jsk_apc¶
jsk_apc is a stack of packages for the Amazon Picking Challenge.
The code is open source, and available on github.
0.3: Stable Achievement of 30 points¶
- Opened: 2016-04-11
- Deadline: 2016-04-30
Goal¶
Achieve 30 points stably.
Configuration¶
- Gripper: vacuum2015
- Item: apc2015
- Hand Camera: use creative (feature)
System¶
Recognition¶
- Location of shelf: old
- Object recognition in bin: new (feature)
- Grasp planning in bin: old
- Detection of grasps with vacuum sensor: old
- In-hand object recognition: old
- In-hand detection of grasps: old
Motion¶
- Pick: old
- Return: (feature)
- into the back of the shelf.
- when failed to solve ik, replay executed trajectory reversely.
A -> B -> C -> G -> (D) -> E -> F -> H -
^ | |
|<--------------------------- |
-----------------------------------
0.4: New Gripper Stable Grasping¶
- Opened: 2016-04-23
- Deadline: 2016-04-30
Goal¶
Grasp the most of objects with new gripper stably.
Configuration¶
- Gripper: vacuum2016 (feature) (vs vacuum2015)
- Item: apc2015
- Hand Camera: None
System¶
Recognition¶
- Location of shelf: old
- Object recognition in bin: old
- Grasp planning in bin: old
- Detection of grasping with vacuum sensor: new (feature)
- In-hand object recognition: None
- In-hand detection of grasping: None
Motion¶
- Pick objects in the certain bin
Comparing¶
- Compare new and old gripper in point of picking the same object in the same setting
- Target of this comparing is proving new gripper’s superiority to old gripper
1.0: APC2016 ver1.0¶
- Opened: 2016-05-10
- Ended: 2016-07-08
Goal¶
Achieve 30pt in pick task on Official APC2016 rule
1.5: APC2016 ver1.5¶
- Opened: 2016-06-02
- Ended: 2016-07-09
Goal¶
Achieve 30pt in both pick and stow task on Official APC2016 rule
schedule¶
Mon | Tue | Wed | Thu | Fri | Sat | Sun |
6 | 7 Milestone1.0 | 8 | 9 | 10 | 11 Milestone1.5 | 12 |
13 | 14 | 15 | 16 | 17 | 18 | 19 |
20 | 21 | 22 | 23 | 24 | 25 | 26 |
27 出発 1601 | 28 | 29 | 30 | 7/1 | 7/2 | 7/3 |
APC2016 rules¶
Objects¶

Bonus and Stocks¶

Scoring¶
Pick Task¶
- Bins with 1 or 2 items
- +10 points
- Bins with 3 or 4 items
- +15 points
- Bins with 5 or more items
- +20 points
Stow Task¶
- Bins that started with 1 or 2 items
- +10 points
- Bins that started with 3 or 4 items
- +15 points
- Bins that started with 5 or more items
- +20 points
Penalties¶
- For each item that is not in a bin, the tote, or actively held by the robot at the and of the attempt:
- -10 points
- For each incorrect final position in the Task Output File for an item that is in a bin or the tote:
- -10 points
- Minor damage to any item or the shelf:
- -5 points
- Major damage to any item or the shelf:
- -20 points
- Dropping an item from a height of more than 0.3 meters:
- -5 points
- Leaving an item protruding out of its bin by more than 0.5cm:
- -5 points
Names¶

Pick Task¶
Stow Task¶
Tips & FAQ¶
How to run rostests?¶
catkin run_tests --this --no-deps
catkin run_tests jsk_arc2017_baxter --no-deps
How to run roslint?¶
catkin build --this --no-deps --catkin-make-args roslint --
catkin build jsk_arc2017_baxter --no-deps --catkin-make-args roslint --
Controlling joints of the robot does not work.¶
Run below to synchronize the time with robot. Time synchronization is crucial.
sudo ntpdate baxter.jsk.imi.i.u-tokyo.ac.jp
Rosdep failure due to cython version.¶
rosdep install -y -r --from-paths . --ignore-src
This command may fail with below errors.
pkg_resources.DistributionNotFound: cython>=0.21
...
ERROR: the following rosdeps failed to install
pip: command [sudo -H pip install -U scikit-image] failed
pip: Failed to detect successful installation of [scikit-image]
In this case, maybe your setuptools is too old. Please run below command.
sudo pip install -U setuptools
https://github.com/start-jsk/jsk_apc/issues/1244 for details.
How to release a new version of jsk_apc?¶
roscd jsk_apc
catkin_generate_change_log
# edit CHANGELOG.rst to create a pretty changelog
catkin_prepare_release
bloom-release --rosdistro indigo --track indigo jsk_apc # you may need to fix package.xml for pip packages
Gripper used in APC2016¶

This gripper has two parts made by 3D printers. “base” part is made by ProJet and “tube” part is made by Dimension. 3D data of these parts are here. For now(2016/9/27), left gripper is gripper-v3 and right gripper is gripper-v4.
Also, PCB data of the control board on this gripper are here.
The servo motor used in this gripper is GWS S11HP/2BBMG/JR.
How to calibrate extrinsic parameters of Astra¶
% roslaunch jsk_2016_01_baxter_apc baxter.launch
% roscd jsk_2016_01_baxter_apc/rvizconfig
% rviz -d check_astra.rviz
% roslaunch jsk_2016_01_baxter_apc astra_hand.launch
You can see Rviz like below:

If you want to reverse right and left camera vision:
% roslaunch jsk_2016_01_baxter_apc astra_hand.launch left_first:=false
If the point cloud and the robot model are too different in Rviz, you should change the pose of depth optical frame like below:
% rosrun tf static_transform_publisher -0.10 -0.008 0.015 -1.56 0.00 -0.08 right_hand right_hand_camera_depth_optical_frame 100 __name:=right_hand_camera_depth_static_tf_publisher # This is just an example
# OR
# % roslaunch jsk_2016_01_baxter_apc astra_hand.launch --args /right_hand_camera_depth_static_tf_publisher
% /opt/ros/indigo/lib/tf/static_transform_publisher -0.10 -0.008 0.015 -1.56 0.00 -0.08 right_hand right_hand_camera_depth_optical_frame 100 __name:=right_hand_camera_depth_static_tf_publisher # This is just an example
After you adjust point cloud, you should check color point cloud:

If the color point cloud and the robot model are too different in Rviz, you should change the pose of RGB optical frame like below:
% rosrun tf static_transform_publisher 0.040 0.01 0 0.0 0 0 right_hand_camera_depth_optical_frame right_hand_camera_rgb_optical_frame 100 __name:=right_hand_camera_rgb_static_tf_publisher # This is just an example
# OR
# % roslaunch jsk_2016_01_baxter_apc astra_hand.launch --args /right_hand_camera_rgb_static_tf_publisher
% /opt/ros/indigo/lib/tf/static_transform_publisher 0.040 0.01 0 0.0 0 0 right_hand_camera_depth_optical_frame right_hand_camera_rgb_optical_frame 100 __name:=right_hand_camera_rgb_static_tf_publisher # This is just an example
jsk_apc2015_common¶
jsk_apc2015_common is common stacks for Amazon Picking Challenge 2015.
Setup Berkeley dataset for object recognition¶
Download berkeley dataset¶
To get dataset at here:
python scripts/download_dataset.py -O berkeley_dataset
Apply mask image for object images¶
To get mask applied images:
python scripts/create_mask_applied_dataset.py berkeley_dataset -O berkeley_dataset_mask_applied
ROS nodes¶
Python Library¶
-
jsk_apc2015_common.
get_object_list
()[source]¶ Returns the object name list for APC2015.
Parameters: None. – Returns: objects – List of object name. Return type: list
-
jsk_apc2015_common.
load_json
(json_file)[source]¶ Load the json file which is interface for APC2015.
Parameters: json_file (str) – Path to the json file.
jsk_2015_05_baxter_apc¶
jsk_2015_05_baxter_apc is ROS package for Amazon Picking Challenge in May 2015.
Demonstrate APC2015 on Real World¶
Real world demonstration for APC2015 can be done on baxter@sheeta.jsk.imi.i.u-tokyo.ac.jp
.
- Prepare json.
- Setup objects in Kiva.
baxter@sheeta $ roscd jsk_apc && git checkout 0.2.2
baxter@sheeta $ roslaunch jsk_2015_05_baxter_apc baxter.launch
baxter@sheeta $ roslaunch jsk_2015_05_baxter_apc setup_torso.launch
baxter@sheeta $ ssh doura
baxter@doura $ tmux
baxter@doura $ roscd jsk_apc && git checkout 0.2.2
# on a tmux session
baxter@doura $ sudo -s # necessary for launch kinect2 with ssh login
baxter@doura $ roslaunch jsk_2015_05_baxter_apc setup_head.launch
# detach from the tmux session and logout from doura here
baxter@sheeta $ roslaunch jsk_2015_05_baxter_apc main.launch json:=$(rospack find jsk_2015_05_baxter_apc)/json/layout_12.json
# optional visualization
$ rviz -d $(rospack find jsk_2015_05_baxter_apc)/rvizconfig/segmentation.rviz # check object segmentation in each bin
$ rviz -d $(rospack find jsk_2015_05_baxter_apc)/rvizconfig/real_demo.rviz # visualization for demo
https://github.com/start-jsk/jsk_apc/blob/master/jsk_2015_05_baxter_apc/json/layout_12.json




Demonstrate APC2015 on Simulation¶
Real world demonstration for APC2015 can be done on any computers with ROS indigo.
Installation¶
- Install kiva_pod gazebo model from here.
- Install the ROS: (Instructions for ROS indigo on Ubuntu 14.04).
- Setup catkin workspace as below:
# setup catkin
mkdir -p ~/ros/jsk_apc2015_sim && cd ~/ros/jsk_apc2015_sim
catkin init
# setup repos
cd ~/ros/jsk_apc2015_sim/src
wstool init
wstool merge https://raw.githubusercontent.com/start-jsk/jsk_apc/master/sim.rosinstall.$ROS_DISTRO
wstool update -j8
# install depends
rosdep install --from-path . -r -y
# build repos
catkin build -iv -j8
Demo¶
roslaunch jsk_2015_05_baxter_apc baxter_sim.launch
roslaunch jsk_2015_05_baxter_apc setup_head.launch gazebo:=true
roslaunch jsk_2015_05_baxter_apc main.launch json:=$(rospack find jsk_apc2015_common)/json/f2.json
# optional visualization
rviz -d $(rospack find jsk_2015_05_baxter_apc)/rvizconfig/gazebo_demo.rviz

ROS nodes¶
bin_contents.py¶
What is this?¶
Publishes the contents in bins of Kiva Pod whose layout is described in a json file for Amazon Picking Challenge 2015.
Subscribing Topic¶
None.
Publishing Topic¶
~
(jsk_2015_05_baxter_apc/BinContentsArray
)Bin contents.
~bin_[a-l]_n_object
(jsk_recognition_msgs/Int32Stamped
)Number of object in each bin.
Parameters¶
~json
(type:String
, required)Path of json file for the challenge.
Example¶
rosrun jsk_2015_05_baxter_apc bin_contents.py _json:=$(rospack find jsk_2015_05_baxter_apc)/json/apc2015_layout_1.json
boost_object_recognition.py¶
What is this?¶
This node considers each classifier’s weight for the object recognition result.
Subscribing Topic¶
~input/bof
(jsk_recognition_msgs/ClassificationResult
)Result of classification with Bag of Features.
~input/ch
(jsk_recognition_msgs/ClassificationResult
)Result of classification with Color Histogram.
Publishing Topic¶
~output
(jsk_recognition_msgs/ClassificationResult
)Result of boosting.
Parameters¶
~weight
(type:String
, required)Path to yaml file for boosting weight.
~queue_size
(type:Int
, default:100
)Queue size for subscriptions.
~approximate_sync
(type:Bool
, default:false
)Synchronize policy for
message_filter
.
color_object_matcher.py¶
What is this?¶
This classifies image for object recognition with color histogram feature.
Subscribing Topic¶
~input
(sensor_msgs/Image
)Input image.
~input/label
(sensor_msgs/Image
)Input label image which describes list of region of interest.
Publishing Topic¶
~output
(jsk_recognition_msgs/ClassificationResult
)Classification result of input image for a object set.
Parameters¶
~queue_size
(type:Int
, default:100
)Queue size for subscriptions.
~approximate_sync
(type:Bool
, default:false
)Synchronize policy for
message_filter
.
euclid_k_clustering.py¶
What is this?¶
This node dynamically reconfigures the ~tolerance
rosparam of
jsk_pcl_ros/euclid_clustering (jsk_pcl/EuclideanClustering)
considering the number of objects in the region of interest.
Subscribing Topic¶
~k_cluster
(jsk_recognition_msgs/Int32Stamped
)Expected number of clusters.
~{node}/cluster_num
(jsk_recognition_msgs/Int32Stamped
)Actual number of clusters.
{node}
is the value of rosparam~node
. See Parameters for detail.
Publishing Topic¶
None.
Parameters¶
~node
(type:String
, required)Node name of jsk_pcl_ros/euclid_clustering.
~default_tolerance
(type:Float
, required)Default value of
tolerance
.~reconfig_eps
(type:Float
, default:0.2
)Rate of reconfiguration compared to the value at each time.
~reconfig_n_limit
(type:Int
, default10
)Number of times of reconfiguration.
initialize_baxter.py¶
What is this?¶
This node setups below after waiting the /clock
and /robot/state
topics.
- Enables the robot.
- Launches
baxter_interface/joint_trajectory_action_server.py
.- Launches
baxter_interface/head_action_server.py
.
Subscribing Topic¶
None.
Publishing Topic¶
None.
main.l¶
What is this?¶
Main program to activate robot which subscribes recognition result and does manipulation.
Subscribing/Publishing Topic¶
Subscriptions and publications are done in below see them:
pr2eus/robot-interface.l
baxtereus/baxter-interface.l
jsk_2015_05_baxter_apc/euslisp/jsk_2015_05_baxter_apc/baxter-interface.l
.
Parameters¶
~[left,right]_hand/state
(type:String
, Do not set manually)~[left,right]_hand/target_bin
(type:String
, Do not set manually)The state and target bin of the hand. Mainly used for parallel activation of dual arms.
work_order.py¶
What is this?¶
Publishes the picking order for each arm of Baxter robot in Amazon Picking Challenge 2015.
Rules¶
- It abandon bins whose target object is listed below:
- genuine_joe_plastic_stir_sticks (big & heavy)
- cheezit_big_original (big & heavy)
- rolodex_jumbo_pencil_cup (many holes)
- champion_copper_plus_spark_plug (small)
- oreo_mega_stuf (heavy)

- Left bins are assigned to left arm, vice versa.
- Center bins are assigned to one arm (left or right).
Subscribing Topic¶
None.
Publishing Topic¶
~left_hand
,~right_hand
(jsk_2015_05_baxter_apc/WorkOrderArray
)Picking orders for each arm.
Parameters¶
~json
(type:String
, required)Path of json file for the challenge.
Example¶
rosrun jsk_2015_05_baxter_apc work_order.py _json:=$(rospack find jsk_2015_05_baxter_apc)/json/apc2015_layout_1.json
Euslisp Code¶
Contents:
Reachable Space of Customized Baxter¶
roscd jsk_2015_05_baxter_apc/euslisp/examples
roseus baxter-reachable-space.l



Library:
euslisp/jsk_2015_05_baxter_apc/baxter-interface.l¶
jsk_2015_05_baxter_apc::baxter-interface¶
- :super baxter-interface
- :slots *tfl *bin-boxes *objects-in-bin-boxes *objects-in-bin-coms _bin-coords-list
:wait-for-user-input-to-start arm
:init &rest args
:start-grasp &optional (arm :arms)
:stop-grasp &optional (arm :arms)
:graspingp arm
:opposite-arm arm
:need-to-wait-opposite-arm arm
:arm-symbol2str arm
:arm-potentio-vector arm
:tf-pose->coords frame_id pose
:fold-pose-back &optional (arm :arms)
:detect-target-object-in-bin target-object bin
:recognize-bin-boxes &key (stamp (ros::time-now))
:bbox->cube bbox
:visualize-bins
:visualize-objects
:recognize-grasp-coords-list bin &key (stamp (ros::time-now))
:recognize-objects-in-bin bin &key (stamp (ros::time-now)) (timeout 10)
:recognize-object-in-hand arm &key (stamp (ros::time-now)) (timeout)
:verify-object arm object-name &key (stamp (ros::time-now))
:try-to-pick-in-bin arm bin
:try-to-pick-object-solidity arm bin &key (offset #f(0.0 0.0 0.0))
:try-to-pick-object arm bin &key (object-index 0) (offset #f(0.0 0.0 0.0))
:pick-object arm bin &key (object-index 0) (n-trial 1) (n-trial-same-pos 1) (do-stop-grasp nil)
:send-av &optional (tm 3000)
:force-to-reach-goal &key (arm :arms) (threshold 5) (stop 10)
:ik->bin-entrance arm bin &key (offset #f(0.0 0.0 0.0))
:move-arm-body->bin arm bin
:move-arm-body->order-bin arm
:spin-off-by-wrist arm &key (times 10)
:move-arm-body->head-view-point arm
:place-object arm
:get-work-orders arm
:get-next-work-order arm current-order
:get-bin-contents bin
:real-sim-end-coords-diff arm
jsk_2015_05_baxter_apc::baxter-init &key (ctype :default-controller)
euslisp/jsk_2015_05_baxter_apc/util.l¶
m->mm m
argmax fvec
vec-list-max vec-list &key axis
str2symbol str
symbol2str *_symbol*
ros::advertise-if-yet name data-class queue-size
underscore-to-space str_
which-bin-region bin
arm-to-ctype arm
arm-to-str arm
opposite-arm arm
get-object-size object-name
zip a b
dict zipped
Testing¶
catkin run_tests jsk_2015_05_baxter_apc --no-deps
jsk_apc2016_common¶
jsk_apc2016_common is common stacks for Amazon Picking Challenge 2016.
Setup RBO Segmenter for segmentation in bin¶
Submodule update¶
To initialize RBO submodule:
git submodule init
git submodule update --init --recursive
ROS nodes¶
visualize_stow_json.py¶
What is this?¶
Visualizes json file for Stow Task which is the interface for Amazon Picking Challenge 2016.

Subscribing Topic¶
None.
Publishing Topic¶
~output
(sensor_msgs/Image
)Bin contents image.
Example¶
rosrun jsk_apc2016_common visualize_stow_json.py $(rospack find jsk_2016_01_baxter_apc)/json/apc_stow.json
rosrun image_view image_view image:=/visualize_stow_json/output
Python Library¶
jsk_2016_01_baxter_apc¶
jsk_2016_01_baxter_apc is ROS package for Amazon Picking Challenge in June 2016.
APC2016 Pick Task Trial on Real World¶
Pick task trial on real world for APC2016 can be done on baxter@sheeta
.
- Prepare json (ex. apc_pick_task_robocup2016.json).
- Setup objects in Kiva.
# Launch nodes to control robot.
baxter@sheeta $ roslaunch jsk_2016_01_baxter_apc baxter.launch
# Launch nodes in recognition pipeline for pick task.
baxter@sheeta $ roslaunch jsk_2016_01_baxter_apc setup_for_pick.launch
# optional: Check sanity.
baxter@sheeta $ rosrun jsk_2016_01_baxter_apc check_sanity_setup_for_pick
# Run task!
baxter@sheeta $ roslaunch jsk_2016_01_baxter_apc main.launch json:=$(rospack find jsk_apc2016_common)/json/apc_pick_task_robocup2016.json
# even if you pass rviz:=false to main.launch, you need to launch yes_no_button.
baxter@sheeta $ rosrun jsk_2016_01_baxter_apc yes_no_button
Above commands are automated with a single command below:
baxter@sheeta $ tmuxinator start apc
APC2016 Pick Task Trial on Real World with Right Gripper-v5¶
Pick task trial on real world with right gripper-v5 for APC2016 can be done on baxter@sheeta
.
- Install right gripper-v5 in Baxter
- Prepare json (ex. test_gripper_v5.json).
- Setup objects in Kiva.
# Launch nodes to control robot.
baxter@sheeta $ roslaunch jsk_2016_01_baxter_apc baxterrgv5.launch
# Launch nodes in recognition pipeline for pick task.
baxter@sheeta $ roslaunch jsk_2016_01_baxter_apc setup_for_pick.launch
# optional: Check sanity.
baxter@sheeta $ rosrun jsk_2016_01_baxter_apc check_sanity_setup_for_pick
# Run task!
baxter@sheeta $ roslaunch jsk_2016_01_baxter_apc main_rgv5.launch json:=$(rospack find jsk_apc2016_common)/json/test_gripper_v5.json
# even if you pass rviz:=false to main.launch, you need to launch yes_no_button.
baxter@sheeta $ rosrun jsk_2016_01_baxter_apc yes_no_button
APC2016 Stow Task Trial on Real World¶
Stow task trial on real world for APC2016 can be done on baxter@satan
and baxter@eyelash
.
- Prepare json.
- Setup objects in Kiva.
- Setup objects in Tote.
# use satan
baxter@satan $ roscd jsk_apc && git fetch origin
baxter@satan $ git checkout 1.5.0
baxter@satan $ roslaunch jsk_2016_01_baxter_apc baxter.launch
baxter@satan $ roslaunch jsk_2016_01_baxter_apc setup_torso.launch use_stow:=true
# use eyelash
baxter@eyelash $ roscd jsk_apc && git fetch origin
baxter@satan $ git checkout 1.5.0
baxter@eyelash $ roslaunch jsk_2016_01_baxter_apc setup_astra.launch use_stow:=true
# use satan
baxter@satan $ roslaunch jsk_2016_01_baxter_apc main_stow.launch json:=$(rospack find jsk_apc2016_common)/json/stow_layout_1.json
Installation¶
See Installation.
MasterPiece¶
Well-done Images and Videos
READ/WRITE: https://drive.google.com/drive/u/1/folders/0B5DV6gwLHtyJS2NKU3J4WXo2TDA
Logs¶
2016-03-07¶
What you did?¶
- Try to achieve 30 points in apc2016 rules (but items are from apc2015)
What is issue?¶
- 掃除機の気圧センサが0/1
- 新グリッパーの改善点について (https://github.com/start-jsk/jsk_apc/issues/1223)
- IKが解けない範囲がある
- IK解けるかチェック (https://github.com/start-jsk/jsk_apc/issues/1105)
- segmentation がうまくいかない
- 奥が見えない
- Installing CREATIVE interactive gesture camera (https://github.com/start-jsk/jsk_apc/issues/1137)
- 安全メガネの認識失敗
- 安全眼鏡の認識失敗 (https://github.com/start-jsk/jsk_apc/issues/1106)
- オレオをとってしまい棚の外で落としてしまう
- 取れない物体が入ってる棚をスキップ (https://github.com/start-jsk/jsk_apc/issues/1104)
What you think/feel?¶
- 圧力センサーを変更したい
- IKをうまく解けるようにしたい
- セグメンテーションの改善
- 手先に認識デバイスをつける
2016-03-10¶
What you did?¶
- Try to achieve 30 points in apc2016 rules (but items are from apc2015)
What is issue?¶
- pickの際に実際はgraspできているが、それを認識できない(graspingpがnil)ので棚から腕を引き出す。しかしその動作が早いため吸引が切れる前に物体も一緒に引きずられて落ちる
- pingp nilがのときには吸引力がをきることなく腕をひきぬく(https://github.com/start-jsk/jsk_apc/issues/1134)
- return objectの途中で掃除機が切れる
- オブジェクトを返す途中にvacuumがoffになることがある。(https://github.com/start-jsk/jsk_apc/issues/1131)
- returnしたobjectをまたとってしまう
- returnしたobjectを奥に入れる(https://github.com/start-jsk/jsk_apc/issues/1129)
- work orderがあまりよくない (i.e. right hand tried to pick target from bin with 9 items)
- Skip ‘level 3’ work order (https://github.com/start-jsk/jsk_apc/issues/1208)
- 袋が大きい物体を棚に戻す際に棚の下にひっかかって落としてしまう
What you think/feel?¶
- collision checkをして右手左手で同時作業できるbinをもっとみつけたい
2016-03-12¶
What you did?¶
- Try to achieve 30 points in apc2016 rules (but items are from apc2015)
What is issue?¶
- 安全メガネの認識:データを追加するだけでは性能が上がらなかった。
What you think/feel?¶
- HOGなどの他の特徴量抽出を使ってみれば?
Log data¶
- None
2016-04-11¶
THIS IS SAMPLE
What you did?¶
- DID_0
- DID_1
What is issue?¶
- ISSUE_0 - ISSUE_01 (https://github.com/start-jsk/jsk_apc/ISSUE_URL) - ISSUE_02 (https://github.com/start-jsk/jsk_apc/ISSUE_URL)
What you think/feel?¶
- Recognition is BAD!! OMG!
2016-04-12¶
Evaluation of CREATIVE camera and DepthSense camera¶
- compare CREATIVE camera and DepthSense camera
CREATIVE camera¶
Specs¶
- 15cm以上離れている必要あり
- depth image: /softkinetic_camera/depth/image_raw
- height: 240
- width: 320
- encoding: 32FC1
- color image /softkinetic_camera/rgb/image_color
- height: 720
- width: 1280
- encoding: bgr8
- points xyzrgb /softkinetic_camera/depth/points
- height: 240
- width: 320
Issues¶
Experiment¶
Setup

Topic hz
Grid table:
topic | rate | min_delta | max_delta | std_dev |
---|---|---|---|---|
/softkinetic_camera/depth/points | 30.053 | 0.021 | 0.043 | 0.005 |
/softkinetic_camera/depth/image_raw | 30.136 | 0.018 | 0.044 | 0.006 |
/softkinetic_camera/rgb/image_color | 25.250 | 0.058 | 0.061 | 0.008 |
View
- front view

- above view
DepthSense¶
Specs¶
- 15cm以上離れている必要がある
- points xyzrgb /softkinetic_camera/depth/points
- height: 240
- width: 320
- encoding: 32FC1
- depth image /softkinetic_camera/depth/image_raw
- height: 240
- width: 320
- encoding: 32FC1
- color image /softkinetic_camera/rgb/image_color
- height: 720
- width: 1280
- encoding: bgr8
Issues¶
- color image /softkinetic_camera/rgb/image_color の色が全体的に黄色い
- a little bit noisy?
Experient¶
Setup

Topic hz
Grid table:
topic | rate | min_delta | max_delta | std_dev |
---|---|---|---|---|
/softkinetic_camera/depth/points | 30.164 | 0.017 | 0.043 | 0.006 |
/softkinetic_camera/depth/image_raw | 30.136 | 0.018 | 0.045 | 0.005 |
/softkinetic_camera/rgb/image_color | 24.960 | 0.032 | 0.051 | 0.005 |
View
- front view

- above view
2016-04-21¶
前日、 start-jsk/jsk_apc#1302 を達成し、 start-jsk/jsk_apc#1308 でmain.launchから新グリッパーのサーボを動かせるようになった。そのバグを潰すとともに、新グリッパー内の圧力センサによる把持判定をmain.lから参照できるようにした。また、新グリッパーで去年のプログラムを走らせると、グリッパー上部がbinの上部に当たる問題が発生したので、それを修正中である。
What you did?¶
Bug fix
gripper joint trajectory actionのInfo messageが読みにくく、プログラムそのものの拡張性も良くなかったので修正New functions
掃除機に内蔵されている圧力センサーの代わりに、新グリッパーに内蔵されている圧力センサーを用いて把持したかどうかを判定するWork in progress
新グリッパーで去年のプログラムを走らせると、グリッパー上部がbinの上部に当たる問題が発生した(movie1)。それを修正中。
What is issue?¶
- 平たい物品はとれるが、背の高い物品は、物品に向けてIKを解いた時に新グリッパーが棚にぶつかる可能性が大。グリッパーのサーボを曲げないで物品を取るプログラムを書く必要がある。
2016-04-22-gripper¶
昨日に引き続き、 start-jsk/jsk_apc#1321 で作業中。ある程度背の高い物品はグリッパーのサーボを伸ばした状態でとり、背の低い物品や棚の側面に立て掛けてあるような物品は、グリッパーのサーボを曲げて、適切な角度に手首を回転させてとるという判断ができるようになった。
What you did?¶
Work in progress
背の高い物品に対し、その位置によって動きを分けられるようになった。- プログラムを書いた当初は、 movie1, movie2 のように、IKが失敗するバグを埋め込んでしまった。
- movie3 でbinの中央付近にある背の高い物品に対し、グリッパーのサーボを伸ばしてアプローチすることができるようになったが、今度は背の低い物品を取ろうとするとIKが失敗するバグを埋め込んでしまった。
- movie4 で背の低い物品でもIKが失敗しないようになった。
- movie5 でbinの側面に近い物品に対し、手首関節を回転させてアプローチすることができるようになったが、反対側のbinの側面とグリッパーが干渉してしまった。
- movie6 でmovie5での干渉を解決できた。
- movie7 で厚みのある物品の把持に挑戦した。つかめはしたが、引き出すところでグリッパーの関節を曲げた際に吸引が剥がれてしまった。
What is issue?¶
- movie7で示された通り、グリッパーの関節をまっすぐにしてアプローチした後は、グリッパーの関節を曲げるのはbinから物品が引き出されてからにするべきである。これと似たような問題として、movie6で示されたように、binの側面に近い物品をとった後、グリッパーがbin内にある状態で手首関節が回転し、物品がbinにあたって吸引が剥がれる危険性が高まるというものもある。物品がbin内から引き出されるまでは、アプローチした際の姿勢を極力維持するべきではないか。
- 現在はグリッパーのサーボモータのトルクは常に入ったままであるが、物品を把持した状態でトルクを切ると、重力や物品とbinの干渉に応じて、グリッパーの関節角度がなじむことがわかっている。これをうまく利用したい。
- movie4で顕著だが、物品の位置にうまくグリッパーを持って行けていない。これは、物品の点群が十分にとれていないせいではないかと考えられる。手首にRealSense Cameraがつき、これを有効に活用できるようになれば、改善できるのではないかと考えられる。
- movie4で、グリッパー部分の電源ジャックが抜けてしまい、挿し直すというトラブルがあった。また、グリッパーが掃除機のホースに引っかかり、非常停止せざるを得なくなった。ジャックが抜ける問題については対策済みであるが、コードやホースの取り回しについて考える必要がある。
2016-04-25-26-gripper¶
start-jsk/jsk_apc#1321 は4/22の状態で完成したと判断し、mergeしてもらった。しかし、特定条件下でIKがとけないことが発覚したので、 start-jsk/jsk_apc#1345 で修正中。また、Arduinoのfirmwareを改修すると共に新規ノードを追加して、ロボットが壊れにくくなるよう安全対策を施した。
What you did?¶
New functions
start-jsk/jsk_apc#1297, start-jsk/jsk_apc#1327 を解決するもの。現在の所、15分間放置するとグリッパーのサーボのトルクが抜ける。グリッパーのサーボを曲げずに物品にアプローチした際、引き出すときにサーボのトルクを抜くことで、物品をうまく引き出せるようにした。 movie2 で成功している。Bug fix(Work in progress)
start-jsk/jsk_apc#1321 の状態で発生した start-jsk/jsk_apc#1341 を修正するもの。- movie1 のように、IKが失敗してしまっていた。
- start-jsk/jsk_apc#1342 で作業し、 movie4 のように、手首を回すことでIKを解けるようになった。
- しかし、手首を回すよりもグリッパーのサーボを回したほうが小さな動作で動けるのではないかということに気づき、前のPRを閉じてこのPRを作った。まだ実機で試していないので、4/27に試す。
2016-04-27-gripper¶
4/26に、 start-jsk/jsk_apc#1345 でグリッパーのサーボを回した方が良いのではないかという仮説を立てたが、失敗したので手首を回すように戻した。また、jsk_2015_05_baxter_apcのeuslisp/examples/picking-with-clustering.lをjsk_2016_01_baxter_apcに移植し、一つのBinの中でクラスタリングを行い、物品を把持してOrder Binに入れるという一連の動作のみを試せるようになった。
What you did?¶
Bug fix
start-jsk/jsk_apc#1321 の状態で発生した start-jsk/jsk_apc#1341 を修正するもの。4/26に思いついたグリッパーのサーボの回転は、試してみた所、 movie1 のようになり、IKで出てくる姿勢に影響を与えなかったので、断念した。結局、手首を回す仕様に戻っている。New functions
jsk_2015_05_baxter_apcのeuslisp/examples/picking-with-clustering.lをjsk_2016_01_baxter_apcに移植するもの。一つのBinの中でクラスタリングを行い、物品を把持してOrder Binに入れるという一連の動作のみを試せるようになった。 movie2 のような動作になる。
2016-04-28-30-gripper¶
4/27で生じた問題点のうち、グリッパーが45度程度に曲がった姿勢がIKで出てくるという問題を解消するため、IKを解く時のグリッパー関節の重みを0にし、IKでは動かないようにした。その上で右手が届く全Binに対してテストを行った所、Bin eへのアプローチの際にBaxter本体とアームが引っかかる問題が発生したので、修正した。
What you did?¶
Bug fix
IKを解く際のグリッパー関節の重みを0にし、IKでは動かないようにする。グリッパー関節を曲げる必要がある場合は、(send *baxter* :rotate-gripper :rarm 90 :relative nil)
のようにスクリプトから直接回転させる。この際、IKを解く際の中継点であるfold-pose-upperに変更を加えることで、 movie9 のように、手首を回転させなくてもBin cでのIKが解けるようになったので、手首を回転させる行は削除した。#1362の状態で実験してみた所、Bin e以外のBinに対しては、Binの前に手を持っていくまでは無理のない動作を行うことができた。掃除機のホースを巻き込むこともなくなった。しかし、Bin eに対しては、 movie10 のように、アームがBaxter本体と引っかかってしまった。これは右アーム固有の問題なのか疑問に思ったので、左アームでも実験してみた所、 movie11 のように似たような動きでBaxter本体と接触した。これは、fold-pose-backからBinに至るまでの経由点であるavoid-shelf-pose-eが悪いのではないかと判断し、変更を行った所、 movie18 のようにグリッパーをBinに持って行くまでは安定して動けるようになったが、グリッパーを引き出す所でBaxter本体とグリッパーがひっかかってしまった。そこで、グリッパーを引き出す距離を短くした所、 movie19 のように安定した動きができるようになった。New functions
GoogleDrive上に保存してあるrosbagファイルをダウンロードし、各Bin、各アームに対して:ik->bin-entrance
が解けるかどうかをテストする。このテストを行うことで、IKが解けないという状態に陥っていないことをtravis上で確認できる。現在は、rosbagファイルの長さが1分であり、再生を始めてから1分以内に:recognize-bin-boxes
が行われなければ、エラーが出てきてしまう。これを何とかするべきかもしれない。
What is issue?¶
- movie14 で、Bin bに対するアプローチを試しているが、Binの入口に辿り着くまでに大きめの動きをしている。これは、 start-jsk/jsk_apc#1362 でfold-pose-upperを修正したために、
(send *baxter* :ik->bin-entrance :rarm :b)
で出てくる姿勢も変わってしまったためではないかと考えられるが、左手との比較をし忘れており、まだ正確なことがわからない。これを比較する必要がある。 - start-jsk/jsk_apc#1383
2016-05-02-18-gripper¶
5/2~5/18の間に作ったPRと、それがどのissueを解決するものかについて、動画で説明する。各PRやissueの詳しい説明は、リンクページを見てほしい。
What you did?¶
- start-jsk/jsk_apc#1394
- start-jsk/jsk_apc#1382 (movie や movie)を movie のように解決するもの
- start-jsk/jsk_apc#1400
- start-jsk/jsk_apc#1404
- start-jsk/jsk_apc#1425
- start-jsk/jsk_apc#1383 (movie)を movie や movie のように解決するもの
- start-jsk/jsk_apc#1432
- start-jsk/jsk_apc#1430 (movie)を movie のように解決するもの
- start-jsk/jsk_apc#1445
- start-jsk/jsk_apc#1429 を解決するもの
- start-jsk/jsk_apc#1448
- このPRまでは、重い物品をOrder Binに入れるときにグリッパー関節が伸びようとしてサーボに大きな負荷がかかっており、サーボからパキパキと音がなっていた。この負荷でサーボ内のギアの歯が飛び、サーボも一個壊してしまった。これをサーボのトルクを抜くことで解決するもの
- start-jsk/jsk_apc#1451
- start-jsk/jsk_apc#1447 (movie)を movie のように解決するもの
- start-jsk/jsk_apc#1474
- start-jsk/jsk_apc#1456 を解決するもの
- start-jsk/jsk_apc#1478
- start-jsk/jsk_apc#1449 を解決するもの
- start-jsk/jsk_apc#1488
- start-jsk/jsk_apc#1481 を解決するもの
- start-jsk/jsk_apc#1498
- start-jsk/jsk_apc#1489 (movie)を movie のように解決するもの
- start-jsk/jsk_apc#1499
- start-jsk/jsk_apc#1497 を解決するもの
- start-jsk/jsk_apc#1500
- start-jsk/jsk_apc#1496 を解決するもの
What is issue?¶
- 特定条件下でIKが解けなくなる
2016-05-09¶
Segmentation in bin with kinect2_torso
What you did?¶
- Run main.launch with kinect2_torso
- Run segmentation_in_bin.launch with kinect2_torso
- Run segmentation and try to pick object from bin l
- Succeeded to pick object
What is issue?¶
- segmentation program takes almost 1 minute with kinect2
What you think/feel?¶
- I want to do this with softkinetic camera
2016-05-17¶
Segmentation in bin with softkinetic camera
What you did?¶
- resize output mask image
- fix camera info
- add robot self filter
- get proper centroid position
Results¶

What you think/feel?¶
- it works good, i think :)
2016-05-21¶
Run main program with RBO segmentation
What you did?¶
- Run main.launch with RBO segmentation algorithms
What is issue?¶
- no issues for milestone 0.3.0
What you think/feel?¶
- Segmentation and recognition is good, but grasping planning is not good.
- milestone 0.3.0 finished
Testing¶
catkin run_tests jsk_2016_01_baxter_apc --no-deps
jsk_arc2017_common¶
jsk_arc2017_common is common stacks for Amazon Picking Challenge 2017.
ROS nodes¶
candidates_publisher.py¶
What is this?¶
Publish label candidates from JSON file.
Subscribing topics¶
~input/json_dir
(std_msgs/String
)JSON file directory
Publishing topics¶
~output/candidates
(jsk_recognition_msgs/LabelArray
)Label candidates in target location
Parameters¶
~target_location
(String, required)Target location name. (
tote
,bin_A
,bin_B
orbin_C
)You can update by
dynamic_reconfigure
.~label_names
(List of String, required)List of label names
Sample¶
roslaunch jsk_arc2017_common sample_candidates_publisher.launch
json_saver.py¶
What is this?¶
Update and save item location JSON file during the tasks.
Services¶
~update_json
(jsk_arc2017_common/UpdateJSON
)Update and save
item_location_file.json
$ rossrv show jsk_arc2017_common/UpdateJSON
string item
string src
string dst
---
bool updated
~save_json
(std_srvs/Trigger
)Save
item_location_file.json
.
Parameters¶
~json_dir
(String, required)Directory where initial json files are located
~output_dir
(String, required)Directory where output json files will be located
Sample¶
roslaunch jsk_arc2017_common sample_json_saver.launch
visualize_json.py¶
Subscribing topics¶
~input/json_dir
(String
)Where json files are located.
Publishing topics¶
~output/item_location_viz
(sensor_msgs/Image
)Visualization of
item_location_file.json
. This is enabled ifitem_location
is inside of~types
of rosparam, and~json_dir/item_location_file.json
is read.~output/order_viz
(sensor_msgs/Image
)Visualization of
order_file.json
. This is enabled iforder
is inside of~types
of rosparam, and~json_dir/order_file.json
is read.
Parameters¶
~types
(List of string, required)item_location
or/andorder
.
Sample¶
roslaunch jsk_arc2017_common sample_visualize_json.launch
work_order_publisher.py¶
What is this?¶
Publish optimized work orders of the tasks.
Subscribing topics¶
None
Publishing topics¶
~left_hand
(jsk_arc2017_common/WorkOrderArray
)Optimized work orders for left hand.
~right_hand
(jsk_arc2017_common/WorkOrderArray
)Optimized work orders for right hand.
Parameters¶
~json_dir
(String, required)Directory where initial json files are located
~rate
(Int, default:1
)Hz of publishing topics
Sample¶
roslaunch jsk_arc2017_common sample_work_order_publisher.launch
jsk_arc2017_baxter¶
jsk_arc2017_baxter is ROS package for Amazon Robotics Challenge on July 2017.
ARC2017 Pick Task Trial on Real World¶
Pick task trial on real world for ARC2017 can be done on baxter@baxter-c1
.
- Prepare json.
- Set objects in Shelf.
# Launch nodes to control robot.
baxter@baxter-c1 $ roslaunch jsk_arc2017_baxter baxter.launch
# Launch nodes in recognition pipeline for pick task.
baxter@baxter-c1 $ roslaunch jsk_arc2017_baxter setup_for_pick.launch
# optional: Check sanity.
baxter@baxter-c1 $ rosrun jsk_2016_01_baxter_apc check_sanity_setup_for_pick
# Run task!
baxter@baxter-c1 $ roslaunch jsk_arc2017_baxter pick.launch json_dir:=$(rospack find jsk_arc2017_common)/data/json/sample_pick_task
With Environment Imitating ARC2017 Pick Competition¶
Preparation¶
baxter@baxter-c1 $ rosrun jsk_arc2017_common install_pick_re-experiment
Execution¶
baxter@baxter-c1 $ roslaunch jsk_arc2017_baxter baxter.launch
baxter@baxter-c1 $ roslaunch jsk_arc2017_baxter setup_for_pick.launch
baxter@baxter-c1 $ roslaunch jsk_arc2017_baxter pick.launch json_dir:=$HOME/data/arc2017/system_inputs_jsons/pick_re-experiment/json
ARC2017 Stow Task Trial on Real World¶
Stow task trial on real world for ARC2017 can be done on baxter@baxter-c1
.
- Prepare json.
- Set objects in Tote.
# Launch nodes to control robot.
baxter@baxter-c1 $ roslaunch jsk_arc2017_baxter baxter.launch pick:=false
# Launch nodes in recognition pipeline for stow task.
baxter@baxter-c1 $ roslaunch jsk_arc2017_baxter setup_for_stow.launch
# Run task!
baxter@baxter-c1 $ roslaunch jsk_arc2017_baxter stow.launch json_dir:=$(rospack find jsk_arc2017_common)/data/json/sample_stow_task
State Machine¶
Task states are controlled by smach
.
You can check state machines by smach_viewer
.
Pick Task¶

# Run pick task with smach_viewer
baxter@baxter-c1 $ roslaunch jsk_arc2017_baxter pick.launch json_dir:=$(rospack find jsk_arc2017_common)/data/json/sample_pick_task smach_viewer:=true
Stow Task¶

# Run stow task with smach_viewer
baxter@baxter-c1 $ roslaunch jsk_arc2017_baxter stow.launch json_dir:=$(rospack find jsk_arc2017_common)/data/json/sample_stow_task smach_viewer:=true
Usage of Baxter¶
How to control baxter via roseus
preparation¶
Run below under emacs’s shell environment (M-x shell).
roseusWhen you start new shell, DO NOT FORGET to run:
rossetip rossetmaster baxter source ~/catkin_ws/devel/setup.bash
Set baxter-interface
;;load modules (load "package://jsk_arc2017_baxter/euslisp/lib/arc-interface.l") ;;create a robot model(*baxter*) and make connection to the real robot(*ri*) (jsk_arc2017_baxter::arc-init) ;; display the robot model (objects (list *baxter*))
arc-interface function APIs¶
rotate left(right) gripper
(send *baxter* :rotate-gripper :larm 90 :relative nil)
slide right gripper
(send *baxter* :slide-gripper :rarm 50 :relative nil)
move fingers in right gripper
(send *baxter* :hand :rarm :angle-vector #f(90 90)) (send *baxter* :hand-grasp-pre-pose :rarm :opposed) (send *baxter* :hand-grasp-pose :rarm :cylindrical)
send initial pose for arc2017
(send *baxter* :fold-pose-back)
send current joint angles of robot model to real robot
(send *ti* :send-av)
send current hand joint angles of robot model to real robot
(send *ri* :move-hand :rarm (send *baxter* :hand :rarm :angle-vector) 1000)
Gripper-v6 Setup¶
Adjust gravity compensation¶
Gripper-v6 is heavy (1.18kg), so we should adjust gravity compensation of Baxter.
For now (2017/6/17), roslaunch jsk_arc2017_baxter baxter.launch
does it by:
$ rostopic pub -1 /robot/end_effector/right_gripper/command baxter_core_msgs/EndEffectorCommand '{ id : 131073, command : "configure", args : "{ \"urdf\":{ \"name\": \"right_gripper_mass\", \"link\": [ { \"name\": \"right_gripper_mass\", \"inertial\": { \"mass\": { \"value\": 1.18 }, \"origin\": { \"xyz\": [0.0, 0.0, 0.15] } } } ] }}"}'
If you want to change gripper, you should restore to the original setting by:
$ rostopic pub -1 /robot/end_effector/right_gripper/command baxter_core_msgs/EndEffectorCommand '{ id : 131073, command : "configure", args : "{ \"urdf\":{ \"name\": \"right_gripper_mass\", \"link\": [ { \"name\": \"right_gripper_mass\", \"inertial\": { \"mass\": { \"value\": 0 }, \"origin\": { \"xyz\": [0.0, 0.0, 0.0] } } } ] }}"}'
More information about gripper customization of Baxter is on official page
Distinguish left DXHUB from right one¶
Each left and right gripper has its own DXHUB for communication with motors.
To distinguish two DXHUBs and create correct symbolic links (/dev/r_dxhub
and /dev/l_dxhub
), you have to change the configuration of left DXHUB from default.
Because the configuration is inside EEPROM of FTDI chip on DXHUB, you have to write a new configuration to that EEPROM.
Method on Windows¶
You should use FT_PROG to program EEPROM. Please install it and take the following steps to change the configuration.
- Connect DXHUB to PC with a USB cable. Don’t connect other USB devices. Power supply to DXHUB is not needed
- Wait until device driver installation is finished
- Launch FT_PROG
- Click the loupe icon to scan devices
- Click the plus icon of “USB String Descriptors”

- Change “Product description” value to the value of
ATTRS{product}
in the udev rule

- Click the lightning icon

- Click “Program” to write the modified configuration to EEPROM
Experiments¶
Create 2d dataset for object segmentation¶
Collect raw data on shelf bins¶
roslaunch jsk_arc2017_baxter baxter.launch moveit:=false
roslaunch jsk_arc2017_baxter create_dataset2d_rawdata_main.launch
rosrun jsk_arc2017_common view_dataset2d.py ~/.ros/jsk_arc2017_baxter/create_dataset2d/right_hand
Collect raw data on tote bin¶
roslaunch jsk_arc2017_baxter baxter.launch moveit:=false pick:=false
roslaunch jsk_arc2017_baxter stereo_astra_hand.launch
roslaunch jsk_arc2017_baxter create_dataset2d_rawdata_main.launch box:=tote
rosrun jsk_arc2017_common view_dataset2d.py ~/.ros/jsk_arc2017_baxter/create_dataset2d/right_hand
Annotate¶
dirname=raw_data_$(date +%Y%m%d_%H%M%S)
mv ~/.ros/jsk_arc2017_baxter/create_dataset2d/left_hand/* $dirname
mv ~/.ros/jsk_arc2017_baxter/create_dataset2d/right_hand/* $dirname
rosrun jsk_arc2017_common annotate_dataset2d.py $dirname
rosrun jsk_arc2017_common view_dataset2d.py $dirname