jsk_apc

jsk_apc is a stack of packages for the Amazon Picking Challenge.

The code is open source, and available on github.

0.3: Stable Achievement of 30 points

  • Opened: 2016-04-11

  • Deadline: 2016-04-30

Goal

Achieve 30 points stably.

Configuration

  • Gripper: vacuum2015

  • Item: apc2015

  • Hand Camera: use creative (feature)

System

Recognition

  1. Location of shelf: old

  2. Object recognition in bin: new (feature)

  3. Grasp planning in bin: old

  4. Detection of grasps with vacuum sensor: old

  5. In-hand object recognition: old

  6. In-hand detection of grasps: old

Motion

  1. Pick: old

  2. Return: (feature)

  • into the back of the shelf.

  • when failed to solve ik, replay executed trajectory reversely.

A -> B -> C -> G -> (D) -> E -> F -> H -
     ^                           |     |
     |<---------------------------     |
     -----------------------------------

0.4: New Gripper Stable Grasping

  • Opened: 2016-04-23

  • Deadline: 2016-04-30

Goal

Grasp the most of objects with new gripper stably.

Configuration

  • Gripper: vacuum2016 (feature) (vs vacuum2015)

  • Item: apc2015

  • Hand Camera: None

System

Recognition

  1. Location of shelf: old

  2. Object recognition in bin: old

  3. Grasp planning in bin: old

  4. Detection of grasping with vacuum sensor: new (feature)

  5. In-hand object recognition: None

  6. In-hand detection of grasping: None

Motion

  1. Pick objects in the certain bin

Comparing

  • Compare new and old gripper in point of picking the same object in the same setting

  • Target of this comparing is proving new gripper’s superiority to old gripper

1.0: APC2016 ver1.0

  • Opened: 2016-05-10

  • Ended: 2016-07-08

Goal

Achieve 30pt in pick task on Official APC2016 rule

1.5: APC2016 ver1.5

  • Opened: 2016-06-02

  • Ended: 2016-07-09

Goal

Achieve 30pt in both pick and stow task on Official APC2016 rule

schedule

Mon

Tue

Wed

Thu

Fri

Sat

Sun

6

7 Milestone1.0

8

9

10

11 Milestone1.5

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27 出発 1601

28

29

30

7/1

7/2

7/3

APC2016 rules

Objects

_images/apc2016_objects.jpg

Bonus and Stocks

_images/apc2016_bonus_and_stocks.jpg

Scoring

Pick Task

  • Bins with 1 or 2 items

    • +10 points

  • Bins with 3 or 4 items

    • +15 points

  • Bins with 5 or more items

    • +20 points

Stow Task

  • Bins that started with 1 or 2 items

    • +10 points

  • Bins that started with 3 or 4 items

    • +15 points

  • Bins that started with 5 or more items

    • +20 points

Penalties

  • For each item that is not in a bin, the tote, or actively held by the robot at the and of the attempt:

    • -10 points

  • For each incorrect final position in the Task Output File for an item that is in a bin or the tote:

    • -10 points

  • Minor damage to any item or the shelf:

    • -5 points

  • Major damage to any item or the shelf:

    • -20 points

  • Dropping an item from a height of more than 0.3 meters:

    • -5 points

  • Leaving an item protruding out of its bin by more than 0.5cm:

    • -5 points

Names

_images/apc2016_names.jpg

Tips & FAQ

How to run rostests?

catkin run_tests --this --no-deps
catkin run_tests jsk_arc2017_baxter --no-deps

How to run roslint?

catkin build --this --no-deps --catkin-make-args roslint --
catkin build jsk_arc2017_baxter --no-deps --catkin-make-args roslint --

Controlling joints of the robot does not work.

Run below to synchronize the time with robot. Time synchronization is crucial.

sudo ntpdate baxter.jsk.imi.i.u-tokyo.ac.jp

Rosdep failure due to cython version.

rosdep install -y -r --from-paths . --ignore-src

This command may fail with below errors.

pkg_resources.DistributionNotFound: cython>=0.21
...
ERROR: the following rosdeps failed to install
pip: command [sudo -H pip install -U scikit-image] failed
pip: Failed to detect successful installation of [scikit-image]

In this case, maybe your setuptools is too old. Please run below command.

sudo pip install -U setuptools

https://github.com/start-jsk/jsk_apc/issues/1244 for details.

How to release a new version of jsk_apc?

roscd jsk_apc
catkin_generate_change_log
# edit CHANGELOG.rst to create a pretty changelog
catkin_prepare_release
bloom-release --rosdistro indigo --track indigo jsk_apc  # you may need to fix package.xml for pip packages

Gripper used in APC2016

_images/apc2016_gripper_base_tube.jpg

This gripper has two parts made by 3D printers. “base” part is made by ProJet and “tube” part is made by Dimension. 3D data of these parts are here. For now(2016/9/27), left gripper is gripper-v3 and right gripper is gripper-v4.

Also, PCB data of the control board on this gripper are here.

The servo motor used in this gripper is GWS S11HP/2BBMG/JR.

How to calibrate extrinsic parameters of Astra

% roslaunch jsk_2016_01_baxter_apc baxter.launch
% roscd jsk_2016_01_baxter_apc/rvizconfig
% rviz -d check_astra.rviz
% roslaunch jsk_2016_01_baxter_apc astra_hand.launch

You can see Rviz like below:

_images/check_astra_raw_point_cloud.jpg

If you want to reverse right and left camera vision:

% roslaunch jsk_2016_01_baxter_apc astra_hand.launch left_first:=false

If the point cloud and the robot model are too different in Rviz, you should change the pose of depth optical frame like below:

% rosrun tf static_transform_publisher -0.10 -0.008 0.015 -1.56 0.00 -0.08 right_hand right_hand_camera_depth_optical_frame 100 __name:=right_hand_camera_depth_static_tf_publisher  # This is just an example

# OR

# % roslaunch jsk_2016_01_baxter_apc astra_hand.launch --args /right_hand_camera_depth_static_tf_publisher
% /opt/ros/indigo/lib/tf/static_transform_publisher -0.10 -0.008 0.015 -1.56 0.00 -0.08 right_hand right_hand_camera_depth_optical_frame 100 __name:=right_hand_camera_depth_static_tf_publisher  # This is just an example

After you adjust point cloud, you should check color point cloud:

_images/check_astra_color_point_cloud.jpg

If the color point cloud and the robot model are too different in Rviz, you should change the pose of RGB optical frame like below:

% rosrun tf static_transform_publisher 0.040 0.01 0 0.0 0 0 right_hand_camera_depth_optical_frame right_hand_camera_rgb_optical_frame 100 __name:=right_hand_camera_rgb_static_tf_publisher  # This is just an example

# OR

# % roslaunch jsk_2016_01_baxter_apc astra_hand.launch --args /right_hand_camera_rgb_static_tf_publisher
% /opt/ros/indigo/lib/tf/static_transform_publisher 0.040 0.01 0 0.0 0 0 right_hand_camera_depth_optical_frame right_hand_camera_rgb_optical_frame 100 __name:=right_hand_camera_rgb_static_tf_publisher  # This is just an example

jsk_apc2015_common

jsk_apc2015_common is common stacks for Amazon Picking Challenge 2015.

Setup Berkeley dataset for object recognition

Download berkeley dataset

To get dataset at here:

python scripts/download_dataset.py -O berkeley_dataset

Apply mask image for object images

To get mask applied images:

python scripts/create_mask_applied_dataset.py berkeley_dataset -O berkeley_dataset_mask_applied

ROS nodes

visualize_json.py

What is this?

Visualizes json file which is the interface for Amazon Picking Challenge 2015.

_images/visualize_json.png
Subscribing Topic

None.

Publishing Topic
  • ~output (sensor_msgs/Image)

    Bin contents image.

Example
rosrun jsk_apc2015_common visualize_json.py $(rospack find jsk_2015_05_baxter_apc)/json/layout_1.json
rosrun image_view image_view image:=/visualize_json/output

Python Library

jsk_2015_05_baxter_apc

jsk_2015_05_baxter_apc is ROS package for Amazon Picking Challenge in May 2015.

Demonstrate APC2015 on Real World

Real world demonstration for APC2015 can be done on baxter@sheeta.jsk.imi.i.u-tokyo.ac.jp.

  • Prepare json.

  • Setup objects in Kiva.

baxter@sheeta $ roscd jsk_apc && git checkout 0.2.2

baxter@sheeta $ roslaunch jsk_2015_05_baxter_apc baxter.launch
baxter@sheeta $ roslaunch jsk_2015_05_baxter_apc setup_torso.launch

baxter@sheeta $ ssh doura
baxter@doura $ tmux
baxter@doura $ roscd jsk_apc && git checkout 0.2.2
# on a tmux session
baxter@doura $ sudo -s  # necessary for launch kinect2 with ssh login
baxter@doura $ roslaunch jsk_2015_05_baxter_apc setup_head.launch
# detach from the tmux session and logout from doura here

baxter@sheeta $ roslaunch jsk_2015_05_baxter_apc main.launch json:=$(rospack find jsk_2015_05_baxter_apc)/json/layout_12.json

# optional visualization
$ rviz -d $(rospack find jsk_2015_05_baxter_apc)/rvizconfig/segmentation.rviz  # check object segmentation in each bin
$ rviz -d $(rospack find jsk_2015_05_baxter_apc)/rvizconfig/real_demo.rviz  # visualization for demo

https://github.com/start-jsk/jsk_apc/blob/master/jsk_2015_05_baxter_apc/json/layout_12.json

_images/setup_demo_1.jpg _images/real_world.jpg _images/real_world_rviz.jpg Amazon Picking Challenge 2015 Real World Demonstration

Demonstrate APC2015 on Simulation

Real world demonstration for APC2015 can be done on any computers with ROS indigo.

Installation

# setup catkin
mkdir -p ~/ros/jsk_apc2015_sim && cd ~/ros/jsk_apc2015_sim
catkin init
# setup repos
cd ~/ros/jsk_apc2015_sim/src
wstool init
wstool merge https://raw.githubusercontent.com/start-jsk/jsk_apc/master/sim.rosinstall.$ROS_DISTRO
wstool update -j8
# install depends
rosdep install --from-path . -r -y
# build repos
catkin build -iv -j8

Demo

roslaunch jsk_2015_05_baxter_apc baxter_sim.launch
roslaunch jsk_2015_05_baxter_apc setup_head.launch gazebo:=true
roslaunch jsk_2015_05_baxter_apc main.launch json:=$(rospack find jsk_apc2015_common)/json/f2.json

# optional visualization
rviz -d $(rospack find jsk_2015_05_baxter_apc)/rvizconfig/gazebo_demo.rviz
Amazon Picking Challenge 2015 Gazebo Simulation

ROS nodes

bin_contents.py

What is this?

Publishes the contents in bins of Kiva Pod whose layout is described in a json file for Amazon Picking Challenge 2015.

Subscribing Topic

None.

Publishing Topic
  • ~ (jsk_2015_05_baxter_apc/BinContentsArray)

    Bin contents.

  • ~bin_[a-l]_n_object (jsk_recognition_msgs/Int32Stamped)

    Number of object in each bin.

Parameters
  • ~json (type: String, required)

    Path of json file for the challenge.

Example
rosrun jsk_2015_05_baxter_apc bin_contents.py _json:=$(rospack find jsk_2015_05_baxter_apc)/json/apc2015_layout_1.json

boost_object_recognition.py

What is this?

This node considers each classifier’s weight for the object recognition result.

Subscribing Topic
  • ~input/bof (jsk_recognition_msgs/ClassificationResult)

    Result of classification with Bag of Features.

  • ~input/ch (jsk_recognition_msgs/ClassificationResult)

    Result of classification with Color Histogram.

Publishing Topic
  • ~output (jsk_recognition_msgs/ClassificationResult)

    Result of boosting.

Parameters
  • ~weight (type: String, required)

    Path to yaml file for boosting weight.

  • ~queue_size (type: Int, default: 100)

    Queue size for subscriptions.

  • ~approximate_sync (type: Bool, default: false)

    Synchronize policy for message_filter.

color_object_matcher.py

What is this?

This classifies image for object recognition with color histogram feature.

Subscribing Topic
  • ~input (sensor_msgs/Image)

    Input image.

  • ~input/label (sensor_msgs/Image)

    Input label image which describes list of region of interest.

Publishing Topic
  • ~output (jsk_recognition_msgs/ClassificationResult)

    Classification result of input image for a object set.

Parameters
  • ~queue_size (type: Int, default: 100)

    Queue size for subscriptions.

  • ~approximate_sync (type: Bool, default: false)

    Synchronize policy for message_filter.

euclid_k_clustering.py

What is this?

This node dynamically reconfigures the ~tolerance rosparam of jsk_pcl_ros/euclid_clustering (jsk_pcl/EuclideanClustering) considering the number of objects in the region of interest.

Subscribing Topic
  • ~k_cluster (jsk_recognition_msgs/Int32Stamped)

    Expected number of clusters.

  • ~{node}/cluster_num (jsk_recognition_msgs/Int32Stamped)

    Actual number of clusters. {node} is the value of rosparam ~node. See Parameters for detail.

Publishing Topic

None.

Parameters
  • ~node (type: String, required)

    Node name of jsk_pcl_ros/euclid_clustering.

  • ~default_tolerance (type: Float, required)

    Default value of tolerance.

  • ~reconfig_eps (type: Float, default: 0.2)

    Rate of reconfiguration compared to the value at each time.

  • ~reconfig_n_limit (type: Int, default 10)

    Number of times of reconfiguration.

initialize_baxter.py

What is this?

This node setups below after waiting the /clock and /robot/state topics.

  • Enables the robot.

  • Launches baxter_interface/joint_trajectory_action_server.py.

  • Launches baxter_interface/head_action_server.py.

Subscribing Topic

None.

Publishing Topic

None.

main.l

What is this?

Main program to activate robot which subscribes recognition result and does manipulation.

Subscribing/Publishing Topic

Subscriptions and publications are done in below see them:

  • pr2eus/robot-interface.l

  • baxtereus/baxter-interface.l

  • jsk_2015_05_baxter_apc/euslisp/jsk_2015_05_baxter_apc/baxter-interface.l.

Parameters
  • ~[left,right]_hand/state (type: String, Do not set manually)

  • ~[left,right]_hand/target_bin (type: String, Do not set manually)

    The state and target bin of the hand. Mainly used for parallel activation of dual arms.

work_order.py

What is this?

Publishes the picking order for each arm of Baxter robot in Amazon Picking Challenge 2015.

Rules
  1. It abandon bins whose target object is listed below:

  • genuine_joe_plastic_stir_sticks (big & heavy)

  • cheezit_big_original (big & heavy)

  • rolodex_jumbo_pencil_cup (many holes)

  • champion_copper_plus_spark_plug (small)

  • oreo_mega_stuf (heavy)

_images/abandon_objects.png
  1. Left bins are assigned to left arm, vice versa.

  2. Center bins are assigned to one arm (left or right).

Subscribing Topic

None.

Publishing Topic
  • ~left_hand, ~right_hand (jsk_2015_05_baxter_apc/WorkOrderArray)

    Picking orders for each arm.

Parameters
  • ~json (type: String, required)

    Path of json file for the challenge.

Example
rosrun jsk_2015_05_baxter_apc work_order.py _json:=$(rospack find jsk_2015_05_baxter_apc)/json/apc2015_layout_1.json

Euslisp Code

Contents:

Reachable Space of Customized Baxter

roscd jsk_2015_05_baxter_apc/euslisp/examples
roseus baxter-reachable-space.l
_images/baxter_reachable_0.png _images/baxter_reachable_1.png _images/baxter_reachable_2.png

Library:

euslisp/jsk_2015_05_baxter_apc/baxter-interface.l

jsk_2015_05_baxter_apc::baxter-interface
  • :super baxter-interface

  • :slots *tfl *bin-boxes *objects-in-bin-boxes *objects-in-bin-coms _bin-coords-list

:wait-for-user-input-to-start arm

:init &rest args

:start-grasp &optional (arm :arms)

:stop-grasp &optional (arm :arms)

:graspingp arm

:opposite-arm arm

:need-to-wait-opposite-arm arm

:arm-symbol2str arm

:arm-potentio-vector arm

:tf-pose->coords frame_id pose

:fold-pose-back &optional (arm :arms)

:detect-target-object-in-bin target-object bin

:recognize-bin-boxes &key (stamp (ros::time-now))

:bbox->cube bbox

:visualize-bins

:visualize-objects

:recognize-grasp-coords-list bin &key (stamp (ros::time-now))

:recognize-objects-in-bin bin &key (stamp (ros::time-now)) (timeout 10)

:recognize-object-in-hand arm &key (stamp (ros::time-now)) (timeout)

:verify-object arm object-name &key (stamp (ros::time-now))

:try-to-pick-in-bin arm bin

:try-to-pick-object-solidity arm bin &key (offset #f(0.0 0.0 0.0))

:try-to-pick-object arm bin &key (object-index 0) (offset #f(0.0 0.0 0.0))

:pick-object arm bin &key (object-index 0) (n-trial 1) (n-trial-same-pos 1) (do-stop-grasp nil)

:send-av &optional (tm 3000)

:force-to-reach-goal &key (arm :arms) (threshold 5) (stop 10)

:ik->bin-entrance arm bin &key (offset #f(0.0 0.0 0.0))

:move-arm-body->bin arm bin

:move-arm-body->order-bin arm

:spin-off-by-wrist arm &key (times 10)

:move-arm-body->head-view-point arm

:place-object arm

:get-work-orders arm

:get-next-work-order arm current-order

:get-bin-contents bin

:real-sim-end-coords-diff arm

jsk_2015_05_baxter_apc::baxter-init &key (ctype :default-controller)

euslisp/jsk_2015_05_baxter_apc/util.l

m->mm m

argmax fvec

vec-list-max vec-list &key axis

str2symbol str

symbol2str *_symbol*

ros::advertise-if-yet name data-class queue-size

underscore-to-space str_

which-bin-region bin

arm-to-ctype arm

arm-to-str arm

opposite-arm arm

get-object-size object-name

zip a b

dict zipped

Shared Files

READONLY: https://drive.google.com/drive/u/1/folders/0B9P1L–7Wd2vS1pjRENUMlFPYlU

Google Drive folder is shared. There are shared files like log files and datasets.

Testing

catkin run_tests jsk_2015_05_baxter_apc --no-deps

jsk_apc2016_common

jsk_apc2016_common is common stacks for Amazon Picking Challenge 2016.

Setup RBO Segmenter for segmentation in bin

Submodule update

To initialize RBO submodule:

git submodule init
git submodule update --init --recursive

Setup RBO Segmenter

To set up RBO segmenter:

python scripts/setup_rbo_segmenter.bash

ROS nodes

visualize_stow_json.py

What is this?

Visualizes json file for Stow Task which is the interface for Amazon Picking Challenge 2016.

_images/visualize_stow_json.png
Subscribing Topic

None.

Publishing Topic
  • ~output (sensor_msgs/Image)

    Bin contents image.

Example
rosrun jsk_apc2016_common visualize_stow_json.py $(rospack find jsk_2016_01_baxter_apc)/json/apc_stow.json
rosrun image_view image_view image:=/visualize_stow_json/output

Python Library

jsk_2016_01_baxter_apc

jsk_2016_01_baxter_apc is ROS package for Amazon Picking Challenge in June 2016.

APC2016 Pick Task Trial on Real World

Pick task trial on real world for APC2016 can be done on baxter@sheeta.

  • Prepare json (ex. apc_pick_task_robocup2016.json).

  • Setup objects in Kiva.

# Launch nodes to control robot.
baxter@sheeta $ roslaunch jsk_2016_01_baxter_apc baxter.launch

# Launch nodes in recognition pipeline for pick task.
baxter@sheeta $ roslaunch jsk_2016_01_baxter_apc setup_for_pick.launch

# optional: Check sanity.
baxter@sheeta $ rosrun jsk_2016_01_baxter_apc check_sanity_setup_for_pick

# Run task!
baxter@sheeta $ roslaunch jsk_2016_01_baxter_apc main.launch json:=$(rospack find jsk_apc2016_common)/json/apc_pick_task_robocup2016.json
# even if you pass rviz:=false to main.launch, you need to launch yes_no_button.
baxter@sheeta $ rosrun jsk_2016_01_baxter_apc yes_no_button

Above commands are automated with a single command below:

baxter@sheeta $ tmuxinator start apc

APC2016 Pick Task Trial on Real World with Right Gripper-v5

Pick task trial on real world with right gripper-v5 for APC2016 can be done on baxter@sheeta.

  • Install right gripper-v5 in Baxter

  • Prepare json (ex. test_gripper_v5.json).

  • Setup objects in Kiva.

# Launch nodes to control robot.
baxter@sheeta $ roslaunch jsk_2016_01_baxter_apc baxterrgv5.launch

# Launch nodes in recognition pipeline for pick task.
baxter@sheeta $ roslaunch jsk_2016_01_baxter_apc setup_for_pick.launch

# optional: Check sanity.
baxter@sheeta $ rosrun jsk_2016_01_baxter_apc check_sanity_setup_for_pick

# Run task!
baxter@sheeta $ roslaunch jsk_2016_01_baxter_apc main_rgv5.launch json:=$(rospack find jsk_apc2016_common)/json/test_gripper_v5.json
# even if you pass rviz:=false to main.launch, you need to launch yes_no_button.
baxter@sheeta $ rosrun jsk_2016_01_baxter_apc yes_no_button

APC2016 Stow Task Trial on Real World

Stow task trial on real world for APC2016 can be done on baxter@satan and baxter@eyelash.

  • Prepare json.

  • Setup objects in Kiva.

  • Setup objects in Tote.

# use satan
baxter@satan $ roscd jsk_apc && git fetch origin
baxter@satan $ git checkout 1.5.0
baxter@satan $ roslaunch jsk_2016_01_baxter_apc baxter.launch
baxter@satan $ roslaunch jsk_2016_01_baxter_apc setup_torso.launch use_stow:=true

# use eyelash
baxter@eyelash $ roscd jsk_apc && git fetch origin
baxter@satan $ git checkout 1.5.0
baxter@eyelash $ roslaunch jsk_2016_01_baxter_apc setup_astra.launch use_stow:=true

# use satan
baxter@satan $ roslaunch jsk_2016_01_baxter_apc main_stow.launch json:=$(rospack find jsk_apc2016_common)/json/stow_layout_1.json

Installation

See Installation.

System

_images/system.png

MasterPiece

Well-done Images and Videos

READ/WRITE: https://drive.google.com/drive/u/1/folders/0B5DV6gwLHtyJS2NKU3J4WXo2TDA

Logs

2016-03-07

What you did?
  • Try to achieve 30 points in apc2016 rules (but items are from apc2015)

What is issue?
What you think/feel?
  • 圧力センサーを変更したい

  • IKをうまく解けるようにしたい

  • セグメンテーションの改善

  • 手先に認識デバイスをつける

2016-03-10

What you did?
  • Try to achieve 30 points in apc2016 rules (but items are from apc2015)

What is issue?
What you think/feel?
  • collision checkをして右手左手で同時作業できるbinをもっとみつけたい

2016-03-12

What you did?
  • Try to achieve 30 points in apc2016 rules (but items are from apc2015)

What is issue?
  • 安全メガネの認識:データを追加するだけでは性能が上がらなかった。

What you think/feel?
  • HOGなどの他の特徴量抽出を使ってみれば?

Log data
  • None

2016-03-21

What you did?
  • new gripper test

2016-04-04

What you did?
  • try to pick apc2016 items with apc2015 gripper

2016-04-06

What you did?
  • over all trial to achieve 30 points in apc2016 rule

2016-04-11

THIS IS SAMPLE

What you did?
  • DID_0

  • DID_1

What you think/feel?
  • Recognition is BAD!! OMG!

2016-04-12

Evaluation of CREATIVE camera and DepthSense camera
  • compare CREATIVE camera and DepthSense camera

Conclusion

Almost same performance

but DepthSense camera is a little bit weird

CREATIVE camera
Specs
  • 15cm以上離れている必要あり

  • depth image: /softkinetic_camera/depth/image_raw

    • height: 240

    • width: 320

    • encoding: 32FC1

  • color image /softkinetic_camera/rgb/image_color

    • height: 720

    • width: 1280

    • encoding: bgr8

  • points xyzrgb /softkinetic_camera/depth/points

    • height: 240

    • width: 320

Issues
Experiment

Setup

_images/creative_setup.jpg

Topic hz

Grid table:

topic

rate

min_delta

max_delta

std_dev

/softkinetic_camera/depth/points

30.053

0.021

0.043

0.005

/softkinetic_camera/depth/image_raw

30.136

0.018

0.044

0.006

/softkinetic_camera/rgb/image_color

25.250

0.058

0.061

0.008

View

  • front view

_images/creative_front.png
  • above view

_images/creative_above.png
DepthSense
Specs
  • 15cm以上離れている必要がある

  • points xyzrgb /softkinetic_camera/depth/points

    • height: 240

    • width: 320

    • encoding: 32FC1

  • depth image /softkinetic_camera/depth/image_raw

    • height: 240

    • width: 320

    • encoding: 32FC1

  • color image /softkinetic_camera/rgb/image_color

    • height: 720

    • width: 1280

    • encoding: bgr8

Issues
  • color image /softkinetic_camera/rgb/image_color の色が全体的に黄色い

  • a little bit noisy?

Experient

Setup

_images/depthsense_setup.jpg

Topic hz

Grid table:

topic

rate

min_delta

max_delta

std_dev

/softkinetic_camera/depth/points

30.164

0.017

0.043

0.006

/softkinetic_camera/depth/image_raw

30.136

0.018

0.045

0.005

/softkinetic_camera/rgb/image_color

24.960

0.032

0.051

0.005

View

  • front view

_images/depthsense_front.png
  • above view

_images/depthsense_above.png

2016-04-14

Install DepthSense Camera on both arm

What you did?
  • front view

_images/depthsense_both_arm.png
  • point clouds with kiva pod

_images/depthsense_kiva.png
What is issue?
  • Point clouds are noisy

2016-04-21

前日、 start-jsk/jsk_apc#1302 を達成し、 start-jsk/jsk_apc#1308 でmain.launchから新グリッパーのサーボを動かせるようになった。そのバグを潰すとともに、新グリッパー内の圧力センサによる把持判定をmain.lから参照できるようにした。また、新グリッパーで去年のプログラムを走らせると、グリッパー上部がbinの上部に当たる問題が発生したので、それを修正中である。

What you did?
  • Bug fix

    gripper joint trajectory actionのInfo messageが読みにくく、プログラムそのものの拡張性も良くなかったので修正
  • New functions

    掃除機に内蔵されている圧力センサーの代わりに、新グリッパーに内蔵されている圧力センサーを用いて把持したかどうかを判定する
  • Work in progress

    新グリッパーで去年のプログラムを走らせると、グリッパー上部がbinの上部に当たる問題が発生した(movie1)。それを修正中。
    • movie2, movie3 でグリッパーはbinに当たらなくなったが、ものを取り出そうとした時にIKが解けないという問題が発生した。

    • movie4 でIKが解けるようになった。

    • 現在の状態で、平たい物品はbinにぶつからずに取れるようになっている。

What is issue?
認識が失敗する。新グリッパー側に関しては、AttentionClipperから失敗することが、launchファイルを読んで判明した。
  • 平たい物品はとれるが、背の高い物品は、物品に向けてIKを解いた時に新グリッパーが棚にぶつかる可能性が大。グリッパーのサーボを曲げないで物品を取るプログラムを書く必要がある。

2016-04-22-gripper

昨日に引き続き、 start-jsk/jsk_apc#1321 で作業中。ある程度背の高い物品はグリッパーのサーボを伸ばした状態でとり、背の低い物品や棚の側面に立て掛けてあるような物品は、グリッパーのサーボを曲げて、適切な角度に手首を回転させてとるという判断ができるようになった。

What you did?
  • Work in progress

    背の高い物品に対し、その位置によって動きを分けられるようになった。
    • プログラムを書いた当初は、 movie1, movie2 のように、IKが失敗するバグを埋め込んでしまった。

    • movie3 でbinの中央付近にある背の高い物品に対し、グリッパーのサーボを伸ばしてアプローチすることができるようになったが、今度は背の低い物品を取ろうとするとIKが失敗するバグを埋め込んでしまった。

    • movie4 で背の低い物品でもIKが失敗しないようになった。

    • movie5 でbinの側面に近い物品に対し、手首関節を回転させてアプローチすることができるようになったが、反対側のbinの側面とグリッパーが干渉してしまった。

    • movie6 でmovie5での干渉を解決できた。

    • movie7 で厚みのある物品の把持に挑戦した。つかめはしたが、引き出すところでグリッパーの関節を曲げた際に吸引が剥がれてしまった。

What is issue?
非常停止スイッチを押すとBaxterの関節は能動的に動かせなくなり、電磁ブレーキだけがかかった状態になるが、グリッパーのサーボモータのトルクは入ったままである。非常停止後にbinにグリッパーが引っかかった場合、グリッパーのサーボモータに大きな負荷がかかるケースがあり、危険である。これを何とかする必要があり、最優先で作業しなくてはならない。
  • movie7で示された通り、グリッパーの関節をまっすぐにしてアプローチした後は、グリッパーの関節を曲げるのはbinから物品が引き出されてからにするべきである。これと似たような問題として、movie6で示されたように、binの側面に近い物品をとった後、グリッパーがbin内にある状態で手首関節が回転し、物品がbinにあたって吸引が剥がれる危険性が高まるというものもある。物品がbin内から引き出されるまでは、アプローチした際の姿勢を極力維持するべきではないか。

  • 現在はグリッパーのサーボモータのトルクは常に入ったままであるが、物品を把持した状態でトルクを切ると、重力や物品とbinの干渉に応じて、グリッパーの関節角度がなじむことがわかっている。これをうまく利用したい。

  • movie4で顕著だが、物品の位置にうまくグリッパーを持って行けていない。これは、物品の点群が十分にとれていないせいではないかと考えられる。手首にRealSense Cameraがつき、これを有効に活用できるようになれば、改善できるのではないかと考えられる。

  • movie4で、グリッパー部分の電源ジャックが抜けてしまい、挿し直すというトラブルがあった。また、グリッパーが掃除機のホースに引っかかり、非常停止せざるを得なくなった。ジャックが抜ける問題については対策済みであるが、コードやホースの取り回しについて考える必要がある。

2016-04-25-26-gripper

start-jsk/jsk_apc#1321 は4/22の状態で完成したと判断し、mergeしてもらった。しかし、特定条件下でIKがとけないことが発覚したので、 start-jsk/jsk_apc#1345 で修正中。また、Arduinoのfirmwareを改修すると共に新規ノードを追加して、ロボットが壊れにくくなるよう安全対策を施した。

What you did?
What is issue?
修正中。
  • movie3 で、真ん中付近においてある薄めの物品に対し、グリッパーを伸ばしてアプローチする動きが見られた。現在の判定基準では致し方ないが、なんとか修正したい。

2016-04-27-gripper

4/26に、 start-jsk/jsk_apc#1345 でグリッパーのサーボを回した方が良いのではないかという仮説を立てたが、失敗したので手首を回すように戻した。また、jsk_2015_05_baxter_apcのeuslisp/examples/picking-with-clustering.lをjsk_2016_01_baxter_apcに移植し、一つのBinの中でクラスタリングを行い、物品を把持してOrder Binに入れるという一連の動作のみを試せるようになった。

What you did?
  • Bug fix

    start-jsk/jsk_apc#1321 の状態で発生した start-jsk/jsk_apc#1341 を修正するもの。4/26に思いついたグリッパーのサーボの回転は、試してみた所、 movie1 のようになり、IKで出てくる姿勢に影響を与えなかったので、断念した。結局、手首を回す仕様に戻っている。
  • New functions

    jsk_2015_05_baxter_apcのeuslisp/examples/picking-with-clustering.lをjsk_2016_01_baxter_apcに移植するもの。一つのBinの中でクラスタリングを行い、物品を把持してOrder Binに入れるという一連の動作のみを試せるようになった。 movie2 のような動作になる。
What is issue?
movie3 で、立ててある物品にグリッパーを伸ばしてアプローチする際に倒してしまっていた。これは、グリッパーを曲げた状態でBinの入り口まで持っていった後に、グリッパーをのばそうとするからであり、この動作に修正が必要である。
movie4, movie5 で、Bin iに対してアプローチした際に、グリッパーがBinに引っかかった。これは、動きの各オフセットをBin c向けにしか調整できていなかったからであり、Binのサイズに応じてオフセットを変えることが必要になる。
  • movie6, movie8 で、Bin eやlにアプローチした際に、グリッパーが45度程度曲がった姿勢がIKを解いた時に出てきてしまった。この姿勢では、Binの奥の方の物品にアプローチすることができない。それに、movie8では掃除機のホースを巻き込んでしまっていた。グリッパーの関節をIKに使用するのはよくないのではないか。

  • movie7 を見ると、グリッパーを引き出した際に吸引パッドが潰れていることがわかる。吸引パッドの先端部分が内側に引き込まれているわけだが、これは表面が平らな物品に対してゆっくりとアプローチした際に起こりやすい。吸引パッドを他のものに交換したほうがいいのかもしれない。

2016-04-28-30-gripper

4/27で生じた問題点のうち、グリッパーが45度程度に曲がった姿勢がIKで出てくるという問題を解消するため、IKを解く時のグリッパー関節の重みを0にし、IKでは動かないようにした。その上で右手が届く全Binに対してテストを行った所、Bin eへのアプローチの際にBaxter本体とアームが引っかかる問題が発生したので、修正した。

What you did?
  • Bug fix

    IKを解く際のグリッパー関節の重みを0にし、IKでは動かないようにする。グリッパー関節を曲げる必要がある場合は、 (send *baxter* :rotate-gripper :rarm 90 :relative nil) のようにスクリプトから直接回転させる。この際、IKを解く際の中継点であるfold-pose-upperに変更を加えることで、 movie9 のように、手首を回転させなくてもBin cでのIKが解けるようになったので、手首を回転させる行は削除した。
    #1362の状態で実験してみた所、Bin e以外のBinに対しては、Binの前に手を持っていくまでは無理のない動作を行うことができた。掃除機のホースを巻き込むこともなくなった。
    しかし、Bin eに対しては、 movie10 のように、アームがBaxter本体と引っかかってしまった。これは右アーム固有の問題なのか疑問に思ったので、左アームでも実験してみた所、 movie11 のように似たような動きでBaxter本体と接触した。
    これは、fold-pose-backからBinに至るまでの経由点であるavoid-shelf-pose-eが悪いのではないかと判断し、変更を行った所、 movie18 のようにグリッパーをBinに持って行くまでは安定して動けるようになったが、グリッパーを引き出す所でBaxter本体とグリッパーがひっかかってしまった。
    そこで、グリッパーを引き出す距離を短くした所、 movie19 のように安定した動きができるようになった。
  • New functions

    GoogleDrive上に保存してあるrosbagファイルをダウンロードし、各Bin、各アームに対して :ik->bin-entrance が解けるかどうかをテストする。このテストを行うことで、IKが解けないという状態に陥っていないことをtravis上で確認できる。
    現在は、rosbagファイルの長さが1分であり、再生を始めてから1分以内に :recognize-bin-boxes が行われなければ、エラーが出てきてしまう。これを何とかするべきかもしれない。
What is issue?
  • movie14 で、Bin bに対するアプローチを試しているが、Binの入口に辿り着くまでに大きめの動きをしている。これは、 start-jsk/jsk_apc#1362 でfold-pose-upperを修正したために、 (send *baxter* :ik->bin-entrance :rarm :b) で出てくる姿勢も変わってしまったためではないかと考えられるが、左手との比較をし忘れており、まだ正確なことがわからない。これを比較する必要がある。

  • start-jsk/jsk_apc#1383

今回のmovieを見ると、Binに対してグリッパーの位置が低すぎ、Binに引っかかるケースが多い。これは、動きの各オフセットがBin c向けにしか調整されていないという、4/27のログで触れた問題と同じである。これを修正する。

2016-05-02-18-gripper

5/2~5/18の間に作ったPRと、それがどのissueを解決するものかについて、動画で説明する。各PRやissueの詳しい説明は、リンクページを見てほしい。

What you did?
What is issue?

2016-05-09

Segmentation in bin with kinect2_torso

What you did?
  • Run main.launch with kinect2_torso

  • Run segmentation_in_bin.launch with kinect2_torso

  • Run segmentation and try to pick object from bin l

  • Succeeded to pick object

What is issue?
  • segmentation program takes almost 1 minute with kinect2

What you think/feel?
  • I want to do this with softkinetic camera

2016-05-17

Segmentation in bin with softkinetic camera

What you did?
  • resize output mask image

  • fix camera info

  • add robot self filter

  • get proper centroid position

Results
_images/sib_filtered.png
What you think/feel?
  • it works good, i think :)

2016-05-21

Run main program with RBO segmentation

What you did?
  • Run main.launch with RBO segmentation algorithms

What is issue?
  • no issues for milestone 0.3.0

What you think/feel?
  • Segmentation and recognition is good, but grasping planning is not good.

  • milestone 0.3.0 finished

2016-05-27

Get Bounding Box from SIB

What you did?
  • Use ClusterPointIndicesDecomposer to get Bounding Box

_images/sib_bbox.png
What is issue?
What you think/feel?
  • segmentation really depends on object kind, of course.

Shared Files

READ/WRITE: https://drive.google.com/drive/u/1/folders/0B9P1L–7Wd2vLXo1TGVYLVh3aE0

Google Drive folder is shared. There are shared files like log files and datasets.

Testing

catkin run_tests jsk_2016_01_baxter_apc --no-deps

jsk_arc2017_common

jsk_arc2017_common is common stacks for Amazon Picking Challenge 2017.

ROS nodes

# candidates_publisher.py

## What is this?

Publish label candidates from JSON file.

## Subscribing topics

  • ~input/json_dir (std_msgs/String)

    JSON file directory

## Publishing topics

  • ~output/candidates (jsk_recognition_msgs/LabelArray)

    Label candidates in target location

## Parameters

  • ~target_location (String, required)

    Target location name. (tote, bin_A, bin_B or bin_C)

    You can update by dynamic_reconfigure.

  • ~label_names (List of String, required)

    List of label names

## Sample

`bash roslaunch jsk_arc2017_common sample_candidates_publisher.launch `

# json_saver.py

## What is this?

Update and save item location JSON file during the tasks.

## Services

  • ~update_json (jsk_arc2017_common/UpdateJSON)

    Update and save item_location_file.json

`bash $ rossrv show jsk_arc2017_common/UpdateJSON string item string src string dst --- bool updated `

  • ~save_json (std_srvs/Trigger)

    Save item_location_file.json.

## Parameters

  • ~json_dir (String, required)

    Directory where initial json files are located

  • ~output_dir (String, required)

    Directory where output json files will be located

## Sample

`bash roslaunch jsk_arc2017_common sample_json_saver.launch `

# visualize_json.py

## What is this?

Visualizes item_location_file.json and order_file.json.

Item Location ![](https://user-images.githubusercontent.com/4310419/27720914-d5e07802-5d97-11e7-881e-9ee2ebd2c888.png)

Order ![](https://user-images.githubusercontent.com/4310419/27720897-c5344718-5d97-11e7-9e50-fbcbbd622f47.png)

## Subscribing topics

  • ~input/json_dir (String)

    Where json files are located.

## Publishing topics

  • ~output/item_location_viz (sensor_msgs/Image)

    Visualization of item_location_file.json. This is enabled if item_location is inside of ~types of rosparam, and ~json_dir/item_location_file.json is read.

  • ~output/order_viz (sensor_msgs/Image)

    Visualization of order_file.json. This is enabled if order is inside of ~types of rosparam, and ~json_dir/order_file.json is read.

## Parameters

  • ~types (List of string, required)

    item_location or/and order.

## Sample

`bash roslaunch jsk_arc2017_common sample_visualize_json.launch `

# work_order_publisher.py

## What is this?

Publish optimized work orders of the tasks.

## Subscribing topics

None

## Publishing topics

  • ~left_hand (jsk_arc2017_common/WorkOrderArray)

    Optimized work orders for left hand.

  • ~right_hand (jsk_arc2017_common/WorkOrderArray)

    Optimized work orders for right hand.

## Parameters

  • ~json_dir (String, required)

    Directory where initial json files are located

  • ~rate (Int, default: 1)

    Hz of publishing topics

## Sample

`bash roslaunch jsk_arc2017_common sample_work_order_publisher.launch `

jsk_arc2017_baxter

jsk_arc2017_baxter is ROS package for Amazon Robotics Challenge on July 2017.

ARC2017 Pick Task Trial on Real World

Pick task trial on real world for ARC2017 can be done on baxter@baxter-c1.

  • Prepare json.

  • Set objects in Shelf.

# Launch nodes to control robot.
baxter@baxter-c1 $ roslaunch jsk_arc2017_baxter baxter.launch

# Launch nodes in recognition pipeline for pick task.
baxter@baxter-c1 $ roslaunch jsk_arc2017_baxter setup_for_pick.launch

# optional: Check sanity.
baxter@baxter-c1 $ rosrun jsk_2016_01_baxter_apc check_sanity_setup_for_pick

# Run task!
baxter@baxter-c1 $ roslaunch jsk_arc2017_baxter pick.launch json_dir:=$(rospack find jsk_arc2017_common)/data/json/sample_pick_task

With Environment Imitating ARC2017 Pick Competition

Preparation
baxter@baxter-c1 $ rosrun jsk_arc2017_common install_pick_re-experiment
Execution
baxter@baxter-c1 $ roslaunch jsk_arc2017_baxter baxter.launch
baxter@baxter-c1 $ roslaunch jsk_arc2017_baxter setup_for_pick.launch
baxter@baxter-c1 $ roslaunch jsk_arc2017_baxter pick.launch json_dir:=$HOME/data/arc2017/system_inputs_jsons/pick_re-experiment/json

ARC2017 Stow Task Trial on Real World

Stow task trial on real world for ARC2017 can be done on baxter@baxter-c1.

  • Prepare json.

  • Set objects in Tote.

# Launch nodes to control robot.
baxter@baxter-c1 $ roslaunch jsk_arc2017_baxter baxter.launch pick:=false

# Launch nodes in recognition pipeline for stow task.
baxter@baxter-c1 $ roslaunch jsk_arc2017_baxter setup_for_stow.launch

# Run task!
baxter@baxter-c1 $ roslaunch jsk_arc2017_baxter stow.launch json_dir:=$(rospack find jsk_arc2017_common)/data/json/sample_stow_task

State Machine

Task states are controlled by smach. You can check state machines by smach_viewer.

Pick Task

_images/pick_state_machine.png
# Run pick task with smach_viewer
baxter@baxter-c1 $ roslaunch jsk_arc2017_baxter pick.launch json_dir:=$(rospack find jsk_arc2017_common)/data/json/sample_pick_task smach_viewer:=true

Stow Task

_images/stow_state_machine.png
# Run stow task with smach_viewer
baxter@baxter-c1 $ roslaunch jsk_arc2017_baxter stow.launch json_dir:=$(rospack find jsk_arc2017_common)/data/json/sample_stow_task smach_viewer:=true

Usage of Baxter

How to control baxter via roseus

preparation

Run below under emacs’s shell environment (M-x shell).

roseus

When you start new shell, DO NOT FORGET to run:

rossetip
rossetmaster baxter
source ~/catkin_ws/devel/setup.bash

Set baxter-interface

;;load modules
(load "package://jsk_arc2017_baxter/euslisp/lib/arc-interface.l")
;;create a robot model(*baxter*) and make connection to the real robot(*ri*)
(jsk_arc2017_baxter::arc-init)
;; display the robot model
(objects (list *baxter*))

arc-interface function APIs

  • rotate left(right) gripper

    (send *baxter* :rotate-gripper :larm 90 :relative nil)
    
  • slide right gripper

    (send *baxter* :slide-gripper :rarm 50 :relative nil)
    
  • move fingers in right gripper

    (send *baxter* :hand :rarm :angle-vector #f(90 90))
    (send *baxter* :hand-grasp-pre-pose :rarm :opposed)
    (send *baxter* :hand-grasp-pose :rarm :cylindrical)
    
  • send initial pose for arc2017

    (send *baxter* :fold-pose-back)
    
  • send current joint angles of robot model to real robot

    (send *ti* :send-av)
    
  • send current hand joint angles of robot model to real robot

    (send *ri* :move-hand :rarm (send *baxter* :hand :rarm :angle-vector) 1000)
    

Gripper-v6 Setup

Adjust gravity compensation

Gripper-v6 is heavy (1.18kg), so we should adjust gravity compensation of Baxter.

For now (2017/6/17), roslaunch jsk_arc2017_baxter baxter.launch does it by:

$ rostopic pub -1 /robot/end_effector/right_gripper/command baxter_core_msgs/EndEffectorCommand '{ id : 131073, command : "configure", args : "{ \"urdf\":{ \"name\": \"right_gripper_mass\", \"link\": [ { \"name\": \"right_gripper_mass\", \"inertial\": { \"mass\": { \"value\": 1.18 }, \"origin\": { \"xyz\": [0.0, 0.0, 0.15] } } } ] }}"}'

If you want to change gripper, you should restore to the original setting by:

$ rostopic pub -1 /robot/end_effector/right_gripper/command baxter_core_msgs/EndEffectorCommand '{ id : 131073, command : "configure", args : "{ \"urdf\":{ \"name\": \"right_gripper_mass\", \"link\": [ { \"name\": \"right_gripper_mass\", \"inertial\": { \"mass\": { \"value\": 0 }, \"origin\": { \"xyz\": [0.0, 0.0, 0.0] } } } ] }}"}'

More information about gripper customization of Baxter is on official page

Distinguish left DXHUB from right one

Each left and right gripper has its own DXHUB for communication with motors. To distinguish two DXHUBs and create correct symbolic links (/dev/r_dxhub and /dev/l_dxhub), you have to change the configuration of left DXHUB from default. Because the configuration is inside EEPROM of FTDI chip on DXHUB, you have to write a new configuration to that EEPROM.

Method on Windows

You should use FT_PROG to program EEPROM. Please install it and take the following steps to change the configuration.

  1. Connect DXHUB to PC with a USB cable. Don’t connect other USB devices. Power supply to DXHUB is not needed

  2. Wait until device driver installation is finished

  3. Launch FT_PROG

  4. Click the loupe icon to scan devices

  5. Click the plus icon of “USB String Descriptors”

_images/ft_prog_before_edit.jpg
  1. Change “Product description” value to the value of ATTRS{product} in the udev rule

_images/ft_prog_after_edit.jpg
  1. Click the lightning icon

_images/ft_prog_before_prog.jpg
  1. Click “Program” to write the modified configuration to EEPROM

Experiments

# Create 2d dataset for object segmentation

## Collect raw data on shelf bins

`bash roslaunch jsk_arc2017_baxter baxter.launch moveit:=false roslaunch jsk_arc2017_baxter create_dataset2d_rawdata_main.launch rosrun jsk_arc2017_common view_dataset2d.py ~/.ros/jsk_arc2017_baxter/create_dataset2d/right_hand `

![](https://user-images.githubusercontent.com/4310419/28227820-ac70352c-6916-11e7-8a95-277f913cd9e9.gif)

## Collect raw data on tote bin

`bash roslaunch jsk_arc2017_baxter baxter.launch moveit:=false pick:=false roslaunch jsk_arc2017_baxter stereo_astra_hand.launch roslaunch jsk_arc2017_baxter create_dataset2d_rawdata_main.launch box:=tote rosrun jsk_arc2017_common view_dataset2d.py ~/.ros/jsk_arc2017_baxter/create_dataset2d/right_hand `

## Annotate

`bash dirname=raw_data_$(date +%Y%m%d_%H%M%S) mv ~/.ros/jsk_arc2017_baxter/create_dataset2d/left_hand/* $dirname mv ~/.ros/jsk_arc2017_baxter/create_dataset2d/right_hand/* $dirname rosrun jsk_arc2017_common annotate_dataset2d.py $dirname rosrun jsk_arc2017_common view_dataset2d.py $dirname `

![](https://user-images.githubusercontent.com/4310419/28228470-a51d903c-6919-11e7-97b1-688f7f1ccf48.gif)

Shared data

Please upload shared data to Google drive:

https://drive.google.com/open?id=0B5DV6gwLHtyJa2MxMDVGVDZ0aE0