This repository includes the packages and instructions to run the LASA Bimanual Motion planning architecture developed initially for a bimanual zucchini peeling task within the Robohow project, but can be used for any bimanual task which invloves coordination between two end-effectors and controlling for a desired cartesian pose/ft/stiffness.
OS: Ubuntu 14.04 ROS compatibility: Indigo
Dependencies |
---|
kuka_interface_packages |
kuka-rviz-simulation |
net-ft-ros |
coupled-dynamical-systems |
state-transfomers |
bimanual-dynamical-system |
fast-gmm |
###Real-time Execution of Bi-manual reaching motions: #####Stream Robot data (joint states, Pose, FT, Stiff)
$ rosrun kuka_fri_bridge run_lwr.sh right
$ rosrun kuka_fri_bridge run_lwr.sh left
follow instructions on kuka_bridge to set control mode for each robot. (joint impedance control, control = 1)
$ roslaunch kuka_lwr_bringup bimanual2_realtime.launch ft_sensors:=true
Cartesian-to-Joint/Joint-to-Cart Estimation
$ roslaunch state_transformers bimanual_joint_ctrls_real.launch
A bimanual action server, containing different types of control methods for bimanual actions, currently:
- Coupled CDS for each arm. The models can be found in bimanual-task-models and they are task and action specific.
- Virtual Object Dynamical System (spatial and temporal coupling)
$ roslaunch bimanual_motion_planner peeling_bimanual_action_server.launch
To run a test with two CDS models independently for each arm:
$ rosrun bimanual_action_planners uncoupled_test.py
To run a test with the Virtual Object Dynamical System:
$ rosrun bimanual_action_planners virtual_object_test.py
To run the peeling task demo:
$ rosrun bimanual_action_planners peeling_demo.py
You will need the whole Perception Module for detecting the Zucchini and computing its observable features.
- Follow the instructions in kinect-process-scene
$ roslaunch kuka_lwr_bringup bimanual2_simulation.launch
Cartesian-to-Joint/Joint-to-Cart Estimation
$ roslaunch state_transformers bimanual_joint_ctrls_sim.launch
Bimanual Action Server with same action types as real-time.
$ roslaunch bimanual_motion_planner bimanual_action_server.launch simulation:=true
- same as realtime
To record/replay demonstrations you must install these packages:
Dependencies |
---|
record_ros |
$ rosrun kuka_fri_bridge run_lwr.sh right
$ rosrun kuka_fri_bridge run_lwr.sh left
$ roslaunch kuka_lwr_bringup bimanual2_realtime.launch ft_sensors:=true vision:=true
$ roslaunch bimanual_action_planners record_bimanual_demos.launch
$ rosservice call /record/cmd "cmd: 'record/stop'"
$ roslaunch kuka_lwr_bringup bimanual2_realtime.launch ft_sensors:=true not_bag:=false
$ rosbag play *.bag
Use my-matlab-rosbag