Code Baxter with Python3!
Useful port of baxter interface to control your Baxter with Python 3. If you need an introductive tutorial on baxter checkout this Baxter introduction
- Baxter Robot
- Human Robot Interaction
- Python 3
everything should be already installed if you are running in the robot, for running it on your pc you need:
- Python 3
- rospy
- opencv
-
add this repo in your repo:
git submodule add git@github.com:igor-lirussi/baxter-python3.git
-
if you want to check for updates of the submodule:
git submodule update --remote
-
import the baxter python3 interface in your code. Since it's in a subfolder, you can use the import shown in the "Run" section below.
-
remember to tell users to clone your repo with --recurse-submodules:
git clone --recurse-submodules https://github.com/your_name/your_repo
-
if a user clones without --recurse-submodules will find the empty folder, download the submodules with
git submodule init
andgit submodule update
For example, this is a repo that uses this module
#import baxter.py interface from the repository submodule
import importlib
baxter=importlib.import_module("baxter-python3.baxter")
import rospy
rospy.init_node("example")
rospy.sleep(2.0)
robot = baxter.BaxterRobot(arm="left")
rospy.sleep(2.0)
robot.set_robot_state(True)
#move robot
robot.move_to_neutral()
robot.move_to_zero()
robot.move_to_joint_position({"left_s0": 1.0})
robot.move_to_joint_position({"left_s0": -1.0})
robot.move_to_neutral()
#get position of hand
p = robot._endpoint_state.pose.position
q = robot._endpoint_state.pose.orientation
print(p)
#instead of the blocking
#msg = rospy.wait_for_message("/robot/limb/left/endpoint_state", EndpointState)
#p = msg.pose.position
#q = msg.pose.orientation
robot.set_robot_state(False)
check the examples for more.
Some examples are available in the folder to demonstrate some capabilities:
robotStateFalse.py
to put both arms in normal position and deactivate the motorsexample_arm_moving.py
for moving the jointsexample_inverse_kinematics.py
for moving the joints given a target point with inverse kinematicsexample_joystick_keyboard.py
for moving the robot with keyboard, given a target pointexample_grippers.py
to use the grippersexample_ir_arm_distance.py
for getting the infrared distance between gripper and objectexample_pics_arm.py
for getting the camera video in the hands and to take pictures with its buttons (useful to create a dataset of images of objects you are working on).example_trajectories_recorder.py
to record trajectories in cartesian (task-space) and also joints poistion (joint space). Visualize withexample_trajectories_visualizer.py
. Check the TUTORIAL_TRAJECTORIES.md for more.example_trajectories_playback.py
to play trajectories in cartesian (task-space) and also joints poistion (joint space). Check the TUTORIAL_TRAJECTORIES.md for more.
This code repo, if imported in your project, allows you to give different facial expression to Baxter robot.
Useful to warn the people around of the movements that are about to happen, looking at the place before moving the joints.
import importlib
face=importlib.import_module("baxter-python3.faces")
#set looking direction
face._set_look(robot, "down")
#or display a face
face._set_face(robot, "left_down")
{% include_cached snippets/masonry.html internal="gallery" %}
- Links
- https://robostack.github.io/GettingStarted.html
- http://web.archive.org/web/20221007182536/https://sdk.rethinkrobotics.com/wiki/API_Reference
- https://sdk.rethinkrobotics.com/wiki/
- https://sdk.rethinkrobotics.com/wiki/Foundations
- https://sdk.rethinkrobotics.com/wiki/Advanced_Understanding
- https://sdk.rethinkrobotics.com/wiki/Examples
- https://sdk.rethinkrobotics.com/wiki/Customer_Videos
- http://web.archive.org/web/20191118144640/http://api.rethinkrobotics.com/baxter_interface/html/index.html
- https://github.com/RethinkRobotics/baxter_common/tree/master/baxter_core_msgs/msg
- https://github.com/RethinkRobotics/baxter_interface/tree/master/src/baxter_interface
- Resources
- Igor Lirussi @ BOUN Boğaziçi University - CoLoRs (Cognitive Learning and Robotics) Lab
- Alper Ahmetoglu @ BOUN Boğaziçi University - CoLoRs (Cognitive Learning and Robotics) Lab
- Deniz Bilge Akkoç @ BOUN Boğaziçi University - CoLoRs (Cognitive Learning and Robotics) Lab
- All the people that contributed with suggestions and tips.
This project is licensed - see the LICENSE file for details.