Skip to content

Latest commit

 

History

History
532 lines (267 loc) · 27.1 KB

enx70886b8bf09c flags=4163UP,BROACAN BUS Communication Configuration for Ubuntu20homeagman1MuJoCo_RL_UR5UR5+grippe.md

File metadata and controls

532 lines (267 loc) · 27.1 KB
  1. enx70886b8bf09c: flags=4163<UP,BROACAN BUS Communication Configuration for Ubuntu20/home/agman1/MuJoCo_RL_UR5/UR5+gripper

Problem: when we start to build up communication between WAM computer(system installed with Ubuntu20) and Barrett robot arm, we may meet this problem that the CAN port can’t be found. Such as the shot screen below.

img

Solution: send the command on a new terminal:

$ sudo ip link set can0 up type can restart-ms 100 bitrate 1000000

Then reboot the computer, send the command:

cd catkin_ws

roslaunch wam_node wam_node.launch

Then follow the notification step by step.

  1. Press the “Shift” + “Reset/Idle” button at the same time

Image

  1. When the “Reset/Idle” button light. Press “Enter” on the keyboard as the follow of notification from the terminal:

Image

img

  1. Press the “Shift” + “Activate” button at the same time. When the “Activate” button lights, the communication can be built up successfully.

Image

  1. Communication problem for CAN - USB adapter

When you need another computer(not WAM PC) to control the Barrett Arm or hand, you need to use a CAN-USB adapter to build up communication between your PC and WAM robot unless your computer is equipped with a PCI card. The instructions for installing the related drives, libraries, and packages can be found in the links below:

Install libbarrett and ROS Noetic on Ubuntu 20.04:

https://git.barrett.com/software/barrett-ros-pkg#on-ubuntu-2004

Install barrett-ros-pkg:

https://git.barrett.com/software/barrett-ros-pkg#compiling-the-package

After you install everything needed and make sure your system can recognize the CAN port, such as the picture shown below(you can find ‘pcan32’ ‘pcan-usb’ and ‘pcanusb32’):

img

Then you can open a new terminal and input the command

” roslaunch wam_node wam_node.launch”

If the node can run successfully, that’s good; if not, you need to change the delay time in puck.h file:

file path: gedit /home/wam/libbarrett/include/barrett/products/puck.h

(ps: XXX is the name for the administrator of your system)

img

Change the time-delay variable: timeout_s in lines 118 and 121 larger than 0.001 (Here, I changed it to 0.005, it depends on the performance of your PC, the higher the performance of your computer, the longer the delay-time variable should be.)

After that, save the file and rebuild the drivers and libraries with the followed instructions:

cd libbarrett

cmake .

make

sudo make install

cd ~/catkin_wam/src/barrett-ros-pkg

sudo -s

./build.sh

After that, you can control the Barrett robot on your own computer. That’s all.

3. Problem and correct related operations for Vicon system in Agman Lab

Sometimes when we decide to make the Initialization for the Vicon system and reset the cartesian coordinate system, we need to use ViconTracker*(now the newest version is 3.9)* to make the configuration for the Vicon system. Firstly we should open the software, then find the page as below:

img

  1. Camera Selection
  2. View Changer
  3. Important tabs
  4. Non-functional Camera symbol

When connecting the Vicon system, you must be sure that the cameras you are interested in working with are all pointing in the direction of the scene you are working in. If the cameras are pointing in the wrong direction, you will see a symbol like (4) in the image above, they might show up with nothing in the frame when looking at camera view similar to the image below.

img

If you run into the scenario mentioned above, the first thing you should do would be to double check your camera positioning to make sure the camera is facing in the right direction and is even capturing the markers you’re interested in.

If you find that the camera is capturing the correct frame, then another option would be to lower the “Threshold” option under that camera's specific properties.

img

Once the threshold has been adjusted, your camera should begin showing marker positions, and then you’re ready for the next step.

img

①. Click the ‘ Calibrate’ button

②. Find the search column for ‘ Camera Calibration Feedback,’ then choose the cameras which you need to initialize.

③. After finishing the choice, click the ‘ Start ’ button in the ‘Calibrate Cameras’ column.

/*********************************************************************************************

Notice: Here, we may meet the error after we click the ‘start ’ button as the situation happened belowimg

As the terminal reported, the Tracker is unable to write to the calibration x2d file. For this error, we need to find the path of data storage for Tracker:

img

Here we can find that there is a compressed file and other previous data files included .x2d file.

Firstly, we need to move all files except the last one to the compressed file to save the previous data(if required). This operation makes sure that there are no .x2d files and .xcp files in this path to let the tracker write new .x2d and .xcp files here.

Secondly, we back the tracker, do the same operation as we mentioned above from ①~③,

Then we can initialize the vicon system successfully, as the shot screen below:

img

************************************************************************************************/

④. After we initialize the cameras we selected successfully, we need to set the origin point for the 3-D workspace to build up a new Spatial Cartesian Coordinate System. Here we choose one position, which should be the center point of the workspace’s base on the ground.

img

Next, we find the ‘ Set Volume Origin ’ column on the left and click ‘ Start.’ Then we can see the originality part as below:

img

⑤. After we click the ‘ Set Origin ’ button, all the positions for cameras will be reset, and the new spatial cartesian coordinate system has been built up.

img

Once you’ve collected your data using the recording section, you can view the graphs associated with your trial by loading in a trial,

img

selecting the object in the tracked object in the 3D perspective view, and then switching to graph mode. Once done, you should select the appropriate graph using the drop-down menu, similar to the image below.

img

You can then view the position graphs, and from there you can select the velocity and acceleration graphs.

img

The problem for C++ or python scripts crushed when we are controlling WAM

img

Sometimes when we are building up communication with the WAM robot arm and controlling it, we may meet problems such as the figure above, the problem that the C++ or Python scripts you programmed would be crushed when you decide to run them. For this problem, firstly, make sure that the script you programmed is correct and whether the hardware settings in the program follow precisely the parameters set in the manual.

Secondly, If everything is okay, then you can check out the file as the following path:

libbarrett/include/barrett/bus/bus_manager.h:112

If your computer is high-performance, you should decrease the variable ‘MESSAGE_BUFFER_SIZE‘ less than the primitive value. Or, if your computer is in low-performance, you need to increase the variable larger than the primitive value. For my situation, I decreased the value for that variable and made the script run successfully on my high-performance computer. Here is my setting as below:

img

After that, you need to rebuild & re-install the libbarrett library as we recorded in the ‘Communication problem for CAN - USB adapter**.’** Then you can run the script programmed by yourself successfully.

Mujoco and mujoco-py installation instruction(Python 3.X Version)

(Note: Don't install other versions with other python versions!)

-Last update: 2022.6.8

Step 1: Install anaconda

Anaconda3-2021.04-Linux-x86_64.sh

sudo chmod +x Anaconda3-2021.04-Linux-x86_64.sh

./Anaconda3-2021.04-Linux-x86_64.sh

Step 2 : install git

sudo apt install git

Step 3 : install the mujoco library

  1. Download the Mujoco library from

https://mujoco.org/download/mujoco210-linux-x86_64.tar.gz

  1. create a hidden folder :

    sudo mkdir /home/username/.mujoco

​ (Note: if the folder you created without any permission, use “sudo chmod a+rwx /path/to/file” to give the folder permission)

  1. extract the library to the .mujoco folder
  2. include these lines in .bashrc file(terminal command: gedit ~/.bashrc):

export LD_LIBRARY_PATH=/home/user_name/.mujoco/mujoco210/bin

export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/lib/nvidia

export PATH="$LD_LIBRARY_PATH:$PATH"

export LD_PRELOAD=/usr/lib/x86_64-linux-gnu/libGLEW.so

  1. source ~/.bashrc

  2. Test that the library is installed by going into:

     cd ~/.mujoco/mujoco210/bin
    

​ ./simulate ../model/humanoid.xml

Step 4 Install mujoco-py:

conda create --name mujoco_py python=3.8

conda activate mujoco_py

sudo apt update

sudo apt-get install patchelf

sudo apt-get install python3-dev build-essential libssl-dev libffi-dev libxml2-dev

sudo apt-get install libxslt1-dev zlib1g-dev libglew1.5 libglew-dev python3-pip

git clone https://github.com/openai/mujoco-py

cd mujoco-py

pip install -r requirements.txt

pip install -r requirements.dev.txt

pip3 install -e . --no-cache

Step 5 reboot your machine

Step 6 run these commands

conda activate mujoco_py

sudo apt install libosmescd examplesa6-dev libgl1-mesa-glx libglfw3

sudo apt-get install libosmesa6-dev

sudo ln -s /usr/lib/x86_64-linux-gnu/libGL.so.1

sudo ln -s /usr/lib/x86_64-linux-gnu/libGL.so

pip3 install -U 'mujoco-py<2.2,>=2.1'

cd examples

python3 setting_state.py

If you’re getting a Cython error, try:

pip install "cython<3"

MSI Laptop Nvidia Driver problem for multi-monitors display solution:

Disabling secure boot in BIOS options.

(ps: also same solution for MSI Laptop with PCAN detection problem-No found pcan-32)

CUDA Install Problem Solution Reference. (Purely delete Cuda packages and Nvidia drivers on Ubuntu)

dpkg -l | grep cuda- | awk '{print $2}' | xargs -n1 sudo dpkg --purge

df -h

sudo apt-get purge nvidia\*

sudo apt-get -f install

sudo apt autoremove

Mujoco Environment setup in PyCharm:

img

Environment variables: LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/home/wam/.mujoco/mujoco210/bin:/usr/lib/nvidia;LD_PRELOAD=/usr/lib/x86_64-linux-gnu/libGLEW.so

Note**: Strongly recommend all when you try to install mujoco through** pip install mujoco**, keep in mind that the newest version is change a lot for its Dataset lib, that means when you run your previous projects, the error may occur and it’s hard to change for every data structure, so please use** pip install mujoco==2.3.0

you can ref this link

UR5e Operation Information:

  1. Activating the Robot

  2. Activating the Gripper

  3. Activating the Wrist Camera

  4. Programming the UR5e via Python

  5. Activating the Robot:

To begin using the robot, one must first ensure the controller is plugged in, and the connections to the controller are valid.

To turn on the UR5e, you hit the power button located on the front of the teach pendant next to the emergency stop button.

img

Once the system has turned on, you still must activate the robot by selecting the button located at the bottom left corner of the screen.

Once clicked, you will be taken to this screen. Press the “ON” button to activate the robot.

After providing power to the robot, click the “START” button to unlock the robot’s joints.

img

After being unlocked, if everything went well, your screen should look as it does below.

img

If your screen looks like the one above, hit the “Exit” button.

With the robot active, you can begin moving it easily using the “Move” tab. An example of what your screen should look like is below.

img

  1. Activating the Gripper

With the robot on and activated, you can now activate the gripper supplied by RobotIQ.

Gripper Model:

RobotIQ 2F-85

img

The gripper connects to the UR5e via a UR Cap, which is a softwa

img

  1. Activating the Wrist Camera

** **Product Website: https://robotiq.com/products/2f85-140-adaptive-robot-gripper?ref=nav_product_new_button

img

  1. Programming the UR5e via Python

You will need either Ubuntu 18.04 with ROS melodic, or Ubuntu 20.04 with ROS noetic. We are all using ROS Noetic with Ubuntu 20.04 at our lab.

Drivers & Prerequisites:

First, begin by installing the Universal Robots ROS Driver here: https://github.com/UniversalRobots/Universal_Robots_ROS_Driver

For the UR5e to function properly, you must install the External Control URcap, which has already been installed on our UR5e, but installation information can be found here: https://github.com/UniversalRobots/Universal_Robots_ROS_Driver/blob/master/ur_robot_driver/doc/install_urcap_e_series.md

You will also need to set up the tool communication with the UR5e if you would like to access the RobotIQ gripper. Information how to install this URcap can be found here, which has been installed on the robot already: https://github.com/UniversalRobots/Universal_Robots_ROS_Driver/blob/master/ur_robot_driver/doc/setup_tool_communication.md

Each UR robot is calibrated inside the factory giving exact forward and inverse kinematics. To also make use of this in ROS, you first have to extract the calibration information from the robot. Information on how to do so can be found in the Universal Robots ROS Driver in the first link. If we ever get another robot, it would be a good idea to follow the details on this page:https://github.com/UniversalRobots/Universal_Robots_ROS_Driver/blob/master/ur_calibration/README.md

Once you have the drivers setup, use this script to collect the calibration information from the robot. First, make sure the robot is turned on and connected to the network.

conda deactivate

$ roslaunch ur_calibration calibration_correction.launch \

robot_ip:=192.168.1.17

target_filename:="${HOME}/my_robot_calibration.yaml”

Once you’ve done this, you should find a file named my_robot_calibration.yaml in your home directory. Just leave that there.

Next, open another terminal, and input the following to set-up a connection with the robot:

$ roslaunch ur_robot_driver ur5e_bringup.launch robot_ip:=192.168.1.17 kinematics_config:=${HOME}/my_robot_calibration.yaml

If you’re getting an error, it is likely the driver was installed incorrectly or you have not properly set-up your root file. To alter your root file, use:

gedit ~/.bashrc

This is what my root file looks like:

img

Setting up the RT Kernal (NEED TO FINISH)

To operate the UR driver it is recommended to setup a Ubuntu system with real-time capabilities. Because we are using an e-series UR, the higher control frequency might lead to non-smooth trajectories if we don’t run the system using a real time-enabled system.

To do this, I followed the following tutorial: https://github.com/UniversalRobots/Universal_Robots_ROS_Driver/blob/master/ur_robot_driver/doc/real_time.md

There were a few things that did not get explained in the tutorial, so I will elaborate here.

Make sure to check what kernel version you have installed in your system, and use this version instead of the 4.14.139-rt66 version they were using.

I found it helpful in the “Getting the sources for a real-time kernel” section to paste the HTML provided in their example into a browser and use that to find the appropriate file needed for the wget action.

In the Setup user privileges to use real-time scheduling section, to edit your /etc/security/limits.conf file I used

sudo nano /etc/security/limits.conf

to add the @realtime lines in the tutorial. Then you can use Ctrl + O to save the file.

HAVING ISSUES WITH THE Setup GRUB to always boot the real-time kernel SECTION

My rt kernel doesn’t appear to show up as a choice when doing their

$ awk -F\' '/menuentry |submenu / {print $1 $2}' /boot/grub/grub.cfg

I’ll come back to this (^^^) later.

Using RVIZ and MoveIt!

This website has great information on how to get rviz & MoveIt! up and running: https://github.com/UniversalRobots/Universal_Robots_ROS_Driver/blob/master/ur_robot_driver/doc/usage_example.md

I was able to connect the UR5e to my computer using the instructions followed in their tutorial, but I was having issues once I started trying to implement MoveIt!. It is likely I need to look at my current installation and make adjustments.

The UR External Control must be playing for the robot to be able to accept commands. You should be able to connect to the robot and see its position in rviz, but when you try to move the robot using the:

rosrun ur_robot_driver test_move

You’ll run into issues. If you are having issues with getting the external control to run, be sure that the external control has the Host IP setting as your own computer’s IP.

To find your own IP address, go to Setting > Network > Ethernet > Settings, and you should see a screen like this appear:

img

Use the IPv4 Address in External Control, and then run the program.

UR5e Setup on Ubuntu 20.04 and ROS Noetic Instructions:

Requirements

Install UR Driver and FMauch’s fork of config files per https://github.com/UniversalRobots/Universal_Robots_ROS_Driver/blob/master/README.md

For Gazebo simulated ur5e:

In separate windows:

roslaunch ur_gazebo ur5e_bringup.launch

roslaunch ur5e_moveit_config ur5e_moveit_planning_execution.launch sim:=true

rosrun rviz rviz -f world

Then in Rviz add MotionPlanning and change planning group to manipulator from endeffector

For simulated ur3e: as above but with 3 instead of 5

However, gazebo enforces collision with ground but this is not being forwarded to MoveIt’s PlanningScene.

To get moveit_commander working in Python, attempting a from source install of moveit’s master branch plus the fmauch ur descriptions. But catkin failed with a CPP compilation error in motion_planning_frame_manipulation.cpp.o

Determined that the deb packages had just not been installed. Ran sudo apt-get install ros-noetic-moveit and it added several packages including moveit_commander.

MoveIt struggles with path planning even for very easy asks with long time allotments. Works much better with joint limited version (add limited:=true to *_gazebo.launch and moveit_planning_execution.launch invocations), but it still isn’t perfect wrt avoiding collisions with the ground. In particular, seem to need to add the floor object twice before it is actually present in the planning scene. This may be a synchronization issue

For Real ur:

  1. Release estop
  2. In upper right, change from automatic or remote to manual mode. Password: biorobotics
  3. In settings/system/URCaps make External Control Active (reboot if required)
  4. In Installation/URCaps/External Control set the remote host IP to your machine
  5. In Run, lhey don't go thoad surimoveitcontrol.upp OR create a new program with only an ExternalControl block
  6. Power On
  7. Start robot
  8. Do: roslaunch ur_robot_driver ur3e_bringup.launch robot_ip:=192.168.0.102 kinematics_config:=/home/ggutow/NGtestbedrightcalibration.yaml
    1. Replace path to kinematics config file and robot_ip as necessary
  9. Do: roslaunch ur3_moveit_config ur3_moveit_planning_execution.launch limited:=true
  10. Do: rosrun rviz rvis -f base
  11. Press play on robot, terminal window should say “Robot connected to reverse interface. Ready to receive control commands.”

Get a VPN and have to ssh into the computer. Needs to learn UNIX first:

  • SSH - connecting to a machine
  • UNIX - get around file system, permissions, basic commands,
  • Get around in the file system.
  • Linux command line cheat sheet
  • man command - manual pages
  • Check what is running on a system
  • basic utilities: listing all the processes that are running, what they are using.
  • htop, etc.

Environment setup for using Python to control the UR5e

  1. img

Add ros noetic path on your pycharm

  1. Install rospy package on your anaconda environment.(pip install rospkg)
  2. You can run the code.

(PS: Environment setup for UR5e, please reference this on GitHub. I made some changes based on the original version to make sure the new version is suitable for our lab’s environment.)

Smart Satellite Actuated Linear Rail

When utilizing the Vernier Dual Range force sensor, the max force required to move the satellite is 32.78N, but the average force required is close to 18N. This high force point on the track is in the middle of the track, where two ends of aluminum extrusion connect. If the load of the satellite increases, it is likely that this max force would also increase.