Control and applications of large-scale tactile sensor
In order to simulate the robot with velocity control input properly in Gazebo, please ensure set the gravity in the z-axis to 0.
In world tag, click Physics, then set z = 0 in the gravity field.
The project is dockerized and (nearly) all of the required dependencies are automatically installed when building the Dockerfile.
The following programs must be installed on the host computer:
- Docker
- nvidia-docker2
- docker-compose
Skip this step if you already have a SSH key paired to the Github account
Open a shell and run:
ssh-keygenLeave the default name and default location. It should be /home/username/.ssh/id_rsa
Add the public key to your github account as explained on this page.
Open a shell inside the project folder and run:
docker-compose buildRemark: The Dockerfile must be compiled once or if changes are applied to the Dockerfile.
Open a shell inside the project folder and run:
docker-compose upAs a result, a container will be created. Check it by running docker ps in a shell.
Connect to the container using Visual Studio Code (recommended) or using the command line:
docker exec -it (name of container) bashThe default working directory inside the container is /home/catkin_ws#
This is required to allow scripts to open windows from the container. In the host machine open a terminal and run:
xhost +**Remark:**This must be done just once when the PC is turned on
From the catkin_ws folder inside the container run catkin_make to build the packages and dependencies.
Open a new bash inside the container, navigate to the catkin_ws root folder and run:
roslaunch ur_robot_driver ur5e_bringup.launch robot_ip:=150.65.152.107 headless_mode:=trueNavigate to the catkin_ws inside the container and run
source devel/setup.sh
rosrun protac_perception protac_acquisition.py --cam_id 0It is also possible to specify the camera bus. Run the node with the -h or --help option for information.
Remark: To run this script the
.ptfile is necessary. Due to its size it is not hosted in this repo, but it must be added into the folderprotac_map/resource/
This is to visualize TacLink connected to the ur5e on rviz. The depth values are represented using grayscale colors. The contact centroid is represented with a spherical marker. In a sourced environment run:
rosrun ur_protac skin_marker_publisher.pyThis is to display the flat representation of TacLink and to send commands to the robot. From a sourced environment run:
Navigate to the catkin_ws inside the container and run
rosrun protac_map protac_map_node.pyA window showing the TacLink map should appear. Remember to disable the access control as previously described. It is also possible to specify the threshold for contact detection. Run the node with the -h or --help option for information.
This node publish a message of type TactileControlInfo containing information to be sent to the robot controller.
To be added
A script to test the cameras are properly working can be found in the protac_map package. To run the script simply issue the following command from the catkin_ws folder
source devel/setup.sh
rosrun protac_map test_cam.pyThe scripts will write on the topic /joint_group_vel_controller/command. To check that the correct controller is enabled run the following command inside the container (remember to enable the robot controller as previously described):
rostopic pub /joint_group_vel_controller/command std_msgs/Float64MultiArray "data: [-0.05, 0.0, 0.0, 0.0, 0.0, 0.0]" The robot end-effector should start rotating. To stop it resend the same command but with zeros in the data field.
If the robot does not move check the following:
- remote control is enabled on the robot
- motors are not locked and powered on
- the velocity controller is enabled from ros
You can test the cartesian control of the robot by running:
source devel/setup.sh
rosrun protac_map test_publisher_node.py It will publish a reference command. Remember to run the control node first.
To test the connection it could be useful to check it first with the simulated version of the robot. You can run it with:
roslaunch ur_gazebo ur5e_bringup.launchFor setting up the MoveIt! nodes to allow motion planning run:
roslaunch ur5_moveit_config moveit_planning_execution.launch sim:=trueRobot can then be moved using MoveIt and rviz:
roslaunch ur5_moveit_config moveit_rviz.launch- Docker installation: https://docs.docker.com/engine/install/ubuntu/
- Docker Compose installation https://docs.docker.com/compose/install/linux/
- Nvidia Docker installation: https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/archive/1.6.0/install-guide.html
- Universal robot ROS packages: https://github.com/ros-industrial/universal_robot
- Universal robot ROS driver: https://github.com/UniversalRobots/Universal_Robots_ROS_Driver