Browse Source

Run through of ROS tutorials

pull/16/head
hello-chintan 1 year ago
parent
commit
b1392d233a
25 changed files with 575 additions and 428 deletions
  1. +5
    -5
      getting_started/updating_software.md
  2. +12
    -24
      ros1/aruco_marker_detection.md
  3. +23
    -18
      ros1/example_1.md
  4. +48
    -34
      ros1/example_10.md
  5. +34
    -22
      ros1/example_11.md
  6. +38
    -32
      ros1/example_12.md
  7. +24
    -11
      ros1/example_13.md
  8. +29
    -20
      ros1/example_2.md
  9. +25
    -20
      ros1/example_3.md
  10. +26
    -12
      ros1/example_4.md
  11. +23
    -12
      ros1/example_5.md
  12. +31
    -16
      ros1/example_6.md
  13. +32
    -27
      ros1/example_7.md
  14. +21
    -8
      ros1/example_8.md
  15. +45
    -23
      ros1/example_9.md
  16. +46
    -39
      ros1/follow_joint_trajectory.md
  17. +9
    -7
      ros1/gazebo_basics.md
  18. +7
    -5
      ros1/getting_started.md
  19. +11
    -9
      ros1/internal_state_of_stretch.md
  20. +20
    -14
      ros1/moveit_basics.md
  21. +20
    -18
      ros1/navigation_stack.md
  22. +7
    -9
      ros1/perception.md
  23. +16
    -20
      ros1/respeaker_microphone_array.md
  24. +5
    -3
      ros1/rviz_basics.md
  25. +18
    -20
      ros1/teleoperating_stretch.md

+ 5
- 5
getting_started/updating_software.md View File

@ -4,7 +4,7 @@ Stretch's software is improved with new features and bug fixes with each update.
## When to Update
We develop our software publicly on Github, allowing anyone to follow/propose the development of a code feature or bug fix. While we wholeheartedly welcome collaboration on Github, it is not necessary to be active on Github to follow our software releases. We announce every major release of software on our [forum](https://forum.hello-robot.com/c/announcements). These are stable releases with code that has been extensively tested on many Stretch robots. To be notified of new releases, create an account on the forum and click the bell icon in the top left of the [announcements section](https://forum.hello-robot.com/c/announcements/6). The forum is also available to report issues and ask questions about any of our software packages.
We develop our software publicly on GitHub, allowing anyone to follow and propose the development of a code feature or bug fix. While we wholeheartedly welcome collaboration on GitHub, it is not necessary to be active on GitHub to follow our software releases. We announce every major release of software on our [forum](https://forum.hello-robot.com/c/announcements). These are stable releases with code that has been extensively tested on many Stretch robots. To be notified of new releases, create an account on the forum and click the bell icon in the top left of the [announcements section](https://forum.hello-robot.com/c/announcements/6). The forum is also available to report issues and ask questions about any of our software packages.
## How to Update
@ -12,7 +12,7 @@ Each Stretch is shipped with firmware, a Python SDK, and ROS packages developed
### Stretch ROS
Stretch ROS is the [Robot Operating System](https://www.ros.org/about-ros/) (ROS) interface to the robot. Many robotics developers find ROS useful to bootstrap their robotics software developments. Depending on whether you want to setup a ROS or ROS 2 workspace, the easiest way to download the most recent updates in the stretch_ros and stretch_ros2 repos, while resolving all source built dependencies at the same time, is by following the instructions in the [Creating a New ROS Workspace](https://github.com/hello-robot/stretch_install/blob/master/docs/ros_workspace.md) section in the stretch_install repo.
Stretch ROS is the [Robot Operating System](https://www.ros.org/about-ros/) (ROS) interface to the robot. Many robotics developers find ROS useful to bootstrap their robotics software developments. Depending on whether you want to set up a ROS or ROS 2 workspace, the easiest way to download the most recent updates in the stretch_ros and stretch_ros2 code repositories, while resolving all source-built dependencies at the same time, is by following the instructions in the [Creating a New ROS Workspace](https://github.com/hello-robot/stretch_install/blob/master/docs/ros_workspace.md) section in the stretch_install repo.
**Warning**: Before you proceed, please ensure that all your personal files in the catkin or ament workspace have been backed up safely. This is important because executing the following set of commands deletes your existing workspace and replaces it with a fresh one.
@ -44,7 +44,7 @@ To replace the ROS 2 Galactic ament_ws in Ubuntu 20.04, execute:
### Stretch Body Python SDK
Stretch Body is the Python SDK to the robot. It abstracts away the low level details of communication with the embedded devices and provides an intuitive API to working with the robot. You may update it using the following commands depending on the Python version.
Stretch Body is the Python SDK for the robot. It abstracts away the low-level details of communication with the embedded devices and provides an intuitive API for working with the robot. You may update it using the following commands depending on the Python version.
If you are using Python2, execute:
```{.bash .shell-prompt}
@ -70,7 +70,7 @@ The firmware and the Python SDK (called Stretch Body) communicate on an establis
$ REx_firmware_updater.py --status
```
This script will automatically determine what version is currently running on the robot and provide a recommendation for a next step. Follow the next steps provided by the firmware updater script.
This script will automatically determine what version is currently running on the robot and provide a recommendation for the next step. Follow the next steps provided by the firmware updater script.
### Ubuntu
@ -99,7 +99,7 @@ Please upgrade the firmware and/or version of Stretch Body.
----------------
```
This error appears because the low level Python SDK and the firmware cannot communicate to each other. There is a protocol mismatch preventing communication between the two. Simply run the following script and follow its recommendations to upgrade/downgrade the firmware as necessary to match the protocol level of Stretch Body.
This error appears because the low-level Python SDK and the firmware cannot communicate with each other. There is a protocol mismatch preventing communication between the two. Simply run the following script and follow its recommendations to upgrade/downgrade the firmware as necessary to match the protocol level of Stretch Body.
```{.bash .shell-prompt}
$ REx_firmware_updater.py --status

+ 12
- 24
ros1/aruco_marker_detection.md View File

@ -1,41 +1,34 @@
## ArUco Marker Detector
For this tutorial, we will go over how to detect Stretch's ArUco markers and how to files the hold the information for each tag.
For this tutorial, we will go over how to detect Stretch's ArUco markers and review the files that hold the information for the tags.
### Visualize ArUco Markers in RViz
Begin by running the stretch driver launch file.
```bash
# Terminal 1
roslaunch stretch_core stretch_driver.launch
```
To activate the RealSense camera and publish topics to be visualized, run the following launch file in a new terminal.
```bash
# Terminal 2
roslaunch stretch_core d435i_low_resolution.launch
```
Next, run the stretch ArUco launch file which will bring up the [detect_aruco_markers](https://github.com/hello-robot/stretch_ros/blob/master/stretch_core/nodes/detect_aruco_markers) node.
Next, in a new terminal, run the stretch ArUco launch file which will bring up the [detect_aruco_markers](https://github.com/hello-robot/stretch_ros/blob/master/stretch_core/nodes/detect_aruco_markers) node.
```bash
# Terminal 3
roslaunch stretch_core stretch_aruco.launch
```
Within this tutorial package, there is an [RViz config file](https://github.com/hello-robot/stretch_tutorials/blob/noetic/rviz/aruco_detector_example.rviz) with the topics for transform frames in the Display tree. You can visualize these topics and the robot model by running the command below in a new terminal.
Within this tutorial package, there is an [RViz config file](https://github.com/hello-robot/stretch_tutorials/blob/noetic/rviz/aruco_detector_example.rviz) with the topics for the transform frames in the Display tree. You can visualize these topics and the robot model by running the command below in a new terminal.
```bash
# Terminal 4
rosrun rviz rviz -d /home/hello-robot/catkin_ws/src/stretch_tutorials/rviz/aruco_detector_example.rviz
```
You are going to need to teleoperate Stretch's head to detect the ArUco marker tags. Run the following command in a new terminal and control the head in order to point the camera towards the markers.
You are going to need to teleoperate Stretch's head to detect the ArUco marker tags. Run the following command in a new terminal and control the head to point the camera toward the markers.
```bash
# Terminal 5
rosrun stretch_core keyboard_teleop
```
@ -44,10 +37,8 @@ rosrun stretch_core keyboard_teleop
</p>
### The ArUco Marker Dictionary
When defining the ArUco markers on Stretch, hello robot utilizes a YAML file, [stretch_marker_dict.yaml](https://github.com/hello-robot/stretch_ros/blob/master/stretch_core/config/stretch_marker_dict.yaml), that holds the information about the markers.
If [detect_aruco_markers](https://github.com/hello-robot/stretch_ros/blob/master/stretch_core/nodes/detect_aruco_markers) node doesn’t find an entry in [stretch_marker_dict.yaml](https://github.com/hello-robot/stretch_ros/blob/master/stretch_core/config/stretch_marker_dict.yaml) for a particular ArUco marker ID number, it uses the default entry. For example, most robots have shipped with the following default entry:
```yaml
@ -83,7 +74,8 @@ and the following entry for the ArUco marker on the top of the wrist
The `length_mm` value used by [detect_aruco_markers](https://github.com/hello-robot/stretch_ros/blob/master/stretch_core/nodes/detect_aruco_markers) is important for estimating the pose of an ArUco marker.
**IMPORTANT NOTE:** If the actual width and height of the marker do not match this value, then pose estimation will be poor. Thus, carefully measure custom Aruco markers.
!!! note
If the actual width and height of the marker do not match this value, then pose estimation will be poor. Thus, carefully measure custom Aruco markers.
```yaml
'use_rgb_only': False
@ -94,8 +86,8 @@ If `use_rgb_only` is `True`, [detect_aruco_markers](https://github.com/hello-rob
```yaml
'name': 'wrist_top'
```
`name` is used for the text string of the ArUco marker’s [ROS Marker](http://docs.ros.org/en/melodic/api/visualization_msgs/html/msg/Marker.html) in the [ROS MarkerArray](http://docs.ros.org/en/melodic/api/visualization_msgs/html/msg/MarkerArray.html) Message published by the [detect_aruco_markers](https://github.com/hello-robot/stretch_ros/blob/master/stretch_core/nodes/detect_aruco_markers) ROS node.
`name` is used for the text string of the ArUco marker’s [ROS Marker](http://docs.ros.org/en/melodic/api/visualization_msgs/html/msg/Marker.html) in the [ROS MarkerArray](http://docs.ros.org/en/melodic/api/visualization_msgs/html/msg/MarkerArray.html) Message published by the [detect_aruco_markers](https://github.com/hello-robot/stretch_ros/blob/master/stretch_core/nodes/detect_aruco_markers) ROS node.
```yaml
'link': 'link_aruco_top_wrist'
@ -103,26 +95,22 @@ If `use_rgb_only` is `True`, [detect_aruco_markers](https://github.com/hello-rob
`link` is currently used by [stretch_calibration](https://github.com/hello-robot/stretch_ros/blob/master/stretch_calibration/nodes/collect_head_calibration_data). It is the name of the link associated with a body-mounted ArUco marker in the [robot’s URDF](https://github.com/hello-robot/stretch_ros/blob/master/stretch_description/urdf/stretch_aruco.xacro).
It’s good practice to add an entry to [stretch_marker_dict.yaml](https://github.com/hello-robot/stretch_ros/blob/master/stretch_core/config/stretch_marker_dict.yaml) for each ArUco marker you use.
### Create a New ArUco Marker
At Hello Robot, we’ve used the following guide when generating new ArUco markers.
We generate ArUco markers using a 6x6 bit grid (36 bits) with 250 unique codes. This corresponds with[ DICT_6X6_250 defined in OpenCV](https://docs.opencv.org/3.4/d9/d6a/group__aruco.html). We generate markers using this [online ArUco marker generator](https://chev.me/arucogen/) by setting the Dictionary entry to 6x6 and then setting the Marker ID and Marker size, mm as appropriate for the specific application. We strongly recommend to measure the actual marker by hand prior to adding an entry for it to [stretch_marker_dict.yaml](https://github.com/hello-robot/stretch_ros/blob/master/stretch_core/config/stretch_marker_dict.yaml).
We generate ArUco markers using a 6x6-bit grid (36 bits) with 250 unique codes. This corresponds with[ DICT_6X6_250 defined in OpenCV](https://docs.opencv.org/3.4/d9/d6a/group__aruco.html). We generate markers using this [online ArUco marker generator](https://chev.me/arucogen/) by setting the Dictionary entry to 6x6 and then setting the Marker ID and Marker size, mm as appropriate for the specific application. We strongly recommend measuring the actual marker by hand before adding an entry for it to [stretch_marker_dict.yaml](https://github.com/hello-robot/stretch_ros/blob/master/stretch_core/config/stretch_marker_dict.yaml).
We select marker ID numbers using the following ranges.
* 0 - 99 : reserved for users
* 100 - 249 : reserved for official use by Hello Robot Inc.
* 100 - 199 : reserved for robots with distinct sets of body-mounted markers
* 0 - 99: reserved for users
* 100 - 249: reserved for official use by Hello Robot Inc.
* 100 - 199: reserved for robots with distinct sets of body-mounted markers
* Allows different robots near each other to use distinct sets of body-mounted markers to avoid confusion. This could be valuable for various uses of body-mounted markers, including calibration, visual servoing, visual motion capture, and multi-robot tasks.
* 5 markers per robot = 2 on the mobile base + 2 on the wrist + 1 on the shoulder
* 20 distinct sets = 100 available ID numbers / 5 ID numbers per robot
* 200 - 249 : reserved for official accessories
* 200 - 249: reserved for official accessories
* 245 for the prototype docking station
* 246-249 for large floor markers

+ 23
- 18
ros1/example_1.md View File

@ -8,22 +8,21 @@ The goal of this example is to give you an enhanced understanding of how to cont
Begin by running the following command in a new terminal.
```bash
# Terminal 1
roslaunch stretch_core stretch_driver.launch
```
Switch the mode to *navigation* mode using a rosservice call. Then drive the robot forward with the [move.py](https://github.com/hello-robot/stretch_tutorials/tree/noetic/src/move.py) node.
Switch to `navigation` mode using a rosservice call. Then, in a new terminal, drive the robot forward with the [move.py](https://github.com/hello-robot/stretch_tutorials/tree/noetic/src/move.py) node.
```bash
# Terminal 2
rosservice call /switch_to_navigation_mode
cd catkin_ws/src/stretch_tutorials/src/
python3 move.py
```
To stop the node from sending twist messages, type **Ctrl** + **c**.
To stop the node from sending twist messages, type `Ctrl` + `c`.
### The Code
Below is the code which will send *Twist* messages to drive the robot forward.
Below is the code which will send `Twist` messages to drive the robot forward.
```python
#!/usr/bin/env python3
@ -68,74 +67,80 @@ if __name__ == '__main__':
```
### The Code Explained
Now let's break the code down.
```python
#!/usr/bin/env python3
```
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python3 script.
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python3 script.
```python
import rospy
from geometry_msgs.msg import Twist
```
You need to import `rospy` if you are writing a ROS [Node](http://wiki.ros.org/Nodes). The `geometry_msgs.msg` import is so that we can send velocity commands to the robot.
You need to import `rospy` if you are writing a ROS [Node](http://wiki.ros.org/Nodes). The `geometry_msgs.msg` import is so that we can send velocity commands to the robot.
```python
class Move:
def __init__(self):
self.pub = rospy.Publisher('/stretch/cmd_vel', Twist, queue_size=1)#/stretch_diff_drive_controller/cmd_vel for gazebo
```
This section of code defines the talker's interface to the rest of ROS. `pub = rospy.Publisher("/stretch/cmd_vel", Twist, queue_size=1)` declares that your node is publishing to the */stretch/cmd_vel* topic using the message type `Twist`. The `queue_size` argument limits the amount of queued messages if any subscriber is not receiving them fast enough.
This section of code defines the talker's interface to the rest of ROS. `pub = rospy.Publisher("/stretch/cmd_vel", Twist, queue_size=1)` declares that your node is publishing to the `/stretch/cmd_vel` topic using the message type `Twist`. The `queue_size` argument limits the amount of queued messages if any subscriber is not receiving them fast enough.
```Python
command = Twist()
```
Make a `Twist` message. We're going to set all of the elements, since we can't depend on them defaulting to safe values.
Make a `Twist` message. We're going to set all of the elements since we can't depend on them defaulting to safe values.
```python
command.linear.x = 0.1
command.linear.y = 0.0
command.linear.z = 0.0
```
A `Twist` data structure has three linear velocities (in meters per second), along each of the axes. For Stretch, it will only pay attention to the x velocity, since it can't directly move in the y direction or the z direction.
A `Twist` data structure has three linear velocities (in meters per second), along each of the axes. For Stretch, it will only pay attention to the x velocity, since it can't directly move in the y or the z direction.
```python
command.angular.x = 0.0
command.angular.y = 0.0
command.angular.z = 0.0
```
A `Twist` message also has three rotational velocities (in radians per second). The Stretch will only respond to rotations around the z (vertical) axis.
A `Twist` message also has three rotational velocities (in radians per second). Stretch will only respond to rotations around the z (vertical) axis.
```python
self.pub.publish(command)
```
Publish the `Twist` commands in the previously defined topic name */stretch/cmd_vel*.
Publish the `Twist` commands in the previously defined topic name `/stretch/cmd_vel`.
```Python
rospy.init_node('move')
base_motion = Move()
rate = rospy.Rate(10)
```
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master. **NOTE:** the name must be a base name, i.e. it cannot contain any slashes "/".
The `rospy.Rate()` function creates a Rate object. With the help of its method `sleep()`, it offers a convenient way for looping at the desired rate. With its argument of 10, we should expect to go through the loop 10 times per second (as long as our processing time does not exceed 1/10th of a second!)
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node. Until rospy has this information, it cannot start communicating with the ROS Master.
!!! note
the name must be a base name, i.e. it cannot contain any slashes "/".
The `rospy.Rate()` function creates a Rate object. With the help of its method `sleep()`, it offers a convenient way for looping at the desired rate. With its argument of 10, we should expect to go through the loop 10 times per second (as long as our processing time does not exceed 1/10th of a second!).
```python
while not rospy.is_shutdown():
base_motion.move_forward()
rate.sleep()
```
This loop is a fairly standard rospy construct: checking the `rospy.is_shutdown()` flag and then doing work. You have to check `is_shutdown()` to check if your program should exit (e.g. if there is a **Ctrl-C** or otherwise). The loop calls `rate.sleep()`, which sleeps just long enough to maintain the desired rate through the loop.
This loop is a fairly standard rospy construct: checking the `rospy.is_shutdown()` flag and then doing work. You have to check `is_shutdown()` to check if your program should exit (e.g. if there is a `Ctrl-C` event or otherwise). The loop calls `rate.sleep()`, which sleeps just long enough to maintain the desired rate through the loop.
## Move Stretch in Simulation
<p align="center">
<img src="https://raw.githubusercontent.com/hello-robot/stretch_tutorials/noetic/images/move.gif"/>
</p>
@ -146,11 +151,11 @@ Using your preferred text editor, modify the topic name of the published `Twist`
self.pub = rospy.Publisher('/stretch_diff_drive_controller/cmd_vel', Twist, queue_size=1)
```
After saving the edited node, bringup [Stretch in the empty world simulation](gazebo_basics.md). To drive the robot with the node, type the following in a new terminal
After saving the edited node, bring up [Stretch in the empty world simulation](gazebo_basics.md). To drive the robot with the node, type the following in a new terminal
```bash
cd catkin_ws/src/stretch_tutorials/src/
python3 move.py
```
To stop the node from sending twist messages, type **Ctrl** + **c**.
To stop the node from sending twist messages, type `Ctrl` + `c`.

+ 48
- 34
ros1/example_10.md View File

@ -1,37 +1,38 @@
# Example 10
This tutorial provides you an idea of what tf2 can do in the Python track. We will elaborate how to create a tf2 static broadcaster and listener.
This tutorial we will explain how to create a tf2 static broadcaster and listener.
## tf2 Static Broadcaster
For the tf2 static broadcaster node, we will be publishing three child static frames in reference to the *link_mast*, *link_lift*, and *link_wrist_yaw* frames.
For the tf2 static broadcaster node, we will be publishing three child static frames in reference to the `link_mast`, `link_lift`, and `link_wrist_yaw` frames.
Begin by starting up the stretch driver launch file.
```bash
# Terminal 1
roslaunch stretch_core stretch_driver.launch
```
Within this tutorial package, there is an [RViz config file](https://github.com/hello-robot/stretch_tutorials/blob/noetic/rviz/tf2_broadcaster_example.rviz) with the topics for transform frames in the Display tree. You can visualize these topics and the robot model by running the command below in a new terminal.
Within this tutorial package, there is an [RViz config file](https://github.com/hello-robot/stretch_tutorials/blob/noetic/rviz/tf2_broadcaster_example.rviz) with the topics for the transform frames in the Display tree. You can visualize these topics and the robot model by running the command below in a new terminal.
```bash
# Terminal 2
rosrun rviz rviz -d /home/hello-robot/catkin_ws/src/stretch_tutorials/rviz/tf2_broadcaster_example.rviz
```
Then run the [tf2_broadcaster.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/tf2_broadcaster.py) node to visualize three static frames.
Then run the [tf2_broadcaster.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/tf2_broadcaster.py) node to visualize three static frames. In a new terminal, execute:
```bash
# Terminal 3
cd catkin_ws/src/stretch_tutorials/src/
python3 tf2_broadcaster.py
```
The gif below visualizes what happens when running the previous node.
The GIF below visualizes what happens when running the previous node.
<p align="center">
<img src="https://raw.githubusercontent.com/hello-robot/stretch_tutorials/noetic/images/tf2_broadcaster.gif"/>
</p>
**OPTIONAL**: If you would like to see how the static frames update while the robot is in motion, run the [stow_command_node.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/stow_command.py) and observe the tf frames in RViz.
!!! tip
If you would like to see how the static frames update while the robot is in motion, run the [stow_command_node.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/stow_command.py) and observe the tf frames in RViz.
In a terminal, execute:
```bash
# Terminal 4
@ -43,7 +44,6 @@ python3 stow_command.py
<img src="https://raw.githubusercontent.com/hello-robot/stretch_tutorials/noetic/images/tf2_broadcaster_with_stow.gif"/>
</p>
### The Code
```python
@ -132,6 +132,7 @@ import tf.transformations
from geometry_msgs.msg import TransformStamped
from tf2_ros import StaticTransformBroadcaster
```
You need to import rospy if you are writing a ROS [Node](http://wiki.ros.org/Nodes). Import `tf.transformations` to get quaternion values from Euler angles. Import the `TransformStamped` from the `geometry_msgs.msg` package because we will be publishing static frames and it requires this message type. The `tf2_ros` package provides an implementation of a `tf2_ros.StaticTransformBroadcaster` to help make the task of publishing transforms easier.
```python
@ -143,6 +144,7 @@ def __init__(self):
"""
self.br = StaticTransformBroadcaster()
```
Here we create a `TransformStamped` object which will be the message we will send over once populated.
```python
@ -151,13 +153,15 @@ self.mast.header.stamp = rospy.Time.now()
self.mast.header.frame_id = 'link_mast'
self.mast.child_frame_id = 'fk_link_mast'
```
We need to give the transform being published a timestamp, we'll just stamp it with the current time, `rospy.Time.now()`. Then, we need to set the name of the parent frame of the link we're creating, in this case *link_mast*. Finally, we need to set the name of the child frame of the link we're creating. In this instance, the child frame is *fk_link_mast*.
We need to give the transform being published a timestamp, we'll just stamp it with the current time, `rospy.Time.now()`. Then, we need to set the name of the parent frame of the link we're creating, in this case `link_mast`. Finally, we need to set the name of the child frame of the link we're creating. In this instance, the child frame is `fk_link_mast`.
```python
self.mast.transform.translation.x = 0.0
self.mast.transform.translation.y = 2.0
self.mast.transform.translation.z = 0.0
```
Set the translation values for the child frame.
```python
@ -167,58 +171,63 @@ self.wrist.transform.rotation.y = q[1]
self.wrist.transform.rotation.z = q[2]
self.wrist.transform.rotation.w = q[3]
```
The `quaternion_from_euler()` function takes in a Euler angle argument and returns a quaternion values. Then set the rotation values to the transformed quaternions.
This process will be completed for the *link_lift* and *link_wrist_yaw* as well.
The `quaternion_from_euler()` function takes in an Euler angle as an argument and returns a quaternion. Then set the rotation values to the transformed quaternions.
This process will be completed for the `link_lift` and `link_wrist_yaw` as well.
```python
self.br.sendTransform([self.mast, self.lift, self.wrist])
```
Send the three transforms using the `sendTransform()` function.
```python
rospy.init_node('tf2_broadcaster')
FixedFrameBroadcaster()
```
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master. **NOTE:** the name must be a base name, i.e. it cannot contain any slashes "/".
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master.
!!! note
The name must be a base name, i.e. it cannot contain any slashes "/".
Instantiate the `FixedFrameBroadcaster()` class.
```python
rospy.spin()
```
Give control to ROS. This will allow the callback to be called whenever new
messages come in. If we don't put this line in, then the node will not work,
and ROS will not process any messages.
## tf2 Static Listener
In the previous section of the tutorial, we created a tf2 broadcaster to publish three static transform frames. In this section we will create a tf2 listener that will find the transform between *fk_link_lift* and *link_grasp_center*.
In the previous section of the tutorial, we created a tf2 broadcaster to publish three static transform frames. In this section, we will create a tf2 listener that will find the transform between `fk_link_lift` and `link_grasp_center`.
Begin by starting up the stretch driver launch file.
```bash
# Terminal 1
roslaunch stretch_core stretch_driver.launch
```
Then run the [tf2_broadcaster.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/tf2_broadcaster.py) node to create the three static frames.
Then run the [tf2_broadcaster.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/tf2_broadcaster.py) node in a new terminal to create the three static frames.
```bash
# Terminal 2
cd catkin_ws/src/stretch_tutorials/src/
python3 tf2_broadcaster.py
```
Finally, run the [tf2_listener.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/tf2_listener.py) node to print the transform between two links.
Finally, run the [tf2_listener.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/tf2_listener.py) node in a separate terminal to print the transform between two links.
```bash
# Terminal 3
cd catkin_ws/src/stretch_tutorials/src/
python3 tf2_listener.py
```
Within the terminal the transform will be printed every 1 second. Below is an example of what will be printed in the terminal. There is also an image for reference of the two frames.
```bash
Within the terminal, the transform will be printed every 1 second. Below is an example of what will be printed in the terminal. There is also an image for reference of the two frames.
```{.bash .no-copy}
[INFO] [1659551318.098168]: The pose of target frame link_grasp_center with reference from fk_link_lift is:
translation:
x: 1.08415191335
@ -235,7 +244,6 @@ rotation:
<img src="https://raw.githubusercontent.com/hello-robot/stretch_tutorials/noetic/images/tf2_listener.png"/>
</p>
### The Code
```python
@ -282,7 +290,6 @@ if __name__ == '__main__':
rospy.spin()
```
### The Code Explained
Now let's break the code down.
@ -297,13 +304,14 @@ from geometry_msgs.msg import TransformStamped
import tf2_ros
```
You need to import rospy if you are writing a ROS [Node](http://wiki.ros.org/Nodes). Import the `TransformStamped` from the `geometry_msgs.msg` package because we will be publishing static frames and it requires this message type. The `tf2_ros` package provides an implementation of a `tf2_ros.TransformListener` to help make the task of receiving transforms easier.
You need to import `rospy` if you are writing a ROS [Node](http://wiki.ros.org/Nodes). Import the `TransformStamped` from the `geometry_msgs.msg` package because we will be publishing static frames and it requires this message type. The `tf2_ros` package provides an implementation of a `tf2_ros.TransformListener` to help make the task of receiving transforms easier.
```python
tf_buffer = tf2_ros.Buffer()
listener = tf2_ros.TransformListener(tf_buffer)
```
Here, we create a `TransformListener` object. Once the listener is created, it starts receiving tf2 transformations over the wire, and buffers them for up to 10 seconds.
Here, we create a `TransformListener` object. Once the listener is created, it starts receiving tf2 transformations and buffers them for up to 10 seconds.
```python
from_frame_rel = 'link_grasp_center'
@ -315,7 +323,8 @@ Store frame names in variables that will be used to compute transformations.
rospy.sleep(1.0)
rate = rospy.Rate(1)
```
The first line gives the listener some time to accumulate transforms. The second line is the rate the node is going to publish information (1 Hz).
The first line gives the listener some time to accumulate transforms. The second line is the rate at which the node is going to publish information (1 Hz).
```python
try:
@ -327,18 +336,23 @@ try:
except (tf2_ros.LookupException, tf2_ros.ConnectivityException, tf2_ros.ExtrapolationException):
rospy.logwarn(' Could not transform %s from %s ', to_frame_rel, from_frame_rel)
```
Try to look up the transform we want. Use a try-except block, since it may fail on any single call, due to internal timing issues in the transform publishers. Look up transform between *from_frame_rel* and *to_frame_rel* frames with the `lookup_transform()` function.
Try to look up the transformation we want. Use a try-except block, since it may fail on any single call, due to internal timing issues in the transform publishers. Look up transform between `from_frame_rel` and `to_frame_rel` frames with the `lookup_transform()` function.
```python
rospy.init_node('tf2_listener')
FrameListener()
```
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master. **NOTE:** the name must be a base name, i.e. it cannot contain any slashes "/".
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master.
!!! note
The name must be a base name, i.e. it cannot contain any slashes "/".
Instantiate the `FrameListener()` class.
```python
rospy.spin()
```
Give control to ROS. This will allow the callback to be called whenever new messages come in. If we don't put this line in, then the node will not work, and ROS will not process any messages.

+ 34
- 22
ros1/example_11.md View File

@ -1,40 +1,36 @@
## Example 11
This tutorial highlights how to create a [PointCloud](http://docs.ros.org/en/melodic/api/sensor_msgs/html/msg/PointCloud.html) message from the data of a [PointCloud2](http://docs.ros.org/en/noetic/api/sensor_msgs/html/msg/PointCloud2.html) message type, then transform the PointCloud's reference link to a different frame. The data published by the RealSense is referencing its *camera_color_optical_frame* link, and we will be changing its reference to the *base_link*.
This tutorial highlights how to create a [PointCloud](http://docs.ros.org/en/melodic/api/sensor_msgs/html/msg/PointCloud.html) message from the data of a [PointCloud2](http://docs.ros.org/en/noetic/api/sensor_msgs/html/msg/PointCloud2.html) message type, then transform the PointCloud's reference link to a different frame. The data published by RealSense is referencing its `camera_color_optical_frame` link, and we will be changing its reference to the `base_link`.
Begin by starting up the stretch driver launch file.
```bash
# Terminal 1
roslaunch stretch_core stretch_driver.launch
```
To activate the [RealSense camera](https://www.intelrealsense.com/depth-camera-d435i/) and publish topics to be visualized, run the following launch file in a new terminal.
```bash
# Terminal 2
roslaunch stretch_core d435i_low_resolution.launch
```
Then run the [pointCloud_transformer.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/pointcloud_transformer.py) node.
Then run the [pointCloud_transformer.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/pointcloud_transformer.py) node. In a new terminal, execute:
```bash
# Terminal 3
cd catkin_ws/src/stretch_tutorials/src/
python3 pointcloud_transformer.py
```
Within this tutorial package, there is an [RViz config file](https://github.com/hello-robot/stretch_tutorials/blob/noetic/rviz/PointCloud_transformer_example.rviz) with the `PointCloud` in the Display tree. You can visualize this topic and the robot model by running the command below in a new terminal.
```bash
# Terminal 4
rosrun rviz rviz -d /home/hello-robot/catkin_ws/src/stretch_tutorials/rviz/PointCloud_transformer_example.rviz
```
The gif below visualizes what happens when running the previous node.
The GIF below visualizes what happens when running the previous node.
<p align="center">
<img src="https://raw.githubusercontent.com/hello-robot/stretch_tutorials/noetic/images/PointCloud_transformer.gif"/>
</p>
### The Code
```python
@ -121,6 +117,7 @@ Now let's break the code down.
```python
#!/usr/bin/env python3
```
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python3 script.
```python
@ -131,23 +128,27 @@ from sensor_msgs.msg import PointCloud2, PointCloud
from geometry_msgs.msg import Point32
from std_msgs.msg import Header
```
You need to import `rospy` if you are writing a ROS [Node](http://wiki.ros.org/Nodes). Import `tf` to utilize the `transformPointCloud` function. Import various the message types from `sensor_msgs`.
You need to import `rospy` if you are writing a ROS [Node](http://wiki.ros.org/Nodes). Import `tf` to utilize the `transformPointCloud` function. Import various message types from `sensor_msgs`.
```python
self.pointcloud2_sub = rospy.Subscriber("/camera/depth/color/points", PointCloud2, self.callback_pcl2, queue_size=1)
```
Set up a subscriber. We're going to subscribe to the topic */camera/depth/color/points*, looking for `PointCloud2` message. When a message comes in, ROS is going to pass it to the function `callback_pcl2()` automatically.
Set up a subscriber. We're going to subscribe to the topic `/camera/depth/color/points`, looking for `PointCloud2` message. When a message comes in, ROS is going to pass it to the function `callback_pcl2()` automatically.
```python
self.pointcloud_pub = rospy.Publisher("/camera_cloud", PointCloud, queue_size=1)
```
This section of code defines the talker's interface to the rest of ROS. `self.pointcloud_pub = rospy.Publisher("/camera_cloud", PointCloud, queue_size=1)` declares that your node is publishing to the */camera_cloud* topic using the message type `PointCloud`.
This section of code defines the talker's interface to the rest of ROS. `self.pointcloud_pub = rospy.Publisher("/camera_cloud", PointCloud, queue_size=1)` declares that your node is publishing to the `/camera_cloud` topic using the message type `PointCloud`.
```python
self.pcl2_cloud = None
self.listener = tf.TransformListener(True, rospy.Duration(10.0))
```
The first line of code initializes *self.pcl2_cloud* to store the `PointCloud2` message. The second line creates a `tf.TransformListener` object. Once the listener is created, it starts receiving tf transformations over the wire, and buffers them for up to 10 seconds.
The first line of code initializes `self.pcl2_cloud` to store the `PointCloud2` message. The second line creates a `tf.TransformListener` object. Once the listener is created, it starts receiving tf transformations and buffers them for up to 10 seconds.
```python
def callback_pcl2(self,msg):
@ -158,24 +159,28 @@ def callback_pcl2(self,msg):
"""
self.pcl2_cloud = msg
```
The callback function that stores the the `PointCloud2` message.
The callback function then stores the `PointCloud2` message.
```python
temp_cloud = PointCloud()
temp_cloud.header = self.pcl2_cloud.header
```
Create a `PointCloud` for temporary use. Set the temporary PointCloud's header to the stored `PointCloud2` header.
Create a `PointCloud` for temporary use. Set the temporary PointCloud header to the stored `PointCloud2` header.
```python
for data in pc2.read_points(self.pcl2_cloud, skip_nans=True):
temp_cloud.points.append(Point32(data[0],data[1],data[2]))
```
Use a for loop to extract `PointCloud2` data into a list of x, y, z points and append those values to the `PointCloud` message, *temp_cloud*.
Use a for loop to extract `PointCloud2` data into a list of x, y, and z points and append those values to the `PointCloud` message, `temp_cloud`.
```python
transformed_cloud = self.transform_pointcloud(temp_cloud)
```
Utilize the `transform_pointcloud` function to transform the points in the `PointCloud` message to reference the *base_link*
Utilize the `transform_pointcloud` function to transform the points in the `PointCloud` message to reference the `base_link`
```python
while not rospy.is_shutdown():
@ -187,19 +192,24 @@ while not rospy.is_shutdown():
except (tf.LookupException, tf.ConnectivityException,tf.ExtrapolationException):
pass
```
Try to look up and transform the `PointCloud` input. Use a try-except block, since it may fail on any single call, due to internal timing issues in the transform publishers. Transform the point cloud data from *camera_color_optical_frame* to *base_link* with the `transformPointCloud()` function.
Try to look up and transform the `PointCloud` input. Use a try-except block, since it may fail on any single call, due to internal timing issues in the transform publishers. Transform the point cloud data from `camera_color_optical_frame` to `base_link` with the `transformPointCloud()` function.
```python
self.pointcloud_pub.publish(transformed_cloud)
```
Publish the new transformed `PointCloud`.
```python
rospy.init_node('pointcloud_transformer',anonymous=True)
PCT = PointCloudTransformer()
```
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master. **NOTE:** the name must be a base name, i.e. it cannot contain any slashes "/".
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master.
!!! note
The name must be a base name, i.e. it cannot contain any slashes "/".
Declare a `PointCloudTransformer` object.
@ -207,11 +217,13 @@ Declare a `PointCloudTransformer` object.
rate = rospy.Rate(1)
rospy.sleep(1)
```
The first line gives the listener some time to accumulate transforms. The second line is the rate the node is going to publish information (1 Hz).
The first line gives the listener some time to accumulate transforms. The second line is the rate at which the node is going to publish information (1 Hz).
```python
while not rospy.is_shutdown():
PCT.pcl_transformer()
rate.sleep()
```
Run a while loop until the node is shutdown. Within the while loop run the `pcl_transformer()` method.
Run a while loop until the node is shut down. Within the while loop run the `pcl_transformer()` method.

+ 38
- 32
ros1/example_12.md View File

@ -1,12 +1,10 @@
# Example 12
For this example, we will send follow joint trajectory commands for the head camera to search and locate an ArUco tag. In this instance, a Stretch robot will try to locate the docking station's ArUco tag.
## Modifying Stretch Marker Dictionary YAML File.
When defining the ArUco markers on Stretch, hello robot utilizes a YAML file, [stretch_marker_dict.yaml](https://github.com/hello-robot/stretch_ros/blob/master/stretch_core/config/stretch_marker_dict.yaml), that holds the information about the markers. A further breakdown of the yaml file can be found in our [Aruco Marker Detection](aruco_marker_detection.md) tutorial.
## Modifying Stretch Marker Dictionary YAML File
When defining the ArUco markers on Stretch, hello robot utilizes a YAML file, [stretch_marker_dict.yaml](https://github.com/hello-robot/stretch_ros/blob/master/stretch_core/config/stretch_marker_dict.yaml), that holds the information about the markers. A further breakdown of the YAML file can be found in our [Aruco Marker Detection](aruco_marker_detection.md) tutorial.
Below is what the needs to be included in the [stretch_marker_dict.yaml](https://github.com/hello-robot/stretch_ros/blob/master/stretch_core/config/stretch_marker_dict.yaml) file so the [detect_aruco_markers](https://github.com/hello-robot/stretch_ros/blob/master/stretch_core/nodes/detect_aruco_markers) node can find the docking station's ArUco tag.
Below is what needs to be included in the [stretch_marker_dict.yaml](https://github.com/hello-robot/stretch_ros/blob/master/stretch_core/config/stretch_marker_dict.yaml) file so the [detect_aruco_markers](https://github.com/hello-robot/stretch_ros/blob/master/stretch_core/nodes/detect_aruco_markers) node can find the docking station's ArUco tag.
```yaml
'245':
@ -17,39 +15,33 @@ Below is what the needs to be included in the [stretch_marker_dict.yaml](https:/
```
## Getting Started
Begin by running the stretch driver launch file.
```bash
# Terminal 1
roslaunch stretch_core stretch_driver.launch
```
To activate the RealSense camera and publish topics to be visualized, run the following launch file in a new terminal.
```bash
# Terminal 2
roslaunch stretch_core d435i_high_resolution.launch
```
Next, run the stretch ArUco launch file which will bring up the [detect_aruco_markers](https://github.com/hello-robot/stretch_ros/blob/master/stretch_core/nodes/detect_aruco_markers) node.
Next, run the stretch ArUco launch file which will bring up the [detect_aruco_markers](https://github.com/hello-robot/stretch_ros/blob/master/stretch_core/nodes/detect_aruco_markers) node. In a new terminal, execute:
```bash
# Terminal 3
roslaunch stretch_core stretch_aruco.launch
```
Within this tutorial package, there is an [RViz config file](https://github.com/hello-robot/stretch_tutorials/blob/noetic/rviz/aruco_detector_example.rviz) with the topics for transform frames in the Display tree. You can visualize these topics and the robot model by running the command below in a new terminal.
Within this tutorial package, there is an [RViz config file](https://github.com/hello-robot/stretch_tutorials/blob/noetic/rviz/aruco_detector_example.rviz) with the topics for the transform frames in the Display tree. You can visualize these topics and the robot model by running the command below in a new terminal.
```bash
# Terminal 4
rosrun rviz rviz -d /home/hello-robot/catkin_ws/src/stretch_tutorials/rviz/aruco_detector_example.rviz
```
Then run the [aruco_tag_locator.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/aruco_tag_locator.py) node.
Then run the [aruco_tag_locator.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/aruco_tag_locator.py) node. In a new terminal, execute:
```bash
# Terminal 5
cd catkin_ws/src/stretch_tutorials/src/
python3 aruco_tag_locator.py
```
@ -207,6 +199,7 @@ Now let's break the code down.
```python
#!/usr/bin/env python3
```
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python3 script.
```python
@ -221,9 +214,9 @@ from sensor_msgs.msg import JointState
from control_msgs.msg import FollowJointTrajectoryGoal
from trajectory_msgs.msg import JointTrajectoryPoint
from geometry_msgs.msg import TransformStamped
```
You need to import `rospy` if you are writing a ROS [Node](http://wiki.ros.org/Nodes). Import other python modules needed for this node. Import the `FollowJointTrajectoryGoal` from the [control_msgs.msg](http://wiki.ros.org/control_msgs) package to control the Stretch robot. Import `JointTrajectoryPoint` from the [trajectory_msgs](http://wiki.ros.org/trajectory_msgs) package to define robot trajectories. The [hello_helpers](https://github.com/hello-robot/stretch_ros/tree/master/hello_helpers) package consists of a module the provides various Python scripts used across [stretch_ros](https://github.com/hello-robot/stretch_ros). In this instance we are importing the `hello_misc` script.
You need to import `rospy` if you are writing a ROS [Node](http://wiki.ros.org/Nodes). Import other python modules needed for this node. Import the `FollowJointTrajectoryGoal` from the [control_msgs.msg](http://wiki.ros.org/control_msgs) package to control the Stretch robot. Import `JointTrajectoryPoint` from the [trajectory_msgs](http://wiki.ros.org/trajectory_msgs) package to define robot trajectories. The [hello_helpers](https://github.com/hello-robot/stretch_ros/tree/master/hello_helpers) package consists of a module that provides various Python scripts used across [stretch_ros](https://github.com/hello-robot/stretch_ros). In this instance, we are importing the `hello_misc` script.
```python
def __init__(self):
@ -238,11 +231,12 @@ def __init__(self):
self.joint_state = None
```
The `LocateArUcoTag` class inherits the `HelloNode` class from `hm` and is instantiated.
Set up a subscriber with `rospy.Subscriber('/stretch/joint_states', JointState, self.joint_states_callback)`. We're going to subscribe to the topic "*stretch/joint_states*", looking for `JointState` messages. When a message comes in, ROS is going to pass it to the function `joint_states_callback()` automatically.
Set up a subscriber with `rospy.Subscriber('/stretch/joint_states', JointState, self.joint_states_callback)`. We're going to subscribe to the topic `stretch/joint_states`, looking for `JointState` messages. When a message comes in, ROS is going to pass it to the function `joint_states_callback()` automatically.
`rospy.Publisher('ArUco_transform', TransformStamped, queue_size=10)` declares that your node is publishing to the *ArUco_transform* topic using the message type `TransformStamped`. The `queue_size` argument limits the amount of queued messages if any subscriber is not receiving them fast enough.
`rospy.Publisher('ArUco_transform', TransformStamped, queue_size=10)` declares that your node is publishing to the `ArUco_transform` topic using the message type `TransformStamped`. The `queue_size` argument limits the amount of queued messages if any subscriber is not receiving them fast enough.
```python
self.min_pan_position = -4.10
@ -250,6 +244,7 @@ self.max_pan_position = 1.50
self.pan_num_steps = 10
self.pan_step_size = abs(self.min_pan_position - self.max_pan_position)/self.pan_num_steps
```
Provide the minimum and maximum joint positions for the head pan. These values are needed for sweeping the head to search for the ArUco tag. We also define the number of steps for the sweep, then create the step size for the head pan joint.
```python
@ -257,11 +252,13 @@ self.min_tilt_position = -0.75
self.tilt_num_steps = 3
self.tilt_step_size = pi/16
```
Set the minimum position of the tilt joint, the number of steps, and the size of each step.
```python
self.rot_vel = 0.5 # radians per sec
```
Define the head actuation rotational velocity.
```python
@ -273,6 +270,7 @@ def joint_states_callback(self, msg):
"""
self.joint_state = msg
```
The `joint_states_callback()` function stores Stretch's joint states.
```python
@ -289,7 +287,8 @@ def send_command(self, command):
trajectory_goal.trajectory.joint_names = [joint_name]
point = JointTrajectoryPoint()
```
Assign *trajectory_goal* as a `FollowJointTrajectoryGoal` message type. Then extract the string value from the `joint` key. Also, assign *point* as a `JointTrajectoryPoint` message type.
Assign `trajectory_goal` as a `FollowJointTrajectoryGoal` message type. Then extract the string value from the `joint` key. Also, assign `point` as a `JointTrajectoryPoint` message type.
```python
if 'delta' in command:
@ -299,13 +298,15 @@ if 'delta' in command:
new_value = joint_value + delta
point.positions = [new_value]
```
Check to see if `delta` is a key in the command dictionary. Then get the current position of the joint and add the delta as a a new position value.
Check to see if `delta` is a key in the command dictionary. Then get the current position of the joint and add the delta as a new position value.
```python
elif 'position' in command:
point.positions = [command['position']]
```
Check to see if `position`is a key in the command dictionary. Then extract the position value.
Check to see if `position` is a key in the command dictionary. Then extract the position value.
```python
point.velocities = [self.rot_vel]
@ -315,7 +316,8 @@ trajectory_goal.trajectory.header.frame_id = 'base_link'
self.trajectory_client.send_goal(trajectory_goal)
self.trajectory_client.wait_for_result()
```
Then `trajectory_goal.trajectory.points` is defined by the positions set in *point*. Specify the coordinate frame that we want (*base_link*) and set the time to be now. Make the action call and send the goal. The last line of code waits for the result before it exits the python script.
Then `trajectory_goal.trajectory.points` is defined by the positions set in `point`. Specify the coordinate frame that we want (*base_link*) and set the time to be now. Make the action call and send the goal. The last line of code waits for the result before it exits the python script.
```python
def find_tag(self, tag_name='docking_station'):
@ -331,9 +333,9 @@ def find_tag(self, tag_name='docking_station'):
self.send_command(pan_command)
tilt_command = {'joint': 'joint_head_tilt', 'position': self.min_tilt_position}
self.send_command(tilt_command)
```
Create a dictionaries to get the head in its initial position for its search and send the commands the the `send_command()` function.
Create a dictionary to get the head in its initial position for its search and send the commands with the `send_command()` function.
```python
for i in range(self.tilt_num_steps):
@ -342,8 +344,8 @@ for i in range(self.tilt_num_steps):
self.send_command(pan_command)
rospy.sleep(0.5)
```
Utilize nested for loop to sweep the pan and tilt in increments. Then update the *joint_head_pan* position by the *pan_step_size*.
Use `rospy.sleep()` function to give time for system to do a Transform lookup before next step.
Utilize a nested for loop to sweep the pan and tilt in increments. Then update the `joint_head_pan` position by the `pan_step_size`. Use `rospy.sleep()` function to give time to the system to do a Transform lookup before the next step.
```python
try:
@ -356,7 +358,8 @@ try:
except (tf2_ros.LookupException, tf2_ros.ConnectivityException, tf2_ros.ExtrapolationException):
continue
```
Use a try-except block to look up the transform between the *base_link* and requested ArUco tag. Then publish and return the `TransformStamped` message.
Use a try-except block to look up the transform between the *base_link* and the requested ArUco tag. Then publish and return the `TransformStamped` message.
```python
pan_command = {'joint': 'joint_head_pan', 'position': self.min_pan_position}
@ -365,8 +368,8 @@ tilt_command = {'joint': 'joint_head_tilt', 'delta': self.tilt_step_size}
self.send_command(tilt_command)
rospy.sleep(.25)
```
Begin sweep with new tilt angle.
Begin sweep with new tilt angle.
```python
def main(self):
@ -376,8 +379,8 @@ def main(self):
"""
hm.HelloNode.main(self, 'aruco_tag_locator', 'aruco_tag_locator', wait_for_first_pointcloud=False)
```
Create a funcion, `main()`, to do all of the setup for the `hm.HelloNode` class and initialize the `aruco_tag_locator` node.
Create a function, `main()`, to do the setup for the `hm.HelloNode` class and initialize the `aruco_tag_locator` node.
```python
self.static_broadcaster = tf2_ros.StaticTransformBroadcaster()
@ -385,13 +388,15 @@ self.tf_buffer = tf2_ros.Buffer()
self.listener = tf2_ros.TransformListener(self.tf_buffer)
rospy.sleep(1.0)
```
Create a StaticTranformBoradcaster Node. Also, start a tf buffer that will store the tf information for a few seconds.Then set up a tf listener, which will subscribe to all of the relevant tf topics, and keep track of the information. Include `rospy.sleep(1.0)` to give the listener some time to accumulate transforms.
Create a StaticTranformBoradcaster Node. Also, start a tf buffer that will store the tf information for a few seconds. Then set up a tf listener, which will subscribe to all of the relevant tf topics, and keep track of the information. Include `rospy.sleep(1.0)` to give the listener some time to accumulate transforms.
```python
rospy.loginfo('Searching for docking ArUco tag.')
pose = self.find_tag("docking_station")
```
Notify Stretch is searching for the ArUco tag with a `rospy.loginfo()` function. Then search for the ArUco marker for the docking station.
Notice Stretch is searching for the ArUco tag with a `rospy.loginfo()` function. Then search for the ArUco marker for the docking station.
```python
if __name__ == '__main__':
@ -401,4 +406,5 @@ if __name__ == '__main__':
except KeyboardInterrupt:
rospy.loginfo('interrupt received, so shutting down')
```
Declare `LocateArUcoTag` object. Then run the `main()` method.
Declare `LocateArUcoTag` object. Then run the `main()` method.

+ 24
- 11
ros1/example_13.md View File

@ -1,30 +1,28 @@
# Example 13
In this example, we will be utilizing the [move_base ROS node](http://wiki.ros.org/move_base), a component of the [ROS navigation stack](http://wiki.ros.org/navigation?distro=melodic) to send base goals to the Stretch robot.
In this example, we will be utilizing the [move_base package](http://wiki.ros.org/move_base), a component of the [ROS navigation stack](http://wiki.ros.org/navigation?distro=melodic), to send base goals to the Stretch robot.
## Build a map
First, begin by building a map of the space the robot will be navigating in. If you need a refresher on how to do this, then check out the [Navigation Stack tutorial](navigation_stack.md).
## Getting Started
Next, with your created map, we can navigate the robot around the mapped space. Run:
```bash
roslaunch stretch_navigation navigation.launch map_yaml:=${HELLO_FLEET_PATH}/maps/<map_name>.yaml
```
Where `${HELLO_FLEET_PATH}` is the path of the `<map_name>.yaml` file.
**IMPORTANT NOTE:** It's likely that the robot's location in the map does not match the robot's location in the real space. In the top bar of Rviz, use 2D Pose Estimate to lay an arrow down roughly where the robot is located in the real space. Below is a gif for reference.
!!! note
It's likely that the robot's location on the map does not match the robot's location in real space. In the top bar of Rviz, use `2D Pose Estimate` to lay an arrow down roughly where the robot is located in real space. Below is a gif for reference.
<p align="center">
<img src="https://raw.githubusercontent.com/hello-robot/stretch_tutorials/noetic/images/2D_pose_estimate.gif"/>
</p>
Now we are going to use a node to send a a move base goal half a meter in front of the map's origin. run the following command to execute the [navigation.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/navigation.py) node.
Now we are going to use a node to send a move_base goal half a meter in front of the map's origin. run the following command in a new terminal to execute the [navigation.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/navigation.py) node.
```bash
# Terminal 2
cd catkin_ws/src/stretch_tutorials/src/
python3 navigation.py
```
@ -116,6 +114,7 @@ Now let's break the code down.
```python
#!/usr/bin/env python3
```
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python3 script.
```python
@ -126,6 +125,7 @@ from move_base_msgs.msg import MoveBaseAction, MoveBaseGoal
from geometry_msgs.msg import Quaternion
from tf import transformations
```
You need to import `rospy` if you are writing a ROS [Node](http://wiki.ros.org/Nodes).
```python
@ -133,6 +133,7 @@ self.client = actionlib.SimpleActionClient('move_base', MoveBaseAction)
self.client.wait_for_server()
rospy.loginfo('{0}: Made contact with move_base server'.format(self.__class__.__name__))
```
Set up a client for the navigation action. On the Stretch, this is called `move_base`, and has type `MoveBaseAction`. Once we make the client, we wait for the server to be ready.
```python
@ -140,7 +141,8 @@ self.goal = MoveBaseGoal()
self.goal.target_pose.header.frame_id = 'map'
self.goal.target_pose.header.stamp = rospy.Time()
```
Make a goal for the action. Specify the coordinate frame that we want, in this instance the *map*. Then we set the time to be now.
Make a goal for the action. Specify the coordinate frame that we want, in this instance the `map` frame. Then we set the time to be now.
```python
self.goal.target_pose.pose.position.x = 0.0
@ -151,6 +153,7 @@ self.goal.target_pose.pose.orientation.y = 0.0
self.goal.target_pose.pose.orientation.z = 0.0
self.goal.target_pose.pose.orientation.w = 1.0
```
Initialize a position in the coordinate frame.
```python
@ -162,6 +165,7 @@ def get_quaternion(self,theta):
"""
return Quaternion(*transformations.quaternion_from_euler(0.0, 0.0, theta))
```
A function that transforms Euler angles to quaternions and returns those values.
```python
@ -176,19 +180,22 @@ def go_to(self, x, y, theta, wait=False):
rospy.loginfo('{0}: Heading for ({1}, {2}) at {3} radians'.format(self.__class__.__name__,
x, y, theta))
```
The `go_to()` function takes in the 3 arguments, the x and y coordinates in the *map* frame, and the heading.
The `go_to()` function takes in the 3 arguments, the x and y coordinates in the `map` frame, and the heading.
```python
self.goal.target_pose.pose.position.x = x
self.goal.target_pose.pose.position.y = y
self.goal.target_pose.pose.orientation = self.get_quaternion(theta)
```
The `MoveBaseGoal()` data structure has three goal positions (in meters), along each of the axes. For Stretch, it will only pay attention to the x and y coordinates, since it can't move in the z direction.
The `MoveBaseGoal()` data structure has three goal positions (in meters), along each of the axes. For Stretch, it will only pay attention to the x and y coordinates, since it can't move in the z-direction.
```python
self.client.send_goal(self.goal, done_cb=self.done_callback)
self.client.wait_for_result()
```
Send the goal and include the `done_callback()` function in one of the arguments in `send_goal()`.
```python
@ -204,17 +211,23 @@ def done_callback(self, status, result):
else:
rospy.loginfo('{0}: FAILED in reaching the goal.'.format(self.__class__.__name__))
```
Conditional statement to print whether the goal status in the `MoveBaseActionResult` succeeded or failed.
```python
rospy.init_node('navigation', argv=sys.argv)
nav = StretchNavigation()
```
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master. **NOTE:** the name must be a base name, i.e. it cannot contain any slashes "/".
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master.
!!! note
The name must be a base name, i.e. it cannot contain any slashes "/".
Declare the `StretchNavigation` object.
```python
nav.go_to(0.5, 0.0, 0.0)
```
Send a move base goal half a meter in front of the map's origin.

+ 29
- 20
ros1/example_2.md View File

@ -1,10 +1,9 @@
## Example 2
This example aims to provide instructions on how to filter scan messages.
The aim of this example is to provide instruction on how to filter scan messages.
For robots with laser scanners, ROS provides a special Message type in the [sensor_msgs](http://wiki.ros.org/sensor_msgs) package called [LaserScan](http://docs.ros.org/en/api/sensor_msgs/html/msg/LaserScan.html) to hold information about a given scan. Let's take a look at the message specifications:
For robots with laser scanners, ROS provides a special Message type in the [sensor_msgs](http://wiki.ros.org/sensor_msgs) package called [LaserScan](http://docs.ros.org/en/api/sensor_msgs/html/msg/LaserScan.html) to hold information about a given scan. Let's take a look at the message specification itself:
```
```{.bash .no-copy}
# Laser scans angles are measured counter clockwise, with Stretch's LiDAR having
# both angle_min and angle_max facing forward (very closely along the x-axis)
@ -19,6 +18,7 @@ float32 range_max # maximum range value [m]
float32[] ranges # range data [m] (Note: values < range_min or > range_max should be discarded)
float32[] intensities # intensity data [device-specific units]
```
The above message tells you everything you need to know about a scan. Most importantly, you have the angle of each hit and its distance (range) from the scanner. If you want to work with raw range data, then the above message is all you need. There is also an image below that illustrates the components of the message type.
<p align="center">
@ -33,35 +33,33 @@ end angle, `angle_max`, are closely located along the x-axis of Stretch's frame.
<img src="https://raw.githubusercontent.com/hello-robot/stretch_tutorials/noetic/images/scan_angles.png"/>
</p>
Knowing the orientation of the LiDAR allows us to filter the scan values for a desired range. In this case, we are only considering the scan ranges in front of the stretch robot.
First, open a terminal and run the stretch driver launch file.
```bash
# Terminal 1
roslaunch stretch_core stretch_driver.launch
```
Then in a new terminal run the `rplidar.launch` file from `stretch_core`.
```bash
# Terminal 2
roslaunch stretch_core rplidar.launch
```
To filter the lidar scans for ranges that are directly in front of Stretch (width of 1 meter) run the [scan_filter.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/scan_filter.py) node by typing the following in a new terminal.
```bash
# Terminal 3
cd catkin_ws/src/stretch_tutorials/src/
python3 scan_filter.py
```
Then run the following command to bring up a simple RViz configuration of the Stretch robot.
Then run the following command in a separate terminal to bring up a simple RViz configuration of the Stretch robot.
```bash
# Terminal 4
rosrun rviz rviz -d `rospack find stretch_core`/rviz/stretch_simple_test.rviz
```
Change the topic name from the LaserScan display from */scan* to */filter_scan*.
<p align="center">
@ -111,14 +109,13 @@ if __name__ == '__main__':
```
### The Code Explained
Now let's break the code down.
```python
#!/usr/bin/env python3
```
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python3 script.
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python3 script.
```python
import rospy
@ -126,57 +123,69 @@ from numpy import linspace, inf
from math import sin
from sensor_msgs.msg import LaserScan
```
You need to import `rospy` if you are writing a ROS [Node](http://wiki.ros.org/Nodes). There are functions from `numpy` and `math` that are required within this code, thus why linspace, inf, and sin are imported. The `sensor_msgs.msg` import is so that we can subscribe and publish `LaserScan` messages.
You need to import `rospy` if you are writing a ROS [Node](http://wiki.ros.org/Nodes). There are functions from `numpy` and `math` that are required within this code, that's why `linspace`, `inf`, and `sin` are imported. The `sensor_msgs.msg` import is so that we can subscribe and publish `LaserScan` messages.
```python
self.width = 1
self.extent = self.width / 2.0
```
We're going to assume that the robot is pointing up the x-axis, so that any points with y coordinates further than half of the defined width (1 meter) from the axis are not considered.
We're going to assume that the robot is pointing up the x-axis so that any points with y coordinates further than half of the defined width (1 meter) from the axis are not considered.
```python
self.sub = rospy.Subscriber('/scan', LaserScan, self.callback)
```
Set up a subscriber. We're going to subscribe to the topic *scan*, looking for `LaserScan` messages. When a message comes in, ROS is going to pass it to the function "callback" automatically.
Set up a subscriber. We're going to subscribe to the topic `scan`, looking for `LaserScan` messages. When a message comes in, ROS is going to pass it to the function `callback` automatically.
```python
self.pub = rospy.Publisher('filtered_scan', LaserScan, queue_size=10)
```
`pub = rospy.Publisher("filtered_scan", LaserScan, queue_size=10)` declares that your node is publishing to the *filtered_scan* topic using the message type `LaserScan`. This lets the master tell any nodes listening on *filtered_scan* that we are going to publish data on that topic.
`pub = rospy.Publisher("filtered_scan", LaserScan, queue_size=10)` declares that your node is publishing to the `filtered_scan` topic using the message type `LaserScan`. This lets the master tell any nodes listening on `filtered_scan` that we are going to publish data on that topic.
```python
angles = linspace(msg.angle_min, msg.angle_max, len(msg.ranges))
```
This line of code utilizes linspace to compute each angle of the subscribed scan.
This line of code utilizes `linspace` to compute each angle of the subscribed scan.
```python
points = [r * sin(theta) if (theta < -2.5 or theta > 2.5) else inf for r,theta in zip(msg.ranges, angles)]
```
Here we are computing the y coordinates of the ranges that are **below -2.5** and **above 2.5** radians of the scan angles. These limits are sufficient for considering scan ranges in front of Stretch, but these values can be altered to your preference.
```python
new_ranges = [r if abs(y) < self.extent else inf for r,y in zip(msg.ranges, points)]
```
If the absolute value of a point's y-coordinate is under *self.extent* then keep the range, otherwise use inf, which means "no return".
If the absolute value of a point's y-coordinate is under `self.extent` then keep the range, otherwise use inf, which means "no return".
```python
msg.ranges = new_ranges
self.pub.publish(msg)
```
Substitute in the new ranges in the original message, and republish it.
Substitute the new ranges in the original message, and republish it.
```python
rospy.init_node('scan_filter')
ScanFilter()
```
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master. **NOTE:** the name must be a base name, i.e. it cannot contain any slashes "/".
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master.
!!! note
The name must be a base name, i.e. it cannot contain any slashes "/".
Instantiate the class with `ScanFilter()`
```python
rospy.spin()
```
Give control to ROS. This will allow the callback to be called whenever new
messages come in. If we don't put this line in, then the node will not work,
and ROS will not process any messages.

+ 25
- 20
ros1/example_3.md View File

@ -1,28 +1,27 @@
## Example 3
This example aims to combine the two previous examples and have Stretch utilize its laser scan data to avoid collision with objects as it drives forward.
The aim of example 3 is to combine the two previous examples and have Stretch utilize its laser scan data to avoid collision with objects as it drives forward.
Begin by running the following commands in a new terminal.
Begin by running the following command in a new terminal.
```bash
# Terminal 1
roslaunch stretch_core stretch_driver.launch
```
Then in a new terminal type the following to activate the LiDAR sensor.
Then, in a new terminal, type the following to activate the LiDAR sensor.
```bash
# Terminal 2
roslaunch stretch_core rplidar.launch
```
To set *navigation* mode and to activate the [avoider.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/avoider.py) node, type the following in a new terminal.
To set `navigation` mode and to activate the [avoider.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/avoider.py) node, type the following in a new terminal.
```bash
# Terminal 3
rosservice call /switch_to_navigation_mode
cd catkin_ws/src/stretch_tutorials/src/
python3 avoider.py
```
To stop the node from sending twist messages, type **Ctrl** + **c** in the terminal running the avoider node.
To stop the node from sending twist messages, type `Ctrl` + `c` in the terminal running the avoider node.
<p align="center">
<img height=600 src="https://raw.githubusercontent.com/hello-robot/stretch_tutorials/noetic/images/avoider.gif"/>
@ -93,8 +92,8 @@ Now let's break the code down.
```python
#!/usr/bin/env python3
```
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python3 script.
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python3 script.
```python
import rospy
@ -103,28 +102,28 @@ from math import sin
from geometry_msgs.msg import Twist
from sensor_msgs.msg import LaserScan
```
You need to import rospy if you are writing a ROS [Node](http://wiki.ros.org/Nodes). There are functions from `numpy` and `math` that are required within this code, thus linspace, inf, tanh, and sin are imported. The `sensor_msgs.msg` import is so that we can subscribe to `LaserScan` messages. The `geometry_msgs.msg` import is so that we can send velocity commands to the robot.
You need to import `rospy` if you are writing a ROS [Node](http://wiki.ros.org/Nodes). There are functions from `numpy` and `math` that are required within this code, thus `linspace`, `inf`, `tanh`, and `sin` are imported. The `sensor_msgs.msg` import is so that we can subscribe to `LaserScan` messages. The `geometry_msgs.msg` import is so that we can send velocity commands to the robot.
```python
self.pub = rospy.Publisher('/stretch/cmd_vel', Twist, queue_size=1)#/stretch_diff_drive_controller/cmd_vel for gazebo
```
This section of code defines the talker's interface to the rest of ROS. `pub = rospy.Publisher("/stretch/cmd_vel", Twist, queue_size=1)` declares that your node is publishing to the */stretch/cmd_vel* topic using the message type `Twist`.
This section of the code defines the talker's interface to the rest of ROS. `pub = rospy.Publisher("/stretch/cmd_vel", Twist, queue_size=1)` declares that your node is publishing to the `/stretch/cmd_vel` topic using the message type `Twist`.
```python
self.sub = rospy.Subscriber('/scan', LaserScan, self.set_speed)
```
Set up a subscriber. We're going to subscribe to the topic "*scan*", looking for `LaserScan` messages. When a message comes in, ROS is going to pass it to the function "set_speed" automatically.
Set up a subscriber. We're going to subscribe to the topic `scan`, looking for `LaserScan` messages. When a message comes in, ROS is going to pass it to the function `set_speed()` automatically.
```python
self.width = 1
self.extent = self.width / 2.0
self.distance = 0.5
```
*self.width* is the width of the laser scan we want in front of Stretch. Since Stretch's front is pointing in the x-axis, any points with y coordinates further than half of the defined width (*self.extent*) from the x-axis are not considered. *self.distance* defines the stopping distance from an object.
`self.width` is the width of the laser scan we want in front of Stretch. Since Stretch's front is pointing to the x-axis, any points with y coordinates further than half of the defined width (*self.extent*) from the x-axis are not considered. `self.distance` defines the stopping distance from an object.
```python
self.twist = Twist()
@ -135,31 +134,33 @@ self.twist.angular.x = 0.0
self.twist.angular.y = 0.0
self.twist.angular.z = 0.0
```
Allocate a `Twist` to use, and set everything to zero. We're going to do this when the class is initiating. Redefining this within the callback function, `set_speed()` can be more computationally taxing.
Allocate a `Twist` to use, and set everything to zero. We're going to do this when the class is initiated. Redefining this within the callback function, `set_speed()` can be more computationally taxing.
```python
angles = linspace(msg.angle_min, msg.angle_max, len(msg.ranges))
points = [r * sin(theta) if (theta < -2.5 or theta > 2.5) else inf for r,theta in zip(msg.ranges, angles)]
new_ranges = [r if abs(y) < self.extent else inf for r,y in zip(msg.ranges, points)]
```
This line of code utilizes linspace to compute each angle of the subscribed scan. Here we compute the y coordinates of the ranges that are below -2.5 and above 2.5 radians of the scan angles. These limits are sufficient for considering scan ranges in front of Stretch, but these values can be altered to your preference. If the absolute value of a point's y-coordinate is under self.extent then keep the range, otherwise use inf, which means "no return".
This line of code utilizes `linspace` to compute each angle of the subscribed scan. Here we compute the y coordinates of the ranges that are below -2.5 and above 2.5 radians of the scan angles. These limits are sufficient for considering scan ranges in front of Stretch, but these values can be altered to your preference. If the absolute value of a point's y-coordinate is under `self.extent` then keep the range, otherwise use `inf`, which means "no return".
```python
error = min(new_ranges) - self.distance
```
Calculate the difference of the closest measured scan and where we want the robot to stop. We define this as *error*.
Calculate the difference of the closest measured scan and where we want the robot to stop. We define this as `error`.
```python
self.twist.linear.x = tanh(error) if (error > 0.05 or error < -0.05) else 0
```
Set the speed according to a tanh function. This method gives a nice smooth mapping from distance to speed, and asymptotes at +/- 1
Set the speed according to a `tanh` function. This method gives a nice smooth mapping from distance to speed, and asymptotes at +/- 1
```python
self.pub.publish(self.twist)
```
Publish the `Twist` message.
```python
@ -167,8 +168,12 @@ rospy.init_node('avoider')
Avoider()
rospy.spin()
```
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master. **NOTE:** the name must be a base name, i.e. it cannot contain any slashes "/".
Instantiate class with `Avioder()`
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master.
!!! note
The name must be a base name, i.e. it cannot contain any slashes "/".
Instantiate class with `Avioder()`.
Give control to ROS with `rospy.spin()`. This will allow the callback to be called whenever new messages come in. If we don't put this line in, then the node will not work, and ROS will not process any messages.

+ 26
- 12
ros1/example_4.md View File

@ -1,28 +1,30 @@
## Example 4
<p align="center">
<img src="https://raw.githubusercontent.com/hello-robot/stretch_tutorials/noetic/images/balloon.png"/>
</p>
Let's bringup stretch in the willowgarage world from the [gazebo basics tutorial](gazebo_basics.md) and RViz by using the following command.
Let's bring up Stretch in the Willow Garage world from the [gazebo basics tutorial](gazebo_basics.md) and RViz by using the following command.
```bash
# Terminal 1
roslaunch stretch_gazebo gazebo.launch world:=worlds/willowgarage.world rviz:=true
```
the `rviz` flag will open an RViz window to visualize a variety of ROS topics. In a new terminal run the following commands to execute the [marker.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/marker.py) node.
The `rviz` flag will open an RViz window to visualize a variety of ROS topics. In a new terminal, run the following commands to execute the [marker.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/marker.py) node.
```bash
# Terminal 2
cd catkin_ws/src/stretch_tutorials/src/
python3 marker.py
```
The gif below demonstrates how to add a new `Marker` display type, and change the topic name from */visualization_marker* to */balloon*. A red sphere Marker should appear above the Stretch robot.
The GIF below demonstrates how to add a new `Marker` display type, and change the topic name from `/visualization_marker` to `/balloon`. A red sphere marker should appear above the Stretch robot.
<p align="center">
<img src="https://raw.githubusercontent.com/hello-robot/stretch_tutorials/noetic/images/balloon.gif"/>
</p>
### The Code
```python
#!/usr/bin/env python3
@ -84,8 +86,8 @@ Now let's break the code down.
```python
#!/usr/bin/env python3
```
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python3 script.
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python3 script.
```python
import rospy
@ -96,8 +98,8 @@ You need to import `rospy` if you are writing a ROS [Node](http://wiki.ros.org/N
```python
self.pub = rospy.Publisher('balloon', Marker, queue_size=10)
```
This section of code defines the talker's interface to the rest of ROS. `pub = rospy.Publisher("balloon", Twist, queue_size=1)` declares that your node is publishing to the */ballon* topic using the message type `Twist`.
This section of code defines the talker's interface to the rest of ROS. `pub = rospy.Publisher("balloon", Twist, queue_size=1)` declares that your node is publishing to the `/ballon` topic using the message type `Twist`.
```python
self.marker = Marker()
@ -105,16 +107,19 @@ self.marker.header.frame_id = 'base_link'
self.marker.header.stamp = rospy.Time()
self.marker.type = self.marker.SPHERE
```
Create a `Marker()` message type. Markers of all shapes share a common type. Set the frame ID and type. The frame ID is the frame in which the position of the marker is specified. The type is the shape of the marker. Further details on marker shapes can be found here: [RViz Markers](http://wiki.ros.org/rviz/DisplayTypes/Marker)
```python
self.marker.id = 0
```
Each marker has a unique ID number. If you have more than one marker that you want displayed at a given time, then each needs to have a unique ID number. If you publish a new marker with the same ID number of an existing marker, it will replace the existing marker with that ID number.
Each marker has a unique ID number. If you have more than one marker that you want to be displayed at a given time, then each needs to have a unique ID number. If you publish a new marker with the same ID number as an existing marker, it will replace the existing marker with that ID number.
```python
self.marker.action = self.marker.ADD
```
This line of code sets the action. You can add, delete, or modify markers.
```python
@ -122,6 +127,7 @@ self.marker.scale.x = 0.5
self.marker.scale.y = 0.5
self.marker.scale.z = 0.5
```
These are the size parameters for the marker. These will vary by marker type.
```python
@ -129,11 +135,13 @@ self.marker.color.r = 1.0
self.marker.color.g = 0.0
self.marker.color.b = 0.0
```
Color of the object, specified as r/g/b/a, with values in the range of [0, 1].
```python
self.marker.color.a = 1.0
```
The alpha value is from 0 (invisible) to 1 (opaque). If you don't set this then it will automatically default to zero, making your marker invisible.
```python
@ -141,12 +149,14 @@ self.marker.pose.position.x = 0.0
self.marker.pose.position.y = 0.0
self.marker.pose.position.z = 2.0
```
Specify the pose of the marker. Since spheres are rotationally invariant, we're only going to specify the positional elements. As usual, these are in the coordinate frame named in `frame_id`. In this case, the position will always be directly 2 meters above the frame_id (*base_link*), and will move with it.
```python
def publish_marker(self):
self.publisher.publish(self.marker)
```
Publish the Marker data structure to be visualized in RViz.
```python
@ -154,16 +164,20 @@ rospy.init_node('marker', argv=sys.argv)
balloon = Balloon()
rate = rospy.Rate(10)
```
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master. **NOTE:** the name must be a base name, i.e. it cannot contain any slashes "/".
Instantiate class with `Balloon()`
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master.
Give control to ROS with `rospy.spin()`. This will allow the callback to be called whenever new messages come in. If we don't put this line in, then the node will not work, and ROS will not process any messages.
!!! note
The name must be a base name, i.e. it cannot contain any slashes "/".
Instantiate class with `Balloon()`.
Give control to ROS with `rospy.spin()`. This will allow the callback to be called whenever new messages come in. If we don't put this line in, then the node will not work, and ROS will not process any messages.
```python
while not rospy.is_shutdown():
balloon.publish_marker()
rate.sleep()
```
This loop is a fairly standard rospy construct: checking the `rospy.is_shutdown()` flag and then doing work. You have to check `is_shutdown()` to check if your program should exit (e.g. if there is a **Ctrl-C** or otherwise). The loop calls `rate.sleep()`, which sleeps just long enough to maintain the desired rate through the loop.
This loop is a fairly standard rospy construct: checking the `rospy.is_shutdown()` flag and then doing work. You have to check `is_shutdown()` to check if your program should exit (e.g. if there is a `Ctrl-c` or otherwise). The loop calls `rate.sleep()`, which sleeps just long enough to maintain the desired rate through the loop.

+ 23
- 12
ros1/example_5.md View File

@ -1,34 +1,36 @@
## Example 5
In this example, we will review a Python script that prints out the positions of a selected group of Stretch's joints. This script is helpful if you need the joint positions after you teleoperated Stretch with the Xbox controller or physically moved the robot to the desired configuration after hitting the run stop button.
In this example, we will review a Python script that prints out the positions of a selected group of Stretch joints. This script is helpful if you need the joint positions after you teleoperated Stretch with the Xbox controller or physically moved the robot to the desired configuration after hitting the run stop button.
If you are looking for a continuous print of the joint states while Stretch is in action, then you can use the [rostopic command-line tool](http://wiki.ros.org/rostopic) shown in the [Internal State of Stretch Tutorial](internal_state_of_stretch.md).
Begin by starting up the stretch driver launch file.
```bash
# Terminal 1
roslaunch stretch_core stretch_driver.launch
```
You can then hit the run-stop button (you should hear a beep and the LED light in the button blink) and move the robot's joints to a desired configuration. Once you are satisfied with the configuration, hold the run-stop button until you hear a beep. Then run the following command to excecute the [joint_state_printer.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/joint_state_printer.py) which will print the joint positions of the lift, arm, and wrist.
You can then hit the run-stop button (you should hear a beep and the LED light in the button blink) and move the robot's joints to a desired configuration. Once you are satisfied with the configuration, hold the run-stop button until you hear a beep. Then run the following command to execute the [joint_state_printer.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/joint_state_printer.py) which will print the joint positions of the lift, arm, and wrist. In a new terminal, execute:
```bash
cd catkin_ws/src/stretch_tutorials/src/
python3 joint_state_printer.py
```
Your terminal will output the `position` information of the previously mentioned joints shown below.
```
```{.bash .no-copy}
name: ['joint_lift', 'wrist_extension', 'joint_wrist_yaw']
position: [0.6043133175850597, 0.19873586673129257, 0.017257283863713464]
```
**IMPORTANT NOTE:** Stretch's arm has 4 prismatic joints and the sum of these positions gives the *wrist_extension* distance. The *wrist_extension* is needed when sending [joint trajectory commands](follow_joint_trajectory.md) to the robot. Further, you **can not** actuate an individual arm joint. Here is an image of the arm joints for reference:
!!! note
Stretch's arm has four prismatic joints and the sum of their positions gives the *wrist_extension* distance. The *wrist_extension* is needed when sending [joint trajectory commands](follow_joint_trajectory.md) to the robot. Further, you can not actuate an individual arm joint. Here is an image of the arm joints for reference:
<p align="center">
<img src="https://raw.githubusercontent.com/hello-robot/stretch_tutorials/noetic/images/joints.png"/>
</p>
### The Code
```python
#!/usr/bin/env python3
@ -84,13 +86,13 @@ if __name__ == '__main__':
rospy.spin()
```
### The Code Explained
Now let's break the code down.
```python
#!/usr/bin/env python3
```
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python3 script.
```python
@ -98,24 +100,28 @@ import rospy
import sys
from sensor_msgs.msg import JointState
```
You need to import `rospy` if you are writing a ROS [Node](http://wiki.ros.org/Nodes). Import `sensor_msgs.msg` so that we can subscribe to `JointState` messages.
```python
self.sub = rospy.Subscriber('joint_states', JointState, self.callback)
```
Set up a subscriber. We're going to subscribe to the topic "*joint_states*", looking for `JointState` messages. When a message comes in, ROS is going to pass it to the function "callback" automatically
Set up a subscriber. We're going to subscribe to the topic `joint_states`, looking for `JointState` messages. When a message comes in, ROS is going to pass it to the function "callback" automatically
```python
def callback(self, msg):
self.joint_states = msg
```
This is the callback function where he `JointState` messages are stored as *self.joint_states*. Further information about the this message type can be found here: [JointState Message](http://docs.ros.org/en/lunar/api/sensor_msgs/html/msg/JointState.html)
This is the callback function where the `JointState` messages are stored as `self.joint_states`. Further information about this message type can be found here: [JointState Message](http://docs.ros.org/en/lunar/api/sensor_msgs/html/msg/JointState.html)
```python
def print_states(self, joints):
joint_positions = []
```
This is the `print_states()` function which takes in a list of joints of interest as its argument. the is also an empty list set as *joint_positions* and this is where the positions of the requested joints will be appended.
This is the `print_states()` function which takes in a list of joints of interest as its argument. the is also an empty list set as `joint_positions` and this is where the positions of the requested joints will be appended.
```python
for joint in joints:
@ -126,12 +132,14 @@ for joint in joints:
index = self.joint_states.name.index(joint)
joint_positions.append(self.joint_states.position[index])
```
In this section of the code, a forloop is used to parse the names of the requested joints from the *self.joint_states* list. The `index()` function returns the index a of the name of the requested joint and appends the respective position to our *joint_positions* list.
In this section of the code, a for loop is used to parse the names of the requested joints from the `self.joint_states` list. The `index()` function returns the index of the name of the requested joint and appends the respective position to the `joint_positions` list.
```python
rospy.signal_shutdown("done")
sys.exit(0)
```
The first line of code initiates a clean shutdown of ROS. The second line of code exits the Python interpreter.
```python
@ -139,6 +147,7 @@ rospy.init_node('joint_state_printer', anonymous=True)
JSP = JointStatePublisher()
rospy.sleep(.1)
```
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master. **NOTE:** the name must be a base name, i.e. it cannot contain any slashes "/".
Declare object, *JSP*, from the `JointStatePublisher` class.
@ -150,9 +159,11 @@ joints = ["joint_lift", "wrist_extension", "joint_wrist_yaw"]
#joints = ["joint_head_pan","joint_head_tilt", joint_gripper_finger_left", "joint_gripper_finger_right"]
JSP.print_states(joints)
```
Create a list of the desired joints that you want positions to be printed. Then use that list as an argument for the `print_states()` method.
```python
rospy.spin()
```
Give control to ROS with `rospy.spin()`. This will allow the callback to be called whenever new messages come in. If we don't put this line in, then the node will not work, and ROS will not process any messages.

+ 31
- 16
ros1/example_6.md View File

@ -1,28 +1,29 @@
## Example 6
In this example, we will review a Python script that prints out and stores the effort values from a specified joint. If you are looking for a continuous print of the joint state efforts while Stretch is in action, then you can use the [rostopic command-line tool](http://wiki.ros.org/rostopic) shown in the [Internal State of Stretch Tutorial](internal_state_of_stretch.md).
In this example, we will review a Python script that prints and stores the effort values from a specified joint. If you are looking for a continuous print of the joint state efforts while Stretch is in action, then you can use the [rostopic command-line tool](http://wiki.ros.org/rostopic) shown in the [Internal State of Stretch Tutorial](internal_state_of_stretch.md).
<p align="center">
<img src="https://raw.githubusercontent.com/hello-robot/stretch_tutorials/noetic/images/effort_sensing.gif"/>
</p>
Begin by running the following command in the terminal in a terminal.
Begin by running the following command in a terminal.
```bash
# Terminal 1
roslaunch stretch_core stretch_driver.launch
```
Switch the mode to *position* mode using a rosservice call. Then run the [effort_sensing.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/effort_sensing.py) node.
Switch the mode to `position` mode using a rosservice call. Then run the [effort_sensing.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/effort_sensing.py) node. In a new terminal, execute:
```bash
# Terminal 2
rosservice call /switch_to_position_mode
cd catkin_ws/src/stretch_tutorials/src/
python3 effort_sensing.py
```
This will send a `FollowJointTrajectory` command to move Stretch's arm or head while also printing the effort of the lift.
### The Code
```python
#!/usr/bin/env python3
import rospy
@ -145,15 +146,14 @@ if __name__ == '__main__':
```
### The Code Explained
This code is similar to that of the [multipoint_command](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/multipoint_command.py) and [joint_state_printer](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/joint_state_printer.py) node. Therefore, this example will highlight sections that are different from those tutorials. Now let's break the code down.
```python
#!/usr/bin/env python3
```
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python3 script.
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python3 script.
```python
import rospy
@ -168,6 +168,7 @@ from trajectory_msgs.msg import JointTrajectoryPoint
from sensor_msgs.msg import JointState
import hello_helpers.hello_misc as hm
```
You need to import rospy if you are writing a ROS [Node](http://wiki.ros.org/Nodes). Import the `FollowJointTrajectoryGoal` from the `control_msgs.msg` package to control the Stretch robot. Import `JointTrajectoryPoint` from the `trajectory_msgs` package to define robot trajectories. The `hello_helpers` package consists of a module that provides various Python scripts used across [stretch_ros](https://github.com/hello-robot/stretch_ros). In this instance, we are importing the `hello_misc` script.
```Python
@ -183,25 +184,29 @@ class JointActuatorEffortSensor(hm.HelloNode):
"""
hm.HelloNode.__init__(self)
```
The `JointActuatorEffortSensor ` class inherits the `HelloNode` class from `hm` and is initialized.
The `JointActuatorEffortSensor` class inherits the `HelloNode` class from `hm` and is initialized.
```python
self.sub = rospy.Subscriber('joint_states', JointState, self.callback)
self.joints = ['joint_lift']
```
Set up a subscriber. We're going to subscribe to the topic "*joint_states*", looking for `JointState` messages. When a message comes in, ROS is going to pass it to the function "callback" automatically. Create a list of the desired joints you want to print.
Set up a subscriber. We're going to subscribe to the topic `joint_states`, looking for `JointState` messages. When a message comes in, ROS is going to pass it to the function "callback" automatically. Create a list of the desired joints you want to print.
```Python
self.joint_effort = []
self.save_path = '/home/hello-robot/catkin_ws/src/stretch_tutorials/stored_data'
self.export_data = export_data
```
Create an empty list to store the joint effort values. The *self.save_path* is the directory path where the .txt file of the effort values will be stored. You can change this path to a preferred directory. The *self.export_data* is a boolean and its default value is set to False. If set to True, then the joint values will be stored in a .txt file, otherwise, the values will be printed in the terminal where you ran the effort sensing node.
Create an empty list to store the joint effort values. The `self.save_path` is the directory path where the .txt file of the effort values will be stored. You can change this path to a preferred directory. The `self.export_data` is a boolean and its default value is set to `False`. If set to `True`, then the joint values will be stored in a .txt file, otherwise, the values will be printed in the terminal where you ran the effort sensing node.
```python
self.trajectory_client.send_goal(trajectory_goal, feedback_cb=self.feedback_callback, done_cb=self.done_callback)
```
Include the feedback and done call back functions in the send goal function.
Include the feedback and `done_callback` functions in the send goal function.
```python
def feedback_callback(self,feedback):
@ -213,6 +218,7 @@ def feedback_callback(self,feedback):
:param feedback: FollowJointTrajectoryActionFeedback message.
"""
```
The feedback callback function takes in the `FollowJointTrajectoryActionFeedback` message as its argument.
```python
@ -220,7 +226,8 @@ if 'wrist_extension' in self.joints:
self.joints.remove('wrist_extension')
self.joints.append('joint_arm_l0')
```
Use a conditional statement to replace `wrist_extenstion` to `joint_arm_l0`. This is because `joint_arm_l0` has the effort values that the `wrist_extension` is experiencing.
Use a conditional statement to replace `wrist_extenstion` with `joint_arm_l0`. This is because `joint_arm_l0` has the effort values that the `wrist_extension` is experiencing.
```python
current_effort = []
@ -228,6 +235,7 @@ for joint in self.joints:
index = self.joint_states.name.index(joint)
current_effort.append(self.joint_states.effort[index])
```
Create an empty list to store the current effort values. Then use a for loop to parse the joint names and effort values.
```python
@ -237,6 +245,7 @@ if not self.export_data:
else:
self.joint_effort.append(current_effort)
```
Use a conditional statement to print effort values in the terminal or store values into a list that will be used for exporting the data in a .txt file.
```Python
@ -249,6 +258,7 @@ def done_callback(self, status, result):
:param result: result attribute from FollowJointTrajectoryActionResult message.
"""
```
The done callback function takes in the `FollowJointTrajectoryActionResult` messages as its arguments.
```python
@ -258,6 +268,7 @@ else:
rospy.loginfo('Failed')
```
Conditional statement to print whether the goal status in the `FollowJointTrajectoryActionResult` succeeded or failed.
```python
@ -271,8 +282,8 @@ if self.export_data:
writer.writerows(self.joint_effort)
```
A conditional statement is used to export the data to a .txt file. The file's name is set to the date and time the node was executed. That way, no previous files are overwritten.
A conditional statement is used to export the data to a .txt file. The file's name is set to the date and time the node was executed. That way, no previous files are overwritten.
### Plotting/Animating Effort Data
@ -280,17 +291,21 @@ A conditional statement is used to export the data to a .txt file. The file's na
<img src="https://raw.githubusercontent.com/hello-robot/stretch_tutorials/noetic/stored_data/2022-06-30_11:26:20-AM.png"/>
</p>
We added a simple python script, [stored_data_plotter.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/stored_data_plotter.py), to this package for plotting the stored data. **Note** you have to change the name of the file you wish to see in the python script. This is shown below:
We added a simple python script, [stored_data_plotter.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/stored_data_plotter.py), to this package for plotting the stored data.
!!! note
You have to change the name of the file you wish to see in the python script. This is shown below.
```Python
####################### Copy the file name here! #######################
file_name = '2022-06-30_11:26:20-AM'
```
Once you have changed the file name, then run the following in a new command.
```bash
cd catkin_ws/src/stretch_tutorials/src/
python3 stored_data_plotter.py
```
Because this is not a node, you don't need `roscore` to run this script. Please review the comments in the python script for additional guidance.
Because this is not a node, you don't need `roscore` to run this script. Please review the comments in the python script for additional guidance.

+ 32
- 27
ros1/example_7.md View File

@ -10,52 +10,49 @@ In this example, we will review the [image_view](http://wiki.ros.org/image_view?
Begin by running the stretch `driver.launch` file.
```bash
# Terminal 1
roslaunch stretch_core stretch_driver.launch
```
To activate the [RealSense camera](https://www.intelrealsense.com/depth-camera-d435i/) and publish topics to be visualized, run the following launch file in a new terminal.
```bash
# Terminal 2
roslaunch stretch_core d435i_low_resolution.launch
```
Within this tutorial package, there is an RViz config file with the topics for perception already in the Display tree. You can visualize these topics and the robot model by running the command below in a new terminal.
```bash
# Terminal 3
rosrun rviz rviz -d /home/hello-robot/catkin_ws/src/stretch_tutorials/rviz/perception_example.rviz
```
## Capture Image with image_view
There are a couple of methods to save an image using the [image_view](http://wiki.ros.org/image_view) package.
**OPTION 1:** Use the `image_view` node to open a simple image viewer for ROS *sensor_msgs/image* topics.
**OPTION 1:** Use the `image_view` node to open a simple image viewer for ROS *sensor_msgs/image* topics. In a new terminal, execute:
```bash
# Terminal 4
rosrun image_view image_view image:=/camera/color/image_raw_upright_view
```
Then you can save the current image by right-clicking on the display window. By deafult, images will be saved as frame000.jpg, frame000.jpg, etc. Note, that the image will be saved to the terminal's current work directory.
**OPTION 2:** Use the `image_saver` node to save an image to the terminals current work directory.
Then you can save the current image by right-clicking on the display window. By default, images will be saved as frame000.jpg, frame000.jpg, etc. Note, that the image will be saved to the terminal's current work directory.
**OPTION 2:** Use the `image_saver` node to save an image to the terminal's current work directory. In a new terminal, execute:
```bash
# Terminal 4
rosrun image_view image_saver image:=/camera/color/image_raw_upright_view
```
## Capture Image with Python Script
In this section, you can use a Python node to capture an image from the [RealSense camera](https://www.intelrealsense.com/depth-camera-d435i/). Execute the [capture_image.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/capture_image.py) node to save a .jpeg image of the image topic */camera/color/image_raw_upright_view*.
In this section, we will use a Python node to capture an image from the [RealSense camera](https://www.intelrealsense.com/depth-camera-d435i/). Execute the [capture_image.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/capture_image.py) node to save a .jpeg image of the image topic `/camera/color/image_raw_upright_view`. In a terminal, execute:
```bash
# Terminal 4
cd ~/catkin_ws/src/stretch_tutorials/src
python3 capture_image.py
```
An image named **camera_image.jpeg** is saved in the **stored_data** folder in this package.
### The Code
```python
#!/usr/bin/env python3
@ -113,19 +110,20 @@ Now let's break the code down.
```
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python3 script.
```python
import rospy
import sys
import os
import cv2
```
You need to import `rospy` if you are writing a ROS [Node](http://wiki.ros.org/Nodes). There are functions from sys, os, and cv2 that are required within this code. cv2 is a library of Python functions that implements computer vision algorithms. Further information about cv2 can be found here: [OpenCV Python](https://www.geeksforgeeks.org/opencv-python-tutorial/).
You need to import `rospy` if you are writing a ROS [Node](http://wiki.ros.org/Nodes). There are functions from `sys`, `os`, and `cv2` that are required within this code. `cv2` is a library of Python functions that implements computer vision algorithms. Further information about cv2 can be found here: [OpenCV Python](https://www.geeksforgeeks.org/opencv-python-tutorial/).
```python
from sensor_msgs.msg import Image
from cv_bridge import CvBridge, CvBridgeError
```
The `sensor_msgs.msg` is imported so that we can subscribe to ROS `Image` messages. Import [CvBridge](http://wiki.ros.org/cv_bridge) to convert between ROS `Image` messages and OpenCV images.
```python
@ -138,7 +136,8 @@ def __init__(self):
self.sub = rospy.Subscriber('/camera/color/image_raw', Image, self.callback, queue_size=1)
self.save_path = '/home/hello-robot/catkin_ws/src/stretch_tutorials/stored_data'
```
Initialize the CvBridge class, the subscriber, and the directory of where the captured image will be stored.
Initialize the CvBridge class, the subscriber, and the directory where the captured image will be stored.
```python
def callback(self, msg):
@ -153,6 +152,7 @@ def callback(self, msg):
except CvBridgeError as e:
rospy.logwarn('CV Bridge error: {0}'.format(e))
```
Try to convert the ROS Image message to a cv2 Image message using the `imgmsg_to_cv2()` function.
```python
@ -160,44 +160,50 @@ file_name = 'camera_image.jpeg'
completeName = os.path.join(self.save_path, file_name)
cv2.imwrite(completeName, image)
```
Join the directory and file name using the `path.join()` function. Then use the `imwrite()` function to save the image.
```python
rospy.signal_shutdown("done")
sys.exit(0)
```
The first line of code initiates a clean shutdown of ROS. The second line of code exits the Python interpreter.
```python
rospy.init_node('capture_image', argv=sys.argv)
CaptureImage()
```
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master. **NOTE:** the name must be a base name, i.e. it cannot contain any slashes "/".
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master.
!!! note
The name must be a base name, i.e. it cannot contain any slashes "/".
Instantiate the `CaptureImage()` class.
```python
rospy.spin()
```
Give control to ROS. This will allow the callback to be called whenever new messages come in. If we don't put this line in, then the node will not work, and ROS will not process any messages.
Give control to ROS. This will allow the callback to be called whenever new messages come in. If we don't put this line in, then the node will not work, and ROS will not process any messages.
## Edge Detection
In this section, we highlight a node that utilizes the [Canny Edge filter](https://www.geeksforgeeks.org/python-opencv-canny-function/) algorithm to detect the edges from an image and convert it back as a ROS image to be visualized in RViz. Begin by running the following commands.
In this section, we highlight a node that utilizes the [Canny Edge filter](https://www.geeksforgeeks.org/python-opencv-canny-function/) algorithm to detect the edges from an image and convert it back as a ROS image to be visualized in RViz. In a terminal, execute:
```bash
# Terminal 4
cd ~/catkin_ws/src/stretch_tutorials/src
python3 edge_detection.py
```
The node will publish a new Image topic named */image_edge_detection*. This can be visualized in RViz and a gif is provided below for reference.
The node will publish a new Image topic named `/image_edge_detection`. This can be visualized in RViz and a gif is provided below for reference.
<p align="center">
<img src="https://raw.githubusercontent.com/hello-robot/stretch_tutorials/noetic/images/camera_image_edge_detection.gif"/>
</p>
### The Code
```python
#!/usr/bin/env python3
@ -253,26 +259,25 @@ if __name__ == '__main__':
```
### The Code Explained
Since that there are similarities in the capture image node, we will only breakdown the different components of the edge detection node.
Since there are similarities in the capture image node, we will only break down the different components of the edge detection node.
```python
self.lower_thres = 100
self.upper_thres = 200
```
Define lower and upper bounds of the Hysteresis Thresholds.
Define the lower and upper bounds of the Hysteresis Thresholds.
```python
image = cv2.Canny(image, self.lower_thres, self.upper_thres)
```
Run the Canny Edge function to detect edges from the cv2 image.
```python
image_msg = self.bridge.cv2_to_imgmsg(image, 'passthrough')
```
Convert the cv2 image back to a ROS image so it can be published.
```python
image_msg.header = msg.header
self.pub.publish(image_msg)
```
Publish the ROS image with the same header as the subscribed ROS message.

+ 21
- 8
ros1/example_8.md View File

@ -7,20 +7,21 @@ This example will showcase how to save the interpreted speech from Stretch's [Re
</p>
Begin by running the `respeaker.launch` file in a terminal.
```bash
# Terminal 1
roslaunch respeaker_ros sample_respeaker.launch
```
Then run the [speech_text.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/speech_text.py) node.
Then run the [speech_text.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/speech_text.py) node. In a new terminal, execute:
```bash
# Terminal 2
cd catkin_ws/src/stretch_tutorials/src/
python3 speech_text.py
```
The ReSpeaker will be listening and will start to interpret speech and save the transcript to a text file. To stop shutdown the node, type **Ctrl** + **c** in the terminal.
The ReSpeaker will be listening and will start to interpret speech and save the transcript to a text file. To shut down the node, type `Ctrl` + `c` in the terminal.
### The Code
```python
#!/usr/bin/env python3
@ -67,17 +68,20 @@ Now let's break the code down.
```python
#!/usr/bin/env python3
```
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python3 script.
```python
import rospy
import os
```
You need to import `rospy` if you are writing a ROS [Node](http://wiki.ros.org/Nodes).
```python
from speech_recognition_msgs.msg import SpeechRecognitionCandidates
```
Import `SpeechRecognitionCandidates` from the `speech_recgonition_msgs.msg` so that we can receive the interpreted speech.
```python
@ -87,22 +91,26 @@ def __init__(self):
"""
self.sub = rospy.Subscriber("speech_to_text", SpeechRecognitionCandidates, self.callback)
```
Set up a subscriber. We're going to subscribe to the topic "*speech_to_text*", looking for `SpeechRecognitionCandidates` messages. When a message comes in, ROS is going to pass it to the function "callback" automatically.
Set up a subscriber. We're going to subscribe to the topic `speech_to_text`, looking for `SpeechRecognitionCandidates` messages. When a message comes in, ROS is going to pass it to the function "callback" automatically.
```python
self.save_path = '/home/hello-robot/catkin_ws/src/stretch_tutorials/stored_data
```
Define the directory to save the text file.
```python
transcript = ' '.join(map(str,msg.transcript))
```
Take all items in the iterable list and join them into a single string named transcript.
```python
file_name = 'speech.txt'
completeName = os.path.join(self.save_path, file_name)
```
Define the file name and create a complete path directory.
```python
@ -110,20 +118,25 @@ with open(completeName, "a+") as file_object:
file_object.write("\n")
file_object.write(transcript)
```
Append the transcript to the text file.
```python
rospy.init_node('speech_text')
SpeechText()
```
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master. **NOTE:** the name must be a base name, i.e. it cannot contain any slashes "/".
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master.
!!! note
The name must be a base name, i.e. it cannot contain any slashes "/".
Instantiate the `SpeechText()` class.
```python
rospy.spin()
```
Give control to ROS. This will allow the callback to be called whenever new
messages come in. If we don't put this line in, then the node will not work,
and ROS will not process any messages.
and ROS will not process any messages.

+ 45
- 23
ros1/example_9.md View File

@ -1,31 +1,30 @@
## Example 9
The aim of example 9 is to combine the [ReSpeaker Microphone Array](respeaker_microphone_array.md) and [Follow Joint Trajectory](follow_joint_trajectory.md) tutorials to voice teleoperate the mobile base of the Stretch robot.
This example aims to combine the [ReSpeaker Microphone Array](respeaker_microphone_array.md) and [Follow Joint Trajectory](follow_joint_trajectory.md) tutorials to voice teleoperate the mobile base of the Stretch robot.
Begin by running the following command in a new terminal.
```bash
# Terminal 1
roslaunch stretch_core stretch_driver.launch
```
Switch the mode to *position* mode using a rosservice call. Then run the `respeaker.launch` file.
Switch the mode to `position` mode using a rosservice call. Then run the `respeaker.launch` file. In a new terminal, execute:
```bash
# Terminal 2
rosservice call /switch_to_position_mode
roslaunch stretch_core respeaker.launch
```
Then run the [voice_teleoperation_base.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/voice_teleoperation_base.py) node in a new terminal.
```bash
# Terminal 3
cd catkin_ws/src/stretch_tutorials/src/
python3 voice_teleoperation_base.py
```
In terminal 3, a menu of voice commands is printed. You can reference this menu layout below.
```
In terminal 3, a menu of voice commands is printed. You can reference this menu layout below.
```{.bash .no-copy}
------------ VOICE TELEOP MENU ------------
VOICE COMMANDS
@ -44,12 +43,13 @@ STEP SIZE
"quit" : QUIT AND CLOSE NODE
-------------------------------------------
```
To stop the node from sending twist messages, type **Ctrl** + **c** or say "**quit**".
To stop the node from sending twist messages, type `Ctrl` + `c` or say "**quit**".
### The Code
```python
#!/usr/bin/env python3
@ -265,6 +265,7 @@ This code is similar to that of the [multipoint_command](https://github.com/hell
```python
#!/usr/bin/env python3
```
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python script.
```python
@ -279,18 +280,21 @@ from trajectory_msgs.msg import JointTrajectoryPoint
import hello_helpers.hello_misc as hm
from speech_recognition_msgs.msg import SpeechRecognitionCandidates
```
You need to import rospy if you are writing a ROS Node. Import the `FollowJointTrajectoryGoal` from the `control_msgs.msg` package to control the Stretch robot. Import `JointTrajectoryPoint` from the `trajectory_msgs` package to define robot trajectories. The `hello_helpers` package consists of a module that provides various Python scripts used across stretch_ros. In this instance, we are importing the `hello_misc` script. Import `sensor_msgs.msg` so that we can subscribe to JointState messages.
You need to import `rospy` if you are writing a ROS Node. Import the `FollowJointTrajectoryGoal` from the `control_msgs.msg` package to control the Stretch robot. Import `JointTrajectoryPoint` from the `trajectory_msgs` package to define robot trajectories. The `hello_helpers` package consists of a module that provides various Python scripts used across stretch_ros. In this instance, we are importing the `hello_misc` script. Import `sensor_msgs.msg` so that we can subscribe to JointState messages.
```python
class GetVoiceCommands:
```
Create a class that subscribes to the speech to text recognition messages, prints a voice command menu, and defines step size for translational and rotational mobile base motion.
Create a class that subscribes to the `speech-to-text` recognition messages, prints a voice command menu, and defines step size for translational and rotational mobile base motion.
```python
self.step_size = 'medium'
self.rad_per_deg = math.pi/180.0
```
Set the default step size as medium and create a float value, *self.rad_per_deg*, to convert degrees to radians.
Set the default step size as medium and create a float value, `self.rad_per_deg`, to convert degrees to radians.
```python
self.small_deg = 5.0
@ -305,6 +309,7 @@ self.big_deg = 20.0
self.big_rad = self.rad_per_deg * self.big_deg
self.big_translate = 0.1
```
Define the three rotation and translation step sizes.
```python
@ -313,14 +318,16 @@ self.sound_direction = 0
self.speech_to_text_sub = rospy.Subscriber("/speech_to_text", SpeechRecognitionCandidates, self.callback_speech)
self.sound_direction_sub = rospy.Subscriber("/sound_direction", Int32, self.callback_direction)
```
Initialize the voice command and sound direction to values that will not result in moving the base.
Set up two subscribers. The first one subscribes to the topic */speech_to_text*, looking for `SpeechRecognitionCandidates` messages. When a message comes in, ROS is going to pass it to the function `callback_speech` automatically. The second subscribes to */sound_direction* message and passes it to the `callback_direction` function.
Set up two subscribers. The first one subscribes to the topic `/speech_to_text`, looking for `SpeechRecognitionCandidates` messages. When a message comes in, ROS is going to pass it to the function `callback_speech` automatically. The second subscribes to `/sound_direction` message and passes it to the `callback_direction` function.
```python
self.sound_direction = msg.data * -self.rad_per_deg
```
The `callback_direction` function converts the *sound_direction* topic from degrees to radians.
The `callback_direction` function converts the `sound_direction` topic from degrees to radians.
```python
if self.step_size == 'small':
@ -331,7 +338,8 @@ if self.step_size == 'big':
inc = {'rad': self.big_rad, 'translate': self.big_translate}
return inc
```
The `callback_speech` stores the increment size for translational and rotational base motion to *inc*. The increment size is contingent on the *self.step_size* string value.
The `callback_speech` stores the increment size for translational and rotational base motion in `inc`. The increment size is contingent on the `self.step_size` string value.
```python
command = None
@ -346,21 +354,24 @@ if self.voice_command == 'right':
if self.voice_command == 'stretch':
command = {'joint': 'rotate_mobile_base', 'inc': self.sound_direction}
```
In the `get_command()` function, the *command* is initialized as None, or set as a dictionary where the *joint* and *inc* values are stored. The *command* message type is dependent on the *self.voice_command* string value.
In the `get_command()` function, the `command` is initialized as `None`, or set as a dictionary where the `joint` and `inc` values are stored. The `command` message type is dependent on the `self.voice_command` string value.
```python
if (self.voice_command == "small") or (self.voice_command == "medium") or (self.voice_command == "big"):
self.step_size = self.voice_command
rospy.loginfo('Step size = {0}'.format(self.step_size))
```
Based on the *self.voice_command* value, set the step size for the increments.
Based on the `self.voice_command` value set the step size for the increments.
```python
if self.voice_command == 'quit':
rospy.signal_shutdown("done")
sys.exit(0)
```
If the *self.voice_command* is equal to "quit", then initiate a clean shutdown of ROS and exit the Python interpreter.
If the `self.voice_command` is equal to `quit`, then initiate a clean shutdown of ROS and exit the Python interpreter.
```python
class VoiceTeleopNode(hm.HelloNode):
@ -379,7 +390,8 @@ class VoiceTeleopNode(hm.HelloNode):
self.joint_state = None
self.speech = GetVoiceCommands()
```
A class that inherits the `HelloNode` class from `hm`, declares object from the `GetVoiceCommands` class, and sends joint trajectory commands.
A class that inherits the `HelloNode` class from `hm` declares object from the `GetVoiceCommands` class and sends joint trajectory commands.
```python
def send_command(self, command):
@ -393,18 +405,21 @@ def send_command(self, command):
point = JointTrajectoryPoint()
point.time_from_start = rospy.Duration(0.0)
```
The `send_command` function stores the joint state message and uses a conditional statement to send joint trajectory goals. Also, assign *point* as a `JointTrajectoryPoint` message type.
The `send_command` function stores the joint state message and uses a conditional statement to send joint trajectory goals. Also, assign `point` as a `JointTrajectoryPoint` message type.
```python
trajectory_goal = FollowJointTrajectoryGoal()
trajectory_goal.goal_time_tolerance = rospy.Time(1.0)
```
Assign *trajectory_goal* as a `FollowJointTrajectoryGoal` message type.
Assign `trajectory_goal` as a `FollowJointTrajectoryGoal` message type.
```python
joint_name = command['joint']
trajectory_goal.trajectory.joint_names = [joint_name]
```
Extract the joint name from the command dictionary.
```python
@ -412,23 +427,27 @@ inc = command['inc']
rospy.loginfo('inc = {0}'.format(inc))
new_value = inc
```
Extract the increment type from the command dictionary.
```python
point.positions = [new_value]
trajectory_goal.trajectory.points = [point]
```
Assign the new value position to the trajectory goal message type.
```python
self.trajectory_client.send_goal(trajectory_goal)
rospy.loginfo('Done sending command.')
```
Make the action call and send goal of the new joint position.
Make the action call and send the goal of the new joint position.
```python
self.speech.print_commands()
```
Reprint the voice command menu after the trajectory goal is sent.
```python
@ -443,7 +462,8 @@ def main(self):
rate = rospy.Rate(self.rate)
self.speech.print_commands()
```
The main function instantiates the `HelloNode` class, initializes the subscriber, and call other methods in both the `VoiceTeleopNode` and `GetVoiceCommands` classes.
The main function instantiates the `HelloNode` class, initializes the subscriber, and calls other methods in both the `VoiceTeleopNode` and `GetVoiceCommands` classes.
```python
while not rospy.is_shutdown():
@ -451,6 +471,7 @@ while not rospy.is_shutdown():
self.send_command(command)
rate.sleep()
```
Run a while loop to continuously check speech commands and send those commands to execute an action.
```python
@ -460,4 +481,5 @@ try:
except KeyboardInterrupt:
rospy.loginfo('interrupt received, so shutting down')
```
Declare a `VoiceTeleopNode` object. Then execute the `main()` method.

+ 46
- 39
ros1/follow_joint_trajectory.md View File

@ -1,28 +1,27 @@
## FollowJointTrajectory Commands
Stretch ROS driver offers a [`FollowJointTrajectory`](http://docs.ros.org/en/api/control_msgs/html/action/FollowJointTrajectory.html) action service for its arm. Within this tutorial we will have a simple FollowJointTrajectory command sent to a Stretch robot to execute.
Stretch ROS driver offers a [`FollowJointTrajectory`](http://docs.ros.org/en/api/control_msgs/html/action/FollowJointTrajectory.html) action service for its arm. Within this tutorial, we will have a simple FollowJointTrajectory command sent to a Stretch robot to execute.
## Stow Command Example
<p align="center">
<img src="https://raw.githubusercontent.com/hello-robot/stretch_tutorials/noetic/images/stow_command.gif"/>
</p>
Begin by running the following command in the terminal in a terminal.
Begin by running the following command in a terminal.
```bash
# Terminal 1
roslaunch stretch_core stretch_driver.launch
```
Switch the mode to *position* mode using a rosservice call. Then run the stow command node.
In a new terminal, switch the mode to `position` mode using a rosservice call. Then run the stow command node.
```bash
# Terminal 2
rosservice call /switch_to_position_mode
cd catkin_ws/src/stretch_tutorials/src/
python3 stow_command.py
```
This will send a `FollowJointTrajectory` command to stow Stretch's arm.
This will send a `FollowJointTrajectory` command to stow Stretch's arm.
### The Code
@ -81,12 +80,12 @@ if __name__ == '__main__':
```
### The Code Explained
Now let's break the code down.
```python
#!/usr/bin/env python3
```
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python3 script.
```python
@ -96,14 +95,16 @@ from trajectory_msgs.msg import JointTrajectoryPoint
import hello_helpers.hello_misc as hm
import time
```
You need to import `rospy` if you are writing a ROS [Node](http://wiki.ros.org/Nodes). Import the `FollowJointTrajectoryGoal` from the [control_msgs.msg](http://wiki.ros.org/control_msgs) package to control the Stretch robot. Import `JointTrajectoryPoint` from the [trajectory_msgs](http://wiki.ros.org/trajectory_msgs) package to define robot trajectories. The [hello_helpers](https://github.com/hello-robot/stretch_ros/tree/master/hello_helpers) package consists of a module the provides various Python scripts used across [stretch_ros](https://github.com/hello-robot/stretch_ros). In this instance we are importing the `hello_misc` script.
You need to import `rospy` if you are writing a ROS [Node](http://wiki.ros.org/Nodes). Import the `FollowJointTrajectoryGoal` from the [control_msgs.msg](http://wiki.ros.org/control_msgs) package to control the Stretch robot. Import `JointTrajectoryPoint` from the [trajectory_msgs](http://wiki.ros.org/trajectory_msgs) package to define robot trajectories. The [hello_helpers](https://github.com/hello-robot/stretch_ros/tree/master/hello_helpers) package consists of a module that provides various Python scripts used across [stretch_ros](https://github.com/hello-robot/stretch_ros). In this instance, we are importing the `hello_misc` script.
```python
class StowCommand(hm.HelloNode):
def __init__(self):
hm.HelloNode.__init__(self)
```
The `StowCommand ` class inherits the `HelloNode` class from `hm` and is initialized.
The `StowCommand` class inherits the `HelloNode` class from `hm` and is initialized.
```python
def issue_stow_command(self):
@ -111,7 +112,8 @@ def issue_stow_command(self):
stow_point.time_from_start = rospy.Duration(0.000)
stow_point.positions = [0.2, 0.0, 3.4]
```
The `issue_stow_command()` is the name of the function that will stow Stretch's arm. Within the function, we set *stow_point* as a `JointTrajectoryPoint`and provide desired positions (in meters). These are the positions of the lift, wrist extension, and yaw of the wrist, respectively. These are defined in the next set of the code.
The `issue_stow_command()` is the name of the function that will stow Stretch's arm. Within the function, we set *stow_point* as a `JointTrajectoryPoint`and provide desired positions (in meters). These are the positions of the lift, wrist extension, and yaw of the wrist, respectively. These are defined next.
```python
trajectory_goal = FollowJointTrajectoryGoal()
@ -120,13 +122,15 @@ The `issue_stow_command()` is the name of the function that will stow Stretch's
trajectory_goal.trajectory.header.stamp = rospy.Time(0.0)
trajectory_goal.trajectory.header.frame_id = 'base_link'
```
Set *trajectory_goal* as a `FollowJointTrajectoryGoal` and define the joint names as a list. Then `trajectory_goal.trajectory.points` is defined by the positions set in *stow_point*. Specify the coordinate frame that we want (*base_link*) and set the time to be now.
Set `trajectory_goal` as a `FollowJointTrajectoryGoal` and define the joint names as a list. Then `trajectory_goal.trajectory.points` is defined by the positions set in `stow_point`. Specify the coordinate frame that we want (*base_link*) and set the time to be now.
```python
self.trajectory_client.send_goal(trajectory_goal)
rospy.loginfo('Sent stow goal = {0}'.format(trajectory_goal))
self.trajectory_client.wait_for_result()
```
Make the action call and send the goal. The last line of code waits for the result before it exits the python script.
```python
@ -136,7 +140,8 @@ def main(self):
self.issue_stow_command()
time.sleep(2)
```
Create a funcion, `main()`, to do all of the setup the `hm.HelloNode` class and issue the stow command.
Create a function, `main()`, to set up the `hm.HelloNode` class and issue the stow command.
```python
if __name__ == '__main__':
@ -147,8 +152,8 @@ if __name__ == '__main__':
rospy.loginfo('interrupt received, so shutting down')
```
Declare object, *node*, from the `StowCommand()` class. Then run the `main()` function.
Declare object, *node*, from the `StowCommand()` class. Then run the `main()` function.
## Multipoint Command Example
@ -156,23 +161,24 @@ Declare object, *node*, from the `StowCommand()` class. Then run the `main()` fu
<img src="https://raw.githubusercontent.com/hello-robot/stretch_tutorials/noetic/images/multipoint.gif"/>
</p>
Begin by running the following command in the terminal in a terminal.
Begin by running the following command in a terminal:
```bash
# Terminal 1
roslaunch stretch_core stretch_driver.launch
```
Switch the mode to *position* mode using a rosservice call. Then run the multipoint command node.
In a new terminal, switch the mode to `position` mode using a rosservice call. Then run the multipoint command node.
```bash
# Terminal 2
rosservice call /switch_to_position_mode
cd catkin_ws/src/stretch_tutorials/src/
python3 multipoint_command.py
```
This will send a list of `JointTrajectoryPoint` message types to move Stretch's arm.
### The Code
```python
#!/usr/bin/env python3
@ -242,31 +248,31 @@ if __name__ == '__main__':
node.main()
except KeyboardInterrupt:
rospy.loginfo('interrupt received, so shutting down')
```
### The Code Explained
Seeing that there are similarities between the multipoint and stow command nodes, we will only breakdown the different components of the `multipoint_command` node.
Seeing that there are similarities between the multipoint and stow command nodes, we will only break down the different components of the `multipoint_command` node.
```python
point0 = JointTrajectoryPoint()
point0.positions = [0.2, 0.0, 3.4]
```
Set *point0* as a `JointTrajectoryPoint`and provide desired positions. These are the positions of the lift, wrist extension, and yaw of the wrist, respectively. The lift and wrist extension positions are expressed in meters, where as the wrist yaw is in radians.
Set `point0` as a `JointTrajectoryPoint`and provide desired positions. These are the positions of the lift, wrist extension, and yaw of the wrist, respectively. The lift and wrist extension positions are expressed in meters, whereas the wrist yaw is in radians.
```python
point0.velocities = [0.2, 0.2, 2.5]
```
Provide desired velocity of the lift (m/s), wrist extension (m/s), and wrist yaw (rad/s) for *point0*.
Provide the desired velocity of the lift (m/s), wrist extension (m/s), and wrist yaw (rad/s) for `point0`.
```python
point0.accelerations = [1.0, 1.0, 3.5]
```
Provide desired accelerations of the lift (m/s^2), wrist extension (m/s^2), and wrist yaw (rad/s^2).
**IMPORTANT NOTE**: The lift and wrist extension can only go up to 0.2 m/s. If you do not provide any velocities or accelerations for the lift or wrist extension, then they go to their default values. However, the Velocity and Acceleration of the wrist yaw will stay the same from the previous value unless updated.
Provide desired accelerations of the lift (m/s^2), wrist extension (m/s^2), and wrist yaw (rad/s^2).
!!! note
The lift and wrist extension can only go up to 0.2 m/s. If you do not provide any velocities or accelerations for the lift or wrist extension, then they go to their default values. However, the Velocity and Acceleration of the wrist yaw will stay the same from the previous value unless updated.
```python
trajectory_goal = FollowJointTrajectoryGoal()
@ -275,18 +281,17 @@ trajectory_goal.trajectory.points = [point0, point1, point2, point3, point4, poi
trajectory_goal.trajectory.header.stamp = rospy.Time(0.0)
trajectory_goal.trajectory.header.frame_id = 'base_link'
```
Set *trajectory_goal* as a `FollowJointTrajectoryGoal` and define the joint names as a list. Then `trajectory_goal.trajectory.points` is defined by a list of the 6 points. Specify the coordinate frame that we want (*base_link*) and set the time to be now.
## Single Joint Actuator
Set `trajectory_goal` as a `FollowJointTrajectoryGoal` and define the joint names as a list. Then `trajectory_goal.trajectory.points` is defined by a list of the 6 points. Specify the coordinate frame that we want (*base_link*) and set the time to be now.
## Single Joint Actuator
<p align="center">
<img src="https://raw.githubusercontent.com/hello-robot/stretch_tutorials/noetic/images/single_joint_actuator.gif"/>
</p>
You can also actuate a single joint for the Stretch. Below are the list of joints and their position limit.
You can also actuate a single joint for the Stretch. Below is the list of joints and their position limit.
```
```{.bash .no-copy}
############################# JOINT LIMITS #############################
joint_lift: lower_limit = 0.15, upper_limit = 1.10 # in meters
wrist_extension: lower_limit = 0.00, upper_limit = 0.50 # in meters
@ -301,26 +306,25 @@ rotate_mobile_base: No lower or upper limit. Defined by a step size in radian
########################################################################
```
Begin by running the following command in the terminal in a terminal.
Begin by running the following command in a terminal.
```bash
# Terminal 1
roslaunch stretch_core stretch_driver.launch
```
Switch the mode to *position* mode using a rosservice call. Then run the single joint actuator node.
In a new terminal, switch the mode to `position` mode using a rosservice call. Then run the single joint actuator node.
```bash
# Terminal 2
rosservice call /switch_to_position_mode
cd catkin_ws/src/stretch_tutorials/src/
python3 single_joint_actuator.py
```
This will send a list of `JointTrajectoryPoint` message types to move Stretch's arm.
The joint, *joint_gripper_finger_left*, is only needed when actuating the gripper.
The joint, `joint_gripper_finger_left`, is only needed when actuating the gripper.
### The Code
```python
#!/usr/bin/env python3
@ -377,8 +381,8 @@ if __name__ == '__main__':
except KeyboardInterrupt:
rospy.loginfo('interrupt received, so shutting down')
```
### The Code Explained
### The Code Explained
Since the code is quite similar to the `multipoint_command` code, we will only review the parts that differ.
Now let's break the code down.
@ -387,7 +391,8 @@ Now let's break the code down.
trajectory_goal = FollowJointTrajectoryGoal()
trajectory_goal.trajectory.joint_names = ['joint_head_pan']
```
Here we only input joint name that we want to actuate. In this instance, we will actuate the *joint_head_pan*.
Here we only input the joint name that we want to actuate. In this instance, we will actuate the `joint_head_pan`.
```python
point0 = JointTrajectoryPoint()
@ -396,11 +401,13 @@ point0.positions = [0.65]
# point1 = JointTrajectoryPoint()
# point1.positions = [0.5]
```
Set *point0* as a `JointTrajectoryPoint`and provide desired position. You also have the option to send multiple point positions rather than one.
Set `point0` as a `JointTrajectoryPoint`and provide the desired position. You also have the option to send multiple point positions rather than one.
```python
trajectory_goal.trajectory.points = [point0]#, point1]
trajectory_goal.trajectory.header.stamp = rospy.Time(0.0)
trajectory_goal.trajectory.header.frame_id = 'base_link'
```
Set *trajectory_goal* as a `FollowJointTrajectoryGoal` and define the joint names as a list. Then `trajectory_goal.trajectory.points` set by your list of points. Specify the coordinate frame that we want (*base_link*) and set the time to be now.
Set `trajectory_goal` as a `FollowJointTrajectoryGoal` and define the joint names as a list. Then `trajectory_goal.trajectory.points` set by your list of points. Specify the coordinate frame that we want (*base_link*) and set the time to be now.

+ 9
- 7
ros1/gazebo_basics.md View File

@ -1,24 +1,26 @@
# Spawning Stretch in Simulation (Gazebo)
### Empty World Simulation
To spawn the Stretch in gazebo's default empty world run the following command in your terminal.
## Empty World Simulation
To spawn Stretch in Gazebo's default empty world run the following command in your terminal.
```bash
roslaunch stretch_gazebo gazebo.launch
```
This will bringup the robot in the gazebo simulation similar to the image shown below.
This will bring up the robot in the gazebo simulation similar to the image shown below.
<p align="center">
<img src="https://raw.githubusercontent.com/hello-robot/stretch_tutorials/noetic/images/stretch_gazebo_empty_world.png"/>
</p>
### Custom World Simulation
In gazebo, you are able to spawn Stretch in various worlds. First, source the gazebo world files by running the following command in a terminal
## Custom World Simulation
In Gazebo, you can spawn Stretch in various worlds. First, source the Gazebo world files by running the following command in a terminal:
```bash
echo "source /usr/share/gazebo/setup.sh"
```
Then using the world argument, you can spawn the stretch in the willowgarage world by running the following
Then using the world argument, you can spawn Stretch in the Willow Garage world by running the following:
```bash
roslaunch stretch_gazebo gazebo.launch world:=worlds/willowgarage.world

+ 7
- 5
ros1/getting_started.md View File

@ -1,6 +1,7 @@
# Getting Started
## Prerequisites
1. A Stretch robot (see below for simulation instructions if you don’t have a robot)
2. Follow the [Getting Started]() guide (hello_robot_xbox_teleop must not be running in the background)
3. Interacting with Linux through the [command line](https://ubuntu.com/tutorials/command-line-for-beginners#1-overview)
@ -14,11 +15,12 @@ If you cannot access the robot through ssh due to your network settings, you wil
Users who don’t have a Stretch, but want to try the tutorials can set up their computer with Stretch Gazebo.
Although lower specifications might be sufficient, for the best experience we recommend the following for running the simulation:
**Processor**: Intel i7 or comparable
**Memory**: 16 GB
**Storage**: 50 GB
**OS**: Ubuntu 20.04
**Graphics Card**: NVIDIA GTX2060 (optional)
* **Processor**: Intel i7 or comparable
* **Memory**: 16 GB
* **Storage**: 50 GB
* **OS**: Ubuntu 20.04
* **Graphics Card**: NVIDIA GTX2060 (optional)
### Setup
Hello Robot is currently running Stretch on Ubuntu 20.04 and ROS Noetic. To begin the setup, follow the [Run the new robot installation script](https://github.com/hello-robot/stretch_install/blob/master/docs/robot_install.md#run-the-new-robot-installation-script) on your system.

+ 11
- 9
ros1/internal_state_of_stretch.md View File

@ -1,20 +1,22 @@
## Getting the State of the Robot
Begin by starting up the stretch driver launch file.
```bash
# Terminal 1
roslaunch stretch_core stretch_driver.launch
```
Then utilize the ROS command-line tool, [rostopic](http://wiki.ros.org/rostopic), to display Stretch's internal state information. For instance, to view the current state of the robot's joints, simply type the following in a terminal.
Then utilize the ROS command-line tool [rostopic](http://wiki.ros.org/rostopic) to display Stretch's internal state information. For instance, to view the current state of the robot's joints, simply type the following in a new terminal.
```bash
# Terminal 2
rostopic echo /joint_states -n1
```
Note that the flag, `-n1`, at the end of the command defines the count of how many times you wish to publish the current topic information. Remove the flag if you prefer to continuously print the topic for debugging purposes.
Your terminal will output the information associated with the `/joint_states` topic. Your `header`, `position`, `velocity`, and `effort` information may vary from what is printed below.
```
```{.bash .no-copy}
header:
seq: 70999
stamp:
@ -31,13 +33,14 @@ effort: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
```
Let's say you are interested in only seeing the `header` component of the `/joint_states` topic, you can output this within the rostopic command-line tool by typing the following command.
```bash
# Terminal 2
rostopic echo /joint_states/header -n1
```
Your terminal will then output something similar to this:
```
```{.bash .no-copy}
seq: 97277
stamp:
secs: 1945
@ -46,12 +49,11 @@ frame_id: ''
---
```
Additionally, if you were to type `rostopic echo /` in the terminal, then press your *Tab* key on your keyboard, you will see the list of published active topics.
Additionally, if you were to type `rostopic echo /` in the terminal, then press the `Tab` key on your keyboard, you will see the list of published active topics.
A powerful tool to visualize the ROS communication is the ROS [rqt_graph package](http://wiki.ros.org/rqt_graph). By typing the following, you can see a graph of topics being communicated between nodes.
A powerful tool to visualize ROS communication is the ROS [rqt_graph package](http://wiki.ros.org/rqt_graph). By typing the following in a new terminal, you can see a graph of topics being communicated between nodes.
```bash
# Terminal 3
rqt_graph
```
<p align="center">

+ 20
- 14
ros1/moveit_basics.md View File

@ -8,7 +8,7 @@ To run MoveIt with the actual hardware, (assuming `stretch_driver` is already ru
roslaunch stretch_moveit_config move_group.launch
```
This will runs all of the planning capabilities, but without the setup, simulation and interface that the above demo provides. In order to create plans for the robot with the same interface as the offline demo, you can run
This will run all of the planning capabilities, but without the setup, simulation and interface that the above demo provides. To create plans for the robot with the same interface as the offline demo, you can run
```bash
roslaunch stretch_moveit_config moveit_rviz.launch
``` -->
@ -19,20 +19,20 @@ To begin running MoveIt! on stretch, run the demo launch file. This doesn't requ
```bash
roslaunch stretch_moveit_config demo.launch
```
This will brining up an RViz instance where you can move the robot around using [interactive markers](http://wiki.ros.org/rviz/Tutorials/Interactive%20Markers%3A%20Getting%20Started) and create plans between poses. You can reference the bottom gif as a guide to plan and execute motion.
This will bring up an RViz instance where you can move the robot around using [interactive markers](http://wiki.ros.org/rviz/Tutorials/Interactive%20Markers%3A%20Getting%20Started) and create plans between poses. You can reference the bottom gif as a guide to plan and execute motion.
<p align="center">
<img src="https://raw.githubusercontent.com/hello-robot/stretch_tutorials/noetic/images/moveit.gif"/>
</p>
Additionally, the demo allows a user to select from the three groups, *stretch_arm*, *stretch_gripper*, *stretch_head* to move. The option to change groups in the in *Planning Request* section in the *Displays* tree. A few notes to be kept in mind:
* Pre-defined start and goal states can be specified in Start State and Goal State drop downs in Planning tab of Motion Planning RViz plugin.
Additionally, the demo allows a user to select from the three groups, `stretch_arm`, `stretch_gripper`, `stretch_head` to move. A few notes to be kept in mind:
* *stretch_gripper* group does not show markers, and is intended to be controlled via the joints tab that is located in the very right of Motion Planning Rviz plugin.
* Pre-defined start and goal states can be specified in Start State and Goal State drop-downs in the Planning tab of the Motion Planning RViz plugin.
* When planning with *stretch_head* group make sure you select *Approx IK Solutions* in Planning tab of Motion Planning RViz plugin.
* *stretch_gripper* group does not show markers and is intended to be controlled via the joints tab that is located on the very right of the Motion Planning Rviz plugin.
* When planning with *stretch_head* group make sure you select *Approx IK Solutions* in the Planning tab of the Motion Planning RViz plugin.
<p align="center">
<img src="https://raw.githubusercontent.com/hello-robot/stretch_tutorials/noetic/images/moveit_groups.gif"/>
@ -40,20 +40,26 @@ Additionally, the demo allows a user to select from the three groups, *stretch_a
## Running Gazebo with MoveIt! and Stretch
To run in Gazebo, execute:
```bash
# Terminal 1:
roslaunch stretch_gazebo gazebo.launch
# Terminal 2:
roslaunch stretch_core teleop_twist.launch twist_topic:=/stretch_diff_drive_controller/cmd_vel linear:=1.0 angular:=2.0 teleop_type:=keyboard # or use teleop_type:=joystick if you have a controller
# Terminal 3
roslaunch stretch_moveit_config demo_gazebo.launch
```
Then, in a new terminal, execute:
This will launch an Rviz instance that visualizes the joints with markers and an empty world in Gazebo with Stretch and load all the controllers. There are pre-defined positions for each joint group for demonstration purposes. There are three joint groups, namely stretch_arm, stretch_gripper and stretch_head that can be controlled individually via Motion Planning Rviz plugin. Start and goal positions for joints can be selected similar to [this moveit tutorial](https://ros-planning.github.io/moveit_tutorials/doc/quickstart_in_rviz/quickstart_in_rviz_tutorial.html#choosing-specific-start-goal-states).
```bash
roslaunch stretch_core teleop_twist.launch twist_topic:=/stretch_diff_drive_controller/cmd_vel linear:=1.0 angular:=2.0 teleop_type:=keyboard
```
In a separate terminal, launch:
```bash
roslaunch stretch_moveit_config demo_gazebo.launch
```
This will launch a Rviz instance that visualizes the joints with markers and an empty world in Gazebo with Stretch and load all the controllers. There are pre-defined positions for each joint group for demonstration purposes. There are three joint groups, namely stretch_arm, stretch_gripper and stretch_head that can be controlled individually via the Motion Planning Rviz plugin. Start and goal positions for joints can be selected similarly to [this moveit tutorial](https://ros-planning.github.io/moveit_tutorials/doc/quickstart_in_rviz/quickstart_in_rviz_tutorial.html#choosing-specific-start-goal-states).
<p align="center">
<img src="https://raw.githubusercontent.com/hello-robot/stretch_tutorials/noetic/images/gazebo_moveit.gif"/>
</p>
</p>

+ 20
- 18
ros1/navigation_stack.md View File

@ -1,12 +1,12 @@
## Navigation Stack with Actual robot
stretch_navigation provides the standard ROS navigation stack as two launch files. This package utilizes gmapping, move_base, and AMCL to drive Stretch around a mapped space. Running this code will require the robot to be [untethered](https://docs.hello-robot.com/0.2/stretch-tutorials/getting_started/untethered_operation/).
Then run the following commands to map the space that the robot will navigate in.
```bash
roslaunch stretch_navigation mapping.launch
```
Rviz will show the robot and the map that is being constructed. With the terminal open, use the instructions printed by the teleop package to teleoperate the robot around the room. Avoid sharp turns and revisit previously visited spots to form loop closures.
<p align="center">
@ -20,7 +20,8 @@ mkdir -p ~/stretch_user/maps
rosrun map_server map_saver -f ${HELLO_FLEET_PATH}/maps/<map_name>
```
The `<map_name>` does not include an extension. Map_saver will save two files as `<map_name>.pgm` and `<map_name>.yaml`.
!!! note
The `<map_name>` does not include an extension. Map_saver will save two files as `<map_name>.pgm` and `<map_name>.yaml`.
Next, with `<map_name>.yaml`, we can navigate the robot around the mapped space. Run:
@ -28,27 +29,21 @@ Next, with `.yaml`, we can navigate the robot around the mapped space.
roslaunch stretch_navigation navigation.launch map_yaml:=${HELLO_FLEET_PATH}/maps/<map_name>.yaml
```
Rviz will show the robot in the previously mapped space; however, the robot's location on the map does not match the robot's location in the real space. In the top bar of Rviz, use 2D Pose Estimate to lay an arrow down roughly where the robot is located in the real space. AMCL, the localization package, will better localize our pose once we give the robot a 2D Nav Goal. In the top bar of Rviz, use 2D Nav Goal to lay down an arrow where you'd like the robot to go. In the terminal, you'll see move_base go through the planning phases and then navigate the robot to the goal. If planning fails, the robot will begin a recovery behavior: spinning around 360 degrees in place.
It is also possible to send 2D Pose Estimates and Nav Goals programmatically. In your launch file, you may include `navigation.launch` to bring up the navigation stack. Then, you can send *move_base_msgs::MoveBaseGoal* messages in order to navigate the robot programmatically.
Rviz will show the robot in the previously mapped space; however, the robot's location on the map does not match the robot's location in the real space. In the top bar of Rviz, use 2D Pose Estimate to lay an arrow down roughly where the robot is located in real space. AMCL, the localization package, will better localize our pose once we give the robot a 2D Nav Goal. In the top bar of Rviz, use 2D Nav Goal to lay down an arrow where you'd like the robot to go. In the terminal, you'll see move_base go through the planning phases and then navigate the robot to the goal. If planning fails, the robot will begin a recovery behavior: spinning around 360 degrees in place.
It is also possible to send 2D Pose Estimates and Nav Goals programmatically. In your launch file, you may include `navigation.launch` to bring up the navigation stack. Then, you can send *move_base_msgs::MoveBaseGoal* messages to navigate the robot programmatically.
## Running in Simulation
To perform mapping and navigation in the Gazebo simulation of Stretch, substitute the `mapping_gazebo.launch` and `navigation_gazebo.launch` launch files into the commands above. The default Gazebo environment is the Willow Garage HQ. Use the "world" ROS argument to specify the Gazebo world within which to spawn Stretch.
To perform mapping and navigation in the Gazebo simulation of Stretch, substitute the `mapping_gazebo.launch` and `navigation_gazebo.launch` files into the commands above. The default Gazebo environment is the Willow Garage HQ. Use the "world" ROS argument to specify the Gazebo world within which to spawn Stretch.
```bash
# Terminal 1
roslaunch stretch_navigation mapping_gazebo.launch gazebo_world:=worlds/willowgarage.world
```
### Teleop using a Joystick Controller
The mapping launch files, `mapping.launch` and `mapping_gazebo.launch` expose the ROS argument, "teleop_type". By default, this ROS arg is set to "keyboard", which launches keyboard teleop in the terminal. If the xbox controller that ships with Stretch is plugged into your computer, the following command will launch mapping with joystick teleop:
The mapping launch files, `mapping.launch` and `mapping_gazebo.launch`, expose the ROS argument `teleop_type`. By default, this ROS argument is set to `keyboard`, which launches keyboard teleop in the terminal. If the Xbox controller that ships with Stretch is plugged into your computer, the following command will launch mapping with joystick teleop:
```bash
# Terminal 2
roslaunch stretch_navigation mapping.launch teleop_type:=joystick
```
@ -57,15 +52,22 @@ roslaunch stretch_navigation mapping.launch teleop_type:=joystick
</p>
### Using ROS Remote Master
If you have set up [ROS Remote Master](https://docs.hello-robot.com/0.2/stretch-tutorials/getting_started/untethered_operation/#ros-remote-master) for [untethered operation](https://docs.hello-robot.com/0.2/stretch-tutorials/getting_started/untethered_operation/), you can use Rviz and teleop locally with the following commands:
On the robot, execute:
```bash
# On Robot
roslaunch stretch_navigation mapping.launch rviz:=false teleop_type:=none
```
# On your machine, Terminal 1:
On your machine, execute:
```bash
rviz -d `rospack find stretch_navigation`/rviz/mapping.launch
# On your machine, Terminal 2:
roslaunch stretch_core teleop_twist.launch teleop_type:=keyboard # or use teleop_type:=joystick if you have a controller
```
In a separate terminal on your machine, execute:
```bash
roslaunch stretch_core teleop_twist.launch teleop_type:=keyboard
```

+ 7
- 9
ros1/perception.md View File

@ -1,25 +1,22 @@
## Perception Introduction
The Stretch robot is equipped with the [Intel RealSense D435i camera](https://www.intelrealsense.com/depth-camera-d435i/), an essential component that allows the robot to measure and analyze the world around it. In this tutorial, we are going to showcase how to visualize the various topics published from the camera.
The Stretch robot is equipped with the [Intel RealSense D435i camera](https://www.intelrealsense.com/depth-camera-d435i/), an essential component that allows the robot to measure and analyze the world around it. In this tutorial, we are going to showcase how to visualize the various topics published by the camera.
Begin by running the stretch `driver.launch` file.
```bash
# Terminal 1
roslaunch stretch_core stretch_driver.launch
```
To activate the [RealSense camera](https://www.intelrealsense.com/depth-camera-d435i/) and publish topics to be visualized, run the following launch file in a new terminal.
```bash
# Terminal 2
roslaunch stretch_core d435i_low_resolution.launch
```
Within this tutorial package, there is an [RViz config file](https://github.com/hello-robot/stretch_tutorials/blob/noetic/rviz/perception_example.rviz) with the topics for perception already in the Display tree. You can visualize these topics and the robot model by running the command below in a new terminal.
```bash
# Terminal 3
rosrun rviz rviz -d /home/hello-robot/catkin_ws/src/stretch_tutorials/rviz/perception_example.rviz
```
@ -27,8 +24,7 @@ rosrun rviz rviz -d /home/hello-robot/catkin_ws/src/stretch_tutorials/rviz/perce
A list of displays on the left side of the interface can visualize the camera data. Each display has its properties and status that notify a user if topic messages are received.
For the `PointCloud2` display, a [sensor_msgs/pointCloud2](http://docs.ros.org/en/lunar/api/sensor_msgs/html/msg/PointCloud2.html) message named */camera/depth/color/points*, is received and the gif below demonstrates the various display properties when visualizing the data.
For the `PointCloud2` display, a [sensor_msgs/pointCloud2](http://docs.ros.org/en/lunar/api/sensor_msgs/html/msg/PointCloud2.html) message named `/camera/depth/color/points` is received and the GIF below demonstrates the various display properties when visualizing the data.
<p align="center">
<img src="https://raw.githubusercontent.com/hello-robot/stretch_tutorials/noetic/images/perception_rviz.gif"/>
@ -36,22 +32,24 @@ For the `PointCloud2` display, a [sensor_msgs/pointCloud2](http://docs.ros.org/e
### Image Display
The `Image` display when toggled creates a new rendering window that visualizes a [sensor_msgs/Image](http://docs.ros.org/en/lunar/api/sensor_msgs/html/msg/Image.html) messaged, */camera/color/image_raw*. This feature shows the image data from the camera; however, the image comes out sideways. Thus, you can select the */camera/color/image_raw_upright_view* from the **Image Topic** options to get an upright view of the image.
<p align="center">
<img src="https://raw.githubusercontent.com/hello-robot/stretch_tutorials/noetic/images/perception_image.gif"/>
</p>
### Camera Display
The `Camera` display is similar to that of the `Image` display. In this setting, the rendering window also visualizes other displays, such as the PointCloud2, the RobotModel, and Grid Displays. The **visibility** property can toggle what displays your are interested in visualizing.
The `Camera` display is similar to that of the `Image` display. In this setting, the rendering window also visualizes other displays, such as the PointCloud2, the RobotModel, and Grid Displays. The `visibility` property can toggle what displays you are interested in visualizing.
<p align="center">
<img src="https://raw.githubusercontent.com/hello-robot/stretch_tutorials/noetic/images/perception_camera.gif"/>
</p>
### DepthCloud Display
The `DepthCloud` display is visualized in the main RViz window. This display takes in the depth image and RGB image, provided by the RealSense, to visualize and register a point cloud.
The `DepthCloud` display is visualized in the main RViz window. This display takes in the depth image and RGB image provided by RealSense to visualize and register a point cloud.
<p align="center">
<img src="https://raw.githubusercontent.com/hello-robot/stretch_tutorials/noetic/images/perception_depth.gif"/>
</p>
## Deep Perception
Hello Robot also has a ROS package that uses deep learning models for various detection demos. A link to the package is provided: [stretch_deep_perception](https://github.com/hello-robot/stretch_ros/tree/master/stretch_deep_perception).

+ 16
- 20
ros1/respeaker_microphone_array.md View File

@ -1,16 +1,12 @@
## ReSpeaker Microphone Array
For this tutorial, we will go over on a high level how to use Stretch's [ReSpeaker Mic Array v2.0](https://wiki.seeedstudio.com/ReSpeaker_Mic_Array_v2.0/).
For this tutorial, we will get a high-level view of how to use Stretch's [ReSpeaker Mic Array v2.0](https://wiki.seeedstudio.com/ReSpeaker_Mic_Array_v2.0/).
<p align="center">
<img src="https://raw.githubusercontent.com/hello-robot/stretch_tutorials/noetic/images/respeaker.jpg"/>
</p>
### Stretch Body Package
In this tutorial's section we will use command line tools in the [Stretch_Body](https://github.com/hello-robot/stretch_body) package, a low level Python API for Stretch's hardware, to directly interact with the ReSpeaker.
In this section we will use command line tools in the [Stretch_Body](https://github.com/hello-robot/stretch_body) package, a low-level Python API for Stretch's hardware, to directly interact with the ReSpeaker.
Begin by typing the following command in a new terminal.
@ -18,37 +14,34 @@ Begin by typing the following command in a new terminal.
stretch_respeaker_test.py
```
The following will be displayed in your terminal
```bash
hello-robot@stretch-re1-1005:~$ stretch_respeaker_test.py
For use with S T R E T C H (TM) RESEARCH EDITION from Hello Robot Inc.
The following will be displayed in your terminal:
```{.bash .no-copy}
For use with S T R E T C H (TM) RESEARCH EDITION from Hello Robot Inc.
* waiting for audio...
* recording 3 seconds
* done
* playing audio
* done
```
The ReSpeaker Mico Array will wait until it hears audio loud enough to trigger its recording feature. Stretch will record audio for 3 seconds and then replay it through its speakers. This command line is a good method to see if the hardware is working correctly.
To stop the python script, type **Ctrl** + **c** in the terminal.
To stop the python script, type `Ctrl` + `c` in the terminal.
### ReSpeaker_ROS Package
A [ROS package for the ReSpeaker](https://index.ros.org/p/respeaker_ros/#melodic) is utilized for this tutorial's section.
A [ROS package for the ReSpeaker](https://index.ros.org/p/respeaker_ros/#melodic) is utilized for this section.
Begin by running the `sample_respeaker.launch` file in a terminal.
```bash
# Terminal 1
roslaunch respeaker_ros sample_respeaker.launch
```
This will bring up the necessary nodes that will allow the ReSpeaker to implement a voice and sound interface with the robot.
Below are executables you can run and see the ReSpeaker results.
Below are executables you can run to see the ReSpeaker results.
```bash
rostopic echo /sound_direction # Result of Direction (in Radians) of Audio
@ -59,20 +52,23 @@ rostopic echo /speech_audio # Raw audio data when there is speech
rostopic echo /speech_to_text # Voice recognition
```
An example is when you run the `speech_to_text` executable and speak near the microphone array. In this instance, "hello robot" was said.
An example is when you run the `speech_to_text` executable and speak near the microphone array. In a new terminal, execute:
```bash
# Terminal 2
hello-robot@stretch-re1-1005:~$ rostopic echo /speech_to_text
```
In this instance, "hello robot" was said. The following will be displayed in your terminal:
```{.bash .no-copy}
transcript:
- hello robot
confidence: []
---
```
You can also set various parameters via`dynamic_reconfigure` running the following command in a new terminal.
You can also set various parameters via `dynamic_reconfigure` by running the following command in a new terminal.
```bash
# Terminal 3
rosrun rqt_reconfigure rqt_reconfigure
```

+ 5
- 3
ros1/rviz_basics.md View File

@ -6,10 +6,12 @@ You can utilize RViz to visualize Stretch's sensor information. To begin, in a t
roslaunch stretch_core stretch_driver.launch
```
Then run the following command in a separate terminal to bring up a simple RViz configuration of the Stretch robot.
Then, run the following command in a separate terminal to bring up a simple RViz configuration of the Stretch robot.
```bash
rosrun rviz rviz -d `rospack find stretch_core`/rviz/stretch_simple_test.rviz
```
An RViz window should open, allowing you to see the various DisplayTypes in the display tree on the left side of the window.
<p align="center">
@ -24,14 +26,14 @@ If you want to visualize Stretch's [tf transform tree](http://wiki.ros.org/rviz/
There are further tutorials for RViz which can be found [here](http://wiki.ros.org/rviz/Tutorials).
## Running RViz and Gazebo (Simulation)
Let's bring up Stretch in the willow garage world from the [gazebo basics tutorial](gazebo_basics.md) and RViz by using the following command.
```bash
roslaunch stretch_gazebo gazebo.launch world:=worlds/willowgarage.world rviz:=true
```
the `rviz` flag will open an RViz window to visualize a variety of ROS topics.
The `rviz` flag will open an RViz window to visualize a variety of ROS topics.
<p align="center">
<img src="https://raw.githubusercontent.com/hello-robot/stretch_tutorials/noetic/images/willowgarage_with_rviz.png"/>

+ 18
- 20
ros1/teleoperating_stretch.md View File

@ -5,21 +5,21 @@ If you have not already had a look at the [Xbox Controller Teleoperation](https:
### Keyboard Teleoperating: Full Body
For full body teleoperation with the keyboard, you first need to run the `stretch_driver.launch` in a terminal.
For full-body teleoperation with the keyboard, you first need to run the `stretch_driver.launch` in a terminal.
```bash
# Terminal 1
roslaunch stretch_core stretch_driver.launch
```
Then in a new terminal, type the following command
```bash
# Terminal 2
rosrun stretch_core keyboard_teleop
```
Below are the keyboard commands that allow a user to control all of Stretch's joints.
```
Below are the keyboard commands that allow a user to control all of Stretch's joints.
```{.bash .no-copy}
---------- KEYBOARD TELEOP MENU -----------
i HEAD UP
@ -52,29 +52,28 @@ Below are the keyboard commands that allow a user to control all of Stretch's jo
q QUIT
-------------------------------------------
```
To stop the node from sending twist messages, type **Ctrl** + **c**.
To stop the node from sending twist messages, press `Ctrl` + `c` in the terminal.
### Keyboard Teleoperating: Mobile Base
Begin by running the following command in your terminal:
```bash
# Terminal 1
roslaunch stretch_core stretch_driver.launch
```
To teleoperate a Stretch's mobile base with the keyboard, you first need to switch the mode to *nagivation* for the robot to receive *Twist* messages. This is done using a rosservice call in a new terminal. In the same terminal run the teleop_twist_keyboard node with the argument remapping the *cmd_vel* topic name to *stretch/cmd_vel*.
To teleoperate a Stretch's mobile base with the keyboard, you first need to switch the mode to `navigation` for the robot to receive `Twist` messages. This is done using a rosservice call in a new terminal. In the same terminal run the teleop_twist_keyboard node with the argument remapping the *cmd_vel* topic name to *stretch/cmd_vel*.
```bash
# Terminal 2
rosservice call /switch_to_navigation_mode
rosrun teleop_twist_keyboard teleop_twist_keyboard.py cmd_vel:=stretch/cmd_vel
```
Below are the keyboard commands that allow a user to move Stretch's base.
```
```{.bash .no-copy}
Reading from the keyboard and Publishing to Twist!
---------------------------
Moving around:
@ -100,36 +99,35 @@ e/c : increase/decrease only angular speed by 10%
CTRL-C to quit
currently: speed 0.5 turn 1.0
```
To stop the node from sending twist messages, type **Ctrl** + **c**.
To stop the node from sending twist messages, type `Ctrl` + `c`.
### Create a node for Mobile Base Teleoperating
To move Stretch's mobile base using a python script, please look at [example 1](example_1.md) for reference.
To move Stretch's mobile base using a python script, please look at [Teleoperate Stretch with a node](example_1.md) for reference.
## Teleoperating in Gazebo
### Keyboard Teleoperating: Mobile Base
For keyboard teleoperation of the Stretch's mobile base, first [startup Stretch in simulation](gazebo_basics.md). Then run the following command in a new terminal.
For keyboard teleoperation of the Stretch's mobile base, first, [startup Stretch in simulation](gazebo_basics.md). Then run the following command in a new terminal.
```bash
# Terminal 1
roslaunch stretch_gazebo gazebo.launch
```
In a new terminal, type the following
```bash
# Terminal 2
roslaunch stretch_core teleop_twist.launch twist_topic:=/stretch_diff_drive_controller/cmd_vel linear:=1.0 angular:=2.0 teleop_type:=keyboard # or use teleop_type:=joystick if you have a controller
roslaunch stretch_core teleop_twist.launch twist_topic:=/stretch_diff_drive_controller/cmd_vel linear:=1.0 angular:=2.0 teleop_type:=keyboard
```
The same keyboard commands will be presented to a user to move the robot.
### Xbox Controller Teleoperating
An alternative for robot base teleoperation is to use an Xbox controller. Stop the keyboard teleoperation node by typing **Ctrl** + **c** in the terminal where the command was executed. Then connect the Xbox controller device to your local machine and run the following command.
An alternative for robot base teleoperation is to use an Xbox controller. Stop the keyboard teleoperation node by typing `Ctrl` + `c` in the terminal where the command was executed. Then connect the Xbox controller device to your local machine and run the following command.
```bash
# Terminal 2
roslaunch stretch_core teleop_twist.launch twist_topic:=/stretch_diff_drive_controller/cmd_vel linear:=1.0 angular:=2.0 teleop_type:=joystick
```
Note that the teleop_twist_joy package has a deadman switch by default which disables the drive commands to be published unless pressed. For a Logitech F310 joystick, this button is A.

Loading…
Cancel
Save