Browse Source

Merge branch 'features/mkdocs' of https://github.com/hello-robot/stretch_tutorials into features/mkdocs

pull/14/head
Aaron Edsinger 2 years ago
parent
commit
20f10531c6
52 changed files with 704 additions and 743 deletions
  1. +1
    -2
      mkdocs.yml
  2. +13
    -14
      ros1/README.md
  3. +0
    -2
      ros1/aruco_marker_detection.md
  4. +17
    -21
      ros1/example_1.md
  5. +21
    -36
      ros1/example_10.md
  6. +13
    -35
      ros1/example_11.md
  7. +10
    -31
      ros1/example_12.md
  8. +220
    -0
      ros1/example_13.md
  9. +20
    -23
      ros1/example_2.md
  10. +11
    -14
      ros1/example_3.md
  11. +13
    -23
      ros1/example_4.md
  12. +28
    -29
      ros1/example_5.md
  13. +17
    -34
      ros1/example_6.md
  14. +23
    -54
      ros1/example_7.md
  15. +12
    -30
      ros1/example_8.md
  16. +19
    -49
      ros1/example_9.md
  17. +23
    -51
      ros1/follow_joint_trajectory.md
  18. +0
    -2
      ros1/gazebo_basics.md
  19. +4
    -4
      ros1/getting_started.md
  20. BIN
     
  21. BIN
     
  22. BIN
     
  23. +1
    -3
      ros1/internal_state_of_stretch.md
  24. +1
    -3
      ros1/moveit_basics.md
  25. +1
    -4
      ros1/navigation_stack.md
  26. +0
    -1
      ros1/other_nav_features.md
  27. +0
    -1
      ros1/other_nav_stack.md
  28. +4
    -7
      ros1/perception.md
  29. +0
    -1
      ros1/ros_testing.md
  30. +79
    -89
      ros1/rviz/aruco_detector_example.rviz
  31. +5
    -5
      ros1/rviz/perception_example.rviz
  32. +1
    -3
      ros1/rviz_basics.md
  33. +11
    -16
      ros1/src/aruco_tag_locator.py
  34. +7
    -9
      ros1/src/avoider.py
  35. +3
    -3
      ros1/src/capture_image.py
  36. +4
    -4
      ros1/src/edge_detection.py
  37. +9
    -15
      ros1/src/effort_sensing.py
  38. +10
    -5
      ros1/src/joint_state_printer.py
  39. +5
    -7
      ros1/src/marker.py
  40. +5
    -6
      ros1/src/move.py
  41. +8
    -9
      ros1/src/multipoint_command.py
  42. +20
    -19
      ros1/src/navigation.py
  43. +5
    -5
      ros1/src/pointcloud_transformer.py
  44. +7
    -9
      ros1/src/scan_filter.py
  45. +11
    -13
      ros1/src/single_joint_actuator.py
  46. +2
    -3
      ros1/src/speech_text.py
  47. +3
    -5
      ros1/src/stored_data_plotter.py
  48. +9
    -11
      ros1/src/stow_command.py
  49. +4
    -4
      ros1/src/tf2_broadcaster.py
  50. +4
    -6
      ros1/src/tf2_listener.py
  51. +19
    -20
      ros1/src/voice_teleoperation_base.py
  52. +1
    -3
      ros1/teleoperating_stretch.md

+ 1
- 2
mkdocs.yml View File

@ -125,8 +125,6 @@ nav:
- ArUco Marker Detection: ./ros1/aruco_marker_detection.md
- ReSpeaker Microphone Array: ./ros1/respeaker_microphone_array.md
- FUNMAP: https://github.com/hello-robot/stretch_ros/tree/master/stretch_funmap
- ROS testing: ./ros1/ros_testing.md
- Other Nav Stack Features: ./ros1/other_nav_features.md
- Other Examples:
- Teleoperate Stretch with a Node: ./ros1/example_1.md
- Filter Laser Scans: ./ros1/example_2.md
@ -140,6 +138,7 @@ nav:
- Tf2 Broadcaster and Listener: ./ros1/example_10.md
- PointCloud Transformation: ./ros1/example_11.md
- ArUco Tag Locator: ./ros1/example_12.md
- 2D Navigation Goals: ./ros1/example_13.md
- ROS2:
- Overview: ./ros2/README.md
- Basics:

+ 13
- 14
ros1/README.md View File

@ -19,8 +19,6 @@ This tutorial track is for users looking to become familiar with programming the
| 10 | [ArUco Marker Detection](aruco_marker_detection.md) | |
| 11 | [ReSpeaker Microphone Array](respeaker_microphone_array.md) | |
| 12 | [FUNMAP](https://github.com/hello-robot/stretch_ros/tree/master/stretch_funmap) | |
| 13 | [ROS testing](ros_testing.md) | |
| 14 | [Other Nav Stack Features](other_nav_features.md) | |
## Other Examples
@ -29,15 +27,16 @@ To help get you get started on your software development, here are examples of n
| | Tutorial | Description |
|---|-------------------------------------------------|----------------------------------------------------|
| 1 | [Teleoperate Stretch with a Node](example_1.md) | Use a python script that sends velocity commands. |
| 2 | [Filter Laser Scans](example_2.md) | Publish new scan ranges that are directly in front of Stretch.|
| 3 | [Mobile Base Collision Avoidance](example_3.md) | Stop Stretch from running into a wall.|
| 4 | [Give Stretch a Balloon](example_4.md) | Create a "balloon" marker that goes where ever Stretch goes.|
| 5 | [Print Joint States](example_5.md) | Print the joint states of Stretch.|
| 6 | [Store Effort Values](example_6.md) | Print, store, and plot the effort values of the Stretch robot.|
| 7 | [Capture Image](example_7.md) | Capture images from the RealSense camera data.|
| 8 | [Voice to Text](example_8.md) | Interpret speech and save transcript to a text file.|
| 9 | [Voice Teleoperation of Base](example_9.md) | Use speech to teleoperate the mobile base.|
| 10 | [Tf2 Broadcaster and Listener](example_10.md) | Create a tf2 broadcaster and listener.|
| 11 | [PointCloud Transformation](example_11.md) | Convert PointCloud2 data to a PointCloud and transform to a different frame.|
| 12 | [ArUco Tag Locator](example_12.md) | Actuate the head to locate a requested ArUco marker tag and return a transform.|
| 1 | [Teleoperate Stretch with a Node](example_1.md) | Use a python script that sends velocity commands. |
| 2 | [Filter Laser Scans](example_2.md) | Publish new scan ranges that are directly in front of Stretch.|
| 3 | [Mobile Base Collision Avoidance](example_3.md) | Stop Stretch from running into a wall.|
| 4 | [Give Stretch a Balloon](example_4.md) | Create a "balloon" marker that goes where ever Stretch goes.|
| 5 | [Print Joint States](example_5.md) | Print the joint states of Stretch.|
| 6 | [Store Effort Values](example_6.md) | Print, store, and plot the effort values of the Stretch robot.|
| 7 | [Capture Image](example_7.md) | Capture images from the RealSense camera data.|
| 8 | [Voice to Text](example_8.md) | Interpret speech and save transcript to a text file.|
| 9 | [Voice Teleoperation of Base](example_9.md) | Use speech to teleoperate the mobile base.|
| 10 | [Tf2 Broadcaster and Listener](example_10.md) | Create a tf2 broadcaster and listener.|
| 11 | [PointCloud Transformation](example_11.md) | Convert PointCloud2 data to a PointCloud and transform to a different frame.|
| 12 | [ArUco Tag Locator](example_12.md) | Actuate the head to locate a requested ArUco marker tag and return a transform.|
| 13 | [2D Navigation Goals](example_13.md) | Send 2D navigation goals to the move_base ROS node.|

+ 0
- 2
ros1/aruco_marker_detection.md View File

@ -130,5 +130,3 @@ When coming up with this guide, we expected the following:
* Body-mounted accessories with the same ID numbers mounted to different robots could be disambiguated using the expected range of 3D locations of the ArUco markers on the calibrated body.
* Accessories in the environment with the same ID numbers could be disambiguated using a map or nearby observable features of the environment.
**Next Tutorial:** [ReSpeaker Microphone Array](respeaker_microphone_array.md)

+ 17
- 21
ros1/example_1.md View File

@ -12,13 +12,13 @@ Begin by running the following command in a new terminal.
roslaunch stretch_core stretch_driver.launch
```
Switch the mode to *navigation* mode using a rosservice call. Then drive the robot forward with the move node.
Switch the mode to *navigation* mode using a rosservice call. Then drive the robot forward with the [move.py](https://github.com/hello-robot/stretch_tutorials/tree/noetic/src) node.
```bash
# Terminal 2
rosservice call /switch_to_navigation_mode
cd catkin_ws/src/stretch_tutorials/src/
python move.py
python3 move.py
```
To stop the node from sending twist messages, type **Ctrl** + **c**.
@ -26,14 +26,14 @@ To stop the node from sending twist messages, type **Ctrl** + **c**.
Below is the code which will send *Twist* messages to drive the robot forward.
```python
#!/usr/bin/env python
#!/usr/bin/env python3
import rospy
from geometry_msgs.msg import Twist
class Move:
"""
A class that sends Twist messages to move the Stretch robot foward.
A class that sends Twist messages to move the Stretch robot forward.
"""
def __init__(self):
"""
@ -72,16 +72,16 @@ if __name__ == '__main__':
Now let's break the code down.
```python
#!/usr/bin/env python
#!/usr/bin/env python3
```
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python script.
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python3 script.
```python
import rospy
from geometry_msgs.msg import Twist
```
You need to import rospy if you are writing a ROS Node. The `geometry_msgs.msg` import is so that we can send velocity commands to the robot.
You need to import `rospy` if you are writing a ROS [Node](http://wiki.ros.org/Nodes). The `geometry_msgs.msg` import is so that we can send velocity commands to the robot.
```python
@ -89,21 +89,20 @@ class Move:
def __init__(self):
self.pub = rospy.Publisher('/stretch/cmd_vel', Twist, queue_size=1)#/stretch_diff_drive_controller/cmd_vel for gazebo
```
This section of code defines the talker's interface to the rest of ROS. `pub = rospy.Publisher("/stretch/cmd_vel", Twist, queue_size=1)` declares that your node is publishing to the /stretch/cmd_vel topic using the message type Twist. The queue_size argument limits the amount of queued messages if any subscriber is not receiving them fast enough.
This section of code defines the talker's interface to the rest of ROS. `pub = rospy.Publisher("/stretch/cmd_vel", Twist, queue_size=1)` declares that your node is publishing to the */stretch/cmd_vel* topic using the message type `Twist`. The `queue_size` argument limits the amount of queued messages if any subscriber is not receiving them fast enough.
```Python
command = Twist()
```
Make a Twist message. We're going to set all of the elements, since we
can't depend on them defaulting to safe values.
Make a `Twist` message. We're going to set all of the elements, since we can't depend on them defaulting to safe values.
```python
command.linear.x = 0.1
command.linear.y = 0.0
command.linear.z = 0.0
```
A Twist has three linear velocities (in meters per second), along each of the axes. For Stretch, it will only pay attention to the x velocity, since it can't directly move in the y direction or the z direction.
A `Twist` data structure has three linear velocities (in meters per second), along each of the axes. For Stretch, it will only pay attention to the x velocity, since it can't directly move in the y direction or the z direction.
```python
@ -111,30 +110,29 @@ command.angular.x = 0.0
command.angular.y = 0.0
command.angular.z = 0.0
```
A *Twist* also has three rotational velocities (in radians per second).
The Stretch will only respond to rotations around the z (vertical) axis.
A `Twist` message also has three rotational velocities (in radians per second). The Stretch will only respond to rotations around the z (vertical) axis.
```python
self.pub.publish(command)
```
Publish the Twist commands in the previously defined topic name */stretch/cmd_vel*.
Publish the `Twist` commands in the previously defined topic name */stretch/cmd_vel*.
```Python
rospy.init_node('move')
base_motion = Move()
rate = rospy.Rate(10)
```
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master. In this case, your node will take on the name talker. NOTE: the name must be a base name, i.e. it cannot contain any slashes "/".
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master. **NOTE:** the name must be a base name, i.e. it cannot contain any slashes "/".
The `rospy.Rate()` function creates a Rate object rate. With the help of its method `sleep()`, it offers a convenient way for looping at the desired rate. With its argument of 10, we should expect to go through the loop 10 times per second (as long as our processing time does not exceed 1/10th of a second!)
The `rospy.Rate()` function creates a Rate object. With the help of its method `sleep()`, it offers a convenient way for looping at the desired rate. With its argument of 10, we should expect to go through the loop 10 times per second (as long as our processing time does not exceed 1/10th of a second!)
```python
while not rospy.is_shutdown():
base_motion.move_forward()
rate.sleep()
```
This loop is a fairly standard rospy construct: checking the `rospy.is_shutdown()` flag and then doing work. You have to check is_shutdown() to check if your program should exit (e.g. if there is a Ctrl-C or otherwise). The loop calls `rate.sleep()`, which sleeps just long enough to maintain the desired rate through the loop.
This loop is a fairly standard rospy construct: checking the `rospy.is_shutdown()` flag and then doing work. You have to check `is_shutdown()` to check if your program should exit (e.g. if there is a **Ctrl-C** or otherwise). The loop calls `rate.sleep()`, which sleeps just long enough to maintain the desired rate through the loop.
## Move Stretch in Simulation
@ -142,7 +140,7 @@ This loop is a fairly standard rospy construct: checking the `rospy.is_shutdown(
<img src="images/move.gif"/>
</p>
Using your preferred text editor, modify the topic name of the published *Twist* messages. Please review the edit in the **move.py** script below.
Using your preferred text editor, modify the topic name of the published `Twist` messages. Please review the edit in the **move.py** script below.
```python
self.pub = rospy.Publisher('/stretch_diff_drive_controller/cmd_vel', Twist, queue_size=1)
@ -152,9 +150,7 @@ After saving the edited node, bringup [Stretch in the empty world simulation](ga
```bash
cd catkin_ws/src/stretch_tutorials/src/
python move.py
python3 move.py
```
To stop the node from sending twist messages, type **Ctrl** + **c**.
**Next Example:** [Filter Laser Scans](example_2.md)

+ 21
- 36
ros1/example_10.md View File

@ -18,13 +18,12 @@ Within this tutorial package, there is an RViz config file with the topics for t
# Terminal 2
rosrun rviz rviz -d /home/hello-robot/catkin_ws/src/stretch_tutorials/rviz/tf2_broadcaster_example.rviz
```
Then run the tf2 broadcaster node to visualize three static frames.
Then run the [tf2_broadcaster.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/tf2_broadcaster.py) node to visualize three static frames.
```bash
# Terminal 3
cd catkin_ws/src/stretch_tutorials/src/
python tf2_broadcaster.py
python3 tf2_broadcaster.py
```
The gif below visualizes what happens when running the previous node.
@ -32,15 +31,14 @@ The gif below visualizes what happens when running the previous node.
<img src="images/tf2_broadcaster.gif"/>
</p>
**OPTIONAL**: If you would like to see how the static frames update while the robot is in motion, run the stow command node and observe the tf frames in RViz.
**OPTIONAL**: If you would like to see how the static frames update while the robot is in motion, run the [stow_command_node.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/stow_command.py) and observe the tf frames in RViz.
```bash
# Terminal 4
cd catkin_ws/src/stretch_tutorials/src/
python stow_command.py
python3 stow_command.py
```
<p align="center">
<img src="images/tf2_broadcaster_with_stow.gif"/>
</p>
@ -49,7 +47,7 @@ python stow_command.py
### The Code
```python
#!/usr/bin/env python
#!/usr/bin/env python3
import rospy
import tf.transformations
@ -124,9 +122,9 @@ if __name__ == '__main__':
Now let's break the code down.
```python
#!/usr/bin/env python
#!/usr/bin/env python3
```
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python script.
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python3 script.
```python
import rospy
@ -134,8 +132,7 @@ import tf.transformations
from geometry_msgs.msg import TransformStamped
from tf2_ros import StaticTransformBroadcaster
```
You need to import rospy if you are writing a ROS Node. Import `tf.transformations` to get quaternion values from Euler angles. Import the `TransformStamped` from the `geometry_msgs.msg` package because we will be publishing static frames and it requires this message type. The `tf2_ros` package provides an implementation of a `tf2_ros.StaticTransformBroadcaster` to help make the task of publishing transforms easier.
You need to import rospy if you are writing a ROS [Node](http://wiki.ros.org/Nodes). Import `tf.transformations` to get quaternion values from Euler angles. Import the `TransformStamped` from the `geometry_msgs.msg` package because we will be publishing static frames and it requires this message type. The `tf2_ros` package provides an implementation of a `tf2_ros.StaticTransformBroadcaster` to help make the task of publishing transforms easier.
```python
def __init__(self):
@ -146,7 +143,6 @@ def __init__(self):
"""
self.br = StaticTransformBroadcaster()
```
Here we create a `TransformStamped` object which will be the message we will send over once populated.
```python
@ -155,7 +151,6 @@ self.mast.header.stamp = rospy.Time.now()
self.mast.header.frame_id = 'link_mast'
self.mast.child_frame_id = 'fk_link_mast'
```
We need to give the transform being published a timestamp, we'll just stamp it with the current time, `rospy.Time.now()`. Then, we need to set the name of the parent frame of the link we're creating, in this case *link_mast*. Finally, we need to set the name of the child frame of the link we're creating. In this instance, the child frame is *fk_link_mast*.
```python
@ -186,9 +181,9 @@ rospy.init_node('tf2_broadcaster')
FixedFrameBroadcaster()
```
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master. In this case, your node will take on the name talker. NOTE: the name must be a base name, i.e. it cannot contain any slashes "/".
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master. **NOTE:** the name must be a base name, i.e. it cannot contain any slashes "/".
Instantiate the class with `FixedFrameBroadcaster()`
Instantiate the `FixedFrameBroadcaster()` class.
```python
rospy.spin()
@ -198,7 +193,6 @@ messages come in. If we don't put this line in, then the node will not work,
and ROS will not process any messages.
## tf2 Static Listener
In the previous section of the tutorial, we created a tf2 broadcaster to publish three static transform frames. In this section we will create a tf2 listener that will find the transform between *fk_link_lift* and *link_grasp_center*.
@ -208,23 +202,20 @@ Begin by starting up the stretch driver launch file.
# Terminal 1
roslaunch stretch_core stretch_driver.launch
```
Then run the tf2 broadcaster node to create the three static frames.
Then run the [tf2_broadcaster.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/tf2_broadcaster.py) node to create the three static frames.
```bash
# Terminal 2
cd catkin_ws/src/stretch_tutorials/src/
python tf2_broadcaster.py
python3 tf2_broadcaster.py
```
Finally, run the tf2 listener node to print the transform between two links.
Finally, run the [tf2_listener.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/tf2_listener.py) node to print the transform between two links.
```bash
# Terminal 3
cd catkin_ws/src/stretch_tutorials/src/
python tf2_listener.py
python3 tf2_listener.py
```
Within the terminal the transform will be printed every 1 second. Below is an example of what will be printed in the terminal. There is also an image for reference of the two frames.
```bash
@ -248,7 +239,7 @@ rotation:
### The Code
```python
#!/usr/bin/env python
#!/usr/bin/env python3
import rospy
from geometry_msgs.msg import TransformStamped
@ -261,7 +252,7 @@ class FrameListener():
"""
def __init__(self):
"""
A function that initializes the variables and looks up a tranformation
A function that initializes the variables and looks up a transformation
between a target and source frame.
:param self: The self reference.
"""
@ -296,9 +287,9 @@ if __name__ == '__main__':
Now let's break the code down.
```python
#!/usr/bin/env python
#!/usr/bin/env python3
```
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python script.
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python3 script.
```python
import rospy
@ -306,7 +297,7 @@ from geometry_msgs.msg import TransformStamped
import tf2_ros
```
You need to import rospy if you are writing a ROS Node. Import the `TransformStamped` from the `geometry_msgs.msg` package because we will be publishing static frames and it requires this message type. The `tf2_ros` package provides an implementation of a `tf2_ros.TransformListener` to help make the task of receiving transforms easier.
You need to import rospy if you are writing a ROS [Node](http://wiki.ros.org/Nodes). Import the `TransformStamped` from the `geometry_msgs.msg` package because we will be publishing static frames and it requires this message type. The `tf2_ros` package provides an implementation of a `tf2_ros.TransformListener` to help make the task of receiving transforms easier.
```python
tf_buffer = tf2_ros.Buffer()
@ -324,7 +315,6 @@ Store frame names in variables that will be used to compute transformations.
rospy.sleep(1.0)
rate = rospy.Rate(1)
```
The first line gives the listener some time to accumulate transforms. The second line is the rate the node is going to publish information (1 Hz).
```python
@ -337,7 +327,6 @@ try:
except (tf2_ros.LookupException, tf2_ros.ConnectivityException, tf2_ros.ExtrapolationException):
rospy.logwarn(' Could not transform %s from %s ', to_frame_rel, from_frame_rel)
```
Try to look up the transform we want. Use a try-except block, since it may fail on any single call, due to internal timing issues in the transform publishers. Look up transform between *from_frame_rel* and *to_frame_rel* frames with the `lookup_transform()` function.
```python
@ -345,15 +334,11 @@ rospy.init_node('tf2_listener')
FrameListener()
```
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master. In this case, your node will take on the name talker. NOTE: the name must be a base name, i.e. it cannot contain any slashes "/".
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master. **NOTE:** the name must be a base name, i.e. it cannot contain any slashes "/".
Instantiate the class with `FrameListener()`
Instantiate the `FrameListener()` class.
```python
rospy.spin()
```
Give control to ROS. This will allow the callback to be called whenever new messages come in. If we don't put this line in, then the node will not work, and ROS will not process any messages.
**Previous Example** [Voice Teleoperation of Base](example_9.md)
**Next Example** [PointCloud Transformation](example_11.md)

+ 13
- 35
ros1/example_11.md View File

@ -8,30 +8,25 @@ Begin by starting up the stretch driver launch file.
# Terminal 1
roslaunch stretch_core stretch_driver.launch
```
To activate the RealSense camera and publish topics to be visualized, run the following launch file in a new terminal.
To activate the [RealSense camera](https://www.intelrealsense.com/depth-camera-d435i/) and publish topics to be visualized, run the following launch file in a new terminal.
```bash
# Terminal 2
roslaunch stretch_core d435i_low_resolution.launch
```
Then run the `PointCloud` transformer node.
Then run the [pointCloud_transformer.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/pointcloud_transformer.py) node.
```bash
# Terminal 3
cd catkin_ws/src/stretch_tutorials/src/
python pointcloud_transformer.py
python3 pointcloud_transformer.py
```
Within this tutorial package, there is an RViz config file with the `PointCloud` in the Display tree. You can visualize this topic and the robot model by running the command below in a new terminal.
```bash
# Terminal 4
rosrun rviz rviz -d /home/hello-robot/catkin_ws/src/stretch_tutorials/rviz/PointCloud_transformer_example.rviz
```
The gif below visualizes what happens when running the previous node.
<p align="center">
@ -43,7 +38,7 @@ The gif below visualizes what happens when running the previous node.
### The Code
```python
#!/usr/bin/env python
#!/usr/bin/env python3
import rospy
import tf
import sensor_msgs.point_cloud2 as pc2
@ -65,7 +60,7 @@ class PointCloudTransformer:
self.pointcloud2_sub = rospy.Subscriber("/camera/depth/color/points", PointCloud2, self.callback_pcl2, queue_size=1)
self.pointcloud_pub = rospy.Publisher("/camera_cloud", PointCloud, queue_size=1)
self.pcl2_cloud = None
self.listener = tf.TransformROS(True, rospy.Duration(10.0))
self.listener = tf.TransformListener(True, rospy.Duration(10.0))
rospy.loginfo('Publishing transformed PointCloud. Use RViz to visualize')
def callback_pcl2(self,msg):
@ -98,7 +93,7 @@ class PointCloudTransformer:
:param self: The self reference.
:param msg: The PointCloud message.
:returns new_cloud: PointCloud message.
:returns new_cloud: The transformed PointCloud message.
"""
while not rospy.is_shutdown():
try:
@ -124,9 +119,9 @@ if __name__=="__main__":
Now let's break the code down.
```python
#!/usr/bin/env python
#!/usr/bin/env python3
```
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python script.
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python3 script.
```python
import rospy
@ -136,8 +131,7 @@ from sensor_msgs.msg import PointCloud2, PointCloud
from geometry_msgs.msg import Point32
from std_msgs.msg import Header
```
You need to import `rospy` if you are writing a ROS Node. Import `tf` to utilize the `transformPointCloud` function. Import various the message types from `sensor_msgs`.
You need to import `rospy` if you are writing a ROS [Node](http://wiki.ros.org/Nodes). Import `tf` to utilize the `transformPointCloud` function. Import various the message types from `sensor_msgs`.
```python
self.pointcloud2_sub = rospy.Subscriber("/camera/depth/color/points", PointCloud2, self.callback_pcl2, queue_size=1)
@ -147,14 +141,12 @@ Set up a subscriber. We're going to subscribe to the topic */camera/depth/color
```python
self.pointcloud_pub = rospy.Publisher("/camera_cloud", PointCloud, queue_size=1)
```
This section of code defines the talker's interface to the rest of ROS. `self.pointcloud_pub = rospy.Publisher("/camera_cloud", PointCloud, queue_size=1)` declares that your node is publishing to the */camera_cloud* topic using the message type `PointCloud`.
```python
self.pcl2_cloud = None
self.listener = tf.TransformListener(True, rospy.Duration(10.0))
```
The first line of code initializes *self.pcl2_cloud* to store the `PointCloud2` message. The second line creates a `tf.TransformListener` object. Once the listener is created, it starts receiving tf transformations over the wire, and buffers them for up to 10 seconds.
```python
@ -172,20 +164,17 @@ The callback function that stores the the `PointCloud2` message.
temp_cloud = PointCloud()
temp_cloud.header = self.pcl2_cloud.header
```
Create a `PointCloud` for temporary use. Set the temporary PointCloud's header to the stored PointCloud2 header.
Create a `PointCloud` for temporary use. Set the temporary PointCloud's header to the stored `PointCloud2` header.
```python
for data in pc2.read_points(self.pcl2_cloud, skip_nans=True):
temp_cloud.points.append(Point32(data[0],data[1],data[2]))
```
Use a for loop to extract `PointCloud2` data into a list of x, y, z points and append those values to the `PointCloud` message, *temp_cloud*.
```python
transformed_cloud = self.transform_pointcloud(temp_cloud)
```
Utilize the `transform_pointcloud` function to transform the points in the `PointCloud` message to reference the *base_link*
```python
@ -198,14 +187,11 @@ while not rospy.is_shutdown():
except (tf.LookupException, tf.ConnectivityException,tf.ExtrapolationException):
pass
```
Try to look up and transform the `PointCloud` input. Use a try-except block, since it may fail on any single call, due to internal timing issues in the transform publishers. Transform the point cloud data from *camera_color_optical_frame* to *base_link* with the `transformPointCloud()` function.
```python
self.pointcloud_pub.publish(transformed_cloud)
```
Publish the new transformed `PointCloud`.
```python
@ -213,27 +199,19 @@ rospy.init_node('pointcloud_transformer',anonymous=True)
PCT = PointCloudTransformer()
```
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master. In this case, your node will take on the name talker. NOTE: the name must be a base name, i.e. it cannot contain any slashes "/".
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master. **NOTE:** the name must be a base name, i.e. it cannot contain any slashes "/".
Declare object, *PCT*, from the `PointCloudTransformer` class.
Declare a `PointCloudTransformer` object.
```python
rate = rospy.Rate(1)
rospy.sleep(1)
```
The first line gives the listener some time to accumulate transforms. The second line is the rate the node is going to publish information (1 Hz).
```python
# Run while loop until the node is shutdown
while not rospy.is_shutdown():
# Run the pcl_transformer method
PCT.pcl_transformer()
rate.sleep()
```
Run a while loop until the node is shutdown. Within the while loop run the `pcl_transformer()` method.
**Previous Example** [Tf2 Broadcaster and Listener](example_10.md)
**Next Example** [ArUco Tag Locator](example_12.md)
Run a while loop until the node is shutdown. Within the while loop run the `pcl_transformer()` method.

+ 10
- 31
ros1/example_12.md View File

@ -46,22 +46,22 @@ Within this tutorial package, there is an RViz config file with the topics for t
rosrun rviz rviz -d /home/hello-robot/catkin_ws/src/stretch_tutorials/rviz/aruco_detector_example.rviz
```
Then run the aruco tag locator node.
Then run the [aruco_tag_locator.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/aruco_tag_locator.py) node.
```bash
# Terminal 5
cd catkin_ws/src/stretch_tutorials/src/
python aruco_tag_locator.py
python3 aruco_tag_locator.py
```
<p align="center">
<img src="images/aruco_locator.gif"/>
<img src="images/aruco_detector.gif"/>
</p>
### The Code
```python
#! /usr/bin/env python
#! /usr/bin/env python3
import rospy
import time
@ -111,7 +111,6 @@ class LocateArUcoTag(hm.HelloNode):
"""
self.joint_state = msg
def send_command(self, command):
'''
Handles single joint control commands by constructing a FollowJointTrajectoryGoal
@ -142,7 +141,6 @@ class LocateArUcoTag(hm.HelloNode):
self.trajectory_client.send_goal(trajectory_goal)
self.trajectory_client.wait_for_result()
def find_tag(self, tag_name='docking_station'):
"""
A function that actuates the camera to search for a defined ArUco tag
@ -150,7 +148,7 @@ class LocateArUcoTag(hm.HelloNode):
:param self: The self reference.
:param tag_name: A string value of the ArUco marker name.
:returns transform: A TransformStamped message type.
:returns transform: The docking station's TransformStamped message.
"""
pan_command = {'joint': 'joint_head_pan', 'position': self.min_pan_position}
self.send_command(pan_command)
@ -194,7 +192,6 @@ class LocateArUcoTag(hm.HelloNode):
rospy.loginfo('Searching for docking ArUco tag.')
pose = self.find_tag("docking_station")
if __name__ == '__main__':
try:
node = LocateArUcoTag()
@ -208,9 +205,9 @@ if __name__ == '__main__':
Now let's break the code down.
```python
#!/usr/bin/env python
#!/usr/bin/env python3
```
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python script.
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python3 script.
```python
import rospy
@ -226,7 +223,7 @@ from trajectory_msgs.msg import JointTrajectoryPoint
from geometry_msgs.msg import TransformStamped
```
You need to import rospy if you are writing a ROS Node. Import other python modules needed for this node. Import the FollowJointTrajectoryGoal from the [control_msgs.msg](http://wiki.ros.org/control_msgs) package to control the Stretch robot. Import JointTrajectoryPoint from the [trajectory_msgs](http://wiki.ros.org/trajectory_msgs) package to define robot trajectories. The [hello_helpers](https://github.com/hello-robot/stretch_ros/tree/master/hello_helpers) package consists of a module the provides various Python scripts used across [stretch_ros](https://github.com/hello-robot/stretch_ros). In this instance we are importing the hello_misc script.
You need to import `rospy` if you are writing a ROS [Node](http://wiki.ros.org/Nodes). Import other python modules needed for this node. Import the `FollowJointTrajectoryGoal` from the [control_msgs.msg](http://wiki.ros.org/control_msgs) package to control the Stretch robot. Import `JointTrajectoryPoint` from the [trajectory_msgs](http://wiki.ros.org/trajectory_msgs) package to define robot trajectories. The [hello_helpers](https://github.com/hello-robot/stretch_ros/tree/master/hello_helpers) package consists of a module the provides various Python scripts used across [stretch_ros](https://github.com/hello-robot/stretch_ros). In this instance we are importing the `hello_misc` script.
```python
def __init__(self):
@ -241,7 +238,6 @@ def __init__(self):
self.joint_state = None
```
The `LocateArUcoTag` class inherits the `HelloNode` class from `hm` and is instantiated.
Set up a subscriber with `rospy.Subscriber('/stretch/joint_states', JointState, self.joint_states_callback)`. We're going to subscribe to the topic "*stretch/joint_states*", looking for `JointState` messages. When a message comes in, ROS is going to pass it to the function `joint_states_callback()` automatically.
@ -254,7 +250,6 @@ self.max_pan_position = 1.50
self.pan_num_steps = 10
self.pan_step_size = abs(self.min_pan_position - self.max_pan_position)/self.pan_num_steps
```
Provide the minimum and maximum joint positions for the head pan. These values are needed for sweeping the head to search for the ArUco tag. We also define the number of steps for the sweep, then create the step size for the head pan joint.
```python
@ -262,7 +257,6 @@ self.min_tilt_position = -0.75
self.tilt_num_steps = 3
self.tilt_step_size = pi/16
```
Set the minimum position of the tilt joint, the number of steps, and the size of each step.
```python
@ -279,7 +273,6 @@ def joint_states_callback(self, msg):
"""
self.joint_state = msg
```
The `joint_states_callback()` function stores Stretch's joint states.
```python
@ -296,7 +289,6 @@ def send_command(self, command):
trajectory_goal.trajectory.joint_names = [joint_name]
point = JointTrajectoryPoint()
```
Assign *trajectory_goal* as a `FollowJointTrajectoryGoal` message type. Then extract the string value from the `joint` key. Also, assign *point* as a `JointTrajectoryPoint` message type.
```python
@ -307,14 +299,12 @@ if 'delta' in command:
new_value = joint_value + delta
point.positions = [new_value]
```
Check to see if `delta` is a key in the command dictionary. Then get the current position of the joint and add the delta as a a new position value.
```python
elif 'position' in command:
point.positions = [command['position']]
```
Check to see if `position`is a key in the command dictionary. Then extract the position value.
```python
@ -325,7 +315,6 @@ trajectory_goal.trajectory.header.frame_id = 'base_link'
self.trajectory_client.send_goal(trajectory_goal)
self.trajectory_client.wait_for_result()
```
Then `trajectory_goal.trajectory.points` is defined by the positions set in *point*. Specify the coordinate frame that we want (*base_link*) and set the time to be now. Make the action call and send the goal. The last line of code waits for the result before it exits the python script.
```python
@ -336,7 +325,7 @@ def find_tag(self, tag_name='docking_station'):
:param self: The self reference.
:param tag_name: A string value of the ArUco marker name.
:returns transform: A TransformStamped message type.
:returns transform: The docking station's TransformStamped message.
"""
pan_command = {'joint': 'joint_head_pan', 'position': self.min_pan_position}
self.send_command(pan_command)
@ -344,7 +333,6 @@ def find_tag(self, tag_name='docking_station'):
self.send_command(tilt_command)
```
Create a dictionaries to get the head in its initial position for its search and send the commands the the `send_command()` function.
```python
@ -354,7 +342,6 @@ for i in range(self.tilt_num_steps):
self.send_command(pan_command)
rospy.sleep(0.5)
```
Utilize nested for loop to sweep the pan and tilt in increments. Then update the *joint_head_pan* position by the *pan_step_size*.
Use `rospy.sleep()` function to give time for system to do a Transform lookup before next step.
@ -369,8 +356,6 @@ try:
except (tf2_ros.LookupException, tf2_ros.ConnectivityException, tf2_ros.ExtrapolationException):
continue
```
Use a try-except block to look up the transform between the *base_link* and requested ArUco tag. Then publish and return the `TransformStamped` message.
```python
@ -380,7 +365,6 @@ tilt_command = {'joint': 'joint_head_tilt', 'delta': self.tilt_step_size}
self.send_command(tilt_command)
rospy.sleep(.25)
```
Begin sweep with new tilt angle.
@ -401,7 +385,6 @@ self.tf_buffer = tf2_ros.Buffer()
self.listener = tf2_ros.TransformListener(self.tf_buffer)
rospy.sleep(1.0)
```
Create a StaticTranformBoradcaster Node. Also, start a tf buffer that will store the tf information for a few seconds.Then set up a tf listener, which will subscribe to all of the relevant tf topics, and keep track of the information. Include `rospy.sleep(1.0)` to give the listener some time to accumulate transforms.
```python
@ -418,8 +401,4 @@ if __name__ == '__main__':
except KeyboardInterrupt:
rospy.loginfo('interrupt received, so shutting down')
```
Declare `LocateArUcoTag` object. Then run the `main()` method.
**Previous Example** [PointCloud Transformation](example_11.md)
Declare `LocateArUcoTag` object. Then run the `main()` method.

+ 220
- 0
ros1/example_13.md View File

@ -0,0 +1,220 @@
# Example 13
In this example, we will be utilizing the [move_base ROS node](http://wiki.ros.org/move_base), a component of the [ROS navigation stack](http://wiki.ros.org/navigation?distro=melodic) to send base goals to the Stretch robot.
## Build a map
First, begin by building a map of the space the robot will be navigating in. If you need a refresher on how to do this, then check out the [Navigation Stack tutorial](navigation_stack.md).
## Getting Started
Next, with your created map, we can navigate the robot around the mapped space. Run:
```bash
roslaunch stretch_navigation navigation.launch map_yaml:=${HELLO_FLEET_PATH}/maps/<map_name>.yaml
```
Where `${HELLO_FLEET_PATH}` is the path of the `<map_name>.yaml` file.
**IMPORTANT NOTE:** It's likely that the robot's location in the map does not match the robot's location in the real space. In the top bar of Rviz, use 2D Pose Estimate to lay an arrow down roughly where the robot is located in the real space. Below is a gif for reference.
<p align="center">
<img src="images/2D_pose_estimate.gif"/>
</p>
Now we are going to use a node to send a a move base goal half a meter in front of the map's origin. run the following command to execute the [navigation.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/navigation.py) node.
```bash
# Terminal 2
cd catkin_ws/src/stretch_tutorials/src/
python3 navigation.py
```
### The Code
```python
#!/usr/bin/env python3
import rospy
import actionlib
import sys
from move_base_msgs.msg import MoveBaseAction, MoveBaseGoal
from geometry_msgs.msg import Quaternion
from tf import transformations
class StretchNavigation:
"""
A simple encapsulation of the navigation stack for a Stretch robot.
"""
def __init__(self):
"""
Create an instance of the simple navigation interface.
:param self: The self reference.
"""
self.client = actionlib.SimpleActionClient('move_base', MoveBaseAction)
self.client.wait_for_server()
rospy.loginfo('{0}: Made contact with move_base server'.format(self.__class__.__name__))
self.goal = MoveBaseGoal()
self.goal.target_pose.header.frame_id = 'map'
self.goal.target_pose.header.stamp = rospy.Time()
self.goal.target_pose.pose.position.x = 0.0
self.goal.target_pose.pose.position.y = 0.0
self.goal.target_pose.pose.position.z = 0.0
self.goal.target_pose.pose.orientation.x = 0.0
self.goal.target_pose.pose.orientation.y = 0.0
self.goal.target_pose.pose.orientation.z = 0.0
self.goal.target_pose.pose.orientation.w = 1.0
def get_quaternion(self,theta):
"""
A function to build Quaternians from Euler angles. Since the Stretch only
rotates around z, we can zero out the other angles.
:param theta: The angle (radians) the robot makes with the x-axis.
"""
return Quaternion(*transformations.quaternion_from_euler(0.0, 0.0, theta))
def go_to(self, x, y, theta):
"""
Drive the robot to a particular pose on the map. The Stretch only needs
(x, y) coordinates and a heading.
:param x: x coordinate in the map frame.
:param y: y coordinate in the map frame.
:param theta: heading (angle with the x-axis in the map frame)
"""
rospy.loginfo('{0}: Heading for ({1}, {2}) at {3} radians'.format(self.__class__.__name__,
x, y, theta))
self.goal.target_pose.pose.position.x = x
self.goal.target_pose.pose.position.y = y
self.goal.target_pose.pose.orientation = self.get_quaternion(theta)
self.client.send_goal(self.goal, done_cb=self.done_callback)
self.client.wait_for_result()
def done_callback(self, status, result):
"""
The done_callback function will be called when the joint action is complete.
:param self: The self reference.
:param status: status attribute from MoveBaseActionResult message.
:param result: result attribute from MoveBaseActionResult message.
"""
if status == actionlib.GoalStatus.SUCCEEDED:
rospy.loginfo('{0}: SUCCEEDED in reaching the goal.'.format(self.__class__.__name__))
else:
rospy.loginfo('{0}: FAILED in reaching the goal.'.format(self.__class__.__name__))
if __name__ == '__main__':
rospy.init_node('navigation', argv=sys.argv)
nav = StretchNavigation()
nav.go_to(0.5, 0.0, 0.0)
```
### The Code Explained
Now let's break the code down.
```python
#!/usr/bin/env python3
```
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python3 script.
```python
import rospy
import actionlib
import sys
from move_base_msgs.msg import MoveBaseAction, MoveBaseGoal
from geometry_msgs.msg import Quaternion
from tf import transformations
```
You need to import `rospy` if you are writing a ROS [Node](http://wiki.ros.org/Nodes).
```python
self.client = actionlib.SimpleActionClient('move_base', MoveBaseAction)
self.client.wait_for_server()
rospy.loginfo('{0}: Made contact with move_base server'.format(self.__class__.__name__))
```
Set up a client for the navigation action. On the Stretch, this is called `move_base`, and has type `MoveBaseAction`. Once we make the client, we wait for the server to be ready.
```python
self.goal = MoveBaseGoal()
self.goal.target_pose.header.frame_id = 'map'
self.goal.target_pose.header.stamp = rospy.Time()
```
Make a goal for the action. Specify the coordinate frame that we want, in this instance the *map*. Then we set the time to be now.
```python
self.goal.target_pose.pose.position.x = 0.0
self.goal.target_pose.pose.position.y = 0.0
self.goal.target_pose.pose.position.z = 0.0
self.goal.target_pose.pose.orientation.x = 0.0
self.goal.target_pose.pose.orientation.y = 0.0
self.goal.target_pose.pose.orientation.z = 0.0
self.goal.target_pose.pose.orientation.w = 1.0
```
Initialize a position in the coordinate frame.
```python
def get_quaternion(self,theta):
"""
A function to build Quaternians from Euler angles. Since the Stretch only
rotates around z, we can zero out the other angles.
:param theta: The angle (radians) the robot makes with the x-axis.
"""
return Quaternion(*transformations.quaternion_from_euler(0.0, 0.0, theta))
```
A function that transforms Euler angles to quaternions and returns those values.
```python
def go_to(self, x, y, theta, wait=False):
"""
Drive the robot to a particular pose on the map. The Stretch only needs
(x, y) coordinates and a heading.
:param x: x coordinate in the map frame.
:param y: y coordinate in the map frame.
:param theta: heading (angle with the x-axis in the map frame)
"""
rospy.loginfo('{0}: Heading for ({1}, {2}) at {3} radians'.format(self.__class__.__name__,
x, y, theta))
```
The `go_to()` function takes in the 3 arguments, the x and y coordinates in the *map* frame, and the heading.
```python
self.goal.target_pose.pose.position.x = x
self.goal.target_pose.pose.position.y = y
self.goal.target_pose.pose.orientation = self.get_quaternion(theta)
```
The `MoveBaseGoal()` data structure has three goal positions (in meters), along each of the axes. For Stretch, it will only pay attention to the x and y coordinates, since it can't move in the z direction.
```python
self.client.send_goal(self.goal, done_cb=self.done_callback)
self.client.wait_for_result()
```
Send the goal and include the `done_callback()` function in one of the arguments in `send_goal()`.
```python
def done_callback(self, status, result):
"""
The done_callback function will be called when the joint action is complete.
:param self: The self reference.
:param status: status attribute from MoveBaseActionResult message.
:param result: result attribute from MoveBaseActionResult message.
"""
if status == actionlib.GoalStatus.SUCCEEDED:
rospy.loginfo('{0}: SUCCEEDED in reaching the goal.'.format(self.__class__.__name__))
else:
rospy.loginfo('{0}: FAILED in reaching the goal.'.format(self.__class__.__name__))
```
Conditional statement to print whether the goal status in the `MoveBaseActionResult` succeeded or failed.
```python
rospy.init_node('navigation', argv=sys.argv)
nav = StretchNavigation()
```
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master. **NOTE:** the name must be a base name, i.e. it cannot contain any slashes "/".
Declare the `StretchNavigation` object.
```python
nav.go_to(0.5, 0.0, 0.0)
```
Send a move base goal half a meter in front of the map's origin.

+ 20
- 23
ros1/example_2.md View File

@ -44,18 +44,18 @@ First, open a terminal and run the stretch driver launch file.
roslaunch stretch_core stretch_driver.launch
```
Then in a new terminal run the rplidar launch file from `stretch_core`.
Then in a new terminal run the `rplidar.launch` file from `stretch_core`.
```bash
# Terminal 2
roslaunch stretch_core rplidar.launch
```
To filter the lidar scans for ranges that are directly in front of Stretch (width of 1 meter) run the scan filter node by typing the following in a new terminal.
To filter the lidar scans for ranges that are directly in front of Stretch (width of 1 meter) run the [scan_filter.py]( class="na">https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/scan_filter.py) node by typing the following in a new terminal.
```bash
# Terminal 3
cd catkin_ws/src/stretch_tutorials/src/
python scan_filter.py
python3 scan_filter.py
```
Then run the following command to bring up a simple RViz configuration of the Stretch robot.
@ -72,7 +72,7 @@ Change the topic name from the LaserScan display from */scan* to */filter_scan*.
### The Code
```python
#!/usr/bin/env python
#!/usr/bin/env python3
import rospy
from numpy import linspace, inf
@ -80,24 +80,24 @@ from math import sin
from sensor_msgs.msg import LaserScan
class ScanFilter:
"""
A class that implements a LaserScan filter that removes all of the points
that are not infront of the robot.
"""
"""
A class that implements a LaserScan filter that removes all of the points
that are not in front of the robot.
"""
def __init__(self):
self.width = 1.0
self.extent = self.width / 2.0
self.sub = rospy.Subscriber('/scan', LaserScan, self.callback)
self.pub = rospy.Publisher('filtered_scan', LaserScan, queue_size=10)
rospy.loginfo("Publishing the filtered_scan topic. Use RViz to visualize.")
rospy.loginfo("Publishing the filtered_scan topic. Use RViz to visualize.")
def callback(self,msg):
"""
Callback function to deal with incoming laserscan messages.
"""
Callback function to deal with incoming LaserScan messages.
:param self: The self reference.
:param msg: The subscribed laserscan message.
:param msg: The subscribed LaserScan message.
:publishes msg: updated laserscan message.
:publishes msg: updated LaserScan message.
"""
angles = linspace(msg.angle_min, msg.angle_max, len(msg.ranges))
points = [r * sin(theta) if (theta < -2.5 or theta > 2.5) else inf for r,theta in zip(msg.ranges, angles)]
@ -116,9 +116,9 @@ if __name__ == '__main__':
Now let's break the code down.
```python
#!/usr/bin/env python
#!/usr/bin/env python3
```
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python script.
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python3 script.
```python
@ -127,7 +127,7 @@ from numpy import linspace, inf
from math import sin
from sensor_msgs.msg import LaserScan
```
You need to import rospy if you are writing a ROS Node. There are functions from numpy and math that are required within this code, thus why linspace, inf, and sin are imported. The `sensor_msgs.msg` import is so that we can subscribe and publish LaserScan messages.
You need to import `rospy` if you are writing a ROS [Node](http://wiki.ros.org/Nodes). There are functions from `numpy` and `math` that are required within this code, thus why linspace, inf, and sin are imported. The `sensor_msgs.msg` import is so that we can subscribe and publish `LaserScan` messages.
```python
self.width = 1
@ -138,12 +138,12 @@ We're going to assume that the robot is pointing up the x-axis, so that any poin
```python
self.sub = rospy.Subscriber('/scan', LaserScan, self.callback)
```
Set up a subscriber. We're going to subscribe to the topic *scan*, looking for LaserScan messages. When a message comes in, ROS is going to pass it to the function "callback" automatically.
Set up a subscriber. We're going to subscribe to the topic *scan*, looking for `LaserScan` messages. When a message comes in, ROS is going to pass it to the function "callback" automatically.
```python
self.pub = rospy.Publisher('filtered_scan', LaserScan, queue_size=10)
```
`pub = rospy.Publisher("filtered_scan", LaserScan, queue_size=10)` declares that your node is publishing to the *filtered_scan* topic using the message type LaserScan. This lets the master tell any nodes listening on *filtered_scan* that we are going to publish data on that topic.
`pub = rospy.Publisher("filtered_scan", LaserScan, queue_size=10)` declares that your node is publishing to the *filtered_scan* topic using the message type `LaserScan`. This lets the master tell any nodes listening on *filtered_scan* that we are going to publish data on that topic.
```python
angles = linspace(msg.angle_min, msg.angle_max, len(msg.ranges))
@ -171,7 +171,7 @@ Substitute in the new ranges in the original message, and republish it.
rospy.init_node('scan_filter')
ScanFilter()
```
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master. In this case, your node will take on the name talker. NOTE: the name must be a base name, i.e. it cannot contain any slashes "/".
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master. **NOTE:** the name must be a base name, i.e. it cannot contain any slashes "/".
Instantiate the class with `ScanFilter()`
@ -180,7 +180,4 @@ rospy.spin()
```
Give control to ROS. This will allow the callback to be called whenever new
messages come in. If we don't put this line in, then the node will not work,
and ROS will not process any messages.
**Previous Example:** [Teleoperate Stretch with a Node](example_1.md)
**Next Example:** [Mobile Base Collision Avoidance](example_3.md)
and ROS will not process any messages.

+ 11
- 14
ros1/example_3.md View File

@ -14,13 +14,13 @@ Then in a new terminal type the following to activate the LiDAR sensor.
roslaunch stretch_core rplidar.launch
```
To set *navigation* mode and to activate the avoider node, type the following in a new terminal.
To set *navigation* mode and to activate the [avoider.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/avoider.py) node, type the following in a new terminal.
```bash
# Terminal 3
rosservice call /switch_to_navigation_mode
cd catkin_ws/src/stretch_tutorials/src/
python avoider.py
python3 avoider.py
```
To stop the node from sending twist messages, type **Ctrl** + **c** in the terminal running the avoider node.
@ -31,7 +31,7 @@ To stop the node from sending twist messages, type **Ctrl** + **c** in the termi
### The Code
```python
#!/usr/bin/env python
#!/usr/bin/env python3
import rospy
from numpy import linspace, inf, tanh
@ -66,9 +66,9 @@ class Avoider:
def set_speed(self,msg):
"""
Callback function to deal with incoming laserscan messages.
Callback function to deal with incoming LaserScan messages.
:param self: The self reference.
:param msg: The subscribed laserscan message.
:param msg: The subscribed LaserScan message.
:publishes self.twist: Twist message.
"""
@ -91,9 +91,9 @@ if __name__ == '__main__':
Now let's break the code down.
```python
#!/usr/bin/env python
#!/usr/bin/env python3
```
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python script.
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python3 script.
```python
@ -103,7 +103,7 @@ from math import sin
from geometry_msgs.msg import Twist
from sensor_msgs.msg import LaserScan
```
You need to import rospy if you are writing a ROS Node. There are functions from numpy and math that are required within this code, thus linspace, inf, tanh, and sin are imported. The `sensor_msgs.msg` import is so that we can subscribe to LaserScan messages. The `geometry_msgs.msg` import is so that we can send velocity commands to the robot.
You need to import rospy if you are writing a ROS [Node](http://wiki.ros.org/Nodes). There are functions from `numpy` and `math` that are required within this code, thus linspace, inf, tanh, and sin are imported. The `sensor_msgs.msg` import is so that we can subscribe to `LaserScan` messages. The `geometry_msgs.msg` import is so that we can send velocity commands to the robot.
```python
@ -142,7 +142,7 @@ angles = linspace(msg.angle_min, msg.angle_max, len(msg.ranges))
points = [r * sin(theta) if (theta < -2.5 or theta > 2.5) else inf for r,theta in zip(msg.ranges, angles)]
new_ranges = [r if abs(y) < self.extent else inf for r,y in zip(msg.ranges, points)]
```
This line of code utilizes linspace to compute each angle of the subscribed scan. Here we compute the y coordinates of the ranges that are below -2.5 and above 2.5 radians of the scan angles. These limits are sufficient for considering scan ranges in front of Stretch, but these values can be altered to your preference. If the absolute value of a point's y-coordinate is under self.extent then keep the range, otherwise use inf, which means "no return".
This line of code utilizes linspace to compute each angle of the subscribed scan. Here we compute the y coordinates of the ranges that are below -2.5 and above 2.5 radians of the scan angles. These limits are sufficient for considering scan ranges in front of Stretch, but these values can be altered to your preference. If the absolute value of a point's y-coordinate is under self.extent then keep the range, otherwise use inf, which means "no return".
```python
@ -167,11 +167,8 @@ rospy.init_node('avoider')
Avoider()
rospy.spin()
```
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master. In this case, your node will take on the name talker. NOTE: the name must be a base name, i.e. it cannot contain any slashes "/".
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master. **NOTE:** the name must be a base name, i.e. it cannot contain any slashes "/".
Instantiate class with `Avioder()`
Give control to ROS with `rospy.spin()`. This will allow the callback to be called whenever new messages come in. If we don't put this line in, then the node will not work, and ROS will not process any messages.
**Previous Example:** [Filter Laser Scans](example_2.md)
**Next Example:** [Give Stretch a Balloon](example_4.md)
Give control to ROS with `rospy.spin()`. This will allow the callback to be called whenever new messages come in. If we don't put this line in, then the node will not work, and ROS will not process any messages.

+ 13
- 23
ros1/example_4.md View File

@ -3,22 +3,20 @@
<img src="images/balloon.png"/>
</p>
Let's bringup stretch in the willowgarage world from the [gazebo basics tutorial](gazebo_basics.md) and RViz by using the following command.
```bash
# Terminal 1
roslaunch stretch_gazebo gazebo.launch world:=worlds/willowgarage.world rviz:=true
```
the `rviz` flag will open an RViz window to visualize a variety of ROS topics. In a new terminal run the following commands to create a marker.
the `rviz` flag will open an RViz window to visualize a variety of ROS topics. In a new terminal run the following commands to execute the [marker.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/marker.py) node.
```bash
# Terminal 2
cd catkin_ws/src/stretch_tutorials/src/
python marker.py
python3 marker.py
```
The gif below demonstrates how to add a new *Marker* display type, and change the topic name from `visualization_marker` to `balloon`. A red sphere Marker should appear above the Stretch robot.
The gif below demonstrates how to add a new `Marker` display type, and change the topic name from */visualization_marker* to */balloon*. A red sphere Marker should appear above the Stretch robot.
<p align="center">
<img src="images/balloon.gif"/>
@ -26,7 +24,7 @@ The gif below demonstrates how to add a new *Marker* display type, and change th
### The Code
```python
#!/usr/bin/env python
#!/usr/bin/env python3
import rospy
from visualization_msgs.msg import Marker
@ -37,7 +35,7 @@ class Balloon():
"""
def __init__(self):
"""
Function that initializes the markers features.
Function that initializes the marker's features.
:param self: The self reference.
"""
self.publisher = rospy.Publisher('balloon', Marker, queue_size=10)
@ -70,7 +68,7 @@ class Balloon():
if __name__ == '__main__':
rospy.init_node('marker', argv=sys.argv)
rospy.init_node('marker')
balloon = Balloon()
rate = rospy.Rate(10)
@ -84,16 +82,16 @@ if __name__ == '__main__':
Now let's break the code down.
```python
#!/usr/bin/env python
#!/usr/bin/env python3
```
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python script.
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python3 script.
```python
import rospy
from visualization_msgs.msg import Marker
```
You need to import rospy if you are writing a ROS Node. Import the `Marker` type from the visualization_msgs.msg package. This import is required to publish a `Marker`, which will be visualized in RViz.
You need to import `rospy` if you are writing a ROS [Node](http://wiki.ros.org/Nodes). Import the `Marker` type from the `visualization_msgs.msg` package. This import is required to publish a `Marker`, which will be visualized in RViz.
```python
self.pub = rospy.Publisher('balloon', Marker, queue_size=10)
@ -107,8 +105,7 @@ self.marker.header.frame_id = 'base_link'
self.marker.header.stamp = rospy.Time()
self.marker.type = self.marker.SPHERE
```
Create a maker. Markers of all shapes share a common type. Set the frame ID and type. The frame ID is the frame in which the position of the marker is specified. The type is the shape of the marker. Further details on marker shapes can be found here: [RViz Markers](http://wiki.ros.org/rviz/DisplayTypes/Marker)
Create a `Marker()` message type. Markers of all shapes share a common type. Set the frame ID and type. The frame ID is the frame in which the position of the marker is specified. The type is the shape of the marker. Further details on marker shapes can be found here: [RViz Markers](http://wiki.ros.org/rviz/DisplayTypes/Marker)
```python
self.marker.id = 0
@ -144,9 +141,7 @@ self.marker.pose.position.x = 0.0
self.marker.pose.position.y = 0.0
self.marker.pose.position.z = 2.0
```
Specify the pose of the marker. Since spheres are rotationally invariant, we're only going to specify the positional elements. As usual, these are in the coordinate frame named in frame_id. In this case, the position will always be directly 2 meters above the frame_id (*base_link*), and will move with it.
Specify the pose of the marker. Since spheres are rotationally invariant, we're only going to specify the positional elements. As usual, these are in the coordinate frame named in `frame_id`. In this case, the position will always be directly 2 meters above the frame_id (*base_link*), and will move with it.
```python
def publish_marker(self):
@ -159,8 +154,7 @@ rospy.init_node('marker', argv=sys.argv)
balloon = Balloon()
rate = rospy.Rate(10)
```
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master. In this case, your node will take on the name talker. NOTE: the name must be a base name, i.e. it cannot contain any slashes "/".
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master. **NOTE:** the name must be a base name, i.e. it cannot contain any slashes "/".
Instantiate class with `Balloon()`
@ -172,8 +166,4 @@ while not rospy.is_shutdown():
balloon.publish_marker()
rate.sleep()
```
This loop is a fairly standard rospy construct: checking the `rospy.is_shutdown()` flag and then doing work. You have to check `is_shutdown()` to check if your program should exit (e.g. if there is a Ctrl-C or otherwise). The loop calls `rate.sleep()`, which sleeps just long enough to maintain the desired rate through the loop.
**Previous Example:** [Mobile Base Collision Avoidance](example_3.md)
**Next Example:** [Print Joint States](example_5.md)
This loop is a fairly standard rospy construct: checking the `rospy.is_shutdown()` flag and then doing work. You have to check `is_shutdown()` to check if your program should exit (e.g. if there is a **Ctrl-C** or otherwise). The loop calls `rate.sleep()`, which sleeps just long enough to maintain the desired rate through the loop.

+ 28
- 29
ros1/example_5.md View File

@ -10,21 +10,19 @@ Begin by starting up the stretch driver launch file.
# Terminal 1
roslaunch stretch_core stretch_driver.launch
```
You can then hit the run-stop button (you should hear a beep and the LED light in the button blink) and move the robot's joints to a desired configuration. Once you are satisfied with the configuration, hold the run-stop button until you hear a beep. Then run the following command to print the joint positions of the lift, arm, and wrist.
You can then hit the run-stop button (you should hear a beep and the LED light in the button blink) and move the robot's joints to a desired configuration. Once you are satisfied with the configuration, hold the run-stop button until you hear a beep. Then run the following command to excecute the [joint_state_printer.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/joint_state_printer.py) which will print the joint positions of the lift, arm, and wrist.
```bash
cd catkin_ws/src/stretch_tutorials/src/
python joint_state_printer.py
python3 joint_state_printer.py
```
Your terminal will output the `position` information of the previously mentioned joints shown below.
```
name: ['joint_lift', 'joint_arm_l0', 'joint_arm_l1', 'joint_arm_l2', 'joint_arm_l3', 'joint_wrist_yaw']
position: [1.004518435897309, 0.12361581673760723, 0.023224914142933994, 0.07758496706423101, 0.12309362763409384, 1.8771004095879587]
name: ['joint_lift', 'wrist_extension', 'joint_wrist_yaw']
position: [0.6043133175850597, 0.19873586673129257, 0.017257283863713464]
```
It's important to note that the arm has 4 prismatic joints and the sum of these positions gives the wrist extension distance. The wrist extension is needed when sending [joint trajectory commands](follow_joint_trajectory.md) to the robot. Here is an image of the arm joints for reference:
**IMPORTANT NOTE:** Stretch's arm has 4 prismatic joints and the sum of these positions gives the *wrist_extension* distance. The *wrist_extension* is needed when sending [joint trajectory commands](follow_joint_trajectory.md) to the robot. Further, you **can not** actuate an individual arm joint. Here is an image of the arm joints for reference:
<p align="center">
<img src="images/joints.png"/>
@ -32,7 +30,7 @@ It's important to note that the arm has 4 prismatic joints and the sum of these
### The Code
```python
#!/usr/bin/env python
#!/usr/bin/env python3
import rospy
import sys
@ -44,7 +42,7 @@ class JointStatePublisher():
"""
def __init__(self):
"""
Function that initializes the subsriber.
Function that initializes the subscriber.
:param self: The self reference.
"""
self.sub = rospy.Subscriber('joint_states', JointState, self.callback)
@ -61,10 +59,14 @@ class JointStatePublisher():
"""
print_states function to deal with the incoming JointState messages.
:param self: The self reference.
:param joints: A list of joint names.
:param joints: A list of string values of joint names.
"""
joint_positions = []
for joint in joints:
if joint == "wrist_extension":
index = self.joint_states.name.index('joint_arm_l0')
joint_positions.append(4*self.joint_states.position[index])
continue
index = self.joint_states.name.index(joint)
joint_positions.append(self.joint_states.position[index])
print("name: " + str(joints))
@ -76,7 +78,7 @@ if __name__ == '__main__':
rospy.init_node('joint_state_printer', anonymous=True)
JSP = JointStatePublisher()
rospy.sleep(.1)
joints = ["joint_lift", "joint_arm_l0", "joint_arm_l1", "joint_arm_l2", "joint_arm_13", "joint_wrist_yaw"]
joints = ["joint_lift", "wrist_extension", "joint_wrist_yaw"]
#joints = ["joint_head_pan","joint_head_tilt", joint_gripper_finger_left", "joint_gripper_finger_right"]
JSP.print_states(joints)
rospy.spin()
@ -87,17 +89,16 @@ if __name__ == '__main__':
Now let's break the code down.
```python
#!/usr/bin/env python
#!/usr/bin/env python3
```
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python script.
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python3 script.
```python
import rospy
import sys
from sensor_msgs.msg import JointState
```
You need to import rospy if you are writing a ROS Node. Import `sensor_msgs.msg` so that we can subscribe to `JointState` messages.
You need to import `rospy` if you are writing a ROS [Node](http://wiki.ros.org/Nodes). Import `sensor_msgs.msg` so that we can subscribe to `JointState` messages.
```python
self.sub = rospy.Subscriber('joint_states', JointState, self.callback)
@ -108,20 +109,22 @@ Set up a subscriber. We're going to subscribe to the topic "*joint_states*", lo
def callback(self, msg):
self.joint_states = msg
```
This is the callback function where he JointState messages are stored as *self.joint_states*. Further information about the this message type can be found here: [JointState Message](http://docs.ros.org/en/lunar/api/sensor_msgs/html/msg/JointState.html)
This is the callback function where he `JointState` messages are stored as *self.joint_states*. Further information about the this message type can be found here: [JointState Message](http://docs.ros.org/en/lunar/api/sensor_msgs/html/msg/JointState.html)
```python
def print_states(self, joints):
joint_positions = []
```
This is the `print_states()` function which takes in a list of joints of interest as its argument. the is also an empty list set as *joint_positions* and this is where the positions of the requested joints will be appended.
```python
for joint in joints:
index = self.joint_states.name.index(joint)
joint_positions.append(self.joint_states.position[index])
print(joint_positions)
if joint == "wrist_extension":
index = self.joint_states.name.index('joint_arm_l0')
joint_positions.append(4*self.joint_states.position[index])
continue
index = self.joint_states.name.index(joint)
joint_positions.append(self.joint_states.position[index])
```
In this section of the code, a forloop is used to parse the names of the requested joints from the *self.joint_states* list. The `index()` function returns the index a of the name of the requested joint and appends the respective position to our *joint_positions* list.
@ -129,31 +132,27 @@ In this section of the code, a forloop is used to parse the names of the request
rospy.signal_shutdown("done")
sys.exit(0)
```
The first line of code initiates a clean shutodwn of ROS. The second line of code exits the Python interpreter.
The first line of code initiates a clean shutdown of ROS. The second line of code exits the Python interpreter.
```python
rospy.init_node('joint_state_printer', anonymous=True)
JSP = JointStatePublisher()
rospy.sleep(.1)
```
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master. In this case, your node will take on the name talker. NOTE: the name must be a base name, i.e. it cannot contain any slashes "/".
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master. **NOTE:** the name must be a base name, i.e. it cannot contain any slashes "/".
Declare object, *JSP*, from the `JointStatePublisher` class.
The use of the `rospy.sleep()` function is to allow the *JSP* class to initialize all of its features before requesting to publish joint positions of desired joints (running the `print_states()` function).
The use of the `rospy.sleep()` function is to allow the *JSP* class to initialize all of its features before requesting to publish joint positions of desired joints (running the `print_states()` method).
```python
joints = ["joint_lift", "joint_arm_l0", "joint_arm_l1", "joint_arm_l2", "joint_arm_13", "joint_wrist_yaw"]
joints = ["joint_lift", "wrist_extension", "joint_wrist_yaw"]
#joints = ["joint_head_pan","joint_head_tilt", joint_gripper_finger_left", "joint_gripper_finger_right"]
JSP.print_states(joints)
```
Create a list of the desired joints that you want positions to be printed. Then use that list as an argument for the `print_states()` function.
Create a list of the desired joints that you want positions to be printed. Then use that list as an argument for the `print_states()` method.
```python
rospy.spin()
```
Give control to ROS with `rospy.spin()`. This will allow the callback to be called whenever new messages come in. If we don't put this line in, then the node will not work, and ROS will not process any messages.
**Previous Example** [Give Stretch a Balloon](example_4.md)
**Next Example** [Store Effort Values](example_6.md)

+ 17
- 34
ros1/example_6.md View File

@ -6,28 +6,25 @@ In this example, we will review a Python script that prints out and stores the e
<img src="images/effort_sensing.gif"/>
</p>
Begin by running the following command in the terminal in a terminal.
```bash
# Terminal 1
roslaunch stretch_core stretch_driver.launch
```
Switch the mode to *position* mode using a rosservice call. Then run the single effort sensing node.
Switch the mode to *position* mode using a rosservice call. Then run the [effort_sensing.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/effort_sensing.py) node.
```bash
# Terminal 2
rosservice call /switch_to_position_mode
cd catkin_ws/src/stretch_tutorials/src/
python effort_sensing.py
python3 effort_sensing.py
```
This will send a `FollowJointTrajectory` command to move Stretch's arm or head while also printing the effort of the lift.
### The Code
```python
#!/usr/bin/env python
#!/usr/bin/env python3
import rospy
import time
import actionlib
@ -81,7 +78,7 @@ class JointActuatorEffortSensor(hm.HelloNode):
trajectory_goal.trajectory.header.stamp = rospy.Time(0.0)
trajectory_goal.trajectory.header.frame_id = 'base_link'
self.trajectory_client.send_goal(trajectory_goal, feedback_cb=self.feedback_callback, done_cb=self.done_callback)
rospy.loginfo('Sent stow goal = {0}'.format(trajectory_goal))
rospy.loginfo('Sent position goal = {0}'.format(trajectory_goal))
self.trajectory_client.wait_for_result()
def feedback_callback(self,feedback):
@ -117,7 +114,7 @@ class JointActuatorEffortSensor(hm.HelloNode):
:param result: result attribute from FollowJointTrajectoryActionResult message.
"""
if status == actionlib.GoalStatus.SUCCEEDED:
rospy.loginfo('Suceeded')
rospy.loginfo('Succeeded')
else:
rospy.loginfo('Failed')
@ -153,9 +150,9 @@ if __name__ == '__main__':
This code is similar to that of the [multipoint_command](https://github.com/hello-robot/stretch_tutorials/blob/main/src/multipoint_command.py) and [joint_state_printer](https://github.com/hello-robot/stretch_tutorials/blob/main/src/joint_state_printer.py) node. Therefore, this example will highlight sections that are different from those tutorials. Now let's break the code down.
```python
#!/usr/bin/env python
#!/usr/bin/env python3
```
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python script.
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python3 script.
```python
@ -171,19 +168,18 @@ from trajectory_msgs.msg import JointTrajectoryPoint
from sensor_msgs.msg import JointState
import hello_helpers.hello_misc as hm
```
You need to import rospy if you are writing a ROS Node. Import the `FollowJointTrajectoryGoal` from the `control_msgs.msg` package to control the Stretch robot. Import `JointTrajectoryPoint` from the `trajectory_msgs` package to define robot trajectories. The `hello_helpers` package consists of a module that provides various Python scripts used across stretch_ros. In this instance, we are importing the `hello_misc` script.
You need to import rospy if you are writing a ROS [Node](http://wiki.ros.org/Nodes). Import the `FollowJointTrajectoryGoal` from the `control_msgs.msg` package to control the Stretch robot. Import `JointTrajectoryPoint` from the `trajectory_msgs` package to define robot trajectories. The `hello_helpers` package consists of a module that provides various Python scripts used across [stretch_ros](https://github.com/hello-robot/stretch_ros). In this instance, we are importing the `hello_misc` script.
```Python
class JointActuatorEffortSensor(hm.HelloNode):
"""
A class that sends multiple joint trajectory goals to a single joint.
"""
def __init__(self):
def __init__(self, export_data=False):
"""
Function that initializes the subscriber,and other features.
:param self: The self reference.
:param export_data: A boolean message type.
"""
hm.HelloNode.__init__(self)
```
@ -192,20 +188,19 @@ The `JointActuatorEffortSensor ` class inherits the `HelloNode` class from `hm`
```python
self.sub = rospy.Subscriber('joint_states', JointState, self.callback)
self.joints = ['joint_lift']
```
Set up a subscriber. We're going to subscribe to the topic "*joint_states*", looking for `JointState` messages. When a message comes in, ROS is going to pass it to the function "callback" automatically. Create a list of the desired joints you want to print.
```Python
self.joint_effort = []
self.save_path = '/home/hello-robot/catkin_ws/src/stretch_tutorials/stored_data'
self.export_data = False
self.export_data = export_data
```
Create an empty list to store the joint effort values. The *self.save_path* is the directory path where the .txt file of the effort values will be stored. You can change this path to a preferred directory. The *self.export_data* is a boolean and its default value is set to False. If set to True, then the joint values will be stored in a .txt file, otherwise, the values will be printed in the terminal where you ran the effort sensing node.
```python
self.trajectory_client.send_goal(trajectory_goal, feedback_cb=self.feedback_callback, done_cb=self.done_callback)
```
Include the feedback and done call back functions in the send goal function.
```python
@ -218,7 +213,6 @@ def feedback_callback(self,feedback):
:param feedback: FollowJointTrajectoryActionFeedback message.
"""
```
The feedback callback function takes in the `FollowJointTrajectoryActionFeedback` message as its argument.
```python
@ -226,7 +220,6 @@ if 'wrist_extension' in self.joints:
self.joints.remove('wrist_extension')
self.joints.append('joint_arm_l0')
```
Use a conditional statement to replace `wrist_extenstion` to `joint_arm_l0`. This is because `joint_arm_l0` has the effort values that the `wrist_extension` is experiencing.
```python
@ -235,7 +228,6 @@ for joint in self.joints:
index = self.joint_states.name.index(joint)
current_effort.append(self.joint_states.effort[index])
```
Create an empty list to store the current effort values. Then use a for loop to parse the joint names and effort values.
```python
@ -257,17 +249,15 @@ def done_callback(self, status, result):
:param result: result attribute from FollowJointTrajectoryActionResult message.
"""
```
The done callback function takes in the `FollowJointTrajectoryActionResult` messages as its arguments.
```python
if status == actionlib.GoalStatus.SUCCEEDED:
rospy.loginfo('Suceeded')
rospy.loginfo('Succeeded')
else:
rospy.loginfo('Failed')
```
Conditional statement to print whether the goal status in the `FollowJointTrajectoryActionResult` succeeded or failed.
```python
@ -281,7 +271,6 @@ if self.export_data:
writer.writerows(self.joint_effort)
```
A conditional statement is used to export the data to a .txt file. The file's name is set to the date and time the node was executed. That way, no previous files are overwritten.
@ -291,23 +280,17 @@ A conditional statement is used to export the data to a .txt file. The file's na
<img src="stored_data/2022-06-30_11:26:20-AM.png"/>
</p>
We added a simple python script, [stored_data_plotter.py](https://github.com/hello-robot/stretch_tutorials/blob/main/src/stored_data_plotter.py), to this package for plotting the stored data. Note you have to change the name of the file you wish to see in the python script. This is shown below:
We added a simple python script, [stored_data_plotter.py](https://github.com/hello-robot/stretch_tutorials/blob/main/src/stored_data_plotter.py), to this package for plotting the stored data. **Note** you have to change the name of the file you wish to see in the python script. This is shown below:
```Python
####################### Copy the file name here! #######################
file_name = '2022-06-30_11:26:20-AM'
```
Once you have changed the file name, then run the following in a new command.
```bash
cd catkin_ws/src/stretch_tutorials/src/
python stored_data_plotter.py
python3 stored_data_plotter.py
```
Because this is not a node, you don't need roscore to run this script. Please review the comments in the python script for additional guidance.
**Previous Example** [Print Joint States](example_5.md)
**Next Example** [Capture Image](example_7.md)
Because this is not a node, you don't need `roscore` to run this script. Please review the comments in the python script for additional guidance.

+ 23
- 54
ros1/example_7.md View File

@ -1,36 +1,24 @@
# Example 7
In this example, we will review the [image_view](http://wiki.ros.org/image_view?distro=melodic) ROS package and a Python script that captures an image from the RealSense camera.
In this example, we will review the [image_view](http://wiki.ros.org/image_view?distro=melodic) ROS package and a Python script that captures an image from the [RealSense camera](https://www.intelrealsense.com/depth-camera-d435i/).
<p align="center">
<img src="images/camera_image.jpeg"/>
<img src="images/camera_image_edge_detection.jpeg"/>
</p>
Begin by checking out the [feature/upright_camera_view](https://github.com/hello-robot/stretch_ros/tree/feature/upright_camera_view) branch in the stretch_ros repository. The configuration of the camera results in the images being displayed sideways. Thus, this branch publishes a new topic that rotates the raw image upright.
```bash
cd ~/catkin_ws/src/stretch_ros/stretch_core
git checkout feature/upright_camera_view
```
Then run the stretch driver launch file.
Begin by running the stretch `driver.launch` file.
```bash
# Terminal 1
roslaunch stretch_core stretch_driver.launch
```
To activate the RealSense camera and publish topics to be visualized, run the following launch file in a new terminal.
To activate the [RealSense camera](https://www.intelrealsense.com/depth-camera-d435i/) and publish topics to be visualized, run the following launch file in a new terminal.
```bash
# Terminal 2
roslaunch stretch_core d435i_low_resolution.launch
```
Within this tutorial package, there is an RViz config file with the topics for perception already in the Display tree. You can visualize these topics and the robot model by running the command below in a new terminal.
```bash
@ -40,9 +28,9 @@ rosrun rviz rviz -d /home/hello-robot/catkin_ws/src/stretch_tutorials/rviz/perce
## Capture Image with image_view
There are a couple of methods to save an image using the image_view package.
There are a couple of methods to save an image using the [image_view](http://wiki.ros.org/image_view) package.
**OPTION 1:** Use the `image_view` node to open a simple image viewer for ROS sensor_msgs/image topics.
**OPTION 1:** Use the `image_view` node to open a simple image viewer for ROS *sensor_msgs/image* topics.
```bash
# Terminal 4
@ -51,7 +39,6 @@ rosrun image_view image_view image:=/camera/color/image_raw_upright_view
Then you can save the current image by right-clicking on the display window. By deafult, images will be saved as frame000.jpg, frame000.jpg, etc. Note, that the image will be saved to the terminal's current work directory.
**OPTION 2:** Use the `image_saver` node to save an image to the terminals current work directory.
```bash
# Terminal 4
rosrun image_view image_saver image:=/camera/color/image_raw_upright_view
@ -59,19 +46,18 @@ rosrun image_view image_saver image:=/camera/color/image_raw_upright_view
## Capture Image with Python Script
In this section, you can use a Python node to capture an image from the RealSense camera. Run the following commands to save a .jpeg image of the image topic */camera/color/image_raw_upright_view*.
In this section, you can use a Python node to capture an image from the [RealSense camera](https://www.intelrealsense.com/depth-camera-d435i/). Execute the [capture_image.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/capture_image.py) node to save a .jpeg image of the image topic */camera/color/image_raw_upright_view*.
```bash
# Terminal 4
cd ~/catkin_ws/src/stretch_tutorials/src
python capture_image.py
python3 capture_image.py
```
An image named **camera_image.jpeg** is saved in the **stored_data** folder in this package.
### The Code
```python
#!/usr/bin/env python
#!/usr/bin/env python3
import rospy
import sys
@ -92,7 +78,7 @@ class CaptureImage:
:param self: The self reference.
"""
self.bridge = CvBridge()
self.sub = rospy.Subscriber('/camera/color/image_raw_upright_view', Image, self.callback, queue_size=1)
self.sub = rospy.Subscriber('/camera/color/image_raw', Image, self.callback, queue_size=1)
self.save_path = '/home/hello-robot/catkin_ws/src/stretch_tutorials/stored_data'
def callback(self, msg):
@ -104,7 +90,7 @@ class CaptureImage:
"""
try:
image = self.bridge.imgmsg_to_cv2(msg, 'bgr8')
except CvBridgeError, e:
except CvBridgeError as e:
rospy.logwarn('CV Bridge error: {0}'.format(e))
file_name = 'camera_image.jpeg'
@ -117,17 +103,15 @@ if __name__ == '__main__':
rospy.init_node('capture_image', argv=sys.argv)
CaptureImage()
rospy.spin()
```
### The Code Explained
Now let's break the code down.
```python
#!/usr/bin/env python
#!/usr/bin/env python3
```
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python script.
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python3 script.
```python
@ -136,14 +120,12 @@ import sys
import os
import cv2
```
You need to import rospy if you are writing a ROS Node. There are functions from sys, os, and cv2 that are required within this code. cv2 is a library of Python functions that implements computer vision algorithms. Further information about cv2 can be found here: [OpenCV Python](https://www.geeksforgeeks.org/opencv-python-tutorial/).
You need to import `rospy` if you are writing a ROS [Node](http://wiki.ros.org/Nodes). There are functions from sys, os, and cv2 that are required within this code. cv2 is a library of Python functions that implements computer vision algorithms. Further information about cv2 can be found here: [OpenCV Python](https://www.geeksforgeeks.org/opencv-python-tutorial/).
```python
from sensor_msgs.msg import Image
from cv_bridge import CvBridge, CvBridgeError
```
The `sensor_msgs.msg` is imported so that we can subscribe to ROS `Image` messages. Import [CvBridge](http://wiki.ros.org/cv_bridge) to convert between ROS `Image` messages and OpenCV images.
```python
@ -153,10 +135,9 @@ def __init__(self):
:param self: The self reference.
"""
self.bridge = CvBridge()
self.sub = rospy.Subscriber('/camera/color/image_raw_upright_view', Image, self.callback, queue_size=1)
self.sub = rospy.Subscriber('/camera/color/image_raw', Image, self.callback, queue_size=1)
self.save_path = '/home/hello-robot/catkin_ws/src/stretch_tutorials/stored_data'
```
Initialize the CvBridge class, the subscriber, and the directory of where the captured image will be stored.
```python
@ -169,10 +150,9 @@ def callback(self, msg):
"""
try:
image = self.bridge.imgmsg_to_cv2(msg, 'bgr8')
except CvBridgeError, e:
except CvBridgeError as e:
rospy.logwarn('CV Bridge error: {0}'.format(e))
```
Try to convert the ROS Image message to a cv2 Image message using the `imgmsg_to_cv2()` function.
```python
@ -180,31 +160,26 @@ file_name = 'camera_image.jpeg'
completeName = os.path.join(self.save_path, file_name)
cv2.imwrite(completeName, image)
```
Join the directory and file name using the `path.join()` function. Then use the `imwrite()` function to save the image.
```python
rospy.signal_shutdown("done")
sys.exit(0)
```
The first line of code initiates a clean shutdown of ROS. The second line of code exits the Python interpreter.
```python
rospy.init_node('capture_image', argv=sys.argv)
CaptureImage()
```
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master. In this case, your node will take on the name talker. NOTE: the name must be a base name, i.e. it cannot contain any slashes "/".
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master. **NOTE:** the name must be a base name, i.e. it cannot contain any slashes "/".
Instantiate the class with `CaptureImage()`
Instantiate the `CaptureImage()` class.
```python
rospy.spin()
```
Give control to ROS. This will allow the callback to be called whenever new
messages come in. If we don't put this line in, then the node will not work,
and ROS will not process any messages.
Give control to ROS. This will allow the callback to be called whenever new messages come in. If we don't put this line in, then the node will not work, and ROS will not process any messages.
## Edge Detection
@ -214,9 +189,8 @@ In this section, we highlight a node that utilizes the [Canny Edge filter](https
```bash
# Terminal 4
cd ~/catkin_ws/src/stretch_tutorials/src
python edge_detection.py
python3 edge_detection.py
```
The node will publish a new Image topic named */image_edge_detection*. This can be visualized in RViz and a gif is provided below for reference.
<p align="center">
@ -225,7 +199,7 @@ The node will publish a new Image topic named */image_edge_detection*. This can
### The Code
```python
#!/usr/bin/env python
#!/usr/bin/env python3
import rospy
import sys
@ -235,7 +209,6 @@ import cv2
from sensor_msgs.msg import Image
from cv_bridge import CvBridge, CvBridgeError
class EdgeDetection:
"""
A class that converts a subscribed ROS image to a OpenCV image and saves
@ -248,7 +221,7 @@ class EdgeDetection:
:param self: The self reference.
"""
self.bridge = CvBridge()
self.sub = rospy.Subscriber('/camera/color/image_raw_upright_view', Image, self.callback, queue_size=1)
self.sub = rospy.Subscriber('/camera/color/image_raw', Image, self.callback, queue_size=1)
self.pub = rospy.Publisher('/image_edge_detection', Image, queue_size=1)
self.save_path = '/home/hello-robot/catkin_ws/src/stretch_tutorials/stored_data'
self.lower_thres = 100
@ -265,7 +238,7 @@ class EdgeDetection:
"""
try:
image = self.bridge.imgmsg_to_cv2(msg, 'bgr8')
except CvBridgeError, e:
except CvBridgeError as e:
rospy.logwarn('CV Bridge error: {0}'.format(e))
image = cv2.Canny(image, self.lower_thres, self.upper_thres)
@ -302,8 +275,4 @@ Convert the cv2 image back to a ROS image so it can be published.
image_msg.header = msg.header
self.pub.publish(image_msg)
```
Publish the ROS image with the same header as the subscribed ROS message.
**Previous Example** [Store Effort Values](example_6.md)
**Next Example** [Voice to Text](example_8.md)
Publish the ROS image with the same header as the subscribed ROS message.

+ 12
- 30
ros1/example_8.md View File

@ -2,32 +2,27 @@
This example will showcase how to save the interpreted speech from Stretch's [ReSpeaker Mic Array v2.0](https://wiki.seeedstudio.com/ReSpeaker_Mic_Array_v2.0/) to a text file.
<p align="center">
<img src="images/respeaker.jpg"/>
</p>
Begin by running the `sample_respeaker.launch` file in a terminal.
Begin by running the `respeaker.launch` file in a terminal.
```bash
# Terminal 1
roslaunch stretch_core respeaker.launch
roslaunch respeaker_ros sample_respeaker.launch
```
Then run the speech_text node.
Then run the [speech_text.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/speech_text.py) node.
```bash
# Terminal 2
cd catkin_ws/src/stretch_tutorials/src/
python speech_text.py
python3 speech_text.py
```
The ReSpeaker will be listening and will start to interpret speech and save the transcript to a text file. To stop shutdown the node, type **Ctrl** + **c** in the terminal.
### The Code
```python
#!/usr/bin/env python
#!/usr/bin/env python3
import rospy
import os
@ -42,7 +37,7 @@ class SpeechText:
Initialize subscriber and directory to save speech to text file.
"""
self.sub = rospy.Subscriber("speech_to_text", SpeechRecognitionCandidates, self.callback)
self.save_path = '/home/hello-robot/catkin_ws/src/stretch_tutorials/stored_data
self.save_path = '/home/hello-robot/catkin_ws/src/stretch_tutorials/stored_data'
rospy.loginfo("Listening to speech.")
def callback(self,msg):
@ -66,27 +61,23 @@ if __name__ == '__main__':
rospy.spin()
```
### The Code Explained
Now let's break the code down.
```python
#!/usr/bin/env python
#!/usr/bin/env python3
```
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python script.
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python3 script.
```python
import rospy
import os
```
You need to import rospy if you are writing a ROS Node.
You need to import `rospy` if you are writing a ROS [Node](http://wiki.ros.org/Nodes).
```python
from speech_recognition_msgs.msg import SpeechRecognitionCandidates
```
Import `SpeechRecognitionCandidates` from the `speech_recgonition_msgs.msg` so that we can receive the interpreted speech.
```python
@ -96,35 +87,29 @@ def __init__(self):
"""
self.sub = rospy.Subscriber("speech_to_text", SpeechRecognitionCandidates, self.callback)
```
Set up a subscriber. We're going to subscribe to the topic "*speech_to_text*", looking for `SpeechRecognitionCandidates` messages. When a message comes in, ROS is going to pass it to the function "callback" automatically.
```python
self.save_path = '/home/hello-robot/catkin_ws/src/stretch_tutorials/stored_data
```
Define the directory to save the text file.
```python
transcript = ' '.join(map(str,msg.transcript))
```
Take all items in the iterable list and join them into a single string named transcript.
```python
file_name = 'speech.txt'
completeName = os.path.join(self.save_path, file_name)
```
Define the file name and create a complete path directory.
```python
with open(completeName, "a+") as file_object:
file_object.write("\n")
file_object.write(transcript)
```
Append the transcript to the text file.
```python
@ -132,16 +117,13 @@ rospy.init_node('speech_text')
SpeechText()
```
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master. In this case, your node will take on the name talker. NOTE: the name must be a base name, i.e. it cannot contain any slashes "/".
The next line, `rospy.init_node(NAME, ...)`, is very important as it tells rospy the name of your node -- until rospy has this information, it cannot start communicating with the ROS Master. **NOTE:** the name must be a base name, i.e. it cannot contain any slashes "/".
Instantiate the class with `SpeechText()`
Instantiate the `SpeechText()` class.
```python
rospy.spin()
```
Give control to ROS. This will allow the callback to be called whenever new
messages come in. If we don't put this line in, then the node will not work,
and ROS will not process any messages.
**Previous Example** [Capture Image](example_7.md)
**Next Example** [Voice Teleoperation of Base](example_9.md)
and ROS will not process any messages.

+ 19
- 49
ros1/example_9.md View File

@ -8,24 +8,22 @@ Begin by running the following command in a new terminal.
# Terminal 1
roslaunch stretch_core stretch_driver.launch
```
Switch the mode to *position* mode using a rosservice call. Then run the `sample_respeaker.launch`.
Switch the mode to *position* mode using a rosservice call. Then run the `respeaker.launch` file.
```bash
# Terminal 2
rosservice call /switch_to_position_mode
roslaunch stretch_core respeaker.launch
```
Then run the voice teleoperation base node in a new terminal.
Then run the [voice_teleoperation_base.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/voice_teleoperation_base.py) node in a new terminal.
```bash
# Terminal 3
cd catkin_ws/src/stretch_tutorials/src/
python voice_teleoperation_base.py
python3 voice_teleoperation_base.py
```
In terminal 3, a menu of voice commands is printed. You can reference this menu layout below.
```
------------ VOICE TELEOP MENU ------------
@ -53,7 +51,7 @@ To stop the node from sending twist messages, type **Ctrl** + **c** or say "**qu
### The Code
```python
#!/usr/bin/env python
#!/usr/bin/env python3
import math
import rospy
@ -100,16 +98,17 @@ class GetVoiceCommands:
def callback_direction(self, msg):
"""
A callback function that converts the sound direction from degrees to radians.
A callback function that converts the incoming message, sound direction,
from degrees to radians.
:param self: The self reference.
:param msg: The Int32 message type.
:param msg: The Int32 message type that represents the sound direction.
"""
self.sound_direction = msg.data * -self.rad_per_deg
def callback_speech(self,msg):
"""
A callback function that takes all items in the iterable list and join
them into a single string.
A callback function takes the incoming message, a list of the speech to
text, and joins all items in that iterable list into a single string.
:param self: The self reference.
:param msg: The SpeechRecognitionCandidates message type.
"""
@ -121,7 +120,7 @@ class GetVoiceCommands:
base motion.
:param self:The self reference.
:returns inc: A dictionary type.
:returns inc: A dictionary type the contains the increment size.
"""
if self.step_size == 'small':
inc = {'rad': self.small_rad, 'translate': self.small_translate}
@ -161,7 +160,7 @@ class GetVoiceCommands:
A function that defines the teleoperation command based on the voice command.
:param self: The self reference.
:returns command: A dictionary type.
:returns command: A dictionary type that contains the type of base motion.
"""
command = None
if self.voice_command == 'forward':
@ -194,6 +193,7 @@ class VoiceTeleopNode(hm.HelloNode):
"""
A function that declares object from the GetVoiceCommands class, instantiates
the HelloNode class, and set the publishing rate.
:param self: The self reference.
"""
hm.HelloNode.__init__(self)
self.rate = 10.0
@ -251,26 +251,22 @@ class VoiceTeleopNode(hm.HelloNode):
self.send_command(command)
rate.sleep()
if __name__ == '__main__':
try:
node = VoiceTeleopNode()
node.main()
except KeyboardInterrupt:
rospy.loginfo('interrupt received, so shutting down')
```
### The Code Explained
This code is similar to that of the [multipoint_command](https://github.com/hello-robot/stretch_tutorials/blob/main/src/multipoint_command.py) and [joint_state_printer](https://github.com/hello-robot/stretch_tutorials/blob/main/src/joint_state_printer.py) node. Therefore, this example will highlight sections that are different from those tutorials. Now let's break the code down.
```python
#!/usr/bin/env python
#!/usr/bin/env python3
```
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python script.
```python
import math
import rospy
@ -283,10 +279,8 @@ from trajectory_msgs.msg import JointTrajectoryPoint
import hello_helpers.hello_misc as hm
from speech_recognition_msgs.msg import SpeechRecognitionCandidates
```
You need to import rospy if you are writing a ROS Node. Import the `FollowJointTrajectoryGoal` from the `control_msgs.msg` package to control the Stretch robot. Import `JointTrajectoryPoint` from the `trajectory_msgs` package to define robot trajectories. The `hello_helpers` package consists of a module that provides various Python scripts used across stretch_ros. In this instance, we are importing the `hello_misc` script. Import `sensor_msgs.msg` so that we can subscribe to JointState messages.
```python
class GetVoiceCommands:
```
@ -296,8 +290,7 @@ Create a class that subscribes to the speech to text recognition messages, print
self.step_size = 'medium'
self.rad_per_deg = math.pi/180.0
```
Set the default step size as medium and create a float value, self.rad_per_deg, to convert degrees to radians.
Set the default step size as medium and create a float value, *self.rad_per_deg*, to convert degrees to radians.
```python
self.small_deg = 5.0
@ -312,7 +305,6 @@ self.big_deg = 20.0
self.big_rad = self.rad_per_deg * self.big_deg
self.big_translate = 0.1
```
Define the three rotation and translation step sizes.
```python
@ -321,16 +313,13 @@ self.sound_direction = 0
self.speech_to_text_sub = rospy.Subscriber("/speech_to_text", SpeechRecognitionCandidates, self.callback_speech)
self.sound_direction_sub = rospy.Subscriber("/sound_direction", Int32, self.callback_direction)
```
Initialize the voice command and sound direction to values that will not result in moving the base.
Set up two subscribers. The first one subscribes to the topic */speech_to_text*, looking for `SpeechRecognitionCandidates` messages. When a message comes in, ROS is going to pass it to the function `callback_speech` automatically. The second subscribes to */sound_direction* message and passes it to the `callback_direction` function.
```python
self.sound_direction = msg.data * -self.rad_per_deg
```
The `callback_direction` function converts the *sound_direction* topic from degrees to radians.
```python
@ -342,7 +331,6 @@ if self.step_size == 'big':
inc = {'rad': self.big_rad, 'translate': self.big_translate}
return inc
```
The `callback_speech` stores the increment size for translational and rotational base motion to *inc*. The increment size is contingent on the *self.step_size* string value.
```python
@ -358,7 +346,6 @@ if self.voice_command == 'right':
if self.voice_command == 'stretch':
command = {'joint': 'rotate_mobile_base', 'inc': self.sound_direction}
```
In the `get_command()` function, the *command* is initialized as None, or set as a dictionary where the *joint* and *inc* values are stored. The *command* message type is dependent on the *self.voice_command* string value.
```python
@ -366,7 +353,6 @@ if (self.voice_command == "small") or (self.voice_command == "medium") or (self.
self.step_size = self.voice_command
rospy.loginfo('Step size = {0}'.format(self.step_size))
```
Based on the *self.voice_command* value, set the step size for the increments.
```python
@ -374,10 +360,8 @@ if self.voice_command == 'quit':
rospy.signal_shutdown("done")
sys.exit(0)
```
If the *self.voice_command* is equal to "quit", then initiate a clean shutdown of ROS and exit the Python interpreter.
```python
class VoiceTeleopNode(hm.HelloNode):
"""
@ -388,14 +372,14 @@ class VoiceTeleopNode(hm.HelloNode):
"""
A function that declares object from the GetVoiceCommands class, instantiates
the HelloNode class, and set the publishing rate.
:param self: The self reference.
"""
hm.HelloNode.__init__(self)
self.rate = 10.0
self.joint_state = None
self.speech = GetVoiceCommands()
```
A class that inherits the HelloNode class from hm, declares object from the GetVoiceCommands class, and sends joint trajectory commands.
A class that inherits the `HelloNode` class from `hm`, declares object from the `GetVoiceCommands` class, and sends joint trajectory commands.
```python
def send_command(self, command):
@ -409,21 +393,18 @@ def send_command(self, command):
point = JointTrajectoryPoint()
point.time_from_start = rospy.Duration(0.0)
```
The `send_command` function stores the joint state message and uses a conditional statement to send joint trajectory goals. Also, assign *point* as a `JointTrajectoryPoint` message type.
```python
trajectory_goal = FollowJointTrajectoryGoal()
trajectory_goal.goal_time_tolerance = rospy.Time(1.0)
```
Assign *trajectory_goal* as a `FollowJointTrajectoryGoal` message type.
```python
joint_name = command['joint']
trajectory_goal.trajectory.joint_names = [joint_name]
```
Extract the joint name from the command dictionary.
```python
@ -431,27 +412,23 @@ inc = command['inc']
rospy.loginfo('inc = {0}'.format(inc))
new_value = inc
```
Extract the increment type from the command dictionary.
```python
point.positions = [new_value]
trajectory_goal.trajectory.points = [point]
```
Assign the new value position to the trajectory goal message type.
```python
self.trajectory_client.send_goal(trajectory_goal)
rospy.loginfo('Done sending command.')
```
Make the action call and send goal of the new joint position.
```python
self.speech.print_commands()
```
Reprint the voice command menu after the trajectory goal is sent.
```python
@ -466,9 +443,7 @@ def main(self):
rate = rospy.Rate(self.rate)
self.speech.print_commands()
```
The main function instantiates the HelloNode class, initializes the subscriber,
and call other methods in both the VoiceTeleopNode and GetVoiceCommands classes.
The main function instantiates the `HelloNode` class, initializes the subscriber, and call other methods in both the `VoiceTeleopNode` and `GetVoiceCommands` classes.
```python
while not rospy.is_shutdown():
@ -476,10 +451,8 @@ while not rospy.is_shutdown():
self.send_command(command)
rate.sleep()
```
Run a while loop to continuously check speech commands and send those commands to execute an action.
```python
try:
node = VoiceTeleopNode()
@ -487,7 +460,4 @@ try:
except KeyboardInterrupt:
rospy.loginfo('interrupt received, so shutting down')
```
Declare object from the VoiceTeleopNode class. Then execute the main() method/function.
**Previous Example** [Voice to Text](example_8.md)
**Next Example** [Tf2 Broadcaster and Listener](example_10.md)
Declare a `VoiceTeleopNode` object. Then execute the `main()` method.

+ 23
- 51
ros1/follow_joint_trajectory.md View File

@ -7,42 +7,27 @@ Stretch ROS driver offers a [`FollowJointTrajectory`](http://docs.ros.org/en/api
<img src="images/stow_command.gif"/>
</p>
Begin by running the following command in the terminal in a terminal.
```bash
# Terminal 1
roslaunch stretch_core stretch_driver.launch
```
Switch the mode to *position* mode using a rosservice call. Then run the stow command node.
=== "Melodic"
```bash
# Terminal 2
rosservice call /switch_to_position_mode
cd catkin_ws/src/stretch_tutorials/src/
python stow_command.py
```
=== "Noetic"
```bash
# Terminal 2
rosservice call /switch_to_position_mode
cd catkin_ws/src/stretch_tutorials/src/
python3 stow_command.py
```
```bash
# Terminal 2
rosservice call /switch_to_position_mode
cd catkin_ws/src/stretch_tutorials/src/
python3 stow_command.py
```
This will send a `FollowJointTrajectory` command to stow Stretch's arm.
This will send a `FollowJointTrajectory` command to stow Stretch's arm.
### The Code
```python
#!/usr/bin/env python
#!/usr/bin/env python3
import rospy
from control_msgs.msg import FollowJointTrajectoryGoal
@ -59,7 +44,7 @@ class StowCommand(hm.HelloNode):
def issue_stow_command(self):
'''
Function that makes an action call and sends stow postion goal.
Function that makes an action call and sends stow position goal.
:param self: The self reference.
'''
stow_point = JointTrajectoryPoint()
@ -100,10 +85,9 @@ if __name__ == '__main__':
Now let's break the code down.
```python
#!/usr/bin/env python
#!/usr/bin/env python3
```
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python script.
Every Python ROS [Node](http://wiki.ros.org/Nodes) will have this declaration at the top. The first line makes sure your script is executed as a Python3 script.
```python
import rospy
@ -112,11 +96,10 @@ from trajectory_msgs.msg import JointTrajectoryPoint
import hello_helpers.hello_misc as hm
import time
```
You need to import rospy if you are writing a ROS Node. Import the FollowJointTrajectoryGoal from the [control_msgs.msg](http://wiki.ros.org/control_msgs) package to control the Stretch robot. Import JointTrajectoryPoint from the [trajectory_msgs](http://wiki.ros.org/trajectory_msgs) package to define robot trajectories. The [hello_helpers](https://github.com/hello-robot/stretch_ros/tree/master/hello_helpers) package consists of a module the provides various Python scripts used across [stretch_ros](https://github.com/hello-robot/stretch_ros). In this instance we are importing the hello_misc script.
You need to import `rospy` if you are writing a ROS [Node](http://wiki.ros.org/Nodes). Import the `FollowJointTrajectoryGoal` from the [control_msgs.msg](http://wiki.ros.org/control_msgs) package to control the Stretch robot. Import `JointTrajectoryPoint` from the [trajectory_msgs](http://wiki.ros.org/trajectory_msgs) package to define robot trajectories. The [hello_helpers](https://github.com/hello-robot/stretch_ros/tree/master/hello_helpers) package consists of a module the provides various Python scripts used across [stretch_ros](https://github.com/hello-robot/stretch_ros). In this instance we are importing the `hello_misc` script.
```python
class StowCommand(hm.HelloNode):
def __init__(self):
hm.HelloNode.__init__(self)
```
@ -137,7 +120,7 @@ The `issue_stow_command()` is the name of the function that will stow Stretch's
trajectory_goal.trajectory.header.stamp = rospy.Time(0.0)
trajectory_goal.trajectory.header.frame_id = 'base_link'
```
Set *trajectory_goal* as a `FollowJointTrajectoryGoal` and define the joint names as a list. Then `trajectory_goal.trajectory.points` is defined by the positions set in *stow_point*. Specify the coordinate frame that we want (base_link) and set the time to be now.
Set *trajectory_goal* as a `FollowJointTrajectoryGoal` and define the joint names as a list. Then `trajectory_goal.trajectory.points` is defined by the positions set in *stow_point*. Specify the coordinate frame that we want (*base_link*) and set the time to be now.
```python
self.trajectory_client.send_goal(trajectory_goal)
@ -153,7 +136,7 @@ def main(self):
self.issue_stow_command()
time.sleep(2)
```
Create a function, `main()`, to do all of the setup for the `hm.HelloNode` class and issue the stow command.
Create a funcion, `main()`, to do all of the setup the `hm.HelloNode` class and issue the stow command.
```python
if __name__ == '__main__':
@ -179,21 +162,19 @@ Begin by running the following command in the terminal in a terminal.
# Terminal 1
roslaunch stretch_core stretch_driver.launch
```
Switch the mode to *position* mode using a rosservice call. Then run the multipoint command node.
```bash
# Terminal 2
rosservice call /switch_to_position_mode
cd catkin_ws/src/stretch_tutorials/src/
python multipoint_command.py
python3 multipoint_command.py
```
This will send a list of `JointTrajectoryPoint` message types to move Stretch's arm.
### The Code
```python
#!/usr/bin/env python
#!/usr/bin/env python3
import rospy
import time
@ -265,7 +246,7 @@ if __name__ == '__main__':
```
### The Code Explained
Seeing that there are similarities between the multipoint and stow command nodes, we will only breakdown the different components of the multipoint_command node.
Seeing that there are similarities between the multipoint and stow command nodes, we will only breakdown the different components of the `multipoint_command` node.
```python
point0 = JointTrajectoryPoint()
@ -294,7 +275,7 @@ trajectory_goal.trajectory.points = [point0, point1, point2, point3, point4, poi
trajectory_goal.trajectory.header.stamp = rospy.Time(0.0)
trajectory_goal.trajectory.header.frame_id = 'base_link'
```
Set *trajectory_goal* as a `FollowJointTrajectoryGoal` and define the joint names as a list. Then `trajectory_goal.trajectory.points` is defined by a list of the 6 points. Specify the coordinate frame that we want (base_link) and set the time to be now.
Set *trajectory_goal* as a `FollowJointTrajectoryGoal` and define the joint names as a list. Then `trajectory_goal.trajectory.points` is defined by a list of the 6 points. Specify the coordinate frame that we want (*base_link*) and set the time to be now.
## Single Joint Actuator
@ -310,13 +291,10 @@ You can also actuate a single joint for the Stretch. Below are the list of joint
joint_lift: lower_limit = 0.15, upper_limit = 1.10 # in meters
wrist_extension: lower_limit = 0.00, upper_limit = 0.50 # in meters
joint_wrist_yaw: lower_limit = -1.75, upper_limit = 4.00 # in radians
joint_head_pan: lower_limit = -4.10, upper_limit = 1.60 # in radians
joint_head_pan: lower_limit = -2.80, upper_limit = 2.90 # in radians
joint_head_tilt: lower_limit = -1.60, upper_limit = 0.40 # in radians
joint_gripper_finger_left: lower_limit = -0.35, upper_limit = 0.165 # in radians
# INCLUDED JOINT IN MANIPULATION MODE
joint_mobile_base_translation: lower_limit = -0.50, upper_limit = 0.50 # in radians
# INCLUDED JOINTS IN POSITION MODE
translate_mobile_base: No lower or upper limit. Defined by a step size in meters
rotate_mobile_base: No lower or upper limit. Defined by a step size in radians
@ -329,25 +307,22 @@ Begin by running the following command in the terminal in a terminal.
# Terminal 1
roslaunch stretch_core stretch_driver.launch
```
Switch the mode to *position* mode using a rosservice call. Then run the single joint actuator node.
```bash
# Terminal 2
rosservice call /switch_to_position_mode
cd catkin_ws/src/stretch_tutorials/src/
python single_joint_actuator.py
python3 single_joint_actuator.py
```
This will send a list of `JointTrajectoryPoint` message types to move Stretch's arm.
The joint, *joint_gripper_finger_left*, is only needed when actuating the gripper.
### The Code
```python
#!/usr/bin/env python
#!/usr/bin/env python3
import rospy
import time
@ -404,7 +379,7 @@ if __name__ == '__main__':
```
### The Code Explained
Since the code is quite similar to the multipoint_command code, we will only review the parts that differ.
Since the code is quite similar to the `multipoint_command` code, we will only review the parts that differ.
Now let's break the code down.
@ -428,7 +403,4 @@ trajectory_goal.trajectory.points = [point0]#, point1]
trajectory_goal.trajectory.header.stamp = rospy.Time(0.0)
trajectory_goal.trajectory.header.frame_id = 'base_link'
```
Set *trajectory_goal* as a `FollowJointTrajectoryGoal` and define the joint names as a list. Then `trajectory_goal.trajectory.points` set by your list of points. Specify the coordinate frame that we want (base_link) and set the time to be now.
**Next Tutorial:** [Perception](perception.md)
Set *trajectory_goal* as a `FollowJointTrajectoryGoal` and define the joint names as a list. Then `trajectory_goal.trajectory.points` set by your list of points. Specify the coordinate frame that we want (*base_link*) and set the time to be now.

+ 0
- 2
ros1/gazebo_basics.md View File

@ -27,5 +27,3 @@ roslaunch stretch_gazebo gazebo.launch world:=worlds/willowgarage.world
<p align="center">
<img src="images/stretch_willowgarage_world.png"/>
</p>
**Next Tutorial:** [Teleoperating Stretch](teleoperating_stretch.md)

+ 4
- 4
ros1/getting_started.md View File

@ -10,6 +10,8 @@ Instructions on installing Noetic can be found in our open-source [installation
```bash
cd ~/catkin_ws/src
git clone https://github.com/hello-robot/stretch_tutorials.git
cd stretch_tutorials
git checkout noetic
cd ~/catkin_ws
catkin_make
```
@ -52,9 +54,7 @@ source ~/.bashrc
## RoboMaker
<p align="center">
<img src="images/aws-robomaker.png"/>
<img src="./images/aws-robomaker.png"/>
</p>
If you cannot dual boot and install ubuntu on your local machine, an alternative is to use [AWS RoboMaker](https://aws.amazon.com/robomaker/). AWS RoboMaker extends the ROS framework with cloud services. The service provides a robotics simulation service, allowing for testing the Stretch RE1 platform. If you are a first-time user of AWS RoboMaker, follow the [guide here](https://github.com/aws-robotics/aws-robomaker-robotics-curriculum/blob/main/getting-started-with-aws-robomaker/_modules/mod-2a-aws.md) to get up and running with the service.
**Next Tutorial:** [Gazebo Basics](gazebo_basics.md)
If you cannot dual boot and install ubuntu on your local machine, an alternative is to use [AWS RoboMaker](https://aws.amazon.com/robomaker/). AWS RoboMaker extends the ROS framework with cloud services. The service provides a robotics simulation service, allowing for testing the Stretch RE1 platform. If you are a first-time user of AWS RoboMaker, follow the [guide here](https://github.com/aws-robotics/aws-robomaker-robotics-curriculum/blob/main/getting-started-with-aws-robomaker/_modules/mod-2a-aws.md) to get up and running with the service.

BIN
View File


BIN
View File


BIN
View File


+ 1
- 3
ros1/internal_state_of_stretch.md View File

@ -58,6 +58,4 @@ rqt_graph
<img src="images/rqt_graph.png"/>
</p>
The graph allows a user to observe and affirm if topics are broadcasted to the correct nodes. This method can also be utilized to debug communication issues.
**Next Tutorial:** [RViz Basics](rviz_basics.md)
The graph allows a user to observe and affirm if topics are broadcasted to the correct nodes. This method can also be utilized to debug communication issues.

+ 1
- 3
ros1/moveit_basics.md View File

@ -56,6 +56,4 @@ This will launch an Rviz instance that visualizes the joints with markers and an
<p align="center">
<img src="images/gazebo_moveit.gif"/>
</p>
**Next Tutorial:** [Follow Joint Trajectory Commands](follow_joint_trajectory.md)
</p>

+ 1
- 4
ros1/navigation_stack.md View File

@ -68,7 +68,4 @@ roslaunch stretch_navigation mapping.launch rviz:=false teleop_type:=none
rviz -d `rospack find stretch_navigation`/rviz/mapping.launch
# On your machine, Terminal 2:
roslaunch stretch_core teleop_twist.launch teleop_type:=keyboard # or use teleop_type:=joystick if you have a controller
```
**Next Tutorial:** [MoveIt! Basics](moveit_basics.md)
```

+ 0
- 1
ros1/other_nav_features.md View File

@ -1 +0,0 @@
Coming soon

+ 0
- 1
ros1/other_nav_stack.md View File

@ -1 +0,0 @@
Coming soon

+ 4
- 7
ros1/perception.md View File

@ -1,8 +1,8 @@
## Perception Introduction
The Stretch robot is equipped with the Intel RealSense D435i camera, an essential component that allows the robot to measure and analyze the world around it. In this tutorial, we are going to showcase how to visualize the various topics published from the camera.
The Stretch robot is equipped with the [Intel RealSense D435i camera](https://www.intelrealsense.com/depth-camera-d435i/), an essential component that allows the robot to measure and analyze the world around it. In this tutorial, we are going to showcase how to visualize the various topics published from the camera.
Begin by checking out the [feature/upright_camera_view](https://github.com/hello-robot/stretch_ros/tree/feature/upright_camera_view) branch in the stretch_ros repository. The configuration of the camera results in the images being displayed sideways. Thus, this branch publishes a new topic that rotates the raw image upright.
Begin by checking out the [feature/upright_camera_view](https://github.com/hello-robot/stretch_ros/tree/feature/upright_camera_view) branch in the [stretch_ros](https://github.com/hello-robot/stretch_ros) repository. The configuration of the camera results in the images being displayed sideways. Thus, this branch publishes a new topic that rotates the raw image upright.
```bash
cd ~/catkin_ws/src/stretch_ros/stretch_core
@ -15,7 +15,7 @@ Then run the stretch driver launch file.
roslaunch stretch_core stretch_driver.launch
```
To activate the RealSense camera and publish topics to be visualized, run the following launch file in a new terminal.
To activate the [RealSense camera](https://www.intelrealsense.com/depth-camera-d435i/) and publish topics to be visualized, run the following launch file in a new terminal.
```bash
# Terminal 2
@ -60,7 +60,4 @@ The `DepthCloud` display is visualized in the main RViz window. This display tak
## Deep Perception
Hello Robot also has a ROS package that uses deep learning models for various detection demos. A link to the package is provided: [stretch_deep_perception](https://github.com/hello-robot/stretch_ros/tree/master/stretch_deep_perception).
**Next Tutorial:** [ArUco Marker Detection](aruco_marker_detection.md)
Hello Robot also has a ROS package that uses deep learning models for various detection demos. A link to the package is provided: [stretch_deep_perception](https://github.com/hello-robot/stretch_ros/tree/master/stretch_deep_perception).

+ 0
- 1
ros1/ros_testing.md View File

@ -1 +0,0 @@
Coming soon

+ 79
- 89
ros1/rviz/aruco_detector_example.rviz View File

@ -7,8 +7,10 @@ Panels:
- /Status1
- /RobotModel1
- /RobotModel1/Links1/camera_bottom_screw_frame1
Splitter Ratio: 0.5058823823928833
Tree Height: 366
- /TF1
- /TF1/Frames1
Splitter Ratio: 0.8150289058685303
Tree Height: 915
- Class: rviz/Selection
Name: Selection
- Class: rviz/Tool Properties
@ -27,7 +29,7 @@ Panels:
Experimental: false
Name: Time
SyncMode: 0
SyncSource: Image
SyncSource: PointCloud2
Preferences:
PromptSaveOnExit: true
Toolbars:
@ -293,8 +295,12 @@ Visualization Manager:
Frame Timeout: 15
Frames:
All Enabled: false
base_left:
Value: true
base_link:
Value: false
base_right:
Value: true
camera_accel_frame:
Value: false
camera_accel_optical_frame:
@ -307,16 +313,10 @@ Visualization Manager:
Value: false
camera_color_optical_frame:
Value: false
camera_color_optical_frame_rotated:
Value: false
camera_depth_frame:
Value: false
camera_depth_optical_frame:
Value: false
camera_gyro_frame:
Value: false
camera_gyro_optical_frame:
Value: false
camera_infra1_frame:
Value: false
camera_infra1_optical_frame:
@ -327,8 +327,6 @@ Visualization Manager:
Value: false
camera_link:
Value: false
docking_station:
Value: true
laser:
Value: false
link_arm_l0:
@ -379,95 +377,89 @@ Visualization Manager:
Value: false
link_wrist_yaw:
Value: false
odom:
Value: true
respeaker_base:
Value: false
shoulder_top:
Value: true
wrist_inside:
Value: true
wrist_top:
Value: true
Marker Scale: 1
Name: TF
Show Arrows: true
Show Axes: true
Show Names: true
Tree:
odom:
base_link:
laser:
{}
link_aruco_left_base:
{}
link_aruco_right_base:
{}
link_left_wheel:
{}
link_mast:
link_head:
link_head_pan:
link_head_tilt:
camera_bottom_screw_frame:
camera_link:
camera_accel_frame:
camera_accel_optical_frame:
{}
camera_aligned_depth_to_color_frame:
camera_color_optical_frame:
camera_color_optical_frame_rotated:
{}
docking_station:
{}
camera_color_frame:
base_link:
laser:
{}
link_aruco_left_base:
{}
link_aruco_right_base:
{}
link_left_wheel:
{}
link_mast:
link_head:
link_head_pan:
link_head_tilt:
camera_bottom_screw_frame:
camera_link:
camera_accel_frame:
camera_accel_optical_frame:
{}
camera_depth_frame:
camera_depth_optical_frame:
{}
camera_gyro_frame:
camera_gyro_optical_frame:
camera_aligned_depth_to_color_frame:
{}
camera_color_frame:
camera_color_optical_frame:
base_left:
{}
camera_infra1_frame:
camera_infra1_optical_frame:
base_right:
{}
camera_infra2_frame:
camera_infra2_optical_frame:
shoulder_top:
{}
link_lift:
link_arm_l4:
link_arm_l3:
link_arm_l2:
link_arm_l1:
link_arm_l0:
link_aruco_inner_wrist:
wrist_inside:
{}
link_aruco_top_wrist:
wrist_top:
{}
link_wrist_yaw:
link_gripper:
link_grasp_center:
camera_depth_frame:
camera_depth_optical_frame:
{}
camera_infra1_frame:
camera_infra1_optical_frame:
{}
camera_infra2_frame:
camera_infra2_optical_frame:
{}
link_lift:
link_arm_l4:
link_arm_l3:
link_arm_l2:
link_arm_l1:
link_arm_l0:
link_aruco_inner_wrist:
{}
link_aruco_top_wrist:
{}
link_wrist_yaw:
link_gripper:
link_grasp_center:
{}
link_gripper_finger_left:
link_gripper_fingertip_left:
{}
link_gripper_finger_left:
link_gripper_fingertip_left:
{}
link_gripper_finger_right:
link_gripper_fingertip_right:
{}
link_aruco_shoulder:
{}
respeaker_base:
link_gripper_finger_right:
link_gripper_fingertip_right:
{}
link_aruco_shoulder:
{}
link_right_wheel:
respeaker_base:
{}
link_right_wheel:
{}
Update Interval: 0
Value: true
- Class: rviz/Image
Enabled: true
Image Topic: /camera/color/upright_image_raw
Max Value: 1
Median window: 5
Min Value: 0
Name: Image
Normalize Range: true
Queue Size: 2
Transport Hint: raw
Unreliable: false
Value: true
Enabled: true
Global Options:
Background Color: 48; 48; 48
@ -496,7 +488,7 @@ Visualization Manager:
Views:
Current:
Class: rviz/Orbit
Distance: 4.624553680419922
Distance: 2.342942714691162
Enable Stereo Rendering:
Stereo Eye Separation: 0.05999999865889549
Stereo Focal Distance: 1
@ -511,10 +503,10 @@ Visualization Manager:
Invert Z Axis: false
Name: Current View
Near Clip Distance: 0.009999999776482582
Pitch: 0.4497976303100586
Pitch: 0.37479761242866516
Target Frame: <Fixed Frame>
Value: Orbit (rviz)
Yaw: 3.255469799041748
Yaw: 3.980466365814209
Saved: ~
Window Geometry:
Displays:
@ -522,9 +514,7 @@ Window Geometry:
Height: 1212
Hide Left Dock: false
Hide Right Dock: false
Image:
collapsed: false
QMainWindow State: 000000ff00000000fd0000000400000000000002aa0000041efc020000000cfb0000001200530065006c0065006300740069006f006e00000001e10000009b0000005c00fffffffb0000001e0054006f006f006c002000500072006f007000650072007400690065007302000001ed000001df00000185000000a3fb000000120056006900650077007300200054006f006f02000001df000002110000018500000122fb000000200054006f006f006c002000500072006f0070006500720074006900650073003203000002880000011d000002210000017afb000000100044006900730070006c006100790073010000003d000001f9000000c900fffffffb0000000a0049006d006100670065010000023c0000021f0000001600fffffffb0000002000730065006c0065006300740069006f006e00200062007500660066006500720200000138000000aa0000023a00000294fb00000014005700690064006500530074006500720065006f02000000e6000000d2000003ee0000030bfb0000000c004b0069006e0065006300740200000186000001060000030c00000261fb0000000c00430061006d006500720061000000022f000000c70000000000000000fb0000000a0049006d00610067006500000002920000026d0000000000000000fb0000000c00430061006d0065007200610000000429000000d60000000000000000000000010000010f00000363fc0200000003fb0000001e0054006f006f006c002000500072006f00700065007200740069006500730100000041000000780000000000000000fb0000000a00560069006500770073000000003d00000363000000a400fffffffb0000001200530065006c0065006300740069006f006e010000025a000000b200000000000000000000000200000490000000a9fc0100000001fb0000000a00560069006500770073030000004e00000080000002e100000197000000030000056f0000003efc0100000002fb0000000800540069006d006501000000000000056f000002eb00fffffffb0000000800540069006d00650100000000000004500000000000000000000002bf0000041e00000004000000040000000800000008fc0000000100000002000000010000000a0054006f006f006c00730100000000ffffffff0000000000000000
QMainWindow State: 000000ff00000000fd0000000400000000000001a20000041efc020000000bfb0000001200530065006c0065006300740069006f006e00000001e10000009b0000005c00fffffffb0000001e0054006f006f006c002000500072006f007000650072007400690065007302000001ed000001df00000185000000a3fb000000120056006900650077007300200054006f006f02000001df000002110000018500000122fb000000200054006f006f006c002000500072006f0070006500720074006900650073003203000002880000011d000002210000017afb000000100044006900730070006c006100790073010000003d0000041e000000c900fffffffb0000002000730065006c0065006300740069006f006e00200062007500660066006500720200000138000000aa0000023a00000294fb00000014005700690064006500530074006500720065006f02000000e6000000d2000003ee0000030bfb0000000c004b0069006e0065006300740200000186000001060000030c00000261fb0000000c00430061006d006500720061000000022f000000c70000000000000000fb0000000a0049006d00610067006500000002920000026d0000000000000000fb0000000c00430061006d0065007200610000000429000000d60000000000000000000000010000010f00000363fc0200000003fb0000001e0054006f006f006c002000500072006f00700065007200740069006500730100000041000000780000000000000000fb0000000a00560069006500770073000000003d00000363000000a400fffffffb0000001200530065006c0065006300740069006f006e010000025a000000b200000000000000000000000200000490000000a9fc0100000001fb0000000a00560069006500770073030000004e00000080000002e100000197000000030000056f0000003efc0100000002fb0000000800540069006d006501000000000000056f000002eb00fffffffb0000000800540069006d00650100000000000004500000000000000000000003c70000041e00000004000000040000000800000008fc0000000100000002000000010000000a0054006f006f006c00730100000000ffffffff0000000000000000
Selection:
collapsed: false
Time:
@ -534,5 +524,5 @@ Window Geometry:
Views:
collapsed: false
Width: 1391
X: 242
Y: 92
X: 1030
Y: 84

+ 5
- 5
ros1/rviz/perception_example.rviz View File

@ -273,7 +273,7 @@ Visualization Manager:
Value: true
- Class: rviz/Image
Enabled: false
Image Topic: /camera/color/image_raw_upright_view
Image Topic: /camera/color/image_raw
Max Value: 1
Median window: 5
Min Value: 0
@ -286,7 +286,7 @@ Visualization Manager:
- Class: rviz/Camera
Enabled: false
Image Rendering: background and overlay
Image Topic: ""
Image Topic: /camera/color/image_raw
Name: Camera
Overlay Alpha: 0.5
Queue Size: 2
@ -371,6 +371,7 @@ Visualization Manager:
Stereo Focal Distance: 1
Swap Stereo Eyes: false
Value: false
Field of View: 0.7853981852531433
Focal Point:
X: 0.1298813819885254
Y: -0.5344312787055969
@ -382,7 +383,6 @@ Visualization Manager:
Near Clip Distance: 0.009999999776482582
Pitch: 0.23479780554771423
Target Frame: <Fixed Frame>
Value: Orbit (rviz)
Yaw: 3.4154603481292725
Saved: ~
Window Geometry:
@ -395,7 +395,7 @@ Window Geometry:
Hide Right Dock: false
Image:
collapsed: false
QMainWindow State: 000000ff00000000fd000000040000000000000253000004c2fc020000000bfb0000001200530065006c0065006300740069006f006e00000001e10000009b0000005c00fffffffb0000001e0054006f006f006c002000500072006f007000650072007400690065007302000001ed000001df00000185000000a3fb000000120056006900650077007300200054006f006f02000001df000002110000018500000122fb000000200054006f006f006c002000500072006f0070006500720074006900650073003203000002880000011d000002210000017afb000000100044006900730070006c006100790073010000003d000004c2000000c900fffffffb0000002000730065006c0065006300740069006f006e00200062007500660066006500720200000138000000aa0000023a00000294fb00000014005700690064006500530074006500720065006f02000000e6000000d2000003ee0000030bfb0000000c004b0069006e0065006300740200000186000001060000030c00000261fb0000000c00430061006d006500720061000000022f000000c70000000000000000fb0000000a0049006d00610067006500000002920000026d0000001600fffffffb0000000c00430061006d0065007200610000000429000000d60000001600ffffff000000010000010f00000363fc0200000003fb0000001e0054006f006f006c002000500072006f00700065007200740069006500730100000041000000780000000000000000fb0000000a00560069006500770073000000003d00000363000000a400fffffffb0000001200530065006c0065006300740069006f006e010000025a000000b200000000000000000000000200000490000000a9fc0100000001fb0000000a00560069006500770073030000004e00000080000002e10000019700000003000006550000003efc0100000002fb0000000800540069006d0065010000000000000655000002eb00fffffffb0000000800540069006d00650100000000000004500000000000000000000003fc000004c200000004000000040000000800000008fc0000000100000002000000010000000a0054006f006f006c00730100000000ffffffff0000000000000000
QMainWindow State: 000000ff00000000fd000000040000000000000253000004c2fc020000000bfb0000001200530065006c0065006300740069006f006e00000001e10000009b0000005c00fffffffb0000001e0054006f006f006c002000500072006f007000650072007400690065007302000001ed000001df00000185000000a3fb000000120056006900650077007300200054006f006f02000001df000002110000018500000122fb000000200054006f006f006c002000500072006f0070006500720074006900650073003203000002880000011d000002210000017afb000000100044006900730070006c006100790073010000003d000004c2000000c900fffffffb0000002000730065006c0065006300740069006f006e00200062007500660066006500720200000138000000aa0000023a00000294fb00000014005700690064006500530074006500720065006f02000000e6000000d2000003ee0000030bfb0000000c004b0069006e0065006300740200000186000001060000030c00000261fb0000000c00430061006d0065007200610000000438000000c70000001600fffffffb0000000a0049006d00610067006500000002920000026d0000001600fffffffb0000000c00430061006d0065007200610000000429000000d60000000000000000000000010000010f00000363fc0200000003fb0000001e0054006f006f006c002000500072006f00700065007200740069006500730100000041000000780000000000000000fb0000000a00560069006500770073000000003d00000363000000a400fffffffb0000001200530065006c0065006300740069006f006e010000025a000000b200000000000000000000000200000490000000a9fc0100000001fb0000000a00560069006500770073030000004e00000080000002e10000019700000003000006550000003efc0100000002fb0000000800540069006d0065010000000000000655000002eb00fffffffb0000000800540069006d00650100000000000004500000000000000000000003fc000004c200000004000000040000000800000008fc0000000100000002000000010000000a0054006f006f006c00730100000000ffffffff0000000000000000
Selection:
collapsed: false
Time:
@ -406,4 +406,4 @@ Window Geometry:
collapsed: false
Width: 1621
X: 864
Y: 38
Y: 27

+ 1
- 3
ros1/rviz_basics.md View File

@ -39,6 +39,4 @@ the `rviz` flag will open an RViz window to visualize a variety of ROS topics.
<img src="images/willowgarage_with_rviz.png"/>
</p>
Bringup the [keyboard teleop](teleoperating_stretch.md) to drive Stretch and observe its sensor input.
**Next Tutorial:** [Navigation Stack](navigation_stack.md)
Bringup the [keyboard teleop](teleoperating_stretch.md) to drive Stretch and observe its sensor input.

+ 11
- 16
ros1/src/aruco_tag_locator.py View File

@ -1,4 +1,4 @@
#!/usr/bin/env python
#!/usr/bin/env python3
# Import modules
import rospy
@ -15,11 +15,11 @@ import hello_helpers.hello_misc as hm
from sensor_msgs.msg import JointState
# Import the FollowJointTrajectoryGoal from the control_msgs.msg package to
# control the Stretch robot.
# control the Stretch robot
from control_msgs.msg import FollowJointTrajectoryGoal
# Import JointTrajectoryPoint from the trajectory_msgs package to define
# robot trajectories.
# robot trajectories
from trajectory_msgs.msg import JointTrajectoryPoint
# Import TransformStamped from the geometry_msgs package for the publisher
@ -38,7 +38,7 @@ class LocateArUcoTag(hm.HelloNode):
# Initialize the inhereted hm.Hellonode class
hm.HelloNode.__init__(self)
# Initialize Subscriber
# Initialize subscriber
self.joint_states_sub = rospy.Subscriber('/stretch/joint_states', JointState, self.joint_states_callback)
# Initialize publisher
@ -62,10 +62,9 @@ class LocateArUcoTag(hm.HelloNode):
self.tilt_num_steps = 3
self.tilt_step_size = pi/16
# Define the head actuation rotational velocity
# Define the head actuation rotational velocity
self.rot_vel = 0.5 # radians per sec
def joint_states_callback(self, msg):
"""
A callback function that stores Stretch's joint states.
@ -74,7 +73,6 @@ class LocateArUcoTag(hm.HelloNode):
"""
self.joint_state = msg
def send_command(self, command):
'''
Handles single joint control commands by constructing a FollowJointTrajectoryGoal
@ -110,7 +108,7 @@ class LocateArUcoTag(hm.HelloNode):
# extract the head position value from the `position` key
point.positions = [command['position']]
# Set the rotaitonal velocity
# Set the rotational velocity
point.velocities = [self.rot_vel]
# Assign goal position with updated point variable
@ -125,23 +123,22 @@ class LocateArUcoTag(hm.HelloNode):
self.trajectory_client.send_goal(trajectory_goal)
self.trajectory_client.wait_for_result()
def find_tag(self, tag_name='docking_station'):
"""
A function that actuates the camera to search for a defined ArUco tag
marker. Then the function returns the pose
marker. Then the function returns the pose.
:param self: The self reference.
:param tag_name: A string value of the ArUco marker name.
:returns transform: A TransformStamped message type.
:returns transform: The docking station's TransformStamped message.
"""
# Create a dictionaries to get the head in its initial position for its search
# Create dictionaries to get the head in its initial position
pan_command = {'joint': 'joint_head_pan', 'position': self.min_pan_position}
self.send_command(pan_command)
tilt_command = {'joint': 'joint_head_tilt', 'position': self.min_tilt_position}
self.send_command(tilt_command)
# Nested for loop to sweep the pan and tilt in increments
# Nested for loop to sweep the joint_head_pan and joint_head_tilt in increments
for i in range(self.tilt_num_steps):
for j in range(self.pan_num_steps):
# Update the joint_head_pan position by the pan_step_size
@ -177,7 +174,6 @@ class LocateArUcoTag(hm.HelloNode):
# Notify that the requested tag was not found
rospy.loginfo("The requested tag '%s' was not found", tag_name)
def main(self):
"""
Function that initiates the issue_command function.
@ -197,13 +193,12 @@ class LocateArUcoTag(hm.HelloNode):
# Give the listener some time to accumulate transforms
rospy.sleep(1.0)
# Notify Stretch is searching for the ArUco tag with a rospy loginfo function
# Notify Stretch is searching for the ArUco tag with `rospy.loginfo()`
rospy.loginfo('Searching for docking ArUco tag.')
# Search for the ArUco marker for the docking station
pose = self.find_tag("docking_station")
if __name__ == '__main__':
# Use a try-except block
try:

+ 7
- 9
ros1/src/avoider.py View File

@ -1,4 +1,4 @@
#!/usr/bin/env python
#!/usr/bin/env python3
# Import modules
import rospy
@ -27,7 +27,7 @@ class Avoider:
self.pub = rospy.Publisher('/stretch/cmd_vel', Twist, queue_size=1) #/stretch_diff_drive_controller/cmd_vel for gazebo
self.sub = rospy.Subscriber('/scan', LaserScan, self.set_speed)
# We're going to assume that the robot is pointing up the x-axis, so that
# We're going to assume that the robot is pointing at the x-axis, so that
# any points with y coordinates further than half of the defined
# width (1 meter) from the axis are not considered.
self.width = 1
@ -47,18 +47,16 @@ class Avoider:
self.twist.angular.y = 0.0
self.twist.angular.z = 0.0
# Called every time we get a LaserScan message from ROS
def set_speed(self,msg):
"""
Callback function to deal with incoming laserscan messages.
Callback function to deal with incoming LaserScan messages.
:param self: The self reference.
:param msg: The subscribed laserscan message.
:param msg: The subscribed LaserScan message.
:publishes self.twist: Twist message.
"""
# Figure out the angles of the scan. We're going to do this each time, in case we're subscribing to more than one
# laser, with different numbers of beams
# Figure out the angles of the scan. We're going to do this each time,
# in case we're subscribing to more than one laser, with different numbers of beams
angles = linspace(msg.angle_min, msg.angle_max, len(msg.ranges))
# Work out the y coordinates of the ranges
@ -81,7 +79,7 @@ if __name__ == '__main__':
# Initialize the node, and call it "avoider"
rospy.init_node('avoider')
# Instantiate he Avoider class
# Instantiate the Avoider class
Avoider()
# Give control to ROS. This will allow the callback to be called whenever new

+ 3
- 3
ros1/src/capture_image.py View File

@ -1,4 +1,4 @@
#!/usr/bin/env python
#!/usr/bin/env python3
# Import Modules
import rospy
@ -27,7 +27,7 @@ class CaptureImage:
self.bridge = CvBridge()
# Initialize subscriber
self.sub = rospy.Subscriber('/camera/color/image_raw_upright_view', Image, self.callback, queue_size=1)
self.sub = rospy.Subscriber('/camera/color/image_raw', Image, self.callback, queue_size=1)
# Create path to save captured images to the stored data folder
self.save_path = '/home/hello-robot/catkin_ws/src/stretch_tutorials/stored_data'
@ -42,7 +42,7 @@ class CaptureImage:
# Try to convert the ROS image message to a cv2 Image
try:
image = self.bridge.imgmsg_to_cv2(msg, 'bgr8')
except CvBridgeError, e:
except CvBridgeError as e:
rospy.logwarn('CV Bridge error: {0}'.format(e))
# Define the file name

+ 4
- 4
ros1/src/edge_detection.py View File

@ -1,4 +1,4 @@
#!/usr/bin/env python
#!/usr/bin/env python3
# Import Modules
import rospy
@ -25,11 +25,11 @@ class EdgeDetection:
parameter values.
:param self: The self reference.
"""
# Initialize the CvBridge class
# Instantiate a CvBridge() object
self.bridge = CvBridge()
# Initialize subscriber
self.sub = rospy.Subscriber('/camera/color/image_raw_upright_view', Image, self.callback, queue_size=1)
self.sub = rospy.Subscriber('/camera/color/image_raw', Image, self.callback, queue_size=1)
# Initialize publisher
self.pub = rospy.Publisher('/image_edge_detection', Image, queue_size=1)
@ -55,7 +55,7 @@ class EdgeDetection:
# Try to convert the ROS image message to a CV2 Image
try:
image = self.bridge.imgmsg_to_cv2(msg, 'bgr8')
except CvBridgeError, e:
except CvBridgeError as e:
rospy.logwarn('CV Bridge error: {0}'.format(e))
# Apply the Canny Edge filter function to the transformed CV2 Image

+ 9
- 15
ros1/src/effort_sensing.py View File

@ -1,4 +1,4 @@
#!/usr/bin/env python
#!/usr/bin/env python3
# Import modules
import rospy
@ -8,15 +8,15 @@ import os
import csv
# Import the FollowJointTrajectoryGoal from the control_msgs.msg package to
# control the Stretch robot.
# control the Stretch robot
from control_msgs.msg import FollowJointTrajectoryGoal
# Import JointTrajectoryPoint from the trajectory_msgs package to define
# robot trajectories.
# robot trajectories
from trajectory_msgs.msg import JointTrajectoryPoint
# We're going to subscribe to a JointState message type, so we need to import
# the definition for it
# We're going to subscribe to 64-bit integers, so we need to import the definition
# for them.
from sensor_msgs.msg import JointState
# Import hello_misc script for handling trajecotry goals with an action client.
@ -31,7 +31,7 @@ class JointActuatorEffortSensor(hm.HelloNode):
"""
def __init__(self, export_data=False):
"""
Function that initializes the subscriber class="p">,and other features.
Function that initializes the subscriber and other features.
:param self: The self reference.
:param export_data: A boolean message type.
"""
@ -53,7 +53,6 @@ class JointActuatorEffortSensor(hm.HelloNode):
# Create boolean data type for conditional statements if you want to export effort data.
self.export_data = export_data
def callback(self, msg):
"""
Callback function to update and store JointState messages.
@ -63,7 +62,6 @@ class JointActuatorEffortSensor(hm.HelloNode):
# Store the joint messages for later use
self.joint_states = msg
def issue_command(self):
"""
Function that makes an action call and sends joint trajectory goals
@ -76,11 +74,11 @@ class JointActuatorEffortSensor(hm.HelloNode):
trajectory_goal = FollowJointTrajectoryGoal()
trajectory_goal.trajectory.joint_names = self.joints
# Provide desired positions for joint name.
# Provide desired positionsfor joint name.
point0 = JointTrajectoryPoint()
point0.positions = [0.9]
# Set a list to the trajectory_goal.trajectory.points
# Set a list to the `trajectory_goal.trajectory.points`
trajectory_goal.trajectory.points = [point0]
# Specify the coordinate frame that we want (base_link) and set the time to be now.
@ -93,7 +91,6 @@ class JointActuatorEffortSensor(hm.HelloNode):
rospy.loginfo('Sent position goal = {0}'.format(trajectory_goal))
self.trajectory_client.wait_for_result()
def feedback_callback(self,feedback):
"""
The feedback_callback function deals with the incoming feedback messages
@ -130,7 +127,6 @@ class JointActuatorEffortSensor(hm.HelloNode):
else:
self.joint_effort.append(current_effort)
def done_callback(self, status, result):
"""
The done_callback function will be called when the joint action is complete.
@ -141,7 +137,7 @@ class JointActuatorEffortSensor(hm.HelloNode):
"""
# Conditional statemets to notify whether the action succeeded or failed.
if status == actionlib.GoalStatus.SUCCEEDED:
rospy.loginfo('Suceeded')
rospy.loginfo('Succeeded')
else:
rospy.loginfo('Failed')
@ -157,7 +153,6 @@ class JointActuatorEffortSensor(hm.HelloNode):
writer.writerow(self.joints)
writer.writerows(self.joint_effort)
def main(self):
"""
Function that initiates the issue_command function.
@ -170,7 +165,6 @@ class JointActuatorEffortSensor(hm.HelloNode):
self.issue_command()
time.sleep(2)
if __name__ == '__main__':
try:
# Declare object, node, from JointActuatorEffortSensor class

+ 10
- 5
ros1/src/joint_state_printer.py View File

@ -1,4 +1,4 @@
#!/usr/bin/env python
#!/usr/bin/env python3
# Import ROS Python basic API and sys
import rospy
@ -15,7 +15,7 @@ class JointStatePublisher():
def __init__(self):
"""
Function that initializes the subsriber.
Function that initializes the subscriber.
:param self: The self reference
"""
# Set up a subscriber. We're going to subscribe to the topic "joint_states"
@ -36,7 +36,7 @@ class JointStatePublisher():
"""
print_states function to deal with the incoming JointState messages.
:param self: The self reference.
:param joints: A list of joint names.
:param joints: A list of string values of joint names.
"""
# Create an empty list that will store the positions of the requested joints
joint_positions = []
@ -45,6 +45,11 @@ class JointStatePublisher():
# The index() function returns the index at the first occurrence of
# the name of the requested joint in the self.joint_states.name list
for joint in joints:
if joint == "wrist_extension":
index = self.joint_states.name.index('joint_arm_l0')
joint_positions.append(4*self.joint_states.position[index])
continue
index = self.joint_states.name.index(joint)
joint_positions.append(self.joint_states.position[index])
@ -64,7 +69,7 @@ if __name__ == '__main__':
# Initialize the node.
rospy.init_node('joint_state_printer', anonymous=True)
# Declare object from JointStatePublisher class
# Instantiate a `JointStatePublisher()` object
JSP = JointStatePublisher()
# Use the rospy.sleep() function to allow the class to initialize before
@ -74,7 +79,7 @@ if __name__ == '__main__':
# Create a list of the joints and name them joints. These will be an argument
# for the print_states() function
joints = ["joint_lift", "joint_arm_l0", "joint_arm_l1", "joint_arm_l2", "joint_arm_l3", "joint_wrist_yaw"]
joints = ["joint_lift", "wrist_extension", "joint_wrist_yaw"]
# joints = ["joint_head_pan","joint_head_tilt", "joint_gripper_finger_left", "joint_gripper_finger_right"]
JSP.print_states(joints)

+ 5
- 7
ros1/src/marker.py View File

@ -1,9 +1,9 @@
#!/usr/bin/env python
#!/usr/bin/env python3
# Import modules
import rospy
# Import the Marker message type from the visualization_msgs package.
# Import the Marker message type from the visualization_msgs package
from visualization_msgs.msg import Marker
class Balloon():
@ -18,7 +18,7 @@ class Balloon():
# Set up a publisher. We're going to publish on a topic called balloon
self.pub = rospy.Publisher('balloon', Marker, queue_size=10)
# Create a marker. Markers of all shapes share a common type.
# Create a marker. Markers of all shapes share a common type
self.marker = Marker()
# Set the frame ID and type. The frame ID is the frame in which the position of the marker
@ -61,10 +61,9 @@ class Balloon():
# Create log message
rospy.loginfo("Publishing the balloon topic. Use RViz to visualize.")
def publish_marker(self):
"""
Function that publishes the sphere marker
Function that publishes the sphere marker.
:param self: The self reference.
:publishes self.marker: Marker message.
@ -72,12 +71,11 @@ class Balloon():
# publisher the marker
self.pub.publish(self.marker)
if __name__ == '__main__':
# Initialize the node, as usual
rospy.init_node('marker')
# Declare object from the Balloon class
# Instanstiate a `Balloon()` object
balloon = Balloon()
# Set a rate

+ 5
- 6
ros1/src/move.py View File

@ -1,4 +1,4 @@
#!/usr/bin/env python
#!/usr/bin/env python3
# Import modules
import rospy
@ -8,18 +8,17 @@ from geometry_msgs.msg import Twist
class Move:
"""
A class that sends Twist messages to move the Stretch robot foward.
A class that sends Twist messages to move the Stretch robot forward.
"""
def __init__(self):
"""
Function that initializes the publisher.
:param self: The self reference
:param self: The self reference.
"""
# Setup a publisher that will send the velocity commands for the Stretch
# Setup a publisher that will send the velocity commands to Stretch
# This will publish on a topic called "/stretch/cmd_vel" with a message type Twist
self.pub = rospy.Publisher('/stretch/cmd_vel', Twist, queue_size=1) #/stretch_diff_drive_controller/cmd_vel for gazebo
def move_forward(self):
"""
Function that publishes Twist messages
@ -51,7 +50,7 @@ if __name__ == '__main__':
# Initialize the node, and call it "move"
rospy.init_node('move')
# Declare object from Move class
# Instanstiate a `Move()` object
base_motion = Move()
# Rate allows us to control the (approximate) rate at which we publish things.

+ 8
- 9
ros1/src/multipoint_command.py View File

@ -1,4 +1,4 @@
#!/usr/bin/env python
#!/usr/bin/env python3
# Import modules
import rospy
@ -19,9 +19,11 @@ class MultiPointCommand(hm.HelloNode):
"""
A class that sends multiple joint trajectory goals to the stretch robot.
"""
# Initialize the inhereted hm.Hellonode class
def __init__(self):
"""
Function that initializes the inhereted hm.HelloNode class.
:param self: The self reference.
"""
hm.HelloNode.__init__(self)
def issue_multipoint_command(self):
@ -33,8 +35,8 @@ class MultiPointCommand(hm.HelloNode):
# Set point0 as a JointTrajectoryPoint()
point0 = JointTrajectoryPoint()
# Provide desired positions of lift, wrist extension, and yaw of
# the wrist (in meters)
# Provide desired positions of lift (m), wrist extension (m), and yaw of
# the wrist (radians)
point0.positions = [0.2, 0.0, 3.4]
# Provide desired velocity of the lift (m/s), wrist extension (m/s),
@ -85,8 +87,6 @@ class MultiPointCommand(hm.HelloNode):
rospy.loginfo('Sent list of goals = {0}'.format(trajectory_goal))
self.trajectory_client.wait_for_result()
# Create a funcion, main(), to do all of the setup the hm.HelloNode class
# and issue the stow command
def main(self):
"""
Function that initiates the multipoint_command function.
@ -99,10 +99,9 @@ class MultiPointCommand(hm.HelloNode):
self.issue_multipoint_command()
time.sleep(2)
if __name__ == '__main__':
try:
# Instantiate the MultiPointCommand class and execute the main() method
# Instanstiate a `MultiPointCommand()` object and execute the main() method
node = MultiPointCommand()
node.main()
except KeyboardInterrupt:

+ 20
- 19
ros1/src/navigation.py View File

@ -1,4 +1,4 @@
#!/usr/bin/env python
#!/usr/bin/env python3
# Import modules
import rospy
@ -28,11 +28,6 @@ class StretchNavigation:
self.client.wait_for_server()
rospy.loginfo('{0}: Made contact with move_base server'.format(self.__class__.__name__))
# Initialize the callback functions
self.active_callback = None
self.feedback_callback = None
self.done_callback = None
# Create a MoveBaseGoal message type, and fill in all of the relevant fields.
self.goal = MoveBaseGoal()
self.goal.target_pose.header.frame_id = 'map'
@ -53,16 +48,14 @@ class StretchNavigation:
"""
A function to build Quaternians from Euler angles. Since the Stretch only
rotates around z, we can zero out the other angles.
:param theta: The angle the robot makes with the x-axis.
:param theta: The angle (radians) the robot makes with the x-axis.
"""
return Quaternion(*transformations.quaternion_from_euler(0.0, 0.0, theta))
def go_to(self, x, y, theta, wait=False):
def go_to(self, x, y, theta):
"""
Drive the robot to a particlar pose on the map. The Stretch only needs
Drive the robot to a particular pose on the map. The Stretch only needs
(x, y) coordinates and a heading.
:param x: x coordinate in the map frame.
:param y: y coordinate in the map frame.
:param theta: heading (angle with the x-axis in the map frame)
@ -75,17 +68,25 @@ class StretchNavigation:
self.goal.target_pose.pose.position.y = y
# Set the orientation. This is a quaternion, so we use the helper function.
self.goal.target_pose.pose.orientation = self.get_quaternion(0.0)
self.goal.target_pose.pose.orientation = self.get_quaternion(theta)
# Make the action call. Include the callbacks. Unless these have been set somewhere else, they are passed
# as None, which means no callback.
self.client.send_goal(self.goal,
active_cb=self.active_callback,
feedback_cb=self.feedback_callback,
done_cb=self.done_callback)
# Make the action call and include the `done_callback` function
self.client.send_goal(self.goal, done_cb=self.done_callback)
self.client.wait_for_result()
def done_callback(self, status, result):
"""
The done_callback function will be called when the joint action is complete.
:param self: The self reference.
:param status: status attribute from MoveBaseActionResult message.
:param result: result attribute from MoveBaseActionResult message.
"""
# Conditional statemets to notify whether the action succeeded or failed.
if status == actionlib.GoalStatus.SUCCEEDED:
rospy.loginfo('{0}: SUCCEEDED in reaching the goal.'.format(self.__class__.__name__))
else:
rospy.loginfo('{0}: FAILED in reaching the goal.'.format(self.__class__.__name__))
if __name__ == '__main__':
# Initialize the node, and call it "navigation"
@ -95,4 +96,4 @@ if __name__ == '__main__':
nav = StretchNavigation()
# Send a nav goal to the `go_to()` method
nav.go_to(0.25, 0.0, 0.0, wait=True)
nav.go_to(0.5, 0.0, 0.0)

+ 5
- 5
ros1/src/pointcloud_transformer.py View File

@ -1,4 +1,4 @@
#!/usr/bin/env python
#!/usr/bin/env python3
# Import modules
import rospy
@ -48,7 +48,7 @@ class PointCloudTransformer:
"""
A function that extracts the points from the stored PointCloud2 message
and appends those points to a PointCloud message. Then the function transforms
the PointCloud from its the header frame id, 'camera_color_optical_frame'
the PointCloud from its header frame id, 'camera_color_optical_frame'
to the 'base_link' frame.
:param self: The self reference.
"""
@ -74,7 +74,7 @@ class PointCloudTransformer:
:param self: The self reference.
:param msg: The PointCloud message.
:returns new_cloud: PointCloud message.
:returns new_cloud: The transformed PointCloud message.
"""
while not rospy.is_shutdown():
try:
@ -91,7 +91,7 @@ if __name__=="__main__":
# Initialize the node, and call it "pointcloud_transformer"
rospy.init_node('pointcloud_transformer',anonymous=True)
# Declare object, PCT, from the PointCloudTransformer class.
# Instanstiate a `PointCloudTransformer()` object
PCT = PointCloudTransformer()
# We're going to publish information at 1 Hz
@ -103,6 +103,6 @@ if __name__=="__main__":
# Run while loop until the node is shutdown
while not rospy.is_shutdown():
# Run the pcl_transformer method
# Run the `pcl_transformer()` method
PCT.pcl_transformer()
rate.sleep()

+ 7
- 9
ros1/src/scan_filter.py View File

@ -1,17 +1,16 @@
#!/usr/bin/env python
#!/usr/bin/env python3
# Import modules
import rospy
from numpy import linspace, inf
from math import sin
# We're going to subscribe to a LaserScan message
from sensor_msgs.msg import LaserScan
class ScanFilter:
"""
A class that implements a LaserScan filter that removes all of the points.
that are not infront of the robot.
that are not in front of the robot.
"""
def __init__(self):
# We're going to assume that the robot is pointing up the x-axis, so that
@ -32,17 +31,16 @@ class ScanFilter:
# Create log message
rospy.loginfo("Publishing the filtered_scan topic. Use RViz to visualize.")
def callback(self,msg):
"""
Callback function to deal with incoming laserscan messages.
Callback function to deal with incoming LaserScan messages.
:param self: The self reference.
:param msg: The subscribed laserscan message.
:param msg: The subscribed LaserScan message.
:publishes msg: updated laserscan message.
:publishes msg: updated LaserScan message.
"""
# Figure out the angles of the scan. We're going to do this each time, in case we're subscribing to more than one
# laser, with different numbers of beams
# Figure out the angles of the scan. We're going to do this each time,
# in case we're subscribing to more than one laser, with different numbers of beams
angles = linspace(msg.angle_min, msg.angle_max, len(msg.ranges))
# Work out the y coordinates of the ranges

+ 11
- 13
ros1/src/single_joint_actuator.py View File

@ -1,15 +1,15 @@
#!/usr/bin/env python
#!/usr/bin/env python3
# Import modules
import rospy
import time
# Import the FollowJointTrajectoryGoal from the control_msgs.msg package to
# control the Stretch robot.
# control the Stretch robot
from control_msgs.msg import FollowJointTrajectoryGoal
# Import JointTrajectoryPoint from the trajectory_msgs package to define
# robot trajectories.
# robot trajectories
from trajectory_msgs.msg import JointTrajectoryPoint
# Import hello_misc script for handling trajectory goals with an action client
@ -19,8 +19,11 @@ class SingleJointActuator(hm.HelloNode):
"""
A class that sends multiple joint trajectory goals to a single joint.
"""
# Initialize the inhereted hm.Hellonode class
def __init__(self):
"""
Function that initializes the inhereted hm.HelloNode class.
:param self: The self reference.
"""
hm.HelloNode.__init__(self)
def issue_command(self):
@ -34,13 +37,10 @@ class SingleJointActuator(hm.HelloNode):
# joint_lift: lower_limit = 0.15, upper_limit = 1.10 # in meters
# wrist_extension: lower_limit = 0.00, upper_limit = 0.50 # in meters
# joint_wrist_yaw: lower_limit = -1.75, upper_limit = 4.00 # in radians
# joint_head_pan: lower_limit = -4.10, upper_limit = 1.60 # in radians
# joint_head_pan: lower_limit = -2.80, upper_limit = 2.90 # in radians
# joint_head_tilt: lower_limit = -1.60, upper_limit = 0.40 # in radians
# joint_gripper_finger_left: lower_limit = -0.35, upper_limit = 0.165 # in radians
#
# INCLUDED JOINT IN MANIPULATION MODE
# joint_mobile_base_translation: lower_limit = -0.50, upper_limit = 0.50 # in radians
#
# INCLUDED JOINTS IN POSITION MODE
# translate_mobile_base: No lower or upper limit. Defined by a step size in meters
# rotate_mobile_base: No lower or upper limit. Defined by a step size in radians
@ -52,7 +52,7 @@ class SingleJointActuator(hm.HelloNode):
trajectory_goal.trajectory.joint_names = ['joint_head_pan']
# Provide desired positions for joint name.
# Create a JointTrajectoryPoint message type
# Set positions for the following 5 trajectory points
point0 = JointTrajectoryPoint()
point0.positions = [0.65]
@ -63,7 +63,7 @@ class SingleJointActuator(hm.HelloNode):
# trajectory points
trajectory_goal.trajectory.points = [point0]#, point1]
# Specify the coordinate frame that we want (base_link) and set the time to be now.
# Specify the coordinate frame that we want (base_link) and set the time to be now
trajectory_goal.trajectory.header.stamp = rospy.Time(0.0)
trajectory_goal.trajectory.header.frame_id = 'base_link'
@ -85,11 +85,9 @@ class SingleJointActuator(hm.HelloNode):
self.issue_command()
time.sleep(2)
if __name__ == '__main__':
try:
# Declare object from the SingleJointActuator class. Then execute the
# main() method/function
# Instanstiate a `SingleJointActuator()` object and execute the main() method
node = SingleJointActuator()
node.main()
except KeyboardInterrupt:

+ 2
- 3
ros1/src/speech_text.py View File

@ -1,4 +1,4 @@
#!/usr/bin/env python
#!/usr/bin/env python3
# Import modules
import rospy
@ -7,7 +7,6 @@ import os
# Import SpeechRecognitionCandidates from the speech_recognition_msgs package
from speech_recognition_msgs.msg import SpeechRecognitionCandidates
class SpeechText:
"""
A class that saves the interpreted speech from the ReSpeaker Microphone Array to a text file.
@ -15,6 +14,7 @@ class SpeechText:
def __init__(self):
"""
Initialize subscriber and directory to save speech to text file.
:param self: The self reference.
"""
# Initialize subscriber
self.sub = rospy.Subscriber("speech_to_text", SpeechRecognitionCandidates, self.callback)
@ -44,7 +44,6 @@ class SpeechText:
file_object.write("\n")
file_object.write(transcript)
if __name__ == '__main__':
# Initialize the node and name it speech_text
rospy.init_node('speech_text')

+ 3
- 5
ros1/src/stored_data_plotter.py View File

@ -1,4 +1,4 @@
#!/usr/bin/env python
#!/usr/bin/env python3
# Import needed python packages
import numpy as np
@ -65,7 +65,6 @@ class Plotter():
# Reset y_anim for the next joint effort animation
del self.y_anim[:]
# Conditional statement for regular plotting (No animation)
else:
self.data[joint].plot(kind='line')
@ -73,18 +72,17 @@ class Plotter():
# plt.savefig(save_path, bbox_inches='tight')
plt.show()
def plot_animate(self,i):
"""
Function that plots every increment of the dataframe.
:param self: The self reference.
:param i: index value.
"""
# Append self.effort values for given joint.
# Append self.effort values for given joint
self.y_anim.append(self.effort.values[i])
plt.plot(self.y_anim, color='blue')
if __name__ == '__main__':
# Instanstiate a `Plotter()` object and execute the `plot_data()` method
viz = Plotter(animate=True)
viz.plot_data()

+ 9
- 11
ros1/src/stow_command.py View File

@ -1,4 +1,4 @@
#!/usr/bin/env python
#!/usr/bin/env python3
# Import modules
import rospy
@ -19,23 +19,24 @@ class StowCommand(hm.HelloNode):
'''
A class that sends a joint trajectory goal to stow the Stretch's arm.
'''
# Initialize the inhereted hm.Hellonode class
def __init__(self):
"""
Function that initializes the inhereted hm.HelloNode class.
:param self: The self reference.
"""
hm.HelloNode.__init__(self)
def issue_stow_command(self):
'''
Function that makes an action call and sends stow postion goal.
Function that makes an action call and sends stow position goal.
:param self: The self reference.
'''
# Set stow_point as a JointTrajectoryPoint()
stow_point = JointTrajectoryPoint()
stow_point.time_from_start = rospy.Duration(0.000)
# Provide desired positions of lift, wrist extension, and yaw of
# the wrist (in meters)
# Provide desired positions of lift (meters), wrist extension (meters), and yaw of
# the wrist (radians)
stow_point.positions = [0.2, 0.0, 3.4]
# Set trajectory_goal as a FollowJointTrajectoryGoal and define
@ -57,8 +58,6 @@ class StowCommand(hm.HelloNode):
rospy.loginfo('Sent stow goal = {0}'.format(trajectory_goal))
self.trajectory_client.wait_for_result()
# Create a funcion, main(), to do all of the setup the hm.HelloNode class
# and issue the stow command
def main(self):
'''
Function that initiates stow_command function.
@ -71,10 +70,9 @@ class StowCommand(hm.HelloNode):
self.issue_stow_command()
time.sleep(2)
if __name__ == '__main__':
try:
# Declare object from StowCommand class then execute the main() method
# Instanstiate a `StowCommand()` object and execute the main() method
node = StowCommand()
node.main()
except KeyboardInterrupt:

+ 4
- 4
ros1/src/tf2_broadcaster.py View File

@ -1,12 +1,12 @@
#!/usr/bin/env python
#!/usr/bin/env python3
# Import modules
import rospy
# Because of transformations
# Import tf.transformations the change quaternion values to euler values
import tf.transformations
# The TransformStamped message is imported to create a child frame
# The TransformStamped message is imported to broadcast a transform frame
from geometry_msgs.msg import TransformStamped
# Import StaticTransformBroadcaster to publish static transforms
@ -84,7 +84,7 @@ if __name__ == '__main__':
# Initialize the node, and call it "tf2_broadcaster"
rospy.init_node('tf2_broadcaster')
# Instantiate the FixedFrameBroadcaster class
# Instantiate the `FixedFrameBroadcaster()` class
FixedFrameBroadcaster()
# Give control to ROS. This will allow the callback to be called whenever new

+ 4
- 6
ros1/src/tf2_listener.py View File

@ -1,4 +1,4 @@
#!/usr/bin/env python
#!/usr/bin/env python3
# Import modules
import rospy
@ -9,11 +9,10 @@ from geometry_msgs.msg import TransformStamped
# Import StaticTransformBroadcaster to publish static transforms
import tf2_ros
class FrameListener():
"""
This Class prints the transformation between the fk_link_mast frame and the
target frame, link_grasp_center.
This Class prints the transform between the fk_link_mast and the
link_grasp_center frame.
"""
def __init__(self):
"""
@ -55,12 +54,11 @@ class FrameListener():
# Manage the rate the node prints out a message
rate.sleep()
if __name__ == '__main__':
# Initialize the node, and call it "tf2_liistener"
rospy.init_node('tf2_listener')
# Instantiate the FrameListener class
# Instantiate the `FrameListener()` class
FrameListener()
# Give control to ROS. This will allow the callback to be called whenever new

+ 19
- 20
ros1/src/voice_teleoperation_base.py View File

@ -1,23 +1,23 @@
#!/usr/bin/env python
#!/usr/bin/env python3
# Import modules
import math
import rospy
import sys
# We're going to subscribe to a JointState message type, so we need to import
# the definition for it
# We're going to subscribe to 64-bit integers, so we need to import the definition
# for them
from sensor_msgs.msg import JointState
# Import Int32 message typs from the std_msgs package
from std_msgs.msg import Int32
# Import the FollowJointTrajectoryGoal from the control_msgs.msg package to
# control the Stretch robot.
# control the Stretch robot
from control_msgs.msg import FollowJointTrajectoryGoal
# Import JointTrajectoryPoint from the trajectory_msgs package to define
# robot trajectories.
# robot trajectories
from trajectory_msgs.msg import JointTrajectoryPoint
# Import hello_misc script for handling trajectory goals with an action client
@ -69,16 +69,17 @@ class GetVoiceCommands:
def callback_direction(self, msg):
"""
A callback function that converts the sound direction from degrees to radians.
A callback function that converts the incoming message, sound direction,
from degrees to radians.
:param self: The self reference.
:param msg: The Int32 message type.
:param msg: The Int32 message type that represents the sound direction.
"""
self.sound_direction = msg.data * -self.rad_per_deg
def callback_speech(self,msg):
"""
A callback function that takes all items in the iterable list and join
them into a single string.
A callback function takes the incoming message, a list of the speech to
text, and joins all items in that iterable list into a single string.
:param self: The self reference.
:param msg: The SpeechRecognitionCandidates message type.
"""
@ -90,7 +91,7 @@ class GetVoiceCommands:
base motion.
:param self:The self reference.
:returns inc: A dictionary type.
:returns inc: A dictionary type the contains the increment size.
"""
if self.step_size == 'small':
inc = {'rad': self.small_rad, 'translate': self.small_translate}
@ -130,7 +131,7 @@ class GetVoiceCommands:
A function that defines the teleoperation command based on the voice command.
:param self: The self reference.
:returns command: A dictionary type.
:returns command: A dictionary type that contains the type of base motion.
"""
command = None
# Move base forward command
@ -141,11 +142,11 @@ class GetVoiceCommands:
if self.voice_command == 'back':
command = {'joint': 'translate_mobile_base', 'inc': -self.get_inc()['translate']}
# Move base left command
# Rotate base left command
if self.voice_command == 'left':
command = {'joint': 'rotate_mobile_base', 'inc': self.get_inc()['rad']}
# Move base right command
# Rotate base right command
if self.voice_command == 'right':
command = {'joint': 'rotate_mobile_base', 'inc': -self.get_inc()['rad']}
@ -181,6 +182,7 @@ class VoiceTeleopNode(hm.HelloNode):
"""
A function that declares object from the GetVoiceCommands class, instantiates
the HelloNode class, and set the publishing rate.
:param self: The self reference.
"""
hm.HelloNode.__init__(self)
self.rate = 10.0
@ -200,7 +202,7 @@ class VoiceTeleopNode(hm.HelloNode):
"""
Function that makes an action call and sends base joint trajectory goals.
:param self: The self reference.
:param command: A dictionary type.
:param command: A dictionary that contains the base motion type and increment size.
"""
joint_state = self.joint_state
# Conditional statement to send joint trajectory goals
@ -222,20 +224,19 @@ class VoiceTeleopNode(hm.HelloNode):
rospy.loginfo('inc = {0}'.format(inc))
new_value = inc
# Assign the new_value position to the trajectory goal message type.
# Assign the new_value position to the trajectory goal message type
point.positions = [new_value]
trajectory_goal.trajectory.points = [point]
trajectory_goal.trajectory.header.stamp = rospy.Time.now()
rospy.loginfo('joint_name = {0}, trajectory_goal = {1}'.format(joint_name, trajectory_goal))
# Make the action call and send goal of the new joint position.
# Make the action call and send goal of the new joint position
self.trajectory_client.send_goal(trajectory_goal)
rospy.loginfo('Done sending command.')
# Reprint the voice command menu
self.speech.print_commands()
def main(self):
"""
The main function that instantiates the HelloNode class, initializes the subscriber,
@ -255,11 +256,9 @@ class VoiceTeleopNode(hm.HelloNode):
self.send_command(command)
rate.sleep()
if __name__ == '__main__':
try:
# Declare object from the VoiceTeleopNode class. Then execute the
# main() method/function
# Instanstiate a `VoiceTeleopNode()` object and execute the main() method
node = VoiceTeleopNode()
node.main()
except KeyboardInterrupt:

+ 1
- 3
ros1/teleoperating_stretch.md View File

@ -161,6 +161,4 @@ An alternative for robot base teleoperation is to use an Xbox controller. Stop t
# Terminal 2
roslaunch stretch_core teleop_twist.launch twist_topic:=/stretch_diff_drive_controller/cmd_vel linear:=1.0 angular:=2.0 teleop_type:=joystick
```
Note that the teleop_twist_joy package has a deadman switch by default which disables the drive commands to be published unless pressed. For a Logitech F310 joystick, this button is A.
**Next Tutorial:** [Internal State of Stretch](internal_state_of_stretch.md)
Note that the teleop_twist_joy package has a deadman switch by default which disables the drive commands to be published unless pressed. For a Logitech F310 joystick, this button is A.

Loading…
Cancel
Save