You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

238 lines
12 KiB

3 years ago
  1. ![](../images/banner.png)
  2. ## Overview
  3. *hello_helpers* mostly consists of the hello_helpers Python module. This module provides various Python files used across stretch_ros that have not attained sufficient status to stand on their own.
  4. ## Capabilities
  5. *fit_plane.py* : Fits planes to 3D data.
  6. *hello_misc.py* : Various functions, including a helpful Python object with which to create ROS nodes.
  7. *hello_ros_viz.py* : Various helper functions for vizualizations using RViz.
  8. ## Typical Usage
  9. ```python
  10. import hello_helpers.fit_plane as fp
  11. ```
  12. ```python
  13. import hello_helpers.hello_misc as hm
  14. ```
  15. ```python
  16. import hello_helpers.hello_ros_viz as hr
  17. ```
  18. # API
  19. ## Classes
  20. ### [HelloNode](./src/hello_helpers/hello_misc.py)
  21. This class is a convenience class for creating a ROS node for Stretch. The most common way to use this class is to extend it. In your extending class, the main funcion would call `HelloNode`'s main function. This would look like:
  22. ```python
  23. import hello_helpers.hello_misc as hm
  24. class MyNode(hm.HelloNode):
  25. def __init__(self):
  26. hm.HelloNode.__init__(self)
  27. def main(self):
  28. hm.HelloNode.main(self, 'my_node', 'my_node', wait_for_first_pointcloud=False)
  29. # my_node's main logic goes here
  30. node = MyNode()
  31. node.main()
  32. ```
  33. There is also a one-liner class method for instantiating a `HelloNode` for easy prototyping. One example where this is handy is sending pose commands from iPython:
  34. ```python
  35. # roslaunch the stretch launch file beforehand
  36. import hello_helpers.hello_misc as hm
  37. temp = hm.HelloNode.quick_create('temp')
  38. temp.move_to_pose({'joint_lift': 0.4})
  39. ```
  40. #### Attributes
  41. ##### `joint_states`
  42. This attribute gives you the entire joint state of the robot as a [JointState](https://docs.ros.org/en/noetic/api/sensor_msgs/html/msg/JointState.html) message. The utility method [`get_joint_state()`](#get_joint_statejoint_name-moving_threshold0001) is an easier alternative to parsing the JointState message.
  43. ##### `point_cloud`
  44. This attribute is a [PointCloud2](https://docs.ros.org/en/noetic/api/sensor_msgs/html/msg/PointCloud2.html) message as seen by the head camera. The utility method [`get_point_cloud()`](#get_point_cloud) is an easier alternative to parsing the PointCloud2 message.
  45. ##### `tool`
  46. This attribute is the name of the end-effector as a string. You can use this attribute to flag an error for other Stretch users if their robot isn't configured with the correct tool. Most commonly, this attribute will be either `'tool_stretch_dex_wrist'` or `'tool_stretch_gripper'`. To learn about the other end-effectors available for Stretch, or how to create your own and plug it into Stretch's ROS driver, check out the [documentation on tools](https://docs.hello-robot.com/0.2/stretch-tutorials/stretch_body/tutorial_tool_change/).
  47. ##### `mode`
  48. This attribute is the mode the robot's driver is in, as a string. See the driver's API to learn more about [driver modes](../stretch_core/README.md#mode-std_msgsstring).
  49. ##### `dryrun`
  50. This attribute allows you to control whether the robot actually moves when calling `move_to_pose()`, `home_the_robot()`, `stow_the_robot()`, or other motion methods in this class. When `dryrun` is set to True, these motion methods return immediately. This attribute is helpful when you want to run just the perception/planning part of your node without actually moving the robot. For example, you could replace the following verbose snippet:
  51. ```python
  52. # roslaunch the stretch launch file beforehand
  53. import hello_helpers.hello_misc as hm
  54. temp = hm.HelloNode.quick_create('temp')
  55. actually_move = False
  56. [...]
  57. if actually_move:
  58. temp.move_to_pose({'translate_mobile_base': 1.0})
  59. ```
  60. to be more consise:
  61. ```python
  62. # roslaunch the stretch launch file beforehand
  63. import hello_helpers.hello_misc as hm
  64. temp = hm.HelloNode.quick_create('temp')
  65. [...]
  66. temp.dryrun = True
  67. temp.move_to_pose({'translate_mobile_base': 1.0})
  68. ```
  69. #### Methods
  70. ##### `move_to_pose(pose, return_before_done=False, custom_contact_thresholds=False, custom_full_goal=False)`
  71. This method takes in a dictionary that describes a desired pose for the robot and communicates with [stretch_driver](../stretch_core/README.md#stretchdrivernodesstretchdriver) to execute it. The basic format of this dictionary is string/number key/value pairs, where the keys are joint names and the values are desired position goals. For example, `{'joint_lift': 0.5}` would put the lift at 0.5m in its joint range. A full list of command-able joints is published to the `/stretch/joint_states` topic. Used within a node extending `HelloNode`, calling this method would look like:
  72. ```python
  73. self.move_to_pose({'joint_lift': 0.5})
  74. ```
  75. Internally, this dictionary is converted into a [FollowJointTrajectoryGoal](http://docs.ros.org/en/diamondback/api/control_msgs/html/msg/FollowJointTrajectoryGoal.html) that is sent to a [FollowJointTrajectory action](http://docs.ros.org/en/noetic/api/control_msgs/html/action/FollowJointTrajectory.html) server in stretch_driver. This method waits by default for the server to report that the goal has completed executing. However, you can return before the goal has completed by setting the `return_before_done` argument to True. This can be useful for preempting goals.
  76. There are two additional arguments that enable you to customize how the pose is executed. If you set `custom_contact_thresholds` to True, this method expects a different format dictionary: string/tuple key/value pairs, where the keys are still joint names, but the values are `(position_goal, effort_threshold)`. The addition of a effort threshold enables you to detect when a joint has made contact with something in the environment, which is useful for manipulation or safe movements. For example, `{'joint_arm': (0.5, 20)}` commands the telescoping arm fully out (the arm is nearly fully extended at 0.5 meters) but with a low enough effort threshold (20% of the arm motor's max effort) that the motor will stop when the end of arm has made contact with something. Again, in a node, this would look like:
  77. ```python
  78. self.move_to_pose({'joint_arm': (0.5, 40)}, custom_contact_thresholds=True)
  79. ```
  80. If you set `custom_full_goal` to True, the dictionary format is string/tuple key/value pairs, where keys are joint names again, but values are `(position_goal, velocity, acceleration, effort_threshold)`. The velocity and acceleration components allow you to customize the trajectory profile the joint follows while moving to the goal position. In the following example, the arm telescopes out slowly until contact is detected:
  81. ```python
  82. self.move_to_pose({'joint_arm': (0.5, 0.01, 0.01, 40)}, custom_full_goal=True)
  83. ```
  84. ##### `home_the_robot()`
  85. This is a convenience method to interact with the driver's [`/home_the_robot` service](../stretch_core/README.md#home_the_robot-std_srvstrigger).
  86. ##### `stow_the_robot()`
  87. This is a convenience method to interact with the driver's [`/stow_the_robot` service](../stretch_core/README.md#stow_the_robot-std_srvstrigger).
  88. ##### `stop_the_robot()`
  89. This is a convenience method to interact with the driver's [`/stop_the_robot` service](../stretch_core/README.md#stop_the_robot-std_srvstrigger).
  90. ##### `get_tf(from_frame, to_frame)`
  91. Use this method to get the transform ([geometry_msgs/TransformStamped](https://docs.ros.org/en/noetic/api/geometry_msgs/html/msg/TransformStamped.html)) between two frames. This method is blocking. For example, this method can do forward kinematics from the base_link to the link between the gripper fingers, link_grasp_center, using:
  92. ```python
  93. # roslaunch the stretch launch file beforehand
  94. import hello_helpers.hello_misc as hm
  95. temp = hm.HelloNode.quick_create('temp')
  96. t = temp.get_tf('base_link', 'link_grasp_center')
  97. print(t.transform.translation)
  98. ```
  99. ##### `get_joint_state(joint_name, moving_threshold=0.001)`
  100. Use this method to retrieve the joint state for a single joint. It will return a tuple with joint position, velocity, effort, and is_moving as a boolean (checked against the moving_threshold argument). For example:
  101. ```python
  102. # roslaunch the stretch launch file beforehand
  103. import hello_helpers.hello_misc as hm
  104. temp = hm.HelloNode.quick_create('temp')
  105. pos, vel, eff, is_moving = temp.get_joint_state('joint_head_pan')
  106. print(f"The head pan is {'' if is_moving else 'not'} moving")
  107. ```
  108. ##### `get_point_cloud()`
  109. Use this method to retrieve the point cloud seen by the head camera as a Numpy array. It will return a tuple with a named array, Nx3 3D point array, timestamp at which the point was captured, and TF frame in which the cloud was captured. For example:
  110. ```python
  111. # roslaunch the stretch launch file beforehand
  112. import hello_helpers.hello_misc as hm
  113. temp = hm.HelloNode.quick_create('temp')
  114. cloud, cloud_xyz, capture_time, capture_frame = temp.get_point_cloud()
  115. print(f"Head camera saw a cloud of size {cloud_xyz.shape} in frame {capture_frame} at {capture_time}")
  116. # Head camera saw a cloud of size (275925, 3) in frame camera_color_optical_frame at 1695973195045439959
  117. import numpy as np
  118. i = np.argmax(cloud['z'])
  119. print(f"If the capture frame is camera_color_optical_frame (i.e. z axis points out from camera), the point furthest from the camera is {np.sqrt(cloud['x'][i]**2 + cloud['y'][i]**2 + cloud['z'][i]**2):.2f}m away and has the color {(cloud['r'][i], cloud['g'][i], cloud['b'][i])}")
  120. # If the capture frame is camera_color_optical_frame (i.e. z axis points out from camera), the point furthest from the camera is 1.81m away and has the color (118, 121, 103)
  121. ```
  122. ##### `get_robot_floor_pose_xya(floor_frame='odom')`
  123. Returns the current estimated x, y position and angle of the robot on the floor. This is typically called with respect to the odom frame or the map frame. x and y are in meters and the angle is in radians.
  124. ##### `main(node_name, node_topic_namespace, wait_for_first_pointcloud=True)`
  125. When extending the `HelloNode` class, call this method at the very beginning of your `main()` method. This method handles setting up a few ROS components, including registering the node with the ROS server, creating a TF listener, creating a [FollowJointTrajectory](http://docs.ros.org/en/noetic/api/control_msgs/html/action/FollowJointTrajectory.html) client for the [`move_to_pose()`](#movetoposepose-returnbeforedonefalse-customcontactthresholdsfalse-customfullgoalfalse) method, subscribing to depth camera point cloud topic, and connecting to the quick-stop service.
  126. Since it takes up to 30 seconds for the head camera to start streaming data, the `wait_for_first_pointcloud` argument will get the node to wait until it has seen camera data, which is helpful if your node is processing camera data.
  127. ##### `quick_create(name, wait_for_first_pointcloud=False)`
  128. A class level method for quick testing. This allows you to avoid having to extend `HelloNode` to use it.
  129. ```python
  130. # roslaunch the stretch launch file beforehand
  131. import hello_helpers.hello_misc as hm
  132. temp = hm.HelloNode.quick_create('temp')
  133. temp.move_to_pose({'joint_lift': 0.4})
  134. ```
  135. #### Subscribed Topics
  136. ##### /camera/depth/color/points ([sensor_msgs/PointCloud2](http://docs.ros.org/en/noetic/api/sensor_msgs/html/msg/PointCloud2.html))
  137. Provides a point cloud as currently seen by the Realsense depth camera in Stretch's head. Accessible from the `self.point_cloud` attribute.
  138. ```python
  139. # roslaunch the stretch launch file beforehand
  140. import hello_helpers.hello_misc as hm
  141. temp = hm.HelloNode.quick_create('temp', wait_for_first_pointcloud=True)
  142. print(temp.point_cloud)
  143. ```
  144. #### Subscribed Services
  145. ##### /stop_the_robot ([std_srvs/Trigger](https://docs.ros.org/en/noetic/api/std_srvs/html/srv/Trigger.html))
  146. Provides a service to quickly stop any motion currently executing on the robot.
  147. ```python
  148. # roslaunch the stretch launch file beforehand
  149. from std_srvs.srv import TriggerRequest
  150. import hello_helpers.hello_misc as hm
  151. temp = hm.HelloNode.quick_create('temp')
  152. temp.stop_the_robot_service(TriggerRequest())
  153. ```
  154. ## License
  155. For license information, please see the LICENSE files.