This attribute gives you the entire joint state of the robot as a [JointState](https://docs.ros.org/en/noetic/api/sensor_msgs/html/msg/JointState.html) message. The utility method [`get_joint_state()`](#get_joint_statejoint_name-moving_threshold0001) is an easier alternative to parsing the JointState message.
##### `point_cloud`
This attribute is a [PointCloud2](https://docs.ros.org/en/noetic/api/sensor_msgs/html/msg/PointCloud2.html) message as seen by the head camera. The utility method [`get_point_cloud()`](#get_point_cloud) is an easier alternative to parsing the PointCloud2 message.
##### `tool`
This attribute is the name of the end-effector as a string. You can use this attribute to flag an error for other Stretch users if their robot isn't configured with the correct tool. Most commonly, this attribute will be either `'tool_stretch_dex_wrist'` or `'tool_stretch_gripper'`. To learn about the other end-effectors available for Stretch, or how to create your own and plug it into Stretch's ROS driver, check out the [documentation on tools](https://docs.hello-robot.com/0.2/stretch-tutorials/stretch_body/tutorial_tool_change/).
##### `mode`
This attribute is the mode the robot's driver is in, as a string. See the driver's API to learn more about [driver modes](../stretch_core/README.md#mode-std_msgsstring).
##### `dryrun`
This attribute allows you to control whether the robot actually moves when calling `move_to_pose()`, `home_the_robot()`, `stow_the_robot()`, or other motion methods in this class. When `dryrun` is set to True, these motion methods return immediately. This attribute is helpful when you want to run just the perception/planning part of your node without actually moving the robot. For example, you could replace the following verbose snippet:
@ -127,13 +144,44 @@ Use this method to get the transform ([geometry_msgs/TransformStamped](https://d
Use this method to retrieve the joint state for a single joint. It will return a tuple with joint position, velocity, effort, and is_moving as a boolean (checked against the moving_threshold argument). For example:
print(f"The head pan is {'' if is_moving else 'not'} moving")
```
##### `get_point_cloud()`
Use this method to retrieve the point cloud seen by the head camera as a Numpy array. It will return a tuple with a named array, Nx3 3D point array, timestamp at which the point was captured, and TF frame in which the cloud was captured. For example:
print(f"Head camera saw a cloud of size {cloud_xyz.shape} in frame {capture_frame} at {capture_time}")
# Head camera saw a cloud of size (275925, 3) in frame camera_color_optical_frame at 1695973195045439959
import numpy as np
i = np.argmax(cloud['z'])
print(f"If the capture frame is camera_color_optical_frame (i.e. z axis points out from camera), the point furthest from the camera is {np.sqrt(cloud['x'][i]**2 + cloud['y'][i]**2 + cloud['z'][i]**2):.2f}m away and has the color {(cloud['r'][i], cloud['g'][i], cloud['b'][i])}")
# If the capture frame is camera_color_optical_frame (i.e. z axis points out from camera), the point furthest from the camera is 1.81m away and has the color (118, 121, 103)
Returns the current estimated x, y position and angle of the robot on the floor. This is typically called with respect to the odom frame or the map frame. x and y are in meters and the angle is in radians.