You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

209 lines
16 KiB

1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
  1. !!! note
  2. Please be advised that the code in this tutorial is currently in beta and is under active development.
  3. # Autodocking with Nav Stack
  4. Wouldn't it be awesome if after a hard day's work, Stretch would just go charge itself without you having to worry about it? In this tutorial we will explore an experimental code that allows Stretch to locate a charging station and charge itself autonomously. This demo will build on top of some of the tutorials that we have explored earlier like [ArUco detection](https://docs.hello-robot.com/0.2/stretch-tutorials/ros1/aruco_marker_detection/), [base teleoperation](https://docs.hello-robot.com/0.2/stretch-tutorials/ros1/example_1/) and using the [Nav Stack](https://docs.hello-robot.com/0.2/stretch-tutorials/ros1/navigation_stack/). Be sure to check them out.
  5. ## Docking Station
  6. The [Stretch Docking Station](https://github.com/hello-robot/stretch_tool_share/tree/master/tool_share/stretch_docking_station) is a Stretch accessory that allows one to autonomously charge the robot. The top part of the docking station has an ArUco marker number 245 from the 6x6, 250 dictionary. To understand why this is important, refer to [this](https://docs.opencv.org/4.x/d5/dae/tutorial_aruco_detection.html) handy guide provided by OpenCV. This marker enables Stretch to accurately locate the docking station in its environment. The docking station also has a Y-shaped recess in its base plate to guide the caster wheel of the robot towards the charging port in case of minor misalignments while backing up. Overall, it's a minimal yet effective way to allow Stretch to charge itself.
  7. ## Behaviour Trees
  8. Traditionally, high level task planning has been achieved using Finite State Machines or FSMs which break down each functional element of the task into states that have to be traversed in a cyclic manner to accomplish the major task. This approach has recently gone out of vogue in favour of [Behavior Trees](https://robohub.org/introduction-to-behavior-trees/) or BTs. BTs, also known as Directed Acyclic Graphs, have been popularized through their use in the gaming industry to impart complex behaviors to NPCs. BTs organize behaviors in a tree representation where the control flow is achieved not through cyclic state transitions but in a tree traversal fashion. Additionally, conditions and actions form distinct leafs in the tree which results in better modularity. This ensures that augmenting behaviors by way of additional leafs in the tree does not require a restructuring of previous and following leafs but only the final tree graph. This also means that actions in complex behaviors can be developed independently resulting in higher scalability.
  9. ## Py-trees
  10. We decided to implement this demo using the open-source behvaior trees library called [py-trees](https://py-trees.readthedocs.io/en/devel/introduction.html) because of its following features:
  11. - Open-source
  12. - Pythonic for quicker adoption and turnaround
  13. - Well-documented to enable users to self-learn
  14. - Scalable to work equally well on all levels of our software stack, including ROS 2
  15. - Well-supported to allow us to continue to build on top for the forseebale future
  16. ## Prerequisites
  17. 1. Since this demo uses the ROS Navigation Stack to navigate to the docking station, it requires a pregenerated map that can be utilized to localize the robot. To know how to generate the map, refer to the [Nav Stack]() tutorial.
  18. 2. To understand the underlying implementation, it is important to review the concept of Behavior Trees. Although this demo does not use some of its more useful features such as a blackboard or tree visualization, a preliminary read on the concept should be sufficient to understand what's happening under the hood.
  19. 3. This demo requires the Behavior Tree library called py-trees to be installed. To do this, execute the following command:
  20. ```{.bash .shell-prompt}
  21. sudo apt install ros-noetic-py-trees-ros ros-noetic-rqt-py-trees
  22. ```
  23. Once you have the above covered, we are ready to setup the demo.
  24. ## Setup and Launch
  25. The demo requires the docking station to be rested against a wall with the charger connected to the back and the arm to be stowed for safety. It is also necessary for the robot to be close to the origin of the map for the robot to have the correct pose estimate at startup. If not, the pose estimate will have to be supplied manually using the `2D Pose Estimate` button in RViz as soon as the demo starts.
  26. Let's stow the arm first:
  27. ```{.bash .shell-prompt}
  28. stretch_robot_stow.py
  29. ```
  30. To launch the demo, execute the following command:
  31. ```{.bash .shell-prompt}
  32. roslaunch stretch_demos autodocking.launch map_yaml:=${HELLO_FLEET_PATH}/maps/<map_name>.yaml
  33. ```
  34. <p align="center">
  35. <img src="https://user-images.githubusercontent.com/97639181/224141937-6024cbb5-994b-4d15-83c7-bddbfaaee08f.gif" width="400">
  36. </p>
  37. ## How It Works
  38. Below is a simplified version of the behavior tree we implemented for this demo. Please be advised that the actual implementation has minor variations, but the below image should serve well to understand the control flow. Before moving ahead, we recommend supplementing reading this tutorial with the concept of behaviour trees. Let's dig in!
  39. The root of our tree is the `sequence` node (right-pointing arrow) below the `repeat` decorator node (circular arrow). This sequence node succeeds only when all of its children succeed, else it returns a failure.
  40. <p align="center">
  41. <img src="https://user-images.githubusercontent.com/97639181/224143721-71ed392e-3e3f-47b8-a6b4-757da2dfefe5.png" width="600">
  42. </p>
  43. It has the `fallback` node (question mark) at the left as its first child. As per rules of tree traversal, this fallback node is the first to be executed. In turn, this fallback nodes has two children - the `Predock found?` condition node and the `Camera scan` action node. The `Predock found?` condition node is a subscriber that waits for the predock pose to be published on a topic called `\predock_pose`. At the start of the demo, we expect this to fail as the robot does not know where the pose is. This triggers the `Camera scan` action node which is an action client for the `ArucoHeadScan` action server that detects the docking station ArUco marker. If this action node succeeds it published the predock pose and the next child of the sequence node is ticked.
  44. The second child of the sequence node is again a `fallback` node with two children - the `At predock?` condition node and the `Move to predock` action node. The `At predock?` condition node is simply a TF lookup, wrapped in a Behavior Tree class called CheckTF, that checks if the base_link frame is aligned with the predock_pose frame. We expect this to fail initially as the robot needs to travel to the predock pose for this condition to be met. This triggers the `Move to predock` action node which is an action client for the MoveBase action server from the Nav stack. This action client passes the predock pose as the goal to the robot. If this action succeeds, the robot navigates to the predock pose and the next child of the root node is triggered.
  45. The third child of the root node is the `Move to dock` action node. This is a simple error-based controller wrapped in a Behavior Tree class called `VisualServoing`. It's working is explained in the image below. This controller enables the robot to back up and align itself to the docking station in case the navigation stack introduces error in getting to the predock pose.
  46. <p align="center">
  47. <img src="https://user-images.githubusercontent.com/97639181/224143937-22c302e4-5fd0-4c7e-97e0-b56fc6f40217.png" width="600">
  48. </p>
  49. The fourth and final child of the sequence node is another `fallback` node with two children - the `Charging?` condition node and the `Move to predock` action node with an `inverter` decorator node (+/- sign). The `Charging?` condition node is a subscriber that checks if the 'present' attribute of the `BatteryState` message is True. If the robot has backed up correctly into the docking station and the charger port latched, this node should return SUCCESS and the autodocking would succeed. If not, the robot moves back to the predock pose through the `Move to predock` action node and tries again.
  50. ## Code Breakdown
  51. Let's jump into the code to see how things work under the hood. Follow along [here]() (TODO after merge) to have a look at the entire script.
  52. We start off by importing the dependencies. The ones of interest are those relating to py-trees and the various behaviour classes in autodocking.autodocking_behaviours, namely, MoveBaseActionClient, CheckTF and VisualServoing. We also created custom ROS action messages for the ArucoHeadScan action defined in the action directory of stretch_demos package.
  53. ```python
  54. import py_trees
  55. import py_trees_ros
  56. import py_trees.console as console
  57. import rospy
  58. import sys
  59. import functools
  60. from autodocking.autodocking_behaviours import MoveBaseActionClient
  61. from autodocking.autodocking_behaviours import CheckTF
  62. from autodocking.autodocking_behaviours import VisualServoing
  63. from stretch_core.msg import ArucoHeadScanAction, ArucoHeadScanGoal
  64. from geometry_msgs.msg import Pose
  65. from sensor_msgs.msg import BatteryState
  66. import hello_helpers.hello_misc as hm
  67. ```
  68. The main class of this script is the AutodockingBT class which is a subclass of HelloNode.
  69. ```python
  70. class AutodockingBT(hm.HelloNode):
  71. ```
  72. The create_root() method is where we construct the autodocking behavior tree. As seen in the figure above, the root node of the behavior tree is a sequence node called `autodocking_seq_root`. This sequence node executes its child nodes sequentially until either all of them succeed or one of them fails. It begins by executing its first child node called `dock_found_fb`.
  73. The `dock_found_fb` node is a fallback node which starts executing from the left-most child node and only executes the following child node if the child node preceeding it fails. This is useful for executing recovery behaviors in case a required condition is not met. Similarly, `at_predock_fb` and `charging_fb` are also fallback nodes.
  74. ```python
  75. def create_root(self):
  76. # behaviours
  77. autodocking_seq_root = py_trees.composites.Sequence("autodocking")
  78. dock_found_fb = py_trees.composites.Selector("dock_found_fb")
  79. at_predock_fb = py_trees.composites.Selector("at_predock_fb")
  80. charging_fb = py_trees.composites.Selector("charging_fb")
  81. ```
  82. The node `predock_found_sub` is a behavior node which is a child of the `dock_found_fb` fallback node. This node subscribes to the `/predock_pose` topic to check for incoming messages. It returns SUCCESS when a predock pose is being published. At the start of the demo, since the robot likely does not have the docking station in its view, no messages are received on this topic. The fallback to this condition would be to scan the area using the head camera. The `head_scan_action` action node sends a goal to the `ArucoHeadScan` server to look for the marker number 245 at a camera tilt angle of -0.68 rads through ArucoHeadScanGoal(). If this action returns SUCCESS, we start receiving the predock_pose.
  83. ```python
  84. predock_found_sub = py_trees_ros.subscribers.CheckData(
  85. name="predock_found_sub?",
  86. topic_name='/predock_pose',
  87. expected_value=None,
  88. topic_type=Pose,
  89. fail_if_no_data=True,fail_if_bad_comparison=False)
  90. aruco_goal = ArucoHeadScanGoal()
  91. aruco_goal.aruco_id = 245
  92. aruco_goal.tilt_angle = -0.68
  93. aruco_goal.publish_to_map = True
  94. aruco_goal.fill_in_blindspot_with_second_scan = False
  95. aruco_goal.fast_scan = False
  96. head_scan_action = py_trees_ros.actions.ActionClient( # Publishes predock pose to /predock_pose topic and tf frame called /predock_pose
  97. name="ArucoHeadScan",
  98. action_namespace="ArucoHeadScan",
  99. action_spec=ArucoHeadScanAction,
  100. action_goal=aruco_goal,
  101. override_feedback_message_on_running="rotating"
  102. )
  103. ```
  104. Next, we want to move to the predock_pose. We do this by passing the predock pose as a goal to the Move Base action server using the `predock_action`. This is followed by the `dock_action` action node which uses a mock visual servoing controller to back up into the docking station. This action uses the predock pose to align the robot to the docking station. Internally, it publishes Twist messages on the /stretch/cmd_vel topic after computing the linear and angular velocities based on the postional and angular errors as defined by the simple controller in the image above.
  105. ```python
  106. predock_action = MoveBaseActionClient(
  107. self.tf2_buffer,
  108. name="predock_action",
  109. override_feedback_message_on_running="moving"
  110. )
  111. invert_predock = py_trees.decorators.SuccessIsFailure(name='invert_predock', child=predock_action)
  112. dock_action = VisualServoing(
  113. name='dock_action',
  114. source_frame='docking_station',
  115. target_frame='charging_port',
  116. override_feedback_message_on_running="docking"
  117. )
  118. ```
  119. Finally, we define the `is_charging_sub` behavior node which, like the `predock_found_sub`, subscribes to the `\battery` topic and checks for the `present` attribute of the BatteryState message to turn True. If this behavior node returns SUCCEES, the root node returns SUCCEESS as well.
  120. ```python
  121. is_charging_sub = py_trees_ros.subscribers.CheckData(
  122. name="battery_charging?",
  123. topic_name='/battery',
  124. variable_name='present',
  125. expected_value=True,
  126. topic_type=BatteryState,
  127. fail_if_no_data=True,fail_if_bad_comparison=True)
  128. ```
  129. Once we have defined the behavior nodes, the behavior tree can be constructed using the add_child() or add_children() methods. The root node is then returned to the caller.
  130. ```python
  131. autodocking_seq_root.add_children([dock_found_fb, at_predock_fb, dock_action, charging_fb])
  132. dock_found_fb.add_children([predock_found_sub, head_scan_action])
  133. at_predock_fb.add_children([predock_action])
  134. charging_fb.add_children([is_charging_sub, invert_predock])
  135. return autodocking_seq_root
  136. ```
  137. The main() method is where the behavior tree is ticked. First, we create an instance of the BehaviorTree class using the root of the tree we created in the create_root() method. The tick_tock() method then ticks the behavior nodes in order until the root either returns a SUCCESS or a FAILURE.
  138. ```python
  139. def main(self):
  140. """
  141. Entry point for the demo script.
  142. """
  143. hm.HelloNode.main(self, 'autodocking', 'autodocking')
  144. root = self.create_root()
  145. self.behaviour_tree = py_trees_ros.trees.BehaviourTree(root)
  146. rospy.on_shutdown(functools.partial(self.shutdown, self.behaviour_tree))
  147. if not self.behaviour_tree.setup(timeout=15):
  148. console.logerror("failed to setup the tree, aborting.")
  149. sys.exit(1)
  150. def print_tree(tree):
  151. print(py_trees.display.unicode_tree(root=tree.root, show_status=True))
  152. try:
  153. self.behaviour_tree.tick_tock(
  154. 500
  155. # period_ms=500,
  156. # number_of_iterations=py_trees.trees.CONTINUOUS_TICK_TOCK,
  157. # pre_tick_handler=None,
  158. # post_tick_handler=print_tree
  159. )
  160. except KeyboardInterrupt:
  161. self.behaviour_tree.interrupt()
  162. ```
  163. ## Results and Expectations
  164. This demo serves as an experimental setup to explore self-charging with Stretch. Please be advised that this code is not expected to work perfectly. Some of the shortcomings of the demo include:
  165. - The aruco detection fails often and the user might be required to experiment with different locations for the docking station and lighting for better results
  166. - The controller implementation is not robust to erroneous predock pose supplied by the camera and friction introduced by floor surfaces like carpet
  167. - The current design of the docking station is minimal and it is recommended that users find ways to stick the station to the floor to prevent it from moving while docking
  168. Users are encouraged to try this demo and submit improvements.