Next, in a new terminal, run the stretch ArUco launch file which will bring up the [detect_aruco_markers](https://github.com/hello-robot/stretch_ros/blob/master/stretch_core/nodes/detect_aruco_markers) node.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core stretch_aruco.launch
```
Within this tutorial package, there is an [RViz config file](https://github.com/hello-robot/stretch_tutorials/blob/noetic/rviz/aruco_detector_example.rviz) with the topics for the transform frames in the Display tree. You can visualize these topics and the robot model by running the command below in a new terminal.
You are going to need to teleoperate Stretch's head to detect the ArUco marker tags. Run the following command in a new terminal and control the head to point the camera toward the markers.
@ -7,13 +7,13 @@ The goal of this example is to give you an enhanced understanding of how to cont
Begin by running the following command in a new terminal.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core stretch_driver.launch
```
Switch to `navigation` mode using a rosservice call. Then, in a new terminal, drive the robot forward with the [move.py](https://github.com/hello-robot/stretch_tutorials/tree/noetic/src/move.py) node.
After saving the edited node, bring up [Stretch in the empty world simulation](gazebo_basics.md). To drive the robot with the node, type the following in a new terminal
@ -6,19 +6,19 @@ For the tf2 static broadcaster node, we will be publishing three child static fr
Begin by starting up the stretch driver launch file.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core stretch_driver.launch
```
Within this tutorial package, there is an [RViz config file](https://github.com/hello-robot/stretch_tutorials/blob/noetic/rviz/tf2_broadcaster_example.rviz) with the topics for the transform frames in the Display tree. You can visualize these topics and the robot model by running the command below in a new terminal.
Then run the [tf2_broadcaster.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/tf2_broadcaster.py) node to visualize three static frames. In a new terminal, execute:
```bash
```{.bash .shell-prompt}
cd catkin_ws/src/stretch_tutorials/src/
python3 tf2_broadcaster.py
```
@ -34,8 +34,7 @@ The GIF below visualizes what happens when running the previous node.
@ -4,24 +4,24 @@ This tutorial highlights how to create a [PointCloud](http://docs.ros.org/en/mel
Begin by starting up the stretch driver launch file.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core stretch_driver.launch
```
To activate the [RealSense camera](https://www.intelrealsense.com/depth-camera-d435i/) and publish topics to be visualized, run the following launch file in a new terminal.
Then run the [pointCloud_transformer.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/pointcloud_transformer.py) node. In a new terminal, execute:
```bash
```{.bash .shell-prompt}
cd catkin_ws/src/stretch_tutorials/src/
python3 pointcloud_transformer.py
```
Within this tutorial package, there is an [RViz config file](https://github.com/hello-robot/stretch_tutorials/blob/noetic/rviz/PointCloud_transformer_example.rviz) with the `PointCloud` in the Display tree. You can visualize this topic and the robot model by running the command below in a new terminal.
Next, run the stretch ArUco launch file which will bring up the [detect_aruco_markers](https://github.com/hello-robot/stretch_ros/blob/master/stretch_core/nodes/detect_aruco_markers) node. In a new terminal, execute:
```bash
```{.bash .shell-prompt}
roslaunch stretch_core stretch_aruco.launch
```
Within this tutorial package, there is an [RViz config file](https://github.com/hello-robot/stretch_tutorials/blob/noetic/rviz/aruco_detector_example.rviz) with the topics for the transform frames in the Display tree. You can visualize these topics and the robot model by running the command below in a new terminal.
Then run the [aruco_tag_locator.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/aruco_tag_locator.py) node. In a new terminal, execute:
@ -22,7 +22,7 @@ Where `${HELLO_FLEET_PATH}` is the path of the `.yaml` file.
Now we are going to use a node to send a move_base goal half a meter in front of the map's origin. run the following command in a new terminal to execute the [navigation.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/navigation.py) node.
@ -37,26 +37,26 @@ Knowing the orientation of the LiDAR allows us to filter the scan values for a d
First, open a terminal and run the stretch driver launch file.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core stretch_driver.launch
```
Then in a new terminal run the `rplidar.launch` file from `stretch_core`.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core rplidar.launch
```
To filter the lidar scans for ranges that are directly in front of Stretch (width of 1 meter) run the [scan_filter.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/scan_filter.py) node by typing the following in a new terminal.
```bash
```{.bash .shell-prompt}
cd catkin_ws/src/stretch_tutorials/src/
python3 scan_filter.py
```
Then run the following command in a separate terminal to bring up a simple RViz configuration of the Stretch robot.
@ -3,19 +3,19 @@ This example aims to combine the two previous examples and have Stretch utilize
Begin by running the following command in a new terminal.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core stretch_driver.launch
```
Then, in a new terminal, type the following to activate the LiDAR sensor.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core rplidar.launch
```
To set `navigation` mode and to activate the [avoider.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/avoider.py) node, type the following in a new terminal.
The `rviz` flag will open an RViz window to visualize a variety of ROS topics. In a new terminal, run the following commands to execute the [marker.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/marker.py) node.
```bash
```{.bash .shell-prompt}
cd catkin_ws/src/stretch_tutorials/src/
python3 marker.py
```
@ -180,4 +180,4 @@ while not rospy.is_shutdown():
rate.sleep()
```
This loop is a fairly standard rospy construct: checking the `rospy.is_shutdown()` flag and then doing work. You have to check `is_shutdown()` to check if your program should exit (e.g. if there is a `Ctrl-c` or otherwise). The loop calls `rate.sleep()`, which sleeps just long enough to maintain the desired rate through the loop.
This loop is a fairly standard rospy construct: checking the `rospy.is_shutdown()` flag and then doing work. You have to check `is_shutdown()` to check if your program should exit (e.g. if there is a `Ctrl+c` or otherwise). The loop calls `rate.sleep()`, which sleeps just long enough to maintain the desired rate through the loop.
@ -5,13 +5,13 @@ If you are looking for a continuous print of the joint states while Stretch is i
Begin by starting up the stretch driver launch file.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core stretch_driver.launch
```
You can then hit the run-stop button (you should hear a beep and the LED light in the button blink) and move the robot's joints to a desired configuration. Once you are satisfied with the configuration, hold the run-stop button until you hear a beep. Then run the following command to execute the [joint_state_printer.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/joint_state_printer.py) which will print the joint positions of the lift, arm, and wrist. In a new terminal, execute:
@ -8,13 +8,13 @@ In this example, we will review a Python script that prints and stores the effor
Begin by running the following command in a terminal.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core stretch_driver.launch
```
Switch the mode to `position` mode using a rosservice call. Then run the [effort_sensing.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/effort_sensing.py) node. In a new terminal, execute:
@ -9,19 +9,19 @@ In this example, we will review the [image_view](http://wiki.ros.org/image_view?
Begin by running the stretch `driver.launch` file.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core stretch_driver.launch
```
To activate the [RealSense camera](https://www.intelrealsense.com/depth-camera-d435i/) and publish topics to be visualized, run the following launch file in a new terminal.
Within this tutorial package, there is an RViz config file with the topics for perception already in the Display tree. You can visualize these topics and the robot model by running the command below in a new terminal.
In this section, we will use a Python node to capture an image from the [RealSense camera](https://www.intelrealsense.com/depth-camera-d435i/). Execute the [capture_image.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/capture_image.py) node to save a .jpeg image of the image topic `/camera/color/image_raw_upright_view`. In a terminal, execute:
```bash
```{.bash .shell-prompt}
cd ~/catkin_ws/src/stretch_tutorials/src
python3 capture_image.py
```
@ -191,7 +191,7 @@ Give control to ROS. This will allow the callback to be called whenever new mes
## Edge Detection
In this section, we highlight a node that utilizes the [Canny Edge filter](https://www.geeksforgeeks.org/python-opencv-canny-function/) algorithm to detect the edges from an image and convert it back as a ROS image to be visualized in RViz. In a terminal, execute:
@ -4,20 +4,20 @@ This example aims to combine the [ReSpeaker Microphone Array](respeaker_micropho
Begin by running the following command in a new terminal.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core stretch_driver.launch
```
Switch the mode to `position` mode using a rosservice call. Then run the `respeaker.launch` file. In a new terminal, execute:
```bash
```{.bash .shell-prompt}
rosservice call /switch_to_position_mode
roslaunch stretch_core respeaker.launch
```
Then run the [voice_teleoperation_base.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/voice_teleoperation_base.py) node in a new terminal.
Begin by starting up the stretch driver launch file.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core stretch_driver.launch
```
Then utilize the ROS command-line tool [rostopic](http://wiki.ros.org/rostopic) to display Stretch's internal state information. For instance, to view the current state of the robot's joints, simply type the following in a new terminal.
Let's say you are interested in only seeing the `header` component of the `/joint_states` topic, you can output this within the rostopic command-line tool by typing the following command.
```bash
```{.bash .shell-prompt}
rostopic echo /joint_states/header -n1
```
@ -53,7 +53,7 @@ Additionally, if you were to type `rostopic echo /` in the terminal, then press
A powerful tool to visualize ROS communication is the ROS [rqt_graph package](http://wiki.ros.org/rqt_graph). By typing the following in a new terminal, you can see a graph of topics being communicated between nodes.
@ -36,14 +36,14 @@ It is also possible to send 2D Pose Estimates and Nav Goals programmatically. In
## Running in Simulation
To perform mapping and navigation in the Gazebo simulation of Stretch, substitute the `mapping_gazebo.launch` and `navigation_gazebo.launch` files into the commands above. The default Gazebo environment is the Willow Garage HQ. Use the "world" ROS argument to specify the Gazebo world within which to spawn Stretch.
The mapping launch files, `mapping.launch` and `mapping_gazebo.launch`, expose the ROS argument `teleop_type`. By default, this ROS argument is set to `keyboard`, which launches keyboard teleop in the terminal. If the Xbox controller that ships with Stretch is plugged into your computer, the following command will launch mapping with joystick teleop:
@ -4,19 +4,19 @@ The Stretch robot is equipped with the [Intel RealSense D435i camera](https://ww
Begin by running the stretch `driver.launch` file.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core stretch_driver.launch
```
To activate the [RealSense camera](https://www.intelrealsense.com/depth-camera-d435i/) and publish topics to be visualized, run the following launch file in a new terminal.
Within this tutorial package, there is an [RViz config file](https://github.com/hello-robot/stretch_tutorials/blob/noetic/rviz/perception_example.rviz) with the topics for perception already in the Display tree. You can visualize these topics and the robot model by running the command below in a new terminal.
@ -7,13 +7,13 @@ If you have not already had a look at the [Xbox Controller Teleoperation](https:
For full-body teleoperation with the keyboard, you first need to run the `stretch_driver.launch` in a terminal.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core stretch_driver.launch
```
Then in a new terminal, type the following command
```bash
```{.bash .shell-prompt}
rosrun stretch_core keyboard_teleop
```
@ -60,13 +60,13 @@ To stop the node from sending twist messages, press `Ctrl` + `c` in the terminal
Begin by running the following command in your terminal:
```bash
```{.bash .shell-prompt}
roslaunch stretch_core stretch_driver.launch
```
To teleoperate a Stretch's mobile base with the keyboard, you first need to switch the mode to `navigation` for the robot to receive `Twist` messages. This is done using a rosservice call in a new terminal. In the same terminal run the teleop_twist_keyboard node with the argument remapping the *cmd_vel* topic name to *stretch/cmd_vel*.
To teleoperate a Stretch's mobile base with the keyboard, you first need to switch the mode to `navigation` for the robot to receive `Twist` messages. This is done using a rosservice call in a new terminal. In the same terminal run the teleop_twist_keyboard node with the argument remapping the `cmd_vel` topic name to `stretch/cmd_vel`.
@ -111,13 +111,13 @@ To move Stretch's mobile base using a python script, please look at [Teleoperate
### Keyboard Teleoperating: Mobile Base
For keyboard teleoperation of the Stretch's mobile base, first, [startup Stretch in simulation](gazebo_basics.md). Then run the following command in a new terminal.
@ -126,7 +126,7 @@ The same keyboard commands will be presented to a user to move the robot.
### Xbox Controller Teleoperating
An alternative for robot base teleoperation is to use an Xbox controller. Stop the keyboard teleoperation node by typing `Ctrl` + `c` in the terminal where the command was executed. Then connect the Xbox controller device to your local machine and run the following command.