Browse Source

Update bash prompt sign and copy

pull/16/head
hello-chintan 1 year ago
parent
commit
6bf180572b
24 changed files with 92 additions and 92 deletions
  1. +5
    -5
      ros1/aruco_marker_detection.md
  2. +3
    -3
      ros1/example_1.md
  3. +4
    -5
      ros1/example_10.md
  4. +4
    -4
      ros1/example_11.md
  5. +5
    -5
      ros1/example_12.md
  6. +2
    -2
      ros1/example_13.md
  7. +4
    -4
      ros1/example_2.md
  8. +3
    -3
      ros1/example_3.md
  9. +3
    -3
      ros1/example_4.md
  10. +2
    -2
      ros1/example_5.md
  11. +3
    -3
      ros1/example_6.md
  12. +7
    -7
      ros1/example_7.md
  13. +2
    -2
      ros1/example_8.md
  14. +3
    -3
      ros1/example_9.md
  15. +2
    -2
      ros1/follow_joint_trajectory.md
  16. +3
    -3
      ros1/gazebo_basics.md
  17. +1
    -0
      ros1/getting_started.md
  18. +4
    -4
      ros1/internal_state_of_stretch.md
  19. +4
    -4
      ros1/moveit_basics.md
  20. +8
    -8
      ros1/navigation_stack.md
  21. +3
    -3
      ros1/perception.md
  22. +6
    -6
      ros1/respeaker_microphone_array.md
  23. +3
    -3
      ros1/rviz_basics.md
  24. +8
    -8
      ros1/teleoperating_stretch.md

+ 5
- 5
ros1/aruco_marker_detection.md View File

@ -4,31 +4,31 @@ For this tutorial, we will go over how to detect Stretch's ArUco markers and rev
### Visualize ArUco Markers in RViz
Begin by running the stretch driver launch file.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core stretch_driver.launch
```
To activate the RealSense camera and publish topics to be visualized, run the following launch file in a new terminal.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core d435i_low_resolution.launch
```
Next, in a new terminal, run the stretch ArUco launch file which will bring up the [detect_aruco_markers](https://github.com/hello-robot/stretch_ros/blob/master/stretch_core/nodes/detect_aruco_markers) node.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core stretch_aruco.launch
```
Within this tutorial package, there is an [RViz config file](https://github.com/hello-robot/stretch_tutorials/blob/noetic/rviz/aruco_detector_example.rviz) with the topics for the transform frames in the Display tree. You can visualize these topics and the robot model by running the command below in a new terminal.
```bash
```{.bash .shell-prompt}
rosrun rviz rviz -d /home/hello-robot/catkin_ws/src/stretch_tutorials/rviz/aruco_detector_example.rviz
```
You are going to need to teleoperate Stretch's head to detect the ArUco marker tags. Run the following command in a new terminal and control the head to point the camera toward the markers.
```bash
```{.bash .shell-prompt}
rosrun stretch_core keyboard_teleop
```

+ 3
- 3
ros1/example_1.md View File

@ -7,13 +7,13 @@ The goal of this example is to give you an enhanced understanding of how to cont
Begin by running the following command in a new terminal.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core stretch_driver.launch
```
Switch to `navigation` mode using a rosservice call. Then, in a new terminal, drive the robot forward with the [move.py](https://github.com/hello-robot/stretch_tutorials/tree/noetic/src/move.py) node.
```bash
```{.bash .shell-prompt}
rosservice call /switch_to_navigation_mode
cd catkin_ws/src/stretch_tutorials/src/
python3 move.py
@ -153,7 +153,7 @@ self.pub = rospy.Publisher('/stretch_diff_drive_controller/cmd_vel', Twist, queu
After saving the edited node, bring up [Stretch in the empty world simulation](gazebo_basics.md). To drive the robot with the node, type the following in a new terminal
```bash
```{.bash .shell-prompt}
cd catkin_ws/src/stretch_tutorials/src/
python3 move.py
```

+ 4
- 5
ros1/example_10.md View File

@ -6,19 +6,19 @@ For the tf2 static broadcaster node, we will be publishing three child static fr
Begin by starting up the stretch driver launch file.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core stretch_driver.launch
```
Within this tutorial package, there is an [RViz config file](https://github.com/hello-robot/stretch_tutorials/blob/noetic/rviz/tf2_broadcaster_example.rviz) with the topics for the transform frames in the Display tree. You can visualize these topics and the robot model by running the command below in a new terminal.
```bash
```{.bash .shell-prompt}
rosrun rviz rviz -d /home/hello-robot/catkin_ws/src/stretch_tutorials/rviz/tf2_broadcaster_example.rviz
```
Then run the [tf2_broadcaster.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/tf2_broadcaster.py) node to visualize three static frames. In a new terminal, execute:
```bash
```{.bash .shell-prompt}
cd catkin_ws/src/stretch_tutorials/src/
python3 tf2_broadcaster.py
```
@ -34,8 +34,7 @@ The GIF below visualizes what happens when running the previous node.
In a terminal, execute:
```bash
# Terminal 4
```{.bash .shell-prompt}
cd catkin_ws/src/stretch_tutorials/src/
python3 stow_command.py
```

+ 4
- 4
ros1/example_11.md View File

@ -4,24 +4,24 @@ This tutorial highlights how to create a [PointCloud](http://docs.ros.org/en/mel
Begin by starting up the stretch driver launch file.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core stretch_driver.launch
```
To activate the [RealSense camera](https://www.intelrealsense.com/depth-camera-d435i/) and publish topics to be visualized, run the following launch file in a new terminal.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core d435i_low_resolution.launch
```
Then run the [pointCloud_transformer.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/pointcloud_transformer.py) node. In a new terminal, execute:
```bash
```{.bash .shell-prompt}
cd catkin_ws/src/stretch_tutorials/src/
python3 pointcloud_transformer.py
```
Within this tutorial package, there is an [RViz config file](https://github.com/hello-robot/stretch_tutorials/blob/noetic/rviz/PointCloud_transformer_example.rviz) with the `PointCloud` in the Display tree. You can visualize this topic and the robot model by running the command below in a new terminal.
```bash
```{.bash .shell-prompt}
rosrun rviz rviz -d /home/hello-robot/catkin_ws/src/stretch_tutorials/rviz/PointCloud_transformer_example.rviz
```

+ 5
- 5
ros1/example_12.md View File

@ -17,31 +17,31 @@ Below is what needs to be included in the [stretch_marker_dict.yaml](https://git
## Getting Started
Begin by running the stretch driver launch file.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core stretch_driver.launch
```
To activate the RealSense camera and publish topics to be visualized, run the following launch file in a new terminal.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core d435i_high_resolution.launch
```
Next, run the stretch ArUco launch file which will bring up the [detect_aruco_markers](https://github.com/hello-robot/stretch_ros/blob/master/stretch_core/nodes/detect_aruco_markers) node. In a new terminal, execute:
```bash
```{.bash .shell-prompt}
roslaunch stretch_core stretch_aruco.launch
```
Within this tutorial package, there is an [RViz config file](https://github.com/hello-robot/stretch_tutorials/blob/noetic/rviz/aruco_detector_example.rviz) with the topics for the transform frames in the Display tree. You can visualize these topics and the robot model by running the command below in a new terminal.
```bash
```{.bash .shell-prompt}
rosrun rviz rviz -d /home/hello-robot/catkin_ws/src/stretch_tutorials/rviz/aruco_detector_example.rviz
```
Then run the [aruco_tag_locator.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/aruco_tag_locator.py) node. In a new terminal, execute:
```bash
```{.bash .shell-prompt}
cd catkin_ws/src/stretch_tutorials/src/
python3 aruco_tag_locator.py
```

+ 2
- 2
ros1/example_13.md View File

@ -7,7 +7,7 @@ First, begin by building a map of the space the robot will be navigating in. If
## Getting Started
Next, with your created map, we can navigate the robot around the mapped space. Run:
```bash
```{.bash .shell-prompt}
roslaunch stretch_navigation navigation.launch map_yaml:=${HELLO_FLEET_PATH}/maps/<map_name>.yaml
```
@ -22,7 +22,7 @@ Where `${HELLO_FLEET_PATH}` is the path of the `.yaml` file.
Now we are going to use a node to send a move_base goal half a meter in front of the map's origin. run the following command in a new terminal to execute the [navigation.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/navigation.py) node.
```bash
```{.bash .shell-prompt}
cd catkin_ws/src/stretch_tutorials/src/
python3 navigation.py
```

+ 4
- 4
ros1/example_2.md View File

@ -37,26 +37,26 @@ Knowing the orientation of the LiDAR allows us to filter the scan values for a d
First, open a terminal and run the stretch driver launch file.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core stretch_driver.launch
```
Then in a new terminal run the `rplidar.launch` file from `stretch_core`.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core rplidar.launch
```
To filter the lidar scans for ranges that are directly in front of Stretch (width of 1 meter) run the [scan_filter.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/scan_filter.py) node by typing the following in a new terminal.
```bash
```{.bash .shell-prompt}
cd catkin_ws/src/stretch_tutorials/src/
python3 scan_filter.py
```
Then run the following command in a separate terminal to bring up a simple RViz configuration of the Stretch robot.
```bash
```{.bash .shell-prompt}
rosrun rviz rviz -d `rospack find stretch_core`/rviz/stretch_simple_test.rviz
```

+ 3
- 3
ros1/example_3.md View File

@ -3,19 +3,19 @@ This example aims to combine the two previous examples and have Stretch utilize
Begin by running the following command in a new terminal.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core stretch_driver.launch
```
Then, in a new terminal, type the following to activate the LiDAR sensor.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core rplidar.launch
```
To set `navigation` mode and to activate the [avoider.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/avoider.py) node, type the following in a new terminal.
```bash
```{.bash .shell-prompt}
rosservice call /switch_to_navigation_mode
cd catkin_ws/src/stretch_tutorials/src/
python3 avoider.py

+ 3
- 3
ros1/example_4.md View File

@ -6,13 +6,13 @@
Let's bring up Stretch in the Willow Garage world from the [gazebo basics tutorial](gazebo_basics.md) and RViz by using the following command.
```bash
```{.bash .shell-prompt}
roslaunch stretch_gazebo gazebo.launch world:=worlds/willowgarage.world rviz:=true
```
The `rviz` flag will open an RViz window to visualize a variety of ROS topics. In a new terminal, run the following commands to execute the [marker.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/marker.py) node.
```bash
```{.bash .shell-prompt}
cd catkin_ws/src/stretch_tutorials/src/
python3 marker.py
```
@ -180,4 +180,4 @@ while not rospy.is_shutdown():
rate.sleep()
```
This loop is a fairly standard rospy construct: checking the `rospy.is_shutdown()` flag and then doing work. You have to check `is_shutdown()` to check if your program should exit (e.g. if there is a `Ctrl-c` or otherwise). The loop calls `rate.sleep()`, which sleeps just long enough to maintain the desired rate through the loop.
This loop is a fairly standard rospy construct: checking the `rospy.is_shutdown()` flag and then doing work. You have to check `is_shutdown()` to check if your program should exit (e.g. if there is a `Ctrl+c` or otherwise). The loop calls `rate.sleep()`, which sleeps just long enough to maintain the desired rate through the loop.

+ 2
- 2
ros1/example_5.md View File

@ -5,13 +5,13 @@ If you are looking for a continuous print of the joint states while Stretch is i
Begin by starting up the stretch driver launch file.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core stretch_driver.launch
```
You can then hit the run-stop button (you should hear a beep and the LED light in the button blink) and move the robot's joints to a desired configuration. Once you are satisfied with the configuration, hold the run-stop button until you hear a beep. Then run the following command to execute the [joint_state_printer.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/joint_state_printer.py) which will print the joint positions of the lift, arm, and wrist. In a new terminal, execute:
```bash
```{.bash .shell-prompt}
cd catkin_ws/src/stretch_tutorials/src/
python3 joint_state_printer.py
```

+ 3
- 3
ros1/example_6.md View File

@ -8,13 +8,13 @@ In this example, we will review a Python script that prints and stores the effor
Begin by running the following command in a terminal.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core stretch_driver.launch
```
Switch the mode to `position` mode using a rosservice call. Then run the [effort_sensing.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/effort_sensing.py) node. In a new terminal, execute:
```bash
```{.bash .shell-prompt}
rosservice call /switch_to_position_mode
cd catkin_ws/src/stretch_tutorials/src/
python3 effort_sensing.py
@ -303,7 +303,7 @@ file_name = '2022-06-30_11:26:20-AM'
Once you have changed the file name, then run the following in a new command.
```bash
```{.bash .shell-prompt}
cd catkin_ws/src/stretch_tutorials/src/
python3 stored_data_plotter.py
```

+ 7
- 7
ros1/example_7.md View File

@ -9,19 +9,19 @@ In this example, we will review the [image_view](http://wiki.ros.org/image_view?
Begin by running the stretch `driver.launch` file.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core stretch_driver.launch
```
To activate the [RealSense camera](https://www.intelrealsense.com/depth-camera-d435i/) and publish topics to be visualized, run the following launch file in a new terminal.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core d435i_low_resolution.launch
```
Within this tutorial package, there is an RViz config file with the topics for perception already in the Display tree. You can visualize these topics and the robot model by running the command below in a new terminal.
```bash
```{.bash .shell-prompt}
rosrun rviz rviz -d /home/hello-robot/catkin_ws/src/stretch_tutorials/rviz/perception_example.rviz
```
@ -30,7 +30,7 @@ There are a couple of methods to save an image using the [image_view](http://wik
**OPTION 1:** Use the `image_view` node to open a simple image viewer for ROS *sensor_msgs/image* topics. In a new terminal, execute:
```bash
```{.bash .shell-prompt}
rosrun image_view image_view image:=/camera/color/image_raw_upright_view
```
@ -38,14 +38,14 @@ Then you can save the current image by right-clicking on the display window. By
**OPTION 2:** Use the `image_saver` node to save an image to the terminal's current work directory. In a new terminal, execute:
```bash
```{.bash .shell-prompt}
rosrun image_view image_saver image:=/camera/color/image_raw_upright_view
```
## Capture Image with Python Script
In this section, we will use a Python node to capture an image from the [RealSense camera](https://www.intelrealsense.com/depth-camera-d435i/). Execute the [capture_image.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/capture_image.py) node to save a .jpeg image of the image topic `/camera/color/image_raw_upright_view`. In a terminal, execute:
```bash
```{.bash .shell-prompt}
cd ~/catkin_ws/src/stretch_tutorials/src
python3 capture_image.py
```
@ -191,7 +191,7 @@ Give control to ROS. This will allow the callback to be called whenever new mes
## Edge Detection
In this section, we highlight a node that utilizes the [Canny Edge filter](https://www.geeksforgeeks.org/python-opencv-canny-function/) algorithm to detect the edges from an image and convert it back as a ROS image to be visualized in RViz. In a terminal, execute:
```bash
```{.bash .shell-prompt}
cd ~/catkin_ws/src/stretch_tutorials/src
python3 edge_detection.py
```

+ 2
- 2
ros1/example_8.md View File

@ -8,12 +8,12 @@ This example will showcase how to save the interpreted speech from Stretch's [Re
Begin by running the `respeaker.launch` file in a terminal.
```bash
```{.bash .shell-prompt}
roslaunch respeaker_ros sample_respeaker.launch
```
Then run the [speech_text.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/speech_text.py) node. In a new terminal, execute:
```bash
```{.bash .shell-prompt}
cd catkin_ws/src/stretch_tutorials/src/
python3 speech_text.py
```

+ 3
- 3
ros1/example_9.md View File

@ -4,20 +4,20 @@ This example aims to combine the [ReSpeaker Microphone Array](respeaker_micropho
Begin by running the following command in a new terminal.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core stretch_driver.launch
```
Switch the mode to `position` mode using a rosservice call. Then run the `respeaker.launch` file. In a new terminal, execute:
```bash
```{.bash .shell-prompt}
rosservice call /switch_to_position_mode
roslaunch stretch_core respeaker.launch
```
Then run the [voice_teleoperation_base.py](https://github.com/hello-robot/stretch_tutorials/blob/noetic/src/voice_teleoperation_base.py) node in a new terminal.
```bash
```{.bash .shell-prompt}
cd catkin_ws/src/stretch_tutorials/src/
python3 voice_teleoperation_base.py
```

+ 2
- 2
ros1/follow_joint_trajectory.md View File

@ -9,13 +9,13 @@ Stretch ROS driver offers a [`FollowJointTrajectory`](http://docs.ros.org/en/api
Begin by running the following command in a terminal.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core stretch_driver.launch
```
In a new terminal, switch the mode to `position` mode using a rosservice call. Then run the stow command node.
```bash
```{.bash .shell-prompt}
rosservice call /switch_to_position_mode
cd catkin_ws/src/stretch_tutorials/src/
python3 stow_command.py

+ 3
- 3
ros1/gazebo_basics.md View File

@ -3,7 +3,7 @@
## Empty World Simulation
To spawn Stretch in Gazebo's default empty world run the following command in your terminal.
```bash
```{.bash .shell-prompt}
roslaunch stretch_gazebo gazebo.launch
```
@ -16,13 +16,13 @@ This will bring up the robot in the gazebo simulation similar to the image shown
## Custom World Simulation
In Gazebo, you can spawn Stretch in various worlds. First, source the Gazebo world files by running the following command in a terminal:
```bash
```{.bash .shell-prompt}
echo "source /usr/share/gazebo/setup.sh"
```
Then using the world argument, you can spawn Stretch in the Willow Garage world by running the following:
```bash
```{.bash .shell-prompt}
roslaunch stretch_gazebo gazebo.launch world:=worlds/willowgarage.world
```

+ 1
- 0
ros1/getting_started.md View File

@ -14,6 +14,7 @@ If you cannot access the robot through ssh due to your network settings, you wil
## Setting Up Stretch in Simulation
Users who don’t have a Stretch, but want to try the tutorials can set up their computer with Stretch Gazebo.
### Requirements
Although lower specifications might be sufficient, for the best experience we recommend the following for running the simulation:
* **Processor**: Intel i7 or comparable

+ 4
- 4
ros1/internal_state_of_stretch.md View File

@ -2,13 +2,13 @@
Begin by starting up the stretch driver launch file.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core stretch_driver.launch
```
Then utilize the ROS command-line tool [rostopic](http://wiki.ros.org/rostopic) to display Stretch's internal state information. For instance, to view the current state of the robot's joints, simply type the following in a new terminal.
```bash
```{.bash .shell-prompt}
rostopic echo /joint_states -n1
```
@ -34,7 +34,7 @@ effort: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
Let's say you are interested in only seeing the `header` component of the `/joint_states` topic, you can output this within the rostopic command-line tool by typing the following command.
```bash
```{.bash .shell-prompt}
rostopic echo /joint_states/header -n1
```
@ -53,7 +53,7 @@ Additionally, if you were to type `rostopic echo /` in the terminal, then press
A powerful tool to visualize ROS communication is the ROS [rqt_graph package](http://wiki.ros.org/rqt_graph). By typing the following in a new terminal, you can see a graph of topics being communicated between nodes.
```bash
```{.bash .shell-prompt}
rqt_graph
```
<p align="center">

+ 4
- 4
ros1/moveit_basics.md View File

@ -16,7 +16,7 @@ roslaunch stretch_moveit_config moveit_rviz.launch
## MoveIt! Without Hardware
To begin running MoveIt! on stretch, run the demo launch file. This doesn't require any simulator or robot to run.
```bash
```{.bash .shell-prompt}
roslaunch stretch_moveit_config demo.launch
```
@ -42,19 +42,19 @@ Additionally, the demo allows a user to select from the three groups, `stretch_a
To run in Gazebo, execute:
```bash
```{.bash .shell-prompt}
roslaunch stretch_gazebo gazebo.launch
```
Then, in a new terminal, execute:
```bash
```{.bash .shell-prompt}
roslaunch stretch_core teleop_twist.launch twist_topic:=/stretch_diff_drive_controller/cmd_vel linear:=1.0 angular:=2.0 teleop_type:=keyboard
```
In a separate terminal, launch:
```bash
```{.bash .shell-prompt}
roslaunch stretch_moveit_config demo_gazebo.launch
```

+ 8
- 8
ros1/navigation_stack.md View File

@ -3,7 +3,7 @@ stretch_navigation provides the standard ROS navigation stack as two launch file
Then run the following commands to map the space that the robot will navigate in.
```bash
```{.bash .shell-prompt}
roslaunch stretch_navigation mapping.launch
```
@ -15,7 +15,7 @@ Rviz will show the robot and the map that is being constructed. With the termina
In Rviz, once you see a map that has reconstructed the space well enough, you can run the following commands to save the map to `stretch_user/`.
```bash
```{.bash .shell-prompt}
mkdir -p ~/stretch_user/maps
rosrun map_server map_saver -f ${HELLO_FLEET_PATH}/maps/<map_name>
```
@ -25,7 +25,7 @@ rosrun map_server map_saver -f ${HELLO_FLEET_PATH}/maps/
Next, with `<map_name>.yaml`, we can navigate the robot around the mapped space. Run:
```bash
```{.bash .shell-prompt}
roslaunch stretch_navigation navigation.launch map_yaml:=${HELLO_FLEET_PATH}/maps/<map_name>.yaml
```
@ -36,14 +36,14 @@ It is also possible to send 2D Pose Estimates and Nav Goals programmatically. In
## Running in Simulation
To perform mapping and navigation in the Gazebo simulation of Stretch, substitute the `mapping_gazebo.launch` and `navigation_gazebo.launch` files into the commands above. The default Gazebo environment is the Willow Garage HQ. Use the "world" ROS argument to specify the Gazebo world within which to spawn Stretch.
```bash
```{.bash .shell-prompt}
roslaunch stretch_navigation mapping_gazebo.launch gazebo_world:=worlds/willowgarage.world
```
### Teleop using a Joystick Controller
The mapping launch files, `mapping.launch` and `mapping_gazebo.launch`, expose the ROS argument `teleop_type`. By default, this ROS argument is set to `keyboard`, which launches keyboard teleop in the terminal. If the Xbox controller that ships with Stretch is plugged into your computer, the following command will launch mapping with joystick teleop:
```bash
```{.bash .shell-prompt}
roslaunch stretch_navigation mapping.launch teleop_type:=joystick
```
@ -56,18 +56,18 @@ If you have set up [ROS Remote Master](https://docs.hello-robot.com/0.2/stretch-
On the robot, execute:
```bash
```{.bash .shell-prompt}
roslaunch stretch_navigation mapping.launch rviz:=false teleop_type:=none
```
On your machine, execute:
```bash
```{.bash .shell-prompt}
rviz -d `rospack find stretch_navigation`/rviz/mapping.launch
```
In a separate terminal on your machine, execute:
```bash
```{.bash .shell-prompt}
roslaunch stretch_core teleop_twist.launch teleop_type:=keyboard
```

+ 3
- 3
ros1/perception.md View File

@ -4,19 +4,19 @@ The Stretch robot is equipped with the [Intel RealSense D435i camera](https://ww
Begin by running the stretch `driver.launch` file.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core stretch_driver.launch
```
To activate the [RealSense camera](https://www.intelrealsense.com/depth-camera-d435i/) and publish topics to be visualized, run the following launch file in a new terminal.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core d435i_low_resolution.launch
```
Within this tutorial package, there is an [RViz config file](https://github.com/hello-robot/stretch_tutorials/blob/noetic/rviz/perception_example.rviz) with the topics for perception already in the Display tree. You can visualize these topics and the robot model by running the command below in a new terminal.
```bash
```{.bash .shell-prompt}
rosrun rviz rviz -d /home/hello-robot/catkin_ws/src/stretch_tutorials/rviz/perception_example.rviz
```

+ 6
- 6
ros1/respeaker_microphone_array.md View File

@ -10,7 +10,7 @@ In this section we will use command line tools in the [Stretch_Body](https://git
Begin by typing the following command in a new terminal.
```bash
```{.bash .shell-prompt}
stretch_respeaker_test.py
```
@ -35,7 +35,7 @@ A [ROS package for the ReSpeaker](https://index.ros.org/p/respeaker_ros/#melodic
Begin by running the `sample_respeaker.launch` file in a terminal.
```bash
```{.bash .shell-prompt}
roslaunch respeaker_ros sample_respeaker.launch
```
@ -43,7 +43,7 @@ This will bring up the necessary nodes that will allow the ReSpeaker to implemen
Below are executables you can run to see the ReSpeaker results.
```bash
```{.bash .shell-prompt}
rostopic echo /sound_direction # Result of Direction (in Radians) of Audio
rostopic echo /sound_localization # Result of Direction as Pose (Quaternion values)
rostopic echo /is_speeching # Result of Voice Activity Detector
@ -54,8 +54,8 @@ rostopic echo /speech_to_text # Voice recognition
An example is when you run the `speech_to_text` executable and speak near the microphone array. In a new terminal, execute:
```bash
hello-robot@stretch-re1-1005:~$ rostopic echo /speech_to_text
```{.bash .shell-prompt}
rostopic echo /speech_to_text
```
In this instance, "hello robot" was said. The following will be displayed in your terminal:
@ -69,6 +69,6 @@ confidence: []
You can also set various parameters via `dynamic_reconfigure` by running the following command in a new terminal.
```bash
```{.bash .shell-prompt}
rosrun rqt_reconfigure rqt_reconfigure
```

+ 3
- 3
ros1/rviz_basics.md View File

@ -2,13 +2,13 @@
You can utilize RViz to visualize Stretch's sensor information. To begin, in a terminal, run the stretch driver launch file.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core stretch_driver.launch
```
Then, run the following command in a separate terminal to bring up a simple RViz configuration of the Stretch robot.
```bash
```{.bash .shell-prompt}
rosrun rviz rviz -d `rospack find stretch_core`/rviz/stretch_simple_test.rviz
```
@ -29,7 +29,7 @@ There are further tutorials for RViz which can be found [here](http://wiki.ros.o
## Running RViz and Gazebo (Simulation)
Let's bring up Stretch in the willow garage world from the [gazebo basics tutorial](gazebo_basics.md) and RViz by using the following command.
```bash
```{.bash .shell-prompt}
roslaunch stretch_gazebo gazebo.launch world:=worlds/willowgarage.world rviz:=true
```

+ 8
- 8
ros1/teleoperating_stretch.md View File

@ -7,13 +7,13 @@ If you have not already had a look at the [Xbox Controller Teleoperation](https:
For full-body teleoperation with the keyboard, you first need to run the `stretch_driver.launch` in a terminal.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core stretch_driver.launch
```
Then in a new terminal, type the following command
```bash
```{.bash .shell-prompt}
rosrun stretch_core keyboard_teleop
```
@ -60,13 +60,13 @@ To stop the node from sending twist messages, press `Ctrl` + `c` in the terminal
Begin by running the following command in your terminal:
```bash
```{.bash .shell-prompt}
roslaunch stretch_core stretch_driver.launch
```
To teleoperate a Stretch's mobile base with the keyboard, you first need to switch the mode to `navigation` for the robot to receive `Twist` messages. This is done using a rosservice call in a new terminal. In the same terminal run the teleop_twist_keyboard node with the argument remapping the *cmd_vel* topic name to *stretch/cmd_vel*.
To teleoperate a Stretch's mobile base with the keyboard, you first need to switch the mode to `navigation` for the robot to receive `Twist` messages. This is done using a rosservice call in a new terminal. In the same terminal run the teleop_twist_keyboard node with the argument remapping the `cmd_vel` topic name to `stretch/cmd_vel`.
```bash
```{.bash .shell-prompt}
rosservice call /switch_to_navigation_mode
rosrun teleop_twist_keyboard teleop_twist_keyboard.py cmd_vel:=stretch/cmd_vel
```
@ -111,13 +111,13 @@ To move Stretch's mobile base using a python script, please look at [Teleoperate
### Keyboard Teleoperating: Mobile Base
For keyboard teleoperation of the Stretch's mobile base, first, [startup Stretch in simulation](gazebo_basics.md). Then run the following command in a new terminal.
```bash
```{.bash .shell-prompt}
roslaunch stretch_gazebo gazebo.launch
```
In a new terminal, type the following
```bash
```{.bash .shell-prompt}
roslaunch stretch_core teleop_twist.launch twist_topic:=/stretch_diff_drive_controller/cmd_vel linear:=1.0 angular:=2.0 teleop_type:=keyboard
```
@ -126,7 +126,7 @@ The same keyboard commands will be presented to a user to move the robot.
### Xbox Controller Teleoperating
An alternative for robot base teleoperation is to use an Xbox controller. Stop the keyboard teleoperation node by typing `Ctrl` + `c` in the terminal where the command was executed. Then connect the Xbox controller device to your local machine and run the following command.
```bash
```{.bash .shell-prompt}
roslaunch stretch_core teleop_twist.launch twist_topic:=/stretch_diff_drive_controller/cmd_vel linear:=1.0 angular:=2.0 teleop_type:=joystick
```

Loading…
Cancel
Save