Commit 44f8c25a authored by Inès EL HADRI's avatar Inès EL HADRI 💤

Update readme

parent 152dda2d
# group Astro, repository for the UV LARM # Astro Group's repository for the LARM UV
---
## Authors :
Inès El Hadri ## Authors
Inès El Hadri
Lucas Naury Lucas Naury
--- ---
## Introduction ## Introduction
This repository is a ROS2 package that allows the control of a kobuki robot. This repository is a ROS2 package that allows the control of a kobuki robot.
### Table of Contents : ### Table of Contents :
1. [Authors](#authors) 1. [Authors](#authors)
1. [Introduction](#introduction) 1. [Introduction](#introduction)
1. [How it works](#how-it-works)
1. [Goal](#goal)
1. [Expected behaviour](#expected-behaviour)
1. [Additional functionality](#additional-functionality)
1. [Installation](#installation) 1. [Installation](#installation)
1. [Requirements](#requirements) 1. [Requirements](#requirements)
1. [Install the package](#install-the-package) 1. [Install the package](#install-the-package)
1. [Tune the camera HSV](#tune-the-camera-hsv)
1. [Build the package](#build-the-packages) 1. [Build the package](#build-the-packages)
1. [How to use the package](#how-to-use-the-package) 1. [How to use the package](#how-to-use-the-package)
1. [In simulation](#in-simulation) 1. [In simulation](#in-simulation)
...@@ -21,7 +28,39 @@ This repository is a ROS2 package that allows the control of a kobuki robot. ...@@ -21,7 +28,39 @@ This repository is a ROS2 package that allows the control of a kobuki robot.
1. [Visualization](#visualization) 1. [Visualization](#visualization)
1. [Frequently Asked Questions](#faq) 1. [Frequently Asked Questions](#faq)
--- ---
## How it works
### Goal
The goal is to explore a closed area (i.e. an area bounded with obstacles) with the robot while avoiding obstacles. While doing so, we must determine the number and position of green painted bottles.
### Expected behaviour
- The robot moves **continuously** in the area by **avoiding obstacles**
- If there are no obstacles, it moves straight
- If an obstacle is in front, it turns the opposite direction until there's no obstacle left in front.
- Using the **SLAM algorithm** with data from the LiDAR and the odometer, the robot builds a map and localizes itself in it.
- Using a RealSense RGBD camera (D435i), the robot is able to **detect the green bottles**. Messages are sent in topics:
- `detection` : to state the detection
- `bottle_relative_pos` : to tell the position of the bottle relative to the camera
- `bottle_marker` : to mark the position of a bottle on the map
- Experiments can be performed with **2 computers**:
- one on the robot (Control PC) running all the robot, movement and vision nodes
- a second for visualization and human control (Operator PC)
### Additional functionality
- An [**automatic HSV tuner script**](#tune-the-camera-hsv) allows you to calculate the ideal threshold to mask your bottle
- Most configuration variables (robot speeds, bounding boxes, are set as **ROS parameters** so that they can be modified
- The robot stops moving when you press any of the **3 robot buttons**. If you press it again, movement will continue.
> All other data is still being processed when the robot is in pause
- The robot **stops when you lift it** (i.e. the wheels are "falling"). It you put the robot back on the ground, movement will continue.
## Installation ## Installation
### Requirements ### Requirements
Before starting, please ensure you have installed the following Before starting, please ensure you have installed the following
...@@ -36,8 +75,9 @@ Before starting, please ensure you have installed the following ...@@ -36,8 +75,9 @@ Before starting, please ensure you have installed the following
* cvbridge3 * cvbridge3
* scikit-image * scikit-image
</br> </br>
>Command :
>`pip install numpy colcon-common-extensions opencv-python pyrealsense2 cvbridge3 scikit-image` > Command :
> `pip install numpy colcon-common-extensions opencv-python pyrealsense2 cvbridge3 scikit-image`
- $`\textcolor{red}{\text{[OPTIONAL]}}`$ Gazebo (for the simulation) - $`\textcolor{red}{\text{[OPTIONAL]}}`$ Gazebo (for the simulation)
- $`\textcolor{red}{\text{[OPTIONAL]}}`$ Teleop twist keyboard (to control manually the robot) - $`\textcolor{red}{\text{[OPTIONAL]}}`$ Teleop twist keyboard (to control manually the robot)
...@@ -55,7 +95,7 @@ export ROS_AUTOMATIC_DISCOVERY_RANGE=LOCALHOST #Tell ROS to make your nodes only ...@@ -55,7 +95,7 @@ export ROS_AUTOMATIC_DISCOVERY_RANGE=LOCALHOST #Tell ROS to make your nodes only
However, if you want to be able to visualize data from another computer on the same network, add : However, if you want to be able to visualize data from another computer on the same network, add :
``` ```
export ROS_AUTOMATIC_DISCOVERY_RANGE=SUBNET #Tell ROS to make your nodes only accessible by the same machine export ROS_AUTOMATIC_DISCOVERY_RANGE=SUBNET #Tell ROS to make your nodes accessible by machines on the same network
``` ```
...@@ -67,6 +107,8 @@ export ROS_AUTOMATIC_DISCOVERY_RANGE=SUBNET #Tell ROS to make your nodes only ac ...@@ -67,6 +107,8 @@ export ROS_AUTOMATIC_DISCOVERY_RANGE=SUBNET #Tell ROS to make your nodes only ac
### Tune the camera HSV ### Tune the camera HSV
To tune the HSV threshold parameters for the camera mask, we will use a script to automate it.
First, put a bottle in front of the robot. First, put a bottle in front of the robot.
Then, go in the `larm` directory and launch the HSV tuner python script using the following command Then, go in the `larm` directory and launch the HSV tuner python script using the following command
...@@ -102,8 +144,9 @@ To launch the challenge 1 on the **real turtlebot**, run the following command : ...@@ -102,8 +144,9 @@ To launch the challenge 1 on the **real turtlebot**, run the following command :
`ros2 launch grp_astro tbot_launch.yaml` `ros2 launch grp_astro tbot_launch.yaml`
### Visualization ### Visualization
In parallel, if you want to **visualize** the information that is published on the different topics, you can run In parallel, if you want to **visualize** the information that is published on the different topics, you can run :
`ros2 launch grp_astro visualize.launch.py` - `ros2 launch tbot_operator_launch.yaml` for the real robot
- `ros2 launch sim_operator_launch.yaml` in simulation
> If you want to run this visualization on another computer than the one running the robot, make sure : > If you want to run this visualization on another computer than the one running the robot, make sure :
> - they are on the **same network** > - they are on the **same network**
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment