Cognipilot MR-B3RB Robot installation specific instructions for nxp-cup
To follow this tutorial you first need to follow and install everything explained in: CogniPilot developer guide on MR-B3RB
You must also follow the NXP-CUP 2024: Simulation Install instructions as it is explained how to update you development computer for the NXP-CUP 2024.
If you have correctly followed the previous steps, you should already be familiar with how to flash the Cognipilot software onto the MR-CANHUBK344 board. In this step, we will flash the board again, but this time with an updated version of the Cerebri software that includes support for the NXP Cup 2024. Additionally, we will need to log in via SSH to the NavQPlus and update the cranium workspace to enable support for the NXP Cup, similar to the procedure outlined on the previous page.
In your development computer go to:
Check that you are on the nxp-cup branch
Which should return:
If it doesn't show that:
Remember if you PDB does not have the battery sensor you have to follow this guide: Configuring Cerebri Software for MR-CANHUBK344 Without Power Measurement Capability
If you have ensured that you are on the nxp-cup branch and that there is not any update available. Then type the following command:
west
west
to MR CANHUBK3Make sure Segger JLink is on your system and the MR CANHUBK3 is connected to the JLink programmer: Connect J-Link EDU Mini to MR-CANHUBK344 with DCD-LZ adapt board.
Once it has finished, you can disconnect the J-Link from your computer and from the MR-CANHUBK344.
In case the west flash command failed, put the followung commands on the terminal:
If you have any doubt or problem in this section please first refer to this documentation: https://airy.cognipilot.org/cranium/compute/navqplus/setup/
Once you have flashed the MR-CANHUBK344, It's time to log into the NavQPlus. Do this with this command:
Next, we will install the packages to develop the algorithm for the NXP CUP 2024 in the B3RB robot.
To simplify repository updates, we've created a shell script. This script automates the process of updating all repositories to the latest versions hosted on the NXPHoverGames GitHub, specifically for the NXP Cup.
The changes on the repositories are:
Synapse Protobuf and Synapse Tinyframe: Define the Pixy Vector message in the Protobuf language and provide essential definitions and functions for the TinyFrame library to operate correctly. For more information, refer to the Synapse documentation.
Synapse msgs and Synapse ros: Extend ROS with the Pixy Vector message definition and support Synapse which the facilitates communication between NavQPlus (ROS2) and CANHUBK344 (Zephyr).
B3RB robot: Includes the launch file for the node that processes the race track images.
NXP CUP Vision: Added ROS2 package for processing race track images, transforming perspectives, detecting track edges, and publishing both the debug image and the Pixy Vector, pivotal for visual processing.
This file ensures that all the repositories are updated. Save this script in ~/cognipilot:
Save this script as update_repos_navqplus.sh in the ~/cognipilot
directory, using any text editor of your choice, such as Vim or nano.
Then, open the terminal and execute the following commands:
Change directory to ~/cognipilot
and make the script executable:
This must have updated all your repositories. Please review the output of this script. It's possible that you have made local changes to the repositories, and you may need to stash them.
To build the software, navigate to the cranium/src
directory within ~/cognipilot
:
Now, you are ready to operate the platform.
On the NavQPlus, run the B3RB launch file:
Start Foxglove for operation and data visualization run the following command in your development computersimulation, run:
These commands will initiate the algorithm on the MR-B3RB robot and Foxglove viewer. You should locate your robot in the racetrack.
To initiate the robot's movement, press the AUTO button and the arm button on the joystick. This action will activate the robot's line-following mode and set it to the armed state, preparing it for operation.
Once the AUTO and arm buttons on the joystick are pressed, the robot should immediately begin to navigate around the racetrack. To understand the underlying code and setup, and to explore ways to obtain a better performance for the NXP CUP 2024, proceed to the next tutorial.
This section explains the essential software required for participating in the NXP Cup 2024 using the MR-B3RB Robot.
Participants will use a combination of pre-configured ROS2 and Zephyr-Cerebri software. Here, "Cerebri" refers to the application layer, while "Zephyr" is the underlying Real-Time Operating System (RTOS). This software stack integrates with the Cognipilot autopilot system, which is available at .
The following pages provide detailed installation instructions for both the simulation environment and the actual robot. Additionally, there is a section dedicated to explaining the line follower algorithm for the NXP Cup 2024.
Cognipilot simulation installation specific instructions for nxp-cup
To follow this tutorial you first need to follow and install everything explained in: CogniPilot developer guide on MR-B3RB
If your installation has been successful you must be able the launch the Gazebo simulation by running the following commands in your development computer:
Please ensure that these commands function correctly and that your setup resembles what is shown in this example video
Launching the Gazebo Harmonic simulation will enable you to control the robot via the Foxglove interface. As explained in this section: b3rb_simulation
Next, we will install all necessary packages for developing the algorithm for the NXP CUP 2024, applicable to both the MR-b3rb simulation and the actual robot.
To simplify repository updates, we've created a shell script. This script automates the process of updating all repositories to the latest versions hosted on the NXPHoverGames GitHub, specifically for the NXP Cup.
The changes on the repositories are:
Synapse Protobuf and Synapse Tinyframe: Define the Pixy Vector message in the Protobuf language and provide essential definitions and functions for the TinyFrame library to operate correctly. For more information, refer to the Synapse documentation.
Synapse msgs and Synapse ros: Extend ROS with the Pixy Vector message definition and support Synapse which the facilitates communication between NavQPlus (ROS2) and CANHUBK344 (Zephyr).
Dream world: Adds a simulated race track environment to the workspace.
B3RB simulator: Includes the launch file for the node that processes the race track images.
Electrode: Integrates support for the debug image and pixy vector in Foxglove, facilitating debugging and visualization.
Cerebri: Implements a subscribing mechanism to receive the Pixy Vector message and controls for the line follower algorithm.
NXP CUP Vision: Added ROS2 package for processing race track images, transforming perspectives, detecting track edges, and publishing both the debug image and the Pixy Vector, pivotal for visual processing.
This file ensures that all the repositories are updated. Save this script in ~/cognipilot:
Save this script as update_repos_native.sh in the ~/cognipilot
directory, using any text editor of your choice, such as Vim, Gedit, Nano, or VSCode.
Then, open the terminal and execute the following commands:
This must have updated all your repositories. Please review the output of this script. It's possible that you have made local changes to the repositories, and you may need to stash them.
To build the software, navigate to the cranium/src
directory within ~/cognipilot
:
Repeat the build process for the electrode
directory:
Update the West workspace for cerebri
.
Now, you are ready to run the simulation:
To launch the Gazebo simulation featuring the MR-B3RB robot on a simple raceway, execute:
To start the Foxglove viewer for the simulation, run:
These commands will initiate the Gazebo simulation with the MR-B3RB robot and the raceway, alongside launching the Foxglove viewer.
To initiate the robot's movement, press the AUTO button and the arm button on the joystick. This action will activate the robot's line-following mode and set it to the armed state, preparing it for operation.
Once the AUTO and arm buttons on the joystick are pressed, the robot will immediately begin to navigate around the racetrack. To understand the underlying code and setup, and to explore ways to obtain a better performance for the NXP CUP 2024, proceed to the next tutorial.
This page explains the software required for the NXP CUP 2024 with the MR-B3RB robot.
The CogniPilot software is integrated into both the NavQPlus, within ROS2, and the MR-CANHUBK344, within the Zephyr RTOS together forming an Ackermann steering robotic development platform.
Please follow the installation instructions to obtain the modified branch of the CogniPilot software, specifically tailored for the NXP CUP 2024; these instructions are explained in the previous sectionNXP-CUP 2024: Simulation Install instructions
The ROS2 part of the software is designed for high-level tasks and should be treated as "black box" for the NXP-CUP 2024,meaning that its code should not be modified.If you encounter any errors in this code, please report them to the NXP Cup organizers or through the technical support channel on the official NXP CUP Discord: https://discord.gg/JcWPbw649S.
The main CogniPilot modification in the ROS2 side for the NXP Cup 2024 is addition of the nxp_track_vision
node, which receives camera data, transforms the perspective, and extracts the Pixy Vector message. This message consists of two vectors, each with a tail and a head, defining the borders of the race track. This information is used by the robot controller to ensure that the robot stays within the track boundaries.
Here's the definition of the Pixy Vector ROS2 message:
The std_msgs/Header
in the Pixy Vector ROS2 message includes the timestamp indicating when the message was sent. Additionally, the message is composed by two vectors, with each vector represented by a pair of points: one indicating where the vector begins (tail) and the other where it ends (head). Each of these points is defined by an x and a y value, specifying their position in a two-dimensional space.
The nxp_track_vision_node
also publishes the Camera Debug Image. This image shows the perspective transformation applied to the camera's view, along with the detected lines of the track. Additionally, it displays the vectors being sent by the track vision node. This visual feedback should result helpful for debugging and optimizing the robot;s control software.
This is an example of the debug camera image:
You can visualize this image through Foxglove. For doing that first run:
Then add an Image panel to the B3RB layout provided and configure the ROS2 topic to /nxp_cup/debug_image.
Now a explanation of the line following control algorithm will be done. This algorithm is located on the Cerebri running on the Zypher RTOS in the MR-CANHUBK344. The Cerebri acts as the brain of our operation, facilitating complex decision-making processes that guide the robot seamlessly along the track.
This code is what you must modify and improve for the NXP CUP 2024.
The main function of the line follower algorithm is located in the velocity state inside the finite state machine logic that manages the robot's behaviour. This file is located in the ~/cognipilot/ws/cerebri/app/src/velocity.c
This is the GitHub location: See code.
The application folder is where the vehicle finite state machine, control, and estimation code reside. For more information visit: https://airy.cognipilot.org/cerebri/platforms/rovers/
In that folder you will find a file called velocity.c. This code is written in C language and has two main operation modes: the cmd_vel modes which listens the /cmd_vel topic from the Nav2 node from ROS2 and the auto mode, this one will listen to the Pixy vector message sendt by the nxp_track_vision_node and will estimate the velocity needed to maintain the robot in the race track:
There are several important variables and functions that are used in the code. A key description of them is given below:
The switch instruction evaluates the number of vectors found in the camera image and what the robot car will do depending on this number:
If no vectors are found (case 0), then the vehicle will stop.
If one vector is found, the algorithm finds the gradient of the vector and stores that in steer, and the speed
If two vectors are found, then the example algorithm will find the offset in the x direction of the average of the two head points of each vector. This will give us a steering value that will steer the cup car in the correct direction, which will be stored in steer. The speed value will be calculated and is stored in speed.
After calculating the speed and the steer values, a call to the CASADI function is done to estimate the control variable for the actuators. Then the robot can move the necessary meters that you need.
Before making changes to the velocity.c file or any files in the ~/cognipilot/ws/cerebri/app/b3rb directory, remember to rebuild the application to see the changes take effect.
For simulation:
For real robot:
In the nxp_track_vision.py file within the ROS2 package nxp_cup_vision, you'll find the following code snippet:
This code snippet automatically sets the B3RB robot in AUTO mode and arms the robot. By default, the self.JoystickPub.publish(joystick_msg) lines are commented out to allow control through Foxglove Studio during development. Uncomment these lines if you want the robot to automatically enter AUTO mode and arm itself.
If you need the robot to delay its start, add a sleep command:
This makes the robot wait for 15 seconds before starting, which can be particularly useful for the NXP-CUP competition.
Variable/Function | Purpose |
---|---|
Context* ctx
This variable is a pointer to the context struct, a custom data type that stores information as the pixy vector, the robot status and the wheel base and radius.
Pixy Vector
Contains information of pixy vectors obtained from the nxp_track_vision node. This includes the starting and ending points (x0, y0, x1, y1) for each vector detected by the camera
linear_velocity
The variable that stores the linear velocity at which the car will move forward. It is a constant value representing the speed in a straight line
angular_velocity
The variable that stores the angular velocity at which the car will steer. It represents the rate of change in the car's direction
single_line_steer_scale
This variable controls how much the car steers when it can only find one line of the race track. It scales the angular velocity to adjust the steering sensitivity
steer
Variable that stores the steer or angular velocity in the same way that the standard cmd_vel message from ROS2 does. It is calculated based on the Pixy Vector information and controls the direction the car turns
vel_linear_x
Variable that stores the linear velocity or vel.linear.x in the same way that the standard cmd_vel message from ROS2 does. It is adjusted based on the steering angle to ensure smooth turns
num_vectors
Variable that stores the number of Pixy Vectors obtained. The value range for this variable is 0, 1, or 2, indicating how many lines the Pixy camera has detected on the track
frame_width
Width of the green frame on the debug image, which the Pixy vector values from the nxp_track_vision node are based on
frame_height
Height of the green frame on the debug image, which the Pixy vector values from the nxp_track_vision node are based on