NXP-CUP 2024: line follower algorithm

This page explains the software required for the NXP CUP 2024 with the MR-B3RB robot.

The CogniPilot software is integrated into both the NavQPlus, within ROS2, and the MR-CANHUBK344, within the Zephyr RTOS together forming an Ackermann steering robotic development platform.

Please follow the installation instructions to obtain the modified branch of the CogniPilot software, specifically tailored for the NXP CUP 2024; these instructions are explained in the previous sectionNXP-CUP 2024: Simulation Install instructions

ROS2 brief explanation

The ROS2 part of the software is designed for high-level tasks and should be treated as "black box" for the NXP-CUP 2024,meaning that its code should not be modified.If you encounter any errors in this code, please report them to the NXP Cup organizers or through the technical support channel on the official NXP CUP Discord: https://discord.gg/JcWPbw649S.

The main CogniPilot modification in the ROS2 side for the NXP Cup 2024 is addition of the nxp_track_vision node, which receives camera data, transforms the perspective, and extracts the Pixy Vector message. This message consists of two vectors, each with a tail and a head, defining the borders of the race track. This information is used by the robot controller to ensure that the robot stays within the track boundaries.

Here's the definition of the Pixy Vector ROS2 message:

std_msgs/Header header
# Vector 0 head and tail points
uint32 m0_x0    # Tail of vector @ x
uint32 m0_y0    # Tail of vector @ y
uint32 m0_x1    # Head of vector @ x
uint32 m0_y1    # Head of vector @ y

# Vector 1 head and tail points
uint32 m1_x0    # Tail of vector @ x
uint32 m1_y0    # Tail of vector @ y
uint32 m1_x1    # Head of vector @ x
uint32 m1_y1    # Head of vector @ y

The std_msgs/Header in the Pixy Vector ROS2 message includes the timestamp indicating when the message was sent. Additionally, the message is composed by two vectors, with each vector represented by a pair of points: one indicating where the vector begins (tail) and the other where it ends (head). Each of these points is defined by an x and a y value, specifying their position in a two-dimensional space.

The nxp_track_vision_node also publishes the Camera Debug Image. This image shows the perspective transformation applied to the camera's view, along with the detected lines of the track. Additionally, it displays the vectors being sent by the track vision node. This visual feedback should result helpful for debugging and optimizing the robot;s control software.

This is an example of the debug camera image:

You can visualize this image through Foxglove. For doing that first run:

ros2 launch electrode electrode.launch.py #sim:=true (If you are using simulation)

Then add an Image panel to the B3RB layout provided and configure the ROS2 topic to /nxp_cup/debug_image.

Control algorithm through Cerebri

Now a explanation of the line following control algorithm will be done. This algorithm is located on the Cerebri running on the Zypher RTOS in the MR-CANHUBK344. The Cerebri acts as the brain of our operation, facilitating complex decision-making processes that guide the robot seamlessly along the track.

This code is what you must modify and improve for the NXP CUP 2024.

The main function of the line follower algorithm is located in the velocity state inside the finite state machine logic that manages the robot's behaviour. This file is located in the ~/cognipilot/ws/cerebri/app/src/velocity.c

This is the GitHub location: See code.

The application folder is where the vehicle finite state machine, control, and estimation code reside. For more information visit: https://airy.cognipilot.org/cerebri/platforms/rovers/

In that folder you will find a file called velocity.c. This code is written in C language and has two main operation modes: the cmd_vel modes which listens the /cmd_vel topic from the Nav2 node from ROS2 and the auto mode, this one will listen to the Pixy vector message sendt by the nxp_track_vision_node and will estimate the velocity needed to maintain the robot in the race track:

// Follow line function in velocity.c
static void follow_line(context* ctx)
{
    double frame_width = 78;
    double frame_height = 51;
    double window_center = frame_width / 2;
    double linear_velocity=0.7;
    double angular_velocity=-0.6;
    double single_line_steer_scale=0.6;

    double x = 0.0;
    double y = 0.0;

    double steer = 0.0;
    double speed = 0.0;
     
    int num_vectors = 0;
    if(!(ctx->pixy_vector.m0_x0 == 0 && ctx->pixy_vector.m0_x1 == 0 && ctx->pixy_vector.m0_y0 == 0 && ctx->pixy_vector.m0_y1 == 0)) {
        num_vectors++;
    }
    if(!(ctx->pixy_vector.m1_x0 == 0 && ctx->pixy_vector.m1_x1 == 0 && ctx->pixy_vector.m1_y0 == 0 && ctx->pixy_vector.m1_y1 == 0)) {
        num_vectors++;
    }
    
    switch(num_vectors) {
        case 0:
            speed = 0.0;
            steer = 0.0;
            break;
        case 1:
            if(ctx->pixy_vector.m0_x1 > ctx->pixy_vector.m0_x0) {
                x = (ctx->pixy_vector.m0_x1 - ctx->pixy_vector.m0_x0) / frame_width;
                y = (ctx->pixy_vector.m0_y1 - ctx->pixy_vector.m0_y0) / frame_height;
            } else {
                x = (ctx->pixy_vector.m0_x0 - ctx->pixy_vector.m0_x1) / frame_width;
                y = (ctx->pixy_vector.m0_y0 - ctx->pixy_vector.m0_y1) / frame_height;
            }

            if((ctx->pixy_vector.m0_x0 != ctx->pixy_vector.m0_x1) && ((y > 0.0) || (y <0.0))) {
                steer = -angular_velocity * (x / y) * single_line_steer_scale;
            } else {
                steer = 0.0;
            }
            speed = linear_velocity;
            break;
        case 2:
            if((ctx->pixy_vector.m1_x0 >= ctx->pixy_vector.m1_x1) && (ctx->pixy_vector.m0_x0 <= ctx->pixy_vector.m0_x1)){
                steer = angular_velocity * (((ctx->pixy_vector.m0_x1 + ctx->pixy_vector.m1_x1) / 2.0) - window_center) / frame_width;
            }else if ((ctx->pixy_vector.m1_x0 < ctx->pixy_vector.m1_x1) && (ctx->pixy_vector.m0_x0 <= ctx->pixy_vector.m0_x1)){
                steer = angular_velocity * (((ctx->pixy_vector.m0_x1 + ctx->pixy_vector.m1_x0) / 2.0) - window_center) / frame_width;
            }else if ((ctx->pixy_vector.m1_x0 > ctx->pixy_vector.m1_x1) && (ctx->pixy_vector.m0_x0 > ctx->pixy_vector.m0_x1)){
                steer = angular_velocity * (((ctx->pixy_vector.m0_x0 + ctx->pixy_vector.m1_x1) / 2.0) - window_center) / frame_width;
            }else {
                steer = angular_velocity * (((ctx->pixy_vector.m0_x0 + ctx->pixy_vector.m1_x0) / 2.0) - window_center) / frame_width;
            }

            speed = linear_velocity;      
            break;
    }
    
    double vel_linear_x = speed * (1 - fabs(2 * steer));  
     
    double turn_angle = 0;
    double omega_fwd = 0;
    double V = vel_linear_x;
    double omega = steer;
    double delta = 0;
    CASADI_FUNC_ARGS(ackermann_steering);
    args[0] = &ctx->wheel_base;
    args[1] = &omega;
    args[2] = &V;
    res[0] = &delta;
    CASADI_FUNC_CALL(ackermann_steering);

    omega_fwd = V / ctx->wheel_radius;
    if (fabs(V) > 0.01) {
        turn_angle = delta;
    }
    b3rb_set_actuators(&ctx->actuators, turn_angle, omega_fwd);
}

There are several important variables and functions that are used in the code. A key description of them is given below:

Variable/FunctionPurpose

Context* ctx

This variable is a pointer to the context struct, a custom data type that stores information as the pixy vector, the robot status and the wheel base and radius.

Pixy Vector

Contains information of pixy vectors obtained from the nxp_track_vision node. This includes the starting and ending points (x0, y0, x1, y1) for each vector detected by the camera

linear_velocity

The variable that stores the linear velocity at which the car will move forward. It is a constant value representing the speed in a straight line

angular_velocity

The variable that stores the angular velocity at which the car will steer. It represents the rate of change in the car's direction

single_line_steer_scale

This variable controls how much the car steers when it can only find one line of the race track. It scales the angular velocity to adjust the steering sensitivity

steer

Variable that stores the steer or angular velocity in the same way that the standard cmd_vel message from ROS2 does. It is calculated based on the Pixy Vector information and controls the direction the car turns

vel_linear_x

Variable that stores the linear velocity or vel.linear.x in the same way that the standard cmd_vel message from ROS2 does. It is adjusted based on the steering angle to ensure smooth turns

num_vectors

Variable that stores the number of Pixy Vectors obtained. The value range for this variable is 0, 1, or 2, indicating how many lines the Pixy camera has detected on the track

frame_width

Width of the green frame on the debug image, which the Pixy vector values from the nxp_track_vision node are based on

frame_height

Height of the green frame on the debug image, which the Pixy vector values from the nxp_track_vision node are based on

The switch instruction evaluates the number of vectors found in the camera image and what the robot car will do depending on this number:

Case: 0 vectors found

If no vectors are found (case 0), then the vehicle will stop.

Case: 1 vector found

If one vector is found, the algorithm finds the gradient of the vector and stores that in steer, and the speed

Case: 2 vectors found

If two vectors are found, then the example algorithm will find the offset in the x direction of the average of the two head points of each vector. This will give us a steering value that will steer the cup car in the correct direction, which will be stored in steer. The speed value will be calculated and is stored in speed.

After calculating the speed and the steer values, a call to the CASADI function is done to estimate the control variable for the actuators. Then the robot can move the necessary meters that you need.

Modifying the code

Before making changes to the velocity.c file or any files in the ~/cognipilot/ws/cerebri/app/b3rb directory, remember to rebuild the application to see the changes take effect.

For simulation:

cd ~/cognipilot/ws/cerebri 
west update
west build -b native_sim app/b3rb/ -p -t install

For real robot:

cd ~/cognipilot/ws/cerebri 
west update
west build -b mr_canhubk3 app/b3rb -p
west flash

Auto mode and Arming the robot automatically

In the nxp_track_vision.py file within the ROS2 package nxp_cup_vision, you'll find the following code snippet:

def statusCallback(self,data):
        #Putting robot in AUTO mode 
        if(data.mode!=2):
            joystick_msg=sensor_msgs.msg.Joy()
            joystick_msg.header.stamp=ROSClock().now().to_msg()
            joystick_msg.axes=[0.0,0.0,0.0,0.0]
            joystick_msg.buttons = [0, 1, 0, 0, 0, 0, 0, 0]
            #self.JoystickPub.publish(joystick_msg)
        #If robot is in AUTO mode -> arming the robot
        elif(data.mode==2 and data.arming!=2):
            joystick_msg=sensor_msgs.msg.Joy()
            joystick_msg.header.stamp=ROSClock().now().to_msg()
            joystick_msg.axes=[0.0,0.0,0.0,0.0]
            joystick_msg.buttons = [0, 0, 0, 0, 0, 0, 0, 1]
            #self.JoystickPub.publish(joystick_msg)

This code snippet automatically sets the B3RB robot in AUTO mode and arms the robot. By default, the self.JoystickPub.publish(joystick_msg) lines are commented out to allow control through Foxglove Studio during development. Uncomment these lines if you want the robot to automatically enter AUTO mode and arm itself.

If you need the robot to delay its start, add a sleep command:

sleep(15.0)

This makes the robot wait for 15 seconds before starting, which can be particularly useful for the NXP-CUP competition.

Last updated