arrow-left

All pages
gitbookPowered by GitBook
1 of 4

Loading...

Loading...

Loading...

Loading...

NXP-CUP 2024: line follower algorithm

This page explains the software required for the NXP CUP 2024 with the MR-B3RB robot.

The CogniPilot software is integrated into both the NavQPlus, within ROS2, and the MR-CANHUBK344, within the Zephyr RTOS together forming an Ackermann steering robotic development platform.

Please follow the installation instructions to obtain the modified branch of the CogniPilot software, specifically tailored for the NXP CUP 2024; these instructions are explained in the previous sectionNXP-CUP 2024: Simulation Install instructions

hashtag
ROS2 brief explanation

The ROS2 part of the software is designed for high-level tasks and should be treated as "black box" for the NXP-CUP 2024,meaning that its code should not be modified.If you encounter any errors in this code, please report them to the NXP Cup organizers or through the technical support channel on the official NXP CUP Discord: .

The main CogniPilot modification in the ROS2 side for the NXP Cup 2024 is addition of the nxp_track_vision node, which receives camera data, transforms the perspective, and extracts the Pixy Vector message. This message consists of two vectors, each with a tail and a head, defining the borders of the race track. This information is used by the robot controller to ensure that the robot stays within the track boundaries.

Here's the definition of the Pixy Vector ROS2 message:

The std_msgs/Header in the Pixy Vector ROS2 message includes the timestamp indicating when the message was sent. Additionally, the message is composed by two vectors, with each vector represented by a pair of points: one indicating where the vector begins (tail) and the other where it ends (head). Each of these points is defined by an x and a y value, specifying their position in a two-dimensional space.

The nxp_track_vision_node also publishes the Camera Debug Image. This image shows the perspective transformation applied to the camera's view, along with the detected lines of the track. Additionally, it displays the vectors being sent by the track vision node. This visual feedback should result helpful for debugging and optimizing the robot;s control software.

This is an example of the debug camera image:

You can visualize this image through Foxglove. For doing that first run:

Then add an Image panel to the B3RB layout provided and configure the ROS2 topic to /nxp_cup/debug_image.

hashtag
Control algorithm through Cerebri

Now a explanation of the line following control algorithm will be done. This algorithm is located on the Cerebri running on the Zypher RTOS in the MR-CANHUBK344. The Cerebri acts as the brain of our operation, facilitating complex decision-making processes that guide the robot seamlessly along the track.

This code is what you must modify and improve for the NXP CUP 2024.

The main function of the line follower algorithm is located in the velocity state inside the finite state machine logic that manages the robot's behaviour. This file is located in the ~/cognipilot/ws/cerebri/app/src/velocity.c

This is the GitHub location: .

The application folder is where the vehicle finite state machine, control, and estimation code reside. For more information visit:

In that folder you will find a file called velocity.c. This code is written in C language and has two main operation modes: the cmd_vel modes which listens the /cmd_vel topic from the Nav2 node from ROS2 and the auto mode, this one will listen to the Pixy vector message sendt by the nxp_track_vision_node and will estimate the velocity needed to maintain the robot in the race track:

There are several important variables and functions that are used in the code. A key description of them is given below:

Variable/Function
Purpose

The switch instruction evaluates the number of vectors found in the camera image and what the robot car will do depending on this number:

hashtag
Case: 0 vectors found

If no vectors are found (case 0), then the vehicle will stop.

hashtag
Case: 1 vector found

If one vector is found, the algorithm finds the gradient of the vector and stores that in steer, and the speed

hashtag
Case: 2 vectors found

If two vectors are found, then the example algorithm will find the offset in the x direction of the average of the two head points of each vector. This will give us a steering value that will steer the cup car in the correct direction, which will be stored in steer. The speed value will be calculated and is stored in speed.

After calculating the speed and the steer values, a call to the function is done to estimate the control variable for the actuators. Then the robot can move the necessary meters that you need.

hashtag
Modifying the code

Before making changes to the velocity.c file or any files in the ~/cognipilot/ws/cerebri/app/b3rb directory, remember to rebuild the application to see the changes take effect.

For simulation:

For real robot:

hashtag
Auto mode and Arming the robot automatically

In the nxp_track_vision.py file within the ROS2 package nxp_cup_vision, you'll find the following code snippet:

This code snippet automatically sets the B3RB robot in AUTO mode and arms the robot. By default, the self.JoystickPub.publish(joystick_msg) lines are commented out to allow control through Foxglove Studio during development. Uncomment these lines if you want the robot to automatically enter AUTO mode and arm itself.

If you need the robot to delay its start, add a sleep command:

This makes the robot wait for 15 seconds before starting, which can be particularly useful for the NXP-CUP competition.

Variable that stores the steer or angular velocity in the same way that the standard cmd_vel message from ROS2 does. It is calculated based on the Pixy Vector information and controls the direction the car turns

vel_linear_x

Variable that stores the linear velocity or vel.linear.x in the same way that the standard cmd_vel message from ROS2 does. It is adjusted based on the steering angle to ensure smooth turns

num_vectors

Variable that stores the number of Pixy Vectors obtained. The value range for this variable is 0, 1, or 2, indicating how many lines the Pixy camera has detected on the track

frame_width

Width of the green frame on the debug image, which the Pixy vector values from the nxp_track_vision node are based on

frame_height

Height of the green frame on the debug image, which the Pixy vector values from the nxp_track_vision node are based on

Context* ctx

This variable is a pointer to the context struct, a custom data type that stores information as the pixy vector, the robot status and the wheel base and radius.

Pixy Vector

Contains information of pixy vectors obtained from the nxp_track_vision node. This includes the starting and ending points (x0, y0, x1, y1) for each vector detected by the camera

linear_velocity

The variable that stores the linear velocity at which the car will move forward. It is a constant value representing the speed in a straight line

angular_velocity

The variable that stores the angular velocity at which the car will steer. It represents the rate of change in the car's direction

single_line_steer_scale

This variable controls how much the car steers when it can only find one line of the race track. It scales the angular velocity to adjust the steering sensitivity

https://discord.gg/JcWPbw649Sarrow-up-right
See codearrow-up-right
https://airy.cognipilot.org/cerebri/platforms/rovers/arrow-up-right
CASADI arrow-up-right
Image prior to the perspective transformation
Image after the perspective transformation

steer

std_msgs/Header header
# Vector 0 head and tail points
uint32 m0_x0    # Tail of vector @ x
uint32 m0_y0    # Tail of vector @ y
uint32 m0_x1    # Head of vector @ x
uint32 m0_y1    # Head of vector @ y

# Vector 1 head and tail points
uint32 m1_x0    # Tail of vector @ x
uint32 m1_y0    # Tail of vector @ y
uint32 m1_x1    # Head of vector @ x
uint32 m1_y1    # Head of vector @ y
ros2 launch electrode electrode.launch.py #sim:=true (If you are using simulation)
// Follow line function in velocity.c
static void follow_line(context* ctx)
{
    double frame_width = 78;
    double frame_height = 51;
    double window_center = frame_width / 2;
    double linear_velocity=0.7;
    double angular_velocity=-0.6;
    double single_line_steer_scale=0.6;

    double x = 0.0;
    double y = 0.0;

    double steer = 0.0;
    double speed = 0.0;
     
    int num_vectors = 0;
    if(!(ctx->pixy_vector.m0_x0 == 0 && ctx->pixy_vector.m0_x1 == 0 && ctx->pixy_vector.m0_y0 == 0 && ctx->pixy_vector.m0_y1 == 0)) {
        num_vectors++;
    }
    if(!(ctx->pixy_vector.m1_x0 == 0 && ctx->pixy_vector.m1_x1 == 0 && ctx->pixy_vector.m1_y0 == 0 && ctx->pixy_vector.m1_y1 == 0)) {
        num_vectors++;
    }
    
    switch(num_vectors) {
        case 0:
            speed = 0.0;
            steer = 0.0;
            break;
        case 1:
            if(ctx->pixy_vector.m0_x1 > ctx->pixy_vector.m0_x0) {
                x = (ctx->pixy_vector.m0_x1 - ctx->pixy_vector.m0_x0) / frame_width;
                y = (ctx->pixy_vector.m0_y1 - ctx->pixy_vector.m0_y0) / frame_height;
            } else {
                x = (ctx->pixy_vector.m0_x0 - ctx->pixy_vector.m0_x1) / frame_width;
                y = (ctx->pixy_vector.m0_y0 - ctx->pixy_vector.m0_y1) / frame_height;
            }

            if((ctx->pixy_vector.m0_x0 != ctx->pixy_vector.m0_x1) && ((y > 0.0) || (y <0.0))) {
                steer = -angular_velocity * (x / y) * single_line_steer_scale;
            } else {
                steer = 0.0;
            }
            speed = linear_velocity;
            break;
        case 2:
            if((ctx->pixy_vector.m1_x0 >= ctx->pixy_vector.m1_x1) && (ctx->pixy_vector.m0_x0 <= ctx->pixy_vector.m0_x1)){
                steer = angular_velocity * (((ctx->pixy_vector.m0_x1 + ctx->pixy_vector.m1_x1) / 2.0) - window_center) / frame_width;
            }else if ((ctx->pixy_vector.m1_x0 < ctx->pixy_vector.m1_x1) && (ctx->pixy_vector.m0_x0 <= ctx->pixy_vector.m0_x1)){
                steer = angular_velocity * (((ctx->pixy_vector.m0_x1 + ctx->pixy_vector.m1_x0) / 2.0) - window_center) / frame_width;
            }else if ((ctx->pixy_vector.m1_x0 > ctx->pixy_vector.m1_x1) && (ctx->pixy_vector.m0_x0 > ctx->pixy_vector.m0_x1)){
                steer = angular_velocity * (((ctx->pixy_vector.m0_x0 + ctx->pixy_vector.m1_x1) / 2.0) - window_center) / frame_width;
            }else {
                steer = angular_velocity * (((ctx->pixy_vector.m0_x0 + ctx->pixy_vector.m1_x0) / 2.0) - window_center) / frame_width;
            }

            speed = linear_velocity;      
            break;
    }
    
    double vel_linear_x = speed * (1 - fabs(2 * steer));  
     
    double turn_angle = 0;
    double omega_fwd = 0;
    double V = vel_linear_x;
    double omega = steer;
    double delta = 0;
    CASADI_FUNC_ARGS(ackermann_steering);
    args[0] = &ctx->wheel_base;
    args[1] = &omega;
    args[2] = &V;
    res[0] = &delta;
    CASADI_FUNC_CALL(ackermann_steering);

    omega_fwd = V / ctx->wheel_radius;
    if (fabs(V) > 0.01) {
        turn_angle = delta;
    }
    b3rb_set_actuators(&ctx->actuators, turn_angle, omega_fwd);
}
cd ~/cognipilot/ws/cerebri 
west update
west build -b native_sim app/b3rb/ -p -t install
cd ~/cognipilot/ws/cerebri 
west update
west build -b mr_canhubk3 app/b3rb -p
west flash
def statusCallback(self,data):
        #Putting robot in AUTO mode 
        if(data.mode!=2):
            joystick_msg=sensor_msgs.msg.Joy()
            joystick_msg.header.stamp=ROSClock().now().to_msg()
            joystick_msg.axes=[0.0,0.0,0.0,0.0]
            joystick_msg.buttons = [0, 1, 0, 0, 0, 0, 0, 0]
            #self.JoystickPub.publish(joystick_msg)
        #If robot is in AUTO mode -> arming the robot
        elif(data.mode==2 and data.arming!=2):
            joystick_msg=sensor_msgs.msg.Joy()
            joystick_msg.header.stamp=ROSClock().now().to_msg()
            joystick_msg.axes=[0.0,0.0,0.0,0.0]
            joystick_msg.buttons = [0, 0, 0, 0, 0, 0, 0, 1]
            #self.JoystickPub.publish(joystick_msg)
sleep(15.0)

NXP-CUP 2024: Simulation Install instructions

Cognipilot simulation installation specific instructions for nxp-cup

triangle-exclamation

To follow this tutorial you first need to follow and install everything explained in: MR-B3RB Software developer guide

If your installation has been successful you must be able the launch the Gazebo simulation by running the following commands in your development computer:

circle-exclamation

Please ensure that these commands function correctly and that your setup resembles what is shown in this

Launching the Gazebo Harmonic simulation will enable you to control the robot via the Foxglove interface. As explained in this section:

hashtag
Explanation of repository to be updated

Next, we will install all necessary packages for developing the algorithm for the NXP CUP 2024, applicable to both the MR-b3rb simulation and the actual robot.

To simplify repository updates, we've created a shell script. This script automates the process of updating all repositories to the latest versions hosted on the GitHub, specifically for the NXP Cup.

The changes on the repositories are:

  • Synapse Protobuf and Synapse Tinyframe: Define the Pixy Vector message in the Protobuf language and provide essential definitions and functions for the TinyFrame library to operate correctly. For more information, refer to the documentation.

  • Synapse msgs and Synapse ros: Extend ROS with the Pixy Vector message definition and support Synapse which the facilitates communication between NavQPlus (ROS2) and CANHUBK344 (Zephyr).

  • Dream world: Adds a simulated race track environment to the workspace.

hashtag
Bash script to update the repositories

This file ensures that all the repositories are updated. Save this script in ~/cognipilot:

Save this script as update_repos_native.sh in the ~/cognipilot directory, using any text editor of your choice, such as Vim, Gedit, Nano, or VSCode.

hashtag
Building cranium, electrode and cerebri

Then, open the terminal and execute the following commands:

This must have updated all your repositories. Please review the output of this script. It's possible that you have made local changes to the repositories, and you may need to stash them.

To build the software, navigate to the cranium/src directory within ~/cognipilot:

Repeat the build process for the electrode directory:

Update the West workspace for cerebri.

hashtag
Testing the update

Now, you are ready to run the simulation:

To launch the Gazebo simulation featuring the MR-B3RB robot on a simple raceway, execute:

To start the Foxglove viewer for the simulation, run:

These commands will initiate the Gazebo simulation with the MR-B3RB robot and the raceway, alongside launching the Foxglove viewer.

To initiate the robot's movement, press the AUTO button and the arm button on the joystick. This action will activate the robot's line-following mode and set it to the armed state, preparing it for operation.

Once the AUTO and arm buttons on the joystick are pressed, the robot will immediately begin to navigate around the racetrack. To understand the underlying code and setup, and to explore ways to obtain a better performance for the NXP CUP 2024, proceed to the next tutorial.

B3RB simulator: Includes the launch file for the node that processes the race track images.

  • Electrode: Integrates support for the debug image and pixy vector in Foxglove, facilitating debugging and visualization.

  • Cerebri: Implements a subscribing mechanism to receive the Pixy Vector message and controls for the line follower algorithm.

  • NXP CUP Vision: Added ROS2 package for processing race track images, transforming perspectives, detecting track edges, and publishing both the debug image and the Pixy Vector, pivotal for visual processing.

  • example videoarrow-up-right
    b3rb_simulationarrow-up-right
    NXPHoverGamesarrow-up-right
    Synapsearrow-up-right
    file-download
    2KB
    update_repos_native.sh
    arrow-up-right-from-squareOpen
    ros2 launch b3rb_gz_bringup sil.launch.py
    ros2 launch electrode electrode.launch.py sim:=true
    #!/bin/bash
    
    declare -A repos=(
        ["cranium/src/synapse_protobuf"]="https://github.com/NXPHoverGames/synapse_protobuf"
        ["cranium/src/synapse_tinyframe"]="https://github.com/NXPHoverGames/synapse_tinyframe"
        ["cranium/src/synapse_msgs"]="https://github.com/NXPHoverGames/synapse_msgs"
        ["cranium/src/synapse_ros"]="https://github.com/NXPHoverGames/synapse_ros"
        ["cranium/src/dream_world"]="https://github.com/NXPHoverGames/dream_world"
        ["cranium/src/b3rb_simulator"]="https://github.com/NXPHoverGames/b3rb_simulator"
        ["electrode/src/electrode"]="https://github.com/NXPHoverGames/electrode"
        ["ws/cerebri"]="https://github.com/NXPHoverGames/cerebri"
        ["cranium/src/nxp_cup_vision"]="https://github.com/NXPHoverGames/nxp_cup_vision"
    )
    
    # Update existing repositories or clone if they don't exist
    for repo in "${!repos[@]}"; do
        echo "Processing $repo..."
        repo_path="${repo}" # Derive the full path
        
        # Check if the repository directory exists. If it does not, clone it.
        if [ ! -d "${repo_path}" ]; then
            echo "Repository ${repo_path} does not exist. Cloning..."
            git clone --branch nxp-cup "${repos[$repo]}" "${repo_path}"
            echo "Cloned ${repo} into ${repo_path}."
        fi
        
        # Navigate to the repository directory
        cd "${repo_path}" || { echo "Failed to change directory to ${repo_path}. Does it exist?"; continue; }
        
        # Set the new remote URL
        git remote set-url origin "${repos[$repo]}"
        echo "Remote changed to ${repos[$repo]}"
    
        # Fetch changes from the new remote
        git fetch origin
        echo "Fetched changes from origin."
    
        # Checkout the specific branch
        git checkout nxp-cup
        if [ $? -eq 0 ]; then
            echo "Checked out nxp-cup branch."
            # Pull the latest changes from the branch
            git pull origin nxp-cup
            echo "Pulled latest changes from nxp-cup branch."
        else
            echo "Failed to checkout nxp-cup branch. Does it exist?"
        fi
        
        # Return to the original directory
        cd - > /dev/null
    done
    cd ~/cognipilot
    chmod +x update_repos_native.sh
    ./update_repos_native.sh
    cd ~/cognipilot/cranium/
    colcon build --symlink-install
    cd ~/cognipilot/cranium/
    source install/setup.bash
    cd ~/cognipilot/electrode/
    colcon build --symlink-install
    cd ~/cognipilot/electrode/
    source install/setup.bash
    cd ~/cognipilot/ws/cerebri
    west update
    west build -b native_sim app/b3rb/ -p -t install
    source ~/.bashrc
    ros2 launch b3rb_gz_bringup sil.launch.py world:=nxp_raceway_octagon
    ros2 launch electrode electrode.launch.py sim:=true 

    NXP-CUP 2024 with MR-B3RB

    This section explains the essential software required for participating in the NXP Cup 2024 using the MR-B3RB Robot.

    Participants will use a combination of pre-configured ROS2 and Zephyr-Cerebri software. Here, "Cerebri" refers to the application layer, while "Zephyr" is the underlying Real-Time Operating System (RTOS). This software stack integrates with the Cognipilot autopilot system, which is available at .

    The following pages provide detailed installation instructions for both the simulation environment and the actual robot. Additionally, there is a section dedicated to explaining the line follower algorithm for the NXP Cup 2024.

    https://airy.cognipilot.org/arrow-up-right

    NXP-CUP 2024: MR-B3RB robot Install instructions

    Cognipilot MR-B3RB Robot installation specific instructions for nxp-cup

    triangle-exclamation

    To follow this tutorial you first need to follow and install everything explained in: MR-B3RB Software developer guide

    circle-exclamation

    You must also follow the NXP-CUP 2024: Simulation Install instructions as it is explained how to update you development computer for the NXP-CUP 2024.

    If you have correctly followed the previous steps, you should already be familiar with how to flash the Cognipilot software onto the MR-CANHUBK344 board. In this step, we will flash the board again, but this time with an updated version of the Cerebri software that includes support for the NXP Cup 2024. Additionally, we will need to log in via SSH to the NavQPlus and update the cranium workspace to enable support for the NXP Cup, similar to the procedure outlined on the previous page.

    hashtag
    Build and flash the NXP-CUP Cerebri image for B3RB

    In your development computer go to:

    circle-exclamation

    Check that you are on the nxp-cup branch

    Which should return:

    If it doesn't show that:

    If you have ensured that you are on the nxp-cup branch and that there is not any update available. Then type the following command:

    hashtag
    Build NXP-CUP Cerebri B3RB image for MR CANHUBK3 with west

    hashtag
    Flash Cerebri B3RB image with west to MR CANHUBK3

    Make sure is on your system and the MR CANHUBK3 is connected to the JLink programmer: .

    Once it has finished, you can disconnect the J-Link from your computer and from the MR-CANHUBK344.

    circle-info

    In case the west flash command failed, put the followung commands on the terminal:

    hashtag
    Updating repositories on NavQPlus

    circle-exclamation

    If you have any doubt or problem in this section please first refer to this documentation:

    Once you have flashed the MR-CANHUBK344, It's time to log into the NavQPlus. Do this with this command:

    Next, we will install the packages to develop the algorithm for the NXP CUP 2024 in the B3RB robot.

    To simplify repository updates, we've created a shell script. This script automates the process of updating all repositories to the latest versions hosted on the GitHub, specifically for the NXP Cup.

    The changes on the repositories are:

    • Synapse Protobuf and Synapse Tinyframe: Define the Pixy Vector message in the Protobuf language and provide essential definitions and functions for the TinyFrame library to operate correctly. For more information, refer to the documentation.

    • Synapse msgs and Synapse ros: Extend ROS with the Pixy Vector message definition and support Synapse which the facilitates communication between NavQPlus (ROS2) and CANHUBK344 (Zephyr).

    • B3RB robot: Includes the launch file for the node that processes the race track images.

    This file ensures that all the repositories are updated. Save this script in ~/cognipilot:

    Save this script as update_repos_navqplus.sh in the ~/cognipilot directory, using any text editor of your choice, such as Vim or nano.

    Then, open the terminal and execute the following commands:

    Change directory to ~/cognipilot and make the script executable:

    This must have updated all your repositories. Please review the output of this script. It's possible that you have made local changes to the repositories, and you may need to stash them.

    To build the software, navigate to the cranium/src directory within ~/cognipilot:

    Now, you are ready to operate the platform.

    On the NavQPlus, run the B3RB launch file:

    Start Foxglove for operation and data visualization run the following command in your development computersimulation, run:

    These commands will initiate the algorithm on the MR-B3RB robot and Foxglove viewer. You should locate your robot in the racetrack.

    To initiate the robot's movement, press the AUTO button and the arm button on the joystick. This action will activate the robot's line-following mode and set it to the armed state, preparing it for operation.

    Once the AUTO and arm buttons on the joystick are pressed, the robot should immediately begin to navigate around the racetrack. To understand the underlying code and setup, and to explore ways to obtain a better performance for the NXP CUP 2024, proceed to the next tutorial.

    NXP CUP Vision: Added ROS2 package for processing race track images, transforming perspectives, detecting track edges, and publishing both the debug image and the Pixy Vector, pivotal for visual processing.

    Segger JLinkarrow-up-right
    Connect J-Link EDU Mini to MR-CANHUBK344 with DCD-LZ adapt board
    https://airy.cognipilot.org/cranium/compute/navqplus/setup/arrow-up-right
    NXPHoverGamesarrow-up-right
    Synapsearrow-up-right
    file-download
    2KB
    update_repos_navqplus.sh
    arrow-up-right-from-squareOpen
    cd ~/cognipilot/ws/cerebri
    git pull
    cd ~/cognipilot/ws/cerebri
    git status
    On branch nxp-cup
    Your branch is up to date with 'origin/nxp-cup'.
    
    nothing to commit, working tree clean
    git checkout nxp-cup
    cd ~/cognipilot/ws/cerebri 
    west update
    cd ~/cognipilot/ws/cerebri 
    west build -b mr_canhubk3 app/b3rb -p
    cd ~/cognipilot/ws/cerebri 
    west flash
    JLinkExe
    connect
    S32K344
    S
    4000
    loadbin /home/$user/cognipilot/ws/cerebri/build/zephyr/zephyr.elf 0x400000
    exit
    ssh <username>@<hostname>.local
    #!/bin/bash
    
    declare -A repos=(
        ["cranium/src/synapse_protobuf"]="https://github.com/NXPHoverGames/synapse_protobuf"
        ["cranium/src/synapse_tinyframe"]="https://github.com/NXPHoverGames/synapse_tinyframe"
        ["cranium/src/synapse_msgs"]="https://github.com/NXPHoverGames/synapse_msgs"
        ["cranium/src/synapse_ros"]="https://github.com/NXPHoverGames/synapse_ros"
        ["cranium/src/b3rb_robot"]="https://github.com/NXPHoverGames/b3rb_robot"
        ["cranium/src/nxp_cup_vision"]="https://github.com/NXPHoverGames/nxp_cup_vision"
    )
    
    # Update existing repositories or clone if they don't exist
    for repo in "${!repos[@]}"; do
        echo "Processing $repo..."
        repo_path="${repo}" # Derive the full path
        
        # Check if the repository directory exists. If it does not, clone it.
        if [ ! -d "${repo_path}" ]; then
            echo "Repository ${repo_path} does not exist. Cloning..."
            git clone --branch nxp-cup "${repos[$repo]}" "${repo_path}"
            echo "Cloned ${repo} into ${repo_path}."
        fi
        
        # Navigate to the repository directory
        cd "${repo_path}" || { echo "Failed to change directory to ${repo_path}. Does it exist?"; continue; }
        
        # Set the new remote URL
        git remote set-url origin "${repos[$repo]}"
        echo "Remote changed to ${repos[$repo]}"
    
        # Fetch changes from the new remote
        git fetch origin
        echo "Fetched changes from origin."
    
        # Checkout the specific branch
        git checkout nxp-cup
        if [ $? -eq 0 ]; then
            echo "Checked out nxp-cup branch."
            # Pull the latest changes from the branch
            git pull origin nxp-cup
            echo "Pulled latest changes from nxp-cup branch."
        else
            echo "Failed to checkout nxp-cup branch. Does it exist?"
        fi
        
        # Return to the original directory
        cd - > /dev/null
    done
    cd ~/cognipilot
    chmod +x update_repos_navqplus.sh
    ./update_repos_navqplus.sh
    cd ~/cognipilot/cranium/
    colcon build --symlink-install
    cd ~/cognipilot/cranium/
    source install/setup.bash
    ros2 launch b3rb_bringup robot.launch.py
    ros2 launch electrode electrode.launch.py