Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
The MR-B3RB should look like this unit after opening the box.
MR-B3RB
The lower chassis comes partially assembled with:
the side skirts attached
the side plastic attached
the PDB partially wired
and the LED lighting installed and wired.
Remove screws that hold on the side plates and also the screws that hold on the top plate.
The screws holding the side skirts and top plate on top will be replaced with the larger thumbscrews.
Use a hex screwdriver or Allen Key and remove all top screws securing the top plate to the metal frame and carefully set them aside.
There are a variety of NXP Mobile Robotics enablements that each have their own GitBook doc as well as reference and product pages on NXP.com. You may want to review some of these other GitBooks to find more detail about specific boards used in the B3RB or boards which could interface with the B3RB:
: Quick reference and index for all our mobile robotics solutions.
NavQPlus: i.MX 8M Plus companion computer for mobile robotics
MR-B3RB: NXP Buggy3 Rev B platform using NavQPlus and MR-CANHUBK344
: The orginal Drone competition with links to Drone & rover dev. kits, with FMUK66 vehicle management unit, and now the MR-VMU-RT1176 vehicle management unit.
: Autonomous model car competition for students.
: Small form factor CAN-FD to 100BASE-T1 ethernet bridge.
: CAN-FD node for mobile robotics applications.
RDDRONE-BMS772: Battery management system (3-6 cells).
RDDRONE-T1ADAPT: 100BASE-T1 ethernet adapter.
The Gitbook documentation is updated regularly. Supportive feedback is welcome: iain.galloway@nxp.com
We hope you are ready to embark on an exciting journey into the world of robotics. Meet MR-B3RB – your very own robot Buggy that you build from the ground up. Even though the the B3RB is toy sized, it represents a complete generic representation of a heterogeneous ROS2 enabled robotic platform with real time processors and model predictive real time control. This guide will walk you through the steps to prepare the hardware and the starter software. Please note this is an ADVANCED system, and a background in, or desire to learn ROS2, Linux, and Real Time Operating Systems is a pre-requisite in order to be successful in further development on the platform.
MR-B3RB stands for "Mobile Robotics - Buggy 3 Revision B." It's a mobile robotics platform designed to ignite your curiosity and expand your understanding of robotics, programming, and engineering.
Embedded Linux Computer
Running Ubuntu POC and ROS2
Wireless Connectivity
CAN Bus
T1 two wire Ethernet and "regular" Ethernet
UARTs and other IO
Real time Microprocessor running
Zephyr RTOS
Cognipilot for
Hosting state of the art Model Predictive Control real time control system
and framework for transparent sensor and actuator communications to ROS
T1 Ethernet
CAN connectivity
UART/SPI/I2C/PWMS, LED lighting and other hardware controls.
Hands-On Learning: Dive into the practical aspects of robotics, electronics, and programming.
Skill Development: Sharpen your critical thinking, creativity, and technical skills.
Fun and Engaging: Experience the joy of building and bringing your own robot to life.
Community Support: Join a community of like-minded individuals passionate about technology and innovation.
Step-by-Step Instructions: Detailed, easy-to-follow steps to assemble and program your MR-B3RB.
Visual Aids: Diagrams, images, and videos to guide you through each stage of the process.
Troubleshooting Tips: Solutions to common issues to ensure a smooth assembly experience.
Additional Resources: Links to materials, tools, and further learning opportunities.
Let's get started on this exciting adventure and bring your MR-B3RB to life!
While anyone can build the MR-B3RB, and get it to perform the basic function as described here, further meaningful development will at minimum require some knowledge of ROS2. Below are some suggested training websites to learn about ROS2. Fundamentally B3RB is designed to be a Linux based ROS2 based robotic system with an attached real time control microcontroller. The default configuration uses Ubuntu Linux to run ROS2, as well as Zephyr RTOS + Cognipilot. Cognipilot provides the infrastructure in Zephyr to provide transparent and efficient communications to ROS2, Manage Sensors and Actuators in real time, and host the actual control system algorithms for the vehicle. Even more involved is the development of control algorithms themselves. You do NOT need to become a controls theory expert to use B3RB or any of the other pre-configured vehicle models in Cognipilot, but be aware of the following: The control algorithm software is developed externally, synthesized and then hosted by Cognipilot which is a key feature of Cognipilot Cerebri module. Control Algorithms can be simple or state of the art Model Predictive Control (MPC). Currently B3RB uses modern Lie group MPC theory. Refer to the for more details on this if you are interested in diving deeper. Note also that the same MR-B3RB hardware *can* support any other Linux or RTOS software, but in this GitBook the configuration of ROS2 + Zephyr is the focus robotics enablement.
Below are some suggested training websites to learn about ROS2. These are only suggestions. Because ROS2 is rapidly and continuously developing, you may find better resources by searching yourself.
ROS2 is an entire subject to study on it's own. In order to be successful in further development, it is strongly suggested you spend some time studying ROS2 fundamentals. There are many resources publicly available for this. Below are only a few.
Udemy or other types or courses provide "completion certificates" a few examples are:
To learn about how to run a the robots you will want to take:
ROSCON is an highly recommended in-person conference and a great way to jumpstart your understanding of the latest development in ROS2. If you have the opportunity to attend it is an excellent conference experience.
The gears are adjusted at the factory and should usually not need to be touched on a new B3RB. However over time it may be that they need adjustment. Follow this procedure to re-adjust the gear meshing.
1. Gently rotate the wheels using your hand while holding the B3RB in the air
2. Listen Carefully: Listen to the sound of the gears as you rotate the wheels. Ideally, the gears should mesh smoothly without any grinding or excessive noise.
3. Feel for Resistance: As you rotate the wheels, feel for any resistance, unevenness or skipping of gears. This could indicate that the gears are not meshing properly.
If you feel the gears are not meshing well, you will may adjust them as follows.
1. Locate the Motor Mounting Bracket: Identify the motor mounting bracket where the motor is attached to the chassis or frame of your device.
2. Loosen the Adjustment Screws: Typically, there are two screws at the front of the motor mounting bracket that secure the motor in place. Use an appropriate screwdriver to gently loosen these screws. You don’t need to remove them completely, just enough to allow slight movement of the motor.
3. Adjust the Motor Position: Apply mild pressure to the motor in the direction that will improve the gear meshing. This usually means pushing the motor closer to the gear or slightly adjusting its angle to ensure the teeth of the gears are aligned properly.
4. Check and Tighten: After adjusting, rotate the wheels again and listen for any improvement in the gear meshing. Once satisfied with the adjustment, carefully tighten the screws back into place to secure the motor in its adjusted position.
The XT60 battery connector is installed at the factory, ensure that it pivots freely as shown in the video below.
This guide is designed to help you quickly understand and effectively use the MR-B3RB platform. Whether you're a beginner or an experienced developer, you can tailor your experience by following the structure below:
If you're new to the MR-B3RB platform or its components, we recommend starting from the beginning and working your way through each section in order. The guide is organized sequentially, with foundational concepts explained first, followed by advanced topics.
Are you unfamiliar with some of the NXP systems? Please check out the section to learn more about the variety of compatible robotics boards.
Experienced users looking for specific information can jump directly to the section they need. Each section is designed to be self-contained, so you can explore topics independently without needing to read the entire guide.
Throughout the guide, you'll find code snippets, examples, and exercises. These are marked clearly and are essential for hands-on learning. We encourage you to replicate and experiment with these examples to deepen your understanding.
Look out for highlighted notes, tips, and warnings. These callouts contain critical information, best practices, and potential pitfalls that can save you time and effort.
Tips
Warnings
Warnings
Some sections include links to external resources, interactive diagrams, or tools. These are designed to enhance your learning experience, so don’t hesitate to explore them for additional context.
A geometrical arrangement in vehicles ensuring all wheels follow concentric circular paths while turning, allowing inner wheels to turn sharper than outer wheels, thus reducing tire slip and wear. The B3RB uses Ackermann steering geometry.
Controller Area Network Bus: A robust vehicle bus standard designed to allow microcontrollers and devices to communicate with each other without a host computer. The NavQPlus and the MR-CANHUBK344 both have CAN bus ports. By default nothing is attached, but you could attach things like Servos, motor controllers, GPS if they had a CAN bus interfaec. The MR-UCANS32K1SIC is a small board that would allow for development of CAN peripherals.
The software platform used in MR-B3RB to enable autonomous navigation and control, integrating sensor data and executing navigation algorithms.
(Global Positioning System): A satellite-based navigation system used in MR-B3RB to provide accurate position and timing information.
(Inertial Measurement Unit): A device that measures and reports a robot's specific force, angular rate, and sometimes the magnetic field surrounding the body, aiding in navigation and orientation.
Kalman Filter: An algorithm that uses a series of measurements observed over time, containing statistical noise and other inaccuracies, to produce estimates of unknown variables that tend to be more accurate than those based on a single measurement alone.
LIDAR: Light Detection and Ranging, a remote sensing method used in MR-B3RB for measuring distances and creating detailed maps of the environment.
MR-CANHUBK344: A central hub in the MR-B3RB system used for managing CAN (Controller Area Network) communications between different modules.
NavQPlus: A development platform used in MR-B3RB for advanced computing and sensor integration, including cameras and GPS. PWM (Pulse Width Modulation): A technique used to control the power supplied to electrical devices, crucial for motor speed control in MR-B3RB.
QDEC (Quadrature Decoder): A system used to interpret signals from rotary encoders, providing precise measurements of wheel rotations in MR-B3RB.
RGB LED: A light-emitting diode that can emit red, green, and blue colors, used in MR-B3RB for visual status indicators.
ROS2 (Robot Operating System 2): An open-source framework for robot software development, providing tools and libraries for building and controlling robots like MR-B3RB.
SLAM (Simultaneous Localization and Mapping): A process by which a robot or device constructs a map of an unknown environment while simultaneously keeping track of its location within that environment.
ToF (Time of Flight) Camera: A depth sensor that measures the time it takes for a light signal to return after being reflected off an object, used in MR-B3RB for obstacle detection.
- Automation: Derived from the Greek word "automaton," meaning something that acts by itself. Refers to machines repeating the same motion or tasks without human intervention.
- Autonomy: The capability of a system to govern itself and make decisions independently, operating under its own set of laws.
Robot vs Vehicle
- Robot: From the Czech word "robota," meaning forced labor. A machine that operates automatically or via remote control to perform tasks.
- Vehicle: A device designed for transporting people or cargo. (Despite this, the B3RB may still commonly be referred to as a type of vehicle)
Robotic System: A combination of a robot, its environment, supporting infrastructure, other machines, and humans interacting with it.
AMR vs AGV
- AMR: Autonomous Mobile Robot, capable of navigating and making decisions independently.
- AGV: Automated Guided Vehicle, follows predefined paths or instructions.
Agent and World vs Controller and Plant
- Agent and World: Conceptual framework where the agent (robot) interacts with the world (environment).
- Controller and Plant: The controller directs the plant (mechanical system) to achieve desired outcomes.
Estimate and Belief
- Estimation: The process of transforming sensor data into actionable information such as position, orientation, and velocity for the robot.
- Belief: The robot's internal representation or estimate of its current state based on the processed data.
PID Controller (Proportional-Integral-Derivative Controller): A control loop mechanism used in industrial control systems, employing feedback to maintain the desired output of a system.
Waypoint Navigation: A navigation method in which a robot or vehicle follows a series of pre-defined points, or waypoints, to reach its destination.
Instructions for the NXP MR-B3RB.
The following pages describe the mechanical assembly of the MR-B3RB-M.
Note that the lower chassis of the latest version of MR-B3RB-M should be partially assembled as follows:
Front and rear "bumper" plastic
PDB (power distribution board) installed
LED lighting installed and wired to PDB
QDEC (Quadrature Decoder) cable to motor installed
Motor cable connected to PDB
XT60 battery wire and power switch attached to PDB
These instructions are now updated for the production version of the MR-B3RB. If you have a pre-production version, please see the archive area here or the previous version of the document.
PART
QTY
DESCRIPTION
1
M3 NYLON SPACER KIT 180PCS.
1
M2/M2.5/M3 SCREW FASTENER KIT FOR WLTOYS 144001 1/14 RC
(Add what kind of screws are in there in the picture or add a note in the screw box)
3
JST-GH 6-pin
To
JST-XH 4-pin
4 wires, 50mm wire
(Color codes for different but nearly similar connector types?)
1
NavQPlus
1
Camera and camera mount for NavQPlus. Most of the time already connected to the NavQPlus
1
Power-Delivery Board (PDB)
Should be already attached to the B3RB under body.
MUK mechanical upgrade kit:
Lidar arch plate metal and metal top cover.
Front and rear plastics (for LED covers and angles front and rear covers)
MR-CANHUBK344 is the real time controller board for the robot. After successfully attaching the NavQPlus, the next step is to install the MR-CANHUBK344 evaluation board.
Included in the kit is a CANHUB ADAP board. This board has IMU (Inertial measurement Unit) components on it such as Accelerometer, Gyro, Magnetometer, as well as specific connectors for GPS and other interfaces. Attach it to the CANHUBK344 as shown below.
The MR-CANHUBK344 board should be attached in the highlighted holes of the following image. Reuse the M3x5mm screws that were previously removed from the side skirts. These M3x5mm screws should be used to screw in from the bottom side.
Be cautious to not squeeze or pinch any of the existing wires attached to the NavQPlus!
Upon completion, the entire assembly should appear as follows:
We will now mount the NavQPlus and the MR-CANHUBK344 to the metal top plate. Follow the steps in the next two sections. In the end, the top plate assembly should look like this:
The RGB LEDS communicate using a SPI-like communication which is modified in that it is unidirectional. The data and power for the LEDs is supplied by a STEMMAQT type connector the PDB, and the SCL SDA (SPI) lines have a second wire connector which attaches between the PDB and SPI port on MR-CANHUBK344
The following image shows the LED lights setup with the left side representing the rear and the right side indicating the front of the car. This is pre-assembled. If for some reason you need to re-create these connections, you should know that it is a daisy chain of STEMMA-QT type connectors, and that there is a marked IN and OUT connector on each of the LED light bars. The connections are from PDB -> IN(rear) -> out(rear) -> in (front)
Note that STEMMA-QT connectors are used but ABUSED and carry SPI signals not the normal STEMMA QT (I2C) signals.
Use the seven-pin JST-GH wire depicted in the picture below to connect from PDB to MR-CANHUBK344
Connect one end of the wire to SPI pin of the PDB.
Connect other end to the SPI2 of the plastic case of the MR-CANHUBK344:
The camera housing included with the NavQPlus mounts to the front "GoPro" style mount on the front plastic:
You must use the screw and bolt that included along with the original camera mount.
The camera flex cable should come out the back of the NavQPlus and loop over top of it. Then route under the lower edge of the front bracket. and up into the camera module. The final setup should look like this:
Screw the front and rear plastic to the top plate using two M3x5 screws from the bottom
The camera flex cable should already be attached to the front plastic.
The kit may include either or both an ARMING board and/or a GPS module. The GPS module in addition to being used for GPS, the it does also include a arming/safety switch, LED lights and a beeper. Generally if you have a GPS, it is preferred to use that rather than the ARMING board.
NOT all kits include the GPS module. Notably the NXP-CUP car kit will not. Instead use the BOARD and stick it down in a convenient location.
The arming board will provide the same beeper and arming buttons. However it does not include a compass component.
If provided in your kit, please install the GPS and GPS post mounting, The the pieces are listed in the page.
There is a set of screws that comes with the GPS unit. Use the 4 longer screws and Nylock Lock Nuts to attach the base mount.
It should be placed in this furthest back position:
Attach the mounting mast rod and fasten it with one of the smaller two screws left. At this time only tighten the screws loosely as you will need to rotate align and remove the mast before completion.
Attach the top mount with the remaining screw. Keep it also only loosely tightened. It should look like this when completed:
Locate the M10 GPS and the 3M double tape.
Carefully align the GPS in the center of this tape. Ensure the arrow on the GPS module is facing forward. The cable of the GPS should be zip tied so it stays very close to the GPS mast in order to not obstruct the LIDAR sensor.
Trim any long cable ties with side cutters.
Ensure the metal top arch is oriented the correct way up and the Lidar is mounted facing the front. A common mistake is for the top arch to be flipped upside down, resulting in the LIDAR facing the back. Check the direction of the LIDAR by looking at the images below and noting the embossed arrow on top.
There is a black (or orange) 3d printed plastic standoff post which also protect the cable.
The holes in the lidar standoff can be used to store some of the thumbscrews when not in use
There are two 5 pin JST-GH cables that provide battery power to the MR-CANHUBK344 and the NavQPlus. They are 5 pins, with no wire in the center position. Two wires are battery voltage+ and two wires are battery voltage-. Be sure to limit the voltage applied at the battery to <20V or the specified ratings for any boards plugged in.
The nominal battery voltage is expected to be ~12V,
There is one long and one short power cable included in the kit.
Connect is the LONG power cable from the MR-CANHUBK344 to one of the the Power Distribution Board Pout connectors.
The shorter length power cable can be similarly connected to the NavQPlus
When done, ZIP TIE the cables to the edge of the top plate and make them neat.
The NavQPlus and MR-CANHUBK344 communicate to each other using 100Base-T1 Two wire ethernet .
Connect the ethernet cable between the NavQPlus and the MR-CANHUBK344.
Zip Tie the cable at both ends and tuck it under the top plate.
Note the specific side notches to use.
Use a piece of tape to hold it neatly out of the way under the plate
Locate the DUAL LOCK type mounting tape eliminating the need for the procedure below and screw fastening. This will also elevate the NAVQPLUS boards slightly so that Ethernet and USB will have better clearance. We recommend you just use some DUAL LOCK Velcro instead of screwing the NavQPlus to the metal frame.
Alternative mounting using screws
An alternative method of mounting is to use the provided M3 screws to securely attach the board's corners to the following position in the metal frame:
Screw them in from the bottom side. It should look like this:
Use a ziptie to secure the power cable (4 wire cable with 5 pin JST-GH), and route the CAN cables down and underneath as shown below.
Next: After successfully attaching the NavQPlus, the next step is to install the MR-CANHUBK344 evaluation board.
It is a four-wire cable having two red and two black cables representing VCC and ground respectively. Firstly, connect one end of the cable to any of the pout terminal on the PDB.
Then connect the other end on the following connector of the NAVQPlus.
The cable shown below is NOT USED with the B3RB and PDB. It may be attached already when you first open the NavQPlus. Keep it safe as it may be useful for other configurations or when bench testing NavQPlus with an 3S LiPo battery or bench power supply.
This is also explained here: PDB Wiring of RGB LED
<TODO> refer to the mounting of the LIDAR base
The cable necessary for connect the NavQPus to the LIDAR-STL-27 is the number one in WITB Cables and Screws.
A closer view of this cable:
The Lidar used is the following:
You should connect the smaller end of the cable to the Lidar and the bigger end to the UART3 Serial port of the NavQPlus:
Thse are all the connections available for NavQPlus:
<TODO >
note on CAN
note on console
note on Ethernet
The Quadrature decoder (QDEC) cable connects from PDB to MR-CANHUBK344. Quadrature signals are coming from the encoder on the motor itself. The PDB is then used to interface between the two different cable types. QDEC signal provides measured information on how many revolutions the motor has made, and therefore how far the robot has travelled. This can then be used in the control system as an input. See here for more information about Quadrature encoding.
Note Pre-production MR-B3RB used a special QDEC cable with orange band markings. if you happen to have one or are upgrading a pre-production kit, do not use this cable as the signals are reversed.
This cable is crucial for calculating the odometry, speed and direction of the robot car. There are six black wires and one red wire in it. The cable should look like this:
One end of the cable must be attached to the QDEC out pin on the PDB:
And the other end must be attached to the following pin on the MR-CANHUBK344:
The MR-B3RB is versatile and can support several network wiring configurations. There are also several optional communication bus connections. This section will serve to show the BASELINE connections. We will also attach cables that may not be used, but should stay with the B3RB for future expansion or debugging.
Provide power to the NavQPlus and MR-CANHUBK344
Connect the NavQPlus and CANHUBK344 with T1 Two-wire Ethernet
Attach the LIDAR to the NavQPlus
Attach the SERVO and Motor Control PWM to the MR-CANHUBK344
Quadrature encoding from the motor/PDB to the MR-CANHUBK344
Connect the RGB LED lighting
Have debuggers and consoles attached
Optionally plug in unused RC-PWM input
Optionally connect CAN-FD cables, which could be used for peripherals and attachments
Optionally connect a serial console from the MR-CANHUBK344 to the NavQPlus <<TODO - add link to section that removes the jumper on the PDB. Add PDB wiring section>>>
The final setup with al the established connections should look like this:
A closer view on MR-CANHUBK344 after wiring is completed:
And in the NavQPlus:
Please follow the steps 4a and 4b for the detailed explanation on these connections.
Here we distinguish between the B3RBs lower & upper chassis:
This is the assembled buggy minus the top cover. The "bridge" with the lidar module may be referred to as the top plate, the top arch, Lidar arch or the lidar plate:
You will notice that throughout this guide, we use these names. At least now you know what we mean by it.
This connection between the Steering servo and the PDB is pre-wired on the lower chassis but is shown here for reference.
The image illustrates how a servo motor is connected to the PDB (Power Distribution Board). It shows three distinct wires: the red wire is +5V positive voltage supply, the brown wire is the ground connection, and the yellow wire carries the control RC-PWM signal from the microcontroller (MR-CANHUBK344).
As shown in the images, the correct orientation of the Servo wire is with the Yellow wire toward the short edge of the PDB. The Servo will NOT be damaged if this connection is reversed, it will simply fail to move.
The GPS module is primarily used to provide the Magnetometer, sound and SAFETY button for the system. Not all systems will use the GPS localization functionality. The software however is capable of using the use of GPS position data for navigation outdoors.
An ARMING board is also included in the kit and can be used with alternative software. This can provide the safety button and sound functionality, but not the Magnetometer needed for robotics navigation using Cognipilot.
The M10 GPS cable attaches to the MR-CANHUBK344 as shown below
(optional) Tie wrap wires
After connecting all the cables, proceed to attach the metal board frame to the robot car using the screws you removed at the beginning of this tutorial:
Please attach the plastic shield with the camera housing to the robot's front using M2.5 pan head Philips Self-Tapping screws, which are located in this box provided:.
For reference, a picture of the screws is displayed here: <<<TODO - these are not the correct screws. They should be flat head (tapered head> screws. Take picture and add here>>>
The mounting must look like these pictures:
And the attach the back plastic shield of the robot with the same screws:
The antenna depicted in the image must be handled with utmost caution and preserved within the protective plastic enclosure.
CAN Networks typically requre proper termination at both ends of the bus in order to function correctly. Many of the boards used will include this termination optionally on board. Additionally many of the NXP boards use CANSIC (Signal Improvement Can Phy) which is more resiliant to mis-matched or incorrect termination. Included in the kit is also a CAN termination board. By plugging in the can termination to the second connector on the MR-CANHUBK344 or NAvQPlus it will add termination network to that end of the circuit. This board also includes a USB input and solder pads for instances where power is to be supplied into the CAN bus using this device. In the MR-B3RB this power injection is NOT needed as 5V power is already supplied.
Typical battery and battery connection
Side note: The 3D model for the has been prepared and may be 3D printed if a replacement is needed.
There are two locations where on board batteries could be located after being connected
1) On the "mezzanine" level of the frame and
2) On the lower part of the the lower chassis.
Since the mezzanine level may be used for additional electronics in the future, we will refer only to the removeable battery location on he lower chassis.
The exact method of securing the battery is dependent on the user preferences.
There is a battery strap included in the kit. This could also be used to strap to the upper/mezzanine level of the metal frame.
You may wish to remove the small plastic battery tray frame that is already installed in the kit in order to free up additional space.
A foam piece can be used to wedge between the underside of the battery and the lower chassis in order to prevent any movement. This wedges the battery "up" against the underside of the mezzanine level.
3D printed pieces may be designed also to hold the battery in a preferred way.
Latest versions of MR-B3RB-M include a power switch mounted under the front edge of the B3RB. Toggle it to the on or off position as desired.
The MR-CANHUBK344 sends PWM signals to the Motor controller and also to the Servo. These are standard RC Model type PWM connections. In order to facilitate keeping the signals in the correct order they are routed and consolidated onto a single cable and small "XDHP" connector board. It is still quite important to get the XDHP connector positioned correctly on the PWM header of the MR-CANHUBK344
Technical note: The steering servo gets +5V power from the PDB
Thw PWM/XDHP wire connection should look like as depicted in the following picture:
The cable carrying PWM signals between the MR-CANHUBK344 board, and the Power Distribution Board should look like as shown in picture below. It is a seven wire interface which carries multiple PWM signals and has a VCC line and a ground.
There is a small "XDHP" adapter board to fit on the PWM pin-header of the MR-CANHUBK344 board. This is provided in the kit and show in the image above
Please be extra careful that the XDHP adapter board is connected and aligned correctly. Damage to the board could result from incorrect connection.
The one end of the PWM cable is connected on the XHDP pins on the PDB:
The other end of the cables having the xHDP adapter on it is mounted on the first three columns(Three pairs of three pins) of PWM pins which represents PWM0, PWM1 and PWM2 positions of the MR-CANHUBK344 respectively:
Installation of the XDHP adapter board and cable is most easily done when the cable and connector are conected together first. Then gently bend the wires at a 90 degree angle. Push the XDHP adapter onto the CANHUBK344 pin headers. The images show XHDP adapter is mounted on the first three columns (Three pairs of three pins) of PWM pins. This corresponds to PWM0, PWM1 and PWM2 positions of the MR-CANHUBK344 respectively.
The XDHP aligned toward the PWM header CLOSEST to the side of the board with the CAN connectors.
The PDB serves two main functions: 1) Takes in Battery voltage and provides FUSED RAW battery and regulated 5V power out.
2) a connector interface converter board.
An INA226 power simple power measurement IC is used to give an indication of battery state by Current and voltage measurement.
The Raw Battery connectors go to other boards such as NavQPlus and MR-CANHUBK344
The 5V power supplies the servo and other local interfaces
Other connectors are provided to adapt between the B3RB motor's Hall sensor, Quatrature encoder and drive signals.
LED lighting via a STEMMAQT type connector also is adapted . NOTE This is NOT STEMMAQT signalling specification. But it is convenient to use the available off the shelf cabling.
Most of the sofware ecosystem examples will requre using a native Linux PC. It may be possible to use a Windows PC running WSL or a virtual Linux machine like VMware, Virtual Box, but this is not supported in this documentation. Generally Roboics uses ROS2 and a correcsponding LTS version of Ubuntu. Other configurations may work, but are not supported in this documentation.
MR-B3RB is intended as a research/development platform. That means we expect the end user to compile and install your own software based on opensource references. Example software configurations are described in the following sections.
This will be a checklist review of all the connections for the base B3RB
You must use a LiPo battery charger to safely charge the battery. There are very inexpensive ones available that will charge using the cell termination connectors at the expense of them being quite slow to charge. Many other types of chargers exist. When choosing a charger consider that this is a safety issue and overcharging or improperly charging can be dangerous and result in a fire.
There are many excellent LiPo battery chargers available. ISDT makes several high-end charges like the following: (This is ONLY an example, and not a specific recommendation)
MR-B3RB units have a power monitor built into the PDB board. This will tell you the total pack voltage and individual cell voltage. It will alarm when the cell voltage goes too low. There is an optional RDDRONE-BMS772 BMS that can interface with the B3RB. It provides advanced battery monitoring capabilities.
LiPo batteries shoudl never be over-discharged. The PDB battery monitor will measure the battery voltage level, and properly running software and control software will alert the user if tbe battery is low. As an extra backup, a small battery alarm board is included, that can be attached to the battery CT leads directly. This is recommended and can save you from over discharging during development or when you may get distracted (i.e. at a tradeshow or seminar) The alarm is connected as shown in the attached photo. While it's usage is mostly self explanatory, you can learn more via YouTube tutorials such as this one: https://youtu.be/A0EjDtGSepw?si=PWPuZGCJoSq1ri9W
MR-B3RB units have a power monitor built into the PDB board. This will tell you the total pack voltage and individual cell voltage. It will alarm when the cell voltage goes too low.
There is an optional RDDRONE-BMS772 BMS that can interface with the B3RB. It provides advanced battery monitoring capabilities.
An external battery alarm is included in the Kit
LiPo batteries should never be over-discharged. The PDB battery monitor will measure the battery voltage level, and properly running software and control software will alert the user if tbe battery is low. As an extra backup, a small battery alarm board is included, that can be attached to the battery CT leads directly. This is recommended and can save you from over discharging during development or when you may get distracted (i.e. at a tradeshow or seminar) The alarm is connected as shown in the attached photo. While it's usage is mostly self explanatory, you can learn more via YouTube tutorials such as this one: https://youtu.be/A0EjDtGSepw?si=PWPuZGCJoSq1ri9W
Bonus feature! The top cover doubles as a work stand. This will keep the wheels up of the ground and can be quite useful while testing. You should add soft tape or foam on the underside of the buggy chassis in order to avoid scratches on the top cover.
This is one of the last steps. You may still want to check everything over to avoid having to insert and remove screws
In previous steps you should have mounted the Lidar and standoff to the Lidar Arch
There are several options for the screws to hold on the arch:
The Lidar arch attaches with screws into the plastic front and rear covers. Ideally M3x5 Socket head cap screws are used, but the plastic is not threaded making it diffocult to insert the screws. With care an patience you can cut your own threads by pushing consistently while turning the screw, then backing off and cleaning the debris off the screw threads. Repeat this several times until the screw goes in all the way.
Alternatively if you have access to an M3- 0.5 threading TAP that will very quickly and easily form threads.
These are small self tapping black screws from the Buggy spare parts kit. While somewhat small, they will work to hold the arch in place.
When completed your Lidar arch should look something like this:
The PDB (Power Distribution board) is where both power enters from the battrey , and also provides a number of connector "translations" to match the motor and LED lighting. Specialized connectors on the motor and encoder to translate to the Dupont-style or standard DroneCode connector pinouts. It should be noted that the block diagram above does not show ALL the power distribution connections, which are as follows:
Battery input through a fuse
Battery voltage and current measurement through the INA226
3x Battery voltage outputs on 5 pin JST-GH for NavQPlus, MR-CANHUBK344, Extra
5V regulating supply for LED lights, and powering PWM servo rail on MR-CANHUBK344 and therefore the Servo itself.
PWM signals are consolidated into a single XHDP connector for ease of connection.
The GPS combines UART, I2C for Compass, and GPIO for switches, LED, and Beeper into a single large DroneCode standard connector.
The DCD-LZ interface simply combines SWD and a UART on a single JST-GH Dronecode connector. Depending on your kit you may have any of the following debug adapter boards:
FTDI USB-UART cable and adapter for UART/Console on the NavQPlus
FTDI USB-UART cable + DCD-LZ adapter for MR-CANHUBK344 DCD-LZ interface
In some systems you will find NeuralProbe board with USBC interface and connectors for UART/Console AND DCD-LZ interface. This board includes a USB-UART converter and also includes a button that will toggle the RST pin on SWD/JTAG interface. There is a 3rd interface on this board for Pixhawk V6X debug+uart connectors (not used here, used with NXP MR-VMU-RT1176)
Typically we use off-the-shelf LiPo battery pack similar to what is used in RC airplanes and Quadcopters.
The battery should have the following specifications:
LiPo 3s (Three Cell) 12V nominal voltage (10-14V)
2000-3000mAh
XT60 connector with balance leads
Dimensions
A) 75mm L x 34mm W x 26 H (short and stubby)
B) 100-110mm L x 35mm W x 22-26mm H (longer format)
Other dimensions are possible. Ensure the battery chosen fits the compartment area,
A battery is NOT included with the kit. This can be purchased online or from a local hobby shop
Below are links to some example batteries. THIS IS IN NO WAY an endorsement of these specific batteries, only a link so you can see more detail about the types of similar batteries that could be used:
You will find that "Stubby/Shorty" batteries such as listed below will give you more working space. However longer thinner batteries have also worked well
Zeee 3S 2200mAh Lipo Battery 11.1V 50C Shorty Pack Battery with XT6...Zeee 3S 2200mAh Lipo Battery 11.1V 50C Shorty Pack Battery with XT60 Plug for RC Car Truck RC Vehicles Boat Drone RC Airplane Quadcopter Helicopter FPV Racing Hobby Models(2 Pack)
HOOVO 3S Lipo Battery 2200mAh 80C 11.1V Shorty Lipo Battery Pack So...HOOVO 3S Lipo Battery 2200mAh 80C 11.1V Shorty Lipo Battery Pack Softcase with XT60 Connector Compatible with Drone RC Airplane Helicopter Quadcopter FPV RC Car,2 Pack
You must use a LiPo battery charger to safely charge the battery. There are very inexpensive ones available that will charge using the cell termination connectors at the expense of them being quite slow to charge. Many other types of chargers exist. When choosing a charger consider that this is a safety issue and overcharging or improperly charging can be dangerous and result in a fire.
There are many excellent LiPo battery chargers available. ISDT makes several high-end charges like the following: (This is an example, and not a specific recommendation)
MR-B3RB units have a power monitor built into the PDB board. This will tell you the total pack voltage and individual cell voltage. It will alarm when the cell voltage goes too low. There is an optional RDDRONE-BMS772 BMS that can interface with the B3RB. It provides advanced battery monitoring capabilities.
LiPo batteries shoudl never be over-discharged. The PDB battery monitor will measure the battery voltage level, and properly running software and control software will alert the user if tbe battery is low. As an extra backup, a small battery alarm board is included, that can be attached to the battery CT leads directly. This is recommended and can save you from over discharging during development or when you may get distracted (i.e. at a tradeshow or seminar) The alarm is connected as shown in the attached photo. While it's usage is mostly self explanatory, you can learn more via YouTube tutorials such as this one: https://youtu.be/A0EjDtGSepw?si=PWPuZGCJoSq1ri9W
MR-B3RB is a kit of kits
MR-B3RB is short for Mobile Robotics - Buggy3 Revision B. This is the second revision to a robotic vehicle called MR-Buggy3 introduced in 2023. Some people didn't like calling a product "buggy". So even though Buggy3 is a brave name, we decided to instead make it MR-B3RB as it is a nice and friendly robot like R2D2.
We took lessons learned from that platform and applied them as an upgrade for 2024.
The upgrades of the MR-B3RB are:
Uses the same rugged metal base RC car, with readily available replacement and upgrade parts
No need to change the steering servo
Upgraded BLDC motor with gear reduction, integrated ESC (motor controller) and quadrature encoder output that can be used for vehicle odometry
Metal upper chassis that can be removed as a module
Power distribution board with additional connector matching and power supply for boards and servos.
Removeable mounting plate for companion computer, real time processors, communications radios and any other add on components
RGB addressable lighting
Optional upgrade to plastic sides, front, rear and lidar arch mount
Optional GPS / remote Magnetometer/ arming button
Currently this is the PRE-Production version, there will still be some updates coming to improve the modularity of the top plate, and provide additional mounting for user added 3d printed components.
is a (newer) opensource vehicle control software framework that sits cleanly on top of the Zephyr RTOS. This hosts an example Model Predictive Control real-time vehicle controller software which works with MR-B3RB kits. Other software or control algorithms may also be used. but is out of scope for this section of the documentation. Cognipilot is the default reference example control software.
Note that EVEN when not using the full capability of CogniPilot, the framework still includes many software modules, high level drivers, communications methods and related PC tools and setup that makes other less demanding uses (such as NXP-CUP) much easier. The framework is there and can be used as much or as little as needed.
Zephyr RTOS has build targets for MR-CANHUBK344 and MR-VMU-RT1176 which can be used as part of the real-time controller and IMU for MR-B3RB. In various versions of MR-B3RB kits these boards may already be included.
CogniPilot's claim to fame is that the control system software component is synthesized as a module from popular control software called CasADi. CasADi is popular with control systems engineers and researchers and allows the decoupling of the control theory, simulation and mathematical proofs from the actual code that implements the particular controller math. This approach helps with ensuring safety and robustness, while also supporting rapid innovation
CasADi is an open-source tool for nonlinear optimization and algorithmic differentiation.
It facilitates rapid — yet efficient — implementation of different methods for numerical optimal control, both in an offline context and for nonlinear model predictive control (NMPC).
CogniPilot is a group of software modules making up the vehicle control system. More details of what each of the modules does is available and explained in more detail on the project website.
Cerebri represents the CogniPilot control-system software module and is named similarly to a human brain cerebellum, which The synthesized code from CasADi, or code representing the control theory-module for the particular vehicle intended, plugs into CogniPilot's Cerebri module. The control system for the vehicle type is a swappable module, it can define control functions or approaches that relate to a ground vehicle vs a multicopter, vs a winged vehicle. You then separately provide a hardware description file for the specific vehicle characteristics.
One of many control theories could be used to control an Ackermann steering car. The specific control theory is built in CasADi. Code is then synthesized and put into Cerebri for execution. The control theory was for a generic Ackermann vehicle type. The hardware description file defines the power of the motors, the size of the vehicle and wheels, the placement of all these wheels. You could theoretically have an Ackermann vehicle where the steering is misaligned with the drive wheel thrust. This same control software works for a passenger car or a toy car since the vehicle size and power that gets defines independently. A different synthesized control theory would be loaded to control a multicopter. The hardware definition would control the arm lengths, the number and orientation of the motors, and the thrust from the motors.
There is much more detail and information about the design and structure of CogniPilot here:
The developer guide for CogniPilot can be found here:
Cognipilot software module operate across multiple processors in order to form a complete system:
CogniPilot Electrode - Runs on PC host groundstation with Linux/ROS2
CogniPilot Corti and Nav2 - Runs on NavQPlus Embedded companion computer running Linux/ROS2
CogniPilot Synapse - Bridge over Ethernet, running on CANHUBK344 and NavQPlus
CogniPilot Cerebri - CANHUBK344 real time control MCU running Zephyr with ZROS nodes and topics
These cables connect to any of the 4 pin Pout connectors on the PDB. All Pout connectors are the same, but some may be closer and provide a better location. Choose one that makes your wiring tidy without overstretching the cable.
And the other end connects to the power input (AKA VBat) of the MR-CANHUBK344
Setting up B3RB with CogniPilot environment software
The definitive guide is the CogniPilot website, these sub-pages are only for additional details and guidance.
Follow this section to prepare CogniPilot-Cranium on NavQPlus:
NEW
This will:
Flash the EMMC on the NavQPlus with the current Linux Image, using uuu using the
USB interface
Establish a console connection using one of the three possible methods (USB-UART, SSH,USB-C Gadget ethernet)
Connect NavQPlus to a Wi-Fi network and establish an alternative console via SSH over Wi-Fi connection
Install CogniPilot Cranium on the NavQPlus (using the script provided)
The developer guide for CogniPilot can be found here:
NEW
Setting up B3RB with CogniPilot environment software
The definitive guide is the CogniPilot website, these sub-pages are only for additional details and guidance.
Please start with an Ubuntu Linux install 22.04 or newer.
Follow this section of CogniPilot to prepare your Linux development PC with ROS, Zephyr and CogniPilot development tools https://airy.cognipilot.org/getting_started/install/
NEW https://brave.cognipilot.org/getting_started/install/ Ubuntu Linux 24.04
This will:
Setup ssh
keys and gpg keys for git access
install Git
Install ROS2
install Zephyr build tools
and B3RB CogniPilot packages on your host Linux laptop.
prepare a SIL (software in the loop) example
optionally setup for serving the CogniPilot documentation locally
At this point you have configured a Linux PC for development. Follow the remaining subpage steps for guidance to get the B3RB (CANHUB-K3 and NavQPlus ready) to be updated. Note that the next section to refer to in the CogniPilot will be https://airy.cognipilot.org/reference_systems/b3rb/setup/
NEW https://brave.cognipilot.org/reference_systems/b3rb/setup/
(** Add note about how to set network interfaces correctly for DDS CycloneDDS config. when more than one interface exists. This is a ROS configuration and not really Cognipilot.)
Launch Notes for B3RB:
# serial terminal to NavQPlus
screen /dev/ttyUSB0 115200
# Setup wifi hotspot on phone.
# same network as host PC
# show local wifi networks, find the one you want
sudo nmcli device wifi
# for a text interface
sudo nmtui
# Connect NavQPlus Wifi to network
sudo nmcli device wifi connect <network name> password <password>
When Foxglove starts, "Open Connection" and type connect to
ws://b3rb-xx.local:4242
(replace b3rb-xx with whatever your buggy's hostname is)
# SSH login to NavQPlus
ssh user@mrb3rb
#or
ssh user@mrb3rb.local
# FIRST launch Ros2 electrode on host pc
ros2 launch electrode
electrode.launch.py
rviz2:=false
# Then launch Ros2 on B3RB
ros2 launch b3rb_bringup robot.launch.py
If there are multiple vehicles being used at the same time on the same network, then you will also need to set a unique ROS_DOMAIN_ID for on both the host LInux machine and the NavQPlus
edit your file ~/.bashrc
using VIM, NANO or GEDIT
change the line (find with CTRL+W)
export ROS_DOMAIN_ID=xxx
where xxx is a unique number from the other robots
Sometimes you just want to start over. It's ok, we all have days like that. I Fyou need afresh cognipilot start, and want to be extra sure there are no leftover dependencies or files you dont' want etc then follow these steps. Keep in mind this is going to erase local changes you may have made to cognipilot files. Note them in advance if you want to re-do the changes afterwards. (for example you may have disabled the battery voltage check in <todo> xxx.conf)
sudo rm -rf /opt/toolchains
sudo rm -rf /opt/zeth
sudo rm -rf /opt/poetry
rm -rf ~/bin/build_*
rm -rf ~/bin/west
rm -rf ~/bin/cyecca
rm -rf ~/bin/docs
rm -rf ~/cognipilot
Then start again with the cognipilot install instructions for the Linux host development PC.
or this link https://airy.cognipilot.org/getting_started/install/#use-cognipilot-universal-installer
The instructions above will take care of Gazebo install also. If you happen to know you are stuck on an old version of Gazebo you may want to just try this first. The example below is where gazebo "garden" (gzgarden) is installed, but you want gazebo "harmonic"
apt-get remove ros-humble-ros-gzgarden
sudo apt-get install ros-humble-ros-gzharmonic
sudo apt-get install gz-harmonic
Connect the buggy with the j-linkcable to your laptop. Connect your laptop and the buggy to the same wifi network. You can use the IP that we used so the buggy connects automatically to it. But if you want to use another IP, you need to connect a serial cable to the NavQP board, open a shell session in MobaXterm, and run the following command: sudo nmcli device wifi connect . It is good to check your new IP: ifconfig mlan0
Start new session by filling in the wiffi
Ip address: 192.168.1.105
Once wifi connection is setup, open an ssh session in MobaXterm, and type your ip/hostname and the username: (you can type b3rb-anne.local instead of the ip)
Shell login:
username: user
password: user
4. in a shell prompt run the following ros2 command:
ros2 launch b3rb_brigup robot.launch.py
If it does not run well, check history | grep ros
Once wifi connection is setup and you can see topics display in foxglove, you are ready to do a test drive.
First, press the button in the GPS module, you should see that SAFETY is OFF in foxglove
Connect the GPS to the CANHUBK3 and press the switch bottom
Then, choose a mode: manual for manual drive, and auto or cmd_vel for autonomous mode
Then, press arm button in foxglove to arm the buggy
If you choose the manual mode, use the joystick in foxglove to drive the buggy
If you choose autonomous mode, use the arrow button in foxglove and choose a destination in the map
CANHUB-ADAP is used with MR-B3RB to provide:
An IMU
SCARD
Connections for StemmaQT/QUIIC I2C devices
GPS connectors
More
The schematics below can be used to review the interfaces
Here you find other hw modules (not included in the box) that are cool to expand your MR-B3RB with.
User contributed 3d printed parts may be linked to here
Setting up B3RB with CogniPilot environment software
The full robotics reference design experience on MR-B3RB relies on the Linux NavQPlus companion computer working in concert with the real time controller MR-CANHUBK344. CognipPilot is the preferred opensource example framework that works hand in hand with ROS2 to provide a fully robotic vehicle, and sets up and configures the majority of the associated host companion computer, and development PC tools. The CogniPilot based "system" is therefore more than just the Cerebri real time vehicle control module.
Other frameworks including bare metal code can be used on B3RB when used in NXP-CUP or AIM. Those competitions may also just use a subset of CogniPilot or even alternative software.
Note however that the combination of Zephyr + Cognipilot is valuable even for when you don't intend to take advantage of the full implementation environment and setup. You can choose to not run Cognipilot Cerebri or other Cognipilot modules.
The development process can be repurposed. Because Cognipilot sits cleanly as an application on top of Zephyr RTOS various aspects of it's implementation such as sensor drivers can be leveraged.
This may be true especially for like NXP-CUP or AIM India systems which are less complex of a robotics implementation.
Below are "high level" overview of installation. The details are well covered in the Cognipilot webpages themselves.
Follow this section of Cognipilot to prepare your Linux development PC with ROS, Zephyr and Cognipilot development tools : https://airy.cognipilot.org/getting_started/install/
NEW https://brave.cognipilot.org/getting_started/install/
Follow this section of to prepare Cognipilot-Cranium on NavQPlus : https://airy.cognipilot.org/cranium/compute/navqplus/setup/
NEW https://brave.cognipilot.org/cranium/compute/navqplus/setup/
Follow this section of CogniPilot to prepare Cognipilot-Cerebri software on MR-CANHUBK344 : https://airy.cognipilot.org/reference_systems/b3rb/setup/
NEW https://brave.cognipilot.org/cranium/compute/navqplus/setup/
NEW https://brave.cognipilot.org/reference_systems/b3rb/about/
https://airy.cognipilot.org/reference_systems/b3rb/about/
The definitive guide is the CogniPilot website, but you can also follow the additional sub-pages for additional guidance details
After installation, you are ready to use the MR-B3RB.
You can log into the NavQPlus and even into the MR-CANHUBK344 running CogniPilot/Zephyr
You can run ROS SIL or "real hardware" examples using RVIZ or Foxglove as a control station.
The vehicle is capable of being safely armed through several steps and autonomously navigating to a position and pose as specified/pointed to on the control station software.
The developer guide for Cognipilot can be found here: https://airy.cognipilot.org/
This guide provides instructions for connecting the J-Link EDU Mini to the MR-CANHUBK344 with the objective of flashing the Cognipilot software onto the board.
First take the J-Link EDU Mini, you can find more information here:
Then, get the J-Link EDU Mini. Then, use the cable with a micro USB connector on one end and a USB Type-A connector on the other end.
Then take the 10-Pin 2x5 Socket-Socket IDC (SWD) Ribbon Cable and connect it to the J-Link EDU mini:
Please, connect the 10-Pin 2x5 Socket-Socket IDC (SWD) Ribbon cable in the following position to the J-Link EDU Mini:
Next, please pick up the DCD-LZ adapt board:
Next, connect one end of the 10-pin 2x5 Socket-Socket IDC (SWD) ribbon cable to the connector on the DCD-LZ adapt board. Ensure that the orientation of the connector corresponds to the one shown in the following image:
The pick the 7 position JST-GH connector:
Finally, connect one end to the DCD-LZ adapt board and the other end of the 7 position JST-GH connector to the MR-CANHUBK344. Please, connect it into the pin showed in the following image:
Now, you just need to connect the USB cable to your development computer and also remember to connect the LiPo battery to the PDB to flash the MR-CANHUBK344, as the board needs to be connected to the power. Then, you can return here to follow the instructions: Prepare MR-CANHUBK344 real-time vehicle controller
This guide provides instructions for connecting the MCU-Link-MR to the MR-CANHUBK344 with the objective of flashing the Cognipilot software or any other Zephyr based onto the board.
Either the J-LINK or MCU-Link may be used to program the MR-CANHUBK344
DRAFT-DRAFT-DRAFT Connect the MR-Link-MR to the 7 position JST-GH programing connector to the MR-CANHUBK344. Please, connect it as shown in the following image: <todo> update image to show MR-Link-MR <todo> alternatively connect the 10 pin JTAG/SWD ribbon cable connector <todo update photos below
Connect the USB cable to your development computer and also remember to connect power to the Buggy (the LiPo battery to the PDB) in order to flash the MR-CANHUBK344.
Use this command when programming using the MCU-Link or MCU-Link-MR from Zephyr west tool
The MCU-LINK-MR is an updated version of the MCU-link which includes debug interfaces for robotics including DCD, DCD-LZ, Pixhawk debug large and small, and USB-C to UART console. PyOCD method has to be used when using a MCU-Link or MCU-Link-MR
west flash --runner pyocd
More detailed official documentation for this is available here. See for default runner to use can be sethttps://docs.zephyrproject.org/latest/develop/west/build-flash-debug.html#choosing-a-runner Then, you can return here to follow the instructions: Prepare MR-CANHUBK344 real-time vehicle controller
Setting up B3RB with CogniPilot environment software
The definitive guide is the CogniPilot website, these sub-pages are only for additional details and guidance.
Follow this section of CogniPilot to prepare Cognipilot-Cerebri on MR-CANHUBK344: https://airy.cognipilot.org/reference_systems/b3rb/setup/
https://airy.cognipilot.org/reference_systems/b3rb/about/
NEW: https://brave.cognipilot.org/reference_systems/b3rb/setup/
NEW: https://brave.cognipilot.org/reference_systems/b3rb/about/
This will:
Update MR-CANHUBK344 with the Zephyr+Cognipilot-Cerebri image. (Cerebri is the application, Zephyr is the RTOS. They are flashed simultaneously using a prepared image.)
You need to follow this guide to correctly flash Cerebri onto the board: Connect J-Link EDU Mini to MR-CANHUBK344 with DCD-LZ adapt board
The following steps must be performed after you have prepared your Linux Development PC.
cd ~/cognipilot/ws/cerebri
west update
cd ~/cognipilot/ws/cerebri/app/b3rb
west build -b mr_canhubk3 -p
west flash
At ths point you are ready to use the MR-B3RB.
You can log into the NavQPlus and even into the MR-CANHUBK344 running CogniPilot/Zephyr
You can run ROS SIL or "real hardware" examples using RVIZ or Foxglove as a control station.
The vehicle is capable of being safely armed through several steps and autonomously navigating to a position and pose as specified/pointed to on the control station software.
When booted, MR-CANHUBK344 should display the following CogniPilot logo and welcome screen on the console.
uart:~$
▄▄▄▄▄▄▄▄
▄▄▄▄▄ ▄▄▄▄▄ ▀▀▀▀▀▀▀▀▀
▄███████▀▄██████▄ ▀█████████████████████▀
▄██████████ ████████ ▄ ▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄
███████████▀ ███████▀ ██ ▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
█████████▀ ▀▀▀▀▀▀▀▀ ████ ▀███████████▀
▀█████▀ ▄▄███████████▄ ████ ▄▄▄▄▄▄▄▄▄
▀▀▀ ███████████████▀ ████ ▀▀▀▀▀▀▀▀
▀▀█████▀▀▀▀▀▀ ▀▀▀▀ ▄█████▀
████████▀ ▄▄▄
▀███▀ ▀▀▀
▀▀
╔═══╗╔═══╗╔═══╗╔═╗ ╔╗╔══╗╔═══╗╔══╗╔╗ ╔═══╗╔════╗
║╔═╗║║╔═╗║║╔═╗║║║║ ║║╚╣╠╝║╔═╗║╚╣╠╝║║ ║╔═╗║║╔╗╔╗║
║║ ╚╝║║ ║║║║ ╚╝║║╚╗║║ ║║ ║║ ║║ ║║ ║║ ║║ ║║╚╝║║╚╝
║║ ║║ ║║║║╔═╗║╔╗╚╝║ ║║ ║╚═╝║ ║║ ║║ ║║ ║║ ║║
║║ ╔╗║║ ║║║║╚╗║║║╚╗║║ ║║ ║╔══╝ ║║ ║║ ╔╗║║ ║║ ║║
║╚═╝║║╚═╝║║╚═╝║║║ ║║║╔╣╠╗║║ ╔╣╠╗║╚═╝║║╚═╝║ ╔╝╚╗
╚═══╝╚═══╝╚═══╝╚╝ ╚═╝╚══╝╚╝ ╚══╝╚═══╝╚═══╝ ╚══╝
┏━━━┓┏━━━┓┏━━━┓┏━━━┓┏━━┓ ┏━━━┓┏━━┓
┃┏━┓┃┃┏━━┛┃┏━┓┃┃┏━━┛┃┏┓┃ ┃┏━┓┃┗┫┣┛
┃┃ ┗┛┃┗━┓ ┃┗━┛┃┃┗━┓ ┃┗┛┗┓┃┗━┛┃ ┃┃
┃┃ ┏┓┃┏━┛ ┃┏┓┏┛┃┏━┛ ┃┏━┓┃┃┏┓┏┛ ┃┃
┃┗━┛┃┃┗━━┓┃┃┃┗┓┃┗━━┓┃┗━┛┃┃┃┃┗┓┏┫┣┓
┗━━━┛┗━━━┛┗┛┗━┛┗━━━┛┗━━━┛┗┛┗━┛┗━━┛
Within a few seconds, the red light on the T1-Ethernet port on both the boards should light up.
ros2 launch b3rb_bringup robot.launch.py
The developer guide for CogniPilot can be found here: https://cognipilot.org/releases/airy/getting_started/install
NEW https://brave.cognipilot.org/reference_systems/b3rb/setup/
This guide provides instructions for connecting the J-Link EDU Mini to the MR-CANHUBK344 with the objective of flashing the Cognipilot software onto the board
This is an alternative connection using a Cognipilot debugger adapter.
First take the J-Link EDU Mini, you can find more information here:
Then, get the J-Link EDU Mini. Then, use the cable with a micro USB connector on one end and a USB Type-A connector on the other end.
Then take the 10-Pin 2x5 Socket-Socket IDC (SWD) Ribbon Cable and connect it to the J-Link EDU mini:
Please, connect the 10-Pin 2x5 Socket-Socket IDC (SWD) Ribbon cable in the following position to the J-Link EDU Mini:
Next, please pick up the Cognipilot debug adapter:
Next, connect one end of the 10-pin 2x5 Socket-Socket IDC (SWD) ribbon cable to the connector on the Cognipilot debugger. Ensure that the orientation of the connector corresponds to the one shown in the following image:
Finally, connect the other end of the 7 position JST-GH connector to the MR-CANHUBK344. Please, connect it into the pin showed in the following image:
Now, you just need to connectConnect the USB cable to your development computer and also remember to connect power to the Buggy (the LiPo battery to the PDB) in order to flash the MR-CANHUBK344, as the board needs to be connected to the power. Then, you can return here to follow the instructions: . Prepare MR-CANHUBK344 real-time vehicle controller
Connect the MR-Link-MR to the 7 position JST-GH programing connector to the MR-CANHUBK344. Please, connect it as shown in the following image:
Now, you just need to connect the USB cable to your development computer and also remember to connect the LiPo battery to the PDB to flash the MR-CANHUBK344, as the board needs to be connected to the power. Then, you can return here to follow the instructions: Prepare MR-CANHUBK344 real-time vehicle controller
Setting up B3RB with CogniPilot environment software
The definitive guide is the CogniPilot website, these sub-pages are only for additional details and guidance.
At ths point you are ready to use the MR-B3RB. If you have reviewed the documentation you should now:
You can log into the NavQPlus console and even into the MR-CANHUBK344 running CogniPilot/Zephyr
You can run ROS SIL or "real hardware" examples using RVIZ or Foxglove as a control station.
The vehicle is capable of being safely armed through several steps and autonomously navigating to a position and pose as specified/pointed to on the control station software.
OPTIONAL: When booted, if you have a console connected, MR-CANHUBK344 will display the following CogniPilot logo and welcome screen on the console.
uart:~$
▄▄▄▄▄▄▄▄
▄▄▄▄▄ ▄▄▄▄▄ ▀▀▀▀▀▀▀▀▀
▄███████▀▄██████▄ ▀█████████████████████▀
▄██████████ ████████ ▄ ▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄
███████████▀ ███████▀ ██ ▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
█████████▀ ▀▀▀▀▀▀▀▀ ████ ▀███████████▀
▀█████▀ ▄▄███████████▄ ████ ▄▄▄▄▄▄▄▄▄
▀▀▀ ███████████████▀ ████ ▀▀▀▀▀▀▀▀
▀▀█████▀▀▀▀▀▀ ▀▀▀▀ ▄█████▀
████████▀ ▄▄▄
▀███▀ ▀▀▀
▀▀
╔═══╗╔═══╗╔═══╗╔═╗ ╔╗╔══╗╔═══╗╔══╗╔╗ ╔═══╗╔════╗
║╔═╗║║╔═╗║║╔═╗║║║║ ║║╚╣╠╝║╔═╗║╚╣╠╝║║ ║╔═╗║║╔╗╔╗║
║║ ╚╝║║ ║║║║ ╚╝║║╚╗║║ ║║ ║║ ║║ ║║ ║║ ║║ ║║╚╝║║╚╝
║║ ║║ ║║║║╔═╗║╔╗╚╝║ ║║ ║╚═╝║ ║║ ║║ ║║ ║║ ║║
║║ ╔╗║║ ║║║║╚╗║║║╚╗║║ ║║ ║╔══╝ ║║ ║║ ╔╗║║ ║║ ║║
║╚═╝║║╚═╝║║╚═╝║║║ ║║║╔╣╠╗║║ ╔╣╠╗║╚═╝║║╚═╝║ ╔╝╚╗
╚═══╝╚═══╝╚═══╝╚╝ ╚═╝╚══╝╚╝ ╚══╝╚═══╝╚═══╝ ╚══╝
┏━━━┓┏━━━┓┏━━━┓┏━━━┓┏━━┓ ┏━━━┓┏━━┓
┃┏━┓┃┃┏━━┛┃┏━┓┃┃┏━━┛┃┏┓┃ ┃┏━┓┃┗┫┣┛
┃┃ ┗┛┃┗━┓ ┃┗━┛┃┃┗━┓ ┃┗┛┗┓┃┗━┛┃ ┃┃
┃┃ ┏┓┃┏━┛ ┃┏┓┏┛┃┏━┛ ┃┏━┓┃┃┏┓┏┛ ┃┃
┃┗━┛┃┃┗━━┓┃┃┃┗┓┃┗━━┓┃┗━┛┃┃┃┃┗┓┏┫┣┓
┗━━━┛┗━━━┛┗┛┗━┛┗━━━┛┗━━━┛┗┛┗━┛┗━━┛
Within a few seconds, the red light on the T1 Ethernet port on both the boards should light up.
(The definitive guide is on cognipilot.org !! copied here for convenience only)
From the command line run
ros2 launch b3rb_bringup robot.launch.py foxglove:=false
Login to the NavQPlus using ssh or serial console.
Ensure the NavQPlus and Host PC are on the same network
AFTER starting ROS on the Host PC, THEN on the NavQPlus run
ros2 launch b3rb_bringup robot.launch.py
The developer guide for CogniPilot can be found here: https://cognipilot.org/releases/airy/getting_started/install
NEW https://brave.cognipilot.org/reference_systems/b3rb/setup/
This page describes the integration of PMD Flexx2 ToF camera with NavQPlus
A Time-of-Flight camera (ToF) is a range imaging sensor that produces a depth image, each pixel of which encodes the distance to the corresponding point in the scene [1].
In addition to the camera that produces 2D images, the ToF sensor includes an IR transmitter. By measuring the phase difference between the radiated (red) and reflected (blue) IR waves, we can calculate the distance to the object as depicted in the figure below.
The PMD Flexx2 camera intended use are: augmented Reality / Virtual Reality and SLAM. It has the following specifications:
Depth Measurements from 10 cm to 7 m
VCSEL with 940 nm Wavelength
38,000 3D Pixels with up to 60 FPS
Independent of External Light Source
224 x 172 Resolution
Multiple built-in User Modes
56 x 44 Degree Field-of-View
USB 3.0 Type-C Port
71.9 x 19.2 x 10.6 mm compact size
Software Suite "Royale" and API
Download the royale SDK from the official website: https://pmdtec.com/en/download-sdk/
You need to fill in the form then you will get an email with a download link available for 24 hours. Use the password that comes with the camera.
You need to download two .zip files: one for your PC where you will display the camera data connected to the PC or the NavQP, and one for the NavQP.
The recent SDK for Flexx2 camera does not have the ROS2 driver. But the previous SDK does. Then, as recommendation, download the zip file libroyale-4.24.0.1201-LINUX-arm-64Bit-ub2004.zip for NavQP.
Linux PC: libroyale-<ROYALE-SDK-VERSION>-LINUX-x86-64Bit.zip
Windows PC: libroyale-<ROYALE-SDK-VERSION>-WINDOWS-x86-64Bit.exe.zip
NavQP: libroyale-4.24.0.1201-LINUX-arm-64Bit-ub2004.zip
Connect the camera to the PC using a USB-A to USB-C cable.
If you are using a Linux machine, download the zip file libroyale-<ROYALE-SDK-VERSION>-LINUX-x86-64Bit.zip from the download page, add the udev rules then run royaleviewer after reboot:
$ unzip ~/libroyale-<ROYALE-SDK-VERSION>-LINUX-x86-64Bit.zip
$ cd ~/libroyale-<ROYALE-SDK-VERSION>-LINUX-x86-64Bit/
$ sudo cp driver/udev/10-royale-ubuntu.rules /etc/udev/rules.d/
$ sudo udevadm control --reload-rules && sudo udevadm trigger (or reboot)
$ cd ~/libroyale-<ROYALE-SDK-VERSION>-LINUX-x86-64Bit/bin
$ ./royaleviewer
A GUI will start, click on start button on the bottom right corner to start the display:
You can use the TOOLS button to change some parameters. For example you can change the color range, or the type of data displayed (Gray image, depth image), etc
There is also a LOG button on the top right corner of the GUI where you can find useful debug or error logs.
If you are using a windows machine, install the zip file libroyale-<ROYALE-SDK-VERSION>-WINDOWS-x86-64Bit.exe. Unzip the file then run the executable to install the Royale Viewer.
Once installed you can find the executable with the name royaleviewer-royale_<ROYALE-SDK-VERSION> under C:\Users\<USER>\AppData\Roaming\Microsoft\Windows\Start Menu\Programs\royale_<ROYALE-SDK-VERSION> (Win64)
Once the program is launched, the steps are similar to those described in Linux sub-section.
Connect the camera to the NavQPlus with an USB-C to USB-C cable (see picture below).
Before using the camera, you need to add the udev rules available in the SDK zip file that you downloaded from PMD download link.
$ unzip ~/libroyale-4.24.0.1201-LINUX-arm-64Bit-ub2004.zip
$ cd ~/libroyale-4.24.0.1201-LINUX-arm-64Bit/
$ sudo cp driver/udev/10-royale-ubuntu.rules /etc/udev/rules.d/
$ sudo udevadm control --reload-rules && sudo udevadm trigger (or reboot)
Under the bin/ directory, run the binary tcpserver:
$ cd ~/libroyale-4.24.0.1201-LINUX-arm-64Bit/bin
$ ./tcpserver
The Royale SDK provides some sample codes available under samples/cpp/ directory. You should find a directory sampleROS2, which contains code for ROS2 driver.
You need to adjust this code to be able to compile in ROS2 Humble. First create a ROS2 workspace and copy the ROS2 driver there:
$ mkdir -p ws_tof/src
$ cp -rf ~/libroyale-4.24.0.1201-LINUX-arm-64Bit/samples/cpp/sampleROS2 ~/ws_tof/src
Then change the CMakeLists.txt file and the package.xml file to add the package pluginlib dependency:
Then compile the ROS2 workspace using the following command:
$ colcon build --cmake-args "-DCMAKE_PREFIX_PATH=~/libroyale-4.24.0.1201-LINUX-arm-64Bit/share"
Change the path to Royale SDK depending on your environment, and make sure to reference to share/ directory in this command.
$ source install/setup.bash
$ ros2 run royale_in_ros2 royale_in_ros2
Make sure that mlan0 interface in listed in the list of interfaces in CycloneDDSConfig.xml file, then run RVIZ in a PC:
[1] Hansard, Miles, Seungkyu Lee, Ouk Choi, and Radu Horaud. Time-of-Flight Cameras: Principles, Methods and Applications. SpringerBriefs in Computer Science. London: Springer London, 2013. https://doi.org/10.1007/978-1-4471-4658-2.
[2] https://3d.pmdtec.com/en/3d-cameras/flexx2/
[3] SDK download link: https://pmdtec.com/en/download-sdk/
There is an additional B3RB gitbook supporting the NXP AIM CHALLENGE (2024) where the B3RB is configured to use vision to drive around a track and avoid several obstacles. The simulation setup may be of interest to others and can be found here: https://nxp.gitbook.io/nxp-aim/running-simulation
This tutorial describes how to use the SteamVR tracker with the B3RB
The Vive tracker can be used to provide a reasonably accurate ground truth reference for robotics. By enabling it using the libsurvive library, position and pose information can be viewed and stored in ROSBAGS for further analysis.
VIVE Tracker (3.0) can pair with HTC’s wireless dongle or use its USB interface to transfer tracking data to a PC. An accessory attached to VIVE Tracker (3.0) can:
Simulate buttons of the VIVE Controller through the underlying Pogo pin port.
Send specific data to a PC through the USB interface of VIVE Tracker (3.0).
Tracking
Support for SteamVR BS1.0 and BS2.0
Status indicator
LED
Input
Power button, Pogo pin, USB-C
Charging
USB-C
Attachment
1/4-inch UNC Threaded mount (standard tripod mount
The status light shows:
Green when the controller is in normal mode
Blinking red when battery is low
Blinking blue when the controller is pairing with the headset
Blue when the controller is connecting with the headset
Orange when charging
As mentioned in the Developer guide [2], the VIVE Tracker provides five use cases:
Use case 1: Track passive objects through USB interface in VR. In this case, the dongle is not used. VIVE Tracker (3.0) is connected to the PC through USB to directly transfer tracking data.
Use case 2: Track passive objects through USB interface in VR, with the accessory passing data to a PC through USB, BT/Wi-Fi or propriety RF. This is similar to Use Case 1 but the accessory directly transfers the tracking data to a PC for a specific purpose based on its design.
Use case 3: Track moving objects by wireless interface in VR. In this case, the dongle is used to transfer tracking data from the VIVE Tracker (3.0) to a PC.
Use case 4: Track moving objects using a wireless interface in VR, with the accessory passing data to a PC through USB, BT/Wi-Fi or propriety RF. This is similar to Use Case 3 but the accessory directly transfer the tracking data to/from a PC for a specific purpose based on its design.
Use case 5: Track moving objects using a wireless interface in VR, with the accessory simulating buttons of the VIVE Controller or passing data to a PC through the VIVE Tracker (3.0). This is similar to Use Case 3 but the accessory connects with the VIVE Tracker (3.0) to transfer a button event to a PC through the Pogo pins or USB interface.
To VIVE tracker in NavQPlus using ROS2, you need to follow the instructions in the following github repository: https://github.com/asymingt/libsurvive_ros2
libsurvive_ros2 provides a lightweight ROS2 wrapper around the libsurvive project, an open source set of tools and libraries that enable 6 dof tracking on lighthouse and vive based systems.
First of all set the udev rules:
sudo curl -fsSL https://raw.githubusercontent.com/cntools/libsurvive/master/useful_files/81-vive.rules \
-o /etc/udev/rules.d/81-vive.rules
sudo udevadm control --reload-rules && sudo udevadm trigger
And install the required packages:
sudo apt-get install build-essential \
cmake \
freeglut3-dev \
libatlas-base-dev \
liblapacke-dev \
libopenblas-dev \
libpcap-dev \
libusb-1.0-0-dev \
libx11-dev \
zlib1g-dev
Then create a workspace and install the ROS2 wrapper:
mkdir -p ~/ros2_ws/src
cd ~/ros2_ws/src
git clone https://github.com/asymingt/libsurvive_ros2.git
cd ..
colcon build
source install/setup.bash
Connect the VIVE Tracker to the NavQPlus using an USB-C to USB-C cable.
Launch the ROS2 node:
ros2 launch libsurvive_ros2 libsurvive_ros2.launch.py rosbridge:=true
Three topics are published:
/libsurvive/cfg: listens for device configuration
/libsurvive/imu: listens for devie inertial measurements
/libsurvive/joy: listens to the button
We need to look at the topic /libsurvive/imu of type sensor_msgs/msg/Imu.
To display topic data in foxglove, we need first to add the topics /libsurvive/imu and /tf_static to the whitelist.
This is simply the default method for connecting the unit. It can be used to ensure the device is working correctly. However for collecting data in ROS2, you will want to use the libsurvive library.
1- Download steam : https://store.steampowered.com/about/
2- Download steamVR https://store.steampowered.com/app/250820/SteamVR/
Connect the tracker to the PC via the USB-C cable then open Steam app. Got to "Library" then launch SteamVR app (click on the green button)
When launched, a small window will appear:
[1] Official website
[2] Developer guide
[3] Borges, Miguel, Andrew Symington, Brian Coltin, Trey Smith, and Rodrigo Ventura. “HTC Vive: Analysis and Accuracy Improvement.” In 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2610–15. Madrid: IEEE, 2018. https://doi.org/10.1109/IROS.2018.8593707.
Measurement of distance within the range 1 mm to 1300 mm for cliff detection using VL53L4CD.
The B3RB can use a TOF distance sensor as a cliff detector and the appropriate software to prevent it from driving off a table, down a stairway or any other "cliff" it may encounter while moving forward. These sensors could also be used pointing in other directions for other proximity detection applications. This tutorial describes how to perform cliff detection on MR-B3RB using TOF VL53L4CD sensor with MR-CANHUBK344 and MR-CANHUBK3-ADAP boards. The example and configuration shown below is with the MR-CANHUBK344 running Zephyr and Cognipilot but is showing the board outside of the installation on a MR-B3RB. When used with a B3RB, the power is supplied in the normal fashion and not with external connectors shown.
The Time of Flight VL53L4CD sensor measures distance within the range 1 mm to 1300 mm.
The sample application "samples/sensor/vl53l4cd" polls on the sensor every 1 second and reads the current distance measured from the device. Then it prints the distance on the console (in meters).
Re: item no. 9, MR-CANHUBK344 kit that comes with MR-B3RB-S, includes multiple cables with "SYP" connector which could be utilized for supplying power to the MR-CANHUBK344.
1
1
2
1
3
1
4
1
5
(Complete set)
1
6
1
7
1
8
1
9
for supplying power to MR-CANHUBK344
1
10
DC Power Supply (any) for supplying 5V to MR-CANHUBK344
1
This hardware setup shows these boards outside of MR-B3RB on a tabletop. This of course can also be done on the vehicle itself. Simply plug in the sensor and the STEMMA QT cable to the CANHUB-ADAP. The power and other connections do not change from a normal MR-B3RB configuration.
Connect hardware components as shown in the block diagram above. The CANHUB-ADAP includes a I2C connector that follows the ADAFRUIT STEMMAQT standard.
Attach the MR-CANHUBK3-ADAP board on top of MR-CANHUBK344.
Power on the MR-CANHUBK344 board - connect the SYP connector to the MR-CANHUBK344, and the other end of the cable to your bench power supply or any source that could provide 5V DC.
cd ~
west init -m "https://github.com/NXPHoverGames/zephyr.git" cliff_detection_project
cd ~/cliff_detection_project
west update
cd ~/cliff_detection_project/zephyr
git checkout tof_vl53l4cd_with_mux_pr
west build -b mr_canhubk3 samples/sensor/vl53l4cd -p
west flash
Get the shell of MR-CANHUBK344.
sudo chmod 666 /dev/ttyUSBn
cu -l /dev/ttyUSBn -s 115200
The shell should display output as:
proximity = 1
distance = 0.019 m
proximity = 1
distance = 0.021 m
proximity = 1
distance = 0.019 m
proximity = 1
distance = 0.018 m
proximity = 1
distance = 0.020 m
proximity = 1
distance = 0.017 m
proximity = 1
distance = 0.018 m
proximity = 1
distance = 0.019 m
proximity = 1
distance = 0.020 m
proximity = 1
distance = 0.018 m
proximity = 1
distance = 0.020 m
proximity = 1
distance = 0.018 m
proximity = 1
distance = 0.020 m
proximity = 1
distance = 0.020 m
proximity = 1
distance = 0.019 m
proximity = 1
distance = 0.016 m
The MR-B3RB platform is equipped with a camera connected to the NavQPlus, enabling a variety of functions such as robot visualization, Visual SLAM, and other computer vision and machine learning applications. The Cognipilot package for ROS2 includes an 'ov5645.ini' configuration file with the camera's intrinsic calibration data. This data is crucial for correcting lens distortion and understanding the camera's geometric properties in computer vision and robotics applications. However, this file serves is just an indication, and users must perform a camera calibration after assembling the MR-B3RB robot to ensure a better operation.
Some cameras, such as pinhole cameras, introduce significant distortion to images. Two major kinds of distortion are radial distortion and tangential distortion.
Radial distortion causes straight lines to appear curved. Radial distortion becomes larger the farther points are from the center of the image. Similarly, tangential distortion occurs because the image-taking lense is not aligned perfectly parallel to the imaging plane. So, some areas in the image may look nearer than expected. In short, we need to find five parameters, known as distortion coefficients.
Extrinsic parameters corresponds to rotation and translation vectors which translates a coordinates of a 3D point to a coordinate system
To find these parameters, we must provide some sample images of a well defined pattern (e.g. a chess board). We find some specific points of which we already know the relative positions (e.g. square corners in the chess board). We know the coordinates of these points in real world space and we know the coordinates in the image, so we can solve for the distortion coefficients. For better results, we need at least 10 test patterns.
In addition to this, we need to some other information, like the intrinsic and extrinsic parameters of the camera. Intrinsic parameters are specific to a camera. They include information like focal length and optical centers. The focal length and optical centers can be used to create a camera matrix, which can be used to remove distortion due to the lenses of a specific camera. The camera matrix is unique to a specific camera, so once calculated, it can be reused on other images taken by the same camera.
After the theoretical background we are going to see how to calibrate the camera on the MR-B3RB robot.
Before start, make sure that you have followed the MR-B3RB Software developer guide and you have installed the most updated stable version of Cognipilot. (NXP-CUP: In case you are a participant of the NXP CUP 2024 please also follow this tutorial NXP-CUP 2024 with MR-B3RB)
Connect through ssh to the NavQPlus as explained in Cognipilot: CheatSheet for operation
sudo apt update
sudo apt install ros-humble-camera-calibration-parsers
sudo apt install ros-humble-camera-info-manager
sudo apt install ros-humble-launch-testing-ament-cmake
cd ~/cognipilot/cranium/src
git clone https://github.com/NXPHoverGames/b3rb_camera_calibration
cd ~/cognipilot/cranium/
colcon build --symlink-install --packages-select b3rb_camera_calibration
source ~/cognipilot/cranium/install/setup.sh
Remember to print the calibration pattern: Calibration pattern
In one terminal run the following command:
ros2 launch b3rb_camera_calibration b3rb_camera_calibration_launch.py
Then open a new terminal and connect throush ssh to the NavQPlus and run:
ros2 run b3rb_camera_calibration cameracalibrator --size 9x6 --square 0.02 --ros-args -r image:=/camera/image_raw -p camera:='chessboard'
If you want to see the output yo can run in your host computer:
ros2 launch electrode electode.launch.py
This will open foxglove and you will be able to select the topic /calibration_camera_feed on the image panel.
You should be able to see something like this.
For the calibration process, it's advised to keep the robot stationary and move the calibration pattern to different positions instead. Place it at various angles, because the calibration system requires multiple samples and poses to calibrate accurately.
You will see the 'Continue Sampling' message in the terminal until the calibration process has collected enough samples. After that, the message will switch to 'READY TO CALIBRATE.' Additionally, the 'calibrate' text in the Foxglove image will change to blue, as shown in the image below:
After the calibration node has collected enough samples, you'll need to open a new terminal window. Then, connect to the NavQPlus via SSH to begin calculating the camera matrix. In this new terminal, execute the following command to start the calibration:
ros2 topic pub /calibration_control std_msgs/msg/String "data: 'start_calibration'" --once
For saving the results please run the following command:
ros2 topic pub /calibration_control std_msgs/msg/String "data: 'save_results'" --once
Then you must see a message similar to the following:
Then the calibration has successfully finished, to stop the calibration please run:
ros2 topic pub /calibration_control std_msgs/msg/String "data: 'stop'" --once
To see the calibration results, first navigate to the cognipilot folder:
cd ~/cognipilot
Then create a new folder:
mkdir calibration_data
And uncompress the file and place the files in this new folder:
tar -xzvf calibrationdata.tar.gz -C calibration_data/
The files ost.txt and ost.yaml contain the camera matrix and distortion parameters.
You have to print a calibration pattern, this tutorial is going to be done with a 9x6 chessboard pattern, but you can do it with other pattern. This web allows you to choose the one that you prefer: https://calib.io/pages/camera-calibration-pattern-generator
This is the one used in this tutorial: https://github.com/opencv/opencv/blob/4.x/doc/pattern.png
Please print it in an A4 mantaining the original size.
In this tutorial you will learn how to integrate the BMS772 into the B3RB.
It is important to take precautions when working with batteries:
Before connecting a battery to the BMS, test it first using a power supply which has current limiting enabled. See section: Setup validation. This is to avoid damage that could occur because of bad soldering, a short of some kind or swapped wires.
LiPo batteries can be dangerous and catch on fire. When first plugging in a LiPo battery, please do it outdoors in an area that is fire-safe. Have a fire extinguisher nearby.
The RDDRONE-BMS772 is a standalone BMS reference design supporting 3 to 6 cell batteries.
The General Purpose MCU S32K144 provides great flexibility and communication to a PX4 based FMU via UAVCAN or I2C/SMBus. More information about the board can be found in references [1][2].
The BMS will be integrated in the system following the wiring diagram below:
For setting up the hardware, we need:
prepare the 3D printed mount
solder connectors and make cables
solder the balance leads and make its extension cable
prepare a battery emulator setup
This particular 3D print will allow the RDDRONE-BMS772 to connect to the pre-production version of MR-B3RB lower chassis metal frame. Future versions may not need this adapter, or may be able to just use standoff's directly.
We recommend pluggable cable connectors as shown below. Follow these images for guidance. Some XT60 connectors have a "back shell" to also help cover the solder joints. The backshell is preferable.
Because of the limited space on the B3RB. it is preferable to use a balancing leads connector that points upward. This will better clear the PDB that is adjacent to this board, and allow you to more easily plug in the battery's balancing cable later. Solder the white balance connector as shown below:
You will also need to make an extension cable for the balance leads connector. The leads that typically come as part of the battery is quite short. The suggested cable length is approximately 17cm.
In this section we will prepare a setup to emulate the battery using a power supply. For this setup we need:
4 x 1kΩ resistors
2 wires: one black and one red
Female balance leads connector with wires
Here are the steps:
Connect the female balance leads connector with wires and cut them to make them short.
Solder the resistors in series then solder two cables in both sides (black and red) as depicted in the picture above. Please make sure that the sides of the black and red wires are similar to the picture.
connect Black and red wires of the battery cells emulator to - and + input from the power supply.
Set the voltage in your power supply to 12V, then plug the input port to the power supply and the output port to the load that you are powering.
You can compile a binary from this GitHub repository: or flash the existing binary (follow ).
Activate the cyphal can and setup the can id:
In this section, we present three methods to display received messages via CAN:
using yakut CLI
using pycyphal library
using pycyphal library and ROS2
Before the three next steps, you need to prepare the environment variables and add the DSDL namespace as follow:
Yakút is a simple cross-platform command-line interface (CLI) tool for diagnostics and debugging of Cyphal networks. (For more information refer to [3])
The BMS772 software is already publishing three messages:
type: reg.udral.physics.electricity.SourceTs_0_1, port: 4096
type: reg.udral.service.battery.Status_0_2, port: 4097
type: reg.udral.service.battery.Parameters_0_3, port: 4098
Print published Cyphal message in a terminal:
First, clone the following repository:
Run the script to print the received cyphal messages:
Clone the following repository:
Run the ros2 node:
Print the topic:
Once the topic is publishing battery state messages, we can display it in Foxglove. Fist add the topic to the while listed topics in robot.launch.py file
In your PC, download the following layout file and add it to Foxglove studio:
You should see two different tabs: one tab General for displaying the B3RB topics and one tab Battery for displaying the battery status raw data:
[1] BMS gitbook:
[2]
[3]
This section explains the essential software required for participating in the NXP Cup 2024 using the MR-B3RB Robot.
Participants will use a combination of pre-configured ROS2 and Zephyr-Cerebri software. Here, "Cerebri" refers to the application layer, while "Zephyr" is the underlying Real-Time Operating System (RTOS). This software stack integrates with the Cognipilot autopilot system, which is available at .
The following pages provide detailed installation instructions for both the simulation environment and the actual robot. Additionally, there is a section dedicated to explaining the line follower algorithm for the NXP Cup 2024.
There is an opening in the rear plastic which can accept antennas from telemetry radios and other types of antennas. Changing the WiFi/BT antenna on NavQPlus may result in improved range. However it should be noted that this may cause your B3RB to no longer be compliant with FCC or CE regulations. It is your responsibility to know if you are compliant. The NavQPlus is tested and compliant only when used with the antenna provided when shipped.
Specifications: Gain: 8dBi Frequency Range: Dual Band WiFi 2.4GHz 5.8GHz VSWR: <2.0; Polarization: Linear Vertical; Impedance: 50 ohm; Direction: Omni-directional; Connector: RP-SMA Male Connector; Antenna Dimension: 16cm x 1.5cm Diameter; Net Weight: 62g; Operating Temperature: -20°С ~ +80°С Sto...
ON/Off power switch
A low cost 20A power switch can be added inline to the battery power cable in order to turn the whole unit off without having to remove the battery. A 3D printed bracket was designed to attach to the underside of the main plate, and stick out the front of the B3RB under the front headlights.
Cognipilot MR-B3RB Robot installation specific instructions for nxp-cup
To follow this tutorial you first need to follow and install everything explained in:
You must also follow the as it is explained how to update you development computer for the NXP-CUP 2024.
If you have correctly followed the previous steps, you should already be familiar with how to flash the Cognipilot software onto the MR-CANHUBK344 board. In this step, we will flash the board again, but this time with an updated version of the Cerebri software that includes support for the NXP Cup 2024. Additionally, we will need to log in via SSH to the NavQPlus and update the cranium workspace to enable support for the NXP Cup, similar to the procedure outlined on the previous page.
In your development computer go to:
Check that you are on the nxp-cup branch
Which should return:
If it doesn't show that:
If you have ensured that you are on the nxp-cup branch and that there is not any update available. Then type the following command:
west
west
to MR CANHUBK3Make sure is on your system and the MR CANHUBK3 is connected to the JLink programmer: .
Once it has finished, you can disconnect the J-Link from your computer and from the MR-CANHUBK344.
If you have any doubt or problem in this section please first refer to this documentation:
Once you have flashed the MR-CANHUBK344, It's time to log into the NavQPlus. Do this with this command:
Next, we will install the packages to develop the algorithm for the NXP CUP 2024 in the B3RB robot.
To simplify repository updates, we've created a shell script. This script automates the process of updating all repositories to the latest versions hosted on the GitHub, specifically for the NXP Cup.
The changes on the repositories are:
Synapse Protobuf and Synapse Tinyframe: Define the Pixy Vector message in the Protobuf language and provide essential definitions and functions for the TinyFrame library to operate correctly. For more information, refer to the documentation.
Synapse msgs and Synapse ros: Extend ROS with the Pixy Vector message definition and support Synapse which the facilitates communication between NavQPlus (ROS2) and CANHUBK344 (Zephyr).
B3RB robot: Includes the launch file for the node that processes the race track images.
NXP CUP Vision: Added ROS2 package for processing race track images, transforming perspectives, detecting track edges, and publishing both the debug image and the Pixy Vector, pivotal for visual processing.
This file ensures that all the repositories are updated. Save this script in ~/cognipilot:
Save this script as update_repos_navqplus.sh in the ~/cognipilot
directory, using any text editor of your choice, such as Vim or nano.
Then, open the terminal and execute the following commands:
Change directory to ~/cognipilot
and make the script executable:
This must have updated all your repositories. Please review the output of this script. It's possible that you have made local changes to the repositories, and you may need to stash them.
To build the software, navigate to the cranium/src
directory within ~/cognipilot
:
Now, you are ready to operate the platform.
On the NavQPlus, run the B3RB launch file:
Start Foxglove for operation and data visualization run the following command in your development computersimulation, run:
These commands will initiate the algorithm on the MR-B3RB robot and Foxglove viewer. You should locate your robot in the racetrack.
To initiate the robot's movement, press the AUTO button and the arm button on the joystick. This action will activate the robot's line-following mode and set it to the armed state, preparing it for operation.
Once the AUTO and arm buttons on the joystick are pressed, the robot should immediately begin to navigate around the racetrack. To understand the underlying code and setup, and to explore ways to obtain a better performance for the NXP CUP 2024, proceed to the next tutorial.
nsh> bms set can-mode cyphal
nsh> bms set cyphal-node-static-id 100
nsh> bms save
nsh> reboot
mkdir -p ~/.cyphal
wget https://github.com/OpenCyphal/public_regulated_data_types/archive/refs/heads/master.zip -O dsdl.zip
unzip dsdl.zip -d ~/.cyphal
mv -f ~/.cyphal/public_regulated_data_types*/* ~/.cyphal
export UAVCAN__NODE__ID=42
export UAVCAN__CAN__IFACE=socketcan:can0
export UAVCAN__CAN__MTU=8
export CYPHAL_PATH=/home/user/.cyphal
# yakut sub --with-metadata port:type
# Run the following commands in separate terminals
yakut sub --with-metadata 4096:reg.udral.physics.electricity.sourcets
yakut sub --with-metadata 4097:reg.udral.service.battery.status
yakut sub --with-metadata 4098:reg.udral.service.battery.parameters
git clone git@github.com:mounalbaccouch-nxp/BMS_cyphalcan.git
cd BMS_cyphalcan
python3.10 cyphal_bms_print_status.py
git clone git@github.com:mounalbaccouch-nxp/BMS_cyphalcan.git
cd BMS_cyphalcan
python3.10 cyphal_bms_ros2-pub_status.py
ros2 topic echo /battery_state
cd ~/cognipilot/ws/cerebri
git pull
cd ~/cognipilot/ws/cerebri
git status
On branch nxp-cup
Your branch is up to date with 'origin/nxp-cup'.
nothing to commit, working tree clean
git checkout nxp-cup
cd ~/cognipilot/ws/cerebri
west update
cd ~/cognipilot/ws/cerebri
west build -b mr_canhubk3 app/b3rb -p
cd ~/cognipilot/ws/cerebri
west flash
JLinkExe
connect
S32K344
S
4000
loadbin /home/$user/cognipilot/ws/cerebri/build/zephyr/zephyr.elf 0x400000
exit
ssh <username>@<hostname>.local
#!/bin/bash
declare -A repos=(
["cranium/src/synapse_protobuf"]="https://github.com/NXPHoverGames/synapse_protobuf"
["cranium/src/synapse_tinyframe"]="https://github.com/NXPHoverGames/synapse_tinyframe"
["cranium/src/synapse_msgs"]="https://github.com/NXPHoverGames/synapse_msgs"
["cranium/src/synapse_ros"]="https://github.com/NXPHoverGames/synapse_ros"
["cranium/src/b3rb_robot"]="https://github.com/NXPHoverGames/b3rb_robot"
["cranium/src/nxp_cup_vision"]="https://github.com/NXPHoverGames/nxp_cup_vision"
)
# Update existing repositories or clone if they don't exist
for repo in "${!repos[@]}"; do
echo "Processing $repo..."
repo_path="${repo}" # Derive the full path
# Check if the repository directory exists. If it does not, clone it.
if [ ! -d "${repo_path}" ]; then
echo "Repository ${repo_path} does not exist. Cloning..."
git clone --branch nxp-cup "${repos[$repo]}" "${repo_path}"
echo "Cloned ${repo} into ${repo_path}."
fi
# Navigate to the repository directory
cd "${repo_path}" || { echo "Failed to change directory to ${repo_path}. Does it exist?"; continue; }
# Set the new remote URL
git remote set-url origin "${repos[$repo]}"
echo "Remote changed to ${repos[$repo]}"
# Fetch changes from the new remote
git fetch origin
echo "Fetched changes from origin."
# Checkout the specific branch
git checkout nxp-cup
if [ $? -eq 0 ]; then
echo "Checked out nxp-cup branch."
# Pull the latest changes from the branch
git pull origin nxp-cup
echo "Pulled latest changes from nxp-cup branch."
else
echo "Failed to checkout nxp-cup branch. Does it exist?"
fi
# Return to the original directory
cd - > /dev/null
done
cd ~/cognipilot
chmod +x update_repos_navqplus.sh
./update_repos_navqplus.sh
cd ~/cognipilot/cranium/
colcon build --symlink-install
cd ~/cognipilot/cranium/
source install/setup.bash
ros2 launch b3rb_bringup robot.launch.py
ros2 launch electrode electrode.launch.py
Try playing with this option for performing flip without 2D
gst-launch-1.0 v4l2src device=/dev/video0 extra-controls='c,horizontal_flip=1,vertical_flip=1' ! xxxxx
[1/11 3:41 PM] Benjamin Perseghetti
K3 <-ENET-> NavQPlus improvement update!
Hey everyone, we just merged a massive improvement for the network between K3 and NavQPlus. Please follow these instructions to update your B3RB system:
Update B3RB and dev computer's cranium by running on both of them:
cd ~/cognipilot/cranium/
vcs pull
colcon build --symlink-install
Update, build, and flash new cerebri image from dev computer (where these instructions should be run) to k3 with attached programmer:
cd ~/cognipilot/ws/cerebri
git pull
west build -b mr_canhubk3 app/b3rb -p
west flash
This note is for code inside of Cognipilot, and should be considered more advanced. It is included here only for reference.
If you want to play around with making your own:
import numpy as np
def create_sound(tone_name, freq_max, freq_min, duration_sec, num_phases, steps):
print('struct tones_t {:s}[] = {{'.format(tone_name))
amp = (freq_max-freq_min)/2
for i in np.linspace(0, duration_sec, num=steps, endpoint=True):
note = freq_min+amp*(1+np.sin(np.pi*(num_phases/duration_sec)*i))
note_str = ' {{ .note = {:d}, .duration = {:d} }},'.format(int(note), int((duration_sec/steps)*1000))
print(note_str)
print(' { .note = REST, .duration = whole },')
print('};')
create_sound('b3rb_reject_tone', 2500, 150, .75, 2.5, 60)
And all I did was insert this at very top of generated tone:
struct tones_t b3rb_reject_tone[] = {
{ .note = 110, .duration = half },
{ .note = REST, .duration = thrirtysecond },
This section is intended as a quick guide to setup the buggy and let it drive either autonomously or manually.
Instead of the M10 GPS provided:
if you want to use your M8 GPS add this to your cerebri/app/b3rb/boards/mr_canhubk3.conf:
CONFIG_CEREBRI_SENSE_UBX_GNSS_MODULE_TYPE_M8=y
CONFIG_CEREBRI_SENSE_UBX_GNSS_BAUD=38400
And to remove the need for the battery monitor also add:
CONFIG_CEREBRI_SENSE_POWER=n
This provides a clip/mount for the battery extension cable mount which attaches to the front standoff from the body to the frame.
If you're integrating the Cognipilot software with the MR-CANHUBK344 and are not using a Power Distribution Board (PDB) with power measurement capability, you'll need to make a specific configuration adjustment to ensure proper functionality. This guide walks you through the necessary steps to modify the configuration for compatibility with your hardware setup.
When you are building the Cognipilot software CogniPilot: Prepare MR-CANHUBK344 for programming for the real robot operation on the MR-CANHUBK344, if its the case that you are not using PDB without power measurement capability, shown in the above picture you have to make the following update:
Initially, you must update a configuration line in the prj.conf file to reflect the absence of power measurement capabilities in your setup. Follow these instructions:
Navigate to the configuration file located at ~/cognipilot/ws/cerebri/app/b3rb/prj.conf
.
Locate the line CONFIG_CEREBRI_SENSE_POWER=y
and also the following line: CONFIG_CEREBRI_SENSE_SAFETY=y
Change this line to CONFIG_CEREBRI_SENSE_POWER=n
and also change the following line: CONFIG_CEREBRI_SENSE_SAFETY=n
This adjustment disables the power sensing feature, accommodating the hardware limitation.
After updating the configuration, proceed with the following steps to apply the changes and build the software:
cd ~/cognipilot/ws/cerebri
west update
cd ~/cognipilot/ws/cerebri/app/b3rb
west build -b mr_canhubk3 -p
west flash
By following these steps, you configure the Cognipilot software to operate correctly with your MR-CANHUBK344 hardware without power measurement capability.
This page explains the software required for the NXP CUP 2024 with the MR-B3RB robot.
The CogniPilot software is integrated into both the NavQPlus, within ROS2, and the MR-CANHUBK344, within the Zephyr RTOS together forming an Ackermann steering robotic development platform.
Please follow the installation instructions to obtain the modified branch of the CogniPilot software, specifically tailored for the NXP CUP 2024; these instructions are explained in the previous sectionNXP-CUP 2024: Simulation Install instructions
The ROS2 part of the software is designed for high-level tasks and should be treated as "black box" for the NXP-CUP 2024,meaning that its code should not be modified.If you encounter any errors in this code, please report them to the NXP Cup organizers or through the technical support channel on the official NXP CUP Discord: https://discord.gg/JcWPbw649S.
The main CogniPilot modification in the ROS2 side for the NXP Cup 2024 is addition of the nxp_track_vision
node, which receives camera data, transforms the perspective, and extracts the Pixy Vector message. This message consists of two vectors, each with a tail and a head, defining the borders of the race track. This information is used by the robot controller to ensure that the robot stays within the track boundaries.
Here's the definition of the Pixy Vector ROS2 message:
std_msgs/Header header
# Vector 0 head and tail points
uint32 m0_x0 # Tail of vector @ x
uint32 m0_y0 # Tail of vector @ y
uint32 m0_x1 # Head of vector @ x
uint32 m0_y1 # Head of vector @ y
# Vector 1 head and tail points
uint32 m1_x0 # Tail of vector @ x
uint32 m1_y0 # Tail of vector @ y
uint32 m1_x1 # Head of vector @ x
uint32 m1_y1 # Head of vector @ y
The std_msgs/Header
in the Pixy Vector ROS2 message includes the timestamp indicating when the message was sent. Additionally, the message is composed by two vectors, with each vector represented by a pair of points: one indicating where the vector begins (tail) and the other where it ends (head). Each of these points is defined by an x and a y value, specifying their position in a two-dimensional space.
The nxp_track_vision_node
also publishes the Camera Debug Image. This image shows the perspective transformation applied to the camera's view, along with the detected lines of the track. Additionally, it displays the vectors being sent by the track vision node. This visual feedback should result helpful for debugging and optimizing the robot;s control software.
This is an example of the debug camera image:
You can visualize this image through Foxglove. For doing that first run:
ros2 launch electrode electrode.launch.py #sim:=true (If you are using simulation)
Then add an Image panel to the B3RB layout provided and configure the ROS2 topic to /nxp_cup/debug_image.
Now a explanation of the line following control algorithm will be done. This algorithm is located on the Cerebri running on the Zypher RTOS in the MR-CANHUBK344. The Cerebri acts as the brain of our operation, facilitating complex decision-making processes that guide the robot seamlessly along the track.
This code is what you must modify and improve for the NXP CUP 2024.
The main function of the line follower algorithm is located in the velocity state inside the finite state machine logic that manages the robot's behaviour. This file is located in the ~/cognipilot/ws/cerebri/app/src/velocity.c
This is the GitHub location: See code.
The application folder is where the vehicle finite state machine, control, and estimation code reside. For more information visit: https://airy.cognipilot.org/cerebri/platforms/rovers/
In that folder you will find a file called velocity.c. This code is written in C language and has two main operation modes: the cmd_vel modes which listens the /cmd_vel topic from the Nav2 node from ROS2 and the auto mode, this one will listen to the Pixy vector message sendt by the nxp_track_vision_node and will estimate the velocity needed to maintain the robot in the race track:
// Follow line function in velocity.c
static void follow_line(context* ctx)
{
double frame_width = 78;
double frame_height = 51;
double window_center = frame_width / 2;
double linear_velocity=0.7;
double angular_velocity=-0.6;
double single_line_steer_scale=0.6;
double x = 0.0;
double y = 0.0;
double steer = 0.0;
double speed = 0.0;
int num_vectors = 0;
if(!(ctx->pixy_vector.m0_x0 == 0 && ctx->pixy_vector.m0_x1 == 0 && ctx->pixy_vector.m0_y0 == 0 && ctx->pixy_vector.m0_y1 == 0)) {
num_vectors++;
}
if(!(ctx->pixy_vector.m1_x0 == 0 && ctx->pixy_vector.m1_x1 == 0 && ctx->pixy_vector.m1_y0 == 0 && ctx->pixy_vector.m1_y1 == 0)) {
num_vectors++;
}
switch(num_vectors) {
case 0:
speed = 0.0;
steer = 0.0;
break;
case 1:
if(ctx->pixy_vector.m0_x1 > ctx->pixy_vector.m0_x0) {
x = (ctx->pixy_vector.m0_x1 - ctx->pixy_vector.m0_x0) / frame_width;
y = (ctx->pixy_vector.m0_y1 - ctx->pixy_vector.m0_y0) / frame_height;
} else {
x = (ctx->pixy_vector.m0_x0 - ctx->pixy_vector.m0_x1) / frame_width;
y = (ctx->pixy_vector.m0_y0 - ctx->pixy_vector.m0_y1) / frame_height;
}
if((ctx->pixy_vector.m0_x0 != ctx->pixy_vector.m0_x1) && ((y > 0.0) || (y <0.0))) {
steer = -angular_velocity * (x / y) * single_line_steer_scale;
} else {
steer = 0.0;
}
speed = linear_velocity;
break;
case 2:
if((ctx->pixy_vector.m1_x0 >= ctx->pixy_vector.m1_x1) && (ctx->pixy_vector.m0_x0 <= ctx->pixy_vector.m0_x1)){
steer = angular_velocity * (((ctx->pixy_vector.m0_x1 + ctx->pixy_vector.m1_x1) / 2.0) - window_center) / frame_width;
}else if ((ctx->pixy_vector.m1_x0 < ctx->pixy_vector.m1_x1) && (ctx->pixy_vector.m0_x0 <= ctx->pixy_vector.m0_x1)){
steer = angular_velocity * (((ctx->pixy_vector.m0_x1 + ctx->pixy_vector.m1_x0) / 2.0) - window_center) / frame_width;
}else if ((ctx->pixy_vector.m1_x0 > ctx->pixy_vector.m1_x1) && (ctx->pixy_vector.m0_x0 > ctx->pixy_vector.m0_x1)){
steer = angular_velocity * (((ctx->pixy_vector.m0_x0 + ctx->pixy_vector.m1_x1) / 2.0) - window_center) / frame_width;
}else {
steer = angular_velocity * (((ctx->pixy_vector.m0_x0 + ctx->pixy_vector.m1_x0) / 2.0) - window_center) / frame_width;
}
speed = linear_velocity;
break;
}
double vel_linear_x = speed * (1 - fabs(2 * steer));
double turn_angle = 0;
double omega_fwd = 0;
double V = vel_linear_x;
double omega = steer;
double delta = 0;
CASADI_FUNC_ARGS(ackermann_steering);
args[0] = &ctx->wheel_base;
args[1] = ω
args[2] = &V;
res[0] = δ
CASADI_FUNC_CALL(ackermann_steering);
omega_fwd = V / ctx->wheel_radius;
if (fabs(V) > 0.01) {
turn_angle = delta;
}
b3rb_set_actuators(&ctx->actuators, turn_angle, omega_fwd);
}
There are several important variables and functions that are used in the code. A key description of them is given below:
Context* ctx
This variable is a pointer to the context struct, a custom data type that stores information as the pixy vector, the robot status and the wheel base and radius.
Pixy Vector
Contains information of pixy vectors obtained from the nxp_track_vision node. This includes the starting and ending points (x0, y0, x1, y1) for each vector detected by the camera
linear_velocity
The variable that stores the linear velocity at which the car will move forward. It is a constant value representing the speed in a straight line
angular_velocity
The variable that stores the angular velocity at which the car will steer. It represents the rate of change in the car's direction
single_line_steer_scale
This variable controls how much the car steers when it can only find one line of the race track. It scales the angular velocity to adjust the steering sensitivity
steer
Variable that stores the steer or angular velocity in the same way that the standard cmd_vel message from ROS2 does. It is calculated based on the Pixy Vector information and controls the direction the car turns
vel_linear_x
Variable that stores the linear velocity or vel.linear.x in the same way that the standard cmd_vel message from ROS2 does. It is adjusted based on the steering angle to ensure smooth turns
num_vectors
Variable that stores the number of Pixy Vectors obtained. The value range for this variable is 0, 1, or 2, indicating how many lines the Pixy camera has detected on the track
frame_width
Width of the green frame on the debug image, which the Pixy vector values from the nxp_track_vision node are based on
frame_height
Height of the green frame on the debug image, which the Pixy vector values from the nxp_track_vision node are based on
The switch instruction evaluates the number of vectors found in the camera image and what the robot car will do depending on this number:
If no vectors are found (case 0), then the vehicle will stop.
If one vector is found, the algorithm finds the gradient of the vector and stores that in steer, and the speed
If two vectors are found, then the example algorithm will find the offset in the x direction of the average of the two head points of each vector. This will give us a steering value that will steer the cup car in the correct direction, which will be stored in steer. The speed value will be calculated and is stored in speed.
After calculating the speed and the steer values, a call to the CASADI function is done to estimate the control variable for the actuators. Then the robot can move the necessary meters that you need.
Before making changes to the velocity.c file or any files in the ~/cognipilot/ws/cerebri/app/b3rb directory, remember to rebuild the application to see the changes take effect.
For simulation:
cd ~/cognipilot/ws/cerebri
west update
west build -b native_sim app/b3rb/ -p -t install
For real robot:
cd ~/cognipilot/ws/cerebri
west update
west build -b mr_canhubk3 app/b3rb -p
west flash
In the nxp_track_vision.py file within the ROS2 package nxp_cup_vision, you'll find the following code snippet:
def statusCallback(self,data):
#Putting robot in AUTO mode
if(data.mode!=2):
joystick_msg=sensor_msgs.msg.Joy()
joystick_msg.header.stamp=ROSClock().now().to_msg()
joystick_msg.axes=[0.0,0.0,0.0,0.0]
joystick_msg.buttons = [0, 1, 0, 0, 0, 0, 0, 0]
#self.JoystickPub.publish(joystick_msg)
#If robot is in AUTO mode -> arming the robot
elif(data.mode==2 and data.arming!=2):
joystick_msg=sensor_msgs.msg.Joy()
joystick_msg.header.stamp=ROSClock().now().to_msg()
joystick_msg.axes=[0.0,0.0,0.0,0.0]
joystick_msg.buttons = [0, 0, 0, 0, 0, 0, 0, 1]
#self.JoystickPub.publish(joystick_msg)
This code snippet automatically sets the B3RB robot in AUTO mode and arms the robot. By default, the self.JoystickPub.publish(joystick_msg) lines are commented out to allow control through Foxglove Studio during development. Uncomment these lines if you want the robot to automatically enter AUTO mode and arm itself.
If you need the robot to delay its start, add a sleep command:
sleep(15.0)
This makes the robot wait for 15 seconds before starting, which can be particularly useful for the NXP-CUP competition.
Cognipilot simulation installation specific instructions for nxp-cup
To follow this tutorial you first need to follow and install everything explained in: MR-B3RB Software developer guide
If your installation has been successful you must be able the launch the Gazebo simulation by running the following commands in your development computer:
Please ensure that these commands function correctly and that your setup resembles what is shown in this example video
ros2 launch b3rb_gz_bringup sil.launch.py
ros2 launch electrode electrode.launch.py sim:=true
Launching the Gazebo Harmonic simulation will enable you to control the robot via the Foxglove interface. As explained in this section: b3rb_simulation
Next, we will install all necessary packages for developing the algorithm for the NXP CUP 2024, applicable to both the MR-b3rb simulation and the actual robot.
To simplify repository updates, we've created a shell script. This script automates the process of updating all repositories to the latest versions hosted on the NXPHoverGames GitHub, specifically for the NXP Cup.
The changes on the repositories are:
Synapse Protobuf and Synapse Tinyframe: Define the Pixy Vector message in the Protobuf language and provide essential definitions and functions for the TinyFrame library to operate correctly. For more information, refer to the Synapse documentation.
Synapse msgs and Synapse ros: Extend ROS with the Pixy Vector message definition and support Synapse which the facilitates communication between NavQPlus (ROS2) and CANHUBK344 (Zephyr).
Dream world: Adds a simulated race track environment to the workspace.
B3RB simulator: Includes the launch file for the node that processes the race track images.
Electrode: Integrates support for the debug image and pixy vector in Foxglove, facilitating debugging and visualization.
Cerebri: Implements a subscribing mechanism to receive the Pixy Vector message and controls for the line follower algorithm.
NXP CUP Vision: Added ROS2 package for processing race track images, transforming perspectives, detecting track edges, and publishing both the debug image and the Pixy Vector, pivotal for visual processing.
This file ensures that all the repositories are updated. Save this script in ~/cognipilot:
#!/bin/bash
declare -A repos=(
["cranium/src/synapse_protobuf"]="https://github.com/NXPHoverGames/synapse_protobuf"
["cranium/src/synapse_tinyframe"]="https://github.com/NXPHoverGames/synapse_tinyframe"
["cranium/src/synapse_msgs"]="https://github.com/NXPHoverGames/synapse_msgs"
["cranium/src/synapse_ros"]="https://github.com/NXPHoverGames/synapse_ros"
["cranium/src/dream_world"]="https://github.com/NXPHoverGames/dream_world"
["cranium/src/b3rb_simulator"]="https://github.com/NXPHoverGames/b3rb_simulator"
["electrode/src/electrode"]="https://github.com/NXPHoverGames/electrode"
["ws/cerebri"]="https://github.com/NXPHoverGames/cerebri"
["cranium/src/nxp_cup_vision"]="https://github.com/NXPHoverGames/nxp_cup_vision"
)
# Update existing repositories or clone if they don't exist
for repo in "${!repos[@]}"; do
echo "Processing $repo..."
repo_path="${repo}" # Derive the full path
# Check if the repository directory exists. If it does not, clone it.
if [ ! -d "${repo_path}" ]; then
echo "Repository ${repo_path} does not exist. Cloning..."
git clone --branch nxp-cup "${repos[$repo]}" "${repo_path}"
echo "Cloned ${repo} into ${repo_path}."
fi
# Navigate to the repository directory
cd "${repo_path}" || { echo "Failed to change directory to ${repo_path}. Does it exist?"; continue; }
# Set the new remote URL
git remote set-url origin "${repos[$repo]}"
echo "Remote changed to ${repos[$repo]}"
# Fetch changes from the new remote
git fetch origin
echo "Fetched changes from origin."
# Checkout the specific branch
git checkout nxp-cup
if [ $? -eq 0 ]; then
echo "Checked out nxp-cup branch."
# Pull the latest changes from the branch
git pull origin nxp-cup
echo "Pulled latest changes from nxp-cup branch."
else
echo "Failed to checkout nxp-cup branch. Does it exist?"
fi
# Return to the original directory
cd - > /dev/null
done
Save this script as update_repos_native.sh in the ~/cognipilot
directory, using any text editor of your choice, such as Vim, Gedit, Nano, or VSCode.
Then, open the terminal and execute the following commands:
cd ~/cognipilot
chmod +x update_repos_native.sh
./update_repos_native.sh
This must have updated all your repositories. Please review the output of this script. It's possible that you have made local changes to the repositories, and you may need to stash them.
To build the software, navigate to the cranium/src
directory within ~/cognipilot
:
cd ~/cognipilot/cranium/
colcon build --symlink-install
cd ~/cognipilot/cranium/
source install/setup.bash
Repeat the build process for the electrode
directory:
cd ~/cognipilot/electrode/
colcon build --symlink-install
cd ~/cognipilot/electrode/
source install/setup.bash
Update the West workspace for cerebri
.
cd ~/cognipilot/ws/cerebri
west update
west build -b native_sim app/b3rb/ -p -t install
source ~/.bashrc
Now, you are ready to run the simulation:
To launch the Gazebo simulation featuring the MR-B3RB robot on a simple raceway, execute:
ros2 launch b3rb_gz_bringup sil.launch.py world:=nxp_raceway_octagon
To start the Foxglove viewer for the simulation, run:
ros2 launch electrode electrode.launch.py sim:=true
These commands will initiate the Gazebo simulation with the MR-B3RB robot and the raceway, alongside launching the Foxglove viewer.
To initiate the robot's movement, press the AUTO button and the arm button on the joystick. This action will activate the robot's line-following mode and set it to the armed state, preparing it for operation.
Once the AUTO and arm buttons on the joystick are pressed, the robot will immediately begin to navigate around the racetrack. To understand the underlying code and setup, and to explore ways to obtain a better performance for the NXP CUP 2024, proceed to the next tutorial.
1
901-10008
HW ACCESSORY, and and
1
2
901-77273
HW ACCESSORY,
1
3
700-31354
PWA,
1
4
600-77370
1
5
901-77364
1
6
280-77408
10
7
901-77659
1
8
600-77612
10
9
700-47472
1
10
600-77546
1
11
901-10005
HW ACCESSORY, DIRECT TIME OF FLIGHT LIDAR
1
12
280-10004
FASTENER, SCREW M2.5x10MM BLACK HEX SOCKET HEAD CAP
3
13
280-10006
NUT, M2.5 x 0.45MM NYLON INSERTED HEX SELF LOCK NUT, BLACK ZINC PLATED, DIN985
3
14
280-10007
3
15
280-10008
3
16
280-10009
16
17
280-10012
8
18
280-10013
4
19
280-10014
4
20
805-77575
1
21
805-10002
2
22
805-10001
1
23
805-10003
1
24
901-77605
1
25
901-77606
1
26
600-10006
2
27
600-77359
1
28
901-10012
HW ACCESSORY,
1
29
600-10009
1
30
590-10002
FOAM, BLOCK 400MM WIDTH 600MM LENGTH X 200MM THICKNESS
1
31
280-77781
FASTENER STANDOFFS 6x35MM & (TOP/BOTTOM) 3MMx5 SOCKET HEAD
1
1
1
2
1
3
1
4
1
NOTE <TODO> 1) The image shows the incorrect cable for item #26. this is a servo extension cable which is MALE-FEMALE and it is already attached to the B3RB. There is only one of this Male-Female Cable. Note the naming of 600-10006 "CABLE, SERVO EXTENSION LEAD WIRING CABLE MALE TO MALE, 300 MM" is in fact correct. These cables are used to jumper between the pin headers on the PDB and the MR-CANHUBK344
If you're integrating the Cerebri software with the MR-CANHUBK344 and you are not using a GPS, but rather the PX4ARMINGBRD you'll need to make a specific configuration adjustment to ensure proper functionality. This guide walks you through the necessary steps to modify the configuration for compatibility with your hardware setup.
Please ensure you have the following PX4ARMINGBRD:
When building the Cognipilot software for real robot operation on the MR-CANHUBK344, if you lack GPS but possess the PX4ARMINGBRD as depicted above, you must perform the following update:
Initially, you must update a configuration line in the prj.conf file to reflect the absence of G MAG measurement capabilities in your setup. Follow these instructions:
Navigate to the configuration file located at ~/cognipilot/ws/cerebri/app/b3rb/prj.conf
.
Locate the lines CONFIG_CEREBRI_SENSE_MAG=y
and also the following line: CONFIG_CEREBRI_SENSE_UBX_GNSS=y
Change this line to CONFIG_CEREBRI_SENSE_MAG=n
and also change the following line: CONFIG_CEREBRI_UBX_GNSS=n
This adjustment disables the magnetometer and GPS feature, accommodating the hardware limitation.
After updating the configuration, proceed with the following steps to apply the changes and build the software:
cd ~/cognipilot/ws/cerebri
west update
cd ~/cognipilot/ws/cerebri/app/b3rb
west build -b mr_canhubk3 -p
west flash
By following these steps, you configure the Cognipilot software to operate correctly with your MR-CANHUBK344 hardware without GPS.
Get the PX4ARMINGBRD and the 10-Pin JST-GH to 10-Pin JST-GH cable that comes with it:
And connect it to the following pin on the MR-CANHUBK344:
Since Cerebri sits ontop of Zephyr you can use normal Zephyr west
commands to debug. Specifically with the JLinkEDU attached you can run west debug
from the development machines ~/cognipilot/ws/cerebri
what that will do is rebuild and flash the latest zephyr binary if there has been changes and attach you to a gdb session (make sure to have recently flashed/built cerebri though so it has the context of the last board and app for the build combination). (edited)
If you are looking for just general debug details you can use the logging backend of Zephyr through a UART shell or have printf statements that go directly to shell. Be careful how much you dump to the shell though, it's not an infinite resource, the logging backend helps protect you from spamming the shell too much. Note that logging backend also allows you to do things like set the logging level and enable/disable the log while it is running from the shell. On the included DCD-LZ adapter use the UART FTDI cable and connect to it over screen: screen /dev/ttyUSB_ 115200
(fill in the USB number it comes up as if you have multiple should tab complete)
UPDATING: The final version of this kit will be updated so that the camera mount (item 2.B) is instead open on the bottom, allowing the camera cable and camera to be more easily separated from the mount.
1
900-77143
HW ASSY, 8MPNAVQ-8GB-G KIT
1
2
900-77144
HW ASSY, INNOCAM KIT WITH ENCLOSURE AND SCREWS
1
3
700-87692
PWA, 8MPNAVQ-USBSER
1
4
334-77918
Micro SDCS2 32GB, Class 10 UHS-I, Canvas Select Plus
1
5
600-77602
HW ACCESSORY, 6 PIN TO 6 PIN POWER INPUT CABLE
1
6
600-77604
HW ACCESSORY, POWER INPUT CABLE, SYR 2 PINS TO GHR 5 PINS
1
7
600-77625
HW ACCESSORY, XT60 Male to XT60 Female with SYP in parallel
1
8
600-77521
DATA CABLE, 2 PIN JST-GH TO 2 PIN JST-GH 254 MM
1
9
600-77456
CABLE, 4 PIN JST-GH TO 4 PIN JST-GH 300 MM
2
10
600-77370
CABLE, USB-TTL, FTDI TTL-232R-3V3, 6 PIN CONNECTOR, +3.3V, 0.70 M
1
11
650-77719
HW CABLE GH 5 PIN TO GH 5 PIN, 100 MM
1
12
600-77212
USB CABLE, USB 3.0 A/M TO TYPE C CABLE, 1M
1
13
600-77658
CABLE, IX INDUSTRIAL IP20 CABLE, IX TO RJ45
1
14
928-79059
QSG, Quick Start Guide For NAVQPLUSPC
1
15
926-79301
FLYER, INNOWAVE DESIGN MARKETING FLYER
1
16
926-79302
EMCRAFT QR CODE STICKER
3
17
901-77834
HW ACCESSORY, ZIPLOCK BAG CONTAINING MOUNTING ACCESSORIES FOR SMPNAVQ-8GB-G AND MR-NOCAM-OV5646
1
1
700-89377
PWA, MR-CANHUBK344
1
2
700-47294
PWA, DRONE-CAN-TERM (CAN Bus Termination Resistor)
1
3
336-77458
HW ACCESSORY, 0.91 inch OLED Display Module IIC SSD1306 128x32 OLED
1
4
600-77466
CABLE, 4 PIN JST-GH TO 4 PIN JST-GH 300 MM
1
5
600-77467
CABLE, ASSY, 4 PIN JST-GH TO 4 PIN JST-GH 50 MM
1
6
600-77521
DATA CABLE, 2 PIN JST-GH TO 2 PIN JST-GH 254 MM
1
7
600-77602
HW ACCESSORY, 5 PIN TO 5 PIN POWER INPUT CABLE
1
8
600-77603
HW ACCESSORY, POWER INPUT CABLE SYP TO DC5.5, 2.1 JACK
1
9
600-77604
HW ACCESSORY, POWER INPUT CABLE, SYR 2 PINS TO GHR 5 PINS
1
10
600-77625
HW ACCESSORY, XT60 Male to XT60 Female with SYP in parallel
1
11
926-79071
POSTCARD, MR-CANHUBK344, 6" X 4"
1
12
CABLE, 6 PIN JST-GH TO 6 PIN JST-GH 150 MM
1
Because this is the pre-production version of the MR-B3RB, some cables and screws are not as clearly separated in the sub-kits as they could be. For clarity we will mention them and their purpose here.
The wires represented by rows with serial numbers 6 and 7 in the following table are not included in the current (Pre-Production) unit. Refer Errata on Pre-Production unit. In their place, incompatible cables are included and their pictures are shown in the diagrams on this page, for reference purposes.
1
Cable, Lidar to NavQPlus 6-Pin JST-GH to 4-Pin JST-PH
1
2
Cable, XT60 PANEL MOUNT ADAPTER to XT60 Connector with Cover
1
3
XT60 Connector with Cover
1
4
Cable, Twisted, Black, 5-PIN JST-GH to 5-PIN JST-GH (With only 4 wires connected and middle wire unconnected)
1
5
Cable, 3 Pin Female Dupont Jumper to 6-PIN JST-GH
1
6
Cable, Back APA102 Strip (IN Port) to CANHUBK3, 7-PIN JST-GH to Stemma QT/Qwiic JST SH 4-Pin
1
7
Cable, Back APA102 Strip (OUT Port) to Front APA102 Strip (IN Port), Stemma QT/Qwiic JST SH 4-Pin to Stemma QT/Qwiic JST SH 4-Pin
1
The MR-B3RB-PDB (Power Distribution board) sits on the lower chassis and provides the following functions:
3S power input from LiPO battery or power supply with XT60 type power connector.
Regulated 5V supply for boards and servos
JST-GH Power cable connections for the NavQPlus and MR-CANHUBK344
Filtering on the power supply connectors
Optional 5V power source and termination for a CAN bus.
3-pin Dupont style PWM connector to MR-CANHUBK344 and ESC cable connector. The PWM signals connect through to the two signals (Enable and PWM) on the ESC (motor controller) integrated into the BLDC motor on the B3RB Buggy Chassis.
3-Pin Dupont style PWM connectors also provide 5V "up" to the "servo rail/PWM rail" on the MR-CANHUBK344 in order to provide power to the Steering Servo or any other RC-PWM connected devices.
2 pins for use in providing 5V to the RGB-LEDS (TBD, this is optional and power presently is supplied via a 4 pint JST-GH on the MR-CANHUBK344)
Quadrature Encoder in (custom) and out (JST-GH) connectors. In from B3RB motor, out via JST-GH to MR-CANHUBK344.
During manufacturing test, there may be a jumper on the PDB connecting + and S of the 3 pin servo header as shown below. It is very important to remove this jumper, It will NOT be used
UPDATE - this BOM is not accurate. In Production it should consist only of these four sub-kits: > MR-B3RB-S + MR-B3RB-MUK + GPS + MR-LIDAR-STL27
These pages for REFERENCE, skip to the for actual assembly
The MR-B3RB is a "kit of kits", therefore, components are grouped to sub-pages relating to their sub-kits.
These sub-pages will describe the individual sub-kit. Each page contains a components table which enumerates all the parts that are included in the kit.
The diagrams on the pages contain the pictures of the parts.
The parts are labelled with their corresponding serial numbers, either directly in the diagram or in the heading of the diagram. To understand the possible configurations please refer to the.
UPDATING: The final version will have the excess side rail materials removed from the board. Meanwhile, if your board looks as below, please snap off all the four side rails as shown below
1
Metal Parts
1.A
Lidar "Arch"
1
1.B
Decorative Top Cover
1
1.C
Side Skirts
2
1
2
Plastic Parts
2.A
Side Plastic Panels
4
2.B
Front Plastic Shield with Mount for Camera
1
2.C
Back Plastic Shield
1
2.D
Covers for LEDs (Transparent)
2
2.E
Front Plastic Panel for LEDs
1
2.F
Back Plastic Panel for LEDs
1
1
3
Thumbscrew M3 X 0.5 X 15MM
12
4
FASTENER, SCREW M3 X 0.5 X 10MM LONG, PHILLIPS, FLAT HEAD TAPPING, 18-8 SS
10
5
NUT, M3 X 0.5 X 4MM HEX SELF LOCK NUT, 18-8 SS
10
6
FASTENER, SCREW M3 X 0.5 X 10MM LONG, HEX DRIVE, SOCKET HEAD, 18-8 SS
9
1
91352
1
2
91354
1
3
91353
X-MR-B3RB-BMF and includes X-MR-LIDAR-STL-27
1
4
901-10014
HW ACCESSORY, M10 GPS MODULE STANDARD
1
5
926-10002
MR-B3RB Pre-Production Prototype PAPER
1
6
901-10015
1
7
901-77659
1
1
1
2
1
3
1
4
1
1.A
700-88954
PWA, NAVQ-PLUS-2A
1
1.B
700-89912
PWA, 8MPNAVQ-808-G
1
1.C
800-76568
ENCLOSURE, NAVOPLUS ENCLOSURE KIT WITH HEATSINK
1
2.A
901-77809
HW ACCESSORY, FIXED FOCUS CAMERA MODULE, CMOS S.MMP
1
2.B
800-78571
ENCLOSURE, ASSY, SCOOR Resin, NRR-NOCAM-CIV5648
1
17.A
280-78113
FASTENER, SCREW, M3-0.5X12, SOCKET HEAD, SS
4
17.B
280-78116
FASTENER, SCREW, 10-02x3/4", SOCKET HEAD, ss
1
17.C
269-78117
FASTENER, NUT, 10-32, HEX, 55
1
17.D
801-77546
BRACKET, MOUNT, White Resin, MR-NQCAM-OV5645
1
17.E
260-77408
PRE-CUT DOUBLE-SIDED FOAM SQUARES # 1/16" thick, 1/2 x 1/2"
4
The PDB board is being revised from the pre-production unit to include power measurement capability. The final production version will look different than what is shown here. An image of the production version is included at the end of this page
Below is the production version of the PDB.
PDB includes a digital voltage and current monitoring circuit (Px4 standard INA226AQDGSRQ1)
Different method of connecting PWM signals which avoids mixing up the channels. Servo connects to PDB. Servo and motor control PWM sent to real time controller through a single JST-XH connector (XHDP)
LED lighting connection improved. Instead of a split power+signal cable, now uses STEMMAQT type to the PDB and a separate SPI cable to connect to real time controller. Power is injected from PDB.
3rd 5-pin JST Power connector included
QDEC pinout corrected to be 1:1 (error on pre-production version meant that a reverse cable was needed)
Fuse on board (5A populated, could use 3A depending on application.)
B3RB- Base Mechanical Frame Assembly.
This assembly video is showing how some of the mechanical parts go together. Some of these are performed during the assembly steps by the user. It is included here for archival completeness.
Note the MR-B3RB-BMF (Base Mechanical Frame) includes some M3 screws in the side plates that will be replaced with thumbscrews in the full kit.
Link: https://a360.co/4bVC8pm
B3RB is a kit of kits
MR-B3RB is a kit of kits. That means it is built up as a series of other part numbers that are packaged together to define few different sub-kits and final kits. This allows for a mix and match approach internally to provide few different final variants. Only the final part numbers are available for sale on NXP.com, but you may find references to the sub-kits and internal kits in the documentation. In case this becomes confusing, we thought it best to explain the internal and external names here.
Most Mobile Robotics part numbers/name will start with "MR-"
The following part number/naming are internal and not normally for sale separately.
MR-B3RB-BMF, "BMF"
Base Mechanical Frame
RC model, Motor/ESC/Encoder, Metal upper frame, RGB LEDS, partial cables/wires. MR-B3RB-PDB
MR-CANHUB-ADAP
Adapter board for MR-CANHUBK344
Adds IMU, regulators, reset button, connectors for GPS, I2C (StemmaQT), RC input, SDCARD expander
MR-B3RB-PDB ,"PDB"
Power Distribution Board
Power in from XT60 battery, regulated power out to boards, connectors bridging motor PWM and encoder and out to CANHUBK344, 5V Power for PWM servo rail
MR-B3RB-MUK, "-MUK"
Mechanical Upgrade Kit
Pretty upgrades with plastic front, rear, side plates, covers for LEDs, mount for Camera, Lidar "arch", decorative top cover, skirts and thumbscrews.
MR-LIDAR-STL-27, "Lidar"
Lidar
Model STL-27 Lidar and cable
The following part number/naming are the assembled kit of kits that include the following. Other variety of kit combinations may be available in the future with different names, check https://www.nxp.com/mr-b3rb or https://www.nxp.com/mr-buggy3 for orderable part numbers
MR-B3RB-S
Small (Standard) Buggy3 RevB
MR-B3RB-BMF + NavQPlus incl. Camera, MR-CANHUBK344, MR-CANHUB-ADAP
MR-B3RB-M
Medium (Main) Buggy3 RevB
MR-B3RB-S , MR-B3RB-MUK GPS, LIDAR
MR-B3RB-(-Other, -MAX)
Other configuration mix of components will follow similar naming.
Please request for specific needs. We are considering additional kits for specific competition, and a -MAX kit with everything and anything that could connect. (BMS, RADAR, UWB, Long Range Radios etc.)