NavQ Linux companion computer platform with Vision for Mobile Robotics based on NXP i.MX 8M Mini SOC. Found at: https://nxp.gitbook.io/8mmnavq/
NavQ was used in HoverGames2 program and this document is supporting information. However it was not commercialized. The NavQPlus (NavQ+) is the replacement upgraded version of NavQ available on NXP.com and through global distributors.
Note that some software and information here is still completely relevant to NavQPlus NavQ may also be referred to as: 8MMNavQ, MR-8MMNavQ or RDDRONE-8MMNavQ
Also take a look at some of our other Gitbooks:
8M Plus NavQ - newer supported version, available on NXP.com
HoverGames challenges
NXP Cup - car/buggy racing series
The 8MMNavQ is a small purpose built experimental Linux computer based on the NXP i.MX 8M Mini SOC. It is focused on the common needs of Mobile Robotics systems.
The system is built as a stack of boards, the top board being a SOM (system on module) containing the Processor, memory and other components with strict layout requirements, and where the secondary boards are relatively inexpensive (often 4 layer boards) and allows for versions with customization to be easily built.
This is a new set of boards and software enablement and will undergo several iterations. Our intent is to provide a "friendly Linux" with typical packages and additional tools included rather than the typical highly optimized and stripped down Linux found in deeply embedded products.
Please check for Linux updates regularly. Feedback and needs will be incorporated and updated as much as possible and reasonable.
There is a discussion forum here for questions specifically about NavQ
And a general HoverGames forum here.
The 8MMNavQ features:
NXP i.MX 8M Mini SOM with LPDDR4 DRAM and eMMC Flash.
A secondary board with SDCARD, Networking, MIPI-CSI (Camera) and MIPI-DSI (Display) interfaces
MIPI-DSI to HDMI converter
A Google Coral camera module
A third HGI (HoverGames Interposer board) with common interfaces and specific drone and rover interfaces which follow PX4 standards.
The NavQ is suitable for many purposes, including generic robots and various vision systems.
Drones, QuadCopters, Unmanned Aircraft, VTOL
Rovers
Road going Delivery Vehicles
Robotic Lawnmowers
Robotic Vacuum
Flying vehicles (PX4)
DIYRobotCars
Marine vessels
Camera and Vision processing modules
Time of Flight (TOF) Cameras
AI/ML inference
Cellular gateway
Vision systems in other applications
e.g a hospital bed monitor that detects if a patient is sitting up or at risk of falling out of bed.
Two specific complete developer tool examples are the NXP HoverGames Drone, and the NXP-CUP car.
The intent of the 8MMNavQ in HoverGames is to enable participants with a solution that allows them to harness common robotics packages and libraries such as:
ROS
OpenCV
GStreamer
Tensorflow
pyeIQ
And more!
The 8MMNavQ runs linux with a package manager, so you should be able to install the packages that you need to complete your projects successfully and efficiently.
This work is licensed under a Creative Commons Attribution 4.0 International License.
NavQ is a companion computer reference design for HoverGames and commercial development of drones and rovers.
While the 8MMNavQ is a standalone computer, it has been designed with NXP HoverGames coding competition in mind. And specifically the NXP KIT-HGDRONEK66 using the RDDRONE-FMUK66 flight controller.
HoverGames specific features include
NavQ can connect to HoverGames (RDDDRONE-FMUK66)
via serial
via Ethernet (using 100BaseT1 2 wire ethernet adapter)
via USB (requires specific configuration)
RGB LED onboard for status reporting
USB-C console for debugging
Power input via USB-C or JST-GH power header
MicroUSB port for peripherals (hub, usb cameras, sensors)
IX industrial Ethernet jack
Serial ports using JST-GH connectors
I2C/SPI port using JST-GH connectors
3 wire LED strip connector with power supply
Wifi and Bluetooth
MIPI Camera intereface (Google Coral Camera default)
eMMC and removable SDCard memories
MIPI DSI for display (particularly for Rover applications)
Here are a few videos of some of the capabilties of NavQ:
Rolling updates on the status of hardware and software
5/10/2020 - Production Hardware in manufacturing
The changelog for the October Demo Image update is as follows:
Added support for SLCAN devices
Added SPI support through the userspace SPI driver "spidev"
Fixed issues with SDMA firmware loading
Fixed an issue with Bluetooth sometimes not working after a reboot
As of July 17 2020:
<u> 20.04 image is fully working with package manager, ROS, UART, WiFi, etc.
As of July 8 2020:
An <u> 19.10 image was built and tested with the Google Coral camera. A picture was successfully taken using this image. A working image with the following features should be finished Soonâ„¢:
Gstreamer + OpenCV w/ working camera
UART communication to NXP RDDRONE-FMUK66 for Offboard control using MAVROS (not from)
WiFi connection for communication with NavQ
Would enable streaming video back to base station for processing as well as creating in-house user interfaces for controlling the HoverGames drone
As of July 7 2020:
NXP Yocto Debian image
3rd party <u>-like image
Works, can install ROS/OpenCV, but most hardware doesn't work (camera, hdmi, etc)
3rd party Debian image
EmCraft Linux image
Works, has desktop, has OpenCV/Gstreamer, no ROS
The SOM includes the processor, RAM, memory, and WiFi chip for the NavQ.
The NavQ SOM (System on Module) contains the brains of the NavQ. On this board, we have our i.MX 8M Mini processor, 2GB of LPDDR4 memory, 16GB of eMMC flash storage, and a QCA9377 WiFi AC + BT 5.0 chip. There are connectors on the bottom of this board that allow for modularity.
Name
Details
NXP i.MX 8M Mini Processor
Quad ARM Cortex-A53, Cortex-M4 @ 1.8GHz
LPDDR4 Memory
2GB
eMMC Flash
16GB
Qualcomm WiFi/BT
802.11ac + BT 5.0
The NavQ's dimensions are 3" W x 2" L x 7/16" H.
The Media Board consists of an SD Card slot, MIPI connectors for a camera serial interface as well as a display serial interface, and a PCIe connector.
Name
Details
SD Card slot
MicroSD card compatible
MIPI CSI
Google Coral Camera connection
MIPI DSI
MIPI to HDMI adapter for full desktop
Image of Google Coral Camera + MIPI to HDMI Adapter:
The Google Coral Camera dimensions are 25mm x 25mm, as shown in the tech specs here:
https://coral.ai/products/camera/#description
The HoverGames Interposer Board (HGI) is the final board in the stack and has a multitude of I/O for your needs in HoverGames. Connect sensors, switches, and LEDs to the NavQ using the HGI to drastically improve your drone system, and even control your drone using NavQ using offboard control with MAVROS.
Name
Details
UART
3 UART ports for serial communication through JST-GH connectors.
USB-C
Powers the board and serves as an interface for flashing new firmware.
MicroUSB
Connect USB devices to this port such as keyboards and mice. USB hub included.
SPI
JST-GH connector for SPI interface.
Hirose IX Industrial Ethernet
IX Industrial ethernet cable is included.
2 Wire Automotive Ethernet
JST-GH connector for 2-wire ethernet as well as GPIO.
JTAG
Pads are available for JTAG. You may solder your own JTAG connector.
Boot Mode Switches
You can use these switches to boot from eMMC or SD card, or boot into fastboot.
GPIO
Through-hole points for GPIO headers.
RGB LED
WS2811 RGB LED available for status
The HoverGames Interposer Board has a large amount of connectors and I/O for you to connect your devices, sensors, switches, LEDs, and more. This section will give you an overview of the connector pinouts on the HGI. The picture below has a few silkscreen labels for pinouts on each connector, but some connectors have multiplexers to make them more flexible.
UART2 is used for the serial monitor, and should not be used for anything other than the serial monitor. It should not be altered.
UART3 will mainly be used for serial communication to the FMU in HoverGames, but it can be used as an SPI port if you're not using it for the drone.
The bottom 9 pin JST-GH connector in the image of the HGI is used for UART4/I2C/GPIO.
The UART4 pins do not have flow control on this connector. There is no multiplexing on this connector as well. The pinout is below.
GPIO Pin (JST-GH Pin)
Linux GPIO ID
GPIO1_IO10 (6)
10
GPIO1_IO12 (7)
12
GPIO1_IO14 (8)
14
The SPI/GPIO port has a full pinout for SPI as well as 3 GPIO pins. The SPI pins can be muxed to a full UART 4 port with flow control. The pinout is below.
GPIO Pin (JST-GH Pin)
Linux GPIO ID
GPIO1_IO11 (6)
11
GPIO1_IO13 (7)
13
GPIO1_IO15 (8)
15
The GPIO header pads on the HGI are not labeled correctly with the silkscreen. The layout is shown below with TP labels and schematics.
If you would like to take a look at the schematics of each board in the NavQ, you can download the .zip file at the bottom of this page. The table below gives detail for each folder inside the .zip file.
Folder Name
Description
RDDRONE-8MLPDDR4
Schematics for the top SoM board, with the CPU, RAM, eMMC, and WiFi/BT chips
RDDRONE-HGI8MM
HoverGames Interposer Board schematics with GPIO, UART, i2C, SPI, HiRose IX Industrial Ethernet, and more
RDDRONE-MEDIA8MM
Media Board schematics with MIPI CSI/DSI, PCIe, and SD Card
RDDRONE-MIPIHDMI
Schematic for the MIPI HDMI board
RDDRONE-T1ADAPT
Schematic for the 100BaseT1 2-Wire Automotive Ethernet Adapter board
NavQ as provided is an experimental board
NXP Friendly Linux is experimental and in development
This reference design is intended for use of ENGINEERING DEVELOPMENT OR EVALUATION PURPOSES ONLY. It is provided as a sample IC pre-soldered to a printed circuit board to make it easier to access inputs, outputs, and supply terminals. This reference design may be used with any development system or other source of I/ O signals by simply connecting it to the host MCU or computer board via off-the-shelf cables. Final device in an application will be heavily dependent on proper printed circuit board layout and heat sinking design as well as attention to supply filtering, transient suppression, and I/O signal quality.
The goods provided may not be complete in terms of required design, marketing, and or manufacturing related protective considerations, including product safety measures typically found in the end product incorporating the goods. Due to the open construction of the product, it is the user's responsibility to take any and all appropriate precautions with regard to electrostatic discharge. In order to minimize risks associated with the customers applications, adequate design and operating safeguards must be provided by the customer to minimize inherent or procedural hazards.
For any safety concerns, contact NXP, EMCRAFT AND SOFTWARECANNERY, sales and technical support services. Should this reference design not meet the specifications indicated in the kit, it may be returned within 30 days from the date of delivery and will be refunded replaced by a new kit. NXP, EMCRAFT AND SOFTWARECANNERY, reserves the right to make changes without further notice to any products herein.
NXP, EMCRAFT AND SOFTWARECANNERY, makes no warranty, representation or guarantee regarding the suitability of its products for any particular purpose, nor does NXP, EMCRAFT AND SOFTWARECANNERY, assume any liability arising out of the application or use of any product or circuit, and specifically disclaims any and all liability, including without limitation consequential or incidental damages. Typical parameters can and do vary in different applications and actual performance may vary over time. All operating parameters, including Typical, must be validated for each customer application by customer’s technical experts.
NXP, EMCRAFT AND SOFTWARECANNERY, does not convey any license under its patent rights nor the rights of others. NXP, EMCRAFT AND SOFTWARECANNERY, products are not designed, intended, or authorized for use as components in systems intended for surgical implant into the body, or other applications intended to support or sustain life, or for any other application in which the failure of the NXP, EMCRAFT AND SOFTWARECANNERY, product could create a situation where personal injury or death may occur.
Should the Buyer purchase or use NXP, EMCRAFT AND SOFTWARECANNERY, products for any such unintended or unauthorized application, the Buyer shall indemnify and hold NXP, EMCRAFT AND SOFTWARECANNERY, and its officers, employees, subsidiaries, affiliates, and distributors harmless against all claims, costs, damages, and expenses, and reasonable attorney fees arising out of, directly or indirectly, any claim of personal injury or death associated with such unintended or unauthorized use, even if such claim alleges NXP, EMCRAFT AND SOFTWARECANNERY, was negligent regarding the design or manufacture of the part.
Board
Rev
Item
RDDRONE-
HGI-1A
PTN5110 FAULT_N should be pulled up
and connected to INT of NX20P4383UK
Action - Mask this interrupt in the register FAULT_STATUS_MASK (0x15h):bit 4 -- Force Discharge Failed Interrupt Status Mask (to be zeroed).
Resolve in the software, since both the PTN5110 (TCPC) and NX20P3483 (power switch) connected to the I2C bus. Polling of the NX20P3483 will address this errata.
This issue will be resolved in the next board revision (RDDRONE-HGI-2A)
Board
Rev
Item
RDDRONE-
MEDIA-1A
The design is missing a power switch required by the SD 3.0 standard
Action - disable SD3.0 support in UBoot and/or avoid SD3.0 cards
This issue will be resolved in the next board revision (RDDRONE-MEDIABOARD-2A)
The SD3.0 PWR switch will be added:
Issue - The 3rd mounting hole for the NavQ was missed on the carbon fiber mounting plate. This is needed to secure the board and avoid vibrations.
Action - Please drill a 3mm hole manually or use double sided tape on this side of the baord mounting. This must be done to avoid vibration of the board during flight.
These components are designed to interface and work with the NavQ.
Unless otherwise indicated, the following add on modules are not included with the NavQ HoverGames 2 kit (HG2).
In addition They may not all be available, and may be experimental in nature.
USB-UART serial debug console (included with HG2 Kit)
LTE CAT-M1 cellular modem
PMDTEC Time of Flight (TOF) Camera
Lighthouse tracking module
NXP 100BaseT1 2-wire Automotive Ethernet
Edgelock SE050 Secure Element
5" high resolution MIPI-DSI LCD
MIPI-DSI to HDMI adapter
PCI.e M2.Module - Kingston SSD
Non contact gesture tracking module
Kingston eMMC, LPDDR4, Industrial SDCARD
RJ45 breakout.
100BaseT1 2-Wire Automotive Ethernet media converter
Additional RDDRONE-T1ADAPT media converter details can be found here: https://nxp.gitbook.io/rddrone-t1adapt/
NXP's TJA1101 is an Ethernet PHY that provides a two-wire 100BaseT1 Ethernet interface. The Ethernet MAC side of this interface is not unusual, and the traffic on the line is "regular Ethernet"
NXP's Flight controller for Mobile Robotics - RDDRONE-FMUK66 includes a 2-wire Ethernet interface on board. In order to connect this with the 8MMNavQ this media converter can be used. The RDDRONE-T1ADAPT is also useful when connecting to other experimental modules such as V2X or an Automotive 5/10 port switch.
On RDDRONE-T1ADAPT power is supplied via a 3 pin JST-GH connector. There is a matching 3 pin JST-GH connector on the 8MMNavQ. A simple 1:1 cable is used. Optionally a USB-C cable can be used to provide power (only) connection. A 2 pin JST-GH connector is used for connecting Ethernet between this board and another - such as the RDDRONE-FMUK66. A simple 1:1 cable is used.
There are also locations marked on the bottom side of the board for soldering in wires for both power and 2-wire Ethernet
There is a small NXP LPC processor on board to configure the back to back PHYs and manage setup and LEDs. This board comes pre-programmed and there is no user software required. Contact hovergames@nxp.com or your local NXP representative if there is a specific need to access the software.
For development, use on a rover or ground station there is a 5" LCD panel that can attach to the NavQ MIPI DSI output.
Specs: <TODO>
This is a Murata Type1SC LTE CAT M1 Cellular Modem and is a low bandwidth solution for IOT.
This LTE Cat M1 modem is a small form factor module for development that includes the PX4 type connectors suitable for connection with the NavQ.
Cellular IoT solutions are new standards defined by the 3GPP Group to answer requirements such as low power, long range and low data rate usage. Two standards currently exist - LTE Cat.M1 is one of the current standards and is intended for low bandwidth intermittent use, as expected for an IOT-type device.
Cat M1 can deliver secure, world-wide coverage by using the same base stations, public networks and power supplies as mobile phones.
It is important to recognize that Cat M1 is not the same as 4G LTE data bandwidth you have come to expect on your cell phone data plan. This is not suitable for streaming video, instead it would be intended more for intermittent low bandwidth telemetry or sensor data.
Generally speaking LTE Cat M1 modems are designed to support these features:
High Security, Encrypted communication and FOTA
Low consumption capable of 10+ years of battery life
Wide coverage using existing smartphone networks
Low cost - Reduction of R&D and operational cost
GPS-free geolocation solution
Large network capacity w/ reduction of data rate
LTE Cat.M1(Release 13)
Specification
Based on LTE
Bandwidth
Up to 1.4MHz
Peak DL data rate Peak UL data rate
300kbps 375bps
Frequency deployment
LTE in Band
Duplex mode
Half or Full duplex
Voice/Data support
Voice & Data
Mobility
Yes
Tx Power
20, 23dBm
Targeted Applications
Critical applications Healthcare, smart factory, security Low latency Emergency devices, smart cities Geolocation Asset tracking, wearable, fleet management
How to order your own NavQ
To order a NavQ, you'll currently need to purchase directly from the Emcraft website. If you are a HoverGames 2 participant and have had your application accepted be sure to use your coupon code.
See links below
Guide to get the 8MMNavQ up and running quickly
The NavQ is a device that will allow you to add extra compute to your HoverGames drone system. With an i.MX 8M Mini processor, you will be able to reach new boundaries of vision and sensor data processing.
The current Demo build was built on 7/24/2020. Confirm you are on the correct image by running uname -a
. You should get the following output:
The SD Card slot on the NavQ is sandwiched between the Media Board and the HoverGames Interposer Board. There are several important components underneath the SD card slot. We highly recommend that you be very careful when using the SD card slot so the components are not damaged. One notable component is the USB controller - it is quite small, so if it gets damaged, you won't be able to use USB devices over the MicroUSB port. One way to be safe when inserting or removing the SD card is by using some tweezers as seen in the image below.
The SD card included in the NavQ kit is preloaded with our HoverGames-Demo Linux distribution. The default username and password are:
To power your NavQ, there are two options. The first is to use one of the included USB-C cables and connect it to a USB port on your computer.
The other option is to power it through one of the included connectors in your NavQ kit. These connectors plug into the 5-pin POWER port next to the boot switches on your NavQ. You may use the barrel connectors or the XT60 power breakout connector. Some images and more details can be found here:
Power CablesTo access the serial console on your NavQ, attach one of the included USB-C cables to the USB->UART adapter included in your kit. You can use programs such as PuTTY to access the serial console. A full guide to do this is linked below.
Serial root consoleWhen your board arrives, the Demo image will already be loaded to the SD card. This image does not take up the full amount of space on the SD card, so you'll need to expand the space in order to install more packages such as ROS or OpenCV. Follow the guide here to do so:
Expand space on SD card/eMMCFollow the guide linked below to mount your NavQ to your drone:
Mounting NavQ on HoverGames DroneDepending on which Linux distribution is loaded, you may find that the NavQ includes a desktop application. This may be a minimal desktop with only a terminal emulator or it may be more feature rich like Liri Desktop.
The signaling is output on the MIPI-DSI port and if a compatible LCD panel is attached, then it would be visible there. Most of us will have access to a standard HDMI monitor, and there is a MIPI-DSI to HDMI adapter included in the kit also.
In order to connect both a mouse and keyboard to the NavQ you will need to connect the included microUSB to USB-A hub. Other USB peripherals may also be supported but need to be tested as it is not guaranteed that all USB drivers will be available.
You can use GStreamer to take 1080p 30fps video. This uses the included H264 encoding plugin for i.MX 8M Mini. Here's an example pipeline you can run on your NavQ to take video:
When you want to end the video, just press Ctrl+C to cancel the pipeline, and the file should be saved to the present directory.
To record video with your NavQ, you can run this simple python script that uses OpenCV to write video to a file:
This is a simple example that you can use as a starting point for even bigger things with OpenCV/computer vision! If you'd like a more sophisticated guide that runs through example code to detect red objects, head to the developer guide on OpenCV to find more.
To perform off-board control of the HoverGames drone from the NavQ, you'll need to get a little bit involved with ROS + MAVLink (MAVROS). To see a guide on how to get started, head over to the developer guide!
A package named connman
is included in the image to help you connect to WiFi through the command line. To connect to WiFi, run the following commands:
Your NavQ should automatically connect to WiFi when rebooted. If you want to connect to another WiFi network, just go through the same process again.
If you need to transfer files to and from the NavQ over a wired or wireless connection, you can use FileZilla to access the NavQ's FTP server. First, you'll want to connect the NavQ to your local network (WiFi or Ethernet) and run ifconfig
to find the IP address that was assigned to your NavQ. Then, use FileZilla to connect to that IP with the username navq
and password navq
.
A guide on how to use FileZilla is here:
Now that you've gone through the Quick Start Guide, you can move on to the Developer Guide if you'd like to go more in depth. Use the sidebar on this Gitbook to navigate to the next section.
The NavQ can take a voltage of 5V to 20V.
The board may also be powered via USB- which also has a 20V max input rating.
Note that while the voltage regulator on the external JST-JH power connector has absolute max rating is 26V. According to the schematics, there is a 20V TVS Diode (PTVS20VS1UR) at that input before the regulator (MP8759GD), with a breakdown voltage of 22.2V to 24.5V according to the datasheet.
Any experimentation above 20V input should take this into account and be done at your own risk.
There is some, but limited power input protection on board. Given this is typically for USB-C application, it may not be sufficient in harsh operating environments. You may want to provide some additional reverse polarity protection, over-current, or DC-DC conversion/isolation from the battery if you expect to be experimenting outside the HoverGames normal operating range or treating it harshly.
Please monitor your LiPo battery carefully for undervoltage conditions. If left connected, the NavQ and HoverGames drone will completely drain your LiPo battery and could cause permanent damage to your battery. There are no undervoltage disconnect provisions built in.
A simple solution to undervoltage protection may be to add a hobby grade low battery monitor alarm. These sound a loud alarm when any cell goes below a user set threshold voltage. The LED display will also show the individual cell voltage and total pack voltage. This plugs into the balance connector of the LiPo battery. They are inexpensive and available at hobby stores or online at typical outlets. Here are some examples:
Amazon - "Battery Voltage Checker Alarm"
Amazon - "Battery Voltage Monitor Low Voltage"
Options and cables for powering the NavQ
The NavQ ships with several cables. They allow for several configurations when powering the NavQ
When working on the bench you may wish to power the NavQ from USB-C input cable. This cable is provided. It would also be possible to power the NavQ while on a drone from a separate USB-C battery pack like those for Cell phone power.
Please note that some USB-C ports on charging adapters and particularly on unpowered Hubs may not supply enough current to run the NavQ and all the peripherals. If you notice a booting failure, please first try powering from an external battery or known high power USB-C charger or power supply.
The HoverGames images come as a .bz2 compressed archive. To decompress this image, you'll need to use a program like 7zip on Windows, use the bunzip2 command on Linux, or double click the archive on Mac.
NXP has a tool for flashing i.MX hardware called UUU (Universal Update Utility). You can download UUU from here:
If you're on Windows, you'll want to download the uuu.exe
file, and if you're on Linux, you'll want to download the uuu
file.
You must agree to all of the applicable licenses and agreements at the following link before downloading the Linux software. It is hosted here:
NOTE: This file is only needed for flashing with UUU to the eMMC/SD Card. If you want to flash your SD Card with dd or Win32DiskImager, this file is not needed.
Flip the DIP switches on your NavQ to put it into USB flashing mode (boot from USB in the image below). Here is an image that shows how to do so:
Connect your NavQ using the included USB-C cable to your computer. You should recieve a message on your computer that it has been connected. To make sure the NavQ is connected, you can run the UUU program with the -lsusb
flag and you should see an output similar to this:
You can flash both the SD card and the eMMC using this tool. The keyword for flashing the SD card is sd_all
, while the keyword for flashing the eMMC is emmc_all
. The command to flash your board is outlined below:
After a few moments, your board should be flashed. Unplug your NavQ from power, reset the DIP switches to the desired boot device, and you're good to go!
To flash the image, you'll need to use dd
on Linux/Mac or Win32DiskImager
on Windows.
Download Win32DiskImager:
Open the program and select your SD card. Choose the .wic OR .img file, then click "Write".
We have created an easy-to-use script to resize the flashed image on your device. Since the image is smaller than the storage device, it is not properly expanded when first flashed. You can run the following script to expand the filesystem.
To run the script, run the following commands for the boot device you're currently using:
For eMMC:
For SD card:
The script should be executable when it is included in the image. If it isn't, then you'll need to make it executable by running the following command:
You may run into an issue where you run out of space on the eMMC or SD card when installing ROS. To expand the rootfs partition, follow these steps:
If you're on the eMMC, you'll use /dev/mmcblk2. If you're on an SD card, use /dev/mmcblk1. By default the NavQ boots from the included SD card in your kit.
Once you're done with those steps, run this command:
and reboot. You should now be able to install ROS Melodic without size issues.
If you prefer to just see the commands, these are the commands you need to run in fdisk in order to resize your disk.
Connecting to the root console on NavQ using the USB--UART adapter
The root console will allow monitoring of the board from initial boot. Since modern PCs don't tend to include serial ports anymore, a small FTDI USB-UART adapter is provided to convert the serial port to USB.
Plug the 6 pin JST-GH cable from the USB-UART converter into NavQ connector "UART2/I2C2"
Plug the USB-UART into your pc like you would a dongle or memory stick
Your PC operating system should respond right away when it recognizes the USB-UART. On windows it will play a "usb connect sound"
There should be a red light illuminated on the USB-UART board (even when not plugged into the NAvQ)
Follow the terminal configuration below and power on the NavQ
Troubleshooting Tip: If the COM port does not show in Device Manager or you don't hear the "usb connect sound" double check that the USB-C connector is fully plugged in and seated into the USB-UART adapter board.
Troubleshooting Tip: If the USB UART is not detected on your PC, in some instances you may need to download FDTI USB-UART driver software. It can be found here: https://www.ftdichip.com/Drivers/D2XX.htm
A terminal program will be required to communicate over the serial console. The following example is for a Windows 10 PC using the terminal program PuTTY.
Default NavQ terminal settings are: 115200 Baud, N, 8, 1 (no Parity, no flow control)
In Windows 10 use Device manager to determine which COM port is assigned to the FTDI USB-UART serial adapter. The open PuTTY window shown below is the root console of the NavQ.
There are several options for a terminal program that can be used. This example is for PuTTY on a Windows PC. Other programs and other hosts can be used. Example using PuTTY to connect to the serial console via a USB-UART adapter. In this example the COM port was determined above by looking at Windows 10 Device manager.
How to mount the NavQ using the plate and standoffs provided
The suggested method of mounting the NavQ to the Drone frame is shown below. Improvements and suggestions are welcomed. Please message on community.nxp.com here: https://community.nxp.com/community/mobilerobotics/hovergames-drone-challenge
The NavQ can mount to the small carbon fiber plate using screws, standoffs or double sided tape.
We did allow for mounting the NavQ with screws, HOWEVER one of the three mounting holes did not get drilled. Please drill the third hole by hand if you wish to mount with screws. Alternatively you can use two screws and some double sided tape for the missing screws. You should use tape or a 3rd hole, because otherwise the board will vibrate too much.
One modification tried was to add a second standoff "hanging" from the front so that the camera module could stick between these two front standoffs to with double sided tape.
Ensure your NavQ is working correctly with these fixes
Currently, the HoverGames-Demo image is in a beta state. Over time we will be improving the image by fixing bugs and including even more software so that it's more of an out-of-the-box experience. Here are some of the current fixes at the moment to get a working system.
WiFi sometimes does not automatically connect to the last WiFi network after reboot.
Workaround: Open connmanctl, run disable wifi
, and reconnect using the instructions in the Quick Start Guide.
Setting the timezone on your NavQ is necessary to ensure the apt
package manager works. First, you'll need to locate the correct timezone file at /usr/share/zoneinfo
. There should be a folder for your country and a file in that folder for the closest city to you.
For example, if you're in Central Time USA, you'd use the following commands:
Now, you can run sudo apt update
and sudo apt upgrade
to get your system up to date.
There are several editors that can be used in Linux. Nano is a lightweight command shell editor that is popular. There are many others to choose from
Some discussion on the topic: https://yujiri.xyz/software/nano
Assuming you already have internet access on your NavQ, to install nano on the demo image type the following.
Suggested proposal for the hostname of NavQ
Since several NavQs might be present in a Wifi Network it's essential to set an unique hostname to determine which one is the correct NavQ you want to connect to.
To change the hostname you need to modify /etc/hostname
. We suggest the following format:
e.g. If Mavlink SysID is 10 the NavQ should be named navq-10
After a reboot the new hostname will be visible on the network.
mavlink-router routes MavLink data dynamically between several end nodes
To be able to have several end nodes communicating via mavlink simultaneously we need to set up mavlink-router on the NavQ. The end nodes can be
A process for onboard control running on NavQ.
A QGroundControl (QGC) computer the NavQ connects to via a data link such as WiFi.
Other mavlink enabled peripherals on the vehicle.
Another program running on the same remote PC as QGC
Connect to your FMU over USB and open QGroundControl. Navigate to Settings -> Parameters -> MAVLink and set these parameters:
Also, you'll need to make sure that the settings in Settings -> Parameters -> Serial look like this:
To install and compile mavlink router
follow the steps below (internet access required on your NavQ)
1) Connect to NavQ console via ssh / serial
2) Type the following commands
Configuration of mavlink router
is done via a single configuration file
/etc/mavlink-router/main.conf
This file needs to be created from scratch. An example configuration file is available in the mavlink-router sources - https://github.com/intel/mavlink-router/blob/master/examples/config.sample
As of today the mavlink-router make install does not create the /etc/mavlink-router directory and main.conf file. Therefore please use the following commands to create the directory and file initially.
~$ sudo mkdir /etc/mavlink-router
~$ sudo touch /etc/mavlink-router/main.conf
You can leave out the unused connection.
Via the UdpEndpoint QGConMobile
section the mavlink stream is forwarded to a QGC computer/mobile device assuming it has 192.168.43.1 and NavQ is connected to this network via e.g. WiFi.
Enable the auto-start of mavlink-router
via systemd
and start it
You can check the status of mavlink router using the command
T1 Ethernet between FMUK66 and NavQ
The RDDRONE-FMUK66 has a two wire 100BaseT1 Ethernet interface on board. The 8mmNavQ board does not include T1 Ethernet however an adapter may be used. To run the T1 Ethernet connection between FMUK66 and NavQ use a separate RDDRONE-T1ADAPT media converter.
The 8mpNavQ or "NavQPlus" will have two Ethernet interfaces. it is planned that one of these interfaces will natively be configured as 100BaseT1
It is not recommended to use DHCP in a vehicle such as a drone, since you generally don't want the network to change without knowing about the explicit details. Therefore since there is no DHCP and FMUK66 by default has a fixed IP of 10.0.0.2, we need to set a fixed IP on the NavQ for eth0
to be able to communicate via Ethernet to FMUK66.
The linux program connman
is used for configuring the network settings . To force connman
to use a fixed IP ( as in case when no DHCP is available) the following file needs to be created.
It is important is that you have a Ethernet cable connection before, otherwise connman will not register the network.
The IP4 settings are in the order of ownIP/netmask/router. Note that 10.0.0.3 is set as router since in this particular hardware configuration no other device is there.
T1 Ethernet is supported by PX4 on FMUK66 with latest master.
In the example configuration above, 10.0.0.3 is the IP address of NavQ on the vehicle. More detailed description of the mavlink start parameters can be found here: https://dev.px4.io/v1.9.0/en/middleware/modules_communication.html
Additionally the MAV_BROADCAST parameter on the FMU needs to be set to "2 - only multicast".
This section is for various linux commands that may be useful when using the NavQ, and the Demo image.
If you've ever run into a situation where you need to view a raw stream from the NavQ's Google Coral Camera, or need to run a lightweight GUI application on NavQ, you can do so using the guide below.
You can run the commands below to start a VNC server on your NavQ.
You can use TightVNC to connect. You'll want to use the IP address of your NavQ at port 5901.
GPS Time for Small Unmanned Aerial Systems by Andrew Brahim Part 1: https://dirksavage88.medium.com/gps-time-for-small-unmanned-aerial-systems-a-primer-for-better-drone-technology-part-one-52d91a908323 Part 2: https://dirksavage88.medium.com/gps-time-for-small-unmanned-aerial-systems-pps-beginnings-part-two-f1a45d0882e1
The i.MX 8M Mini parts are rated 0C to +95C. We do not expect they will need any additional heat-sinking, especially while flying, but you can monitor the core temperature with the following command:
cat /sys/class/thermal/thermal_zone0/temp
We have two separate Linux distributions that you can build for NavQ. There are pros and cons to both.
The HoverGames-BSP image is based on NXP's Yocto BSP for i.MX 8M Mini. This distro is not as easy to use, and requires much more effort to get a working system, but currently it is the only system that is allowed for use if you want to use NavQ commercially. If you need tight integration and a small system that has only the packages you need installed on it, or if you're a company looking to use the NavQ in production, this is the one for you.
The HoverGames-Demo image comes with the APT package manager as well as some other pre-installed software specific to i.MX 8M Mini. This distro is the easiest to use due to it's great compatibility with pre-existing binaries coming from official repositories. You can install ROS, OpenCV, and more on this image just as you would on a normal desktop computer. This makes the Demo image great for quick development and iteration. To use this image, you must agree to a Demo license stating that you will not use NavQ comercially with this distribution.
You'll need to use a computer with Ubuntu 18.04 installed, and we recommend a high amount of cores + RAM to build these images in a decent time frame. You will also need a large HDD or SSD (>500GB) to store the build files. A table of recommended specs are below:
Component
Recommended Hardware
CPU
Recent 6-core Intel i-Series or AMD Ryzen processor with HyperThreading or SMT (Simultaneous Multi-Threading)
RAM
16GB DDR4 or more
Storage
500GB SSD recommended, HDD will suffice but will be slow
Operating System
Ubuntu 18.04 (Not 20.04!)
Follow the guide at Yocto's website to install the necessary build tools for Ubuntu/Debian, or just install the list of packages below:
We have a GitHub repo with instructions for both the HoverGames-BSP and HoverGames-Demo images. The BSP image can be built using the master
branch, whereas the Demo image can be built using the demo
branch. The link to the repo is below.
There are several GPIO pins on the various JST-GH connectors on NavQ. To control these GPIO pins, follow the instructions below.
In order to use GPIO pins, we need to export them in Linux first. To do this, we need to know the GPIO number for the pin we want to access. We can compute this number using the following formula:
For example, if we want to access the GPIO1_IO12
pin on the UART4/I2C/GPIO connector, we would find that the GPIO number is:
If you want to find out what pins correspond to what GPIO numbers, we have tables in the Hardware Overview/Pinouts and Connector
info section here:
Once you know the GPIO number of the pin you want to access, exporting the pin for use is easy. All you have to do is echo the pin number to /sys/class/gpio/export
. For example, if we were to export GPIO1_IO12
, we would run the following in our NavQ console:
Currently we have not created a specific user group to control GPIO pins, so you must be root to export/control pins. If someone in the community would like to submit a process for greating a GPIO user group, please make a post on our hackster.io page and we will add it to the demo image. :)
Next, we will want to change the direction of the GPIO pin for our specific use case. There are two options: in
and out
. To do this for GPIO1_IO12
, you can run the following in your NavQ console:
To read or write a value to the GPIO pin, we will follow a similar process to changing the pin direction. A pseudo file named value
is created at /sys/class/gpio/gpioXXX/value
that holds a 1 or a 0. If you echoed out
to the GPIO direction file, you can control the pin. To control the GPIO1_IO12
pin, just run the following in your NavQ console:
If you echoed in
to the GPIO direction file, you can read the value file and find the current state of the pin. To do this for the GPIO1_IO12 pin, you can run the following in your NavQ console:
Create new group called gpio
2. Create new udev rules file
Create a file at /etc/udev/rules.d/99-gpio.rules and add the following to it:
This will allow you to access the GPIO pseudofiles without being root.
[Work In Progress]
A comprehensive guide on using the NavQ as an I2C master (work in progress)
The NavQ includes an I2C port in one of the JST-GH connectors. You may use this port to communicate to other devices in your drone system. In this example, we will go over the process of connecting a Teensy LC to the NavQ over I2C to control some WS2812 LEDs.
Add guide for using C/Python SMBus libraries for controlling I2C
Add more pictures/visuals
Explain teensy code
etc
Teensy LC
JST-GH connectors and pre-terminated wires
Headers
Soldering kit
Teensy side
Arduino IDE
TeensyDuino
NavQ side
i2c-tools (installable from apt)
To create the I2C connector, you'll need to order some JST-GH hardware. Here is a link to a digikey page where you can purchase connectors:
And here is a page where you can purchase the jumpers:
In the hardware overview (link here: Hardware Overview), you can see the pinout for the I2C connector. Here is another screenshot of it:
The 5VP pin is on the left-most side of the connector, and GND is on the right-most side. I2C2_SDA is pin 4, and I2C2_SCL is pin 5. The JST-GH connector is positioned with the retention clip facing away from you when you are determining the left/right sides.
You'll need to do some soldering for the first step in this project. In the two pictures below, the NeoPixels are connected to the LED 5V, LED GND, and LED SIG pins. The JST-GH connector to the NavQ connects to the SDA/SCL pins and 5V + GND pads on the back of the Teensy.
One thing to keep in mind is that even though the Teensy LC does not include pullup resistors to 3.3v for the I2C lines, pullups are not required since the NavQ has internal 4.7k pullups on it's own I2C bus (on the SoM).
Here are a couples images of this setup:
We have written some simple example code that changes the color of the NeoPixel LEDs when the Teensy recieves I2C data. In the example below, the slave address of the Teensy is 0x29, and the color of the LEDs change from green to white when a 0x1 byte is sent to the Teensy. If any other byte is sent to the Teensy, the color changes back to green.
To use the i2c commands without root, you'll need to add the navq user to the i2c group. To do this, you can run the following command:
Once your Teensy is connected using the I2C JST-GH connector, you need to confirm that the NavQ recognizes the Teensy as an I2C device. To do this, you can run the following command on the NavQ:
You should see a device at address 0x29. If there is no device at address 0x29, you'll need to check your wiring.
To send data to the Teensy, you can use the following command:
This will change the LEDs to white. You can swap the 0x1 with a 0x0 or any other byte to switch back to green.
Controlling the I2C bus with console commands is great, but what about when we want to integrate those commands into code? Well, with Python and C, we can control the Teensy over I2C by using some libraries supplied in both the Linux kernel and through pip.
First, you'll need to install the smbus
pip package. To do this, just run in your terminal:
Once that is installed, you can run a simple script to select a 1 or 0 to send to the NavQ to change the color of the LEDs.
The expected output of this script is as follows:
By selecting 1 or 0, you can change the color of the LEDs to white or green.
To control the I2C bus with C, you can use the following code:
[WIP]
This "Project Guide" is written to show some of the capabilites of NavQ. In conjunction with a Teensy LC and a strip of WS2812B LEDs, you can add a forward-facing battery indicator light to your drone.
The software needed to run this project on your NavQ is as follows:
ROS Noetic
MAVROS
You can install this software using the guides here:
ROS1Controlling your drone from NavQ using MAVROSThe hardware needed is the same as the hardware from the I2C guide here:
I2CAt the moment, we're just going to paste the code here, and a more detailed guide will be written later.
This code should be uploaded to the Teensy using the Arduino IDE.
The ROS node should be placed in the home folder ('/home/navq/')
The service file should be located in /etc/systemd/system/.
The Launch script should be located in /usr/local/bin/.
Once all of the necessary files are placed in their respective directories, you need to make the systemd service run at boot. To do this, run in the terminal:
The PWM chips are tied to the onboard LED on NavQ. There are three PWM chips: pwmchip0
, pwmchip1
, and pwmchip2
. Each of these "chips" have one PWM line attached to them: pwm0
. To use these PWM lines, you will need to use the sysfs interface.
Log into the root
user on NavQ by running this command:
Navigate to /sys/class/pwm and run the following commands:
Now that our PWM lines are exported for each chip, we can change the duty cycle of the PWM lines and enable them. The default frequency is 2730667 Hz. For a 50% duty cycle, we will use half of this number: 1365333. Apply this duty cycle to each chip by running the following commands:
We will now enable each line. The colors for each chip are as follows:
pwmchip0
: RED
pwmchip1
: GREEN
pwmchip2
: BLUE
To enable the colors, run the following commands:
Running these commands in succession should enable the LEDs in a RED, GREEN, BLUE pattern until you reach a white LED.
[WIP] A guide on communicating over CAN/SLCAN using NavQ and UCANS32K146
If you're thinking about using the CAN protocol on your drone, this guide will walk you through using our UCANS32K146 to create a CAN interface.
Since there isn't a native CAN bus on the NavQ, we can use a protocol called SLCAN to communicate CAN messages across a UART connection. We have built a binary for the UCANS32K146 that acts as an SLCAN transfer layer. This means that we can add a CAN bus to NavQ by just connecting the UCANS32K146 to the UART3 port.
To enable SLCAN on NavQ, run these commands:
Now you can use SocketCAN or python-can to send and recieve CAN messages over the slcan0 interface. As an example, here is how to send a CAN message from the command line:
Follow the guide at the link below to flash the SLCAN binary to your UCAN board:
To use the package manager (apt
) on the Demo image, you'll need to change your timezone.
First, you'll need to locate the correct timezone file at /usr/share/zoneinfo
. There should be a folder for your country and a file in that folder for the closest city to you.
For example, if you're in Central Time USA, you'd use the following commands:
Now, you can run sudo apt update
and sudo apt upgrade
to get your system up to date.
ROS on NavQ will allow you to interface with sensors, control your drone using MAVROS, and more. To get started, follow the install guide below and then continue to the next sections.
When you install ROS Noetic on your NavQ, make sure to install the base version of ROS and not the desktop version. If you install the desktop version, critical gstreamer packages for NavQ can be overwritten and therefore become non-functional.
To install ROS, you need to be on the Demo image. You can follow the guide for installing ROS Noetic Ninjemys at http://wiki.ros.org/noetic/Installation/Ubuntu
ROS Melodic is automatically installed on the HoverGames-BSP image. It includes MAVROS by default. You will need to do a little bit of setup, though, once you first boot your image.
Run the following commands to enable ROS on the HoverGames-BSP image:
You'll also want to download the following script and run it to install GPS geoids:
Now, you can continue with the ROS tutorials for setting up a build environment and installing your first package. We will go over this in the next section.
The 8MMNavQ can control your HoverGames drone by communicating with the RDDRONE-FMUK66 over MAVROS. A UART cable will be included in the kit that connects the UART3 port on the 8MMNavQ to the TELEM2 port on the RDDRONE-FMUK66.
NOTICE: When running the off-board script, make sure that you confirm the landing zone for your drone in QGroundControl. The local position parameter in the offboard ROS node is set to x:0, y:0, z:2, which means it will hover at 2 meters above its landing zone. If the drone takes off from a position away from its landing zone, it will quickly return to its landing zone and hover 2 meters above it. This is especially important to note if you turn the drone on indoors and then place it somewhere outside to take off. We don't want your drone to smack into a building!
Connect to your FMU over USB and open QGroundControl. Navigate to Settings -> Parameters -> MAVLink and set these parameters:
Also, you'll need to make sure that the settings in Settings -> Parameters -> Serial look like this:
A coding guide for the ROS node we will be using is located at the link below.
This guide will help you install the ROS node outlined in the MAVROS Offboard Example.
To start, you'll want to make sure that you have already set up a development environment for ROS. ROS has a guide on how to get a catkin workspace set up in the link below.
Once you've completed that tutorial, you'll maybe want to add an extra line to your ~/.bashrc
so that your devel/setup.bash
is always sourced when you open a new terminal:
This will ensure that your development environment is properly set up when you open a new shell.
Follow the "binary installation" guide on the page below to install the necessary MAVROS packages from apt.
Make sure to use 'noetic' in place of 'kinetic' in the commands they give you on this page. Also, you do NOT need to follow the "Source Installation" section of the guide.
To create our first ROS package, we will want to navigate to our catkin workspace's src
folder and run the following command:
This command will create a new package folder named offb
and will add the dependencies roscpp
, mavros_msgs
, and geometry_msgs
to the 'CMakeLists.txt' and 'package.xml' files. Next, you'll want to take the code from the PX4 MAVROS example and create a file named offb_node.cpp in the src/
folder in the offb
package. Your directory structure should now look like this:
In order to build your ROS package, you'll need to make some edits to CMakeLists.txt so the catkin build system knows where your source files are. Two edits need to be made.
The first edit is to add your executable to CMakeLists. Your executable should be named offb_node.cpp. Uncomment line 136 to add it:
The second edit is link your target libraries (roscpp, mavros_msgs, and geographic_msgs). Uncomment lines 149-151 to do so:
And that's all you need to do for now to set up your workspace!
To build your ROS node, return to the root of your catkin_ws/
directory and run:
To run our ROS node, we need to make sure that MAVROS is running. On the NavQ, run the following command:
This will start roscore
and the mavros
node with a pointer to the UART port /dev/ttymxc2
at a 921600 baud rate. To run the ROS node we created, run the following in an ssh terminal:
and your drone should take off to an altitude of 2 meters!
Follow the guide at the link below to install ROS2 Foxy Fitzroy on your NavQ running the Demo image.
Note 1: at Setup Sources step you might get an error message by curl. To avoid this, run the following commands:
Note 2: at Install ROS2 package step, run the ros-foxy-ros-base
installer, as the Desktop tools are not needed on NavQ:
FastRTPS and the microRTPS agent are needed on NavQ in order to bridge uORB topics from PX4 to ROS2 on NavQ over a UART or UDP connection. Follow the guide below to build and install these packages.
Follow the link below for more details on microRTPS and PX4 ROS Com:
First, we will install the FastRTPS project from eProsima. Use the following commands below to do so:
Next, we will build and install the necessary software that will allow us to use ROS2 to communicate with the microRTPS bridge. First, run the following commands:
URGENT: Building px4_ros_com requires a lot of ram. Enabling a swap disk is highly recommended. This will take up 1GB of space on your storage medium.
Run the following commands to enable a 1GB swapfile:
Now, build the workspace:
In order to run all of your specific ROS2 software successfully, you must source the install/setup.bash files in each of your ROS2 workspace folders. Add the following lines to your .bashrc to do so:
Continue to the next page to set up a systemd
service that will automatically start the micrortps agent on your NavQ. The guide will also cover how to automatically start the client on the FMU.
Generate a start up script for the micrortps client under /usr/local/bin
with content
Save the file and exit nano. Make the file executable
Generate a systemd service file to start the startup script at boot
with content
Save the file and exit nano. Check if the process starts
You should see an state active (running), quit with <q> Enable the systemd service file finally to be active at boot
In order to use the microRTPS client on NavQ, you'll need to build PX4 with the _rtps
tag for the fmuk66-v3 build target. To do this, you will need to have both the FastRTPS and Fast-RTPS-Gen packages installed. You can just follow the previous guide on your Linux development VM or computer.
Once you have successfully installed those two packages, you can navigate to your cloned PX4 repository and run the following:
You will need to flash your FMU with the updated RTPS binary. If you don't know how to do this yet, follow the guide here:
To make the microRTPS client start at boot on the FMU, you will need to have an SD card inserted. On your SD card, make a file at /etc/extras.txt and insert one of the following options:
[WORK IN PROGRESS]
NOTE: This guide is currently a work in progress. Some details may not be finished.
In this section, we will guide you through the process needed to detect AprilTags on your NavQ. There are a few things that need to be done to accomplish this:
Install ROS2 image tools
Build and install AprilTag detection nodes
Calibrate the camera on your NavQ using a checkerboard pattern
Run!
Before we start, you will need a few things:
To create a checkerboard for camera calibration, download this PDF: https://www.mrpt.org/downloads/camera-calibration-checker-board_9x7.pdf
In order to calibrate the camera, you will need to set up your NavQ with a mouse, keyboard, and monitor. Use the included microUSB hub and HDMI connector to do so.
Install the following packages with the apt
package manager by running the commands below:
Once that is finished, move on to the next step.
Hook up your NavQ to a monitor with the provided HDMI cord and connect a USB mouse + keyboard through the included microUSB hub. Open the terminal by clicking the icon at the top left of the screen and open the bash shell by running:
Have your printed checkerboard ready in a well lit environment and run the camera calibration software by running the following commands:
Now use the link in the note above to run through calibarating the camera.
A prerequisite for the apriltag_ros
node is apriltag_msgs
. Clone the repo and build it by running these commands:
Make sure to source the install/setup.bash file so that apriltag_msgs
can see it when being built.
First, in order to detect AprilTags, we need to build the apriltag_ros
node written by christianrauch. You can clone his repository by using this git repo:
To make his repo work with ROS2 Foxy, you will need to make a small change in the CMakeLists.txt file. Go to line 26 in that file and delete the apriltag::
token in the AprilTagNode apriltag::apriltag
part.
Next, you'll want to save that file and run colcon build
in the apriltag_ros folder. Once it is done building, you'll want to source the install/setup.bash
file. Add this line to your .bashrc
:
In order to make the apriltag_ros
node work, we need to make sure that camera info messages are being sent in sync with each camera frame published by the cam2image
node. We have written an example node that does just that. You can download it here:
You will need to replace the matrices in the node file to match your camera calibration parameters. The source file is located at pypysub/py_pysub/publisher_member_function.py
. Once you have done that, make sure to build and install the node and source the install/setup.bash
file.
To run the code, you'll need to run the following ROS nodes:
There is an NXP community user guide for gstreamer available here: https://community.nxp.com/t5/i-MX-Processors-Knowledge-Base/i-MX-8-GStreamer-User-Guide/ta-p/1098942
To take a picture on your NavQ using GStreamer, run the following command:
To take video, you can run the following pipeline:
In this guide, we need a few things:
NavQ Companion Computer mounted with Google Coral Camera attached
Laptop/Phone with QGroundControl Installed
Both NavQ and mobile device connected to the same WiFi network
In QGroundControl, click the Q logo in the top left, and configure the video section as seen in the image below:
This will set up your QGroundControl instance to receive the UDP video stream from the NavQ.
Follow the WiFi setup guide using connman
in the Quick Start guide to connect your NavQ to the same router as your mobile device. You will need to use the serial console to do this. Once you have your NavQ connected, you can run ifconfig
in the serial console to find the IP address of your NavQ.
You can SSH into the NavQ to run the GStreamer pipeline once you have the IP.
With your NavQ on, SSH into it by using the IP address you noted when connected to the serial console. Once you're successfully SSHed in, you should note the IP address that you logged in from as seen here:
This is the IP of your computer that you should be sending the video stream to.
To run the GStreamer pipeline, run the following command:
Once you run that command, you should be able to see the video stream from your NavQ on QGroundControl!
To enable Mobile Hotspot on Windows, go to Settings->Network & Internet->Mobile Hotspot. Next, you'll want to edit your mobile hotspot settings to set a password and SSID. Once you've done this, you can enable Mobile Hotspot. You can see a full configuration in the screenshot below.
By default, port 5000 or port 5600 is not open in the Windows firewall, so any UDP stream packets will be blocked. To enable this, go to your Windows search bar, and type "Firewall". Select "Windows Defender Firewall".
Once you open Windows Defender Firewall, you'll want to navigate to "Advanced Settings" from the menu on the left.
You will then be brought to a new window with Windows Firewall rules. To create a new rule for QGC streaming, you'll need to click "New Rule" on the right side.
You will be brought to a new window to add a rule. Select "Program" and click "Next".
At the next window, it will ask you to specify the program you are adding a rule for. Paste the following into that field and click "Next":
Once you've done this, you can click "Next" through the rest of the fields and you should be good to go.
To connect your NavQ to your new Mobile Hotspot, follow the connecting to WiFi guide in the Gitbook here:
Now you can stream to QGroundControl as you normally would. Follow the guide here:
Streaming Video to QGroundControl using NavQ over WiFiTo enable a WiFi hotspot in Ubuntu 20.04, you'll first need to go to Settings->WiFi. Then, at the top right, click the 3 dots button and select "Turn On Wi-Fi Hotspot...".
After you click that entry, this window will pop up. Enter a network name and password, and you should be good to go! Follow Steps 3 and 4 in the Windows section above to configure your NavQ.
With OpenCV on NavQ, you will be able to harness a vast library of computer vision tools for use in HoverGames. OpenCV is installed out of the box on the HoverGames-BSP image and can be installed easily through the package manager on the HoverGames-Demo image. If you'd like to get a jump start on OpenCV, follow the guide below to create a program that detects red objects.
Let's go through a quick example of running OpenCV on the NavQ to identify a red object in an image taken on the Google Coral camera. This example will be written in Python and uses OpenCV.
If you are using the default OS that is shipped with the NavQ, you can skip this step.
If you're using the HoverGames-Demo image, you'll need to install python3-opencv. To do so, run the following command in your terminal:
First, create a new python source file (name it whatever you want!). We only need two imports for this program: opencv (cv2) and numpy. Numpy is used to create arrays of HSV values.
To capture an image, we must first open a camera object and then read from it.
To make our OpenCV pipeline run faster, we're going to shrink our image down to 640x480 resolution. This resolution isn't so small that the image quality will be reduced enough to make a difference in detecting objects, but it will make OpenCV process our image much quicker.
Another pre-processing step that we will run is a box blur. This will get rid of small artifacts in the image that can throw off our pipeline and will make detecting large objects much easier.
In order to find objects that are red in our image, we will apply an HSV filter to the image to create a mask of the color red in the image.
To find the location of the objects in our image, we will find contours in the mask and sort them by total area. This will allow us to filter out smaller objects that we aren't interested in. We will also be able to detect the objects' position in the image and draw a box around them.
Finally, we will store the images we generated from this program: the mask and the final image with annotations (bounding box and text).
To run the code, you'll need to use python3. Run the following command (<file.py> will be the filename that you saved the code to):
Here is the complete source code if you'd like to run it on your own NavQ as an example:
Python framework for eIQ on i.MX
This page is a work in progress. NOTE - THIS WILL NOT WORK ON NAVQ!! UPDATED 12/03/2020 - Updated with notes that this will not work as-is for NavQ using 8M Mini. Apologies for any confusion. These notes are here only for reference advanced developers. The 8M Mini does not have any NN acceleration and can only run using the processor cores.
pyeIQ is not targeted at the i.MX Mini processor, but it may still work albeit with much lower performance than if an accelerator was available. We expect to use this more with the upcoming i.MX 8M Plus that includes a 2.25 TOPS neural net accelerator.
Please refer to the following pyeIQ documentation:
Note that eIQ support is only included on imx-image-full-imx8mpevk.wic pre-built image [1]. *** THIS IMAGE is only for 8M Plus!
Please take a look on switch_image application, we are using TFLite 2.1.0. This application offers a graphical interface for users to run an object classification demo using either CPU or NPU.
# pyeiq --run switch_image
We also have a TFLite example out of pyeIQ, please refer to instructions below. Details can be found on i.MX Linux User's Guide [2].
# cd /usr/bin/tensorflow-lite-2.1.0/examples
# ./label_image -m mobilenet_v1_1.0_224_quant.tflite -i grace_hopper.bmp -l labels.txt
The i.MX Linux User's Guide [2] also provides instructions on how to get our latest Linux BSP [1] up and running. *** NOTE FOR 8M Plus only!
[2]: https://www.nxp.com/docs/en/user-guide/IMX_LINUX_USERS_GUIDE.pdf
Where to learn more about Gazebo
Gazebo is one of several simulators that work with PX4 and ROS.
Simulation is important in order to test code without risk of damaging real hardware. It can be critical in uncovering faults that would otherwise be very difficult to trigger. This is not a tutorial on Gazebo, but a list of some resources to get started.
PX4.io Gazebo developer guide https://dev.px4.io/v1.9.0/en/simulation/gazebo.html
Youtube videos e.g. https://youtu.be/mranHM9wn0g
Read about Gazebo on Wikipedia.
Try out this simple Gazebo tutorial to control a differential drive robot, which is a fun way to learn both Gazebo and ROS. (smile)
[WORK IN PROGRESS]
Telerobotics is a platform in which robots can be controlled over the internet. A good example of this is Twitch Plays. Twitch Plays allows users to play games on a stream by giving commands through Twitch Chat.
Now you may be wondering, "Well Twitch Plays isn't controlling a robot, that's a video game!" and you'd be correct. There is an alternative for robotics though! It's called Remo.TV, and we have written a module for the NavQ to be supported on the site.
Remo.TV is a website where anyone can log in and control robots over the internet. With NavQ, your family and friends can easily control your HoverGames Drone or NXP Cup Rover through Remo.TV. Below we have written a guide for you to set up your drone or rover with the service.
This guide is written for an NXP Cup Car with PX4. You can follow this guide for other robots, but you will need to create your own hardware file to work with your setup. To do this, follow Remo.TV's documentation. You can start by visiting the front page of their website:
To set up Remo.TV on NavQ, we are going to have to install a few packages. Lets install those now:
Install dependencies:
Download RemoTV from GitHub:
Install Python dependencies:
Open RemoTV and copy the sample config file:
And your RemoTV is cloned and ready for configuration! Next we will set up our configuration file.
To get Remo.TV to work on your robot, you'll need to set up the configuration file to work with NavQ. Open the controller.conf file that we copied earlier for editing. Below you will see each field that needs to be edited for the configuration file.
Config field
Value
owner=
Set this as your Remo.TV username.
robot_key=
Set this as your Remo.TV API key.
type=
navq
Config field
Value
x_res=
640
y_res=
480
video_framerate=
30
Log into your Remo.TV account and go to your robot. The screen should look like this:
At the bottom, there is a movement tab. For the NXP Cup car, you'll want to press the (edit buttons) text and paste this code into that window:
This will set up your controls to be compatible with the example navq hardware file.
Once you have set up the software, you can run the controller by going into the controller
directory and running:
Your robot should start streaming to Remo.TV!
Currently the image does not include connman due to a bug in the build system. Also, for connman to work, it requires a package called dhcpc5
. To install connman
and dhcpc5
, connect your NavQ to the internet using ethernet, and run the following:
Use the quick start guide to connect to WiFi once you're finished installing those packages.
There is an issue with resolv.conf
that prevents you from connecting to the internet on WiFi due to an incorrect DNS setup. To fix this, edit /etc/resolv.conf
by removing the current nameserver IP and replacing it with 8.8.8.8
.
By default, the navq
user on the HoverGames-Demo image is not in the dialout
and video
groups. If your user is not in those groups, you will not have control of the camera nor the UART port for communication to the FMUK66. To fix this, run the following commands:
It is necessary to log out to ensure that the Linux kernel knows that your user is part of those user groups.
In the current build, there is an issue with ldconfig which causes the v4l2src driver to not be found. To fix this, run the following commands:
Currently the patches for the linux device tree in our build system are not working. To fix hardware issues like the camera, HDMI, and WiFi, you'll need to replace the *.dtb file on the boot partition of your SD card. You'll need the following file to follow the steps below:
Insert your SD card into your computer
Open the boot
partition that is mounted by default.
Drag and drop the imx8mm-cube.dtb
file into the boot partition.
Boot your NavQ. Quickly connect to it using a serial monitor such as PuTTY.
Press Enter
repeatedly after it boots to bring up the u-boot=>
prompt.
In the u-boot=>
prompt, run the following commands:
Once you have completed these steps, you should be good to go until you reflash your SD card with a new image.
How to assemble the NavQ for use. For initial developers only. Yocto team.
NOTE: This information is archived and not up to date. Some information might still be useful, but continue at your own risk.
The boards ship pre-programmed, pre-assembled and configured to boot from eMMC.
While the boards ship pre-programmed with an image and configured to boot from eMMC, you will want to update to the latest code after confirming you have everything running
There are several interfaces that can be used when connecting the NavQ board to a PC/laptop. The base connections are as follows below:
Connect the NavQ Debug UART to the USB-UART adapter (included) and then the USB-UART adapter to the PC. You can use a terminal program like Putty to connect to the root console.
Plug the USB C to USB A cable between the NavQ and a PC to provide power
An Ethernet connection is available by using the IX Ethernet to RJ45 cable between the NavQ and a PC or Ethernet hub
On a factory fresh system, when power is applied using the USB-C cable (or one of the other power inputs), the system will boot to the Linux prompt. This test shows that the system is functional (UBoot and Linux in terms of the software, UART, Ethernet and eMMC in terms of the hardware).
The default login is root with no password.
To provide the better understanding we may suggest to refer to the tree of the system:
The SOM - the top layer in the assembly consist of:
The processor itself: can be assembled with IMX8M MINI / IMX8M NANO
The FLASH (eMMC): can be assembled in the range from 4GB up to 512 GBytes
The first option for the RAM (LPDDR4): can be assembled in the range from 1GB up to 4GB, the data bus is 32 bit (high throughput)
The second option for the RAM (DDR4): can be assembled in the range from 256MB up to 1GB, the data bus is 16 bit (medium throughput)
The WiFi [optional]
The MEDIA board - the middle board in the stack, can provide the connectivity as follows:
The MIPI DSI (Display Serial Interface) interface at a flex connector can be used with:
MIPI Display adapter board (with a 5.5 inch FullHD display) [optional, via 24pin flex cable]
MIPI to HDMI adapter board (HDMI at the output) [optional, via 24pin flex cable]
The MIPI CSI (Camera Serial Interface) interface at a flex connector can be used with:
Time Of Flight Camera Adapter board (38kPix) [optional, via 24pin flex cable]
RGB Camera Adapter board (Google Coral camera, 5MPix) [optional, via 24pin flex cable]
The PCIe interface at a flex connector can be used with:
PCIe extension board with a PCIe M.2 key E adapter board (any NVMe SSD)
The 1G Ethernet PHY which can be connected to the:
Ethernet+USB adapter board [optional, via 24pin flex cable]
HGI board [optional, via board-to-board connector]
The microSD card holder
The USB 2.0 interface (USB0) with the optional connectivity to the:
Ethernet+USB adapter board [optional, via 24pin flex cable]
HGI board [optional, via board-to-board connector]
The HGI (HoverGamesInterposer), can provide the connectivity as follows: [Optional]
The USB 2.0 interface (USB0) (available at USB type-C connector)
The USB 2.0 interface (USB1) (available at micro-USB connector)
The 1G Ethernet connector (IX industrial Ethernet connector)
The JTAG interface (for debugging purposes)
The BootMode switches (to switch between USB/eMMC/SD card)
The RGB LED (for some fun for HoverGames)
The UART interfaces (2 pcs)
The SPI interfaces (1 pcs)
The secure element (an IC for HG games)
The actual kit consist of all the three layers (SOM + Media board + HoverGames Interposer Board) with the Google Coral camera installed by default (https://coral.ai/products/camera/)
Each layer implemented as an individual project and has it's own documentation (schematic + layout)
Here you can find the schematics and renders for each board in the stack:
Please note, in accordance with the manufacturer recommendations, the flex cable connectors designed for 20 mating cycles only, the board-to-board connectors designed for 30 mating cycles only.
For development purposes, our own experience shows that the board-to-board connectors can in practice withstand up to 500 mating cycles. However this is beyond specification and for critical application use it is best to minimize mating cycles and stress to the connectors.
Engineering sample boards nevertheless are suitable for debugging all the other interfaces and capabilities.
In order to switch the board into the various i.MX debugging modes you may refer to the table below which describes the boot modes available at the board.
An X means that the position of the switch does not matter
A dark square shows the position of the movable element on the switch.
To configure boot access to the system via USB please set the boot mode switches in accordance with the table above for "8M MINI: boot from USB" configuration. After power cycling the board the processor will power on and become available via USB-C connector. It will not execute code form the onboard SDCARD or eMMC.
Any regular tests and operations are now available via the UUU tool. The DDR Stress Test tool is also could be used in order to perform the LPDDR4 calibration.
This particular system is built with 3GBytes of RAM, which means the default UBoot configuration and the lpddr4_timing.c file should be updated. The patch provided below has been applied to the UBoot tree available at:
https://source.codeaurora.org/external/imx/uboot-imx.git -b refs/heads/imx_v2018.03_4.14.98_2.0.0_ga
Also, you may use the LPDDR4 calibration file only:
The detailed manual how to build U-Boot from the scratch is described at: https://community.nxp.com/docs/DOC-345535
The U-Boot image may be re-built using the manual and patch above. This will allow you to boot the system via USB using the NXP UUU tool and program U-Boot to the eMMC using the UUU script below:
After successfully reaching this point you may switch back from "Boot from USB" mode to "boot from eMMC" mode.
In order to build Linux you may refer to the git available at
https://source.codeaurora.org/external/imx/linux-imx.git -b refs/heads/imx_4.14.98_2.0.0_ga
INITIAL DEVELOPERS ONLY - NOTE: The patch below for the tree is extremely raw and should be used as a reference only
In order to get the root file system please refer to the Yocto reference manual, chapter 5.
This is an initial version of the manual and the coverage of all the aspects is quite limited. If you would face any HW issues, please let me know via email abushuev@emcraft.com and/or via NXP's Microsoft Teams' messaging system.
Contact NXP at iain.galloway@nxp.com for coordination or other questions.
You will need to download the Ubuntu 18.04 rootfs files from this link:
NOTE: NXP Internal. Only NXP employees currently have access to this link.
The two files you will need from this link are located in the im8mmevk/
folder:
imx-boot-imx8mmevk-ds.bin-flash_evk
imx-image-multimedia-ubumtu-imx8mmevk-20200611102300.rootfs.sdcard.bz2
You will need to extract the .bz2 file using 7zip on Windows or bzip2 -d <filename>
on Linux.
UUU is a tool used for flashing i.MX8 boards. You can download UUU for Windows or Linux using the link below:
In order to flash the NavQ using UUU, you'll have to change the DIP switches to the following configuration:
Power the board over USB-C as normal. You'll also want to connect to the USB-C/UART adapter and pull up a serial console (with baud rate 115200) to watch the serial output while flashing in case there are any errors.
(NOTE: This has only been tested on Windows 10)
Open CMD on Windows as administrator and change your directory to where you stored the uuu.exe, boot, and ubuntu rootfs files that you downloaded at the beginning of this guide. Then, plug the NavQ in with the correct DIP switch configuration and run the following command in the CMD:
Your eMMC should now be flashed with the Ubuntu 18.04 rootfs. After this step, it still won't boot, so there are a few more steps to get it to boot.
(NOTE: You must have an SD card that you can boot from or you can't perform these steps.)
Boot from the SD card and run the following commands to mount the eMMC boot partition:
Your eMMC boot partition should now be mounted. Next, you'll need to connect your NavQ over ethernet to your computer or router. Use an FTP client (I used FileZilla) to FTP into the NavQ, and then place the following file in /mnt/boot:
This is a Device Tree Blob (dtb) file that tells Linux what the device tree looks like. (WiFi does not work with this .dtb, a new one with WiFi working will be provided soon). Once you've successfully copied over the .dtb file, you'll need to unmount the boot partition:
Now you can change the DIP switches to boot from eMMC and reboot by running reboot
in the serial console.
Once the NavQ starts booting from the eMMC, you'll want to press enter repeatedly until you get to the U-Boot console prompt. Once you're in the U-Boot prompt, you'll need to run the following commands:
Your NavQ should then boot into the Ubuntu rootfs.
The default username and password for this image is:
Note there are other usernames and passwords used for the SDK and Demo (Ubuntu) images.
Whala! You're done!