Only this pageAll pages
Powered by GitBook
1 of 61

NXP 8MMNAVQ: NavQ Companion Computer

NavQ

NavQ Linux companion computer platform with Vision for Mobile Robotics based on NXP i.MX 8M Mini SOC. Found at: https://nxp.gitbook.io/8mmnavq/

NavQ was used in HoverGames2 program and this document is supporting information. However it was not commercialized. The NavQPlus (NavQ+) is the replacement upgraded version of NavQ available on NXP.com and through global distributors.

Note that some software and information here is still completely relevant to NavQPlus NavQ may also be referred to as: 8MMNavQ, MR-8MMNavQ or RDDRONE-8MMNavQ

NavQ Mounted on HoverGames drone

Also take a look at some of our other Gitbooks:

  • 8M Plus NavQ - newer supported version, available on NXP.com

  • HoverGames challenges

  • HoverGames Drone

  • NXP Cup - car/buggy racing series

  • UCANS32K146 UAVCAN/CAN Node

  • UCANS32K UAVCAN Node

  • RDDRONE-BMS772 Battery Management System

The 8MMNavQ is a small purpose built experimental Linux computer based on the NXP i.MX 8M Mini SOC. It is focused on the common needs of Mobile Robotics systems.

The system is built as a stack of boards, the top board being a SOM (system on module) containing the Processor, memory and other components with strict layout requirements, and where the secondary boards are relatively inexpensive (often 4 layer boards) and allows for versions with customization to be easily built.

This is a new set of boards and software enablement and will undergo several iterations. Our intent is to provide a "friendly Linux" with typical packages and additional tools included rather than the typical highly optimized and stripped down Linux found in deeply embedded products.

Please check for Linux updates regularly. Feedback and needs will be incorporated and updated as much as possible and reasonable.

There is a discussion forum here for questions specifically about NavQ

And a general HoverGames forum here.

The 8MMNavQ features:

  1. NXP i.MX 8M Mini SOM with LPDDR4 DRAM and eMMC Flash.

  2. A secondary board with SDCARD, Networking, MIPI-CSI (Camera) and MIPI-DSI (Display) interfaces

    • MIPI-DSI to HDMI converter

    • A Google Coral camera module

  3. A third HGI (HoverGames Interposer board) with common interfaces and specific drone and rover interfaces which follow PX4 standards.

Applications

The NavQ is suitable for many purposes, including generic robots and various vision systems.

  • Drones, QuadCopters, Unmanned Aircraft, VTOL

  • Rovers

  • Road going Delivery Vehicles

  • Robotic Lawnmowers

  • Robotic Vacuum

  • Flying vehicles (PX4)

  • DIYRobotCars

  • Marine vessels

  • Camera and Vision processing modules

  • Time of Flight (TOF) Cameras

  • AI/ML inference

  • Cellular gateway

  • Vision systems in other applications

    • e.g a hospital bed monitor that detects if a patient is sitting up or at risk of falling out of bed.

Two specific complete developer tool examples are the NXP HoverGames Drone, and the NXP-CUP car.

The NavQ was prepared with the intention of working with and supporting the NXP HoverGames Drone program

Software

The intent of the 8MMNavQ in HoverGames is to enable participants with a solution that allows them to harness common robotics packages and libraries such as:

  • ROS

  • OpenCV

  • GStreamer

  • Tensorflow

  • pyeIQ

  • And more!

The 8MMNavQ runs linux with a package manager, so you should be able to install the packages that you need to complete your projects successfully and efficiently.

Creative Commons License This work is licensed under a Creative Commons Attribution 4.0 International License.

NavQ and HoverGames

NavQ is a companion computer reference design for HoverGames and commercial development of drones and rovers.

NXP HoverGames Drone Software Competition

While the 8MMNavQ is a standalone computer, it has been designed with NXP HoverGames coding competition in mind. And specifically the NXP KIT-HGDRONEK66 using the RDDRONE-FMUK66 flight controller.

  • www.HoverGames.com

  • https://nxp.gitbook.io/HoverGames

As a result of HoverGames planning, the NavQ also makes a great companion computer for many other PX4 or Ardupilot flight controllers

HoverGames specific features include

  • NavQ can connect to HoverGames (RDDDRONE-FMUK66)

    • via serial

    • via Ethernet (using 100BaseT1 2 wire ethernet adapter)

    • via USB (requires specific configuration)

  • RGB LED onboard for status reporting

  • USB-C console for debugging

  • Power input via USB-C or JST-GH power header

  • MicroUSB port for peripherals (hub, usb cameras, sensors)

  • IX industrial Ethernet jack

  • Serial ports using JST-GH connectors

  • I2C/SPI port using JST-GH connectors

  • 3 wire LED strip connector with power supply

  • Wifi and Bluetooth

  • MIPI Camera intereface (Google Coral Camera default)

  • eMMC and removable SDCard memories

  • MIPI DSI for display (particularly for Rover applications)

NavQ Examples with NXPHoverGames Drone

Here are a few videos of some of the capabilties of NavQ:

Current HW SW status

Rolling updates on the status of hardware and software

Hardware

  • 5/10/2020 - Production Hardware in manufacturing

Software

October demo update

The changelog for the October Demo Image update is as follows:

  • Added support for SLCAN devices

  • Added SPI support through the userspace SPI driver "spidev"

  • Fixed issues with SDMA firmware loading

  • Fixed an issue with Bluetooth sometimes not working after a reboot

NavQ development updates

As of July 17 2020:

  • <u> 20.04 image is fully working with package manager, ROS, UART, WiFi, etc.

As of July 8 2020:

  • An <u> 19.10 image was built and tested with the Google Coral camera. A picture was successfully taken using this image. A working image with the following features should be finished Soonâ„¢:

    • Gstreamer + OpenCV w/ working camera

    • UART communication to NXP RDDRONE-FMUK66 for Offboard control using MAVROS (not from)

    • WiFi connection for communication with NavQ

      • Would enable streaming video back to base station for processing as well as creating in-house user interfaces for controlling the HoverGames drone

As of July 7 2020:

  • NXP Yocto Debian image

  • 3rd party <u>-like image

    • Works, can install ROS/OpenCV, but most hardware doesn't work (camera, hdmi, etc)

  • 3rd party Debian image

  • EmCraft Linux image

    • Works, has desktop, has OpenCV/Gstreamer, no ROS

Hardware Overview

NavQ SOM

The SOM includes the processor, RAM, memory, and WiFi chip for the NavQ.

Summary

The NavQ SOM (System on Module) contains the brains of the NavQ. On this board, we have our i.MX 8M Mini processor, 2GB of LPDDR4 memory, 16GB of eMMC flash storage, and a QCA9377 WiFi AC + BT 5.0 chip. There are connectors on the bottom of this board that allow for modularity.

Components

Name

Details

NXP i.MX 8M Mini Processor

Quad ARM Cortex-A53, Cortex-M4 @ 1.8GHz

LPDDR4 Memory

2GB

eMMC Flash

16GB

Qualcomm WiFi/BT

802.11ac + BT 5.0

Dimensions

The NavQ's dimensions are 3" W x 2" L x 7/16" H.

Media Board

Summary

The Media Board consists of an SD Card slot, MIPI connectors for a camera serial interface as well as a display serial interface, and a PCIe connector.

Components

Name

Details

SD Card slot

MicroSD card compatible

MIPI CSI

Google Coral Camera connection

MIPI DSI

MIPI to HDMI adapter for full desktop

Image of Google Coral Camera + MIPI to HDMI Adapter:

Google Coral Camera Dimensions

The Google Coral Camera dimensions are 25mm x 25mm, as shown in the tech specs here:

https://coral.ai/products/camera/#description

Hovergames Interposer Board (HGI)

Summary

The HoverGames Interposer Board (HGI) is the final board in the stack and has a multitude of I/O for your needs in HoverGames. Connect sensors, switches, and LEDs to the NavQ using the HGI to drastically improve your drone system, and even control your drone using NavQ using offboard control with MAVROS.

Components

Name

Details

UART

3 UART ports for serial communication through JST-GH connectors.

USB-C

Powers the board and serves as an interface for flashing new firmware.

MicroUSB

Connect USB devices to this port such as keyboards and mice. USB hub included.

SPI

JST-GH connector for SPI interface.

Hirose IX Industrial Ethernet

IX Industrial ethernet cable is included.

2 Wire Automotive Ethernet

JST-GH connector for 2-wire ethernet as well as GPIO.

JTAG

Pads are available for JTAG. You may solder your own JTAG connector.

Boot Mode Switches

You can use these switches to boot from eMMC or SD card, or boot into fastboot.

GPIO

Through-hole points for GPIO headers.

RGB LED

WS2811 RGB LED available for status

Pinouts and Connector info

HoverGames Interposer Board (HGI)

The HoverGames Interposer Board has a large amount of connectors and I/O for you to connect your devices, sensors, switches, LEDs, and more. This section will give you an overview of the connector pinouts on the HGI. The picture below has a few silkscreen labels for pinouts on each connector, but some connectors have multiplexers to make them more flexible.

Overview of the HoverGames Interposer Board

UART2

UART2 is used for the serial monitor, and should not be used for anything other than the serial monitor. It should not be altered.

UART3

UART3 will mainly be used for serial communication to the FMU in HoverGames, but it can be used as an SPI port if you're not using it for the drone.

UART4/I2C/GPIO

The bottom 9 pin JST-GH connector in the image of the HGI is used for UART4/I2C/GPIO.

The UART4 port on this connector is tied to the ARM Cortex-M4 core. It is not available for use in Linux.

The UART4 pins do not have flow control on this connector. There is no multiplexing on this connector as well. The pinout is below.

Linux GPIO Pin IDs

GPIO Pin (JST-GH Pin)

Linux GPIO ID

GPIO1_IO10 (6)

10

GPIO1_IO12 (7)

12

GPIO1_IO14 (8)

14

SPI/GPIO

The SPI/GPIO port has a full pinout for SPI as well as 3 GPIO pins. The SPI pins can be muxed to a full UART 4 port with flow control. The pinout is below.

Linux GPIO Pin IDs

GPIO Pin (JST-GH Pin)

Linux GPIO ID

GPIO1_IO11 (6)

11

GPIO1_IO13 (7)

13

GPIO1_IO15 (8)

15

GPIO Headers

The GPIO header pads on the HGI are not labeled correctly with the silkscreen. The layout is shown below with TP labels and schematics.

GPIO Pin Labels
GPIO pinout for labels

Schematics

If you would like to take a look at the schematics of each board in the NavQ, you can download the .zip file at the bottom of this page. The table below gives detail for each folder inside the .zip file.

Folder Name

Description

RDDRONE-8MLPDDR4

Schematics for the top SoM board, with the CPU, RAM, eMMC, and WiFi/BT chips

RDDRONE-HGI8MM

HoverGames Interposer Board schematics with GPIO, UART, i2C, SPI, HiRose IX Industrial Ethernet, and more

RDDRONE-MEDIA8MM

Media Board schematics with MIPI CSI/DSI, PCIe, and SD Card

RDDRONE-MIPIHDMI

Schematic for the MIPI HDMI board

RDDRONE-T1ADAPT

Schematic for the 100BaseT1 2-Wire Automotive Ethernet Adapter board

4MB
8m-lpddr4-som-2a.pdf
pdf
RDDRONE-8MLPDDR4 Schematic
6MB
RDDRONE-HGI8MM.pdf
pdf
RDDRONE-HGI8MM Schematic
6MB
RDDRONE-MEDIA8MM-1A.pdf
pdf
RDDRONE-MEDIA8MM Schematic
1MB
RDDRONE-MIPIHDMI-1A.pdf
pdf
RDDRONE-MIPIHDMI Schematic
2MB
RDDRONE-T1ADAPT.pdf
pdf
RDDRONE-T1ADAPT Schematic

Disclaimers

  • NavQ as provided is an experimental board

  • NXP Friendly Linux is experimental and in development

NXP, EMCRAFT AND SOFTWARECANNERY, provides the enclosed product(s) under the following conditions:

This reference design is intended for use of ENGINEERING DEVELOPMENT OR EVALUATION PURPOSES ONLY. It is provided as a sample IC pre-soldered to a printed circuit board to make it easier to access inputs, outputs, and supply terminals. This reference design may be used with any development system or other source of I/ O signals by simply connecting it to the host MCU or computer board via off-the-shelf cables. Final device in an application will be heavily dependent on proper printed circuit board layout and heat sinking design as well as attention to supply filtering, transient suppression, and I/O signal quality.

The goods provided may not be complete in terms of required design, marketing, and or manufacturing related protective considerations, including product safety measures typically found in the end product incorporating the goods. Due to the open construction of the product, it is the user's responsibility to take any and all appropriate precautions with regard to electrostatic discharge. In order to minimize risks associated with the customers applications, adequate design and operating safeguards must be provided by the customer to minimize inherent or procedural hazards.

For any safety concerns, contact NXP, EMCRAFT AND SOFTWARECANNERY, sales and technical support services. Should this reference design not meet the specifications indicated in the kit, it may be returned within 30 days from the date of delivery and will be refunded replaced by a new kit. NXP, EMCRAFT AND SOFTWARECANNERY, reserves the right to make changes without further notice to any products herein.

NXP, EMCRAFT AND SOFTWARECANNERY, makes no warranty, representation or guarantee regarding the suitability of its products for any particular purpose, nor does NXP, EMCRAFT AND SOFTWARECANNERY, assume any liability arising out of the application or use of any product or circuit, and specifically disclaims any and all liability, including without limitation consequential or incidental damages. Typical parameters can and do vary in different applications and actual performance may vary over time. All operating parameters, including Typical, must be validated for each customer application by customer’s technical experts.

NXP, EMCRAFT AND SOFTWARECANNERY, does not convey any license under its patent rights nor the rights of others. NXP, EMCRAFT AND SOFTWARECANNERY, products are not designed, intended, or authorized for use as components in systems intended for surgical implant into the body, or other applications intended to support or sustain life, or for any other application in which the failure of the NXP, EMCRAFT AND SOFTWARECANNERY, product could create a situation where personal injury or death may occur.

Should the Buyer purchase or use NXP, EMCRAFT AND SOFTWARECANNERY, products for any such unintended or unauthorized application, the Buyer shall indemnify and hold NXP, EMCRAFT AND SOFTWARECANNERY, and its officers, employees, subsidiaries, affiliates, and distributors harmless against all claims, costs, damages, and expenses, and reasonable attorney fees arising out of, directly or indirectly, any claim of personal injury or death associated with such unintended or unauthorized use, even if such claim alleges NXP, EMCRAFT AND SOFTWARECANNERY, was negligent regarding the design or manufacture of the part.

Revisions and Errata

Fault_N on USB C chips

Board

Rev

Item

RDDRONE-

HGI-1A

PTN5110 FAULT_N should be pulled up

and connected to INT of NX20P4383UK

Action - Mask this interrupt in the register FAULT_STATUS_MASK (0x15h):bit 4 -- Force Discharge Failed Interrupt Status Mask (to be zeroed).

Resolve in the software, since both the PTN5110 (TCPC) and NX20P3483 (power switch) connected to the I2C bus. Polling of the NX20P3483 will address this errata.

This issue will be resolved in the next board revision (RDDRONE-HGI-2A)

SD 3.0 Power Switch is missing

Board

Rev

Item

RDDRONE-

MEDIA-1A

The design is missing a power switch required by the SD 3.0 standard

Action - disable SD3.0 support in UBoot and/or avoid SD3.0 cards

This issue will be resolved in the next board revision (RDDRONE-MEDIABOARD-2A)

The SD3.0 PWR switch will be added:

3rd Mounting hole on carbon fiber plate

Issue - The 3rd mounting hole for the NavQ was missed on the carbon fiber mounting plate. This is needed to secure the board and avoid vibrations.

Action - Please drill a 3mm hole manually or use double sided tape on this side of the baord mounting. This must be done to avoid vibration of the board during flight.

NavQ add on modules

These components are designed to interface and work with the NavQ.

Unless otherwise indicated, the following add on modules are not included with the NavQ HoverGames 2 kit (HG2).

In addition They may not all be available, and may be experimental in nature.

  • USB-UART serial debug console (included with HG2 Kit)

  • LTE CAT-M1 cellular modem

  • PMDTEC Time of Flight (TOF) Camera

  • Lighthouse tracking module

  • NXP 100BaseT1 2-wire Automotive Ethernet

  • Edgelock SE050 Secure Element

  • 5" high resolution MIPI-DSI LCD

  • MIPI-DSI to HDMI adapter

  • PCI.e M2.Module - Kingston SSD

  • Non contact gesture tracking module

  • Kingston eMMC, LPDDR4, Industrial SDCARD

  • RJ45 breakout.

RDDRONE-T1ADAPT

100BaseT1 2-Wire Automotive Ethernet media converter

RDDRONE-T1ADAPT TOP view
RDDRONE-T1ADAPT Bottom view - USB-C alternative power input connector

Additional RDDRONE-T1ADAPT media converter details can be found here: https://nxp.gitbook.io/rddrone-t1adapt/

TJA110x 2-Wire Ethernet

NXP's TJA1101 is an Ethernet PHY that provides a two-wire 100BaseT1 Ethernet interface. The Ethernet MAC side of this interface is not unusual, and the traffic on the line is "regular Ethernet"

NXP's Flight controller for Mobile Robotics - RDDRONE-FMUK66 includes a 2-wire Ethernet interface on board. In order to connect this with the 8MMNavQ this media converter can be used. The RDDRONE-T1ADAPT is also useful when connecting to other experimental modules such as V2X or an Automotive 5/10 port switch.

Connecting the RDDRONE-T1ADAPT

On RDDRONE-T1ADAPT power is supplied via a 3 pin JST-GH connector. There is a matching 3 pin JST-GH connector on the 8MMNavQ. A simple 1:1 cable is used. Optionally a USB-C cable can be used to provide power (only) connection. A 2 pin JST-GH connector is used for connecting Ethernet between this board and another - such as the RDDRONE-FMUK66. A simple 1:1 cable is used.

There are also locations marked on the bottom side of the board for soldering in wires for both power and 2-wire Ethernet

RDDDRONE-T1ADAPT Software

There is a small NXP LPC processor on board to configure the back to back PHYs and manage setup and LEDs. This board comes pre-programmed and there is no user software required. Contact hovergames@nxp.com or your local NXP representative if there is a specific need to access the software.

Engineering Sample - 100BaseT1 Ethernet media converter

5" LCD panel module

5" developer LCD panel

For development, use on a rover or ground station there is a 5" LCD panel that can attach to the NavQ MIPI DSI output.

Specs: <TODO>

LTE Cat M1 modem

This is a Murata Type1SC LTE CAT M1 Cellular Modem and is a low bandwidth solution for IOT.

This LTE Cat M1 modem is a small form factor module for development that includes the PX4 type connectors suitable for connection with the NavQ.

Note that at this time any software for this module is the responsibility of the user. Murata and Emcraft may have example or reference code to follow.

Details may be found here: Murata Type1SC LTE Cat M1 Cellular modem

Cellular IoT solutions are new standards defined by the 3GPP Group to answer requirements such as low power, long range and low data rate usage. Two standards currently exist - LTE Cat.M1 is one of the current standards and is intended for low bandwidth intermittent use, as expected for an IOT-type device.

Cat M1 can deliver secure, world-wide coverage by using the same base stations, public networks and power supplies as mobile phones.

It is important to recognize that Cat M1 is not the same as 4G LTE data bandwidth you have come to expect on your cell phone data plan. This is not suitable for streaming video, instead it would be intended more for intermittent low bandwidth telemetry or sensor data.

Murata Type1SC LTE Cat M1 Cellular modem (low bandwidth)

Generally speaking LTE Cat M1 modems are designed to support these features:

  • High Security, Encrypted communication and FOTA

  • Low consumption capable of 10+ years of battery life

  • Wide coverage using existing smartphone networks

  • Low cost - Reduction of R&D and operational cost

  • GPS-free geolocation solution

  • Large network capacity w/ reduction of data rate

LTE Cat.M1(Release 13)

Specification

Based on LTE

Bandwidth

Up to 1.4MHz

Peak DL data rate Peak UL data rate

300kbps 375bps

Frequency deployment

LTE in Band

Duplex mode

Half or Full duplex

Voice/Data support

Voice & Data

Mobility

Yes

Tx Power

20, 23dBm

Targeted Applications

Critical applications Healthcare, smart factory, security Low latency Emergency devices, smart cities Geolocation Asset tracking, wearable, fleet management

Ordering Info

How to order your own NavQ

To order a NavQ, you'll currently need to purchase directly from the Emcraft website. If you are a HoverGames 2 participant and have had your application accepted be sure to use your coupon code.

NavQ has been superseded by NavQPlus. As of Dec 2022, there are no original NavQ available to the general public. (Commercial customers can work directly with EmCraft on build to order) . Following HoverGames3 competition a commercial version of NavQPlus EVK will be available in 2023 on NXP.com here: https://www.nxp.com/design/designs/navqplus-ai-ml-companion-computer-evk-for-mobile-robotics-ros-ground-stations-and-camera-heads:8MPNAVQ

See links below

  • NXP 8mmNavQ Starter Kit: https://emcraft.com/products/1125#starter-kit

  • Direct link to buy 8mmNavQ: https://store.emcraft.com/shoppingcart.asp?ProductCode=nxp-8mmnavq-kit

Getting Started

Quick start Guide

Guide to get the 8MMNavQ up and running quickly

Getting started with your NavQ kit

The NavQ is a device that will allow you to add extra compute to your HoverGames drone system. With an i.MX 8M Mini processor, you will be able to reach new boundaries of vision and sensor data processing.

Current Demo build version

The current Demo build was built on 7/24/2020. Confirm you are on the correct image by running uname -a. You should get the following output:

Linux imx8mmnavq 5.4.24-2.1.0+gbabac008e5cf #1 SMP PREEMPT Fri Jul 24 23:17:18 UTC 2020 aarch64 aarch64 aarch

Notice about SD Card Slot

The SD Card slot on the NavQ is sandwiched between the Media Board and the HoverGames Interposer Board. There are several important components underneath the SD card slot. We highly recommend that you be very careful when using the SD card slot so the components are not damaged. One notable component is the USB controller - it is quite small, so if it gets damaged, you won't be able to use USB devices over the MicroUSB port. One way to be safe when inserting or removing the SD card is by using some tweezers as seen in the image below.

Be very careful when inserting or removing the SD Card!

Default username/password for the Demo image

The SD card included in the NavQ kit is preloaded with our HoverGames-Demo Linux distribution. The default username and password are:

Username: navq Password: navq

Powering the NavQ

To power your NavQ, there are two options. The first is to use one of the included USB-C cables and connect it to a USB port on your computer.

Location of USB-C Port on the NavQ.

The other option is to power it through one of the included connectors in your NavQ kit. These connectors plug into the 5-pin POWER port next to the boot switches on your NavQ. You may use the barrel connectors or the XT60 power breakout connector. Some images and more details can be found here:

Power Cables

Accessing the serial console

To access the serial console on your NavQ, attach one of the included USB-C cables to the USB->UART adapter included in your kit. You can use programs such as PuTTY to access the serial console. A full guide to do this is linked below.

Serial root console

Expanding the SD card

When your board arrives, the Demo image will already be loaded to the SD card. This image does not take up the full amount of space on the SD card, so you'll need to expand the space in order to install more packages such as ROS or OpenCV. Follow the guide here to do so:

Expand space on SD card/eMMC

Mounting your NavQ to the HoverGames Drone

Follow the guide linked below to mount your NavQ to your drone:

Mounting NavQ on HoverGames Drone

Using your NavQ as a desktop computer

Depending on which Linux distribution is loaded, you may find that the NavQ includes a desktop application. This may be a minimal desktop with only a terminal emulator or it may be more feature rich like Liri Desktop.

Liri Desktop is not yet supported. Currently we only have a basic Wayland desktop with a terminal. You can run GUI applications through the terminal. Try installing firefox with apt and run it!

MIPI-DSI to HDMI adapter

The signaling is output on the MIPI-DSI port and if a compatible LCD panel is attached, then it would be visible there. Most of us will have access to a standard HDMI monitor, and there is a MIPI-DSI to HDMI adapter included in the kit also.

Connecting a mouse and keyboard

In order to connect both a mouse and keyboard to the NavQ you will need to connect the included microUSB to USB-A hub. Other USB peripherals may also be supported but need to be tested as it is not guaranteed that all USB drivers will be available.

Recording a video with NavQ

GStreamer

You can use GStreamer to take 1080p 30fps video. This uses the included H264 encoding plugin for i.MX 8M Mini. Here's an example pipeline you can run on your NavQ to take video:

$ sudo gst-launch-1.0 v4l2src ! vpuenc_h264 ! avimux ! filesink location='video.avi'

When you want to end the video, just press Ctrl+C to cancel the pipeline, and the file should be saved to the present directory.

OpenCV

To record video with your NavQ, you can run this simple python script that uses OpenCV to write video to a file:

353B
record_video.py

This is a simple example that you can use as a starting point for even bigger things with OpenCV/computer vision! If you'd like a more sophisticated guide that runs through example code to detect red objects, head to the developer guide on OpenCV to find more.

Controlling HoverGames drone with NavQ

To perform off-board control of the HoverGames drone from the NavQ, you'll need to get a little bit involved with ROS + MAVLink (MAVROS). To see a guide on how to get started, head over to the developer guide!

Connecting to WiFi

A package named connman is included in the image to help you connect to WiFi through the command line. To connect to WiFi, run the following commands:

$ connmanctl
connmanctl> enable wifi
connmanctl> scan wifi
connmanctl> services

WIFI_SSID     wifi_e8de27077de3_41483034303434393134_managed_psk

connmanctl> agent on
connmanctl> connect wifi_e8de27077de3_41483034303434393134_managed_psk
<enter passphrase>
<wait for connection success message>
connmanctl> exit
$ ping www.nxp.com

When you run services, there may be duplicates of each WiFi network. Try to connect to your WiFi network with each key until it works. Sometimes the first one works, and sometimes the second one works. You will get a connection successful message if it works correctly.

Your NavQ should automatically connect to WiFi when rebooted. If you want to connect to another WiFi network, just go through the same process again.

Transferring files to and from NavQ over FTP

If you need to transfer files to and from the NavQ over a wired or wireless connection, you can use FileZilla to access the NavQ's FTP server. First, you'll want to connect the NavQ to your local network (WiFi or Ethernet) and run ifconfig to find the IP address that was assigned to your NavQ. Then, use FileZilla to connect to that IP with the username navq and password navq.

A guide on how to use FileZilla is here:

LogoUsing - FileZilla Wiki

Next steps

Now that you've gone through the Quick Start Guide, you can move on to the Developer Guide if you'd like to go more in depth. Use the sidebar on this Gitbook to navigate to the next section.

Power input

Power input specification

The NavQ can take a voltage of 5V to 20V.

The board may also be powered via USB- which also has a 20V max input rating.

Note that while the voltage regulator on the external JST-JH power connector has absolute max rating is 26V. According to the schematics, there is a 20V TVS Diode (PTVS20VS1UR) at that input before the regulator (MP8759GD), with a breakdown voltage of 22.2V to 24.5V according to the datasheet.

Any experimentation above 20V input should take this into account and be done at your own risk.

Power input protection

There is some, but limited power input protection on board. Given this is typically for USB-C application, it may not be sufficient in harsh operating environments. You may want to provide some additional reverse polarity protection, over-current, or DC-DC conversion/isolation from the battery if you expect to be experimenting outside the HoverGames normal operating range or treating it harshly.

Battery supply protection

Please monitor your LiPo battery carefully for undervoltage conditions. If left connected, the NavQ and HoverGames drone will completely drain your LiPo battery and could cause permanent damage to your battery. There are no undervoltage disconnect provisions built in.

Low battery monitor

A simple solution to undervoltage protection may be to add a hobby grade low battery monitor alarm. These sound a loud alarm when any cell goes below a user set threshold voltage. The LED display will also show the individual cell voltage and total pack voltage. This plugs into the balance connector of the LiPo battery. They are inexpensive and available at hobby stores or online at typical outlets. Here are some examples:

  • Amazon - "Battery Voltage Checker Alarm"

  • Amazon - "Battery Voltage Monitor Low Voltage"

Power Cables

Options and cables for powering the NavQ

NavQ Cables

The NavQ ships with several cables. They allow for several configurations when powering the NavQ

USB-C

When working on the bench you may wish to power the NavQ from USB-C input cable. This cable is provided. It would also be possible to power the NavQ while on a drone from a separate USB-C battery pack like those for Cell phone power.

Please note that some USB-C ports on charging adapters and particularly on unpowered Hubs may not supply enough current to run the NavQ and all the peripherals. If you notice a booting failure, please first try powering from an external battery or known high power USB-C charger or power supply.

Drone power cable supply

JST-GH Power input cable connector (USB-C connector also available on other side of board)
JST-GH Power input cable connector
3-way splitter-extender (optional), XT60 adapter and "bullet" adapter provided
Choose XT60 or Bullet connectors for powering your NavQ

Downloading & Flashing the HoverGames-Demo Linux Image

Preparing the image

The HoverGames images come as a .bz2 compressed archive. To decompress this image, you'll need to use a program like 7zip on Windows, use the bunzip2 command on Linux, or double click the archive on Mac.

Flashing the image to eMMC or SD card using UUU

Downloading UUU

NXP has a tool for flashing i.MX hardware called UUU (Universal Update Utility). You can download UUU from here:

It is recommended to download the "Latest Release", not the "Pre-Release" at the top of the page.

LogoReleases · NXPmicro/mfgtoolsGitHub

If you're on Windows, you'll want to download the uuu.exe file, and if you're on Linux, you'll want to download the uuu file.

Downloading the Linux image

You must agree to all of the applicable licenses and agreements at the following link before downloading the Linux software. It is hosted here:

LogoHoverGames EULA | NXP SemiconductorsNXP

Downloading the bootloader

NOTE: This file is only needed for flashing with UUU to the eMMC/SD Card. If you want to flash your SD Card with dd or Win32DiskImager, this file is not needed.

Logoimx-boot-imx8mmnavq-sd.bin-flash_navqGoogle Docs

Step 1

Flip the DIP switches on your NavQ to put it into USB flashing mode (boot from USB in the image below). Here is an image that shows how to do so:

Once you're done flashing, you can use this image to select the boot mode: eMMC or SD card.

Step 2

Connect your NavQ using the included USB-C cable to your computer. You should recieve a message on your computer that it has been connected. To make sure the NavQ is connected, you can run the UUU program with the -lsusb flag and you should see an output similar to this:

Linux
-----
$ ./uuu -lsusb

Windows
-------
$ .\uuu.exe -lsusb

Output
------
uuu (Universal Update Utility) for nxp imx chips -- libuuu_1.3.191-0-g4fe24b9

Connected Known USB Devices
        Path     Chip    Pro     Vid     Pid     BcdVersion
        ==================================================
        2:7      MX8MM   SDP:    0x1FC9 0x0134   0x0101

Step 3

You can flash both the SD card and the eMMC using this tool. The keyword for flashing the SD card is sd_all, while the keyword for flashing the eMMC is emmc_all. The command to flash your board is outlined below:

There are advantages and disadvantages to each storage medium. eMMC is faster, but is locked to 16GB size and is non-removable. SD cards can be of any size you like and are removable, but they are quite a bit slower.

Linux
-----
$ ./uuu -b [emmc_all|sd_all] <.bin-flash_navq> <.wic.bz2 OR .img>

Windows
-------
$ .\uuu.exe -b [emmc_all|sd_all] <.bin-flash_navq> <.wic.bz2 OR .img>

After a few moments, your board should be flashed. Unplug your NavQ from power, reset the DIP switches to the desired boot device, and you're good to go!

Flashing the image to SD card using dd or Win32DiskImager

To flash the image, you'll need to use dd on Linux/Mac or Win32DiskImager on Windows.

Linux

Replace the X in "/dev/sdX" with the letter of your SD card in linux. You can use a program like "GParted" or "Disks" to find the letter of your SD card.

$ sudo dd if=<path to .wic file> of=/dev/sdX bs=1M status=progress

Mac

Replace the X in "/dev/rdiskX" with the number of your SD card in Mac. You can use diskutil list to find the number of your SD Card.

$ sudo diskutil unmountDisk /dev/rdiskX
$ sudo dd if=<path to .wic file> of=/dev/rdiskX bs=1m

Windows

Download Win32DiskImager:

LogoWin32 Disk ImagerSourceForge

Currently the HoverGames-Demo Linux image is packaged as a .img file. In future releases, it may be packaged as a .wic.bz2 file. If it is packaged as a .wic.bz2 file, you'll want to extract the .bz2 file before flashing using Win32DiskImager.

Open the program and select your SD card. Choose the .wic OR .img file, then click "Write".

Expand space on SD card/eMMC

Easy-to-use script

We have created an easy-to-use script to resize the flashed image on your device. Since the image is smaller than the storage device, it is not properly expanded when first flashed. You can run the following script to expand the filesystem.

This script should be in the home folder of the demo image. If it isn't there, or if you lost it and need it, it is linked below.

729B
resizeDisk.sh

To run the script, run the following commands for the boot device you're currently using:

For eMMC:

$ sudo ./resizeDisk.sh eMMC

For SD card:

$ sudo ./resizeDisk.sh sd

Making the script executable

The script should be executable when it is included in the image. If it isn't, then you'll need to make it executable by running the following command:

$ chmod a+x ./resizeDisk.sh

Manually expanding space

If you don't want to use the script, you can run the commands below.

You may run into an issue where you run out of space on the eMMC or SD card when installing ROS. To expand the rootfs partition, follow these steps:

If you're on the eMMC, you'll use /dev/mmcblk2. If you're on an SD card, use /dev/mmcblk1. By default the NavQ boots from the included SD card in your kit.

$ sudo fdisk /dev/mmcblk1

Command (m for help): p

Device           Boot      Start   End      Blocks   Size   Id  System
/dev/mmcblk1p1   *         16384   186775   170392   83.2M  c   W95 FAT32 (LBA)
/dev/mmcblk1p2             196608  <end>   <blocks>  <size> 83  Linux

Command (m for help): d
Partition number (1,2, default 2): 2

Partition 2 has been deleted.

Command (m for help): p

Device           Boot      Start   End      Blocks   Size   Id  System
/dev/mmcblk1p1   *         16384   186775   170392   83.2M  c   W95 FAT32 (LBA)
   
Command (m for help): n
Partition type
   e   extended
   p   primary partition (1-4)
Select (default p): p
Partition number (1-4, default 1): 2
First sector (2048-30621695, default 2048): 196608
Last sector, +sectors or +size{K,M,G} (2048-30621695, default 30621695): <press enter for default>

Created a new partition 2 of type 'Linux' and of size 14.5 GiB.
Partition #2 contains a ext4 signature.

Do you want to remove the signature? [Y]es/[N]o: n

Command (m for help): p

Device           Boot      Start   End      Blocks   Size   Id  System
/dev/mmcblk1p1   *         16384   186775   170392   83.2M  c   W95 FAT32 (LBA)
/dev/mmcblk1p2             196608  30621695 39425088 14.5G  83  Linux

Command (m for help): w
The partition table has been altered!

Once you're done with those steps, run this command:

$ sudo resize2fs /dev/mmcblk1p2

and reboot. You should now be able to install ROS Melodic without size issues.

Commands for fdisk

If you prefer to just see the commands, these are the commands you need to run in fdisk in order to resize your disk.

d <enter>
2 <enter>
n <enter>
p <enter>
2 <enter>
196608 <enter>
<enter>
n <enter>
w <enter>
<fdisk should exit>

$ sudo resize2fs /dev/mmcblk2p2 <enter> (FOR eMMC)
$ sudo resize2fs /dev/mmcblk1p2 <enter> (FOR SD CARD)

Serial root console

Connecting to the root console on NavQ using the USB--UART adapter

The root console will allow monitoring of the board from initial boot. Since modern PCs don't tend to include serial ports anymore, a small FTDI USB-UART adapter is provided to convert the serial port to USB.

Hardware connection

  • Plug the 6 pin JST-GH cable from the USB-UART converter into NavQ connector "UART2/I2C2"

  • Plug the USB-UART into your pc like you would a dongle or memory stick

    • Your PC operating system should respond right away when it recognizes the USB-UART. On windows it will play a "usb connect sound"

    • There should be a red light illuminated on the USB-UART board (even when not plugged into the NAvQ)

  • Follow the terminal configuration below and power on the NavQ

Troubleshooting Tip: If the COM port does not show in Device Manager or you don't hear the "usb connect sound" double check that the USB-C connector is fully plugged in and seated into the USB-UART adapter board.

Troubleshooting Tip: If the USB UART is not detected on your PC, in some instances you may need to download FDTI USB-UART driver software. It can be found here: https://www.ftdichip.com/Drivers/D2XX.htm

Terminal software connection

A terminal program will be required to communicate over the serial console. The following example is for a Windows 10 PC using the terminal program PuTTY.

Default NavQ terminal settings are: 115200 Baud, N, 8, 1 (no Parity, no flow control)

In Windows 10 use Device manager to determine which COM port is assigned to the FTDI USB-UART serial adapter. The open PuTTY window shown below is the root console of the NavQ.

In Windows 10, Device manager can be used to determine which COM port is assigned

Configuring PuTTY

There are several options for a terminal program that can be used. This example is for PuTTY on a Windows PC. Other programs and other hosts can be used. Example using PuTTY to connect to the serial console via a USB-UART adapter. In this example the COM port was determined above by looking at Windows 10 Device manager.

Default baud rate is 115200
You may also want to configure the serial console to turn off flow control

Mounting NavQ on HoverGames Drone

How to mount the NavQ using the plate and standoffs provided

The suggested method of mounting the NavQ to the Drone frame is shown below. Improvements and suggestions are welcomed. Please message on community.nxp.com here: https://community.nxp.com/community/mobilerobotics/hovergames-drone-challenge

Use two of the Hex standoffs together to make ~35mm total standoff height between the top plate and the NavQ mounting plate

The NavQ can mount to the small carbon fiber plate using screws, standoffs or double sided tape.

We did allow for mounting the NavQ with screws, HOWEVER one of the three mounting holes did not get drilled. Please drill the third hole by hand if you wish to mount with screws. Alternatively you can use two screws and some double sided tape for the missing screws. You should use tape or a 3rd hole, because otherwise the board will vibrate too much.

Note the use of short hex standoffs instead of nuts under the main top-plate because they are easier to hold with needle nose pliers without needing to disassembe the complete frame.

One modification tried was to add a second standoff "hanging" from the front so that the camera module could stick between these two front standoffs to with double sided tape.

NavQ mounted on standoff plate with camera mounted on hex shafts

HoverGames-Demo Bugs/Fixes

Ensure your NavQ is working correctly with these fixes

Overview

Currently, the HoverGames-Demo image is in a beta state. Over time we will be improving the image by fixing bugs and including even more software so that it's more of an out-of-the-box experience. Here are some of the current fixes at the moment to get a working system.

These fixes are for the build on 7/24/2020. This page will be updated for each new release.

Quick Workarounds

  • WiFi sometimes does not automatically connect to the last WiFi network after reboot.

    • Workaround: Open connmanctl, run disable wifi, and reconnect using the instructions in the Quick Start Guide.

Setting timezone

Setting the timezone on your NavQ is necessary to ensure the apt package manager works. First, you'll need to locate the correct timezone file at /usr/share/zoneinfo. There should be a folder for your country and a file in that folder for the closest city to you.

$ rm -f /etc/localtime
$ ln -sf /usr/share/zoneinfo/<country>/<city> /etc/localtime

For example, if you're in Central Time USA, you'd use the following commands:

$ rm -f /etc/localtime
$ ln -sf /usr/share/zoneinfo/America/Chicago /etc/localtime

Now, you can run sudo apt update and sudo apt upgrade to get your system up to date.

User Guide

Nano Editor

There are several editors that can be used in Linux. Nano is a lightweight command shell editor that is popular. There are many others to choose from

Some discussion on the topic: https://yujiri.xyz/software/nano

Assuming you already have internet access on your NavQ, to install nano on the demo image type the following.

~$ sudo apt install nano

Setting the NavQ hostname

Suggested proposal for the hostname of NavQ

Since several NavQs might be present in a Wifi Network it's essential to set an unique hostname to determine which one is the correct NavQ you want to connect to.

To change the hostname you need to modify /etc/hostname . We suggest the following format:

navq-[Vehicle Mavlink SysID]

e.g. If Mavlink SysID is 10 the NavQ should be named navq-10

Adjusting the hostname with nano

~$ sudo nano /etc/hostname
navq-10

Adjusting the hostname with echo command

~$ sudo echo navq-10 > /etc/hostname

After a reboot the new hostname will be visible on the network.

MavLink Specific details

mavlink-router

mavlink-router routes MavLink data dynamically between several end nodes

Introduction

To be able to have several end nodes communicating via mavlink simultaneously we need to set up mavlink-router on the NavQ. The end nodes can be

  • A process for onboard control running on NavQ.

  • A QGroundControl (QGC) computer the NavQ connects to via a data link such as WiFi.

  • Other mavlink enabled peripherals on the vehicle.

  • Another program running on the same remote PC as QGC

Prerequisites

Set up TELEM2 on the FMU

Connect to your FMU over USB and open QGroundControl. Navigate to Settings -> Parameters -> MAVLink and set these parameters:

Also, you'll need to make sure that the settings in Settings -> Parameters -> Serial look like this:

Installation and compiling mavlink router

To install and compile mavlink router follow the steps below (internet access required on your NavQ)

1) Connect to NavQ console via ssh / serial

2) Type the following commands

~$ mkdir src
~$ cd src
~/src$ git clone https://github.com/intel/mavlink-router.git
~/src$ cd mavlink-router 
~/src/mavlink-router$ git submodule update --init --recursive
~/src/mavlink-router$ ./autogen.sh && ./configure CFLAGS='-g -O2' --sysconfdir=/etc --localstatedir=/var --libdir=/usr/lib --prefix=/usr
~/src/mavlink-router$ make
~/src/mavlink-router$ sudo make install

Mavlink-router configuration

Configuration of mavlink router is done via a single configuration file /etc/mavlink-router/main.conf This file needs to be created from scratch. An example configuration file is available in the mavlink-router sources - https://github.com/intel/mavlink-router/blob/master/examples/config.sample

As of today the mavlink-router make install does not create the /etc/mavlink-router directory and main.conf file. Therefore please use the following commands to create the directory and file initially.

~$ sudo mkdir /etc/mavlink-router

~$ sudo touch /etc/mavlink-router/main.conf

Setup the config file with minimal configuration

~$ sudo nano /etc/mavlink-router/main.conf
#Mavlink router configuration navq
#
[General]
TcpServerPort=5760
ReportStats=false
MavlinkDialect=auto

[UartEndpoint FMUuart]
Device=/dev/ttymxc2
Baud=921600

[UdpEndpoint FMUeth]
Mode = Eavesdropping
Address = 0.0.0.0
Port = 14551

[UdpEndpoint QGConMobile]
Mode = Normal
Address = 192.168.43.1
Port = 14550

The configuration above assumes that the NavQ gets mavlink data from FMU either via UART3 (/dev/ttymxc2) or UDP. If you use UART please set on the FMU the corresponding serial port to 921600Bd. For this the SER_TELx_BAUD (x = number of telemetry port) parameter needs to be adjusted to 921600 8N. If you use lower speed QGroundControl might fail to load parameters.

You can leave out the unused connection. Via the UdpEndpoint QGConMobile section the mavlink stream is forwarded to a QGC computer/mobile device assuming it has 192.168.43.1 and NavQ is connected to this network via e.g. WiFi.

Enabling auto-start of mavlink-router

Enable the auto-start of mavlink-router via systemd and start it

~$ sudo systemctl enable mavlink-router
~$ sudo systemctl start mavlink-router

Checking status of mavlink-router

You can check the status of mavlink router using the command

~$ sudo systemctl status mavlink-router

Running MavLink over T1 Ethernet

T1 Ethernet between FMUK66 and NavQ

Prerequisite

The RDDRONE-FMUK66 has a two wire 100BaseT1 Ethernet interface on board. The 8mmNavQ board does not include T1 Ethernet however an adapter may be used. To run the T1 Ethernet connection between FMUK66 and NavQ use a separate RDDRONE-T1ADAPT media converter.

The 8mpNavQ or "NavQPlus" will have two Ethernet interfaces. it is planned that one of these interfaces will natively be configured as 100BaseT1

Setting a fixed IP to use Ethernet for FMU communication

It is not recommended to use DHCP in a vehicle such as a drone, since you generally don't want the network to change without knowing about the explicit details. Therefore since there is no DHCP and FMUK66 by default has a fixed IP of 10.0.0.2, we need to set a fixed IP on the NavQ for eth0 to be able to communicate via Ethernet to FMUK66.

It is suggested to use IP address 10.0.0.3 for navq.

connman connection manager

The linux program connman is used for configuring the network settings . To force connman to use a fixed IP ( as in case when no DHCP is available) the following file needs to be created.

It is important is that you have a Ethernet cable connection before, otherwise connman will not register the network.

~$ sudo nano /var/lib/connman/ethernet.config
[global]
Name = Ethernet_config
Description = Ethernet fixed IP setting

[service_onboard_ethernet]
Type = ethernet
IPv4 = 10.0.0.3/255.255.255.0/10.0.0.3

The IP4 settings are in the order of ownIP/netmask/router. Note that 10.0.0.3 is set as router since in this particular hardware configuration no other device is there.

Setting up FMUK66 for mavlink over T1 Ethernet

T1 Ethernet is supported by PX4 on FMUK66 with latest master.

To enable the RDDRONE-FMUK66 mavlink telemetry via UDP sending to a specific IP you must add the following file on the FMUK66 SDcard:

/etc/extras.txt

set +e
mavlink start -x -u 14551 -o 14551 -r 200000 -t 10.0.0.3 -m onboard
set -e

In the example configuration above, 10.0.0.3 is the IP address of NavQ on the vehicle. More detailed description of the mavlink start parameters can be found here: https://dev.px4.io/v1.9.0/en/middleware/modules_communication.html

Additionally the MAV_BROADCAST parameter on the FMU needs to be set to "2 - only multicast".

Distributing MavLink data can be done by installing mavlink-router on NavQ.

Miscellaneous Linux Commands

This section is for various linux commands that may be useful when using the NavQ, and the Demo image.

VNC desktop environment

Intro

If you've ever run into a situation where you need to view a raw stream from the NavQ's Google Coral Camera, or need to run a lightweight GUI application on NavQ, you can do so using the guide below.

Instructions

Installation

You can run the commands below to start a VNC server on your NavQ.

# Credit to: https://www.vandorp.biz/2012/01/installing-a-lightweight-lxdevnc-desktop-environment-on-your-ubuntudebian-vps/#.YCQEGS1h3O4
# Install X, LXDE, VPN programs

apt-get install xorg lxde-core tightvncserver

# Start VNC to create config file

tightvncserver :1

# Then stop VNC

tightvncserver -kill :1

# Edit config file to start session with LXDE:

nano ~/.vnc/xstartup

# Add this at the bottom of the file:
lxterminal &
/usr/bin/lxsession -s LXDE &

# Restart VNC

tightvncserver :1

Connecting

You can use TightVNC to connect. You'll want to use the IP address of your NavQ at port 5901.

GPS Time

GPS Time for Small Unmanned Aerial Systems by Andrew Brahim Part 1: https://dirksavage88.medium.com/gps-time-for-small-unmanned-aerial-systems-a-primer-for-better-drone-technology-part-one-52d91a908323 Part 2: https://dirksavage88.medium.com/gps-time-for-small-unmanned-aerial-systems-pps-beginnings-part-two-f1a45d0882e1

Linux: Get the core temperature

The i.MX 8M Mini parts are rated 0C to +95C. We do not expect they will need any additional heat-sinking, especially while flying, but you can monitor the core temperature with the following command:

cat /sys/class/thermal/thermal_zone0/temp

NavQ Developer Guide

Building a Linux image for NavQ

HoverGames-Linux Distros

We have two separate Linux distributions that you can build for NavQ. There are pros and cons to both.

[1] HoverGames-BSP

The HoverGames-BSP image is based on NXP's Yocto BSP for i.MX 8M Mini. This distro is not as easy to use, and requires much more effort to get a working system, but currently it is the only system that is allowed for use if you want to use NavQ commercially. If you need tight integration and a small system that has only the packages you need installed on it, or if you're a company looking to use the NavQ in production, this is the one for you.

[2] HoverGames-Demo

The HoverGames-Demo image comes with the APT package manager as well as some other pre-installed software specific to i.MX 8M Mini. This distro is the easiest to use due to it's great compatibility with pre-existing binaries coming from official repositories. You can install ROS, OpenCV, and more on this image just as you would on a normal desktop computer. This makes the Demo image great for quick development and iteration. To use this image, you must agree to a Demo license stating that you will not use NavQ comercially with this distribution.

Building the Linux images for NavQ

Prerequisites

Recommended Specs

You'll need to use a computer with Ubuntu 18.04 installed, and we recommend a high amount of cores + RAM to build these images in a decent time frame. You will also need a large HDD or SSD (>500GB) to store the build files. A table of recommended specs are below:

Component

Recommended Hardware

CPU

Recent 6-core Intel i-Series or AMD Ryzen processor with HyperThreading or SMT (Simultaneous Multi-Threading)

RAM

16GB DDR4 or more

Storage

500GB SSD recommended, HDD will suffice but will be slow

Operating System

Ubuntu 18.04 (Not 20.04!)

Install Yocto build tools

Follow the guide at Yocto's website to install the necessary build tools for Ubuntu/Debian, or just install the list of packages below:

$ sudo apt-get install gawk wget git-core diffstat unzip texinfo gcc-multilib \
     build-essential chrpath socat cpio python python3 python3-pip python3-pexpect \
     xz-utils debianutils iputils-ping libsdl1.2-dev xterm

Running the build script

We have a GitHub repo with instructions for both the HoverGames-BSP and HoverGames-Demo images. The BSP image can be built using the master branch, whereas the Demo image can be built using the demo branch. The link to the repo is below.

LogoGitHub - NXPmicro/meta-nxp-hovergamesGitHub

Communications Interfaces

There are many communications interfaces supported on NavQ through the JST-GH connectors on the HoverGames Interposer Board. The page links below are guides for each one

GPIOSPII2CCAN

GPIO

Controlling GPIO through the command line

There are several GPIO pins on the various JST-GH connectors on NavQ. To control these GPIO pins, follow the instructions below.

Exporting GPIO pins

In order to use GPIO pins, we need to export them in Linux first. To do this, we need to know the GPIO number for the pin we want to access. We can compute this number using the following formula:

gpio_number = ((gpio_bank - 1) * 32) + gpio_pin

For example, if we want to access the GPIO1_IO12 pin on the UART4/I2C/GPIO connector, we would find that the GPIO number is:

gpio_number = ((1 - 1) * 32) + 12
gpio_number = (0) + 12
gpio_number = 12

If you want to find out what pins correspond to what GPIO numbers, we have tables in the Hardware Overview/Pinouts and Connector info section here:

Pinouts and Connector info

Once you know the GPIO number of the pin you want to access, exporting the pin for use is easy. All you have to do is echo the pin number to /sys/class/gpio/export. For example, if we were to export GPIO1_IO12, we would run the following in our NavQ console:

$ sudo echo 12 > /sys/class/gpio/export

Currently we have not created a specific user group to control GPIO pins, so you must be root to export/control pins. If someone in the community would like to submit a process for greating a GPIO user group, please make a post on our hackster.io page and we will add it to the demo image. :)

Changing the pin direction

Next, we will want to change the direction of the GPIO pin for our specific use case. There are two options: in and out. To do this for GPIO1_IO12, you can run the following in your NavQ console:

$ sudo echo in > /sys/class/gpio/gpio12/direction
$ sudo echo out > /sys/class/gpio/gpio12/direction

Reading or Writing the GPIO pin value

To read or write a value to the GPIO pin, we will follow a similar process to changing the pin direction. A pseudo file named value is created at /sys/class/gpio/gpioXXX/value that holds a 1 or a 0. If you echoed out to the GPIO direction file, you can control the pin. To control the GPIO1_IO12 pin, just run the following in your NavQ console:

$ sudo echo 0 > /sys/class/gpio/gpio12/value
$ sudo echo 1 > /sys/class/gpio/gpio12/value

If you echoed in to the GPIO direction file, you can read the value file and find the current state of the pin. To do this for the GPIO1_IO12 pin, you can run the following in your NavQ console:

$ sudo cat /sys/class/gpio/gpio12/value
// a 0 or a 1 should be printed to your console

Controlling GPIO programmatically (in C)

Prerequisites

  1. Create new group called gpio

$ sudo groupadd gpio
$ sudo usermod -aG gpio navq

2. Create new udev rules file

Create a file at /etc/udev/rules.d/99-gpio.rules and add the following to it:

SUBSYSTEM=="gpio", KERNEL=="gpiochip*", ACTION=="add", PROGRAM="/bin/sh -c 'chown root:gpio /sys/class/gpio/export /sys/class/gpio/unexport ; chmod 220 /sys/class/gpio/export /sys/class/gpio/unexport'"
SUBSYSTEM=="gpio", KERNEL=="gpio*", ACTION=="add", PROGRAM="/bin/sh -c 'chown root:gpio /sys%p/active_low /sys%p/direction /sys%p/edge /sys%p/value ; chmod 660 /sys%p/active_low /sys%p/direction /sys%p/edge /sys%p/value'"

This will allow you to access the GPIO pseudofiles without being root.

Source Code

Source code coming soon

SPI

[Work In Progress]

SPI userspace driver

In the October Demo image, we have enabled SPI in the kernel to allow HoverGames participants to communicate with SPI devices. The October Demo image will be released soon.

This page is a work in progress.

I2C

A comprehensive guide on using the NavQ as an I2C master (work in progress)

I2C Example

The NavQ includes an I2C port in one of the JST-GH connectors. You may use this port to communicate to other devices in your drone system. In this example, we will go over the process of connecting a Teensy LC to the NavQ over I2C to control some WS2812 LEDs.

TODO

  1. Add guide for using C/Python SMBus libraries for controlling I2C

  2. Add more pictures/visuals

  3. Explain teensy code

  4. etc

Prerequisites

Hardware

  1. Teensy LC

  2. NeoPixel LED Strip (Ex. https://www.amazon.com/WS2812-Channel-Color-Driven-Development-Arduino/dp/B081BBF4R3/ref=sr_1_48?dchild=1&keywords=neopixel+led+strip&qid=1599144002&sr=8-48)

  3. JST-GH connectors and pre-terminated wires

  4. Headers

  5. Soldering kit

Software

  1. Teensy side

    1. Arduino IDE

    2. TeensyDuino

  2. NavQ side

    1. i2c-tools (installable from apt)

Preparing the JST-GH connector

To create the I2C connector, you'll need to order some JST-GH hardware. Here is a link to a digikey page where you can purchase connectors:

https://www.digikey.com/catalog/en/partgroup/gh-series/8397www.digikey.com

And here is a page where you can purchase the jumpers:

https://www.digikey.com/catalog/en/partgroup/gh-series/61220www.digikey.com

NOTE: For the I2C connector, you'll need the 9-pin JST-GH connector.

In the hardware overview (link here: Hardware Overview), you can see the pinout for the I2C connector. Here is another screenshot of it:

I2C JST-GH connector

The 5VP pin is on the left-most side of the connector, and GND is on the right-most side. I2C2_SDA is pin 4, and I2C2_SCL is pin 5. The JST-GH connector is positioned with the retention clip facing away from you when you are determining the left/right sides.

Wiring the Teensy

You'll need to do some soldering for the first step in this project. In the two pictures below, the NeoPixels are connected to the LED 5V, LED GND, and LED SIG pins. The JST-GH connector to the NavQ connects to the SDA/SCL pins and 5V + GND pads on the back of the Teensy.

Tip: you can solder the pre-terminated JST-GH wires directly to the pads and the through-hole pins to make things easier.

One thing to keep in mind is that even though the Teensy LC does not include pullup resistors to 3.3v for the I2C lines, pullups are not required since the NavQ has internal 4.7k pullups on it's own I2C bus (on the SoM).

Pictures

Here are a couples images of this setup:

You can clearly see the soldering on the top of the teensy - SDA/SCL, and LED headers.
Wire run from Teensy to NavQ

Teensy code

We have written some simple example code that changes the color of the NeoPixel LEDs when the Teensy recieves I2C data. In the example below, the slave address of the Teensy is 0x29, and the color of the LEDs change from green to white when a 0x1 byte is sent to the Teensy. If any other byte is sent to the Teensy, the color changes back to green.

Make sure that you install the Adafruit_NeoPixel library in the Arduino IDE.

The i2c_t3 library is included with the TeensyDuino software. Make sure to use "Wire1" instead of "Wire" since we are using the SDA1/SCLK1 pins on the Teensy.

#include <Adafruit_NeoPixel.h>
#include <i2c_t3.h>

// MACROS
#ifdef __AVR__
  #include <avr/power.h>
#endif
#define PIN       17
#define NUMPIXELS 8
#define DELAYVAL 50

#define SLAVE_ADDRESS 0x29

// INIT PIXELS
Adafruit_NeoPixel pixels(NUMPIXELS, PIN, NEO_GRB + NEO_KHZ800);
void receiveEvent(size_t bytes);
void requestEvent(void);

// MEMORY
#define MEM_LEN 256
char databuf[MEM_LEN];
volatile uint8_t received;

// INIT VARS
bool latch = false;
bool color = false;

// SETUP I2C AND PIXELS
void setup() 
{
  Wire1.begin(I2C_SLAVE, 0x29, I2C_PINS_22_23, I2C_PULLUP_EXT, 400000);
  received = 0;
  memset(databuf, 0, sizeof(databuf));
  Wire1.onReceive(receiveEvent);
  Wire1.onRequest(requestEvent);
  Serial.begin(9600);
#if defined(__AVR_ATtiny85__) && (F_CPU == 16000000)
  clock_prescale_set(clock_div_1)
#endif
  pixels.begin();
}

// LOOP
void loop() 
{
  latch = !latch;
  for(int i=0; i<NUMPIXELS; i++) 
  {
    if(color)
    {
      if(latch)
        pixels.setPixelColor(i, pixels.Color(5,5,5));
      else
        pixels.setPixelColor(i, pixels.Color(15,15,15));
    }
    else
    {
      if(latch)
        pixels.setPixelColor(i, pixels.Color(0,5,0));
      else
        pixels.setPixelColor(i, pixels.Color(0,15,0));
    }

    pixels.show();
    delay(DELAYVAL);
    //Serial.println("loop..");
  }
}

// I2C DATA RECV CALLBACK
void receiveEvent(size_t bytes)
{
  Wire1.read(databuf, bytes);
  if(databuf[0] == 1) color = true;
  else color = false;
  Serial.println(databuf[0]);
  received = bytes;
  Serial.println("recv");
}

void requestEvent(void)
{
  Wire1.write(databuf, MEM_LEN);
  Serial.println("req..");
}

NavQ commands

Add navq user to i2c group

To use the i2c commands without root, you'll need to add the navq user to the i2c group. To do this, you can run the following command:

$ sudo usermod -aG i2c $USER
$ sudo su
$ echo 'KERNEL=="i2c-[0-9]*", GROUP="i2c"' >> /etc/udev/rules.d/10-local_i2c_group.rules

Checking connection

Once your Teensy is connected using the I2C JST-GH connector, you need to confirm that the NavQ recognizes the Teensy as an I2C device. To do this, you can run the following command on the NavQ:

$ i2cdetect -y 1

You should see a device at address 0x29. If there is no device at address 0x29, you'll need to check your wiring.

Sending data to the Teensy

To send data to the Teensy, you can use the following command:

$ i2cset -y 1 0x29 0x1

This will change the LEDs to white. You can swap the 0x1 with a 0x0 or any other byte to switch back to green.

Controlling I2C bus with Python/C

Controlling the I2C bus with console commands is great, but what about when we want to integrate those commands into code? Well, with Python and C, we can control the Teensy over I2C by using some libraries supplied in both the Linux kernel and through pip.

Python

First, you'll need to install the smbus pip package. To do this, just run in your terminal:

$ pip3 install smbus

Once that is installed, you can run a simple script to select a 1 or 0 to send to the NavQ to change the color of the LEDs.

from smbus import SMBus

addr = 0x29
bus = SMBus(1)

numb = 1

print("Enter 1 for WHITE or 0 for GREEN")
while(numb == 1):
    ledstate = input(">>>>   ")

    if(ledstate == "1"):
        bus.write_byte(addr, 0x1)
    elif(ledstate == "0"):
        bus.write_byte(addr, 0x0)
    else:
        numb = 0

The expected output of this script is as follows:

navq@imx8mmnavq:~$ python3 i2c.py
Enter 1 for WHITE or 0 for GREEN
>>>>   1
>>>>   0

By selecting 1 or 0, you can change the color of the LEDs to white or green.

C

To control the I2C bus with C, you can use the following code:

#include <linux/i2c-dev.h>
#include <string.h>
#include <stdio.h>
#include <stdlib.h>
#include <unistd.h>
#include <sys/ioctl.h>
#include <sys/types.h>
#include <fcntl.h>

int main() {
        // Init vars - file descriptor and I2C slave address
        int file;
        int addr = 0x29;
        char filename[20];

        // Open the /dev/i2c-1 device filename and apply the address using ioctl
        sprintf(filename, "/dev/i2c-1");
        file = open(filename, O_RDWR);
        if(file < 0) {
                printf("Failed to open the i2c bus");
                exit(1);
        }
        if(ioctl(file, I2C_SLAVE, addr) < 0) {
                printf("Failed to acquire bus access and/or talk to slave.\n");
                exit(1);
        }
        
        // Create a data buffer, then ask the user for a 0 or 1 to change LED color
        // LED color is changed by writing buf to file
        char buf[10] = {0};
        buf[0] = 0x0;
        while(1==1){
                printf("Enter a 0 for GREEN and a 1 for WHITE: ");
                scanf("%X", &buf[0]);
                if(write(file,buf,1) != 1) {
                        printf("Failed to write to the i2c bus.\n");
                }
                printf("\n");
        }
}

Battery LED w/ Teensy LC

[WIP]

Introduction

This "Project Guide" is written to show some of the capabilites of NavQ. In conjunction with a Teensy LC and a strip of WS2812B LEDs, you can add a forward-facing battery indicator light to your drone.

The battery on my drone is quite low!

Prerequisites

Software

The software needed to run this project on your NavQ is as follows:

  1. ROS Noetic

  2. MAVROS

You can install this software using the guides here:

ROS1Controlling your drone from NavQ using MAVROS

Hardware

The hardware needed is the same as the hardware from the I2C guide here:

I2C

Code

At the moment, we're just going to paste the code here, and a more detailed guide will be written later.

Teensy code

This code should be uploaded to the Teensy using the Arduino IDE.

789B
batt_led.zip
archive
Teensy batt_led code

NavQ code

The ROS node should be placed in the home folder ('/home/navq/')

951B
batt_led.py
ROS node

The service file should be located in /etc/systemd/system/.

157B
batt_led.service
"systemd" service file

The Launch script should be located in /usr/local/bin/.

223B
batt_led.sh
Launch script

Making the ROS node run on boot

Once all of the necessary files are placed in their respective directories, you need to make the systemd service run at boot. To do this, run in the terminal:

$ sudo systemctl enable batt_led

PWM (Onboard RGB LED)

Controlling PWM on NavQ

The PWM chips are tied to the onboard LED on NavQ. There are three PWM chips: pwmchip0, pwmchip1, and pwmchip2. Each of these "chips" have one PWM line attached to them: pwm0. To use these PWM lines, you will need to use the sysfs interface.

Using the sysfs interface to control the onboard LED

Currently, you must be root to access these PWM chips. In the future we will use a udev rules file to change the permissions. This will allow the navq user to write to the psuedofiles for these chips.

Step 1

Log into the root user on NavQ by running this command:

$ sudo su -
<enter password>

Step 2

Navigate to /sys/class/pwm and run the following commands:

$ echo 0 > pwmchip0/export
$ echo 0 > pwmchip1/export
$ echo 0 > pwmchip2/export

Step 3

Now that our PWM lines are exported for each chip, we can change the duty cycle of the PWM lines and enable them. The default frequency is 2730667 Hz. For a 50% duty cycle, we will use half of this number: 1365333. Apply this duty cycle to each chip by running the following commands:

$ echo 1365333 > pwmchip0/pwm0/duty_cycle
$ echo 1365333 > pwmchip1/pwm0/duty_cycle
$ echo 1365333 > pwmchip2/pwm0/duty_cycle

Step 4

We will now enable each line. The colors for each chip are as follows:

pwmchip0: RED

pwmchip1: GREEN

pwmchip2: BLUE

To enable the colors, run the following commands:

$ echo 1 > pwmchip0/pwm0/enable
$ echo 1 > pwmchip1/pwm0/enable
$ echo 1 > pwmchip2/pwm0/enable

Running these commands in succession should enable the LEDs in a RED, GREEN, BLUE pattern until you reach a white LED.

Controlling the onboard LEDs programmatically

Comnig soon

CAN

[WIP] A guide on communicating over CAN/SLCAN using NavQ and UCANS32K146

Introduction

If you're thinking about using the CAN protocol on your drone, this guide will walk you through using our UCANS32K146 to create a CAN interface.

Since there isn't a native CAN bus on the NavQ, we can use a protocol called SLCAN to communicate CAN messages across a UART connection. We have built a binary for the UCANS32K146 that acts as an SLCAN transfer layer. This means that we can add a CAN bus to NavQ by just connecting the UCANS32K146 to the UART3 port.

Diagram of setup

Setting up SLCAN on NavQ

SLCAN support is enabled in the October image coming out this month.

To enable SLCAN on NavQ, run these commands:

$ sudo modprobe slcan
$ sudo slcand -o -t sw -s8 /dev/ttymxc2 -S 115200
$ sudo ip link set up slcan0

Now you can use SocketCAN or python-can to send and recieve CAN messages over the slcan0 interface. As an example, here is how to send a CAN message from the command line:

$ cansend slcan0 123#deadbeef

Flashing your UCANS32K146 with the SLCAN conversion binary

This binary is not yet available. This page will be updated with a link to the binary when it is ready.

Follow the guide at the link below to flash the SLCAN binary to your UCAN board:

LogoFlashing HoverGames boardsNXPMobileRobotics

Software Support

We have pages for several common software packages. Click the links below or follow the guide on the left of your screen.

Package ManagementROS1GStreamerOpenCVpyeIQGazebo

Package Management

To use the package manager (apt) on the Demo image, you'll need to change your timezone.

First, you'll need to locate the correct timezone file at /usr/share/zoneinfo. There should be a folder for your country and a file in that folder for the closest city to you.

$ rm -f /etc/localtime
$ ln -sf /usr/share/zoneinfo/<country>/<city> /etc/localtime

For example, if you're in Central Time USA, you'd use the following commands:

$ rm -f /etc/localtime
$ ln -sf /usr/share/zoneinfo/America/Chicago /etc/localtime

Now, you can run sudo apt update and sudo apt upgrade to get your system up to date.

ROS1

ROS on NavQ

ROS on NavQ will allow you to interface with sensors, control your drone using MAVROS, and more. To get started, follow the install guide below and then continue to the next sections.

NOTE: ROS1 support is good, but the Mobile Robotics team at NXP's focus is on ROS2. There is a lot more documentation on ROS1 than ROS2, but ROS2 may be easier to use in the long run. We suggest that you do not cross-polinate with ROS, i.e. only use ROS1 or ROS2, not both. Keep in mind that any documentation under the ROS1 section is for ROS1 only, and vice versa.

Install guide by OS

HoverGames-Demo image

NOTE: HoverGames participants should be using the Demo image. If you flashed your NavQ with the image from the HoverGames website, or if you're using the image that came installed on the SD Card included in your kit, you're using the Demo image.

When you install ROS Noetic on your NavQ, make sure to install the base version of ROS and not the desktop version. If you install the desktop version, critical gstreamer packages for NavQ can be overwritten and therefore become non-functional.

To install ROS, you need to be on the Demo image. You can follow the guide for installing ROS Noetic Ninjemys at http://wiki.ros.org/noetic/Installation/Ubuntu

HoverGames-BSP image

If you're using NavQ comercially and are running the HoverGames-BSP image, you'll follow these steps.

ROS Melodic is automatically installed on the HoverGames-BSP image. It includes MAVROS by default. You will need to do a little bit of setup, though, once you first boot your image.

Run the following commands to enable ROS on the HoverGames-BSP image:

$ sudo rosdep init
$ rosdep update
$ source /opt/ros/melodic/setup.bash
$ echo "source /opt/ros/melodic/setup.bash" >> ~/.bashrc
$ source ~/.bashrc

You'll also want to download the following script and run it to install GPS geoids:

$ wget https://raw.githubusercontent.com/mavlink/mavros/master/mavros/scripts/install_geographiclib_datasets.sh
$ chmod a+x ./install_geographiclib_datasets.sh
$ ./install_geographiclib_datasets.sh

Now, you can continue with the ROS tutorials for setting up a build environment and installing your first package. We will go over this in the next section.

Controlling your drone from NavQ using MAVROS

MAVLink / MAVROS

The 8MMNavQ can control your HoverGames drone by communicating with the RDDRONE-FMUK66 over MAVROS. A UART cable will be included in the kit that connects the UART3 port on the 8MMNavQ to the TELEM2 port on the RDDRONE-FMUK66.

NOTE: This page is for ROS1 only. MAVLINK and MAVROS are deprecated for ROS2 applications. ROS2 uses microRTPS and PX4 ROS Com in place of MAVROS.

NOTICE: When running the off-board script, make sure that you confirm the landing zone for your drone in QGroundControl. The local position parameter in the offboard ROS node is set to x:0, y:0, z:2, which means it will hover at 2 meters above its landing zone. If the drone takes off from a position away from its landing zone, it will quickly return to its landing zone and hover 2 meters above it. This is especially important to note if you turn the drone on indoors and then place it somewhere outside to take off. We don't want your drone to smack into a building!

Prerequisites

Set up TELEM2 on the FMU

Connect to your FMU over USB and open QGroundControl. Navigate to Settings -> Parameters -> MAVLink and set these parameters:

Also, you'll need to make sure that the settings in Settings -> Parameters -> Serial look like this:

Offboard control guide

MAVROS Offboard node example

A coding guide for the ROS node we will be using is located at the link below.

LogoMAVROS Offboard control example | PX4 User Guide

This guide will help you install the ROS node outlined in the MAVROS Offboard Example.

Setting up your development environment

To start, you'll want to make sure that you have already set up a development environment for ROS. ROS has a guide on how to get a catkin workspace set up in the link below.

ROS/Tutorials/InstallingandConfiguringROSEnvironment - ROS Wiki

Once you've completed that tutorial, you'll maybe want to add an extra line to your ~/.bashrc so that your devel/setup.bash is always sourced when you open a new terminal:

$ echo "source /home/<user>/catkin_ws/devel/setup.bash" >> ~/.bashrc

This will ensure that your development environment is properly set up when you open a new shell.

Installing MAVROS specific packages

Follow the "binary installation" guide on the page below to install the necessary MAVROS packages from apt.

Make sure to use 'noetic' in place of 'kinetic' in the commands they give you on this page. Also, you do NOT need to follow the "Source Installation" section of the guide.

LogoROS with MAVROS Installation Guide | PX4 User Guide

Creating a new package

To create our first ROS package, we will want to navigate to our catkin workspace's src folder and run the following command:

$ catkin_create_pkg offb roscpp mavros_msgs geometry_msgs

This command will create a new package folder named offb and will add the dependencies roscpp, mavros_msgs, and geometry_msgs to the 'CMakeLists.txt' and 'package.xml' files. Next, you'll want to take the code from the PX4 MAVROS example and create a file named offb_node.cpp in the src/ folder in the offb package. Your directory structure should now look like this:

navq@imx8mmnavq:~/catkin_ws/src/offb$ tree
.
├── CMakeLists.txt
├── include
│   └── offb
├── package.xml
└── src
    └── offb_node.cpp

3 directories, 3 files
navq@imx8mmnavq:~/catkin_ws/src/offb$

Editing CMakeLists

In order to build your ROS package, you'll need to make some edits to CMakeLists.txt so the catkin build system knows where your source files are. Two edits need to be made.

The first edit is to add your executable to CMakeLists. Your executable should be named offb_node.cpp. Uncomment line 136 to add it:

136 add_executable(${PROJECT_NAME}_node src/offb_node.cpp)

The second edit is link your target libraries (roscpp, mavros_msgs, and geographic_msgs). Uncomment lines 149-151 to do so:

149 target_link_libraries(${PROJECT_NAME}_node
150   ${catkin_LIBRARIES}
151 )

And that's all you need to do for now to set up your workspace!

Building your ROS node

To build your ROS node, return to the root of your catkin_ws/ directory and run:

$ catkin_make && catkin_make install

Running your ROS node

To run our ROS node, we need to make sure that MAVROS is running. On the NavQ, run the following command:

$ roslaunch mavros px4.launch fcu_url:='/dev/ttymxc2:921600' &

This will start roscore and the mavros node with a pointer to the UART port /dev/ttymxc2 at a 921600 baud rate. To run the ROS node we created, run the following in an ssh terminal:

$ rosrun offb offb_node &

and your drone should take off to an altitude of 2 meters!

ROS2

ROS2 Foxy Fitzroy Install Guide

NOTE: ROS2 is new, but we suggest you use it over ROS1, as ROS1 will be deprecated in the near future. You may run into issues with the ROS2 section of this Gitbook. If you have any issues with the guide, please email landon.haugh@nxp.com if external, or use Teams/Email if internal. MAVROS is not compatible with ROS2. MicroRTPS and PX4 ROS Com replace MAVROS.

Follow the guide at the link below to install ROS2 Foxy Fitzroy on your NavQ running the Demo image.

<no title>

Note 1: at Setup Sources step you might get an error message by curl. To avoid this, run the following commands:

sudo rm -rf /usr/lib/libcurl*
sudo apt install curl

Note 2: at Install ROS2 package step, run the ros-foxy-ros-base installer, as the Desktop tools are not needed on NavQ:

sudo apt install ros-foxy-ros-base

Building and Installing FastRTPS for ROS2 communication to FMU

FastRTPS and the microRTPS Agent

FastRTPS and the microRTPS agent are needed on NavQ in order to bridge uORB topics from PX4 to ROS2 on NavQ over a UART or UDP connection. Follow the guide below to build and install these packages.

NOTE: FastRTPS and PX4 ROS Com work differently from MAVROS (ROS1). PX4 ROS Com subscribes to uORB topics rather than MAVLINK messages. See below for a diagram of how microRTPS and PX4 ROS Com works.

Follow the link below for more details on microRTPS and PX4 ROS Com:

LogoRTPS/DDS Interface: PX4-Fast RTPS(DDS) Bridge | PX4 User Guide

Installing FastRTPS and PX4 ROS Com on NavQ

Prerequisites

~$ sudo apt update
~$ sudo apt install cmake python3-pip gradle python3-colcon-common-extensions gradle
~$ pip3 install --user pyros-genmsg

FastRTPS installation

First, we will install the FastRTPS project from eProsima. Use the following commands below to do so:

~$ mkdir src && cd src
~/src$ git clone --recursive https://github.com/eProsima/Fast-RTPS.git -b 1.8.x FastRTPS-1.8.2
~/src$ cd FastRTPS-1.8.2
~/src/FastRTPS-1.8.2$ mkdir build
~/src/FastRTPS-1.8.2$ cd build 
~/src/FastRTPS-1.8.2/build$ cmake -DTHIRDPARTY=ON -DSECURITY=ON .. 
~/src/FastRTPS-1.8.2/build$ make 
~/src/FastRTPS-1.8.2/build$ sudo make install
cd ~/src
~/src$ git clone --recursive https://github.com/eProsima/Fast-RTPS-Gen.git -b v1.0.4 Fast-RTPS-Gen
~/src$ cd Fast-RTPS-Gen
~/src/Fast-RTPS-Gen$ unset TERM
~/src/Fast-RTPS-Gen$ ./gradlew assemble
~/src/Fast-RTPS-Gen$ sudo su
~/src/Fast-RTPS-Gen$ unset TERM
~/src/Fast-RTPS-Gen# ./gradlew install
~/src/Fast-RTPS-Gen# exit
~/src/Fast-RTPS-Gen$

px4_ros_com installation

Next, we will build and install the necessary software that will allow us to use ROS2 to communicate with the microRTPS bridge. First, run the following commands:

$ cd ~/
~$ mkdir -p ~/px4_ros_com_ros2/src

~$ git clone https://github.com/PX4/px4_ros_com.git ~/px4_ros_com_ros2/src/px4_ros_com
~$ git clone https://github.com/PX4/px4_msgs.git ~/px4_ros_com_ros2/src/px4_msgs

URGENT: Building px4_ros_com requires a lot of ram. Enabling a swap disk is highly recommended. This will take up 1GB of space on your storage medium.

Run the following commands to enable a 1GB swapfile:

$ sudo fallocate -l 1G /swapfile
$ sudo chmod 600 /swapfile
$ sudo mkswap /swapfile
$ sudo swapon /swapfile
$ sudo vim /etc/fstab
Insert: /swapfile swap swap defaults 0 0
$ sudo swapon --show
(make sure swap is active)

Now, build the workspace:

This will take a long time to build on NavQ. In our experience, it takes anywhere from 45 minutes to an hour. Make sure you have a stable connection to NavQ over UART or SSH, and do not let the NavQ lose power!

~$ ./px4_ros_com_ros2/src/px4_ros_com/scripts/build_ros2_workspace.bash

Sourcing ROS2 bash files

In order to run all of your specific ROS2 software successfully, you must source the install/setup.bash files in each of your ROS2 workspace folders. Add the following lines to your .bashrc to do so:

source /opt/ros/foxy/setup.bash
source ~/px4_ros_com_ros2/install/setup.bash

Next steps

Continue to the next page to set up a systemd service that will automatically start the micrortps agent on your NavQ. The guide will also cover how to automatically start the client on the FMU.

Auto-start microRTPS client/agent on FMU/NavQ

Creating a systemd service to auto-start the microRTPS agent on NavQ

Generate a start up script for the micrortps client under /usr/local/bin

sudo nano /usr/local/bin/start_micrortps_agent.sh

with content

#!/bin/bash
## startup script for micro_rtps_agent
## agent will communicate to FMUK66 via UDP
## FMUK66 IPv4 addr = 10.0.0.2 
##
## Author: Gerald Peklar <gerald.peklar@nxp.com>  

source /opt/ros/foxy/setup.bash
source ~/px4_ros_com_ros2/install/setup.bash

# Comment out the line that you are not using:

# If you're using T1 Ethernet communication:
micrortps_agent -t UDP -i 10.0.0.2

# If you're using UART communication over the UART3 port:
micrortps_agent -d /dev/ttymxc2 -b 921600

Save the file and exit nano. Make the file executable

sudo chmod +x /usr/local/bin/start_micrortps_agent.sh

Generate a systemd service file to start the startup script at boot

sudo nano /etc/systemd/system/micrortps_agent.service

with content

[Unit]
Description=PX4 micrortps service
After=network.target

[Service]
Restart=always
TimeoutStartSec=10
User=navq
Group=navq
WorkingDirectory=~
ExecStart=/usr/local/bin/start_micrortps_agent.sh

[Install]
WantedBy=multi-user.target

Save the file and exit nano. Check if the process starts

sudo systemctl start micrortps_agent.service
sudo systemctl status micrortps_agent.service

You should see an state active (running), quit with <q> Enable the systemd service file finally to be active at boot

sudo systemctl enable micrortps_agent.service

Auto-start the microRTPS client on the FMU

To run fastRTPS over Ethernet with NavQ board the RDDRONE-T1ADAPT is needed.

Building PX4 with microRTPS

You will need a Linux VM or computer to complete this step.

In order to use the microRTPS client on NavQ, you'll need to build PX4 with the _rtps tag for the fmuk66-v3 build target. To do this, you will need to have both the FastRTPS and Fast-RTPS-Gen packages installed. You can just follow the previous guide on your Linux development VM or computer.

Once you have successfully installed those two packages, you can navigate to your cloned PX4 repository and run the following:

$ make nxp_fmuk66-v3_rtps

Flashing your FMU with the updated binary

You will need to flash your FMU with the updated RTPS binary. If you don't know how to do this yet, follow the guide here:

LogoProgram software using debuggerNXP HoverGames

Creating a startup file on the SD card

To make the microRTPS client start at boot on the FMU, you will need to have an SD card inserted. On your SD card, make a file at /etc/extras.txt and insert one of the following options:

set +e
# For T1 Ethernet communication:
micrortps_client start -t UDP -i <NavQ_IP_Address>

# For UART communication over the TELEM port:
micrortps_client start -d /dev/ttyS4 -b 921600

# For UART communication over the IR/TELM2 port:
micrortps_client start -d /dev/ttyS1 -b 921600
set -e

Calling set +e at beginning / set -e at end is needed to prevent from boot errors. Further details can be found on https://dev.px4.io/master/en/concept/system_startup.html#replacing-the-system-startup

Detecting AprilTags with ROS2

[WORK IN PROGRESS]

Overview

NOTE: This guide is currently a work in progress. Some details may not be finished.

In this section, we will guide you through the process needed to detect AprilTags on your NavQ. There are a few things that need to be done to accomplish this:

  1. Install ROS2 image tools

  2. Build and install AprilTag detection nodes

  3. Calibrate the camera on your NavQ using a checkerboard pattern

  4. Run!

Prerequisites

Before we start, you will need a few things:

Checkerboard

To create a checkerboard for camera calibration, download this PDF: https://www.mrpt.org/downloads/camera-calibration-checker-board_9x7.pdf

Desktop Setup

In order to calibrate the camera, you will need to set up your NavQ with a mouse, keyboard, and monitor. Use the included microUSB hub and HDMI connector to do so.

Installing required ROS2 software

Install the following packages with the apt package manager by running the commands below:

$ sudo apt install ros-foxy-cv-bridge \
ros-foxy-image-tools \
ros-foxy-image-transport \
ros-foxy-image-transport-plugins \
ros-foxy-image-pipeline \
ros-foxy-camera-calibration-parsers \
ros-foxy-camera-info-manager \
ros-foxy-launch-testing-ament-cmake 

Once that is finished, move on to the next step.

Calibrating the camera

This section is a consolidation of the specific commands for the NavQ. If you run into any issues with this section of the guide, email landon.haugh@nxp.com and refer to the official guide: https://navigation.ros.org/tutorials/docs/camera_calibration.html

Hook up your NavQ to a monitor with the provided HDMI cord and connect a USB mouse + keyboard through the included microUSB hub. Open the terminal by clicking the icon at the top left of the screen and open the bash shell by running:

$ bash

Have your printed checkerboard ready in a well lit environment and run the camera calibration software by running the following commands:

# Start publishing camera images to ROS2 topic /camera/image_raw
$ ros2 run image_tools cam2image --ros-args -p device_id:=0 -p width:=640 -p height:=480 -r /image:=/camera/image_raw > /dev/null 2>&1 &
# Start the camera calibration software
$ ros2 run camera_calibration cameracalibrator --size 7x9 --square 0.02

Now use the link in the note above to run through calibarating the camera.

Detecting AprilTags

Building apriltag_msgs

A prerequisite for the apriltag_ros node is apriltag_msgs. Clone the repo and build it by running these commands:

$ git clone https://github.com/christianrauch/apriltag_msgs
$ cd apriltag_msgs
$ colcon build

Make sure to source the install/setup.bash file so that apriltag_msgs can see it when being built.

Building the apriltag_ros node

First, in order to detect AprilTags, we need to build the apriltag_ros node written by christianrauch. You can clone his repository by using this git repo:

$ git clone https://github.com/christianrauch/apriltag_ros

To make his repo work with ROS2 Foxy, you will need to make a small change in the CMakeLists.txt file. Go to line 26 in that file and delete the apriltag:: token in the AprilTagNode apriltag::apriltag part.

Next, you'll want to save that file and run colcon build in the apriltag_ros folder. Once it is done building, you'll want to source the install/setup.bash file. Add this line to your .bashrc:

source /home/navq/<apriltag_ros folder>/install/setup.bash

Creating a new package to concatenate camera information to each camera frame

In order to make the apriltag_ros node work, we need to make sure that camera info messages are being sent in sync with each camera frame published by the cam2image node. We have written an example node that does just that. You can download it here:

86KB
py_pysub.zip
archive
Full Camera Republish Node Workspace

You will need to replace the matrices in the node file to match your camera calibration parameters. The source file is located at pypysub/py_pysub/publisher_member_function.py. Once you have done that, make sure to build and install the node and source the install/setup.bash file.

Running the code

To run the code, you'll need to run the following ROS nodes:

$ ros2 run image_tools cam2image --ros-args -p device_id:=0 -p width:=640 -p height:=480 -r /image:=/camera_image > /dev/null 2>&1 &
$ ros2 run py_pysub talker > /dev/null 2>&1 &
$ ros2 launch apriltag_ros tag_16h5_all.launch.py --ros-args -p image_transport:=raw > apriltag_log.txt 2>&1 &

GStreamer

There is an NXP community user guide for gstreamer available here: https://community.nxp.com/t5/i-MX-Processors-Knowledge-Base/i-MX-8-GStreamer-User-Guide/ta-p/1098942

Taking a picture

To take a picture on your NavQ using GStreamer, run the following command:

$ gst-launch-1.0 -v v4l2src num-buffers=1 ! jpegenc ! filesink location=capture1.jpeg

To take video, you can run the following pipeline:

$ gst-launch-1.0 v4l2src ! 'video/x-raw,width=1920,height=1080,framerate=30/1' ! vpuenc_h264 ! avimux ! filesink location='/home/navq/video.avi'

Streaming Video to QGroundControl using NavQ over WiFi

Prerequisites

Devices required

In this guide, we need a few things:

  1. NavQ Companion Computer mounted with Google Coral Camera attached

  2. Laptop/Phone with QGroundControl Installed

  3. Both NavQ and mobile device connected to the same WiFi network

Setting up QGroundControl

In QGroundControl, click the Q logo in the top left, and configure the video section as seen in the image below:

This will set up your QGroundControl instance to receive the UDP video stream from the NavQ.

Connecting your NavQ to your router and getting IPs

Follow the WiFi setup guide using connman in the Quick Start guide to connect your NavQ to the same router as your mobile device. You will need to use the serial console to do this. Once you have your NavQ connected, you can run ifconfig in the serial console to find the IP address of your NavQ.

Your IP address should be next to 'inet' under 'wlan0' if connected over WiFi.

You can SSH into the NavQ to run the GStreamer pipeline once you have the IP.

Running the GStreamer pipeline

With your NavQ on, SSH into it by using the IP address you noted when connected to the serial console. Once you're successfully SSHed in, you should note the IP address that you logged in from as seen here:

This is the IP of your computer that you should be sending the video stream to.

To run the GStreamer pipeline, run the following command:

$ sudo gst-launch-1.0 v4l2src ! video/x-raw,width=640,height=480,framerate=30/1 ! vpuenc_h264 bitrate=500 ! rtph264pay ! udpsink host=xxx.xxx.xxx.xxx port=5600 sync=false

Make sure to replace the 'xxx.xxx.xxx.xxx' with the IP you noted when first SSHing into the NavQ.

Once you run that command, you should be able to see the video stream from your NavQ on QGroundControl!

NavQ Streaming over UDP to QGroundControl

Ad-Hoc Streaming using Mobile Hotspot

Configuring Windows

Step 1 - Enable Mobile Hotspot

You must have a WiFi adapter in your Laptop/PC to follow this guide.

To enable Mobile Hotspot on Windows, go to Settings->Network & Internet->Mobile Hotspot. Next, you'll want to edit your mobile hotspot settings to set a password and SSID. Once you've done this, you can enable Mobile Hotspot. You can see a full configuration in the screenshot below.

Mobile Hotspot setup in Windows with imx8mmnavq connected

Step 2 - Enable Port 5000/5600 in Firewall

By default, port 5000 or port 5600 is not open in the Windows firewall, so any UDP stream packets will be blocked. To enable this, go to your Windows search bar, and type "Firewall". Select "Windows Defender Firewall".

Once you open Windows Defender Firewall, you'll want to navigate to "Advanced Settings" from the menu on the left.

You will then be brought to a new window with Windows Firewall rules. To create a new rule for QGC streaming, you'll need to click "New Rule" on the right side.

You will be brought to a new window to add a rule. Select "Program" and click "Next".

At the next window, it will ask you to specify the program you are adding a rule for. Paste the following into that field and click "Next":

%ProgramFiles%\QGroundControl\QGroundControl.exe

Once you've done this, you can click "Next" through the rest of the fields and you should be good to go.

On the page that tells you to name your rule, just name it "QGroundControl".

Step 3 - Connect NavQ to new Mobile Hotspot

To connect your NavQ to your new Mobile Hotspot, follow the connecting to WiFi guide in the Gitbook here:

LogoQuick start Guide8MMNavQ

Step 4 - Stream to QGroundControl

Now you can stream to QGroundControl as you normally would. Follow the guide here:

Streaming Video to QGroundControl using NavQ over WiFi

Configuring Ubuntu

Step 1 - Enable Wifi Hotspot

To enable a WiFi hotspot in Ubuntu 20.04, you'll first need to go to Settings->WiFi. Then, at the top right, click the 3 dots button and select "Turn On Wi-Fi Hotspot...".

After you click that entry, this window will pop up. Enter a network name and password, and you should be good to go! Follow Steps 3 and 4 in the Windows section above to configure your NavQ.

OpenCV

With OpenCV on NavQ, you will be able to harness a vast library of computer vision tools for use in HoverGames. OpenCV is installed out of the box on the HoverGames-BSP image and can be installed easily through the package manager on the HoverGames-Demo image. If you'd like to get a jump start on OpenCV, follow the guide below to create a program that detects red objects.

Quick Example

Let's go through a quick example of running OpenCV on the NavQ to identify a red object in an image taken on the Google Coral camera. This example will be written in Python and uses OpenCV.

Installing OpenCV

If you are using the default OS that is shipped with the NavQ, you can skip this step.

If you're using the HoverGames-Demo image, you'll need to install python3-opencv. To do so, run the following command in your terminal:

$ sudo apt install python3-opencv

Imports

First, create a new python source file (name it whatever you want!). We only need two imports for this program: opencv (cv2) and numpy. Numpy is used to create arrays of HSV values.

import cv2
import numpy as np

Capturing an image

To capture an image, we must first open a camera object and then read from it.

# Open camera and capture image form it
cap = cv2.VideoCapture('v4l2src ! video/x-raw,width=640,height=480 ! decodebin ! videoconvert ! appsink', cv2.CAP_GSTREAMER)
ret,frame = cap.read()

Downsizing the image

To make our OpenCV pipeline run faster, we're going to shrink our image down to 640x480 resolution. This resolution isn't so small that the image quality will be reduced enough to make a difference in detecting objects, but it will make OpenCV process our image much quicker.

Another pre-processing step that we will run is a box blur. This will get rid of small artifacts in the image that can throw off our pipeline and will make detecting large objects much easier.

# Resize to make processing faster
frame = cv2.resize(frame, (640,480), interpolation = cv2.INTER_AREA)

# Blur image to make contours easier to find
radius = 10
ksize = int(2 * round(radius) + 1)
image = cv2.blur(frame, (ksize, ksize))

Color filtering

In order to find objects that are red in our image, we will apply an HSV filter to the image to create a mask of the color red in the image.

The lower_red and upper_red variables are found by using a program called GRIP. GRIP is a GUI program for OpenCV. It has tons of great features including code generation. To check GRIP out, go to the website here.

# Convert to HSV color for filtering
hsv = cv2.cvtColor(image, cv2.COLOR_BGR2HSV)

# Filter out all colors except red
lower_red = np.array([0,87,211])
upper_red = np.array([36,255,255])

# Create binary mask to detect objects/contours
mask = cv2.inRange(hsv, lower_red, upper_red)

Finding contours

To find the location of the objects in our image, we will find contours in the mask and sort them by total area. This will allow us to filter out smaller objects that we aren't interested in. We will also be able to detect the objects' position in the image and draw a box around them.

# Find contours and sort using contour area
cnts = cv2.findContours(mask, cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)
cnts = cnts[0] if len(cnts) == 2 else cnts[1]
cnts = sorted(cnts, key=cv2.contourArea, reverse=True)
for c in cnts:
    # Once we hit smaller contours, stop the loop
    if(cv2.contourArea(c) < 100):
        break

    # Draw bounding box around contours and write "Red Object" text
    x,y,w,h = cv2.boundingRect(c)
    cv2.rectangle(frame,(x,y),(x+w,y+h),(0,255,0),2)
    font = cv2.FONT_HERSHEY_SIMPLEX
    cv2.putText(frame,'Red Object', (x,y), font, 1, (0, 255, 0), 2, cv2.LINE_AA)

Storing the generated images

Finally, we will store the images we generated from this program: the mask and the final image with annotations (bounding box and text).

# Write images to disk for debugging
cv2.imwrite('thresh.png', mask)
cv2.imwrite('image.png', frame)

# Close camera
cap.release()

Running the code

To run the code, you'll need to use python3. Run the following command (<file.py> will be the filename that you saved the code to):

$ python3 <file.py>

Source code

Here is the complete source code if you'd like to run it on your own NavQ as an example:

# Landon Haugh (NXP) 2020

import cv2
import numpy as np

# Load image, grayscale, Gaussian blur, and Otsu's threshold
cap = cv2.VideoCapture(0)
ret,frame = cap.read()

# Resize to make processing faster
frame = cv2.resize(frame, (640,480), interpolation = cv2.INTER_AREA)

# Blur image to make contours easier to find
radius = 10
ksize = int(2 * round(radius) + 1)
image = cv2.blur(frame, (ksize, ksize))

# Convert to HSV color for filtering
hsv = cv2.cvtColor(image, cv2.COLOR_BGR2HSV)

# Filter out all colors except red
lower_red = np.array([0,87,211])
upper_red = np.array([36,255,255])

# Create binary mask to detect objects/contours
mask = cv2.inRange(hsv, lower_red, upper_red)

# Find contours and sort using contour area
cnts = cv2.findContours(mask, cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)
cnts = cnts[0] if len(cnts) == 2 else cnts[1]
cnts = sorted(cnts, key=cv2.contourArea, reverse=True)
for c in cnts:
    # Once we hit smaller contours, stop the loop
    if(cv2.contourArea(c) < 100):
        break

    # Draw bounding box around contours and write "Red Object" text
    x,y,w,h = cv2.boundingRect(c)
    cv2.rectangle(frame,(x,y),(x+w,y+h),(0,255,0),2)
    font = cv2.FONT_HERSHEY_SIMPLEX
    cv2.putText(frame,'Red Object', (x,y), font, 1, (0, 255, 0), 2, cv2.LINE_AA)
    
    
# Write images to disk for debugging
cv2.imwrite('thresh.png', mask)
cv2.imwrite('image.png', frame)

# Close camera
cap.release()

pyeIQ

Python framework for eIQ on i.MX

This page is a work in progress. NOTE - THIS WILL NOT WORK ON NAVQ!! UPDATED 12/03/2020 - Updated with notes that this will not work as-is for NavQ using 8M Mini. Apologies for any confusion. These notes are here only for reference advanced developers. The 8M Mini does not have any NN acceleration and can only run using the processor cores.

pyeIQ is not targeted at the i.MX Mini processor, but it may still work albeit with much lower performance than if an accelerator was available. We expect to use this more with the upcoming i.MX 8M Plus that includes a 2.25 TOPS neural net accelerator.

Please refer to the following pyeIQ documentation:

https://pyeiq.dev/

Note that eIQ support is only included on imx-image-full-imx8mpevk.wic pre-built image [1]. *** THIS IMAGE is only for 8M Plus!

Please take a look on switch_image application, we are using TFLite 2.1.0. This application offers a graphical interface for users to run an object classification demo using either CPU or NPU.

# pyeiq --run switch_image

We also have a TFLite example out of pyeIQ, please refer to instructions below. Details can be found on i.MX Linux User's Guide [2].

# cd /usr/bin/tensorflow-lite-2.1.0/examples

# ./label_image -m mobilenet_v1_1.0_224_quant.tflite -i grace_hopper.bmp -l labels.txt

The i.MX Linux User's Guide [2] also provides instructions on how to get our latest Linux BSP [1] up and running. *** NOTE FOR 8M Plus only!

[1]: https://www.nxp.com/webapp/sps/download/license.jsp?colCode=L5.4.47_2.2.0_MX8MP-BETA2&appType=file1&DOWNLOAD_ID=null

[2]: https://www.nxp.com/docs/en/user-guide/IMX_LINUX_USERS_GUIDE.pdf

Gazebo

Where to learn more about Gazebo

Gazebo is one of several simulators that work with PX4 and ROS.

Simulation is important in order to test code without risk of damaging real hardware. It can be critical in uncovering faults that would otherwise be very difficult to trigger. This is not a tutorial on Gazebo, but a list of some resources to get started.

  • PX4.io Gazebo developer guide https://dev.px4.io/v1.9.0/en/simulation/gazebo.html

  • Youtube videos e.g. https://youtu.be/mranHM9wn0g

  • Read about Gazebo on Wikipedia.

  • Try out this simple Gazebo tutorial to control a differential drive robot, which is a fun way to learn both Gazebo and ROS. (smile)

Telerobotics with NavQ and Remo.TV

[WORK IN PROGRESS]

What is telerobotics?

Telerobotics is a platform in which robots can be controlled over the internet. A good example of this is Twitch Plays. Twitch Plays allows users to play games on a stream by giving commands through Twitch Chat.

Now you may be wondering, "Well Twitch Plays isn't controlling a robot, that's a video game!" and you'd be correct. There is an alternative for robotics though! It's called Remo.TV, and we have written a module for the NavQ to be supported on the site.

Remo.TV is a website where anyone can log in and control robots over the internet. With NavQ, your family and friends can easily control your HoverGames Drone or NXP Cup Rover through Remo.TV. Below we have written a guide for you to set up your drone or rover with the service.

This guide is written for an NXP Cup Car with PX4. You can follow this guide for other robots, but you will need to create your own hardware file to work with your setup. To do this, follow Remo.TV's documentation. You can start by visiting the front page of their website:

LogoRemo.TV

Setting up Remo.TV on NavQ

Dependencies and repos

To set up Remo.TV on NavQ, we are going to have to install a few packages. Lets install those now:

The steps below come from the README on the official remotv GitHub repository: https://github.com/remotv/controller

Install dependencies:

$ sudo apt update
$ sudo apt upgrade -y
$ sudo apt install ffmpeg python-serial python-dev libgnutls28-dev espeak python-smbus python-pip git

Download RemoTV from GitHub:

$ git clone https://github.com/remotv/controller.git ~/remotv

Install Python dependencies:

$ sudo python -m pip install -r ~/remotv/requirements.txt

Open RemoTV and copy the sample config file:

$ cd remotv
$ cp controller.sample.conf controller.conf

And your RemoTV is cloned and ready for configuration! Next we will set up our configuration file.

Configurating the Configuration

To get Remo.TV to work on your robot, you'll need to set up the configuration file to work with NavQ. Open the controller.conf file that we copied earlier for editing. Below you will see each field that needs to be edited for the configuration file.

[robot] Section

Config field

Value

owner=

Set this as your Remo.TV username.

robot_key=

Set this as your Remo.TV API key.

type=

navq

[camera] Section

Config field

Value

x_res=

640

y_res=

480

video_framerate=

30

Setting up controls on Remo.TV

Log into your Remo.TV account and go to your robot. The screen should look like this:

At the bottom, there is a movement tab. For the NXP Cup car, you'll want to press the (edit buttons) text and paste this code into that window:

[
  {
    "break": "line",
    "label": "movement"
  },
  {
    "label": "forward",
    "command": "f",
    "hot_key": "w"
  },
  {
    "label": "stop",
    "command": "s",
    "hot_key": "s"
  },
  {
    "label": "reverse",
    "command": "x",
    "hot_key": "x"
  },
  {
    "label": "left",
    "command": "l",
    "hot_key": "a"
  },
  {
    "label": "right",
    "command": "r",
    "hot_key": "d"
  }
]

This will set up your controls to be compatible with the example navq hardware file.

Running the Remo.TV controller

Once you have set up the software, you can run the controller by going into the controller directory and running:

$ python3 controller.py

Your robot should start streaming to Remo.TV!

Archive

Old Fixes

Old Fixes [Archive]

These fixes are common for older versions. If you don't run into issues with the sections stated below, you shouldn't need to use these fixes.

WiFi

Currently the image does not include connman due to a bug in the build system. Also, for connman to work, it requires a package called dhcpc5. To install connman and dhcpc5, connect your NavQ to the internet using ethernet, and run the following:

$ sudo apt install connman dhcpc5

Use the quick start guide to connect to WiFi once you're finished installing those packages.

resolv.conf

There is an issue with resolv.conf that prevents you from connecting to the internet on WiFi due to an incorrect DNS setup. To fix this, edit /etc/resolv.conf by removing the current nameserver IP and replacing it with 8.8.8.8.

Setting the `dialout` and `video` usergroups

By default, the navq user on the HoverGames-Demo image is not in the dialout and video groups. If your user is not in those groups, you will not have control of the camera nor the UART port for communication to the FMUK66. To fix this, run the following commands:

$ sudo usermod -a -G dialout $USER
$ sudo usermod -a -G video $USER
$ logout

It is necessary to log out to ensure that the Linux kernel knows that your user is part of those user groups.

V4L2SRC & Gstreamer

In the current build, there is an issue with ldconfig which causes the v4l2src driver to not be found. To fix this, run the following commands:

$ export LD_LIBRARY_PATH=/usr/lib >> ~/.bashrc
$ source ~/.bashrc

Linux Device Tree

Currently the patches for the linux device tree in our build system are not working. To fix hardware issues like the camera, HDMI, and WiFi, you'll need to replace the *.dtb file on the boot partition of your SD card. You'll need the following file to follow the steps below:

38KB
imx8mm-cube.dtb
imx8mm-cube.dtb

Steps

  1. Insert your SD card into your computer

  2. Open the boot partition that is mounted by default.

  3. Drag and drop the imx8mm-cube.dtb file into the boot partition.

  4. Boot your NavQ. Quickly connect to it using a serial monitor such as PuTTY.

  5. Press Enter repeatedly after it boots to bring up the u-boot=> prompt.

  6. In the u-boot=> prompt, run the following commands:

u-boot=>setenv fdt_file imx8mm-cube.dtb
u-boot=>saveenv
u-boot=>boot

Once you have completed these steps, you should be good to go until you reflash your SD card with a new image.

Developer Quick Start Guide

How to assemble the NavQ for use. For initial developers only. Yocto team.

NOTE: This information is archived and not up to date. Some information might still be useful, but continue at your own risk.

Assembly manual

The boards ship pre-programmed, pre-assembled and configured to boot from eMMC.

While the boards ship pre-programmed with an image and configured to boot from eMMC, you will want to update to the latest code after confirming you have everything running

Connecting the NavQ to a PC

There are several interfaces that can be used when connecting the NavQ board to a PC/laptop. The base connections are as follows below:

  1. Connect the NavQ Debug UART to the USB-UART adapter (included) and then the USB-UART adapter to the PC. You can use a terminal program like Putty to connect to the root console.

  2. Plug the USB C to USB A cable between the NavQ and a PC to provide power

  3. An Ethernet connection is available by using the IX Ethernet to RJ45 cable between the NavQ and a PC or Ethernet hub

IX Industrial is a new IEC 61076-3-124 standard providing rugged and compact Ethernet connections for industrial equipment. It was pioneered by Hirose and has multiple sources.

Initial powering on

On a factory fresh system, when power is applied using the USB-C cable (or one of the other power inputs), the system will boot to the Linux prompt. This test shows that the system is functional (UBoot and Linux in terms of the software, UART, Ethernet and eMMC in terms of the hardware).

  • The default login is root with no password.

To provide the better understanding we may suggest to refer to the tree of the system:

  • The SOM - the top layer in the assembly consist of:

    • The processor itself: can be assembled with IMX8M MINI / IMX8M NANO

    • The FLASH (eMMC): can be assembled in the range from 4GB up to 512 GBytes

    • The first option for the RAM (LPDDR4): can be assembled in the range from 1GB up to 4GB, the data bus is 32 bit (high throughput)

    • The second option for the RAM (DDR4): can be assembled in the range from 256MB up to 1GB, the data bus is 16 bit (medium throughput)

    • The WiFi [optional]

  • The MEDIA board - the middle board in the stack, can provide the connectivity as follows:

    • The MIPI DSI (Display Serial Interface) interface at a flex connector can be used with:

      • MIPI Display adapter board (with a 5.5 inch FullHD display) [optional, via 24pin flex cable]

      • MIPI to HDMI adapter board (HDMI at the output) [optional, via 24pin flex cable]

    • The MIPI CSI (Camera Serial Interface) interface at a flex connector can be used with:

      • Time Of Flight Camera Adapter board (38kPix) [optional, via 24pin flex cable]

      • RGB Camera Adapter board (Google Coral camera, 5MPix) [optional, via 24pin flex cable]

    • The PCIe interface at a flex connector can be used with:

      • PCIe extension board with a PCIe M.2 key E adapter board (any NVMe SSD)

    • The 1G Ethernet PHY which can be connected to the:

      • Ethernet+USB adapter board [optional, via 24pin flex cable]

      • HGI board [optional, via board-to-board connector]

    • The microSD card holder

    • The USB 2.0 interface (USB0) with the optional connectivity to the:

      • Ethernet+USB adapter board [optional, via 24pin flex cable]

      • HGI board [optional, via board-to-board connector]

  • The HGI (HoverGamesInterposer), can provide the connectivity as follows: [Optional]

    • The USB 2.0 interface (USB0) (available at USB type-C connector)

    • The USB 2.0 interface (USB1) (available at micro-USB connector)

    • The 1G Ethernet connector (IX industrial Ethernet connector)

    • The JTAG interface (for debugging purposes)

    • The BootMode switches (to switch between USB/eMMC/SD card)

    • The RGB LED (for some fun for HoverGames)

    • The UART interfaces (2 pcs)

    • The SPI interfaces (1 pcs)

    • The secure element (an IC for HG games)

The actual kit consist of all the three layers (SOM + Media board + HoverGames Interposer Board) with the Google Coral camera installed by default (https://coral.ai/products/camera/)

Each layer implemented as an individual project and has it's own documentation (schematic + layout)

Here you can find the schematics and renders for each board in the stack:

IMX8MM-SOM-1A
540KB
SPF-31399_C2.pdf
pdf
IMX8MM-SOM-1A. 32-bit LPDDR4, Murata 1PJ WiFi, Full IO at 3V3
MEDIA BOARD. Engineering sample.
1MB
MEDIABOARD-ENGINEERING.SAMPLE.pdf
pdf
MEDIA BOARD. Engineering Sample ()
1MB
8m-hover-games-1a.pdf
pdf
Hover Games Interposer. Engineering sample.

Please note, in accordance with the manufacturer recommendations, the flex cable connectors designed for 20 mating cycles only, the board-to-board connectors designed for 30 mating cycles only.

For development purposes, our own experience shows that the board-to-board connectors can in practice withstand up to 500 mating cycles. However this is beyond specification and for critical application use it is best to minimize mating cycles and stress to the connectors.

Engineering Sample boards

Pre-production hardware marked as "engineering samples" have the following known issues and limitations:

  • The SD Card Detect has been permanently tied to ground (which is a signal of permanent presence of an SD card.)

  • The 1G Ethernet may be unstable with some Ethernet hubs and/or end-point devices

These issues will be resolved in the next revision.

Boot Mode and Config Switches

  • Engineering sample boards nevertheless are suitable for debugging all the other interfaces and capabilities.

Boot mode Table

In order to switch the board into the various i.MX debugging modes you may refer to the table below which describes the boot modes available at the board.

The 8mmNavQ ships with an 8M Mini SOM in the board stack not the 8M Nano. (Portions of this board stack can be re-purposed and used with the 8M Nano. Contact emCraft for details)

  • An X means that the position of the switch does not matter

  • A dark square shows the position of the movable element on the switch.

Boot from USB

To configure boot access to the system via USB please set the boot mode switches in accordance with the table above for "8M MINI: boot from USB" configuration. After power cycling the board the processor will power on and become available via USB-C connector. It will not execute code form the onboard SDCARD or eMMC.

Any regular tests and operations are now available via the UUU tool. The DDR Stress Test tool is also could be used in order to perform the LPDDR4 calibration.

NavQ UBoot and LPDDR4 - Engineering Samples

This particular system is built with 3GBytes of RAM, which means the default UBoot configuration and the lpddr4_timing.c file should be updated. The patch provided below has been applied to the UBoot tree available at:

  • https://source.codeaurora.org/external/imx/uboot-imx.git -b refs/heads/imx_v2018.03_4.14.98_2.0.0_ga

52KB
imx8m_navq_uboot.patch

Also, you may use the LPDDR4 calibration file only:

38KB
lpddr4_timing.c

Building U-Boot from Scratch

The detailed manual how to build U-Boot from the scratch is described at: https://community.nxp.com/docs/DOC-345535

The U-Boot image may be re-built using the manual and patch above. This will allow you to boot the system via USB using the NXP UUU tool and program U-Boot to the eMMC using the UUU script below:

uuu_version 1.0.1

# for IMX8MQ, IMX8MM
SDP: boot -f flash.bin
SDPV: delay 1000
SDPV: write -f flash.bin -skipspl
SDPV: jump

# Configure U-Boot variables
FB: ucmd setenv fastboot_dev mmc

# Flash the MMC image
FB: ucmd mmc dev 1
FB: ucmd setenv mmcdev 1

FB: flash bootloader flash.bin

FB: ucmd mmc partconf 1 0 1 1
FB: ucmd mmc bootbus 1 1 0 2

FB: Done

Switch back to eMMC mode

After successfully reaching this point you may switch back from "Boot from USB" mode to "boot from eMMC" mode.

Building Linux

In order to build Linux you may refer to the git available at

  • https://source.codeaurora.org/external/imx/linux-imx.git -b refs/heads/imx_4.14.98_2.0.0_ga

INITIAL DEVELOPERS ONLY - NOTE: The patch below for the tree is extremely raw and should be used as a reference only

7KB
dirty.patch

  • In order to get the root file system please refer to the Yocto reference manual, chapter 5.

  • This is an initial version of the manual and the coverage of all the aspects is quite limited. If you would face any HW issues, please let me know via email abushuev@emcraft.com and/or via NXP's Microsoft Teams' messaging system.

  • Contact NXP at iain.galloway@nxp.com for coordination or other questions.

Flashing the NavQ with 18.04 rootfs (internal)

Grabbing necessary files

Download Ubuntu rootfs + bootloader

You will need to download the Ubuntu 18.04 rootfs files from this link:

LINK

NOTE: NXP Internal. Only NXP employees currently have access to this link.

The two files you will need from this link are located in the im8mmevk/ folder:

  • imx-boot-imx8mmevk-ds.bin-flash_evk

  • imx-image-multimedia-ubumtu-imx8mmevk-20200611102300.rootfs.sdcard.bz2

You will need to extract the .bz2 file using 7zip on Windows or bzip2 -d <filename> on Linux.

Download UUU

UUU is a tool used for flashing i.MX8 boards. You can download UUU for Windows or Linux using the link below:

LINK

Preparing the board

Setting DIP switches

In order to flash the NavQ using UUU, you'll have to change the DIP switches to the following configuration:

Powering the board

Power the board over USB-C as normal. You'll also want to connect to the USB-C/UART adapter and pull up a serial console (with baud rate 115200) to watch the serial output while flashing in case there are any errors.

Flashing the board

Using UUU to flash the board

(NOTE: This has only been tested on Windows 10)

Open CMD on Windows as administrator and change your directory to where you stored the uuu.exe, boot, and ubuntu rootfs files that you downloaded at the beginning of this guide. Then, plug the NavQ in with the correct DIP switch configuration and run the following command in the CMD:

.\uuu.exe -b emmc_all imx-boot-imx8mmevk-ds.bin-flash_evk imx-image-multimedia-ubumtu-imx8mmevk-20200611102300.rootfs.sdcard

Your eMMC should now be flashed with the Ubuntu 18.04 rootfs. After this step, it still won't boot, so there are a few more steps to get it to boot.

Adding correct NavQ DTB file

(NOTE: You must have an SD card that you can boot from or you can't perform these steps.)

Boot from the SD card and run the following commands to mount the eMMC boot partition:

cd /mnt/
mkdir boot
sudo mount /dev/mmcblk2p1 /mnt/boot

Your eMMC boot partition should now be mounted. Next, you'll need to connect your NavQ over ethernet to your computer or router. Use an FTP client (I used FileZilla) to FTP into the NavQ, and then place the following file in /mnt/boot:

39KB
imx8mm-navq.dtb
imx8mm-navq.dtb

This is a Device Tree Blob (dtb) file that tells Linux what the device tree looks like. (WiFi does not work with this .dtb, a new one with WiFi working will be provided soon). Once you've successfully copied over the .dtb file, you'll need to unmount the boot partition:

sudo umount /dev/mmcblk2p1

Now you can change the DIP switches to boot from eMMC and reboot by running reboot in the serial console.

Configure U-Boot to use new .dtb file

Once the NavQ starts booting from the eMMC, you'll want to press enter repeatedly until you get to the U-Boot console prompt. Once you're in the U-Boot prompt, you'll need to run the following commands:

setenv fdt_file imx8mm-navq.dtb
saveenv
boot

Your NavQ should then boot into the Ubuntu rootfs.

The default username and password for this image is:

Username: bluebox
Password: bluebox

Note there are other usernames and passwords used for the SDK and Demo (Ubuntu) images.

Whala! You're done!