arrow-left

All pages
gitbookPowered by GitBook
1 of 5

Loading...

Loading...

Loading...

Loading...

Loading...

OpenCV

circle-info

If using an downloaded image, check first that these packages are not already installed.

hashtag
Accessing the cameras using OpenCV in Python

Below are short examples of using openCV. Refer to NXP.com or other guides for more detail.

circle-check

You may find more complete application examples in the

hashtag
Getting started

To install OpenCV for Python to the image, run the following command:

hashtag
Getting images from the camera using gstreamer pipelines

To access the Google Coral Camera(s) on NavQ+ in OpenCV, you may use the following VideoCapture instantiation:

You may change the source resolution by editing the width and height values in the GStreamer pipeline. See below for a list of supported resolutions and framerates.

MR-B3RB documentationarrow-up-right
sudo apt install python3-opencv
cap = cv2.VideoCapture('v4l2src device=/dev/video3 ! video/x-raw,framerate=30/1,width=640,height=480 ! appsink', cv2.CAP_GSTREAMER)
video/x-raw, format=YUY2, width=2592, height=1944, framerate=8/1
video/x-raw, format=YUY2, width=1920, height=1080, framerate={ (fraction)15/1, (fraction)30/1 }
video/x-raw, format=YUY2, width=1280, height=720, framerate={ (fraction)15/1, (fraction)30/1 }
video/x-raw, format=YUY2, width=1024, height=768, framerate={ (fraction)15/1, (fraction)30/1 }
video/x-raw, format=YUY2, width=720, height=576, framerate={ (fraction)15/1, (fraction)30/1 }
video/x-raw, format=YUY2, width=720, height=480, framerate={ (fraction)15/1, (fraction)30/1 }
video/x-raw, format=YUY2, width=640, height=480, framerate={ (fraction)15/1, (fraction)30/1 }
video/x-raw, format=YUY2, width=320, height=240, framerate={ (fraction)15/1, (fraction)30/1 }
video/x-raw, format=YUY2, width=176, height=144, framerate={ (fraction)15/1, (fraction)30/1 }

I2C

I2C configuration and setup

hashtag
Install I2C tools

If not already present, the first step is to install the I2C tools to be able to use the I2C ports.

hashtag

hashtag
Adjust I2C user group

To use the I2C commands without root, you'll need to add the NavQ+ user to the i2c group. To do this, you can run the following command:

hashtag

hashtag
Check the I2C connection

Now to check the connection and confirm that the port is working correctly. Connect something to the I2C JST-GH port, then run the command below. It shoudl show you raw output from most devices connected on the I2C bus(es). Note that the onboard NXP secure element SE05x will not respond to this command.

hashtag
Example

circle-check

The link below is a 3rd party example that shows the use of I2C from one of the NXP HoverGames participants.

sudo apt-get update -y
sudo apt-get install -y i2c-tools
sudo usermod -a -G i2c user
sudo su
echo 'KERNEL=="i2c-[0-9]*", GROUP="i2c"' >> /etc/udev/rules.d/10-local_i2c_group.rules
i2cdetect -y 5
Smart and Sustainable Agriculture SystemHackster.iochevron-right
Logo

GStreamer

hashtag
Pipeline examples

hashtag
Take an image

hashtag
Record a video

hashtag
Streaming examples:

  • On PC: RX Pipeline to receive the stream from the NavQPlus AI/ML companion computer:

gst-launch-1.0 udpsrc port=50000 ! "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! decodebin ! queue ! autovideosink

  • On NavQPlus (e.g. from SSH connection):

gst-launch-1.0 v4l2src io-mode=dmabuf device=/dev/video3 ! "video/x-raw,width=1920,height=1080,framerate=(fraction)30/1" ! vpuenc_h264 ! h264parse ! rtph264pay ! udpsink host=10.0.1.101 port=50000

circle-check

Note: Modify the host IP address 10.0.1.101 to the one matching your receiving laptop

  • Video Test Source instead of camera: gst-launch-1.0 videotestsrc ! "video/x-raw,width=640,height=480,framerate=(fraction)10/1" ! vpuenc_h264 ! h264parse ! rtph264pay ! udpsink host=10.0.1.101 port=50000

  • Camera Tuning: 1) Get the video subdevice using:

media-ctl -p /dev/media0

remember the sub-device of the camera ( ov5645tn) in the media video chain, e.g. /dev/v4l-subdev2

2) List the available user and camera controls and try them out (e.g. contrast, saturation, auto white balancing etc.):

v4l2-ctl -l -d /dev/v4l-subdev2

  • Streaming and Storing to a h264 compressed file simultaneously including text overlays for 30 seconds:

gst-launch-1.0 v4l2src io-mode=dmabuf device=/dev/video3 num-buffers=900 ! "video/x-raw,width=1920,height=1080,framerate=(fraction)30/1" ! tee name=t ! queue leaky=1 ! textoverlay text="Live" ! vpuenc_h264 ! h264parse ! rtph264pay ! udpsink host=10.0.1.101 port=50000 sync=false t. ! queue ! textoverlay text="recorded" ! vpuenc_h264 ! mpegtsmux ! filesink location=record_$(date +"%Y-%m-%d_%T").mp4

Note: The VPU is encoding two streams in parallel as we have different text overlays on the live stream and the recorded file

hashtag
Note on hostapd

To avoid needing to set up a Wi-Fi router hostapd can be installed so that the NavQPlus acts as an access point. Additionally a DHCP server can be configured. This way you can directly connect to the WiFi network spawned by the NavQPlus:

Shown highlighted below, the IP address 10.0.1.101 is coming from the DHCP server on the NavQPlus.

gst-launch-1.0 -v v4l2src num-buffers=1 device=/dev/video3 ! jpegenc ! filesink location=capture.jpeg
gst-launch-1.0 v4l2src device=/dev/video3 ! imxvideoconvert_g2d ! "video/x-raw, height=640, width=480, framerate=30/1" ! vpuenc_h264 ! avimux ! filesink location='/home/user/video.avi'

WebServer

Controlling a NavQPlus with an HTML WebServer

hashtag
Introduction

This is an example of how to use a Webserver running locally on the NavQPlus. Note that this is one of several methods available. In this example the idea is to control the robot (or anything), using low level commands written with shell scripts commanded from an HTML page. By not using a higher level language such as Python it helps make the robot responsive and quick to act.

A WebServer is used to control the NavQPlus using low level commands written with shell scripts commanded straight from a HTML page. This page will explain how to set-up a WebServer and control your NAVQ+.

The webserver we will use is called Lighttpd, for more information on the Lighttpd read the following link:

First step is to install Lighttpd WebServer and components. Use the following code in your NAVQ+ serial console application:

This is the output you should receive after running all three codes above:

Lighttpd is looking for an index.html page at /var/www/html. We will change it, so the index.html will be placed under /var/www. For that, we must edit the Lighttpd config file using nano (if you do not have nano see ):

In this file you should change:

To:

triangle-exclamation

I personally did not have to change the file location. Not sure why

It should look something like this:

Then exit the file and save. For the changes we made just now to take effect, we must reboot the web server. To do that enter both commands in order:

At this point the web server is running and if a page index.html is located at /var/www, we can access it from any browser, typing the NAVQ+ address you can see the default web page by lighttpd. Get the NavQ+ IP address and input it into your browser search bar.

Now let's try and place an example template webpage and access it.

Stop the server for the next few steps.

Clone the following repository to somewhere where you will remember in your home file.

After cloning the repository, you will copy some of the files to /var/www. This can be done with these commands:

Now access the page again using the IP address of your NavQPlus, you should get this:

You have managed to make a simple webserver. This webpage can be quite useful to use as a GUI for robot controls as displayed in the example page above.

this chapter
sudo apt-get -y install lighttpd
sudo lighttpd-enable-mod cgi
sudo lighttpd-enable-mod fastcgi
sudo nano /etc/lighttpd/lighttpd.conf
server.document-root = “/var/www/html”
server.document-root = “/var/www”
sudo /etc/init.d/lighttpd stop
sudo /etc/init.d/lighttpd start
sudo /etc/init.d/lighttpd stop
git clone https://github.com/eslamfayad/Hover_Games3_E.F.git
sudo cp -r "yourdirectory"/Hover_Games3_E.F/ROBOT_WEB_SERVER/images /var/www
sudo cp -r "yourdirectory"/Hover_Games3_E.F/ROBOT_WEB_SERVER/cgi-bin /var/www
sudo cp  "yourdirectory"/Hover_Games3_E.F/ROBOT_WEB_SERVER/index.html /var/www

Application Software

The NavQPlus may be considered a generic embedded Linux Computer. When running Ubuntu POC, most linux packages may be installed using apt or apt-get. As with any Linux machine not all packages are suited to the specific hardware.

There is also specific emablement related to the i.MX 8M Plus SOC chip that is used. Please refer to NXP.com for more details. This includes things like hardware acclerated video using gstreamer and hardware accelerated neural net processing using eIQ on the NPU module. Note that the MR-B3RB documentationarrow-up-right will demonstrate the usage of NavQPlus as a Robotic platform running ROS and is considered one of our reference development tools. Following the software guides there may be preferable, and provide more detail.

WikiStart - Lighttpd - lighty labsredmine.lighttpd.netchevron-right
Logo