Coding the VEX AI Robot

This article will cover an example project that displays a dashboard reporting the status for robot to robot communications using VEXlink and also the status to the Jetson. The ai_demo project is hosted on our Github. This demo project collects data from the Jetson processor via USB serial connection. Once the data is received it is displayed on the V5 Brain’s screen and also transmitted to a partner V5 robot that is connected via VEXlink.

Note: This project requires the latest version of the VS Code Extension for V5. Download the VS Code Extension for V5 here.

NVIDIA Jetson Nano to VEX V5 Brain Communications

The Jetson processor contains an application that collects the following data from the VEX AI software:


Robot Location data:

  • Robot’s X,Y location in meters from the center of the field.
  • Robot’s Azimuth (Heading), Elevation (Pitch), Rotation (Roll) all in radians.

Object Detection data (three types):

Pixel (0,0).png

Image Detection (type one):
  • This data represents a detected object by the VEX AI Intel Camera.
  • This data describes the object with reference to the camera image.
  • Values for X, Y, width, and height are in the units of pixels. The pixel values are in reference to the top left corner of the image and object detection box. Image resolution is 640x480.


AI XY (1).png

Map Detection (type two):

  • This data represents the location of the object on the field in the same coordinate system as the GPS Sensor, reported in meters.
  • Each object also contains the location of the object relative to the center of the field. Values for X and Y are in units of meters from the center of the field in their respective axis. The value of Z is meters from the field tiles (height).

Detection Object (type three):


  • This encapsulates all information about the detected objects.
  • Each object contains a value that represents the classification of the detected object. (Class ID: 0 = GreenTriball, 1 = RedTriBall, 2 = BlueTriBall)
  • Each object also contains a probability that represents the VEX AI’s confidence in the detection. This is after a filter in the that removes detections of low probability.
  • In addition, the depth of the object is reported in meters from the VEX AI Intel Camera.
  • The Image Detection and Map Detection are attached to each object to represent the coordinates of the object on the image as well as in the real world.

A breakdown of the ai_demo program:



Standard includes for VEX projects:


Declare an instance of the Jetson class. This class is used to send requests for data to the Jetson as well as receive data via the USB serial connection.



Declare an instance of the robot_link class. This object will be used to connect and transfer data between this robot and a partner robot. This same project can be downloaded to two separate robots. One robot will need the have the line:

//#define MANAGER_ROBOT 1

Before you load the code onto the second robot, you will need to comment out that line:

The robot_link class sets up the robot’s VEXlink and handles the transmitting and receiving of data between the two robots. We are not going to go into detail in this article for how that class works. It would be a good idea to understand how the VEXlink works first. For more detailed information on using the V5 VEXlink API, this document explains the new libraries and how to use them effectively for robot-to-robot communication.

  • Competition Event handlers


    One of the biggest differences between VAIC and VRC is that there is no driver control period. Instead there are two autonomous periods, the isolation period and the interaction period. In this example, there are separate routines for each autonomous period. Because the VEX API does not support two different callbacks, there has to be a flag in the program to determine which routine is to be executed. In this example program, the “firstAutoFlag” is used to call the Isolation function the first time autonomous is enabled, and the interaction function when autonomous is enabled for the second time. One thing to note is that if for some reason the match needs to be reset, the demo program would need to be restarted so that the firstAutoFlag can be reset.

  • Main()

    Main (1).PNG

    This is the main task for this project. It starts off by calling vexcodeInit() to correctly set up the VEXcode environment. Next, a local AI_RECORD object is declared to store the data we receive from the Jetson. A separate task is also set up to handle updating the screen with the most current data. The code for that task is contained in the dashboard.cpp file. The autonomous callback is also registered to handle when the autonomous periods are initiated.

    The main while() loop starts off by copying the latest data from the jetson_comms object into our local AI_RECORD object. It then passes the robot's location information to the link object so that it can be transmitted to our partner robot. Once it is done processing the data, it then requests more data from the Jetson and sleeps for 66 milliseconds. The polling rate for this data is 15Hz. There is no reason to poll any faster as the AI system data updates at about 15Hz.

    Note: the Jetson map data only needs to be requested by a single task.

For more information, help, and tips, check out the many resources at VEX Professional Development Plus

Last Updated: