Understanding the Data in the AI Vision Utility in VEXcode EXP

The AI Vision Utility is what allows you to connect and configure your AI Vision Sensor. To read about how to do so, you can read these articles here:

Understanding how the AI Vision Sensor detects and measures objects can help you better use these measurements in your coding projects. With this knowledge, you can improve your coding skills and create more precise solutions for tasks like object recognition and spatial analysis.

Understanding Hue and Saturation

AI Vision Utility with a color signature's options shown. The color signature is named Blue and has a Hue Range value of 32 and a Saturation Range value of 0.57.

When configuring a color signature, options appear for both Hue and Saturation Ranges. These allow you to tune the color signature to be more resilient. A color signature is considered resilient when the object can be moved around and still be tracked by the AI Vision Utility.

Color wheel demonstrating how the 360 degree circle correlates to a hue value. The color red is at a degree of 0, the color green is at a degree of 120, and the color blue is at a degree of 240.

The first slider is the Hue Range. Hue is the color perceived, as defined by its position on the color wheel. This color wheel has a range of 0 to 359.9 degrees and each color on the wheel has a defined degree value. 

The Hue Range allows you to choose the degrees above and below the configured color that will report as that color. For example, a dark blue may have the hue value of 240 degrees. With a Hue Range of 20 degrees, anything from 220 degrees to 260 degrees will report as that dark blue configured color.

Graph of a saturation value increasing from 0% to 100%. The 0% saturation value is the color gray, and the 100% saturation value is colored bright red.

The second slider is the Saturation Range. Saturation is the intensity or purity of the color. The brighter the color, the more saturated it is. Saturation is a relative scale measured with percentages from 0%, being a muted grey tone, and 100% being a intense version of that hue. 

The Saturation Range allows you to choose the percent of saturation above and below the configured color that will report as that color. For example, a red ball in dimmer lighting may appear as 50% saturation. With a Saturation Range of .25 (the decimal equivalent of 25%), anything from 25% to 75% saturation will report as that red configured color.

Understanding Pixels and Resolution

Diagram of a cartoon house drawn on top of grid paper, with some of the squares fully colored to represent pixels.

Imagine you're drawing a picture on a piece of grid paper. Each tiny square on the paper is like a pixel. When you color in these squares, you're making your picture.

Low Resolution High Resolution
VEX 123 robot is shown with a very low resolution to demonstrate the individual pixels in low-res displays. VEX 123 robot is shown with a high resolution to demonstrate the sharper image on high-res displays.

Now, let's talk about resolution. Resolution is the number of pixels in an image. If you have lots of tiny squares (pixels) in your grid paper, your picture will look sharp and detailed. But if you only have a few pixels, your picture might look blurry and not very clear.

Diagram of the AI Vision Sensor's resolution. The top left corner is labeled 0, 0, the top right corner is labeled 320, 0, and the bottom left corner is labeled 0, 240. The center of the screen is labeled 160, 120.

The AI Vision Sensor has a resolution of 320 pixels horizontally by 240 pixels vertically. This means that the precise center of detection aligns with coordinates 160 on the X-axis and 120 on the Y-axis.

How Does the AI Vision Sensor Measure Objects

Data Reported by the Sensor

The AI Vision Sensor collects data on configured colors, AprilTags, and AI Classifications. Some of this data is shown in the AI Vision Utility and can help when planning and creating a VEXcode project. 

AI Vision Sensor is shown tracking a Blue Buckyball. The Buckyball has a tracking rectangle around it, and the label above shows that it has a width of 80 pixels and a height of 78 pixels. Red arrows are highlighting the tracking rectangle to demonstrate its width and height.

Width and Height

This is the width or height of the detected object in pixels.

The width and height measurements help identify different objects. For example, a Buckyball will have a larger height than a Ring.

 

AI Vision Sensor is shown tracking a Blue Buckyball. The Buckyball has a tracking rectangle around it, and the label above shows that it has an X position of 176 and a Y position of 117. The tracking rectangle's center is highlighted to demonstrate that the position is measured from the center.

CenterX and CenterY

This is the center coordinates of the detected object in pixels.

CenterX and CenterY coordinates help with navigation and positioning. The AI Vision Sensor has a resolution of 320 x 240 pixels.

Animation of a red square and a green square being rotated together to demonstrate the 360 degrees of an angle value.

Angle

Angle is a property only available for Color Codes and AprilTags. This represents if the detected Color Code or AprilTag is orientated differently.

 

AI Vision Sensor is shown tracking a Blue Buckyball. The Buckyball has a tracking rectangle around it, and the label above shows that it has an X position of 176 and a Y position of 117. The tracking rectangle's upper left corner is highlighted to demonstrate that the origin position is measured from its top left corner.

OriginX and OriginY

OriginX and OriginY is the coordinate at the top-left corner of the detected object in pixels.

OriginX and OriginY coordinates help with navigation and positioning. By combining this coordinate with the object's Width and Height, you can determine the size of the object's bounding box. This can help with tracking moving objects or navigating between objects.

 

Three AprilTags are being tracked by the AI Vision Utility. Each tag is identified, located, and outlined, indicating its tracking by the system. The AprilTag IDs in this example read 0, 9, and 3.

Tag ID

The tag ID is only available for AprilTags. This is the ID number for the specified AprilTag.

Identifying specific AprilTags allows for selective navigation. You can program your robot to move towards certain tags while ignoring others, effectively using them as signposts for automated navigation.

Four objects are being tracked by the AI Vision utility, two BuckyBalls and two Rings. Each object is identified, located, and outlined, indicating its tracking by the system. The utility also lists each object's AI Classification score, in this example each score reads 99%.

Score

The score property is used when detecting AI Classifications with the AI Vision Sensor.

The confidence score indicates how certain the AI Vision Sensor is about its detection. In this image, it's 99% confident in identifying these four objects' AI Classifications. You can use this score to ensure your robot only focuses on highly confident detections.

For more information, help, and tips, check out the many resources at VEX Professional Development Plus

Last Updated: