Click here to Skip to main content
15,879,184 members
Articles / Artificial Intelligence / Machine Learning
Article

Exercising Misty's Extensibility

3 Apr 2019CPOL6 min read 9.2K  
Deconstructing Misty's "Follow Ball" Skill

This article is in the Product Showcase section for our sponsors at CodeProject. These articles are intended to provide you with information on products and services that we consider useful and of value to developers.

Image 1

Misty knows the importance of playing as hard as you work. That's why she's willing to risk a few grass stains in her Follow Ball skill.

Image 2

In this skill, Misty connects to a microcontroller to employ the object recognition capabilities of a Pixy2 vision sensor and chase a soccer ball as it moves around the room. Here's a high-level overview of how it works:

  • A Pixy2 - "trained" to recognize and detect a soccer ball - sends image data to a microcontroller connected to Misty.
  • The microcontroller calculates the distance and heading from Misty to the ball, then sends these values to the code running on Misty by way of the UART serial port built-in to Misty's back.
  • When Misty receives this data, she transforms it into linear and angular velocity values for her drive command and moves toward or away from the ball.
  • The Pixy2 sends updated information to the microcontroller, and the process repeats with the new image data.

This post breaks down each point in a little more detail, so you can build your own Follow Ball skill for Misty. We'll start by looking at the Pixy2.

Priming the Pixy2

The Pixy2 is a lightning-fast vision sensor for robots. Training the Pixy2 to recognize an object is as simple as holding the object in front of the sensor and pressing a button. It's built to easily connect with microcontrollers like an Arduino or Raspberry Pi, and Pixy provides libraries to simplify using image data in your code.

In the Follow Ball skill, we connect the Pixy to the Misty (Arduino-Compatible) Backpack using the ISCP connectors on the microcontroller. We use a 3D-printed attachment to position the sensor front-and-center on Misty’s base. Once this setup is done, we “teach” the sensor to recognize a soccer ball by pointing it in the right direction and holding down the button on the Pixy2.

Image 3

Yep. That's all there is to it.

Now that the Pixy2 knows the object, we're ready to write the code for the microcontroller.

Preparing the microcontroller

For this skill, Misty uses the microcontroller in her own Arduino-compatible backpack. This microcontroller is a clone of the Arduino Uno that's been changed in a few important ways. One of these changes is the addition of magnets that attach the board to Misty's back for a direct connection to her UART serial port.

Image 4

Extend Misty’s capabilities with her Arduino-compatible backpack, or use your own microcontroller.

We call code that runs on Misty's Arduino-compatible backpack a "sketch". You write a sketch with the same tools used to program an Arduino Uno. To process data from the vision sensor, we include the Pixy2 Arduino library in the sketch.

#include <Pixy2.h>

// This is the main Pixy object 
Pixy2 pixy;

We initialize the pixy object in the setup() function of the sketch. The Serial.begin() function from the Serial library opens a serial port and sets the data transfer rate to 9600 baud (this is the rate for communicating with Misty's UART serial port). We also set the values of a few variables used in calculating the ball's movement.

void setup()
{
  
  //Serial.print("Starting...\n");
  pixy.init();
  OLD = 0;
  towards = 0;
  Serial.begin(9600);
}

Before she can execute her drive command, Misty needs to know her position relative to the ball. Specifically, she needs to know her heading, or the angle between the direction she's facing and the ball's position. She also needs to know the ball's distance from her and it's current direction of movement.

The loop() function in this sketch transmits all of this information to Misty once every 200 milliseconds. In the loop function, we call pixy.ccc.getBlocks() to return data about objects the vision sensor detects. This data includes the width and height of objects in the frame (in pixels), which lets us approximate a value for how close the ball is to Misty. A larger size is closer, and a smaller size is more distant.

Image 5

We get Misty's heading by calculating the difference between the center of the image and the X coordinate of the center of the ball within the frame. To determine which direction the ball is moving, we compare this value to the heading angle from the last loop.

We then package all this data - size, pan (heading), and direction - into a stringified JSON object and send it to Misty using Serial.println(). Check out the sketch on GitHub, or see the loop() function below.

void loop()
{ 
  pixy.ccc.getBlocks();
  if (pixy.ccc.numBlocks) {

       // ~Max size of the ball in pixels
       sizeMin = min(pixy.ccc.blocks[0].m_width, pixy.ccc.blocks[0].m_height);
       sizeMax = max(pixy.ccc.blocks[0].m_width, pixy.ccc.blocks[0].m_height);
      
       if (sizeMin >10) {

          // Heading to ball
          panOffset = (int32_t)pixy.frameWidth/2 - (int32_t)pixy.ccc.blocks[0].m_x;
          
          // Calculate direction of motion
          NEW = panOffset;
          if (abs(NEW-OLD) > 20) {
            //towards = sign(NEW-OLD,DEC)*1;
            if (NEW-OLD <0) { towards = -1;}
            else { towards = 1;}
            OLD = NEW;
            count = 0;
          } 

          if (abs(NEW-OLD) < 10){
            count++;
            if (count>30){
              towards = 0;
            }
          }
          // This data is sent to misty
          Serial.println("{\"pan\":\""+String(panOffset)+"\",\"size\":\""+String(sizeMax)+"\",\"direction\":\""+String(towards)+"\"}");  
       }
   }  
   delay(200);
}

Next, we set up the JavaScript code that runs on Misty to handle incoming data and calculate the values for her drive command.

Coding Misty

The code running on Misty processes each message the microcontroller sends through her UART serial. It converts this data into arguments that, when passed into Misty's drive command, move her closer to (or further away from) the ball.

To manage this in the skill, we use misty.RegisterEvent() to register for StringMessage events. These events occur each time the microcontroller sends a message through the receiver (RX) pin on Misty's UART serial port.

function sub_arduino(){
    misty.AddReturnProperty("StringMessage", "StringMessage");
    misty.RegisterEvent("StringMessage", "StringMessage", 0, true);
}

By default, when StringMessage events occur, the skill looks for a _StringMessage() callback function to pass the event data. This data includes any messages the microcontroller sends to Misty. You define how to process this data in the function definition for the _StringMessage() callback.

function _StringMessage(data) { 
// Process incoming message from Misty's UART serial
}

In the Follow Ball skill, _StringMessage() parses out the data from the microcontroller and converts it into values for the linearVelocity and angularVelocity arguments of the misty.Drive() method. Each of these values is an integer between -100 and 100. For linearVelocity, a value of -100 translates to "drive backward at max speed", while 100 means "drive forward at max speed". For angularVelocity, -100 means "rotate clockwise at max speed", and 100 means "rotate counterclockwise at max speed".

We use the size of the ball in the image from the Pixy2 to calculate an appropriate value for the linearVelocity argument. If the ball appears smaller than it should, Misty drives toward it, and if it appears larger, she backs away. The value for angularVelocity correlates with Misty's heading – we calculate a value that has Misty rotate or drive in a curve toward the ball. See lines 35 - 72 of followBall.js to see how these calculations are made in the code.

Misty recalculates these values each time the microcontroller sends new image data, so she's constantly adjusting her speed and position relative to the ball. All this adds up to a dynamically playful robot, optimized for cuteness. Watch the video to see what we mean.

Extensibility means limitless potential

You'd be hard-pressed to find a robot whose hardware enables every job you dream up. That's what makes extensible platforms so compelling. A robot with robust native capabilities that you can augment with third-party hardware has a lot of employment opportunities.

Don't miss the full repository for the Follow Ball skill on GitHub, and if you're interested in more talk about robots, be sure to join the conversation in the Community Forums.

Read this article (and others) on the Misty Robotics blog.

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)


Written By
United States United States
I write to learn, to simplify complexity, and to enable developers to build new things. These ambitions drive my work as a Developer Writer for the platform we're building at Misty Robotics. When I'm not documenting robot APIs, I spend as much time as I can in the mountains near my Utah home.

Comments and Discussions

 
-- There are no messages in this forum --