Motion Tracking Robot

by Matthew, Laurin, Lucas

Summary: Last time we started out writing code utilizing OpenCV to track moving objects through image processing. This time we move our project into the physical world by making a robot that uses the code we wrote to track people with a laser.

What we did

Designing the turret

Our original idea in Project 1 was to make a turret in the style of the video game Portal. However, we quickly found that this was a non-trivial problem that many hobbyists and professional engineers have been working on for years. Since none of us have much in the way of engineering experience, we instead went with a utilitarian design.

turretpic

This was our OpenSCAD design. the box has enough space to comfortably house the webcam, Arduino Mega, and the breadboard we use for basic circuitry. Instead of making the box a single object, we made each section removable for rapid prototyping. The webcam is held securely in place with a back plate that is screwed into the front, and there are mounting holes+standoffs for the Arduino .

The actual turret is a very simple two-axis servo mount. The bottom servo controls horizontal rotation, and the top servo controls vertical. The laser itself is housed in the cylinder you see in the picture above. Additionally, the turret was designed so that the emitting end of the laser experiences no translation during normal operation. This was to simplify some of the math you’ll read about later.

Overall, the turret can be made very cheaply (around 30-40 dollars,) which would make it a good DIY project. We plan to upload the files to Thingiverse once we have made a statically linked executable for the control software.

A picture of the finished turret (under repair):

20151218_041008.jpg

And here is a video of the turret in action:

Finding Turret Position

In the end, the answer of where to point the laser was not to use calculus, but just to use more trig. We ended up saying that when the laser is pointed straight ahead, with both servos (up/down and left/right) at 90 degrees, the laser should be pointed at the very center of the image on our webcam (it turned out that that wasn’t the case. To get the base calculations though, we assumed that we were correct, and we dealt with fixing it later).

With our assumption, the calculations became fairly simple. We looked at the image from the webcam, and ignored everything in it but the center of the target we draw on it. The target already does all the calculations to put itself into the center of the object, so all we needed to do was figure out how to make the laser point at the center of the target.

What we ended up doing is we thought about the image as a cartesian grid, with the center of the image being our (0,0) and servos both at 90 degrees point. From there, we looked at where the target was on that grid, and calculated the angle to which both of our servos needed to go to to line up the laser. We did this through two arctangents. One servo needed to know how far left or right to go, so we could ignore the target’s height, project it into the axis we wanted, and solve the arctangent to get a degree offset from 90 the servo needed to go to. We did that for both axis, and ran both simultaneously on both servos, and effectively turned two 1-degree motions into a 2-degree motion that tracked actual movement in the real world with a laser pointer!

Using arctan in this way did have some problems, though. Such as arctans need to know how far away you are from the target, and we had no way of calculating that. So we assume a fixed distance that the object is away from the webcam. Also, we still had that off by a little bit error that was mentioned earlier. The laser, at default 90 degrees on both servos, did not point at the center of the webcam. The laser was mounted a few inches above the webcam, so we did have some offset.

How we ended up dealing with that problem was just more trig! In fact, just another arctan. Instead of using the location of the target, though, we used the already fixed distance to the object being tracked and the distance between the laser and webcam to calculate a fairly small angle offset that the up/down servo would need to take into account, in order to actually keep the laser in the target.

The math worked out to be this:

int x_command = -atan2((destination.x – configuration[“CAM_RES_X”] / 2)*parallax_unit , configuration[“PARALLAX”])*180/CV_PI + 90;

int y_command = atan2((destination.y – configuration[“CAM_RES_Y”] / 2)*parallax_unit , configuration[“PARALLAX”])*180/CV_PI + 90 + vertical_offset;

The commands, where the servo should rotate to, are pretty much identical, save that the Y command has the vertical offset added in as well, which is calculated elsewhere. The calculations are fairly simple, with the first parameter of atan2 being either the distance vertically or the distance horizontally that our target is off from the center, and the second being the hard coded distance the camera is from the target it’s tracking. The result is then multiplied by 180/pi, in order to convert from radians to degrees, then the base 90 degrees is added in, so that if the target is in the center of the screen, the servos would return to their proper default locations.

Sending the Data

So now that we have the calculations of where to move the servos, how do we get that data across to the Arduino? We had to figure out how to get our C++ code to send data through the serial port into the Arduino, so the Arduino could extract it and do meaningful things to the data.

We found that you can write data to a serial port by treating it as a file. Write your data to a stream, then send it along through the serial port. Really straightforward.

So the next problem was how to format the data being sent?

We decided to go with a single character, then two integer values. The single character could be an ‘A’, to indicate that the turret is moving automatically, and that the following integers are the location of the target, so the turret would move there. Or the character could be a ‘M’, to indicate that we have manual control, and we should treat the following stream differently. At the time of writing this, manual mode wasn’t fully working, though all of the architecture is in place to implement it.

The Arduino breaks apart the data, depending on the first character. If the first character is an ‘A’, it breaks apart the following two integers, and does the math on them as was discussed in the previous section, before telling the servos where to go.

GUI

 To make our project have a smoother user interface we worked on providing a GUI for our robot. Ultimately we used OpenGL with OpenCV since they seemed to be able to collaborate fairly well. First thing was first, we needed to get OpenCV working with OpenGL since we had not used OpenGL for our last project.

(We also utilized GLUT and GLEW, old I know. I got constant reminders from Xcode that my code was deprecated.)

To do this, we included our global variables at the top of the file and put our motion tracking openCV loop code (previously in main.cpp) into a function that I called opencvLoop(). OpenGL needed the main for itself, so it could call glutMainLoop() and update the window. I ended up calling opencvLoop in my main window display function, because I was using our images and binding them to texture objects.

The rest was simple and more straightforward because I used my graphics teacher’s code he provided us for educational purposes, to draw text to the screen and to have an easy-to-use shader program. Thank you Dr. Chappel!

All in all, I added keyboard arrow functionality, 3 debugging sub windows, our video, and instructions for our user in the GUI.

Screen Shot 2015-12-14 at 1.48.59 PM

See our code here:

GitHub

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s