OpenCV Blob Tracker

github code is here!

The Blob Tracker is a simple demo that shows how you can track a certain color in OpenCV.

The setup consists of a camera mounted on a pan-tilt unit that’s wired to an Arduino. The camera and Arduino are hooked up to a computer via USB. On the computer, a simple Python script takes in the camera images, processes them using OpenCV, and sends back commands to the Arduino to move the pan-tilt servos and track the desired color.

Parts List and Assembly

This tutorial builds off the Remotely Controlled Pan-Tilt Unit post. Follow instructions there to assemble the unit, and upload the provided Arduino code to your Arduino.

If you’ve done things correctly, you should be able to view images from the camera on your computer (using any webcam software, like Skype or Google hangouts), and control the pan-tilt unit by plugging in your Arduino and sending commands over serial.

Installing OpenCV

In Linux, install OpenCV by running:


# run apt-cache search opencv-core first, to see which version is available. Anything greater than 2.2 should work
sudo apt-get install opencv-core2.4

Alternatively, you can install ROS, which comes with OpenCV.

Code

The code is available on Github at https://github.com/jessicaaustin/robotics-projects/tree/master/blob-tracker

Understanding color tracking

(Note: If you’re lazy and just want to track a color without understanding how it works, you can skip this section and pass in –red, –green, or –blue to blob-tracker.py below)

Most of the time, we think in terms of the RGB color model. However, when it comes to trying to track an object of a certain “color”, the RGB space is not very useful. That’s because something that’s “red” in one lighting condition might look like “dark red” in low light or “light red” in bright light.

An alternative is the HSV color model. HSV stands for Hue, Saturation, and Value. The hue is what we care about — for example, red — and the range we’ll look for here will be fairly narrow. Saturation and value will depend on the object’s texture and lighting conditions, and we can set those those values to a wider range to account for a wider range of texture and lighting conditions.

To illustrate this concept to yourself, try running the color_detector.py script in the blob-tracker folder:


# use --camera=N to set the index of your camera.
# e.g., if /dev/video1 is your camera device, then use --camera=1
./color_detector.py --camera=1

The program will pop up two windows: “camera feed” and “filtered feed”. The filtered feed is a mask where white is the color you’re tracking, and black is not.

color_detector_1

The program will start out with HSV set to the following values:
H = 100 +/- 50
S = 155 +/- 200
V = 155 +/- 200

Place a solid-color object in front of the camera — for example, a red ball — and use the keyboard to modify these ranges:

hue: sat: val:
 e    t    u
s d  f g  h j
 x    v    n

For example, to increase the max hue, press e, to decrease the min hue press x, to decrease the range press s, to increase the range press d. (If things aren’t working, make sure you have the window called “filtered feed” selected when you press the buttons).

Play with the values until you’re consistently seeing just the color you want, and not anything else (for example, a red jacket in the background).

color_detector_2

Now try changing the lighting conditions. How does this change the track-ability? What if you modify the sat and val values?

The program will spit out the current HSV min/max ranges to the terminal. Once you’re happy with your ranges, hit ESC to exit and save the HSV values — you’ll need to input them into the blob-tracker program next.

For example, for tracking a red object I ended up with:

(h,s,v):
min=(146.0, 146.5, 55.0, 0.0)
max=(182.0, 283.5, 255.0, 0.0)

Running everything together

At this point you’ve got a camera to capture images, mounted on a pan-tilt unit that you can control over serial. You also have an HSV range to track. Now it’s just a matter of running the blob-tracker code! This code will process the images, find the color you want to track in the image, and send commands to the servos to close the loop and track the color.

To run:

# get options
./blob-tracker.py --help
# find a red object (no tracking), using a camera on /dev/video1 and an arduino on /dev/ttyACM0
./blob_tracker.py --camera=1 --red
# track a red object:
./blob_tracker.py --camera=1 --device=/dev/ttyACM0 --red --follow

First try without the –follow command. You should see two windows pop up: “camera” and “threshed”. The code performs some filtering on the image to reduce noise, so the color blob in the threshed image is “smooth”. A red circle on the camera shows where the center of the largest “blob” matching your color is located. If there is more than one “blob” of the same color in the image, the code will find the largest one and track it.

blob_tracker_1

Now try running with the –follow command. Your pan-tit unit should move around the track the object!

blob_tracker_follow_3 blob_tracker_follow_2 blob_tracker_follow_1

Antibiotic Apprentice: An Educational Game

As I talked about in a previous post, last term I took a class at CMU called Design of Educational Games by Vincent Aleven. For my final project, I worked with two other classmates, Martina Pavelko and Yujun Song, to create an educational game for med students studying for exams.

The game, called Antibiotic Apprentice, is an adventure RPG where the main character uses Items (antibiotics) to fight Enemies (bacterial infections). For an overview of the game, watch the video above. To learn more about our process for making the game, and evaluations, check out our final presentation. Finally, if you really want to get into detail, you can read our final report.

Game Screenshot

Education and Technology

It’s tempting to treat technology as a panacea for education. In fact, we’ve been doing it for a long time—well before iPads or MOOCs or even computers came around [1]. However, (and this is true of life in general) technology is only the answer in very specific cases, and we should be aware of what those are if we don’t want to waste valuable time and resources.

I wanted to learn more about the intersection of technology and education, to see what’s out there, what’s working, and why. So, last term, I took two courses offered by the Human-Computer Interaction Institute at Carnegie Mellon University: The Role of Technology in Learning in the 21st Century and Design of Educational Games.

The Role of Technology in Learning in the 21st Century, taught by Prof. Amy Ogan, was essentially an overview of existing educational technology, with an emphasis on students with low socioeconomic status. We covered dozens of technologies, in the form of papers, discussions, case studies, and expert panels. I learned quite a few things from this course, but the things that stand out are:

  • It is incredibly difficult to create an piece of educational technology that is effective for a large audience. Of the success stories, all of them had a clear, narrow focus for a specific target audience. For example, Project LISTEN has been successful by many standards, but it targets literacy for children who are in grades 1-4, and are currently behind in school.
  • Context is everything. Most of the failed projects failed because they didn’t take into account the existing infrastructure, educational culture, language barriers, etc
  • Teacher support and understanding is crucial. You can create the best technology ever, but if a teacher doesn’t understand how it works, or how it can fit into his/her curriculum, it will be ignored.

To sum up: I think technology does have potential in the classroom, but, without good evidence to the contrary, one should be skeptical that it’s effective. Ideally, the person choosing or designing the technology should understand the teachers, understand the students, understand the curriculum, and see how the tech can fit in with all of that in order to supplement learning.

The second class I took, Design of Educational Games with Vincent Aleven, covered some of the same ground but was focused on games specifically. The two key takeaways from this class were: integrate learning with your core game mechanic, and evaluate out-of-game transfer.

Bad educational games are ones where the educational aspect feels “tacked on”. Good games have tight integration between the core game mechanic and the learning objectives. An easy test to see if there is good integration: if you replaced the learning objectives with something else, would the game still make sense? For example, in the Math Blaster games, in many cases you could replace the math puzzles with spelling puzzles and the game wouldn’t change much. On the other hand, if you took Where in the World is Carmen Sandiego and replaced the “clues” with math puzzles, the game wouldn’t make sense anymore.

Secondly, for an educational game to be considered a “success”, there MUST be evidence of out-of-game transfer. That is, progressing in the game itself is not good enough—the player must transfer whatever they’ve learned out of the game and apply it in a real-life scenario. The best way to evaluate this is with pre- and post-game quizzes. For example, take DragonBox. A Forbes article trumpets, “On average, it took 41 minutes and 44 seconds for students to master Algebra skills … using the DragonBox App” [3]. But this is talking about in-game mastery—without data to show that kids got better at doing actual algebra (e.g., homework, quizzes, word problems) there’s no proof that this game is actually educational.

While those two takeaways are high-level, we also learned about a practical way of evaluating the details of an educational game called the EDGE Framework [2].

The EDGE framework looks at a game from three different perspectives:

  • Learning Objectives: what knowledge or skills do you want to impart upon the player?
  • Instructional Principles: what best-practices or techniques (ideally, proven by research or commonly used in industry) will you use to achieve those learning objectives?
  • Mechanics, Dynamics, and Aesthetics (MDA): how does the game actually function, and how does this provide a certain experience to the user?

All three of these work together. For example, the MDA and Instructional Principles are chosen to support the Learning Objectives.

The EDGE Framework
The EDGE Framework

For some more information on this framework, and an example of it applied to a real game, check out the presentation I made about Where In the World is Carmen Sandiego.

Finally, I have to point out a less well-known genre of games: Interactive Fiction. I think IF has excellent potential for education, something I talk about in another presentation, Can we use Interactive Fiction in the Classroom?


[1] Reiser, Robert A. “A history of instructional design and technology: Part I: A history of instructional media.” Educational technology research and development 49.1 (2001): 53-64.

[2] Aleven, V., Myers, E., Easterday, M., & Ogan, A. (2010). Toward a framework for the analysis and design of educational games. In G. Biswas, D. Carr, Y. S. Chee, & W. Y. Hwang (Eds.), Proceedings of the 3rd IEEE Conference on Digital Game and Intelligent Toy-Enhanced Learning (pp. 69—76). Los Alamitos, CA: IEEE Computer Society. doi: 10.1109/DIGITEL.2010.55

[3] Shapiro, Jordan. “It Only Takes About 42 Minutes To Learn Algebra With Video Games”. Forbes, 7/01/2013. http://www.forbes.com/sites/jordanshapiro/2013/07/01/it-only-takes-about-42-minutes-to-learn-algebra-with-video-games

 

Robotics: books and online courses for independent study

This fall, I’m going back to school to study Robotics as a graduate student.

It’s been almost five years since I graduated from undergrad, so to prepare myself I created a list of study materials for review. I hope others might find this list of recommendations helpful.

There were two main areas I wanted to cover: mathematics review, and introduction to robotics concepts. The latter section might be useful for someone interested in robotics but not sure which areas they want to pursue.

Please comment if you have any questions!

General Math

How to Prove It by Velleman

Man, I wish I had read this book BEFORE undergrad. In this book, Velleman does three things:

  • describes basic concepts in Logic
  • gives common proof strategies, with plenty of examples
  • dives into more set theory, defining functions, etc

He does all this assuming the reader is NOT a mathematician–in fact, he does an excellent job of explaining a mathematician’s thought process when trying to prove something.

I highly recommend this book if you feel uncomfortable reading and/or writing proofs, since it will make the following math books much more enjoyable to read!

Calculus

Barron’s College Review Series: Calculus

This book was my warm-up. It is very simple, and is focused more on computation than rigorous proofs. I think I got through it in a weekend, while completing most of the exercises. It does NOT include multivariate calculus.

Khan Academy: Calculus

Khan Academy lectures, while time-consuming, are a great reference if there is a specific concept that you’re struggling with. That said, I don’t recommend watching the whole series, but rather searching for a specific topic (say, “gradient”) when you want more information.

Probability and Statistics

Khan Academy: Probability and Statistics (combined with Combinatorial Probabilities cheat sheet)

I have to say: I always had problems getting combinatorics straight in my head, and watching these videos + completing the exercises really helped.

Introduction to Bayesian Statistics by Bolstad

This book is AMAZING. Bayesian statistics is extremely important to modern robotics, and this book provides an excellent introduction. Highly recommended!

Note that if you’re already comfortable with traditional probability, you can skip the Khan Academy altogether and skip straight to the Bolstad book.

Differential Equations

Elementary Differential Equations by Boyce and DiPrima

All-around excellent book. Probably my favorite, most-referenced textbook from undergrad.

Khan Academy: Differential Equations

Again, don’t watch the all the lectures, but use them as a reference when you want a simple, thoroughly-explained overview of a specific topic.

Linear Algebra

Linear Algebra by Hefferon (also available in print)

If you had to pick a single math topic to study before entering robotics, linear algebra would be it. This book is particularly good because it starts with solving systems of equations, defining spaces, and creating functions and maps between spaces–and only after this foundation is laid does it introduce matrices as a convenient form for dealing with these concepts.

Khan Academy: Linear Algebra

Again, don’t watch the all the lectures, but use them as a reference when you want a simple, thoroughly-explained overview of a specific topic.

Code

The Nature of Code

I’ve been programming since high school, so I didn’t really need much review in this area. However, The Nature of Code is an amazing book, it’s free!, and it includes online exercises in the Processing language, so I have to recommend it.

Also note that the Udacity CS-373 course includes programming exercises in Python.

Robotics

If you complete the following courses, you’ll get a high-level understanding of some of the most important concepts in robotics.

Udacity CS-373, Artificial Intelligence for Robotics

Topics include: Localization, Particle Filters, Kalman Filters, Search (including A* Search), PID control, and SLAM (simultaneous localization and mapping). If you understand these concepts, you can write software for a mobile robot! Even better, each section has multiple programming exercises in Python, so you really get practice with the topic.

If you want to dig deeper into some of the above topics, I recommend Sebastian’s book, Probabilistic Robotics

Udacity CS-271, Introduction to Artificial Intelligence

If you’re interested in Machine Learning, this is a great course. It’s not as slick as CS-373, but still worthwhile.

ChiBots SRS RoboMagellan 2012: Nomad Overview

This summer, my friend Bill Mania and I entered our robot in the ChiBots SRS RoboMagellan contest. To steal the description directly from the website:

Robo-Magellan is a robotics competition emphasizing autonomous navigation and obstacle avoidance over varied, outdoor terrain. Robots have three opportunities to navigate from a starting point to an ending point and are scored on time required to complete the course with opportunities to lower the score based on contacting intermediate points.

Basically, we had to develop a robot that could navigate around a campus-like setting, find GPS waypoints marked by orange traffic cones, and do it faster than any of the other robots entered.

To give you an idea of what this looked like for us, here’s a picture of us testing in Bill’s backyard:

Our robot moving between two waypoints. Note the red-orange potters and yellow plastic bin–“red herrings” that our robot is wisely ignoring!

For our platform, we used a modified version of the CoroWare CoroBot, with additional sensors like ultrasonic rangefinders, a 6-DOF IMU, and wheel encoders.

Our software platform was ROS — rospy specifically — and we made liberal use of various components in the navigation stack. We were even able to attend the very first ROSCon in St. Paul, MN, which was a blast and greatly expanded our knowledge of the software and what it was capable of.

Over the next few weeks, I’ll be writing more detailed posts about the robot and specific challenges we faced, including:

  • Hardware and sensor overview
  • Using robot_pose_ekf for sensor fusion of IMU + wheel encoders to allow us to navigate using dead reckoning
  • Localization in ROS using a very, very sparse map
  • Our attempts to use the move_base stack with hobby-grade sensors, and why we ended up writing our own strategy node
  • Using OpenCV + ROS to find an orange traffic cone, and using this feedback to “capture” the waypoint

In the meantime, enjoy this video of the above scene, from the robot’s point of view!