Pixy2 is the second version of Pixy, a vision sensor for DIY robotics and similar applications. But it's faster, smaller and more capable than the original Pixy, adding line tracking/following algorithms as well as other features. Here are the added features:
And Pixy2 is capable of everything that the original Pixy can do:
If you want to provide your robot with the ability to pick up an object, chase a ball, locate a charging station, etc., a good solution is to use a vision (image) sensor. But there are two drawbacks with image sensors: 1) they output lots of data, dozens of megabytes per second, and 2) processing this amount of data can overwhelm many processors. And if the processor can keep up with the data, much of its processing power won’t be available for other tasks.
Pixy2 addresses these problems by pairing a powerful dedicated processor with the image sensor. Pixy2 processes images from the image sensor and only sends the useful information to your microcontroller. And it does this at frame rate 60 Hz. The information is available through one of several interfaces: UART serial, SPI, I2C, USB, or digital/analog output. So your Arduino or other microcontrollers can talk easily with Pixy2 and still have plenty of CPU available for other tasks.
Pixy2 processes an entire image frame every 1/60th of a second (16.7 milliseconds). This means that you get a complete update of all detected objects’ positions every 16.7 ms. At this rate, tracking the path of falling/bouncing ball is possible. If your robot is performing line following, your robot will typically move a small fraction of an inch between frames.
Pixy2 is unique because you can physically teach it what you are interested in sensing. A blue cube? Place the cube in front of Pixy2 and press the button. It’s easy, and it’s fast.
More specifically, you teach Pixy2 by holding the object in front of its lens while holding down the button located on top. While doing this, the RGB LED under the lens provides feedback regarding which object it is looking at directly. For example, the LED turns orange when an orange ball is placed directly in front of Pixy2. Release the button and Pixy2 generates a statistical model of the colors contained in the object and stores them in flash. It will then use this statistical model to find objects with similar color signatures in its frame from then on.
Once Pixy2 detects a new object, it will add it to a table of objects that it is currently tracking and assign it a tracking index. It will then attempt to find the object (and every object in the table) in the next frame by finding its best match. Each tracked object receives an index between 0 and 255 that it will keep until it either leaves Pixy2's field-of-view, or Pixy2 can no longer find the object in subsequent frames.
Pixy2 has added the ability to detect and track lines. Line-following is a popular robotics demo/application because it is relatively simple to implement and gives a robot simple navigation abilities. Most robots use discrete photosensors as a solution, but the problem is that it works best with only thick lines.
Pixy2 attempts to solve the more general problem of line-following by using its image (array) sensor. Each of Pixy2's camera frames takes in information about the line being followed, its direction, other lines, and any intersections that these lines may form.
Pixy2 comes with various cables so that you can connect it with an Arduino or a Raspberry Pi out of the box. Furthermore, the I/O port offers several interfaces (SOI, I²C, UART, USB) to plug your Pixy2 in most boards.