[Today’s run: 3.5 mile interval workout on the treadmill]
I was doing some research recently involving camera system and we needed a way to locate objects in view of the camera. Someone suggested april tags. So I read up on them. And I had a simple sample program running on my computer.
April Tags come from the APRIL Robotics Laboratory at the University of Michigan. They are patterns of black and white squares. A family of patterns will be generated with a limited number of patterns. All of the patterns in that family are unique. And they are directionally unique: they aren’t symmetrical or mirrored.
The result is that they can be detected in a scene and the detection software can tell their orientation as well as position. Not only will the software know this pattern is #12 (or whatever), it will know the center of that pattern and it will know it’s distance and orientation in relation to the camera. It is pretty cool stuff.
Here is a video from youtube. There are multiple tags in the picture. The software is marking each tag with lines in a “box” which show the pattern number and the orientation of each tag in three dimensions. Those aren’t added to the video with later editing, that is done in real time.
I printed out some tags and waved them around and it does work.
We’ve grown used to machine readable tagging systems like barcodes, UPC, square QR codes and many others. April Tags have been designed specifically to indicate position in space.
In this video, the people are driving a remote vehicle which has a big april tag on a piece of cardboard. The flying drone is automatically following the tag at a fixed distance, height and orientation.