This is a place where you will find what people do with Angus.ai (including us at Angus !).
If you want to share your own hack, drop us a mail at firstname.lastname@example.org with video / photo links and a brief summary.
We are currently working hard on a service that extracts 3D information from 2 standard RGB cameras looking at the same scene (aka stereo vision), and
we are using our artificial chameleon for this !
It is made of 2 usb cams (like this), 4 small but very fast servos and a Raspberry Pi (B+) to stream videos on Angus.ai and control servos. The Stereovision service is just about to be released, stay in touch !
And now testing stereovision with opencv and the Angus.ai chameleon:
The Angus Lamp
The lamp leverages Angus Movement, Face, Age, Gender and Expression Estimation to greet people entering the scene, and react (in a very simplistic way !)
to people smiles or frowning.
The lamp is made from a 30$ architect desk lamp, 3 high torque servos (see here), a usb cam and a Rasperry Pi (model B+) that streams the video to Angus.ai and control the servos accordingly.
Jumping Sumo + Angus
Angus.ai API is used here to automate a Jumping Sumo drone from Parrot (see here).
If you have a Jumping Sumo (or something similar), look at this GitHub repository for details.