Little conversational robot - the Eyes
This is the first part of a new project, building a small conversational/AI robot.
Inspired by what I saw about commonly used animatronic mechanisms and, in particular, Will Cogley’s projects, I decided to start with the eyes, since those will really help give this little robot some personality.
After some successful tests, I started tweaking the mechanism for my use, and to start integrating the eyes inside some sort of body.
The eyes themselves were printed in resin, then hand painted to be realistic, but not so much that it would get into creepy territory for a robot.
Eyelids were also printed in resin and painted silver, and the rest of the small parts were printed either in resin or filament. I designed and printed my own custom arms for the servo motors.
Since this is a robot and not a remote controlled animatronic, I wanted the eyes to be able to follow someone’s face. For this, I used a K210 AI vision module sold by a company called Yahboom. DFrobot also makes one called the HuskyLens, but I decided for the other one as it’s rectangular and it looks like the camera would detach easily. It does, but I’ll need to replace it for one with a longer cable.
NOTE:
That module is not plug-and-play, requires a bit of work to get going and good familiarity with this type of development, and there is absolutely ZERO support from the manufacturer as they refuse to provide support for a “DIY” installation (as if it could be anything but…) You are warned :)
There are other modules that can be used for this, such as the much smaller and cheaper Person Sensor V2 from Useful Sensors, but I preferred this one for the added capabilities and better performance. In this case, the module communicates with the main controller, an ESP32-S3, via serial.
This took a bit of work. It all had to be calibrated for the field of view of the camera, the range of movement of the eyes and lids, the motors, and then more work to try to make all of this move in a natural and personable way. Well…for a robot. Movement is smoothed out so the eyes don’t just dart around. There’s still more work to be done here.
I then worked on a design for a “face” and a temporary body for this little one, which also required several parts of the mech to be refactored, especially how the eyelids attach. I will need to make more changes, as mounting this inside a head takes way too much time, with screws that are very difficult to reach..
See the short video below for a demo of how it moves, plus preview of face/body:
Stay tuned for the next steps:
Make it conversational. I already worked out some promising tests with ChatGPT and text-to-speech
Add some head movements so it can turn and look at faces, not just give the side-eye
Additional detection sensors. I have an MMWave sensor I want to try
Work on body design
Maybe allow it to move around