Voting resources, early voting, and poll worker information - VOTE. ... Adafruit is open and shipping.
0

Connecting Snake Eyes Bonnet to Google AIY Vision bonnet
Moderators: adafruit_support_bill, adafruit

Please be positive and constructive with your questions and comments.

Connecting Snake Eyes Bonnet to Google AIY Vision bonnet

by ugthak on Tue Oct 13, 2020 11:22 am

I'm attempting to connect a Google AIY Vision build to a Snake Eyes build, so that the whole setup detects the nearest face, and the eyes then follow that face while it's in the field of view of the camera. I'm a neophyte with both the hardware and software aspects of this, so apologies in advance for what could be a dumb question.

I've got a Raspberry Pi 3B+ with a Snake Eyes Bonnet running two 1.54' 240x240 TFT displays, with the analog inputs for controlling the movement of the eyes activated for use.

Separately, I've got a Google AIY Vision kit v1.1 running off the included Pi Zero WH. I've made some changes to the code on the face_detection_camera.py example included to output x/y coordinates of a point (in the middle of a detected face, at eye level) via two of the GPIO pins on the vision bonnet set to use PWM to match the analog nature of the inputs on the Snake Eyes Bonnet.

The power pin on the vision bonnet is 5v, with all the analog inputs on the Snake Eyes bonnet being 3v... and I'm at a bit of a loss on how to connect from here without putting too much power through. I know I can't use a logic level converter due to this being an analog connection. Is this as simple as wiring Pin A on the vision bonnet to a 2.2k resistor, followed by splitting it to analog input 0 and a 3.3k resistor connected to ground on the Snake Eyes Bonnet, then doing the same for Pin B & analog input 1? Do I need to connect the grounds? Do I need to do anything with the power pins on either board? Is this better done with a buck converter, or will that not work with an analog signal as well? Help for a newbie would be much appreciated!

ugthak
 
Posts: 2
Joined: Fri Sep 25, 2020 12:18 am

Re: Connecting Snake Eyes Bonnet to Google AIY Vision bonnet

by McLoven on Mon Oct 19, 2020 6:56 pm

I am looking for the same. I have the snake eyes bonnet hooked to a raspberry pi 3. I installed a pi camera on the raspberry pi 3 and was wanting to know if it was possible to do face tracking and facial reignition to get the eyes to follow.

McLoven
 
Posts: 1
Joined: Mon Oct 19, 2020 6:49 pm

Re: Connecting Snake Eyes Bonnet to Google AIY Vision bonnet

by ugthak on Mon Oct 19, 2020 8:22 pm

It is 100% possible- I've got mine successfully doing exactly that.

I solved the problem I described in the post above, because it turns out it wasn't a real problem at all- the GPIO pins (other than the power pins) are putting out 3v signals, so it didn't need to be reduced. The pulse width modulation that the Google AIY Vision bonnet is capable of, however, wasn't being read correctly by the Snake Eyes Bonnet- the eyes were jumping around due to reading the peaks/valleys on the signal rather than a true analog signal. That was solved with a combination of software/hardware- the full setup is as follows:


Hardware:
Raspberry Pi 3b+, 240x240 TFTs, Snake Eyes Bonnet from Adafruit

Raspberry Pi zero WH with Google AIY Vision Bonnet and camera

2 Low Pass RC filters (for x and y axis, respectively) connecting the outputs from the Vision Bonnet to the Snake Eyes Bonnet- this smooths out the high/lows on the PWM output to a smooth signal
A whole mess of jumper wires connecting everything

Software: Snake Eyes Bonnet: Base software, plus I've configured it to use the last X number of samples (currently 20) from the analog inputs to help smooth the input from the PWM. I've since added the low pass filters, which might be enough to do that on its own- I haven't disabled/changed the smoothing function to know for sure, yet.

Google AIY Vision: I started with the face_detection_camera.py example that comes with the kit, and used the AIY annotator and pins APIs to first put a point estimated to be right between the eyes (1/3 of the distance from the top of the bounding box, and horizontally in the middle) and then converted the coordinates of that point to a proportional 0-1 signal to be output from the GPIO pins... and, viola, it's working.

Apologies if any of that is hard to follow or unclear, this is literally both my first programming project and my first soldering project following some tutorials/instructional videos.

I hope this is helpful- long term I'm trying to teach myself how to get a machine learning model trained myself, as I don't need processing power to be spent on the "Joy Detection" function on google AIY, which I suspect helps drag the performance down to the 8-12 FPS it's at, as well as get things to work on an infrared camera instead of a normal one so that it can be used in a low-light application. We'll see when/if I get there!

ugthak
 
Posts: 2
Joined: Fri Sep 25, 2020 12:18 am

Please be positive and constructive with your questions and comments.