The Big Mouth Billy Bass has been a part of American culture for decades . For a complete history of this novelty item, check out the Wikipedia page . https://en.wikipedia.org/wiki/Big_Mouth_Billy_Bass
So after finding Billy in a thrift shop, I was able to take him home for $5. I'd seen other projects modifying Billy Bass to sing arbitrary songs, or to act as the front end for an Alexa or Google home. I had other plans in mind.
My plan was to turn Billy Bass into a completely self-contained AI unit.
Overall I used a Raspberry Pi 5 - 8 GB SBC running Raspian. Motor controller boards were used to control the several DC motors for the mouth, head and tail. Two separate AC to 5v DC power supplies were used to control the electronics. A small powered speaker was used for audio output and an electret microphone was used for voice input.
VOSK https://github.com/alphacep/vosk-api was used for the offline speech to text. The medium sized model was chosen for performance.
Piper https://github.com/rhasspy/piper was used to generate the text to speech.
A smaller LLM model was chosen in order to generate quick responses. I set up the LLM with a prompt to let Billy know that he was a fish in the wall and to answer all questions with some folksy wisdom!
From there, I used an Arduino compatible board to sample the audio output. Based on the amplitude levels, I generated motor control signals to control the tail flapping, head turning, and mouth movements.
I put this all together with a small python script that basically listens for audio input, converts it to text, sends it to the llm model and streams responses from the llm to Piper for text to speech conversion. From there, audio is played using the aplay utility.
Overall this was a super fun project to work on! Good old Billy is now sitting on my wall chatting away with me daily!
Just a note, but the LLM you mention " uDelphip " does not seem to exist anywhere on the web but here on this page. Probably a typo?