Undergraduate Projects
PixieBot: Smart Pixelated Companion
The goal of this project is to build a smart pixelated companion called PixieBot. PixieBot requires a TTS processing with the use of a hot word and an implementation of facial recognition is required for it to recognise users’ faces. PixieBot’s facial recognition must utilise trained models as well. After each user’s input, PixieBot will have to response by using a display of pixelated renderings to display. Furthermore, PixieBot has to response to users with short audio responses and this comes with a predictable artificial personality. This makes sure that PixieBot can visually display emotions.
In order to achieve the goal of the project, PixieBot is developed by using Python with pre-existing open-source libraries developed for speech and facial recognition. Python due to flexibility and reliability of dynamically importing the code during runtime. Other than that, multiprocessing is necessary for the separation of the UI and recognition processing. Individual modules are also created such as the text-to-speech function and weather module. Moreover, the use of Docker provides a virtual environment to run the system no matter the operating system. Additionally, Phillips Hue’s Smart Hub and users’ phone could interact well with PixieBot. To change the audio of PixieBot so that it sounds more ‘Pixie-like’, a WAV file is used. PixieBot’s emotions are created which include Neutral, Thinking, Confused, Happy, Angry and Sad with the use of gifs.
Technologies Used
● Final product is run on a small machine
● 4GB RAM
● No more than 1.5GHz processing speeds
● Desktop application running through docker
Development platform:
● Linux Virtual Machine
● Visual Studio Code
● Python
Services:
● Google Calendar
● Google TTS
● Notifications used (Firebase)
Developed By:
Christal Tan
Stephen Byatt
Andy Long Pham
Mingkai Guo