Making rosie listen Rosie and overlay stuff
For a while, I wanted to turn Rosie, our Chatbot, into a voice assistant. Today we worked on that. We used the Web Speech API to do speech recognition. We started to connect to wit.ai for the intents.
Segments
Timestamp | Topic |
---|---|
0:00:00 | Hello World |
0:09:43 | Giving a tour around Rosie and the layout system |
0:35:14 | Working the speech recognition |
2:16:00 | Connect to Wit.ai |
2:41:25 | The end |
Announcements
- Improved the MIDI Extension, Great feedback!
- Tried to create visual AI, face tracking and hand tracking. Unsuccessfull :(
- !discord
- !git
Project working on today
- Want to make Rosie ’listen’ and respond to spoken commands
Next Steps
- Add more intents
- Connect intents to Chatbot code
- Combine layouts and bot into 1 app. Possibly Electron.
Things to Review & Notes
-
[21:45] +wingysam:
for (const module of fs.readdirSync('modules')) { const moduleExports = modules[moduleExports.name] = moduleExports.run }
-
[22:19] +wingysam: Btw VSCode understands JSDoc natively, just type /** above a function and it will spit out a jsdoc template with your function’s params
-
[22:25] +wingysam: https://github.com/sorskoot/RosieBot/issues/1
Todays stream brought to you by
All the wonderful people that hung out with us in chat!