Could Q.ai be commercializing the AlterEgo tech coming out of MIT Lab?
i.e. "detects faint neuromuscular signals in the face and throat when a person internally verbalizes words"
If this works well, then I could finally see that AI wearable pins could be socially feasible. IMO speaking aloud in public to AI doesn't seem like something which will work but it is also what OpenAI is apparently investing a lot into with their hardware ambition with Jony Ive [0].
> ...in most people, when they "talk to themselves" in their mind (inner speech or internal monologue), there is typically subtle, miniature activation of the voice-related muscles — especially in the larynx (vocal cords/folds), tongue, lips, and sometimes jaw or chin area. These movements are usually extremely small — often called subvocal or sub-articulatory activity — and almost nobody can feel or see them without sensitive equipment. They do not produce any audible sound (no air is pushed through to vibrate the vocal folds enough for sound). Key evidence comes from decades of research using electromyography (EMG), which records tiny electrical signals from muscles: EMG studies consistently show increased activity in laryngeal (voice box) muscles, tongue, and lip/chin areas during inner speech, silent reading, mental arithmetic, thinking in words, or other verbal thinking tasks
Yep, looks like that is it. Recent patent from one of the founders: https://scholar.google.com/citations?view_op=view_citation&h...