That's an interesting point! We did consider adding memory to the voice agent, and we have use cases like an AI therapy session wanting to know the former conversation with the patient. Adding the previous chat would be very helpful as well.
The use case I recall involves a nonprofit organization focused on preventing suicide. They are hoping for an AI therapy solution capable of listening to patients and picking up the phone when no human is available. This isn't entirely unacceptable because one of the therapist's roles is to listen to problems, so AI can effectively substitute in this aspect.
You're not wrong, and I agree this is a great use case, but consider calling it crisis response vs a therapist. A therapist is there to help you dig deep, over a long time, crisis response is a tactical mechanism to prevent imminent self harm.
Amazing product, looking forward to working with it.
If it was only to gap fill then that sounds reasonable, but other risks here is the voice agent picks up slack and lowers the pressure for staffing and working on solving these problems in the first place.
What is worse, that no one is available to listen to you when you're suicidal, or that you lack so much value that only a machine would talk to you. I'm sure some people would have an extremely poor reaction to that.