Nice start. There's definitely room for a good native macOS chat client, I have tried a few now and none feel perfect. I found two that feel usable:
HuggingChat (https://github.com/huggingface/chat-macOS)
It has a launcher interface in the current release, and code, latex, etc are pretty printed. You can switch from HF hosted models to local mlx ones (though those are hardcoded rn i think). I like it for quick queries to qwen2.5-coder and I think it would be great if they develop it more.
Enchanted (https://github.com/gluonfield/enchanted)
This one feels a bit buggy and it might be abandoned, but it has basic functionality for working with ollama models.
I've had a generally positive experience with MindMac[1], which is another native macOS app. I've raised a few issues and feature requests and the developer has been pretty responsive to feedback.
HuggingChat (https://github.com/huggingface/chat-macOS) It has a launcher interface in the current release, and code, latex, etc are pretty printed. You can switch from HF hosted models to local mlx ones (though those are hardcoded rn i think). I like it for quick queries to qwen2.5-coder and I think it would be great if they develop it more.
Enchanted (https://github.com/gluonfield/enchanted) This one feels a bit buggy and it might be abandoned, but it has basic functionality for working with ollama models.
Also worth a mention is aichat (https://github.com/sigoden/aichat). It's not a native gui app, but it's an impressive cli client.