Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Can this get the ggerganov treatment so that I can run it on Apple silicon?


SD models are much smaller so you can probably already run it with the github code.

Image models work well at least: https://huggingface.co/docs/diffusers/optimization/mps

It is even possible to run them in the browser: https://stablediffusionweb.com/


with llama.cpp getting GPT4all inference support the same day it came out, I feel like llama.cpp might soon become a general purpose high performance inference library/toolkit. excited!


Heh, things seem to be moving in this direction, but I think it's still a very long way to go. But who knows - the amount of contributions to the project keep growing. I guess when we have a solid foundation for LLM inference we can think about supporting SD as well


Thanks for your work in this space, it’s incredible!


Absolutely, its just unbelievable what this one person has enabled. ChatGPT isn't mind blowing as much as ggerganov is.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: