Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The GPT4 model is crazy huge. Almost 1T parameters, probably 512 to 1TB of vram minimum. You need a huge machine to even run it inference wise. I wouldn't be surprised they are just having scaling issues vs any sort of conspiracy issue.


> Almost 1T parameters

AFAIK, there is literally no basis but outside speculation for this persistent claim.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: