Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

3090, trivially.

No reason to go 4090 as it's no more capable, and the 5090 is probably not going to have more than 24GB on it either simply because nVidia wants to maintain their margins through market segregation (and adding more VRAM to that card would obsolete their low-end enterprise AI cards that cost 6000+ dollars).



Appreciate the info!

In another thread I saw a recommendation for dual 3090s if you're not doing anything gaming related, good to have some confirmation there.


I'd also consider dual A6000-48GB (96GB total) if you have a budget of $8000 or dual V100-32GB (64GB) if you have a budget of $4000.

V100 is old and slower, but for AI applications, RAM is king and there are lots of enterprise V100's coming off racks and being sold on eBay for cheap.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: