Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

someone has a major confusion of levels here. have they ever tried asking gpt-4 to multiply some matrices, i wonder?


I'm guessing that's a joke. The benefit of hardware support for fractions is eliminating rounding errors you get with today's machines. You can do it with libs like PyMath but ultimately you need to do build it from the ground up to completely eliminate type and class abstractions messing up the accuracy, so why not start at the hardware level so potential future chips are automatically supported? Then you can get today's performance without the overhead of legacy software.


it's not a joke. writing a program in fractran to multiply input fractions that are part of the input data is no easier than writing a program in other turing-tarpit esolangs with no built-in arithmetic support, like the λ-calculus. try it. or check out https://stackoverflow.com/questions/1749905/code-golf-fractr... which has a couple of unusably inefficient fractran interpreters in fractran




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: