You have to think ahead. You have 96 numbers now, but after 96 you have to change the data type to a larger storage to be able to store larger numbers. So when you pre-calc say 10,000 numbers what happens to your array then? What if the program is run on an embedded system for some reason that has memory limits? Like say someone finds a reason to use Fibonacci numbers on an embedded system to monitor machines and control machines and other stuff?
You don't know a reason to use Fibonacci numbers, but that doesn't mean someone else might find a way to use them.
Given the limits of the data type, I really am not going to think ahead until I change the data type[1]. If the new datatype can hold 10,000 numbers, then I will probably go back to an algorithm.
I am really not sure about these embedded systems that have more RAM than ROM / EPROM / Flash. 384 bytes used for constant time performance in and embedded situation sounds pretty good to me.
1) not only would that be premature optimization, it would also change a function from constant time to variable. I do wonder about how big the stack frame is.
actually, in the distant past, I did. I don't remember a lot of embedded systems where Fibonacci numbers were important or where floating point was a real option (been awhile, probably same length of time since 384 bytes was the killer). I was always RAM constrained more than ROM /EPROM.