Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> The central thesis is that the shortage will lead to innovation, but in my experience it has put a HUGE strain on everyone doing R&D. I suspect it'll instead lead to rising costs and stagnation as people wait for supplies. This is already basically what we have in the consumer hardware market, where last year's consoles still aren't consistently on store shelves.

I think there's some truth to creativity being enhanced by constraints. Certainly, if supplies are limited, and especially if the limits are uneven, there's going to be incentive to design around the chips that are limited, and some of that might be innovation that could be useful even after supply goes back to normal. Of course, CPU shortages are hard to design around, especially if all CPUs are in short supply; but some applications might be able to make use of different CPUs that might have more availability.



Technically, sure. But the extent of the innovation I've seen so far is people going back to using inferior chips because that was all they could get within their budget.

Microcontrollers aren't exactly interchangeable, even within the same product line. You could design for flexibility and use the arduino framework to run your code on most microchip/atmel, ST, and a million other chips but that comes at enormous cost -- to put it nicely that's an incredibly inefficient library if you're not doing anything demanding, and damn near worthless if you are. Any multiplatform framework that abstracts away the inner workings of microcontrollers is going to be too heavy to work for a huge percentage of people's power and complexity profiles.

It's not just MCUs and firmware either, any time you replace a component due to shortages you need to revalidate your design. Constantly redesigning and revalidating boards based on available stock is what people are doing right now to keep the lights on. It's hell.

If you don't need a microcontroller to do whatever you do, then sure. Pop it out and save a few bucks. But that's hardly innovation, it's more rectifying a mistake made when doing the initial design.


I think you're like 98% right. Swapping a MCU is a lot of work, and other chips are too... I just wonder how many people are going to have all the chips but one, and figure out how to wing it, and how many of those solutions end up being interestimg/useful/kept past when the missing chip becomes available.

I'm thinking of stuff like (at least some) dishwashers with 'dirt' sensing don't actually sense dirt at all; instead the pump has a thermal overload, and they sense how many times the pump cycles to indicate dirtyness.

If you used to have a dirt sensor, but it was delayed 18 months, you might figure something like that out, and maybe that's handy. Or maybe there's some other thing you'd like to measure, but there's not a good way to measure it, but it causes something to misbehave and you can measure that; but you wouldn't have thought about it except that you ran out of dirt sensors.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: