Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A couple months back I was on a panel on Agile methods with a fellow from Intel. [1] Normally I scoff at large-company Agile efforts, but I came away impressed. As I understood his presentation, they had virtualized their entire System-on-Chip development pipeline [2], allowing them to use software methods for hardware products.

Clearly we're not there for open hardware, but I think that's the kind of tooling necessary. If the cost of a build drops from $1k of manufacturing to $0.01 of electricity, and if the hardware experts express their knowledge as automated tests, then suddenly randoms like me can, as with open source, jump in, try a patch, and be given a chance to learn how to do it right.

[1] http://computer.ieeesiliconvalley.org/agile-update-and-is-it...

[2] http://computer.ieeesiliconvalley.org/wp-content/uploads/sit...



For well-established projects with high throughput and lots of resources this is likely practical, but those aren't the kind of things that "open source hardware" (a term I personally hate) typically targets.

For niche products being worked on by a small team with shoestring resources, the problems the article points out just scratch the surface. It gets even more fun when you add mechanical systems and sensors to the mix. I work in embedded systems and spend at least as much of my time debugging hardware as software.

The number of ways hardware can go wrong is astonishingly large. As well as all the ones mentioned in the article, there are temperature effects, power supply effects and most of all: noise. Put your system into an industrial environment with lots of big electric motors generating noise on the mains and see what it does to your design, even after it's passed EMC testing.

Noise and grounding (particularly leakage between analog and digital grounds) cause a ridiculous range of hard-to-debug problems. Software by comparison is easy.

My career has been about 80% software/firmware with 20% hardware, and I'm a hardware hobbyist (including time as a mentor for an FIRST Robotics Challenge team, an experience I highly recommend) so I've seen both sides of the coin and found that the typical "open source hardware" electronics problem is the kind of flaky, intermittent weirdness that is really rare in well-designed software these days (it's been years since I've seen a real Heisenbug in software, whereas they used to be a lot more common). Some of that speaks to how well-designed a lot of modern software is, especially open source stuff. Some of it speaks to how difficult it is to predict/ensure the quality of electronics in the real world.


Sure, but the history of computing has a lot of people solving hard/expensive problems and then having the techniques trickle down to small/cheap ones. That's my hope here.

Of course, I doubt it will ever work for the hand-built stuff. I'd guess it will only make sense for stuff whose fabrication is automated or controlled enough that it can be reasonably simulated.


You seem to be discussing a very different level of hardware development than the linked article is.

SoC design, along with FPGAs, and generally any digital logic oriented device lends itself well to software style development approaches because a large portion of it is software. The design of such devices is done in a HDL (Hardware Definition Language), not with schematics.

There really isn't a HDL for board level design (unless you count Spice I guess), mostly because board level design doesn't have the neat and clean abstractions that on chip digital design has. e.g. you can black box a 4-bit adder and glue a few together without much care about the implementation. You can't do the same thing with an audio pre-amp circuit that will be part of some mixed signal board as the implementation will often be driven by complex interaction with the rest of the design, right down to physical placement and layout of the circuit on the PCB.


Sure, I'm not saying they're equivalent. I'm just saying that this points the way.

There was also a time when making software was too expensive to make for modern methods to work. When my dad got started, machine time was way too expensive to spend it on things like frequent compiling and automated testing; it was cheaper to have humans stare very hard at paper printouts and simulate operations using paper and pencil.

That changed with the rise of the personal computer, and changed further as computing power got cheaper. Now developers can spin up whole virtual clusters of machines to test things out, easing the development of open-source software for those environments. If software is eating the easy end of hardware, that at least means that some open-source hardware will become plausible. But I hope the progress will continue over coming decades.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: