This reminds me of http://nand2tetris.org/ (and the accompanying Coursera course). It's a fantastic course to learn about computing from first principles.
The course starts off with just a few logic gates and how you can combine these to make more complicated circuits. These are in turn combined to reach higher levels of abstraction, until you finally have a 16bit computer, with 16K of ram.
It's a really cool course that helped me flesh out my understanding of what's going in my computer.
Quick question if I may. I have a very keen interest in trying this course...is what you build (hack) using actual physical circuits / components? Or do you run it all in simulation?
I don't know if I should post on HW, my 32-bit CPU that only exists as software emulator, plus a basic firmware that consists on a basic BIOS + machine code monitor that imitates Apple ][ Wozniak's machine code monitor. I ever managed to get a Raspery Pi B to run the emulator.
I try to wrote the CPU as VHDL, but I only did a few pieces (registers storage and a 32 bit ALU), so I can get a real and accurate cycle count for each instructions.
Many people would be really interested to read a submission about that. I think they'd especially be interested to read how you solved any problems you had.
This is really cool, something I've wanted to do but never actually done. The closest I've been is a very simple 8 bit CPU implemented in Verilog ( https://gist.github.com/benpye/07aa8a66e25a7c702766 ), the program in the BRAM I think calculates the 5th number in the fibbonachi series or something, I've forgotten. It works on the ice40 hx8k breakout board if anyone wants to try it, I was using the icestorm toolchain.
Since it looks like the author posted this, we've added "Show HN" to the title. Normally that requires something people can try out, but when you're making something on an FPGA * that's hard to do, so a demo is fine, as long as it works.
* I originally said "making your own hardware" but that caused confusion. Sorry!
He didn't make his own hardware though? It's a software based CPU. Actually manufacturing a CPU at home would require you to crash at a foundry. Or y'know build a really, really slow one out of parts that you could actually use in your home.
I think this is really cool but all he did was use a cool product for exactly what it is intended for.
The video seems to be deliberately non-obvious that this is all software based. If you're gonna post it for public consumption you should be more up front, I think, or some people are probably going to assume you actually did manufacture your own CPU From the title.
There are plenty of hand-built CPU projects built by enthusiasts, made out of wire-wrapped integrated circuits[0]. There was even a Zuse homage I saw that was built entirely out of magnetic relays[1]. It's...insane, tbh.
I've worked for some characters in my time, and some of my elderly bosses were hardcore on this subject. As it happens, in various industries - rail/transportation, for example - there are often available large surpluses of electro-mechanical relays, in copious quantities sitting in large bins, which .. by way of practical application of apprentices .. become a new 'test subsystem rack' waaaaaay in the back of RailtrainCo.'s, computing facility. What I'm trying to say is, "don't power off the back racks, ever, because they are the bosses little relay-CPU project, and its computing .. something .. (actually its running the model railway switching system, upstairs in the lobby)" ..
That sounds like something from a Charlie Stross Laundry novel.
Edit: I have encountered systems that were designed in the 1960s where they wanted to rely on tried and tested technology - so pretty much clockwork! This was for sequencing operations during the emergency shutdown of a nuclear reactor, so being conservative in technology platform was perhaps understandable!
Remote reprogramming requiring a metal lathe, milling machine, and a cooperative operator onsite is not necessarily a bug with nukes. Expensive, but accidents are more expensive.
In my case, it was just so that the Boss could prove the point: even his relay-CPU could handle the train signalling/switching systems, sometimes, better than the 'modern' versions we were all sweating on developing. Technological prowess runs deep in the train-nerds world.
This semantic argument that "is FPGA a real hardware" is pointless. All the same arguments that the "software" side can produce are also applicable for, say, a circuit built on a breadboard. Is this circuit not a "real circuit", but a "software" for some weird reason? And if not, stop calling FPGA bitfiles a "software".
There's whole papers in CompSci on why FPGA's aren't ASIC's and many FPGA-to-ASIC conversions failed to achieve their objectives. They really are different enough to warrant a distinction. I'd just call them bitstreams instead of software, though, given software usually gives implication of instructions a la x86 running on fixed HW. I mean, technically that's what FPGA's and bitstreams are but world of difference in meaning.
It starts getting extra muddy when we're talking anti-fuse FPGA's where it runs software exactly once then becomes hardware. ;) Still significant logical, structural, and so on differences from ASIC's or TTL designs to not consider HDL-to-FPGA to be same as making, esp manufacturing or wiring up, those.
Point being the structures and underlying workings of FPGA doing things vs an ASIC are really different. A person who can FPGA-prove something can't necessarily put that specific design on a 350nm node. Surely not on deep sub-micron. The FPGA won't even run by itself in the development configuration. Stand-alone FPGA boards with the stuff flashed or whatever is closer to building some hardware. It's still only a subset of skill required, though.
Ok we'll change the title back to what it was originally. Sorry for the distraction.
Edit: actually this is still pretty cool work and it isn't the creator's fault that I oversold it, so I've put "Show HN" back and edited my root comment above. Let's discuss the video now!
I think it's totally valid, but I can see how the word 'homemade' could be misleading. All designs nowadays start as HDL, even the ones manufactured at a foundry.
The course starts off with just a few logic gates and how you can combine these to make more complicated circuits. These are in turn combined to reach higher levels of abstraction, until you finally have a 16bit computer, with 16K of ram. It's a really cool course that helped me flesh out my understanding of what's going in my computer.