I'm of similar vintage, although my father was a mainframe technician, so I had some significant on-site expertise to draw on, even over the 6502 and Z-80 boards in our basement. One thing I've noted, now that I teach programming to the Stack Overflow generation is that we had to be much, much better at troubleshooting. Now, we had the advantage of dealing with pretty primitive systems -- 'troubleshooting' often involved an oscilloscope (or the poor kids version: a latch accessorized with an LED and resistor) and an 'upgrade' like as not would involve a soldering iron---but my systems were simple enough that I had a pretty accurate mental model of how the whole system worked, and the number of subsystems that could interact to produce an effect was very limited. On modern systems with complex multiprocessing, multithreading, dozens of peripherals, operating systems providing shared services to hundreds of process and every program connected to a fault-tolerant network stack to effectively imfinite other computers that may or may not be standards compliant or even malicious. My childhood mental model of "well this address line should be going high if the data is being sent to ..." is almost quaint in the face of that.
Still, I've noticed that the Google/SO approach to troublehooting is, all too often, to collect an exhaustive set of things to try, and begin trying them. If you discover one that works, say yay!, and move on to the next problem. If you're very, very nice, post that it worked so that the solution can be lifted marginally higher in the queue for the next person. My students (who, admittedly are very inexperienced, I suspect this behaviour becomes less prevalent later) seem to spend very little time on what was a critical first step for me: hypothesize about what things could possibly be causing the problem and then come up with an experiment (often involving a simplified program or using only a subset of the system). My initial tests were rarely to isolate the problem: they were to either eliminate or comfirm a class of problems as I successively isolated the specific cause (ok, so it remains if I move it to another block of memory. Maybe it's a race condition? Ok, what if I force this test high, then there won't be any branching in this code block ---that's vaguely recalled from a problem that came down to needing a NOP in one branch path to avoid a fencepost error in some probably over-optimized code). The benefit was that, after a few months of troubleshooting like that, you kmew your system absurdly well. Conversations within the tiny community of fellow travellors (there were about a half-dozen in my high school of more than a thousand students) were heroic tales of bugs run to ground that we all learned from and tried ourselves.
Unfortunately, it's not really possible to learn a system that well any more, they're too complex. But I wonder if the ease of just typing the error message into Google means this generation don't learn the application of the scientific method that good troubleshooting really is. Because, for many of my students, it seems as though if the answer isn't on SO, then it will forever remain a mystery.
i'm firmly in the SO generation. self/on-the-job -taught. web dev. i enjoy and take the journey of grokking problems, but many of my peers don't. slows me down a bit at first, but then i'm a local expert. feels like these people weren't taught how to think critically, or weren't able to apply that skill to code.
> [...] my systems were simple enough that I had a pretty accurate mental model of how the whole system worked [...]
This is key. I've worked on dozens of systems, platforms, frameworks, etc., and the common predictor of how well I program in each environment is directly related to how well I understand the mental model of the architecture.
Still, I've noticed that the Google/SO approach to troublehooting is, all too often, to collect an exhaustive set of things to try, and begin trying them. If you discover one that works, say yay!, and move on to the next problem. If you're very, very nice, post that it worked so that the solution can be lifted marginally higher in the queue for the next person. My students (who, admittedly are very inexperienced, I suspect this behaviour becomes less prevalent later) seem to spend very little time on what was a critical first step for me: hypothesize about what things could possibly be causing the problem and then come up with an experiment (often involving a simplified program or using only a subset of the system). My initial tests were rarely to isolate the problem: they were to either eliminate or comfirm a class of problems as I successively isolated the specific cause (ok, so it remains if I move it to another block of memory. Maybe it's a race condition? Ok, what if I force this test high, then there won't be any branching in this code block ---that's vaguely recalled from a problem that came down to needing a NOP in one branch path to avoid a fencepost error in some probably over-optimized code). The benefit was that, after a few months of troubleshooting like that, you kmew your system absurdly well. Conversations within the tiny community of fellow travellors (there were about a half-dozen in my high school of more than a thousand students) were heroic tales of bugs run to ground that we all learned from and tried ourselves.
Unfortunately, it's not really possible to learn a system that well any more, they're too complex. But I wonder if the ease of just typing the error message into Google means this generation don't learn the application of the scientific method that good troubleshooting really is. Because, for many of my students, it seems as though if the answer isn't on SO, then it will forever remain a mystery.