Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Inverted computer culture: A thought experiment (viznut.fi)
184 points by pabs3 on March 6, 2023 | hide | past | favorite | 146 comments


Replace 'computers' with 'open source software' and you are something closer to an uncomfortable truth. Open source has been critical to the advancement of our world but people who go out of their way to use and encourage open source past the basics (even fdroid or Firefox not even getting to hardware and full software stacks) are treated as oddballs even by the tech community.


Embracing open source and freedom ideologically is very difficult. You can install Firefox, and still feel claustrophobic when you see paid-for default bookmarks and sponsored content everywhere. You can install an open source operating system, but when you learn your system is ACTUALLY running MINIX at the hardware level, which is connected to a 3g chipset listening for updates to do whatever the OEM wants, it all starts to feel like a losing battle.

Let's say you're really determined to have freedom, so you buy a fancy new RISC-V single board computer. Fully open source architecture, bleeding edge stuff. You'll learn quickly the GPU relies on a closed source kernel that the OEM refuses to provide updates for after as little as a year or two, effectively acting as hardware DRM, when your device was supposed to be as free as it gets.

There are many roadblocks in the way of being a free software/open source fanboy.


Please enumerate the demonstrable advancements of society attributable to open source software. Edit: lots of downvotes, no credible responses. Guess y'all are stewing in your own cognitive dissonance?


The Human Genome Project heavily used open source software, as did the detection of the Higgs boson on the Large Hadron Collider at CERN. I'm not sure what your standard for advancing society is though, but the Internet probably helps investment banks and short sellers all over the world try to price in various current and future events and prevent massive and sudden price corrections that can have major impacts on real people's lives. For example your pension going bust.


> The Human Genome Project heavily used open source software, as did the detection of the Higgs boson on the Large Hadron Collider at CERN.

Do you think that those projects wouldn't have existed with proprietary software?


Ok so gcc and Linux don’t exist in your world.


Or the internet.


Claiming the internet is a net advance for society is a pretty hard sell in the face of the dissolution of critical institutions, proliferation of mental heath issues, and hyper-concentration of wealth it has made possible.


These critical institutions you speak of, they were not here since the beginning of time nor will they be here till the end of time. Blaming “the internet” for their dissolution is implicit assuming that they should have continued in their current form (just to begin listing the claims that are hard to sell).

Further, hyper-concentration of wealth is not something “made possible” by the internet. These problems existed before it and will continue existing. Because, if they are any problem at all, they are a social one.


> Further, hyper-concentration of wealth is not something “made possible” by the internet

It should be pretty obvious that, even if the internet did not create concentration of wealth, it has certainly enabled it to a far greater extent than was possible before, which was the point. It would be petty semantics to argue over whether this means that it enabled "hyper"-concentration.


I'm sorry, I can't hear you over Google's near-monopoly over online search and ads and Amazon's deathgrip on merchandising. Could you speak up a little?


Sure, let me say that again. Online search, online advertising, and merchandising are not "critical institutions". Nor have you provided any evidence that the companies Google and Amazon displaced deserved their role as incumbents.

In fact. 2 of the 3 were literally non-existant before the internet was created, so the internet is obviously not causing their dissolution.

Let me spell it out further: no matter how much people moan about it, Google does not have a monopoly on search in the way that the word "monopoly" is being used. There are several alternative (hello Kagi) and people are free and able to switch in less 30 seconds, they just don't want to and just don't care.


I'm not sure what you're on about. ICC/MSVC and Windows both exist. They're real compilers and OSes that millions of people use every day, and I've never seen anyone make the case that they're somehow not capable of doing things that their open-source competitors can.


Both great for IT guys, society writ large though? Really?


Most of the world servers run Linux. Most of the worlds people would use at least one service running on those servers.


If Linux didn't exist, those servers would be running another open-source OS. Or a proprietary one. Linux isn't special, and in fact, its design and popularity together have set OS research a few decades back.


Android is a Linux.


Android is the worst Linux. On 99% of devices, it's TiVo-ized garbage.

The history of Android is a story about the exploitation of 'open-source', not the incorporation of the benefits of software freedom into society.


I am confortably working on hardware 10 to 15 years old. So do all the people of my family and the non-technical friends I help. This is possible with Linux and software resulting from the free and open source communities.

With proprietary OS, all this hardware would be trashed. It would go in waste dumps or poor countries to be dismantled at high health and environment expense. And I would buy new hardware requiring more nature destruction with mining of rare materials. Finally this additional investment would decrease my financial capacity to do other things.

So to make a long story short, the many (millions?) people preserving their hardware investment with open source software are saving environment, health and money to a large extend.

And last but not least, they weaken software company monopolies, which can only be good for society as well.


The open-source community significantly intersects with what is most accurately described as a "cult" dedicated to the irrational defense of open-source by any means necessary.


I am well aware. I was an open source developer for 12 years.


I've watched over time as the desktop stopped being the primary computing device of choice for many people. I've grown far too accustomed to the power and flexibility of a nice keyboard and 32" monitor, and virtually endless storage and general purpose compute that I own to fall into that usage pattern, but I understand the appeal.

No Wednesday morning headaches from Microsoft, no applications breaking with the new Python revision, etc. Just apps from the app store that work (mostly).

So, the Server Farms that comprise the cloud aren't ancient, but they might as well be remote temples just like those we had in the 1960s.


I don’t think much “computing” is done, in our sense of the word, on tablets and phones. Personally, I’ve found that almost any slightly complex, has-to-be-done-once-and p-right, action is better performed on an actual computer. And that’s not even considering writing new programs.


This hasn't been my experience. I'm able to be surprisingly effective on my smartphone. I've constructed trip itineraries, balanced outing budgets, shared shopping lists, designed room layouts, computed food and drink recipe ratios, built DJ sets, laid out wood pieces, and read books using my smartphone. Coding is probably the only thing that I find unparalleled with a keyboard and a seated position than my smartphone. I used to game more often when I was younger and I still generally prefer to game in a seated position though I have gamed on my smartphone before.


Different tools for different tasks. Can you score a movie whilst maintaining perfect clock sync with umpteen devices via SMPTE and load-balance your DSP across a cluster of computers whilst recording an orchestra in real-time? Can you do digital productions in Unreal whilst designing the set in real-time and showing it to the director in-camera as the cast are on set? No, and I wouldn't want to try anyway.

What you're describing is fairly basic stuff, and the computational resources required to perform those tasks has been outpaced by the miniaturization and price-to-performance value of a smartphone or tablet. But complex tasks have always required complex tools, and complex tasks become more and more complex over time, they take advantage of modern-day performance.

The smartphone is the modern-day personal computer. A workstation is still a workstation, and just because it comes in an ATX tower doesn't mean it's consumer-grade hardware.


Really agree. Programming and gaming are the only things I use my laptop for. I have my iPad for CAD and art, everything else I do on my phone. All my trip planning is done on there, all my finances, everything on the phone.


What CAD tools work well on an iPad? I'd love to try it. Fusion 360 is okay on Windows, but it's pretty bad on a Mac.


I've heard good reviews of Shapr3d on iPad. Can't vouch for it myself, but I've heard positive sentiment.


None. Trust me, you need a mouse.


I tend to agree, but I feel like there should be a way to come up with a good Pencil interface.


Also very interested in this!


While certainly a cpu is working, I don’t classify any of that as computing. To me, computing is any app where your output is files for later use. So for instance, Apple Notes is borderline computing, and virtually anything using a browser isn’t, with the exception of running your own files.


That seems highly reductive a definition because then a filesystem is a requirement for computation. A lot of early computing worked only in memory and results were printed out on tape or via TTY. Moreover it seems like your definition of computing only makes sense as input into another computer program.

Regardless many of the examples I provided do produce files. When I do finances I create spreadsheets, when I share shopping lists I'm making formatted text documents.


Hah, I'm also "accustomed" like you and always prefer a proper monitor/keyboard first, then a laptop, then lastly a phone even for simple tasks. Obviously things are different if I'm on the street, but if I'm in front of a computer I'd rather use that to check something rather than pull out my phone.

It's partially an eyestrain issue (I find the text too small) as well as the fact that the phone shows much less information than the computer screen. For instance on a mobile web page I have to scroll a lot, whereas on the computer I can just take in the text and images in a single glance. The keyboard and mouse also seem much more intuitive to me. So I guess I find the phone more cumbersome while many people I know think the other way :D


>whereas on the computer I can just take in the text and images in a single glance.

Web2 and Web3 websites do everything possible to waste as much screen real estate as possible with as little content density as possible.


This kind of speculative fiction doesn't work. Usually the template is described as: take some part of the world and tweak it and then press play and see what the world looks like now.

Instead this is just "invert some part of culture" which makes no sense. Culture is the emergent externalized collective intelligence of social creatures. You can't invent a culture which would preserve this property without asserting a global belief in the community isomorphic to the alteration. You haven't altered the culture. You'd just asserted a group of unthinking zombies.

So when I read:

> Imagine a world where computers are inherently old. Whatever you do with them is automatically seen as practice of an ancient and unchanging tradition. Even though new discoveries do happen, they cannot dispel the aura of oldness.

I can't conceive of such a world. Its incoherent. I also can't imagine a world where no one likes metals for whatever reason. You can't just assert a fact like that and press play. It's nonsensical.


It's not just speculative. It's a retrospective on the early mainframe era.

The nitty gritty punch card programming was something stereotypically done by middle aged/older women. Many early computers were massive and housed in what could be considered temples in big universities surrounded by gothic architecture. They were expensive to replace and kept around for long periods of time because of how difficult it was to upgrade (some large mainframe systems still exist today for that reason). Mechanical computers like the Babbage difference engine would last a lifetime. The idea of an average Joe spending all day using expensive compute time was insane. Those who focused on them did so for academic reasons, and spent most of their time meditating on computation, math, and the structure of language and meaning.


It brings in those aesthetics, but then gives individuals as much computer time as they want. Then zero people make spreadsheets.


Imagine a world where you can just assert any fact as the starting point of speculative fiction. Then imagine that someone would write a piece where computers were inherently old.


This breaks immersion for a lot of people, me included. It's one of the biggest issues I have with lots of sci-fi (moreso than most other types of speculative fiction) novels. Just asserting a fact this fundamental and building your world around it only works if societies, culture, technologies, and history are build around it. Humanity's history with tools and automation are millenia old. To alter this would require a lot of changes. We've been automating, forgetting, copying, and maintaining technologies for as long as our history.

Alien technology encounter stories satisfy some of this for me, as the injection of a foreign, advanced technology makes sense in this framework, much like how ancient peoples would discover technologies built by neighbors or rivals they didn't communicate regularly with.


To some degree I agree with your last point. Before the idea of aliens, writers like Plato and Thomas Moore would posit some undiscovered island where they could imagine a culture unlike their own existing, which like with aliens, gave plausibility as to why that culture would have a different history and traditions than their own. Of course they, like many speculative writers today, were really writing about how they wished their own society worked.


Hmmm. Good observations. I hope this doesn't break my interest in the Warhammer universe, lol, which I believe probably suffer from a lacking "origin story"


Expert prompt crafting!


A sci-fi magazine has cut off submissions after a flood of AI-generated stories - https://www.npr.org/2023/02/24/1159286436/ai-chatbot-chatgpt...

> The science fiction and fantasy magazine Clarkesworld has been forced to stop accepting any new submissions from writers after it was bombarded with what it says were AI-generated stories.

> The magazine officially shut off submissions on February 20 after a surge in stories that publisher and editor-in-chief Neil Clarke says were clearly machine-written.


I got some pretty interesting sci-fi type results for that in Stable Diffusion. It helped to negative-prompt titles, text, words, etc


I'm so mad you made this comment before me.


It's like asking "would you believe in homeopathy if doctors all told you that it was correct?"

To which the answer is "Doctors believe things for reasons. You can't just flip a switch and change what doctors think without changing the entire world so that homeopathy is actually true, and that would be such a weird world that science in it would be unrecognizeable."

(And since I'm not ChatGPT, I can answer your entire question: "Such a story would have an incoherent world. I could imagine someone writing a story that takes place in an incoherent world, but you wouldn't be able to get useful insights from it.")


Step 1: for some reason, the world stops making computers, but keeps the ones that already exist

Step 2: 100 years pass

Step 3: computers are now an old (and dying) tradition, kept going only by a small group of people who understand how to care for the machines.

Was that so hard?


hell the living computer museum in seattle, before it's at least physical closure by the sister of paul allen, is/was a bit like this. They have a bunch of vintage big iron that they have restored to working condition and allow members of the public to use. Some of these are still available on the internet.

https://www.livingcomputers.org/

https://www.livingcomputers.org/Computer-Collection/Online-S...

Hell deep in the chip shortage I talked with more than one person that were buying up electronics at thrift shops and ebay that they knew contained chips they could not source at the time.


Step 1 is really hard if computers are useful (why do you stop making something that's useful?), but step 3 is simply not believable: a society that has machines that could, at the minimum, help solving complex calculations and engineering problems much better than humans decides to completely ignore them. That's not how societies work.


1 is easy to imagine - a large enough collapse that making new ones is very very hard, especially compared to scrounging and repurposing the huge amount of existing ones. Imagine a huge population decline, and search parties looking through the ruins for useful tech, like computers, decades later.

Such a society could probably make new computers, but the cost of making new, much worse specced computers, may not be cost effective for a long time as long as old computers can be found and repaired.

Edit: such a movie or novel would make for great Fallout-ish world-building. Imagine for instance a large hotel in a big city running the booking system on an old laptop in a central secure location, and new 8-bit computers used as terminals at the front desk. Public access 8-bit computer terminals at public libraries connected to copies of WikiPedia running on some old gaming rig and such things. :-)


No it isn't (No1 easy to imagine). Computers are only useful because we have a stable society that we can build on. Post apocalyptic societies have zero practical use for computers.


Post-apocalyptic doesn't necessarily mean unstable. We weren't post-apocalyptic back in the 1930s or 1950s, and computing machines had plenty of uses already (and even earlier).


> Post apocalyptic societies have zero practical use for computers.

Post global nuclear war reconstruction would really benefit from having even a simple spreadsheet program to aid with resource allocation, wouldn’t it?


> Whatever you do with them [computers] is automatically seen as practice of an ancient and unchanging tradition.

During a nuclear war reconstruction scenario, computing wouldn't be seen as ancient or unchanging. We'd be in a transitory period where we try and recreate computing infrastructure with what we have and the power sources we have access to. But we wouldn't develop an ancient tradition or anything.


At lot can be done with pen and paper but if you were rebuilding civilisation from scratch I'd be be happy to have a solar powered computer holding a copy of wikipedia and heaps of documentation to help me put to use any artifacts I'd find from the old world.


Step 1 and 3 are, arguably, quite probable to happen in the future. I don't think you're appreciating just how many circular dependencies we have in the global supply chains. We're one big cataclysm or global war to reverting back to pre-industrial times, and if that were to happen, we'd likely be stuck there for quite a long time. Consider the following observations:

A. For obvious reasons, we continuously improve tooling and processes. This creates circular dependencies that make our civilization non-restartable.

As a toy example, imagine we made tool A, which let us create tool B, which let us, among other things, upgrade A to A'. A' can still produce B, but can also produce upgraded B'. B' obviously lets us make A'', but also C', and A'' + C' let us make B''. Rinse, repeat for a hundred years. The result is that the whole industry relies on tools A'''''''''' and B''''''''''. The hobbyists still play with A and B, maybe even A' and B', but there's nothing between A' and A'''''''''' in actual use or production, nor anyone who remembers their design, because they've all been made obsolete long ago. If the calamity hits, and we can no longer make new A'''''''''' and B'''''''''', we don't get to downgrade a step or two - we go back all the way to A' and B'. And all the other tools and technologies that also relied on our advanced capacity - they all stop working and go "back to beginning" in lock step.

B. Making and operating tools involves ever increasing energy use.

C. We've used up all easily-available high-density energy sources long ago. Both our renewable tech and our fossil fuel mining and combustion processes are A''''''''''-level tech.

With that in mind, it's not that big of a stretch to write a plausible backstory for a culture as described in the article.


I can't decide if I would enjoy reading real examples of these cycles. Just knowing how precarious the supply of natural rubber is, and how inadequate artifical solutions are (including prospective ones from continental), makes me uneasy, but I'll be damned if there isn't some kind of allure there as well.

It seems almost certain that climate change is going to severely disrupt several of these cycles in decade or two, and a slightly different failure mode of what you allude to in C. is that our tech and processes become capped before we adequately scale up renewables, and overshoot kicks in and it's actually a negative feedback loop that certainly doesn't just stop with computers.


Even if we're ignoring wind and water power, boosted by charcoal, computers would be quite useful for lots of business purposes.

And if things are so low tech, the long-distance communication would be far more impressive than the story suggests!


1 happens every fucking day, hell wasn't there an article today about people who have older equipment that depend on floppy disks and how hard it is to find reliable disks these days? The article discussed an older embroidery machine, the only way to supply it with patterns to embroider was 3.5 inch floppy. His are starting to fail. But this is also true of older samplers, drum machines, synthesizers, etc.

Sometimes someone manages to spin up new manufacturing of retro components or things to replace the retro components (there are a few pcbs running around that can replace floppy drives that conform to certain interfaces with usb sticks or sd cards or even a network): Vacuum tubes, parts for vintage cars, parts for vintage watches, vinyl records, etc. but often before that happens the entire known supply needs to be scavenged.

So I can imagine a world where we manage to start doing the things we are doing with some new technology that is not digital silicon based computers that run on electricity. But there will be edges where the new technology is not compatible with some system, and in that edge lives this kind of speculation.


We stop making specific models, and lose how to make spare parts for those models. We don't lose how to make the machines in general.

This story would be completely different if the old silicon computers were replaced with neo-computers.

Also for anything that uses a standard floppy drive there are cheap and plentiful adapters to use flash storage instead of disks.


I have yet to see a thing I can insert into a floppy drive that will pretend to be a disk. I have seen plenty of things that can replace the drive itself with something that pretends to be a floppy drive.

The problem with those are that not all equipment that uses floppy drives conforms to standards. If you have a 50 year old embroidery machine from a company that no longer even exists you'd need someone pretty fucking technical to pull that shit apart and figure out if any of those drive replacements will even work in your weirdass machine. If you are lucky your machine used some off the shelf drive from a reputable company (I've seen some old weird drum machines machines that had Apple Disk ][ drives hanging off them) but otherwise...


I can imagine it, but it would require a lot of worldbuilding to explain.

I reckon the primary means of computing for the lay person becomes interacting with prompts to make the computer do what you want, but the actual knowledge of software engineering and computer science is gone. Even the prompts, due to linguistic evolution, are in a language that regular people don't understand and people's mastery of the language is akin to moderns trying to reconstruct something like Akkadian. We sort of know what specific words mean and the general grammar, but we simply don't know 90% of the vocabulary to express most ideas. Whatever the language model was populated on training data from an archaic language and is also full of weird idioms and phrases and misspellings that we can't reconstruct in an organized way.

In this context, the computers might be able to solve some basic problems for us, but everything we do with them is basically just received incantations from other crusty old people. Innovations keep happening, but they're small scale and mostly revolve around finding out new ways to tweak the incantations to do novel things. But the most we can do is say, like, "draw a picture" or "sort this list."


Unless they have other means of solving those calculations better than computers; in that case, call those other means "computers" and the study of older computers "retro-computing".


something like the amish, relying on 100 year old technology developed by humans instead of those newfangled tools that aliens brought that no human can understand.


As I was reading the post I assumed (1) happened because something better came and completely replaced computers: magic, raw energy manipulation, biomachines, hextech... something like that.


The issue is with Step 1 - there is some reason why the world chooses to stop using computers despite (as far as I can see in the 'thought experiment') them being available and the homo sapiens needs/desires because of which we use(d) computers seemingly being the same. That reason isn't provided, isn't even implied or hinted to, however it's something major with far-reaching consequences to human behavior.

So it's kind of frustrating - in effect, the author describes a thought experiment of how society would change if X happened, but without saying anything at all about that X other than asserting that one of the things X causes is a mysterious lack of interest in using computers to fill needs/desires which (at least according to the original article) aren't filled by some better alternative.


That's just 40k with extra steps


I'd say it's 40k without a whole lot of extra steps.


You're basically missing the whole point of speculative fiction in this style, which is "we start with imagining a world which has these conclusions. Trying to imagine it forces us to try to make sense of what could lead to those conclusions."

You assert it's incoherent. That's part of the point: it's incoherent with your current understanding of our current world. There might be other circumstances that lead to a such world without being incoherent. If the conclusions in a piece of speculative fiction are appealing, it may be worth thinking of what it takes to reach those conclusions.

The thought exercise of trying to make it coherent is part of the point.


No, the point is it can't be done. It's like cleaning your house with an 100 year old tortoise: you absolutely would do that, if tortoises were plentiful and do a good job cleaning. Why would anyone clean their houses themselves, when there were all these old tortoises willing and able to do the job for them? Now if the tortoise moves all your crap around so you can't find it, leaves streaks on your mirrors and windows, fire his scaly butt!


The parent is describing an inductive approach to worldbuilding, while you are describing a deductive approach. Two sides of the same coin.


Good speculative fiction at least tries to give some reason why something might come to be the case. This article just says "pretend computers are old; old things are like X today; therefore, computers are going to be just like X," while ignoring all the things that are different.


>I can't conceive of such a world.

One word: Terminals.

Computers are inherently old at this point, the technology harkens back at least the better part of a century.

The old guard in particular cannot and will not let go of CLIs and esoteric flags and arguments, and will berate anyone who would dare suggest anything concerning a GUI.

To them, the only human interfaces that exist are the keyboard and the monochrome monitor, computing is still grey-on-black (or green/orange-on-black) monospace text and running emacs or vim. The rest of the world disregards them as living relics of ancient history.


Imagine a world where everybody simply has an AI assistant in their pocket, that simply tells them anything they might need to know. Design programs for you. Draw you any image. Perhaps even construct virtual worlds. We are almost there already.

Only weird old people would bother to learn how to program dumb desktop computers.


There's a lot of people who feel disquieted by the pace of software systems and aspire to do work in software that feels similar to the more methodological, slower paced work of more mature fields like HVAC engineering or structural engineering. These folks find solace in this kind of speculative fiction.

I feel their disquiet is misplaced. Any young, fast-moving field is going to be full of the same issues that software is in now. You can look back at the history of mechanical engineering during the industrial revolution to see many similar problems we had with unsafe projects, hyped-up snake oil, or iteration for the sake of iteration. The history of automobiles and aviation was also marked by similar issues. Slower paced engineering fields are more mature and have gone through decades or centuries of iteration before coming up with tried and true solutions. But fundamentally fiction speaks to the soul more than it speaks to any measurable outcome. Truly the only fix for this disquiet is to search inside rather than look out. Fiction can be a great tool for that.


This is not unlike the speculative future of Anathem.


> Computers are seldom privately owned – they are considered essentially communal rather than personal

This struck me as true of most people's use of computers now. Almost everyone I know who isn't either an old fashioned developer or a gamer uses a mobile phone or tablet merely as a graphical terminal; the actual computer is some mysterious entity elsewhere on the Net.

The golden age of computing when 'everyone' had one of their own that was useable offline is now long past.


Likewise Gen Z and Alpha are not very technical. Many teens and 20 somethings have a hard time with seemingly basic tasks like finding files. As a 20 something myself, my public school did not offer any 'technical' courses beyond office programs and keyboarding.


In fairness, modern mobile devices seem to be determined to make how the filesystem works as opaque as possible.


Files and folders are based on a metaphor applying to paper-based organizational systems. People used to have typewriter printouts that they used to mark up by pen and place into folders, then file away into filing cabinets. Is there really a need for a filesystem other than in the vestiges of what we consider an operating system?


This.

The "files and folders" metaphor never worked even in the heyday of computer literacy, most people never understood it and just dumped all their crap on the desktop.

Seeing as the metaphor did not, does not, and will not work it is only reasonable that personal computing is slowly moving away from it and towards something the users actually want: One singular place to store everything (and I mean everything) and a one-stop quick way to find something in there, organization be damned.

Email is perhaps one of the earliest manifestations of this user desire: Everything goes in the one singular Inbox, and you search for whatever mail you need.


> a one-stop quick way to find something in there, organization be damned.

I understand the desire, and clever search indexing allegedly solves this, but it's fundamentally an impossible tradeoff. Organization is essential to finding things.

I'm tired of people pretending search and AI can solve that problem. It can help, but it can't tell you what x or y thing you want is. I find myself typing nonsense like "what's the thing that's important in speaking" if I watched a video about how to give a good talk and am wanting to watch a part of it again. I haven't given it a title relative to how I think about it, I haven't thought about how to categorize it, and I haven't let a "place" for it settle in my head, because I didn't bookmark it or add it to a playlist.


I prefer being able to find stuff.

Tags or folders, I don't care. But I need some kind of organization. A giant pile with a crappy search box that hardly works does not cut it.


>Tags or folders, I don't care.

My point exactly. The people who cry that the young today can't file-and-folder fail to understand bloody nobody, neither young nor old, cares for file-and-folder.


The problem is that mobile devices lack competent tagging on most things and don't let you organize with folders either.

And folders are a very simple concept. If you can't handle them on a system that uses them, that's bad.


>And folders are a very simple concept. If you can't handle them on a system that uses them, that's bad.

Simple to us. I've seen accountants, aka people who deal with files and folders as a matter of their god damn line of work, fail to understand or care for files and folders in computers. They still put everything on the desktop.

Fact of the matter is people do not care for files-and-folders. It's high time we admit most people do not want it, do not care for it, and just want to store everything in one place with an easy one-stop way to find what they want. Most people don't care about file systems, and both iOS and Android are right for abstracting away something nobody cares about.


Putting things on the desktop or not caring for folders is different from not understanding the concept.

> just want to store everything in one place with an easy one-stop way to find what they want

That's a pipe dream. If you want good search results, then you need to put in organizational effort. Whether or not you're using folders.


>Putting things on the desktop or not caring for folders is different from not understanding the concept.

Yes, but whether they understand or not they still clearly don't care.

>That's a pipe dream. If you want good search results, then you need to put in organizational effort.

The thing is people will not put in effort, whether out of neglect or ignorance. "It's in the 'puter." is as far as most people will care, and operating environments that respect this attitude end up having the widest mainstream appeal and adoption.

It's like manual and automatic transmission in cars. Most people drive their cars to get from A to B, they couldn't give a flying banana how it works. The enthusiasts arguing over stick shift are martians, the people just want a bloody car to get them where they want to be and car manufacturers will and should oblige the people.


Automatic transmissions actually do a good and thorough job. File search, if the user hasn't done any sorting at all, will fail a lot. That's why it's a pipe dream. People can always refuse to sort but they won't get the experience they want.


I like file and folder. I like tags. I need something.


Yeah, and that something for most people is the search box.

For another example of this "I refuse to file-and-folder" phenomenon, try and recount how many people go to Amazon by going to the address bar, typing www.amazon.com, and hitting enter instead of going to Google, searching Amazon, and clicking the first listing.

Fact of the matter is people don't fucking care for file-and-folder, people don't fucking care for locators. For most people, a given something is "in the computer" and that is as far as they can be bothered to care. Abstractioning of file systems and similar constructs to improve end-user usability is the correct decision, given this situation.

Yes, the small handful of us that understand and use file-and-folder and Universal Resource Locators hate the abstraction, but we are Nobody. They (most people) are Everyone.

The Photos app dumping all photos into one singular location with no hints as to where it's located in the file system is better usability for everyone. That's the brutal-to-nerds reality.


Each generation consists of roughly 90% of people who are completely clueless about anything computer related. Those of us that work in the field usually can't even conceive of it but it's true.


Indeed, you could say the same about cars and plumbing too (with some variation in %). People tend to think younger people are automatically 'good at computers' though.


I wouldn't say it's "long past", but rather in its last gasps outside of niche hobbyist communities. There are plenty of people whose primary computer use is still local, they're just considered old-fashioned


> Solid-state computer components, on the other hand, have no mechanical decay, so they are practically eternal.

I wish.

Electrolytic capacitors seem to have a functional lifespan of about 10-20 years, transistors do wear out, electromigration is a problem in chips in the "hundreds of years" span discussed, and there are all sorts of other interesting failure modes of "solid state" electronics that mean they're not going to last hundreds of years without some pretty massive heroics - and that's when you can repair it at all.


My schneider 286@12Mhz from 1990 with 40MB huge hdd still running fine, all original components.


How much of that time has it been running?

I would wager that it's not been running powered on for 33 years without any repairs. If it has, probably long past due for a re-cap job, and you probably don't want to put a scope on the voltage rails to see what the ripple is.

For a broad handwave, "sitting powered off" isn't too bad for solid state equipment (it does bad things for hard drives, see "stiction"), but operating (and operating at temperature) is where the wear occurs from a range of effects. I've reworked [0] a Core 2 Duo board that stopped booting after a decade or so, because the capacitors filtering power for the IDE controller got so bad it wouldn't boot reliably (it would load the kernel off the drive, and then insist the drive wasn't present later).

[0]: https://www.sevarg.net/2018/04/15/on-art-of-repair-re-capaci...


Yes it ran 24/7 when i was a kid for about 4 years, nowadays i rarely turn it on.


Older hardware tends to be more resilient due to wider traces, which means lower susceptibility to ESD and electromigration. But eventually the last atom will get eroded out of a critical trace and the thing will fail. Nothing lasts forever, especially when made to be as cheap as possible.

https://en.wikipedia.org/wiki/Electromigration


Well, all of the components are less sensitive, not just traces. Also, a lot of the old machines weren’t designed to be cheap.


I recently had to replace an SSD that was only 3 years old; it's dead-dead, as in, slowed down one day and then wouldn't boot. I've never had an HDD die this fast in my life. I know this is only one example but I'm curious if anyone else has a similar experience.


Had a brand new Seagate 2.5" hard drive fail after ~60 days of laptop use, back in 2010. I was a reasonably heavy desktop user, but scarcely a stress-tester, and I didn't have any memorable drop incident or accident that'd explain the failure, either (I was pretty careful with my machines). Retailer or manufacturer, I can't recall, replaced the drive (with a refurb) and I was able to recover some of the most important new data off the old drive as it failed, but it left me pretty unhappy.

still have a few mis-named files deep in my personal directories, 2 PCs and 3 Mac migrations later.


> it's dead-dead, as in, slowed down one day and then wouldn't boot

Did you double check if the files could be read by another system? A drive that goes read-only will also fail to boot, I think.


Here's a pretty good document about HDD life expectancy. Depending on the model HDDs can have a fail rate of up to 12% in 3 years. I'm unsure how many drives you've used, or how much of a workload you put them through, but I wouldn't assume SSDs are significantly more reliable than HDDs based on one failed drive.

[0] https://www.backblaze.com/blog/hard-drive-life-expectancy/


SSDs are a consumable, so depending on how much writing you do per day to it you can run it down pretty quickly.

The vast majority of people never write enough for this to become a problem, though.


Have a virtual memory pagefile on it?


[flagged]


Let me Google that for you:

"All electrolytic capacitors with non-solid electrolyte age over time, due to evaporation of the electrolyte. The capacitance usually decreases and the ESR usually increases. The normal lifespan of a non-solid electrolytic capacitor of consumer quality, typically rated at 2000 h/85 °C and operating at 40 °C, is roughly 6 years. It can be more than 10 years for a 1000 h/105 °C capacitor operating at 40 °C. Electrolytic capacitors that operate at a lower temperature can have a considerably longer lifespan."

( https://en.m.wikipedia.org/wiki/Capacitor_plague )

"Electromigration decreases the reliability of integrated circuits (ICs). It can cause the eventual loss of connections or failure of a circuit. Since reliability is critically important ... the reliability of chips (ICs) is a major focus of research efforts. ... With increasing miniaturization, the probability of failure due to electromigration increases ... In modern consumer electronic devices, ICs rarely fail due to electromigration effects. ... Nevertheless, there have been documented cases of product failures due to electromigration."

( https://en.m.wikipedia.org/wiki/Electromigration )

Before dismissively replying, maybe do your research.


The electrolyte doesn't evaporate.

The "capacitor plague" thing was because in the early 2000s every manufacturer started buying the same kind of cheapest electrolytics they could find that simply did not meet spec.

If your electrolytic caps are failing after six years, you're running them at too high a voltage.


From an actual research paper, not just Wikipedia:

"The variation of electrolytic capacitor electrical properties due to ageing can be attributed to two key degradation mechanisms: 1. The evaporation of electrolyte [3, 4, 5] 2. The electrolyte reacting with the insulation material within the electrolytic capacitor [6] A. Evaporation of electrolyte Over time the liquid electrolyte will evaporate, and therefore reduce the amount of available electrolyte within the capacitor."

https://eprints.whiterose.ac.uk/104114/1/IECONv2a.pdf

Yes, the capacitor plague was a particular incident not a general thing; yes, six years seems slightly ridiculous; yes, you can make them last significantly longer; but no, capacitors are not immortal - and you can't make them be immortal by just repeating "no that doesn't happen".


Electrolytic capacitors can dry out, boil out, oxidize, generally break down. There's many failure modes for them that can be caused by wear, age, or unuse.


Only if they're used grossly outside their rated limits.

Whether or not the rated limits printed on the can match what they're actually capable of is another matter, as people found out for a couple of years in the early 2000s.


Here's another study for you, which shows a decrease in capacitance and increase in ESR (equivalent series resistance) of electrolytics operated in a DC-DC converter circuit at room temperature over extended periods of time. This shows that capacitors degrade even when used within their rated limits.

https://www.researchgate.net/publication/228874152_Experimen...


Unuse also can damage capacitors through oxidation. I've never seen a datasheet that said "make sure to stretch out your capacitor's legs at least once a year."

Manufacturing errors can make capacitors die young, but even the most upstanding member of a capacitor tribe will one day kick the bucket through no fault of anyone's.


Can you explain what you mean?

In 100,000,000 years I can almost surely guarantee that your hard drive will be nonexistent. Surely there is a point between “fresh off the assembly line” and “disintegration” where the drive can be considered to be “worn out.”


> Nope.

> Nope.

> None of these things "wear out".

Would you perhaps like to present any evidence or even argument whatsoever?


This is a story, not a thought experiment.

To be an experiment there must be some sort of question to explore, after specifying some sort of condition.

This is a set of assertions from beginning to end.


And half the assertions are at odds with what computers are and can do rather than just our culture around them.


All the comments here are so negative for such a beautiful essay!

This paragraph at the end in particular really struck a chord with me:

> In the real world, people associate computers with many different things: corporate dehumanization, overwhelming consumer capitalism, alienation from the material world, shortened attention spans, ridiculously short obsolescence cycles, etc. etc. It is often difficult to tell these cultural biases apart from the "essence" or computing, and it is even more difficult to envision alternatives due to the lack of diversity. Thought experiments like this may be helpful for widening the perspective.

Computing is a fundamental part of the world and crazy exciting to play around with. It allows us to experience whole new spectrums of reality! At its core, isn't this the essence of the hacker ethos?


Strongly agree!

This post so wonderfully blends a hypothetical with where we really are, what computing has in fact become. Computiung is in fact vanishing, hidden inside massive data centers (temples) & behind firewalls. Applications offer packaged consumer experiences, but actual computing recedes, gets further off, & few people experience it. Real connection to computing is slow & takes concentration & will to develop; it's not fast.

This project to invert common conceptions actually reveals a lot of truth.

The final summation is great.

> So, what is the "essence" of computing then? I'd say universality. The universality of computing makes it possible to bend it to reflect and amplify just about any kind of ideology or cultural construct.


I’m with you. I really liked the essay, and I also particularly like speculative fiction.


They are just describing how computing works in an academic system if you’ve run out of HPC cluster credits and haven’t gotten a grant for a new system recently.


It's a good story but it kind of ignores something fundamental. Most early writing systems were created for accounting, like Bob gets 5 wheat, Doug gets 5 sheep, etc.

In a world where computers are some ancient artifact, they would still have significant value as calculators and accounting tools. It's likely some scribe class would emerge around using them to manage the day-to-day of whatever social hierarchy is present.

Visicalc will never die!


I would just be happy if the circuit board in my furnace would last longer than 5 years and wouldn't cost 10x more to replace than it probably should.


> It is commonly thought to be futile to even try to make youngsters interested in computers – they simply don't yet have the required patience or concentration.

You say that, but I find it far easier to explain how to solve computer problems over the phone to older people than younger people.


That's more or less how programming is today as opposed to watching TikTok, Gen-Z doesn't have attention span to bother and corporate veterans waste 90% of time debating abstract merits of design concepts rather than getting things done.


As someone who primarily works with computers in industrial machinery, this doesn't seem all that fanciful.

What breaks the metaphor for me however is the 'computers have no practical use in society' line. Medicine, modern precision manufacturing and transportation have radically changed the quality of life for billions. Without the development of global timekeeping, communications and computing (mainframes at least) life would be much nastier and shorter for many.

Does the inversion also come with all of that baggage?


Make everyone a long-range telepath.

You eliminate most of mundane uses of computers (such as Whatsapp, buying stuff and getting news) since you can almost immediately do this person to person.

To use computers, you need to learn to read (most people can't), which is esoteric by itself. Computers are used to store knowledge too fragile to be transferred via pollination, and for computations. Obviously, you need a lot of concentration to use a computer in such a world, and will be considered loony.


Computers are, essentially, fast, iterative, extremely useful for automated and communication tasks, and naturally attract young, detail focused minds based on these qualities. Those essential qualities lend themselves to computer culture. Inherent qualities lead to cultural connotations, not the other way around.


I'm not sure you can say that their attraction of youth is actually "natural" - as other subthreads here have suggested, Gen Z is considerably less interested in computing than my (millenial) generation was. When I was a teenager I was torrenting 100% of my media, tweaking and installing VST's in Ableton's Max for Live to make music, hacking my school district's crappy computer security system to get onto a proxy and play video games in the library, and then figuring out how to code in Powershell so I could bot on those games.. the list goes on. Me and my friends thought we were Epic Trolls and hackers - I think that image has been relegated to the "cringe" bin of history, the same way that people look at pictures of themselves wearing leather jackets with teased hair from the 80's and go - Look! Look how ridiculous I was back then! Times have changed.

I think for a certain segment of the population born between 1980 and 1995, mostly male gamer/geek type personalities (and believe me, I do cringe at the person I believed I was in 2006 when I was doing all of this stuff), computers were and remain attractive because of the stereotypical associations with the Hacker/slacker archetype - the cultural milieu consistent with The Matrix, online video games, shows like the I.T. crowd, musicians like Aphex Twin and Radiohead, fantasy and sci-fi ranging from the more primitive Dwarf Fortress and Lord of The Rings to the fast-paced cyberpunk of Counter Strike and Snow Crash.

I'm not particularly "hip" but a lot of the mystique and glory of the Hacker seems to have worn off with kids these days - gaming and mobile phones are ubiquitous and easy to use, where they used to only be for the "hardcore nerds". Tech startups full of people like me when I was 18 have taken over the world (unfortunately, I was not one of them) and they seem to have done their fair share of harm.

I know that computer science remains popular as a career avenue to make money for young people, but I'm not sure that they're particularly passionate about it in the way that some people my age and older are. When I was into it as a self-styled "outcast" neckbeard teenager, I was into it because of the cultural connotations and the associations marketed to me as a young man in the "hacker" heyday. Imagine telling me or any young "geek" type guy in 2005 that when I was 35 that I would be a systems engineer working with ex-frat bro type guys on the hardest problems I have ever approached - I probably would have scoffed and told you that there's no way such a small mind could ever approach the mathematics and algorithms needed to REALLY be an engineer (again, cringe, but true). Yet here I am, and it's great.

Suffice it to say that I don't think there's anything uniquely young or male about computing or computers, and that the culture around them is completely free to morph and shift as


Between hardware and underlying software becoming more capable and reliable, software removing more and more avenues for tinkering with a computer's innards, passion getting replaced with profit, and straight up computing becoming "stuff my mom and dad does", the young of today are rightfully finding greener pastures for inspiration.

I know in my day I had a blast just messing with Windows 95 and Microsoft Office. Yes, really. Just messing in them was as fun, or even more fun than, playing video games. Microsoft Access was my favorite "game". The sheer potential that even my kid brain could see was just mindblowing.

Now? It's not. No more are developers writing for passion, most are writing for profit. No more are we allowed to tinker; to be clear the likes of Windows and Linux still let us, but iOS and Android refuse tinkerers. No more is computing "fashionable" to young minds, it's the stuff old people (read: us) do and that is so not young, come on man!

Times are changing, and we're simply no longer in the front row seats.


> It is commonly thought to be futile to even try to make youngsters interested in computers – they simply don't yet have the required patience or concentration. There's no addictivity or instant gratification, nothing flashy or punk that fascinates the young mind. The appreciation and understanding of computers is something that develops slowly over years, often via gateway interests such as [...] pure mathematics. The kind of people who want to settle in a monastery and dedicate their lives to science or art may also develop an interest in computing.

This part is already true about the study of theoretical computer science today. People view algorithm design, big-O, proofs, recursion, automata, Turing machines, etc. as a chore.


> Solid-state computer components, on the other hand, have no mechanical decay, so they are practically eternal.

This is unfortunately a wrong assumption. Electromigration will kill everything with time. Most chips are designed for a finite lifespan, not in the 100s or 1000s of years, but single digits.. Sure, they last a lot longer, but not forever.

However, the article is still a lovely read!

A lot of it does not seem inverted at all.

> Programming is the most essential element of all computer use

The idea that it is a sacred place, where breathing slows down and thoughts become wholesome.

Indeed, it is the essence of computing.


I am not sure when it happened but I discovered that every game I played over the last twenty years, reduced to it's bare minimum functional components, was aligning a pointer on an X/Y axis and pressing the correct button within a time window.

The goal of computers, reduction or multiplication of work, is often lost because the output does not last longer that the work required to generate.

Computers are tools. They are used as toys, or at best, substitution of physical activities. Thankfully I only print paper twice a year. Yet why at all? Clearly I'm trusted enough to enter into a contract with companies yet can't be trusted to complete trivial tasks like sending a package without paperwork.

I'm curious if we reserved computers for qualified owners and operators like firearms or vehicles. I personally would love to operate one but without certification or training the ability to have negative impact upon the population or needless self harm outweighs the enjoyment of <current popular social media app>

I started driving at ten. I was also a teen with unfiltered internet over dial up but that isn't the same as personalized curated streaming video aligned to my habits or an exploitative corporate/State.

If we can not determine when we're getting manipulated, can not agree on what truth is nor speak against the authority without being labeled an enemy how we can adapt as a people to the growing power of computers in every pocket?


Sports are just a series of muscle contractions with timing; music is just different accelerations of the air; ... I think these statements, and yours concerning video games, are overly reductive.


Life is just a sequence of body movements to position into the correct shape for various contexts.


> aligning a pointer on an X/Y axis and pressing the correct button within a time window

Heroes of Might and Magic III are somewhat older than 20 years but they are super different than that.


Yeah I can think of a few games that this doesn't describe, Disco Elysium for example as it has nothing to do with "timing", as they have described it.


Replace "computers" with "metal" and it seems about as likely, and as interesting as far as thought experiments go.


Why not say "the essence of computing is computing"? It's not like "universiality" is any more explicit or clear than "computing" itself.

What is the point of this kind of "finding the essence" exercise anyway? Definitely seems to be very subjective, to say the least.


Reminds me of Kurzweil's Age of Spiritual Machines


Ha, possible if all the lunatics who want us to be marked a Professional Engineer to program have their way. We must fight them at every turn.


Reminds me of The Temples of Syrinx, by Rush


Actually, this is pretty much how things were during the mainframe days :)

Or so I hear...

punch cards, anyone? reserve some time on the system?


always coming home


[flagged]


> Embrace utilitarism.

As long as everyone else embraces it, we seem to be doomed to a market economy, that (all its virtues notwithstanding) unfortunately makes:

> Design durable, low powered and hackable computers with easy protocols to exchange data over a flakey network

not possible. As long as other people also take utilitarian approach, then the only things we can get are "low powered" and "flakey network". Durability, hackability and easy protocols are anathema to those who make money on technology at scale.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: