Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

As far as I know, the value produced by programmers per unit of time is extremely difficult if not impossible to measure. You could argue that a programmer in 2020 is not 10 times faster at writing 1990 style gui software, and maybe you are right.

However, if you would try to write a modern web application like Basecamp only with methods and technologies from 1990, this is not going to work out at all. There would be no web frameworks, no web server, no browser, no Ruby, no Java, no stackoverflow, no Google, no automated testing, no git, no stripe, no script, no css - no anything.

You would probably give up and go for something like a client-server application with Delphi and Oracle. And you'd probably work with a waterfall development style, create tons of features that nobody wants, and then ship the whole thing as a bunch of floppy disks. Who would pay for something like that? Even if you sold it at 1/10th of what people pay for Basecamp or Jira, no one would want it. It would be trash. There have simply been fundamental improvements in the capabilities of software, and in our abilities and methods of writing such software, so that it is not even possible to compare them, and yes, I would say this is at least an order of magnitude of improvement.



Having programmed since the 1980s, you're somewhat right.

Git is way better than pkZip and floppies full of dated zip files. It's not 10x better though. A single developer wrote most things back them for a single customer.

We had text with attributes, colors, images, all of that under Windows 3.1 and later. When you deployed a program, you knew it was going to run in Windows, on 640x480 or larger... maybe as big as 1024x768 if the user had thousands of dollars for a nice graphics card and a big Sony Trinitron Multisync.

We had multiuser databases, like Access, that allowed all of the users of a system to share data across their organizations. Programs were shipped on disks, either floppy or CDs. They then worked forever.

The Waterfall Programming model was meant as a joke. On the small scale, I did it once... the program took 2 months from start to finish... the customer was happy that we met the requirements, but then wanted more. We negotiated a deal and I spent the next year doing rapid prototyping (agile?) on site, with lots of user testing. That application deployed with hardware in one site visit, and was usually run forever after that with only the occasional phone call, or field trip for faulty hardware.

Things are NOT better today than they were. In Delphi, for example, every function had a working example included in the documentation. You didn't need to search Stackoverflow every ten minutes... it just worked. The fact that the deployment platform was known, and you had control over the code all the way made things incredibly easy to ship to a customer and support over the phone.

Today, I can write applications in Lazarus/Free pascal, and ship them in a single zip file. The customer's screen looks exactly like mine does, there's no need to worry about dependencies, internet connectivity issues, etc. Recently I reached back into my archives from 1994 for a string function I wanted... and it worked.

Things are mostly just different today, not mostly better.


I simply don't believe that if you combine all improvements of the last 30 years, software, hardware and management combined are not 10x better.

Where is the data that shows, that things are just different and not better? Things are different today, yes and for a reason. People demand different software today for a reason and the way software is made has changed for a reason.


What can you do today that you couldn't do in Windows for Workgroups? The hardware is massively better. I had a side-project last year that I tried to do in Python, because I had used it a bit in the past, and figured things just had to be better because it was decades since the last time I wrote something big from scratch.

It was a horrible experience, except for GIT replacing ZIP files to allow un-do. WXBuilder only generates python, it doesn't allow you to edit the results and go backward... a significantly less useful paradigm than Delphi in Windows, or Visual Basic 6, for that matter.

Eventually after much frustration, I managed to write layers to completely decouple the GUI from the actual working code, which was ok... but then I needed to change one of the controls from a list to a combo box... everything broke... 2 lost weekends of work... and got it working again. Any GUI change took 20+ minutes.

Eventually I gave up, pulled out Lazarus/Free Pascal, and re-implemented everything in about a week of spare time.

After that, GUI changes took seconds, builds took seconds. It just works.

I greatly appreciate the power of the hardware, and persistent internet instead of 56k dialup... but the GUI tools have gone downhill.

I tried Visual C++, but it generates a forest of code and parameters that you really shouldn't have to deal with.

Maybe I haven't found the right set of tools, but as far as I'm concerned, it was actually better programming in the 90s, except for the hardware, and GIT.


> What can you do today that you couldn't do in Windows for Workgroups?

1990 was two years too early for Windows for Workgroups and 11 years too early for lazarus. Nevertheless:

- Being able to choose between dozens of memory managed and open source programming languages without having to think too much about performance or memory usage

- The ability to comfortably do in memory what previously could not even fit on a hard disk

- Doing anything with (compressed) video in realtime

- Deployment and distribution of software to a global audience of users, using a broad range of device types, screen sizes and processor architectures, all in fractions of a second

- Setting up servers and fully managed platforms, data stores and databases at the touch of a button

- Full text search in a global database containing all documentation for any available software, including millions of Q&A articles, in a fraction of a second

- The ability to use a variety of third-party online services for automated billing, monitoring, mailing, streaming, analytics, testing and machine learning

- ...

You could probably continue this list for a long time and find many more such improvements that have been made in the past 30 years. If you could find just a dozen of such improvements, each giving you just a 20 percent productivity advantage, you would already have a compound 10x improvement.

By the way, I think to make a fair comparison, we should not compare what is mainstream today with what was leading edge in 1990, but with what was mainstream in 1990. The difference between leading edge and mainstream is 10 years or more; so the question "What could you do in 1990" should be "What was typical in 1990".


We didn't have to set up servers all the time, they just ran, for years, without interruption. Some machines had uptimes in decades. We had worldwide software distribution, before the internet. BBSs, Shareware, etc. UseNet had every support channel in the universe. Email involved routing through ihnp4

Many things are clearly better, but IDEs really didn't keep up.


Today, you don't have to set up servers all the time either. But you can and that is a huge advantage.

You don't need to hire a special person to constantly optimize your database server and indexes. You don't need sharding, except in the most exotic use cases. You don't need to manage table ranges. You no longer need to manually set up and manage an HA cluster.

> Some machines had uptimes in decades.

What machine had uptime in decades (i.e. >= 20 years) in 1990? Did you have access to such a machine?


I agree that it is nice to be able to fire up a machine from a script. Back in the days of MS-DOS, it was entirely possible that your work system consisted of a few disks, which contained the whole image of everything, and you didn't hit the hard drive. That's pretty close to configure less systems.

As for databases, they were small enough that they just worked. Database Administrators were a mainframe thing, not a PC thing.

I didn't have a huge network, only a handfull of machines, but one of my Windows NT servers had a 4 year uptime before Y2K testing messed things up.

A friend had a Netware machine with 15 years of uptime... started in the 1990s.

Moore's law and the push to follow it has given us amazing increases in performance. The software that runs on this hardware isn't fit for purpose, as far as I'm concerned.

None of the current crop of operating systems is actually secure enough to handle direct internet connectivity. This is a new threat. Blaming the application and the programmer for vulnerabilities that should fall squarely on that of the operating system, for example, is a huge mistake.

It should be possible to have a machine connected to the internet, that does real work, with an uptime measured in the economically useful life of the machine. The default permissive model of computing inherited from Unix isn't up to that task.

Virtualization/Containers is a poor man's ersatz Capability Based Security. Such systems (also know as Multi-Level Security) are capable of keeping the kernel of the operating system from ever being compromised by applications or users. They have existed in niche applications since the 1970s.

For the end user, lucky enough to avoid virii, things are vastly improved since the 1980s. The need to even use removable media, let alone load stacks of it spanning hours, is gone. The limitation to text, without sound, or always on internet, sucked.

But, in the days of floppy disks... you could buy shareware disks as your user group meetings, and take the stuff home and try it. You didn't have to worry about viruses, because you had write protected copies of your OS, and you didn't experiment with your live copies of the data. Everything was transparent enough that a user could manage their risk, even though there was a non-zero chance of getting an infected floppy disk.

Gosh that's a lot of writing.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: