I was using Linux before things like Slackware came about, when it was just a boot and a root floppy disk. We had DECstation 3100/5000 machines costing small fortunes, that couldn’t reliably write a CD. The small 386 in the corner running “that newfangled thing” was far better at this :)
In my lifetime, I’ve gone through:
- building (as in: soldering chips to a motherboard) my own computer at home at age 11*
- buying an 8-bit Atari and learning about Antic and the 6502
- eventually getting a “disk drive” which stored an entire 128k
- moving onto a 32-bit cpu with the Atari ST
- blowing my student budget for the term on a hard disk, 20MB
- finally getting connected, at the blazing speed of 2048 baud
- lather, upgrade, rinse, repeat
- to where I have a 1gbit internet connection, a 10gbit home network, 100TB of storage locally, and a server rack in the garage with more than 100 “cores” available.
Things have changed so much, so quickly, relatively speaking.
[This is the ‘*’ from above. I couldn’t get ‘edit’ to accept it as an update, it just kept on putting the original text back, so...]
My parents bought me a black-and-white TV and a computer kit (they couldn’t afford both the TV and the already-assembled version) for Xmas at age 11. The big gift here was the TV, as far as I was concerned, we only had the one downstairs before that, and I got one in my bedroom! I even convinced myself I could watch snooker on a black & white TV, even if you did have to tune it by turning a dial until the picture appeared :)
About a month later, they were getting at me to put the computer together, which was the main present in their eyes - the TV was second-hand. Grumbling, I did so, and got it working. Once I’d told them, the whole family wanted to see this new technological marvel, so I took it downstairs, plugged it into the main TV, and everyone gathered around.
I typed in what the manual had told me to do, to test things out
PRINT 2+2=4
To which it displayed “1”. And I turned round to the family expecting all the validation an 11 year old desired. My dad looked at me, looked at the screen, and just said “I knew it, you’ve buggered it” and walked out the room.
It took me a few days to convince him that “1” was the right answer. To this day, I think his mistrust in computers stems from that episode. He was a docker in a Northern city, and all he’d say for years afterwards was “you can’t trust those bloody things” in ... more colourful... language.
I remember we had an older Compaq tower server that had a Pentium 60MHz chip AND a SCSI card. It was the the perfect device to burn a CD with and even it messed up on occasion. I can't imagine doing them on a 386. It wasn't until the Pentium II days when we could reliably burn a CD and still use the computer for normal tasks without having a buffer underrun.
Oh it was pretty much a dedicated machine when it was burning a CD. This was a postgrad office, there were 4 of us (three called Simon...) and we had a Unix workstation each.
Generally the PC sat in the corner and wasn’t really used. It had a SCSI card too, and when it was burning CDs it was left alone.
I remember one day a colleague of mine burnt 50 CDs so he could give them out after a -resentation, it was pretty damn reliable. I also used it as a mastering machine for a CD that I had professionally duplicated to sell, full of Atari ST shareware/public domain s/w.
These were very early days of Linux. I actually had already released the “Mint distribution kit” which let my beloved Atari ST work like the Unix machine I had at college, and this was before any sort of distribution for Linux (at the time, Slackware had yet to be released) was available. The MDK was quite popular, mainly amongst students I think, but of course paled into insignificance compared to what Linux/Slackware/all-the-rest would evolve into :)
I see your WWII-era parents got what I call the Alastair Memo, which stated that at least 50% of the boys born around 1955-1970 were required to be named Simon, Alastair, or Nigel.
That's nice but it's hard to beat - typewriter as a gift in ones teens to flying in metal cages, to going to the moon, to VR headsets.
I remember reading opinion polls from people who saw this rapid rate of progress in the 60s 70s 80s and they all assumed the 2000's would be this "flying cars everywhere" magical land.
I'm ridiculously busy right now, and I don't have the time - sorry. What I will do is point you to the resources I've been using:
http://opencircuitdesign.com is the primary resource, which gives you layout, design, proofing, and setlist generation.
I'm not aiming for anything even remotely state-of-the-art, I'm looking at a 180nm design and even that might be pushing the hobby funds. You can get "shuttle service" at various places to share a wafer, or you can use efabless (link on the magic page above) to do a lot of the work for you, at additional cost.
The guy who for years wrote and maintained Magic (the layout tool) now works for efabless, and he's a great guy - especially if you submit patches to him :)
It's a lot of hard work, you have to worry about all sorts of things you can take for granted in an FPGA (clock routing, i/o bonding and pad designs, oscillators for clock multiplication etc. etc. and yes etc. again). But there's not many people who can say they taught themselves how to make an ASIC :)
And yes, many kudos for teaching yourself how to make an ASIC, that is very cool :-)
I have a long way to go to get there.
I've been out of the open source silicon field for a while but want to get back in. My side project is building a small-scale factory for custom ASICs, rather than getting them made in another factory. It's a very interesting problem, and quite different from ASIC design issues since a lot of it is physics, and obviously there are many factors that are different on a small scale.
The goal is an open source silicon service to the extent of making it relatively affordable for others to iterate and reuse designs, in the hope of developing a thriving scene much like happened with open source software, GNU/Linux etc. But it is proving hard to find the time these days. And as you say, it's expensive, no matter how you go about it, even though affordability is a goal of the final service.
The Linux kernel has been the greatest example of a global network of programmers contributing in the open and it has shown that open-source software with the GPL works for the contributors and for companies relying on Linux in production environments. I see most of the FAANG companies (Except Apple obviously) have at least contributed in some way, which is interesting to see.
For server-side environments or to some extent Android, I can see those as reasons for companies to contribute patches but I'm not sure for the future of several individual distros which are still floating around today, even when I rarely switch between macOS and Ubuntu these days..
Well 'using' Linux in production is quite different to actually contributing to the development to it.
From the FAANMG group of companies: Facebook, Amazon, Netflix, Microsoft and Google all have employees signing off patches under their company emails and no-one should be surprised to see no contributions from Apple to the Linux kernel anyway.
> to see no contributions from Apple to the Linux kernel anyway
There is one person submitting patches to the linux kernel under an @apple.com address and a few more listed as having reported them. That's as of the last time I pulled the kernel repo back in November.
> no-one should be surprised to see no contributions from Apple to the Linux kernel anyway
Why not? I know Apple is generally looked as a closed down company, but I'm still surprised that a company with their caliber, capital, resources and engineers can't find the time to help out the community that is helping them.
As someone who works for a company where the IP lawyers own your soul and constantly remind you of the fact, I would not be surprised that Apple contributions to the Linux kernel are few and far between.
Don't worry, the Apple thinktroopers will catch up with those people and then they'll probably be free to contribute to any free software they want in their now-limitless leisure time.
Because in Apple's case they are merely using Linux as off the shelf server software to serve a website or maybe files in their HQ. They have no need to modify Linux therefor they have no need to create patches. Apple doesn't even contribute to FreeBSD despite lifting many components for OSX AFIK.
The others have deep investment in Linux and therefore contribute patches.
AFAIK Apple favors FreeBSD and NetBSD (the latter used for the Airport routers). I wouldn’t be surprised if they have contributed code to those projects.
I was born 17.09.1990, exactly a year off. I grew up with Linux and am eternally grateful to the "scene" surrounding the magazine CDs with - I don't know - like 17 distros packed in together and endless tutorials on each of them.
It's also amusing how many people - my age or so - use Ubuntu as a daily driver these days that never went through the pain of configuring LILO or Broadcom drivers from source in Slackware ;)
I started using Linux in around 94/95. I heard some people say how cool Linux was. So after numerous attempts I got it installed and was booted to a command prompt and asked "What the fuck is so cool about this?"
I would turn out to be love at first sight. I've been using Linux since and I've been working at SUSE for 10 years this Fall.
I remember going to a Linux meetup in downtown Seattle in about the spring of 1993. I was surprised at the large number who attended, probably a couple hundred.
[0] I thought to check because I instantly knew the number was not divisible by 3 (since 1+7+1+1 % 3 = 1), and at that point it's just quicker to look it up than run through the other primes up to sqrt(1791991).
I remember my first Linux distro, Monkey Linux, downloaded from the BBS. It fits on 5 floppy disks, with XFree86. You have to use `arj` to extract it, an alternative to `pkunzip` during that time.
(http://www.ibiblio.org/pub/historic-linux/distributions/monk...).
SuSE invented LiveCD, AFAIR, years before Knoppix claims to be the first LiveCD Linux in the late 90s.
Yggdrasil was amazing. It single-handedly helped me sell Linux to many a grizzled greybeard, who couldn't believe that this toy kernel would amount to anything .. all I had to do was boot the CD and show them a working X-terminal a few minutes later, and that was all it took: I spent days copying those CD's for the entire team.
Curious, do you remember when user contributable/rolling package managers became popular? Back in 2000 when I wanted latest software, I remember having to resolve dependencies manually (view compile errors, then yahoo/google for libraries and errors). Each dependency had to be compiled manually, sometimes requiring patching code to get things to play nice. This was always a headache lol, but felt awesome once things actually compiled.
Debian is the first distro to allows volunteer to maintain packages. Then Ubuntu releases Hoary and gives invitation to be a MOTU maintainer with hangout channel in FreeNode, later on, Mercurial was released and Launchpad.net was created during Dapper days at the time where GIT is not yet mainstream, which gives way to PPA packages, then after a year, ArchLinux gains popularity, which allows anyone to submit packages using `yaourt`, a custom package manager on top of `pacman`.
We used Linux in a product before it had a network stack. We did need networking, and used KA9Q to do it. We also used rz / sz (zmodem protocol) for file transfer over phone lines. I recently integrated zmodem into an embedded system for firmware updates over a serial port- it's still very useful.
The performance of the floppy drive was terrible at first- a friend of mine and I added buffering, so that it could read a track at a time instead of block at time (which caused it to read only one block per disk rotation).
For the same project my friend created the generic SCSI driver that still exists today. It allowed us to connect a medical film scanner to Linux.
I've always wondered what Torvalds's answer would be if you asked him if he had the chance to snap his fingers and have the kernel rewritten from scratch what he would change?
Linus' monolithic kernel won out over Tanenbaum's microkernel, because it just worked. In the 1990s that was important.
Now we want it to work and not get totally pwned because we opened a sketchy email attachment before we had our coffee. Tanenbaum was right [1], but for the wrong reasons, and way too early.
Ah, the memories of ordering distro bundles from cheapbytes and gleefully experimenting in my “test lab” (aka bedroom). It’s hard to quantify how liberating the early Linux days felt after a slow and painful indoctrination to computing in the Windows world.
When I was 15-16, me and a friend ordered CD-ROMs from CheapBytes and sold them in The Netherlands (with a label applied to cover the CheapBytes branding). Most people didn't have credit cards in the 90s, so this scheme worked well enough to earn us some additional 'pocket money'. The website is forever preserved by Tripod (at some point we stopped, but never removed the website):
We did get some complaints from people who were convinced that selling Linux distributions was illegal
I was a C.S. student at UNC-Wilmington when RedHat first launched, and I remember a lot of people were freaked out about that. They were just howling "they're SELLING Linux?!???!" Heh.
One of the most interesting folder on Funet archive is mirror of Simtel[0] FTP-serever[1]: there are a lot of very interesting software for various old platforms.
Especially interesting — there are some very cool CAD, GIS and graphics apps or MS-DOS, Windows (from 3.1 to XP).[2]
Ah yeah, the good old days. Yggdrasil Linux, Turbo Linux, Slackware, etc. I think I installed my first Linux system in 1996 (maybe 1997) and never looked back. I didn't drop all use of OS/2 and Windows immediately, but by 2001 I had adopted Linux as my fulltime desktop OS and to this day it's all I use, outside of situations ($DAYJOB mostly) where I'm required to use something else.
I wonder if Linus had any idea of the impact his "toy project" would have on the world?
I remember buying "Linux - unleashing the workstation in your PC" circa '94 which had the tag line "friends don't let friends use DOS !". I was quite stunned at how much better it was than DOS/Windows at the time. I've been a Linux user ever since!
These specs, and runs over HTTP.. and is likely written in VIM, in unformatted HTML.
> It runs on a Linux server with dual 20 core processors, 786GB of memory and 80+TB of NetApp NFS storage.
> It has a 2 x 25Gbit/s connection to the Funet backbone.
Got my first copy of Linux (Debian Potato!) in early 2002. I'd known about it for years thanks to second-hand computer magazines and the excellent (in retrospect) ZDTV, and it definitely lived up to the hype.
ZDTV was amazing. First I learned about Linux was from "The ScreenSavers". Loved that show.
I think my first functioning Linux install was Debian Sarge in maybe 2005. I remember the first time I got it to boot and it was just the command line that I had working, but it felt like freaking magic.
Those were the days too of Compiz and all that fun. It was mind blowing to realize that there was an alternative to Windows that not only looked fancier but had free access to things like compilers and interpreters. Definitely started me down the path to my current programming career.
My guess is that they're mostly optimizing to minimize impact on other operations and effort needed to host this - 80 TB storage from shared system might be nothing compared to needs of scientific computing, but letting requests directly there might be too much - having lots of cache at frontends will take care of that, and dual processors are needed to have that much memory.
Yup, the niche science site may have a big working set for a single server to handle a worldwide random workload, but it is still cost effective versus trying to pay for a CDN that will have such low user density.
The conventional science community approach is volunteer mirror sites. We could have many benefits of a CDN without the big recurring costs.
You may as well ask why big systems are needed for anything. You can run Postgres on a 10 year old laptop, but you can't run Postgres on a 10 year old laptop as a backend for the USPS.
An SQL database workload I would understand, but an FTP seems mostly filesystem/io not memory and cpu bound. If you have more information, I'd be happy to have the details.
So I actually live (and am typing this) about 5 minutes walk from the CSC head office in Keilaranta, which I can see out of my window. I wonder if the machine is physically located there.
(I'm pretty sure my first linux kernel download was 1997, one of the 2.0-pre series, and it probably came from funet.fi)
It has less to do with capacity required for the task than it has to do with justifying the IT manager's budget and general dickwavery. Although, if IT really needed to justify its budget, today it'd be talking about how many AWS nodes they use and how big their Kubernetes cluster is. To serve FTP.
what does 17.9.1991 mean? Can I suggest people stop coming up with their novel date formatting schemes and use a standard that universally makes sense? ISO 8601 is worth looking at:
You're technically correct. However, to a large number of people it not only appears novel, it is ambiguous for ~40% of dates of the year with their native format (MM/DD/YYYY). Even if it's convenient for the majority, it may cause so much friction for the minority that it's not worth using anymore.
My native format is mdy but I dislike it almost as much as dmy.
Day month year is a common date format in much of the world. I think they should write the year as 9191 for consistency but I’m not willing to die on that hill.
I agree, but 17 being greater than 12 and the year being 4 digits reveals the order in this case. This only works sometimes, though. Like how 14 o'clock is pretty clear, but because not everyone uses 24 hour time yet, 2 o'clock is actually ambiguous. I find a leading 0 helps, but in spoken conversation it's not so simple.
Not in English, but the direct translation is used in other European languages.
Danish: klokken 14
German: 14 Uhr
That does mean native speakers of languages like this might say "14 o'clock" when speaking English.
I very occasionally use this if I need to be certain to be understood by someone who rarely speaks English, "we'll arrive at 19 o'clock". (A friend's parents in rural rest-of-Europe, for example.)
In my lifetime, I’ve gone through:
- building (as in: soldering chips to a motherboard) my own computer at home at age 11*
- buying an 8-bit Atari and learning about Antic and the 6502
- eventually getting a “disk drive” which stored an entire 128k
- moving onto a 32-bit cpu with the Atari ST
- blowing my student budget for the term on a hard disk, 20MB
- finally getting connected, at the blazing speed of 2048 baud
- lather, upgrade, rinse, repeat
- to where I have a 1gbit internet connection, a 10gbit home network, 100TB of storage locally, and a server rack in the garage with more than 100 “cores” available.
Things have changed so much, so quickly, relatively speaking.