That is one of the most worth to watch videos ever. I also had no idea it was a "precursor" to UNIX, so it kind of amazed me as if I was understanding time-sharing for the first time in my life.
I also really like how he's describing what I know as an OS, but before that term was used. He also mentions magnetic disks and terms like "words" (we use bytes now), "supervisor" (would be OS) and "alarm clock", which gives me a feeling of the turning point between huge expensive computers and personal computers. The work of this man and his team is truly the birth of computers as we knows them.
Exactly, this is why I shared this video. Hearing concepts explained by those who pioneered them really brings them home, because it highlights the context in which they were created, with a description of what was the trouble with computing back then and how it was solved, and the excitement of the achievement.
For instance, he situates his breaktrought in a timeline where the main milestones are, first, getting computers to work at all, and, then, high level languages, to make them much easier to program (Fortran, https://www.youtube.com/watch?v=dDsWTyLEgbk). Then, time sharing made them much easier to use.
The birth of the terminal, no less, which is still around and which we all take for granted (I know it's deeper than this, multiprocessing, but this is the most visible manifestation) ...
Judging from this video, it also seems that he was a very good communicator.
I haven't watched the video yet so I may be missing some context, but terminals (as in ASR-33 and ilk) predate computers. They were used for telegraph/telex services for example. Old teletypes had no electronics btw.
I mean a computer terminal, in the sense of a device from which you can control multiple computations and get real time feedback, with the ilusion that it's all happening on a dedicated computer in spite of the fact that others might be sharing it.
But, he does say a "word" is the information of 6 letters or 10 numbers, which is a bit curious.
My understanding is the most encodings for letters in 1960s were 6 bits, so that would perhaps imply a 36-bit word for Dr. Corbato's computer. But then, if you have to fit ten of them into a word, you could only use up to 3-bit integers, which doesn't sound right.
Well, I meant he used words instead of bytes, even if in some architectures, a word was a byte. That's because the term "word" was more widely used back then from what I can tell. Nowadays it's much more meaningful to give the actual size in bytes rather than words simply because 8-bit bytes are common throughout most modern computer systems.
About 11 years ago, Dr Corbato was invited to speak at an ACM meeting in the computer science department at Uconn.
He came, but he was, seemingly, too elderly to actually speak. So he stood up, turned on that video on YouTube, and then sat down, ending the entirety of his interaction with our students.
I was thrilled to become aware of the video and have watched it several times since, but I feel the physical appearance was not a great use of Dr. Corbato's time.