Hacker Newsnew | past | comments | ask | show | jobs | submit | deater's commentslogin

at the time, just out of undergrad, I ended up working for the remnants of the #9 Video Card company that had been bought by S3 and was masking a last effort at making a Linux-based Transmeta-powered "web-pad" (tablet): the "Frontpath ProGear" (new management wouldn't let them give it a Beatles related name that #9 equipment used to get)

in any case due to the unfortunate timing of the dot-com implosion it never really went anywhere (I wish I had managed to keep one, they used to appear on ebay occasionally)

the one thing I remember is that it was memory limited, it had 64MB but I think the code-morphing software really wanted 16MB of it which really cut into the available system memory


saying "windows 98 was bad too" is just an example that Microsoft has always had poor code quality. Back in the day Linux, for all its flaws, was generaly a lot more stable on the same hardware.

microsoft has a lot to answer for after 50 years of normalizing poor quality software


Not sure about this. All OSes were janky and buggy, Linux desktop up until at least late 2000s (I've been using it since ~2000), early Mac OS X, I don't even want to talk about classic Mac OS which was an abomination. Software quality and user experience was notoriously worse than it is today. This applies to everything - I've lost a ton of work to bugs in ZBrush, Maya, Word, FL Studio, backup software, and more.


as someone who has written my own OS from scratch (vmwOS) and teach a class on it, I have to agree with a lot of the other comments that x86-based OS projects do end up being exercises in 40-year old PC/x86 retrocomputing.

A few years ago I would have recommended the path I took (writing an OS for the Raspberry Pi) but the Pis have gone off the rails recently. So writing a simple OS for a Pi-1B+ is relatively doable (simple enough, sort of OK documentation, biggest downside is needing USB for the keyboard).

Things led to disaster once everyone wanted to use Pi4 (which was all we could manage to source during the CPU shortage of '23) as the documentation is poor, getting interrupts going became nearly impossible, and the virtual memory/cache/etc setup on the 64-bit cores (at least a few years ago) was not documented well at all.


If you are still interested in SMP on a 64-bit ARM, we have had some success with virtual memory/cache/peripherals on the Pi Zero 2 W


quite possible because it's from Europe, but remember that Apple was sticking + on the end of their model names 6 years before the Amiga existed.


> remember that Apple was sticking + on the end of their model names 6 years before the Amiga existed.

Did they? AFAIK, Apple always used “Plus”, not “+” (see https://en.wikipedia.org/wiki/Apple_II_Plus, https://mirrors.apple2.org.za/ftp.apple.asimov.net/documenta...), and “+” is shorthand invented by the community.

The Macintosh Plus, similarly, wasn’t a Macintosh+ in Apple’s marketing, AFAIK.

And, looking at https://en.m.wikipedia.org/wiki/Amiga_500#Amiga_500_Plus, it doesn’t look like Commodore stuck + on the end of their model names, either.


I think that many companies have been appending + to the end of product names for an extremely long time. This is hardly an Apple innovation.


Next you're gonna try and tell me that Apple didn't invent the mobile phone. Or the portable MP3 player. Or the windowing GUI.


Jobs was "inspired" by a visit at Xerox labs, they showed him a GUI built using Smalltalk (which they'd also invented). So naturally, he ran back to his office and invented GUI ;)


There is no prize for second… unless you do it better.


Or for third I guess lol Jobs demoed a mac gui to Gates, and apparently Gates ran strait to his Microsoft office, where he too invented gui. Jobs was very upset for years.


No, but they did invent rounded corners :^)


Steve Jobs invented the "+" sign at Reed College!

/s


Yes but it was much more stylish: ⌘

/s


No, it was not Steve that found the Symbol but Susan Kare the Macintosh graphics artist. https://www.folklore.org/Swedish_Campground.html


But unlike Steve, Susan will live on immortal - as the inventor of the Dogcow.


Yeah, that means it was Steve. :P

/s

(joking, relax..)


I think it actually was an Apple innovation, at least for {hobbyist, home, personal} computers. I did some digging and wasn't able to find anything before the Apple II+ in 1979. Please do prove me wrong, though!


I found an old edition of Byte from 77 where they advertise a "Vector Graphic Vector 1+":

https://isaac.lsu.edu/byte/issues/197710_Byte_Magazine_Vol_0...

A quick search doesn't find me pictures, but I did find a "Vector 1++”:

https://vintagecomputer.ca/vector-graphic-vector-1/


The BBC Model B (the machine the Raspberry Pi got its A/B designation) was supplemented with a Model B+ in 1985, with twice the memory.

https://chrisacorns.computinghistory.org.uk/Computers/BBCB+6...


as someone who has built various raspberry pi clusters over the years (I even got an academic paper out of one) the big shame is that as far as I know it's still virtually impossible to use the fairly powerful GPUs they have for GPGPU work


sub hundred gigaflop counts as "fairly powerful" now?


ironically, you should be thanking Apple that the IBM PC exists

The Apple II was an open system and IBM clearly took a lot of inspiration from the Apple II line. Look at the 5150 motherboard in the picture in the article and compare it to the motherboard from an Apple II+


Contrary to the popular belief, Apple is not the second coming of our lord and savior. We can thank Compaq[1] for the open PC ecosystem though.

[1] https://www.allaboutcircuits.com/news/how-compaqs-clone-comp...


Apple II was an open system in a sense. Apple published the schematics and ROM source code. But it didn't have well defined interfaces that developers respected. A lot of published software, including some of the most popular apps, made use of variables and entry points in "unofficial" ways. This made it impossible for Apple or anybody else to even know how it was being used, much less to write a compatible ROM or OS that was not an exact copy of the original.

And if an updated system were to break any published app, Apple would be blamed. There were apps, albeit only a few, that would not run on an Apple IIe, and I think, a few more that wouldn't run on a IIc.

There were some notable violations of published entry points in MS-DOS software, most notably the page locations of display memory, leading to the famous "640k barrier." But they weren't enough to dissuade developers from treating the PC as an "open enough" platform.

I doubt that developers felt a particular sense of morality about the DOS interface, that they didn't feel about Apple II, but only that the interface was good enough to use as-is.

The real important thing here, was the openly published interface, and mutual agreement among devs to respect that interface. I mean "open enough" and "mostly respect" of course.


And when people made Apple II clones, most of them (like the Franklin Ace series) got sued out of existence by Apple. Eventually true clean-room ROMs were created like for the Laser 128, but that was fairly late in the life-span of the Apple II.


I believe they were both "accidentally open" for similar reasons. Neither company produced the most important chips and components in the device itself. That meant that you could assemble a greater understanding of the device than even the manufacturer had and there was good incentive for putting this effort in the early days of computing.


Intentionally open. As noted above, Apple published schematics and ROM source code. IBM published system board schematics as well as the BIOS source code.


> The Apple II was an open system

All computers which could be bought by individuals at that time were 'open systems', they usually came with a full set of hardware schematics and programming documentation, and sometimes even ROM listings. The Apple II was nothing special in that regard.


dating myself here but I remember in the 90s reading a really funny spoof article about Microsoft announcing they had developed nuclear weapons. Didn't even seem that implausible at the time.

I would have linked it here but none of the search engines are turning up anything at all, and in fact I don't think it's even possible to find stuff like that with search engines anyore.


I had thought maybe it was on the old 0xdeadbeef mailing list, but no luck, but probably it was this from rec.humor.funny which in the end isn't quite as clever as I remember it being.

https://groups.google.com/g/rec.humor.funny/c/4zIyBq1-1_E/m/...


The funny part about our 1990s memes on big tech, is that today’s big tech is 100-1000x larger.

NVidia is worth more than Germany.


We're not for sale, but still… numbers? Nvidia's stock price isn't even a 10th of the gold reserve, as far as I can see?


Also, comparing GDP (rate of production) to valuation (area under the curve) is silly. Like comparing velocity to distance.


it's already happened. So many of the main contributors work for IBM, Microsoft, and Intel. It's extremely difficult to have your voice heard / patches accepted if you're just a hobbyist developer

I've gone to the extreme of writing my own OS because I got fed up with how corporate Linux has gotten


it's funny that a lot of us Linux nuts on comp.os.linux.advocacy back in the 90s predicted this was Microsoft's planned endgame. I personally thought it would take less than 30 years for them to get around to it though.


You were pretty spot on its been about 20 I think


my plan to rick-roll every major coding competition continues apace. muahahaha

I actually had another entry that I felt had much more clever coding that did some nice sixel animations but from what I understand there were many entries also doing that this year


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: