Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
On the Origin of the iPhone (daringfireball.net)
178 points by tambourine_man on Feb 26, 2022 | hide | past | favorite | 129 comments


"The cutthroat internal politics of Apple under Steve Jobs — strong personalities with large egos — amidst tumultuous technical drama (see timeline below) sounds like the makings for a damn good show like Succession)"

Halt and Catch Fire and Silicon Valley (leaving aside Jobs) have arguably already successfully covered this ground from a reasonably safe biographical distance.


Pirates of Silicon Valley was literally about Jobs (and Woz and Gates).


I don't think mellosouls means that those portrayals left Jobs aside. I think they mean we should put the 2013 movie, Jobs, aside (hence their italics).

(Wise, though I personally think Kutcher did a good job with what he was given.)

Jobs very famously liked (or at least whimsically came to terms with) Noah Wyle's amazing portrayal of him in Pirates Of Silicon Valley, getting Wyle to repeat his performance on stage at a a Macworld keynote.

https://www.youtube.com/watch?v=TIClAanU7Os

At the time, people thought that was Jobs doing damage control because Pirates made him look like an asshole. At this point looking back, it seems more like part of his self-reflective pivot to what came next.

Either way I am not sure it needs more treatment and more demonisation, but they were all pretty lucky with the care that went into their portrayals in Pirates Of Silicon Valley.


2001 Antitrust is about Gates :)


I have no idea why this sticks in my head, but way way way back when, I remember Leo Laporte being adamant, on MacBreak Weekly, that he was sure the iPhone wasn’t really running a version of OS X, because it was impossible to do so.

And mind you, as someone who has been using a Mac since System 7, and OS X as a daily driver since 10.2—yeah, at the time, it seemed wild beyond belief that a phone could run what amounts to OS X. Those first versions of OS X were not snappy.


Famously after the initial presentation BlackBerry had a meeting of important people and decided the whole thing had been faked because it was impossible.

The original iPhone (and to a degree the iPad) really seemed to have been right at the edge of what was technologically possible.


The only people in the industry who seemed to get it were the Android team. It took Microsoft 3 years to realise they actually needed to go back to the drawing board on their mobile tech stack.


    iPhone announcement: January 2007
    Android release: October 2008
    Windows Phone 7 release: October 2010
Indeed it seems that Microsoft took their sweet time to rearchitect their OS, but these dates suggests Microsoft realized much earlier than 3 years later.


Microsoft was shipping PDAs (of which iPhone is an evolution of) since the 90s. I had a Casio E-95. It's about the same size as an iphone. It has a single button below the display at the bottom (same place as the home button on the original iPhone). It's home screen has a grid of 3x4 icons for apps (similar to an iPhone's 4x5 grid). What it doesn't have, it doesn't have capacitive touchscreen, it has resistive so it used a stylus, though I used my fingernail often). It doesn't have cellular service (but plenty did before iPhone shipped). Its UX is not optimized for touch as much as iOS. But it runs an OS, Win-CE, on a PDA so I'm not sure anything needed to be rearchitected rather than just deprecate the desktop UI widgets for more touch friendly alternatives (yea I know that's easier said than done but it's a tiny part of an OS vs all the rest).

Also I'm not under any delusion that MS would have gotten it right. The same reasoning that led them to do the minimal changes from Windows -> Windows CE would have likely always been a drag on them fully embracing letting go and making the necessary changes.


From early Windows CE memories, Microsoft's mobile hubris always seemed to start from "desktop Windows is the best OS." Ergo, the closer to desktop Windows they could put on mobile devices, the happier people would be.

Blackberry, Palm, and Apple all realized that given hardware and interface limitations, a better solution was to cut the Gordian knot and strike a different balance between desktop and something else.

Honestly, any of the three had a shot at being Apple. Palm had Graffiti and really slick for the time OSs. Blackberry had iMessage of their own. But Apple had a more complete package + Apple fans + iTunes.


I think the issue is that Microsoft only targeted enterprise users and never tried that hard to make their mobile platform for consumers.

Because I remember having a Windows Mobile device and it was fairly powerful. You could do more on it than you could on Android or iOS for many years.


> From early Windows CE memories, Microsoft's mobile hubris always seemed to start from "desktop Windows is the best OS." Ergo, the closer to desktop Windows they could put on mobile devices, the happier people would be.

This insanity reached it's peak just before the iPhone introduction with Microsoft's "Ultra Mobile PC" handheld device initiative.

https://en.wikipedia.org/wiki/Ultra-mobile_PC

I can remember an implementation of this from ASUS that used a D-Pad to drive a mouse pointer on a handheld device running an unchanged Windows XP UI.


To be fair to Microsoft,* there was customer pull for “desktop everywhere”. I remember asking 20 years ago at a city council meeting why they planned to spend so much on Exchange/Outlook rather than, say, Netscape.

The answer was that the Netscape icons looked different from Microsoft’s so lack of training costs would make MS cheaper. Sounds like bullshit but back then there was enormous learned helplessness when it came to anything computerish. Walk to any desk in most companies and there would be a stack of books to explain it all.

One of the best things about the iPhone was that it jettisoned all that crap and acknowledged that people aren’t idiots. The stores still offered free “training” for the fearful.

* strange, I know, as they were rarely fair to others.


And then later on Microsoft shipped Windows 8 - which was a touch first interface on millions and millions of desktop computers. I still remember the utter despair when non pro users didn't understand what was happening with the interface.

You can't make this sh!t up!


The feature parity minimizes how much better the iPhone was, even at launch when it was clearly rough around the edges. It’s like comparing a 1988 Hyundai with a Tesla Model S. The features are all the same: 4 doors, 4 seats, steering wheel, brake and gas pedals. But the experiences couldn’t be more different.


One thing Apple had was the political ability to iterate on the design. Remember that the original iphone had no apps, and no mobile data.

That made it look like an odd duck in the sea of computerphones, but what it had was a slick user interface completely designed around the capabilities of capacitive screens, and that took many years for the competition to even get close to.

Good user interfaces are truly hard, and Apple has always put it front and center, so it must have made sense for them.


Also, besides the form factor (pda/phone) there was - before the Win-CE - the mis-known Windows for Pen Computing (essentially Windows 3.11), I used to have a Compaq "laptop" that was actually a tablet (with detachable keyboard), the Compaq Concerto that was actually quite usable with the specific pen, whether that is to be seen as a precursor of the iPad/tablet or of the Microsoft Surface or similar, it came well before anything else (and lasted only a very brief time):

https://en.wikipedia.org/wiki/Compaq_Concerto


I can't find traces of E-95 models, was part of Cassiopeia line ? https://en.wikipedia.org/wiki/Casio_Cassiopeia


The Windows Phone 7 "tech stack" was based on the Windows CE kernel and, speaking as someone who worked on it briefly, it was a dumpster fire. It wasn't until Windows Phone 8 in 2012 that Microsoft put a real OS kernel on their phones the way Apple put Darwin on iPhones. Devices could not upgrade from 7 to 8 because they were essentially different operating systems.


For the customer though the difference was not that dramatic. My wife had a WP7 device and I liked it enough that I replaced my android phone with a WP8, and while it was clearly better than 7, it was not dramatically better. At the time the lack of backwards compatibility was baffling for the WP7 buyers I knew, and all of them switched to iphone or android after that because of being burned by that transition.

I had a similar experience when they went to WP10. While WP8 was a fast, stable and highly usable OS, WP10 was a dumpster fire (for me as a user, cannot talk about underpinnings). Constant crashes, slow, terrible battery life. This was a widely shared experience among WP10 users as far as I know. And on top of that there was another software transition that all but guaranteed no apps properly jumped to the new platform. Looking back on it I think jumping to WP10 instead of carefully incrementing WP8 was the big mistake, and rolling out WP8 without a way to make the transition easier on existing WP7 buyers was the lesser one.

I don’t think a perfect execution would have saved microsoft though: they would have always been third and it is pretty clear developers cannot support more than two mobile platforms properly. The app situation would have always been bad, and without all of the popular apps a smartphone is dead in the water.


I might say that Microsoft didn’t really rearchitect the mobile OS until Windows Phone 8 in November 2012, which was based on Windows NT. Windows 7 was, well a new skin on Windows CE is reductive, but CE was the kernel.


As if the 2.5 years delay to market wasn't enough, Microsoft also decided to invent a whole new UI paradigm for Windows Phone 7. I really want to know what went on when they made that decision.


The new UI paradigm was fantastic for users. That was never the problem with the Windows Phone. I'm not sure if you ever used one, but I loved it. It's problem was always the ecosystem. The device was fast even on low end hardware, yet felt fresh and productive. It got out of the way. I always felt the tabs were a couple of points too large. I never wrote an app for it so suspect it sucked.

On the phone, peoples social media was all integrated into one hub and while great for the user, it reduced the platforms to a transport layer - which did not appeal to them. Youtube was a big deal at the time, and didn't want their app on the new competitors platform. Microsoft wrote some, and they were taking offline. On the Windows phone, it would have been a few steps away from a storage platform.

Microsoft had a brand reputation problem at the time. Windows is what your parents used for work. Apple had an emotional connection of being fresh and hip. They had the iPod and people dancing in silhouette.

iTunes was available on Mac. You could run it on Windows, if you shut everything else down - the thing was a massive resource hog. I'm so scarred by it that even owning a Mac now I never user it.

The Zune app was a joy to use in comparison. I used Zune app for years without even owning a Zune. You had to use iTunes to sync your iPod. The older portable MP3 players were storage devices that you just copied files onto the device.

The end user experience was not the problem.

edit: some of this might be slightly inaccurate since I'm going from memory here. Happy to be corrected by someone who remembers better than I.


I had an iPhone at that point. I remember playing with Windows Phone 7 at a store and my gut feeling wasn't great about it.

Perhaps it was a combination of that text based tab control or it was the black & white + flat + text design they adopted for live tile and for the UI controls.

It's been a while, so it's possible that it wasn't as bad as I remember it.


You’re so scarred by iTunes you didn’t even realize they got rid of it years ago.


As I understand it the android team had to complete redesign the user facing aspect of the platform after the iPhone was announced


Not really. The “I guess we’re not shipping that phone” quote was about not shipping the Blackberry-clone Sooner device, but the full-screen touch-first “dream” prototype was already on the drawing board.

Android source code is publicly available, with time stamps. You can read it and see how little had to change to make it a touch first ux.

Android had to add multi-touch events, make the default controls bigger, and add inertial scrolling to the list views. They didn’t even have to add a soft keyboard, because the first few gens of phones had physical keyboards.


> Android source code is publicly available, with time stamps. You can read it and see how little had to change to make it a touch first ux.

> Android had to add multi-touch events, make the default controls bigger, and add inertial scrolling to the list views. They didn’t even have to add a soft keyboard, because the first few gens of phones had physical keyboards.

Whether or not it was a lot of code, multi-touch was a revolutionary new UI paradigm and a beautiful symbiosis between well tuned hardware and software. I don’t think its importance should be trivialized and the iPhone was undoubtedly the catalyst that brought Fingerworks’s invention to the masses.


This. The genius of iPhone UI was that it was so obvious once you saw it. There were plenty of so called Smartphones then, with or without Keyboard. But none of the touch screen phone interface were user friendly. Zero.


There was a full touch screen phone that predated the iphone, the LG prada, but if you look at the demo you can see in some places they arrived at a similar design (the dialer), in others they had a laughably bad design (T9 text message entry on the touch screen) and overall it was very unresponsive. The iphone didn’t just have the right touch ui concepts, it executed them really well.

LG prada phone demo from 15 years ago: https://youtu.be/5mo7Ab6ZcJ4


Yes I even had one, the LG Prada. Along with other Smartphone like Sony Ericsson P800, or O2 XDA which was similar to LG Prada but way earlier, I think it was 2002. They were based on Windows Mobile. And some other PDAs before that. The O2 XDA were built by HTC ,at the time they were like Foxconn doing ODM before it became a consumer brand.

There were lots and lots of other small details that Apple got it right. And I said it the first time I saw the iPhone, what sets them apart was Apple was trying to make an iPhone like an Appliance. While all other Smartphones was trying to cramp a computer inside a Phone factor. The day I saw iPhone I knew that was it. The Smartphone I have been looking for. Strange it is 15 years already.

I still dont know why the Visual Voicemail patents had not expired. It really should be the standard on all network.


I purchased the fingerworks gesture keyboard insert for my TiBook. It was amazing. I remember thinking it was the future of all laptop input devices. While I'm glad it resurfaced as the iOS input device, I am super bummed I still don't have that keyboard on my MacBook. That thing was incredible.


> Not really

Yes, really.

>[Google] had been working with proto­types for six months and had planned a launch by the end of the year . . . until Jobs took the stage to unveil the iPhone.

Chris DeSalvo’s reaction to the iPhone was immediate and visceral. “As a consumer I was blown away. I wanted one immediately. But as a Google engineer, I thought ‘We’re going to have to start over.’”

“What we had suddenly looked just so . . . nineties,” DeSalvo said. “It’s just one of those things that are obvious when you see it.”

https://www.theatlantic.com/technology/archive/2013/12/the-d...


Those quotes are entirely compatible with "not really".

Chris is talking about the Sooner: https://www.androidcentral.com/look-back-google-sooner-first.... It was called Sooner because it was going to come to market "sooner" than the other, more advanced, Dream prototype. After the iPhone announcement Sooner was canceled, in favor of the "Dream" prototype, which eventually shipped in a modified form as the G1.

The _software_ difference between non-touch Sooner and the touch-capable G1 were relatively minor. Same OS, same view system, same application framework.

As I recall, Chris's role at the time was as a UX focused engineer, so perhaps from his POV it was a massive redesign. But if you consider Android as a whole, relatively little changed. I recall the apps being pretty easy to port over to support touch. (Moving to a bigger screen helped offset the problem of having to make the buttons way larger.)


There was a really similar mp3 player that had been out for a while. It was too expensive for me to buy one, so I don't remember the name


Zune?


Android didn't do anything (in the ballpark) for several years either.


Android had been working on their phone long before that. There is a story by Richard Hipp the creator of Sqlite who was flown in by Android to help with their sqlite deployements on their phones long before it was public knowledge. He also had been working with Nokia/Ericsson who reigned supreme in cell phone market at that time and according to him what he saw by Android blew him away and knew that other companies were doomed. This was well before the Iphone days, I forget the exact date he mentioned but it was circa mid-2000s. It is clear that Apple and Android had been working on a similar product, apple just got to market first.


This is not how history went.

Android being worked on is very different from Android being anything at all like what it ended up being.

It literally took years before Android was in the same league.


It wasn’t several years. The G1 was released just over a year after the first iPhone, not long after the iPhone 3G, and it was roughly equivalent to the 3G.


Here's the G1: https://www.androidpolice.com/2021/09/23/the-first-android-p...

Here's cnet's review (literally just the first link I pulled from Google).

"But still, the G1 doesn't quite offer the mass appeal and ease of use as the iPhone, so it won't be a good fit for someone making the jump from a regular mobile to their first smartphone." https://www.cnet.com/reviews/t-mobile-g1-review/


I don’t need to read a review. I owned one when they were launched, while I was working as an iPhoneOS developer. I would spend eight hours a day using an iPhone 3G and the rest of the time using my G1. They were roughly equivalent. The iPhone did some things better, the G1 did some things better. It didn’t take “several years” for Google to be in the same ballpark, they were in the same ballpark from launch, which was fifteen months after the original iPhone launch.


I remember Android had a much better notification experience out of the gate, and IIRC, shipped copy-and-paste first. Same ballpark sound about right


OS X at its core, is just Mach + FreeBSD. People were running Darwin in limited resource environments for years.

By the time the iPhone hit the market, there were already plenty of handheld devices that had similar specs that were running full-blown Linux.

Pretending that the iPhone was "at the edge of what was technologically possible" is rewriting history.


OS X is not Darwin. All the interesting parts are on the layers built on top of it.

What was remarkable was being able to run a graphic accelerated version of AppKit built for touch screens (UIKit) and have RAM to spare to run a non trivial app, like, you know, a full blown web browser, all in 128MB.

So yeah, Darwin + window server + springboard + Safari + webpage > 128MB is pretty impressive indeed.


128MB is absolutely massive compared to the hardware that NextSTEP / OS X was designed to run on.

https://en.wikipedia.org/wiki/NeXTstation


NextSTEP didn’t run Safari, nor Quartz or Aqua. And NextSTEP was slooooowww. One of the reasons some of us rooted for BeOS.


Those awesome BeOS demos! They would open a dozen apps to max out both CPUs (25 or 33 MHz as I recall) and then show that you could grab any window and drag it around, and everything else would slow to a crawl, but that window, the one you were interacting with, was still snappy.

Then they would shut off one of the CPUs.

Everything was as slow as winter molasses. And they would grab a window, and it was still super snappy, while everything else almost froze solid.

I miss being the center of my computer's universe like that.


NextSTEP didn't run Quartz, no.

It used Display PostScript -- a comparably impressive achievement at the time.

It also wasn't that slow on the Intel machines, e.g. the black NextStep PCs Elonex made in the UK. It was quite nippy on some of the earlier NeXT hardware, too.

BeOS was a very different operating system and -- let's be clear -- would have been a terrible, terrible choice.

I tell you what was slow: WebObjects on the Mach subsystem layer on top of Windows NT.


Mach was slow, disk access (the spinning beachball Mac OS inherited was supposed to be a CD originally) was very slow. Using a Turing complete language for drawing was scary.

> BeOS was a very different operating system and -- let's be clear -- would have been a terrible, terrible choice.

Oh, sure. But it sure was snappy. And gorgeous. And that file system was awesome.


> Mach was slow, disk access (the spinning beachball Mac OS inherited was supposed to be a CD originally) was very slow.

https://pbfcomics.com/comics/beach-closing/

> Using a Turing complete language for drawing was scary.

Not particularly sure why. Aren't all apps using Turing complete languages to draw on screen? It's just a question of which layer that happens.

Display PostScript did the same, surely?


The comment is about the display layer. Yes [almost] all computer languages are Turing complete; the distinction is about the level at which they produce drawing instructions. Some simply set particular bits in memory that cause pixels to light up. That's essentially how the original Mac worked. More abstract is producing instructions to "draw a rectangle" or "rotate all future lines by 20 degrees." That's the "display list" approach and it's more flexible than manipulating raw pixels.

More flexible still is producing commands that can contain loops like Postscript. Now we're in tricky territory, because it's easy to make a mistake that e.g. produces an infinite loop and the whole display locks up. And it's much harder to reason about such things as the update rate of the drawing process because the drawing commands are themselves Turing complete.

This doesn't matter when you're merely printing because there is no refresh. Not so with an interactive display. That's why the "Display" in "Display Postscript" was a big deal.


I think PostScript was considered “too powerful” for its own good by the industry, which is why it was mostly replaced by PDF (initially a subset of it).

With PostScript, you can have a document that renders differently every time you open it (on purpose). Or a file that will never finish rendering or printing, like the infinite loop you mentioned.

PostScript infinite possibilities are cool and cute, but not all that practical for a graphics description language.


> With PostScript, you can have a document that renders differently every time you open it (on purpose). Or a file that will never finish rendering or printing, like the infinite loop you mentioned.

There was a font, wasn't there -- was it Just van Rossum's Beowulf -- that drew itself slightly differently every time? Can't find a reference now.

edit: it was!

https://www.fontshop.com/families/ff-beowolf


Oh my god. Thanks for this.


128 MB is also the base requirement for 10.2, 10.3, and 10.4 (although at least 10.3 will run with much less than that in practice).


It will run. It’s just a very bad experience.

The original iPhone was a delight to use, responsive like the desktop counterparts weren’t, despite much slower hardware and running on battery.


I think part of that was that the graphics layer on the iPhone was GPU accelerated from the start, and the first few versions of OSX were not.

One of the things that Microsoft got right and Google got wrong was that Microsoft copied the concept of using GPU accelerated graphics with Windows Phone and it took Google quite some time to catch up.


> OS X is not Darwin.

iOS is not OS X.

> So yeah, Darwin + window server + springboard + Safari + webpage > 128MB is pretty impressive indeed.

Maemo did all all of this years before, and in 64MB of RAM.


I don’t think Debian wasn’t doing double buffering GPU compositing of the UI. Nor was it running WebKit.

Show me a video of a smooth pinch to zoom on a webpage with MicroB


Konqueror has modular backend support, and it shipped with WebKit support as backend before the iPhone was released. Rekonq also used it and not KHTML. Epiphany used WebKit since mid-2007, it's now Gnome Web. All were available on Debian and Debian-based distributions. Phone-wise, Symbian used WebKit in 2005.


My iMac from 2000 had 128MB and ran OS X.


And it was terribly slow.

The iPhone was responsive like no other computer at its time, let alone phone. It prioritizes user input above all else, which is the right choice.


> And it was terribly slow.

https://youtu.be/YW7z0uMOYrw?t=364 Does that look terribly slow to you?


That's a 600Mhz G3 with 256MB of RAM. Probably 3X faster CPU and twice the RAM.

I was a user at the time. Had many iMac G3s.

But even on this machine, try navigating through nested menus, dragging windows around. Now boot from Mac OS 9 and do the same. It's night and day.


Show me the handheld devices that were running full-blown Linux at the levels the iPhone was (i.e. with a responsive UI). Linux being Linux I’m sure it can be run on a sock but that’s not the same as what iPhone OS 1.0 was.


Well of course they weren't running it at iPhone level performance.

First because the device and the software hadn't been optimised for each other.

And second, the fact that iPhone OS was not a "full blown" Unix OS is precisely why it _did_ have that performance on otherwise reasonably similar Samsung ARM hardware (better specs but not, you know, alien technological advance stuff).

The point is that there is this tradeoff. Developers who'd squeezed full multitasking Linux into phones would have been well aware of the kind of performance they'd see without multitasking and without a BSD subsystem.

FWIW Jim Gettys' handhelds.org project was demonstrating -- years earlier -- pretty impressive Linux GUI performance on much earlier hardware that had been only modestly adjusted for it [0]. Including for example tilt-sensitivity.

Those who knew about those projects could see that Apple had likely benefited enormously from seeing what was being done, and had decided to make an OS profile for iPhone that prioritised single-app performance with a sizeable slice of OS X.

[0] The iPaq as a product was designed in part to explicitly accommodate research from the DEC Itsy on off-the-shelf devices


> Well of course they weren't running it at iPhone level performance. First because the device and the software hadn't been optimised for each other.

I disagree entirely. Maemo as designed, built and optimized for low-powered mobile devices compared to desktops.


If it was so optimized, why did it let apps run down the battery in 30 minutes?


I don't know, you tell me? I could get hours of FreeCiv out of the N770, and that will steal a core of even a modern processor for a while.

Both the N770 and N810 were marketed as products for early adopters and developers.


Ah the goals are shifting I see


Too few words to tell if you're being sarcastic, rude or whimsical. (edit: I'm going with rude)

The entire mobile phone industry was -- and still is -- making these tradeoffs. (Android used and uses a slightly more advanced process model that iOS eventually adopted, but even it does not and did not offer full multitasking).

It's not shifting goals, it's an indication that there's more than one possible goal to shoot at.


Nokia Nxx series like the Nokia N770 and Nokia N800, and then the N810 and N900. Maemo[1] was full-fledged Linux, and you could run full-blown Linux applications. GTK apps were natively supported. I played a lot of FreeCiv on them and used SSH's X forwarding on them. Maemo became Meego, which became SailfishOS.

[1] https://en.wikipedia.org/wiki/Maemo


The N810 and N900 were AFTER the iPhone (N900 significantly so), and it’s quite revisionist to compare the N770 or N800 (or N810 for that matter) to the first iPhone, in terms of its performance and feel. Yes, on a pure feature list comparison, the N770 and N800 might have compared well (or even better, especially for iPhone OS 1.0), but those devices never felt the same as an iPhone. Capacitive touch screens made a big difference too.

The N-series Nokias (including the N95 and N97 phones) were really special, but they weren’t iPhones. The iPhone just made everything else feel old instantly.


Until you tried to copy/paste text, and then you went back to your Nokia. This is a pretty subjective thing you're trying to argue, here.


It was at the edge in a few small ways but it was carefully profiled to be that; it had limitations (single process model) that didn't really matter (except the absence of 3G) in favour of performance (graphics, capacitive multitouch) that did.

It was out in front in a lot of ways but not witchcraft.


That's just it, craft. They took the time to craft the system to fit a certain profile which people found desirable and it changed the industry.


Right.

The common criticisms about Apple kit, at any point in time, will be led by feature comparison rather than benefit comparison.

On almost every level, the iPhone prioritised benefits (consistency, simplicity, responsiveness, discoverability) over features.

And in all but really two of the feature cases (text management/copy and paste, and the absence of 3G), they gambled right: people cared about the benefits they'd engineered for, and not about the features they'd skipped.

(They managed to bluff past the apps question: we knew they had to be working on it and they managed to tell us they were working on it without telling us they were working on it)

Commentators still make this mistake with Apple.


“It was out in front in a lot of ways but not witchcraft.”

I’ve been running Linux since 14.

Had all the great smartphones of the age.

At 19 I purchased the first iPhone.

It was fucking witchcraft.


> I’ve been running Linux since 14.

Judging by the maths I was running it when you were four or five and have done ever since. I'm not sure what point you're making.

> Had all the great smartphones of the age.

Good for you. So you ran the handhelds.org linux project on a Compaq iPaq back in 2002-2003, or no?

> It was fucking witchcraft.

It was not. Apart from the multi-touch implementation which is transformational.

Apple did not put anything in that device that other manufacturers didn't have access to, and their device lacked capabilities others had.

They just did it right.


Nope, I contend chicken blood must have been spilled in pentagram marked areas because after using the first iPhone I was spellbound.


I think you are mis-remembering history.

People were in awe of how responsive the system was. Things like the pins dropping on the google maps app seemed faked, bo way they could do it.


I had a Zarus at the time. It could run X11 and java apps and even had a self hosting dev environment.

The iPhone was mostly a regression from that.


What everyone wants, X11 and dev environments on their cell phones. You sir, are a visionary.


People don't like the constraints of smartphones but they don't understand computers well enough to articulate it. That you do and then you mock me for pointing it out indicates a particularly ugly kind of follower mentality.

X11 isn't that important, it's the large app library it brings with it (many apps that after 24 years still have no replacements on either smartphone OS.) For power users the flexibility is nice too.

Onboard dev environments aren't useful to most people but it indicates/allows a few things:

1) Individual users who are inclined can easily improve the platform with very little effort.

2) The owners of the platform aren't stopping people from doing what they want with their own devices.


“stopping people from doing what they want with their own devices.”

I’m guessing you don’t use toasters or Nintendos either then, because you can’t do what you want with those.

The tired arguments you trot out have been covered to death. You are a hyper niche use case, pretending the general users feels like you do is delusional.


Wrong, people love the constraints. Or else smart phones wouldn’t be the most popular computing devices in history, and we’d all be rocking mobile terminals.


>Popularity -> utility

Talk about tired arguments. All popularity really means is good PR.

As to toasters and Nintendos, yes I do whatever I want with those, toasters have nothing to stop me and Nintendos are usually trivial (and often have flash carts available) to get a shell on.


[flagged]


Why wouldn't applying reason (especially since you're familiar with your personal situation) result in better decisions than just listening to people who are paid to tell you what to think? What argument are you even making here?


You are extrapolating your own personal niche tastes to the wider market. This is obviously a mistake.

Personally I am happy you have devices that satisfy your needs. I also understand that the average person makes different value calls and that drives different choices.

Summing all that up as good PR is lazy and intellectually dishonest.


BlackBerry's hang up was they couldn't figure out how Apple was able to power such a device and how could they be allowed to use so much bandwidth by AT&T when that was a such a point of contention.

https://us.macmillan.com/books/9781250096067/losingthesignal


“So much bandwidth”? It was an edge device in a 3g world. It relied on Wi-Fi for location service when high end phones had used 3g towers and gps for years.

In so many ways the iPhone was a poor high end phone on paper.


OTOH the iPhone accessed the real HTTP Web not WAP and it didn't use a split browser architecture. For comparison, the Danger Hiptop/T-Mobile Sidekick was only allowed to have an unlimited data plan because all traffic from the phone went through proxy servers that compressed everything. AFAIK Blackberries also routed everything through a BES/BIS proxy.


There were a lot of people in tech and the media who made all sorts of claims that we appreciate as nonsense. One that stuck out to me was a “tech guru” stating that the screen must be using heat to detect touch, and that this means the screen will wear out with use over time.

Even back then it felt utterly comical to see that on TV.


After the iPad was announced, some industry expert wrote a long post about his knowledge of screen technology and how the promised battery life was clearly a flagrant lie, that there was no way the initial iPad could last more than 2-3 hours.


I remember one of the spoof commercials (on MadTV perhap?) when the iPhone was first released that described all the awesome features and wrapped up with a battery life of 5 minutes (or some such small number). The other thing that we forget about now that was expected back then was that fingerprints would render the display unreadable after a short time. Also turned out to not be an issue (largely thanks to a combination of glass design and a strong backlight).


Amazingly, my 2014 iPad Mini lasts longer on standby than my 2018 iPad Pro. Even with all optional stuff turned off (and no cellular antenna), the new Pro lasts about 2 weeks, while the old Mini lasts a month.

I'm not sure, but I think the difference is that the old iPad can't update to iPadOS, which has worse standby performance. It's pretty frustrating to pick up the Pro after a week of standby and find the battery half depleted.


I saw NT running on ARM candy-bar smartphone at internal MS demo in 2004. There was a start button and it opened notepad. This was surreal, but not surprising since NT was engineered to be portable from the start.

MS was dysfunctional at the time, and they continued to ship abysmal Windows Mobile instead of replacing it with NT. Not that it would help, it takes more to make a good smartphone than good kernel.


"A version of OS X" is potentially a bit misleading for many at that point, because while the iPhone had a slimmed down Darwin kernel and a slice of the same UI framework on top, its process management was completely different.

The first version of the iPhone's OS had a brutal process management approach not unlike PalmOS, as I recall -- with the exception of bits of the system apps, if the app wasn't on screen, it wasn't running.

And that really changes the equation. There's a completely different notion of performance and memory management because you don't have multitasking. There was no window management, no interprocess communication, no unix subsystem. (Essentially no text selection and no copy and paste!)

So it wasn't "running Mac OS X"; that was a red herring. It was running a Mach kernel and a big chunk of Cocoa.

My guess is that people at Palm would have understood this right from that first demo.

And it wouldn't for example have been surprising to anyone at Compaq, because the iPaq series at that point was producing not-too-dissimilar devices in terms of power, able to run Linux and X11 while multitasking.

Likewise it wouldn't have surprised anyone at Nokia that here was a very powerful device with a high resolution screen and a pretty much desktop-capable webkit browser that played streaming video and audio, because Nokia had already shipped a device with some of those characteristics and shipped it 14 months earlier; like many in the UK and Europe I had a webkit-capable 3G device in my hand on the day the iPhone was launched (an E61) and had been using it (and testing websites on it) for months.

So what Apple had done on a technical/OS level wasn't so mindblowing if you knew your way around the decisions others had made, and had a view of the devices on the market (outside the USA where it was seemingly Windows CE or nothing). They were ahead, but they weren't so far ahead as for it to be completely unbelievable.

They had made careful choices and tuned the device for them.

(Seriously: no copy and paste until 2009, right? No 3G.)

The difference was -- as it always is with Apple -- execution.

People (including me!) should have been paying more attention not to this sort of OS-level achievement but to the design and delivery of the UI and apps. Serious thought had been given to capacitive touch and multitouch, which Apple didn't invent but did perfect, right out of the gate.

The CPU performance, the OS technicalities, whether it was OS X or not, didn't sell it to _anyone_. Indeed, Apple confounded anyone interested in features over benefits -- it didn't support third party applications.

What sold it was that it was understandable and comfortable.

> Those first versions of OS X were not snappy.

I ran OS X on day one of the public beta, on a blue and white iMac, and I don't recall its performance being bad (apart from the absence of sound). I remember it being amazing. But it's important to note that OS X on a Mac was a very different operating system; a fully preemptive-multitasked, multi-window OS with a BSD subsystem, IPC, swap, etc. etc., and running on hard disks, not flash.


As a Symbian and Windows Mobile user at the time let’s not kid ourselves. Those smartphones and PDAs were in some aspects far superior (high res screens, cameras, 3G, apps, etc) but the software was junk if compared to the iPhone, especially browsers. I did like my HTC Diamond though (which came after the iPhone)


Yes, the software was not great (though I still found it a very useful mobile terminal). But the software was a legacy product.

The hardware was not -- it was older and it had less RAM and flash, but it wasn't worlds apart. (Indeed as you say they had higher-density displays on some hardware).

And they took those S60 products to market 14, 15 months earlier, with hardware designed much earlier.

The point, again, is learning from the competition, and then execution.

(Execution is why, after all, those Nokia products had to shoehorn Blackberry enterprise support into much more modern hardware; at that point the market cared about Blackberry because Blackberry knew how to execute. The world ran on Blackberry)

We should expect those who come later to do so with better hardware and software. I think the iPhone was impressive but I still think it is worth putting in the effort to see the context and not pretend that it came out of the blue, fully formed and perfect.


> I ran OS X on day one of the public beta, on a blue and white iMac, and I don't recall its performance being bad (apart from the absence of sound). I remember it being amazing

It was amazing and dog slow. It wasn’t until the Intel transition that OS X was on par with classic on “snappiness” for me.

Even if a G3 was probably dozens of times faster than a 68k, the window server was doing so much more drawing there was really no competition. Every single menu and window were composited in real time with transparency and drop shadows. Phew.

I remember running a patcher app (haxies, shadow killer, something like that) to remove all drop shadows and make the damn thing usable.


I was an intern at Apple right around the time of the beta. The performance was bad. They had a program where the best performance improvement for the week would net a top of the line Mac development machine.

They did a whole bunch of tuning over the years, including parallelizing the startup sequence and optimizing linking to make startup faster that really made a difference.


I don't mean to understate the accomplishment, but the pedigree of OS X was NeXTSTEP. That was released in 1989. I'm sure a lot of bloat was added over the decades, but the underpinnings were running on hardware almost 20 years old at the time.


Although the iPhone launched coming into the leopard era, which wasn't that bad.


That 2004-2007 period is really quite incredible from the perspective of Steve Jobs. To have that many balls up in the air simultaneously and getting every decision, even in hindsight, correct while raising children and recovering from his pancreatic cancer diagnosis scare. On top of that he was under undoubtedly intense pressure from internal feuds, external competitors (Motorola), buyout negotiations (Disney), suppliers (Foxconn) and the telecom industry (AT&T).

This was a man at the peak of his life.


> "recovering from his pancreatic cancer diagnosis scare"

I feel it's important to say here -- for all those with pancreatic cancer or who have lost loved ones to a shockingly awful cancer and who keep hearing about Steve Jobs beating the odds for all that time, that he didn't have a scare.

He had actual pancreatic cancer.

It's just that he was (a tiny bit) luckier with one of the rare diagnoses that has a very high degree of treatability, and as a result vastly better five year survival odds.

But he didn't so much recover as delay treatment unnecessarily for alternative therapies, and nutrition.

(Having lost a relative to less treatable pancreatic cancer, learning of this gamble made me furious at him at the time and still really does)

The treatment he might possibly have avoided with earlier targeted intervention but ultimately received (the Whipple procedure) is effective but brutal, and the long term knock-on effects of that procedure affected him as they do almost everyone; needing a liver transplant as he did has a very high probability among Whipple patients.

(He actually misled investors about the seriousness of his illness when people surmised that he was experiencing the longer-term side effects of his procedure.)

An aside: as much as people like to claim Tim Cook doesn't have Jobs's guts or heart or bravery, he literally offered his boss a part of his own (compatible) liver to save his life. Tim Cook is brave and honourable.


Well he rolled the dice, because from what I understand the treatment at the time still had some % of serious complications. If he elected to do so and unluckily suffered some complication after the procedure in 2004, very little, if any, of the subsequent events would have occurred.


Alas, not so at all -- the treatment he should have had at the moment of diagnosis was simple and much less radical, with far fewer side effects, than the treatment he had nine months later when it had spread.

I correct this story not to be rude or to say he was an idiot (he was not, at all -- just scared because it is terrifying), but to try to counter the unrealistic things people still hear about Jobs beating the pancreatic cancer odds.


Even the simpler treatment for his special type of curable cancer would have had some % possibility of severe complications, correct? That is why I imagine he decided not to risk it at that critical juncture and instead bet on his alternative treatment regimen.


That picture of the "Wallabies" https://twitter.com/kocienda/status/880451736049139712 suggests that the often-repeated claim about the hardware and software teams working in mutual igorance is overblown. The Wallabies really seem not so far from final hardware, especially compared to (IIRC) some other devkits for old mobile devices. It would have been quite clear that the release hardware would be substantially thicker and lighter (assuming those things are fairly heavy, as well as bulky) and if you imagine them so you get something pretty close to the original iPhone (aside from the metal-case flash and sizzle).


I always heard things were almost always on a need to know basis. Designing a keyboard kind of requires an approximate form-factor. Like the sibling comment says, "Wallabies" were a touchscreen in a housing. It sounds like other developers were writing software for a generic grey box. To me, that makes sense if the hardware wasn't known even if they weren't trying to be covert.


Seems they were truly just touchscreens in a housing. https://twitter.com/kocienda/status/880522635112726529


FTA: "There is a certain kind of Linux enthusiast who thinks the answer to any technical problem can be found in Linux."


The comparison of Facebook and Apple is a weird one. Apple is good at innovation. Facebook is good at acquisitions.


> to scale the iPod’s embedded Linux OS up to serve as a phone OS

The OS of the iPod was Linux-based?

edit: hm I should have read all of the article before commenting, although the way it is written at the start is quite misleading, I find


> The OS of the iPod was Linux-based?

It wasn't. For those who didn't read the article: Apple was considering (not very seriously) replacing the iPod OS with one based on Linux, as the existing OS codebase was such a mess it was bordering on unmaintainable.

Obviously, the idea was not implemented, but it was this proposed OS that would have been "scaled up" to the iPhone.


Jobs if I recall correctly had tried to hire Linus to work on Darwin. I don’t believe the Linux on iPhone/iPad project had any meaningful realistic traction beyond engineering proposing it as a path forward and building the prototypes on it. My impression was that it was a nonstarter politically.


One of the first jailbreaks gave you a shell to the file system IIRC


Maybe I missed it but was the original TED demo of the multitouch prototype part of this? I thought that had a heavy influence on the design of how the iphone software would work?


[flagged]


"Please don't complain about tangential annoyances—things like article or website formats, name collisions, or back-button breakage. They're too common to be interesting."

https://news.ycombinator.com/newsguidelines.html


It's perfect on mobile. You can zoom, there's no cookie popups, there's no prompt asking you to sign up for a mailing list, and the whole article loads at once without needing to click a "continue" button. There's no ads. It's just a clean site that you can zoom in on to read.

All sites should be this good.


Don’t all mobile browser support double tapping to zoom into logical segments? Or is that just mobile Safari? Because that’s all I do for it.


I think it could be much better. You have to manually zoom into the article area on every page, and every time I scroll down it tends to scroll side to one side a bit since there is space to either side of the article. (I know mobile browsers tend to stop any horizontal scroll if it senses you are scrolling vertically, but not every swipe is perfect enough for it to do this.) Considering the desktop version and mobile version are identical, I suspect there was no effort to make it mobile friendly.


I recall some comment he made about wanting to do a “final site redesign” to make it responsive at some point, but then never got around to it.

Still, it’s a bit of an odd oversight, because this is the most simplest of 1-column-with-a-sidebar layouts, and it’s just a matter of getting that sidebar nav out of the way.


He has spoken about it many times in his podcast.

In short, Gruber is obsessive/perfectionist/lazy, the current site is OK on mobile with a double tap, making it “just right” would mean more work for him than what it looks like on the surface, etc.

He is the kind of guy who took years to choose the right shade of blue slate, still uses Verdana because of the way it looked without anti-alias on low res CRTs running Netscape/IE.

Basically, a lot of though on small things to a paralyzing degree.


Interestingly chrome on my android prompted me to view the page with 'simplified mode'. Never seen that feature prompted before


I just use Reader Mode


Double-tapping the text is also a decent solution as it zooms the text out enough to fit the screen.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: