Understand that PCs were end-user fixable and upgradeable by accident.
The intention was never for you, the person who bought your computer, to be able to pop a new CPU into your Compaq five years later.
Standardized, modular parts were always all about the OEMs, who were all trying to build an industry by cloning the IBM PC. They could have a variety of manufacturers making various components and as long as everybody followed the standards, they could pick and choose parts to create SKUs for various needs and then sell them to you, the end-user.
The fact that a twenty year-old me could spend a summer saving up for the parts to build a brand-new Pentium II computer and then spend an exciting weekend putting it all together was a side effect of all that.
A side effect many of us enjoy, obviously. And a side effect that developed into a small industry of its own, with various companies (even Intel and AMD at times) targeting the "enthusiast builder" community.
When you understand this, though, things like soldered-on memory and CPUs make sense. The reasons why your 1988 Compaq had standardized IDE drives and memory slots are utterly inapplicable to a 2015 MacBook.
I hate the non-hackable place to which PC hardware is headed, but I don't get angry at it.
Almost no other category of consumer good works in the modular way that we nerds wish that everything would. Aside from the fact that they run on the same gasoline you can't put Chevy stuff into a Ford; you can't even put Ford F-150 stuff into a Ford Fusion. Your toaster oven, your ceiling fan, and your lawn mower are the same way.
Yeah, I hear where you're coming from, but all of those parts are still serviceable. If your starter goes on your Ford car, you might have to buy a Ford starter (or a third party starter for a Ford car), but it can still be fixed with a commonly accessible set of relatively inexpensive tools.
If the RAM in my Macbook gets cooked, either I can buy an entire new motherboard, or a new laptop. It's throwing the baby out with the bath water, and the truck/lawnmower example you've mentioned doesn't extend to that.
"Moore's Law" is probably part of the reason we're okay with this: by the time RAM/CPU/mobo start failing, a laptop is 5 (or 6 or 7 or 10) years old and the laptop's a dinosaur anyway. Heck, by that point a lot of us are probably hoping our laptops fail so we have an excuse to get a new one.
I wonder if we'll stop being okay with this eventually, now that desktop/laptop CPUs aren't getting faster at the rate they used to. My 2011 Macbook Pro is still plenty fast enough for me. It would be nice if it lasted a long time.
We've hit a plateau on overall compute speed. There are incremental improvements, but the difference in performance over 5 years today is much less than it was 7-10 years ago.
My 10 year old laptop is awesome, and it's a Thinkpad too. I have fixed it, replacing a fan, upgrading RAM, etc... all good, but it's performance isn't so hot. Buying new made a lot of sense.
But, my current 4-5 year old, high end, laptop is still competetive in a way that didn't used to be true.
I've a 2012 MBP, and it's still plenty fast. The newer machines are nice, but it's not like the 2012 is irrelevant, and that particular MBP is servicable. Just barely, but possible. I've done a number of things on it.
The new machines now just aren't. So even the expert user service is off the table, and I find that hard to deal with personally, because I know I absolutely do plan on running the machine for 5+ years.
Apple is still selling the old model user upgradable MacBook Pro for $1100 http://www.apple.com/shop/buy-mac/macbook-pro the same price I bought one for in 2011. Except I bought refurbished and smaller HD.
Some of the new machines Apple is selling for $1000 are actually slower at many tasks. But the market has spoken and Apple is selling hardware that suits most user's narrow tasks of email, browsing and video. It also streamlines manufacturing, as they're already focused on mobile where everything must be stuck together in one chip.
But they're enabling 4K video editing on iPhones and iPad Pros with what I'm assuming is custom hardware. And that could be a very smart approach if it was applied to more tasks on Pro computers. Performance could actually move forward again.
Indeed. Custom hardware can move us forward, and that's an interesting trend that may actually drive me to buy new machines sooner.
One of my daily driver tasks is CAD. Mechanical CAD. Due to the age and massive human investment in geometry kernels, single thread performance really matters. It is difficult to apply multi core processing to CAD problems.
It would be interesting to understand what special hardware could improve on this.
They are actually working on that. With DRM, DMCA, shitty software, etc. They want you to go back to the vendor to make sure things are fixed. Repairable? Only by authorized service dealers. At a monopolistic cost.
And in the US you can thank the EPA for helping with that.
Who is "they"? Apple? Because they USED to sell laptops that were serviceable, and have slowly over time migrated to once that aren't. Not because of software, but in order to reduce the size.
If you want a 1kg notebook that's 100x the power of a room-sized $40,000,000 Cray 2 from back when, you'll have to leave off things like connectors, removable covers, generalized internal interfaces, empty space (for varying-sized components), beefier batteries (to run whatever you plug in), etc. Some of us value compact over modular.
You want "repairable by user for cheap"? Such is eminently available, and will decidedly not weigh 1kg.
You're right, my Thinkpad X201t weighs 2.89 lbs. That's 1.31kg, it's so difficult to lug around but at least I can replace the hard drive, RAM, keyboard, CPU fan, battery, and fundamentally keep it maintained until the processor fails.
Probably means CARB, where states that have adopted California's emissions regulations require certain parts of the vehicles be replaced by special CARB-approved versions when they fail. Even if they're exactly the same as non-carb approved versions yet cost 2-3x as much.
If your engine block is damaged in your Corvette, you'll probably need a whole new engine block. Modern engines are using increasingly exotic materials & construction that are ever less repairable.
Your RAM can be replaced by an expert with the right tools, but computing hardware has become so cheap that it often simply isn't worth hiring an expert to perform serious repairs.
Amen to that. I am seriously thinking about going out and buying a 1975-1985 or so Chevy C-10 pickup with a small block V8. Those things are dead simple to work on, parts are cheap and plentiful and readily available, and there's plenty of room in the engine compartment to get in there and work.
Contrast that with my 2000 Ford Expedition where you have to jack the truck up and crawl underneath just to change spark plugs. Never mind the need for more specialized tools when you're dealing with fuel injection and messing with the ECM, etc.
Yeah, I think it's time to go back to older cars. I want something I can work on myself without needing a $5,000 tool that I'll use once, and that you probably can't buy in the first place. :-(
Hi Fi stereo sound is strongly out of cultural favor, tinny little speakers and 64K mp3 quality means you're conspicuously consuming smartphones, which is more important. However, component hi-fi stereo or 80s era home entertainment systems were a very modular thing in the past. From my wife's point of view its one octopus like appliance, but to mine it was a collection of amp, speakers, and legacy media players.
I remember seeing a vintage studio recorder at one point; the manual it came with included all of the schematics of the electrics, so that you could make your own replacement parts. That's some future proofing right there - the device's creator (probably Philips) knew they wouldn't last forever, while their device probably would. It did cost a fortune back when though, I have to add.
If you buy a pair of Sony MDR-7506 headphones -- they've been around since 1991, still being made, under $100 on sale -- you get an exploded diagram with part numbers for everything.
At the end of the cable is a 3.5mm TRS phone jack connector, which has a passive adapter to 6.35mm (1/4"). The 1/4 inch phone jack was invented in 1878, for use in the repetitive insertion environment of a manual telephone switchboard.
Basically every set of headphones you've used in your life has used a phone plug/jack connector. It's a pretty good bet that a hundred years from now there will still be people using phone plug/jack systems to connect up systems where some part needs 2-4 wires and will be repeatedly removed and re-inserted.
Probably because the observation is accurate but it's mixed in with snark like "conspicuously consuming smartphones." Component systems were certainly modular although I honestly don't know if I were starting over today I'd go with a big multi-speaker stereo system over something like a Sonos.
Although, even bicycles (at least those designed for racing) are starting to become more and more proprietary. Try mixing and matching Shimano and Campagnolo shifters and derailleurs or using a seatpost from an aerodynamic frame in another one.
Have Compagnolo components ever been compatible with other maker's parts? As far as I know Shimano and SRAM are compatible, and most of the smaller players work in the same way. Compagnolo, AFAIK, is the only outlier here.
As for the seat post, it's these molded carbon frames that seem to be built for short use and then the trash that are problematic in my eyes. They're just wasteful and most of the people buying them are doing it just to buy some time for their various races.
Having gotten in to triathlons a bit lately, I'm constantly annoyed that someone can literally buy gear to make themselves minutes faster, even though our engines may be comparable. I wish the organizing bodies would standardize the equipment for competition.
In the days of friction shifting, it used to be common to mix and match components across the entire drivetrain. You could have shifters, derailleurs, chains, freewheels, and hubs all from different manufacturers.
Drivelines are more integrated now, with derailleurs, cog tooth profiles, chains, and shifters designed as a system with minimal thought given to cross compatibility. At best you could use a non-Campy/non-Shimano chain.
My understanding is that they aren't, at least where we are talking about indexed shifting (which is pretty much everything these days). The amount of cable pull per "click" is different between SRAM and Shimano, which means you can't use a Shimano shifter with a SRAM derailleur. Or maybe that's just true in MTB world and not road bike world, not sure.
OTOH, chain, cogs and chainrings are pretty much compatible, at least within the constraints imposed by the changes in chain width to accomodate the newer, denser cassettes. Eg, 10 speed chain or 9 speed chain won't work with an 11-speed drivetrain.
The flip side to this (triathlon gear) is ITU draft legal racing, where it's basically like a criterion and requires the same UCI compliant bike setups as a road race would. This solves the problem in one way but creates a new one. Take a look at the world standings (for women, especially). Gwen Jorgensen is a beast, for sure, but she wins because she's a ridiculously strong runner who managed to get her swim good enough so she could hang with the leaders. On the bike she just has to ride with the pack. ... then, like Mirinda Carfrae in Ironman, leaves everyone in her dust on the run. I'm not sure this is fairer or not.
Not really. Exceptional cyclists who are average runners still do very well on full iron distances. "Average runner" meaning they can crank out a 3:15-3:30 marathon after the first two events (so not really "average at all).
That said, people who come into triathlon from a running background do tend to do better than swimmers or cyclists who have to learn the other two events.
"Understand that PCs were end-user fixable and upgradeable by accident."
I think if that were the case, IBM would not have released such complete documentation with the PC, XT, and AT. They could've just as well kept it to themselves, like they started to do with the MCA-based systems (PS/2).
The IBM PC design was a fabulously successful attempt to co-opt the much more accidental S-100 eco system. S-100 software and hardware could be easily ported to the PC, solving the chicken / egg problem inherent in creating a new ecosystem to compete with established ones.
The one mistake they made was assuming they could keep control by controlling the BIOS.
That information was primarily intended for OEMs, system integrators, and service technicians. They weren't really concerned about little Johnny down the street upgrading his computer... they certainly weren't trying to prevent that but it's not why they created those specs.
You're hopelessly, utterly wrong. I refer you to, for just one of millions of examples, the Apple II reference manual: https://archive.org/details/applerefjan78. The early culture of PCs was one of end-user fixing and upgrading by default, often primarily.
The early history of PCs included sending away for a box of parts and a manual and then soldering the pieces together yourself. And then when you put the kit together, there was no software, so you had to write it yourself. Also, there was no keyboard, so you had to input the machine code for your software directly into RAM using toggle switches. And there was no real display, just a few LEDs that you could blink on and off.
And then the Mac was sealed. You needed a special tool just to open the case.
I upgraded the RAM in my Atari ST by solder-sucking 16*(16+2) through-holes and soldering the additional chips and caps myself. Not exactly modular by design.
For brevity's sake, I talked about IBM PC hardware because that's what our current x86 hardware is directly descended from.
(If you typed your post on some crazy descendant of the 6502 and/or the Apple II's architecture... then I stand corrected, and please share details of your setup!)
In fact, the fundamental success of the PC/XT line was that it had increased user-expandability and maintainability over the competitors, as documented extensively in all accounts of the period.
And we're all typing on crazy descendants of the Apple II architecture, because pieces of it informed all subsequent architectures:
"It was also decided along the way that the machine would have an "open" architecture, like that of the Apple family. IBM would include slots under the cover of the machine that could accommodate plug-in boards that would add features or even change the entire personality of the PC. And, to make it relatively easy for outside companies to participate in the building of the PC market, IBM would publish a Technical Reference Manual with the entire set of electrical schematics for the machine and a full explanation and printout of the ROM-based BIOS (Basic input Output System) that provides the hooks into the machine for hardware and software. The ROM BIOS and the IBM logo were actually the only elements of the entire machine that bore an IBM copyright." (http://www.atarimagazines.com/creative/v10n11/298_IBM_coloss...)
Frankly, your lack of understanding of the history you're so confidently asserting facts about is beyond shocking. I'm sorry if that seems mean-spirited, but there's no other way to put it.
This is a very American point of view. German and Japanese cars are very modular; you can take certain parts from a 3 series and use them in a 7 series, and vice-versa. The transmission in the Toyota 86 is straight out of a Lexus IS.
This is actually true for American cars as well. I used "Ford Fusion" and "Ford F-150" in my example because one's a family sedan and one's a big pickup truck. I bet the Fusion shares more than a few parts with the Focus.
At the very least, each American manufacturer shares engines and transmissions across its various models.
80s and 90s American car interiors were sort of quietly hilarious. They shared common parts like crazy. You'd see the same gearshift, steering wheel, and so forth on various models. Maybe that's why it was a really ugly era in automotive interiors; their designs were constrained by all the common parts.
And the transmissions from e36 and e39 (early-mid 2000s) are built by GM. There is actually a lot of cross platform and cross manufacturer parts rattling around in our vehicles.
Honda is a great example of modular components. Taking a d-series and swapping it with a b-16/18/20 and even k-series with just a change in motor mounts/axles/ecu. It takes an afternoon to swap it over.
Of course as you get further into the aftermarket realm things like bolt-in conversions for swapping 2jz into 90s mustangs becomes possible. Fabbing up some brackets isn't that hard, conversely, desoldering a cpu or dealing with non-modular computer components is a much more daunting task to me. The parts are too small and fragile.
For some brands, there are retrofit fans, who take parts from newer models and put them into older, to get features that were never offered when their cars were made.
It's not really a nationality-based thing (although Toyota and VW do have the largest volumes on a single platform). There's been a significant shift toward modular automobile platforms in general. Some numbers I saw recently indicated that almost half of passenger cars were based on the top 20 platforms.
I'm not sure I buy the contrast between old computers and macbooks. The smallest replaceable part on a MacBook is the entire MacBook (excluding any wired or wireless peripherals), but the smallest in a normal computer might be the CPU. The CPU cannot be repaired except by replacing the entire thing. You can't repair a CPU or a MacBook, so it seems the the only real difference is how much it costs to replace the broken part.
I like to think that as moores law ends and computers stop being obsolete we may get to a point where a computer could be passed down generations like a solid dining room table. In that case maybe it makes more sense to be able to repair it and people build them that way. Or maybe computers cost nothing and it's like bequeathing a pez dispenser.
>I'm not sure I buy the contrast between old computers and macbooks. The smallest replaceable part on a MacBook is the entire MacBook (excluding any wired or wireless peripherals), but the smallest in a normal computer might be the CPU. The CPU cannot be repaired except by replacing the entire thing. You can't repair a CPU or a MacBook, so it seems the the only real difference is how much it costs to replace the broken part.
Only, from my experience at least, the CPU is the LEAST likely part to break. If it was the most common or a usually broken part your argument would have some merit.
Besides, he also talks upgrading. An old laptop HD that has 1/4 the space and much less speed than a modern hard disk but can't be replaced, is as good as broken, even when it's not. And then there's also memory that people want to update and can't.
The smallest replaceable part on a MacBook is the entire MacBook
this is where the bourne comparison needs to come back into the discussion, because it depends who is doing the replacing.
I, my partner and my uncle had had three recent macbooks experience sudden unintended swamping, two with water one with red wine, charlie fixed them all.
The apple geniuses - james bond company men all - carefully removed the screws from my computer, reverently placing them in actual surgical kidney dishes, then looked at me with the practiced empathy on their young beautiful faces, shook their heads slowly without breaking eye contact and said 'logic board' - the macbook cancer diagnosis.
magic charlie is the bourne in this story.
I met charlie at his office, a claustrophobic box on a suburban shopping strip, tobacco fogged,
every horizontal surface except for a few stepping stones between charlie and the door
fractalling out jagged piles of derelict dismembered macbooks, maybe about a thousand, in pieces,
some with surprisingly brutal injuries, a fist sized whole though a 15 inch retina.
I was lucky to meet him, the shop had been open a year, he was planning on closing it down and working out of home.
he got few walk in customers,
the few that did seemed unsettled by his casual disemboweling of their pristine objects of beauty
atop of a mound of incomplete repairs, crowded by ashtrays - most of them overflowing, some for screws.
Those people would leave to take their computers to apple licensed stores,
who - in violation of licensing agreements, but in obedience with the imperative of profit -
sent the work to charlie. He charged the same price either way.
If a macbook is atomic, charlie is a quarks man, I watched for about ten minutes until he kicked me out,
as he scrounged chips from the racks of donor laptops,
desoldering and re-soldering tiny components into the open heart of my machine.
The only explanations he gave of the undertaking were when he pointed a chip on the logic board and said 'sound',
then lifted the board to eye level, held at two diagonal corners in his finger tips, and slowly spun the board through a full rotation, then made eye contact for the one and only time and said 'keyboard'.
When I came back the shop was shut. For about a week no-one answered the phone.
Then I got a message from a woman on my phone "your computer's ready".
I dropped by charlie's, I wanted to speak with him a moment, to emote my awe and gratitude,
he wasn't interested, he didn't look up.
the woman, standing in one of the bare-floored stepping stones took my money, don't remember how much, but less than a hundred.
Don't be sad about it, there are plenty of Charlie's.
I knew a Charlie in Hollywood, CA. He was an old cranky Asian-American guy, greying beard, real guru stereotype, had a big store full of hi-fi and guitar and studio equipment. Every studio engineer in SoCal knew that if they needed something fixed, fast, no questions asked, you took your synth/guitar/studio equipment to him. He looked at you, told you exactly how much, exactly when to come back, don't argue with him or leave the shop immediately.
In the 90's I built my synth studio by buying cheap stuff that was considered borked - SH101's, Pro One, DX's, etc. - by some studio mishap - spilled wine, homicidal drummer, etc. All of it broken somehow, and thus cheap. A Saturday afternoon with Charlie, and it was like it was brand new.
So the point is, there are plenty of 'em out there, still doing the work. The key to it is always remember one thing, and one thing only: hardware rules us all. Working hardware, even more so.
> as moores law ends and computers stop being obsolete we may get to a point where a computer could be passed down generations like a solid dining room table
I had to smile at this. I just handed down my old iPhone 3GS to my daughter to use as she's started secondary school. I bought it in 2009 and it's now on it's fourth owner - Me, my wife, her father and now our daughter so it's been used by three generations of our family. That's only 6 years, but counting. Getting it set up for her reminded me what a great device it is, and it was remarkable to me that it's still very usable even today. It still has a load of photos and videos on it (all backed up into iPhoto already of course) and it was a lot of fun going through them all together on the device they were taken with.
I think it's this last point that relates to the Pez dispenser concern. Computers are still currently highly personal devices that capture and provide access to so much of our lives, but as more of that is persisted to cloud services the devices themselves will becomes less central to the experience.
It's a shame this doesn't apply to the original iPad... no OS updates for quite a while, so you can't use the majority of apps (as they don't offer older versions) and the browser regularly crashes.
It was excellent hardware, let down by just becoming obsolete through age.
One website that works perfectly on the original iPad is HN - the browser seems to crash on more complex pages so I've been assuming that it's a lack of RAM.
[I still have a couple of original iPads in additional to an Air 2]
The other difference is the rate at which the part fails. CPUs basically last forever if you keep them sufficiently cooled. Batteries or screen backlights or keyboards, not so much.
But CPU spring connectors don't last forever. If you want to ensure the CPU stays reliable for a good long time (at today's clock rates), you're going to have to solder it down.
Contacts are a source of trouble, especially in devices that are subjected to vibration. That's why they're sometimes soldered down. For stationary devices this is much less of a concern, the long term killer there is contact oxidization. Since CPU contacts are made of gold this is not usually a source of concern. Even circuit boards made in the 70's with regular sockets still function fine today, the 'bad' parts are usually electrolytic capacitors (they dry out) and tantalum ones (they like to change into miniature fireworks).
I got a great score on an exotic DMM from the 80s where the only problem was a weak ROM socket. Machine pin sockets might be OK (if you're always inserting a brand new IC) but dual wipe definitely wears out.
Gold plating only works for expensive parts, and only if you have the same plating on both parts. The 10u gold alloys on flashy audio equipment looks great but they aren't likely to be any more reliable than tin. A gold socket can't save a tin balled CPU: http://www.te.com/documentation/whitepapers/pdf/p316-90.pdf
Just google, there's massive amounts of industry data. One term is "insertion cycle", another is "loss of force". A typical soldered PCB connection will always last longer than a comparable mechanical one. (comparing cheap to cheap and exotic to exotic, so no fair comparing a corroded solder joint in a throwaway greeting card to the backplanes in an FA/18).
The term "Mechanical Durability (min.): 30 cycles" means that this socket is only guaranteed for 30 insertion/removal cycles and they don't specify a lifetime. Is that lower than you were expecting? Making connectors is hard!
> he smallest replaceable part on a MacBook is the entire MacBook
It wasn't always like that. In my 2011 Macbook Pro, I can replace memory, harddisk and optical drive (with an SSD, for example), and - revolutionary for Apple - the manual even tells me how. It's sad they moved away from that.
I also like how an old car like the Citroen 2CV is so easily hackable, and many car enthusiasts (and even the manufacturer) have hacked that car in numerous ways. It's not being made anymore, and I don't think there's any comparable car available nowadays.
The “soldered to the motherboard” memory on a $400 computer today costs 10,000 times less than the same quantity of memory in 1995.
What’s the point of making something repairable generations down the line when nobody is going to still want it because you’ll be able to buy 10 new ones that are each 100 times better for the same price.
The only computer parts from <2000 that are still worth keeping around today are the keyboards, and maybe the laser printers.
>What’s the point of making something repairable generations down the line when nobody is going to still want it because you’ll be able to buy 10 new ones that are each 100 times better for the same price.
If you're buying stuff for your 1995 laptop, yes, you have a point.
If I want to double the memory of my 2014 laptop and can't, because it's soldered-on 4GB, OTOH, I'm hosed.
The soldered memory also tends to cost the end-user double than exactly the same memory that is user-replaceable. Just because they can charge more for the "upgrade".
> The only computer parts from <2000 that are still worth keeping around today are the keyboards, and maybe the laser printers.
And yet I have a Dell Latitude CPx laptop from that era that is perfectly usable, and common parts are readily available via eBay (RAM, hard drives, battery, screen, etc). No, it's not going to play any modern games and no, it's not a good choice for compiling software. But OpenBSD, Slackware, and Debian all run surprisingly well on it given a light weight window manager. It's a good backup device for when my main laptop (also a "relic", a Dell Latitude D400 from 2003) is unavailable.
"The world is seven billion people swimming in a boiling froth of water, oil, guns, steel, race, sex, language, wisdom, secrets, hate, love, pain and TCP/IP."
> More to the point, this is why the soi-disant-designer snob ... comes across as such a douchebag. It’s not “minimalist” if you buy a new one every two years; it’s conspicuous consumption with chamfered edges.
Some people certainly were, many didn't (or did and gave the old one to someone else to use for a while). I study CS and I think most people have had their laptops for 3 or 4 years now. (only exception-pattern I can think of were people who jumped at retina screens)
I'm still using my mid 2009 macbook. The battery is not that unremovable, it's just screwed in. I upgraded to 8gb ram and SSD as most have. Thanks to Apples move to super weak hardware you can't buy a new one that's better for the same price.
Ah, he found me out. I'm buying a new laptop every three years to impress strangers so they think I'm rich and are impressed with my status. Definitely not because my entire existence has happened in a time where the amount of transistors on a circuit doubling every 18 months was essentially considered a fact of nature.
Just because the next model has twice the whatever doesn't mean you need to get it. The only reason people upgrade on such a nice, predictable cycle is because of planned obsolescence.
It is interesting to compare Bond with Smiley. While Bond is whom the salesman aspired to be, Smiley is whom he would actually become.
The expense account is a reptile fund over drafted by senior management for spurious expenses, he travels, but to cold and boring places, usually on a budget and staying out of sight in cramped hotels and safe houses.
Instead of sleeping with beautiful women, he spends most of his time working, and being cheated on by his detached wife.
And when he actually makes it to the top, he just finds himself overburden with political machinations, stakeholder management, and longing for the time when he'd actually get something done.
I watched a fascinating interview with a former MI6 manager. This was probably about 10 years ago. The interviewer made some comment about the men she had to manage: the agents. She said, effectively 'men?'.
The Grey Man principle is a good one, as far as it goes, except maybe the gender.
Fictional spies are interesting reflections of social angst, but I think we're about ready now for a fictional Valerie Plame.
What does gender truly have to do with the "Grey Man" idea? The core idea as I see it is to blend in with your surroundings, be "boring grey" instead standing out. Sure, "Grey Man" has the word "Man" in it, but I think the operative word in it is "Grey."
Oh, I wasn't trying to make a massive point on gender, beyond pointing to the fact that fictional spies tend to be male. My interpretation of 'grey man' is not so much blending in, as not being the kind of person one would expect to be a spy. A person who 'looks like they can handle themselves' is probably less useful than an inconspicuous office intern. It would be interesting to see a fictional spy that was believable in that way.
I'm trying to remember where. It was a TV program in the UK, and I think the interview took place in an unused london underground station, I think. Sorry. ...
[edit 1:] Was it some kind of show where they trained people to be spies for the camera? I seem to also remember footage of someone having to plant a letter in their mother's handbag without being seen by them. Sorry for not being helpful. Like I said I wasn't trying to make a significant point: the discussion of spies just reminded me.
[edit 2:] Can't find a YT clip of that bit, but the show was 'Spymaster', or 'Spy'. I hope I'm not remembering it too wrongly.
Very perceptive. Introducing a third data point helps to clarify that there is more to the story of how we went from Bond to Bourne than just millennial job anxiety. There was quiet desperation back in the Bond era too.
Well then we're all Bob Howard from the Laundry. The unassuming backroom guy who knows his way around a computer yet surprisingly can hold it together when confronted with gibbering Lovecraftian monsters from beyond the void. Has an iphone that can kill demons, the senior management are keen to push him up the ladder and has a wife who is clearly out of his league.
This might be an interesting reply: https://www.fairphone.com/phone/ - it's not exactly a clunky brick, and it's completely modular and user-serviceable (at the level of modules, anyhow). Apple didn't pare down the openness, they designed it out. They've been actively hostile to user servicing since the toaster mac (that you couldn't even open without a proprietary tool).
My hobby is photography. I love my Nikon D5300 to bits, but after a year of usage two buttons are stuck and the eyecup needs replacing. And then there are people using twin reflex cameras from the 50s and they run like butter...
Thank you for the advice. I kind of reckoned one would need to. I'm about to buy one and currently looking, if there are shops that do repairs, if need be.
Could it be because your camera is now pretty much a whole photography studio in a compact package, and to make it as sturdy as those old cameras that had three moving parts it will need to cost twice as much?
Of course. I believe, it's because of the manufacturer's (and the consumers') changed priorities. I think when a product arrives at a certain level of quality, it's difficult to see a difference, unless you have very good knowledge of the subject matter. For example, I don't know how to tell the difference between the quality of Carl Zeiss Planar and Tessar lenses. Or when I would need to.
But as a company you need to differentiate yourself. Hence, you go digital, then you introduce all kinds of gizmos, built-in HDR processing, retouching capabilities, etc., etc. You save on the build quality to be able to claim the pole position in the megapixel race.
Understandable, but does not cover the whole market.
I don't really agree, at least when it comes to photography. Photographers care more about low light performance than they do about megapixels, for example, but there's a simple reason for build quality taking a back seat to features:
Fifty years ago, next year's camera would be 3% better. Nowadays, next year's camera will be 100% better. Why would you spend money on build quality for a camera that will be de facto obsolete in two or three years?
Modularity has advantages (repairability, upgradability, switching out manufacturers) and also disadvantages (cost, design constraints).
The way I'm seeing it technological advance is changing how big those are. I still remember how soldered on cpus were seen as a problem with laptops. I don't know anyone who ever upgraded their cpu (might be that it was more common earlier). Laptop batteries today are way better than what they used to be. They used to hold maybe 2 or 3 years before having unusable usage times. The development in RAM is stagnating.
And of course Apple now is so big that they can just dictate proprietary connectors.
I really, really miss the days when there was more than one manufacturer of computers.
I think it was in 2002/2003 when I was last in the market for a non-apple device and there was a lot of interesting competition ... thinkpads, sony vaio, HP ... Dell had some great laptops ... there was actually a reason to scroll through gizmodo/engadget and look at new products coming out.
Now there's only one.[1]
I really do think that the macbook air is the final evolution of the laptop computer. Yes, I do wish I could pull the battery and yes I do miss PCMCIA and yes I cannot use a laptop with a single port[2] but the 13" MBA that I bought in late 2008 lasted 7 years of hard use and three battery swap-outs (done by apple, of course).[3]
I don't really understand why other manufacturers can't make a MBA ... I'd like to see some minor competition along the very, very final bits of design latitude you can have in this thoroughly distilled form factor ... but for whatever reason Dell/HP/Sony/Lenovo refuse to just make their laptops out of metal for god's sake.
I wonder if this is where we are going with high end cars ? I notice the exact same inability to function on the part of all the incumbent car manufacturers. It's how many years since the Prius came out and how many years since the Tesla roadster and Porsche just announced an all electric car and of course it's a concept car?[4] BMW has a brand new 7 series with the predictably lame hybrid option mated to some little lawnmower engine. Just concept cars and half-assed hybrids over and over again. Meanwhile, Tesla has a dual-motor, AWD supercar for sale.
There used to be more than one maker of high end four door luxury cars ...
[1] Well, actually there's two - apple for laptops/desktops and supermicro (god bless them) for anything that goes in a rack.
[2] New macbook 12" with the single combined USB/power.
[3] Actually it still works now, but I get a new 11" MBA.
Tesla is [not there yet](http://jalopnik.com/heres-what-a-tesla-model-s-can-do-around...) and all these car-makers have to sell cars, that 1) handle well and 2) do not go into reduced performance mode after some burst of activity. So I expect a lot of 7ers being sold with their lawnmower engines. If I were in the market for a nice sedan, I would also pick G11 instead of Model S.
P.S. With regards to Lenovo: have you ever seen the 'For those who do' series of ads? I would never dare to take any Apple laptop into such environment. No problem with Thinkpads ;). And even in cafe-houses conditions, having built-in mobile broadband or smart-card reader is nice.
The thing that keeps me on them is that no other hardware/OS combo is even in the same ballpark for battery life as Apple mobile/iOS and Macbooks/OSX. Maybe, maybe high-end Lenovos or something with a carefully-tweaked install of Linux could come close, but probably not without sacrificing features and/or convenience.
> I don't really understand why other manufacturers can't make a MBA...
Lenovo's Carbon X1 is that machine for me. I carried an Air (running Debian) for a couple of years. I've since replaced it with a Carbon X1, and there's nothing I miss about the Air. The Carbon is simply fantastic.
That's... a lot of personal preference. I won't really address how you're only allowing for Apple and Apple only but just for one thing, I would really dislike a laptop made of metal
It's mostly nonsense. You can invent comparisons like this, which are really just half-baked and not very convincing attempts at social science deconstruction, between any number of things and it may appear to be just as thought provoking.
I am sure that the generation that watched James Bond in the 1960's trusted their employers more than action movie watchers of the last decade but Bourne stories are about government conspiracies and the first book was written even before Iran Contra deals were known and president Reagan took office.
I'm on my 3rd thinkpad. First one was a T30 that I got 12 years ago. I used it for 4 years and my mother for 4 years after that. I still have it and it still runs but it is now made out of parts from two T30s, the original, whose screen died, and another that had a bad hard drive. My T60p I used for 4 years and then my mother for 4 more years, it had to have its cpu fan replaced once about 6 years into its life, otherwise it is sitting here next to me in perfect working order.
I'm 3 years into a first gen X1 Carbon, it runs perfectly. The only annoyance is that I can't get at the battery to replace it. I'd be a bit happier if it had a 2560x1440 screen since then there would really be no need for an upgrade ever again, but as it stands I expect to be able to use it for at least another 5 years.
Build quality on the newer T series feels like shit to me, but I haven't spent enough time with them to be able to judge and they are so new we don't really have any data yet.
edit: seeing the link to the potential retro thinkpad, that I would get without a second thought
I'm on the 3443CTO revision of the X1 Carbon. The "carbon" part of it is starting to fall apart, the palmrest creaks and I really should have bought it a week later when the i7 with 8GB of RAM was available - 4GB and an i5 feels restrictive (what a sad statement on current software optimisation)
Oh, and you can only get ~6 hours of battery out of it with very particular settings and a specific power management driver in Windows 8. Otherwise it's 3-4 hours at best.
I use a W530 for work. It's a true beast of a machine. But I cannot stand the way all the plastic flexes. It feels cheap. I had to replace the keyboard the first week I had it due to a defective key (it happens, no big deal), but there's a little plastic strip at the top that was way to easy to bend and so now it's just ... there. reminding me of it's cheapness.
I have a T430s that I like quite a bit. When I was idly considering buying a T450s to get a higher resolution screen I looked into how easy it would be to upgrade the RAM and hard drive (much cheaper than buying a good one from Lenovo) and was disappointed that it seems much harder to replace them now.
That was a lot of words to say Apple is wrong about design and I'm right and I wish someone would realize this and build something that suits me exactly.
They took everything he had, and promised that if he gave
himself up to the System, in return the System would take
care of him.
It turned out to be a lie.
We’re all Jason Bourne now.
I read this with a tongue in cheek but walked away with a profound sense of camaraderie and Tyler Durdene feels.
This last few setences. WOW. As a Jason Bourne fan, I can now be him.
This is the most pretentious article I remember reading in recent history but it seems the author has a few interesting points. Can anyone do a tl;dr in plain english?
Design as crafty/minimalism/apple's aesthetic is irresponsible in that it refuses to acknowledge that things can break. Design which pays attention to the reality of a thing's environment and usage patterns, and which does not prompt conspicuous consumption of every brand new version after even the slightest failure of the old, is better.
Well, let's call that out then--folks should be honest that they buy un-repairable hardware at least partly because they enjoy having an excuse to get the latest model two years down the line.
I don’t think two years is the average life time of a current MacBook. I would guess maybe three, four years, maybe even longer. Most problems will crop up during the first year and be fixed on warranty.
Mechanically the devices are very solid and will survive for many years.
The biggest issue is the battery, really, and probably the power supply, both of which are costly (maybe 25% or so more than a replacement battery for another laptop, plus you have to bring in the MacBook to get it replaced, you can’t do it yourself), but can be replaced. Most people will just not do that, though, and live with diminished battery life and fraying power supply cables.
I think you are stuck in your filter bubble if you think people want to buy a new one quickly. Many will live happily with a current MacBook for many years. A decade ago the situation was a bit different (and performance crummy), but nowadays the speed is there.
The intention was never for you, the person who bought your computer, to be able to pop a new CPU into your Compaq five years later.
Standardized, modular parts were always all about the OEMs, who were all trying to build an industry by cloning the IBM PC. They could have a variety of manufacturers making various components and as long as everybody followed the standards, they could pick and choose parts to create SKUs for various needs and then sell them to you, the end-user.
The fact that a twenty year-old me could spend a summer saving up for the parts to build a brand-new Pentium II computer and then spend an exciting weekend putting it all together was a side effect of all that.
A side effect many of us enjoy, obviously. And a side effect that developed into a small industry of its own, with various companies (even Intel and AMD at times) targeting the "enthusiast builder" community.
When you understand this, though, things like soldered-on memory and CPUs make sense. The reasons why your 1988 Compaq had standardized IDE drives and memory slots are utterly inapplicable to a 2015 MacBook.
I hate the non-hackable place to which PC hardware is headed, but I don't get angry at it.
Almost no other category of consumer good works in the modular way that we nerds wish that everything would. Aside from the fact that they run on the same gasoline you can't put Chevy stuff into a Ford; you can't even put Ford F-150 stuff into a Ford Fusion. Your toaster oven, your ceiling fan, and your lawn mower are the same way.