If they had enough time to warn people ahead of time, they had plenty of time to push a fix to their system for this. We are literally talking about adding support for one more image format.
Emails, tweets, texts are no excuse for broken products. The iPhone is the best selling model in the United States. It is on College Board to support its default image format.
Good product design is owning your users' success. It is not sending people workaround emails.
The bare minimum would have to be to do a warning before every single AP test about this and giving students a few minutes to change their default image format. Sending a tweet (!!!) out does not count as doing any work.
Even more strange is that when the file failed to parse after uploading, they just threw out the files instead of keeping them to analyze later.
If they still had the uploads they could go back and convert them properly and apologize for the delay.
It's just bad engineering all around. Even if there was a less glaringly obvious bug that caused parsing to fail, how would they debug that parsing bug without a sample file?
I mean it's not that odd when you think about how software is typically written. The parsing logic probably threw an exception that was never handled and just pops everything uploaded off the stack by default.
I'm not sure how you typically write software, but I don't consider it typical for software that opens a file and encounters unexpected input to throw an exception and then delete the file.
If an exception is thrown and not caught, the software should stop doing anything.
Uploads are often initially stored in a temp directory before they are validated and moved to wherever they are meant to be stored, and the default behavior in PHP, for example, is to delete uploads that are not moved or renamed at the end of the request.
In the scenario I'm describing nothing is ever written to disk. The uploaded image data is streamed into memory directly from the socket and is processed in situ, when an exception occurs the stack unwinds and deallocates the memory storing the image data.
Writing extra code to delete a file in a catch block doesn't seem like something someone trying to account for failure scenarios would do, it's much more likely that the data was living in memory and no thought was put into failure scenarios in that part of the code.
But it is incredibly unlikely that web uploads are piped directly into custom software rather than just being written as files which are processed later. That would be an extreme amount of extra work for no benefit at all.
Tomcat gives you a http request object where you can just grab the input stream object and pass it to pretty much every library that processes files because opening a file just gives you a fileinputstream so adding general support for inputstreams is much easier than actually adding support that only works on files.
It's not at all unlikely, this is the default behavior for various setups, e.g. nodejs with express which is primarily a streams based system where you'd have to do extra work to write to disk.
It was never on the file system (kept in RAM) or it was in some temporary folder where files get deleted when an upload request has finished processing the file to prevent DOS attacks. Automatically keeping uploaded files sounds like a really really bad idea.
While I kind of agree with the sentiment, I'm also totally done with the notion of "Apple decides to have their own unique format every 2 years, and makes the change in a backwards incompatible way, so now the world needs to kowtow to them, despite Apple dragging their feet in many areas of standardization."
Seriously, fuck Apple. It took legal changes in the EU to force them to the "Just f'ing support USB-C like the rest of the world instead of making half your money selling dongles".
It's funny because the Lighting connector has been around since 2012, when Android phones were largely using Micro USB, with an awkward flirtation with the wide USB3 connector. Now they're standardizing on USB-C, so if you were upgrading frequently you may have needed 3 cables in the past 8 years.
Before then iPhones used the 30-pin connector, backwards compatible with 5-year-old iPod accessories. At that time other manufactures seemed to be shipping different barrel-plug chargers and proprietary cables for every model.
So that's 2 connectors introduced over a span of 18 years supported by dozens of product models that sold billions of units. Cables have been available from third parties for most of that time. The only dongles that might apply are 30-pin to Lightning or USB-A to USB-C.
The 30-pin's raison d'être was to provide features you couldn't get out of USB, like analog audio and video out. And Lightning was a much better designed connector than Micro USB due to being reversible, which informed the design of the Type C connector.
I'd agree that the Mac now has a dongle problem, but it's precisely _because_ it switched to USB-C, as you suggested.
This is misguided. The problem is not how quickly one format or the other iterates. The problem is forcing your users to endure a closed, licensed format.
A USB accessory will work on any device, but a lightning accessory only works on an iPhone, to nobody's benefit but Apple. Apple's hate of standards is anti-consumer, and that's why the EU ruled against them.
What particularly ircs me is that Apple has acknowledged that USB-C is the superior plug by going all in on their laptops. But they can't let go of all the money they make selling licenses for third party cables on iPhones.
> The problem is forcing your users to endure a closed, licensed format.
Lightning was a huge win for consumers because it was years ahead of the incompetently designed clusterfuck that micro-USB was.
> Apple's hate of standards is anti-consumer, and that's why the EU ruled against them.
Apple’s “hate of standards” is in part the reason the USB-C ecosystem exists today. They contributed quite a bit to its development.
The EU ruled against Apple because the EU is full of bureaucratic idiots that care more about looking good than actually knowing what they’re doing. The circlejerk that the EU is always correct needs to end.
If the EU ruling happened a few years ago we’d never have had Lightning and we’d have been stuck with the piece of shit known as micro-USB. Thankfully, Apple was allowed to innovate independently as any remotely reasonable government would allow, and created a connector that would later inspire USB-C.
> What particularly ircs me is that Apple has acknowledged that USB-C is the superior plug by going all in on their laptops. But they can't let go of all the money they make selling licenses for third party cables on iPhones.
Catch-22; if they change the cable people like you complain that they’re trying to obsolete accessories, and if they don’t change the cable people like you complain that they’re trying to profiteer off of accessories.
I find it really surreal that this format is named "high efficiency image file format" when it makes no guarantees, no claims, and harbors no aspirations about efficiency. It's an encoding-agnostic container format!
You're correct, I shouldn't have said their own unique format.
Still, what matters is the reality of the situation. They could have easily made it so that uploading or transferring images, especially to websites, uses a standard format that 99.9% of websites support, instead of one that virtually noone supports (yet).
And at the same time that Apple rushes to support this new standard without providing a good backwards compatible experience, they've been dragging their feet for YEARS on Safari support for progressive web app features that would let devs build truly feature-comparable web apps without being beholden to the App Store walled garden.
> They could have easily made it so that uploading or transferring images, especially to websites, uses a standard format that 99.9% of websites support
The parent comment said they sent a tweet a week before, and they had something in their faq but it doesn't say how long before that was done.
Generally, when I've worked at places that were not startups a week to get something pushed in to fix something was not reliably enough time.
I didn't see in the article anything about how long they have been aware of the problem, perhaps they became aware of the problem just before the testing was scheduled to start. I guess that is a problem with their QA system, but at any rate I can think of lots of ways that they could have a problem for a week (or even longer really), not be able to fix it in their particular system, and have to notify people.
Of course I agree they did a lousy job of notifying people.
Yeah, I'm not surprised, since corporations like this are more concerned with making sure the Business Impact Assessment was routed in compliance with Standard Operating Procedures, establishing the Quality Verification Steering Committee to discuss possible impact to critical systems, and getting sign-offs from Validation Specialists and Risk Analysts.
> Perhaps Apple should make it easier or automatic to convert into a format that's universally usable.
Further down this thread you’ll see that the board have messed up and they aren’t accepting images they should due to poor implementation.
“
oefrha 1 hour ago [–]
Tried a standard input tag with the proper accept attribute
<input type="file" accept="image/jpeg,image/png" />
Selected a HEIC file from Photos in Safari, the selected image was automatically converted to JPEG”
I'd posit that Apple ought to assume the worst (ie nothing but JPEG and PNG is supported) if no format is specified. That's how we've built most of the web, to ensure backwards compatibility and avoid these kinds of problems.
That said, come on College Board. Fix your crap. What a stupid bug.
If no format is specified, then the presumption would be that the website wants a raw octet stream, and that adulterating it would be the last thing a client device should do, because it knows nothing about what the website's going to do to the result.
Okay, but the contents of that raw occlet stream will be a file in some format. The iPhone isn't like a traditional computer—the user picks an image from a library of images, and the type of image is abstracted away. Yes, it so happens that modern iPhones store images on disk as HEIC, but since this isn't user-visible it amounts to an implementation detail.
Since the user didn't specify a format, and the website didn't specify a format, the iPhone needs to guess something. Seems to me it should guess the format that's most likely to work, not the one only a tiny number of devices support.
But the website did specify a format. Like I said in my sibling comment, a lack of an `accept` attribute (which is the same as saying `accept="⋆/⋆"`) has a conventional meaning from a plethora of legacy use-cases; and that meaning is:
"Give me the underlying data, just as it is. You may or may not understand what it is, but I'm asking you to pretend that you don't, because I definitely don't understand what it is. I'm acting as a courier on behalf of a recipient who wants whatever you give me. All they told me was to get it from you. Please don't try to guess why they want it, or to prepare it for them in some way. Their motivations are beyond our understanding. They just want it. They want what you have, the way you currently have it. Do as little to it as possible, because anything you do may be counter-productive to their unknowable designs."
This is, for one thing, the use-case of file-sharing websites. If you upload something to e.g. MEGA, or WeTransfer, you're doing that in order for something further to happen to it on the other side. The other side may or may not have wanted the file in its original format, but that question is up to them, not up to the sender. The job of a "dumb pipe" file-transfer service, is to take what it's given, and losslessly pass it on to the recipient, so that further steps can happen. And, as such, it's also a responsibility of a file-transfer service to ask the User-Agent to also send the file on to it losslessly, because in this case the User-Agent is also acting as part of the "dumb pipe" transfer process.
Let me put it this way: if my photos were not saving correctly, and someone at Apple asked me to file a Radar ticket and attach such a mis-encoded photo to the ticket... how would the Radar web-app express the intent of "I want the stupid mis-encoded file that you-the-device are using to store what you think is a Photos photo"? Well, our legacy convention is that it would express that intent through `accept="⋆/⋆"`. (Or a lack of an `accept` header at all.)
Note that this is different from an `accept` attribute like "image/⋆". In that case, we know something—we know that the recipient we're acting as courier has the intent to use the uploaded file as an image—so both the mis-encoded file, and maybe HEIC files, are probably bad candidates. One should be filtered out as an upload candidate; the other should maybe be transcoded (just like a RAW camera file would be.)
By "raw octet stream", I meant that the client should upload a file named X made of opaque bytes 0xABCD, as a file named X made of opaque bytes 0xABCD; rather than assuming a higher-level intent on the server's part to acquire some abstract document, where the client would be able to better serve that request by transforming its file to a different file-format.
I didn't mean that e.g. the client should avoid using Transfer-Encoding compression during the HTTP POST request, or anything like that. (That is, after all, a fact about the wire representation of the request; it doesn't affect the file received on the server end.)
Or, to put that another way, an <input type="file"> with no `accept`, is to the client, as `Cache-Control: no-transform` is to the server: an expressed desire to get some specific data that exists the other end sent over "as it is", rather than to get something customized to the peer's needs.
I thought you were suggesting that images should be sent as raw octets for the image, rather than picking a compression format. But that raw data is extremely large, and therefore would have horrible impacts on bandwidth and latency.
That said, you're right. Trying to be clever about what people are sending results in a lot of hidden complexity and bugs of various forms.
As I understood the article, the problem did not arise when uploading the picture directly from the phone, but in cases in which the picture was first transferred to a computer (via Airdrop was explicitly mentioned, but could probably also have been a cable connection) and then uploaded from the computer. Whatever conversion the browser on the iPhone (or Safari on macOS, because there are other browsers on computers as well) does or does not do is irrelevant in such a situation.
You know, I thought that about FAT32. But apparently neither Windows nor Mac OS X can see a FAT32 partition on an SD card if it's not the primary partition (at, least, not in an obvious way).
Windows and macOS do that in order to hide EFI system partitions, I believe. (Not that they should have to; MBR and GPT both define a specific tag to mark a partition as being an EFI partition. But so many partitioning tools don't bother to use that tag—or to adhere to any other standard that could be used to identify an EFI partition—that OSes are stuck with a very bad/loose heuristic.)
They may also get some other benefits from this bad/loose heuristic, e.g. hiding Linux's common FAT32 /boot partition; OEMs' "backup" and "BIOS update" partitions; OS Recovery partitions from unknown (and therefore unpredictable-in-approach) OSes; etc.
What the consumer OSes really need is a bit in each MBR/GPT partition's bitflags, that has a meaning equivalent to one of those "no user-serviceable parts inside" stickers. I think it's too late to fit that bit into either standard, sadly.
> If they had enough time to warn people ahead of time, they had plenty of time to push a fix to their system for this
Yeah, no. Absolutely not. All of the testing for their platform would have to be redone, and if a bug is found, then what?
You can argue that they should have done a better job notifying users, but to argue that "of course they had time to push a fix in the week before the most important testing period" is just nonsensical.
IIRC, you have to use special flags when you compile from source to get HEIC support. And just because it's available doesn't mean you can legally use it. For example, the HEIF container's reference implementation is pretty explicit about not allowing commercial use [0]. The MPEG consortium lists over 7000 patents on their webpage for HEIC[1]. Making sure that you're bit infringing on those patents and/or working out a license deal with the patent cartel is a nontrivial amount of work.
I don't think you would be comfortable pushing this few-hours-hack to a system of such high importance. A mistake, be it stupid or complex, could break far more than the issue at hand does. And you would be at fault. Would you like to receive the response of all the students then? Would you really dare to risk this scenario?
That support depends on installing or building native libraries (mainly libheif I think) which is not trivial or may be that something developers can't do due to security reasons, also it's different for each platform.
I don't think it would have been possible to push a fix that quickly. Verification and sign off on these sorts of systems would require weeks or months to push changes.
Pretty much any mobile industry stat. Most have Android at a bit over 50% in the US and Apple at a bit under. Worldwide, Apple is under 20% market share. Remember, iOS is just Apple whereas Android is Samsung + LG + HTC + Google + Oppo + OnePlus + etc...
Just a guess, but it seems like you work in theoretical lala land and never have to deliver something that works.
I'm not saying you are wrong- sure you can argue that it's the iPhone that is broken, from a non-standard format. However, if you are designing an app that needs pictures to be taken and about 45% of your customers (the students) can't just take a picture without going through some conversion while on a time sensitive test- then you majorly fucked up.
End of the day this is on the AP designers for not adding a format which 45% of their base will use by default.
Scanners almost universally output in TIFF, and need to be converted to JPG or something more universally accepted. Nobody complains.
Some scanners even will do the conversion to JPG for you, because they know nobody accepts TIFF files.
If Apple does it, then everyone has to accept it?
And why? Because images are large and Apple's trying to reduce the size on the phone? How about giving everyone more than 5GB of iCloud storage in 2020? Google gives you 10-15GB for free, and costs half as much for more storage.
Well I could turn that around and say if you develop a phone and the image format it exports to is not accepted by 99% of websites maybe then you majorly fucked up.
But the phone does the right thing when told to do the right thing. If the input tag has a proper accepts attribute set the iPhone will transparently and automatically transcode a HEIC image to JPEG.
A file input tag with no accepts attribute when you're expecting a particular type of file is broken. Would Android phones be "majorly fucked up" if they stored images as WebP by default?
And they were correct. If more than 10% of the population is using it, it should be supported, particularly if you are providing a service that can affect people heavily (like AP test submission).
Apple should convert to JPG or PNG when exporting the image to anywhere anytime.
The same problem happens with webp images too - totally unusable almost everywhere, even in photoshop.
Apple wants to reduce storage space photos take up. Fine - but they should convert back to a standard format when exporting the image to be used on some other system.
With that said, it seems the issue is from students that didn't upload the image from their phone (where Apple correctly converts to JPG), but rather transferred the image to their computer, then uploaded into the AP Exam.
If that's true, then this is absolutely on Apple. Why would they export in a format that literally nobody else supports. What are these people supposed to do with a folder full of HEIC formatted photos, that can't be uploaded anywhere else, edited with any program, or opened even opened on some Operating Systems?
Apple should assume nobody else uses HEIC because... well, nobody else uses HEIC.
File formats change fast. It was not too long ago PNG was the newcomer, and people were touting how much better it was than GIF for non-photographs (alpha transparency, better compression, etc.). It has been successful, and now almost all applications support it. Same thing with moving from AVI to the MPEG formats in video.
New formats are a good thing, and fast adoption of them is good for users.
The point is that we have all seen what happens when we start letting a single company dictate formats. Because the next step is "i can't believe those lazy fucking programmers can't support heic2" or whatever magical bullshit they come up with after abandoning heic1.
And it you want to talk about incompetence, I'd say pushing a format that is almost guaranteed to be incompatible with the millions of existing backends out there is profoundly stupid.
> If they had enough time to warn people ahead of time, they had plenty of time to push a fix to their system for this
At the very least, a message that appeared at the time that one attempted to upload the image with instructions on how to fix the problem on the spot. How hard could that have possibly been?
Quite hard? I mean shame on them for letting this bug through in the first place but I'd be pretty horrified if someone at CB tried to hot-fix this problem out.
This isn't a small agile organization, it's enormous and may not have any route to get a patch out in less than a week due to QA requirements. (That isn't to say such slow deployment processes are good, but they do exist and may be contractually required)
> The iPhone is the best selling model in the United States.
That's an interesting (and slightly misleading) way to present that statistic. Apple has the best selling models, but does not have the majority of the market by operating system, which is what matters here. It's close, but Android is still supposed to have over 51% of the US market.[1] (If the graph itself is occluded, see the summary below it).
> It is on College Board to support its default image format.
It's a minority platform. It's only a slight minority, but it is one. If you want to make a case that they should support any default format for a platform over a certain percentage of usage (or "almost half"), that's fine, but you can't rely on the obvious argument that it's the dominant platform and thus should be supported, because it isn't the dominant platform.
I disagree. Bowing down to the whims and fancies of corporations is how we got into this situation (in terms of media formats) in the first place. According to wikipedia, HEIC isn't even supported in any browser natively, clearly rolling it out this soon was a bad idea.
Media format support will always be a chicken and egg problem. You got to start somewhere.
Apple's approach of automatically converting images at the edges is the right way to go. It does require your software to be explicit about what media formats it understands. This is where the College Board failed.
Their approach to supporting the new formats is actually quite easy to work with and properly defined input tags will just automagically trigger file conversions on the iOS side.
I also really dislike Apple's usual "my way or the highway approach" (it's causing my nephews serious issues since some of their remote learning tools use flash which Apple refuses to support) but in this case they are using the right approach to make it a smooth transition.
> it's causing my nephews serious issues since some of their remote learning tools use flash which Apple refuses to support
That's really not Apple's problem. Flash Player is dead. Adobe has declared that all support for the plugin will end in December 2020, and every major web browser has indicated that they intend to discontinue support for the plugin in advance of that date. Some desktop browsers (including Chrome and Firefox) have already disabled Flash content by default, and the Flash plugin for Android was discontinued in 2012.
As a developer I'm totally onboard with Flash having died a while back - and these tools educational suites really shouldn't have anything to do with Flash... All that said, when Apple killed Flash they did it unilaterally and really did break a lot of existing systems, if this pandemic was happening a decade ago I'd absolutely be on an Apple hate train since the sudden drop of support forced people to scramble.
At this point though, Flash is known to be dead and buried and folks that haven't migrated off of it have made their own beds[1].
1. ...And unfortunately caused a lot of headache for quite a few parents with multiple children that are trying to let all their children learn concurrently on different devices they have around the house.
> when Apple killed Flash they did it unilaterally and really did break a lot of existing systems
What are you referring to when you say "when Apple killed Flash"?
The big outcry was back when Apple made it clear that they would never add Flash support to iOS. But that support never existed in the first place -- one can hardly "kill" something which was never alive.
Desktop Safari still supports Flash, for now. It's off by default (with a "click to enable" icon), but that's no different from how it's handled in other browsers. All signs indicate that they intend to remove Flash support with the next major release, but that just puts them on the same track as everyone else.
Refusing to support the tech on your platform is killing it. iOS's big selling point, initially, was as a consumption device - a phone with a browser. Deploying your browser without flash was removing some expected support from the norm expected of a browser at that time.
And yea - I agree that flash is dead at this point and I'm quite happy it's gone. Apple actually contributed significantly to the death of flash based advertisements and there is nothing in the world I hated more than those.
Open competition among standards is the way adoption has mostly happened at least since the VHS/Betamax days. Otherwise it’s decision by committee and/or fiat which can sometimes be a net benefit but more often than not standardizes on a cumbersome standard that is not a good fit for any one application.
Emails, tweets, texts are no excuse for broken products. The iPhone is the best selling model in the United States. It is on College Board to support its default image format.
Good product design is owning your users' success. It is not sending people workaround emails.
The bare minimum would have to be to do a warning before every single AP test about this and giving students a few minutes to change their default image format. Sending a tweet (!!!) out does not count as doing any work.
This is a failure. An abysmal failure.