I think what would be amazing would be if one day it is possible to run macOS on an Apple Silicon iPad. This seems like it would be something that many people would want, so I guess there are technical issues preventing this. In another thread, the absence of JIT support has been mentioned for example.
Memory is another issue. There isn’t much extra TAM on the iPad. I ran UTM in my iPad Pro once and crashed it when trying to use a VM with too much RAM.
As someone happily tinkering on a Pinephone, I just don't think the RK3399 is going to satiate Linus's hunger for performance. I think there's some implied subtext "that is actually fast enough for my needs".
He's currently running a 32 core Ryzen desktop CPU, an upgrade he consciously chose over a lesser chip for its improved build speeds. He's also lamented poor screen quality and resolutions before, and I imagine the PBP's would not impress him.
The Air is notable because its processor can run build jobs (like compiling the Linux kernel) in stellar times. While the RK3399 has dual A72s and should be ... better than, say, the pinephone, I'd imagine it's closer to 2010~ish core 2 duos in performance.
The issue is that you really start feel the performance limitations of a cheap ARM SoC. Especially if you do a bunch of C compiling like Torvalds would.
C compiles in a flash on today's hardware. It's only when you have a BIG C codebase (the kernel counts as big), or a medium-sized C++ codebase, that things slow down.
C++ code that uses templates heavily is painful to compile, using 6 GiB of RAM is possible even for a "make -j1" build. It's kind of funny that a VPS with little RAM can build a full Linux kernel with all the modules and drivers, yet cannot build a small webserver. For big projects, the final linking process also eats enormous amount of RAM, it's possible for a small system to survive all the build and gets killed during the final linking.
Sometimes, it's just valuable to have a local environment so you can use develop and test conveniently "at home" while using the same computer. It's why there are ARM workstations.
For example, Linus Torvalds said [0],
> Some people think that "the cloud" means that the instruction set doesn't matter. Develop at home, deploy in the cloud.
> That's bullshit. If you develop on x86, then you're going to want to deploy on x86, because you'll be able to run what you test "at home" (and by "at home" I don't mean literally in your home, but in your work environment).
> Which means that you'll happily pay a bit more for x86 cloud hosting, simply because it matches what you can test on your own local setup, and the errors you get will translate better.
> This is true even if what you mostly do is something ostensibly cross-platform like just run perl scripts or whatever. Simply because you'll want to have as similar an environment as possible,
> Which in turn means that cloud providers will end up making more money from their x86 side, which means that they'll prioritize it, and any ARM offerings will be secondary and probably relegated to the mindless dregs (maybe front-end, maybe just static html, that kind of stuff).
> Guys, do you really not understand why x86 took over the server market?
> It wasn't just all price. It was literally this "develop at home" issue. Thousands of small companies ended up having random small internal workloads where it was easy to just get a random whitebox PC and run some silly small thing on it yourself. Then as the workload expanded, it became a "real server". And then once that thing expanded, suddenly it made a whole lot of sense to let somebody else manage the hardware and hosting, and the cloud took over.
> Do you really not understand? This isn't rocket science. This isn't some made up story. This is literally what happened, and what killed all the RISC vendors, and made x86 be the undisputed king of the hill of servers, to the point where everybody else is just a rounding error. Something that sounded entirely fictional a couple of decades ago.
> Without a development platform, ARM in the server space is never going to make it.
And,
> And the only way that changes is if you end up saying "look, you can deploy more cheaply on an ARM box, and here's the development box you can do your work on".
> Actual hardware for developers is hugely important. I seriously claim that this is why the PC took over, and why everything else died.
> So you can pooh-pooh it all you want, and say "just cross-build", but as long as you do that, you're going to be a tiny minority, and you don't see the big picture, and you're ignoring actual real history.
> And btw, calling this an "unixoid" mindset is just showing what a total disconnect to reality you have, and how stupid your argument is. Unix lost. Yes, it lives on in the shape of Linux, but Unix lost not just to Linux, but to Windows. In fact, arguably it lost to windows first.
> Why? Same exact reason, just on the software side. In both cases. Where did you find developers? You found them on Windows and on Linux, because that's what developers had access to. When those workloads grew up to be "real" workloads, they continued to be run on Windows and Linux, they weren't moved over to Unix platforms even if that would have been fairly easy in the Linux case. No, that was just unnecessary and pointless work. Just continue to deploy on the same platform.
> Exact same issue on the software side as with the hardware. Cross-development is pointless and stupid when the alternative is to just develop and deploy on the same platform. Yes, you can do it, but you generally would like to avoid it if at all possible.
> End result: cross-development is mainly done for platforms that are so weak as to make it pointless to develop on them. Nobody does native development in the embedded space. But whenever the target is powerful enough to support native development, there's a huge pressure to do it that way, because the cross-development model is so relatively painful.
I have been debating a Pi400 vs a Pinebook Pro as an Xmas gift for a 10-year-old. The Pi400 looks great except that it doesn't support a webcam+microphone(+audio out?) out of the box and attempts to find an HDMI monitor with these things built in and supported by the Pi have thus far been unsuccessful. Is it odd to be considering a Lunux laptop for a 10-year old? The open source ethos of both machines, seems ideal for this use case.
I don't think a Pinebook Pro has a 40 pin GPIO header, so the possibilities for interacting with the outside world are limited and thus fun is severely reduced.
Its possible that the shared values and commitment to public service inherent in choosing to serve in the military caused your father's co-workers in the military to see him more clearly and/or in a different light.
It is also possible that his leadership ability did not manifest equally in both settings, for example if he was more passionate about the military than his civilian job and/or if it was a better fit for his skill set.
Not to suggest that some institutions don't still discriminate against Asians etc, unfortunately. Hopefully things are changing for the better though.
I assure you there are lots of stupid people in the military who are racist, and "the shared values and commitment to public service" is not nearly as inherent in government as Parks and Recreation would have you believe. Remember, in his full time job he also worked for the Federal Government. In the VA, no less. You'd think that the employees of the VA would have the highest level of commitment to public service and the highest understanding of what it means to be a successful leader in the military, (and for that matter the highest amount of care for the health of veterans), but, that doesn't seem to really be the case.
How well one person fits into a team is a very "butterfly effect" thing. I've both been "the star" and the mediocre kinda struggling guy on different team.
It's called "team chemistry" not "team logic" for a reason.
All the people here trying to throw CBP under the bus, did you read their report?
If I have a 'terroristic' picture on my phone and I try to cross a border, and at the same time I tell CBP that I want to work in my cousin's gas station so that he will be freed up to do other things, it is reasonable to think that CBP might exercise their right to decline to admit me.
Commenters who are saying that CBP are lying: if its he said / she said, I am going to believe the person without the dead baby image on their phone. Not the person with the 5 year visa who wants to be away from their wife and child for three months for 'vacation'.
It seems a bit weak to me to call CBP liars absent any evidence (and here there is none) that they have lied.
Mainly what I want to say is about the dead baby image: if my 'friend' sends me that photo, we are having a talk stat about good judgement, he is not sending me any more photos, I am probably blocking him, and that photo is getting deleted.
And if I have that photo on my phone and I choose to tell some bullshit story about child safety PSAs etc, and about my cousin's gas station, I have to expect that discretionary decisions are not going to go well for me.
As I write this, I am attending SILMO 2017 optical trade show in Paris. I've been attending for a few years and what you see when talking to industry participants is that there is real aversion to business models that can disrupt the industry to lower costs. Frame suppliers that sell to brick-and-mortar shops will very often refuse to sell to e-commerce players, for fear of alienating their brick and mortar accounts for example. Opticians will often refuse to disclose pupillary distance, as a way to prevent their customer from purchasing elsewhere. Many states/provinces/countries have pre-Internet laws on the books that effectively disallow selling online, and efforts to amend those laws to keep pace with technology are aggressively derailed by special interest groups.
As noted elsewhere in this discussion, it was expensive before. Independent opticians feel attacked on one side by e-commerce, and on the other side by Essilor/Luxottica, chains in general, and big-box stores The real issue may be that they can't see the value in changing how they have always worked or that even if individual opticians are open to change, they are stuck in a supply chain that may hold exactly the opposite view.
The opticians are safe and sound for now but if I were in the business I'd be weary.
1. augmented reality could get real and good very fast. Lots of people will be comfortable testing glasses at home instead of checking the optician. Visiting the optician is the most awkward shop I go to.
2. How long before we have advanced cameras and the eye test could be done online? Well, I think a bit longer than I'd think but that's in the realm of possibilities.
I feel that we need to see an uptick in automation/computer vision in the machines at the opticians before we're likely to see any sort of capability of the same at home. I would feel safer if the machines made diagnoses and there was an onsite optometrist/ophthalmologist for verification.
Also some conditions such as glaucoma can be difficult to test (http://www.glaucoma.org/glaucoma/diagnostic-tests.php). I remember one of the tests is to use a tonometer to check pressure by firing a puff of air at the eye, so for any of the tests today that give output values, these could feasibly be administered at home but wouldn't form a complete picture of our eye's health.
Interesting to think what an "opticians" could be in 20 years from now.
State of the art in eye imaging sensors is the equipment used in Lasik surgery. Already being used to supplement smaller opticians with remote optometrists, https://news.ycombinator.com/item?id=15374489