Hacker Newsnew | past | comments | ask | show | jobs | submit | free2OSS's commentslogin

What kind of stat problems?

Also I used to love Python... Until I got a full time job and learned why static typing exists.


Any sort of statistical or econometric estimator is typically published as an R package.

So for example, I recently saw a paper with a quite complex estimator based on dynamic panels and network (or spacial) interdependence that could identify missing network ties. For that, an R package exists.

If you want to use it in Python, you'd have to replicate a whole estimation infrastructure yourself, starting by extending the basic models in statsmodels.

That example is quite typical in my opinion.

Like I said, really like to code in Python and I don't like R all that much. But if someone says: "Why would you use R, Python is better", then we can confidently say the person does not know what R is actually used for.


I wonder if philosophy can solve the first 4?

The later through Science/logic where it's possible.


Same question but for general Linux distros.

Not sure I'm willing to try Ubuntu for the 8th time. I use Linux server all the time, but I don't want to mess with configurations and installing a bunch of unique software to get netflix to work. More bloatwear the better. (I half joke)


I've used Slackware, Void Linux, Debian, Arch. Slackware and Void Linux I found really nice technically. However, I'm currently using latest Ubuntu simply because of the popularity, which translates to large ecosystem and minimum setup required for basic things.

I'm not a fan of how Ubuntu does things, but I'm at a point in my life where I want something that's fire-and-forget. Certainly wasn't the case before, might not be the case in the future. You'll be making some kind of compromises when choosing your computing environment. Be upfront with what you can compromise and what you can't and choosing will be easier.


Slackware was my very first Linux (Slackware 3.6). Back then I built a Linux From Scratch (2001/2002) and used it for two semesters and it really helped me fully understand a lot about Linux.

Today I use Gentoo on my main box, Alpine on my file server (I wrote this on using Alpine with full disk encryption: https://battlepenguin.com/tech/alpine-linux-with-full-disk-e...), Void on my router and Dedicated server.

My favorite current distros are currently Void, Gentoo and Alpine.


I find that Linux devotees often leave out the compromises they've made in their computing choices. Thank you for stating yours, it helps contextualize the reasons for choosing Ubuntu!


I've done a similar (very interesting techically) trip through various distros. Instead of Ubuntu, I settled for Fedora. It provides newer packages than Ubuntu but also "just works" (at least for me). The disadvantage would be the far lower popularity compared to Ubuntu.


Cinnamon on Linux Mint or Manjaro are very 'batteries included' setups. Ditto Kubuntu (KDE) and Pop!_OS (Gnome).

Even Ubuntu is pretty good these days. As of April it also includes Nvidia drivers.


Linux Foundation annual report was apparently produced on a Mac using Adobe Creative Cloud [1].

Linux is essential part of my workflow too, but I completely understand why that poor Linux Foundation person had to use Adobe CC on a Mac.

[1] https://twitter.com/grifferz/status/1334671602156507143


In my experience, Cinnamon Mint is the easiest to use with a background in Mac OS Classic or Windows XP through 7.

Netflix works on Firefox out of the box.


PopOS has a nice default distro that doesn't require a tonne of tweaking.


Install Gentoo...

Here is a list of Popular Distros

https://distrowatch.com/


That's an indicator of popularity among distrowatch.com readers, not at all representative of what Linux users worldwide use.


>That's an indicator of popularity among distrowatch.com readers, not at all representative of what Linux users worldwide use.

So Linux users worldwide don't use any of those Distros like Mint (#3), Ubuntu (#5), etc?

What do they use?


>popularity

The ranking portion can be misleading.


Epidemiologists are the experts, I'm baffled why so many people are asking physicians about coronavirus.

Physicians are not scientists, they aren't supposed to make policy decisions.

It goes to show how few people understand what Science is and probably more dangerously that Medical is mistakenly thought as Science. (Medical is older than the scientific revolution,it's a hybrid of Tradition/Authority/Art/Science)


ree2OSS says>"Epidemiologists are the experts"<

Epidemiology of COVID-19 has been a train wreck. Indeed we must ask if epidemiology is a science at all.

Interviewing 700 epidemiologists is a fool's errand providing no useful information. One might as well simplify the selection criteria and interview 700 (or better, simply 30) people of above-average intelligence.

ree2OSS says>"Physicians are not scientists, they aren't supposed to make policy decisions."<

I agree with the first sentence but not with the second. Most medical organizations/hospitals/institutions/clinics are headed up by physicians who make policy decisions all day long.


“Interviewing 700 epidemiologists is a fool's errand providing no useful information.”

Do you seriously believe there is “no useful information” from such an exercise? None? Methinks you are being facetious.

I agree you shouldn’t necessarily follow what they say they do, but even realizing there is a diversity of responses among those who have some formal training is surely of some value. Especially among the target audience of the NYT that tends to believe what an authority tells them more than the average American.


Epidemiologists are also MDs. And epidemiology also is a hybrid of the kind you mention. Dogmas exist in every science, and although it is true that clinical medicine is far behind other disciplines regarding getting rid of dogmas, things are improving steadily.

Epidemiologists don't always understand better regarding what to do in practice. They understand things in their own manner shaped by their profession and that's exactly what we ask of them.


Some are MDs, but most have a Ph.D or MPH (Master of Public Health) instead or in addition to a MD. Epidemiology is a science and one that is not well covered in medical schools.


Epidemiology is a science as much as for instance infectiology is a science. An aggregate of best practices relying only partially on hard (experimental or mathematical) science. For example IMO, the Bradford Hill criteria are backed by informal reasoning and although seemingly trivially logical, are very much dogmatic in nature.

It's true that many epidemiologists are not MDs, though. I formulated my thoughts badly in that regard.


Most people don't really understand that.

Even relatively informed people don't understand that physicians aren't members of the scientific class and in practice are closer to car mechanics than fuel chemists.

A person with a DMA, DPA, or a DMM are all still called "Doctor" colloquially, but not people with a JD.

A Doctorate of Science should by the sound of it be the one that members of the science class have, not a PhD, but it curiously represents both an award that's equivalent, beyond, and less than a PhD depending on who awards it and how it's awarded -- and is curiously often the degree of choice for medical doctors and other health practitioners but almost never the degree for chemists, biologists, physicists or other sciences.

Most people don't even know there are doctorate degrees other than PhD.


>Most people don't even know there are doctorate degrees other than PhD

I think you sort of answered why institutions often award a PhD rather than ScD/DSc/etc. (And there are some related examples related to Masters degrees.) If there's an industry job opening for a PhD, how many ScD resumes end up getting filtered out because the candidate doesn't have the "right" degree? I've definitely heard this type of thing on occasion from graduates who don't or at least didn't award the standard degrees. Even my undergraduate degree isn't quite "normal" (SB vs. BS) so I use the standard form on my resume not that it matters at this point.


I can't remember at the moment, either Harvard or John Hopkins was phasing out the ScD because it didn't have the "brand" recognition of the PhD and for years both degrees had exactly the same programs.


> Physicians are not scientists

Neither are epidemiologists.

> they aren't supposed to make policy decisions.

Scientists don't make policy decisions either. Politicians do.

> It goes to show how few people understand what Science is

Ain't that the truth.


> Scientists don't make policy decisions either. Politicians do.

Even politicians aren't that good at solving this issue clearly.

It's because centralization is bad at responding to the needs of individuals.

> The reason that this top-down bureaucratic management approach does not work is because the knowledge we need to plan is local, widely dispersed, and held by individuals. Hayek’s point, which won him Nobel honors in 1974, was that there is no such thing as a centralized repository of knowledge from which epidemiologists, policy-makers, or anyone else “in charge” can pull any required data at any time and then — poof — solve societal problems. He further warned that we must treat economic problems differently from scientific problems. Truer words have not been spoken regarding how we should think about the COVID-19 pandemic. Successfully containing COVID-19 is a problem that both requires scientific investigation and economic thinking.

https://spectator.org/covid-experts-shutdowns/


No Politicians framed the issue like this. Canceling school in 3 month bursts is a political move because it's much harder to tell a generation of parents their kids won't return to school for 2-3 years.

Shame on our politicians, shame of Fauci for not making this clear.


It's best to seperate the human from their technical work.

I do technical work and unfortunately can't contain my Politics on social media (due to their psychology tricks). I know both Republicans and Democrats will dismiss my math because I don't belong to their side.


I Disagree. But in particular I a think it is important for one to care about the ethics of what one works on and whom one works with. This is an industry where people are generally in a position to be choosy about what they do and it is a myth that it’s generally possible to separate something technical or theoretical from how it is used.

Even if one releases software with a license like the GPL that explicitly doesn’t constrain what the software is used for, I think one should still care about how it is used and the ethical implications of it.


I'll give you some credit since Programming is not Science/Engineering but rather some combination of Science/Authority/Tradition/Art.

But in science fields, authors names don't really matter. The same outcome will exist regardless who is studying it.


> But in science fields, authors names don't really matter. The same outcome will exist regardless who is studying it.

I'm curious, how much experience do you have in scientific academia? Names matter a great deal in my experience. A lot of results are simply ignored if they're not backed by a big name, and many experiments will simply not be done unless there's a big name to attract funding in the first place.


I may have written ‘Programming’ but so was mostly thinking about mathematics. It is wishful thinking to ignore the ethical implications of one’s work in academia


Yeah, not politics at all at work. And the worst fact (at least for me) mostly 90% of all programming guys are Democrats :(


Not even 90% of programmers are Americans. Or men. The whole world is not just republican VS democrats....


The whole opening bars/restaurants/casinos but closing gyms is mind boggling.

I understand keeping the population subdued with circuses, but you'd think politicians would realize the gym is a positive outlet that will reduce coronavirus deaths by having a healthy population.

My gym group broke up and 1 guy became overweight. Went from a no risk Coronavirus to high risk because Politicians messed up his routine.


Gyms are where there is the most aggressive breathing and grunting though, and that volatalizes as much contagion as people singing. Restaurants are a calmer more subdued breathing environment.


>Restaurants are a calmer more subdued breathing environment.

Depends on the restaurant. Alcohol and loud music lead to raised voices and presumably greater transmission.


So what? Kn95 mask it up.

Your gym population isn't going to clog up hospitals.

And we need to kill this myth that every old person lives with their kid.

Terrible Unscientific policy is causing more damage than if we did nothing.


I read that Japan is doing well because they use kn95 masks.

It baffles me we haven't been giving these away to old and obese people.


I'd be shocked if this ever made it to production.

I'd be more shocked if they did this for 2 generations of pixels.

The problem with vertical integration is how fast you can be obsolete by competitors. You only need 1 competitor to make a better product and your internal Engineers are going to be begging for it.

Unlike Apple customers, the rest of the population is sensitive to performance and cost. If Google can't compete people move on.

Edit- Samsung is not a flagship company, their high pricing has nothing to do with performance. At best they are mid tier.


> Unlike Apple customers, the rest of the population is sensitive to performance and cost. If Google can't compete people move on.

This is an odd phrasing: Apple's pricing isn't higher than the competition — flagship Android phones frequently cost noticeably more — and when it comes to performance it's very hard to beat the $400 iPhone SE2 at any price.

Vertical integration is what allows Apple to be ahead on both price and performance. The only way Google is matching that is with a strong long-term commitment to make similar investments. Apple will falter at some point but after years of ignoring the basics Google won't be able to take advantage of that — especially since at this point they're chasing Apple's previous generation so it'd take an extended sag for Google to catch up much less surpass with their current half-hearted strategy.


I think free2OSS is implying that Apple customers care more about the brand than the performance and cost.


A weird suggestion, given that the products they choose have better performance and cost than the competitors.

The facts suggest it is Samsung customers (and perhaps a few other brands) who care more about the brand than performance and cost.


There's also the OS and other aspects of the walled garden that you must choose between an iPhone and some other flagship. I personally select for some perceived optimization of attributes such as "has Android" and "has good value for price" among other things. If Apple made a phone like that I might buy one. Unfortunately, all of their phones have iOS.


How many apps have you sideloaded?


Google curates the play store with a much lighter touch than Apple curates its store. Possibly because they don't want to drive people to sideloading.


It’s still a walled garden.


The wall is pretty low, when you download an apk and open it, it asks you if you want to unlock the gate. In earlier builds of Android, you had to find the setting yourself, which was at least a little harder.


What percentage of users actually have a version of Android where that’s all it takes to install a downloaded apk?


I have no stats. Only my own annecdotal experience.

On Android 2.x, I remember apk download installs just failing, until you changed the settings; but everything else has been pretty easy. A Samsung KitKat phone I was using recently was even nice enough to have a 'allow just this app, or let it be wide open' checkbox.

About the only thing I sideload that's not on the play store is rooting toolkits, but sometimes it's convenient to get the APK another way (Google Play requires logging in to Google, which is hard to undo, for example)


This is cool, and much better than the impression I had from reading comments here.

I have frequently read that sideloading is impractical as a way to sidestep the store on Android when delivering software to consumers, but that sounds like it’s just wrong.

I wonder which version this changed in.


I don't think it's impractical, but it is more difficult.

Some users will be scared away by the warning text, and it's hard to prepare them with example dialogs, because there's such variation.

You'll need to build your own auto-update service, which (probably?) can't do delta updates like Play can. If you have a lot of users, and you can't dither the downloads[1], you may have a huge download peak.

If you have native libraries, off store downloads need to contain all the versions, but on store only downloads the arch appropriate ones.

If you make apks for download, people will archive them and make them available to others, even after they're obsolete. This happens with play store too, of course.

People who download your apk directly are also likely not to have play libraries or access to google etc. If your app depends on those things anyway, you're going to be missing functionality, or have to implement fallbacks for say, maps and push, etc.

People are still going to look for your app in the play store, so you probably want it there too; at least if it's not something that Google will disallow.

[1] Android really likes grouping background timers, so you will probably experience peaks at local X:00 for all of your users. And local X:00:08 for your users on networks with slightly off clocks


Nearly 100%, but you're comparing against the wrong devices. When I run a Linux router, I am not choosing at random from all the vendors. It's the same for my phone. I choose which one to use on purpose. Not only can I install my own apps, but I can install my own system as well. This lets me run devices from 2013 on the latest version of Android.

Answers below until I am no longer rate limited:

> Sure but almost nobody can do that because they didn’t make the same careful choice as you did.

What does that have to do with anything? Just because all iOS devices are bad and many Android devices are bad doesn't mean all devices are bad. I choose from the good devices.

> Also, you routinely make unsupported claims

Show me one unsupported claim I have made.

> you are going to say nearly 100%, you’ll need to show a source.

I have literally never seen an Android phone that does not allow sideloading. I have heard of many that do not allow unlocking the bootloader, but sideloading is standard. I can't provide a source because the idea that an Android phone would not allow sideloading is so absurd that nobody would bother explaining why it doesn't exist.

> it contradicts what others say when I suggest Android is an open alternative to iOS.

What have others said?

Edit 2 for the second post:

> Usually when I bring up sideloading as a reason why people should choose Android if they don’t like the restrictions of iOS, people claim that it is too hard for regular users to do because of the configuration required.

The OS takes the user to the settings checkbox to check. This has been true for as long as I can remember.


Sure but almost nobody can do that because they didn’t make the same careful choice as you did.

Also, you routinely make unsupported claims, so if you are going to say nearly 100%, you’ll need to show a source.

I’m happy to accept this as true by the way - I would like Android to have this level of openness, but it contradicts what others say when I suggest Android is an open alternative to iOS.


I’m not going to respond in general to your edited comment, since it is not what I replied to originally.

Editing a comment to get round a rate limit creates a misrepresentation of the thread your interlocutor was responding to.

I don’t mind waiting for you to be allowed to respond in the normal way.

Nobody is talking about Android phones ‘not allowing sideloading’.

The question is only about what percentage of phones allow a sideloaded app to be installed from the web just by responding to a dialog, rather than changing a configuration.

Usually when I bring up sideloading as a reason why people should choose Android if they don’t like the restrictions of iOS, people claim that it is too hard for regular users to do because of the configuration required.

I was responding to a comment which said this had changed and now it is much easier.

I don’t believe nearly 100% of users have this experience yet, but I am curious to know what the number actually is, because it matters for developers.


> people claim that it is too hard for regular users to do because of the configuration required.

Who are these people? The error message has always linked to the settings checkbox to allow installing the app. The only thing that has changed is which checkbox it links to because the settings for installing apps from unknown sources have changed. Here is a video of the way it used to be. I know that you're going to say that you don't believe video proof, but this is all I have for you.

https://youtu.be/GuzKSUMyCUM


I don’t need video proof for this. Other people are confirming it.

‘These people’ are generally replying to the suggestion that if people don’t like iOS because it it doesn’t allow sideloading, then why not buy Android?

From what you and others are saying, sideloading may not even be an appropriate term - you make it sound like you can just install software from the web on Android directly.


> I don’t need video proof for this. Other people are confirming it.

So you take an uncomfirmed report over video proof. Got it. No wonder your "facts" are all wrong.

> you make it sound like you can just install software from the web on Android directly.

Yes, that is the case. Sideloading is just the ability to install apps from somewhere other than the official store(s) on the device.


I take a a report from an honest and reasonable interlocutor who has demonstrated domain knowledge on HN over a random YouTube video, yes.

If there is a wrong fact that I have stated, you would be able to show an example.


Don't call it a garden. That's just propaganda from the dictators.


IDK, a handful. The point I was trying to make is less about the selection of apps and more about the actual operating system.


Hundreds. I develop apps for my own devices. This is supposed to be a forum for technologists.


“Hundreds. I develop apps for my own devices. This is supposed to be a forum for technologists.”

Sideloading doesn’t refer to installing apps you have developed yourself.

It’s about distributing software outside of app stores.


> Sideloading doesn’t refer to installing apps you have developed yourself.

If you can do one, you can trivially do the other. It's effectively the same feature to the end user. I develop my own apps and upload them to cloud storage to install them on all my devices.


This is not correct.

You can install your own app on your own iOS device without using the store, but you cannot distribute to end users.

That’s why we say Android supports side loading but iOS does not.

The two things are not one and the same.


> You can install your own app on your own iOS device without using the store, but you cannot distribute to end users.

My own devices and the devices of my friends and family are end users. I do not have to connect the devices to my computer. On iOS, you cannot do this without paying Apple yearly unless you want to deal with reinstalling every week.

> The two things are not one and the same.

They are enabled by the same mechanism. If you have one, you automatically have the other.


This is called “moving the goalposts”. It’s a kind of fallacy.

You said: “Hundreds. I develop apps for my own devices. This is supposed to be a forum for technologists.”

I said: Sideloading doesn’t refer to installing apps you have developed yourself.

You said: If you can do one, you can trivially do the other.

This is clearly not true. You can install apps you have developed yourself on your own iOS devices for free, but you cannot distribute them. They do expire, but you can certainly install and use them on your own devices.

So then you changed the goalposts to “My own devices and the devices of my friends and family are end users”.

Which is not what you originally said. You originally said “for my own devices”.

We agree that sideloading is about distributing software to end users. Not just your own devices.

The two are not one and the same mechanism. The fact that you had to change your qualifier shows this.


Apple computers have always been slower and more expensive than their non-Apple counterparts, and this gap has been larger recently due to sourcing from the struggling CPU vendor and the struggling GPU vendor. Worse, the software adds a 30% performance penalty on top.

https://www.phoronix.com/scan.php?page=article&item=macos101...

Their phones were slower at productivity tasks than midtier phones from the previous generation for many years.

https://www.youtube.com/watch?v=emPiTZHdP88

https://youtu.be/hPhkPXVxISY

https://youtu.be/B5ZT9z9Bt4M

I agree that Samsung's phones are also overpriced.


“The software adds a 30% performance penalty on top”

That seems like a straight up falsehood.


Which is why I provided a link to show it. Whenever I see IntelliJ IDEs on Macs, they seem so sluggish, and the benchmarks in that link show that Java2D is several times slower on MacOS on the same hardware.

iOS is even worse, taking vastly superior hardware and still managing to perform worse on standard productivity tasks.


The post doesn’t substantiate the claim: “The software adds a 30% performance penalty on top”. It shows performance on an extremely narrow test which clearly doesn’t represent general performance.

As to your claim about iOS. There is literally nothing to substantiate it.

You are simply lying and hoping nobody checks.


> The post doesn’t substantiate the claim: “The software adds a 30% performance penalty on top”.

From the link I posted: "Ubuntu 19.10 meanwhile had a 29.5% advantage over Apple macOS..."

Ubuntu 20.04 is faster still on that set of benchmarks, and ClearLinux has a 10% performance increase on top of that.

>It shows performance on an extremely narrow test.

No, it includes a very broad range of tests. I just highlighted one of them, on which MacOS performs particularly poorly and which affects software a lot of us use.

> As to your claims about iOS. There is literally nothing to substantiate it.

I gave you three links to substantiate it. It also matches my own experience.


Those results rely on Java and OpenGL.

These technologies are long deprecated on Apple platforms.

It’s completely unrepresentative. But sure, if you rely primarily on Java or OpenGL for your workloads, I agree that a Mac isn’t the right choice for you.

It’s a lie to say “The software adds a 30% performance penalty on top” based on these results. It doesn’t.

If you have something credible to link to about iOS performance being slower, I’m interested, but I’m not going waste time watching YouTube videos that nobody else is going to bother with.

If it’s that bad, you’ll be able to find some credible analysis in writing. I’m guessing you can’t.


> These technologies are long deprecated on Apple platforms.

This doesn't matter to the user, who is still using apps like IntelliJ IDEs. To the end user, this is a problem with MacOS, which does not allow GPU vendors to update their OpenGL drivers. Meanwhile, Ubuntu has far superior OpenGL performance and superior Vulkan performance on the same hardware.

> It’s a lie to say “The software adds a 30% performance penalty on top” based on these results.

You keep saying that, but you haven't explained why. MacOS is also twice as slow on git operations according to the benchmarks. Java2D and git are software that people use that are multiple times slower (not merely 30%) on the same hardware.

> I’m not going to watch YouTube videos.

Then I'll post text summarizing them. Android on midtier previous generation phones launches apps to interactivity faster than iOS running on latest generation hardware, and this has been the case for about ten years now. There are hundreds of videos on YouTube substantiating that.

Edit to respond to below comment due to rate limiting:

> YouTube videos, however many, are not a credible source of operating system performance claims. You can find YouTube videos supporting flat earth theories or that Obama is a lizard alien.

The YouTube videos doen't say that Android is faster. They show that Android is faster. You seem to not understand the difference between people claiming things are true and people showing things are true.

It's the same with the 30%. I showed a suite of benchmarks where MacOS is 30% slower on average. It was many times slower on a few specific benchmarks I called out.

> If you are a heavy Java or OpenGL user, don’t by a Mac.

I'm glad we can agree on that. You forgot heavy git users.

Edit 2:

> It creates a dishonest impression of the thread.

How does replying with edits create a dishonest impression of the thread when I very clearly indicate which comment I am replying to? If you would like to respond normally, there is a very easy way to enable that. Simply upvote my comments to disable the rate limiting.


If you are a heavy Java or OpenGL user, don’t buy a Mac.

If you are running a git server farm, don’t use macs. If you are a developer using Git, you’ll be just fine.

These just aren’t representative of most people’s experience.

It’s a lie to say the software is 30% slower.

It’s even a lie to say it’s 30% slower for an IntelliJ user.

It’s only 30% slower on a narrow benchmark.

YouTube videos, however many, are not a credible source of operating system performance claims. You can find YouTube videos supporting flat earth theories or that Obama is a lizard alien.

If you can’t find any article anywhere substantiating your claim, I think we can conclude that it’s not supported by evidence.

If it was that bad, a reputable site like ars, or anandtech would have shown it.


> It’s a lie to say the software is 30% slower.

I gave you a source. The sluggishness is something I notice when I use MacOS.

> If it was that bad, a reputable site like ars, or anandtech would have shown it.

It is that bad. Hundreds of YouTubers have shown it, and you can do it yourself to confirm. I have friends who have iPhones. The launch times are noticeably slower.

I'll even give you my speculation for why this is the case in order of impact:

1. Android's zygote based process initialization.

2. Developers optimizing for the $50 Android phones in the long tail.

3. GC instead of ARC.


Ok, so now we can see that you lied.

The software is not 30% slower.

There are a set of cold boot launch times which can be shown to be slower in a video, but this is not a metric of software performance nor does it impact users.

You’ve attempted to use a set of YouTube videos to justify a false statement.

As I have said before - if iOS was slower than Android you’d be able to find a reputable source to show it through analysis, not some staged YouTube videos looking for clicks.


> The software is not 30% slower.

I showed you that it is, and you keep saying that it isn't. I even gave you some examples that are more than 100% slower.

> There are a set of cold boot launch times which can be shown to be slower in a video

That's what I said they were. You kept not believing me even though I showed proof. Now you say that it doesn't matter. It is in fact the biggest thing that matters for productivity apps, aside from UX, where Android wins by a bigger margin due to smart replies and other actions in notifications.

> You’ve attempted to use a set of YouTube videos to justify a false statement.

You keep saying that, but you've just agreed in this comment that my statement was wholly correct.

Get your fingers out of your ears.


What you said is this: “Apple computers have always been slower and more expensive than their non-Apple counterparts, and this gap has been larger recently due to sourcing from the struggling CPU vendor and the struggling GPU vendor. Worse, the software adds a 30% performance penalty on top.”

This is a false statement. The YouTube videos do not change that.

Moving the goalposts to “the most important thing for productivity apps”, doesn’t change the original statement.

Taking aside the moved goalposts, even your new statement is false. Cold boot time for productivity apps is not the most important thing for productivity apps, since cold starting a regularly used app is rare.

The fact that you know these videos only show a rarely occurring situation, shows that you know you were lying when you made the original statement: “Worse, the software adds a 30% performance penalty on top”.

Also I said: “There are a set of cold boot launch times which can be shown to be slower in a video”

To which you said: “That's what I said they were.”

If this is true, you’ll be able to link to the comment where you said you were talking about cold boot launch times that precedes mine.

The pattern here is that you make a general statement which is completely false as written, and then attempt to change the goalposts and justify a much narrower statement that doesn’t actually support what you originally said.


> This is a false statement. The YouTube videos do not change that.

I never said they did. Can you not read? How many straw men do I have to burn down to have a discussion with you? The substantiation comes from an article with benchmark results.

> Moving the goalposts to “the most important thing for productivity apps”, doesn’t change the original statement.

I made two separate claims: one that MacOS is slow and one that iOS is slow. There are two separate goal posts that I put up, and neither one has moved. It's like I'm arguing with a post.

> If this is true, you’ll be able to link to the comment where you said you were talking about cold boot launch times that precedes mine.

I said it is faster at launching apps to interactivity. This is for cold boot and not. The apps launched after the first in each video are long after cold boot.

Your reading ability is so low that it is useless to carry on a discussion with you. No wonder you use Apple products. You're the rube who actually believes Apple's marketing.


> I made two separate claims: one that MacOS is slow and one that iOS is slow.

No. That’s a new statement you are making now.

You made this claim:

“The software adds a 30% performance penalty on top.”

Which is false, regardless of which platform you are talking about, and not supported by the links you shared, which in both cases were corner cases that cannot support a general claim like this.

As I have said before, if this claim were true, you’d be able to find a reputable source, e.g. Ars or Anandtech.


Please stop replying with edits.

It creates a dishonest impression of the thread.


It's hard to imagine Apple's brand maintaining its luster without good performance, and when Apple sells over 1B phones, it's hard to imagine that price-efficacy wasn't a huge part of the package.

On the other hand, Androids have a reputation to get somehow jankier or sluggish over time, and that surely must affect its brand and overall standing for consumers.


I think Apple is on the hook for $100M+ for degrading performance of their phones over time intentionally.

https://www.npr.org/2020/11/18/936268845/apple-agrees-to-pay...

I understand there are arguments about why this is a good idea, but Apple is settling, which tells me a fair amount about how Apple thinks those arguments would land in court.


I find the opposite True.

I imagine you watch lots of Apple ads.


> Vertical integration is what allows Apple to be ahead on both price and performance.

I don't know about that. The processor is the main thing they have vertically integrated, right? And if I go look at a flagship like a Galaxy S20 Ultra 5G I see a BOM of $57 for the processor. (It does have a hideously expensive modem for 5G but it didn't need to have that.)


They control the entire system on a chip — remember how long it was before Android devices started to have the security features which Apple introduced in the iPhone 3GS? That took Google getting phone vendors to get the CPU vendors to all work together, whereas at Apple that's a couple of teams all working on the same product with the same incentives to make it succeed.


With Apple, it was security theater. With multiple vendors working together, it has to actually work to get everybody on board with putting in the resources.

https://www.wired.com/2009/07/iphone-encryption/


Consider the knock-on effects though - that processor isn't necessarily optimising the things Samsung might want, nor is Android, which costs them more in other ways (I'm particularly thinking of the fact that Android phones seem to have to include a lot more RAM than iPhones, so that's another chunk of BOM).


>Apple's pricing isn't higher than the competition

Back in the Intel days you were paying twice as much for a Macbook laptop that used the same CPU as whatever Windows laptop with similar specs.


That might have been true for a couple of years if you just compare pure computing power, as soon as you factor in build quality and compared it to Thinkpad X1s, XPS & Co you were in similar price ranges again. Or if you don't care about build quality, another classic is the high-res displays (and software support) which took the competitors years and some might argue that they're still not there.

Oh and there was also a couple of generations where Apple had pretty much kind-of exclusivity on Intel's newest CPUs and they were really only available in MacBooks for the first few months.


Build quality is a term non engineers use. It's a marketing term.

Seems like you got sucked up into it.

Edit- butterfly keyboard, airpods, bad iphone signals come to mind


Yeah, all stuff that sucked. There is even more, Staingate, Bendgate, ... Never said Apple is flawless :) You think the comparable offerings from Dell, Lenovo or HP are flawless? Oh I have news for you.

You know what, if you look close enough, "they all suck".

Nice Ad hom though. :)


Apple and Build quality should not go anywhere in the same sentence.

Take a watch at any of the Louis Rossman's "Think Different" Series to see Apple Build Quality....


Not for comparable quality, you weren't. The prices varied depending on features but it wasn't twice as much unless you failed to control specs (I remember forum posters accusing everyone of being fanboys when they came up with a number like that by comparing, say, an Apple device with an SSD to something with a previous-generation processor and spinning metal drive). If you're buying comparable quality you get fairly comparable cost.


Back in the Intel days? You mean one month ago? lol


Apple has a first class in house design team. Google is subbing out so they don't get any long term benefit from an experienced employee base and they have to share a slice of the extra profit with someone else in addition to paying larger license fees to ARM than Apple needs to.

This is probably just a bow shot to Qualcomm to get a better deal out of them the same way PC vendors threaten to switch to AMD to squeeze Intel.


I think that's very much the case.

Qualcomm has been price gouging while limiting features from each new release (for example, the new 888 doesn't have AV1 decode support... WTF!)

This is google forcing them to up their game if they want to keep playing.

Will google keep doing this? Probably not (IMO). Rather, it's likely just a demonstration of "Hey, we don't need you, so straighten up!"

That being said, if Qualcomm ignores them (since Pixel isn't a major player in the phone market) google may keep producing their own chips.

Another interesting aspect here is that Google may decide that they don't want to keep paying for the ARM licenses. This is an avenue for them to start getting RISC-V as a first class Android citizen.


Android on RISC-V will go over as well as it did on Atom. Apps with native ARM code will suck or be unusable.


I actually think there's a pretty easy way around this for Google. They could create a ARM -> RISC-V translation layer. That works well because RISC-V already has a very limited instruction set. In order to make things fast, you don't need to do a lot of fancy transforms.

The issue with doing the same ARM->x86 translation is the x86 instruction set is vast and to get the best performance requires you effectively use those instructions. That requires you to merge instructions that you wouldn't have merged (think of things like the lea instruction).

There's less of an instruction impedance mismatch going from ARM to RISC-V.

Interestingly, I think going from x86->ARM or RISC-V would present less of a performance problem. That's because most of the work would be splitting one instruction into many. The evidence of this is the x86->IA-64 which resulted in something like a 10->20% performance loss vs a natively compiled IA-64.


The real risk to Qualcomm is if Samsung uses this in their phones and Chromebooks as well.


Samsung had their own mobile proccesser division as well.


I don’t get how you can look at Apple’s mobile offerings, which have been years ahead of all the Android competition, and come to the conclusion that Apple customers don’t care about performance.

I’d say Apple’s customers are far more sensitive to performance than Google’s given what Android users have been willing to put up with (e.g. flagship SoCs that are years behind what’s in last year’s iPhones). If you factor in broader aspects of performance (e.g. Face ID, fingerprint unlock speed) it’s clear that the _only_ thing most Android users care about is cost. I’m not saying this to diss them, I just think the Android manufacturers owe it to their users to actually produce competitive hardware and software.

P.S. if you factor in longevity, Apple products are usually cheaper, too.


It's not about features or performance for most of the population, who are light users. Most users of anything are not power users and modern smartphones are good enough for casual usage, since 2016 or so. Even mid range Android phones.

In the US iPhones are entrenched because of network effects (iMessage or whatever it's called) plus the Apple ecosystem. In the rest of the world where this is less the case, iPhones are a status symbol. Not even expensive Android phones have the halo iPhones have, especially in poorer countries.

Yes, people will enjoy the extra performance or extended software support, but that's not why they will buy them. I doubt 99% of regular users even know about those aspects.


> Unlike Apple customers, the rest of the population is sensitive to performance and cost. If Google can't compete people move on.

Maybe they meant both independently. As an example:

- A friend of mine just upgraded out of his iPhone 6. Until recently performance and storage seemingly just weren't that important to them.

- Another friend of mine will just buy iPhones, no matter the cost.

Both of these friends stick with Apple because they like and care about other things about the ecosystem more so than performance or cost (independently), whereas OP was perhaps implying that Android users tend to care more about "performance to cost ratio" since they could as well just buy a different phone from another maker and get the same Android experience. This of course, in the context sticking with one phone maker or another.


As someone who upgraded this year from the 6 to the SE 2, I held out not only because I like the ecosystem, but also because I'm not shelling out almost one grand for a phone. The SE hit the sweet spot, so I ordered one the first day it came out.


I wish I had waited and gotten an SE 2, especially because Touch ID is way more convenient than Face ID nowadays that one's often wearing a face mask haha


Apple has a 10 year lead time on Google for running OSes on in-house Silicon. If there is no intention to compete with Qualcomm, how are they going to get the economies of scale problem down to where the chips are affordable + performant? The volumes of Chromebooks/Pixels sold in a week vs. iPhones in a week is stark.


"X has an insurmountable lead" almost always turns out to be wrong. You could have made the same argument with x86. Intel/AMD used to at least have the argument that they had vertically integrated because they not only designed the chips, but also designed the fabs.

Apple is a fabless designer, other vendors have access to TSMC as well, as long as they have space capacity. Realistically, other ARM vendors are about 2 years behind. A Snapdragon 875 scores the same as an Apple A12 on Single Thread Geekbench.

On GPU and TPU, Apple isn't that far ahead at all. TPUs are relatively simple devices, and the Snapdragon 888 already has almost double the A14 performance in TOPS on paper. GPU wise, the A14 (PowerVR derived) GPU is roughly equivalent to a 2016 era NVidia 1060. AMD's RDNA2 based cores for SOCs are likely more powerful and more capable (e.g. raytracing acceleration)

Much is made of how far Qualcomm is behind Apple, but like with AMD and Intel, Qualcomm has a wider market focus to address. Apple sells only a few SKUs, Qualcomm needs to make chips for a much wider array of demands, and so they're a jack of all trades, master of none. In much the same way, AMD and Intel have desktops and enterprise vendors to satisfy, including people running SMP systems. Does the M1 support multiple M1 SMP? Does it support ECC? The other vendors are kind of hamstrung by trying to make one architecture that pleases too many markets at once.

Look at the Anandtech Snapdragon 888 article: https://www.anandtech.com/show/16271/qualcomm-snapdragon-888...

"Qualcomm’s 25% generational boost is also less than Arm’s advertised 30% as the new S888 continues to use a 4MB L3 cache for the CPU cluster, versus Arm’s envisioned 8MB configuration for a high-end 5nm SoC with the new X1 cores. Qualcomm explained to us that this was simply a balance between cost, implementation effort, and diminishing returns of a higher cache configuration design."

Basically, Qualcomm needs to worry about what their customers are willing to pay for the chip and how much work it takes to integrate. Apple doesn't, if need be, they can bump the price for a more expensive SOC, Qualcomm Android vendors can't.

25% CPU performance uplift and 30% GPU performance uplift for the 888. the A14 gets 1583 Single Thread in Geekbench, the 888 will likely turn in a final Single Core perf between 1300-1450, only 9% less than the A14. The situation is the same for multi-core, roughly on par, and on Antutu, the Snapdragon 875 beats the A14 substantially, and wins on several 3dMark tests.

Apple does not have an insurmountable lead over the other vendors in terms of SOC, I'd argue that their primary advantage right now is their software has always been better. In particular, WebKit's JIT for ARM does a lot better than V8 for ARM, and Android Dalvik does not produce code that runs as well as LLVM on Swift (plus there's generational GC vs ARC+refcounting).

Otherwise, a 9% CPU advantage in single thread performance is nothing to write home about.


> Samsung is not a flagship company, their high pricing has nothing to do with performance. At best they are mid tier.

Their high pricing has to do with the slew of software and hardware features they include, many of which are executed at least reasonably well and later trickle down to mid-tier brands like Google's Pixel line or more conservative high-end brands like Apple. For instance: DeX, more flexible fast USB C and wireless charging than most manufacturers, pen/remote support with in-device charging and corresponding apps, Samsung Pay with magnetic stripe emulation, etc.

Samsung isn't focused on raw computing power, because they largely depend on the same chips as most of their manufacturers, but if you've used a newer Galaxy S/Note device the level of hardware polish and random "gimmicky" but occasionally useful features far outpaces a Pixel or similar.

Disclaimer: I currently own a Pixel (because there are other things that Google competes on) but last owned a Note 9.


The problem I see is, will they support it? Or will they send it to the graveyard?


Maybe a logical progression for the TPU chips and having on-board cores to manage the TPU side of things for performance does make sense and if they can scale this for other products, then why not. Given the whole mobile CPU performance caught up and power usage become the biggest saving area on the table PR financially as well as direct financial savings. Then I'd hazard a educated stab that maybe what is afoot here.

However it pans, the Linux kernel is sure going to see some nice ARM optimisations over the comming years out of all this drive.


The smart move for Google here would be to debut their own chips in Pixels, but then sell the chips to other phone manufacturers to use in their phones. Assuming the chips provide enough raw performance and interesting hardware acceleration, that would make it a reasonable business for Google to stay involved in.


Democrats want less tariffs?

I would love a Neoliberal Democratic party.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: