Hacker Newsnew | past | comments | ask | show | jobs | submit | more romanoderoma's commentslogin

> they want to get the f--- out the EU as soon as they can.

Not in the EU I was born and raised.


English isn't an official EU language anymore, the only country that chose it was UK.

https://www.politico.eu/article/english-will-not-be-an-offic...


Do you have any references of English actually being phased-out in EU? The Politico article was written just a few days after the Brexit referendum, and 4 years later I can only take it for empty sabre-rattling.


Either make french or german the language of diplomacy wouldn't be the first time in history that was done.


Proposing either of those two as a lingua franca (irony aside), would be a sure fire way to start an enormous argument in the EU.

Surprisingly with the UK out of the EU; English is perhaps best placed to fill this role, as it is now a "neutral" language, with only the Irish and Maltese contingents who, broadly speaking, don't wield the same influence as France and Germany.


Legally it could be done, but not practically since English is the language we actually use to communicate.


Greek is the most iconic european language.


> Do you have any references of English actually being phased-out in EU?

It's not clear yet if it will be used in official documents after Brexit agreements will be signed, but UK was the only country who chose English as an official language.

There's an hard requirement in EU to translate every official document in every language spoken in the EU, if English is not an official language, that requirement will cease to exist and slowly there will be no official document in English anymore because it's also a very costly operation.

Unless Ireland or Malta apply for English as primary language.

It's also a political stand, Macron is pushing to use French more and more in the EU institutions.

It doesn't mean that it won't be taught in schools or that it will fade as the most spoken language in EU, it still is by a large margin, but that it will move away from UK English and will become more similar to the English 75% of the > 1 billion English speakers in the World speak as a second language.

Native English speakers not understanding or not being understood by non-native speakers it's already a thing, the gap could widen in the future.


> It's not clear yet if it will be used in official documents after Brexit agreements will be signed, but UK was the only country who chose English as an official language.

The official languages of the EU are set by Council Regulation No 1, which declares English to be an official language. Brexit did not change that regulation in any way; it would take a unanimous vote of the Council of the European Union to amend it, and it is very unlikely that Ireland would vote to remove as an official language the first language of the vast majority of its citizens.

https://eur-lex.europa.eu/eli/reg/1958/1(1)/2013-07-01


Funny side-note: Sweden never registered Swedish as an official language to the EU because they thought it would be a waste of EU funds to translate all documents to Swedish, since most people there can get along in English.

Then Finland went ahead and ruined it by registering both Finish and Swedish as their official languages (they have a Swedish minority there) :)


It’s a good story, but Finland and Sweden joined EU at the same time in 1995, so there wouldn’t actually have been a period when Sweden was a member but Swedish wasn’t an official language.

Of course the Swedes would like to blame things on Finland if they can ;)


What language the EU documents are written in is entirely irrelevant for the EU citizens though (they'll get translations anyway). What matters is what people choose as their second language, and I'm pretty sure English is and will remain the most popular choice.

(and the US cultural influence is the main reason for this, the UK doesn't really matter much in that regard since around the 19th century).


Yes and no.

First of all, I'm European.

Use of English as a second language, it might surprise you, dropped in the past 10 years, it was spoken by 52% of the population in 2012, now it dropped to 44%.

Not native speakers, English is the fourth language in Europe for number of native speakers, after German, French and Italian (if we don't count Russian, largely first with 120 million native speakers)

Secondly, things can change rapidly.

20 years ago the most studied second language in Italian schools was French.

Thirdly, in Europe countries have spoken to each other for centuries, when I go to Spain I speak Spanish, when I go to France or Belgium I speak French, I can also speak some German.

Of course not perfectly, but we still understand each other better than through English, because of similarities in the languages.

So yes, second language counts, but only when other options are worse (I speak English with Sweds or Dutch - I've been studying Swedish on my own but my progress with the language have been very slow)

In many schools we study more than 1 foreign language.

I studied French in primary and middle school then English and German in high school.

They were regular public schools, far from the best.

It's quite common elsewhere.

My cousins live in Belgium, their kids study French, Flemish and of course Italian in school.

Many Italian researchers, for example, move to Germany and they learn German because it's better to know it when you live there, English is used to bootstrap the social life and at work.

CERN is in Switzerland and even though I believe everyone there speaks English, in Geneva the official language is French, it helps a lot outside of work to learn it, more than English. Move a few Kms and they speak German, go South and they speak Italian and you will still be in small Switzerland.

Where many people are naturally bi or trilingual.

Adding English as a third/fourth it's very easy for them.

Eventually everybody will learn some English, undoubtedly, but it's gonna be foreigners' English, that it's different from proper English and sometimes it's pure nonsense for native speakers and viceversa.


As an American, I was surprised by the amount of English education I encountered in European.

Though the places I visited: Sweden, Rome, Greece, Netherlands, France, Spain, Vienna, Ukraine, Berlin all have above average levels.

Do you see the Brexit changing this?

Ref: https://en.wikipedia.org/wiki/English_language_in_Europe


Actual map of English proficiency in Europe, you're welcome ;)

https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6Ikp...


Nope, not at all. The EU has essentially nothing to do with English being spoken Europe. Brexit is a legal matter.


Nah, the amount of English education across Europe is a function of global business and cultural production (Hollywood and the internet). Those will not change even after Brexit.


China is the largest producer of cultural products in the World right now, Hollywood itself made blockbusters there, financed and co produced by China.

It has already changed, before Brexit was even a thing.

From 2013

China is now the lead exporter of cultural goods, followed by the United States. In 2013 the total value of China's cultural exports was US$60.1 billion; more than double that of the United States at US$27.9 billion.


Should use US/EU English instead.


I think you meant Muwekma Ohlone.

If we are trying to be precise it was the Spanish who stole their land.

Not all Europeans are made equal.


I did, thank you!


That's not true.

You can as an employer walk away whenever you want, you just don't get the salary for the difference between the notice and when you left.

Example: if you have to give 30 days notice (very common in Italy) and leave on the spot, after your resignation have been accepted, you have to give the company the equivalent of 30 days of pay.

Which is exactly the money they should give you aniway if you stayed for the entire notice time.

But usually the employer and company find an agreement before coming to that.


> they could claim wrongful termination in this scenario and likely be awarded the pay they would have received for the missing notice period.

Correct.

That's what happens in Italy for example and I believe in large parts of Europe (I know for sure France, Germany, Sweden and Spain)

The missing pay is granted anyway, but with a wrongful termination the employer will most likely obtain a fair compensation.

A couple decades ago I was awarded a year of salary + the 3 months of notice we agreed upon when I was hired, for a wrongful termination after an acquisition.


> with extremly strong worker protection law

Serious question: what makes them strong if employers can't organize a simple strike?


You can organize a strike BUT this is absolutely not the way you do it.

You have to follow strict procedures, work with your union and go into one or more formal negotiating tables with all the representatives of all the interested parts before even thinking about giving an ultimatum like that. And for sure you cannot communicate in that way using directly your work email


But can you or can't you be fired for that?

Because in my country you can't be fired for using the work email for that.

Communicating with other workers using their work email is absolutely allowed, courts ruled about it several times [1].

I usually receive trade unions (there are several of them) communications on my work email from the unions' work email, because they have been authorized by the company to send them.

In this case she wrote directly to workers using work email, but having no union or labor protection laws it doesn't make any difference whether she could or couldn't, she could be fired anyway without having to provide any reason.

In countries where there are strong laws protecting workers she would have written to the union members and they would have done the same thing: write directly to the workers.

Of course she did it at Google so it is different, but here the strict procedures to call a strike are only necessary if a public service that requires continuity risk to be interrupted, otherwise unions are only required to alert the company that the strike is going to happen, but have no requirement whatsoever on how to organise it.

Which sound logical to me,strong protection means IMO freedom to collectively counter the actions of the company, if that's not allowed the protection is not strong.

[1] Court of Catania, Labor Section, February 2, 2009 "The RSU employee can send trade union communications by e-mail to the employees of the company during their working hours and to their company e-mail address using his personal e-mail address"


In many countries you do need a vote for a strike to be organised. You can't just spam people with a request to stop working. Strikes are a powerful tactic that have evolved a lot of formality around them to try and ensure the outcomes aren't totally destructive. Italy is a rather unusual exception to this. Perhaps it's a contributing factor to the long stagnation of the Italian economy.


> Perhaps it's a contributing factor to the long stagnation of the Italian economy

Perhaps.

But doesn't explain stagnation in Japanese economy where there are no Italian unions.

Or why France did much better despite having even stronger work protections laws than Italy and wilder strikes (the gilet jaunes for example) or those happened last year against the pension reform where the workers of public transportation went on strike - for weeks - without even announcing it.

Even countries like Germany, Singapore, Switzerland and Finland are doing worse than Pakistan in the GDP growth race.

My point was than if there are stronger work protection laws somewhere else and the laws of your country are weaker, they are not very strong, they are moderately strong.


Re: last point. Fair enough. That point is sound.

Japan seems to have evolved a work culture very similar to strongly unionised societies but without unions. Japanese salaryman culture is famously a culture of employment for life with unusually strong loyalty between employee and employer, hence weird things like "banishment rooms" that you don't find elsewhere. If it's the end results that matter and not the means, Japan might not be a good counter example.

I think French strike law sounds tighter than Italian strike law? The French are famous for striking but strikes must still be a collective decision and related to a specific set of issues, whereas Italian strike law really does sound incredibly broad and vague.

With respect to Pakistan, that's doesn't mean anything, poor countries always have very high GDP growth. It's easy to grow something small and backwards by a lot because you can get a lot of relative growth just by copying what other countries do, and less absolute improvement is needed to get a percentage point of growth to begin with. You can only compare GDP growth rates between countries of a similar level of wealth.


Do Google employers work in barracks?


Unpopular opinion: I never loved Slack and the Linux client wasn't working really well for me so I had to use the web one

I work for a company with thousands of employees and we used Slack in our small team, we chose it because the majority was already familiar with it (I voted for Mastodon) and we spent part of our team budget to pay for it

When COVID hit the fans and we started working from home, the company chose Teams as the collaboration platform for all of us

I don't love Teams either, but I honestly have to admit that the voice/video calls are incredibly reliable even on some shitty DSL connection I found myself using that had a packet loss ratio around 50% and no other alternative worked

For everything else Teams is worse than Slack, except for the integration with the AD domain and sharing documents with colleagues that never used anything else than Windows

On average Teams is as good as Slack and for casual users that only use chats and calls there is no difference at all

I can see why companies are switching to Teams, they have been MS customers for years and Teams comes bundled, it's one less contract to sign and one less vendor to start a relationship with.


> (I voted for Mastodon)

Does Matodon do well as a chat app? It doesn't feel like 'Twitter for companies' would do good in that respect if everything has to be a threaded conversation.


Curious, what was lacking with your experience of Slack on Linux? I haven't experienced any issues at all.


Not lacking, it simply didn't work as expected.

I'm on Debian + KDE Plasma 5 and the tray icon disappeared, the UI froze for no obvious reason, sometimes only the frame of the windows was visible while the rest was a black square, notifications were hit and miss, sometimes the mic wasn't released after a voice call and I had to restart pulseaudio to free it and sometimes the client couldn't detect the audio devices at all.

I believe it has to do with my configuration, but I never had the same problems with the Teams client.


Reddit.


But honestly, does it matter?

People buying a threadripper today don't care about it.

What matters to them it's the performance, my gf is a 3d artist, she doesn't care if some cpu consumes more or less power than another (when I say she doesn't care I mean she doesn't even know that CPU have very different power requirements), she needs the fastest gear she can buy on her budget.

Apple M1s are not it.

There are many other that do not care because the hardware stays at the office, where someone else pays the bills and they too don't care because the electricity bill is the last of their problems, fixing it buying more efficient computing devices would mean spending a lot of money in advance to replace at least half of the stock.

How many more months of electricity could that money pay?

A lot.

People buying Apple for its M1 low power consumption are a niche inside a niche.

So I think the reasoning stands: Apple M1s are not a real threat for its competitors because the market that really cares about its strengths is smaller than the one that doesn't and Apple will keep to largely miss the second one.


Laptops have outsold desktops every year for the last decade+. The M1, being a mobile friendly chip, is addressing the larger market. People may not care about power usage from the wall, but they do care about battery life.

Also, the fact that the M1 is even being mentioned along with a desktop powerhouse like the threadripper says it all. The M1 is the lowest end mobile Apple Silicon. We know higher end mobile chips are coming along with desktop ones. The M1 has put all the chip manufacturers on notice, much the same the way the iPhone did (to the point RIM thought Apple was lying about the iPhone because they thought the device was impossible).


It's only being mentioned alongside threadripper because the article went overboard and claimed the M1 "embarrass all other PCs — ... including ... every single machine running Windows or Linux"

This is a joke. The M1s performance is no where near the threadripper. Even on per-thread performance, new AMD cpus perform better than the M1... and they have many more cores. The M1 can only win these comparisons when we start adding all kinds of conditions, like power draw, heat, price point, etc. They need to add those conditions to redirect the conversation, because on raw performance threadripper destroys the M1.

The article made the mistake of leaving out those conditions to make it seem even more impressive than it is... but hyperbole and fantasy will always sound better than reality. Reality requires those conditions. That doesn't detract from the m1's impressiveness.. but it's foolish to forget that and go off and start making ridiculous claims like the article has done.


That was sort of my takeaway as well. I use desktop computers for most of my more demanding work. Laptops are great and (like smartphones, tablets, etc) any improvement in their capabilities is to be welcomed--but there's a difference between a headline that reads "New laptop is the fastest computer ever!" and one that reads "New laptop faster than all previous laptops!"

I don't do the most intensive tasks on thin, battery-powered computers for a reason. I try to do those on the box that's always plugged into mains power with heatsinks and fans as needed, plenty of ports to plug things in, and a big monitor. A more performant portable computer is awesome, but until I can spend a similar amount to make my workstation smaller without losing capability, I'll likely keep using it for heavy lifting alongside a cheaper, less powerful laptop or tablet for lighter work or remoting in.


Android has oversold Apple 3 to 1

Now if we look at what people are buying and ask ourselves 'are these the people who care about TDP?' the answer is simply 'no'

The majority of laptops sold are sub 700, as the previous post said, low budget, low power and in education, markets that Apple has never even considered (except education, but it's mainly college students in USA)

Not saying M1 aren't good, but that they won't shake the market as NVidia buying ARM could

P.s. M1 is a Zen 3 competitor, threadripper has been mentioned for the exact opposite reason: because that's the state of the art for people looking at performances in personal computing (while in HPC ARM is prominent)


> People buying Apple for its M1 low power consumption are a niche inside a niche.

Got it, watching Netflix in Chrome without the laptop getting hot is "a niche inside a niche" while GPU intensive 3d graphics workflows are mainstream.


This is doable on practically anything. Talking about being powerful with low heat/power consumption implies heavier workloads than Netflix because otherwise the "powerful" part is irrelevant.

Of course I'm sure you can find laptops with terrible cooling design that would get uncomfortably hot watching Netflix, but there's plenty of thin and light laptops available for < $1000 that would do just fine too.


Even my 5 year old passive-cooling Core m3 12" Macbook streams Netflix for hours on battery without getting hot.


>watching Netflix in Chrome without the laptop getting hot is "a niche inside a niche" while GPU intensive 3d graphics workflows are mainstream.

Well then i have to say Apple achieved what my 7 year old nexus 5 already could...bravo.


If not getting hot while watching Netflix is the benchmark, I can show you a 400 euros laptop that can do 9 hours of Netflix on battery (or 7 hours of 1080p 60fps streaming)

Don't need no M1

AMD and Nvidia have sold more GPUs than Apple laptops by a large margin


People using powerful laptops care about battery life, though. For someone moving from a "workstation"-class laptop, the M1's value is significant.


>For someone moving from a "workstation"-class laptop, the M1's value is significant.

Em NO, the M1's are not comparable with a workstation class laptops (not cpu wise nor ram...and especially not GPU wise)


> she needs the fastest gear she can buy on her budget.

> Apple M1s are not it.

If a person had a $1000 budget, they could by an M1 equipped laptop from Apple. Is there something significantly faster at that price?


Or a desktop with 64 GB of RAM and two previous generation GPUs


64 GB? From where?

I just looked at Dell and an XPS desktop with 16 GB of RAM starts at $650. You aren't going to be able to add two GPUs and a monitor for $350.

At HP you can get to 32 GB for $650 but again you still need to buy GPUs and a monitor.


64GB of ram cost < 300 euros

A lot of studios sell their old GPU (as in a generation ago) for cheap

If you stay out of vertical integrated supply chains you can buy the best your budget allows

If you are saying that you cannot assemble a laptop of the same quality of an Apple one, I agree

If you are saying that Apple laptops are what people want because they consume 20 watt full load, I disagree

The market of the good enough products is always gonna be larger than 'the best money can buy'

Apple will never make 'good enough products' hence they are not competing with the bulk of what is being sold right now

It could change in the future, but not in the immediate future IMO


I'm not saying any of those things. I'm saying a $1000 laptop from Apple is a pretty good value for a lot of people and I'm not sure you can do a lot better at that price point. When you include things like having an Apple store across town where you can go when things break, it pulls even further ahead.

The best that money can buy changes when you put the actual budget. Today, Apple may be the best that $1000 can buy.


> 64GB of ram cost < 300 euros

Not all RAM is equivalent. If one wants 64GB of the kind of RAM M1 uses, which is very high performance, it will cost more than 300 euros.


M1 is using standard LPDDRx. It's not "very high performance". It uses a different interconnect -- that's it.

I guarantee that 64GB does not cost anything near EUR300.

You might be thinking of HBM[2] which has a wider I/O path and costs more.


>M1 is using standard LPDDRx. >It's not "very high performance".

AnandTech disagrees with you: https://www.anandtech.com/show/16252/mac-mini-apple-m1-teste...

Besides the additional cores on the part of the CPUs and GPU, one main performance factor of the M1 that differs from the A14 is the fact that’s it’s running on a 128-bit memory bus rather than the mobile 64-bit bus. Across 8x 16-bit memory channels and at LPDDR4X-4266-class memory, this means the M1 hits a peak of 68.25GB/s memory bandwidth.

Later in the article:

Most importantly, memory copies land in at 60 to 62GB/s depending if you’re using scalar or vector instructions. The fact that a single Firestorm core can almost saturate the memory controllers is astounding and something we’ve never seen in a design before.


Anandtech is comparing M1 vs A14. It's high performance for a cellphone part.

Dual channel DDR3L or DDR4L also has a 128 bit bus. 4200MHz DDR4 is clocked on the high side for most laptops, sure, but it's hardly unusual.

Run the numbers and you get the exact same throughput figure as for M1, which isn't surprising, because we're just taking width * rate = throughput.

So I'll repeat my assertion, downvotes be damned: the memory on the M1 is not special. The packaging and interconnect is interesting. It might reduce latency a little; it probably reduces power consumption a lot. But there's nothing special about it. The computer you're on right now probably has the same memory subsystem with different packaging.


Anandtech is comparing M1 vs A14. It's high performance for a cellphone part.

That’s where they started, but their conclusion was beyond that.

Did you miss the part where they said the fact that a single Firestorm core can almost saturate the memory controllers is astounding and something we’ve never seen in a design before?

This isn’t only about A14 vs M1.

It’s not that LPDDR4X-4266-class memory is special; it’s been around for a while. What is special is that the RAM is part of SoC and due to the unified memory model, the CPU, GPU, Neural Engine and the other units have very fast access to the memory.

This is common for tablets and smartphones; it’s not common for general purpose laptops and desktops. And while Intel and AMD have added more functionality to their processors, they don’t have everything that’s part of the M1 system on a chip:

* Image Processing Unit (ISP) * Digital Signal Processor (DSP) * 16 core Neural Processing Unit (NPU) * Video encoder/decoder * Secure Enclave

There’s no other desktop such as the M1 Mac mini that combines all of these features with this level of performance at the price point of $699.

That is special.


> Did you miss the part where they

I don't think that's notable, sorry. I would expect that of any modern CPU.

> it’s not common for general purpose laptops and desktops

Well, yeah, because "memory on package" has major disadvantages. You (laptop/desktop manufacturer) are making minor gains in performance and power and need to buy a CPU which doesn't exist. Apple can do it, but they were already doing it for iPhone, and they must do it for iPhone to meet space constraints.

I think unified memory is the right way to go, long term, and that's a meaningful improvement. But as you point out, there is plenty of prior work there.

> they don’t have everything that’s part of the M1 system on a chip

They actually do! The 'CPU' part of an Intel CPU is vanishingly small these days. Most area is taken up with cache, GPU and hardware accelerators, such as... hardware video encode and decode, image processing, security and NN acceleration.

Most high-end Android cellphone SoCs have the same blocks. NVIDIA's SoCs have been shipping the same hardware blocks, with the same unified memory architecture, for at least four years. They all boot Ubuntu and give a desktop-like experience on a modern ARM ISA.

> There’s no other desktop ... at the price point of $699

Literally every modern Intel desktop does this.


I've seen latencies when pointer chasing (in a relatively TLB friendly pattern) of 30-34ns. Have you seen similar elsewhere?


https://news.ycombinator.com/item?id=25050625

showed https://www.cpu-monkey.com/en/cpu-apple_m1-1804

which determined the M1's memory is LPDDR4X-4266 or LPDDR5-5500. If those memories are not high performance, what is?


You can't do what you do on a desktop on a laptop, not even a good one

Who cares if an M1 consumes less energy than a candle if I can buy 64GB of DDR4 3600 for 250 bucks and render the VFX for a 2 hours movie in 4k?

Another 300 bucks buy me a second GPU

When I deliver the job I put aside another 300 bucks and buy a third GPU

Or a better CPU

vertical products are an absolute waste of money when you chase the last bit of performance to save time (for you and your clients) and don't have the budget of Elon Musk

The M1 changes nothing in that space

Which is also a very lucrative space where every hour saved is an hour billed doing a new job instead of waiting to finish the last one to get paid

You can't mount your old gear on a rack and use it as a rendering node, plus you're paying for things you don't need: design, thermal constraints, a very expensive panel (a very good one, but still attached to the laptop body, and small)

So no, M1 is not comparable to a Threadripper, it's not even close, even if it consumes a lot more energy

When I'll see the same performances and freedom to upgrade in 20W chips, I will be the first one to buy them!

https://www.newegg.com/corsair-64gb-288-pin-ddr4-sdram/p/N82...

Then there's the 92% (actually 92.4%) of the remaining market that is not using an Apple computer that will keep buying non Apple hardware

Even if Apple doubled their market share, it would still be 15% Vs 85%

How is it possible that on HN people don't realise that 90 is much bigger than 10 and it's not a new laptop that will overturn the situation in a month is beyond me


> You can't do what you do on a desktop on a laptop, not even a good one

Ummm... ok. But my comment was not all RAM is equivalent.


Not all cars are equivalent

I guess you don't drive a Ferrari or a Murciélago

And does it really matter to have a faster car if you can't use it to go camping with your family because space is limited?

That's what an Apple gives you, but it's not even a Ferrari, it's more like an Alfa Duetto

It's not expensive if you compare it to similar offers in the same category with the same constraints (which are artificially imposed on Macs like there's no other way to use a computer...)

But if you compare it to the vast amount of better configurations that the same money can buy, it is not


>You can't do what you do on a desktop on a laptop, not even a good one

Yeah… no, those days are over. The reviews clearly show the M1 Macs, including the MacBook Pro outperform most "desktops" at graphics-intensive tasks.

>So no, M1 is not comparable to a Threadripper, it's not even close, even if it consumes a lot more energy

Um… nobody is comparing an M1 Mac to a processor that often costs more than either the M1 Mac mini or MacBook Pro. However, the general consensus is the M1 outperforms PCs with mid to high-end GPUs and CPUs from Intel and AMD. Threadripper is a high-end, purpose build chip that can cost more than complete systems from most other companies, including Apple. However, it's at a cost of power consumption, special cooling in some cases, etc.

>Who cares if an M1 consumes less energy than a candle if I can buy 64GB of DDR4 3600 for 250 bucks and render the VFX for a 2 hours movie in 4k. Another 300 bucks buy me a second GPU

The MacBook Pro has faster LPDDR4X-4266 RAM on a 128-bit wide memory bus. The memory bandwidth maxes out at over 60 GB/s. And because the RAM, CPU and GPU (and all of the other units in the SoC) are all in the same die, memory is extremely fast.

From AnandTech; emphasis mine [1]: "A single Firestorm achieves memory reads up to around 58GB/s, with memory writes coming in at 33-36GB/s. Most importantly, memory copies land in at 60 to 62GB/s depending if you’re using scalar or vector instructions. The fact that a single Firestorm core can almost saturate the memory controllers is astounding and something we’ve never seen in a design before."

It can easily render a 2-hour 4k video unplugged in the background while you're doing other stuff. And when you're done, you’ll still have enough battery to last you until the next day if necessary. According to the AnandTech review [1], it blows away all other integrated GPUs and is even faster than several dedicated GPUs. That's not nothing; and these machines do it for less money.

>vertical products are an absolute waste of money when you chase the last bit of performance to save time (for you and your clients) and don't have the budget of Elon Musk

>The M1 changes nothing in that space

This is not correct… seeing should be believing.

Here's a video of 4k, 6k and 8k RED RAW files being rendered on an M1 Mac with 8 Gb of RAM, using DaVinci Resolve 17 [2]. Spoiler: while the 8k RAW file stuttered a little, once the preview resolution was reduced to only 4k, the playback was smooooth.

[1]: https://www.anandtech.com/show/16252/mac-mini-apple-m1-teste...

[2]: https://www.youtube.com/watch?v=HxH3RabNWfE


The M1 beats low end desktop GPUs from a couple of generations ago (~25% faster than the 1050ti and RX560 according to this benchmark [0]). Current high end GPUs are much faster than that (e.g the 3080 is ~5 times as powerful as a 1050ti).

Don't get me wrong - this is still very impressive with a ~20w combined! power draw under full load, but it definitely doesn't beat mid - high desktop GPUs.

(This is largely irrelevant for video encoding/decoding though as you can see - as that's mostly done either on the CPU or dedicated silicon living in either the CPU or the GPU that's separate from the main graphics processing cores.)

[0] https://www.macrumors.com/2020/11/16/m1-beats-geforce-gtx-10...


How much does a 3080 cost? Could you build a complete computer around one for $1000?


You're missing the point. I'm not trying to argue about which system is better, I'm just saying that the comment I'm replying to is saying incorrect things about GPU performance. I'll answer your question anyway though:

You could build a complete desktop system including a GPU that's more powerful than the one in the M1 for ~$1000, but certainly not a 3080. They're very expensive, and nobody has any in stock anyway.

An RX 580 or 1660 would probably be the right GPU with that budget. (Although you could go with something more powerful and skimp out on CPU and ram if you only cared about gaming performance).


- a 3080 costs > $750 . Good luck buying one, I would if it wasn't out of stock. On the other hand the gtx 1050 mobile that is on the M1 can be easily found on eBay for < $50

- yes, you totally can. The best thing is that with a 1k.entry level you can start working on real-life projects that have deadlines and start earning money that will let you upgrade your gear to the level you actually need, without having to buy an entire new machine. The old components can serve as spare parts or to build a second node. You don't waste a single penny on things you don't need.

Even though, it's true, you can't brag with friends that it absorbs only 20 watts full load and the heat of the aluminium body is actually pleasant

It's a big sacrifice, I understand it.


> The reviews clearly show the M1 Macs, including the MacBook Pro outperform most "desktops" at graphics-intensive tasks.

They don't!

Cut the BS

> Here's a video of 4k, 6k and 8k RED RAW files being rendered on an M1 Mac

Blablablabla

That's not rendering


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: