Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Dialog Semiconductor to acquire Atmel for $4.6B (dialog-semiconductor.com)
93 points by metaphor on Sept 22, 2015 | hide | past | favorite | 44 comments


Atmel saw huge success with the AVR at the end of the 90s, and they had the ARM microcontroller market nearly to themselves with their SAM7 and SAM9 series of controllers. Unfortunately they dropped the ball when the Cortex cores came (and ARM really sprung into popularity) which ST gladly picked up and ran away with (around 2008, when the STM32F1 was introduced). What happened to Atmel is that they chose to invest their resources in the AVR32 series (riding on the success of the 8-bit AVR and the strong political position of the AVR group inside Atmel; IIRC the ARM group was in France while the AVR group was based in Norway), but they didn't read the market correctly. Everyone was going away from proprietary architectures and into the Cortex ecosystem (not always rationally) - customers bought into the pseudo-standardization ARM was selling them. This standardization was only perceived, as once you get into the 32-bit domain, the code is mostly C that can easily compile between cores - its the peripheral drivers that are non-standard and where most of the porting work exists and going from an ST Cortex to an Atmel Cortex is a big headache. Anyway, this shift in the market caught Atmel (and others, like Microchip and Renesas) off guard and pushed ST into the Cortex dominance it has today.

Around that time, Atmel changed management and refocused the company (selling off their memory lines, other fringe products and their fabs) to be a primarily microcontroller company, but it was a bit too late and they found themselves behind. It's been a while since I worked with 8 bit micros, but the word is that their AVR series is not really innovative anymore.

Seems like Atmel is back on its feet though with great new Cortex-F7 processors (I think they have the industry speed record at 300mhz and some other benefits).

I haven't looked at the graphs, but $4.6bn is kind of a valuation low point considering their last 10 years or so. They probably understood that they're too small to survive by themselves, especially as how complicated microcontrollers are becoming. I don't think microcontrollers are such a fantastic business - talking to my friends at component distribution like Avnet and Arrow, they make a lot more money on some passive components (like power) than on micros, and for a long time some companies have been allegedly selling microcontrollers at a loss, just to get the designs.


Nice summary of Atmel. I had no idea that Atmel was not doing well. Atmel has the basic arduino business, although a lot of other architectures, especially newer and more powerful ones, are using the arduino framework for their CPUs. Even the newest official arduino platform is ARM based.

I'm constantly reading about how the 8-bit market is still doing well, despite the emergence of ARM. Even though the AVR hasn't really innovated much, it has picked up a lot of peripherals and low power modes which has kept the AVRs from being strictly in the bargain basement MCU market. I don't think there has been much innovation in the 8-bit market, but the AVRs have a really good programming model, especially compared to the 8-bit PICs. I don't know if that's enough to give AVRs a shot at new designs, but they're one of the best 8-bit options today, IMHO.


Arduino has been huge for Atmel in terms of brand recognition and popularity of AVRs, but overall, its not a huge revenue source (even if 1m Arduino boards are sold yearly, that's at best $3m-$4m in sales). Of course that there are indirect benefits.

Most of the designs I know, migrate away from 8 bits and into 32 bit. The Cortex M0 is great. While designing with 8 bit is somewhat of an upgrade dead-end, if you design a product with an ST Cortex-M0, you can pretty easily go up their Cortex family ladder while retaining a lot of compatibility on both the board and firmware levels (ST is particularly good at that, which is why its my default recommendation when designing a new product - I can explain why if anyone is interested). The denser 8 bit parts are really expensive anyway - I suspect its because manufacturers know its mostly "locked in" customers that need more flash or I/O for an existing design and need a more capable, near drop-in replacement.

I also vastly prefer AVRs to PICs, but AVRs were kinda where I started my embedded programming career so I guess I just reflect on them fondly.


Please do. I can easily see the benefits of being able to upgrade a chip while retaining software/hardware compatibility but I'd love to hear any other selling points. Are they reasonably competitive with equivalent AVRs in terms of cost and power? Do they have a PicoPower equivalent?


PicoPower - I can't tell you the specifics, as comparing power consumption is a bit difficult (you need to figure out which peripherals you need running, whats the system clock, input voltage, sleep/wakeup regime etc and calculate from there), but they do have STM32L0 series of ultra-low power based on Cortex-M0+.

As to why do I like ST:

1. They are very easy to design boards for. Each IO pin on the Cortex micros maps to up to 16 different internal functions. That gives amazing flexibility from the board design perspective.

2. They have fantastic long term migration paths. You can directly go from Cortex-M3 to Cortex-M4 to Cortex-M7, while, in many cases, retaining package compatibility.

3. They have very good design validation. Their errata sheets are quite shorter than what I remember working with other manufacturers.

4. They have a huge amount of useful IO, such as timers (I have one design using over 15 timer channels) and a quite flexible and powerful ADC architecture.

5. They probably have the widest selection of Cortex microcontrollers in the industry.

There are downsides, too; I generally find their peripherals less advanced than Atmel's (regular CAN instead of CAN-FD, USB inferior to Atmel's, I2C that's really sucky and prone to errors, slower clock and no dual precision FPU on the new F7 family). There was also some story with ST's 2MB flash parts, being not really usable - but I'm not sure what was that about.

Overall you can't go wrong with ST. The long term migration ability and board design flexibility are well worth the downsides.

Coming from an AVR, you're immediately going to get way more memory, much higher clock frequencies, ARM CMSIS libraries (fun if you have any math/DSP going on) and more industry standard tools.


Unfortunately STM uses a lot of external silicon IP (all MCU designers have to) from different firms that don't really work well together or are straight up inferior as you mentioned.

Their eratta sheets are shorter but I've run into a number of problems that I've worked around with help from their engineers and a year later I still couodnt find any notes on them. I'm guessing they underreport edge case silicon bugs like almost all silicon manufacturers.

My last project used a STM32F437 with 100 mbit ethernet, just about saturating the connection at 80mbits while maxing out the processor to interleave the bits of all incoming data and bit banging every GPIO unused by peripherals to control 30k individually adressable LEDs at 45 fps which is just incredible for a $7-10 microcontroller. Then after a few weeks of field testing we identified half a dozen problems all resulting in nondeterministic failures of the DMA or MAC peripheral requiring a hard reset. I think the most problematic one, DMA1 or DMA2 stalling when they tick within 10-20 cycles of each other, is the only one that was added to the errata and still plagues the STM32F4x7 parts.

That said if you care about part swapability, developer productivity, and need a very powerful ARM that you're not pushing to the edge, the STM32 line of chips is a great choice.


That sounds nasty. I've had my share of problem with Atmel parts though when operating at high rates - on their Cortex-M3 and SAM9 there were problems of AHB overruns and data loss when doing fast Ethernet or SPI transfers. No hard lockups though.


>> I2C that's really sucky and prone to errors

Is that fixable via software? if not , is it worth the risk ?


Yes, you can bitbang your way out of that, but its not great for all applications (like when you want to do DMA to/from the I2C in the background while the processor sleeps or does something else; can't do that with GPIO bitbang).

I was really surprised by that when I first encountered it; I2C is something rudimentary.


I was designing a many-sensor system back in very early 2011 and we wanted to go with AVR, but couldn't. They didn't have enough memory, enough flash, enough IO, enough anything, even on their $20 chips.

Then we found the STM32F lines and got chips that were dramatically, objectively superior to the AVR32 chips in literally every way, for about $3/piece.

I still feel like I must have missed something for there to have been such a gap, but the project went great.

So, I think there may have been more to it than marketing pitch about portability.


You've totally ignored the 8051 market, which is most competitive and gets the newest features well before AVR-core processors. Not to mention their dozens of other product lines. I think the Arduino has led people to believing the AVR are more popular in industry than they actually are -- it isn't hobbyists buying tens of millions of microcontrollers per year.


I would think that in mass manufactured products AVRs are actually more common than Atmel made '51s, and for the whole '51 market Atmel seems to be mostly irrelevant as most '51s you see today are various highly integrated special purpose SoC's from mostly asian manufacturers.

I've not seen anything that can be called "consumer electronics" that contained Atmel made '51, while I've seen some such things with AVR's (probably most often with mega169). In various low-volume industrial stuff there are heaps of Atmel made '51s, but only reason for that is familiarity (and to lesser extent AVR-compatible ISP).

Another thing is that Atmel is prone to supply chain problems. In our case it meant that we had redesigned the same device three times, first design was based on mega8, which in late 2010 we found was completely unavailable in any non-prototype quantities, thus we found that our distributor had cheap inventory of AT89C2051 and went with that (with all the development pains that are inherent in non-ISP '51) and on last major revision we went with MSP430 (G2253 IIRC) and didn't have to change anything due to supply problems for 3 years.

In any case I believe that for many uses of Atmel microcontrollers the familiarity and avaiability (like mega8 being fifth of the price of PIC16F84 in unit quantities in ~2007) were the major selling points.


You're exactly right about the peripherals being the important part for micro controllers. Atmel when they developed the xmega and avr32 families radically changed the peripherals. At that point switching to an ARM Cortex was just as hard. Given that switching instruction sets is a nothing burger if most of your code is in C/C++.


>> and for a long time some companies have been allegedly selling microcontrollers at a loss,

So why aren't companies rushing to build 40nm commodity mcu's like infineon did ? it should cost almost nothing to make an mcu that way.


I don't know - I believe the barrier is in the embedded flash technology used (NOR flashes that are fast/reliable/big enough).


Cortex-F7 (or just F7) is the ST brand name of their Coretex-M7 based products. ARM only sells A/M/R series of processors.


Yes yes, sorry, that was a typo ;)


The problem is that the ARM Cortex microcontroller space doesn't have any differentiation left.

As a microcontroller user, I care about two features: RAM and price.

Really. That's it. I'll eat the engineering to switch lines if I can hold my price but double my RAM.

I don't care about frequency--everything is fast enough. I care a bit about flash, but I'm almost never flash limited. If you don't have enough of the peripherals I need I won't buy, but everybody has a ton of peripherals since transistors are cheap. And everybody's tool chains suck equally so there is no advantage to be gathered there.


There are people who care about power as well, right? E.g. the security system where I'm living now has remote wireless units with fairly big lithium batteries that last roughly 10 years....


> There are people who care about power as well, right?

Not as much as you would think.

Battery life in the embedded space is mostly defined by how much you can stay off and what your leakages are rather than what your active current is--since you can quite often just stay asleep a little longer.

Most of these microcontrollers are in the same range for leakage and are at the point where you have to start considering other leakage sources on the board (capacitors for example).

It's Amdahl's Law in action: the big contributors have all been smacked down pretty much as far as they can (sure, you might get 10% here or there ... but it's going to be real work now) so now getting overall performance isn't so easy.

If someone really wanted to get my attention with their microcontroller, they should produce a combined BLE/WiFi chip where the WiFi can be run off of a coin cell battery like BLE (that would mean something like 2mW transmit power rather than 200mW).


In particular I suspect there's very few gains that can be made for wireless communications. My contention is that there exists a information-theoretic lower bound on the amount of energy it takes to transmit a given amount of information a given distance. Like a heat engine, at some point you can no longer make generational improvements in efficiency. You get asymptotically closer to the bound, you can't ever beat it. And wireless power tends to dominate the power budget of anything that uses it.

High-bandwidth signals require more power to travel a given distance, low-bandwidth signals go farther on less power but transmit less information. Intuitively this is why you can go worldwide with CW (Morse code) on just a few watts of power, while a SSB signal on the same frequency doesn't go far at all. But I think you can mathematically support this by playing with the bandwidth+power terms of the Shannon-Hartley theorem.

If we want to transmit a given amount of data per time unit, there are two natural extremes. We can either transmit it at the average data rate necessary, or we can race-to-sleep by blasting it out as fast as possible. I'm not sure which is more efficient given that SNR is inside the log2 of the S-H theorem, it depends on the relationship between SNR and bandwidth as you spread a given quantity of RF power thinner and thinner. Based on the log2 I would think this is an inverse-square kind of thing, making it an equivalent tradeoff.

Showerthoughts: because of the Inverse Square Law, it's probably more efficient to make a mesh of multiple small hops than a single long shot. As the spacing of the mesh decreases to zero, you end up with a wire.

Also, check out the ESP8266 chip, it doesn't have a BTLE onboard but it does handle the "SOC with low-power wifi" thing.


> Also, check out the ESP8266 chip, it doesn't have a BTLE onboard but it does handle the "SOC with low-power wifi" thing.

I know all about it. Still won't run on a coin cell.

Nobody will build a 2mW WiFi chip until Apple forces somebody to build it. Then EVERYBODY will rush to build it. Then EVERYBODY-1 will go bankrupt when Apple doesn't use their chip.

It's the Zen of Hardware.


When LoRa will be deployed, it would probably be a much better solution than wifi.


Ayup, but until then I need a gateway to talk to the outside world. If I don't have a phone, then that's WiFi.


yes. bsder's comment should only be taken to represent his own reality.

As an embedded firmware developer for 18 years, I've seen every project has its own constraints, just like most engineering. Design constraints on microcontrollers include: power, clock speed, FPU, number of digital GPIO, number of analog GPIO, quality of IDE and debugger, price, package size, available operating systems, longevity of part, volatile and non-volatile memory, quality of compilers, peripherals and so on.

IME, Atmel had a great 8-bit series with good documentation and few bugs that scaled from 8-pins to 40 very well. With analog inputs and PWM output, they were well-equiped to do analog input->processing->output tasks. They really blew it when they moved to 32-bit as others have said. They weren't very fast, changed all the peripherals around and didn't improve on the clumsy fuse system. ARM M ("M"->embedded) Cores like stm32f4, Freescale Kinetix, etc. have all but obliterated Atmel.

Finally differentiation in the Cortex family includes ethernet (F27/F29), Front-side Memory Controller, FPU, low-power (STM32L0), extreme low power (MSP432), low-price (STM32F0) and so on.


> yes. bsder's comment should only be taken to represent his own reality.

No argument.

> Design constraints on microcontrollers include: power, clock speed, FPU, number of digital GPIO, number of analog GPIO, quality of IDE and debugger, price, package size, available operating systems, longevity of part, volatile and non-volatile memory, quality of compilers, peripherals and so on.

True. But most of the embedded work I have been dealing with over the last 5? years seems to be very different from prior. In the last 5 years, choice of microcontroller moved from contentious to almost an afterthought: 8-bit, 16-bit, 32-bit? Motorola/Atmel/Microchip/ST/Renesas? gcc vs proprietary? Sufficient frequency? GPIB/RS-232/USB? All gone.

I get asked about memory and footprint (BGA/CSP is starting to become popular ... bangs head on wall) and that's about it. Even cost just doesn't come up much anymore--I don't know if its that things are cheap enough or that everybody now actually knows what microcontrollers cost.

Pretty much things seem to be splitting into two bands: internal memory only (M0, M3/M4 class--generally an RTOS) and external memory (A-class+, runs Linux). And, given some of the higher end M-series, I suspect Linux is going to hit there shortly.


Which problem spaces are you working in btw? I've been doing medical, microwave radio, infra-red LEDs, bike light LEDS, music instruments, architecture and some other small stuff.


Lately: monitoring and sensing across a vast spectrum of industries.

Lots of people want data about business processes and throwing some coin cell sized boards with BLE and sensors into a product as it goes from start to end customer can be enlightening ... sometimes too much so for some businesses :). The real trick is having the infrastructure and team to receive and analyze the data (which we do now that we've done it enough).

Industrial automation is in the mix. A lot of that has been moving from combinations of IEEE-488, RS-232, USB or other non-differential links to CAN and Ethernet. Customers are often amazed how many fewer problems they have when they use a connection that is actually electrically differential.

It's gotten to the point that when I hear "reliability issues" I look for USB and just wipe it out. Generally there will be a network of PC's connected via USB to "random industrial machine X" or, worse, PC->USB Adapter->RS-232/IEE-488/etc. (to be fair--the old systems with actual RS-232 ports or IEEE-488 cards generally worked fine as they had more than enough signal overhead and shielding to deal with industrial electrical spikes--it's the USB interface that disconnects and causes Windows to crap its pants).

A replace of a bunch of those with something with a CAN interface generally fixes industrial problems. I finally sat down and wrote my own CANOpen stack because I was using it so frequently.

Medical and bio is an occasional, but that hasn't been such a big mover for me lately. The people who cut checks and the people who actually know what their problems are are too far apart in the hierarchy right now. My big problem in bio is that I could do quite a bit using full-custom VLSI to go to ultra low power functionality, but nobody wants to pay for that. So, I'll continue to use off-the-shelf microcontrollers and FPGA's and use way too much power.

Most of my grief these days isn't electrical. It's mechanical. I used 3D printers and silcone molds a lot. Very messy and annoying to work with, but it saves us having to cut molds for injection molding.

I can 3D print a lot of parts for the cost of even a single injection mold. And my margins aren't so tight that I need the cost savings. And I can change the part if something is wrong. And I don't have to scream at the mold maker when (not if) he screws my mold up.


Isn't USB data lines differential?


Yes, but the detection that a cable has been plugged in isn't. Whoops.

To be fair, this isn't really an issue unless your device grounds are likely to have more than a couple of volts difference due to spikes. And that happens very rarely in the home or office.

In industrial settings, however, that's not true.


Apart from this there are other issues to consider when you have to go through different certifications required for different domains. When you have MCU's in automotive,industrial,medical and millitary (invasive instruments) it is better to go through safety critical MCU's with built in core protection to pass through different certifications. Apart from this thermal considerations play a big factor when you go through different qualification tests.


Yes, there are, and the industry players are pushing this as their main differentiator.

However, for most projects I see, people don't care much. A better efficiency in a car is good, but a few microwatts can't compete with double the RAM in this context.

Lots of electronics projects are not battery dependent, and engineers care more about reliability, development ease and speed and flexibility. I surely like the idea of not reimplementing a firmware ten years from now by just upgrading the micro controller, instead of dealing with a whole new set peripheral drivers.


For commodity systems that run off cheap lithium primary cell with essentially any modern low power microcontroller the major limiting factor for battery lifetime is actually the self-discharge rate of the battery itself (and to somewhat lesser extent the fact that most lithium battery chemistries have discharge curves that depend on peak load).


Here's why Dialog is buying Atmel for $4.6 billion http://fortune.com/2015/09/21/dialog-buying-atmel/?xid=soc_s...


As an Atmel customer, what does it mean to me?

Will nothing change? Will Atmel's (or is it Dialog's ?) micro-controllers will become even more power efficient?


Most likely, nothing. This seems to be a channel-driven acquisition. Dialog wants to put Atmel products in its sales channels and its own products in Atmel's sales channels. And it looks like the product lines don't overlap much, so there shouldn't be significant anticompetitive effects.


Hell, what happens to Arduino?


Even if they stopped making AVRs together, there are plenty of other (arguably better) MCUs on the market anyway.

At the company I work for, we've replaced all the Atmega chips in our products with the STM32F0 series chips for lower-end requirements and with STM32F4 series for higher end stuff, because they have more grunt and generally have more peripherals, and are similar prices.


How many MCUs have the same strong, libre toolchain?


Free GCC tool chains are present for the TI MSP430, TI Tiva Series C (Cortex M4) and also TI Hercules (Cortex M3) series. Apart from that I have tried STM32F4 series with GCC tool chains.

I have personally developed Bluetooth stack using the GCC tools for the MSP430 using the CC246x Bluetooth modules and it was a blast.

So you have a lot of options with free tool chains. I personally use only free toolchains on linux. It is the best development setup and you have a lot of options for debugging and tool development.


Are there any which are easily used breadboard style project?


NXP's (and now Freescale's) LPC1100 are ARM Cortex M0 processors available in DIP packages. IIRC, NXP is the only one making DIP ARMs, but breakout boards exist....


There are indeed a bunch of them. Here's the ones STM provides: http://www.st.com/web/catalog/tools/FM116/SC959/SS1532


According to user qq66, probably nothing.

Though if Dialog is power management IC company, maybe Atmel's microcontrollers or Atmel based boards will have better power management features and/or be more power efficient?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: