Yep, 10 out of 10 of those "IoT frameworks" unequivocally qualify as epic fails.
As of recently, I had to work on Microsoft Azur IOT. Their "embedded" library doesn't even load into memory if you are not plugging in an external ram on the most beefy MCU on the market. It takes 1mb+ to do a very, very basic HTTP+TLS+JSON parsing. And even this way, it often triggers watchdog on moderately long transfers.
-----
On a sidenote, had a conversation one time with an ex-webdev in the setting of developing for MCUs:
— "How did you get it to 1MB?"
— "I shortened the strings, and removed the dead code!"
— "No, I mean, how did you manage to get it past that 1MB?"
-----
It's quite hard finding experienced non-webdevs in general these days, but among those, the number of people who can do C for embedded (and under embedded I mean bare metal, not Linux on SBC) is so low that we can say that real embedded devs have pretty much went extinct.
I agree with you totally, these IOT frameworks mostly have clients that do not really work properly on the "edge". Als IOT frameworks based on "runtimes" always have way too much overhead that does not scale.
I have seen some technology that is amazing though, for example DDS, Zenoh, OPC UA.
I for once look at the side of, reliability and quality of the signals. Most IOT frameworks don't care about when or how the data arrives. What about QoS ?? What about Jitter of the signal?
If people use Managed languages in IOT, for example the hobby stuff like micropython on microcontrollers. It's fun for hobby. But nothing else. Waisting alot of resources etc.
A good IOT frameworks consists out of a Tweakable QoS for the data from hard-realtime with (TSN time sensitive networking) up to soft to no-realtime data. Network will balance things out regarding the QoS.
What if i need an IOT platform/framework and need hard-realtime data between a sensor somewhere and a robot with a control loop?
Reliability, scalability , adjustable QoS of signals, and also low on resources that can run on Microcontrollers with TCP stack that enables TSN.(Therefore an RTOS is best solution, but rather no RTOS if possible) (And two NIC's to make reliable hardware, when you have hardware failure)
For one project I've spent considerable time trying to get the Azure IoT SDK going on a low-cost Wifi chipset, only to find out that it ended up not working after all the platform glue was in place because of inexplicable decisions from both the host APIs (no source code access, of course) and the Azure SDK, making most of the built-in functionality (eg https/MQTT/encryption) useless for Azure, resulting in not only code bloat, but also running out of heap memory with fantastic silent crashes.
After spending weeks on trying to fit everything into ROM in the first place. The end-result was ditching the SDK and re-implenting most of MQTT and websockets before the project was ultimately canned for other reasons.
A pi zero w, otoh, costs about $20 (with a reasonable SD card) vs $5 for something like the ESP32, and makes life easier with unlimited storage and open source platform code, but it is impossible to mass-produce anything for under about $120 retail cost due to the high price of the rpi alone, which is a real bummer in a competitive IoT market.
At least in Sydney, I found that bare-metal jobs typically paid far less and required far more experience. Some of the best engineers I know are being paid peanuts but are just happy to be quietly amazing in their corner.
I was proper bare metal at one point (even wrote my own crappy rtos at one point) but couldn't resist the better pay and better treatment that you get up the stack.
oi, I'm not extinct, it's just many embedded programmers, especially bare metal programmers, tend to hang out in very different places, if they hang out anywhere on the net at all.
I work with a few! So may have to get our pod registered as endangered species, I could monetize it by selling sightseeing tours for web and backend devs to see what real engineers look like.
Generally wages there correlate to the amount of productivity of the field.
Embedded / firmware takes a lot more tedious work, often requiring more people and time and that's just for a single purpose device. Whereas a few Web developers can create a web service that runs an entire profitable company with tens of thousands or millions of users. Of course that's the generalization but I'd guess it's a 1:10 or at least a 1:5 ratio on the income/profit vs development time/effort required between web dev and embedded.
I strongly believe that projects like Nerves can help lower that ratio and make a lot of IoT type projects follow economics more similar to web development. Sure a RPi costs $35 per unit, but lets say you're making a veterinary monitoring system which costs $100k for the software it fits into the budget much easier if a couple of devs can implement it in weeks versus the months and/or large teams it used to take. There's a lot to be said for the long-tail of economics in areas where leveraging more productive tooling can increase productivity and profitability.
Unfortunately that only generally applies to roughly pre-canned hardware and commodity sensors, etc. You won't get that effect when building say a new Camera Sensor CCD chip.
> As of recently, I had to work on Microsoft Azure IOT: Their "embedded" library doesn't even load into memory if you are not plugging in an external ram on the most beefy MCU on the market.
AWS Greengrass has the same issue, it only runs on Raspberry Pi's and up, which is like using a semi-truck as your daily driver in a city where a good bicycle is all you need. At worst we need things to load into 0.5MB with room to spare for our own code too. At best we'd like things that could load into 32KiB with room left over for our own code too.
32KiB lets me get temperature/humidity monitoring over WiFi working at <$2 per sensor+wifi+mcu. So if you need 200 sensors to monitor a space, that's just $400 for bare minimum parts, not including enclosure or power supply.
0.5MB raises that price to $5/unit, but mcu's that can handle this can also generally handle generating/using enterprise-grade wifi certificates, which might be nice.
Using AWS Greengrass / Azure IOT means I need to use a raspberry pi, so $38 or so per unit. Now instead of $400 you're looking at $7,500 for baremetal parts. Put another way that would raise a hypothetical consumer-oriented device's MSRP by $100 vs whatever it would be if it could use the lightweight $1 mcu.
As of recently, I had to work on Microsoft Azur IOT. Their "embedded" library doesn't even load into memory if you are not plugging in an external ram on the most beefy MCU on the market. It takes 1mb+ to do a very, very basic HTTP+TLS+JSON parsing. And even this way, it often triggers watchdog on moderately long transfers.
-----
On a sidenote, had a conversation one time with an ex-webdev in the setting of developing for MCUs:
— "How did you get it to 1MB?"
— "I shortened the strings, and removed the dead code!"
— "No, I mean, how did you manage to get it past that 1MB?"
-----
It's quite hard finding experienced non-webdevs in general these days, but among those, the number of people who can do C for embedded (and under embedded I mean bare metal, not Linux on SBC) is so low that we can say that real embedded devs have pretty much went extinct.