Espressif is taking quite a radical approach as compared to other semiconductor companies. They provide everything from a Cloud SaaS for their chips, to open source phone apps that connect over BLE/WiFi to their chips, reference hardware designs, integrations with voice assistants, and more. Typically, semiconductor companies have shied away from providing anything comprehensive because they don't want to upset their distributors and partners who provide the missing services - but not Espressif.
I think if the US-China chip war hadn't been going on, and customers wouldn't have been paranoid of using a Chinese chip (and being in a tough spot like Huawei customers), Espressif chipsets would be waaaaay more popular.
Disclaimer/source - worked at Espressif for a bit (2017-2019)
> Espressif chipsets would be waaaaay more popular
Oh, don't worry, they're getting a ton of attention from the "pros" these days. In hardware no one likes to be the guinea pig if they don't have to be. But these days Espressif is not seen as an exotic choice, and their low cost, excellent feature set, and good supporting material are big draws.
Honestly it feels like they're truly just eating this market, but it's just happening (what looks like) slowly.
My job uses the ESP32-S3 (the 2MB PSRAM version, for some extra memory headroom) professionally (and with Nim for the firmware, which is fun). Super neat little SoC!
Interesting you use Nim, would you have some resources/pointers for using Nim on embedded devices.
Do you use a native compiler or cross-compile it for this (RISC-V) chip ?
So the S3 is actually a Xtensa LX7 processor rather than a RISC-V one, but the process is the same: “--compileOnly” to output C sources from Nim, and point CMake to that nimcache output folder.
With #line macros injected into the compiled C output source files, JTAG debugging and other tools just work, which is quite nice.
That said, if I was on the C3 or any of the RISC-V chips I’d look at using the C toolchain directly: you can easily have “nim c main.nim” call out to a specific C compiler with your specific build and linker flags!
The one annoyance in all of the embedded toolkits when trying to use non-C/C++ languages is always CMake; companies build component systems and such on top of it which a naive C compiler usage can’t really leverage, which means you end up rebuilding the whole build toolchain again.
Easier to just stick within CMake and use Nim’s C output instead, at least for now (until I get some time to write a nimscript parser for a subset of ESP-IDF’s CMake file format…)
I also use Nim quite a bit on linux/windows for machine-learning purposes.
But even on these systems, It is generally a pain to try to trick Nim into using a compiler not directly supported, as some original hard-coded flags would be incompatible with the one i am trying to use,for example NVCC .
Follow up question, Do you generally use Nim on ESP like chips which run some form of a RTOS, or also on bare-metal micro-controllers ?
Yeah, we deliberately have a fork of Nim that is purely for adding workarounds when we hit small internal things like that — though most of our changes are upstreamed proper now which is nice!
To answer your follow up question; both - we’re in the process of bringing up firmware for the STM32G0 and L4 microcontrollers, and CubeMX is such a pain to try and bind as-is, we started to play with svd2nim and see if we can’t bring up peripherals that way instead. That’s a less well tread path of course, but it’s promising so far. Only downside is that’s more like Rust’s approach of rebuilding the world, but oh well
Run-time configurable box that acts as a IoT gateway to various sensors/protocols/etc, pulling data from a heap of different possible devices and kicking it to our web platform. Used to track all sorts of stuff across a few industries
I forget we actually have a website, probably easier to share that and answer specific technical questions instead! https://www.venturi.io/
> Honestly it feels like they're truly just eating this market, but it's just happening (what looks like) slowly.
I have an friend who works at ARM, he said there's a mix of people who don't understand what's happening and people who are freaking out but have no power to respond.
I have no idea what market share changes have been but speculation based on what I've heard is that EspressIf is large enough that ARM has awareness of their presence but doesn't comprehend their threat. It sounds like it's going to play out like typical market disruptor where the titans will respond too slow and too late to stop them.
The thing is, it's all about that radio. No one cares what the core is. ESP32 is ample evidence of that: neither their Xtensa nor their RISC-V cores are as good as an ARM Cortex-M, in terms of design quality, implementation quality, documentation, ecosystem, or general familiarity. And it just doesn't matter. No one even cares about the Xtensa versus RISC-V products. The toolchain, documentation, major features, and raw horsepower available are all that's important.
Which is a long way to say that ARM has no moat here.
From what I recall, the ESP8266 was originally just a module designated to provide WiFi capabilities to other circuits, but over time, hobbyists discovered that it had much more potential. Its incredibly low cost sparked its popularity. Espressif did an excellent job of designing the ESP32 to accommodate the demands of the emerging audience. But I could be mistaken; I might be missing a part of the story.
> The ongoing US-China chip war has created a situation where users hesitate to use Chinese chips, wary of ending up like Huawei's customers. Without this scenario, Espressif chipsets might have enjoyed much greater popularity.
These chips are intricate enough to potentially contain backdoors. While it is something I never thought about with respect to Espressif, it does seem plausible that they could be a target.
What really sparked my interest was having WiFi in a chip that only cost maybe 2 usd at the time. Plus, you could find great usb dev boards for less than 5 usd.
If I had this stuff as a teenager, I would've lost my mind...
Yeah, important to remember that the old Wifi module for the Arduino was like 20-30 bucks and was basically just an AT modem that still needed a host Arduino. Barely anybody ever used them that I saw. The original ESP-01 for a couple bucks was a very exciting product for hobbyists just as a serial-to-wifi adapter even with minimal English documentation, and then it went bonkers when they released Arduino support and a package with enough IO to do useful things. Espressif has also been pretty nice to the hobbyist community in general once they realized what they had done.
The first time I saw a 8266 I thought "why am I struggling with Arduino or Raspi if this thing is all I need to make a sensor + wifi mesh". Also being powered through the USB is a plus, lots of unused USB ports around in 24/7 machines.
Agreed. Hobbyist (and retired professional here) who fools around with IoT stuff at home. I've done lots of things with Raspberry Pi Zeroes that can be done with an ESP8266/ESP32 and with the benefit of on board analog inputs. Some of the devices I've used (like the Bosch BME280) seem to work better on the ESP and I suspect that's because the serial communication does not suffer from the risk of missed data due to managing the protocol in user space. The ESP is easier to program for real time either using FreeRTOS or just bare metal. Espressif's SDKs are a bit cumbersome to install but the VS Code extension helps to manage a lot of that. And it all works well on Linux.
> These chips are intricate enough to potentially contain backdoors. While it is something I never thought about with respect to Espressif, it does seem plausible that they could be a target.
As the old saying goes, "the S in IoT stands for security" - I choose to trust Google/Amazon and their peers to have an Internet-connected device, but everything else (95% the IoT devices I have) gets sectioned off to a dedicated VLAN & WLAN with no Internet access (and no access to the rest of my network).
This keeps me safe, and keeps the devices safe from each other (micro-segmentation in the access level). No need to trust what has minimal interfaces, and then I don't worry as much if I don't roll software updates every week...
> These chips are intricate enough to potentially contain backdoors. While it is something I never thought about with respect to Espressif, it does seem plausible that they could be a target.
They could indeed be, as already are all PC's, phones, TVs and any smart connected household appliance, no matter if they're built in China. I would never ever consider building a IoT home system wthout putting it behind a dedicated and firewalled hardware subnet that protects from the outside and from the inside. Ok, the firewall hardware itself could contain tampered chips with firmware instructing them to encapsulate and redirect certain traffic outside of the rules without reporting it, but that's another story; when in doubt, I prefer to consider every piece of hardware that is closed and connected as potential spyware.
They already are very popular. They're embedded in devices from smart home gadgets to washing machines and smart appliances.
The great thing I find about them is, that they are "hackable". Don't want to depend on some weird company and their cloud, especially with long-lasting appliances (which outlast the company cloud or even the whole company)? Just buy something with esp8266/esp32 inside, and there's a high chance, that there is some opensource firmware (tasmota, esphome,....) available.
(disclaimer: some times the "hacking" part involves soldering irons, wires, usb->serial adapters and knowledge of electronics safety, especially with mains powered devices)
It's also great to see how with a little technical know-how, you can avoid paying 100s of dollars for some IoT tech that is reliant on more people buying overpriced hardware for it to work in the future, and instead - go for DIY or cheap hardware (e.g.: SonOff) which gets a huge audience due to their low prices, which guarantees someone like me will spend the time to get it properly supported soon enough...
All of a sudden, the premium product becomes the inferior product, because it will have a smaller market share, and therefore, fewer hackers :-)
A lot of grey-beard type embedded programmers, who programmed in assembly through the 80s and 90s, really got attached to their architecture of choice, whether that is AVR or MSP430 or 8051. That crew tends to be very conservative with tech choices, for both good and bad reasons.
Some then learned to tolerate ARM Cortex M.
But Espressif, this weird new and unknown Chinese manufacturer, came out with their own arch and instruction set that no one else uses (xtensa). A lot of old timers were wary.
Their price to feature ratio is unbeatable though. That go them in the maker/hobby community fast, and lately I've heard of more and more companies using esp32 for actual professional products.
Xtensa is licensed from Tensilica which is both an American company, but also used extensively in non-consumer facing industrial systems. It's not that nobody uses it, it's that you're likely not even close to the target market they're made for. Espressif is a fabless semiconductor company, they don't actually make their chips, they license a bunch of parts, design some developer boards, and write software around it. They're much closer to a software company than a hardware company.
Athereos wireless cards (ath10k) have also used Xtensa.
And the audio DSP in newer intel chipsets (e.g. Apollo Lake) is also Xtensa-based, but unfortunately quite locked down (signed firmware only). See https://thesofproject.github.io/latest/platforms/index.html
Also ISTR that older Radeon graphics cards used Xtensa (e.g. in the Unified Video Decoder).
There was a period centered around the late '90s where new architectures were a dime a dozen, as access to fabs opened up, and small companies with an idea hoped to become the next Intel.
Anyway, ESP32-C3 is RISC-V, and I suspect they won't do any more new Xtensa cores.
I can't agree with this enough. Willow[0] (disclaimer - I'm the creator) just wouldn't be possible without the robust Espressif ecosystem, libraries, code samples, docs, etc. Let alone all on a finished product (ESP BOX, $50) available worldwide through their distribution channels! We have a lot of background and experience in this and related fields but the fact remains we went from concept to initial release in less than six weeks (with two part time devs - if you include me, which you shouldn't haha).
Wouldn't the newer RISCV based ESP32 models eventually add to the commercial popularity? Maybe some commercial users stayed away from the Xtensa powered ones, especially if they would have needed to do lower level work?
ESP32s aren't really ‘lower level’ in the sense that anyone is likely to write assembly code for them (compared to, say, 8051 or PIC), other than maybe some driver author at Espressif. The big win from using RISC-V, other than name recognition, is mainstream compiler support (which is nothing to sneeze at, especially when it's largely funded by someone else).
When I worked on Matter¹, the Xtensa and RISC-V versions were basically fungible from the software point of view. (And really, so were other vendors' various ARMs.) We did find that Bloaty McBloatface² didn't support Xtensa, so I had to write an alternative.
We do use the ESP-32-C3-MINI module in our open source / open hardware air quality monitor [1].
We used the ESP8266 before and the C3 is a great replacement and big improvement. The direct programming through the USB works great and our makers can actually directly flash it from the Chrome browser [2]. So the complexity of the BOM is greatly reduced.
Having two real hardware serials is a great improvement over the ESP8266 as the software serial with the ESP8266 was never 100% reliable.
So far we have produced a few hundred boards with the ESP-C3 and they all seem to work stable and reliable and we are planning to use it also for our indoor air quality kits.
Espressif's software support for their chips is incredible, and they are actively collaborating with the open-source community to fix bugs and implement feature requests.
The fact that their SDK supports all major features of an embedded device out of the box (e.g. OTA, clients and servers for application-level protocols like HTTP and MQTT, file system and kv storage over flash and sd memories) makes it a compelling choice for many commercial use cases.
Add a wide selection of FCC-certified modules with embedded PSRAM and Flash memories, that are also cheap and available in high quantities from multiple suppliers, including LCSC, DigiKey, and Mouser.
And, their higher-performance options, like ESP32S3 can also be used to run low-complexity image classification CNNs.
The 8266 was based off an xtensa core instead of an Arm core which is the norm in the industry. Espressif created a replacement for this wildly popular chip with the C3 and based it off RISC-V to keep up with the times and the community. No more toolchain complaints from the community, plus they get to use open standards like RISC instead of xtensa from Tensilica. The never-ARM approach from Espressif has worked out well for them over multiple other alternatives that they chose over the years.
If someone writes a book on the downfall of Arm in a few years, this is going to be a very interesting chapter IMO.
disclaimer - worked at Espressif for a bit (2017-2019)
You know that ARM is used in a number of microcontrollers? There are over a 100 of them in every modern car, also in Chinese ones. So there are a billion a year alone in that area. Then all the peripherals which use either 8051 or ARM based microcontrollers, like simple things like the keyboard controller.
While I think Risc-V has its benefits. I don't know how it compares today to the scalability of the Cortex-M series starting with the M0 and then following the Cortex-A series. But I see that it is still a long way to go par.
Something can have massive market penetration and still be doomed. Nobody talks about the biggest fax machine manufacturer, after all.
Mind you, whether that applies to ARM is a bigger question, it's like asking whether sand is going obsolete. I have my doubts unless something radical happens in the RISC-V space commercially; I don't think Espressif going all-in would do it on its own.
- As one of the replies said, I was talking mostly about the IoT space, but I think it's true for the non-IoT laptop/server space as well.
- Sure - looking at what Apple's silicon team is doing with ARM would make you think that ARM has nothing to worry about for a while. However, notice that Apple is the only one who is getting a lot done with ARM at the high end. Qualcomm, etc. aren't getting the same amount of performance. I believe that's because of the non-ARM stuff that Apple puts in their silicon efforts. Imagine in a few years, when the small companies designing and licensing RISC-V cores are not that small anymore - folks like Amazon (uses ARM cores in their servers) and Apple might have a serious alternative.
- I think open source models win in the long run, similar to Windows vs Linux, or closed version control software vs Git. The tooling around open source grows over a few years and then suddenly it feels like it's miles better than the closed options. I think similar stuff will happen with silicon.
- ARM themselves worry about this btw - not that long ago they created a website smearing RISC-V which was mostly full of FUD. Their employees revolted and made them get rid of that site: https://www.theregister.com/2018/07/10/arm_riscv_website/
ARM should be worried. In anticipation of their new IPO, they changed their licensing model to be based on the value of the final device, not the chip itself. Manufacturers are profoundly unhappy with that, and hey, there happens to be ann alternative to invest in now.
Talk to someone who works at a RISC-V startup. There’s a lot of energy there. It’s going to take a while, but startups are working on every class of chip.
ARM changed the fee structure because growth has been flat, and this is a way to get money short term, not grow the addressable market. It will settle into Intel-like stagnation as a disruptor takes its markets. What's ARM's moat?
I believe OP is speaking specifically to the IoT space, which is still a bit of a stretch - but slightly more accurate than the massive ARM dominance we're seeing elsewhere.
RISC-V is a much better ISA for IoT since it's so much better aligned. You just need to use the parts that you want instead of having to put in the whole package. Plus, it's free!
ST, Atmel and Microchip maintain massive product catalogues so users can choose between 4MHz, 10MHz, 16MHz and 20MHz for their 8-bit, 8-pin microcontroller, and choose whether they want 1750 bytes of program memory or 3500 - but for the same price customers can get a 160 MHz, 32-bit microcontroller with built in wifi?
Why would expressif pay the engineering costs and supply chain complexity costs to make worse products, when their flagship product is already a great price?
It's not about the number of ARM cores shipping today, but tomorrow. If RISC-V tools are about as good as ARM and the cores are cheaper it will become the preferred instruction set somewhere in the future.
Seems in this age of open source tools the instruction set is mostly commodity anyway.
I regard the ESP32-C3 as the natural successor to the ESP8266, but considering the relatively minor price differences on sites like Aliexpress, I'd opt for a more robust option like ESP32-S3, which offers significantly fewer limitations.
However, I'm personally anticipating the upcoming releases of zigbee/matter compatible chips for my home automation endeavors (such as the ESP32-H2, ESP32-C6) and possibly their esphome support. I'm enthusiastic about building low power devices that can operate efficiently for up to a year on simple alkaline batteries. While these chips already have an effective sleep mode with low power consumption, the Wi-Fi component remains a notable energy drain.
I've been developing my home automation stuff recently. I have a esp32 calling home over wifi about every minute, and I wanted to try a battery version. I just happened to have two rechargable AA and a AA battery holder.
You're very correct the wifi is a huge drain. I configured it to go into sleep mode between measurements and I was able to get 3x the life out of it, but for 2 AAs without a real battery circuit, it just meant about 36 hours.
I'm excited for the zigbee/matter chips, I'd like to really get power down so I can embed sensors in things like guitar cases.
It's been a while since I used it but from what I remember it works really well between ESP devices and you can certainly do a many to one/gateway approach. The range and power consumption is outstanding all things considered.
Would BLE work for your use case? I have a few ESP32 projects that fire out occasional sensor readings using BLE advertisements which is a lot faster and also lower power than trying to hook up to WiFi on every wake cycle.
Just to avoid confusion, when you say ‘zigbee/matter’ you're referring to Matter over Thread, as Matter can also run over WiFi for ‘big’ (wall-powered) devices.
They are- at least the c6- I actually just had it arrive on my doorstep this morning.
I get more is better, but even the c3 and c6, are really incredible overkill for most applications. Maybe another way to put it is that there is tremendous utility in just having a cheap low power chip that can read sensors, do some super basic processing on them (converting a voltage to a temperature or humidity reading, etc) and then send them back to the mothership, wherever that may live (could be the cloud) to store it and take action.
I might be missing the forest for the trees here but multicore chips with cores that push into the hundreds of megahertz, I actually don't know what the use case is. Driving touchscreen UIs?
It might be that many modern IoT standards have some form of encryption as standard - which requires a contain baseline of power from the chips it runs on. Or might be that the cost delta between the hypothetical low perf iot chip and a higher perf iot chip would actually be so low as to make the former not worth producing. Pure speculation though.
As someone who primarily specializes in programming and only understands the basics of electronics, I often use esphome[1] for my home automation projects. I want to mention it here as it significantly lowers the entry barrier for creating functional (and funny!) projects quickly, focusing on code and capabilities (with one click, you can update any esp chip wirelessly, after the initial firmware upload via USB that you can do via a web interface too).
Working at a lower level and with code more closely tied to the machine can definitely be rewarding. Yet, it’s esphome that has revived my pleasure of working with electronics and enabled numerous complex projects that I wouldn't have even begun otherwise.
It's also incredibly powerful, and if you wish, you can write libraries to add more features. It seamlessly integrates with Home Assistant (auto-discovering the devices), but it can also be used with other interfaces.
I trust it to not disappear because I think Home Assistant acquired the project.
Oh, and it's open source too (MIT + GNU GENERAL PUBLIC LICENSE)!
I moved to the ESP32-C3 after using the 8266 for a while, and the C3 is a great chip. It's small, power-efficient, has bluetooth and wifi onboard. It's got USB pins so you can connect it up to USB and program it, and connect to Serial (no USB device though, sadly) It's got a really full feature set, and a really strong Arduino community to boot. Probably best of all, it's inexpensive and in-stock. While so many other chips have been unavailable, the C3 can be had just about anywhere at very reasonable prices.
And if the chip isn't quite your thing, the modules available are _excellent_.
I use the ESP32 in many of my art-projects and it works perfectly: enough memory, compact, cheap and the “fuses” that I use to engrave the edition number and signature. Programming through Arduino works well.
LCD driver for ASCII art (there is an image of the back with the chip visible):
I haven't done much with it yet, but I'm excited about the bare-metal (no_std) rust support for the esp32c3 (as opposed to some other variants that require a custom toolchain as I understand it).
I hope to eventually get it working with MQTT (there may be examples already, I haven't yet looked in-depth), at which point I think this will be my go-to for the majority of my IOT projects going forward!
They are callable from plain Rust. See the crate `esp-idf-sys` [1]. I'm using ESP32-C3, Rust, and the IDF for a DIY RGB room lighting solution [2]. Convenient MQTT and OTA updates -- it's great! Espressif and the Rust community have done an amazing job.
But I'd also love to be able to do the project without ESP IDF, using `no_std` and smaller specific crates instead. The solution for using ESP IDF in Rust is fairly convenient, but it does require some extra steps and the disk usage is not insignificant. A pure Rust solution would be more elegant.
The utility of these chips just make me a little sad we don't quite have the same thing for hard-wired ethernet. The size you can hit with these is so small that the bigger component is always getting power to them - which is something which a similar cost/performance marker would be able to hit if you could hardwire them and run PoE.
cnlohr managed to add Ethernet to a esp8266 module (1) using the i2s signals also present in the esp32-c3 and a driver plus magnetics. Once the transformer is in place, passive PoE can be added with a regulator connected to the relevant transformer pins, probably 2 and 7 in the schematic at (2), but yeah, the size grows significantly.
Something like 10BASE-T1S (802.3cg) would be nice. Single-pair Ethernet with PoE (PoDL) and bus-like connectivity (from what I've read, I haven't had the opportunity to try it in practice).
What kind of power supply concerns? I've used quite a few of them and never had a problem.
For USB-powered development boards IMO the best option is the raspberry pi power supplies -- they are cheap, high quality, and can supply plenty of current. It's sometimes necessary to bridge over the input diode if you want to draw any significant current from the 5v pin.
If making your own boards the maximum 3.3v current is higher than you might expect (0.5A), but it's perfectly doable.
Piggy-backing them on mains USB chargers has got me a long way, but what I want right now is something I can strap an 18650 to, wake the thing out of deep sleep once every five minutes to squirt a reading over MQTT, and recharge once a month (or whatever.
As long as it's better than once a week). I haven't seen a board with the lipo management onboard to make it easy, but I also can't imagine it's that rare a use case.
My smart home is also my home security solution, so I wanted it to work through a (short) power failure.
Getting a UPS for my Home Assistant node was the easy part. Like you, I was worried about the many nodes scattered throughout the house that needed power.
The solution I've come to for now is using a USB plug for mains power, that goes through a DIY USB battery bank charging board (that I got off AliExpress) that's plugged into an 18650 cell. Most of these boards can either charge or provide a 5v out, but some are happy to do both at the same time, and can act as a mini-UPS.
I use feathers a lot in hobby level projects. I wish they could use a 2s battery instead of 1s only. Other parts of my projects sometimes need the extra voltage and it would simplify things if the feather could handle (and charge) a 2s.
I love my ESP32-S series. I use them in tons of LED projects[1]. Sadly it seems WS2812b are fundamentally incompatible, under FreeRTOS, with wifi due to timing glitches from interrupts.
In my next two project I'm going to have to run two 30ft cables because I can figure out how to get low latency wifi to work on a Raspberry Pi 2040 (where PIO is fabulous, but 500ms request latency is killing me) or without glitches on a ESP32.
Been using ESP32 to drive WS2812b LEDs for years. You need to use the SPI peripheral to drive the LEDs, using a driver that outputs the data format the LEDs understand. The SPI peripheral uses DMA and does not glitch when wifi is accessed at the same time.
One of my favourite series of chips. Use them in pretty much every project I do. The combination of a pretty powerful processor along with built in wifi and Bluetooth is perfect.
Edit: apparently it doesn't. ESP32 however does. Why is the major use case for going S2/S3/C3 again ? Seems regular ESP32 has it all, and price is excellent.
ESP32 is pretty old and will reach EOL "soon". At least when you are thinking on larger scale and long product lifecycles. You wouldn't design it in for a product you start developing now and want to put into the market by the end of next year for at least 3 to 5 years. Also things get better. You really don't want to release a product next year that still sits on Bluetooth 4.2 for example.
I know ZERO about Hardware and IoT, but I would love to learn to tinker with these things like ESP32 and Arduinos, is this guide comprehensive and easy enough to follow to go from zero to building something?
Once setup, you can open any HA on a computer of your choice, plug in an esp* via usb, and flash it from your browser.
Once it's flashed over usb once, you can disconnect from your computer and flash it via OTA (i.e. remotely) in the future. You can connect whatever sensors or outputs you want and change it at any time.
I think if the US-China chip war hadn't been going on, and customers wouldn't have been paranoid of using a Chinese chip (and being in a tough spot like Huawei customers), Espressif chipsets would be waaaaay more popular.
Disclaimer/source - worked at Espressif for a bit (2017-2019)