Wellington Ma's business card was a rectangular slice of pink synthetic quartz, laser-engraved with his name, 'The Ma-Mariano Agency,' an address on Beverly Boulevard, and all kinds of numbers and e-mail addresses. It arrived by GlobEx in its own little gray suede envelope while Rydell was still in the hospital.
"Looks like you could cut yourself on it," Rydell said.
"You could, many no doubt have," said Karen Mendelsohn, "and if you put it in your wallet and sit down, it shatters."
"Then what's the point of it?"
"You're supposed to take very good care of it. You won't get another."
Loved the article, but loved even more to learn about MicroPython.
As an amateur programmer that has never actually physically handled hobby micro-controller type boards, but has read a lot about them, it was amazing to be able to try out the emulator at https://micropython.org/unicorn/
Make the leds blink, see how servo control works etc, all in the browser without having to actually buy any hardware! I had never seen anything like this.
Would love if others could point me to more advanced emulators of hardware micro controllers (if there is such a thing!)
MicroPython is fantastic for education and rapid prototyping with hardware - definitely try it out! It can be difficult to make complex long-running applications because of post-GC memory fragmentation and frequent dynamic allocations by the VM, but the Python syntax makes it all worthwhile. It also has a supportive and friendly community where it's easy to get support, and the official 'PyBoards' are great.
MakeCode is also similar to what you describe - check out Adafruit's Blockly/JS simulator:
Note that CircuitPython is Adafruit's fork which focuses more on education than industrial prototyping. So even within the embedded Python ecosystem, you have options.
Embedded development is so easy these days, it is just fantastic. The barrier to entry, in both cost and complexity, is so low now that just about anyone can apply physical automation to the problems that they see in their life and community. I'm excited to see what emerges in the next couple of decades.
We've made some pretty capable hardware at my job on top of the pyboard/micropython. We've outgrown it in certain cases but it's really powerful being able to build something for potential critical business needs (like one off testing platforms) without spending weeks rewriting drivers.
Hardware is so cheap and accessible these days, just get an Arduino board. There are lots of kits available with sensors and actuators, and it's all so easy to use it's like building with Legos.
If you've never tried Python including a REPL for whatever project you'd use Arduino for it's definitely worth giving a shot. Different kind of game than C.
To me it's truly mind-bogging that a tiny $1.42 chip contains almost everything needed to boot Linux: a 500mhz cpu, 32MB SDRAM, 2D GPU, SD/MMC support, and USB controller, all packaged inside a 10mm x 10mm chip? It makes me really want to get into embedded development.
Calling this embedded system feels like insult to the spirit of 'running light without overbyte', because it's so comically capable.
My first PC had 66/33 MHz 80486 processor and 8 MB ram, 320 MB HDD. You could run AutoCAD, play Doom, Civilization, have dual boot for Linux, Windows 3.11 etc. It could compile Linux from source.
You were living large! My first Linux computer was a 20Mhz 80386SX with a 40MB hard drive partitioned half for Windows 3.0 and half for Linux (SLS) and swap. It had a whopping 4MB of RAM and I also compiled my own kernel making sure to trim down features as much as possible. It was magic seeing X11 come up for the first time just like the big-bucks Sun boxes!
My XT had an 8088 @ ~4mhz in it, and we pushed it uphill to school both ways, in the snow!
It was multi-functional though. The power supply was inefficient enough to heat the room, while simultaneously being heavy enough to bludgeon large creatures for sustenance.
Something inside me misses the radiant heat, sitting here on a cold day with my chilly metallic laptop.
oh kids! i used to crank the wheel by hand on my analytical engine that my hacker friends and me built after there was a leak of babbage's plans! he never knew!
you kids and you petabytes of ram! back in my day....
/s
(i'm looking forward for these comments to be commented on a certain website i'm not supposed to name)
The prime directive of said website is to never name it on hacker news. But I'm pretty sure you can Google the prime directive itself and figure out where it comes from. As a hint, it starts with "ng".
Ran windows 3.1, Civilisation (VGA graphics), railroad tycoon, Star Trek 25th anniversary - and could have them all installed at the same time. Other programs included championship manager 93, and I think day of the tentacle.
I was a kid when my father first acquired a similar computer.
We ran Windows 3.0 and my father configured a boot entry for running Doom or Ultima 7, which required so much memory that it wasn't possible to run Windows, only DOS.
I remember feeling a bit envious about my neighbor having the more powerful 486, which could run Doom much faster than in our PC
Made me laugh about “Dad configured a boot entry”...I remember the hours with my Dad trying to stuff whatever device drivers into that 640K via Autoexec.bat and Config.sys. Got the CD-ROM running but damn if sound now doesn’t work. Those were the days trying to get Tie Fighter in my case to work.
Worse part was my Dad in his wisdom bought a family PC without a Pentium, but a Cyrix P166. Had zero floating point processing. Ran like a damn dog on any 3D game.
Any Brits out there might remember Time Computers. On every back page of any newspaper ever selling PoS computers with whatever subpar hardware they could cram into a big beige box ;-)
Not to brag but I got a 4.77MHz 8086 XT PC for my 8th birthday :) It had a whopping 256KB of RAM, a monochrome display (later upgraded to a 4 color CGA that blew my mind), and two (yes, TWO) Floppy drives.
My dad got one of those for us both to use...what was truly "whopping" about it was its price...
If I remember right it was almost $3k...but that included that blazing fast 300bps Hayes modem for wide-area networking.
And my mind was truly blown when we replaced the 2nd floppy with a more-memory-then-we-will-EVER-need 20MB hard drive...what could we do with all those bits??
And a decade before, the distributor DigiKey presented as a one-column advertisement in Popular Electronics, selling 7400 DIP ICs for $0.29 . Inflation-adjusted, that $0.29 now buys a an IC that can run Linux.
Wow, I admire the far-sightedness of your parents, to give a child a computer at such a young age (and in those early days of PCs).
I was of a similar age when my dad brought home an NEC PC 9801 - with an 8086 CPU (8MHz), 640KB RAM, kanji (~3000 characters) font ROM, even a Japanese word processor. I think it ran MS-DOS 2~3.
"In 1987, NEC announced one million PC-98s were shipped."
That was a big wave, and I'm so glad my parents let me play with a computer as a "toy" - it was a huge influence on my mental development.
Kinda like the monolith moment in 2001: A Space Odyssey. :)
My parents had no idea, and they couldn't afford it anyway :) My dad asked his cousin, who happened to be working with PCs way, way back and recommended he get a PC instead of a C64 which is what everyone else had. My dad asked his dad and my grandfather forked over what must have been an insane amount back in 1980's Israel.
Aww, that's so nice that your dad asked around and pulled resources together for you. And I'm sure grandfather knew it was a worthy investment in your future.
My parents refused to buy it, and instead let me play with the PC-98, where I learned BASIC, Turbo Pascal, even some 8086 assembly language.
I suppose if I have children, I'd discourage mobile phones/apps and instead give them Raspberry Pi, microcomputers, sensors, devices they can build stuff with.
This time of year is ripe for nostalgia. I recall that it was just about this time in 1993 during Christmas break that I loaded Linux version 0.96(?) onto this 386SX machine. This involved taking a stack of 1.44MB floppies and driving to a place where I had access to the internet. I'd choose SLS packages and copy each to an individual floppy. Then I'd drive home and load them one by one until I had a bootable system. And of course with floppies, it was inevitable that you'd get a error now and then. So, back to the car, download, copy and repeat. All to get away from the limitations of Windows 3.0...
Talking about Christmas, we used to go to my aunt's on Christmas day and I remember the time they bought an Amiga for my cousins. We were very young so between games we used to have a good laugh with the program that did text-to-speech (can't remember the name right now), but it only spoke English and we were easily amused by typing phrases in Italian and having them read back as if it was English (so the pronunciation was all messed up).
same here, packard bell but iirc it was 20/40Mhz with 40MHz being the turbo, but before that I did have a zenith 8088 dual 5.25" floppy system that had to boot dos with one disk and run programs off the other (mostly written in gwbasic).
My first few PCs weren't even PCs, I didn't know what they were until I was a little bit older. Somehow I intuited them naturally though; I liked exploring and being curious about what I was looking at.
First one I remember was a Commodore 64, along with a 600 page book full of BASIC that you could type out and record on cassette to have your own game. The book was total gibberish to anyone else; it was just pure code with no explanation. But that's what the C64 gave you; an interactive environment on boot where you could program a new game and write it to a cassette. By default. If you wanted to play a game you had to type `RUN` or maybe one or two other things to set it up. But you wouldn't know that, because you just had an interpreter on a basic blue and white screen.
Worst bit was the 10 minutes of spasmodic strobe animations that showed you the game was loading. But also each game controlled those loading animations. You had to know what game you wanted to play, and be sure of it, or otherwise you could just flip to the B-side and get a different game.
After that I think we had a BBC Micro at school but I'm not sure. All I remember is an adventure game and one of those classic 5" floppies. I still really love the look and feel of inserting a floppy and manually locking it in. Floppies and cassettes and VHS's minidiscs were truly fantastic for the time. They were still mechanical, unlike CDs.
Then on my dad's side I and my siblings got an Acorn PC and a bunch of random floppies. None of them made any sense but some of them did cool things when you ran them. I remember hiding from my family and putting in the floppies that made flashing colours and watching it until time passed.
Must have been 11 or 12 years old before we first got a PC and by that point I was utterly fascinated. It was some shitty off-the-shelf eMachines thing but it was the best we could get; I managed to retrofit a decent graphics card in it a little bit later.
There's embedded devices with an order of magnitude more CPU and two orders more RAM, if you're being strict about what counts as "embedded". If you're not, probably another order of magnitude on each.
Well now I feel old. I remember buying a 486 with 8MB and thinking "I'm really living in the future now!" The mhz were ridiculous and -- according to my memory -- most instructions ran in a single cycle too! (Warning: my memory is not a reliable gauge of the state of computing at the time.)
8MB was pretty extravagant but it turned out to be a good call even though it could be had for half the price within a few years.
Lucky! We had the same 486 system but only a 40MB disk. I had to spend some lawn mowing money to get a 400MB second drive so I could install Sid Meier’s Gettysburg. :)
Remember there was a lot less to the Linux kernel at the time. Wasn't too bad. The single-threaded compile time for Linux has stayed fairly consistent for awhile, since the tree grows about as fast as the average desktop improves. The big improvement in compile time has come from being able to compile with 8-32+ threads.
Yes, that's true. It has been O(1 hour) for pretty much the entire history of Linux. And if you think about it, this makes sense. If it were significantly less than that, the development cycle would speed up so it would be easier to add capabilities, which would slow down the build. If it were significantly longer, the addition of new features would slow down until the hardware started to catch up. So the compile time acts as a sort of natural control mechanism to throttle the addition of new features.
I'm not sure that is the primary reason -- many other projects have had their compile times explode over many years, even though the same logic should apply.
Not to mention if you build the kernel regularly, you benefit from incremental compilation. If you change a few non-header files the rebuild time can be as little as 2-3 minutes. Oh, and "make localdefconfig" will reduce your from-scratch compile times to the 5-15 minute mark. I highly doubt most kernel devs are building a distro configuration when testing (especially since they'd be testing either in a VM or on their local machine).
When I worked at Microsoft, Windows took far, far longer than an hour to build from scratch. I remember walking a few buildings over to the burn lab to pick up DVDs of the latest build. I don’t have any hard data, but running a full build on your dev box was very rarely done.
The dynamics of commercial projects can be very different from open-source. You're much more likely to tolerate sitting through long builds if you're being paid to do so.
I'm convinced that compiler speed is all a matter of how it was initially designed.
Take for example Delphi which can compile millions of lines of code in under a minute or something ridiculous. Then we have D, Go, Rust, and such, they compile rather large codebases that would take C++ a good 30 minutes on high end hardware of today in shorter spans of time (not as familiar with how fast Rust compiles, but I know Go and D do pretty nicely, especially if you enable concurrency for D), which probably takes those same 30 minutes on high end hardware from about a decade ago.
Sadly from what I have heard C / C++ has to run through source files numerous times before it finally compiles the darn thing into anything meaningful. Facebook made a preprocessor in D to speed things up for them, it technically didn't have to be coded in D, but Walter did write the preprocessor for them, so he gets to pick whatever language.
The C++ language cannot be parsed with a context-free parser.
A large part of the build times, however, is due to optimizations.
In the early days, clang was much faster in compilation than gcc. Over the years, it has improved on optimization output but as a consequence has lost the compilation speed.
There are many examples for https://godbolt.org/ to show how much work the optimizer does. As example, the http://eigen.tuxfamily.org library relies on the optimizer to generate optimized code for all sorts of combinations of algorithms.
Rust doesn't compile very fast unfortunately, but it's being worked on. General wisdom says it's about as fast as Clang but comparing compile speeds across languages in a meaningful way is difficult
Fair enough, thanks for that, I wasn't sure, all my Rust programs have been small enough to where I havent noticed. I wonder if it would of made sense for Rust to consider this earlier on in the same way that Pascal did. I believe I heard Pascal was designed to be parsed easily by machine code.
"Parsing" was probably not the best choice of word on the GP's part, but they meant that Pascal was specifically designed to be implementable with a naive single-pass compiler; of course that would exclude many optimizations we take for granted these days.
It's the layout of the code that allows Pascal (/Delphi) to compile everything in a single pass. By starting with the main file you easily build the tree of deps and public interfaces/functions vs the private & implementation details.
C and C++ even though they have header files, make no restriction on what goes in a header file, so it takes a bit to figure out the dep graph.
But the preprocessor isn't what makes compiling C++ slow. It's a combination of the complex grammar, and ahead of time optimization of a (mostly) static language. Turns out you can compile code really fast if you don't actually try to optimize it.
Linux in the 90's wasn't as bloated as it was now. It definitely took well under an hour to compile a kernel. My first Linux box was a 386SX-16. I later upgraded to a 486DX4-100.
That doesn't sound right, it's still not bloated. The majority is drivers then comes the cpu arch folder that takes a big cut. You can check yourself with cloc:
It used to take me just over two hours to compile a (very minimal stripped down) kernel on my 25 MHz 80486 with 4 MB of RAM (in the late 2.0.x/early 2.2.x days).
Long. I remember it being something like an hour or so to compile the kernel, but the kernel was also much smaller then. I specifically remember Linux 2.0.0 coming in a 2MB tar ball. Because it probably took me longer to download with my 28800 baud modem from an over saturated FTP server than it did to compile. 8)
I mainly work in web-dev, but working in embedded systems is kind of my hobby. It's kind of refreshing working on bare-metal, optimizing up-to single bytes of data, when I'm usually working on layers and layers of abstraction everyday.
For other low level programming, Atmel's AVR Studio and their dev boards are incredibley newcomer friendly. Their source level debugger (with the debug coprocessor) is a miracle.
If you'd like to get into "big iron", Embedded Linux is amazing. Raspberry Pi is a great start (lots of GPIOS programmable from userspace). To get into embedded linux kernel programming, start with a loadable module on the Ras-Pi. Also, build/load your own Ras-Pi kernel.
Other folks suggested the ESP8266 which is also great fun to use.
Edit: Learn C. Lots and lots of C. Embedded dev is almost all C. A teeny bit of ASM, sometimes C++. But almost all of it will be in C.
Can recommend https://buildroot.org/ for building complete Linux system images. Been using it on multiple embedded projects, both personal and professional.
Somewhat related, I've also used buildroot in both AWS and GCP to run workloads from read-only system images. Quite liberating in my opinion. No ssh, no ansible, etc. Build the image, launch it and off it goes. GCE even allows you to use the same boot disk, if mounted read-only, for multiple instances, perfect for these type of images.
I can second this as well, Arduinos have a bad rep for being too beginner-focused but AVR is a super solid platform and you can do a ton with them. I'd pick Arduino over RasPi for a starter project anytime.
I'm a React Developer in my day job, but I'm disillusioned with the web as a whole and I've always had an interest in low-level stuff.
At the moment, I'm doing a structured C book course (C: A Modern Approach), and I've also signed up for the edX course "Embedded Systems - Shape The World: Microcontroller Input/Output"
It uses the TM4C123 board, which from a look around seemed to be a decent enough board for a beginner. I'd seen complaints of Arduino, but I'm not experienced enough to know the validity of their claims.
Either way, I'm having fun. Not sure if I'd switch career as Web Dev is full of decent paying jobs but it's nice to have a hobby that's different from the day job.
FWIW, those TI boards have tons of hardware bugs. Way more than you'd even typically expect from embedded hardware.
And if you don't use the arduino ecosystem, but use an RTOS on an Arduino, it's a perfectly valid dev board for learning real embedded systems development.
I've heard about the hw bugs a little but couldn't find much information about them.
I'm mostly just using it for the course which was highly recommended in multiple places, so I hope I won't encounter many of these bugs.
To be honest, the whole picking a board thing was quite overwhelming, with many recommendations, boards, variations, etc. Hopefully once I've finished the course I'll have some more knowledge to help me pick the next board.
I think Micropython is fantastic and you can get an ESP8266 + FTDI cable for around $25. You can make this a web server and do fun hardware integrations with it like flashing LEDs for successful CI builds. Here's a good tutorial: https://docs.micropython.org/en/latest/esp8266/tutorial/inde...
I highly recommend the ESP8266 and ESP32 once you're a little more familiar with the arduinos.
They have Wifi, and the ESP32 is actually very powerful and has two cores. They make for great DIY home automation/IOT devices and are pretty easy to work with.
Second this. The ESP8266 (a nodeMCU) is about as powerful as an Arduino Uno, is about half the size of a credit card, is very thin and costs around $3.
The wifi part is the biggest advantage. You can send sensor data directly to your server via simple http requests.
Esp8266 is way more powerful than an arduino Uno. It's got a 32bit 96MHz proc with 96k of ram and Wifi, often coupled with 4MB flash. Uno has 32k of flash 2k ram and (from memory) 20MHz 8bit cpu.
No matter what you do, you NEED tools. I cannot stress this enough. If you can spend $250-$500 on a nice desktop scope, awesome. If you only have $80, you can get a serviceable Hantek USB scope that will at least give you an idea of what you're doing. If you can only spare $20, you can at least get an el cheapo JYETech scope. In addition, you'll want to pick up a multimeter and probably a logic analyzer. Again, no need to go all out; a cheap $20 meter and a $15 Saleae knockoff will get the job done when you're just starting out. DO NOT SKIP OUT ON THESE. Embedded development without being able to see and know what you're doing is miserable, so unless you're doing this because you want to be frustrated, just buy the tools you need upfront.
As for what microcontroller to actually learn on, I would say the MSP430 is a very good starting point. It's a fairly mundane 16-bit RISC microcontroller series with very forgiving electrical design requirements, very good documentation, and very good community support. They make a devboard (TI calls them Launchpads) for the MSP430G2553 that's more than enough to get a beginner started. When you need a little more power, you can either opt to invest in learning the ARM ecosystem, or go for something a little more exotic. Just about every manufacturer makes an ARM microcontroller of some sort, so if that's what you're interested in, take your pick and go with it. If you're looking for something else, the Renesas RL78 and RX series provide a lot of functionality if you're willing to deal with how stodgy Renesas can be.
Some important notes:
1.) Don't bother with Arduino. They were a much more compelling product 15 years ago when you had to pay thousands in tools and compiler/environment licensing to get in on in embedded development. Today, what Arduino nets you is a painfully barren environment that abstracts away a lot of what you're trying to learn when you're starting out. Losing out on the debugging, profiling, tracing, disassembly, memory usage statistics, etc. that modern development environments give you will do nothing but stunt your growth, especially if you're used to having all these tools while writing desktop software
2.) Be careful with (or preferably just avoid) starting with embedded Linux; it's pushing the limits of "embedded". You're going to miss out on a lot of important knowledge and insight jumping straight into using an operating system (and a very heavy one at that), and for many applications, it is MASSIVE overkill. When you start, you're not going to need an RTOS. When you need an RTOS, you're going to reach for something more reasonable, like FreeRTOS. If FreeRTOS doesn't cut it, then you can start looking at Linux.
3.) Don't get tangled up with Raspberry Pis; the microprocessor on these is complex and the documentation is severely lacking/nonexistent. RPis are much closer to a desktop computer than they are an embedded system.
If you really want to get it, I would say one of the most useful exercises is implementing your own microcontroller/processor. You can pick up an FPGA devboard for fairly cheap, and there are plenty of textbooks (Harris & Harris' Digital Design and Computer Architecture comes to mind) that will get you through most of the key concepts. Once you've done this, a lot of the gaps in understanding when dealing with a microcontroller will be filled in. This exercise isn't strictly necessary, but I don't know anybody who has done it that wasn't better off for it.
My final note is to buy or "acquire" Horowitz and Hill's The Art of Electronics. Embedded development is inseparable from electrical engineering, so even if you don't read it front to back, there are some sections that you will definitely be visiting if your background isn't in electronics.
I wouldn't say that. Yes the Arduino "libraries" abstract away a lot of the complexities and hinder true understanding. However for a beginner it is a perfect platform simply because of the huge community i.e. tutorials, code and projects. Once they gain confidence from using this, they can move on to "traditional embedded development" by learning to program the underlying AVR directly i.e. use the toolchains installed by the IDE, delete bootloader and upload your program using the ISP interface (see the book; "Make: AVR Programming"). This gives you the best of both worlds using the same development platform. Another advantage is that many other MCU families also provide a "Arduino-like" interface (eg. Energia for MSP430) and thus all the skills learnt on one can be transferred to another.
My take is driven by me finding the Arduino platform too easy. My first exposure to embedded development was Energia, and I was so underwhelmed that I didn't bother with it again until years afterwards, when I downloaded CCS and got to play in the "big leagues". I know multiple people that had that same experience.
If you're already writing software, you're not going to be struggling with having to learn programming. Translated to Arduino, this is great for engineers who don't write software and just need to get simple shit done fast, but for those with a background in software, it feels equal parts mundane (I called analogRead and... got an ADC value, just as expected) and magic (what did analogRead actually do?). Given, you can go look at the source for these libraries, but a lot of them make very generous use of the preprocessor for the sake of supporting multiple products in one codebase, and it's often not pleasant to read. Working with an MSP430 (or basically any architecture if you're using more capable tooling, like you mentioned with AVR) has a certain clarity to it that the Arduino ecosystem just doesn't seem to capture.
I would make the same argument for AVR and the "real deal" tooling as I do the MSP430; why bother with Arduino when you're probably coming in with decent programming skills?
>why bother with Arduino when you're probably coming in with decent programming skills?
Pure Software programming skills are NOT enough when it comes to embedded programming. You need to know the logical interface to the HW and the EE/Electronics behind the HW itself. This is a huge challenge for people who have only worked with applications software over layers of abstraction. This is where the Arduino shines; hide the overwhelming HW complexities and provide a simple api for software guys to get their job done. This allows them to slowly gain HW knowledge and transition to "traditional embedded programming" as needed. Also, many experienced embedded developers are using the Arduino as a rapid prototyping platform to implement an idea quickly and get clarity on the project before implementation on the actual target using traditional means. Learners can do both on the same Arduino platform.
So here is my recipe for somebody wanting to learn Embedded Development;
1) Get a Arduino Uno, a couple of extra ATmega328Ps, a AVR programmer and a electronic components toolkit.
2) Get a book on Arduino programming and AVR programming. I can recommend "Exploring Arduino" and "Make: AVR Programming". You also need a book on Electronics and "Practical Electronics for Inventors" is a pretty good one.
3) Install the Arduino IDE and go through some blinky tutorials. Do some projects from "Exploring Arduino".
4) Now move on to direct AVR programming. The Arduino IDE has already installed the required toolchains. Setup a Makefile environment using them following instructions in "Make: AVR programming". Do some blinky and other projects from this book. This gives a command-line approach to AVR programming.
5) Next repeat the above using a proper vendor supplied IDE eg. Atmel Studio for AVRs. These IDEs are pretty complex but quite powerful and used extensively in the industry. Go through some tutorials and redo the blinky and other projects using the IDE.
6) Get some test equipment tools to look into the inner workings of the system. I recommend the multi-functional "Analog Discovery 2" which has lots of learning tutorials.
Congratulations; you are now a bare-metal "Embedded Developer"!
To enter the "big boys league", move onto ARM Cortex boards.
Finally you get to "Embedded Linux" and become a "Master Super-Duper Embedded Developer" :-)
A great book particularly for learners/students is "Introduction to Embedded Systems: Using Microcontrollers and the MSP430" by Jimenez, Palomera et al. For every feature of Embedded Programming it gives a illustrated and platform independent explanation followed by how it is realized in the MSP430. This cements understanding like nothing else.
FPGA is pretty challenging to get started with. If you decide to go in on it, you're going to spend quite a bit of your time at the beginning getting your ass kicked. Vendor tooling blows, open source tooling is playing continuous catch-up or is much more focused on VLSI, and resources for beginners feel pretty anemic compared to what you're probably used to coming from the software world.
I can't tell you if this is "the way", but I can tell you how I started. I flipped a coin to decide between Altera and Xilinx and started with a Terasic DE2 board (an Altera Cyclone II devboard) that I borrowed from my university. I don't recommend using this board or something like it (it has actually been superseded by a nearly identical board with a Cyclone IV in place of the dated Cyclone II); the extra peripherals are a headache more than anything, and the simple breakout pins on the Terasic DE0-Nano are greatly appreciated. As for the environment, you can go and download Quartus from Altera basically no-questions-asked.
After pissing with the board for a bit, I decided to pick up a book. Rapid Prototyping of Digital Systems by Hamblen, Hall, and Furman is good, if not a bit out of date. I had this book to thumb through instead of read front-to-back. It is written for older versions of Altera's Quartus software, but with a little bit of exploring the UI I was able to find just about everything I needed. It makes a decent quick reference for Verilog and VHDL syntax, has quite a bit of info on interfacing various things to an FPGA (think PS/2 mouse, VGA, etc.), and a couple chapters on microcontrollers and microprocessors.
An important bit to know is that that a lot of the usage of an FPGA (at least in the way I use them) happens in two phases; the actual logical design, which you'll do on your computer via simulation, and the actual usage of that design on the FPGA. The logical design happens largely in ModelSim (with Quartus only being used to generate libraries at this stage), or some other Verilog/VHDL simulation tool. Altera ships ModelSim free-of-charge with Quartus. It is your run-of-the-mill Tk-based clunker of an EDA tool, but it works and I haven't had it crash on me yet, so I can't really complain, even if I have some gripes with it. This is where most of the work/magic happens, at least for a beginner; you write your logic in your editor of choice, write testbenches, and simulate those testbenches in ModelSim. Having read the Harris and Harris book mentioned in my initial post in the past, I took a quick look at a processor called "Ahmes" on GitHub, and just had a go at it. After getting a primitive processor working/executing instructions, adding some peripherals like some timers is where things started to come full circle, and I started realizing both the power of the FPGA, and why things are the way they are on a microcontroller/processor.
The bit where you actually put it on the FPGA hardware is largely uneventful, or at least it was for me. I didn't have multiple clock domains or anything like that, so I didn't suffer from any timing issues, and basically had the good fortune of just mapping pins and letting it rip. In theory, actually translating the design to the chip doesn't do much, but in practice, a design that exists only on your computer isn't terribly useful. Actually using the hardware adds that "visceral feeling" and lets you play with what you've spent so much time doing, along with getting you familiar with the process if/when the day comes that you actually need to do it. You also get to enjoy/experience the power of bolting on arbitrary hardware to your jalopy processor for the sake of VGA output, GPIO, timers, SPI communication, etc.
I wouldn't consider myself terribly talented or knowledgeable, so if you just throw enough time at it, you can probably end up having just as much fun as I did.
I'd recommend an ESP32 (or it's slightly less capable little brother the ESP-8266) using the Arduino environment to get started. If you prefer Javascript, you can program it that way using something like NodeMCU or MongooseOS. Once you get comfortable with that, if you want to get a little lower level you can use PlatformIO with the vendor's SDK directly.
I like the ESP boards from Wemos. The US made ones from Sparkfun and Adafruit are also really good. The advantage of the Wemos ones (and the many counterfeits you'll see on sites like Aliexpress) is that they're so cheap you don't even have to feel bad if you fry one.
For anyone interested in embedded JavaScript but wants broader hardware support and/or modern JavaScript support[1], the XS JavaScript engine is microcontroller-independent[2], and Moddable provides supported ports for ESP8266, ESP32 (the successor to ESP8266), and the Gecko series of processors. (They also sell some hardware kits for folks who want that: https://www.moddable.com/product.php)
The roadblock has always been the MMU. Vanilla Linux won't work without it, and uCLinux is always in some strange phase of support for newer architectures if it's supported at all.
I found a decent port of uCLinux to the Cortex-M3 once, but the host core was 120 MHz and it was pretty much a science experiment and not much more.
Raspberry Pi's goal is computer science and coding education. They aren't going to make anything that isn't beginner friendly, and can't at least run a Linux GUI, have basic networking, support standard peripherals etc.
I don't understand how that's relevant. You can still release something smaller and less powerful, and still be accessible to beginners.
Why are you arguing about having standard peripherals? It doesn't let people learn about I/O.
I was in highschool and we were given given embedded hardware and were able to program it using assembly and uploading a program into it.
If you give students some very minimal embedded hardware with wifi and some terminal, they should be able to learn using that. Let them install things, maybe setup a 2D interface...
The RPi educational value is only enabled by I/O pins. I'm not sure that's really worthwhile.
The RPi is interesting because it's a powerful but cheap computer. I just wish there was much cheaper hardware to show people you can also do things with smaller stuff.
Because it also means increased power consumption. If you're trying to run something off battery or solar, not using hardware that's more powerful than you actually need can be necessary for the project to be viable.
I think it's fantastic. I don't typically hand out a physical business card often. So the recipient would be someone that I would feel would generally find the project exciting and intriguing, and it provides a great conversation piece.
It's simple enough to discuss it briefly and mention that for any security-conscious individuals, they can check out the GitHub that's stated on the card and see the code and still have the same insight if not more than if they had just plugged in the device.
For anyone else, the information they need from a business card is all still nicely stated, in a design that's enticing even without functionality. The production cost is easy to justify with the chance of achieving whatever your goal may be, whether it be securing a job offer or generating new business.
Maybe I'm twisted but my first thought if that if I were a pen tester I would want business cards exactly like this. And yes, they would run Linux... and some other things. :-)
Ha someone on Reddit suggested this. See the thing is, they have my contact info printed on them, so I'm really easy for the police to find after everything is hacked!
I do wonder how easy it would be for someone who has your business card to frame you... Is it possible to overwrite the OS on these with something malicious?
It is absolutely possible to overwrite the operating system. You have full root access when you log in. You can dd right over the flash device /dev/mtd0 with whatever you want.
As to framing me, well... I don't think it's much easier than framing me using some other method. And I think the cross section of people that want to frame me, and people who know how to compile Linux, is hopefully very small.
Well, it is appropriate as a business card for an embedded systems engineers, not so much for a system security one!
More seriously, I never managed to understand the "obsession" with ROHS (this here is just an example):
>I made sure to have a lead-free process—the boards, parts, and solder paste are all RoHS—so that I wouldn't feel bad about giving them to people.
I mean, you are creating something that (besides being very nice) is not needed, that very likely will end soon in the trash (and that contains anyway a few grams of "far from clean" materials) and you worry about the tiny amount of lead in the solder tin?
Industry wide, it's about sanitizing a highly toxic substance from the supply chain. I've met a child whose father worked with lead, and would hug him after coming home. The child ended up with permanent brain damage from ingesting traces of leaded dust on the father's clothing. In this case, it's more likely about keeping lead out of something likely to get crunched up in a parent's wallet and then shown to their child.
Lead causes permanent brain damage in children. There is no detectable amount of lead in the blood of children which is considered safe. While an adult body absorbs roughly 95% of lead into its bones, children's bodies end up storing lead in blood and soft tissue where it causes damage. When untreated, it can take months for lead to exit a child's bloodstream.
I would never consider reducing the lead used in something I've built to be wasted time.
I have no idea. I'm not a doctor, I just read a lot.
I used lead based solder as a child, and I turned out okay. That said, people used to use pewter plates and got sick from lead poisoning after eating tomatoes (the acid leached the lead from the plate); Roman soldiers would keep their wine in a lead container to sweeten it; pretty sure I've heard something about people painting their houses with leaded paint as well. On the one hand, society survived all that. On the other hand, if I have the opportunity to not risk poisoning a child by spending a couple extra dollars buying solder, that seems like a pretty easy decision to make.
I don't get this lead-free hatred on internet forums. People say they can't solder with lead-free, and that it is terrible, and that joints look dull. So much hatred that I actually tried it out. And I found the process is basically just the same and there is nothing to be afraid of. Just set the solder iron temperature a bit higher and that's it. Works like a charm.
My note was not at all about lead-free hate, actually, since several years, everyone in EU (and I believe in US as well) has been using lead-free solder (ROHS is 2003, if I recall correctly).
As you say it is just a matter of slightly increasing the soldering iron temperature, but it doesn't end there and there may be issues further on (JFYI):
It seems like lead-free alloys tend to be more problematic in these - fortunately rare - cases.
But because it is more than 15 years that ROHS came in force, I read in 2019 "and I made it lead-free" like I would read "and I buckled my safety belt" or "I put my helmet on", etc. I see it as "the normal" way, nothing worth a mention (nowadays).
Bismuth is used as a replacement for lead, if you want lower temperature solder. 58Bi/42Sn alloy is cheap and about as good as Pb/Sn, but is more susceptible to thermal fatigue. I used 57Bi/42Sn/1Ag to solder heatpipes. Works great!
Washing your hands after handling solder seems to be a fairly universal recommendation. It's not a concern for normal products because the users do not touch the bare PCB.
Sure, the lead may cross one or two layers of cloth and migrate from the pocket to your skin and bite you.
More considerate would be to provide the card in a protecting sleeve (that would protect from all the bad, bad substances it may contain) or just use a good, ol' plain paper business card, posibly printed with "bioetical" ink.
>You get it on your hands and then ingest it when you eat. (People don't usually wash their hands after touching a business card.)
I was referring to the parent poster that talked of people putting them in their pockets.
Anyway, ROHS (or lead-free) is a good thing, but it is not like you get lead poisoning (or saturnism) because of the once in your lifetime you touched sometthing contained lead and then - if you eat with your hands - you managed to ingest it:
You need to drink from lead or lead soldered tap water or water contaminated by lead to become poisoned by lead.
In the case of the Linux business card the exposed surface that would eventually contain a minimum amount of lead - pre-ROHS soldering tin contained in the common eutectic Sn-Pb alloy 37% lead - is in the below 1 square mm range.
You would nead to actually lick hundreds or thousands of such cards to ingest any meaningful amount of lead.
And lead free soldering tin may contain (in minimal amounts):
I wish I could find a source on this, but when my 12mo got his routine lead test, his lead levels were elevated. It was then that I learned that even "safe" levels of lead can (it appears, inconclusively) result in long term permanent changes: reduced IQ, increased aggression.
I will forever feel guilt about letting him play in the yard as a baby (my best guess as to where it came from) and take other efforts to reduce the lead in his environment.
Depending on your life situation, it may be impractical if not impossible to reduce the lead in your environment. I am open to being further educated on this subject, but it seems like reducing lead in all products that will be handled by people is a net benefit for getting to home environments that will not cause brain damage to the infants that live in them.
> I was referring to the parent poster that talked of people putting them in their pockets.
Right, but how are you going to put it in your pocket without touching it with your hands?!
>You would nead to actually lick hundreds or thousands of such cards to ingest any meaningful amount of lead.
Lead poisoning is cumulative, so you want to avoid ingesting even very small amounts. I doubt that good data is available regarding the amount of lead that would be ingested in this scenario or its potential effects. Best to be cautious. Lead free solder is not expensive.
Avoiding leaded solder is trivial, and it is something that's hopefully handled a bunch, so why not avoid lead? I didn't know "following best practices you have to follow in your daily work anyhow" is "obsessing".
It’s actually probably harder these days to build something that isn’t RoHS compliant as a hobbyist than one that is. The vast majority of what you can buy from Digikey and Mouser and whatnot will be compliant, PCB shops will mostly do lead-free HASL (though you’d want ENIG anyway if you’re going to use a stencil), and lead-free solder pastes are more common than leaded ones. Basically you’ll probably have to either use old stock parts or pull some really old solder out of the fridge as a hobbyist to end up with lead.
I meant - jokingly - the opposite, i.e. you call for a consultancy a security engineer as you are interested in increasing/bettering the digital security level of your business and he produces to you the Linux USB business card, you proceed to insert it in your computer and he says:
In fact, a USB condom would be a nice variant of this project for a security engineer's business card, although the female USB portion might be a bit bulky for practical use.
Good to see you here, congratulations for your idea and creation.
I replied to that in another post, ROHS is 15 years old or more and most available parts and solders are ROHS compliant, I believe you would have had a tough time to intentionally make something that was entirely non-ROHS compliant.
I spent a month visiting a PCB factory in China. I met the people working with lead soldering.
Seriously, please choose lead-free.
Pregnant women weren't allowed to eat in the cafeteria due to the high number of miscarriages. The people inhaling the stuff all day... let's just say that had a reputation among the other factory workers for being a bit mentally challenged.
The whole beauty of USB ports is that we have one unified interface for practically everything. It can't be safe without giving that up, adding some friction to user interaction, or doing some dark art involving centrally signed devices that we expect won't be broken. If anybody can produce storage and input devices that plug into a universal port and just work, malicious input devices can be made to physically look like storage devices in the eyes of the user. I'm not saying we shouldn't make more security conscious tradeoffs, I'm just saying that there are tradeoffs to be made, and most users probably won't appreciate the added friction.
OTOH, a security conscious engineer could exploit this to have the business card call home and automatically block communications from the company so one does not waste time with them ;)
It would be funny to have the business card list known exploits on their system -- whether this would taken lightly by the hiring manager is another question.
I don't see why you were downvoted. I fully get why the blog writer didn't include USB-C with this awesome project. But there's now a growing group of people who simply can't use USB-A anymore with their laptops. Most will have an adapter, but not all.
I've played around putting USB connectors directly on PCBs, and USB-A is way more lenient than USB-C. With USB-A, you can use basically any PCB thickness between the standard 1.6mm and 2.2mm, and it'll work well enough.
USB-C's midplate is specified at 0.7mm, and all of the cheap chinese PCB manufacturers only offer 0.6mm and 0.8mm. 0.6mm is too thin to make reliable contact with most of the cables I've tried: I haven't gotten around to ordering another batch at 0.8mm, but plugging the corner of a 0.8mm M.2 SSD into a USB-C cable, it's pretty (i.e. probably slightly too) snug. Thin PCBs are also extremely flexible, which isn't great for brittle solder joints, ceramic capacitors, etc.
While this is generally true, I think it’s less true for embedded systems. All of the tools I use at work for embedded systems engineering (JLink, serial adapter, oscilloscope, logic analyzer, nearly all MCU dev boards and FPGAs) are USB-A. I use a MBP and simply could do any work whatsoever without a USB hub. I think it’s fair to assume anyone in the industry can use USB-A.
In the industry, yes, I agree. Except for maybe managers but even they probably have a dock on their desk that includes USB-A.
Edit: I remember something, having done work related to embedded stuff. About 6 years ago, we still had 3.5" floppy drives in a couple of scopes in the lab. We replaced those with a device that internally had the interface of a 3.5" floppy drive, but on the outer faceplate it had a USB-A connector for thumb drives.
I would assume someone doing embedded work would not only have to have USB, but serial. Is that incorrect? It seemed like that was even included in your list?
I usually have two or three USB to serial converters plugged into my computer at work at any time. I expect most embedded people would do similar (or use serial console servers over a network).
Yeah, they sell them but they're not supposed to. It's because putting an A on each end of a C cable and connecting two A hosts together can destroy the two A hosts.
Practically, since you can find them as you say, is that as a peripheral vendor in the west, you can't ship a C device with a C to A plug because you'll lose your license.
USB-A is not 'just fine'. It's uni-directional. Which is an insane design for something being constantly plugged in and out. It's also low-bandwidth and low-power. Most devices these days have bluetooth, so it isn't really needed except for power and monitors. USB-C is a step-up in everything and I'm happy with only USB-C.
Not everything needs to be constantly plugged. My desktop doesn't even have Bluetooth in it. I'm quite happy with the USB A ports on my PC when it comes to plugging in mice, keyboards, USB storage, etc. If you use USB devices on a regular basis, you'll remember which way USB ports are oriented, and you can just look at the end of the USB cable to see which way it needs to go in.
Having said that, I will say the USB C connector on my phone is far superior, as I plug/unplug it daily.
Power and bandwidth aren't even related to connector type.
> If you use USB devices on a regular basis, you'll remember which way USB ports are oriented, and you can just look at the end of the USB cable to see which way it needs to go in.
You shouldn't need to remember orientation or look at it! You're apologising for a bad design.
> Power and bandwidth aren't even related to connector type.
Not true! For example the max bandwidth you can currently put through USB-A is 10 Gbps with USB 3.1. Through USB-C it's up to 50 Gbps with Thunderbolt 3.
The asymmetry of the USB-A connector is at worst a minor inconvenience. Definitely not in itself a good reason to break compatibility. And most USB devices are not frequently plugged in and out (indeed, for those that are, Bluetooth is a better choice!)
Growing pains, we'll get over it. USB-C fixes the problem everyone complained about forever of turning the plug over three times before getting it to be rightside up, and gives us enough contact points for modern video streams. I feel like making it compatible with the older USB standards by just connecting a subset of pins is super useful, and it's about as much backwards compatibility as we can expect with major sacrifices in other regards. I've got a phone with two USB-C ports, a hub that gives me two USB-A ports, an ethernet, and another USB-C, and a separate USB-C to HDMI adapter. I wouldn't trade either of my USB-C ports for a dedicated USB-A.
Absurd tragedy? Maybe for you. For me, it's been a good life since the 2016 MacBook Pro. I got a decent dock, and connect everything with a single cable.
More. I paid 250 euros for a CalDigit TS3 Plus. Thing is amazing.
When I first got my MacBook Pro, I just got USB-C cables for my external monitor and my (USB-A) hub. That was very cheap, but I love the convenience of Thunderbolt and shelled out the cash.
I never understood why Apple and other thin laptop manufacturers didn't do this? I remember X-Jacks fondly, and would love to have a super-thin laptop with ethernet.
> It has a USB port in the corner. If you plug it into a computer, it boots in about 6 seconds and shows up over USB as a flash drive and a virtual serial port that you can use to log into the card's shell.
The parent of this thread has a valid point here and the insecurity implications of this is nefarious if one isn't careful.
While it is a technically cool fun project, I can imagine a bad actor taking this to DEFCON/CCC or even a tech-centric concert and mass producing this to emulate a USB drive, but is also a keylogger or a remote access tool of some sort which reminds me of the nightmares of BadUSB.
I'm worried that the potential employer infects the card, uses it to take over their competitor personal computer network and appends: "Don't forget! The S in IoT stands for Security!" to all of their documents and audio files. Then the traffic lights after we colonize mars will be like "It is safe to cross. Don't forget!..." and some ATM on the moon will be like "Thanks for using the bank of the moon. Don't forget!..."
While I'm impressed by how low the bill of materials is, I can't but put into perspective the kind of skills the authors of the post has, and how mindbogglingly expensive they must be. Contrary to what the media says, it seems that the digital era has made (some) human beings more essential than ever ...
The bias of HN towards web dev is pretty funny sometimes. I'd definitely be very impressed if I received these business cards, it's more novel than other PCB business cards I've seen and took a lot more initiative and effort with all the sourcing and research required. But the design and integration work is something I'd expect any embedded systems engineer to be capable of, and the vast majority of those engineers are making less than all of you guys at FAANG talking about kubernetes or whatever it is that goes on there
After reading the post, i was under the impression that the author really did some shopping for major components and assembled everything into a fun shape. Obviously i know nothing about electronic board design, so could you explain which extraordinary skill this required ?
EDIT : ok, i missed the part where he actually designed the whole board, obviously..
Designing the layout of the PCB. Populating it with surface-mount components using home-made solder paste stencils and an oven. Selecting the MCU. Figuring out what kernel drivers are needed and what can be stripped out. Searching the web for datasheets and drivers that others have created/ported. Building the userland. Debugging and testing.
Like many pieces of technology, once you lay out the steps it starts to look as though each one, and therefore the whole, are achievable to a highly-motivated but somewhat typical person. And it's true! But even so, few people develop the whole set of skills to build something from start to finish, even notwithstanding the incredible help you can get nowadays from open-source software, consumer access to services such as PCB manufacturing, and online documentation. Even in a high-achieving community focused on building stuff, like this one, I doubt many posters here have completed as impressive an individual project.
In my experience both as an embedded systems engineer and as a hiring manager, the salary sites miss a lot of nuance.
A large amount (probably most?) of embedded systems positions are at the lower end and don't require a lot of experience so that is reflected in the lower salary numbers. At the higher end, where more niche skills and higher levels of complexity & system integration are needed, you'll see the kind of salaries you are more familiar with.
It's also very industry dependent. I've always found it surprising that Factory Automation engineers are paid generally below a typical web development salary, but that's where a lot of embedded engineers end up working and get the lower pay that goes along with the job.
When I worked in Medical Devices, programmer salary didn't make a distinction between embedded or database, or UI work.
In addition, the positions aren't always broken out that way. In my day to day work I do embedded systems programming, Windows desktop stuff, some network (IoT) stuff and even a small amount of web dev when needed. My job title has never reflected any distinction in the kind of programming I did.
LinkedIn reports embedded software engineer wages are a bit higher than the number Payscale gives - https://www.linkedin.com/salary/embedded-software-engineer-s... - still lower than the full stack engineer level but certainly within the same ballpark.
I suspect the number is biased to the type of website that's gathering the data. Maybe employees of higher paying companies don't report their salaries on Payscale.
One of the big caveats is that embedded engineering is very location dependend.
Barcelona (where I live) has big demand for web/app software engineers, but embedded engineer jobs are way rarer.
For embedded engineers getting >40k€ is just not a thing around here. In the mean time my friends with backend experience (2y) get that kind of offers on a regular basis.
I think you're going to need more than 100 or so people's self-reported salaries before you can reasonably claim what pay is like across the professions or in comparison to each other.
Also rocket scientist are paid less. Its a supply and demand market. There are simply more ppl wanting to build rockets and iot gadgets then there are ppl willing do do a crud website for an adtech company.
This is true, had to turn down various jobs in big cities due to very low pay. Guys looked like I'd just slapped them when I'd named my price (previously done fullstack).
Yeah, maybe it's a "I don't like this" vote. I tend to agree - I find embedded software engineering skills quite intimidating. Nevertheless, the market demand just isn't as high.
Where's the mystery? Someone looks at what this guy did for his business card and, to paraphrase, says "Wow, look at this guy's skills. He must charge a very, very non-average amount of money for them."
Your response to this is "But actually, according to payscale.com, an average embedded software engineer only makes X."
1. Nobody likes but-actually posts.
2. Quoting average salary when someone is commenting on how non-average a guy's skillset is isn't really relevant.
I get the "but actually" thing, but I don't think it was that.
The OP said the skills are "mindbogglingly expensive". From personal experience they aren't (I worked in a team with embedded engineers with similar skills to those required for this).
I thought this was well understood - it's been discussed here a number of times.
I added the average salaries after getting voted down to -2 and that turned around the voting.
While there are a number of issues with average salaries I think it's notable that no one is claiming the opposite.
A complete "Embedded System" (HW+SW) is orders of magnitude more complex than "Web Client Design and Implementation" (unless you approach Google/Facebook scale). The malleability of SW means you can quickly try out various techniques without too much "learning overhead". With all the existing Frameworks/Libraries/etc it becomes more of a "plug & play" proposition. Not so with Embedded Systems. You have to know EE/Electronics, programming interfaces to HW, Bootloader and OS kernel (if using one), Complex Toolchains/ICE/Simulators/Debuggers and finally; The Application. Each is a huge domain in itself and has a significant learning curve. To have all of them in "one head" is remarkable and non-trivial.
An "embedded system" can be as simple as "when this button is pressed for 3-5 seconds, turn on this light for 30 seconds." That hardly requires much of a skillset.
And as far as having all the things you mention in "one head," every member of my team can handle that easily and we're not particularly noteworthy!
I think in a sense you are proving my point. Real knowledgeable people have done the hard work to simplify much of technology so that it is easy to get started on. But the moment you are past the "commoditized" part, the "learning ramp" for an embedded system becomes exponential. Not so with pure higher-layer Software Apps. There is also the fact that an "embedded system" spans a very broad domain and hence requires a broad spectrum of knowledge and skills eg. developing a Network Appliance vs. Bread toaster.
In my own experience moving from pure applications software development (though not a web developer) to network appliances to lower-level embedded development i have been amazed at all the "hidden complexity" which you suddenly become exposed to. And this is just the software part. If you get into the hardware part you have a whole another world of knowledge to learn. Merely doing something without understanding is the difference between a "Technician" vs "Engineer/Scientist".
True. This is what i dislike about the "market". There is no reward for knowledge, hardwork, solving complex problems etc. Just "commoditization" and "supply and demand dynamics".
The five following are the principal circumstances which, so far as I have been able to observe, make up for a small pecuniary gain in some employments, and counterbalance a great one in others: first, the agreeableness or disagreeableness of the employments themselves; secondly, the easiness and cheapness, or the difficulty and expense of learning them; thirdly, the constancy or inconstancy of employment in them; fourthly, the small or great trust which must be reposed in those who exercise them; and, fifthly, the probability or improbability of success in them.
-- Adam Smith, An Inquiry into the Nature and Causes of the Wealth of Nations, "Chapter 10: Of Wages and Profit in the different Employments of Labour and Stock"
Information and skill aren't rewarded of themselves, for much the same reason that information wants to be free. If the skills themselve are both rare and valuable, that tends to change, effectively serving as a type of rent (in the economic sense). Effort in aquisition, risk in employment, trust, and the underlying attractiveness of the activity, all factor in.
Smith's analysis doesn't hold in all cases, but is a good first approximation. It's remarkably poorly known, even among market advocates.
There's almost certainly a book or two in there, to do it justice.
Tackling the latter: I recall listening to a BBC interview of a banker, probably in the wake of the 2007-8 global financial crisis, though the example's really an evergreen. The presenter asked how the executive justified his (it was a he) income. He responded based on value creation. Oddly, that's not a basis for compensation in a competitive market -- price should be set equal to marginal cost. Of course, if bank-executive services are a form of rent, he might be correct, but that's an interesting argument on its own.
It was a few years later when I finally got to actually reading Smith that I came across his "five following" discussion. Which a true market-capitalist banker really ought to have fully internalised. It made the interview response all the more curious.
I've also kept this in mind as other discussions of wages (particuarly concerning the tech world, but also other worker classifications) come up.
On the "information wants to be free" comment: skill itself is (in parts) an informational good, though one that lacks inforation's fundamental fungibility (we can transfer information from storage device to storage device across channels and networks, less so skills).
But like information, skill is difficult to both assert and assess. Beyond zero marginal costs, a chief reason information markets behave so poorly is that it's hard to assert high-quality information, and expensive to assess it. If you're reading a book, or article, or ... lengthy and/or cryptic HN comment ... somewhere in your head is the question "is this even worth my time?"
This is what makes tech recruiting, from both sides of the table, such a PITA. The front-line people writing ads and screening calls ... have all but no capability to assess skills. Responses are all over the map, from the absolutely unqualified to domain specialists and experts. "Expertise" itself is almost certainly a misnomer as so much information has so short a half-life -- it's generalised problem-solving and learning which are most required. And the tasks and projects to which talent is applied can itself be called into question. Take the 737 MAX development -- Boeing almost certainly would have been better for not trying to drag out the lifespan of that airframe, a decision which likely should have been made in the mid-1990s. Full costs (or benefits) of decisions cannot be known when they're made, and only manifest over years, decades, or even centuries (fossil fuel use).
The notion of "manifest vs. latent" properties or consequences is one I've been looking at. Some earlier work by Robert K. Merton and others.
"The market" rewards short-term, highly-apparent, risk-externalising, liquidity-generating behaviours. There are inherent distortions to all of this. Skills and competence (as well as skills development, mentoring, training, preservation, etc.) are poorly served.
There's also some interesting discussion relating to this in Alvin Toffler's Future Shock, from 1970, which I'm reading for the first time. Bits drag, but there is much that's prescient.
You make some very interesting points. The Adam Smith quote points out the various axes to consider when looking at the "viability" of something in the market which is quite correct. But there is another wrinkle added by the "Modern" world. I presume you are aware of the works of Nassim Taleb. He makes a distinction between "Extremistan" and "Mediocristan" (and lambasts Economics and Economists :-) which i feel is very applicable to understanding "The Market". With the current "Information/Data" revolutions the frequency and impact of "Extremistan" events have increased dramatically and existing economic models are no longer sufficient to explain them. For example, what exactly is the "marginal cost" of software? How do we explain the rise of "The Internet" as a major economic driver? I lean towards the viewpoint that this dichotomy is applicable to the whole Software/IT industry. Thus "Embedded Systems Industry" has moved from "Extremistan" to "Mediocristan" (commoditization, cost reduction, large supply pool etc.) while "Web Development" is still in "Extremistan" and this gets reflected in their respective salaries.
Thanks for the extremistan/mediocristan reference. I've read some Taleb, but not much and not deeply, and that's a valid point.
Financialisation and capital accumulation allow tremendous inequalities in allocation. These have always been present to some degree, but recent history (say, 1800 - present) seems exceptional. Note that arguably the Gilded Age personal fortunes were probably even more extreme than those of today.
Unless I'm missing something, the marginal cost of software publishing is very near zero. Development is ... more expensive.
The Internet's role is a whole 'nother discussion, but pulling in a notion of an ontology of technological mechanisms (see: https://ello.co/dredmorbius/post/klsjjjzzl9plqxz-ms8nww), it's a network, with network effects, as well as a system. These ... tend to define general parameters and considerations.
Where specific industries fall on the mediocristan / extremistan scale .... is an interesting question. Though I'd argue that systems/networks elements tend toward extremistan. Whether hardware is/isn't systems/network (as I'm using the terms) is another question.
For successful fundamental component manufacturers (e.g., Intel, AMD), hardware certainly resembles extremistan.
I rarely keep business carts, typically I scan them with my phone and throw them away but... If I hadn't read the article and someone handed me that business card, my jaw would have dropped all the way to the other side of the earth.
My version of this would have a tiny LCD, buttons, and a Wi-Fi interface. I know all of those exist in incredibly inexpensive formats. Adds to todo list
My first thought is that literally billions of people already do something similar every day. They use a card that runs its own small, programmable computer, that is portable, energy efficient, and can be powered 'wirelessly.' It's called a universal integrated circuit card (UICC) -- or the chip used in credit cards / sim / ID cards.
The OS these cards run is often referred to as the "Java Card Platform." The OS has many of the same features you would expect from a modern operating system like support for multiple programs, I/O, and very strong crypto primitives. It even has features you wouldn't expect (because some applets give you access to APIs that aren't currently open source available outside that environment.)
JavaCard is hilariously weak though, as far as Java is concerned. Support for "int" datatypes is optional, and there's no such thing as java.lang.String.
To your point though, it's pretty amazing to see the sheer number of devices with it deployed, and the things that can be done with sim card applets (mobile banking, etc.).
I really appreciate this write up. I didn’t know about JCL and their assembly prices seem very reasonable. I also usually order my parts from Newark but their shipping is kind of silly on small orders ($8 for anything under 2lb).
Yesterday I started a small electronics project (Bluetooth connected tachometer for my motorcycle), and want to use ESP32 chips for it. I am the kind of masochist who in the past soldered SMD components by hand, but with these new services at my disposal could do so much better with that. Time to build a reflow oven?
You don't really even need to build one. Buy a $20 toaster oven, turn it on, and put your board in. When your solder paste reflows, turn it off. There is a reflow waveform to follow, but for small runs and personal projects is hardly worth the time and money. Check out YouTube four specifics.
I really want to learn being able compile and solder an embedded system but I don't know where to start. Assuming I have the parts, tools and materials, how can I learn where to put capacitors/resistors etc in a custom design? What kind of simulators are good at testing these ideas without needing to waste expensive components? What was missing in my Computer Engineering curriculum which paints those circuit parts (except the cpu) completely alien to me?
Start with a simple, known-good design, then figure out why each part is there.
After you get some experience reverse engineering existing designs, start trying to build something simple. Thinking blinking lights without a processor.
Warning: Embedded tools are not user friendly. Because most people who use them, use them all the time.
Eagle has a free tier. I think it's the EDA tool we used in my CS/CSE courses.
There are open source alternatives, but they had even rougher edges a decade ago when I was using them.
Buy an Arduino clone, install the crappy GUI and see where that leads you.
Forget simulators. IME, software devs obsess about simulators. I think they're mostly a waste of time: at the beginner levels we're talking about here, signals are slow (after 30 years in the business, I can't believe I now think a 16MHz signal is "slow" :-) and parts are cheap. Just dive right in and blink a few LEDs or something.
Open source projects like Arduino and the Pyboard have schematics available. If you want to learn about about why those components are there read the datasheets for the major ICs, the board designer will usually be following the manufacturers recommendations.
Emulation is rarely ever done, it'd be a high effort project with a lot of nuance to specify external hardware and the hardware is cheap anyway. A rough flow for an embedded project would be to start with a development board from the MCU manufacturer to get a head start on firmware, move on to a custom dev board of your actual product but with more easily accessible pins and test points, and then iterate on the final formfactor board.
For a hobbyist start with an arduino and a pyboard connected to a breadboard. Once you're happy with the firmware and circuit design then start with the open source schematic and modify it to add your new components and lay it out on your custom board.
There are a lot of electronics tutorials that you can find starting from Arduino. SparkFun and AdaFruit sell various kits and have a lot of tutorials for things like how to solder.
Read the datasheets for your parts. Good datasheets will tell you what components you need, what values they need to have, and how to layout the board. They are often written assuming some knowledge but you'll have terms to google.
Very cool project! It would be neat to incorporate WiFi into the next version instead of USB. Then you could publish a mobile app to communicate with it. It would be fun to play with at parties.
https://www.gridconnect.com/products/esp8266ex-tiny-wireless... .
I wonder if tiny solar cells covering the entire surface of the card would be enough to power it. Not sure if the wifi module can be powered with < 1 watt of electricity though.
This is cool as it showcases his embedded engineering prowess.
Years of startup meets[1] have left a bad taste in mouth for the business cards. I've seen people exchange business cards ceremoniously, only to give a missed call to each other immediately.
Business cards are a waste of money, paper and I don't want to get started on the plastics. I honestly thought that proliferation of NFC in smartphones would help to change this behaviour, but with Google shutting down Android Beam(its reasoning: no one is using it) and newer, cheaper smartphones without NFC; I think that ship has sailed.
At-least, in WhatsApp countries; people share contacts via WhatsApp.
I find that business cards are still helpful to get the spelling of people's names right - otherwise you have to either hand your phone over to the other person and have them type their name into your phone, or at least have them text you the spelling, which can be awkward to ask for and anyway has more friction than accepting a business card.
Also, sharing WhatsApp contacts makes me uncomfortable. There's various fields that people can store in their private contacts system which is inappropriate to share without permission - birthdays and home addresses at least. When you share a contact, you have to remember to sanitize the contact before you send it. A business card only consists of details which are pre-approved to share with strangers, so business cards don't have this problem.
One more concern for me is the utility of a business card is just once, if you want to contact them in future you'll store the details on your phone and when you put the card in the pocket it likely directly ends in the trash.
As for WhatsApp or other chat app 'contact' sharing it is not limited to business/professionals and so it has become a norm in several countries.
Yeah, I don’t really get the practice of sharing contact cards with people. A coworker sent me his once and now I have the names of his family members, his birthday, and home address.
On android, Google lens in pixel can do it and few other manufacturers like OnePlus have deeper integration with Google Lens via their camera app but needs to be installed separately.
I support the idea of having built-in barcode reader which can scan barcode from the screen as well e.g. images in webpage.
Actual card exchanges seem to be used as a polite way to fob uninteresting people off as much as an actual exchange of contact info with people you want to talk to later.
This seems to be a bit of an exception. They're not really expensive, but they are expensive enough that you're not going to just hand them out to random annoying recruiters or wantrepreneuers you want to fob off.
It is an unusually cheap system, and OP clearly did some thorough research on finding a really well integrated SoC that is available for super low piece part prices. I'd guess Allwinner targets that part at some ultra high volume application and some AliExpress sellers managed to capture some trays of excess parts to sell individually. I wouldn't be surprised if the unit price there is lower than what Allwinner quotes for volume.
And this one of the dangers of basing your supply chain on AliExpress. I've had to caution clients that just because they found some normally expensive component dirt cheap on Amazon or AliExpress, doesn't mean that they'll still be available in 6 months when you need 1000 more.
Base your BOM on a manufacturer-approved supplier. If you can find the same parts for less elsewhere (assuming they're not counterfeit), go for it.
It’s QFN though so you need a reflow oven to make the boards at home.
I’ve had bad experiences with the allwinner chips (the boards I had years ago were very unstable) but the mainline kernel support is very exciting. That you can order a few phones built around them now too is just awesome.
The one that blew me away was the JW5211 switching regulator. I thought he was being a bit extravagant by not just using a cheap linear regulator...but that buck regulator costs 4.5 cents.
Guess I'm a little behind the times. Sure, throw three of them on the board at that price.
Only other surprise was the MOSFET labeled "Q_NMOS_GSD" for the status/power LED. At first glance, with the pull-up resistor, it looks like the FET does nothing. Then I located the reference net (a flag would have helped) on pin 13, so you can turn on/off the LED with the processor, or (depending on initial state of that pin) it will power up with the LED on and turn off only after the processor boots. Neat!
Author here. It's a really good conversation starter. I've given a couple of them out and the reaction is always that I bought it somewhere, and I have to explain that no, I really did the entire design from start to finish!
Yes and no. It's based on the Hugo hyde-hyde theme and modified to suit my tastes. I'm not entirely happy with it yet. I would like to handle rich media better but I am definitely stunted by my weak web developer skills.
Nice work! MicroPython is nice as well although I prefer C for now, but that's probably just because MicroPython would not be very feasible at the moment for projects we do.
I have been dreaming about making our card (see my profile here if interested) run a real (well embedded but) OS; we had a version which could run some embedded OSes but to cut on material costs, we had to remove a mcu which got us down now to less than 23kB of workable memory. It is possible to run 'an OS' in that like computers from the 70-80s; the speed of the current mcu matches more with that era as well.
I like this kind of thing, but I would want a display on it and maybe no USB (just a radio circuit communicate with it for updates etc).
I'm amazed he did this for $3. I've been struggling to get my cheap electronic Christmas ornament under $15. I guess I need to find an alternative toolchai that's cheaper
The author: did you think about creating a course on embedded hardware and software design and launch a funding campaign on Kickstarter. I would definitely help fund it.
I have thought about it and I am writing one entitled "Mastering Embedded Linux." Its goal is to take you from zero to expert with hacking embedded Linux systems.
Kids today don't appreciate that in the 1990s, if you wanted a bootable business card (BBC), you had to satisfy yourself with a CDROM, cut to "business card" form factor, which would boot, but you had to supply your own computer.
(@schoen here, among others, will be ... extraordinarily familiar with this.)
Hilliard's solution here is the 2020s variant.
This is also a testemant to the fact that it's not just ICs which are now cheap (as opposed to, in the 1990s, static data archives), but entire SoCs.
The main limitation is now power supplies, though the prospect of parasitic OTA power extraction is becoming a possiblity, perhaps also thin-film PV and small-scale power storage systems.
Which means that peel-and-stick computing is well within reach, if not a present reality. I suspect Hilliard's BoM would fall by 25-50% in bulk, and given technological advances, should be halving every 18-24 months, which means that $1.50 systems are available now, $0.75 in roughly two years, and only a few more halvings before the 1995 Dilbert Unix user advice, "here's a nickle, kid, get yourself a better computer", will be a practical reality:
Applied to packages, this means that shipments can report their own progress. Applied to a vehicle, these will permit covert tracking on the cheap. Applied to arbitrary surfaces, indoors or out, surveillance of a given location (audio, video, or other sensors). RFID tags will give way to fully-capable computers, integrated into inventory and pricing stickers. Applied to a drone or glider, an integrated surveillance-and-flight-control package (combine with very small body and lightweight electric motors for power-gliders deployable en masse, not individually threats, but capable of close-up, large-scale monitoring and guidance for more capable systems). Applied to windows, tapping of conversations inside. Likely tapping (or hacking) into available wireless comms nets.
The devices will be well within reach of white hats and black, as well as numerous shades of grey.
Decreasing costs mean that activities not presently viable, many of which won't benefit the common weal, will come into viablity, much as email, fax, pop-up, and phone spam over the past three decades.
Power, sensors, and antennae are likely the biggest limiting factors at this point, though all are also rapidly miniaturising.
Impressive work. But I'd ask for a few leds with some easy to use method (can they be a device, like dev/led1 ?). The Arduino has one in port 13 (usually), and it is very useful for useful for minimal projects, like a blinking led.
(Bonus points for a few leds, so you can show the Knight Rider animation, or a 7-segment display.)
And then there's the occasional sourfaced old party pooper like myself. Two seconds in, and I no longer see the sweet wizardry or the polished black surface; I only see @gmail, internally translating to "someone who doesn't really care".
This is brilliant. I’m an embedded systems engineer by training (and interest - I still occasionally solder things to bend them to my will!) and absolutely love the implementation. It’s aesthetically pleasing and functional in both interesting and fun ways. Very slick.
I love this! And while it's well suited to the person (embedded engineer), it could be realigned to someone else by changing the tagline. For example, "This card runs vim!" for developers, or "This card runs Linux containers!" for devops, etc.
It is a gimmick, the value is that he designed it himself and no one else does it. If you just buy a computer card and install your own software, it is just a really impractical waste of resources.
This is a very cool project. I know very little about embedded engineering. I have a question: Why would MicroPython be needed? I thought it was useful only for simple chips, and this chip runs a full Linux, so couldn't you run CPython on it?
OP here. CPython is very heavyweight on small systems like these, with multi-second imports, multi-megabyte stdlib, and limited library availability (cross building native extensions is tricky).
Very cool. Back in 2004 I used to have business card-shaped CD-ROMs that booted Tiny Core Linux. They were always eye catching. In fact, one of those was a factor that landed me a job in a small startup in Palo Alto called VMware...
It looks promising for someone to share their personal profile and their Resume.
For business, The best part will be to have cards received into the System and transfer them to their CRM or contacts or some other application they use.
I remember installing Coherent from a bunch of 5 1/4 inch floppies onto my third computer, a 286 (or perhaps it was onto my fourth computer, a 386 sx).
What's the time spent on bootup? Hardware POST? I remember seeing some Linux can boot in under a second. It would be cool to see the business card come to life instantly.
You might know this and use it as a short hand, but just in case: ARM doesn't use a BIOS of any sort, so there is no POST, just a bunch of vendor blobs (burned into the chip) that do bootstrapping then jump to your bootloader.
The biggest problem in his case is probably the 8 MiB serial flash; that is going to be rather slow to load from.
@paxys - Not sure what purpose you're referring to. Sure, purchasing them would defeat the purpose of making your own business card, but not the traditional purpose of having a business card that's cool & which doesn't immediately get tossed in the garbage after you give it out.
I imagine these aren't for sale, but I'm just throwing it out there that I would prly buy them if they were.
It's only fuzzed in the pictures. The actual card has the phone number. I just didn't want a ton of people from the internet hitting my phone directly! Email is fine, however, so that's un-blurred.
OP here. Bingo! Karl Schroeder is fairly obscure unfortunately... I love Ventus and Lady of Mazes. I picked this handle mainly because it's unique everywhere I've tried to sign up.
Truth being flagged just because it doesn’t look “cool”. This profession desperately needs a re-do. Otherwise I’m pretty sure that the person that flagged my statement really cares about the environment and recycles religiously.
People are flagging your comments because they aren't innumerate enough to be taken in by your silly argument that this project is somehow relevant to our current environmental catastrophe.
So we can agree on something, at least, even though I still do believe that you treat as ignorant those people pointing out that more not needed stuff is worse than less not needed stuff. I do sincerely hope that people like you will have a “we are the baddies” moment, but as things are going I don’t see that happening pretty soon. Maybe the next dot-com-like crush will solve it, if and when it will come.
The only thing that is needed is death; everything else is optional. So I reject your anti-life philosophy that “more not needed stuff is worse than less not needed stuff.” Because the category of “not needed stuff” includes precious life itself, as well as everything that makes it worth living, and of course an endless variety of worthless and pernicious nonsense.
That makes no sense. This guy isnt advertising his mad "registering firstnamelastname.com" skills, he's advertising his mad embedded engineering skills.
I wonder what world would we be living in if people like this fellow were actually focusing on automation and not on making business cards that run Linux.
Maybe we wouldn't be living in such a lowly automated reality, using concepts from the previous centuries. Robotics might even make sense...
I wonder what the world would we be living in if people like you were actually focusing on automation and not wasting time reading and commenting on HN posts.
Maybe we wouldn't be living in such a lowly automated reality, using concepts from the previous centuries. Robotics might even make sense...
I read a story once about a fellow in, I believe, St. Louis who devotes all of his energy to rescuing stray dogs. It’s an obsession.
People ask him why he doesn’t apply that energy to helping the homeless instead.
As he says, there are plenty of people tackling that problem (debatable), this is his passion, and I imagine he asks, or should ask, the questioner what they’ve done to help the homeless recently.
Follow your passions.
(Plus, as others have pointed out, this is a personal project and he does this stuff full-time, so yours is a particularly odd complaint.)
I have worked on some industrial automation stuff. In my experience, it's genuinely really hard to do in a way that's actually better than manual systems, or very narrowly-scoped and simple automation. The number of edge cases and failure modes is mind-boggling. I'm skeptical that there's much room for effective solutions between extremely simple and dumb automation, and near-human level AI.
> I'm skeptical that there's much room for effective solutions between extremely simple and dumb automation, and near-human level AI.
That's literally my job, and there's a ton of room actually. You just have to be able (and willing) to have the ability to fall back to humans at any point in the process. Like real world "fix-and-continue" debugging...
"Looks like you could cut yourself on it," Rydell said.
"You could, many no doubt have," said Karen Mendelsohn, "and if you put it in your wallet and sit down, it shatters."
"Then what's the point of it?"
"You're supposed to take very good care of it. You won't get another."