I can't speak for cameras specifically, but my partner is Taiwanese and apparently this hardware/software dichotomy is extremely prevalent there as well. Namely, there is a broad social perception that hardware design is "real" engineering, and that software is a joke. Thus, the best and most talented engineers go into hardware, and the jokers work in software, leading to this "good hardware, bad software" observed outcome and reinforcing the stereotype further. Rinse, repeat, and you eventually end up with decent hardware running absolutely garbage firmware.
Given the social cross-pollination between Japan and Taiwan, I wouldn't be surprised if a similar pattern held true there as well.
> Namely, there is a broad social perception that hardware design is "real" engineering, and that software is a joke.
Yup. Not just in Asia. The US suffered from that, as well. It may have changed (for the US), by now, as I spent 27 years at a Japanese hardware company.
I spent most of my career, as a software dev at hardware companies, and got the brunt of that crap. It was infuriating.
During my time, I wrote some very good software. In the early days, when my team was given a lot of leeway, it was sent out, and got [mostly] positive reviews.
As time went on, Japan got more and more involved with/in control of the software development that we did, and threw more and more restrictions at us.
We were forced to do a standard hardware-centric waterfall development process. If I even mentioned the word "agile," I might as well have just gotten up and left the meeting, because everything I said, after that, was ignored.
They took away all of the user interface from us, and we were just doing "engine" work, which was actually pretty cool, but, they sucked at UI.
Towards the end, I was reading terrible reviews about our software, and tried writing stuff that would directly address these gripes.
My work, and any similar work from my team, was ignored. Instead, they had some disastrous relationships with external companies, under (I assume) the impression that we were not capable of writing "modern" software, and these folks were (they were able to write "modern" software, because their work was terrible, and I have issues with the Quality of "modern" software, in general).
It is indeed exactly the reverse in the US currently. Pay ranges for software engineers tend to be higher than for hardware engineers at big tech companies, and many folks with electrical engineering backgrounds end up going into software as a result. Also similarly, people building hardware inside of software companies tend to have to put up with mismatches in expectations, including questions around why they can’t build hardware in an “agile” process!
Sometimes I feel that "Agile" has become so diluted to mean "there are feedback loops in the design/execute process" and if that's the case then 6σ is an "agile" process for hardware.
I like the spirit of the Agile Manifesto. I feel that the devil is in the details[0], though.
Nowadays, the word "Agile" means "Waterfall, but with different names," or "Tear off all your clothes and run naked through the bluebells! Do what you want!"
I'm really big on Discipline and Quality. It's entirely possible to have a flexible and iterative development process, but there's no way to avoid the difficult bits. They just get shifted around.
> Namely, there is a broad social perception that hardware design is "real" engineering, and that software is a joke. Thus, the best and most talented engineers go into hardware, and the jokers work in software, leading to this "good hardware, bad software" observed outcome and reinforcing the stereotype further.
Curious. In the US, the software people can usually make a lot more money so even many EE’s end up in software. I wonder if it’s the opposite in some of these countries, where software people are paid less than hardware people.
Actually, it still happens the same way in the USA as well -- for physical products. The hardware side of physical products is often well-supported, with higher budgets for R&D and engineering salaries, while the software side of the physical products is expected to be done barebones and as an afterthought at the end of the product development cycle.
Software engineers in the US who do not work on physical products are highly paid, because they can potentially create nearly infinite return on investment with near-zero marginal product costs.
But software for widgets doesn't have that infinite margin ratio. So firmware suffers greatly. Think auto infotainment systems, smart-home electronics, appliance interfaces, point-of-sale kiosks, etc.
Don't forget device drivers back in the day before all the chips got thrown directly into the motherboard. You might buy a nice soundcard, but the software that came with it (drivers and utilities both) were quite a mess.
I think a big part of Apple's success was getting both hardware and software right.
> Curious. In the US, the software people can usually make a lot more money so even many EE’s end up in software. I wonder if it’s the opposite in some of these countries, where software people are paid less than hardware people.
In Japan and Taiwan, both EEs and SWEs are generally underpaid. SWEs and some EEs go to the USA or (gasp) mainland China to make more money, since software talent is generally more appreciated in those two countries. The same applies in other Asian countries (e.g. HK and Singapore, where it is software vs. financial services rather than software vs hardware).
I think the US situation just reflects economics. The value of software scales up more than hardware. So software teams and companies get more investment.
Anyone who's worked with software management of commercial hardware like cameras, digital signage, time clocks, door controllers, I don't know 1,000 other products types, can attest to the horror-show software you're provided by these manufacturers.
Think: Windows only, often IE/Edge only, ActiveX, crashes constantly. Random UI strings are in Chinese. Barely, barely usable.
Why is this still true? I can understand in the past, but after the rise of all the tech companies and obvious important software they use everyday (Android and iOS) how can anyone at this point think software is a joke and lesser than hardware?
Most hardware companies are decades old and so are most of their established competitors. Until one of the old guard breaks rank or a new competitor manages to break into the industry using software as a clear competitive advantage (i.e. Tesla), the success of tech in general means nothing to them.
It doesn’t even matter how big the companies are or if they’re a “hardware” company. All the lumberyards in my area still use DOS era machines that I’m not even sure are networked. I know that at least one of them runs the whole thing by printing the day’s transactions from each computer and paying a secretary for data entry into their similarly ancient accounting/inventory management software. Cost of land and fuel overwhelm labor costs in the lumber business so there’s zero incentive to even try
While software is important, quality of software is usually not. There regularly are articles and comments on HN about how common software dev practices would not fly in real engineering.
I think this would change overnight if management were actually held accountable for quality. Right now all the incentives are on ship fast, ship early, ship often. A PM who delays a release to fix bugs (is a hero IMHO, but) looks terrible to management higher up. The PM who rushes to market looks good, even if the reputation of the company as a whole suffers because they shipped crap.
Millions of bridges have been built in human history, but only a handful of GPU drivers.
The bridge doesn’t need to withstand the river suddenly turning into lava or the atmosphere becoming sulphuric. The driver has to be prepared for whatever Windows and the hardware put up.
In those tech companies, that knowledge has arrived. Of course, among software people, similar is true, because who doesn't like being told they are important and valued. But there are various kinds of "tech" companies. Ones founded by hardware people and EEs, where the key innovation that made the company big was in hardware design. And ones founded by software people in their dorm room or something like that. Usually companies from the latter category offer respect to software engineering, while companies from the former category see it as a cost center and something that ideally you'd out source.
DSLR manufacturers got big by making great cameras. They didn't really feel the need for making good software. Compare this to Google which got big by implementing a clever algorithm and using distributed computing.
JS and frontend are terrible and you'll hear this loudest from frontend people themselves. It's an entire industry built purely around the inertia of an unexpectedly wildly successful product.
I think it's worth noting that the Web ate software largely because the ergonomics for new devs are vastly superior to building native apps, and can be used cross platform without downloading binaries. What language is easier to get moving in? If writing cross platform native apps was as easy as using a single html file with a script tag, they would be more in vogue.
To accommodate the greater scope of the web the language has evolved. It's fast, supports multiple paradigms, and never makes breaking changes, so your code will run the same 20 years from now.
Is this a real issue? I doubt the average new coder needs to worry about supporting 20 year old browsers today. I've never worked at a company that needed to support ie8 or whatever.
These are orthogonal. You can believe software is important and a great area to work in, and still think JS and frontend is terrible. In fact, the two are often correlated!
If you think frontends, as a general category, are terrible, and backend software, as a general category, is more "serious", "real", or "important", you have precisely the mindset that produces theoretically useful gadgets that are ruined by poor user interfaces.
That's not the point OP was making. On the contrary, you have to believe that frontends are important to be really mad about how terribly they are made.
In general, I think any engineering community that congregates around a particular set of issues is just trying their best to address their needs and build solutions to their problems, and it's important to respect those. Rather than being dismissive, exposure and cross-pollination is how we lift the boat together.
Just because it’s terrible doesn’t mean the haters have to suck at it. It makes the opinion more valuable if you’re good at something and then criticize the bad parts.
The issue is that the recent growth in the software field has caused people that would otherwise major in something else, and aren’t really interested in software, to be your coworkers and they don’t care about doing a good job. There are some areas of software which would be benefited (lower cost over time) to apply an engineering mindset. That’s not what happens with agile. The whole ethos is about being able to change the design around, shipping MVPs and quick iteration. In hardware it has to be correct when you ship it , leading to a more methodological approach. As a result, some software work in comparison to hardware work can come off as sloppy.
Define recent? I remember "too many new people are just chasing money in IT" already being a well-established trope 25 years ago, long before Agile or most of the modern stacks were even a thing.
In europe management is considered more important. Its all bullshit indeed.
If they design cameras from the users perspective and expectations there is still a lot of room to take on phones.
I just want to shoot, possibly edit, publish the images on my server and have some api to make the appropriate database entries.
In stead I have to hook up the cam over usb then pretend it is a slow drive??? Oh and the battery is draining while doing this??? Some models have replaceable batteries that you have to remove to charge???? As a hard drive it scores 0/10
I have to start up an editor, find the right image, load it and find a folder to store the edit???? what nonsens workflow!
Iphones let you shoot the images straight into the upload dialogue.. but its not using the wonderful hacks the photo app offers.
Maybe camera makers should just make a frankenphone the size of a brick with a few TB of storage, automatic wifi connectivity (with more options so that one never has to look at it), a week worth of battery. The extra weight helps making sharper images and probably a cloud account with a list of highly configurable API's
Ill be as weird as to suggest website names could have physical buttons on the top so that one can shoot things straight onto facebook and press delete later.
Why wouldn't it be true? All the software that ate the world did so from a very small number of places. Outside those few focal points of software wealth, if an area isn't essentially preindustrial, whatever is happening there related to hardware will greatly outshine any local software endeavors.
That's a bit odd. Are they unaware of the last 30 years of computer history?
Even as someone with a background in mechanical engineering the degree of complexity behind some software products, such as Windows, is really impressive.
I suspect "hardware is real engineering" is really just "hardware engineering is where you can find prestigious employment in this country".
It used to be quite similar in South Korea until the more recent rise of domestic software giants like Naver and Kakao Corp.
In a lot of the East Asian countries, there is a large gap in desirability between the large, established employers and smaller companies due to outrageous differences in pay grade, benefits and job stability. So new business has a tougher time making it to escape velocity and offering significant numbers of jobs.
+1. I have vague memories of my time at a Japanese automation vendor out of the uni. I had quit out of frustration that the software was super buggy, there was no one to help except just a couple of people in Japan who knew the software but would not reply to emails. I also remember feeling neglected as folks working on the hardware or on customer projects were paid higher than me.
A recent experience at a neobanking startup from SE Asia reaffirms the point. Despite the product built around an API-only model, the firm was operations heavy when it came to decision making and investing in people, as it was believed to be the core company strength (for a variety of reasons including the institutional bureaucracy, corruption in these markets, etc.)
TL;DR people work where the money flows. Companies get what they pay for. And the investors pay for what they think is the strength or is likely to sell at inflated valuations.
I have family members who consider themselves "real engineers" compared to me, an SWE. They have backgrounds in Mechanical and other "traditional" engineering fields.
About once a quarter I am subject to conversations where they remark condescendingly about how flabbergasted they are at SWE salaries. I stopped engaging beyond "Mmm if you're interested you should learn more about the field".
This interaction is beyond grating and is detrimental to our relationships.
Yep, sounds like one of those "agree to disagree" topics. Or diffuse using mild humor, like you have tried. Or redirect, and blame supply and demand, or social media.
I too have a background in Mechanical Engineering and while many software products are complex I wouldn’t categorize all of them as engineering projects in the historical sense of the word. That’s not to say there are not quality software products that satisfy real businesses requirements. But it is to say that a lot of software projects would be WAY too expensive if they were engineered the way a passenger jet or a skyscraper was engineered.
The software development field is quite new compared to the other engineering disciplines and many, many decisions are made on gut feel, intuition or out right personal preference. Alan Kay has some very good talks on this specific subject, referring to the current state of our field as a Cargo Cult.
However, I would also say firmware would be the least expensive to engineer because the requirements for that type of software are better known and more rigid.
I believe that a part of the problem with software engineering is the "we can always fix this later" mindset.
Even during development, the only cost of iterating over errors until you get it right is just time.
But HW engineers just don't have the luxury of making 100 iterations of a product until it works, nor the safety net of "we'll update it over the internet". They must put a lot of effort into testing and verification until they say "ok, this is good, let's ship it."
Also, failure modes of mechanical products are often known and intuitive.
I am guessing that before the advent of Internet, the average quality of shipped software was higher on average. Nobody would dare ship a hot mess like Battlefield 2042 if they knew it's the last version they ship.
Automotive and other mixed-criticality systems is where these two worlds butt together and have a lot to learn from each other.
Mech eng processes on one side, ASIL-style safety requirements in the middle, and someone wishing to pour a bucket load of Android apps into the same computer from the other end.
Are they ever really "the same computer"? I don't think that's true even in entirely software-mediated-control vehicles like Teslas.
The discipline of robotics (which is really what you're talking about here — cars are just very manually-micromanaged robots these days) is all about subsumptive distributed architectures: e.g. the wheels in an electric car don't need a control signal to tell them to brake if they're skidding; they have a local connection to a skid sensor that allows them to brake by themselves, and they instead need a control signal to stop braking in such a situation.
This is why, in anything from planes to trains to cars, you see the words "auxiliary" or "accessory" used to describe infotainment displays et al — the larger systems are architected such that even an electrical fault (e.g. dead short) in the "accessory" (non-critical) systems can't impact QoS for the "main" (critical) systems.
I really can't imagine a world where they've got engineers building the car that understand that, but who are willing to let Android apps run on the same CPU that's operating the car. They'd very clearly insist for separate chips; ideally, separate logic boards, connected only by opto-isolated signals and clear fault-tolerant wire protocols.
The point you're making is valid in general and you provide valuable context. A modern car does have many different computers, and there is a lot of intentional partitioning (and even some redundancy) into different CPUs, as well as guests under hypervisors.
For example, a typical headunit computer (the "infotainment computer") tends to contain two to three SoCs performing different duties, and one or two of them will run hypervisors with multiple guest operating systems. And that is just one of multiple computers of that weight class in the overall car architecture.
That said, there's an overall drive to integrate/consolidate the electrical architecture into fewer, beefier systems, and you do now encounter systems where you have mixed criticality within a single computational partition, e.g. a single Linux kernel running workloads that contribute both to entertainment and safety use cases. One specific driver is that they sometimes share the same camera hardware (e.g. a mixed-mode IR/RGB camera doing both seat occupancy monitoring tasks and selfies).
Safety-vs-not-safety aside, you also simply have different styles of development methodology (i.e. how do you govern a system) run into each other within the same partition. AUTOSAR Adaptive runs AUTOSAR-style apps right next to your POSIX-free-for-all workloads on the same kernel, for example.
What however is typically not the case in that scenario is that the safety workload in a partition is the only contributor to its safety use case, i.e. typically you will always have another partition (or computer) also contribute to assure an overall safe result.
In more auto terms, you might now have ASIL B stuff running alongside those Android apps on the same kernel, but you will still have an ASIL D system somewhere.
In general, you will start to see more of both in cars: More aviation- and telco-style redunancy and fault tolerance, and more mixed criticality. The trends are heading in both directions simultaneously.
> I don't think that's true even in entirely software-mediated-control vehicles like Teslas.
Tesla has been in the media for bugs such as flipping tracks on your Bluetooth-tethered phone or opening the wrong website in the headunit web browser rebooting the Instrument Cluster display. This is an example of mixed-criticality (done wrong). Many other cars are not architected quite as poorly. However, IC and HU/central displays sharing the same computer (not necessarily the same computational partition/guest OS) is increasingly common.
> perfection is finally attained not when there is no longer anything to add, but when there is no longer anything to take away
In traditional engineering, there's at least a BOM and manufacturing processes that create pressure to keep things simpler. If physical items were engineered like software, you'd have people bolting a keyboard onto the monitor chassis they're designing because they needed an 'on' button, and keyboards have buttons. Obviously they'd then also have to add in an always-on raspberry pi to plug the USB keyboard into and emit a GPIO signal when the button is pressed. You'd get a lot more complexity, but for most of it, "impressive" would be the wrong word.
I think part of it is that hardware is tangible, software isn't, so for some reason people resent being expected to pay for software. Building software thus has less legitimacy in some peoples' minds.
I see this in the retro computing scene: people will quite happily fork over large amounts of cash to have an old bit of kit repaired, or buy a newly-developed expansion for old hardware, but those same people - even the people doing the repairs and building the new hardware - can be incredibly hostile to the idea of someone asking for money in return for new software for those old platforms.
It's hard to overcome preconceived notions. As we know from politics, emotions are much stronger than logic. You can't simply say, "be logical!" or "change your view".
The worst part is that, despite treating their software like a joke, every damn business guards their source code, protocols, etc. as if it were their crown jewels.
So end users end up having to reverse engineer it just to fix issues that the manufacturer should have addressed.
And - the real kicker - far too often it turns out to be based open source work, with a few random modifications, distributed in violation of the license.
Isn't that exaggerated by semiconductor manufacturing (TSMC et al) dominating the Taiwanese economy? If your nation's existence is driven by EE-type concerns, software engineering doesn't seem important.
Given the social cross-pollination between Japan and Taiwan, I wouldn't be surprised if a similar pattern held true there as well.