> if I have untrusted code running on my CPU, I've already lost
Don’t forget about JavaScript, a common way for people to run untrusted code on their computers. Not all of micro-architectural data sample are exploitable in JavaScript, but some are.
Yeah, JS is the only hairy part. I considered mentioning it, since I know it was going to come up. But luckily, all I've seen so far are basic demos (like leaky.page) that read data from a carefully-crafted array that the page itself populated. I've yet to be convinced that you could realistically exfiltrate meaningful data at any sort of scale with in-browser JS, especially now that more blatant bugs like Meltdown are fixed.
If anyone can show a proof-of-concept ("this page grabs your password manager extension's data") I'll eat my words. But I feel confident that most of these issues are purely academic and, while interesting, serve more to provide content for PhD theses than represent urgent hazards on the web.
Indeed, I've been feeling indifferent about all these timing sidechannels ever since the very first ones (Spectre/Meltdown). The PoCs have not been particularly convincing to me, given that they are extremely contrived and rely on knowing the exact details of the system being exploited to such an extent that someone with those details would be better off with other ways in, and assumes those details haven't changed at all during the amount of time required to do the attack --- the nature of these side-channels is such that even the smallest change in environment can completely change the results.
In other words, if I choose a process on my system at random, and dump a few dozen bytes from it, I can technically claim to have leaked some data; but the use of that data to an attacker likely depends strongly on factors which are outside of the attacker's control. It's somewhat like finding a (real-world) key on the ground: you theoretically now have access to something you shouldn't have, but you have next to no idea what that something is.
But I feel confident that these issues are purely academic and, while interesting, serve more to provide content for PhD theses than represent urgent hazards on the web.
They also provide content for sensationalist clickbait articles and fuel the paranoia that drives society towards authoritarianism and furthers the war on general-purpose-computing, which IMHO is a much bigger issue to worry about.
> Yeah, JS is the only hairy part. I considered mentioning it, since I know it was going to come up. But luckily, all I've seen so far are basic demos (like leaky.page) that read data from a carefully-crafted array that the page itself populated.
Only PoC says very little. If I were head of a nation state APT I'd look into exploiting this because attack surface of JS is high. I'd only use it targeted, for example on Microsoft Azure team as outlined in Darknet Diaries #78.
If they’re targeting Joe and Jane Average, the long history of government tech procurement failures means I expect them to fail — fail dangerously, but fail.
How would the statically typed WASM open an even wider hole? Assuming you mean that the size of wasm's hole is larger than js, not that their combined holes are larger than either one.
Wasm has more control over time and memory access than JS does. From a capabilities model, it is more secure, but from a threat model due to side channels, Wasm is a more effective tool than JS.
Be afraid. Be very afraid, the hackers are reading this too, as well as the malevolent nation-states hell bent on hacking.
I cannot show you a proof of concept right now, but I am betting that within a year, maybe even as quickly as six months. you will see this in the wild.
It sounds like a dream, but going back towards interpreted JS instead of JIT may finally stem the insanity of bloat that JS has evolved in an environment of increasingly fast implementations.
The problem of Javascript bloat doesn't have a technical solution.
Javascript bloat exists because of a social problem: the guy who fixes the corporate webpage's javascripts is called a "webdesigner", and "webdesigners" are the lowest rung on the corporate IT ladder, maybe only a bit above first-tier techsupport.
If you want to make some sort of career you need to upgrade from "webdesigner" to "frontend developer", and that means cryptic, incomprehensible and pointless "frontend frameworks".
It provides to value to business or users, but management puts up with it because it fixes the problem of employee churn. (Frontend positions are a big pain in the ass.)
I posit something even simpler. JavaScript bloat exists because it's easy to learn and put something real on a screen for a newb, and it's a pleasure to write in. Writing these frameworks/libraries/websites/whatevers is literally its own reward, and the barrier to sharing tools is low. That, coupled with enthusiastic developers across the entire spectrum of niavete and experience finding new tools fun and exciting to develop and use, and you have an ecosystem with endemic bitrot. It feels absurd to have to say this, but the people who are a part of this ecosystem and contribute to the bloat do not despise the ecosystem the way HN people seem to. They don't see it as broken. It's not going away.
>Javascript bloat exists because of a social problem
JS bloat exists because HTML Working Group along with the whole industry believes JS is the solution to everything. They believe everything on the web should be Web Apps, and completely neglect Web Page development. It was only in the recent 2 two years did we start seeing discussions to reverse course.
But JavaScript bloat is allowed to stay (by product managers, middle managers, UX designers, etc) because the website is still fast. If the JS bloat actually caused the site to become too slow on fast machines, people with power to change stuff would demand change.
People with the power to change stuff thought Java applets were a good idea in 1995, when most computers in wide circulation could barely run the JVM at any acceptable speed.
Never trust the tech industry to make optimal decisions, you are only in for a bad time.
They can set metrics on quality, but those can be easily gamed. (E.g. measuring average TTFB for a site instead of the real wall time to show visible content for the user.)
Management has the power to say that something isn't good enough and make it a priority. They also have the power to hire employees or consultants if the current team isn't capable of doing it.
Project management _definitely_ has the power to dedicate time to fixing performance issues.
Metrics can be gamed, but certain metrics - such as time to interactive, and time to fully loaded - are fairly well in line with what users actually care about. Even if they're gamed, a project manager can say, "This still feels slow to use. Dedicate the next (sprint|cycle|month|whatever) to performance work."
Once JIT is disabled, webapps are no longer viable. Which means we can start to deprecate features content-based websites don't need, and eventually, JS itself.
No reason to insult web developers in general. The simpler explanation is that webapps exist because of an economical problem: that you can make more money (have lower barriers) by either recurring payments for services, or by selling your user's attention, or both.
It is a technical problem. Fix the platform by enabling the writing of modular code with controlled, scoped imports and exports between html, css and JS, and you'll see the bloat go away.
Don’t forget about JavaScript, a common way for people to run untrusted code on their computers. Not all of micro-architectural data sample are exploitable in JavaScript, but some are.