Hacker News new | past | comments | ask | show | jobs | submit | mavhc's comments login

However they applied it to all phones of that model, not just ones with degraded batteries

No, it was dynamic based on voltage. iPhones with worn batteries had higher performance at full battery and swapping the battery with a fresh replacement restored full performance even at low battery percentage. In fact this is how the slowdown was discovered: someone replaced their iPhone battery with a non-genuine replacement and it got noticeably faster.

Given that a) most human rated rockets have had 0 flights before use, and b) I'd expect each starship to have at least 10 flights, and at least 100 in total without mishap before launching, the statistics should be good

I don’t think (a) is true. The Shuttle flew with people on its maiden voyage, but that’s the only one I can think of.

(b) is true and should make it substantially safer than other launch systems. But given how narrow the margins are for something going wrong (zero ability to land safely with all engines dead, for example) it’s still going to be pretty dangerous compared to more mundane forms of travel.


Most rockets flew test flights before sticking people inside the same model, but most rockets are also single use and so each stack is fundamentally new.

A future starship could plausibly be the first rocket to fly to space unmanned, return, and then fly humans to space!


Don't use the homepage, use the subscriptions page

When you turn off search history, it makes the homepage useless and the subscriptions page becomes unavoidably the next step.

Discovery of content happens in the sidebar from videos I enjoy now, and only when I'm in the mood to discover something.


Quassel is basically self hosted irccloud, I use it on 5 computers, Windows, Linux and Android, it's great

https://news.ycombinator.com/item?id=42385393 When the only copy of your computer is 15 billion miles away, and the documentation is OCR'd 50 year old printouts, and no one recorded what was patched when


Do you assume laws cannot be updated?


Low-level EU Regulations like this take approximately 3 years to be drafted and adopted (validated).

A whole decade is often needed if the member states consider a new mandate is needed, typically a directive or regulation or treaty clause giving the EC authority and a framework to regulate something.

Any update to this regulation will have to wait at least 3 years after a new standard has been agreed on. And there will probably be a period for adoption by the industry, typically 2 years. So at least 5 years after everyone has agreed what is needed. It most probably won't be updated for the next 25 years.


Considering the number of stupid laws that haven’t been updated, and the conflicting interests every time an update is proposed, I answer that it can be safely assumed the law will never be updated in most circumstances.


The assumption that a law that directly influences millions of people daily lives and has close to 0 direct budget costs associated with it won't be updated when it becomes counterproductive is quite funny.

Are you American per chance?


The EU's cookie law still requires a banner for everything except "strictly necessary cookies",[1] which means you must have a banner if you use cookies to save preferred language, default location, or any internal analytics data (such as New Relic, Datadog, etc).

So yes, I think updating the law will take a significant amount of time.

1. https://gdpr.eu/cookies/


You do not need a banner, you need informed consent. I'm sure there are other ways of getting consent other than a half screen pop-up with a big red accept button on first visit, but they probably won't get 70% "opt in" rate.


Law: The optimum behaviour is annoying banners.

Companies: Annoying banners.

Legislators: Mission Accomplished. A win for the good guys!

Situation persists for at least a decade.


A more accurate version:

Law: You have to get some form of affirmative consent if you want to do specific often-abused things.

Companies: We'll do it in the most obnoxious way possible ("here are our 853 technology partners... no, there's not a 'deselect all' option, have fun clicking") so people blame the law instead of the industry that didn't want to allow consent at all.


There's always a deselect all option (or rather, the equivalent "accept only the technically required ones"), because it's required by law. Sometimes the operator tries to hide the option. That, too, is illegal.


There is frequently not a "deselect all" option; there's a reason regulators keep having to warn about it.

https://ico.org.uk/about-the-ico/media-centre/news-and-blogs...


I so wish that "our 1234 trusted partners" was an exaggeration.


Selecting the language you want actually sounds like "functionality that has been explicitly requested by the user" who "did a positive action to request a service with a clearly defined perimeter". This is clearly allowed.

https://ec.europa.eu/justice/article-29/documentation/opinio...


Section 3.6 says that UI customizations such as language preferences are only exempt if they last for a session (no more than a few hours). Anything longer requires a cookie notice, though they do claim that a less prominent notice than a modal is acceptable.


There's no section 3.6.

It doesn't say only a few hours.

The optimum behaviour under the law is not to show a cookie banner. It's not to collect copious amounts of data.

You only had 8 years to learn about the law, and you still remain willingly ignorant and misinformed about it.


Page 8 of the PDF[1]: 3.6 UI customization cookies

> These customization functionalities are thus explicitly enabled by the user of an information society service (e.g. by clicking on button or ticking a box) although in the absence of additional information the intention of the user could not be interpreted as a preference to remember that choice for longer than a browser session (or no more than a few additional hours). As such only session (or short term) cookies storing such information are exempted under CRITERION B.

It specifically says that a consent notice is required for UI customization cookies that persist more than a few hours, and it gives an example of preferred language as one of those UI customizations.

1. https://ec.europa.eu/justice/article-29/documentation/opinio...


> Page 8 of the PDF[1]: 3.6 UI customization cookies

What's "Opinion 04/2012 on Cookie Consent Exemption" adopted on 2012, 4 years before GDPR?

Edit On top of that, actual quote:

--- start quote ---

"They may be session cookies or have a lifespan counted in weeks or months, depending on their purpose

... addition of additional information in a prominent location (e.g. “uses cookies” written next to the flag) would constitute sufficient information for valid consent to remember the user’s preference for a longer duration,

--- end quote ---

12 years since this opinion, 8 years since GDPR, and you still have no idea about either.


Sounds perfect to me.


Maybe I am dense but I cannot find the requirement for cookie-banners in your link.


You're linking to a fake website made by the private company behind Proton Mail, that tries to present itself as an official EU site. What they claim will be in their own financial interest, and not what the GDPR law says.

From the horses mouth:

"GDPR.EU is a website operated by Proton Technologies AG, which is co-funded by Project REP-791727-1 of the Horizon 2020 Framework Programme of the European Union. This is not an official EU Commission or Government resource. The europa.eu webpage concerning GDPR can be found here. Nothing found in this portal constitutes legal advice."


GDPR isn't the cookie law. It is a law regulating storage of personal data overall. The banners are a result of greed and incompetence. The companies made stupid amount of money by closely profiling every single individual using cookies and fingerprinting. They are in malicious compliance and if the behavior continues the regulation may become more stringent.


I never said the GDPR was the cookie law. I was just linking to a site that summarized the actual law. If storing preferred language in a cookie (without any uniquely identifying info) does not require a cookie banner, then I'd be happy to be corrected on that.


You don’t need an intrusive banner on page open. You just need consent.

If the user is saving a setting like a language preference, just put “by saving this preference you agree for us to store the setting” next to the option/OK button (it’s really implicit just like their shopping cart example, but this is if you want to be really paranoid)


If it cannot be linked to you, it's no longer PII, and doesn't require consent.

As easy as that.


Can't see the source


It seems marketing BS… will they open source the ARM710 Verilog/VHDL?

Probably the board schematic… what a joke


By that logic any OSHW project using integrated circuits is closed-source. After all, the PDK is proprietary, the fab's production pipeline isn't publicly available, and there isn't even a way to confirm the HDL supplied is actually what is being used in the chip!

If we want truly open-source hardware we'll have to go back to relay computers. Everything beyond that is "a joke".


No love for a verilog design in an fpga as open source hardware?

It’s a flawed surrogate of course - lower performance and higher cost than an asic produced at volume - but the open source hardware aspect is very strong.


The circuitry (ie transistor level) of the FPGA is still closed no?


Yes, but there are many different fpga’s, many of which many verilog will run on, which to my mind makes the downsides of proprietyness low enough that this might be the sweet spot to stop at going down the stack..

however awesome an open source fabbed asic or fpga would be of course.


It's not open unless you can make it from sand.

Actually, it's not open unless you can make it from hydrogen atoms.


That’s going too far. RYF certification is what you could reasonably require.


> By that logic any OSHW project using integrated circuits is closed-source.

There are degrees to things. It's not cool to try to claim the prestige of "open source" when the entire board is nothing but a vehicle for your very-much-closed processor design.


> when the entire board is nothing but a vehicle

You can get the board design and order it manufactured yourself, or you can change it to suit a different processor that's more open.


What about the many real open source RISC V?


Most people wouldn't realise they can't recover their TOTP codes. But the hacker would still need to know your password surely


...so you agree that this is missing the '2' in 2FA?


For "something you have" to be true to its purpose it has to be something that has one and only one copy - so either only you have it, or you don't, but nothing in between. The second you have "cloud backup", or activate an additional device, or "transfer to a new device" then you turn the attack into "phishing with extra steps".


You can support transferring to a new device without increasing the phishing risk, the transferral just needs to be done via a physical cable rather than via the cloud.


I'll grant you that it's a better option but by no means good if you want to stand on the 2FA hill and put security first (only?). That "just" does a lot of heavy lifting.

The only time I'd consider transferring a secret like this is secure is within an HSM cluster. But these are exceptionally hardened devices, operating in very secure environments, managed by professionals.

Your TOTP seed on the other hand is stored on any of the thousands of types of phones, most of which can be (and are) outdated and about as secure as a sieve. These devices also have no standard protocol to transfer. Allowing the extraction via cable is still allowing the extraction, the cable "helps" with the transfer. Once you have the option to extract, as I said, you add some extra steps to an attack. Many if not most attacks would maybe be thwarted but a motivated attacker (and a potential payoff in the millions is a hell of a motivator) will find ways to exfiltrate the copy of the keys from the device even without a cable.

This is plain security vs. convenience. The backup to cloud exists because people lose/destroy the phones and with that their access to everything. The contactless transfer exists because there's no interoperability between phones, they used different connectors, etc. No access to the phone is a more pressing risk than phishing for most people, hence the convenience over security.


I think this is also the main drawback of physical U2F/FIDO2/Webauthn tokens: security-wise they are by far the best 2FA option out there, but in practice it quickly becomes quite awkward to use because it assumes you only own a single token which you permanently carry around.

Sure, when I make a new account I can easily enroll the token hanging on my keychain, but what about the backup token lying in my safe? Why can't I easily enroll that one as well? It's inconvenient enough that I don't think I could really recommend it to the average user...


I don't quite get this "I need to add every possible authenticator I have at account creation or I'm not doing it" kind of mentality I see a lot.

When I make an account, if I have at least two authenticators around me, I'll set up the hardware authenticators or make sure it's got a decent recovery set up. As time goes on I'll add the rest of them when it's convenient. If I don't have at least two at account creation or I don't trust their recovery workflow, I guess I'll just wait to add them. No big deal.

If I'm out and I make an account with $service but I only have my phone, I'll probably wait to add any authenticators. When I'm with my keys, I'll add my phone and my keyring authenticator to it. When I sit down at my desktop sometime in the next few days and I use $service I'll add my desktop and the token in my desk drawer to it. Next time I sit down with my laptop and use $service, I'll add that device too. Now I've got a ton of hardware authenticators to the account in question.

It's not like I want to make an account to $service, gotta run home and have all my devices around so I can set this up only this one time!


>When I make an account, if I have at least two authenticators around me

If you do, you're in a tiny minority of users. Well, even if you have one you're in a tiny minority, but having two laying around is extremely unusual.


Only because I bothered to buy a few. If they're making a new account they're probably on a device which can be an authenticator, i.e. a passkey. Is it rare for people to be far away from their keyring where they potentially have a car key and a house key and what not?

Do most people with hardware authenticators not also have laptops, desktops, or phones? They just have an authenticator, no other computers?

This person I replied to already has two hardware tokens. They probably also have a phone that can be used with passkeys, they probably also have a laptop which can be used with passkeys, they might also have a tablet or desktop which can be used with passkeys. That person probably has 3-6 authenticators, and is probably with two of them often if they carry keys regularly.


I don't understand the existence of an HSM cluster. I thought HSM was meant to be a very "chain-of-custody" object, enabling scenarios like: cryptographically guarantee one can only publish firmware updates via the company processes.


The HSM is more generic than that - a Hardware Security Module. It's just a hardware (usually, software... Hardware security modules exist...) device that securely stores your secret cryptographic material, like certificate private keys. The devices are exceptionally hardened both physically and the running software. In theory any attempts to attack them (physically open, or even turn them upside down to investigate them, or leave them unpowered for longer than some hours, attempt too many wrong passwords, etc.) results in the permanent deletion of all the cryptographic material inside. These can be server sized, or pocket sized, the concept is the same.

Their point is to ensure the private keys cannot be extracted, not even by the owner. So when you need to sign that firmware update, or log into a system, or decrypt something, you don't use a certificate (private key) file lying around that someone can just copy, you have the HSM safely handling that for you without the key ever leaving the HSM.

You can already guess the point of a cluster now. With only one HSM there's a real risk that a maintenance activity, malfunction, accident, or malicious act will lead to temporary unavailability or permanently losing all the keys. So you have many more HSMs duplicating the functionality and keys. So by design there must be a way to extract a copy and sync it to the other HSMs in the cluster. But again, these are exceptionally hardened HW and SW so this in incomparably more secure than any other transfer mechanism you'd run into day to day.


Ah, got it. So in the event someone managed to get access, they are limited to signing things in that moment on that infrastructure. I can see how that would reduce the blast radius of a hack.


Ideally this would destroy the initial copy too - but forcing physical access would indeed be a great start.


Even so, if you have a copy even for a fraction of a second then you can have two copies, or skip the deletion, or keep the temporary copy that was used during the transfer. Even the transfer process could fail and leave a temporary file behind with your secrets.


I quite like Apple’s Advanced Data Protection, I set it up with two physical yubikeys recently. To login to iCloud/Apple on a new device that’s not part of your trusted devices, you must use the hardware token.


They'd have to know your password, and get you to click your 2FA accept button, that's 2 factors still


If I'm the only person going from/to somewhere having a large public transport vehicle take me is wasteful


Whatsapp wins because it doesn't require a username and password, that's too complex for many people


> wins because it doesn't require a username and password

And lose because you can't give it to a kid that doesn't have a mobile phone number.

I have shared custody of my daughter and we communicate via xmpp on a tablet they carry over there when they spend a week at their mother's.


True, but I mean wins in a global "it's now the default platform for most people in the world" sense.


Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: