Hacker News new | past | comments | ask | show | jobs | submit | gbxyz's comments login


W3C DID Decentralized Identifiers sk/pk can be generated offline and then the pk can optionally be registered online. If there is no connection to the broader internet - as is sometimes the case during solar storms in space - offline key generation without a central authority would ensure ongoing operations.

Blockcerts can be verified offline.

EDNS Ethereum DNS avoids various pitfalls of DNS, DNScrypt, DNSSEC, and DoH and DoT (which depend upon NTP which depends upon DoH which depends upon the X.509 cert validity interval and the system time being set correctly).

Dapps don't need DNS and are stored in the synchronized (*) chain.

From https://news.ycombinator.com/item?id=32753994 :

> W3C DID pubkeys can be stored in a W3C SOLID LDPC: https://solid.github.io/did-method-solid/

> Re: W3C DIDs and PKI systems; CT Certificate Transparency w/ Merkle hashes in google/trillian and edns,

And then actually handle money with keys in space because how do people pay for limited time leases on domain names in space?

In addition to Peering, Clearing, and Settlement, ILP Interledger Protocol Specifies Addresses: https://news.ycombinator.com/item?id=36503888

> ILP is not tied to a single company, payment network, or currency

ILP Addresses - v2.0.0 > Allocation Schemes: https://github.com/interledger/rfcs/blob/main/0015-ilp-addre...


That seems like a pretty extreme solution to that problem.


Poster prioritized their uptime.


Its the right amount of solution IMO.


Teaching an LLM to play Mornington Crescent https://vm.tiktok.com/ZGejrSJfE/


Nice!

GPT-4 can play, it will actually pick up on what you're doing and follow along. It's a bit stilted if it's got its history of explaining what places are, though does pick up on inventing move names and similar. It'll follow your lead if you start invoking random rules & history.

> Given your astute deployment of the reverse-polish maneuver to Angel, it's clear I must think creatively to maintain a competitive stance in our game. Angel's position on the Northern Line presents an intriguing array of tactical responses.

> In light of this, and adhering to the spirit of our pre-98 rule set, I will execute a move that's both unexpected and steeped in the lore of our game. I navigate to Stratford, invoking the East London Shuffle.

https://chat.openai.com/share/6a275e17-ea27-4c35-b0a3-6d9f2c...


Verisign sold its CA back in 2010.


I kind of get the vibe that Mozilla is laying the groundworks for Microsoft vs Netscape Round II - or at least some kind of antitrust litigation.


I really wish they'd go after Apple and Safari. Safari is basically the modern day IE. Whenever I do anything slightly weird, I'm fairly confident it works in chrome/Firefox and almost sure that it's broken in safari.

The fact that you can't test on Safari without osx is insane. Some bugs can be reproduced in other WebKit browsers (I test with epiphany) but some are safari only. Not to mention the fact that Safari is the only choice on Apple mobile devices.

I believe Apple is significantly worse than Microsoft in regards to browsers. I wish Mozilla would focus on them.


Once Chrome is allowed on iOS, then Firefox is dead. When they're no longer forced to test in a second browser by management, all the web devs will just check for a recent Chrome user agent and if it's not there they just say "install chrome to use this site", and we're back in the IE days. And then Google can stop paying Mozilla for search field defaults and then all of Mozilla's income is gone.


I've seen that sentiment a lot, I don't buy it. Google pays for Firefox so they can claim they are not a monopoly.

I'm hopeful that if people have a choice on iOS, then Apple would invest more on their browser.


They won't. They already have given both Apple and Google pass when chrome was launched or when smartphones took of. Mozilla sees Microsoft as devil's incarnation while Google and Apple are the ally in their holy war.

I am saying this as a developer who was using Firefox since firebug v0.8 era.



I hope so. But a lot has changed, notably the financial power and sheer size of their counterparts. We're talking about three of the most valuable companies in the world.


About time!


The UN publishes a list of recognised member states. ISO maintains a list of codes. IANA uses that list to determine whether a country code TLD is valid. But that's the extent of the UN's involvement in the governance of the root zone.


There are plenty of TLDs for areas which aren’t countries, whcih don’t claim to be counties, which nobody recognises as countries. .gg and .gu for example.


> The IANA is not in the business of deciding what is and what is not a country. The selection of the ISO 3166 list as a basis for country code top-level domain names was made with the knowledge that ISO has a procedure for determining which entities should be and should not be on that list.

Jon Postel, RFC 1591 (Domain Name System Structure and Delegation)


Those are not ccTLDs though.



Oh, I was wrong. My bad.


are you sure? Wikipedia describes both as being country code tlds, and both have an official ISO 3166-1 code that matches their domain.



As the parent comment said, it's a temporary TLD, so by design it does not appear in the root listing. See: https://en.wikipedia.org/wiki/.xk


It has never been present in the DNS root zone. A TLD that doesn't exist in the DNS is not really a TLD, is it?


(2010)


Which partially explains how they managed to have a 10 Mbit port at an internet exchange. Although that seems weird even for 2010? I assume this would be a 100 Mbps or Gbit port artificially throttled to 10 Mbps?

That said, 50 Mbps of traffic sounds like a non-issue for someone like Cloudflare, so I'm not surprised they took the prefix.

Edit: A post by Cloudflare (linked in another subthread here) mentioned 10 Gbps of traffic at the time Cloudflare started using it. Which, as it would be spread over multiple locations, is probably a relatively minor annoyance.


It seems like it's MB to Mb, but still it feels like a small number even in 2010 standards


Has to be a virtual network right? The throttle was for safety not cheapness.


Aside: Arachnophilia was the first editor I used for web page development in the late 90s as it was installed on every PC in my university's computer rooms. It's amazing to rediscover it now!


mods, I suggest adding (1999) to the post title.


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: