I don't think I've ever truly "bricked" a iPhone, or got one to the point it CAN'T be put DFU mode and restored. Cydia tweaks made it sound like one wrong move could render your device permanently unusable, at most it was a inconvenience.
i was not the original retail customer, it was a phone used by a small business that I had owned, the employees were not good about keeping track of stuff like that, they'd use their own personal deets to sign up for business things all the time.
my point was that there was not "official" procedure to verify, I told him the story, he believed me, and went in the back and reset the password or something. They have the capability if they want to. He was a manager I think, and I didn't get to him through the reservations queue, I just walked up to the counter to find out if this was the place I should go, and after I asked my question he took an interest and solved the problem for me.
That ones new to me, I was aware of Angie and Freenginx which are both led by former nginx developers who left F5 after the acquisition. TEngine looks to be a much older fork but I can't find much recent discussion about it, though that may be because it's an Alibaba/Taobao project with a primarily Chinese userbase judging by the GitHub issues.
For wayback machine, are those compressed, deduplicated numbers?
A semi-popular domain can have millions of results on their CDX api, but with https/https duplicated and about 90% of results are error pages or pages with deliberate garbage / LFI attempts in them.
Deduplication is not trivial. Each scrape is stored in a WARC archive, so you would have to unpack several large files, dedupe, and then pack them back up again. I believe they are at least compressed within each snapshot though.
Yes, that seems to be a silly way to go about it if your goal is to store the whole web and not just a single scrape. Of course anything that deduplicates data is more vulnerable to data corruption (or at least corruption can have wider consequences) so it's not a trivial problem but you'd think deduplicating identical resources would be something added the first time they came close to their storage limits.
MacOS is notorious for this. By default, it would only run binaries signed with an Apple-issued certificate. You can bypass this multiple different ways, of course, but that requires knowing that it can be bypassed in the first place.
Then there are mobile OSes where you don't get to see the binaries at all. Yes you can repack an apk but again, that's a more involved process requiring specific tools and knowledge (and very awkward to do on the device itself), and iOS is completely locked down.
> MacOS is notorious for this. By default, it would only run binaries signed with an Apple-issued certificate. You can bypass this multiple different ways, of course, but that requires knowing that it can be bypassed in the first place.
What do you mean? When I compile something with a myriad of different language stacks or compiler toolchains, I'm not aware of an Apple-issued certificate ever being involved and those binaries run just fine.
Probably because the environment you use to compile it, like the terminal or Xcode, is added to "developer tools" under security settings. Xcode in particular does that for itself automatically.
This is fine if the user was empowered to re-sign it after the mucking. The problem is that the user is rarely in charge of their own computer anymore.
Outside of mobile operating systems, eg on Linux, Windows and MacOS (and all the BSDs etc) it's fairly trivial to run binaries you built yourself.
But: re-signing is an extra step that someone who's just starting out and mucking around with a hexeditor might not know how to do nor even be aware of.
Creating a reddit account took 25 seconds and they even generated a username for me. No email verification necessary, sorry to a@example.com if you're getting emails about reddit user "Agile_Rectangle729"
Why not just use tar or any other archive tool on the repository .git folder? Unless your repository is a un-gc'd mess with millions of unpacked objects...
I think that is a fair alternative, but restoring the backup means that the repository is in a bit of a weird state. Whereas a bundle can be cloned from nicely. Your way does have the property that it includes hooks and config, though (which could be desired or not).
This is also non-deterministic between versions of tar, but I guess for this usecase that would be fine. It’s just not good for reproducible build systems when trying to recreate tarballs after years.
> The naive solution of simply backing up the entire file-system tree is clearly not desirable since that would clutter the backup with useless build artifacts.
Build artifacts can be filtered out with tar --exclude patterns, but this is a language-dependent set that will require curation.