Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

...Found the guy who crashed the site ;)


You can add delays with -w 10 (10 seconds) or --random-wait says https://stackoverflow.com/questions/35771287/delays-between-...


--random-wait with the default tries (20) is working for me


Don't forget --timestamping in case the command fails and you want to try again, but only download files you don't already have.


It's really the tragedy of the commons and why we can't have nice things. Ghibli went out of their way and gave something nice, only to have it immediately abused by gluttonous data horders.


Yeah, it's definetly the ~100 people using a shell script to download a couple of jpegs crashing the site. Not millions of regular people looking around.


This is a nice use case for IPFS.

People `ipfs pin` the images to satisfy their data lust, and instantly become part of the growing swarm of computers which are now serving the content to new visitors.


Or just put it on a tracker, like we've been doing for 20 years.


https://instant.io/#e371e76f57923f555c0585910420137cd615911a

magnet:?xt=urn:btih:e371e76f57923f555c0585910420137cd615911a


Thanks for sharing – 8 peers and downloaded in 30 seconds!


This is a nice use case for a regular and boring CDN, caching images and making them available without crashing a web server is a solved problem.


You've described a solution that puts the cost/work on the provider. IPFS and BitTorrent shifts that to the consumers.


€50 per month for 50TB of traffic = 50 mio x 1MB file downloads. The costs are so low, that CDNs are practically free. Oh wait, CloudFlare IS free for such primitive use cases.


You ignored the costs of labor.

Not that it matters, my point stands. The cost exists and different solutions put it in different places.


Or pin them on Skynet. Which is just a better implementation of IPFS.



Are there metrics that indicate this, or is it better in other ways also?


The big difference with Skynet is that you pay for professionals to pin the data on your behalf. It works better for unpopular files because you aren't relying on the uptime of people at home.

It also works better for popular files because the professionals generally have great bandwidth and good peering.

There are a bunch of other benefits as well, such as access to an api that allows skynet pages/apps to upload and download. For something like this you wouldn't use those endpoints but if you wanted to make say a decentralized blogger, you could use those endpoints to upload posts that users write.


> because you aren't relying on the uptime of people at home

IPFS doesn't automatically spread whatever you put into your client, so you should never rely on others to randomly host your content for you. In fact, if you start a default IPFS node, add your content then turn it off again, no one is gonna have the content you added.

Would have been interesting to hear a fair/impartial comparison between Skynet and IPFS but your misunderstanding of IPFS is now evident so maybe someone else can fill in the gaps?


Out of their way? Releasing a few pictures they already stored digitally hardly seems like a lot of effort.


Or the studio could have provided a single zip/tar for the data archivist. ; )


A torrent would be easiest on their servers.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: