Hacker News new | past | comments | ask | show | jobs | submit login
Command-line file sharing (transfer.sh)
226 points by masolino on Sept 30, 2014 | hide | past | favorite | 79 comments



That's great, you can encrypt stuff then as follows (on a Mac).

  brew install gpg
  gpg --gen-key
  gpg -ac < file.unencrypted | curl http://transfer.sh/file.encrypted -T -
To decrypt, take the returned URL:

  curl https://transfer.sh/1nKXr/file.encrypted | gpg -ad
Someone can probably improve this!


OP here. Right this is the public / private encryption. The sample from on the site itself is symmetric encryption (using pw).


Handy tip for OS X users: after creating the "transfer" alias, simply use

  transfer file.txt | pbcopy 
to get the link copied to your clipboard!


For linux users, install xclip from your flavor of package managers and then:

    transfer file.txt | xclip
link is copied to your clipboard.


My usual procedure goes something like this:

    rsync -P foobar.png personal.sircmpwn.com:/var/html/
    # "hey dude, go download http://personal.sircmpwn.com/foobar.png"


I do this too, but I find it annoying not knowing if the file has been received and when it's safe for me to delete it.


"Did you get that file I sent you?" "Yes" "Great, thanks" $ ssh personal.sircmpwn.com rm /var/html/foobar.png


I often do this and found that I often forget about it and it turns into a big mess of outdated file, so I periodically run a script in crontab that kills files older than a certain age. Works great for me.


Next version I'll add the possibility to add a timespan in number of days or number of downloads before deletion. What about that?


I'd personally love a website that offered self-destructing url pointers or downloads. So, a url that redirects once to the target, that sort of thing.


You might not want to be too strict about it. I know I'll sometimes look at an attachment on the go, but not be able to download it until later when I'm at home.


I like in the little example animation, hello.txt is uploaded and a url is returned, but then a completely different url is used to get the hello.txt back.


Op here, good point. Will update the demo ;-)


You might also want to add a delay at the end of the animation before looping. As it is you only see the end result for a single frame.


Slow it a bit down too while you are at it.


How did you make the demo? I'd love to put something similar on http://ipinfo.io


Using tty2gif, I've made it work at my mac. Only need to find some time to upload the changes to the repo.

  http://z24.github.io/tty2gif/


tty2gif author here. Could you send me a pull request about your mac version? I can merge them if I have time. Thanks for your efforts!


make it not loop!


Or at least add a delay at the end. It's not super hard to figure out what must be going on, but really the only thing you have time to see is the first line.


How do you intend to make money?


I was wondering the same thing. It seems like this will be abused very quickly and could end up costing a considerable amount of money.


Put in some text ads under the curl progress bar.


"These pills will make your dick grow like this progress bar!"


Haha, yes, was thinking about the same. Currently I'm just glad I can provide a useful service. There is no business case (yet).


curl puts the data from the server on stdout, progress on stderr. what's the process for getting ads from the server to appear in curl progress?


I use CloudApp for OS X for sharing files and it's pretty convenient. The best part is sharing screenshots with it. You take a screenshot, it is automatically uploaded to the cloud (no pun intended) and the link is copied to your Clipboard. You could not ask for anything simpler.


I'm still looking for an app which does just that, just with my own server via scp / sftp and not dropbox or a third party service. It's so convenient once you got used to it.

- https://news.ycombinator.com/item?id=8039708


I've used TinyGrab for this for quite some time: http://tinygrab.com/


You can do this hitting Command-Shift-4 in Dropbox too.

https://www.dropbox.com/en/help/1964


> You take a screenshot, it is automatically uploaded to my butt (no pun intended) and the link is copied to your Clipboard.

No pun detected ;)


I installed that plugin on my boss's computer, totally forgot about it, and he was about to reach out to a bunch of executives on LinkedIn to ask them what was up with their profiles when I finally told him about it. Joke almost went a little bit too far :)


It also has a CLI app:

    $ cloudapp -d myfile.pdf  # upload the file and copy the link in your clipboard







Which of all of these is the best? Which is most likely to still have my files after 5+ years?


The one I posted is (mine and) amateur, don't use it.

It will probably be around for 5 years, but is not undergoing active development, and it was a side project I worked on for funsies (I may come back to it one day, and it's kind of a leave-it-be project, especially without 100s of users).

Please refer to the other ones for a serious (or at least more well thought out) solution.


Purrrl files never expire.


I'm getting a 301 when I try to upload to chuck.io.


I use curl.io from time to time and it has a big drawback: you need to visit curl.io to get your upload URL. transfer.sh got it right.


http://purrrl.link generates the links automatically if you are uploading via the CLI.


I wish the URL could include a hash of the content so it could be used for verification for uploading and downloading the files. It could be handy for error checking and security checks.


Better yet, if there was a wrapper for this, it could just operate on abbreviated hash-prefix (i.e. git ref) rather than URL.

    $ transfer < ./foo
    acH39gew

    $ transfer acH39gew > ./foo
The ref would just be short for the URL, and you could still use the URL, of course.


Even shorter/easier to remember version of the command (you don't have to specify the filename twice if you don't want):

    curl -T ./filename transfer.sh


This will be abused for hosting questionable things in no time, if it's not already.


Yeah, nice target for drive-by malware uploaders.

A better approach than blanket allowing of hotlinking is to whitelist certain user agents (curl, wget) and only allow those to directly download the file, and to present landing pages detailing the file and its contents to everyone else. It's been on my list of things to implement in my own sharing site.


These are the people behind transfer.sh: https://github.com/dutchcoders/

...Based on some words I see on that page, I think they will anticipate abuse and roll with the punches.


Right you are! We will implement a page in between depending on the referer. Direct download users won't have any issues, but to prevent abuse the page will be shown if from an other site.


Nice, finally hassle free service. No registration, proprietary clients etc...


This makes me wonder about their future. I don't see any revenue streams, but I can see nontrivial expenses.


Hi...I write a wrapper tool, hopeful easy to use, and now just upload & download twice function.

https://github.com/daineseh/py-transfer.sh

ex1: Upload a file to http://transfer.sh

  $./pt.py -u /home/something/file1
# and output a link for download:

  https://transfer.sh/19Xwp/file1

ex2: Upload multiple files to http://transfer.sh

  $./pt.py -u /home/something/file1 /home/something/file2 /home/something/file3
# and output some link for downloads:

  https://transfer.sh/19Xwp/file1

  https://transfer.sh/19Xwp/file2

  https://transfer.sh/19Xwp/file3

ex3: download multiple files from http://transfer.sh

  $./pt.py -d https://transfer.sh/19Xwp/file1 https://transfer.sh/1fn4k/file2
# Print downloads information:

  Download ./file1 done.
  
  Download ./file2 done.

ex4: download multiple files and specify path from http://transfer.sh

  $./pt.py -d https://transfer.sh/19Xwp/file1  https://transfer.sh/1fn4k/file2 -w /home/user/
# Print downloads information:

  Download /home/user/file1 done.

  Download /home/user/file2 done.


I didn't know about tty2gif (http://z24.github.io/tty2gif/) -- that looks pretty cool.


It is but it creates some amazing huge gifs :)


Author here. Thanks for your comment and I have reduced the gif size quite a lot. Please take a look.


I really like http://fh.tl/ because you can also easily send shell output

    [/] $ ls -la / | curl -F 'paste=<-' http://fh.tl/
    
    Your paste has been saved!
    
    Share this URL:
    
           http://fh.tl/MO
    
    [/] $ 

helpful when trying to debug someone on a remote workers machine


I was puzzled to see a symlinked libnss3.so in your root directory and couldn't help myself but google it. Here's the reason (bug in package ca-certificates-java): https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=688415


This works also with transfer, eg.

  ls -la | curl --upload-file - https://transfer.sh/bla.txt


I would love to use this for my own files, without expiration etc. Do you have any plans to open source this?


Maybe I don't understand what this service actually does. When you take away the file expiration and the convenience of not having to manage your own infrastructure what is left? I've had this script in ~/bin for years.

  SERVER="example.com"
  LOCATION="public_html/uploads"
  BASE_URL="http://$SERVER/uploads"
  
  scp "$1" "$SERVER:$LOCATION"
  echo "$BASE_URL/$1" | xclip -i


Add a cron job to that (to manage timed deletions), and you got a very sweet setup! :-)


OP here. Yes, I have plans to opensource this. I'm just looking for the right awesome design, then it will be opensourced.


It's quite heavyweight.. but I just installed https://owncloud.org/ on my own play VPS recently. Very happy with it so far. It's basically open source DropBox.


I built one that doesn't expire files and we're planning to open source it: http://purrrl.link


I've been wanting something like this for semi-private image shares as Imgur's URLs aren't very random; but Transfer has a similar security problem; it looks like the URLs need to be longer as a prerequisite to being un-crawlable. I hope you'll consider it.


Never expect things you upload to the internet to be private. There are so many vectors where this can go wrong. The host might list directory indexes, s3 buckets can be open, sites can be hacked, etc etc.


You did see the part where I said "semi-" private? I'm just suggesting OP might want to harden the URLs, otherwise the URLs might as well be /1 /2 /3.


well there is the filename in there, so you could give your files really long names


Op here. The urls are randomly generated combined with the chosen filename, I assume this should be safe enough. Links will expire in 2 weeks. Working on functionality where you can configure the lifetime. Eg, just 1 download or 2 days.


OP, I'm not an expert but I think you need to introduce more entropy to prevent opportunistic crawling of every possible URL.



at least for common use cases like curl and wget you probably need a wrapper. I mean i love the idea of never leaving my cli, but... as developer and bash user im lazy... so please don't make type long curl commands. (Yeah I know I can add some aliases, but a wrapper could be handy). Anyway good job :)


Are there any solutions similar to this and the others mentioned in the discussion that can be self-hosted?


Responsive NavBar button is broken. Also is there any reason to alias a function like in the sample?


Is there a way to upload image without "attachment" in content disposition header?


Yes, this has been done because it was abused. We will make some smarter changes, which will allow removal of the content disposition header.


Thank you for this.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: