Hacker News new | past | comments | ask | show | jobs | submit login
Uses for cURL (httpkit.com)
339 points by KrisJordan on Nov 9, 2012 | hide | past | favorite | 71 comments



I would strongly suggest HTTPie for the majority of these tasks: https://github.com/jkbr/httpie


And I will continue to pitch our curlish instead which does not replace curl but amends it: http://packages.python.org/curlish/

It just wraps curl and adds some nicer options, cookie handling, oauth and colorizing to it. Same command line you are already used to but with some extras that simplify live and make it more enjoyable.


May I ask why? What benefits does HTTPie have?


The GitHub page does a good job of highlighting all of the benefits HTTPie has if you're already familiar with cURL. But for me it's the design of the interface with syntax highlighting being the sugar on top.

See: https://github.com/jkbr/httpie#interface-design , https://github.com/jkbr/httpie#redirected-input , https://github.com/jkbr/httpie#usage


I prefer HTTPie because after years of using curl, I still have to Google simple use cases. HTTPie has a very simple a intuitive api.


Apparently it's for humans.


Thanks for this recommendation. I still value tips on using cURL because it is practically ubiquitous which is helpful when you are logged into some random server, but HTTPie definitely has a very common sense usage and I think it will be very useful as a testing tool for my local machine.


Strongly agree. For Windows users like me, i recommend installing Python and Pip using Chocolatey Nuget and then doing a `pip install httpie` to avoid Windows Python dependency hell and still get httpie.


Here's a pitch for another nice one: http://echohttp.com/


> If you only care about headers use the -I flag and the response body will be hidden

That is actually wrong. The -I flag set the request method to HEAD. So in some cases it will return different headers than a normal get request (and some servers don't implement HEAD responses at all).


That's why I have this in the shell config and use it daily:

  alias h='curl -sIX GET -w "Total time: %{time_total} s\n"'
It issues a GET request, only prints the response headers and displays the time it took.


Very true, but let's not forget that [0]

> The HEAD method is identical to GET except that the server MUST NOT return a message-body in the response.

and that

> The metainformation contained in the HTTP headers in response to a HEAD request SHOULD be identical to the information sent in response to a GET request.

[0]: http://www.w3.org/Protocols/rfc2616/rfc2616-sec9.html


Yes, it should work the same. But it's not the same.


Ah, great catch. Added a note about having to specify the method explicitly / that -I uses HEAD implicitly. Thanks!


A better way to view only the header of a response is using the flags -o to redirect the body and -D to redirect the header of the response. When I want to print only the header to stdout I do

    curl -o/dev/null -D- http://www.example.com
-I does not work with methods other than GET. This does the job.


So you can use -I with another method, but as soon as you add a body curl refuses. Went ahead and just took that scenario out of the post -- but your redirection snippet is great.


Why not just use -i?


-i mixes the header and the body of the response into one stream. If you want only the header it won't help.


Note that -I and -i are different:

-i, --include (HTTP) Include the HTTP-header in the output. The HTTP-header includes things like server-name, date of the document, HTTP-version and more...

-I, --head (HTTP/FTP/FILE) Fetch the HTTP-header only! HTTP-servers feature the command HEAD which this uses to get nothing but the header of a document. When used on an FTP or FILE file, curl displays the file size and last modification time only.


If you're on windows and don't feel like using the commandline you can click around in Fiddler to achieve similar things. The help is much shorter http://www.fiddler2.com/fiddler/help/composer.asp

I wasn't able to change the Host header successfully, though. A workaround is needed: https://groups.google.com/forum/?fromgroups=#!topic/httpfidd...


> If you're on windows and don't feel like using the commandline

Just install gow ( https://github.com/bmatzelle/gow ) and you'll be able to run all of those examples as intended. The command line becomes very pleasant on Windows after that one little install. It's very lightweight and well designed.


Does gow support curses yet? Last I used it, it broke my zsh completion.

(I just use Cygwin now, though, because mintty beats CSRSS all hollow.)


How does that compare to mingw32/MSYS?

http://mingw.org/


Nice. I like the way examples were constructed.

BTW, here are some more cURL tips I noted down a while back, which I found to be useful for daily work. - http://laktek.com/2012/03/12/curl-tips-for-daily-use/


The site does not work on the iPhone at all if you want to zoom in to read the text. If you start to touch and scroll the menu that was hidden away on the left decides to rear its ugly head and makes the site completely unreadable.

I'm pretty frustrated with sites that don't just have a basic two column layout. Is this a template theme? Otherwise why would you waste so much time on a left column that reflows and messes up the browser experience? Or better yet why not test this on an actual mobile browser?

I am not trying to harp on this site or the author specifically because there are certainly other offenders. This site though is quite annoying because once you finally think yougot that stupid menu out of the way, BOOM it pops right back an ruins the site again.

This is both a rant and a notice to the author since most people probably got too fed up to tell him about this problem.


Thanks for the heads up. Made a quick fix by taking out affix on the column. Tried to get this up pretty quickly with Bootstrap, need to spend some more time on it this weekend. Sorry for the annoyance.


This title is a bit misleading. I was assuming that I'd be reading about how to use something that wasn't curl?


It got changed, sorry. Original title was "9 uses for cURL everyone should know". Perhaps a bit assumptive.


Ah, thanks for the info, and nice examples! The main title on the page is now "9 uses for cURL worth knowing" for any mods watching.. :)


In my opinion that's a much better title. Thanks for at least submitting it with an informative title.


Check out the guidelines here: http://ycombinator.com/newsguidelines.html

They suggest you drop the number and go with something like "Uses for cURL everyone should know" instead.


Yep, you're right. Duly noted, thanks.


I knew 8 of the 9, but the one I didn't (testing virtual hosts) was worth the cost of reading the entire article. Thanks.


That will only work with named vhosts. Outside of shared hosting / personal web servers, almost everyone uses IP-defined vhosts.


Well, doesn't everyone do "their own" shared hosting? With ip4 getting more and more scarce, and before ip6 is viable, I'll definitely keep using named vhosts for stuff. Add in SNI, and it even works with SSL. Sort of.


I just tweak my /etc/hosts to do this.


> Test Virtual Hosts, Avoid DNS

> With cURL just point the request at your host’s IP address

It's not even necessary to manually look up the IP, since options have precedence:

  curl server1.example.com -H Host:\ www.example.net
looks up server1.example.com and connects to that IP with the given Host: Header. Just try the "-v" option to see what's going on.


Fantastic, there's also a grand tool called httpie that's a bit nicer than curl https://github.com/jkbr/httpie


I generally use wget myself. For basic HTTP debugging needs, I run ":%!wget -Sd http://www.example.com inside a new vim buffer. Then I can read the Varnish headers or whatever and figure out what's going on.

I ran into an issue where the SSL implementation was a bit dated, though, and didn't recognize how a GoDaddy cert implemented multiple hostnames -- but it turned out to follow the standard. wget was just lacking in its implementation and reporting an error when the cert was fine.


Curl is great to have, but remembering all the options flags is a pain. Nowadays I use:

Chrome: "Postman" extension

Firefox: "REST Client" addon

...both great utilities for creating and saving any HTTP request you need.


man curl


I'm sure there must be other sites documenting this too, but that's a very well written and prioritized bit of documentation - nicely done.

Your echo service is pretty nifty too.


It is. One thing I just noticed though, the echo service indicates that docs are available at http://httpkit.com/echo but that URI is a 404.


Good catch -- need to get that page up :) Built echo primarily to help me test wiretap so it hasn't gotten much documentation love. Will get that up this weekend.


Real HTTP from the command line is 'telnet localhost 80'.


'nc localhost 80' FTFY


    exec 5<>"/dev/tcp/localhost/80"
    echo -ne "GET / HTTP/1.1\r\nHost: localhost\r\n\r\n" > &5
    cat < &5
FTFY


/dev/tcp is a bashism. It's not a real (kernel-provided) device.


Clever, though nc really is better than telnet. It is much easier to pipe into just for starters.


Actually, you broke it. telnet sends CRLF line breaks like you're supposed to. nc will just send LF. A strict web server won't talk to you.


This can be combined well with http://news.ycombinator.com/item?id=4762444 :)


Nice!!! Here is one more when it comes to working with the Internet Of Things: http://cosm.com/docs/quickstart/curl.html And using this I was inspired and managed to do this: http://www.agilart.com/blog/agilart-programs-using-cosm #YAY


I never learnt cURL, or for that matter missed many of command-line tools because of the laziness to read boring long man pages. This is a perfect example of how a man page should be. All the options explained one by one with simple examples! It took me hardly 5 minutes, now I feel confident in using cURL for my next use.


This is a great resource. We use curl for sample API calls in our docs (developers.box.com/docs) because it's ubiquitous, but we've come to learn over time that knowledge of how to use curl is not ubiquitous.

Thanks for making this!


I can add -I parameter to these. It sends a HEAD request, which only returns headers, works better when you don't need the response body as in -i.


`curl --head http://google.com` is useful just to get the response headers back.


And if you prefer Ruby, try out htty: https://github.com/htty/htty


recommended, i use it to test all my REST apps, the project needs a breathe of fresh air and get moving with its feature roadmap.


Does someone knows how to use Google Analytics API with cURL?

btw: nice overview!


Helpful. I 'know' these, but this is a good cheat sheet to save around.

Really like the casual plug of your new project; I signed up :-)


Why do people like cURL? I've regularly found it to have stupid defaults compared to wget. Is it because it's default on Mac and wget isn't?


cURL is worth learning in my experience because it is bulletproof, ubiquitous, fast, highly-configurable, and comprehensive. No other option listed in here are all of those things, though they are all easier to use. But start talking about SOCKS5 proxies or FTPS and all of a sudden you have to start monkey patching your (pretty) tool or, you know, learning curl.

Seriously how often are you guys performing PUTs and DELETEs manually from the command line? `man` up and write a wrapper.


I didn't realize there was such a discrepency in proxy functionality, that's exactly what I was looking for.


As I tend to use in very simple situations, I happen to like curl's default use of stdout. Not a very compelling reason, I know.


I hope tomorrow I'll see an even more useful article on sending email from the command line! Wow. /sarcasm


You know, sending email from the command line isn't quite trivial, if you want to send in anything other than ascii. Telnet[0] to port 25 can be a lot of fun (alternatively openssl s_client[1] -starttls smtp ...).

For anyone not already familiar with this I'd recommend having a look at heirloom mailx[2].

Note, you can of course also pipe stuff to "/usr/bin/sendmail -t", which I'm sure was the thing claudio was alluding to.

[0] See eg: http://www.freebsdwiki.net/index.php/SMTP,_testing_via_Telne... vs: http://www.pcvr.nl/tcpip/smtp_sim.htm

[1] http://www.madboa.com/geek/openssl/#cs-smtp

[2] http://heirloom.sourceforge.net/mailx.html


I'd be interested to know exactly what you mean (I usually don't really get sarcasm). It seems that you think this advice is useless, or that people never use the command line, or something, but I don't really know what.

Could you be more specific? I find it useful to understand different points of view.

Thanks


From the guidelines: On-Topic: Anything that good hackers would find interesting.

My guess is that caludio believes that good hackers would not find this article interesting. I somewhat agree, as I feel that article adds little to what can already be found in the cURL man page. Having said that, I do think there is worth in having good examples, which this article certainly did.

It's a little disheartening to see this at the top of HN though.


Site is "Hacker News", not "How-tos for dummies", last time I checked. Do we really need this kind of stuff in home page? Hardly so. What about an article on how to create a three columns layout with CSS? Or how to edit a file using a text editor? Come on... how to use curl? Really?


Seeing that 200+ people upvoted, it would suggests that enough people found it useful.

I will admit that I am not familiar with curl and I found this helpful and enlightening. If you don't find it useful, then just move on. We come to HKN to learn about the other aspects that we are not familiar with. Just be mindful that not everyone have the same expertise.


Not every hacker knows every tool well. I wouldn't sweat it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: