It is widely assumed in SEO circles that Google uses toolbar data, among other sources, for finding new URLs to crawl. They're very enthusiastic about getting all pages on the public Internet into the crawl set. User data gets them there faster and more reliably than a hypothetical competing crawler using only e.g. the observed link graph.
Edit: I did some digging to see if I could find an authoritative source on this, and found that Matt Cutts specifically denies this particular usage of user data for expanding the crawl set on his blog. Mea maxima culpa.
Edit the second: An amusing note on this general subject: Google will fuzz test certain search forms on, e.g., high value government websites to get at the juicy data behind them which would not otherwise be reachable from just traversing the link graph.
Edit: I did some digging to see if I could find an authoritative source on this, and found that Matt Cutts specifically denies this particular usage of user data for expanding the crawl set on his blog. Mea maxima culpa.
http://www.mattcutts.com/blog/toolbar-indexing-debunk-post/
Edit the second: An amusing note on this general subject: Google will fuzz test certain search forms on, e.g., high value government websites to get at the juicy data behind them which would not otherwise be reachable from just traversing the link graph.