Pagerank is a like a markov chain - you iterate the same dataset over and over until you're happy with the result. If 20 is good enough, its good enough. Good explanation here: http://www.iterativemapreduce.org/samples.html#Pagerank
Whereas if you include signals from multiple sources, the joins are each one MR job, never mind the calculations.
From what I've read Google's index runs in around 20 MapReduce jobs.