In the age of computers, there will be fewer fun surprises like that.
What surprises me is the euler-mascheroni constant, which shows up all over number theory and also in probability (coupon collector's problem, entropy of weibull and levy) is so difficult to work with that not only has no-one proven it transcendental, it hasn't even been proven irrational.
If you can compute a number's digits (i.e. approximate it), then you can see if they start repeating or not. If you compute a billion digits and they don't repeat yet, and you have no reason to expect the number to be a fraction with a huge denominator, then a good conjecture is that the number is irrational and vice versa when there is repetition. You'll be right more often than not.
(Of course, like many heuristics this one is easily exploitable by an adversary trying to make you give the wrong answer. And clearly this is not a proof, just a rule of thumb where you expect to be wrong now and then but not usually.)
The only thing that computers add is the ability to prove that a number is not rational to a larger number of decimal places than a human working on their own could or any other technique that requires brute force calculation on a large scale.
I admit I was surprised that "Most (but not all) interesting numbers admit a polynomial-time algorithm to compute their digits." But this is a really good reason why computers would make a difference.
It's not much easier to prove it. But when you can quickly compute the first few billion digits, you can just look for repetitions and conjecture that a number is rational.
What surprises me is the euler-mascheroni constant, which shows up all over number theory and also in probability (coupon collector's problem, entropy of weibull and levy) is so difficult to work with that not only has no-one proven it transcendental, it hasn't even been proven irrational.