It is well-known that our intuition is not perfect. We are predictably irrational in a huge number of ways in our everyday lives. But what about something a bit more sophisticated? Are there times when we use our reason—our ability for extrapolation and prediction—and still fail, because things are simply too complicated. This sort of situation seems to be embodied in a Quora question that I recently came across: What is an example of a conjecture that was proven wrong for "very large" numbers?
Essentially, the questioner was interested in situations where a mathematical conjecture appeared true, but fails only with the application of advanced computational power as cases far beyond human abilities are tested.
And there are a lot of these. One of the more famous examples is the Pólya conjecture. This conjecture states that given a number N, the fraction of numbers less than N that have an even number of prime factors is always less than the fraction that have an odd number of prime factors. This seems to be true. Until that is, you get to 906,150,257. Our intuition failed.
Another example is a conjecture from Euler:
As Singh notes, "The moral of the story is that you cannot use evidence from the first million numbers to prove absolutely a conjecture about all numbers."
These instances are fascinating. They show that large numbers are not infinity and trends do not indicate proof. Our human brains are powerful but we must increasingly work in concert with machines, to help us place bounds on our intuition.