Largest prime number found?

Over the past few months, I have been inundated with tweets about the largest prime number ever found. That number, according to Nature News, is 2^{57,885,161}-1. This is certainly a very large prime number and one would think that we would need a supercomputer to find a prime number larger than this one. In fact, Nature mentions that there are infinitely many prime numbers, but the powerful prime number theorem doesn’t tell us how to find them!
nature_news_highlightWell, I am here to tell you of the discovery of the new largest prime number ever found, which I will call P_{euclid}. Here it is:

P_{euclid} = 2\cdot 3\cdot 5\cdot 7\cdot 11 \cdot \cdots \cdot (2^{57,885,161}-1) +1.

This number, the product of all prime numbers known so far plus one, is so large that I can’t even write it down on this blog post. But it is certainly (proof left as an exercise…!) a prime number (see Problem 4 in The allure of elegance) and definitely larger than the one getting all the hype. Finally, I will be getting published in Nature!

In the meantime, if you are looking for a real challenge, calculate how many digits my prime number has in base 10. Whoever gets it right (within an order of magnitude), will be my co-author in the shortest Nature paper ever written.

Update 2: I read somewhere that in order to get attention to your blog posts, you should sprinkle them with grammatical errors and let the commenters do the rest for you. I wish I was mastermind-y enough to engineer this post in this fashion. Instead, I get the feeling that someone will run a primality test on P_{euclid} just to prove me wrong. Well, what are you waiting for? In the meantime, another challenge: What is the smallest number (ballpark it using Prime Number Theorem) of primes we need to multiply together before adding one, in order to have a number with a larger prime factor than 2^{57,885,161}-1?

Update: The number P_{euclid} given above may not be prime itself, as pointed out quickly by Steve Flammia, Georg and Graeme Smith. But, it does contain within it the new largest prime number ever known, which may be the number itself. Now, if only we had a quantum computer to factor numbers quickly…Wait, wasn’t there a polynomial time primality test?

Note: The number mentioned is the largest known Mersenne prime. That Mersenne primes are crazy hard to find is an awesome problem in number theory.

38 thoughts on “Largest prime number found?

  1. Ummm, I don’t see why that number is prime. N!+1 isn’t prime in general. Consider 6!+1 = 7*103. Is there a special property of Mersenne primes which makes the statement true?

  2. While your P_{Euclid} certainly has a prime factor larger than 2^{57,588,161}-1, it need not itself be prime. For example, 2*3*5*7*11*13+1 = 30,031 = 59 * 509.

  3. Sorry, I misspoke above, but the point still stands. Consider the “primorial” function P(n) which returns the product of the first n prime numbers. Then P(n)+1 is not in general prime. P(6)+1 = 30031=59*509.

  4. Spiros, the product of the first n primes plus 1 doesn’t have to be prime itself. It’s just relatively prime with all of the first n primes. Consider: 2*3*5*7*11*13 + 1 = 30031, which is divisible by 59. So, while 2^{57885161} – 1 is the largest known Mersenne prime, it’s also the largest known prime. Actually, the 10 largest known primes are Mersennes (11th isn’t).

  5. I think you’ve fallen into a common trap here. Euclid’s proof only tells you that the product of a list of primes plus one is not divisible by anything on the list. It doesn’t have to be prime itself, as long as its prime factors are not on the list (for example, 2*3*5*7*11*13+1 is divisible by 59). Your P_euclid definitely contains a bigger prime factor β€” the real challenge is figuring out what that is!

  6. Your Euclid number is not a product of the first N primes, you are cherry-picking. For example, 2*7+1 is not a prime. It even has a divisor smaller than 7.

      • Technically, you defined P_euclid as “the product of all prime numbers known so far plus one”, which misses out on unknown primes less than 2^{57,588,161}-1.

        • Interesting and valid point! I don’t even remember saying “known primes”, but either way I wonder if there are in fact primes smaller than the one listed which we don’t know about… Again, great point.

          • I’m certain there are plenty (a understatement) of primes we don’t know about below this value, since much of the search for large primes has focused entirely on Mersenne primes (see

            Quote: “Why Mersennes? Because the way the largest numbers N are proven prime is based on the factorizations of either N+1 or N-1, and for Mersennes the factorization of N+1 is as trivial as possible (a power of two).”

            • Also, check the link for a list of the largest known Primorial primes, which are primes of the type 2*3*5*7*…*p_n $\pm$ 1.

            • Thanks Anzel. The link is very interesting. Primorial primes have a name after all… Now I want to know if there is a pattern for which primorial numbers are prime!

            • Also, this may have some errors but here’s a rough estimate of the number of digits of P_euclid
              log = log10
              M_48 is the newest prime (48th? Mersenne)
              # digits = log(P_{euclid})
              \approx \log (product of all primes up to M_{48})
              = sum of all primes up to M_{48}log(p_i)
              So as a first estimate, we can ask how many primes there are with 1 digit, 2 digits, 3 digits… up to log(M_{48})
              ~ 1*(pi(10)-pi(1)) + 2*(pi(100)-pi(11)) + 3*(pi(1000) – pi(101)) +…
              Turning this into an integral
              ~ int_epsilon^log(M_{48}) x*(pi(10^x) – pi(10^{x-1})*dx

              Note the lower integral is kind of irrelevant (since the values will be small).
              Using the most basic approximation that pi(x) ~ x/ln(x), the integral solves to
              (9/10)10^log(M_{48})/ln(10)^2 + (1/log(10))*Ei ((log M_48-1) ln10)

              Ei(u) ~ exp(u)/u for large u
              ~ (9/10)*M_48/ln(10)^2 + (1/10)*M_48/(ln10)^2/(log(M_48)-1)
              ~ (9/10)*M_48/ln(10)^2
              ~0.17*M_48 = 0.17*10^(log(2)*57885161)
              # of digits ~ 1E17425169 or p_euclid ~ 10^M_48

              Then having gone through all of that, I found which states that ln p# ~ p, and so the real answer is about .43*M_48. Oh well, close enough.

            • Thanks for doing all the hard work Paul! I guess I was right about one thing: this number is a massive beast that wouldn’t fit inside this blog post (10^{17.4 million} digits would require many universes to lend me their WordPress space). Now, what is the last digit of P_{euclid}?

  7. Whoops, I did the Riemann sum incorrectly. The real answer is
    # digits ~ \int^M x d/dx \pi (10^x) dx
    Which simplifies to the correct answer. (let \pi (x) be x/ln(x)).

    The final digit is 7. Try and prove me wrong.

  8. Dear Spiros,

    I used to use all the Linux computers in the physics high-energy theory department to run the very same software to find the Mersenne primes – an enthusiasm for this particular question I no longer quite understand – and as a result, I can assure you of many things, for example that the number 2^{57,885,161} – 1 definitely *is* the largest known prime, not only the largest known Mersenne prime.

    Others have explained to you why your “Euclid” number almost certainly isn’t a prime. But there’s a subtler point about which you’re totally wrong as well. A note of yours at the end says:

    Note: The number mentioned is the largest known Mersenne prime. That Mersenne primes are crazy hard to find is an awesome problem in number theory.

    But Spiros, this is a complete misunderstanding of the role of Mersenne primes among all primes. The Mersenne primes are, on the contrary, the *easiest* large primes to be looked for! That’s partly why the Mersenne prime project exists: it’s the search for the largest known prime at the same moment. The largest known Mersenne prime has been the largest known prime at the same moment for quite some time, without an interruption!

    Finally, if this text of yours was meant as a real claim and not just a self-evident artificial error meant to build traffic, and if the “correction” was a real correction, then I must say that it is a very lame attempt to mask your previous argument’s lethal error. Of course that a random number that is incredibly larger than the currently largest known prime will contain some very large prime factors. But so will all other random very large numbers such as 10^{10^{100}}+1.

    If you pick the “product of all primes up to the largest one plus one”, you only guarantee that this prescription isn’t divisible by any primes smaller than or equal to the currently largest known prime (most of them are unknown – it’s unknown whether the smaller primes than the largest number prime are prime or not, of course). But this is a negligible fraction of the tests one has to run in order to check whether a large number is a prime. Your “Euclid” number is a number of the sort 10^{10^{10,000,000}} and the number of its potential prime divisors is pretty much a number of the same sort, 10^{10^{10,000,000}}. But your form has only eliminated 10^{10,000,000} or so candidates, a hierarchically exponentially tiny fraction of the tests!

    So e.g. my claim that there is a larger prime number and it may be obtained as a factor of 10^{10^{100}}+1, or any large number with a not-obvious complete factorization, is exactly “as good” as your claim about the Euclid number. The large number in the sentence may be replaced by any comparably large number. However, this prescription – the number is a factor of an even larger number – that’s what deserves the label of the “crazy hard problem”, unlike the problem to find a Mersenne prime. That’s why cryptographic algorithms are built on the impossibility to find primes p,q if one is given p*q in a realistic time!

    If your text is serious, I must say that pretty much all the far-reaching messages of it, much like the main technical claim, is incorrect. It’s still brave that you discussed it because such things should be discussed and should be more widely known, at least in the broader math-trained public, but your contribution towards this goal was just a contribution of a provoking sequence of errors. πŸ˜‰


    • Dear Lubos,

      Indeed, the post was meant to be a provocative attempt to shed more light on the search for large prime numbers, following the previous great post about cryptography. Fortunately, I made a grave mistake about the consequences of Euclid’s algorithm and something interesting happened: I received comments that were far more informative and engaging than the post itself. I learned about Primorial primes, that there are huge gaps in our knowledge of primes, that Mersenne primes are really difficult to find but also the easiest targets when looking for large primes and that you once spent your days looking for them with a cluster of Unix machines.

      The teacher became the student again. Not bad for a lame little post.

      • It’s a great success then, Spiros. πŸ˜‰ You may want to run a text about a theory of everything, maybe it will lead to a discussion that will complete this task.

      • Incidentally, you’re into algorithms, sort of, so you may want to try to understand the Lucas-Lehmer test

        that is used (along with some other test, to optimize the timing) to decide whether the Mersenne numbers are primes. Of course, the first thing to do is to notice that for 2^p-1 to be prime, p itself has to be a prime: for p=KL, 2^K-1 would be a divisor of 2^p-1.

        I once understood why this test works but I forgot it and won’t study it again today. At any rate, it’s a special clever structure, a special algorithm, and this extra structure makes it easier, not harder, to decide about the primarily of the Mersenne numbers.

        Just imagine how amazing it is that we may check whether a number with tens of millions of digits is a prime. Normally, by brute force, we would need to check all potential divisors up to sqrt(candidate), i.e. also so many candidates that the number of these candidates is also a number with tens of millions of digits (well, one half of the original one).

        One clearly can’t make that many operations: when the brute force is the only recipe, i.e. when one talks about “generic” candidates for primes, they could have at most dozens of digits if you want the calculation to take a shorter time than the time that remains for the Sun to go red giant. πŸ˜‰ But for the Mersenne primes, we’re remarkably able to decide about the primality of numbers that have tens of millions (not just hundreds!) of digits. Isn’t it amazing? To a certain extent, the Mersenne primes are “semi-constructive”: we may pretty much generate large primes of this sort while the remaining work to be done is vastly smaller than the verification of primality for a generic number.

        So to a limited extent, the Mersenne primes are exactly playing the role you wanted to assign to the Euclid prime – except that the Mersenne numbers that pass the Lucas-Lehmer tests are really primes while your Euclid prime is not. πŸ˜‰


  9. Pingback: Project Euler Problem# 5 solution in C++ | Khuram Ali

Your thoughts here.

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s