ECC: collusion or blindness?

 

Vancouver, British Columbia
September 8, 2017

No commercialization of the large math library without a license is granted. The Boren integer factorization utility is free and available for download at the bottom of this page. This is for research and development.

This article is based on a paper presented to IEEE Smart World Cyber Trust Workshop August 4, 2017 and published by the IEEE Advanced Trusted Computing.

http://www.wnlabs.com/news/IEEE_copyright_WN_papers.php

http://www.wnlabs.com/pdf/IEEE-ATC-R-CyberTrust-27.pdf

Download the Whitenoise Boren Newton Bisection and test integer factorization utility:

RAPID ECC FACTORIZATION RESEARCH MATERIALS: factoring executable, big math library, scope documents, code etc.

Download this web page as a pdf for easier reading and research: http://www.wnlabs.com/pdf/ECC_is_easier_to_break.pdf

 

Article

abrisson@wnlabs.com

Some say that there may have been collusion between governments, standards bodies and prominant companies that actually compromised general security by convincing people that Elliptical Curve Cryptography (ECC) was a legitimate replacement for RSA Integer Factorization Cryptography (IFC). Security experts globally followed the shiny object and swallowed the red herring. For a long time it has been publicly speculated that an approved ECC random number generator had a backdoor. It wasn't needed. They already had the key to the front door validated by the same break approaches that downgraded RSA in the first place. The bisect and test approach of Whitenoise Boren factoring techniques solves the same challenge much more quickly.

ECC became the standard replacing RSA.

RSA had factoring challenges throughout the 90s which examined the difficulty of factoring or breaking RSA keys. They had these contests because they needed to predict how long a cryptographic algorithm could safely be used. The intent of the original Advanced Encryption Standard AES was to identify an algorithm that would be secure for seventy years and then only use it for only thirty years. We see how important this is every day. Just the one hack of Equifax severely compromised almost half of all Americans.

Long before the advent of quantum computing NIST and NSA downgraded the use of RSA Integer Factorization Cryptography because it was obvious that RSA was degraded and very vulnerable. Read the notes in their chart. In 2015 NIST and NSA announced they intended to transition away from the current cypher suite which includes RSA and ECC. Why did they include ECC?

Elliptical Curve Cryptography became the standard based on three rationales. Functioning democracies have to question whether there were any additional motives.

The first rationale was definitive knowledge that RSA could be factored.

The second rationale was a red herring. Because Elliptical Curve mathematics is even more complicated than that used by RSA in asymmetric contexts and because it is more difficult for system administrators to understand ECC, it was therefore harder to challenge the efficacy of ECC. It was blindly accepted that it is safe to use ECC cryptography with much smaller modulii than it was to use RSA which have much larger modulii simply because ECC is more complicated.

One argument is that it is still difficult to process numbers that large on regular computers. This isn't true. In the challenge issued below the materials include a large number math library that will allow mathematical processes like factorization and manipulation of 500 to 700 digit numbers on a typical notebook. The Whitenoise large number math library and factorial utility is free for any researcher to use to advance prime number theory. The large number math library cannot be commercialized without a license from Whitenoise Laboratories Canada Inc.

http://www.wnlabs.com/downloads/ECC_factoring_research.zip

If one agrees with the second rationale then the third rationale makes sense. ECC is used to lower the overall overhead and processing required by asymmetric systems.

Security has always been sacrificed for speed and broadband demand was and continues to explode exponentially.

When NSA and NIST announced they intended to phase out RSA they also announced they were phasing out ECC which is included in the cipher suite. Why? Is it recognized that quantum computing, even if not yet perfected, makes brute force attacks efficient because of incredible speed? If so, it makes sense to discontinue ECC because it uses much smaller modulus in asymmetric frameworks (prime number composites) than RSA which was already too weak relative to the processing effort required.

•  RSA's deadly relationship: to use keys of adequate strength requires an enormous amount of processing.

•  ECC's deadly relationship: if the ECC required key strength must be increased to that of RSA's because the same factoring techniques factor ECC prime number composites IN THE SAME FASHION bypassing elliptical curve mathematics then overhead and processing requirements will be even greater for ECC than what RSA requires.

What were the factoring challenges of the 90s that RSA ran that sank Integer Factorization Cryptography?

The basic challenge is a X b = c.

Multiply two prime numbers (private key equivalents) together to get a prime number composite (a public key equivalent.) That is simple to do. Either a or b can be evenly divided into c without knowledge of one another. For example, if a is 3 and b is 5 and c is 15 it is easy to divide either a or b into c without a or b being known to the other while c can be known to the public. This is the underpinning of all session key exchange and security controls in asymmetric public key systems design.

The security premise for RSA and ECC in an asymmetric network framework (PKI) is that it is supposed to be infeasible to reverse that formula to calculate a and b if you only know c (say a public key) if large prime numbers are being used.

The red herring: Since Elliptical Curve Cryptography uses a much more complicated process to locate prime numbers a and b it is conflated to imply that it is even harder to factor c (known) if one has to reverse the elliptical curve mathematics.

The Whitenoise Boren alternative to the Lenstra elliptical curve sieve requires no knowledge or use of elliptical curve mathematics. The problem remains the same. We are factoring the prime number composite c. We are not concerned with the process that was used to select values a and b. Source code for the Lenstra method is available from wikipedia:

https://en.wikipedia.org/wiki/Lenstra_elliptic_curve_factorization

The Boren factorization source code and large number math library are available here:

http://www.wnlabs.com/news/ECC-is-easier-to-break.php

It is easy to see the increased complexity of elliptical curve mathematics compared to that of Euclidian mathematics that RSA relies upon. The following layman explanation is provided on Wikipedia.

After doing all the processing that elliptical curve mathematics demands, the solution is confirmed by using the Euclidean algorithms derived from a X b = c. The Boren integer factorial utility simply solves that problem directly in the first place. Feasible and simple brute force attacks and sieves that solve that problem is what resulted in the downgrading of both ECC and IFC.

The factorial process of using Newton's bisection theory and a remainder volatility test solves that problem and BYPASSES the need to understand or use any elliptical curve mathematics.

Was it collusion or lack of insight that led to the adoption of ECC as the de facto standard?

President Obama was using a BlackBerry phone and ECC was approved for the protection of secret and top secret government information. This is a status ECC still holds because it has yet to be replaced.

Second, appropriate government, law enforcement, standards bodies and pertinent corporations have been aware of this factoring approach and a preferred crypto algorithm for a long time. The challenge materials including the factorial utility and a large number math library were first built in a project at British Columbia Institute of Technology in 2001.

RSA is to be discontinued because NIST and NSA know that it is vulnerable to sieve based attacks and brute force attacks from regular computers. They also knew of the advent of quantum computers.

ECC was then adopted because it had more complicated mathematics but lower processing requirements than RSA IFC. The more complicated mathematics was supposed to mean that it is safer to use smaller ECC prime number composites to get a comparable RSA strength.

From the first chart we see that NIST says that using 256 bit ECC is equivalent in strength to using 3072 bit RSA. The shiny object that system architects saw and swallowed is the fallacy that the security of a 32 byte length number that is factorable is more secure than a 384 byte length number that is factorable. 256 bit ECC (the 32 byte option) is the NIST NSA recommended key strength through the year 2030. Does this guarantee wide open surveillance by friends and foes alike?

Given the relative size of the prime number composites, you can see that it would take less than a tenth of the time to factor ECC than it does RSA if both are vulnerable to brute force and sieve attacks on the prime number composite. This means that RSA and ECC cryptography might be broken at will with just inefficient techniques.

It was previously posited that simple prime number dictionary attacks are possible as well. All the prime numbers up to 10^18 in length have already been calculated. It seems improbable that uses of prime number lists to make dictionaries haven't been compiled. Certainly all primes within the usability range for breaking ECC have been calculated. The vulnerability for ECC is identical to RSA's vulnerability regardless of the mathematics used to derive the necessary PKI components a and b.

http://www.wnlabs.com/pdf/Rapid_Factorization_of_semiprimes.pdf

Prime number dictionary attacks

256 bit ECC remains approved for government use for secret and top secret information through the year 2030. This is what a 32 byte (256 bit) length number looks like:

10,000,000,000,000,000,000,000,000,000,000 = 10^30 = the size of approved ECC

Some factoring utilities, including the Boren technique, start by taking the square root of the prime number composite so that a normal curve is bisected in a manner that assures that one or the other of the two prime number solutions fall on either side of the bisection.

Then the utilities search for the prime number solution that falls in the smaller end of the normal curve so that the work space is reduced.

The square root of a number 10^30 in length is a number that is 10^15 in length.

All the prime numbers up to 10^18 in length have already been calculated and are available on the internet. This illustrates that 256 bit ECC which is approved for US government use for secret and top secret information through year 2030 is breakable with a prime number dictionary attack using the primes which have already been calculated.

If this thesis is correct than we really never have had security from the prying eyes of criminals or governments. The ultimate challenge of the digital age is balancing security and privacy.

That balance is affected by political and human reality. The majority of people will choose life and physical freedom over freedom of speech. They believe that they can temporarily sacrifice free speech for physical safety and regain the freedom of speech when the crisis has passed. This is the "veneer of democracy" that the former head of NSA (Clapper) warns that democracies have to vigilantly protect. General Mike Hayden, former head of both NSA and CIA has recognized that the time is nigh for unbreakable end-to-end encryption. Our way of life and government can collapse with the collapse of critical infrastructures.

The question to balance is "how much surveillance and under what contexts will a democracy allow and retain its character?"

Challenge

Advance research! Join the challenge to optimize this utility and large number math library and define the underlying operations that make it work.

DOWNLOAD

RAPID ECC FACTORIZATION RESEARCH MATERIALS: factoring executable, big math library, scope documents, code etc.

The large number math library that is intended to allow manipulation of 500 to 700 digit numbers on a typical desktop or laptop computer is included. The application will allow you to test and factor some sample composite primes. Grab two primes from prime number lists. Multiply them together and then use the utility to factor them. Start small and build up to larger composite prime sizes i.e. 48 bit, 80 bit, 112bit etc. For ECC, just optimizing to factor 256 bit primes compromises recommended security levels.

Running the utility does not require an install. It can be run from its folder after download. There are scope and design documents as well as sample code.

Bookmark this page and visit often. We will be announcing major ECC factoring challenges:

Can you optimize the provided code or library that are provided? What is the largest composite prime you can factor?

How long will it take you to factor a 256 bit ECC prime number composite (128 bit strength semiprime) like ECC Curve25519?

Mathematically explain and prove why the volatility of a 6th derivitive of a remainder points to the correct prime? (This would be a significant contribution to General and Prime Number Theories.)

SIMPLE BENCHMARK COMPARISON OF BOREN VS LENSTRA FACTORING SPEED

The Lenstra elliptical curve factorization technique is considered the benchmark. However, "ECM is considered a special purpose factoring algorithm as it is most suitable for finding small factors. Currently, it is still the best algorithm for  divisors  not greatly exceeding 20 to 25  digits  (64 to 83  bits  or so), as its running time is dominated by the size of the smallest factor  p  rather than by the size of the number  n  to be factored." Make an 80 bit prime number composite. Time how long the Lenstra method takes to determine the factors. Run the same prime number composite through the Whitenoise Boren factoring utility and it will do numbers that size easily on a decent notebook. https://en.wikipedia.org/wiki/Lenstra_elliptic_curve_factorization

The Lenstra EMC elliptical curve factoring is currently the best approach. However, in his presentation Professor Lenstra indicates that since the 1970s general factoring has been stuck and is running out of steam. See: Cryptanalysis of Public Key Cryptographic Algorithms - http://cs.ucsb.edu/~koc/ccs130h/notes/lenstra.pdf.

Lenstra indicates that there is a need for an entirely new approach. The Boren integer factorization bisection and volatility testing technique is a new approach to factoring that merits study, optimization and better definition.

This approach was developed for Whitenoise Labs by mathematician and computer scientist, Stephen Lawrence Boren, who is a quadriplegic and co-founded the original Whitenoise Labs www.wnlabs.com . Because he programmed with the back of one knuckle, his code is poorly remarked.

Using Lenstra's example from his paper, we know that 80 bit security should take more than 3 million processing years effort on a 10GHz machine to factor. Rapidly doing a benchmark test on such samples will show this approach works as quickly as EMC. This merits further research, optimization and understanding.

Download the Boren Integer Factorization utility from the link above.

Compare to Lenstra EMC online integer factorization calculator: online Lenstra EMC utility: https://www.alpertron.com.ar/ECM.HTM

For additional information or to provide feedback write to Andre Brisson: abrisson@wnlabs.com

or, correspond through Linked In at https://www.linkedin.com/in/andre-brisson-51077/?trk=nav_responsive_tab_profile_pic