This claims to be well-documented and secure by default, but I cannot find a link to the documentation (or is it just the README)? What actual algorithm is used for encryption and for signing?
The README has a major red flag:
> From the math perspective, there is no difference between private and public keys, they both can be used to encrypt messages that only can de be decrypted by the other. Most of the security tools, including enc, artificially forbid using the public key for decrypting messages because that's not how it should be used (encrypting messages that anyone can decrypt is pointless).
Most security tools forbid that because it is dangerous! Also, this description sounds like textbook RSA is being used...
> Rather, device-independent quantum key distribution allows you to scale back the assumptions on your implementation to a well-motivated, minimal set. To me, this is already intriguing enough without the need for hyperbole!
Would it be accurate to say it is scaled back to the level achieved by classical (non-quantum) cryptography?
> Would it be accurate to say it is scaled back to the level achieved by classical (non-quantum) cryptography?
Not quite. Classical cryptography of course requires the additional assumption that the computational capacity of the attacker is limited (at least if the amount of key material available is less than the length of the messages to be exchanged). QKD does not need any such computational assumptions. Looking at this purely from a theoretical perspective, I hope you'll agree that the ability to create new shared randomness "out of thin air" by drawing on quantum correlations, and to do so an information-theoretically secure fashion, is a pretty neat trick.
Now, if you asked me how likely it is _in practice_ that $THREE_LETTER_AGENCY has broken your cryptosystem to the point where they can feasibly attack it/have backdoored it, compared to the likelihood that they've bugged your devices in a supply chain attack or found any number of other ways to compromise the practical implementation, I suspect my answer wouldn't be much different to yours. Nevertheless, I still think it is interesting to explore additions to the cryptographer's toolbox that, in a very practical sense, have a rather different profile of assumptions and tradeoffs.
Oh absolutely, the theory behind QKD is fascinating! And I do think that some day there may be actually secure practical implementations, maybe even ones that are practical for more than a few niche applications.
But you mentioned the assumptions on the implementation, not on the underlying mathematics. The thing that concerns me is that QKD introduces additional hardware to operate, and there have been many demonstrations of weaknesses in that hardware that threaten the overall security of the system. With DIQKD you ensure that those issues no longer affect security (again it is absolutely remarkable that this is possible at all),
but now you still have to concern yourself with all the implementation vulnerabilities that also plague classical cryptography. In that sense I mean that the implementation assumptions are now the same.
The article suggests that quantum key distribution (QKD) is a replacement for a courier, but that is not true. Distributing an initial small secret key (which allows authenticity of the post-processing of the quantum measurements) is still required.
Also I think the added value of device-independence is overstated. While it does indeed prevent loss of security by faulty quantum hardware (even if constructed maliciously), there is still a lot of classical post-processing required. That device still needs to be trusted.
For example, when the device is outputting the shared key, it still needs to be trusted that it isn't also delivering that key back to Eve.
> The article suggests that quantum key distribution (QKD) is a replacement for a courier, but that is not true. Distributing an initial small secret key (which allows authenticity of the post-processing of the quantum measurements) is still required.
Is checking authenticity needed if you’re communicating with just one party?
From what I understood from the article the data received can be assumed to be random and private between two parties if a high enough win rate is achieved.
Or is checking authenticity to guard against another party taking the entangled particles but not the key used for authenticating?
FYI I don’t have very strong knowledge in this area.
Yes, otherwise how would you know you are indeed communicating with that party?
Otherwise the standard Person-in-the-Middle attack would apply: Eve (claiming to be Bob) first runs a full protocol session (quantum + classical communication) with Alice, resulting in a shared key X. Then she does the same to Bob, resulting in a key Y. When Alice wants to encrypt a message to Bob, she encrypts with X. Eve can decrypt (and optionally re-encrypt with Y and forward the message to Bob).
So the part I’m getting hung up on is if Eve attempts to MitM the quantum key exchange wouldn’t the probability of winning drop below the acceptable threshold since Eve does not posses the entangled particles? If that’s the case then wouldn’t Alice invalidate the exchange and same for Bob?
Without authentication, any form of communication is susceptible to a man-in-the-middle attack. You simply don't know who you are communicating with.
This makes using QKD very hard to justify in practice. If you have exchanged a pre-shared key (which is required for authentication anyway), you can just use a symmetric stream cipher like AES for encrypting the communication. This is many orders of magnitude cheaper and faster than QKD and works independently of the communication medium. Also it doesn't look like AES is going to be broken anytime soon.
That's simply a logical fallacy: P => Q does not imply (not P) => (not Q). Here:
"If it is inaccurate, then it is wrong" does not imply "If it is accurate, then it is right."
No configuration, curve or key-size would protect any of the asymmetric crypto mentioned in the book from quantum computing. You really have to switch to something mentioned in the post-quantum chapter.
Ultimately, that's true, but I am clearly referencing the interim transition. After all, that's what the book is about - developers today who use cryptography. That "ultimate" timeframe won't happen today or tomorrow. Look at CNSA for examples.
This take is rather naive. Those RSA factoring records were done by a large international team of researchers, using well established algorithms and decades of work on implementing those methods as fast as possible.
The blog post says the paper mentions 8.4e10 operations for factoring, but I can't find that number in the paper anywhere. The post then states: "The 800-bit claims would be 36 bits of work." I don't know what that means.
[edit]: the numbers are in the new version (https://eprint.iacr.org/2021/232). I was looking at the old version uploaded yesterday.
> The post states that 800-bit claims would be 36 bits of work. I don't know what that means.
From the article: "Schnorr’s paper claims to factor ... 800-bit moduli in 8.4·10¹⁰ operations"
2^36 ~= 8.4·10¹⁰, so I guess "36 bits of work" means 2^36 operations. Analogous to how a password with 36 bits of entropy would require 2^36 guesses. My first time encountering the phrase "bits of work" as well, though.
Yeah, that's what got me. Doubling the length of the key only requires a single order of magnitude more work?. If that turns out to be true, I'm going to need to revise my beliefs about how the universe works. In particular, information theory and thermodynamics, because multiplying two numbers together doesn't preserve information about what the factors were. Or at least pretty much everyone thought so. (Caveat: if the values of primes turn out to have a predictable pattern, that could provide the needed information. However, that would mean that the Riemann Hypothesis is false, and that'd be an even more astounding result.)
Thing is, RSA keys are rather 'sparse' because they are the product of two primes. There aren't that many primes, so there aren't that many numbers that only have two (proper) divisors.
Hence, if you look at the strength of the currently best-known attack on RSA keys, you see that the key strength grows quite slowly as the keys get larger. This is purely from how sparse prime numbers are. From [1] which quotes NIST in 2015 we see (both collumns are in bits):
2^36 "operations" can still take a lot of time if each operation is multiplying two giant numbers, unless the meaning of "operation" is somehow normalized to mean e.g. 64 bit integer operations.
> 2^36 "operations" can still take a lot of time if each operation is multiplying two giant number
It took me 3.3 years of actual computation time to do about 2^46 multiplication+modulo of two 2048 bit numbers on a Core i7. 2^36 of 2048 bit numbers should be doable in a day on an eight years old CPU.
P.S: that was on a single core, for the problem I solved was explicitly created as to not be parallelizable.
It didn't take long for custom ASICs for mining bitcoin to emerge. It wouldn't take long for custom ASICs to do these kinds of operations a lot faster than on a general purpose CPU to emerge.
Supposing the paper does describe a more efficient factorisation algorithm, that does not imply that factoring a 800 bit prime (like the author of this article suggests) would be cheap.
It's broken for me, thanks to the "responsive" design. When the browser window occupies half my screen (1920x1080px), the top-right menus (bell, plus, profile) get replaced with only a bell icon. How am I supposed to reach the settings page now?
In that case the answer is: never. We've tried to generously interpret earlier factoring annealing results (there is nothing new in the top-posted paper) and had to conclude that the overall method just doesn't scale well: https://arxiv.org/abs/1902.01448
This study considers seven semi-primes (without justification why these seven) and reports an average(?) runtime of factoring each with D-Wave. No conclusions should be drawn on so few data-points.
The proposed "Block Multiplication Table Method" will only affect the constants and thus has no effect on the asymptotics.
The embedding from logical to physical qubits appears to have an exponential gap (but again, we shouldn't draw conclusions from so few data-points). However, if this is indeed true then even a polynomial runtime for annealing would still result in an exponential runtime for factoring.
All in all, it eludes me why authors conclude that the obtained results are promising.
It's strange that the authors pick integer factorization as the problem to solve on their machine. Although their machine may provide a speedup for optimization problems, these speedups are not relevant for the problem of integer factorization, as shown in
https://arxiv.org/abs/1902.01448 (disclaimer: I am a co-author of that paper).
Skimming over the paper it seems their method of translating factorization to the optimization problem consists of simplifying equations without justification that this can be done efficiently. I suspect that their preprocessing step is already NP-hard.
The second and more important issue, is that the overall strategy---of translating a problem with a sub-exponential classical complexity (via the Number Field Sieve) to an optimization problem with exponential runtime---is not expected to succeed, as confirmed by careful measurements in our paper.
I agree that the integer factorization is a poor choice. Last statement in the Abstract talks about sampling and optimization which could be the bigger point.
Their pre-processing seems like simply expanding out their cost function that's of the form E= (F - XY)^2. Of course it's a lot of multiplications since X and Y are binary and multi-dimensional. Not sure if it would be NP-hard though.
The README has a major red flag:
> From the math perspective, there is no difference between private and public keys, they both can be used to encrypt messages that only can de be decrypted by the other. Most of the security tools, including enc, artificially forbid using the public key for decrypting messages because that's not how it should be used (encrypting messages that anyone can decrypt is pointless).
Most security tools forbid that because it is dangerous! Also, this description sounds like textbook RSA is being used...