Incorrect hasing algorithm selected for long n values in keys #1

Open
by Ghost opened 4 years ago · 4 comments
Ghost commented 4 years ago

When a long n value is used in account keys, SHA-384/SHA-512 will get selected for hashing for the thumbprint when signing JWS or generating the key authorization. The specification requires SHA-256 for these.

I noticed that the repository linked in the README for keypairs is correct and behaves as expected.

However, the actual repository used is @root/keypairs which is an outdated version of the aforementioned.

When a long n value is used in account keys, SHA-384/SHA-512 will get selected for hashing for the thumbprint when signing JWS or generating the key authorization. The specification requires SHA-256 for these. I noticed that the repository linked in the README for `keypairs` is correct and behaves as expected. However, the actual repository used is `@root/keypairs` which is an outdated version of the aforementioned.
Owner

Actually, it's the opposite. And it's complicated. And don't use non-standard values. 😃 There are only two acceptable values for all practical purposes:

  • RSA 2048 with SHA-256
  • EC P-256 with SHA-256

The "provisional" standards (specified but basically declared as "not recommended") are

  • RSA
    • 3072 / SHA-384
    • 4096 / SHA-512
  • EC
    • P-384 / SHA-384
    • P-521 (not a typo) / SHA-512
  • Other
    • (I think ED and X curves are coming into popularity, but not commonly implemented yet)

You shouldn't use a SHA-256 hash on a larger bit value key because you reduce the entropy of the thumbprint to the lowest common denominator, which effectually cancels out the the security of the identifier.

Likewise when you attempt to use the signing function of a ~256-bit entropy key (2048 RSA is effectively 256 due to the entropy space of prime numbers) with a larger hash algorithm, the hash is truncated down to 256-bits. The thumbprint itself is not affected by this, but you get no advantage.

Google Cloud doesn't allow for keys larger than 2048 - its basically considered to be an attack on CPU processing power and increases the TLS handshake times beyond what they are willing to accept and the guarantees that they're striving to make.

Many other cloud providers do likewise.

Bottom line: If someone is well-versed enough to know the security ramifications of the choices they're making then they should know that bigger isn't better (and can in fact be less secure), and whatever special knowledge that someone might have to make such a decision despite the recognized best practices should be more than enough to hack up their own implementation (I commented something to that effect in the code).

That said, I'll leave this open because I do need to update the documentation.

😃

Actually, it's the opposite. And it's complicated. And don't use non-standard values. :smiley: There are only two acceptable values for all practical purposes: * `RSA` `2048` with `SHA-256` * `EC` `P-256` with `SHA-256` The "provisional" standards (specified but basically declared as "not recommended") are * RSA * 3072 / SHA-384 * 4096 / SHA-512 * EC * P-384 / SHA-384 * P-521 (not a typo) / SHA-512 * Other * (I think ED and X curves are coming into popularity, but not commonly implemented yet) You shouldn't use a SHA-256 hash on a larger bit value key because you reduce the entropy of the thumbprint to the lowest common denominator, which effectually cancels out the the security of the identifier. Likewise when you attempt to use the signing function of a ~256-bit entropy key (2048 RSA is effectively 256 due to the entropy space of prime numbers) with a larger hash algorithm, the hash is truncated down to 256-bits. The thumbprint itself is not affected by this, but you get no advantage. Google Cloud doesn't allow for keys larger than 2048 - its basically considered to be an attack on CPU processing power and increases the TLS handshake times beyond what they are willing to accept and the guarantees that they're striving to make. Many other cloud providers do likewise. Bottom line: If someone is well-versed enough to know the security ramifications of the choices they're making then they should know that [bigger isn't better](https://www.schneier.com/blog/archives/2009/07/another_new_aes.html) (and can in fact be _less_ secure), and whatever special knowledge that someone might have to make such a decision despite the recognized best practices should be more than enough to hack up their own implementation (I commented something to that effect in the code). That said, I'll leave this open because I do need to update the documentation. :smiley:
Owner

And if you know something that I don't, by all means correct me, but I've gone through several revisions of the ACME and JWK and JOSE standards as well as Node crypto and Web Crypto, and the current implementation is the most correct across the board.

To the best of my knowledge, even if any particular vendor happened to support the other standards, none that I'm aware of recommends them (although few faux-security blog posts make false claims according to the very incorrect "bigger is better" philosophy).

And if you know something that I don't, by all means correct me, but I've gone through several revisions of the ACME and JWK and JOSE standards as well as Node crypto and Web Crypto, and the current implementation is the most correct across the board. To the best of my knowledge, even if any particular vendor happened to support the other standards, none that I'm aware of recommends them (although few faux-security blog posts make false claims according to the very incorrect "bigger is better" philosophy).

I definitely agree with your assessment on the situation, but as you mentioned, most providers effectively only accept SHA-256 based signatures.

To that effect, in the event that a longer key length is used and the implementations don't actually allow the correct signatures for that length, there should be at least some provision to override the behaviour or act in the best way possible.

In this case, signing with JWS allows an override use the alg parameter in a JWK key, but generating the thumbprint offers no such override. While it should generally be understood the effects of using a thumbprint algorithm different than the norm should be defaulted in a discouraged way (as is the case now), the option to override that could be provided in such a case as with Let's Encrypt where the correct thumbprint causes the key authorization to be in excess of the length restrictions as well as not being the expected SHA-256.

I mentioned that the newest version of keypairs effectively only uses SHA-256, while the @root/keypairs attempts to select the algorithm intelligently is a disparity between the specifications, norms, and accepted usage.

I think the position you have taken is not in the wrong, but I think offering the ability to self select an algorithm for generating the thumbprint is preferable in the case where these things cannot be adjusted.

I definitely agree with your assessment on the situation, but as you mentioned, most providers effectively only accept `SHA-256` based signatures. To that effect, in the event that a longer key length is used and the implementations don't actually allow the correct signatures for that length, there should be at least some provision to override the behaviour or act in the best way possible. In this case, signing with JWS allows an override use the `alg` parameter in a JWK key, but generating the thumbprint offers no such override. While it should generally be understood the effects of using a thumbprint algorithm different than the norm should be defaulted in a discouraged way (as is the case now), the option to override that could be provided in such a case as with Let's Encrypt where the correct thumbprint causes the key authorization to be in excess of the length restrictions as well as not being the expected SHA-256. I mentioned that the newest version of `keypairs` effectively only uses `SHA-256`, while the `@root/keypairs` attempts to select the algorithm intelligently is a disparity between the specifications, norms, and accepted usage. I think the position you have taken is not in the wrong, but I think offering the ability to self select an algorithm for generating the thumbprint is preferable in the case where these things cannot be adjusted.

So, I've cobbled something together for myself to renew my Let's Encrypt certificates using this acme.js on AWS Lambda. It works really well, but I recently tried experimenting with having an account key that was P-384 instead of P-256, and ran into a weird error "status:400 Unable to update challenge :: authorization must be pending" when trying to accomplish the challenges. Everything works fine when my account key uses P-256, and I don't really have any need for a P-384 key (I was just playing around to see if it'd work).

But I guess my question is: Am I running into the issue being reported here? That is, acme.js isn't designed to work with however Let's Encrypt implements the spec when using a P-384 account key, and this is somewhat "by design" and fulfulling challenges can't work as intended while having one since of how the thumbprints are hashed?

I could certainly share my code (I have a writeup and code at https://cooperjr.name/2020/08/13/acme-lambda-renewal/ actually, though I've made some tweaks to things since I posted that), but I don't want to dig too much into it if it's something that's just not going to work and I should be sticking with P-256 in the long run anyway. Thanks!

So, I've cobbled something together for myself to renew my Let's Encrypt certificates using this acme.js on AWS Lambda. It works really well, but I recently tried experimenting with having an account key that was P-384 instead of P-256, and ran into a weird error "status:400 Unable to update challenge :: authorization must be pending" when trying to accomplish the challenges. Everything works fine when my account key uses P-256, and I don't really have any need for a P-384 key (I was just playing around to see if it'd work). But I guess my question is: Am I running into the issue being reported here? That is, acme.js isn't designed to work with however Let's Encrypt implements the spec when using a P-384 account key, and this is somewhat "by design" and fulfulling challenges can't work as intended while having one since of how the thumbprints are hashed? I could certainly share my code (I have a writeup and code at https://cooperjr.name/2020/08/13/acme-lambda-renewal/ actually, though I've made some tweaks to things since I posted that), but I don't want to dig too much into it if it's something that's just not going to work and I should be sticking with P-256 in the long run anyway. Thanks!
Sign in to join this conversation.
No Label
No Milestone
No Assignees
2 Participants
Notifications
Due Date

No due date set.

Dependencies

This issue currently doesn't have any dependencies.

Loading…
There is no content yet.