New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NuGet package signing prototype #5260

Closed
jariq opened this Issue May 20, 2017 · 24 comments

Comments

Projects
None yet
10 participants
@jariq

jariq commented May 20, 2017

About a month ago I was asking for more information in issue #2577 related to package signing but received no response so far. So meanwhile I've created a proof of the concept tool called NuSign which digitally signs NuGet packages and verifies the signature of previously signed packages. I tried to do my best to briefly document the design goals and signing technology in README and I would like to ask NuGet team and/or interested community members to take a look at it and provide any kind of feedback. I am hoping that existence of package signing prototype might help to revive public upstream discussion and implementation of this very important security feature.

@onovotny

This comment has been minimized.

Show comment
Hide comment
@onovotny

onovotny May 20, 2017

One initial thought -- the tool should either support Azure Key Vault signing directly or at least expose an add-in API where someone else can write an adapter for that. Basically, at the place where you call sign on the digest, don't use CNG, we want to use Key Vault's SignAsync method instead.

As an example, see here: https://github.com/vcsjones/OpenVsixSignTool/blob/a089007679cb29cbd734071ef5d947946acce0c6/src/OpenVsixSignTool.Core/KeyVaultSigningContext.cs#L47

The goal is that the certificates can live entirely in Azure Key Vault and not exist at all on the local machine running the tool.

onovotny commented May 20, 2017

One initial thought -- the tool should either support Azure Key Vault signing directly or at least expose an add-in API where someone else can write an adapter for that. Basically, at the place where you call sign on the digest, don't use CNG, we want to use Key Vault's SignAsync method instead.

As an example, see here: https://github.com/vcsjones/OpenVsixSignTool/blob/a089007679cb29cbd734071ef5d947946acce0c6/src/OpenVsixSignTool.Core/KeyVaultSigningContext.cs#L47

The goal is that the certificates can live entirely in Azure Key Vault and not exist at all on the local machine running the tool.

@maartenba

This comment has been minimized.

Show comment
Hide comment
@maartenba

maartenba May 21, 2017

Contributor

That looks super cool! Ideally the sign/verify commands should be part of NuGet.exe, but that's the entire goal of this issue and proof of concept anyway :-)

I like the simple approach of this tool. Things I miss (and are more important imo than being able to use a third-party certificate store imo):

  • Being able to block certain certificates or unsigned packages from being installed into a project in the NuGet client
  • Timestamping would be nice, to be able to verify at some point the signature was done using a valid certificate (which could be expired today). Example: if I publish and sign a package today, I want the signature to still be valid in 5 years from now even if my signing certificate is only valid until next week. Seems the OpenVsixSignTool @onovotny referenced has timestamping as well.

Love this so far, cool stuff!

Contributor

maartenba commented May 21, 2017

That looks super cool! Ideally the sign/verify commands should be part of NuGet.exe, but that's the entire goal of this issue and proof of concept anyway :-)

I like the simple approach of this tool. Things I miss (and are more important imo than being able to use a third-party certificate store imo):

  • Being able to block certain certificates or unsigned packages from being installed into a project in the NuGet client
  • Timestamping would be nice, to be able to verify at some point the signature was done using a valid certificate (which could be expired today). Example: if I publish and sign a package today, I want the signature to still be valid in 5 years from now even if my signing certificate is only valid until next week. Seems the OpenVsixSignTool @onovotny referenced has timestamping as well.

Love this so far, cool stuff!

@onovotny

This comment has been minimized.

Show comment
Hide comment
@onovotny

onovotny May 21, 2017

@maartenba seems like if NuGet were to retain/restore use of OPC formats, we could just use the existing tooling? OPC has signature support already, is there a reason why NuGet moved away from OPC?

onovotny commented May 21, 2017

@maartenba seems like if NuGet were to retain/restore use of OPC formats, we could just use the existing tooling? OPC has signature support already, is there a reason why NuGet moved away from OPC?

@maartenba

This comment has been minimized.

Show comment
Hide comment
@maartenba

maartenba May 21, 2017

Contributor

No idea what the decision was there. OPC indeed has a lot of these things built in.

Contributor

maartenba commented May 21, 2017

No idea what the decision was there. OPC indeed has a lot of these things built in.

@vcsjones

This comment has been minimized.

Show comment
Hide comment
@vcsjones

vcsjones May 21, 2017

This does look interesting. (Note I am the author of OpenVsixSignTool Oren referenced above).

I've weighed the pros and cons of OPC signing vs. something else. OpenVsixSignTool and even Microsoft's own VsixSignTool will actually just "work" on a nuget packages today. The nuget client doesn't verify the signature, but it does indeed work.

For NuGet, there are a few challenges. The first is that Microsoft's OPC framework is in WindowsBase (which falls under the WPF umbrella), so it is not readily available xplat, which the nuget client will require. This is fine for Visual Studio extensions as they are Windows only. OpenVsixSignTool has a branch for xplat signing, but that's because it has its own OPC implementation. The NuGet team would either need to write their own as I did, or get part of WindowsBase ported in to netstandard.

The CMS based signatures in NuSign is in netstandard2, so it has more framework components that are needed without re-writing too much of them. The approach here is to use a CMS signature instead of an XmlDSig in OPC which is a sound idea provided that it is implemented correctly. The downside is that the NuGet team would be inventing their own signing scheme which has its own challenges and is likely to be put under intense scrutiny. If I am not mistaken this is somewhat similar to what Appx package signing is like.

I agree that NuGet package signing is something we want sooner rather than later. At a minimum, netstandard2.0 will have to be used, otherwise some difficult-to-get-right things are being redesigned. OPC seems like a natural choice but without the right framework pieces, it will be hard. As I understand it today, the WindowsBase components for OPC in the .NET Framework are not slated to be included in netstandard2.0, so that puts OPC signing at a disadvantage.

I also agree that some form of counter signature for timestamping is necessary even for a minimum viable product. This is another piece that is hard to do xplat, though I've made some progress using OpenSSL for OpenVsixSignTool for Linux and macOS. Right now OVST P/Invoke's Windows as there is nothing in the framework to accomplish this. It could be written in pure managed code, but that is an undertaking in itself.

vcsjones commented May 21, 2017

This does look interesting. (Note I am the author of OpenVsixSignTool Oren referenced above).

I've weighed the pros and cons of OPC signing vs. something else. OpenVsixSignTool and even Microsoft's own VsixSignTool will actually just "work" on a nuget packages today. The nuget client doesn't verify the signature, but it does indeed work.

For NuGet, there are a few challenges. The first is that Microsoft's OPC framework is in WindowsBase (which falls under the WPF umbrella), so it is not readily available xplat, which the nuget client will require. This is fine for Visual Studio extensions as they are Windows only. OpenVsixSignTool has a branch for xplat signing, but that's because it has its own OPC implementation. The NuGet team would either need to write their own as I did, or get part of WindowsBase ported in to netstandard.

The CMS based signatures in NuSign is in netstandard2, so it has more framework components that are needed without re-writing too much of them. The approach here is to use a CMS signature instead of an XmlDSig in OPC which is a sound idea provided that it is implemented correctly. The downside is that the NuGet team would be inventing their own signing scheme which has its own challenges and is likely to be put under intense scrutiny. If I am not mistaken this is somewhat similar to what Appx package signing is like.

I agree that NuGet package signing is something we want sooner rather than later. At a minimum, netstandard2.0 will have to be used, otherwise some difficult-to-get-right things are being redesigned. OPC seems like a natural choice but without the right framework pieces, it will be hard. As I understand it today, the WindowsBase components for OPC in the .NET Framework are not slated to be included in netstandard2.0, so that puts OPC signing at a disadvantage.

I also agree that some form of counter signature for timestamping is necessary even for a minimum viable product. This is another piece that is hard to do xplat, though I've made some progress using OpenSSL for OpenVsixSignTool for Linux and macOS. Right now OVST P/Invoke's Windows as there is nothing in the framework to accomplish this. It could be written in pure managed code, but that is an undertaking in itself.

@blowdart

This comment has been minimized.

Show comment
Hide comment
@blowdart

blowdart May 21, 2017

Member

Let me explain, roughly, how this sort of thing works at MSFT. Any "novel" use of crypto has to pass muster with our Crypto Board. This includes new implementations of existing standards. Then it has to get beaten up to ensure we're not introducing more problems than we solve. Community contributions or ideas also need taking to the Crypto Board, and that then adds a couple of months where we can't say much, until that process is complete. This isn't a fast process, for obvious reasons, and it's not one that lends itself to being done in the open unfortunately, because someone, somewhere is going to take a first draft and start using it, before problems are discovered.

Timestamping is something they do consider mandatory. Which is good news for some of you :)

Member

blowdart commented May 21, 2017

Let me explain, roughly, how this sort of thing works at MSFT. Any "novel" use of crypto has to pass muster with our Crypto Board. This includes new implementations of existing standards. Then it has to get beaten up to ensure we're not introducing more problems than we solve. Community contributions or ideas also need taking to the Crypto Board, and that then adds a couple of months where we can't say much, until that process is complete. This isn't a fast process, for obvious reasons, and it's not one that lends itself to being done in the open unfortunately, because someone, somewhere is going to take a first draft and start using it, before problems are discovered.

Timestamping is something they do consider mandatory. Which is good news for some of you :)

@maartenba

This comment has been minimized.

Show comment
Hide comment
@maartenba

maartenba May 21, 2017

Contributor

But do the same rules apply for .NET foundation projects, which NuGet falls under if not mistaken? :-)
(or in other words does it make sense for the community to put effort in this kind of thing)

Contributor

maartenba commented May 21, 2017

But do the same rules apply for .NET foundation projects, which NuGet falls under if not mistaken? :-)
(or in other words does it make sense for the community to put effort in this kind of thing)

@jariq

This comment has been minimized.

Show comment
Hide comment
@jariq

jariq May 21, 2017

Thanks for the valuable feedback guys.

I also consider support for time-stamps and different key stores to be necessary for any real world solution but I have purposely kept implementation as simple as possible so even people without previous crypto experience can explore the code and understand the basic idea behind the package signing.

I never heard of OPC before. I guess you are referring to Open Packaging Conventions. I'll try to take a look at the specification this week. It would be great if NuGet team could confirm whether signing solution has to be OPC based or not.

Personally I am not a big fan of XMLDSig based signatures because IMO they add unnecessary complexity to the most solutions. In order to work with X.509 certificates and CMS based signatures one does only need ASN.1 framework and basic crypto primitives. In order to work with XMLDSig based signatures one does need everything needed for CMS based signatures and also solid XML framework (because of normalization etc.). So unless I really need to work at XML level (sign only parts of XML structure not whole structure) I almost always pick CMS.

@vcsjones I believe xplat support is achievable. We already have built light-weight time-stamping client that can be easily ported to netstandard, netstandard compatible PKCS#11 wrapper and almost finished netstandard compatible ASN.1 processor. There is also FIPS 140-2 validated Bouncy Castle module available.

Any "novel" use of crypto has to pass muster with our Crypto Board. This includes new implementations of existing standards.

@blowdart If that brings package signing support into NuGet tooling then it's a sacrifice I'm willing to take :)

jariq commented May 21, 2017

Thanks for the valuable feedback guys.

I also consider support for time-stamps and different key stores to be necessary for any real world solution but I have purposely kept implementation as simple as possible so even people without previous crypto experience can explore the code and understand the basic idea behind the package signing.

I never heard of OPC before. I guess you are referring to Open Packaging Conventions. I'll try to take a look at the specification this week. It would be great if NuGet team could confirm whether signing solution has to be OPC based or not.

Personally I am not a big fan of XMLDSig based signatures because IMO they add unnecessary complexity to the most solutions. In order to work with X.509 certificates and CMS based signatures one does only need ASN.1 framework and basic crypto primitives. In order to work with XMLDSig based signatures one does need everything needed for CMS based signatures and also solid XML framework (because of normalization etc.). So unless I really need to work at XML level (sign only parts of XML structure not whole structure) I almost always pick CMS.

@vcsjones I believe xplat support is achievable. We already have built light-weight time-stamping client that can be easily ported to netstandard, netstandard compatible PKCS#11 wrapper and almost finished netstandard compatible ASN.1 processor. There is also FIPS 140-2 validated Bouncy Castle module available.

Any "novel" use of crypto has to pass muster with our Crypto Board. This includes new implementations of existing standards.

@blowdart If that brings package signing support into NuGet tooling then it's a sacrifice I'm willing to take :)

@vcsjones

This comment has been minimized.

Show comment
Hide comment
@vcsjones

vcsjones May 22, 2017

I never heard of OPC before. I guess you are referring to Open Packaging Conventions.

Correct.

It would be great if NuGet team could confirm whether signing solution has to be OPC based or not.

I don't think the current barrier right now is "what are they going to use" but what problems does it solve. Personally I think some uniformity is desirable too, NuGet chose to use a specification for a reason - and they should continue to use a specification, especially if it is already being used in other areas. It would seem strange that NuGet packages uses OPC and later it turns out that signatures don't follow the standard for no particularly good reason. There are many libraries out there that can already validate an OPC signature - .NET being one of them with the PackageDigitalSignatureManager or openxml4j in Java. This is also the same signing process used for Office documents, VSIX, XAML Packages, and a bunch of other things.

I'm not sure I agree with your points of CMS over XMLDSig. CMS is not just "sign a bunch of ASN.1". There's the matter of the unauthenticatedAttributes section, which is not signed, and has been a problem that's been plaguing Windows' Authenticode for a while. The same for the certificate chain that's stored in the CMS message - Google watermarks Chrome downloads by injecting a bogus certificate in to the chain of the signature, which is fine as far as CMS is concerned.

My point is that a CMS message is parts of an ASN.1 structure that is signed, just like XMLDSig is XML elements that are canonicalized (much like DER in ASN.1) and signed. But the .NET Framework's CmsSignature and company do all of the legwork for that, much like SignedXml does for XMLDSig. Both however are well studied and been around for a while, attacks have been made, and the most egregious problems have been fixed.

I think the we and NuGet team also need define what the threat model is for signing and what problems they want to solve with it before we just start signing stuff. For example, is being able to watermark downloads or append data to the signature part of the threat model?

I believe xplat support is achievable.

I never said it isn't - just that it is something that is required for the client.

vcsjones commented May 22, 2017

I never heard of OPC before. I guess you are referring to Open Packaging Conventions.

Correct.

It would be great if NuGet team could confirm whether signing solution has to be OPC based or not.

I don't think the current barrier right now is "what are they going to use" but what problems does it solve. Personally I think some uniformity is desirable too, NuGet chose to use a specification for a reason - and they should continue to use a specification, especially if it is already being used in other areas. It would seem strange that NuGet packages uses OPC and later it turns out that signatures don't follow the standard for no particularly good reason. There are many libraries out there that can already validate an OPC signature - .NET being one of them with the PackageDigitalSignatureManager or openxml4j in Java. This is also the same signing process used for Office documents, VSIX, XAML Packages, and a bunch of other things.

I'm not sure I agree with your points of CMS over XMLDSig. CMS is not just "sign a bunch of ASN.1". There's the matter of the unauthenticatedAttributes section, which is not signed, and has been a problem that's been plaguing Windows' Authenticode for a while. The same for the certificate chain that's stored in the CMS message - Google watermarks Chrome downloads by injecting a bogus certificate in to the chain of the signature, which is fine as far as CMS is concerned.

My point is that a CMS message is parts of an ASN.1 structure that is signed, just like XMLDSig is XML elements that are canonicalized (much like DER in ASN.1) and signed. But the .NET Framework's CmsSignature and company do all of the legwork for that, much like SignedXml does for XMLDSig. Both however are well studied and been around for a while, attacks have been made, and the most egregious problems have been fixed.

I think the we and NuGet team also need define what the threat model is for signing and what problems they want to solve with it before we just start signing stuff. For example, is being able to watermark downloads or append data to the signature part of the threat model?

I believe xplat support is achievable.

I never said it isn't - just that it is something that is required for the client.

@vcsjones

This comment has been minimized.

Show comment
Hide comment
@vcsjones

vcsjones May 22, 2017

I've given this much more consideration now and gotten some input from other folks, and the more I think about it, we need to take a step back before we start putting together proof of concepts.

My concerns lie around what is the threat model, or problem, that package signing is trying to solve. The proposed approach above may solve some of them, and it will not solve others.

I'm going to ask a bunch of questions that I honestly don't know the answer to - I want to highlight that there are a lot of unknowns left to be filled in before we can start signing packages.

The first is identity. "Who" signed this package. There are two sources of the identity - which is the publisher of the package, and in this scenario, the Subject of the certificate. From a UX perspective, which takes precedence, especially when you consider that we have verified authors now? It would seem that identity of the certificate would "win" when being presented with the the author of the package, but again I am not 100% sure.

This also assumes that Code Signing certificates are the correct route. This puts a financial burden on package authors to sign their packages. There are many options for low cost certificates for OSS programs, but also consider that many popular OSS Windows packages are not code signed - why? Is it the financial burden? The difficulty in infrastructure? Clearly there is a barrier there, and it would be a similar barrier being introduced in NuGet. You could not sign the package, but that means your package has less chance of being used (especially in my scenario below). It puts some packages in a position of privilege over others.

The second is integrity. Has the package been tampered with? This is easy enough to do with OPC or something else at the package level, but NuGet packages have dependencies. Should a package be considered "signed" if it has unsigned dependencies? From the perspective of integrity - I think not. If my package "A" has a dependency on someone else's package, package "B", and "B" is compromised because they had a poor password, should my package "A" still have a valid signature?

Let's say the answer is "no", packages must have signed dependencies. That is a massive challenge and I don't know how to solve that. You would have to sign the entire package graph, and I don't know how to handle a dependent package changing. How would a package be "locked" to a dependency, cryptographically? Would it be by the Subject of the package's signature? The digest of its signature? If the former, what happens if a company gets acquired and they need to start signing with a new certificate identity, or if someone had to revoke their certificate? If the latter, then it would be impossible to support version ranges, which is a no-go scenario. This also leads to the "strong name signing" problem, but about 200x worse.

Or we could say the answer is "yes", a package's signature should not account for the signature of of a dependency. Then what good is the signature of my package? If the threat model is a compromised NuGet account or Server, then even though my package is signed - it now could be compromised by a dependency being compromised. That would be unfortunate. That also raises the question of @maartenba's enterprise policy - should a dependency be transitively trusted?

Is there a relationship between the code signing identity and a NuGet account? If so, who is the authoritative source of that relationship, and how is that relationship secured? If the threat model encompasses a compromised NuGet server, then this relationship authority is also at risk.

Finally, when do package signatures get validated? When they are downloaded by the nuget client? During a restore? Is the local nuget cache part of the integrity threat model? All things to consider.

My point with all of this is, signing is a hard problem to solve - I don't expect all of this to be answered without careful discussion, planning, and understanding. Some people want to solve the identity problem and gloss over the integrity problem. Others want to solve the integrity problem, which is hard, needs to consider dependencies, and would introduce a "pay to play" problem with obtaining certificates.

I don't think package signing is a technical problem. The tools already exist, and Microsoft has some incredibly smart crypto and appsec people working for them. Rather I think it is trying to understand what problems are, and are not, going to be solved by this and being clear about it.

vcsjones commented May 22, 2017

I've given this much more consideration now and gotten some input from other folks, and the more I think about it, we need to take a step back before we start putting together proof of concepts.

My concerns lie around what is the threat model, or problem, that package signing is trying to solve. The proposed approach above may solve some of them, and it will not solve others.

I'm going to ask a bunch of questions that I honestly don't know the answer to - I want to highlight that there are a lot of unknowns left to be filled in before we can start signing packages.

The first is identity. "Who" signed this package. There are two sources of the identity - which is the publisher of the package, and in this scenario, the Subject of the certificate. From a UX perspective, which takes precedence, especially when you consider that we have verified authors now? It would seem that identity of the certificate would "win" when being presented with the the author of the package, but again I am not 100% sure.

This also assumes that Code Signing certificates are the correct route. This puts a financial burden on package authors to sign their packages. There are many options for low cost certificates for OSS programs, but also consider that many popular OSS Windows packages are not code signed - why? Is it the financial burden? The difficulty in infrastructure? Clearly there is a barrier there, and it would be a similar barrier being introduced in NuGet. You could not sign the package, but that means your package has less chance of being used (especially in my scenario below). It puts some packages in a position of privilege over others.

The second is integrity. Has the package been tampered with? This is easy enough to do with OPC or something else at the package level, but NuGet packages have dependencies. Should a package be considered "signed" if it has unsigned dependencies? From the perspective of integrity - I think not. If my package "A" has a dependency on someone else's package, package "B", and "B" is compromised because they had a poor password, should my package "A" still have a valid signature?

Let's say the answer is "no", packages must have signed dependencies. That is a massive challenge and I don't know how to solve that. You would have to sign the entire package graph, and I don't know how to handle a dependent package changing. How would a package be "locked" to a dependency, cryptographically? Would it be by the Subject of the package's signature? The digest of its signature? If the former, what happens if a company gets acquired and they need to start signing with a new certificate identity, or if someone had to revoke their certificate? If the latter, then it would be impossible to support version ranges, which is a no-go scenario. This also leads to the "strong name signing" problem, but about 200x worse.

Or we could say the answer is "yes", a package's signature should not account for the signature of of a dependency. Then what good is the signature of my package? If the threat model is a compromised NuGet account or Server, then even though my package is signed - it now could be compromised by a dependency being compromised. That would be unfortunate. That also raises the question of @maartenba's enterprise policy - should a dependency be transitively trusted?

Is there a relationship between the code signing identity and a NuGet account? If so, who is the authoritative source of that relationship, and how is that relationship secured? If the threat model encompasses a compromised NuGet server, then this relationship authority is also at risk.

Finally, when do package signatures get validated? When they are downloaded by the nuget client? During a restore? Is the local nuget cache part of the integrity threat model? All things to consider.

My point with all of this is, signing is a hard problem to solve - I don't expect all of this to be answered without careful discussion, planning, and understanding. Some people want to solve the identity problem and gloss over the integrity problem. Others want to solve the integrity problem, which is hard, needs to consider dependencies, and would introduce a "pay to play" problem with obtaining certificates.

I don't think package signing is a technical problem. The tools already exist, and Microsoft has some incredibly smart crypto and appsec people working for them. Rather I think it is trying to understand what problems are, and are not, going to be solved by this and being clear about it.

@jariq

This comment has been minimized.

Show comment
Hide comment
@jariq

jariq May 23, 2017

@vcsjones that's a great summary of problems that need to be carefully considered and solved before package signatures can be automatically processed during package install. I agree with your post almost 100% but I am afraid that trying to solve all problems at once might be quite overwhelming and may not lead to satisfactory results in a reasonable time. I believe we should take one small step at a time. Let's first define how a single package can be correctly signed/verified and let's create/define tools required to correctly do that. We can move to next step once we have this basic building block ready and approved by NuGet team.

I'm going to ask a bunch of questions that I honestly don't know the answer to...

Let me share my opinions...

  1. Identity - "Who" signed this package?

It is hard for me to imagine other answer than "owner of signing key/certificate".

  1. Are Code Signing certificates the correct route?

X.509 certificates seem to be a natural choice but there are similar systems (for example RPM) that use PGP/GPG.

I believe that signing technology/tooling should allow to use keys/certificates independent from centrally managed commercial entity.

  1. Integrity - Should a package be considered "signed" if it has unsigned dependencies?

I think that verification should be performed on a single package level. Dependencies can be verified one by one using the same process.

  1. Is there a relationship between the code signing identity and a NuGet account?

Tooling should be extensible enough and should provide a way to define custom trust anchors for signature validation. That would give us possibility to to create such relationship if needed and to explore alternative approaches such as KeyBase mentioned in #2577 by @asbjornu.

  1. When do package signatures get validated?

Whenever signature verification method/command is executed. We can decide what's the best time once we have that method/command available.

jariq commented May 23, 2017

@vcsjones that's a great summary of problems that need to be carefully considered and solved before package signatures can be automatically processed during package install. I agree with your post almost 100% but I am afraid that trying to solve all problems at once might be quite overwhelming and may not lead to satisfactory results in a reasonable time. I believe we should take one small step at a time. Let's first define how a single package can be correctly signed/verified and let's create/define tools required to correctly do that. We can move to next step once we have this basic building block ready and approved by NuGet team.

I'm going to ask a bunch of questions that I honestly don't know the answer to...

Let me share my opinions...

  1. Identity - "Who" signed this package?

It is hard for me to imagine other answer than "owner of signing key/certificate".

  1. Are Code Signing certificates the correct route?

X.509 certificates seem to be a natural choice but there are similar systems (for example RPM) that use PGP/GPG.

I believe that signing technology/tooling should allow to use keys/certificates independent from centrally managed commercial entity.

  1. Integrity - Should a package be considered "signed" if it has unsigned dependencies?

I think that verification should be performed on a single package level. Dependencies can be verified one by one using the same process.

  1. Is there a relationship between the code signing identity and a NuGet account?

Tooling should be extensible enough and should provide a way to define custom trust anchors for signature validation. That would give us possibility to to create such relationship if needed and to explore alternative approaches such as KeyBase mentioned in #2577 by @asbjornu.

  1. When do package signatures get validated?

Whenever signature verification method/command is executed. We can decide what's the best time once we have that method/command available.

@ferventcoder

This comment has been minimized.

Show comment
Hide comment
@ferventcoder

ferventcoder Jun 1, 2017

I've thought about this quite a bit over the last 5 or so years and I'm firmly in the camp of PGP/GPG signing being the path forward - it's what other package managers already use, it has a trusted infrastructure, and it's completely free. Authenticode has a cost for real use, and unless Microsoft is going to issue free certificates and keep a local store for traceability and identity for signing packages (all without any cost to folks), then authenticode is a non-starter. Never underestimate the blocker that even a tiny cost has over free.

ferventcoder commented Jun 1, 2017

I've thought about this quite a bit over the last 5 or so years and I'm firmly in the camp of PGP/GPG signing being the path forward - it's what other package managers already use, it has a trusted infrastructure, and it's completely free. Authenticode has a cost for real use, and unless Microsoft is going to issue free certificates and keep a local store for traceability and identity for signing packages (all without any cost to folks), then authenticode is a non-starter. Never underestimate the blocker that even a tiny cost has over free.

@mishra14

This comment has been minimized.

Show comment
Hide comment
@mishra14

mishra14 Nov 9, 2017

Collaborator

Folks, I want to close this in the favor of https://github.com/NuGet/Home/wiki/Author-Package-Signing. Please let me know otherwise.

Collaborator

mishra14 commented Nov 9, 2017

Folks, I want to close this in the favor of https://github.com/NuGet/Home/wiki/Author-Package-Signing. Please let me know otherwise.

@anangaur

This comment has been minimized.

Show comment
Hide comment
@anangaur

anangaur Nov 9, 2017

Member

Agree with @mishra14.
@jariq Can we close this issue and have further discussions on the Package signing spec issue: #5889?

Member

anangaur commented Nov 9, 2017

Agree with @mishra14.
@jariq Can we close this issue and have further discussions on the Package signing spec issue: #5889?

@jariq

This comment has been minimized.

Show comment
Hide comment
@jariq

jariq Nov 10, 2017

@mishra14 sure. I am closing this one.

jariq commented Nov 10, 2017

@mishra14 sure. I am closing this one.

@jariq jariq closed this Nov 10, 2017

@joshpearce

This comment has been minimized.

Show comment
Hide comment
@joshpearce

joshpearce Dec 18, 2017

I realize this issue is closed, but for those of us in a regulated environment, is there a stop-gap workflow for nuget restore with or without local cache, to at least verify a checksum for all packages, to assert they have not changed from the time they were introduced into the project? Just a blog post of someone who's using existing tools would be great.

joshpearce commented Dec 18, 2017

I realize this issue is closed, but for those of us in a regulated environment, is there a stop-gap workflow for nuget restore with or without local cache, to at least verify a checksum for all packages, to assert they have not changed from the time they were introduced into the project? Just a blog post of someone who's using existing tools would be great.

@onovotny

This comment has been minimized.

Show comment
Hide comment
@onovotny

onovotny Dec 18, 2017

@joshpearce Based on the current spec and code in the NuGet client repo, it verifies the integrity of the package at package extraction time. If you want to re-verify, then you can clear the local nuget cache and it'll re-download/extract them, triggering the verification.

Would that work? As far as not changed since introduction, packages are immutable, so 1.0.4 is always 1.0.4, at least from NuGet.org. For private feeds, that depends on your server config, but I believe all of them can be configured to prohibit overwriting existing versions.

onovotny commented Dec 18, 2017

@joshpearce Based on the current spec and code in the NuGet client repo, it verifies the integrity of the package at package extraction time. If you want to re-verify, then you can clear the local nuget cache and it'll re-download/extract them, triggering the verification.

Would that work? As far as not changed since introduction, packages are immutable, so 1.0.4 is always 1.0.4, at least from NuGet.org. For private feeds, that depends on your server config, but I believe all of them can be configured to prohibit overwriting existing versions.

@joshpearce

This comment has been minimized.

Show comment
Hide comment
@joshpearce

joshpearce Dec 18, 2017

@onovotny, It seems that package verification will assert, "this package, which I am about to restore, is signed by a certificate chain in your trusted root." Is that accurate?

Getting back to what's available today, I'm actually more interested in the current state of package hashing:

I see that when I publish a project, the [app].deps.json has an sha512 for each package. Also, project.assets.json has the same info. Finally, I see that packages in NuGetFallbackFolder now have a [name].nupkg.sha512 file with a hash. When is this .sha512 file generated, and when is the package file verified to still have the same hash? Does that happen at restore time?

joshpearce commented Dec 18, 2017

@onovotny, It seems that package verification will assert, "this package, which I am about to restore, is signed by a certificate chain in your trusted root." Is that accurate?

Getting back to what's available today, I'm actually more interested in the current state of package hashing:

I see that when I publish a project, the [app].deps.json has an sha512 for each package. Also, project.assets.json has the same info. Finally, I see that packages in NuGetFallbackFolder now have a [name].nupkg.sha512 file with a hash. When is this .sha512 file generated, and when is the package file verified to still have the same hash? Does that happen at restore time?

@onovotny

This comment has been minimized.

Show comment
Hide comment
@onovotny

onovotny Dec 18, 2017

@onovotny, It seems that package verification will assert, "this package, which I am about to restore, is signed by a certificate chain in your trusted root." Is that accurate?

Yes, that's correct. Plus that it's not modified. Though to be precise, "this package, which I am about to extract... "

I see that when I publish a project, the [app].deps.json has an sha512 for each package. Also, project.assets.json has the same info. Finally, I see that packages in NuGetFallbackFolder now have a [name].nupkg.sha512 file with a hash. When is this .sha512 file generated, and when is the package file verified to still have the same hash? Does that happen at restore time?

I'm not sure, one of the other team members may know - @rohit21agrawal @emgarten?

onovotny commented Dec 18, 2017

@onovotny, It seems that package verification will assert, "this package, which I am about to restore, is signed by a certificate chain in your trusted root." Is that accurate?

Yes, that's correct. Plus that it's not modified. Though to be precise, "this package, which I am about to extract... "

I see that when I publish a project, the [app].deps.json has an sha512 for each package. Also, project.assets.json has the same info. Finally, I see that packages in NuGetFallbackFolder now have a [name].nupkg.sha512 file with a hash. When is this .sha512 file generated, and when is the package file verified to still have the same hash? Does that happen at restore time?

I'm not sure, one of the other team members may know - @rohit21agrawal @emgarten?

@rohit21agrawal

This comment has been minimized.

Show comment
Hide comment
@rohit21agrawal

rohit21agrawal Dec 18, 2017

Contributor

Cc: @dtivel @mishra14 for signing related questions

Contributor

rohit21agrawal commented Dec 18, 2017

Cc: @dtivel @mishra14 for signing related questions

@anangaur

This comment has been minimized.

Show comment
Hide comment
@anangaur

anangaur Dec 18, 2017

Member

/cc: @rido-min

@joshpearce I have seen Package hash to be generated at the time of adding the package to the repository/source. For example use “NuGet add” command.
And as far as I know the hash is currently not used before restore.

Member

anangaur commented Dec 18, 2017

/cc: @rido-min

@joshpearce I have seen Package hash to be generated at the time of adding the package to the repository/source. For example use “NuGet add” command.
And as far as I know the hash is currently not used before restore.

@joshpearce

This comment has been minimized.

Show comment
Hide comment
@joshpearce

joshpearce Dec 18, 2017

@anangaur, I work in medical software. We have to verify that our dependent assemblies are the same, at install time, as they were when the project was built for testing. It seems these hashes could help if there were a story/tools/documentation around their use.

With Nuget, I assume it's possible to declare a single, file-system-based, offline location to restore packages from? That would probably do it for me.

joshpearce commented Dec 18, 2017

@anangaur, I work in medical software. We have to verify that our dependent assemblies are the same, at install time, as they were when the project was built for testing. It seems these hashes could help if there were a story/tools/documentation around their use.

With Nuget, I assume it's possible to declare a single, file-system-based, offline location to restore packages from? That would probably do it for me.

@anangaur

This comment has been minimized.

Show comment
Hide comment
@anangaur

anangaur Dec 18, 2017

Member

@joshpearce Yes that’s very much possible. Whether these packages are your company packages or public packages, you can put all these packages into a folder and then use this folder as a package source in Visual Studio or for command line builds. Let me get more helpful links for you in sometime.

Member

anangaur commented Dec 18, 2017

@joshpearce Yes that’s very much possible. Whether these packages are your company packages or public packages, you can put all these packages into a folder and then use this folder as a package source in Visual Studio or for command line builds. Let me get more helpful links for you in sometime.

@anangaur

This comment has been minimized.

Show comment
Hide comment
@anangaur

anangaur Dec 18, 2017

Member

Use this documentation to add another folder based source to Visual Studio. Alternatively you can just modify NuGet.config to add the same source.
Details on adding local feed/source: https://docs.microsoft.com/en-us/nuget/hosting-packages/local-feeds

Member

anangaur commented Dec 18, 2017

Use this documentation to add another folder based source to Visual Studio. Alternatively you can just modify NuGet.config to add the same source.
Details on adding local feed/source: https://docs.microsoft.com/en-us/nuget/hosting-packages/local-feeds

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment