Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Signing appcasts #971

Open
jkbullard opened this issue Jan 3, 2017 · 16 comments
Open

Signing appcasts #971

jkbullard opened this issue Jan 3, 2017 · 16 comments

Comments

@jkbullard
Copy link
Contributor

In Issue #962, @pornel said "We don't sign appcasts and don't intend to".

Can I ask why you "don't intend to"?

Signed appcasts would allow Sparkle to trust them (although Sparkle must still protect itself from authors' errors, of course). (This was just mentioned in #969.) It's easy to script the creation of the appcast so it gets signed. I do that in Tunnelblick, and generate_appcast (PR #953) could presumably be modified to sign the appcast.

I can create a PR to check the appcast signature if there is any interest but if there is some reason the Sparkle developers won't consider it I won't bother.

Would a PR be more acceptable if it only required signed appcasts if some flag was set by the application? (Perhaps via an "appcastMustBeSigned" delegate?)

@kornelski
Copy link
Member

Our main focus is to protect against installation of unauthorized code, and for that particular purpose signed appcast doesn't offer significant improvement.

We already support multiple checks for the payload: HTTPS + DSA + Apple Code signing. Using the same keys for the appcast wouldn't create any new security barrier (i.e. if author leaks their DSA private key and HTTPS is compromised, then the attacker will replace the appcast as easily as the payload).

@jkbullard
Copy link
Contributor Author

Thanks, @pornel -- so would a PR be considered because you just don't have time to work on stuff that isn't your main focus? Or would you not want one because of the additional complexity it would entail (for the code and for application authors)?

One reason for using signed appcasts is that they can make sure that updates are presented to the user properly. A forged appcast can prevent a user from being presented with a critical update, or it can include release notes with malicious links like "Don't click the 'Update' button, instead, please click here to update"). Signed appcasts can also allow an author to include Javascript in the update notes without it being a security risk.

Signed appcasts would have mitigated the problems associated with not using https: that caused such an uproar a year ago (or whenever it was), too, but of course that's 20-20 hindsight.

@zorgiepoo
Copy link
Member

I think signing appcasts could be valuable. However it should be an option developers would have to opt into, and we would still want to be secure as possible when developers don't sign the appcast. So there would be an Info.plist key/value in the bundle that specifies the feed should be signed. (It would be inconvenient if we required appcast signing).

Do you download another file from a different URL to obtain the signature of the appcast, and then validate it? Or do you do it some other way?

What I am concerned at the moment is maintainability and merging in new features at the moment, whilst I'm trying to merge changes from my fork at this time.

@zorgiepoo
Copy link
Member

Also what if the appcast has external file references (release notes, images, ..)? Do we just not care, or take precautions around that?

@jkbullard
Copy link
Contributor Author

@zorgiepoo - Thanks. An optional Info.plist key/value pair (boolean SUAppcastMustBeSigned, defaulting to false) is fine with me. If true and if the appcast's signature wasn't correct, the appcast fetch would fail hard. If not present, or if false, no signature checking would be done.

Note that this is a dangerous option -- if you lose your private key you won't be able to create a new appcast with display notes such as "Please download a copy of XXX from the website", which you could do if appcasts aren't signed. All you can do is create an appcast that causes an error. The error message for an invalid appcast signature should mention that as a possibility.

Regarding the mechanism of the signature, Tunnelblick calculates the signature of the appcast, then prepends an HTML/XML comment which includes the signature to the appcast. The start of a Tunnelblick appcast looks like this:

<!-- Tunnelblick DSA Signature v1 MCwCFFz8MOFVX4y0O0aCYzRcLFxICTU5AhQSPmWgsF1ozBctorVQOo6zwwGDiw== -->
<rss version="2.0" xmlns:sparkle="http://www.andymatuschak.org/xml-namespaces/sparkle"  xmlns:dc="http://purl.org/dc/elements/1.1/">
	<channel>

When my patched Sparkle gets the appcast, it removes the comment and checks the rest of the appcast against the signature.

How might signatures work for dynamically-generated appcasts? The webserver would have to sign them on-the-fly, which means the webserver would need to have the private DSA key available, which is a really bad idea because of the public-facing nature of a website. So although they could, I don't imagine anyone would want to do that.

I don't think the Sparkle developers should care about external links in an appcast. An appcast does not need to have external references except the update link [1] . Everything else is up to the appcast's author. The release notes, for example, can be inlined so they are protected by the appcast's signature, and so could images. Sparkle requires the appcast; that's why I think it should be protected.

[1] I'm not sure if the update link is actually required. If missing, Sparkle might reasonably show the update notes with a disabled "Update" button. The update notes could have a link to a download or something if the author wanted. That wouldn't be very user-friendly, but it might be better than complaining (if that's what Sparkle does now).

@kornelski
Copy link
Member

There is no absolute security, and we have to choose which attacks we protect from, and which we give up on. We face trade-offs in usability, reliability and complexity of updating.

For example, we don't guarantee that the app will be updated.

An active MITM can prevent any updates from being shown at all by breaking the network at any level, and the breakage can be done in a way that's indistinguishable from common failures (e.g. an attacker can simulate a captive portal, firewall, etc.).

If we took seriously a scenario that an out-of-date app may contain exploitable vulnerabilities, we'd have to assume that any problem with the appcast or network connectivity can be a sign of an attack in progress that has exploited or is about to exploit the host app, so the app would have to be immediately stopped or even deleted. Thats obviously a very drastic user-hostile measure for an unlikely attack, so we choose not to protect from it.

or it can include release notes with malicious links

We rely on HTTPS to protect against that on the network level, and I think that's perfectly adequate — our threat model is "not-Mossad". If we chose to fight adversaries that can hack/infiltrate/intimidate a CA, we'd first have to worry that it's even easier to do the same to the developers. The CA system is frighteningly large and only as strong as the weakest link, but realistically it's probably still the strongest link in Sparkle. We tell developers to keep the private DSA key, without a passphrase, on disk. The Flash plugin has privileges to sign Sparkle updates.

@kornelski
Copy link
Member

kornelski commented Jan 3, 2017

One scenario that I think is realistic, and that we could protect from is when the web server is compromised, but developer's computer is not.

In such case signed appcast would prevent phishing.

However, I think that's still a low priority, because a phishing attack is the least effective way to exploit the server. It massively increases chance of getting compromise discovered and killing the golden goose for the attacker. It's much better to backdoor regular download archives (and the DSA pub key in them) and stay under the radar for as long as possible. AFAIK that how it's been when Transmission got hacked, so Sparkle's security turned out to be good enough.

@jkbullard
Copy link
Contributor Author

@pornel - Thanks.

If you can't get a validly-signed appcast, then, as you say, it could be a network problem or it could be an attack. But if you do get a validly-signed appcast, you know whether or not an update is available with a bit more assurance that you would if it is not signed.

It's a low priority, though, so I'll skip making a PR for now.

@zorgiepoo
Copy link
Member

Afaik, signed appcast should be useful excactly in the scenario where the server is compromised but not the developers computer. I personally think as a software update framework, our security shouldn't rely on whether it is served over http or https. It should be agnostic. I vaguely recall reading about the "Software Update Framework" online which went into more greater detail.

@kornelski
Copy link
Member

kornelski commented Jan 3, 2017

you know whether or not an update is available with a bit more assurance that you would if it is not signed.

Not even that. If there's no expiration date in the signature, an attacker could have saved an old version of your signed appcast and perform a replay attack (and if you do have an expiration date, the attacker may spoof NTP server response and change system clock to allow an expired one :)

@kornelski kornelski reopened this Jan 3, 2017
@kornelski
Copy link
Member

I'm warming up to the idea, but I'd like to think this trough to avoid making it a problem for developers.

  1. Currently devs can treat appcast from generate_appcast as a template, and customize the feed and add release notes manually. If the tool adds the signature, this won't be possible, and will make the feed fail to work. How can we make the workflow convenient and without the trap?

  2. What if developer loses their DSA private key? Can we fall back to something else without losing security and phishing-prevention?

  3. Developers trying to sign updates dynamically on the server will make the whole server-compromise-prevention feature pointless. Can prevent this mistake somehow?

@kornelski
Copy link
Member

I'm thinking:

  1. Signing could forbid newlines, and the tool would output the whole feed in one line. It would be a clear signal the feed is not hand-editable, and would avoid problems with git doing CRLF conversion. But I'm not sure where to get release notes from.

  2. If we trusted the server, then for lost DSA keys we could have a feature that makes Sparkle say to user to download the app again. But if we don't trust the server, we can't do that, as we'd send users straight to phisher's download. Maybe do that only if "I lost DSA key" message is consistently up for X weeks/months? (hoping that a server compromise would be found sooner). And/or require Info.plist to contain an alternative trusted URL for this? (hoping that both servers won't be compromised at the same time)

  3. I'd only implement appcast signing code in thegenerate_appcast tool, and the tool can't be accidentally run server-side.

@RickyRomero
Copy link

RickyRomero commented May 17, 2017

I'd like to bump this issue in light of the Handbrake breach, which circumvented Sparkle through social-engineering (emphasis mine):

HandBrake had been nagging me for some time to install an update. I finally decided, for whatever reason, to do the update. There was a note in HandBrake’s update dialog that the incremental update was not available, and that I’d have to download an entirely fresh copy from their server. I didn’t think too much of this, as we’ve been in a similar situation with a broken Sparkle update channel once before (the worst).

Source: https://panic.com/blog/stolen-source-code/.

A signed appcast may have blocked off this attack vector by preventing the attacker from adding this note, assuming the update signing keys weren't compromised (which, judging by the attack method, seems to be the case).

@jkbullard
Copy link
Contributor Author

pornel wrote:

Not even that. If there's no expiration date in the signature, an attacker could have saved an old version of your signed appcast and perform a replay attack (and if you do have an expiration date, the attacker may spoof NTP server response and change system clock to allow an expired one :)

I modified the Tunnelblick implementation to include an expiration date (protected by the signature). Tthat means the appcasts have to be "refreshed" with a new signature before they expire. Other developers could set the expiration far into the future, I suppose, to avoid that, if they aren't concerned with long-term replay attacks.

I'm not particularly concerned about NTP spoofing; really paranoid folks don't use NTP or have very tight controls on what time deltas are allowed.

@RickyRomero
Copy link

RickyRomero commented Apr 7, 2020

Today I saw this update prompt from 1Password and have no guarantee that it's legitimate.

image

@zorgiepoo
Copy link
Member

Adding this to future milestone as it's a potential interest for exploring. It's not an easy enhancement. It should be easy to use, be safe to opt into, and offer more security.

There are several desirable constraints I can think of, many are from this discussion:

  • It should be easy to sign appcasts -- this needs support from generate_appcast
  • Signing external release notes needs to be supported as well; it's too prevalent we can't have a major feature that just doesn't support external release notes. Rejecting other external references is still up in the air if it's possible.
  • It should be 'difficult' to accidentally alter an appcast feed invalidating its signature; stripping newlines is one suggested approach here to make the feed less readable/adjustable.
  • It should still be easy to alter appcasts by hand and re-sign them however -- generate_appcast could maybe generate a non-signed appcast template that contains the source of truth for example.
  • sign_update should still be able to sign appcast feeds too. We should have documentation that covers not storing private keys on the server. We can't impair the tool because of that concern.
  • Fall back for losing the private EdDSA key for Dev-ID signed app updates which allow key rotation. For example we can operate in a "safe" mode where we don't trust release notes or informational only updates, disable automatic downloading, still show a good signing error if the update validation fails, and allow a couple of weeks potentially to go by before even operating in this mode once an invalid feed has been detected (and ignoring the feed until then, except perhaps a user initiated check).

I'm not too concerned about expiration dates. We need to continue leveraging not allowing downgrades or showing updates that have already been installed.

@zorgiepoo zorgiepoo added this to the 2.4 milestone Jul 3, 2022
@zorgiepoo zorgiepoo modified the milestones: 2.4, 2.5 Nov 26, 2022
@zorgiepoo zorgiepoo removed this from the 2.5 milestone Mar 19, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants