Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature discussion] Alternative signature spoofing #1467

Closed
chris42 opened this issue May 11, 2021 · 19 comments
Closed

[Feature discussion] Alternative signature spoofing #1467

chris42 opened this issue May 11, 2021 · 19 comments

Comments

@chris42
Copy link

chris42 commented May 11, 2021

I think it would be worthwhile to have a view on the alternative signature spoofing, @dylangerdaly created here:
dylangerdaly/platform_frameworks_base@b58aa11

It seems to bury/automate the spoofing in the system by using comparing the microg certificate. That could be quite an improvement for some ROMs, that took the user handled permission negatively?

So I would be quite interested in your view @mar-v-in on this. Maybe an additional SigSpoof patch?

Regarding the feature request:
Probably some work on the signature spoofing check would be needed to not confuse possible users.
Also does this change the need to move microg to priv-app/system?
What is happening with fakestore/gsfproxy as mentioned in #897 (comment)

@chirayudesai
Copy link
Contributor

This implementation actually makes it not need to be a system app (for spoofing anyway), or request the permission at all. microG signed with the right key would automatically get spoofed from what I can tell by reading the code.

We have an alternative implementation at CalyxOS: https://review.calyxos.org/c/CalyxOS/platform_frameworks_base/+/388
This is more for when microG is a system / privileged app.

@mar-v-in
Copy link
Member

mar-v-in commented May 13, 2021

The patches provided by microG serve two purposes:

  • Allow for easy and sufficiently secure development of microG. This requires allowing non-system apps and also those that don't have official signature to be able to spoof signature, but also means that apps doing so need to confirm permission with the user. On top it can also be beneficial to spoof signature of other apps in some cases for development.
  • Give an example to ROM developers how signature spoofing works. If ROMs want to add further restrictions or modifications, they certainly can do.

I don't think it's needed to have all possible variations of this patch (restrict to specific package names, package signing keys, spoofed keys, system apps only, etc). All of these changes are rather easy and every ROM developer should be able to adjust the patch for these variations.

Regarding ROMs that decided against signature spoofing like LineageOS: There is absolutely no significant security impact as soon as the protectionLevel of the signature spoofing permission is changed to system-only. This is because the signature of system apps is not properly verified anyway (ref), so if you allow system apps to spoof signatures, you only allow them to do things they could do anyways.

@chris42 chris42 closed this as completed May 14, 2021
@thestinger
Copy link

Regarding ROMs that decided against signature spoofing: There is absolutely no security impact as soon as the protectionLevel of the signature spoofing permission is changed to system-only. This is because the signature of system apps is not properly verified anyway (ref), so if you allow system apps to spoof signatures, you only allow them to do things they could do anyways.

If it's not scoped to the Play services signature alone, then a compromise of microG allows completely bypassing the signature-based security model. The standard patch destroys the verified boot security feature, where persistent state is not supposed to be trusted. Persistent state is controlled by the attacker in the threat model for verified boot. The goal is stopping them gaining privileged access in the OS from there, generally to persist after a remote exploit. It's an additional barrier for them and means if they don't block updates (which attestation can detect) they'll likely lose their privileged access eventually, ultimately leading to their methods of hiding the compromise crumbling apart.

When it's properly scoped to only being able to spoof Play services, it's still a security issue. The spoofing is bypassing a security check in apps to trick them into using a different implementation of Play services. That means the security expectations they have from Play services for enforcing certain component signature checks, key pinning and a security model need to be provided by microG in their entirety to avoid it downgrading security. If Play services pins the signature for one of their services but microG is not, then microG is reducing the data security and therefore hurting privacy. Same applies to missing aspects of the security model. It is not without costs.

If an OS simply trusts your app builds rather than only their own, then they're also adding an additional trusted party for builds.

@thestinger
Copy link

I think it's worth noting that we've laid this out many times before. You should acknowledge what we've actually said about this and respond to that rather than someone no one has said. Scoping the signature spoofing to only Play services and only by a first party build of microG doesn't address that microG doesn't do the same security checks as Play services. Scoping it that way is also not what the microG projects as the approach. Most integrations of it aren't doing both of those things.

@mar-v-in
Copy link
Member

mar-v-in commented Aug 7, 2021

If it's not scoped to the Play services signature alone, then a compromise of microG allows completely bypassing the signature-based security model. [...]
If an OS simply trusts your app builds rather than only their own, then they're also adding an additional trusted party for builds.

I don't get this point. If we restrict the permission to system apps, the system needs to be compromised. If the system includes a compromised microG or trusts a signing key that is compromised, this would indeed be problematic, same as if the system was compromised anywhere else. If an OS developer does not feel happy with trusting the official builds, they can easily build microG from source and create their own signed build in their system.

The standard patch destroys the verified boot security feature [...]

As I mentioned above, the "standard patch" is for "easy and sufficiently secure development*. It's not meant to be super-secure, just sufficiently secure (and I consider everything that asks users for confirmation as sufficiently secure, it's basically a "sudo"). Considering that most aftermarket Android OS do not even implement verified boot (or allow to turn it off without users being able to see it), breaking its security model with the "standard patch" is fine for me.

That means the security expectations they have from Play services for enforcing certain component signature checks, key pinning and a security model need to be provided by microG in their entirety to avoid it downgrading security.

Technically correct. I'd be happy to hear if there was any Android app developer that had any security expectations at all from Play services (which is actually met by Play services). And I'd be happily adjusting microG (and/or accepting pull requests) to meet these expectations.

If Play services pins the signature for one of their services but microG is not, then microG is reducing the data security and therefore hurting privacy. Same applies to missing aspects of the security model. It is not without costs.

Nothing is without costs. Using microG breaks many apps or makes them behave incorrectly. And I hope it's clear to users that microG may have security issues that are not present in original Play services (and vice-versa). However I believe (and I guess the same is true for most microG users), that using an open-source and intentionally privacy-preserving implementation of the features provided by Play services is worth these costs. Your opinion may vary.

Most integrations of it aren't doing both of those things.

OS developers may decide that

  • It's secure enough for them and their users if they grant permission for signature spoofing on request and per-app.
  • Official microG builds are trustworthy.
  • Having microG support is not worth any compromise and users should rather stick with no play services or opt into tracking.

All of these are decisions that OS developers make. Same as OS builders have to make choices about which entities to trust for hosting, TLS certificates, build servers, etc. These decisions are why some users pick one OS and others pick another OS.

I'm happy about the existence of GrapheneOS and I think it's perfectly fine that you decided for not supporting microG.

@thestinger
Copy link

I don't get this point. If we restrict the permission to system apps, the system needs to be compromised. If the system includes a compromised microG or trusts a signing key that is compromised, this would indeed be problematic, same as if the system was compromised anywhere else. If an OS developer does not feel happy with trusting the official builds, they can easily build microG from source and create their own signed build in their system.

You're presenting it as if it has to be shipped with a backdoor to be compromised. That's not what was actually being talked about and certainly isn't required. A compromise only requires that the code has a remote or local code execution vulnerability. Since you're bypassing the security model, it won't be properly contained in the sandbox. microG will be able to spoof the signatures of highly privileged component and arbitrary other apps, not simply Play services. This is not the same as any other app being exploited. The whole point is that by severely weakening the sandbox and making microG into a highly trusted component, you're weakening the system against attacks. You treat your code as infallible and bypass the standard sandboxing. It's very unfortunate that instead of addressing the vulnerabilities being introduced, you're continuously misrepresenting the issues being raised.

By denying that this is what you're doing and not communicating it to the people using the patch, you've made these into security vulnerabilities. It's not a design decision but rather a security vulnerability that you're covering up. We're more than happy to obtain CVE assignments and draw attention to these vulnerabilities being denied by the project.

As I mentioned above, the "standard patch" is for "easy and sufficiently secure development*. It's not meant to be super-secure, just sufficiently secure (and I consider everything that asks users for confirmation as sufficiently secure, it's basically a "sudo"). Considering that most aftermarket Android OS do not even implement verified boot (or allow to turn it off without users being able to see it), breaking its security model with the "standard patch" is fine for me.

The patch has far more serious issues than bypassing verified boot. You turn a basic bugs in the UI layer into a very deep compromise of the OS with this approach. It's not sufficiently secure. It has awful security and puts users seriously at risk. It's usually being adopted with the standard patch, not a modified form of it. You don't provide an example of a secure implementation. You break far more than the security model of verified boot. You deeply break the sandbox-based security model of the OS and make vulnerabilities into far more serious ones that are able to cause far more harm to end users and persist with those privileges.

Technically correct. I'd be happy to hear if there was any Android app developer that had any security expectations at all from Play services (which is actually met by Play services). And I'd be happily adjusting microG (and/or accepting pull requests) to meet these expectations.

It's not simply technically correct. It's a major downgrade from how connections/components are secured with Play services itself and impacts users privacy/security. It shows why the signature spoofing patch is compromising the application security model even when it's scoped to only the Play services signature and only by microG.

Nothing is without costs. Using microG breaks many apps or makes them behave incorrectly. And I hope it's clear to users that microG may have security issues that are not present in original Play services (and vice-versa). However I believe (and I guess the same is true for most microG users), that using an open-source and intentionally privacy-preserving implementation of the features provided by Play services is worth these costs. Your opinion may vary.

GrapheneOS doesn't bundle Play services and doesn't support giving it any special privileges, just as it doesn't support giving microG any special privileges. All that it provides is support for running it at a regular sandboxed app without any of those special privileges or integration into the OS where the OS uses it as a backend for services.

OS developers may decide that

Most of them aren't actually making an informed decision, especially since you cover up and misrepresent the security issues with the approach.

I'm happy about the existence of GrapheneOS and I think it's perfectly fine that you decided for not supporting microG.

Then please don't misrepresent our reasoning for not including it. CalyxOS and microG participate in an official capacity in the substantial harassment being perpetrated against our developers and raids on our community. Misrepresenting our statements + reasoning and pushing misinformation is one way you've encouraged the the relentless attacks on us disrupting our project and harming our developers. Covering up security security weaknesses is not a good look. If it's not an intention design decision to seriously weaken the OS security model and bypass sandboxing, then they're security vulnerabilities. Which is it?

@thestinger
Copy link

and/or accepting pull requests

You can't really expect pull requests addressing problems you deny, especially when you close issues raising legitimate concerns about security vulnerabilities introduced by the approach and misrepresent the issues being raised. I'm sure someone would submit and maintain a patch avoiding the security weaknesses here. Perhaps people would also work on adding the missing security checks from Play services if the project didn't downplay it as irrelevant and pretend it isn't a problem.

The reason GrapheneOS hasn't included a properly scoped form of signature spoofing for only the Play services signatures by first-party builds of microG is due to not wanting to reduce the security of data those applications trust Play services to handle. Downplaying this and denying that the OS sandbox and principle of least privilege matter erodes our confidence in the ability of microG to provide privacy and security. Even if we can eventually convince you that these are real problems, that doesn't imply that whatever to not seeing them as problems will have changed. Making our own first party builds wouldn't mean that we aren't heavily relying on you to uphold privacy and security. Changing something specific in a fork doesn't change the approach taken for the overall project. I think it's pretty important for any project that cares about privacy and security to avoid weakening the app sandbox and greatly expanding attack surface. Even if you claim to only care about privacy and not security, it depends on it, and the app sandbox is a huge part of that. A compromise of microG via a vulnerability should only compromise data handled by microG and access/permissions it has been granted.

Not so long ago, we were auditing the microG code and figuring out how we could come up with a way to support it while meeting our requirements. https://grapheneos.org/usage#sandboxed-play-services does not reserve the app ids within the OS and Play services will never be bundled in any form so it doesn't rule out supporting other options. We'd still like to have the stub implementation described at https://grapheneos.org/faq#google-services but while we do have substantial resources available now we're not going to be focusing so much on one specific area in the short term to the point that we offer multiple different options meeting our requirements. It's our hope that the support for Android apps in Windows 11 without Play services will greatly expand interest in supporting it, greatly reducing the need for making a stub implementation.

CalyxOS has dragged us into this with their escalating one-sided war waged on us but if you're doing the same kind of things too, then

You say it's "technically correct" but it's a real issue and is why GrapheneOS does not include a scoped implementation of signature spoofing for only the Play services signatures by first-party microG builds. Making first-party builds builds of it only takes your build and signing infrastructure out of the picture as attack surface. It doesn't fix security weaknesses in the code and doesn't avoid trusting the project is going to seriously consider security and avoiding weakening sandboxing and data security. We wanted to include this in GrapheneOS, and what was linked above was someone trying to come up with an approach that fit our requirements. A signature permission only allowing spoofing Play services permissions would be fine too, but we won't sign microG with that key unless it upholds the same security.

There's a whole lot more to privacy than simply not using closed source code especially when so many open source projects do not seem to care much about privacy and security. Protecting data in transit and upholding the application security model is part of what's required for decent privacy. If someone wants to use Google apps and services, we're more than happy to let them do it, but without invasive integration into the OS or downgrading the privacy and security of user data.

@thestinger
Copy link

In fact, if you look at GrapheneOS/platform_frameworks_base@d411882 you can see there has been explicit work done to avoid breaking microG integration by including this feature. This has no relevance to GrapheneOS but is there nonetheless for others using this code.

@mar-v-in
Copy link
Member

mar-v-in commented Aug 7, 2021

You're presenting it as if it has to be shipped with a backdoor to be compromised.

You're presenting it as if there was a huge security issue when the security issue in fact is more like a sudo command. While certainly, sudo can cause huge security issues when used wrongly, it isn't a security issue of itself as long as only users are authorized to use it that should be and those users don't misuse it.

A compromise only requires that the code has a remote or local code execution vulnerability.

Are you talking about the signature spoofing patch itself or the microG GmsCore package that has the permission to spoof signature. In the first case I totally agree that a vulnerability in the patch could theoretically have massive impact because the patch changes the base framework which could affect everything.

microG will be able to spoof the signatures of highly privileged component and arbitrary other apps, not simply Play services.

The signature spoofing permission is granted on a per-app level. Apps with the permission can only spoof to change its own signature, not its package name or the signature of another app. microG GmsCore is installed under the com.google.android.gms package name. As a consequence, it can only spoof to be an app with the com.google.android.gms package name. I'd argue that thus, microG GmsCore will only be able to spoof to be Play services or some com.google.android.gms package signed by an unknown third-party.
Also note that any attempt to change the signature being spoof will require to install an update to the app, as the spoof signature certificate is encoded as meta-data in the AndroidManifest.xml and thus can't be changed without installing an update app with a modified AndroidManifest.xml.

The whole point is that by severely weakening the sandbox and making microG into a highly trusted component, you're weakening the system against attacks.

"severely weakening" the sandbox is highly subjective. I don't know any app that could be misused by having the com.google.android.gms package fake any signature except for the Google signature (= the intended functionality). The patches only affect the PackageInfo.signatures field, which is deprecated and according to Android developer documentation should normally not be used to grant access to a third-party app. Instead one should either use permissions or PackageManager.checkSignatures(), both of which would not be affected by the fake signature.

By denying that this is what you're doing and not communicating it to the people using the patch

I stopped providing patches and merging them in the microG repository long time ago (no patches for Android 10+). I'll be happily removing those patches if you think this would improve the situation.

It's not a design decision but rather a security vulnerability that you're covering up.

Even if I'd say (and I don't agree with this assessment) THIS IS A BIG SECURITY RISK, it still is a design decision to accept that risk for the advantage of allowing users to use certain apps without proprietary Google code on their devices.

We're more than happy to obtain CVE assignments and draw attention to these vulnerabilities being denied by the project.

I don't think assigning a CVE to the patches which serve documentation purposes and are not shipped as part of microG GmsCore works or make sense. Besides, I'm happy to receive any notification of security vulnerabilities in microG GmsCore. You can always mail security@microg.org for RD (PGP FP: 22F796D6E62E6625A0BCEFEA7F979A66F3E08422). And in that case you can also assign a CVE.

You turn a basic bugs in the UI layer into a very deep compromise of the OS with this approach.

So does sudo.

that are able to cause far more harm to end users

Can you please make elaborate examples for how signature spoofing can harm end users in practice outside the desired functionality of spoofing Play services.

  • When the "standard patch" is used as is, with user accepting signature spoofing permission for arbitrary untrusted third-party apps.
  • When the "standard patch" is used as is, with user accepting signature spoofing permission for microG apps, assuming a remote code execution vulnerability in microG.
  • When a modified patch is used, that restricts to apps on the system partition and signed by the OS platform key (protectionLevel set to signature|privileged) and restricted to the package names com.google.android.gms and com.android.vending (which come preinstalled with the OS).
  • When a modified patch is used, that restricts to apps on the system partition and signed by the OS platform key (protectionLevel set to signature|privileged) and restricted to the package names com.google.android.gms and com.android.vending (installed by user on demand).

Feel free to send these examples to security@microg.org if you don't want to make them public.

It's a major downgrade from how connections/components are secured with Play services itself and impacts users privacy/security. It shows why the signature spoofing patch is compromising the application security model even when it's scoped to only the Play services signature and only by microG.

You are making bold claims here, that original Play services would have a high privacy and security standard. These claims are hard to impossible to confirm or deny due to Play services not being open-source and Google not publishing any information on security issues in it. My findings during reverse engineering works with Play services do not support the claim of high security and I can't take the claim of privacy serious, giving we're literally talking about Google here.

Then please don't misrepresent our reasoning for not including it.

I have not or did not intend to make any misrepresenting statements about GrapheneOS. My previous statements regarding ROMs deciding to not accept signature spoofing patches was with respect to LineageOS and ROMs that come with root-access preinstalled. Those ROMs which don't put an emphasis on security can hardly be significantly affected by the signature spoofing patch.

CalyxOS and microG participate in an official capacity in the substantial harassment being perpetrated against our developers and raids on our community. Misrepresenting our statements + reasoning and pushing misinformation is one way you've encouraged the the relentless attacks on us disrupting our project and harming our developers.

I am not participating and I never encouraged anyone to act in any capacity to do harm or damage to the GrapheneOS project and/or its developers. I applaud for your approach of not adding support for microG or Play services and losing potential users for doing so, as this encourages developers to create apps that work without Google services. Your work (this, but obviously also the security improvements you are developing) is good for everyone.
I am not involved with CalyxOS. Of course I was in touch with developers of CalyxOS and some of them also contributed to microG, but that doesn't make me or the project a part of the CalyxOS project. FWIW, I'm not using CalyxOS on any of my devices but I do own a Pixel 3 with GrapheneOS for testing purposes (using it to test how apps behave with neither microG nor original Play services).

I'm sure someone would submit and maintain a patch avoiding the security weaknesses here.

There is no one patch to fit them all. Restricting signature spoofing to certain package names, signing keys, system apps, etc, can easily increase development effort. How about we create a documentation about possible security risks due to signature spoofing and how to mitigate these risks. This way, OS developers can make an informed decision and you can link there as one of the reasons in case users question why you decided to not support microG.

Perhaps people would also work on adding the missing security checks from Play services if the project didn't downplay it as irrelevant and pretend it isn't a problem.

I'm not downplaying any missing security checks in microG. In all the history of microG, I was made aware of two missing security checks and immediately fixed those.

We'd still like to have the stub implementation described at https://grapheneos.org/faq#google-services

As part of microG, I'm also releasing all the API interfaces which will dramatically reduce the workload of creating such stubs. In fact, many API implementations in microG are also only stub implementations.

CalyxOS has dragged us into this with their escalating one-sided war waged on us

As mentioned above, I'm not involved with CalyxOS. Please don't drag me personally or the microG project in your war.

If someone wants to use Google apps and services, we're more than happy to let them do it, but without invasive integration into the OS or downgrading the privacy and security of user data.

No matter how invasive the integration of original Play services is, it will always downgrade privacy significantly. Just to make an example: Whenever any app requests GPS geolocation through Play services (using the Google Location Services API as suggested in the Android developer documentation), the device location and surrounding mobile network information will be uploaded to Google servers, linked to your devices "android id" which in itself is linked to your google account when signed-in. microG skips the uploading and just reports the GPS geolocation to the requesting app.

@thestinger
Copy link

You're presenting it as if there was a huge security issue when the security issue in fact is more like a sudo command. While certainly, sudo can cause huge security issues when used wrongly, it isn't a security issue of itself as long as only users are authorized to use it that should be and those users don't misuse it.

I'm not claiming that this is anything close to that serious of a problem. That would be a very serious security vulnerability in an OS that actually has a proper application and multi-user security model. It's not true that it requires misuse. An app or OS component can be compromised by an exploit. A huge part of the security model is avoiding trust not just in apps but in the OS components themselves. It's not at all a huge massively privileged monolith, although that unfortunately does describe the Linux kernel architecture.

Sudo is a very poorly designed tool even on platforms without that kind of security model. It's generally quite misguided to use it and provides a false sense of security with how it's deployed/used in practice. Better to use actual access control. It's a quite poor implementation too, which is why OpenBSD phased it out along with others, and why others like Alpine are following their lead. Not particularly relevant to the fundamental design issues with the approach though.

Are you talking about the signature spoofing patch itself or the microG GmsCore package that has the permission to spoof signature. In the first case I totally agree that a vulnerability in the patch could theoretically have massive impact because the patch changes the base framework which could affect everything.

I'm talking about the expanded impact of a vulnerability in microG if spoofing is not scoped.

The signature spoofing permission is granted on a per-app level. Apps with the permission can only spoof to change its own signature, not its package name or the signature of another app. microG GmsCore is installed under the com.google.android.gms package name. As a consequence, it can only spoof to be an app with the com.google.android.gms package name. I'd argue that thus, microG GmsCore will only be able to spoof to be Play services or some com.google.android.gms package signed by an unknown third-party.

Privileges are granted not just based on package name + signature but also signature alone.

"severely weakening" the sandbox is highly subjective. I don't know any app that could be misused by having the com.google.android.gms package fake any signature except for the Google signature (= the intended functionality). The patches only affect the PackageInfo.signatures field, which is deprecated and according to Android developer documentation should normally not be used to grant access to a third-party app. Instead one should either use permissions or PackageManager.checkSignatures(), both of which would not be affected by the fake signature.

External use being phased out doesn't imply it's not used internally including by other APIs.

I stopped providing patches and merging them in the microG repository long time ago (no patches for Android 10+). I'll be happily removing those patches if you think this would improve the situation.

Not encouraging that approach would improve the situation. It won't change much if it's still encouraged elsewhere. That's not what stops us supporting it regardless.

Even if I'd say (and I don't agree with this assessment) THIS IS A BIG SECURITY RISK, it still is a design decision to accept that risk for the advantage of allowing users to use certain apps without proprietary Google code on their devices.

Proprietary Google code is still there in the apps using it for the Play libraries. Many of those libraries like the Ads one provide extensive functionality without Play services installed too.

I don't think assigning a CVE to the patches which serve documentation purposes and are not shipped as part of microG GmsCore works or make sense.

It's still part of the project and also isn't what stopped us from integrating support for it.

You are making bold claims here, that original Play services would have a high privacy and security standard. These claims are hard to impossible to confirm or deny due to Play services not being open-source and Google not publishing any information on security issues in it. My findings during reverse engineering works with Play services do not support the claim of high security and I can't take the claim of privacy serious, giving we're literally talking about Google here.

I don't see what's bold about saying that microG doesn't implement the same transport security or other security checks. A project being open source does not make it inherently more private or secure and does not make trustworthy.

https://arstechnica.com/gadgets/2021/07/for-years-a-backdoor-in-popular-kiwisdr-product-gave-root-to-project-developer/

I don't think it's a bold claim to think that Google is best positioned to secure the connections to their own services and to implement the internal security model and checks.

I have not or did not intend to make any misrepresenting statements about GrapheneOS. My previous statements regarding ROMs deciding to not accept signature spoofing patches was with respect to LineageOS and ROMs that come with root-access preinstalled. Those ROMs which don't put an emphasis on security can hardly be significantly affected by the signature spoofing patch.

LineageOS doesn't screw up security enough that it isn't impacted by this. Sure, it doesn't have verified boot, but they might eventually change their minds about that. Still worth arguing against screwing things up further.

I clearly perceive this differently than you do because what I see is our reasoning/concerns being downplayed and misrepresented, and then that feeds into the attacks from others.

I am not participating and I never encouraged anyone to act in any capacity to do harm or damage to the GrapheneOS project and/or its developers. I applaud for your approach of not adding support for microG or Play services and losing potential users for doing so, as this encourages developers to create apps that work without Google services. Your work (this, but obviously also the security improvements you are developing) is good for everyone.
I am not involved with CalyxOS. Of course I was in touch with developers of CalyxOS and some of them also contributed to microG, but that doesn't make me or the project a part of the CalyxOS project. FWIW, I'm not using CalyxOS on any of my devices but I do own a Pixel 3 with GrapheneOS for testing purposes (using it to test how apps behave with neither microG nor original Play services).

We won't ever include Play services in any form but we are fine with people using Google apps and services. The part we aren't fine with is it having special integration and privileges not available to other apps. In theory, microG could be something we were fine with supporting but definitely not bundling, but as is it's not. https://grapheneos.org/usage#sandboxed-play-services is an approach we're fine with as long as it's not bundled, is a relatively small amount of total code (~1000 lines of actual code would be fine) and grants no special privileges/access. It also led us to removing a few bits of leftover Play services integration cruft.

I'm not downplaying any missing security checks in microG. In all the history of microG, I was made aware of two missing security checks and immediately fixed those.

I'm sure you would have a similar opinion as you do about the signature spoofing scoping for most of the things we consider to be issues.

As mentioned above, I'm not involved with CalyxOS. Please don't drag me personally or the microG project in your war.

I've been dragged into this. Our community isn't making relentless offensive attacks on your projects. I'm here to defend our reasoning and to counter the claim that there's no significant security impact from bypassing these checks.

No matter how invasive the integration of original Play services is, it will always downgrade privacy significantly. Just to make an example: Whenever any app requests GPS geolocation through Play services (using the Google Location Services API as suggested in the Android developer documentation), the device location and surrounding mobile network information will be uploaded to Google servers, linked to your devices "android id" which in itself is linked to your google account when signed-in. microG skips the uploading and just reports the GPS geolocation to the requesting app.

If the user has it configured that way, sure. It's also not a traditional desktop OS where apps can do what they want, access any data and impersonate the user. If someone installs Play services, or microG for that matter, on GrapheneOS they're regular sandboxed apps without any special privileges. ANDROID_ID is a per-profile-per-app identifier, not a hardware identifier, and any issues with the approach are a general problem not something specific to Play services. It has no access to hardware identifiers on GrapheneOS. A fake serial number based on the normal unprivileged identifier. If someone installs Play services in 2 profiles and logs into their account, they have 2 separate devices connected to their account. If they wipe the profile, that device is dead and gone. It doesn't have location access unless they grant that to it and they choose when it has it. It's also limited to the profile that it's installed in.

I don't really see how this is any different than using apps like Discord, WhatsApp, Tinder or whatever else people use. Many of those have extension integration of Google's libraries including ones like the Ads SDK functioning without Play services. The OS has to protect the user's privacy and security regardless, and users choose what data / access they grant to apps. Those apps can communicate within a profile with mutual consent just as they do with Play services. There are many other widely used libraries like Facebook SDKs too. OS should be preserving privacy and security regardless of what the user installs along with giving them good controls over what they make available to apps.

@chirayudesai
Copy link
Contributor

I've been dragged into this. Our community isn't making relentless offensive attacks on your projects. I'm here to defend our reasoning and to counter the claim that there's no significant security impact from bypassing these checks.

@thestinger Please stop with this, there is no need to bring any of that here. There is no reason the other discussions cannot be had without any of this. Don't you see that you're simply doing exactly what you're falsely accusing us of. I urge you to not drag that here.

@mar-v-in Sorry about that, I did not want to see this here but for the record we have no involvement in any of this. We're grateful for microG, and the only things I say are about our implementation of the signature spoofing patch with a link to this thread for providing the rationale about the patches present in the repo.

@thestinger
Copy link

Please stop with this, there is no need to bring any of that here

I'm responding to escalating attacks and misinformation from you folks.

There is no reason the other discussions cannot be had without any of this. Don't you see that you're simply doing exactly what you're falsely accusing us of.

I'm not falsely accusing you of anything or engaging in that behavior. You're simply being incredibly dishonest. I've seen you participating in the threads where these attacks are happening and you have no problem with subtly encouraging it.

@thestinger
Copy link

@mar-v-in Sorry about that, I did not want to see this here but for the record we have no involvement in any of this

If you're not involved in the recent escalating attacks, why do I see you participating where it's happening, perfectly content to promote CalyxOS based on harming GrapheneOS with malicious misinformation?

@chirayudesai
Copy link
Contributor

@mar-v-in Sorry about that, I did not want to see this here but for the record we have no involvement in any of this

If you're not involved in the recent escalating attacks, why do I see you participating where it's happening, perfectly content to promote CalyxOS based on harming GrapheneOS with malicious misinformation?

I haven't mentioned CalyxOS once here outside the patch. The issue is about a patch made by somebody for GrapheneOS, and I happily clarified what it does rather than any attacks. You're the one promoting GrapheneOS here and your play services stub. You're doing exactly what you're falsely accusing me of.

Please stop Daniel, this place is for microG discussion not any fights or bickering.

@thestinger
Copy link

You're the one engaging in increasingly active malicious attacks on our project. These have massively ramped up in the past week across a bunch of platforms. Don't pretend that CalyxOS is not involved. The dishonesty will just increase our response to the attacks. We're not given any peace anywhere. There are non-stop attacks and raids all day, every day. It's your community engaging in these attacks across platforms, encouraged by the project which heavily spreads misinformation and makes dishonest claims including on your site.

@thestinger
Copy link

I haven't mentioned CalyxOS once here outside the patch. The issue is about a patch made by somebody for GrapheneOS, and I happily clarified what it does rather than any attacks. You're the one promoting GrapheneOS here and your play services stub. You're doing exactly what you're falsely accusing me of.

I'm here defending us from misrepresentations and your usual spreading of dishonest claims.

@chris42
Copy link
Author

chris42 commented Aug 7, 2021

As the starter of this thread and posting this original question, I intended to see if there was a better collaboration or alignment possible between microg and GrapheneOS. Which out of my view are both incredible projects with talented people aboard.

I did not want this to become a pit of claims and hurt feelings.

@thestinger After observing the GrapheneOS community, I can understand where you are coming from. I have seen multiple weird people showing up in issues and chats, where I do not understand who they are, why they are attacking you, nor what they think to achieve. I understand that this is stressful and hurting, but with regards of Marvin (microg community), I know that he has no part in this and if anything what he said is misused by someone, he would be more than willing to correct that.

@chirayudesai and @thestinger Your discussion needs to take place somewhere else. Not to hide it or anything, but that you two can talk about it and give each other a second chance. Trust me.. this is not about being right or wrong, this is a matter of understanding and agreeing on a path forward removing the bitterness.

@thestinger
Copy link

@chirayudesai has recently been actively spreading misinformation and attacking me with false claims, and that is what he was doing here in a subtle way along with using it to push his attacks elsewhere.

@mar-v-in
Copy link
Member

mar-v-in commented Aug 7, 2021

@thestinger
I do not deny that the signature spoofing patch provided by microG and most of its variants do have an impact on the security of the system. But I also heard a lot of false accusations like "With signature spoofing an attacker can create a modified update of an app and steal the private app data". This is not how signature spoofing works and many of these statement were from leading LineageOS developers; people who should know better.
This is why I started to be very reserved as soon as someone claims "signature spoofing is a big security issues", because I'm not aware of any practically possible or relevant attack that can be run due to signature spoofing - except faking to be Play services which is exactly the purpose of it. I did check multiple versions of AOSP code for references to the PackageInfo.signatures field to ensure none of its uses could be negatively impacted by the signature spoofing patch (under the assumption that any app installed to system comes with a trusted AndroidManifest.xml, which even holds in case of code execution vulnerabilities). That doesn't mean it's not possible that there is other code that would be significantly affected by signature spoofing, but it makes it less likely that such issues become relevant in practice even without further mitigation (like restricting the permission to certain package names or system apps).

In any way, please don't let my stance on signature spoofing stop you from reporting other security issues in microG should you see any, especially when you already put some time into auditing microG code. And I'd also be happy to assist you in creating a stub-only version of microG should you ever want to do that.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants