Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Learning from the mistakes of Firefox OS #10

Open
benfrancis opened this issue Nov 23, 2022 · 3 comments
Open

Learning from the mistakes of Firefox OS #10

benfrancis opened this issue Nov 23, 2022 · 3 comments

Comments

@benfrancis
Copy link

benfrancis commented Nov 23, 2022

Hi,

I worked on Firefox OS at Mozilla for five years, founded the Webian project and am now an Invited Expert in the W3C Web Apps Working Group.

I wanted to share some experience with signed package apps from my time working on Firefox OS, in the hope that you can avoid repeating some of the mistakes we made a decade ago. The long version of the story is in a blog post, but I will share some key technical learnings here.

There were two key problems that led us to implement signed packaged apps in Firefox OS in circa 2012:

  1. We needed apps to work offline and the state of the art of the time was AppCache, which had many issues
  2. All of the UI in Firefox OS (including the system UI) was implemented using web technologies so we needed to expose lots of very privileged APIs including low level access to hardware, making phone calls, sending text messages etc. to web content

Due to the time constraints of getting a product to market, "mozApps" (signed packaged apps) were originally intended as a temporary solution to these two problems, for a subset of Firefox OS apps which needed offline functionality or access to the most privileged APIs. In the longer term the intention was that we'd find (and standardise) an alternative solution which worked with real hosted web apps, but by the time the project was cancelled five years later we still hadn't delivered that alternative.

In my view this decision ended up being the biggest single technical mistake we made for Firefox OS, because it ultimately ended up adding to the problem (of centralised single-vendor app stores) which we'd set out to solve.

It had two huge downsides:

  1. Although not the original intention, privileged apps ultimately had to be centrally reviewed and signed by Mozilla in order for them to be installable on Firefox OS
  2. It negated what I would argue are all of the biggest benefits of web apps including linkability, discovery, progressive enhancement, deep linking, transparent automatic updating and cross-platform use

Ultimately, they were not the web. That was important because Firefox OS was meant to prove that the web as a platform could provide a viable alternative to native apps, not just create another vendor-specific packaged app format.

mozApps had many similarities with your current proposal:

  • They were packages, downloaded over the web
  • The packages were signed using public key cryptography
  • They were used to provide access to privileged APIs like low level access to hardware, network sockets, webviews and bypassing same-origin checks
  • They used a special URI scheme (app://) with unique origins (UUIDs) separate from the DNS system
  • They were updated by downloading a new package
  • They used installation as a signal of user trust
  • They ran inside a special sandbox isolated at the OS level
  • Their storage was isolated from other web content
  • They always launched in a standalone window
  • Out of scope navigations would open in the browser (or something similar to Chrome Custom Tabs)

Re-visiting those two original problems today:

  1. The offline problem is now solved by Service Workers
  2. The privileged API problem is still a hot topic of debate. I'm still firmly in the camp that privileged low level hardware APIs like WebBluetooth, WebUSB, WebMIDI and WebSerial are not safe to expose to web content using the web's current security model, because it's not possible to get informed consent from users. There are just too many ways in which those APIs can be misused.

So how would I solve it today?

I would definitely avoid packaged apps which have to be signed by a central authority. Having a solution where in theory anyone could sign the app is not good enough, it has to work that way in practice. The tendency of these types of systems is to centralise around a single authority, usually a single-vendor app store. I would argue that apps which are not linkable (including deep links to individual resources) and have to be installed and updated from a central app store are the antithesis of the web.

We had started work on "hosted packaged apps" in the later days of Firefox OS, which I think worked a bit like web bundles and could be self-hosted by app authors. But they still had the problem that someone had to sign the packages to say they were safe. It's as much a social problem as it is a technical one.

Maybe you can design a solution where packages can be self-hosted at a URL on the web, support deep links, and can be signed by the app author rather than an app store owner, but you still need to solve the permissions UI problem.

Personally, my instinct has always been that the answer to privileged hardware APIs lies on the server side of the web stack rather than the client side. To a browser vendor every problem looks like a DOM API. But what if rather than exposing hardware features as JavaScript APIs, they were treated as web resources with URLs?

The starting point for this would be a local web server which exposes hardware resources as locally hosted web services (e.g. using the Web of Things to describe capabilities which are exposed via an HTTP REST API or WebSocket sub-protocol from localhost or a .local domain). These higher level web services could be much more application specific instead of providing generic low level access to hardware, which could help solve the problem of informed user consent. They could be authorised by the user using a mechanism like OAuth. The biggest problem I see with this approach is that TLS doesn't work on local networks. There is a W3C Community Group looking at that problem but they haven't yet come up with a solution.

Those are my thoughts so far. If you'd like to discuss this further, please feel free to reply here, or get in touch.

@fabricedesre
Copy link
Contributor

Hi! I also worked on FxOS and now at KaiOS shipping the same platform.

My feeling here is that this proposal an Ben don't have the same goals. The crux is to figure out what's best for the web at large - users first, then developers. All the "packaged apps with opaque urls" proposals naturally align themselves with the app store models, where a single actor has full control over the user experience and can stifle innovation by being an omnipotent middle man between app providers and users.

I'm not convinced by Ben's solution of using backend services - this looks like just moving the problem elsewhere in the stack. How do you trust the backend services?

In https://capyloon.org we explore alternative solutions with decentralized web protocols. Using ipfs:// or ipns:// resources gives offline support after first access, because these resources are immutable in a way that is simpler than service workers (no extra work for the developers). In contrast with package signing where the signature is used both for integrity and reputability, IPFS gives you just the integrity part, and let us layer a reputability/trust solution on top. As Ben wrote, providing trust is as much a social problem as a technical one, so it's important to decouple that aspect from content integrity. You will likely trust the OS vendor by default (since you already trust it...), but could also have more involved trust schemes like "I trust apps that are trusted by this set of people I know".

ipfs:// urls are resolvable as any regular https:// one, so deep linking comes for free; they fit just well with the overall web security model and can be considered secure context because of the integrity guarantees. This seems just like a better primitive to build on that re-inventing something that we know has flaws.

But again, it's all about goals. If the intention is to make Electron apps obsolete, the current proposal probably fits the bill. It would just not make the Web much better.

@benfrancis
Copy link
Author

benfrancis commented Nov 23, 2022

Hi Fabrice,

I was hoping you might comment because I know you've been doing some interesting work in this area in Capyloon.

My feeling here is that this proposal an Ben don't have the same goals.

If the goal is to create trusted apps which can be distributed from a Google app store, a Microsoft app store and an Apple app store and nowhere else then I suspect you may be correct.

I'm not convinced by Ben's solution of using backend services - this looks like just moving the problem elsewhere in the stack..

FWIW I agree my proposed alternative is half baked. I would love to find a way to prototype it in order to test my hypotheses and work through the security model.

How do you trust the backend services?

The back end services themselves would be provided by the OS (or web browser) and would therefore be trusted by default, as you described above.

My thinking is that semantic annotations in Thing Descriptions could provide much richer semantics around what a service does, so a user can provide more informed consent for a given web app to be given access to a given service. E.g. "Do you want to give this web app access to play music on your Sony speaker?" or "Do you want to give this web app access to control your Aerial quadcopter?" rather than "Do you want to give this web app access to Bluetooth?". Or "Do you want to give this web app access to notes played on your M-Audio MIDI keyboard?" rather than "Do you want to give this web app access to MIDI, which could include updating the firmware of your MIDI keyboard to turn it into a HID device and send inputs on your behalf?"

It's true that those semantics could be provided on the client side rather than the server side as I have described, but the Web of Things provides an already standardised mechanism for describing the capabilities of devices and consuming those capabilities using existing web protocols, in such a way that can be authorised using existing proven security schemes like OAuth. It would certainly be a radical departure from how web apps currently interact with hardware though.

In https://capyloon.org/ we explore alternative solutions with decentralized web protocols.

I haven't evaluated these solutions in detail, but I'm interested to learn more.

If the intention is to make Electron apps obsolete, the current proposal probably fits the bill. It would just not make the Web much better.

I agree with this, and if the goal is to standardise packaged apps (which happen to use web technologies but aren't web apps) then I question whether the W3C is the right place to do that. We already went through this with the System Applications Working Group.

@reillyeon
Copy link
Collaborator

Thanks for the perspectives and my apologies for not responding in a timely manner. I've read through these posts a couple times but haven't been able to sit down and write out a thorough reply until now.

My feeling here is that this proposal an Ben don't have the same goals.

If the goal is to create trusted apps which can be distributed from a Google app store, a Microsoft app store and an Apple app store and nowhere else then I suspect you may be correct.

I would say the goal is to create trusted apps. Distribution from a store is one of the most obvious solutions for establishing trust but isn't necessarily the only one. In particular both Windows and macOS offer code signing without the store distribution component as an alternative that offers more control for developers while still providing a strong trust signal. Distributed models, like Binary Transparency, may also be practical. As @fabricedesre mentioned, you can start by trusting the OS vendor but also expand to trusting apps that are trusted by people you know.

I think that packaging is a key aspect of making trust work. Transparent automatic updates are great (and still possible with packaging) but the web lacks the concept of application versioning and that requires a package (or at least a resource manifest). SRI gets us very close but without a way to pin the versions of all of a site's top-level documents it is too much of a moving target to trust. I think an IPFS/IPNS approach would work similarly, with IPFS providing integrity (a hash that represents an application version and points to an immutable set of resources) and IPNS providing identity (a public key which is used to publish new versions that reference that hash). Further trust can be layered on top of that. The current proposal does the same thing with Web Bundles: a developer's signature over the Web Bundle provides integrity and identity and additional signatures can be added to provide trust signals.

Maybe you can design a solution where packages can be self-hosted at a URL on the web, support deep links, and can be signed by the app author rather than an app store owner, [...]

As mentioned above, in this proposal developers are the first ones to sign their packages, not an app store. A package could be self-hosted at a URL, and maybe apps that ask for no special permissions could work that way without any additional trust signals.

The deep linking restrictions are there to help the app developer protect themselves from malicious content, similar to X-Frame-Options, and the point is that the developer has to opt-in to a particular page being externally linkable.

How do you trust the backend services?

The back end services themselves would be provided by the OS (or web browser) and would therefore be trusted by default, [...]

I like the idea of trusted services providing a mediation between low-level (and thus dangerous and hard-to-explain) APIs and applications requesting high-level (and thus more explainable to users) APIs but this is my sticking point as well because it feels like shifting the problem elsewhere without solving to the trust problem because not all of these services can be provided by the OS vendor.

I don't disagree with the argument that the restrictions on IWAs mean that they aren't truly part of the web.

If the web can't be trusted with these capabilities and so a separate "not-web" platform is necessary then maybe this can be the platform for building services that need to be more trusted, but do it on top of the web technology stack rather than something proprietary and vendor-specific.

If the intention is to make Electron apps obsolete, the current proposal probably fits the bill. It would just not make the Web much better.

I agree with this, and if the goal is to standardise packaged apps (which happen to use web technologies but aren't web apps) then I question whether the W3C is the right place to do that. We already went through this with the System Applications Working Group.

This might not be the web (and so might not be in-scope for the W3C) but I do think that considering this proposal in some sort of standards organization is the right thing to do for users because it gives them the same choice they have when developers target the web, cross-platform compatibility built on standards.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants