Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Integrity check for ASAR files #9648

Closed
wants to merge 2 commits into from

Conversation

@develar
Copy link
Contributor

commented Jun 1, 2017

Addresses Code Signing of ASAR files.

  • Any application that uses external files as code sources are not protected and code signing doesn't make sense.
  • Anyone can use trusted app from trusted vendor as a template for malicious app.
    • macOS: Static validation failed (when you manually check using codesign tool or another tools), dynamic (performed on run for quarantined apps) passed.
    • Windows: no app bundle concept at all, no way to statically check app, no dynamic validation at all. Nothing prevents.
  • Java, Electron apps are affected.

How do we can fix it?

  1. Embedding code sources into the executables.
    1.1 Modifying existing executable.

    As it is not possible on macOS, this approach is rejected.

    1.2 Linking during build. It will complicate Electron app development. Another approach should be used.

    So, although embedding code sources into the executables approach is the most secured, price is too high.

  2. Really sign ASAR file and verify on ASAR file load. Well, implementation will be complex and fragile (very easy to verify code integrity, but identity validation...). So, this approach is rejected.

  3. Keep code integrity data in the application manifest and verify only integrity on ASAR file load. As application manifest is a part of executable file (Windows) and Info.plist is a part of application bundle (macOS), we don't need to verify identity and so, our implementation is robust and straightforward. If integrity data will be changed — code signature will be broken and application will be marked as damaged by OS.

Third approach, "code integrity data in the application manifest", is implemented in this PR.

Some notes:

  • macOS specific: Depending on not yet known reasons even modified Info.plist is not a reason to prevent run on macOS sometimes. Under some conditions app is marked as damaged, but under another conditions not (interestingly, in both cases app is quarantined (com.apple.quarantine, can be checked using xattr tool)). Well, it doesn't invalidate the whole idea of this PR:
    • small amount of data can be embedded into existing Mach-O object file if will be proven that Info.plist is not really secured. So, it can be implemented later if need.
    • static validation is not passed (since code signature will be broken), so, user has ability to verify application using standard tools.
  • sha512 is used because it is faster than sha256 on x64 machines.
    • 5MB hashed in 4ms, 30MB hashed in 45ms. So, integrity check doesn't slow down app loading.
  • macOS specific: Officially not recommended to sign non-Mach-O executables. So, unlikely PR to electron-osx-sign will be filed to sign ASAR files.
  • Windows is unsecured, even macOS Gatekeeper doesn't work as expected... Should we worry about code integrity if even checks on OS level so insecure? Well... no excuse for us. Integrity check is cheap, build process is not complicated. So, why not to make sure that on our level user is protected and we do our best to protect users?

Checksums stored as JSON object under the key AsarChecksums. If key not specified — no integrity check. electron-builder (asar is enabled by default) will add integrity data on build (will be implemented as separated npm module to reuse this logic in other build tools).

Implemented for macOS only for now, I want to hear your feedback.

update 1: macOS performs full static validation on first run of downloaded application. So, changes in this PR not "should", but "must" be implemented (planned fix on electron-osx-sign under investigation).

std::vector<std::string> GetSearchPath() {
#if defined(OS_MACOSX)
if (asar::GetAsarChecksums().empty()) {
return {"app", "app.asar", "default_app.asar"};

This comment has been minimized.

Copy link
@develar

develar Jun 1, 2017

Author Contributor

Code duplication (default search path) will be eliminated when Windows will be supported.


namespace asar {

std::string GetAsarChecksums();

This comment has been minimized.

Copy link
@develar

develar Jun 1, 2017

Author Contributor

Maybe another string type should be used, I am not expert.

try {
packagePath = path.join(process.resourcesPath, packagePath)
packageJson = require(path.join(packagePath, 'package.json'))
break
} catch (error) {
continue
if (error.code && error.code.startsWith('EDAMAGED')) {

This comment has been minimized.

Copy link
@develar

develar Jun 1, 2017

Author Contributor

We must not swallow exception is file is damaged.

@@ -103,14 +104,15 @@ require('./guest-window-manager')
// Now we try to load app's package.json.
let packagePath = null
let packageJson = null
const searchPaths = ['app', 'app.asar', 'default_app.asar']
for (packagePath of searchPaths) {
for (packagePath of asar.getSearchPath()) {

This comment has been minimized.

Copy link
@develar

develar Jun 1, 2017

Author Contributor

I hope it is the only place where we do search since it is very critical (see comment for method GetSearchPath in the atom_api_asar.cc).

@MarshallOfSound

This comment has been minimized.

Copy link
Member

commented Jun 1, 2017

This looks pretty cool, will be interested to see how much you can mess with the Info.plist before macOS refuses to launch the app.

Quick question though, will this validate any .asar file you load against the checksum in the Info.plist? Or only the app.asar file, I want to be sure that any other asar file we might want to load won't be rejected for not matching the checksum of the main app.asar file.

@develar

This comment has been minimized.

Copy link
Contributor Author

commented Jun 1, 2017

see how much you can mess with the Info.plist before macOS refuses to launch the app.

As soon as Info.plist is modified, code signature is broken and app marked as damaged. It worked so yesterday, but doesn't work today... See " Under some conditions app is marked as damaged" Because it doesn't invalidates the whole idea of this PR, I will continue investigation (using other macs / user and so on) a little bit later.

will this validate any .asar file you load against the checksum in the Info.plist

Any. Any required asar file is validated and search path is limited to ensure that check is not bypassed. If asar file is not listed in the checksums — error is thrown.

<key>AsarChecksums</key>
<string>{"electron.asar": "512hash", "app.asar": "512hash"}</string>
@MarshallOfSound

This comment has been minimized.

Copy link
Member

commented Jun 1, 2017

@develar What if the app requires an ASAR file outside of it's own package. For instance I'm working on something at the moment that implements plugin support in an app I'm working on. The current implementation ships ASAR files for each plugin and the plugin manager internally does public/private key validation of the plugin ASAR files.

With this implementation AFAIK this would stop working because it would attempt to validate the plugin ASAR files against the AsarChecksums key and it wouldn't exist in that key even though app.asar would.

Can we set this up so any ASAR file not in that AsarChecksums object isn't validated. Removing an entry from the object would still cause the Info.plist to be invalidated so I think it's safe.

@develar

This comment has been minimized.

Copy link
Contributor Author

commented Jun 1, 2017

Can we set this up so any ASAR file not in that AsarChecksums object isn't validated. Removing an entry from the object would still cause the Info.plist to be invalidated so I think it's safe.

I want to avoid any possible effects if some asar file is added. But we don't need to relax requirements, — such plugin ASAR files are located outside of application bundle, right? So, we can skip this validation for external asar files.

@MarshallOfSound

This comment has been minimized.

Copy link
Member

commented Jun 1, 2017

Having read through this, isn't bypassing this as simple as generating your own electron.asar file without the validation JS and swapping it in to the resources folder? For this to be effective I think all the logic for validation needs to be on the native side of things.

@develar

This comment has been minimized.

Copy link
Contributor Author

commented Jun 1, 2017

all the logic for validation needs to be on the native side of things.

No doubt, that's why it is fixed in the Electron and not in the userland. It is on the native side. Not in the electron.asar Processed and embedded into the executable by js2c.

@MarshallOfSound

This comment has been minimized.

Copy link
Member

commented Jun 1, 2017

Dope, forgot we did that for the asar js files 😆 There goes all my questions then 😄 Will defer to others for opinions / review 👍

Integrity check for ASAR files: add externalAllowed option
By default false, external loading is forbidden
@develar

This comment has been minimized.

Copy link
Contributor Author

commented Jun 3, 2017

externalAllowed option is added. By default false. If false, external asar files (outside of resources directory) are forbidden. if true, integrity for external files are not checked.

Thanks @MarshallOfSound for review and comment about external files.

asar-integrity 0.1.0 published. electron-builder with this support not yet released.

@rsaccon

This comment has been minimized.

Copy link

commented Jun 5, 2017

Great feature, thanks for implementing this. But what exactly happens on Electron at startup, if the asar integrity is corrupt ? I looked through the sources, but couldn't find the answer.

@develar

This comment has been minimized.

Copy link
Contributor Author

commented Jun 5, 2017

Error printed to stderr, process exit. If only user asar file corrupted (not electron.asar), you will get also error dialog.

@seanzer

This comment has been minimized.

Copy link
Contributor

commented Jun 8, 2017

Are there changes to support validation on Windows? I read through this discussion and looked at the PR, but I don't see any Windows support. Maybe I'm missing it?

@develar

This comment has been minimized.

Copy link
Contributor Author

commented Jun 8, 2017

@seanzer Please note "Implemented for macOS only for now, I want to hear your feedback."

I am going to add Windows support as soon as Electron maintainers will approve my idea.

@seanzer

This comment has been minimized.

Copy link
Contributor

commented Jun 8, 2017

Oh, thanks for the clarification. I wonder how this change can be improved to support the unpacked files feature described here: https://electron.atom.io/docs/tutorial/application-packaging/#adding-unpacked-files-in-asar-archive

Adding the ability to validate native node modules might not make sense with this PR, but I'm curious if anyone had given it any thought. In my opinion, I don't think this is a general "node" problem, since electron apps are typically prebuilt and distributed.

@develar

This comment has been minimized.

Copy link
Contributor Author

commented Jun 8, 2017

Adding the ability to validate native node modules might not make sense with this PR

I am author of electron-builder. To provide the best results, asar is enabled by default. I did my best to reduce user's pain. electron-builder automatically detects files that should be unpacked and provide very handy asarUnpack option if you need to tweak automatic resolution.

So, according to my experience, electron-builder smart unpack algo doesn't unpack native node modules. It is part of ASAR file. And, so, this PR protects native node modules also.

In the future electron-builder can also produce integrity data for all unpacked files as well.

@develar

This comment has been minimized.

Copy link
Contributor Author

commented Jun 12, 2017

@kevinsawicki could you please review?

@groundwater

This comment has been minimized.

Copy link
Member

commented Jun 12, 2017

There is an effort worth watching https://github.com/WICG/webpackage that may eventually bring signed-packages into node.

Now might be a great time to give input into that project. If node can natively support a signed bundling format, that would make our lives easier, and could potentially serve as an ASAR replacement.

@develar

This comment has been minimized.

Copy link
Contributor Author

commented Jun 12, 2017

@groundwater thanks for links. Adobe AIR is dead. For users self-contained app is better — no dependency on some extra app manager. Ok, it is another story.

Technically, generic implementation of code signing requires identity validation. It can increase app start time and will duplicate OS implementation (as result, can lead to some extra sources of bugs). It will also complicate build tools (now we just compute sha512 checksum).

@kevinsawicki

This comment has been minimized.

Copy link
Contributor

commented Jun 14, 2017

could you please review?

Added this to the project board to queue up review of this next week, thanks.

@zcbenz

This comment has been minimized.

Copy link
Member

commented Jun 28, 2017

This only checks the JavaScript part of reading asar archive, when you load a page in asar archive via file:// this check will not work.

While this should be fine for most apps, it can be simply bypassed by replacing electron.asar with one that does not do checks. (EDIT: the file was embedded into exe so it is safe.)

So I think the check should be done in C++ when creating archives (atom/common/asar/archive.cc), which works for both C++ and JavaScript APIs.

@develar

This comment has been minimized.

Copy link
Contributor Author

commented Jun 28, 2017

by replacing electron.asar with one that does not do checks.

No, please see above #9648 (comment)

when you load a page in asar archive via file:// this check will not work.

In general, it is user responsibility to not do such things. e.g. user can simply use file read to evaluate any external JS and we cannot and should not forbid it. i.e. we can, but this task a little bit out of scope. Our current task — to make code signing working. Task "lint user code" is another task.

@develar

This comment has been minimized.

Copy link
Contributor Author

commented Oct 2, 2017

The response so far has been that it's not in scope, but that's unsatisfactory, because it fundamentally affects the usefulness of the whole idea.

Because as @timfish already pointed above, the question is not correct. macOS does code signature check on each run, but not full static, only dynamic (Info.plist, executable).

which then wouldn't be signed by the original publisher

...and marked as damaged and blocked to run.

@MarshallOfSound

This comment has been minimized.

Copy link
Member

commented Oct 2, 2017

Replacing the entire bundle won't work.

But it will? I can ship an app called "Your Payment Service" and sign it with my certificate, delete your app and add mine. If you lose control of the users machine, there is no way to stop people doing whatever they want.

@galvesribeiro

This comment has been minimized.

Copy link

commented Oct 2, 2017

@MarshallOfSound

I can ship an app called "Your Payment Service" and sign it with my certificate, delete your app and add mine

If the app was first signed with the developer certificate, then you replace that app executable with another one signed by your certificate, it will fail the check.

It is not whether or not the app is signed. It is whether or not it is signed with the correct/original key.

However, if you have access to the original signing key, of course it will bypass any validation since you are signing the whole package including malicious code and replacing the other package.

But bear with me... If you are dealing with application development that requires tide security procedures, that key will be heavily secured without human access.

Again, it is not this PR will make the application hacker proof. It is just one of the several procedures to make your system safer.

@timfish

This comment has been minimized.

Copy link
Member

commented Oct 2, 2017

@MarshallOfSound Apple can revoke the certificates of the malicious app you've replaced it with. Publishers want to be sure that code run under their signing certificate was written by them. I'm sure most of us here wouldn't want our certificates revoked because malicious code is being run under it.

Before this PR

  • Users are blocked from running the app if any executables or info.plist have been modified
  • asar contents can be modified and the app still opened with no warning

After this PR

  • When electron runs, if app.asar does not match hash in info.plist, info.plist is modified
  • Users are then blocked from running the app as the info.plist has been modified

This means that on macOS, there will be no way to modify an electron app without triggering Gatekeeper. This will put electrons protection at the same level as "any other signed executable".

@develar

This comment has been minimized.

Copy link
Contributor Author

commented Oct 2, 2017

Just to make clear future plans and answer to possible question "and what about Windows" — yes, Windows is not a secure system and allows you to run any unsigned app. But — sysadmins should be able to use standard tools to ensure security and on Windows checksums will be stored (will be addressed in another PR since implementation (both sides — Electron and build tool) is not so easy as in case of macOS) using resources (as stated in the PR description).

@MarshallOfSound

There are a lot of "if", but right answer is the absence of question. I can spend time and answer to all your "if". But it doesn't worth it. Ok, if my point is still not clear, let's explain:

  1. User adds Foo app (native) to white list of Firewall.
  2. Someone modify Foo app because new app Bar ("sign it with my certificate" — signed, ok) is not suitable — only Foo in the white list.
  3. App Foo will be blocked because damaged.

The same scenario, but app Foo is not native, it is Electron app.

  1. ...
  2. ..
  3. App Foo is not blocked, because dynamic validation passed. And so, hacked app Foo passes firewall.

And so, user has a question to Electron team — why OS standard protection doesn't work? "User adds Foo app (native) to white list of Firewall" can be replaced to "Sysadmin blocks all publishers and only apps from limited set of publisher can be run" and to numerous other cases, where user/sysadmin relies on code signature.

@galvesribeiro

This comment has been minimized.

Copy link

commented Oct 2, 2017

@develar just to complete your last comment.

There are two ways one Windows to ensure integrity of an App:

  1. You already mentioned. There are multiple ways to set AppLocker. The worse case scenario, your user will be notified about the signature check fail but ofc, they can bypass it (if UAC is enabled they need elevated perms to do so).
  2. Publishing your app to Windows Store. That will behave similar to OSX Gatekeeper since you are under the store "rules" and AppLocker is not optional. You would need to have an app sideloaded in order to bypass it. Even if they do it, they are completely aware that they are probably using an app from an untrusted source from outside the Store, which is clearly reported to them.
@poiru

This comment has been minimized.

Copy link
Member

commented Oct 2, 2017

Replacing the entire bundle won't work. This PR adds the hash to the manifest which cannot be modified without breaking the signing and causing macOS to show a warning. With these changes you'd have to replace the electron executable too which then wouldn't be signed by the original publisher.

@develar @timfish We've already established that macOS will refuse to run a modified bundle when it is downloaded from the internet (see this comment). This is the case even if only the .asar file has been modified.

If the application is downloaded and then modified locally by the attacker (which is what this PR seems to be protecting against, right?), that check will not be performed by macOS. The attacker could simply remove the AsarIntegrity key from Info.plist and you would be none the wiser. Please demonstrate how you will protect against that scenario.

@develar

This comment has been minimized.

Copy link
Contributor Author

commented Oct 2, 2017

@poiru Sorry if it is not clear from comments above — macOS does dynamic validation on each app run. Info.plist is one the files that is checked during dynamic validation, as stated in the PR description (see "3. Keep code integrity data in").

@galvesribeiro

This comment has been minimized.

Copy link

commented Oct 2, 2017

The attacker could simply remove the AsarIntegrity key from Info.plist and you would be none the wiser. Please demonstrate how you will protect against that scenario.

Wow! If that is possible, that beats the whole purpose of the PR, at least for my use case.

I believe the both OSes (Windows and OSX) are not naive to the point to only check the first run and if the package was downloaded from the internet. That would indeed be ridiculous fail.

@poiru

This comment has been minimized.

Copy link
Member

commented Oct 2, 2017

Sorry if it is not clear from comments above — macOS does dynamic validation on each app run. Info.plist is one the files that is checked during dynamic validation, as stated in the PR description (see "3 Keep code integrity data in").

@develar I don't think it's performed on each run. It's definitely cached in some form. And in any case, if the attacker has write access, they can just remove Contents/_CodeSignature/CodeResources and get on with it.

@poiru

This comment has been minimized.

Copy link
Member

commented Oct 2, 2017

I believe the both OSes (Windows and OSX) are not naive to the point to only check the first run and if the package was downloaded from the internet. That would indeed be ridiculous fail.

No, it's not a ridiculous fail. If the attacker has compromised the OS, the battle has already been lost regardless of the what the PCI requirements say. See this:

This is necessary to trigger the Gatekeeper check as Gatekeeper only checks quarantined files the first time they're opened.

@galvesribeiro

This comment has been minimized.

Copy link

commented Oct 2, 2017

If the attacker has compromised the OS, the battle has already been lost regardless of the what the PCI requirements say

Like I said... Again, it is 1 of the security measures people should take. It alone won't make your app hacker-proof. But that doesn't mean it is useless as people are stating here.

@baconbrad

This comment has been minimized.

Copy link
Contributor

commented Oct 2, 2017

To me this protection depends on many other protections to work. From machine level, OS level, application level, user level, ect. It's not a first line of defense. And as mentioned prior this line of defense itself can be compromised in one way or another. So it seems to be a very niche protection. While it is always nice to have layers and layers of security it just seems like this is a "what if" or "maybe this will work" protection rather than a necessary protection. There are much more efficient solutions of file integrity monitoring to comply with PCI requirements rather than try to deal with it ourselves at the Electron level.

Overall I don't like the idea of the application saying it's trusted opposed to trying to prove it's trusted. Integrity checks should be handled outside of itself. It is like buying beer. I give my ID and the clerk verifies it's me, tries to determine it's not a fake ID, and some states they can scan the ID to further test it. Sure there are ways around this but it is a lot better than me saying hey, I double checked my own age just now and I am in fact old enough. And then the clerk sales me beer with just my word. Having the ASAR prove to Electron is like I am vouching to the clerk my 6 year old daughter is 21. Your country/state's alcohol commission would not find this to be a suitable solution.

I like the initiative you put in I just don't see how a case can be built to justify this PR.

@poiru

This comment has been minimized.

Copy link
Member

commented Oct 2, 2017

It just occured to me that this check as implemented in this PR could be trivially bypassed by simply removing the relevant code from electron.asar. In other words, the JS code in electron.asar responsible for the integrity check of app.asar can be trivially modified without triggering any warnings. (This specific problem could be fixed by doing the check completely in native code, but the other problems remain.)

@poiru

This comment has been minimized.

Copy link
Member

commented Oct 2, 2017

It seems like this discussion has already been had with some Apple engineers here:

David Simpson: could someone point me to some code that would allow me to check the validity of the application inside our own code?

Jens Alfke: Checking your own validity doesn't seem very useful for hack- resistance, since any hacker patching the binary can just delete the code that runs the check. Trying to make code protect itself is just an unwinnable arms-race of obfuscation and other tricks; it can only deter or delay a crack, not prevent it.

Perry The Cynic: Exactly. The Code Signing calculus is designed to help someone else check your program's validity. The "someone else" can be a launch controller (Parental Controls, MCX) that can then inhibit your launch, or it can be a service (keychain, firewall) that grants service based on your validity. While you can "check yourself" (or you could if the API was, uh, public, which it isn't - do ask for that :-), if you've been hacked, you may be lying to yourself.

Note that Jens Alfke and Perry The Cynic were both Apple engineers. In particular, Perry worked on the macOS codesigning infrastructure. With this information straight from the horse's mouth, I'm going to go ahead and close this PR. Please re-open if you feel otherwise!

@poiru poiru closed this Oct 2, 2017

@timfish

This comment has been minimized.

Copy link
Member

commented Oct 3, 2017

@poiru As it stands, electron apps offer a way for untrusted code to bypass Gatekeeper. I doubt Jens and Perry would be happy about selective quoting of their past comments being used to defend this position!

When reading the full discussion you linked to between the apple engineers, they actually appear to contradict the decision you've made:

It seems more useful to use code-checking to verify plug-ins/bundles that your app loads, since that way the code doing the check is separate from the possibly-tampered-with code.

Perry: Yup. The basic idea is to check newly added code before it can affect you, and if it doesn't verify, then either refuse to use it (that's what the HARD flag is about), or clear your own dynamic validity, thereby announcing (irrevocably) to everyone that you're no longer (sure you are) valid.

What if Apple decides electron apps pose a risk and don't abide by their terms and start revoking our certificates?

@zeke

This comment has been minimized.

Copy link
Member

commented Oct 3, 2017

I'm reopening this PR, as there doesn't yet seem to be consensus on the issue. Let us continue the conversation (respectfully) and if the community is unable to come to a collective understanding then we'll put it to a vote.

@zeke zeke reopened this Oct 3, 2017

@develar

This comment has been minimized.

Copy link
Contributor Author

commented Oct 4, 2017

Reality is that System Integrity Protection is enabled only for apps downloaded from AppStore. So, Apple, as far I can judge by this fact, also thinks that additional level of protection is good and necessary.

macOS High Serra doesn't mark app as damaged if Info.plist modified anymore (for apps from AppStore you will get https://gist.github.com/develar/820f0c9620eafb5dd36a110117e79c29, thanks to SPI) (for previous versions it was "prevent run on macOS sometimes", see PR description). App identification still will be invalidated (e.g. access to Keychain will be blocked), but need to investigate to make sure.

So, for now, AppStore is a preferable way to distribute Electron Apps if you need integrity protection.

But... even if SPI is enabled, I can change app.asar in the Slack.app from AppStore (but in this case I need sudo to change file permission (because app from AppStore has write permissions only for system)). Modifying MacOS/Slack binary file leads to damaged app.

I still think that this protection should be implemented for macOS (I think, for Windows we all agree that it is required, right?), but for now I don't have strong arguments to convince. Personally, for me, fact, that modifying Info.plist leads to damaged app (not always for apps not from AppStore, but at least app identify will be invalidated) is enough (since check is cheap and completely optional if someone doesn't want to enable it).

the JS code in electron.asar responsible for the integrity check of app.asar can be trivially modified

No, please see #9648 (comment)

@miniak

This comment has been minimized.

Copy link
Contributor

commented Oct 4, 2017

Btw what if you just extract app.asar into app directory and modify files there? The checksum would be still valid, but Electron will use files from app instead of app.asar, as the former has priority.

@develar

This comment has been minimized.

Copy link
Contributor Author

commented Oct 4, 2017

Btw what if you just extract app.asar into app directory and modify files there? The checksum would be still valid, but Electron will use files from app instead of app.asar, as the former has priority.

@miniak

  1. if you mean this PR — no, we explicitly change search path to avoid this problem.
  2. if you mean code signature — no, full static validation will be failed (reason: unknown added files).
@timfish

This comment has been minimized.

Copy link
Member

commented Oct 4, 2017

@develar if validation fails, could you add the com.apple.quarantine extended attribute to the app file? I think this signifies to Gatekeeper that the app needs checking again.

@timfish

This comment has been minimized.

Copy link
Member

commented Oct 4, 2017

I can confirm that setting the com.apple.quarantine extended attribute on the app file causes Gatekeeper to re-check the app on run and re-prompt the user about the application stating that "it's from the internet".

However, if the app has been modified in any way (even 1 character change in app.asar), the signing is broken and instead I am told the app is "damaged and can't be opened" and only offered an option to "move it trash".

I am not sure what stops malicious code from removing extended attributes but I'm guessing Apple has thought of that? There has been work on adding this attribute here.

@vanessayuenn

This comment has been minimized.

Copy link
Contributor

commented Oct 10, 2017

As the conversation seems to have tapered out and no consensus has come out of it, it was brought up in the weekly maintainers group meeting, and we have agreed to close this PR out.

@develar

This comment has been minimized.

Copy link
Contributor Author

commented Oct 10, 2017

@vanessayuenn Please share what do you think about Windows case.

@jkleinsc

This comment has been minimized.

Copy link
Contributor

commented Oct 10, 2017

@develar one of the things that came up in discussion of this issue is the fact that we should have an RFC process for adding features to Electron. That way we can have discussions like the one that has occurred in this PR before actual work takes place. We don't have the RFC process formalized, but what I would recommend at this point is to open up a new issue (prefixed with RFC:) in regards to a solution proposal for the Windows case.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
You can’t perform that action at this time.