-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ufuzz failure #3689
Comments
@kzc this one's a little tricky. $ cat test.js
console.log(function(a) {
return a + (a[0] = 0).toString();
}([]));
$ cat test.js | node
00
$ uglifyjs test.js -c unsafe | node
0
$ uglifyjs test.js -c unsafe
console.log(function(a) {
return a + "" + (a[0] = 0);
}([])); |
Good one. Given the dynamic nature of the language, I suspect that a bug-free optimizing JS compiler is not possible. |
A bit worse than that - most browser (except Chrome) used to be capable of handling large memory usage (say 64GB), but all that has changed in the last couple of years. Firefox nowadays blows up around the 20GB mark, and for some bizarre reason Microsoft decided to "update" IE11 such that it can't only use 2GB before going OOM. (Safari won't even load WebWorker properly since at least a year ago so let's not mention that.) |
I've been amazed by the speed of these JS JIT engines, but it's only possible with some compromises as you've discovered. You can only imagine the bug attack surface of these engines given their incredible complexity. |
I'm also curious what JS applications you're designing that need 64GB in a browser. The language was invented to do simple form field validation, after all. :-) |
Just running data analytics with distributed storage, nothing fancy 😉 Thought I would (ab)use web technologies as much as I could, but the recent push of feature creep and security theatre is really making me wonder if I should migrate away from this platform. |
Still remember back in the days when Internet Explorer first introduced CSS crop and transparency filters − making image kernels (e.g. sharpen, motion blur) to work inside the browser was hours of fun. 🤣 |
- migrate de-facto compression to `conditionals` & `strings` fixes mishoo#3689
- migrate de-facto compression to `conditionals` & `strings` fixes mishoo#3689
- migrate de-facto compression to `conditionals` & `strings` fixes #3689
That's been my thought as well. The back end alternatives all have their own issues and baggage though.
I didn't even know that was a thing! It's getting harder and harder to be a programming generalist these days - so much technology to learn. You're basically forced to to pick a silo and run with it. |
After #3710: $ uglifyjs test.js --toplevel -mc passes=1000000,unsafe --reduce-test
// reduce test pass 1, iteration 0: 1271 bytes
// reduce test pass 1, iteration 25: 918 bytes
// reduce test pass 1, iteration 50: 726 bytes
// reduce test pass 1, iteration 75: 373 bytes
// reduce test pass 1, iteration 100: 260 bytes
// reduce test pass 1, iteration 125: 131 bytes
// reduce test pass 1: 131 bytes
// reduce test pass 2: 126 bytes
// reduce test pass 3: 126 bytes
var a = 0, b = 0;
function f0(a_1) {
return a_1 += (a_1[0] >>>= 0).toString();
}
var c = f0([]);
console.log(null, a, b, c, Infinity, NaN, undefined);
// output: null 0 0 00 Infinity NaN undefined
// minify: null 0 0 0 Infinity NaN undefined
// options: {"compress":{"passes":1000000,"unsafe":true},"mangle":true,"toplevel":true} |
Not bad at all. Just curious what was generated before #3710. |
The text was updated successfully, but these errors were encountered: