New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
libcore panic on ridiculous compilation #56799
Comments
Triage: i had to apply this patch to get things to compile: diff --git a/day_2/galaxy_brain/src/lib.rs b/day_2/galaxy_brain/src/lib.rs
index 208c3b0..55e68e4 100644
--- a/day_2/galaxy_brain/src/lib.rs
+++ b/day_2/galaxy_brain/src/lib.rs
@@ -14,7 +14,7 @@ pub fn construct_galaxy_brain(_input: TokenStream) -> TokenStream {
use packed_simd::u8x32;
use ugly_array_decl::ugly_array_decl;
-const BYTES: [u8; 6750] = *include_bytes!("../../zesterer-input.txt");
+const BYTES: [u8; 7000] = *include_bytes!("../../zesterer-input.txt");
const LINES: usize = 250;
const ZEROS: u8x32 = u8x32::splat(0);
const ONES: u8x32 = u8x32::new(1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
@@ -40,4 +40,4 @@ fn part_2_ff() -> String {
}
}
unreachable!();
-}
\ No newline at end of file
+}
diff --git a/day_2/src/main.rs b/day_2/src/main.rs
index 9e2f244..957fab6 100644
--- a/day_2/src/main.rs
+++ b/day_2/src/main.rs
@@ -60,7 +60,7 @@ use std::hint::unreachable_unchecked;
use packed_simd::u8x32;
use ugly_array_decl::big_ugly_array_decl;
-const BYTES: [u8; 2700000] = *include_bytes!("../fixed-big-input.txt");
+const BYTES: [u8; 2800000] = *include_bytes!("../fixed-big-input.txt");
const LINES: usize = 100_000;
const ZEROS: u8x32 = u8x32::splat(0);
const ONES: u8x32 = u8x32::new(1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
@@ -241,4 +241,4 @@ fn zesterer(b: &mut Bencher) {
fn og_zesterer(b: &mut Bencher) {
let i = include_bytes!("../ameo-input.txt");
b.iter(|| black_box(original_zesterer(i)));
-}*/
\ No newline at end of file
+}*/ I didn't go the whole way through a whole compile, so I don't know if it will take a long time or not, but yeah. |
This crate expands to 60MB of source code, primarily repeating a bunch of expressions that look like this:
Looking at some logging from the compiler, I think this is primarily slow and using memory because we are (a) type checking each of these index expressions separately, which involves re-proving various bounds. Each of the indexing expressions desugars to have a unique lifetime assigned to parts of it so we're not able to deduplicate the proofs as well as we perhaps could. That said, I think this input is pretty pathological. 60MB of code that is primarily concentrated around a few huge consts/statics isn't going to perform well. I'm not sure there's much to learn from looking into this further, beyond what we already know about wanting to improve caching/normalization around the trait solver. I'm going to go ahead and close this issue even though this particular case isn't fixed since I don't think anyone is likely to care much about the particulars and the larger problem IMO isn't going to get helped by keeping it open. |
I had a friend try to compile my solution for Advent of Code day 2 with 100k inputs instead of the 250 given in the puzzle, since I was unable to compile it myself. Because it ate up so much memory. He ended up allowing an external 2TB HDD to be a swap disk to compile it (on a Lenovo IdeaPad Yoga 11S), and it took eight days before it spat out this error (with said friend's name replaced with
user
for sake of privacy):I don't want to trouble him to compile it again, so the code can be found here.
Here's the output of
rustc --version --verbose
:Note on memory use: we don't actually know how much memory it took to compile this thing. All I know is that a different friend tried compiling it and it ate up the better part of their 40GB of RAM plus writing a 60GB swap file. As for the 2TB swap disk, I don't know how much of that it ate up since friend 1's laptop was frozen for five of the eight days it was compiling.
The text was updated successfully, but these errors were encountered: