diff --git a/AGENTS.md b/AGENTS.md index 725007dd7b..de749990a1 100644 --- a/AGENTS.md +++ b/AGENTS.md @@ -254,4 +254,173 @@ The compiler is designed for fast feedback loops and scales to large codebases: - Remember Lambda IR is the core optimization layer - All `lam_*.ml` files process this representation - Use `lam_print.ml` for debugging lambda expressions -- Test both with and without optimization passes \ No newline at end of file +- Test both with and without optimization passes + +## Working on the Build System + +### Rewatch Architecture + +Rewatch is the modern build system written in Rust that replaces the legacy bsb (BuckleScript) build system. It provides faster incremental builds, better error messages, and improved developer experience. + +#### Key Components + +``` +rewatch/src/ +├── build/ # Core build system logic +│ ├── build_types.rs # Core data structures (BuildState, Module, etc.) +│ ├── compile.rs # Compilation logic and bsc argument generation +│ ├── parse.rs # AST generation and parser argument handling +│ ├── packages.rs # Package discovery and dependency resolution +│ ├── deps.rs # Dependency analysis and module graph +│ ├── clean.rs # Build artifact cleanup +│ └── logs.rs # Build logging and error reporting +├── cli.rs # Command-line interface definitions +├── config.rs # rescript.json configuration parsing +├── watcher.rs # File watching and incremental builds +└── main.rs # Application entry point +``` + +#### Build System Flow + +1. **Initialization** (`build::initialize_build`) + - Parse `rescript.json` configuration + - Discover packages and dependencies + - Set up compiler information + - Create initial `BuildState` + +2. **AST Generation** (`build::parse`) + - Generate AST files using `bsc -bs-ast` + - Handle PPX transformations + - Process JSX + +3. **Dependency Analysis** (`build::deps`) + - Analyze module dependencies from AST files + - Build dependency graph + - Detect circular dependencies + +4. **Compilation** (`build::compile`) + - Generate `bsc` compiler arguments + - Compile modules in dependency order + - Handle warnings and errors + - Generate JavaScript output + +5. **Incremental Updates** (`watcher.rs`) + - Watch for file changes + - Determine dirty modules + - Recompile only affected modules + +### Development Guidelines + +#### Adding New Features + +1. **CLI Arguments**: Add to `cli.rs` in `BuildArgs` and `WatchArgs` +2. **Configuration**: Extend `config.rs` for new `rescript.json` fields +3. **Build Logic**: Modify appropriate `build/*.rs` modules +4. **Thread Parameters**: Pass new parameters through the build system chain +5. **Add Tests**: Include unit tests for new functionality + +#### Common Patterns + +- **Parameter Threading**: New CLI flags need to be passed through: + - `main.rs` → `build::build()` → `initialize_build()` → `BuildState` + - `main.rs` → `watcher::start()` → `async_watch()` → `initialize_build()` + +- **Configuration Precedence**: Command-line flags override `rescript.json` config +- **Error Handling**: Use `anyhow::Result` for error propagation +- **Logging**: Use `log::debug!` for development debugging + +#### Testing + +```bash +# Run rewatch tests (from project root) +cargo test --manifest-path rewatch/Cargo.toml + +# Test specific functionality +cargo test --manifest-path rewatch/Cargo.toml config::tests::test_get_warning_args + +# Run clippy for code quality +cargo clippy --manifest-path rewatch/Cargo.toml --all-targets --all-features + +# Check formatting +cargo fmt --check --manifest-path rewatch/Cargo.toml + +# Build rewatch +cargo build --manifest-path rewatch/Cargo.toml --release + +# Or use the Makefile shortcuts +make rewatch # Build rewatch +make test-rewatch # Run integration tests +``` + +**Note**: The rewatch project is located in the `rewatch/` directory with its own `Cargo.toml` file. All cargo commands should be run from the project root using the `--manifest-path rewatch/Cargo.toml` flag, as shown in the CI workflow. + +**Integration Tests**: The `make test-rewatch` command runs bash-based integration tests located in `rewatch/tests/suite-ci.sh`. These tests use the `rewatch/testrepo/` directory as a test workspace with various package configurations to verify rewatch's behavior across different scenarios. + +#### Debugging + +- **Build State**: Use `log::debug!` to inspect `BuildState` contents +- **Compiler Args**: Check generated `bsc` arguments in `compile.rs` +- **Dependencies**: Inspect module dependency graph in `deps.rs` +- **File Watching**: Monitor file change events in `watcher.rs` + +#### Performance Considerations + +- **Incremental Builds**: Only recompile dirty modules +- **Parallel Compilation**: Use `rayon` for parallel processing +- **Memory Usage**: Be mindful of `BuildState` size in large projects +- **File I/O**: Minimize file system operations + +#### Performance vs Code Quality Trade-offs + +When clippy suggests refactoring that could impact performance, consider the trade-offs: + +- **Parameter Structs vs Many Arguments**: While clippy prefers parameter structs for functions with many arguments, sometimes the added complexity isn't worth it. Use `#[allow(clippy::too_many_arguments)]` for functions that legitimately need many parameters and where a struct would add unnecessary complexity. + +- **Cloning vs Borrowing**: Sometimes cloning is necessary due to Rust's borrow checker rules. If the clone is: + - Small and one-time (e.g., `Vec` with few elements) + - Necessary for correct ownership semantics + - Not in a hot path + + Then accept the clone rather than over-engineering the solution. + +- **When to Optimize**: Profile before optimizing. Most "performance concerns" in build systems are negligible compared to actual compilation time. + +- **Avoid Unnecessary Type Conversions**: When threading parameters through multiple function calls, use consistent types (e.g., `String` throughout) rather than converting between `String` and `&str` at each boundary. This eliminates unnecessary allocations and conversions. + +#### Compatibility with Legacy bsb + +- **Command-line Flags**: Maintain compatibility with bsb flags where possible +- **Configuration**: Support both old (`bs-*`) and new field names +- **Output Format**: Generate compatible build artifacts +- **Error Messages**: Provide clear migration guidance + +### Common Tasks + +#### Adding New CLI Flags + +1. Add to `BuildArgs` and `WatchArgs` in `cli.rs` +2. Update `From for WatchArgs` implementation +3. Pass through `main.rs` to build functions +4. Thread through build system to where it's needed +5. Add unit tests for the new functionality + +#### Modifying Compiler Arguments + +1. Update `compiler_args()` in `build/compile.rs` +2. Consider both parsing and compilation phases +3. Handle precedence between CLI flags and config +4. Test with various `rescript.json` configurations + +#### Working with Dependencies + +1. Use `packages.rs` for package discovery +2. Update `deps.rs` for dependency analysis +3. Handle both local and external dependencies +4. Consider dev dependencies vs regular dependencies + +#### File Watching + +1. Modify `watcher.rs` for file change handling +2. Update `AsyncWatchArgs` for new parameters +3. Handle different file types (`.res`, `.resi`, etc.) +4. Consider performance impact of watching many files diff --git a/CHANGELOG.md b/CHANGELOG.md index 75c4404ab0..1261c8bb45 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -32,6 +32,7 @@ - Option optimization: do not create redundant local vars. https://github.com/rescript-lang/rescript/pull/7915 - Js output: remove superfluous newline after every `if`. https://github.com/rescript-lang/rescript/pull/7920 - Rewatch: Traverse upwards for package resolution in single context projects. https://github.com/rescript-lang/rescript/pull/7896 +- Rewatch: Add `--warn-error` flag to `build`. https://github.com/rescript-lang/rescript/pull/7916 #### :house: Internal diff --git a/rewatch/src/build.rs b/rewatch/src/build.rs index eaad489da0..a8114bf124 100644 --- a/rewatch/src/build.rs +++ b/rewatch/src/build.rs @@ -82,6 +82,8 @@ pub fn get_compiler_args(rescript_file_path: &Path) -> Result { &project_context.current_config, relative_filename, &contents, + /* is_local_dep */ true, + /* warn_error_override */ None, )?; let is_interface = filename.to_string_lossy().ends_with('i'); let has_interface = if is_interface { @@ -101,6 +103,7 @@ pub fn get_compiler_args(rescript_file_path: &Path) -> Result { &None, is_type_dev, true, + None, // No warn_error_override for compiler-args command )?; let result = serde_json::to_string_pretty(&CompilerArgs { @@ -132,7 +135,8 @@ pub fn initialize_build( path: &Path, build_dev_deps: bool, snapshot_output: bool, -) -> Result { + warn_error: Option, +) -> Result { let project_context = ProjectContext::new(path)?; let compiler = get_compiler_info(&project_context)?; @@ -189,7 +193,7 @@ pub fn initialize_build( let _ = stdout().flush(); } - let mut build_state = BuildState::new(project_context, packages, compiler); + let mut build_state = BuildCommandState::new(project_context, packages, compiler, warn_error); packages::parse_packages(&mut build_state); let timing_source_files_elapsed = timing_source_files.elapsed(); @@ -300,7 +304,7 @@ impl fmt::Display for IncrementalBuildError { } pub fn incremental_build( - build_state: &mut BuildState, + build_state: &mut BuildCommandState, default_timing: Option, initial_build: bool, show_progress: bool, @@ -370,7 +374,8 @@ pub fn incremental_build( } } let timing_deps = Instant::now(); - deps::get_deps(build_state, &build_state.deleted_modules.to_owned()); + let deleted_modules = build_state.deleted_modules.clone(); + deps::get_deps(build_state, &deleted_modules); let timing_deps_elapsed = timing_deps.elapsed(); current_step += 1; @@ -385,7 +390,7 @@ pub fn incremental_build( } mark_modules_with_expired_deps_dirty(build_state); - mark_modules_with_deleted_deps_dirty(build_state); + mark_modules_with_deleted_deps_dirty(&mut build_state.build_state); current_step += 1; //print all the compile_dirty modules @@ -488,7 +493,7 @@ pub fn incremental_build( } } -fn log_deprecations(build_state: &BuildState) { +fn log_deprecations(build_state: &BuildCommandState) { build_state.packages.iter().for_each(|(_, package)| { // Only warn for local dependencies, not external packages if package.is_local_dep { @@ -522,7 +527,7 @@ fn log_deprecated_config_field(package_name: &str, field_name: &str, new_field_n // is watching this file. // we don't need to do this in an incremental build because there are no file // changes (deletes / additions) -pub fn write_build_ninja(build_state: &BuildState) { +pub fn write_build_ninja(build_state: &BuildCommandState) { for package in build_state.packages.values() { // write empty file: let mut f = File::create(package.get_build_path().join("build.ninja")).expect("Unable to write file"); @@ -530,6 +535,7 @@ pub fn write_build_ninja(build_state: &BuildState) { } } +#[allow(clippy::too_many_arguments)] pub fn build( filter: &Option, path: &Path, @@ -538,7 +544,8 @@ pub fn build( create_sourcedirs: bool, build_dev_deps: bool, snapshot_output: bool, -) -> Result { + warn_error: Option, +) -> Result { let default_timing: Option = if no_timing { Some(std::time::Duration::new(0.0 as u64, 0.0 as u32)) } else { @@ -552,6 +559,7 @@ pub fn build( path, build_dev_deps, snapshot_output, + warn_error, ) .map_err(|e| anyhow!("Could not initialize build. Error: {e}"))?; diff --git a/rewatch/src/build/build_types.rs b/rewatch/src/build/build_types.rs index 64a8895da4..273684ec10 100644 --- a/rewatch/src/build/build_types.rs +++ b/rewatch/src/build/build_types.rs @@ -3,7 +3,7 @@ use crate::config::Config; use crate::project_context::ProjectContext; use ahash::{AHashMap, AHashSet}; use blake3::Hash; -use std::{fmt::Display, path::PathBuf, time::SystemTime}; +use std::{fmt::Display, ops::Deref, path::PathBuf, time::SystemTime}; #[derive(Debug, Clone, PartialEq)] pub enum ParseState { @@ -90,6 +90,9 @@ impl Module { } } +/// Core build state containing all the essential data needed for compilation. +/// This is the minimal state required for basic build operations like cleaning. +/// Used by commands that don't need command-line specific overrides (e.g., `clean`). #[derive(Debug)] pub struct BuildState { pub project_context: ProjectContext, @@ -101,6 +104,21 @@ pub struct BuildState { pub deps_initialized: bool, } +/// Extended build state that includes command-line specific overrides. +/// Wraps `BuildState` and adds command-specific data like warning overrides. +/// Used by commands that need to respect CLI flags (e.g., `build`, `watch`). +/// +/// The separation exists because: +/// - `clean` command only needs core build data, no CLI overrides +/// - `build`/`watch` commands need both core data AND CLI overrides +/// - This prevents the "code smell" of optional fields that are None for some commands +#[derive(Debug)] +pub struct BuildCommandState { + pub build_state: BuildState, + // Command-line --warn-error flag override (takes precedence over rescript.json config) + pub warn_error_override: Option, +} + #[derive(Debug, Clone)] pub struct CompilerInfo { pub bsc_path: PathBuf, @@ -116,6 +134,7 @@ impl BuildState { pub fn get_module(&self, module_name: &str) -> Option<&Module> { self.modules.get(module_name) } + pub fn new( project_context: ProjectContext, packages: AHashMap, @@ -142,6 +161,48 @@ impl BuildState { } } +impl BuildCommandState { + pub fn new( + project_context: ProjectContext, + packages: AHashMap, + compiler: CompilerInfo, + warn_error_override: Option, + ) -> Self { + Self { + build_state: BuildState::new(project_context, packages, compiler), + warn_error_override, + } + } + + pub fn get_warn_error_override(&self) -> Option { + self.warn_error_override.clone() + } + + pub fn module_name_package_pairs(&self) -> Vec<(String, String)> { + self.build_state + .modules + .iter() + .map(|(name, module)| (name.clone(), module.package_name.clone())) + .collect() + } +} + +// Implement Deref to automatically delegate method calls to the inner BuildState +impl Deref for BuildCommandState { + type Target = BuildState; + + fn deref(&self) -> &Self::Target { + &self.build_state + } +} + +// Implement DerefMut to allow mutable access to the inner BuildState +impl std::ops::DerefMut for BuildCommandState { + fn deref_mut(&mut self) -> &mut Self::Target { + &mut self.build_state + } +} + #[derive(Debug)] pub struct AstModule { pub module_name: String, diff --git a/rewatch/src/build/clean.rs b/rewatch/src/build/clean.rs index 34a698fcb4..17b7c4ef51 100644 --- a/rewatch/src/build/clean.rs +++ b/rewatch/src/build/clean.rs @@ -103,7 +103,7 @@ fn clean_source_files(build_state: &BuildState, root_config: &Config) { // and then do cleanup on that state (for instance remove all .mjs files that are not in the state) pub fn cleanup_previous_build( - build_state: &mut BuildState, + build_state: &mut BuildCommandState, compile_assets_state: CompileAssetsState, ) -> (usize, usize) { // delete the .mjs file which appear in our previous compile assets @@ -299,7 +299,7 @@ fn has_compile_warnings(module: &Module) -> bool { ) } -pub fn cleanup_after_build(build_state: &BuildState) { +pub fn cleanup_after_build(build_state: &BuildCommandState) { build_state.modules.par_iter().for_each(|(_module_name, module)| { let package = build_state.get_package(&module.package_name).unwrap(); if has_parse_warnings(module) { diff --git a/rewatch/src/build/compile.rs b/rewatch/src/build/compile.rs index 7db67b653d..c48457a5e1 100644 --- a/rewatch/src/build/compile.rs +++ b/rewatch/src/build/compile.rs @@ -22,7 +22,7 @@ use std::sync::OnceLock; use std::time::SystemTime; pub fn compile( - build_state: &mut BuildState, + build_state: &mut BuildCommandState, show_progress: bool, inc: impl Fn() + std::marker::Sync, set_length: impl Fn(u64), @@ -166,6 +166,7 @@ pub fn compile( module, true, build_state, + build_state.get_warn_error_override(), ); Some(result) } @@ -177,6 +178,7 @@ pub fn compile( module, false, build_state, + build_state.get_warn_error_override(), ); let cmi_digest_after = helpers::compute_file_hash(Path::new(&cmi_path)); @@ -243,70 +245,100 @@ pub fn compile( } } - let module = build_state - .modules - .get_mut(module_name) - .ok_or(anyhow!("Module not found"))?; + let package_name = { + let module = build_state + .build_state + .modules + .get(module_name) + .ok_or(anyhow!("Module not found"))?; + module.package_name.clone() + }; let package = build_state + .build_state .packages - .get(&module.package_name) + .get(&package_name) .ok_or(anyhow!("Package name not found"))?; - match module.source_type { - SourceType::MlMap(ref mut mlmap) => { - module.compile_dirty = false; - mlmap.parse_dirty = false; - } - SourceType::SourceFile(ref mut source_file) => { - match result { + // Process results and update module state + let (compile_warning, compile_error, interface_warning, interface_error) = { + let module = build_state + .build_state + .modules + .get_mut(module_name) + .ok_or(anyhow!("Module not found"))?; + + let (compile_warning, compile_error) = match module.source_type { + SourceType::MlMap(ref mut mlmap) => { + module.compile_dirty = false; + mlmap.parse_dirty = false; + (None, None) + } + SourceType::SourceFile(ref mut source_file) => match result { Ok(Some(err)) => { source_file.implementation.compile_state = CompileState::Warning; - logs::append(package, err); - compile_warnings.push_str(err); + (Some(err.to_string()), None) } Ok(None) => { source_file.implementation.compile_state = CompileState::Success; + (None, None) } Err(err) => { source_file.implementation.compile_state = CompileState::Error; - logs::append(package, &err.to_string()); - compile_errors.push_str(&err.to_string()); - } - }; - match interface_result { - Some(Ok(Some(err))) => { - source_file.interface.as_mut().unwrap().compile_state = CompileState::Warning; - logs::append(package, &err.to_string()); - compile_warnings.push_str(&err.to_string()); + (None, Some(err.to_string())) } - Some(Ok(None)) => { - if let Some(interface) = source_file.interface.as_mut() { - interface.compile_state = CompileState::Success; + }, + }; + + let (interface_warning, interface_error) = + if let SourceType::SourceFile(ref mut source_file) = module.source_type { + match interface_result { + Some(Ok(Some(err))) => { + source_file.interface.as_mut().unwrap().compile_state = CompileState::Warning; + (Some(err.to_string()), None) } + Some(Ok(None)) => { + if let Some(interface) = source_file.interface.as_mut() { + interface.compile_state = CompileState::Success; + } + (None, None) + } + Some(Err(err)) => { + source_file.interface.as_mut().unwrap().compile_state = CompileState::Error; + (None, Some(err.to_string())) + } + _ => (None, None), } - - Some(Err(err)) => { - source_file.interface.as_mut().unwrap().compile_state = CompileState::Error; - logs::append(package, &err.to_string()); - compile_errors.push_str(&err.to_string()); - } - _ => (), + } else { + (None, None) }; - match (result, interface_result) { - // successfull compilation - (Ok(None), Some(Ok(None))) | (Ok(None), None) => { - module.compile_dirty = false; - module.last_compiled_cmi = Some(SystemTime::now()); - module.last_compiled_cmt = Some(SystemTime::now()); - } - // some error or warning - (Err(_), _) | (_, Some(Err(_))) | (Ok(Some(_)), _) | (_, Some(Ok(Some(_)))) => { - module.compile_dirty = true; - } - } + // Update compilation timestamps for successful compilation + if result.is_ok() && interface_result.as_ref().is_none_or(|r| r.is_ok()) { + module.compile_dirty = false; + module.last_compiled_cmi = Some(SystemTime::now()); + module.last_compiled_cmt = Some(SystemTime::now()); } + + (compile_warning, compile_error, interface_warning, interface_error) + }; + + // Handle logging outside the mutable borrow + if let Some(warning) = compile_warning { + logs::append(package, &warning); + compile_warnings.push_str(&warning); + } + if let Some(error) = compile_error { + logs::append(package, &error); + compile_errors.push_str(&error); + } + if let Some(warning) = interface_warning { + logs::append(package, &warning); + compile_warnings.push_str(&warning); + } + if let Some(error) = interface_error { + logs::append(package, &error); + compile_errors.push_str(&error); } } @@ -384,6 +416,8 @@ pub fn compiler_args( // Is the file listed as "type":"dev"? is_type_dev: bool, is_local_dep: bool, + // Command-line --warn-error flag override (takes precedence over rescript.json config) + warn_error_override: Option, ) -> Result> { let bsc_flags = config::flatten_flags(&config.compiler_flags); let dependency_paths = get_dependency_paths(config, project_context, packages, is_type_dev); @@ -417,7 +451,7 @@ pub fn compiler_args( let jsx_preserve_args = root_config.get_jsx_preserve_args(); let gentype_arg = config.get_gentype_arg(); let experimental_args = root_config.get_experimental_features_args(); - let warning_args = config.get_warning_args(is_local_dep); + let warning_args = config.get_warning_args(is_local_dep, warn_error_override); let read_cmi_args = match has_interface { true => { @@ -593,6 +627,7 @@ fn compile_file( module: &Module, is_interface: bool, build_state: &BuildState, + warn_error_override: Option, ) -> Result> { let BuildState { packages, @@ -626,6 +661,7 @@ fn compile_file( &Some(packages), is_type_dev, package.is_local_dep, + warn_error_override, )?; let to_mjs = Command::new(&compiler_info.bsc_path) @@ -800,7 +836,7 @@ pub fn mark_modules_with_deleted_deps_dirty(build_state: &mut BuildState) { // // We could clean up the build after errors. But I think we probably still need // to do this, because people can also force quit the watcher of -pub fn mark_modules_with_expired_deps_dirty(build_state: &mut BuildState) { +pub fn mark_modules_with_expired_deps_dirty(build_state: &mut BuildCommandState) { let mut modules_with_expired_deps: AHashSet = AHashSet::new(); build_state .modules diff --git a/rewatch/src/build/compiler_info.rs b/rewatch/src/build/compiler_info.rs index f19145dbf0..98a5ae81b0 100644 --- a/rewatch/src/build/compiler_info.rs +++ b/rewatch/src/build/compiler_info.rs @@ -1,6 +1,6 @@ use crate::helpers; -use super::build_types::{BuildState, CompilerInfo}; +use super::build_types::{BuildCommandState, CompilerInfo}; use super::packages; use super::{clean, logs}; use ahash::AHashMap; @@ -113,7 +113,7 @@ pub fn verify_compiler_info( } } -pub fn write_compiler_info(build_state: &BuildState) { +pub fn write_compiler_info(build_state: &BuildCommandState) { let bsc_path = build_state.compiler_info.bsc_path.to_string_lossy().to_string(); let bsc_hash = build_state.compiler_info.bsc_hash.to_hex().to_string(); let runtime_path = build_state diff --git a/rewatch/src/build/parse.rs b/rewatch/src/build/parse.rs index f5fabc9e36..22ab88ff46 100644 --- a/rewatch/src/build/parse.rs +++ b/rewatch/src/build/parse.rs @@ -14,7 +14,7 @@ use std::path::{Path, PathBuf}; use std::process::Command; pub fn generate_asts( - build_state: &mut BuildState, + build_state: &mut BuildCommandState, inc: impl Fn() + std::marker::Sync, ) -> anyhow::Result { let mut has_failure = false; @@ -52,14 +52,21 @@ pub fn generate_asts( package.to_owned(), &source_file.implementation.path.to_owned(), build_state, + build_state.get_warn_error_override(), ) .map_err(|e| e.to_string()); let iast_result = match source_file.interface.as_ref().map(|i| i.path.to_owned()) { Some(interface_file_path) => { - generate_ast(package.to_owned(), &interface_file_path.to_owned(), build_state) - .map_err(|e| e.to_string()) - .map(Some) + match generate_ast( + package.to_owned(), + &interface_file_path.to_owned(), + build_state, + build_state.get_warn_error_override(), + ) { + Ok(v) => Ok(Some(v)), + Err(e) => Err(e.to_string()), + } } _ => Ok(None), }; @@ -94,7 +101,24 @@ pub fn generate_asts( )>>() .into_iter() .for_each(|(module_name, ast_result, iast_result, is_dirty)| { - if let Some(module) = build_state.modules.get_mut(&module_name) { + // Get package name first to avoid borrow checker issues + let package_name = build_state + .build_state + .modules + .get(&module_name) + .map(|module| module.package_name.clone()) + .unwrap_or_else(|| { + eprintln!("Module not found: {module_name}"); + String::new() + }); + + let package = build_state + .build_state + .packages + .get(&package_name) + .expect("Package not found"); + + if let Some(module) = build_state.build_state.modules.get_mut(&module_name) { // if the module is dirty, mark it also compile_dirty // do NOT set to false if the module is not parse_dirty, it needs to keep // the compile_dirty flag if it was set before @@ -102,10 +126,6 @@ pub fn generate_asts( module.compile_dirty = true; module.deps_dirty = true; } - let package = build_state - .packages - .get(&module.package_name) - .expect("Package not found"); if let SourceType::SourceFile(ref mut source_file) = module.source_type { // We get Err(x) when there is a parse error. When it's Ok(_, Some( // stderr_warnings )), the outputs are warnings @@ -180,65 +200,71 @@ pub fn generate_asts( .map(|(_, module)| module.package_name.clone()) .collect::>(); - build_state.modules.iter_mut().for_each(|(module_name, module)| { - let is_dirty = match &module.source_type { - SourceType::MlMap(_) => { - if dirty_packages.contains(&module.package_name) { - let package = build_state - .packages - .get(&module.package_name) - .expect("Package not found"); - // probably better to do this in a different function - // specific to compiling mlmaps - let compile_path = package.get_mlmap_compile_path(); - let mlmap_hash = helpers::compute_file_hash(Path::new(&compile_path)); - if let Err(err) = namespaces::compile_mlmap( - &build_state.project_context, - package, - module_name, - &build_state.compiler_info.bsc_path, - ) { - has_failure = true; - stderr.push_str(&format!("{err}\n")); - } - let mlmap_hash_after = helpers::compute_file_hash(Path::new(&compile_path)); + // Collect package names first to avoid borrow checker issues + let module_package_pairs = build_state.module_name_package_pairs(); - let suffix = package - .namespace - .to_suffix() - .expect("namespace should be set for mlmap module"); - let base_build_path = package.get_build_path().join(&suffix); - let base_ocaml_build_path = package.get_ocaml_build_path().join(&suffix); - let _ = std::fs::copy( - base_build_path.with_extension("cmi"), - base_ocaml_build_path.with_extension("cmi"), - ); - let _ = std::fs::copy( - base_build_path.with_extension("cmt"), - base_ocaml_build_path.with_extension("cmt"), - ); - let _ = std::fs::copy( - base_build_path.with_extension("cmj"), - base_ocaml_build_path.with_extension("cmj"), - ); - let _ = std::fs::copy( - base_build_path.with_extension("mlmap"), - base_ocaml_build_path.with_extension("mlmap"), - ); - match (mlmap_hash, mlmap_hash_after) { - (Some(digest), Some(digest_after)) => !digest.eq(&digest_after), - _ => true, + for (module_name, package_name) in module_package_pairs { + if let Some(module) = build_state.build_state.modules.get_mut(&module_name) { + let is_dirty = match &module.source_type { + SourceType::MlMap(_) => { + if dirty_packages.contains(&package_name) { + let package = build_state + .build_state + .packages + .get(&package_name) + .expect("Package not found"); + // probably better to do this in a different function + // specific to compiling mlmaps + let compile_path = package.get_mlmap_compile_path(); + let mlmap_hash = helpers::compute_file_hash(Path::new(&compile_path)); + if let Err(err) = namespaces::compile_mlmap( + &build_state.build_state.project_context, + package, + &module_name, + &build_state.build_state.compiler_info.bsc_path, + ) { + has_failure = true; + stderr.push_str(&format!("{err}\n")); + } + let mlmap_hash_after = helpers::compute_file_hash(Path::new(&compile_path)); + + let suffix = package + .namespace + .to_suffix() + .expect("namespace should be set for mlmap module"); + let base_build_path = package.get_build_path().join(&suffix); + let base_ocaml_build_path = package.get_ocaml_build_path().join(&suffix); + let _ = std::fs::copy( + base_build_path.with_extension("cmi"), + base_ocaml_build_path.with_extension("cmi"), + ); + let _ = std::fs::copy( + base_build_path.with_extension("cmt"), + base_ocaml_build_path.with_extension("cmt"), + ); + let _ = std::fs::copy( + base_build_path.with_extension("cmj"), + base_ocaml_build_path.with_extension("cmj"), + ); + let _ = std::fs::copy( + base_build_path.with_extension("mlmap"), + base_ocaml_build_path.with_extension("mlmap"), + ); + match (mlmap_hash, mlmap_hash_after) { + (Some(digest), Some(digest_after)) => !digest.eq(&digest_after), + _ => true, + } + } else { + false } - } else { - false } + _ => false, + }; + if is_dirty { + module.compile_dirty = is_dirty; } - _ => false, - }; - if is_dirty { - module.compile_dirty = is_dirty; } - }); + } if has_failure { Err(anyhow!(stderr)) @@ -252,6 +278,8 @@ pub fn parser_args( package_config: &Config, filename: &Path, contents: &str, + is_local_dep: bool, + warn_error_override: Option, ) -> anyhow::Result<(PathBuf, Vec)> { let root_config = project_context.get_root_config(); let file = &filename; @@ -267,6 +295,7 @@ pub fn parser_args( let jsx_preserve_args = root_config.get_jsx_preserve_args(); let experimental_features_args = root_config.get_experimental_features_args(); let bsc_flags = config::flatten_flags(&package_config.compiler_flags); + let warning_args = package_config.get_warning_args(is_local_dep, warn_error_override); let file = PathBuf::from("..").join("..").join(file); @@ -279,6 +308,7 @@ pub fn parser_args( jsx_mode_args, jsx_preserve_args, experimental_features_args, + warning_args, bsc_flags, vec![ "-absname".to_string(), @@ -296,13 +326,20 @@ fn generate_ast( package: Package, filename: &Path, build_state: &BuildState, + warn_error_override: Option, ) -> anyhow::Result<(PathBuf, Option)> { let file_path = PathBuf::from(&package.path).join(filename); let contents = helpers::read_file(&file_path).expect("Error reading file"); let build_path_abs = package.get_build_path(); - let (ast_path, parser_args) = - parser_args(&build_state.project_context, &package.config, filename, &contents)?; + let (ast_path, parser_args) = parser_args( + &build_state.project_context, + &package.config, + filename, + &contents, + package.is_local_dep, + warn_error_override, + )?; // generate the dir of the ast_path (it mirrors the source file dir) let ast_parent_path = package.get_build_path().join(ast_path.parent().unwrap()); diff --git a/rewatch/src/build/read_compile_state.rs b/rewatch/src/build/read_compile_state.rs index 3a3b8adc59..3b76629471 100644 --- a/rewatch/src/build/read_compile_state.rs +++ b/rewatch/src/build/read_compile_state.rs @@ -7,7 +7,7 @@ use std::fs; use std::path::{Path, PathBuf}; use std::time::SystemTime; -pub fn read(build_state: &mut BuildState) -> anyhow::Result { +pub fn read(build_state: &mut BuildCommandState) -> anyhow::Result { let mut ast_modules: AHashMap = AHashMap::new(); let mut cmi_modules: AHashMap = AHashMap::new(); let mut cmt_modules: AHashMap = AHashMap::new(); diff --git a/rewatch/src/cli.rs b/rewatch/src/cli.rs index 5a66fe0df1..0e5dc564a2 100644 --- a/rewatch/src/cli.rs +++ b/rewatch/src/cli.rs @@ -126,6 +126,14 @@ pub struct BuildArgs { /// Watch mode (deprecated, use `rescript watch` instead) #[arg(short, default_value_t = false, num_args = 0..=1)] pub watch: bool, + + /// Warning numbers and whether to turn them into errors + /// + /// This flag overrides any warning configuration in rescript.json. + /// Example: --warn-error "+3+8+11+12+26+27+31+32+33+34+35+39+44+45+110" + /// This follows the same precedence behavior as the legacy bsb build system. + #[arg(long)] + pub warn_error: Option, } #[derive(Args, Clone, Debug)] @@ -147,6 +155,14 @@ pub struct WatchArgs { #[command(flatten)] pub snapshot_output: SnapshotOutputArg, + + /// Warning numbers and whether to turn them into errors + /// + /// This flag overrides any warning configuration in rescript.json. + /// Example: --warn-error "+3+8+11+12+26+27+31+32+33+34+35+39+44+45+110" + /// This follows the same precedence behavior as the legacy bsb build system. + #[arg(long)] + pub warn_error: Option, } impl From for WatchArgs { @@ -158,6 +174,7 @@ impl From for WatchArgs { create_sourcedirs: build_args.create_sourcedirs, dev: build_args.dev, snapshot_output: build_args.snapshot_output, + warn_error: build_args.warn_error, } } } diff --git a/rewatch/src/config.rs b/rewatch/src/config.rs index 2d2c0e8d98..6e12a9f209 100644 --- a/rewatch/src/config.rs +++ b/rewatch/src/config.rs @@ -563,12 +563,18 @@ impl Config { } } - pub fn get_warning_args(&self, is_local_dep: bool) -> Vec { + pub fn get_warning_args(&self, is_local_dep: bool, warn_error_override: Option) -> Vec { // Ignore warning config for non local dependencies (node_module dependencies) if !is_local_dep { return vec![]; } + // Command-line --warn-error flag takes precedence over rescript.json configuration + // This follows the same precedence behavior as the legacy bsb build system + if let Some(warn_error_str) = warn_error_override { + return vec!["-warn-error".to_string(), warn_error_str]; + } + match self.warnings { None => vec![], Some(ref warnings) => { @@ -1176,4 +1182,99 @@ pub mod tests { true, ) } + + #[test] + fn test_get_warning_args_with_override() { + let config = create_config(CreateConfigArgs { + name: "test".to_string(), + bs_deps: vec![], + build_dev_deps: vec![], + allowed_dependents: None, + path: PathBuf::from("./rescript.json"), + }); + + // Test that warn_error_override takes precedence + let args = config.get_warning_args(true, Some("+3+8+11".to_string())); + assert_eq!(args, vec!["-warn-error".to_string(), "+3+8+11".to_string()]); + } + + #[test] + fn test_get_warning_args_without_override() { + let mut config = create_config(CreateConfigArgs { + name: "test".to_string(), + bs_deps: vec![], + build_dev_deps: vec![], + allowed_dependents: None, + path: PathBuf::from("./rescript.json"), + }); + + // Set up warnings in config + config.warnings = Some(Warnings { + number: Some("+8+32".to_string()), + error: Some(Error::Catchall(true)), + }); + + // Test that config warnings are used when no override + let args = config.get_warning_args(true, None); + assert_eq!( + args, + vec![ + "-w".to_string(), + "+8+32".to_string(), + "-warn-error".to_string(), + "A".to_string() + ] + ); + } + + #[test] + fn test_get_warning_args_non_local_dep() { + let config = create_config(CreateConfigArgs { + name: "test".to_string(), + bs_deps: vec![], + build_dev_deps: vec![], + allowed_dependents: None, + path: PathBuf::from("./rescript.json"), + }); + + // Test that non-local deps ignore warning config + let args: Vec = config.get_warning_args(false, None); + assert_eq!(args, Vec::::new()); + } + + #[test] + fn test_get_warning_args_override_ignores_config() { + let mut config = create_config(CreateConfigArgs { + name: "test".to_string(), + bs_deps: vec![], + build_dev_deps: vec![], + allowed_dependents: None, + path: PathBuf::from("./rescript.json"), + }); + + // Set up warnings in config + config.warnings = Some(Warnings { + number: Some("+8+32".to_string()), + error: Some(Error::Catchall(true)), + }); + + // Test that override completely ignores config warnings + let args = config.get_warning_args(true, Some("+3+8+11".to_string())); + assert_eq!(args, vec!["-warn-error".to_string(), "+3+8+11".to_string()]); + } + + #[test] + fn test_get_warning_args_non_local_dep_ignores_override() { + let config = create_config(CreateConfigArgs { + name: "test".to_string(), + bs_deps: vec![], + build_dev_deps: vec![], + allowed_dependents: None, + path: PathBuf::from("./rescript.json"), + }); + + // Non-local dependency should never receive warning args, even if override is provided + let args = config.get_warning_args(false, Some("+3+8+11".to_string())); + assert_eq!(args, Vec::::new()); + } } diff --git a/rewatch/src/main.rs b/rewatch/src/main.rs index 5fc08d4250..f7f09eca1e 100644 --- a/rewatch/src/main.rs +++ b/rewatch/src/main.rs @@ -45,6 +45,7 @@ fn main() -> Result<()> { *build_args.create_sourcedirs, *build_args.dev, *build_args.snapshot_output, + build_args.warn_error.clone(), ) { Err(e) => { println!("{e}"); @@ -69,6 +70,7 @@ fn main() -> Result<()> { *watch_args.create_sourcedirs, *watch_args.dev, *watch_args.snapshot_output, + watch_args.warn_error.clone(), ); Ok(()) diff --git a/rewatch/src/watcher.rs b/rewatch/src/watcher.rs index 09b1c6bc15..114098eefb 100644 --- a/rewatch/src/watcher.rs +++ b/rewatch/src/watcher.rs @@ -64,6 +64,7 @@ struct AsyncWatchArgs<'a> { create_sourcedirs: bool, build_dev_deps: bool, snapshot_output: bool, + warn_error: Option, } async fn async_watch( @@ -76,11 +77,19 @@ async fn async_watch( create_sourcedirs, build_dev_deps, snapshot_output, + warn_error, }: AsyncWatchArgs<'_>, ) -> notify::Result<()> { - let mut build_state = - build::initialize_build(None, filter, show_progress, path, build_dev_deps, snapshot_output) - .expect("Can't initialize build"); + let mut build_state: build::build_types::BuildCommandState = build::initialize_build( + None, + filter, + show_progress, + path, + build_dev_deps, + snapshot_output, + warn_error, + ) + .expect("Can't initialize build"); let mut needs_compile_type = CompileType::Incremental; // create a mutex to capture if ctrl-c was pressed let ctrlc_pressed = Arc::new(Mutex::new(false)); @@ -164,14 +173,19 @@ async fn async_watch( .canonicalize() .map(StrippedVerbatimPath::to_stripped_verbatim_path) { - for module in build_state.modules.values_mut() { - match module.source_type { - SourceType::SourceFile(ref mut source_file) => { - // mark the implementation file dirty - let package = build_state - .packages - .get(&module.package_name) - .expect("Package not found"); + // Collect package names first to avoid borrow checker issues + let module_package_pairs = build_state.module_name_package_pairs(); + + for (module_name, package_name) in module_package_pairs { + let package = build_state + .build_state + .packages + .get(&package_name) + .expect("Package not found"); + + if let Some(module) = build_state.build_state.modules.get_mut(&module_name) { + match module.source_type { + SourceType::SourceFile(ref mut source_file) => { let canonicalized_implementation_file = package.path.join(&source_file.implementation.path); if canonicalized_path_buf == canonicalized_implementation_file { @@ -199,8 +213,9 @@ async fn async_watch( break; } } + } + SourceType::MlMap(_) => (), } - SourceType::MlMap(_) => (), } } needs_compile_type = CompileType::Incremental; @@ -270,6 +285,7 @@ async fn async_watch( path, build_dev_deps, snapshot_output, + build_state.get_warn_error_override(), ) .expect("Can't initialize build"); let _ = build::incremental_build( @@ -308,6 +324,7 @@ async fn async_watch( } } +#[allow(clippy::too_many_arguments)] pub fn start( filter: &Option, show_progress: bool, @@ -316,6 +333,7 @@ pub fn start( create_sourcedirs: bool, build_dev_deps: bool, snapshot_output: bool, + warn_error: Option, ) { futures::executor::block_on(async { let queue = Arc::new(FifoQueue::>::new()); @@ -328,7 +346,7 @@ pub fn start( log::debug!("watching {folder}"); watcher - .watch(folder.as_ref(), RecursiveMode::Recursive) + .watch(Path::new(folder), RecursiveMode::Recursive) .expect("Could not start watcher"); let path = Path::new(folder); @@ -342,6 +360,7 @@ pub fn start( create_sourcedirs, build_dev_deps, snapshot_output, + warn_error: warn_error.clone(), }) .await { diff --git a/rewatch/testrepo/packages/namespace-casing/rescript.json b/rewatch/testrepo/packages/namespace-casing/rescript.json index 7a739a9a88..a46ee03dcf 100644 --- a/rewatch/testrepo/packages/namespace-casing/rescript.json +++ b/rewatch/testrepo/packages/namespace-casing/rescript.json @@ -18,5 +18,9 @@ "compiler-flags": [], "jsx": { "version": 4 + }, + "warnings": { + "number": "+1000", + "error": "-2000" } } diff --git a/rewatch/tests/compiler-args.sh b/rewatch/tests/compiler-args.sh index bbe455bfc0..7402d4ea54 100755 --- a/rewatch/tests/compiler-args.sh +++ b/rewatch/tests/compiler-args.sh @@ -34,3 +34,27 @@ else git diff --no-index "$tmp1" "$tmp2" || true exit 1 fi + +# Additional check: warnings/error flags should be present in both parser_args and compiler_args (using namespace-casing package) +bold "Test: warnings/error flags appear in both parser_args and compiler_args (namespace-casing)" + +stdout=$(rewatch compiler-args packages/namespace-casing/src/Consume.res 2>/dev/null) +if [ $? -ne 0 ]; then + error "Error grabbing compiler args for packages/namespace-casing/src/Consume.res" + exit 1 +fi + +# The package has warnings.number = +1000 and warnings.error = -2000 +# Expect both parser_args and compiler_args to include -warn-error/-2000 and -w/+1000 +warn_error_flag_count=$(echo "$stdout" | grep -F -o '"-warn-error"' | wc -l | xargs) +warn_error_val_count=$(echo "$stdout" | grep -F -o '"-2000"' | wc -l | xargs) +warn_number_flag_count=$(echo "$stdout" | grep -F -o '"-w"' | wc -l | xargs) +warn_number_val_count=$(echo "$stdout" | grep -F -o '"+1000"' | wc -l | xargs) + +if [ "$warn_error_flag_count" -ne 2 ] || [ "$warn_error_val_count" -ne 2 ] || [ "$warn_number_flag_count" -ne 2 ] || [ "$warn_number_val_count" -ne 2 ]; then + error "Expected -w/+1000 and -warn-error/-2000 to appear twice (parser_args and compiler_args)" + echo "$stdout" + exit 1 +fi + +success "warnings/error flags present in both parser and compiler args (namespace-casing)"