Skip to content

Conversation

@marcorudolphflex
Copy link
Contributor

@marcorudolphflex marcorudolphflex commented Oct 30, 2025

Previous state
Mode simulations never write to the new local cache because store_result can’t find stub_data.simulation.

Changes
web.load does not store ModeSolver data in cache, instead wrapping functions do that with new helper _store_mode_solver_in_cache

Greptile Overview

Updated On: 2025-10-30 12:33:17 UTC

Greptile Summary

Fixed ModeSolver simulation results not being cached locally by introducing a specialized cache storage path that passes the simulation object directly instead of relying on stub_data.simulation.

Key Changes

  • Modified LocalCache.store_result() in tidy3d/web/cache.py:315 to accept an optional simulation parameter and return a boolean indicating success/failure
  • Added new helper function _store_mode_solver_in_cache() in tidy3d/web/cache.py:550 to encapsulate ModeSolver-specific cache storage logic
  • Updated web.load() in tidy3d/web/api/webapi.py:1420 to skip generic cache storage for MODE_SOLVER workflow types
  • Modified web.run(), Job.load(), and Batch.load() to explicitly call the new cache helper after loading ModeSolver results
  • Added comprehensive test coverage for all ModeSolver cache code paths (run, Job, Batch)

Issues Found

  • Duplicate download() call in Batch.load() at tidy3d/web/api/container.py:1439-1440 (already called on line 1420)

Confidence Score: 4/5

  • This PR is safe to merge after fixing the duplicate download() call
  • The core fix is sound and well-tested, but contains one critical logic error (duplicate download call) that should be fixed before merge. The approach of using a specialized helper function with explicit simulation parameter correctly addresses the original issue where stub_data.simulation was unavailable for ModeSolver
  • Pay close attention to tidy3d/web/api/container.py - contains duplicate download() call that needs removal

Important Files Changed

File Analysis

Filename Score Overview
tidy3d/web/cache.py 5/5 Added new helper function _store_mode_solver_in_cache and modified store_result to accept optional simulation parameter, fixing ModeSolver cache storage
tidy3d/web/api/webapi.py 5/5 Modified run() and load() to use new ModeSolver cache helper, excluding MODE_SOLVER workflow from generic cache storage in load()
tidy3d/web/api/container.py 4/5 Updated Job and Batch classes to call new cache helper for ModeSolver results; includes duplicate download() call bug in Batch.load()
tests/test_web/test_local_cache.py 5/5 Added comprehensive test _test_mode_solver_caching covering all code paths (run, Job, Batch) for ModeSolver cache functionality

Sequence Diagram

sequenceDiagram
    participant User
    participant web.run/Job/Batch
    participant load()
    participant LocalCache
    participant _store_mode_solver_in_cache()

    alt Mode Solver Run
        User->>web.run/Job/Batch: run(ModeSolver)
        web.run/Job/Batch->>load(): load(task_id, path)
        load()->>LocalCache: Check if MODE_SOLVER type
        Note over load(),LocalCache: Skip generic store_result<br/>for MODE_SOLVER
        load()-->>web.run/Job/Batch: return data
        web.run/Job/Batch->>_store_mode_solver_in_cache(): Store with simulation param
        _store_mode_solver_in_cache()->>LocalCache: store_result(simulation=mode_solver)
        LocalCache->>LocalCache: Hash simulation object
        LocalCache->>LocalCache: Store with cache key
        web.run/Job/Batch->>User: return data
    end

    alt Regular Simulation Run
        User->>web.run/Job/Batch: run(Simulation)
        web.run/Job/Batch->>load(): load(task_id, path)
        load()->>LocalCache: store_result(stub_data)
        Note over load(),LocalCache: Uses stub_data.simulation<br/>for hashing
        load()-->>web.run/Job/Batch: return data
        web.run/Job/Batch->>User: return data
    end
Loading

Copy link

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

4 files reviewed, 1 comment

Edit Code Review Agent Settings | Greptile

@github-actions
Copy link
Contributor

github-actions bot commented Oct 30, 2025

Diff Coverage

Diff: origin/develop...HEAD, staged and unstaged changes

  • tidy3d/web/api/container.py (100%)
  • tidy3d/web/api/webapi.py (70.0%): Missing lines 1317-1318,1320,1434-1436
  • tidy3d/web/cache.py (72.2%): Missing lines 357,360,363-364,388

Summary

  • Total: 44 lines
  • Missing: 11 lines
  • Coverage: 75%

tidy3d/web/api/webapi.py

  1313     task = SimulationTask.get(task_id)
  1314     path = Path(path)
  1315     if path.suffix == ".json":
  1316         task.get_simulation_json(path, verbose=verbose)
! 1317     elif path.suffix == ".hdf5":
! 1318         task.get_simulation_hdf5(path, verbose=verbose)
  1319     else:
! 1320         raise ValueError("Path suffix must be '.json' or '.hdf5'")
  1321     return Tidy3dStub.from_file(path)
  1322 
  1323 
  1324 @wait_for_connection

  1430             if lazy:  # get simulation via web to avoid unpacking of lazy object in store_result
  1431                 try:
  1432                     with tempfile.NamedTemporaryFile(suffix=".hdf5") as tmp_file:
  1433                         simulation = load_simulation(task_id, path=tmp_file.name, verbose=False)
! 1434                 except Exception as e:
! 1435                     log.info(f"Failed to load simulation for storing results: {e}.")
! 1436                     return stub_data
  1437             simulation_cache.store_result(
  1438                 stub_data=stub_data,
  1439                 task_id=task_id,
  1440                 path=path,

tidy3d/web/cache.py

  353                 simulation_obj = simulation
  354             else:
  355                 simulation_obj = getattr(stub_data, "simulation", None)
  356                 if simulation_obj is None:
! 357                     log.debug(
  358                         "Failed storing local cache entry: Could not find simulation data in stub_data."
  359                     )
! 360                     return False
  361             simulation_hash = simulation_obj._hash_self() if simulation_obj is not None else None
  362             if not simulation_hash:
! 363                 log.debug("Failed storing local cache entry: Could not hash simulation.")
! 364                 return False
  365 
  366             version = _get_protocol_version()
  367 
  368             cache_key = build_cache_key(

  384                 metadata=metadata,
  385             )
  386         except Exception as e:
  387             log.error(f"Could not store cache entry: {e}")
! 388             return False
  389         return True
  390 
  391 
  392 def _copy_and_hash(

Copy link
Collaborator

@lucas-flexcompute lucas-flexcompute left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If I understand correctly, PF should use now the Job class to upload, start and load back results to enable caching for model solver, right? And it should also work for other simulation types, since Job.load calls web.load. Looks good.

@marcorudolphflex
Copy link
Contributor Author

If I understand correctly, PF should use now the Job class to upload, start and load back results to enable caching for model solver, right? And it should also work for other simulation types, since Job.load calls web.load. Looks good.

Not necessarily - both storing and loading should work with any of {run, Job, Batch}, see the new test. But if Job works for you, go for it.

@marcorudolphflex
Copy link
Contributor Author

@lucas-flexcompute or would you target another usage of it? In principle, you could also directly use _store_mode_solver_in_cache if the supported ways do not fit in your pipeline

@lucas-flexcompute
Copy link
Collaborator

@lucas-flexcompute or would you target another usage of it? In principle, you could also directly use _store_mode_solver_in_cache if the supported ways do not fit in your pipeline

Using Job seems fine. IIRC I did not use it originally because it would automatically upload the simulation, which I did not want.

Copy link
Collaborator

@yaugenst-flex yaugenst-flex left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @marcorudolphflex looks great!

@marcorudolphflex marcorudolphflex force-pushed the FXC-3884-mode-solver-cache-hashing branch 2 times, most recently from a85d3eb to b87144c Compare October 30, 2025 15:16
@marcorudolphflex marcorudolphflex force-pushed the FXC-3884-mode-solver-cache-hashing branch from b87144c to 623d7ac Compare October 30, 2025 15:38
@yaugenst-flex yaugenst-flex added this pull request to the merge queue Oct 30, 2025
Merged via the queue into develop with commit be656b0 Oct 30, 2025
26 checks passed
@yaugenst-flex yaugenst-flex deleted the FXC-3884-mode-solver-cache-hashing branch October 30, 2025 21:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants