Skip to content

Conversation

@spirousschuh
Copy link
Contributor

@spirousschuh spirousschuh commented Jul 31, 2025

Motivation and Context

I encountered that saving a model including its weights and loading it again did not work for me the way that I would expect it. Therefore, I wrote a unit test that fails in the current version of the repo and I propose a code fix.

Detailed Problem Description
From my perspective the underlying problem is that when saving a model using the function tabpfn.model.loading.save_tabpfn_model saves a tabpfn.model.config.ModelConfig together with the torch model. However, the model loading routine uses the ModelConfig.upgrade_config method that just works with dictionaries.

The fix
During the loading routine I decided to convert a ModelConfig to a dict in the ModelConfig.upgrade_config function. But of course one could fix the saving routine to save a dict instead of a ModelConfig.


Public API Changes

  • No Public API changes
  • Yes, Public API changes (Details below)

How Has This Been Tested?

Yes. I provide a unit test for the regressor interface.

Checklist

  • The changes have been tested locally.
  • Documentation has been updated (if the public API or usage changes).
  • A entry has been added to CHANGELOG.md (if relevant for users).
  • The code follows the project's style guidelines.
  • I have considered the impact of these changes on the public API.

Copilot AI review requested due to automatic review settings July 31, 2025 15:56
@CLAassistant
Copy link

CLAassistant commented Jul 31, 2025

CLA assistant check
All committers have signed the CLA.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Summary of Changes

Hello @spirousschuh, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request addresses a critical bug in the model serialization process, specifically preventing the loss of model weights when saving and loading TabPFN models. The changes ensure that the model configuration handling is robust across different input types during deserialization, and a new comprehensive test validates the end-to-end persistence functionality.

Highlights

  • Model Configuration Compatibility: The ModelConfig.upgrade_config method has been enhanced to accept both dictionary and ModelConfig object inputs. This ensures that when a ModelConfig instance is passed during model loading, it is correctly converted to a dictionary before the upgrade logic is applied, resolving an incompatibility issue.
  • Robust Model Persistence Testing: A new unit test, test_saving_and_loading_model_with_weights, has been added to tests/test_regressor_interface.py. This test rigorously verifies that a TabPFNRegressor model, including its modified weights, can be successfully saved to disk and subsequently loaded, with all parameters accurately preserved. This directly addresses the core problem identified by the author.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point in your pull request via creating an issue comment (i.e. comment on the pull request page) using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in issue comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments or fill out our survey to provide feedback.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

This comment was marked as outdated.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request addresses an issue with saving and loading models by ensuring the model configuration is correctly handled as a dictionary during the loading process. The fix is sound and is accompanied by a solid unit test that verifies the corrected behavior. I've added one suggestion for a minor refactoring to improve efficiency in the configuration upgrade logic. Overall, this is a good contribution that improves the robustness of model serialization.

noahho added 2 commits August 3, 2025 10:44
…ng_weights

# Conflicts:
#	src/tabpfn/model/config.py
@noahho
Copy link
Collaborator

noahho commented Aug 3, 2025

@spirousschuh thanks for fixing this! I just merged with main which was quite a change, would you like to double-check?

@noahho
Copy link
Collaborator

noahho commented Aug 3, 2025

Sorry I have by now deviated a lot from your initial PR. Is it right that test_load_fitted_model is testing your case already?

@spirousschuh
Copy link
Contributor Author

Hi, thanks for looking at my PR. I assume that you are talking about the test "test_save_load_happy_path". It does test the case of loading a model, i.e. a checkpoint with the keys "state_dict" and "config". In this test however what is loaded from the "config" key and ends up in the "ModelConfig.upgrade_config" method is of type dict.

The problem that happened to me was that the "save_tabpfn_model" function does not save a dict, but saves a ModelConfig in the checkpoint: https://github.com/PriorLabs/TabPFN/blob/main/src/tabpfn/model_loading.py#L615
And so loading that checkpoint with your loading functions like the "initialize_tabpfn_model" function failed, before. Therefore, I would recommend to keep the unit test that you removed. Cause the test checks that the saving format of "save_tabpfn_model" is fine with the loading functions.

@noahho
Copy link
Collaborator

noahho commented Aug 4, 2025

@spirousschuh okay great. Maybe the test file for saving and loading the model is the most appropriate place for the test - would be great to add that test there?

@spirousschuh
Copy link
Contributor Author

@noahho I added the test as you suggested and I added a small twist to the PR. Now, the save_tabpfn_model internally converts the ModelConfig to a dict before saving. This avoids adding the context manager to the loading that allows for loading custom classes. Now, we would not need the feature of the loading function to allow for loading ModelConfig anymore. Do you want me to remove it?

@noahho
Copy link
Collaborator

noahho commented Aug 5, 2025

Thanks a lot, also for adding that test. The fix now seems simpler, so removing the previous fix seems like a good idea.
Are you familiar with how to ruff the code properly, otherwise I can give it a go?
Also, was there a typing issue for the input to the save / load function input?

@spirousschuh
Copy link
Contributor Author

Ok. I undid the changes in the upgrade_config method and I fixed the liniting error.

For ruffing the code I ran pre-commit run --all-files. If you mean anything on top, please let me know.

@noahho noahho enabled auto-merge (squash) August 6, 2025 06:47
Copy link
Collaborator

@noahho noahho left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Amazing, thanks a lot for this!

@noahho noahho requested a review from Copilot August 6, 2025 06:48
@noahho
Copy link
Collaborator

noahho commented Aug 6, 2025

/gemini review

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR fixes a compatibility issue between the model saving and loading functionality where saved models could not be properly loaded due to a type mismatch between ModelConfig objects and dictionaries.

  • Updates save_tabpfn_model to save the model config as a dictionary instead of a ModelConfig object
  • Removes an unnecessary blank line in the config upgrade method
  • Adds a comprehensive unit test to verify the save/load functionality works correctly

Reviewed Changes

Copilot reviewed 3 out of 3 changed files in this pull request and generated 1 comment.

File Description
src/tabpfn/model_loading.py Converts ModelConfig to dictionary using asdict() before saving and imports the required function
src/tabpfn/architectures/base/config.py Removes extraneous blank line for code cleanup
tests/test_save_load_fitted_model.py Adds comprehensive test for save/load model functionality with weight preservation verification
Comments suppressed due to low confidence (1)

tests/test_save_load_fitted_model.py:96

  • The test only covers the regressor case but the fix applies to both TabPFNRegressor and TabPFNClassifier. Consider adding a similar test for TabPFNClassifier to ensure the fix works for both model types.
    regressor = TabPFNRegressor(model_path="auto", device="cpu", random_state=42)

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request addresses an issue with saving and loading model weights and configurations. The core change involves converting the model configuration to a dictionary before saving, which ensures compatibility with the loading process that expects a dictionary. A new test case has been added to verify this save/load cycle.

My main feedback is regarding backward compatibility. The current change will prevent models saved with the older version of the code from being loaded. I've suggested a modification to the loading logic to handle both old and new model formats, ensuring a smooth transition for users with existing saved models.

@noahho noahho merged commit 3ee9e3d into PriorLabs:main Aug 6, 2025
8 checks passed
oscarkey pushed a commit that referenced this pull request Nov 12, 2025
* Record copied public PR 420

* fix saving and loading model including weights (#420)

Co-authored-by: noahho <Noah.homa@gmail.com>

---------

Co-authored-by: mirror-bot <mirror-bot@users.noreply.github.com>
Co-authored-by: spirousschuh <hasensilvester@gmail.com>
Co-authored-by: noahho <Noah.homa@gmail.com>
Co-authored-by: Noah Hollmann <noah@priorlabs.ai>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants