Issues with InferenceConfig in Newer Versions & Query on Performance Improvements
rascalforsjtu opened this issue · comments
Hi there,
I’m a user of TabPFN and have been relying on it for my classification tasks. Recently, I tried updating to the latest versions of TabPFN, but ran into a problem: the InferenceConfig class, which my pre-trained models depend on, seems to have been removed or restructured in newer releases. This made it impossible to load my existing models—after several attempts, I had no choice but to roll back to v2.0.9, where everything still works.
It’s a bit frustrating to be stuck on an older version, but I understand that software evolves and changes are often necessary for improvement!
That said, I’m curious: have the newer versions (post-v2.0.9) brought significant performance improvements? For example, better accuracy, faster inference, or improved handling of certain datasets?
I’m weighing whether it’s worth the effort to re-fine-tune my models with the latest version to take advantage of any upgrades. Any insights or recommendations you could share would be really helpful.
Thanks for your hard work on TabPFN—it’s a great tool, and I appreciate the continuous updates!
Hi @rascalforsjtu breaking changes are inevitable, I guess you would need some work at your end to adapt your config. The InferenceConfig has been renamed to ModelConfig there is a method that can convert your existing configuration tabpfn.model.config.ModelConfig.upgrade_config.
Hope that helps.
Out of curiosity what exactly are you changing? Seems to me for inference it might not be a good idea to play with those.
Hi @rascalforsjtu, thanks for posting! As @iivalchev writes this change was important for us to make since the config object was much too complex and we could simplify it.
That said, I’m curious: have the newer versions (post-v2.0.9) brought significant performance improvements? For example, better accuracy, faster inference, or improved handling of certain datasets?
It has brought significant improvements - could you try running the new model version without finetuning first to see how it does now? That might help determine if there is improvement for your specific case.
Out of curiosity what exactly are you changing? Seems to me for inference it might not be a good idea to play with those.
My actions were strictly limited to training and saving the model — no other modifications were made. When newer versions failed to load it, I reverted to v2.0.9 since retraining would require significant effort without guaranteed results. Your guidance helped me understand the root cause — truly appreciated!
Hi @rascalforsjtu, thanks for posting! As @iivalchev writes this change was important for us to make since the config object was much too complex and we could simplify it.
That said, I’m curious: have the newer versions (post-v2.0.9) brought significant performance improvements? For example, better accuracy, faster inference, or improved handling of certain datasets?
It has brought significant improvements - could you try running the new model version without finetuning first to see how it does now? That might help determine if there is improvement for your specific case.
Thanks for the suggestion! Will definitely try the new version in other projects down the road.