wetliu / energy_ood

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

How did you come up with your idea?

algoteam5 opened this issue · comments

I am reading your paper, but i dont understand the math part. How did you come up with using gibbs distribution and helmholtz free energy? What kind math books that can teach us about gibbs distribution and helmholtz free energy? are they probability books or are they combined by different math sectors? I really want to know how you linked all these math stuff to create your awesome paper idea.

Thanks for your interest in our paper! In case you are attending NeurIPS this week, you might want to check out this nice tutorial: https://neurips.cc/media/neurips-2021/Slides/21896.pdf (ML for Physics and Physics for ML). It covers some nice ground.

The idea stems from taken-for-granted usage of softmax, but no one has answered why we use it and what assumption we made behind using this softmax. After all, if we simply need a non-negative mapping of any value to 0-1, taking the absolute value could do the same work. It turns out the maximum entropy in statistics with constant mean is the key. In other words, once we use softmax, mean is fixed. In statistical mechanics, this means the mean energy (or total energy) is fixed. That's the starting point of our project.
We did have an intuition of the energy score corresponds to the data distribution for a long time, according to some books and Lecun's paper, but they simply mentioned that without proof. The theoretical connections is answered thanks for @YixuanLi.

Admittedly, there are still some discrepancy remains to be answered...

Thank you for you both answered! even though I am still struggling to understand the math part :(