yihong-chen / neural-collaborative-filtering

pytorch version of neural collaborative filtering

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

L2 regularization

ttgump opened this issue · comments

Hi,
I notice that in your config code of mlp and neumf, you set l2_regularization. But I can't find l2 regularization in your model and loss implementation. Could you help me to know how to implement the l2 regularization? Thanks.

Hi, the l2_regularization is simply the weight decay hyper-parameter for the optimizer. I hope this helps.