ycjuan / libffm

A Library for Field-aware Factorization Machines

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

when I add some numeric continuous features, the loss decreased slowly

huanyu-liao opened this issue · comments

I found that loss is very quick to decrease, if all feature are categorical. And some numerical feature are chosen into the model, it is very slow to decrease the loss even if 150 iteration.
Could you tell me why or give me some advices?

I think you can try to close normalization using --no-norm. Because libffm uses instance-wise normalization, where the numerical feature could generate a very large scaling factor. It may influence the speed of gradient descent.