deepseek-ai / DeepSeek-V2

DeepSeek-V2: A Strong, Economical, and Efficient Mixture-of-Experts Language Model

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Comparison Between MLA and MHA in dense model

mx8435 opened this issue · comments

Hi, great job. Did you have a ablation study about the performance between MLA and MHA in dense model ? Thanks.

@mx8435 Our early experiments have already verified that on the 7B dense model, MLA outperforms MHA (by aligning the overall number of parameters of the models, the lesser parameter count in MLA is compensated by increasing the number of layers).