LLaMA-MoE-v2 is a series of open-sourced Mixture-of-Expert (MoE) models based on LLaMA3. We build LLaMA-MoE-v2 with the following two steps: Partition LLaMA's FFN layers or Attention layers into ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results