文章预览
2024-11-06 14:00
本条微博链接
【Hunyuan-Large:腾讯推出的大型 MoE(Mixture of Experts)模型,该模型拥有 3890 亿参数和 520 亿激活参数,是业界目前最大的开源 Transformer 基础 MoE 模型,专注于自然语言处理和长文本理解】'Tencent/Tencent-Hunyuan-Large - Hunyuan-Large (Hunyuan-MoE-A52B) model is the largest open-source Transformer-based MoE model in the industry, featuring a total of 389 billion parameters and 52 billion active parameters.' GitHub: github.com/Tencent/Tencent-Hunyuan-Large #人工
………………………………