Yogthos on Nostr: ByteDance just dopped Doubao-1.5-pro tht uses sparse MoE architecture, it matches GPT ...
ByteDance just dopped Doubao-1.5-pro tht uses sparse MoE architecture, it matches GPT 4o benchmarks while being 50x cheaper to run, and it's 5x cheaper than DeepSeek.
https://www.aibase.com/news/14931#ai #MachineLearning #technology
Published at
2025-01-24 14:57:52Event JSON
{
"id": "dc9dc62cd3d86eb67502fa4d52c50378d43a83ea0e8834cbbb32b9ab8a2b2e2a",
"pubkey": "b1dcb61f2cdc8f5c8d92226a074c2a23bd2d5728fb4d4f277142fe9119f0392d",
"created_at": 1737730672,
"kind": 1,
"tags": [
[
"t",
"ai"
],
[
"t",
"machinelearning"
],
[
"t",
"technology"
],
[
"proxy",
"https://social.marxist.network/users/yogthos/statuses/113883917330454356",
"activitypub"
]
],
"content": "ByteDance just dopped Doubao-1.5-pro tht uses sparse MoE architecture, it matches GPT 4o benchmarks while being 50x cheaper to run, and it's 5x cheaper than DeepSeek.\n\nhttps://www.aibase.com/news/14931\n\n#ai #MachineLearning #technology",
"sig": "b9f707432f6c2c9d16c7c3563b4d69ac0ce5990ecd410eff9d6a0676957cfb60f00d1099d371944f3c7c18a1b1c9c404d2b7d5ba900361f2db5a23760289adf0"
}