News

ByteDance's Doubao LLM Daily Token Usage Soars to 120 Trillion, Signaling Explosive AI Growth and Enterprise Adoption

ByteDance's Doubao LLM Daily Token Usage Soars to 120 Trillion, Signaling Explosive AI Growth and Enterprise Adoption

Recent announcements from a Volcano Engine event in Wuhan have shed light on the latest advancements in China's large language model (LLM) and AI Agent ecosystem:

Firstly, ByteDance's Doubao LLM has surpassed a daily average of 120 trillion tokens in usage. This figure not only positions it as a leader in the domestic market but also highlights its astonishing growth rate. Compared to 63 trillion tokens just three months prior, current usage has doubled; against May 2024, it represents a staggering 1000-fold increase. This data underscores the explosive growth Doubao LLM is experiencing in real-world applications.

Secondly, the number of enterprises accumulating trillion-plus token usage with Doubao has reached 140, up from 100 last year. This significant increase indirectly demonstrates the profound impact of AI Agent solutions (referred to as "Lobster Agents" in the original context) across various industries, prompting businesses to actively integrate and adopt these technologies.

Furthermore, Tan Dai, CEO of Volcano Engine, articulated the three essential elements for developing AI Agents (playfully termed "raising cyber lobsters"): Models, Skills, and Security. These principles are integral to Volcano Engine's proprietary ArkClaw platform, providing core guidelines for the construction and deployment of robust AI Agents.

Lastly, Seedance 2.0, the highly anticipated AI video and short drama creation tool, has officially commenced its public beta program for enterprise clients. This launch is expected to open new avenues and opportunities within the AI-powered content generation landscape.

Collectively, the volume of token consumption has undeniably emerged as a critical metric for evaluating the speed of AI technological advancement and the depth of its application.

↗ Read original source