In a federal lawsuit against OpenAI in the United States, Elon Musk admitted in his testimony that the artificial intelligence company xAI he founded had used OpenAI's model to train its chatbot Grok through so-called "distillation" technology, once again pushing this tacit practice in the industry into the spotlight.

Recently, OpenAI and Anthropic have been blasting third parties for training the behavior of new models through intensive questioning of publicly accessible chatbots and APIs, a process known in the industry as "distillation." In the past few months, the focus of public opinion has focused on some Chinese companies. They have been accused of creating open source weighting models through distillation. The capabilities of these models are close to those of cutting-edge American products, but they can be provided to the outside world at a lower cost. However, within the technology circle, many practitioners have long believed that cutting-edge laboratories in the United States will also use similar methods to avoid falling behind in the competition.
This speculation has now been confirmed in at least one case. Asked during Thursday’s testimony in federal court in California whether xAI used distillation technology to train Grok based on OpenAI models, Musk said it was a “common practice among AI companies.” When the other party asked whether this could be understood as "yes", his answer was, "partly yes".
Musk is currently suing OpenAI, as well as the company's CEO Sam Altman and co-founder Greg Brockman, alleging they violated OpenAI's original nonprofit mission by converting it from a nonprofit to a for-profit structure. The trial begins this week, with Musk's testimony becoming one of the central dramas.
Musk’s admission is significant because distillation technology is seen as a threat to the core advantages of large AI companies: these companies invest huge sums of money in building computing infrastructure in an attempt to stay ahead through scale barriers, while distillation may allow other software developers to train models that are “not far off” in capabilities at a cost far lower than the original cost. Against this background, there is no lack of irony in the industry - in order to obtain enough training data, Frontier Labs themselves have repeatedly tested the boundaries of copyright and were even accused of "stepping over the line", but now they have to resist others from using compliance interfaces to "learn away" their models.
Judging from the timeline, xAI was founded in 2023, several years later than OpenAI, so it is not surprising that it tried to "learn" from the industry leader at the time. It is unclear whether distillation constitutes a clear violation of the law. More realistic constraints may come from the terms of service set by each company for the use of products - distillation is often regarded as a violation of these terms, and does not necessarily touch the statutory law itself.
Faced with concerns about model "plagiarism" from China, OpenAI, Anthropic and Google have launched a joint action through the "Frontier Model Forum" to try to share intelligence and jointly respond to distillation attempts. According to reports, these distillations of large models often rely on systematic, large-scale automated questioning to infer the model’s “internal behavioral patterns.” In order to curb such behavior, Frontier Labs is trying to identify and block suspected batch and abnormal requests to prevent the model from being "drained of its essence." As of press time, OpenAI had not responded to a request for comment on Musk’s testimony.
Later in the trial, Musk was asked about a high-profile statement he made last summer: that xAI would soon surpass all companies except Google in terms of capabilities. He conducted a subjective ranking of the world's major AI providers in court, saying that Anthropic currently ranks first, followed by OpenAI and Google, while China's open source models are ranked second. By comparison, he described xAI as a much smaller company, currently with just a few hundred employees.