Inside the Creation of DBRX, the World’s Most Powerful AI Model


Last Monday, about a dozen engineers and managers at the data science and AI company Databricks they gathered in conference rooms connected via Zoom to find out if they managed to build on the top artificial intelligence language example. The group spent many months, and about $10 million, training DBRX, a the main language similar in design and background OpenAI’s ChatGPT. But he wouldn’t know how powerful their creation was until the results came back from the final tests of his abilities.

“We’ve surpassed everything,” Jonathan Frankle, chief neural network architect at Databricks and leader of the team that built DBRX, finally told the crowd, which responded with emojis, cheers, and applause. Frankle usually abstains from caffeine but drinks an iced latte after pulling it overnight to record results.

Databricks will release DBRX under an open license, allowing others to build on top of its work. Frankle shared data showing that in nearly a dozen or so benchmarks that test an AI model’s ability to answer informational questions, read comprehension, solve logic problems, and generate high-quality code, DBRX was better than any other. An open source model exists.

AI Decision Makers: Jonathan Frankle, Naveen Rao, Ali Ghodsi, and Hanlin Tang.Photo: Gabriela Hasbun

It shone more Meta’s Llama 2 and Mistral’s Mixtral, two of the most popular Open source AI models available today. “Yes!” shouted Ali Ghodsi, CEO of Databricks, when the scores were revealed. “Wait, did we win the Elon thing?” Frankle replied that he had indeed won the Grok AI version recently opened with Musk’s xAIadding, “I’ll consider it a success if we get a tweet from him.”

To the surprise of the group, on several counts DBRX was also surprisingly close to GPT-4, the closed version of OpenAI that powers ChatGPT and is considered the pinnacle of machine intelligence. “We’ve set a new standard for open LLMs,” Frankle said with a huge grin.


By opening up, DBRX Databricks is also adding to the trend that is challenging the private approach of the most popular companies in modern AI development. OpenAI and Google maintain code for their GPT-4 and Gemini versions of the most widely used, but competing, languages. especially Metathey have released their models for others to use, arguing that they will advance technology by putting technology in the hands of more researchers, entrepreneurs, innovators, and sustainable businesses.

Databricks says it also wants to be open about the work involved in creating its open source platform, something Meta has not done about other important aspects of its development. Call 2 models. The company released a blog post detailing the work that went into creating the model, and asked WIRED to spend time with Databricks engineers as they made key decisions in completing the multimillion-dollar DBRX study. This provided a glimpse into how complex and challenging it is to develop an advanced AI model—and how the latest innovations in the field promise to drive down costs. This, combined with the availability of open source models like DBRX, shows that the development of AI will not slow down anytime soon.

Ali Farhadi, CEO of the Allen Institute for AI, says greater transparency in building and training AI models is critical. This field has become increasingly secretive in recent years as companies have sought competitive advantage. Transparency is especially important when there are concerns about the risks that advanced AI models could pose, he says. Farhadi said: “I am very happy to see every effort to be free. “I believe that a large part of the market is going to open source models. We need more of that.”


Source link

Leave a Reply

Your email address will not be published. Required fields are marked *