Stanford Researchers Build AI Program Similar to ChatGPT for $600

robot buddhist priest in Japan
CHARLY TRIBALLEAU /Getty

Researchers at Stanford University have built an AI that they claim matches the capabilities of OpenAI’s ChatGPT, which currently leads the market in consumer-facing AI products. However, while powerful AIs seem to be easy and cheap to build, running them is a different matter.

The team at Stanford used LLaMA 7b, the smallest and cheapest of Facebook’s open source language models as a basis to create their AI, which the researchers dubbed Alpaca AI.

OpenAI logo seen on screen with ChatGPT website displayed on mobile seen in this illustration in Brussels, Belgium, on December 12, 2022. (Photo by Jonathan Raa/NurPhoto via Getty Images)

OpenAI logo seen on screen with ChatGPT website displayed on mobile seen in this illustration in Brussels, Belgium, on December 12, 2022. (Photo by Jonathan Raa/NurPhoto via Getty Images)

Via New Atlas:

So, with the LLaMA 7B model up and running, the Stanford team then basically asked GPT to take 175 human-written instruction/output pairs, and start generating more in the same style and format, 20 at a time. This was automated through one of OpenAI’s helpfully provided APIs, and in a short time, the team had some 52,000 sample conversations to use in post-training the LLaMA model. Generating this bulk training data cost less than US$500.

Then, they used that data to fine-tune the LLaMA model – a process that took about three hours on eight 80-GB A100 cloud processing computers. This cost less than US$100.

Next, they tested the resulting model, which they called Alpaca, against ChatGPT’s underlying language model across a variety of domains including email writing, social media and productivity tools. Alpaca won 90 of these tests, GPT won 89.

It seems that AIs are relatively cheap to reproduce. However, while creating the technology may be cost-effective, operating it at scale is a different matter.

Every time an AI returns a response to a prompt, it performs billions of calculations in its mission to deliver a useful answer — this requires a great deal of computing power, which is expensive. In the case of one startup, hundreds of thousands of dollars per month were spent trying to meet their users’ AI processing requests — considerably more than a human employee would have cost.

The high operating costs of consumer-facing AI are one of the reasons why Microsoft is pouring hundreds of millions of dollars into a new supercomputer built specifically to support OpenAI and its products.

Allum Bokhari is the senior technology correspondent at Breitbart News. He is the author of #DELETED: Big Tech’s Battle to Erase the Trump Movement and Steal The Election.

COMMENTS

Please let us know if you're having issues with commenting.