OpenAI dominates the generative AI marketand she GPT-4 is the industry’s best performing model to date. But businesses are increasingly choosing to build their own, smaller AI models that are more tailored to their business needs.
Sales team, for example, launched two coding AI assistants called Einstein for Developers and Einstein for Flow, which were trained on both Salesforce’s internal programming data and open source data. They are “small” AI models for niche business applications. The assistants can also write poems and such, but they won’t be as good at it because they haven’t been trained on the broader Internet like ChatGPT, said Patrick Stokes, Salesforce’s executive vice president of product.
With OpenAI, Google, Amazon and Meta focused on building bigger and bigger AI models, there’s still a good case for companies to wait and see what kind of capabilities emerge. But there could well be an ocean of smaller AI models designed for specific tasks, meaning people could interact with different AI bots for different activities throughout their day. Ultimately, companies may find they can adopt AI in a less expensive way by focusing on specific applications, said Yoon Kim, an assistant professor at the Massachusetts Institute of Technology whose research focuses on making generative AI models more efficient .
“You can’t use ChatGPT out of the box”
Braden Hancock is the chief technology officer of Snorkel AI, a Redwood City, California-based company that refines AI models. He has helped businesses, many in the financial sector, build small AI models that power bots that do one thing: a customer service assistant, or a coding assistant, for example.
“There was maybe a moment early on at the beginning of the year, right after ChatGPT came out, where people weren’t quite sure—like, oh my, is this game over? Is AI just solved now?” said Hancock. Then, upon closer inspection, companies realized there were few if any business applications that could be addressed by ChatGPT without any modifications.
What does this mean for OpenAI?
If hardware costs come down enough, there’s a scenario where GPT-4 will do everything for everyone, said Amin Ahmad, founder and CEO of Vectera, a software company focused on semantic search. AMD just did released a set of discs which can lower the cost of developing AI models.
But there is another scenario where more large language models (LLMs) on the market will create greater competition for OpenAI. This may help explain why OpenAI is pushing for more regulation ahead of AI competitors and make it harder for others to participate.