This article is more than 1 year old

ChatGPT, how did you get here? It was a long journey through open source AI

Without publicly accessible code, there would be no AI chatbot

Opinion When OpenAI released ChatGPT 3.5 in late November 2022, no one expected much from the new release. It was just a "research preview," explained Sandhini Agarwal, an AI Policy researcher at OpenAI. "We didn't want to oversell it as a big fundamental advance," added Liam Fedus, a scientist at the org.

Ha! That was then. This is now. 

Unless you've been living under a rock, you know ChatGPT has since become the hottest technology development this decade, heck, maybe this century. At least Bill Gates – you remember him, right? – thinks it's the biggest thing since he was introduced to the idea of a graphical user interface (GUI) in 1980. That led to a product called Windows.

Amusingly enough, there was nothing all that new in ChatGPT 3.5.  It used the same large language model (LLM) as earlier versions. The key difference is that you could now more easily ask questions in natural language instead of accessing it via application programming interfaces (API) or API-driven programs.

By making it easy to access ChatGPT, OpenAI, to its surprise, saw it become wildly popular. And, oh, by the way,  since Microsoft then invested $10 billion into the business, it seems to have done just fine by the company.

So, great news for open source, too, right? I mean, the company's name is OpenAI, yes? Yes, the name still has open in it, but the source code and the services based on it haven't been open for some time.

While Google's newly released answer to ChatGPT, Bard, "thinks" that "the GPT-4 model and ChatGPT are both open source projects," it's wrong.

It was meant to be open source, said one of the company's co-founders, another guy you may have heard of named Elon Musk. Musk noted: "OpenAI was created as an open source (which is why I named it "Open" AI), non-profit company to serve as a counterweight to Google, but now it has become a closed source, maximum-profit company effectively controlled by Microsoft. Not what I intended at all."

What happened was Musk first left OpenAI, which was then a non-profit corporation, in 2018, to focus on SpaceX and Tesla. The next year, seeing it would need more money, OpenAI became, for all intents and purposes, a for-profit company. As Sam Altman, OpenAI's CEO, subsequently tweeted, "we will have to monetize it somehow at some point; the compute costs are eye-watering." I guess $10 billion only goes so far.

In other words, yet another company failed to figure out how to monetize its open source work. Then, having used open source to build up to GPT-2, it closed the doors on the code.

Besides, even before Microsoft invested big money in OpenAI, Microsoft had exclusively licensed the GPT-3 language model in 2020.

Mind you, ChatGPT still uses open source code. Just this week when it became clear people could look at others' search history, Altman blamed an open source library. "We had a significant issue in ChatGPT due to a bug in an open source library, for which a fix has now been released and we have just finished validating."

The bug originated in the Redis client open-source library, redis-py*.

Now that ChatGPT is in for the money, it is no longer living up to the Open in its name. As Ben Schmidt, Nomic AI's VP of information design tweeted: "I think we can call it shut on 'Open' AI: the 98 page paper introducing GPT-4 proudly declares that they're disclosing *nothing* about the contents of their training set."

Before all this, you can trace ChatGPT's journey to the open source programs at the heart of AI, machine learning, natural language processing, and deep learning frameworks.

In particular, TensorFlow and PyTorch, developed by Google and Facebook, respectively, fueled ChatGPT. These frameworks provide essential tools and libraries for building and training deep learning models. Without them, there is no ChatGPT. 

Another vital open source part of ChatGPT is the oddly named Hugging Face's Transformer. This is the leading open source library for building state-of-the-art machine learning models. It provides pre-trained models, architectures, and tools for natural language processing tasks, allowing developers to build upon existing models and fine-tune them for specific use cases. ChatGPT benefited immensely from the library's support for the GPT series of models, enabling the rapid deployment and scaling of the model.

You can see all this in OpenAI's GPT-2, ChatGPT's direct predecessor. While it got no headlines, GPT-2's impressive capabilities were a result of combining advancements in deep learning, unsupervised learning, and transformer architecture. The open source community played an essential role in the development, testing, and improvement of GPT-2. 

How could OpenAI do this? Easily. The open source licenses in the above, TensorFlow, Apache 2.0; PyTorch, modified BSD; and Hugging Face Transformer, support a variety of open-source licenses, including the BSDs. In other words, OpenAI is legally in the clear.

So, like many other programs ChatGPT owes a great debt to open source, but it's not open source, nor is it likely to ever be open source. There you have it. Another depressing open source tale. ® 

* We've updated the piece with a link to details of the bug that allowed some users to see titles from another active user's chat history.

More about

TIP US OFF

Send us news


Other stories you might like