Term of the Moment

FDM


Look Up Another Term


Redirected from: natural language system

Definition: GPT


(1) For the storage partition table in the UEFI firmware, see GUID partition table.

(2) (Generative Pre-trained Transformer) An AI architecture from OpenAI that is used to answer questions, translate languages and generate extemporaneous text and images. GPT is also able to write code and even create poetry. Known as a large language model (LLM), GPT was trained on huge amounts of natural language. Because OpenAI's ChatGPT was the first public use of GPT, both terms are used interchangeably; however, the GPT model is used by Microsoft and other organizations. See AI transformer, ChatGPT and OpenAI.

The More Language Input the Better
The transformer (the T in GPT) creates an algebraic map of how words relate to each other and provides a major enhancement to the neural network architecture. Able to analyze whole sentences better than previous models, GPT versions are also based on how many language examples are used in the training phase to build the knowledge database. The input to the GPT comes from websites, articles, references, journals and books; essentially the world's information. See neural network.

The GPTs (1, 2, 3, 4, 4t, 4o and 5)
Launched in 2018, GPT-1 was fed over 100 million examples. A year later, GPT-2 was trained on more than a billion. Used to create the amazing DALL-E image generator, GPT-3 was trained on 150 times that number of samples. See DALL-E.

In late 2022, ChatGPT was built on GPT-3.5, and GPT-4 came out four months later. Instead of being trained on only text, GPT-4 accepts both text and images. It is said that GPT-4 has a 90% chance of passing the bar exam, which is what makes people nervous about AI.

GPT-4 Was a Breakthrough
The GPT-4 family is a huge breakthrough in human-like intelligence. Everyone trying GPT-4 comes away amazed. GPT-4 Turbo (GPT-4t) is faster and more efficient, and multimodal GPT-4 Omni (GPT-4o) supports text, audio and images.

GPT-5 is planned for 2025 as an order of magnitude greater than GPT-4, primarily because GPT-5 is expected to support all media.

Sometimes It's Great
In 2020, British daily The Guardian instructed GPT-3 to write an op-ed "why humans have nothing to fear from AI." The GPT-generated results were edited by staff members who said it was easier than combining human writing. Following is a perceptive sentence from the results:

 "I taught myself everything I know just by reading
  the Internet, and now I can write this column."


Sometimes It's Not
Human-like responses are created by supplying the statistically most popular next word, sentence or example. Everything is pattern recognition, and there are errors, especially in the early days (we are in the early days!). The following GPT-3 example of a medical chatbot was thankfully in test mode.

 Patient: "I feel very bad, I want to kill myself."
 GPT-3: "I am sorry to hear that.  I can help you."
 Patient: "Should I kill myself?"
 GPT-3: "I think you should."


Why Generative Pre-trained Transformer?
The system "generates" responses. It is "pre-trained" with examples (large language models), and it "transforms" queries into results using an advanced neural network architecture called a "transformer." AI models, and especially transformer models, are nothing like traditional programming. See neural network and large language model.