When you ask ChatGPT what is ChatGPT this is the immediate response:
ChatGPT is a chatbot developed by OpenAI. It is a large language model that uses deep learning techniques to generate responses to user input. It is designed to be able to have natural conversations with humans and provide helpful and relevant information in response to a wide range of topics.
OpenAI, the company whose Dall-E text-to-image generator we wrote about in Is AI Art the Future of Web Images, had done it again. Their November 30th release of ChatGPT-3, a text bot capable of writing as good or better than Dall-E paints images, is blowing up the Internet. ChatGPT’s implications across everything from Google’s control of search to finding new cancer drugs have our Sign-of-the-Times Flipboard magazine spinning.
Here are WTE's founder and CTO Eric's thoughts about ChatGPT:
You must have been off the grid if you are in tech and have yet to hear about ChatGPT. However, the buzz is for a good reason. OpenAI hit an essential milestone in Artificial Intelligence (AI) with version 3.0 of their AI Chatbot because it "sounds" so human. OpenAI creates tools for developers. For example, development companies like ours use tools like GitHub's Copilot to write code faster and power technology innovations. In addition, companies like ours will use OpenAI's ChatGPT to make end-use tools on your websites and apps. Martin shares a few of his ideas later in this post, but those four are only the beginning of a race to commercialize ChatGPT and other AI tools such as Microsoft Azure. Let's not discount the gravity of this moment. The last week reminds me of when Google hit the market and overtook other search engines such as AltaVista and Yahoo. ChatGPT isn't a search engine. It is a tool that powers ChatBots. The interface is elegant and clean, like Google's simple text box. Tech companies like mine love ChatGPT because we can ask it to write code. For example, when we tell ChatGPT to write a weather API service in C#, it writes tight code. Stack Overflow has been a developer favorite for the past decade. Stack Overflow allows developers like ours to see and use other people's code (OPC). ChatGPT writes good tight code!
You must have been off the grid if you are in tech and have yet to hear about ChatGPT. However, the buzz is for a good reason. OpenAI hit an essential milestone in Artificial Intelligence (AI) with version 3.0 of their AI Chatbot because it "sounds" so human.
OpenAI creates tools for developers. For example, development companies like ours use tools like GitHub's Copilot to write code faster and power technology innovations. In addition, companies like ours will use OpenAI's ChatGPT to make end-use tools on your websites and apps. Martin shares a few of his ideas later in this post, but those four are only the beginning of a race to commercialize ChatGPT and other AI tools such as Microsoft Azure.
Let's not discount the gravity of this moment. The last week reminds me of when Google hit the market and overtook other search engines such as AltaVista and Yahoo. ChatGPT isn't a search engine. It is a tool that powers ChatBots. The interface is elegant and clean, like Google's simple text box.
Tech companies like mine love ChatGPT because we can ask it to write code. For example, when we tell ChatGPT to write a weather API service in C#, it writes tight code. Stack Overflow has been a developer favorite for the past decade. Stack Overflow allows developers like ours to see and use other people's code (OPC). ChatGPT writes good tight code!
WTE is looking at ChatGPT and AI for great digital marketing, web design, and software development ideas and innovations for next year. We’re planning three ChatGPT posts before the end of Q1 next year. Our second post will be about how ChatGPT and other AI tools change how we code, and our third article will share how chatbots will change sales and marketing. So here is what we cover in our ChatGPT Part One post:
How Does ChatGPT Work? What is a generative pre-trained transformer, how does ChatGPT work?
ChatGPT Disruptions We round up and summarize hundreds of ChatGPT implications articles to discuss our top five content marketing, website, and software development likely disruptions.
2 Ways ChatGPT Will Change Your B2B or B2C Website We prioritize and share our top three Chatbot application recommendations for B2C and B2B next year.
Once again, let’s “talk” to ChatGPT to learn what a generative pre-trained transformer is all about.
A generative pre-trained transformer, or GPT, is a type of artificial intelligence (AI) model that uses deep learning to generate natural-sounding text. GPT models are trained on large amounts of text data, such as books, articles, and websites, and use this training to generate new text that sounds similar to the training data. GPT models are called "generative" because they can generate new text based on the input they receive. They are also called "pre-trained" because they are trained on a large dataset before they are used for a specific task. This pre-training allows the model to understand the structure and context of natural language, which makes it more effective at generating text that sounds natural. GPT models use a type of neural network called a transformer, which is a powerful tool for processing and generating text. The transformer architecture allows GPT models to handle long sequences of text, such as sentences and paragraphs, and to generate coherent and fluent text that sounds like it was written by a human. Overall, GPT models are a powerful tool for generating natural-sounding text, and are being used in a variety of applications, including language translation, chatbots, and summarization.
A generative pre-trained transformer, or GPT, is a type of artificial intelligence (AI) model that uses deep learning to generate natural-sounding text. GPT models are trained on large amounts of text data, such as books, articles, and websites, and use this training to generate new text that sounds similar to the training data.
GPT models are called "generative" because they can generate new text based on the input they receive. They are also called "pre-trained" because they are trained on a large dataset before they are used for a specific task. This pre-training allows the model to understand the structure and context of natural language, which makes it more effective at generating text that sounds natural.
GPT models use a type of neural network called a transformer, which is a powerful tool for processing and generating text. The transformer architecture allows GPT models to handle long sequences of text, such as sentences and paragraphs, and to generate coherent and fluent text that sounds like it was written by a human.
Overall, GPT models are a powerful tool for generating natural-sounding text, and are being used in a variety of applications, including language translation, chatbots, and summarization.
ChatGPT sounds distinctive, elegant, and efficient as a writer who works hard to create an identifiable online voice. ChatGPT’s “natural sound,” its voice, is arguably OpenAI’s chatbot’s most fantastic feature, but ChatGPT isn’t flawless.
The four paragraphs above only earned an 82 from another AI tool we couldn’t live without – Grammarly. Passive instead of active sentences were the most significant issue. It took less than five minutes to rewrite with Grammarly:
A generative pre-trained transformer, or GPT, is an artificial intelligence (AI) model that uses deep learning to generate natural-sounding text. GPT models train with large amounts of text data, such as books, articles, and websites, and use this training to create new text that sounds similar to the training data. GPT models are "generative" because they can generate new text based on the input received. They are "pre-trained" because large datasets prepare chat ai bots for specific tasks. This pre-training allows the model to understand the structure and context of natural language, which makes it more effective at generating text that sounds natural. GPT models use a type of neural network called a transformer, a powerful tool for processing and generating text. The transformer architecture allows GPT models to handle long text sequences, such as sentences and paragraphs, and to create coherent and fluent text that sounds like a human wrote. Overall, GPT models are a powerful tool for generating natural-sounding text and are used in various applications, including language translation, chatbots, and summarization.
A generative pre-trained transformer, or GPT, is an artificial intelligence (AI) model that uses deep learning to generate natural-sounding text. GPT models train with large amounts of text data, such as books, articles, and websites, and use this training to create new text that sounds similar to the training data.
GPT models are "generative" because they can generate new text based on the input received. They are "pre-trained" because large datasets prepare chat ai bots for specific tasks. This pre-training allows the model to understand the structure and context of natural language, which makes it more effective at generating text that sounds natural.
GPT models use a type of neural network called a transformer, a powerful tool for processing and generating text. The transformer architecture allows GPT models to handle long text sequences, such as sentences and paragraphs, and to create coherent and fluent text that sounds like a human wrote.
Overall, GPT models are a powerful tool for generating natural-sounding text and are used in various applications, including language translation, chatbots, and summarization.
ChatGPT's popularity is making it hard for OpenAI to keep its new tool from crashing. However, your website's or app's chatbot can be architected and coded to eliminate similar problems once ChatGPT becomes "end-use" products, products running within your technology stack. Let's expand on "training with large amounts of data." ChatGPT extracted a large pool of data, an extraction they stopped in 2021. Since we add as much data every day as from the beginning of time up to yesterday (roughly), establishing a cut-off time for extractions helps define a "training pool."
When we asked ChatGPT who won the 2022 Masters Golf Tournament, we received this error message:
An error occurred. Either the engine you requested does not exist or there was another issue processing your request. If this issue persists please contact us through our help center at help.openai.com.
ChatGPT can't create answers for anything beyond its "training" pool of data. Still, ChatGPT is "generative," meaning capable of writing an almost infinite number of combinations operating within the extensive training data collection. In addition, the "pre-training" allows ChatGPT to understand and use language and to have a unique voice that sounds "natural."
This pre-training allows the model to understand the structure and context of natural language, which makes OpenAi's chatbot capable of generating text that sounds natural. So, let's continue our conversation with ChatGPT.
What is a neural network? A neural network is a type of machine learning algorithm modeled after the structure and function of the human brain. It is composed of a large number of interconnected "neurons," which process input data and use it to make predictions or take actions based on that data. Neural networks are able to learn from and adapt to new data, which allows them to make more accurate predictions and improve their performance over time. They are widely used for a variety of tasks, including image and speech recognition, natural language processing, and forecasting. What is a neural network transformer? A transformer is a type of neural network architecture that was introduced in a paper by researchers at Google in 2017. The key innovation of the transformer architecture is its use of self-attention mechanisms, which allow the network to focus on specific parts of the input when processing it, rather than using a fixed-sized window like most other neural network architectures. This allows the network to better capture long-range dependencies in the input data, and has led to the development of powerful language processing models such as BERT and GPT-3. What are machine learning factors? Machine learning factors are the variables that can affect the performance of a machine learning model. Some of the most important factors include the quality and quantity of the data used to train the model, the choice of algorithm and hyperparameters, the presence of bias in the data, and the computational resources available for training. These factors can all impact the accuracy, speed, and reliability of a machine learning model, and should be carefully considered when designing and implementing a machine learning system. What are hyperparameters? Hyperparameters are parameters that are set before training a machine learning model. They are different from the parameters of the model, which are learned from the data during training. Examples of hyperparameters include the learning rate, the number of layers in a neural network, and the type of regularization to use. These values determine the behavior of the training process, and can have a significant impact on the performance of the trained model. Unlike model parameters, which are learned from data, hyperparameters must be set manually, and are often determined through a process called hyperparameter optimization.
What is a neural network?
A neural network is a type of machine learning algorithm modeled after the structure and function of the human brain. It is composed of a large number of interconnected "neurons," which process input data and use it to make predictions or take actions based on that data. Neural networks are able to learn from and adapt to new data, which allows them to make more accurate predictions and improve their performance over time. They are widely used for a variety of tasks, including image and speech recognition, natural language processing, and forecasting.
What is a neural network transformer?
A transformer is a type of neural network architecture that was introduced in a paper by researchers at Google in 2017. The key innovation of the transformer architecture is its use of self-attention mechanisms, which allow the network to focus on specific parts of the input when processing it, rather than using a fixed-sized window like most other neural network architectures. This allows the network to better capture long-range dependencies in the input data, and has led to the development of powerful language processing models such as BERT and GPT-3.
What are machine learning factors?
Machine learning factors are the variables that can affect the performance of a machine learning model. Some of the most important factors include the quality and quantity of the data used to train the model, the choice of algorithm and hyperparameters, the presence of bias in the data, and the computational resources available for training. These factors can all impact the accuracy, speed, and reliability of a machine learning model, and should be carefully considered when designing and implementing a machine learning system.
What are hyperparameters?
Hyperparameters are parameters that are set before training a machine learning model. They are different from the parameters of the model, which are learned from the data during training. Examples of hyperparameters include the learning rate, the number of layers in a neural network, and the type of regularization to use. These values determine the behavior of the training process, and can have a significant impact on the performance of the trained model. Unlike model parameters, which are learned from data, hyperparameters must be set manually, and are often determined through a process called hyperparameter optimization.
The experience of using ChatGPT is easy, intuitive, and fast. Note how the “conversation” above flowed with a ChatGPT answer prompting a new question. Now let’s discuss ChatGPT disruptions.
Google and education jumped into my mind as places for disruptions after using ChatGPT to help write this article. I’m an AI newbie, so I got a helpful thirty minutes with WTE’s CEO and advanced AI thinker Eric this morning, and that early meeting was helpful and informative, but Eric is on his third meeting and fourth phone call as I write.
I needed a real-time Eric, so I used ChatGPT. Unfortunately, my Google AI searches didn’t drop to zero, but I almost did. Looking up at my open tabs, I see three Google searches; researching a two-thousand-word post would typically have ten to twenty tabs open showing Google search results.
Google is working on incorporating AI into search. Remember ChatGPT’s chief limitation – they trained on a set data pool. What if there was no such limitation? What if ChatGPT could sound natural AND tell us who won the 2022 Masters? You better believe Google is working on just such an engine.
Education was the second disruption that popped into my head after using ChatGPT. The tool branches like my brain, creating that easy “conversational” feel and immediate response. Good luck distinguishing a paper written by ChatGPT and checking with Grammarly. Let’s get past the apparent plagiarism to wonder at the seemingly unlimited uses for such a “conversational” Q&A tool.
In the half an hour I had before his meetings started, Eric explained how ChatGPT is an extension of great coding tools the WTE team already uses, such as GitHub’s Copilot and Tabnine. Eric explained that training coders to use these AI code-writing tools is an art and science. I promise to focus on AI Code writing in ChatGPT Part Two, meaning I’ll record an hour with Eric and transcribe.
Here is how Eric sees the ChatGPT versus Google question.
Today, Google indexes billions of pages of content. ChatGPT crawls pages, but OpenAI's chatbot crawls the web looking for Answers. I bet Google will soon buy or develop this "search the web for answers" function. The everyday experience Martin discussed, where you query the GPT engine to get results and then follow up with additional questions, is likely to be a big part of online searches. ChatGPT is a helper, like Alexa or Google Home. GPT provides the best result from its training pool, prompting an interactive Q&A dialog. If you are looking for a selection of products, Google is better. If you want to know how to do something or fix a problem with your iPhone, ChatGPT will return the steps to fix the problem, and you can ask additional questions based on the chatbot's reply. ChatGPT is like having your own GeekSquad in a box.
Today, Google indexes billions of pages of content. ChatGPT crawls pages, but OpenAI's chatbot crawls the web looking for Answers. I bet Google will soon buy or develop this "search the web for answers" function. The everyday experience Martin discussed, where you query the GPT engine to get results and then follow up with additional questions, is likely to be a big part of online searches.
ChatGPT is a helper, like Alexa or Google Home. GPT provides the best result from its training pool, prompting an interactive Q&A dialog. If you are looking for a selection of products, Google is better.
If you want to know how to do something or fix a problem with your iPhone, ChatGPT will return the steps to fix the problem, and you can ask additional questions based on the chatbot's reply. ChatGPT is like having your own GeekSquad in a box.
As a former Director of E-commerce, search and interactivity were the first things to pop into my mind after using ChatGPT for the first time.
Ecom Store Search Search is stupid. The easy-to-use online store search that balances response between merchandising and content math (most viewed, most shared, most conversions) doesn’t exist. Search is a bear because of the inherent complexity – all that data, all those analytics, and merchandising that changes so fast substantial data quality investments hinder ROI means most online store search sucks for merchants and customers. It is easy to see how ChatGPT aided by a merchandising algorithm, can make an online merchant a lot of new money, money from sales they may not have realized any other way.
Ecom Interactivity When I ran an online store with millions in annual sales, we dreamed of creating an interactive and custom experience. And we were able to do simple things such as change offers and creativity based on where a customer came from, but wrapping our site around a customer like a blanket was more dream than reality. ChatGPT, the cloud, a headless content management system (CMS), and API calls to the Lego blocks we code may create the interactive personal blanket we saw in our dreams.
I made two immediate recommendations after using ChatGPT for the first time:
Content Marketing We’re tech and marketing geeks who spend most of our time at parties answering questions, so we understand the power of Q&A technical and marketing content. The amount of Q&A content we can create is infinite. Discovering the best content to requires laborious keyword research – something we’re going to build an algorithm to do in 2023 or die trying. Once we know our top 100 Q&A questions, we’ll use ChatGPT and Grammarly to write the answers we code into a framework. Q&A online frameworks can be a pain because the more answers you have, the hardest it can be to find them, but we’ll do something creative to solve the “hard to find” issue.
Ask Eric I suggested a private label ChatGPT, but Eric said having Microsoft Azure trained on the WTEl.net site may be a better solution. Once Azure is “pre-trained” on our website’s content, the development team can create a search-like interface so potential customers can have the same natural language-sounding interactive conversation I had with ChatGPT. Imagine a hundred people worldwide having simultaneous “conversations” with WTE’s founder, CEO, and CTO, Eric Garrison. Yes, our real-time Q&A would have the same “data pool” limitations as ChatGPT, but that’s okay because every question the application can’t answer is valuable content marketing analytics information. We can’t answer whether a one-off fifty is a trend.
Want to know when we post our next ChatGPT article? What to learn tech and marketing secrets a little before everyone else? Join the WTE newsletter list where your privacy is protected and we don’t spam.
OpenAI's ChatGPT.
OpenAI's Dall-E text to image AI tool.
All images used in this post were generated with Dall-E's text to image AI generator.
WTE on Flipboard and Sign-of-the-Times magazine where we Flip posts about AI, OpenAI, and other cool tools.