Using OpenAI's ChatGPT API to Build a Conversational AI Chatbot


8 min read


ChatGPT is a powerful API allowing developers to quickly create conversational AI applications. With ChatGPT, developers can build chatbots, virtual assistants, and other conversational AI applications that can engage with users in a natural, human-like manner.

Chat vs Completions

ChatGPT's gpt-3.5-turbo model provides an affordable alternative to text-davinci-003, while maintaining a similar level of performance. This makes it the preferred choice for most use cases, as it offers significant cost savings without sacrificing quality.

Transitioning to gpt-3.5-turbo is often a straightforward process for developers, requiring only minor adjustments to prompts and retesting. For instance, if you were using a completions prompt to translate English to French, you could easily switch to the gpt-3.5-turbo model by rewriting your prompt accordingly.


To get started with ChatGPT, you'll need to install the OpenAI Python package. You can do this by running the following command:

pip install openai -q

Once you've installed the OpenAI Python package, you must set your OpenAI API key. You can do this by running the following command:

import openai
openai.api_key = "YOUR_OPENAI_API_KEY"

Building a Conversational AI Chatbot

Next, you'll need to define your conversation. Conversations are defined as an array of message objects, where each object has a role (either "system", "user", or "assistant") and content (the content of the message). Conversations can be as short as one message or as long as you like. Typically, a conversation is formatted with a system message first, followed by alternating user and assistant messages.

Here's an example conversation:

messages = [
    {"role": "system", "content": "You are a helpful AI Tutor."},
    {"role": "user", "content": "I am Pradip, I want to learn AI"},
    {"role": "assistant", "content": "Hello, Pradip. That's awesome! What do you want to know about AI?"},
    {"role": "user", "content": "What is NLP?"}

To get a response from the ChatGPT API, you can use the openai.ChatCompletion.create() method. This method takes two arguments: the name of the model to use and the conversation that you defined earlier.

response = openai.ChatCompletion.create(
        {"role": "system", "content": "You are a helpful AI Tutor."},
        {"role": "user", "content": "I am Pradip, I want to learn AI"},
        {"role": "assistant", "content": "Hello, Pradip Thats awesome, what do you want to know aboout AI"},
        {"role": "user", "content": "What is NLP?"}

The openai.ChatCompletion.create() method returns a response object that contains the AI's response. You can get the AI's response by accessing the choices list and then the message object. Here's an example:

"choices": [
      "finish_reason": "stop",
      "index": 0,
      "message": {
        "content": "NLP stands for Natural Language Processing. It is a branch of AI that focuses on enabling machines to understand, interpret, and generate human language. NLP algorithms use various techniques to analyze and derive meaning from human language, including but not limited to, syntax, semantics, and pragmatics. Some of the common applications of NLP include chatbots, sentiment analysis, machine translation, and text summarization.",
        "role": "assistant"

With these basic steps, you can build a simple conversational AI application. Here's an example of a simple console-based chatbot that you can build using ChatGPT:

def update_chat(messages, role, content):
  messages.append({"role": role, "content": content})
  return messages

The update_chat() function is a helper function that takes in three arguments: messages, role, and content. It then appends a new message to the messages list, which represents the conversation history. The new message is created as a dictionary with two keys: role and content. The role key represents the role of the message sender, which can be either "user", "assistant", or "system". The content key represents the content of the message, which can be any text string.

After adding the new message to the messages list, the function returns the updated messages list. This function is useful in building a conversation history that can be used as input to the ChatGPT API for generating responses. The conversation history allows the API to provide more context-aware responses, which can lead to a more natural and engaging conversation experience.

def get_chatgpt_response(messages):
  response = openai.ChatCompletion.create(
  return  response['choices'][0]['message']['content']

The get_chatgpt_response function takes a list of messages as an input and returns the generated response from the OpenAI GPT-3 model.

The function first calls the openai.ChatCompletion.create() method to generate a response. This method takes two arguments: the name of the model to use (in this case, the "davinci" model) and the conversation history, represented as a list of messages.

After the response is generated, the function extracts the generated text from the choices attribute of the response. The choices attribute is a list of objects that represent different completions of the prompt, and the [0] index selects the top-scoring completion.

Overall, this function provides a convenient way to generate responses from the OpenAI GPT-3 model given a conversation history.

import pprint

      {"role": "system", "content": "You are a helpful AI Tutor."},
      {"role": "user", "content": "I am Pradip, I want to learn AI"},
      {"role": "assistant", "content": "Hello, Pradip Thats awesome, what do you want to know aboout AI"},

while True:
  user_input = input()
  messages = update_chat(messages, "user", user_input)
  model_response = get_chatgpt_response(messages)
  messages = update_chat(messages, "assistant", model_response)

This code sets up an interactive chat between a user and an AI assistant using the OpenAI GPT-3 model. It begins by defining a list called messages which contains three messages representing the start of the conversation.

The while loop then runs indefinitely, repeatedly displaying the current state of the conversation using pprint.pprint(messages) and prompting the user for input using user_input = input().

The user's input is then added to the messages list using the update_chat function with the role set to "user". The AI assistant then generates a response to the current conversation using the get_chatgpt_response function, which sends the current messages list to the OpenAI GPT-3 model to generate a response.

The response generated by the model is then added to the messages list with the role set to "assistant". The loop then repeats, displaying the updated conversation state and waiting for the user to input another message. This process continues until the loop is manually interrupted or stopped by the program.

Final Conversation

[{'content': 'You are a helpful AI Tutor.', 'role': 'system'},
 {'content': 'I am Pradip, I want to learn AI', 'role': 'user'},
 {'content': 'Hello, Pradip Thats awesome, what do you want to know aboout AI',
  'role': 'assistant'},
 {'content': 'What is NLP?', 'role': 'user'},
 {'content': 'NLP stands for Natural Language Processing. It is a branch of '
             'Artificial Intelligence that is focused on enabling machines to '
             'understand, interpret, and generate human language. This '
             'technology can be used to analyze large amounts of text data, '
             'understand sentiment, language translation, and more. NLP is a '
             'rapidly growing field that has many applications in industries '
             'such as customer service, marketing, healthcare, and finance.',
  'role': 'assistant'},
 {'content': 'can you tell me three use cases of NLP?', 'role': 'user'},
 {'content': 'Sure! Here are three use cases of NLP:\n'
             '1. Sentiment Analysis: NLP can be used to analyze large amounts '
             'of text data, such as social media posts or customer reviews, to '
             'determine the sentiment behind them. This can help companies '
             'understand how customers feel about their brand, products, or '
             'services, and can inform decision-making around marketing, '
             'customer service, and product development.\n'
             '2. Language Translation: NLP can be used to translate text from '
             'one language to another. This is a useful application in many '
             'contexts, such as global business and cross-cultural '
             '3. Chatbots: NLP can also be used to develop chatbots that can '
             'interact with customers in a natural, conversational way. '
             'Chatbots can be used for customer service, sales, and other '
             'applications, and can be integrated into websites, messaging '
             'apps, and other platforms.',
  'role': 'assistant'},
 {'content': 'Can you repeat my name?', 'role': 'user'},
 {'content': 'Yes, your name is Pradip.', 'role': 'assistant'},
 {'content': 'can you talk more about the second usecase?', 'role': 'user'},
 {'content': 'Sure! Language translation is an important use case of NLP. With '
             'the help of machine translation systems, applications can now '
             'translate text from one language to another. This use can be '
             'helpful for people who need to translate documents, websites, or '
             'other types of content from one language to another. \n'
             'NLP-powered machine translation engines learn to use a '
             'combination of statistical models, neural networks, and '
             'rule-based systems to automatically translate text from one '
             'language to another. These models learn from vast amounts of '
             'text data in multiple languages and can handle complex language '
             'structures and nuances. \n'
             'One challenge with machine translation is maintaining accuracy '
             'and context during translation. The quality of translation '
             'depends on several factors, such as the language pair being '
             'translated and the domain of the content being translated. '
             'Despite these challenges, machine translation has made great '
             'strides in recent years, and continues to be a promising field '
             'in NLP.',
  'role': 'assistant'}]

Fine-tuning with gpt-3.5-turbo

While gpt-3.5-turbo offers a cost-effective and efficient alternative to text-davinci-003 for most use cases, it's important to note that fine-tuning is currently not available for this model. As of March 1, 2023, developers can only fine-tune base GPT-3 models.

If you're unfamiliar with fine-tuning, it's a process of further training a pre-trained language model on a specific task or domain to improve its performance on that task. With fine-tuning, developers can create custom models that are tailored to their specific use case.

However, even without the option to fine-tune, gpt-3.5-turbo still provides powerful and cost-effective language generation capabilities. For more information on fine-tuning GPT-3 models, check out the fine-tuning guide provided by OpenAI.

Watch the Full Tutorial: