Streamlit Updating Prompt After Each Input

Summary

Hi, I am new to Streamlit. I am looking to build a frontend for an interface to speak with a chatbot. However, I need the chatbot to remember previous responses in order to complete tasks. I am running into an issue where every time I enter a new input in the text_input function, the new prompt doesnโ€™t include previous responses, and Iโ€™m not sure what Iโ€™m doing wrong. If someone has any advice that would be much appreciated!

Steps to reproduce

Code snippet:

import openai
import streamlit as st
from streamlit_chat import message

openai.api_key = "xxxxxxxxxxxxxxxx"

# Use the OpenAI API to generate a response

start_sequence = "\nYou: "
restart_sequence = "\nFriend: "

prompt = "You: What have you been up to?
\nFriend: Watching old movies. 
\nYou: Did you watch anything interesting? 
\nFriend: Jaws"

title = ("Test")
st.title(title)

# Storing the chat
if 'generated' not in st.session_state:
    st.session_state['generated'] = []

if 'past' not in st.session_state:
    st.session_state['past'] = []

def gpt3(prompt):
    response = openai.Completion.create(
    model="text-davinci-003",
    prompt= prompt,
    temperature=0,
    max_tokens=256,
    top_p=1,
    frequency_penalty=0,
    presence_penalty=0,
    stop=["\n"]
    )
    # Get the response text
    response_text = response["choices"][0]["text"]
    # Print the response
    return response_text

def chatbot_response(prompt, user_input):
  prompt = prompt + start_sequence + user_input + restart_sequence
  chat_response = gpt3(prompt)
  prompt = prompt + chat_response
  return chat_response, prompt

def get_text():
    input_text = st.text_input("Enter your message to speak to Chatbot: ","", key="text")
    return input_text

def clear_text():
    st.session_state["text"] = ""

user_input = get_text()
st.button("Clear Text", on_click=clear_text)

if user_input:
    output, prompt = chatbot_response(prompt, user_input)
    print(prompt)
    # store the output 
    st.session_state.past.append(user_input)
    st.session_state.generated.append(output)

if st.session_state['generated']:
    for i in range(len(st.session_state['generated'])-1, -1, -1):
        message(st.session_state["generated"][i], key=str(i))
        message(st.session_state['past'][i], is_user=True, key=str(i) + '_user')

Expected behavior:

Desired behavior is that the chatbot remembers the whole conversation so it remembers previous inputs as well as the most recent input.

Actual behavior:

If you were to use this code, only the most recent input and response are included when printing prompt, which is not the desired behavior since it indicates the chatbot is not remembering the entire conversation. Any help would be much appreciated, thanks!

I have a feeling this may involve session state, but I am not sure how I would go about it.

At the top you have this:

prompt = "You: What have you been up to?
\nFriend: Watching old movies. 
\nYou: Did you watch anything interesting? 
\nFriend: Jaws"

And when asking for response you have this:

output, prompt = chatbot_response(prompt, user_input)

Unfortunately, the prompt variable value in,

chatbot_response(prompt, user_input)

is always the value of the prompt defined at the top (as streamlit runs the script from top to bottom when there are changes to the widget states). See also the main concept. So the intended input prompt for the GPT bot does not contain the complete chat history.

Here is one approach. Define a new session state variable to record the user and bot messages including the initial prompt.

if 'bot_prompt' not in st.session_state:
    st.session_state.bot_prompt = []

Save the init prompt.

...

prompt = """You: What have you been up to?
\nFriend: Watching old movies. 
\nYou: Did you watch anything interesting? 
\nFriend: Jaws"""

# Save init prompt in the bot_prompt.
if len(st.session_state.bot_prompt) == 0:
    pr: list = prompt.split('\n')
    pr = [p for p in pr if len(p)]  # remove empty string
    st.session_state.bot_prompt = pr
    # print(f'init: {st.session_state.bot_prompt}')

When the user has input, save it to the bot_prompt.

if user_input:
    # Add the user input to the bot_prompt before sending the prompt.
    st.session_state.bot_prompt.append(f'You: {user_input}')

    # Convert a list of prompts to a string for the GPT bot.
    input_prompt: str = '\n'.join(st.session_state.bot_prompt)
    # print(f'bot prompt input list:\n{st.session_state.bot_prompt}')
    # print(f'bot prompt input string:\n{input_prompt}')

...

Once we receive the bot response, save it also on the bot_prompt for next prompt.

if user_input:
    ...

    st.session_state.past.append(user_input)
    st.session_state.generated.append(output)

    # Add bot response for next prompt.
    st.session_state.bot_prompt.append(f'Friend: {output}')

Full code when there is user input.

if user_input:
    # Add the user input to the bot_prompt before sending the prompt.
    st.session_state.bot_prompt.append(f'You: {user_input}')

    # Convert a list of prompts to a string for the GPT bot.
    input_prompt: str = '\n'.join(st.session_state.bot_prompt)
    # print(f'bot prompt input list:\n{st.session_state.bot_prompt}')
    # print(f'bot prompt input string:\n{input_prompt}')

    output = chatbot_response(input_prompt)
    # print(prompt)
    # store the output 
    st.session_state.past.append(user_input)
    st.session_state.generated.append(output)

    # Add bot response for next prompt.
    st.session_state.bot_prompt.append(f'Friend: {output}')

chatbot_response update.

def chatbot_response(prompt):
    prompt = prompt + restart_sequence
    return gpt3(prompt)

I tried it and it worked.

2 Likes

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.