How to keep previous image in conversation history

Summary

I am using streamlit app to build a csv chatbot along with langchain csv agent and chatgpt api. The chatbot is able to response with text, table, plots per user’s input question. But plots from old question in conversation history will disappear when users ask a new question. This only happens to plots/graph. Text or table can still exist. I wonder if it is related to st.seesion_state() I used.

Steps to reproduce

Code snippet:

from langchain.agents import AgentType
from langchain.agents import create_pandas_dataframe_agent
from langchain.callbacks import StreamlitCallbackHandler
from langchain.llms.openai import OpenAI
import streamlit as st
import pandas as pd
import matplotlib.pyplot as plt
from langchain.tools.python.tool import PythonREPLTool
import os

df = pd.read_cvs('https://raw.githubusercontent.com/a-mt/fcc-medical-data-visualizer/master/medical_examination.csv')

if "messages" not in st.session_state or st.sidebar.button("Clear conversation history"):
    st.session_state["messages"] = [{"role": "assistant", "content": "How can I help you?"}]

for msg in st.session_state.messages:
    st.chat_message(msg["role"]).write(msg["content"])

if prompt := st.chat_input(placeholder="What is this data about?"):
    st.session_state.messages.append({"role": "user", "content": prompt})
    st.chat_message("user").write(prompt)

    llm = OpenAI(openai_api_key="YOUR_API_KEY")
    pandas_df_agent = create_pandas_dataframe_agent(
        llm,
        df,
        verbose=True,
        agent_type=AgentType.OPENAI_FUNCTIONS,
        handle_parsing_errors=True,
    )

    with st.chat_message("assistant"):
        st_cb = StreamlitCallbackHandler(st.container(), expand_new_thoughts=True)
        response = pandas_df_agent.run(st.session_state.messages, callbacks=[st_cb])
        st.session_state.messages.append({"role": "assistant", "content": response})
        st.write(response)
        fig = plt.gcf() 
        if fig:
            st.write(fig)

If applicable, please provide the steps we should take to reproduce the error or specified behavior.

Expected behavior:

All plots in conversation history will stay.

Actual behavior:

say question 1 "to plot distribution of age’, it will show a figure of age distribution. Then ask another question like ‘how many rows and columns are there’. It will return the number of rows and columns but the previous plot of age distribution is gone.

1 Like

Same need here.

Hi @joebruin ,

The issue you’re facing is maybe because streamlit is clearing your plots. So you have to save the history somewherel
You can do the following:

  1. Create a list to store the plots and their associated messages.
  2. Check if the message is from the assistant and contains a plot (i.e., a fig is not None). If so, append the message and the associated plot to your list.
  3. When displaying the conversation history, iterate through this list and show the plots along with their corresponding messages.

You can try this code

import streamlit as st
import pandas as pd
import matplotlib.pyplot as plt

# Define a list to store the conversation history including plots
conversation_history = []

if "messages" not in st.session_state or st.sidebar.button("Clear conversation history"):
    st.session_state["messages"] = [{"role": "assistant", "content": "How can I help you?"}]

for msg in st.session_state["messages"]:
    st.chat_message(msg["role"]).write(msg["content"])

if prompt := st.chat_input(placeholder="What is this data about?"):
    st.session_state["messages"].append({"role": "user", "content": prompt})
    st.chat_message("user").write(prompt)

    # Assuming your response generates a plot
    response_message, fig = generate_response(prompt)  # Modify this part accordingly

    if fig:
        conversation_history.append((response_message, fig))

# Display the conversation history with plots
for msg, fig in conversation_history:
    st.chat_message("assistant").write(msg)
    st.write(fig)

Hello @joebruin @Hugo_2020 wondering if you guys were able to fix the problem? Please let me know.

yes, +1 would love to see a better method for keeping charts in message

Histories (including human and ai messages) contain only strings unless the assistants API is used. So, what I do is to add a simple message containing the fig information as an ai message while maintaining the list of fig objects separately. This is not very tidy, but works. Here is what I do.

fig = plt.gcf()
if fig and fig.get_axes():
    fig_index = len(st.session_state.fig)
    st.session_state.message_history.add_ai_message(
        f"Figure {fig_index + 1} generated by AI."
    )
    st.session_state.fig.append(fig)

...

for msg in st.session_state.message_history.messagees:
    ...

    if re.match(
        r"^Figure \d+ generated by AI\.$", msg.content
    ):  # Check to see if the message points to a figure object
        fig_number = re.search(r'\bFigure (\d+)\b', msg.content).group(1)
        st.pyplot(st.session_state.fig[int(fig_number) - 1])
...

In the above, ‘fig_index’ is maintained appropriately such that the message containing the fig information is connected to the right figure.

Hope this helps. Just for your reference, my app using this method of keeping figures is located at https://langchain-llm-agent.streamlit.app/. Another app utilizing the Assistants API (eliminating the need to maintain a separate list) can be found at https://assistants.streamlit.app/. These scripts are both single long files that are very untidy. Any advice on better ways of splitting these into multiple files in a systematic way would be appreciated.

Hello. I have come up with a better way of keeping images as part of a chat message, which is to add ‘additional_kwargs’ to the chat message as follows:

(Human/AI)Message(content='hi!', additional_kwargs={'image_urls': ["https://...", "https://...", ...]})

Figure objects generated by matplotlib are converted to base64 encoded images and then placed into the (list) value of the item ‘image_urls’. My code introduced above has now been changed using this method.

Thanks.

TWY

Hello yoon.tw, thanks you for the solution. Can I get to see your code? thanks you.

@J11 I am sorry that the code is very untidy. LangChain_llm_Agent/LangChain_llm_Agent.py at main · twy80/LangChain_llm_Agent · GitHub

You can find the information in

Lines 1052~1065
Lines 671~682
Lines 576~597
Lines 890~916.

Please let me know if you need further assistance.

1 Like

Thank you yoon.tw:) Will check and get back!