Add bar chart to session state

Hi

I am using the ChatGPT-like clone example in Build a basic LLM chat app - Streamlit Docs and i want to add a bar chart to the chatbot

I can draw it in the chat using st.bar_chart([1,5,3,2,7]), but it is not added to the session state list, therefore, when i write a new message, this graph disappears

How should i write a bar chart to the session state? I have tried st.session_state[“messages”].append(st.bar_chart([1,5,3,2,7])), but it does not work

Thanks

To include a bar chart in the chat interface, you’ll need to add a separate field, such as 'bar_chart', in your message. Then, use an if statement to check if that field exists and render the chart accordingly.

if 'messages' not in st.session_state:
    st.session_state.messages = [
        {'role': 'assistant', 'content': 'Hello, human!', 'bar_chart': np.random.randn(30, 3)}
    ]

for message in st.session_state.messages:
    if message['role'] != 'system':  # Skip system messages
        with st.chat_message(message['role']):
            st.markdown(message['content'])
            
            # Check if the message includes a bar chart
            if 'bar_chart' in message:
                st.bar_chart(message['bar_chart'])
1 Like

Thanks. Actually its an altair chat

I am having this error

streamlit.errors.StreamlitAPIException: to_dict() is not a valid Streamlit command.

I have debugged the problem and it seems to be in this sentence st.altair_chart(message[‘altair_chart’])

for message in st.session_state["messages"]:
    with st.chat_message(message["role"]):
        if 'altair_chart' in message:
            st.altair_chart(message['altair_chart'])

This is the data in message[‘altair_chart’]

DeltaGenerator(_provided_cursor=LockedCursor(_parent_path=(2,), _props={'delta_type': 'arrow_vega_lite_chart', 'add_rows_metadata': None}), _parent=DeltaGenerator(_provided_cursor=RunningCursor(_parent_path=(2,), _index=1), _parent=DeltaGenerator(), _block_type='chat_message', _form_data=FormData(form_id='')))

Any idea what might be wrong?

Thanks

Apparently you are storing the wrong thing in message['altair_chart'].

Ah right, yeah, tried that too. I added to altair chart the numpy object, and now its the stream what fails, the
for chunk in stream:

    stream = ollama.chat(
        model=st.session_state["model"],
        messages=prompt,
        stream=True,
    )
    for chunk in stream:
        yield chunk["message"]["content"]

raise TypeError(f’Object of type {o.class.name} ’
TypeError: Object of type Chart is not JSON serializable

Looks like it is trying to print the chart again?

I don’t see anything here suggesting an attempt to print a chart. And it is unclear to me how this snippet is related to an altair chart. Also I don’t understand what you mean by “I added to altair chart the numpy object”. I don’t even know which line of code triggered the error let alone what the call stack is.

You should not store any chart object inside messages list. Before passing message list to the LLM you should remove all chart data from the list.

messages=[{"role": m["role"], "content": m["content"]} for m in st.session_state.messages]

Im not sure if this will work…This is my code right now

def chatbot():
    if 'chart' in st.session_state["messages"][-1]["content"]:
        info = return_drawing_data(data)
        chart = alt.Chart(info).mark_bar().encode(x=alt.X('Month'), y='Info')
        st.altair_chart(chart,use_container_width=True)
        st.session_state["messages"].append(
            {'role': 'assistant', 'content': 'Here is your chart', 'altair_chart': chart})
    else:
        prompt = [
            {"role": "system", "content": f"""You are another chatbot that needs to help me"""}
        ]
        prompt += st.session_state["messages"]
        stream = ollama.chat(
            model=st.session_state["model"],
            messages=prompt,
            stream=True,
        )
        for chunk in stream:
            yield chunk["message"]["content"]


for message in st.session_state["messages"]:
    with st.chat_message(message["role"]):
        if 'altair_chart' in message:
            st.altair_chart(message['altair_chart'], use_container_width=True)
        else:
            st.markdown(message["content"])

if prompt := st.chat_input("Enter prompt here.."):
    st.session_state["messages"].append({"role": "user", "content": prompt})
    with st.chat_message("user"):
        st.markdown(prompt)

    with st.chat_message("assistant"):
        message = st.write_stream(chatbot())
        if message:
            st.session_state["messages"].append({"role": "assistant", "content": message})

What should i do to not pass the messages to the prompt?

What error message are you getting in this code?