I can draw it in the chat using st.bar_chart([1,5,3,2,7]), but it is not added to the session state list, therefore, when i write a new message, this graph disappears
How should i write a bar chart to the session state? I have tried st.session_state[“messages”].append(st.bar_chart([1,5,3,2,7])), but it does not work
To include a bar chart in the chat interface, you’ll need to add a separate field, such as 'bar_chart', in your message. Then, use an if statement to check if that field exists and render the chart accordingly.
if 'messages' not in st.session_state:
st.session_state.messages = [
{'role': 'assistant', 'content': 'Hello, human!', 'bar_chart': np.random.randn(30, 3)}
]
for message in st.session_state.messages:
if message['role'] != 'system': # Skip system messages
with st.chat_message(message['role']):
st.markdown(message['content'])
# Check if the message includes a bar chart
if 'bar_chart' in message:
st.bar_chart(message['bar_chart'])
streamlit.errors.StreamlitAPIException: to_dict() is not a valid Streamlit command.
I have debugged the problem and it seems to be in this sentence st.altair_chart(message[‘altair_chart’])
for message in st.session_state["messages"]:
with st.chat_message(message["role"]):
if 'altair_chart' in message:
st.altair_chart(message['altair_chart'])
I don’t see anything here suggesting an attempt to print a chart. And it is unclear to me how this snippet is related to an altair chart. Also I don’t understand what you mean by “I added to altair chart the numpy object”. I don’t even know which line of code triggered the error let alone what the call stack is.
Im not sure if this will work…This is my code right now
def chatbot():
if 'chart' in st.session_state["messages"][-1]["content"]:
info = return_drawing_data(data)
chart = alt.Chart(info).mark_bar().encode(x=alt.X('Month'), y='Info')
st.altair_chart(chart,use_container_width=True)
st.session_state["messages"].append(
{'role': 'assistant', 'content': 'Here is your chart', 'altair_chart': chart})
else:
prompt = [
{"role": "system", "content": f"""You are another chatbot that needs to help me"""}
]
prompt += st.session_state["messages"]
stream = ollama.chat(
model=st.session_state["model"],
messages=prompt,
stream=True,
)
for chunk in stream:
yield chunk["message"]["content"]
for message in st.session_state["messages"]:
with st.chat_message(message["role"]):
if 'altair_chart' in message:
st.altair_chart(message['altair_chart'], use_container_width=True)
else:
st.markdown(message["content"])
if prompt := st.chat_input("Enter prompt here.."):
st.session_state["messages"].append({"role": "user", "content": prompt})
with st.chat_message("user"):
st.markdown(prompt)
with st.chat_message("assistant"):
message = st.write_stream(chatbot())
if message:
st.session_state["messages"].append({"role": "assistant", "content": message})
What should i do to not pass the messages to the prompt?