Summary
Hello everyone. When I use the download button in combination with the st.chat_input and st.chat_messages, the session state sometimes flushes randomly. This happens inside imported functions from sub-directories.
Steps to reproduce
My folder structure is the following:
├─ utils
│ ├─ __init.py__
│ ├─ chat.py
│ └─ ui.py
├─ app.py
The app.py file is the where the ui is built an the function in chat is called:
import streamlit as st
from utils.ui import (
display_all_messages,
process_chat_input,
initialize_messages
)
from utils.chat import (
generate_chat_response
)
PAGE_NAME = 'kw'
uploaded_file = st.file_uploader(
'Upload a file'
)
# Initialize chat history
if f"{PAGE_NAME}_messages" not in st.session_state:
initialize_messages(page_name=PAGE_NAME)
# Display chat messages from history on app rerun
display_all_messages(page_name=PAGE_NAME)
# Accept user input
if prompt := st.chat_input("Your message."):
message_prompt = process_chat_input(prompt, page_name=PAGE_NAME)
print("A BEFORE GENERATION RESPONSE")
print(st.session_state)
response = generate_chat_response(
question=prompt,
chat_history=st.session_state[f'{PAGE_NAME}_messages']
)
print("B AFTER GENERATING RESPONSE")
print(st.session_state)
The utils/ui.py file is responsible for printing various parts of the app:
import streamlit as st
def initialize_messages(page_name=''):
st.session_state[f'{page_name}_messages'] = []
def display_all_messages(page_name=''):
for message in st.session_state[f'{page_name}_messages']:
with st.chat_message(message["role"]):
st.markdown(message['content'])
def process_chat_input(prompt, page_name=''):
message = {
'role' : 'user',
'content' : prompt
}
# Add user message to chat history
st.session_state[f'{page_name}_messages'].append(message)
# Display user message in chat message container
with st.chat_message("user"):
st.markdown(prompt)
return message
Finally, the utils/chat.py file contains dummy functions which simulate the generation of a chat response:
import streamlit as st
import time
def get_model():
print("1.a BEFORE MODEL LOADING")
print(st.session_state)
time.sleep(1)
model = 'dummy_model'
print("1.b AFTER MODEL LOADING")
print(st.session_state)
return model
def generate_chat_response(question, chat_history):
print("1 BEFORE CALLING GET_MODEL()")
print(st.session_state)
llm = get_model()
return 'text'
To reconstruct the error, simply run the app.py with streamlit and submit a prompt in the chat input as e.g. ‘hi’.
Expected behavior:
I expect the streamlit session state to persist and be accesable troughout all function calls without it being flushed. The following expected output is generated when removing the file uploader from the app:
You can now view your Streamlit app in your browser.
Local URL: http://localhost:8501
Network URL: http://192.168.0.238:8501
A BEFORE GENERATING RESPONSE
{'kw_messages': [{'role': 'user', 'content': 'Hi'}]}
1 BEFORE CALLING GET_MODEL()
{'kw_messages': [{'role': 'user', 'content': 'Hi'}]}
1.a BEFORE MODEL LOADING
{'kw_messages': [{'role': 'user', 'content': 'Hi'}]}
1.b AFTER MODEL LOADING
{'kw_messages': [{'role': 'user', 'content': 'Hi'}]}
B AFTER GENERATING RESPONSE
{'kw_messages': [{'role': 'user', 'content': 'Hi'}]}
Actual behavior:
This is the output of the
You can now view your Streamlit app in your browser.
Local URL: http://localhost:8502
Network URL: http://192.168.0.238:8502
A BEFORE GENERATING RESPONSE
{'kw_messages': [{'role': 'user', 'content': 'hi'}]}
1 BEFORE CALLING GET_MODEL()
{'kw_messages': [{'role': 'user', 'content': 'hi'}]}
1.a BEFORE MODEL LOADING
{'kw_messages': [{'role': 'user', 'content': 'hi'}]}
1.b AFTER MODEL LOADING
{}
B AFTER GENERATING RESPONSE
{}
Debug info
- Streamlit version: 1.27.0
- Python version: 3.11.3
- Using Conda? Yes, using conda.
- OS version: Windows 10
- Browser version: Chrome Version 117.0.5938.150 (64bit)
Additional information
While converting my app to a minimum working example, I noticed that the session state was not always flushed.