Hi everyone! Iām building a chatbot using the Google Vertex AI LLM, and in order for the Vertex model to maintain context, the āsessionā object must survive for the entirety of the chat session. The object in question is the āsessionā object, here is a snippet of code showing how itās created:
import streamlit as st
from streamlit_chat import message
import vertexai
from vertexai.preview.language_models import ChatModel, ChatSession
chat_model = ChatModel.from_pretrained("chat-bison@001")
session=ChatSession(model=chat_model)
I couldnāt work out why the bot would not maintain context, but after some playing around I realised the entire script is being re-run with use input. When this happens, the session object is re-created and the chat context is lost.
Iāve read about the state engine in Streamlit, but I donāt believe it will work in this particular situation, as Iām working with an object, not a variable. Is there any way I can present the session object from being re-created on every input?
In python everything is an object. The following assumes that you want to keep both the model and the session and there is never the case that you have one of them and not the other. If you need only the session, do not store the model in session_state.
if "session" not in st.session_state:
st.session_state.chat_model = ChatModel.from_pretrained("chat-bison@001")
st.session_state.session = ChatSession(model=chat_model)
You should declare the model once and add it to the session state.
When the app re-runs (which it does a lot) you have to read to the model from the session state instead of instatiating again (to maintain state).
something like
if model not in st.session_state:
model = load_model()
st.session_state['model'] = model
In addition to the session state suggestions, you could also put the model or session loading into a function, and decorate the function with st.cache_resource (st.cache_resource - Streamlit Docs)
Thank you so much Goyo! Iām a data scientist, not a developer, and as you can see, I suck at python. Your solution worked perfectly, that you for your help.