I could get the new streaming feature to work together with a LangChain RetrievalQAWithSourcesChain chain. To achieve this, I used the new StreamlitCallbackHandler (read here: Streamlit | π¦οΈπ Langchain) which is apparently only working correctly for agents.
LLM
llm = OpenAI(client=OpenAI, streaming=True, callbacks=[StreamlitCallbackHandler(message_placeholder)])
chain
chain = RetrievalQAWithSourcesChain.from_chain_type(llm=llm, chain_type='stuff', retriever=docsearch.as_retriever())
get answer from LLM:
with st.chat_message("assistant"):
full_response = ''
message_placeholder = st.empty()
if 'chain' not in st.session_state:
st.session_state['chain'] = load_data(st.session_state['thematic'], message_placeholder)
if 'chain' in st.session_state:
answer = st.session_state['chain']({'question': user_input}, return_only_outputs=True)
for chunk in answer['answer'].split():
full_response += chunk + " "
time.sleep(0.05)
# Add a blinking cursor to simulate typing
message_placeholder.markdown(full_response + "β")
message_placeholder.markdown(full_response)
Question 1: Why canβt I have st.chat_message in a column? This makes no sense to me
StreamlitAPIException: st.chat_input() can't be used inside an st.expander, st.form, st.tabs, st.columns, or st.sidebar.
Question 2: When running a second question in the same session the following error pops up. Is there any way to circumvent this?
Bad message format
'setIn' cannot be called on an ElementNode
It seems, this is because of the st.empty() element that is used for the StreamlitCallbackHandler. See the issue here:
https://github.com/streamlit/streamlit/issues/5880
Beside that great feature