Questions about the new streaming feature

I could get the new streaming feature to work together with a LangChain RetrievalQAWithSourcesChain chain. To achieve this, I used the new StreamlitCallbackHandler (read here: Streamlit | πŸ¦œοΈπŸ”— Langchain) which is apparently only working correctly for agents.


llm = OpenAI(client=OpenAI, streaming=True, callbacks=[StreamlitCallbackHandler(message_placeholder)])


chain = RetrievalQAWithSourcesChain.from_chain_type(llm=llm, chain_type='stuff', retriever=docsearch.as_retriever())

get answer from LLM:

with st.chat_message("assistant"):
    full_response = ''
    message_placeholder = st.empty()
    if 'chain' not in st.session_state:
        st.session_state['chain'] = load_data(st.session_state['thematic'], message_placeholder)
    if 'chain' in st.session_state:
        answer = st.session_state['chain']({'question': user_input}, return_only_outputs=True)
        for chunk in answer['answer'].split():
            full_response += chunk + " "
            # Add a blinking cursor to simulate typing
            message_placeholder.markdown(full_response + "β–Œ")

Question 1: Why can’t I have st.chat_message in a column? This makes no sense to me

StreamlitAPIException: st.chat_input() can't be used inside an st.expander, st.form, st.tabs, st.columns, or st.sidebar.

Question 2: When running a second question in the same session the following error pops up. Is there any way to circumvent this?

Bad message format
'setIn' cannot be called on an ElementNode

It seems, this is because of the st.empty() element that is used for the StreamlitCallbackHandler. See the issue here:

Beside that great feature :raised_hands:


Hi! I get the same error, and I agree with you: it does not really make sense!

1 Like

Interested in the solution for Q1. I would like to set up the chat input within one out of two tabs, but it seems to be impossible to do so.