Using write_stream with langchain llm streaming showing incorrect output

Hi, I’m creating a chatbot using langchain and trying to include a streaming feature. However, when I use st.write_stream on the langchain stream generator I get incorrect output as shown below:


here is the relevant code:

#get response
def get_response(query, chat_history, context):
    template = """
    You are a helpful customer support assistant. If you are given input in Thai, reply in Thai. If you are given input in English, reply in English. Do not include AIMessage in the message.
    Answer the following questions using the following context and chat history:
    
    Context: {context}
    
    Chat history: {chat_history}
    
    User question: {user_question}
    """
    # prompt = ChatPromptTemplate.from_template(template)
    
    llm = ChatOpenAI(model="gpt-3.5-turbo")
    
    return llm.stream(template.format(context=context, chat_history=chat_history, user_question=query))

#user input
user_query = st.chat_input("Your question")
if user_query is not None and user_query != "":
    load_db = Chroma(persist_directory=CHROMA_PATH, embedding_function=OpenAIEmbeddings())
    context = load_db.similarity_search(user_query)
    
    st.session_state.chat_history.append(HumanMessage(user_query))
    
    with st.chat_message("Human"):
        st.markdown(user_query)
        
    with st.chat_message("AI"):
        ai_response = st.write_stream(get_response(user_query, st.session_state.chat_history, context[0].page_content))
        
    st.session_state.chat_history.append(AIMessage(ai_response))

Does langchain streaming output an unsupported generator type, and if so, how do I fix it?

Using streamlit version 1.34.0 and python version 3.9.16, and running streamlit locally.

Alright, I’ve gotten closer to my desired output by preprocessing the chunks using chunk.content in the following code:

def stream_response(response):
    for chunk in response:
        yield chunk.content

st.write_stream(stream_response(get_response(user_query, st.session_state.chat_history, context[0].page_content)))
    

However, I get an extra of the first letter at the end of the line as in the following screenshot:

How do I fix this?

Okay, my bad, seems like I added an extra:

st.write(ai_response[0])

My code is working fine now!

This topic was automatically closed 2 days after the last reply. New replies are no longer allowed.