Hi, I’m creating a chatbot using langchain and trying to include a streaming feature. However, when I use st.write_stream on the langchain stream generator I get incorrect output as shown below:
here is the relevant code:
#get response
def get_response(query, chat_history, context):
template = """
You are a helpful customer support assistant. If you are given input in Thai, reply in Thai. If you are given input in English, reply in English. Do not include AIMessage in the message.
Answer the following questions using the following context and chat history:
Context: {context}
Chat history: {chat_history}
User question: {user_question}
"""
# prompt = ChatPromptTemplate.from_template(template)
llm = ChatOpenAI(model="gpt-3.5-turbo")
return llm.stream(template.format(context=context, chat_history=chat_history, user_question=query))
#user input
user_query = st.chat_input("Your question")
if user_query is not None and user_query != "":
load_db = Chroma(persist_directory=CHROMA_PATH, embedding_function=OpenAIEmbeddings())
context = load_db.similarity_search(user_query)
st.session_state.chat_history.append(HumanMessage(user_query))
with st.chat_message("Human"):
st.markdown(user_query)
with st.chat_message("AI"):
ai_response = st.write_stream(get_response(user_query, st.session_state.chat_history, context[0].page_content))
st.session_state.chat_history.append(AIMessage(ai_response))
Does langchain streaming output an unsupported generator type, and if so, how do I fix it?
Using streamlit version 1.34.0 and python version 3.9.16, and running streamlit locally.