How to solve memory issues with Langchain Agents?

Summary

I’m looking to add chat history memory to a Langchain’s OpenAI Function agent, based on the instruction here: Add Memory to OpenAI Functions Agent | 🦜️🔗 Langchain

However, this does not seem to work if I wrap the agent.run with st.chat_input element. If I tested outside of st.chat_input, then the chat memory works!

Steps to reproduce

Code snippet:

# CHAT MEMORY FOR AGENT
agent_kwargs = {
    "extra_prompt_messages": [MessagesPlaceholder(variable_name="memory")],
}
memory = ConversationBufferMemory(memory_key="memory", return_messages=True)

# initialize agent executor (run time)
agent = initialize_agent(
    tools, 
    llm, 
    agent=AgentType.OPENAI_FUNCTIONS, 
    verbose=True,
    agent_kwargs=agent_kwargs,
    memory=memory,
)

# Display chat messages from history on app rerun
for message in st.session_state.messages:
    with st.chat_message(message["role"]):
        st.markdown(message["content"])
        
# Accept user input
if question := st.chat_input("Ask me any thing..."):
    # Add user message to chat history
    st.session_state.messages.append({"role": "user", "content": question})
    # Display user message in chat message container
    with st.chat_message("user"):
        st.markdown(question)
    # Display assistant response in chat message container
    with st.chat_message("assistant"):
        st_callback = StreamlitCallbackHandler(st.container()) # streamlit container for output
        answer = agent.run(question, callbacks=[st_callback, langfuse_handler])
        st.session_state.messages.append({"role": "assistant", "content": answer})
        st.markdown(answer)

Expected behavior:

Chat memory updated after every questions and answers.

Actual behavior:

Chat memory does not update after every questions and answers.

Thank you!

Hi @AnhNgDo,

It seems that the agent is already using ConversationBufferMemory and you can see following Docs page from LangChain on retrieving the stored messaged

Hope this helps!