Langchain - Memory

Hi all,

If you are looking into implementing the langchain memory to mimic a chatbot in streamlit using openAI API, here is the code snippet that might help you.

you need to put st.cache_resource and put all the functions in a function call.

7 Likes

To anyone in the same boat, the decorator below also works:

@st.cache(allow_output_mutation=True)

1 Like

By adding memory to cache, it saves same history for all session.
How to save langchain’s chat history per session?

This is an old thread, but adding an important addition - I was running into major problems with memory leaks and going over the 1GB community cloud limit every couple days. It turns out it wasn’t LangChain but LangSmith. Turning off LangSmith brought the problem under control. It was easy to overlook because setting up LangSmith just requires some environment variables. I love LangSmith and am sad to lose the insights. I have no idea what’s going on under the hood.

Hopefully this helps somebody else out there.