Implementing Langchain'smemory with Streamlit's new Chat Elements

Im trying to implement Langchain to the just launched chat elements. I followed the example they posted and I manipulated it to use langchain isntead of openai directly. However, the memory is not working even though I’m using session states to save the conversation. Here is my code
Code Snippet:

from langchain import OpenAI
from langchain.callbacks import get_openai_callback
from langchain.chains import ConversationChain
from langchain.chains.conversation.memory import ConversationSummaryMemory
from langchain.memory import ConversationBufferMemory
from langchain.memory import ChatMessageHistory
import streamlit.components.v1 as components
import openai
import streamlit as st
import os

openai.api_key = "OpenAI Key"

if "openai_model" not in st.session_state:
    llm = OpenAI(
        openai_api_key= "xxxxxxxxxxxxxxxxxxxxx",
    st.session_state.conversation = ConversationChain(
        memory = ConversationBufferMemory(llm=llm),

if "messages" not in st.session_state:
    st.session_state.messages = []


for message in st.session_state.messages:
    with st.chat_message(message["role"]):

if prompt := st.chat_input("What is up?"):
    st.session_state.messages.append({"role": "user", "content": prompt})
    with st.chat_message("user"):

    with st.chat_message("assistant"):
        llm_response =
    st.session_state.messages.append({"role": "assistant", "content": llm_response})

Hi @fabiancbc

I noticed that in your code snippet, you forgot to delete your OpenAI API key, I’ve taken the liberty to delete and replace it with a placeholder text on your behalf for your API key safety.

hey fabiancbc, any solution you found to solve this?

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.