Help build a basic LLM chat app


I am trying to follow the tutorial below to build a basic LLM chat app that uses AzureOpenAI.

Build a basic LLM chat app

Here’s the code I use, where api_version, azure_endpoint, and st.session_state[“openai_model”] have appropriate values.

from openai import AzureOpenAI
import streamlit as st

st.title("ChatGPT-like clone")

client = AzureOpenAI(

if "openai_model" not in st.session_state:
    st.session_state["openai_model"] = "..."

if "messages" not in st.session_state:
    st.session_state.messages = []

for message in st.session_state.messages:
    with st.chat_message(message["role"]):

if prompt := st.chat_input("What is up?"):
    st.session_state.messages.append({"role": "user", "content": prompt})
    with st.chat_message("user"):

    with st.chat_message("assistant"):
        stream =
                {"role": m["role"], "content": m["content"]}
                for m in st.session_state.messages
        response = st.write_stream(stream)
    st.session_state.messages.append({"role": "assistant", "content": response})

When I enter “20+30=” in the prompt, I get the following response.

 choices=[], created=0, model='', object='', system_fingerprint=None, 
prompt_filter_results=[{'prompt_index': 0, 'content_filter_results': 
{'hate': {'filtered': False, 'severity': 'safe'}, 'self_harm': 
{'filtered': False, 'severity': 'safe'}, 'sexual': {'filtered': False, 
'severity': 'safe'}, 'violence': {'filtered': False, 'severity': 


When I follow up and enter “No, it is 40.”, I get the following error.

BadRequestError: Error code: 400 - {'error': {'message': "'$.messages[1].content' is invalid. Please check the API reference:", 'type': 'invalid_request_error', 'param': None, 'code': None}}


File “C:\Python\Python311\Lib\site-packages\streamlit\runtime\scriptrunner\”, line 535, in _run_script
exec(code, module.dict)File “C:\myapp\week 28-31\OpenAI\”, line 27, in
stream =
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^File “C:\Python\Python311\Lib\site-packages\”, line 271, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^File “C:\Python\Python311\Lib\site-packages\openai\resources\chat\”, line 659, in create
return self._post(
^^^^^^^^^^^File “C:\Python\Python311\Lib\site-packages\”, line 1200, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^File “C:\Python\Python311\Lib\site-packages\”, line 889, in request
return self._request(
^^^^^^^^^^^^^^File “C:\Python\Python311\Lib\site-packages\”, line 980, in _request
raise self._make_status_error_from_response(err.response) from None

Please advise.

In case it matters, I am using the gpt-4 model, while the tutorial uses the gpt-3.5-turbo model. The issue seems to be caused by “stream = True”.

Hi @txt,

Sorry for the late reply, but in case you didn’t see it and also for posterity: Support for OpenAI was added in Streamlit version 1.32.0. :slight_smile: