Wanted to Stream LLM response as it arrives to the streamlit application

I have built a streamlit app using Langchain.

At the start of the application i have initialized to use BedrockChat with Claude Model and streaming=True. So i expected the LLM response to come as a stream and not as a whole.

While debugging i also noticed that the responses from LLM comes token by token and not as a whole.

I want this to be displayed on the Streamlit Application as it arrives and not to wait for the whole response to come from LLM and then start displaying at the application level.

I used “StreamingStdOutCallbackHandler” while initializing the Agent and i have used AgentExecutor.stream method call to get the response from the LLM.

Would much appreciate the community help here

Is st.stream what you are looking for?

1 Like

Ya i did use st.write_stream but they did not still help me. I also tried looping around the response as it comes by and use the st.write_stream method but it did not still help me.

Have you tried using chat element?