I have built a streamlit app using Langchain.
At the start of the application i have initialized to use BedrockChat with Claude Model and streaming=True. So i expected the LLM response to come as a stream and not as a whole.
While debugging i also noticed that the responses from LLM comes token by token and not as a whole.
I want this to be displayed on the Streamlit Application as it arrives and not to wait for the whole response to come from LLM and then start displaying at the application level.
I used “StreamingStdOutCallbackHandler” while initializing the Agent and i have used AgentExecutor.stream method call to get the response from the LLM.
Would much appreciate the community help here