In my experience, LangChain is a very complex HIGH LEVEL abstraction, and if you follow their example exactly, itâs easy to get good results, but if you try to modify something yourself, it often brings very complicated bugs because they hide too much information in it.
Just by looking at this part of your code, I have no idea what is happening. Also, I havenât obtained the Azure OpenAI API key yet, so I cannot test AzureChatOpenAI either.
If I were to debug it, I think I would need to first test if the response is being properly outputted when streaming is set to False.
If everything mentioned above is working fine, I noticed that the error message states: âObject of type StreamHandler is not JSON serializable.â Itâs possible that the information returned by the AI is in JSON format. In that case, you might need to extract a specific part of the JSON, such as the text or token, and then pass it to the StreamHandler for processing. You can refer to the âoutput parserâ reference for guidance: https://python.langchain.com/en/latest/modules/prompts/output_parsers/getting_started.html
Or, if your entire programâs code is not very long, you may want to copy all the code along with the error messages into GPT-4 or Claude 100k and let GPT-4 do the debug.
In fact, I wrote this StreamHandler with the help of GPT-4. I gave GPT-4 the callback description page and let it come up with it. They are pretty good.
Hey @goldengrape,
Thanks for your suggestion. I tried the langchainâs built-in StreamingStdOutCallbackHandler to check if the streaming output worked correctly. I was able to stream the response on the terminal. But, as mentioned early I was looking for a way to stream the output on Streamlit. I was able to do this by adopting a custom stream_handler (StreamlitCallbackHandler(BaseCallbackHandler)). Then I used a callback_manager to the LLM before running the SequentialChain().
@goldengrape Hi would this work with if i provide custom css to it. I actually was trying to implment a chatbot app. Where i was using GitHub - AI-Yash/st-chat: Streamlit Component, for a Chatbot UI this to create chat UI. But i have a hard time integarting streaming support into this. Can somebody please let me know a way ?
Hey, same here. Would me great to implement this streaming feature into streamlit_chat! Does anyone have an idea of how to do such a thing? All my previous attempts fail so far
With the latest (1.24) version of Streamlit streaming is possible, however ONLY for some special cases like OpenAIâs chat completion API. I am working on a streamlit app that uses LangChain RetrievalQAWithSourcesChain to answer questions from text documents.
Is there no possibility to add streaming with Streamlit + LangChain RetrievalQAWithSourcesChain ?
Thanks for stopping by! We use cookies to help us understand how you interact with our website.
By clicking âAccept allâ, you consent to our use of cookies. For more information, please see our privacy policy.
Cookie settings
Strictly necessary cookies
These cookies are necessary for the website to function and cannot be switched off. They are usually only set in response to actions made by you which amount to a request for services, such as setting your privacy preferences, logging in or filling in forms.
Performance cookies
These cookies allow us to count visits and traffic sources so we can measure and improve the performance of our site. They help us understand how visitors move around the site and which pages are most frequently visited.
Functional cookies
These cookies are used to record your choices and settings, maintain your preferences over time and recognize you when you return to our website. These cookies help us to personalize our content for you and remember your preferences.
Targeting cookies
These cookies may be deployed to our site by our advertising partners to build a profile of your interest and provide you with content that is relevant to you, including showing you relevant ads on other websites.