Streamlit and Langgraph used to create human-in-the-loop news writer app

https://meeting-reporter.streamlit.app/ mates Streamlit and Langgraph to create an app using both multiple agents and human-in-the-loop to generate news stories more reliably than AI can alone and more cheaply than humans can without AI. It’s an example of how AI can help fill a gap in local news reporting.

Based on GPT4-turbo so you do need your own paid OpenAI API key to get past the first screen (cost a few pennies per run).

Code is open source at GitHub - tevslin/meeting-reporter: Human-AI collaboration to produce a newstory about a meeting from minutes or transcript

Screenshots and transcript of a a session are here,

Most examples of Langgraph use are in Jupyter notebooks so not really suitable for deployment to a broad audience. Streamlit solves the UI problem but mating the Streamlit and Langgraph state machines is an interesting problem.

3 Likes

@tevslin it appears that you are not using langchain_community.callbacks.StreamlitCallbackHandler to handle the rendering of each step in the langgraph workflow. langchain_community.callbacks.StreamlitCallbackHandler is the documented method of doing this task with langchain, but the callback handler produces the following error when used with a langgraph workflow:

Error in StreamlitCallbackHandler.on_llm_end callback: RuntimeError('Current LLMThought is unexpectedly None!')

See this stack overflow post for more info.

Any ideas on how to properly use langchain_community.callbacks.StreamlitCallbackHandler with langgraph? … or how to get the equivalent functionality with another approach?

Nick, As you’ve seen, I’m not familiar with langchain_community.callbacks.StreamlitCallbackHandler but will take a look at it as I upgrade this code to add more functionality.

1 Like

Hey @Nick , I solved the streaming problem with LangGraph and Streamlit in the toolkit I just published today:

In my repo I run the agent in a different container with a fastapi server in between, but I believe the same principle and key code would work if the agent was running directly in the Streamlit app.

2 Likes

That’s awesome! Thanks @Joshua2 for sharing :pray:

@Joshua2 the complexity of your code, even just the draw_messages function, shows how helpful it would be for streamlit to build functionality to handle the complexity of streaming output from langgraph.
Otherwise, each developer has to re-invent this code (or at least find examples from other codebases).