How to build a chat interface with follow-up questions like in GPT-5

All the examples that I found in the streamlit docs are a bit old. I cannot find anything how I can leverage newer GPT chat functions. Especially how GPT-5 automatically renders some follow-up questions. I haven’t found a good way to do this.
The way I had in mind was to instruct the llm to create some follow-ups. This would require the llm to return the result as json, with the follow-ups as separate parts which I can the put in st.buttons.
However, this will lead to some issues. First, I cannot use streaming in that case because it would output the json format. But the Chat-GPT interface (and others) work like this, so there must be another way. I don’t think that they use follow-up prompts, because the answers are displayed without further delay. How to achieve that?
Any “modern” chat example would be helpful. Thanks!