Static paragraph while using chat

In the code below, I want the styled_paragraph to stay, while I Q and A using the chat elements. But it is not staying. As soon as I ask a question, the paragraph disappears.

if key_validated:
    # Initialize the documents in memory
    with st.spinner(text="Initializing the bot. Hang tight!."):
        document_db = load_data()

        # instantiate the database retriever
        retriever = document_db.as_retriever(search_type="similarity", search_kwargs={"k": 3})

        # instantiate the large language model
        llm = AzureChatOpenAI(

        template = """Use the following pieces of context to answer the question at the end. 
        If you don't know the answer, just say that you don't know, don't try to make up an answer. 
        Give detailed answers.
        Question: {question}
        Helpful Answer:"""
        QA_CHAIN_PROMPT = PromptTemplate.from_template(template)

        qa_chain = RetrievalQA.from_chain_type(
            chain_type_kwargs={"prompt": QA_CHAIN_PROMPT}

    police_report_prompt = """
    Summarize my report in plain English

    if st.button("Police Report Summary"):
        with st.spinner("Thinking..."):
            response = qa_chain({"query": report_prompt})
            # Using HTML tags to style the paragraph
            styled_paragraph = f"""
                <div style="background-color: #f5f5f5; padding: 20px; border-radius: 10px; box-shadow: 0 4px 8px rgba(0, 0, 0, 0.1);">
                    <h2 style="color: #333;">My Report</h2>
                    <p style="font-size: 18px; line-height: 1.6; color: #666;">{response["result"]}</p>
            # Display the styled paragraph using Streamlit
            st.markdown(styled_paragraph, unsafe_allow_html=True)

    # Prompt for user input and display message history
    if prompt := st.chat_input(
            "Your question here ..",
            disabled=not openai.api_key):  # Prompt for user input and save to chat history
        st.session_state.messages.append({"role": "user", "content": prompt})

    for message in st.session_state.messages:  # Display the prior chat messages
        with st.chat_message(message["role"]):

    # Pass query to chat engine and display response
    # If last message is not from assistant, generate a new response
    if st.session_state.messages[-1]["role"] != "assistant":
        with st.chat_message("assistant"):
            with st.spinner("Thinking..."):
                response = qa_chain({"query": prompt})
                message = {"role": "assistant", "content": response["result"]}
                st.session_state.messages.append(message)  # Add response to message history

Hi @Sridhar_Iyer

Perhaps you can try to use the single-element container st.empty() as a placeholder underneath the st.button("Police Report Summary"). Subsequently, assign any other text in this placeholder. This should keep the st.button to be located above the placeholder.

However, eventually with more response generation content the scrolling mechanism would be activated such that the latest content will be displayed at the bottom and older ones at the top.

Another approach is to assign the st.button("Police Report Summary") to the sidebar, which should not be affected by the response generation.

Hope this helps!

Trail #1:
Using a placehodler for st.empty() did not help. The st.markdown(styled_paragraph, unsafe_allow_html=True) still disappears after I ask a question in the
My code -

placeholder = st.empty()
# Rest of the code
# ---
with placeholder.container():
  st.markdown(styled_paragraph, unsafe_allow_html=True)

Trial #2:
Having the Create Police Report button in the sidebar -
Same issue. The st.markdown(styled_paragraph, unsafe_allow_html=True) shows in the sidebar itself, which I don’t want. This also disappears after I use the chat.

Have a static text in the main Streamlit screen along with the chat elements. It is ok if the static paragraph moves to the top while chatting.

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.