How to render the LLM response like ChatGPT does?

We see that when we ask ChatGPT to return a summary of any document or title and subtitle for whatever query we have , it renders it beautifully .
By beautifully , I mean the title is larger in size , the subtitle or the normal text is smaller in size .

I’ve created a PDF RAG app using langchain v0.2 . It has streaming feature where I get result streamed to me .

Now , I want to have the same response like ChatGPT where headings are larger in size and text is smaller in size . How can I do so ?

Hi,

Using st.markdown() within st.chat_message should handle the styling of text via markdown syntax.

question_answer_chain = create_stuff_documents_chain(llm, chatPrompt)

rag_chain = create_retrieval_chain(history_aware_retriever, question_answer_chain)

conversational_rag_chain = RunnableWithMessageHistory(
    rag_chain,
    get_session_history,
    input_messages_key="input",
    output_messages_key="answer",
    history_messages_key="chat_history",
)


# generate response 
def generate_response(prompt: str) :
    for chunk in conversational_rag_chain.stream(input={"input": prompt},config={'configurable': {'session_id': "gaurav"}}):
        answer_chunk = chunk.get("answer")
        if answer_chunk:
            yield answer_chunk

prompt = st.chat_input("Hey, What's up?")

if prompt is not None and prompt !="" :
    st.session_state.chat_history.append(HumanMessage(prompt))
    with st.chat_message("Human"):
        st.markdown(prompt)

    if len(pc.list_indexes()) == 0:
        st.error("Please upload some files first!")
    else:
        with st.chat_message("AI"):
            ai_response = st.write_stream(generate_response(prompt))

        st.session_state.chat_history.append(AIMessage(ai_response))

Hi @dataprofessor , thanks for replying . I tried using the method you mentioned but it doesn’t work . It is maybe because of the way my output gets streamed . I’ve attached my code snippet . Can you guide me how to add this feature to my app ?

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.