How to add system prompt

Hi

How can the system prompt be added to streamlit chat when using a LLM (im using llama3.2 through ollama)

I tried adding the system prompt to st.session_state.messages = , and it just displays it in the screen, but it doesnt modify the LLM behaviour

Thanks

You can include an if statement check to prevent system message from showing up in the chat.


if 'messages' not in st.session_state:
    st.session_state.messages = [{'role': 'system', 'content': 'You are a helpful assistant.'}]

for message in st.session_state.messages:
    if message['role'] != 'system':
        with st.chat_message(message['role']):
            st.markdown(message['content'])
1 Like

Right, but it still doesnt work as a system prompt :frowning:

Are you passing all the messages including system message to the LLM? Below is an example using Open AI API.

openai.chat.completions.create(
            model="gpt-4o-mini",
            messages=[{"role": m["role"], "content": m["content"]} for m in st.session_state.messages]
        ):

Hi

I finally made it work

prompt = [
    {"role": "system", "content": f"""You are another boring chatbot that needs to help me"""}
]
prompt += st.session_state["messages"]

stream = ollama.chat(
    model=st.session_state["model"],
    messages=prompt,
    stream=True,
)

Maybe this should be added to the doc?

Great!