How to deploy Streamlit LLM with Vertex AI?


I’m trying to deploy a Streamlit app that uses Langchain’s OpenAI and VertexAI integration. Is there any way to do so without exposing my Google Account credentials (json file)?

Steps to reproduce

Code snippet:

prompt_default = ChatPromptTemplate.from_messages([

if st.session_state["llm_provider"] == "OpenAI":
    llm_default = ChatOpenAI(model="gpt-3.5-turbo", temperature=0.6, openai_api_key=openai_api_key, streaming=True)
elif st.session_state["llm_provider"] == "VertexAI":
    llm_default = VertexAI(temperature=0.4, max_output_tokens=512)

chain_default = LLMChain(llm=llm_default, prompt=prompt_default, memory=memory, verbose=True)

Additional information

The app works in my local environment where Langchain will use the environment variable GOOGLE_APPLICATION_CREDENTIALS (which is a path to the local .json file). But I’m not sure how to deploy this the Streamlit Cloud?

Many thanks!

Hi @AnhNgDo

If you’re deploying to the Streamlit Community Cloud, you can leverage the built-in Secrets management in the Community Cloud to store those credentials. Thus, you may have to copy the contents from the .json file into the Secrets text box (in the settings of a deployed app).

Hope this helps!

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.