Can't get environment variable during deployment

I’m building a LLM app using Langchain and OpenAI embeddings. This is running in localhost very smoothly without any error, while I deployed the app in streamlit cloud, It’s showing me this kind of error. Please suggest me any solution.

here is my github code

Hi @SoumyadeepOSD,

Thanks for posting!

You can use st.secrets to retrieve the API key using the .streamit/secrets.toml file instead of using dotenv. You can follow this guide to add your API key to your deployed app.

Here’s another LangChain tutorial by the @dataprofessor that you might find helpful as well.

import streamlit as st
from langchain_community.llms import OpenAI

client = OpenAI(
1 Like

Hi @SoumyadeepOSD , @tonykip . But when he decided to push this application into cloud then I think no need of .streamlit with secrets.toml. And also I have noticed something in the documentation site about how to store the secrets. There is a problem in the mentioned structure for storing especially openai api keys. If we use that procedure as mentioned then the openai team dropping an email which it contains leakage in the key.

Official Documentation Style:-


In your code we need to use like:-

import streamlit as st

But the corrected structure for storing openai api keys as follows:-


In the application:-

import streamlit as st

I’m requesting the community to add the corrected part especially when we are working with openai api keys.

1 Like

Thanks for your clarification. I have made this app, can you check it out and let me know possible modification and feature(s) integration?
:point_right:t2: Chat with multiple PDF · Streamlit

1 Like

Correct, I added that for him to use in dev environment. There’s different ways I have used to call the keys from secrets in deployed apps with no issues. I also think it has to do with the changes OpenAI made to their client.