Deployment issue in open AI LLM model

Hi everyone, I am working on a streamlit project currently where I am facing the following issue:

The problem is that I am able to deploy the app locally and it works as expected, but when deploying onto streamlit’s server, i get the error message

for reference, here’s my python code for that page.

straight from the llm documentation (LLM quickstart - Streamlit Docs)

I have tried the following things:

  1. Added openAI api key into the secrets.
  2. Made my github repo private since i read that might have been causing an issue.
  3. Updated the requirements text with all the dependencies i have used in the app.

none of these have been able to resolve the issue. Could anyone please help me out, I would be grateful for assistance.

1 Like

Hi @ritabanm-99, thanks for posting…
Remove hyphen ( - ) and extra spaces from your requirements.txt.
It should be like

streamlit
numpy
pandas
openai
langchain
1 Like

got it to work thank you so much this was bothering me for a couple of hours

1 Like

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.