The python snippet is:
for delta in openai.ChatCompletion.create(
model=“gpt-3.5-turbo”, engine=“gpt35uc”,
messages=[{“role”: m[“role”], “content”: m[“content”]} for m in st.session_state.messages],
stream=True,
):
I selected openai module version 1.25 and otherse as shown in screenshot. Since it works when I deploy the code locally, I package the required libraries in a zip file and uploaded to the package stage in the corresponding schema. But still doesn’t work.
Hey Winnie, I was having the same issue. I don’t write any of my own code. I use ChatGPT to write it. I found this on a forum and tried it as a response to the code given by ChatGPT and it would correct it until the code became more complicated and it would refuse. Here’s the response I found in a forum:
It seems you’re using the earlier syntax for the ChatCompletion API. The syntax was changed from response = openai.ChatCompletion.create(…) to client = OpenAI(api_key=‘…’) response = client.chat.completions.create(…) Additionally, your line here: question = response[‘choices’][0][‘message’][‘content’].strip() needs to be modified to question = response.choices[0].message.content.strip()
When that response was no longer working, I ran my code through Claude [Claude]
and that corrected the issue.
Its weird that ChatGPT cant write code in its own syntax.
Anyway, I dont understand much of this. Hope this helps!
You can run openai migrate to automatically upgrade your codebase to use the 1.0.0 interface. Alternatively, you can pin your installation to the old version
Acutally I see the old version of openai 0.27.4 is available in the streamlit package. I could also choose this one. In case if I would like to run openai migrate, how can I do it directly in the streamlit app within my Snowflake account?
The other issue might be due to Firewall. My company’s Snowflake account on Azure might be blocking the open ai instance. Error message: APIConnectionError: Error communicating with OpenAI. Would it be the case?
Did anyone resolve this? From the way it reads, it appears to be an issue with Langchain instead of Streamlit. However, i updated to latest langchain and open AI and still see the problem. It only happens with connecting to GPT4 for me.
If you encounter an error in your deployed Streamlit app that doesn’t appear in your Codespace, the issue may lie in your requirements.txt file. Ensure that the file specifies the correct library versions you need.
For example, to use version 0.28 of the OpenAI library, your requirements.txt should include:
Thanks for stopping by! We use cookies to help us understand how you interact with our website.
By clicking “Accept all”, you consent to our use of cookies. For more information, please see our privacy policy.
Cookie settings
Strictly necessary cookies
These cookies are necessary for the website to function and cannot be switched off. They are usually only set in response to actions made by you which amount to a request for services, such as setting your privacy preferences, logging in or filling in forms.
Performance cookies
These cookies allow us to count visits and traffic sources so we can measure and improve the performance of our site. They help us understand how visitors move around the site and which pages are most frequently visited.
Functional cookies
These cookies are used to record your choices and settings, maintain your preferences over time and recognize you when you return to our website. These cookies help us to personalize our content for you and remember your preferences.
Targeting cookies
These cookies may be deployed to our site by our advertising partners to build a profile of your interest and provide you with content that is relevant to you, including showing you relevant ads on other websites.