My ChatGPT application is not working properly once deployed

I’m having an issue where my local application and deployed application are giving two different responses from ChatGPT. Even though it is working and able to get the response, the one deployed on the Streamlit server is not making sense, while the one on my local machine is giving the expected output.

I have confirmed that both codes are identical.
I was wondering if someone else facing or has faced a similar issue.
Thank you.

This is where I’m invoking the chatgpt in the dashboard:

st.header('Smart Analysis 🤖')
    # Add a text box
    text = f"Analyzing this dataframe {ttdata}"  # ttdata is a pandas dataframe I'm trying to analyze.
    text_input = st.text_input("Enter your request here")

    # Add a button  
    if st.button("Submit"):
        #st.write(f"You entered: {text} {text_input}")
        st.write("Analyzing...")
        
    reuqest_headers = {
    #"Content-Type": "application/json",
    "Authorization": f"Bearer {openai_api_key}"
    }

    request_data = {
        "model": "text-davinci-003",
        "prompt": f"{text} {text_input} using the dataframe",
        "max_tokens": 500,
        "temperature": 0.5
    }

    response = requests.post(api_endpoint, headers=reuqest_headers, json=request_data)

    if response.status_code == 200:
         st.text(response.json()['choices'][0]['text'])

    else:
         st.text(f"request failed with status code: {str(response.status_code)}")

Hey @Rasha_Salim,

I’d recommend playing around with the temperature parameter. From OpenAI’s docs:

What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic.

We generally recommend altering this or top_p but not both.

It sounds like you want the output to be less random, so setting the temperature to 0.2 or even 0 might be a good place to start.

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.