Using GSheetsConnection: conn.read() does not return latest updated data

I’m building a chatbot and want to count the number of tokens and cost for each user call to the LLM. I’m using GSheetsConnection to record the above information. However, when using the conn.read() function it seems like it only records the information in the spreadsheet when the app is first run and does not update as the data is updated. Because of the nature of my project, I need conn.read() to return the latest version of the Google spreadsheet every time the user inputs a question. Is there any way to do this?

Here is the relevant code:

user_query = st.chat_input("Your question")
if user_query is not None and user_query != "":
    load_db = FAISS.load_local(FAISS_PATH, OpenAIEmbeddings(), allow_dangerous_deserialization=True)
    context = load_db.max_marginal_relevance_search(user_query, k=3)
    context_text = "\n\n---\n\n".join([doc.page_content for doc in context])
    
    st.session_state.chat_history.append(HumanMessage(user_query))
    
    with st.chat_message("Human"):
        st.markdown(user_query)
        
    with st.chat_message("AI"):
        ai_response = st.write_stream(stream_response(get_response(user_query, st.session_state.chat_history, context_text)))
            
    st.session_state.chat_history.append(AIMessage(ai_response))
    
    #recording info in google sheets 
    conn = st.connection("gsheets", type=GSheetsConnection)
    df = conn.read(worksheet="Sheet1", usecols=[0,1,2,3])
    df_cleaned = df.dropna()
    total_cost, total_tokens = get_token_info_brute(user_query, st.session_state.chat_history, context_text)
    new_data = {"Time": strftime("%Y-%m-%d %H:%M:%S", localtime()), "Email":st.session_state.email, "Total_Tokens": total_tokens, "Total_Cost": total_cost}
    df_new = pd.DataFrame(new_data, index=[0])
    df_concat = pd.concat([df_cleaned, df_new])
    conn.update(worksheet="Sheet1", data=df_concat)
    st.dataframe(df_concat)

A visualiser of what I’m working on, if it helps:

Nevermind, switched over to MongoDB to store my data instead, but if anyone knows the solution to this, please let me know!