MySQL Connector error is my app. It works when i reboot the app & after reloading it 2-3 times it is giving me this error that i have mentioned below.

mysql.connector.errors.OperationalError: 2013 (HY000): Lost connection to MySQL server during query

Hey @minutetolearn,

Can you share a link to your app and your GitHub repo? There a few different things that could be going on here

app link is -

i cant share my git repo because this is my business app

Unfortunately, I think it’s going to be hard for folks to figure out the root cause without seeing the code. That said, this page is pretty helpful for this error. Notably:

Usually it indicates network connectivity trouble and you should check the condition of your network if this error occurs frequently. If the error message includes “during query,” this is probably the case you are experiencing.

Sometimes the “during query” form happens when millions of rows are being sent as part of one or more queries. If you know that this is happening, you should try increasing net_read_timeout from its default of 30 seconds to 60 seconds or longer, sufficient for the data transfer to complete.

My database code is pretty simple

import mysql.connector
import datetime
import streamlit as st

config = {
‘user’: st.secrets[‘Main_Database’][‘username’],
‘password’: st.secrets[‘Main_Database’][‘password’],
‘host’: st.secrets[‘Main_Database’][‘host’],
‘database’: st.secrets[‘Main_Database’][‘database’],
‘port’ : st.secrets[‘Main_Database’][‘port’],
‘raise_on_warnings’: True,
‘connection_timeout’: 10

conn = mysql.connector.connect(**config)
cursor = conn.cursor()

def get_branches():
cursor.execute(“SELECT Branch_Name, Id FROM branch”)
branches = [(row[0], row[1]) for row in cursor.fetchall()] # ← Use cursor.fetchall()
return branches

def get_sems(branch_id):
cursor.execute(“SELECT Sem_Name, Id FROM sem WHERE Branch_Id = %s”, (branch_id,))
sems = [(row[0], row[1]) for row in cursor.fetchall()] # ← Use cursor.fetchall()
return sems

def get_subjects(sem_id):
cursor.execute(“SELECT Subject_Name, Id FROM subject WHERE Sem_Id = %s”, (sem_id,))
subjects = [(row[0], row[1]) for row in cursor.fetchall()] # ← Use cursor.fetchall()
return subjects

def get_chapters(subject_id):
cursor.execute(“SELECT Chapter_Name, Id FROM chapter WHERE Subject_Id = %s”, (subject_id,))
chapters = [{“Chapter_Name”: row[0], “Chapter_Id”: row[1]} for row in cursor.fetchall()] # ← Use cursor.fetchall()
return chapters

def get_pdf_link(chapter_id):
cursor.execute(“SELECT Link FROM chapter WHERE Id = %s”, (chapter_id,))
pdf_link = cursor.fetchone()

if pdf_link is not None:
    return pdf_link[0]
    print(f"No link found for chapter ID: {chapter_id}")
    return None

def insert_record(user_messages, bot_responses, chapter_id):
cursor.execute(“INSERT INTO chat (Question, Response, Chapter_Id, Created_At, Updated_At) VALUES (%s, %s, %s, %s, %s)”,
(user_messages, bot_responses, chapter_id,,

def update_chat(chat_id, like_dislike):
cursor.execute(“UPDATE chat SET Reaction = %s WHERE Id = %s”, (like_dislike, chat_id))

the database has hardly 5-10 rows, it is still in the testing phase & the mysql db is with godaddy

Have you tried increasing connection_timeout?

yes you can see its there on top

What value did you try increasing it to?

for now only 10

locally its working smooth

I’d recommend increasing it beyond that and seeing if that resolves the issue

mysql.connector.errors.ProgrammingError: 1045 (28000): Access denied for user 'aj2790'@'' (using password: YES)

Also despite adding 2-3 ip address there is a new one coming every now and then, is there a way to fix 1 ip?

Unfortunately, if you need to allow-list the outbound IP addresses, a different hosting platform is likely the way to go, since we can’t provide a list of stable outbound IP addresses.

Did increasing connection_timeout resolve the other error?

No increasing timeout didnt help

What did you increase it to?

  1. now i am planning to grant all ips with password to check

I would try at least 180 (that would be three minutes) and see if that changes anything. There are a few other timeout-related settings you could try changing as well – check out this guide

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.