The page cannot support the display of 120,000 records

When the number of records in the dataframe reaches 120,000 records, the page loads very slowly and even dead,as follows:
import streamlit as st

import numpy as np

import pandas as pd

st.title(“这是用streamlit Web框架启动的web应用\n @数字化审计”)


def load_contract():


df=pd.read_csv(filepath,encoding='utf_8_sig') #支持中文路径

return df




event_htType = st.sidebar.selectbox(“选择合同类别”, htType_list)



event_dwmcType = st.sidebar.selectbox(“选择公司名称”, dwmc_list)

part_df = df[(df[“合同类别”]==event_htType) & (df[‘公司名称’]==event_dwmcType)]



Hi @hillstone,

Without seeing your data I’m not totally sure what’s going on here, but at a size of 120K records it’s going to be one of two things.

  1. Maybe part_df is very large, so your browser is having trouble rendering the page.
  2. Maybe creating part_df is very resource-intensive because your dataset is not indexed on the right fields.

Remember that Streamlit re-runs your script every time something happens on the front-end. So if you have any “greedy” operations that will fill your computer’s memory and CPU, this will cause a consistent slowdown.

You might want to wrap the creation of part_df in an @st.cache function and see if that helps!