Load massive data from sql

Hi, there, maybe it’s stupid question, but I need to understand
I trying load from postgresql db, via pandas, massive table (~65Mb), for dashboard, summarize and plot by Plotly. But it doesn’t work, endless loading and after that runtime error.
My question, if it possible to load massive table to web app or its my local problem? And any solution to load data for final summarize?

Hi @TeodorGottfried,

In general 65MB worth of data should not be a problem, but it depends on the details of how you are using it.

Here’s a simple example with some data is a bit larger than 65MB

import numpy as np
import pandas as pd
import streamlit as st


@st.experimental_memo
def get_data() -> pd.DataFrame:
    random_data = np.random.randn(5_000_000, 3)

    df = pd.DataFrame(random_data, columns=["a", "b", "c"])

    return df


st.write(get_data())

This runs fine for me locally.

So, the short answer is it should be possible. One thing you might try is running your app with a small fraction of the data first, and if that doesn’t work, then start debugging to see if you can figure out where it’s failing.

1 Like