Regarding questions 1 and 3, how are you inspecting memory usage in the browser? If you install and open chrome dev tools, there’s a Memory
tab that shows the current JS Heap Size
for the browser tab.
Yes I have some increase in memory usage but not much given that the sample data is small. If you’re sending a million data points to the browser maybe it’s simply using more memory than the browser can handle and is crashing the tab.
The maximum memory per tab is either 2GB or 4GB I believe.
If you go to the Console
tab in the chrome developer tools you should see some output that says Protobuf: ...
. These represent the data sent for the elements in your report. Given the current code, the last one corresponds to the graph at the bottom. If you drill in a bit you can see the spec
data for the chart, as well as the size of the data, and that we’re sending over 1.3 MB of data per chart just for this sample dataset.
Actually, I believe this output is only shown when Streamlit is installed in development mode which means you may have to install it from source, not from PYPI.
It seems that it’s taking one to two seconds to send the 1.3MB protobuf over the websocket, which is causing that small delay in loading the graph that you’re seeing. It’s also taking a bit of time to render the graph after the data is received, so perhaps a combination of the two.
At a certain point too many data points is too many data points, is it possible for you to coalesce the data in some way? I imagine you don’t need to plot the chart with the same granularity of the raw data…?