Very Large Datasets

Hi All. I am trying to visualise on the map a large dataset (could be millions of polygons). I am using pydeck Polygon layers and a pandas dataset. In my case this is an unstructured mesh. It seems that there is limit to the size that can be viewed. Afterwards, I get a pop up with the following message.

“Bad message format
value is null”

My layer is

data = pd.read_pickle(’./gdata.pkl’)
elev = pdk.Layer(
get _polygon=‘coordinates’,*
filled= True ,
get_fill_color=[34, 44, 90, 255],
pickable= True

Any way that I can overcome this? Is this a streamlit issue or pydeck issue? Does it depend on available hardware?

Hi @brey, welcome to the Streamlit community!

It sounds like you might be running into this issue:

This is because we’re transferring data across a websocket, to pass over to the JS side for How much data (in terms of polygons) are you looking to pass the map?


Well, I would like to depict a global unstructured mesh with millions of polygons. I tried to split the mesh in two smaller parts but it didn’t work. Also, I am using a column to reference the fill color (This represents a relevant scalar quantity). So the pandas data frame is

	coordinates	                                                                max_elev	                color
0	[[39.3699493, 47.2716255], [39.3896141, 47.226...	-0.008906	[41, 148, 0, 255]
1	[[39.3896141, 47.2261314], [39.457634, 47.2116...	-0.008672	[41, 149, 0, 255]
2	[[39.3824272, 47.1758614], [39.457634, 47.2116...	-0.007819	[41, 149, 0, 255]

and setting


I have noticed that setting the fill_color constant allows for more polygons.

@randyzwitch Any news on this one? How about the http solution used for media?

Thanks for following up @brey. Any chance you’ve tried a newer version of Streamlit (0.62 is the newest). We fixed a bug just about a month ago that was causing issues with large files sizes (affecting file_uploader), but I wouldn’t be surprised if the bug was also affecting what you are seeing: