Using caching with API calls and messy DataFrames

I’m working with a site’s API for data retrieval, and I have been using st.cache on my functions with the calls in order to only make new requests when necessary (great functionality, thanks!). Currently, I am hoping to keep my getting and cleaning functions separate.

The issue I am having right now unfortunately is that, from what I can tell, pandas is unable to cache the messy data that the API is returning to me.

My question is what the suggested work flow might be to use Streamlit effectively here; will I only be able to cache the clean dataframe, in which case I will have to use a non-cached function within my cached request function that is run each time for cleaning?

Here is my current output.

The first st.dataframe() is within my cached function and working as expected; the error is from a second st.dataframe() that sits outside my cached function so that, ideally, an st.dataframe() isn’t displayed every time I request something from the API.

Sorry for what is most likely an unclear and/or amateur question; thanks for the community platform and the opportunity to use Streamlit!

Hi J4th:

Thanks for that great question which is not amateurish at all! :slightly_smiling_face:

In general, you should absolutely be able to call a cached function from a cached function. That’s a great usage pattern!

It seems that you’ve uncovered a hashing bug in Pandas. However, since Streamlit special-cases Pandas DataFrames when hashing, we might be able to fix ths one in Streamlit too.

I’ve gone ahead and submitted a bug for you. Please go ahead and follow it on Github if you’d like to follow its progress.

In the meantime, we’d suggest that you either (1) not cache the cleanup function, or (2) sanitize the DataFrame (by removing the lists) before passing it into the cached cleanup function.

Thanks for using Streamlit! :star:


p.s. We apologize for the delayed reply. We’re just setting up our discussion forums now. We should have faster response times going forward.

Awesome, thanks so much for the reply and submitting that bug for me; I added two others I had come across, so I apologise for the sudden influx on my part.

Happy Monday!

No worries. Bugs are great! :heart:

Thank you for filing the issues. I sent two PRs to fix them: and I suppose the fixes will be included in the next release.