Thanks @thiago , I am getting there
To check if I understood correctly…
My assumptions: Streamlit app, running under nginx. No authentication mechanism (no userId, no login,…). Users upload a sensitive CSV dataset they do not want anybody else to see. The app uses many instances of st.cache to improve performance of different functions. The sensitive user data is always one of the input parameters of the cached functions.
If a user A uploads a dataset, this dataset cannot be accessed by user B, even if user B is a hacker. Great.
The global st.cache content that “belongs” to user A is not accessible to user B (since user B cannot access user A’ dataset, and therefore cannot call the function with the same parameters) Great.
User A and user B are working, at the same time, with different datasets, on the streamlit app. All is good and they cannot see the other user’s file, whether they are hackers or not.
At one point, user A, for whatever reason, clears the cache, deleting, if I understand correctly, also the info in the global st.cache tied to user B.
User B will experience a loss in performance until the cache rebuilds itself?
If this is true (hope not) once you have enough users and there statistically is always one clearing the cache, the cache will more or less always be empty? Any ideas on how to avoid it, assuming this is true?
Thanks for the clarification!