What is the best caching strategy for caching large volume of data?

We’ve built an app that conducts semantic searches across approximately 200 pre-processed documents stored as pickle files, each between 50-100MB in size. With 5 active users accessing these files from Azure blob storage, ranging from 20 files to the entire set, we aim to load and cache all 200 files since their list remains constant. What’s the optimal caching method? Could you provide code examples via links? Recently, our app’s performance has slowed down as sometimes we have the data of ~1000 files stored in session states…