Hi there,
I’m creating a web app that allows users to use some tensorflow models I’ve created. Nothing new here, streamlit would deal with this perfectly. The problem here is that I want the users to be able to train new models with their own data and even parameters. So a number of user-specific models will be created, wich should be stored and secured with authentication. I have some experience with Django and it sounds like the framework that could perfectly deal with both authentication and database management, as well as Tensorflow tasks like inference and training.
My proposed architecture would be a Streamlit frontend, connected to a Django backend with 3 applications: one for general backend tasks and communication with the frontend and DB (maybe via REST), one for Tensorflow related tasks, and one for authentication. I thought of this because this way I could hypothetically deploy another Streamlit frontend for another data science project, while using the same Django backend for both. My question is how should I containerize (with Docker) this system. I was thinking of one container for the Streamlit app (frontend) (if in the future I have several frontends, one container for each), one for Django and one for the database. Oh, and this is to be deployed in one local machine. Below is a diagram of the proposed architecture:
Are these too much? Or too less? I don’t know what’s the standard approach here.
Thanks in advance!