Project Insight
Project Code
Introduction
Project Insight is designed to create NLP as a service with code base for both front end GUI ( streamlit
) and backend server ( FastApi
) the usage of transformers models on various downstream NLP task.
The downstream NLP tasks covered:
- News Classification
- Entity Recognition
- Sentiment Analysis
- Summarization
- Information Extraction
To Do
The user can select different models from the drop down to run the inference.
The users can also directly use the backend fastapi server to have a command line inference.
Features of the solution
-
Python Code Base : Built using
Fastapi
and Streamlit
making the complete code base in Python.
-
Expandable : The backend is desinged in a way that it can be expanded with more Transformer based models and it will be available in the front end app automatically.
4 Likes
Hello @Abhishek_Mishra , welcome to the community !
This looks amazing
Thanks a lot for sharing! I love the combination of Streamlit + FastAPI (I have internal projects on the Streamlit/FastAPI/Postgres stack xD) and I’ve always wanted to try Huggingface so I’ll definitely try this. And it looks very well documented 
In the Readme of the project, can you add the command to run the docker container so anyone just has to copy/paste from it? You could also add a docker compose file at the root of the project so running it becomes a oneliner
Awesome work,
Fanilo
2 Likes
Hi @andfanilo, thank you for the feedback and recommendations.
I will add the docker commands in the readme.
This is my first rodeo with Docker so i am not very well versed with the docker compose aspect. So it might take a while. 
Any feedback around improvement and contributions are most welcomed! 
1 Like
Hi team,
I have update the project. The backend is now designed with Microservices architecture in mind. With every NLP service running its own server and tied together with nginx.
@andfanilo i have added instructions to run the application. With docker compose to spin up all the services.
3 Likes