Deploying an mT5 model on the community plan

Hi,
I wanted to deploy an mT5 summarizer on Streamlit Cloud to share but it wouldn’t work, keeps crashing.

Can I ask if anyone knows of any obvious reason why that is the case?

Thanks in advance.

Can you share a github link?

Hi Franky1, here is the repo link I forked from another public repo:
IgnatiusEzeani/mT5-summarization-app: Streamlit app for summarizing news articles with mT5 & XLSum. (github.com)

What is the error message?

I assume you have to install also a ML framework as extra dependency:

transformers[tf-cpu]
1 Like

Oh okay. But transformer is being installed. Are you suggesting installing transformers[tf-cpu] also?

Just put this in your requirements.txt file and try again:

transformers[tf-cpu]

But no guarantee, it may be that other extras are missing.

1 Like

Thanks, I did that and I believe that helped. I also included sentencepiece and torch and then got this:

The service has encountered an error while checking the health of the Streamlit app: Get "http://localhost:8501/healthz": dial tcp 127.0.0.1:8501: connect: connection refused

Could it be that it is too much for the cloud resources available on the Community plan?

See my pull request.

  • App installs on streamlit cloud
  • However app crashes during initial run during download of models, the models seems to be quite big
1 Like

Yes, I thought as much too. Really appreciate your help.

Do you know if there is any solution to that? I would really love to deploy a summariser on Streamlit to demo to my team.

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.