Error: This model doesn't have language tokens so it can't perform lang id

I created an app that was working fine, but now I’m receiving the following error:
“Error: This model doesn’t have language tokens so it can’t perform lang id”

Any help resolving this would be appreciated.

Hi @Notadeveloper,

Can you please share the full error log trace?

Thanks,
Charly

Few days back, my app was working fine but now it is giving the same error as mentioned above. My app is about taking in a YouTube video url and chat with the content of the video. Please guide, it will be really appreciated.

Hey @Zia_Younas

Can you please share the full error log trace?

Thanks,
Charly

[09:08:58] ❗️ The service has encountered an error while checking the health of the Streamlit app: Get "http://localhost:8501/healthz": dial tcp 10.12.178.10:8501: connect: connection refused

Collecting usage statistics. To deactivate, set browser.gatherUsageStats to False.




[09:09:02] 🔄 Updated app!
/home/appuser/venv/lib/python3.9/site-packages/whisper/timing.py:57: NumbaDeprecationWarning: The 'nopython' keyword argument was not supplied to the 'numba.jit' decorator. The implicit default value for this argument is currently False, but it will be changed to True in Numba 0.59.0. See https://numba.readthedocs.io/en/stable/reference/deprecation.html#deprecation-of-object-mode-fall-back-behaviour-when-using-jit for details.
  def backtrace(trace: np.ndarray):
2023-10-22 09:11:03.025 Session with id 977b6ea5-55d6-44c5-8fe9-aee6061c420d is already connected! Connecting to a new session.
[09:16:56] 🐙 Pulling code changes from Github...
[09:16:57] 📦 Processing dependencies...
[09:16:57] 📦 Apt dependencies were installed from /app/questibot/packages.txt using apt-get.
[09:16:57] 📦 Processed dependencies!
[09:16:58] 🔄 Updated app!
[09:21:32] 🐙 Pulling code changes from Github...
[09:21:33] 📦 Processing dependencies...
[09:21:33] 📦 Apt dependencies were installed from /app/questibot/packages.txt using apt-get.
[09:21:33] 📦 Processed dependencies!
[09:21:35] 🔄 Updated app!
[09:24:16] 🐙 Pulling code changes from Github...
[09:24:17] 📦 Processing dependencies...
[09:24:17] 📦 Apt dependencies were installed from /app/questibot/packages.txt using apt-get.
[09:24:17] 📦 Processed dependencies!
[09:24:18] 🔄 Updated app!
[09:26:09] 🐙 Pulling code changes from Github...
[09:26:10] 📦 Processing dependencies...
[09:26:10] 📦 Apt dependencies were installed from /app/questibot/packages.txt using apt-get.
[09:26:10] 📦 Processed dependencies!
[09:26:11] 🔄 Updated app!
youtube_video.title- Manual Answering
<Stream: itag="17" mime_type="video/3gpp" res="144p" fps="8fps" vcodec="mp4v.20.3" acodec="mp4a.40.2" progressive="True" type="video">
<Stream: itag="18" mime_type="video/mp4" res="360p" fps="30fps" vcodec="avc1.42001E" acodec="mp4a.40.2" progressive="True" type="video">
<Stream: itag="22" mime_type="video/mp4" res="720p" fps="30fps" vcodec="avc1.64001F" acodec="mp4a.40.2" progressive="True" type="video">
<Stream: itag="136" mime_type="video/mp4" res="720p" fps="30fps" vcodec="avc1.64001f" progressive="False" type="video">
<Stream: itag="247" mime_type="video/webm" res="720p" fps="30fps" vcodec="vp9" progressive="False" type="video">
<Stream: itag="135" mime_type="video/mp4" res="480p" fps="30fps" vcodec="avc1.4d401f" progressive="False" type="video">
<Stream: itag="244" mime_type="video/webm" res="480p" fps="30fps" vcodec="vp9" progressive="False" type="video">
<Stream: itag="134" mime_type="video/mp4" res="360p" fps="30fps" vcodec="avc1.4d401e" progressive="False" type="video">
<Stream: itag="243" mime_type="video/webm" res="360p" fps="30fps" vcodec="vp9" progressive="False" type="video">
<Stream: itag="133" mime_type="video/mp4" res="240p" fps="30fps" vcodec="avc1.4d4015" progressive="False" type="video">
<Stream: itag="242" mime_type="video/webm" res="240p" fps="30fps" vcodec="vp9" progressive="False" type="video">
<Stream: itag="160" mime_type="video/mp4" res="144p" fps="30fps" vcodec="avc1.4d400c" progressive="False" type="video">
<Stream: itag="278" mime_type="video/webm" res="144p" fps="30fps" vcodec="vp9" progressive="False" type="video">
<Stream: itag="139" mime_type="audio/mp4" abr="48kbps" acodec="mp4a.40.5" progressive="False" type="audio">
<Stream: itag="140" mime_type="audio/mp4" abr="128kbps" acodec="mp4a.40.2" progressive="False" type="audio">
<Stream: itag="249" mime_type="audio/webm" abr="50kbps" acodec="opus" progressive="False" type="audio">
<Stream: itag="250" mime_type="audio/webm" abr="70kbps" acodec="opus" progressive="False" type="audio">
<Stream: itag="251" mime_type="audio/webm" abr="160kbps" acodec="opus" progressive="False" type="audio">
youtube_video.title- Manual Answering
<Stream: itag="17" mime_type="video/3gpp" res="144p" fps="8fps" vcodec="mp4v.20.3" acodec="mp4a.40.2" progressive="True" type="video">
<Stream: itag="18" mime_type="video/mp4" res="360p" fps="30fps" vcodec="avc1.42001E" acodec="mp4a.40.2" progressive="True" type="video">
<Stream: itag="22" mime_type="video/mp4" res="720p" fps="30fps" vcodec="avc1.64001F" acodec="mp4a.40.2" progressive="True" type="video">
<Stream: itag="136" mime_type="video/mp4" res="720p" fps="30fps" vcodec="avc1.64001f" progressive="False" type="video">
<Stream: itag="247" mime_type="video/webm" res="720p" fps="30fps" vcodec="vp9" progressive="False" type="video">
<Stream: itag="135" mime_type="video/mp4" res="480p" fps="30fps" vcodec="avc1.4d401f" progressive="False" type="video">
<Stream: itag="244" mime_type="video/webm" res="480p" fps="30fps" vcodec="vp9" progressive="False" type="video">
<Stream: itag="134" mime_type="video/mp4" res="360p" fps="30fps" vcodec="avc1.4d401e" progressive="False" type="video">
<Stream: itag="243" mime_type="video/webm" res="360p" fps="30fps" vcodec="vp9" progressive="False" type="video">
<Stream: itag="133" mime_type="video/mp4" res="240p" fps="30fps" vcodec="avc1.4d4015" progressive="False" type="video">
<Stream: itag="242" mime_type="video/webm" res="240p" fps="30fps" vcodec="vp9" progressive="False" type="video">
<Stream: itag="160" mime_type="video/mp4" res="144p" fps="30fps" vcodec="avc1.4d400c" progressive="False" type="video">
<Stream: itag="278" mime_type="video/webm" res="144p" fps="30fps" vcodec="vp9" progressive="False" type="video">
<Stream: itag="139" mime_type="audio/mp4" abr="48kbps" acodec="mp4a.40.5" progressive="False" type="audio">
<Stream: itag="140" mime_type="audio/mp4" abr="128kbps" acodec="mp4a.40.2" progressive="False" type="audio">
<Stream: itag="249" mime_type="audio/webm" abr="50kbps" acodec="opus" progressive="False" type="audio">
<Stream: itag="250" mime_type="audio/webm" abr="70kbps" acodec="opus" progressive="False" type="audio">
<Stream: itag="251" mime_type="audio/webm" abr="160kbps" acodec="opus" progressive="False" type="audio">
youtube_video.title- Manual Answering
<Stream: itag="17" mime_type="video/3gpp" res="144p" fps="8fps" vcodec="mp4v.20.3" acodec="mp4a.40.2" progressive="True" type="video">
<Stream: itag="18" mime_type="video/mp4" res="360p" fps="30fps" vcodec="avc1.42001E" acodec="mp4a.40.2" progressive="True" type="video">
<Stream: itag="22" mime_type="video/mp4" res="720p" fps="30fps" vcodec="avc1.64001F" acodec="mp4a.40.2" progressive="True" type="video">
<Stream: itag="136" mime_type="video/mp4" res="720p" fps="30fps" vcodec="avc1.64001f" progressive="False" type="video">
<Stream: itag="247" mime_type="video/webm" res="720p" fps="30fps" vcodec="vp9" progressive="False" type="video">
<Stream: itag="135" mime_type="video/mp4" res="480p" fps="30fps" vcodec="avc1.4d401f" progressive="False" type="video">
<Stream: itag="244" mime_type="video/webm" res="480p" fps="30fps" vcodec="vp9" progressive="False" type="video">
<Stream: itag="134" mime_type="video/mp4" res="360p" fps="30fps" vcodec="avc1.4d401e" progressive="False" type="video">
<Stream: itag="243" mime_type="video/webm" res="360p" fps="30fps" vcodec="vp9" progressive="False" type="video">
<Stream: itag="133" mime_type="video/mp4" res="240p" fps="30fps" vcodec="avc1.4d4015" progressive="False" type="video">
<Stream: itag="242" mime_type="video/webm" res="240p" fps="30fps" vcodec="vp9" progressive="False" type="video">
<Stream: itag="160" mime_type="video/mp4" res="144p" fps="30fps" vcodec="avc1.4d400c" progressive="False" type="video">
<Stream: itag="278" mime_type="video/webm" res="144p" fps="30fps" vcodec="vp9" progressive="False" type="video">
<Stream: itag="139" mime_type="audio/mp4" abr="48kbps" acodec="mp4a.40.5" progressive="False" type="audio">
<Stream: itag="140" mime_type="audio/mp4" abr="128kbps" acodec="mp4a.40.2" progressive="False" type="audio">
<Stream: itag="249" mime_type="audio/webm" abr="50kbps" acodec="opus" progressive="False" type="audio">
<Stream: itag="250" mime_type="audio/webm" abr="70kbps" acodec="opus" progressive="False" type="audio">
<Stream: itag="251" mime_type="audio/webm" abr="160kbps" acodec="opus" progressive="False" type="audio">
youtube_video.title- Manual Answering
<Stream: itag="17" mime_type="video/3gpp" res="144p" fps="8fps" vcodec="mp4v.20.3" acodec="mp4a.40.2" progressive="True" type="video">
<Stream: itag="18" mime_type="video/mp4" res="360p" fps="30fps" vcodec="avc1.42001E" acodec="mp4a.40.2" progressive="True" type="video">
<Stream: itag="22" mime_type="video/mp4" res="720p" fps="30fps" vcodec="avc1.64001F" acodec="mp4a.40.2" progressive="True" type="video">
<Stream: itag="136" mime_type="video/mp4" res="720p" fps="30fps" vcodec="avc1.64001f" progressive="False" type="video">
<Stream: itag="247" mime_type="video/webm" res="720p" fps="30fps" vcodec="vp9" progressive="False" type="video">
<Stream: itag="135" mime_type="video/mp4" res="480p" fps="30fps" vcodec="avc1.4d401f" progressive="False" type="video">
<Stream: itag="244" mime_type="video/webm" res="480p" fps="30fps" vcodec="vp9" progressive="False" type="video">
<Stream: itag="134" mime_type="video/mp4" res="360p" fps="30fps" vcodec="avc1.4d401e" progressive="False" type="video">
<Stream: itag="243" mime_type="video/webm" res="360p" fps="30fps" vcodec="vp9" progressive="False" type="video">
<Stream: itag="133" mime_type="video/mp4" res="240p" fps="30fps" vcodec="avc1.4d4015" progressive="False" type="video">
<Stream: itag="242" mime_type="video/webm" res="240p" fps="30fps" vcodec="vp9" progressive="False" type="video">
<Stream: itag="160" mime_type="video/mp4" res="144p" fps="30fps" vcodec="avc1.4d400c" progressive="False" type="video">
<Stream: itag="278" mime_type="video/webm" res="144p" fps="30fps" vcodec="vp9" progressive="False" type="video">
<Stream: itag="139" mime_type="audio/mp4" abr="48kbps" acodec="mp4a.40.5" progressive="False" type="audio">

@Zia_Younas Would it be possible for you to share the code as well, please?

Thank
Charly

‘’’
def get_video_text(url):
model = whisper.load_model(‘base’)
youtube_video_url = url
youtube_video = YouTube(youtube_video_url)
print(‘youtube_video.title-’,youtube_video.title)
for stream in youtube_video.streams:
print(stream)
streams = youtube_video.streams.filter(only_audio=True)
stream.download(filename=‘feds.mp4’)
# save a timestamp before transcription
t1 = datetime.datetime.now()
# print(f"started at {t1}")
path=os.getcwd()
# print(os.getcwd())
file=‘feds.mp4’
# print(os.path.join(path,file))
full_file_location=os.path.join(path,file)
# print(“full_file_location-”,full_file_location)
# do the transcription

output = model.transcribe(full_file_location, fp16=False)

show time elapsed after transcription is complete.

t2 = datetime.datetime.now()
# print(f"ended at {t2}")
# print(f"time elapsed: {t2 - t1}")
text=output['text']
# print(text)
return text

Thanks @Zia_Younas,

We had some brief outages over the past few days that affected some users’ apps. A simple app restart usually fixes this.

If that doesn’t solve this issue, the error message shows there’s a connectivity issue with http://localhost:8501/healthz.

There could be a few reasons for this error.

The line whisper.load_model('base') shows that the Whisper model is being loaded into memory, so one possibility is that the model you’re using is using up too many resources.

Have you tried leveraging caching? Also, is there a way to use a lighter model instead of the base model?

Thanks,
Charly

Thanks for your response,

I have rebooted the app but it is still giving the same error. I have also tried the ‘tiny’ model instead of ‘base’ model but still giving the same error.

can you please guide me in this issue.

Thanks,
Zia

Would it be possible for you to rely on services like Replicate?

Thanks,
Charly

Thanks for your suggestion!

But i want to ask that your team is not going to resolve this issue? Because streamlit is a free cloud and the one you suggested is paid one…

Hi @Zia_Younas,

I think what Charly was suggesting is that you might run into issues deploying this app and loading the model into memory on Community Cloud since apps on Community Cloud are limited to 1GB. Community Cloud serves as a free community resource for folks to get started with Streamlit. If your app requires significantly more than 1GB, we’d recommend exploring other deployment options (check out our community deployment wiki here).

That said, the error message you shared doesn’t sound like it’s related to your app’s memory usage.