Deployment Question for Older App and NLTK

Hello all,

I have an app that I originally deployed in the spring of 2021 on Streamlit 0.76.
Repo: coffee-reviews-nlp/web_app at main · ejfeldman7/coffee-reviews-nlp · GitHub

It used to run without an issue and had the versions of packages in the requirements.txt pinned, but when I came back to it after a while a couple weeks ago I go the error attached below. I believe that this was initially created in Python 3.8, so I redeployed the app and selected 3.8, but still got the same error. I have tried to recreate this locally in a venv, but when I installed the requirements.txt there I did not encounter any error. It seems to be specific to a dependency within an import for NLTK (bolded in error).

I’m curious if anyone has any thoughts on why this may be occurring on Streamlit but not in a local venv or if there are ideas for how to work around it in a deployment. Thanks!

Text from error:

AttributeError: module 'numpy' has no attribute 'int'. `` was a deprecated alias for the builtin `int`. To avoid this error in existing code, use `int` by itself. Doing this will not modify any behavior and is safe. When replacing ``, you may wish to use e.g. `np.int64` or `np.int32` to specify the precision. If you wish to review your current use, check the release note link for additional information. The aliases was originally deprecated in NumPy 1.20; for more details and guidance see the original release note at:
File "/home/adminuser/venv/lib/python3.8/site-packages/streamlit/", line 332, in _run_script
    exec(code, module.__dict__)
**File "/mount/src/coffee-reviews-nlp/web_app/", line 12, in <module>**
**    from nltk import download**
File "/home/adminuser/venv/lib/python3.8/site-packages/nltk/", line 142, in <module>
    from nltk.chunk import *
File "/home/adminuser/venv/lib/python3.8/site-packages/nltk/chunk/", line 157, in <module>
    from nltk.chunk.api import ChunkParserI
File "/home/adminuser/venv/lib/python3.8/site-packages/nltk/chunk/", line 13, in <module>
    from nltk.parse import ParserI
File "/home/adminuser/venv/lib/python3.8/site-packages/nltk/parse/", line 100, in <module>
    from nltk.parse.transitionparser import TransitionParser
File "/home/adminuser/venv/lib/python3.8/site-packages/nltk/parse/", line 19, in <module>
    from sklearn.datasets import load_svmlight_file
File "/home/adminuser/venv/lib/python3.8/site-packages/sklearn/datasets/", line 22, in <module>
    from ._twenty_newsgroups import fetch_20newsgroups
File "/home/adminuser/venv/lib/python3.8/site-packages/sklearn/datasets/", line 45, in <module>
    from ..feature_extraction.text import CountVectorizer
File "/home/adminuser/venv/lib/python3.8/site-packages/sklearn/feature_extraction/", line 9, in <module>
    from .image import img_to_graph, grid_to_graph
File "/home/adminuser/venv/lib/python3.8/site-packages/sklearn/feature_extraction/", line 172, in <module>
File "/home/adminuser/venv/lib/python3.8/site-packages/numpy/", line 305, in __getattr__
    raise AttributeError(__former_attrs__[attr])

Hi @ejfeldman7,

Thanks for posting!

The error you’re getting is related to a version incompatibility between NumPy and other libraries you’re using. It seems was deprecated in NumPy version 1.20 to be precise. If you wish to maintain your code and functionality to use the older version, I recommend downgrading your NumPy version in your requirements.txt file to something like numpy==1.19.5.

The other option is to modify your codebase to reflect the changes in NumPy. You could replace occurrences of with either int , numpy.int64 , or numpy.int32 , depending on the required precision in your code. More info here in their docs.


I appreciate the thoughts. I have tried and just tried again with pinning an earlier version of numpy, but still got the same error. The issue is that the occurrences of aren’t in my codebase, they’re within the import of nltk. What I find most odd though is that I can set up a virtual environment locally with the same set of requirements, why not on streamlit?

Ah interesting. I’ll try it on my end and let you know if it works.