hello! I’m just getting started with Streamlit and have been trying to host a little demo just to test things, but I’m running into an issue with importing ‘pytorch-fast-transformers’.
Collecting torch (from -r /mount/src/{redacted}/requirements.txt (line 7))
Downloading torch-2.6.0-cp312-cp312-manylinux1_x86_64.whl.metadata (28 kB)
Collecting pytorch-fast-transformers (from -r /mount/src/{redacted}/requirements.txt (line 8))
Downloading pytorch-fast-transformers-0.4.0.tar.gz (93 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 93.6/93.6 kB 130.3 MB/s eta 0:00:00[2025-04-22 21:49:57.955717]
Installing build dependencies: started
Installing build dependencies: finished with status 'done'
Getting requirements to build wheel: started
Getting requirements to build wheel: finished with status 'error'
error: subprocess-exited-with-error
× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> [26 lines of output]
Traceback (most recent call last):
File "<string>", line 19, in <module>
ModuleNotFoundError: No module named 'torch'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/adminuser/venv/lib/python3.12/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 353, in <module>
main()
File "/home/adminuser/venv/lib/python3.12/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 335, in main
json_out['return_val'] = hook(**hook_input['kwargs'])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/adminuser/venv/lib/python3.12/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 118, in get_requires_for_build_wheel
return hook(config_settings)
^^^^^^^^^^^^^^^^^^^^^
File "/tmp/pip-build-env-hxhue8h0/overlay/lib/python3.12/site-packages/setuptools/build_meta.py", line 331, in get_requires_for_build_wheel
return self._get_build_requires(config_settings, requirements=[])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/tmp/pip-build-env-hxhue8h0/overlay/lib/python3.12/site-packages/setuptools/build_meta.py", line 301, in _get_build_requires
self.run_setup()
File "/tmp/pip-build-env-hxhue8h0/overlay/lib/python3.12/site-packages/setuptools/build_meta.py", line 512, in run_setup
super().run_setup(setup_script=setup_script)
File "/tmp/pip-build-env-hxhue8h0/overlay/lib/python3.12/site-packages/setuptools/build_meta.py", line 317, in run_setup
exec(code, locals())
File "<string>", line 22, in <module>
ImportError: PyTorch is required to install pytorch-fast-transformers. Please install your favorite version of PyTorch, we support 1.3.1, 1.5.0 and >=1.6
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error
× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> See above for output.
note: This error originates from a subprocess, and is likely not a problem with pip.
I have checked that I include torch prior to pytorch-fast-transformers in my requirements.txt, and I found supporting information that suggests it is because pytorch-fast-transformers requires torch at build-time but uv defaults to building in isolation, which I don’t fully understand (sorry!!) but I assume it means it ignores the torch I previously install just before? Another keyword I noticed was PEP 517, which meant very little to me.
Supporting links:
https://github.com/astral-sh/uv/issues/2991
https://github.com/astral-sh/uv/issues/2252
I have no idea how to remedy this for Streamlit, as I cannot pass a ‘–no-build-isolation’ arg to Streamlit. I have tried many other things, such as setting a specific version for torch/pytorch-fast-transformers, including torchvision and/or torchaudio, downgrading versions of pytorch-fast-transformers, etc.
I have also tried using wheel to manually build a wheel of my own, but unfortunately my local machine is running Windows and I receive an error missing some C/C++ libraries when I try to use the .whl for Streamlit. (I also tried using WSL to generate a Linux wheel, but that did not work either, I was missing GLIBC libraries among other things? Unsure.)
note: the wheel I built on Linxu named itself ‘pytorch_fast_transformers-0.4.0-cp312-cp312-linux_x86_64.whl’, which I would think matches the name of the torch wheel automatically imported up there, but it didn’t work for me…
The only method I have found that I haven’t explored is using fast-transformers-pytorch instead of pytorch-fast-transformers, but I would have to reformat a model that I am using (EMOPIA) to use the different library and that does in fact make me feel like misery.
However, I noticed in another topic,
https://discuss.streamlit.io/t/running-locally-streamlit-prompts-to-login-to-my-git-repo-how-can-i-disable/6751/5
that user quadebroadwell had imported torch and pytorch-fast-transformers!! (presumably successfully!)
Unfortunately, I tried the same versions he used, and did not receive any better results. Still a failed to build wheel. This gives me hope though! (and reason to procrastinate on rewriting the EMOPIA model!)
I don’t think its proper etiquette however to reply to a message from a thread in 2020, so I was wondering if there is anyone smarter than I and more familiar with package management and Streamlit that could point me in a direction to resolve this? It annoys me insanely how much time I’ve spent not even debugging or playing around with Streamlit, but instead trying to fix an import.
P.S. If someone happened to have a wheel for pytorch-fast-transformers for a version of Linux that works for the servers backing Streamlit Community Cloud, I would love and cherish you for at least a week or two.