PySpark installation issue on Streamlit Cloud

can any one give me any solution about streamlit deployment error,
basically when I run the streamlit app on local machine its working fine but I when I try to deploye on streamlit cloud its give me error
I pasted the SS of the same below


pyspark needs Java, therefore try to add this to your packages.txt file:

default-jre-headless

is pacakages.txt file contains same things as requirements.txt file ?

No, the packages.txt file is for binary packages, the requirements.txt is for pip packages.

in requirements.txt I added*

findspark==2.0.1
matplotlib==3.7.0
numpy==1.24.2
pandas==1.5.3
plotly==5.13.0
pyspark==3.3.2
seaborn==0.12.2
streamlit==1.18.1

and In packages.txt

default-jre-headless
still I am getting same Error


One Thing I also Like to mentioned,
I written the pyspark code in jupyter notebook and then save my trained model by using save() method. and same model I exported in the app.py file by using load() for prediction.

is this work on streamlit cloud or I have to write all code in only python (xyz.py) file ?

I would try some other packages instead, for example:

  • default-jre
  • default-jdk
1 Like

I don’t understand, can you please provide a link to your public github repo?

Thank you soo much Now its work with * default-jre
I spent almost 2 days for fixing this error Thanx allot @Franky1

1 Like

Hi, I am getting the same error when using Pyspark and I have tried all 3 options of default- in packages.txt but still getting same error.

App weblink: AppWebLink
Github Public repo: GithubRepoLink

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.