How to most easily share your streamlit app locally?

@andreas_rc do you have any leads about heavy tweaking steps to settings? I’m happy to try from my end and debug

Yes, that’s how it worked. Since we had a common shared folder we did not need to have the ability to upload your own files.

Unfortunately nothing specific. I had several error messages and was able to get past some of them with google but in the end stopped. I’ll check out Docker + AWS to share. If you have success with .exe would be interested to see!

Nice! I tested it and it does seem to work well. However, I get a 405 error when trying to use the st.file_uploader.

I also tried to select a path as follows:

# load data
path_input_data = st.sidebar.text_input("Provide the name to the subfolder in which your csv files are "
                                            "stored.")
if path_input_data:
    excel_files = glob.glob(path_input_data + '/**/*.csv', recursive=True)
    st.write(excel_files)

What I want to do is provide the complete path to the local folder in which data is stored. When creating a container I can only refer to the relative path inside of the container, which means that users cannot select their own data. Any suggestions on how to work around this? I’m probably missing something obvious here, as I am not too familiar with Docker images.

We also continue to keep in touch over email, but just to keep any other readers up to date, here is a summary of our discussion:

Thank you for getting in touch - I’m really pleased to hear you have enjoyed ContainDS Desktop!

I think the important piece of information you are missing is that the ‘workspace folder’ on your computer maps to the path ‘/app’ inside the Docker container.

So for example, on my Mac the container is running with a workspace of /Users/dan/streamlit-single and this can be accessed via /app in the container’s Streamlit code.

Here is my example folder on the Mac:

|- code.py
|- csvfiles
     |- test.csv
     |- testboxes.csv

My version of the Python code is at the end of this email, showing two methods of accessing the files - direct access through Python code, and then using st.file_uploader.

For the direct access method (not using st.file_uploader) it is important to understand that most of your computer’s hard drive is not accessible directly by the container. That is essential for security. So the csv files should ideally be in your workspace folder (as mine are above). You could always use a symlink if you really wanted to store them elsewhere on your hard drive.

Rather than expect the user to know that /app is the location for accessing the workspace from inside the container, in my example code I just expect them to type the immediate subfolder name of ‘csvfiles’ and then I prepend ‘/app/’ to that. In fact, if you just want to find all csv files in the workspace then you could just glob from the /app folder recursively anyway, so you don’t really need to specify the subfolder name.

For the file uploader version, I’m not too sure why you were seeing the 405 error. It seems to work for me using the streamlit-single Docker image, but I do see the 405 error using streamlit-launchpad images. I’ll take a look.

import streamlit as st
import glob
from os import path
import pandas as pd

st.markdown('# Based on subfolder path')

path_input_data = st.sidebar.text_input("Provide the name to the subfolder in which your csv files are "
                                            "stored.")

st.write("Input path:", path_input_data)

docker_search_path = path.join('/app', path_input_data)

st.write("Search path:", docker_search_path)

if path_input_data:
    excel_files = glob.glob(docker_search_path + '/**/*.csv', recursive=True)
    st.write(excel_files)

st.markdown('# Now try uploading')

uploaded_file = st.file_uploader("Choose a CSV file", type="csv")

if uploaded_file is not None:

     st.write('Uploaded file: ', uploaded_file)
     data = pd.read_csv(uploaded_file)
     st.write(data)

st.write('Streamlit version: ', st.version._get_installed_streamlit_version())
1 Like

I’ve updated streamlit-launchpad to work with the file_uploader now (version 0.0.6). To pick this up you might need to restart ContainDS Desktop and then create a new container from the latest ‘streamlit-launchpad’ in the Images tab.

@ bjornvandijkman
I have successfully created the .exe file by your procedure. But unable to run the .exe file, It’s throwing the below error.
‘Failed to execute script run_streamlit’
Please help!!

# st-run-test.py
import streamlit as st
def main():
    """    # do stuff here. """
    st.markdown("Test this and that")

if __name__ == "__main__":
    main()

# run-st-test.py
import subprocess as sp
import shlex

sp.call(shlex.split("streamlit run st-test.py"), shell=True)

I seem to successfully freeze run-st-test.py with pyinstaller (pyinstaller run-st-test.py -c) and run the exe. Will give it a try with my rather complicated streamlit project. I would guess it should work.

It also works with more complicated cases. The problem is, it only freeze run-st-test.py.

I am looking for a way to freeze st-run-test.py and anything it makes use of. The reason: I want other people to be able to run my streamlit program without explicitly sharing the source code. Is this possible?

@mikee @bjornvandijkman
This is not the solution I was hoping for, as it only makes an .exe file of the calling script. Python and streamlit still needs to be present on the executer’s computer.
That’s why Srinath and andreas are not able to run the .exe
Am I missing something or is it just not possible to create a (single) executable with pyinstaller for streamlit apps?

I have not used pyinstaller since to be honest. I have however successfully shared applications locally using containds of @danlester.

2 Likes

Hi Chekos,

That is actually a really brilliant solution that I could use for many challenges, not just streamlit.
Installing python or anaconda to a shared drive is easy enough. But what I do not get, is how you managed to install the extra libraries to the shared drive?

I have python on my own machine and on the shared drive now, but pip installing will just install them on my own machine. How did you get around this?