Wassup Guys,
I am currently writing a transcriptor app. I’ve deployed it since a couple versions to my cloud.
However, I added a few files to Git LFS because they’re too big to be uploaded normally.
I store the logic for my models as .npy, .pth, .bin files.
Everythings works locally normally, but the cloud project seems to have issues cloning my stored files.
How can I solve this? I don’t want to use an ugly workaround.
Hi @fuccDebugging -
Streamlit Cloud supports using GitLFS, but if I’m not mistaken it does require credits to be purchased (from GitHub) over a certain size. How large are the files you are using?
Best,
Randy
Using Git LFS Premium didn’t work either.
However, my solution:
checkFiles = ("distilbert-dlf/pytorch_model.bin",
"wandering-sponge-4.pth", "label_embeddings.npy")
for path in checkFiles:
if os.path.exists(path) == False:
print('I miss :', path)
msg = st.warning("🚩 Models need to be downloaded... ")
try:
with st.spinner('Initiating...'):
time.sleep(3)
url_pth = "https://www.dl.dropboxusercontent.com/s/....
url_npy = "https://www.dl.dropboxusercontent.com/s/...."
url_bin = "https://www.dl.dropboxusercontent.com/s/..."
r_pth = requests.get(url_pth, allow_redirects=True)
r_npy = requests.get(url_npy, allow_redirects=True)
r_bin = requests.get(url_bin, allow_redirects=True)
open("wandering-sponge-4.pth", 'wb').write(r_pth.content)
open("label_embeddings.npy", 'wb').write(r_npy.content)
open("distilbert-dlf/pytorch_model.bin",
'wb').write(r_bin.content)
del r_pth, r_npy, r_bin
msg.success("Download was successful ✅")
except:
msg.error("Error downloading model files...😥")