It takes a long time to deploy and can't be stopped deploying for a whole dayy

I deployed the application on Streamlit, but it took more than 1 day and couldn’t finish.

This is my file requirements.txt:
langchain
langchain-community
pysqlite3-binary
#streamlit==1.28.0
streamlit
requests
#llama_index
openai
#docx2txt
unstructured
unstructured[docx]
unstructured[pdf]
opencv-python-headless
chromadb
tiktoken
tesseract
pytesseract==0.3.8

This is my file packages.txt:
libgl1
poppler-utils
tesseract-ocr
tesseract-ocr-por

[08:07:11] :snake: Python dependencies were installed from /mount/src/support-client/requirements.txt using pip.
Check if streamlit is installed
Streamlit is already installed
[08:07:13] :package: Processed dependencies!

/home/adminuser/venv/lib/python3.9/site-packages/langchain_core/_api/deprecation.py:117: LangChainDeprecationWarning: The class langchain_community.embeddings.openai.OpenAIEmbeddings was deprecated in langchain-community 0.0.9 and will be removed in 0.2.0. An updated version of the class exists in the langchain-openai package and should be used instead. To use it run pip install -U langchain-openai and import as from langchain_openai import OpenAIEmbeddings.
warn_deprecated(
[nltk_data] Downloading package punkt to /home/appuser/nltk_data…
[nltk_data] Unzipping tokenizers/punkt.zip.
[nltk_data] Downloading package averaged_perceptron_tagger to
[nltk_data] /home/appuser/nltk_data…
[nltk_data] Unzipping taggers/averaged_perceptron_tagger.zip.
This function will be deprecated in a future release and unstructured will simply use the DEFAULT_MODEL from unstructured_inference.model.base to set default model name
Some weights of the model checkpoint at microsoft/table-transformer-structure-recognition were not used when initializing TableTransformerForObjectDetection: [‘model.backbone.conv_encoder.model.layer2.0.downsample.1.num_batches_tracked’, ‘model.backbone.conv_encoder.model.layer3.0.downsample.1.num_batches_tracked’, ‘model.backbone.conv_encoder.model.layer4.0.downsample.1.num_batches_tracked’]

  • This IS expected if you are initializing TableTransformerForObjectDetection from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
  • This IS NOT expected if you are initializing TableTransformerForObjectDetection from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).
    2024-04-16 08:09:12.059 503 GET /script-health-check (10.12.122.60) 60058.59ms
    This function will be deprecated in a future release and unstructured will simply use the DEFAULT_MODEL from unstructured_inference.model.base to set default model name
    2024-04-16 08:10:12.378 503 GET /script-health-check (10.12.122.60) 60050.45ms
    2024-04-16 08:11:12.838 503 GET /script-health-check (10.12.122.60) 60076.18ms
    This function will be deprecated in a future release and unstructured will simply use the DEFAULT_MODEL from unstructured_inference.model.base to set default model name
    This function will be deprecated in a future release and unstructured will simply use the DEFAULT_MODEL from unstructured_inference.model.base to set default model name
    This function will be deprecated in a future release and unstructured will simply use the DEFAULT_MODEL from unstructured_inference.model.base to set default model name
    This function will be deprecated in a future release and unstructured will simply use the DEFAULT_MODEL from unstructured_inference.model.base to set default model name
    This function will be deprecated in a future release and unstructured will simply use the DEFAULT_MODEL from unstructured_inference.model.base to set default model name
    This function will be deprecated in a future release and unstructured will simply use the DEFAULT_MODEL from unstructured_inference.model.base to set default model name
    This function will be deprecated in a future release and unstructured will simply use the DEFAULT_MODEL from unstructured_inference.model.base to set default model name
    This function will be deprecated in a future release and unstructured will simply use the DEFAULT_MODEL from unstructured_inference.model.base to set default model name
    This function will be deprecated in a future release and unstructured will simply use the DEFAULT_MODEL from unstructured_inference.model.base to set default model name
    This function will be deprecated in a future release and unstructured will simply use the DEFAULT_MODEL from unstructured_inference.model.base to set default model name
    This function will be deprecated in a future release and unstructured will simply use the DEFAULT_MODEL from unstructured_inference.model.base to set default model name
    This function will be deprecated in a future release and unstructured will simply use the DEFAULT_MODEL from unstructured_inference.model.base to set default model name
1 Like

Hi @Jane1702,

Thanks for sharing this question!

The app seems to be up and running now.

1 Like

But it’s like a lottery; it may break at any time and start working again after a few days.

Similar to the issue with this application, it worked yesterday, but today it broke.

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.