ModuleNotFoundError: No module named 'gpt_index'

Hi,

I’m getting the error:

ModuleNotFoundError: This app has encountered an error. The original error message is redacted to prevent data leaks. Full error details have been recorded in the logs (if you’re on Streamlit Cloud, click on ‘Manage app’ in the lower right of your app).

Traceback:

File "/home/appuser/venv/lib/python3.9/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 565, in _run_script
    exec(code, module.__dict__)File "/app/xxbot/newui11.py", line 3, in <module>
    from gpt_index import SimpleDirectoryReader, GPTListIndex, GPTSimpleVectorInde

even though everything works perfectly on my local machine.

Thanks

https://docs.streamlit.io/streamlit-community-cloud/get-started/deploy-an-app/app-dependencies

Hi Franky,

I had a requirements.txt file initially that led to the same types of errors.

I then created a packages.txt file that also produced errors.

Here are the dependencies in my app (maybe you can help me create the file in the best way?

import re
import streamlit as st
from gpt_index import SimpleDirectoryReader, GPTListIndex, GPTSimpleVectorIndex, LLMPredictor, PromptHelper
from langchain.chat_models import ChatOpenAI
import sys
import os
from datetime import datetime

os.environ[“OPENAI_API_KEY”] = ‘NOT LISTED HERE’

streamlit
gpt-index

Hi Franky,

I tried that and am still getting the same errors. The error is that the gpt_index module isn’t being found. However, everything runs perfectly fine on my local machine when I run the streamlit app.

Thanks!

Please share your public github repo link, otherwise we are poking around in the dark.

No, your requirements.txt file in this repo is completely different and will not work for obvious reasons.

Do you know what I need to fix in that requirements doc?

When running locally, I didn’t need a requirements doc since all of the dependencies were listed in the python app file.

Thanks!

This is not a proper requirements.txt file, ehm, it contains only garbage… :face_with_diagonal_mouth:
Try this content only:

streamlit
gpt-index

Now, I get:

Traceback (most recent call last):
  File "/home/appuser/venv/lib/python3.9/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 565, in _run_script
    exec(code, module.__dict__)
  File "/app/cxbot/newui11.py", line 54, in <module>
    index = construct_index(docs_directory_path)
  File "/app/cxbot/newui11.py", line 24, in construct_index
    index = GPTSimpleVectorIndex(documents, llm_predictor=llm_predictor, prompt_helper=prompt_helper)
  File "/home/appuser/venv/lib/python3.9/site-packages/gpt_index/indices/vector_store/vector_indices.py", line 73, in __init__
    super().__init__(
  File "/home/appuser/venv/lib/python3.9/site-packages/gpt_index/indices/vector_store/base.py", line 54, in __init__
    super().__init__(
TypeError: __init__() got an unexpected keyword argument 'llm_predictor'

main

Your code probably does not match with the latest version of gpt-index anymore.

The whole gpt/llama/furry… ecosystem is evolving rapidly at the moment, the code samples, tutorials, docs and implementations don’t fit together within weeks… :neutral_face:


Check the installed version on your local computer with:

pip show gpt-index

and add this version to your requirements.txt file in this manner:

streamlit
gpt-index==1.2.3

Edit: Even the documentation itself is confusing, since gpt-index and llama-index are used alternately.
I am not familiar with this ecosystem, however it looks to me like llama-index seems to be the newer naming scheme. But i could be wrong.

1 Like

Hi Franky,

I just obscured my API key by putting it in my environment variables and reran the code. Now, I get:

tenacity.RetryError: This app has encountered an error. The original error message is redacted to prevent data leaks. Full error details have been recorded in the logs (if you’re on Streamlit Cloud, click on ‘Manage app’ in the lower right of your app).
Traceback:
File “/home/appuser/venv/lib/python3.9/site-packages/streamlit/runtime/scriptrunner/script_runner.py”, line 565, in _run_script
exec(code, module.dict)
File “/app/cxbot/newui11.py”, line 54, in
index = construct_index(docs_directory_path)
File “/app/cxbot/newui11.py”, line 24, in construct_index
index = GPTSimpleVectorIndex(documents, llm_predictor=llm_predictor, prompt_helper=prompt_helper)
File “/home/appuser/venv/lib/python3.9/site-packages/gpt_index/indices/vector_store/vector_indices.py”, line 84, in init
super().init(
File “/home/appuser/venv/lib/python3.9/site-packages/gpt_index/indices/vector_store/base.py”, line 63, in init
super().init(
File “/home/appuser/venv/lib/python3.9/site-packages/gpt_index/indices/base.py”, line 109, in init
self._index_struct = self.build_index_from_documents(documents)
File “/home/appuser/venv/lib/python3.9/site-packages/gpt_index/token_counter/token_counter.py”, line 55, in wrapped_llm_predict
f_return_val = f(_self, *args, **kwargs)
File “/home/appuser/venv/lib/python3.9/site-packages/gpt_index/indices/base.py”, line 278, in build_index_from_documents
return self._build_index_from_documents(documents)
File “/home/appuser/venv/lib/python3.9/site-packages/gpt_index/indices/vector_store/base.py”, line 206, in _build_index_from_documents
self._add_document_to_index(index_struct, d)
File “/home/appuser/venv/lib/python3.9/site-packages/gpt_index/indices/vector_store/base.py”, line 182, in _add_document_to_index
embedding_results = self._get_node_embedding_results(
File “/home/appuser/venv/lib/python3.9/site-packages/gpt_index/indices/vector_store/base.py”, line 102, in _get_node_embedding_results
result_ids, result_embeddings = self._embed_model.get_queued_text_embeddings()
File “/home/appuser/venv/lib/python3.9/site-packages/gpt_index/embeddings/base.py”, line 151, in get_queued_text_embeddings
embeddings = self._get_text_embeddings(cur_batch_texts)
File “/home/appuser/venv/lib/python3.9/site-packages/gpt_index/embeddings/openai.py”, line 260, in _get_text_embeddings
embeddings = get_embeddings(texts, engine=engine)
File “/home/appuser/venv/lib/python3.9/site-packages/tenacity/init.py”, line 289, in wrapped_f
return self(f, *args, **kw)
File “/home/appuser/venv/lib/python3.9/site-packages/tenacity/init.py”, line 379, in call
do = self.iter(retry_state=retry_state)
File “/home/appuser/venv/lib/python3.9/site-packages/tenacity/init.py”, line 326, in iter
raise retry_exc from fut.exception()

Do I need to add a virtual environment to the requirements.txt file?

Thanks for all of your help!

Hi Franky,

I’ve made some progress, I think.

I cleaned up the requirements.txt file, created a secrets file in TOML format, etc.

However, now, when I reboot the app, I still get a few errors, and the secret that I’m trying to hide has been printed in the app’s UI.

Can you please take a look at my secrets file and the revised code on Github to see what I’m doing wrong in terms of the secrets?

Thanks!

Error messages usually hint to what one is doing wrong. Depriving @Franky1 of that information is not fair.

Besides, you might want to read about secrets managemetnt in streamlit cloud.

Hi,

Here are the error messages. I did read about secrets management but must be missing something:

2023-04-25 13:17:00.060 Uncaught app exception
Traceback (most recent call last):
  File "/home/appuser/venv/lib/python3.9/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 565, in _run_script
    exec(code, module.__dict__)
  File "/app/cxbot/newui11.py", line 67, in <module>
    st.set_page_config(page_title="Carnegie Chatbot")
  File "/home/appuser/venv/lib/python3.9/site-packages/streamlit/runtime/metrics_util.py", line 311, in wrapped_func
    result = non_optional_func(*args, **kwargs)
  File "/home/appuser/venv/lib/python3.9/site-packages/streamlit/commands/page_config.py", line 225, in set_page_config
    ctx.enqueue(msg)
  File "/home/appuser/venv/lib/python3.9/site-packages/streamlit/runtime/scriptrunner/script_run_context.py", line 90, in enqueue
    raise StreamlitAPIException(
streamlit.errors.StreamlitAPIException: `set_page_config()` can only be called once per app, and must be called as the first Streamlit command in your script.

For more information refer to the [docs](https://docs.streamlit.io/library/api-reference/utilities/st.set_page_config).

I’m trying a code modification and will report on the outcome.

Thanks!

The message says it all:

streamlit.errors.StreamlitAPIException: `set_page_config()` can only be called once per app, and must be called as the first Streamlit command in your script.

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.