FileNotFoundError: [Errno 2] No such file or directory: 'UploadedFile'

Hi, I’m creating an LLM Application which accepts user inputs such as JSON file upload, project id and dataset name to establish bigquery connection. I have tried the following code given below and it works perfectly. It is a hard coded version where the user doesn’t upload or give any inputs. When i converted this to streamlit app where it accepts input JSON file, it gives me an error saying file not found. Full details on the error is given below. I would greatly appreciate it if someone could help me out on this.

Code snippet bigquery direct connection:

service_account_file = "bigquery_sample.json" #local directory path 
project = "xxxxx" 
dataset = "xxxx_xxxxx" 

sqlalchemy_url = f'bigquery://{project}/{dataset}?credentials_path={service_account_file}' #connection

Code snippet for bigquery connection using streamlit by accepting inputs from user:

service_account_file = st.sidebar.file_uploader("Upload BigQuery Service Account file") 
project = st.sidebar.text_input("Enter Project ID:","xxxxx") 
dataset = st.sidebar.text_input("Enter Dataset Name:","xxxx_xxxxx") 

sqlalchemy_url = f'bigquery://{project}/{dataset}?credentials_path={service_account_file}'


FileNotFoundError: [Errno 2] No such file or directory: 'UploadedFile(file_id=\'7a217610-7090-4b3c-9e34-19f4555de9d5\', name=\'bigquery_sample.json\', type=\'application/json\', size=2355, _file_urls=file_id: "7a217610-7090-4b3c-9e34-19f4555de9d5"'


File "C:\Users\alens\Downloads\langchain_text2sql_1\venv\lib\site-packages\streamlit\runtime\scriptrunner\", line 541, in _run_script
    exec(code, module.__dict__)
File "C:\Users\alens\Downloads\langchain_text2sql_1\", line 93, in <module>
    query_engine = initialize_llm_predictor()
File "C:\Users\alens\Downloads\langchain_text2sql_1\", line 67, in initialize_llm_predictor
    db = SQLDatabase.from_uri(sqlalchemy_url)
File "C:\Users\alens\Downloads\langchain_text2sql_1\venv\lib\site-packages\langchain\utilities\", line 125, in from_uri
    return cls(create_engine(database_uri, **_engine_args), **kwargs)
File "<string>", line 2, in create_engine
File "C:\Users\alens\Downloads\langchain_text2sql_1\venv\lib\site-packages\sqlalchemy\util\", line 375, in warned
    return fn(*args, **kwargs)
File "C:\Users\alens\Downloads\langchain_text2sql_1\venv\lib\site-packages\sqlalchemy\engine\", line 560, in create_engine
    (cargs, cparams) = dialect.create_connect_args(u)
File "C:\Users\alens\Downloads\langchain_text2sql_1\venv\lib\site-packages\sqlalchemy_bigquery\", line 842, in create_connect_args
    client = _helpers.create_bigquery_client(
File "C:\Users\alens\Downloads\langchain_text2sql_1\venv\lib\site-packages\sqlalchemy_bigquery\", line 46, in create_bigquery_client
    credentials = service_account.Credentials.from_service_account_file(
File "C:\Users\alens\Downloads\langchain_text2sql_1\venv\lib\site-packages\google\oauth2\", line 260, in from_service_account_file
    info, signer = _service_account_info.from_filename(
File "C:\Users\alens\Downloads\langchain_text2sql_1\venv\lib\site-packages\google\auth\", line 78, in from_filename
    with, "r", encoding="utf-8") as json_file:

The way you are using service_account_file it should be a file name, but the return value of st.file_uploader is not that.

Thank you for the reply. Is there any workaround for my use case?

Doesn’t the bigquery API allow you to pass the credentials as parameters instead of a file name? Something like this:


I am making it up because I don’t know bigquery but hopefully you get the idea.


Apparently you should be able to do something like


where credentials_base64 is the resul of encoding the json in base64.

or if you can use create_engine instead:

engine = create_engine('bigquery://{project}/{dataset}', credentials_info=credentials_info)

where credentials_info is the deserialized json (a dictionary).

Furthermore, you can pass engine arguments to from_uri:

from_uri(f'bigquery://{project}/{dataset}' , engine_args=credentials_info)

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.