BigQuery service account works locally but not on server

Hi there, love Streamlit.

I followed these instructions to create a GBQ Service Account and link it to my app. The app runs locally, but when deployed it gets a 403 Access Denied. Any ideas?

Thanks very much.

credentials = service_account.Credentials.from_service_account_info(
    st.secrets["gcp_service_account"]
)
client = bigquery.Client(credentials=credentials)

data = pd.DataFrame(etc)

pandas_gbq.to_gbq(data, destination, etc.)

I make an example app on big query using streamlit’s example. It worked on both local and cloud from my tests.

source: github
app: bigquery example

Thanks @ferdy. It looks like the example app is only reading a table, not writing to one. I am specifically creating a new table. The role of my service account is Owner (tried Editor also).

When I try reading a table I have no issue.

That is correct, the example from streamlit only reads.

I see, so is writing not supported yet?

Do you have a code that writes data to bigquery that works locally?

Yes please see the example code in my original question.

Hi @bloukanov :wave:

This is just a hypothesis: reading the bigquery.Client API, I see that if you don’t set the project param to your project ID, it default’s to one “inferred from the environment.”. Community Cloud might be blocking access to their project – which may be getting inferred from app’s container.

Could you try explicitly setting bigquery.Client(project="<your-bigquery-project>")? Where <your-bigquery-project> should be replaced the name of your bigquery project ID:

Hmm thanks I thought for sure this would be it! But unfortunately, no luck…

I tried both bigquery.Client(credentials=credentials, project="project-id") and bigquery.Client(credentials=credentials, project=credentials.project_id).

Both work locally but not when deployed…

Try this one, it will work both in local and in cloud from my test.

Code

"""Creates and inserts data in bigquery.

References:
    https://docs.streamlit.io/knowledge-base/tutorials/databases/bigquery
    https://cloud.google.com/bigquery/docs/tables
    https://cloud.google.com/bigquery/docs/locations
    https://practicaldatascience.co.uk/data-engineering/how-to-import-data-into-bigquery-using-pandas-and-mysql
"""


import streamlit as st
from google.oauth2 import service_account
from google.cloud import bigquery
from google.cloud.exceptions import NotFound
import pandas as pd


credentials = service_account.Credentials.from_service_account_info(
    st.secrets["gcp_service_account"]
)
client = bigquery.Client(credentials=credentials)


def create_dataset(dataset_id, region_name):
    reference = client.dataset(dataset_id)
    try:
        client.get_dataset(reference)
    except NotFound:
        dataset = bigquery.Dataset(reference)
        dataset.location = region_name
        dataset = client.create_dataset(dataset)


def insert(df, table):    
    return client.load_table_from_dataframe(df, table)


if __name__ == '__main__':
    data = {'Country': ['Philippines', 'Japan'], 'Capital': ['Manila', 'Tokyo']}
    df = pd.DataFrame(data)

    st.write('#### Dataframe to be imported to bigquery')
    st.dataframe(df)

    with st.form('form'):
        project_id = st.text_input('Project id')
        dataset_id = st.text_input('Dataset id')
        table_name = st.text_input('table name')
        region_name = st.text_input('Region name')
        submit = st.form_submit_button('Submit')

    if submit:
        create_dataset(dataset_id, region_name)
        tableid = f'{project_id}.{dataset_id}.{table_name}'
        insert(df, tableid)

Streamlit output

Google bigquery view

image

1 Like

Amazing! Thanks so much, this worked.

Ferdy, Can you help me ? Im having a issue trying to save the info from a form of streamlit in a bigquery table

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.