Hi, I’m creating this topic because I can’t find a problem like mine. My BigQuery API key is recognized when I use my streamlit web application locally, but it is not recognized when I deploy it on GitHub although it is the same code.
The issue is : Access Denied: Project xxx: User does not have bigquery.jobs.create permission in project xxx.
(The xxx are just there to hide my confidential informations)
My code is :
import streamlit as st
from google.oauth2 import service_account
from google.cloud import bigquery
def test():
# credentials = service_account.Credentials.from_service_account_info(
# st.secrets["gcp_service_account"]
# )
# client = bigquery.Client(project = "xxx", credentials=credentials)
client = bigquery.Client()
sql = """
SELECT
*
FROM
`xxx.xxx`
LIMIT 1
"""
df = client.query(sql).to_dataframe()
return df
print(test())
All of the “xxx” values are meant to be replaced with the real values from your GCP account. For example, the project_id = "xxx" should become project_id = "<whatever-your-project-id-is>". If you followed the steps here Connect Streamlit to Google BigQuery - Streamlit Docs, and downloaded the JSON file with your credentials, you should be able to see all of values you’ll need to put into the .toml file. Once that works locally, you can just copy and paste the values from the secrets.toml file and put them in as the secrets for your app on Community Cloud.
Hi @blackary, and thank you for your response ! The xxx are just there to hide my confidential informations. Indeed, it works localy but when I deploy it, it won’t work. While I put all the information from my secrets.toml in the streamlit secrets.
Ah, so if I understand correctly, it works locally when you don’t pass any credentials (just do bigquery.Client()), but it doesn’t work locally or remotely when you do pass the credentials (bigquery.Client(project=...)?
In that case, the issue is probably being described accurately from the error message – you probably need to grant more permissions to the service account you created so that it has bigquery.jobs.create permission for your project. Here is some more detail about BigQuery permissions: Access control with IAM | BigQuery | Google Cloud
Hey there, I have the exact same issue as above. My service account credentials work perfectly locally, can read in data and do everything I need it but it doesn’t work on the server.
The wierd thing is that I don’t know why it would need create permissions wrt my code. Does streamlit caching somehow involve writing through BQ though it sounds far fetched?
yes, it works for query results. But my problem was the complete opposite: I loaded data into big query. I will try the resource caching option, thanks.
Thanks for stopping by! We use cookies to help us understand how you interact with our website.
By clicking “Accept all”, you consent to our use of cookies. For more information, please see our privacy policy.
Cookie settings
Strictly necessary cookies
These cookies are necessary for the website to function and cannot be switched off. They are usually only set in response to actions made by you which amount to a request for services, such as setting your privacy preferences, logging in or filling in forms.
Performance cookies
These cookies allow us to count visits and traffic sources so we can measure and improve the performance of our site. They help us understand how visitors move around the site and which pages are most frequently visited.
Functional cookies
These cookies are used to record your choices and settings, maintain your preferences over time and recognize you when you return to our website. These cookies help us to personalize our content for you and remember your preferences.
Targeting cookies
These cookies may be deployed to our site by our advertising partners to build a profile of your interest and provide you with content that is relevant to you, including showing you relevant ads on other websites.