Since your app must be tied to a github repo, if you aren’t using an external data source, you must push your updated data to the repository each time.
My application reads a csv and displays a chart.
st.cache/ function to re-run itself Data = pd.read_csv('Data/csv') st.line_chart data
Since I cannot manually push the csv to github fast enough, the only option remaining is to upload the csv to an external database, and then read from the database instead of from the csv.
The problem is, this is incredibly difficult for me to implement. I went along with the streamlit documentation for the google sheets API, but it only has instructions for reading from the sheet. I need to upload the dataframe to the sheet first and then read the sheet. But again, the documentation for gsheets db is incomprehensible for someone not versed in SQL.
What are my remaining options? Estimates say that it takes at least a year to understand SQL, but my app is already done and I need to deploy it ASAP. The other databases that are mentioned in the streamlit docs are too complex and costly for this simple app.
Again, all I want to do is read a csv and display a line chart. Any help is greatly appreciated. Please assume that I have already read all of the official streamlit documentation.