Enter, edit and delete data without fetching the entire df again

I am currently creating a streamlit app where my datasource is connected to databricks, and I need my users to seamlessly be able to update and add data to databricks.

I have two tables, one dbo.budget and one dbo.comments, where budget is only updated once a month, so it makes sense for this table to be in cache which I made work, and it’s super fast. But I would like for the user to save new rows to the dbo.comments table without having to reload the dbo.comments table every time they’ve added or updated a row, and I don’t want the user to have to fill in all comments, and then patch them to the datasource at the end.

For adding a comment I currently run an insert statement: insert into dbo.comments (values).

How do I avoid having to rerun select * from dbo.comments after each comment has been added or amended?

You can use st.cache, to avoid to reload everything.
Or you can add in a session_state too and add just the new rows each time.

Thanks for the reply, when you say add in a session_state too and add just the new rows each time, do you then mean i only add it to the dataframe I’m displaying in the app? When would I then add them to the data source?

If you don’t need a refresh of you data, each minutes you can store the dataframe in cache/session_state.

And, you can send it as you do right now, just do it seperatly: display/send it to the servers.

1 Like

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.