Caching a database connection pool for multiple users

Hello!

I am running my multipage app locally, and building it using docker in preparation for deployment via Azure. In my app, users select filters and send a query to my supabase-database. However, since users can send multiple queries after one another, I wanted to cache the database connection globally, so that each new user can “borrow” from the connection pool. However, I can’t figure out how to store the connection pool globally, so that the pool is not recreated each time a user interacts with the site. I am using Python 3.12, and importing psycopg2-binary for the SQL. The closest solution so far is using st.session_state, however, this only works to save the connection pool individually for each user-session.

Here is the relevant code I have so far:

import streamlit as st
from psycopg2 import pool

def initialize_connection_pool():
if “connection_pool” not in st.session_state:
st.session_state.connection_pool = pool.SimpleConnectionPool(
1, 20,
host=os.getenv(“host_hidden”),
dbname=os.getenv(“db_hidden”),
user=os.getenv(“user_hidden”),
port=os.getenv(“port_hidden”),
password=os.getenv(“password_hidden”)
)

def get_connection():
initialize_connection_pool()
conn = st.session_state.connection_pool.getconn()
return conn

def return_connection(conn):
if “connection_pool” in st.session_state and conn:
st.session_state.connection_pool.putconn(conn)

Edit:
I have not gotten st.cache_resource to work.

I am grateful for any guidance I can get!