Caching asyncio functions

First of all, great platform!

Next, i’m trying to cache an asyncio function (i’m forced to use it with the library) but i’m getting this Object of type builtins.coroutine: <coroutine object get_channel_details at 0x1a9be3560>

  1. Builtins does not have coroutine, it’s inside asyncio.
  2. How can i cache it? This does not help hash_funcs={asyncio.coroutine: lambda x: x} (probably because it does not correspond to builtins.coroutine as in the error message

Here is the function itself:

async def get_channel_details(channel):
    tg_client = TelegramClient(SESSION_FILE, tg_env['TELEGRAM_APP_ID'], tg_env['TELEGRAM_APP_HASH'])
    channel_details = {'title': "", 'description': "", "recent_posts": []}
    async with tg_client:
        ent = await tg_client.get_entity(channel)
        assert isinstance(ent, tl.types.Channel)
        meta = await tg_client(functions.channels.GetFullChannelRequest(channel))
        posts = await tg_client.get_messages(channel, limit=10)
        'title': meta.chats[0].title,
        'description': meta.full_chat.about
    if len(posts) > 0:
        channel_details.update({'recent_posts': [p.message for p in posts if p.message is not None]})
    return channel_details

Hey @Rusteam ! Thanks for reaching out!

To be honest, I’m not as familiar with our caching setup, but likely we didn’t design it when considering asyncio. This is probably due to Streamlit focusing on a script to design the interface (less async functions on the side).

I basically searched how to cache async functions to functions, and found this.

They describe a caching solution they created, but I realized they used a loop_run_until_complete method, which would basically make an async call synchronous.

Curious if this would work for you:

import asyncio

# async def get_channel_details(channel)
#     ...

@st.cache # options here
def cached_channel_details(channel):
    loop = asyncio.get_event_loop()
    return loop.run_until_complete(get_channel_details(channel)

Hope that helps!

thanks a lot, it’s definitely worth checking out

@Rusteam Hey dude. I was browsing to see if streamlit had implemented async caching yet and seems they haven’t. As an alternative you can use aiocache. Here is a quick example with your func name:

from aiocache import Cache
from aiocache import cached

@cached(ttl=None, cache=Cache.MEMORY)
async def get_channel_details(channel):

Hey streamlit team. I’d very highly recommend y’all implement async caching. The reason is there has been a seismic shift in python API development since the release of FastAPI (30k+ starts on github in 2 years). I realize that many data scientist are building without the “need” to use async/await but it will become more necessary as more ML code needs to be integrated with an API like FastAPI which is gaining api market share vs Django/Flask.

amazing! thanks for reaching out

Hi all! We’re currently doing some work on st.cache, so this conversation is super relevant and timely.

I’m curious to see some stand-alone app examples showing how you’d like to use asyncio in Streamlit. Would anyone be able to share?

1 Like