Connecting llama Index custom retriver to Streamlit chatengine

Hello Everyone,

I am currently struggeling to build my first streamlit app. I followed this tutorial: Build a chatbot with custom data sources, powered by LlamaIndex and tried to integrate it with a local model, local embedding model and custom retriever. My custom retriever is a variation for the query fusion retriever, which works very well with my data

custom_retriever = QueryFusionRetriever(
    [vector_retriever, bm25_retriever],
    llm=llm,
    similarity_top_k=top_k,
    num_queries=0,  
    mode="reciprocal_rerank",
    use_async=False,
    query_gen_prompt=vary_question_tmpl
)

However, the streamlit session_state.chat_engine only takes the default retriever from llama index as seen in this line:

st.session_state.chat_engine = index.as_chat_engine(chat_mode="react", llm=llm, retriever=custom-retriever, verbose=True)

This produces the following error:

TypeError: RetrieverQueryEngine.from_args() got multiple values for argument 'retriever'

How would I go ahead and solve this problem?