Sentence-transformers using cpu on local machine while using streamlit despite having GPU enabled

Hello, I was trying to run GUI for a sentence-transformers model using streamlit, however streamlit does not seem to use my GPU locally as it gives me the following message: Use pytorch device: cpu. When not using streamlit I can use GPU properly can anyone help me with this?y

tagging @randyzwitch because I saw you replying to similar questions earlier

I tried to run pytorch gpu with streamlit locally. It works fine.

streamlit==1.20.0
torch==1.13.1+cu116
import torch
import streamlit as st

st.write(f'**cuda is available:** {torch.cuda.is_available()}')

st.subheader('tensor')
x = torch.randint(1, 100, (100, 100))
st.dataframe(x)

x = x.to(torch.device('cuda'))

res_gpu = x ** 2
st.subheader('gpu result operation')
st.write(res_gpu)

Output

1 Like

Hey @ferdy thanks for your response, so, this problem does not arise when I am using torch specifically, but does when I use sentence-transformers. Have changed the question accordingly

I tried it with sentence-transformer and it worked just fine.

from sentence_transformers import SentenceTransformer
import streamlit as st

model = SentenceTransformer('all-MiniLM-L6-v2', device='cuda')

#Our sentences we like to encode
sentences = ['The quick brown fox jumps over the lazy dog.']

if st.button('run'):
    #Sentences are encoded by calling model.encode()
    sentence_embeddings = model.encode(sentences)

    #Print the embeddings
    for sentence, embedding in zip(sentences, sentence_embeddings):
        st.write("Sentence:", sentence)
        st.dataframe(embedding.tolist(), use_container_width=True, height=300)

Output

1 Like

Which sentence_transformers version did you used. In mine it says NoModuleFoundError. If you could show your requirements.txt

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.