Putting a GPT-2 Model Up For Others to Interact With!

Hi All! Super new to Streamlit (& new to Data Science in general!), so excited by Streamlitā€™s ease-of-use. In a sentence, Iā€™m finetuning GPT-2 to give (hopefully decent!) relationship advice. Iā€™d like anybody in the world to be able to interact with it. Just wondering what app Iā€™d use in order to upload my version of the model, and to let others interact with it through a simple text input & text-only output.

Thanks all!

Hey @rjmalka,

First, welcome to the Streamlit community!! :partying_face: :partying_face: :two_hearts: :tada:

Iā€™m actually not sure what you mean by this:

Streamlit will create your webapp interface for you and you can deploy that to share with the world :world_map: using Streamlit Sharing platform. If you havenā€™t signed up to Sharing you can here: Streamlit Community Cloud ā€¢ Streamlit

This will allow you to deploy your app in one click from your linked Github account (put your code and all the files you need on a public github repo and sign up with your github associated email).

This is a free service, so there are limits on the RAM we give to each person, 800MB is the max allowance!

Hope this helps?
Marisa

Hi Marisa,

Thanks for your patience, and sorry for being unclear! I should have waited until I was less burned out before writing that :slight_smile: Basically, I have code that calls GPT-2 and uses it to generate text. I also have the .rar file that represents the trained model checkpoint, so the model doesnā€™t have to keep retraining. I want people to be able to ask GPT-2 for relationship advice, and that requires getting a text input from the user, which replaces a certain line of code in this notebook.

Iā€™ve looked through the Streamlit docs and Iā€™ve found streamlit.text_input(), which is awesome! While Iā€™m putting the app together, wondering if thereā€™s an API to include sharing the results of the output with friends through FB, Twitter, etc?

Lastly, I assume I should store my .rar file in the same Github folder and call it as I usually do before generating text?

Assuming the model uses only the .rar file, rather than try to train the model in real time each time someone asks a question, I donā€™t believe Iā€™ll ever exceed the 800 MB/RAM limit.

Please let me know if Iā€™m still being unclear, but I hope this helps articulate my questions!

Glad to be part of the Streamlit community :slight_smile:

Warmly,
Robert

1 Like

Related: Am encountering the following error (ModuleNotFoundError: No module named ā€˜gpt_2_simple.ā€™) when I run:

%tensorflow_version 1.x
!pip install -q gpt-2-simple 

Makes perfect sense that this isnā€™t executable in Streamlitā€¦ wondering if thereā€™s any way to have the model run through Streamlit? Iā€™m getting tired, but if not Iā€™m sure thereā€™s some way to connect & borrow from e.g. Google Colabā€™s runtime/resources? Wondering if youā€™ve encountered this kind of problem before and if there are any suggestions to getting GPT-2 to run!

EDIT: Currently have the following code in question:

%tensorflow_version 1.x
!pip install -q gpt-2-simple #wondering what I can do instead of attempting a pip install, as I usually do on Google Colab?
import gpt_2_simple as gpt2

gpt2.download_gpt2(model_name="355M")
gpt2.copy_checkpoint_from_gdrive(run_name='run1')  #Other question is how to get this to work.

input = st.text_input('Describe your relationship troubles.', '')

text = gpt2.generate(sess, run_name='run1',
              temperature=0.7,
              top_k=40,
              nsamples=100,
              batch_size=25,
              length=200,
              prefix="<|startoftext|>[WP]" + input,
              truncate="<|endoftext|>",
              include_prefix=False,
              sample_delim=''
              )

st.write('', text)

Appreciate the help! Thanks all :blush:

Hi @Marisa_Smith! Just want to reach out on this and see if you have an answer to my questions (no luck so far) ā€“ thank you!

Robert

Hey Robert,

So sorry about the long delay, I havenā€™t been around on the forum as much as I would like as we have a lot of exciting things in the works!

No worries we have all been there thanks for clearing this up and adding your code snippets!

So there is no Streamlit API for this, but community members make components (which are plug-ins) for Streamlit! :partying_face: You may find something out of the box there that may work for you, or you might be able to adapt something someone else made! Iā€™m thinking of the Disqus and Discord components that @okld made!

here is a link to our components: App Gallery ā€¢ Streamlit

Also, recently there was this fantastic app made by @jrieke (Creator turned team member :sunglasses:) where he linked twitter into his Year on Github app! I took the liberty (hope thatā€™s ok @jrieke :wink:) to grab his GitHub repo for you, I think you will have some success with this example of linking twitter into Streamlit! GitHub - jrieke/year-on-github: šŸ™ Share your Github stats for 2020 on Twitter

I would say yes, depending on the size of it you could load it in before or after. But you may want to actually make a small function for it, that way you can add the @st.cache decorator to your function call in your Streamlit app, and (Iā€™m assuming) because your .rar file wonā€™t change, this will only have to load one time and then help speed up the app for users!

I am less sure about this one. Based on the notebook link you sent me it seems your trying to translate a Jupyter Notebook into a Streamlit App? Jupyter notebook usually lets you run pip install commands directly from the notebook cells, but Streamlit acts just like any python package. You would need to install this in your environment before you try running your script. (assuming I have understood this issue correctly!)

Here are the steps I would recommend:

  1. make a clean environment (pip or conda) for a specific python version
  2. In that new environment, pip or conda install the packages you want to use, streamlit, gpt-2-simple, tensorflow (?) and any others
  3. in your streamlit_app.py script, remove the %tensorflow_version 1.x and
    !pip install -q gpt-2-simple, and just import these packages like your third line in your app there :point_up: (import gpt_2_simple as gpt2)

Letā€™s see where this gets us!

Also, have you put your app in a GitHub repo? If you can do that and share it, that would be super helpful!

Hope this helps!
Happy Streamlit-ing!
Marisa