Aws deploy free tier with 1gb memory

My data set is around 800mb and
model A is 10kb ( linear regression )
vs model B is 600mb ( random forest )

both deploy ok locally but when I tried model B at AWS the prediction part breaks.

http://ec2-54-179-165-223.ap-southeast-1.compute.amazonaws.com:8501/

p/s : It’s ok that I can’t deploy model B as long as I know if this is the reason that I can’t deploy as this is just a pet project as the current one is good enough IMHO.

Hi @Andrew_Ng -

These types of things are always “it depends”…the file size of the model and the RAM it takes can be widely different, depending if the model is serialized in a binary format vs. a text one. Same idea with data; “800MB” of data could be a dramatically different sized dataset whether it’s a CSV file or Parquet (which is a compressed format).

If you’re running into issues, my first guess would be that your VM part is the issue. Locally, I’m sure you have at least 8GB of RAM if not more, so it’s not surprising it works there but not on AWS.

1 Like

Hi @randyzwitch ,

Thanks for your prompt response.

The dataset is just a plan .csv using pandas for retrieval and the AWS hardware that I’m using is the free tier with only 1gb. Do you think a 2gb memory will help?

Thanks for your response.