
Ask HN: What do you use for ML Hosting
by blululu on Hacker News.
I’m trying to setup server to run ML inferences. I need to provision a somewhat beefy gpu with a decent amount of RAM (8-16 GB). Does anyone here have personal experience and recommendations about the various companies operating in this space?
