Many researchers hear about the huge hardware clusters that companies use for machine learning and AI. When you get started and even start doing more extensive projects, do you really need to move away from your work laptop?
Most people use laptops as hardware these days. Desktop computers are clunky pieces of hardware only used by people doing numerical or image processing work and gamers, so they've become very niche. Yet many people own laptops either bought themselves or through work. I've had several friends ask me if they can buy the new Macbook M1 or need $3000 laptops to break into data science.
So the new Apple, eh?
Historically Apples were marketed toward "creatives". But in their most recent iteration on the Macbooks, Apple has stepped up their game. Everyone working close to hardware was nervous when Apple bought ARM and moved away from Intel due to compatibility. Still, with the M1 laptops, they're also right away publishing a version of Google's popular deep learning framework Tensorflow!
The crazy thing is I've never even owned an Apple product, and I'm impressed with the execution. Before, my Mac-Owning friends and colleagues had to buy expensive external GPUs to run simulations, and now this! But this isn't just about Macs. Let's look at machine Learning itself because not all ML is Deep Neural Networks. Your 2016 Macbook may still be adequate to run many machine learning applications. And the best part? If they're not enough anymore, you can still rent an instance in the cloud for a few dollars during training!
Hardware Considerations for Machine Learning
Let's have a quick chat about computer hardware. Computers made up of some essential parts to do the computery things. For the kind of numerical work we are talking about, we're primarily interested in three things. The CPU is the brain of the computer. It enables complex tasks and computation and runs every program you have. The CPU closely interact with special memory called RAM, which is quite similar to short-term memory in humans. This is opposed to regular hard drives, which are much slower but perfect for longer-term storage of information.
Then there's the graphical processing unit or GPU. GPUs developed over time to make games more realistic, but because 3D graphics are essentially just a bunch of linear algebra and matrix calculations, people slowly figured out that you can do scientific calculations with them as well. These GPUs are really good at this one thing and one thing only: Throwing a bunch of matrixes at each other. These GPUs often have a dedicated memory, often called VRAM, even faster and even smaller than your CPUs RAM.
Many laptops, especially work-issued office laptops, only have a CPU and so-called integrated graphics. They can play Netflix and Youtube but will usually buckle when you want to start up any 3D game. It gets really specific real quick, but do you even need a GPU?
No, probably not. Most of machine learning is done on the normal CPU every laptop has. So you should be able to train most simple models in scikit-learn, even on your phone.
The CPU decides how fast your model trains, and usually, it doesn't matter. Model training times are relatively short anyways; the limiting factor on your laptop is RAM. There are only a few classic machine learning methods that can be trained iteratively, feeding it your entire dataset piece by piece and non of them are the ones I usually throw at any problem like Support-Vector Machines or Random Forests. There are some tricks, of course, but basically, you need to fit the whole data AND your model into memory (also called RAM) of the laptop. On smaller problems, this is also negligible. Still, some of the problems I'm working on right now require millions of data points. At this scale, I'm consuming 100s of GB of data. In my experience, for classic machine learning, you should prioritize RAM and then CPU or GPU.
Especially in office computers, the amount of RAM can be as low as 4GB (at which point opening Excel files can become a dread). That laptop quickly reaches capacity on your problem. But essentially, you should be able to validate your code for machine learning on any old machine with enough memory and often even train models in a reasonable time. Random Forests, for example, are popular because they're also very fast to train.
You can switch classical Machine Learning to the GPU with Nvidia Rapids cuML, which is basically scikit-learn on a GPU. That speeds up a lot of processing but puts you in front of some similar problems regarding the memory space on the GPU card. However, let's be honest here, if you want to train a deep learning model, you definitely want an Nvidia GPU in your laptop or desktop PC.
Reasons against running Deep Learning on your Beefy PC
Is that all?
No.
I'm not a fan of running deep learning on local hardware anymore. Running a reasonably big deep learning problem on your main PC will usually lock up the entire machine. Opening even small programs on the same machine can be a drag, as powerful as it may be. Deep learning models train for hours or even weeks, where your machine can be barely usable with significant power usage.
Where there's power, there's heat! Training deep learning models for extended periods of time can become quite hot. If you're like me during the pandemic, a lot of my work is done on the couch cosied up, despite having an amazing desk with screens and everything. Training might make the machine a bit more toasty than you'd prefer. Gaming laptops with dedicated Nvidia graphics cards are ok for this kind of training, and the Macbooks M1 will be too. Gaming laptops are notorious for not being particularly uncomfortable on the legs when playing heavy games or training models for extended periods, something to consider.
My Recommendation
You can totally train machine learning models on laptops or even your phone. Deep learning is a bit more tricky but is also becoming much more commonplace with Apple's adoption of the M1 chip.
For Machine Learning for Science, I personally recommend getting a reasonable work computer with enough RAM and a decent CPU, but leave Deep Learning to dedicated machines or the cloud. Try it
Try it out train a model on your laptop!
Top comments (0)