Technology

Machine learning explained: what is a GPU and why are they so important?

Calipsa 10 September 2020
  • Technology

In this instalment of "Machine Learning Explained", we’re going to look at an important piece of equipment that is used in many machine learning applications: the Graphics Processing Unit (GPU). 

If you’re interested in learning about the fundamentals of machine learning, check out our ebook, "Machine Learning Explained".

Download ebook

What is a GPU?

A GPU is a type of processor used in computing. Most computers we are familiar with use a Central Processing Unit (CPU), which enables them to carry out several tasks at once. If you’ve ever used your laptop to browse the internet and stream music in the background, all while completing some work on a word processor, then you have a CPU to thank for making the whole process seem quick and seamless. 

CPUs are excellent multitaskers that work well for simpler artificial intelligence, but when it comes to advanced machine learning they do have some drawbacks. They can carry out complex computations fairly quickly; however, the more computations you want to carry out in parallel and the larger the dataset you work with, the more a CPU lags.

While CPU core processors are extremely powerful, they are not designed to carry out the sheer volume of tasks and data required in deep learning. To learn more about how deep learning works, take a look at our article on neural networks. 

On the other hand, GPUs are ideal for algorithms using large datasets, with hundreds, sometimes even thousands more processors per chip than a CPU. Used in gaming and 3D rendering, they are optimised to carry out a smaller variety of tasks but with much greater speed. Many chip manufacturers are now designing GPUs specifically for machine learning, enabling even faster processing speeds. 

So how exactly does a GPU work when it’s used in deep learning? The numerous core processors in a GPU allow allow machine learning engineers to train complex models using lots of data relatively quickly. The ability to rapidly perform multiple computations in parallel is what makes them so effective; with a powerful processor, the model can make statistical predictions about very large amounts of data. If we were to try and do this using a CPU, the same process could take months or even years to complete. 

Why are GPUs so important to the development of machine learning?

Over the past few decades, machine learning has become increasingly advanced. The benefit of the GPU is that now, much more powerful processing can take place with better economies of scale. Since GPUs became cheaper to manufacture and buy in the 2010s, machine learning technology has improved at a faster rate, which means it has moved out of academic institutions and into the mainstream. 

Easier, cheaper access to GPUs also encourages more advanced branches of machine learning to develop. Deep learning is a subset of machine learning, where a model ‘learns’ from previous computations; in other words, the model’s performance improves by training it to learn what a correct output looks like, so it can provide that output independently when faced with brand new data. 

To make processing faster and more efficient for deep learning models, it is possible to cluster GPUs together for even better performance. These multi-GPUs can either be built from scratch, or they can be accessed through cloud-based services. TensorFlow and PyTorch are examples of deep learning frameworks that provide the processing power engineers need to test out statistical models. 

The added advantage of a remote framework is that engineers can access extremely powerful processors more affordably than before, which means that more and more machine learning technology can become available to a wider audience. 

Technology like Calipsa’s False Alarm Filtering Platform uses the rapid processing power of GPUs to train our deep learning models to recognise human activity in images. Images are a particularly large type of data, and we expose our model to millions of them to help it learn to distinguish between true and false alarms. 


Can’t get enough of all things machine learning? Why not check out our ebook, “Machine Learning Explained”.

Download ebook

1 comment