Does GPU Hardware Help Database Workloads?

I've covered GPUs in analytic databases here in the past. This post, by a senior PM at Oracle, explains that we're not seeing GPUs in analytic databases because the workloads simply don't align that well:

The huge number of parallel computation engines provided by these devices excel at accelerating tasks that require large numbers of computations on small amounts of data. GPUs are extremely effective for Blockchain applications because these require billions of computations on a few megabytes of data. GPUs are great for deep learning since these perform repeated computational loops on megabytes to gigabytes of data. Analytics typically perform a small number of simple calculations on large amounts of data, often hundreds of gigabytes to petabytes of data.

The article goes much deeper. Fascinating read and instructive on the future of this space.


Want to receive more content like this in your inbox?