A picture was worth a thousand words

Stan Wade
Stan Wade | 25 August 2017

They say a picture is worth a thousand words. Is that true in the world of computing? Well, I suppose most of us would say a word is usually 64 bits nowadays, and we render our big pictures with a GPU.  So based on some work I have been doing with a client at present that ‘picture’ can say not thousands but billions of words.

I remember getting my first CGA graphics card many decades ago and being so impressed at seeing colour on my screen.  Things have got a tad better over time and the behemoth in my current PC is the most expensive component of my system. That’s because I like to play games and high resolution graphical rendering takes some serious computational muscle.

Quite a long time ago, some bright spark looked at what these GPUs did and decided that maybe there was another role for what is essentially a massive maths co-processer. Things have moved on from then and the world of GPGPU (General Purpose Graphics Processer Unit) has become more mainstream. In fact, 7 out of 10 of the worlds fasted super computers use GPUs to power their processing.

So, what’s so special about a GPU over a CPU? Well they are designed to do different things. A CPU has a big instruction set and typically between 4 and 20 cores, (ignoring hyperthreading) and can do up to 20 things at once. A GPU is many ways is pretty dumb, and can only do a small number of things, but the cards we are playing with have over 7000 cores. So, as long as you are wanting to do the kind of things a GPU is good at, well the throughput is astounding. Let’s look at an example. We have a security client who sometimes needs to legitimately crack passwords. We ran a test that calculated 1.3 billion hashes. On the Zeon CPU it took 236 seconds. On the GPU it took just a single second! I think this is an extreme example but you can see why it’s all quite exciting.

So we have a great new tool in the box. But what can we do with it? What could you use this kind of computing power for? We are all aware of the Big Data, and we are getting better at collecting it. What is the real challenge is learning things from the petabytes we have recorded. Welcome to the world of Machine Learning. This is essentially pointing a big computer at a heap of data and saying ‘what do you make of that?’.  The inventor of the term,  Arthur Samuel, describes it as giving ‘computers the ability to learn without being explicitly programmed.’  He came up with that in 1959 when it was almost science fiction. Well now its science fact!

And the technology train doesn’t stop! The next logical stage is to take the technology and progress from reutilising something designed for rendering graphics to something specifically designed to doing number crunching. Now we have a TPU, or a Tensor Processing Unit. This is the next technology step designed to give a computer an enhanced capability for Machine Learning.  The next version of the card we are using will use some TPU technology and is said to be 12 times faster at Machine Learning. That’s a mind-blowing capability.

So back to where we started. If a picture is worth a thousand words, we are now able to look at an expanded paradigm. With current technology trends, we have moved from a small black and white picture to full 4K video and its only getting bigger and faster. Depending on how you look at this, the impacts of this could be both marvellous and potentially terrifying!