This is pretty much a David vs Goliath match up. In the 80’s it was pretty clear that AMD were little more than re-engineering the Intel 808x chip. That has changed a lot since those halcyon days. I would be hard pressed to find anyone that would call them copy cats. AMD are a stand alone manufacturer, and the product line reveals that.

It was first coined in 1964 ‘Moore’s Law’, it was used to describe the idea that every 18 months or so the number of components buried in a piece of silicone would double. This would effectively double the speed of any computer. Of course the software guys at Microsoft and everywhere else have been making sure that as fast as the new computers come out, they are completely drowned in useless software. Windows has has just become an overhead rather than an asset.

The significant difference between the two companies is not so much the hardware, more the way it can be used. Both company’s products can be used in a business environment, the big question in my mind is ‘are they equally capable in an entertainment situation’?

I use my computer in a variety of boring ways, I do email, I write articles, and I browse the web. Quite honestly I could likely do this with a Comodore 64, or my old P90 system. I have lost interest in the race for speed, it does not mean anything. I run several computers they vary a lot in speed, but when it comes down to email, word processing, and generally goofing off on the net, it doesn’t make any difference if you are on an ancient 600mz processor or some spanky new multi core deal.

The exception to that rule is multimedia, rendering computer games and playing HD video most certainly needs all of the horsepower you can muster. It is this arena that we discover that not all processors are created equal and I for one feel that this is an area that AMD lead Intel in. The acquisition of graphics manufacturer ATI has added momentum to the multimedia world. AMD has have two approaches to the problem.

By more closely coupling the CPU with the GPU the multimedia experience is enhanced. Their second solution which is currently in Beta test is essentially server side rendering. Why bother to render at the client computer when you can harness the power of what is essentially a super computer cluster. This initiative that AMD are running with is called the Fusion Render Cloud, and forms an important building block in the AMD Cinema 2.0 initiative.

I guess I am just old and cynical and have been in the Computer Industry far too long. But I have a pet theory that there is nothing new, technology is a cyclic beast. When I started in the business in 1973 I worked on IBM mainframes, actually it happened to be the largest set of computer power outside of the US. For word processing we used an IBM product called DCF Script. Guess what? It used tags, in fact the same set of tags that you find in a basic HTML document <H1, <UL, <LI, etc. So HTML was not a new invention, merely a reuse of some earlier technology.

Likewise this new latest push for ‘cloud’ solutions. If you wind the clock back to 1980 the average computer was a mainframe with ‘dumb terminals’ connected to it. Today we have ditched the ‘dumb terminal’ moniker and call it a thin client, and the mainframe has been renamed a ‘cloud’.

Now personally I am a cloud fan, in fact I spend a good deal of time in the Cloud, that way I do not have to worry about my data, if this hard drive crashes I just slot in a new one, load up whatever OS I fancy (that tends to be anything that does not have the MS name on it) and I am back in business within minutes.

I like the direction that the folks at AMD are taking, the idea of server side rendering could well change online entertainment, not just in movie type applications but also gaming.

Simon Barrett

Be Sociable, Share!