Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
AMD's APU's. Actually Pretty Useful....
AMD's APU's. Actually Pretty Useful....
#1
The term 'Integrated Graphics' is something of a bogeyman. Anyone who can remember the older Intel parts from 5-10 years ago will especially have flashbacks. (Like taking seconds to render a single screensaver frame)

I don't know about you, but the latest Kaveri APU's from AMD seem to be set to turn that on its head. At least partially.

I celebrated my first paycheck ia few weeks ago by buying and building a brand new desktop centred around one of AMD's A10 processors. (A 7700k). I didn't spec a dedicated GPU because I didn't expect to use the system for gaming, just for playing films and the like quietly... and maybe some BD ripping. I used the money I saved on an SSD, and some faster RAM (2400mhz).

For shit's and giggles, I loaded up an older game that I remember giving my last desktop some troubles, STALKER... and was amazed at how well it ran on a system that was still technically 'Integrated graphics'. It ran smooth and crisp on near maximum quality. Maybe that says more about the march of hardware, but I remember it being a challenge to run for a long time.

Even Metro 2033, which - although 4 years old at this stage - is still a pretty hefty graphics muncher, looks amazing and plays amazing on a system that doesn't have a dedicated graphics card.

Needless to say, lighter games like Source games run well enough - while I haven't exactly tried anything more modern because I'm not that much of a PC gamer, the fact that it manages so well even with older games is nothing short of amazing. For a 140 euro part, doubly so. I'd say it's definitely better than an Intel i3, and cheaper than an i3 with a dedicated cheap graphics card.... it blows anything from the Leixlip chocolate factory in it's price band out of the water.

And yet, it seems woefully under-appreciated. Most people just don't buy AMD parts anymore. I think I'm almost unique amongst the people I know in having an AMD system.

Which is a shame because the new Kaveri series definitely deserve better than being relegated to the bargain bin. For such a cheap part, it's more than exceeded my expectations as to its capability. It'[s definitely something to look at if you're on a tight budget. And if the OpenCL API becomes more common, it could be a force to be reckoned with.

The only dowsides are a slight tendancy to eat power, and being very sensitive to RAM speed. Fast RAM makes a lot of difference to the performance of the system, far more than an Intel system, because the graphics are integrated.
________________________________
--m(^0^)m-- Wot, no sig?
Reply
 
#2
Heh. If anyone would bother looking at the specs, they would find that both the Xbox One and the PS4 have custom AMD APU's under the hood. Food for thought, ain't it? Smile
Reply
 
#3
So, instead of buying a cheap PC, I bought an expensive Playstation.... oh well. At least it's quieter than a console. And more upgradeable. And is a pretty subdued black monolith of a thing without Gaudy LED's and the like.

But the APU's definitely proving far more special than the online benchmarks suggest. Which makes me wonder a bit about how they're designed.
________________________________
--m(^0^)m-- Wot, no sig?
Reply
 
#4
Well, AMD doesn't really bother to hide it under their hats, given that they maintain patents on all their stuff. If you look for it, I'm sure you'll find detailed drawings of the the architecture of their A-series chips.
BTW: Something you may want to look into if you want a suitably high-tech career would be Printed Circuit Board Design.  It may be right up your ally, its all desk work, and it pays decently (20USD/hour for entry level positions).  My father is a Senior Designer... almost what they refer to in the trade as a 'Guru'... and he says that we're looking at a major shortfall of junior level designers in the next few years.
Reply
 
#5
Actually, I was wondering more about the benchmarks themselves. Most of them seem to favour Intel hardware.... which makes me wonder about how they work and how they're coded. Given the near ubiquity of Intel hardware these days... they're probably more 'Intel comparison tools' than hardware comparison tools. But down that road lies demagogery and fanboyism....

Unfortunately, I'm too far out of college to get a junior position. Struggled just to get part-time work with the place I'm in. And even then managed to make the owner go 'Whoah' when I sat right down, put together a mockup of a custom PLC-controlled heating system on my desk and programmed it up to work on my first day.
________________________________
--m(^0^)m-- Wot, no sig?
Reply
 
#6
Quote:Dartz wrote:
 And is a pretty subdued black monolith of a thing
Are its proportions 1:4:9? Smile
--
Sucrose Octanitrate.
Proof positive that with sufficient motivation, you can make anything explode.
Reply
 
#7
Dartz Wrote:Which is a shame because the new Kaveri series definitely deserve better than being relegated to the bargain bin. For such a cheap part, it's more than exceeded my expectations as to its capability. It's definitely something to look at if you're on a tight budget. And if the OpenCL API becomes more common, it could be a force to be reckoned with.

The problem is that AMD had some bad breaks and intel managed to outpace them again, so the high end chips... AMD just can;t come close to touching intel there, and intel is half a manufactering node ahead basicly, meaning that for a lot of the big enterprise systems which are very sensitive to power useage Intel again dominates. This has caused intel to slowly push up prices again, but AMD still has it's traditional stronghold of the budget market, and if you are looking to max performance per $ then AMD can't be beat.

AMD also takes great pains to keep the chip slot the same for longer, meaning you can more easily and cheaply upgrade AMD based machines, so if you are on a tight budget AMD is better unless the computer is going ot be runnign 24/7 in which case powerdraw becomes a significant concern in total cost. My last few machines where all AMD cpu based, but my latest machine is an intel one just because AMD can't keep up at the high end for now.

We'll see in 2-3 years if their decision to remove the hand-optimization of chip design and instead rely on computers was right or not, it has a lot of advantages, but particularly the early cores after the switchover suffered. And they will get a bit less performance per chip, how much is disputed... but it removes the most expensive part of chip desing basicly, and allows for much much faster iterations.
E: "Did they... did they just endorse the combination of the JSDF and US Army by showing them as two lesbian lolicons moving in together and holding hands and talking about how 'intimate' they were?"
B: "Have you forgotten so soon? They're phasing out Don't Ask, Don't Tell."
Reply
 
#8
Quote:CattyNebulart wrote:
The problem is that AMD had some bad breaks and intel managed to outpace them again...
You know, I often times wonder what AMD might have been like if people in the right places had caught onto Intel's bullying their customers into exclusively using their chips in their products.  It really did hurt AMD back in the day (to the point that they nearly did go out of business), and at times I still have trouble trying to find a good AMD option when I shop around for laptops and desktops.
Reply
 
#9
The Intel plant is just up the railline from my home - about ten minutes on a train. I've been in there and it's monstrous. You get a sense of the scale of the operation... It's like living next door to Genom. It has that big, monolithic corporate feel inside. The interview was held in what amounted to an abandoned cubicle-farm. There's something about the place that really didn't sit well with me when I was there. It was ominous and big inside...

Quote:and if you are looking to max performance per $ then AMD can't be beat.

Truth. The A10 with some proper fast RAM is at least on par with an i3/low-i5 system with a cheaper discrete GPU.... and works out cheaper on power and in cash, because Intel's integrated graphics are still a bit crap despite money being flung at the problem hand-over-fist, especially on the bottom-end chips. Assimilating ATI is probably the best thing AMD have done to save themselves, because it gives a real edge.

Quote:at times I still have trouble trying to find a good AMD option when I shop around for laptops and desktops.

Best option is to build your own, it's what I had to do to get exactly what I wanted. Works out cheaper, and aside from a little troubleshooting caused by a knackered DVD drive, didn't take that much longer than de-crapifying something straight from the manufacturer.
________________________________
--m(^0^)m-- Wot, no sig?
Reply
 
#10
Quote:blackaeronaut wrote:
Quote:CattyNebulart wrote:
The problem is that AMD had some bad breaks and intel managed to outpace them again...
You know, I often times wonder what AMD might have been like if people in the right places had caught onto Intel's bullying their customers into exclusively using their chips in their products.  It really did hurt AMD back in the day (to the point that they nearly did go out of business), and at times I still have trouble trying to find a good AMD option when I shop around for laptops and desktops.
Ars Technica seems to think AMD's failure had more to do with poorly timed acquisitions, building too many fabs, and a lack of a coherent strategy.  Not that Intel didn't do nasty stuff to AMD above and beyond normal competition, but a lot of AMD's problems were self-inflicted.
-- ∇×V
Reply
 
#11
My own perspective is much simpler:

I have, in the past, had reliability and compatibility issues with AMD components in my computers.

I have had no such issues with NVidia and Intel.

Until the latter changes, I'll be sticking with Genom.
===========

===============================================
"V, did you do something foolish?"
"Yes, and it was glorious."
Reply
 
#12
I've had no problems with AMD processors.  I have had issues with ATI/AMD graphics cards.  They tend to use clever design where NVIDIA goes for raw power; when they get that right, it works wonders, but when they get it wrong...
Let's just say that I went NVIDIA after trying to play KOTOR 1.  It ran just fine on a 32MB Geforce 4 (below required specs, even!), but gave me 5 FPS IN THE MENU on a 128MB ATI card.  Worst performance I've ever seen.

My Unitarian Jihad Name is: Brother Atom Bomb of Courteous Debate. Get yours.

I've been writing a bit.
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)