120°

Anandtech: Intel Kills Larrabee GPU, Will Not Bring a Discrete Graphics Product to Market

Dated: May 25, 2010
Bill Kircos, Intel’s Director of Product & Technology PR, just posted a blog on Intel’s site entitled “An Update on our Graphics-Related Programs”. In the blog Bill addresses future plans for what he calls Intel’s three visual computing efforts:

There’s a ton of information in the vague but deliberately worded blog post, including a clear stance on Larrabee as a discrete GPU: We will not bring a discrete graphics product to market, at least in the short-term. Kircos goes on to say that Intel will increase

Read Full Story >>
anandtech.com
darkequitus5121d ago

Thats it then. From delay to complete bow out.

DeadlyFire5120d ago

This is old news. They said in 2009 they would not be releasing Larrabee in 2010. Its still possible with a 2nd generation though. Larrabee is only first generation of Intel's GPU dealings. They can use what they learn and apply it to CPU/GPU hybrids that come along as well as continue work on Larrabee 2 or whatever the next generation is called. Intel and AMD both come up with new names every year it seems for their new line ups.

led10905121d ago

fudge that give us project offset

ultramoot5120d ago

I'm really starting to NOT like Intel right now. What the fudge is up with them holding off Project Offset?(they were working on it, right?)

DeadlyFire5120d ago

They scraped and rebuilt the game engine not that long ago along 2009 I think Intel did something. Its still coming as far as I know, but Intel is very quiet. Intel and Havok are at E3 2010 we will have to wait and see. It appears that GDC is their most favored event for showing game bits though. I hope something new is shown.

Marty83705120d ago (Edited 5120d ago )

Larrabee takes most it's design idea's from Cell and doesn't improve on them. I mean they are even going down the 'Hybrid' route, just like Cell.

BlackKnight5120d ago

Will you fanboys shut up for once? Larrabee is based on Pentium MMX architecture, that's from the 90's dude.

And think about, a CPU company that tries to make a GPU ends up with a GPU with CPU features...no shit. Why wouldn't they use their CPU expertise in making a GPU? I assure you it has nothing to do with Intel copying the cell, or more accurately said, IBM.

Ju5120d ago (Edited 5120d ago )

It is exactly based on the ideas of the CELL.

It takes it a step further. Many cores, high vector power (SSE derivative), added discrete ISA (which happens to be ia32) and HW extension to improve ROPs (HW texture unit), hyperthreaded/cross over with HW cache management instead of local storage with ring bus.

These are exactly the ideas the CELL was born from. And it ignored some aspects the CELL did from the beginning: the ring bus was designed to solve cache coherency problems with many cores and guarantee high bandwidth. The SPUs are a from scratch designed ISA because nothing else could be implemented in a small footprint, as fast as these CPUs and be reasonably fast.

The lack of branching being one of the compromises/shortcomings. Ring bus requires SW based memory management. And the lack of a HW ROP unit means, complex texture operations are limited to memory bandwidth and hence not feasible on the CELL.

intel tried to address all those things. And yet, their plans turned out to be too ambitious. Just shows how far ahead the CELL designers were back then; now that intel is giving up.

But, note, how intel refers to Larrabee to be a research project for HPC. There is a demand for many core vector processors. That is where Larrabee will end up, eventually.

Larrabee might have made the most sense in a (next generation) gaming console. It's save to say, now, that whatever that next generation will be, it ain't intel (Larrabee) based...

E46M35120d ago

Intel is giving up cause it cant keep up with the likes of AMD and Nvidia.

Something the Cell couldnt do either. Which is why Sony abandoned their original design of 2 cells and actually went with a GPU from Nvidia.

Had Sony stuck to their original design of 2 cells the PS3 would have been a joke compared to the 360.

While sony likes to hype the cell much. In reality Cells graphics capabilities are a joke compared to a real Gpu.

Trroy5120d ago (Edited 5120d ago )

The Cell is too functional to compete with GPUs -- it can do too much, which requires too much logic, and yields sheer numbers of pipes for flexibility. Lack of flexibility, and the resulting greater numbers of simplified work pipelines, is what gives modern GPUs their performance prowess.

On the other hand, the Cell, as a HPC CPU design, is downright unbeatable by any other design in existance, including the Larrabee and i7.

The i7 is a stellar general purpose processor, which is geared toward making slow apps faster, like your typical office program, or server service. The Cell is incredible for tasks which lend themselves well to parallel, performance computing, that require custom programming but that can't be bound by the limitations of a specialized processor like a GPU. A Cell which shared the die size of an i7 would absolutely demolish the i7 in HPC tasks -- one of which is simulation/gaming. Because of the custom-program nature of HPC tasks, Cell apps can always be written faster than Larrabee apps could be, simply because the shared cache actually gets in the way of performance computing -- it enforces ease of development, at the expense of performance.

The Larrabee tried to be everything all at once, and failed.

BlackKnight5120d ago

JU, my point is it's not based on cell. As you put it, it's based on some of the same ideas that the cell was, which is completely different.

Larrabee is just not going to be released to the consumer market since as far as gaming goes, it didn't reach the milestones they set for performance in a timely matter. It's still going to be offered for business solutions and research.

BlackKnight5120d ago

Trroy, when you say the Cell would beat a Core i7 at simulation/gaming, that's quite a generalization. Gaming contains a vast array of different calculations. A more accurate statement would be that the Cell would be faster in certain types of calculations done in gaming while lose to others.

+ Show (1) more replyLast reply 5120d ago
sikbeta5120d ago

Isn't Larrabee a GPGPU? sorry if I'm wrong, I don't know crap about HW...

Ju5120d ago

It is basically a CPU which does graphics. But not a massive "stream processor" design like current gen GPUs, but a "many cores" cpu which does what 512 stream processors do with 32 more general purpose cores. stream processors are just that: they operate best on a large data set, but would not necessarily outperform general purpose cores in say branching etc. A vector processor can be seen as some sort of a hybrid. It is very fast on large data sets but can do some sort of general purpose tasks. Larrabee used a standard (smaller) ia32 architecture with extended SSE vector cores.

The expectation was, the higher performance in general purpose computing could offset the advantage stream processors have on raw data.

And example from a couple of years ago showed the CELL outperforming the then current GPUs in things like raytracing, etc. Which requires a high vector performance but also more complexer operations than those gpGPUs could perform at the time.

Lately, stream processors got more general purpose ops as well. Means, those benefits of more discrete vector cores are getting smaller and smaller. I think there is still an advantage from a programming perspective to have a more complex ISA - but tools and modern "shader" languages hide this pretty well. In the scientific community, there is still a high demand for complex vector cpus.

Larrabee was an "experiment" from intel, to build on what the CELL showed it could do with vector data to avoid to go massive parallel, what NVidia and AMD/ATI is doing.

Those new gpGPUs have 500 or so cores. intel simply does not have the expertise to go down the route. Those cores are not just simplified cpus, they were specifically designed with a massive parallel concept. Not what intel (or any CPU manufacturer) is used to do. And not a trivial task either; there are only 2 companies doing it, apparently.

intel tried to use what they know best to compete with the GPU designers. Use their CPU knowledge to build a GPU. Theoretically it should have worked. But it never lived up to what it was supposed to do.

Trroy5120d ago (Edited 5120d ago )

NOT like the Cell, which was a good idea then, and still is. The Larrabee relies on an automated caching mechanism, and the Cell doesn't which is one of the things that makes the Cell both "hard" (lol) to program, and shine in performance. SPUs never suffer from a cache miss -- ever. That's miles better than spending 30% of the total core time missing on a shared cache, plain and simple. With all that wasted cache die area, you could have more cores instead... oh hey, like the Cell. ;)

Caches are crutches for the lazy -- they're great for programs written without performance in mind. Convenience graphics programming? lol.

The Larrabee was a design nightmare without any serious performance benefits. Smart of Intel to axe it. No one would have ever used it, with subpar performance to serious GPUs, and greater expense to boot.

Show all comments (31)
90°

Here's a closer look at the new ASUS ROG Ally X as early renders are leaked online

It's almost time for the ASUS ROG Ally X to be revealed, and these leaked renders already provide a look at the device and its specs.

UltimateOwnage1d 4h ago

Is it running the same Windows OS? Because that is the biggest issue with the current Ally.

80°

The MSI Claw Gaming Handheld Sees Another Game Performance Boost Through New BIOS and MSI Center M

MSI is proud to announce that its gaming handheld, Claw, has achieved a significant performance increase of up to 30% through a new BIOS and MSI Center M update. Furthermore, the new BIOS and MSI Center M enable Claw to smoothly play all of the top 100 po

purple1018d ago (Edited 8d ago )

is this the one with the switch2 chip inside.?

this is Intels first try at the format

probs not though, as it's $799. so not good for switch actually

Huey_My_D_Long8d ago

Well thats nice considering Ive heard it consistently performs worse when it really shouldn't.

Now if only Lenovo would do the same for the legion go

170°

New and improved ASUS ROG Ally X battery life is just what it needs to compete with the Steam Deck

Yet another leak for the ASUS ROG Ally X points towards as much as 8 hours of battery, but how does that compare to the competition?

Vits15d ago

Honestly, I really like this updated version. But it doesn't solve the biggest flaw that the original had for me: the Z1 Extreme APU. Yes, it's an extremely powerful part, but it is not part of AMD's Adrenalin driver update program, so it's dependent on Asus for driver updates. And unfortunately, Asus doesn't have a stellar record of support for their devices.

Goodguy0115d ago

Up to 8 hours basically just means the least demanding games. AAA gaming at highest wattage would probably be about 2-3 hours which is good compared to just about 1 hour with the current ally. The OLED Deck can do about 2-3hrs.

mrcatastropheAF12d ago

With much less performance so that makes sense.

The Steamdeck shines at the lower TDP end but gets absolutely mopped at the high end.

Similar longevity with much better performance is a big win for the Ally X

Killa7813d ago

Too bad Asus are all awful company.

PRIMORDUS13d ago

They used to be the best when it comes to motherboards, now I will never buy anything from them again.

Firebird36013d ago

8 hrs yea right. Running tetris?

Skuletor13d ago

Only after setting the screen brightness to the lowest level, of course.

Notellin12d ago

They tested the battery life watching a game of Tetris in 360p. Running Tetris natively brought the number down slightly to 1 hour and 38 minutes. 😂

Asuka13d ago

Nope. The only improvements I want to hear is better customer support. Otherwise, I can't be bothered.

Show all comments (16)