Motorola4495d ago

...That's it? I had a HD 6670 and it was good, but not amazing. It's a good entry level GPU, but won't be 60 FPS 1080p in much except COD.

vortis4495d ago

I thought they were going for Xfired cards?

I still have Xfired 5770s they can run the latest and greatest on the highest settings with DX11 turned all the way up.

I imagine two 6670s will tide MS over for the next four years or so, which should buy them enough time to start up Cloud services as an alternative for staying relevant in the [3D]HD arena.

C_Menz4495d ago

Why would the put in 2 gpu's? It would generate twice as much heat, use twice as much power(increasing their psu cost, and make the console much larger.

I think when the next set of consoles are released many will be disappointed. If they release this year their gpu's will not surpass the level of a gtx 580 and be closer to a gtx 560ti due to costs. However if they wait until 2013/14 they will be able to use a much better gpu.

nondecaf4495d ago

@c_menz Those GPUs are small and could be akin to radeon's mobility series in power consumption,I think they should be based on atleast 6770 or6750 because two would have the same specs as the high end 5800 series.

vortis4495d ago

@C_Menz

nondecaf is right, my two 5770s run better and consume less power than my one 4850.

AMD has done a superb job of cutting down energy consumption and heat output in their mid-ranged cards.

Not to mention, you can get two 6670s for the price of one GTX 560ti if you shop around. Also, Crossfired HD 6670s outperform a single GTX 560ti every time in the benchmarking tests.

It's a very smart, affordable and viable choice for MS if they decided to go that route.

gamingdroid4495d ago (Edited 4495d ago )

It's all about smart engineering by balancing the system and always keeping it busy.

Two 6670's will increase "initial" yields which often plauges launches, and they can later be combined for cost reduction.

It shouldn't use much more power than a single chip of equal computing power to the two.

That said, this is a rumor... but bring the next generation on! I cannot wait!!!

milohighclub4495d ago

20% more than the wii u, so about 70% more than ps3 and 6x more powerful than the 360.....

Let's hope Sony are aiming higher...

AKS4495d ago

I don't see them doing a Crossfire setup. Too much heat, power demands, and cost for a console. It wouldn't be practical or affordable.

ProjectVulcan4495d ago (Edited 4495d ago )

6 times as fast. 6670. Microsoft must be trying to build a fairly cheap machine then. 360 and PS3 were easily 10+ times faster than their predecessors not forgetting 360 in particular benefitted from a wonderful toolset compared to what Ps2 had. 6670 isn't really 6 times faster either TBH, its more like 4 times or so. Realistically 4 times, not just because they almost certainly wouldn't use the sort of clockspeeds the part in PC would run.

This is actually fairly weaker than expected. No matter what anyone says about consoles being more efficient than the same part inside a PC, this is true, but 6670 is nowhere near fast enough to dish out 1080p native games and offer up a significant jump over what PS3 and 360 can do. It just doesn't possess the memory bandwidth or fillrate- Less than a quarter the pixel shading performance of even a cheap 6870 and less than half the bandwidth!

You'll get a jump in 720p, but if moving to this generation means moving to 1080p natively as most people want, 6670 just isn't fast enough for a leap- not on the level seen from xbox to 360. A step forward but no revolution. The thing would struggle to manage Battlefield 3 on maximum PC settings @ 1080p even with all the optimisation advantages the consoles have.

Even if this thing launches end of this year, it'll be massively outdated. Sony could go with a midrange GPU from the EXISTING Nvidia lineup (i.e a 560), and it would obliterate Microsoft's new machine if that had a 6670 class part in it.

This is besides the fact a new GPU generation is coming from Nvidia inside the next 3 months.

Statix4495d ago (Edited 4495d ago )

@C_Menz: I would be happy if the next Xbox had a GTX 560 ti. It's a decent card, and console developers can do a lot with limited hardware due to optimization. Just look at what devs have accomplished w/ the PS3 and its antiquated 7800 GTX.

If the PS4 does release later, I certainly hope it does a lot better than an HD 6670. Even a GTX 560 ti would be a lot better.

----------------

Michael Pachter said that Microsoft wasn't planning for a truly next-gen console, but rather a beefed-up Xbox 360 that can run Windows 8. Maybe this HD 6670 is just for the interim, beefed-up 360. Anyone here give any credence to Pachter's speculation>

morganfell4495d ago

Statix, true they did a lot with the GPU. But with a console you can't compare GPUs in a vacuum. You have to consider the CPU to which they are mated. Particularly so with the PS3 since it is common to dump of processes, such as lighting, onto one of the Cell's SPUs.

Remember, the Cell was partially designed from GPU architecture. This is why it number crunches data rapidly and why it easily handles matters which require massive calculations such as certain lighting techniques. I remember the first time we saw a Cell handle Vertex Lighting and we were astounded.

fear884495d ago

This makes me think that Sony might be right in holding off on their next console.

In fact maybe Sony's full focus on the Vita may pay off seeing as how close the graphics will be between the PS3, 360, PC, and the Vita.

I just don't see a reason for a new Sony console at this point if the vita and the PS3 are so close and the leap from 360 and Next Xbox is so minuscule.

SugarSoSweet4495d ago

Where did the article state they were using crossfire? I couldn't see it....

decrypt4495d ago

Poor console gamers going to pay 400usd for tech that is outdated way before it launched.

Wake up console gamers, a 150usd GPU will outperform that piece of trash hardware. Any quad core PC you got at home equipped with a 150usd GPU will significantly outperform this.

NewMonday4495d ago

if Sony uses a new Kepler card with a better 32 core cell that would really be something.

T9004495d ago

Console gaming = having no options, basically you rely on a company to setup hardware as they see fit. It may be old tech and they may charge a premium for it, however theres nothing you can do about it. You just have to bend over and purchase whatever they deem fit for you.

Will be funny though if the Xbox 720 is launched with this hardware and then is marketed as next gen lol. Sadly most consolers will even think the hardware is new tech.

BattleAxe4495d ago

PS4 to out perform the XBOX 720.....Confirmed.

Disccordia4495d ago

Why are people sounding so disappointed? The gpus in current consoles were two years old when they released and they are still going strong a further six years on. This is going to be better than the WiiU which developers have said is already quite beefy (for a console).

You cannot make direct comparisons.

sikbeta4494d ago (Edited 4494d ago )

"Bu... but... we don't need next-gen consoles"

Bring it on MS!

+ Show (14) more repliesLast reply 4494d ago
Sobari4495d ago

Keep in mind that these GPUs are vastly more powerful than you might think when games are programmed specifically for them. Because PC games have to work on pretty much anything, the full potential of a single GPU can never be realized.

ShabbaRanks4495d ago

couldn't of said it better... Well said !

ProjectVulcan4495d ago (Edited 4495d ago )

I wouldn't say vastly. It helps a lot that the developer knows exactly the hardware the game is gonna be built for, it helps that the fixed platform means they can root out all the tricks and find performance the equivalent part on PC wouldn't be capable as it is burdened by several levels of access i.e drivers whereas consoles have direct to metal access.

Xenos in 360 is a clever GPU, but it is still easily beaten in the real world, in a PC, by something like an 8800GT. PC does win admittedly by employing brute force but not massively more brute force is required to overcome the overheads PC does suffer from. 8800GT is like maybe 2-2.5 times faster than 360, and it shows. Not that much is swallowed up by the OS and API.

Even with efficiency advantages, 6670 is not fast by any metric after the year 2008....mainly because it is lacking in memory bandwidth and pixel fillrate.

It'll never be quick enough to do the sort of demos you have been drooling over, the Samaritan stuff, Battlefield 3 on ultra with all the filters jacked up etc etc. Not even in 720p.

It'll comprehensively beat what 360 and Ps3 can do if you pit them against each other in 720p, but it could never do wonders in 1080p.

So i guess it depends what you might want. It is a decent choice to stick with the lower resolution, however if you were hoping for more res, more filtering, samaritan style leaps then not even a more efficient platform will be able to rescue those dreams from the bin with this level of hardware.

wicko4495d ago

This is what Carmack was saying about the API overhead for PC GPUs being restrictive (basically not letting you get as close to the silicon as a console GPU would). A 6670 in a console would outperform a 6670 in a PC.

I still expected something a bit faster than that (360 was the equivalent of an X1900, an enthusiast level card). 6670 is a budget card, however even budget cards are pretty powerful.

bozebo4494d ago

These GPUs nowadays are mostly the same architecture with a different number of cores. The same optimisations apply to the 6670 as would apply to the 6850, 6870, 6950 and 6970.

Once you are using 100% of the shaders in a card and the frame rate is pixel/fragment shader or vertex shader capped - you simply cannot optimise it any more without reducing load on the shaders (by optimising the logic in your HLSL, for which any querks will be architecture specific like I said above).

A 6670 will just NEVER produce Samaritan quality graphics, EVER; not even at 30 fps 720p. It would need 5x the number of cores to even suspect that as a possibility after taking into acount massive optimisations (like 100% increased performance pulled out of thin air, like god of war 2 lol)

Sony and MS had a play around with some interesting tweaks on their hardware this gen, the 360 has 10MB of special RAM that can apply calculations as the data is dumped in it - which took a lot of time and experience to optimise. And the PS3 had a weaker GPU but lots of post-processing (combining layers from deferred rendering, motion blur etc.) could be moved onto the SPU cores once the programmers were used to the hardware and the engine's general memory management was tweaked just right (essentially making multi plats a no-go for getting all the performance they can).

Anyway, this article is likely a bit misleading.
Sure, it could be based off the chip in a 6670 but it will probably be a newer fabrication and a lot snappier. If it is 6x as powerful as a 360 like the article says then you would be better comparing it to a 6850, which is still nowhere near enough power to run Samaritan at 1080p (not even a fith of the required power). So this is a gen-and-a-half console at best if this rumour is true.

Also, PCs have no issues capping out all the shaders in a GPU due to API inefficiencies, all that is wasted time on the CPU - the 360 uses DX9 almost identical to the DX9 on a PC anyway, so the API is doing most of the same things in the UMD (User-Mode graphics Driver) behind the scenes, the CPU is plenty powerful enough to deal with it and you can apply the same low level access on a PC (just isn't practical because it is different in d3d9 for various card ranges but it's exactly the same in 10 and 11 for all cards/manufacturers).

It could be that they are employing 6670-style GPU(s) along with some other fancy technique such as the EDRAM that the 360 had; and it is certainly true that there will be extremely fast memory like consoles usually have (though DDR3 in a sandy bridge setup these days is maddeningly fast and never even remotely stressed by any game currently out).

One thing is certain, it's impossible for it to ever render Samaritan quality graphics - it just won't happen unless this rumour is massively incorrect. If the rumour is true I suspect they have officially ditched gamers because the tech is already outdated.

ardivt4494d ago

Found this list online and I think it's very helpful to get a little bit of perspective here.

The Playstation 1's graphic card was top-of-the-line when it came out. Could do 3D graphics better than PCs at the exact time of release in late 1994. It had a technological lead over PCs for approximately one year.

The Nintendo 64's graphics card was top of the line at release in 1996. It could do special effects such as enviromental mapping 6 months ahead of PCs. It had a technological lead over PCs for approximately 6 months.

The Playstation 2's graphic chip was top of the line in 1999, but due to delays the machine was not released until the year 2000. Combined with difficult programming, the full capabilities of the machine were not on display until Metal Gear Solid 2 in 2001. By then, the graphics were two years behind PC.

The Gamecube's graphic chip was essentially equivalent to top of the line chips in the year 2000 (Radeon 7000/Geforce 2) but was delayed until 2001, whereby it was one year behind PCs.

The Xbox's graphic chip was a top-of-the-line Geforce 3 released 6 months after the Geforce 3 chips. It was 6 months behind in terms of graphics.

The Xbox 360's graphic chip used a combined vertex/pixel shader which was not available on PCs for a large amount of time, but a lack of RAM hampered visuals. On release it could probably be said to have been a year behind PC graphics.

The Wii used the same graphic chip as the Gamecube, but only at a higher clock speed. Essentially, a year 2000 PC card in the year 2006. So it was 6 years behind. This is the biggest gap.

The Playstation 3 used a graphic chip equal in power to a 7800GTX (a 2005 card)....at the end of 2006. Due to a lack of RAM and programming difficulities the full power of the machine was not really on display until Uncharted (2007). So the machine was two years behind PC.

The Xbox 720 will use a 6670 which is a middle-tier graphic card released in 2011, and the console is to be released in 2013. It is roughly on par in terms of graphics with a top-of-the-line-at-the-time Radeon HD3800, a 2007 graphic card. It will be 6 years out of date on the time of release, just like with the Wii.

Microsoft is following Nintendo's strategy exactly.

ProjectVulcan4494d ago (Edited 4494d ago )

Long story short: If you rewind and go back to the time where 6670 would have been a top end card like inside the top 3 fastest single GPU cards you would have to go way back to early 2008 before the HD Radeon 4800 and GTX2xx series launched.

If the rumours hold out and Microsoft launch this thing in early 2013, that will mean then the hardware inside it would be 5 years old.

I hoped for a lot more. At least a 6870 sort of speed. I hope sony go with at least a GTX560 level GPU but if they don't launch til next year they could get something even faster crammed in and it wouldn't be too pricy.

solidt124493d ago

yep. exactly what I was gonna say.

+ Show (5) more repliesLast reply 4493d ago
Crazyglues4495d ago

Yeah that's weak, they needed at least the 6970, hell the 7970 is already out, and by 2013 that card is going to be just on the borderline of hot cards..

I don't know what they are planing but so far it's looking like the system is going to be seriously underpowered..

If this is true Microsoft just put the ball back in Sony's hands... now let's see what they do..

gamingdroid4495d ago (Edited 4495d ago )

You mean how the Xbox 360 has less power than the PS3, yet the games look almost identical?

It's all about smart engineering, not about putting the most high tech and by extension most costly parts.

Perfect example of that is the PS3, with it's vast computing power that it can't fully utilize because of bandwidth bottlenecks. Other examples are PS3's dedicated RAM that has superior bandwidth, but the bottleneck wasn't the bandwidth... it was the amount!

You see this in games like Rage, where PS3 version had far more texture pop-up than Xbox 360 due to dedicated memory instead of the 360's shared memory design that had more available for VRAM.

Crazyglues4495d ago (Edited 4495d ago )

@ gamingdroid

uh, No, this is not about PS3, this is about the 6670 graphics card and how I have the 6970 in my PC and when playing Crysis 2 @ 1080p with full everything and high texture upgrade, it plays, but not 100 percent smooth..

and I have 8gb's of ram... it's not crawling but it's not 60fps that's for sure.

So going with a weaker card does not look good for next gen games, since the card has to last at least 5 years..

How can that be a good card if's already weak by the time the system comes out..

Let's be honest, smart engineering is great but no amount of smart engineering can get around the limits of the card- It is what it is..

If they are going to really corner the gaming market in the next gen, it's go hard or go home.. (you gotta do something that will get even hard core PC Gamers to say, yeah I need to get a console)

-Because Sony looks like they have learned from their mistakes with the bottlenecks and will be coming hard this time around.

That's all I'm saying.. I want Microsoft to give me a real good reason to run out and buy a X720

hell, when I'm playing Crysis 3 or Battlefield 4 or whatever is out on the new systems, I want it too look like I'm in the dam jungle -(Like this)- http://www.youtube.com/watc...

They better go hard or I don't see any reason for people to even run out and grab this next gen. -and having just a 6670 that just sounds like that's just not good enough for a system coming out in 2013..

slayorofgods4495d ago (Edited 4495d ago )

The PS2 and original xbox are probably the last consoles that will have more powerful graphics then current generation p.c.'s... This is because of cost, power consumption and most importantly heat. The current generation has been riddled with over heating consoles.

A generation without rrod or ylod might get my attention. But really in graphics alone a new generation is too soon.

Dude4204495d ago (Edited 4495d ago )

@ Crazyglues

First of all, they can't just stick in a 6970. For the consoles size, they have to make it compact to fit all the other parts and that costs more money. That's not even considering the CPU and RAM they're putting in there.

Second, the PC does not show the true power of the video cards when it comes to gaming. All PC games today have to be written to work on so many types of video cards, from the old 8800GT (or even older) to the brand new 7000 series AMD cards. Therefore, games are not truly optimized for any card.

Let's say they managed to stick a 6970 in the Xbox. when games will be written specifically for that card, it would probably destroy 2 way SLI 580GTX's in a PC.

My point here is that putting in a 6670 probably isn't a bad idea and we're just not seeing it's true power.

Statix4495d ago

@gamingdroid: You're a bit confused. The general consensus amongst developers is that the PS3 is bottlenecked by the GPU. The 360 has a more powerful GPU than the PS3, which offsets the PS3's sizeable CPU advantage. The GPU is the single, greatest determining factor of graphical prowess in any PC or console system. ANY PC gamer or PC builder will tell you that.

Ergo, if the Xbox 720 has a measly HD 6670, and the PS4 has even the midrange GTX 560 ti (hypothetically speaking), then the PS4 will absolutely DESTROY the Xbox 720 in graphics rendering. This much cannot be debated. This is assuming that Sony doesn't go the 100% ultra-cheap route and at least goes for a GTX 560 ti for the PS4.

However, I also very much doubt that Microsoft would be so cheap as to go with such a low-end card like the HD 6670 and leave the window wide open to be left behind in graphical prowess. That's why I don't believe this rumor. There is the slight possibility that Microsoft will be going with TWO 6670 GPUs, and that would be much better, but there are a plethora of issues that plague dual-card setups which make such a setup impractical for consoles. I.e., heat generation, manufacturing cost/yields, power consumption... not to mention the micro-stuttering phenomenon that could potentially be very annoying.

STONEY44495d ago (Edited 4495d ago )

^^what he said. First time I played MGS2 on PS2, it was miles above anything I played on PC. Until 2003 and on where every PC game after Doom 3 looked a whole generation ahead of the PS2/Xbox/GC.

MGS4s original real-time 2005 trailer looked mindblowing, better than any PC game at the time... then (according to Kojima) the PS3s specs changed, and by the time the game was released, there were better looking games out there. That trailer still looks good today, especially compared to how the cutscene ended up looking in the final game.

But PC technology has become so cheap/affordable and is advancing so fast that it really won't happen again. The consoles this gen were already outdated on release.

CharlesDCI4495d ago

@Gamingdroid

"You mean how the Xbox 360 has less power than the PS3, yet the games look almost identical?"

That's because Microsoft's content policy for the 360 pretty much dictates parity among multiplatform games.

Gazondaily4495d ago

Yeah I have a 6970. This seems to be undrpowered for a next gen console. Still....it hasn't been confirmed yet so its early days.

sikbeta4494d ago

Graphics are not the more important thing, did you guys didn't saw what happened this gen or what?

Rhythmattic4494d ago

Crazyglues

Great Graphics, totally suck-worth Ai.

Crazyglues4494d ago

@ Rhythmattic

yeah that's a good point, hopefully they step up the Ai too in the next generation of consoles..

+ Show (8) more repliesLast reply 4494d ago
slayorofgods4495d ago

Please tell me the next generation is going to be more then just the equivalent of adjusting from med graphics to high graphics on most games. If that is the case then a jump isn't necessary yet.

Am I missing something, is this upcoming generation going to offer some new feature that the current generation can not offer?

ninjahunter4494d ago

That is exactly the jump Haha, well, native 720p is probably expected, That graphics card just doesnt have the raw power for 1080p in anything that looks good.
The difference between each generation is just going to get smaller and smaller, because it require exponentially more power to improve graphics. At some point, it will require so much horsepower to make any difference that polygon graphics will be replaced by ray tracing or something.

ZippyZapper4495d ago (Edited 4495d ago )

IGN? strange rumors today http://www.computerandvideo... this one says 7000 series. Windows 8 is right around the corner so you never know. DX11 for sure, so cross platform Win8/Xbox with full backward compatibility is a serious possibility.

hazardman4495d ago

I read something about forward compatibility, and how Xbox 360 games would optimize to new consoles graphics. I know i read this somewhere, but can't recall sources..sorry!

osamaq4495d ago

Radeon HD 6670 ?// u must be kidding me
why ?
there are much more powerful gpus out there..
by the time 2013 it will faaaaaaaaar more obsolete
i will go with sony if they produce more powerful console..

juggulator4495d ago

It doesn't say they will be using this exact card just that it might be close to it.

Besides it supports DX11 right? Yes.

And that's the biggest thing as we haven't seen what DX11 can accomplish on consoles.

osamaq4495d ago

how can the Radeon HD 6670 render avatar quality games?!!
either the rumors are BS or AMD president was BSing us.....

fr0sty4495d ago

So I already own a more powerful GPU... No thanks.

frostypants4495d ago

FPS is entirely up to the dev. People who think every game will be 60fps next gen are missing the point. Devs could make all games 60fps THIS gen if they wanted...they'd just have to tone down the other stuff like textures and particle effects. The same will apply next gen. Devs might just limit to 30fps in many games in order to maximize textures. I'm not sure everyone gets this...

SJPFTW4495d ago

right on. Still think most of the AAA games next gen though will recieve a huge graphical boost, will still be 720p + 30fps mainly because that gives you the best balance between graphics and performance.

Also Dont forget its not just graphics cards. Also have to look at RAM and CPU and all that other jazz.

frostypants4495d ago (Edited 4495d ago )

Also, Consoles are more than just the GPU. Their entire infrastructure is geared towards speed. Memory, bus systems, etc...all 100% geared to gaming in ways that PCs are not. High end PCs will always be king, but consoles tend get more out of what they have due to those aspects we tend to overlook.

GraveLord4495d ago

PC graphics cards never get used to their full power.
Devs can't optimize to just 1 graphics card, since PC gamers have a variety of builds.

TENTONGUN4495d ago

if the next gen console cant do next gen engines like unreal 4(not 3.5) then whats the point, i wont buy it

Hanif-8764495d ago

I am willing to wait if they opt in for the Radeon HD 7970 even a full year after because it offers a lot more performance, consumes less while generating less heat on the 28nm process. Hence, tell me if that's not what a console need?!

TheXgamerLive4495d ago

You do realize this is FAKE, don't you?

Common sence should tell you that.

CrimsonEngage4495d ago (Edited 4495d ago )

I have a 4850 and can run BF3 at all high with 60FPS. If the only game you can get 60fps on with a 6670 is COD then the rest of your PC must be complete trash.

A graphics card is only as good as the other components in your case. If any one of them bottle necks it will bring everything else down with it.

Just saying.

hiredhelp4494d ago (Edited 4494d ago )

Yeh hes very high, i have old ati 4870 1gb powercolor i dont get anywere near 60.
Not with 1 card hell many top lines struggle to get 60 inc my gtx 560 unless. Had sli witch i do.
Untill march when by 7970 or 7950 xfire undecided.

tee_bag2424494d ago (Edited 4494d ago )

Yeah at what res? I have the same card in a quad core bootcamped iMac 27 . You may know the internals ain't exactly weak so I'm curious how you can get 60fps

rowdyBOY4494d ago

a hd 6670 is equivalent to a 8800gt.
check this link out to see how it performs against the best card of today ?
so far ,

http://www.anandtech.com/be...

ha ha , thats got to be a joke .
if it isnt , than bye bye microsoft and hello sony

Drekken4494d ago

I have a 8800GT in my PC and that card is so out of date. MS is concentrating on cartoony kiddie games anyways. They don't need all that much power.

PC_Enthusiast 4494d ago

microsoft is going to put in some expensive motion control technology in the xbox 720 and cheap out on the hardware like gpu cpu maybe ram aswell :/

SephirothX214494d ago

The next gen consoles won't pass a GTX 580 in graphics. I bought that card for 500 hundred euro and it is gigantic. By the time next gen consoles come out, I'll have added a second one or a GTX 600. Though a 580 is very powerful and I'd be surprised if next gen console graphics will be 75% as good as it.

Scenarist4494d ago

LOL when i read the title ... that was my first thought

" Thats It ? "

FanboyPunisher4494d ago

People dont know shit

Xbox had a gpu close to the X1800XL in terms of performance, a 6760 is FAR FAR more powerful then a X1800XL(AMD) / 7800(Nvidia).

People forget because you used your card on PC, and it was meh means a console will make that card function far better on a console, due to the optimization in making the games vs PC games that dont maximize the hardware as well as a console can.

Its fine, 6760 seems realistic imo. PS3 had a worse gpu then xbox, so even this next gen PS4 will likely have similar spec'ed gpu if this is a accurate article.

Either way, reguardless if its a 6760 or not, both PS4/720 will have similar spec'ed gpus.

lve2playbball4494d ago

someone's post over at IGG...thought it was funny

"I can already see their E3 presentation now.
360 x 6 = XBOX 2160, 2+1+6+0 = 9 (German techno beats began thumping with a robot voice repeating 'nein, nein, nein, nein, nein.') Spotlight hits a sweaty shirtless David Hasslehoff and Bill Gates doing the robot to the rhythm of the techno beat, They continue the robot as avant-garde images of exotic animals floating in space wearing master chief helmets are projected on the screen; all the while the beat continues thumping, 'nein, nein, nein, nein, nein.' Bold lettering fills the screen with the phrase, "All your XBOT are belong to us". The vivid images and mesmerizing trance become too much for the crowd to handle. David Hasslehoff and Bill Gates begin rolling in a pile of money while laughing uncontrollably as the audiences eyes began to melt inside their skulls like the finale of Raiders of the Lost Ark.

Actually this was just Greg Millers response when I asked him why he texted me at three in the morning saying he had an incredibly amazing dream of the new Xbox presentation at E3."

MEsoJD4494d ago

Why are people surprised or disappointed with Microsoft going with this card? So it won't do native and/or 60 fps 1080 for most games... it really doesn't need to. Try to understand that modern consoles benefit from users being far away from the screen who wont notice a lot jaggies and edges, oppose to pc gamers who are up close and actually need sharper visuals and power.

ChrisW4493d ago

The next gen consoles with last gen GPUs? This is why consoles will ultimately fail!

+ Show (19) more repliesLast reply 4493d ago
NYC_Gamer4495d ago

they should have went with the 6950

pc_masterrace4495d ago (Edited 4495d ago )

I have two 6950s and they're spectacular but far too expensive an option for a console if they're looking at 2013 ship date.

dirthurts4495d ago

6950? Seriously? That's a 260 dollar card. Want a 600 dollar console? Mid as well buy a pc.
Not to mention those cards are huge in comparison to a console case, Where would you put it? Not to mention the power consumption and heat output.
The 6770 is a good choice. Small enough to work, affordable enough, and much more powerful that what is out there right now.
If you want raw power, buy a pc.

NYC_Gamer4495d ago (Edited 4495d ago )

i already have a gaming PC with the 570.my 60gb launch PS3 cost me 600 dollars with nothing.

Statix4495d ago (Edited 4495d ago )

A GTX 560 ti is about equivalent to the HD 6750, and I bought it on sale for $175 the middle of LAST YEAR. By the time late 2013 hits, the GTX 560 ti and HD 6750 will be MUCH cheaper.

I would be surprised if the PS4 doesn't include AT LEAST a GTX 560 ti or equivalent.

SyWolf4495d ago

The actual GPU is relatively small, you're thinking of the PCB and other components that are attached to the GPU. In a console those things wouldn't be there. You're also assuming that it would be a vanilla card, which isn't realistic. Microsoft would modify it to better suit the overall design of the console.

Bob5704495d ago (Edited 4495d ago )

People seem to forget that MS will be paying a fraction of what consumers would have to pay for a comparable GPU. I'd bet that MS would pay 20-30% of what we would have to pay.

At the time the 360 launched, it's GPU was comparable to a $600 graphics card. MS actually sold the 360 consoles at a loss for a while. It cost MS somewhere around $500 to $750 to make.

http://www.macworld.com/art...

http://www.joystiq.com/2005...

peowpeow4495d ago (Edited 4495d ago )

HD 6750 competes with the 550 not 560 ti. 6850 is closer but still not as fast

RyuStrife4495d ago (Edited 4495d ago )

A gtx 560 ti is actually a lot closer to a HD6950. Even the price range is. Currently, it's at $250.

Edit: That's actually the price for a superclocked version. But, you get the idea.

+ Show (3) more repliesLast reply 4495d ago
Double_Oh_Snap4495d ago

I'm not the best with tech. So can somebody explain if this turns out to be real, what the comparison to the current xbox would be? Just curious.

JsonHenry4495d ago

It means it is already outdated. Bad outdated.

Check out just how bad here - http://www.videocardbenchma...

That is assuming this is even true. It could be "akin" to the 6670 but have a higher clock rate and more RAM. Or it could have even less through-put than the regular PC counterpart.

Double_Oh_Snap4495d ago

Disappointing if true, thanks for the comparison though.

Dude4204495d ago

The comparison is pointless, some of these guys don't seem to understand writing a code for tons of video cards vs writing for one video card.

The GPU in the PS3 is apparently as powerful as the old 7800gtx, yet look at the results. Just imagine what they can do with this card.

Take this for example, the hardware in all macbooks for a certain model are the same. OSX was very well optimized for that hardware, which is why it can run very fast compared to Windows.

JsonHenry4495d ago

Look at the results DUDE420? Its pulling down the same low rez textures at 720p (or less) in most games that the PC counter part gets. YES coding for one set of hardware can help you pull off something the PC counter part won't. But it isn't some sort of silver bullet that overcomes outdated and slow hardware. Its called optimization for a reason. It optimizes it. It doesn't turn a Fermi card performance into Kepler card performance.

Outdated is outdated no matter how you code for it.

Dude4204495d ago (Edited 4495d ago )

Yeah sure it's 720p, but show me a game on pc, running on a 7800gtx that looks as good as Killzone 3, Uncharted 3 and God of War 3? I used to have a 7950GTX and even that had trouble running Crysis at medium settings at 1280x1024. When I upgraded to a GTX 560 Ti (obviously with new PC hardware), I was amazed by the performance leap.

I do most of my gaming on PC, but I have to say, I'm impressed with what they could dish out with the PS3 and Xbox 360 in this generation. Point is, in the right hands they can work wonders with video cards. Besides, it'll most likely be a custom chip based around the performance of a 6670, who knows what they could give it.

Edit: Oh and yes I think you need to understand the difference between a low end and outdated card. If it was outdated, it wouldn't have the latest features that all other cards have out there (like GDDR5 vRAM and DX11), it's just a less performing card.

+ Show (1) more replyLast reply 4495d ago
STONEY44495d ago (Edited 4495d ago )

It can't even keep constant 60fps on Modern Warfare 2 at 1920x1200 with 4xAA. Which is poor, even by low-end GPU standards. It's more of a mass-production everyday desktop kind of GPU than a gaming GPU.

It's obviously a good leap above the current-gen consoles (which struggle with COD 60fps even at sub-HD), but it doesn't even match the $300 cards of 2008. It's BARELY above 2006s high-end 8800 GTX. It's 2012 right now.

http://www.guru3d.com/artic...

Comparison with a larger range of cards that really shows how poorly it holds up, using Far Cry 2.

http://www.guru3d.com/artic...

Which leads me to believe that there is no way that Microsoft is that stupid.

mrshooter2k124495d ago

This rumor worries me because if true (which I hope not), the device would probably spontaneously combust upon starting up even the Samaritan demo.

Shackdaddy8364495d ago (Edited 4495d ago )

To give you perspective, my GPU from 3 years ago(2009) in the computer I made is about twice to three times as good as this card.

It is better than the current Xbox's GPU but not by much... It probably will be stuck at 1080p, 30fps its whole life...

ninjahunter4494d ago

The Jump will be to about modern PC graphics at 720p. thats of course dropping a few subtle, yet demanding effects like SSAO.
the biggest difference will probably be native 720p though, It just doesn't have the raw horsepower for anything good at 1080p

+ Show (1) more replyLast reply 4494d ago
AlbatrossRevue4495d ago

Can't see M$ using this GPU on a console that if the rumor is true won't go into production for a year

mrshooter2k124495d ago

So......we will have waited almost eight years for what is basically an abysmal update at best. The GPU will be almost useless if Microsoft intends to carry on with a ten year lifecycle.

Really disappointed if this actually turns out to be true.

SleazyChimp4494d ago

None of this lines up with what AMD its self stated about the next box having avatar like graphics. It is rumored that MS is launching 2 versions of the new box. One was suppose to be a low cost set top box. This card would be a better fit for such a thing. That would leave the high end version still in line with using the 7900 south beach series. Plus this is just a rumor, so take it with a grain of salt. Besides it doesn't make any sense for MS to have a long term plan with such underpowered tech.

Show all comments (231)
70°

Starfield's new Xbox performance modes are thoughtful and comprehensive

Digital Foundry : Bethesda's Starfield was generally a well-regarded RPG, but the game's 30fps target on consoles was the subject of some controversy. The game's massive scope arguably justified that 30fps refresh rate, with only high-end PCs capable of hitting 60fps and higher, but now Bethesda has changed course and opened the floodgates on Xbox Series X consoles following significant optimisation work. Players can now independently select performance and visuals modes at arbitrary frame-rates. How exactly do these new combinations fare, and is 60fps really a possibility after it was explicitly ruled out before?

Read Full Story >>
eurogamer.net
darthv721h ago

900p in performance mode for SX.... I'd assume a 5pro would at least hit 1080p if not more.

jwillj2k439m ago

It’s a point, click and load game. Stop with all the massive scale bullshit. Only thing massive is the number of junk items they decided to throw in the game.

80°

Assassin’s Creed Shadows: Inside Ubisoft’s Ambitious Open World Japan

An inside look at Assassin's Creed Shadows, Ubisoft's ambitious open world Japan where your every move is affected by weather, season, and lighting systems.

220°

Report: Assassin's Creed Shadows to Require an Internet Connection Even If It's Single-Player Only

Assassin's Creed Shadows digital storefront pages are up, and it confirms the game will require an internet connection, and MTX.

kenpachi8h ago

Was sort of interested in this but if it's true it's an easy pass

jznrpg2h ago

It’s true for sure. Avatar game had online requirement

Tacoboto1h ago

Avatar only had an online requirement to install the disc version, not to play.

This AC really does have Online Play Required marked on it.

Terry_B7h ago

as usual..Don't Buy Ubisoft Games.

Skuletor3h ago

One thing that's messed up about this, is there will be people playing pirated versions of this without that restriction, while the paying customers will suffer. Just like how some games will have lower performance on PC due to DRM, while pirated copies don't.

Anyone that wants to say something like "Who doesn't have internet access in this day and age?" There's plenty reasons people won't always have access, such as living in rural areas with spotty coverage, for example.

banger883h ago

Three single-player games in a row they've done this with now. Those f***ers weren't kidding when they said gamers need to be comfortable with not owning their games. As a physical collector, and somebody who enjoys Ubisoft's open worlds, this is a nightmare scenario for me. Absolute scumbag company.

Show all comments (25)