Meh

Kakkoii

Contributor
CRank: 18Score: 157070

Someone like SuperDaE could set that up in minutes, you could very easily automate the process. Hell, most people could with a little bit of Google sleuthing.

3990d ago 0 agree0 disagreeView comment

@zod18: Sorry to burst your bubble, but this isn't like any regular electronic device. Anybody can get their hands on an iPhone or Android devkit and in the case of Android, source code. The Xbox One is a completely closed system, only a game developer can get some mid level access to the console's operation, and even then it's nothing that can allow them to take control over the console without user permission built into the OS.

These kinds of features are the fu...

3992d ago 0 agree1 disagreeView comment

The Call of Duty series actually does keep itself at the forefront of graphics tech for the most part. The teams even often release whitepapers on new tech they have developed. Some Black Ops tech was featured at Siggraph 2011 even. Here's the whitepaper:
http://advances.realtimeren... <...

4062d ago 0 agree0 disagreeView comment

Lol dude, there are tonnes of 4k videogames. Because on the PC, in many games you can set the resolution to WHATEVER YOU WANT.

4109d ago 0 agree2 disagreeView comment

Just watch, someone will come out with a modded piece of software for Shield that will allow you to do it with AMD cards as well somehow haha. The platform is said to be completely unlocked, so there isn't much stopping it.

4114d ago 0 agree0 disagreeView comment

Well, I would argue that the type of people who have money to spend on niche toys like this are the type of people who typically go for Nvidia already anyways, since that little extra bit of money is nothing to them, with the benefit of having PhysX/CUDA support and notoriously more stable drivers and game support.

4114d ago 0 agree0 disagreeView comment

@FriedGoat: Not true actually. Shield uses 2x2 Wireless N. This allows for around 300Mbps bandwidth. 1080p Blu-ray movies are encoded around 17Mbps, in some extreme cases 30Mbps, with no compression artifacts, lossless.

So this device can EASILY stream 1080p, multiple 1080p streams if it needed to (Although this device has a 720p screen... so there would be no point).

4114d ago 0 agree0 disagreeView comment

Not really. It seems aimed at families who were still on the fence about spending over $150 on a Wii. And it gives it a nice tiny form factor that makes it even more attractive to those consumers.

It positions a version of the Wii for Nintendo to drain the last bit of market they can out of it. They won't be updating the console, and there aren't all that many online games for the Wii that people care to play, especially those who would be buying the console this late...

4182d ago 0 agree0 disagreeView comment

It doesn't work with AMD cards. It's CUDA based and there is no OpenCL/GL version, as they state it is not mature/developed enough for these purposes. But that's no problem, since most serious studios that deal with rendering will be using Nvidia GPUs.

This is a great rendering engine, although it's nothing ground breaking. It's definitely one of the fastest GPU raytracers, but there are still plenty of them out there, some with even more features, that ma...

4182d ago 0 agree0 disagreeView comment

There's nothing wrong with saying "upon the earth". In fact, it's good to use such terms because it keeps the English language lively, instead of descending into the hell of abbreviations and slang words that seems to be happening more and more with each generation.

4306d ago 0 agree1 disagreeView comment

In what other way would a game dev be able to give support other than making a game?

4321d ago 0 agree0 disagreeView comment

Actually, there are 3 super computers that utilize Nvidia GPU's that are in the top 10.

http://en.wikipedia.org/wik...

Of course, they aren't 100% GPU based super computers, since GPU's aren't the best at EVERYTHING. CPU's are still needed for more serial tasks.

Also, the top spot is soon to be replaced by this monster:
4323d ago 0 agree0 disagreeView comment

If anyone is familiar with the injectSMAA that was floating around for a while, this is basically the full implementation of the tech behind that, which was developed with the help of 2 Crytek dev's and Nvidia. It's quite amazing actually.

4328d ago 0 agree0 disagreeView comment

Yes, the CARD costs that much, but the chip itself costs less than $80 each to make. Microsoft or Sony only need the chip, not a complete working graphics board. They'll be incorporating the chips into their own motherboards in their own manufacturing lines. So the costs are A LOT cheaper for them. Not to mention they get a very good mass production discount due to shipping millions.

But it likely won't be the highest end of a series, it's just too large of a chip...

4335d ago 1 agree0 disagreeView comment

I'm sure they are probably still using the same base engine, but merely updating the graphics and creating new characters, game UI, multiplayer, story, etc.. Nintendo knows that hell would be raised if the core mechanics were changed in a significant way.

4344d ago 2 agree0 disagreeView comment

Well, 4850 is quite old now.. We've already had 5xxx, 6xxx and now 7xxx just a few months ago. 4xxx series came out in mid 2008, and were on 55nm fab. The Wii U should likely be using 40nm, or if they managed to work it out with AMD, 28nm, which would drastically reduce the price of the chip.

4427d ago 1 agree0 disagreeView comment

Yeah, it supposedly uses a Radeon 4850.

4427d ago 4 agree0 disagreeView comment

@hiredhelp: Because the market allows them to get away with it. A person pays price based on a cards performance relative to competition, not its size or original intended segment.

This allows them to sit back and rake in tons of profit on this small chip, putting further financial distance between them and ATI/AMD. And gives them more time to perfect the full sized version. It's likely it will come out as the 7xx series, to combat ATI's next gen around August.

4427d ago 3 agree0 disagreeView comment

4GB models are coming out soon as well though.

And even the most unbiased ones, Nvidia is winning in the majority of the bench's. And this is still with early drivers. AMD has had time to release a few driver iterations that improved performance on a lot of games. Nvidia will be doing the same, as both always do. And in some instances, the 680 is drastically more powerful.. so.. We shall see.

4427d ago 3 agree1 disagreeView comment

This is Nvidia's coolest running architecture yet. Even cooler/smaller/more power effecient than ATI's.

The chip would just be scale down, possibly optimized even more, and run at a lower clock speed to make it cool enough for a phone, like with any processor.

It's just that this one would be much more powerful at phone clock rates than past GPU's in phones.

4432d ago 0 agree0 disagreeView comment