500°

GTX 580 gets substantial price cut

With Kepler just a few weeks away, Nvidia partners are cutting GTX 580 prices to clear inventory, or to make the ageing cards a bit more competitive against AMD’s 28nm Radeons.

Read Full Story >>
fudzilla.com
NYC_Gamer4465d ago

People looking for a nice high end gpu should jump on this

Sarcasmology4465d ago

That would probably just break it. I'd recommend replacing your current graphic's card with it.

SilentNegotiator4464d ago

Not that you could jump on it very well if you wanted to. It still costs a hand and a leg.

gamingdroid4464d ago

Yeah, these cards cost more than what I plan on spending on games this entire year.

RumbleFish4464d ago

time to get my second 580! framerate up! Metro: last light I embrace you!

Kurylo3d4464d ago

@rumbleFish

what.... 150 fps wasnt enough for you .. going to need to have 300 ? lol

sikbeta4464d ago (Edited 4464d ago )

WOW! you can put 2 of those things in the same motherboard? @_@ last time I tried to put my s*** old nvidia gt-something-00 it was a pain because the thing is like half the size of the MB in my PC alone, wtf!?

+ Show (3) more repliesLast reply 4464d ago
SantistaUSA4465d ago

I'm glad that the price is finally coming down. I got an EVGA GTX 570 back in november of 2010 for U$360 and I love it, I'm using with an i7 960 and 12GB of DDR3, I game at 1080p (55" LED) and it runs everything at max without a problem! So I have no need in upgrading for a while, who ever gets the GTX 580 will be happy for a long time!

D3vilzRightHand4464d ago

I have the same rig almost :D

1x over clocked i5.760 @ 3.8
1x Watercooler
1x SSD 1tb Hd
1x 2tbhd
1x GeForce GTX 560Ti 1GB PhysX CUDA
12gb ram
750 psu

Play everything maxed out on @ 1080p on a "28 Led"

kaveti66164464d ago

I have a Gtx 570 with 1280 gigs of VRAM and I have question. If I want to run Skyrim with the HD texture pack from the modding community, can I do that with just 1 gig of VRAM, because I heard the textures are 2k?

I feel really bad for sticking with the 1 gig version of the card instead of getting the 2 gig one.

SantistaUSA4464d ago (Edited 4464d ago )

@kaveti6616 I guess it will depend on the resolution that you are playing on. I've tried the game without the HD texture pack, so I can't tell how impacts the performance.

kaveti66164464d ago

I have a 1920 by 1080 monitor so I know that in most cases a single gig will do, but I heard Skyrim's HD texture pack will have 2k textures and wonder if this will demand more than a gig. Of course, I do have 1.28 gigs to work with.

I also heard that in BF3, the game will utilize system RAM to render, but that drops the frame rate considerably.

Ghoul4464d ago (Edited 4464d ago )

Kaveti

2k textures is the resolution of the texture not the vram consumption
We have had 2 k textures for years now

Nvidia can process up to 4096x4096
Ati can process up to 2048x2048

Almost evry texture is then compressed to even further reduce the vram footprint

An uncompressed texture goes as the following

256x256 = 0.25MB (192k no alpha)
512x512 = 1MB (0.75MB no alpha)
1024x1024 = 4MB (3MB no alpha)
2048x2048 = 16MB (12MB no alpha)
4096x4096 = 64MB (48MB no alpha)

Compression algorithms can reduce those numbers by a factor of 4 or more
I could go on and on but vram size on your gpu doesnt determin the texture resolution you can display

Always keep in mind that textures have a mipmap version to and higher res version get loaded nce you get closer to them

I could go on and on but that would take some pages ;)

2k textures are mostly idiotic since you can produce sharp textures much more elegant with several tricks and composite materials in staed of simplycranking up the texture size and using up vram for nothing

Just remember your screen is most likely around 1920x1080
to see the textures benefit you must have ne textures fullscreen and even then you cantsee the complete 1to1 texture size

Ne last thing
No you dont need every texture in the vram only whats displayed on screen and even then you reload texurs from the system ram when needed

steve30x4464d ago

@ kaveti6616 : You have 1280 gigabytes on your GPU? Holy shit you have an extremely rare GPU. You could sell that for a huge fortune.

Denethor_II4464d ago

You cant be running any AA then. I'm using a GTX 580 and with a i5 2500k and can run BF3, for example, at 60fps@1080 on ultra. But I still have to turn AA off. For me I'd rather have 60 frames then slightly smoother edges.

steve30x4464d ago

I dont know why I got disagreed with. Read what the guy said himself. He said he has a 1280 gig GTX 570.

+ Show (5) more repliesLast reply 4464d ago
TooTall194465d ago

I've been looking into building or buying a PC. I would probably go with something like this.

SSultan4464d ago

@TooTall19, Just keep in mind that by the time all the parts will arrive at your door, your rig will be outdated. Unfortunately, that's the way it is.

superrey194464d ago

@ssultan
No they wont. Go troll somewhere else please.

RumbleFish4464d ago

Depending on the resolution you want to display you will have a lot of fun with one of those cards for some time.

Not every game in the next 2 years will be BF 3 like gfx. So no need to worry as long as 1080p is enough for you.

JsonHenry4465d ago

Awesome. But I am gonna get the Kepler card. I think we are supposed to have a flood of kepler news tomorrow. Not sure if the NDA is gonna lifted tomorrow or if Nvidia is just doing a PR event about it.

Orpheus4464d ago

Yea you will get special benefit in Physx ... though may be only Metro LL will use it extensively, I guess it will be the game which will make they buy worth it.

JsonHenry4464d ago

I want it to play BF3, in 3D, at max settings. Not to mention Crysis 2 in 3D, with DX11+High rez textures at max settings. Then don't get me started on games like the new ARMA that are coming soon.

josephayal4465d ago

Good news for SONY and the Next Playstation Super HI DEF 1080Px 3 Console

LightofDarkness4464d ago (Edited 4464d ago )

Wow, you've been trying really hard with the trolling lately, seems you have a comment in almost every article possible, and yet no one bites. It must be so sad to fail at something so easy no matter how hard you try. I mean, it's literally the easiest thing to do on the Internet, and you can't do it.

ABizzel14464d ago

It's possible it could happen, but not likely.

As consumers we constantly forget that companies buying in bulk pay anywhere from 30% - 70% less than us (usually 33% - 50%). Each console manufacturer plans on buying at least 50 million GPU's so they're probably Nvidia's and AMD's single largest consumer, and they want to keep that business.

But spending $250 (350 - 33% + taxes) alone on the GPU means another $500 - $600 PS4, which is not going to happen. But that's the price if it came out today. I don't expect to see a PS4 until holiday 2013 at the earliest, and in a years time there will be another price drop on 580's.

Also consoles generally get an optimized version of an existing GPU to better fit their needs, so that could drop the price a bit as well.

He may have been trolling, but it's possible. Unless the rumors of Sony going unified AMD is true.

gamingdroid4464d ago

@ABizzel1

MS seemed to have pioneered a new business model. They contract out the design to companies like ATI and own the design. They then contract out the manufacturing to two or three vendor which competes with each other. TSMC is one of them and I forget who the other one was.

In short, Nvidia squeezed the living sh!t out of MS for the GPU price and was one of the main reasons why the original Xbox was discontinued. It wouldn't surprise me if Sony went ATI next generation as I'm sure they are being price squeezed too.

That said, even self manufactured 70% less cost is excessive... out of curiosity do you have a source to back that up?

awi59514464d ago

@gamingdroid

The funny thing is nividia was disappointed that microsoft didnt pick them this gen lol lol. NO sh)&)(&*( they didnt pick you because you ripped them off all last gen. Nividia made microsoft charge like 150 for each gpu sold in the xbox and wouldnt let them drop the price for it. So even though the rest of the parts got cheaper nividia made them keep paying the same price for outdated tech. Thats why the xbox could never compete on price with the ps2 and thats why microsoft killed it off.

ABizzel14463d ago

@gamingdroid

I have personal experience. My first job was at retail and our stores purchased all items from 30% - 70% off the price we sold them for (most of the time 30% - 50%) generally jewelry is way overpriced and sold at 50% - 70% mark up.

I also worked with contractor's when I worked in Media Services at a hospital and we had to purchase equipment from things like television installation for hundreds of rooms for multiple hospitals as well as equipment for recording surgeries, events, conferences, etc... And we generally received a 30% off discount from normal retail due to buying in bulk versus 1 or 2 items.

gamingdroid4462d ago (Edited 4462d ago )

@ABizzel1

Certain products have a very high margin, including water, make-up, jewelry, brand name clothing and etc. It is not uncommon for a retailer to have a 40% mark up, to cover ongoing business operations.

However, most consumer electronics (if not all) have closer to 10-20% markup at retail with some exceptions.

Also, if you buy products intended for commercial use (instead of consumers), the markup tend to much higher just because they can.

Ever wondered why a fancy commercial LCD panel still cost thousands, while a consumer one cost $500!?

Computer chips are heavily commoditized except for high end chips, but even that can be if you contract out the manufacturing. Chips are sold at bulk in general, so the margins are already thin and companies earn by volume.

+ Show (2) more repliesLast reply 4462d ago
Plagasx4465d ago (Edited 4465d ago )

When do the price cuts take place in the US? it does seem to be in effect yet...

KRUSSIDULL4464d ago

I'm about to put a second GTX 580 in my PC.

SantistaUSA4464d ago

Nowadays that's overkill, we need more demanding games, I can't justify a sli setup as I can play everything at 1080p.

But if I had plenty of cash I wouldnt even think twice :)

JAMurida4464d ago

I have to second that...

I've been done a lot of research in building gaming PCs, (since I'm about to do so soon), and I see A LOT of people who Crossfire/SLI. But it seems most of these games run just fine maxed out with just one very good card. Yet you still see people with rigs that make it seem as if they're running Crysis 2, Skyrim and BF3, max settings at the same time on the same rig. I understand wanting to be future proof and all, but damn...

Show all comments (52)
330°

Nvidia DLSS 3.7 drives a further nail in the coffin of native performance

Nvidia DLSS 3.7 is the latest update to the long-running AI upscaling technology, and it further shows native performance doesn't matter.

DustMan49d ago

I think hardware development is at a point where they need to figure out how to draw less power, These beefy high end cards eat wattage, and I'm curious if using DLSS & AI in general will lower the power draw. It would seem like the days of just adding more VRAM & horsepower is over. Law of diminishing returns. Pretty soon DLSS/FSR will be incorporated into everything, and eventually the tech will be good enough to hardly notice a difference if at all. AI is the future and it would be foolish to turn around and not incorporate it at all. Reliance on AI is only going to pick up more & more.

Tapani49d ago (Edited 49d ago )

DLSS certainly lowers power consumption. Also, the numbers such as the 4090 at 450W does not tell you everything, most of the time the GPU stays between 200-350W in gameplay, which is not too different from the highest end GPU of 10 years ago. Plus, today you can undervolt + OC GPUs by a good margin to keep stock performance while utilizing 80% of the power limit.

You can make the 4090 extremely power efficient and keep 90% of its performance at 320W.

However, in today's world the chip manufacturing is limited by physics and we will have power increases in the next 5-10 years at the very least to keep the technology moving forward at a pace that satisfies both businesses and consumers.

Maybe in 10 years we have new tech coming to the markets which we are yet to invent or perhaps we can solve existing technologies problems with manufacturing or cost of production.

On the other hand, if we were to solve the energy problem on earth by utilizing fusion and solar etc. it would not matter how much these chips require. That being said, in the next 30-40 years that is a pipedream.

MrBaskerville49d ago

I don't think fusion is the way forward. It will mosy likely be too late when it's finally ready, meaning it will probably never be ready. Something else might arrive before though and then it becomes viable.

Firebird36049d ago

We need to stop the smear campaign on nuclear energy.
We could power everything forever if we wanted too.

Tacoboto49d ago

PS4 Pro had dedicated hardware in it for supporting checkerboard rendering that was used significantly in PS4 first party titles, so you don't need to look to PC or even modern PC gaming. The first RTX cards released nearly 6 years ago, so how many nails does this coffin need?

InUrFoxHole49d ago

Well... its a coffin man. So atleast 4?

Tacoboto49d ago

PSSR in the fall can assume that role.

anast49d ago

and those nails need to be replaced annually

Einhander197249d ago

I'm not sure what the point you're trying to make is, but PS4 Pro was before DLSS and FSR, and it still provides one of the highest performance uplifts while maintaining good image quality.

DLSS is it's own thing but checkerboarding om PS5 still is a rival to the likes of FSR2.

Tacoboto49d ago

Um. That is my point. That there have been so many nails in this "native performance" coffin and they've been getting hammered in for years, even on PS4 Pro before DLSS was even a thing.

RaidenBlack49d ago

Don't know what's OP's point is either but ... checkerboard rendering was good enough for its time but in terms of image quality its wayy behind what's DLSS 3 or FSR 3 is currently offering.
The main point of the article and what OP missed here is that DLSS 3.7 is soo good that its nearly undisguisable from native rendering and basically throws the "its still blurry and inferior to native rendering" debacle, (that's been going around in PC community since 2019), right out of the window.

Einhander197248d ago

RaidenBlack

DLSS is as i said a different thing from FSR and checkerboard.

But you're talking about FSR 3 which probably is better than checkerboard, but FSR 3 has only started to get games this year, so checkerboard which was the first hardware upscaling solution was and is still one of the best upscaling solutions.

Give credit where credit is due, PlayStation was first and they got it right from the get go, and PSSR will almost certainly be better than it will be given credit for, heck digital foundry is already spreading misinformation about the Pro.

Rhythmattic49d ago

Tacoboto
Yes... Its amazing how many talekd about KZ2 deferred rendering, pointing out the explosions were lower res than the frame itself..
And of course, Then the idea of checkerboard rendering, not being native....
For sure, maybe this tech makes it minimal while pixel counting, but alas, seems performance and close enough , and not native now matters.....
I want to see it run native without DLSS.. why not?

RonsonPL49d ago

Almost deaf person:
- lightweight portable 5$, speakers of 0,5cm diameter are the final nail in coffin of Hi-Fi audio!

Some people in 2010:
- smartphones are the final nain in the console gaming's coffin!

This is just the same.
AI upscalling is complete dogshit in terms of motion quality. The fact that someone is not aware of it (look at the deaf guy example) doesn't mean the flaws are not there. They are. And all it takes to see them, is to use a display that handles motion well, so either gets true 500fps at 500Hz LCD TN or OLED (or faster tech) or uses low persistence mode (check blurbusters.com if you don't know what it means) also known as Black Frame Insertion or backlight strobing.

Also, image ruined by any type of TAA is just as "native image" as chineese 0,5$ screwdriver is "high quality, heavy duty, for professional use". It's nowhere near it. But if you're an ignorant "journalist", you will publish crap like this article, just to flow with the current.

There's no coffin to native res quality and there never will be. Eventually, we'll have enough performance in rasterization to drive 500fps, which will be a game changer for motion quality while also adding other benefit - lower latency.
And at 500fps, the amount of time required for upscalling makes it completely useless.
This crap is only usable for cinematic stuff, like cutscenes and such. Not for gaming. Beware of ignorants on the internet. The TAA is not "native" and the shitty look of the modern games when you disable any TAA, is not "native" either as it's ruined by the developer's design choice - you can cheat by rendering every 4th pixel when you plan to put a smeary TAA pass on it later on. When you disable it, you will see a ruined image, horrible pixellation and other visual "glitches" but it is NOT what native would've looked like if you'd like to honestly compare the two.

Stay informed.

RaidenBlack49d ago

Main point of the article is how far DLSS has come with v3.7 since 2018. If this is what we're getting already, then imagine what we'll get within next ~3 years. Yes parity will obviously be there compared to the then native rendering tech but it'll slowly narrow down to the point it'll be indistinguishable.
Something similar is like the genAI Sora ... AI generative videos were turd back when they were introduced (the infamous Will Smith eating video) ... but now look at Sora, generating videos that just looks like real life.

RonsonPL30d ago

You can improve quality but you will never be able to reach native quality in motion. The biggest part of why these upscallers are so praised is because they use previous frame data. You cannot do that without degrading latency and/or hurting the motion quality. If you put another flaw on top of it, coming from sample and hold method of displaying image, or coming from low framerate, sure, the difference between "screwed up image" vs. "image screwed up even more" may seem small or non-existent. But if you talk about gaming, not interactive movies, the upscallers are overhyped and harfmul tech for the gamers and the whole gaming industry. For example, a game designed around screwed up motion, like the TAA enabled games, will never be played with improved quality even 100 years later when hardware allows for native 16K res. The motion quality will be broken and even if you disable the AA pass, you will still get the broken image, cause the devs were designing their effects with smeary filter in mind - this is why you can disable TAA in some games today, manually, with some tinkering, but you get 1 to 16 understampled crap.
It's annoying that nobody seems to understand the serious drawbacks of AI assisted upscallers. Everyone just praises it and calling it a great revolution. Don't get me wrong. AI has its place in rendering. But NOT in gaming.

48d ago
Yui_Suzumiya49d ago

How much VRAM is standard today? My laptop has a 1080p QLED display but only an Intel Iris Xe with 128MB of VRAM. I currently do all my gaming on it but certain titles do suffer because of it. I plan on getting a Steam Deck OLED soon to play the newer and more demanding titles.

purple10149d ago

Maybe better to get a budget gaming laptop and link a dualsense to it

= Portable console with far better graphics than a steam deck! + bigger screen and able to use it for work / etc

170°

Why I'm worried about the Nvidia RTX 50 series

Aleksha writes: "Nvidia has established itself as a dominant force in the world of AI, but I can't shake the worry of what this means for the RTX 50 series."

Tal16952d ago

Echo sentiment here - I think the way GPUs are going, gaming could be secondary to deep learning. Wonder if the 40 series was the last true generation of GPUs?

Number1TailzFan52d ago

No.. Jensen believes GPUs should stay expensive. Those wanting a top end GPU will have to splash out for it, or play at just 1080p and 60fps or something if you can only afford a low end option.

On the other hand if you don't care about RT or AI performance then there's always AMD that are doing ok at the mid range.

Christopher52d ago

***or play at just 1080p and 60fps or something***

My over 2-year-old laptop GPU still runs fine. I think this is more a reason why GPUs are going to other things in priority, because the market reach for new users is shrinking as more PC gamers focus less on replacing older and still working parts that run RT/AI fine enough as it is. Not to say there aren't people who still do it, but I think the market is shrinking for having the latest and greatest like it has been the past two decades. Problem is we aren't growing things at a rate as we were, we're reaching the the flattening of that exponential curve in regards to advancement. We need another major technological advancement to restart that curve.

D0nkeyBoi52d ago

The irremoval ad makes it impossible to read article

Tzuno52d ago (Edited 52d ago )

I hope Intel takes some lead and do a big dent to nvidia sales

Jingsing52d ago

You also need to consider that NVIDIA are heavily invested in cloud gaming. So they are likely going to make moves to push you into yet another life subscription service.

Kayser8152d ago

NVIDIA will never change their price point until AMD or intel makes a GPU that is comparable and cheaper than them .
it happend before in the days of gtx280 which they changed the price from 650$ to 450$ in a matter of 2 weeks because of rx4870 which is being sold at 380$.

Show all comments (8)
230°

Nvidia AI Demo Unwittingly Proves that Human Voice Actors, Artists, and Writers are Irreplaceable

Nvidia presented Covert Protocol, a tech demo aiming to showcase the "power" of the Nvidia Ace technology applied to video game characters.

Read Full Story >>
techraptor.net
Eonjay72d ago (Edited 72d ago )

They look like they are in pain. Almost begging to be put down. It was uncomfortable to watch.

PRIMORDUS73d ago

The tech. is too early. Come back in 10+yrs and see what it can do then.

N3mzor73d ago

That presentation sounds like it was written by an AI using corporate buzzwords.

CS773d ago

I don’t know why people keep thinking of it as AI vs no AI.

A much more likely scenario is the use of AI alongside human work.

Eg. AI voices used during side quests or banter to boost the lines of dialog.

AI generating additional pre determined branches in dialog tree options for more freedom in conversations with NPCs

Smellsforfree72d ago

"AI generating additional pre determined branches in dialog tree options for more freedom in conversations with NPCs"

I'm wondering about that last one. Will that make a game more fun or more immersive? In the end, how can it possibly be more than filler content and then if it is filler content how much do I really want to engage with conversing with it if I know it will lead no where?

MrBaskerville72d ago

It's one of those things that sounds cool on paper. But will probably get old fast.

DivineHand12572d ago

The tech is now available, and it is up to creators to create something unique with it.

Profchaos73d ago (Edited 73d ago )

The biggest thing to talk about here is that every interaction requires communication to inworld servers so there's three big impacts here
1) games are always online no question about it
2) delays in processing on inworld servers, outages or unexpected load as a result of some astronomically popular game will cause real time game delays ever wait for a chat got response this will be similar as the context must be pulled via the llm.

Now as for the other impact the artistic one no I don't think writers can be replaced I've mentioned before often AI generated writing is word soup I still standby that it's also evident in the video to.
AI can not convery accurately human emotions and I don't think ever will.

I know publishers are looking to cut down on development costs but what happens when inworld decide to charge per interaction or update their pricing a year after your game goes live you have no choice but pay it or shutter it.

Ive felt for a while that we are heading towards this place of games being disposable entertainment and now it's feeling more and more accurate

Show all comments (23)