GeForce.com writes, "According to Valve's hardware survey results, the 2008-released GeForce 9800 GT is the most popular graphics card used on Steam. While many gamers are still using three-year-old GPUs, interestingly enough, Valve's data also states that the most popular gaming resolution, by a vast majority, is 1920x1080 (21.1%). This discrepancy in what hardware the majority of people have versus what resolutions their displays support presents an interesting predicament. Essentially, users running 9800 GTs and even older graphics cards would have to make compromises in performance, graphical settings, or both to play games at such a high resolution. With the price of large 1920x1080-capable monitors dropping like flies, the popularity of HD gaming will only continue to grow, and because we wanted gamers to be able to play modern games at 1920x1080, the way they were meant to be played, we have designed the affordable GeForce GTX 560. "
Nvidia DLSS 3.7 is the latest update to the long-running AI upscaling technology, and it further shows native performance doesn't matter.
I think hardware development is at a point where they need to figure out how to draw less power, These beefy high end cards eat wattage, and I'm curious if using DLSS & AI in general will lower the power draw. It would seem like the days of just adding more VRAM & horsepower is over. Law of diminishing returns. Pretty soon DLSS/FSR will be incorporated into everything, and eventually the tech will be good enough to hardly notice a difference if at all. AI is the future and it would be foolish to turn around and not incorporate it at all. Reliance on AI is only going to pick up more & more.
PS4 Pro had dedicated hardware in it for supporting checkerboard rendering that was used significantly in PS4 first party titles, so you don't need to look to PC or even modern PC gaming. The first RTX cards released nearly 6 years ago, so how many nails does this coffin need?
Almost deaf person:
- lightweight portable 5$, speakers of 0,5cm diameter are the final nail in coffin of Hi-Fi audio!
Some people in 2010:
- smartphones are the final nain in the console gaming's coffin!
This is just the same.
AI upscalling is complete dogshit in terms of motion quality. The fact that someone is not aware of it (look at the deaf guy example) doesn't mean the flaws are not there. They are. And all it takes to see them, is to use a display that handles motion well, so either gets true 500fps at 500Hz LCD TN or OLED (or faster tech) or uses low persistence mode (check blurbusters.com if you don't know what it means) also known as Black Frame Insertion or backlight strobing.
Also, image ruined by any type of TAA is just as "native image" as chineese 0,5$ screwdriver is "high quality, heavy duty, for professional use". It's nowhere near it. But if you're an ignorant "journalist", you will publish crap like this article, just to flow with the current.
There's no coffin to native res quality and there never will be. Eventually, we'll have enough performance in rasterization to drive 500fps, which will be a game changer for motion quality while also adding other benefit - lower latency.
And at 500fps, the amount of time required for upscalling makes it completely useless.
This crap is only usable for cinematic stuff, like cutscenes and such. Not for gaming. Beware of ignorants on the internet. The TAA is not "native" and the shitty look of the modern games when you disable any TAA, is not "native" either as it's ruined by the developer's design choice - you can cheat by rendering every 4th pixel when you plan to put a smeary TAA pass on it later on. When you disable it, you will see a ruined image, horrible pixellation and other visual "glitches" but it is NOT what native would've looked like if you'd like to honestly compare the two.
Stay informed.
How much VRAM is standard today? My laptop has a 1080p QLED display but only an Intel Iris Xe with 128MB of VRAM. I currently do all my gaming on it but certain titles do suffer because of it. I plan on getting a Steam Deck OLED soon to play the newer and more demanding titles.
Aleksha writes: "Nvidia has established itself as a dominant force in the world of AI, but I can't shake the worry of what this means for the RTX 50 series."
Echo sentiment here - I think the way GPUs are going, gaming could be secondary to deep learning. Wonder if the 40 series was the last true generation of GPUs?
You also need to consider that NVIDIA are heavily invested in cloud gaming. So they are likely going to make moves to push you into yet another life subscription service.
NVIDIA will never change their price point until AMD or intel makes a GPU that is comparable and cheaper than them .
it happend before in the days of gtx280 which they changed the price from 650$ to 450$ in a matter of 2 weeks because of rx4870 which is being sold at 380$.
Nvidia presented Covert Protocol, a tech demo aiming to showcase the "power" of the Nvidia Ace technology applied to video game characters.
I don’t know why people keep thinking of it as AI vs no AI.
A much more likely scenario is the use of AI alongside human work.
Eg. AI voices used during side quests or banter to boost the lines of dialog.
AI generating additional pre determined branches in dialog tree options for more freedom in conversations with NPCs
The biggest thing to talk about here is that every interaction requires communication to inworld servers so there's three big impacts here
1) games are always online no question about it
2) delays in processing on inworld servers, outages or unexpected load as a result of some astronomically popular game will cause real time game delays ever wait for a chat got response this will be similar as the context must be pulled via the llm.
Now as for the other impact the artistic one no I don't think writers can be replaced I've mentioned before often AI generated writing is word soup I still standby that it's also evident in the video to.
AI can not convery accurately human emotions and I don't think ever will.
I know publishers are looking to cut down on development costs but what happens when inworld decide to charge per interaction or update their pricing a year after your game goes live you have no choice but pay it or shutter it.
Ive felt for a while that we are heading towards this place of games being disposable entertainment and now it's feeling more and more accurate
You can buy a mid range GPU now for $120+
Why use a 9800 card or older
http://www.tigerdirect.com/...
I need this for Duke nukem!
Wow, a replacement for the GTX 460. 560 is just better at OC, that's cool. Those people with old GPUs might wanna upgrade now, 560 might hit a sweet spot.
Got a gtx580, but I prefer my 1680 monitor (120hz) over my old 1920 at 60hz.
Resolution doesn't matter that much when you're close enough.
Plus a 24 inch monitor gave me some headache and sometimes pain to the eyes.
22" 's perfect from my distance.
I currently have a 9800GT :D
Funny, after I got that card I noticed the 8800GT was the most popular - but the 2 cards perform very similarly and use the same G90 nVidia chip and RAM sepcifications (performance varies more by the card manufacturer than the 9800/8800 discrepancy. Some 9800GT G90 chips were manufactured in 55nm which made them more likely to use less energy and therefore possibly clock higher), so it is odd that the 9800gt has become more popular. I think what has happend is a large number of 8800gt owners upgraded because their cards began to break due to age or underperform due to age, so they moved to something like a 5850 HD or similar.
The 560 is great but the ati 6950 (2GB version) is what you want because they openly allow you to unlock it to a 6970 and it will perform to the same specs without any problems. Us 9800GT owners will dissapear soon to, I have been gaming on 1680x1050 and I have to lower the graphics quality quite a bit to keep my fps high - I'll get a 6950 soon.