720°

Why 'Watch Dogs' Is Bad News For AMD Users -- And Potentially The Entire PC Gaming Ecosystem

Forbes.com’s Jason Evangelho has a fascinating story on the disappointment that owners of AMD graphics cards are about to feel when they discover that Watch Dogs is largely unoptimized for their system.

3642d ago Replies(6)
NYC_Gamer3642d ago

Nvidia aren't wrong by not going the sharable route when it comes to their tools/features..

starchild3642d ago

True. It's no different than Sony or Microsoft not sharing their exclusives or features or tools with the other.

UltraNova3642d ago (Edited 3642d ago )

Who said anything on sharing proprietary code?

Its the developers fault and their damn release date windows who is to blame here!

If they wanted to partner up with Nvidia and their Gameworks program its their right, but its also their responsibility when they do optimize their game for Nvidia products to sit down (if possible in parallel) with AMD and optimize their code for their products as well. They have to do this since they cater in both sides of the fence. Plus, they are expecting sales form AMD users right? Incidentally, 40% of the market!

After reading the whole article and their benchmarking test its obvious that not only WD is an un-optimized mess in general, but AMD users are screwed big time, and dont get me started on Crossfire's and SLI's non-existent support.

That's disrespectful to AMD customers and customers in general to say the least and all this because they didn't have the balls to delay the game for a few months more and deliver something actually finished, I would have respected that, shut up and wait for it!

I'm officially putting this game on the back burner until they fix it.

Born2Game833642d ago

You are talking about 2 entirely different console manufactures vs PC. It's not the same thing.

ProjectVulcan3641d ago (Edited 3641d ago )

It's a poor show when Ubisoft have screwed AMD users, however this is why a lot of people will pay a bit more for Nvidia hardware because I daresay this happens less frequently for the green team.

The problem is still Nvidia is considered the benchmark for most developers working on PC, they prefer to use Nvidia hardware in testing generally. Think back to Microsoft demonstrating their early Xbox One titles....on Nvidia powered PCs. This is despite the console itself being entirely AMD powered!!!

TWIMTBP was a whole bunch of Nvidia engineers working hard to improve PC gaming and games support on Nvidia hardware. Many criticise it for allegedly biasing developers against AMD, but if Nvidia are going to provide all that support at their own expense, you can see exactly why many developers would take advantage of it.

AMD's counter to this Nvidia favouring is mainly to pay to be associated with big titles and try to support such titles themselves, for example Battlefield 4 recently.

Ubisoft are at fault. But it's still really in AMD's interests to chase down developers of really big titles to try to ensure game compatibility.

Giul_Xainx3641d ago

I bought this game for pc. I have an ati radeon 5750....

I also own a ps3 and want a ps4.... I think I know what my ch o ice is going to be in the future from now on. Buy a console and stick with it.

frostypants3641d ago (Edited 3641d ago )

No, it's totally different. PC is a largely open platform. Consoles are not.

Having gamers worry about whether or not their video card's proprietary crap is supported is a huge problem. It's not like people are going to install 2 cards.

+ Show (2) more repliesLast reply 3641d ago
mrmarx3642d ago

amd is the xbone of cpu worlld.. weak

Jonny5isalive3642d ago

yeah and they also make the apu in ps4 as well, what a bunch of slow crap huh.

brads43641d ago

PS4 is build with AMD components. People are so stupid.

FlyingFoxy3641d ago (Edited 3641d ago )

That's why the R9 290 was so much cheaper than Nvidia's overpriced cards and forced them to drop prices a lot on the 780?

AMD are the ones keeping prices in check, if it weren't for them Nv would keep ripping us off even more than they do now. And they are not "weak" in the least with their GPU's

3-4-53641d ago

* Basically people WON'T buy this game because they have AMD Graphics card.

* Dev's will take note of that, in the future, and make sure games are compatible and run good on AMD, and it will effect NVidia in the future for sales of graphics cards.

^ Not a TON, or a lot, but there will be some cause and effect from this.

ITPython3641d ago (Edited 3641d ago )

One of these days I plan to build my own high-end PC with all the bells and whistles, but it's stuff like this that makes me want to stick to consoles for my gaming needs.

It may not have the power of a PC, but it sure is a heck of a lot more convenient knowing I don't have to mess with my console or worry about games not playing on it correctly. And if there is bugs and issues, they usually get resolved pretty quickly because if one person has the problem, everybody likely will have the same exact issue. Whereas with PC gaming and the insane amount of variety between hardware, OS's, and software, if one person has a problem it's likely local to them and the devs won't bother to look into it unless it is a large scale problem affecting the majority of PC gamers.

Plus I'm a bit OCD when it comes to my PC's performance, and I get real irritated when it doesn't perform as expected and I sometimes spend hours tweaking settings and troubleshooting. So if I am playing a game, and say the frame-rate gets a bit iffy or there is some performance issues, I will probably spend more time messing with the games settings and my computers settings than I would be enjoying the game. With consoles if there is a performance issue, there is nothing I can do about it so I don't let the issue get in the way of enjoying the game because it is out of my hands.

+ Show (1) more replyLast reply 3641d ago
sourav933642d ago

Those GTX 770 benchmark numbers make me happy. Not because they're better than 290x numbers (shame on you Ubisoft!), but because it means I'll be able to run the game decently (gtx 770). Hopefully Ubisoft fixes this AMD issue, as I've got a few friends who use AMD cards, and I wanna play WD with them online without they're game getting screwed up all the time.

starchild3642d ago

I can confirm that it runs well on my gtx 770. It's a demanding game but it's doing a lot and I think it looks great.

choujij3642d ago

I tested it on a gtx 770 with 4770k processor in 1080p, and it often stutters and lags on ultra settings.

adorie3642d ago

How much vram do you have?

uso3641d ago

I have a AMD R9 280x and i can run the game in 1080P in ultra, with 40 to 50 fps

Kayant3642d ago

"Hopefully Ubisoft fixes this AMD issue" - They can never fix it Ubisoft nor AMD knows what is going on with the code running in the gameworks library. It's a black box to everyone apart from AMD, that's the problem with this. Why a 290x that goes toe to toe with a titan and keeps up quite well to beating it a very times in non-gamework optimized games can't beat a 770 you know something is very wrong.

Kayant3642d ago

2It's a black box to everyone apart from AMD" - Meant to say Nvidia there :p.

ginsunuva3642d ago

You thought a 770 would have trouble with this game?

raWfodog3641d ago

According to choujij (#3.1.1), his gtx770 is having issues.

duplissi3642d ago

I was thinking that it was running somewhat slower than I imagined it would. regardless it runs well enough on my 290X. I imagine there will be a game update and a driver update soon that will correct this.

flozn3641d ago

Keep in mind that this ONLY applies to 4GB GTX 770.

sourav933641d ago

The only difference in the 4GB and 2GB 770 would be the texture settings. Everything else will be identical.

+ Show (2) more repliesLast reply 3641d ago
Are_The_MaDNess3642d ago

you get what you pay for i guess.
im glad that Nvidia is patching for games before release and is teaming up with so many devs with support for their cards and exclusive features, dont think i will change from NV any time soon.

VJGenova3642d ago (Edited 3642d ago )

Did you read the article? It stated that the $500 R290x didn't run as well as the $300 GTX770 ... Not sure what you mean ...

Ogygian3641d ago

Brand loyalty is only going to allow nVidia to continue to get away with ludicrously high prices for their cards in future.

lets_go_gunners3642d ago

It's one game....No need to over react, dice and other devs are still supporters of mantle so amd will be relevant going forward whether people want to believe or not.

Show all comments (147)
270°

AMD gaming revenue declined massively year-over-year, CFO says the demand is 'weak'

Poor Xbox sales have affected AMD’S bottom line

Read Full Story >>
tweaktown.com
RonsonPL13d ago

Oh wow. How surprising! Nvidia overpriced their RTX cards by +100% and AMD instead of offering real competition, decided to join Nvidia in their greedy approach, while not having the same mindshare as Nvidia (sadly) does. The 7900 launch was a marketing disaster. All the reviews were made while the card was not worth the money at all, they lowered the price a bit later on, but not only not enough but also too late and out of "free marketing" window coming along with the new card generation release. Then the geniuses at AMD axed the high-end SKUs with increased cache etc, cause "nobody will buy expensive cards to play games" while Nvidia laughed at them selling their 2000€ 4090s.
Intel had all the mindshare among PC enthusiasts with their CPUs. All it took was a competetive product and good price (Ryzen 7000 series and especially 7800x3d) and guess what? AMD regained the market share in DYI PCs in no time! The same could've have happened with Radeon 5000, Radeon 6000 and Radeon 7000.
But meh. Why bother. Let's cancell high-end RDNA 4 and use the TSMC wafers for AI and then let the clueless "analysts" make their articles about "gaming demand dwingling".

I'm sure low-end, very overpriced and barely faster if not slower RDNA4 will turn things around. It will have AI and RT! Two things nobody asked for, especially not gamers who'd like to use the PC for what's most exciting about PC gaming (VR, high framerate gaming, hi-res gaming).
8000 series will be slow, overpriced and marketed based on its much improved RT/AI... and it will flop badly.
And there will be no sane conclusions made at AMD about that. There will be just one, insane: Gaming is not worth catering to. Let's go into AI/RT instead, what could go wrong..."

Crows9013d ago

What would you say would be the correct pricing for new cards?

Very insightful post!

RonsonPL13d ago

That's a complicated question. Depends on what you mean. The pricing at the release date or the pricing planned ahead. They couldn't just suddenly end up in a situation where their existing stock of 6000 cards is suddenly unsellable, but if it was properly rolled out, the prices should be where they were while PC gaming industry was healthy. I recognize the arguments about inflation, higher power draw and PCB/BOM costs, more expensive wafers from TSMC etc. but still, PC gaming needs some sanity to exist and be healthy. Past few years were very unhealthy and dangerous to whole PC gaming. AMD should recognize this market is very good for them as they have advantage in software for gaming and other markets while attractive short term, may be just too difficult to compete at. AI is the modern day gold rush and Nvidia and Intel can easily out-spend AMD on R&D. Meanwhile gaming is tricky for newcomers and Nvidia doesn't seem to care that much about gaming anymore. So I would argue that it should be in AMDs interest to even sell some Radeon SKUs at zero profit, just to prevent the PC gaming from collapsing. Cards like 6400 and 6500 should never exist at their prices. This tier was traditionally "office only" and priced at 50$ in early 2000s. Then we have Radeons 7600 which is not really 6-tier card. Those were traditionally quite performant cards based on wider than 128-bit memory bus. Also 8GB is screaming "low end". So I'd say the 7600 should've been available at below 200$ (+taxes etc.) as soon as possible, at least for some cheaper SKUs.For faster cards, the situation is bad for AMD, because people spending like $400+ are usually fairly knowledgable and demanding. While personally I don't see any value in upscallers and RT for 400-700$ cards, the fact is that especially DLSS is a valuable feature for potential buyers. Therefore, even 7800 and 7900 cards should be significantly cheaper than they currently are. People knew what they were paying for when buying Radeon 9700, 9800, X800, 4870 etc. They were getting gaming experience truly unlike console or low-end PC gaming. By all means, let's have expensive AMD cards for even above $1000, but first, AMD needs to show value. Make the product attractive. PS5 consoles can be bought at 400$. If AMD offers just a slightly better upscalled image on the 400$ GPU, or their 900$ GPU cannot even push 3x as many fps compared to cheap consoles, the pricing acts like cancer on PC gaming. And poor old PC gaming can endure only so much.

MrCrimson13d ago

I appreciate your rant sir, but it has very little to do with gpus. It is the fact that the PS5 and Xbox are in end cycle before a refresh.

RonsonPL12d ago

Yes, but also no. AMD let their PC GPU marketshare to shrink by a lot (and accidentally helped the whole market shrink in general due to bad value of PC GPUs over the years) and while their console business may be important here, I'd still argue their profits from GPU division could've been much better if not for mismanagement.

bababooiy13d ago

This is something many have argued over the last few years when it comes to AMD. The days of them selling their cards at a slight discount while having a similar offering are over. Its not just a matter of poor drivers anymore, they are behind on everything.

RNTody12d ago (Edited 12d ago )

Great post. I went for a Nvidia RTX 3060Ti which was insane value for money when I look at the fidelity and frame rates I can push in most games including new releases. Can't justify spending 3 times what my card cost at the time to get marginal better returns or the big sell of "ray tracing", which is a nice to have feature but hardly essential given what it costs to maintain.

+ Show (1) more replyLast reply 12d ago
13d ago Replies(1)
KwietStorm_BLM13d ago

Well that's gonna happen when you don't really try. I want to support AMD so badly and give Nvidia some actual competition but they don't very much seem interested in challenging, by their own accord. I been waiting for them to attack the GPU segment the same way they took over CPU, but they just seem so content with handing Nvidia the market year after year, and it's happening again this year with their cancelled high end card.

MrCrimson13d ago

I think you're going to see almost zero interest from AMD or Nvidia on the gaming GPU market. They are all in on AI.

RhinoGamer8813d ago

No Executive bonuses then...right?

enkiduxiv13d ago

What are smoking? Got to layoff your way to those bonuses. Fire 500 employees right before Christmas. That should get you there.

Tapani13d ago (Edited 13d ago )

Well, if you are 48% down in Q4 in your Gaming sector as they are, which in absolute money terms is north of 500M USD, then you are not likely to get at least your quarterly STI, but can be applicable for annual STI. The LTI may be something you are still eligible for, such as RSUs or other equity and benefits, especially if they are based on the company total result rather than your unit. All depends on your contract and AMD's reward system.

MrCrimson13d ago

Lisa Su took AMD from bankruptcy to one of the best semiconductor companies on the planet. AMD from 2 dollars a share to 147. She can take whatever she wants.

Tapani12d ago

You are not wrong about what she did for AMD and that is remarkable. However, MNCs' Rewards schemes do not work like "take whatever you want, because you performed well in the past".

darksky13d ago

AMD prcied their cards thinking that they will sell out just like in the mining craze. I suspect reality has hit home when they realized most gamers cannot afford to spend over $500 for a gpu.

Show all comments (33)
100°

Make your next GPU upgrade AMD as these latest-gen Radeon cards receive a special promotion

AMD has long been the best value option if you're looking for a new GPU. Now even their latest Radeon RX 7000 series is getting cheaper.

Father__Merrin23d ago

Best for the money is the Arc cards

just_looken23d ago

In the past yes but last gen amd has gotten cheaper and there new cards are on the horizon making 6k even cheaper.

The arc cards are no longer made by intel but asus/asrock has some the next line battlemage is coming out prices tbd.

Do to the longer software development its always best to go amd over intel if its not to much more money even though intel is a strong gpu i own 2/4 card versions.

330°

Nvidia DLSS 3.7 drives a further nail in the coffin of native performance

Nvidia DLSS 3.7 is the latest update to the long-running AI upscaling technology, and it further shows native performance doesn't matter.

DustMan33d ago

I think hardware development is at a point where they need to figure out how to draw less power, These beefy high end cards eat wattage, and I'm curious if using DLSS & AI in general will lower the power draw. It would seem like the days of just adding more VRAM & horsepower is over. Law of diminishing returns. Pretty soon DLSS/FSR will be incorporated into everything, and eventually the tech will be good enough to hardly notice a difference if at all. AI is the future and it would be foolish to turn around and not incorporate it at all. Reliance on AI is only going to pick up more & more.

Tapani33d ago (Edited 33d ago )

DLSS certainly lowers power consumption. Also, the numbers such as the 4090 at 450W does not tell you everything, most of the time the GPU stays between 200-350W in gameplay, which is not too different from the highest end GPU of 10 years ago. Plus, today you can undervolt + OC GPUs by a good margin to keep stock performance while utilizing 80% of the power limit.

You can make the 4090 extremely power efficient and keep 90% of its performance at 320W.

However, in today's world the chip manufacturing is limited by physics and we will have power increases in the next 5-10 years at the very least to keep the technology moving forward at a pace that satisfies both businesses and consumers.

Maybe in 10 years we have new tech coming to the markets which we are yet to invent or perhaps we can solve existing technologies problems with manufacturing or cost of production.

On the other hand, if we were to solve the energy problem on earth by utilizing fusion and solar etc. it would not matter how much these chips require. That being said, in the next 30-40 years that is a pipedream.

MrBaskerville32d ago

I don't think fusion is the way forward. It will mosy likely be too late when it's finally ready, meaning it will probably never be ready. Something else might arrive before though and then it becomes viable.

Firebird36032d ago

We need to stop the smear campaign on nuclear energy.
We could power everything forever if we wanted too.

Tacoboto33d ago

PS4 Pro had dedicated hardware in it for supporting checkerboard rendering that was used significantly in PS4 first party titles, so you don't need to look to PC or even modern PC gaming. The first RTX cards released nearly 6 years ago, so how many nails does this coffin need?

InUrFoxHole33d ago

Well... its a coffin man. So atleast 4?

Tacoboto33d ago

PSSR in the fall can assume that role.

anast33d ago

and those nails need to be replaced annually

Einhander197233d ago

I'm not sure what the point you're trying to make is, but PS4 Pro was before DLSS and FSR, and it still provides one of the highest performance uplifts while maintaining good image quality.

DLSS is it's own thing but checkerboarding om PS5 still is a rival to the likes of FSR2.

Tacoboto33d ago

Um. That is my point. That there have been so many nails in this "native performance" coffin and they've been getting hammered in for years, even on PS4 Pro before DLSS was even a thing.

RaidenBlack32d ago

Don't know what's OP's point is either but ... checkerboard rendering was good enough for its time but in terms of image quality its wayy behind what's DLSS 3 or FSR 3 is currently offering.
The main point of the article and what OP missed here is that DLSS 3.7 is soo good that its nearly undisguisable from native rendering and basically throws the "its still blurry and inferior to native rendering" debacle, (that's been going around in PC community since 2019), right out of the window.

Einhander197232d ago

RaidenBlack

DLSS is as i said a different thing from FSR and checkerboard.

But you're talking about FSR 3 which probably is better than checkerboard, but FSR 3 has only started to get games this year, so checkerboard which was the first hardware upscaling solution was and is still one of the best upscaling solutions.

Give credit where credit is due, PlayStation was first and they got it right from the get go, and PSSR will almost certainly be better than it will be given credit for, heck digital foundry is already spreading misinformation about the Pro.

Rhythmattic33d ago

Tacoboto
Yes... Its amazing how many talekd about KZ2 deferred rendering, pointing out the explosions were lower res than the frame itself..
And of course, Then the idea of checkerboard rendering, not being native....
For sure, maybe this tech makes it minimal while pixel counting, but alas, seems performance and close enough , and not native now matters.....
I want to see it run native without DLSS.. why not?

RonsonPL33d ago

Almost deaf person:
- lightweight portable 5$, speakers of 0,5cm diameter are the final nail in coffin of Hi-Fi audio!

Some people in 2010:
- smartphones are the final nain in the console gaming's coffin!

This is just the same.
AI upscalling is complete dogshit in terms of motion quality. The fact that someone is not aware of it (look at the deaf guy example) doesn't mean the flaws are not there. They are. And all it takes to see them, is to use a display that handles motion well, so either gets true 500fps at 500Hz LCD TN or OLED (or faster tech) or uses low persistence mode (check blurbusters.com if you don't know what it means) also known as Black Frame Insertion or backlight strobing.

Also, image ruined by any type of TAA is just as "native image" as chineese 0,5$ screwdriver is "high quality, heavy duty, for professional use". It's nowhere near it. But if you're an ignorant "journalist", you will publish crap like this article, just to flow with the current.

There's no coffin to native res quality and there never will be. Eventually, we'll have enough performance in rasterization to drive 500fps, which will be a game changer for motion quality while also adding other benefit - lower latency.
And at 500fps, the amount of time required for upscalling makes it completely useless.
This crap is only usable for cinematic stuff, like cutscenes and such. Not for gaming. Beware of ignorants on the internet. The TAA is not "native" and the shitty look of the modern games when you disable any TAA, is not "native" either as it's ruined by the developer's design choice - you can cheat by rendering every 4th pixel when you plan to put a smeary TAA pass on it later on. When you disable it, you will see a ruined image, horrible pixellation and other visual "glitches" but it is NOT what native would've looked like if you'd like to honestly compare the two.

Stay informed.

RaidenBlack32d ago

Main point of the article is how far DLSS has come with v3.7 since 2018. If this is what we're getting already, then imagine what we'll get within next ~3 years. Yes parity will obviously be there compared to the then native rendering tech but it'll slowly narrow down to the point it'll be indistinguishable.
Something similar is like the genAI Sora ... AI generative videos were turd back when they were introduced (the infamous Will Smith eating video) ... but now look at Sora, generating videos that just looks like real life.

RonsonPL13d ago

You can improve quality but you will never be able to reach native quality in motion. The biggest part of why these upscallers are so praised is because they use previous frame data. You cannot do that without degrading latency and/or hurting the motion quality. If you put another flaw on top of it, coming from sample and hold method of displaying image, or coming from low framerate, sure, the difference between "screwed up image" vs. "image screwed up even more" may seem small or non-existent. But if you talk about gaming, not interactive movies, the upscallers are overhyped and harfmul tech for the gamers and the whole gaming industry. For example, a game designed around screwed up motion, like the TAA enabled games, will never be played with improved quality even 100 years later when hardware allows for native 16K res. The motion quality will be broken and even if you disable the AA pass, you will still get the broken image, cause the devs were designing their effects with smeary filter in mind - this is why you can disable TAA in some games today, manually, with some tinkering, but you get 1 to 16 understampled crap.
It's annoying that nobody seems to understand the serious drawbacks of AI assisted upscallers. Everyone just praises it and calling it a great revolution. Don't get me wrong. AI has its place in rendering. But NOT in gaming.

32d ago
Yui_Suzumiya32d ago

How much VRAM is standard today? My laptop has a 1080p QLED display but only an Intel Iris Xe with 128MB of VRAM. I currently do all my gaming on it but certain titles do suffer because of it. I plan on getting a Steam Deck OLED soon to play the newer and more demanding titles.

purple10132d ago

Maybe better to get a budget gaming laptop and link a dualsense to it

= Portable console with far better graphics than a steam deck! + bigger screen and able to use it for work / etc