180°

Witcher 3 Dev says Nvidia HairWorks unoptimizable for AMD GPUs

Witcher 3 Dev says Nvidia HairWorks tech is unoptimizable for AMD GPUs, encouraging Radeon users to disable the feature for higher performance. Is Nvidia's GameWorks Tech anti-competitive?

Read Full Story >>
overclock3d.net
Adexus3292d ago

I thought this would be really obvious lol.

Hairworks in Far Cry 4 apparently had the same kind of performance impact that it did on Nvidia cards though, hopefully this is the same!

Hasswell-NeverCold3291d ago

How much impact it had in far cry 4? I haven't played it.

Well I guess that tomb raider tressfx worked better (and the whole game) in amd cards. I could play it some older amd card just fine (with dropping something of course) and then I switched to nvidia it wasn't so much better than I thought. oh yeah that opinion is based on amd hd 6850 vs gtx 760 (which i currently have)

It is really fucked that some games are just made for basically for another card.

kingeliran3291d ago

Yep.. Hairworks & Physx are Hardware related, so i dont think Nividia will share its tech with their competition AMD.

cartoonx13291d ago

well hairworks on far cry 4 looks crap anyway. i tested them with amd GPU and it wasnt really bad performance wise, hope its same performance wise in this game.

Magicite3291d ago

Dying Light and AC Unity also had Nvidia exclusive feats. AMD is in big trouble.

Cueil3291d ago

TressFX is open source but NOOOO... screw over a bunch of gamers...

Activemessiah3291d ago

AMD owners getting shafted once again...

_LarZen_3291d ago

Nothing new. And one of many reasons I am going back to Nvidia next time I upgrade.

NuggetsOfGod3291d ago (Edited 3291d ago )

or stupid pc community could stop supporting preparatory technology?

as far as I can see; Amd is the one who single handedly changed the industry.

The forced low level apis for pc and co-develop hbm memory.

nvidia makes hair and AA.

The only reason I would get nvidia is if amd doesn't support it r9 390x with good drivers.

Nvidia seems to have good service though. But they do not even one thing to benefit the industry.

If I can save $100 - $200 with freesync I wouldn't mind. we will see.

Amd drops the ball hard with driver support it seems.

other than that they seem to better for the industry.

tressfx in deus ex will make u jump back to amd? lol nvidia owners will be shafted I guess.

proprietary divides pc gamers.

lol like hairworks won't be killing gtx fps performance lol physx as well kills gtx performance.

everyone is shafted.

and it look hardly better than ps4 so far.

Cueil3291d ago

no need... just buy yourself a cheap Nvidia card and slot it in with your AMD and blam you're good to go with DX12 and Windows 10

Immorals3291d ago

I dunno, wouldn't call it shafted. A technology designed for one companies graphics card not working on their rivals sounds about right to me.

darksky3291d ago

So why do devs support this? They should develop features which are useable on all cards, not only some.

Immorals3291d ago

@darksky

Why limit yourself? If you can improve your product, even for a select few customers it's worth it.

darksky3291d ago

@ immorals

Yet CDProject downgraded the pc version to be at parity with the XB1. They certainly limited themselves there.

Cueil3291d ago

TressFX works great and AMD made it open source... not sure why any company wants to use crap from Nvidia that doesn't work right on AMD cards

+ Show (1) more replyLast reply 3291d ago
3291d ago Replies(1)
cartoonx13291d ago (Edited 3291d ago )

even on nvidia cards u have to use titan x or 980 sli to get 60 fps with hair works on, so its not worth anyways. also i hope tht the game perform well without hairworks on AMD since there have been almost no game which use gameworks and perform well on AM. on the other had almost all gaming evolve games perform fairly well on both GPUs

@NeverCold- Nvidia has posted GPU requirements on their website, u can check there. it do look quit impressive with hairworks on but perf drop is too much for average users. also game dev said youtube videos doesnt represent the final build so maybe final game has better graphics with day one patch. just 3 more days to know :)

Hasswell-NeverCold3291d ago

And if that actually is true and I haven't seen much difference in PC vs PS4 comparisons in youtube. That's a lot of money for just 30 to 60 fps. With hair. =)

Hopefully most people will be quite happy to finally play the game next week.

VJGenova3291d ago

One GTX 980 is said to be able to run it with hairworks on at 60 fps, 1080p. 4k hairworks is another story. Hoping AMD will at least be optimized enough to run 60 FPS 1080p ultra with my R9 295x2. Hairworks would be great, but I've assumed from the start it wouldn't work.

cartoonx13290d ago (Edited 3290d ago )

@genova- IGN were running ultra with hairworks and some other effects off and they were getting dips below 60 and thts with gtx 980 at 1080p. also they were using SSAO which is alot cheaper thn HBAO+

methegreatone3291d ago

Why ?

Nvidia hairworks is Nvidia technology, designed specifically to work on NV hardware.

Ofcourse they aren't going to optimize it on AMD hardware.

+ Show (2) more repliesLast reply 3290d ago
Dario_DC3291d ago

What else is new?? Anything connected to Nvidia basically runs like crap on AMD cards... This is how Nvidia keeps gaining so much market share! Just work along/sponsor(pay) developers to use their proprietary software on these new games and make AMD cards perform like crap...
Funny that people blame AMD drivers support when most of these issues are created by Nvidia... Then you have PC gamers buying Nvidia thinking they're the best... Any Nvida developed game WILL have issues on AMD cards, that's how it is.

uth113291d ago

Then when nVidia is the only viable game in town. They'll lose the incentives to innovate or have competitive pricing and everyone will suffer

Gwiz3291d ago

AMD has 3 consoles,Nvidia is going to do everything they can to have an advantage on PC.

kingmushroom3291d ago

If i get the disc version on Pc, do i get the digital version with it?

Cueil3291d ago

with most games the disc is a lie.... but luckily as long as your disc says GoG and not Steam Required you're fine

Show all comments (31)
60°

Here's Everything I Expect From The Witcher 4's Combat

Danish from eXputer: "The Witcher's upcoming sequel needs to overhaul the series' combat system if it wants to make a big splash among gamers."

on_line_forever19d ago

after what we see ( 2015 - 2024 ) in dark souls 3, blood borne , sekiro , dragon's dogma 2 , rise of the Ronin combat they should really careful and bring very good combat in Witcher 4 this time

330°

Nvidia DLSS 3.7 drives a further nail in the coffin of native performance

Nvidia DLSS 3.7 is the latest update to the long-running AI upscaling technology, and it further shows native performance doesn't matter.

DustMan37d ago

I think hardware development is at a point where they need to figure out how to draw less power, These beefy high end cards eat wattage, and I'm curious if using DLSS & AI in general will lower the power draw. It would seem like the days of just adding more VRAM & horsepower is over. Law of diminishing returns. Pretty soon DLSS/FSR will be incorporated into everything, and eventually the tech will be good enough to hardly notice a difference if at all. AI is the future and it would be foolish to turn around and not incorporate it at all. Reliance on AI is only going to pick up more & more.

Tapani36d ago (Edited 36d ago )

DLSS certainly lowers power consumption. Also, the numbers such as the 4090 at 450W does not tell you everything, most of the time the GPU stays between 200-350W in gameplay, which is not too different from the highest end GPU of 10 years ago. Plus, today you can undervolt + OC GPUs by a good margin to keep stock performance while utilizing 80% of the power limit.

You can make the 4090 extremely power efficient and keep 90% of its performance at 320W.

However, in today's world the chip manufacturing is limited by physics and we will have power increases in the next 5-10 years at the very least to keep the technology moving forward at a pace that satisfies both businesses and consumers.

Maybe in 10 years we have new tech coming to the markets which we are yet to invent or perhaps we can solve existing technologies problems with manufacturing or cost of production.

On the other hand, if we were to solve the energy problem on earth by utilizing fusion and solar etc. it would not matter how much these chips require. That being said, in the next 30-40 years that is a pipedream.

MrBaskerville36d ago

I don't think fusion is the way forward. It will mosy likely be too late when it's finally ready, meaning it will probably never be ready. Something else might arrive before though and then it becomes viable.

Firebird36036d ago

We need to stop the smear campaign on nuclear energy.
We could power everything forever if we wanted too.

Tacoboto36d ago

PS4 Pro had dedicated hardware in it for supporting checkerboard rendering that was used significantly in PS4 first party titles, so you don't need to look to PC or even modern PC gaming. The first RTX cards released nearly 6 years ago, so how many nails does this coffin need?

InUrFoxHole36d ago

Well... its a coffin man. So atleast 4?

Tacoboto36d ago

PSSR in the fall can assume that role.

anast36d ago

and those nails need to be replaced annually

Einhander197236d ago

I'm not sure what the point you're trying to make is, but PS4 Pro was before DLSS and FSR, and it still provides one of the highest performance uplifts while maintaining good image quality.

DLSS is it's own thing but checkerboarding om PS5 still is a rival to the likes of FSR2.

Tacoboto36d ago

Um. That is my point. That there have been so many nails in this "native performance" coffin and they've been getting hammered in for years, even on PS4 Pro before DLSS was even a thing.

RaidenBlack36d ago

Don't know what's OP's point is either but ... checkerboard rendering was good enough for its time but in terms of image quality its wayy behind what's DLSS 3 or FSR 3 is currently offering.
The main point of the article and what OP missed here is that DLSS 3.7 is soo good that its nearly undisguisable from native rendering and basically throws the "its still blurry and inferior to native rendering" debacle, (that's been going around in PC community since 2019), right out of the window.

Einhander197235d ago

RaidenBlack

DLSS is as i said a different thing from FSR and checkerboard.

But you're talking about FSR 3 which probably is better than checkerboard, but FSR 3 has only started to get games this year, so checkerboard which was the first hardware upscaling solution was and is still one of the best upscaling solutions.

Give credit where credit is due, PlayStation was first and they got it right from the get go, and PSSR will almost certainly be better than it will be given credit for, heck digital foundry is already spreading misinformation about the Pro.

Rhythmattic36d ago

Tacoboto
Yes... Its amazing how many talekd about KZ2 deferred rendering, pointing out the explosions were lower res than the frame itself..
And of course, Then the idea of checkerboard rendering, not being native....
For sure, maybe this tech makes it minimal while pixel counting, but alas, seems performance and close enough , and not native now matters.....
I want to see it run native without DLSS.. why not?

RonsonPL36d ago

Almost deaf person:
- lightweight portable 5$, speakers of 0,5cm diameter are the final nail in coffin of Hi-Fi audio!

Some people in 2010:
- smartphones are the final nain in the console gaming's coffin!

This is just the same.
AI upscalling is complete dogshit in terms of motion quality. The fact that someone is not aware of it (look at the deaf guy example) doesn't mean the flaws are not there. They are. And all it takes to see them, is to use a display that handles motion well, so either gets true 500fps at 500Hz LCD TN or OLED (or faster tech) or uses low persistence mode (check blurbusters.com if you don't know what it means) also known as Black Frame Insertion or backlight strobing.

Also, image ruined by any type of TAA is just as "native image" as chineese 0,5$ screwdriver is "high quality, heavy duty, for professional use". It's nowhere near it. But if you're an ignorant "journalist", you will publish crap like this article, just to flow with the current.

There's no coffin to native res quality and there never will be. Eventually, we'll have enough performance in rasterization to drive 500fps, which will be a game changer for motion quality while also adding other benefit - lower latency.
And at 500fps, the amount of time required for upscalling makes it completely useless.
This crap is only usable for cinematic stuff, like cutscenes and such. Not for gaming. Beware of ignorants on the internet. The TAA is not "native" and the shitty look of the modern games when you disable any TAA, is not "native" either as it's ruined by the developer's design choice - you can cheat by rendering every 4th pixel when you plan to put a smeary TAA pass on it later on. When you disable it, you will see a ruined image, horrible pixellation and other visual "glitches" but it is NOT what native would've looked like if you'd like to honestly compare the two.

Stay informed.

RaidenBlack36d ago

Main point of the article is how far DLSS has come with v3.7 since 2018. If this is what we're getting already, then imagine what we'll get within next ~3 years. Yes parity will obviously be there compared to the then native rendering tech but it'll slowly narrow down to the point it'll be indistinguishable.
Something similar is like the genAI Sora ... AI generative videos were turd back when they were introduced (the infamous Will Smith eating video) ... but now look at Sora, generating videos that just looks like real life.

RonsonPL17d ago

You can improve quality but you will never be able to reach native quality in motion. The biggest part of why these upscallers are so praised is because they use previous frame data. You cannot do that without degrading latency and/or hurting the motion quality. If you put another flaw on top of it, coming from sample and hold method of displaying image, or coming from low framerate, sure, the difference between "screwed up image" vs. "image screwed up even more" may seem small or non-existent. But if you talk about gaming, not interactive movies, the upscallers are overhyped and harfmul tech for the gamers and the whole gaming industry. For example, a game designed around screwed up motion, like the TAA enabled games, will never be played with improved quality even 100 years later when hardware allows for native 16K res. The motion quality will be broken and even if you disable the AA pass, you will still get the broken image, cause the devs were designing their effects with smeary filter in mind - this is why you can disable TAA in some games today, manually, with some tinkering, but you get 1 to 16 understampled crap.
It's annoying that nobody seems to understand the serious drawbacks of AI assisted upscallers. Everyone just praises it and calling it a great revolution. Don't get me wrong. AI has its place in rendering. But NOT in gaming.

35d ago
Yui_Suzumiya36d ago

How much VRAM is standard today? My laptop has a 1080p QLED display but only an Intel Iris Xe with 128MB of VRAM. I currently do all my gaming on it but certain titles do suffer because of it. I plan on getting a Steam Deck OLED soon to play the newer and more demanding titles.

purple10136d ago

Maybe better to get a budget gaming laptop and link a dualsense to it

= Portable console with far better graphics than a steam deck! + bigger screen and able to use it for work / etc

170°

Why I'm worried about the Nvidia RTX 50 series

Aleksha writes: "Nvidia has established itself as a dominant force in the world of AI, but I can't shake the worry of what this means for the RTX 50 series."

Tal16939d ago

Echo sentiment here - I think the way GPUs are going, gaming could be secondary to deep learning. Wonder if the 40 series was the last true generation of GPUs?

Number1TailzFan39d ago

No.. Jensen believes GPUs should stay expensive. Those wanting a top end GPU will have to splash out for it, or play at just 1080p and 60fps or something if you can only afford a low end option.

On the other hand if you don't care about RT or AI performance then there's always AMD that are doing ok at the mid range.

Christopher39d ago

***or play at just 1080p and 60fps or something***

My over 2-year-old laptop GPU still runs fine. I think this is more a reason why GPUs are going to other things in priority, because the market reach for new users is shrinking as more PC gamers focus less on replacing older and still working parts that run RT/AI fine enough as it is. Not to say there aren't people who still do it, but I think the market is shrinking for having the latest and greatest like it has been the past two decades. Problem is we aren't growing things at a rate as we were, we're reaching the the flattening of that exponential curve in regards to advancement. We need another major technological advancement to restart that curve.

D0nkeyBoi39d ago

The irremoval ad makes it impossible to read article

Tzuno39d ago (Edited 39d ago )

I hope Intel takes some lead and do a big dent to nvidia sales

Jingsing39d ago

You also need to consider that NVIDIA are heavily invested in cloud gaming. So they are likely going to make moves to push you into yet another life subscription service.

Kayser8139d ago

NVIDIA will never change their price point until AMD or intel makes a GPU that is comparable and cheaper than them .
it happend before in the days of gtx280 which they changed the price from 650$ to 450$ in a matter of 2 weeks because of rx4870 which is being sold at 380$.

Show all comments (8)