Watch Dogs on PC to get more exclusive NVIDIA treatment

Watch Dogs

By on August 28, 2013 at 7:34 pm

Ubisoft and NVIDIA’s PC gaming partnership is bearing more fruit, with the graphics card company announcing that they’ll be giving the same treatment to Watch Dogs as they did with Splinter Cell: Blacklist.

While they’re still being coy about how exactly the game will be enhanced, it’s all part of the “unprecedented technical alliance” between the two companies. Having just wrapped up Splinter Cell: Blacklist, it’s certainly got a truckload more features than any generic console port, so hopefully the same treatment can be applied to Watch Dogs.

We recently took a look at Watch Dogs at this year’s E3. Get that preview here.

26 comments (Leave your own)

Yay, soft-locked PhysX features for no reason!


Yay, soft-locked PhysX features for no reason!

^pretty much this


There’s a lot of money to be made with these types of partnerships, so no surprises here. Guess I need to start learning what the heck’s so special about TXAA.


More badly optimized and arbitrarily proprietary features shoveled on top of a console port by Nvidia. Forgive me for not being excited.


LOL, console port? That term is mostly irrelevant for the new consoles as they are literally just PC’s in a box…

However the nVidia only stuff is pretty much a waste of time. They are already optimised for AMD so nVidia are just trying to keep their feet wet as they know a drought is coming for them.


The term is fairly relevant, actually. Effort still needs to be put into developing UI, controls, and variable graphics settings etc. It’s easier to make ports, but it’s still going to be very easy to make lazy ones.


Even in Radeon optimized games the Radeon cards were never superior in my experience. I’ve never regretted switching to Nvidia.


Agreed Hobomaster, but at the same time, it’s dependant on the studio whether they use the hi-res textures and detail level for the PC versions. Most games are done at higher than PC level then reduced to the console level, when they port lazily it’s just using the console assets as there’s no extra work to remake them from the original very high detail assets. Pity that some studios don’t just include PC level assets in their pipeline from the start.

@yeapal – Having used both many times in PC builds, I don’t see much of an advantage to having nVidia over Radeon. Most of the driver horror stories I see are primarily user problems rather than the drivers themselves. Some obviously aren’t, but there are just as many issues with team green as there is with team red. Hell I have an issue with my Radeon, hardware fault too, doesn’t mean I have a kneejerk reaction and say “OMG Radeonz are all crapz”. Having played with a Titan, a couple of 780GTX’s and a couple of 680GTX’s vs a lot of 7970′s I haven’t found the nVidia’s holding such a massive lead. Plus I still prefer the image quality of the Radeon’s.


Been using nVidia based cards since my old TNT2 with 32MB on board. Those were the days…that said it’s not that I hate Radeon just prefer nVidia.

That said, the big feature of going nVidia with Watch Dogs/Splinter Cell:Blacklist is slightly blurrier edges? Quick everybody, throw out your Radeons and rush out to grab an nVidia immediately, this is going to be sooooo awesome! <_<

(there has to be something else I'm sure, but still…)


The term is fairly relevant, actually. Effort still needs to be put into developing UI, controls, and variable graphics settings etc. It’s easier to make ports, but it’s still going to be very easy to make lazy ones.

Again the hobo has it right, the term console port has never related to performance but to user friendliness.

You were talking about next gen consoles making the term irrelevant then you bring up “it’s up to them if they include hi-res textures” if next gen does use hi-res textures what the hell is the point in next gen ?

vcatkiller: (there has to be something else I’m sure, but still…)

PhysX probably and Nvidias acquisition of physx was when I stopped buying Nvidia cards because before that happened I didn’t realise how obnoxious their business practices were taking a technology and having it locked to your card is just not behavior I want to support.


Last I heard nVidia actually offered AMD PhysX support for their cards and AMD said no because they were going to use hardware Havok which ended up dieing. So if that’s true it’s actually scumbag AMD, not scumbag nVidia.–ati.html
Looks like that was half the case, nVidia claimed they wanted to keep the API open for anybody to contribute. Shortly thereafter AMD started on about Hardware based Havok.
Don’t see anything re a breakdown between Havok and AMD but you don’t hear anything about the thing since 2008, so…


Interesting that PhysX is performance locked considering they want to keep it “open” there’s more to this story than “you can have it AMD” their had to be some kind of catch or else why would you gimp CPU performance when handling phyx ?

Admittedly I’m out of my element here hardware is not my area of expertise it just seems odd to me to have an open platform that you performance lock.


Gimp CPU performance? What are you talking about? It was true that once upon a time PhysX was written in an outdated instruction set which caused terribad CPU performance but that was fixed ages ago. Unless it’s been discovered otherwise CPU’s can’t handle PhysX because they’re not powerful enough.

Unless you talk about those times when they hardcode the requirement of an nVidia card before you can use the physics which is lame though not very relevant given the CPU can’t handle it very well anyway (someone got BL2 CPU PhysX and it was unplayable).

As for more to the AMD story I believe nVidia were going to license PhysX to them so whether nVidia was asking for too much or if AMD were just being cheap (probably imo given they thought they had their own hardware physics solution on the horizon) we’ll likely never know.



I don’t completely know the specifics either, but it’s possible (and high levels of guessing here on my part mind you) that nVidia locked it down after no other graphics manufacturer showed any interest. Intel had just aquired Havok, ATI had decided to go with Intel and Havok instead of accepting nVdia’s offer, who else in the field produces any sort of graphics cards? Let’s just lock our tech.

Then of course AMD acquired ATI which is probably why the Havok thing went quiet. (not sure if Intel would be interesting in licensing their tech to a rival CPU company you know…) I guess if they really wanted to nVidia could re-unlock their tech and allow it to work on different gpus but now it’s a selling point I can’t see that happening.


exe3: CPU’s can’t handle PhysX because they’re not powerful enough.

My cpu usage in borderlands 2 drops when I enable PhysX if the problem was calculation power I would be maxing it out, same thing occurs with Arkham asylum. For borderlands 2 I actually found a work-around that fixed this and have had no issues with PhysX effecting frame rates.

I have nothing beyond an anecdotes to offer here, as again hardware is not my area of expertise and I don’t spend a large amount of time researching its ins and outs. But it seems odd to me that it isn’t performance locked and yet the performance of my CPU gets gimped to using a smaller amount of its capacity than it otherwise would without PhysX.

And really since physx which lets be honest is kind of a gimmick i just don’t see locking gimmicks like that away or even really using them in general to be something good for pc gaming as an industry.


I haven’t regretted switching back to nVidia. Radeons may be superior for power and price, but I had horrible experiences with dodgy Catalyst drivers each time I’ve had a Radeon. Even though those issues were ironed out with eventual drivers, they were bad enough swear me off trying Radeon ever again (or at least until they do something revolutionary to their software system that they have to do away with the Catalyst name to usher in a new era for themselves).


Nvidia freely gave PhysX to PS3 and Wii developers and to Xbox360, etc. AMD wanted to go another way, which is fair enough. AMD did not want PhysX (gpu) and the performance lock, people talk about, is that AMD users are stuck with the cpu version of PhysX.

AMD do not show any interest in PhysX and will always talk it down at every opportunity. I find it interesting that owners of AMD cards want PhysX and yet AMD the makers of the cards do not want to support PhysX. AMD card users should be asking AMD why they do not want the faster PhysX (gpu).

CUDA, PhysX Are Doomed Says AMD’s Roy Taylor,23797.html

Said matter was a tough topic over recent years, even resulting in claims that NVIDIA “hobbles” the CPU PhysX performance on purpose, to make their GPUs look more advantageous.

However, recently we saw many reports (mostly from AMD users) that Borderlands 2 shows surprisingly good performance, while running with all PhysX effects enabled even without a NVIDIA card in the system.



This game has never been a console port, It was always been first and foremost a PC game then ported to consoles officially stated by all the Ubisoft representatives.

Leave a comment

You can use the following bbCode
[i], [b], [img], [quote], [url href=""]Google[/url]

Leave a Reply



Steam Group

Upcoming Games

Community Soapbox

Recent Features
Call of Duty: Black Ops 3

Call of Duty: Adding women “not a remotely difficult decision”

"You think I have to worry about that (abuse) or let that bother me? I can't. I can't."

Halo 5: Guardians

Halo 5′s designer “not shying away” from MOBA game similarities in new Warzone mode

"Nothing really stopping us" creating a F2P Warzone spinoff, either.

Assassin's Creed Syndicate

We talk to Assassin’s Creed Syndicate’s director about encouraging class warfare

"I think the Frye twins do help push London more towards equality for all."

Streaming Radio
Radio Streams are restricted to iiNet group customers.

GreenManGaming MREC

Facebook Like Box