Hacked Nvidia drivers make any DisplayPort monitor into a G-Sync monitor (no module required)
55 replies, posted
Too bad my main monitor is DVI only, my secondary 21:9 is my only DP screen.
well so he enabled it to work on Display Port 1.2a / 1.3 and eDP
now isn't that then adaptive vsync which is part of the VESA specs ?(AMD Freesync)
I hope you guys aren't ordering DP cables to do this because it doesn't work. The guy who wrote the blog seems sort of crazy, posted mostly inaccurate info, and he just reposted a leaked OEM driver that enables adaptive refresh rate for two laptops. The funny thing is that every single comment on the blog that the OP linked is people saying it doesn't work.
Nevermind my card isn't supported. Shit. Only the 650 TI Boost is supported...
Has anyone ever had issues with tearing before? I always turn off VSync and havent had a single issue with tearing.
This whole GSync stuff is foreign and I think what Brt posted is more informative than the video posted afterwards.
[QUOTE=Used Car Salesman;47043619]Nvidia are cunts. I am definitely never spending a fucking dime on one of their products again.[/QUOTE]
That means using the Catalyst Control Center so you enjoy that lump of dump.
[QUOTE=Elspin;47042604]UHHHHHH, is there some kind of backing for this or did this person literally just claim that nvidia software would work on better on AMD hardware with no proof or basis other than their own imagination?[/QUOTE]
AMD GPUs have a lot more ALUs/shaders than NVIDIA GPUs, although NVIDIA tries to compensate with having a higher clock speed. In terms of performance (instructions per second) AMD knocks them out of the park, hence why people hoarded their GPUs when people still mined bitcoins with GPUs.
[url]https://en.bitcoin.it/wiki/Why_a_GPU_mines_faster_than_a_CPU#Why_are_AMD_GPUs_faster_than_Nvidia_GPUs.3F[/url]
[QUOTE]Firstly, AMD designs GPUs with many simple ALUs/shaders (VLIW design) that run at a relatively low frequency clock (typically 1120-3200 ALUs at 625-900 MHz), whereas Nvidia's microarchitecture consists of fewer more complex ALUs and tries to compensate with a higher shader clock (typically 448-1024 ALUs at 1150-1544 MHz). Because of this VLIW vs. non-VLIW difference, Nvidia uses up more square millimeters of die space per ALU, hence can pack fewer of them per chip, and they hit the frequency wall sooner than AMD which prevents them from increasing the clock high enough to match or surpass AMD's performance. This translates to a raw ALU performance advantage for AMD:
AMD Radeon HD 6990: 3072 ALUs x 830 MHz = 2550 billion 32-bit instruction per second
Nvidia GTX 590: 1024 ALUs x 1214 MHz = 1243 billion 32-bit instruction per second
[/QUOTE]
It depends a lot on what kind of work it is, but if you are doing repetitive grunt work AMD GPUs will fare better. More complicated work like folding@home apparently works a lot better with NVIDIA GPUs.
[QUOTE=Code3Response;47045179]Has anyone ever had issues with tearing before? I always turn off VSync and havent had a single issue with tearing.[/QUOTE]
It's really dependent on the particular game and person playing. Some people don't notice any tearing while some notice it any time v-sync is off. And some games exhibit it far worse than others as well.
[QUOTE=Used Car Salesman;47043619]Nvidia are cunts. I am definitely never spending a fucking dime on one of their products again.[/QUOTE]
You should also stop using Microsoft products as well, as they are no better
and everything apple
ford
hersheys
sony
riteaid
pepsi
in fact, don't use any products made by any big business ever again, ever, because none of them are any better at all
snip
[QUOTE=Trekintosh;47045200]That means using the Catalyst Control Center so you enjoy that lump of dump.[/QUOTE]
It's actually decent now, and has been for a while. Nvidia control panel is just as bad after I switched and in some cases worse.
[QUOTE=Tasm;47045849]It's actually decent now, and has been for a while. Nvidia control panel is just as bad after I switched and in some cases worse.[/QUOTE]
It was alright before they made that geforce experience crap mandatory (which I uncheck every time and EVERY TIME it STILL installs itself oh my god)
So what this is trying to teach me is:
Spoofing G-Sync at the software/driver level = working G-Sync on xyz monitor without hardwarez; ?;
No.
If you want to actually see/play with G-Sync, get a fucking G-Sync monitor, & don't bother trying to see it by making your shit tell you there's something there that's not fucking there.
[QUOTE=Kaabii;47044628]I hope you guys aren't ordering DP cables to do this because it doesn't work. The guy who wrote the blog seems sort of crazy, posted mostly inaccurate info, and he just reposted a leaked OEM driver that enables adaptive refresh rate for two laptops. The funny thing is that every single comment on the blog that the OP linked is people saying it doesn't work.[/QUOTE]
Of course it doesn't work. G-Sync is when the monitor waits for the graphics card. As opposed to V-Sync where the graphics card waits for the monitor.
V-Sync is meant to remove tearing when the graphics card is pumping out images faster than the monitor can display them.
G-Sync removes tearing when the monitor refreshes too fast, and the graphics card cannot keep up (because of a lag spike in a badly optimized game or something).
Just because your drivers suddenly try to send the "Wait for me!" signal to the monitor doesn't mean the monitor actually can wait for the graphics card. The module is not just to identify the monitor, it's to actually add functionality the monitor needs.
this does not work on a gtx770.
it just makes the geforce experience error and all the settings go in the control panel
[QUOTE=ashrobhoy;47046300]this does not work on a gtx770.
it just makes the geforce experience error and all the settings go in the control panel[/QUOTE]
Because it doesn't work at all
The monitor itself needs specific hardware inside it for g-sync to work, you can't just magic it into existence with a modded driver
CUDA is just a proprietary version of DirectCompute/OpenCL, and both of those are pretty much intentionally gimped.
[editline]31st January 2015[/editline]
[QUOTE=Thunderbolt;47046387]Because it doesn't work at all
The monitor itself needs specific hardware inside it for g-sync to work, you can't just magic it into existence with a modded driver[/QUOTE]
It does need special hardware / firmware support.
G-Sync also happens to be VESA Adaptive Sync with DRM on it, which was standardized somewhere in 2014 with DisplayPort 1.2a (although monitors aren't required to implement it by spec). There are likely a ton of monitors since then implementing adaptive sync.
This is complete BS, someone checked the modified driver he was using and it wasn't changed except for a single unrelated line. (probably to change the checksum)
I have one of those compatible g-sync monitors (asus vg248qe) and you need a custom hardware board for it to work. I highly doubt that hacked drivers add hardware functionality
[QUOTE]AMD should take the opportunity and rub their Freesync implementation of Adaptive-Sync in Nvidia's face.
I would guess in some time Nvidia will announce how their superior G-Sync technology can now work on any Displayport monitor just because they magically made it work, which should probably annoy some of the companies that invested in making G-Sync monitors.[/QUOTE]
See 2 posts lower
[QUOTE=Murkrow;47047130]AMD should take the opportunity and rub their Freesync implementation of Adaptive-Sync in Nvidia's face.
I would guess in some time Nvidia will announce how their superior G-Sync technology can now work on any Displayport monitor just because they magically made it work, which should probably annoy some of the companies that invested in making G-Sync monitors.[/QUOTE]
Hi, read the thread, thanks.
[QUOTE=Tobba;47046402]CUDA is just a proprietary version of DirectCompute/OpenCL, and both of those are pretty much intentionally gimped.
[editline]31st January 2015[/editline]
It does need special hardware / firmware support.
G-Sync also happens to be VESA Adaptive Sync with DRM on it, which was standardized somewhere in 2014 with DisplayPort 1.2a (although monitors aren't required to implement it by spec). There are likely a ton of monitors since then implementing adaptive sync.[/QUOTE]
Adaptive sync sounds so nice, my GPU can hold the 60 FPS it needs to avoid tearing in a lot of games, but nothing from the last few years, and the tearing I get is fairly distracting.
Shame I don't have any DisplayPort ports or a DP monitor with Adaptive Sync. That would cost money.
[QUOTE=Rika-chan;47047167]Hi, read the thread, thanks.[/QUOTE]
Very informative post.
So I did re-read and basically it only does embedded displayport 1.0+ and only as a power saving feature, meaning it is more or less pointless for most people. Fair enough.
Regardless of that, Nvidia should just focus on something that gets wide-scale adoption, namely the displayport standard. Not that it's very likely of them.
Maybe Nvidia's and AMD's implementations will be cross compatible, but not exactly gonna count on it. Nvidia will definitely advertise G-Sync, I just hope AMD doesn't fall asleep for the PC market having the current console generation in hands.
I've had a G-sync monitor for about 3 months. I don't regret paying for it, now that I've gotten
used to seeing/feeling what it gives - although some games seem to utilize it a lot better than others.
It's nifty in fast/reactionary games, and extremely noticeable in games wherever camera or distant scene movement is a factor.
Even in current DayZ SA for example, maxxed/in towns, even 20fps plays like 60+fps non-G-sync'd.
Probably even better, because you kind of forget what micro-stutters look like until you see non-G-sync again.
[QUOTE=Sam Za Nemesis;47042807]Funny how many flops nvidia is dropping lately, reminds me of the gold old Fermi days[/QUOTE]
All their flops seem to be extremely successful, somehow.
Sorry, you need to Log In to post a reply to this thread.