ShaderBeam v0.1 - feedback thread #2
Replies: 16 comments 41 replies
-
|
Hi! So first of all, thank you for making this! Your efforts with ShaderGlass and this are always much appreciated! Now onto the issue I'm having... At 60Hz I manage to get no flickering and all "seems fine"... but then I actually want to run non-60Hz content with this and that is where everything falls apart for me! My litmus test here is the PPSSPP emulator with a 30fps game (in my case - Monster Hunter Freedom Unite). I tried to set the shader on the 4090, I tried the 3090 as a shader GPU... I tried 300Hz, I tried 240Hz... I made sure PPSSPP is rendering at 30fps via RTSS, I used your tips to optimize the system (disabled MPO, Lasso optimizations, disabled HAGS, etc) - no change! I had this exact same issue when trying the CRT beam emulation with RetroArch. 60fps worked fine, but doubled up the image, while trying more sub-frames to match to 30Hz content just resulted in intense flickering, but buttery motion and no doubled frames. So the big question is - is it possible to use these types of shaders for non-60fps content at all? Seeing how crystal-clear and buttery the motion can be with the shader (ignoring the intense flicker) is just giving me such blue balls here! In my mind, I should be able to use whatever base fps that divvies up nicely within my monitors refresh rate (say, even 75fps with 300Hz) and just have this thing work, but it never does. Am doing something wrong, or is this just a huge oversight on the shader side? I'm attaching photos of the setup at 240Hz when I tried to see if somehow the GPU can't keep up at 300+ Hz. As far as I can tell - the numbers look right, it's just that at 60Hz base I get no flickering, and a 30Hz base I get insane flicker. Thank you for your time! These settings don't flicker, but double-up frames in motion: These settings - 30fps base - give insane flicker, but the motion itself is crystal-clear: P.S.: I have to use the "Desktop duplication" capture API, otherwise my "captured FPS" stays the same as the monitor's refresh rate, which is wrong. |
Beta Was this translation helpful? Give feedback.
-
You don't have to add stutter.
Image retention stops occuring at 0.01Hz roll on all LCDs I tried, and image retention stops occuring at 0.001Hz roll on most LCDs. You can tell retention is occuring if 0.001Hz roll has more intense gradients than 0.01Hz roll, so you can try both. The shader handles the roll internally so you keep exact integer count of subframes, and let the shader roll the phase-offset of the scanout. That's why my shader automatically turns off roll algorithm during odd native:simulated Hz divisors. Also, TestUFO supports global refresh (phosphor-fade BFI), which is done via a shader modification to support a scan direction of 0. And yes, LCD Saver works with phosphor-fade BFI by slewing the fadewave slowly too. A zero-stutter / zero-band / zero-flicker BFI. Global-refresh CRTs never existed, but my/Lotte's shader can be tweaked to be such. |
Beta Was this translation helpful? Give feedback.
-
|
I'm trying a dual gpu per recommended and am curious if i need x8 x8 bifurcation for it to not stutter? I know from experience that to use dual gpu past like 110hz on a crt you need x8x8 lanes to not stutter and I seem to be getting stutters here so suspect it might also apply. I was actually getting less stutter with one gpu. Besides that this is almost perfect with my 500hz oled. If I can fix this stutter, I'll probably start selling off all but one crt monitor once I can get 4k 480hz in the coming years. The contrast and no black crush is soo nice. |
Beta Was this translation helpful? Give feedback.
-
|
Your work with this is incredible! While reading through your GitHub, it did make me wonder if you've inadvertently also discovered a new way / method of using light guns (like the old USB guncon2) on high refresh modern gaming LCD displays - through recreation of CRT beam drawing (assuming it mimics the beam horizontal and vertical movement?) Unfortunately I don't have an equivalent LCD panel that will allow me to start testing this further. Has this use case been considered before? curious if theoretically could it work? |
Beta Was this translation helpful? Give feedback.
-
|
Heyy, Amazing work. I'm just wondering if anyone else is getting around 50ms increase in input latency? Im consistently getting 50ms more latency with shaderbeam in a simple click reaction test even though subframes are set to 1, all the fps's match to display hz. Is this expected? running 7800xt as capture and uhd 750 for the shader on windows 10 |
Beta Was this translation helpful? Give feedback.
-
|
Love this, can you add CRT Dusha shader? it looked the best on my miniled tv and its so bright too Thank you so much Mausimus and Mark!! |
Beta Was this translation helpful? Give feedback.
-
|
Hi, I have encountered a few issues using your shader
https://github.com/user-attachments/assets/20d17742-588e-4323-b0e2-89505f49895f I think it has something to do with the Gamma? I tried messing with the shader section values especially gamma. Lowering the gamma to 1.7 from 2.2 made it less obvious but nothing ended up completely solving it. How do I fix it? (I did everything inside the performance tips section, and I set the shader GPU to my iGPU and it can render 545 fps for the benchmark).
(Not exactly a problem but the app menu says to cap the fps to 71.96 and when I try to do set it to 71 inside RTSS I get a lot of flashing. Only fix is to close and reopen RTSS again, and it happens again after applying the FPS limit again. And the game works fine without fps limit that's why I didn't count it as an issue. Also, how do you even set decimals for your FPS cap inside RTSS?) specs: I was supposed to study but my inner demons told me to test this instead of studying, so I HOPE it helps |
Beta Was this translation helpful? Give feedback.
-
Banding Calibration
Interactions with Temporal AA AlgorithmsBTW, temporal AA is not very compatible with CRT simulator. There's beat frequencies between the frequency of the flickering in the AA and the flickering of CRT simulator, that amplifies AA artifacts. It's best to avoid using temporal AA algorithms with CRT simulator, unless you do a framerate cap to the simulated Hz. |
Beta Was this translation helpful? Give feedback.
-
|
Regarding what mdrejhon said "Odd divisors (e.g. 180Hz) are 100% retention proof; OR" On another note, I remember nvidia profile inspector having a setting to do vsync at 1/2, 1/3 and 1/4 of display refresh rate for games, SpecialK having the same option too. Other info: |
Beta Was this translation helpful? Give feedback.
-
|
Fantastic work here, but, though this isn't really a bug report, I was wondering if you had any plans for Linux support in the future? |
Beta Was this translation helpful? Give feedback.
-
|
Hi i just wanted to note that rtss gives me problems with stutter, maybe because of reflex injection?. I did an exclude on the Shaderbeam.exe but it still happend, for now if im using it just ignore rtss and either use special k/nvidia limit or ingame on the game or let the fps uncapped and choose an subframe that works for me. i have a 360hz OLED display and rtx 4080 so alot games run pretty good, but i like even the ehanced visual clarity even with 2 subframes if my fps stays over 200+. i cant get the banding out so i using scan direct 0 atm... Also for HDR https://github.com/ledoge/autohdr_force . You can kinda force it (?) but the blur/gain slider is then the brightness slider, it is tricky but it works. I hope you can figure HDR stuff out, especially for people like me we use an OLED :) So long story short... your work is amazing and im happy with the programm because 2 or 3 subframes feels so good on my display and saves me some gpu power! |
Beta Was this translation helpful? Give feedback.
-
|
Aloha! Just wanted to share my experience for anyone who journeys a similar crazy path.
|
Beta Was this translation helpful? Give feedback.
-
|
Hello, By the way 400 Nits seem to be the limit, the more I go over 400 Nits, for example 480Nits, the higher the chance that bright elements like the sky in games get a green flickering, which like I said fully disappears at 400 Nits. Also about performance, the performance hit is worth it, but still some data from my PC and why I prefer to run it fully on a dGPU: No Shader 81 FPS (only RTX 5090) Simple BFI The thing is, the iGPU is not even closed to being maxed it is at about 20% Usage, the "VRAM" of the iGPU is only used up 10%. It does 1300FPS at the Simple BFI Shader Benchmark. Edit: I almost forgot, the only thing holding it back in some games is the framelimiter I guess (if it can work as good as quarter vsync), it only runs 100% perfect for me when I enable 1/4 Vsync ingame, or in games which do not have the option, with specialk, which can force 1/4 Vsync. Any other kind of limiter at some point always stutters even when using the reset function. |
Beta Was this translation helpful? Give feedback.
-
|
Thanks everyone for testing and feedback! I've released a prerelease version of v0.2 which focuses mainly on reducing input lag. Your help in testing is much appreciated, download and details here: |
Beta Was this translation helpful? Give feedback.
-
|
ShaderBeam v0.2 is now out; mainly cosmetic changes from the pre-release version. Thanks everyone for testing and feedback, I couldn't do this without you! |
Beta Was this translation helpful? Give feedback.
-
|
Hey everyone, I released a minor patch v0.2.1, mostly UI fixes and a call for testing! I'm looking again at single GPU setups and the flashing problem. In my testing I have found:
If you have a single GPU and the flashing problem, please:
Thanks a lot for help! |
Beta Was this translation helpful? Give feedback.










Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Thank you everyone for testing ShaderGlass/BFI, ShaderBeam is the next step in the process of trying to make things work as smoothly as possible, for as many of you as possible.
ShaderBeam supersedes ShaderGlass/BFI as means for testing Blur Buster's CRT Beam Simulator on Windows desktop. It's a complete rewrite, optimized for running just BFI shaders. It's also quite barebones compared to ShaderGlass/BFI (e.g. fullscreen-only), but it should be sufficient to try the tech out.
Please make sure to read the front page of the GitHub repo for tips how to set up your machine to get the best performance. While ShaderBeam doesn't fully solve desync problems, the best way to avoid them we've found so far is to use a second GPU just for the shader, and ShaderBeam fully supports this scenario. I optimized the shader so that in some cases even an iGPU might suffice (in my case of Intel UHD 770, it can easily do 1080p @ 240 Hz, which is all I have...)
One problem I haven't fully solved yet is LCD Anti-retention. If you have an OLED you can safely disable it, but on LCDs when it's enabled it can introduce occasional stutter as it internally desyncs CRT refresh rate from your content rate. But it's also fine to run on LCDs for short periods of time without it, if you don't mind a bit (temporary!) image retention on static portions of the screen.
Please share your experiences or configuration tips you've found that make things better. And since this is a first release of a new app, apologies for any jank/crashes - please report these as issues and I will try to address them so that as many of you as possible could see why we're doing this.
UPDATE 11/01: ShaderBeam v0.2 prerelease now available for testing:
https://github.com/mausimus/ShaderBeam/releases/tag/v0.2-prerelease
Beta Was this translation helpful? Give feedback.
All reactions