This forum uses cookies
This forum makes use of cookies to store your login information if you are registered, and your last visit if you are not. Cookies are small text documents stored on your computer; the cookies set by this forum can only be used on this website and pose no security risk. Cookies on this forum also track the specific topics you have read and when you last read them. Please confirm whether you accept or reject these cookies being set.

A cookie will be stored in your browser regardless of choice to prevent you being asked this question again. You will be able to change your cookie settings at any time using the link in the footer.
Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
In latest 2629 Gow Ghost Of Sparta Doesnt work
#21
Nope, I don't get blackscreen everything's fine on both (great work btw!), I get like 59 Average fps but I notice the same thing as on my post about Kingdom Hearts BBS:FM post that when the cube is about to change a side that scrolls I get that backwars/jitter fps effect like in this video ( youtube.com/watch?v=aCM1H3WHxUE ) which I don't encounter in the Intel one.
All those blackscreens with audio, bad textures and weird visual bugs are always caused by Nvidia drivers and their bad relationship with OpenGL.

Here are either way the results with the default settings as you asked.
Also If you want I can provide tests (I guess on both cards) running Ubuntu on the same machine.


.7z   Cube test with default settings.7z (Size: 4.46 KB / Downloads: 4)
Reply

#22
(10-30-2012, 04:37 PM)VIRGIN KLM Wrote: Nope, I don't get blackscreen everything's fine on both (great work btw!), I get like 59 Average fps but I notice the same thing as on my post about Kingdom Hearts BBS:FM post that when the cube is about to change a side that scrolls I get that backwars/jitter fps effect like in this video ( youtube.com/watch?v=aCM1H3WHxUE ) which I don't encounter in the Intel one.
All those blackscreens with audio, bad textures and weird visual bugs are always caused by Nvidia drivers and their bad relationship with OpenGL.

Here are either way the results with the default settings as you asked.
Also If you want I can provide tests (I guess on both cards) running Ubuntu on the same machine.
OK, then this has nothing to do with OGL 4.3.0 and the selection of compatibility/core profiles. If Jpcsp would run in OGL 4.3.0 core profile (with all the deprecated functions being disabled), you would just see nothing.

This seems to be "just" a synchronization problem at the display. The Jpcsp 60 FPS is maybe not just exactly synchronized with the video card 60 FPS, which becomes visible on Nvidia.
Have you changed something in the 3D settings of the Nvidia Control Panel? E.g. "Vertical Sync"?
Could you post a video of the cube demo on Nvidia to see how bad it is?
Always include a complete log file at INFO level in your reports. Thanks! How to post a log
Reply

#23
First try what Gid15 said.

I know "Only GE graphics" sometimes used to cause the glitch effect for me like in the video posted above (though I haven't seen it happen lately not sure if it's driver related, I try to keep Only GE off when I can),
In the nvidia control panel I found setting "Threaded Optimization" to Off might fix the problem.
   

btw
when Jpcsp tries to keep vsync at 60 fps it will fully eat the cpu (causing up to 100% usage on some cores, I first noticed this with the game Growlanser) , this might cause some glitches, you could try a program like bandicam or dxtroy and limit it 1fps below th Jpcsp vsync and you will notice cpu utilization will drop to very low.
Normal. (two cores fully loaded , CPU clocked to 3.5 Ghz)
   

Capped at 59 FPS using BandiCam (very low CPU usage, CPU even down-clocked to 1.7ghz, still maintains 59FPS)
   
Reply

#24
(10-30-2012, 08:22 PM)hyakki Wrote: btw
when Jpcsp tries to keep vsync at 60 fps it will fully eat the cpu (causing up to 100% usage on some cores, I first noticed this with the game Growlanser) , this might cause some glitches, you could try a program like bandicam or dxtroy and limit it 1fps below vsync and you will notice cpu utilization will drop to very low.
Normal. (two cores fully loaded , clocked to 3.5 Ghz)


Capped at 59 FPS using BandiCam (very low CPU usage, CPU even down-clocked to 1.7ghz, still maintain 59FPS)
Interesting! I will try to analyze this strange behavior...
Always include a complete log file at INFO level in your reports. Thanks! How to post a log
Reply

#25
(10-30-2012, 08:50 PM)gid15 Wrote: Interesting! I will try to analyze this strange behavior...

Actually looking into this problem some more, it seems the high cpu usage could be related to the nvidia drivers, when forcing vsync to 'off' in the nvidia control panel the cpu usage is very low with jpcsp again (basically idle with the cube demo).

I wonder if jpcsp needs to tell the drivers to turn off vsync (Ie: when its set to "Use the 3D application setting"), so Jpcsp can handle it's frame-limiting alone without intervention from the drivers.
Definitely some kind of conflict going on with the vsyncs Smile

       
Reply

#26
(10-31-2012, 10:15 AM)hyakki Wrote: I wonder if jpcsp needs to tell the drivers to turn off vsync (Ie: when its set to "Use the 3D application setting"), so Jpcsp can handle it's frame-limiting alone without intervention from the drivers.
This is done is sceDisplay, line 274:
Code:
setSwapInterval(1);
"setSwapInterval(0)" would mean VSync off.

I cannot reproduce the behavior when performing changes only through the Nvidia control panel. But I get also a low CPU load for the cube demo when changing the above line to "setSwapInterval(0)"...
Always include a complete log file at INFO level in your reports. Thanks! How to post a log
Reply

#27
Meh the video recorder refuses to capture anything OpenGL render related, it just stays black in that place/window and it's the only one that records and restores at fullspeed despite CPU stress.
I have the feeling, and I guess I'm not wrong, that JPCSP uses somekind of a smart frameskip method that gets triggered only when it's not 60fps to try to mentain full speed even with fps drops compared to other emulators that get enabled and constantly skip and draw a specific number of frames or get disabled and have speed loss in case of fps drop. Even though it may have the best experience result out of all, it looks like GPUs don't like at all this idea and it causes that non-sync jitter when combined to a conflicting behaviour of something else like VSync. Also it's not just Nvidia that I have this jitter effect, I have it on every card I tried JPCSP (Intel,AMD,Nvidia) but it's just that Nvidia is the one that does it specificaly on cube demo.

Also we are way offtopic, I take the responsibility for that.
Reply

#28
I've disabled the Vsync with the graphics card in r2816. Could you check if there is a change? I see a much lower CPU load on the cube demo, but does it help for the small jitter effect?
Use explicitly r2816 ( http://buildbot.orphis.net/jpcsp/ ), as I will revert my change in r2817.
Always include a complete log file at INFO level in your reports. Thanks! How to post a log
Reply

#29
OK did the test and tried to test generaly in games that do that alot.
On the demo if I use it at default settings or if I enable the GE screen to Textures setting everything's fine, no jitter.
This change though seems to affect the Only GE graphics in the opposite way (crazy jittered frames, big chaos). I know that it is mentioned that it might break homebrew but it didnt in the past test I did so my point is that it might affect in the same way an other game.
So I tested Kingdom Hearts BBS:FX and Tekken 6.
Both had some slight performance boost and I didn't notice any jitter at all though it seems to affect in a weird way the frameskipping mechanism, like it makes it more prone to desync.
(BTW, ugh, just noticed on Tekken 6 that I have the same issue with posterizing/16bit looking testures like in this picture img145.imageshack.us/img145/9620/nvidiageforcegt620.jpg so it's for sure an issue with Nvidia cards and I think I fell when and why it's happening. I'm so sorry for comming up with all those stuff but I am so mad with Nvidia right now)

EDIT: Meh, this jitter, I hate it, not because I'm a stubborn child but because I have history with epilepsy and I feel (health wise) terrible testing it but it's a motive to surpass my issues and try to help.
Reply

#30
(10-31-2012, 10:19 PM)VIRGIN KLM Wrote: OK did the test and tried to test generaly in games that do that alot.
On the demo if I use it at default settings or if I enable the GE screen to Textures setting everything's fine, no jitter.
This change though seems to affect the Only GE graphics in the opposite way (crazy jittered frames, big chaos). I know that it is mentioned that it might break homebrew but it didnt in the past test I did so my point is that it might affect in the same way an other game.
So I tested Kingdom Hearts BBS:FX and Tekken 6.
Both had some slight performance boost and I didn't notice any jitter at all though it seems to affect in a weird way the frameskipping mechanism, like it makes it more prone to desync.
(BTW, ugh, just noticed on Tekken 6 that I have the same issue with posterizing/16bit looking testures like in this picture img145.imageshack.us/img145/9620/nvidiageforcegt620.jpg so it's for sure an issue with Nvidia cards and I think I fell when and why it's happening. I'm so sorry for comming up with all those stuff but I am so mad with Nvidia right now)

EDIT: Meh, this jitter, I hate it, not because I'm a stubborn child but because I have history with epilepsy and I feel (health wise) terrible testing it but it's a motive to surpass my issues and try to help.

KLM what are your system specs? (sorry if you said it before), I have a 560 GTX TI video card that uses opengl 4 and I'm having trouble getting the same results as you, but I think I know what you're talking about since on my old system (Core 2 duo, 9800GTX +, I would get some jitter, but that was an opengl 3 card), Usually just turning 'Threaded optimization' to Off would fix it under the nvidia drivers.

also to record a video have you tried to disable 'desktop composition'? (sometimes that helps when recording videos)

Btw gid15
With setSwapInterval(0); it does set vsync off (or the same effect), but while also testing I noticed the FPS reading might be wrong in Jpcsp, using bandicam internal fps reader its showing FPS as 120fps (but jpcsp title bar shows 60)
Nvidia CP Vsync OFF
setSwapInterval(1);
   

Nvidia CP Vsync (Application Controlled)
setSwapInterval(0);
   

Nvidia CP Vsync (Application Controlled) // default
setSwapInterval(1); // default
   

so If the Jpcsp FPS readout is wrong and its really running at 120fps (with vsync off), I guess when Vsync is on, it has to reduce the FPS to 60 and this is whats causing the high cpu usage (though still strange why it would use almost 100% cpu on two cores to reduce the speed)



Reply



Forum Jump:


Users browsing this thread:
4 Guest(s)

Powered By MyBB, © 2002-2022 iAndrew & Melroy van den Berg.