This forum uses cookies
This forum makes use of cookies to store your login information if you are registered, and your last visit if you are not. Cookies are small text documents stored on your computer; the cookies set by this forum can only be used on this website and pose no security risk. Cookies on this forum also track the specific topics you have read and when you last read them. Please confirm whether you accept or reject these cookies being set.

A cookie will be stored in your browser regardless of choice to prevent you being asked this question again. You will be able to change your cookie settings at any time using the link in the footer.

Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Use shader is too bad for AMD GPU
#4
When "Use shaders" is enabled, one also needs to enable "Disable optimized VertexInfo reading (may improve compatibility)." Otherwise, graphical garbage will appear in many games. The FPS will drop to sluggish levels if this option is enabled. This option isn't perfect either since games like Kingdom Hearts: Birth by Sleep Final Mix becomes graphically worse with "Disable optimized VertexInfo reading (may improve compatibility)" enabled.

One of the reasons that JPCSP doesn't perform well on AMD GPUs is because of poor drivers. Nvidia seems to have better OpenGL support for their GPU in comparison to AMD's GPU.

Is there any fundamental differences between the two Software Rendering options? Doesn't both options use the CPU for graphics rendering and why are they called Internal and External Software Rendering respectively?
Reply


Messages In This Thread
Use shader is too bad for AMD GPU - by onelight - 12-29-2013, 09:03 AM
RE: Use shader is too bad for AMD GPU - by hlide - 12-30-2013, 06:52 PM
RE: Use shader is too bad for AMD GPU - by DragonNeos - 01-01-2014, 08:21 AM
RE: Use shader is too bad for AMD GPU - by hlide - 01-01-2014, 02:00 PM

Forum Jump:


Users browsing this thread: 1 Guest(s)