This forum uses cookies
This forum makes use of cookies to store your login information if you are registered, and your last visit if you are not. Cookies are small text documents stored on your computer; the cookies set by this forum can only be used on this website and pose no security risk. Cookies on this forum also track the specific topics you have read and when you last read them. Please confirm whether you accept or reject these cookies being set.

A cookie will be stored in your browser regardless of choice to prevent you being asked this question again. You will be able to change your cookie settings at any time using the link in the footer.

Post Reply 
 
Thread Rating:
  • 0 Votes - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Use shader is too bad for AMD GPU
12-29-2013, 09:03 AM
Post: #1
Use shader is too bad for AMD GPU
Tested Monster Hunter p2G,Monster Hunter p3HD
For nvidia user, If you want run this game in full fps, you need check.
Use vertex cache
Use shaders
Enable saving GE sereen to textures instead of memory .

But if you are using AMD GPU , Use chaders will make Everything become Crash
and you can not find a way to run JPCSP in full fps.
   
   
   
   
Find all posts by this user
Quote this message in a reply
12-30-2013, 01:03 AM
Post: #2
RE: Use shader is too bad for AMD GPU
JPCSP is update External GE renderer, Dose External GE renderer will make JPCSP run faster and fix the problem of use shader AMD GPU?
Find all posts by this user
Quote this message in a reply
12-30-2013, 06:52 PM
Post: #3
RE: Use shader is too bad for AMD GPU
(12-30-2013 01:03 AM)onelight Wrote:  JPCSP is update External GE renderer, Dose External GE renderer will make JPCSP run faster and fix the problem of use shader AMD GPU?

external GE render is an attempt to make an optimized software renderer. It is a work in progress and its dll is not publicly released.
Find all posts by this user
Quote this message in a reply
01-01-2014, 08:21 AM (This post was last modified: 01-01-2014 08:23 AM by DragonNeos.)
Post: #4
RE: Use shader is too bad for AMD GPU
When "Use shaders" is enabled, one also needs to enable "Disable optimized VertexInfo reading (may improve compatibility)." Otherwise, graphical garbage will appear in many games. The FPS will drop to sluggish levels if this option is enabled. This option isn't perfect either since games like Kingdom Hearts: Birth by Sleep Final Mix becomes graphically worse with "Disable optimized VertexInfo reading (may improve compatibility)" enabled.

One of the reasons that JPCSP doesn't perform well on AMD GPUs is because of poor drivers. Nvidia seems to have better OpenGL support for their GPU in comparison to AMD's GPU.

Is there any fundamental differences between the two Software Rendering options? Doesn't both options use the CPU for graphics rendering and why are they called Internal and External Software Rendering respectively?
Find all posts by this user
Quote this message in a reply
01-01-2014, 02:00 PM (This post was last modified: 01-01-2014 02:09 PM by hlide.)
Post: #5
RE: Use shader is too bad for AMD GPU
(01-01-2014 08:21 AM)DragonNeos Wrote:  One of the reasons that JPCSP doesn't perform well on AMD GPUs is because of poor drivers. Nvidia seems to have better OpenGL support for their GPU in comparison to AMD's GPU.

Is there any fundamental differences between the two Software Rendering options? Doesn't both options use the CPU for graphics rendering and why are they called Internal and External Software Rendering respectively?

Yeah, sadly AMD shader tends to compile ok whereas it shouldn't. It's a nightmare.

Yes, both use CPU. The internal software renderer is done in Java inside JPCSP source, hence "internal" : it's the original software renderer of JPCSP. The external software renderer is a dll because it's written "from the scratch"* in C++. It uses more and more SIMD instructions (SSE2, SSE4.1 and AVX2 in future) through intrinsics functions or run-time generated code through a dynamic compiler. It is at a stage we can play games with it but it remains a work in progress as we try to improve both compliance and speediness of it. As a fact, gid15 and I join our effort to make a software renderer.

*: not exactly as part of it is originally based on my OpenGL-GE source from pspe4all.
Find all posts by this user
Quote this message in a reply
01-02-2014, 05:56 AM
Post: #6
RE: Use shader is too bad for AMD GPU
test AMD GPU with 13.1 whql Driver and 13.2 beat5 Driver
shader work well
Find all posts by this user
Quote this message in a reply
01-13-2014, 07:51 PM
Post: #7
RE: Use shader is too bad for AMD GPU
(01-02-2014 05:56 AM)onelight Wrote:  test AMD GPU with 13.1 whql Driver and 13.2 beat5 Driver
shader work well
What is your graphics card? My graphics card is ATI Radeon HD 4200, so I'm not too sure if updating the drivers for my graphics card would even make a difference (currently using 12.6 Legacy Driver). The only options I have when it comes to updating is using either 13.1, 13.4, 13.9, or a modded Legacy Driver.
Find all posts by this user
Quote this message in a reply
01-14-2014, 05:59 AM
Post: #8
RE: Use shader is too bad for AMD GPU
(01-13-2014 07:51 PM)DragonNeos Wrote:  
(01-02-2014 05:56 AM)onelight Wrote:  test AMD GPU with 13.1 whql Driver and 13.2 beat5 Driver
shader work well
What is your graphics card? My graphics card is ATI Radeon HD 4200, so I'm not too sure if updating the drivers for my graphics card would even make a difference (currently using 12.6 Legacy Driver). The only options I have when it comes to updating is using either 13.1, 13.4, 13.9, or a modded Legacy Driver.

Actually, the one who use AMD GPU are my friends, we want to play mhp3 Multiplayer in hamaci network. but JPCSP run too slow , Finally, they Install 13.1 or 13.2 Driver. it make JPCSP run faster. you can try it if you want.
Find all posts by this user
Quote this message in a reply
Post Reply 


Forum Jump:


User(s) browsing this thread: 1 Guest(s)