This forum uses cookies
This forum makes use of cookies to store your login information if you are registered, and your last visit if you are not. Cookies are small text documents stored on your computer; the cookies set by this forum can only be used on this website and pose no security risk. Cookies on this forum also track the specific topics you have read and when you last read them. Please confirm whether you accept or reject these cookies being set.

A cookie will be stored in your browser regardless of choice to prevent you being asked this question again. You will be able to change your cookie settings at any time using the link in the footer.

Thread Rating:
  • 11 Vote(s) - 4 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Dissidia 012: Duodecim Final Fantasy - ULUS10566
The autosave issue has been fixed when I checked with revision d097511. If the option "Enable decoding of indexed textures (using CLUT) in shader (only relevant when using shaders)" is enabled, the second cutscene will be able to load but many graphical artifacts (1st Screenshot at the bottom of the post) appear on the screen (started in revision 3587 (bd71f28), works fine in revision 3586 (d237620)). The cutscene also appears to be darker than the PSP counterpart.

Other issues that appear while the cutscene is being displayed is the "OpenGL Graphics Issue" (improper hair textures), "NOP Issue" (occurs after the second cutscene ends with an endless stream of messages in the logger), and "Performance Issue" (audio out of sync with cutscene).

The option "Enable saving GE screen to textures instead of memory" seems to trigger the cutscene crash issue starting in revision 3587 (bd71f28). Disabling this option allows the second cutscene to load with buggy results. This option can be enabled if the option "Enable decoding of indexed textures (using CLUT) in shader (only relevant when using shaders)" is also used.

Revision 3586 (d237620) Settings: (2nd Screenshot)
Code:
[X] Use vertex cache
[X] Use shaders
[X] Enable saving GE screen to textures instead of memory
[X] Enable dynamic shader generation (only relevant when using shaders, recommended for AMD/ATI)
   

(3rd Screenshot - 2nd Screenshot Settings + "Enable decoding of indexed textures (using CLUT) in shader (only relevant when using shaders)")
   

Revision 3587 (bd71f28) Settings: (4th Screenshot)
Code:
[X] Use vertex cache
[X] Use shaders
[X] Enable dynamic shader generation (only relevant when using shaders, recommended for AMD/ATI)
   

(5th Screenshot - 4th Screenshot Settings + "Enable saving GE screen to textures instead of memory" + "Enable decoding of indexed textures (using CLUT) in shader (only relevant when using shaders)")
   

As you can see in the screenshots, enabling the option "Enable decoding of indexed textures (using CLUT) in shader (only relevant when using shaders)" has no visual difference in revision 3586 (d237620). Many graphical issues occur starting in revision 3587 (bd71f28).


Attached Files Thumbnail(s)
   

.zip   Log_rd097511(64bit)_INFO.zip (Size: 243.08 KB / Downloads: 85)
Reply


Messages In This Thread
RE: Dissidia 012: Duodecim Final Fantasy - ULUS10566 - by DragonNeos - 12-29-2017, 05:33 PM
Jumbled Text in 012 Dissidia - by Izicial - 03-17-2012, 01:44 PM
JPCSP Crash - by abood555 - 07-29-2012, 07:06 PM

Forum Jump:


Users browsing this thread: 4 Guest(s)