Not able to combine animations and shaders?

edited November 2011 in Help request
I have a fragment shader that I am successfully using on plenty of objects, with the only exception being if orxObject_SetCurrentAnimation has been called on the object.

In this case, if I use AddShader and add that same shader, the object doesn't appear at all.

Is there something I need to do in the shader to support animation?

Thanks,
Matt

Comments

  • edited November 2011
    Hi MattP and welcome here!

    Mmh, no. Nothing really comes to my mind. I presume you want to use the current animation's graphic as input texture for your shader?
    My first guess would then be some sequencing issues in the shader initialization/param querying, probably in the render plugin. I'll have a look at it tomorrow (2:30 AM here, I should already be asleep. ^^)

    Which version of orx are you using, btw? The 1.3rc0 or the latest from SVN? Also, on which platform did you experience the issue?

    Cheers!
  • edited November 2011
    Hi iarwain,

    Thanks for the quick reply! Yes, I want to use the current animation's graphic in the shader.

    I have tried it on iOS and on the PC, with the same result each time (with all other objects using shaders being fine).

    I just tried a basic shader that forces full-alpha as follows:

    [basicShader]
    Code = "
    void main()
    {
    vec4 color = texture2D(texture, gl_TexCoord[0]);

    color.a = 1.0;

    gl_FragColor = color;
    }"
    ParamList = texture

    The result then is that I see a white square in the position the graphic should be, and the bounds of that square change each frame but it is just solid white.

    Any help would be greatly appreciated :)

    EDIT: And yes, this is with 1.3rc0
  • edited December 2011
    Hi MattP,

    I've fixed a related issue a few weeks ago. Actually it might be the same issue: if your animation's graphic a full texture or a sub-region of a bigger texture/spritesheet (or texture atlas, depending on the terminology you prefer :))?

    If so, that'd probably be the bug I've fixed in revision #2731.

    I'd suggest using the svn version (it compiles out of the box for all the platforms, just make sure to sync the /trunk folder) as there has been over 200 fixes/improvements since the 1.3rc0. :)
  • edited December 2011
    We are using texture atlasing, so that fix may be just what I need!

    To date, we have been using the precompiled distribution. I just pulled down the bleeding edge from SVN and compiled the libs then linked it into our project, but now when we launch we get a crash with the following output on the console:


    CEGUI::RendererException in file c:cegui-0.7.5ceguisrc
    enderermodulesopenglceguiopenglrenderer.cpp(641) : OpenGLRe
    nderer failed to initialise the GLEW library. Missing GL version


    But then that will probably be on our end, I guess. Still, I'm not quite sure why changing the orx version would cause CEGUI to not initialise :S
  • edited December 2011
    That's a good question indeed. I'm not exactly sure what CEGUI means by missing GL version.
    I assume that happens for your windows build? Wild guess: could it be some incompatibility of the GL version linked with orx as it's now linked against your own local GL version and not mine.

    Just to be sure, you compiled the embedded dynamic versions of orx (and not one of the non-embedded ones)? And, also, did you make sure that you're using all the new include headers and libraries and don't have something that's still using the old code (in which case verifying the paths + clean/rebuild should do the trick).

    Again, sorry about those obvious advises but I can't think of anything else for now. :(
  • edited December 2011
    It seems that it is failing because orx isn't initialising the viewport. The value returned by:

    orxViewport_CreateFromConfig("Viewport");

    is null. But when I switch back to the old version, it has no problem with it :S

    Here is the ini file in full:
    ;-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-
    ;-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-
     
    ; This is the Main Menu configuration file.
    ; Viewport, Camera and other bits and bobs go here.
     
    ;-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-
    ;-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-
     
    ; [ViewportTemplate]
    ; BackgroundClear    = <bool>                ; Specifies if the background should be cleared before rendering it. Its default value is true.
    ; BackgroundColor    = <vector>              ; Defines which color will be used for clearing the viewport before rendering it. Its default value is black (0, 0, 0).
    ; Camera             = CameraTemplate        ; Template name of the camera that will be linked to this viewport. Each camera template will correspond to a unique camera at runtime. This means that if you use more than one viewport linked to the same camera, they will render the same content as seen by this camera.
    ; RelativePosition   = left|right|top|bottom ; Defines where the viewport will be placed in the main display. It should be a combination of two attributes. Ex.: 'top left' to have your viewport in the top left corner. Its default value is 'top left'.
    ; Position           = <vector>              ; Defines an absolute position for the viewport in the main display, in pixel coordinates. This value is only used if none is provided for RelativePosition.
    ; RelativeSize       = <vector>              ; Defines the viewport size relatively to the main display's one, ie. (1, 1, 0) means that it will cover the full display. Its default value is (1, 1, 0). The Z coordinate is ignored.
    ; Size               = <vector>              ; Defines the absolute viewport size, in pixels. This value is only used if none is provided for RelativeSize.
    ; ShaderList         = <list#list>           ; Defines a list of shaders that will be executed every time this viewport is rendered. Up to 4 shaders can be specified. By default, no shader is used.
    ; Texture            = path/to/TextureFile   ; Defines a texture where the viewport will be rendered. Its default value is the main display (ie. screen). NB: orx's default display plugin based on SFML doesn't support this property.
     
    [Viewport] ;==================================
    Camera               = Camera
    BackgroundColor      = (155, 0, 55)
     
    ;-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-
     
    ; [CameraTemplate]
    ; FrustumHeight      = <float>               ; As orx's cameras are 2D ones, their frustum are rectangle cuboids instead of real frustums.
    ; FrustumWidth       = <float>               ;  - If you want to achieve a 1:1 aspect ratio with your main display window, you can use the Display.ScreenHeight and Display.ScreenWidth values.
    ; FrustumNear        = <float>               ; Defines the near plane for the camera frustum. The near plane is excluded when doing render culling.
    ; FrustumFar         = <float>               ; Defines the far plane for the camera frustum. The far plane is included when doing render culling.
    ; Position           = <vector>              ; Camera's initial position.
    ; Rotation           = <float>               ; Camera's initial rotation (along its Z-axis).
    ; Zoom               = <float>               ; Camera's initial zoom.
     
     
    [Camera] ;====================================
    FrustumWidth         = @Display.ScreenWidth
    FrustumHeight        = @Display.ScreenHeight
    FrustumFar           = 2.0                ; Frustum Near and Far are based upon the position of the camera. 0.0 minimum to +infinite.
    FrustumNear          = 0.0                   ; You cannot set this value 'behind' the camera.
    Position             = (0.0, 0.0, -1.0)
    Zoom                 = 1.0
     
    ;-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-
    
    ;-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-
    ;-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-
    

    EDIT: I have cleaned and rebuilt and checked paths etc. and I am sure it is using the new code
  • edited December 2011
    Could it then be that your config file is encoded in UCS-2 or another non-supported unicode encoding?
    Only UTF-8 (and plain ANSI) is supported by orx and I recently added a check (+ error message in the debug log when trying to read a config file with an incorrect BOM).

    Is the file actually being loaded? Maybe some working directory issue, I don't remember if 1.3rc0 was forcing the exe's directory to be the current working one, but it's now the case. (You can override this by calling orxConfig_SetBaseName() in your init and reload the config.)
  • edited December 2011
    PS: I have to go home for now, but I'll check the forum again in a couple of hours. Good luck! :)
  • edited December 2011
    Thanks for your help so far, it's great that this project has a developer that is so active on it :)

    I tracked down where the viewport creation is failing, it's in orxStructure.c on line 368 inside orxStructure_Create:
    ...
    
      /* Checks */
      orxASSERT(sstStructure.u32Flags & orxSTRUCTURE_KU32_STATIC_FLAG_READY);
      orxASSERT(_eStructureID < orxSTRUCTURE_ID_NUMBER);
    
      /* Is structure type registered? */
      if(sstStructure.astInfo[_eStructureID].u32Size != 0)
      {
    ...
      }
      else
      {
        /* Logs message */
        orxDEBUG_PRINT(orxDEBUG_LEVEL_OBJECT, "Failed to allocate structure bank.");
      }
    

    That if statement check fails (although I don't get that debug output, strangely) and returns null.

    That function was called from line 247 in orxViewport.c:
      pstViewport = orxVIEWPORT(orxStructure_Create(orxSTRUCTURE_ID_VIEWPORT));
    
  • edited December 2011
    Mmh, the fact you get a failed test here and no debug sounds like there's some kind of release/debug mismatch in your setup.
    Like if you were using a release build (orx.lib/orx.dll) with __orxDEBUG__ defined or using a debug build (orxd.lib/orxd.dll) without defining __orxDEBUG__.
    Or not using the new headers from the /code/include of the svn repo but the old ones. Or using the new .lib with the old .dll. Well, I won't do all the combinations but you see what I mean. :)
  • edited December 2011
    I cleaned out the project settings and rebuilt it to ensure the old version wasn't being included still somehow, and now it does run with the latest version from the repository, but unfortunately my issue is not solved :(

    I have, however, come up with a test-case for you using the latest SVN code:

    In the 04_Anim example, add the following to the end of 04_Anim.ini:
    [testShader]
    Code = "
    void main()
    {
       gl_FragColor = texture2D(texture, gl_TexCoord[0]);
    
       gl_FragColor.a *= 0.5;
    }"
    ParamList = texture
    

    And change 04_Anim.c like this:
      /* Creates soldier */
      pstSoldier = orxObject_CreateFromConfig("Soldier");
    
      if(orxObject_AddShader(pstSoldier, "testShader") == orxSTATUS_SUCCESS)
      {
        orxLOG("SHADER SUCCESSFULLY LOADED");
      }
      else
      {
        orxLOG("SHADER NOT LOADED");
      }
    
      /* Gets main clock */
      pstClock = orxClock_FindFirst(orx2F(-1.0f), orxCLOCK_TYPE_CORE);
    

    Normally, the output of this example project is:

    normalbc.png

    But with the shader loaded, the output is:

    shader.png

    So the coordinates must still not be passed through correctly. Hope this helps. Thanks!

    EDIT: Note also that the image *does* change each frame, so it is possibly just the UV scale that is wrong? Also, the shader is being loaded properly, as I am able to alter the fragment alpha etc.
  • edited December 2011
    Well, I'm glad you were able to run your project wit the latest version of orx, at least. I'll try to debug the shader/animation issue tonight or tomorrow night. I'm sure it's some UV miscomputation somewhere in the Render plugin so hopefully that shouldn't be too hard to fix.
  • edited December 2011
    Hi!

    I found that particular issue and fixed it on the SVN. Hope that was the same you were experiencing on your side.

    If you're curious, the coordinates were actually correct. It's simply that the default texture sent to the shader was the one of the object's main graphic, not the texture holding the current animation frame. However the UV coordinates were actually the ones of the current frame.

    I didn't encounter that issue earlier as in most of my projects I use the same spritesheet for both the main graphic and the animation frames.

    Orx will now use the current frame's texture as default texture parameter. Let me know if that fixed your problem.
  • edited December 2011
    Hi iarwain,

    Thanks, it now works! Thanks for taking the time to help out with this, it's going to make a big difference to the polish of our title being able to have shaders on our animated objects :D

    Thanks!
    Matt
    Gameloft Auckland
  • edited December 2011
    Hey MattP,

    I'm glad it worked. Sorry I couldn't check it earlier I was busy with IGF judging last week end.

    If you have any details you can communicate on your title, don't hesitate, I'm always curious to know what's made with orx. :)

    Cheers!
  • edited December 2011
    I would if I could, but you know what our industry is like :(
  • edited December 2011
    Indeed! Well then, whenever you can communicate (probably after release?), let us know. :)
Sign In or Register to comment.