TheoraPlay and Orx

edited February 2014 in Help request
Hey all,

I'm looking to include .ogv files for the cutscenes in our game. We're looking at using TheoraPlay for decoding the video to get the frames. I can use the orxDisplay_SetBitmapData() to draw the frame on a bitmap.

I have Theora properly decoding the video/audio from the file, but I'm having issues with making it display through Orx.

Here are my questions/issues:

How can I make the bitmap drawn to the screen. orxDisplay_GetScreenBitmap() gives me the bitmap for the main display, but I can't call SetBitmapData() on the main screen display. If I call CreateBitmap(), will that bitmap automatically be drawn to the screen?

How can I get the proper formatting? TheoraPlay claims to support RGBA decoding, but when I pass frame->pixels to SetBitmapData(), I get the error output "Can't set bitmap data: format needs to be RGBA."

On a related note, I'm slightly confused on what to pass as the number of bytes. Presumably, it would be frame->width * frame->height * 4 (number of pixels to draw multiplied by four bytes per pixel). Is this correct?

Comments

  • edited February 2014
    Okay, so I've found the issue with saying that it's not in RGBA format. I just didn't have the right bitmap size set. Getting the first frame of the video and creating the bitmap to have the right size fixes this.

    HOWEVER,

    how do I draw the bitmap to the screen through Orx?
  • jimjim
    edited February 2014
    I think what you need is to understand InitTextures() part of this tutorial.
    http://orx-project.org/wiki/en/orx/tutorials/community/iarwain/compositing
    But I am not sure how to do this efficiently and every frame.
  • edited February 2014
    Hi softero and welcome here!

    Jim's right, you can see an example on how to render a procedurally generated/update texture in this tutorial.
    I'll try to explain 2 different options here, depending on one's needs. Also I'll cover the steps you've already learned, just for completion sake as it might help someone else in the future. :)

    There are three steps in order to use a procedurally generated texture and render its content to screen:
    1. Create the texture
    2. Update its content, either in code or via a viewport
    3. Render it, either via a viewport (shader) or an object (with or without a shader)

    Step 1:

    Creating a texture is pretty straightforward. One need to create a new texture, create a hardware surface (called bitmap) and link them both using a unique name that can then be used to retrieve that texture/use it from config.

    Here's how it's done:
    // Creates new texture
    orxTEXTURE *pstTexture = orxTexture_Create();
    
    // Creates bitmap at the correct dimensions
    orxBITMAP *pstBitmap = orxDisplay_CreateBitmap(width, height);
    
    // Links them together with a unique name
    orxTexture_LinkBitmap(pstTexture, pstBitmap, "VideoTexture");
    

    Now we can refer to that new texture by the handle "VideoTexture".

    Also note that the user is responsible for deleting the texture and the bitmap when they are not needed anymore.


    Step 2:

    Update the content of the texture. If you already have the content (RGBA format) from any source, in your case the video decoder, you can update the bitmap with a single line of code:
    orxDisplay_SetBitmapData(pstBitmap, data, width * height * sizeof(orxRGBA));
    

    Now, if you actually want to get its content from orx's renderer (like rendering a part of the scene in order to apply post-FX later on, for example), you need to create a viewport that will render to this texture instead of the screen.
    By default viewports render to screen unless given a texture list. Yes, texture destination can be a list as orx supports Multiple Render Targets on architectures that support it, ie. *not* OpenGL ES 2.0 (iOS/Android), but let's not get into that right now. :)

    It's done in config:
    [MyViewport]
    Camera = ... ; If a shader is attached to a viewport, the camera is optional as content can be entirely generated by the shader
    TextureList = VideoTexture ; Whatever will be rendered through that viewport will now ends up into "VideoTexture" instead of screen
    

    Something noteworthy: if VideoTexture hasn't been created before the viewport, the viewport will create it with its own dimensions. If no dimensions are provided for the viewport, like in the above example, the screen resolution will be used.
    The good thing about this is that texture/bitmap deletion will also be handled by orx, which means step 1 is entirely skippable in this case.

    Step 3:
    Render the content of that texture on screen. Here there are different options depending on one's needs. I'll not talk about a 100% code solution, it's doable but it's so much easier with config only. :)
    If you need the content to be displayed fullscreen, it can be done with a simple viewport. That viewport will render to screen and use a dead simple shader to display the content of the texture. Here's an example:
    [ScreenViewport]
    ShaderList = SimpleShader
    ; No need for a camera as we're not rendering a part of the scene
    
    [SimpleShader]
    Code = "
    // Simply "copy" the pixel from source
    gl_FragColor = texture2D(MyTexture, gl_TexCoord[0].xy);
    "
    ParamList = MyTexture
    MyTexture = VideoTexture ; We use the name of an existing texture for the shader parameter, so orx will now which type of parameter to use and will link it to the right texture automatically *provided the texture already exists before this shader is compiled*
    

    Now if you want to render the content as part of a bigger scene, you simply need to "link" the texture to an object that will get rendered normally, its content instead of being static will now be whatever you put in the texture. There's no special option for such an object, the only requirement, again, is that the texture exists prior to the object.
    In this example, let's create an object that will follow our camera and scale to cover the full viewport:
    [VideoObject]
    ParentCamera = MainCamera ; Or whichever name you use for your camera
    UseParentSpace = true ; Using parent space coordinates for both position & scale
    Scale = 1 ; Covering the whole camera field of view
    Position = (0, 0, 0.01) ; Putting it right in front of the camera, very close to its frustum's near plane
    Graphic = @ ; Reusing the current section for defining the graphic component
    Texture = VideoTexture ; Sounds familiar? ;)
    

    Now create the object, either as part of a bigger scene (via ChildList property of a parent object), or directly.

    That should be it! :)

    Step 4 (debugging):

    Now, here's a little trick for debugging. If things don't work as expected, it could be that either the texture isn't correctly updated or isn't correctly displayed.
    To check if the content is correctly updated, open the console (` key, by default) and here are two useful commands:

    To retrieve the texture:

    > texture.find VideoTexture
    [Returns texture's UID]

    To save the texture on disk:

    > texture.save <tab> test.png
    [Returns true/false]

    The <tab> key will auto-complete the previous result, in our case the texture's UID (much easier than typing it manually).
    You can now open test.png on disk and inspect its content.

    If you have any other questions or need any points to be clarified, please don't hesitate! :)
  • edited February 2014
    Hello! I am working with softero above on this game project, and I've taken over the task of getting cutscenes working.

    iarwain, I have followed your tutorial, and I have gotten it halfway working. I used your debugging trick, and I have confirmed that the video is getting decoded and the texture/bitmap is getting updated. However, nothing shows up on the screen, and I see the following output on the console:
    [23:38:18] [DISPLAY] [orxDisplay.c:orxDisplay_GLFW_CompileShader():969] Couldn't compile fragment shader:
    0:1(1): error: syntax error, unexpected NEW_IDENTIFIER

    I tried looking at the source code for orxDisplay.c to try and understand what was happening there, but to no avail.

    Since it says it's a syntax error, I figured I must have done something wrong in the .ini file, but I'm not sure what it could be.

    I've attached the .ini file that I'm using, but note that I'm not actually using many of the items listed in it. I am creating the CutsceneViewport from config in the Init() method, but that's about it. softero gave me this project to work on about 2 days ago, and I still haven't gotten the chance to really read all of the help material in depth, so maybe there's something I'm missing?

    Thanks in advance. https://forum.orx-project.org/uploads/legacy/fbfiles/files/Test.txt
    Test 628B
  • jimjim
    edited February 2014
    I haven't played much with shader, but I think you forgot to put your shader code in main(). So your shader code should look like:
    Code = "
    void main()
    {
    gl_FragColor = texture2D(MyTexture, gl_TexCoord[0].xy);
    }
    "
    

    This might be the problem. Not sure though.
  • edited February 2014
    Thanks, jim! That indeed fixed that specific error I was getting. However, as is often the case, fixing that one made about two or three more show up.

    I reread iarwain's tutorial above, and realized that our code was linking the bitmap and the texture AFTER it was creating the custom CutsceneViewport from the config file. So, the "CutsceneTexture" that the viewport's shader needed wasn't being created until after the shader, and I figured that that could be a big reason why it wasn't working. So, I moved code around so that now it calls the linking function BEFORE creating the viewport from config.

    However, when I run the program, it still shows up with the following error:
    [12:48:04] [DISPLAY] [orxDisplay.c:orxDisplay_GLFW_CompileShader():969] Couldn't compile fragment shader:
    0:3(35): error: `MyTexture' undeclared

    I thought my change to our code might have resolved this problem, but it didn't. I've gone through the shader tutorial to try and find something I might have missed, but all of the small changes I've made to the .ini file have resulted in the same error getting thrown.

    Maybe I am creating the texture and bitmap in the wrong section of code? Currently they are created in the Init() method, and then the bitmap is updated in the Run() method.

    In the meantime, I'm going to go read through more tutorials and see if there's anything that I failed to comprehend in the first glance-through.

    Thanks in advance.
  • edited February 2014
    I haven't used the shaders other than the example, but from taking a look, what I would try:

    1) In this tutorial here:
    http://orx-project.org/wiki/en/orx/tutorials/spawner

    There is:
    ParamList = texture#offset
    offset    = (-0.05, -0.05, 0.0) ~ (0.05, 0.05, 0.0); <= Let's take some random offset
    

    Maybe your ini should be:
    ParamList = MyTexture
    MyTexture = CutsceneTexture
    

    2) Use the default texture argument and apply the shader to the viewport. Since you are running a cut scene, I believe your shader is to be applied to the whole scene anyways.
  • jimjim
    edited February 2014
    Yep, I was going to say the same. Probably 'MyTexture' is undefined because it is not defined in 'ParamList'. So including MyTexture in ParamList, like Knolan stated should fix the problem.
  • edited February 2014
    So I found and fixed another "fatal" flaw in the code. The code had been structured such that the entire video was being decoded and processed and written to the bitmap before the viewport could look at the new state of the bitmap and display it. The entire process was taking place within the first call to Run(), and successive calls to Run() after that essentially did nothing. I changed it so that each call to Run() would update the bitmap only once, so that we would actually be able to see each frame as it got updated!


    When I restructured the code to fix that, and I made the change to the .ini file that you guys suggested (I had actually tried the ParamList before, but not since restructuring the code these two times) and finallly it worked. All that was left to do was make the thread sleep a little bit between frames, so that it wouldn't go through the video super fast. I haven't read the clock section of the tutorials, but I figure that there's something I can do with them that'll make it easy to set a framerate for cutscenes.

    Anyway, thank you all for helping me sort through all of this. It's been a pleasure.
  • edited February 2014
    Hi Cubemaster333 and welcome here!

    Well, that was my mistake about the ParamList parameter, I realized I forgot to add it in my previous post in this thread. I just updated it accordingly.

    You are right about the clock module, that's the easiest way you'll be able to throttle your decoding code.
    If you feel adventurous, you can sync to the "thread" branch instead of the "default" one and use the brand new orxThread module to spawn a video decoding thread that will run in parallel.
    This will yield much better performances however you'll have to be careful about concurrent accesses (as with any multithreading environment). Also keep in mind you can't call any orx API function from that new thread, especially not the orxDisplay ones.

    A compromise would be to use the orxThread_RunTask() function.
    This will allow to run a task on a separate thread (ie. decoding one frame), as well as having an optional "closure" on the main thread. That means that when your frame has been decoded, a callback will be called from inside the main thread where you can upload your data into your texture. In this case, there's no need to worry about any concurrent access.

    Here's how it works in pseudo-code:
    orxSTATUS orxFASTCALL DecodeOneFrame(void *_pContext)
    {
      // This runs on a separate thread
      // Simply decode one frame and return
    
      // Return orxSTATUS_SUCCESS/orxSTATUS_FAILURE
    }
    
    orxSTATUS orxFASTCALL UpdateTexture(void *_pContext)
    {
      // Fetch the last decoded frame (the context arg can be used to that purpose if you wish)
    
      // Update the texture with orxDisplay_SetBitmapData
    
      // Re-add the task to decode another frame
      orxThread_RunTask(DecodeOneFrame, UpdateTexture, orxNULL, _pContext);
    }
    
    orxSTATUS orxFASTCALL Init()
    {
      // Start video decoding
      orxThread_RunTask(DecodeOneFrame, UpdateTexture, orxNULL, MyContext);
    }
    

    If you want to catch the decoding errors (ie. when DecodeOneFrame returns orxSTATUS_FAILURE), you can add a different callback when calling orxThread_RunTask instead of orxNULL.

    NOTE: If you use orxThread_RunTask(), you'll share a thread with some of orx other modules. At the moment, namely bitmap saving (ie. screenshots) and sound loading (samples disk loading and streams creation).

    Don't hesitate if you have any other questions!
  • edited February 2014
    Hello again. So, I read the Clock tutorials, and got the video successfully throttled using a clock with the correct frequency. Now, I need to also play the audio from the .ogv file. I looked at the sound and music tutorial, but that only covers how to play music from a specified music file. In our case, TheoraPlay is already decoding the audio from the .ogv file into a "float32 PCM format", according to the Theoraplay website. When the program calls THEORAPLAY_getAudio(), we get back a packet of audio, and I am unsure what Orx function I need to call in order to give this packet to the system like I did with the orxDisplay_SetBitmapData() method before.

    I've been looking through the Orx documentation in the orxSound section and I found a method called orxSound_CreateWithEmptyStream() function that takes in a channel number, a sample rate, and name for the orxSound object (an orxString). The description says: "Creates a sound with an empty stream (ie. you'll need to provide actual sound data for each packet sent to the sound card using the event system)". This sounds similar to what I had to do before with the bitmap - updating the initially empty bitmap with actual RGBA data, for each video frame that we want the system to display.
    My biggest question is: how do I use the event system to put the packet data I get from Theoraplay into this sound object so that it can stream the audio from the .ogv?

    I believe I have found some of the object types I will need to use, including orxSOUND_STREAM_PACKET, orxSOUND_EVENT_PACKET, and maybe OrxSoundSystem. But I couldn't find a tutorial on this in the tutorial section or in the first 10 or so pages of the help forums. How do I interface with this stuff "using the event system" like description says? Or do I need to be doing something different?

    Thanks in advance.
  • edited February 2014
    Hi Cubemaster333!

    You've reached one of the advanced features, and as such they aren't covered by a tutorial. Yet. :)

    That being said, you're very close to the solution. I haven't used that feature myself for almost a year and a half (I was using it on a synthesizer written with orx that I never got to finish, unfortunately) but it should still work just as fine. If you encounter any troubles, please let me know.

    Now, for the specifics (and you've already uncovered most of the steps):

    1. Create an empty stream with orxSound_CreateWithEmptyStream().
    You can also create it from config by specifying "empty" (without quotes) as the value of a Music property.

    2. Play/pause/stop the stream as needed.

    3. When the stream is playing, orx will fire an orxEVENT_TYPE_SOUND/orxSOUND_EVENT_PACKET for each packet before forwarding them to OpenAL/device. In our case, the packet will be filled with zeros for all samples.

    4. You can modify directly the content of the samples. If you need less samples than used in the packet, don't forget to update the u32SampleNumber field. If you need more samples, or if you already have a packet pre-built, you can substitute it to the current list of samples, as16SampleList (and update u32SampleNumber as well).

    That's about it. Also don't forget that you can control the number of buffers used for one stream as well as their size (in bytes), using the config properties:
    [SoundSystem]
    StreamBufferNumber = [Int]; Number of buffers to use for sound streaming. Needs to be at least 2, defaults to 4;
    StreamBufferSize = [Int]; Size of buffer to use for sound streaming. Needs to be a multiple of 4, defaults to 4096;
    

    To sum it up, here's a small example (untested, so let me know if it doesn't work at all!):
    orxSTATUS orxFASTCALL Init()
    {
      // At init
      orxEvent_AddHandler(orxEVENT_TYPE_SOUND, &EventHandler);
    }
    
    // Event handler
    orxSTATUS orxFASTCALL EventHandler(const orxEVENT *_pstEvent)
    {
      // Sound?
      if(_pstEvent->eType == orxEVENT_TYPE_SOUND)
      {
        // Streaming packet?
        if(_pstEvent->eID == orxSOUND_EVENT_PACKET)
        {
          orxSOUND_EVENT_PAYLOAD *pstPayload;
    
          // Gets event payload
          pstPayload = (orxSOUND_EVENT_PAYLOAD *)_pstEvent->pstPayload;
    
          // Here's how to copy all the samples to the packet
          {
            // Copies samples to packet, assuming we enough room in the packet
            orxASSERT(MySampleCount <= pstPayload->stStream.stPacket.u32SampleNumber);
            orxMemory_Copy(pstPayload->stStream.stPacket.as16SampleList, MySamples, MySampleCount * sizeof(orxS16));
            pstPayload->stStream.stPacket.u32SampleNumber = MySampleCount;
          }
    
          // As an alternative, here's how to replace the samples with a buffer we already have (that buffer should not be freed before the stream is stopped!)
          // Better if MySamples is static, no need to fear anything
          {
            // Replace the packet content
            pstPayload->stStream.stPacket.as16SampleList = MySamples;
            pstPayload->stStream.stPacket.u32SampleNumber = MySampleCount;
          }
        }
      }
    
      return orxSTATUS_SUCCESS;
    }
    

    Now, as a FYI, as to be future proof or if you're using the "thread" dev branch, the sound streaming events will be the only events in orx that will get sent from a different thread than the main one, as sound streaming itself got moved to a separate thread.

    Which means you'll have to be careful when accessing your own data from within the event. Synchronization such as using orxTHREAD_SEMAPHOREs or orxMEMORY_FENCE might be necessary. But for now, I wouldn't worry too much about it. :)
Sign In or Register to comment.