Hey all,
I'm looking to include .ogv files for the cutscenes in our game. We're looking at using TheoraPlay for decoding the video to get the frames. I can use the orxDisplay_SetBitmapData() to draw the frame on a bitmap.
I have Theora properly decoding the video/audio from the file, but I'm having issues with making it display through Orx.
Here are my questions/issues:
How can I make the bitmap drawn to the screen. orxDisplay_GetScreenBitmap() gives me the bitmap for the main display, but I can't call SetBitmapData() on the main screen display. If I call CreateBitmap(), will that bitmap automatically be drawn to the screen?
How can I get the proper formatting? TheoraPlay claims to support RGBA decoding, but when I pass frame->pixels to SetBitmapData(), I get the error output "Can't set bitmap data: format needs to be RGBA."
On a related note, I'm slightly confused on what to pass as the number of bytes. Presumably, it would be frame->width * frame->height * 4 (number of pixels to draw multiplied by four bytes per pixel). Is this correct?
Comments
HOWEVER,
how do I draw the bitmap to the screen through Orx?
http://orx-project.org/wiki/en/orx/tutorials/community/iarwain/compositing
But I am not sure how to do this efficiently and every frame.
Jim's right, you can see an example on how to render a procedurally generated/update texture in this tutorial.
I'll try to explain 2 different options here, depending on one's needs. Also I'll cover the steps you've already learned, just for completion sake as it might help someone else in the future.
There are three steps in order to use a procedurally generated texture and render its content to screen:
1. Create the texture
2. Update its content, either in code or via a viewport
3. Render it, either via a viewport (shader) or an object (with or without a shader)
Step 1:
Creating a texture is pretty straightforward. One need to create a new texture, create a hardware surface (called bitmap) and link them both using a unique name that can then be used to retrieve that texture/use it from config.
Here's how it's done:
Now we can refer to that new texture by the handle "VideoTexture".
Also note that the user is responsible for deleting the texture and the bitmap when they are not needed anymore.
Step 2:
Update the content of the texture. If you already have the content (RGBA format) from any source, in your case the video decoder, you can update the bitmap with a single line of code:
Now, if you actually want to get its content from orx's renderer (like rendering a part of the scene in order to apply post-FX later on, for example), you need to create a viewport that will render to this texture instead of the screen.
By default viewports render to screen unless given a texture list. Yes, texture destination can be a list as orx supports Multiple Render Targets on architectures that support it, ie. *not* OpenGL ES 2.0 (iOS/Android), but let's not get into that right now.
It's done in config:
Something noteworthy: if VideoTexture hasn't been created before the viewport, the viewport will create it with its own dimensions. If no dimensions are provided for the viewport, like in the above example, the screen resolution will be used.
The good thing about this is that texture/bitmap deletion will also be handled by orx, which means step 1 is entirely skippable in this case.
Step 3:
Render the content of that texture on screen. Here there are different options depending on one's needs. I'll not talk about a 100% code solution, it's doable but it's so much easier with config only.
If you need the content to be displayed fullscreen, it can be done with a simple viewport. That viewport will render to screen and use a dead simple shader to display the content of the texture. Here's an example:
Now if you want to render the content as part of a bigger scene, you simply need to "link" the texture to an object that will get rendered normally, its content instead of being static will now be whatever you put in the texture. There's no special option for such an object, the only requirement, again, is that the texture exists prior to the object.
In this example, let's create an object that will follow our camera and scale to cover the full viewport:
Now create the object, either as part of a bigger scene (via ChildList property of a parent object), or directly.
That should be it!
Step 4 (debugging):
Now, here's a little trick for debugging. If things don't work as expected, it could be that either the texture isn't correctly updated or isn't correctly displayed.
To check if the content is correctly updated, open the console (` key, by default) and here are two useful commands:
To retrieve the texture:
> texture.find VideoTexture
[Returns texture's UID]
To save the texture on disk:
> texture.save <tab> test.png
[Returns true/false]
The <tab> key will auto-complete the previous result, in our case the texture's UID (much easier than typing it manually).
You can now open test.png on disk and inspect its content.
If you have any other questions or need any points to be clarified, please don't hesitate!
iarwain, I have followed your tutorial, and I have gotten it halfway working. I used your debugging trick, and I have confirmed that the video is getting decoded and the texture/bitmap is getting updated. However, nothing shows up on the screen, and I see the following output on the console:
[23:38:18] [DISPLAY] [orxDisplay.c:orxDisplay_GLFW_CompileShader():969] Couldn't compile fragment shader:
0:1(1): error: syntax error, unexpected NEW_IDENTIFIER
I tried looking at the source code for orxDisplay.c to try and understand what was happening there, but to no avail.
Since it says it's a syntax error, I figured I must have done something wrong in the .ini file, but I'm not sure what it could be.
I've attached the .ini file that I'm using, but note that I'm not actually using many of the items listed in it. I am creating the CutsceneViewport from config in the Init() method, but that's about it. softero gave me this project to work on about 2 days ago, and I still haven't gotten the chance to really read all of the help material in depth, so maybe there's something I'm missing?
Thanks in advance. https://forum.orx-project.org/uploads/legacy/fbfiles/files/Test.txt
This might be the problem. Not sure though.
I reread iarwain's tutorial above, and realized that our code was linking the bitmap and the texture AFTER it was creating the custom CutsceneViewport from the config file. So, the "CutsceneTexture" that the viewport's shader needed wasn't being created until after the shader, and I figured that that could be a big reason why it wasn't working. So, I moved code around so that now it calls the linking function BEFORE creating the viewport from config.
However, when I run the program, it still shows up with the following error:
[12:48:04] [DISPLAY] [orxDisplay.c:orxDisplay_GLFW_CompileShader():969] Couldn't compile fragment shader:
0:3(35): error: `MyTexture' undeclared
I thought my change to our code might have resolved this problem, but it didn't. I've gone through the shader tutorial to try and find something I might have missed, but all of the small changes I've made to the .ini file have resulted in the same error getting thrown.
Maybe I am creating the texture and bitmap in the wrong section of code? Currently they are created in the Init() method, and then the bitmap is updated in the Run() method.
In the meantime, I'm going to go read through more tutorials and see if there's anything that I failed to comprehend in the first glance-through.
Thanks in advance.
1) In this tutorial here:
http://orx-project.org/wiki/en/orx/tutorials/spawner
There is:
Maybe your ini should be:
2) Use the default texture argument and apply the shader to the viewport. Since you are running a cut scene, I believe your shader is to be applied to the whole scene anyways.
When I restructured the code to fix that, and I made the change to the .ini file that you guys suggested (I had actually tried the ParamList before, but not since restructuring the code these two times) and finallly it worked. All that was left to do was make the thread sleep a little bit between frames, so that it wouldn't go through the video super fast. I haven't read the clock section of the tutorials, but I figure that there's something I can do with them that'll make it easy to set a framerate for cutscenes.
Anyway, thank you all for helping me sort through all of this. It's been a pleasure.
Well, that was my mistake about the ParamList parameter, I realized I forgot to add it in my previous post in this thread. I just updated it accordingly.
You are right about the clock module, that's the easiest way you'll be able to throttle your decoding code.
If you feel adventurous, you can sync to the "thread" branch instead of the "default" one and use the brand new orxThread module to spawn a video decoding thread that will run in parallel.
This will yield much better performances however you'll have to be careful about concurrent accesses (as with any multithreading environment). Also keep in mind you can't call any orx API function from that new thread, especially not the orxDisplay ones.
A compromise would be to use the orxThread_RunTask() function.
This will allow to run a task on a separate thread (ie. decoding one frame), as well as having an optional "closure" on the main thread. That means that when your frame has been decoded, a callback will be called from inside the main thread where you can upload your data into your texture. In this case, there's no need to worry about any concurrent access.
Here's how it works in pseudo-code:
If you want to catch the decoding errors (ie. when DecodeOneFrame returns orxSTATUS_FAILURE), you can add a different callback when calling orxThread_RunTask instead of orxNULL.
NOTE: If you use orxThread_RunTask(), you'll share a thread with some of orx other modules. At the moment, namely bitmap saving (ie. screenshots) and sound loading (samples disk loading and streams creation).
Don't hesitate if you have any other questions!
I've been looking through the Orx documentation in the orxSound section and I found a method called orxSound_CreateWithEmptyStream() function that takes in a channel number, a sample rate, and name for the orxSound object (an orxString). The description says: "Creates a sound with an empty stream (ie. you'll need to provide actual sound data for each packet sent to the sound card using the event system)". This sounds similar to what I had to do before with the bitmap - updating the initially empty bitmap with actual RGBA data, for each video frame that we want the system to display.
My biggest question is: how do I use the event system to put the packet data I get from Theoraplay into this sound object so that it can stream the audio from the .ogv?
I believe I have found some of the object types I will need to use, including orxSOUND_STREAM_PACKET, orxSOUND_EVENT_PACKET, and maybe OrxSoundSystem. But I couldn't find a tutorial on this in the tutorial section or in the first 10 or so pages of the help forums. How do I interface with this stuff "using the event system" like description says? Or do I need to be doing something different?
Thanks in advance.
You've reached one of the advanced features, and as such they aren't covered by a tutorial. Yet.
That being said, you're very close to the solution. I haven't used that feature myself for almost a year and a half (I was using it on a synthesizer written with orx that I never got to finish, unfortunately) but it should still work just as fine. If you encounter any troubles, please let me know.
Now, for the specifics (and you've already uncovered most of the steps):
1. Create an empty stream with orxSound_CreateWithEmptyStream().
You can also create it from config by specifying "empty" (without quotes) as the value of a Music property.
2. Play/pause/stop the stream as needed.
3. When the stream is playing, orx will fire an orxEVENT_TYPE_SOUND/orxSOUND_EVENT_PACKET for each packet before forwarding them to OpenAL/device. In our case, the packet will be filled with zeros for all samples.
4. You can modify directly the content of the samples. If you need less samples than used in the packet, don't forget to update the u32SampleNumber field. If you need more samples, or if you already have a packet pre-built, you can substitute it to the current list of samples, as16SampleList (and update u32SampleNumber as well).
That's about it. Also don't forget that you can control the number of buffers used for one stream as well as their size (in bytes), using the config properties:
To sum it up, here's a small example (untested, so let me know if it doesn't work at all!):
Now, as a FYI, as to be future proof or if you're using the "thread" dev branch, the sound streaming events will be the only events in orx that will get sent from a different thread than the main one, as sound streaming itself got moved to a separate thread.
Which means you'll have to be careful when accessing your own data from within the event. Synchronization such as using orxTHREAD_SEMAPHOREs or orxMEMORY_FENCE might be necessary. But for now, I wouldn't worry too much about it.