In the tutorials, there's an example of defining and using a shader within the config files. How would one go about doing so in code?
For example, suppose we have two shaders respectively named SA and SB defined in config files. How would I go about loading and alternating between them?
shaderA = orxShader_CreateFromConfig("SA");
shaderB = orxShader_CreateFromConfig("SB");
Beyond that, I'm not entirely sure. I'm rather new to shaders, and while there are nice tutorials out there on how to write GLSL, I'm still at a loss sometimes as to how to actually activate them.
Oh, and is there a straightforward way to determine whether a user's computer actually supports shaders? I imagine simply trying to load/enable one and checking the results might do the trick.
I think most recent graphics card support shaders. GPU-Z is a nifty tool for detecting which shader model you support.
Still can't get the shader to work on my Player object, but that might just be a function of my old laptop (which is a Linux machine, so GPU-Z unfortunately won't work).
Thanks for the info, hopefully I'll figure this out.
So here come the shaders. First of all, the SDL and GLFW plugins will have a correct behavior relative to per object shaders. Whereas the SFML plugins won't. There's already a thread about this on these forums, and it's due to how SFML internally handles shaders.
Now let's speak about your options.
When you create a shader (you can do it all from code but it's fairly complex compared to the config version), the first time it'll get compiled. This usually takes a split second, depending on the complexity of your shader. So I'd advise to create all your shaders at the beginning with the KeepInCache option (turned on by default) so that they won't need to be recompiled till the next execution of your game (as a matter of fact orx can't store and reuse precompiled shaders... yet ).
Btw, you're totally right about testing the return value of a AddShader() function to test if the current OS/hardware supports shaders. I guess I could expose this publicly in a more straightforward way in the future.
Back to the shaders themselves. You can only create fragment/pixel shaders as the vertex shader is always the same (it's a quad rendering), which makes sense in 2D for 99% of the uses. I'll add optional vertex shader code later maybe, but it's not a priority as it's almost only useful for 3D programs.
The same piece of shader code can now be applied to different structures: either an object, a viewport or without any support (which means it'll get applied to the whole screen after all the viewports have been rendered).
If a shader is applied on an object, it won't affect other objects' rendering (beside the SFML plugins, again, but we can't do anything about it unfortunately ). When a shader is applied to a viewport, all the objects displayed in this viewport are rendered normally and then the whole viewport is re-rendered with the shader on. It's mainly use for global effects.
You'll find the same shader handling functions in both the orxViewport and orxObject modules. Shaders can be added, stacked, removed & enabled/disabled.
Shaders, by default, will use custom parameters and send events to query the value of each parameters (you need to create a handler so as to feed the shader these values at runtime, otherwise the default value from config will be used, ie. static shader).
Let's say you want to make a wave, you'll need at least a time realtime parameter, whereas if you want to create a black&white shader, it doesn't require any runtime parameters, and you can turn off the UseCustomParameters option for better performance.
Each shader will have implicit parameters created for every texture defined for it. These will give you the relative side values (between 0 & 1). For power of two single textures it'll be 0 & 1, whereas for atlas texture or NPOT on some architectures, it might be different values so be sure to use them and not numerical constants 0 & 1 to define up, bottom, left & right.
For a texture named MyTexture, those implicit parameters are named MyTexture_Left, etc...
Now, for the texture, when defining the parameters, if you use the name screen it'll use the actual rendering texture (NB: it might not be the actual screen if your viewport's target is another texture and not the screen), if you name a real texture it will use it, otherwise, by default, it'll use the associated texture with the bound structure: for a viewport it'll use the texture bound to the viewport (usually the screen) and for an object it'll use the current orxGRAPHIC's texture or, if animated, the current animation frame's orxGRAPHIC's texture.
Hope this isn't too confusing!
You can have a look at Mushroom Stew's shaders but they are outdated for the current version of orx: they're using numerical constants instead of _Left, _Right, _Bottom & _Top texture delimiters and they are running with SFML shaders, which means the per object shading is broken and I had to make a work around in the shaders themselves that won't work with correct shading support from the SDL/GLFW plugins.
Last detail: on every platform supported by orx, the same shaders can be used to obtain the exact same effect even if some platform use OpenGL shaders (win, linux, mac) and others use OpenGL ES 2.0 shaders (iPhone, iPad, iPod Touch). OpenGL ES 1.1 doesn't have support for non-fixed pipelines, ie. shaders won't work there (old iPhone/iPod Touch).
Shaders can be tricky as some compilers/hardware/manufacturers are more permissive than others.
It's a bit like web programming, there are web standards such as HTML but all the available browsers have their own quirks and don't fully respect the standards. Well, it's a bit the same with shaders, but not as worse as with web programming.
So when creating a shader, if you can, try it on other OSes or ask someone else to test it for you as sometimes you get bad surprises such as random pixels on screen (happened to me with Mushroom Stew for linux/mac as my shaders weren't strictly conform to standards but the windows NVidia OpenGL driver was far more permissive).
If you have any questions about shaders, I'm far from a guru as I began to use them only about a year ago, but they aren't such a hard beast to tame! Well, mostly!
I use a very simple coloured wave with runtime frequency & amplitude variations.
The shader there is applied to the whole viewport, but for example, in BounceAlt.ini, add the line
To the section [Wall1], [Wall2], [Wall3] or [ParticleSource] and you'll see that the same shader can be applied directly to single objects (well in that case the shader is very lame as it was designed for a full screen with a black background and no transparency handling, but you got my point! ^^)
I've got shaders working now, near as I can tell. I think my previous problem really was just due to an outdated laptop. Everything works fine on my new one.
Well, I say "works fine"...I suppose I do have another question, more particular to shader programming itself. My shader is as follows:
This is only attached to my player character; just for testing purposes, the transparent bits will show up in red, the rest will be rendered normally. That's in theory. However, I'm noticing that it seems to be rendering different sprites altogether! Sometimes my player will be replaced with other NPCs, sometimes with half of an environmental decoration (zoomed in, to boot), and other weirdness.
Is this part of the SFML strangeness you were talking about? Or did I perhaps neglect to do something in the code...
Once again, a big thanks up front.
Too bad you didn't ask the question two weeks ago as I'd have been happy to drink that beer with you! Well, I'm feeling thirsty now, I blame you... :P
Mmh, if you're using orx 1.2 or svn's head, you're probably not using SFML plugins anyway (at least with the embedded versions).
Your shader looks good to me, however there are two points I noticed:
- Some shader compilers will be a bit fussy about the if(0 == texel.a) as 0 is technically an int and texel.a is a float. Using 0.0 instead should work everywhere.
- You don't have to declare any variable explicitly in your shader code. So here you should remove the uniform sampler2D tex. However, in your shader config's section, you should have a ParamList key looking like: As you won't define tex it'll assume it's a texture and the one bound to the object/viewport where the shader is applied. If you don't have the ParamList config key, or won't know about the tex variable, the shader will compile (as it's valid) but orx won't know how to make the bridge between the shader variable tex (as it doesn't know it exists) and the texture of your object/viewport.
What will happen is that the texture used by the shader will be the last one that has been bound to the texture unit used by the shader. Which, in turn, will result in an undefined/random behavior.
My guess is that it's your current issue and it's really easy to fix!
Orx only recognize 3 types of variables: 3-component vectors, floats and textures. And it'll get their types from their original value. For example a shader using the variables F, V and T, will have the parameters declared this way:
Let me know if this helps!
IE, instead of writing write
Worked like a bloody charm. I originally thought that perhaps some variables (like the static ones) could be declared in the shader code itself; now I know to just put all of them in the ParameterList.
Now on to making a shader that's actually...well...interesting.
...where I've added the new "distance" variable. Running this, I was getting some shader compilation errors. On looking at your Bounce config example, I noticed it should instead be:
...that is, ParamList instead of ParameterList.
Not a big problem, just making note of it here in case someone else has the same confusion I did.
I don't see what you mean, you where probably very tired when reading my post!
More seriously, thanks for the info, I fixed my original post. Next time I'll use the CreationTemplate.ini reference file like everybody!
(BTW, this isn't really part of the game; I was just testing dynamic parameters. But I thought it looked pretty damn trippy/hilarious.)
(Moved just to keep this thread specific to shader discussions.)