It looks like you're new here. If you want to get involved, click one of these buttons!
I'm having a problem adding float parameter (or I think any parameter for that matter) to my shader that is derived from the tileset example. I'd like to be able to set the alpha value of the highlighted block depending on if it is within a certain range of the player. The range checking is working fine in my code, but trying to use the new parameter in the shader breaks it. I think it must be something very simple I've missed here.
Here is the parameter list, and I've set a default value for the new parameter HighlightAlpha

If in the shader if I use
color.a = 0.35;
it works fine,
but
color.a = HighlightAlpha;
makes the shader not work at all.
I also see when I try to set the parameter in code, that it doesn't like the float value

Not sure what I've done wrong here.
Comments
Hi @funemaker!
In the tilemap sample, the shader parameters are actually defined in the
CliffMapsection, not in theMapShadersection itself.As you haven't defined
HighlightAlphain the section of the object using the shader (which contains the originalCodeproperty), orx will assume by default that this parameter is the current object's texture.That's why you can't set it as a float later on and why you get the shader compile error.
More precision: all the shader parameters are added programmatically at runtime, using what's defined in the shader section for
CodeandParamList:Ah, I overlooked that...thanks very much. It is working as expected now!
Np, glad it's working.