I've spent a few days redoing all the graphics for the pinball game to make the table and elements higher resolution. Took the screen size up to around 960 x 1280 pixels.
I then considered creating lower-rez versioned assets of around 480 x 640 for regular mobile devices and switching dynamically.
But I am wondering if I really need to do this. After toying around with some test code and turning on the profile, trying various resolutions, zooming, scaling and display sizes, the profile results appear to be almost identical.
Is this because Orx might be scaling all elements in memory for me whenever the display size changes? Which would mean that Orx would scale all assets when the game initialises, and be therefore playing on the scaled asset versions, not the full size versions, hence great performance.
Am I right? Besides putting an initial load on the device to do the scale... do I only need to create a single set of full sized assets?
There "should" be a performance difference when the GPU "read" the texture on a high-def devices because it needs to read more pixel from the source texture, but in reality, there is no difference because high-def devices do have more powerfull GPU.
But you should not see a difference on a low-def device because the GPU will just skip some pixel from the texture. So memory bandwith load will be almost the same as if you provided only low-def textures...
Of course, on a low-def device, there maybe not enough memory to load the high-def textures.
Nowadays, 1280x720 is a good target screen size.
I have a fairly low powered device (motorola defy+) and it copes fine.