Handling Different Resolutions

edited April 2013 in Help request
Hi, here am I again with another question :P

I was wondering.. how one should handle different PC native resolutions? I'm making a PC/Mac game and I'm targeting a resolution of 1024x768, what happens when I set that resolution with fullscreen enabled? My PC handles that in a nice way, but I tested it on a laptop and the screen was totally stretched.. ( My native resolution is 1080p while the laptop resolution is 1336x768 )

Comments

  • edited April 2013
    Hi, good question! :)

    Orx tries to handles things in a resolution-independent way, but that requires some explanations. :)

    There are two things to consider: the actual screen resolution and the camera/viewport one.

    Let's say you target a native resolution for your assets of 1024x768. You'll then want to set this as your camera frustum size.

    By default, a viewport will try to occupy the whole "screen" (not the physical one, but the display surface) defined by the Display.ScreenWidth/Display.ScreenHeight config parameters, and then inherits its resolution. This can be changed on the fly or via config of course, but that's the default behavior.

    No matter what you do, the aspect ratio defined by your camera frustum size will be maintained when this camera is used for rendering. That will result in letter/pillar-boxing if the target aspect ratio (the one of the viewport) isn't the same.

    That means if you have a 1024x768 camera (aspect ratio (ar) = 4:3) which is linked to a 1680x1050 (ar = 16:10) viewport, the actual rendering will get upscaled to 1400x1050 and you'll have two bars of 140 pixels on the left and right of your viewport.

    Now when going full screen, there are two options: either use the native resolution defined by your desktop, or forcing one.

    If you force one and if it's supported by your monitor, you'll end up with that exact resolution. In your case your laptop switched to a 1024x768 resolution on a physical screen which is not 4:3, which results in actual stretching though the rendering was done 1:1 to a video surface of 1024x768.
    In addition to stretching, it actually takes a few seconds for most GPU/monitors to switch to a new resolution.

    The alternative is to use the "native" resolution: the switching is much faster, often less than one second, and the result won't look stretched. That also means that your rendering is likely to get letter/pillar-boxed and/or up/down-scaled depending.

    In order to use the native resolution, you can omit the Display.ScreenWidth/Display.ScreenHeight parameters. Orx will then query the desktop to know the current resolution, including the depth and refresh rate.

    Now if you want to be able to switch on the fly between both, in addition to call orxDisplay_SetFullScreen(), you'll need to change the video mode too.
    To get the desktop native resolution at runtime, you can call orxDisplay_GetVideoMode(-1).

    Note that changing the video mode results in the deletion and re-creation of the current OpenGL context. That usually takes some time as orx will backup all the loaded textures/shaders and restore them in the new context.
    So in the end that can be even slower than simply switching the monitor resolution. However that will never induce any visual stretching.

    Lastly, by using the native resolution, alt-tabbing in and out of your game should be really fast, whereas using a different resolution than the desktop one would be much slower.

    The way most 3D games allow users to pick their own resolution can also be achieved with orx (you can query all the available video modes and select one from that list). However, as we're rendering 2D and not 3D, that means you can end up with visual stretch instead of a different FOV.

    Lemme know if something wasn't clear. :)
  • edited April 2013
    As a corollary, objects bound to a camera (very useful for GUI) can be defined relatively to that camera, both in position and scale, which would result in an UI that can adapt to different camera aspect ratio/resolutions.
    The downside is that for widely different resolutions, you might end up with overlapping or unreadable buttons, for example. But for most cases it should work pretty well.

    Lastly, by using the new resource system, one an support more than one native resolution (think SD and HD, or 4:3 and 16:9, etc...). Just use the same name of assets/config files and place them in different folders (such as gfx_sd, gfx_hd, ...).

    Then before loading them (or creating objects that use them), simply add the folder matching your current settings to the resource system with orxResource_AddStorage(): ex. orxResource_AddStorage(orxTEXTURE_KZ_RESOURCE_GROUP, "gfx_hd", orxTRUE);
  • edited April 2013
    Thanks for the explanation :)
    What I wanted to do was rendering to a fixed size texture ( say, 1024x768 ) and then just put that in the center of the screen without caring about the resolution (hoping that the resolution is at least 1024x768)...
    Right now I'm using the native resolution and then putting a viewport with RelativeSize = (nothing here) and Size = (1024,768,0)
    All my cameras have frustum size ( 1024x768 ) so the viewport and camera size are matching, but the texture is still being stretched somehow.

    Am I doing something wrong?
  • edited April 2013
    So in that case, you need to satisfy these conditions:
    [Display]
    FullScreen = true
    ; No ScreenWidth, ScreenHeight or ScreenDepth here: defining them with = (nothing) won't work, they have to be entirely absent
    
    [MyViewport]
    Camera = MyCamera
    Size = (1024, 768, 0)
    RelativePosition = center
    ; Again, don't define RelativeSize at all, even with = (nothing)
    
    [MyCamera]
    FrustumWidth = 1024
    FrustumHeight = 768
    FrustumFar = ...; (depending on your needs)
    Position = ...; (depending on your needs)
    

    That should do the trick. :)
  • edited April 2013
    Ah I forgot: If you change the video mode on the fly (fullscreen<->windowed or change of resolution), your viewport will get moved&scaled accordingly, so you'll have to enforce its size afterward.
  • edited April 2013
    Sorry for the late answer, I've been busy with a lot of things lately!
    I'll try forcing the viewport size after the fullscreen, but do I have to force it even if I specify in the configuration that I want a fullscreen "window"?

    Thanks for everything as always :)
  • edited April 2013
    No worries for the delay. :)

    You shouldn't have to force it if you ask for a fullscreen in config. It's only for any following resolution changes.

    By the way, last week end I did some modifications in order to allow for easier windowed<->fullscreen switching over different resolutions (windowed with game resolution and fullscreen with desktop one, for example).

    Now orxDISPLAY_VIDEO_MODE contains a bFullScreen field that can be used to this effect.

    In this case, going from your windowed 1024x768 -> FullScreen in desktop native resolution would be:
    orxDISPLAY_VIDEO_MODE stVideoMode;
    
    orxDisplay_GetVideoMode(orxU32_UNDEFINED, &stVideoMode);
    stVideoMode.bFullScreen = orxTRUE;
    
    orxDisplay_SetVideoMode(&stVideoMode);
    
    // Here you might want to resize your viewport if you want to maintain a fixed size
    

    And, reciprocally, going from that fullscreen mode back to a windowed 1024x768 would be:
    orxDISPLAY_VIDEO_MODE stVideoMode;
    
    orxDisplay_GetVideoMode(orxU32_UNDEFINED, &stVideoMode); // We do this so as to maintain refresh rate and color depth
    
    stVideoMode.u32Width = 1024;
    stVideoMode.u32Height = 768;
    stVideoMode.bFullScreen = orxFALSE;
    
    orxDisplay_SetVideoMode(&stVideoMode);
    
    // Here you might want to resize your viewport if you want to maintain a fixed size
    
  • jimjim
    edited April 2013
    iarwain wrote:
    That means if you have a 1024x768 camera (aspect ratio (ar) = 4:3) which is linked to a 1680x1050 (ar = 16:10) viewport, the actual rendering will get upscaled to 1400x1050 and you'll have two bars of 140 pixels on the left and right of your viewport.
    I think another way to avoid letter boxing is to consider the width of the target device while scaling. In this case, we have to fix the resolution of 1680 and try to maintain the aspect ratio. So, it would be up scaled to 1680x1200 instead of 1400x1050.

    Though 150 pixels of the top and bottom of the screen will be cut. Because we could be only see 1680x1050 portion of the screen. A simple fix is that, while developing the game we can put some dummy graphics on the top and bottom of the screen and to make sure there are no gameplay elements there. This way user wont notice any bar or letter boxing effect.

    Of course, we can use different camera mechanics for UI elements. For UI I think there is no simple fix but to place them using grid, based on the actual rendering size.
    So, if we divide our 1024x768 camera a 400x300 grid then when have to map the same number of grid for 1680x1050 viewport. Not to mention, we have to leave 150 pixels off of top and bottom while considering the grid.

    Well, I have not tried these mechanics practically, but I have found something like this somewhere on the net. I might be wrong in many places, so feel free to correct me.

    Targeting different resolution or device is always a pain in the ass while developing for mobiles. Once we find near good solution it will help the others :)
  • edited April 2013
    Hi jim.

    I'm not convinced that cutting content is a good idea, especially when you can't always know which resolution will be used by the end user, like it's often the case on computers.

    In your example, you're simply targeting resolution B instead of resolution A, but the problem remains the same: in resolution A you now have more screen space to fill that is not covered by your game space.

    That being said, in the current way things work, if you don't like black borders, you still can display something in them at your convenience. However in this case you're in control of what is going to be displayed exactly and won't have to juggle with how much of your actual game space will be cut. :)
  • jimjim
    edited April 2013
    Yeah, I understand your point, cutting content is kinda risk and one has to guess the variation of target resolution and design according to that, which is kinda uncertain how much would be cut or not. Also it mostly depends on the game.

    But this is the beauty of orx :) that it has viewport and camera system,which can be used to solve this kind of problems in different way effectively.

    So, there can be two camera one for gameplay rendering, that would maintain aspect ratio while showing the full content of the game having black bars. Another camera can be used to place dummy graphics to cover the black bars but again this depends on the game though.
  • edited April 2013
    Yep, definitely. That's one completely valid option. :)
  • edited April 2013
    Let me see if I have these definitions straight:

    Display: Physical window (or full screen) that actually shows something.

    Viewport: A rectangle that shows the output of a Camera. A display may have multiple (overlapping??) viewports, but in many cases has just one.

    Camera: Simple a location / angle from which generate the view in a viewport

    Frustum: The sliced pyramidal area in 3D space that is seen by the camera.

    Is this correct?
  • edited April 2013
    This is all true.

    A couple of precisions:

    - Viewports can be associated with a texture instead of the display, in that case the rendering will happen on the texture that can then be reused in the scene (cf. the compositing tutorial).

    - The frustum is just another attribute of the camera, such as position or angle. In our 2D world, the camera projection being orthogonal, the frustum isn't a cut pyramid but a cuboid. We still use the name frustum as it's now the generic term used in game engines (and orx might support 3D one day).
Sign In or Register to comment.