orxDisplay_GetBitmapData() crashing on Android?

drpdrp
edited October 2011 in Help request
Hi iarwain and guys!

Thanks for the previous quick replies, they were all very helpful :)

Anyways, I was playing around and I think my Android build is crashing when invoking orxDisplay_GetBitmapData(). It doesn't crash when I have the same version on Win32.

On Win32, I am doing something like:
static orxRGBA *ExtractBitmap(const orxTEXTURE *pTexture, orxFLOAT &fWidth, orxFLOAT &fHeight)
{
	orxRGBA *pSrcBuffer = NULL;
	if (pTexture)
	{
		orxBITMAP *pBitmap = orxTexture_GetBitmap(pTexture);
		assert(pBitmap);

		orxDisplay_GetBitmapSize(pBitmap, &fWidth, &fHeight);
		unsigned long u32SrcImageSizeInPixels = (unsigned long)(fWidth * fHeight);

		pSrcBuffer = new orxRGBA[u32SrcImageSizeInPixels];
		assert(pSrcBuffer);
		orxDisplay_GetBitmapData(pBitmap, (orxU8*)pSrcBuffer, u32SrcImageSizeInPixels * sizeof(orxRGBA));
	}
	return pSrcBuffer;
}

I thought it was because I invoking the "new" operator instead of, for example:
pSrcBuffer = (orxRGBA *) orxMemory_Allocate(u32SrcImageSizeInPixels, orxMEMORY_TYPE_VIDEO);

but that change didn't solve the problem on Android (and it crashes on Win32).

Android debugging is a bit of a pain and I can't give you the specifics but I was wondering if you had any ideas on top of your head about this crash. Any hint or idea would be helpful.

Cheers,
Diego

Comments

  • drpdrp
    edited October 2011
    Just for testing, I changed the code to
    orxDisplay_GetBitmapData(pBitmap, (orxU8*)pSrcBuffer, u32SrcImageSizeInPixels * 2/*sizeof(orxRGBA)*/);
    

    and it doesn't crash?
  • edited October 2011
    Hi Diego!

    Your original code looks correct to me, but as I'm not too familiar with the Android verrsion, we'll have to wait for the opinion of people like faistoiplaisir or lydesik.

    I can tell you why your orxMemory_Allocate version doesn't work though: you allocate only 1/4 of the required size as you didn't put * sizeof(orxRGBA) in the size parameter. :)

    In your last post you don't get a crash but I think you might get orxSTATUS_FAILURE as return because the texture size and the size you pass to the function won't match.
  • drpdrp
    edited October 2011
    Hehe thanks for the reply and I hope someone else had a brilliant idea xD

    Regarding the sizeof operator, yes, I realized that too but I forgot to got back to the forums --sorry for having made you write that haha...

    Good call, I will check the status. In any case, if I figure out what the heck is happening I will try to remember to get back in here so it's (potentially) useful for everyone else.
  • edited October 2011
    No worries!

    Yeah, I'm sure this will be helpful to others. I had a quick look at the Android display plugin but didn't see any obvious issue in the GetBitmapData() function. :/
  • drpdrp
    edited October 2011
    I just noticed orxDisplay.cpp in the Android version, orxDisplay_android_GetBitmapData(), uses frame buffers like:
    /* Generates frame buffer */
    glGenFramebuffers(1, &uiFrameBuffer);
    glASSERT();
    
    /* Binds frame buffer */
    glBindFramebuffer(GL_FRAMEBUFFER, uiFrameBuffer);
    glASSERT();
    
    /* Links it to frame buffer */
    glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, _pstBitmap->uiTexture, 0);
    glASSERT();
    

    I think it's worth mentioning I am targeting to OpenGL ES 1.0 in my test application, so maybe that is the reason why it's crashing? I mean, I assume it's asserting but I noticed the asserts on Android make the application crash.

    Does it even make sense? what do you think?

    Any other ideas guys?

    Cheers,
    Diego
  • edited October 2011
    I believe you're right, FBO is an extension for OpenGL ES 1.1 (_OES).

    I don't exactly know what is supported in the Android version.
    I believe you're using the Android one and not the Android_Native one, isn't it?

    Laschweinski was the original author of the Android plugins but Lydesik rewrote most of them and also wrote the Android_Native ones so you might want to PM him about that maybe?

    All I know is that the iOS plugins support FBO for both OpenGL ES 1.1 & 2.0 but that won't be of much help, sorry! ^^

    Cheers!
  • drpdrp
    edited October 2011
    Thanks for the quick reply!

    I just realized I may have overcomplicated things. Is there a way to extract the raw bytes out of a bitmap (as in orxBITMAP)? I just noticed the structure doesn't have any pointer to a buffer:
    struct __orxBITMAP_t {
    	GLuint uiTexture;
    	orxBOOL bSmoothing;
    	orxFLOAT fWidth, fHeight;
    	orxU32 u32RealWidth, u32RealHeight;
    	orxFLOAT fRecRealWidth, fRecRealHeight;
    	orxRGBA stColor;
    	orxAABOX stClip;
    };
    

    Cause I was going thru OGLES to simply get the pixels out of a texture/graphics? Yuck.
  • edited October 2011
    Well, if you want to be able to read a bitmap's content, orx won't keep them in central memory and will hand them to OpenGL ES to store them on the GPU side. So you're a bit out of luck there. :/

    I guess one other option would be to actually load the bitmap separately on your side using SOIL (one of orx's external dependencies), for example.

    What are you trying to do in the end, btw?
  • drpdrp
    edited October 2011
    Simply put, I am trying to extract the data, do pixel manipulations (i.e. change some colours, create a grayscale out of it, etc) and after that create a new image with the manipulated bytes.
  • drpdrp
    edited October 2011
    Or in another words, I am doing what pixel shaders do without actual pixel shaders? xD
  • drpdrp
    edited October 2011
    Don't worry about it too much, will rethink/redo the way it's being done.

    (Plus going through OGL to get the original raw bytes is not very efficient either.)

    Thanks for all the support dude!
  • edited October 2011
    No worries!

    And yes, you're right about the perf, but it's mostly reading from the GPU that will be costly, if you keep a version of the bitmap at all time on your side, that shouldn't be too bad!

    Also, you have access to off-screen rendering and different blend modes, that can be helpful for doing some compositing depending on what you want to achieve. I'm currently doing something like this on one of my projects to simulate a ZX spectrum color clash without having to use any shaders. :)

    As for the function itself in Android, even if you don't end up using it, it'd be nice if it could be fixed. :)
  • edited October 2011
    Hi all,

    last time i tested orxDisplay_GetBitmapData(), it was working ok
    still I've rewrote stuffs in orxDisplay.cpp, but i don't think i broke this one... (I'll give a try to the shader tutorial which is using it)

    BUT... orx Android won't work at all on OpenGL ES 1.x , it assume that' it's running on an OpenGL ES 2.0 device.

    almost all 1 year old android devices does have an OpenGL ES 2.0 driver, but the emulator doesn't.
  • edited October 2011
    Thanks for the details!
  • drpdrp
    edited October 2011
    @lydesik, thanks for the details and the support
    @iarwain, thanks for everything xD

    It's a nice community here.
Sign In or Register to comment.