I can't believe I fix the IPhone touch things without a document.
When I begin to use orxEVENT_TYPE_IPHONE macro in my cpp files,I always get compile errors......Compiler said it can't find it.
I see the macro defined in orx.h,after tried to defined macro __orxIPHONE__ and macro __orxOBJC__ myself and failed, I found the demo can use the orxEVENT_TYPE_IPHONE macro right.Oh my!It's must be a objc file to use it........
After I found the mouse input come from EVENT orxEVENT_TYPE_INPUT instead of orxEVENT_TYPE_MOUSE and just have a EVENT when the button uping, This is the biggest victory to in control of orx!
I think the orx for IPhone need a special tutorial......I read the old tutorials,but I get no help from the things I mentioned.....
Yes, I totally forgot to point out that point, sorry!
As the application stub is written in ObjC (there isn't any other option available anyway), your main file has to be an ObjC one.
Yes, all the inputs go through orxEVENT_TYPE_INPUT, the events orxEVENT_TYPE_MOUSE/JOYSTICK/KEYBOARD being only used internally. I should write something about this somewhere. Or even remove them from the public event list.
You're totally right. If you have any suggestions about what should contain such a tutorial, please let me know. Maybe I should base it on the small iPhone demo that ships with orx?
In addition,I think orx could wrap touch event more without exposing any NS things to let user could handle touch event in a cpp file.In that way, one could more easily handle the differentiation in the Windows/Mac/Iphone with a simple macro.
Now, Macros can't handle this, I need to put two event handle function in two files with the macros. But supporting the RAW IPhone Event is still needed.
I look forward your new tutorials.
That will be done before the 1.2 release.
I guess I could add a wrapper there but I'm not sure which information I should wrap as everything could potentially be useful and NS classes for touches and accelerometer have a bunch of query methods.
I'll let you know when something's ready.
touch move = left button down and move, touch end/cancel = left button up.
(I don't know if is it available now)
But just with input event,I only can get the left button up event as the touch inside up.
orxMouse_IsButtonPressed will tell you if there's an active touch or not and orxMouse_GetPosition will then give you its position.
You can also use the generic input system to monitor mouse left button click (both touch on *and* off) via events and I'll soon add the way to monitor mouse axes (x and y separately as the interface use single float values).
I'll soon add
But will I need divide code into two part,one in event callback, one in run/clock callback?(If I want to detect all the input inclued touch begin,touch move and touch end)
You can query them with orxInput_IsActive() / orxInput_HasNewStatus().
If you'd rather go for an event callback, I added support for mouse position in the generic input system tonight. It's only on the SVN so far but it'll be part of 1.2 release.
Both ways allow you to fully handle single touch on iPhone. For handling multiple-touch, you'd have to use event callbacks as there are no handling in the generic input system yet.