0 votes

This same problem (inverted camera) occurred with version 1.x (if I remember correctly) of this plugin.

The user named tcmxx2, said: "EasyAR's backend algorithms are using right-hand convention while Unity uses left-hand..." (thanks to explain, tcmxx2)

The developers are repeating the error? Why create a UNITY plug-in that uses a different coordinate system (conventions) than Unity uses? What's the logic to do that? It makes no sense!

The EasyAR 2.x system works perfectly (however the ARM64 is not supported). But in the third version of this plugin, the problem with the camera axis has returned.

I need some help to solve this problem .... And I really ask you developers: Change this please !!!

The user named tcmxx2, said: "EasyAR's backend algorithms are using right-hand convention while Unity uses left-hand..." (thanks to explain, tcmxx2)

The developers are repeating the error? Why create a UNITY plug-in that uses a different coordinate system (conventions) than Unity uses? What's the logic to do that? It makes no sense!

The EasyAR 2.x system works perfectly (however the ARM64 is not supported). But in the third version of this plugin, the problem with the camera axis has returned.

I need some help to solve this problem .... And I really ask you developers: Change this please !!!

+1 vote

To convert a matrix from EasyAR to Unity, we have created two utility functions: Utility.Matrix44FToMatrix4x4 and Utility.SetMatrixOnTransform. With the later one, we can do right-handed-to-left-handed conversion and matrix decomposition at the same time.

Or if you like a simpler approach, suppose M is the 4x4 transform matrix from EasyAR, you can calculate a transform matrix M' in Unity coordinates by

(1 0 0 0) (1 0 0 0) ( m11 m12 -m13 m14)

M' = (0 1 0 0) M (0 1 0 0) = ( m21 m22 -m23 m24)

(0 0 -1 0) (0 0 -1 0) (-m31 -m32 m33 -m34)

(0 0 0 1) (0 0 0 1) ( m41 m42 -m43 m44)

where

(m11 m12 m13 m14)

M = (m21 m22 m23 m24)

(m31 m32 m33 m34)

(m41 m42 m43 m44)

Notice that Unity convert its coordinate system to right-handed by inverting the Z-axis in the View matrix, so their projection matrix maps points from a right-handed coordinate system to a right-handed coordinate system. https://docs.unity3d.com/ScriptReference/Camera-worldToCameraMatrix.html

There are some defects with the EasyAR 2.x transforms and projection matrices, such as returning matrices with a negative determinant.

There are also inconsistencies between the Unity and non-Unity APIs, which slow our iterations.

There are also problems with non-OpenGLES2 rendering APIs with Unity, which is not easy to support one by one.

To address the above problems, we have moved to a rendering-API-neutral model. All the APIs accept and return transforms between right-handed coordinate systems.

For the right-handedness, we have observed that nowadays most books and systems use right-handed coordinate systems, and Microsoft have shifted from left-handedness in Direct3D to right-handedness in WPF, XNA and Mixed Reality. As of Unity, we have found the inconsistency that it uses left-handed coordinate systems in the high level API but right-handed coordinate systems in the low level API. We suspect that Unity have been affected by Direct3D.

To mitigate problems with Unity, we have created the above utility functions.

It is used in ImageTargetController.OnTracking, where we pass a matrix of pose calculated in ImageTrackerBehaviour.UpdateFrame, which is calculated based on output of EasyAR ImageTracker (targetInstance.pose()).

We do all the matrix calculations assuming right-handed coordinate systems, and convert to Unity transform only at the final step with SetMatrixOnTransform.

It has to be noticed that Unity use a projection matrix the same as that of OpenGL, and the z-reflection is not needed in the projection matrix. It is done in the view matrix, which we don't need to adjust. We only need to do the z-reflection for transforms of GameObjects.

We do all the matrix calculations assuming right-handed coordinate systems, and convert to Unity transform only at the final step with SetMatrixOnTransform.

It has to be noticed that Unity use a projection matrix the same as that of OpenGL, and the z-reflection is not needed in the projection matrix. It is done in the view matrix, which we don't need to adjust. We only need to do the z-reflection for transforms of GameObjects.

Unfortunately, it didn't work ... I tried to flip the 3D objects on the x axis, but the objects created in real time appear lying down. And when the user touches the screen, the character moves on the wrong axis ... what a pity! In version 2 of the plugin worked perfectly. Let's consider redoing the project, or a plugin change ... but our preference is still for EasyAr ... maybe version 4? Anyway, thanks for your help.

You can change this line in ImageTrackerBehaviour:

pose = args.ImageRotationMatrixGlobal * pose;

To this one:

pose = args.ImageRotationMatrixGlobal * pose * Matrix4x4.Rotate(Quaternion.Euler(-90, 0, 180));

Or you can add this multiplication to the result of Matrix44FToMatrix4x4 function.

pose = args.ImageRotationMatrixGlobal * pose;

To this one:

pose = args.ImageRotationMatrixGlobal * pose * Matrix4x4.Rotate(Quaternion.Euler(-90, 0, 180));

Or you can add this multiplication to the result of Matrix44FToMatrix4x4 function.

Thanks for your efforts emanuel, but it didn't work either... With the code above, the objects in the scene really are in the right position, but the others, created in real time, change position and scales ...

I do not understand ... Is it just me who has problems with EasyAr's coordinates? Making a static cube on a surface is easy, but, no one else is using particles and objects that need gravity (Rigidbody and Colliders) on the correct axis?

I do not understand ... Is it just me who has problems with EasyAr's coordinates? Making a static cube on a surface is easy, but, no one else is using particles and objects that need gravity (Rigidbody and Colliders) on the correct axis?

Forgot to mention that this code requires to change the ImageTrackerBehaviour CenterTarget to "FirstTarget" and placing every Target position and rotation to 0,0,0. That way the objects reamain in the same place all the time, only the camera is in motion.

Yes, I did it ... but it generated the problems described above...

I'll keep trying for now ... But the developers need to solve it, not us! Don't you agree?

I'll keep trying for now ... But the developers need to solve it, not us! Don't you agree?

Hey, emanuel...

One problem is solved. About object scales, I did some tests on the "Target Size" parameter. I noticed that the "target size" in the script was different for the "Scale" parameter in the "Transform" object (my fault). When I tweaked it, it worked ... one more thing, the 3D scenario are no stable on the surface (Target), it seems to slide sideways ...

One problem is solved. About object scales, I did some tests on the "Target Size" parameter. I noticed that the "target size" in the script was different for the "Scale" parameter in the "Transform" object (my fault). When I tweaked it, it worked ... one more thing, the 3D scenario are no stable on the surface (Target), it seems to slide sideways ...

...

edited Jul 22, 2019 by adenio

I believe it's really a big challenge to make a plugin for multiple platforms and programs that use different features ...

I can imagine that you (developer?) intend to provide more information or help for this aproach soon, but until then, I have one more question ... I noticed that the "Matrix44FToMatrix4x4" function is used in the camera-attached script (CameraImageRenderer), but where and how to use the function "SetMatrixOnTransform" to adjust the Unity camera axis? Can you take an example, please?

Thanks.