Is there a way in Irrlicht to find the rotation using 2 3D points.
I am using wiimote for my project and I calculate the 3D position of 2 IR led dots.
I use these 2 dots to find the rotations in 3D. Is there an easy way in Irrlicht to do this? (quaternions or sth).
Also I did the rotations in 2D using Johny Lees Affine transformation solver. I converted the float 4x4 to matrix4 and stored the values.
Sometimes the Z-rotation kicks to 258.xx . I really dont understand why this is happening.
Please see the code below.
float computeAngle(float dx, float dy)
{
float angle = 0;
if (dx == 0)
{
angle = (float)PI / 2;
}
else
{
if (dx > 0)
return (float)atan(dy / dx);
else
angle = (float)(atan(dy / dx) + PI);
}
return angle;
}
float distance2(float x1, float x2, float y1, float y2)
{
float dx = x1- x2;
float dy = y1-y2;
return (float)sqrt(dx*dx+dy*dy);
}
//I want to know if I am doing this properly here.
matrix4 solve2Dto4x4(float x1, float y1, float x2, float y2, float x3, float y3, float x4, float y4)
{
float scale = distance2(x3,x4,y3,y4) / distance2(x1,x2,y1,y2);
float theta = computeAngle((x4 - x3), (y4 - y3)) - computeAngle((x2 - x1), (y2 - y1));
float tx1 = (x2 + x1) / 2;
float ty1 = (y2 + y1) / 2;
float tx2 = (x4 + x3) / 2;
float ty2 = (y4 + y3) / 2;
matrix4 result;
result.makeIdentity();
result(0, 0) = 1.0f;
result(1, 1) = 1.0f;
result(2, 2) = 1.0f;
result(3, 3) = 1.0f;
result(0,0) = (float)(scale * cos(theta));
result(1,0) = -(float)(scale * sin(theta));
result(3,0) = -tx1 * result(0,0) - ty1 * result(1,0) + tx2;
result(0,1) = (float)(scale * sin(theta));
result(1,1) = (float)(scale * cos(theta));
result(3,1) = -tx1 * result(0,1) - ty1 * result(1,1) + ty2;
return result;
}
==================================
The main code that uses this slove2Dto4x4()
matrix4
WiiInteraction::computeTransformation()
{
matrix4 transform;
if (normalizedPosition1.X < normalizedPosition2.X)
leftCursor = 1;
else
leftCursor = 2;
if(leftCursor == 1)
{
transform = solve2Dto4x4(old_normalizedPosition1.X, old_normalizedPosition1.Y, old_normalizedPosition2.X, old_normalizedPosition2.Y,
normalizedPosition1.X, normalizedPosition1.Y, normalizedPosition2.X, normalizedPosition2.Y);
}
else
{
transform = solve2Dto4x4(old_normalizedPosition2.X, old_normalizedPosition2.Y, old_normalizedPosition1.X, old_normalizedPosition1.Y,
normalizedPosition2.X, normalizedPosition2.Y, normalizedPosition1.X, normalizedPosition1.Y);
}
return transform;
}
if (wii.remoteL.IR.Dot[0].bVisible && wii.remoteL.IR.Dot[1].bVisible)
{
cout << "********Computing Transformation**************" << endl;
matrix4 m;
m.setRotationDegrees(handle1.iNode->getRotation());
//This returns the transformation matrix
matrix4 transform = wii.computeTransformation();
matrix4 n;
n.setRotationDegrees(transform.getRotationDegrees());
m *= n;
handle1.setRotation(m.getRotationDegrees());
if(isRingAttached == true)
{
currentRing.setRotation(m.getRotationDegrees());
}
}
3D rotation using 2 3D points.
-
- Posts: 105
- Joined: Mon Jul 27, 2009 4:06 pm
- Location: Cambridge, MA
Can you hit edit and put that code in [ code ] [ / code ] tags so that it's readable? I'm not sure about Johnny Lee's affine transform solver, but possibly the matrices are stored differently. Irrlicht uses column major format, while many libraries use row major format (a difference of a transpose if you just copy the data). Also, Irrlicht uses a left-handed coordinate system, while most math textbooks use a right-handed coordinate system, ( a difference of swapping the y and z axes, or left and right multiplying your rotation matrix by the permutation matrix that will swap the x and y coordinates ).
Edit: maybe I shouldn't assume that you're familiar with linear algebra, if your rotation matrix is:
in a right handed coordinate system, then the same matrix in a left handed coordinate system (such as irrlicht) will be
Edit: Oh, and to answer your original question, I don't think there's anything within the engine to help solve that problem. It's actually a pretty complicated problem to solve. You can do it using constrained non-linear optimization like this guy http://irrlicht.sourceforge.net/phpBB2/ ... highlight= , but the thing you're solving for is a rotation matrix so there's a lot more arithmetic involved so it becomes a messy book-keeping problem. I actually have a code to do this that I wrote in Java using the JMonkey engine, but god knows when I'll have time to dig it up. Basically you can redefine your problem as "what is the rotation matrix that, when used to rotate two 3-d points, will minimize the distance to the measured 3d location". It sounds like that's what your affine transform solver does, however, so probably your problem is just an issue of a difference in coordinate representations
Edit: maybe I shouldn't assume that you're familiar with linear algebra, if your rotation matrix is:
Code: Select all
| a b c |
| d e f |
| g h i |
Code: Select all
| 1 0 0 || a b c || 1 0 0 |
| 0 0 1 || d e f || 0 0 1 |
| 0 1 0 || g h i || 0 1 0 |
Edited the code section.
Is there a way in Irrlicht to find the rotation using 2 3D points.
I am using wiimote for my project and I calculate the 3D position of 2 IR led dots.
I use these 2 dots to find the rotations in 3D. Is there an easy way in Irrlicht to do this? (quaternions or sth).
Also I did the rotations in 2D using Johny Lees Affine transformation solver. I converted the float 4x4 to matrix4 and stored the values.
Sometimes the Z-rotation kicks to 258.xx . I really dont understand why this is happening.
Please see the code below.[/code]
Is there a way in Irrlicht to find the rotation using 2 3D points.
I am using wiimote for my project and I calculate the 3D position of 2 IR led dots.
I use these 2 dots to find the rotations in 3D. Is there an easy way in Irrlicht to do this? (quaternions or sth).
Also I did the rotations in 2D using Johny Lees Affine transformation solver. I converted the float 4x4 to matrix4 and stored the values.
Sometimes the Z-rotation kicks to 258.xx . I really dont understand why this is happening.
Please see the code below.
Code: Select all
float computeAngle(float dx, float dy)
{
float angle = 0;
if (dx == 0)
{
angle = (float)PI / 2;
}
else
{
if (dx > 0)
return (float)atan(dy / dx);
else
angle = (float)(atan(dy / dx) + PI);
}
return angle;
}
float distance2(float x1, float x2, float y1, float y2)
{
float dx = x1- x2;
float dy = y1-y2;
return (float)sqrt(dx*dx+dy*dy);
}
//I want to know if I am doing this properly here.
matrix4 solve2Dto4x4(float x1, float y1, float x2, float y2, float x3, float y3, float x4, float y4)
{
float scale = distance2(x3,x4,y3,y4) / distance2(x1,x2,y1,y2);
float theta = computeAngle((x4 - x3), (y4 - y3)) - computeAngle((x2 - x1), (y2 - y1));
float tx1 = (x2 + x1) / 2;
float ty1 = (y2 + y1) / 2;
float tx2 = (x4 + x3) / 2;
float ty2 = (y4 + y3) / 2;
matrix4 result;
result.makeIdentity();
result(0, 0) = 1.0f;
result(1, 1) = 1.0f;
result(2, 2) = 1.0f;
result(3, 3) = 1.0f;
result(0,0) = (float)(scale * cos(theta));
result(1,0) = -(float)(scale * sin(theta));
result(3,0) = -tx1 * result(0,0) - ty1 * result(1,0) + tx2;
result(0,1) = (float)(scale * sin(theta));
result(1,1) = (float)(scale * cos(theta));
result(3,1) = -tx1 * result(0,1) - ty1 * result(1,1) + ty2;
return result;
}
==================================
The main code that uses this slove2Dto4x4()
matrix4
WiiInteraction::computeTransformation()
{
matrix4 transform;
if (normalizedPosition1.X < normalizedPosition2.X)
leftCursor = 1;
else
leftCursor = 2;
if(leftCursor == 1)
{
transform = solve2Dto4x4(old_normalizedPosition1.X, old_normalizedPosition1.Y, old_normalizedPosition2.X, old_normalizedPosition2.Y,
normalizedPosition1.X, normalizedPosition1.Y, normalizedPosition2.X, normalizedPosition2.Y);
}
else
{
transform = solve2Dto4x4(old_normalizedPosition2.X, old_normalizedPosition2.Y, old_normalizedPosition1.X, old_normalizedPosition1.Y,
normalizedPosition2.X, normalizedPosition2.Y, normalizedPosition1.X, normalizedPosition1.Y);
}
return transform;
}
if (wii.remoteL.IR.Dot[0].bVisible && wii.remoteL.IR.Dot[1].bVisible)
{
cout << "********Computing Transformation**************" << endl;
matrix4 m;
m.setRotationDegrees(handle1.iNode->getRotation());
//This returns the transformation matrix
matrix4 transform = wii.computeTransformation();
matrix4 n;
n.setRotationDegrees(transform.getRotationDegrees());
m *= n;
handle1.setRotation(m.getRotationDegrees());
if(isRingAttached == true)
{
currentRing.setRotation(m.getRotationDegrees());
}
}
It's easy:
(!) only 3 numbers can define a rotation, so Y-axis -> always up
Jeroen
Code: Select all
core::vector3df Point1 = core::vector3df(X1,Y1,Z1);
core::vector3df Point2 = core::vector3df(X2,Y2,Z2);
core::vector3df vect = Point2-Point1;
node->setRotation(vect.getHorizontalAngle());
Jeroen
-
- Posts: 105
- Joined: Mon Jul 27, 2009 4:06 pm
- Location: Cambridge, MA
@JeroenP, Ok, so this answers the question as it does calculate the rotation from two points, but I don't think that's what the op meant.
@kparamas, you said you're using a wii remote... So you are trying to calculate the orientation (rotation) in space of the remote, given the position of the two points it can see. In other words, you know that the two IR dots are at position p1, p2 from the reference frame of the wii remote, and you know that they're at positions q1, q2 in the room, and you want to know the orientation of the remote. Is this an accurate statement of what you want?
P.S. (do you seriously use no indentation when you write code?)
I don't think you're doing this correctly. That matrix that you generate is a rotation matrix for a rotation only about the x-axis. I this what you want? Exactly what rotation are you looking for?
@kparamas, you said you're using a wii remote... So you are trying to calculate the orientation (rotation) in space of the remote, given the position of the two points it can see. In other words, you know that the two IR dots are at position p1, p2 from the reference frame of the wii remote, and you know that they're at positions q1, q2 in the room, and you want to know the orientation of the remote. Is this an accurate statement of what you want?
P.S. (do you seriously use no indentation when you write code?)
I don't think you're doing this correctly. That matrix that you generate is a rotation matrix for a rotation only about the x-axis. I this what you want? Exactly what rotation are you looking for?
I calibrate the position of the IR dots to the screen co-ordinates.
My screen co-ordinates X: -15 to 15 ; Y : -10 to 10 and Z: -18 to +18.
Normally Raw X and Raw Y range from (0.0 - 1.0) and I calculate the depth using 1/(xL - xR) ; xL - position seen by the left wiimote and xR - position seen by the right wiimote.
My doubt is can it calculate the rotations in all the axis? Please help me out in this. I shall test the approach given by JeroenP.
Normally I indent the code. But here I copied the code from my earlier post. So its not indented.
My screen co-ordinates X: -15 to 15 ; Y : -10 to 10 and Z: -18 to +18.
Normally Raw X and Raw Y range from (0.0 - 1.0) and I calculate the depth using 1/(xL - xR) ; xL - position seen by the left wiimote and xR - position seen by the right wiimote.
My doubt is can it calculate the rotations in all the axis? Please help me out in this. I shall test the approach given by JeroenP.
Normally I indent the code. But here I copied the code from my earlier post. So its not indented.