The obvious solution of interpolating the ray by the collision distance divided by the line length returns floating point inaccuracies when dealing with long lines, notably those created by the function getRayFromScreenCoordinates().
So here's a quick example of a function that will return a vector of (0,0,0) upon failure. You might want to take in two references, collision boolean and a out vector if you want to make it properly. But this suited my needs.
Code: Select all
/**
* collides a sphere with a ray, returning a position of the first collision
* @param ray to collide
* @param position position of the sphere
* @param radius the radius of the sphere
* @return core::vector3df position of the sphere
*/
core::vector3df getIntersectionWithSpherePosition(const core::line3df &ray,
const core::vector3df &position,
const f32 radius) {
f64 outdistance = 0.0;
bool colide = ray.getIntersectionWithSphere (position, radius, outdistance);
if (colide) {
core::vector3df pos (ray.getVector().normalize() * outdistance);
pos += ray.start;
return pos;
}
// collision failed
return core::vector3df (0,0,0);
}