So, I took a very simplistic approach where I figure out how many pixels on the screen corresponds to "one degree" on the pitch, then count up (or down) from the center of the screen to determine where the horizon should be at the center of the screen. Well, clearly that is not the right answer as when I run the code, as the pitch angle decreases, the horizon depicted graphically will rise faster than the white "+" sign I put where I think it should be.
Any clue where I turned wrong in my logic?
Code: Select all
f32 cam_fov = camera->getFOV() * core::RADTODEG;
f32 cam_aspect = camera->getAspectRatio();
core::dimension2du screen = driver->getCurrentRenderTargetSize();
core::dimension2df ppd(screen.Width / cam_fov, screen.Height / (cam_fov / cam_aspect));
f32 pitch_displacement = ppd.Height * pitch;
core::vector2di horizonCenter(screen.Width / 2, (screen.Height / 2) - pitch_displacement / 2);
driver->draw2DLine(horizonCenter + core::dimension2di(2, 0), horizonCenter - core::dimension2di(2, 0), video::SColor(255, 255, 255, 255));
driver->draw2DLine(horizonCenter + core::dimension2di(0, 2), horizonCenter - core::dimension2di(0, 2), video::SColor(255, 255, 255, 255));