I am facing an issue with camera movement around an axis. When the camera is further away from it's target the camera appears to move around the object slower than when the camera is closer to it's target. I understand this is due to the distance and having a smaller field of view when the camera is closer to the target. Just as if you were watching a spider run across your counter top, if you're closer to the spider it looks like he's just sprinting by, if you're across the kitchen the spider appears to move slower since he has more surface area in your field of view to run across.
I need the camera to move at a constant rate no matter the distance from it's target. Pretty straight forward problem. I have to be limited with my code due to work.
float deltaX = mouse.X;
float deltaY = mouse.Y;
float moveSpeed = 100000f;
// These are set elsewhere and manipulated throughout the code.
Vector3 deltaMove = Vector3.Zero;
Vector3 Position = Vector3.Zero;
Vector3 Target = Vector3.Zero;
deltaMove += new Vector3(deltaX * moveSpeed / 2f, deltaY * moveSpeed / 2f, 0);
So after plugging away at this for a while, I've come to find the following solution has produced the best result:
float distance = Vector3.Distance(CameraPosition, TargetPosition) * 50; float x = (mouse.X * distance) / moveSpeed; float y = (mouse.Y * distance) / moveSpeed; deltaMove += new Vector3(x, y, 0);
The snippet above produced the cleanest and smoothest results out of everything I have tried. Originally the above snippet would be written as below, since it produces a vector focused on speed over a distance. The smaller the distance the smaller the speed and vice versa. I had to produce a different method of the same snippet since my distance variable was producing results of 2.5 * 10^12 and larger forcing the target off of the screen.
float distance = Vector3.Distance(CameraPosition, TargetPosition); float x = mouse.X * moveSpeed * distance; float y = mouse.Y * moveSpeed * distance; deltaMove += new Vector3(x, y, 0);