Has anyone ever thought about this? This thing called "field of view".
Really it's angle of view or focal length in cameras. I noticed the other day that the image in the software I work with is very distorted. I've been noticing it for a while, actually, or how there seemed to be an absurd amount of parallax in the perspective, where things sore up into the air, or down below, but not even to great heights.
I've thought this is funny, and I'm embarrassed that I've been working with 3D software for what feels like many lifetimes without taking notice. But it wasn't until I began to see actual distortions, like ripples, and then I began to see them everywhere I looked...
Part of me wonders if this is because of my other post of late where I discovered an aliasing free 3D display technique. So I've been seeing 3D images devoid of jagged edges, and I've been spending a lot of time just lost in the images, because they're miraculous. But I wonder if being bombarded by so many jagged edges everywhere has a way of diffusing the senses, so you do not notice the ripples and distortions of this thing called "FOV".
What's even more interesting about this subject, is from what I've read, in computer games, conventional wisdom about "FOV", what is the right settings and so on, seems very misplaced to me upon examination.
It's hard to tell what numbers mean what. The software I've inherited uses a 50 degree vertical field of view. I would describe this as very distorted. About as much so as could possibly be acceptable. Other places I read 90 degrees is recommended. Valve's Source Engine website says its default is 75 degrees. But even though I think vertical "FOV" is almost universally used, I get the impression that if anyone is using 90 it must be either a horizontal or diagonal measurement.
I briefly tried using 90 but scaled by the reciprocal of the aspect ratio, and I couldn't tell the difference from 50 degrees on my system, even though it would depend on the aspect ratio.
Supposedly a high value is recommended on computer games. For PCs. I'm not sure why, but some reasons are given as it is better suited for sitting close to a smaller screen, and it is supposed to be better for people who feel motion sickness, but I'm skeptical of that, I wouldn't be surprised if the distortions could be causing the sickness in many cases.
What I do know is if you lower the value, perhaps to what is used on consoles, but I doubt it truly, although I wonder why I've never played a console game with a preference to control this... if you lower it, the distortions go away, and to me the scene begins to look like a movie. In fact on Valve's Source Engine website it suggests using a low setting and long shots for trailers to give them a movie like feel, but out of the other side of its mouth it says that players do not notice the distortion. I'm very skeptical of this...
From what I can see players seem to be using this setting to give themselves play advantages, because a greater field of view means that you can see more of your surroundings. And also this makes them feel more secure, when virtual combat and threats can come from anywhere. Being able to see less, means your targets can more easily go where you cannot see them, and so on.
So I just want to share this, because I worry about conventional wisdom's way of creating defaults, that we do, never knowing there is an other way.
This setting is how games zoom in to use gun scopes and things. So when it is lowered, it can feel like you are zooming in, but this is truly a matter of context. You wouldn't know that you are zooming in if you were always zoomed in. The problem itself really boils down to vision being circular and screens being flat. Our displays must pick a spot to map to the orb, like flat polygons that form a sphere. I really like the intimacy of the "zoomed in" image...
When you come face to face with a character, they fill the entire screen, you really feel like you are in their presence, and not orbiting around them from a distance that can never be crossed. And of course, there is no warping or towering things that close in on themselves as if to kiss, or like two staves lowered by guards blocking your path. I think subconsciously this distortion like jagged edges separates the video game realm from the cinematic.
When the setting is lowered, it can keep you from seeing things above and below, if there is a limit on looking up and down. I think this might be why some games use black bars to even this out. But I found that for a range of 30 to 50 (30 or 35 I recommend) to allow the lower settings to see the same amount as 50 the position of the view can be pulled back a little ways. Not being able to see as much means that the game doesn't have to display as many things, so you can see your frame rate improve by significant figures.
if(SOM::zoom<50) //back up so you can see above and below
{
//20: this is just 50-30 covering the zoom modes in full
//4: works surprisingly well for 1.5m tall player character
assert(SOM::zoom>=30);
float up = fabsf(SOM::pov[1])/4;
float back[3] = {SOM::pov[0],0,SOM::pov[2]};
Somvector::map(back).unit<3>().se_scale<3>(float(50-SOM::zoom)/20*up);
pos.move<3>(back);
}
In this code "pov" is the normalized view vector. So its vertical component is used to scale the push back vector, and its horizontal components are renormalized and scaled by the simple linear formula.
PS: In the code, it should really be
pos.remove<3>(back); except it is operating on the translation component of the scene's view matrix, so it is inverted.
EDITED: float up = powf(fabsf(SOM::pov[1]),1.5f)/4; smooths out the code so it isn't camel humpy. For homework you can do first-person avatar body parts. I find with absurdly widescreen/matted settings I don't see distortion, but I'm sure it's there. I'll probably see it eventually. But it's cool to go really widescreen because you can frame the whole NPC on one side, and put whole blocks of spoken subtitles beside them out of the way.