lens for human like field of view?

rtphokie

Photo board moderator
Joined
Jan 9, 2006
Messages
3,607
I've read that using a 50mm lens on a 35mm film camera provides a good approximation of human vision. Does that mean that a 37mm lens should be used on a dSLR?
 
I've read that using a 50mm lens on a 35mm film camera provides a good approximation of human vision. Does that mean that a 37mm lens should be used on a dSLR?

Yeah, 50mm sounds about right.

Though for a 1.5x crop body (ie: Nikon & Pentax) it would be 33mm on a zoom lens and 31mm for a 1.6x crop body (ie: Canon) on a zoom lens. If your looking for a prime lens, then the 30mm is the closest your going to find for a dSLR (unless you have one of the FF Canon bodies of course).
 

Actually, the field of view of the human eye is quite wide. More like 140+ degrees. A 50mm (on a film camera) will be about a 1:1 magnification ratio which is why it's usually associated with the human eye. It's field of view is about 50 degrees.
 
I think that it's very confusing to talk about the field of view of the human eye as though our vision were like a simple camera sensor. Human vision is quite complex.

In a very simple experiment, hold your arms straight out and stare straight ahead. Wiggle your fingers. Keep wiggling them and bringing them forward until you can see the motion on either side. When I do that, the angle described by my fingers and my eyes is about 160 degrees.

The interesting thing is that once I stop moving them, I can't see them. It would appear that at the extreme edges eyes can see motion better than still things.

As I continue to move my fingers forward, my ability to see them steadily improves. It appears that the "sensor" cells on the edges of my eyes aren't as good (or perhaps not as dense) as the ones in front. So for me at least, my vision doesn't have a hard boundary like a photo sensor. That's what makes it confusing to discuss the human field of view.

Using an Angular Field of View calculator, I compute that a 50mm lens on a 35mm film body shows a horizontal angle of 40 degrees and a vertical angle of 27 degrees. That's not terribly wide. If that matches anything about what the eye can see, it must only be the "sweet spot" where vision is the best or maybe the field of view in which we typically focus our attention.

A 50mm (on a film camera) will be about a 1:1 magnification ratio
I'm not sure that I understand the use of the phrase "magnification ratio" without also including the subject distance. As I understand it, how much a 50mm lens magnifies an object would depend not just on the focal length but also on the distance the object is from the film plane. If the 50mm lens could focus close enough, it could achieve a 1:1 magnification ratio (meaning that the image of the object is the same size on the film/sensor as it is in real life. I would imagine that a 50mm lens could be designed to focus closer than that or less close.

Looking at the specs of Canon 50mm lenses, I see that the f/1.2 lens can only achieve a 1:67 ratio, the f/1.4 can get a 1:66 ratio, the f/1.8 get can a 1:6.66 ratio, and even the f/2.5 Macro can only get to a 1:2 ratio without a converter. All of these ratios are the maximum manification ratios achievable by the lenses and occur when objects are at the minimum focusing distance for the lens.
 
Actually, the field of view of the human eye is quite wide. More like 140+ degrees. A 50mm (on a film camera) will be about a 1:1 magnification ratio which is why it's usually associated with the human eye. It's field of view is about 50 degrees.

I guess the question I'm asking is more naive, what lens and technique should I use to best approximate "being there".

It's sounding like a combination of a 50mm (or whatever is appropriate for your digital sensor to get you as close as possible to 50mm) couple with a panorama. To get a panorama in the neighborhood of 140 deg, its going to take about 8 shots to get right.

Thinking back, the first panorama stitching software I used defaulted to 50mm and it makes sense now.

Good discussion, thanks everyone.
 
I'm not sure that I understand the use of the phrase "magnification ratio" without also including the subject distance. As I understand it, how much a 50mm lens magnifies an object would depend not just on the focal length but also on the distance the object is from the film plane. If the 50mm lens could focus close enough, it could achieve a 1:1 magnification ratio (meaning that the image of the object is the same size on the film/sensor as it is in real life. I would imagine that a 50mm lens could be designed to focus closer than that or less close.

Looking at the specs of Canon 50mm lenses, I see that the f/1.2 lens can only achieve a 1:67 ratio, the f/1.4 can get a 1:66 ratio, the f/1.8 get can a 1:6.66 ratio, and even the f/2.5 Macro can only get to a 1:2 ratio without a converter. All of these ratios are the maximum manification ratios achievable by the lenses and occur when objects are at the minimum focusing distance for the lens.

I guess I confused you with by using a term more often associated with macro capabilities. What I mean is, when you use a 50mm lens, the image will appear to be the same distance from you as your eye sees it. If you were to use a 100mm lens, an object would appear to be twice as close to you as your eye sees it. I suppose I should have used 1X magnification, not 1:1.
 
It's sounding like a combination of a 50mm (or whatever is appropriate for your digital sensor to get you as close as possible to 50mm) couple with a panorama. To get a panorama in the neighborhood of 140 deg, its going to take about 8 shots to get right.

.
That's about right allowing for some overlap to aid in stitching the panorama.
 
Try this:
use a zoom lens that includes the expected "normal" range. View the scene through the camera with one eye, while viewing the scene (not through the camera) with the other eye (this usually means viewing through the camera with the right eye). Zoom the lens until the two views seem to correspond with each other, and note the focal length.
It is likely to be in the 30-something range on a 1.6x camera.

Before photography, painters generally painted a scene with a viewing angle that closely corresponds to what our eyes discern, similar to a "normal" lens.
 
View the scene through the camera with one eye, while viewing the scene (not through the camera) with the other eye
Wouldn't that give you the field of view of just one eye rather than two?

What we see and what a camera sees are different in too many ways to directly compare them. I agree that a 50mm (full frame) focal length is a reasonable match to what I sort of see when I'm looking straight ahead and not thinking about the edges much. I also agree that our vision extends way beyond those bounds to something more like a 17-20mm lens. That part of the visual image isn't what we "see" when our attention isn't drawn to it.

There's a difference between what our brain "sees" and what the image that hits our retina is. Vision is a fascinating topic. One of the cool things that I learned is that information coming from our eyes to our brain is split into three distinct processing paths very early on in the process. One part of the brain focuses on sensing motion. Another part focuses on edge detection, parallax, and other information required to mentally construct a three dimensional image with objects in it. The last part focuses on color information. Later in the mental part of the process of "seeing", the three sets of information are combined to mentally construct what we see.

Many interesting artistic and psychological tricks take advantage of our complex visual process. Piet Mondrian’s famous Broadway Boogie Woogie is an example of using colors with the same brightness. The spatial sensing part of our visual process focuses more on black and white information and doesn't pay much attention to the edges between those regions in the painting. When the color is added, there isn't a clear place for it to go. The effect is that that the gray and yellow parts of the picture seem to jitter as our brain fails to build a coherent image.

Another great example of this effect that can be seen on the web is Rotating Snakes. Believe it or not, this is a static image. It's your brain that is getting confused and inferring motion.
rotsnakemini.gif


There are also lots of tricks involving after images. The classic ones have you stare at a simple, brightly colored image for a while and then look at a white wall. Your mind will create an image on the white wall that is the negative of the original image.

I'm rambling, so I'll get back to my point. Vision isn't as simple as a lens and a sensor. There isn't a definitive answer to what your field-of-view is. There's a difference between what you can detect at the extremes of your vision and what part of your vision your mind usually pays conscious attention to. And the last part isn't some cheap cop-out because eyes don't really "see", it's your mind that "sees." If it isn't building the mental part of the image, you aren't really "seeing" something.

Then there is also the problem that pictures are flat and what we see is 3 dimensional. That's why lighting is so important. A well lit scene will provide the viewer with enough natural visual cues to help them mentally construct a 3D image. A poorly lit scene will just look flat.
 
I'm rambling, so I'll get back to my point. Vision isn't as simple as a lens and a sensor. There isn't a definitive answer to what your field-of-view is.

I learned a lot about peripheral vision, viewing angle, and more in Keith Code's 2nd level California Superbike class, and as you say, it isn't that simple. We don't even have to be directly looking at what we are concentrating on, it is easy after some practice.

What *is* simple about our vision is the magnification, it doesn't change and we can't change it without a lens. This is the part the "one eye in the viewfinder, one not" exercise can easily compare. When the magnification appears the same, that focal length is what corresponds to our eyes.
Strangely enough, when I actually tested it just now the focal length that matched my eyes magnification was slightly more than 50mm. Maybe the crop factor does not change the "normal" lens idea?
 
What *is* simple about our vision is the magnification, it doesn't change and we can't change it without a lens. This is the part the "one eye in the viewfinder, one not" exercise can easily compare. When the magnification appears the same, that focal length is what corresponds to our eyes.

I can see that. Of course, if you are looking through the viewfinder rather than the lens, you have to consider the magnification level of the viewfinder as well.

The magnification of some Canon's on a 50mm lens focused at infinity:
400D - 0.8x
30D - 0.9x
5D - 0.71x
1DM2 - 0.72x
1DsM2 - 0.7x
Elan (film) - 0.7x
 
Hopefully not to late for this debate. 50mm is only deemed as normal because it was the normal lens that was first provided with 35mm film cameras. I would not say that it actually shows the equivalent a person's visual range including peripheral (which does vary from person to person).

For my camera (with a 1.6X crop), I find 22-24mm fits my peripheral. But again that is for my vision. :3dglasses

Mike
 














Save Up to 30% on Rooms at Walt Disney World!

Save up to 30% on rooms at select Disney Resorts Collection hotels when you stay 5 consecutive nights or longer in late summer and early fall. Plus, enjoy other savings for shorter stays.This offer is valid for stays most nights from August 1 to October 11, 2025.
CLICK HERE













DIS Facebook DIS youtube DIS Instagram DIS Pinterest

Back
Top