Interesting fact of the day: Not counting point sources of light, any source of light that doesnt appear as a single point will have the same brightness no matter what distance you happen to be from it.
In fact the opposite is true in a sense. Once you get close enough to an object that you can no longer see the whole object within your field of view, then it will get **less** bright as you get closer.
Amazingly this even applies to the sun. IF you were at the surface of the sun, just a few feet away (Such that 1 meter square of the surface of the sun was within your field of vision) it would only appear to be 93 lumens bright. That would be equivalent to only a 6 watt incandescent light bulb! Compare that to the brightness of the sun from earth which is a whopping 127,000 lumens.
@freemo Not entirely true. You are confusing (overall) brightness with surface brightness. Surface brightness, that is light power received per unit solid angle, does not decrease with distance, but since luminous area decreases with distance squared, so does overall brightness = solid angle * surface brightness. For this reason, photographing the Moon (same distance from Sun as Earth) requires the same exposure settings as a sunlit landscape, but Saturn (10x distance) 100x more (+6.6 EV)
@freemo Also, on the surface of the Sun, the area from with light reaches you, is 180°x180°, way more than the 0.5x0.5° as seen from Earth. So you will be toast within a fraction of a second, although a 0.5x0.5° piece of the Sun would be the same brightness as seen from Earth.
Incorrect, but only because you seem to misunderstood what was asserted.
@freemo Well, you will certainly be toast very quickly, but you didn't define what "brightness" means, so the post was not unambiguous. To a layman, brightness usually means how bright something is lit by the light source (at a particular distance, that is). Surface brightness is a concept that needs to be introduced.
@AlphaCephei yes i should have been clear and said "perceived brightness of an object" to be more clear.
@freemo Perceived brightness of a *particular viewing angle smaller than the viewing angle of the object*. The object in whole expanse, as long as it fits the field of view, does in fact get brighter when it is closer, because the viewing angle increases. And therefore, it lights up closer objects (say, Mercury) more than objects farther away (e,g. Saturn)
> Perceived brightness of a *particular viewing angle smaller than the viewing angle of the object*.
This was alrady stated in the OP when I said the following:
> Such that 1 meter square of the surface of the sun was within your field of vision
> The object in whole expanse, as long as it fits the field of view, does in fact get brighter when it is closer,
Incorrect, but a common misconception.
As you get closer to an object the object fills a larger portion of your retina. This will cause the object to transmit more overall energy to your eye, but the **same** energy per receptor of the eye. Therefore the brightness of the object appears the same, there is simply more of your pixels detecting it.
@freemo That's surface brightness again. Each retina cell receives the same amount of light, but more cells receive light, so the *object* gets brighter *overall*. The projection of the Sun inside the eye is something else again than the illumination of a piece of surface like a planet.
I know all that, I know that a telescope will not make the Andromeda galaxy appear brighter per unit area as naked eye but it will increase contrast to the background & large objects are easier to see than small.
@AlphaCephei The illumination of a piece of surface here is the opening of the eye not a planet
And yea a telescope would make andromeda brighter to the eye since it would collect light more so and focus it into that same area.. it would be like having a larger retina.
We might be misconnecteding again.
@freemo Nope. This only holds for point sources. Ojects get brighter in telescopes the less magnification is used (more light in less area). However, the *exit pupil* of the telescope also gets larger as mag. decreases. Exit pupil > eye pupil wastes light outside the eye pupil. The optimal mag. (Mo) for aperture D is at exit pupil=eye pupil Ø (≈7mm).
Exit pupil = D/M, so Mo = D/7mm. A gains (D/7mm)^2 more light but extends object area by Mo^2=(D/7mm)^2 which cancels. See
https://www.rocketmime.com/astronomy/Telescope/SurfaceBrightness.html
@freemo Therefore, D=N*7mm yields same surface brightness as D=7mm, only at Nx magnification. This is the reason why there are 7x50 glasses but no 6x50. It's basically the same principle as in your initial post: telescopes make things appear closer. That's why I thought we are on the same line here. It's different for photography where exposure time depends on f-stop number (focal length/D), or point sources where all light ends up in 1 retina cell.
I know telescopes for ~50 years, I have 3.
@AlphaCephei I think we are getting bogged down with technical terms... let me put this in a technical but practical way.
If you have a camera in a otherwise dark scene with a large bon fire (non-point source), the exposure time, iso, etc would be the same no matter the distance you are from the fire. In addition the light meter would measure the same brightness of the fire no matter your distance from it.
@freemo Correct, as the surface brightness is constant. Which is what the light meter measures. You could also take the Moon as the bonfire. 1/250 s at f/16 and ISO 100 will give you a nicely exposed image. You could use the same setting on Earth (in sunshine) - or on the surface of the Moon.
However, the *overall* brightness of the Moon as seen fro Earth is barely enough to read a newspaper headline, while on the Moon, you could easily read the newspaper in reflected Moonlight.