Interesting fact of the day: Not counting point sources of light, any source of light that doesnt appear as a single point will have the same brightness no matter what distance you happen to be from it.

In fact the opposite is true in a sense. Once you get close enough to an object that you can no longer see the whole object within your field of view, then it will get **less** bright as you get closer.

Amazingly this even applies to the sun. IF you were at the surface of the sun, just a few feet away (Such that 1 meter square of the surface of the sun was within your field of vision) it would only appear to be 93 lumens bright. That would be equivalent to only a 6 watt incandescent light bulb! Compare that to the brightness of the sun from earth which is a whopping 127,000 lumens.

@freemo Not entirely true. You are confusing (overall) brightness with surface brightness. Surface brightness, that is light power received per unit solid angle, does not decrease with distance, but since luminous area decreases with distance squared, so does overall brightness = solid angle * surface brightness. For this reason, photographing the Moon (same distance from Sun as Earth) requires the same exposure settings as a sunlit landscape, but Saturn (10x distance) 100x more (+6.6 EV)

Follow

@AlphaCephei

I am talking about perceived brightness at the eye.. and yes its very much as I described and it has to do with the pixelated nature of the eye.

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.