What it really looks like
A photograph taken during a discussion of how photographs often fail to capture scenes.
It seems to me that photos often don’t capture what the photographer saw in the scene (perhaps especially if the photographer isn’t really a photographer). But it’s kind of amazing that I think of this as ‘the photograph failed to capture what it was really like’, rather than ‘my perception failed to capture what it was really like, as evidenced by this photograph’!
I wonder if it is possible to make a camera that captures scenes the way they look to a person there.
For a mundane example, it would need to make the moon appear big, when a camera might capture it as tiny. Is there a two dimensional set of pixels that can produce the same sense of how big the moon is, without changing other aspects of the picture to be further away from the perception? Is there such a two dimensional image for any perceived scene?
With sufficiently good what-it-seemed-like-to-the-photographer cameras, we could make substantial progress on bridging the gaping gaps between different minds.
For now, I suppose you can always edit the colors a bit.