Subtitle: A viral Reddit post has revealed just how much processing the company’s cameras apply to photos of the Moon, further blurring the line between real and fake imagery in the age of AI.
From the comments section, which is relevent for context.
I mention this because I see a lot of comments where people still misunderstand what is going on. No, Samsung is not replacing your moon with a pre-downloaded moon.png, it's using a neural network to hallucinate some of those details in. My newer experiments show how this works in even more detail, and how an image looks when these 'enhancements' are not being applied.
It seems that Samsung wants to make everything you look at Fake-N-Gay(TM).
texasblood 1 points 1.3 years ago
First thing you need is stability.
I have shooting sticks,bi pods and tri pods.
My S23 Ultra takes very clean pics at 100 zoom.
One can't possibly free hand enough stabilty for clear shots.