Design & Research | Master Thesis Log 02
#InteractionDesign #AIPhotography #HumanInTheLoop #ResearchJourney #ComputationalPhotography
Since its invention, photography has held a unique promise: the promise of truth. Unlike a painting, which is an interpretation, a photograph was historically seen as an “index”—a physical trace left by light hitting a sensor.
But what happens when the sensor stops recording light and starts predicting it?
In my previous post, I asked if photography is dead. This week, I conducted a deep dive into the Samsung “Space Zoom” Controversy. This event is not just a consumer tech scandal; for my thesis, it serves as “Ground Zero” for the ontological shift in image-making. It proves we have moved from capturing the world to generating a statistical average of it.
Part 1: The Experiment
The controversy erupted when Reddit user u/ibreakphotos designed a clever stress test for Samsung’s “100x Space Zoom.” The user hypothesized that the camera wasn’t actually optically powerful enough to see the moon’s craters.
The Methodology:
- They downloaded a high-res image of the moon.
- They downsized it and blurred it until it was an unrecognizable, glowing white blob.
- They displayed this blob on a monitor in a dark room.
- They stood back and photographed the monitor using the Samsung S23 Ultra.

The hardware limitation: A tiny smartphone sensor cannot defy physics, yet the software claims it can. (Source: reddit)
The Results:
The phone produced a sharp, detailed image of the moon, complete with craters and surface textures.
This was physically impossible. The source image (the blurred blob on the screen) contained zero texture data. The camera had effectively “hallucinated” the craters because its AI recognized the shape of a moon and overlaid a texture map from its internal database.
Part 2: The Death of Indexicality
Why does this matter for Interaction Design? Because it breaks the fundamental contract between the user and the tool.
In media theory, Charles Sanders Peirce defined the photograph as an “Index”—a sign that has a physical connection to its object (like a footprint in the sand). When you look at a traditional photo, you know that the light actually touched the subject.
The Samsung Moon is no longer an Index. It is a Simulacrum. As the philosopher Jean Baudrillard argued, a simulacrum is a copy without an original. The image on the user’s phone is “hyperreal”—it looks more real than the blurry reality the user actually saw with their eyes, but it has no connection to the physical moment.
The friction lies here:
The User thinks: “I captured this.”
The System knows: “I generated this.”
This creates a gap in agency. The user believes they are the creator, but they are merely the “prompter.” The camera is no longer a tool for documentation; it is a tool for optimization. It prioritizes a “beautiful lie” over an “ugly truth.”
My Perspective: The Case for “Honest Interfaces”
After analyzing this case, I do not believe the solution is to ban AI. Most users do want a clear photo of the moon, even if it is fake. However, from an Interaction Design standpoint, the failure here is not technological—it is ethical.
The Failure of “Silent Substitution”
The interface lied. It presented a generated image as a captured one. My take is that we need to redesign the camera interface to be “Honest.”
My Proposal for Future Research:
We need a UI that distinguishes between “Documentation Mode” (Optical truth, flaws included) and “Simulation Mode” (AI enhanced).
If the user knows they are painting with data, the agency is restored. They become a “Director” rather than a duped consumer. The current design trend of hiding these choices behind a single “Shutter Button” is what I call “Agency Laundering”—the machine takes the credit, but lets the user feel like the artist. My thesis aims to challenge this specific pattern.
Key Questions Arising from this Case:
- Transparency: Should AI-enhanced photos carry a visible watermark or metadata tag indicating “Generative Content”?
- The “Raw” Mode: Is “Pro Mode” the last bastion of authenticity, or is AI seeping into the raw data as well?
- User Consent: Did the user consent to having their blurry moon replaced? Or did the interface assume their intent?
References (IEEE)
[1] u/ibreakphotos, “Samsung ‘Space Zoom’ Moon Shots are Fake,” Reddit, 2023.
[2] J. Vincent, “Samsung’s Moon photos are fake—but so is a lot of mobile photography,” The Verge, 2023.
[3] J. Baudrillard, Simulacra and Simulation. University of Michigan Press, 1994.
AI Declaration: This blog post was drafted with the assistance of an LLM to structure the theoretical analysis. The research selection, case study choice, and final arguments regarding ‘Indexicality’ are my own.