a writer's blog

Consolidating Memories in the Photographic Field (Part II)

iPhone 5s

iPhone 5s (Paging Dr. Benjamin)

This post originally appeared at my secret level just drafts.

I transferred it to this place because my secret level now focuses more strictly on game-based learning, education, and LLM/GPT/AI.

In my first post in this series, I wrote about in-game photography for memorable moments, and about the detrimental effects photos can have on our memories by highlighting these recorded moment over all other moments that might be of equal, or even higher, biographical value for us. And I wrote about how digital photography might alleviate this effect. It enables us not only to record an unprecedented amount of moments of our lives, but record them at almost any given point in time. Our phone cameras are always with us, ready to record.

Originally, I planned to proceed to aspects of player memory and game-based learning, but I’m still processing these topics. Then, last week, an article in the New Yorker by on “Have iPhone Cameras Become Too Smart?” has triggered another artillery exchange in the digital photo quality wars that is worth mentioning in the context of these posts.

Chayka’s argument boils down to this: in contrast to the iPhone 7 camera, later models—iPhone 11–13—digitally manipulate shots so “aggressively and unsolicited” that they often don’t look “natural” but weird, or overprocessed and “over-real,” with glaring editing errors on top.

Backlash was swift, naturally. John Nack (h/t Daring Fireball) put up a that juxtaposes iPhone 7 and iPhone 12 shots from the same objects at the same location. Or, John Gruber argued that the problem “is not that iPhone cameras have gotten too smart. It’s that they haven’t gotten smart enough.”

It doesn’t help that Chayka’s article is not supported by evidence, like, well, photos. Not of editing glitches, we all know how they look—but examples of iPhone 7 shots that look better, or more natural, or more interesting, than corresponding shots from an iPhone 11 or later. For an article that makes such a deep, sweeping argument about digital photography, one could expect some examples to go with it? Perhaps it’s just me.

Also, it doesn’t help that the article’s arguments aren’t well structured and throw wildly different things into the mix, particularly the hypothesis (or assumption) that modern digital photography has a “destabilizing effect on the status of the camera and the photographer” (with reference to Benjamin, of course). That iPhones create “a shallow copy of photographic technique that undermines the impact of the original” and “mimics artistry without ever getting there”—now that’s a tune we’ve heard before.

First of all, for the overwhelming majority of iPhone users, iPhone photography is perfectly fine. And for them, later iPhone models serve these people’s purposes a lot better than the camera and software package of the iPhone 7 did.

Then, if the iPhone’s editing process is too aggressive for what you want to achieve, which is indeed a concern for many professional photographers and artists, you can and should switch to third-party photo apps, up to and especially Halide that allows you to shoot RAW and, from iPhone 12 models on, also Apple’s ProRAW. (Chayka even mentions Halide, but somehow that doesn’t lead anywhere.) And it goes without saying that later iPhones will provide you with better RAW or ProRAW data to work with than an iPhone 7.

Seen in this light, the argument that digital photography is not really “professional photography” and is lacking in “artistry” is at least partly based on the hidden, quite bizarre, and most likely unreflected assumption that the “shot” that came out of the camera is what makes or breaks a professional photo or art. Just like photos from non-digital cameras, the editing process is a substantial part of it, despite all their differences.

Adding everything up, Chayka’s article is not well-structured; it doesn’t provide evidence for its arguments; and it contains hidden assumptions that are dubious at least, and outright untenable at worst.

And I’m not even a fan of iPhone photography! Early on over at Glass, I followed a good number of accounts who shot with iPhone 11+ models. Photographers, that is, who are very good and know what they’re doing. Some are a blast, and I keep enjoying them very much. But with time, I felt the majority of photos from pure iPhone accounts were just not interesting enough to stick around. At the same time, I began to follow more and more photographers who shoot film, and not just professionals. (Most of my older photos on Glass are shot with an iPhone 5s, so please ignore these curious sounds of breaking glass that you hear from the back of the house.)

Thus, there are two different, but perfectly compatible takeaways.

On one side, following Gruber, iPhone cameras haven’t gotten smart enough yet for the overwhelming majority of iPhone users who aren’t professional photographers or artists. On the other side, newer and smarter iPhone cameras keep providing professional photographers and artists with more, better, and richer “raw” data to work with.

Bonus recommendation: if you want to dive into the nitty-gritty details of photo processing software on the iPhone 13 Pro, this post by Halide designer Sebastiaan de With on “iPhone 13 Pro: The Edge of Intelligent Photography” is what you want to read.

Share

If you have something valuable to add or some interesting point to discuss, I’ll be looking forward to meeting you at Mastodon!

Tagged as: ,