iPhone 13 camera before iPhone 14 launch: Apple and Android fail to make real camera phones

Photography, as your parents or grandparents knew it, is a dying breed.

If 20 years ago the idea of ​​a photo was to capture an important moment in one’s life as authentically as possible, today we live in a different world…fair enough, not everyone had a camera in 1980, and phones managed to make that thing that wasn’t It can only be accessed once, which is great!

However, as it turns out in 2022, the world is less concerned with authenticity and more about “making everything better” – whatever that is supposed to mean. Nowadays, images (among other things) are supposed to enhance our reality and make it “cool and interesting”. Your child can have bunny ears, and you can puke rainbows.

But there’s more beyond Snapchat filters that enhance the look of your photos, and it all boils down to something called computational photography. It’s the “hidden filter” that makes photos taken with your phone look “ready to share online”.

This little experiment will try to show you the pros and cons of modern phone cameras that support computer photography, and the phone of your choice is the Apple phone.

One of the most popular phones in the last 10 months.

Before I show you a bunch of “before and after” photo samples, let me create something: I’m well aware that people love photos that are ready to be shared online. And while I may not be one of them, I think I might know what happened here…

In short, social media has played a huge role in the demand for “Instagram-ready” images (that’s a term we already use in the tech community). Talking about gramsSince its debut in 2010, the photo and video sharing social network has encouraged the use of bold, exaggerated color filters, which people just can’t resist, which of course means Apple and Android will jump on board…

For example, Instagram was the reason Apple felt the need to include Square Photo mode in the iPhone 5S (2013), which has been part of the iPhone’s camera for nearly a decade. However, and most importantly, this was around the time when the iPhone and Android started adding photo filters to their camera apps. Because Instagram Fever has made it clear that people love filters.

And then… we entered the age of what I call “filters on steroids” or “hard-core computational photography,” or “advanced filters,” if you like it. The phone that represents the adoption of “hardcore computational imaging” in my mind is the Google phone Nexus 6P. In this phone, (most) computational photography came in the form of something called HDR +.

What HDR+ did was “advanced image stacking”. HDR+ was part of the post-processing phase of taking a photo with the Nexus 6P / Nexus 5X and its role was to balance highlights and shadows in high-contrast scenes – one of the biggest challenges phones faced in 2014-2015 (along with the absolute inability to produce scalable night photos). to use).

Anyway, short for HDR+: The Nexus 6P has made one of the best phones for taking photos. Sure, my bias plays a role in this statement (I’ve never bought a Nexus 6P, but it was only because I couldn’t afford it), but there’s no denying that the somewhat darker photos taken by 2015 Google flagships had something to do with it. very appeal to them. Other tech enthusiasts love them too.

Lights, Highlights and Shadows: What Photography Really Should Be About

It wasn’t until about a year ago when I watched an amazing 24 minute video David Imel who was able to help me express what I was feeling at the time when the Nexus 6P and the original Google Pixel cameras dominated the phone camera industry.

To sum up the 24 minutes of storytelling, David draws a comparison between modern computer photography and classical art, all in an effort to explain the importance of light to both photography and painting.

What he tries to explain is that in the early days of photography, the control/artistic element (in photos) was based entirely “on the intensity of highlights and depth of shadows” – as in paintings. These are used to express feelings and create depth through tonality in our images. This is especially evident in monochrome photography where the light, shadows, and highlights are the only elements that create nuance and perspective.

But, he says, “the computing speed was progressing much faster than the physics changed,” and that seems to be why I dislike many of the photos taken by the super powerful iPhone 13 and wish they were just like the original Google Pixel photos.

iPhone 13, Galaxy S22, and Pixel 6 take photos that are not representative and are not always more attractive than the real scene looks

What we see here is a bunch of photos you took with your iPhone 13 in full automatic mode. It is important to note that I did not start taking pictures in order to make my point, but the photos given to me by the iPhone 13 became the reason for writing this story…

Anyway, the iPhone 13 photos taken in auto mode are on the left, and the same iPhone 13 photos you edited are on the right. I modified it not to my liking, but to the authenticity of the scene at the time (and as best I could).

I chose to edit photos using the photo editing capabilities of the iPhone because that is what most people can access. Of course, Lightroom was giving me more (and better) control over the various properties of photos (which weren’t captured in RAW), but that’s not the idea here.

If you’re curious, what helped me the most in my attempt to make my iPhone 13 photos look more realistic in a scene was pulling the brightness and exposure sliders backwards. Which means that photos taken with modern phones are very bright. Then, some Brilliance, Highlight, and Shadow tweaks helped me get a more accurate result.

iPhone 13, Galaxy S22, and Pixel 6 show recent HDR and computer imaging issues

The results tell me that computer imaging on phones today is literally a hit or a bug.

On the other hand, some people will like the default output of the iPhone 13, Galaxy S22, and Pixel 6 (the Galaxy also takes very bright photos, while the Pixel is incredibly flat), because they are “shareable”. Even leaving credibility aside, I would argue that the iPhone’s processing doesn’t make photos look “better” than the scenery does. Take another look at the samples shown above. What pictures do you like the most? The one on the left or the one on the right?

apple, Samsung Google & Co has made some amazing progress in all three areas thanks to its large camera sensors (capture) and fast processors, including dedicated image processors (processing), and ultra-bright displays and color accuracy that allow you to view your images (display). However, I would argue that, as often happens, we do not know when to stop … As it stands, most phone makers are abusing the amazing software and hardware power of a modern phone camera.

Photos and even videos taken with the iPhone 13 and other recent phones often appear too bright, too heated, too flat, and ultimately ‘lifeless’. Sure, they might be able to capture both highlights and shadows incredibly well and even turn night into day thanks to Night Mode, but without the natural balance and contrast element, photos taken with most phones wouldn’t evoke any emotions…

But hey! Looks good on Instagram.

Finally: There is light at the end of the computational imaging tunnel thanks to Sony and Xiaomi

To end on a positive note, there’s a light (pun intended) at the end of the tunnel!

Unlike Apple and Samsung, companies like Sony has always tried to stick to the basics of photography, and this is evident from the fact that the Sony Xperia 1 IV has incredible processing power but doesn’t even include a night mode in the camera. The phone also brings the first continuous zoom on a modern smartphone, which is as close to “real camera zoom” as we’ve ever gotten. And then, of course, we have Xiaomi 12S Ultra, which uses a full 1-inch sensor and Leica magic to deliver some of the best (if not the best) photos I’ve ever seen coming out of a phone’s camera. Xiaomi and Leica chose to allow shadows to overshadow, avoid excessive sharpness, and rely on flagship devices, which (shock!) produce images with amazing depth and natural detail.

So, I invite Apple, Samsung, and even Google to come back and take a look at the original Pixel; Go back and look at the iPhone 4S (not as impressive as the camera might look today), and bring back the realism in our photos. I am sure that with the increase in hardware and software power, a touch of authenticity It can go a long way!

And you know – for those who want bright and saturated photos… give them filters!


#iPhone #camera #iPhone #launch #Apple #Android #fail #real #camera #phones

Leave a Comment

Your email address will not be published.