What exactly is happening with the iPhone's camera?
The Blind Smartphone Camera Test: https://youtu.be/LQdjmGimh04
MKBHD Merch: http://shop.MKBHD.com
Tech I'm using right now: https://www.amazon.com/shop/MKBHD
Playlist of MKBHD Intro music: https://goo.gl/B3AWV5

http://twitter.com/MKBHD
http://instagram.com/MKBHD
http://facebook.com/MKBHD

- Okay, what exactly is happening with the iPhone's camera? Like We've done years of blind smartphone camera tests in bracket format and the iPhone, supposedly one of the premium cameras in the entire smartphone industry, consistently loses in the first round. Then We do a scientific version with 20 million+ votes and it finishes in the middle of the pack. "And Yet, Marques, you named it the fourth-time running best overall smartphone camera system in 2022 and gave it a trophy. What's up with that?" A Concerning amount of people have started to notice that the iPhones camera feels like it's taken a few steps back lately and I agree with them.

I Think we should take a closer look at this. So First of all,, cameras have come a really long way to the point where smartphone cameras aren't just cameras anymore. See, back in the day, a camera was basically a sensor that would travel around, covered all the time. and when you wanted to take a photo, you would expose that sensitive bit to the environment around it and it would collect the light and close it.

Then The photo would be a representation of how much light hit each part of the sensor. The better the sensor, the better an image you could get, the more light information, super simple. These days though,, it's turned into a whole computational event. Your Smartphone sensor is sampling the environment, not once, but often several times in rapid succession at different speeds.

It's taking that light information, merging exposures together. It's doing tone mapping, noise reduction, HDR processing and putting it all together into what it thinks will be the best looking image. This, of course,, is a very different definition of a picture. So Now it's not just about having the best sensor that gathers the most light information, it's at the point where software makes a much bigger difference to the way the image looks at the end of the day than anything else..

Like Next time you watch a smartphone reveal event, for example,, keep an eye on all the new additions that get made and just how many of them are pure software. So Google Basically struck gold when they first started using the IMX363 sensor way back in the day with the Pixel 3's camera because they got their software tuning with it just right and it was an instant smash hit.. So They kept using that great camera combo in every Pixel since then. The 3, the 3a, the 4,, the 4a, the 5,, the 5a, and even the Pixel 6a.

So year after year of new phones, same sensor, same software tuning combo because it just worked. If It ain't broke, don't fix it. So When you saw the Pixel 6a win December's Scientific Blind Smartphone camera test, what you saw was a four-year-old sensor and software tuning combo that is still so good that in a postage-stamp-sized comparison of compressed side-by-side images where you can't really judge sharpness or depth of field too much, basically just appreciating the basics,, this combo absolutely nailed the basics better than anyone else. Now, When the Pixel 6 came along,, stay with me, Google finally updated their design and their branding and they finally changed to a new sensor with this new camera system..
So They go from the tried-and-true 12 megapixel to this massive new 50 megapixel sensor and it kind of threw a wrench into things. - So It looks to me that the Pixel is over sharpening. I Think the one on the left looks too crunchy. - The Camera on the Pixel 6 does have a habit of making things just look HDR-y.

I dunno if there's really a technical term for that. - And If you look at all the photos, it's clear the Pixel is still doing Pixel things. - I Think Google's still running all of their camera algorithms at 11, like when they don't need to anymore. - Right Now, new phones with much bigger sensors are still processing like their smaller older ones.

- The Basic principle is: they were doing all this processing with the old sensors as if they were not getting a lot of light. and then suddenly they had this massive new sensor which is getting way more light information. but they were still running all of this processing. They would still do high-sensitivity stuff and then they'd do noise reduction because if you have high sensitivity, you need noise reduction.

But Then since you're doing noise reduction, you need to do sharpening on top of that to make up for it and just overall, you're doing way too much. And So the photos are literally overprocessed. So This fancy new phone would come out with a new camera system, but you could argue, legitimately, that the older Pixel still took better looking photos. So Google Had to work really hard at the drawing board and make some adjustments and some updates to the software to dial in this new sensor.

It took a while, but now with the Pixel 7 out. a full year later with the same huge 50 megapixel sensor, they're back on track. And hey, would you look at that, Pixel 7 right behind the Pixel 6a in the blind camera test. So When I see iPhone 14 Pro photos looking a little inconsistent and a little overprocessed right now, I actually see a lot of the same stuff that Google just went through with the Pixel.

Because The iPhone story is kind of along the same lines, they used a small 12 megapixel sensor for years and years and years. Then The 13 Pro sensor got a little bigger, but this year, the iPhone 14 Pro is the first time they're bumping up to this dramatically larger 48 megapixel sensor. And So guess what? Some iPhone photos this year are looking a little too processed and it's nothing extreme, but it's real and they will have to work on this. I Suspect that by the time we get to iPhone 15 Pro, you know, a year later, they'll have some new software stuff they're working on.

And I bet there's one new word they use on stage. You Know, we finally have Deep Fusion and pixel-binning and all this stuff,. I bet there's one new word they use to explain some software improvement with the camera. But Anyway,, I think this will continue improving with software updates over time and they'll continue to get it dialed and I think it'll be fine.
But That's only half my theory. This does not explain why all the previous 12 megapixel iPhones also all lost in the first round in all those other bracket style tests. And This is a separate issue that I'm actually a little more curious about because as you might recall,, all of our testing photos have been photos of me. Now, this was on purpose, right? Like We specifically designed the tests to have as many potential factors to judge a photo as possible.

Like If it was just a picture of this figurine in front of a white wall,, the winner would probably just be whichever one's brighter, maybe whichever one has a better gold color, basically. But Then if we take the figurine with some falloff in the background, now, we're judging both color and background blur. Maybe You add a sky to the background, now you're also testing dynamic range and HDR. So Yeah,, with our latest photo, it's a lot.

It's two different skin tones. It's two different colored shirts. It's some textures for sharpness, the sky back there for a dynamic range, short-range falloff on the left,, long-range falloff on the right. I mean with all these factors,, whichever one people pick as a winner ideally is closer to the best overall photo.

I Also wanted the pictures to be of a human just because I feel like most of the important pictures that people take, most often, that they care about are of other humans. But As it turns out,, using my own face as a subject for these revealed a lot about how different smartphones handle taking a picture of a human face.. Because As I've already mentioned,, these smartphone cameras are so much software. now that the photo that you get when you hit that shutter button isn't so much reality as much as it is this computer's best interpretation of what it thinks you want reality to look like.

And Each company goes to a different level of making different choices and different optimizations to change their pictures up to look different ways.. They used to actually be a little more transparent about it. There are phones that would literally identify when you're taking a landscape photo and they'd pump up any greens they can find of the grass. or they'd identify any picture with a sky in it and pump up the blues to make it look nicer.

I did a whole video on smartphone cameras versus reality that I'll link below the Like button if you wanna check it out. But The point is,, when you snap that photo on your phone, you're not necessarily getting back a capture of what was really in front of you. They're really bending it in many ways. The iPhone's thing is when you take a photo, it likes to identify faces and evenly light them.
It tries every time. And So this feels like a pretty innocent thing, right? Like If you ask people normally, "What do you think should look good in a photo?" And you say, "Oh, I'll evenly light all the faces in it." That Sounds fine, right? And A lot of time it looks fine, but it's a subtle thing. like in a photo where you can see the light is coming from one side. Clearly,, where you can see from the Pixel's camera, there's a shadow on the right side of the face.

With The iPhone though, it's almost like someone walked up and added a little bounce fill, just a really nice little subtle bounce fill. But Sometimes it looks a little off. Like Look,, this is the low-light photo test we did from our blind camera test. On The left is the Pixel 7 again, which looks like all the other top dogs.

And On the right is the iPhone 14 Pro that finished in the middle of the pack. It Might be hard at first to see why it looks so weird, but look at how they completely removed the shadow from half of my face. I am clearly being lit from a source that's to the side of me, and that's part of reality. But In the iPhone's reality, you cannot tell, at least from my face, where the light is coming from.

Every Once in a while you get weird stuff like this. And It all comes back to the fact that it's software making choices. And The other half of that is skin tones. So You've heard me say for a few years in a row that I mostly prefer photos coming from the Pixel's camera, and we've done lots of tests where I have me as a sample photo and you can tell it looks really good.

Turns out Google's done this thing over the past few years with the Pixel camera called Real Tone. It doesn't get that much attention, but it turns out to be making a pretty big difference here. Historically, a real issue for film cameras back in the day was that they were calibrated for lighter skin tones and people with darker skin tones would typically be underexposed in those pictures. So Now, fast forward today, cameras are all software.

Smartphone Cameras are software so they can all make adjustments to account for different variety of skin tones, of course. But They still all do it to different varying degrees. Like You might have noticed a lot of phones sold in China will just brighten up faces across the board because that's what people prefer in photos in that region very often. Google goes the extra mile to train their camera software on data sets that have a large variety of skin tones to try to represent them correctly across the board.

And that's what it's calling Real Tone. And Apple's cameras, from what I've observed, simply just like to evenly light faces across the board and doesn't necessarily account for different white balances and exposures necessary to accurately represent different types of skin tones when I think they totally could. So Basically,, it turns out this is a big part of what we were observing in Pixel's and a lot of the phones that do accurately represent my skin tone finishing higher in this blind voting thing that we did because they happen to do that really well. And That's a thing that people really considered when they voted on them.
I Haven't said this a lot, but I think this is one of the earliest reasons that I actually really liked RED cameras was, you know, obviously 8K is great, Color Science is great, but the way it represents and renders my skin tone accurately over a lot of, you know,, the Sonys and the ARRIs and Canons that I've tried, that's actually one of the things that really drew me to these cameras. So All this software stuff is why photo comparisons between modern smartphones is so hard. Like There are a lot of channels that do a really good job with the side-by-side photo test, you know,, but even as you're trying to like pick one over the other, you've probably noticed this,. you might like the way one of them renders landscape photos over the other, but the way a different one renders photos with your own skin tone.

and then the way a different one renders photos of your pet, for example. So I'm sure Apple will defend everything they're doing now with their current cameras, as they typically do. But I'm gonna keep an eye on what. I'm also sure which is they're for sure working on tuning these new cameras, dialing it in, and eventually getting it better with the iPhone 15 and 15 Pro.

So Back to the original question from the beginning of the video, we can't leave that unanswered, which is, "All right, the Pixel 6a, you like the Pixel photos, Marques, it won the blind Scientific Camera test but you still gave the trophy for best overall camera system to the iPhone, the very 14 Pro that we've been talking about this whole video, why?" And If you listen carefully, you already got it, which is that scientific test that we did tested one specific thing, it tested the the small postage-stamp-sized, you know, exposure and colors. General thing with a bunch of different factors, but sharpness and detail with all the compression that we did wasn't tested. Also, speed of autofocus, reliability of autofocus wasn't tested. The Open-close time of the camera app, how fast and reliable you can get a shot, wasn't tested.

And Also, video wasn't tested. So The microphone quality, video quality, speed and reliability of autofocus there, file formats, sharpness, HDR, all that stuff, wasn't tested. Maybe Someday we will test all that, but until then, the lesson learned is the pretty pictures that come from the Pixel or whatever phone's in your pocket are partly photons, but primarily processing. Thanks for watching.

Catch You guys the next one. Peace.

By MKBHD

15 thoughts on “What is happening with iphone camera?”
  1. Avataaar/Circle Created with python_avatars supersploder says:

    Why do we have to wait for a new PHONE MODEL for a software improvement

  2. Avataaar/Circle Created with python_avatars david sanchez says:

    So what happens when you shoot in RAW?

  3. Avataaar/Circle Created with python_avatars wally says:

    i HAATE the bright as the sun HDR video

  4. Avataaar/Circle Created with python_avatars ManTanas says:

    The reason why i am still atuck to iphoneX – the 12 and 13 series are just like oppo photos with foo much saturation and contrast 😂

  5. Avataaar/Circle Created with python_avatars N3rd says:

    noice

  6. Avataaar/Circle Created with python_avatars Zack Adams says:

    Imagine during the phone setup, where you're setting up face unlock /face ID, it makes you have someone else take a photo of you using the primary camera in a raw format to get a true skin tone and then starts editing the photos of you from there out to accurately match your skin tone. 🤔

  7. Avataaar/Circle Created with python_avatars fernando says:

    You forgot to mention screen color accuracy
    True tone, cool, warm, OLED, night mode

  8. Avataaar/Circle Created with python_avatars Les Grossman says:

    This is why I’m waiting for the iPhone 17, it’ll be amazing

  9. Avataaar/Circle Created with python_avatars Dangun Suh says:

    Mkbhd if you have a music channel imma buyin’

  10. Avataaar/Circle Created with python_avatars Mohammed Khan says:

    Now is the software revolution. Things are being done more by the use of software than by game-changing hardware.

  11. Avataaar/Circle Created with python_avatars Asgar Haider says:

    your video title hardly matches with the depth of the content. did not like this video

  12. Avataaar/Circle Created with python_avatars philhersh says:

    So what you’re really saying is if you take your photography seriously you want to get a proper mirrorless camera.

  13. Avataaar/Circle Created with python_avatars Noi NgheNah says:

    The thing is people always choose the unnatural photo in blind test but they talk “want natural” out of their mouth =)

  14. Avataaar/Circle Created with python_avatars Darius Shu says:

    My iPhone 13 Pro Max camera has gotten worst after updating to ios16, I don’t know if it’s just in my head or apple degrading it thru software, it’s all fuzzy, overlit and not as great as before. Anyone else experience this?

  15. Avataaar/Circle Created with python_avatars Gyls Troy says:

    Great vid. Period.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.