this post was submitted on 01 Dec 2023
260 points (82.2% liked)

Technology

59210 readers
4221 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

A U.K. woman was photographed standing in a mirror where her reflections didn't match, but not because of a glitch in the Matrix. Instead, it's a simple iPhone computational photography mistake.

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 222 points 11 months ago (5 children)

This story may be amusing, but it's actually a serious issue if Apple is doing this and people are not aware of it because cellphone imagery is used in things like court cases. Relative positions of people in a scene really fucking matter in those kinds of situations. Someone's photo of a crime could be dismissed or discredited using this exact news story as an example -- or worse, someone could be wrongly convicted because the composite produced a misleading representation of the scene.

[–] [email protected] 46 points 11 months ago (1 children)

I see your point, though I wouldn't put it that far. It's an edge case that has to happen in a very short duration.
Similar effects can be acheived with traditional cameras with rolling shutter.
If you're only concerned of relative positions of different people during a time frame, I don't think you need to be that worried. Being aware of it is enough.

[–] Odelay42 57 points 11 months ago (2 children)

I don't think that's what's happening. I think Apple is "filming" over the course of the seconds you have the camera open, and uses the press of the shutter button to select a specific shit from the hundreds of frames that have been taken as video. Then, some algorithm appears to be assembling different portions of those shots into one "best" shot.

It's not just a mechanical shutter effect.

[–] [email protected] 10 points 11 months ago (1 children)

I'm aware of the differences. I'm just pointing out that similar phenomenon and discussions have been made since rolling shutter artifacts have been a thing. It still only takes milliseconds for an iPhone to finish taking it's plethora of photos to composite. For the majority of forensic use cases, it's a non issue imo. People don't move that quick to change relative positions substantially irl.

[–] Odelay42 14 points 11 months ago (1 children)

Did you look at the example in the article? It's clearly not milliseconds. It's several whole seconds.

[–] [email protected] 8 points 11 months ago* (last edited 11 months ago) (1 children)

You don't need a few whole seconds to put an arm down.

Edit: I should rephrase. I don't think computational photography algorithms would risk compositing photos that are whole seconds apart. In well lit environments, one photo only needs 1/100 seconds or less to expose properly. Using photos that are temporally too far apart risk objects moving too much in the frame, and thus fail to composite.

[–] Odelay42 6 points 11 months ago (6 children)

There's three different arm positions in a single picture. That doesn't happen in the blink of an eye.

The camera is taking many frames over a relatively long time to do this.

This is nothing at all like rolling shutter, and it's very obvious from looking at the example in the article.

[–] LifeInOregon 7 points 11 months ago

Those arm positions occur over the course of a fluid motion in a single second. How long does it take for you to drop your hands to your side or raise them to clasped from the side? It doesn’t take me more than about half a second as a deliberate movement.

[–] [email protected] 6 points 11 months ago

It takes you several seconds to move your arm? I hope you don’t do manual work.

Also did you use the iOS camera app before? You can see how long it takes for the iPhone to take multiple shots for the always-on hdr feature, and it isn’t several seconds.

load more comments (4 replies)
[–] [email protected] 6 points 11 months ago (1 children)
[–] Odelay42 7 points 11 months ago
[–] [email protected] 30 points 11 months ago (3 children)

It should be. All computational photography has zero business being used in court

[–] Decoy321 18 points 11 months ago (1 children)

We might be exaggerating the issue here. Fallibility has always been an issue with court evidence. Analog photos can be doctored too.

[–] [email protected] 4 points 11 months ago (1 children)

Sure, but smartphones now automatically doctor every photo you take. Someone who took the photo could not even know it was doctored and think it represents truth.

load more comments (1 replies)
[–] [email protected] 17 points 11 months ago* (last edited 11 months ago) (1 children)

All digital photography is computational. I think the word you're looking for is composite, not computational.

[–] NotSoCoolWhip 3 points 11 months ago (1 children)

Unless the dude is saying only film should be admissible, which doesn't sound all that bad.

[–] [email protected] 8 points 11 months ago

Film is also subject to manipulation in the development stage, even if you avoid compositing e.g. dodging and burning. Photographic honesty is an open and active philosophic debate that has been going on since its inception. It's not like you can really draw a line in the sand and blanketly say admissible or not. Although I'm sure established guidelines would help. Ultimately, it's an argument about the validity of evidence that needs to be made on a case by case basis. The manipulations involved need to be fully identified and accounted for in those discussions.

[–] [email protected] 9 points 11 months ago

With all the image manipulation and generation tools available to even amateurs, I'm not sure how any photography is admissible as evidence these days.

At some point there's going to have to be a whole bunch of digital signing (and timestamp signatures) going on inside the camera for things to be even considered.

[–] [email protected] 26 points 11 months ago (1 children)

I'm still waiting for the first time somebody uses it to zoom in on a car number plate and it helpfully fills it in with some AI bullshit with something else entirely.

We've already seen such a thing with image compression.

https://www.zdnet.com/article/xerox-scanners-alter-numbers-in-scanned-documents/

[–] ook_the_librarian 24 points 11 months ago (4 children)

This was important in the Kyle Rittenhouse case. The zoom resolution was interpolated by software. It wasn't AI per se, but the fact that a jury couldn't be relied upon to understand a black box algorithm and its possible artifacts, the zoomed video was disallowed.

(this in no way implies that I agree with the court.)

load more comments (4 replies)
[–] Jarix 11 points 11 months ago

This isn't an issue at all it's a bullshit headline. And it worked.

This is the result of shooting in panorama mode.

In other news, the sky is blue

load more comments (1 replies)
[–] [email protected] 204 points 11 months ago (4 children)
[–] ComradePorkRoll 71 points 11 months ago

Preventing people from perpetuating clickbait "journalism" is so punk rock.

[–] [email protected] 37 points 11 months ago (1 children)

Damn, this photo is weirdly unsettling to me

[–] [email protected] 12 points 11 months ago

I'm totally getting Black Swan vibes.

[–] [email protected] 13 points 11 months ago (1 children)

Not even a mistake, this is unavoidable if you move during a panorama. iPhones can't pause time. Cool photo tho

load more comments (1 replies)
[–] [email protected] 11 points 11 months ago
[–] slaacaa 63 points 11 months ago* (last edited 11 months ago) (1 children)

Uhm, ok?

The way the girl’s post is written, it’s like she found out Apple made camera lenses from orphans’ retinas (“almost made me vomit on the street”). I assumed it was well known that iPhone takes many photos and stitches the pic together (hence the usually great quality). Now the software made a mistake, resulting in a definitely cool/interesting pic, but that’s it.

Also, maybe stop flailing your arms around when you want your pic taken in your wedding dress.

[–] Jarix 4 points 11 months ago

When have panorama photos ever not done weird stuff?

[–] [email protected] 45 points 11 months ago (6 children)

Who wants photos of a fake reality? Might as well just AI generate them.

[–] LifeInOregon 47 points 11 months ago (1 children)

Generally the final photo is an accurate representation of a moment. Everything in this photo happened. It’s not really generating anything that wasn’t there. You can sometimes get similar results by exploiting the rolling shutter effect.

https://camerareviews.com/rolling-shutter/

It’s not like they’re superimposing an image of the moon over a night sky photo to fake astrophotography or something.

load more comments (1 replies)
[–] [email protected] 23 points 11 months ago (7 children)

A photo is a fake reality. It's a capture of the world from the perspective of a camera that no person has ever seen.

Sure we can approximate with viewfinders and colour match as much as possible but it's not reality. Take a photo of a light bulb, versus look at a light bulb, as one obvious example.

This is just one other way to get less consistency in the time of different parts of the photos, but overall better capture what we want to see in a photo.

load more comments (7 replies)
[–] [email protected] 17 points 11 months ago* (last edited 11 months ago)

To their credit, it's not "fake". This isn't from generative AI, this is from AI picking from multiple different exposures of the same shot and stitching various parts of them together to create the "best" version of the photo.

Everything seen in the photo was still 100% captured in-lens. Just... not at the exact same time.

[–] [email protected] 12 points 11 months ago* (last edited 10 months ago)
load more comments (2 replies)
[–] [email protected] 36 points 11 months ago (3 children)

Seriously? She almost vomited because the photos didn’t match? Give me a fucking break!

[–] [email protected] 30 points 11 months ago* (last edited 11 months ago)

I’m pretty sure that was just a joke.

[–] [email protected] 23 points 11 months ago

The woman in question is a comedian.

load more comments (1 replies)
[–] [email protected] 31 points 11 months ago (1 children)

Ah yes, I remember noticing it would make like a short video instead of one picture, back when I had an iPhone. I turned that function off because I didn't see the benefits.

[–] [email protected] 15 points 11 months ago (12 children)

That’s not what this is. I also turned that off, it’s called “Live Photo” or something like that. Honestly I find it to be a dumb feature.

What this is, is the iPhone taking a large number of images and stitching them together for better results.

load more comments (12 replies)
[–] [email protected] 14 points 11 months ago (3 children)

It's a really cool discovery, but I don't know how Apple is suppose to program against it.

What surprises me is how much of a time range each photo has to work with. Enough time for Tessa to put down one arm and then the other. It's basically recording a mini-video and selecting frames from it. I wonder if turning off things like Live Photo (which retroactively starts the video a second or two before you actually press record) would force the Camera app to select from a briefer range of time.

Maybe combining facial recognition with post processing to tell the software that if it thinks it's looking at multiple copies of the same person, it needs to time-sync the sections of frames chosen for the final photo. It wouldn't be foolproof, but it would be better than nothing.

[–] xantoxis 36 points 11 months ago (2 children)

Program against it? It's a camera. Put what's on the light sensor into the file, you're done. They programmed to make this happen, by pretending that multiple images are the same image.

[–] ninekeysdown 3 points 11 months ago (11 children)

That’s over simplified. There’s only so much you can get on a sensor at the sizes in mobile devices. To compensate there’s A LOT of processing that goes on. Even higher end DSLR cameras are doing post processing.

Even shooting RAW like you’re suggesting involves some amount of post processing for things like lens corrections.

It’s all that post processing that allows us to have things like HDR images for example. It also allows us to compensate for various lighting and motion changes.

Mobile phone cameras are more about the software than the hardware these days

[–] [email protected] 11 points 11 months ago (3 children)

With a DSLR, the person editing the pictures has full control over what post processing is done to the RAW files.

load more comments (3 replies)
load more comments (10 replies)
load more comments (1 replies)
load more comments (2 replies)
[–] [email protected] 13 points 11 months ago

I may have missed this in the comments already but it is really important to note here that the article says the photo was taken using panorama mode, which is why the computational photography thing is even an issue. If you have used panorama mode ever you should go in expecting some funkiest, especially if someone in the shot is moving, as the bride apparently was when it was shot.

[–] satans_crackpipe 12 points 11 months ago

Stop posting apple advertisments.

[–] orion2145 10 points 11 months ago

There’s a note at the end of the article that says it was take using pano. So this is doubly unsurprising. Despite the instagram caption reading it wasn’t.

load more comments
view more: next ›