If everything’s edited and AI-generated, how do we know what to believe anymore? This is my take as an Arizona-based photographer.
We’re living in a time where scrolling through social media or reading headlines often leaves us wondering: “Is this even real?” And that question isn’t just rhetorical anymore…it’s absolutely necessary.
Recent coverage of Jeff Bezos’ high-profile wedding, for instance, included images that were either AI-generated, heavily edited with AI tools, or at the very least, manipulated to an extent that raised eyebrows. And while that may seem like tabloid-level gossip, it actually points to something much deeper and more alarming: our growing inability to trust what we see.
As a photographer and media strategist, I’ve felt this shift a lot lately. When the average consumer can no longer differentiate between real and fabricated, between editorial and sponsored, we’re treading on dangerous ground. I wanted to break down how AI and media manipulation are impacting our perception, trust, and the ethics of creative work.
Trust is eroding as AI is increasing
People who specialize in detecting AI-generated imagery were quick to analyze the Bezos wedding photos, identifying common AI mistakes like odd lighting, distorted hands, uncanny smoothness. But beyond the technical anomalies, the bigger question is: Why were these images manipulated at all? What was the intention? When content is overly edited or outright fabricated, it creates distrust, and not just in the media covering the story, but in all media.
Even I’ve had my photos accused of being “too perfect” or manipulated, despite being known for minimal editing. I prefer to spend time upfront, getting the lighting and composition right, because frankly, I’m lazy when it comes to retouching. But even with that approach, I still face suspicion. And trust me, I get it! We’re at a point where trust is eroding so fast that people question even the most straightforward images.
Sponsored content and what these blurred lines mean
This extends far beyond personal photos or social media. Let’s talk about sponsored content, because that’s another area where blurred lines create confusion. Sponsored content is when someone pays to appear in a media outlet, but instead of running a standard ad, it’s designed to look like editorial content. It’s supposed to be clearly labeled as sponsored, but many media companies blur this distinction. Financial pressures and shrinking revenues have led to a rise in this tactic on television, online, and in print.
I highly recommend watching the Last Week Tonight segment on sponsored content: it shows just how easy it is to get a fake product promoted on legitimate news outlets. The public often can’t tell the difference, and that’s the danger.
When I worked as a photojournalist, we were strictly forbidden from accepting even a CD from a record label because it could compromise our editorial integrity. Cropping a photo too much could be considered unethical if it changed the context. That was the standard. Today, many of those lines have all but disappeared.
Where should we draw the line on editing?
In editorial photography, I hold myself to a standard: editing should not change the context of the photo. Yes, I’ve swapped expressions or arm positions on magazine covers, but I draw the line at altering body shapes or airbrushing someone to the point where they no longer resemble themselves. I once shot a major fitness influencer who wasn’t happy with my minimal retouching: they wanted something more “perfected,” and when I declined, they edited the photos themselves beyond recognition. I didn’t work with them again.
But when it comes to commercial photography such as advertising or product shots, I’m more flexible. Those images are meant to sell something, and everyone understands that there’s artistic license involved. I’ll move items, brighten products, clean up backgrounds, because that’s not pretending to be journalism.
AI and fakes abound
AI isn’t just affecting photography. I was recently pitched a book by a publicist. The book was over 700 pages, riddled with errors, and clearly structured by an AI, headlines followed by matching subheads in a way that screamed “non-human.” Even the Amazon reviews looked fake, with the publicist herself leaving the top review.
All of this contributes to a much bigger issue: the crumbling of consumer trust. Influencers have long been accused of portraying lifestyles they don’t live, achievements they haven’t earned, and now, with AI, the deception is getting even harder to spot.
So what do we do about it?
I don’t have a perfect solution, but I know this is a conversation we need to be having. We need transparency. If AI tools are used in media or advertising, that should be disclosed. Just like influencers are required to tag #sponsored (though let’s be honest, no one’s really checking those hashtags in 2025), there should be clear labeling when AI editing has been used.
If you’re so worried that admitting to AI use will make people distrust you, maybe you shouldn’t be using it in the first place. The Federal Communications Commission (FCC) is technically supposed to regulate media transparency, but they’re severely underfunded and understaffed. That means the burden falls on us (creatives, marketers, photographers) to set and uphold our own standards.
Personally, I’m not ashamed to say I use AI tools for things like flyaway hair or background cleanup. I’ll even use it as a selling point: “Here’s what I can do with AI, and here’s what I can do without it.” It’s about honesty and context.
Taking responsibility
The problem we’re experiencing with AI goes so much further than just photography or media. It’s actually a looming cultural problem. As AI tools become more powerful and accessible, our responsibility to use them ethically becomes more urgent. We must be more intentional about how we present our work and how we discern the work of others.
And I want to hear from you. What do you think the future of media integrity looks like? What lines should we draw, and, elephant in the room, who should be drawing them?
To hear more about this, check out this episode of the Beyond the Image podcast.


