

This photo isn’t real—but you wouldn’t know it by the way it swept across social media this month, with tens of thousands of people genuinely thinking they were looking at spy photos of two new Volvo cars and cheering the automaker’s return to boxy, retrofuture designs. Then, the counter-response: thousands more commenters declaring that it’s obviously an AI-generated image, and how could anyone be dumb enough to think otherwise? But the thing is, it’s actually neither of those. And the excitement and anger surrounding the conversation is an ominous sign of how AI can mess up even the smallest social interactions, like two car enthusiasts looking at a picture together.
The images are in fact renders created by digital artist Jordan Rubinstein-Towler, who published them on his Instagram page back on January 28. He’s been making sketches and 3D renders for years, mainly fan-service stuff like redoing the new Acura Integra to look more like the third-gen one. The two Volvos are just his latest project. He’s been having fun with the rollout, mocking them up in design sketches and camouflaged spy photo scenes in Instagram posts over the last six months.
How did they turn into a moment of mass misinformation? I’ll walk through it, but the big point is that relying on AI—to create or solve or filter the world around us—kills our already diminishing critical thinking skills. That’s not (just) my POV, it’s the conclusion of a recent study by Microsoft. (You know, the company investing $80 billion into AI this year.) As society struggles to adapt to these new technologies, problems will pop up in places we don’t expect. Like upsetting the delicate balance that previously made amateur 3D renders fun distractions for the car community, not the source of false rumors and deep frustration.



Over the last decade, more powerful graphics cards and the spread of open-source modeling software led to a real cottage industry for creators of fake car renders. Artists “fix” cars that had gotten bad redesigns (think the fight against BMW’s ever-expanding kidney grilles), make outlandish dream builds, or use actual spy photographs to mock up future models. Those with talent could earn a big following and decent cash.
Until recently, renders haven’t caused a lot of issues. When one broke containment and found a new audience who didn’t know what it was, confusion was snuffed out pretty quickly. At the same time, you could argue that Jordan, like many artists, at least sorta hoped people would think they were real photos, if only for the views. His original caption—”Volvo 240 sedan and kombi high performance prototypes spotted at the Nürburgring”—sure isn’t helping. It’s fine if you’re in on the running joke with his profile, but without that context, it adds to the new problem today.
Because now, the right render breaking containment is met by an audience (especially on Facebook) that’s being deluged with pure AI-generated content, struggling to separate it from reality, and relying on an algorithm that prioritizes it to determine what they see next. It’s a spark in a bone-dry forest.
That’s what happened here. Jordan’s renders were quickly screenshotted by someone—whether they believed the caption or not is irrelevant—and uploaded to Facebook without its crucial context. Users latched onto them, thinking they were real and spreading them across dozens of enthusiast groups. Others immediately jumped in to declare that they were generative AI nonsense, but those fact checks couldn’t catch up with the number of people falling for them every second and excitedly sharing them further. By the time the dust settled, after the images had already leapt to Reddit and other platforms, they were presented as real to millions of people. How many just moved on with their lives believing something fake?

Again, this is not a new dynamic. The speed is really what’s different now, for something as innocuous as a 3D model of a car. You could argue that incorrectly believing Volvos are going to look a certain way isn’t going to ruin anyone’s life and this is all ultimately harmless. But the response of people saying it’s AI and getting legitimately angry with those who couldn’t tell shows why it’s not. Before, when there was confusion over a render, the usual correction was to just point it out. “Uh, that’s a render guys.” It’s subtle, but when you say that, you’re acknowledging that this is something created by a person, who might not have meant to fool anyone, and it’s more on the viewer to keep that in mind. But when you say, THAT’S AI, ARE YOU KIDDING ME IDIOTS?, you’re saying that this is worthless content made just to trick people, and God help you if you fell for it.
I get why people thought it was AI, because most things that look like these two images are AI today: the too-precise shadows, weird depth of field effect, and incongruous body lines on the cars feel like tells. But that is the other side of the coin. Not only has AI content made it hard for less digitally savvy folks to tell what’s real, but it’s also set up those who are on guard with a new default assumption. If something isn’t real but looks like it’s trying to be, it’s AI. The idea of someone spending many hours building a 3D scene as a piece of art isn’t top of mind anymore. That person is erased from the conversation, as is the humanity of the person you’re judging for thinking it’s real. It’s no longer communication, it’s combat.
I’m not gonna get into what that means for society; this is a car blog. In our world though, I think in this one aspect, it means renders are going to cause more rough moments like this one, and the car community and industry are going to have to learn how to navigate them. And artists, whether they’re innocent bystanders or not, are going to see people writing off their work as AI in the aftermath, as people did on Jordan Rubinstein-Towler’s original post.
One last look at Jordan’s Instagram shows how things have shifted in just the last two years. In January 2023, he was in the middle of another rollout campaign for his retro redesign of the Acura Integra. That month, he posted a similar set of “spy shot” renders showing a partly-camo’d car in traffic with another factual-sounding caption. Over 13,000 likes, thousands of interactions, and almost as much algorithmic action as his ill-fated Volvo post last month. And not a single response from the time debating its origins. People knew it was a render, because what else would it be? Except last week, in the wake of all this, a curious user found their way to the two-year-old post. They weren’t sure if what they were seeing was real. They left a comment: “Is this AI?”
Got a tip? Send us a note: tips@thedrive.com