Puppets for Sale

Director and illustrator: Lanikai Yatomi Writer: Jo Carstensen

Most of the time, when people think of artificial intelligence running the world, images from “The Terminator” or “Blade Runner” pop into their head. They see drones flying around the sky of a world where everything is dominated buy AI. What people miss is that AI already runs today’s world.

TikTok, the nouveau video-based social media platform, led the way in 2018 with the AI content algorithm that brought the distinctly catered “For you” page to each individual user. “It’s so addicting, I love getting such niche videos,” says Bela Lohman, a junior at Oregon State University. After only a few short months of using the app, Lohman says she’s already noticed the types of videos the app pushes her way.


 “I think TikTok knows such specific things about me like it’s so personalized it’s kinda scary. I started out on skater TikTok and it very quickly added in surfing which I also do but then somehow it figured out I was a coast surfer, like versus a Southern California surfer, even though I didn’t interact or anything with those videos.”

Bela isn’t the only one who’s noticed the unsettlingly quick evolution of AI algorithms but the real problem is rooted much deeper. Former CEOs and presidents of Facebook, Instagram, and even Pinterest have started expressing ethical concerns about how the developing AI algorithms now running most social media platforms could have serious consequences.

As our time on technology increases, the programs get better and better at catering to the individual, reinforcing the already deafening social media echo chamber. The negative mental health effects, polarization, fake news, and conspiracy theory rabbit-holes that the programs themselves create are concerning. But these only get worse as the AI goes from catering to your interests to predicting them.


“[What] most people don’t realize is these social media companies are competing for your attention,” says Tristian Harris the co-founder of the Center of Humane Technology says in an interview for The Social Dilemma, a Netfliz documentary. Formerly a design ethicist for Google, Harris explains that for companies like these, revenue is almost solely based on user interactions. “Their business model is to keep people engaged,” he explains. Meaning, that in a capitalist society where everything is monetized, keeping you on your phone and more specifically on their app is their number one priority.

This is where tech addictions start to come into play as there still isn’t a monetary incentive for companies to get their users, especially the younger ones, to reduce their screen time.  

For the last 10 years, the biggest tech companies in Silicon Valley have outgrown selling merely software or programs and have upgraded to the business of selling targeted ad placement. This is something that can only be done with access to lots of data; data that show who plays piano, who clicks on weight-loss videos, who rarely reads the news, and who spends less than three seconds watching basketball highlights over football highlights. 

Many of these sites we download on our phone don’t require a monthly payment or a one-time purchase, just a simple username and password. That’s because tech giants no longer have a need for users’ money, they now deal in something much more valuable than currency – your time.

 Justin Rosenstein, a former engineer at Facebook and Google, breaks down this concept in their interview for the Netflix original documentary titled “The Social Dilemma.” 

“When you think about how some of these companies work, it starts to make sense,” Rosenstein explains. “They’re all these services on the internet that we think of as free but they’re not free. They’re paid for by advertisers. Why do advertisers pay those companies? They pay in exchange for showing their ads to us. We’re the product, our attention is the product being sold to advertisers.”

There is a classic line that says, if you’re not paying for the product, you are the product.

Jaron Lanier, founding father of virtual reality and author of “10 Arguments for Deleting your Social Media Accounts Right Now”, explains that this brainwashing isn’t a quick, sudden change. “It’s the gradual, slight, imperceptible change in your own behavior and perception of the product,” he clarifies. But he admits that “If you can go to somebody and say, ‘Give me $10 million, and I will change the world 1% in the direction you want it to change,’ that’s still a lot of people.”

Everything from political campaigns to weight-loss programs could buy one percent of the world’s opinions either in favor of their cause, or in opposition of their competitor. When it comes to targeted ads, the real question isn’t when will this affect us, but what has already been affected.

Was this article helpful?