First they listened to your conversations. Now they want to anticipate your thoughts. I’ve disconnected everything except the toaster, and I’m not sure about the toaster.
They announced it like it was good news. Google’s Gemini can now tap into something called a “Shopping Graph” and use “agentic technology” to call stores on your behalf. OpenAI’s ChatGPT has struck deals with Walmart, Target, and Etsy to let you buy products “directly within chatbot interactions.” They’re calling it convenience. I’m calling it ai shopping surveillance dressed up in a friendly interface. Big difference.
Let me explain what’s actually happening, because the tech press won’t. Every time you ask a chatbot for product recommendations, you’re not getting help. You’re building a profile. A psychological fingerprint of your desires, your insecurities, your weaknesses. You ask for running shoes, they know you’re trying to get healthy. You ask for sleep aids, they know you’re stressed. You ask for gift ideas for your wife, they know your marriage needs work. This is surveillance, not assistance.
The Pieces Are Connecting
I’ve been tracking this since 2019. Back then, the smart speakers were just listening. Then the phones started predicting. Now the chatbots are *anticipating*. The trajectory is clear if you’re paying attention. They want to know what you want before you consciously want it. That’s not AI. That’s pre-crime for consumer behavior.
My neighbor’s kid showed me how ChatGPT works last month. He typed in a question about headphones. Within seconds, it had recommendations, prices, links. He thought this was amazing. I thought: they now know he wants headphones, his budget, his preferences, and that he’s the type of person who asks AI for advice instead of doing his own research. That’s a complete psychological profile from one interaction.
The article said “consumer time spent chatting with AI keeps rising” while “web traffic from search engines and social media continues to plummet.” You know what that means? It means they don’t need to track you across the web anymore. You’re going directly to them. You’re volunteering the information. You’re training the algorithm to know you better than you know yourself.
What They’re Building
I talked to my guy at the phone store—the one who still sells the flip phones I trust. He said even he’s getting pressure to push the “smart” devices. The company wants everyone on the AI platforms. He didn’t know why. I do. They want everyone in the system. Every purchase, every question, every moment of uncertainty cataloged and cross-referenced.
Here’s what the MIT Technology Review mentioned almost in passing: AI companies are now dealing with lawsuits about “what their chatbots encourage people to do.” Think about that. The chatbots are encouraging behavior. They’re not neutral. They’re not just answering questions. They’re shaping decisions. Your decisions. About what to buy, what to think, what to want.
I still shop in person. Cash only. I drive to the store. I look at products with my actual eyes. I make decisions without algorithmic input. My wife says this takes longer. My wife also has six different shopping apps on her phone that know more about her preferences than I do after 34 years of marriage. There’s a lesson there.
The Path Forward
You want my advice? Don’t use the chatbots for shopping. Don’t use them for anything that reveals what you want or need. Every query is a data point. Every data point builds the profile. Every profile makes you more predictable. And predictable people are controllable people.
I know how this sounds. My daughter says I “see patterns that aren’t there.” But I saw the pattern with smartphones before everyone had one. I saw the pattern with social media before it started influencing elections. I saw the pattern with smart speakers before we all had microphones in our kitchens. The pattern is always the same: convenience first, control later.
The AI wants to shop for you. It wants to call stores for you. It wants to know your size, your style, your budget, your insecurities. It wants to be helpful. Helpful is just the first phase. What comes after helpful is always the same: indispensable. And what comes after indispensable? I’ve got theories. None of them are good.