What The Tech: Why artificial intelligence still can’t think like Santa

BY JAMEY TUCKER, Consumer Technology Reporter
If you’re using Chat GPT or Alexa, or Gemini to come up with the perfect Christmas gifts, you’re not alone.
A growing number of people are putting no thought into what they give, they just ask AI.
Here’s the thing:
I tried it. I asked ChatGPT for gift ideas for my wife. I added that she’s busy, practical, and
appreciates useful gifts. It confidently gave me a list that included a high-powered robotic vacuum cleaner and a
personal finance book about budgeting.
Are you kidding me? You could almost hear it say ,”Good luck with that.”
What we don’t realize is that to come up with answers, ChatGPT and other AI tools study billions of examples.
Product reviews, blog posts, and shopping trends to predict what someone like your wife might want. It’s connecting data points. Not relationships.
Not to say it doesn’t do a solid job of brainstorming. But to get the best results, you need to be very specific. Include what they enjoy, any hobbies, even what they don’t like.
What’s their favorite vacation spot? If they have a favorite book, include it in the prompt. The more details, the better.
AI is great at patterns. It can predict what people buy, but it can’t understand why they buy it.
So the next time you unwrap something that feels a little impersonal, or oddly specific, don’t blame the giver. Blame the algorithm. And save the receipt.