On one of the last coffee hangouts before the hospitality business closed in the city I live in, I’d been participating in one of those hypochondriacal conversations where you end up listing all the possible terrible illnesses you can have. When I came home and opened the Twitter app, I was expecting an awareness announcement about a type of cancer.The screenshot of the ad garnered laughter on WhatsApp and at least an explanation of why he was seeing it. It was the day of that cancer, so awareness campaigns were to be expected. However, this was not the first time something like this had happened to me. Were the social network segmentation processes doing too well? (Taking into account that for a while I was chased by an advertisement to become a police officer in the UK, perhaps not so much …).Even so, and no matter how coincidental it was, receiving the announcement after that conversation was very disturbing. It was almost excessively invasive, as if Twitter had been listening to me and waiting to offer me announcements of what, after listening to my conversations, seemed that I feared the most.Internet companies have already actively and passively denied that they spend the day listening to what we are talking about to give us the ads that are more adjusted to what interests us (it would be a nightmare for them in terms of European data protection law but also , they usually explain, would require an overly ambitious technological capacity). Even so, the thought that they listen to us to sell us things is one of the great modern fears.
The phrase “Facebook listens to our conversations and what we say” gives nothing more and Poland Email List nothing less than 1,470,000 results on Google.But, although they do not listen to us and do not use what we tell intimately to determine that perhaps this is the time to sell us material to make pastries or a new mattress, if it is true that they have more and more data about us, they can better adjust what It may interest us and how we are and with it they are getting closer and closer to being too disturbing with their ads. Instead of clicking on what they are trying to sell us that is so appropriate, the knee-jerk reaction is rejection. It is too close and therefore too disturbing.When your room appears in an advertisement One of the examples of this is what a journalist from Vice magazine tells : he just saw “his” room in an advertisement on Instagram.
It’s not actually your room, but at first glance it might look like it. It’s the same bed, the same sheets, and the same nightstand. “We have never seen ourselves so reflected in an ad,” he writes in the article about how he and his girlfriend felt about the message from the home lingerie brand that was trying to sell them something via Instagram feed. From Facebook, they explained that this was a coincidence. And, as the same article explains, there are two clear reasons why you can understand why you saw your room in what it really wasn’t.On the one hand, there is the association work that the brain does. First of all, it processes what you already know or what you are thinking about the most lately (it is what it does that when you buy a car, then you see it everywhere as the most successful model). On the other, there is the business of data and advertising.
Everything is very close because companies have managed to accumulate a lot of data on how we buy and get to the essence of how we are and what we want.Not that they have actually seen your room, but they have made an informed approach to what they think your room may be.Burn the consumer In theory, this brutal closeness should work better. Personalized ads make the consumer only receive what they really want and what they are interested in and do a job of eliminating everything that is irrelevant or of little value to them.However, getting the message too right can have the opposite effect. It can be negative for the brand as it is excessively invasive on privacy and burns the consumer.