Share this Article


<p>I am an Internet Marketing addict. I like to do internet stuff. Codes, lines and optimization. All tough, strategies are my poison. What’s yours?</p>


When we think about Facebook, most of us think about that friendly social network where we post our vacation pictures and check what our favorite famous people said lately. But during the time, Facebook turned into a real Marketing Beast, capable of unimaginable things.

Most of the people didn’t understand Google’s constant struggle to create and own a social network over and over. There were so many projects, all of them failed, closed or without much success. There was Orkut, and Dodgeball, Buzz and last but not least, Google+. People always wondered why do Google keeps pushing all these products, most of them unwanted and unneeded, in their opinion. They mocked and laughed at Google, every time a new social project came. They got angry when Google linked Google+ to Youtube, they refused to even give it a chance.

But Google knew why they were doing all that. Google understood the importance of the social networks, what they mean, their potential. The one company which generates the biggest part of its profits through marketing and advertising understood what a successful social network would mean.

But, unlucky for them, they were unsuccessful in their efforts. Unlucky for Google, lucky for Facebook. Facebook was able to “decrypt” the code and turn their network into the most powerful marketing machine ever created. We already softly touched this subject in the past, when we talked about Internet Security. But due to some current events, we’re forced to take a deeper dive into this subject.

Facebook evolved very interestingly over time. While at the beginning it was a simple social network, with simple functions, everything started to change when they understood its marketing potential. And everything skyrocketed with the introduction of the Reaction buttons. This was the next level.

Facebook has a different way of gathering information about its users than Google does. While Google is using lots of activity trackers and records your search history, among other stuff, Facebook knows a LOT more. Not by tracking or spying, but much easier. Its users “share” all that information willingly with Facebook.

Since the beginning, Facebook knew about its users as much as they would share with it. Starting from the name, location, education level or job and workplace. Then, they started tracking the likes and shares. Then, it followed the image tagging, in which they became able to “read” your pictures and “see” what they contain, people, things or places. Also, they started tracking the news feed scroll actions. Every time you stopped from scrolling to “read” a story,  Facebook considered that you “showed interest” in that topic. And then, there are the Reactions.

Facebook, the (evil) Marketing Beast

Facebook gathers all this data and with it they create what they call “data points”. But we will call them Filters, so it’s a bit easier to understand. To make it even easier to understand, let’s create a live example and explain how everything works exactly. To simplify the whole process further, we’ll use just a small part of the filters, in a less complex way. So, Mr. Random Userman just joined Facebook.

As soon as he opens the site, some information automatically gets to Facebook, like the type of device he’s using, the operating system, IP based location and few others. The first part of the Filter would look like this:

Samsung Owner > Android User > German IP


The information you share willingly. 

Random Userman just finished his account creation and gets to the profile completion part. Besides the name, which Facebook requires to be your real one and the age, there isn’t much info which you’re required to share mandatorily with the network. But people do that anyway, willingly.

So Random Userman would say that he is a male, born in 1981, he lives in Berlin / Germany, he went to university and he’s working in the IT sector, as databases engineer. So far, Random Userman’s specific Filter would look like this:

Male > In the 31-40 Age Range > Lives in: City/Berlin/Germany > High Education/University > IT Employee


The Likes & Shares

The next day, RD (we’ll call Random Userman like that further) likes Real Madrid’s fan page, Bill Nye’s page and also shares an article from Popular Science on his timeline. Therefore, we can add these new elements to the Filter:

Football Fan > Real Madrid Fan > Science and Technology Fan


All nice and normal so far, right? It’s just simple sociology and old fashion classification, nothing fancy. Marketing always used that, it’s the foundation on which it was created. So, if someone wants to sell something to Real Madrid fans who live in cities in Germany and due to their job afford to buy stuff above the average, RD would fit perfectly in that target. Basic marketing with a very very good accuracy. But there’s more.

The Image Scans

In the next day, RD uploads an album of photos. Facebook scans those photos and recognizes that RD has a dog, he likes to drink a specific brand of beverages (let’s call it “Ceko”) and he has a few pictures of himself and the Eiffel Tower. So now, new things will join the Filter:

Dog Owner > Drinks Ceko > Visits Paris


Tracking the News Feed Scroll

After adding some friends and started following some pages and famous people, RD starts to get a full News Feed. And while scrolling through boring stuff, some news titles capture his attention. So, he stops to check them, before continuing the scrolling, but he doesn’t engage with them in any way. The Filter will get improved with new info:

Interested in Donald Trump > Interested in 3D Printing > Interested in Stock Market Prices


The Reactions

Facebook radically upgraded their Filter creation process when they introduced the Reaction buttons. While before people could only tell Facebook what they “Like”, now the users are able to also say what they Love, Hate, what makes them Happy or Sad. Why is this such a big deal, you may ask. Well, it is, because Facebook can very easily create emotional leverages which they can use to control or influence people’s responses to certain matters. But we’ll get more on that later.

From his Reactions, RD’s Filter will get more details:

Loves Cristiano Ronaldo > Hates Donald Trump > Laughs at funny cats videos > Angry at Yulin Dog Meat Festival

Again, this is a simplified example of how the Filters work. Facebook is using a very large amount of “data points” in this matter and probably only they know exactly how many and in which way. So, for what we’ve learned so far, RD’s Filter looks like this, put together:

Samsung Owner > Android User > German IP > Male > In the 31-40 Age Range > Lives in: City/Berlin/Germany > High Education/University > IT Employee > Football Fan > Real Madrid Fan > Science and Technology Fan > Dog Owner > Drinks Ceko > Visits Paris > Interested in Donald Trump > Interested in 3D Printing > Interested in Stock Market Prices > Loves Cristiano Ronaldo > Hates Donald Trump > Laughs at funny cats videos > Angry at Yulin Dog Meat Festival

*Insert Facebook WOW Reaction here* This is amazing! And imagine now that Facebook does data points with everything you’re doing (or even NOT doing) on their network. EVERYTHING. Every scroll, every like, every post, every picture, every reaction, every comment, every click. Everything matters, everything gets counted, analyzed and stored in the database. So it’s safe enough to say that at some point, Facebook knows about you more than you and all your friends know combined. And that’s plain scary if you ask me.

Why scary? Because it gets pretty easy to manipulate the people. A team of specialists will find it easy to turn people into the directions they need them if they know exactly what turns the people on, what are their weaknesses. It becomes very easy to create predictability patterns by studying people’s behavior. And even easier to target your campaign with bullseye precision, thanks to the Filters.

Let’s take a hypothetical example. But please keep in mind, this is ONLY a scenario and nothing more unless proven otherwiseWe’ve learned previously from Random Userman’s pictures that he likes to drink Ceko. Let’s say that Ceko has a competitor named Popsi. And Popsi wants to steal some of Ceko’s users, to make them their own. To do so, they create a nice advertising spot where the main character is a charmful dog. 

And of course, Popsi will want to target its audience on Ceko drinkers. And especially on Dog Owners Ceko drinkers due to the latest spot which is very dog-friendly. Basic marketing, yes. But what if Popsi doesn’t stop there and wants to go into the dark zone? What if they decide to make some fake “news” stories on some platforms which say that the people at Yulin Dog Meat Festival started to boil dogs in Ceko because it makes the meat more tender?

Here’s how it would go. The people at Popsi will create the articles and post them on some dodgy websites. Then, promote the articles on Facebook to some Dog owners > Ceko drinkers > Angry at Yulin Dog Meat Festival people. First, the ad bots or fake account farms will take them and spread them further, then they will go viral, then the people will start sharing them and sooner or later, in one way or the other, they will reach Random Userman. Random Userman who is already a Dog owner > Ceko drinker > Angry at Yulin Dog Meat Festival person.

Of course, the article is unreal and it will ring some bells, but until Ceko will respond properly, until people will find out about it, the harm would be done, the damage would take place. And we’ve learned already that Facebook (and/or Google) doesn’t know yet how to distinguish properly and fast enough fake news from real news.

Facebook, the (evil) Marketing Beast 3

And the options are pretty much unlimited. Replace the dogs with cats, make articles which say “Cat owner? A new study finds that even a single sip of Ceko could kill your cat”, promote them to the proper audience and you get the same effect. Yes, Ceko will say it’s not true, fake news, blah blah blah, but some people will believe that. And it would have an effect even on the fact checking people, at a subconscious level, if it’s done “by the manipulation book”, by associating something that you like with something that you completely hate. In this case, your favorite beverage with the thought of losing your beloved pet.

And the worst part of all this is that if Facebook really wants, it can make this look completely real. Imagine these links don’t come from ad bots or foreigners, don’t come as promoted stories in your timeline. But they would come from your friends or friends of friends (of friends), real people whit whom you have something (more or less) in common. And arranged in a way in which it doesn’t look at all suspicious or untrustworthy. This way, for most of the people it gets close to impossible to make a difference between real and fake news.

Ok, you may say, this is just marketing, one company using all the tools to sell their products to its competition, nothing special, nothing new. True, but what happens when it goes beyond that? When it’s not a company trying to sell beverages, but it’s a group of people with shady interests who want to manipulate other people for their own benefit? Or even worse, when a country tries to influence and manipulate the people in other countries to act against their own interest?

Due to recent events, we’ve learned that Facebook has a real problem in handling that. And more worrying, an even bigger problem when it’s about reinsuring us that they’ll take better care to prevent it in the future. Here’s the now famous discussion between Colin Stretch, the vice-president and general counsel at Facebook, and Al Franken, a US senator, during the congressional hearings regarding the Russian ads. Pretty scary, right?

But by far, the most troubling aspect is the ease with which you can manipulate people through Facebook. The filters make it EXTREMELY easy. Do you want to sell a new car to cat owners who live in a 2 room apartment, like the color red, eat sweet tofu and hate zebras? You got it! Do you want to spread your message only to 42 years old airplane pilots who like London, listen to dubstep, drink cherry milkshake and own a green sedan car? Your audience is a few clicks away.

With that in mind, a good strategy team can create the perfect message which can address and touch any individual. ANY individual. Including you, even if you think right now that something like that would be impossible. If you know peoples’ fears and desires, the job is half done. The other half is just crafting the message in the right way.

To show somehow that they care, Facebook did create a place to see all your Ad related information. You can find it here. But can we really trust Facebook to actually care about those settings? Because it’s so easy for everything to be done in a way that you’d not be aware of. Everything would look nice and normal, coming from real people. Even more, Facebook can always “blame” the algorithm and say that everything you see on your news feed is there because once, at some point in time, you showed interest in that subject. And if you check your Interests list (they call it Ad Preferences), you’d be amazed at what you can find there.

Facebook, the (evil) Marketing Beast control

And Facebook already proved their manipulation power. Remember the emotional contagion study? In 2014, in a completely shocking and mind-blowing manner, Facebook admitted they experimented emotional manipulation in their network. The study, titled “Experimental evidence of massive-scale emotional contagion through social networks“, was published in the PNAS and it was so disturbing, that it even contains a later “Editorial Expression of Concern and Correction” note at its beginning. Even Facebook admitted in a blog post that it was wrong and in the future would try to do it differently.

Long story short, Facebook filtered the content of the news feed for almost 700.000 of its users. Users which had no idea that they participate in the study, either before or after it was completed. To some, they showed only positive and happy posts and news, to the others, only sad and negative. Of course, the results came as expected. Those who received a “happy” news feed posted “happy” stuff as well, and the same with the “sad” ones. So we know “officially”, from Facebook itself, that manipulation in their network is possible and works extremely well.

So what can we do in all of these, to protect ourselves? Well, there are a few options, but unfortunately none with real effect. We could shut down Facebook. But besides being a very unpopular option, it would not solve the problem. Another social network will take its place in no time. We could also shut down all the social networks, but that’s really out of the question in our modern free world.

We can try to use laws to regulate Facebook and the other internet companies better, but what guarantees that they will actually respect those regulations? The European Commision alone fined the tech giants with billions over billions of $. But for them, these amounts are ridiculous. Not to mention how hard it is to discover and prove something like that, as we’ve just explained above.

We could also try to make people more aware of the dangers they expose themselves by sharing too much information, but that won’t work either. People love to be social, to share and belong, to be part of groups of individuals with the same likes and views as them. And frankly, it’s not quite easy to understand all these things or to care about them, since they don’t affect the people physically and the results are hard to see or measure. Even worse when we talk about subversive things which most of the times have only subconscious effects.

So what can we REALLY do? Not much, actually. We can only hope that the people in charge, those with the power over all this, know how to handle the situation properly. That they will resist the temptations of using this gigantic power for their own benefits. That they have an idea how to stop and ban others from using their tools in a harmful way. Unfortunately, so far, they pretty much failed in all of that. *Insert  Facebook SAD reaction here*

What do you think about Facebook and its power? Feel free to join the discussion below.


2