MARCH 23 — I refer to the opinion piece by Ms Boo Su-Lyn that you published on 23 March 2018 (“Is Facebook to blame for data mining?”), and would like to share my disagreement with the piece. Here is my response to some of Ms Boo’s unsubstantiated assertions, dismissals, and lack of understanding on the nature of Facebook and the ongoing saga with Cambridge Analytica.

Boo said – “How can we demand for free things and expect a business to magically sustain itself? It is like asking for quality journalism, but refusing to pay for it and complaining about the state of the press.”

1. We’re not demanding for ‘free things’. This may not be apparent to everyone, but when we use the platform, Facebook uses filters that make us see content that we are more likely to prefer. This is loosely referred to as content curation, and is mostly a fair thing, because otherwise, we would be overloaded with information.

Less savorily, this is also done for the purpose of catering to their customers, the advertisers. Facebook gets most of its revenue through ads, and they’ve come a long way into making it into a massive advertising ecosystem of its own, similar to the Google ads system.

Advertisement

We aren’t freeloading on Facebook. Facebook needs us to be in the ecosystem because they sell the reach and impact of our presence to companies that want to sell things to us. The ones paying Facebook are the advertisers who need Facebook’s users to be there. It’s a captive market, and it sells for a lot of money.

So no, Facebook is not free for us. We pay a price for it – giving up our personal information, likes, personalities and buying habits – for use of the platform.

2. Psychological warfare is a real thing. The implications of using psychological tactics to affect the sentiments of masses is a very powerful tool, and in the wrong hands, could be rather devastating.

Advertisement

“But how is that any different from other companies holding focus groups to test their products?”, you ask.

There is a world of difference between political parties that form the core of a country’s governance that exploit social media, and companies that, say, want to sell you razors, on Facebook.

This difference has ethical implications, and is especially apparent when you consider that an incumbent government with fat coffers on their side target masses of voters using targeted Facebook profiling.

Companies use targeted ads to get people to buy things. However, this doesn’t quite impact how governments get elected, and therefore, impact how the country is governed. In case you’re not aware, governance is a pretty serious thing.

Ideally, governments and political parties should be ethical in their politicking, and shouldn’t be in positions where they may utilise taxpayer monies to sway electoral sentiment. Again, in case it isn’t obvious, their agenda is so that they get to remain in power.

Furthermore, the playing field is uneven if one political side has more money. I know, this is normal. But this should not be happening when a private entity is questionably thrown into the mix, especially if their services were exploited when they have the obligation to protect their users.

If you can’t tell the difference between a company selling a product for profit and governments using tax payer monies and exploiting private companies to keep their corrupt selves in power, then I think you need to reassess your understanding of ethics.

You may then ask: “What’s the big deal then when political parties use the standard non-digital means to incite and stir up emotions and exploiting Facebook to do the same thing?”

The big deal is that it has made psychological warfare far more insidious and efficient for incumbent governments with the cash flow to specifically target voters and use propaganda to get at their ‘inner demons’.

You dismissed the whistleblower’s statements as spurious and exaggerated, but realistically, what happens is that these companies target and exploit underlying insecurities and emotional sentiments (of large swathes of voters) in order to emotionally prod them to make a biased decision, such as to vote for their party.

Except, this is on steroids.

It’s a far more efficient tool for propaganda and keeping populations afraid by preying on their insecurities. And when it is Malaysia or the US we are talking about, of which a lot of the voting population are deliberately kept undereducated, un-savvy about current affairs and easily emotionally triggered by, say, religious issues, you’re looking at a very dangerous situation here.

3. I must point out the irony of a self-proclaimed feminist using the victim blaming card (i.e. that users should be responsible for their actions). While it is true that people should be wary when using any service (or buying anything), there should be checks and balances in place that prevent exploitation of people and services when it concerns governance and population.

I say exploitation because Facebook has the responsibility to keep its users (i.e. their cash cows) happy and secure, especially since Facebook is a personal social media platform for individuals to use. It is clear in their ethos that they take the privacy of their users seriously. I’m guessing you didn’t read the ToS either, so let me put it here for you:

How do we use this information?

- Provide, improve and develop services

- Communicate with you

- Show and measure ads and services

- Promote safety and security

Note the last point. Users’ data used for malicious intent is unsafe and unsecured.

4. The accusation against governments or political parties allegedly spending money (again, I remind you these are taxpayer monies) to buy the services of Cambridge Analytica is unethical because they harvested the personal data of users through deceptive and misleading means. Again, Facebook has privacy policies in place and specifically outlined how this is prohibited:

3 (9) You will not use Facebook to do anything unlawful, misleading, malicious, or discriminatory.

It is clear that Cambridge Analytica acted in a manner that was misleading and ultimately, malicious. Users were not aware they were giving up important information that could be used for psychological warfare, and played right into the hands of politicians.

Furthermore, Facebook was aware of it, but was not exactly proactive in stopping it early. Heck, many news portals have been reporting on this phenomenon from as early as 2015.

It’s amazing how you have no problems with all of these happening. Mayhap you were ignorant of all of these?

But I’ve got to admit, the cherry on top was when you victim blamed the users that Facebook is obligated to protect.

Is Facebook responsible for data mining? No, of course not, we’re in the 21st century where big data is a thing (but commendable attempt at throwing a red herring, though).

The question should be: “Is Facebook responsible for being negligent about protecting users’ data?”

I think the answer is a resounding “No shit, Sherlock”.

*This is the personal opinion of the writer and does not necessarily represent the views of Malay Mail Online.