Ticker

6/recent/ticker-posts

"Inside Facebook" || Facebook Doesn't Like That

A man sits at his computer and scrolls through the Facebook data of a woman he recently had a date with. Not through her public profile, but the intimate information in the background: He can see who she wrote with on Facebook, what interests the platform ascribes to her, and can even track where she is in real time via the location. The man is a Facebook employee and what he does is not allowed. But it's possible.

This scene comes from the first chapter of the recently published book Inside Facebook by New York Times journalists Sheera Frenkel and Cecilia Kang. In January 2014 and August 2015, the Facebook employee was one of 52 employees who accessed intimate information from users - mostly men who looked at the profiles of women they were interested in. And there was no process on Facebook to systematically record this misuse of data; there were probably more than these 52, write the authors.

Read also: Twitter Verified Some "Fake" Accounts

It has come to the point with Facebook that the case is somehow not really surprising anymore. The news fits in seamlessly with all the stories that have been written about the social network in recent years: about disinformation on the platform , about attempted election manipulation , about disregarding the privacy of users, about security gaps , about abuse of power.

Symbol for everything bad in Silicon Valley

In the past few years, Facebook has become a symbol of everything that is going badly in technology companies from Silicon Valley: The social network that began to connect people around the world is now seen by many as one that divides society. It wanted to give people access to information; in fact, disinformation often spreads without being contradicted. And the company behind the platform is still autocratically directed by mainly one person: Founder and CEO Mark Zuckerberg (and a little bit by his de facto deputy Sheryl Sandberg).

Read also: YouTube Shorts

So much has been reported on Facebook that Inside Facebook reads like a summary of all the major scandals of the past few years. It is thanks to Frenkel and Kang to relate the many problems of Facebook to each other and to unravel in detail how the beginnings of the social network contributed to its current image. The book gives a glimpse behind the already fragile-looking facade of Facebook. The unsettling thing about it is that it apparently looks even worse behind it than one had already expected.

For example, software developers were able to stalk users as described at the beginning, because the system should be "transparent and accessible to all," as the authors write. It was from the early days and hadn't been questioned over the years - Zuckerberg wanted little bureaucracy for his software developers. According to the book, employees had repeatedly pointed out the problem to the founder, but he had simply ignored it for a long time. It only changed when the renowned security researcher Alex Stamos hired as Chief Security Officer in 2015 and presented figures on the size of the problem.

Read more: Amazon gets Apple

Or the challenge of populism on the platform. The management of Facebook apparently knew early on what the social network could expect from a nomination of Donald Trump as a Republican presidential candidate. But no later than December 2015, when he gave a inflammatory speech with anti-Muslim rhetoric. At that time, the authors describe it, Zuckerberg was worried about the contribution and wanted to check how it could be removed. But Joel Kaplan, Vice President for Public Policy, or more simply: lobbying interests, warned that Trump and his supporters could see this as censorship: "Don't prick a wasp's nest," he is quoted as saying.

This early decision was to run through the election campaign as well as through the Trump presidency: For fear of the appearance of partisanship (and later regulation), Facebook Trump let go of many things that actually violated the rules - even if that meant the guidelines always having to adjust again. This zigzag course, which could also be observed from the outside, only ended with the storming of the Capitol in January 2021, when Facebook decided to suspend the president for an indefinite period. ( It was recently limited to two years)

Read also: The New iPhone "Always-On-Display" Technology

Or the enormous effects of fake news, which the authors illustrate by means of the persecution of the Rohingya Muslim minority in Myanmar. For a long time, the military controlled the country and also information, it only changed slowly in 2013. Smartphones and cheap mobile contracts were suddenly everywhere. And Facebook. The social network was represented there with an app that was internally called "Blue". It was one of the most popular apps in the country and synonymous with the Internet. But, as everywhere, there were not just people who posted recipes or photos from their everyday family life. But also people who posted hatred, religious fanatics for example. Conspiracy theories against the Rohingya were spread via the app, as well as death threats at some point. The situation in Myanmar is "Inside Facebook.

The authors identify the structure of the platform as a central reason. It was designed to "always pour oil on the fire where a post generated emotions, even if that emotion was hate". Because Facebook's algorithm rewards interaction - it does not distinguish whether someone likes or comments on a post out of joy or anger, out of love or hate. A video in which a woman tries a Chewbacca mask, which makes a lot of people laugh and receives a correspondingly high number of likes, is treated the same way as a post with anti-Muslim agitation against the Rohingya: it is followed up in the users' newsfeed flushed up, the more likes and comments it gets, which means it gets even more attention.

Read more: Illegal Cryptocurrency Mining

These decisions, and this is unfortunately neglected in many places in the book, affect users in everyday life around the world. 2.85 billion people use Facebook's platform every month, more than two billion chat on WhatsApp and one billion post pictures and videos on Instagram. What content Facebook allows and what it blocks, what data it collects, what its employees can see, how it deals with populists on the platforms, how with fake news, all of this has a direct impact on people and their everyday lives.

Of course, people still decide for themselves what to buy, who to follow, who to choose. But without Facebook they probably wouldn't come across some offers, some ideas. The platform's automated decisions can be found legitimate when it comes to which animal videos or pants advertisements a user sees. But it gets tricky as soon as the platform weights political or social issues according to relevance.

Sole ruler Zuckerberg

What Frenkel and Kang work out: It is possible that many of these content-related problems could be resolved if the massive structural problems did not also exist. Mark Zuckerberg holds almost 60 percent of the voting rights in Facebook and can therefore make many decisions on his own . In short: What happens on Facebook and the subsidiaries WhatsApp and Instagram is at the discretion of the boss. If he changes his mind, the course also changes.

Read more: Netflix Confirms It Will Add Video Games To The Service

A particularly impressive example of this in the book is the reaction to the US presidential election in November 2020. When it became clear that there might not be a clear winner on election day, the team sat down and asked Zuckerberg to make changes to the algorithm to permit. In the following days there should be more contributions from quality media such as the New York Times or the Wall Street Journalare displayed. Zuckerberg agreed. For five days, Facebook appeared calmer, more balanced, and the employees described the new weighting as a nicer newsfeed. But after the count, when Joe Biden was established as the new President of the USA, the old algorithm was implemented again. The example makes it clear: Facebook could be different if it wanted to. Apparently it doesn't want to.

Read also: PlayStation 4 Is The New Target Of Cryptocurrency

"Many people see Facebook as a company that has strayed from the right path," the authors write in the introduction, "the classic Frankenstein story of the monster that broke away from its creator." You would see it differently. Her assumption: "We rather believe that the moment they met at a Christmas party in December 2007, Zuckerberg and Sandberg had an inkling of the potential of the company and that it could be transformed into the global power it is today is."

Do You Know What I Have Posted on

Twitter Facebook Instagram Reddit tumblr

Post a Comment

0 Comments