Powered by

Facebook: You Don't Get To Profit From My Anger

(PID:51649746551) Source
posted by alias outtacontext on Tuesday 2nd of November 2021 04:34:55 PM

Listening to Facebook whistleblower Frances Haugen’s 60 Minutes interview in October 2021, the simplicity and impact of her allegation struck me: Facebook made its money from engagement (the more clicks and comments a post contained, the more money it made). And what caused the most engagement? Anger. Facebook was making millions of dollars financed by our anger. Haugen revealed that Facebook’s internal reports showed the company’s algorithms promoted political discord and anti-vaccination rhetoric, both domestically and internationally (in 2018, Myanmar’s military used Facebook to launch the Rohingya genocide). Company insiders warned Mark Zuckerberg, but he chose profit over the safety and well-being of its users. And despite numerous appearances before Congress, he consistently misled our legislators and us. I use Facebook to stay in contact with friends. I also moderate a cultural and political page as part of my work on the Chamomile Tea Party. I’ve created over 230 posters during the last decade that chronicle the devolution of American political discourse. I don’t have to tell you, Americans are more polarized than ever. Donald Trump’s presidency and power were built on that divide. The Republican Party’s kowtowing to him, both during his tenure and even now, has created high levels of vigilance and anxiety. No matter where we stand on the political spectrum, we’ve had little power to do anything about it except to yell at each other. Our anger was and continues to be palatable. Trump’s defeat (even as he dangles a 2024 run for the presidency) has given us some room to breathe and to distance ourselves, if ever so slightly, from the precipice. But how have we fallen so far? How did we lose sight of what many believe is American Exceptionalism (a term I find a fabricated national myth)? As parents of two young adults, my wife and I have found the “terrible twos” had nothing on the clueless early twenties. At 18, our daughters were legally adults. But they had little experience being adults. And with their prefrontal cortexes still developing, they rarely asked for help nor listened when we offered our expertise. Fair enough. I didn’t listen to my parents either at that age. But our challenge in helping them navigate adulthood is complicated because our twenties were so different from theirs. We cannot compare our pasts to their experiences as digital natives. As a technologist and a former teacher of technology, I never taught my students the philosophical and moral underpinnings of the net. In the late 1990s, teachers focused on using programs like Photoshop and PageMaker, not how to be good netizens. We didn’t have to. There was no need—yet. At the beginning of the internet, the opportunity to meet new people to discuss ideas was a major attraction to me. As a teenager, I had pen pals in countries worldwide, and I saw the internet as an extension of that interest and my curiosity. As an artist, I saw the opportunity to bypass the impediments of gallery representation and the art market to convey my work to new audiences. But at a “town hall” back then, hastily organized to discuss a Washington Post article bemoaning DC’s lackluster arts community, I warned my fellow artists we needed to guard this new resource. If we didn’t make this concerted effort, companies and corporations would turn it into just another marketplace for their goods and services. I feel no pleasure in my prescience. Enter social media. By the early 2000s, I was a technology strategist and frontend web designer at the Smithsonian American Art Museum, where I helped shepherd our mission to new online audiences. And, in 2002, I proposed doing a blog as a way of posting current information about exhibitions and lectures weekly. Until that time, websites were static. They presented the basics and were rarely updated. But as we recognized the value of engaging these new audiences, we needed to find ways to interact and inform on an ongoing basis. The introduction of content management systems allowed us to create that fresh content easily. In 2005, my idea gained enough traction to launch the first blog at the Smithsonian, Eye Level. Everybody was trying to find ways to engage these new communities. In 2006, Facebook opened its membership to everyone. And in 2008, Twitter did the same. Both of these platforms became part of our museum’s toolkit for social engagement. These apps heralded a revolution in social interaction. As these platforms grew, they looked to differentiate themselves from one another. When coworkers wondered if Twitter would supplant blogging, I told them, “you tweet to react and blog to reflect.” But the business of social media was developing too. As access and bandwidth increased, these companies grew exponentially. So did their power and their share prices. My ’90s prediction that capital would supplant real societal change came to be. There was money to be made, and by the late 2000s, the net’s fate was set. Net cognoscenti have been advocating for net neutrality ever since. The internet’s future demanded a robust infrastructure to secure its future. Money poured in from venture capitalists. In Silicon Valley, just about every idea was a good one, that is until the bubble burst in the late 1990s. The wild, wild West was gone, but that didn’t stop the capital from flowing in, albeit with a little more restraint. And it began to coalesce. Companies bought up other companies. And as Yuval Noah Harari, a historian, and author of Sapiens, recently stated on 60 Minutes, platforms like Instagram and What’s App sold for millions. These apps had no tangible assets, so why were they valued so highly? It was their data that made those acquisitions so valuable. Their data on you and me. I decided if others coveted my interests, I wanted a piece of that pie. So, in 1999, I auctioned my personal demographics on eBay. When my children were young, I never mentioned their names or showed photos of them online. I wanted to protect their personal information for as long as possible. Knowing all about our habits, companies could target content to each of us. Chris Anderson, the former editor of WIRED, called this “the longtail strategy.” Amazon may make a lot of money from the sale of their best sellers, but it was the other 90% of their inventory that made them rich. The number of small sales from a long list of niche books surpassed the volume of more well-known fare. However, Wharton professor, Serguei Netessine, found just the opposite. He felt people overwhelmed with choices would gravitate to bestsellers. The key was personalization. Develop algorithms that use your past searches to create a profile of your interests so that search results could show you precisely what you were looking for (even if you didn’t know what you were looking for). This is exactly what Facebook does. It knows everything about us. Everything. Harai told Anderson Cooper, “I came out as gay when I was 21. It should’ve been obvious to me when I was 15 that I’m gay. But something in the mind blocked it. Now, if you think about a teenager today, Facebook can know that they are gay or Amazon can know that they are gay long before they do just based on analyzing patterns.” To understand the consequences such knowledge could reveal, Harai asked us to consider what that would mean to LGBT+ communities in Iran, Russia, or any other homophobic country where “the police know that you are gay even before you know it.” The dystopian message of the film, Minority Report, is coming true. Based loosely on Philip K. Dick’s novel, The Minority Report, a special division of the police called “Precrime” uses “precogs”—psychics—to identify and arrest people before they can commit a crime. Substitute precogs with algorithms, and you have Facebook. The key is, as always, will this power be used for good or evil? Despite Zuckerberg’s assurances he is only interested in the former, Haugen’s purloined documents tell another story. Before Haugen revealed herself on 60 Minutes, The Wall Street Journal published an investigation of these documents in a series called The Facebook Files. Here are a couple of the takeaways. Facebook Says Its Rules Apply to All. Company Documents Reveal a Secret Elite That’s Exempt. While Zuckerberg conveys Facebook’s role as neutrality-based, where the platform treats every user equally, the truth is just the opposite. A special class of high-profile users doesn’t always have to adhere to Facebook’s rules and algorithms. They are part of a program called “Cross Check” or “XCheck.” Facebook’s algorithms and content moderators can’t keep up with the abundance of user-generated content, so they wanted to give special attention to these very visible and vocal VIPs to ensure no PR problems for the company. Yet, many of these “special people” have used their privilege to harass and incite violence. As regular users, their posts would have been taken down and, as many of us have experienced for much lesser “crimes,” thrown into Facebook jail. This confidential review stated, “We are not actually doing what we say we do publicly,” and it called the company’s actions “a breach of trust.” Facebook Tried to Make Its Platform a Healthier Place. It Got Angrier Instead. In 2018, the company changed its algorithm to make its platform kinder and gentler. Its goal was to emphasize sharing and resharing posts amongst friends and family. Instead, it had the opposite effect. Political parties and trolls used the algorithm to sensationalize content. In March 2021, Mark Zuckerberg announced that he would use the platform to promote COVID vaccinations. His goal was to get 50 million people to get vaccinated. Despite this altruistic hope, his app’s formula stymied even his efforts by prioritizing resharing. Anti-vaxx comments and mis- and disinformation inundated pro-vaccination content. The Wall Street Journal stated that Facebook's problem was “its users create the content, but their comments, posts, and videos are hard to control.” In the lead-up to the 2020 elections, Facebook attempted to address these issues by forming the Civic Integrity working group. When Haugen began working at Facebook, she was assigned to this group to help manage the misinformation. But after the election, the company decided to disband this unit. Haugen said, “They told us, ‘We're dissolving Civic Integrity.’ Like, they basically said, ‘Oh good, we made it through the election. There wasn't [sic] riots. We can get rid of Civic Integrity now.’ Fast forward a couple months, we got the insurrection. And when they got rid of Civic Integrity, it was the moment where I was like, ‘I don't trust that they're willing to actually invest what needs to be invested to keep Facebook from being dangerous.’” As a moderator on a political page, I often bought ads to promote messages from the posters I designed. Defining my audiences for these ads, I wanted to get the word out without the back and forth animosity and name-calling that was so rampant in most social media “tit-for-tats.” To do that, Facebook allowed me to target my audiences extensively. Building an audience profile was an art form in and of itself. They provided very niche groups I could address. Combining these groups allowed me to pinpoint my messages. For example, I could focus on liberal or conservative movements and interests in many granular ways. However, after the 2020 election, this specificity disappeared. I was only allowed to target more general audiences (“interested in politics” instead of liberal or conservative issues). With the election over, they felt hostilities would cease or, at least, lessen. They have not. My ability to define my audience has taken a big hit. My messages must now be broadcasted to a more general group of people, just perfect for more anger and increased clicks. Sure, I’d like a larger following, but not at this cost. Instead, I’d like a more significant audience. Show me how I can accomplish that, Mark. In 2017, Sean Parker, the founding president of Facebook, stated, “The thought process that went into building these applications was all about: ‘How do we consume as much of your time and conscious attention as possible?’ And that means that we need to sort of give you a little dopamine hit every once in a while because someone liked or commented on a photo or a post. And that’s going to get you to contribute more content, and that’s going to get you more likes and comments [and more money for the company]. It’s a social-validation feedback loop, exactly the kind of thing that a hacker like myself would come up with because you’re exploiting a vulnerability in human psychology.” I accept Parker’s reasoning. I know what I’m getting and giving up on the platform (and I’m constantly securing my data and watching what information I post). But it angers me that Zuckerberg et al. seem to have so much power with so little understanding and control over their platform. And I’m mad that he is misleading us, but not enough to yell and scream about it on Facebook. While everyone has a right to their opinion, no matter how distasteful or wrong I may think it is, no one has a right to spew that opinion on someone else. I live by that dictum. So I do most of my screaming into a pillow. Above all, I will not let Facebook profit from my anger. Feel free to pass this poster on. It's free to download here (click on the down arrow just to the lower right of the image). See the rest of the posters from the Chamomile Tea Party! Digital high res downloads are free here (click the down arrow on the lower right side of the image). Other options are available. And join our Facebook group. Follow the history of our country's political intransigence from 2010-2020 through a seven-part exhibit of these posters on Google Arts & Culture.

Focused Differentiation Strategy,
Focused Differentiation Strategy Example,
Focused Differentiation Strategy Porter,
Focused Differentiation Strategy Pdf,
Focused Differentiation Strategy Definition,
Focused Differentiation Strategy Meaning,
Focused Differentiation Strategy Company,
Focused Differentiation Strategy Vs Differentiation Strategy,
Focused Differentiation Strategy Advantages And Disadvantages,

License and Use

This Focused Differentiation Strategy - facebook-you-dont-get-to-profit-from-my-anger on image has 1024x712 pixels (original) and is uploaded to . The image size is 325394 byte. If you have a problem about intellectual property, child pornography or immature images with any of these pictures, please send report email to a webmaster at , to remove it from web.

Any questions about us or this searchengine simply use our contact form

  • Published 01.22.22
  • Resolution 1024x712
  • Image type jpg
  • File Size 325394 byte.

Related Photos


comments powered by Disqus