Afgespeeld
-
We are in the middle of a global trust crisis. Neighbors are strangers and local news sources are becoming scarcer; institutions that used to symbolize prestige, honor and a sense of societal security are ridiculed for being antiquated and out of touch. To replace the void, we turn to sharing economy companies and social media, which come up short, or worse. Our guest on this episode, academic and business advisor Rachel Botsman, guides us through how we got here, and how to recover. Botsman is the Trust Fellow at Oxford University, and the author of two books, including “Who Can You Trust?” The intangibility of trust makes it difficult to pin down, she explains, and she speaks directly to technology leaders about fostering communities and creating products the public is willing to put faith in. “The efficiency of technology is the enemy of trust,” she says.
-
When we press play on a YouTube video, we set in motion an algorithm that taps all available data to find the next video that keeps us glued to the screen. Because of its advertising-based business model, YouTube’s top priority is not to help us learn to play the accordion, tie a bow tie, heal an injury, or see a new city — it’s to keep us staring at the screen for as long as possible, regardless of the content. This episode’s guest, AI expert Guillaume Chaslot, helped write YouTube’s recommendation engine and explains how those priorities spin up outrage, conspiracy theories and extremism. After leaving YouTube, Guillaume’s mission became shedding light on those hidden patterns on his website, AlgoTransparency.org, which tracks and publicizes YouTube recommendations for controversial content channels. Through his work, he encourages YouTube to take responsibility for the videos it promotes and aims to give viewers more control.
-
In the second part of our interview with Renée DiResta, disinformation expert, Mozilla fellow, and co-author of the Senate Intelligence Committee’s Russia investigation, she explains how social media platforms use your sense of identity and personal relationships to keep you glued to their sites longer, and how those design choices have political consequences. The online tools and tactics of foreign agents can be very precise and deliberate, but they don’t have to be -- Renée has seen how deception and uncertainty are powerful agents of distrust and easy to create. Do we really need the ease of global amplification of information-sharing that social media enables, anyway? We don’t want spam in our email inbox so why do we tolerate it in our social media feed? What would happen if we had to copy and paste and click twice, or three times? Tristan and Aza also brainstorm ways to prevent and control disinformation in the lead-up to elections, and particularly the 2020 U.S. elections.
-
Aza sits down with Yael Eisenstat, a former CIA officer and a former advisor at the White House. When Yael noticed that Americans were having a harder and harder time finding common ground, she shifted her work from counter-extremism abroad to advising technology companies in the U.S. She believed as danger at home increased, her public sector experience could help fill a gap in Silicon Valley’s talent pool and chip away at the ways tech was contributing to polarization and election hacking. But when she joined Facebook in June 2018, things didn’t go as planned. Yael shares the lessons she learned and her perspective on government’s role in regulating tech, and Aza and Tristan raise questions about our relationships with these companies and the balance of power.
-
Today’s online propaganda has evolved in unforeseeable and seemingly absurd ways; by laughing at or spreading a Kermit the Frog meme, you may be unwittingly advancing the Russian agenda. These campaigns affect our elections integrity, public health, and relationships. In this episode, the first of two parts, disinformation expert Renee DiResta talks with Tristan and Aza about how these tactics work, how social media platforms’ algorithms and business models allow foreign agents to game the system, and what these messages reveal to us about ourselves. Renee gained unique insight into this issue when in 2017 Congress asked her to lead a team of investigators analyzing a data set of texts, images and videos from Facebook, Twitter and Google thought to have been created by Russia’s Internet Research Agency. She shares what she learned, and in part two of their conversation, Renee, Tristan and Aza will discuss what steps can be taken to prevent this kind of manipulation in the future.
-
In 1940, a group of 60 American intellectuals formed the Committee for National Morale. “They’ve largely been forgotten,” says Fred Turner, a professor of communications at Stanford University, but their work had a profound impact on public opinion. They produced groundbreaking films and art exhibitions. They urged viewers to stop, reflect and think for themselves, and in so doing, they developed a set of design principles that reimagined how media could make us feel more calm, reflective, empathetic; in short, more democratic.
-
Technology has shredded our attention. We can do better.
-
A new documentary called The Social Dilemma comes out on Netflix today, September 9, 2020. We hope that this film, full of interviews with tech insiders, will be a catalyst and tool for exposing how technology has been distorting our perception of the world, and will help us reach the shared ground we need to solve big problems together.
-
What causes addiction? Johann Hari, author of Chasing the Scream, travelled some 30,000 miles in search of an answer. He met with researchers and lawmakers, drug dealers and drug makers, those who were struggling with substance abuse and those who had recovered from it, and he came to the conclusion that our whole narrative about addiction is broken. "The opposite of addiction is not sobriety," he argues. "The opposite of addiction is connection." But first, we have to figure out what it really means to connect.
-
This summer, Facebook unveiled “2Africa,” a subsea cable project that will encircle nearly the entire continent of Africa — much to the surprise of Julie Owono. As Executive Director of Internet Without Borders, she’s seen how quickly projects like this can become enmeshed in local politics, as private companies dig through territorial waters, negotiate with local officials and gradually assume responsibility over vital pieces of national infrastructure. “It’s critical, now, that communities have a seat at the table,” Julie says. We ask her about the risks of tech companies leading us into an age of “digital colonialism,” and what she hopes to achieve as a newly appointed member of Facebook’s Oversight Board.