What matters to you.
0:00
0:00
NEXT UP:
 
Top

Forum Network

Free online lectures: Explore a world of ideas

Funding provided by:
2023.08.04_FN_Big, If True Series On Tech & The Pandemic
Medical masks on blue background. Typical 3-ply surgical mask to cover the mouth and nose. Procedure mask from bacteria. Healthcare and protection concept.
Envato Elements (License on SonyCI)

Big, If True Series On Tech & The Pandemic

"Big, If True" is a webinar series from the Technology and Social Change Research Project at the Shorenstein Center. Hosted by Dr. Joan Donovan, the series focuses on media manipulation, disinformation, and the future of democracy during a pandemic.

From xenophobia and "Zoomboming" to misinformation and why certain technologies get made, learn about research of the connection between technology, politics and social issues during the COVID-19 crisis.

  • What can mis- and disinformation scholars learn from the security studies field? What happens when security threats are inflated by governments? And how do security scholars analyze and account for civil liberties against the rise of digital search tools and surveillance? The beginning of 2021 brings with it both new and old vulnerabilities and uncertainties: the roll-out of COVID-19 vaccines, cybersecurity data breaches and hacks, openings to expand state power, and opportunities for resistance. As we embrace the unsettled state of things, governments and media manipulators may capitalize on a fragile media ecosystem and shifting political landscapes. For many security studies scholars, it is important to understand how advanced information technologies create national vulnerabilities, increase instabilities in international relations, exploit and stockpile user data, and allow unauthorized people to intercept and infiltrate communications. Joan Donovan, Director of the Technology and Social Change Research Project at the Shorenstein Center on Media, Politics and Public Policy, hosts cyber security experts Susan Landau, Erik Lin-Greenberg, and Gabrielle Lim to discuss what this means for mis- and disinformation campaigns, and how interdisciplinary collaboration can unmask new strategies for pushing back against government overreach. Photo: [Pexels/Christina Morillo](https://www.pexels.com/photo/two-women-looking-at-the-code-at-laptop-1181263/)
    Partner:
    Shorenstein Center on Media, Politics and Public Policy
  • In light of the recent wave of Black Lives Matter protests, there are distressing concerns that facial recognition software is being used to target and catalogue people engaging in protected speech and assembly. Given the chilling effect it poses on civil liberties and its propensity for error — from misidentifying to wrongfully convicting individuals — major cities like Boston and San Francisco have banned its use by law enforcement. The discussion will navigate how community organizers are fighting back against the unprecedented use of surveillance tools that disproportionately exhibit racial and gender bias and how the movement for racial justice means banning facial recognition. Drawing on the remarkable work of Ben Ewen-Campen, Chris Gilliard and Emily Dreyfuss, series host Joan Donovan asks: what is the potential human cost of widespread facial recognition technology? What prompted Amazon’s moratorium on selling its controversial Rekognition platform to law enforcement and what are the consequences? Have the successful bans in Boston and San Francisco sparked enough momentum for a nationwide ban? And crucially, is facial recognition so widespread now that it’s even possible to effectively ban it? Photo: "Facial Recognition Art Mural," Hollywood CA by [Yowhathappenedtopeace/Flickr](https://www.flickr.com/photos/yowhathappenedtopeace/15861645255)
    Partner:
    Shorenstein Center on Media, Politics and Public Policy
  • This episode of [BIG, If True,](http://shorensteincenter.org/programs/technology-social-change/big-if-true-webinar-series/) reflects on the hybrid battles being waged by journalists, activists, and dissidents against censorship and disinformation in Southeast Asia. The discussion traces the genesis of the recent attacks on the freedom to expression, from the rise and fall of the Anti-Fake News Act in Malaysia to the conviction of Rappler CEO and Editor Maria Ressa and former Rappler researcher-writer Reynaldo Santos over cyber libel charges — a conviction that has been widely reported as a strike against press freedom and democracy in the Philippines. In light of these extraordinary censorship measures, this conversation charts the broader efforts being made by civil society to counter the repression of free speech. Image: Pixabay.com
    Partner:
    Shorenstein Center on Media, Politics and Public Policy
  • This webinar discusses some of the many cybersecurity threats and strategies emerging today. Joan Donovan, PhD talks with experts: how secure and reliable is the American cyber ecosystem? How have threats, like opportunistic cybercrime, data breaches, and cyberattacks, changed given that many of us are still working from home? Is the potential for technology to serve as a force for good being usurped by malicious parties bent on oppression and surveillance? In the face of these myriad uncertainties and cosmic shifts, are we witnessing a resurgence of socio-technological hacking? This talk is part of the [Big, If True](http://forum-network.org/series/big-if-true-series-tech-pandemic/) webinar series hosted by Joan Donovan, Ph.D., who heads up the [Technology and Social Change Research Project (TaSC)](http://shorensteincenter.org/about-us/areas-of-focus/technology-social-change/) at Harvard University’s Shorenstein Center on Media, Politics and Public Policy. Image by [VIN JD](https://pixabay.com/users/jaydeep_-7740155/?utm_source=link-attribution&utm_medium=referral&utm_campaign=image&utm_content=3112539") for Pixabay
    Partner:
    Shorenstein Center on Media, Politics and Public Policy
  • Explore whether the rigorous peer review process – a process that has traditionally safeguarded information quality control – can compete in a media ecosystem riddled with fast-paced health misinformation and dangerous speculation. Panelists discuss the flurry of preprints and the limitations of correcting the record after an article has hit the mainstream. The panel also offers insight into how scientific communities are wrestling with new uncertainties and heightened public visibility, while forging new pathways for curating knowledge amidst the infodemic. This talk is part of the [Big, If True](http://forum-network.org/series/big-if-true-series-tech-pandemic/) webinar series hosted by Joan Donovan, Ph.D., who heads up the[ Technology and Social Change Research Project (TaSC)](http://shorensteincenter.org/about-us/areas-of-focus/technology-social-change/) at Harvard University’s Shorenstein Center on Media, Politics and Public Policy. Image: Pexels.com
    Partner:
    Shorenstein Center on Media, Politics and Public Policy
  • Dive into the promises and perils of communicating science to public audiences. While trust in our politicians and the press has waned in recent years, trust in scientists has remained remarkably steady. However, as we chart out new uncertainties and complex facts and figures in a pandemic, we wonder: is trust in science eroding? How are science journalists and educators dispelling misinformation and tempering fear? Why don’t facts go viral on social media? And, what resources and platforms can help marshal facts and good science? Jane Hu, a regular contributor to Slate’s [Future Tense](https://slate.com/technology/future-tense) , Mitchell Moffit and Gregory Brown, co-creators of the YouTube channel [AsapSCIENCE](https://www.youtube.com/channel/UCC552Sd-3nyi_tk2BudLUzA) , join Joan Donovan to discuss the role of science communication in helping the general public steer a safe course against pseudoscience and misinformation. Image: [Creative Commons](https://www.peakpx.com/563861/clear-test-tubes)
    Partner:
    Shorenstein Center on Media, Politics and Public Policy
  • In this week’s episode of BIG, If True, our host Joan Donovan, PhD asks: should we trust our search engines? Have joint industry efforts – led by Facebook, Google, Microsoft, YouTube, Twitter, Reddit and LinkedIn – to limit misinformation been successful? How are new content policies specific to COVID-19 being enforced? And if so, at what cost? As we clumsily shift our lives online, the cracks in the information infrastructure are bursting open. While there’s been an uptick in boosting trusted content by credible sources, like the Center for Disease Control and the World Health Organization, there has simultaneously been sweeping purges of advertisements seeking to capitalize on the crisis and suspicious accounts, leaving us to wonder who’s heard and who’s harmed in the current infodemic. Amidst this sliding scale of uncertainty, we turn to leading voices in the field, UCLA professors Safiya Umoja Noble, PhD and Sarah T. Roberts, PhD and Washington Post Reporter, Elizabeth Dwoskin, who have been taking stock of how commercial content is being moderated during the pandemic. Registration for this event is required, details on how to join the webinar will be sent to registered participants before the event. Register [here.](http://https://forms.shorensteincenter.org/view.php?id=128809) Image courtesy of Pixabay
    Partner:
    Shorenstein Center on Media, Politics and Public Policy
  • Recently, Luke O’Brien, a reporter at HuffPost, covered the controversy surrounding Clearview AI, a company that has amassed a large database of images and social media data of private citizens. His reporting also illustrated how right-wing activism shaped the design of the technology and flouted platforms’ terms of service in pursuit of big data. Luke joins the Shorenstein Center's director of research, Joan Donovan, and Biella Coleman, an anthropologist of hackers, who has studied how technopolitics can influence law and change society. They talk about how the “alt-right” developed, how they spread their ideas and what responses are necessary to prevent this from reoccurring. ## EXTRA RESOURCES [Luke O'Brien's reporting on Clearview AI and far right extremism](https://www.huffpost.com/entry/clearview-ai-facial-recognition-alt-right_n_5e7d028bc5b6cb08a92a5c48?6p8) Whitney Phillips' book "[You Are Here: A Field Guide for Navigating Polluted Information](A Field Guide for Navigating Polluted Information) " Joan Donovan's [MIT Tech Review article](https://www.technologyreview.com/2020/04/30/1000881/covid-hoaxes-zombie-content-wayback-machine-disinformation/) on how covid conspiracy theorists are using tech to keep conspiracy theories alive. Image courtesy of Flickr
    Partner:
    Shorenstein Center on Media, Politics and Public Policy
  • What do public health advocates need to know about misinformation research? Like our hospitals, our information systems are completely overwhelmed with questions, ranging from the banal, “How do I know if I have coronavirus?,” “Where can I get tested for COVID-19?,” “Is there a vaccine?” to the conspiracy-driven, “Does 5G affect your health?” or “What is the World Health Organization and do they work for China?” The list goes on, but the fact remains: people are seeking more and more information about COVID-19 and wrong answers could be deadly. Ashish Jha, Director of the Harvard Global Health Institute talks with Setti Warren, Executive Director of the Shorenstein Center and former Mayor of Newton MA. The two discuss what public health advocates need to know about misinformation and how misinformation influence people’s behaviors. They . cover the ways local governments communicate to residents and how public health professionals and local officials can work together to share life-saving recommendations during the infodemic. This talk is part of the [_Big, If True_](https://forum-network.org/series/big-if-true-series-tech-pandemic/) webinar series hosted by Joan Donovan, Ph.D., who heads up the [**Technology and Social Change Research Project** (TaSC)](https://shorensteincenter.org/about-us/areas-of-focus/technology-social-change/) at Harvard University's Shorenstein Center on Media, Politics and Public Policy.
    Partner:
    Shorenstein Center on Media, Politics and Public Policy
  • Explore how COVID-19 has exacerbated existing inequalities, fueled xenophobia, and harmed marginalized groups. How can policymakers, civil society, and media mitigate against discrimination by shining a light on health disparities? What does xenophobia look like in a time of social distancing? How has misinformation and disinformation inflamed these divides? And what can journalists do to surface these tensions without compounding the problem? COVID-19 won’t be the last global crisis, but how we respond to these questions may make all the difference. ProPublica reporters, Akilah Johnson and Talia Buford, who are covering the data on health outcomes across communities of color, Marita Etcubañezs, Director of Strategic Initiatives for Asian Americans Advancing Justice, Lisa Nakamura, Professor and Director of the Digital Studies Institute at University of Michigan, and Gabby Lim, Researcher, Technology and Social Change Research Project, Shorenstein Center, dicuss the impacts of the pandemic on inequalities. This talk is part of the [_Big, If True_](https://forum-network.org/series/big-if-true-series-tech-pandemic/) webinar series hosted by Joan Donovan, Ph.D., who heads up the [**Technology and Social Change Research Project** (TaSC)](https://shorensteincenter.org/about-us/areas-of-focus/technology-social-change/) at Harvard University's Shorenstein Center on Media, Politics and Public Policy. Image courtesy of Flickr
    Partner:
    Shorenstein Center on Media, Politics and Public Policy