Episoder
-
I’ve got another special episode of the Summer Spectacular podcast for you with my good friend Emi Kolawole. Emi has 10 years of experience in communications, design, and tech. In our conversation, she shares her insights from attending the Aspen Ideas Festival and the Socrates program.
She highlights the importance of including diverse voices, particularly women and minorities, in the conversations around artificial intelligence (AI) and technology. Emi also discusses the need for more intentional and inclusive panels and events in the tech industry. She expresses optimism about the future of technology and the emergence of new voices and perspectives.
Emi and I also discuss our interest in tarot as a tool for self-investigation and personal growth. We both gush about Rebecca Auman, who has helped us on our journeys. Check out her podcast, Voices in the River, on which Emi and I were guests. She encourages women to believe in themselves and follow their talents when entering the tech industry.
Enjoy!
Please support the curation and analysis I’m doing with this newsletter. As a paid subscriber, you make it possible for me to bring you in-depth analyses of the most pressing issues in tech and politics.
Get full access to Anchor Change with Katie Harbath at anchorchange.substack.com/subscribe -
Ahead of TrustCon next week, I wanted to bring you a conversation I’ve been trying to set up since Duco released the Trust and Safety Market Research Report in March.
Guests, investors Lauren Wagner and Shu Dar Yao, join me to discuss their involvement in the report and the reasons behind it. They highlight key takeaways from the report, such as the unbundling of the tech stack in trust and safety and the impact of regulations and talent outflow. The conversation also touched on the trends in the trust and safety field, the challenges faced by startups, and the different perspectives on trust and safety among investors. They share advice for trust and safety professionals and discuss the future of the field and the need for more research and tools to support startups and investors.
Enjoy!
Lauren Wagner and Shu Dar Yao
Get full access to Anchor Change with Katie Harbath at anchorchange.substack.com/subscribe -
Mangler du episoder?
-
I’ll admit, I wasn’t expecting to do my first Summer Spectacular episode quite this quickly. However, when Saurabh Shukla with NewsMobile pinged me on WhatsApp to offer to talk about what had happened in the Indian election, I jumped at the chance. I then saw former Integrity Institute resident fellow Alexis Crews post about her work in the EU, and I knew I had to include her as well.
In our conversation, we discuss the role of social media and digital platforms in the elections, the use of AI for misinformation and disinformation, the impact of WhatsApp as a messaging app, and the use of influencers in campaigns. We also talk about the lessons learned from the EU elections and the recommendations for tech companies in mitigating disinformation.
I hope you enjoy!
Anchor Change with Katie Harbath is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.
Get full access to Anchor Change with Katie Harbath at anchorchange.substack.com/subscribe -
Don’t forget to follow us on YouTube!
This is the last episode of season 2! Thank you all for listening. I’ll be taking a break over the summer and returning with season 3 in the Fall. Between now and then, I might be planning a few summer spectacular episodes, so keep your eyes out.
I couldn’t have asked for a better person to end the season with. Renee DiResta joins me this week to discuss her background and journey into researching the anti-vaccine movement, the impact of social media on public opinion, and the concept of 'invisible rulers' in her book 'Invisible Rulers.’ She also explores the role of influencers in shaping public opinion and the ethical considerations of platform algorithms in content distribution. She delves into the challenges of researching algorithms, the evolution of content control on platforms, and the politicization of her work. Additionally, she shares insights on the importance of engaging in the face of misinformation and the shaping of public opinion.
Takeaways
* Renee's background spans various roles, leading to her involvement in researching the anti-vaccine movement.
* The impact of social media on public opinion and the lack of counter speech and narrative is a significant concern.
* The concept of 'invisible rulers' and the role of influencers in shaping public opinion is explored in her book 'Invisible Rulers'.
* Ethical considerations of platform algorithms in content distribution, particularly the distinction between free speech and free reach, are important to address.
* Challenges of researching algorithms and content control on platforms
* The politicization of research work and the importance of engaging in the face of misinformation
* Insights on the shaping of public opinion and the impact of algorithms on society
Please support the curation and analysis I’m doing with this newsletter. As a paid subscriber, you make it possible for me to bring you in-depth analyses of the most pressing issues in tech and politics.
Get full access to Anchor Change with Katie Harbath at anchorchange.substack.com/subscribe -
Don’t forget to follow us on YouTube!
This week Claire Wardle and Daniel Bramatti discuss their work in creating fact-checking coalitions around elections as a follow up to the guide we published back in April. They highlight the importance of collaboration and trust-building among news organizations and social media platforms. They also discuss the challenges they faced in handling misinformation during elections, such as dealing with blackouts and coordinating with tech companies. They emphasize the need for continuous adaptation and updating of strategies as bad actors evolve their tactics.
Overall, they highlight the progress made in building resilience and awareness among the public, but also acknowledge the ongoing challenges in combating misinformation. The conversation explores the challenges and strategies of fact-checking during elections, with a focus on the Comprova project in Brazil and the CrossCheck project in France. The guests discuss the importance of collaboration among news organizations, the role of technology and AI in fact-checking, and the need for media literacy to combat misinformation. They also address the business incentives and funding models for fact-checking, as well as the potential impact of AI-generated content on trust and democracy.
Takeaways
* Collaboration and trust-building among news organizations and social media platforms are crucial in creating effective fact-checking coalitions.
* Handling misinformation during elections requires continuous adaptation and updating of strategies as bad actors evolve their tactics.
* Building resilience and awareness among the public is essential in combating misinformation.
* Challenges in fact-checking include dealing with blackouts, coordinating with tech companies, and addressing less talked about disinformation tactics.
* Panic responsibly and avoid catastrophizing the issue of misinformation. Collaboration among news organizations is crucial for effective fact-checking during elections.
* Fact-checkers need to empower journalists in different countries to adapt and implement fact-checking strategies that work within their political and media systems.
* Fact-checking on encrypted platforms like WhatsApp requires innovative approaches, such as creating tip lines for users to submit content for fact-checking.
* The decision of what to fact-check should be based on the potential harm the misinformation can cause, with a focus on content that can affect public health or democracy.
* News organizations need to balance the business incentives of fact-checking with the responsibility to provide accurate information and build trust with the audience.
* AI-generated content poses new challenges for fact-checkers, and there is a need to educate the public about the tactics and techniques used to create and spread misinformation.
* Preparing for elections and other major events requires proactive fact-checking and spreading good information to prevent the spread of misinformation.
* Building resilience in societies and fostering collaboration among fact-checkers, researchers, and technology experts is essential to combat misinformation and maintain trust in democratic processes.
Chapters
* Introductions
* Creating Fact-Checking Coalitions
* Structure and Methodology of Fact-Checking Coalitions
* Trust-Building with Social Media Platforms
* Handling Misinformation During Elections
* Challenges in Brazil's Elections
* Changes and Evolutions in the Field of Fact-Checking
* Addressing Less Talked About Disinformation Tactics
* Building Collaborations and Empowering Journalists
* The Role of Users in Fact-Checking
* Prioritizing Fact-Checking Based on Harm
* Pre-Bunking and Proactive Fact-Checking
* Educating the Public about Misinformation Tactics
* Building Resilience and Collaboration
Get full access to Anchor Change with Katie Harbath at anchorchange.substack.com/subscribe -
Don’t forget to follow us on YouTube!
I’m so honored to have my good friend and expert extraordinaire Kat Duffy as my guest this week. I have no words other than you should listen to everything she says.
Summary
Kat Duffy, a senior fellow for digital and cyberspace policy at the Council on Foreign Relations, discusses her work in the trust and safety space and her role in leading the Trust and Safety Task Force. The task force brought together a diverse group of experts to explore the evolving landscape of trust and safety, with a focus on the intersection of technology, human rights, and humanitarian law. The project aimed to map the ecosystem, capture insights, and provide concrete recommendations for philanthropies and private funding. The conversation also touches on the challenges of consensus-building and the importance of including voices from the global majority in these discussions. The conversation explores the concept of the global majority and its relevance in the tech industry. The term 'global majority' refers to individuals and countries that are not at the leading edge of revenue models or political interests. The discussion highlights the lack of cultural diversity and understanding in tech, as well as the challenges faced by countries with fragile political and economic systems. The conversation also touches on the impact of regulations on trust and safety efforts, the need for independent expertise in civil society, and the future implications of technology in societal governance.
Takeaways
* The Trust and Safety Task Force brought together a diverse group of experts to explore the evolving landscape of trust and safety. Read the report here.
* The task force aimed to map the ecosystem, capture insights, and provide concrete recommendations for philanthropies and private funding.
* Consensus-building in the trust and safety space can be challenging, but it's important to include multiple perspectives and create dialogue.
* Including voices from the global majority is crucial in shaping discussions and decisions in the trust and safety space. The term 'global majority' refers to individuals and countries that are not at the leading edge of revenue models or political interests in the tech industry.
* Tech often lacks cultural diversity and understanding, reflecting the perspectives of a small group of thinkers.
* Countries with fragile political and economic systems face unique challenges in the tech space.
* Regulations can divert resources from trust and safety efforts and hinder innovation.
* Independent expertise in civil society is crucial for effective governance and impact assessment.
* The future of technology requires a better understanding of its implications and the involvement of diverse perspectives.
Chapters
* Introduction and Background
* Trust and Safety in Different Contexts
* The Trust and Safety Task Force
* The Impact of Generative AI
* Creating Room for Entrepreneurship and New Approaches
* Adapting to the Fast-Paced Nature of the Trust and Safety Space
* Including Voices from the Global Majority
* Understanding the Global Majority
* Tech's Lack of Cultural Diversity
* Challenges in Fragile Systems
* Impact of Regulations on Trust and Safety
* The Need for Independent Expertise
* Future Implications of Technology
Get full access to Anchor Change with Katie Harbath at anchorchange.substack.com/subscribe -
ACK! I totally forgot to schedule this last night. So sorry for my delay.
Don’t forget to follow us on YouTube!
This week we welcome Nina Jankowicz to the podcast. She talks about the importance of recognizing Russia's use of disinformation as a geopolitical strategy and the need for more vocal online presence in countering disinformation. Nina also shares her observations of the impact of Russian disinformation during the 2016 US election from her vantage point in Ukraine. She discusses her research on disinformation in Central and Eastern Europe and the limits of counter disinformation programming.
Nina then talks about her role in the Disinformation Governance Board and the misconceptions and attacks it faced. She emphasizes the importance of balancing free speech and protecting the information environment, citing examples from Germany, Singapore, Ukraine, and Brazil. The conversation covers various topics related to disinformation, online harassment, and the American Sunlight Project (of which I’m an advisor). It highlights the global nature of disinformation and the interconnectedness of these issues. The conversation also explores the challenges faced by women online, including gendered and sexualized abuse.
The American Sunlight Project is introduced as a new initiative to combat disinformation and promote truth and transparency in the discourse. The importance of transparency and accountability in tech platforms and government is emphasized.
Takeaways
* Recognizing and addressing Russia's use of disinformation as a geopolitical strategy is crucial in countering disinformation.
* The impact of Russian disinformation during the 2016 US election was significant, and the hubris of the United States in responding to it was concerning.
* Counter-disinformation programming often faces challenges in balancing its goals with free speech considerations.
* The Disinformation Governance Board faced misconceptions and attacks, highlighting the need for clear communication and understanding of its purpose.
* Balancing free speech and protecting the information environment is complex, and different countries have approached it differently. Disinformation is not limited to foreign actors and is often based on pre-existing social fissures.
* Women face gendered and sexualized abuse online, which has a silencing effect.
* The American Sunlight Project aims to promote truth and transparency in the discourse and inform voters.
* Transparency and accountability are crucial in addressing disinformation and online harassment.
Key Links
* Wilson Center: Freedom and Fakes: A Comparative Exploration of Countering Disinformation and Protecting Free Expression
* MSNBC: New documents show how disinformation expert was unfairly tarred
Please support the curation and analysis I’m doing with this newsletter. As a paid subscriber, you make it possible for me to bring you in-depth analyses of the most pressing issues in tech and politics.
Get full access to Anchor Change with Katie Harbath at anchorchange.substack.com/subscribe -
Kate Dommett, a professor of digital politics, discusses her book “Data-Driven Campaigning and Political Parties” and her research on how political parties use data in election campaigns. She explores the prevailing narratives around data-driven campaigning and how they often do not match reality. Dommett also discusses the role of regulation in data-driven campaigning and the potential threats to democracy. She emphasizes the need for a nuanced understanding of data use in campaigns and the importance of considering the regulatory environment and data security.
Takeaways
* Data-driven campaigning is not a new phenomenon, but the use of digital technology has disrupted traditional accountability systems.
* The regulatory environment and electoral systems vary across countries, leading to different practices in data-driven campaigning.
* Campaigns use data for targeting, mobilization, and message development, but the level of sophistication varies.
* The role of online platforms in data-driven campaigning raises questions about access, responsibility, and unintended consequences.
* Data security is a significant concern in data-driven campaigning, especially for smaller parties with limited resources.
* Academic research should draw on historical lessons and theory to provide a responsible and nuanced understanding of the impact of new technologies, such as AI, on elections.
Chapters
* Introduction and Background
* Types of Data and Decision-Making
* Role of Online Platforms in Data-Driven Campaigning
* Concerns and Challenges in Data-Driven Campaigning
* Responsible Research on New Technologies in Elections
Links
* Katharine Dommett, Glenn Kefford, and Simon Kruschinski, Data-Driven Campaigning and Political Parties
* Amy Orben, The Sisyphean Cycle of Technology Panics
* Fabio Votta at al. Who Does(n't) Target You? Mapping the Worldwide Usage of Online Political Microtargeting
Get full access to Anchor Change with Katie Harbath at anchorchange.substack.com/subscribe -
Sorry for the slight delay today. The tech gods were not being kind.
Watch this on YouTube!
NOTE: A lot of this summary and show notes were generated via a new AI tool by Riverside.fmIn this episode, Brian Fishman, co-founder and chief strategy officer at Cinder, discusses his background in studying terrorism and his work at Facebook. He shares insights on the early days of using the internet for research on terrorist organizations and the challenges of trust and safety in the tech industry.
Brian also talks about the importance of pivoting in one's career and the trade-offs involved. He discusses the future of the tech industry, including the impact of AI and the role of regulation in trust and safety. The conversation covers various topics related to content moderation, media coverage, and the role of technology in society.
The main themes include the challenges of content moderation, the difficulty of making decisions in the face of complex issues, the role of the media in shaping narratives, and the need for a more comprehensive understanding of technology's impact. The conversation also touches on the importance of capturing the wider ecosystem of communication platforms and actors involved in events like January 6th.
We also discuss the potential for storytelling to shed light on the challenges of trust and safety in the tech industry.
Takeaways
* Brian Fishman has a background in studying terrorism and has worked on trust and safety issues at Facebook.
* The early days of using the internet for research on terrorist organizations provided valuable insights.
* Pivoting in one's career requires a mix of audacity and humility.
* The tech industry is facing challenges related to AI, synthetic content, and trust in institutions.
* Regulation will raise the floor for trust and safety expectations but may lower the ceiling. Content moderation is a complex and challenging task, regardless of whether it is done by private companies or regulators.
* The difficulty of making decisions in the face of complex issues remains even when responsibility is transferred from private companies to regulators.
* Media coverage plays a significant role in shaping narratives and public understanding of events.
* A comprehensive understanding of technology's impact requires considering the wider ecosystem of communication platforms and actors involved.
* Storytelling can be a powerful tool to explore the challenges of trust and safety in the tech industry.
Links
* Trust and Safety Tycoon Game
* How much mouse poop should be allowed in cereal boxes (a thought exercise that pertains to content moderation that we discuss and David Karpf wrote a piece about).
Show Notes
* Introduction and Background
* Early Internet Research on Terrorism
* Pivoting in a Career
* Challenges and Opportunities in the Tech Industry
* The Role of Regulation in Trust and Safety
* The Challenges of Content Moderation
* The Difficulty of Decision-Making
* The Role of Media in Shaping Narratives
* Understanding the Wider Ecosystem of Communication Platforms
* Storytelling and the Challenges of Trust and Safety
Anchor Change with Katie Harbath is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.
Get full access to Anchor Change with Katie Harbath at anchorchange.substack.com/subscribe -
Watch this episode on YouTube.
As most of you know, I started working full-time as the Global Affairs Officer of Duco Experts in January. One of the things I like about the job is getting to work across the industry and seeing how companies tackle similar problems differently.
Thus, I thought it would be fun to bring the leadership team on the podcast to discuss some trends. Sidney Olinyk is the founder and CEO. Neema Basri is the Chief Operating Officer, and Scott Hoch is the Chief Technology Officer.
In this episode, Sidney and Scott talk a bit about the origins of Duco and some lessons learned from starting the business. Neema shares some of the work that we do and we all jump into sharing some trends we are seeing, our favorite productivity hacks and some hot take predictions for the future.
Anchor Change with Katie Harbath is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.
Get full access to Anchor Change with Katie Harbath at anchorchange.substack.com/subscribe -
I’m trying something different for this week’s podcast. I originally hoped I would have a webinar to repurpose, but we ended up rescheduling it, so rather than not doing a podcast this week, I thought I would record the answers to questions I get quite often.
In this discussion, I go into seven different areas of interest. Topics and time stamps are below.
I hope you enjoy it!
* 2:04 - My career path & how I mapped out my post-Facebook journey
* Lessons from Striking Out on My Own
* Mapping Out My Post Facebook Career
* Lori Brewer Collins: On Leading: Transformative Conversations
* 26:13 - Who I am outside of work
* Reclaim the Fairy Tale
* The Things We Whisper
* 32:16 - What I do and a typical day looks like for me
* 34:41 - How to get a job in tech
* 40:54 - My reading/news consumption habits
* 48:10 - Why I started my Substack and how I put it together each week
* 57:25 - What’s next
Get full access to Anchor Change with Katie Harbath at anchorchange.substack.com/subscribe -
Watch the video of our conversation on YouTube!
This week, I welcome Professor Kate Klonick to the podcast. The name of this episode comes from some amazing swag Kate made for a conference she put on last year on the history of the Trust and Safety profession. (You know how much I love swag.)
Kate is among the foremost experts on many things, including platform governance of speech. In 2018, she wrote a paper at Harvard titled “The New Governors: The People, Rules, and Processes Governing Online Speech,” which was a first-of-its-kind behind-the-scenes look at how platforms handle content moderation.
In 2021, she wrote a piece for the New Yorker about how then-Facebook set up the Oversight Board titled, “Inside the Making of Facebook’s Supreme Court.”
Recently, she has been writing on these topics at her Substack The Klonickles. One of her pieces I cite all the time is about the end of the golden age of tech accountability where in 2023 she makes the point:
[F]or all the of the complaining we’ve done about Big Tech’s lack of cooperation with accountability, transparency, and research efforts, I unfortunately think we’ll look back on the last five years as a Golden Age of Tech Company access and cooperation.
We talk about all of this and more. Enjoy!
Kate Klonick teaches Property, Internet Law, and a seminar on information privacy. Klonick's research focuses on law and technology, most recently on private platform governance of online speech.
Klonick's scholarly work has appeared in The Yale Law Journal, Harvard Law Review, The Georgetown Law Journal, the peer-reviewed Copyright Journal of the U.S.A., The Maryland Law Review, and The Southern California Law Review. Her popular press writing has appeared in the New Yorker, New York Times, The Atlantic, The Guardian, Lawfare, Slate, Vox and numerous other publications.
Professor Klonick holds an A.B. with honors from Brown University where she studied both modern American History and cognitive neuroscience, a J.D. from Georgetown University Law Center where she was a Senior Editor on the Georgetown Law Journal, and a Ph.D. in Law from Yale Law School. She clerked for Hon. Eric N. Vitaliano of the Eastern District of New York and Hon. Richard C. Wesley of the Second Circuit. She is an affiliated fellow at the Yale Law School Information Society Project and a non-resident fellow at the Brookings Institution. She is on leave for 2022-2023 serving as a Visiting Scholar at the Rebooting Social Media Institute at Harvard University.
Get full access to Anchor Change with Katie Harbath at anchorchange.substack.com/subscribe -
Don’t forget you can watch all of these on YouTube!
This week, we are discussing all things online influence operations with one of the foremost experts - Olga Belogolva. We’re talking about Russians, Chinese, Iranians, and other actors who want to influence the online information environment. The title of this episode comes from one of her classes she used to teach at Georgetown.
Olga is the Director of the Emerging Technologies Initiative at the Johns Hopkins School of Advanced International Studies (SAIS). She also a lecturer at the Alperovitch Institute for Cybersecurity Studies at SAIS, where she teaches a course on disinformation and influence in the digital age.
At Facebook/Meta, she led policy for countering influence operations, leading execution and development of policies on coordinated inauthentic behavior, state media capture, and hack-and-leaks within the Trust and Safety team. Prior to that, she led threat intelligence work on Russia and Eastern Europe at Facebook, identifying, tracking, and disrupting coordinated IO campaigns, and in particular, the Internet Research Agency investigations between 2017-2019.
Olga previously worked as a journalist, and her work has appeared in The Atlantic, National Journal, Inside Defense, and The Globe and Mail, among others. She is a fellow with the Truman National Security Project and serves on the review board for CYBERWARCON.
Enjoy!
Get full access to Anchor Change with Katie Harbath at anchorchange.substack.com/subscribe -
Don’t forget you can also watch this on YouTube!
This week, we are talking to Michael Bąk. Michael brings more than two decades of experience across international development, peacebuilding, diplomacy, and tech policy. Throughout his career, he has sustained a strong commitment to democratic governance, human rights, and information integrity.
He is currently the Executive Director of the Forum on Information and Democracy and a former colleague of mine at Facebook, where he was the head of public policy for Thailand.
We dig into the relationship between technology companies and civil society, how he’s thinking about our information environment and how we protect democracy in the age of social media and artificial intelligence.
Show Links:
* Global Call for Research to Expand Literature on Crucial Research Questions, with Emphasis on Global South Regions
* Policy brief - Tech firms, governments urged to combat digital election threats (I forgot to mention that we did this with International IDEA and Democracy Reporting International)
* Policy Brief - Information Integrity in Times of Conflict
* AI as a Public Good: Ensuring Democratic Control of AI in the Information Space framework
* Why Do We Need to Discuss So-called "Information Integrity"?
* Euroviews. 'Regulation stifles innovation' is a misguided myth
* Der Demokratieschützer Michael Bak über KI. - SZ.de
* Fair Trade AI: https://background.tagesspiegel.de/digitalisierung/plaedoyer-fuer-eine-fairtrade-ki
* Forum on Information and Democracy: www.informationdemocracy.org
Get full access to Anchor Change with Katie Harbath at anchorchange.substack.com/subscribe -
Don’t forget you can also watch these on YouTube!
This week we welcome Stanford Law Professor Nate Persily to the podcast. I’ve known Nate since 2013/2014 when he held a gathering at Stanford with folks in the tech/digital industry and the Federal Elections Commission. Nate has been a thought leader his entire career with experiences across technology, academia and election administration. We get into all of that in this conversation. Some links from what we talked about:
* Nate’s bio
* Stanford Cyber Policy Center
* Social Science One research partnership with Facebook
* Facebook 2020 election research
* Senate Testimony on Platform Transparency
Get full access to Anchor Change with Katie Harbath at anchorchange.substack.com/subscribe -
Don’t forget you can now also watch these conversations on YouTube!
This week, we are diving deep into elections and specifically where people go to get information on the election. Rachel Orey is the Bipartisan Policy Center’s Senior Associate Director where they are responsible for the organization’s election administration policy development, state and federal advocacy efforts, and the BPC Task Force on Elections. Their research focuses on evidence-based and data-driven reforms that meaningfully improve our elections ecosystem.
As many of you know, I was a fellow on Rachel’s team for nearly three years and one of my last acts as both a BPC and Integrity Institute fellow was to help get this survey off the ground. We did a similar one in 2022 as well.
Some of the findings include:Most Americans have confidence in the 2024 presidential election. They are more confident that votes in their community and state will be counted accurately than votes across the country.
* A majority of respondents (69%) are confident their votes will be counted accurately in the 2024 election. This includes majorities of Republicans (60% very or somewhat confident), Independents (59%), and Democrats (85%).
* Across all groups, Americans are most confident about an accurate count of votes in their community (74%). Just 64% are confident in an accurate count across the country.
* This difference is most pronounced among Republicans. Only 50% of Republicans express confidence that votes will be counted accurately at the national level compared with 66% at the local level—a gap of 16 percentage points.
* The confidence gap between local and national counting is an opportunity for voter education about how the counting and certification process works at all levels of our election system. While election officials may be doing a good job building confidence in their community, this gap shows the need for national and state media outlets, candidates, and political elites to help voters understand the robust processes and security measures that are present in every state.
Rachel digs into that and more in this week’s podcast.
Here’s the link to the security and integrity protections that make American elections strong, resilient, and trustworthy in every jurisdiction.
Get full access to Anchor Change with Katie Harbath at anchorchange.substack.com/subscribe -
This week on the podcast, we are taking you to SXSW and discussing a conversation I had with Sasha Issenberg about his new book, Lie Detectives: How Political Campaigns Fight Disinformation. You might remember him from his popular book Victory Lab, which was published after Obama won.
Sasha’s book looks at people in the U.S. and Brazil who were at the forefront of helping campaigns figure out how to combat disinformation—including when to ignore it. He also touches on how many on the Right view this work, and it’s something we talk about as well. You can read an excerpt on Politico.
We interview each other in this conversation given our various experiences.
A huge thank you to SXSW for the audio after my recorder ran out of batteries halfway through. 😬
Please enjoy!
Anchor Change with Katie Harbath is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.
Get full access to Anchor Change with Katie Harbath at anchorchange.substack.com/subscribe -
We’re back! I didn’t intend to take two full months off from the podcast, but as many of you know, I started a new job in January as the Chief Global Affairs Officer of Duco Experts - a technology consulting firm. It has been overwhelming, in a good way, but it took me a bit to get started again with the podcast.
I’ve got some exciting guests lined up. I figure we’ll do this season through the end of May, and then I’ll re-evaluate for the rest of the year.
To kick things off, I’m excited to have Luis Lozada, the CEO of Democracy Works. You may not have heard of Democracy Works, but you likely have encountered their work. They do the painstaking work of gathering all the information about where, when, and how to vote from the thousands of election officials across the country to put it in a readable format that companies like Google, TikTok, and Anthropic currently use.
I started working with them when I was at Facebook, and we used them to power many of our U.S.-based Election Day reminders. I was invited to join the board while I was at Facebook and have now been a board member for five years.
With the explosion of AI, Democracy Works is now helping companies think through the next generation of people getting election information. Luis and I cover that and more in our conversation.
Enjoy!
PS: If you are looking for the poll by the Bipartisan Policy Center, Integrity Institute and States United that we reference you can find it here.
PS: We’re now on video, too! With Season 2, I’ve launched an Impossible Tradeoffs YouTube channel if you'd like to watch our conversation rather than listen.
Get full access to Anchor Change with Katie Harbath at anchorchange.substack.com/subscribe -
In this special episode of Impossible Tradeoffs, I sit down with Latika Bourke - a journalist who I first met in Sydney ten years ago - who now has her own Substack at Latika Takes . In celebration of Facebook’s 20th birthday we sat down to take a look back at the role the platform has played in politics and elections and what we might expect going forward.
Get full access to Anchor Change with Katie Harbath at anchorchange.substack.com/subscribe -
It’s our last podcast of the year! Thank you all for joining my journey in starting this podcast. As a reminder, I’ll take some time off before restarting episodes in 2024. Don't forget to fill out this short form if you have ideas for what you’d like to see.
This week’s fun trade-off is:
For our last guest of 2023, I’m welcoming Sarah Oh to the podcast.
Sarah is a human rights expert and tech executive who has worked at Meta, Twitter, and, most recently, Pebble. Pebble worked a lot like Twitter, but one thing Sarah and the other co-founders were adamant about was building the platform with a human-rights and safety-first approach. While Pebble did shut down a few weeks ago (the success of Threads really hurt their ability to grow and get funding), there are a lot of lessons to be learned from their experiences. Sarah’s co-founder, Gabor Cselle, covered some of his lessons learned in this Medium post, and Sarah talks about what she learned in the podcast.
Enjoy!
Get full access to Anchor Change with Katie Harbath at anchorchange.substack.com/subscribe - Se mer