What Can The Biotech Industry Learn From The Failures Of The Social Media Industry?

By now, we’re all aware that Big Tech has a problem. Social media giants in particular, like Facebook and Twitter, were originally intended to connect people online. But instead, the rise of hate speech, political extremism, and a fundamental distrust of science and facts has become a painful example of how our best intentions can go awry. So, what does this mean for Big Biotech? Social media has shaken our world with words and images. If the same failings were to occur in the biotech industry, the consequences could be unimaginable.

The synthetic biology community is working hard to avoid a social media-like catastrophe in the biotech industry. In light of this, I began this year’s SynBioBeta conference with a Leaps Talk with Tristan Harris, co-founder of the Center for Humane Technology and star of the eye-opening Netflix documentary, The Social Dilemma. The discussion was introduced by Jurgen Eckhardt, head of Leaps by Bayer, Bayer’s impact investment division, and SynBioBeta partner.

Eckhardt praised the scientific community’s astounding biotech innovations through the last nine months of the COVID-19 pandemic. But he also acknowledged just how badly public belief in science has been eroded by misinformation. To regain trust, leaders of industry, government, and public institutions “need to engage and work together with unprecedented transparency and clarity,” says Eckhardt. Leaps Talks are part of this transparency mission. “We can’t afford not to [work together]. So much is at stake,” Eckhardt concluded.

It feels like our world is balanced on a precipice. The last hundred years have arguably seen the greatest scientific and technological advancements in human history. But now, we are moving from the Information Age to the Disinformation Age—and belief in science, once a bedrock of society, has been swept up in the tumult.   

In our talk, Harris explained social media’s role in undermining basic facts. For Big Tech algorithms, embellished, salacious, and downright untruthful content is the most profitable. Companies like Facebook operate in the Attention Economy, enticing users to stay engaged with the platform for the benefit of advertisers. And nothing attracts attention more than exaggeration and falsehoods. The truth, by comparison, is boring. Users, unbeholden to journalistic standards, are manipulated into creating free content for advertisers. Algorithms don’t care what users post, as long as millions see it.

Social media disinformation is the crisis of today. But what about the tech crisis of tomorrow? I asked Harris for his thoughts on emerging technology like GPT-3, a language prediction model by OpenAI. This platform can spit out made-up quotes by any given person based on what they’ve previously said online. Suddenly, anyone can be made to say anything and there’s no real way (yet) to disprove that. Harris likened GPT-3 to “a digital nuke for trust.” This issue is critical for the biotech industry to recognize. It’s not enough to evaluate the potential negative consequences of market-ready platforms. The industry needs to establish systems now to address the societal impacts of technology that, today, is still part science fiction.

A prototype for one of these science fiction-seeming devices was recently unveiled by Elon Musk. Musk’s proposed biohacking implant, Neuralink, can both upload and download information from the brain. Harris expressed concern and pointed to Facebook as a case study. “Many of the examples [of] technology harming us has to do with us not seeing where the technology has manipulated our weaknesses,” says Harris.

Technology like Neuralink is billed as a way to optimize the human brain. But there’s a big catch. Even the most logical person can be overridden by their core emotional instincts. Harris quoted biologist E.O. Wilson, saying, “We have Paleolithic emotions, medieval institutions, and god-like technology.”  If we don’t recognize our fixed characteristics, like the need for social inclusion, technology like Neuralink could essentially be hacked by “the mass manipulation of social approval,” says Harris.

Evolutionarily speaking, disapproval, or dislike from another person or community is more important to survival than complex, higher-level analytics. We’ve already seen what happens when social media algorithms hack this evolutionary pathway outside of the brain. If poorly designed, moving that technology into the brain could have terrifying consequences.

The need to anticipate the worst potential outcomes of our best scientific intentions is clear. In our discussion, I expressed to Harris that the synthetic biology industry is on the cusp of a revolution, particularly in reading, writing, and editing DNA. But as an industry we’re also concerned with the ethics of what we’re doing and how our technology is going to impact society and our environment. Having seen what happened to social media, I asked Harris what advice he could offer to the biotech community.

Harris said the first step in building ethical technology is for companies to understand their weaknesses and red team their thinking. “Under what conditions would the very thing that I’m saying would be great for the world lead to the exact opposite of that? Actively imagine a dystopia that could occur from this positive thing,” says Harris. Doing this will take humility and skepticism—challenging an idea can feel like a put-down. Harris says skepticism doesn’t have to be negative. Being critical and even cynical of our ideas is key to an ethical development process.

Harris stressed that technology is not inherently harmful. But living medicines, absorbing atmospheric carbon with algae, editing DNA, these are god-like technologies. “We have to have the wisdom, love, and prudence of gods if we’re going to have the power of gods,” says Harris, paraphrasing Barbara Marx Hubbard. This is how social media failed; the industry de-coupled their god-like power from their commensurate responsibility.

So, what needs to happen in order for Big Tech to shoulder their responsibilities? Harris says we need to view human nature with more compassion. “And we need a different venture capital model where we invest in things that are thoughtfully done, ethically.” He also says we need to leverage policy and journalism to de-incentivize the “world-dominating mindset” of “racing around to grow fast at all costs.”

I asked Harris what responsibilities Big Tech leaders have in rebuilding the trust we’ve lost in facts and science. For Harris, the path forward is through truth and reconciliation. The Big Tech companies are the most profitable in human history. It’s on them to use their funds “to rebuild the society that’s been eroded through their success,” says Harris. We may finally be entering that time of reckoning. The United States House of Representatives recently leveled a blistering, 450-page antitrust report at Facebook, Amazon, Google, and Apple.

Harris summed up our Leaps Talk, saying, “When you’re handing out god-like power like Oprah, we have to be really thoughtful that we pair that power with great responsibility.” Maybe, finally, that responsibility is returning to Big Tech. But now it’s on the biotech community to lead the way forward, with power and responsibility hand-in-hand. 

I’m the founder of SynBioBeta, and some of the companies that I write about—including Leaps by Bayer—are sponsors of the SynBioBeta conference and weekly digest. Thank you to Fiona Rose Mischel for additional research and reporting in this article.

Leave a Reply

Your email address will not be published. Required fields are marked *

%d bloggers like this: