Like a lot of us these days, I can’t help but notice the amount of tension and divisiveness that seems to be everywhere you turn. From the highest seats in our government all the way to our own kitchen tables, this divisiveness has rooted itself into almost everything.
This can manifest itself in a variety of ways from simple bickering to outright violence. You may no longer speak to your cousins because they’ve “gone off the deep end”, or you could be an innocent teenage kid who gets shot for accidentally knocking on the wrong door.
Although divisiveness and conflict are certainly not new to society, the way that they are currently spreading has been accelerated by our use of technology. Now, I’m not one of those who thinks that every advance in technology spells the end for humanity. But, I am also not naive enough to overlook the fact that it certainly has that potential.
Technology is a tool, or even a vehicle, plain and simple. And much like any other vehicle, you can use it to drive your family to beautiful destinations, or you can use it to run over pedestrians on your way to driving off a cliff. The outcome depends on how you use it. And nowadays its use seems to be more in line with the latter. But, technology is so ingrained in our modern day lives that to say that we need to get rid of it all together is as ridiculous as it is impossible.
So, what can we do? How is tech contributing to and or causing this polarization? And, how do we use it to bridge these current divides? Back in February I took a trip out to San Francisco for the Designing Tech for Social Cohesion conference to find out.
The conference was organized by members from some of the global leaders in peacebuilding like Shamil Idriss CEO of Search for Common Ground and Dr. Lisa Schirch of the Toda Peace Institute. Featuring people like Daanish Masood a technologist at the United Nations Department of Political and Peacebuilding Affairs and Calista Small and Daniel Yudkin at More In Common. The goal of this conference was to put these peacebuilders in the same room as those who design our technology so we can find ways to embed social cohesion into the code. So we can use this vehicle to build empathy and close perception gaps. Individuals like Tristan Harris from the Center for Humane Technology, Colin Megill from Polis, Andrew Konya of Remesh, and my two favorites, Waidhei Gokhale of Soliya, and Lisa Conn of Gatheround.
Steve Jobs once pointed out that computers are like bicycles. They enhance what we can do. Although we can steer the bike, it can only go where the road is paved. And the road the bike travels on is paved in code.
Because the way the code is paving this road now, drives for profit. The goal is to keep your eyes on the screen. Period. That’s how these big tech companies make money. Unfortunately there is more profit in our collective dysfunction. Because, as it turns out, outrage mixed with splashes of vindication is very compelling and satisfying and will keep you tuned in and coming back for more. At the beginning of the conference Shamil Idriss got on stage and laid out the three steps to peacebuilding. Connection leads to collaboration which leads to breakthroughs. Sounds simple right? But when I look at where we are today: 1. Connection? We are increasingly divided. 2. Collaboration? We are working against not with each other 3. Breakthroughs? Things are breaking down.
According to Dr. Schirch, “the road to hell is paved in code”. As she puts it, the platforms that we use today, Twitter, Facebook, Youtube, etc. are designed like digital coliseums. Where gladiators can fight it out with a passive and anonymous audience cheering and commenting. Where the algorithms don’t create an environment for constructive conversation but rather an arena for battle.
Trust & Safety groups were formed by the major tech companies to create a set of business practices for their online platforms to reduce the risk of exposure to users to harmful and fraudulent content. But that didn’t last long. The end of 2022 and the beginning of 2023 saw groups like Meta, Amazon, Alphabet, and Twitter all drastically reduced the size of their teams that fight online hate and misinformation to cut costs and maximize profit. And more recently Elon Musk and Twitter started rolling back misinformation rules which were quickly followed by Mark Zuckerberg and Meta.
What these Trust and Safety groups found while they were involved was that content moderation alone cannot (and will not ever be able to) keep pace with the amount of hate and misinformation online. We need to build from the ground up. It has to be embedded in the code. It has to be written into the very structure of these platforms. We need to start paving this road to lead toward social cohesion rather than the coliseum. We have to design our tech to be a more benevolent manipulator instead of a shit-stirring instigator.
Because, at this point, none of us are blind to the fact that digital online hate does “spill over” into the real world. With real physical violence. And it’s not just violence that is “spilling over”. A recent warning from the Surgeon General of the United States, Dr. Vivek Murthy, explained the dangers of social media on the mental health of today’s youth. In his report he points out that social media use with young people between the ages of 13-17 is almost universal at 95%. With more than a third of them saying they use it “almost constantly”. This age represents a critical stage in brain development and social development. So while they are in this critical stage and engaging almost constantly with technology, they “are exposed to harmful content on social media, ranging from violent and sexual content, to bullying and harassment.” according to Dr. Murthy.
The point here is that ALL tech design involves social engineering. The consistent use of these devices and platforms embeds habits and values on its user. As Meagan Mitchell from New Republic said in the conference, “Tech is shaping what it is to be a human right now.” It is time that we accept that and accept what we need to do to fix it. We need to accept and address the fact that a lot of the tech designers who built this coliseum built targeted design features to make engaging this coliseum as addictive as possible. Again, your eyes on the screen is the one and only goal.
And we can see how this is panning out. Heightened outrage, and anger that leads to polarization. Which means more individuals are being pushed to the extreme ends of their ideology. The way the code is written now taps into things like emotional contagions while stifling emotional communication. It then compounds it by enabling it to become a social contagion.
What does that mean?
An emotional contagion is the phenomenon when the emotional state of one person is transferred to another via unconscious sensory input rather than communication. Getting you riled up about something before you’ve had a chance to logically think about it. And a social contagion is when this emotion can spread from one large group of people to another large group rather than person to person. Meaning your moods and perspectives as a group can influence the moods and perspectives of another group.
These contagions are a reason we say things can “go viral”. These algorithms and platforms enable and amplify these types of contagions. And as Gustave Le Bon pointed out over 100 years ago, in his book The Crowd, groups experiencing an emotional situation can cause the emotional intensity felt by each individual to intensify even further.
This all leads to us, as a whole, to begin to bypass things like empathy and understanding. We become locked in our echo chambers with our like minded tribe. The algorithm feeds either our outrage of the other, or our vindication with our brother. Proving to us over and over again how right we are and how wrong they are.
One of the most important things that I learned at the conference was the concept of “perception gaps”. I had never heard of this term before, but as I soon realized, we all have them.
PERCEPTION GAPS – MORE IN COMMON
At one point in the conference Dr. Schirch invited everyone to get up from the table we had been sitting and to find another seat at a different table to engage in conversation with someone we had not spoken with yet. That is when I met Calista Small from More In Common.
She casually threw out this phrase “perception gaps” and at first it sounded interesting but I had no idea what it meant. So, she gave me a link to the Perception Gap Quiz designed by More In Common that will lay it all out, right in front of you.
It is quick, easy and incredibly eye opening. You start by defining yourself. Are you a republican, democrat, or independent? Male or female, etc. Once they know where you are coming from, they ask your opinion on gun rights, immigration, and sexual assault among other things. Then they ask you to guess what percentage of “the other side” would agree or disagree with your feelings on these topics. For example, “What percentage of Republicans would agree that properly controlled immigration would be good for America?” So, you guess what you think, that they think. This shouldn’t be hard because we all make these assumptions about “the other side” and what they think, all of the time.
Next, they take your answers about what you think they think and match it up with their actual views. These views are taken from a survey that was conducted among a sample of 2,100 US adults (974 male, 1126 female, mean age 49) from November 7th to 10th, 2018—the week immediately following the 2018 midterm elections.
This is where things get really interesting. This survey will spit out a graph showing you exactly how far off your assumptions about “the other side” actually are. Check out the examples from their report below:
As we can clearly see, a good chunk of our assumptions of each other are just plain wrong. And the way in which we engage with each other on these platforms only serves to fortify these incorrect assumptions a.k.a. perception gaps.
If we don’t make the effort to break out of our own echo chambers and engage in curiosity based conversation with those we disagree with, then how are we ever going to come together and come up with a National course of action? A course of action on things like gun control, abortion rights, or climate change. I’m not suggesting that there is a solution that will please everyone. But as the studies by More In Common suggest, there are solutions we can reach that can include more people from both sides rather than one side versus the other. There are issues where our values are closer together than we think. We are not as divided as we may feel.
The inherent problem with placating one side and enraging the other is that it guarantees any progress made in one direction can and will be dismantled after the next election. One step forward, two steps back. Or, as the guys at More In Common put it, “preventing progress on shared concerns, and undermining Americans’ faith in democracy.”
We have to re-energize this idea of a “United” States of America.
BRIDGING ALGORITHMS
So how can we build these platforms in a way that would bring opposing sides together rather than pit them against each other? How can we expose these perception gaps for what they really are? One suggestion is being put forward by Aviv Ovadya, a technologist and researcher focusing on the impacts of internet platforms and artificial intelligence on society and democracy, and that is what he calls Bridging-Based Ranking. He created a report in May of 2022 for the Technology and Public Purpose Project and it was published by the Harvard Kennedy School, Belfer Center for Science and International Affairs.
In his report Aviv explains the current way platforms like Facebook, among others, use algorithms to amplify or boost content in your feed is to promote content that receives engagement in the form of likes or shares or comments. What shows up in your feed is determined by an algorithm that picks the things most engaged with.
As mentioned earlier, the best way to get someone to engage is to enrage. The second best way is to echo your currently held beliefs. Again, they want your eyes on the screen for as long as possible. They don’t care if the reason you are staring at the screen is good or bad. They also do not care if that information is true or false. So they “feed” you what angers you and what you agree with. Further driving the divide with outrage and vindication.
As Aviv puts it in his report, these platforms currently reward this divisive content with top placement in your stream of information “resulting in significant impacts on the quality of our decision-making, our capacity to cooperate, the likelihood of violent conflict, and the robustness of democracy.”
He further explains, “In summary, sensationalism and divisiveness attract attention. This drives engagement. Engagement-based ranking systems reward this with more attention…It is this combination of human psychology (what we pay attention to) and societal incentives (our desire for attention and its rewards) that leads to harm; engagement-based recommendations are just a particular way to increase the reward and thus the harm…All of this leads rapidly to a race to the bottom.”
His suggestion, “We can call such a system a bridging recommender—a recommendation system that rewards content that helps bridge divides. It would use bridging-based ranking to reward content that leads to positive interactions across diverse audiences, even when the topic may be divisive.”
In other words, these algorithms would reward posts and articles that would help opposing sides understand each other. And, in my opinion, that should be the goal here. Not to get opposing sides to necessarily agree with each other, rather to help them understand each other.
Because there will always be groups within our society who inherently disagree with each other. But as Julia Kamin of the Civic Health Project states, “It’s not about getting rid of conflict. It’s about getting rid of conflict that is generated to sell shirts and ad space.” Conflict can be and is useful for growth. But there is a difference between constructive conflict and destructive conflict. It is how these conflicts are resolved that will either promote growth or divisiveness.
It is imperative that we identify where and how technology is diminishing our cross group communication. Where is it causing our ability to have constructive conversations with someone we disagree with to atrophy? And I believe Aviv and his group have pinpointed a key element that could prove pivotal in turning things around if it is embraced and utilized.
Conflict is inevitable. But nothing resolves conflict and builds empathy better than communication. That’s right. Simple conversation could be the key to this whole thing. But how can conversation be instigated and cultivated through technology? How can platforms be built to explore the complexities and nuance of an issue, that will expose the interdependence our different tribes rely on, to make our society function in a healthy manner? And, how can they do this without engaging in the aspects of tech that drive destructive conflict and polarization?
POLIS
One of the more intriguing ways in which tech is being designed to help bridge these gaps is called Polis. Polis is supported by a group called, The Computational Democracy Project and was founded by Colin Megill.
As described by them on their site, “Polis is a real-time system for gathering, analyzing and understanding what large groups of people think in their own words, enabled by advanced statistics and machine learning.” They work to bring data science to “deliberative democracy”, with the hope to create political practice that better reflects public will.
In general Polis works by taking a question or statement such as, “How can we affect climate change?” and allows the participants to submit short statements that reflect their views on the question. They can also vote to agree, disagree, or pass on statements made by other participants. As these statements and responses accumulate, Polis’ algorithm starts to group participants into clusters of “like-minded” responders. But, my favorite part is that the algorithm is able to identify “consensus statements” across these diverse clusters and show where these different groups actually agree with each other. Participants can see how this is playing out over the days and/or weeks that this conversation is being run. It also gives them the opportunity to create new statements that respond to what they see in that overview. They do all of this without engaging in a “comments” section, which on most platforms is the seat of the divisiveness and hate we find online. As this conversation evolves the statements become more specific, realistic and workable.
Now, those who have designed this conversation have real world feedback that can help make practical, far reaching and tangible solutions available to decision makers. These conversations and results have already been applied in places like Bowling Green Kentucky, New York City, all the way to Taiwan, to get a real world consensus on what citizens think and feel about real issues they are facing in their communities. The local governments take this feedback and make their decisions based on what the participants have to say. All of this is taking place while the algorithm is pointing towards the consensus statements that are reached across the different viewpoints.
SOLIYA
Another presenter at the conference that piqued my interest was Waidehi Gokhale. At the time of the conference she was the CEO of Soliya. After the conference I was able to speak with her and it was one of the best conversations that came from this event.
Soliya is an International Non-Profit with headquarters in Manhattan and Tunisia. Soliya has been in operation for over 20 years and was founded by Lucas Welch and Liza Chambers. On the onset they understood that to make the necessary changes in order to bridge divides they needed to get more and more people to come together and have meaningful exchanges across those divides. They design each experience to drive dialogue through virtual exchange in a way that leans into empathy and curiosity.
Soliya’s original focus was on the 18 to 25 demographic because it is a personality crystallization period. It is a time in most people’s lives when you find your “tribe” and are exposed to others. As she puts it, “We thrive in tribe, it is human nature. But wouldn’t it be wonderful to keep the walls of these tribes permeable so we can bend and flex as things inevitably change?”
Soliya is a fee for service business in which anyone can come to them and say, “Our community is having trouble with this particular issue.” Or, a professor can come to them and say, “I would like for my students to have a facilitated discussion about this topic.” They have 4 global versions and one US based, so no matter where you are they should be able to accommodate.
They will do an initial consultation, a design process, then they deliver you a customized program with a facilitator. The applications of this are extensive. From the professor in the classroom, to a community leader who says they are having trouble tackling a local issue and would like to get 20 people together to address it. Or a company that has expanded to 3 different regions, or to another country, and is having a hard time getting their people to communicate.
They can also provide facilitator training for those of us who wish to know how to guide others in difficult conversations. Now you can learn how to have these conversations and you can have them with others who can then have them with others, etc.
These dialogue experiences are meant to be facilitated by someone from Sloiya who serves as the “invisible hand”. The facilitators are there but they are not contributing. Because, as Waidhei points out, “Can people come together for conversations? Absolutely. Can these conversations remain productive? Less absolutely.” These facilitators serve as guardrails to keep things on track. And this is important because these topics and issues can be difficult for some to navigate without getting emotional.
“We are not driving to common ground.” she explains, “We are driving through the mud.”
“Let’s ask the hard questions and have the tough conversations and at the end of it if we can find agreements, that’s wonderful. But what we are trying to accomplish is for those participating to understand and respect the differences.”
“We can still walk away thinking differently, we can still walk away voting differently, but I am now more human to you and you know that I have heard and respected what you are saying and feeling.”
“This helps us not just put people in drawers and then close them.” Which is something that we all do.
Keeping in mind that although stereotypes do come from somewhere, it is important to recognize our differences and not weaponize our differences.
My conversation with Waidehi left me optimistic and energized. It was relieving to know that people like her are working on the solution rather than just working on the problem.
GATHEROUND
With things like “alternative facts” these days it’s hard for these polarized groups to agree on common truths. For example, in order for us to have a conversation about shapes we all have to agree that a square has four equal sides and four right angles. Without this shared understanding nothing productive or meaningful will come from our interaction. And, more likely, this interaction will devolve into something more destructive.
After the conference I was able to speak with the CEO of Gatheround, Lisa Conn. Lisa has studied at the MIT Sloan School of Management and the MIT Media Lab. She also spent time in the Community Partnerships while working at Facebook, now Meta.
During her time at Facebook Lisa had an excellent insight. She noticed that when groups would form, let’s say a “Hiking With Your Kids” to use her example, the members would begin to interact. People from different backgrounds and beliefs would join under the shared container and understanding that they liked hiking with their kids. But, as these groups evolved, Lisa began to notice that, naturally, differences would come up.
They may start talking about the weather during hikes and this would lead to talks about climate change and differences in opinion would surface. Then the group admin would shut these conversations down saying something like, “No politics here.” etc.
Lisa’s insight was, these groups are containers of shared understanding and are exactly the space where conversations could be the most productive. But, the tools to facilitate this didn’t exist in Facebook. So, she left and co-founded Gatheround.
When developing Gatheround Lisa decided to implement her platform in the container of the workplace. Because the workplace is where most adults spend most of their time. It is also the place that will have the greatest likelihood of “cross-cutting interactions” as she puts it. This is interactions between people who do not share the same ideology. Even though you may not share all of the beliefs of your co-workers, you are more likely to have shared values and mission as far as the goals for the workplace are concerned. This helps the walls of the different tribes to be more permeable and easily traversed.
The tools used in Gatheround were developed over her time spent with different peacebuilding organizations and her approach boils down to: Structure + Facilitation = Engagement. And in using the tools provided in Gatheround to brainstorm to solve a business problem, or learning to lead through change, you will also learn the tactics of listening, storytelling, compassion, and empathy. With the hope this will carry over into other parts of your life.
As she gave me a live demo of the platform, I began to get more and more excited. The tools and tactics used are simple and very effective in building camaraderie and empathy. She showed me how a simple question like, “What was your favorite food growing up?” could uncover things like the environment in which you were raised, to your socioeconomic background, to personal preference. Which will ultimately make you a better colleague and collaborator.
Gatheround, as it is today, is the result of many iterations. What started as a list of questions, is now a series of quizzes, polls, flashcards and more. It even has a little photo booth. The experience is highly engaging, fun and very effective. They now have a library of “experiences” as she calls them, that you can implement with your team. This platform is brilliant and so well thought out. For me, this was definitely one of the gems that I stumbled upon because of the conference. This is how technology can be used to expose the interdependence our different tribes rely on to make our society function in a healthy manner. And, it does this without engaging in the aspects of tech that drive destructive conflict and polarization.
CONCLUSION – (HEADING SUGGESTION?)
I went to the Conference for Designing Tech for Social Cohesion not knowing what I would find. And although, at times, it seems like the problem we face in technology’s role in our collective dysfunction can be insurmountable, there are ways to correct it. Not only that, but there are people who recognize the problem and are implementing effective solutions.
We still have a long way to go. More people need to get involved. More solutions have to be identified and implemented. More light has to be shined, not only on the problem, but also on the people working to remedy it. We cannot continue to blindly consume every new toy tech companies throw at us, and not examine how it affects us. As a species our present and future are now inextricably linked to our technology. It will and does modify our behavior, our thoughts, and our emotions. Ignoring this fact means we will have no control on how it modifies us moving forward.
In this volatile and shifting point in our history, I feel the late science fiction writer Isaac Asimov may have put it best, “It is change, continuing change, inevitable change that is the dominant factor in society today. No sensible decision can be made any longer without considering not only the world as it is, but the world as it will be.”
DIG DEEPER:
The Computational Democracy Project
Conntix – Connecting People Through Technology and Innovation
Bridging-Based Ranking
Council on Technology and Social Cohesion
TO READERS: The Associated Press’ initiative “Not Real News: A look at what didn’t happen this week” publishes weekly roundups exposing untrue stories and visuals.
Hi, this is a comment.
To get started with moderating, editing, and deleting comments, please visit the Comments screen in the dashboard.
Commenter avatars come from Gravatar.