AIS Electronic Library (AISeL)

  • eLibrary Home
  • eLibrary Login
  • < Previous

Home > Conferences > PACIS > PACIS22 > 150

PACIS 2022 Proceedings

PACIS 2022 Proceedings

Governance of social media platforms: a literature review.

Manish Kumar A , Indian Institute of Management Raipur Follow Sumeet Gupta , Indian Institute of Management Raipur Follow

Paper Number

Social Media platforms have become an integral part of our life. Initially, they were used as a medium of interconnection but now have become a source of creating, consuming, and disseminating information. The proliferation of misinformation, fake news, and hate speech are some of the major challenges that are posed by these platforms. To curb these challenges, it is crucial to study the governance mechanisms of these platforms. Though there have been studies in the domain of platform governance but there is a dearth of studies in the area of governance of social media platforms. This paper tries to comprehend the studies in the domain of governance of social media platforms and present them under four major themes that are different modes of governance, content moderation and combating misinformation, algorithmic governance, and ethical aspects of governance. This paper not only synthesizes the existing literature but also identifies future research gaps.

Paper Number 1493

Recommended Citation

A, Manish Kumar and Gupta, Sumeet, "Governance of Social Media Platforms: A Literature Review" (2022). PACIS 2022 Proceedings . 150. https://aisel.aisnet.org/pacis2022/150

Since June 16, 2022

When commenting on articles, please be friendly, welcoming, respectful and abide by the AIS eLibrary Discussion Thread Code of Conduct posted here .

Advanced Search

  • Notify me via email or RSS
  • PACIS22 Website
  • All Content

Author Corner

  • eLibrary FAQ

Home | About | FAQ | My Account | Accessibility Statement

Privacy Copyright

  •  Sign into My Research
  •  Create My Research Account
  • Company Website
  • Our Products
  • About Dissertations
  • Español (España)
  • Support Center

Select language

  • Bahasa Indonesia
  • Português (Brasil)
  • Português (Portugal)

Welcome to My Research!

You may have access to the free features available through My Research. You can save searches, save documents, create alerts and more. Please log in through your library or institution to check if you have access.

Welcome to My Research!

Translate this article into 20 different languages!

If you log in through your library or institution you might have access to this article in multiple languages.

Translate this article into 20 different languages!

Get access to 20+ different citations styles

Styles include MLA, APA, Chicago and many more. This feature may be available for free if you log in through your library or institution.

Get access to 20+ different citations styles

Looking for a PDF of this document?

You may have access to it for free by logging in through your library or institution.

Looking for a PDF of this document?

Want to save this document?

You may have access to different export options including Google Drive and Microsoft OneDrive and citation management tools like RefWorks and EasyBib. Try logging in through your library or institution to get access to these tools.

Want to save this document?

  • More like this
  • Preview Available
  • Scholarly Journal

governance of social media platforms a literature review

Governance of Social Media Platforms: A Literature Review

No items selected.

Please select one or more items.

Select results items first to use the cite, email, save, and export options

This is a limited preview of the full PDF

Try and log in through your library or institution to see if they have access.

It appears you don't have support to open PDFs in this web browser. To view this file, Open with your PDF reader

Suggested sources

  • About ProQuest
  • Terms of Use
  • Privacy Policy
  • Cookie Policy
  • Conferences
  • New Conferences
  • search search
  • You are not signed in

External Links

  • Google Scholar
  • References: 0
  • Cited by: 0
  • Bibliographies: 0
  • [Upload PDF for personal use]

Researchr is a web site for finding, collecting, sharing, and reviewing scientific publications, for researchers by researchers.

Sign up for an account to create a profile with publication list, tag and review your related work, and share bibliographies with your co-authors.

Governance of Social Media Platforms: A Literature Review

Manish Kumar A , Sumeet Gupta . Governance of Social Media Platforms: A Literature Review . In Ming-Hui Huang , Guy Gable , Christy M. K. Cheung , Dongming Xu , editors, 26th Pacific Asia Conference on Information Systems, PACIS 2022, Virtual Event / Taipei, Taiwan / Sydney, Australia, July 5-9, 2022 . pages 150 , 2022. [doi]

  • Bibliographies

Abstract is missing.

  • Web Service API

The dark side of digitalization and social media platform governance: a citizen engagement study

Internet Research

ISSN : 1066-2243

Article publication date: 9 January 2023

Issue publication date: 27 November 2023

Social media platforms are a pervasive technology that continues to define the modern world. While social media has brought many benefits to society in terms of connection and content sharing, numerous concerns remain for the governance of social media platforms going forward, including (but not limited to) the spread of misinformation, hate speech and online surveillance. However, the voice of citizens and other non-experts is often missing from such conversations in information systems literature, which has led to an alleged gap between research and the everyday life of citizens.

Design/methodology/approach

The authors address this gap by presenting findings from 16 h of online dialog with 25 citizens on social media platform governance. The online dialog was undertaken as part of a worldwide consultation project called “We, the internet”, which sought to provide citizens with a voice on a range of topics such as “Digitalization and Me,” “My Data, Your Data, Our Data” and “A Strong Digital Public Sphere.” Five phases of thematic analysis were undertaken by the authors to code the corpus of qualitative data.

Drawing on the Theory of Communicative Action, the authors discuss three dialogical processes critical to citizen discourse: lifeworld reasoning, rationalization and moral action. The findings point toward citizens’ perspectives of current and future issues associated with social media platform governance, including concerns around the multiplicity of digital identities, consent for vulnerable groups and transparency in content moderation. The findings also reveal citizens’ rationalization of the dilemmas faced in addressing these issues going forward, including tensions such as digital accountability vs data privacy, protection vs inclusion and algorithmic censorship vs free speech.

Originality/value

Based on outcomes from this dialogical process, moral actions in the form of policy recommendations are proposed by citizens and for citizens. The authors find that tackling these dark sides of digitalization is something too important to be left to “Big Tech” and equally requires an understanding of citizens’ perspectives to ensure an informed and positive imprint for change.

  • Digitalization of the individual
  • Citizen participation
  • Qualitative study

McCarthy, S. , Rowan, W. , Mahony, C. and Vergne, A. (2023), "The dark side of digitalization and social media platform governance: a citizen engagement study", Internet Research , Vol. 33 No. 6, pp. 2172-2204. https://doi.org/10.1108/INTR-03-2022-0142

Emerald Publishing Limited

Copyright © 2022, Stephen McCarthy, Wendy Rowan, Carolanne Mahony and Antoine Vergne

Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode

1. Introduction

Over the last 25 years, social media platforms have transformed human relationships and society as we know it. Recent statistics estimate that nearly 58.4% of the world’s population is now connected to social media, with high growth rates projected in developing nations going forward ( Statista, 2022 ). The proliferation of social networks has brought new opportunities for individuals to connect almost instantaneously with friends, family, co-workers and other social groups across the world ( Cheung et al. , 2015 ; Kapoor et al. , 2018 ; Richey et al. , 2018 ). Social media offers an open channel for user-generated content and communication with a global community, allowing users to engage both within and outside their existing networks ( Richey et al. , 2018 ). This enables hobbyists to form online communities around their shared interests ( Dwivedi et al. , 2018 ), politicians and celebrities to share content with citizens during election campaigns ( Mallipeddi et al. , 2021 ) and entrepreneurs to collaborate with customers during new product development ( Namisango et al. , 2021 ).

Research has primarily focused on the positive impacts of digital technology, with less attention directed toward its negative consequences ( Rauf, 2021 ). However, despite recent advances, numerous ethical challenges remain for social media platform governance going forward ( Aral, 2020 ). Firstly, significant concerns have been raised around the integrity of the information provided on social media, given the rise of fake news and propagation of misinformation and disinformation through social networks ( Aral, 2020 ; Laato et al. , 2020 ; Torres et al. , 2018 ). This was notable during the coronavirus disease 2019 (COVID-19) pandemic when unverified claims about the virus spread through social media leading to “cyberchondria” among some user groups ( Laato et al. , 2020 ). The World Health Organization (2022) has also highlighted risks associated with an online “infodemic” where false or misleading information on disease outbreaks quickly spreads across digital channels leading to confusion and mistrust among the public. Secondly, much of this digitalization of life has been brought about by large companies that seek to profile users’ identities via social media for commercialization purposes. The profit motives of these companies have driven the emergence of ‘surveillance capitalism’ ( Zuboff, 2015 , 2019a ), where social media is used to track citizens online to influence address this gap by presenting findings from 16 h of online dialog with 25 citizens’ behavior, e.g. the Cambridge Analytica scandal. New developments in analytics bring an astonishing range of opportunities to “watch” citizens, raising new concerns around privacy and dignity in the digital age ( Leidner and Tona, 2021 ).

Considering these issues, research is urgently needed to steer the future of digitalization and social media platform governance in a more responsible, ethical and inclusive direction ( D’Arcy et al. , 2014 ; Tarafdar et al. , 2015 ; Turel et al. , 2019 ). Responsible governance aims to ensure an informed and positive imprint for change by listening to the voices of different actors affected by a system ( McCarthy et al. , 2020 ; Stahl, 2012 ). The goal is to deliver socio-technical innovations that have beneficial consequences for all by highlighting the responsibilities of different stakeholders to make the “world a better place” ( Walsham, 2012 ). In a similar vein, ethical governance deals with moral principles around how people should act and set rules for judging “right” from “wrong”. This centers on ideals around how society should govern itself beyond formal institutions, rules and processes by empowering individuals to monitor and condemn immoral behaviors that violate principles such as privacy and fairness ( Chen et al. , 2020 ; Molas-Gallart, 2012 ). Inclusive governance, meanwhile, aims to close the gap between scientists and citizens by promoting equitable, open and transparent collaboration through citizen engagement ( Lukyanenko et al. , 2020 ). Inclusivity helps reduce the barriers to participation by engaging traditionally marginalized groups ( Liu and Zheng, 2012 ). For instance, online platforms can support enhanced levels of inclusivity, allowing citizens with impairments and disabilities a chance to influence policymaking ( Leible et al. , 2022, p. 2 ). Inclusive citizen engagement, therefore, incorporates a diversity of contributors who are each provided an equal opportunity to participate ( Liu and Zheng, 2012 ). Interactions between citizens, researchers and other stakeholders can, in turn, foster knowledge creation and transform how we understand the world through meaningful collaboration ( Kullenberg and Kasperowski, 2016 ).

Despite the importance of societal issues to Information Systems (IS) research, the voice of the citizen is often missing - as evident from the dearth of IS literature on citizen engagement studies compared to other disciplines ( Lukyanenko et al. , 2020 ; Weinhardt et al. , 2020 ). Our paper explores citizen engagement’s role in supporting inclusive deliberation on the “dark sides” of digitalization (cf. Tarafdar et al. , 2015 ; Turel et al. , 2019 ) and social media platforms, more specifically. We investigate this objective through the following research question: How do citizens perceive the current and future issues associated with social media platform governance? Findings are presented from “We, the Internet”, a citizen engagement project which sought citizens’ thoughts and feelings around online governance using participation events and foresight methods. Discussions centered on how Internet technologies should be governed in the future, with support from strategic partners such as the United Nations, Internet Society, Google, The United Nations Educational, Scientific and Cultural Organization (UNESCO) and the Wikimedia Foundation. The primary aim was to explore different perspectives on social media governance and the digitalization of life more broadly. Questions included: (1) How should social media platforms be managed and governed? (2) What is the role of the different actors in this interdependent system? (3) What will be the impact of emerging technologies on governance? The dialog included not only researchers, practitioners and policymakers but also Irish citizens who were part of a larger participant group located across the world. The Internet enabled diverse citizen groups to gain equal access to policymaking roundtables through electronic citizen participation forums ( Leible et al. , 2022 ).

Based on our findings, we present three primary contributions which will be of interest to academic and practitioner communities. Our first contribution is to reveal citizens’ perceptions of issues associated with social media platform governance and their rationalization of the ethical dilemmas faced in addressing these issues. In doing so, we answer calls for increased citizen involvement in IS research on digitalization to move existing conversations beyond the political and commercial sphere ( Kapoor et al. , 2018 ; Weinhardt et al. , 2020 ). Our findings reveal citizen concerns around the multiplicity of digital identities, sign-up consent for vulnerable groups and transparency in the moderation of content on social media timelines. Citizens also point towards ethical dilemmas in addressing each of these concerns, including tensions between the need for digital accountability vs data privacy, protection vs inclusion of user groups, as well as algorithmic censorship vs free speech. Drawing on the Theory of Communicative Action ( Habermas, 1984 , 1990 ), our second contribution is to explore how citizen engagement can harness the power of collectives to deliberate ethical dilemmas associated with Information Technology (IT). We discuss three dialogical processes critical to discourse ethics: lifeworld reasoning , rationalization and moral action . Lifeworld reasoning is a communicative act in which citizens share their pre-reflective knowledge and experiences, while rationalization involves argumentative processes where these speech acts are continuously exposed to criticism ( Habermas, 1984 ; Ross and Chiasson, 2011 ). Moral actions then involve coordinating new rules for the external world (e.g. society) based on mutual understanding, legitimacy and consensus ( Mingers and Walsham, 2010 ).

We further discuss the interplay between these three dialogical processes for inclusive deliberation on the ethical implications of IT. Thirdly, we discuss how citizen engagement can yield valuable insights into shaping a digital policy for the future. We assert that social media governance is not a trivial question to be left to “Big Tech’ but a global issue demanding engagement from all members of society ( Ransbotham et al. , 2016 ). This can enable citizens to become the new pioneers of evolved and ethical technology use, developing new meanings for our digital experiences online. We argue that this process is enhanced by the support of strategic partners who can help combat public skepticism in IS research and policymaking ( Lukyanenko et al. , 2020 ; Weinhardt et al. , 2020 ), discussing how the United Nations” involvement in “We, the Internet” as part of their program on Digital Cooperation reinforced the message that governance cannot be left to individual organizations and remains a societal issue ( Guterres, 2020 ).

The remainder of this paper is structured as follows: Section 2 provides the background to our study by reviewing literature on the dark sides of social media, citizen engagement and the Theory of Communicative Action. Section 3 provides an overview of our citizen engagement study, while findings are presented in Section 4 . Section 5 discusses contributions from our study, while section 6 brings the paper to a close, emphasizing the importance of citizen input in IT governance.

2. Conceptual foundations

2.1 the dark sides of digitalization and social media.

An emerging body of literature on the “dark sides” of digitalization points toward several negative consequences (both intended and unintended) from the use of ubiquitous technologies such as social media ( D’Arcy et al. , 2014 ; Ransbotham et al. , 2016 ; Turel et al. , 2019 ). Our study focuses on three related areas of research: the emotional and social impacts of ubiquitous technology use, misinformation and the credibility of online content and data privacy loss.

A growing body of research suggests that the use of digital technologies, such as social media, is intimately linked to our social and psychological well-being ( Agogo and Hess, 2018 ; Salo et al. , 2022 ). Psychological well-being is more than being free from depression, anxiety and stress; it is about being well-supported, empowered and satisfied with life ( Winefield et al. , 2012 ). On the “bright side”, social media can have a positive impact on users’ sense of social connectedness ( Brown and Kuss, 2020 ) by contributing to one’s sense of identity and self-representation ( Walther et al. , 2011) . However, this can also lead to the development of “bad habits” when users engage in negative self-comparisons and habitually respond to social media notifications to avoid a fear of missing out ( Walther et al. , 2011 ). Several studies have found a link between social media use and experiences of negative emotional states such as depression, attachment anxiety, personality distortion, mental exhaustion and attention deficiency, among others ( Busch and McCarthy, 2021 ; Sriwilai and Charoensukmongkol, 2016 ). Literature also highlights that the relationship between psychological well-being and social media use is far from straightforward. Lin et al. (2016) found that while social media use was significantly associated with depression, depressed individuals were also more inclined to interact with social media.

Much of this negativity in reporting is related to the emergence of new digital vulnerabilities such as online harassment by cybermobs and “sock puppet” accounts, e.g. offensive speech and social shaming ( Lowry et al. , 2016 ; Ransbotham et al. , 2016 ). Recent studies have shown a link between the level of anonymity offered by social media and the level of user-directed aggression displayed ( Lowry et al. , 2016 ). For instance, McHugh et al. (2018) find that risk exposure through social media (e.g. cyberbullying, sexual solicitations and seeing explicit content) can lead to symptoms of post-traumatic stress disorder among teens. Johnson (2018) also finds that “mob rules” formed by online groups can sometimes make users unable to distinguish between just and unjust actions online. Trolling differs from other forms of online misbehavior as its central focus tends to be deception and mischief, with the troll seeking to get a reaction from the larger community ( Cruz et al. , 2018 ; Dineva and Breitsohl, 2022 ). Trolling can escalate into cyberbullying ( Cruz et al. , 2018 ), causing psychological distress for both the perpetrator and the victim ( Dineva and Breitsohl, 2022 ).

Secondly, digitalization can transform what we believe, how we think, feel and act: the most extreme case of this being when terrorist groups connect and radicalize citizens using social media to propagate hate speech and misinformation ( Ransbotham et al. , 2016 ; Turel et al. , 2019 ; Thompson, 2011 ). For adolescents, their choice of online communities creates a custom “digital neighborhood”, which can impact their behavior, attitudes and cultural norms ( Brough et al. , 2020 ; Stevens et al. , 2016 ). Mihaylov et al. (2018) point toward the importance of algorithmic classifiers to filter out comments by “opinion manipulation trolls” who seek to compromise the credibility of news forums. While censorship offers one means of tackling these issues, academics have also signaled that human dignity, liberty and autonomy can be compromised by online censorship ( Leidner and Tona, 2021 ; Stoycheff et al. , 2020 ), aspects which are incompatible with debate, consensus and democracy ( Berners-Lee, 2017 ; Robbins and Henschke, 2017 ). It is estimated that 71% of those with Internet access live in countries where they can be imprisoned for posting social media content on political, social, or religious issues ( Stoycheff et al. , 2020 ). Emergency measures to combat misinformation and disinformation, such as automatic screening and the take down of content, can backfire by negatively impacting legitimate journalism, eroding trust in institutions and pushing users to other platforms ( Radu, 2020 ). Algorithmic bias, caused by flawed data or unconscious assumptions, also poses a significant risk to users who may be falsely penalized ( Ransbotham et al. , 2016 ).

Thirdly, digital surveillance on social media has also raised questions around data privacy loss. This creates an imbalance in power between “the watcher” and “the watched” as companies and governments increasingly use data analytics to profile social media users and promote desired behaviors for commercial or political gain ( Ransbotham et al. , 2016 ; Zuboff, 2019a ). Zuboff (2019b) explains that information represents a new form of power as personal data on users and their social contacts can be monetized and modified to create a new principle of social ordering. This centers on the surplus of behavioral data created in the information age. Like Zuboff (2019b) , Sir Tim Berners-Lee expressed concern that the tech giants have become surveillance platforms. In an open letter on the 28th birthday of the World Wide Web, Sir Tim Berners-Lee (2017) highlighted the danger posed by companies and governments “ watching our every move online and passing extreme laws that trample on our rights to privacy ”. However, Cheung et al . (2015) find that social influence has a more significant impact on self-disclosure through social media than perceived privacy risk, as users are often willing to accept privacy loss in exchange for desired benefits. Paradoxically, research suggests that users perceive others as suffering from the threat of Internet privacy risks rather than themselves, recommending privacy protections to others whilst displaying a decreased willingness to adopt these measures for themselves ( Chen and Atkin, 2021 ). Questions, therefore, remain around the governance of social platforms in the ever-evolving landscape of privacy risks.

2.1.1 Social media platform governance

Concerns about the negative consequence of technology use have often been sidestepped in lieu of our fetish for innovation and commercial gain ( Ransbotham et al. , 2016 ; Stahl, 2012 ; Zuboff, 2015 , 2019a ). After decades of relatively light intervention to enable social media platforms to flourish more freely in their infancy, policymakers and civil society have become increasingly aware of the imperative for dialog on how to harness the benefits while containing the drawbacks ( Gorwa, 2019 ). A report by the World Economic Forum (2013) highlights the risk of “Digital Wildfires”, which combine global governance failures with issues such as misinformation, fraud and cyber-attacks (see Figure 1 ). These wildfires can spread rapidly, spreading provocative content, such as misinformation or inflammatory messages, through open and easily accessible systems ( World Economic Forum, 2013 ; Webb et al. , 2016 ).

Platform governance is a multifaceted concept that encapsulates areas such as content generation, technical infrastructure models, data subjects, policies and regulations ( DeNardis and Hackl, 2015 ; Van Dijck, 2021 ). Social media platforms are public spaces where users can interact and engage in activities such as gaming, learning, consuming news and commerce. Considering the range of activities and variety of users, it can be difficult to predict future conflicts and governance issues ( Almeida et al. , 2016 ). For Han and Guo (2022) , governance can be understood according to two perspectives. In a broad sense, IT governance relates to issues such as information, network culture, domain name regulation, etc. In Floridi’s philosophy of information and information ethics, to understand the impact of actions, we need to look beyond living creatures and their environment to include information itself. Floridi (2002) elaborates on the conceptual nature and basic principles of information and how they can be applied to philosophical problems. As information is a central component of social media platforms, Floridi’s philosophy is worthy of consideration to help guide IT governance in a more ethical direction.

In the narrow sense, IT governance encapsulates various stakeholders working together to formulate common principles, rules, standards and cooperative decision-making procedures. The core proposition is that IT governance should follow a process of co-governance which is mirrored in the ideas of discourse ethics ( Gorwa, 2019 ; Van Dijck, 2021 ; Han and Guo, 2022 ). Discourse ethics has its home in Habermas’ (1984) Theory of Communicative Action and aims to study how different stakeholders select between technology frames, rejecting misaligned use or disengaging where there is an assessment of little value ( Mingers and Walsham, 2010 ). This goes beyond an individual’s own ethical decision-making process on technology use and considers the implications for wider society ( Stahl, 2012 ; Walsham, 2012 ). Habermas (1993) recognized that debates need to go beyond the justification of norms to debate their application which can only occur when communities of actors are involved in discursive information ethics. We next discuss citizen engagement as an approach to discourse ethics.

2.2 Citizen engagement

Citizen engagement is an open movement that encourages voluntary participation by diverse citizen populations in research and policymaking ( Lukyanenko et al. , 2020 ; Kullenberg and Kasperowski, 2016 ; Olphert and Damodaran, 2007 ). Citizen engagement ensures the democratization of research by allowing diverse communities to influence decision-making processes by contributing their thoughts, opinions and perspectives ( Levy and Germonprez, 2017 ). This aims to change the way research is conducted by involving the public in the research process, from generating ideas to conducting research and disseminating findings ( Olphert and Damodaran, 2007 ; Weinhardt et al. , 2020 ).

While research has traditionally been the domain of academic experts who control the data-gathering process and analysis of findings, citizen engagement adopts a more non-discriminatory stance asserting that anyone impacted by a system should have a say in how it is designed, regardless of their educational background or subject matter expertise ( Khan and Krishnan, 2021 ; Levy and Germonprez, 2017 ; Lukyanenko et al. , 2020 ; Weinhardt et al. , 2020 ). This requires ongoing collaboration between researchers and members of the public, as epitomized by the discourse ethics school of thought in IS research ( Mingers and Walsham, 2010 ; Someh et al. , 2019 ). The aim is to make research easier to access and more inclusive by empowering citizens with the opportunity to participate. This supports freedom of expression, allowing individuals to realize their full potential through the development of ideas, exploration and self-discovery ( Emerson, 1962 ).

Citizen engagement is also designed to encourage consensus building between a broad range of actors in research, innovation and public policy decisions ( Ju et al. , 2019 ; McCarthy et al. , 2020 ; Olphert and Damodaran, 2007 ). This is achieved using various practices such as foresight processes and scenario planning ( McCarthy et al. , 2020 ). Online communication tools can also be used to make IS research more collaborative by supporting knowledge creation and dialog across geographical locations ( Khan and Krishnan, 2021 ). Lukyanenko et al. (2020) sees technology as an open door for practicing collaborative research, breaking down traditional barriers and encouraging diversity and inclusion by harnessing the “wisdom of the crowds”. Indeed, social media can provide a platform for recruiting citizens in research and enable self-expression and the distribution of knowledge beyond the “soundbite” reporting of policy ( Tacke, 2010 ).

The principle of universality aims to bring together volunteer citizen groups to collectively shape a better future for all ( Mingers and Walsham, 2010 ; van der Velden and Mörtberg, 2021 ). For instance, citizen engagement can help promote new policy agendas through community self-organization ( Hecker et al. , 2018 ). Participation efforts can, in turn, lead to increased interest and participation in the democratic process and contribute to a more scientifically literate society ( Weinhardt et al. , 2020 ; Levy and Germonprez, 2017 ; Lukyanenko et al. , 2020 ). Table 1 outlines the participatory ideals of citizen engagement, moving away from centralized control and encouraging community involvement.

The next section draws on the Theory of Communicative Action as a lens for understanding how citizen engagement can support discourse ethics on the dark sides of digitalization.

2.3 Theory of communicative action

The Theory of Communicative Action proposes that collective deliberation between diverse groups can deliver informed and argumentative perspectives on issues of public concern ( Habermas, 1984 , 1990 ). Discourse ethics defines the theories on communicative rationality and consensus decision-making of the philosopher Jürgen Habermas which aims to promote involvement by diverse actors to expose decision-makers to a wider variety of perspectives, needs and potential solutions ( Mingers and Walsham, 2010 ). Discourse ethics differs from moral theories, such as principles of human rights which have a catalog of clear morality ( Beschorner, 2006 ). Instead, discourse ethics is a procedural moral theory where norms are created during open discourse and agreed upon based on mutual understanding and consensus ( Beschorner, 2006 ; Lyytinen and Klein, 1985 ; Yuthas et al. , 2002 ). Discourse ethics proposes that morality can only be achieved through “an ideal-speech situation” where stakeholders can equitably debate the ethical concerns of different proposals ( Someh et al. , 2019 ).

While it may seem unattainable and utopian at face value, Habermas (1990) later offered clarifications on how an ideal-speech situation can be approximated in practice. He acknowledges that speech acts are socially constructed and subject to different interests, which may not always align with the pursuit of mutual understanding. Habermas (1990) nevertheless asserts that ideal-speech situations are a critical standard by which discourse should be evaluated. Therefore, researchers should aim to approximate these critical standards by guiding discussion and encouraging inclusivity ( Ross and Chiasson, 2011 ).

The aim is to bring together different participant groups to collectively shape a better future through the interplay of different dialogical practices ( Habermas, 1984 , 1990 ). The Theory of Communicative Action presents three practices central to social cooperation: lifeworld reasoning , rationalization and moral action (see Figure 2 ).

Lifeworld Reasoning is pre-reflective and centers on participants’ own perspectives, assumptions and ideals for a given situation. Participants are invited to express their inner world by freely sharing honest and personal thoughts on the topic of discussion (known as personal sincerity ). Each participant should be given an equal voice, with legitimacy for everyone’s contribution. Lifeworld reasoning seeks contributions from those directly affected by proposals and embeds their input into decisions ( Mingers and Walsham, 2010 ). It recognizes that systems impact the lifeworld of participants and vice versa that the lifeworld impacts systems. Participants need to feel comfortable not only giving truthful information but also projecting a genuine picture of themselves to the group; this is vital for developing mutual understanding ( Lyytinen and Klein, 1985 ; Yuthas et al. , 2002 ).

Rationalization then refers to an argumentative process through which speech acts are increasingly exposed to criticism. During rationalization, the lifeworld reasoning of participants is challenged and negotiated through questioning ( Ross and Chiasson, 2011 ). The aim is for participants to justify their position and deliver better arguments by testing them with other participants. Habermas (1984 , 1990) argues that decisions subjected to rationalization suffer less polarization and have a much lower level of volatility than those that have not. Rationalization is, therefore, an “emancipatory communicative act” in which participants build a shared understanding of situations through reasoned argument, negotiation and consensus. The discourse can lead to changes in position due to a better understanding of the issue or impact on others (c.f. Beschorner, 2006 ). A speech act is successful when judged as merit-worthy, authentic and justified by the hearer, who then adopts “ an affirmative position ”. In contrast, a speech act is unsuccessful if it fails to gain uptake from others and must then shift toward new speech acts ( Ngwenyama and Lee, 1997 ).

Lastly, Moral Action refers to the creation and coordination of new rules for the external world (e.g. society) based on consensus and collective interests. Habermas (1984 , 1990) asserts that rules can be legitimate only if they arise from the will of citizens and represent the will of all. Moral action centers on rules which are said to be equally good for every citizen, beyond the interests of any one community or context. Mingers and Walsham (2010) recognize that while this ambitious ideal may never be realized, all debates should aspire toward it. The ideal is established through an open and fair debate process (or ideal speech situation ) and cannot pre-exist dialog or be imposed by more powerful stakeholders ( Someh et al. , 2019 ). Moral actions seek to reduce the ethical impacts of systems on the lifeworld of participants by coordinating actions and developing restraining barriers. Moral actions focus on questions of the good life (“ ethical goodness” ) for individuals (“ ethical-existential ” discourse) or collectives (“ ethical-political ” discourse).

Three conditions are presented for evaluating moral actions and the validity of claims made during dialogical processes: sincerity, truth and rightness. The condition of sincerity relates to the speaker’s inner world and evaluates whether claims of truthfulness are based on subjective perceptions and lifeworld reasoning (e.g. personal experiences). Truth shifts our focus to the external world and whether the speaker’s claims are based on statements with fair assumptions of the objective world (e.g. the performance of an information system). Rightness then directs attention towards the wider context of society to assess the speaker’s claims of legitimacy in relation to “our” social world. All speakers must defend their claims to validity according to these three conditions ( Lyytinen and Klein, 1985 ).

Another key aspect of Habermas’ work is “deliberative democracy”, which proposes that citizens are invited to participate in developing problem-solution pairings to complex challenges. Encouraging citizen participation in this way ensures legitimacy as diverse voices are engaged in creating new norms. Inclusive deliberation can similarly support democratic IS processes, allowing citizens to share their opinions on new systems before they are implemented in an organization or wider society ( Lyytinen and Klein, 1985 ; Ross and Chiasson, 2011 ). We next present the research design behind our citizen engagement study, which follows a Habermasian perspective.

3. Research design

Citizen engagement was selected as an appropriate research approach as it supports the investigation of contested, fragmented and multi-dimensional phenomena through discourse ethics (cf. Mingers and Walsham, 2010 ). Critical social theory was then chosen as the foundational philosophy of this study to develop explanations and understandings of a social situation and critique inequitable conditions ( Ngwenyama and Lee, 1997 ). This methodology is in line with Habermas' view that citizens, given the right environment, “ may become the supreme judges of their own best interests … [and] the idea is thus to open up public, democratic processes, based on dialogue between citizens ” ( Alvesson, 1996, p. 139 ). The critical standard of an “ideal-speech situation” was approximated by a trained group of facilitators who followed a set of guidelines that aimed at supporting intersubjectivity (see Appendix 1 ). During the discourse process, efforts were made to ensure underlying power dynamics in the group were addressed, and any marginalized participants were invited into the discussion to overcome any barriers to consensus.

Our study centered on the global consultation project “We, the Internet” ( https://wetheinternet.org/ ), which explored citizens’ attitudes toward the opportunities and challenges provided by Internet platforms and future developments in this technology ( McCarthy et al. , 2021 ). The project was coordinated by Missions Publiques (France) in collaboration with national organizers in over 80 countries worldwide. These national partners recruited citizens in their respective countries and were part of the facilitation team during the online dialog. In addition, the project had support from public and private strategic partners such as the German Federal Foreign Office, the United Nations, European Commission, World Economic Forum, Wikimedia Foundation, Internet Society and Google. The strategic partners constituted the advisory board and scientific committee to provide conceptual and scientific guidance. This network of partners (see Figure 3 ) was essential in ensuring global outreach with a diversity of participants, as well as enhancing the impact on decision-making processes.

On October 10 and 11, 2020, citizen assemblies were held in over 80 countries simultaneously, representing about 25–100 citizens per country. In this paper, we refine our scope to concentrate on results from Ireland, where the co-authors facilitated over 16 h of dialog between 25 citizens. An open registration process was utilized to ensure a diverse sample of citizen participants and encourage discussion and reflection based on different experiences ( Chapple et al. , 2020 ). An online platform allowed volunteers to join two synchronous four-hour sessions across two days. Volunteers did not require expertise on the topic to register and the final sample had representation from different demographics (see Figure 4 ). Following Patton (2002) , we adopted a snowball sampling strategy using various recruitment channels to identify people with interest in the topic of digitalization. Public adverts were disseminated across several digital and printed outlets, including the university website, social media and local and national news sites. While the demographics included in the final sample were not directly representative of the Irish population, we nevertheless ensured that different age ranges, genders and professions were represented, which supported our aim of inclusivity.

The study received ethical approval from the Social Research Ethics Committee in [ university name withheld for review ], Ireland (Log number 2020-115). All recruited citizens received an information leaflet about the study and were asked to provide their informed consent prior to participating.

3.1 Data collection

Data was collected through 16 h of dialog with citizen participants in Ireland. This translated into an 87-page transcript with 80,662 words (around 20,000 words per group interaction) and 72 pages of field notes. The views of volunteer citizens (see Appendix 2 ) were collected through a set of structured steps during roundtable discussions in virtual breakout rooms. A World Café approach ( Fouché and Light, 2011 ) was used to divide participants into sub-groups with a facilitator present to help guide the emerging discourse on the dark sides of digitalization. The facilitator was responsible for providing an inclusive space for different debates and viewpoints to emerge, moderating potential conflicts between participants and supporting them in working toward a resolution.

The sub-groups encompassed about 5 participants each to provide a suitable atmosphere for everyone to engage in meaningful discourse. High levels of rationalization were vital to ensure that participants had opportunities to declare their position and justify or update it where required based on the evolving group discourse ( Alvesson, 1996 ). It also allowed group members to evaluate statements according to the conditions of sincerity, truth and rightness ( Habermas, 1984 ). Google Docs was used as a collaborative note-taking platform, with all participants invited to record key points from the dialog using designated worksheets. The participants could also include comments and share hyperlinks using the chat function available through Zoom Video Conferencing.

Discourse was organized around seven sessions that covered critical topics on digitalization, social media governance and the future of the Internet (see Table 2 ). Each session followed structured templates which aimed at supporting participants in their discussion (see sample templates in Appendix 3 ). The template designs were informed by the High-Level Panel of Digital Cooperation launched by Antonio Guterres, Secretary General of the United Nations and aimed to directly relate to current issues being discussed by policymakers and strategic partners. This encouraged reflection and discussion around a set of prompting questions derived from the report. Templates were uploaded to a virtual collaboration platform and acted as a visual canvas for participants to record ideas using post-it notes. After each session (typically lasting 45-60 min), participants were asked to record their contributions. Following Habermas’ (1984) fundamental principle of deliberation and open dialog, a semi-structured approach to facilitation was adopted to ensure that discussions were not constrained by the templates. Nevertheless, the facilitators also made certain that all prompted questions were answered and that the templates were used consistently across sessions. Participants in the deliberation were not required to read the material beforehand, and their contributions were not differentiated based on prior level of knowledge or expertise. Instead, the citizen engagement study aimed to support open debate on complex issues. For the purposes of this paper, our findings will center on the first four sessions.

3.2 Data analysis

The Gioia Methodology ( Gioia et al. , 2013 ) was followed to analyze transcribed material from our study and cluster interesting findings. The first and second authors coded the qualitative data across five phases to develop first-order concepts, second-order themes and aggregate dimensions.

In phase one, the first and second authors began by continuously reading and rereading 16 h of transcribed content from the citizen engagement study as well as field notes to generate a set of initial codes which they judged as meaningful and important to the study in question. Based on this process of familiarization, the authors noted initial ideas on the transcribed data. In phase two, the authors systematically coded interesting features of the entire data set and collated data relevant to each code. This process allowed strongly expressed ideas to emerge based on citizens’ evaluations of the issues associated with social media governance.

Phase three involved grouping initial codes together to form overarching categories of codes which helped organize the content. The collated codes were sorted into potential themes, with all data on these themes included. In phase four, themes were reviewed and critically appraised by each co-author to ensure it was representative of coded extracts and the entire data set. The authors refined each theme and formed an overall story of the analysis based on clear definitions and names. Finally, themes of interest to the citizen participants were collated, aggregated and reported by the co-authors. The authors selected vivid and compelling extracts guided by the research question and literature review.

Figure 5 illustrates our data structure and the full list of codes. In total, 205 first-order concepts were created during phases one and two of data analysis. This subsequently led to the definition of nine second-order themes during phase three and four aggregate dimensions during phase four. In the final phase, abductive reasoning was employed to draw on plausible theories that might give the authors deeper insights into the coded observations. The authors sought to better understand the dialogical process through which citizens engaged in discussion around the dark sides of digitalization. To do this, we adapted three constructs from the Theory of Communicative Action ( Habermas, 1984 ), Lifeworld Reasoning, Rationalization and Moral Actions . Working with the data, we realized that open codes primarily related to citizens’ own lifeworld reasoning , which was aggregated as personal experiences (feelings and intentions) as well as thoughts on states of affairs/norms in the social world ( Mingers and Walsham, 2010 ). The process of rationalization was then coded for citizens’ argumentative discussion around the ethical dilemmas faced in addressing concerns from their lifeworld reasoning and was coded as to where the validity of claims is challenged ( Mingers and Walsham, 2010 ). Finally, platform governance recommendations were aggregated as moral actions (summarized in Table 3 of the discussion section) and centered on suggestions to resolve issues that transcend the interests of a particular group ( Mingers and Walsham, 2010 ). While some coded themes appear to overlap (e.g. data privacy and free speech), the reported findings are distinctive and cover different aspects of the discussion topic. Overlap is also indicative of the complex and interconnected nature of digitalization and social media governance.

The next section provides an overview of the findings from our discourse ethics study.

4. Findings

Findings center on the issues associated with social media platform governance from the citizens’ perspective and point towards the ethical dilemmas they identified in addressing governance issues. Drawing on the Theory of Communicative Action ( Habermas, 1984 , 1990 ), we investigate citizens’ contributions through the lens of lifeworld reasoning (citizens’ personal reflections on social media governance issues) and rationalization (critical debate around the ethical dilemmas associated with social media governance). In the discussion section of our paper, we then explore moral actions aimed at protecting the dignity, autonomy and liberty of citizens on social media. All names are anonymized for confidentiality reasons.

4.1 The multiplicity of digital identities (lifeworld reasoning)

Discussions during Session two, “ Digitalization and Me”, centered on the current loopholes in social media governance that enable citizens to create multiple “ fake identities ” online, often with the purpose of malicious intent. Several participants shared personal accounts of abuse that they experienced either directly or indirectly through social media platforms and the relative freedoms that anonymity affords “sock puppet” accounts due to an overall lack of accountability. One story centered on a difficult experience involving a “troll” with multiple fake profiles who sought to inflict abuse on innocent users. “Amanda”, a 40-year-old public figure and advocate for employee well-being in the IT sector, noted how “ an individual with numerous fake identities targeted a few people, including myself. A stranger (I) don’t know this person, and they literally put out so much stuff, all our profiles, and photographs. It was huge online bullying harassment ”. “Amanda” also discussed the impact this experience had on them emotionally, in no small part due to the lack of mitigation mechanisms available through the social media platform in question. She described efforts in seeking litigation action for this attack on her privacy; however, the case was eventually dismissed by the courts: “ it led to legal threats at one particular point made against me by a completely mischievous person. Eventually, that case was dropped ”.

Participants also spoke about the potential for collaborative cyberbullying on social media, where users develop a “ herd mentality ” and work together to attack a fellow user of the platform. Reflecting again on the affordances of anonymity, citizens noted how “ hiding behind their digital identity, they can be as cruel and as nasty as they like ” and the lack of real-world implications can empower groups to rapidly spread rumors on social media. The posting of altered photographs (“ Instagram Fakery ”) to a broad audience was also discussed, leading to negative consequences such as body dysmorphia among young users. “Juliet”, a 35-year-old public school teacher working in the local area, reflected on their conversations with a child around Instagram Fakery and how it affects their perceptions of identities in real life: “ [the child said] “yeah everything is fake, everyone’s edited it’ but then on the flip side of that she said yeah ‘I also wouldn’t talk to someone if I don’t like the look of them [on social media] I just wouldn’t even bother trying to get to know them” . This prompted others to share childhood experiences of bullying before the advent of social media and the difficulties that younger generations experience today. “John”, a 29-year-old law graduate now working in the field of digital rights, noted how prior to social media, victims of bullying would only have to deal with abuse during school hours, as they had reprieve once in the comforts of their family home. However, with social media, cyberbullying can now permeate the safe space of a young person’s home: “ originally, if you were a boy in school [bullying] was only during the day. Nowadays, because everyone has online access, you see that people are bullied so badly because, like, basically the bullying can go on 24/7 ”.

4.2 Digital accountability vs data privacy (rationalization)

Toward the end of session two, participants began to challenge the notion of digital accountability and the multiplicity of digital identity. Some argued the need for “ authenticity ” on social media platforms through new governance mechanisms which ensure real-world repercussions for instigators of abuse, i.e. “naming and shaming ” users in real life. Others argued that accountability was essential to protect victims and respect basic human rights to dignity on social media platforms: “ people seem to lose their normal human inhibitions when they go online. When I’m interacting with people face-to-face, there are certain ways [of] civilized behavior and empathy for other people. An ability to understand that even if you disagree with what he or she is saying, there are ways in which you can do that which are sensitive to their personhood .” Similarly, the lack of governance around digital identities on social media was argued to create more extreme views “ we feel free to make the most bombastic and black and white judgments about other people … taking ridiculously over the top positions about everything and forgetting about the fact they’re talking to other human beings and what you’re saying has an emotional impact .”

However, some citizens countered by highlighting the need for “fake” digital identities today to protect real-world privacy. They argued that “different personas” are required for separating professional and private contexts, asserting these two worlds should not necessarily mix: “ you might [use] a different persona with government organizations, companies, or different social media outlets. [Some are] a lot closer to what your real-world identity would be because you’re logging on to use essential services”. “Jane”, a senior manager with high visibility in her field, made a similar assertion that “ I have my public profile, which is my professional profile, but I refuse to have my private profile at all in most social media … I think it’s very dangerous to mix these. Yet, most people [do] ”. The facilitator “Johnathon” however, welcomed contrasting approaches to data privacy which were less rational in nature: “ So anybody else got a different perspective on the way that they do it? ”. In response, one citizen noted that younger generations today wish to be transparent online and represent their authentic self, regardless of the potential impacts on their future employability: “ if an employer didn’t find their drunken photos funny on their publicly accessible Facebook page, which they had no intention of privatizing, then they didn’t want to work for that company ”. Younger generations were generally perceived as less concerned about mixing the professional and private worlds. “Thomas”, a 33-year-old male, also explained that this is driven by peer pressure and the demands for continuous self-promotion on social media to present a more socially attractive persona (e.g. opinionated, socially active and adventurous): “ [they’re] chasing social gratification … people used to call TV the opiate of the masses … I don’t even think kids are interested in drugs and cigarettes anymore because they’re on Instagram, and they get their dopamine from that ”. Education was highlighted as a critical intervention to ensure that younger generations are aware of the risks posed by privacy loss and the dangers of mixing personal and professional identities online.

4.3 Consent for vulnerable groups (lifeworld reasoning)

In session three, “ My data, your data, our data ”, discussions centered on the issue of consent online and the minimum age requirement for sign-up (typically 13 and above). “Juliet” again shared personal experiences of when proposed consent policies were properly governed in practice and the real ethical risks that this carries for underage users: “ I’d be teaching classes maybe where there are children at eight, nine, ten years old and they all have smart devices, and all have accounts for TikTok or Instagram or various social media sites despite the being much younger than the minimum [age] to actually sign up” . She then reflected on a conversation with parents who allowed their child to bypass the minimum age of consent for social media, conceding that the decision was made due to peer pressure from other children in the class: “I said, how old is your child? (She said) “my child is seven”. Okay, so you obviously put in a fake date of birth to set up this account […] there’s a lot of kids hammering their parents to get a TikTok account. ” Others agreed that peer pressure was a primary driver of social media adoption by vulnerable groups: “ 10- or 11-year-olds … all have Snapchat and TikTok and Facebook and Twitter despite there being age restrictions. Think this age restriction is ridiculous anyway … [13-year-olds] are not informed about the dangers of the apps at that stage in your life” .

Participants then discussed the risk of underage social media users being targeted by sex offenders and extremist groups. One shared a harrowing story of an underage user who was stalked by a known pedophile in their local area using location data on their social media posts to determine the child’s location: “ if you go to Snapchat and you turn on the map, you can see everybody in Snapchat that you’re connected to, where they are and what they’re doing … one of my one of my students got contacted by the guards (police) because this pedophile was tracking his daughter and other pre-teens” . The maturity of underage users to assess their exposure to such risks on social media was discussed (“ too young to understand ”) and some felt they were often more willing to share information freely online, without concern for any negative consequences: “ they’re sitting inside the living room on apps so that’s their real-world [and] there’s no difference between the real world and the online world for a younger generation. They’re living through their phones ”. “Anthony”, a parent with young children, also shared his struggles when trying to communicate the risks of social media engagement, noting that warnings continuously fell on deaf ears: “ if it’s a parent telling them (about the risks online), it’s just like “another lifelong lesson” and they don’t tend to listen. Trying to explain that to young people, especially as a parent, they shut down a lot of the time” .

4.4 Protection vs inclusion (rationalization)

Building on the dialog presented above, participants began to challenge whether protection is the best way to mitigate social media adoption by underage users. One rationale in support of protection was to give young users the freedom to make mistakes without repercussions later in life: “ you know, there are pictures of me probably smoking weed when I was 15 or something up online somewhere, but there’s nothing I can do about that. But if you’re even younger, you know that’s crazy … I suppose reality in the physical world and the online world is becoming blurred ”. Young users’ discernment of appropriate content to share online was discussed, as well as the impact this might have on their reputation in adulthood. “Thomas”, whose sibling is on the autistic spectrum, shared a personal story of a young relative with special needs who continuously share large volumes of information on social media without appreciating the risks it poses to their safety online: “ I know a person that has special needs and really didn’t understand the concept of private information and what you put out into the public. You kind of end up then opening yourself up to vulnerabilities the more you share online; the more you think it’s acceptable to share online ”.

However, others later challenged the recommendation of universal protection, countering the equal need for inclusion across diverse groups. Inclusion was argued to be an inherent advantage of social media: “ I think one of the advantages obviously is that you can send out multiple messages and reach so many more people than you would if you were just trying to do it and physically. You’d never be able to do that, and so you are able to share good news stories, you’re able to share positive photos and messages” . Some argued that the current low barriers to sign-up helped promote diversity on social media, allowing users to freely contribute content and communicate with others across the world: “ I think it [social media] gives people the opportunity to promote themselves in good ways. Like you can tell people: look at this interesting stuff we are doing; come and get involved or … spread the word. ” Inclusion was also highlighted as a way of empowering citizens who may have had limited access to content and resources in the past: “ We take for granted how easy it is to access all of those things now compared to not even ten years ago. Just the access and how quick everything has become at the end of our fingertips. We have access to information but also access to resources that ordinary citizens might not necessarily be involved in ”.

4.5 Transparency in content moderation (lifeworld reasoning)

During session five, “ A strong digital public sphere”, participants shared their personal concerns about the moderation of content on social media timelines. The increase of fake news during the COVID-19 pandemic was discussed as well as its transcendence from the digital world, impacting how people act in the real world, e.g. anti-vax groups. The “ invisible hand ” of algorithms that shape what is displayed on social media timelines was also criticized for lacking transparency: “how can we discern what data is coming from reliable sources and what is not? And how do we deal with the stuff that’s not without simply saying, oh yeah, just ban all that? It’s a question really of judicial discernment and digital awareness.” Many noted that they “ didn’t understand the question ” as they felt AI was a topic beyond their understanding. However, the facilitator “Gary” assured them that “ there are no wrong answers, just reflect on how [social media] is now and what you foresee in the future. It could be kind of a utopian or dystopian view” . In response, one individual described social media timelines as a “ kind of free-floating world which is not grounded in real-life experience, whatever that experience may be … it creates a kind of a sense in which people are living at least partially in a world of fantasy .” Reflecting on earlier discussions around digital identities, participants agreed that social media had become an increasingly unruly environment of public discourse and the impact of “ catfishing ” was noted as another driver of fake news dissemination: “There are a lot of fake identities online and we have heard a lot about people doing catfishing. There is a difference … online; you can be anybody.”

The power that social media companies possess in controlling content, banning users and posting disclaimers on content flagged as fake news was also observed: “Facebook banned a marketing agent’s PR agency in the US who were using teenagers who were using (sock puppet accounts) on Facebook to support Mr. Trump to amplify and comment. So, they were able to shut down that kind of a micro-network [of] groups trying to influence the elections in the US” However, questions remained around whether it is wise to empower private organizations with absolute control over the censorship of information online, with the lack of transparency around content moderation again noted. Participants felt it was important to continue dialog on the design of moderation policies, how these are enacted and by whom. This was seen as essential to avoid “blind trust” in the algorithms of private companies that moderate content: “ as things become more automated, we begin to trust more that ‘oh look, it’s all done in the background, I don’t need to worry’, but that’s exactly the [concern] … if we had accurate information that’s properly assessed using these really smart technologies then the world would benefit more than having all the information at our fingertips, which none of us have the expertise to analyze” . In the spirit of mutual understanding, the facilitator “Brenda” summarized that “So if we were to come to some sort of consensus […] it all relates to a strong digital public sphere [which] organizations, government bodies, and non-government agencies [should] offer ”.

4.6 Algorithmic censorship vs free speech (rationalization)

Through ongoing discussion, participants next explored the balance between algorithmic censorship and freedom of speech. The European Convention on Human Rights’ (ECHR) position on Internet content as distinct from printed media (subject to different regulations and controls) was noted by several participants, with its underlying rationality questioned: “ [social media] is very much a free for all [on] how this information is disseminated … These platforms defend themselves saying they are just platforms and not publishers. I think they should be publishers and should be held legally accountable for what they do. That would be a good start and something which would be a major departure from how things are now. ” As a result of ongoing dialog, some felt that regulating social media platforms as “ publishers ”, like traditional media (e.g. print, television, radio), could help address the concerns around censorship: “I do think there needs to be more accountability to people because of the amount of money and power that [social media companies] have … I don’t know who should regulate [social media companies], but I don’t want the companies self-regulating themselves. There is an international convention, but that doesn’t mean that all countries in the world are following those”.

There was general agreement that the balance of algorithmic censorship and free speech would be a complex goal and trade-offs may come at a price. Some argued that governance should be driven at an international level and “there needs to be an international body set up for the governance of the internet”. In contrast, others felt that judicial pressure from regional legislators is the only way to drive change: “ it’s normally judicial pressure to actually get something made. The only reason that this Digital Media Bill is coming in is because the EU is making us.” “Jonas”, a 50-year-old digital inclusivity champion who works with engagement groups across the world, reaffirmed this by arguing that cultural differences around the expectations of censorship and freedom of speech must be accommodated at a regional level: “ the issues that we have with Facebook and LinkedIn result from the fact that they see from the perspective of the United States only. They don’t consider the needs of other people. I believe there’s cultural bias in that model of scaling” .

5. Discussion

With this study, we respond to calls by Kapoor et al. (2018) and Weinhardt et al. (2020) for engaged IS research that offers citizens a voice in discussions around ubiquitous platforms such as social media, moving existing conversations beyond the political and commercial sphere. Despite their representation as both users and the “product” of such platforms (cf. Zuboff, 2015 , 2019a ), citizens and other “non-experts” are often missing from conversations on topics such as digitalization, raising questions of legitimacy and representation ( Lukyanenko et al. , 2020 ; Weinhardt et al. , 2020 ). Building on findings from 16 h of online dialog with citizens, we take steps towards addressing the current dearth of citizen engagement studies in IS by investigating their perceptions of issues such as the digitalization of the individual and the digital public sphere ( D’Arcy et al. , 2014 ; Tarafdar et al. , 2015 ; Turel et al. , 2019 ).

Our first research contribution centers on the exploration of citizens’ perceptions of the current issues associated with social media platform governance, as well as their rationalization of the ethical dilemmas faced in addressing these issues going forward ( Kapoor et al. , 2018 ). This includes the multiplicity of digital identities, sign-up consent for vulnerable groups and transparency in the moderation of content, among others. While existing research is mostly focused on the self-disclosure of singular “true” identities online and the impact of privacy risks ( Cheung et al. , 2015 ), our research points towards the complex nature of digital identity and its impact on accountability online. Some citizens discussed the creation of fake personas to represent their ideal selves to certain audiences. By dichotomizing social connections this way, the user can control the type of persona they portray, returning some element of control to the user ( Kang and Wei, 2020 ). However, despite the potential for employers to assess candidates using social media, our findings also suggest that younger users may underestimate the risk of privacy loss ( Cheung et al. , 2015 ) by failing to separate their private-professional identities using different accounts. We also extend previous discussions by McHugh et al. (2018) around risk exposure on social media by providing real-life accounts of how vulnerable groups can be stalked by predatory groups online. Anonymity (“digital cloaking”) was identified as a key enabler of online harassment and the omnipresent nature of cybermobs ( Lowry et al. , 2016 ; Johnson, 2018 ; Ransbotham et al. , 2016 ). We also find that although parents can take steps toward protecting their children, further protections are needed at a platform level. This corresponds to accounts of 21% of children in the UK who are between 811 years of age owning social media profiles ( Ofcom, 2020 ). Our findings show citizens’ support for Mihaylov et al. ’s (2018) assertion that algorithmic censorship can provide a better digital public sphere by filtering out malicious comments and fake news by trolls; however, they also highlight the need for algorithmic transparency to avoid false positives (algorithmic bias) and protect freedom of speech ( Emerson, 1962 ). As highlighted by Radu (2020) , the use of algorithmic censorship to fight misinformation and disinformation during the COVID-19 pandemic was not always successful and sometimes had unintended consequences. This further emphasizes the need for new and improved governance structures in the digital ecosystem.

Our second contribution is to adapt the Theory of Communicative Action ( Habermas, 1984 , 1990 ) to our discussions on how citizen engagement can harness the power of collective deliberation and rationalize potential ethical dilemmas around the governance of digital platforms (cf. Mingers and Walsham, 2010 ; Someh et al. , 2019 ). We highlight the importance of providing citizens with a “stage” to share their concerns and hopes for social media in the future by inductively revealing their perspectives based on a thematic analysis of transcribed dialog. We further uncover the interplay between key dialogical practices of discourse ethics and explore their role in the collective deliberation of governance issues associated with digitalization. Our findings showcase how ordinary citizens, regardless of their educational background or subject matter expertise, can contribute to discussions through lifeworld reasoning and sharing their personal experiences ( Khan and Krishnan, 2021 ). This draws on citizens’ interest in the public good to bring together volunteers who wish to have their say on research and policy ( Lukyanenko et al. , 2020 ). Findings also showcase how citizens engaged in the rationalization of ethical dilemmas faced in addressing governance issues going forward, exploring tensions such as digital accountability vs data privacy, protection vs inclusion and algorithmic censorship vs free speech. Consistent with previous research, we find that the diversity of citizens’ backgrounds can stimulate rationalization by leveraging the plurality of views they hold ( Lukyanenko et al. , 2020 ; Mingers and Walsham, 2010 ). In doing so, we take steps towards answering the call for more discourse ethics studies on IS research topics of societal concern, aspiring towards “an ideal-speech situation” where citizens can equitably debate different proposals ( Mingers and Walsham, 2010 ; Someh et al. , 2019 ).

Discourse ethics guide us to the principles of engagement over norms, through consideration of pragmatics and ethical values, but also by negotiation – being willing to take the other’s perspective, modify one’s perspective and by ensuring that such agreements are based on the “force of argument” not the force of power ( Mingers and Walsham, 2010 ). The inequalities of society are often reflected in how information systems are designed, which calls into question any claims that technology is value-neutral and free from bias. For instance, algorithms can funnel vulnerable users into “echo chambers”, which proliferate disinformation, while trolls and bots can produce automated messaging content that triggers intergroup conflict ( Jamison et al. , 2019 ). Discourse ethics centers on moral, pragmatic and ethical questions such as: How do value systems affect our actions? What are the consequences of our actions? How do we ensure justice for all ( Mingers and Walsham, 2010 ). We invite IS scholars to consider the importance of citizen engagement for supporting discourse ethics across three frames. This includes collective thought and discourse work whereby ethics is an open-ended process of joint deliberation, the balancing a continuum of values and the appropriation of a technical fix which is solvable through technical or instrumental means (cf. Wehrens et al. , 2021 ).

For our third research contribution, we present a set of policy recommendations ( moral actions ) proposed by citizens and for citizens. Building on our findings and theoretical model, Table 3 summarizes citizens’ lifeworld reasoning of governance issues in social media and their rationalization of ethical dilemmas. In any moral reasoning, the critical issues include “ Do the means justify the ends? Or do the ends justify the means” Habermas (1984) argues that citizen engagement on such important questions can provide insights into what “the people” want from technology going forward. Habermas (1984 , 1990) asserts that morality and law are not separate and instead constitute an incremental and developing system of interplay. Moral consciousness requires a transition from “ pre-conventional ” expectations of behavior to “ post-conventional ” principles - where the ethics of conviction and responsibility are underpinned by formal laws and social norms ( Habermas, 1984 , 1990 ).

Table 3 outlines “ post-conventional” principles and calls for future research and policies on new governance models which are both fair and responsible. Although our study focuses on Habermas’ conceptualization of moral action, it should be noted that other versions exist. For example, Rest’s (1994) model of ethical decision-making focuses on the individual over Habermas’ collective interest. In his model, Rest proposes a four-stage process that occurs after the individual has been presented with an ethical dilemma, (1) recognizing the situation ( moral awareness ), (2) evaluating choices and impacts ( moral judgment ), (3) choosing how to act ( moral intention ) and (4) the actual behavior ( moral action ) ( Agag et al. , 2016 ). Technology moral action is a complex concept that examines the system, user and designer across three types of responsibility (causal, moral and role), which sometimes overlap ( Johnson and Powers, 2005 ). The doctrine of double effect (DDE), meanwhile, attributed to St. Thomas Aquinas, focuses on the intention of the actor ( McIntyre, 2019 ). DDE postulates that it is acceptable to cause harm when it is an inescapable consequence of achieving a greater good ( Cushman, 2016 ). The fact that the action will cause harm can be foreseen, however, the harm must not be the goal. Foot (1967) offered the “trolley problem” as a method to examine DDE. According to the DDE, it is acceptable to redirect a runaway trolley away from five people to a side track where it will kill one person. This is because the predicted side-effect of killing one person is outweighed by the benefit of saving five lives. However, throwing a person in front of the train is not permissible to slow it down, thereby preventing it from hitting five because, in this version, you intend death ( Cushman, 2016 ). Studies suggest that moral judgments by ordinary people are often consistent with DDE, although this process is unconscious and automatic ( Cushman et al. , 2006 ). By examining the trolley problem in different scenarios, Foot (1967) highlights the complexity of moral judgment and moral actions, opining that other factors must be considered, such as avoiding injury and bringing aid.

Social media was originally intended as a place for sharing content and communicating with fellow users across the globe ( Dwivedi et al. , 2018 ; Mallipeddi et al. , 2021 ; Namisango et al ., 2021 ). More recently, however, the role of such digital platforms in society has started to change, leading to numerous unintended consequences in what is now a global communication channel ( Aral, 2020 ; Laato et al. , 2020 ; Torres et al. , 2018 ). In our “always online world”, ideals about morality, human dignity and law are continuously tested by ubiquitous technologies. Moreover, many of the conventional norms of society are under attack from phenomena such as fake news and the rise of hate speech, showing existing governance models to be inadequate. Our findings suggest the need for a re-evaluation of governance models to reduce users’ exposure to potential negative consequences of social media platforms. This requires a coordinated approach across different stakeholder groups.

In terms of practical contributions, our study provides insights into how citizen engagement practices can be used to engage diverse groups in IS research and policymaking processes ( Lukyanenko et al. , 2020 ; Weinhardt et al. , 2020 ). Our findings contribute insights into engaged scholarship by building on the ethos that collective intelligence emerges from constructive, non-partisan forums. The primary objective of citizen engagement is to encourage transparent collaboration and transform how we understand the world ( Lukyanenko et al. , 2020 ). It is, therefore, not only about deliberating for the sake of deliberating; the aim is to further improve decision-making and governance by delivering recommendations to political authorities. Our study displays the potential of citizen engagement to influence future-oriented policymaking ( Lukyanenko et al. , 2020 ; Levy and Germonprez, 2017 ; Weinhardt et al. , 2020 ) once intertwined with the decision-making processes of strategic partners. In the “We, the Internet” project, results from citizen assemblies were later recognized in the Roadmap on Digital Cooperation issued by the UN Secretary General’s (UNSG) Office and are well aligned with the roadmap presented in a UNSG options paper on the Future of Digital Cooperation. Outputs from the project were also endorsed by the German Government and the German Federal Foreign Office as part of the High-Level Panel’s follow-up process, with global findings incorporated into relevant publications.

To enable new forms of informed action, we suggest the need for a “deliberative imperative” in the IS field ( Habermas, 1984 , 1990 ; Lukyanenko et al. , 2020 ). IS researchers can seek to go outside the “organizational walls” to engage diverse groups in formulating problems and developing concrete solutions that are in the best interests of civil society. Our study shows how citizen engagement can offer a mechanism for different stakeholder groups to have an active role in the definition of public policies ( Mingers and Walsham, 2010 ; Someh et al. , 2019 ). In this model, divergent mindsets are put aside, and everyone is given a chance to speak and form enlightened viewpoints that can inspire policymakers. Most importantly, this enables citizens, as a previously underrepresented group, to discuss issues of public concern, debating proposals related to both their daily lives and those of future generations.

5.1 Limitations and future research

There are, nevertheless, limitations inherent in our study that future research can seek to address. Firstly, we recognize limitations associated with some of the primary assumptions of Habermas’ Theory of Communicative Action, which can seem overly idealistic and utopian when taken at face value ( Ross and Chiasson, 2011 ). For instance, definitions of an “ ideal speech situation ”, “ consensus ” and “ moral action ” may seem to neglect some of the realities of social interactions, such as power, conflict and context. We, therefore, suggest that these theoretical assumptions must be adapted to ensure that they are practically useful for the context under investigation. Secondly, the study was primarily focused on the initial stages of engaging citizens in dialog around current challenges. As a result, an in-depth study of potential solutions and the impact derived from the project outcomes on the future development of social media governance was beyond our paper’s scope. Future studies can seek to longitudinally explore the practical impact of citizen engagement studies on mitigating the dark sides of digitalization. Findings from our study highlight urgent questions for researchers going forward, including: How can we ensure accountability with digital identities? Who decides what is to be done and who has legitimate authority to govern? Further questions abound on the introduction of sanctions, with punishment for offenders of regulated systems. This calls for action by both industry and policymakers.

Our understanding of citizen engagement in IS research and policy are only emerging, with several factors yet to be considered. Firstly, limitations associated with the information quality and the self-selection biases of participants involved in citizen engagement studies should be recognized ( Lukyanenko et al. , 2020 ). We suggest that future research also explore the fit between research purpose and citizen engagement design, as well as the policymakers’ assessment of whether the resulting findings stand up to political scrutiny ( Ju et al. , 2019 ). Future research can also contribute insights into the potential of discourse ethics to foster change around technology use - beginning at the individual level and moving through groups to eventually support societal-level acceptance. Our findings on the need to protect underage users on social media suggest that future research should include younger users in discussions around the design and governance of online platforms. Lastly, future studies can seek to explore other “dark sides” of digitalization through engagement with the citizens directly affected by technologies as well as the industry practitioners and policymakers responsible for their governance. Our research provides initial insights into how IS researchers might support openness by engaging diverse citizen groups on socio-political issues.

6. Conclusion

In this paper, we explored concerns and ethical dilemmas surrounding the dark sides of digitalization based on findings from a citizen engagement study. Drawing on the Theory of Communicative Action ( Habermas, 1984 , 1990 , 1993 ), we reveal that the interplay between different dialogical practices ( lifeworld reasoning , rationalization and moral action ) can promote critical input for the future of social media governance. We further discuss the study’s practical implications and highlight the need for consultation processes to be intertwined with the policymaking processes of high-level strategic partners to deliver concrete actions that address ethical dilemmas associated with social media platforms. Research on this citizen-led process can provide valuable insights and the impulse for transformative change, beginning with an inclusive, deliberative process that explores the attitudes of individuals towards the dark sides of digitalization. Combined with targeted policy recommendations, citizen discourse can, in turn, act as input for political decision-making. Future research efforts are vital for building a deeper understanding of complex issues, such as the interaction between open dialogs and the governance of digital technologies going forward.

Digital wildfires in a hyperconnected world

Theory of communicative action

We, the Internet consortium

Overview of participating citizens in Ireland

Data structure

Principles of citizen engagement

PrincipleDefinitionReferences
DemocratizationCitizens from diverse backgrounds (e.g. education) are invited and actively encouraged to participate in the research design and development process , (2020)
Non-discriminationParticipation is not limited to experts, and all contributions are treated equally. This implies the facilitation of dialog, free from censorship , (2020), (2020)
ConsensusShared understanding is sought through negotiation and transparency during the participatory process (2019),
UniversalityThe aim is to formulate recommendations that go beyond a single context or the interests of a single community ,

Overview of the citizen engagement sessions

SessionDescriptionObjective
IntroductionFacilitators presented the study’s objectives and program for dialogTo welcome participants and ensure they are aware of rules for good dialog, e.g. active listening and respect
Digitalization and meParticipants explored personal and collective experiences with the Internet and, more specifically, social mediaTo understand basic concepts of digital governance from the perspective of citizens
My data
your data
our data
Participants explored their understanding of digital identities on social media and the data footprint left through interactions on social mediaTo reflect on citizens’ understanding of the data that is gathered online
A strong digital public sphereParticipants explored their wishes for a strong digital public sphereTo explore the impacts of disinformation and approaches to tackle it online
Exploring artificial intelligenceParticipants exchanged their thoughts and feelings about AI through targeted advertising and personalization of contentTo explore citizens’ basic understanding of the data analytic practices adopted by social media platforms
Internet for and with citizensParticipants proposed the policy actions for governance going forwardTo discuss and prioritize policy actions for addressing ethical issues with social media
ConclusionFacilitators closed the session and summarized the resultsTo outline the next steps of the project and thank participants

Summary of findings from the citizen engagement study

Lifeworld reasoningCitizens shared personal experiences where the guise of multiple digital identities allowed trolls to engage in acts of hate speech without consequence. Collective digital identities were also discussed, where users engage in collaborative acts of hate speech driven by a herd mentality
RationalizationCitizens felt that the right to human dignity is currently set against a backdrop of “anything goes” with little to no accountability for harmful acts on social media. Herein lies the dilemma between social media as an open platform for connection and contribution and the need for content moderation
Moral actionCitizens believe education on what constitutes digital accountability and how this should be addressed in the future is crucial. This should be based on an assessment of human dignity for people from all levels of society. They believe moral caveats of decency and respect for self and others also require further consideration
Lifeworld reasoningCitizens were concerned for young (often underage) users who feel increased pressure for peer validation on social media. The expectations of social conformity are pitched against the risks of vulnerable users being targeted by dangerous groups through their publicly visible profiles
RationalizationA dilemma is perceived by citizens between the desire for inclusion on social media through equitable access to services and the need to protect vulnerable groups against risks online
Moral actionCitizens called on policymakers to increase protection by monitoring and enforcing the regulations around age requirements. Such consent policies should be inclusive but protectionist. They believe more clarity and openness are needed on ‘the right to be forgotten’ as per the General Data Protection Regulation (GDPR)
Lifeworld reasoningCitizens discussed how actions taken in the digital public sphere can impact different social groups and society more broadly. Human rights laws are seen as set against ambiguity around social media platforms as “publishers”
RationalizationDilemmas are perceived between a duty to respect freedom of speech and the challenge of making social media a trustworthy and safe place for all. Citizens questioned who (e.g. regulators) or what (e.g. algorithms) should moderate the content
Moral actionCitizens called for legal changes at both an international and regional level. This requires consensus and cooperation among international organizations, nation-states and social media platforms. Actions must consider individual and cultural needs

Appendix 1 Guidelines for discourse facilitation

As a facilitator, you have one of the most important roles during the day. You are the guarantor of the quality of the deliberation at your table, and hence at all tables and for all the dialogs. The quality of the results often depends on the quality of the facilitation.

Be neutral: be aware of your “power” on participants and not misuse it. Your role is not to influence deliberations, but to facilitate them by reformulating, and helping participants to express their thoughts.

Be aware of the group’s dynamics : pay attention to the participants and be able to identify if someone at the table is uncomfortable.

Be clear : reformulate and make sure participants understand the purpose of the discussions.

Be inclusive : identify participants that are participating less and try to make them comfortable enough to speak out.

Be polite : always be friendly, even with participants that are less polite.

The dialog must above all respect one value: inclusiveness. We are committed to it.

By welcoming each participant individually at your table (virtual or real), and by making sure that they feel comfortable and have a good day.

By introducing yourself at the beginning of the day and when participants change tables (if the participants change table/virtual room). That way, citizens at your table can identify you and their fellow citizens.

By explaining the principle of inclusiveness to citizens from the very beginning of the day: “No one here is an expert on the Internet and its issues. We are here to listen to you and to give you the keys you need to deepen your opinions. You are here to exchange with another”.

As a facilitator, you must regulate speaking: some people are more comfortable speakers than others. In a group there may be some people who monopolize the floor and others who are naturally withdrawn. In this case, do not hesitate to ask the latter if they have anything to add or supplement to what has been said, without pushing them to speak if they do not want to. On the contrary, you will have to regulate participants that speak a bit too much and do not leave space for others.

Appendix 2 Anonymized list of volunteer participants

IDBackgroundSubgroup
Participant 1Retired senior citizenSubgroup 1
Participant 2Small business ownerSubgroup 1
Participant 3Head of social media in an advertising agencySubgroup 1
Participant 4Teacher in a primary schoolSubgroup 1
Participant 5UnemployedSubgroup 1
Participant 6IT DirectorSubgroup 2
Participant 7Coach and MentorSubgroup 2
Participant 8Undergraduate studentSubgroup 2
Participant 9Social media business evangelistSubgroup 2
Participant 10University lecturerSubgroup 2
Participant 11Marketing and communications professionalSubgroup 3
Participant 12PhD StudentSubgroup 3
Participant 13Postgraduate student (Law)Subgroup 3
Participant 14Independent non-executive Chairman and DirectorSubgroup 3
Participant 15Retired senior citizenSubgroup 3
Participant 16Information scientistSubgroup 4
Participant 17University lecturerSubgroup 4
Participant 18Researcher/legal practitionerSubgroup 4
Participant 19Chief Executive OfficerSubgroup 4
Participant 20Student (Applied Psychology)Subgroup 4
Participant 21Undergraduate studentSubgroup 5
Participant 22Marketing managerSubgroup 5
Participant 23Undergraduate studentSubgroup 5
Participant 24Administration professionalSubgroup 5
Participant 25Owner of HR firmSubgroup 5

Appendix 3 Sample templates from the citizen engagement study

governance of social media platforms a literature review

Agag , G. , El-masry , A. , Alharbi , N.S. and Ahmed Almamy , A. ( 2016 ), “ Development and validation of an instrument to measure online retailing ethics: consumers’ perspective ”, Internet Research , Vol.  26 No.  5 , pp.  1158 - 1180 , doi: 10.1108/IntR-09-2015-0272 .

Agogo , D. and Hess , T.J. ( 2018 ), “ How does tech make you feel? A review and examination of negative affective responses to technology use ”, European Journal of Information Systems , Vol.  27 No.  5 , pp.  570 - 599 , doi: 10.1080/0960085X.2018.1435230 .

Almeida , V.A.F. , Doneda , D. and Córdova , Y. ( 2016 ), “ Whither social media governance? ”, IEEE Internet Computing , Vol.  20 No.  2 , pp.  82 - 84 , doi: 10.1109/MIC.2016.32 .

Alvesson , M. ( 1996 ), “ Communication, power, and organization ”, Kieser , A. (Ed.), De Gruyter Studies in Organization: Organizational Theory and Research , Vol.  72 . Walter de Gruyter , Berlin, NY , doi: 10.1515/9783110900545 .

Aral , S. ( 2020 ), The Hype Machine: How Social Media Disrupts Our Elections, Our Economy and Our Health – and How We Must Adapt , Harper Collins , New York, NY .

Berners-Lee , T. ( 2017 ), Three Challenges for the Web, According to its Inventor , World Wide Web Foundation , available at: https://webfoundation.org/2017/03/web-turns-28-letter/ ( accessed 16 March 2022 ).

Beschorner , T. ( 2006 ), “ Ethical theory and business practices: the case of discourse ethics ”, Journal of Business Ethics , Vol.  66 No.  1 , pp.  127 - 139 , doi: 10.1007/s10551-006-9049-x .

Brough , M. , Literat , I. and Ikin , A. ( 2020 ), Good Social Media? Underrepresented Youth Perspectives on the Ethical and Equitable Design of Social Media Platforms , Vol.  6 No.  2 , Social Media + Society , pp.  1 - 11 , doi: 10.1177/2056305120928488 .

Brown , L. and Kuss , D.J. ( 2020 ), “ Fear of missing out, mental wellbeing, and social connectedness: a seven-day social media abstinence trial ”, International Journal of Environmental Research and Public Health , Vol.  17 No.  12 , p. 4566 , doi: 10.3390/ijerph17124566 .

Busch , P.A. and McCarthy , S. ( 2021 ), “ Antecedents and consequences of problematic smartphone use: a systematic literature review of an emerging research area ”, Computers in Human Behavior , Vol.  114 No.  106414 , pp.  1 - 47 , doi: 10.1016/j.chb.2020.106414 .

Chapple , W. , Molthan-Hill , P. , Welton , R. and Hewitt , M. ( 2020 ), “ Lights off, spot on: carbon literacy training crossing boundaries in the television industry ”, Journal of Business Ethics , Vol.  162 No.  4 , pp.  813 - 834 , doi: 10.1007/s10551-019-04363-w .

Chen , H. and Atkin , D. ( 2021 ), “ Understanding third-person perception about Internet privacy risks ”, New Media and Society , Vol.  23 No.  3 , pp.  419 - 437 , doi: 10.1177/1461444820902103 .

Chen , X. , Huang , C. and Cheng , Y. ( 2020 ), “ Identifiability, risk, and information credibility in discussions on moral/ethical violation topics on Chinese social networking sites ”, Frontiers in Psychology , Vol.  11 , doi: 10.3389/fpsyg.2020.535605 .

Cheung , C. , Lee , Z.W. and Chan , T.K. ( 2015 ), “ Self-disclosure in social networking sites: the role of perceived cost, perceived benefits and social influence ”, Internet Research , Vol.  25 No.  2 , pp.  279 - 299 , doi: 10.1108/IntR-09-2013-0192 .

Cruz , A.G.B. , Seo , Y. and Rex , M. ( 2018 ), “ Trolling in online communities: a practice-based theoretical perspective ”, The Information Society , Vol.  34 No.  1 , pp.  15 - 26 , doi: 10.1080/01972243.2017.1391909 .

Cushman , F. ( 2016 ), “ The psychological origins of the doctrine of double effect ”, Criminal Law, Philosophy , Vol.  10 No.  4 , pp.  763 - 776 , doi: 10.1007/s11572-014-9334-1 .

Cushman , F.A. , Young , L. and Hauser , M.D. ( 2006 ), “ The role of conscious reasoning and intuition in moral judgment: testing three principles of harm ”, Psychological Science , Vol.  17 No.  12 , pp.  1082 - 1089 , doi: 10.1177/1461444820902103 .

DeNardis , L. and Hackl , A.M. ( 2015 ), “ Internet governance by social media platforms ”, Telecommunications Policy , Vol.  39 No.  9 , pp.  761 - 770 , doi: 10.1016/j.telpol.2015.04.003 .

Dineva , D. and Breitsohl , J. ( 2022 ), “ Managing trolling in online communities: an organizational perspective ”, Internet Research , Vol.  32 No.  1 , pp.  292 - 311 , doi: 10.1108/INTR-08-2020-0462 .

Dwivedi , Y.K. , Kelly , G. , Janssen , M. , Rana , N.P. , Slade , E.L. and Clement , M. ( 2018 ), “ Social Media: the good, the bad, and the ugly ”, Information Systems Frontiers , Vol.  20 No.  3 , pp.  419 - 423 , doi: 10.1007/s10796-018-9848-5 .

D’Arcy , J. , Gupta , A. , Tarafdar , M. and Turel , O. ( 2014 ), “ Reflecting on the ‘dark side’ of information technology use ”, Communications of the Association for Information Systems , Vol.  35 No.  1 , pp.  109 - 118 , doi: 10.17705/1CAIS.03505 .

Emerson , T.I. ( 1962 ), “ Toward a general theory of the First Amendment ”, The Yale Law Journal , Vol.  72 , pp.  877 - 956 .

Floridi , L. ( 2002 ), “ What is the philosophy of Information? ”, Metaphilosophy , Vol.  33 Nos 1-2 , pp.  123 - 145 , doi: 10.1111/1467-9973.00221 .

Foot , P. ( 1967 ), “ The problem of abortion and the doctrine of the double effect ”, Oxford Review , Vol.  5 , pp.  1 - 6 .

Fouché , C. and Light , G. ( 2011 ), “ An invitation to dialogue: ‘the world Café’ in social work research ”, Qualitative Social Work , Vol.  10 No.  1 , pp.  28 - 48 , doi: 10.1177/1473325010376016 .

Gioia , D.A. , Corley , K.G. and Hamilton , A.L. ( 2013 ), “ Seeking qualitative rigor in inductive research: notes on the Gioia methodology ”, Organizational Research Methods , Vol.  16 No.  1 , pp.  15 - 31 , doi: 10.1177/1094428112452151 .

Gorwa , R. ( 2019 ), “ What is platform governance? ”, Information, Communication and Society , Vol.  22 No.  6 , pp.  854 - 871 , doi: 10.1080/1369118X.2019.1573914 .

Guterres , A. ( 2020 ), “ Report of the secretary-general roadmap for digital cooperation ”, United Nations, available at: https://www.un.org/en/content/digital-cooperation-roadmap/assets/pdf/Roadmap_for_Digital_Cooperation_EN.pdf ( accessed 28 March 2022 ).

Habermas , J. ( 1984 ), in McCarthy , T. (Ed.), The Theory of Communicative Action: Reason and the Rationalization of Society , Beacon Press , Boston, MA .

Habermas , J. ( 1990 ), “ Discourse ethics: notes on a program of philosophical justification ”, in Habermas , J. (Ed.), Moral Consciousness and Communicative Action , Polity Press , Cambridge, MA .

Habermas , J. ( 1993 ), Justification and Application , Polity Press , Cambridge .

Han , Y. , Guo , Y. , E-Business , E-Management and E-Learning ( 2022 ), “ Literature review of the concept of ‘internet governance’ based on the background of E-society ”, 13th International Conference on E-Education, E-Business, E-Management, and E-Learning,(IC4E 2022) , January 14-17, 2022 Tokyo , pp.  611 - 618 , doi: 10.1145/3514262.3514263 .

Hecker , S. , Haklay , M. , Bowser , A. , Makuch , Z. , Vogel , J. and Bonn , A. ( 2018 ), “ Innovation in open science, society and policy–setting the agenda for citizen engagement ”, Citizen Science: Innovation in Open Science, Society and Policy , UCL Press , London , pp.  1 - 23 , doi: 10.14324/111.9781787352339 .

Jamison , A.M. , Broniatowski , D.A. and Quinn , S.C. ( 2019 ), “ Malicious actors on Twitter: a guide for public health researchers ”, American Journal of Public Health , Vol.  109 No.  5 , pp.  688 - 692 , doi: 10.2105/AJPH.2019.304969 .

Johnson , B.G. ( 2018 ), “ Tolerating and managing extreme speech on social media ”, Internet Research , Vol.  28 No.  5 , pp.  1275 - 1291 , doi: 10.1108/IntR-03-2017-0100 .

Johnson , D.G. and Powers , T.M. ( 2005 ), “ Computer systems and responsibility: a normative look at technological complexity ”, Ethics Information Technology , Vol.  7 No.  2 , pp.  99 - 107 , doi: 10.1007/s10676-005-4585-0 .

Ju , J. , Liu , L. and Feng , Y. ( 2019 ), “ Design of an O2O citizen participation ecosystem for sustainable governance ”, Information Systems Frontiers , Vol.  21 No.  3 , pp.  605 - 620 , doi: 10.1007/s10796-019-09910-4 .

Kang , J. and Wei , L. ( 2020 ), “ Let me be at my funniest: instagram users’ motivations for using Finsta (aka, fake Instagram) ”, The Social Science Journal , Vol.  57 No.  1 , pp.  58 - 71 , doi: 10.1016/j.soscij.2018.12.005 .

Kapoor , K.K. , Tamilmani , K. , Rana , N.P. , Patil , P. , Dwivedi , Y.K. and Nerur , S. ( 2018 ), “ Advances in social media research: past, present and future ”, Information Systems Frontiers , Vol.  20 No.  3 , pp.  531 - 558 , doi: 10.1007/s10796-017-9810-y .

Khan , A. and Krishnan , S. ( 2021 ), “ Citizen engagement in co-creation of e-government services: a process theory view from a meta-synthesis approach ”, Internet Research , Vol.  31 No.  4 , pp.  1318 - 1375 , doi: 10.1108/INTR-03-2020-0116 .

Kullenberg , C. and Kasperowski , D. ( 2016 ), “ What is citizen science? – a scientometric meta-analysis ”, PLoS ONE , Vol.  11 No.  1 , pp.  1 - 16 , e0147152 , doi: 10.1371/journal.pone.0147152 .

Laato , S. , Islam , A.N. , Islam , M.N. and Whelan , E. ( 2020 ), “ What drives unverified information sharing and cyberchondria during the COVID-19 pandemic? ”, European Journal of Information Systems , Vol.  29 No.  3 , pp.  288 - 305 , doi: 10.1080/0960085X.2020.1770632 .

Leible , S. , Ludzay , M. , Götz , S. , Kaufmann , T. , Meyer-Lüters , K. and Tran , M.N. ( 2022 ), “ ICT application types and equality of E-participation - a systematic literature review ”, Pacific Asia Conference on Information Systems, 2022 Proceedings , Vol.  30 , available at: https://aisel.aisnet.org/pacis2022/30 ( accessed 5 July 2022 ).

Leidner , D. and Tona , O. ( 2021 ), “ The CARE theory of dignity amid personal data digitalization ”, MIS Quarterly , Vol.  45 No.  1 , pp.  343 - 370 , doi: 10.25300/MISQ/2021/15941 .

Levy , M. and Germonprez , M. ( 2017 ), “ The potential for citizen science in information systems research ”, Communications of the Association for Information Systems , Vol.  40 No.  1 , pp.  22 - 39 , doi: 10.17705/1CAIS.04002 .

Lin , L.Y. , Sidani , J.E. , Shensa , A. , Radovic , A. , Miller , E. , Colditz , J.B. , Hoffman , B.L. , Giles , L.M. and Primack , B.A. ( 2016 ), “ Association between social media use and depression among U.S. young adults ”, Depression and Anxiety , Vol.  33 No.  4 , pp.  323 - 331 , doi: 10.1002/da.22466 .

Liu , X. and Zheng , L. ( 2012 ), “ Government official microblogs: an effective platform for facilitating inclusive governance ”, Proceedings of the 6th International Conference on Theory and Practice of Electronic Governance , pp.  258 - 261 , available at: https://dl.acm.org/doi/pdf/10.1145/2463728.2463778?casa_token=qg85vmOL-CMAAAAA:QhCq9a1lSCVDlDwItMHKc4nZz7twQCTS3-othb_1jOJJu3DPjA89lgSTUkVmZFzWraptI-jWMaqSaw ( accessed 11 March 2022 ).

Lowry , P.B. , Zhang , J. , Wang , C. and Siponen , M. ( 2016 ), “ Why do adults engage in cyberbullying on social media? An integration of online disinhibition and deindividuation effects with the social structure and social learning model ”, Information Systems Research , Vol.  27 No.  4 , pp.  962 - 986 , doi: 10.1287/isre.2016.0671 .

Lukyanenko , R. , Wiggins , A. and Rosser , H.K. ( 2020 ), “ Citizen science: an information quality research Frontier ”, Information Systems Frontiers , Vol.  22 No.  4 , pp.  961 - 983 , doi: 10.1007/s10796-019-09915-z .

Lyytinen , K.J. and Klein , H.K. ( 1985 ), “ The critical theory of jurgen Habermas as a basis for a theory of information systems ”, in Mumford , E. , Hirschheim , R. , Fitzgerald , G. and Wood-Harper , T. (Eds), Research Methods in Information Systems , Elsevier Science Publishers B.V. , Amsterdam, North-Holland , pp.  207 - 226 .

Mallipeddi , R.R. , Janakiraman , R. , Kumar , S. and Gupta , S. ( 2021 ), “ The effects of social media content created by human brands on engagement: evidence from Indian general election 2014 ”, Information Systems Research , Vol.  32 No.  1 , pp.  212 - 237 , doi: 10.1287/isre.2020.0961 .

McCarthy , S. , Rowan , W. , Lynch , L. and Fitzgerald , C. ( 2020 ), Blended stakeholder participation for responsible Information Systems research , Communications of the Association for Information Systems , pp.  124 - 149 , doi: 10.17705/1CAIS.04733 .

McCarthy , S. , Mahony , C. , Rowan , W. , Tran-Karcher , H. and Potet , M. ( 2021 ), The pragmatic school of thought in open science practsice: a case study of multi-stakeholder participation in shaping the future of internet governance , 54th Hawaii International Conference on System Sciences, Kauai, Hawaii, University of Hawai’i at Manoa, USA , 4-8 January 2021 pp.  670 - 679 , available at: http://hdl.handle.net/10125/70693 ( accessed 1 February 2022 ).

McHugh , B.C. , Wisniewski , P. , Rosson , M.B. and Carroll , J.M. ( 2018 ), “ When social media traumatizes teens: the roles of online risk exposure, coping, and post-traumatic stress ”, Internet Research , Vol.  28 No.  5 , pp.  1169 - 1188 , doi: 10.1108/IntR-02-2017-0077 .

McIntyre , A. ( 2019 ), “ Doctrine of double effect ”, in Zalta , E.N. (Ed.), The Stanford Encyclopaedia of Philosophy , Spring 2019 Edition , available at: https://plato.stanford.edu/archives/spr2019/entries/double-effect ( accessed 10 March 2022 ).

Mihaylov , T. , Mihaylova , T. , Nakov , P. , Màrquez , L. , Georgiev , G.D. and Koychev , I.K. ( 2018 ), “ The dark side of news community forums: opinion manipulation trolls ”, Internet Research , Vol.  28 No.  5 , pp.  1066 - 2243 , doi: 10.1108/IntR-03-2017-0118 .

Mingers , J. and Walsham , G. ( 2010 ), “ Toward ethical information systems: the contribution of discourse ethics ”, MIS Quarterly , Vol.  34 No.  4 , pp.  833 - 854 , doi: 10.2307/25750707 .

Molas-Gallart , J. ( 2012 ), “ Research governance and the role of evaluation: a comparative study ”, American Journal of Evaluation , Vol.  33 No.  4 , pp.  583 - 598 , doi: 10.1177/1098214012450938 .

Namisango , F. , Kang , K. and Beydoun , G. ( 2021 ), “ How the structures provided by social media enable collaborative outcomes: a study of service Co-creation in non-profits ”, Information Systems Frontiers , Vol.  24 No.  2 , pp.  517 - 535 , doi: 10.1007/s10796-020-10090-9 .

Ngwenyama , O. and Lee , A. ( 1997 ), “ Communication richness in electronic mail: critical theory and the contextuality of meaning ”, MIS Quarterly , Vol.  21 No.  2 , pp.  145 - 167 , doi: 10.2307/249417 .

Ofcom ( 2020 ), Children and Parents: Media Use and Attitudes Report 2019 , Ofcom , London , available at: https://www.ofcom.org.uk/research-and-data/media-literacy-research/childrens/children-and-parents-media-use-and-attitudes-report-2019 ( accessed 2 February 2022 ).

Olphert , W. and Damodaran , L. ( 2007 ), “ Citizen participation and engagement in the design of e-government services: the missing link in effective ICT design and delivery ”, Journal of the Association for Information Systems , Vol.  8 No.  9 , pp.  491 - 507 , doi: 10.17705/1jais.00140 .

Patton , M.Q. ( 2002 ), “ Qualitative research and evaluation methods ”, Qualitative Research and Evaluation Methods , 3rd ed. , Sage , Thousand Oaks, CA .

Radu , R. ( 2020 ), “ Fighting the ‘infodemic’: legal responses to COVID-19 disinformation ”, Social Media + Society , Vol.  6 No.  3 , pp.  1 - 4 , doi: 10.1177/2056305120948190 .

Ransbotham , S. , Fichman , R.G. , Gopal , R. and Gupta , A. ( 2016 ), “ Special section introduction—ubiquitous IT and digital vulnerabilities ”, Information Systems Research , Vol.  27 No.  4 , pp.  834 - 847 , doi: 10.1287/isre.2016.0683 .

Rauf , A.A. ( 2021 ), “ New moralities for new media? Assessing the role of social media in acts of terror and providing points of deliberation for business ethics ”, Journal of Business Ethics , Vol.  170 No.  2 , pp.  229 - 251 , doi: 10.1007/s10551-020-04635-w .

Rest , J. ( 1994 ), “ Background: theory and research ”, in Rest , J. and Narvaez , D. (Eds), Moral Development in the Professions: Psychology and Applied Ethics , Lawrence Erlbaum Associates , Mahwah, NJ , pp.  1 - 26 .

Richey , M. , Gonibeed , A. and Ravishankar , M.N. ( 2018 ), “ The perils and promises of self-disclosure on social media ”, Information Systems Frontiers , Vol.  20 No.  3 , pp.  425 - 437 , doi: 10.1007/s10796-017-9806-7 .

Robbins , S. and Henschke , A. ( 2017 ), “ The value of transparency: bulk data and authoritarianism ”, Surveillance and Society , Vol.  15 Nos 3/4 , pp.  582 - 589 .

Ross , A. and Chiasson , M. ( 2011 ), “ Habermas and information systems research: new directions ”, Information and Organization , Vol.  21 No.  3 , pp.  123 - 141 , doi: 10.1016/j.infoandorg.2011.06.001 .

Salo , M. , Pirkkalainen , H. , Chua , C. and Koskelainen , T. ( 2022 ), “ Formation and mitigation of technostress in the personal use of IT ”, MIS Quarterly , Vol.  46 No.  2 , pp.  1073 - 1108 , doi: 10.25300/MISQ/2022/14950 .

Someh , I. , Davern , M. , Breidbach , C.F. and Shanks , G. ( 2019 ), “ Ethical issues in big data analytics: a stakeholder perspective ”, Communications of the Association for Information Systems , Vol.  44 , doi: 10.17705/1CAIS.04434 .

Sriwilai , K. and Charoensukmongkol , P. ( 2016 ), “ Face it, don’t Facebook it: impacts of social media addiction on mindfulness, coping strategies and the consequence on emotional exhaustion ”, Stress and Health , Vol.  32 No.  4 , pp.  427 - 434 , doi: 10.1002/smi.2637 .

Stahl , B.C. ( 2012 ), “ Responsible research and innovation in information systems ”, European Journal of Information Systems , Vol.  21 No.  3 , pp.  207 - 221 , doi: 10.1057/ejis.2012.19 .

Statista ( 2022 ), Social media – statistics and facts , available at: https://www.statista.com/topics/1164/social-networks/#dossierKeyfigures ( accessed 3 March 2022 ).

Stevens , R. , Dunaev , J. , Malven , E. , Bleakley , A. and Hull , S. ( 2016 ), “ Social Media in the sexual lives of African American and Latino youth: challenges and opportunities in the digital neighborhood ”, Media and Communication , Vol.  4 No.  3 , pp.  60 - 70 , doi: 10.17645/mac.v4i3.524 .

Stoycheff , E. , Burgess , G.S. and Martucci , M.C. ( 2020 ), “ Online censorship and digital surveillance: the relationship between suppression technologies and democratization across countries ”, Information, Communication and Society , Vol.  23 No.  4 , pp.  474 - 490 , doi: 10.1080/1369118X.2018.1518472 .

Tacke , O. ( 2010 ), “ Open science 2.0: how research and education can benefit from open innovation and Web 2.0 ”, in Bastiaens , T.J. , Baumöl , U. and Krämer , B.J. (Eds), On Collective Intelligence. Advances in Intelligent and Soft Computing , Vol.  76 , Springer , Berlin, Heidelberg , pp.  37 - 48 , doi: 10.1007/978-3-642-14481-3_4 .

Tarafdar , M. , Gupta , A. and Turel , O. ( 2015 ), “ Special issue on ‘dark side of information technology use’: an introduction and a framework for research ”, Information Systems Journal , Vol.  25 No.  3 , pp.  161 - 170 , doi: 10.1111/isj.12070 .

Thompson , R. ( 2011 ), “ Radicalization and the use of social media ”, Journal of Strategic Security , Vol.  4 No.  4 , pp.  167 - 190 , doi: 10.5038/1944-0472.4.4.8 .

Torres , R. , Gerhart , N. and Negahban , A. ( 2018 ), “ Epistemology in the era of fake news: an exploration of information verification behaviors among social networking site users ”, ACM SIGMIS Database: The DATABASE for Advances in Information Systems , Vol.  49 No.  3 , pp.  78 - 97 , doi: 10.1145/3242734.3242740 .

Turel , O. , Matt , C. , Trenz , M. , Cheung , C.M.K. , D’Arcy , J. , Qahri-Saremi , H. and Tarafdar , M. ( 2019 ), “ Panel report: the dark side of the digitization of the individual ”, Internet Research , Vol.  29 No.  2 , pp.  274 - 288 , doi: 10.1108/INTR-04-2019-541 .

van der Velden , M. and Mörtberg , C. ( 2021 ), “ Citizen engagement and design for values ”. In van den Hoven , J. , Vermaas , P.E. and van de Poel , I. (Eds), Handbook of Ethics, Values, and Technological Design: Sources, Theory, Values and Application Domains , Dordrecht , Springer Netherlands , pp.  1 - 22 .

Van Dijck , J. ( 2021 ), “ Seeing the forest for the trees: visualizing platformization and its governance ”, New Media and Society , Vol.  23 No.  9 , pp.  2801 - 2819 , doi: 10.1177/1461444820940293 .

Walsham , G. ( 2012 ), “ Are we making a better world with ICTs? Reflections on a future agenda for the IS field ”, Journal of Information Technology , Vol.  27 No.  2 , pp.  87 - 93 , doi: 10.1057/jit.2012.4 .

Walther , J.B. , Liang , Y.J. , De Andrea , D.C. , Tong , S.T. , Carr , C.T. , Spottswood , E.L. and Amichai-Hamburger , Y. ( 2011 ), The Effect of Feedback on Identity Shift in Computer-Mediated Communication , Vol.  14 , Media Psychology , pp.  1 - 26 , doi: 10.1080/15213269.2010.547832 .

Webb , H. , Jirotka , M. , Stahl , B.C. , Housley , W. , Edwards , A. , Williams , M. , Procter , R. , Rana , O. and Burnap , P. ( 2016 ), “ Digital wildfires: hyper-connectivity, havoc and a global ethos to govern social media ”, ACM SIGCAS Computers and Society , Vol.  45 No.  3 , pp.  193 - 201 , doi: 10.1145/2874239.2874267 .

Wehrens , R. , Stevens , M. , Kostenzer , J. , Weggelaar , A.M. and de Bont , A. ( 2021 ), “ Ethics as discursive work: the role of ethical framing in the promissory future of data-driven healthcare technologies ”, Science, Technology, and Human Values , pp.  1 - 29 , doi: 10.1177/01622439211053661 .

Weinhardt , C. , Kloker , S. , Hinz , O. and van der Aalst , W.M.P. ( 2020 ), “ Citizen science in information systems research ”, Business Information Systems Engineering , Vol.  62 , pp.  273 - 277 , doi: 10.1007/s12599-020-00663-y .

Winefield , H.R. , Gill , T.K. , Taylor , A.W. and Pilkington , R.M. ( 2012 ), “ Psychological well-being and psychological distress: is it necessary to measure both? ”, Psychology of Well-Being , Vol.  2 No.  1 , pp.  1 - 14 , doi: 10.1186/2211-1522-2-3 .

World Economic Forum ( 2013 ), “ Digital wildfires in a hyperconnected world. Global risks report ”, World Economic Forum, available at: http://reports.weforum.org/global-risks-2013/risk-case1/digital-wildfires-in-a-hyperconnected-world/ ( accessed 30 March 2022 ).

World Health Organization ( 2022 ), “ Overview of the infodemic ”, WHO Health Topics, available at: https://www.who.int/health-topics/infodemic#tab=tab_1 ( accessed 7 February 2022 ).

Yuthas , K. , Rogers , R. and Dillard , J.F. ( 2002 ), “ Communicative action and corporate annual reports ”, Journal of Business Ethics , Vol.  41 No.  1 , pp.  141 - 157 , doi: 10.1023/A:1021314626311 .

Zuboff , S. ( 2015 ), “ Big other: surveillance capitalism and the prospects of an information civilization ”, Journal of Information Technology , Vol.  30 No.  1 , pp.  75 - 89 , doi: 10.1057/jit.2015.5 .

Zuboff , S. ( 2019a ), The Age of Surveillance Capitalism: the Fight for a Human Future at the New Frontier of Power , Profile Books , London .

Zuboff , S. ( 2019b ), “ Surveillance capitalism and the challenge of collective action ”, New Labor Forum , Vol.  28 No.  1 , pp.  10 - 29 , doi: 10.1177/1095796018819461 .

Acknowledgements

The authors would like to acknowledge the helpful comments provided by the senior editor, associate editor and four anonymous reviewers. An earlier version of this paper was published at the Hawaii International Conference on Systems Sciences ( McCarthy et al. , 2021 ). The authors are grateful for the feedback received from the chair, reviewers and attendees at the session.

Corresponding author

Related articles, all feedback is valuable.

Please share your general feedback

Report an issue or find answers to frequently asked questions

Contact Customer Support

  • Corpus ID: 262213256

Governance of Social Media Platforms: A Literature Review

  • A. ManishKumar , Sumeet Gupta
  • Published in Pacific Asia Conference on… 2022
  • Computer Science, Political Science

Related Papers

Showing 1 through 3 of 0 Related Papers

Academia.edu no longer supports Internet Explorer.

To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to  upgrade your browser .

Enter the email address you signed up with and we'll email you a reset link.

  • We're Hiring!
  • Help Center

paper cover thumbnail

Corporate Governance in a Social Media Era - A systematic Literature Review

Profile image of Martin Karanda

“It takes 20 years to build a reputation and five minutes of to ruin it.” Warren Buffett Abstract The extensive use of social media in today’s globalised business world presents opportunities and challenges for organisations, including boards of directors, stakeholders and regulators. While social media outlets provide extraordinary means for corporations to engage actively with stakeholders, as well as with, market analysts, consumers, suppliers and other members of the corporate community, there are pitfalls and rewards corporations need to heed. There is lack of qualitative research on social media governance and this paper aims to provide a systematic literature review on the governance structures currently in use by major corporations. More specifically, it is aimed at providing insight and options available to governance practitioners in dealing with social media. The major themes that emerged from this review are: - Companies can be vulnerable to damaging publicity that can spread widely over social media networks even if they are not active participants in the social media; Directors need to evaluate reputational risks associated with social media and incorporate mitigation measures to counter the risks. From a total of 2,212 articles and abstracts that refer to corporate governance, social media and risk, only 43 publications qualified as presenting evidence on social media governance and just over half (51%) of the studies were peer- reviewed journals and used in this review. Key words: corporate governance, web 2.0 technology, social media and risk

Related Papers

International Journal of Research

Edupedia Publications

The popularity of social networking sites has rapidly increased over the past few years. Social networks provide many kinds of services and benefits to its users like helping them to connect with new people, share opinions with likeminded people, and stay in touch with old friends and colleagues. It allows users to connect and interact with likeminded people. While these tools were originally developed for individual use, the aspects of information sharing and instant responsiveness of social media lend itself well for corporate communications. Through social media platforms, companies can build and promote their brands, introduce new products, and learn about their customer base. Social media becomes an extension of real world communication strategies by allowing enhanced transparency and increasing interaction between companies and their stakeholders. In this paper an attempt has been made to define the role of social media in applying good corporate governance practices and problems arises due to social media and explained precautions can be taken by companies to use social media for the better corporate governance practices.

governance of social media platforms a literature review

Proceedings of IRIS 2011

Meri Kuikka

Organizations are increasingly using social media for business purposes such as marketing, internal and external communication as well as crowdsourcing ideas for product and service development. Relatively few of these organizations have, however, established guidelines for restricting, controlling or encouraging desirable behavior in the use of social media, and aligning its strategic objectives with those of the organization. The concept of corporate governance of information technology (IT) has yet to be applied to the specific case of social media, and there is a need for establishing guidelines for creating policies and guidelines for its alignment with business strategy. Based on earlier literature on social media services, IT governance, and strategic IT alignment, we introduce a Social Media Governance framework. This framework is then used to analyze an exploratory case study in a Finnish industrial corporation, and we outline the emergent issues related to corporate social media governance and strategic alignment.

Advances in marketing, customer relationship management, and e-services book series

Tobias Endress

Sustainability

Juan Sapena

This study examined how social media (Twitter and LinkedIn) relates to the operating revenue by investigating the effect of the use of social media by the board of directors. To tackle this question, we analyzed the mediating and moderating relationship of social media on the effect of board size in operating revenue (turnover). We studied the implications of the use of social media by the board members by using structural equation modeling (SEM). The data consisted of a random sample of 100 companies listed on the NASDAQ. The study makes two main contributions. First, it shows interesting differences in the use of social media for the operating revenue. Our results suggest that while Twitter mediated and inhibited the negative effect of board size on revenue, LinkedIn moderated and re-enforced this effect. Second, it offers marketers and managers some useful hints about the relationship between social media and financial performance.

AIP Conf. Proc. 2587, 070032-1–070032-5

Giriraj Kiradoo

Good governance refers to the process used by an organisation to produce a result and meet society's needs. It is based on the idea of effective utilisation of resources and the concept of higher efficiency. It also includes the sustainable use of resources and environmental protection. Social media has become an essential tool that acts as a watchdog and gathers information from various business organisations to provide necessary information to the public. It is generally observed that millions of people use social media to gain knowledge and present their points of view. In such situations, social media acts as a watchdog where every individual can put the required information about an organisation to be made aware. It has been observed that a person that finds a negative point or a drawback about business operations tends to post it on social media as a threat to make improvements.

Francesca Bria

The popularity and economic relevance of social media has increased over recent years, enabling millions of users to share instantaneous data, information and media products. User-generated content and the participatory web are currently the lifeblood of the Internet. Social Media is starting to affect all organisations across different dimensions: organisations&#39; internal communication, the working relationships, the relationship with their stakeholder audiences, conversations with consumers, business model innovation, and organisational reputation and legitimacy. Multiple disciplines have analysed this phenomenon from different perspectives, producing a considerable amount of knowledge. However, the impact of social media on organisations and the type of changes provoked by their adoption remain relatively unexplored. Furthermore, little effort has been made to integrate the outcome of research in different disciplines and develop an integrated theoretical framework at an organ...

International Business & Economics Research Journal (IBER)

Petro Kotze

Social media offers great opportunities for businesses, and the use thereof will increase competitiveness. However, social media also introduces significant risks to those who adopt it. This study was undertaken to identify incremental risks resulting from the adoption of social media by businesses and to develop an integrated Information Technology (IT) governance control framework to address these risks. In order to overcome the IT gap, these risks are addressed both at strategic and operational levels. With the help of the processes in Control Objectives for Information Technology and Related Technology (COBIT) 5, this study provides safeguards or controls that can be implemented to address the IT risks that social media introduces to a business. A business can ensure that it successfully governs the IT-related risks at a strategic level through the implementation of the safeguards and controls identified from COBIT 5. This study also briefly discusses the steps that a business c...

Loading Preview

Sorry, preview is currently unavailable. You can download the paper by clicking the button above.

RELATED PAPERS

Andre Calitz , Justin Scharneck

Jane Johnston

Business Horizons

Francesca Cabiddu

Tgk Vasista

Vikalpa: The Journal for Decision Makers

Vidhi Chaudhri

International Journal of Recent Technology and Engineering

DR. RADZIAH MAHMUD

Andrea Moretti

Efthymios Constantinides , Maximilian Rose , Sven Dirkes , Raphael Tietmeyer , Lucas Hüer , Torben Taros

Ansgar Zerfass

maslisa yusof

17th Annual International Conference on Law, 13- 16 July 2020, Athens, Greece, Abstract Book

Murat Can Pehlivanoglu

Arthik Davianti

mushahid zuberi

Developments in Marketing Science: Proceedings of the Academy of Marketing Science

Lina M Gomez

dinesh selvanathan

Lina M Gomez , Ivette Soto Velez

Journal of Direct, Data and Digital Marketing Practice

Efthymios Constantinides

Christoph Lutz

Investment Management and Financial Innovations

wafa ghardallou

alexandria.unisg.ch

Antonin Pavlicek

Jorge Yeshayahu Gonzales LaraY

Jorge Yeshayahu Gonzales-Lara

New outlooks for the scholarly research in corporate governance

Tabani Moyo

Adam Saffer

Handbook of Research on Global Issues in Financial Communication and Investment Decision Making

Ayşen AKYÜZ

RELATED TOPICS

  •   We're Hiring!
  •   Help Center
  • Find new research papers in:
  • Health Sciences
  • Earth Sciences
  • Cognitive Science
  • Mathematics
  • Computer Science
  • Academia ©2024

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings
  • My Bibliography
  • Collections
  • Citation manager

Save citation to file

Email citation, add to collections.

  • Create a new collection
  • Add to an existing collection

Add to My Bibliography

Your saved search, create a file for external citation management software, your rss feed.

  • Search in PubMed
  • Search in NLM Catalog
  • Add to Search

The existential stakes of platform governance: a critical literature review

Affiliation.

  • 1 JOLT-ETN / LERASS, University Paul Sabatier - Toulouse III, Toulouse, France.
  • PMID: 37645107
  • PMCID: PMC10445813
  • DOI: 10.12688/openreseurope.13358.2

This study introduces a comprehensive overview of literature concerning the concepts of regulation and governance, and attempts to connect them to scholarly works that deal with the governance of and by social media platforms. The paper provides fundamental definitions of regulation and governance, along with a critique of polycentricity or multi-stakeholderism, in order to contextualise the discussion around platform governance and, subsequently, online content regulation. Moreover, where traditional governance literature conceptualised stakeholders as a triangle, this article proposes going beyond the triad of public, private and non-governmental actors, to account for previously invisible stakeholder clusters, like citizens and news media organisations. This paper also contends that, while platform governance is an important field of study and practice, the way it has been structured and investigated so far, is posing an existential risk to the broader internet governance structure, primarily, because of the danger of conflating the internet with platforms. As a result, there exists a timely need to reimagine the way in which we understand and study phenomena related to platform governance by adjusting our conceptual and analytical heuristics. So, this article wishes to expand the theorisation of this field in order to better engage with complicated platform governance issues, like the development of regulatory frameworks concerning online content regulation.

Keywords: Platform governance; multi-stakeholder governance; online content regulation; regulatory governance; social media regulation.

Copyright: © 2021 Papaevangelou C.

PubMed Disclaimer

Conflict of interest statement

No competing interests were disclosed.

Figure 1.. Robert Gorwa's formulation of the…

Figure 1.. Robert Gorwa's formulation of the ‘Platform Governance Triangle’ depicting the EU content regulation…

Figure 2.. Expansion of platform governance triangle.

Similar articles

  • Performing Platform Governance: Facebook and the Stage Management of Data Relations. Huang K, Krafft PM. Huang K, et al. Sci Eng Ethics. 2024 Apr 4;30(2):13. doi: 10.1007/s11948-024-00473-5. Sci Eng Ethics. 2024. PMID: 38575812 Free PMC article.
  • Towards Preventing and Managing Conflict of Interest in Nutrition Policy? An Analysis of Submissions to a Consultation on a Draft WHO Tool. Ralston R, Hil SE, da Silva Gomes F, Collin J. Ralston R, et al. Int J Health Policy Manag. 2021 Mar 15;10(5):255-265. doi: 10.34172/ijhpm.2020.52. Int J Health Policy Manag. 2021. PMID: 32610752 Free PMC article.
  • Relying on the engagement of others: A review of the governance choices facing social media platform start-ups. Reuber AR, Fischer E. Reuber AR, et al. Int Small Bus J. 2022 Feb;40(1):3-22. doi: 10.1177/02662426211050509. Epub 2021 Dec 6. Int Small Bus J. 2022. PMID: 35153363 Free PMC article.
  • Guidance for the governance of public-private collaborations in vaccine post-marketing settings in Europe. Torcel-Pagnon L, Bauchau V, Mahy P, Tin Tin Htar M, van der Sande M, Mahé C, Krause TG, Charrat A, Simondon F, Kurz X; ADVANCE Consortium. Torcel-Pagnon L, et al. Vaccine. 2019 May 31;37(25):3278-3289. doi: 10.1016/j.vaccine.2019.04.073. Epub 2019 May 6. Vaccine. 2019. PMID: 31072735 Review.
  • Adaptive governance good practice: Show me the evidence! Sharma-Wallace L, Velarde SJ, Wreford A. Sharma-Wallace L, et al. J Environ Manage. 2018 Sep 15;222:174-184. doi: 10.1016/j.jenvman.2018.05.067. Epub 2018 May 26. J Environ Manage. 2018. PMID: 29843090 Review.
  • Abbott KW, Snidal D: The Governance Triangle: Regulatory Standards Institutions and The Shadow of the State.In The Politics of Global Regulation.Princeton University Press,2009;44–88. Reference Source
  • Beresford AD: Foucault’s Theory Of Governance And The Deterrence Of Internet Fraud. Admin Soc. 2003;35(1):82–103. 10.1177/0095399702250347 - DOI
  • Bernstein S, Cashore B: Can non‐state global governance be legitimate? An analytical framework. Regul Gov. 2007;1(4):347–371. 10.1111/j.1748-5991.2007.00021.x - DOI
  • Bietti E: Consent as a Free Pass: Platform Power and the Limits of the Informational Turn. Pace Law Review. 2020;40(1):92. Reference Source
  • Black J: Critical Reflections on Regulation. Australian Journal of Legal Philosophy. 2002;27. Reference Source

Grants and funding

Linkout - more resources, full text sources.

  • Europe PubMed Central
  • PubMed Central

Research Materials

  • NCI CPTC Antibody Characterization Program
  • Citation Manager

NCBI Literature Resources

MeSH PMC Bookshelf Disclaimer

The PubMed wordmark and PubMed logo are registered trademarks of the U.S. Department of Health and Human Services (HHS). Unauthorized use of these marks is strictly prohibited.

governance of social media platforms a literature review

  • Interactivity
  • AI Assistant
  • Digital Sales
  • Online Sharing
  • Offline Reading
  • Custom Domain
  • Branding & Self-hosting
  • SEO Friendly
  • Create Video & Photo with AI
  • PDF/Image/Audio/Video Tools
  • Art & Culture
  • Food & Beverage
  • Home & Garden
  • Weddings & Bridal
  • Religion & Spirituality
  • Animals & Pets
  • Celebrity & Entertainment
  • Family & Parenting
  • Science & Technology
  • Health & Wellness
  • Real Estate
  • Business & Finance
  • Cars & Automobiles
  • Fashion & Style
  • News & Politics
  • Hobbies & Leisure
  • Recipes & Cookbooks
  • Photo Albums
  • Invitations
  • Presentations
  • Newsletters
  • Sell Content
  • Fashion & Beauty
  • Retail & Wholesale
  • Presentation
  • Help Center Check out our knowledge base with detailed tutorials and FAQs.
  • Learning Center Read latest article about digital publishing solutions.
  • Webinars Check out the upcoming free live Webinars, and book the sessions you are interested.
  • Contact Us Please feel free to leave us a message.

Governance of Social Media Platforms A Literature Review

Description: social media platforms have become an integral part of our life. initially, they were used as a medium of interconnection but now have become a source of creating, consuming, and disseminating information. the proliferation of misinformation, fake news, and hate speech are some of the major challenges that are posed by these platforms. to curb these challenges, it is crucial to study the governance mechanisms of these platforms. though there have been studies in the domain of platform governance but there is a dearth of studies in the area of governance of social media platforms. this paper tries to comprehend the studies in the domain of governance of social media platforms and present them under four major themes that are different modes of governance, content moderation and combating misinformation, algorithmic governance, and ethical aspects of governance. this paper not only synthesizes the existing literature but also identifies future research gaps., keywords: social media platforms, read the text version.

No Text Content!

Association for Information Systems AIS Electronic Library (AISeL) PACIS 2022 Proceedings Pacific Asia Conference on Information Systems (PACIS) 7-4-2022 Governance of Social Media Platforms: A Liter ernance of Social Media Platforms: A Literature Review Manish Kumar A Indian Institute of Management Raipur, [email protected] Sumeet Gupta Indian Institute of Management Raipur, [email protected] Follow this and additional works at: https://aisel.aisnet.org/pacis2022 Recommended Citation A, Manish Kumar and Gupta, Sumeet, "Governance of Social Media Platforms: A Literature Review" (2022). PACIS 2022 Proceedings. 150. https://aisel.aisnet.org/pacis2022/150 This material is brought to you by the Pacific Asia Conference on Information Systems (PACIS) at AIS Electronic Library (AISeL). It has been accepted for inclusion in PACIS 2022 Proceedings by an authorized administrator of AIS Electronic Library (AISeL). For more information, please contact [email protected] . Governance of Social Media Platforms: A Literature Review Pacific Asia Conference on Information Systems 2022 1 Governance of Social Media Platforms: A Literature Review Completed Research Paper A Manish Kumar Indian Institute of Management Raipur Raipur, India [email protected] Sumeet Gupta Indian Institute of Management Raipur Raipur, India [email protected] Abstract Social Media platforms have become an integral part of our life. Initially, they were used as a medium of interconnection but now have become a source of creating, consuming, and disseminating information. The proliferation of misinformation, fake news, and hate speech are some of the major challenges that are posed by these platforms. To curb these challenges, it is crucial to study the governance mechanisms of these platforms. Though there have been studies in the domain of platform governance but there is a dearth of studies in the area of governance of social media platforms. This paper tries to comprehend the studies in the domain of governance of social media platforms and present them under four major themes that are different modes of governance, content moderation and combating misinformation, algorithmic governance, and ethical aspects of governance. This paper not only synthesizes the existing literature but also identifies future research gaps. Keywords: Social Media Governance, Internet Governance, Platform Governance, Internal governance, Platform Regulation, Algorithmic Governance Introduction In the last two decades, information systems have emerged, drastically impacting our lives in an unprecedented way. One of the most prominent of them has been the advent of social media platforms. Social media platforms have become one of the most popular choices among people for instant dissemination and consumption of information among themselves (Qualman, E. 2012; Copeland, R. 2011; Napoli, P. M. 2015). Social Media has not only changed the way people interact with each other but also has impacted the way of conducting business (e.g., Aral, S. et al. 2013, Nascimento, A. M., & Da Silveira, D. S. 2017) and consuming information (Kim, D. H., & Desai, M. 2021; Mellado, C., & Alfaro, A. 2020; Guo, M., & Sun, F. S. 2020). Social media has entered our lives in such a way that it has started influencing our opinions and even our decision-making (Gorodnichenko, et al., 2021; Zhuravskaya et al. 2020). In the recent past, we have observed many incidents like the impact of social media on US Presidential Elections1 , the role of social media in the Brexit referendum (Marshall, H., & Drieschova, A. 2018), the influence of misinformation on social media platforms leading to violent incidents in the society2 3 4. During the pandemic, though social media platforms became one of the leading mediums of information dissemination 1 https://www.wsj.com/articles/central-roles-of-facebook-twitter-get-mixed-reviews-1478782066 2 https://www.washingtonpost.com/politics/2020/02/21/how-misinformation-whatsapp-led-deathlymob-lynching-india/ 3 https://www.indiatoday.in/india/story/maharashtra-violence-cyber-cell-lists-36-social-media-poststriggered-1877775-2021-11-17 4 https://www.livemint.com/Politics/jkSPTSf6IJZ5vGC1CFVyzI/Death-by-Social-Media.html Governance of Social Media Platforms: A Literature Review Pacific Asia Conference on Information Systems 2022 2 regarding the Covid 19, it was the spread of misinformation and disinformation through these platforms which fueled problems like vaccine hesitancy and refusals (Zhang et al., 2021). Such situations led to infodemic, which even cost the lives of the people. Hence, it is crucial to regulate the contents that are being created, disseminated and consumed on these social media platforms. The governance of social media platforms has been a growing concern in recent times. Now, the question is not whether these social media platforms need to be governed, but the question is how and with what effectiveness these platforms should be governed? (Rochefort, A. 2020) There is a lot of literature that deals with a broader landscape of platform governance (De Reuver et al., 2018; Helberger et al., 2018; Tiwana, A., 2013) and governance of platform ecosystems (e.g., Mukhopadhyay, S., & Bouwman, H. 2019; Rietveld et al., 2020; DeNardis, L., & Hackl, A. M. 2015). There have been studies that analyzed the social media governance in an organizational setting, corporate governance (e.g., Vaast, E., & Kaganer, E. 2013; Stevens et al. 2016; Linke, A., & Zerfass, A. 2013; van den Berg, A. C., & Verhoeven, J. W. 2017; Parker et al. 2019), and e-governance (e.g., Mansoor, M. 2021; Kumar et al., 2016; Sideri, et al. 2019) but there have been very few studies which address the issue of governance of the social media platforms. This study postulates that one of the reasons behind decreasing attention of researchers towards this domain is not lack of relevance of the topic but maybe lack of comprehensive understanding of previous work to date, which might motivate researchers to work in this domain. To the best of knowledge, there was no comprehensive study was found in the literature which summarized the existing studies dealing with the governance and regulation of social media platforms. This paper attempts to fill this gap by synthesizing the existing knowledge (which helps in understanding what we know about the topic) and Identifying the Gaps (which helps in understanding what we still need to know). This study pursues the idea that addressing these questions will revive the interests and attention of IS researchers in this domain of regulation and governance of social media platforms. The rest of the paper is organized as follows: The next section deals with the basic understanding of Platforms, Governance, and social media so that readers may have a basic understanding of what these terms are, as the entire paper revolves around these terms. The next section consists of synthesizing the research findings, which gives an illustrative explanation of the governance of social media platforms as understood in this work. Subsequently, it is followed by a detailed future research agenda in this domain which is followed by the concluding remarks. Governance of Social Media Platforms What is a Platform? The platform is a very dubious term that has been used and understood by various communities in diverse ways over the period (Gorwa, R. 2019a; Gillespie, T. 2018). The term "Platform" has evolved over the period figuring out into multiple conceptualizations. Though the differences are subtle still, how computer science, lawyers, economist, and digital media scholars infer platforms are quite different (Andersson Schwarz, J. 2017; Gorwa, R. 2019a). Digital Platforms are one of the major manifestations of Digital Transformation, but the conceptualization of platforms is not a very recent phenomenon. Literature has been discussing the concepts of platforms within the digital landscape and even outside the landscape (de Reuver et al.,2018). The genesis of the word Platform comes from industrial management, where platforms are seen as a stable core entity surrounded and supported by variable periphery (Baldwin, C. Y., & Woodard, C. J. 2009). De Reuver et al. 2018 classified platforms through the lens of non–digital platforms or platforms related to production processes and labeled them as Internal Platforms, supply chain platforms, and industry platforms. The main role of the platforms is to act as a mediator between different stakeholders (or users), like in the case of supply chain and industry platforms. There are studies in the literature that illustrate the market power of two-sided platforms and how businesses can sustain themselves through the competition in platforms (Rochet, J. C., & Tirole, J. 2006, Rochet, J. C., & Tirole, J. 2003). These platforms are offline platforms, and many of the notions and the characteristics of Digital Platforms have been borrowed from these offline platforms. However, digital platforms are noticeably different from traditional offline platforms (de Reuver, M., Sørensen, C., & Basole, R. C. (2018)). The dynamic nature of the digital platforms makes it very complicated to understand the phenomenon of these platforms. Some studies discuss the changing architectural features of the platforms Governance of Social Media Platforms: A Literature Review Pacific Asia Conference on Information Systems 2022 3 (Tiwana et al., 2010), changes in user interaction among various participants with the impact of the internet on these platforms (Spagnoletti et al., 2015), and even the inter-organizational relationships within the various stakeholder (Ghazawneh, A., & Henfridsson, O. 2013). What is Platform Governance? The word “Governance” has different meanings, and it widely varies and incorporates a variety of facets and hence cannot be standardized with a single meaning or definition. Broadly speaking, it can be said that governance encompasses all the processes that belong to any institution and the structures supporting them, which are used to supervise the relations between various actors of that institution (Linke, A., & Zerfass, A. 2013). From a control perspective, governance can be defined as the ability to implement a set of rules for the delivery of services (Fukuyama, F. 2013) According to Gorwa 2019, Platform Governance can be defined as a way of understanding and handling the socio-technical aspects of platforms. These aspects mainly consist of handling the interaction between various actors of the platform. These platforms play a vital role in mediating and influencing the behavior of these actors and hence mainly play as political actors (Gillespie, T. 2018). Hence, these platforms which engineer the actions of their stakeholders themselves need some control and governance. The governance of these platforms is directed by local and national governance mechanisms of the territory on which these platforms operate. (Gorwa 2019) Now, as these platforms are taking more of the political role and the power relationships between various actors of these networks are becoming more complicated, it is high time that scholars from various disciplines, including Information systems, should start looking at various ways through which these platforms can be checked and governed so that it operates freely and fairly, and the users are the beneficiaries. What is social media? Social Media is a very powerful form of ICT which is used for the transmission of information that impacts the individual, organizations may be political or enterprise, and eventually the entire society. Social media now has started impacting the viewpoint of the entire society towards any issue. If it is seen in the IS literature, social media has been defined in different ways looking at it from a different perspective. Boyd & Ellison 2007 defined social media as websites that are utilized by users for creating profiles and enhancing visibility in their networks. Along the same lines, social media can also be defined as a subset of information technology that facilitates and enhance networking and interactions among users (Kapoor et al., 2017; Oestreicher-Singer & Zalmanson,2013) However, it seems that the broader consensus appears to converge at a point where social media flourished and proliferated with the growth of web2.0 technologies. It was the web2.0 technologies that catalyzed the adoption and usage of social media (Wolf, M et al., 2018). Web2.0 is the set of technologies that help to transmit media-rich content creation and transmission on the internet. It is an open-source ideology that persuades users to collaborate, generate and broadcast content among themselves (Kaplan & Haenlein, 2010). Hence it is evident that scholars have defined social media from various lenses. From a technocentric point of view, social media can be read as: “a group of Internet-based applications that build on the ideological and technological foundations of Web 2.0, and that allow the creation and exchange of User Generated Content” (Kaplan & Haenlein, 2010) Hence this definition gives more emphasis to the web2.0 technology that is behind the proliferation of social media usage and implicitly mentions about the content which is generated by the users known as User Generated Content and its creation. Now, if we look at (Boyd & Ellison 2007, p. 211) way of defining social media, it can be observed that they take a less technical approach and define "social network sites as web-based services that allow individuals to (1) construct a public or semipublic profile within a bounded system, (2) articulate a list of other users with whom they share a connection, and (3)view and traverse their list of connections and those made by others within the system" Similarly from the social and nontechnical point of view the concept of "social computing" which enables social interactions and focuses on relationships and collaboration were introduced in some of the definitions (Oestreicher-Singer & Zalmanson, 2013; Kapoor et al., 2018)). These definitions are more inclined toward the social implications of social media usage, that is, the creation, the transmission of Governance of Social Media Platforms: A Literature Review Pacific Asia Conference on Information Systems 2022 4 content among users, and socializing by maintaining relationships and collaboration among them. It can be said that these views are more performative view as they lean towards focusing on what social media platforms normally do rather than what the underlying technology was envisioned to support (Wolf, M et al., 2018) But for this study, platforms are considered to be a socio-technical assemblage that is a combination of hardware, software, and human interaction with them (de Reuver et al.,2018; Gillespie, T. 2018). Hence adopting the definition by Gillespie T. 2018 platforms are "online sites and services that host, organize, and circulate users’ shared content or social interactions for them, without having produced or commissioned (the bulk of) that content, built on an infrastructure, beneath that circulation of information, for processing data for customer service, advertising, and profit” Characteristics of Social Media Platforms One of the viewpoints of understanding the social media platforms is to identify the basic characteristics of these platforms. Firstly, social media platforms here mean the online offering, services, and data-driven apps like Facebook, YouTube, LinkedIn, etc. However, these platforms are not standalone entities, and some corporations are basically behind all these services, which help in deploying these services and are known as Platform companies (Gorwa, R. 2019a). For example, Facebook, WhatsApp, and Instagram are run by Facebook inc. Similarly, YouTube, Google is run by Alphabet inc., and so on. Mostly these social media platforms claim that they don't create content, which is true also. For example, on Facebook or Instagram, or for that matter, any other social media platforms, no content is generated or created by the platforms themselves. Platforms generate no content because the content is generated only by the users. Users generate the content, which is known as User Generated Content (UGC). However, these platforms shape the conversations and make important choices about these contents. These platforms, with the help of their algorithms on background, decide which content will be available to whom, which will be searchable to whom, how users will connect, and so on (Gillespie, T. 2018). Based on the usage of the social media platforms, they can be classified as (Gancho, S. P. M. 2017): • Social News Sites: These are the portals that are mainly used for sharing news and data (Ex: Bloomberg, Social Times, Mashable, etc.) • Social Networking Sites: These sites are used for creating networks and sharing user-generated content among the closed group or network that is formed. The data that is shared can be textual, audio, video, or images. (Ex: Facebook, Instagram, etc.) • Social Bookmarking Sites: Such sites are used for tagging content or bookmarking the content (Ex: Twitter, Pinterest, StumbleUpon, etc.) • Social Sharing Sites: These are the sites used for sharing information with people. They may be used to share any kind of information that is in the form of images or audio-visual form. However, in this case, normally, the data is shared within an open network. (Ex: Blogs, Events, Wikis, Forums, etc.) Synthesizing Research Findings Identifying Relevant Literature To find out the relevant literature, the suggestions of Webster, J., & Watson, R. T. (2002) were followed. The finding literature was carried out in three phases, and it took around 6-month time. STEP 1: The first step was title search in relevant databases, which were EBSCO Business Source Complete, ScienceDirect, ACM Digital Library, and Scopus. The logical string was (‘Social Media’ AND ‘Governance’) OR (‘Social Media’ AND ‘Regulation’) OR (‘Social Media’ AND ‘Content Moderation) OR (‘Social Media’ AND ‘Platform Regulation’) OR (‘Social Media’ AND ‘Internet Regulation’). The search was not limited to any specific period. The number of papers from all the sources after phase 1 was 407. STEP 2: Scanning of abstract was the second phase of the search. After going through the abstract, the papers which were not relevant to the topic of governance of social media platforms. There were a lot of Governance of Social Media Platforms: A Literature Review Pacific Asia Conference on Information Systems 2022 5 papers which were dealing with social media governance or regulation within any organizational setup. All such papers were dropped during this stage. At the end of phase 2, there were 69 papers. STEP 3: Now, in the third phase, a thorough analysis of the body of the paper, including research questions, methodology, and findings, was carried out. In this phase, some of the papers were removed that were related to the governance of e-commerce platforms or those papers that were not clear with the research questions. Finally, after this phase, we had 47 articles which were dealing with the governance of social media platforms in one form or the other. Figure 1 shows the steps followed for the synthesizing of the literature for this study. Figure 1: Steps followed for Synthesizing Literature Analysis Technique Used For the analysis of the papers selected in the final step, thematic analysis is used as the analysis technique. Thematic analysis is one of the most commonly used techniques used for identifying and analyzing the themes and the common patterns in the documents or interviews ((Boyatzis, 1998; Braun & Clarke, 2006). Thematic analysis is far more than just counting words or phrases in the text. It basically emphasizes identifying the repetitive patterns and implicit and explicit ideas in the data (in this case, research articles). This method also provides more control over data as this involves the reading of the text, identifying themes, comparing and contrasting themes, and further building theoretical models on them (Babar, A., Bunker, D., & Gill, A. Q. 2018). This paper has adopted guidelines offered by Braun and Clarke (2006), which are shown in the following table: Phases Steps Description 1 Familiarizing yourself with your data Initially reading and re-reading the data to get a rough initial idea of the document 2 Generating initial codes Coding the appealing features in the data in a systematic way and organizing data pertinent to each code Governance of Social Media Platforms: A Literature Review Pacific Asia Conference on Information Systems 2022 6 3 Searching for themes Collating the codes into major themes 4 Reviewing themes Revisiting themes and checking if the themes are in relation to the entire data that is extracted 5 Defining and naming Themes Refining the themes generated and generating clear definitions for each theme 6 Producing the report The last stage of analysis. Conducting final analysis selected extracts and themes, relating back of the analysis to the research question and literature, producing a scholarly report of the analysis. Table 1: Different Phases of Thematic Analysis (Braun, V., & Clarke, V. 2006) Research Findings Selected papers addressed the topic "Governance of Social Media Platforms" in various ways. Some were dealing with the various modes or forms of governance that can be implemented; some were discussing the indirect form of governance by content moderation, and so on. In order to present the findings of the literature review, it was decided to classify the papers based on various themes of governance mechanisms that have been discussed in the paper. We have classified the papers in the following four different themes (see table 1 for summary & key findings) : (a) Modes of Governance (b) Content Moderation and Combating Misinformation (c) Algorithmic Governance (d) Ethical Aspects of Governance (a) Modes of Governance One of the most important themes which were addressed by the scholars was the various modes of governance that can be implemented. Gorwa (2019a), in his paper, talks about three modes of governance, i.e., Self-Governance (Internal Governance), External Governance, and Co-Governance. Along similar lines, Gillespie (2017) talks about governance by platforms and governance of platforms. Self – Governance is a form of governance mechanism in which these social media platforms govern themselves by using their own rules and regulations like Community Guidelines, Terms of Services, etc. However, in the case of external governance mechanisms, these platforms are governed by local, national, or supranational governments. Network Enforcement Act (NetzD) and Information Technology Act, 2000 are some examples of external governance mechanisms. Co-governance is a midway between the two; that is, it is the governance of these platforms through some third party or combination of both. Christchurch Call (CC), Global Network Initiative (GNI), and even Facebook Oversight Board are examples of co-governance mechanisms (Gorwa, R. 2019a). There is a paper by Gorwa R. 2019b, which has mapped the various governance mechanisms with the 'Platform Governance Triangle’ (Abbott, K. W., & Snidal, D. 2009). This paper helps in understanding the modes of governance along with inter-actor power relationships (Gorwa R. 2019b). Shankar, R., & Ahmad, T. (2021), in their paper, have analyzed various external governance mechanisms and their links with regulatory mechanisms and concluded that these laws somehow dilute free speech and privacy. Some papers also mentioned government intervention in the governance of these platforms, like in Ethiopia and Germany (Abraha, H. H. 2017; Heldt, A. P. 2019). Thus, it can be said that these set of papers give a brief idea about various modes of governance mechanisms employed for the governance of social media platforms (Obar, J. A., & Wildman, S. S.2015; Meng et al., 2016; Rochefort, A. 2020; Governance of Social Media Platforms: A Literature Review Pacific Asia Conference on Information Systems 2022 7 Papaevangelou, C. 2021; Donato, V. E. S. E.2021). Hence, these different modes of governance, which have been studied as a part of the governance of social media platforms, implicate the importance of various stakeholders on these platforms. These studies also guide the future administrators that no one form of governance can be optimal, but rather it should be a combination of all forms of governance (Gorwa, 2019a). It also helps in understanding the role of various stakeholders in the governance mechanism. (b) Content Moderation and Combating Misinformation One of the most important ways through which these social media platforms hamper the peace of society is through the proliferation of fake news and hate speech content, and this misinformation leads to violent incidents in society, causing the lives of innocent people (Arayankalam, J. 2020). One way of governing these platforms is by curtailing the spread of misinformation and hate speech through content moderation on platforms (Kyza, E. A et al. 2020). These platforms claim that they are just intermediaries and are neutral but play an important role in engineering the choices of the content consumed by the users and hence enhancing the virality of the misinformation (Gillespie, T. 2010, 2017). There have been intense efforts from the governance perspective to prevent the dissemination of misinformation by implementing multilevel governance mechanisms (as implemented in European Countries) (Saurwein, F., & Spencer-Smith, C. 2020) and also by pressurizing these platforms to implement mechanisms to reduce and cease the spreading of these illicit contents. One of the initiatives by these platforms is to provide the affordances to their users about reporting content by raising flags and giving feedback (Crawford, K., & Gillespie, T. 2016). Content moderation is one of the most challenging parts of the self-governance mechanism 0f social media platforms. However, on one side, this can be used for combating misinformation, but it also shapes what content users have access to based on their previous choices of the user. This theme of literature will also help in understanding the impact of platform affordances on content moderation and hence on tackling the proliferation of misinformation. (c) Algorithmic Governance The content which is being consumed on the social media platforms is created by the users but is ordered and circulated through algorithms (Kitchin, 2016; Seaver, 2017; Ziewitz, 2016). Hence, another important aspect of governing these platforms is keeping a check on these algorithms. These algorithms are not independent and external and are dependent on the cultural and economic aspects of these platforms (Katzenbach, C., & Ulbricht, L. 2019). Algorithmic governance also includes content moderation based on AI-based algorithms, which is the first stage of content moderation. As the inflow of data on these platforms is humongous, it is impossible to moderate content manually, and hence these algorithms play an important role in automatic content moderation (Gillespie, T. 2018). The studies in this domain primarily deal with the functioning of algorithms (which are mostly black boxes) and how it impacts the choice of a user on the platform. This is an area that needs to be looked into, as this will also help the government to understand the role of algorithms in impacting the functioning of these platforms. (d) Ethical Aspects of Governance There have been studies that also delve into various aspects of the governance of these platforms, which predominantly include transparency, accountability, fairness, and neutrality (Gorwa, R. 2018). One of the significant areas in this domain is the study of algorithms that are implemented by these platforms. These platforms never reveal the operations of these algorithms or never make these algorithms open for regulation. It is very important to understand how these social media platforms operate based on these algorithms (Rieder, B., & Hofmann, J. 2020). Some of the studies distinctly mention that the corporates which own and run these platforms are profit-making organizations and have economic motives behind governing these platforms. Hence, for these corporates also, it is important to prove themselves in front of all stakeholders that their transparency, accountability, and fairness are of foremost importance. Hence, now Facebook and Google have started partnering with academic institutions and creating neutral Bodies like Facebook Oversight Board for their governance (Gorwa, R., & Ash, T. G. 2020; Bostoen, F. 2018). Governance of Social Media Platforms: A Literature Review Pacific Asia Conference on Information Systems 2022 8 Themes Key Findings Literature Modes of Governance (a) Most of the papers elaborately explained various forms of governance of these platforms, i.e., Self-Governance (Governance by Platforms or Internal), External Governance (Governance of Platforms) and Co-Governance Gorwa (2019a); Gillespie (2017); Gorwa R. (2019b); Abbott, K. W., & Snidal, D. (2009); Shankar, R., & Ahmad, T. (2021); Abraha, H. H. (2017); Heldt, A. P. (2019); Obar, J. A., & Wildman, S. S. (2015); Meng et al. (2016); Rochefort, A. (2020); Papaevangelou, C. (2021); Donato, V. E. S. E. (2021) (b) Some papers specifically dealt with external governance mechanisms that are being implemented in various countries in the form of various regulations like NetzD, Information Technology Bills 2021, Christchurch Call (CC), and Global Network Initiative (GNI), Facebook Oversight Board, etc. They were also conversant about the complex interaction among these actors, including competencies and power relations among them. (c) Some of the papers also discussed the governance of these platforms in a specific context, like in case of public emergencies, governing fake news, etc. Content Moderation and Combating Misinformation (a) some studies interpreted the governance of social media platforms from the perspective of content moderation and curtailing the proliferation of fake news and misinformation. Arayankalam, J. (2020); Kyza, E. A. et al. (2020); Gillespie, T. (2010); Gillespie, T.(2017); Saurwein, F., & Spencer-Smith, C. (2020); Crawford, K., & Gillespie, T. (2016); Bhagat, S., Kim, D. J., & Parrish, J. L. (2020); Hayes, T., & Oakley, R. L. (2019) (b) Some studies highlighted the fact that these social media platforms are neither intermediaries nor neutral; they have a role in the dissemination of information to their users. (c) They also illustrated how external governance mechanisms like multilevel governance and affordances-based governance mechanisms are being implemented to reduce the spread of misinformation. Algorithmic Governance (a) Algorithmic governance is a very important part of the governance of social media platforms as algorithms engineer the process of interaction of users among themselves. Katzenbach, C., & Ulbricht, L. (2019); Gillespie, T. (2018) ; Copland, S. (2020) ; Napoli, P. M. (2015) ; Kumar, S. (2019) ; Gorwa, R., Binns, R., & Katzenbach, C. (2020) ; Riemer, K., & Peter, S. (2021) (b) Algorithms also play an important role in what information will be listed to which user at what time hence impacting the decision-making process of the users. Governance of Social Media Platforms: A Literature Review Pacific Asia Conference on Information Systems 2022 9 (C) AI-based Algorithms also play an important role in content moderation, and hence regulation of these algorithms is very crucial. Ethical Aspects of Governance (a) Studies mention that transparency, accountability, fairness, and neutrality are important pillars of social media governance. Gorwa, R. (2018); Rieder, B., & Hofmann, J. (2020) ; Gorwa, R., & Ash, T. G. (2020) ; Bostoen, F. (2018) ; Anansaringkarn, P., & Neo, R. (2021) (b) These social media platforms showcase their operations of algorithms as a black box. Transparency, fairness, and accountability should be important parameters for the governance of algorithms. (c) Corporate organizations which are owners of these platforms should also be held accountable for the governance of these platforms. Table 2. Key Findings of The Literature Review Identifying Research Gaps Since the last decade lot of research has been done studying the governance of social media platforms (e.g., Gorwa 2019a; Gillespie, 2017; Gorwa R. 2019b); however, it can be observed that still a lot can be done in this domain by our IS scholars. In fact, our literature review also suggests some of the research agenda, which can be motivating for future researchers to work upon. Agenda 1: User's Perspective on Governance of Social Media Platforms Mostly the studies which have been focused on governance is from platform and government perspective but not from the user’s perspective. One of the critical areas of research can be studying governance from the user's perspective, i.e., What do users think of these governance mechanisms? On which of the governance mechanism does the user trust the most and why? How can one involve users in this entire structure of governance? This would be an important area as users are one of the most important actors in the social media ecosystem. For example, what if the government intervenes in the governance mechanism of these social media platforms and mandates these platforms to share the source of the message in case of a threat to national security? How would the user react to this? Similarly, if these social media platforms decide to decrease the level of anonymity of user on them by introducing a compulsory disclosure mechanism, how would the behavior of user changes on these platforms because of this change in governance mechanism. Such questions need to be investigated further so that it helps to understand the user's perspective on the governance of these platforms. Users are one of the dominant stakeholders; in this case, hence without understanding their perspective, it would be impossible to understand the complete canvas of governance of social media platforms. Agenda 2: Studying Transparency & Biases of Algorithms Another stream of research can be towards exploring the study of inherent biases of these algorithms. The social media platforms are majorly governed by these algorithms. The algorithms are the backbone of the operations of the social media platforms. From the consumption of information to content moderation, algorithms play a vital role but are these algorithms neutral? What are the algorithmic biases that these platforms utilize, and how does it impact the overall governing process of these platforms? The study of these black-box algorithms will help to understand how content moderation is being implemented on these platforms. These algorithms are behind what type of content appears on the newsfeed of the user, what type of content is struck down, and which content is shown to whom? It is very important to conduct future Governance of Social Media Platforms: A Literature Review Pacific Asia Conference on Information Systems 2022 10 research in this direction so that one can understand how these algorithms influence the governance mechanism of these social media platforms. Agenda 3: Role of Third-Party Developers In the platform ecosystem, some third-party developers also play an important role in the development and operation of platforms. As the content which is generated by these third-party developers is consumed by the users hence regulation of these third-party actors is also very crucial for the overall governance of these platforms. Hence, researchers can also study the governance mechanism of these third-party developers. Discussion & Limitations The research agenda which has been proposed in the previous section addresses issues of three important stakeholders of the platform ecosystem, which are Users, Platforms, and the third-party developers who are developing applications for the platforms. Future research should try to delve into these directions and open new avenues for research in the direction of governance of social media platforms. As already discussed in the previous section, one of the future agendas can be to study the governance mechanism of social media platforms from the user's perspective. The main question that this research agenda needs to investigate is how users react if the governance mechanism of these platforms changes. There has been lot myriad of studies that have studied governance mechanism from a platform’s perspective (Gillespie, T. 2010; Gorwa, R. 2019a; Helberger et al., 2018; Linke, A., & Zerfass, A. 2013) and government perspective (Gorwa, R. 2019b; Heldt, A. P. 2019; Anansaringkarn, P., & Neo, R. 2021) but there is a dearth of studies which look at governance mechanism from user's perspective. Future research can direct itself towards studying the change in behavior of the social media users as there is a change in the governance mechanism, which may include compulsory disclosure of identity, reduction in anonymity, change in the business model of these platforms, and so on. Such studies will help in understanding the user's perspective on the regulatory mechanism. Similarly, agenda 2 of future research points towards exploring the transparency of the algorithms and governance of these algorithms. As it has already been known that algorithms are the backbone of information management on these platforms but what if these algorithms are themselves biased. These algorithms are biased toward a specific type of information and suppress the other type of information. Future studies must try to delve into this issue and try to understand if these algorithms provide level playing fields to all stakeholders. Third-party developers are the stakeholders who develop third-party applications and games for these platforms. They use the core features of these platforms and build their games and applications around them. These third-party developers are a very important and dominant stakeholder for any platform, and hence future studies should examine and analyze the regulatory mechanism of these third-party developers on these platforms. This study specifically focuses on reviewing the studies which deal with the governance of social media platforms. There is a literature review that talks about social media and platform governance, but there is hardly any study that comprehensively studies the governance of social media platforms. Hence, this study helps to understand the governance of platforms in a specific context, that is, social media platforms. No study can be free from limitations and always have a scope for improvement and hence so is this one. One of the limitations is that this study has used thematic analysis and hence has derived themes. But in future studies, thematic analysis can be combined with bibliometric analysis to derive an intellectual structure in the area of governance of social media platforms. Such a study will help future researchers not only to understand themes evolved but also the history of the topic across geography and time. Conclusion This paper gives a comprehensive overview of the studies which address the issue of social media governance. Based on the literature review, which initially included more than 400 papers, we demonstrated that the literature had papers that can be classified into four major themes, i.e., different modes of governance, content moderation and combating misinformation, algorithmic governance, and ethical aspects of governance. It was found that the majority of papers addressed different modes of Governance of Social Media Platforms: A Literature Review Pacific Asia Conference on Information Systems 2022 11 governance that could be used for governing these platforms. However, there were studies which were exploring the role of content moderation and algorithmic governance in the process of regulation of these social media platforms. Based on this synthesized literature, we also addressed some of the research gaps that can be addressed in the future by scholars. Studying governance from the user's perspective, algorithmic biases, and governance of third-party developers are some of the future research objectives that need to be addressed. Thus, this paper is not just a literature review but also pushes future researchers to explore and delve into this contemporary issue of social media governance. References Abbott, K. W., and Snidal, D. (2009). The governance triangle: Regulatory standards institutions and the shadow of the state. The politics of global regulation, 44, pp.44-88. Abraha, H. H. (2017). Examining approaches to internet regulation in Ethiopia. Information & Communications Technology Law, 26(3), pp. 293-311. Anansaringkarn, P., and Neo, R. (2021). How can state regulations over the online sphere continue to respect the freedom of expression? A case study of contemporary ‘fake news’ regulations in Thailand. Information & Communications Technology Law, 30(3), pp. 283-303. Andersson Schwarz, J. (2017). Platform logic: An interdisciplinary approach to the platform‐based economy. Policy & Internet, 9(4), pp. 374-394. Aral, S., Dellarocas, C., and Godes, D. (2013). Introduction to the special issue—social media and business transformation: a framework for research. Information systems research, 24(1), pp. 3-13. Arayankalam, J. (2020). Disinformation as a strategic weapon: Roles of societal polarization, government’s cybersecurity capability, and the rule of law. Babar, A., Bunker, D., & Gill, A. Q. (2018). Investigating the relationship between business analysts’ competency and IS requirements elicitation: a thematic-analysis approach. Communications of the Association for Information Systems, 42(1), pp. 12. Baldwin, C. Y., and Woodard, C. J. (2009). The architecture of platforms: A unified view. Platforms, markets, and innovation, 32, pp. 19-44. Bhagat, S., Kim, D. J., and Parrish, J. L. (2020). Disinformation in social media: Role of Dark Triad Personality Traits and Self-Regulation. Bostoen, F. (2018). Neutrality, fairness, or freedom? Principles for platform regulation. Principles for Platform Regulation (March 31, 2018). Internet Policy Review, 7(1), pp. 1-19. Boyatzis, R. E. (1998). Transforming qualitative information: Thematic analysis and code development. sage. Boyd, D. M., and Ellison, N. B. (2007). Social network sites: Definition, history, and scholarship. Journal of computer-mediated communication, 13(1), pp. 210-230 Braun, V., and Clarke, V. (2006). Using thematic analysis in psychology. Qualitative research in psychology, 3(2), pp. 77-101. Copeland, R. (2011). Tweet all about it: social media and the news revolution. Metro Magazine: Media & Education Magazine, (169), pp. 96-100. Copland, S. (2020). Reddit quarantined: Can changing platform affordances reduce hateful material online?. Internet Policy Review, 9(4), pp. 1-26. Crawford, K., & Gillespie, T. (2016). What is a flag for? Social media reporting tools and the vocabulary of complaint. New Media & Society, 18(3), pp. 410-428. Governance of Social Media Platforms: A Literature Review Pacific Asia Conference on Information Systems 2022 12 Donato, V. E. S. E. (2021). Governing Fake News: The Regulation of Social Media and the Right to Freedom of Expression in the Era of Emergency. European Journal of Risk Regulation, pp. 1-41. DeNardis, L., and Hackl, A. M. (2015). Internet governance by social media platforms. Telecommunications Policy, 39(9), pp. 761-770. De Reuver, M., Sørensen, C., and Basole, R. C. (2018). The digital platform: a research agenda. Journal of Information Technology, 33(2), pp. 124-135. Fukuyama, F. (2013). What is governance?. Governance, 26(3), pp. 347-368. Gancho, S. P. M. (2017). Social Media: a literature review. e-Revista LOGO, 6(2), pp. 1-20. Ghazawneh, A., and Henfridsson, O. (2013). Balancing platform control and external contribution in third‐party development: the boundary resources model. Information systems journal, 23(2), pp. 173-192. Gillespie, T. (2010). The politics of 'platforms .'New media & society, 12(3), pp. 347-364. Gillespie, T. (2017). Platforms are not intermediaries. Geo. L. Tech. Rev., 2, pp. 198. Gillespie, T. (2018). Custodians of the Internet. Yale University Press. Gorodnichenko, Y., Pham, T., and Talavera, O. (2021). Social media, sentiment and public opinions: Evidence from# Brexit and# USElection. European Economic Review, 136, 103772. Gorwa, R. (2018). TOWARDS FAIRNESS, ACCOUNTABILITY, AND TRANSPARENCY IN PLATFORM GOVERNANCE. AoIR Selected Papers of Internet Research. Gorwa, R. (2019a). What is platform governance? Information, Communication & Society, 22(6), pp.854- 871. Gorwa, R. (2019b). The platform governance triangle: Conceptualizing the informal regulation of online content. Internet Policy Review, 8(2), pp. 1-22. Gorwa, R., and Ash, T. G. (2020). Democratic transparency in the platform society. Social Media and Democracy: The State of the Field, Prospects for Reform, pp. 286. Gorwa, R., Binns, R., and Katzenbach, C. (2020). Algorithmic content moderation: Technical and political challenges in the automation of platform governance. Big Data & Society, 7(1), 2053951719897945. Guo, M., and Sun, F. S. (2020). Like, comment, or share? Exploring the effects of local television news Facebook posts on news engagement. Journal of Broadcasting & Electronic Media, 64(5), pp. 736- 755. Hayes, T., and Oakley, R. L. (2019). Reducing Misinformation on Social Media Networks. Helberger, N., Pierson, J., and Poell, T. (2018). Governing online platforms: From contested to cooperative responsibility. The information society, 34(1), pp. 1-14. Heldt, A. P. (2019). Reading between the lines and the numbers: an analysis of the first NetzDG reports. Internet Policy Review, 8(2). Kaplan, A. M., and Haenlein, M. (2010). Users of the world, unite! The challenges and opportunities of Social Media. Business horizons, 53(1), pp. 59-68. Kapoor, K. K., Tamilmani, K., Rana, N. P., Patil, P., Dwivedi, Y. K., & Nerur, S. (2018). Advances in social media research: Past, present and future. Information Systems Frontiers, 20(3), pp. 531-558. Katzenbach, C., and Ulbricht, L. (2019). Algorithmic governance. Internet Policy Review, 8(4), pp. 1-18. Kim, D. H., and Desai, M. (2021). Are Social Media Worth It for News Media?: Explaining News Engagement on Tumblr and Digital Traffic of News Websites. International Journal on Media Management, 23(1-2), pp. 2-28. Governance of Social Media Platforms: A Literature Review Pacific Asia Conference on Information Systems 2022 13 Kumar, H., Singh, M. K., and Gupta, M. P. (2016, September). Smart governance for smart cities: a conceptual framework from social media practices. In Conference on e-Business, e-Services and eSociety (pp. 628-634). Springer, Cham. Kumar, S. (2019). The algorithmic dance: YouTube's Adpocalypse and the gatekeeping of cultural content on digital platforms. Internet Policy Review, 8(2), pp. 1-21. Kyza, E. A., Varda, C., Panos, D., Karageorgiou, M., Komendantova-Amann, N., Coppolino Perfumi, S., ... and Hosseini, A. S. (2020). Combating misinformation online: re-imagining social media for policymaking. Internet Policy Review, 9(4), pp. 1-24. Linke, A., and Zerfass, A. (2013). Social media governance: Regulatory frameworks for successful online communications. Journal of Communication Management. Mansoor, M. (2021). Citizens' trust in government as a function of good governance and government agency's provision of quality information on social media during COVID-19. Government Information Quarterly, 38(4), 101597. Marshall, H., and Drieschova, A. (2018). Post-truth politics in the UK's Brexit referendum. New Perspectives, 26(3), pp. 89-105. Meng, Q., Zhang, N., Zhao, X., Li, F., and Guan, X. (2016). The governance strategies for public emergencies on social media and their effects: a case study based on the microblog data. Electronic Markets, 26(1), pp. 15-29. Mellado, C., and Alfaro, A. (2020). Platforms, journalists and their digital selves. Digital journalism, 8(10), pp. 1258-1279. Mukhopadhyay, S., and Bouwman, H. (2019). Orchestration and governance in digital platform ecosystems: a literature review and trends. Digital Policy, Regulation and Governance. Napoli, P. M. (2015). Social media and the public interest: Governance of news platforms in the realm of individual and algorithmic gatekeepers. Telecommunications Policy, 39(9), pp. 751-760. Nascimento, A. M., and Da Silveira, D. S. (2017). A systematic mapping study on using social media for business process improvement. Computers in Human Behavior, 73, pp. 670-675. Obar, J. A., and Wildman, S. S. (2015). Social media definition and the governance challenge-an introduction to the special issue. Obar, JA and Wildman, S.(2015). Social media definition and the governance challenge: An introduction to the special issue. Telecommunications policy, 39(9), pp. 745-750. Oestreicher-Singer, G., and Zalmanson, L. (2013). Content or community? A digital business strategy for content providers in the social age. MIS quarterly, pp. 591-616 Papaevangelou, C. (2021). The existential stakes of platform governance: a critical literature review. Open Research Europe, 1(31), pp. 31. Parker, J. M., Marasi, S., James, K. W., and Wall, A. (2019). Should employees be “dooced” for a social media post? The role of social media marketing governance. Journal of Business Research, 103, pp. 1- 9. Qualman, E. (2012). Socialnomics: How social media transforms the way we live and do business. John Wiley & Sons. Rieder, B., and Hofmann, J. (2020). Towards platform observability. Internet Policy Review, 9(4), pp. 1- 28. Riemer, K., and Peter, S. (2021). Algorithmic audiencing: Why we need to rethink free speech on social media. Journal of Information Technology, 36(4), pp. 409-426. Rietveld, J., Ploog, J. N., and Nieborg, D. B. (2020). Coevolution of platform dominance and governance strategies: effects on complementor performance outcomes. Academy of Management Discoveries, 6(3), pp. 488-513. Governance of Social Media Platforms: A Literature Review Pacific Asia Conference on Information Systems 2022 14 Rochefort, A. (2020). Regulating social media platforms: a comparative policy analysis. Communication Law and Policy, 25(2), pp. 225-260. Rochet, J. C., and Tirole, J. (2003). Platform competition in two-sided markets. Journal of the european economic association, 1(4), pp. 990-1029. Rochet, J. C., and Tirole, J. (2006). Two‐sided markets: a progress report. The RAND journal of economics, 37(3), pp. 645-667. Saurwein, F., and Spencer-Smith, C. (2020). Combating disinformation on social media: Multilevel governance and distributed accountability in Europe. Digital Journalism, 8(6), pp. 820-841. Shankar, R., and Ahmad, T. (2021). Information Technology Laws: Mapping the Evolution and Impact of Social Media Regulation in India. DESIDOC Journal of Library & Information Technology, 41(4). Sideri, M., Kitsiou, A., Filippopoulou, A., Kalloniatis, C., and Gritzalis, S. (2019). E-Governance in educational settings: Greek educational organizations leadership’s perspectives towards social media usage for participatory decision-making. Internet Research. Spagnoletti, P., Resca, A., and Lee, G. (2015). A design theory for digital platforms supporting online communities: a multiple case study. Journal of Information technology, 30(4), pp. 364-380. Stevens, T. M., Aarts, N., Termeer, C. J. A. M., and Dewulf, A. (2016). Social media as a new playing field for the governance of agro-food sustainability. Current Opinion in Environmental Sustainability, 18, pp. 99-106. Tiwana, A. (2013). Platform ecosystems: Aligning architecture, governance, and strategy. Newnes. Vaast, E., and Kaganer, E. (2013). Social media affordances and governance in the workplace: An examination of organizational policies. Journal of computer-mediated communication, 19(1), pp. 78- 101. Van den Berg, A. C., and Verhoeven, J. W. (2017). Understanding social media governance: seizing opportunities, staying out of trouble. Corporate Communications: An International Journal. Webster, J., and Watson, R. T. (2002). Analyzing the past to prepare for the future: Writing a literature review. MIS quarterly, pp. xiii-xxiii. Wolf, M., Sims, J., and Yang, H. (2018, March). Social media? What social media?. In UKAIS (p. 3) Zhang, S., Pian, W., Ma, F., Ni, Z., and Liu, Y. (2021). Characterizing the COVID-19 infodemic on Chinese social media: Exploratory study. JMIR public health and surveillance, 7(2), e26090. Zhuravskaya, E., Petrova, M., and Enikolopov, R. (2020). Political effects of the internet and social media. Annual Review of Economics, 12, pp. 415-438.

governance of social media platforms a literature review

Ebooks on Internet Governance

Related publications.

Logo

  • Browse by Year
  • Browse by Journal Title
  • Browse by Author
|
and Muhammad Ahsan Samad, and Rinawulandari, and Shamsiah Abd Kadir, (2024) Jurnal Komunikasi ; Malaysian Journal of Communication, 40 (2). pp. 205-223. ISSN 0128-1496


551kB

Official URL: https://ejournal.ukm.my/mjc/issue/view/1710

This systematic literature review examines the impact of social media on public opinion and its implications for policy-making. Utilising the PRISMA framework, the study analysed 19 articles from Scopus and Web of Science databases published between 2013-2023. The review identified five main categories of social media platforms discussed; Twitter/X, Meta (Facebook, Instagram, WhatsApp), YouTube, Chinese apps (Sina Weibo, WeChat, QQ), and several unspecified platforms. Key themes emerged across these categories, including the role of social media in knowledge dissemination, creation of filter bubbles and echo chambers, amplification of diverse voices, and spread of misinformation. The findings highlighted social media potential for real-time public opinion monitoring, facilitating engagement between policymakers and citizens, and early identification of emerging issues. However, challenges such as information credibility and algorithmic curation of content were also noted. The review suggests that strategic use of social media can raise awareness and mobilise support for global initiatives like the Sustainable Development Goals. It emphasises the need for policymakers to understand and leverage social media's influence on public sentiment while addressing associated risks. The study contributes to a nuanced understanding of how different social media platforms shape public discourse and influence policy decisions in the digital age. Future research directions are proposed to further explore the complex dynamics between social media, public opinion, and governance in an evolving technological landscape.

Item Type:Article
Keywords:Social media impact; Public opinion formation; Digital policymaking; Online political discourse; Civic engagement
Journal:
ID Code:23951
Deposited By: Mohd Hamka Md. Nasir
Deposited On:07 Aug 2024 07:21
Last Modified:12 Aug 2024 02:57

Repository Staff Only: item control page

archive-creating software, which generates eprints archives that are compliant with the Open Archives Protocol for Metadata Harvesting.Installed and configured by Division of Information System and Technology,

COMMENTS

  1. PDF Governance of Social Media Platforms: A Literature Review

    Method: This study reviews 64 relevant studies on the governance of social media platforms over the last decade. This paper adopts a thematic analysis approach in analyzing the relevant papers and categorizing them into potential themes. Results: This categorizes the relevant studies into what and how to govern.

  2. Governance of Social Media Platforms: A Literature Review

    The extant literature talks about the regulation of platforms, but very few studies delve into the governance of social media platforms. This paper synthesizes the existing knowledge and identifies the gaps in social media governance. Method: This study reviews 64 relevant studies on the governance of social media platforms over the last decade.

  3. Governance of Social Media Platforms: A Literature Review

    Method: This study reviews 64 relevant studies on the governance of social media platforms over the last decade. This paper adopts a thematic analysis approach in analyzing the relevant papers and ...

  4. Governance of Social Media Platforms: A Literature Review

    The governance of social media platforms has been a growing concern in recent times. The extant literature talks about the regulation of platforms, but very few studies delve into the governance of social media platforms. This paper synthesizes the existing knowledge and identifies the gaps in social media governance.

  5. Governance of Social Media Platforms: A Literature Review

    Social Media platforms have become an integral part of our life. Initially, they were used as a medium of interconnection but now have become a source of creating, consuming, and disseminating information. The proliferation of misinformation, fake news, and hate speech are some of the major challenges that are posed by these platforms. To curb these challenges, it is crucial to study the ...

  6. Governance of Social Media Platforms: A

    Governance of Social Media Platforms: A Literature Review A Manish Kumar; Gupta, Sumeet. Pacific Asia Journal of the Association for Information Systems ; Atlanta Vol. 15, Iss. 1, (Mar 2023): 3.

  7. Governing principles: Articulating values in social media platform policies

    Literature review Platform values as a site of contestation Platforms, broadly defined as digital infrastructures that "host, organize and circulate user's shared content or social exchanges" (Gillespie, 2017: 417), structure possibilities for expression and interaction. Although social media companies have made use of the

  8. Regulating Social Media Platforms: A Comparative Policy Analysis

    Gorwa explicated a framework analyzing governance of, and by, platforms with a typology consisting of three main models: self-governance, external governance, and co-governance. This work is instructive for its contextualization of how evolving mechanisms within the platform economy can be understood in terms of traditional models of oversight.

  9. © The Author(s) 2021 others: A review of the governance choices facing

    platforms and exchange platforms, it has shed little light on the governance of social media platforms. In this review, we synthesize the emerging literature on diverse social media platforms, focussing on four types of governance mechanisms: those that regulate user behaviour, those

  10. Governance of Social Media Platforms: A Literature Review

    Governance of Social Media Platforms: A Literature Review. Manish Kumar A, Sumeet Gupta. Governance of Social Media Platforms: A Literature Review. In Ming-Hui Huang, Guy Gable, Christy M. K. Cheung, Dongming Xu, editors, 26th Pacific Asia Conference on Information Systems, PACIS 2022, Virtual Event / Taipei, Taiwan / Sydney, Australia, July 5 ...

  11. The dark side of digitalization and social media platform governance: a

    Purpose. Social media platforms are a pervasive technology that continues to define the modern world. While social media has brought many benefits to society in terms of connection and content sharing, numerous concerns remain for the governance of social media platforms going forward, including (but not limited to) the spread of misinformation, hate speech and online surveillance.

  12. Governance of Social Media Platforms: A Literature Review

    Semantic Scholar extracted view of "Governance of Social Media Platforms: A Literature Review" by A. ManishKumar et al. ... {ManishKumar2022GovernanceOS, title={Governance of Social Media Platforms: A Literature Review}, author={A ManishKumar and Sumeet Gupta}, booktitle={Pacific Asia Conference on Information Systems}, year={2022}, url={https ...

  13. Misinformation on social platforms: A review and research Agenda

    Dissemination of misinformation is facilitated by " actors " (users) on social platforms who share the content in their networks, either actively (e.g. through 'like' or 'share' on Facebook, and 'tweet' or 'retweet' on Twitter), or passively (by 'commenting' on social media stories) [ 11 ].

  14. The impact of government use of social media and social media

    The development of social media platforms has altered modes of communication between the government and the public, introducing new forms of citizen engagement. ... Literature review. ... Scholars have demonstrated the promising role of social government in transforming governance by increasing governments' transparency in their communication ...

  15. A review of social media-based public opinion analyses: Challenges and

    Through a systematic literature review, we identify 54 papers to analyze and discuss issues related to data collection, data quality, and data mining. ... Social media platforms provide a new way of representing and measuring public opinions. ... including social governance, decision making, policy adoption and adjustment, ...

  16. Corporate Governance in a Social Media Era

    Through social media platforms, companies can build and promote their brands, introduce new products, and learn about their customer base. ... There is lack of qualitative research on social media governance and this paper aims to provide a systematic literature review on the governance structures currently in use by major corporations. More ...

  17. Governing principles: Articulating values in social media platform

    Literature review Platform values as a site of contestation. Platforms, ... Hackl AM (2015) Internet governance by social media platforms. Telecommunications Policy 39(9): 761-770. Crossref. Google Scholar. Epstein D, Roth MC, Baumer EPS (2014) It's the definition, stupid! Framing of online privacy in the internet governance forum debates.

  18. The existential stakes of platform governance: a critical literature review

    Abstract. This study introduces a comprehensive overview of literature concerning the concepts of regulation and governance, and attempts to connect them to scholarly works that deal with the governance of and by social media platforms. The paper provides fundamental definitions of regulation and governance, along with a critique of ...

  19. Evaluating the legitimacy of platform governance: A review of research

    This work builds on the efforts of a broad range of researchers already working to systematically investigate the governance of social media platforms and telecommunications intermediaries. In this article, we present our review and analysis of the work that has been carried out to date, using the digital constitutionalism literature to ...

  20. Governance of Social Media Platforms A Literature Review

    Published by Ebooks on Internet Governance , 2022-06-18 20:24:34. Description: Social Media platforms have become an integral part of our life. Initially, they were used as a medium of interconnection but now have become a source of creating, consuming, and disseminating information. The proliferation of misinformation, fake news, and hate ...

  21. Social media in shaping public opinion roles and impact: a systematic

    This systematic literature review examines the impact of social media on public opinion and its implications for policy-making. Utilising the PRISMA framework, the study analysed 19 articles from Scopus and Web of Science databases published between 2013-2023. The review identified five main categories of social media platforms discussed; Twitter/X, Meta (Facebook, Instagram, WhatsApp ...

  22. Mapping government social media research and moving it forward: A

    Drawing on a comprehensive review of government social media literature in the e-government, the Information Systems (IS), and the public administration (PA) research fields, we mapped government social media research into the six focus categories of context, user characteristics, user behavior, platform properties, management, and effects.

  23. Unveiling Digital Democracy: Social Media's Catalyst Role in Enhancing

    This research uses a systematic literature review methodology to describe emerging trends in digital democracy studies and analyze them to increase comprehensiveness and comparability, in particular highlighting articles that cover various aspects of digital democracy, political participation, and the role of social media in the context of ...

  24. The evolution of social media influence

    In business world social media became popular after 2012 and academic literature also indicates social media evolved after 2000 (Boyd & Ellison, 2007). Therefore, the document published in 2000 and after had been considered for the review only. Firstly the keyword "social media" was searched in Scopus database.