9:00 am - 9:30 am
Opening remarks & Welcome
Symposium - Platform governance for better communities
Dr Fiona Martin and Nicolas Suzor
9:30 am - 12:45 pm
Building cohesive, productive online communities depends on good governance relationships – particularly with the platforms that host our conversations and content. But increasingly transnational communications platform providers are being portrayed as reluctant to control the violence and misinformation they host, and slow to provide moderation tools that help community managers minimise harmful content and deal with bad behaviour.
In light of the Christchurch Call, the 2019 SWARM Symposium investigates how we can govern online communities more effectively, for safer sociality. It’s sponsored by the Department of Media and Communications and Faculty of Arts & Social Sciences, and is inspired this year by a joint ARC Discovery grant with colleagues from QUT, Platform Governance: Rethinking Internet Regulation as Media Policy (DP190100222).
9.30 – 9.35 Welcome and Acknowledgment of Country,
Dr Fiona Martin, Research Director, Dept. Media & Communications
9:35 – 9:50 Keynote: Professor Nicolas Suzor, Law School, Queensland University of Technology
A new social contract for the digital age: the responsibilities of platforms
Rampant abuse, hate speech, censorship, bias, and disinformation – our Internet has problems. It is governed by technology companies – search engines, social media platforms, content hosts, and infrastructure providers – whose rules influence what we are allowed to see and say. These companies govern our digital environment, but they are also subject to pressure from governments and other powerful actors to censor and control the flow of information online. As governments around the world grapple with how to regulate digital media platforms, it’s clear that big changes are coming. We are now at a constitutional moment – an opportunity to rethink the basic rules of how the Internet is governed. Can we build a vibrant, diverse, and flourishing internet that can promote fundamental human rights? I argue that, if we care about the future of our shared social spaces, we need a new constitutionalism: real limits on how power is exercised online.
Bio: Professor Nicolas Suzor studies the regulation of networked society, including the governance of the internet and social networks, digital copyright, and knowledge commons. Nic is also the Chapter Lead of the Creative Commons Australia project and the deputy chair of Digital Rights Watch, an Australian non-profit organisation whose mission is to ensure that Australian citizens are equipped, empowered and enabled to uphold their digital rights. He is the author of Lawless: the secret rules that govern our digital lives (Cambridge, 2019).
Nic teaches intellectual property and technology law at QUT. He is an award winning educator, receiving QUT’s David Gardiner Teacher of the Year medal in 2016 and was nationally recognised as a recipient of an Australian Awards for University Teaching Citation for Outstanding Contributions to Student Learning in 2017 for his engaging and innovative teaching.
9.55 – 10.15 Mr Luke Munn – Western Sydney University
Angry by Design?: Technical Affordances and Toxic Communication
Hate speech online is on the rise. Recent studies describe this rise statistically (Safehome 2017; Hango 2016), but stop short of analyzing its underlying conditions. Harm reduction on platforms seems heavily focused on improving automated systems (Pavlopoulous et al. 2017) or human content moderators (Gillespie 2017). Mainstream literature, for its part, often blames a toxic individual, someone with a predilection for hating or bullying, racism or sexism (Jennings-Edquist 2014). In contrast, this study seeks to understand how hate emerges from hate-inducing architectures. Just as the design of urban space influences the practices within it, the design of platforms, apps and technical environments shapes our behaviour in digital space. How does the design of technical environments promote toxic communication?
In the last few years, technical designers have admitted that their systems are addictive (Bosker 2016) and exploit negative “triggers” (Lewis 2017). Others have spoken about their tools “ripping apart the social fabric of how society works” (Vincent 2017). Facebook’s design privileges base impulses rather than considered reflection (Bosker 2016). Social media functionality enables negative messages to be distributed farther and faster (Vosoughi et al. 2018), while anger spreads contagiously (Fan et al., 2016). The “incentive structures and social cues of algorithm-driven social media sites” amplify the anger of users over time until they “arrive at hate speech” (Fisher & Taub 2018). Indeed such gradual amplification of hate creates a potential pipeline for alt-right radicalization (Munn 2019a; Munn 2019b). In warning others of these negative social effects, designers have described themselves as canaries in the coal mine (Mac 2019).
Very recently then, a new wave of designers and technologists have begun thinking about how to redesign platforms to foster calmer behaviour and more civil discourse. How might design create ethical platforms that enhance users wellbeing (Han 2019)? Could technology be designed in a more humane way (Harris 2018), and what would the core principles and processes of such design look like (Yablonski 2019)? Identifying a set of hate-promoting architectures would allow designers and developers to construct future platforms that mitigate communication which is used to threaten, harass, or incite harm.
“Angry By Design,” recently funded by Netsafe, picks up on this nascent work, tracing the relationship between technical architectures and toxic communication. Three distinctly different platforms are examined: Facebook, Twitch, and 4chan. How does Facebook’s privileging of metrics influence the intensity of content that gets shared? What kind of features support Twitch’s “gamerbro” culture of misogynistic trolling? And how does message board design encourage memes that normalize hate against marginalized communities? For the SWARM symposium, this paper will survey the terrain of platform design and hate speech, introduce some early findings, and suggest some promising directions for future research.
Bio: Based in Tāmaki Makaurau, Aotearoa New Zealand, Luke Munn uses both practice-based and theoretical approaches to explore the intersections of digital cultures, investigating how technical environments shape the political and social capacities of the everyday. He is currently completing a PhD at Western Sydney University on algorithmic power.
10.20 – 10.40 Ms Jenna Price – University of Technology/University of Sydney
The emotional labour of online activism
As activists work on digital campaigns, they struggle with detractors, negotiate with other activists, and come face-to-face with the perpetually demanding participatory nature of online activism. In other words, they invest emotionally in their online labour. It is a requirement to manage their feelings in this setting in much the same way as it is a requirement in paid work – activists must manage their own feelings, their feelings about each other, and about the impact of both campaigning and campaigns in order to achieve their end goals. Digital activism is also outward-facing, in the public sphere, making the job of exerting control of emotional states more pressing than in intimate surroundings. This presentation discusses the impact of emotional labour on the moderators and administrators of a feminist activist Facebook group. It’s not all bad news either.
Bio: Jenna Price is a senior lecturer in journalism at the University of Technology, Sydney and a PhD candidate with the University of Sydney. She is a columnist for the Sydney Morning Herald, and a co-founder of feminist action group Destroy the Joint.
10.45 – 11.05 Dr Lukasz Swiatek (UNSW Sydney) and Chris Galloway (Massey University)
Platform Governance, AI & Boundary Spanning: New Approaches for PR Managers
As harmful online content and behaviour increasingly negatively impact online communities, and as concerns also mount about international platform providers’ inadequate regulation of such content and behaviour, the role of public relations (PR) managers becomes more and more important. This paper examines the vital ways in which PR managers – with a management rather than just a ‘technician’ role (Grunig & Grunig, 1992) – can contribute to platform governance. Specifically, it examines the contributions that they can make as boundary-spanners: operating across organisational boundaries (Grunig & Hunt, 1984), facilitating communication flow between different departments, staff and other internal stakeholders.
The paper makes a novel contribution to theory and practice in platform governance (as well as PR) by examining the ways in which boundary-spanning by PR managers can help build and maintain safer online communities in an era of proliferating artificial intelligence (AI). To the best of the authors’ knowledge, based on an extensive review of the literature, this approach has not been considered before, with PR-AI possibilities only recently beginning to be investigated thoroughly in scholarly literature (see, for example, Tilson, 2017; Yaxley, 2018; Galloway & Swiatek, 2018). The paper argues that boundary-spanning by PR managers for more effective platform governance needs to take into account the affordances of AI technologies, as well as the dilemmas that they entail. It presents a new practice-based framework for boundary-spanning that includes both synchronous and asynchronous communications monitorable by AI technologies.
The paper’s case study is Facebook and its ‘policy team’ (comprising public relations professionals, crisis management practitioners, and lawyers), its 7,500 human moderators, and its technologists, among other organisational groups (Koebler & Cox, 2018). The paper speaks directly to the symposium theme, and two specific sub-themes: (1) AI and its impacts on community development, and (2) regulating live streaming and synchronous chat.
11.05 – 11.15 MORNING TEA
11.20 – 11.40 Mr Tim Koskie – Centre for Media Transition, UTS, Sydney
Insert culture here: Culturally intermediating online communities
The legal and ethical challenges facing organisations that choose to host user comments on their websites are increasingly visible. However, it is unclear what set of comment moderation and community management practices and objectives organisations are employing to achieve their goals. This research project investigated the work of the cultural intermediaries working in these comment sections in online newsrooms. It found that these workers often grounded their choices in the journalistic field that surrounded them and that they themselves inhabited. Through observations and interviews at Fairfax Media and The Conversation with the staff that watch, moderate, manage, and shape user comments on news stories, the research uncovered the influences guiding their work as well as some of the distinct tasks and practices they employ to cultivate the culture they want to see in the discussions below the articles. Employing newsroom ethnographic techniques, the study used participant observation alongside deep, unstructured interviews with nine participants, including journalists, editors, a web developer, and, most prominently, the key staff dedicated to comment moderation and community management. It found that, while participants did not equally value comments, their practices and judgments related strongly to their backgrounds and the context in which they operated, which ultimately shaped their cultural intermediation work. Further, it revealed that, through the way they prioritise their work, they can have a significant influence on the way the comment sections develop. These results show that organisations need to consider how their culture and the background of their staff shape the comment sections they host. It also reveals subtle but crucial tasks and challenges facing the staff that engage in this work.
Bio: Timothy Koskie is a researcher, government consultant, and doctoral student at the Centre for Media Transition, UTS, as part of the joint Media Pluralism Project with the University of Sydney. His academic interests are user generated content, the culture of media work, and media pluralism, with his current project investigating how user generated content plays a part in pluralistic media ecosystems.
11.45 –12.05 Dr Fiona Martin & Ms Venessa Paech, University of Sydney
Working with platforms: the parameters of community governance relationships
As media and communications research into platform content regulation has focused on the problematic of moderation (Roberts, 2016, Gillespie, 2018), and everyday user experience of platforms’ regulatory regimes (Crawford & Gillespie, 2014; Sarikakis, 2017; Gerrard, 2018; Tan 2018), little study has been undertaken of the governance relationships developing between platforms and community managers. Based on preliminary data from the 2019 Australian Community Managers Career survey, this paper explores the scope of community manager relationships with platform providers, and the challenges that they identify in governing groups on their communications infrastructures. Using a nodal governance framework (Holley and Shearing, 2017), it pinpoints areas for further research into the uneven dynamics and power inequities between platform companies and professional community managers.
The study finds Australian community managers worked with over 20 different online communications platforms, specialist and non-specialist, to build their communities with Facebook hosting the most used non-specialist applications for this purpose. However, in contrast to the diversity of hosting architectures reported, the challenges community managers face in negotiating their relationship were more homogenous, concentrating on functionality and usability, with data privacy, service regulation and content regulation as additional concerns. In exploring how they would like to improve their relationships with their platform providers, community manager responses highlight the barriers they face to achieving transparent, timely, relevant and consistent responses to the issues they raise.
Bio: Dr Fiona Martin researches digital journalism and dialogic technologies, as well as the uses, politics and regulation of online media (internet, web, mobile and social media) and the implications of these technologies for media industry change. She is the co-author, with Tim Dwyer, of Sharing News Online (Palgrave Macmillan, 2018) and the author of Mediating the Conversation (Routledge 2020). She is a co-investigator on the ARC Discovery project Platform Governance: Rethinking internet regulation as media policy, 2019-2022 and on the Facebook Content Policy Research on Social Media Platforms award: Regulating Hate Speech in the Asia Pacific.
Ms Venessa Paech is an internationally regarded community builder, manager and strategist. She has been engaged by numerous organisations as a community principal, consultant and strategist, including Lonely Planet, REA Group, Envato and Australia Post. She is the co-founder of the SWARM conference and the Australian Community Manager’s network, and is a founding member of the Global Community Management Leadership Group (with fellow community leaders from five nations), working closely with industry, government and researchers to grow community management practice in the Asia Pacific region. Venessa is a PhD student at the University of Sydney, studying the impact of automation and AI on community building and governance, and is a member of the Socio-Technical Futures (STuF) Lab.
12.05 – 13.00 Panel – Evaluating Australia’s platform governance strategy
Moderator: Fiona Martin. Panellists: Nic Suzor, Andre Oboler, Venessa Paech
In the Abhorrent Violent Materials Act and the ACCC’s Digital Platforms Inquiry the Australian federal government has shown an appetite for regulating social media platforms. How effective are the approaches it’s favouring, and what other directions might we take in governing online communities and content? We welcome to the discussion Dr Andre Oboler CEO of the Online Hate Prevention Institute.
Bio: Dr Andre Oboler is a Senior Lecturer in the La Trobe Law School and CEO of the Online Hate Prevention Institute. He serves on the IEEE’s Global Public Policy Committee and on the Australian Government’s delegation to the International Holocaust Remembrance Alliance. He holds a PhD in Computer Science from Lancaster University (UK) and LLM(Juris Doctor) and Honours in Computer Science from Monash University.
12:45 pm - 2:15 pm
Building awesome Facebook Groups: the secret sauce
Alison Michalk and Larah Kennedy - Quiip
2:30 pm - 4:00 pm
Looking to make use of Facebook’s groups for your community? Drawing on decades of experience and case studies from their work with some of Australia’s most successful Facebook groups, Quiip‘s Alison Michalk and Larah Kennedy will share their leading tips and tricks for making the most of the community product on the world’s largest social media platform. Bring an existing group you’re working with, or ideas about one you plan to launch.
ROI and business casing
2:30 pm - 4:00 pm
We get it, recalibrating your creative brain to prove an ROI for your work is tough! In this workshop, you’ll learn ways of thinking about deriving and proving metrics from your work. Using examples from 20 years in Silicon Valley companies such as eBay, Zynga, and Sephora, longtime Online Community advocate and strategist Shira Levine will make you smarter about building the business cases for your work. For this workshop you will need: an open mind and willingness to share in a group.
Self-care for community professionals (and their communities)
Dr Jennifer Beckett - University of Melbourne
4:00 pm - 5:30 pm
Community management is emotional labor. University of Melbourne’s Dr. Jennifer Beckett will lead you through strategies and tactics to better support you in this labor, so you can work at the top of your game while maintaining health and well-being. She will also explain how to apply care frameworks to your community and its members, which in turn supports your community objectives.
Social media legislation: community manager primer
Paul Gordon - Wallmans Lawyers
4:00 pm - 5:30 pm
As one of Australia’s leading experts on social media and privacy law, Wallmans Lawyers Partner Paul Gordon will help make sense of the gamut of new legislation and regulation impacting the work of community professionals in Australia. What do you have to know, what should you be careful of and how can you stay on the right side of the law while doing your best work? He will also help attendees troubleshoot legal issues or questions they have concerning their communities.
7:45 am - 8:50 am
Arrivals & Registration
8:50 am - 9:00 am
Welcome to Country
9:00 am - 9:20 am
Why Woolies is investing in online community
Gemma Howells - Woolworths FoodCo
Community lessons from the support frontlines
Brent Patching - Prostate Cancer Foundation of Australia
9:55 am - 10:30 am
In exploring how online communities are changing the delivery of psychosocial support, Brent shares his tips on using a research mindset in developing online community strategy; creating a sustainable community with minimal resources; and empowering users to drive conversations, particularly in a support and NFP context.
10:30 am - 11:00 am
The Indispensable Community: how to stop chasing engagement and start delivering results
Rich Millington - Feverbee
11:00 am - 11:40 am
Do you spend your days endlessly trying to get community members to click like, share, and post comments? Do you feel forced to chase meaningless engagement metrics? If so, you’re probably one of thousands of community professionals caught in the engagement trap. During this talk, Rich explains how top community professionals have stopped chasing engagement and instead built communities neither they, nor their members, can live without.
This talk will help you:
- Stop chasing engagement metrics and focus on the behaviours that really matter.
- Get your boss and your colleagues to provide you with the resources to build an indispensable community.
- Persuade members to make their best possible contributions to your community.
- Help you develop remarkable new products.
- Align your community to many areas of your business and multiply your value.
- Develop a comprehensive strategy for your brand’s community from scratch.
Lessons in connection from the world's biggest neighbourhood network
Jennie Sager - Nextdoor
11:40 am - 12:15 pm
10 years ago, a group of entrepreneurs set out on a mission to build an app that could make the world a better place. Technology increased social isolation, and they wanted to flip that reality on its head and create support for authentic and meaningful connections. This is how Nextdoor was born. Learn how it was built into a global powerhouse that has become the go-to online community for neighbours worldwide – and how its teams use hyperlocality to turbocharge connection and interaction.
12:15 pm - 1:15 pm
The skills you need to run a community of 330 million
Evan Hamilton - Reddit
1:15 pm - 1:50 pm
Being a team of one running a small community differs greatly from running a team of 16 and a community larger than the United States. Evan will share his journey and the skills he had to learn along the way to successfully run Reddit’s community team. His insights about scaling community management will help you build your own career roadmap – whether you manage a community of 1000 or 100 million.
The perfect platform is hard to find – how to make the most of the cards you've been dealt
Sarah Hawk - Discourse
1:50 pm - 2:25 pm
Community technology has come a long way in the past decade – so far that it can be difficult to keep up. Most community professionals find themselves battling to create a truly magic experience for their members while becoming increasingly weighed down by stakeholder expectations, budgetary restrictions, limited resources and technical debt. Even when starting from scratch the many different considerations can become overwhelming. But fear not – help is here! With many years of experience as both a community practitioner and a software builder, Hawk will walk you through some simple strategies to make the most of the hand that you’ve been dealt.
2:25 pm - 2:50 pm
A quest for belonging: Building new online communities
2:50 pm - 3:25 pm
Building new communities is hard and most of them fail. A seasoned community practitioner, Jason shares stories of his career in community has unfolded, and the critical lessons he has learnt about building spaces people want to belong to – and communities that have a chance at lasting life.
How to become a Chief Community Officer
3:15 pm - 4:25 pm
Longtime Online Community advocate and strategist Shira Levine brings her infectious enthusiasm to an important topic for all of us: how to position yourself as CCO, your organisation’s Chief Community Officer — in five easy steps. After sharing her pathway to community leadership, Shira will lead us through an activity that locks the lessons in– for immediate impact on the job.
4:45 pm - 5:00 pm
Thanks & Wrap Up
5:00 pm - 7:00 pm
Drinks & Networking