No Cover Image

Journal article 101 views 34 downloads

Moderating borderline content while respecting fundamental values

Stuart Macdonald Orcid Logo, Katy Vaughan Orcid Logo

Policy and Internet

Swansea University Authors: Stuart Macdonald Orcid Logo, Katy Vaughan Orcid Logo

  • 64730.VOR.pdf

    PDF | Version of Record

    This is an open access article under the terms of the Creative Commons Attribution License.

    Download (2.12MB)

Check full text

DOI (Published version): 10.1002/poi3.376

Abstract

As efforts to identify and remove online terrorist and violent extremist content have intensified, concern has also grown about so-called lawful but awful content. Various options have been touted for reducing the visibility of this borderline content, including removing it from search and recommend...

Full description

Published in: Policy and Internet
ISSN: 1944-2866 1944-2866
Published: Wiley 2023
Online Access: Check full text

URI: https://cronfa.swan.ac.uk/Record/cronfa64730
Tags: Add Tag
No Tags, Be the first to tag this record!
first_indexed 2023-11-29T20:30:12Z
last_indexed 2023-11-29T20:30:12Z
id cronfa64730
recordtype SURis
fullrecord <?xml version="1.0" encoding="utf-8"?><rfc1807 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema"><bib-version>v2</bib-version><id>64730</id><entry>2023-10-12</entry><title>Moderating borderline content while respecting fundamental values</title><swanseaauthors><author><sid>933e714a4cc37c3ac12d4edc277f8f98</sid><ORCID>0000-0002-7483-9023</ORCID><firstname>Stuart</firstname><surname>Macdonald</surname><name>Stuart Macdonald</name><active>true</active><ethesisStudent>false</ethesisStudent></author><author><sid>3004372545efc989f9de1eb456d20c6c</sid><ORCID>0000-0001-5025-5747</ORCID><firstname>Katy</firstname><surname>Vaughan</surname><name>Katy Vaughan</name><active>true</active><ethesisStudent>false</ethesisStudent></author></swanseaauthors><date>2023-10-12</date><deptcode>LAWD</deptcode><abstract>As efforts to identify and remove online terrorist and violent extremist content have intensified, concern has also grown about so-called lawful but awful content. Various options have been touted for reducing the visibility of this borderline content, including removing it from search and recommendation algorithms, downranking it and redirecting those who search for it. This article contributes to this discussion by considering the moderation of such content, in terms of three sets of values. First, definitional clarity. This is necessary to provide users with fair warning of what content is liable to moderation and to place limits on the discretion of content moderators. Yet, at present, definitions of borderline content are vague and imprecise. Second, necessity and proportionality. While downranking and removal from search and recommender algorithms should be distinguished from deplatforming, tech companies’ efforts to deamplify borderline content give rise to many of the same concerns as content removal and account shutdowns. Third, transparency. While a number of platforms now publish their content moderation policies and transparency data reports, these largely focus on violative, not borderline content. Moreover, there remain questions around access to data for independent researchers and transparency at the level of the individual user.</abstract><type>Journal Article</type><journal>Policy and Internet</journal><volume>0</volume><journalNumber/><paginationStart/><paginationEnd/><publisher>Wiley</publisher><placeOfPublication/><isbnPrint/><isbnElectronic/><issnPrint>1944-2866</issnPrint><issnElectronic>1944-2866</issnElectronic><keywords>algorithms, borderline, content, content moderation, freedom of expression, recommendation, terrorist and violent extremist content, transparency</keywords><publishedDay>28</publishedDay><publishedMonth>11</publishedMonth><publishedYear>2023</publishedYear><publishedDate>2023-11-28</publishedDate><doi>10.1002/poi3.376</doi><url/><notes/><college>COLLEGE NANME</college><department>Law</department><CollegeCode>COLLEGE CODE</CollegeCode><DepartmentCode>LAWD</DepartmentCode><institution>Swansea University</institution><apcterm>SU Library paid the OA fee (TA Institutional Deal)</apcterm><funders>Swansea University</funders><projectreference/><lastEdited>2024-05-07T15:11:35.3918300</lastEdited><Created>2023-10-12T10:18:36.7907936</Created><path><level id="1">Faculty of Humanities and Social Sciences</level><level id="2">Hilary Rodham Clinton School of Law</level></path><authors><author><firstname>Stuart</firstname><surname>Macdonald</surname><orcid>0000-0002-7483-9023</orcid><order>1</order></author><author><firstname>Katy</firstname><surname>Vaughan</surname><orcid>0000-0001-5025-5747</orcid><order>2</order></author></authors><documents><document><filename>64730__30029__e4edc34ff2df456cbf18f97c63402ebf.pdf</filename><originalFilename>64730.VOR.pdf</originalFilename><uploaded>2024-04-15T21:07:10.4450841</uploaded><type>Output</type><contentLength>2221989</contentLength><contentType>application/pdf</contentType><version>Version of Record</version><cronfaStatus>true</cronfaStatus><documentNotes>This is an open access article under the terms of the Creative Commons Attribution License.</documentNotes><copyrightCorrect>true</copyrightCorrect><language>eng</language><licence>http://creativecommons.org/licenses/by/4.0/</licence></document></documents><OutputDurs/></rfc1807>
spelling v2 64730 2023-10-12 Moderating borderline content while respecting fundamental values 933e714a4cc37c3ac12d4edc277f8f98 0000-0002-7483-9023 Stuart Macdonald Stuart Macdonald true false 3004372545efc989f9de1eb456d20c6c 0000-0001-5025-5747 Katy Vaughan Katy Vaughan true false 2023-10-12 LAWD As efforts to identify and remove online terrorist and violent extremist content have intensified, concern has also grown about so-called lawful but awful content. Various options have been touted for reducing the visibility of this borderline content, including removing it from search and recommendation algorithms, downranking it and redirecting those who search for it. This article contributes to this discussion by considering the moderation of such content, in terms of three sets of values. First, definitional clarity. This is necessary to provide users with fair warning of what content is liable to moderation and to place limits on the discretion of content moderators. Yet, at present, definitions of borderline content are vague and imprecise. Second, necessity and proportionality. While downranking and removal from search and recommender algorithms should be distinguished from deplatforming, tech companies’ efforts to deamplify borderline content give rise to many of the same concerns as content removal and account shutdowns. Third, transparency. While a number of platforms now publish their content moderation policies and transparency data reports, these largely focus on violative, not borderline content. Moreover, there remain questions around access to data for independent researchers and transparency at the level of the individual user. Journal Article Policy and Internet 0 Wiley 1944-2866 1944-2866 algorithms, borderline, content, content moderation, freedom of expression, recommendation, terrorist and violent extremist content, transparency 28 11 2023 2023-11-28 10.1002/poi3.376 COLLEGE NANME Law COLLEGE CODE LAWD Swansea University SU Library paid the OA fee (TA Institutional Deal) Swansea University 2024-05-07T15:11:35.3918300 2023-10-12T10:18:36.7907936 Faculty of Humanities and Social Sciences Hilary Rodham Clinton School of Law Stuart Macdonald 0000-0002-7483-9023 1 Katy Vaughan 0000-0001-5025-5747 2 64730__30029__e4edc34ff2df456cbf18f97c63402ebf.pdf 64730.VOR.pdf 2024-04-15T21:07:10.4450841 Output 2221989 application/pdf Version of Record true This is an open access article under the terms of the Creative Commons Attribution License. true eng http://creativecommons.org/licenses/by/4.0/
title Moderating borderline content while respecting fundamental values
spellingShingle Moderating borderline content while respecting fundamental values
Stuart Macdonald
Katy Vaughan
title_short Moderating borderline content while respecting fundamental values
title_full Moderating borderline content while respecting fundamental values
title_fullStr Moderating borderline content while respecting fundamental values
title_full_unstemmed Moderating borderline content while respecting fundamental values
title_sort Moderating borderline content while respecting fundamental values
author_id_str_mv 933e714a4cc37c3ac12d4edc277f8f98
3004372545efc989f9de1eb456d20c6c
author_id_fullname_str_mv 933e714a4cc37c3ac12d4edc277f8f98_***_Stuart Macdonald
3004372545efc989f9de1eb456d20c6c_***_Katy Vaughan
author Stuart Macdonald
Katy Vaughan
author2 Stuart Macdonald
Katy Vaughan
format Journal article
container_title Policy and Internet
container_volume 0
publishDate 2023
institution Swansea University
issn 1944-2866
1944-2866
doi_str_mv 10.1002/poi3.376
publisher Wiley
college_str Faculty of Humanities and Social Sciences
hierarchytype
hierarchy_top_id facultyofhumanitiesandsocialsciences
hierarchy_top_title Faculty of Humanities and Social Sciences
hierarchy_parent_id facultyofhumanitiesandsocialsciences
hierarchy_parent_title Faculty of Humanities and Social Sciences
department_str Hilary Rodham Clinton School of Law{{{_:::_}}}Faculty of Humanities and Social Sciences{{{_:::_}}}Hilary Rodham Clinton School of Law
document_store_str 1
active_str 0
description As efforts to identify and remove online terrorist and violent extremist content have intensified, concern has also grown about so-called lawful but awful content. Various options have been touted for reducing the visibility of this borderline content, including removing it from search and recommendation algorithms, downranking it and redirecting those who search for it. This article contributes to this discussion by considering the moderation of such content, in terms of three sets of values. First, definitional clarity. This is necessary to provide users with fair warning of what content is liable to moderation and to place limits on the discretion of content moderators. Yet, at present, definitions of borderline content are vague and imprecise. Second, necessity and proportionality. While downranking and removal from search and recommender algorithms should be distinguished from deplatforming, tech companies’ efforts to deamplify borderline content give rise to many of the same concerns as content removal and account shutdowns. Third, transparency. While a number of platforms now publish their content moderation policies and transparency data reports, these largely focus on violative, not borderline content. Moreover, there remain questions around access to data for independent researchers and transparency at the level of the individual user.
published_date 2023-11-28T15:11:34Z
_version_ 1798403359316639744
score 11.0133505