Profanity use in online communities

Sara Owsley Sood*, Judd Antin, Elizabeth F. Churchill

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

34 Citations (Scopus)

Abstract

As user-generated Web content increases, the amount of inappropriate and/or objectionable content also grows. Several scholarly communities are addressing how to detect and manage such content: research in computer vision focuses on detection of inappropriate images, natural language processing technology has advanced to recognize insults. However, profanity detection systems remain flawed. Current list-based profanity detection systems have two limitations. First, they are easy to circumvent and easily become stale-that is, they cannot adapt to misspellings, abbreviations, and the fast pace of profane slang evolution. Secondly, they offer a one-size fits all solution; they typically do not accommodate domain, community and context specific needs. However, social settings have their own normative behaviors-what is deemed acceptable in one community may not be in another. In this paper, through analysis of comments from a social news site, we provide evidence that current systems are performing poorly and evaluate the cases on which they fail. We then address community differences regarding creation/tolerance of profanity and suggest a shift to more contextually nuanced profanity detection systems.

Original languageEnglish (US)
Title of host publicationConference Proceedings - The 30th ACM Conference on Human Factors in Computing Systems, CHI 2012
Pages1481-1490
Number of pages10
DOIs
StatePublished - May 24 2012
Event30th ACM Conference on Human Factors in Computing Systems, CHI 2012 - Austin, TX, United States
Duration: May 5 2012May 10 2012

Other

Other30th ACM Conference on Human Factors in Computing Systems, CHI 2012
CountryUnited States
CityAustin, TX
Period5/5/125/10/12

Fingerprint

Computer vision
Processing

Keywords

  • Comment threads
  • Community management
  • Negativity
  • Online communities
  • Profanity
  • User-generated content

ASJC Scopus subject areas

  • Software
  • Human-Computer Interaction
  • Computer Graphics and Computer-Aided Design

Cite this

Sood, S. O., Antin, J., & Churchill, E. F. (2012). Profanity use in online communities. In Conference Proceedings - The 30th ACM Conference on Human Factors in Computing Systems, CHI 2012 (pp. 1481-1490) https://doi.org/10.1145/2207676.2208610
Sood, Sara Owsley ; Antin, Judd ; Churchill, Elizabeth F. / Profanity use in online communities. Conference Proceedings - The 30th ACM Conference on Human Factors in Computing Systems, CHI 2012. 2012. pp. 1481-1490
@inproceedings{754da11c16a74c3e9fa6eba539adbdf4,
title = "Profanity use in online communities",
abstract = "As user-generated Web content increases, the amount of inappropriate and/or objectionable content also grows. Several scholarly communities are addressing how to detect and manage such content: research in computer vision focuses on detection of inappropriate images, natural language processing technology has advanced to recognize insults. However, profanity detection systems remain flawed. Current list-based profanity detection systems have two limitations. First, they are easy to circumvent and easily become stale-that is, they cannot adapt to misspellings, abbreviations, and the fast pace of profane slang evolution. Secondly, they offer a one-size fits all solution; they typically do not accommodate domain, community and context specific needs. However, social settings have their own normative behaviors-what is deemed acceptable in one community may not be in another. In this paper, through analysis of comments from a social news site, we provide evidence that current systems are performing poorly and evaluate the cases on which they fail. We then address community differences regarding creation/tolerance of profanity and suggest a shift to more contextually nuanced profanity detection systems.",
keywords = "Comment threads, Community management, Negativity, Online communities, Profanity, User-generated content",
author = "Sood, {Sara Owsley} and Judd Antin and Churchill, {Elizabeth F.}",
year = "2012",
month = "5",
day = "24",
doi = "10.1145/2207676.2208610",
language = "English (US)",
isbn = "9781450310154",
pages = "1481--1490",
booktitle = "Conference Proceedings - The 30th ACM Conference on Human Factors in Computing Systems, CHI 2012",

}

Sood, SO, Antin, J & Churchill, EF 2012, Profanity use in online communities. in Conference Proceedings - The 30th ACM Conference on Human Factors in Computing Systems, CHI 2012. pp. 1481-1490, 30th ACM Conference on Human Factors in Computing Systems, CHI 2012, Austin, TX, United States, 5/5/12. https://doi.org/10.1145/2207676.2208610

Profanity use in online communities. / Sood, Sara Owsley; Antin, Judd; Churchill, Elizabeth F.

Conference Proceedings - The 30th ACM Conference on Human Factors in Computing Systems, CHI 2012. 2012. p. 1481-1490.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Profanity use in online communities

AU - Sood, Sara Owsley

AU - Antin, Judd

AU - Churchill, Elizabeth F.

PY - 2012/5/24

Y1 - 2012/5/24

N2 - As user-generated Web content increases, the amount of inappropriate and/or objectionable content also grows. Several scholarly communities are addressing how to detect and manage such content: research in computer vision focuses on detection of inappropriate images, natural language processing technology has advanced to recognize insults. However, profanity detection systems remain flawed. Current list-based profanity detection systems have two limitations. First, they are easy to circumvent and easily become stale-that is, they cannot adapt to misspellings, abbreviations, and the fast pace of profane slang evolution. Secondly, they offer a one-size fits all solution; they typically do not accommodate domain, community and context specific needs. However, social settings have their own normative behaviors-what is deemed acceptable in one community may not be in another. In this paper, through analysis of comments from a social news site, we provide evidence that current systems are performing poorly and evaluate the cases on which they fail. We then address community differences regarding creation/tolerance of profanity and suggest a shift to more contextually nuanced profanity detection systems.

AB - As user-generated Web content increases, the amount of inappropriate and/or objectionable content also grows. Several scholarly communities are addressing how to detect and manage such content: research in computer vision focuses on detection of inappropriate images, natural language processing technology has advanced to recognize insults. However, profanity detection systems remain flawed. Current list-based profanity detection systems have two limitations. First, they are easy to circumvent and easily become stale-that is, they cannot adapt to misspellings, abbreviations, and the fast pace of profane slang evolution. Secondly, they offer a one-size fits all solution; they typically do not accommodate domain, community and context specific needs. However, social settings have their own normative behaviors-what is deemed acceptable in one community may not be in another. In this paper, through analysis of comments from a social news site, we provide evidence that current systems are performing poorly and evaluate the cases on which they fail. We then address community differences regarding creation/tolerance of profanity and suggest a shift to more contextually nuanced profanity detection systems.

KW - Comment threads

KW - Community management

KW - Negativity

KW - Online communities

KW - Profanity

KW - User-generated content

UR - http://www.scopus.com/inward/record.url?scp=84862074729&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84862074729&partnerID=8YFLogxK

U2 - 10.1145/2207676.2208610

DO - 10.1145/2207676.2208610

M3 - Conference contribution

SN - 9781450310154

SP - 1481

EP - 1490

BT - Conference Proceedings - The 30th ACM Conference on Human Factors in Computing Systems, CHI 2012

ER -

Sood SO, Antin J, Churchill EF. Profanity use in online communities. In Conference Proceedings - The 30th ACM Conference on Human Factors in Computing Systems, CHI 2012. 2012. p. 1481-1490 https://doi.org/10.1145/2207676.2208610