Ajou University repository

GTA: Gated Toxicity Avoidance for LM Performance Preservationoa mark
Citations

SCOPUS

0

Citation Export

DC Field Value Language
dc.contributor.authorKim, Heegyu-
dc.contributor.authorCho, Hyunsouk-
dc.date.issued2023-01-01-
dc.identifier.urihttps://aurora.ajou.ac.kr/handle/2018.oak/37011-
dc.identifier.urihttps://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85183309567&origin=inward-
dc.description.abstractCaution: This paper includes offensive words that could potentially cause unpleasantness. The fast-paced evolution of generative language models such as GPT-4 has demonstrated outstanding results in various NLP generation tasks. However, due to the potential generation of offensive words related to race or gender, various Controllable Text Generation (CTG) methods have been proposed to mitigate the occurrence of harmful words. However, existing CTG methods not only reduce toxicity but also negatively impact several aspects of the language model's generation performance, including topic consistency, grammar, and perplexity. This paper explores the limitations of previous methods and introduces a novel solution in the form of a simple Gated Toxicity Avoidance (GTA) that can be applied to any CTG method. We also evaluate the effectiveness of the proposed GTA by comparing it with state-of-the-art CTG methods across various datasets. Our findings reveal that gated toxicity avoidance efficiently achieves comparable levels of toxicity reduction to the original CTG methods while preserving the generation performance of the language model.-
dc.description.sponsorshipThis work was supported by Institute of Information & communications Technology Planning & Evaluation (IITP) grant funded by the Korea government(MSIT) (No.2022-0-00680, Abductive inference framework using omni-data for understanding complex causal relations) and (IITP-2023-No.RS-2023-00255968, Artificial Intelligence Convergence Innovation Human Resources Development) We received support from the Google TPU Research Cloud to train the models required for this study. To make Fig 1, 2, and 3, we used commercially available public icons. Robot icons910 are created by itim2101 - Flaticon. Gate icon11 and Bearded male icon12 is create by Freepik - Flati-con.-
dc.description.sponsorshipThis work was supported by Institute of Information & communications Technology Planning & Evaluation (IITP) grant funded by the Korea government(MSIT) (No.2022-0-00680, Abductive inference framework using omni-data for understanding complex causal relations) and (IITP-2023No.RS-2023-00255968, Artificial Intelligence Convergence Innovation Human Resources Development)-
dc.language.isoeng-
dc.publisherAssociation for Computational Linguistics (ACL)-
dc.subject.meshGeneration method-
dc.subject.meshLanguage model-
dc.subject.meshModel generation-
dc.subject.meshNovel solutions-
dc.subject.meshPerformance-
dc.subject.meshSimple++-
dc.subject.meshState of the art-
dc.subject.meshText generations-
dc.subject.meshToxicity reduction-
dc.titleGTA: Gated Toxicity Avoidance for LM Performance Preservation-
dc.typeConference-
dc.citation.conferenceDate2023.12.6. ~ 2023.12.10.-
dc.citation.conferenceName2023 Findings of the Association for Computational Linguistics: EMNLP 2023-
dc.citation.editionFindings of the Association for Computational Linguistics: EMNLP 2023-
dc.citation.endPage14763-
dc.citation.startPage14747-
dc.citation.titleFindings of the Association for Computational Linguistics: EMNLP 2023-
dc.identifier.bibliographicCitationFindings of the Association for Computational Linguistics: EMNLP 2023, pp.14747-14763-
dc.identifier.doi2-s2.0-85183309567-
dc.identifier.scopusid2-s2.0-85183309567-
dc.type.otherConference Paper-
dc.description.isoatrue-
dc.subject.subareaComputational Theory and Mathematics-
dc.subject.subareaComputer Science Applications-
dc.subject.subareaInformation Systems-
dc.subject.subareaLanguage and Linguistics-
dc.subject.subareaLinguistics and Language-
Show simple item record

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Cho, Hyunsouk Image
Cho, Hyunsouk조현석
Department of Software and Computer Engineering
Read More

Total Views & Downloads

File Download