Ajou University repository

EDiT: Interpreting ensemble models via compact soft decision trees
Citations

SCOPUS

0

Citation Export

DC Field Value Language
dc.contributor.authorYoo, Jaemin-
dc.contributor.authorSael, Lee-
dc.date.issued2019-11-01-
dc.identifier.issn1550-4786-
dc.identifier.urihttps://aurora.ajou.ac.kr/handle/2018.oak/36439-
dc.identifier.urihttps://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85078899754&origin=inward-
dc.description.abstractGiven feature-based data, how can we accurately classify individual input and interpret the result of it? Ensemble models are often the best choice in terms of accuracy when dealing with feature-based datasets. However, interpreting the decision made by the ensemble model for individual input seems intractable. On the other hand, decision trees, although being prone to overfit, are considered as the most interpretable in terms of being able to trace the decision process of individual input. In this work, we propose Ensemble to Distilled Tree (EDiT), a novel distilling method that generates compact soft decision trees from ensemble models. EDiT exploits the interpretability of a tree-based structure by removing redundant branches and learning sparse weights, while enhancing accuracy by distilling the knowledge of ensemble models such as random forests (RF). Our experiments on eight datasets show that EDiT reduces the number of parameters of an RF by 6.4 to 498.4 times with a minor loss of classification accuracy.-
dc.description.sponsorshipACKNOWLEDGEMENT This work was supported by the National Research Foundation of Korea funded by the Ministry of Science, ICT and Future Planning (2018R1A5A1060031, 2018R1A1A3A0407953). Lee Sael is the corresponding author.-
dc.language.isoeng-
dc.publisherInstitute of Electrical and Electronics Engineers Inc.-
dc.subject.meshClassification accuracy-
dc.subject.meshDecision process-
dc.subject.meshEnsemble modeling-
dc.subject.meshInterpretability-
dc.subject.meshInterpretable learning-
dc.subject.meshRandom forests-
dc.subject.meshTree-based structures-
dc.subject.meshWeight pruning-
dc.titleEDiT: Interpreting ensemble models via compact soft decision trees-
dc.typeConference-
dc.citation.conferenceDate2019.11.8. ~ 2019.11.11.-
dc.citation.conferenceName19th IEEE International Conference on Data Mining, ICDM 2019-
dc.citation.editionProceedings - 19th IEEE International Conference on Data Mining, ICDM 2019-
dc.citation.endPage1443-
dc.citation.startPage1438-
dc.citation.titleProceedings - IEEE International Conference on Data Mining, ICDM-
dc.citation.volume2019-November-
dc.identifier.bibliographicCitationProceedings - IEEE International Conference on Data Mining, ICDM, Vol.2019-November, pp.1438-1443-
dc.identifier.doi10.1109/icdm.2019.00187-
dc.identifier.scopusid2-s2.0-85078899754-
dc.subject.keywordInterpretable learning-
dc.subject.keywordKnowledge distillation-
dc.subject.keywordRandom forests-
dc.subject.keywordSoft decision trees-
dc.subject.keywordWeight pruning-
dc.type.otherConference Paper-
dc.description.isoafalse-
dc.subject.subareaEngineering (all)-
Show simple item record

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Lee, Sael Image
Lee, Sael이슬
Department of Software and Computer Engineering
Read More

Total Views & Downloads

File Download

  • There are no files associated with this item.