Citation Export
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Ishaq, Ahmad | - |
| dc.contributor.author | Ullah, Fath U.Min | - |
| dc.contributor.author | Hamandawana, Prince | - |
| dc.contributor.author | Cho, Da Jung | - |
| dc.contributor.author | Chung, Tae Sun | - |
| dc.date.issued | 2025-02-01 | - |
| dc.identifier.issn | 2079-9292 | - |
| dc.identifier.uri | https://aurora.ajou.ac.kr/handle/2018.oak/38517 | - |
| dc.identifier.uri | https://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85218879948&origin=inward | - |
| dc.description.abstract | Accurate detection and diagnosis of brain tumors at early stages is significant for effective treatment. While numerous methods have been developed for tumor detection and classification, several rely on traditional techniques, often resulting in suboptimal performance. In contrast, AI-based deep learning techniques have shown promising results, consistently achieving high accuracy across various tumor types while maintaining model interpretability. Inspired by these advancements, this paper introduces an improved variant of EfficientNet for multi-grade brain tumor detection and classification, addressing the gap between performance and explainability. Our approach extends the capabilities of EfficientNet to classify four tumor types: glioma, meningioma, pituitary tumor, and non-tumor. For enhanced explainability, we incorporate gradient-weighted class activation mapping (Grad-CAM) to improve model interpretability. The input MRI images undergo data augmentation before being passed through the feature extraction phase, where the underlying tumor patterns are learned. Our model achieves an average accuracy of 98.6%, surpassing other state-of-the-art methods on standard datasets while maintaining a substantially reduced parameter count. Furthermore, the explainable AI (XAI) analysis demonstrates the model’s ability to focus on relevant tumor regions, enhancing its interpretability. This accurate and interpretable model for brain tumor classification has the potential to significantly aid clinical decision-making in neuro-oncology. | - |
| dc.description.sponsorship | This work was supported by the Institute of Information and Communications Technology Planning and Evaluation (IITP) under the Artificial Intelligence Convergence Innovation Human Resources Development (IITP-2024-RS-2023-00255968) grant and the ITRC (Information Technology Research Center) support program (IITP-2021-0-02051) funded by the Korean government (MSIT). Additionally, this work was supported by the BK21 FOUR program of the National Research Foundation of Korea funded by the Ministry of Education (NRF5199991014091). | - |
| dc.language.iso | eng | - |
| dc.publisher | Multidisciplinary Digital Publishing Institute (MDPI) | - |
| dc.title | Improved EfficientNet Architecture for Multi-Grade Brain Tumor Detection | - |
| dc.type | Article | - |
| dc.citation.number | 4 | - |
| dc.citation.title | Electronics (Switzerland) | - |
| dc.citation.volume | 14 | - |
| dc.identifier.bibliographicCitation | Electronics (Switzerland), Vol.14 No.4 | - |
| dc.identifier.doi | 10.3390/electronics14040710 | - |
| dc.identifier.scopusid | 2-s2.0-85218879948 | - |
| dc.identifier.url | www.mdpi.com/journal/electronics | - |
| dc.subject.keyword | brain cancer | - |
| dc.subject.keyword | deep learning | - |
| dc.subject.keyword | image classification | - |
| dc.subject.keyword | medical imaging | - |
| dc.subject.keyword | medical informatics | - |
| dc.subject.keyword | model tuning | - |
| dc.subject.keyword | transfer learning | - |
| dc.type.other | Article | - |
| dc.identifier.pissn | 20799292 | - |
| dc.description.isoa | true | - |
| dc.subject.subarea | Control and Systems Engineering | - |
| dc.subject.subarea | Signal Processing | - |
| dc.subject.subarea | Hardware and Architecture | - |
| dc.subject.subarea | Computer Networks and Communications | - |
| dc.subject.subarea | Electrical and Electronic Engineering | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.