SCOPUS
0Citation Export
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.advisor | Tae-Sun Chung | - |
| dc.contributor.author | HUANG HAIRUI | - |
| dc.date.issued | 2024-02 | - |
| dc.identifier.other | 33549 | - |
| dc.identifier.uri | https://aurora.ajou.ac.kr/handle/2018.oak/38823 | - |
| dc.description | 학위논문(석사)--인공지능학과,2024. 2 | - |
| dc.description.abstract | News is an important way for people to obtain information, and in the open news environment, the type of news is increasingly diversified and the scale of news is huge, which causes problems such as information overload and redundancy. The open-domain event extraction task aims to identify and extract various types of event information from predefined text, The task is usually based on methods such as pre-training or neural topic modeling. However, there are a number of problems with existing methods. First, Existing pre-trained models suffer from insufficient feature vector extraction and excessively high embedding dimensions. Second, Existing methods are not rich enough in semantics and lack syntactic structural information, resulting in poor readability of results and insufficient extraction accuracy. Therefore, to address these issues, this paper first improves the open-domain event extraction method based on the neural topic model of BERT, and then dynamically in- tegrates semantic and syntactic dependency information to obtain rich semantic and syn- tactic features, in order to further improve the model performance. The main research is as follows: Proposed an improvement method of neural topic modeling based on BERT. First, BERT is used in the coding layer for pre-training to obtain the contextual representation of the feature sequences. Second, the Umap dimensionality reduction method is used to obtain more extensive local and global information, and the joint distribution of variables is combined with the deep hidden variable probabilistic graph model to further optimize the parameter inference learning process. Finally, the self-attention mechanism is introduced to assign weights to different nodes to reduce the influence of noisy data, so that the model can pay attention to the more critical features, and further improve the performance of the open-domain event extraction model. Keywords: Event extraction, Open-domain event extraction, Neural topic model. | - |
| dc.description.tableofcontents | 1 I. Introduction 1_x000D_ <br>2 II. Background 4_x000D_ <br>3 III. Related work 7_x000D_ <br> 3.1 Dimensionality reduction methods 7_x000D_ <br> 3.2 Probabilistic graphical models 9_x000D_ <br> 3.3 Inference Methods 11_x000D_ <br> 3.4 Attention Mechanism 11_x000D_ <br> 3.5 Graph Neural Networks 12_x000D_ <br>4 IV. Proposed Method 14_x000D_ <br> 4.1 Problem Analysis 14_x000D_ <br> 4.2 Model Architecture 16_x000D_ <br> 4.2.1 Embedding Layer 18_x000D_ <br> 4.2.2 Dimensionality Reduction Layer 19_x000D_ <br> 4.2.3 Model Generation 20_x000D_ <br> 4.2.4 Inference and Parameter Learning Layer 22_x000D_ <br>5 V. Experiment 24_x000D_ <br> 5.1 Dataset 24_x000D_ <br> 5.2 Evaluation Metrics 24_x000D_ <br> 5.3 Experimental results and analysis 25_x000D_ <br> 5.3.1 Main experiment results and analysis 25_x000D_ <br> 5.3.2 Ablation experiment results and analysis 27_x000D_ <br>6 VI. Conclusion 28_x000D_ | - |
| dc.language.iso | eng | - |
| dc.publisher | The Graduate School, Ajou University | - |
| dc.rights | 아주대학교 논문은 저작권에 의해 보호받습니다. | - |
| dc.title | Open-Domain News Event Extraction Method Based on BERT | - |
| dc.type | Thesis | - |
| dc.contributor.affiliation | 아주대학교 대학원 | - |
| dc.contributor.department | 일반대학원 인공지능학과 | - |
| dc.date.awarded | 2024-02 | - |
| dc.description.degree | Master | - |
| dc.identifier.url | https://dcoll.ajou.ac.kr/dcollection/common/orgView/000000033549 | - |
| dc.subject.keyword | Event extraction | - |
| dc.subject.keyword | Neural topic model | - |
| dc.subject.keyword | Open-domain event extraction | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.