Description
Metadata
Settings
About:
Automatically extracting causal relations from texts is a challenging task in Natural Language Processing (NLP). Most existing methods focus on extracting intra-sentence or explicit causality, while neglecting the causal relations that expressed implicitly or hidden in inter-sentences. In this paper, we propose Cascaded multi-Structure Neural Network (CSNN), a novel and unified model that extract inter-sentence or implicit causal relations from Chinese Corpus, without relying on external knowledge. The model employs Convolutional Neural Network (CNN) to capture important features as well as causal structural pattern. Self-attention mechanism is designed to mine semantic and relevant characteristics between different features. The output of CNN and self-attention structure are concatenated as higher-level phrase representations. Then Conditional Random Field (CRF) layer is employed to calculate the label of each word in inter-sentence or implicit causal relation sentences, which improves the performance of inter-sentence or implicit causality extraction. Experimental results show that the proposed model achieves state-of-the-art results, improved on three datasets, when compared with other methods.
Permalink
an Entity references as follows:
Subject of Sentences In Document
Object of Sentences In Document
Explicit Coreferences
Implicit Coreferences
Graph IRI
Count
http://ns.inria.fr/covid19/graph/entityfishing
11
http://ns.inria.fr/covid19/graph/articles
3
Faceted Search & Find service v1.13.91
Alternative Linked Data Documents:
Sponger
|
ODE
Raw Data in:
CXML
|
CSV
| RDF (
N-Triples
N3/Turtle
JSON
XML
) | OData (
Atom
JSON
) | Microdata (
JSON
HTML
) |
JSON-LD
About
This work is licensed under a
Creative Commons Attribution-Share Alike 3.0 Unported License
.
OpenLink Virtuoso
version 07.20.3229 as of Jul 10 2020, on Linux (x86_64-pc-linux-gnu), Single-Server Edition (94 GB total memory)
Copyright © 2009-2025 OpenLink Software