Concept
Transition-based Neural RST Parsing with Implicit Syntax Features
This paper proposes an encoder-decoder structure, where the encoder represents the input span of words and EDUs using a hierarchical biLSTM. There are two level of sequences, first level is words contained in a EDU and second level is EDU sequences.
0
1
Updated 2022-04-13
Tags
Data Science
Related
EDU Segmentation
Transition-based Neural RST Parsing with Implicit Syntax Features
RST-Pareval Metrics
Transition-based Neural RST Parsing with Implicit Syntax Features
Transition-based Neural RST Parsing with Implicit Syntax Features (Reference)
Transition-based Neural RST Parsing with Implicit Syntax Features