Deep Learning Approaches to Text Production. Shashi Narayan

Чтение книги онлайн.

Читать онлайн книгу Deep Learning Approaches to Text Production - Shashi Narayan страница 3

Deep Learning Approaches to Text Production - Shashi Narayan Synthesis Lectures on Human Language Technologies

Скачать книгу

Gabrielle, et Caroline.

       – Claire

       Contents

       List of Figures

       List of Tables

       Preface

       1Introduction

       1.1What is Text Production?

       1.1.1Generating Text from Meaning Representations

       1.1.2Generating Text from Data

       1.1.3Generating Text from Text

       1.2Roadmap

       1.3What’s Not Covered?

       1.4Our Notations

       PART IBasics

       2Pre-Neural Approaches

       2.1Data-to-Text Generation

       2.2Meaning Representations-to-Text Generation

       2.2.1Grammar-Centric Approaches

       2.2.2Statistical MR-to-Text Generation

       2.3Text-to-Text Generation

       2.3.1Sentence Simplification and Sentence Compression

       2.3.2Document Summarisation

       2.4Summary

       3Deep Learning Frameworks

       3.1Basics

       3.1.1Convolutional Neural Networks

       3.1.2Recurrent Neural Networks

       3.1.3LSTMs and GRUs

       3.1.4Word Embeddings

       3.2The Encoder-Decoder Framework

       3.2.1Learning Input Representations with Bidirectional RNNs

       3.2.2Generating Text Using Recurrent Neural Networks

       3.2.3Training and Decoding with Sequential Generators

       3.3Differences with Pre-Neural Text-Production Approaches

       3.4Summary

       PART IINeural Improvements

       4Generating Better Text

       4.1Attention

       4.2Copy

       4.3Coverage

       4.4Summary

       5Building Better Input Representations

       5.1Pitfalls of Modelling Input as a Sequence of Tokens

       5.1.1Modelling Long Text as a Sequence of Tokens

       5.1.2Modelling Graphs or Trees as a Sequence of Tokens

       5.1.3Limitations of Sequential Representation Learning

       5.2Modelling Text Structures

       5.2.1Modelling Documents with Hierarchical LSTMs

       5.2.2Modelling Document with Ensemble Encoders

       5.2.3Modelling Document With Convolutional Sentence Encoders

       5.3Modelling Graph Structure

       5.3.1Graph-to-Sequence Model for AMR Generation

      

Скачать книгу