Legal Text Summarization Using Transformer Models

Date:

This talk presents our work on a transformer-based encoder-decoder architecture for abstractive legal text summarization. Combines PEGASUS’ (from Zhang et al. 2020) pre-training objective with Longformer’s (from Beltagy et al. 2020) dilated attention mechanism to create a model that can handle extremely long input sequences to generate summaries of legal documents. Achieves state-of-the-art summarization performance on the BIGPATENT dataset.

Slides: https://github.com/nimashoghi/dl-summarization