Rich examples are included to demonstrate the use of Texar. The implementations of cutting-edge models/algorithms also provide references for reproducibility and comparisons.

More examples are continuously added…

Examples by Models/Algorithms

RNN / Seq2seq

Transformer (Self-attention)

  • transformer: Transformer for machine translation
  • bert: Pre-trained BERT model for text representation
  • gpt-2: Pre-trained OpenAI GPT-2 language model
  • vae_text: VAE with a transformer decoder for improved language modeling

Variational Autoencoder (VAE)

GANs / Discriminator-supervision

Reinforcement Learning

  • seq2seq_rl: Attentional seq2seq trained with policy gradient.
  • seqGAN: Policy gradient for sequence generation
  • rl_gym: Various RL algorithms for games on OpenAI Gym

Memory Network

Classifier / Sequence Prediction

Reward Augmented Maximum Likelihood (RAML)

Examples by Tasks

Language Modeling

Machine Translation


  • hierarchical_dialog: Hierarchical recurrent encoder-decoder model for conversation response generation.

Text Summarization

  • seq2seq_exposure_bias: Various algorithms tackling exposure bias in sequence generation (MT and summarization as examples).

Text Style Transfer


Sequence Tagging


  • rl_gym: Various RL algorithms for games on OpenAI Gym


Distributed training