
Text generation with a Variational Autoencoder - GitHub Pages
In this post, I’m going to implement a text Variational Auto Encoder (VAE), inspired to the paper “Generating sentences from a continuous space”, in Keras. First, I’ll briefly introduce generative models, the VAE, its characteristics and its advantages; then I’ll show the code to implement the text VAE in keras and finally I will ...
Topic-Guided Variational Autoencoders for Text Generation
Mar 17, 2019 · We propose a topic-guided variational autoencoder (TGVAE) model for text generation. Distinct from existing variational autoencoder (VAE) based approaches, which assume a simple Gaussian prior for the latent code, our model specifies the prior as a Gaussian mixture model (GMM) parametrized by a neural topic module.
Topic-Guided Variational Auto-Encoder for Text Generation
Apr 28, 2025 · We propose a topic-guided variational auto-encoder (TGVAE) model for text generation. Distinct from existing variational auto-encoder (VAE) based approaches, which assume a simple Gaussian prior for latent code, our model specifies the prior as a Gaussian mixture model (GMM) parametrized by a neural topic module.
A Hybrid Convolutional Variational Autoencoder for Text Generation
Feb 8, 2017 · In this paper we explore the effect of architectural choices on learning a Variational Autoencoder (VAE) for text generation. In contrast to the previously introduced VAE model for text where both...
Variational Auto-Encoder for text generation - IEEE Xplore
Recurrent neural network language(RNNLM) is powerful and scalable for text generation in unsupervised generative modeling. We extended the RNNLM and propose the Variational Auto-Encoder Recurrent Neural Network(VAE-RNNLM), which designed to explicitly capture such global features as continuous latent variable.
A Hybrid Convolutional Variational Autoencoder for Text Generation ...
Apr 28, 2025 · In this paper we explore the effect of architectural choices on learning a variational autoencoder (VAE) for text generation. In contrast to the previously introduced VAE model for text where both the encoder and decoder are RNNs, we propose a novel hybrid architecture that blends fully feed-forward convolutional and deconvolutional components ...
Text generation with a Variational Autoencoder - GitHub
Text generation with a Variational Autoencoder in Keras. In this tutorial we will try to generate text with a variational autoencoder and interpolate between sentences like in the paper "Generating sentences from a continuous space" https://arxiv.org/abs/1511.06349
Experimental results show that our TGVAE outperforms alterna-tive approaches on both unconditional and con-ditional text generation, which can generate semantically-meaningful sentences with vari-ous topics.
Hierarchically-Structured Variational Autoencoders for Long Text Generation
Sep 27, 2018 · In this paper, we propose a novel framework, hierarchically-structured variational autoencoder (hier-VAE), for generating long and coherent units of text. To enhance the model’s plan-ahead ability, intermediate sentence representations are introduced into the generative networks to guide the word-level predictions.
μ-Forcing: Training Variational Recurrent Autoencoders for Text Generation
Jul 13, 2019 · It has been previously observed that training Variational Recurrent Autoencoders (VRAE) for text generation suffers from serious uninformative latent variables problems. The model would collapse into a plain language model that totally ignores the latent variables and can only generate repeating and dull samples.
- Some results have been removed