A variational autoencoder assumes that a latent, unobserved random variable produces the observed data and attempts to approximate its distribution. This function constructs a wrapper for a variational autoencoder using a Gaussian distribution as the prior of the latent space.

autoencoder_variational(network, loss = "binary_crossentropy",
  auto_transform_network = TRUE)

Arguments

network

Network architecture as a "ruta_network" object (or coercible)

loss

Reconstruction error to be combined with KL divergence in order to compute the variational loss

auto_transform_network

Boolean: convert the encoding layer into a variational block if none is found?

Value

A construct of class "ruta_autoencoder"

References

  • Auto-Encoding Variational Bayes

  • Under the Hood of the Variational Autoencoder (in Prose and Code)

  • Keras example: Variational autoencoder

See also

Examples

network <- input() + dense(256, "elu") + variational_block(3) + dense(256, "elu") + output("sigmoid") learner <- autoencoder_variational(network, loss = "binary_crossentropy")