23 May 2024
Dive into the mechanics of normalizing flows, powerful probabilistic generative models that leverage bijective transformations to map latent distributions to data distributions. This post covers the theory, loss functions, and various types of flows including linear, coupling, autoregressive, residual, and multi-scale flows, highlighting their unique properties.
1Generative adversarial networks
20 May 2024
Explore the family of generative models that use a discriminator as a signal. This post explores how GANs work, delving into their loss functions, challenges in training, and advanced techniques such as Wasserstein distance and conditional generation to enhance their performance and effectiveness.
215 April 2024
Bayesian approach leverages the full distribution of model parameters, resulting in more robust predictions. This post explains the principles of Bayesian thinking, contrasts it with maximum likelihood estimation, and illustrates its application in linear regression, offering a refresher on matrix operations and theoretical basics.
315 March 2024
Loss functions are essential for training models, guiding the optimization process. This post explores how loss functions work from a probabilistic perspective, delves into maximum likelihood estimation, and connects this framework with methods such as least squares, KL-divergence, and cross-entropy.
414 March 2024
The complex nature of loss functions requires numerical methods to search through them. This post explains various optimization methods, including stochastic gradient descent, momentum, and adaptive moments like Adam. You'll gain insights into their mechanics, advantages, and how they address the challenges of training deep learning models.
513 March 2024
Understanding backpropagation is crucial, as it calculates the gradients needed to optimize model parameters. This post explains backpropagation, covering its theory, Jacobian matrices, and why proper initialization and residual connections are important for training deep neural networks effectively.
6