天天看点

机器学习框架_机器学习中的概率框架

机器学习框架_机器学习中的概率框架

这两天闲来没事,就总结了一下机器学习中与概率相关的内容= =

机器学习框架_机器学习中的概率框架

推荐一些相关文献:

1.生成模型:

(1)VAE:An Introduction to Variational Autoencoders、变分推断与变分自编码器(VAE)

(2)GAN:A Review on Generative Adversarial Networks: Algorithms, Theory, and Applications

(3)Flow:Normalizing Flows: An Introduction and Review of Current Methods

(4)Autoregressive Models: introduction for Autoregressive Models

2.divergence:

(1)f-divergence: f-Divergences and Surrogate Loss Functions

(2)Wasserstein distance:【数学】Wasserstein Distance

(3)Stein discrepancy: Stein’s Method for Practical Machine Learning

(4)Fisher divergence: Variational approximations using Fisher divergence

(5)Sliced method for computation acceleration:

a.Fisher divergence: Sliced Score Matching: A Scalable Approach to Density and Score Estimation

b.Wasserstein distance: Generalized Sliced Wasserstein Distances

3.BNN和dropout

(1)BNN:Bayesian Neural Networks

(2)dropout:深度学习中Dropout原理解析

4.贝叶斯元学习:

(1)神经过程:Neural Processes

(2)贝叶斯+MAML:Amortized Bayesian Meta-Learning、Bayesian Model-Agnostic Meta-Learning、Probabilistic Model-Agnostic Meta-Learning

5.高斯过程:Gaussian Processes in Machine Learning

6.表示学习:

(1)信息瓶颈:信息瓶颈理论-基础与应用

(2)解耦表示学习:【Disentangled representation 1】InfoGAN与betaVAE

(3)infoMAX和contrastive learning:Deep InfoMax: Learning good representations through mutual information maximization、A Simple Framework for Contrastive Learning of Visual Representations

7.其他:

(1)信息论:INTRODUCTION TO INFORMATION THEORY

(2)测度论:sola的数学笔记

(3)MCMC:马尔可夫链蒙特卡罗算法(MCMC)

(4)gradient flow:Gradient Flow

(5)贝叶斯优化:贝叶斯优化(Bayesian Optimization)深入理解

(6)其他论文:

Meta Dropout: Learning to Perturb Features for Generalization

Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm

Functional Variational Bayesian Neural Networks

Variational Implicit Processes

The Functional Neural Process

Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning

继续阅读