天天看點

機器學習架構_機器學習中的機率架構

機器學習架構_機器學習中的機率架構

這兩天閑來沒事,就總結了一下機器學習中與機率相關的内容= =

機器學習架構_機器學習中的機率架構

推薦一些相關文獻:

1.生成模型:

(1)VAE:An Introduction to Variational Autoencoders、變分推斷與變分自編碼器(VAE)

(2)GAN:A Review on Generative Adversarial Networks: Algorithms, Theory, and Applications

(3)Flow:Normalizing Flows: An Introduction and Review of Current Methods

(4)Autoregressive Models: introduction for Autoregressive Models

2.divergence:

(1)f-divergence: f-Divergences and Surrogate Loss Functions

(2)Wasserstein distance:【數學】Wasserstein Distance

(3)Stein discrepancy: Stein’s Method for Practical Machine Learning

(4)Fisher divergence: Variational approximations using Fisher divergence

(5)Sliced method for computation acceleration:

a.Fisher divergence: Sliced Score Matching: A Scalable Approach to Density and Score Estimation

b.Wasserstein distance: Generalized Sliced Wasserstein Distances

3.BNN和dropout

(1)BNN:Bayesian Neural Networks

(2)dropout:深度學習中Dropout原了解析

4.貝葉斯元學習:

(1)神經過程:Neural Processes

(2)貝葉斯+MAML:Amortized Bayesian Meta-Learning、Bayesian Model-Agnostic Meta-Learning、Probabilistic Model-Agnostic Meta-Learning

5.高斯過程:Gaussian Processes in Machine Learning

6.表示學習:

(1)資訊瓶頸:資訊瓶頸理論-基礎與應用

(2)解耦表示學習:【Disentangled representation 1】InfoGAN與betaVAE

(3)infoMAX和contrastive learning:Deep InfoMax: Learning good representations through mutual information maximization、A Simple Framework for Contrastive Learning of Visual Representations

7.其他:

(1)資訊論:INTRODUCTION TO INFORMATION THEORY

(2)測度論:sola的數學筆記

(3)MCMC:馬爾可夫鍊蒙特卡羅算法(MCMC)

(4)gradient flow:Gradient Flow

(5)貝葉斯優化:貝葉斯優化(Bayesian Optimization)深入了解

(6)其他論文:

Meta Dropout: Learning to Perturb Features for Generalization

Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm

Functional Variational Bayesian Neural Networks

Variational Implicit Processes

The Functional Neural Process

Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning

繼續閱讀