天天看點

Dropout 下(關于《Dropout: A Simple way to prevent neural networks from overfitting》)

先上菜單:

摘要:

Deep neural nets with a large number of parameters are very powerful machine learning systems. However, overfitting is a serious problem in such networks. Large networks are also slow to use, making it difficult to deal with overfitting by combining the predictions of many different large neural nets at test time. (具有大量參數的深度神經網絡是非常強大的機器學習系統。然而,在這樣的網絡中,過度拟合是一個嚴重的問題。大型網絡的使用速度也較慢,是以在測試時結合許多不同大型神經網絡的預測,很難處理過度拟合問題。)Dropout is a technique for addressing this problem.The key idea is to randomly drop units (along with their connections) from the neural network during training. (dropout是解決這個問題的一種方法。關鍵思想是在訓練過程中從神經網絡中随機删除單元(以及它們的連接配接)。)This prevents units from co-adapting too much. During training,dropout samples from an exponential number of different “thinned” networks. At test time,it is easy to approximate the effect of averaging the predictions of all these thinned networks by simply using a single unthinned network that has smaller weights. (這就防止了機關過度的互相适應。在訓練過程中,舍棄來自不同的指數級别的“稀疏”網絡的樣本。在測試時,隻需使用一個權重較小的未減薄網絡,就可以很容易地估計出所有這些變薄網絡的平均預測效果。)This significantly reduces overfitting and gives major improvements over other regularization methods. We show that dropout improves the performance of neural networks on supervised learning tasks in vision, speech recognition, document classification and computational biology,obtaining state-of-the-art results on many benchmark data sets.(這大大減少了過度拟合,并對其他正則化方法進行了重大改進。實驗結果表明,在視覺、語音識别、文檔分類和計算生物學等方面,dropout都能提高神經網絡在有監督學習任務中的性能,在許多基準資料集上都獲得了最新的結果。)

Keywords: neural networks, regularization(正則化), model combination(模型組合), deep learning

先介紹一下本文結構:

本文的結構如下:第2節描述了這個想法的動機。第3節描述了以前的相關工作。第4節正式描述了dropout模型。第5節給出了訓練dropout網絡的算法。在第6節中,我們展示了我們的實驗結果,我們将dropout應用于不同領域的問題,并與其他形式的正則化和模型組合進行了比較。第7節分析了dropout對神經網絡不同性質的影響,并描述了dropout如何與網絡的超參數互相作用。第8節描述了drop - RBM模型。在第9節中,我們探讨了邊緣化dropout的概念。在附錄A中,我們提供了一個訓練dropout網的實用指南。這包括在訓練drop - out網絡時,選擇超參數所涉及的實際考慮的詳細分析。(背景部分:1-3節 ;方法部分:4-5節;實驗及分析:6-7節;其他:8-10節;總結:11;附錄:A-B)

(幾個參考網站:

https://www.baidu.com/link?url=F-vklwp34FZsuOsiAw36yS2upENUfms5jn-R3VGUY3Pmhq210Q2c9K5N8YNN63BzYlCS9OPNUhl-eSms3QpNh9urQwhWo0HDis6G2MnoGm3&wd=&eqid=f9e01460000131a8000000055bceab97

https://blog.csdn.net/qq_25011449/article/details/81168369

https://blog.csdn.net/huplion/article/details/79208736

https://blog.csdn.net/u014422406/article/details/70257324?locationNum=2&fps=1

https://blog.csdn.net/lhc19940815/article/details/50907545

轉載于:https://www.cnblogs.com/Ann21/p/9830781.html

繼續閱讀