BatchNorm + Dropout = DNN Success! | Synced

A group of researchers from Tencent Technology, the Chinese University of Hong Kong, and Nankai University recently combined two commonly used techniques — Batch Normalization (BatchNorm) and Dropo...

By · · 1 min read

Source: Synced | AI Technology & Industry Review

A group of researchers from Tencent Technology, the Chinese University of Hong Kong, and Nankai University recently combined two commonly used techniques — Batch Normalization (BatchNorm) and Dropout — into an Independent Component (IC) layer inserted before each weight layer to make inputs more independent*.