天天看点

吴恩达选择题部分错题整理week3

01_week3_

quiz Shallow Neural Networks

1。第 1 个问题

Which of the following are true? (Check all that apply.)

a[2] denotes the activation vector of the 2nd

X

X

a[2](12) denotes activation vector of the 12th layer on the 2nd

a4[2] is the activation output by the 4th neuron of the 2nd

a4[2] is the activation output of the 2nd layer for the 4th

a[2](12) denotes the activation vector of the 2nd layer for the 12th

answer:(A,C,D,E)

6。第 6 个问题

Suppose you have built a neural network. You decide to initialize the weights and biases to be zero. Which of the following statements is true?

Each neuron in the first hidden layer will perform the same computation. So even after multiple iterations of gradient descent each neuron in the layer will be computing the same thing as other neurons.

Each neuron in the first hidden layer will perform the same computation in the first iteration. But after one iteration of gradient descent they will learn to compute different things because we have “broken symmetry”.

Each neuron in the first hidden layer will compute the same thing, but neurons in different layers will compute different things, thus we have accomplished “symmetry breaking” as described in lecture.

The first hidden layer’s neurons will perform different computations from each other even in the first iteration; their parameters will thus keep evolving in their own way.

answer(A,     reason: because its symmetry)

9。第 9 个问题

Consider the following 1 hidden layer neural network:

吴恩达选择题部分错题整理week3

Which of the following statements are True? (Check all that apply).

W[1]

b[1]

W[1]

b[1]

W[2]

b[2]

W[2]

b[2]

answer(B,C,E,H)

吴恩达选择题部分错题整理week3

第10题看不懂为什么是m,如果有知道原因的麻烦留个原因