Artificial Neural Networks QUIZ (MCQ QUESTIONS AND ANSWERS)

Total Correct: 0

Time:20:00

Question: 1

What does the term "epoch" refer to in the context of neural network training?

Question: 2

What is the purpose of the term "momentum" in the context of gradient descent optimization for neural networks?

Question: 3

Which technique helps improve the convergence of gradient descent in neural networks by adapting the learning rate for each parameter?

Question: 4

What is the primary role of the cross-entropy loss function in classification neural networks?

Question: 5

In the context of neural networks, what does the term "overfitting" mean?

Question: 6

What is the primary purpose of the Adam optimizer in neural network training?

Question: 7

Which algorithm is commonly used to optimize the weights of neural networks during training?

Question: 8

What is the primary objective of weight initialization techniques in neural networks?

Question: 9

Which type of neural network layer is used to connect all neurons in one layer to all neurons in the next layer without skipping any connections?

Question: 10

What is the term for the process of adjusting the model's hyperparameters to optimize its performance on a validation dataset?

Question: 11

In deep neural networks, what is the primary reason for using batch normalization layers?

Question: 12

What is the purpose of the dropout regularization technique in neural networks?

Question: 13

What role does the activation function ReLU (Rectified Linear Unit) play in neural networks?

Question: 14

Which neural network architecture is suitable for image classification tasks and can automatically learn hierarchical features?

Question: 15

What is the primary difference between a feedforward neural network (FNN) and a recurrent neural network (RNN)?

Question: 16

What is the primary purpose of an activation function in an artificial neural network?

Question: 17

What is the purpose of the learning rate in gradient descent optimization for neural networks?

Question: 18

Which technique helps address the issue of vanishing gradients in training deep neural networks?

Question: 19

What is the primary purpose of the softmax activation function in the output layer of a classification neural network?

Question: 20

Which type of neural network layer is often used to downsample the spatial dimensions of data in convolutional neural networks (CNNs)?

Question: 21

What is the vanishing gradient problem in neural networks?

Question: 22

What is the purpose of dropout layers in neural networks?

Question: 23

In a neural network, what is the term for the process of updating the model's weights to minimize the loss?

Question: 24

What is the purpose of an activation function in a neural network?

Question: 25

Which type of neural network is well-suited for sequential data like natural language processing and time series forecasting?

Question: 26

What is the primary purpose of a neural network's loss function during training?

Question: 27

In a convolutional neural network (CNN), what is the primary advantage of using convolutional layers?

Question: 28

What does the term "backpropagation" refer to in the context of neural networks?

Question: 29

What is a common technique used to prevent overfitting in deep neural networks?

Question: 30

In a feedforward neural network, what is the role of the input layer?