set-9
401. ______ is a possible mechanism for synaptic modification in the brain.
Hebbian Rule
McCulloch pits neuron
Hopfield network
None of above
402. The ______ can be used to train neural networks for pattern recognition.
Hebbian Rule
McCulloch pits neuron
Hopfield network
Genetic Algorithm
403. If two neurons on either side of a synapse are activated simultaneously, the strength of the synapse will increase is ______.
Hebbian Learning Rule
Perceptron Learning Rule
Delta Learning Rule
Genetic Algorithm Learning Rule
404. ______ is the one of the Neural Network Learning rules.
Hebbian Learning Rule
Perceptron Learning Rule
Delta Learning Rule
All of above
405. ______ is an error correcting the supervised learning algorithm of single layer feedforward networks with linear activation function.
Hebbian Learning Rule
Perceptron Learning Rule
Delta Learning Rule
None of above
406. ______ also called Least Mean Square (LMS)
Hebbian Learning Rule
Perceptron Learning Rule
Delta Learning Rule
None of above
407. A ______ is an algorithm for supervised learning of binary classifiers. This algorithm enables neurons to learn and processes elements in the training set one at a time.
Hebbian Learning
Delta Learning
Perceptron
Delta Learning
408. ______ are one of the types of perceptron.
Single layer perceptron
Multilayer perceptron
No layer perceptron
Both A and B
409. ______ can learn only linearly separable patterns.
Single layer perceptron
Multilayer perceptron
No layer perceptron
None of above
410. ______ can learn about two or more layers having a greater processing power.
Single layer perceptron
Multilayer perceptron
No layer perceptron
None of above
411. The given figure is of ______.
[\begin{array}{c} \text{Xi} \ \text{X2} \ \text{Xn} \end{array}]
Input Sum
Activation Function
Single layer perceptron
Multilayer perceptron
No layer perceptron
None of above
412. The given figure is of ______.
Input layer Hidden layers Output layer
Single layer perceptron
Multilayer perceptron
No layer perceptron
None of above
413. Identify the type of perceptron activation function.
Sigmoid and tanh
Tanh and sigmoid
Linear, Relu
Relu, Linear
414. Identify the type of perceptron activation function.
Sigmoid and tanh
Tanh and sigmoid
Leaky Relu and Relu
Relu and, Leaky Relu
415. Gradient descent is an optimization algorithm which is commonly-used to ______ machine learning models and neural networks.
Train
Test
Validate
None of above
416. ______ is one of the types of gradient descent algorithm.
Batch Gradient Descent (BGD)
Stochastic Gradient Descent (SGD)
Mini-Batch Gradient Descent
All of above
417. The backpropagation algorithm is used for which type of neural network?
Single-layer feedforward neural networks
Multilayer feedforward neural networks
Convolutional neural networks
Recurrent neural networks
418. Which of the following is true about the backpropagation algorithm?
It is a supervised learning algorithm
It is an unsupervised learning algorithm
It is a reinforcement learning algorithm
It is a semi-supervised learning algorithm
419. The backpropagation algorithm involves two phases. What are they?
Forward propagation and backward propagation
Feature selection and feature extraction
Clustering and classification
Regression and classification
420. Which of the following is the activation function commonly used in the backpropagation algorithm?
Linear
Sigmoid
ReLU
Tanh
421. The backpropagation law is also known as generalized delta rule, is it true?
Yes
No
Partially yes
Not sure
422. ______ consists of a set of neurons where each neuron corresponds to a pixel of the difference image and is connected to all the neurons in the neighborhood.
Hopfield neural network
Biological neural network
Hamming neural network
McColloch Pits' neural network
423. The ______ is commonly used for auto-association and optimization tasks.
Hopfield neural network
Biological neural network
Hamming neural network
McColloch Pits' neural network
424. In ______, the input and output patterns are discrete vector, which can be either binary 0,1 or bipolar +1, -1 in nature.
Continuous Hopfield n/w
Discrete Hopfield n/w
Sequential Hopfield n/w
None of above
425. Continuous Hopfield Network in comparison with Discrete Hopfield network, continuous network has ______ as a continuous variable.
Space
Range
Time
Velocity
426. ______ architecture can be build up by adding electrical components such as amplifiers which can map the input voltage to the output voltage over a sigmoid activation function.
Continuous Hopfield n/w
Discrete Hopfield n/w
Sequential Hopfield n/w
None of above
427. ______, is a network having a single linear unit.
Madeline
Adaline
Backpropagation
Perceptron
428. The basic structure of ______ is similar to perceptron having an extra feedback loop.
Madeline
Adaline
Backpropagation
None of above
429. Identify the diagram and answer the question which training algorithm is this?
A. Madeline B. Adaline C. Perceptron D. Backpropagation
430. ______ is a network which consists of many Adalines in parallel.
Madeline
Adaline
Backpropagation
Perceptron
431. In NN, Delta rule works only for the ______.
Hidden layer
Input layer
Weight and bias
Output layer
432. Generalized delta rule, also called as ______ rule, is a way of creating the desired values of the hidden layer.
Feed-forward
Backpropagation
Perceptron
Adaline
Last updated