Generic selectors
Exact matches only
Search in title
Search in content
Search in posts
Search in pages
Filter by Categories
nmims post
Objective Type Set
Online MCQ Assignment
Question Solution
Solved Question
Uncategorized

1. In determination of weights by learning, for orthogonal input vectors what kind of learning should be employed?
a) hebb learning law
b) widrow learning law
c) hoff learning law
d) no learning law

Answer: a [Reason:] For orthogonal input vectors, Hebb learning law is best suited.

2. In determination of weights by learning, for linear input vectors what kind of learning should be employed?
a) hebb learning law
b) widrow learning law
c) hoff learning law
d) no learning law

Answer: b [Reason:] For linear input vectors, widrow learning law is best suited.

3. In determination of weights by learning, for noisy input vectors what kind of learning should be employed?
a) hebb learning law
b) widrow learning law
c) hoff learning law
d) no learning law

Answer: d [Reason:] For noisy input vectors, there is no learning law.

4. What are the features that can be accomplished using affine transformations?
a) arbitrary rotation
b) scaling
c) translation
d) all of the mentioned

Answer: d [Reason:] Affine transformations can be used to do arbitrary rotation, scaling, translation.

5. What is the features that cannot be accomplished earlier without affine transformations?
a) arbitrary rotation
b) scaling
c) translation
d) all of the mentioned

Answer: c [Reason:] Affine transformations can be used to do arbitrary rotation, scaling, translation.

6. what are affine transformations?
a) addition of bias term (-1) which results in arbitrary rotation, scaling, translation of input pattern.
b) addition of bias term (+1) which results in arbitrary rotation, scaling, translation of input pattern.
c) addition of bias term (-1) or (+1) which results in arbitrary rotation, scaling, translation of input pattern.
b) none of the mentioned

Answer: a [Reason:] It follows from basic definition of affine transformation.

7. Can a artificial neural network capture association if input patterns is greater then dimensionality of input vectors?
a) yes
b) no

Answer: a [Reason:] By using nonlinear processing units in output layer.

8. By using only linear processing units in output layer, can a artificial neural network capture association if input patterns is greater then dimensionality of input vectors?
a) yes
b) no

Answer: b [Reason:] There is need of non linear processing units.

9. Number of output cases depends on what factor?
a) number of inputs
b) number of distinct classes
c) total number of classes
d) none of the mentioned

Answer: b [Reason:] Number of output cases depends on number of distinct classes.

10. For noisy input vectors, Hebb methodology of learning can be employed?
a) yes
b) no