1. A Multi-layer Perceptron (MLP) is capable of learning
A . Only linearly separable functions
B . Only clustering functions
C . Non-linear decision boundaries
D . Only regression tasks
2. The activation functions in hidden layers of MLP are usually
A . Linear
B . Non-linear
C . Step function only
D . None of the above
3. The process of calculating outputs from inputs in MLP is called
A . Forward pass
B . Backward pass
C . Gradient descent
D . Weight initialization
4. The main idea of backpropagation is
A . Adjust weights randomly
B . Propagate error backwards to update weights
C . Increase dataset size
D . Reduce neurons
5. Which optimization method is commonly used with backpropagation?
A . Gradient descent
B . K-means
C . PCA
D . Decision trees
6. The error function minimized in backpropagation is usually
A . Sum of squared errors
B . Entropy only
C . Maximum likelihood
D . Random error
7. Which of the following problems can MLP solve that a single perceptron cannot?
A . AND function
B . OR function
C . XOR function
D . Linear regression
8. Overfitting in MLP can be reduced using
A . Dropout
B . Regularization
C . Early stopping
D . All of the above
9. The Radial Basis Function (RBF) network uses which activation in hidden units?[ ]
A . Sigmoid
B . Gaussian
C . ReLU
D . Step function
10.. RBF networks are particularly effective for
A . Linearly separable data
B . Function approximation and interpolation
C . Clustering only
D . Dimensionality reduction
11.. The output of an RBF neuron depends mainly on
A . Distance from the center
B . Random initialization
C . Gradient descent
D . Step size
12.. Splines are widely used for
A . Data clustering
B . Smooth curve fitting
C . Decision trees
D . Classification only
13.. The "curse of dimensionality" refers to
A . Too many parameters make learning easier
B . Only neural networks suffer from it
C . Decreased dimensions lead to overfitting
D . Increased dimensions cause sparsity of data
14.. In high dimensions, nearest-neighbor methods perform poorly because
A . Distance metrics lose meaning
B . Too few features
C . Too many labels
D . Lack of training data
15.. One way to overcome curse of dimensionality is
A . Increase dataset size
B . Increase learning rate
C . Add more features
D . Dimensionality reduction (PCA, LDA)
16.. The main objective of SVM is to
A . Minimize training error
B . Randomly separate data
C . Maximize margin between classes
D . Increase features
17.. A linear SVM finds
A . Multiple decision boundaries
B . Cluster centers
C . A regression curve
D . A hyperplane that separates data with maximum margin
18.. Which of the following is used in SVM to handle non-linear data?
A . Dropout
B . Decision tree
C . Kernel trick
D . Backpropagation
19.. Support vectors in SVM are
A . Random points
B . Cluster centers
C . Points closest to the hyperplane
D . Outliers
20.. A commonly used kernel in SVM is
A . Gaussian (RBF)
B . Polynomial
C . Linear
D . All of the above
21. A Multi-layer Perceptron (MLP) contains at least one __________ layer.
22. The forward pass in MLP calculates _________outputs.
23. Backpropagation updates weights by propagating the ___________ backward.
24. The learning rate controls the size of ___________ updates.
25. The error function often minimized in backpropagation is ____________error.
26. The XOR problem can be solved using a __________
27. To avoid overfitting in MLP, we can use __________ regularization, or early stopping.
28. RBF stands for___________
29. The most common activation used in RBF is the ____________ function.
30. RBF networks are often applied in ___________ and interpolation tasks.
31. Splines are mathematical functions used for smooth _________ fitting.
32. The curse of dimensionality arises when data becomes ________ in high dimensions.
33. In high-dimensional space, __________ metrics lose their meaning.
34. Dimensionality reduction techniques like ___________help overcome the curse of dimensionality.
35. The main objective of SVM is to maximize the __________ between classes.
36. The separating surface in SVM is called a _________
37. Support vectors are the data points lying closest to the ___________
38. The method used in SVM to handle non-linear classification is called the __________ trick.
39. A commonly used kernel in SVM is the ________kernel.
40. SVMs are effective in both classification and ___________tasks.
☞ Machine Learning MCQs - Unit-1 - [ ML ]
☞ Machine Learning MCQs - Unit-2 - [ ML ]
☞ PPS MCQs - Unit-1 - [ PPS ]
☞ PPS MCQs - Unit-2 - [ PPS ]
☞ PPS MCQs - Unit-3 - [ PPS ]
☞ Data Mining MCQs - Unit-1 - [ DM ]
☞ Data Mining MCQs - Unit-2 - [ DM ]
☞ Data Mining MCQs - Unit-3 - [ DM ]
☞ Data Mining MCQs - Unit-4 - [ DM ]
☞ Data Mining MCQs - Unit-5 - [ DM ]
☞ Object Oriented Programming through Java MCQs - Unit-1 - [ OOP_JAVA ]
☞ Object Oriented Programming through Java MCQs - Unit-2 - [ OOP_JAVA ]
☞ Object Oriented Programming through Java MCQs - Unit-3 - [ OOP_JAVA ]
☞ Object Oriented Programming through Java MCQs - Unit-4 - [ OOP_JAVA ]
☞ Object Oriented Programming through Java MCQs - Unit-5 - [ OOP_JAVA ]
☞ Database Management System Objective Type Question Bank-Unit-1 - [ DBMS ]
☞ Database Management System Objective Type Question Bank-Unit-2 - [ DBMS ]
☞ Database Management System Objective Type Question Bank-Unit-3 - [ DBMS ]
☞ Database Management System Objective Type Question Bank-Unit-4 - [ DBMS ]
☞ Database Management System Objective Type Question Bank-Unit-5 - [ DBMS ]
☞ Computer Organization and Architecture (COA) Objective Question Bank-Unit-1 - [ COA ]
☞ Computer Organization and Architecture (COA) Objective Question Bank-Unit-2 - [ COA ]
☞ Computer Organization and Architecture (COA) Objective Question Bank-Unit-3 - [ COA ]
☞ Computer Organization and Architecture (COA) Objective Question Bank-Unit-4 - [ COA ]
☞ Computer Organization and Architecture (COA) Objective Question Bank-Unit-5 - [ COA ]
☞ R - Programming MCQs - Unit-1 - [ R-Programming ]
☞ R - Programming MCQs - Unit-2 - [ R-Programming ]
☞ R - Programming MCQs - Unit-3 - [ R-Programming ]
☞ R - Programming MCQs - Unit-4 - [ R-Programming ]
☞ R - Programming MCQs - Unit-5 - [ R-Programming ]
☞ Formal Languages and Automata Theory (FLAT) MCQs - Unit-1 - [ FLAT ]
☞ Formal Languages and Automata Theory (FLAT) MCQs - Unit-2 - [ FLAT ]
☞ Formal Languages and Automata Theory (FLAT) MCQs - Unit-3 - [ FLAT ]
☞ Formal Languages and Automata Theory (FLAT) MCQs - Unit-4 - [ FLAT ]
☞ Formal Languages and Automata Theory (FLAT) MCQs - Unit-5 - [ FLAT ]
☞ Artificial Intelligence (AI) MCQs - Unit-1 - [ Artificial Intelligence ]
☞ Artificial Intelligence (AI) MCQs - Unit-2 - [ Artificial Intelligence ]
☞ Artificial Intelligence (AI) MCQs - Unit-3 - [ Artificial Intelligence ]
☞ Artificial Intelligence (AI) MCQs - Unit-4 - [ Artificial Intelligence ]
☞ Artificial Intelligence (AI) MCQs - Unit-5 - [ Artificial Intelligence ]
☞ Design and Analysis of Algorithms MCQs - Unit-1 - [ DAA ]
☞ Design and Analysis of Algorithms MCQs - Unit-2 - [ DAA ]
☞ Design and Analysis of Algorithms MCQs - Unit-3 - [ DAA ]
☞ Design and Analysis of Algorithms MCQs - Unit-4 - [ DAA ]
☞ Design and Analysis of Algorithms MCQs - Unit-5 - [ DAA ]
☞ Software Engineering MCQs - Unit-1 - [ SE ]
☞ Software Engineering MCQs - Unit-2 - [ SE ]
☞ Software Engineering MCQs - Unit-3 - [ SE ]
☞ Software Engineering MCQs - Unit-4 - [ SE ]
☞ Software Engineering MCQs - Unit-5 - [ SE ]
☞ Data Structures Objective Type Question Bank-Unit-1 - [ DS ]
☞ Data Structures Objective Type Question Bank-Unit-2 - [ DS ]