NEC B.Tech(R19) CSE 4-1 Machine Learning Important Questions

0
UNIT– I

Supervised Learning: Learning a Class from Examples, Vapnik Chervonenkis (VC) Dimension, Probably Approximately Correct (PAC) Learning, Noise, Learning Multiple Classes, Regression, Model Selection and Generalization, Dimensions of a Supervised Machine Learning Algorithm 
Bayesian Decision Theory: Classification, Losses and Risks, Discriminant Functions, Utility Theory, Association Rules.
......................................................................................................................................................................

Questions:
1. Enumerate the process of supervised learning from examples.

2. What do mean by VC Dimensions? How to separate positive examples from negative examples in a dataset?

3. What is PAC learning? How to find probably approximately correct hypothesis? Explain.

4. What do you mean by noise in ML? Is zero noise is good for ML algorithms?

5. What is regression? Write the differences between linear and logistics regression.

6.  Write the procedure for determining Dimensions of a Supervised Machine Learning Algorithm

7.  How Bayesian theory works in classification problems? Derive all formulas for classification?

8. What are association rules? How Bayesian theory helpful in this context?

UNIT–II
Parametric Methods: Maximum Likelihood Estimation, Evaluating an Estimator: Bias and Variance, The Bayes' Estimator, Parametric Classification, Regression, Tuning Model Complexity: Bias/Variance Dilemma, Model Selection Procedures
....................................................................................................................................................................

Questions 

1. How to estimate maximum likelihood using Bernoulli Density ?

2. How to estimate maximum likelihood using  Multinomial Density?

3. How to estimate maximum likelihood using Gaussian (Normal) Density?

4. What are the various parameters need to consider while selecting a model? Explain.




UNIT–III

Dimensionality Reduction: Subset Selection, Principal Components Analysis, Factor Analysis, Multidimensional Scaling, Linear Discriminant Analysis Association learning: Basics of Association, Apriori Algorithm, Eclat Algorithm, FP Growth Algorithm with examples, SCADA application with FP Growth Algorithm
......................................................................................................................................................................

Questions

1.  Write about back word and forward selection using wrapper algorithm.

2. Why Principal Components Analysis? How it useful proposition of variance and screen graphics in PCA?

3. Write Apriori Algorithm for Linear Discriminant Analysis Association learning.

4. Explain FP Growth Algorithm.

UNIT-IV
Unsupervised Learning: Expectation Maximization, Self-Organizing Maps(SOM),learning Process in SOM, Algorithm: SOM, Adaptive Resonance Theory. Clustering: k-Means Clustering, Expectation-Maximization Algorithm, Supervised Learning after Clustering, Fuzzy Clustering, Document Clustering example, Hierarchical Clustering, Choosing the Number of Clusters.
....................................................................................................................................................................

Questions







UNIT-V
Decision Trees: Univariate Trees, Pruning, Rule Extraction from Trees, Learning Rules from Data. Random Forest: basic Principle, Decision Tree vs random Forest, Random Forest Algorithm with Example
..............................................................................................................................................................

Questions:

1. What is decision tree? How to represent decision tree?

2. What is pruning? Why it is needed in Decision tree?

3. How to extract rules from Decision tree? Explain with example

4.  Write Ripper algorithm for learning rules from data.

5. Write about Decision trees vs random forest.

6. Explain Random forest algorithm with example.

Post a Comment

0Comments
Post a Comment (0)

Join CSE Team