My Course Work for Applied Machine Learning at Indiana University Bloomington
#PA1 - Decision Tree Algorithm
Implement a fixed depth decision tree algorithm. In particular, the input to your algorithm will include the training data set and the maximum depth of the tree. For example, if the depth is set to one, you will learn a decision tree with one test node, which is also called a decision stump. Test your implementation, with depth=1, and 2 respectively, on the following data set as described below (train on the training data and test with the testing data set). Data set information: This data set is extracted from the UCI Monk’s problem data set (Monks-X.train and Monks-X.test). Note that there are 7 features (the first 7 columns) and the class labels are in the last col- umn. There are 2 classes. Please refer to http://archive.ics.uci.edu/ml/machine-learning-databases/ monks-problems/monks.names for details on the data set.
#PA2 - Adaboost and Bagging Algorithm
Implement Adaboost and Bagging based on your prior decision tree code. Use the code developed in Programming Assignment 1 and tailor it to learn Adaboost and Bagging. For Adaboost, try two different depths 1 and 2 for the trees and try 5 and 10 trees. For bagging, use a depth of 3 and 5 and try with 5 and 10 tree bags.