AllFreePapers.com - All Free Papers and Essays for All Students
Search

Data Mining

Autor:   •  September 13, 2016  •  Essay  •  323 Words (2 Pages)  •  894 Views

Page 1 of 2

When criterion set to gain ratio and maximal depth is 10; tree is splitting is done mainly on Amount, Duration, Age and History attributes and accuracy is 72.5 %. As we increase the depth of tree the accuracy is increasing.

Information gain parameter with depth of tree 10 increases the accuracy to 99.6 %. As we increase the depth of tree the accuracy rapidly reaches to 100.

Gini index parameter gives very high accuracy even with less depth of tree.

Pruning :

Accuracy is increasing as we reduce value of confidence

For pre-pruning there is only single node with minimum gain of 0.1. If we increase the confidence and number of pre pruning alternatives we see growth for the tree.

As we increase depth of tree the accuracy is increasing but this may result is over fitting. As we reduce the depth of tree the accuracy is decreasing. The observed optimum depth for tree is 10 to accommodate accuracy and fitting for new (unseen) data.

When criterion set to gain ratio and maximal depth is 10; tree is splitting is done mainly on Amount, Duration, Age and History attributes and accuracy is 72.5 %. As we increase the depth of tree the accuracy is increasing.

Information gain parameter with depth of tree 10 increases the accuracy to 99.6 %. As we increase the depth of tree the accuracy rapidly reaches to 100.

Gini index parameter gives very high accuracy even with less depth of tree.

Pruning :

Accuracy is increasing as we reduce value of confidence

For pre-pruning there is only single node with minimum gain of 0.1. If we increase the confidence and number of pre pruning alternatives we see growth for the tree.

As we increase depth

...

Download as:   txt (1.9 Kb)   pdf (36.1 Kb)   docx (8 Kb)  
Continue for 1 more page »