全部版块 我的主页
论坛 计量经济学与统计论坛 五区 计量经济学与统计软件 LATEX论坛
1580 2
2016-09-23

Decision trees are created by making meaningful decisions as to where to mark boundaries on ranges of attribute values in order to split the instances into 2 or more subcategories, each being represented by different branches of a tree. This process continues, with branches being recursively split into smaller, more specific, branches on different attributes, with the tree leaves being classes. A subsequent walk of the tree with any un-labeled instance would lead to an unambiguous classification.

After a decision tree model is built, it is often pruned. This means that branches which do not add any value to data classification, or branches which, if removed, do not result in a considerable reduction in training data classification accuracy - this accuracy reduction threshold would be pre-specified - are removed, and its sub-trees are combined. The effects of this pruning process can be measured on training data, but effects on unseen test data (or real world data), remain unknown at the time of model training, parameter tuning, and tree pruning.

An unpruned decision tree can lead to overfitting. Overfitting occurs when a data model describes random error or noise, and does not describe the underlying data relationships. Overfitting more accurately fits known data, and in turn is not as good at predicting new data. As a result, this produces too many class outcomes to be useful.

Also, overfitting does not allow for meaningful information to be learned from a model. A tree that is pruned but does not fit the data so well can still be useful as there would be fewer, more meaningful classes. Fewer classes mean that more instances are grouped together, a situation in which there is a better chance that meaningful patterns will emerge and information will be extracted.

Any time that instances are grouped together in fewer classes there is a better chance of patterns being recognized. This is the reason that pruned decision trees, which avoid the overfitting prone to unpruned trees, could be a better choice for learning.

As discussed with supervised vs. unsupervised learning above, you can see that there are obvious trade-offs to pruning a tree vs. deciding against it.


二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

全部回复
2016-9-24 00:51:40
oliyiyi 发表于 2016-9-23 09:26
Decision trees are created by making meaningful decisions as to where to mark boundaries on ranges o ...
谢谢分享了啊!
二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

2016-9-27 09:03:32
Thans for your sharing. I learned another term: pruned tree. And it seems the scope of it is narrower.
二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

相关推荐
栏目导航
热门文章
推荐文章

说点什么

分享

扫码加好友,拉您进群
各岗位、行业、专业交流群