DSpace Repository

HHCART: An Oblique Decision Tree

Show simple item record

dc.contributor.author Wickramarachchi, D.C.
dc.contributor.author Robertson, B.L.
dc.contributor.author Reale, M.
dc.contributor.author Price, C.J.
dc.contributor.author Brown, J.
dc.date.accessioned 2017-04-04T07:54:11Z
dc.date.available 2017-04-04T07:54:11Z
dc.date.issued 2015
dc.identifier.citation Wickramarachchi, D.C., Robertson, B.L., Reale, M., Price, C.J., & Brown, J. (2015). HHCART: An Oblique Decision Tree. Computational Statistics and Data Analysis. doi: 10.1016/j.csda.2015.11.006 en_US, si_LK
dc.identifier.issn 0167-9473
dc.identifier.uri http://dr.lib.sjp.ac.lk/handle/123456789/4830
dc.description.abstract Decision trees are a popular technique in statistical data classification. They recursively partition the feature space into disjoint sub-regions until each sub-region becomes homogeneous with respect to a particular class. The basic Classification and Regression Tree (CART) algorithm partitions the feature space using axis parallel splits. When the true decision boundaries are not aligned with the feature axes, this approach can produce a complicated boundary structure. Oblique decision trees use oblique decision boundaries to potentially simplify the boundary structure. The major limitation of this approach is that the tree induction algorithm is computationally expensive. Hence, as an alternative, a new decision tree algorithm called HHCART is presented. The method uses a series of Householder matrices to reflect the training data at each non-terminal node during tree construction. Each reflection is based on the directions of the eigenvectors from each class' covariance matrix. Considering of axis parallel splits in the reflected training data provides an efficient w a y o f finding oblique splits in the unreflected training data. Experimental results show that the accuracy and size of HHCART trees are comparable with some benchmark methods. The appealing feature of HHCART is that it can handle both qualitative and quantitative features in the same oblique split. en_US, si_LK
dc.language.iso en en_US, si_LK
dc.publisher Elsevier en_US, si_LK
dc.subject Oblique decision tree en_US, si_LK
dc.subject Data classification en_US, si_LK
dc.subject Statistical learning en_US, si_LK
dc.subject Householder reflection en_US, si_LK
dc.subject Machine learning en_US, si_LK
dc.title HHCART: An Oblique Decision Tree en_US, si_LK
dc.type Article en_US, si_LK


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search DSpace


Browse

My Account