High interpretability and ease of understanding decision trees have made
them one of the most widely used machine learning algorithms. The key to building
efficient and effective decision trees is to use the suitable splitting method. This
paper proposes a new splitting approach to produce a tree based on the T-entropy criterion
for the splitting method. The method presented on three data sets is examined
by 11 evaluation criteria. The results show that the introduced method in making
the decision tree has a more accurate performance than the well-known methods of
Gini index, Shannon, Tisalis, and Renny entropies and can be used as an alternative
method in producing the decision tree.