TY - JOUR
T1 - Decision trees based on 1-consequences
AU - Moshkov, Mikhail
N1 - KAUST Repository Item: Exported on 2021-07-27
Acknowledgements: Research reported in this publication was supported by King Abdullah University of Science and Technology (KAUST) . The author is grateful to the anonymous reviewers for useful remarks and suggestions.
PY - 2021/7/20
Y1 - 2021/7/20
N2 - In this paper, we study arbitrary infinite binary information systems each of which consists of an infinite set of elements and an infinite set of two-valued non-constant functions (attributes) defined on the set of elements. We consider the notion of a problem over information system, which is described by a finite number of attributes: for a given element, we should determine values of these attributes. As algorithms for problem solving, we study decision trees that use arbitrary attributes from the considered infinite set of attributes and solve the problem based on 1-consequences. In such a tree, we take into account consequences each of which follows from one equation of the kind “attribute value” obtained during the decision tree work and ignore consequences that can be derived only from at least two equations. As time complexity, we study the depth of decision trees. We prove that in the worst case, with the growth of the number of attributes in the problem description, the minimum depth of decision trees based on 1-consequences grows either as a logarithm or linearly.
AB - In this paper, we study arbitrary infinite binary information systems each of which consists of an infinite set of elements and an infinite set of two-valued non-constant functions (attributes) defined on the set of elements. We consider the notion of a problem over information system, which is described by a finite number of attributes: for a given element, we should determine values of these attributes. As algorithms for problem solving, we study decision trees that use arbitrary attributes from the considered infinite set of attributes and solve the problem based on 1-consequences. In such a tree, we take into account consequences each of which follows from one equation of the kind “attribute value” obtained during the decision tree work and ignore consequences that can be derived only from at least two equations. As time complexity, we study the depth of decision trees. We prove that in the worst case, with the growth of the number of attributes in the problem description, the minimum depth of decision trees based on 1-consequences grows either as a logarithm or linearly.
UR - http://hdl.handle.net/10754/670289
UR - https://linkinghub.elsevier.com/retrieve/pii/S0166218X21002766
U2 - 10.1016/j.dam.2021.07.017
DO - 10.1016/j.dam.2021.07.017
M3 - Article
SN - 0166-218X
VL - 302
SP - 208
EP - 214
JO - Discrete Applied Mathematics
JF - Discrete Applied Mathematics
ER -