چكيده به لاتين
In recent years, the advent of new advances in computing power of machines provided machine learning algorithms with new context in order to show their power. One time, The goal of machine learning was to find efficient solutions for data analysis but over the time, machine learning was confronted with new challenges. One of the most important challenges was increasing in size of data.Now, the data were massive and generating with such speed that it was imposible to store them in memory.Thereafter these types of data were named data stream. Machine Learning has to present new methods by which it can manage these data stream. Different methods have been introduced to manage them that had their pros and cons but among all of them, Decision trees were one of the most popular algorithm because of their high ability in representing data structures but unfortunately, Decision trees suffer from the instability problem. In this thesis we introduce Cross Split Decision Tree (CSDT) which is a new decision tree learning algorithm with improved stability. This new algorithm uses multiple attributes as the split test in the internal nodes, in spite of the classical decision tree learning algorithms which use a single attribute. The experimental results show that in comparison with the well-known decision tree learning algorithms, the proposed algorithm creates shallower and smaller decision trees with comparable accuracy.