Update README.md

Fixed a typo in the "Features" section. 
Added links to the C4.5 and ID3 algorithm to aid readers unfamiliar with the subject.
Also added a link to MIT License
Embedded URLs inside text to provide a cleaner reading experience
This commit is contained in:
ahadc
2018-03-24 16:28:59 -07:00
committed by GitHub
parent 7481ea98cf
commit f4eef94662

View File

@@ -1,22 +1,22 @@
# Decision Tree # Decision Tree
A Ruby library which implements ID3 (information gain) algorithm for decision tree learning. Currently, continuous and discrete datasets can be learned. A Ruby library which implements [ID3 (information gain)](https://en.wikipedia.org/wiki/ID3_algorithm) algorithm for decision tree learning. Currently, continuous and discrete datasets can be learned.
- Discrete model assumes unique labels & can be graphed and converted into a png for visual analysis - Discrete model assumes unique labels & can be graphed and converted into a png for visual analysis
- Continuous looks at all possible values for a variable and iteratively chooses the best threshold between all possible assignments. This results in a binary tree which is partitioned by the threshold at every step. (e.g. temperate > 20C) - Continuous looks at all possible values for a variable and iteratively chooses the best threshold between all possible assignments. This results in a binary tree which is partitioned by the threshold at every step. (e.g. temperate > 20C)
## Features ## Features
- ID3 algorithms for continuous and discrete cases, with support for inconsistent datasets. - ID3 algorithms for continuous and discrete cases, with support for inconsistent datasets.
- Graphviz component to visualize the learned tree (http://rockit.sourceforge.net/subprojects/graphr/) - [Graphviz component](http://rockit.sourceforge.net/subprojects/graphr/) to visualize the learned tree
- Support for multiple, and symbolic outputs and graphing of continuos trees. - Support for multiple, and symbolic outputs and graphing of continuous trees.
- Returns default value when no branches are suitable for input - Returns default value when no branches are suitable for input
## Implementation ## Implementation
- Ruleset is a class that trains an ID3Tree with 2/3 of the training data, converts it into a set of rules and prunes the rules with the remaining 1/3 of the training data (in a C4.5 way). - Ruleset is a class that trains an ID3Tree with 2/3 of the training data, converts it into a set of rules and prunes the rules with the remaining 1/3 of the training data (in a [C4.5](https://en.wikipedia.org/wiki/C4.5_algorithm) way).
- Bagging is a bagging-based trainer (quite obvious), which trains 10 Ruleset trainers and when predicting chooses the best output based on voting. - Bagging is a bagging-based trainer (quite obvious), which trains 10 Ruleset trainers and when predicting chooses the best output based on voting.
Blog post with explanation & examples: http://www.igvita.com/2007/04/16/decision-tree-learning-in-ruby/ [Blog post with explanation & examples](http://www.igvita.com/2007/04/16/decision-tree-learning-in-ruby/)
## Example ## Example
@@ -68,4 +68,4 @@ puts "Predicted: #{decision} ... True decision: #{test.last}"
## License ## License
The MIT License - Copyright (c) 2006 Ilya Grigorik The [MIT License](https://opensource.org/licenses/MIT) - Copyright (c) 2006 Ilya Grigorik