Algorithm - Generate a decision tree from the given training data.
Input - The training samples represented by discrete-valued attributes, the set of candidate attributes, attribute-list.
Output - A decision tree.
Method:
Input - The training samples represented by discrete-valued attributes, the set of candidate attributes, attribute-list.
Output - A decision tree.
Method:
- Create a node N.
- If samples are all of the same class, C then
- Return N as a leaf node labeled with the class C.
- If attribute-list is empty then
- Return N as a leaf node labeled with the most common class in samples; majority voting.
- Select tst attribute the attribute among attribute list with the highest information gain.
- Label node N with test attribute
- For each known value a of test attrubute; partition the samples
- Grow a branch from node N for the condition test-attribute=ai
- Let Si be the set of samples in samples for which test-attribute=ai; a partition
- If Si is empty then
- Attach a leaf labeled with the most common class in samples.
- Else attach the node returned by generate decision tree.
0 comments:
Post a Comment