?? type_gdecisiontree.html
字號:
<html><head><title>Generated Documentation</title></head><body> <image src="headerimage.png"> <br><br><table><tr><td><big><big><big style="font-family: arial;"><b>GDecisionTree</b></big></big></big><br>extends <a href="type_GSupervisedLearner.html">GSupervisedLearner</a><br></td><td></td></tr></table><br><br><big><big><i>Constructors (public)</i></big></big><br><div style="margin-left: 40px;"><big><b>GDecisionTree</b></big>(<a href="type_GDecisionTree.html">GDecisionTree</a>* pThat, GDecisionTreeNode* pInterestingNode, GDecisionTreeNode* ppOutInterestingCopy)<br><div style="margin-left: 80px;"><font color=brown> Makes a deep copy of another decision tree. Also, if pInterestingNode is non-NULL, then ppOutInterestingNode will return the node that is a copy of pInterestingNode</font></div><br><big><b>GDecisionTree</b></big>(<a href="type_GArffRelation.html">GArffRelation</a>* pRelation)<br><div style="margin-left: 80px;"><font color=brown> The tree is built automatically in the constructor</font></div><br></div><br><big><big><i>Destructors</i></big></big><br><div style="margin-left: 40px;"><big><b>~GDecisionTree</b></big>()<br></div><br><big><big><i>Virtual (public)</i></big></big><br><div style="margin-left: 40px;">void <big><b>Eval</b></big>(double* pRow)<br><div style="margin-left: 80px;"><font color=brown> Evaluates the input values in the provided row and deduce the output values</font></div><br>void <big><b>Train</b></big>(<a href="type_GArffData.html">GArffData</a>* pData)<br><div style="margin-left: 80px;"><font color=brown> Divides the provided data into two parts, trains with one part and prunes with the other. (Use SetTrainingPortion to set the ratio of the two parts.)</font></div><br></div><br><big><big><i>Public</i></big></big><br><div style="margin-left: 40px;">void <big><b>Print</b></big>()<br><div style="margin-left: 80px;"><font color=brown> Print an ascii representation of the tree to stdout</font></div><br>void <big><b>Prune</b></big>(<a href="type_GArffData.html">GArffData</a>* pValidationSet)<br><div style="margin-left: 80px;"><font color=brown> Performs all pruning that causes the tree to give better results for the validation set</font></div><br>void <big><b>SetTrainingPortion</b></big>(double d)<br><div style="margin-left: 80px;"><font color=brown> Specifies how much of the training data is used to build the tree. (The rest is used to prune the tree.)</font></div><br>void <big><b>TrainWithoutPruning</b></big>(<a href="type_GArffData.html">GArffData</a>* pTrainingData)<br><div style="margin-left: 80px;"><font color=brown> Trains using all of the provided data and doesn't do any pruning</font></div><br></div><br><big><big><i>Protected</i></big></big><br><div style="margin-left: 40px;">void <big><b>BuildNode</b></big>(GDecisionTreeNode* pNode, <a href="type_GArffData.html">GArffData</a>* pData, bool* pUsedAttributes)<br><div style="margin-left: 80px;"><font color=brown> A recursive helper method used to construct the decision tree</font></div><br>void <big><b>DeepPruneNode</b></big>(GDecisionTreeNode* pNode, <a href="type_GArffData.html">GArffData</a>* pValidationSet)<br><div style="margin-left: 80px;"><font color=brown> Tries pruning the children of pNode. If that improves the tree, makes the change permanent, otherwise recurses on all children of pNode</font></div><br>double <big><b>MeasureInfoGain</b></big>(<a href="type_GArffData.html">GArffData</a>* pData, int nAttribute, double* pPivot)<br><div style="margin-left: 80px;"><font color=brown> InfoGain is defined as the difference in entropy in the data before and after dividing it based on the specified attribute. For continuous attributes it uses the difference between the original variance and the sum of the variances of the two parts after dividing at the point the maximizes this value.</font></div><br></div><br></body></html>
?? 快捷鍵說明
復制代碼
Ctrl + C
搜索代碼
Ctrl + F
全屏模式
F11
切換主題
Ctrl + Shift + D
顯示快捷鍵
?
增大字號
Ctrl + =
減小字號
Ctrl + -