C

CourseWWWork

14 Followers
    33.6 Selecting New Datapoints for Next tree Krish Naik ML
    6:27
    33.5 Normalising Weights and Assigning Bins Krish Naik ML
    6:12
    33.3 Performance of Decision Tree Stump Krish Naik ML
    8:07
    33.4 Updating Weights Krish Naik ML
    5:19
    33.2 Creating Decision Tree Stump Krish Naik ML
    7:34
    33.1 Introduction to Adaboost ML algorithm Krish Naik ML
    11:50
    32.8 Feature Engineering Krish Naik ML
    11:54
    32.6 Model Training Step Krish Naik ML
    11:49
    32.4 Feature Engineering Part 01 Krish Naik ML
    13:19
    32.9 Model Training Krish Naik ML
    6:57
    32.5 Feature Engineering Part 02 Krish Naik ML
    8:49
    32.2 Random Forest Regression Krish Naik ML
    12:05
    32.1 Bagging & Boosting Ensemble Techniques Krish Naik ML
    14:32
    32.3 Problem Classification Krish Naik ML
    3:15
    31.7 Decision Tree Regression Krish Naik ML
    21:21
    31.9 Decision tree Prepruning Krish Naik ML
    8:27
    31.8 Decision Tree Implementation Krish Naik ML
    16:53
    31.6 Post Pruning & Pre Pruning Krish Naik ML
    8:23
    31.3 Information Gain Krish Naik ML
    9:07
    31.5 Decision Tree Split for Numerical Features Krish Naik ML
    4:58
    31.4 Entropy vs Gini impurity Krish Naik ML
    2:48
    31.2 Entropy and Gini Impurity Krish Naik ML
    11:31
    31.1 Introduction TO Decision Tree Krish Naik ML
    12:42
Rumble logo