00:00Hello everyone, in the video we will talk about Random Forest Ensemble Learning Technique.
00:07Let's talk about Random Forest Bagging Algorithm.
00:14Random Forest is Ensemble Bagging Algorithm.
00:17We will combine multiple models and outputs and predict the final outcomes.
00:23For example, we will consider a data set.
00:26So, in the data set, there are 100 rows and 20 features.
00:30So, in the basic bagging technique, we will arrange multiple models.
00:37So, in the case, we will create a n number of models.
00:41In the Random Forest, we will use the decision tree.
00:46The decision tree will be used in the random forest.
00:51Normally, in the bagging technique, we will use different algorithms to deal with different algorithms.
00:56We will build the decision tree in the random forest.
01:01Decision tree 1 and decision tree n.
01:04So, first decision tree, we will use sample 1.
01:07So, this is the 50th row and 20th row.
01:10Then, along with sample 1.
01:13Sample 1.
01:14Sample 1.
01:15Provide.
01:16Provide.
01:17Provide.
01:18Provide.
01:19Provide.
01:20Then, second model.
01:21M2.
01:22M2.
01:23M2.
01:24Sample 2.
01:25So, that is the 35th row.
01:2650th row.
01:27Then, feature 2.
01:28Provide.
01:29Provide.
01:30Provide.
01:31So, 10th column.
01:3220th column.
01:33Then, nth model.
01:34Nth model.
01:35Sn.
01:36So, sn.
01:3750th row.
01:3840th row.
01:39Very consider.
01:40Then, features.
01:41Third column.
01:4211th column.
01:43So, in the model.
01:44Samples and features.
01:45Different different samples and features.
01:46Provide.
01:47So, another one.
01:48The same.
01:49The same.
01:50The same.
01:51The same.
01:52The same.
01:53The same.
01:54The same.
01:55The same.
01:56The same.
01:57The same.
01:58So, another one.
01:59The same.
02:00Row sampling.
02:01Row and column sampling.
02:02One.
02:03The same.
02:04The same.
02:05Row and column sampling.
02:06One.
02:07So, this one.
02:08Bootstrap.
02:09That's the same.
02:10Then.
02:11Then.
02:12Now, one.
02:13Classification problem.
02:14One.
02:15Data deal.
02:16Okay.
02:17And the particular person.
02:19Diapetic.
02:20So, different features.
02:22Consider.
02:23So, one.
02:25Classification problem.
02:26What the number of decisions.
02:27That's the same.
02:28Decision tree.
02:29Classifier.
02:30Decision tree.
02:31Classifier.
02:32Decision tree.
02:33Classifier.
02:34So, by default.
02:36Default.
02:37What is that.
02:38100.
02:39Decision tree.
02:40Are you ready?
02:41Okay.
02:43So, I know.
02:44This.
02:45Decision tree.
02:46Classifier.
02:47provide pundro so the particular person good heart disease is
02:49irukkale and first model one the number good specific data is put
02:52up the roll in the 20th row varic provide pundro features and the
02:56fifth column in the 15th column varic we can find pundro so the
03:00particular testing data is good amul good machine model one
03:03one the number provide pundna and the particular person good
03:05diabetes is good then second model and the particular person
03:10good diabetes is illa then third model like final model
03:14one the number is good so the particular person is not
03:17diabetes is illa so the original product is good so
03:20yes or no. so the maximum voting average apply
03:27so the result are combined then and the particular person is
03:32diabetic is illa. this is a regression problem we deal with
03:38decision tree we can handle two so regression
03:431. Decisionary Regressor 2.
03:46So, here we have maximum voting average.
03:48We have average voting.
03:54So, here we have maximum voting.
03:56What we consider is classification problem.
03:58Average voting.
04:00Regression problem.
04:02So, here we have basic random for us work.
04:10This is random for us ensemble technique.
04:12Next video we will see.
Comments