Skip to playerSkip to main content
Discover the power of Random Forest, one of the best ensemble machine learning techniques, with this detailed and easy-to-understand tutorial in Tamil!

Join us now and transform your knowledge with practical examples and hands-on coding.

Our Website:
Visit đź”— http://www.skillfloor.com

Our Blogs:
Visit đź”— https://skillfloor.com/blog/

DEVELOPMENT TRAINING IN CHENNAI
https://skillfloor.com/development-training-in-chennai

DEVELOPMENT TRAINING IN COIMBATORE
https://skillfloor.com/development-training-in-coimbatore

Our Development Courses:
Certified Python Developer
Visit đź”—https://skillfloor.com/certified-python-developer
Certified Data BASE Developer
Visit đź”—https://skillfloor.com/certified-data-base-developer
Certified Android App Developer
Visit đź”—https://skillfloor.com/certified-android-app-developer
Certified IOS App Developer
Visit đź”—https://skillfloor.com/certified-ios-app-developer
Certified Flutter Developer
Visit đź”—https://skillfloor.com/certified-flutter-developer
Certified Full Stack Developer
Visit đź”—https://skillfloor.com/certified-full-stack-developer
Certified Front End Developer
Visit đź”—https://skillfloor.com/certified-front-end-developer

Our Classroom Locations:
Bangalore - https://maps.app.goo.gl/ZKTSJNCKTihQqfgx6
Chennai - https://maps.app.goo.gl/36gvPAnwqVWWoWD47
Coimbatore - https://maps.app.goo.gl/BvEpAWtdbDUuTf1G6
Hyderabad - https://maps.app.goo.gl/NyPwrN35b3EoUDHCA
Ahmedabad - https://maps.app.goo.gl/uSizg8qngBMyLhC76
Pune - https://maps.app.goo.gl/JbGVtDgNQA7hpJYj9

Our Additional Course:
Analytics Course
https://skillfloor.com/analytics-courses
https://skillfloor.com/analytics-training-in-bangalore
Artificial Intelligence Course
https://skillfloor.com/artificial-intelligence-courses
https://skillfloor.com/artificial-intelligence-training-in-bangalore
Data Science Course
https://skillfloor.com/data-science-courses
https://skillfloor.com/data-science-course-in-bangalore
Digital Marketing
https://skillfloor.com/digital-marketing-courses
https://skillfloor.com/digital-marketing-courses-in-bangalore
Ethical Hacking
https://skillfloor.com/ethical-hacking-courses
https://skillfloor.com/cyber-security-training-in-bangalore

#randomforest #machinelearning #pythontutorial #tamilcoding #skillfloor #datascience #pythonintamil #ensemblelearning #mlalgorithm #decisiontree #datasciencetamil #machinelearningtamil #pythonprogramming #mlmodels #techeducation #codingintamil #dataanalysis #supervisedlearning #pythoncourse #tamiltech
Transcript
00:00Hello everyone, in the video we will talk about Random Forest Ensemble Learning Technique.
00:07Let's talk about Random Forest Bagging Algorithm.
00:14Random Forest is Ensemble Bagging Algorithm.
00:17We will combine multiple models and outputs and predict the final outcomes.
00:23For example, we will consider a data set.
00:26So, in the data set, there are 100 rows and 20 features.
00:30So, in the basic bagging technique, we will arrange multiple models.
00:37So, in the case, we will create a n number of models.
00:41In the Random Forest, we will use the decision tree.
00:46The decision tree will be used in the random forest.
00:51Normally, in the bagging technique, we will use different algorithms to deal with different algorithms.
00:56We will build the decision tree in the random forest.
01:01Decision tree 1 and decision tree n.
01:04So, first decision tree, we will use sample 1.
01:07So, this is the 50th row and 20th row.
01:10Then, along with sample 1.
01:13Sample 1.
01:14Sample 1.
01:15Provide.
01:16Provide.
01:17Provide.
01:18Provide.
01:19Provide.
01:20Then, second model.
01:21M2.
01:22M2.
01:23M2.
01:24Sample 2.
01:25So, that is the 35th row.
01:2650th row.
01:27Then, feature 2.
01:28Provide.
01:29Provide.
01:30Provide.
01:31So, 10th column.
01:3220th column.
01:33Then, nth model.
01:34Nth model.
01:35Sn.
01:36So, sn.
01:3750th row.
01:3840th row.
01:39Very consider.
01:40Then, features.
01:41Third column.
01:4211th column.
01:43So, in the model.
01:44Samples and features.
01:45Different different samples and features.
01:46Provide.
01:47So, another one.
01:48The same.
01:49The same.
01:50The same.
01:51The same.
01:52The same.
01:53The same.
01:54The same.
01:55The same.
01:56The same.
01:57The same.
01:58So, another one.
01:59The same.
02:00Row sampling.
02:01Row and column sampling.
02:02One.
02:03The same.
02:04The same.
02:05Row and column sampling.
02:06One.
02:07So, this one.
02:08Bootstrap.
02:09That's the same.
02:10Then.
02:11Then.
02:12Now, one.
02:13Classification problem.
02:14One.
02:15Data deal.
02:16Okay.
02:17And the particular person.
02:19Diapetic.
02:20So, different features.
02:22Consider.
02:23So, one.
02:25Classification problem.
02:26What the number of decisions.
02:27That's the same.
02:28Decision tree.
02:29Classifier.
02:30Decision tree.
02:31Classifier.
02:32Decision tree.
02:33Classifier.
02:34So, by default.
02:36Default.
02:37What is that.
02:38100.
02:39Decision tree.
02:40Are you ready?
02:41Okay.
02:43So, I know.
02:44This.
02:45Decision tree.
02:46Classifier.
02:47provide pundro so the particular person good heart disease is
02:49irukkale and first model one the number good specific data is put
02:52up the roll in the 20th row varic provide pundro features and the
02:56fifth column in the 15th column varic we can find pundro so the
03:00particular testing data is good amul good machine model one
03:03one the number provide pundna and the particular person good
03:05diabetes is good then second model and the particular person
03:10good diabetes is illa then third model like final model
03:14one the number is good so the particular person is not
03:17diabetes is illa so the original product is good so
03:20yes or no. so the maximum voting average apply
03:27so the result are combined then and the particular person is
03:32diabetic is illa. this is a regression problem we deal with
03:38decision tree we can handle two so regression
03:431. Decisionary Regressor 2.
03:46So, here we have maximum voting average.
03:48We have average voting.
03:54So, here we have maximum voting.
03:56What we consider is classification problem.
03:58Average voting.
04:00Regression problem.
04:02So, here we have basic random for us work.
04:10This is random for us ensemble technique.
04:12Next video we will see.
Be the first to comment
Add your comment

Recommended