Skip to player
Skip to main content
Search
Connect
Watch fullscreen
Like
Bookmark
Share
More
Add to Playlist
Report
Python - Splitting Criteria Entrophy | Python Courses in Tamil | Skillfloor
Skillfloor
Follow
2 months ago
#entropy
#decisiontree
#pythontutorial
#machinelearning
#tamilcoding
#skillfloor
#datascience
#pythonintamil
#mlalgorithm
#splittingcriteria
#classification
#datasciencetamil
#machinelearningtamil
#pythonprogramming
#mlmodels
Understand the concept of Entropy, a fundamental splitting criterion used in Decision Tree algorithms, with this easy-to-follow tutorial in Tamil!
Our Website:
Visit đź”— http://www.skillfloor.com
Our Blogs:
Visit đź”— https://skillfloor.com/blog/
DEVELOPMENT TRAINING IN CHENNAI
https://skillfloor.com/development-training-in-chennai
DEVELOPMENT TRAINING IN COIMBATORE
https://skillfloor.com/development-training-in-coimbatore
Our Development Courses:
Certified Python Developer
Visit đź”—https://skillfloor.com/certified-python-developer
Certified Data BASE Developer
Visit đź”—https://skillfloor.com/certified-data-base-developer
Certified Android App Developer
Visit đź”—https://skillfloor.com/certified-android-app-developer
Certified IOS App Developer
Visit đź”—https://skillfloor.com/certified-ios-app-developer
Certified Flutter Developer
Visit đź”—https://skillfloor.com/certified-flutter-developer
Certified Full Stack Developer
Visit đź”—https://skillfloor.com/certified-full-stack-developer
Certified Front End Developer
Visit đź”—https://skillfloor.com/certified-front-end-developer
Our Classroom Locations:
Bangalore - https://maps.app.goo.gl/ZKTSJNCKTihQqfgx6
Chennai - https://maps.app.goo.gl/36gvPAnwqVWWoWD47
Coimbatore - https://maps.app.goo.gl/BvEpAWtdbDUuTf1G6
Hyderabad - https://maps.app.goo.gl/NyPwrN35b3EoUDHCA
Ahmedabad - https://maps.app.goo.gl/uSizg8qngBMyLhC76
Pune - https://maps.app.goo.gl/JbGVtDgNQA7hpJYj9
Our Additional Course:
Analytics Course
https://skillfloor.com/analytics-courses
https://skillfloor.com/analytics-training-in-bangalore
Artificial Intelligence Course
https://skillfloor.com/artificial-intelligence-courses
https://skillfloor.com/artificial-intelligence-training-in-bangalore
Data Science Course
https://skillfloor.com/data-science-courses
https://skillfloor.com/data-science-course-in-bangalore
Digital Marketing
https://skillfloor.com/digital-marketing-courses
https://skillfloor.com/digital-marketing-courses-in-bangalore
Ethical Hacking
https://skillfloor.com/ethical-hacking-courses
https://skillfloor.com/cyber-security-training-in-bangalore
#entropy #decisiontree #pythontutorial #machinelearning #tamilcoding #skillfloor #datascience #pythonintamil #mlalgorithm #splittingcriteria #classification #datasciencetamil #machinelearningtamil #pythonprogramming #mlmodels #techeducation #codingintamil #featureselection #pythoncourse
Category
📚
Learning
Transcript
Display full video transcript
00:00
Hello everyone, in this video, we are splitting criteria, entropy.
00:08
So, entropy based upon each tree, we will split up.
00:12
So, entropy is the inner criteria.
00:17
Randomness and oddness in our data, we will calculate.
00:21
Normally, we compare the entropy, we will work slower.
00:26
We will work closer to the entropy.
00:28
Then, we will calculate the entropy and probability based upon each tree.
00:32
Guinea split index, impurity and randomness.
00:34
But entropy is the probability and logarithmic.
00:37
So, if we combine it, we will provide the decision.
00:41
If entropy is equal to zero, we will calculate the pure split.
00:46
We calculate the weighted average in our data.
00:50
We calculate the information in our data.
00:53
So, we calculate the guinea split index.
00:56
We calculate the entropy of entropy.
00:58
We check the information gain.
01:00
So, the information gain, we check the split.
01:02
We check the split.
01:04
We check the split index.
01:06
In the guinea split index, we calculate the minimum value.
01:10
We choose the minimum value.
01:12
Correct?
01:13
And the information gain, we choose the highest value of the column.
01:18
That's the root node.
01:20
So, the information gain formula is the E of S.
01:24
That's entropy of complete data set.
01:27
Okay?
01:28
And minus weighted average into entropy of each and every features.
01:32
So, each and every features based upon the information gain.
01:35
So, each and every features based upon the information gain.
01:36
We decide.
01:37
First, we consider what data set.
01:41
Outlook, that's weather.
01:43
Then, temperature, humidity, windy.
01:45
If we compare it, we don't have a very high level.
01:47
We say that we have a target column.
01:49
So, in the target column, we have a yes or no.
01:53
So, in the s or no, we have a 9 data.
01:56
No, 5 data.
01:58
So, total of 14 data.
02:00
So, we can analyze the probability.
02:02
Okay?
02:03
So, if we analyze the probability,
02:05
This formula is the entropy of the formula.
02:09
Minus summation of probability of i
02:12
into log to the base of probability of i.
02:15
So, i enter the class.
02:17
Okay?
02:18
So, instance of play.
02:19
Target column is classes.
02:20
So, that is minus probability of s
02:23
into log2 of probability of s.
02:26
Then, minus probability of no into log2 of probability of no.
02:30
Okay?
02:31
We can calculate.
02:32
So, probability of s is 9 by 14.
02:34
Probability of node is 5 by 14.
02:36
So, if we substitute it,
02:38
we have 0.94.
02:39
That is, we have 94% impurity in the data.
02:43
Okay?
02:44
So, first, we choose the root node.
02:47
We calculate entropy.
02:49
So, entropy we calculate.
02:51
Like outlook.
02:52
Like outlook.
02:53
Like outlook.
02:54
So, outlook.
02:55
Basic.
02:56
So, probability of s is 2 by 5.
02:57
So, probability of s is 2 by 5.
02:59
And rainy.
03:00
So, sunny.
03:01
So, sunny.
03:02
There are total of 5 data.
03:03
Rainy.
03:04
There are 5 data.
03:05
And overcast of 2 data.
03:06
So, first, we will separate entropy.
03:07
Sunny data.
03:08
So, we calculate the entropy.
03:09
Sunny data.
03:10
So, sunny data.
03:11
We pick up.
03:12
So, here we know.
03:13
We have no data.
03:14
1, 2, 3.
03:15
3 no data.
03:17
And 2.
03:18
Yes data.
03:19
So, probability of s is 2 by 5.
03:22
5 is sunny data.
03:23
Probability of s is 2 by 5.
03:25
Probability of no is 3 by 5.
03:27
So, what is the entropy?
03:29
Minus probability of s.
03:31
Into log 2 of s.
03:33
Into plus probability of no.
03:35
Into log 2 of 4.
03:37
So, here the point is 971.
03:39
That is, overcast column.
03:41
So, overcast data.
03:42
Overcast data.
03:44
Overcast data.
03:45
So, overcast data.
03:46
There are 4 data.
03:47
Actually.
03:48
So, complete a yes.
03:49
This is entropy equal to 0.
03:50
Okay?
03:51
Entropy equal to 0.
03:52
Why?
03:53
Completed we have yes data.
03:54
Okay?
03:55
Then, rainy data.
03:56
Rainy data.
03:57
That is what we calculate.
03:58
So, this is the first weighted entropy.
04:00
Okay?
04:01
We calculate the entropy index.
04:03
That is what we calculate.
04:05
Probability of sunny into entropy of sunny.
04:08
What we calculate.
04:09
Probability of sunny into guinea value of sunny.
04:11
And D.
04:12
We calculate the value of 10.
04:13
Then.
04:14
Plus probability of overcast.
04:16
Into entropy of overcast.
04:19
Plus probability of raining.
04:22
Into entropy of raining.
04:24
So, this is completely calculated.
04:26
We calculate the value of weighted entropy.
04:28
Weighted entropy is 0.695.
04:30
And then, we calculate the value of 8.
04:31
That is the value of 8.
04:32
That is the value of 9.
04:33
So, the value of 0.9.
04:34
So, we calculate the value of 0.9.
04:35
data set value is 0.94 so 0.94 minus
04:40
weighted entropy is 0.695 so the information gain of outlook is 0.2488
04:52
so the temperature column is 0.2488 so the temperature column is 0.2488
05:03
so hot cold mild to separate hot to get the entropy calculate mild and cool to calculate
05:09
once after that we have weighted average in total 4 data so the hot data is considered
05:18
2s and 2no then probability of yes is 0.5 and probability of no is 0.5
05:27
then mild data as to cool data as to calculate weighted entropy calculate
05:33
so weighted entropy calculate probability of hot into entropy of hot plus probability of mild
05:39
into entropy of mild plus probability of cool into entropy of cool
05:45
so in the way calculate the weighted entropy calculate the bar
05:49
if you calculate the pro information gain calculate
05:51
so information gain of entropy of yes that is the entire data set minus weighted entropy
05:57
0.911 so 0.0328
06:01
okay then similarly windy column go choose
06:05
so windy column go same calculation so windy
06:08
so windy level high normal
06:11
end is split there so it is separated calculate
06:15
high go normal go weighted entropy calculate
06:17
information gain
06:18
okay so this is third column
06:22
fourth column
06:23
fourth column
06:24
fourth column
06:46
the highest information gain is Outlook
06:49
Outlook is the highest information gain
06:52
Output is the root node
06:55
Outlook is sunny, overcast and 20
06:58
Overcast is the entropy is 0
07:01
So it's splitting
07:03
Sunny data and rainy data
07:06
Then the second time decide
07:08
Outlook
07:11
Humidity column is yes or no
07:14
Then windy data is strong or weak
07:18
So we will create a root node
07:22
Next further splitting
07:24
Entropy calculate
07:26
All the information gain higher
07:28
Second node choose
07:30
So this tree
07:32
Entropy use
07:34
Create
07:36
In the next video
07:44
Thank you
07:46
To the next video
08:00
Check it out
08:02
All the information
08:04
This tree
08:06
To the next two
Be the first to comment
Add your comment
Recommended
13:39
|
Up next
Treaspassing Sasquatch looking theif.
Wiredwizard.net
2 weeks ago
0:37
Benefits of AI in Marketing | Eflot - Digital marketing agency in Bangalore
Eflot
2 months ago
0:43
Why Dove’s “Real Beauty Sketches” Went Viral Worldwide | Eflot
Eflot
4 months ago
0:34
The Importance of Internet Advertising | Eflot - Digital Marketing Agency In Bangalore
Eflot
5 months ago
9:13
Python - Purpose of Version Control & Popular tools | Python Courses in Tamil | Skillfloor
Skillfloor
2 months ago
15:00
Implementation of DT in Python | Python Courses in Tamil | Skillfloor
Skillfloor
2 months ago
9:59
Python - Splitting Criteria: Gini | Python Courses in Tamil | Skillfloor
Skillfloor
2 months ago
1:06
Steps to Begin Your Digital Marketing Journey | Digital Marketing Course in Hyderabad | Skillfloor
Skillfloor
2 months ago
5:36
Python - Introduction to Decision Tree | Python Courses in Tamil | Skillfloor
Skillfloor
2 months ago
10:45
Modelling and Evaluation of RF in Python | Python Courses in Tamil | Skillfloor
Skillfloor
2 months ago
4:17
Python - Random Forest Ensemble technique | Python Courses in Tamil | Skillfloor
Skillfloor
2 months ago
4:30
Advanced Python Data Visualizations Count plot, Catplot | Python Courses in Tamil | Skillfloor
Skillfloor
2 months ago
0:38
The Best Social Media Marketing Institute | Digital Marketing Course in Coimbatore | Skillfloor
Skillfloor
2 months ago
4:25
Advanced Python Data Visualizations Strip plot, Swarmplot | Python Courses in Tamil | Skillfloor
Skillfloor
2 months ago
5:36
Python - Advanced Python Data Visualizations, Boxplot, Violin | Python Courses in Tamil | Skillfloor
Skillfloor
2 months ago
5:46
Python - Advanced Python Data Visualizations Relplot, Heatmap | Python Courses in Tamil | Skillfloor
Skillfloor
2 months ago
8:30
Python - Seaborn Basic Plots Histogram, Distplot | Python Courses in Tamil | Skillfloor
Skillfloor
3 months ago
7:43
Python - Seaborn Basic Plots Line, Scatter | Python Courses in Tamil | Skillfloor
Skillfloor
3 months ago
7:41
Python - Data Visualization Using Matplotlib in Python Part 4 | Python Courses in Tamil | Skillfloor
Skillfloor
3 months ago
7:38
Python - Data Visualization Using Matplotlib Part 2 | Python Courses in Tamil | Skillfloor
Skillfloor
3 months ago
5:00
Data Visualization using Matplotlib in Python Part 1 | Python Courses in Tamil | Skillfloor
Skillfloor
3 months ago
5:45
Python - Introduction to Visualisation Packages | Python Courses in Tamil | Skillfloor
Skillfloor
3 months ago
8:24
Python - Data Munging in Pandas Part-5 | Python Courses in Tamil | Skillfloor
Skillfloor
3 months ago
11:53
Python - Data Munging in Pandas Part-4 | Python Courses in Tamil | Skillfloor
Skillfloor
3 months ago
7:00
Python - Data Munging in Pandas Part-3 | Python Courses in Tamil | Skillfloor
Skillfloor
3 months ago
Be the first to comment