When I said that decision trees were simple, I did not mean it is a simple form of ML. I meant that they are easily understood. They are also transparent in the sense that we can figure out how we got to a decision.
When you ask about the situation where decision trees are not sufficient, you probably means that the accuracy is not high enough.
There could be many reasons:
- Data quality
- Not enough data / data not representative of the problem domain
- Too many attributes
- Having a tree that overfits
and probably more.
I did mention in the webcast that there are other algorithms that use decision trees but help in the accuracy. For example, random forests and gradient boosted trees. This could help in getting better accuracy.
of course, there are multiple ways to solve one problem so other approaches could be taken, including neural network. Still, don't give up on decision trees too fast :-)
------------------------------
Jacques Roy
------------------------------
Original Message:
Sent: 03-22-2019 01:03 PM
From: Henri Ajenstat
Subject: Webcast Into Data Science: Understanding Decision Trees Follow-up
Hi Jacques,
In the webinar, you said I reckon that decision trees are a simple form of ML. What if decision trees are not sufficient, how should one proceed if that's the case?
Thanks!
------------------------------
Henri Ajenstat
Original Message:
Sent: 03-22-2019 01:00 PM
From: Jacques Roy
Subject: Webcast Into Data Science: Understanding Decision Trees Follow-up
Thank you everyone for attending today's webcast! In this webcast we covered decision trees:
- What are they?
- What are they good for?
- Decision trees limitations
- Types of decision trees
- A demo showing the use of decision trees for customer churn prediction
The replay can be found here. If you have any questions, please feel free to post them in this thread.
------------------------------
Jacques Roy
------------------------------
#Askadatascientist
#GlobalAIandDataScience
#GlobalDataScience