This repository chronicles my exploration into the charming global of Machine Learning (ML). It files my knowledge as I progress from a curious novice to, with any luck, a capable professional who can leverage data to construct impactful models and contribute to the advancement of AI.
My adventure begins on the very basis โ information how information can be used to learn and make predictions. From there, I'll delve into diverse algorithms, gear, and libraries, progressively running my manner up to complicated deep studying architectures. Throughout this procedure, I'll percentage my learnings, tasks, experiments, and screw ups (because let's accept it, they're part of the mastering curve!).
I want to make some additions time to time as i build more models with different training methods. So far i have trained models with the same dataset and if we look at the **MAE(Mean Absolute Error) of both models we can see that with the XGBoost method we can train much succesful models. At first we used the Decision Tree method and it was okay but let's look at the difference between both methods' MAEs'. Decision Tree Regressor's MAE = 27,283 | XGBoost Regressor's MAE = 17,101.580024614726.
**MAE(Mean Absolute Error) is a measure of the average size of the mistakes in a collection of predictions, without taking their direction into account.