In this video we talk about two methods that are commonly used to fine-tune the hyperparameters of a statistical model: (1) grid search and (2) random search, and what are the advantages and disadvantages of each one.
References
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
Grid search vs random search figure taken from: A kernel design approach to improve kernel subspace identification
Related Videos
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
Covariance and Correlation Explained: • Covariance and Correla...
Eigendecomposition Explained: • Eigendecomposition Exp...
Multivariate Normal (Gaussian) Distribution Explained: • Multivariate Normal (G...
The Bessel's Correction: • Why We Divide by N-1 i...
Gradient Boosting with Regression Trees Explained: • Gradient Boosting with...
P-Values Explained: • P-Values Explained | P...
Kabsch-Umeyama Algorithm: • Kabsch-Umeyama Algorit...
Contents
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
00:00 - Hyperparameters Intro
01:04 - Grid Search
01:34 - Random Search
02:08 - Pros and cons
02:45 - Outro
Follow Me
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
🐦 Twitter: @datamlistic / datamlistic
📸 Instagram: @datamlistic / datamlistic
📱 TikTok: @datamlistic / datamlistic
Channel Support
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
The best way to support the channel is to share the content. ;)
If you'd like to also support the channel financially, donating the price of a coffee is always warmly welcomed! (completely optional and voluntary)
► Patreon: / datamlistic
► Bitcoin (BTC): 3C6Pkzyb5CjAUYrJxmpCaaNPVRgRVxxyTq
► Ethereum (ETH): 0x9Ac4eB94386C3e02b96599C05B7a8C71773c9281
► Cardano (ADA): addr1v95rfxlslfzkvd8sr3exkh7st4qmgj4ywf5zcaxgqgdyunsj5juw5
► Tether (USDT): 0xeC261d9b2EE4B6997a6a424067af165BAA4afE1a
#hyperparameters #finetune #gridsearch #randomsearch
Негізгі бет Hyperparameters Tuning: Grid Search vs Random Search
Пікірлер: 5