Here are the notes: raw.githubusercontent.com/Cey...
How can we realize distributions with more than one mode? How can we take into account that the chance for good weather depends on the season? Mixture Distributions provide us with a tool to model these scenarios. They are a special kind of directed graphical models (DGM) with a (commonly latent) class node and multiple distributions based on class.
In this video, you will find an intuitive introduction to this concept, including three ways to implement them in TensorFlow Probability.
-------
📝 : Check out the GitHub Repository of the channel, where I upload all the handwritten notes and source-code files (contributions are very welcome): github.com/Ceyron/machine-lea...
📢 : Follow me on LinkedIn or Twitter for updates on the channel and other cool Machine Learning & Simulation stuff: / felix-koehler and / felix_m_koehler
💸 : If you want to support my work on the channel, you can become a Patreon here: / mlsim
-------
Timestamps:
00:00 Introduction
00:37 A special case of DGMs
01:33 The (factorized) joint
04:06 The parameters of a Mixture Distribution
04:57 Example
05:54 TFP: Defining example values
06:38 TFP: Define the DGM
08:30 TFP: Investigating the Mixture Distribution
09:20 Latent Class Variables
10:06 Expressing the Marginal
13:29 Marginal for the Example
14:10 TFP: Ignoring the latent samples
14:43 TFP: Marginalization
15:21 TFP: Using built-in Mixture Distribution #1
16:52 TFP: Using built-in Mixture Distribution #2
18:29 Outro
Негізгі бет Mixture Distributions | Introduction | with examples in TensorFlow Probability
Пікірлер: 4