- Aleatoric vs. Epistemic: Inherent physical noise vs. model ignorance.
- Overconfidence Danger: Point estimates fail safely in unknown regimes; uncertainty metrics are crucial.
- Distribution over Functions: GP yields posterior mean and variance (uncertainty).
- Kernels as Physical Priors: Encodes assumptions about data smoothness/scale.
- Non-Parametric Nature: Scales with data size, ideal for small, high-quality materials datasets.
- Confidence Ribbons: Visualize reliability to guide further experiments.
- Kriging: Interpolates materials property surfaces using GP regression.
- Risk management in materials processing.
- Visualizing distributions and error bars.
- Function vs. Parameter space.
- Kernels and "Similarity".
- Conditional Gaussians and Variance.
- Predicting tensile strength across parameters.
- GP for Experimental Design.
- Multi-Task GPs.
- Mixture Density Networks (MDNs).
- Dropout as Bayesian approximation.
- Safe process windows via confidence intervals.
- Building trustworthy models.
Summary for ML-PC Week 12:
- Introduces Probabilistic Machine Learning for uncertainty quantification.
- Differentiates aleatoric (noise) from epistemic (ignorance) uncertainty.
- Uses Gaussian Processes (GPs) for uncertainty-aware regression.
- Applies confidence intervals to map robust process windows.