Skip to content

Latest commit

 

History

History
48 lines (38 loc) · 1.65 KB

File metadata and controls

48 lines (38 loc) · 1.65 KB

Week 4 Summary: From classical microstructure metrics to learned representations

Cross-Book Summary

1. Classical Microstructure Metrics

  • Stereology: Quantifying 3D structure from 2D sections.
  • Hand-crafted Descriptors: Intuitive but biased (e.g., aspect ratio).
  • Information Bottleneck: Scalar metrics discard crucial subtle details.

2. Learned Representations

  • Artificial Neuron: Weighted sum plus non-linear activation.
  • Activations: ReLU enables deep learning; Sigmoid/Tanh provide non-linearity.
  • Universal Approximation: One hidden layer can approximate any continuous function.
  • MLP Topology: Stacked layers build abstract representations.

90-Minute Lecture Strategy

Part 1: Classical Approach

  • Standard metrics: Grain size, phase fractions.
  • Quantitative Metallography.
  • Limitations of hand-crafted features.

Part 2: Feature Engineering vs. Learning

  • Concept of Representation.
  • Incomplete expert features.
  • Shift to learned embeddings.

Part 3: Building Blocks of ML

  • Mathematical Neuron.
  • Weights and Biases.
  • Non-linear activations.
  • Forward propagation.

Part 4: Multi-Layer Perceptrons

  • Stacking layers for abstraction.
  • Hidden layers.
  • Universal approximators.

Part 5: Outlook

  • Moving to images (CNNs).
  • Trusting learned vs. classical metrics.

Quarto Website Update (Summary)

Summary for ML-PC Week 4:

  • Transitions from classical stereology to learned representations.
  • Reviews limits of hand-crafted microstructure metrics.
  • Introduces the artificial neuron, weights, and activations.
  • Builds the framework for Multi-Layer Perceptrons (MLPs) to automate feature extraction.