- Temporal Dimension: Process logs are sequential; order matters.
- RNNs: Possess hidden states that act as memory.
- Vanishing Gradients: Basic RNNs struggle with long-term dependencies.
- LSTMs/GRUs: Use gates to selectively remember/forget information.
- Preprocessing: Smoothing and "Triggering" to extract cycles.
- Anomaly Detection: Large prediction deviations signal defects.
- Surrogate Models: Fast RNN/LSTM replacements for slow simulations.
- Time-dependency of microstructure.
- Sensor types (1D vs. logs).
- Event vs. Continuous.
- Denoising filters.
- Triggering cycles.
- Temporal Feature Engineering.
- Unrolled RNN structure.
- Vanishing Gradient problem.
- LSTM and GRU mechanics.
- Predictive Maintenance.
- AM melt pool stability.
- Dilatometry phase predictions.
- Non-stationarity and machine drift.
- Transformers for 1D data.
- Closed-loop control outlook.
Summary for ML-PC Week 7:
- Applies ML to Time-Series Data for process monitoring.
- Introduces RNNs and LSTMs for sequential dependencies.
- Details essential preprocessing like smoothing and triggering.
- Covers anomaly detection and process outcome prediction.