The Figen
2周前
The Figen
2周前
Rohan Paul
2周前
Your brain's next 5 seconds, predicted by AI Transformer predicts brain activity patterns 5 seconds into future using just 21 seconds of fMRI data Achieves 0.997 correlation using modified time-series Transformer architecture ----- 🧠 Original Problem: Predicting future brain states from fMRI data remains challenging, especially for patients who can't undergo long scanning sessions. Current methods require extensive scan times and lack accuracy in short-term predictions. ----- 🔬 Solution in this Paper: → The paper introduces a modified time series Transformer with 4 encoder and 4 decoder layers, each containing 8 attention heads → The model takes a 30-timepoint window covering 379 brain regions as input and predicts the next brain state → Training uses Human Connectome Project data from 1003 healthy adults, with preprocessing including spatial smoothing and bandpass filtering → Unlike traditional approaches, this model omits look-ahead masking, simplifying prediction for single future timepoints ----- 🎯 Key Insights: → Temporal dependencies in brain states can be effectively captured using self-attention mechanisms → Short input sequences (21.6s) suffice for accurate predictions → Error accumulation follows a Markov chain pattern in longer predictions → The model preserves functional connectivity patterns matching known brain organization ----- 📊 Results: → Single timepoint prediction achieves MSE of 0.0013 → Accurate predictions up to 5.04 seconds with correlation >0.85 → First 7 predicted timepoints maintain high accuracy → Outperforms BrainLM with 20-timepoint MSE of 0.26 vs 0.568
勃勃OC
3周前