🤖 AI Fundamentals

This serie provides a foundation in AI theory and methods for Earth Observation (EO) applications and research. It covers core topics such as machine learning, deep learning, self-supervised learning, generative models, optimization, and interpretability.

📂 Sub-series:

Latest Articles

Mastering TerraMind: From Understanding to Fine-tuning

TerraMind is the first large-scale, any-to-any generative multimodal foundation model proposed for the Earth Observation (EO) field. It is pre-trained by combining token-level and pixel-level dual-scale representations to learn high-level contextual information and fine-grained spatial details. The model aims to facilitate multimodal data integration, provide powerful generative capabilities, and support zero-shot and few-shot applications, while outperforming existing models on Earth Observation benchmarks and further improving performance by introducing ‘Thinking in Modalities’ (TiM). [Read More]

Monte Carlo Sampling

Understand the core concepts of Monte Carlo: Law of Large Numbers, rejection sampling, importance sampling, variance reduction techniques (antithetic variates, control variates, stratified sampling). [Read More]

Introduction to MCMC

The reason we need MCMC is that many distributions are only known in their unnormalized form, making traditional sampling/integration methods ineffective. By constructing a ‘correct Markov chain’, we can obtain the target distribution from its stationary distribution, meaning the long-term distribution of the trajectory ≈ target distribution. [Read More]

title: “BYOL Explained: Self-Supervised Learning without Negative Pairs” date: 2025-10-08 summary: “Understanding BYOL: How interactions between Online and Target networks achieve SOTA performance without negative samples. A deep dive into the architecture and loss function.” series: [“Self-Supervised Learning”] tags: [“BYOL”, “Contrastive Learning”, “SSL”, “CV”, “Paper Notes”]

[Read More]

title: “SimCLR Explained: Contrastive Learning Design & Code” date: 2025-10-07 summary: “A detailed visual guide to SimCLR. Understand the logic behind stochastic data augmentation, the NT-Xent loss, and why contrastive learning works.” series: [“Self-Supervised Learning”] tags: [“SimCLR”, “Contrastive Learning”, “SSL”, “CV”, “Paper Notes”]

[Read More]