Hybrid Interpretable and Explainable Deep Learning Workflow for Vineyard Wildfire Damage Mapping From UAV-RGB Imagery: PCA–k-Means Calibrated With U-Net Segmentation (Patras, August 2025)
Metadatos
Mostrar el registro completo del ítemAutor
Rodrigo Comino, Jesús; Domínguez Rull, Jacinto; Xouris, Christos; Kalogeras, Athanasios; Cerdà, Artemi; Herrera Triguero, FranciscoEditorial
John Wiley & Sons, Ltd.
Materia
artificial intelligence burn severity RGB indices
Fecha
2026-04-09Referencia bibliográfica
Rodrigo-Comino, J., J.Domínguez-Rull, C.Xouris, A.Kalogeras, A.Cerdà, and F.Herrera. 2026. “Hybrid Interpretable and Explainable Deep Learning Workflow for Vineyard Wildfire Damage Mapping From UAV-RGB Imagery: PCA–k-Means Calibrated With U-Net Segmentation (Patras, August 2025).” The Photogrammetric Record41, no. 194: e70044. https://doi.org/10.1111/phor.70044
Patrocinador
Recovery, Transformation and Resilience Plan from the European Union Next Generation EU - (TSI-100927-2023-1); Spanish Ministry of Science, Innovation, and Universities of Spain - (PID2023-150070NB-I00)Resumen
Wildfires are becoming increasingly frequent and severe in Mediterranean regions, posing growing threats to both natural ecosystems and high-value agricultural systems such as vineyards. Following the August 2025 wildfire near Patras (Achaia, Western Greece), this study develops a rapid workflow for post-fire damage assessment in viticultural landscapes using ultra-high-resolution UAV RGB imagery (≈1.5 cm GSD). A hybrid interpretable and explainable artificial intelligence (AI) framework was designed to compare traditional and deep learning methods for burn severity mapping. Here, explainability refers to methodological transparency and role separation between interpretable RGB-based analysis and deep learning segmentation, rather than post hoc inspection of neural network internals. Results demonstrate that the interpretable PCA–k-means workflow captures vineyard burn severity patterns consistent with U-Net semantic segmentation, while requiring minimal data and computational cost. The classical approach integrated vegetation indices derived from RGB bands (ExG, NGRDI, GRVI, VARI) with principal component analysis (PCA) and k-means clustering, providing interpretable severity zonation from minimal data. In parallel, a U-Net semantic segmentation model was trained to produce pixel-level delineations of burned and surviving canopy, serving as a high-fidelity benchmark. PCA–k-means effectively captured overall burn gradients (PC1 explaining 50.1% of variance), while the U-Net achieved high spatial accuracy (mIoU = 0.91) and improved boundary precision. The combination of both approaches enabled cross-validation and calibration of severity thresholds, revealing their complementarity for operational post-fire analysis. This hybrid AI framework demonstrates that low-cost UAV RGB imagery, when coupled with interpretable and deep models, can deliver fast, reproducible, and transferable assessments of wildfire impacts in agricultural systems, supporting early recovery decisions and resilience planning under climate-driven fire regimes.





