gxceed
← 論文一覧に戻る

EF-YOLO: Detecting Small Targets in Early-Stage Agricultural Fires via UAV-Based Remote Sensing

EF-YOLO:UAVリモートセンシングによる農業火災初期の小目標検出 (AI 翻訳)

Jun Tao, Zhihan Wang, Jianqiu Wu, Yuan Li, Tomohiro Fukuda, Jiaxin Zhang

Remote Sensing📚 査読済 / ジャーナル2026-04-09#その他Origin: CN
DOI: 10.3390/rs18081119
原典: https://doi.org/10.3390/rs18081119
📄 PDF

🤖 gxceed AI 要約

日本語

本研究は、UAVを用いた農業火災の早期検出における小目標検出の難しさに対処するため、データとモデルの共同最適化フレームワークを提案。ROI誘導合成パイプラインで高品質データセットを構築し、EF-YOLO検出器で高感度検出を実現。実験ではYOLOv8s比でAPSを15.4ポイント向上、リアルタイム推論も可能。

English

This paper proposes a joint data and model optimization framework for early detection of small agricultural fire targets in UAV imagery. It builds a hybrid dataset using latent diffusion models and introduces EF-YOLO with SPD-Conv and a high-resolution P2 head to improve sensitivity. Experimental results show APS of 40.2% on sub-pixel targets, exceeding YOLOv8s by 15.4 percentage points, with 88.7% recall and 78 FPS inference speed.

Unofficial AI-generated summary based on the public title and abstract. Not an official translation.

📝 gxceed 編集解説 — Why this matters

日本のGX文脈において

本論文は大阪大学が共同研究に参画しており、日本の農業分野におけるUAV火災検知技術の発展に寄与する可能性がある。ただし、GX政策や開示制度との直接的な関連は薄い。

In the global GX context

This paper presents a technical advancement in UAV-based agricultural fire detection, with potential applications in reducing emissions from agricultural fires globally. While not directly tied to climate disclosure frameworks, it contributes to environmental monitoring and disaster risk reduction.

👥 読者別の含意

🔬研究者:Computer vision researchers can adopt the joint optimization approach for small-object detection in resource-constrained settings.

🏢実務担当者:Agricultural tech firms can deploy EF-YOLO in edge devices for early fire warning systems.

🏛政策担当者:Environmental agencies may consider this technology for improving agricultural fire monitoring and reducing related emissions.

📄 Abstract(原文)

Early detection of agricultural fires with Unmanned Aerial Vehicles (UAVs) is important for environmental safety, yet it remains difficult because ignition cues are extremely small, smoke patterns vary widely, and farmland scenes often contain strong background interference such as specular reflections. Model development is further constrained by the scarcity of data from the early ignition stage. To address these challenges, we propose a joint data and model optimization framework. We first build a hybrid dataset through an ROI-guided synthesis pipeline, in which latent diffusion models are used to insert high-fidelity, carefully screened fire samples into real farmland backgrounds. We then introduce EF-YOLO, a detector designed for high sensitivity to small targets. The network uses SPD-Conv to reduce feature loss during spatial downsampling and includes a high-resolution P2 head to improve the detection of minute objects. To reduce background clutter, a Dual-Path Frequency–Spatial Enhancement (DP-FSE) module serves as a lightweight statistical surrogate that extracts global contextual cues and local salient features in parallel, thereby suppressing high-frequency noise. Experimental results show that EF-YOLO achieves an APS of 40.2% on sub-pixel targets, exceeding the YOLOv8s baseline by 15.4 percentage points. With a recall of 88.7% and a real-time inference speed of 78 FPS, the proposed framework offers a strong balance between detection performance and efficiency, making it well suited for edge-deployed agricultural fire early-warning systems.

🔗 Provenance — このレコードを発見したソース

🔔 こうした論文の新着を逃したくない方は キーワードアラート に登録(無料・3キーワードまで)。

gxceed は公開メタデータに基づく研究支援データセットです。要約・翻訳・解説は AI 支援で生成されています。 最終的な解釈・検証は利用者が原典資料に基づいて行うことを前提とします。