We present an analysis of the diffuse ionized gas (DIG) in a high-resolution simulation of an isolated Milky Way–like galaxy, incorporating on-the-fly radiative transfer and non-equilibrium thermochemistry. We utilize the Monte-Carlo radiative transfer code COLT to self-consistently obtain ionization states and line emission in post-processing.

We find a clear bimodal distribution in the electron densities of ionized gas (ne), allowing us to define a threshold of ne = 10 cm−3 to differentiate DIG from H II regions. The DIG is primarily ionized by stars aged 5–25 Myr, which become exposed directly to low-density gas after H II regions have been cleared. Leakage from recently formed stars (< 5 Myr) is only moderately important for DIG ionization.

We forward-model local observations and validate our simulated DIG against observed line ratios in [S II]/Hα, [N II]/Hα, [O I]/Hα, and [O III]/Hβ versus Σ. The mock observations not only reproduce observed correlations, but also demonstrate that such trends are related to an increasing temperature and hardening ionizing radiation field with decreasing ne. The hardening of radiation within the DIG is caused by the gradual transition of the dominant ionizing source—from 0 Myr to 25 Myr stars—with progressively harder intrinsic spectra, primarily due to the extended Wolf–Rayet phase driven by binary interactions.

Consequently, the DIG line–ratio trends can be attributed to ongoing star formation, rather than secondary ionization sources, and therefore present a potent test for stellar feedback and stellar population models.

Previous
Previous

The density-bounded twilight of starbursts