Flores Velazquez et al., available on arXiv
Abstract: Understanding the rate at which stars form is central to studies of galaxy formation. Observationally, the star formation rates (SFRs) of galaxies are measured using the luminosity in different frequency bands, often under the assumption of a time-steady SFR in the recent past. We use star formation histories (SFHs) extracted from cosmological simulations of star-forming galaxies from the FIRE project to analyze the time-scales to which the Hα and far-ultraviolet (FUV) continuum SFR indicators are sensitive. In these simulations, the SFRs are highly time variable for all galaxies at high redshift, and continue to be bursty to z=0 in dwarf galaxies. When FIRE SFHs are partitioned into their bursty and time-steady phases, the best-fitting FUV time-scale fluctuates from its ~10 Myr value when the SFR is time-steady to >~100 Myr immediately following particularly extreme bursts of star formation during the bursty phase. On the other hand, the best-fitting averaging time-scale for Hα is generally insensitive to the SFR variability in the FIRE simulations and remains ~5 Myr at all times. These time-scales are shorter than the 100 Myr and 10 Myr time-scales sometimes assumed in the literature for FUV and H-alpha, respectively, because while the FUV emission persists for stellar populations older than 100 Myr, the time-dependent luminosities are strongly dominated by younger stars. Our results confirm that the ratio of SFRs inferred using Hα vs. FUV can be used to probe the burstiness of star formation in galaxies.