Probing the Radio Continuum- Far IR relationship in the low mass regime
Jonathan Westcott
University of Hertfordshire
Elias Brinks (University of Hertfordshire), Robert Beswick (University of Manchester)
The Radio continuum – Far IR correlation fundamentally links radio emission to massive star evolution, potentially revealing an ideal and direct tracer of star-formation that holds out to cosmological distances. Worryingly, many factors are delicately balanced to form such a relationship and hence it is surprising that it remains tight over 5 orders of magnitude in radio luminosity. In order to fully calibrate a radio star-formation law, insights into the physics behind this relationship need to be gained by pushing observations to extreme environments, for example, that provided by low-density dwarf galaxies that have recently been shown to deviate from the correlation. The source of this deviation is believed to be due to Cosmic Ray electrons (CRe), accelerated in Supernova Remnants (SNR), escaping the low gravitational potential of these systems without radiating a substantial amount of their injected energy via the synchrotron emission process. The actual fraction of energy that is radiated inside the galaxy is currently unknown and is key to understanding how other factors ‘conspire’ to maintain the relation. In this poster, we present the theory behind CRe ‘age’ analysis and outline a pilot project to construct resolved CRe ‘age’ maps for a small sample of nearby dwarf irregular galaxies. These maps will offer a unique opportunity to study CRe energy loss as a function of environment and analyse to what extent ‘age’ analysis can be used to recover galaxy-wide spatially resolved recent star-formation histories.