Application of a novel radiative transfer scheme in the context of core collapse
Andre Klepitko
Solving radiative transfer in numerical simulations demands an enormous amount of computational resources. Similar to gravity, the effects of radiation act over long distances. In order to solve problems of this kind, one must therefore consider the entire computational domain. This poses a problem for most algorithms designed to run on parallel hardware configurations, making communication a non-negligible part of the computing process.
In the case where we want to model radiation from dust, we have emission and absorption ongoing in all parts of our computational domain. To tackle this problem we employ a backwards ray-tracing scheme allowing us to efficiently probe the entire domain from every point in space. In return we are able to solve multiple long range effects such as gravity, ionising radiation and, with recent work, infrared radiation.
Figure 1 shows an astrophysical application of the new scheme where we have modelled the collapse of a subvirial 150 M☉ core seeded with an initial turbulent velocity field. Displayed from left to right are column density, gas temperature, dust temperature and the local infrared radiation field intensity. On the upper left of Figure 1 one can find information about the current time displayed in units of free fall time and the number of sink particles present in the simulation. Each sink particle represents a stellar object which is a principal source of radiation heating its surroundings. The simulation is still ongoing and reaches 0.3 tff.
By feeding the radiation intensity into our chemistry network, we were able to improve our modeling of dust thermodynamics. This will allow for new ways to investigate the role of dust cooling for astrophysical processes. Furthermore, we are able to compute radiation pressure on dust and the role dust plays in boosting radiation pressure by re-emission.