The size of a star determines its ultimate fate. The smallest stars will burn lighter elements for tens of billions of years; stars like the Sun will make some heavier elements before shrinking into white dwarfs; and massive stars will create the heavier elements and scatter them into the Universe as they explode.
Estimating the frequency at which different mass stars form is relatively easy—we can simply survey the Milky Way, counting how many of each type of star we see. That, however, assumes the Milky Way is typical of other galaxies out there. Earlier this year, we got a hint that it wasn’t. Observations of one of the dwarf galaxies orbiting the Milky Way suggested a star-forming region within it had an excess of massive stars.
But a dwarf galaxy is even more likely to have an atypical star-formation process than the Milky Way. So we really needed more general measures of the sizes of stars being formed in the larger Universe. We now have one, and big stars are still showing up at much higher rates than previous estimates would suggest.
The new work, done by a small European team, focuses on identifying specific isotopes in distant galaxies. One of those isotopes is carbon-13, which is predominantly made at the energies found in lower-mass stars. The other is oxygen-18, which requires energies only found in stars with masses greater than eight times that of the Sun. To get a measure of these two isotopes from back when the Universe was forming many of its stars, the researchers searched the catalogs of past observations to find galaxies that are more than 10 billion years old and have strong emissions of carbon monoxide.
While the differences between isotopes of the same atom are subtle, they do show up in the wavelengths of light emitted by these molecules. And the Atacama Large Millimeter Array (ALMA) telescope has the resolution to identify them. So they directed ALMA to image these gravitationally lensed galaxies. The results allow the researchers to calculate the ratio of the carbon-13 to the oxygen-18.
And, for these distant galaxies, that ratio is quite low. Using modeling, the researchers show that the only way to get ratios that low is to have an unusually large number of massive stars.
The researchers also plot this ratio for a variety of galaxies using data obtained by others and show that it changes along with the rate of star formation. Older, smaller galaxies that don’t form many stars tend to have higher ratios, while that ratio drops as you move into star-forming galaxies and drops further in the galaxies in the early Universe that are experiencing a burst of star formation.
The range of values suggests that galaxies that are forming the most stars may be producing massive stars at up to seven times the rate of a more typical galaxy. This includes “starburst” galaxies in the early Universe as well as some luminous galaxies more locally.
So this is looking like a pattern we didn’t expect, and our models don’t predict it. Which means we must have gotten a few things wrong. Which things? Conveniently, the authors list them: “Classical ideas about the evolutionary tracks of galaxies and our understanding of cosmic star-formation history are challenged. Fundamental parameters governing galaxy formation and evolution—star-formation rates, stellar masses, gas-depletion and dust-formation timescales, dust extinction laws, and more—must be re-addressed.”
That seems like a pretty long list. The authors note that there have been advances in what we know about star formation and physics, so there’s a chance that we’re already in a position to handle some of these issues. But if this pattern of massive star formation continues to hold, it looks like we’ve got plenty of work to do.