Scientists claim to have discovered ancient radioactive remnants of alien stars that may have once upon a time existed but then died out, concealed for millions of years in the obscure depths of the Pacific, Atlantic, and Indian oceans.
These stars, thought to be more massive than the sun, may have exploded in a succession of events not too long ago, prompting experts to speculate whether these colossal stellar blasts may have altered the planet’s climate during its early years. Scientists unearthed samples containing radioactive iron, typically generated in the core of giant stars and violently erupting supernovae.
Scientists suspect these particles may have been dispersed into interstellar space via stardust. The event is thought to have originated approximately 300 light years from Earth and as far back in time as 3.2 million years. Astronomers have concluded that, given the cataclysmic character of these explosions, they would have generated enough stellar energy and cosmic rays to completely exterminate any form of organic life on the planet Earth.
According to Berlin Institute of Technology astronomer Deiter Breitschwerdt, these stellar eruptions were associated with massive star systems that may have been relatively near by.
“We have now a consistent and coherent picture of what happened around the solar system in the last 20 million years and we know how close these supernovae were. We can now proceed to find out if there might have been any biological effects.”
Nearly a decade ago, a faraway stellar blast was recorded 50,000 light years away from our planet in what was seen as the brightest galactic event ever recorded. Its impact was powerful enough to alter the Earth’s upper atmosphere for a relatively brief period. According to astronomers, an explosion of such formidable magnitude could easily have triggered a mass extinction event on Earth had it originated approximately 30 light years from our planet.
Astronomers believe these explosions could also have been easily visible to the naked eye from the Earth suggesting that although the supernovae would have been almost as luminous as the moon and perhaps equally captivating to behold, it would nonetheless have been calamitous for all life on Earth.
According to University of Kansas astrophysicist Adrian Melott, the evidence would enable scientists to examine some of the events that may have taken place during Earth’s early lifetime that may have been related to such kind of stellar devastation.
“Our local research group is working on figuring out what the effects were likely to have been. We really don’t know. The events weren’t close enough to cause a big mass extinction or severe effects, but not so far away that we can ignore them either. We’re trying to decide if we should expect to have seen any effects on the ground on the Earth.”
Iron is one of the most abundant elements constituting the Earth’s crust. Scientists contend that the planet’s core consists largely of iron. However, iron is known to equally constitute the chemical make-up of the sun and asteroids, as well as the alien stars that abound beyond the solar system. A type of radioactive iron referred to as “iron 60” is typically indicative of colossal supernova explosions, massive enough to spark the origins of a nascent star system.
Supernovae are the most intensely vivid objects observed in outer space. They are formed when a star approaches the end of its lifetime with a huge explosion, scattering away all its stellar matter out into deep space. They are said to be the most powerful stellar eruptions in the universe. A stellar explosion on the farthest peripheries of the universe also offers compelling clues about origins of the ever-so-elusive black holes that continue to exist in obscurity.
[Image via Shutterstock]