Astronomy News

New class of galaxy mainly dark matter?

Spread the love

From Rachel Feltman at Washington Post:

But now scientists have found something entirely new: a galaxy with the same mass as the Milky Way but with only 1 percent of our galaxy’s star power. About 99.99 percent of this other galaxy is made up of dark matter, and scientists believe it may be one of many.

The galaxy Dragonfly 44, described in a study published Thursday in the Astrophysical Journal Letters, is 300 million light years away. If scientists can track down a similar galaxy closer to home, however, they may be able to use it to make the first direct detection of dark matter. More.

See also: Dark matter skeptics wanted These people should really talk.

Follow UD News at Twitter!

8 Replies to “New class of galaxy mainly dark matter?

  1. 1
    Seversky says:

    As I understand it, the existence of dark matter is inferred from the observed motion of galaxies. If you estimate the total mass of all the visible stars in a galaxy, their combined gravitational attraction is not strong enough to prevent the galaxy from flying apart, given the speed at which it’s observed to be rotating. The best explanation is that there is something there we can’t see which provides the missing matter and hence the additional gravitational force needed to hold them together – dark matter.

    Now, tell us again how a design inference would have led us to the same conclusion or maybe an even better one.

  2. 2
    EDTA says:

    I’m not aware that accepting design means that we would stop observing and hypothesizing. That process should continue also. I for one welcome new discoveries (assuming they are genuine).

  3. 3
    ppolish says:

    Seversky lol – dark matter was inferred BECAUSE of design. If galaxies rotated according to scientifically accepted design rules – there would be no need for inferring dark matter. Lol again.

  4. 4
    Seversky says:

    ppolish @ 3

    Seversky lol – dark matter was inferred BECAUSE of design. If galaxies rotated according to scientifically accepted design rules – there would be no need for inferring dark matter. Lol again.

    So dark matter was inferred because galaxies don’t rotate the way they are designed to rotate? Sounds like another example of poor design to me – if design came into it at all.

  5. 5
    ppolish says:

    Seversky, where did I say “galaxies don’t rotate the way they are designed to rotate”? Where? Nowhere that’s where.

    They rotate differently from scientifically accepted design rules. Is that “bad”? Course not – it’s an inspiration for science to discover better explanations. And believe you me, the better explanation will not involve “Ooops” lol:)

  6. 6
    Pearlman says:

    ‘missing’ dark matter assumed
    because of assumptions that universe is expanding
    the motion of and observed cohesion of the galaxies
    Ongoing cosmic expansion can be rejected because of the vastly greater probability of SPIRAL cosmological redshift hypothesis
    the motion and extent the galaxies have flown apart can be explained by YeC being the reality, not deep-time doctrine dogma.

  7. 7

    Design answers the question of dark matter very simply. Genesis 1:2.
    It even tells you what it is made of. Water.

    But it requires a very low entropy Big Bang to work, something few scientists want to allow. Why? Because low entropy = design.

    Any more questions?

  8. 8
    Gordon Davisson says:

    Rob, I have a question: how do you think astronomers could possibly miss spotting that much water?

    To get a sense of the scale we’re talking about, let’s look at the Andromeda galaxy: its total mass is about 1.5e12 solar masses, or 3.0e42 kg. Assuming that’s 90% dark matter, that’s 2.7e42 kg, and if the dark matter is water that’d be 2.7e39 cubic meters of water. Andromeda’s radius us about 110,000 light years = 1.0e18 meters, so a spherical shell around it would have a surface area of 1.3e37 m^2, so we’re talking about enough water to make a shell about 200 meters thick around the entire galaxy.

    Of course, dark matter is not arranged in a spherical shell, it’s concentrated near the plane of the galaxy. We see Andromeda nearly edge-on, which means we’d actually be looking through even more water than that.

    Ok, after a little googling, I assume you’re talking about your proposal that dark matter is composed of primordial coments containing liquid water (“Primordial Comets: Big Bang Nucleosynthesis, Dark Matter & Life”)? If the water was in the form of comets 30km in diameter (the high end of the range you mention), they’d have an optical cross section of at least 5.0e-8 m^2/kg, meaning the total cross section of the comet cloud would be at least 1.3e35 m^2, which (at a radius of 110 kly) is enough to block/scatter 1% of the light from Andromeda… except again we’re looking at it through a denser section of dark matter, so it’s block more than that.

    If the comets were only 3km in diameter, the blocked light goes up to (over) 10%. If they were 100m (your estimate from dark matter constraints), we’re talking about complete blockage/scattering of all light from the galaxy.

    There’s no way we could’ve missed that.

    I also have … doubts .. about your theory of nucleosynthesis, but I’m not knowledgeable enough to critique it properly. This bit, however:

    But it requires a very low entropy Big Bang to work, something few scientists want to allow. Why? Because low entropy = design.

    Is clearly wrong on all counts. First, the standard model of the big bang does start with a very low-entropy state — a (nearly) uniform state, instead of having the mass concentrated in black holes. The early states (e.g. quark-gluon plasma) look high-entropy only if you don’t take gravity into account.

    But low entropy has nothing to do with design. Investigations into Maxwell’s demon — a hypothetical intelligent being who could (supposedly) decrease entropy by sorting molecules — have found that as long as the demon obeys the other laws of physics, it’ll wind up producing as much entropy as it destroys, so there’s no net decrease in entropy. So, there’s no basis for linking intelligence to decreases in entropy.

    But normal physical processes can produce low-entropy states. This may seem impossible, since they can’t decrease total entropy either, but they can (in some situations) increase the total possible entropy, and thus decrease the actual entropy relative to the possible entropy (i.e. increase negentropy).

    “Huh?”, I hear you say. Ok, simple analogy: a glass partly full of water.

    – The water represents the entropy in our system
    – The total capacity of the glass represents the entropy the system could have (given its current contents) if it were at equilibrium
    – The empty space (the water/entropy the glass could hold but doesn’t) is the negentropy or missing entropy

    The second law says that water (/entropy) can be produced but not destroyed, so if there’s none leaving the system the amount water (/entropy) cannot decrease. But the second law doesn’t say anything about the capacity of the glass, and if that’s increasing (stretchy glass?) fast enough the amount of empty space (/negentropy) can also be increasing.

    Ok, that was pretty vague and hand-wavey. Here’s a specific example to illustrate the principle: start with a container with some air, water, and rocks in it. Isolate it, and let it reach equilibrium. That means everything in it winds up at the same temperature, and the water evaporates (or condenses) until the partial pressure of the water vapor in the air is equal the vapor pressure at that temperature (i.e. 100% relative humidity). The system is now at its maximum entropy, so its negentropy is zero.

    Now expand the container rapidly. The air undergoes adiabatic cooling, but the rocks and water, being nearly incompressible, don’t (as much), so they’re no longer at the same temperature. Also, as the air cooled, the vapor pressure of water decreased (faster than the actual partial pressure of water vapor), so the air is now supersaturated with water (>100% humidity). Both of these effects mean the system is no longer at equilibrium, and its negentropy is now positive. Its entropy will have increased during the expansion, but its total entropy capacity increased even faster, so its negentropy increased as well.

    If you leave it isolated again, its entropy will creep up to its new (higher) maximum, so its negentropy will drop back down to zero. But if it continued expanding, there’s no reason its negentropy would ever have to decrease.

    (BTW, my example may suggest that expansion is the critical element here, but in fact a variety of different changes in the system’s constraints could cause an increase in negentropy. For example, if the container contracted instead of expanding, the gas would heat up (more than the water and rocks), and the water vapor would become undersaturated, so this would also produce an increase in negentropy.)

    Similar effects clearly happened during the big bang. For one thing, after light decoupled from matter, the photon gas (now the microwave background) cooled off at a different rate from the matter in the universe (in much the same way the air cooled differently from the water & rocks in my first example). Even more significantly, as the temperature of the matter dropped, the thermodynamically preferred state for baryons shifted. At high temperatures, free protons (aka hydrogen nuclei) were preferred, but as the matter cooled the equilibrium state shifted up the periodic table, eventually settling at iron-56. But most of the baryons got stuck in the hydrogen state, creating a huge amount of negentropy — enough to power every star in existence.

    That doesn’t explain the uniformity of the early universe, but I don’t see any reason to think it’s inexplicable by similar effects. There’s certainly no reason to think intelligence would’ve been required to produce it (or even that intelligence would’ve helped!).

Leave a Reply