Home

Is this a CONSPIRACY?

14 Comments

science

 

Interesting article about conspiracy theorists. I have always said and I am saying it again,  they  conspiracy theorists, hav e an (interesting ) not so balanced psychological profile with a ‘ touch of ‘ paranoia of course. No point arguing with them either.I have known a  few, say no more!

 

Once you buy into the first conspiracy theory, the next one seems that much more plausible.

To believe that the U.S. government planned or deliberately allowed the 9/11 attacks, you’d have to posit that President Bush intentionally sacrificed 3,000 Americans. To believe that explosives, not planes, brought down the buildings, you’d have to imagine an operation large enough to plant the devices without anyone getting caught. To insist that the truth remains hidden, you’d have to assume that everyone who has reviewed the attacks and the events leading up to them—the CIA, the Justice Department, the Federal Aviation Administration, the North American Aerospace Defense Command, the Federal Emergency Management Agency, scientific organizations, peer-reviewed journals, news organizations, the airlines, and local law enforcement agencies in three states—was incompetent, deceived, or part of the cover-up.

Conspiracy theory psychology is becoming an empirical field with a broader mission: to understand why so many people embrace this way of interpreting history. As you’d expect, distrust turns out to be an important factor. But it’s not the kind of distrust that cultivates critical thinking.

The common thread between distrust and cynicism, as defined in these experiments, is a perception of bad character. More broadly, it’s a tendency to focus on intention and agency, rather than randomness or causal complexity. In extreme form, it can become paranoia. In mild form, it’s a common weakness known as the fundamental attribution error—ascribing others’ behavior to personality traits and objectives, forgetting the importance of situational factors and chance. Suspicion, imagination, and fantasy are closely related.

Read it for yourself!

http://www.slate.com/articles/health_and_science/science/2013/11/conspiracy_theory_psychology_people_who_claim_to_know_the_truth_about_jfk.single.html

 

 

Are we evolving stupidity?

44 Comments

An article that appeared in the latest  New Scientist. You have to subscribe to read this but you can also read here  Technology may be getting smarter, but humans are getting dumber, scientists have warned.

I am not in the least surprised! Off the top of my head I would agree but be careful not to appear discriminatory or politically incorrect.Have you noticed how people write, what they read, the level of their general knowledge? And this is also passed onto their children. Every child has a smart phone with language that I do not understand  really. Shortcut! Everything has to save time , more time to be bored!

The Daily Mail writes An IQ test used to determine whether Danish men are fit to serve in the military has revealed scores have fallen by 1.5 points since 1998.

Between the 1930s and 1980s, the average IQ score in the US rose by three points and in post-war Japan and Denmark, test scores also increased significantly – a trend known as the ‘Flynn effect’.

This increase in intelligence was due to improved nutrition and living conditions – as well as better education – says James Flynn of the University of Otago, after whom the effect is named. 

Now some experts believe we are starting to see the end of the Flynn effect in developed countries – and that IQ scores are not just levelling out, but declining. 

World_IQ_graph_jpg

Westerns have lost 14 IQ points on average since the Victorian age, according to a study published by the University of Amsterdam last year.

Jan te Nijenhuis thinks this could be because intelligent women tend to have less children than women who are not as clever, The Huffington Post reported.

The perceived link between IQ and fertility is a very contentious one.

Dr Nijenhuis studied the results of 14 intelligence studies conducted between 1884 and 2004 to come to his conclusion.

Each study measured peoples’ reaction times – how long they took to press a button after being prompted.

It is claimed that reaction time mirrors mental processing speed – so it reflects intelligence.

They found that visual reaction times averaged 194 milliseconds in the late 19th Century, but in 2004, they had increased to 275 milliseconds.

This would suggest that people have become less intelligent, they said.

But unfortunately the older generation who are supposed to be ‘more intelligent ‘ are now bocoming too lazy to think. A lot are not even computer literate, they have to ask their children. Too lazy to figure it out!!!!!

If you find this one on hoax slayer go shout at the BBC

31 Comments

dinosaur

 

Fossilised bones of dinosaur believed to be the largest creature ever to walk the Earth has been unearthed in Argentina, palaeontologists say.

Based on its huge thigh bones, it was 40m (130ft) long and 20m (65ft) tall. Weighing in at 77 tonnes, it was as heavy as 14 African elephants, and seven tonnes heavier than the previous record holder, Argentinosaurus. Scientists believe it is a new species of titanosaur – an enormous herbivore dating from the Late Cretaceous period. Read more: http://ow.ly/wXCu1

Image credit: BBC Natural History Unit

Six health myths that you should avoid.

26 Comments

This comes from the New Scientist, but to read it you have to subscribe. So I am giving you the full story.

New Scientist: “Six health myths you should ignore”

DON’T SWALLOW IT: SIX HEALTH MYTHS YOU SHOULD IGNORE. 26 August 2013 by Caroline Williams
Drink eight glasses of water per day
It’s the myth that just won’t go away. Almost everyone thinks they don’t drink enough water, but the idea that we all should drink lots of it – eight glasses per day – is based on no scientific data whatsoever. No one really knows where the eight-glasses idea comes from. Some blame the bottled water industry but plenty of doctors and health organisations have also promoted it over the decades. The source might be a 1945 recommendation by the US National Research Council (NRC) that adults should consume 1 millilitre of water for each calorie of food, which adds up to about 2.5 litres per day for men and 2 litres for women. According to Barbara Rolls, a nutrition researcher at Penn State University and author of the 1984 book Thirst, this amount is about right for people in a temperate climate who aren’t exercising vigorously. And 1.9 litres is what you’ll get from drinking eight 8-ounce glasses of water – the 8 ×8 rule – as per the US version of the myth. What most people don’t realise, though, is that we get a lot of that water from our food, as the NRC pointed out at the time. Foods contain water and are broken down chemically into carbon dioxide and more water. So if you are not sweating buckets you need only about a litre a day – and 1.2 litres is what you will get from the eight 150-millilitre glasses recommended by the UK’s health service. But any talk of glasses is misleading because there is no need to drink pure water. The fluids that people drink anyway, including tea and coffee, can provide all the water we need, says Heinz Valtin, a kidney specialist at Dartmouth Medical School in Lebanon, New Hampshire, who has reviewed the evidence (Regulatory Integrative and Comparative Physiology, vol 283, p R993). According to the myth, however, caffeinated drinks don’t count because they are diuretic, stimulating the body to lose more water than it gets from the drink. Not true. A comparison of healthy adults in 2000 found no difference in hydration whether they got their water from caffeinated drinks or not (Journal of the American College of Nutrition, vol 19, p 591). Even one or two mildly alcoholic drinks will hydrate you rather than dehydrating you. Hydrophilics respond by saying that pure water is better than other drinks. Even this claim is arguable, but the crucial point is that if you are a healthy individual already drinking enough tea, milk, juice or whatever, there is no evidence that swigging down water as well will achieve anything other than making you go to the bathroom all the time. The final aspect of this myth is that we need to force ourselves to drink because by the time we are thirsty we are already seriously dehydrated. Not so. Rolls showed nearly 30 years ago that we get thirsty long before there is any significant loss of bodily fluids. It takes less than a 2 per cent rise in the concentration of the blood to make us want to drink, while the body isn’t officially regarded as dehydrated until a rise of 5 per cent or more. So relax and trust your body. Don’t force yourself to gulp down gallons of water if you don’t want to – that can be dangerous – just drink the beverage of your choice whenever you’re thirsty.
Our bodies can and should be “detoxed”
We live in a toxic world. You’re breathing in lead as you read this. Your next meal will contain everything from natural poisons to pesticides and pollutants. As a result, the human body is a veritable cesspit of suspect chemicals. The last US National Report on Human Exposure to Environmental Chemicals found potentially concerning levels of dozens of undesirable substances, including heavy metals, dioxins, PCBs and phthalate plasticisers, in the blood and urine of Americans. The question is, what can we do about it? According to popular wisdom, we need to “detox” to get rid of these poisons in our body, and there is no shortage of advice on the best way to accomplish this. But do any of these detox plans actually work? And is detoxing really good for us? For a start, we are already doing it all the time, with the help of our livers, kidneys and digestive systems. Most of the toxic chemicals we consume are broken down or excreted, or both, within hours. However, it can take weeks, months or even years to get rid of some substances, especially fat-soluble chemicals such as dioxins and PCBs. If we take these in faster than our bodies can get rid of them, levels build up in our bodies. Many detox programmes promote a period of consuming only fluids and no solid food, but this will make virtually no difference to levels of chemicals that have built up over years. “For many of these it will take between six and 10 years of zero exposure to get rid of one-half of the amount stored in our fat tissues,” says Andreas Kortenkamp, a toxicologist at Brunel University in London. “That is not achievable, because, unfortunately, there is no zero exposure.” What’s more, fasting or dieting releases fat-soluble chemicals into the blood, rather than eliminating them from the body. One study found the level of organochlorines and pesticides in blood shot up by 25 to 50 per cent after people lost a lot of weight quickly (Obesity Surgery, vol 16, p 1145). Animal studies show that this increases the level of compounds in tissues like the muscles and brain, where they can do more harm than in fat. This sudden flood of chemicals could even cause the kind of problems detoxers are trying to avoid, says Margaret Sears, an environmental health researcher at the CHEO Research Institute in Ottawa, Canada. “These chemicals have toxic effects as endocrine disruptors that paradoxically affect energy levels and appetite, potentially contributing to yo-yo weight loss and gain,” she says. Plus there’s no guarantee that chemicals released from fat will actually leave the body – some will end up back in storage. With chemicals that the body does eliminate rapidly, such as phthalates, a short fast will lower levels. It’s not clear that this does you any good, though. As soon as you start eating again, says Kortenkamp, levels go back to where they were. For these reasons, Sears recommends what she calls a “lifelong detox”, which involves eating as healthily as possible and avoiding chemicals in the home and workplace as much as you can. But Kortenkamp isn’t convinced that even that will help much. “Only regulatory action that reduces exposures will work. Individual avoidance strategies are but a drop in the ocean,” he says. That said, you can greatly reduce your exposure to toxic chemicals like nicotine and alcohol. There is also one way of speeding up the removal of many fat-soluble toxic chemicals that is supported by scientific evidence – producing milk (Lipids, vol 36, p 1289). While it is possible for women to induce lactation without giving birth – and even for men to lactate – the milk-yourself detox method is probably unlikely to catch on.
Antioxidant pills help you live longer         It seems blindingly obvious. As our cells metabolise the food we eat, they produce rogue molecules called free radicals that wreak havoc. Over a lifetime, the damage they do slowly builds up and may cause all kinds of degenerative diseases. Luckily, though, many chemicals can act as antioxidants that mop up free radicals. Plus, eating vegetables rich in antioxidants seems to reduce the risk of degenerative diseases. So popping pills packed with antioxidants must surely help stave off these diseases too? That’s what some scientists started thinking from the 1970s onwards. The Nobel prizewinning chemist Linus Pauling enthusiastically promoted high doses of vitamins without waiting for the evidence, the public lapped it up and a whole new industry sprang up to meet demand. Then, in the 1990s, the results of rigorous trials of some of the most popular supplements, including beta carotene, vitamin E and vitamin C, started to come in. Study after study has found that while these substances do work as antioxidants in the test tube, popping the pills does not provide any benefit. On the contrary, some studies suggest that they are harmful. A 2007 review of nearly 70 trials involving 230,000 people concluded that not only do antioxidant supplements not increase lifespan, but that supplements of beta carotene and vitamins A and E actually seem to increase mortality (Journal of the American Medical Association, vol 297, p 842). Why? Perhaps because high levels of free radicals tell cells to ramp up their own built-in antioxidant defences, says Barry Halliwell, a biochemist at the National University of Singapore. He thinks these internal defences are far more effective than the antioxidants we get from food. So by taking supplements we may be deactivating a first-rate defence mechanism and replacing it with a poorer one (Nutrition Reviews, vol 70, p 257). “Free radicals in low amounts also play useful roles,” Halliwell says. If this is right, the benefits of vegetables may have nothing to do with antioxidants. One suggestion is that vegetables are beneficial because they are mildly poisonous – a little poison may activate protective mechanisms that ward off disease. In the meantime, the antioxidant juggernaut rolls on. No one seems keen to abandon the idea that antioxidant supplements are good for you.
Being a bit overweight shortens life         Let’s be clear – being seriously obese is bad for your health. A body mass index of over 40 increases the risk of type 2 diabetes, heart disease and certain cancers and increases the risk of dying from any cause by up to 29 per cent. This is not a health myth. But carrying just a few extra pounds, far from being a one-way ticket to an early grave, seems to deter the grim reaper, according to a recent review of nearly a hundred studies involving nearly 3 million people. The review, led by Katherine Flegal of the US Centers for Disease Control in Hyattsville, Maryland, reported earlier this year that being “overweight” – defined as having a body mass index (BMI) of 25 to 29 – seems to have a protective effect, with a 6 per cent reduction in death risk compared with people with a BMI of between 18.5 and 25. Those with BMIs over 35, however, have a higher risk (JAMA, vol 309, p 71). It isn’t clear why being overweight might protect against an early death. Perhaps carrying a few extra pounds in reserve helps the body fight off illness or infection. Perhaps overweight people are more likely to receive medical attention. Or perhaps some of those counted as “normal” had lost weight due to serious illnesses. Whatever the reason, Flegal says her finding is not a green light to eat all the pies. Overweight people might be more likely to develop diseases that affect the quality of life, for instance. Even so, it seems that a little bit of flab may not be the crime against health it has always been made out to be.
We should live and eat like cavemen
Our bodies didn’t evolve for lying on a sofa watching TV and eating chips and ice cream. They evolved for running around hunting game and gathering fruit and vegetables. So, the myth goes, we’d all be a lot healthier if we lived and ate more like our ancestors. This “evolutionary discordance hypothesis” was first put forward in 1985 by medic S. Boyd Eaton and anthropologist Melvin Konner, both of Emory University in Atlanta, Georgia (NEJM, vol 312, p 283). In it they claimed that while our genes haven’t changed for at least 50,000 years, our diets and lifestyles have changed greatly since the advent of agriculture 10,000 years ago, and it has all happened too quickly for us to evolve to deal with it. This, they argued, is the reason why diabetes, heart disease and cancers are rife. If we could only exercise more and eat like hunter-gatherers, we’d be fitter, happier and healthier. In recent years, the Stone Age or “paleo” diet based on these ideas has become very popular. It involves eating game, fish, fruit, vegetables and nuts, and avoiding grains, dairy, legumes, oils, refined sugars and salt. Some aspects, such as exercising more and eating less highly processed grains and sugars, agree with the latest evidence. But others, such as ditching grains, legumes and dairy, do not. And the underlying rationale is flawed. The idea that there was some evolutionary sweet spot 50,000 years ago just isn’t true, says Marlene Zuk, an evolutionary biologist at the University of Minnesota in Saint Paul, who has written a book debunking the paleo lifestyle. Our ancestors were not perfectly adapted to their lifestyles, and we have adapted to our agricultural diet. For instance, many people have extra copies of genes for digesting the starch found in grains. The ability to digest milk as an adult – lactose tolerance – has also evolved independently in several populations. Another criticism is that we don’t know for sure what our ancestors ate. They definitely didn’t eat anything like the animals and plants we eat today, which have been transformed beyond recognition by selective breeding. Last but not least, it’s not clear that ancient hunter-gatherers really were that much healthier than the rest of us (The Lancet, vol 381, p 1211). Evolution, after all, doesn’t care if we drop dead once we’ve raised our children and grandchildren. The original proponents of the discordance hypothesis still stand by their idea, but they have revised it in light of the latest evidence. Eaton and Konner now include low-fat dairy products and whole grains in their recommended foods (Nutrition in Clinical Practice, vol 25, p 594).
Sugar makes children hyperactive
Every parent has seen it happen: take a group of young children, add sugar, then stand back and watch them bounce off the walls. But although many parents will find it hard to believe, sugar does not cause hyperactivity. A 1996 review of 12 blinded studies, where no one at the time knew which kids had received sugar and which a placebo, found no evidence to support this notion. This is true even for children with ADHD or whose parents consider them to be sensitive to sugar (Critical Reviews in Food Science and Nutrition, vol 36, p 31) In fact, one of these studies concluded that the sugar effect is all in parents’ minds. Parents and their 5 to 7-year-old “sugar-sensitive” children were split into two groups. The parents of one group were told their children had been given a large dose of sugar, while the others believed their kids were in the placebo group. In reality, all the children had been given sugar-free food. But when the parents watched their offspring at play afterwards, those who thought their kids were in the sugar group were more likely to rate their behaviour as hyperactive (Journal of Abnormal Child Psychology, vol 22, p 501). Having said all that, sugar does affect kids’ brains, although in a surprising way. In one study, David Benton, a psychologist at Swansea University in the UK, found that in the half-an-hour or so after having a glucose drink, 9 to 11-year-old schoolchildren were better able to concentrate on tasks and scored higher in memory tests (Biological Psychology, vol 78, p 242). That’s the opposite of hyperactivity, one characteristic of which is an inability to concentrate. But don’t start plying your kids with sugary drinks – as the study notes, the performance boost may not last long. Non-sugary meals that help the body maintain a constant supply of glucose to the brain are better. So perhaps what parents’ mistake for hyperactivity at parties is just sugar-fuelled kids focusing hard on having fun. “Provision of energy is clearly going to increase the possibility of energy expenditure,” says Andrew Scholey, who studies glucose and cognitive enhancement at Swinburne University in Melbourne, Australia.