Personal Health

 

Friday, January 6, 2012

Modest exercise cuts obese women's blood pressure

By Amy Norton

Reuters Health

Friday, January 6, 2012

NEW YORK (Reuters Health) - Even fairly modest amounts of exercise can help older obese women lower their blood pressure numbers, a new clinical trial suggests.

In a study that randomly assigned 404 women to exercise or not, researchers found that lower-intensity activity -- walking on a treadmill or pedaling an exercise bike -- curbed what's known as exercise blood pressure.

That refers to blood pressure levels during physical activity, including daily routines like walking around a store or climbing stairs.

While it's normal for blood pressure to rise when we're active, steeper increases have been linked to a heightened risk of heart disease -- independent of a person's usual blood pressure at rest.

So the new findings, the researchers say, reinforce the importance of staying physically active.

"Try to keep finding ways to fit more physical activity into your life," said lead researcher Damon L. Swift, of the Pennington Biomedical Research Center in Baton Rouge, Louisiana.

It's familiar advice. But the findings, reported in the journal Menopause, strengthen the case for exercise, say the authors, because they come from a well-designed study in which sedentary women were randomly assigned to exercise or stay inactive.

The fact that exercisers showed a blood pressure benefit suggests that the exercise, itself, deserves the credit.

For the study, Swift and his colleagues recruited 404 women ages 45 to 75 who were overweight or obese, sedentary and had higher-than-normal blood pressure.

They randomly assigned the women to one of four groups: three that exercised at different levels for six months and one that stuck with their usual lifestyle.

In one exercise group, the women got the equivalent of what experts generally recommend for most adults: about 2.5 hours of moderate activity per week.

A second group got only about half that amount of exercise, while the third worked harder -- equivalent to almost 4 hours of moderate activity per week.

In the end, all three exercise groups showed improvements in their exercise blood pressure. In the most-active group, systolic blood pressure (the "top" number) dropped by an average of about 14 points.

But the least active group was close behind, shaving 11 points off their systolic blood pressure.

Exercise did not change the women's blood pressure at rest, however.

But since high exercise blood pressure may put a strain on the heart, lowering it -- even without effects on resting blood pressure -- might do the heart good, according to Swift.

And even people who do not have full-blown chronic high blood pressure may have abnormal exercise readings.

"One of the things we saw was that even among women with pre-hypertension, a good portion had abnormally elevated exercise blood pressure," Swift said.

'It Doesn’t  Take A Lot Of Exercise'

At the study's start, 204 women had pre-hypertension -- higher than 120/80 millimeters of mercury (mm Hg), but lower than the threshold for high blood pressure, which is 140/90 mm Hg. Of those women, about 40 percent had abnormally high exercise blood pressure.

According to Swift, the bottom line for current couch potatoes -- men and women of all ages -- is that it doesn't take a lot of exercise to see potential health benefits.

"A lot of people think of exercise as this big thing that will take up a lot of their time," Swift said. But simple things like a walk around your neighborhood can fit the bill, he noted. "You don't have to go to the gym."

And don't get discouraged if you fail to shed pounds, Swift said. In this study, women who exercised did not lose weight, but they did lower their exercise blood pressure and boost their overall cardiovascular fitness levels.

It is always wise, however, for sedentary overweight people to have a check-up with the doctor before becoming active.

"It's good to know if you have a condition that could be aggravated by exercise," Swift noted.

Source: http://bit.ly/xpRSqs

Menopause, online December 12, 2011.

PSA Test for Prostate Cancer Doesn't Save Lives: Study

By Steven Reinberg
HealthDay Reporter

HealthDay News

Friday, January 6, 2012

FRIDAY, Jan. 6 (HealthDay News) -- Annual screening for prostate cancer doesn't save lives, finds a new study that is unlikely to quell the controversy surrounding routine prostate specific antigen (PSA) screening.

"Organized prostate cancer screening when done in addition to whatever background testing exists in the population does not result in any apparent benefit, but does result in harm from false positives and over-diagnosis," said lead researcher Philip Prorok, from the Division of Cancer Prevention at the U.S. National Cancer Institute.

"Men considering prostate cancer screening should be fully informed of the implications of such testing before making a decision," he added.

Experts have disagreed for some time on whether the blood test saves lives or results in over-diagnosis and over-treatment. The new findings, which extend prior results out to 13 years of follow-up, are published in the Jan. 6 online edition of the Journal of the National Cancer Institute.

The study followed men enrolled in the Prostate, Lung, Colorectal and Ovarian Cancer Screening (PLCO) Trial from 1993 to 2009, comparing results for a group of men who had undergone screening with those for men who hadn't had testing. The men were 55 to 74 years old.

One group had PSA screening every year for six years and a digital rectal examination every year for four years. The other men had regular care, which in some cases included screening if requested by the patient or doctor.

Compared to men getting usual care, the screened men had a 12 percent relative increase in prostate cancer but a slightly lower rate of high-grade cancer.

However, no difference in deaths was seen between the two groups.

This finding held true even after age, screening before the trial and other medical conditions were taken into account, the researchers said.

Prorok said that better treatment for prostate cancer may explain the similar mortality results.

Among prostate cancer patients, death from other causes was somewhat higher in the screened group (10.7 percent of 4,250 men with prostate cancer) compared to the usual care group (9.9 percent of 3,815 men with prostate cancer).

This indicates men who underwent PSA screening were over-diagnosed, meaning the test picked up slow-growing tumors that probably weren't lethal, the researchers said.

"PSA testing and digital rectal examination screening as conducted in this trial did not reduce prostate cancer mortality, but there was a persistent excess of prostate cancer cases in the screened arm, suggesting over-diagnosis of prostate cancer," Prorok said.

Some prostate cancer experts disagree with the authors' conclusions.

Dr. Anthony D'Amico, chief of radiation oncology at Brigham and Women's Hospital in Boston, said the results are invalid because the trial was flawed.

According to D'Amico, 52 percent of those who received usual care had a PSA screening. "That's a serious issue which makes it very hard for the study to show if any benefit exists for PSA screening," he said.

Also, 15 percent of those who were supposed to get PSA screening never did, D'Amico said. "So what you've got is a screening study in which 85 percent of the people got PSA screened on the screening arm and 52 percent got screened on the control arm, which makes it impossible to ever measure a difference," he said.

Men should ignore this study, "because it has no relevance to PSA screening," D'Amico said.

D'Amico said he has more confidence in the results of a European study published in 2009 in the New England Journal of Medicine, which showed a 20 percent reduction in cancer mortality with PSA screening.

Men who can benefit most from screening are those at risk for prostate cancer, particularly men who have a family history of prostate cancer, African Americans and men over 60, D'Amico said.

Prorok acknowledged that the PLCO trial wasn't perfect. "Nonetheless, the contamination was not enough to eliminate the early diagnosis of prostate cancers nor the persistent excess of cancers," he said.

PLCO provides information about over-diagnosis, Prorok added. "Even if the contamination did dilute a benefit compared to no screening, the result of no mortality difference between the arms in PLCO could be interpreted to suggest that more intensive screening is not beneficial but does result in harm," he said.

More information

For more information on prostate cancer, visit the American Cancer Society.

To optimize exercise, heed your heart rate training zone

By Dorene Internicola

Reuters

Friday, January 6, 2012

NEW YORK (Reuters) - Whether you're interested in running a marathon or staving off the chronic diseases of ageing, to reap the rewards of your efforts getting into the zone is essential.

Experts say knowing and staying within your heart rate training zone is an easy way to pace the intensity of your workout.

"Exercisers need to get to at least a moderate level of physical activity in order to reap the benefits," said Dr. Adrian Hutber of the American College of Sports Medicine. "Your goal is to get to a stage where you're fit enough to exercise within your heart rate training zone."

Your heart rate training zone, or target heart rate, is based on your maximum heart rate (MHR), which is roughly calculated as 220 minus your age.

"It's not exact but it doesn't need to be," said Hutber. "It's a really good indicator."

For moderate-intensity physical activity, a person's target heart rate should be 50 to 70 percent of MHR, according to the Centers for Disease Control and Prevention. Vigorous exercisers should aim for 70 to 85 percent.

A 62-year-old woman has an estimated target heart rate zone of 111-134 beats per minute. An 18-year-old boy has a range of 141-172.

Science tells us you need at least 150 minutes of moderate-level physical activity per week to be healthy, said Hutber, quoting U.S. government guidelines.

Heart rate is a user-friendly way to track intensity level, according to Hutber. METS (Metabolic Equivalent of Task), which measures energy consumption, is another and VO2, which measures oxygen uptake, is a third.

"But for the public it's easier to talk about percentage of maximum heart rate," he said.

So short of wearing a heart monitor, how can you be sure you're training in the zone? Most modern treadmills, elliptical trainers, and other cardio machines will tell you if you feed it your correct age. And experts say you should.

"For the beginner who wants the most benefits and results, getting in that range is more important than worrying about calories burned," said Deborah Plitt, a trainer with Life Fitness, the equipment manufacturer.

She said the training zone is tied to age because as the heart gets older and becomes less efficient, it beats faster.

But as you become more fit your heart muscle recovers from exercise more quickly, returning sooner to the resting heart rate.

"Your resting heart rate becomes lower than it was because the same workout is getting easier," she explained. "The heart is a muscle and as it gets stronger it doesn't have to pump as many times ... It becomes more efficient."

People can check their heart rate any time simply by taking their pulse for 15 seconds and multiplying that number by four to calculate beats per minute.

A less disruptive way to check the intensity of your workout is the sing-talk test.

"It's a very approximate but very good litmus test for moderate physical activity," Hutber said. "If you're exercising hard enough that you can still carry on a conversation but you couldn't sing, that's moderate intensity. If you can't talk you're moving into vigorous."

And if you're able to both chat and carry a tune?

"Then you haven't brought your activity up to a moderate level," he said. "That shouldn't be your goal."

(This story corrects hours to minutes in eighth paragraph.)

Paranoid or Placid? Scans Show Pot's Effect on Brain

By Alan Mozes
HealthDay Reporter

HealthDay News

Friday, January 6, 2012

FRIDAY, Jan. 6 (HealthDay News) -- Smoking marijuana can mean different things to different people -- for some, anxiety and paranoia can set in, while others mellow out.

Now, a unique brain scan study suggests two ingredients in pot may work independently to achieve these effects.

British scientists who watched the effects of the two marijuana ingredients -- Δ9-tetrahydrocannabinol (THC) and cannabidiol (CBD) -- on the brains of 15 young men say the research shows how the drug can either ease or agitate the mind.

"People have polarized views about marijuana," said study lead author Dr. Sagnik Bhattacharyya, a researcher in the department of psychosis studies at the Institute of Psychiatry, King's College London. "Some consider it to be essentially harmless but potentially useful as a treatment in a number of medical conditions, and others link it to potentially severe public health consequences in terms of mental health. This study explains why the truth is somewhere in between."

The findings were published in the January issue of Archives of General Psychiatry.

According to Bhattacharyya's team, it's long been noted that cannabis can prompt the onset of psychotic symptoms, such as paranoia and/or delusional thinking, among otherwise healthy people.

"A number of studies have (also) clearly shown that regular marijuana or cannabis use in vulnerable individuals is associated with increased risk of developing psychotic disorders such as schizophrenia, where one loses contact with reality," Bhattacharyya said.

Just how this occurs in the brain wasn't understood.

In the new study, the researchers used functional MRI brain imaging on 15 healthy men, roughly 27 years old on average and described as "occasional cannabis users."

On three occasions under fMRI monitoring, the men received one of three identical-looking gelatin capsules: one containing 10 milligrams (mg) of the marijuana ingredient THC (deemed to be a "modest" dose); another containing 600 mg of CBD; and a third filled with flour.

Testing was conducted in a highly controlled and monitored environment, in which no marijuana was actually smoked.

The fMRI scans (which track brain activity in real time) were conducted one and two hours after capsule administration. During the scans, the men engaged in simple visual-cognition tasks (such as pressing buttons to reflect the direction of a series of flashing arrows). Psychopathological assessments were conducted throughout the brain imaging process.

The team found that THC and CBD appeared to affect the brain in different and opposite ways.

Ingesting THC brought about irregular activity in two regions of the brain (the striatum and the lateral prefrontal cortex) that are key to the way people perceive their surroundings. THC seemed to boost the brain's responses to otherwise insignificant stimuli, while reducing response to what would typically be seen as significant or salient.

In other words, under the influence of THC, healthy individuals might give far more importance to details in their environment than they would have without the chemical in their brain.

THC also prompted a significant uptick in paranoid and delusional thinking, the authors said, and the more that "normal" brain responses were set off-kilter, the more severe the paranoid or even psychotic reaction.

The effect of the other main pot ingredient, CBD, was nearly the opposite, however.

Ingesting the CBD capsule appeared to prompt brain activity linked to appropriate responses to significant stimuli in the environment, the team reported.

According to Bhattacharyya, this suggests that, on balance, marijuana may play both a good and bad role in the context of psychosis.

The study also suggests that CBD, at least, may "have potential use for the treatment of psychosis," he said, even as marijuana's other principle ingredient, THC, raises the risk for developing psychotic complications.

Dr. Joseph Coyle, a professor of psychiatry and neuroscience at Harvard Medical School in Boston, said the current work goes a long way toward "connecting all the dots" when it comes to understanding the marijuana experience.

"What we're talking about here is the kind of perception, in this case prompted by marijuana, that leads a person to think that other people who are just talking in the subway are all actually talking about him," he noted. "Or people who are just tipping their hat for no reason are actually doing so specifically about him. And so this paper strikes me as important, because it actually looks at this kind of increased anxiety and increased hyper-alertness which are major factors in psychosis -- and then finds out what's going on in the brain among people who experience them.

"So I think this provides another brick in the foundation when talking about direct causality," he said. "It links the psychological state marijuana brings about with a specific psychophysical response in the brain. And that's very, very interesting."

More information

There's more on marijuana at the U.S. National Institute on Drug Abuse.

Thursday, January 5, 2012

Low Vitamin D Levels Linked to Depression, Psychiatrists Report

ScienceDaily

Thursday, January 5, 2012

ScienceDaily (Jan. 5, 2012) — Low levels of vitamin D have been linked to depression, according to UT Southwestern Medical Center psychiatrists working with the Cooper Center Longitudinal Study. It is believed to be the largest such investigation ever undertaken.

Low levels of vitamin D already are associated with a cavalcade of health woes from cardiovascular diseases to neurological ailments. This new study -- published in Mayo Clinic Proceedings -- helps clarify a debate that erupted after smaller studies produced conflicting results about the relationship between vitamin D and depression. Major depressive disorder affects nearly one in 10 adults in the U.S.

"Our findings suggest that screening for vitamin D levels in depressed patients -- and perhaps screening for depression in people with low vitamin D levels -- might be useful," said Dr. E. Sherwood Brown, professor of psychiatry and senior author of the study, done in conjunction with The Cooper Institute in Dallas. "But we don't have enough information yet to recommend going out and taking supplements."

UT Southwestern researchers examined the results of almost 12,600 participants from late 2006 to late 2010. Dr. Brown and colleagues from The Cooper Institute found that higher vitamin D levels were associated with a significantly decreased risk of current depression, particularly among people with a prior history of depression. Low vitamin D levels were associated with depressive symptoms, particularly those with a history of depression, so primary care patients with a history of depression may be an important target for assessing vitamin D levels. The study did not address whether increasing vitamin D levels reduced depressive symptoms.

The scientists have not determined the exact relationship -- whether low vitamin D contributes to symptoms of depression, whether depression itself contributes to lower vitamin D levels, or chemically how that happens. But vitamin D may affect neurotransmitters, inflammatory markers and other factors, which could help explain the relationship with depression, said Dr. Brown, who leads the psychoneuroendocrine research program at UT Southwestern.

Vitamin D levels are now commonly tested during routine physical exams, and they already are accepted as risk factors for a number of other medical problems: autoimmune diseases; heart and vascular disease; infectious diseases; osteoporosis; obesity; diabetes; certain cancers; and neurological disorders such as Alzheimer's and Parkinson's diseases, multiple sclerosis, and general cognitive decline.

Investigators used information gathered by the institute, which has 40 years of data on runners and other fit volunteers. UT Southwestern has a partnership with the institute, a preventive medicine research and educational nonprofit located at the Cooper Aerobics Center, to develop a joint scientific medical research program aimed at improving health and preventing a wide range of chronic diseases. The institute maintains one of the world's most extensive databases -- known as the Cooper Center Longitudinal Study -- that includes detailed information from more than 250,000 clinic visits that has been collected since Dr. Kenneth Cooper founded the institute and clinic in 1970.

Other researchers involved in the study were Dr. Myron F. Weiner, professor of psychiatry and neurology and neurotherapeutics; Dr. David S. Leonard, assistant professor of clinical sciences; lead author MinhTu T. Hoang, student research fellow; Dr. Laura F. DeFina, medical director of research at The Cooper Institute; and Benjamin L. Willis, epidemiologist at the institute.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.

Story Source:

The above story is reprinted from materials provided by UT Southwestern Medical Center.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Note: If no author is given, the source is cited instead.

Antibiotics in Pregnancy May Shield Newborns From Strep B

HealthDay News

Thursday, January 5, 2012

THURSDAY, Jan. 5 (HealthDay News) -- Giving antibiotics to pregnant women at risk of streptococcus B infection greatly reduces infection rates in newborns, according to a new study.

Use of antibiotics to prevent group B strep is common in high-income nations and should also be used in developing countries, at least until a vaccine becomes available, said study author Dr. Karen Edmond, of the London School of Hygiene and Tropical Medicine in England, and colleagues.

The researchers analyzed dozens of previous studies and found that the mean global incidence of group B strep infection in infants up to 3 months old was 0.53 per 1,000 live births and the mean death rate was 10 percent.

Africa had the highest incidence (1.21 cases per 1,000 live births) and death rate (22 percent). Incidence in the Americas [0.67 per 1,000 live births] and Europe [0.57 per 1,000 live births] was also higher than the global average. The death rate was 11 percent in the Americas and 7 percent in Europe.

Worldwide, the death rate for early-onset group B strep infection -- occurring the first week of life -- was 12 percent, twice that of later-onset disease.

Sixty-nine percent of the studies reported use of any preventive antibiotic treatment in the time between labor and delivery (intrapartum). Rates of early-onset disease were three times lower in studies that reported preventive use of antibiotics than those that did not report such use.

The study appears in the Jan. 5 issue of The Lancet.

The most common strep B serotype in all regions was serotype III (49 percent), followed by serotypes Ia (23 percent), Ib (7 percent), II (6 percent), and V (9 percent).

The distribution of strains of strep B appears similar worldwide, which means that vaccines currently in development could have near-universal applicability, according to the researchers.

"A conjugate vaccine incorporating five serotypes (Ia, Ib, II, III, V) could prevent most global group B streptococcal disease," they wrote in a journal news release. Phase 3 trials of vaccines will soon begin in Africa, they said.

"Vaccination of pregnant women also has the potential to reduce premature births, stillbirths, and puerperal sepsis [a toxic condition] caused by group B streptococcus," the researchers said.

More information

The U.S. Centers for Disease Control and Prevention has more about group B strep infections in newborns.

Chronic Heartburn a Growing Problem in U.S.

By By Dennis Thompson
HealthDay Reporter

HealthDay News

Thursday, January 5, 2012

THURSDAY, Jan. 5 (HealthDay News) -- Heartburn and acid reflux strike many people as an annoying and painful but ultimately harmless problem -- a result of overindulgence and gluttony that must be endured, much like a hangover after a night of drinking.

But frequent bouts of heartburn and reflux constitute a real medical condition known as gastroesophageal reflux disease, or GERD, and GERD is on the rise worldwide.

"The overall prevalence is increasing over the past decades," said Dr. Ronnie Fass, a medical advisory board member for the International Foundation for Functional Gastrointestinal Disorders who's also a professor of medicine at the University of Arizona and chief of gastroenterology at the Southern Arizona VA Health Care System.

The increase has occurred "not only in the United States, but in Asian countries, where GERD was unheard of," Fass said. "But we are the trailblazers. We are leading the world."

If left untreated, GERD can lead to bleeding or ulcers in the esophagus, a buildup of scar tissue that makes swallowing difficult and, in extreme cases, esophageal cancer, according to the U.S. National Institutes of Health.

"People consider heartburn part of the eating experience," Fass said. "They have to understand the presence of heartburn denotes a real medical problem."

Frequent reflux or heartburn are apparently a regular occurrence for Americans. "We believe up to 20 percent of the population experiences symptoms once a week, and 7 percent have daily symptoms," he said.

Heartburn and acid reflux occur when acidic digestive juices from the stomach get past a ring of muscle known as the lower esophageal sphincter, which acts as a valve separating the stomach from the esophagus.

People experience heartburn when the digestive juices eat away at the lining of the esophagus. Sometimes the acid refluxes all the way up through the esophagus to the mouth, causing people to taste digestive juices or food in the back of their mouth.

Doctors consider people to be suffering from GERD if they experience persistent reflux, meaning at least twice a week, according to the NIH. Anyone at any age can have GERD, although symptoms tend to be different for children 12 and younger, who may have asthma-like symptoms, a dry cough or difficulty swallowing.

Most of the time, GERD stems from one of two causes -- what you eat and how much you weigh -- but excessive weight is the most prominent, said Dr. Kenneth R. DeVault, chairman of the gastroenterology department at the Mayo Clinic in Jacksonville, Fla., and co-author of the American College of Gastroenterology's guidelines for treating GERD.

"The most consistent factor is probably weight gain and obesity," DeVault said. "It's become pretty clear that a small amount of weight gain produces an increase in reflux symptoms. I'm not talking a large amount; I'm talking about 5 or 10 pounds, probably. Even if you're already overweight, increasing your weight will increase your risk of reflux over the next several months."

Excess weight can press on the stomach, forcing acid past the valve into the esophagus. And, as Fass explained, the problem isn't just the belly flab evident on people who are obese or overweight. Rather, the accumulation of fat around the organs inside the body contributes by increasing pressure on the stomach, making reflux much more likely to occur, he said.

GERD also can be caused, or exacerbated, by a person's diet. But foods contribute to reflux in different ways.

Caffeine, for example, has been shown to relax the esophageal sphincter, increasing the chances of reflux, DeVault said.

Eating fatty foods can also contribute to reflux because fats slow the emptying of the stomach, meaning "there's more material left in the stomach that can be refluxed," he said.

Acidic, spicy or strongly flavored foods also can contribute to reflux by increasing the amount of acid in the stomach, according to the NIH. Citrus fruits or juices, tomatoes, mint, garlic, onions and chocolate are among the main offenders.

Also, lifestyle changes can usually reduce the possibility of reflux, the two experts said. These include:

Making a concerted effort to lose weight, by exercising and adopting a healthy diet.

Learning which foods are more likely to trigger excess acid or reflux, and then avoiding them.

Eating the final meal of the day two to three hours before bedtime, thus reducing the amount of food in the stomach that would press against the esophageal valve.

Elevating the head of the bed, if nighttime reflux is a problem, as this can reduce the pressure of stomach acid and contents on the valve.

If reflux symptoms persist, however, DeVault stressed that more needs to be done to avoid damage to the esophagus.

"If they have frequent heartburn symptoms, more than weekly, and have had it for many years, they need to see a physician," he said.

More information

The U.S. National Institute of Diabetes and Digestive and Kidney Diseases has more on GERD.

For more on the effects of GERD, read about one woman's story.

Heart Failure, Diabetes Might Be Linked by Protein

HealthDay News

Thursday, January 5, 2012 

THURSDAY, Jan. 5 (HealthDay News) -- Researchers may have pinpointed how heart disease can lead to diabetes, a finding that could lead to new preventive treatments.

The team at Chiba University in Japan found that the stress of heart failure activates a protein called p53, resulting in inflammation in fat tissue, systemic insulin resistance and worsening heart function.

This domino effect is outlined in a study in the January issue of the journal Cell Metabolism.

"Our findings clarify the reasons why the incidence of heart failure is high among diabetic patients, why the prevalence of insulin resistance is increased in heart failure patients and why treatment of insulin resistance improves the prognosis of heart failure patients," study author Tohru Minamino said in a journal news release.

Previous research by the author has shown that build-up of p53 in the heart -- from stress or age -- promotes heart failure, the release said. While p53 is best known as a tumor suppressor, it is also a cellular aging agent, according to Minamino. He explained that constant activation of p53 can lead to inflammation and aging-related diseases.

Finding a way to block inflammation associated with p53 activation without compromising the protein's tumor-fighting abilities could lead to anti-aging therapy without the cancer risk, Minamino said.

More information

The U.S. National Diabetes Education Program outlines how you can prevent diabetes.

Study shows memory loss can start as early as 45

By Ben Hirschler

Reuters

Thursday, January 5, 2012

LONDON (Reuters) - Loss of memory and other brain function can start as early as age 45, posing a big challenge to scientists looking for new ways to stave off dementia, researchers said Thursday.

The finding from a 10-year study of more than 7,000 British government workers contradicts previous notions that cognitive decline does not begin before 60 years of age, and it could have far-reaching implications for dementia research.

Pinpointing the age at which memory, reasoning and comprehension skills start to deteriorate is important because drugs are most likely to work if given when people first start to experience mental impairment.

A handful of novel medicines for Alzheimer's disease, the most common form of dementia, are currently in clinical trials, but expectations are low and some experts fear the new drugs are being tested in patients who may be too old to show a benefit.

Companies with products in development include Eli Lilly, working on a drug called solanezumab, and Elan and Johnson & Johnson, developing bapineuzumab.

The research team led by Archana Singh-Manoux from the Center for Research in Epidemiology and Population Health in France and University College London found a modest decline in mental reasoning in men and women aged 45-49 years.

"We were expecting to see no decline, based on past research," Singh-Manoux said in a telephone interview.

Among older subjects in the study, the average decline in cognitive function was greater, but there was a wide variation at all ages, with a third of individuals aged 45-70 showing no deterioration over the period.

"It doesn't suddenly happen when you get old. That variability exists much earlier on," Singh-Manoux said. "The next step is going to be to tease that apart and look for links to risk factors."

Healthy Lifestyle

Participants were assessed three times during the study, using tests for memory, vocabulary, and aural and visual comprehension skills.

Over the 10-year period, there was a 3.6 percent decline in mental reasoning in both men and women aged 45-49 at the start of the study, while the decline for men aged 65-70 was 9.6 percent and 7.4 percent for women.

Since the youngest individuals at the start of the study were 45, it is possible that the decline in cognition might have commenced even earlier.

Singh-Manoux said the results may also have underestimated the cognitive decline in the broader population, since the office workers in the study enjoyed a relatively privileged and healthy lifestyle.

Factors affecting cardiovascular function -- such as obesity, high blood pressure, high cholesterol and smoking -- are believed to impact the development of Alzheimer's and vascular dementia through effects on brain blood vessels and brain cells.

The research findings were published in the British Medical Journal, alongside an editorial by Francine Grodstein of the Brigham and Women's Hospital in Boston, who described the results as convincing.

Most research into dementia has focused on people aged 65 and over. In future, scientists will need to devise long-term clinical studies that include much younger age groups and may have to enroll tens of thousands of participants, she said.

One way to deal with this "major challenge" might be to use computerised cognitive assessment tests, rather than face-to-face interviews, although more research is still needed on this approach, she added.

(Reporting by Ben Hirschler; Editing by Will Waterman)

Could Daily Aspirin Harm Seniors' Eyes?

By Alan Mozes
HealthDay Reporter

HealthDay News

Thursday, January 5, 2012

THURSDAY, Jan. 5 (HealthDay News) -- Daily aspirin use among seniors may double their risk of developing a particularly advanced form of age-related macular degeneration, a debilitating eye disease, a large new European study suggests.

The possible link involves the so-called "wet" type of age-related macular degeneration (AMD), a significant cause of blindness in seniors.

Aspirin use was not, however, found to be associated with an increased risk for developing the more common, and usually less advanced, "dry" form of AMD, according to the report published in the January issue of Ophthalmology.

Although the study team stressed that further research is needed, the findings could cause concern for the millions of older people who routinely take over-the-counter aspirin for pain, inflammation and blood-clot management, and to reduce their risk of heart disease.

"People should be aware that aspirin, often just bought over the counter without prescription, may have adverse effects -- apart from major gastrointestinal and other bleeding -- also for AMD," said lead author Dr. Paulus de Jong.

De Jong is an emeritus professor of ophthalmic epidemiology at the Netherlands Institute for Neuroscience of the Royal Academy of Arts and Sciences, as well as the Academic Medical Center, both in Amsterdam.

Age-related macular degeneration affects the critical central vision required for reading, driving and general mobility. The damage occurs when the retinal core of the eye (the macula) becomes exposed to leaking or bleeding due to abnormal growth of blood vessels.

To examine whether aspirin use might trigger this process, the authors focused on nearly 4,700 men and women over age 65 living in Norway, Estonia, the United Kingdom, France, Italy, Greece and Spain.

In the study, conducted between 2000 and 2003, the researchers looked at blood samples, frequency of aspirin use (though not doses), smoking and drinking history, stroke and heart attack records, blood pressure levels and sociodemographic data.

The team also analyzed detailed images of each participant's eyes, looking for indications of age-related macular degeneration and severity.

Daily aspirin use was associated with the onset of late-stage "wet" age-related macular degeneration, and to a lesser degree, the onset of early "dry" AMD -- even after the researchers took into account age and a history of heart disease, which in itself is a risk factor for AMD.

For late-stage wet AMD only, the association was stronger the more frequently an individual took aspirin.

Early AMD was found in more than more than one-third of participants (36 percent), while late-stage AMD was found in roughly 3 percent, or 157 patients.

Of those with late AMD, more than two-thirds (108) had wet AMD, while about one-third (49) had dry AMD, the researchers found.

More than 17 percent of participants said they took aspirin daily, while 7 percent took it at least once a week and 41 percent did so at least once a month.

About one-third of those with wet AMD consumed aspirin on a daily basis, compared with 16 percent of those with no AMD.

The study authors cautioned that further research is needed on aspirin's possible effects on eye health. Meanwhile, they suggested that doctors generally should not alter their current advice for aspirin use among older patients coping with heart disease risk.

"[But] I would advise persons who [already] have early or late AMD not to take aspirin as a painkiller," de Jong said. "[And] I would advise people with AMD who take small amounts of aspirin for primary prevention -- this means having no past history of cardiac or vascular problems like stroke, and no elevated risk factors for these diseases -- to discuss with their doctor if it is wise to continue doing so. For secondary prevention -- this means after having these elevated risks or disorders -- the benefits of daily aspirin outweigh the risks."

While the study uncovered an association between aspirin use and AMD, it did not prove a cause-and-effect relationship.

This point was also made by Dr. Alfred Sommer, a professor of ophthalmology and dean emeritus at the Bloomberg School of Public Health at Johns Hopkins University in Baltimore. He noted that while the study was "well executed," it should not be seen as definitive proof that aspirin use and AMD are linked.

An observational study of this type "merely calls attention to the fact that such an association may exist, and that it may be causal, but only randomized clinical trials can prove the matter one way or the other," he said.

"Hence, this might or might not be real," Sommer added, "and we will only know that when and if a randomized trial is done."

In the interim, he said the findings should not guide patient behavior.

"It is well known that aspirin [and other NSAIDs] can increase the risk of gastric distress and gastric ulcers," Sommer said. "Like any medicine, it should only be taken if needed. But those taking aspirin to prevent heart disease, particularly those at increased risk of heart disease, definitely do benefit and should not change what they do."

More information

For more on age-related macular degeneration, visit the U.S. National Eye Institute. 

Wednesday, January 4, 2012

Cancer-Killing Compound Spares Healthy Cells

ScienceDaily

Wednesday, January 4, 2012

ScienceDaily (Jan. 4, 2012) — Lithocholic acid (LCA), naturally produced in the liver during digestion, has been seriously underestimated. A study published in the journal Oncotarget shows that LCA can kill several types of cancer cells, such as those found in some brain tumors and breast cancer.

The research team, led by Concordia University, included scientists from McGill University and the Jewish General Hospital's Lady Davis Institute in Montreal as well as the University of Saskatchewan.

Previous research from this same team showed LCA also extends the lifespan of aging yeast. This time, the team found LCA to be very selective in killing cancer cells while leaving normal cells unscathed. This could signal a huge improvement over the baby-with-the-bathwater drugs used in chemotherapy.

"LCA doesn't just kill individual cancer cells. It could also prevent the entire tumor from growing," says senior author Vladimir Titorenko, a professor in the Department of Biology and Concordia University Research Chair in Genomics, Cell Biology and Aging.

What's more, LCA prevents tumors from releasing substances that cause neighboring cancer cells to grow and proliferate. Titorenko says LCA is the only compound that targets cancer cells, which could translate into tumor-halting power.

"This is important for preventing cancer cells from spreading to other parts of the body," he says, noting that unlike other anti-aging compounds, LCA stops cancer cell growth yet lets normal cells continue to grow.

A wide effect on different types of cancers

The next step for the research team will be to test LCA's effect on different cancers in mice models. Titorenko expects that LCA will also kill cancer cells in those experiments and lead to human clinical trials. "Our study found that LCA kills not only tumors (neuroblastomas), but also human breast cancer cells," says Titorenko. "This shows that it has a wide effect on different types of cancers."

Titorenko emphasizes that unlike drugs used in chemotherapy, LCA is a natural compound that is already present in our bodies. Studies have shown that LCA can be safely administered to mice by adding it to their food. So why is LCA so deadly for cancer cells? Titorenko speculates that cancer cells have more sensors for LCA, which makes them more sensitive to the compound than normal cells.

LCA sensors send signals to mitochondria -- the powerhouses of all cells. It seems that when these signals are too strong, mitochondria self-destruct and bring the cell down with them. Simply put, Titorenko and his colleagues engaged in cancer cell sabotage by targeting a weakness to LCA.

This study was supported by the Canadian Institutes of Health Research, the Natural Sciences and Engineering Research Council of Canada and the Concordia University Research Chair program.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.

Story Source:

The above story is reprinted from materials provided by Concordia University.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Journal Reference:

Goldberg AA, Beach A, Davies GF, Harkness TA, Leblanc A, Titorenko VI. Lithocholic bile acid selectively kills neuroblastoma cells, while sparing normal neuronal cells.. Oncotarget, 2011 Oct;2(10):761-82 [link]

No extra risk of breaking bones after ovary removal

By Amy Norton

Reuters Health

Wednesday, January 4, 2012

NEW YORK (Reuters Health) - Women who have their ovaries surgically removed may go through menopause early, but that does not seem to raise their risk of breaking a bone, a new study suggests.

Women may have their ovaries removed during a hysterectomy (surgery to remove the uterus) or to prevent ovarian cancer if they are at high genetic risk of the disease. If the procedure is done before menopause, it will trigger an early "surgical menopause."

With natural menopause, women's estrogen levels wane and their bone density tends to decline -- raising the risk of fractures. So there has been concern that the early, and abrupt, menopause from ovary removal could put women at particular risk of broken bones.

But in the new study, reported in the journal Menopause, researchers found no extra risk.

The study combed data from more than 6,600 U.S. women age 65 and up who were followed for 14 years, on average. Of those women, 1,157 had undergone surgical menopause, at an average age of 44.

Overall, the researchers found, those women were no more likely to break a hip, wrist or any other bone outside the spine, compared with women who'd gone through natural menopause. (The study did not look at spinal fractures, which are not necessarily serious.)

Women in the surgical menopause group suffered fractures at a rate of 54 per 1,000 women each year. Among women who'd had a natural menopause, that rate was 50 per 1,000.

What's more, the bone fracture risk was no greater in women who had not taken estrogen replacement after surgical menopause.

Doctors sometimes recommend low-dose hormone replacement after a premenopausal woman has her ovaries removed. Hormone therapy can manage menopause symptoms like severe hot flashes, and protect bone density.

The findings are surprising to some extent, according to lead researcher Dr. Kimberly K. Vesco, of Kaiser Permanente Northwest in Portland, Oregon. But they also make sense, she told Reuters Health in an email.

One reason, she explained, is that the "long-term trajectory" of bone loss may not be much different whether a woman goes through menopause naturally or because of ovary removal.

Although women who go through surgical menopause have a more abrupt decline in estrogen, their bone loss -- which, in all women, actually begins before menopause -- may ultimately not be any greater.

So what does that mean for women who are deciding whether to start hormone replacement after surgical menopause?

Vesco said that, just as for women who go through natural menopause, they should base the decision on their own personal situation.

Although hormone replacement can cool severe hot flashes and protect the bones, many women, in general, are now wary of it.

That's because combined hormone therapy (with estrogen and progesterone) has been tied to increased risks of breast cancer, blood clots, heart disease and stroke. Women who've had their ovaries removed along with a hysterectomy may, however, choose estrogen-only therapy, which appears to have fewer risks than combined hormone therapy does.

"What I think our data suggest for a woman undergoing surgical menopause in her 40s or beyond...is that her long-term risk of fracture, even if she chooses not to use estrogen therapy, is not substantially different than if she were to experience natural menopause," Vesco said.

But for any one woman, she added, "her decision to use or not use estrogen should be guided by the same types of symptoms or risk factors that would influence her decision if she experienced natural menopause."

Some studies have suggested that women who go through menopause early, whether through surgery or naturally, have a higher risk of heart disease and a shorter lifespan.

But another recent study turned up no evidence of that in nearly 9,800 California women who'd gone through surgical menopause, before or after age 45. Their risks of death from heart disease, cancer or any other cause during the study period were no greater compared with women who'd gone through natural menopause.

Source: http://bit.ly/wb8wwO

Menopause, online December 12, 2011.

Antiestrogen Therapy May Decrease Risk for Melanoma

ScienceDaily

Wednesday, January 4, 2012

ScienceDaily (Jan. 4, 2012) — Women with breast cancer who take antiestrogen supplements may be decreasing their risk for melanoma, according to a study published in Cancer Prevention Research, a journal of the American Association for Cancer Research.

Christine Bouchardy, M.D., Ph.D., professor at the University of Geneva and head of the Geneva Cancer Registry, and colleagues analyzed data from 7,360 women who had breast cancer between 1980 and 2005. About half (54 percent) of these women received antiestrogen therapy.

The researchers followed the patients until 2008 and recorded 34 melanoma cases during the follow-up period. Risk for melanoma was 60 percent higher among patients who did not receive antiestrogen therapy compared with patients who received antiestrogen therapy.

According to Bouchardy, the increased focus on estrogen's role in breast cancer has led scientists to start questioning what role estrogen is playing in other cancers. "These data reinforce the hypothesis that estrogens play a role in melanoma occurrence," she said.

Bouchardy said this may be due to the fact that estrogens are associated with increased levels of melanocytes and melanin production in human skin, which have been linked to early-stage melanoma. However, she cautioned against widespread antiestrogen supplementation to prevent melanoma in the general population.

"These results need to be replicated in other studies, particularly given the numerous side effects linked to this kind of drug," said Bouchardy.

The study was funded by a grant from the Swiss Research Foundation against Cancer, a nonprofit group.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.

Story Source:

The above story is reprinted from materials provided by American Association for Cancer Research.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Journal Reference:

C. Huber, C. Bouchardy, R. Schaffar, I. Neyroud-Caspar, G. Vlastos, F.-A. Le Gal, E. Rapiti, S. Benhamou. Antiestrogen Therapy for Breast Cancer Modifies the Risk of Subsequent Cutaneous Melanoma. Cancer Prevention Research, 2011; 5 (1): 82 DOI: 10.1158/1940-6207.CAPR-11-0332

Dried Licorice Root Fights the Bacteria That Cause Tooth Decay and Gum Disease, Study Finds

ScienceDaily

Wednesday, January 4, 2012

ScienceDaily (Jan. 4, 2012) — Scientists are reporting identification of two substances in licorice -- used extensively in Chinese traditional medicine -- that kill the major bacteria responsible for tooth decay and gum disease, the leading causes of tooth loss in children and adults. In a study in ACS' Journal of Natural Products, they say that these substances could have a role in treating and preventing tooth decay and gum disease.

Stefan Gafner and colleagues explain that the dried root of the licorice plant is a common treatment in Chinese traditional medicine, especially as a way to enhance the activity of other herbal ingredients or as a flavoring. Despite the popularity of licorice candy in the U.S., licorice root has been replaced in domestic candy with anise oil, which has a similar flavor. Traditional medical practitioners use dried licorice root to treat various ailments, such as respiratory and digestive problems, but few modern scientific studies address whether licorice really works. (Consumers should check with their health care provider before taking licorice root because it can have undesirable effects and interactions with prescription drugs.) To test whether the sweet root could combat the bacteria that cause gum disease and cavities, the researchers took a closer look at various substances in licorice.

They found that two of the licorice compounds, licoricidin and licorisoflavan A, were the most effective antibacterial substances. These substances killed two of the major bacteria responsible for dental cavities and two of the bacteria that promote gum disease. One of the compounds -- licoricidin -- also killed a third gum disease bacterium. The researchers say that these substances could treat or even prevent oral infections.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.

Story Source:

The above story is reprinted from materials provided by American Chemical Society.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Journal Reference:

Stefan Gafner, Chantal Bergeron, Jacquelyn R. Villinski, Markus Godejohann, Pavel Kessler, John H. Cardellina, Daneel Ferreira, Karine Feghali, Daniel Grenier. Isoflavonoids and Coumarins fromGlycyrrhiza uralensis: Antibacterial Activity against Oral Pathogens and Conversion of Isoflavans into Isoflavan-Quinones during Purification. Journal of Natural Products, 2011; 74 (12): 2514 DOI: 10.1021/np2004775

Tuesday, January 3, 2012

 

Soy may not protect against stomach cancer

By Andrew M. Seaman

Reuters Health

Tuesday, January 3, 2012

NEW YORK (Reuters Health) - Estrogen-like compounds that come with a soy-rich diet are sometimes linked to a reduced risk of cancer, but new research from Japan suggests that protection doesn't extend to stomach cancer.

In a study that tried to tease apart the effects of isoflavones -- also known as phytoestrogens -- found in soy, and other nutrients, like salt, Japanese researchers found no difference in gastric cancer risk between people who consumed a lot of isoflavones and those who consumed the least.

Azusa Hara and her colleagues from the National Cancer Center in Tokyo examined data on about 85,000 people in an existing Japanese study.

The researchers estimated how much isoflavone the study's participants ate from a list of questions they had answered in the 1990s, then followed the subjects until the end of 2006 to see how many developed stomach cancer.

During the follow-up period, approximately 1,250 of the study's participants got stomach cancer, but the researchers saw no difference in risk between those who ate the most isoflavone and those who ate the least.

According to Dr. Richard Peek, director of Gastroenterology, Hepatology and Nutrition at Vanderbilt University Medical Center in Nashville, Tennessee, estrogen is thought to protect against stomach cancer because the disease is much more common in men, at least until women are post-menopausal -- hinting that younger women's higher estrogen levels might be protecting them.

Peek told Reuters Health there are also studies on mice that suggest estrogen protects against stomach cancer.

The Japanese team, however, found an increase in stomach cancer risk among women taking hormone therapy who ate the most isoflavone-laden food, compared to those who ate the least.

The women in the study on hormone therapy were more likely to smoke, drink and have a family history of stomach cancer, the researchers note, which could explain the link.

Hara and her colleagues wrote in the American Journal of Clinical Nutrition that their results are limited by the use of their questionnaire and the fact that they could not account for whether the subjects were also infected with the bacterium Helicobacter pylori, which is also linked to increased stomach cancer risk.

Still another well-known risk factor for gastric cancers is high salt intake.

According to the American Cancer Society, a person in the United States has a one in 114 chance of developing stomach cancer. An estimated 21,500 Americans were diagnosed with it in 2011 and an estimated 10,500 died from it.

Stomach cancers were once the leading cause of cancer deaths in the U.S. until 1930.

"One of the reasons for decline is that people have fridges now, and they use less salt preservatives," said Khaldoun Almhanna, a medical oncologist at the Moffitt Cancer Center in Tampa, Florida.

Source: http://bit.ly/AuxWcS

American Journal of Clinical Nutrition, online December 20, 2011.

Insulin-Linked Hormone May Also Raise Alzheimer's Risk

HealthDay News

Tuesday, January 3, 2012

TUESDAY, Jan. 3 (HealthDay News) -- Rising levels of a hormone associated with sensitizing the body to insulin appears to raise the risk for developing dementia and Alzheimer's among women, new research reveals.

The hormone in question, adiponectin, is derived from visceral fat. It is known to play a role in regulating the metabolism of glucose and lipids, while also carrying certain anti-inflammatory characteristics.

So, the finding is therefore somewhat unexpected, given that insulin resistance and inflammation are considered to be hallmarks of both type 2 diabetes and Alzheimer's disease (AD). The logical presumption would have been that anything that lowers insulin resistance and inflammation might also reduce the risk for dementia.

Study leader Thomas van Himbergen, of the Lipid Metabolism Laboratory with the Human Nutrition Research Center on Aging at Tufts University in Boston, and his colleagues report their finding online Jan. 2 in the Archives of Neurology.

"It is well established that insulin signaling is dysfunctional in the brains of patients with AD, and since adiponectin enhances insulin sensitivity, one would also expect beneficial actions protecting against cognitive decline," van Himbergen said in a journal news release. "Our data, however, indicate that elevated adiponectin level was associated with an increased risk of dementia and AD in women."

The authors note that the global incidence of dementia is projected to double over the next two decades, at which point it will affect roughly 72 million people.

To get at possible mechanism and indicators of the onset of Alzheimer's, the team took blood samples from 541 women over the course of 13 years. The samples were measured for levels of a number of markers, including glucose, insulin and adiponectin. All patients were simultaneously monitored for dementia symptoms.

During the study, 159 patients went on to develop dementia, of which 125 were Alzheimer's cases.

In the end, the investigators concluded that only a rise in adiponectin signaled an increased risk for both all-cause dementia and/or Alzheimer's.

More information

For more on dementia, visit the Alzheimer's Association.

Novel Compound to Halt Virus Replication Identified

ScienceDaily

Tuesday, January 3, 2012

ScienceDaily (Jan. 3, 2012) — A team of scientists from Boston University School of Medicine (BUSM) have identified a novel compound that inhibits viruses from replicating. The findings, which are published online in the Journal of Virology, could lead to the development of highly targeted compounds to block the replication of poxviruses, such as the emerging infectious disease Monkeypox.

The basic research was led by Ken Dower, PhD, a postdoctoral fellow in the laboratory of John Connor, PhD, assistant professor of microbiology at BUSM who is corresponding author on the paper. They worked with Scott Schaus, PhD, associate professor of chemistry from the Boston University College of Arts & Sciences and co-principal investigator in the Center for Chemical Methodology and Library Development (CMLD). The researchers collaborated with the United States Army Medical Research Institute for Infectious Diseases (USAMRIID), who conducted the experiments involving Monkeypox at their laboratory in Maryland.

Poxviruses, such as smallpox, vaccinia virus and the Monkeypox virus, invade host cells and replicate, causing disease. Smallpox, a deadly poxvirus that killed hundreds of millions of people worldwide, was declared eradicated by the World Health Organization in 1979 after successful vaccination efforts. Recent data shows that the number of people being infected by Monkeypox is increasing globally.

Utilizing state of the art screening techniques, vaccinia and a library of chemicals from CMLD, Dower and his colleagues looked for compounds that could stop vaccinia from replicating inside human cells. They identified several. In studying how one of these compounds work, they discovered that the virus can enter the cell in its presence, but once the virus was inside, the compound inactivates an essential piece of virus machinery.

USAMRIID researchers then tested the efficacy of the chemical compound on the Monkeypox virus. Their experiments demonstrated similar results, showing that this chemical compound has the ability to inhibit different varieties of poxviruses.

"The compound we identified forces the catastrophic failure of the normal virus amplification cycle and illustrates a new drug-accessible restriction point for poxviruses in general," said Connor. "This can help us in developing new compounds that fight poxviruses infection."

This research was funded by the National Institutes of Health (NIH) and the Transformative Medical Technologies Initiative (TMTI).

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.

Story Source:

The above story is reprinted from materials provided by Boston University Medical Center.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Journal Reference:

K. Dower, C. M. Filone, E. N. Hodges, Z. B. Bjornson, K. H. Rubins, L. E. Brown, S. Schaus, L. Hensley, J. H. Connor. Identification of a pyridopyrimidinone inhibitor of orthopoxviruses from a diversity-oriented synthesis library. Journal of Virology, 2011; DOI: 10.1128/JVI.05416-11

Calories, not protein, matter most for fat gain

By Genevra Pittman

Reuters Health

Tuesday, January 3, 2012

NEW YORK (Reuters Health) - When it comes to packing on body fat, how many calories you eat seems to count more than where those calories come from -- lots of protein, or very little.

Researchers found that people who ate high-calorie diets all gained about the same amount of fat. Those whose diets were low in protein gained less weight overall than people on high- and moderate-protein diets, but that's because the low-protein group also lost muscle.

"Huge swings in protein intake do not result in huge swings in body fat gain," said Dr. James Levine, who studies obesity at the Mayo Clinic in Rochester, Minnesota but wasn't involved in the new study.

"It really is the calories that count."

Previous research has suggested that when people over-eat, the amount of weight they gain varies from person to person. Whether the make-up of individuals' diets might be affecting how their body stores the extra calories has remained unclear, though.

For the current study, researchers led by Dr. George Bray from the Pennington Biomedical Research Center in Baton Rouge, Louisiana, recruited 25 young, healthy volunteers to live in their lab and eat a prescribed diet for two to three months.

During the first couple of weeks, the researchers tinkered with participants' diets to determine exactly how many calories they needed to maintain their body weight.

Then, for eight weeks, they piled on about 1,000 extra calories to those daily diets. One-third of the participants were fed a standard diet with 15 percent of their calories coming from protein, while the others ate low- or high-protein diets with either five or 25 percent of calories from protein.

That worked out to volunteers eating an average of 47, 139 or 228 grams of protein per day.

Those diets made everyone gain weight, but not equally. The low-protein diet group put on about seven pounds per person, compared to 13 or 14 pounds in the normal- and high-protein groups.

But people in the low-protein group stored more than 90 percent of their extra calories as fat and lost body protein (muscle mass), while other participants gained both fat and healthier lean muscle, researchers reported in the Journal of the American Medical Association. So the groups all gained a similar amount of total excess fat.

Donald Layman, a food science researcher at the University of Illinois in Urbana, said it's difficult to see how the findings apply to a general population that isn't being overfed such a protein-deficient diet, in the case of the low-protein group.

"It's an interesting scientific study, but from an obesity standpoint, I don't think it tells us anything," he told Reuters Health.

But Levine said there are a couple of messages that people outside of a strict scientific study can take away from the findings -- especially that weight gain or loss might not be the best way to track how healthy a person's diet is.

Bray agreed.

"The scale that you step on isn't necessarily a good guide to the kind of weight you're gaining," he told Reuters Health.

"People who had the low protein gained about half as much weight as those that had normal or high protein, but the weight was different in one major component: they lost body protein, which is not healthy," Bray said. "The scale can fool you into thinking that you're winning when you aren't."

Regardless of that number, he concluded, "If you over-eat extra calories, no matter what the composition of the diet is, you'll put down more fat."

Source: http://bit.ly/hwxtTL

Journal of the American Medical Association, online January 3, 2012.

Monday, January 2, 2012 

Changes in Cerebrospinal Fluid May Signal Early Alzheimer's

HealthDay News

Monday, January 2, 2012

MONDAY, Jan. 2 (HealthDay News) -- Searching for a better screen for early Alzheimer's disease, researchers think they have found a marker of change in the brain that precedes the onset of the disease by five to 10 years.

The indicator of trouble to come, they say, is a shift in the levels of specific components of the cerebrospinal fluid (CSF) in the brain and spinal cord. Among patients already diagnosed with mild cognitive impairment, a drop in such levels appears to be a sign of Alzheimer's years before symptoms develop.

The discovery, published in the January issue of Archives of General Psychiatry, could potentially aid in the use of disease-modifying therapies, which are designed to work best if applied when a patient is still in the early stages of disease.

"These markers can identify individuals at high risk for future [Alzheimer's disease] at least five to 10 years before conversion to dementia," study author Dr. Peder Buchhave, of Lund University and Skane University in Sweden, noted in a journal news release. "Hopefully, new therapies that can retard or even halt progression of the disease will soon be available. Together with an early and accurate diagnosis, such therapies could be initiated before neuronal degeneration is too widespread and patients are already demented."

The study results stem from more than nine years of follow-up to prior research that had involved 137 patients diagnosed with mild cognitive impairment, a mental state that often precedes dementia.

Over the course of the study, nearly 54 percent of the patients went on to develop Alzheimer's, while another 16 percent were ultimately diagnosed with different forms of dementia.

Specifically, among those who developed Alzheimer's, the researchers found that key aspects of their cerebrospinal fluid dropped off in the years before. In addition, other fluid properties actually went up.

The study team said that they believe that about nine out of every 10 patients with mild cognitive impairment who experience such fluid shifts will eventually go on to develop Alzheimer's disease.

Commenting on the study, one expert in the United States said that the new research "provides confirmation of the general concept that CSF can predict the progression of mild memory loss to mild dementia."

Dr. Sam Gandy, associate director of the Mount Sinai Alzheimer's Disease Research Center at Mount Sinai School of Medicine in New York City, added that the results of the European study largely echo those of a trial reported by researchers at the U.S. National Institutes of Health in 2010.

He noted that methods of early detection might prove valuable for research into the treatment of Alzheimer's disease.

"Most new Alzheimer's drugs are aimed at reducing amyloid [protein plaque] accumulation, and the general consensus is that these drugs will only work at early or presymptomatic stages of disease," said Gandy, who is also Mount Sinai Chair in Alzheimer's Disease Research. "The new paper strengthens the likelihood that CSF biomarkers can be useful for identifying that population of subjects with early or presymptomatic disease in order to recruit them into trials."

More information

For more on early signs of Alzheimer's, visit the Alzheimer's Association.