Earliest useful Alzheimer’s disease test everyone over 65 needs to take

Earlier this month, Britain’s Prime Minister spoke at the G8 summit about a new brain scan to diagnose Alzheimer’s disease. The test detects the presence of beta-amyloid protein “plaques” in the brain, one of the hallmarks of AD. He wants the test–developed in Britain–to become a standard diagnostic tool.

That’s all well and good for the future of the British med-tech industry. But you need to know right now about a much simpler way to test for AD. You can do it in the privacy of your own home. It won’t cost you a dime. And it’s highly accurate. In fact, five recent studies found that it might be the earliest useful tool we have for diagnosing Alzheimer’s disease!

Believe it or not, while mainstream medicine fumbles around dealing with dementia, recent research suggests that your judgment is the best tool we have to catch the earliest signs of Alzheimer’s disease and dementia.

Researchers presented these five new studies at the 2013 Alzheimer’s Association International Conference earlier this year in Boston.

In one study, researchers studied a group of cognitively “normal” older people. They had no prior history of neurological or psychological illness. But they reported concerns about their memory. And felt they had poorer cognition than their peers.

Well, the research showed that their concerns were well founded. In fact, researchers found a significant connection between these entirely “subjective,” self-reported concerns and the actual build-up of beta-amyloid protein plaques. And the higher a patient’s education level, the more attuned they seemed to noticing even minor changes in cognition.

This study underscores a point I often make…that you should know yourself better than anyone else does. Especially when it comes to AD. And even when you appear “cognitively normal” on paper. Or during the five minutes in your doctor’s office. You know better than anyone else does when something is “off.”

Another study followed nearly 4,000 nurses age 70 and older. Researchers found that the nurses’ subjective concerns about memory loss often accurately predicted a subsequent increase in objectively measured memory loss. Especially among carriers of the ApoE4 gene, a strong known genetic risk factor for Alzheimer’s disease.

In a third study, older adults underwent annual, objective cognitive tests over an average of 10 years. Men and women who reported a subjective change in memory since their last assessment were almost twice as likely to have objective cognitive impairment or dementia during a follow-up. That’s compared to those who did not report such a change.

The last two studies had similar findings. And scientists are now starting to realize the value of your “subjective” opinion about memory loss and cognition.

Your chance of developing dementia begins to increase dramatically after age 65. At that point, it’s very important that you monitor changes you notice in your thinking and memory.

But don’t go running to a neurologist the first time you misplace your keys. Or forget an actor’s name. We live in an era with ever increasing distractions and stress. We also interact with young people who often have limited attention spans. So, it can feel impossible to stay mentally sharp in a culture that tweets, streams, screams and updates ideas every half second. No wonder you sometimes feel like you can’t string two coherent thoughts together. (After all, few people seem to be doing that anymore.)

But certain cognitive changes aren’t a normal part of aging. And you need to watch out for these changes.

Source:

1. Subjective Cognitive Decline May Be the Earliest Clinical Indicator of Alzheimer’s Disease,” Alzheimer’s Association (www.alz.org), July 17, 2013