One of the many tragedies of Alzheimer's disease is that patients don't know until it's too late that they actually have the condition. By the time the first signs of forgetfulness and confusion set in, experts believe, the disease has already been ravaging the brain for a decade or more, causing irreversible damage.
But researchers at the Cleveland Clinic report that they may have found a way to identify those most at risk of developing the neurological disorder long before symptoms develop simply by asking them whether they recognize celebrities such as Britney Spears and Johnny Carson. It turns out that when people who are at highest risk of Alzheimer's try to recognize a famous name, their brains activate in very different ways from those of people who aren't at risk. And scientists can actually see this difference using functional magnetic resonance imaging, or fMRI.
In the journal Neurology, a team led by Stephen Rao, a brain-imaging specialist, describes a study of 69 healthy men and women aged 65 to 85. The researchers divided the group into three: those who had no risk factors for Alzheimer's, those who had a family history of the disease but no genetic indicators of it themselves and those who had both family members with Alzheimer's as well as a version of a gene for a protein called apolipoprotein E4 (ApoE4) that has been linked to the condition. They slid all of the subjects into an fMRI machine, and while the volunteers were there, they saw names of both famous and not-so-famous people flashed in front of them.
Rao's team found that when volunteers saw names such as Britney Spears, George Clooney, Albert Einstein and Marilyn Monroe, those who were at the highest risk of developing Alzheimer's those with both the genetic makeup and a family history showed high levels of activity in the hippocampus, posterior cingulate and regions of the frontal cortex, all areas involved in memory. The control group showed the opposite pattern. Their brains became more excited when they saw unfamiliar names, which included Irma Jacoby, Joyce O'Neil and Virginia Warfield.
That could mean that the at-risk people were working harder to recognize the well-known celebrities, compensating for already damaged or destroyed neurons that were no longer functioning, while the control group had to struggle only when trying to place the names of noncelebrities, recruiting more nerve cells and connections, racking their memory banks and recall centers. Significantly, in neither group did pictures of the brain designed to pick up structural changes associated with dementia, like signs of atrophy and dead neurons, show any differences at least not yet.
"This pushes the envelope further in attempting to detect dysfunction in the brain at a stage earlier than any detectable clinical measurement of cognitive decline," says Dr. Ralph Nixon, a psychiatrist at New York University and vice chair of the medical and scientific advisory council of the Alzheimer's Association. "We all know that the brain is changing metabolically at a very early stage of the disease, well before clinical symptoms. This type of technique validates that concept."
While doctors can now test for the presence of ApoE4, you have to have two copies of a particular form of the gene to be at real risk of Alzheimer's. If you do have them, your chances of developing the disease increase 10- to 20-fold. So far, the Alzheimer's Association does not recommend widespread screening for the gene, even among those with a family history of Alzheimer's, since most people who have the risky version of ApoE4 don't have the necessary gene copies. But looking more closely at people who have a family history of the disease by adding an fMRI scan such as the one Rao conducted to the genetic screen could help doctors select those who do seem to be in the greatest danger of being claimed by the disorder.
The idea is not necessarily to diagnose Alzheimer's earlier, says Rao. But imaging studies can help to identify those most vulnerable to cognitive decline so they can participate in clinical trials of new drugs designed to postpone or reduce symptoms. "If we can delay the onset of Alzheimer's by five years," he says, "by some estimates we can cut the incidence of Alzheimer's in half. If we can delay the disease by 10 years, we could almost eliminate it because people would die from other conditions first."