Is Running Bad for Your Knees? Maybe Not

  • Share
  • Read Later
Tim Tadder / Corbis

Perhaps because it seems intuitively true, the notion persists that running, especially when done long-term and over long distances, is bad for the joints. Indeed, it would be hard to think otherwise when with each foot strike, a runner's knee withstands a force equal to eight times his or her body weight — for a 150-lb. person, that's about 1,200 lb. of impact, step after step.

The common wisdom is that regular running or vigorous sport-playing during a person's youth subjects the joints to so much wear and tear that it increases his or her risk of developing osteoarthritis later in life. Research has suggested that may be at least partly true: in a study of about 5,000 women published in 1999, researchers found that women who actively participated in heavy physical sports in their teenage years or weight-bearing activities in middle age had a higher than average risk of developing osteoarthritis of the hip by age 50.

But over the past few years, an emerging body of research has begun to show the opposite, especially when it comes to running. Not only is there no connection between running and arthritis, the new studies say, but running — and perhaps regular vigorous exercise generally — may even help protect people from joint problems later on.

In a well-known long-term study conducted at Stanford University, researchers tracked nearly 1,000 runners (active members of a running club) and nonrunners (healthy adults who didn't have an intensive exercise regimen) for 21 years. None of the participants had arthritis when the study began, but many of them developed the condition over the next two decades. When the Stanford team tabulated the data, published in the Archives of Internal Medicine in 2008, it found that the runners' knees were no more or less healthy than the nonrunners' knees. And It didn't seem to matter how much the runners ran. "We have runners who average 200 miles a year and others who average 2,000 miles a year. Their joints are the same," says James Fries, a professor emeritus of medicine at Stanford and the leader of the research group. The study also found that runners experienced less physical disability and had a 39% lower mortality rate than the nonrunners.

In 2007 a nine-year study of 1,279 elderly residents of Framingham, Mass., resulted in similar findings: that the most active people had the same risk of arthritis as the least active. About 9% of the participants overall developed arthritis over the course of the study, as measured by symptoms reported to their physicians (pain and difficulty walking) as well as X-ray scans. And in the same year, Australian researchers writing in the journal Arthritis and Rheumatism found that people who exercised vigorously had thicker and healthier knee cartilage than their sedentary peers. That suggests the exercisers may have also enjoyed a lower risk of osteoarthritis, which is caused by breakdown and loss of cartilage.

Together, the findings lend support to the theory that osteoarthritis, which affects nearly 20 million Americans, is caused mainly by genes and risk factors like obesity (obese men and women are at least four times as likely to become arthritic as their thinner peers), rather than daily exercise or wear and tear of joints. In fact, a "normally functioning joint can withstand and actually flourish under a lot of wear," says Fries. Because cartilage — the soft connective tissue that surrounds the bones in joints — does not have arteries that deliver blood, it relies on the pumping action generated by movement to get its regular dose of oxygen and nutrients. "When you bear weight, [the joint] squishes out fluid, and when you release weight, it sucks in fluid," says Fries, explaining why a daily run or any other workout is useful for maintaining healthy cartilage.

That's not to say that there are no risks in running. It can sometimes cause soft-tissue injuries and stress fractures, also called hairline fractures, which result from the compounding of tiny cracks in the bone over time. It's not uncommon for such tiny cracks to appear in the bones that bear the heaviest loads, like the tibia (shinbone), but they usually heal quickly and go unnoticed. Stress fractures occur when bone damage happens suddenly, without enough time to heal. For instance, high school athletes who stop training all summer and then abruptly start attending practice every day have a much higher risk of stress fractures in their shinbones than their friends who practiced regularly over the break.

The good news is that there are ways to help reduce the risk of stress fracture. One method may be to simply strengthen the muscle attached to the bone. In a study published in the December issue of Medicine & Science in Sports & Exercise, researchers at the University of Minnesota found that among competitive female runners, those with larger calf muscles were less likely than runners with small calf muscles to suffer stress fractures in their shinbones. Why? The stronger the muscle, the greater the force it exerts on the bone; a contracting muscle exerts a bending force on the bone, like a string bending a bow — an interaction that over time makes the bone stronger.

So simple calf-muscle exercises, like rising up on your toes about a dozen times a day, may be sufficient to increase strength in the shinbone, says study author Kristy Popp, who recently completed her Ph.D. in exercise physiology at the University of Minnesota. She suggests adding calf workouts to your regular exercise routine but cautions that increasing muscle and bone strength is a gradual process and that having strong calves is no cure-all. But "if it can help prevent stress fractures, it's worth a try," says Popp.

In a second study in the same journal, researchers at Iowa State University used computer modeling to figure out how the length of a runner's stride might change the force applied to his or her bones and thereby affect the risk of stress fractures. Researchers recruited 10 male participants, each of whom typically ran about three miles per day, and calculated their risk of experiencing a stress fracture — about 9% over 100 days. By observing the participants running at varying stride lengths and recording the amount of force their foot strikes exerted on the ground, researchers were able to estimate the force each runner applied to his shinbone. According to the computer model, if the runners reduced their natural strides about 10%, they could reduce their risk of fracture by a third.

The reason is less air time, researchers say — the less time a runner's feet spend airborne, the less force they strike the ground with. Still, the results of a mathematical model are difficult to re-create in real life, especially since it takes a fair amount of practice to adjust to a shortened stride. Runners who abbreviate their stride try instinctively to quicken their pace to compensate. That can negate any protective effect of stride shortening — when you speed up, the force on the bone increases proportionately.

Study author Brent Edwards, now at the University of Illinois in Chicago, says he "would never recommend stride reduction to a competitive runner," but he suggests the technique for people with a history of stress fractures, like former athletes. The biggest risk factor for stress fractures, he notes, is simply having had such a fracture in the past. But the best advice for runners wishing to reduce injuries is to keep running; that is, run consistently and avoid long periods of inactivity. That may be especially hard during the snowy winter months, but runners should try to get in a daily workout — hitting the treadmill, running up and down stairs or even shoveling the driveway should do the job. Just don't sit around all winter and then start running three-milers in the spring. It's that sudden activity that increases the risk of injury.