The world’s first legally binding climate change treaty, the Kyoto Protocol, is in critical condition. Its original terms expired at the end of 2012, and although a last-minute extension is postponing its death, the treaty has little to show for a 16-year lifespan.
In 1997, signatories to the Kyoto Protocol committed to cutting their average greenhouse gas emissions 5 percent from 1990 levels. In the subsequent years, several industrialized nations shot wide of their goals. The Netherlands intended to cut its emissions 6 percent but allowed them to increase 20 percent by 2010. Japan planned a 6 percent cut but saw a 7 percent increase. Canada planned its own 6 percent cut, but gave up on the target almost as soon as the treaty went into effect: The country set a new target in 2010 and then last year withdrew from the treaty altogether—a move to avoid penalties for breaking its pledge, The Wall Street Journal reports. Overall, Canadian emissions increased 17 percent from 1990 to 2010.
The United States wasn’t bound to make cuts since it never ratified the treaty (its emissions increased 10 percent). The United Kingdom and European Union met or surpassed their own reduction targets, but that’s partly because their state economies have shrunk and because they’ve been able to import more of their goods from places like China, where manufacturing plants operate with fewer environmental rules.
The Kyoto Protocol never bound major polluters like China and India to make emissions cuts. Overall, global greenhouse gas emissions have risen 58 percent since 1990, rendering the treaty little more than a device to promote Eastern growth and Western stagnation.
In December some original treaty members, including Australia and the EU—though not Japan or Canada—extended their reduction targets for a few more years. Since these committed countries represent just 14 percent of the world’s emissions, their cuts will make little difference.
Eating habits in the United States have shifted during the past three decades, as people eat fewer meals around the dinner table and more meals on the road. In 1970, Americans spent 26 percent of their food budget eating out. That proportion increased to 41 percent in 2010.
In a recent study, researchers from the U.S. Department of Agriculture charted the nutritional implications of this trend, comparing food consumption data from 1977-78 with data from 2005-08. Although Americans today consume less fat overall, they eat more calories (2,000 on average per day, up from 1,875 in the 1970s)—and a much higher proportion of calories comes from restaurants. Americans now consume one-third of their calories away from home, compared with one-fifth in the ’70s. Fast-food restaurants, unsurprisingly, have become significant contributors to fat and calorie intake. While fast food contributed just 3 percent of the average American’s total fat intake in the ’70s, it contributed 16 percent three decades later.
The increased popularity of eating out isn’t a recipe for great health: Foods prepared away from home, the study showed, tend to be lower in dietary fiber and higher in saturated fat, cholesterol, and sodium. The authors said the changes in household structure and the expansion of women in the workforce since the ’70s may have discouraged at-home cooking. —D.J.D.