Policy Institutes

The U.S. International Trade Commission (ITC) is required by the Bipartisan Congressional Trade Priorities and Accountability Act of 2015 to prepare estimates of the economic effects of trade agreements.  In specific:

“Not later than 105 calendar days after the President enters into a trade agreement under section 103(B), the Commission shall submit to the President and Congress a report assessing the likely impact of the agreement on the United States economy as a whole and on specific industry sectors, including the impact the agreement will have on the gross domestic product, exports and imports, aggregate employment and employment opportunities, the production, employment, and competitive position of industries likely to be significantly affected by the agreement, and the interests of United States consumers.”

This statutory language guided the ITC’s analysis of the twelve-nation Trans-Pacific Partnership (TPP).  The ITC study was released on May 18, 2016. 

It had been several years since the United States concluded a free trade agreement.  The previous one with South Korea (Korea-U.S. Free Trade Agreement, or KORUS) dates from 2007.  I served as chairman of the ITC at the time and am quite familiar with the KORUS study.  The econometric modeling used a “comparative static” analysis.  A comparative static approach can be likened to taking two snapshots of the economy.  The first photo was of the known baseline economy as it existed in 2007. The second photo also used the 2007 baseline, but this time it was “shocked” by incorporating all provisions of KORUS as if they had been fully implemented.  This allowed a conceptually sound – albeit counterfactual – assessment of the likely economic effects of KORUS by analyzing how those reforms would have influenced the 2007 economy.  (These issues are explained in this Free Trade Bulletin.) Static modeling has been used in all the ITC’s analyses of trade agreements prior to TPP." title="<--break-->" class="mceItem">

One of the great strengths of the comparative static approach is that it makes no attempt to project the economy into the future.  There is no need to speculate on whether a recession will curb trade flows, or whether technological change will make some industries obsolete while spurring new ones into existence.  Precisely predicting the future requires a degree of clairvoyance not possessed by economists or anyone else.  A comparative static analysis deals with that reality by instead looking backward.  It imposes new policy reforms on an old – but well-known – economy.  And it allows economists to avoid trying to make forward-looking projections of economic activity that inevitably turn out not to be correct.

However, comparative static modeling is not the only tool in the econometrician’s toolbox.  For its analysis of the economic effects of TPP, the ITC has chosen to use a dynamic computable general equilibrium (CGE) model.  The Global Trade Analysis Project (GTAP) model “is an appropriate tool for analyzing the effects of trade agreements because it consists of a database with international trade flows and other macroeconomic information, social accounting matrixes that show how different segments of the economy are interlinked, and national income accounts data.”  Using a dynamic version of the GTAP model has allowed the ITC to estimate changes in various economic measures (real GDP, employment, exports, imports, etc.) up to 30 years in the future.  Most of the analysis focuses on the 15-year period beginning in 2017 and ending in 2032.

In order to evaluate how TPP might influence the economy in the future, it first was necessary to create a baseline projection of what the economy would be like in the years ahead without TPP.  The ITC has done this by incorporating projections made by the International Monetary Fund (IMF) and the Organisation for Economic Co-operation and Development (OECD) regarding growth rates in many countries for labor, population, and GDP.  Once the 30-year baseline was established, the model was shocked by adding TPP’s annual policy changes.  (TPP gradually phases in many reductions in trade restrictions year by year.)  The economic effects of TPP then were measured as differences between the original baseline and the baseline following the shocks from TPP.  The dynamic GTAP model provides a mathematically sound means to estimate future economic variances caused by policy changes. 

The real questions regarding forward-looking estimates have to do with the baseline itself.  The IMF and OECD are quite capable when it comes to analyzing historic trends.  Generally it’s not unreasonable to project well-established trends a short distance into the future.  If global GDP has grown at an average rate of 3 percent over the past ten years, for instance, it may be quite sensible to estimate that growth in the coming year also will be around 3 percent.  The problems come as we look further into the future.  Life’s inherent uncertainties make it relatively likely that a future projection of U.S. exports or imports of cheese, for instance, will turn out not to be precisely accurate.  How much confidence should we have in projections five years into the future?  Fifteen years?  Thirty?

Making estimates that turn out to be different than actual future outcomes is not a problem to economists and statisticians schooled in economic modeling.  They understand well that the ITC’s estimates were made using the best available information and up-to-date econometric techniques, and that the real world economy simply diverged from what had been projected in the baseline.  Unfortunately, not everyone interested in trade policy has such depth of knowledge and understanding.  My concern is not with the integrity of the modeling, but rather the challenges that trade supporters may face in defending the results of the analysis against criticism.

Even with comparative static modeling, opponents of expanded international trade have been inclined to misinterpret the analysis.  There are claims, for example, that the ITC did a poor job with its KORUS study because the U.S. trade deficit with South Korea has gone up since the agreement went into effect.  The KORUS study didn’t say anything about what might happen to the trade deficit in the future.  However, it did indicate a likely decrease in the deficit in the hypothetical situation in which all provisions of KORUS were somehow implemented during the static 2007 baseline period.

Now that the ITC’s TPP study has used a dynamic CGE approach that actually does make estimates about future trade flows, critics of trade agreements no doubt will be happy to point out how the ITC “got things all wrong.”  (In fact, the ITC’s estimates are seldom likely to be “right.”)  Trade skeptics are unlikely to bother explaining that the real source of the estimated “errors” is that the underlying economy evolved differently than the IMF and OECD had projected.  Most anti-trade NGOs have little interest in raising the quality of the trade policy debate.  Rather, they may be inclined to argue that all economic analysis showing positive effects for the United States from trade agreements is suspect and can’t be trusted.

Supporters of trade liberalization will do their best to counter such misinformation by explaining the details of dynamic CGE modeling.  But the criticism of the ITC’s estimates will take only a few words; setting the record straight will require several sentences or paragraphs.  Protectionist rhetoric may prove to have a greater influence on public opinion than do the substantive explanations. 

It will be interesting to see whether analyzing trade agreements via dynamic CGE modeling leads to a more informed public discussion than has been the case for the comparative static technique.  With a comparative static approach, the ITC was never wrong, but often misunderstood.  With dynamic modeling, the ITC will almost never be right, while still being misunderstood. 

 

Daniel R. Pearson is a senior fellow in the Cato Institute’s Herbert A. Stiefel Center for Trade Policy Studies, and is a former chairman of the U.S. International Trade Commission.

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

There is a new paper generating some press attention (e.g. Chris Mooney at the Washington Post) that strongly suggests global warming is leading to specific changes in the atmospheric circulation over the Northern Hemisphere that is causing an enhancement of surface melting across Greenland—and of course, that this mechanism will make things even worse than expected into the future.

We are here to strongly suggest this is not the case.

The new paper is by a team of authors led by Marco Tedesco from Columbia University’s Lamont-Doherty Earth Observatory. The main gist of the paper is that Arctic sea ice loss as a result of human-caused global warming is causing the jet stream to slow down and become wigglier—with deeper north-south excursions that hang around longer.  This type of behavior is referred to as atmospheric “blocking.”

If this sounds familiar, it’s the same theoretical argument that is made to try to link wintertime “polar vortex” events (i.e., cold outbreaks) and blizzards to global warming. This argument which has been pretty well debunked, time and time again.

Well, at least it has as it concerns wintertime climate.

The twist of the new Tedesco and colleagues’ paper is that they’ve applied it to the summertime climate over Greenland. They argue that global warming is leading to an increase in blocking events over Greenland in the summer and that is causing warm air to be “locked” in place leading to enhanced surface melting there. Chris Mooney, who likes to promote climate alarm buzzwords, refers to this behavior as “weird.” And he describes the worrysome implications:

The key issue, then, is whether 2015 is a harbinger of a future in which the jet stream keeps sending Greenland atmospheric systems that drive major melt — and in turn, whether the Arctic amplification of climate change is driving this. If so, that could be a factor, not currently included in many climate change simulations, that would worsen the ice sheet’s melt, drive additional sea level rise and perhaps upend ocean currents due to large influxes of fresh water.

As proof that things were weird over Greenland in recent summers, Tedesco’s team offers up this figure in their paper:

" title="<--break-->" class="mceItem">

This chart (part of a multipanel figure) shows the time history of the North Atlantic Oscillation (NAO—a pattern of atmospheric variation over the North Atlantic) as red bars and something called the Greenland Blocking Index (GBI) as the black line, for the month of July during the period 1950-2015. The chart is meant to show that in recent years, the NAO has been very low with 2015 being “a new record low of -1.23 (since 1899),” and the GBI has been very high with the authors noting that “[c]oncurrently, the GBI also set a new record for the month of July [2015].” Clearly the evidence is showing that atmospheric blocking increasing over Greenland which fits nicely into the global warming/sea ice loss/wiggly jet stream theory.

So what’s our beef?

A couple of months ago, some of the same authors of the Tedesco paper (notably Ed Hanna) published a paper showing the history of the monthly GBI going back to 1851 (as opposed to 1950 as depicted in the Tedesco paper).

Here’s their GBI plotted for the month of July from 1851 to 2015:

This picture tells a completely different story. Instead of a long-term trend that could be related to anthropogenic global warming, what we see is large annual and multidecadal variability, with the end of the record not looking much different than say a period around 1880 and with the highest GBI occurring in 1918 (with 1919 coming in 2nd place). While this doesn’t conclusively demonstrate that the current rise in GBI is not related to jet stream changes induced by sea ice loss, it most certainly does demonstrate that global-warming induced sea ice loss is not a requirement for blocking events to occur over Greenland and that recent events are not  at all “weird.”  An equally plausible, if not much more plausible, expectation of future behavior is that this GBI highstand is part of multidecadal natural variability and will soon relax back towards normal values.  But such an explanation isn’t Post-worthy.

Another big problem with all the new hype is that history shows the current goings-on in Greenland to be irrelevant, because humans just can’t make it warm enough up there to melt all that much ice. For example, in 2013, Dorthe Dahl-Jensen and her colleagues published a paper in Nature detailing the history of the ice in Northwest Greenland during the beginning of the last interglacial, which included a 6,000 year period in which her ice core data showed averaged a whopping 6⁰C warmer in summer than the 20th century average. Greenland only lost around 30% of its ice with a heat load of (6 X 6000) 36,000 degree-summers. The best humans could ever hope to do with greenhouse gases is—very liberally—about 5 degrees for 500 summers, or (5 X 500) 2,500 degree-summers. In other words, the best we can do is 500/6000 times 30%, or a 2.5% of the ice, resulting in a grand total of seven inches of sea level rise over 500 years. That’s pretty much the death of the Greenland disaster story, despite every lame press release and hyped “news” article on it.

While you won’t find this kind of analysis elsewhere, we’re happy to do it here at Cato. 

References:

Dahl-Jensen, D., et al., 2013.  Eemian interglacial reconstructed from a Greenland folded ice core.  Nature 489, doi: 10.1038/nature11789.

Hanna, E., et al., 2016. Greenland Blocking Index 1851-2015: a regional climate change signal. International Journal of Climatology, doi: 10.1002/joc.4673.

Tedesco, M., et al., 2016. Arctic cut-off high drives the poleward shift of a new Greenland melting record. Nature Communications, DOI: 10.1038/ncomms11723, http://www.nature.com/ncomms/2016/160609/ncomms11723/full/ncomms11723.html

North Korea’s ruling elite appears to be getting along fine despite international sanctions. Washington needs to find a new approach toward the North.

The so-called Democratic People’s Republic of Korea poses one of the most vexing challenges to American policy. For more than 20 years U.S. presidents have insisted that the DPRK cannot be allowed to develop nuclear weapons. Yet it apparently is preparing for a fifth nuclear test.

A military strike, as proposed by Ashton Carter before he was appointed Defense Secretary, would risk engulfing the peninsula in war. So the U.S. has relied on sanctions. Every time Pyongyang misbehaves—especially tests a nuclear weapon or launches a missile—American officials impose tougher domestic economic penalties and press for harsher UN sanctions.

 After the North’s latest nuclear test earlier this year, China agreed to a new round of restrictions. The increased penalties had no impact of North Korean policy. To the contrary, in early May the Kim regime used the party congress to highlight Pyongyang’s nuclear program.

Sanctions have had an impact. The People’s Republic of China has been losing patience and appears to be more tightly regulating cross-border commerce. Some North Korean representatives of blacklisted agencies moved from China to Southeast Asian nations. The regime has resorted to smuggling to bring in banned products. Moreover, Pyongyang appears to be having more difficulty selling weapons abroad.

Nevertheless, Beijing continues to moderate the impact of sanctions. Illicit goods still cross the border and some observers expect the PRC commitment to fade as Western attention moves elsewhere. Beijing more fears chaos on its border than a North Korea with nuclear weapons. President Xi Jinping recently declared: “As a close neighbor of the peninsula, we will absolutely not permit war or chaos on the peninsula.”

The Xi government so far refuses to halt energy and food shipments, the only step that would apply bone-crunching pressure to the Kim regime. Even then, Pyongyang might refuse to comply. The regime already is blaming the West, preparing its people for what it calls an “arduous march.”

During the late 1990s the regime survived the virtual collapse of the economy and starvation death of a half million or more North Koreans. The Kim dynasty might survive similar hardship in the future.

Unfortunately, the uniform experience of sanctions is that they hurt those with the least resources and influence. That appears to be the case in North Korea.

So far the powerful have prospered, despite penalties directed against luxury imports. The Washington Post recently reported on “Pyonghattan,” home to North Korea’s privileged elite. In contrast, argued Andrei Lankov of Kookmin University, “the average North Korean will also bear the brunt of the sanctions.”

The latest round of sanctions has increased hardship. Choi Ha-young, chairman of the Love North Korean Children Charity, complained: “Currently, due to the UN sanctions, people in the lowest class are really impacted.”

As I point out in National Interest: “Washington seems to have only one response to the North: increase sanctions. However, this policy is a dead-end. The U.S. and its allies must find a new strategy toward Pyongyang.”

Pages