Policy Institutes

The Constitution has gotten short shrift in the ongoing presidential debates, save for an occasional mention by Rand Paul. Now that he’s out of the race, Politico reports this morning, in a piece entitled “Ted Cruz, born-again libertarian,” that Cruz is scrambling for Paul’s supporters, claiming that he’s the one remaining “constitutional conservative.” That’s rich, and here’s why.

If there is any test of libertarian constitutionalism, it concerns the proper role of the courts in limiting legislative and executive excesses, federal, state, and local. Even many conservatives today are rethinking their earlier views and arguing now that courts need to be more engaged in the business of limiting government and preserving liberty. And no Supreme Court decision in our history more symbolizes the divide between the earlier conservatives and the libertarians who’ve gradually brought this re-thinking about than Lochner v. New York, where the Court in 1905 struck down an economic regulation because it violated the right to liberty of contract protected by the 14th Amendment.

And where does Ted Cruz stand on that? Here’s Damon Root writing yesterday about the Paul exit in Reason’s “Hit & Run” blog:

Ted Cruz, meanwhile, stands in direct opposition to the libertarian legal movement on the central issue of economic liberties and the Constitution. For example, in July 2015 Cruz attacked the Supreme Court’s Lochner decision as a regrettable example of the Court’s “imperial tendencies” and “long descent into lawlessness.”

Unfortunately for Cruz, he undercut his own position in that speech by mangling the facts of Lochner, which he incorrectly described (while reading from a prepared text) as a case where “an activist Court struck down minimum wage laws” on behalf of an individual right “that has no basis in the language of the Constitution.” (Cruz’s opposition to Lochner also happens to be indistinguishable from Barack Obama’s negative view of the case.)

In reality, Lochner was not a minimum wage case at all; it was a maximum working hours case, plain and simple. What’s more, there is significant historical evidence showing that the individual right at issue in Lochner—liberty of contract—is deeply rooted in the text and history of the 14th Amendment.

Ted Cruz may be a “constitutional conservative” in the old and, increasingly, passing sense, but he’s hardly heir to those Rand Paul supporters who take the Constitution seriously. If his views on Lochner are any indication, he’d be more comfortable with the deferential Court that has left Obamacare largely intact. At the least, he needs to bone up on his constitutional theory and history.

Last year I referred readers to the abuse of civil asset forfeiture laws by the IRS in its attempt to take more than $107,000 from North Carolina small business owner Lyndon McLellan without charging him with any crime.

The IRS cleaned out Mr. McLellan’s business account because it suspected him of “structuring,” an offense whereby a person avoids legally-mandated financial reporting requirements by keeping their deposits and withdrawals under $10,000.  Because there are many perfectly legitimate reasons a business owner may deposit less than $10,000 at a time (for instance, if their insurance policy only covers $10,000 cash on hand), and because civil asset forfeiture allows the government to seize cash and property without proving any wrongdoing, IRS structuring seizures are prone to abuse.

Tacitly recognizing the abuse allowed by the law, former Attorney General Eric Holder announced changes to the use of civil forfeiture in structuring offenses last year.  The policy changes should have spared innocent business owners like Lyndon McLellan, but it seems some federal prosecutors never got the memo.  In fact, the Assistant U.S. Attorney in charge of the case responded to criticism by sending veiled threats to Lyndon McLellan and his lawyers at the Institute for Justice, warning them against publicizing the case lest it “ratchet up feelings” in the IRS offices.

The publicity worked. After significant public and political pressure, the IRS relented and returned the amount they had taken from Mr. McLellan’s bank account. As I noted last year, however, the IRS refused to reimburse Mr. McLellan for the costs of fighting the seizure or to pay interest on the money it had wrongfully seized.

But this week a federal judge ruled that the IRS must do more to make Mr. McLellan whole, and awarded him legal costs totalling more than $20,000.

The court held:

Certainly, the damage inflicted upon an innocent person or business is immense when, although it has done nothing wrong, its money and property are seized. Congress, acknowledging the harsh realities of civil forfeiture practice, sought to lessen the blow to innocent citizens who have had their property stripped from them by the Government… . This court will not discard lightly the right of a citizen to seek the relief Congress has afforded.

Fortunately, thanks to the efforts of Mr. McLellan and the Institute for Justice, the good guys won this time. Ultimately, however, the only way to ensure that civil forfeiture abuses stop happening is to abolish civil forfeiture. If the government cannot prove beyond a reasonable doubt that a person engaged in criminal activity, it should not be able to punish them as if they’re guilty.  As long as Congress and state legislatures allow this practice to continue, more innocent Americans will end up fighting for their livelihoods like Lyndon McLellan had to.  

For the Institute for Justice page detailing Mr. McLellan’s case, click here.

For Cato’s explainer on the troubling history of civil asset forfeiture, click here.

The big trade news from yesterday was that government officials from the 12 nations negotiating the Trans Pacific Partnership traveled to New Zealand for the official signing ceremony. While the negotiators are no doubt relieved, and are looking forward to some time off, we now get to perhaps the most difficult part of the process: Seeing whether Congress will approve what the Obama administration negotiated.

Finding a way for different branches in a divided government to work together is never easy. This year you have Presidential elections thrown into the mix, which makes things even harder.

Senator Mitch McConnell is sounding pretty skeptical about holding a vote before the election, and maybe even after as well:

McConnell said his “advice” is that Congress not vote on TPP prior to the election in part because the two Democratic presidential candidates and several Republican candidates oppose the agreement.

With respect to a lame-duck vote, McConnell signaled it may not be fair to constituents to take a vote on a controversial issue such as trade after they cast their votes for who should represent them in Congress.

People are sometimes able to resolve their differences, and maybe there is some deal to be struck here. On the other hand, Senator McConnell feels pretty strongly about the “tobacco carveout” that was included in the TPP’s investment provisions. The Obama administration has used this carveout to generate TPP support from groups such as the Cancer Action Network, but it’s not clear that such support will lead to any Democratic votes for the TPP, whereas it clearly is affecting Republican views of the TPP.

So, the TPP has been signed, but it is not clear whether it can be sealed and delivered.  In fact, at this point, it seems very possible that whoever becomes President will want to take a fresh look at the terms. Hillary Clinton might want to see if it is “progressive” enough (the Obama administration keeps calling it the “most progressive trade agreement in history”); on the other side, Marco Rubio might want to make it a lot less progressive (e.g., by taking out the minimum wage provisions, and deleting the tobacco carveout).

I’ll close with a quote from Victoria Guida of Politico: “The future of the Trans-Pacific Partnership is as clear as mud … .” 

Remember peak oil? Remember when oil prices were $140 a barrel and Goldman Sachs predicted they would soon reach $200? Now, the latest news is that oil prices have gone up all the way to $34 a barrel. Last fall, Goldman Sachs predicted prices would fall to $20 a barrel, which other analysts argued was “no better than its prior predictions,” but in fact they came a lot closer to that than to $200.

Low oil prices generate huge economic benefits. Low prices mean increased mobility, which means increased economic productivity. The end result, says Bank of America analyst Francisco Blanch, is “one of the largest transfers of wealth in human history” as $3 trillion remain in consumers’ pockets rather than going to the oil companies. I wouldn’t call this a “wealth transfer” so much as a reduction in income inequality, but either way, it is a good thing.

Naturally, some people hate the idea of increased mobility from lower fuel prices. “Cheap gas raises fears of urban sprawl,” warns NPR. Since “urban sprawl” is a made-up problem, I’d have to rewrite this as, “Cheap gas raises hopes of urban sprawl.” The only real “fear” is on the part of city officials who want everyone to pay taxes to them so they can build stadiums, light-rail lines, and other useless urban monuments.

A more cogent argument is made by UC Berkeley sustainability professor Maximilian Auffhammer, who argues that “gas is too cheap” because current prices fail to cover all of the external costs of driving. He cites what he calls a “classic paper” that calculates the external costs of driving to be $2.28 per gallon. If that were true, then one approach would be to tax gasoline $2.28 a gallon and use the revenues to pay those external costs.

The only problem is that most of the so-called external costs aren’t external at all but are paid by highway users. The largest share of calculated costs, estimated at $1.05 a gallon, is the cost of congestion. This is really a cost of bad planning, not gasoline. Either way, the cost is almost entirely paid by people in traffic consuming that gasoline.

The next largest cost, at 63 cents a gallon, is the cost of accidents. Again, this is partly a cost of bad planning: remember how fatality rates dropped nearly 20 percent between 2007 and 2009, largely due to the reduction in congestion caused by the recession? This decline could have taken place years before if cities had been serious about relieving congestion rather than ignoring it. In any case, most of the cost of accidents, like the other costs of congestion, are largely internalized by the auto drivers through insurance.

The next-largest cost, pegged at 42 cents per gallon, is “local pollution.” While that is truly an external cost, it is also rapidly declining as shown in figure 1 of the paper. According to EPA data, total vehicle emissions of most pollutants have declined by more than 50 percent since the numbers used in this 2006 report. Thus, the 42 cents per gallon is more like 20 cents per gallon and falling fast.

At 12 cents a gallon, the next-largest cost is “oil dependency,” which the paper defines as exposing “the economy to energy price volatility and price manipulation” that “may compromise national security and foreign policy interests.” That problem, which was questionable in the first place, seems to have gone away thanks to the resurgence of oil production within the United States, which has made other oil producers, such as Saudi Arabia, more dependent on us than we are on them.

Finally, at a mere 6 cents per gallon, is the cost of greenhouse gas emissions. If you believe this is a cost, it will decline when measured as a cost per mile as cars get more fuel efficient under the current CAFE standards. But it should remain fixed as a cost per gallon as burning a gallon of gasoline will always produce a fixed amount of greenhouse gases.

In short, rather than $2.38 per gallon, the external cost of driving is closer to around 26 cents per gallon. Twenty cents of this cost is steadily declining as cars get cleaner and all of it is declining when measured per mile as cars get more fuel-efficient.

It’s worth noting that, though we are seeing an increase in driving due to low fuel prices, the amount of driving we do isn’t all that sensitive to fuel prices. Real gasoline prices doubled between 2000 and 2009, yet per capita driving continued to grow until the recession began. Prices have fallen by 50 percent in the last six months or so, yet the 3 or 4 percent increase in driving may be as much due to increased employment as to more affordable fuel.

This means that, though there may be some externalities from driving, raising gas taxes and creating government slush funds with the revenues is not the best way of dealing with those externalities. I’d feel differently if I felt any assurance that government would use those revenues to actually fix the externalities, but that seems unlikely. I actually like the idea of tradable permits best, but short of that the current system of ever-tightening pollution controls seems to be working well at little cost to consumers and without threatening the economic benefits of increased mobility.

Just days before the Trans-Pacific Partnership is scheduled to be signed by its 12 member governments, an official expert from the UN Human Rights Council released a statement criticizing the agreement for being incompatible with the goals of the UN human rights regime.  The criticism isn’t about the TPP in particular so much as the modern model of trade agreements as an inadequate vehicle for furthering wealth redistribution and massive regulatory intervention to pursue progressive goals.  That is, it’s a complaint about what the TPP doesn’t do.

There are, of course, lots of things the TPP doesn’t do.  Critics have complained that the TPP doesn’t prevent climate change, doesn’t eliminate human trafficking, and doesn’t reform repressive regimes in Vietnam and Brunei.  But these are not things the TPP was ever supposed to do.  It’s like complaining that Obamacare doesn’t end the drug war.

There are legitimate criticisms to be leveled against the TPP—things it does but shouldn’t and things it doesn’t do as well as it should.  There’s also a lot to like.  But debates over trade agreements often get bogged down with unrelated controversies that are easier to argue about.  Not one of the complaints the UN expert makes is explicitly about trade liberalization.  

The statement includes two specific criticisms of the TPP.  One is the secrecy of the negotiations, and the other is investor-state dispute settlement.  These are well-worn, standard complaints opponents of the TPP have been making for years.  The persuasiveness of both arguments relies on reflexive fear of the unknown—opponents can hint at what horrible things might happen from the TPP rather than looking at specific, measurable impacts.

These issues have become so controversial, in fact, that eliminating ISDS from future trade agreements and increasing transparency in negotiations would probably result in more free trade

The proliferation and prominence of non-trade arguments against trade agreements show that agreements like the TPP have strayed too far away from their core mission.  Using “human rights” as an argument against trade agreements will be harder to do if they focus more on simply eliminating tariffs, quotas, and subsidies.  A debate over the value of protectionism in promoting national and global welfare sounds very appealing and would surely lead to better policy.

I’ve been quite hard on President Obama for his abuse of executive power – and will soon file another brief in the 26-state challenge to his immigration action – but there are certainly things that he or any president can do to protect and secure our liberty without violating the Constitution. One such executive action would be to “declassify” marijuna: remove it from the list of controlled substances (or at least move it further down the list, which would have significant positive legal effects). I explain in this video:

What the President Should Do: Declassify Marijuana

In case you don’t have time to watch, here’s a transcript:

While legalizing marijuana as a matter of federal law would take an act of Congress, President Obama can decriminalize it. He can do this by moving it out of Schedule I of the Controlled Substances Act, which is reserved for substances of no medical purpose and a high potential for abuse, and therefore have high criminal penalties attached to their mere possession.

Virtually all marijuana-­related arrests are handled by state and local law enforcement. The federal Drug Enforcement Agency (DEA) simply lacks the resources to enforce the federal ban across all 50 states. That’s why the Justice Department decided not to fight the legalization of marijuana in the handful of states that have taken that step.

President Obama — without rewriting any laws or going outside of his constitutional authority — can direct the attorney general to start the process of reclassifying marijuana as a Schedule IV or V substance, or declassifying it altogether.

Reclassifying marijuana as a Schedule III substance or lower would have significant benefits for the budding marijuana industry and individual users. For example:

Declassifying marijuana would solve all of these problems.

But even merely reclassifying it would make it easier for legal businesses to access the full economy and reduce violent crime.

Marijuana deregulation sits squarely within the control of the executive. The president should use his executive powers to allow for intelligent enforcement of drug policy without eroding the rule of law.

I guarantee that if President Obama does this, he won’t be impeached for high times crimes and misdemeanors.

The majority of federally insured savings and loans failed in the 1980s, wiping out the Federal Savings and Loan Insurance Corporation in 1989.  The fiasco ultimately cost taxpayers around $150 billion to make savings depositors whole.  Two years later, the failures of hundreds of commercial banks put the Federal Deposit Insurance Corporation in the red.  (The FDIC got a bridge loan from the US Treasury, which it eventually repaid.)  It became clear that deposit insurance had fostered immense moral hazard, enabling the growth of unsound S&Ls and commercial banks.

For many reformers these events raised the question of how the core services of banks (intermediation and payments) might be provided without the expense of tax-funded guarantees, and yet without the danger of runs that had prompted the creation of the FSLIC and FDIC.  A number of economists (myself included) pointed to checkable money-market mutual funds (MMMFs) as an alternative to bank deposits that are not run-prone and therefore have no need for taxpayer-funded guarantees.

MMMFs, like other mutual funds and unlike banks, offer savers not debt claims promising specified dollar payouts on specified dates but rather equity claims (shares) in the dollar value of a portfolio.  Like other mutual funds, a MMMF buys back shares on demand at the current “net asset value” or NAV.  The modifier “money-market” means that a fund invests only in fixed-income securities with less than a year in remaining maturity, which means that present-value losses will be negligible from a rise in interest rates.  A fund can keep default and liquidity risks low by maintaining a diversified portfolio of highly rated securities with active secondary markets.

In 1976 Merrill Lynch introduced a MMMF that allowed customers to write checks against their account balances, an innovation which was quickly copied by other funds.  Money-market share accounts now combined the services of checking accounts with much higher returns, because they were not subject to the binding interest-rate ceiling (under the Fed’s Regulation Q) then constraining bank accounts.  To make them seem more like bank accounts, fund providers adopted the convention of pegging the share redemption value or NAV at $1, and varying the number of shares in an account, rather than varying the share price to reflect changes in the value of portfolio assets.  The popularity of MMMFs soared.  MMMFs that hold only Treasury obligations are called “government” funds.  Those that hold mostly commercial paper and jumbo bank CDs are called “prime” funds.

J. Huston McCulloch put the case for MMMFs not needing government guarantees well in a 1993 article: “[E]ven though MMMFs invest in financial instruments that may not come due for many weeks or months, they are entirely run-proof.  Should the volume of withdrawals be high enough” to require net sales that shrink the asset portfolio, “the fund’s liability to its remaining depositors simply falls in the same proportion.”  That is, each MMMF share is a claim not a fixed dollar sum, but only a percentage of the portfolio’s value.  A fall in the total value of the asset portfolio, whether from redemptions or from bad-news events that reduce assets’ market prices, immediately reduces the total value of shares so that they never over-claim the available assets.  Any bad-news net market value loss is immediately spread evenly over shareholders rather than being concentrated “on the last unlucky depositors in line, as occurs in a run on a traditional bank.”  With no greater losses falling on the person last in line to withdraw, there is no incentive to run to withdraw ahead of others.  Thus, “as long as MMMFs behave like true mutual funds,” continuously marking portfolio assets and shares to market value, the problem of the me-first incentive to run “cannot arise.”

I made essentially the same argument in chapter 6 of my text The Theory of Monetary Institutions.  There I argued that a run arises from the combination of three conditions: (1) claims are redeemable in pre-specified dollar amounts (i.e. are debts), (2) redemption is unconditionally available on demand, with a first-come first-served rule for meeting redemption demands, and (3) the last claim in line has a lower expected value.  Mutual funds eliminate the first element (claims are equity rather than debt), which is sufficient to eliminate the run problem.  It’s no use rushing to redeem when bad news about the asset portfolio arrives, because your account balance has already been marked down.  They also eliminate the third element (because every share redemption receives the same percentage of the portfolio value) when assets are liquid enough or the fund is small enough to make “fire-sale” losses from asset sales negligible.

But wait — doesn’t this argument assume that MMMFs vary the price of their shares like ordinary mutual funds?  Doesn’t it matter that the share redemption value is pegged at $1?  McCulloch explained why it should not matter: “Some MMMFs offer investors a variable number of shares of fixed value instead of a fixed number of shares of variable value.  This is merely a cosmetic difference with no substance, however.”  The problem of claims exceeding portfolio value “arises [only] when funds try to offer investors a fixed number of shares of fixed value.”  In other words, so long as $1 shares are promptly subtracted from each account in proportion to any decline in total portfolio value, or alternatively promptly marked below $1 (an event called “breaking the buck”), there remains no incentive to run.

In practice, subtracting $1 shares is not done (for reasons not immediately obvious), and breaking the buck has become an occasion to liquidate the fund.  Accordingly parent companies, to keep a MMMF alive and preserve its brand-name capital, almost always choose to eat losses and maintain the $1 share value.  A 2010 report by Moody’s identified 147 occasions over the period 1980-2007 when a MMMF suffered a net decline in portfolio assets that, without a rescue, would require breaking the buck.  Only one fund actually broke the buck.  (It was then liquidated, with shareholders receiving 96.1 cents per share.)  In 146 cases the parent firm stepped in, absorbing losses to keep the share value at $1.  If a parent firm acts immediately, upon news of critical asset losses, either to break the buck or instead to pitch in to preserve the par value, then running to get a better payoff than other shareholders remains either impossible or pointless.

Fast-forward to September 2008.  At midday on Sunday the 15th, insolvent and without a rescuer, Lehman Brothers filed for bankruptcy.  A money-market fund called The Reserve Primary Fund was caught holding $785 million in Lehman paper, about 1.3% of its $62.5 billion in assets under management.  (This size put it in the top twenty, but outside the top ten.)  An immediate 20% write-down on Lehman paper meant that a $157 million gap needed to be filled immediately if the fund was to have the asset value necessary to maintain its $1 share price.  For the next 24 hours, shareholders ran on the fund.  They did not believe, for good reason as it turned out, reassurances from the fund’s sales reps, repeating what The Reserve’s ownership had said but not done, that the parent company would pitch in to support the price.  By 1pm Monday (the 16th) shareholders had redeemed a bit more than a quarter of their claims at $1 per share.  The ownership had dithered and did not fill the hole in the balance sheet.  The fund’s custodian State Street Bank finally refused to make further payouts, and the fund broke the buck.  The Reserve also imposed daily withdrawal limits on its other funds.

During that Monday, and again on Tuesday and Wednesday, other prime funds experienced heavier than normal redemption outflows.  Other MMMF parent firms, by contrast to The Reserve, immediately supported their prime funds that had Lehman-related losses, and continued to redeem at $1 per share.  No other funds broke the buck.  By the 19th the industry-wide dollar value of assets under management by MMMFs was down by $247 million, a bit less than 7 percent of the value held ten days earlier.

After these three days of relatively heavy net redemptions following the Lehman bankruptcy and Reserve Primary buck-breaking, on Thursday the 19th, the US Treasury stepped in to stanch the redemptions, which it considered equivalent to runs, with something that it considered equivalent to federal deposit insurance.  It announced what Secretary Hank Paulson described as a “temporary guaranty program for the U.S. money market mutual fund industry,” assuring shareholders in participating funds that their shares would be redeemed at $1 even if their fund’s net asset value fell below par.  The Federal Reserve pitched in on September 22 by creating a special “Asset-Backed Commercial Paper Money Market Mutual Fund Liquidity Facility” to lend funds to banks for acquiring the commercial paper assets that MMMFs were shedding.

As later described by Philip Swagel, who was a Treasury official at the time, the MMMF guarantee program was initially funded, in an unprecedented and legally dubious move, from the Treasury’s Exchange Stabilization Fund:

The US Department of the Treasury (2008) used the $50 billion Exchange Stabilization Fund—originally established back in the 1930s to address issues affecting the exchange rate of the US dollar—to set up an insurance program to insure depositors in money market funds. … Use of the Exchange Stabilization Fund for this purpose was plausibly legal—after all, a panicked flight from US dollar-denominated securities could be seen as posing a threat to the exchange value of the dollar—but its use in this way was without precedent.

It should be noted that there was in fact no panicked flight from US dollar-denominated securities in general.  US Treasury securities rose in value during the crisis as investors worldwide considered them a safe haven.  The trade-weighted US dollar index actually rose sharply in the six months after Lehman fell and the Primary Reserve Fund broke the buck.  In its indifference to the rule of law, the US Treasury acted much like the Federal Reserve System did during the crisis.

After one year, the Treasury ended its MMMF guarantee program.  It has since imposed new pricing restrictions, liquidity requirements, and accounting rules on the funds in the name of reducing the problem of runs.  (I will discuss these regulatory changes in my next Alt-M post.)

So what happened in September 2008?  Is the run on Reserve Primary and heavy redemptions at other prime funds evidence that, contrary to McCulloch’s and my argument, prime MMMFs with a fixed $1 share price are in fact inherently fragile?

Stephen G. Cecchetti, former Director of Research at the Federal Reserve Bank of New York, and co-blogger Kermit L. Schoenholt have said so:

The fundamental problem facing U.S. regulators is that money market funds are banks in everything but their outward legal form.  They perform liquidity and credit functions that are identical to those of chartered banks; in particular, they offer the equivalent of bank checking deposits, making them vulnerable to a run.

This argument won’t do.  It completely fails to engage the basic counter-argument that checkable equity claims (MMMFs) are not run-prone because they distribute portfolio asset losses in an essentially different way from checkable debt claims (bank deposits).

Useful analysis of the run-proneness of MMMFs is provided by a 2013 comment on SEC rule proposals by the Squam Lake Group, a committee of 13 center-left to center-right financial economists.  They note that a MMMF (like a bank) will be run-prone whenever the aggregate redemption value of its shares or NAV exceeds the actual market value of the fund’s assets, so that early redeemers can expect to get more than late redeemers.  Under current accounting rules for money-market mutual funds (which they abbreviate MMFs), they point out, this can happen for two reasons:

First, mutual funds have the option to account for assets at amortized cost if they have a maturity of 60 days or less.  With that option, the [total redemption value of shares] is not a true reflection of the fair market value of fund assets.  Whenever investors can redeem at a NAV that is higher than the fair value of the assets, investors have incentives to run.

Second, and more fundamentally, prime MMFs invest substantially in assets without a liquid secondary market.  This creates an incentive for fund investors to run during a period of financial stress, because even “fair market value” may exceed by a significant amount the value at which the fund can quickly sell assets to meet investor redemptions.  Therefore, … the first MMF investors to redeem their shares during a crisis are likely to receive a higher price for their shares than those who follow once the fund is forced to meet redemption demands by selling assets that have not yet matured. … This first-to-redeem advantage, which is exacerbated by amortized cost accounting, creates an incentive for MMF shareholders to run.

In other words, MMMFs in August 2008 did not exhibit the immunity to runs that McCulloch and I expected in cases where the accounting rules did not, as we assumed they generally do, rule out an excess of aggregate share redemption value over actual asset portfolio value.  Some funds used accounting rules that allowed them not to mark 60-days-or-fewer assets to market at all, and not to mark other assets to a market price that corresponded to their actual immediate liquidation value.

In summary, we learned in August 2008 that MMMFs using certain accounting rules are not run-proof.  For 24 hours The Reserve Primary Fund carried a diminished asset portfolio without either topping it up or diminishing the claims against it, and consequently was rationally run upon.  We did not learn that MMMFs are inherently fragile, but rather that run-proneness depends on the accounting practices that a fund uses.

From this diagnosis, no policy intervention is indicated.  What follows is rather that in a market where losses remain private, investors can be expected to consider the relative fragility under certain circumstance of funds that opt to use potentially run-incentivizing accounting practices.  Such funds, if they do not offer some fully compensating advantage, should be expected to lose their market share.  Money-market mutual funds that instead credibly bind themselves to thoroughgoing mark-to-market accounting and other run-proofing practices (such as perhaps a pre-funded commitment by the parent company to shelter shareholders from losses), and advertise that fact, should be expected to flourish in the marketplace.  Such MMMFs remain an available payment mechanism that is not susceptible to runs and therefore has no need for guarantees at taxpayer expense.

To come in a later post: What to make of the US Treasury’s new restrictions on MMMFs?

______________

*Acknowledgment: I thank Kyle Davidson for research assistance.

[Cross-posted from Alt-M.org]

Global Science Report is a weekly feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

Second only to incidences of high temperature, supporters of government action to restrict energy choice like to say “extreme” precipitation events–be they in the form of rain, sleet, snow, or hail falling from tropical cyclones, mid-latitude extratropical storms, or summer thunderstorm complexes–are evidence that greenhouse gas emissions from human activities make our climate and daily weather worse.

The federal government encourages and promotes such associations. Take, for example, the opening stanzas of its 2014 National Climate Assessment: Climate Change Impacts in the United States, a document regularly cited by President Obama in support of his climatic perseverations:

This National Climate Assessment concludes that the evidence of human-induced climate change continues to strengthen and that impacts are increasing across the country.

Americans are noticing changes all around them. Summers are longer and hotter, and extended periods of unusual heat last longer than any living American has ever experienced. Winters are generally shorter and warmer. Rain comes in heavier downpours.

President Obama often calls out the extreme rain meme when he is running through his list of climate change evils. His Executive Order “Preparing for the Impacts of Climate Change,” includes:

The impacts of climate change – including…more heavy downpours… – are already affecting communities, natural resources, ecosystems, economies, and public health across the Nation.

So, certainly the science must be settled demonstrating a strong greenhouse-gas altered climate signal in the observed patterns of extreme precipitation trends and variability across the United States in recent decades, right?

Wrong.

Here are the conclusions of a freshly minted study, titled “Characterizing Recent Trends in U.S. Heavy Precipitation” from a group of scientists led by Dr. Martin Hoerling from the NOAA’s System Research Laboratory in Boulder, Colorado:

Analysis of the seasonality in heavy daily precipitation trends supports physical arguments that their changes during 1979-2013 have been intimately linked to internal decadal ocean variability, and less to human-induced climate change…Analysis of model ensemble spread reveals that appreciable 35-yr trends in heavy daily precipitation can occur in the absence of forcing, thereby limiting detection of the weak anthropogenic influence at regional scales [emphasis added].

Basically, after reviewing observations of heavy rains across the country and comparing them to climate model explanations/expectations, Hoerling and colleagues determined that natural variability acting through variations in sea surface temperature patterns, not global warming, is the main driver of the observed changes in heavy precipitation.

They summed up their efforts and findings this way (emphasis also added):

In conclusion, the paper sought to answer the question whether the recent observed trends in heavy daily precipitation constitute a strongly constrained outcome, either of external radiative forcing alone [i.e., greenhouse gas increase], or from a combination of radiative and internal ocean boundary forcing. We emphasized that the overall spatial pattern and seasonality of US trends has been more consistent with internally driven ocean-related forcing than with external radiative forcing. Yet, the magnitude of these forced changes since 1979 was at most equal to the magnitude of observed trends (e.g. over the Far West), and in areas such as the Far Northeast where especially large upward trends have occurred, the forced signals were several factors smaller. From the perspective of external forcing alone [i.e., changes in atmospheric carbon dioxide], the observed trends appear not to have been strongly constrained, and apparently much less so than the efficacy of an external driving mechanism surmised in the National Climate Assessment.

Hoerling’s team tried to say it nicely, but, basically they’re saying that the federal government’s assessment of the impacts of climate change greatly overstates the case for linking dreaded carbon dioxide emissions to extreme precipitation events across the United States (Note: We weren’t as nice when saying that, in fact, the National Assessment Report overstates the case for linking carbon dioxide emissions to darn near everything.)

This is not to say that Hoerling and colleagues don’t think that an increasing atmospheric concentration of carbon dioxide isn’t supposed to lead to an enhancement of heavy precipitation over the course of the 21st century. (If they didn’t say that, they’d probably be exiled to the federal climatologist rubber room). Rather, they think that folks (including the president and the authors of the National Climate Assessment) are far too premature in linking observed changes to date with our reliance on coal, oil, and natural gas as primary fuels for our energy production.

Whether or not at some later date a definitive and sizeable (actionable) anthropogenic signal is identifiable in the patterns and trends in heavy precipitation occurrence across the United States is a question whose answer will have to wait—most likely until much closer to the end of the century or beyond.

Reference:

Hoerling, M., J. Eischeid, J. Perlwitz, X. Quan, K. Wolter, and L. Cheng, 2016. Characterizing Recent Trends in U.S. Heavy Precipitation. Journal of Climate. doi:10.1175/JCLI-D-15-0441.1, in press.

 

In the 1990s, the Clinton administration proposed restructuring our air traffic control (ATC) system, creating a self-funded organization outside of the Federal Aviation Administration (FAA). The idea went nowhere in Congress at the time.

Since then, numerous countries have successfully privatized their ATC systems, including Britain and Canada. Meanwhile, our ATC is still trapped inside the FAA bureaucracy, and it continues to fall short on crucial technology upgrade projects.

The good news is that major restructuring is back on the agenda in Congress. House Transportation and Infrastructure Committee chairman, Bill Shuster, is expected to soon unveil a major reform proposal, perhaps along the lines of Canada’s non-profit ATC corporation, Nav Canada. The FAA must be reauthorized by the end of March, which gives some momentum to reform. If President Obama wants an important pro-growth legacy in his final year in office, he should get behind this effort.

Canada’s ATC privatization has been a huge success. In a recent Wall Street Journal interview, the head of Nav Canada, John Crichton said, “This business of ours has evolved long past the time when government should be in it … Governments are not suited to run … dynamic, high-tech, 24-hour businesses.” Exactly—and for all the reasons I discuss here.

Please join us Thursday for a Capitol Hill forum to discuss these issues (Rayburn B-354, noon). We will hear from two top experts. Dorothy Robyn was a top economic advisor to both Presidents Clinton and Obama, and she wrote an excellent study on ATC reform for Brookings. Stephen Van Beek is a long-time aviation industry expert.  

A popular knock against vouchers and other school choice programs is that private schools do not serve many students with disabilities, whereas public schools serve everyone. If that’s true, then the vast majority of public schools in New York City must actually be private.

According to a federal investigation just rejected by the de Blasio administration, the large majority of New York City elementary schools – 83 percent – are not “fully accessible” to students with disabilities. That forces many disabled students to travel far afield from their local public schools, which are supposed to serve every zoned child. The U.S. Department of Justice’s letter to the city laying all this out contains this anecdote:

In the course of our investigation, we spoke to one family who went to extreme measures to keep their child enrolled in their zoned local school, rather than subject the child to a lengthy commute to the closest “accessible” school. A parent of this elementary school child was forced to travel to the school multiple times a day, every school day, in order to carry her child up and down stairs to her classroom, to the cafeteria, and to other areas of the school in which classes and programs were held.

Of course, it is unrealistic to expect that every school is going to be able to provide the best possible education for every child – all kids learn different things at different rates and have different strengths and weaknesses – but it is especially true for children with disabilities. Yet while the public schools often fall lightyears short of the goal, that is the standard to which public schooling advocates love to hold schools in choice programs. And not only is it unrealistic no matter what, but vouchers are usually a fraction of the funding public schools get, averaging around $7,000, versus New York City’s nearly $19,000 per pupil.

The scope of NYC’s failure to live up to the ideal is sobering, but revelations of double standards on this front are not new. School districts often pay for kids with the most challenging disabilities to attend private institutions, and there are several choice programs that are, in fact, specifically designed for children with disabilities. But maybe now, before choice opponents attack private schools again, they’ll at least try to get their own house in order. Or in New York City, their hundreds of houses not fully serving disabled children.

Jeb Bush spent at least $14.9 trying to win the Iowa Republican primary, the most of any candidate in either party. He finished sixth.

Will this persuade people that money does not buy elections? Probably not. The belief that “money buys elections” is not really falsifiable. It is a matter of faith.

But perhaps those who believe that money buys elections will now think it is somewhat less probable they are correct.

On Wednesday, February 3, the Senate Environment and Public Works committee will hold a hearing on a new “Stream Protection Rule” being proposed by the Department of the Interior’s Office of Surface Mining (OSM) that looks to be another nail being hammered into the coal industry’s coffin by the Obama Administration.

Energy and mineral resource development in the U.S. is being thwarted by a wave of agenda-driven federal agency rulemakings being rushed through before the end of this administration. Oil, natural gas, and coal have been targeted for replacement by renewable energy sources. The coal industry has been fast-tracked by the OSM’s proposed new “Stream Protection Rule” (SPR). 

The new SPR would supersede the existing Stream Buffer Zone Rule, enacted in 2008 to control the increasingly few negative effects of surface coal mining on aquatic environments in the nation’s three largest coal mining areas: Appalachia, the Illinois Basin—Midwest, and Rocky Mountains—and Northern Great Plains. But, as is so often the case in the world of environmental regulation, that was not sufficient for the OSM, and, over the past seven years it has continued to press for more and stricter regulations on coal mining all across the United States.  They seem to prefer a nationwide one-size-fits-all regulatory enforcement scenario, even though local geology, geochemistry, and terrain vary widely between states and basins.  As it is, these concerns are more efficiently addressed by the states and policed by the industry.

That aside, the real impacts of the SPR, openly acknowledged by OSM, leave tens of billions of dollars’ worth of coal in the ground with no chance of future development—“stranded reserves,” as OSM terms them in the rule. Those coal deposits, according to OSM, “…are technically and economically minable, but unavailable for production given new requirements and restrictions included in the proposed rule.”  Yet, OSM’s engineering analysis, cited by a Congressional Research Service study, states that there will be no increase in “stranded reserves” under the SPR. In other words, the same volume of coal will be mined under the proposed rule as under the current rule…an OSM oversight, no doubt.

The proposed rulemaking employs questionable geoscience and mining engineering issues such as overemphasizing the importance of ephemeral streams to limit mining activities in all areas, requiring needless increases of subsurface drilling and geologic sampling, redefining accepted technical terms such as “approximate original contour” and “material damage to hydrologic balance,” and creating new unfamiliar terms such as “hydrological form” and “ecological function.”

But OSM likely is not focused on technical issues as much as their main concern: that the new rule is more stringent than the existing 2008 rule as is possible, and that it will apply nationally. Hence, the rule appears to be more for the benefit of regulators and places undue burden and expense on coal miners. Neither is OSM overly concerned with the big three tangible adverse impacts of their proposed rulemaking: lost jobs, lost resources, and lost tax revenue—with Appalachia being hit the hardest. Consensus estimates—not OSM’s—of the number of mining-related jobs lost nationally due to the SPR: in excess of 100,000 to upwards of 300,000. The decrease in coal tonnage recovered: between roughly 30 to 65 percent less. The annual value of coal left in the ground because of the rule: between 14 to 29 billion dollars. The estimated decrease in Federal and coal state tax bases: between 3.1 to 6.4 billion dollars. These are not very encouraging statistics for an industry that is currently responsible for supplying 40 percent of U.S. electrical power generation.   

Interior’s Office of Surface Mining has failed to adequately justify its proposed Stream Protection Rule in light of the federal and state rules and regulations already in place. Rather, OSM has embarked on a seven year odyssey of agenda-driven rulemaking that would force-fit regional and local characteristics coal mining operations to a nationwide template. However, Congress and the courts had already established that a uniform nationwide federal standard for coal mining would not be workable given the significant differences in regional and local geology, hydrology, topography, and environmental factors related to mining operations everywhere. On the non-technical side, OSM does not retreat from its admission in the preamble to the proposed rule that the SPR is politically motivated. Press reports have quoted an OSM official as acknowledging that there was pressure to get the SPR done in this administration’s last year.

Enacting the new SPR would be an ominous threat to a coal mining industry that deserves much better from this or any other future administration. This is one reason why OSM’s proposed SPR has been tagged by the National Mining Association as “a rule in search of a problem.” However, to paraphrase a more appropriate quote: the voluminous Stream Protection Rule is not the solution to the coal industry’s problems—rather the Stream Protection Rule is the problem.

It will be interesting to see how this all plays out in the Senate on Wednesday.

The U.S. Department of Labor’s Occupational Safety and Health Administration (OSHA) is soon set to release new exposure limits to air-borne silica dust. The rulemaking has been in the works for about three years with a final rule scheduled to be announced this year. The silica industry is not enthused.

Silica dust is known to cause respiratory illnesses (e.g., silicosis, lung cancer, other airways diseases) that may contribute to or lead directly to death when it is breathed in high enough concentrations over long enough time periods.

OSHA explains that exposure to respirable silica “occurs in operations involving cutting, sawing, drilling and crushing of concrete, brick, block and other stone products and in operations using sand products, such as in glass manufacturing, foundries and sand blasting.”

OSHA’s proposal, generally, is to lower the existing permissible exposure limits (adopted in 1971) by about 50%, dropping them from around 0.1mg/m3  to 0.05mg/m3 (specific details here). OSHA explains:

The agency currently enforces 40-year-old permissible exposure limits (PELs) for crystalline silica in general industry, construction and shipyards that are outdated, inconsistent between industries and do not adequately protect worker health. The proposed rule brings protections into the 21st century.

And, as the government likes to claim with all of its regulations, the added restrictions will save lots of lives, and in doing so, will save lots of money:

OSHA estimates that the proposed rule will save nearly 700 lives and prevent 1,600 new cases of silicosis per year once the full effects of the rule are realized.

The proposed rule is estimated to provide average net benefits of about $2.8 to $4.7 billion annually over the next 60 years.

Interestingly, a visit to the Centers for Disease Control in search of deaths from silica inhalation produces this chart graphing silicosis mortality over time. The numbers have dropped considerably over the past 40+ years, and by 2010 had fallen to about 100 or so deaths per year (U.S. residents over the age of 15) attributed to silicosis as either the underlying or contributing cause.

Figure 1. Silicosis: Number of deaths, crude and age-adjusted death rates, U.S. residents age 15 and over, 1968–2010 (Source: CDC).

The CDC data shows that silicosis deaths have been declining and although the decline has slowed, it continues to drop while under the current OSHA guidelines. And further, the 100 or so deaths that are occurring annually are several times less than the annual number of deaths that OSHA predicts will be saved by the new regulations. That’s a pretty neat trick—the new regs are going to save several times more lives than are actually lost!

This means not only that the OSHA mortality benefits from the new regulations are questionable, but so too must be the economic benefits (as they are tied directly to the mortality savings).

The silica industry isn’t taking this lightly.

They contend that the OSHA mortality projections are based on dose-response relationships that are not truly indicative of what is really going on, for several reasons, primarily that they are based upon poor and inadequate data and analyses.

Dose-response curves used by the federal government are notorious for producing forecasts of a much greater health benefit than actually occurs. For one reason, often, the federal dose-response curves aren’t actually curves at all, but are rather straight lines. Which means that the response is the same for all dosage increments. This allows government regulatory agencies to claim the continually cranking down the exposure limits will continually produce positive health outcomes.

But, more and more research is proving that this is not the case. As the dose gets lower, the response often flattens out (i.e. there is a threshold) or may in fact become positive (low dosages are actually good for you). While this sounds like common sense to most of us (consider sunshine or alcohol), it is a ground-breaking notion in the field. And much of the research on this groundbreaking theory is being done by Dr. Edward Calabrese at University of Massachusetts, also an adjunct scholar to Cato’s Center for the Study of Science.

A leading industry group (the National Industrial Sand Association, NISA) makes a strong case that the existing OSHA standards are quite effective at greatly reducing, or even eliminating, occurrence of silicosis. So really, all OSHA needs to do is unify the existing regulations and better insure that they are enforced.

NISA has commissioned a scientific study of the dose-response behavior that is wider in scope and includes a greater and more detailed amount of epidemiological data than the existing studies relied upon by OSHA. From a NISA report:

While the association between silicosis and exposure to respirable crystalline silica is indisputable, there is still considerable uncertainty regarding the dose/response relationship of this association, particularly in the case of chronic simple silicosis, which is the most common form of silicosis.  It is unclear, for example, whether there is an effect threshold, or whether instead the dose/response curve is linear at even the lowest doses.  The slope of that curve is also uncertain.  As a result, there is uncertainty regarding the degree of risk remaining at various 8 hour time-weighted average exposures, including, most importantly, the current PEL of [0.10 mg/m3].  

In NISA’s view, this degree of uncertainty is unacceptable for a rulemaking of this magnitude.

The industry is being up-front about their commission (involving scientists at major universities involved in silica research) and has taken steps to be open and transparent about their involvement—or rather lack of involvement—in the study’s outcome. 

But, the results of the new study, instigated several years ago, are still forthcoming and consequently the industry has asked OSHA to wait until they are available (expected later this year) before issuing its final rule.

It’ll be interesting to see how this plays out, but preliminarily, this looks like another case of the government solving a problem that doesn’t really exist, in that existing regulations are sufficient to address the health concerns if they were better applied and enforced, and further, the promise of the new regulations is being greatly overplayed, saving many times more lives than our CDC says silicosis takes away.

But that’s the government at work—if some regulations are good, more must be better. Sadly, for the taxpayers and the regulated industries, it doesn’t always work out that way.

As I recall from my time in the Senate, there’s nothing like an energy bill to attract misguided proposals.  This week the Senate begins consideration of S.2012 — the Energy Policy Modernization Act of 2015.  Among the almost two hundred filed amendments is a proposal (Amendment #3042) from former real estate broker, Senator Isakson, to mandate that the Federal Housing Administration (FHA) reduce the quality of its loans in order to encourage more efficient energy use.

The two most concerning aspects of Amdt 3042 are 1) it would allow “estimated energy savings” to be used to increase the allowable debt-to-income (DTI) ratios for the loan and; 2) require “that the estimated energy savings…be added to the appraised value…”

These changes might not be so bad in the abstract but when combined with existing FHA standards, they set the borrower up for failure and leave the taxpayer holding the bag. Let’s recall that borrowers can already get a FHA mortgage at a loan to value (LTV) of 96.5%, and that’s assuming an accurate appraisal.  If borrowers were required to put 20 percent down, then this amendment would be a minor problem, but under existing standards, borrowers would mostly likely leave the table with an LTV over 100%, that is already underwater before they’ve even moved in.  Did Congress learn nothing from the crisis?

The increase in DTI might not matter if FHA did not already allow a DTI as high as 43% of income.  Under Amdt 3042 borrowers could easily leave the closing table devoting over half their income to their mortgage.  Again, did Congress learn nothing from the crisis?

To illustrate that the intent of the proposal is to have the taxpayer take more risk, Amdt 3042 actually prohibits FHA from imposing any standards that would offset this risk.  If these new loans perform worse, as one would expect, FHA cannot put them back to the lenders.   And let’s not forget FHA allows the borrower to have a credit history deep in the subprime range.  So you could have a subprime borrower, say FICO down to 580, LTV > 100% and DTI > 43% - what could go wrong?

If indeed energy savings actually increased the value of the home, that would be reflected in the price.  There would be no need to mandate such.  Not only does this proposal weaken FHA standards, and expose the taxpayer to greater risk, it takes us further down the path of an already politicized housing policy, where instead of relying on market prices, values are dictated by Soviet-style bureaucratic guesswork.

Afghanistan is a bust. The Taliban is expanding its control. The number of “security incidents” was up a fifth in the last months of 2015 over the previous year. Popular confidence is at its lowest level in a decade. U.S. military officers now speak of a “goal line” defense of Kabul.

While the deadly geopolitical game is not yet over, it is hard to see how the current regime can survive without Washington’s continued combat support. The nation-building mission always was Quixotic.

Indeed, the latest report from the Special Inspector General for Afghanistan Reconstruction shows how far this Central Asian land was and remains from developed status. And how ineffective U.S. aid programs have been in transforming it.

While Afghanistan enjoyed some boom years in the flood of Western cash, the foreign money also inflamed the problem of corruption. The Stockholm International Peace Research Institute explained: “The significant amount of aid and vast international military spending post-2001 has re-ingrained a culture of aid-rentierism: the Afghan elite competes internally for political rents from the international community.”

Tougher times have not increased honesty. In its latest quarterly report, SIGAR noted that a recent Afghan task force “reportedly found that millions of dollars were being embezzled while Afghanistan pays for numerous nonexistent ‘ghost’ schools, ‘ghost’ teachers, and ‘ghost’ students.”

Even worse, the same practice apparently afflicts the security forces. SIGAR cited an Associated Press investigation: “In that report, a provincial council member estimated 40% of the security forces in Helmand do not exist, while a former provincial deputy police chief said the actual number was ‘nowhere near’ the 31,000 police on the registers, and an Afghan official estimated the total ANDSF number at around 120,000—less than half the reported 322,638.”

Security never has been good during the conflict. Today it is worse than ever.

Explained SIGAR: “The Taliban now controls more territory than at any time since 2001. Vicious and repeated attacks in Kabul this quarter shook confidence in the national-unity government. A year after the Coalition handed responsibility for Afghan security to the Afghan National Defense and Security Forces (ANDSF), American and British forces were compelled on several occasions to support ANDSF troops in combat against the Taliban.”

Yet the failure of U.S. aid programs reaches well beyond insecurity. Despite pouring $113.1 billion into Afghanistan, Washington has surprisingly few sustainable, long-term benefits to show for it.

Citing just a few of its earlier audits, SIGAR reported on Afghan government agencies suffering from “divergent approaches and a lack of overall strategy, poor coordination and limited information sharing,” and unable to “handle contract research, awards, and management.” U.S.-funded “power and water systems [were] inoperable for lack of fuel” while an industrial park had minimal occupancy.

Its latest audits yielded little better results.

USAID devoted $488 million to develop Afghanistan’s oil, gas, and minerals industries. SIGAR found “limited progress overall.” Afghan ministries weren’t committed to reforms, “many mining operations are still controlled by political elites, warlords, military personnel, and the police,” transportation networks were inadequate, and several projects showed no results.

Tens of millions of dollars went for training and equipping an Afghan National Engineer Brigade. The NEB was hampered by “army staff on leave for holidays, political events, low literacy levels, and security concerns.” The brigade “lacked initiative” and “was not capable of carrying out its mission.”

Some $2.3 billion in USAID money went for stability programs, yet, said SIGAR, “villages that received USAID assistance showed a marked decrease in their stability scores relative to the overall decrease in stability scores for both villages that did and those that did not receive USAID assistance.”

The official line remains positive. On one of my visits to Afghanistan a Marine Corps officer warned me that “everyone is selling something.” Private reports were different than the glowing reviews from my NATO handlers.

As I point out on Forbes: “The U.S. has been fighting in Afghanistan for more than 14 years. It’s time to bring home the troops. No more Americans should die in Afghanistan for nothing.”

Secretary John Kerry went to Beijing to again lecture his hosts about the need for China to pressure North Korea over the latter’s nuclear program. As expected, his mission failed. The Xi government again proved unwilling to threaten the survival of the Kim dynasty.

Immediately after Pyongyang’s fourth nuclear test Kerry attacked Beijing’s policy: it “has not worked and we cannot continue business as usual.” Even before Kerry arrived the PRC made clear it disagreed. “The origin and crux of the nuclear issue on the Korean Peninsula has never been China,” said a Ministry of Foreign Affairs spokeswoman: “The key to solving the problem is not China.”

While he was in Beijing she cited the behavior of other parties as “one major reason why the denuclearization process on the peninsula has run into difficulties.” Beijing officialdom has shown plenty of irritation with the Democratic People’s Republic of Korea, but China has demonstrated it has yet to be convinced to destroy its own ally and strengthen America’s position in Northeast Asia.

Kerry made the best of an embarrassing situation when he announced that the two sides agreed to an “accelerated effort” by the UN Security Council to approve a “strong resolution that introduces significant new measures” against the DPRK. No one should hold their breath as to the nature of those “measures,” however.

Foreign Minister Wang Yi echoed Kerry in supporting passage of “a new resolution,” but added the devastating caveat: “In the meantime, we must point out that the new resolution should not provoke new tensions in the situation, still less destabilize the Korean peninsula.” Wang explained that “Sanctions are not an end in themselves” but should encourage negotiation, not punish.

As I point out in National Interest: “If Kerry wants the Chinese to follow U.S. priorities, he must convince them that America’s proposals advance Chinese interests. Which means explain to them why they should risk destroying their one military ally in the region, with the possibility of creating chaos and conflict next door and adding the entire peninsula to America’s anti-China alliance network.”

Good luck.

In 1950, the PRC went to war with the U.S. to preserve the North Korean state and prevent American forces from advancing to the Yalu River. Even today Beijing wants to see a united Korea allied with Washington about as much as it desires to have a nuclear North Korea.

Indeed, even without a U.S. garrison, a more powerful ROK would pose a challenge to the PRC. Moreover, Beijing’s favored economic position in the North would disappear as South Korean money swept away Chinese concessions.

Worse, the process of getting to a reunified Korea likely could be disastrous. Nothing in the DPRK’s history suggests a willingness to gently yield to foreign dictates. In the late 1990s the regime allowed a half million or more people to starve to death. Irrespective of China’s threats, Kim Jong-un might say no and continue in power, irrespective of the human cost.

If China ended up breaking a recalcitrant Kim dynasty by sanctioning oil and food, the result could be extraordinary hardship and armed factional combat followed by mass refugee flows across the Yalu—multiply the desperation and number of Syrians heading to Europe. Then toss in loose nuclear weapons and a possible South Korean/U.S. military push across the Demilitarized Zone to force reunification.

The result would be a first rate nightmare for Chinese President Xi Jinping. So, explain to me again, Secretary Kerry, why my country should ruin its geopolitical position to further Washington’s ends?

If John Kerry’s private message was the same as his public pronouncements, he had no hope of winning Chinese support for taking decisive action against the DPRK. Next time he visits he should employ the art of persuasion—or stay home.                      

Possibly the strangest foreign policy decision the Obama administration has made was their decision to support the Saudi-led war in Yemen. The White House has made quiet counterterrorism operations a key plank of its foreign policy agenda, and the administration includes a number of officials best known for their work on human rights issues, most notably Samantha Power. As such, the President’s decision to supply logistical, intelligence and targeting support for the Saudi-led coalition’s military campaign – a campaign which has been horrifically damaging to human rights inside Yemen, as well as detrimental to U.S. counterterrorism goals – was deeply surprising.

Less surprising was the fact that the conflict has turned into a disastrous quagmire. Yemen was already arguably a failed state when the intervention began in April 2015. The power transition negotiated in the aftermath of the Arab Spring was weak and failing, with Yemen’s perpetual insurgencies worsening the situation. Since the intervention began, the United Nations estimates that over 21 million Yemenis have been deprived of life’s basic necessities. Thousands have been killed. Even more concerning, United Nations monitors reported to the Security Council that they believed the Saudi-led coalition may be guilty of crimes against humanity for its indiscriminate air strikes on civilians.

Strategically, the coalition has made few gains. Despite the terrible loss of life, the coalition has stalled south of the capital, Sanaa. Further advances will be exceedingly difficult. At the same time, Al Qaeda inside Yemen has grown in strength and size, benefitting from the conflict, and even presenting itself as a viable partner for the Saudi coalition. It is hard to see how U.S. strategic interests - counterterrorism, human rights, or even regional stability – are being served by this conflict.

So what should the president do? In his last few months in office, President Obama should take advantage of his executive power to end U.S. support for the war in Yemen, and direct America’s diplomats to aggressively pursue a diplomatic settlement. This war is humanitarian disaster and a strategic failure; ending our support for it should be a no-brainer. 

What the President Should Do: U.S. Support in Yemen

Gabriel Roth, who turns 90 years young today, is a rock star among transportation economists, and a special inspiration for those of us who support reducing the federal government’s role in transportation. According to his C.V., Roth earned degrees in engineering from London’s Imperial College in 1948 and economics from Cambridge in 1954.

In 1959, he began research into improved road pricing systems. This led to his appointment to a Ministry of Transport commission that published a 1964 report advocating pricing congested roads in order to end that congestion.

In 1966, the Institute for Economic Affairs published his paper, A Self-Financing Road System, which argued that user fees should pay for all roads, and not just be used to relieve congestion. Roads should be expanded, Roth noted, wherever user fees exceeded the cost of providing a particular road, but not elsewhere.

In 1967, Roth moved to the United States to work for the World Bank, where he did road pricing studies for many developing countries and cities, including Bangkok, Manila, and Singapore. After leaving the World Bank in 1987, he continued to work as a consultant until 2000, among other things helping design the Dulles Toll Road and writing Roads in a Market Economy, a book published in 1996.

Since then, he has been a regular participant in transportation conferences, meetings, and hearings. He edited a 2006 book, Street Smart, co-authored a 2008 paper showing how electronic tolling could be done without invading people’s privacy, and made a presentation about tolling at the 2010 American Dream conference.

My home state of Oregon is now experimenting with mileage-based user fees, and I’m one of the volunteers in this experiment. If it goes well, we may see the realization of Roth’s ideas before he turns 100.

I hope to see Gabe on my next trip to DC. I know I’ll be able to find him by looking for the nearest transportation conference.

Quite a number of media fact-checkers tripped over Ted Cruz’s claim in last night’s debate that Barack Obama had “dramatically degraded our military,” and Marco Rubio’s related pledge to rebuild a U.S. military that is “being diminished.”

The Dallas Morning News noted that “amounts spent on weapons modernization are about the same as they were when Republican George W. Bush was president.” Meanwhile, to the extent that the military’s budget “is being squeezed,” they wrote, it is because of “the insistence of lawmakers in both parties that money be spent on bases and equipment that the Pentagon says it doesn’t need.”

Politico’s Bryan Bender (accessible to Politico Pro subscribers), concluded that while Cruz’s “facts may hold up to scrutiny…they are nonetheless misleading.” Bender pointed out that “Military technology has advanced significantly in the last quarter century and combat aircraft and warships are much more precise and pack a more powerful punch.” Politifact agreed, rating Cruz’s claim “Mostly False.”

Ultimately, alas, whether the U.S. military has been severely degraded is a judgment call. Relative to what? And when? And what does that mean for U.S. security?

But while the answers to such questions are subjective, the facts on spending are not. Undaunted by the realization that committed partisans are unlikely to be converted by them, I’m also doing my part to try to inject some facts into the debate over the Pentagon’s budget. A few weeks ago, I posed five sets of questions to the candidates at The National Interest’s Blog, The Skeptics, including, for those calling for more military spending:

Why would you spend more? What is the United States unable to do right now to preserve its security because it isn’t spending enough? To what extent is insufficient military strength the critical factor explaining America’s inability to achieve satisfactory results with respect to an array of challenges, from destroying ISIS, to repairing failed states, to halting North Korea’s nuclear program?

This morning at TNI, I offered my take on whether lower military spending as a share of GDP is to blame for the U.S. military’s supposed precipitous decline. I’m skeptical.

For one thing, the Pentagon’s base budget, excluding the costs of our recent wars, remains near historic highs. Under the bipartisan Budget Control Act passed in 2011, and as amended in 2013 and late 2015, U.S. taxpayers will spend more on the military in each of the next five years ($510 billion) than we spent, on average, during the Cold War ($460 billion). Those figures are adjusted for inflation. And the actual gap between what we spend now, and what we spent then, will be larger, because the BCA doesn’t cover war costs.

Meanwhile, it isn’t even true that spending under Barack Obama is lower than under George W. Bush. In inflation-adjusted dollars, military spending – both war and non-war – averaged $606 billion per year during Bush’s two terms in office; under Obama, it has averaged $668 billion. The United States will have spent nearly $500 billion more in the period 2009-2016 than from 2001-2008 ($4.8 trillion vs. $5.3 trillion).

So the most important question, it seems, is “Why is more spending leading to – in Cruz’s estimation (and Rubio and Jeb Bush and any other candidate that wants to spend more on the military) – less capability? A smaller Army. A smaller Navy. Fewer Air Force planes.

Do fewer troops and ships and planes imply that the military is dramatically degraded? Not necessarily. The troops are better trained than a generation ago. The ships are more capable. The weapons are more accurate.

We should not assume that less military spending – if spending did decline – would necessarily lead to a less capable military. Meanwhile, there are many possible explanations for why militaries degrade over time – for example, fighting foolish, unnecessary wars. Far fewer American troops are being killed and wounded in Iraq and Afghanistan now than in 2008.

I conclude at TNI:

it isn’t obvious that a more costly force is needed to preserve U.S. security and protect vital U.S. interests. That we are spending less as a share of GDP than at some points in U.S. history does not necessarily mean that we should spend more. It could also be true that we are spending less and getting more, or that we could safely get by with less. Once we get beyond the confusion over different ways to measure our spending, let’s examine what the U.S. military truly must do in order to keep Americans safe, and how much that will cost.

Read the whole thing here.

Did our message finally get through? (See “How ADA-for-the-Web Regulations Menace Online Freedom,” 2013). Or that of other commentators like Eric Goldman, who warned (of a related court case) that “all hell will break loose” if the law defines websites as public accommodations and makes them adopt “accessibility”? At any rate, the U.S. Department of Justice, after years of declaring that it was getting ready any day now to label your website and most others you encounter every day as out of compliance with the ADA, has suddenly turned around and done this:

In an astonishing move, the Department of Justice (DOJ) announced that it will not issue any regulations for public accommodations websites until fiscal year 2018 — eight years after it started the rulemaking process with an Advanced Notice of Proposed Rulemaking (ANPRM).

Yes, eight years is a very long time for a rulemaking, especially one pursuing issues that have been in play for many years (that link discusses testimony I gave in 2000). And predictably, some disabled interest-group advocates are already charging that the latest delay is “outrageous” and shows “indifference.” More likely, it shows that even an administration that has launched many audacious and super-costly initiatives in regulation has figured out that this one is so audacious and super-costly that it should be – well, not dropped, but left as a problem for a successor administration.

Besides, as so often happens, for regulated parties the issue is (to borrow a phrase) not freedom from obligation, but freedom from specification as to what that obligation might be. Court decisions, which for years ran mostly against ADA advocates’ “public accommodations” claim, now point confusingly in both directions. And in the mean time both private litigants and DoJ itself continue to sue online providers and fasten on them new settlements and decrees, as when Amazon lately agreed to caption more videos for the deaf; Harvard and MIT, meanwhile, were still being sued for the audacity of having offered uncaptioned online courses to the public. Minh Vu and Kristina Launey of Seyfarth Shaw:

…since issuing that [2010] ANPRM, DOJ’s enforcement attorneys have investigated numerous [entities claimed to be] public accommodations, pressuring them to make their websites accessible. DOJ even intervened in recent lawsuits (e.g., herehere, and here) taking the position that the obligation to have an accessible website has existed all this time in the absence of any new regulations.

The next administration – or better yet Congress – should summon the courage to give a firm and final No.

Pages