Wednesday, August 31, 2011

The Fed: what's inside counts, too

So I was reading through this Felix Salmon post about newfangled cash transfer systems, and a bit near the end caught my eye:
[I]t’s downright idiotic that the Fed doesn’t step up to the plate and take on its natural role as guardian of the national payment system. Why doesn’t it? I’m not sure, but I suspect it’s something to do with the fact that the Fed doesn’t really exist as a unified body: there’s just a network of regional federal reserve banks, with a board of governors in Washington.
Now maybe it's just me, but this seems like a much bigger, more important point than anything to do with payment systems. And I'll admit it's something I haven't really thought about before. He's right: when we talk about "the Fed," we think of a monolithic entity directing monetary policy from behind closed doors. But "the Fed" is just shorthand for "the Federal Reserve System." That's important to remember when you're trying to explain why it has or has not chosen to take a certain course.

Often, lately, the course taken by the Fed(eral Reserve System) has been "timidity." Which makes sense, really. Dispersal of authority doesn't typically lead to strong, forward-thinking leadership. Bernanke isn't the King of the Federal Reserve, he's the Chairman of the Board, and it's not hard to wonder how that changes his approach to issues like payments... and everything else. When you're talking about a committee of just seven people, psychology matters.

Congress has granted the Federal Reserve broad responsibilities and broad authority. It is subject to few external limitations when pursuing its many mandates. But might it face internal limitations that prevent it from exercising the full scope of its authority? Just as importantly, how would we know?

If the Fed's structure inhibits it from directing something important and obvious like the implementation of a national payment system, what else does its structure inhibit it from doing? Considering the Fed's centrality to the US economy, this seems like something we should figure out.

Monday, August 29, 2011

Facebook Halfheartedly Responds

I previously posted about the Google+'s "circles" and the superiority of Facebook's "everyone sees all posts" system.

It seems that Facebook has added a feature to their status updates system, that allows a user to limit who sees the update. This will not change Facebook in a meaningful way for 2 reasons. First, unlike Google+, very few people on Facebook have their "Friends" meaningfully divided up into groups. So this makes posting something to a limited audience kind of cumbersome. Second, a big share of Facebook posts seem to be on other people's walls, which everyone can see.

As I argued in my previous post, Facebook's information sharing is superior to that of Google+'s. Normally id be a little concerned about Facebook shifting towards a more Google+ type model but Facebook doesnt seem to be promoting this new posting strategy very hard and at first glance it seems like one of their typical feints towards increased privacy. Next week they will probably go back to secretly making our social security numbers public or something like that.

Sunday, August 28, 2011

Things that are still true: we don't have a borrowing problem

Yglesias on real interest rates for 5 and 7 year Treasuries, which have been and remain negative:
This is an extraordinary situation that ought to be dominating the public debate. What does it mean? Well it means that right now it’s much cheaper for the government to finance some undertaking by borrowing the money and paying for it out of taxes five or seven years from now than to pay for it with taxes. And yet right now not only does the federal government do lots of things, it collects a fair amount of taxes to pay for those things to be done. This is perverse. In your personal life, paying for all your consumption by wracking up huge amounts of credit card debt rather than working would be a terrible idea. But the reason it’s a terrible idea is that the credit card company charges you a high interest rate. Were the credit card company to instead charge you a negative interest rate, it would be borderline insane to pay your bill in a timely manner. But not only are we paying some of our bills on time even though it would be cheaper to not pay them, our present fiscal policy debate is pathologically focused on the idea that we’re borrowing too much money.
I don't have much to add, other than to re-emphasize how stupid a time this is to worry about balancing the budget. Still, it's unsurprising that conservatives remain obsessed with the deficit in the face of basic market facts: their position has nothing to do with intelligent fiscal policy and everything to do with a values-centric worldview in which borrowing is a sign of moral weakness. They've decided upfront that being a debtor is bad; everything else is negotiable.

Friday, August 26, 2011

How not to build a memorial

The Martin Luther King, Jr. Memorial, long in the works, finally opened in DC the other day. And while I obviously haven't seen it in person, if the pictures I've seen are any indication, it, well, kinda blows. The guy looks furious! Somewhere between his steely gaze and his arms-crossed, don't-screw-with-me posture, the whole thing radiates "non-violent protest" less than it conveys that King is about break heads until Montgomery desegregates its bus system.

The Times agrees. It published a long "Memorial Review" yesterday, which spent a while dancing around the subject but ultimately found itself unable to avoid the truth:
And the mound’s isolation from any other tall objects, its enormity and Dr. King’s posture all conspire to make him seem an authoritarian figure, emerging full-grown from the rock’s chiseled surface, at one with the ancient forces of nature, seeming to claim their authority as his. You don’t come here to commune with him, let alone to attend to the ideas the memorial’s Web site insists are latent here: “democracy, justice, hope and love.” You come to tilt your head back and follow; he, clearly, has his mind elsewhere.

It is difficult to know precisely why all this went wrong, or why this memorial never alludes to the fundamental theme of Dr. King’s life, equal treatment for American blacks. It strives for a kind of ethereal universality, while opposing forces pull it in another direction.
And as the Times notes, the boneheadedness of the King memorial seems to be part of a larger trend:
Many recent memorials proliferating along the Mall have trivialized or mischaracterized their subjects. The World War II memorial seems almost phony, with its artificial allusions to antiquity; the Roosevelt Memorial diminishes that president and even implies that he was a pacifist (featuring his words “I hate war”) instead of a wartime leader responsible for building up the “arsenal of democracy.” Why shouldn’t Dr. King, too, be misread — turning the minister into a warrior or a ruler, as if caricaturing or trying too hard to resemble his company on the Mall?
I haven't seen the Roosevelt memorial in years, but I agree that the WWII memorial is quite stupid.

The thing is, all this misguided memorializing has a pretty obvious cause: it's the socially conservative opposition to modern or abstract art.

Complex past events and figures, almost by definition, are going to trigger a diverse range of memories and emotional responses from different people. That's how historical memory works. So while some people may remember King as an imposing figure, standing resolute over modern America, others view him as a kind and benevolent force for good. And still others see him as a flawed man who helped undo a very evil system. None of these viewpoints is incorrect, and a successful memorial would find some way to incorporate them all.

That's hard to do when your memorial must include a statue of the personage in question. I suppose the artist could strive for a Mona Lisa-style ambiguity of affect. But the process of chiseling a person into stone usually only makes them more one-dimensional.

Modern art, by contrast, is basically designed to create an interpretive, personalized experience. What you take away from a piece of a abstract art is highly dependent on what you bring into it. As such, it is ideal for memorials and monuments.

Exhibit A: the Vietnam Wall, DC's most effective memorial. The Wall doesn't tell us what to think about Vietnam (if it did, it would be a certain failure, given the disagreement over that war) but people flock to it, and are deeply moved by it, anyway. (Sidebar: I suspect that the initial backlash to the Wall's strange design is largely to blame for the current mediocrity of DC memorials.)

Exhibit B: The Holocaust memorial in Berlin. This is actually the most emotionally affecting memorial I've ever seen. It's hard to describe, so I've added a picture. But the amazing thing about the Holocaust memorial is how malleable it is. It variously looks like grave stones, box cars, or coffins.

It gets much deeper towards the middle. Down in the center, the memorial forms an accidental maze, where it's easy to get separated from companions. That's an unnerving experience, because while you're not lost, people tend to cut in and out of your vision unexpectedly.

I've made it sound completely depressing, but it's not -- just somber and disorienting. And children are allowed to play around the edges, and do -- climbing the stones and jumping from one to the next. It's an amazing experience, but more importantly, it's a different experience for everyone who goes.

No one's ever going to say that about the King memorial. The King memorial isn't a place for reflection at all. It just tells us "This guy was important." It's propaganda.

Wednesday, August 24, 2011

Sub-national Entities the Right Way

I posted earlier about the inefficiency of having major American cities straddle multiple states. I hypothesized that things may be different in Europe. Turns out they are.

Germany is made up of 16 different states. But check this out, the two biggest cities (Berlin and Hamburg) are each their own state. How cool is that? Also a brief look at a map of the German States reveals that most major cities in Germany sit right in the center of each state and are usually the capital of the state.

In the US most state capitals are in the geographic center of the state. There is a certain logic to this, it is easiest for everyone in the state to get to. The problem is most major cities sit in the corner of the state at the mouth of a river and so you end up with situations where a government in Albany is making decisions for a state who's population resides overwhelmingly in a single city in the corner of the map.

Germany doesn't have this problem because the states are basically major cities and the surrounding area. If you look at France it's similar.

Today's best/most insufferable tweet

File this hilariously arrogant little gem under "Roubini being Roubini." But also, he's right. The general public is subjected to a lot of a crank economic theory by laymen with political agendas. Among academic economists, opinion is a lot less diverse, and a lot more Keynesian -- partly because of all the evidence in Keynes' favor. One of the tragedies of the past two-odd years is the right's success in disguising this fact.

Anyway, the link is to Dylan Matthew's rundown of stimulus studies, which actually is great, and definitely is worth a perusal.

Guest Post: Redrawing the States

Matt Yglesias had an interesting post over the weekend about the Tappan Zee Bridge. What I found most interesting was this part at the end:
The fact that metropolitan areas in the northeastern United States are often divided among more than one state tends to complicate regional planning in an unfortunate way.
He's right, but I think the problem exists outside of the northeastern United States. Throughout history humans have been better at moving resources over water then over land. As a result we have tended to settle and build cities on rivers and in bays. In the US we also used rivers to help divide up the States. This left us with a number of cities that have suburbs in 2 or 3 different states. Many are in the Northeast (New York, DC, Philadelphia). But they also exist in the midwest (Cincinnati, St. Louis, Kansas City).

I don't know enough about countries in Europe but I imagine that the problem exists there as well, but they might not have the same complex federalism problems. Canada is really big and there are only a few major cities so its not really an issue. Heres a cool map showing the US divided by population:

Tuesday, August 23, 2011

Random observation

This blog sure has taken a neo-liberal turn in the last few days. There's a post defending the market pricing of labor, a post waxing poetic about traditional property rights, a post extolling the virtues of military intervention, a post siding with producers and suppliers over consumers, and a post musing about whether American interests require the assassination of world leaders.

And all that, before the coming series of posts about how Minneapolis should ease the zoning and housing regulations currently frustrating Uptown landlords.

I'm almost embarrassed.

Guest Post: Gladwell + Grantland = about what you'd expect

Malcolm Gladwell wrote a post for Grantland recently discussing the NBA Lockout situation. As with most Gladwell articles the findings from a psychological study are cherry-picked and used to explain something else. Analogies are also used.

In this article Gladwell explains that owning a sports team isnt like other businesses, because the owners get enjoyment out of ownership. He says that owning a sports team has a "psychic value" above and beyond the "business value" of the team. Because of that, buying a team is more like piece of art, part of the price one pays goes towards the enjoyment of it and a purchaser shouldn't expect to make money off his investment:

But of course an owner is only losing money if he values the psychic benefits of owning an NBA franchise at zero — and if you value psychic benefits at zero, then you shouldn't own an NBA franchise in the first place. You should sell your "business" — at what is sure to be a healthy premium — to someone who actually likes basketball.

I would agree that owning a basketball team is really fun. But lots of jobs seem pretty fun. Steve Jobs seems to enjoy walking out on stage and showing off the new iPhone but that doesnt mean he deserves to get paid less. I bet that being a writer for the New Yorker would be pretty cool but you don't see Gladwell giving his books away. I like writing this blog post but I wouldn't be opposed if 4:17am editor Will paid me for my contribution. Most importantly, playing basketball for a living against the best players in the world seems like it would be a lot of fun, but I dont see Gladwell arguing that the players should take a reduction in pay or quit basketball and get jobs hanging light fixtures, painting ceilings and helping people get goods from the highest shelves at the supermarket.

Some jobs are fun, some jobs suck. Im sure this is incorporated in some way into compensation but the idea that NBA owners are unique in this regard is wrong.

I generally am against the owners in the lockout situation. The teams value's are increasing and losses of revenue are because of mismanagement not players salaries. But, "owning a team is fun so suck it up and take the loss" isn't a good argument.

That DC earthquake

Hey, there was a small earthquake in DC and NYC! Did you hear?

Of course you heard, because news of the earthquake flooded onto Twitter and Facebook at an incredible volume.

By the way, according to the NYT, there was also a historically large earthquake in Colorado today. Did you hear about that one? Probably not.

The point I'm trying to make here is that one of the stranger effects of social media is the way it's made DC-NYC the nerve center of the global digital consciousness. Part of it is just because there are so many people in this area. But part of it is because this area is home to the vast majority of the nation's (and probably a decent-sized plurality of the world's) news organizations, reporters, media professionals, and the like. It's so disproportionately wired that anything that makes even the slightest bit of news in DC and NYC quickly gets catapulted into the eyes of the world. Stuff gets blown way out of proportion.

But the real problem is when this dynamic plays out in reverse. Stuff that doesn't affect DC or NYC, or affects these cities less, tends to get underplayed and forgotten. For instance, it's small wonder that unemployment, though historically bad through much of the country, is so often overlooked on the East Coast. The East Coast is doing relatively well and it's the East Coast that sets the tone for the rest of us.

Also, as a postscript, I'd like to point out that, incredibly, XKCD was proven correct today.

If only George W. Bush's middle name were Qaddafi

So apparently the war in Libya isn't quite over yet.

But predictably, the gnashing of teeth has begun. I'm going to start calling it PISD, for Post Iraq Stress Disorder: panic, unease, and irrationality among foreign policy pundits, when confronted with any situation that even slightly resembles Iraq in 2003. The archetypal case is this New York Times article from earlier today: Wars Against Qaddafi and Saddam Show Parallels.

Many of the "parallels" cited would be more at home on a Lincoln/Kennedy Coincidences Commemorative Plate than in the Grey Lady. For instance, Qaddafi and Hussein both lived in cities with gates:
Like the Iraqi leader in 2003, [Qaddafi] had vowed to defeat the enemy at the gates of his capital, only to find his outer defenses, including his son Khamis’s widely-feared paramilitary unit, the 32nd Brigade, crumbling under rebel assault and NATO bombs.
And both built bunkers:
Rebel commanders on the ground appeared to have concluded that Colonel Qaddafi, after months of NATO bombing that had obliterated almost everything above ground in the compound, had retreated into a vast underground complex beneath the ruins — a last-ditch refuge similar to those that Mr. Hussein had underneath several of his Baghdad palaces.
Their respective disappearances were tangentially connected to household pests.
[Hussein] then disappeared for eight months until he surfaced again, literally, into the custody of American troops standing over his spider hole. In Colonel Qaddafi’s last radio address, he dismissed the Libyan rebels as “rats,” before he, too, vanished.
They even both had two sons!
In 2003, two of Mr. Hussein’s sons, including his likely heir, fled Baghdad without firing a shot; on Sunday, two other sons of Colonel Qaddafi, including his chosen heir, Seif al-Islam, surrendered quickly to the rebels.

Anyway, through some careful investigative work, I've managed to identify at least one slight difference between the wars in Iraq and Libya.
In another respect, too, Colonel Qaddafi appeared to have emulated the former Iraqi leader. As tumult gripped his capital, he disappeared. As American tanks seized the center of Baghdad, Mr. Hussein stood atop a Volkswagen Passat outside one of Baghdad’s main Sunni mosques and promised to stand with his people.
Seems like this might be important... but... oh wait! Did you realize that the names Qaddafi and Hussein each have seven letters? What have we done?!

Consumer privacy is hogwash

One issue on which I do not see eye-to-eye with mainstream progressives is consumer privacy. I just don't see the problem. "Google is gathering your data!" they say. "The credit reporting agencies know everything about you!"

To which I respond, "Yes, but so what?" Try as I might, I don't see what harm Google is going to do to me with a smattering of my personal information. Waste less of my time advertising things I'll never buy? And I certainly don't know what privacy utopia these panicked progressives grew up in. The world dreamed up phone books and private eyes long before it invented AdSense.

So this Prospect post seemed almost tailor-made to rub me the wrong way. First, there's a Kevin Drum quote:
What’s most unnerving about this to me isn’t the technology itself, which is inevitable. It’s not even the obvious next step beyond just age and gender targeting, which has been common practice forever. It’s the fact that I know most people don’t even object to this. It’s just a better way of making sure that you only see ads for stuff you’re interested in, after all. And what’s wrong with that?
Pray tell, Kevin... what is wrong with that? Of course, he never says. That's pretty typical. Usually, privacy advocates just insinuate, but never explain. Their entire position seems built on the creepiness of having someone you haven't met know your name and address. But once you get past irrational, knee-jerk objections, it becomes clear that this sinister figure is actually just a computer database, most of the information it holds is pretty readily available through non-sinister means, and the worst thing anyone wants to do with it is send you credit card offers.

Then Paul Waldman doubles down on stupidity.
[P]eople who work in marketing are pretty smart. They understand the potential for backlash in these technologies, and they know that as long as they can convince us that we’re getting something in exchange, and they make relinquishing our privacy as easy and smooth as possible, we’ll assent, with our silence if nothing else. Sure, my phone is recording my every move if I forget to turn off the satellite tracking. But I have a friggin’ GPS right on my phone! And have you seen the 3-D building renderings on Google Maps? So cool.
You know what? Having a friggin' GPS right on my phone is awesome! And the 3-D building renderings on Google Maps make me feel like I'm living in the future!

Waldman and other progressives would have you believe that marketers and device-makers devised always-on, highly-integrated GPS gadgets as some sort of lure for consumers -- a carrot to entice consumers to give up privacy rights. They then moan and groan about how consumers have been deceived into giving up privacy in exchange for technology.

It's a bizarre rearrangement of the facts. Marketers and device-makers created always-on, highly-integrated GPS gadgets because people wanted always-on, highly-integrated GPS gadgets. They're incredibly useful in a huge number of situations, they're practical for navigation, they facilitate our social lives, they're straight-up cool, and they're just generally a boon to the people that buy them. How do we know this? Because people continue to buy them! In the millions!

Waldman and others act like the privacy-for-technology switcheroo has been forced onto consumers, but that's ridiculous. Consumers demanded technology that opened up gaps in privacy, and manufacturers built those technologies. Consumers liked those technologies and wanted more of them. Manufacturers obliged. There's no "potential for backlash" -- people are purchasing and using these products of their own volition, and if these products pass some invasion-of-privacy threshold, people will presumably stop using their products of their own volition as well.

It's much more likely that consumers understand what privacy advocates don't: when you disregard irrational paranoia, diminished consumer privacy doesn't really hurt anyone very much. On the other hand, having an iPhone with GPS, or using a search engine custom-tuned for your personal interests, is pretty great. For the rest of us, it's not a hard decision to make. There's no conspiracy, we're just getting what we want.

Monday, August 22, 2011

Owning From Dust

The video game you see above is called From Dust, and it's representative of a digital revolution. Two of them, in fact.

The first revolution is one I'm happy with. After years in which every new video game was some variation of man-with-gun-shooting-terrorists, there's been an explosion of small-scale, creative development on the gaming scene. The lead designer of From Dust is something of a game designer auteur, and the game itself is a product of an unusual and novel vision.

It's about as unique a game as can be imagined: it's about geological sculpting. Players build mountains, watch forests emerge, guide rivers as they erode the landscape and eventually form gorgeous deltas when they reach the sea. I played a demo a few weeks ago and it was entrancing. It's every bit as stunning as this proof-of-concept video.

So it's all the more shame that the game is now better known for the controversy it's kicked off than its originality or beautiful design. And the controversy points the way to the second revolution it's helping lead.

Explaining the controversy will require some background. It starts with Ubisoft, From Dust's publisher. Ubisoft, is notorious for treating PC consumers in a manner charitably described as "flippant" and less charitably described as "utterly callous." (For instance, they've developed a habit of unaccountably delaying PC versions of their titles at the very last second -- sometimes literally hours before the scheduled release -- while other iterations are released as planned. Naturally, PC gamers who have preordered the game -- in other words, paid a premium in order to have it delivered to them as soon as possible -- are infuriated by this, because it totally obviates their investment.) Nowhere is Ubisoft's disregard for its customers more readily apparent than its enthusiastic embrace of DRM.

For the uninitiated, DRM stands for "digital rights management," and is shorthand for any sort of technological lock-and-key placed on film, music, or games, that prevents it from being used or copied by a person who is not the rightful owner. Over the last decade, content publishers have evolved a veritable menagerie of DRM.

DRM comes in varying degrees of aggressiveness. It can range from relatively benign -- including an encrypted registration key with the product, which is used at installation and unlocks the content -- to the malicious. For instance, the mid-2000s saw the rise and fall of Starforce copy protection, which essentially rewrote the drivers of a computer's CD-ROM drive, sometimes destroying the computer in the process. Fittingly enough, it was largely utilized by shady fly-by-night Eastern European game development shops, and thankfully seems to have passed out of common usage.

The current trend in gaming DRM is online authentication keys. The advantage is obvious: unlike encryption keys, which relay on a client-side algorithm and can therefore be generated en masse by cryptographically-gifted hackers, online keys are checked against a database held by the publisher. As a result, no fake keys can be used, and real keys can only be used once.

But online keys have a disadvantage, as well: they preclude any consumer without access to the internet (or any consumer suspicious about letting a product communicate with a third party) from using the product. And they're hardly foolproof -- enterprising pirates almost immediately devised ways to spoof confirmation from the publishers, fooling programs into thinking they'd been authenticated.

The response from publishers has been predictable: they've made their online authentication ever more draconian. Which brings us back to Ubisoft.

Ubisoft has taken this principle of online verification to its logical end, and devised what may be the most intrusive copyright protection of all time. Many of its games now require online verification every single time they're used. Meaning that if you've legitimately purchased a game, and your internet connection is down, it's useless. Or if you're on an airplane. Or if you can't get your router to forward its ports properly. Any number of common and predictable scenarios render Ubisoft games complete wastes of hard drive space.

And it gets worse -- Ubisoft has intvented a second, even more aggressive variation of its DRM, which requires an internet connection at all times while the program is active. If at any point, for any reason, the user's internet suffers a hiccup -- as my wireless does about 30 times a day -- the user is booted back to desktop, with any unsaved progress lost.

If this had been an accidental addition, it would constitute a massive software bug. "Network traffic causes random crashes" would be an unforgivable and laughable flaw. But because it's done in the name of combating piracy, consumers are expected to buckle under and accept it.

The great irony here is that Ubisoft's abusive DRM does very, very little to protect copyrights. Let's return to From Dust. It originally shipped with the latter, worst form of copyright protection (after assurances to the contrary, which mysteriously disappeared off Ubisoft's site the day of release); after a huge outcry, Ubisoft downgraded the DRM to the still-bad "authenticate at startup" variety. The pirates were undeterred, and a complete, working version of the game appeared on the internet, sans any DRM, within hours of release. In other words, the copyright raiders weren't bothered at all, stripped the game of all copy protection, and in the process created a far more functional product than the officially-sanctioned, artificially-crippled version Ubisoft had published.

In the meantime, Ubisoft has weathered a huge public relations storm. Big-time, commercial reviewers have recommended against buying the game, solely on the basis of its aggressive DRM, and smaller, independent reviewers have actively advocated that consumers pirate it instead. The situation is almost unheard of. The latest news is that Ubisoft has finally caved to pressure, and is removing all online authentication.

That's a small victory. But a bigger question remains: why did Ubisoft insist on using such obtrusive DRM in the first place? To be clear: it's no surprise that it failed to prevent piracy. Literally every single major game release is pirated quickly after publication, and nobody in the game industry labors under the illusion that DRM of any stripe will remain secure for very long. In the end, hardly a single person who wanted to pirate the game was prevented from doing so, and many more were probably driven to pirate the game in order to avoid dealing with the DRM in the first place. Worse still, many, many legitimate consumers, who had paid cash for the game, were forced to make do with a half-functional product. None of these things were unpredictable -- in fact, this whole dumb farce has played out many times before.

Ubisoft's behavior seems almost mind-bogglingly stupid: it treated its customers like criminals, while failing to prevent piracy in the first place! How does this keep happening? None of it makes sense at all.

Unless Ubisoft was never trying to prevent piracy in the first place.

And they weren't. This is the dirty little secret of DRM. It's not about stopping pirates. Pirates are here to stay. Every game, movie, and album finds its way onto the internet eventually. DRM is a response to internet piracy, but it doesn't work by stopping pirates. Instead, it makes up for piracy's losses by squeezing legitimate consumers for every last drop.

It does this by fundamentally changing the nature of the content consumers are buying. Prior to DRM, a video game, movie, or album was a piece of property. You bought it, and then you owned it, and you could resell it. But today, you more often buy the right to use content, in specific circumstances and on specific pieces of equipment. When you try to use your content outside these established parameters, however, it breaks. It doesn't work. That means if you want to sell it to your friend, you can't. They have to buy their own copy, for full price, from the publisher.

Of course, Ubisoft acan't say this out loud. Can you imagine? "We've installed a system of locks and failsafes into your game, so that you can't sell it to your friend. You're stuck with it, and they have to pay us instead." So instead, everything is rationalized as copyright protection. Nobody can argue with anti-piracy measures. So anti-piracy measures we get, and they're omnipresent, even when everybody knows they don't work.

In this sense, From Dust, and games like it, signal a revolution, all right. But it's not a good kind of revolution. It's something much more ambiguous, where personal property is being transmuted into something cloudy and communal. And it's increasingly commonplace.
Ubisoft might be the among the most flagrant and aggressive purveyors of DRM today, but it's not the first, it won't be the last, and it probably won't be the worst for long. Songs and movies cost less than games, but they're subject to the same restrictions. Ebooks, too. These tricks are pervasive.

This change has social and legal implications, which I'll probably talk about at a later date. But for now, I'll just leave off by pointing out how it disrupts something fundamental and simple and universally understood about ownership. Everyone understands the simple concept of owning an item: "I bought this, and now I can do what I like with it." Now, instead, it's, "I paid for this, but it's a tangle of legal obligations and electronic mechanisms that I don't really control." How is that desirable?

Sunday, August 21, 2011

Not smiling anymore

According to Twitter, this amazing picture was taken less than a year ago. That's Ben-Ali and Saleh on the left. Things change fast.

Exit Qaddafi

As you may have heard, something happened today in Libya.

While Qaddafi's whereabouts are a mystery, no one is pretending that he matters tonight. He might be in his palace. He might be in Algeria. He might be in Venezuela. What's clear at this point is that nobody in Libya is paying their former dictator any mind anymore. Instead, all eyes are on the cheering masses in Tripoli.

First and foremost, the collapse of the Qaddafi regime is good news for the Libyan people.

That's even true when you take all the uncertainties ahead into account. There's already a certain brand of naysayer running about, saying things to the effect of "But this was the easy part!" and reminding us that the whole country could still devolve into tribal violence. These people aren't totally wrong. Libya could still descend into protracted internecine conflict among the opposition forces. There are by all accounts serious historical tensions among the Libyan tribes, and particularly, between the eastern and western halves of the country. Nobody knows what happens next.

With that said, the signs are as encouraging as they could be. The transitional council has announced its intention to pursue a peaceful transition to a new government, and signaled that it has already laid the groundwork for a new constitution and elections. That's a far cry from actually effecting the transition, but it's what you would want to hear at this stage. It's also very important that Tripoli seems to have finished off Qaddafi more or less on its own, rising up and sloughing off the dictator. This way, it joins the new government as an equal partner, not a subjugated region. If Tripoli's residents had resisted the rebellion, and the rebels had been forced to lay siege to the city, the prospects for a peaceful post-Qaddafi transition would have been much diminished.

But it's also worth noting that the transition would have been uncertain in any circumstance. Look at Egypt: its revolution was as peaceful as could be imagined, and happened totally independent of Western intervention. And yet the country's future remains cloudy, at best. No one, however, is arguing that Egypt would be better off under Mubarak. If the Libyan revolution ends in further bloodshed, that doesn't prove that the revolution was foolish, or that NATO was foolish to assist the rebels. It was always going to be a gamble, and with Qaddafi gone, it's now clear that the gamble was worth taking.

It might be unseemly, but I'm also going to take the briefest of victory laps here. I've been for intervention in Libya since the beginning, and have occasionally been pretty irritated with fellow progressives for their knee-jerk opposition to the war. Few of them seemed to believe this day could ever come, or that NATO could successfully backstop the rebels without eventually landing troops on Libyan soil or otherwise getting sucked deeper into the conflict. Few of them wanted to admit that saving Libyan lives was worth the cost of US bombs. So for me, today's news is a little bit of personal vindication.

But it should be heartening for everyone else, too: it's a reminder that the rich, democratic nations of the world have agency as a force for good. We don't always have to sit on the sidelines while dictators inflict atrocities on their own people. Not all interventions are good, smart, or successful, of course. Recently, however, many people on both sides of the political spectrum seem to have adopted the belief that any attempt to forcefully interfere in world affairs would inevitably sour against us. These people seemed to believe the best possible outcome would always be the one that the west generally, and America specifically, had the smallest role in creating. Libya suggests these people are wrong, and that's something that should make any advocate for human rights happy.

Saturday, August 20, 2011

Assassination and foreign policy

My roommate and I were discussing what the US government should do about Syria, and he brought up political assassination. Maybe, he suggested, a few high-profile failures in the past had caused Western governments to unfairly neglect it as a viable foreign policy alternative.

His suggestion would no doubt elicit knee-jerk opposition from many -- myself included -- but the logic is hard to deny. Although the US government wants Assad to step down, it has precious little leverage over him, short of full-scale military invasion. As long as Assad remains convinced an attack is unlikely and infeasible, he can safely ignore the pronouncements of the US State Department. His calculus would have to change, however, if defying the US meant dodging a steady stream of assassination attempts.

It's an interesting thought. Eroding the international proscription against assassination would dramatically increase American leverage over small-time despots and geographically localized conflicts. Can anyone deny that the US would have more quickly achieved its ends in Libya if it had the option of simply blowing up Qaddafi? Either Qaddafi would have stepped down, knowing he had no other option, or he'd probably just be dead and gone. We certainly wouldn't have spent the last six months bombing the Libyan desert, in order to ensure that Libyan rebels survive long enough to do our dirty work for us.

(Some people, I'm sure, also have moral objections to state-directed killings. And fair enough. But when, as in Libya, the practical alternative might be a prolonged military conflict, assassination seems, if anything, the more humane option.)

If you carry the reasoning only this far, it does seem that the US would benefit from again pursuing political assassination as an extension of its foreign policy. And you might think that smaller nations would be generally opposed to a loosening of norms against assassination.

I think this reasoning has it backwards, however. It's somewhat counterintuitive, but proscriptions against assassination are actually most beneficial to large, rich nations like the US.

That's because hiring, training, and equipping a team of assassins is relatively cheap. So cheap, in fact, that most nations could probably do it. So while the spectre of the assassin's bullet might force Assad to take America's preferences more into account when making political decisions, it would also force Obama to take Syria's preferences more into account. He, like, Assad, would have to live with the constant fear that some miffed international opponent had sent a team of killers his way.

In a world where assassination was an acceptable foreign policy tool, all world leaders would gain more leverage over all other world leaders. But right now, Syrian leaders have exactly zero leverage over American leaders, while the United States, with enormous economic influence and unrivaled military power, has significant influence over Syria. So the benefit to Syria (a voice in US policymaking!) would be considerably larger than the benefit to the United States (marginal at best). In other words, international norms against assassination are helpful in maintaining the status quo. They increase the influence of large, rich, militarily powerful nations, at the expense of small, poor, militarily weak nations.

It's true that the threat of military force or economic sanctions are awkward foreign policy tools. They're ill-suited for a lot of the foreign policy problems the US faces -- Syria is just the latest example -- and there's often no middle option that would work better. But for American policymakers, the key feature of the use of military and economic force is not just their versatility, but their exclusivity. Few other nations have similar implements at their command. So American policymakers surely prefer to live in a world where other, less exclusive foreign policy options are foreclosed by law or custom.

Friday, August 19, 2011

With limited power comes limited responsibility

Yglesias, on Syria:
I don’t want to see a military intervention, but I think it would be perverse for the administration not to condemn Assad and his regime in the strongest possible terms. A side consequence of America’s global military posture, however, is that it’s hard for our government to respond in the normal way here... The President of the United States... in light of our bid for global military hegemony risks being made to look foolish if he doesn’t do what it takes to make his various pronouncements come true.
Is this remotely true? There are lots of things that happen in other countries that the US government openly opposes. Every time the administration expresses an international policy preference, is it implicitly backing up that preference with the threat of military force? It's not like a country as large or rich or influential as the United States lacks other means of expressing its disapproval in the international arena, even if ultimately those means are unlikely to unseat Assad. Surely most of the relevant political figures understand this.

For that matter, would the administration even want to convey that all international political developments are conducted under the aegis of the United States? One of the more insidious myths facing the United States, at least from a public diplomacy standpoint, is the idea that everything horrible in the world happened because Americans made it happen. People believe this because they can't imagine that the US could desire an outcome and then lack the capacity to achieve that outcome. The impression of American omnipotence in world affairs has led to a lot of very bad things being pinned on Americans. In the end, I'd rather remind people that the US can be one of the good guys than continue to pretend that the US always gets what it wants.

Wednesday, August 17, 2011

Stimulus and the deficit commission

Ezra Klein posted this morning about an idea that, I'll admit, hadn't occurred to me: using the deficit commission to pass additional stimulus.

Ezra focuses on the Democrats' willingness to make it happen -- whether they have the resolve to use the leverage of the deficit trigger to pursue job creation measures.

And that's all well and good. But the more important question here is, "Could it even work?" In other words, "Would the Republicans on the commission allow Democrats to include stimulus measures in the final package?"

I'm skeptical, but not totally pessimistic. Obviously, in a legislative context, the Republican Party has proven almost completely resistant to any form of compromise. There is basically a zero percent chance that anything resembling a stimulus measure could wend its way through Congress while Obama is still president.*

On the other hand, for whatever reason, small, bipartisan committees have tended to spit out much more moderate plans and compromises than the Congress as a whole. See: Simpson-Bowles, the Gang of Six.

Why is this? It's hard to say. It could be that, in a smaller setting, Republicans find it harder to ignore some of the major defects in their argument. (That tax hikes aren't a viable means of reducing the deficit, for instance.) In a one-on-one setting -- or close to it -- individual legislators might be called on to explain and defend their positions, and finding themselves unable to do so, forced to yield a bit. I also suspect diffusion of responsibility plays a role. In the debt ceiling fiasco, most Republicans got off pretty easy: they never personally had to threaten to blow up the economy, because a tiny fringe of right-wing nutjobs were doing the dirty work for them. And even among the nutjob caucus, no single member was responsible for the crisis. Everyone could point to some other group of people and say "It's really their fault." In a small committee, obstructionism will have a name and a face. That may create more incentive to compromise.

Also, in the past, committees have been able to labor in relative obscurity. Their negotiations could proceed without interference from party leadership and, especially, party activists. Each side could make concessions that, taken alone, would enrage the base, safe in the knowledge that the final plan wouldn't be presented to the public until it was in a form that was (somewhat) mutually tolerable. Congressional negotiations, by comparison, tend to attract more public scrutiny.

So which will it be? Will the deficit commission be more like a mini-legislature or a super-committee? If it's the former, we can probably rule stimulus out. If it's the latter, the range of possibilities could be much broader.

Are six Republicans enough to diffuse responsibility for obstructing a compromise? Will they be forced to defend their policy arguments? Will negotiations take place in secret? Or will the entire process play out in the public eye, with each side acting as a proxy for their entire party?

I don't know the answers yet, but they will determine the final composition of the commission's plan.

*If Mitt Romney becomes president, however, expect additional stimulus. It's very doubtful that most congressional Republicans have completely abandoned the mainstream economic theories that they subscribed to in the Bush years, and you can count on them readopting those theories in full once the nation's economic interests run parallel to their political interests.

Monday, August 15, 2011

A passage from my financial management textbook

I promise you it doesn't make any more sense in context.

And here's an editorial so nuts that I couldn't not post about it

I don't usually read the WSJ editorial page. Can someone who does tell me if this Norman Podhoretz editorial is indicative of its usual output? Because, wow.
[W]e villainous conservatives do not see Mr. Obama as conciliatory or as "a president who either does not know what he believes or is willing to take whatever position he thinks will lead to his re-election." On the contrary, we see him as a president who knows all too well what he believes. Furthermore, what Mr. Westen regards as an opportunistic appeal to the center we interpret as a tactic calculated to obfuscate his unshakable strategic objective, which is to turn this country into a European-style social democracy while diminishing the leading role it has played in the world since the end of World War II. The Democrats have persistently denied that these are Mr. Obama's goals, but they have only been able to do so by ignoring or dismissing what Mr. Obama himself, in a rare moment of candor, promised at the tail end of his run for the presidency: "We are five days away from fundamentally transforming the United States of America."
After which, he muttered, "¡Viva la RevoluciĆ³n!"

But please, Norman, continue. I'm sure you have so much left to say.
This statement, coming on top of his association with radicals like Bill Ayers, Jeremiah Wright and Rashid Khalidi, definitively revealed to all who were not wilfully blinding themselves that Mr. Obama was a genuine product of the political culture that had its birth among a marginal group of leftists in the early 1960s and that by the end of the decade had spread metastatically to the universities, the mainstream media, the mainline churches, and the entertainment industry... But whereas the communists had in their delusional vision of the Soviet Union a model of the kind of society that would replace the one they were bent on destroying, the new leftists only knew what they were against: America, or Amerika as they spelled it to suggest its kinship to Nazi Germany.
Oh god, I can't go on. There's so much more -- Obama only got elected because he was black, apparently -- but I'll spare you. The experience of reading this editorial is rather like having bucket after bucket of lunacy dumped on your head, over and over, until you feel like you're drowning in it. That's right: it's like Norman Podhoretz stepped into my room and waterboarded me with his own wild-eyed paranoia.

I don't really know which way to take this. First, it's incredible that "strengthening the social safety net" is now on par with communist revolution, in the mind of mainstream conservatives. This would be an excellent time to reflect on the fact that Obama's health care plan has its roots in a Heritage Foundation proposal.

I guess I could go through and debunk some of his claims, but is that really necessary? His argument is almost self-evidently wrong. Podhoretz is arguing that Obama's health care bill and other legislative initiatives (as well as Obama himself) are the conceptual offspring of 60s radicals who took over the party after the nomination of George McGovern in '72. But Presidents Roosevelt, Truman, and Johnson all came before McGovern, all pursued universal health care, and all three plans were more transformative than Obama's. Thesis destroyed.

Finally, notice that the only piece of evidence that Podhoretz marshals to prove that Obama is some sort of social democratic sleeper agent is one single quote. Imagine if Obama had said something like this? "As long as we remember our first principles and believe in ourselves, the future will always be ours. And something else we learned: Once you begin a great movement, there's no telling where it will end. We meant to change a nation, and instead, we changed a world." That's Soviet Shadow Premiere Ronald Reagan, revealing, in a brief lapse, his affinity for Trotskyist permanent revolution.

It's all very demented, but that's not what's really important here. What's important is that this editorial has actually made its way onto the pages of the most important center-right paper in America. Along the way, nobody pulled Podhoretz aside and pointed out that nothing he'd written jibed with common sense or basic history. Podhoretz believes that the radicals have stormed the gates of mainstream progressivism, but his idea seems much more applicable to his own movement. He's the one, after all, that's taken to the pages of the Wall Street Journal to accuse the President of being a subversive.

Friday, August 12, 2011

You can't fix Europe by making some Europeans better human beings

Yglesias on German financial "prudence":
[I]f you look at the actual budget figures on the right, you’ll see that Spain — a country full of fun-loving Spanish people — was actually running extremely prudent budgets throughout the boom years. The same is true of Ireland and I believe Portugal as well. There was nothing particularly imprudent about German budget practices during this time, but the fact of the matter is that Germany was running a larger budget deficit than was Spain. Since the recession hit, that’s obviously flipped around. But again the issue here isn’t really one of “prudence.” Germany responded to the downturn with a pretty vigorous fiscal stimulus program while Spain has enacted repeated austerity packages.
As Matt points out, the tendency to characterize the Germans as a fiscally prudent people tends to mask what's actually happening in Europe. Everyone wants the Spanish to be more like the Germans, but many people seem to be mistaken about how the Germans actually are. The lesson from the German experience is not "adopt austerity measures," and may be quite the opposite of that.

But there's a more insidious error at play here, as well. "Be more like the Germans" suggests that the troubles in Spain and Italy are somehow an extension of how Spanish and Italian people tend to behave. It suggests that if you can change behavior, you can change the destiny of the respective countries.

But is there even one iota of evidence that this is true? Do Germans, presented similar economic opportunities, really act the slightest bit differently from Italians? I, for one, am incredibly skeptical. One of the real fundamental ideas of economics is that all people are basically motivated by the same stuff. And one corollary of that idea is that talking about "how people are" is a waste of time. People are how they are. What's different about the Germans is that they live in a bigger, richer country, with a more robust social safety net, more extensive industrial infrastructure, and more political clout. The only way to make an Italian more like a German, economically speaking, is to have him go live in Germany. Otherwise, we're going to need solutions that acknowledge that Madrid can't become Berlin, the Po isn't ever going to be the Danube, and the Italian economy can't be an exact facsimile of the German economy. In short, we need solutions that acknowledge that Italy and Spain exist as separate and unique places. All the withering lectures about spendthrift Spain can't change these basic facts, but you wouldn't know it from the tone of so much purported financial analysis.

Thursday, August 11, 2011

Someone wrote something smart

Lots of 2012 election news today! A new poll shows Obama competitive, but with Romney close behind. Meanwhile, Romney is doing his absolute best to widen the gap, telling a crowd that "corporations are people, my friends." On the airport TV behind me, Obama's giving a bit of a populist barn-burner, attacking Congress for its inaction. Zap! Zing! Pow!

And yet. None of this hardly matters at all right now. If you want the day's shortest, smartest, most accurate summary of the current state of American electoral politics, go read this Amanda Marcotte post. If they gave out awards for writing true things on the Internet, someone ought to be presenting her a little golden statuette right now. Here's my favorite bit, which can be summed up as "swing voters are dumb and you shouldn't trust them":
[T]o make it worse, the very small number of people whose votes are up for grabs are pretty much the polar opposite of the thoughtful citizen who has an open mind and spends the weeks before the election somberly reading up on the candidates before making a well-informed, well-considered opinion. Swing voters tend to be the most ignorant ones, which is probably why they manage to keep voting for Republicans, in between voting for Democrats, even though they basically never like the results of voting Republican.  The truth of the matter is that someone who actually pays a lot of attention to politics is going to become a partisan, and there's no shame in that.  It'd be like following sports or music intently without ever developing opinions about any teams or bands.  

If you don't read the whole thing, I'll think you're a bad person.

Wednesday, August 10, 2011

Guest Post: Watch the Throne Sucks

The new Kanye and Jay-z album is mediocre. It’s probably Kanye’s worst album to date and easily his least creative. I haven’t listened to The Blueprint 2 or In my Lifetime Vol. 3 in a while so I don’t know if it’s Jay’s worst, but its bad.

It’s hard to point out when Jay-Z became boring. I remember being really underwhelmed by his verse on “Run This Town” when Kanye showed him up. His embarrassingly uncool verse on “Monster” was a pretty clear indicator that he was getting old. After this album I don’t really think there is a plausible argument that Hov is one of the best in the game anymore. He has nothing interesting to say and even his flow feels off a lot of the time. Sometimes he seems to just be stringing words together without really having any ideas or theme.

Kanye feels like he’s not trying at all, which is pretty strange for him. For someone coming off the best album of his career he sounds pretty uninterested in trying anything new. Every album he has released so far has felt pretty groundbreaking, except for this one. Most of raps are complaints about critics and mainstream “White America”. They come off kind of silly coming on the heels of such a universally praised album. He’s not particularly funny or offensive either, just kind of boring.

23 Year Old, Odd Future member, Frank Ocean is easily the best part of the album. He has 2 great choruses on Made in America and No Church in the Wild, two of the strongest songs on the album. No Church in the Wild sounds a lot like the song “Pray” from Jay-Z’s “American Gangster” album, except the verses are far less haunting. “Otis” is a legitimate banger. The “Murder for Excellence” beat is great, the chanting gives it that “Power” feel. The beat coproduced by RZA (“New Day”) is pretty good too. None of these tracks are real classics though and they are probably akin to or maybe slightly worse then Jay and Ye’s former B+ material.

I originally planned on writing a long review of this album but I really dont have much to say.

On the DJ Khaled track “Im On One” released a few months ago, Drake raised some eyebrows with a line “I just feel like the throne is for the taking”. It is.

Sunday, August 7, 2011

Astronauts: still cool

I'm not one of those people who gets all weak in the knees about human space flight.

Look, I get why it happens. I too wanted to be an astronaut when I was a kid. I grew up reading science fiction and owned a telescope so I could look at Mars and Saturn. The Right Stuff is still one of my very favorite books, and I carried a Space Shuttle Columbia trading card in my wallet until I was about 24. The romance of space travel isn't lost on me.

But it's also undeniable that the most implacable foe of human space exploration is cost-benefit analysis. The things we need done in space, robots can do just as well as us. Instead, we've spent many years paying a premium of billions upon billions of dollars so that we could have a select few men and women perform those same tasks with their own two hands. Towards what end? Collective pride, I guess. And it's just not worth it. That money could help thousands -- even millions -- of genuinely suffering people. In the meantime, science will survive.

Still, sometimes I see things like this, and I get sad that things have to be that way.

S&P: Not wrong, but out of line

Felix Salmon has a post up this morning defending the S&P downgrade against criticism by Krugman and the administration. Felix's broad point is clearly correct: whatever fault you might find with S&P, US debt isn't risk-free and deserves to be downgraded. That's what matters most here, not the hypocrisy or incompetence of S&P.

That said, it does matter some whether S&P has an intellectual leg to stand on, and whether its decision was motivated by politics. And here, Felix's argument falls a bit short.
Instead, to understand S&P’s actions, you just need to understand two basic facts. The first is that S&P is not judging the quality of Treasury bonds as an investment. There’s a key difference between S&P, on the one hand, and Moody’s, on the other: when rating sovereigns, S&P doesn’t care about or look at the likely recovery in the event of default. If the US ever did default, investors would ultimately get back 100 cents on the dollar, interest included. Shorting Treasury bonds into that kind of a default wouldn’t make you much money. But it would still be a default — and S&P is trying to gauge the likelihood of such a thing happening.
Felix's point here is that S&P rates the likelihood of any sort of default -- even a short-lived technical default. It doesn't matter that the US would likely meet any missed commitments eventually -- that's beyond the scope of S&P's predictions. If the odds of a technical default increase, then the debt rating should fall.

And fair enough -- except his argument is undermined by the timing of the downgrade. The debt deal (and the precedent it set) might have increased the odds of a default in the long term. But the deal also dramatically decreased the odds of technical default in the near future. The risk that the US would commit some sort of technical default surely peaked in mid-July, as debt ceiling negotiations were imploding and Treasury was actively preparing to pass the August 2 deadline. An agency truly dedicated to rating the probability of default should have downgraded us then... and should have been reassured by the debt deal, which proved that most of the American political system has no real appetite for default, and will strike a pretty crappy bargain to avoid it.

The downgrade's timing hounds Felix's second argument, too:
Secondly, and more importantly, all sovereign defaults are political, not economic — especially defaults by countries which borrow exclusively in their own currency. S&P and Moody’s can look at all the econometric ratios they like, but ultimately sovereign ratings are always going to be a judgment as to the amount of political capital that a government is willing and able to spend in the service of its bonded obligations.
Again, his basic reasoning is perfectly sound. But it doesn't jibe especially well with what S&P actually did. US debt is threatened by political gridlock. But we didn't need to wait until August to understand how bad that gridlock actually was. If anything, the debt deal actually opens up a little space for the opposite argument: there's certain lengths to which Congress won't go, and certain catastrophes that it will work together to avoid.

Instead, by waiting until after the deal was struck to downgrade the US, S&P has signaled that they were taking the content of the deal into their considerations. And the content of the deal shouldn't be terribly important to them. The features of the US political system that endanger bondholders existed before the deal, and remain unchanged after it. The deal itself might alter the economic outlook for the US, but it didn't alter the country's political fundamentals. And as Felix himself points out, the nation's economic outlook should be of secondary importance for S&P.

Ultimately, I can't disagree with the S&P downgrade too much. But I can disapprove of the political game S&P is trying to play. It's inserted itself American fiscal policy in a manner that both endangers its own mission and distorts the US political process. If recent history is any indication, the rating agencies aren't especially good at their own jobs. So I don't especially want to see what happens when they try to do Congress's job as well.

Saturday, August 6, 2011

On the S&P debt downgrade

I honestly can't decide whether to say this is unexpected or expected.

Why it's expected: S&P has made it very clear the downgrade occurred because of political gridlock, not because of the size of the debt or the state of the US economy. And they're correct! Nobody in their right mind could look at the last month of US politics and declare US debt almost perfectly safe. Countries rated AAA shouldn't come within two days of openly defaulting on their obligations, and the debt deal doesn't change that. What if Congress hadn't reached a deal in time? What if some quirk of parliamentary procedure delayed the deal past the deadline? When you're walking that close to the wire, there are simply too many things that can go wrong to say the debt is "risk-free." And Congress has made it quite clear that July's standoff will be the norm from here on out. Really, AA+ might be too good for us, even.

Why it's unexpected: on the other hand, it doesn't make any sense to downgrade the debt right now -- nothing's changed in the last few days. And while political gridlock is a perfectly legitimate reason for a downgrade, gridlock has been a reality for a while now. The timing of the downgrade makes me think S&P was waiting to see the final debt deal before passing judgment, which is silly of them. All the relevant facts were in place two months ago: the debt was large, no plausible congressional bargain could reduce it, Congress was a feeble, broken institution, and the economy was lousy.

By having a debt ceiling standoff, Congress was playing Russian Roulette with the economy. You don't need to wait for the result to know that Congress was engaging in some pretty risky behavior.

Which brings me to my final point: the real reason the downgrade happened is because S&P painted itself into a corner. It started issuing edicts: either we strike a deal with $4 trillion in deficit reduction or the US loses its AAA rating. We didn't, so it did.

It was always a bit strange to see an agency charged with passively rating debt trying to influence policy. And this is why: now, it's hard to know to what degree S&P's new rating is because of honest evaluation of the facts, and to what degree it's because of a political position it had staked out in advance and now must defend. S&P might have damaged the America's reputation with this downgrade, but it's damaged its own, too.

Coming soon: more Kanye West-related posting

Because I'm stuck without computer access for extended periods of time for the next week or so, and because blogging must go on, I've taken on a guest blogger: my roommate Charles. I haven't given him any specific instructions on what to write about but it'll probably involve comparing Inception to the Fed. Brace yourself: he may even post about things that aren't related to politics at all! A novel concept indeed.

Friday, August 5, 2011

Guest Post: Graphs That Worry Me

Almost a month ago Ezra Klein posted this graph and its been bothering me ever since:

The basic lesson to take away from this is that Democrats prefer politicians who compromise to get things done and Republicans prefer politicians who stick to their principles no matter what.

It seems easy, looking at this graph, to figure out how the debt ceiling debate would play out. Democrats would offer a compromise and the Republicans would reject it. Democrats would offer a compromise that was slightly more to the right and the Republicans would reject it. This would happen a bunch more times. Eventually something would happen that would change the incentives. For (most) Republicans this was the actual debt ceiling. Compromising was a better option then the US defaulting. For Democrats this seemed to be cut, cap and balance. A compromise they felt their base wouldn't support.

It also seems pretty clear from this graph how the appropriations deal is going to play out. Except, i'm unsure whether or not a government shutdown looks worse to Republicans then compromise.

Recently the phrase "congress is broken" has been thrown around a lot. It looks more like Congress is responding appropriately to the stated desires of their electorate. It also doesn't help that the government's structure allows a minority to prevent legislation by refusing to compromise.

Wednesday, August 3, 2011

Why you can't negotiate with terrorists

Mitch McConnell:
What we have done, Larry, also is set a new template. In the future, any president, this one or another one, when they request us to raise the debt ceiling, it will not be clean anymore. This is just the first step. This, we anticipate, will take us into 2013. Whoever the new president is, is probably going to be asking us to raise the debt ceiling again. Then we will go through the process again and see what we can continue to achieve in connection with these debt ceiling requests of presidents to get our financial house in order.
More here.

Tuesday, August 2, 2011

A question for my readers, continued [and now updated]

Here's a slightly more systematic way of looking at what I'm saying in the previous post.

Here are the four potential outcomes given two variables -- the expiration or extension of the Bush tax cuts, and the acceptance or rejection of the deficit committee's plan to cut $1.5 trillion from the CBO baseline.

Tax cuts expireTax cuts extended
Committee cuts accepted$1.5 T targeted cuts$3.5 T+ targeted cuts
Committee cuts rejected$1.2 T across-the-board cuts$1.2 T across-the-board cuts

Now, assuming the tax cuts expire, the committee's cuts are probably more acceptable to Democrats than the cuts mandated by the trigger. (Although they'll be larger, they'll also -- hopefully -- be targeted in such a way as to avoid serious economic damage.)

But what happens if the Bush tax cuts are extended? Suddenly, the trigger seems by far the better option. There's simply no way to structure over $3 trillion in spending cuts so that they're more acceptable to liberals than $1.2 trillion in spending cuts, particularly if half of the latter figure comes from military spending. In other words, if the Bush tax cuts are extended, the trigger probably fires, no matter what.

And the Republicans, knowing this, would be forced to choose between preventing $600 billion in cuts to the military and the extension of the Bush tax cuts. Many would certainly choose the latter, but their incentive to preserve the tax cuts would be, at the least, reduced somewhat.

Of course, there's a third possibility: nobody in Congress wants to sunset the tax cuts or to accept military cuts, so they vote to dismantle the trigger instead. The deficit would continue to rise as expected, but for the exact same reason it's always risen: nobody in Congress actually cares about the deficit as much as they care about cutting taxes or preserving spending.

So tell me: how am I wrong about this?

Update: My brother just messaged me to tell me exactly why I'm wrong -- I've botched the timing. The debt commission's plan is due this year, before the tax cuts expire. Which means that so long as no one tries to extend the tax cuts before the debt commission rolls out it's plan, the trigger will become irrelevant long before anyone starts worrying about taxes. So, uh, just ignore the last two posts, I guess.

A question for my readers

To my chagrin, internet access is still limited, meaning that the US government is in the process of enacting an enormous, epochal budget deal and I know next to nothing about it. Argghhh!

So if any of you are able to follow this thing more closely than myself, I've got a question. My understanding is that the cuts mandated by the trigger -- a huge amount of which come out of military spending -- go into effect if $1.5 trillion in other cuts aren't made by a certain date in 2012.

These numbers are calculated with regard to a CBO baseline. But, and again, I could be wrong about this, it seems like the CBO baseline includes the expiration of the Bush tax cuts. In other words, if the Bush tax cuts are extended, the number of cuts that need to be discovered elsewhere in order to avoid the trigger goes up by the cost of the tax cuts themselves. That's a huge sum -- it would easily double (or could even triple!) the number of cuts that need to be found.

So my question is about the incentives this creates. If my understanding is accurate, it seems that anyone who successfully extends the Bush tax cuts now must also be prepared to accept absolutely draconian cuts to government OR the firing of the trigger. Likewise, anyone wants to avoid draconian cuts to government and the firing of the trigger now has an enormous incentive to let the Bush tax cuts expire as scheduled.

So doesn't this basically force the Democrats to let the tax cuts expire? And it surely distorts the incentives for Republicans as well, although I'm less clear how. I have no doubt that many congressional Republicans would prefer to extend the tax cuts and then make up the difference by adding an additional 2.5 trillion in spending cuts to the debt committee plan. But that's simply not politically feasible -- Democrats aren't going to accept 4 trillion in additional cuts with no new revenue, particularly when both the president and the Senate have veto power over any final deal. Besides, it's not like Republicans themselves have shown a lot of stomach for massive Medicare or entitlement cuts. What the next-best option for Republicans? Would they risk a constituent revolt over Medicare cuts in order to stave off a return to Clinton-era tax levels? At what point does simply letting the Bush tax cuts expire simply become the path of least resistance for both parties?

Again, I could be totally misunderstanding all of this. Certainly, the small bits and pieces I've read so far treat the inclusion of the the expiration of the tax cuts in the CBO baseline as a Republican victory. But I'm not seeing it, at all.