So I was reading the Wonkettes (another place where I write words occasionally) and saw this review of a book about the idea of Southern secession, written by a Northwestern liberal armpit-ty hippie (I assume), which makes the argument that really, the rest of the country would probably be better off if we just gave the South back to wherever we got it from (hell).
Now, I am new around here, but if you don't know yet, I am a goddamned Southerner. Born in Arkansas (I am indeed known among the gays in my town as one of an amazing yet unorganized group of guys who are The Best Guys in town, known as the Arkansas Boys), raised in Tennessee, lived in Georgia, back in Tennessee, etc. I love it down here. That being said, I see books like this and my immediate reaction is "oh yeah, hell yes, fucking get rid of the South. Turn it into the Banana Republic it so aspires to be. Just let me move North first." And it's weird, because as I was reading that book review, it occurred to me that a lot of Southern progressives are like this.
We are well fucking aware that so much of the history that has defined this nation happened on our lands. We feel it all the way back to the Native peoples who were driven/murdered out of their rightful territory, where we pay homage to them by giving our wealthy white neighborhoods names like "Chickasaw Gardens." We who are obsessed with music know how much of our musical heritage comes from this place. We know what kind of magic comes from this part of the country. But at the same time, we deal with the fact that, as the book review describes, this part of the country has been the Grown-Up part of the country's proverbial bleeding, abscessed hemmorhoid pretty much since the nation's founding. We love it, yet we hate it. We cherish being Southern Liberals (Molly Ivins, helloooo.), yet there's a part of us that's right there with the rest of the country saying "Kick it into the Gulf of Mexico. All of it. No one will miss it."
It's a weird dichotomy, is all I'm saying.