When you’re a teenager, you have preconceived notions of what you want out of adulthood. You want to drive a Ford Explorer and live in Southern California.
That’s not what you wanted? Maybe it was just me. (And let’s be honest, that’s a pretty ridiculous combination: a gas guzzling automobile in the land of the environmentally conscious? That never would have worked.)
I’ve been to Southern California a fair amount for someone who has no family here or real reason to come here. Each time I’ve ventured West, it excited me to come out here, and I mourned the day when I’d have to head back East. It was thrilling to see stores and restaurants we didn’t have back home. People here spoke Spanish, ate sushi, and were crazy about avocados. There were palm trees and more than three lanes of traffic. Visiting Los Angeles was as big of an adventure as you could have when you’re from a small town in Virginia.
It’s been a few years since I’ve been to LA. In that time, I’ve fallen pretty hard for New Orleans, a boy from New Orleans, and the swamp. I love southern hospitality, warm summer nights, soul food, an affordable cost of living, and saying “y’all”. It took me awhile to accept that I’m from the South, but now I’m proud of it. It’s a lifestyle that I’m fond of, and I never want to leave.
Working in tv/film, I’ve definitely embraced the change of pace and opportunity to get out of Nashville for a bit and learn from some great Los Angeles producers. But, it’s the first time I’ve been to California and been completely indifferent about it. It’s the first time I’ve been able to look at it from an unbiased perspective. I see the beauty here, but I also see the flaws. It’s the first time I’ve been here where I realized this isn’t the life I want. I’ve grown up.
Have you recently visited a place you once had blind adoration for? How have you changed?