Oh, Floridians. It never ceases to irk me how the term "west coast" to you doesn't mean California, Oregon, and Washington, but rather the west coast of Florida (um, pretentious much?). Also, stop saying "the Bay Area" to mean Tampa Bay. That right is reserved for San Francisco!
It's like you've never lived outside your own state. ಠ_ಠ
Come to think of it, most Floridians I know haven't. What's up with that, anyway?
It's like Florida is a black hole that sucks you in and doesn't let you go.