I read recently that Florida is no longer considered part of the South... possibly because of the huge influx of "snowbirds" from the Northeast. Also, Florida has developed its own distinctive culture separate from the rest of the South. So I could see how culturally, it's a bit different too.

Do you consider Florida to be part of the South?