I’m aware that, at the moment, Florida is deemed unsafe for people to travel to, but what is generally the worst state to live in? Factors such as education, religious extremism, crime, cost of food, healthcare, and access to other resources are relevant. Will “Deep South” states remain one of the worst places to live in due to traditionalism, or are more progressive states bound to grow worse?
Mississippi, Alabama, and Arkansas are usually at the bottom of the rankings when it comes to the metrics you mentioned, especially education. Other southern states aren’t much better.
Seeing as how modern conservatism has become nothing more than a culture war against the things that improve the general well-being of a population, yes it will continue to be that way.