04-29-2023, 04:34 PM
The only nature kinda places worth seeing in the east USA is in the eastern TN area where it meets Virginia. So the west of Virginia, not the state of West Virginia, lol. Other than that, Gulf coast... any of the Gulf along the southern states is worth seeing, and of course Florida has a diverse selection of beaches. Atlantic Florida is far different from Gulf Florida. Gulf side is much prettier, but they're both worth seeing. The Keys are worth seeing, but it's mostly for the coral and pretty color of the water, as there's no sand.
I'd recommend hitting up Texas, Gulf coast, Florida. That's about it. Maybe take a route that brings you over to that western area of Virginia, just to see it. There's nothing over in the east USA that compares to the scenery out west on any level whatsoever.
Keep it moving through the south.
Once you get closer to the Gulf and Florida, the vibe changes and you'll be safer (ironically, even though there's a shitload of crime in those areas) because you're a tourist and that's normal there.
Texas has good camping on its coast, same for all those southern Gulf states. It won't break the bank to stay along those areas and those are about the only areas that have any free options whatsoever.
I'd recommend hitting up Texas, Gulf coast, Florida. That's about it. Maybe take a route that brings you over to that western area of Virginia, just to see it. There's nothing over in the east USA that compares to the scenery out west on any level whatsoever.
Keep it moving through the south.
Once you get closer to the Gulf and Florida, the vibe changes and you'll be safer (ironically, even though there's a shitload of crime in those areas) because you're a tourist and that's normal there.
Texas has good camping on its coast, same for all those southern Gulf states. It won't break the bank to stay along those areas and those are about the only areas that have any free options whatsoever.