With the stark division between left/right (Democrats/Republicans, progressives/conservatives, whatever you want to call it) is the “United” States over? I might be pessimistic, but it seems that the disconnect between the citizens is too great to fix. I don’t think I’m alone in thinking this, but what are other opinions?
When I think of US history, unity and stability are probably the last adjectives I’d use. I don’t see this period as notably worse than any other transitional period. Alternatively I’d put it like this; Violence is the only true power in America and as along as the government has a monopoly on it the country will be forced into submission no matter the ruling party.