It seems to me like American society right now has a really negative view of the future. Personally speaking most people I ask don't think that the future will be better than the present, and many think it will be much worse. I've encountered this across nearly the entire political spectrum and it seems the same across different classes, ethnicities, and religions too. Do you think that this pessimistic outlook is realistic? Why should things get better or worse (in your opinion)? Is it different where you live?
Personally I tend to be unjustifiably optimistic and think that things generally work out for the best. Terrible things happen sure, but so do beautiful and wonderful things. I think that focusing to much on the bad is unhealthy for a culture and can lead to paralysis and paranoia, just like how focusing too much on the good (as I tend to) can lead to blind confidence and arrogance. I hope that things become more balanced as time goes on and that worldviews begin to brighten up.