I’ve been hearing this phrase all my life. It seems to be a uniquely Christian saying and one that is mostly used at Christmas (that pagan holiday co-opted by the early Christians because the pagans wouldn’t quit celebrating their holiday).
The problem is that, in my experience, no religion actually wants peace on Earth… unless they are the ones in charge. Here’s a story about a rabbi (in Israel no less) who has started a campaign to make it illegal to rent rooms to Arab Israelites.
Of course, the GOP, long the party of fundamentalist Christians have been huge supports of the US military machine (probably because of the amount of stock they own in the industry). They have also routinely rejected attempts to make sure that all people in the US have adequate healthcare, are taught proper information about things like birth control or science.
While, praying for religious freedom (because, apparently, Christian persecution is the highest it’s ever been (possibly with good reason)), they continue to pray that everyone across the globe find “Christ”.
You know what would make Peace on Earth a reality? The elimination of evangelical religions. If everyone was free to live as the wanted to, without being harassed or attacked for being a certain religion (by other religions, atheists don’t care), then there wouldn’t be nearly as much strife.
People who used their religion to preach and support violence, marginalization of out-groups, and hatred against those who just want to be allowed to think and believe as they want to should be the ones marginalized. Instead, these groups are promoted (via the media or internet sensationalist sites). Instead of everyone telling these people “That’s wrong!”, they are allowed, even encouraged to promote their bigotry and hatred.
And that’s not a recipe for world peace.
And for those evangelical religions, here’s a hint. World peace does not mean everyone doing what you say.