Wednesday, November 15, 2017

The death of Christianity in the U.S.

The death of Christianity in the U.S.: Christianity has died in the hands of Evangelicals. Evangelicalism ceased being a religious faith tradition following Jesus’ teachings concerning justice for the betterment of humanity when it made a Faustian bargain for the sake of political influence. The beauty of the gospel message — of love

No comments:

Post a Comment