I Have been looking at different societies through out history for school and have seen how the rise and fall of different empires/countries has caused the rise and fall of different religions associated with them. My question is is if Christianity is the true religion, why hasn’t it stayed, why have we fallen and become a ‘post-christian’ nation?
True Christianity hasn’t gone anywhere, but many people have rejected it in favor of false teaching and mythology (1 Tim. 4:1-2) and have become self-centered and ungodly in the process (2 Tim 3:1-5). They even question the Lord’s promise that He’ll return (2 Peter 3:3-5). Reading these passages, you can see the Bible warned us these things would happen and says it’s a sign that the end times are upon us. It also tells us the Church will decline in strength and influence as the end approaches (Rev. 3:8). All this means in a few years the Lord will return to set up His kingdom and when He does Christianity will be the “religion” of the entire world (Zech 14:9).