A denominational leader asserted that the best thing the Church could do to handle the challenges of this cultural moment would be to "stay in its lane." That the so-called "culture wars" have been grueling, and the Church is primarily called to spread the Gospel.
To counter the secularizing forces constantly at work, we must be deliberate and strategic about helping Christians think “Christianly.” In biblical terms, we must “destroy arguments and every lofty opinion raised against the knowledge of God, and take every thought captive to obey Christ” (2 Corinthians 10:5, my emphasis).
There’s no need to fight a culture war if your worldview is in the ascendency. Christians in the Middle Ages didn’t go on crusades to lands full of churches. Upper-class men didn’t fight for the right to vote because they already had it. So why does it feel like, in every sphere of society right now, a culture war is being fought by both sides? Why does literally everyone feel as if they are the victim?