Despite an unfortunately common mis-reputation, both within the Church and without, Christianity doesn’t teach that human bodies are evils to be controlled in order to avoid sin. Rather, it teaches that our physical bodies are part of who we are as made in the image of God. As part of the created order, then, our bodies are a type of natural revelation, designed by God to reveal Himself.