Actually we never grew out of the fundamental beliefs of the Christian church (at least not in the West). It was the Church that instituted the belief in the fundamental dignity and mental and bodily integrity of all human beings (beginning with women). Prior to that in slave-cultures like Rome no-one except the patriarch had any authority or power over themselves.
Although the Church as a political institution made some very grace errors that resulted in great evil, the fundamental philosophies of Christianity changed the world and led ultimately to the emancipation of women and the end of slavery (and serfdom). But people who don't study Ancient History (or Philosophy) usually think the Church begins and ends with buggering altar boys.
Great book called Dominion by Tom Holland tells the story very clearly as history of ideas. Another one is Inventing the Individual by Larry Siedontop.
I'd avoid slagging Christianity until I'd read at least one of them.
Have you read "The Pursuit of the Millennium" by Norman Cohn, Atelier? That's a great one regarding medieval Christianity, although it's more about heretical movements than the institution of the Church itself.
I'd recommend it to anyone interested in the subject of early Christianity, though, as Cohn writes about many of the movements from before the Crusades up until the Reformation & "the Peasants War" in a way that really gives interesting and unexpected insights into the period and religious philosophies/movements that were going around back then