Church has gone from something that was a staple of American culture to something deeply divisive. The more hostile our culture becomes to the Christian faith, the more important it is for Christians to understand what church is really all about, and to be a part of a church family.