Who Defines Doctrine?
Who Defines Doctrine?
Back in the 1780s, Noah Webster fought to create an American language based on the way American people spoke, not on rules laid down by English aristocrats. His populist philosophy did not entirely appeal to people who bought dictionaries, however. During the 19th century, people bought dictionaries in order to get the authoritative word on words. Having a large dictionary in the parlor became a ticket to culture, writes David Skinner in The Story of Ain't. Thus dictionary companies marketed their products to a set of consumers more conservative than Webster himself.
Webster must have rested uneasily in his grave until 1961, when Webster's Third New International Dictionary startled speakers of American English. Using the new science of linguistics, the dictionary returned to the authentic Webster tradition: Rather than prescribing how people should speak, it described how they actually spoke. As the dictionary's editor Philip Babcock Gove wrote, it needs to be "a faithful recorder … it cannot expect to be any longer appealed to as an authority."
In the controversy that followed, writes Skinner, detractors and defenders alike used moral language. A critic complained in the Saturday Review, for example, that "permissiveness, now on the wane in child-rearing, has caught up with the dictionary makers." Editor Gove celebrated that permissiveness, Skinner reports: He "compared the belief in one correct linguistic standard to a belief in revelation, in the Ten Commandments specifically," rejecting the notion that there is some language deity inhabiting a linguistic Sinai—some source and sanction for language other than usage.
Reading The Story of Ain't got me thinking about doctrine and ethics. Truth is eternal, but the language of truth—precisely what believers believe, how they summarize it, and what dimensions they emphasize—changes. Doctrine is conditioned by events and movements.
One example is the Rule of Faith—the words repeated by those about to be baptized into the early Christian church. It no doubt began as something like 1 Timothy 3:16 ("He appeared in a body, was vindicated by the Spirit, was seen by angels, was preached among the nations, was believed on in the world, was taken up in glory," NIV 1984), a summary of the Christ event. By the year 381, it had become what is now called the Nicene Creed, a careful delineation of the life and work of the Trinity. The language of truth had changed in reaction to at least four heresies—Arianism, Apollinarianism, Macedonianism, and Chiliasm.
Not all change is provoked by heresy. Christianity often takes new cultural forms in response to new contexts. The Protestant Reformation with its emphasis on sola scriptura moved with the speed of the newly invented printing press. The missionary movements of the 16th and 19th centuries followed the trails laid by explorers and colonizers. I grew up in the revivalist tradition—a spiritual stream ignited by the democratization of American religion.
And yet change can be heretical, schismatic, or just dangerously lopsided. So how should we relate to populist religious movements? Do we appeal to tradition, or do we say anything goes?
Theologian Richard Mouw has modeled a middle ground between theological elitism and cultural permissiveness. In a 1994 Christianity Today essay, he drew on Cardinal John Henry Newman's respect for the "sense of the faithful"—"a sort of instinct … deep in the bosom of the mystical body of Christ." If the church or the guild of theologians sets something forth as true, "it is a good sign of its truth that it is actually received by the membership."
- Jesus' Elevator Speech
- Misreading the Magnificat
- Downton Abbey's Real Legacy
- The Hymns That Haunt Us
- Private Faith and Public Policy: Where Obama and Santorum Agree