In the United States and the United Kingdom,
culture war refers to a conflict between
traditionalist or
conservative values and
progressive or
liberal values. Beginning in the 1990s, culture wars have influenced the debate over public school history and science curricula in the United States, along with many other issues.