Dr. Wilson

View Original

America v. The Faculty

Does it matter that American college professors are much more liberal than the country? To put it more accurately, does the DEGREE to which college professors are more leftist than the country matter in the search for truth, knowledge and perspective? (Another way of saying it: does the academic mission matter?)

Does it matter that American college professors are much more liberal than the country? To put it more accurately, does the DEGREE to which college professors are more leftist than the country matter in the search for truth, knowledge and perspective? (Another way of saying it: does the academic mission matter?)