asked 26.0k views
4 votes
When did sociology become established as an academic discipline in the united states? during the middle ages about 1800 about 1900 about 2000?

1 Answer

1 vote
Sociology became an academic discipline in the United States around the 1900s. The term 'sociology' was coined by Auguste Comte (the father of Sociology) in 1838 as the scientific study of human society. Sociology was first taught as a social science in Yale University and 1876; and the university of Chicago was the first institute to establish a graduate sociology department.

answered
User Stephen Patten
by
8.4k points
Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.