asked 171k views
1 vote
How has Christianity played a major role in the culture of the Unites States?

1 Answer

2 votes

Answer:

christianity has played a major role in the united states because it caused most of the population to have christian beliefs and other religions aren't as common as the christian religion.

answered
User Rohit Chemburkar
by
7.7k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.