menu
Qamnty
Login
Register
My account
Edit my Profile
Private messages
My favorites
When did Christianity become the dominant religion in Europe?
Ask a Question
Questions
Unanswered
Tags
Ask a Question
When did Christianity become the dominant religion in Europe?
asked
Apr 20, 2018
44.1k
views
2
votes
When did Christianity become the dominant religion in Europe?
History
high-school
Kieran Foot
asked
by
Kieran Foot
8.2k
points
answer
comment
share this
share
0 Comments
Please
log in
or
register
to add a comment.
Please
log in
or
register
to answer this question.
2
Answers
3
votes
Romans started to persecute it in the 4th century, when the roman emperor embraced Christianity.
Thesis
answered
Apr 22, 2018
by
Thesis
8.1k
points
ask related question
comment
share this
0 Comments
Please
log in
or
register
to add a comment.
4
votes
i think it is the 4th century
Inmyth
answered
Apr 25, 2018
by
Inmyth
8.0k
points
ask related question
comment
share this
0 Comments
Please
log in
or
register
to add a comment.
← Prev Question
Next Question →
No related questions found
Ask a Question
Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.
Categories
All categories
Mathematics
(3.7m)
History
(955k)
English
(903k)
Biology
(716k)
Chemistry
(440k)
Physics
(405k)
Social Studies
(564k)
Advanced Placement
(27.5k)
SAT
(19.1k)
Geography
(146k)
Health
(283k)
Arts
(107k)
Business
(468k)
Computers & Tech
(195k)
French
(33.9k)
German
(4.9k)
Spanish
(174k)
Medicine
(125k)
Law
(53.4k)
Engineering
(74.2k)
Other Questions
is it true or false that after the american revolution conflicts in the northwest territory erupted between remaining british soldiers and native americans
Who made dutch claims in north america?
How did world war 1 affect the racial and ethnic makeup of american cities
Twitter
WhatsApp
Facebook
Reddit
LinkedIn
Email
Link Copied!
Copy
Search Qamnty