asked 219k views
2 votes
How did life in europe change after the germanic invasions

1 Answer

5 votes
Assuming that you're refer to the Germanic invasion during the world war 2,
The european regions became the field of battle between the influence of Communism and capitalism who tend to fight to spread their influence within european nations.
If you're referring to the germanic invasion after the kingdom of rome fall, Christianity spread throughout the lives of europeans

Hope this helps
answered
User Mrpatg
by
7.9k points
Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.