asked 93.0k views
5 votes
What does it mean when you say faith actually begins with God

1 Answer

2 votes
When people say that faith actually begins with God, they mean that true faith comes from a belief in God that is unrelated to societal pressure--in other words true faith should not be influenced by people saying you should or should not be faithful. 
answered
User IndusBull
by
8.4k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.