Answer:
A.
Step-by-step explanation:
The West represented the frontier, a place of opportunity and adventure where Americans could start anew and fulfill their dreams. It also symbolized the triumph of American civilization over wilderness and Native American cultures.