0 Comments
California became a state in 1850.
The U.S. territory in the West that became a state in 1850 was California.