Landslide
Well-Known Member
I came across this news, and I have to say I'm in shock...
Specially with this part:
So geography is never a mandatory subject at school in the United States?
Specially with this part:
The United States remains one of the few advanced societies where it is possible for most citizens to move from kindergarten to postgraduate life without any exposure to geography as an analytical science.
So geography is never a mandatory subject at school in the United States?