OK. I'm probably going to expose myself as a HUGE ignoramus here, but I really see no point in having a Geography department and major at a university.
Honestly. You study geography up through 8th grade or whatever and you know where countries are and things--but like, that's not enough? You have to MAJOR in knowing where countries are?
I work as a TA, and our lab shares space with the Geography TAs among others and I was sitting in my office hours today listening to some pompous young male person go on and on about the weather conditions in the British isles and how you don't get snow and ice there and how in the winter it just rains and doesn't get below 40 and blah blah blah. So I stuck my head in. "Actually, you do." I said, and then some other random bit of information and the male person gave me this glare and then said "That's why I said 'very often'," and proceeded to ignore me. So I went back to my chair--humiliated. A whole other group of people had heard the exchange--and you know what?
I'M RIGHT.
He's NOT.
Why do I know I'm right? Because I spent HOURS shivering in a tent in below freezing circumstances in JULY in SOUTHERN WALES, which is one of the vacation hot-spots of the British isles, and it wasn't just some freak weather pattern. It was the norm.
So my opinion of Geography and its usefulness has taken yet another plunge.
The TA then continued to talk, spewing all sorts of stuff about different places, throwing out the words "Nation-state" and things like that the same way a athlete would flex his muscles.
Pardon my french, but what the Hell does a geography student know about nation-states? Leave that to the political scientist, the historians and the anthropologists, hon, and go get a REAL major.
/end embittered rant