I don’t know what it is, but over the past year or so I have read and watched many things that have to do with World War II. Why is that? To be honest, I can’t even tell you what happened during World War I. I have no idea what-so-ever. Does that make me American? Because I live in ignorance?
Anne Frank, Sarah’s, Key, The Reader (GOOD stuff there), am currently watching Swing Kids, just finished No One is Here Except All of Us… Why is there such intense concentration on World War II in so many forms of art – literature, movies and etc., yet World War I? Where’s that in history? Where are those stories?