Everyone has their own opinion on all of these issues. I did want to pose a question though...and it is something that I have been thinking about lately, especially now that I have children of my own.
Does it seem that our culture as a whole has become more and more violent and dark and that is being reflected in our literature, and YA literature? Everything from our movies, books, TV shows, news, etc. has leaned more and more towards the dark side more than any other. And as a parent, naturally I would take issue with that.
Teaching about westward expansion recently, I shared with my students The Leatherstocking Series (includes The Last of the Mohicans), and had a very productive discussion about how what a culture reads says a lot about them. One of my students brought up that what a culture reads is directly correlated with what important things are happening in that time period. During the late 1800s exploration, pioneers, and hard work were the themes for literature...and now, vampires, abuse, and hard decisions are becoming more and more prevalent. It is worth noticing. It is also worth noting that killing has ALWAYS been a theme in literature, and that is probably because it has always been a part of human culture.
Do you think that our culture, and YA literature have taken a ridiculous turn toward the dark, or is it simply a reflection of our lives...and how do you feel about it either way?