I'm sorry if what I'm about to write is unpopular, but I have to write this. The teaching profession is one of the MOST important jobs in this society. Yes there are many professions, and yes so many in our society work hard, but how many people in America have not been taught by a teacher. Teachers guide individuals into professions. We are the ones that encourage, stimulate, and shape futures. How many other professions out there serve almost every American? A person can go almost their whole life and never see a doctor (until death), never meet the president of the US, never need a plumber, never use a bank, grow their own foods, and so on - but how many people can honestly say that they have NOT been taught (at some point) by a teacher.
Yes there are not so great teachers, and yes some need some counseling themselves but the majority of us work our behinds off. The people that get into the teaching profession do it because they care. We really want to make a difference, and make this world a better place. Its surely not because of the money. Yes we have summers off, winter breaks, and spring breaks - but many (many being me) don't get paid during that summer break - so we have to stretch that poor salary all year around. With all of that said, many of you can compare how hard you work, your husbands work etc. But how many of you haven't been taught by a teacher at some point of your life? That's how important teacher are?
To the OP, teacher bashing comes into play because few have taken that trust and respect and made of mess of things. Teachers use to have such a high moral status. Now we (I used we, but not me) are disrespecting kids. We talk to them crazy; we come down to a kid level, we sleep with kids, and become their friends instead of their mentors. Because of the indiscretions of some, people (some people) automatically don't trust teachers. Our trust has to be earned.