Every week, it seems, there’s at least one aricle in a major publication explaining what’s wrong with public education. This week, I happened to read it in Newsweek. Now it is the fault of the education schools and the poor education most teachers receive at them.

It’s not that I disagree with the article’s premise, exactly. Most education schools don’t teach teachers how to teach well. Or even what to teach (community building techniques aside, I have worked with teachers entrusted to teach reading who didn’t understand phonics or phonemic awareness). I majored in history as an undergrad, than took 14 months of graduate classes to get a Master’s and a teaching certificate. When I walked into the urban school district I taught in for 8 years, I knew more about how children learn than half of the veteran staff.

And I thought I didn’t know enough – so I kept studying. On my own. During my free time. I’m not trying to sound noble, it was just that I wanted to be good at my job. Most of the people I know with corporate jobs had to do the same thing – they learned most of the skills needed for their job while they were actually working at their company, not while they were in college.

Of course education schools need to do more than espouse progressive ideologies and teach future teachers not to fall victim to any of the “isms”. But does anyone really believe that perfectly educated teachers will solve all of education’s problems?

Be Sociable, Share!