Ah, the reality of accelerated graduate school classes. The cohort, which seems to change rapidly, moves from class to class in 8 week intervals. I have to say I was pleased with that until Action Research. I admit I wasn’t big on this class at the beginning, but after learning the process and participating in several excellent conversations about PD, I was disappointed when it came time to conclude.
According to everything the class taught us, data analysis and the subsequent action plan take up the majority of the time in the project. We, the students, had three weeks to collect data and one week to analyze and conclude. I am not complaining, I get it. I get that this was an introduction to the process and now we can implement this practice with our districts, but I really go into it. I am disappointed that I could not possibly put all of that data into a coherent form in just a few days. I can’t believe it, but I am disappointed that I didn’t have time to write a bigger, better paper. Looking at my schedule, odds are I will not get back to it for quite sometime.
That said, here is a small piece and please save your comments about my awful APA citation for an argument with a college professor : (FYI: InfoSource is an internet based content delivery system that we are using to update MS Office skills)
Question two will unfortunately remain unanswered until further research is conducted. It is my opinion that offering online, on demand personal development is a more effective delivery method; however, the data collected from InfoSource (Pearce, 2007) proves the contrary. A yearly assessment report showed over 300 failures, 54 incomplete tests, and only 45 passing scores in 2006-2007. A quick look at the time spent showed that the average participant is spending less than 30 minutes on each quiz and often logging off after less than 10 minutes because of a failure of the pretest.
This is the only online assessment tool that we have, but it is my belief that it is not an accurate reflection of the medium. In my personal experience InfoSource is tedious and difficult, offering material as drill and skill rather than exploration and reflection. I point to the short amounts of time spent and the staggering amount of failures as proof of this. As a side note, I am also taking issue with InfoSource’s overall data reporting. To my knowledge, the system provides only one report listing access times and success or failures. There is no way to filter, import, or export data making assessment beyond pass/fail difficult. Chalmers and Keown (2006) suggested that the success of any program is equal to the amount of support provided. Infosource is a stand alone quiz machine and, in my opinion, is proving to be an expensive experiment with very little reward.
Although not addressing professional development on a mass scale, another popular method of teacher development is the conference. With the assistance of the district office, I was able to access and manually record conference request data for the years of 2005-2006 and 2006-2007 (D.A.S.D., Appendix A). For the last two school years there were 1086 individual requests filed. Not having other data from other schools makes conclusions difficult, but cross checking the requests with the given rationale yielded some interesting trends. Less than half of the requests appeared to be related to pedagogy. Only 79 of the requests involved educational technology, while 364 requests were related to the administration and guidance departments. The question that needs explored now is why teachers do not seek out more opportunities within their content area?