Monday, December 7, 2009

Administrators'...New Teacher Mentorship Program Survey

The Administrators’…New Teacher Mentorship Program survey was designed to gain insight into how familiar principals in Saskatoon Public Schools are with the existing new teacher mentorship program. A Second purpose is to gage the level of importance principals place on such a program. Thirdly results of this survey should give an indication if the principals are familiar with what the research suggests as well as how they see their role influencing new teachers. Novice teachers look foremost to principals for guidance and direction on how they should perform in schools (Wood ,2005). I conducted a test pilot of the survey with 4 principals, two females and two males, all of whom had varying years of experience in the principalship.

Revising the Survey
I received some valuable feedback from all four principals that participated in the survey. Although the feedback didn’t suggested any fundamental problems with the survey itself the revision made the survey questions easier to understand and with the addition of a couple questions I was able to ensure that I was obtaining all the necessary information. Following are the specific changes that were made. Numbers are identified according to their placement in survey #2.

Question #1 & 2
I changed these two questions to statements. They flow better with the rest of the survey this way. I changed the value of number 1 to include “don’t know” as it fits better with the statement versus the question format.

Questions 5 & 6
I added these two questions to get a sense if there was an understanding that mentoring programs support all stakeholders’ learning and not just student learning.

Question #7
I reworded this question due to the fact that all 4 principals felt that it was wordy and confusing. After looking at it again I did agree and attempted to simplify the wording for clarity.

Questions 9 – 11
I reordered these 3 questions as they had common content but did not follow each other in the original survey. With the common theme running through them it makes categorized one after another to keep the survey succinct.

Questions 13 & 18
I agreed with the suggestion to change the numbers in the lists to letters. It may have posed some confusion for readers but the break in numbers and letters makes it easier to identify answers from the initial question or statement.

Question 15
After reading the answers to # 15 is was obvious I had worded the question in a way that was confusing to the reader. What I wanted to know was simply who selects the mentors.

Question 16
Again, I reworded this question using fewer and more concise wording.

Question 20 & 21
These were additions as suggested by one of the principals. He is unfamiliar, as am I, with whether or not there is such a procedure in place. It would make sense to both of us that there be some way to evaluate the usefulness of the program as it exists.

Question #22
This is also an addition to the original survey and I will be very excited to how principals answer this question. It is my personal belief that there should be. I know that there would be some degree of informal mentoring for all employees but if it is not done with purpose and intention can we get the results necessary.

I found the test pilot extremely useful. When I initially sent it out to my peers I was feeling confident that I had developed a valuable tool that needed little revision. I was wrong. Concise wording in the survey appeared to be the greatest weakness. Having professionals with some familiarity to the content complete and critique the survey helped bring it to life. Their comments and questions forced me to rethink the formatting, succinctness and clarity and lead to more valuable survey that I will use in the New Year.

You can view the surveys at the following links:
Survey #1
https://survey.usask.ca/survey.php?sid=18323
Survey #2
https://survey.usask.ca/survey.php?sid=18424

References
Wood, A. L. (2005). The importance of principals: Site administrators' roles in novice

teacher induction. American Secondary Education, 33(2), 39.

Tuesday, October 27, 2009

SPS New Teacher Mentorship Program

Assignment #3

New Teacher Mentorship Program in Saskatoon Public Schools (SPS)

Over the last couple of years Saskatoon Public Schools has changed the format of their new teacher mentorship program.

The purpose of the program is "To help all students learn and achieve at high levels." In this way, induction and mentoring serve the strategic reason for such a program.

The programs goals are:
*to support the learning of the new job and reduce the stress that comes with it
*to improve instructional performance by modeling and collaboration
*to attract new staff in a very competitive recruiting environment
*to retain excellent veteran staff in a setting where their contributions are valued
*to promote the socialization of new staff into the school "family", values and priorities

The purpose of this evaluation assessment is to determine the effectiveness of the program in meeting the above goals.

Contributors to the process will be:
*New teacher mentees,
*Experienced teachers mentors
*School based administration
*Senior administration (central office)
*Member(s) of the SPS New Teacher Support Committee

The information from this evaluation will prove useful for several reasons.
*It will assist SPS and the New Teacher Support Committee in planning effective processes to
support new teacher learning
*It will provide feedback and suggestions for teacher mentors and school-based administrators
as to how they can work along side their new teachers in a more purposeful or valuable way
*It will assist in determining whether or not to continue with the program

Questions
1. What processes/activities are in place to support new teacher learning and teaching?
2. What do people do differently as a result of the program?
3. Are new teachers satisfied with the support offered?
4. Do teacher mentors have an in-depth understanding of their roles?
5. Do school-based administrators have an in-depth understanding of their roles?
6. Are all new teachers across SPS receiving similar experiences or are there inconsistencies
depending on where one works?
7. What changes or modifications do those involved see as important?

Resources
Following are the sources of information needed
*People identified to be involved
*Time to conduct interviews
*Survey
*Literature/Research that supports effective mentorship models

The two methods, I will use, for obtaining the data will be through interviews and surveys. These will be one-time surveys and interviews conducted in the middle of the school year. The interviews will be recorded for transcription purposes.

Analyzing the Data
The computer generated survey will be tabulated with the technology, as well as, I will review the data to organize the information. I will also collate and organize the interview data.

Once I have the data organized and collated I will collaborate with SPS New Teacher Support Committee, school-based administrators and teachers to interpret the findings. It will be important to seek the input from all stakeholders as various views will lead to greater understanding for effective future planning.

The evaluation will be communicated to all those involved through a written report but also through conversations with the decision makers connected to this program.

Saskatoon Public Schools strives to recruit and retain only the best teacher candidates. To anticipate that new teachers will be performing at an expected standard means that clear support plans be in place. New teachers are charged with understanding and implementing successful instructional strategies, effective classroom management, the need for community development and ongoing learning. Without guidance and structured support from colleagues and administration the overwhelming job of a teacher may become daunting and their ability to help students learn and achieve at high levels may be jeopardized.

Friday, September 25, 2009

Assignment #2

Program Evaluation...Case Study

The case study describing a government funded program for children with severe/profound disabilities is not uncommon to Saskatchewan educators. Although this program focuses on children who may or may not be in school we, in elementary schools, may be involved in providing programs such as this one. Funding availability hinges on similar criteria such as diagnosis, documentation in the form of a personal program plan and of course the level of support being provided.

The Formative Evaluation Approach is the method I would choose to assess the program in place to meet the needs of the children identified in the ECS Program as described in the case study. Scriven defines formative evaluation as ongoing with the intent to improve while summative evaluation is conducted after a program is complete, usually for an external audience. The urgency to support young children in their early years of development cannot be overstated. That there is a limited time allotted to each child in this program would suggest ongoing evaluation needs to be conducted so necessary changes can be made to meet the ongoing needs and changes in the child. To wait until the end of the year to evaluate a program that supports a child’s emotional, social and cognitive growth and subsequent success of the program could be detrimental. The required minimum four visits during the school year provides the opportunity to evaluate the progress in the child and alter the program to support his or her ongoing development, that being said, the Formative Approach seems to be an obvious choice.

Formative Evaluation is a reflective method that as three important attributes that would support its implementation. First it allows the opportunity for quick feedback to determine the effectiveness of the program and strategies that are in place to meet the learning needs of the child. Secondly, cumulative documentation provides updated information on the techniques and resources that are being used and their effectiveness, what challenges are encountered and what impacts have been made not only at the end of the school year but early and mid way through as well. Lastly, formative evaluation supports planning and allows for reconsideration of or recommitment to the plans. It also allows for the reevaluation of goals and supports future planning and implementation. The three attributes described align with the necessary records that will lead the funders to continue or deny future financial support for a particular child with significant disabilities.

Another reason for choosing this method is the number of evaluation tools available as part of this process. The varying tools such as interviews, questionnaires, reports and student interaction allow for the evidence of learning to be triangulated through observation, conversation and product. With the triangulation of information the program provider can involve family and caregivers, medical personnel and the education team in gathering the necessary evidence. Involving the entire community of support is a necessary and important piece of a successful learning plan. In conclusion I would suggest that the importance of selecting the right strategies to work with some of the youngest and most needy children in a timely and effective way can be through the described formative evaluation.

Sunday, September 20, 2009

Assignment # 1...Program Evaluation Analysis

Summer Tutoring Program for Kids 2003…final report.

The Summer Tutoring Program for Kids 2003 final report was of interest to me as it showed similarities to reports I have seen written during my time working in community schools. Grant writing and lobbying for funding to execute programs in community schools during the academic year as well as summer months is common practice. Program evaluations are always required to justify and provide information about the delivery and success of the funded program.

I reviewed the 2003 final report but for interests sake I also reviewed the 2008 final report and found them to be significantly different. I will elaborate later in this assignment.

The 2003 final report can be accessed at:
http://www.gov.pe.ca/photos/original/edu_sum_tutor.pdf

The 2008 final report can be accessed at:
http://www.nald.ca/library/learning/stpk08/stpk-rpt.pdf

Background
Summer Tutoring Program for Kids is an 8 week tutoring opportunity for students in grades 1-6 across Prince Edward Island. The focus is on improving or maintaining reading levels over the summer. It began in 1998 with 5 tutors and 97 students and has grown to 21 tutors, one full-time coordinator and 600 students served in 2003.

The Sumer Tutoring Program for Kids 2003 final report is a process-based evaluation that describes how the program has evolved and provides a vague description of how it operated.

Weaknesses
I found there to be many weaknesses in the 2003 program evaluation that left more questions than answers about the 2003 program.
1. Overall the document was vague and provided little detail in answering the expected process-based evaluation questions.
2. One of the objectives was to provide qualified students with summer employment. How these tutors were deemed qualified was never described. The report stated they got two days training from consultants and teachers but what was involved in the training was omitted. I was left asking if and how the tutors provided pre and post assessments to determine the learning needs or successes of each child.
3. Details about the services offered to the children was not provided. The general process that the tutors and students went through is not found in this final report.
4. The training and role description for the program coordinator was also not addressed.

5. The only resource mentioned was the public library and levelled book but how these were utilized is unknown. The author of this evaluation also mentions a tote of supplies and final forms for the school. I would have appreciated knowing what was in the totes and what information was provided to the school on the final forms.
6. If the main reason for this program is to help students build or maintain reading levels over the summer where was the evidence of that? Perhaps on the final forms that are submitted to schools? Without this data I question how informed decisions were made to determine the inclusion or omission of future products and services?

Strengths
Although outweighed by weaknesses the report did have strengths.
1. Qualified resource teachers recommended the students for the program. We can assume that these teachers would have used various assessments and have a complete understanding of each child’s needs.
2. The report indicates a survey that was provided to parents and students. A large part of the report provides feedback from both of these groups. The surveys included questions about how the clients felt about the program, what they would like to see changed and what they would recommend staying the same.
3. Lastly, the author does provide a list of recommendations that she deemed necessary for the future. I should note that in the 2008 final report many of the 2003 recommendations were being enacted.

Conclusion
In reviewing a report such as this I think there are a couple of key things to consider.
1. The final evaluation is only as good as the author who writes it. Without understanding the experiences and training of these people it is difficult to really know how effective the program might have been. The 2008 final evaluation answers almost all of the questions that the 2003 report did not. Although I felt the program had many gaps perhaps it was only ineffective writing and detailing on the author’s part.
2. With the tremendous growth of the program between 1998 and 2003 we can assume that the Summer Tutoring Program for Kids 2003 was effective. I don’t doubt the need of the students and that one on one tutoring can be very effective in supporting student learning.