Sunday, November 20, 2011

CATE OBPE

This week I began reading Dynamic Youth Services through Outcome-Based Planning and Evaluation, an ALA publication on the CATE OBPE programming model. As I'm reading, I'm trying to see if this type of outcome-based planning has been utilized in my own local library, in addition to considering how I could apply this model for my final project. Presently, I'm about two-thirds of the way through the book, and have thus far discovered many connections to material that I've learned in library school.

One such connection is in reference to how outcomes are measured, discussed on page 9. "One important difference between inputs and outputs, on the one hand, and outcomes, on the other, is that inputs and outputs are most frequently measured quantitatively, whereas outcome measures are some quantitative but often qualitative". I happily discovered that I understood immediately what the differences were in reference to the two types of measurements and two types of research design methods. Reading the Creswell book, as well as published literature on research design, really gave me a comprehensive understanding of qualitative and quantitative methods. Sometimes, when one is 'tested' in the real world, one can truly know if one understood the material learned. This is definitely how I feel at this point.

The most interesting point that relates to the course material on research design, is the aspect of action research corresponding to the interactive model of OBPE. The Project CATE Outcome-Based Planning and Evaluation Model is very similar to action research in the sense that it is continuously being re-evaluated and evolves over time. When I think of action research, I find it most helpful to think of the spiral diagram and the article I read on action research: Gordon, C. A. (2008). A never-ending story: Action research meets summer reading. Knowledge Quest, 37(2), 64-41. This particular piece of literature showed me how action research works practically, and is helping me understanding the ever evolving OBPE model as well. Once a program is implemented based on the CATE OBPE model, it requires re-assessment and thereby remodeling as time goes on. Factors that assisted in the creation of the program may change over time, thus necessitating the reassessment.

Another aspect I found fascinating was in relation to youth and parental permission, which coincides with the recent reading in Representing Youth. The authors of Dynamic Youth Services discuss how a parent or guardian needs to give permission for a child to participate in a study. Of course, this is the idea of potential gatekeepers in play, as well as our IRB training. However, what the authors focused on was that this permission only need to be given in "extraordinary evaluation study". Children who "filled out program evaluations and voluntarily answered questions about how they used the library as part of the library's ordinary responsibility of getting input from the public". However, when children were videoed, even though the videos were only shown to library staff, parent consent was necessary. Somehow, the informal atmosphere of the first method of information gathering did not qualify as an "extraordinary evaluation study", and thereby did not need informed parental consent. I was previously unaware that there was a difference, and wonder how the library came to this conclusion. However, I do see the merits in the informal evaluations as not requiring parental permission.

The last connection I discovered while reading this book on the OBPE model, was how much my 204 management class will come into play if spearheading such a program. Although the professor I had for 204 unfortunately did not connect much of our material to libraries, I can now see how to apply the managing information to creating and implementing a CATE OBPE type of program. For instance, page 54 brings up the topic of "developing a culture of evaluation" in order to even begin to develop an Outcome-Based Planning and Evaluation Project. One suggestion presented by the authors to help staff feel comfortable with an evaluation, is to begin with a Level I project. This is simple to implement and easy to understand and therefore will bolster confidence among workers and inspire them to want to try something more difficult, such as a Level II or Level III project. A good manager understands the need to get one's employees on board, and this is such a case. Another idea brought up in the book to reinforce confidence among staff is to present previously published literature of successful CATE OBPE projects in other libraries.

No comments:

Post a Comment