Sunday, April 8, 2012

Private-allison robertson april 9

Evaluating a technology plan....
     How do you know it is working?
Purposes and stages of formative evaluation of technology plan.

    The purpose of formative evaluation is to evaluate progress towards achieving the plan goals while the plan is in progress and to establish the effectiveness of the current programs while making needed changes. 
In the  first stages, it is necessary to create the assessment instruments and develop target levels of progress.  The next stage is data collection and review.  The final stages include making recommendations concerning future action and changes to the technology plan. 

  

The technology plan by domains

Teaching with Technology
Student use of Technology 
   According to Gamire and Pearson (2006) assessing technology literacy is very complicated and developing the proper tools to measure is also challenging.  The purpose is to measure the effectiveness of efforts to improve technology literacy.  The first stage is to determine clearly what is being assessed. 
  1. Assessments should be designed with a clear purpose in mind. The purpose must be clear to the developers of the assessment, as well as to test takers and the users of the results.
  2. Assessment developers should take into account research findings related to how children and adults learn, including how they learn about technology. Insights into how conceptual understanding of technology develops and the mental processes involved in solving technological problems can help assessment designers construct appropriate items and tasks.
  3. The content of an assessment should be based on rigorously developed learning standards. The knowledge and skills identified in learning standards reflect the judgments of technical experts and experienced educators about the development of technological literacy.
  4. Assessments should provide information about all three dimensions of technological literacy—knowledge, capabilities, and critical thinking and decision making. Meaningful
    conclusions about the state of technological literacy in the United States must reflect skills and knowledge in all three dimensions.


  5. Assessments should not reflect gender, culture, or socioeconomic bias. Because of the nature of technology, men and women, people from different cultures, and people from different economic backgrounds experience and value technology in different ways. Designers of assessments must take these differences into account to avoid including items and tasks that favor or disadvantage particular groups.


  6. Assessments should be accessible to people with mental or physical disabilities. In keeping with federal and state laws and regulations, assessments of technological literacy must be designed, as much as possible, to allow individuals with mental or physical disabilities to participate.
     



Management Domain
Communication between school and home.
     This goal is also very complex to measure quantitatively.  While parents are aware of what is going on in the school through weekly notes in folders, it is difficult for parents to really see the products that their children are creating or to participate in the learning dialog to carry the conversation home.  A valid assessment would consider both increased parent involvement and the increase of achievement by students because they have an authentic audience.
      I would consider administering an open-ended response survey to identify a  baseline of proficiency and satisfaction at the "Meet the teacher" night at the start of the school year. Towards then end of the year set up another survey to compare  what parents consider important changes in communication over the year. 




Funding domain
Creation of a technology fund
   Assessment can be measured concretely by reviewing steps in a plan to raise funds as they occur and by funds accrued.  I anticipate that the funding goals may also be the most challenging.  To stay on target, this goal may need several checkpoints.  Polling parents and community members regarding potential events that could be successful will generate a larger base of ideas.  Caldwell is among a group of small towns that has very popular festivals.  Non-profit organizations are given preferential treatment.  Assessing the success of previous groups' efforts can help narrow the field.

Instruments to be used.
      There are many possible instruments that can be used to evaluate a technology plan.  I view this as a process with some steps in the plan taking years to implement, so I recommend a rubric approach by a trained observer who would review student and teacher products, interview stakeholders and report to the school's steering committee.
     The State of Texas uses the Staar chart to measure long-term progress over years.  Teachers are required to participate in an on-line survey that gathers information about their progress and the school's progress toward target technology goals.  This method of self-reporting seems somewhat flawed as many districts do extensive coaching, and teachers do not want to report a lack of personal progress.  This is evidenced by the numerous presentations that are available to teachers on the Internet about how to understand the Staar chart.  I suspect many teachers get tired of taking the same, long survey and just click so that they can get the Technology Department to stop sending them reminder emails.
   I am more inclined to agree with the Milken Family Foundations (famous for the TAP system) of acknowledging that progress of this nature is not measurable by merely taking a self-inventory but rather by applying a web, or framework of questions that evaluate in multi-faceted ways.

The Milken Foundation suggests this:

Seven Dimensions for Gauging Progress
Asking The Right Questions

1. LEARNERS
Are students using technology in ways that deepen their understanding of academic content and advance their knowledge of the world around them?

2. LEARNING ENVIRONMENTS
Is the learning environment designed to achieve high academic performance by students?

3. PROFESSIONAL COMPETENCY
Are educators fluent with technology and do they effectively use technology to the learning advantage of students?

4. SYSTEM CAPACITY
Is the entire education system reengineering itself to meet the needs of students in this knowledge-based, global society?

5. COMMUNITY CONNECTIONS
Is the school-community relationship one of trust and respect, and is this translating into beneficial, sustainable partnerships in learning technology?

6. TECHNOLOGY CAPACITY
Are there adequate technologies, networks, electronic resources and support to reach the education system’s learning goals?

7. ACCOUNTABILITY
Is there agreement on what success with technology looks like? Are there measures in place to track progress and report results?



Data collected for a plan.
  Based on the data collected using the Shannon-Cooper self assessment, interviews with parents, students, teachers, tech support personnel and the principal, the plan suggested in Reports I and II targets the needs of First Baptist School within the context of their funding and principles.  As time progresses it seems logical to check in again with these same groups to be sure that the system is moving towards the Instructional Goals established in the plan.  The plan itself calls for a showcase of products, on a monitor in the school and on-line in a blog.  These products can be rated using a rubric to show progress toward effective incorporation into the classroom and inclusion of best practices.  As this plan is not yet approved nor implemented, it is premature to collect data.



References
Johnston, J., Toms-Barker, L. (2002). Assessing the Impact of Technology in Teaching and Learning: A Sourcebook for Evaluators. Institute for Social Research at the University of Michigan with funding from the U.S. Department of Education, OERI, Learning Technologies Division in April 2002.


Milken Exchange on Education Technology. Technology in American Schools: Seven Dimensions for Gauging Progress - A Policymaker's Guide. Santa Monica, CA. 1998. This report can be found online at http://www.mff.org/edtech/projects.taf?_function=detail&Content_uid1=152

Gamire and Pearson (2006). National Research Council. "Executive Summary." Tech Tally: Approaches to Assessing Technological Literacy. Washington, DC: The National Academies Press, 2006. 1. Print.  Retrieved April 8, 2012 from http://www.nap.edu/openbook.php?record_id=11691&page=1

6 comments:

  1. Allison,

    I have also wondered about the accuracy of the teacher STaR chart. I feel like some teachers may not accurately describe their abilities because it is one more survey they need to complete at the end of the year. I think they may just click through it to get it over with. However, since one of the ways it can be used is to determine professional development needs, maybe all teachers should take it seriously and try to answer accurately. (Texas Education Agency)

    Reference
    Texas Education Agency. Texas STaR Chart. Retrieved from http://starchart.epsilen.com/history.html

    ReplyDelete
    Replies
    1. If you have ever read the story about the Emperor's New Clothes, then you will understand why teachers click a better score each year....

      Anderson, Has Christian http://deoxy.org/emperors.htm

      Delete
  2. I see the Star chart as a valuable assessment tool for several measurable objectives (Leadership, Teaching & Learning, Professional Development, etc.) that make up the Tech plan. I think it is fair to question the integrity of its results since it is a self-assessment, but I think the self-assessment itself serves a valuable purpose in terms of reminding the teachers of the objectives and encouraging them to identify professional development opportunities that could help them improve. But as you and Monica point out, the teachers need to take the survey seriously. The Milken Foundation topics are good, but need to be fleshed out in a level of detail like the Star Chart and not let to the teachers' interpretation of progress on each topic.
    Although private schools are not required to follow TEA requirements, I think there are many parts of the Star Chart that private schools could adopt and benefit from. Any thoughts about that?

    ReplyDelete
    Replies
    1. True about the framework needing to be more detailed--the Foundation does not make the entire framework available without a commitment to the entire system. From my experiences with TAP, with 19 areas of assessment graded on a 1-5 scale, I think that the system in its entirety has been well-thought out.

      Personally, I rated the school on the staar chart because it is an easy way to compare across so much diversity in systems.

      Delete
  3. Allison,

    Very nice job! It's very important to make sure we adopt technology solutions that meet the needs of our users with disabilities. So to often we leave this user group out of the equation. Based on your post, I plan on changing my technology plan to address this very important group.

    Mike

    ReplyDelete
    Replies
    1. I feel frustrated sometimes when I am helping work on machinery in the barn because I have no vocabulary for the tools and many I have no idea what they can do. I am from a generation of specialized experts (girls cooked, boys built) and it is hard to get beyond this kind of "in a box" thinking. Many of the readings from Dr. Ezell's classes have helped reshape my thinking.

      Delete

Note: Only a member of this blog may post a comment.