Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 12 Current »

This page is in alpha state; still figuring out what sections and elements it should contain...

--------------

This page is intended to be a starting point for people who may be evaluating the tool for use at their institution.

  1. Links to working instances of the tool (waiting to hear from rSmart about getting the tool installed on their server)
  2. Known requirements for institutions considering the tool: This is a place for tire-kickers to briefly list major gaps and/or functional areas that are missing from the current tool. It is not intended to capture detailed requirements, but rather to help the evaluation team when considering the road map for the tool. Also note that features which are not controllable should be made configurable or removed to not impose on others; the intent is to prevent upgrades from causing problems for institutions running earlier versions
  3. Evaluation Tool Newbie FAQ:
    • Who is running the tool? (see the Adopters page)
      • University of Maryland
        • CourseEvalUM website with detailed information on our process for approval on campus.  Maryland had two things happening that resulted in deploying this tool:  (1) request from students that prompted a multi-year task force asking for unified items across the university (we were decentralized in terms of course evaluation) and (2) request from students to deploy the course evaluation online and make results availale online (i.e. like Pick-a-Prof type service).
      • University of Michigan
      • University of Oxford
    • What are the scope and scale of the implementations already underway?
      • University of Maryland has been using the Evaluation Tool university-wide since Spring 2008.  Currently, Maryland is using course/group or instructor/evaluatee items,  multiple instructor functionality and hierarchy for deploying a single evaluation institution-wide.  Two levels of the hierarchy are currently being used in the deployment (university and college).  Seven colleges/schools have added college-level items (either course/group or instructor/evaluatee or both).
        • Fall 2008:  1 template, 1 evaluation, 5928 course/sections (groups), 139,704 possible evaluations
          • Maryland's hierarchy for Fall 2008 (pdf)
          • Maryland's template for Fall 2008 (pdf)
    • How do I learn more / who can answer questions about the tool?
      • Ellen Yu Borkowski, Director, Academic Support, Office of Information Technology, University of Maryland [Office: 301.405.2922; Email: eyb@umd.edu]
  4. Institution specific documentation (not guaranteed to be fit for any particular use, but offered as a starting point for other institutions when considering writing their own docs)
    • Cambridge documentation (pending OK from Hattie)
  5. What is the hierarchy for?
    • Lecturer must use evaluations from above in the hierarchy This was intended to allow lecturers to opt in or out (as applicable) of surveys run by the University or their Faculty. For example, the University would run a survey, and the lecturers teaching the courses it was assigned to would receive a mail inviting them either to opt in or opt out. However, the functionality does not currently exist to allow people to opt in or out via a link in the email.  Lecturer may add this number of questions to evaluations from above in the hierarchy before the release date This is an extension of the functionality above - lecturers may opt in to a University-wide survey and, in addition, add in a certain number of questions. Again, the functionality does not currently exist. (Text from Hattie)
  • No labels