01_28_2009 EvalSys Team Meeting (Sakai Bridge 002)

01_28_09 EvalSys Team Meeting (Sakai Bridge 002)

Objective: Provide a communication opportunity for members of the Sakai community who are interested in the Course Evaluation tool

Agenda items:

  1. Dinking around with audio, etc. til everyone can hear (5 minutes)
  2. Results of recent production and pilot usage (45 minutes)
    1. UMD (Ellen)
      1. Implementation overview: External Group provider used to feed in course, instructor, enrollment data from our proprietary SIS, template and evaluation defined in Sakai, delivering in Sakai, reporting in external proprietary system, using both multiple instructor and hierarchy at two levels (University and college) in deployment
      2. Deployment strategy: Evaluation officially open 2 weeks from study day (day before final exams begin), soft opening day before official open date
      3. F08 deployment numbers: 5,921 course sections, 139,644 possible evaluations, 7 colleges/schools added items
        1. Have been in production since Fall 2007 semester
      4. Outcomes
        1. Overall response rate: 61% evaluations
        2. Email process: Captured evaluation available email from Sakai and sent out our own megamail note from the Provost to the students so they would receive a single copy, Reminder email messages sent via Sakai tool - one email per evaluation not completed
        3. Submission pattern
        4. What went well
          1. No performance issues
          2. Small number of Help Desk tickets
          3. Communications out to student body - fliers, link on student portal, student newspaper ad, email from Provost to all students
        5. Problems encountered
          1. Discovered six courses that appeared in two different nodes in the hierarchy due to the fact that these courses had multiple "homes" in the university datawarehouse
            1. This caused an error in the generation of the email reminder messages and caused discrepancies on our return rate numbers
            2. This did not cause any problems with distributing the evaluation - the hierarchy node that appeared first in the tree "won" and those set of questions were distributed (i.e. the students only saw one evaluation and the associated items for the first college to appear in the hierarchy were displayed)
            3. We did have to address this issue when exporting the data for reporting
      5. Plans for 2009
        1. In process of hiring an additional java developer for this project
        2. Upgrade to Sakai 2.5
        3. Update to current EvalSys code - hope to look at incorporating TA category items
        4. Still prioritizing functionality list with project owner (Office of Institutional Research, Planning, and Assessment)
        5. Would like to focus on getting the UI's completed for allowing distribution of administration of items for lower levels of the hierarchy (some initial drafts are here)
    2. U-M (Sean & Dick)
      1. Implementation overview: Defining templates & evals in PeopleSoft system, delivering in Sakai, reporting in PeopleSoft
      2. F08 deployment numbers: approx. 8600 courses, 35,000 students, 200,000 possible submissions
        1. Have been in Production since Sept with weekly loads
      3. Outcomes
        1. Overall response rates: 72+% students; 62% evals
        2. Paper vs. online comparison; F08 details by Group, by Category
        3. Single email process and submission pattern (http://tinyurl.com/5ts4bq)
        4. What went well
          1. Communication Team effort (MOTD ads, bus ads, etc); Daily status reporting
          2. Help Desk tickets & um_evaluations
          3. No usage load problems
        5. What didn't go so well
          1. Two stage deployment (most problems with second stage of late eval orders)
          2. First email run failed due to query length, subsequent failures due to bad data; resulting scramble
          3. Problems with Bad Data
          4. Reporting performance issues and shifting expectations
      4. Non-technical Issues
        1. Students: anonymity, data recipients, timing concerns
        2. Faculty: response rates, rating favorability, data recipients ("administration going straight to students")
      5. Outstanding requirements/continuing work plans for W09 and beyond
        1. Email receipt ("extra credit" use case)
        2. Faculty & staff preview of evaluations
        3. In-process response rate reporting
        4. Paging in admin view (had to turn off widget)
        5. Incremental commits to avoid rollbacks (data syncing) & other system robustness items
        6. Comment length limits (data syncing)
      6. Major outstanding challenges
        1. Data interface is killing us (data synchronization, data validation, Change Orders)
        2. Shifting requirements & limited resources
        3. Perpetual Production mode (Weekly loads, Feb 12-18 Midterms, April 12-21 Finals)
        4. Need to hand process off to business owner
    3. UCT (Stephen)
      1. Pilots
        1. Ran two carefully managed pilots in Mar/Apr 08 and Aug/Sep 08 - around 30 courses
        2. Response rates from 30-50%.
        3. Currently a wide range of course evaluation formats and delivery mechanisms on campus (paper, T&Q, other systems)
        4. The pilots also introduced a more standardized course evaluation format which will be promoted campus-wide
        5. Final decisions about course evaluations are currently take at course or Department level, i.e. no mandated university-wide process.
      2. Current work
        1. We are focusing on a self-service model that will enable course convenors (instructors) or Department admin staff to create and manage course evaluations for their courses, typically starting with a standard UCT template.
        2. We expect this to meet around 80% of course evaluation needs in 2009
        3. The other 20% is courses who don't want to run online evaluations at all, or courses where the course or teaching structure is complex and the evaluation system (questionnaire format) cannot model it adequately.
      3. 2009 plans
        1. Integrate the template editing UI work into trunk and QA: EVALSYS-633 - Getting issue details... STATUS
        2. Complete a set of functionality changes (working with Aaron) and QA: EVALSYS-634 - Getting issue details... STATUS
        3. Release a new production build to UCT by end of Feb 09
  3. Plans for Branch integration (5 minutes)
    1. Prework for conference?
    2. Concensus to hold next conference call in early-to-mid May to share results of Winter semester usage and to plan for preconference and conference working sessions.
  4. Conference presentation of findings (including W09 updates?) (5 minutes)
  5. Next Steps/ Action Items (added by Kirk Alexander)
    1. UC Berkeley and UC Davis would like to hear about how this tool has been accepted by the faculty as well as input on the various approaches (batch vs provider) for getting course information and/or hierachy into the tool. 
    2. Are each institutions contributions to the tool available in the foundation SVN or just locally?   Documentation here on this would be helpful
      1. UCT: We have used branches (in contrib/evalsys or contrib/uct) for our local production versions + local changes, then merged the changes to trunk in due course. Current UCT production branch (slightly old) is https://source.sakaiproject.org/contrib/uct/evaluation-pilot08/ and outstanding work on the template editing UI not yet merged into trunk (though hoping to do this imminently) is https://source.sakaiproject.org/contrib/uct/evaluation-pilot08-UI/
    3. Would be nice to get an update from Cambridge on what they're doing...anyone from Cambridge available to update us?