Faculty weren't ready for matrix collection and review
Students weren't ready for matrix submission and feedback
Raw program (site) management (matrix structure, instructors, students) is difficult (administrative overhead)
Detail data must be understood as being able to fulfill a specific business case for data before reports are useful
Developers and implementors cannot necessarily conceive of reports that are programmatically significant
Faculty and program personnel without assessment and statistical background have trouble defining these significant reports in a way that is functional and insulated from change
What GMT is Solving at Saginaw
Provides a way to "bring it to" the faculty and students through assignments (goal-awareness in a familiar interaction model)
Mitigates confusion on behalf of both students and faculty
Reduces various maintenance, load, and overhead concerns with large program sites
Bundling activities (as NCATE or other "assessments" - termed "assessment tools" in new data model)
Need for wrapper/descriptor around activities
This descriptor is essentially the purpose or application of the activity (why? what are we assessing? what competencies are we aiming for?)
Staging prescribed, reusable assignments with attached goals
Master (or template) assignments live in a program site with real assignment instructions, etc.
Deployment of these activities into specific courses on a recurring, low maintenance basis
Can grade assignments (ie, for individual assessment within a course) and rate assignments (against a rubric for program assessment)
Reports can be tabulated against the assessments "down to" the ratings, rather than trying to extrapolate programmatic meaning from various attachments/ratings of goals without hierarchical program mapping.