Reporting Strategies

There are multiple strategies being employed for reporting on portfolio data at various institutions. There is some interest in describing them in one place to observe where the is commonality or differences between them and to improve the general awareness of what is working in practice. Anyone who has specific reporting needs or is doing specific work on reporting should feel free to add an account of their project here.

These may end up on multiple pages, depending on the amount of detail included. Initially, some basic information about what kind of reports are needed and the general approach being employed would be helpful. Ideally, we will have a clear picture of the types of data being used, how it is being extracted, what summary or analysis is applied, and the tools in use.

Indiana University

Standardized evaluation form elements for generating summary and detailed reports on assessment results.

See: Solving the OSP Reporting Conundrum  (Presentation from Boston 2009 Meeting)

University of Michigan

Custom online extraction, XSL, and Javascript
See: Form Data Extraction and Summary Data Presentation | SAK-13476

Pentaho Sakai Integration

Serensoft

How about being able to report on your form data – evaluation forms, rubrics, feedback, etc – using any standard off-the-shelf SQL-based reporting product?

Normally, all data that users put into forms is stored in XML files in resources (usually on the file system). This makes it unavailable for most off the shelf reporting tools (and very difficult for even the Sakai reporting tool). A new tool is being developed that will piggyback on the existing data warehousing infrastructure to pull data out of those XML files and store them in database tables, freeing the data for reporting by a variety of off-the-shelf reporting tools (examples include Crystal Reports, Hyperion, Cognos or Sakai reports tool).

See also:

  • SlideShare: Reporting On Your Xml Field Data
  • PowerPoint: Reporting On Your Xml Field Data

    Change your WHERE clause, not your JOINs

    With the Serensoft approach, the schema for all "parsed" forms is broken up into a handful of tables, such that the joins will be (basically) the same for all queries – the part that changes will be in your "where" clause, to limit it to certain worksites, certain matrices, certain forms, certain fields...
    Here's a sample join, when incorporating data-warehouse with "live" database tables:

    Serensoft Reporting Gizmo – Sample SQL
    from
        `osp_review` OSP_REVIEW
            join
        `metaobj_form_def` METAOBJ
            on OSP_REVIEW.`review_device_id` = METAOBJ.`id`
            join
        `osptool_form_response` FORM_RESP
            on FORM_RESP.`reviewId` = OSP_REVIEW.`id`
            join
        `osptool_root_formitem_response` ROOT
            on ROOT.`responseId` = FORM_RESP.`id`
            join
        `osptool_formitem_response` FORMITEM
            on ROOT.`responseItemId` = FORMITEM.`parentId`
            join
        `osptool_long_formitem_response` LONGITEM
            on FORMITEM.`id` = LONGITEM.`responseItemId`
            and FORMITEM.`responseItemType` = 'LONG'
            join
        `SAKAI_USER` USER
            on USER.`USER_ID` = FORM_RESP.`userId`
            join
        `SAKAI_USER_ID_MAP` USERIDMAP
            on USER.`USER_ID` = USERIDMAP.`USER_ID`
            join
        `osptool_formitem_response` PARENT
            on PARENT.`id` = FORMITEM.`parentId`
    where
        METAOBJ.`description` = 'certain-form-name' or
        USERIDMAP.`EID` = 'certain-user-name' -- etc
    

    In this example, differen forms or different users can be reported upon, based on different values in the "where" clause... leaving the joins as-is. Very handy!

Virginia Tech

I'm happy to start the conversation about how we might begin assessing reporting needs and the multiple tools that are currently under development, and I can speak specifically from Virginia Tech's perspective.  We currently have 55 projects (utilizing a combination of matrix and/or presentation tools) spanning our entire university, and this number is growing at a rapid rate.  Some of our faculty are perfectly happy using matrices as a central place for electronically storing student materials and reflections on those materials.  These places offer useful spaces for organizing and maintaining materials, as well as guiding students to reflect on their learning over time. Most of these faculty are folks who have to read large quantities of student submissions, and they typically download a random sample of these materials and perform their assessment, and consequent data analysis, outside of the instance of Sakai.

However, we have a growing need for faculty and administrators to pull information, and data, out of these matrices.  The following is a list of some of the requests we hear most frequently from our faculty and administrators. I am also including some of the specific eP needs that Marc and I have identified:

  • syncing of rosters with portfolio sites (I realize this is currently under discussion)
  • summaries showing which users have submitted items in their matrices
  • easier interface for navigating matrices
  • easier interface for submitting evaluations
  • ability to export summaries as Excel or .csv files
  • ability to export evaluation data as Excel or .csv files

I realize some of these issues/requests are probably addressed in Sakai 2.6/2.7, but we are still working on a version of 2.5 and will probably be in that instance for quite some time. That said, we we would like to work with folks to implement these tools in our instance.

I wonder if it would be useful for other institutions to join in creating a list of behaviors or functions that are needed and then perhaps we could compare the three reporting strategies listed here to see how they meet the different functional needs?

I hope this is a helpful way to get this conversation started.