MISI Calls Summary - Feb 12 & 13
NEXT STEPS:
1. First, if you have not yet answered the survey that asks some quick questions about your institution and your implementation of Sakai, please fill it out as soon as possible: http://lessons.ummu.umich.edu/2k/misi/info
2. Second, based on the comments from the survey, Stephanie and I will be assembling a version of the Core questions. There will be "core" questions as well as "optional" questions on this page. I will send a note out when this version is ready for your comments, likely followed by another round of conference calls.
3. Finally, a few of you had some questions about IRB approval for your institution. This is from our IRB:
Hi Steve - from our IRBs perspective, we would not require to see IRB approvals from other sites because the study is exempt. So - each of your participating institutions should do what their own institution requires. Some international sites might not have a IRB or ethics committee that would review a minimal risk survey such as yours.
So, please proceed with the IRB or Human Subjects requirements for your institution. If needed, our IRB exemption from Michigan is attached.
- - - - - - - - - - - - - - - - - - - - - - - - -
Conference Call Summary:
Participants: Steve & Stephanie (Michigan), Salwa (Texas State), Robin (Wyoming), Yitna & Stephanie (U Virginia), Jim (Marist), Mary & Jim (Mount Holyoke), Raul & David (Valencia - Spain), Kelly (Stanford), Angela (Rice), Lisa (Berkeley), Stephen (Georgia Tech), Anna & _?_ (Windsor - Canada), Danna (Missouri), Barb (Bradley)
-- my apologies if I missed you!
General Questions
- Yitna asked if institutions should collect user IDs. You should collect user identification information only for the purposes of your own data management (e.g., matching demographic information from the Registrar with the survey) or for survey-related incentives (e.g., prizes). You should NOT send user ids to Michigan for the combined data analysis.
- Yitna also asked if there was a minimum N or response rate for participation. We have seen published results from online surveys with response rates as low as 10% and some higher than 30%. A good benchmark is 20% to shoot for, but we still want your data no matter what your N or response rate is.Â
- Stephanie reiterated that while our hope is that each institution will use the "core" questions that we all agree to as much as possible, we are not forcing anyone to do anything, and if you need to change the wording or omit some questions because of your institution's circumstances, that is okay. In the end, this is YOUR survey and you have to decide how to best obtain the information you need for your institution.
- Stephanie also reminded everyone that while we are happy to share our coding schemes, we do not have the resources at Michigan to code your qualitative data. We do hope, however, that you will share your aggregate qualitative findings as well as some examples to help explain the quantitative data.
Review of Core Survey Questions
General
- Danna suggested that we might want to use "Instructional Technology" or "Educational Technology" instead of "Information Technology" in the question prompts. There can be a limit of what the instructor or student understands as "technology". Steve mentioned that in the UM survey, there is a beginning prompt that specifies what IT means. The group concluded that maybe the specific technology items should come before the general technology items, so that users get the idea we mean a wide variety of technologies, not just Sakai and PowerPoint.
Demographic Items
- If you cannot obtain gender, unit, and other demographic information from your list of students & instructors, you may want to add these to your instrument.
- Each institution should customize the list of schools / colleges for their local needs. To ease coding, respondents should be limited to one choice only.
- The student choices do not work for non-US institutions. Steve will rework these choices to reflect 1st year, 2nd year, etc. undergraduate with Sophomore, etc. in parentheses. For example: Second-year undergraduate (e.g., Sophomore). Steve will also add an option for Professional School students (e.g., Law, Nursing, etc.).
General Technology Experience / Proficiency
- For the descriptions of IT use / preference, omit the examples (e.g.....) and just have the scale None - Exclusive
Also, add the word "generally" or "for most of my courses" to the prompt -- see Alternative Questions Confluence page for options. - There was some discussion about the "preference" nature of the student IT question, but the final decision was to leave the question prompt as-is in terms of the "preference" nature.
Specific Technology Experience / Proficiency
- There was a lot of discussion about the list of IT and if it asks about enough different technologies OUTSIDE of Sakai. One option might be to simply ask "Do you use use IT for course-related activities" and ask the respondent to identify which ones. However, this removes the "how valuable" aspect of this question. Will try a couple alternatives for comment.
- Laptop - dropped from core questions
- Majority of courses I teach are conducted.... change "distance" to "online" for second option
Benefits of IT:
- Not a lot of discussion on these items. Steve advocated for the bottom-two options as a potential way to identify instructor & student differences in approaches to IT in general.
General Use of Sakai for Courses:
- How may courses - helps identify experience with the system. Could be used to as a factor when analyzing findings in other survey items.
- How often visit - add "over the course of the semester" or similar text so users don't just think about the present time.
Specific Use of Sakai:
- big list of value activities needs headings. Yitna & Robin volunteered to work on this. On Friday, the folks from Windsor suggested "Presenting Materials, Interaction, and Assessment"
- Each institution should customize the list of Sakai tools for their local needs.
Projects:
- Include if project sites available. Provides data for arguments that project sites are useful and important -- also that Sakai is more than just a CMS for courses.
Site Participants:
- Helps teach instructors that they can have non-institution folks on their sites.
- Probably not core questions, however
Support Questions:
- Too institution-specific -- drop from core questions
Qualitative Questions:
- No real comments. Reminder that each institution is responsible for coding their own qualitative items.