KNOWLEDGE SURVEYS
| Knowledge Surveys: A Tool for All Reasons provide a means to assess changes in specific content learning and intellectual development. More importantly, they promote student learning by improving course organization and planning. For instructors, the tool establishes a high degree of instructional alignment, and, if properly used, can insure employment of all seven "best practices" during the enactment of the course. Beyond increasing success of individual courses, knowledge surveys inform curriculum development to better achieve, improve and document program success. As Karl R. Wirth, Geology Department, Macalester College, St. Paul, MN, and Dexter Perkins, Department of Geology, University of North Dakota, Grand Forks, ND write: Knowledge Surveys: An Indispensable Course Design and Assessment Tool. |
| What are knowledge surveys? |
Knowledge surveys are an approach to assessing 1) student preparedness and 2) teaching effectiveness. As described by Nuhfer & Knipp (2003), the surveys consist of numerous questions which exhaustively itemize the content of a course. When students take the surveys, they are not asked to provide the information required by the questions. Rather, they are asked to assess their own confidence level with respect to each question. Levels of confidence might include "I could answer this," "I could find the answer to this in ten minutes," "I could not answer this," and so on. Research findings indicate that student responses to the surveys correlate closely to other assessment indicators, such as tests, that require an actual display of knowledge. |
| How are knowledge surveys useful? |
Developing and using a Knowledge Survey forces the faculty member to create a detailed map of expected knowledge/outcomes for students and for the faculty member him/herself. It is difficult to get anywhere if you don't know where you are going; however, once armed with a map the student is able to navigate towards the destination and the faculty member can precisely plot progress on the chart. |
| Are knowledge surveys suited better to some disciplines than to others? |
Any discipline whose course content can be expressed in words, numbers, images, or sounds (and that covers just about everyone) should be able to benefit from knowledge surveys. The project leaders are developing software to allow this kind of flexibility via surveys that will be hosted on-line. |
| Does the cognitive level of my survey questions have to follow a distribution like that suggested by the Bloom taxonomy? |
No. The Bloom taxonomy is offered merely as a help and guide. Some courses may demand disproportionate numbers of purely factual questions, or purely analytical ones, or purely evaluative ones. The nature of the course content should dictate the type of question. However, survey-writing is something of an art and science (more familiar to social scientists than to people in the humanities) and Bloom may help those of us who are inexperienced at survey design to avoid accidentally emphasizing only one kind of question (for example), and thus distorting the actual nature of the knowledge being taught in our classes. |
| What are knowledge surveys NOT? |
Knowledge surveys are not tests. Students are not asked to answer the questions in the survey -- they are asked to assess their own competence to answer the questions if they were to appear on an actual test. Of course, some questions in the survey may actually show up on actual tests, but the advantage of surveys is that they survey the whole content of the course, not just a sample of it for grading purposes. Tests tell teachers how students did in the course; knowledge surveys additionally suggest to teachers how teachers did in the course, that is, how well the content got across. |
| Do all sections of a course have to use the same Knowledge Survey Items? |
Knowledge Surveys do not dictate course uniformity (Deans, Division Chairs, and Disciplines may). A common practice on other campuses that use Knowledge Surveys is to at least share a common core of items and then permit individual faculty who teach the same course to include course specific expected learning outcomes. This facilitates programmatic evaluation without requiring complete course uniformity. |
| How and why will the development and use of a Knowledge Survey improve my student learning outcomes? |
To answer the question, we have included Ed Nuhfer's explanation to the Idaho State University faculty: Better Organization = Better Learning. When we think our organization is clear, students usually do not. Harvard's Phil Sadler's (1992) videotape, "Thinking Together: Collaborative Learning in Science," explains how this occurs: "When you learn to teach a subject, just struggling with how to present it, where you're sort of relearning it yourself, that's when students gain the very most from a lecture. Once you've really got it down and you see all these beautiful connections that you didn't see before, you're well beyond the level of the student." We see our organization; they don't, unless we bring it to their level. One way to bridge the gap is to present our organizational plan completely in writing and to let students engage it at their pace in ways that promote their learning. The concept behind a knowledge survey is simple. It is a written document constructed through a logic that begins with course goals, then outcomes that are fleshed out by what students should be able to do as a result of successfully meeting an outcome. It is a document that discloses the entire course and takes detailed before/after snapshots of students' perceptions of their learning. If you've taught a course before, rudiments for your first crude knowledge survey are likely already in your computer. Copy all your quiz, test, and review questions into one giant file, in the order you intend to cover these topics in the coming days ahead. See if it is, in fact, organized so as to cover and make explicit your stated goals and outcomes. If not, make the needed changes and additions to do so. You now have a "monster exam" that covers the entire course. Students don't merely retain it as a study guide, they interact with it and produce a scaled record based on their confidence with present knowledge. Students mark an "A" in response to an item if they can, with present knowledge, answer an item or perform the skill for test purposes; a "B" if they have partial knowledge/skill or know how to find the information required to answer the question within a short time (say, 20 minutes) or a "C" if one could not presently answer this question for test purposes. You now have the basic idea. Next go to the Center's web site for examples, details, and a long list of benefits to be gained from doing so. This paper, published last February, represents our experience as of about two years ago. We now know more about how to use these well, and there is certainly much more to be learned. We have worked with ITRC the past year to allow a knowledge survey prepared in a word processor to be given to students via WebCT and the data returned to the professor as an Excel file to allow pre-post records of the kind shown in the above web site to be produced. We provide workshops on (1) constructing such surveys and (2) getting them up on WebCT. We are happy to come to any unit or department to present this. But you need not wait. Between the web site above and what you intend to do for your course, you can construct a 'first edition' immediately. To make the best use of this tool, you need to refer to it often through the course, align your lessons with your plan, and make certain students are using it too. For you, it will give a detailed record that can serve as a reality check for how fitting your plan is. If all goes well, better learning will be the outcome. Even if disaster occurs (you find the plan impractical and have to scrap it), take notes and the detailed record will reveal fully how you can design the course for success the next time. |
| What is the HCC Strategy for the Inclusion of Knowledge Surveys in the Assessment of our SLOs? |
We will develop some pilot Knowledge Surveys this Fall, and then launch a few pilot post-tests at the end of the semester. At the beginning of the Spring 06 semester, we plan to launch several pilot pre/post test Knowledge Surveys. The findings from the pilot projects will be reviewed to determine the utility/appropriateness of Knowledge Surveys and inform our decisions about their future use on the campus. You can access more information about Knwoledge Survesy in the Workshop/Conference Information section of the Assessment section of the HCC Intranet.
|
| 11 Steps to develop/utilize a Knowledge Survey |
|
شکسپیر می گوید: به جای تاج گل بزرگی