Search By:


Publication Search Results

Search returned 65 results using Keyword: "Rubrics"



1. 2009 . Peer Review Vol. 11, No. 1: Assessing Learning Outcomes: Lesson from the AAC&U's VALUE Project.
This edition of Peer Review, AAC&U's quarterly publication on noteworthy trends and debates within undergraduate education, addresses the development and usage of emerging assessment approaches including rubrics to assess learning outcomes and e-portfolios.
Link to Full Text | Show Similar Items | Show Associated Keywords
2. RCampus iRubric.
iRubric is a comprehensive rubric development, assessment, and sharing tool. Designed from the ground up, iRubric supports a variety of applications in an easy-to-use package. Best of all, iRubric is free to individual faculty and students. iRubric School-Edition empowers schools with an easy-to-use system for monitoring student learning outcomes and aligning with standards.
Link to Full Text | Show Similar Items | Show Associated Keywords
3. RubiStar.
A free online website tool developed through the Advanced Learning Technologies (ALTEC) project at the University of Kansas Center for Research on Learning helps visitors create rubrics.
Link to Full Text | Show Similar Items | Show Associated Keywords
4. Valid Assessment of Learning in Undergraduate Education (VALUE).
The VALUE project seeks to contribute to the national dialogue on assessment of college student learning building on a philosophy of learning assessment that privileges authentic assessment of student work and shared understanding of student learning outcomes on campuses over reliance on standardized tests administered to samples of students outside of their required courses. The result of this philosophy has been the collaborative development of 15 rubrics by teams of faculty and academic professionals on campuses from across the country.
Link to Full Text | Show Similar Items | Show Associated Keywords
5. Winona State University sample rubrics.
This site is a compilation of sample rubrics collected from several colleges and universities divided by discipline and/or learning outcome.
Link to Full Text | Show Similar Items | Show Associated Keywords
6. Albertine, S. and Rhodes, T. 2012. Show me the learning.
This is a power point presentation by Susan Albertine and Terrel Rhodes about the AAC&U VALUE Project and the use of VALUE rubrics.
Link to Full Text | Show Similar Items | Show Associated Keywords
7. Ammons, J. L., & Mills, S. K. 2005. Course-embedded assessments for evaluating crossfunctional integration and improving the teaching-learning process.
This paper offers a case study of the process of defining a competency, specifying intended learning outcomes, selecting course-embedded assessment methods, evaluating the results, and using that information to guide changes in the teaching-learning process.
Link to Full Text | Show Similar Items | Show Associated Keywords
8. Association of American Colleges and Universities. 2009. Assessing learning outcomes: Lessons from AAC&U’s VALUE project.
The entire Winter 2009 edition of Peer Review addresses the VALUE project. Information presented includes an overview of the project, information on e-portfolios, application of rubrics, assessment process, and the use of assessment results for improvements.
Link to Full Text | Show Similar Items | Show Associated Keywords
9. Association of American Colleges and Universities. 2011, Fall/2012, Winter. Assessing liberal education outcomes using VALUE rubrics.
The use of AAC&U's VALUE Rubrics to assess student learning at colleges and universities around the nation is the central focus of this issue.
Link to Full Text | Show Similar Items | Show Associated Keywords
10. Association of American Colleges and Universities. 2002. Greater expectations: A new vision for learning as the nation goes to college.
This article provides an overview of the Greater Expectations Initiative, conducted by AAC&U from 2000-2006 which "articulated the aims and purposes of a twenty-first century liberal education and identified innovative models that improve campus practices and learning for all undergraduate students, and advocated for a comprehensive approach to reform." The results of this project helped to formulate AAC&U's current LEAP initiative.
Link to Full Text | Show Similar Items | Show Associated Keywords
11. Association of American Colleges and Universities. 2007. Rising to the challenge: Meaningful assessment of student learning .
This article provides an overview of VALUE, or Valid Assessment of Learning in Undergraduate Education, "a national project to advance our understanding around assessment student learning outcomes."
Link to Full Text | Show Similar Items | Show Associated Keywords
12. Association of American Colleges and Universities. VALUE: Valid Assessment of Learning in Undergraduate Education.
15 Essential Liberal Learning Outcomes are outlined by AAC&U. The rubrics were developed to help create a shared understanding of student learning at colleges and universities across the country.
Link to Full Text | Show Similar Items | Show Associated Keywords
13. Bailie, F., Marion, B. & Whitfield, D. 2010. How rubrics that measure outcomes can complete the assessment loop.
Program assessment of student learning includes the following steps: 1) involving all constituents to establish program goals, 2) developing measurable student learning outcomes for each of the goals, 3) developing measurable outcomes for each course that map to the student learning outcomes, 4) determining appropriate assessment methods in the courses, 5) creating assessment instruments (or rubrics) for each of the methods, 6) establishing benchmarks, 7) analyzing the data, and 8) using the results to improve student learning. This paper focuses on the last four steps by beginning with a generalized assessment plan for an undergraduate computer science program. A generalized rubric for computer programs is presented that measures selected student learning outcomes. This paper demonstrates how to apply the generalized rubric to two specific computer programming assignments. Benchmarks associated with the rubrics are suggested. Sample results are analyzed to identify problems and propose solutions---"closing t
Link to Full Text | Show Similar Items | Show Associated Keywords
14. Banta, T.W., Griffin, M., Flateby, T.L., & Kahn, S. December 2009. Three promising alternatives for assessing college students' knowledge and skills.
In this paper, assessment experts Trudy Banta, Merilee Griffin, Theresa Flateby, and Susan Kahn describe the development of several promising authentic assessment approaches. The contributors draw on their rich assessment experience to illustrate how portfolios, common analytic rubrics, and online assessment communities can more effectively link assessment practices to pedagogy. In addition to discussing the strengths and limitations of each approach, the paper offers concrete examples of how these authentic approaches are being used to guide institutional improvement, respond to accountability questions, and involve more faculty, staff, and students in meaningful appraisals of learning outcomes.
Link to Full Text | Show Similar Items | Show Associated Keywords
15. Beld, J. 2015, April. Building your asessment toolkit: Strategies for gathering actionable evidence of student learning.
This report explores the various assessment strategies that institutions, with a special focus on Minority-Serving Institutions (MSIs), can utilize. It offers various questions for institutions to ask themselves before beginning their assessment, an analysis of various assessment instruments, and advice on each approach.
Link to Full Text | Show Similar Items | Show Associated Keywords
16. Berg, J., Grimm, L. M., Wigmore, D., Cratsley, C. K., Slotnick, R. C., & Taylor, S. . 2014, Summer. Quality Collaborative to Assess Quantitative Reasoning: Adapting the LEAP VALUE Rubric and the DQP.
Fitchburg State University (FSU) and Mount Wachusett Community College (MWCC) worked together to evaluate their rubrics of quantitative reasoning- and three other areas- and compare it with the DQP's and LEAP's. The aim is to develop common rubrics to measure what students know and should do, which in turn should work towards setting common expectations for transfer students.
Link to Full Text | Show Similar Items | Show Associated Keywords
17. Bowling Green State University. Rubrics for the University Learning Outcomes.
BGSU has identified six university-level learning outcomes and developed rubrics for each which are available on its Assessment website. Each rubric is designed using a developmental sequence, from beginner to advanced, in order to assess proficiency in course assignments, work duties, or co-curricular activities
Link to Full Text | Show Similar Items | Show Associated Keywords
18. Broad, Bob. . 2003. What we really value: Beyond rubrics in teaching and assessing writing.
This book offers a critical examination of traditional rubrics for evaluating student writing and presents Dynamic Criteria Mapping as a more flexible and contextual evaluative tool. Broad examines evaluation at work by detailing a study of an introductory composition program.
Link to Full Text | Show Similar Items | Show Associated Keywords
19. Buyarski, C.A., & Landis, C.M. 2014. Using an ePortfolio to assess the outcomes of a first-year seminar: student narrative and authentic assessment.
The authors analyzed at 47 ePortfolios of first-year seminar students. Using the combination of a rubric and identification of authentic evidence, results suggested that the ePortfolio can thoroughly assess student learning when combined with a rubric and examining authentic evidence.
Link to Full Text | Show Similar Items | Show Associated Keywords
20. Council of Independent Colleges. 2008. Evidence of learning: Applying the collegiate learning assessment to improve teaching and learning in the liberal arts college experience.
The Council of Independent Colleges sponsored report presents the experience of a consortium of 33 CIC member colleges and universities with the CLA over a period of three years.
Link to Full Text | Show Similar Items | Show Associated Keywords
21. Dalal, D. K., Hakel, M. D., Sliter, M. T., & Kirkendall, S. R. 2012. Analysis of a rubric for assessing depth of classroom reflections.

Link to Full Text | Show Similar Items | Show Associated Keywords
22. DeWitt, P. March 2012. What is satisfactory performance? Measuring students and measuring programs with rubrics.
Some assessment experts strongly recommend that a desired level of achievement be stated when measuring student performance on stated student learning outcomes. According to Nichols, the criteria should be stated in quantitative terms, as this example illustrates: “Eighty percent of those taking the CPA exam each year…will pass three of four parts of the exam” (Nichols, 1989, p. 178). In the era of rubrics, this can easily be translated to “Eighty percent of students…will score at least ‘satisfactory’ on three of the four rubric rows.”
Link to Full Text | Show Similar Items | Show Associated Keywords
23. Elrod, S. 2014, Summer. Quantitative Reasoning: The Next "Across the Curriculum" Movement .
The ability to think quantitatively, or quantitative reasoning (QR), clearly plays a central role in undergraduate education. But what do terms like quantitative reasoning, quantitative literacy, and quantitative fluency really mean for student learning, the curriculum, program development, faculty development, or accreditation? Why should QR be taught across the curriculum and in interdisciplinary contexts? In addition, this publication explores learning outcomes for QR.
Link to Full Text | Show Similar Items | Show Associated Keywords
24. Eubanks, D., & Gliem, D. 2015, May. Improving Teaching, Learning, and Assessment by Making Evidence of Achievement Transparent.
Technology can change higher education by empowering students to make an impact on the world as undergraduates. Done systematically, this would allow institutions to close the credibility gap with an increasingly dubious public. Authentic student achievements that are addressed to a real world audience can lead to richly detailed Resume 2.0 portfolios of work that add value to degrees and the granting institutions. A guide is provided for implementation of new high-impact practices, including structured assignment creation.
Link to Full Text | Show Similar Items | Show Associated Keywords
25. Ewell, P., Kinzie, J., Keith, J., & Love, M. B. January 2011. Down and in: A national perspective on program-level assessment.
Presentation at Association of American Colleges and Universities (AAC&U) on the Reviewing NILOA survey results, qualitative information on program assessment, with examples from two exemplary campuses.
Link to Full Text | Show Similar Items | Show Associated Keywords
26. Ewell, P., Mandell, C., Martin, E., & Hutchings, P. 2013, October. Mapping the curriculum: Learning outcomes and related assignments.
This presentation from the 2013 Assessment Institute discusses the implications of using the DQP for assessing learning outcomes, curriculum mapping, the use of rubrics, and designing an assignment library.
Link to Full Text | Show Similar Items | Show Associated Keywords
27. Finley, A. 2011, Fall/Winter. How Reliable Are the VALUE Rubrics?.
The effectiveness of assessment instruments is commonly evaluated by the degree to which validity and reliability can be established.The VALUE reliability study was developed to gather data on the usability and transferability of rubrics both within and across institutions. This study was also designed to address the degree of reliability and consensus in scoring across faculty from different disciplinary backgrounds. Reliability data were gathered and analyzed for three of the fifteen existing VALUE rubrics—critical thinking, integrative learning, and civic engagement.
Link to Full Text | Show Similar Items | Show Associated Keywords
28. Fresno State. Sample rubrics from Fresno State.
CSU-Fresno houses a Rubric Library through its Office of Institutional Effectiveness. Sample rubrics are available for some of its programs as well as university-level learning outcomes.
Link to Full Text | Show Similar Items | Show Associated Keywords
29. Gerretson, H., & Golson, E. 2005. Synopsis of the use of course-embedded assessment in a medium sized public university’s general education program.
Gerretson and Golson describe the use of a faculty-driven course-embedded assessment at a medium-size public university. The authors offer an overview on course-embedded assessment, implementing learning outcomes, rubrics, the use of data analysis, and evaluating the effectiveness of the course-embedded approach.
Link to Full Text | Show Similar Items | Show Associated Keywords
30. Gerretson, H., & Golson, E. 2005. Synopsis of the use of course-embedded assessment in a medium sized public university's general education program.
Gerretson and Golson describe the use of a faculty-driven course-embedded assessment at a medium size public university. The authors offer an overview on course-embedded assessment, implementing learning outcomes, rubrics, the use of data analysis, and evaluating the effectiveness of the course-embedded approach.
Link to Full Text | Show Similar Items | Show Associated Keywords
31. Goff, L., Potter, M. K., Pierre, E., Carey, T., Gullage, A., et al. 2015, March. Learning Outcomes Assessment: A Practitioner’s Handbook.
This handbook from the Higher Education Quality Council of Ontario (HEQCO) serves as a resource for faculty and administrators to design and assess program-level learning outcomes. The handbook includes tips, examples and case studies, and recommendations on methods for developing program-level learning outcomes and assessment.
Link to Full Text | Show Similar Items | Show Associated Keywords
32. Harvard Initiative for Learning and Teaching. Concept maps: Are they good for assessment.
This is a PowerPoint that discusses the use of concept maps for assessment purposes. It provides a general understanding of using concept maps in determining student learning and possible outcomes. Included within the PowerPoint are slides about “Why create concept maps?,” “Concept Maps for Assessment,” and “Concept Map activity.” There is also an example rubric in addition to a list of the pros and cons of using concept maps for assessment purposes.
Link to Full Text | Show Similar Items | Show Associated Keywords
33. Harvey, V. & Avramenko, A. Mar/Apr2012. Video killed the radio star: Video created the student star!.
Abstract: The article explores the use of videos in course feedback and student assessment. Aside from enhancing student engagement, it is inferred that videos can improve digital literacy and promote student involvement in the video production process. The employment of video in learning activities is reported, noting that it can help students gain communication skills. It is also concluded that videos are useful for both self and peer assessment, allowing them to reflect on areas for improvement.
Link to Full Text | Show Similar Items | Show Associated Keywords
34. Howell, R.J. 2011. Exploring the impact of grading rubrics on academic performance: Findings from a quasi-experimental, pre-post evaluation.
This purpose of this pre-post, quasi-experimental evaluation was to explore the impact of grading rubric use on student academic performance. Cross-sectional data were derived from 80 undergraduates enrolled in an elective course at a research university during spring and fall 2009. The control group (n = 41), who completed the course’s Assignment #2 without a grading rubric, scored significantly lower, on average, than the treatment group (n = 39), who completed the same assignment, but with access to a grading rubric. The grading rubric constituted an important predictor of assignment performance, the magnitude of which was stronger than college year, major, pre-test score, and gender. Suggestions are provided for future research.
Link to Full Text | Show Similar Items | Show Associated Keywords
35. Jo Beld. 2015. Building Your Assessment Toolkit: Strategies for Gathering Actionable Evidence of Student Learning.
This resource explores the various assessment strategies that Minority-Serving Institutions (MSIs) can utilize. It offers various questions for MSIs to ask themselves before beginning their assessment, an analysis of various assessment instruments, and advice on each approach.
Link to Full Text | Show Similar Items | Show Associated Keywords
36. Jonsson, A., & Svingby, G. 2006, August. The use of scoring rubrics: Reliability, validity and educational consequences.
75 empirical research studies on rubrics were examined. Rubrics set clear expectations, which also facilitates feedback and self-assessment.
Link to Full Text | Show Similar Items | Show Associated Keywords
37. Judd, T. P., Secolsky, C., Allen, C. February 2012. Being confident about results from rubrics.
Using rubrics to assess student learning is more and more common, and their use is almost certainly going to increase, as the Association of American Colleges and Universities (AAC&U) essential learning outcomes become better known and the Lumina Degree Qualifications Profile gains traction. Both outcomes frameworks require something more than what available standardized instruments measure.
Link to Full Text | Show Similar Items | Show Associated Keywords
38. Khan, R., Khalsa, D., Klose, K., and Cooksey, Y. Winter 2012. Assessing graduate student learning in four competencies: Use of a common assignment and a combined rubric.
Abstract: Since 2001, the University of Maryland University College (UMUC) Graduate School has been conducting outcomes assessment of student learning. The current 3-3-3 Model of assessment has been used at the program and school levels providing results that assist refinement of programs and courses. Though effective, this model employs multiple rubrics to assess a wide variety of assignments and is complex to administer. This paper discusses a new outcomes assessment model called C2, currently being piloted in UMUC’s Graduate School. The model employs a single common activity (CoA) to be used by all Graduate School programs. It is designed to assess four of the five student learning expectations (SLEs) using one combined rubric (ComR). The assessment activity, scored by trained raters, displays pilot results supporting inter-rater agreement. Pilot implementation of the C2 model has advanced its reliability and its potential to streamline current assessment processes in the Graduate School.
Link to Full Text | Show Similar Items | Show Associated Keywords
39. Kinzie, J. August 2011. Colorado State University: A comprehensive continuous improvement system.
Colorado State University was determined to be an instructive case study because of its innovative learning outcomes assessment and institutional improvement activities have been highlighted in various publications (see Bender, 2009; Bender, Johnson, & Siller, 2010; Bender & Siller, 2006, 2009; McKelfresh & Bender, 2009) and have been noted by experts in assessment and accreditation. CSU's assessment effort in student affairs is a model for bridging the work of academic affairs and student affairs through student learning outcomes assessment. Over the last dozen years, CSU has expanded its continuous improvement system for managing information sharing to serve the decision-making and reporting needs of various audiences. This system—known as the CSU Plan for Researching Improvement and Supporting Mission, or PRISM—provides information on the university's performance in prioritized areas, uses a peer review system for feedback, and emphasizes the importance of documenting institutional improvements informed by
Link to Full Text | Show Similar Items | Show Associated Keywords
40. Kuh, G. November 2010. Learning outcomes assessment: A national perspective.
Presentation at Council of Graduate Schools on global competitiveness in degree attainment, the two paradigms of assessment, and Valid Assessment of Learning in Undergraduate Education (VALUE) Rubrics.
Link to Full Text | Show Similar Items | Show Associated Keywords
41. Kuh, G., & Ikenberry, S. October 2009. More than you think, less than we need: Learning outcomes assessment in American higher education.
The 2009 report from the National Institute of Learning Outcomes Assessment (NILOA) is based on information from more than 1,500 regionally accredited degree-granting institutions in the U.S. The NILOA study, titled “More Than You Think, Less Than We Need: Learning Outcomes Assessment in American Higher Education,” summarizes what colleges and universities are doing to measure student learning.
Link to Full Text | Show Similar Items | Show Associated Keywords
42. Leskes, A., & Wright, B. 2005. The art and science of assessing general education outcomes: A practical guide..
This guide offers practical recommendations for individuals involved with the assessment of general education programs and outcomes on campus. It includes a step-by-step assessment checklist, tips for better assessment, and examples of assessment tools, methods, and rubrics for assessing a variety of key outcomes of a quality general education.
Link to Full Text | Show Similar Items | Show Associated Keywords
43. Lough, W. 2012. Assessment Rubric.
This rubric from Longwood University allows faculty to assess student portfolios.
Link to Full Text | Show Similar Items | Show Associated Keywords
44. Lowood, J. 2013. Restructuring the Writing Program at Berkeley City College: Or how we learned to love assessment and use it to improve student learning.
The portfolio-based assessment program at BCC started in 2011. They first looked at their pre-transfer English and English as a Second Language (ESL) composition/reading classes learning outcomes and determined that the best way to assess if they were met was through portfolios. As a results, all students had to summarize readings, write an in-class essay based on a prompt, and a research paper. About 500 students were assessed per semester. The endeavor extended so much so that the entire English and English as a Second Language (ESL) Department participated in scoring the portfolios through a rubric design.
Link to Full Text | Show Similar Items | Show Associated Keywords
45. McKitrick, S. A., & Barnes, S. M. 2012. Assessment of critical thinking: An evolutionary approach.
Binghamton university was required by the SUNY Board of Trustees to use critical-thinking learning goals and to select a method of critical-thinking assessment. Campuses were also required to submit to SUNY a plan for assessing critical thinking which SUNY would approve through collaboration with the General Education Assessment Review (GEAR) group. . GEAR was formed by SUNY to develop a critical-thinking rubric with faculty help. Campuses were given the freedom to select from a narrow range of strategies for assessing critical thinking (Faculty Delphi study; NSSE surveys). Binghamton chose the GEAR rubric. The strategy was implemented and composed of 3 stages: developmental, enculturation, and refinement stage.
Link to Full Text | Show Similar Items | Show Associated Keywords
46. Naser, C.R., Donoghue, K., & Burrell, S. (2012). The eyes and ears of engagement: Using RAs to assess resident engagement.
This article analyzes the effectiveness of an effort to assess the extent of student engagement at Fairfield University through the assistance of resident assistants (RAs) and the adaptation of a methodology used by the university’s schools of engineering and education. Asking RAs to participate in an assessment of their residents provides several clear benefits: the assessment rubric sets clear expectations in plain language; the rubric sets out clear expectations to the residents; and the assessment data appear to be a valid indicator of student engagement and allow the institution to identify students who may benefit from additional counseling or attention.
Link to Full Text | Show Similar Items | Show Associated Keywords
47. Northern Arizona University. Sample rubrics for liberal skills studies.
Sample rubrics available for its Liberal Studies Skills.
Link to Full Text | Show Similar Items | Show Associated Keywords
48. Puncochar, J., & Klett, M. (2013). A model for outcomes assessment of undergraduate science knowledge and inquiry processes.
To measure the efficacy of a Liberal Studies education, a Midwestern regional university developed a systematic, rubric-guided assessment based on nationally recognized science principles and inquiry processes to evaluate student work in undergraduate science laboratory courses relative to a liberal education. The rubric presented a direct measure of student understandings of science inquiry processes. The assessment procedure used stratified random sampling at confidence levels of 95% to select student work, maintained anonymity of students and faculty, addressed concerns of university faculty, and completed a continuous improvement feedback loop by informing faculty of assessment results to assess and refine science-inquiry processes of course content. The procedure resulted in an assessment system for benchmarking science inquiry processes evident in student work and offered insights into the effect of undergraduate science laboratory courses on student knowledge and understanding.
Link to Full Text | Show Similar Items | Show Associated Keywords
49. Pusecker, K. L., Torres, M. R., Crawford, I., Levia, D., Lehman, D., & Copic, G. 2011. ETS proficiency profile .
This article examines the use and value of the EPP at the University of Delaware.
Link to Full Text | Show Similar Items | Show Associated Keywords
50. Reddy, M. 2007. Rubrics and the enhancement of student learning.
Empirical research on the effectiveness of rubrics has primarily concentrated on its contribution towards improvement in academic performance, as reflected in attainment of higher grades. Its role in assessing the other dimensions of Student Learning (SL) such as attitudes, behaviours and perceptions that affect students’ inclination and ability to learn has been largely unexplored. There is also a paucity of literature on how rubrics can be used for informing course delivery and course design. The objectives of the study are derived from these gaps in literature.
Link to Full Text | Show Similar Items | Show Associated Keywords
51. Rhodes, T. L. 2010. Assessing outcomes and improving achievement: Tips and tools for using rubrics.
"This publication provides practical advice on the development and effective use of rubrics to evaluate college student achievement at various levels. Also included are the rubrics developed by faculty teams for fifteen liberal learning outcomes through AAC&U's Valid Assessment of Learning in Undergraduate Education (VALUE) project."
Link to Full Text | Show Similar Items | Show Associated Keywords
52. Springfield, E., Gwozdek, A., & Smiler, A.P. 2015. Transformation rubric for engaged learning: A tool and method for measuring life-changing experiences.
This paper shares how the Transformation Rubric for Engaged Learning is an effective assessment tool in relation to ePortfolios, including how it can be replicated and used in a variety of current assessment methods.
Link to Full Text | Show Similar Items | Show Associated Keywords
53. Stevens, D. D., & Levi, A. J. 2012. Introduction to rubrics.

Link to Full Text | Show Similar Items | Show Associated Keywords
54. The Outcomes Program. Waubonsee Community College guide for developing rubrics.
WCC provides a guide for developing rubrics.
Link to Full Text | Show Similar Items | Show Associated Keywords
55. The Writing Program at University of Massachusetts. Dynamic Criteria Mapping.
This page from the UMass Amherst’s Writing Program explains Dynamic Criteria Mapping and offers instructors tips on how to use the method with their students’ work.
Link to Full Text | Show Similar Items | Show Associated Keywords
56. Tierney, R., & Simon, M. 2004. What’s still wrong with rubrics: Focusing on the consistency of performance criteria across scale levels.
This article examines the guidelines and principles in current educational literature that relate to performance criteria in scoring rubrics. The focus is on the consistency of the language that is used across the scale levels to describe performance criteria for learning and assessment. The article aims to assist rubric developers in creating or adapting scoring rubrics with consistent performance criteria descriptors.
Link to Full Text | Show Similar Items | Show Associated Keywords
57. Truman State University. Truman State University's portfolio project.
One of Truman State's most well-developed assessment programs is the Truman Portfolio. Access to individual students portfolios are password-protected; however, rubrics (referred to as descriptors) are listed for critical thinking and writing, interdisciplinary thinking, historical analysis, and intercultural thinking and results for each of the descriptors are reported in the yearly Assessment Almanac. Included on the Portfolio page is also a FAQ list for interested faculty and students.
Link to Full Text | Show Similar Items | Show Associated Keywords
58. Turbow, D. J., Werner, T. P., Lowe, E., & Vu, H. Q. 2016, Fall. Norming a written communication rubric in a graduate health science course.
This study aimed to determine whether or not the norming of a written communication rubric improved scoring consistency among clinical faculty in a critical thinking course. The benefits of a formalized norming process are described.
Link to Full Text | Show Similar Items | Show Associated Keywords
59. Turbow, D., & Evener, J. 2016, July. Norming a VALUE rubric to assess graduate information literacy skills.
The study evaluated whether a modified version of the information literacy Valid Assessment of Learning in Undergraduate Education (VALUE) rubric would be useful for assessing the information literacy skills of graduate health sciences students.
Link to Full Text | Show Similar Items | Show Associated Keywords
60. University of Denver. University of Denver Portfolio Project.
According to its website, "The University of Denver Portfolio Community (DUPC) is a fully developed web-based application that supports the academic community with a searchable database of electronic portfolios for students, faculty, staff, and alumni, community discussion, academic program assessment based on student work, and an assessment rubric library." All of the documents/discussions are on a secure web server.
Link to Full Text | Show Similar Items | Show Associated Keywords
61. University of Hawai'i Manoa. University of Hawai'i Manoa's assessment how-to.
A how-to guide for creating and using rubrics.
Link to Full Text | Show Similar Items | Show Associated Keywords
62. Various. Rubric Assessment of Information Literacy Skills (RAILS).
In partnership with the Institute of Museum and Library Services, AAC&U, and the ACRL Assessment Immersion Program, the RAILS (Rubric Assessment of Information Literacy Skills) Project provides resources for academic librarians and faculty to enhance their skills of assessing information literacy outcomes. To this end, RAILS serves as clearinghouse for information literacy rubrics.
Link to Full Text | Show Similar Items | Show Associated Keywords
63. Various. Stephen F. Austin State University Rubric Assessment Resource Page.
Examples of rubrics of both discipline-specific and college-wide outcomes are provided.
Link to Full Text | Show Similar Items | Show Associated Keywords
64. Various. University of Hawai'i Manoa's rubric bank.
This page provides rubrics for different learning outcomes designated as essential for the university
Link to Full Text | Show Similar Items | Show Associated Keywords
65. Vorhees, R. A. 2001. Competency-based learning models: A necessary future.
This essay focuses on the present and proposed usages of competency-based learning models internationally and nationally. Furthermore, the author calls for a common lexicon for the usage of compentencies and the thoughtful usage of competentices to encourage a paradigmatic shift.
Link to Full Text | Show Similar Items | Show Associated Keywords

Search Again