Search By:


Publication Search Results

Search returned 170 results using Keyword: "Current assessment activities"



1. New Directions for Evaluation.
New Directions for Evaluation, a quarterly thematic journal, is an official publication of the American Evaluation Association. The journal publishes empirical, methodological, and theoretical works on all aspects of evaluation. Each issue is devoted to a single topic, with contributions solicited, organized, reviewed, and edited by a guest editor or editors. Issues may take any of several forms, such as a series of related chapters, a debate, or a long article followed by brief critical commentaries.
Link to Full Text | Show Similar Items | Show Associated Keywords
2. AAC&U. Summer/Fall 2005. Integrative learning.
This edition of Peer Review focuses on the usage of integrative learning as a tool for building connectedness in student learning.
Link to Full Text | Show Similar Items | Show Associated Keywords
3. Abbas, Andrea; McLean, Monica. Nov 2007. Qualitative Research as a Method for Making Just Comparisons of Pedagogic Quality in Higher Education: A Pilot Study.
This article suggests alternatives to comparing pedagogy between universities in order to internationalize higher education.
Link to Full Text | Show Similar Items | Show Associated Keywords
4. Ackermann, E. Summer 2007. Program Assessment in Academic Libraries: An Introduction for Assessment Practitioners.
This paper addresses recent changes in the perception of libraries’ functions in higher education and developments in measurement tools. The report looks at three issues at the helm of library assessment: (1) the tradition of assessment in libraries; (2) the current state of affairs and challenges of assessing the following library components: instruction, services, and resources; and (3) implications for the future of library assessment.
Link to Full Text | Show Similar Items | Show Associated Keywords
5. Ammons, J. L., & Mills, S. K. 2005. Course-embedded assessments for evaluating crossfunctional integration and improving the teaching-learning process.
This paper offers a case study of the process of defining a competency, specifying intended learning outcomes, selecting course-embedded assessment methods, evaluating the results, and using that information to guide changes in the teaching-learning process.
Link to Full Text | Show Similar Items | Show Associated Keywords
6. Angelo, T. A., & Cross, K. P. 1993. Classroom assessment techniques: A handbook for college teachers.
This revised and greatly expanded edition of the 1988 handbook offers teachers at all levels how-to advise on classroom assessment
Link to Full Text | Show Similar Items | Show Associated Keywords
7. Ariovich, L., & Richman, W.A. 2013, October. All-in-one: Combining grading, course, program, and general education outcomes assessment.
In NILOA's nineteeth occasional paper, authors Laura Ariovich and W. Allen Richman discuss Prince George's Community College's All-in-One system, designed to integrate course, program, and general education assessment in order to connect outcomes assessment with grading
Link to Full Text | Show Similar Items | Show Associated Keywords
8. Ash, S. and Clayton, P. Fall 2009.. Generating, Deepening, and Documenting Learning: The Power of Critical Reflection in Applied Learning..
This article examines the meaning of Critical Reflection and presents a model for blending Critical Reflection with assessment.
Link to Full Text | Show Similar Items | Show Associated Keywords
9. Ash, S.L., & Clayton, P.H. Fall 2009. Generating, deepening, and documenting learning: The power of critical reflection in applied learning.
This article considers the meaning of critical reflection and principles of good practice for designing it effectively and will present a research-grounded, flexible model for integrating critical reflection and assessment.
Link to Full Text | Show Similar Items | Show Associated Keywords
10. Banta, T. W. 2009. Demonstrating the impact of changes based on assessment findings.
This editor's note discusses her co-authored upcoming book, Designing effective assessment: Principles and profiles of good practice. In it they interview individuals about assessment and this article provides a brief summary of findings.
Link to Full Text | Show Similar Items | Show Associated Keywords
11. Banta, T. W. 2005. What draw campus leaders to embrace outcomes assessment?.
This editor's note begins with the question asked of 11 top administrators, "What can we learn from the leaders of institutions noted for outstanding work in outcomes assessment?"(p.3). This article summarize her findings.
Link to Full Text | Show Similar Items | Show Associated Keywords
12. Banta, T. W. (Ed.). 2004. Hallmarks of effective outcomes assessment.
"This booklet brings together the best guidance and practices from Assessment Update to illustrate time-tested principles for all aspects of assessment from planning and implementing to sustaining and improving assessment efforts over time. Useful for those new to assessment as well as experienced practitioners, it details the specific hallmarks required for the success of any assessment program--from leadership and staff development to the assessment of process as well as outcomes, ongoing communication among constituents, and more."
Link to Full Text | Show Similar Items | Show Associated Keywords
13. Banta, T.W., Griffin, M., Flateby, T.L., & Kahn, S. December 2009. Three promising alternatives for assessing college students' knowledge and skills.
In this paper, assessment experts Trudy Banta, Merilee Griffin, Theresa Flateby, and Susan Kahn describe the development of several promising authentic assessment approaches. The contributors draw on their rich assessment experience to illustrate how portfolios, common analytic rubrics, and online assessment communities can more effectively link assessment practices to pedagogy. In addition to discussing the strengths and limitations of each approach, the paper offers concrete examples of how these authentic approaches are being used to guide institutional improvement, respond to accountability questions, and involve more faculty, staff, and students in meaningful appraisals of learning outcomes.
Link to Full Text | Show Similar Items | Show Associated Keywords
14. Barrett, J.M. 2012. Writing assessment in the humanities: Culture and methodology.
This article examines methodological and institutional challenges for empirically measuring student performance on writing. Writing’s intrinsic subjectivity and the great variety of writing formats appropriate to diverse contexts raise fundamental questions about the empirical bias of the assessment culture taking root in U.S. higher education. At the same time, the academic training of humanist scholars, who typically have primary responsibility for writing pedagogy in universities, may predispose them to skepticism about assessment culture’s broader mission. This article narrates the process by which the Humanities Department at Lawrence Technological University implemented a writing assessment process designed to address these challenges and evaluates the data generated by this process.
Link to Full Text | Show Similar Items | Show Associated Keywords
15. Blaney, J.,Filer, K., & Lyon, J. Summer 2014. Assessing High Impact Practices Using NVivo: An Automated Approach to Analyzing Student Reflections for Program Improvement.
Roanoke College developed a system to automate the qualitative coding process using NVivo, a software analysis tool, allowing them to identify patterns in student learning that indicate effective and ineffective aspects of applied learning experiences. The NVivo query approach led to increased efficiency in the assessment of most HIPs included in the experiential learning program at Roanoke College.
Link to Full Text | Show Similar Items | Show Associated Keywords
16. Blasi, L. December 2011. How assessment and institutional research staff can help faculty with student learning outcomes assessment .
Institutional researchers can provide support for faculty members as they seek to improve the attainment of student learning outcomes through assessment. Sometimes a few dedicated faculty members drive the process, but increased faculty support is needed to cultivate a culture of assessment on campus.
Link to Full Text | Show Similar Items | Show Associated Keywords
17. Bresciani, M. Summer 2011. Identifying barriers in implementing outcomes-based assessments program review: A grounded theory analysis.
While conversations proposing standardized testing within higher education abound (Allen & Bresciani, 2003; Department of Education (DOE), 2006; Ewell, 1997a, 1997b; Ewell & Jones, 1996; Maki, 2004; Palomba & Banta, 1999), proponents of outcomes-based assessment program review are still applauding the value and extent that the process can be used to inform decisions to improve student learning and development (Bresciani, 2006; Bresciani, Zelna, & Anderson, 2004; Huba & Freed, 2000; Maki, 2004; Mentkowski, 2000; Palomba & Banta, 1999; Suskie, 2004). As such, practitioners of outcomes-based assessment continue to seek various ways to meaningfully engage in outcomes-based assessment program review in order to find ways to improve student learning and development.
Link to Full Text | Show Similar Items | Show Associated Keywords
18. Bresciani, M. J. 2003. External partners in assessment of student development and learning in student affairs and external relations.
This chapter discusses the role of external partnerships in student development and learning outcomes assessment in the context of results from a national survey of senior student affairs officers.
Link to Full Text | Show Similar Items | Show Associated Keywords
19. Breslow, L., Lienhard, J., Masi, B., Seering, W., & Ulm, F. 2008. How do we know if students are learning?.
This Massachusetts Institute of Technology faculty newsletter reported the efforts by the Accreditation Board for Engineering and Technology (ABET), departments in the School of Engineering (SoE), and the School’s Director of Education Innovation and Assessment towards assessing their students’ learning outcomes. The newsletter covers the multi-perspective approach that was taken to account for student learning outcomes, including both top-down and bottom–up approaches of assessing student learning. Engineering faculty were also engaged in a process of determining some of the most effective methods for assessing their students learning outcomes, including conducting experiments with the guidance of the school’s Teaching and Learning Laboratory (TTL).
Link to Full Text | Show Similar Items | Show Associated Keywords
20. Brint, Steven. May-June 2008. The Spellings Commission and the Case for Professionalizing College Teaching.
This article examines the challenge of accountability as presented in Margaret Spelling's Commission on the Future of Higher Education, current measures for learning outcomes, the Collegiate Learning Assessment, and argues for bringing professionalism to higher education.
Link to Full Text | Show Similar Items | Show Associated Keywords
21. Brownell, J. and Swaner, L. 2009. High-impact practices: Applying the learning outcomes literature to the development of successful campus programs..
This article outlines a study by the Association of American Colleges and Universities (AAC&U) on four high-impact practices in the US: first-year seminars, learning communities, service learning, and undergraduate research.
Link to Full Text | Show Similar Items | Show Associated Keywords
22. Bryant, J. L. 2006. Assessing expectations and perceptions of the campus experience: The Noel-Levitz Student Satisfaction Inventory.
This chapter describes the content and implementation of the Noel-Levitz Student Satisfaction Inventory (SSI) and explains its importance and utility for community colleges.
Link to Full Text | Show Similar Items | Show Associated Keywords
23. Cain, T., & Hutchings, P. 2013, October. Faculty buy-in and engagement: Reframing the conversation around faculty roles in assessment.
This presentation from the 2013 Assessment Institute discusses faculty's engagement with assessment, including common challenges, sources of discontent, and solutions for overcoming these difficulties.
Link to Full Text | Show Similar Items | Show Associated Keywords
24. Cain, T., & Jankowski, N. 2013, November. Mapping the landscape of learning outcomes assessment: An update from the field.
This presentation from the Association for the Study of Higher Education (ASHE) 2013 Annual Conference reviews the results of NILOA's 2014 survey of provosts.
Link to Full Text | Show Similar Items | Show Associated Keywords
25. California State University Northridge. 2014. SUNY's General Education "tips" for closing the loop and frequently asked questions.
SUNY’s General Education Assessment “Tips” for Closing the Loop and Frequently Asked Questions
Link to Full Text | Show Similar Items | Show Associated Keywords
26. Cambridge, D. 2010. EPortfolios for lifelong learning and assessment.
This book takes on the topic of e-portfolios, student learning and student assessments from a deeply philosophical perspective. A guiding idea throughout the book is the question of a student’s agency in relation to the use of e-portfolios. Some of the themes covered included using e-portfolios authentically, with integrity and through deliberation. The author also addresses these themes in relation to the issue of assessing student learning, including lifelong learning.
Link to Full Text | Show Similar Items | Show Associated Keywords
27. Case, S. 2007. Reconfiguring and realigning the assessment feedback processes for an undergraduate criminology degree.
The author conducted this study with a question of how to streamline the assessment process while still maximizing student learning benefits. So, the question aimed to symbiotically merge explicit engagement with assessment criteria and constructive feedback. A reconfigured system was adopted as a standard at the Criminology department.
Link to Full Text | Show Similar Items | Show Associated Keywords
28. Chan, C. K. Y., Tam, V. W. L., & Fok, W. T. T. 2013. Traditional and modern MCQ methods as in-class formative assessment.
This study was designed to compare three different Multiple Choice Questions (MCQs) delivery methods namely clickers, pen and paper MCQs and online elearning MCQs on the effectiveness of student engagement used as an in-class formative assessment. The results were also compared without the use of any formative assessment.
Link to Full Text | Show Similar Items | Show Associated Keywords
29. CHEA International Quality Group. 2013, May.. A Government Official's Guide: Quality Assurance of Higher Education in an International Setting.
This is the first issue of CHEA International Quality Group's Policy Brief series. This issue provides a succinct discussion of quality assurance of higher education in an international setting.
Link to Full Text | Show Similar Items | Show Associated Keywords
30. Colvin, J. 2012. Earn college credit for what you know.
“Earn College Credit for What You Know offers information on prior learning assessment (PLA) for adult learners, professionals, evaluators, administrators, faculty, and training managers.”
Link to Full Text | Show Similar Items | Show Associated Keywords
31. Cooper, T. & Terrell, T. 2013, August. What are institutions spending on assessment? Is it worth the cost?.
In NILOA's eighteenth occasional paper, authors Tammi Cooper and Trent Terrell examine how much institutions are spending when it comes to assessment. Their paper presents findings from a survey of assessment professionals at institutions regarding the cost of assessment and the perceived benefit that institutions receive.
Link to Full Text | Show Similar Items | Show Associated Keywords
32. Council for Adult and Experiential Learning. 2012. Employer views on the value of the PLA.
This CAEL research brief, produced in partnership with Prometric, presents highlights from conversations with 19 U.S. employers representing a range of industries on the topic of PLA. The conversations address the value of PLA to workers and corporations, as well as employers’ views on PLA as an allowable expense within their tuition assistance programs.
Link to Full Text | Show Similar Items | Show Associated Keywords
33. Council for Adult and Experiential Learning. 2011. Moving the starting line through prior learning assessment (PLA).
This research brief provides an analysis of the average number of credits students earn for what they already know. It offers information on the average number of PLA credits earned by a subgroup of 4,905 students in our sample, looking at how the average number of credits differs by institution type and by students of various demographic groups.
Link to Full Text | Show Similar Items | Show Associated Keywords
34. Council for Adult and Experiential Learning. 2007. Prior learning assessment at home and abroad.
Excerpts from articles in the CAEL Forum and News.
Link to Full Text | Show Similar Items | Show Associated Keywords
35. Crosta, P. 2013, June. Intensity and attachment: How the chaotic enrollment patterns of community college students affect educational outcomes.
This study examines the relationship between community college enrollment patterns and two successful student outcomes—credential completion and transfer to a four-year institution.
Link to Full Text | Show Similar Items | Show Associated Keywords
36. Dalal, D. K., Hakel, M. D., Sliter, M. T., & Kirkendall, S. R. 2012. Analysis of a rubric for assessing depth of classroom reflections.

Link to Full Text | Show Similar Items | Show Associated Keywords
37. Deardorff, D. 2014, Spring. Outcomes Assessment in International Education.
This paper from International Higher Education is for international educators on assessment.
Link to Full Text | Show Similar Items | Show Associated Keywords
38. Doyle, T. 2008. Helping students learn in a learner-centered environment: A guide to facilitating learning in higher education.
With simplicity in mind, this book explains what a learner-centered environment is, how this environment can be fostered, utilized, and assessed for effectiveness.
Link to Full Text | Show Similar Items | Show Associated Keywords
39. Driscoll, A. & Wood, S. 2007. Developing outcomes-based assessment for learner-centered education: A faculty introduction.
This book attempts to explain how faculty can comfortably use outcomes-based assessment within their own instruction. The author navigates readers through the process of creating expectations, standards and criteria, and course alignment to desired outcomes. articulating expectations, defining criteria and standards, and aligning course content consistently with desired outcomes
Link to Full Text | Show Similar Items | Show Associated Keywords
40. Dugan, J. P. & Komives, S. R. 2007. Developing leadership capacity in college students: Findings from a national study.
This report of the Multi-Institutional Study of Leadership (MSL) reflects key findings from a multi-site, multi-year project. This report includes findings from over 50,000 students from 52 campuses who participated in this study in the Spring of 2006.
Link to Full Text | Show Similar Items | Show Associated Keywords
41. Durrant, M.B. & Dorius, C.R. 2007. Study abroad survey instruments: A comparison of survey types and experiences.
This study examines different survey instruments used to assess the experiences of U.S. study abroad participants. The intended audience is international and area study practitioners interested in assessing study abroad programs through postprogram interviews. An interview with the top 20 universities for number of students sent on study abroad reveals a broad picture of the type of survey instruments used across the United States to assess student experiences. Within this context and based on 19 years of data collection from study abroad participants with four data collection modes (a standard questionnaire with multiple choice and open-ended questions, a multiple choice bubble sheet response format, a scanned form, and a Web-based survey), one university’s experience is analyzed in depth to expand on the benefits and drawbacks of specific survey types. Lessons learned about when each type might be appropriate for different institutional goals and situations are presented.
Link to Full Text | Show Similar Items | Show Associated Keywords
42. Eubanks, David A. & Royal, Kenneth D. May/Jun2011. A survey of attitudes about methods of assessment.
Abstract: The article discusses the results of a survey of attitudes about methods of assessment. It speculates that the underwhelming endorsement of purely psychometric methods among assessment professionals may come from pressures to implement a practical assessment program with limited means, lack of knowledge of theory and a preference for less formal methods. The article also mentions the need for a forum for open discussion about the theory and practice of assessment.
Link to Full Text | Show Similar Items | Show Associated Keywords
43. Evans, C. 2013, March. Making sense of assessment feedback in higher education.
This article presents a thematic analysis of the research evidence on assessment feedback in higher education (HE) from 2000 to 2012.
Link to Full Text | Show Similar Items | Show Associated Keywords
44. Ewell, P., Kinzie, J., Keith, J., & Love, M. B. January 2011. Down and in: A national perspective on program-level assessment.
Presentation at Association of American Colleges and Universities (AAC&U) on the Reviewing NILOA survey results, qualitative information on program assessment, with examples from two exemplary campuses.
Link to Full Text | Show Similar Items | Show Associated Keywords
45. Ewell, P., Mandell, C., Martin, E., & Hutchings, P. 2013, October. Mapping the curriculum: Learning outcomes and related assignments.
This presentation from the 2013 Assessment Institute discusses the implications of using the DQP for assessing learning outcomes, curriculum mapping, the use of rubrics, and designing an assignment library.
Link to Full Text | Show Similar Items | Show Associated Keywords
46. Ewell, P., Paulson, K., & Kinzie, J. June 2011. Down and in: Assessment practices at the program level.
To follow up the 2009 National Institute for Learning Outcomes Assessment (NILOA) report on institutional assessment activity described by chief academic officers, NILOA surveyed program heads in the two and four-year sectors to gain a more complete picture of assessment activity at the program or department level.
Link to Full Text | Show Similar Items | Show Associated Keywords
47. Eynon, B., Gambino, L., & Torok, J. 2014. Outcomes assessment and institutional learning.
This page describes the role of ePortfolios in outcomes assessment.
Link to Full Text | Show Similar Items | Show Associated Keywords
48. Fain, P. 2012, June 8. The next big thing, almost.
An article on barriers to quality competency-based education in higher education. A white paper, "A 'Disruptive' Look at Competency-Based Education: How the Innovative Use of Technology Will Transform the College Experience," is also available from the Center for American Progress.
Link to Full Text | Show Similar Items | Show Associated Keywords
49. Fifolt, M. Jul/Aug2013. Applying qualitative techniques to assessment in student affairs.
The article focuses on the application of qualitative techniques to assessment initiative in the student affairs division at the University of Alabama at Birmingham (UAB). The Vision Team utilized multiple verification techniques, including triangulation of data sources, member checking, an audit trail, and rich description. Examples of interview questions are presented. Recommendations for future assessment endeavors in student affairs are listed.
Link to Full Text | Show Similar Items | Show Associated Keywords
50. Finley, A. 2012. Making progress? What we know about the achievement of liberal education outcomes.
This report "presents comparative data on achievement over time across an array of liberal education outcomes such as critical thinking, writing, civic engagement, global competence, and social responsibility...[and] highlights new approaches to advancing meaningful assessment with effective pathways for learning and student success."
Link to Full Text | Show Similar Items | Show Associated Keywords
51. Frye, R. 1999. Assessment, accountability, and student learning outcomes.
This article explores the relationship between assessment, accountability and student learning outcomes. Examples of exemplary assessment programs are given as well as recommendations for improvement of student learning on college campuses.
Link to Full Text | Show Similar Items | Show Associated Keywords
52. Gerretson, H., & Golson, E. 2005. Synopsis of the use of course-embedded assessment in a medium size public university’s general education program.
Discusses how the institution implemented assessment on its campus and how they are using the data collected.
Link to Full Text | Show Similar Items | Show Associated Keywords
53. Gerretson, H., & Golson, E. 2005. Synopsis of the use of course-embedded assessment in a medium sized public university's general education program.
Gerretson and Golson describe the use of a faculty-driven course-embedded assessment at a medium size public university. The authors offer an overview on course-embedded assessment, implementing learning outcomes, rubrics, the use of data analysis, and evaluating the effectiveness of the course-embedded approach.
Link to Full Text | Show Similar Items | Show Associated Keywords
54. Gorlewski, E. 2010, October. Assessing assessment: Important new research in education abroad.
This article is a review of four recent research studies on study abroad.
Link to Full Text | Show Similar Items | Show Associated Keywords
55. Hardison, C. M., & Vilamovska, A. 2009. The collegiate learning assessment: Setting standards for performance at a college or university.
"This report illustrates how institutions can set their own standards on the CLA using a method that is appropriate for the CLA's unique characteristics. The authors examined evidence of reliability and procedural validity of a standard-setting methodology that they developed and applied to the CLA."
Link to Full Text | Show Similar Items | Show Associated Keywords
56. Harper Jr., Vernon B. May/Jun2011. Program portfolio analysis: Evaluating academic program viability and mix.
Abstract: The article discusses the use of program portfolio analysis in evaluating academic program viability and mix. The article also states that portfolio analysis can produce actionable institutional effectiveness and can assist in the allocation of academic resources.
Link to Full Text | Show Similar Items | Show Associated Keywords
57. Hart Research Associates. Spring 2013. It takes more than a major: Employer priorities for college learning and student success.
This report provides a detailed analysis of employers’ priorities for the kinds of learning today’s college students need to succeed in today’s economy. It also reports on changes in educational and assessment practices that employers recommend.
Link to Full Text | Show Similar Items | Show Associated Keywords
58. Hart, D.M. and Hickerson, J.H. 2008. Prior learning portfolios: A representative collection.
This resource is a valuable reference tool for PLA assessors, administrators and PLA instructors, and can help train both faculty and students to understand portfolio assessment. Designed to provide you with institutional policies and procedures regarding prior learning portfolios, this text and accompanying CD-ROM contain thirteen sample portfolios from eleven different institutions, handbooks, guidelines, flow charts, and information about the location of portfolio assessment in the context of a degree.
Link to Full Text | Show Similar Items | Show Associated Keywords
59. Harvard Initiative for Learning and Teaching. Concept maps: Are they good for assessment.
This is a PowerPoint that discusses the use of concept maps for assessment purposes. It provides a general understanding of using concept maps in determining student learning and possible outcomes. Included within the PowerPoint are slides about “Why create concept maps?,” “Concept Maps for Assessment,” and “Concept Map activity.” There is also an example rubric in addition to a list of the pros and cons of using concept maps for assessment purposes.
Link to Full Text | Show Similar Items | Show Associated Keywords
60. Harvey, V. & Avramenko, A. Mar/Apr2012. Video killed the radio star: Video created the student star!.
Abstract: The article explores the use of videos in course feedback and student assessment. Aside from enhancing student engagement, it is inferred that videos can improve digital literacy and promote student involvement in the video production process. The employment of video in learning activities is reported, noting that it can help students gain communication skills. It is also concluded that videos are useful for both self and peer assessment, allowing them to reflect on areas for improvement.
Link to Full Text | Show Similar Items | Show Associated Keywords
61. Hatch, D. 2012. Unpacking the black box of student engagement: The need for programmatic investigation of high impact practices. .
This article suggests that a more systematic investigation of high-impact practices is needed in order to further our understanding of student engagement within the community college.
Link to Full Text | Show Similar Items | Show Associated Keywords
62. Hawthorne, J., & Kelsch, A. 2012. Closing the loop: How an assessment project paved the way for GE reform.
Highlights an University of North Dakota (UND) assessment project rooted in five "actionable" principles: a) the need for scholarly credibility (assessment is perceived as scholarly in method and conception), b) authenticity (the degree to which the data generated feel "real" or "true"); c) keeping it local (grounded in a specific campus context); d) a faculty-owned project, and e) driven by genuine inquiry.
Link to Full Text | Show Similar Items | Show Associated Keywords
63. Higher Education Quality Council of Ontario (HEQCO). 2013, April 4. Quality: Shifting the focus. A report from the expert panel to assess the strategic mandate agreement submissions.
Government must play a more active, assertive and purposeful role to drive system-level planning and change, according to an Expert Panel convened by the Higher Education Quality Council of Ontario (HEQCO) that reviewed Strategic Mandate Agreement submissions from Ontario’s 44 public colleges and universities. The Panel’s report, Quality: Shifting the Focus: A Report from the Expert Panel to Assess the Strategic Mandate Agreement Submissions resulted from one of a series of initiatives by the Ministry of Training, Colleges and Universities (MTCU) to strengthen Ontario’s public postsecondary sector. MTCU launched the process to establish Strategic Mandate Agreements (SMA) with each of the postsecondary institutions “that will strongly inform future decisions, including allocation decisions and program approvals.”
Link to Full Text | Show Similar Items | Show Associated Keywords
64. Hobson, S.M., & Talbot, D. M. 2001. Understanding student evaluation: What all faculty should know.
A review of previous literature on student evaluation and its validity
Link to Full Text | Show Similar Items | Show Associated Keywords
65. Hosch, B.J. (2012). Time on test, student motivation, and performance on the collegiate learning assessment: Implications for institutional accountability.
Using results from the Collegiate Learning Assessment (CLA) administered at Central Connecticut State University, a public Carnegie master’s-larger programs university in the Northeast, this study demonstrates time on spent on the test, student motivation, and to a lesser extent the local institutional administration procedures represent problematic intervening variables in the measurement of student learning. Findings from successive administrations of the instrument reveal wide year-to-year variations in student performance related to time on test and motivation. Significant additional study of these factors should likely be prioritized ahead of adoption of accountability practices that rely upon low-stakes testing to measure student learning and demonstrate institutional effectiveness.
Link to Full Text | Show Similar Items | Show Associated Keywords
66. Hrabowski, F.A., Suess, J., & Fritz, J. 2011. Assessment and analytics in institutional transformation.
This article discusses the importance that learning analytics can have on learning outcomes assessment. Aided by IT professionals who are familiar with analytics software, assessment professionals can make sense of overwhelming datasets; and ensure that critical data is not overlooked. Harbowski, Suess, and Frtiz connect assessment to analytics in order to help institutions answer the call for accountability, potentially help increase the number of students in STEM fields, and how to support students by helping them reach intended learning outcomes.
Link to Full Text | Show Similar Items | Show Associated Keywords
67. Hutchings, P. 2014, July. DQP Case Study: Kansas City Kansas Community College.
KCKCC created an alternative system for documenting student achievement of Degree Qualifications Profile (DQP) proficiencies using an interactive curriculum mapping database that allows faculty to enter information about individual student performance on each learning outcome and competency in their courses.
Link to Full Text | Show Similar Items | Show Associated Keywords
68. Hutchings, P. 2014, January. DQP Case Study: Point Loma Nazarene University, San Diego, California.
Point Loma Nazarene University's engagement with the Degree Qualifications Profile began early and has been sustained over a number of years. PLNU's work with the DQP is now prompting conversations about how to more effectively assess learning in ways that are comparable across programs and how to continue to improve the experience of Point Loma students.
Link to Full Text | Show Similar Items | Show Associated Keywords
69. Hutchings, P. April 2010. Opening doors to faculty involvement in assessment.
Much of what has been done in the name of assessment has failed to induce large numbers of faculty to systematically collect and use evidence of student learning to improve teaching and enhance student performance. Pat Hutchings, a senior associate at The Carnegie Foundation for the Advancement of Teaching, examines the dynamics behind this reality, including the mixed origins of assessment, coming both from within and outside academe, and the more formidable obstacles that stem from the culture and organization of higher education itself. Then, she describes six ways to bring the purposes of assessment and the regular work of faculty closer together, which may make faculty involvement more likely and assessment more useful.
Link to Full Text | Show Similar Items | Show Associated Keywords
70. Hutchings, P., Ewell, P., & Humphreys, D. 2014, March 31. Where policies and practice meet: Assessment and the way we work.
This presentation connects the NILOA provost survey results to assignment design.
Link to Full Text | Show Similar Items | Show Associated Keywords
71. Jankowski, N. 2013, May. Assessment for learning research methods: A multi-faceted terrain.
This presentation from The Higher Education Academy (HEA) Social Science Conference offers an overview of NILOA, research method outcomes, and assessment practices.
Link to Full Text | Show Similar Items | Show Associated Keywords
72. Jankowski, N. 2013, October. How institutions use evidence of assessment: Does it really improve student learning? .
This presentation from the University of Illinois College of Education Higher Education Collaborative Series examines the ways in which institutions are using assessment data.
Link to Full Text | Show Similar Items | Show Associated Keywords
73. Jankowski, N. 2013, June. Showing an impact: Using assessment results to improve student learning.
This presentation from the 2013 Florida State Assessment Meeting describes how institutions are currently using assessments to improve student learning outcomes.
Link to Full Text | Show Similar Items | Show Associated Keywords
74. Jankowski, N. The role of IR in assessing student learning: Managing shifting priorities.
This presentation from the Ohio Association for Institutional Research and Planning (OAIRP) Spring Conference examines the role of Institutional Research in assessing student learning.
Link to Full Text | Show Similar Items | Show Associated Keywords
75. Jankowski, N. 2013, June. Using assessment evidence to improve student learning: Can it be done?.
This presentation from the 2013 Assessment in Higher Education Conference describes how institutions are using assessment data.
Link to Full Text | Show Similar Items | Show Associated Keywords
76. Jankowski, N, & Kinzie, J. 2013, May. The role of IR in fostering good assessment practice.

Link to Full Text | Show Similar Items | Show Associated Keywords
77. Jankowski, N., & Makela, J. P. June 2010. Exploring the landscape: What institutional websites reveal about student learning outcomes assessment activities.
Despite persistent calls for colleges and universities to post student learning outcomes assessment information to their websites, the assessment information that can be found online falls considerably short of the activities reported by chief academic officers. The study finds that institutions are often not taking full advantage of their website to increase transparency regarding student learning outcomes assessment. The researchers share their findings and offer recommendations for institutions.
Link to Full Text | Show Similar Items | Show Associated Keywords
78. Jankowski, N., & Provezis, S. November 2011. Making student learning evidence transparent: The state of the art.
Making Student Learning Evidence Transparent: The State of the Art is composed of four sections. The sections cover 1) the impact of national transparency initiatives; 2) the changing landscape of transparency; 3) the display of assessment results and their subsequent use; and 4) a synthesis of the previous three sections.
Link to Full Text | Show Similar Items | Show Associated Keywords
79. Jankowski, N., Eggleston, T., Heyman, E. 2013, June. Engaging with assessment and the Degree Qualifications Profile: Institutions in practice.
This presentation from the Association for the 2013 Assessment of Learning in Higher Education (AALHE) conference provides institutions are working with the DQP.
Link to Full Text | Show Similar Items | Show Associated Keywords
80. Jankowski, N., Hutchings, P., Slotnick, R., Cratsley, C., Fulton, S., Oates, S. 2014, January. What the DQP looks like on the ground: National trends and campus examples.

Link to Full Text | Show Similar Items | Show Associated Keywords
81. Jankowski, N., Kinzie, J., & Kuh, G. 2014, January 24. What provosts say about student learning outcomes asssessment.
This presentation from the 2014 AAC&U Annual Meeting presents an overview of NILOA's 2014 provost survey.
Link to Full Text | Show Similar Items | Show Associated Keywords
82. Jaschik, S. October 2009. Turning surveys into reforms.
Inside Higher Ed captures the significant of the 10 year anniversary celebration of NSSE and raises questions for the future of assessments.
Link to Full Text | Show Similar Items | Show Associated Keywords
83. Jo Beld. 2015. Building Your Assessment Toolkit: Strategies for Gathering Actionable Evidence of Student Learning.
This resource explores the various assessment strategies that Minority-Serving Institutions (MSIs) can utilize. It offers various questions for MSIs to ask themselves before beginning their assessment, an analysis of various assessment instruments, and advice on each approach.
Link to Full Text | Show Similar Items | Show Associated Keywords
84. Kelly, R. 2011. Implementing high-impact learning practices that improve retention.
This article discusses the use and implementation of high-impact learning practices to improve student retention.
Link to Full Text | Show Similar Items | Show Associated Keywords
85. Kelly, R. 2011. Implementing high-impact learning..
This article provides insights from consultant Lynn Swaner on high-impact practices. Swaner suggests that assessment is a vital aspect of successful high-impact practices.
Link to Full Text | Show Similar Items | Show Associated Keywords
86. Kerka, S. & Wonacott, M.E. 2000. Assessing learners online.
A wide range of tools are available to enable teaching practitioners to create web-based educational materials from PowerPoint presentations, adding a variety of different digital media, such as audio and animation. The pilot study described in this paper compared three different systems for producing multimedia presentations from existing PowerPoint files.
Link to Full Text | Show Similar Items | Show Associated Keywords
87. Khan, R., Khalsa, D., Klose, K., and Cooksey, Y. Winter 2012. Assessing graduate student learning in four competencies: Use of a common assignment and a combined rubric.
Abstract: Since 2001, the University of Maryland University College (UMUC) Graduate School has been conducting outcomes assessment of student learning. The current 3-3-3 Model of assessment has been used at the program and school levels providing results that assist refinement of programs and courses. Though effective, this model employs multiple rubrics to assess a wide variety of assignments and is complex to administer. This paper discusses a new outcomes assessment model called C2, currently being piloted in UMUC’s Graduate School. The model employs a single common activity (CoA) to be used by all Graduate School programs. It is designed to assess four of the five student learning expectations (SLEs) using one combined rubric (ComR). The assessment activity, scored by trained raters, displays pilot results supporting inter-rater agreement. Pilot implementation of the C2 model has advanced its reliability and its potential to streamline current assessment processes in the Graduate School.
Link to Full Text | Show Similar Items | Show Associated Keywords
88. Kinzie, J. 2014, March. DQP Case Study: University System of Georgia - Georgia State University and Georgia Perimeter College.
Georgia State University and Georgia Perimeter College collaborated on a project to explore the DQP proficiencies at the associate's and bachelor's degree levels. The project provided opportunity for faculty and staff to work toegether to explore the creation of discipline-specific versions of the DQP, establish common learning outcomes between programs, and devise mechanisms for assessing the DQP and evaluating the strengths and weaknesses of individual students relative to the disciplinary DQPs.
Link to Full Text | Show Similar Items | Show Associated Keywords
89. Kinzie, J. October 2010. Perspectives from campus leaders on the current state of student learning outcomes assessment: NILOA focus group summary 2009-2010.
This paper highlights lessons from four focus group sessions with campus leaders--presidents, provosts, academic deans and directors of institutional research from a variety of two- and four-year institutions-- regarding their perspectives on the state of learning assessment practices on their campuses.
Link to Full Text | Show Similar Items | Show Associated Keywords
90. Kinzie, J., & Jankowski, N. 2013, October. Delving deeper into NILOA survey results: What we know about institutional assessment practice in 2013.
This presentation from the 2013 Assessment Institute discusses results of NILOA's 2013 survey of provosts.
Link to Full Text | Show Similar Items | Show Associated Keywords
91. Kinzie, J., & Lindsay, N. 2014, February 28. Assessment administrators anonymous: 12 steps for involving faculty in assessment.
This presentation from the 2014 AAC&U General Education and Assessment Meeting discusses the role of faculty in assessment and presents results of NILOA's 2009 and 2014 provost surveys.
Link to Full Text | Show Similar Items | Show Associated Keywords
92. Kinzie, J., Harper, I., Moeckel, D.L., Renick, T. 2013, July. Conversations about the Degree Qualifications Profile (DQP).
This presentation from the American Association of State Colleges and Universities (AASCU) Summer Meeting outlines the ways in which several institutions are working with the DQP.
Link to Full Text | Show Similar Items | Show Associated Keywords
93. Kinzie, J., Jankowski, N., Baker, G., Klages, M., & Martinez, V. 2012, October. Using assessment results: Promising practices of institutions that do it well (Presentation).
This presentation from the 2012 Assessment Institute provides an overview of NILOA's July 2012 report, "Using Assessment Results: Promising Practices of Institutions that Do It Well." To read the full report, click here: http://www.learningoutcomeassessment.org/UsingAssessmentResults.htm
Link to Full Text | Show Similar Items | Show Associated Keywords
94. Kinzie, J., Jankowski, N., Haak, B., Bender, K. October 2011. Advancing student learning outcomes assessment: Lessons from campuses doing good work.
Presentation at the Assessment Institute on the purpose of case studies and on colleges and universities presently engaged with assessment.
Link to Full Text | Show Similar Items | Show Associated Keywords
95. Klein-Collins, R. 2012. Competency-Based Degree Programs in the U.S.: Postsecondary Credentials for Measurable Student Learning and Performance.
CAEL's latest report which addresses using student assessment in competency-based degree programs.
Link to Full Text | Show Similar Items | Show Associated Keywords
96. Klein-Collins, R. 2013, October. Sharpening our focus on learning: The rise of competency-based approaches to degree completion.
In NILOA's twentieth occasional paper, author Rebecca Klein-Collins, Senior Director of Research and Policy Development for the Council for Adult and Experiential Learning (CAEL), discusses the methodology, practices, and policies surrounding competency-based education.
Link to Full Text | Show Similar Items | Show Associated Keywords
97. Klein-Collins, R., Ikenberry, S.O., & Kuh, G.D. 2014, January/February. Competency-Based Education: What the Board Needs to Know.
Increasingly, higher education is moving away from credit hours toward an approach that focuses on what students actually know and can do with what they learn; such as Competency-Based Education (CBE). This article discusses the basics of CBE, the role of assessment, and what governing boards need to know.
Link to Full Text | Show Similar Items | Show Associated Keywords
98. Kuh, G. 2008. High impact practices: What they are, who has access to them, and why they matter.
This report from AAC&U presents NSSE data on high-impact practices and explains how and why they benefit students.
Link to Full Text | Show Similar Items | Show Associated Keywords
99. Kuh, G. 2013. Promise in action: Examples of institutional success. .
This article presents three examples of high-impact practices on college campuses.
Link to Full Text | Show Similar Items | Show Associated Keywords
100. Kuh, G. 2013, February 28. Quality assurance implications of high-impact practices and related improvement efforts.
This presentation from the New Mexico Higher Education Assessment and Retention (NMHEAR) conference explores the use of high-impact practices in higher education.
Link to Full Text | Show Similar Items | Show Associated Keywords
101. Kuh, G. 2013, February 28. Toward a sea change in what counts as meaningful evidence of student learning.
This presentation accompanied the keynote speech at the 2013 New Mexico Higher Education Assessment and Retention (NMHEAR) Conference.
Link to Full Text | Show Similar Items | Show Associated Keywords
102. Kuh, G. 2013, May 15.. What if the VSA Morphed into the VST?.
Blog post on transparency and student learning outcomes.
Link to Full Text | Show Similar Items | Show Associated Keywords
103. Kuh, G. 2013, February 28. What matters to student success: The promise of high-impact practices.
This presentation from the 2013 New Mexico Higher Education Assessment and Retention Conference (NMHEAR) provides an overview of high-impact practices and their implications.
Link to Full Text | Show Similar Items | Show Associated Keywords
104. Kuh, G. 2009. What student affairs professionals need to know about student engagement.
This article includes a summary of research on the relationship between student engagement and high-impact practices.
Link to Full Text | Show Similar Items | Show Associated Keywords
105. Kuh, G. D., & Ewell, P. T. 2010. The state of learning outcomes assessment in the United States.
"This paper summarises the status of undergraduate student learning outcomes assessment at accredited colleges and universities in the United States" (p.1)
Link to Full Text | Show Similar Items | Show Associated Keywords
106. Kuh, G. D., Kinzie, J., Schuh, J. H., & Whitt, E. J. 2005. Student success in college: Creating conditions that matter.
"Student Success in College describes policies, programs, and practices that a diverse set of institutions have used to enhance student achievement. This book clearly shows the benefits of student learning and educational effectiveness that can be realized when these conditions are present. This book provides concrete examples from twenty institutions that other colleges and universities can learn from and adapt to help create a success-oriented campus culture and learning environment."
Link to Full Text | Show Similar Items | Show Associated Keywords
107. Kuh, G., & Ikenberry, S. October 2009. More than you think, less than we need: Learning outcomes assessment in American higher education.
The 2009 report from the National Institute of Learning Outcomes Assessment (NILOA) is based on information from more than 1,500 regionally accredited degree-granting institutions in the U.S. The NILOA study, titled “More Than You Think, Less Than We Need: Learning Outcomes Assessment in American Higher Education,” summarizes what colleges and universities are doing to measure student learning.
Link to Full Text | Show Similar Items | Show Associated Keywords
108. Kuh, G., & Jankowski, N. 2012, October 29. What you see is less than we need: Communicating and using evidence of student learning.
This presentation from the 2012 Assessment Institute outlines current NILOA projects, including work related to the DQP.
Link to Full Text | Show Similar Items | Show Associated Keywords
109. Kuh, G., Ikenberry, S., & Jankowski, N. 2013, October. From provosts' lips to NILOA's ear: What we know about institutional assessment practice in 2013.
This presentation from the 2013 Assessment Institute discusses results of NILOA's 2013 survey of provosts.
Link to Full Text | Show Similar Items | Show Associated Keywords
110. Kuh, G., Jankowski, N., Ikenberry, S., & Kinzie, J. 2014. Knowing what students know and can do: The current state of student learning outcomes assessment in US colleges and universities.
In a follow-up to the 2009 survey of chief academic officers, NILOA again asked institutions about practices and activities related to assessing student learning. This report showcases findings regarding institutional activities, uses, drivers, and areas of continued need to advance the assessment of student learning. In addition, the report examines changes and shifts over time in institutional assessment related activities.
Link to Full Text | Show Similar Items | Show Associated Keywords
111. Kuh, G.D. 2003. Assessing what really matters to student learning: Inside the National Survey of Student Engagement.
This article covers the history and current importance of NSSE. Access to article is through JSTOR, which may require login information.
Link to Full Text | Show Similar Items | Show Associated Keywords
112. Lai, K. 2012. Assessing participation skills: Online discussions with peers.
This article describes an online assignment with a set of participation criteria and a method for assessing the quality of students’ interactions with peers.
Link to Full Text | Show Similar Items | Show Associated Keywords
113. Lakin, M. B., Seymour, D., Nellum, C. J., & Crandall, J. R. 2015. Credit for Prior Learning: Charting Institutional Practice for Sustainability.
This report focuses on credit for prior learning (CPL) and addresses the barriers and successful strategies for incorporating CPL.
Link to Full Text | Show Similar Items | Show Associated Keywords
114. Lester, N., et al. 2003. Writing across the curriculum: A college snapshot.
This article describes a research project intended to yield data about the state of writing across the curriculum at one urban college campus site.
Link to Full Text | Show Similar Items | Show Associated Keywords
115. Light, T., Chen, H., Ittelson, J. 2012. Documenting learning with ePortfolios: A guide for college instructors.
In this book, the authors provide both a theoretical and practical understanding about e-portfolios. Some of the main themes covered include how e-portfolios are a form of documentation, the relation between faculty and students in the process of using e-portfolios, and how to design the implementation of e-portfolios. The books also includes discussions regarding how to engage students, staff, and faculty in the process of using e-portfolios in addition to discussions about selecting appropriate technologies and evaluating the use of e-portfolios.
Link to Full Text | Show Similar Items | Show Associated Keywords
116. Liu, A., Sharkness, J., & Pryor, J. H. 2008. Findings from the 2007 administration of Your First College Year (YFCY): National aggregates.
This document provides a historical overview of the YFCY survey, information on the administration of the survey, and numerous results from the 2007 national survey.
Link to Full Text | Show Similar Items | Show Associated Keywords
117. London, M., & Hall, M. 2011. Unlocking the value of Web 2.0 technologies for training and development: The shift from instructor-controlled, adaptive learning to learner-driven, generative learning.
Traditional instruction is adaptive-that is, instructor-driven, face-to-face and/or online training to teach skills and knowledge and convey information, policies, and procedures. In contrast, generative learning is learner-driven, collaborative, and problem-focused. Web 2.0 technologies can support both types of learning but are especially valuable for generative learning. This article reviews learning processes and Web 2.0 capabilities, describes two case examples, outlines ways to design Web 2.0 training applications, and discusses the changing role of learning professionals from delivering structured, one-way adaptive learning to designing and facilitating generative learning opportunities.
Link to Full Text | Show Similar Items | Show Associated Keywords
118. Lowood, J. 2013. Restructuring the Writing Program at Berkeley City College: Or how we learned to love assessment and use it to improve student learning.
The portfolio-based assessment program at BCC started in 2011. They first looked at their pre-transfer English and English as a Second Language (ESL) composition/reading classes learning outcomes and determined that the best way to assess if they were met was through portfolios. As a results, all students had to summarize readings, write an in-class essay based on a prompt, and a research paper. About 500 students were assessed per semester. The endeavor extended so much so that the entire English and English as a Second Language (ESL) Department participated in scoring the portfolios through a rubric design.
Link to Full Text | Show Similar Items | Show Associated Keywords
119. Lynn, S. A., & Robinson-Backmon, I. 2005. Course-level outcomes assessment: An investigation of an upper-division undergraduate accounting course and the factors that influence learning .
This study examined the association between a course-level embedded assessment tool, earning performance outcomes (i.e., final numerical course average), and factors that influence learning goal outcomes.
Link to Full Text | Show Similar Items | Show Associated Keywords
120. Makela, J. P., & Rooney, G. S. June 2012. Learning outcomes assessment step-by-step: Enhancing evidence-based practice in career services.
This newest monograph, "Learning Outcomes Assessment Step-by-Step: Enhancing Evidence-Based Practice in Career Services," by Julia Panke Makela and Gail S. Rooney, examine learning outcomes assessments in career services offices. Examples of practical strategies are offered.
Link to Full Text | Show Similar Items | Show Associated Keywords
121. McKitrick, S. A., & Barnes, S. M. 2012. Assessment of critical thinking: An evolutionary approach.
Binghamton university was required by the SUNY Board of Trustees to use critical-thinking learning goals and to select a method of critical-thinking assessment. Campuses were also required to submit to SUNY a plan for assessing critical thinking which SUNY would approve through collaboration with the General Education Assessment Review (GEAR) group. . GEAR was formed by SUNY to develop a critical-thinking rubric with faculty help. Campuses were given the freedom to select from a narrow range of strategies for assessing critical thinking (Faculty Delphi study; NSSE surveys). Binghamton chose the GEAR rubric. The strategy was implemented and composed of 3 stages: developmental, enculturation, and refinement stage.
Link to Full Text | Show Similar Items | Show Associated Keywords
122. McNair, T. and Albertine, S. 2012. Seeking high-quality, high-impact learning: The imperative of faculty development and curricular intentionality.
This article discusses effective implementation of high-impact practices and references George Kuh’s 2008 article “High-Impact Educational Practices: What They Are, Who Has Access to Them, and Why They Matter.”
Link to Full Text | Show Similar Items | Show Associated Keywords
123. Miller, M., Lincoln, C., Goldberger, S., Kazis, R., Rothkoph, A. 2012, January. From denial to acceptance: The stages of assessment.
In some ways, the assessment movement over the last 25 years is similar to what individuals experience as they move through Kübler-Ross’s (1997) stages of grief: denial, anger, bargaining, depression, and acceptance. Eventually, reluctantly, slowly, and unevenly, many institutions have come to an acceptance of assessment and its role in higher education.
Link to Full Text | Show Similar Items | Show Associated Keywords
124. Miller, R. 2007. Assessment in cycles of improvement: Faculty designs for essential learning outcomes. .
This publication features a series of reports on how selected colleges and universities foster and assess student learning in twelve liberal education outcome areas, including writing, quantitative literacy, critical thinking, ethics, intercultural knowledge, and information literacy. Moving from goals to experiences, assessments, and improvements driven by assessment data, each institutional story illustrates how complex learning can be shaped over time and across programs to bring students to higher levels of achievement of these important outcomes.
Link to Full Text | Show Similar Items | Show Associated Keywords
125. Montenegro, E., & Jankowski, N. A. January 2017. Equity and Assessment: Moving Towards Culturally Responsive Assessment.
As colleges educate a more diverse and global student population, there is increased need to ensure every student succeeds regardless of their differences. This paper explores the relationship between equity and assessment, addressing the question: how consequential can assessment be to learning when assessment approaches may not be inclusive of diverse learners? The paper argues that for assessment to meet the goal of improving student learning and authentically documenting what students know and can do, a culturally responsive approach to assessment is needed. In describing what culturally responsive assessment entails, this paper offers a rationale as to why change is necessary, proposes a way to conceptualize the place of students and culture in assessment, and introduces three ways to help make assessment more culturally responsive.
Link to Full Text | Show Similar Items | Show Associated Keywords
126. Montenegro, E., & Jankowski, N. A. 2015, April. Focused on What Matters: Assessment of Student Learning Outcomes at Minority-Serving Institutions.
This report features the assessment work being done at Minority-Serving Institutions (MSIs). Comparisons are made between assessment activities at MSIs and those underway at Predominantly White Institutions (PWIs) as well as those at different types of MSIs (Tribal Colleges, Historically Black Colleges and Universities, and others). Four main findings are discussed including the internal focus of MSIs, the emphasis on using assessment data for improvement, differences among different types of MSIs in their assessment approaches, and matching assessment approaches to student characteristics and learning needs. Implications are presented for understanding assessment activities in MSIs, and how such understandings can help advance assessment efforts at all postsecondary institutions.
Link to Full Text | Show Similar Items | Show Associated Keywords
127. Morningside College. Classroom assessment and course-embedded assessment: What’s the difference?.
A short article about the difference between course embedded assessment and course level assessment.
Link to Full Text | Show Similar Items | Show Associated Keywords
128. Musil, C. M., & Miller, R. AACU’s course level student learning assessment matrix.
Under AAC&U’s Shared Futures initiative, Musil and Miller developed a matrix for planning and implementing student learning assessments into a course. Sample matrices and ideas to consider are also included with the blank matrix tool.
Link to Full Text | Show Similar Items | Show Associated Keywords
129. Naser, C.R., Donoghue, K., & Burrell, S. (2012). The eyes and ears of engagement: Using RAs to assess resident engagement.
This article analyzes the effectiveness of an effort to assess the extent of student engagement at Fairfield University through the assistance of resident assistants (RAs) and the adaptation of a methodology used by the university’s schools of engineering and education. Asking RAs to participate in an assessment of their residents provides several clear benefits: the assessment rubric sets clear expectations in plain language; the rubric sets out clear expectations to the residents; and the assessment data appear to be a valid indicator of student engagement and allow the institution to identify students who may benefit from additional counseling or attention.
Link to Full Text | Show Similar Items | Show Associated Keywords
130. National Assessment of Adult Literacy (NAAL). 2005, December.. A first look at the literacy of America's adults in the 21st century.
This report presents data from the 2003 National Assessment of Adult Literacy (NAAL), which measures the English literacy of adults in America.
Link to Full Text | Show Similar Items | Show Associated Keywords
131. National Institute for Learning Outcomes Assessment (NILOA). 2009. 2009 survey questionnaire.
This survey is for examining institution level assessment activities regarding campus assessment practices. Please contact us before using it for research or external purposes.
Link to Full Text | Show Similar Items | Show Associated Keywords
132. National Institute for Learning Outcomes Assessment (NILOA). 2010. 2010 survey questionnaire.
This survey is for surveying at the program and department level. Please contact us before using it for research or external purposes.
Link to Full Text | Show Similar Items | Show Associated Keywords
133. Norris, D. 2010, September 29. Learning from analytics best practices in other sectors.
This blog from Donald Norris explores topics related to action analytics.
Link to Full Text | Show Similar Items | Show Associated Keywords
134. Norris, D., Baer, L., Leonard, J., Pugliese, L., & Lefrere, P. 2008. Action analytics: Measuring and improving performance that matters in higher education.
The action analytics of the future will better assess students' competencies. Using individualized planning, advising, and best practices from cradle to career, these action analytics solutions will align interventions to facilitate retention and transitions and will fully maximize learners' success. Six primary actions are needed to evolve from the current generation of academic analytics (tools, solutions, and services) to action analytics.
Link to Full Text | Show Similar Items | Show Associated Keywords
135. Norris, D., Leonard, J., Pugliese, L., Baer, L., & Lefrere, P. 2008. Framing action analytics and putting them to work.
This article is a companion piece to the article “Action Analytics: Measuring and Improving Performance That Matters in Higher Education,” which describes the emergence of a new generation of tools, solutions, and behaviors that are giving rise to more powerful and effective utilities through which colleges and universities can measure performance and provoke pervasive actions to improve it.
Link to Full Text | Show Similar Items | Show Associated Keywords
136. Nunley, C., Bers, T, & Manning, T. July 2011. Learning outcomes assessment in community colleges.
As community colleges becoming increasingly important in educating students across the country, more emphasis is being placed on community colleges to provide the public with information on learning outcomes of its students. In this tenth NILOA Occasional Paper, Charlene Nunley, Trudy Bers and Terri Manning describe the complex environment of community colleges as it relates to student learning outcomes assessment. Results from previous surveys of community college instituitional researchers and chief academic officers are analyzed in addition to short vignettes of examples of good practices at various community colleges. Through prior experience either working with institutions or within their own institution, suggestions are offered from the authors in an effort to make student learning outcomes assessment more effective and transparent.
Link to Full Text | Show Similar Items | Show Associated Keywords
137. Otis, M. M. 2010. Listening to students.
An insightful article from Change magazine discussing assessment from a student's perspective.
Link to Full Text | Show Similar Items | Show Associated Keywords
138. Palomba, C. A., & Banta, T. W. 1999. Assessment essentials: Planning, implementing, and improving assessment in higher education.
"This book examines current assessment practices in higher education and offers suggestions on planning assessment programs, carrying them out, and using the results to improve academic programs. Examples from all types of institutions (community colleges, liberal arts colleges, and comprehensive, doctoral and research institutions) are used to illustrate various assessment activities."
Link to Full Text | Show Similar Items | Show Associated Keywords
139. Papp, R. n.d.. Assessment and assurance of learning using e-portfolios.
This paper will explore the use of e-portfolios in an educational environment to assess student performance, provide an electronic repertoire or resume of accomplishments as well as maintain a collection of student work which can be used for accreditation purposes and to measure and assess student learning outcomes.
Link to Full Text | Show Similar Items | Show Associated Keywords
140. Paulson, K. 2012. Faculty perceptions of general education and the use of high-impact practices.
This article focuses on the Compass Faculty Survey of the Association of American Colleges and Universities (AAC&U), which is related to the use of high-impact practices (HIPs) in general education.
Link to Full Text | Show Similar Items | Show Associated Keywords
141. Peterson, M. W., & Einarson, M. K. 2001. What are colleges doing about student assessment? Does it make a difference?.
"The purpose of our study was to extend current understanding of how postsecondary institutions have approached, supported, and promoted undergraduate student assessment, and the institutional uses and impacts that have been realized from these assessment efforts" (p.630).
Link to Full Text | Show Similar Items | Show Associated Keywords
142. Peterson, M. W., Augustine, C. H., Einarson, M. K., & Vaughan, D. B. 1999. Designing student assessment to strengthen institutional performance in baccalaureate institutions.
This monograph will "provide a national profile of current student assessment practices and institutional support patterns" and compare it with similar institutions (p.1). It also provides practical advice for practitioners.
Link to Full Text | Show Similar Items | Show Associated Keywords
143. Pike, G. R. Jan/Feb2012. Assessment measures.
The author argues that defining the uses to be made of the assessment data is the most important step in evaluating an assessment instrument.
Link to Full Text | Show Similar Items | Show Associated Keywords
144. Pinsent-Johnson, C., Howell, S., and King, R. 2013. Returning to high school in Ontario: Adult students, postsecondary plans and program supports.
This report from HEQCO reveals that most adults who return to high school do so because they want to pursue postsecondary education.
Link to Full Text | Show Similar Items | Show Associated Keywords
145. Pintrich, P. R., Smith, D. A. F., Garcia, T., & Mckeachie, W. J. 1993. Reliability and predictive validity of the motivated strategies for learning questionnaire (MSLQ).
This paper reports on the reliability and predicative capabilities of the Motivated Strategies for Learning Questionnaire (MSLQ). The development of this questionnaire began in the 1980s. It focuses on both motivational and learning strategies of students. The paper provides statistical evidence regarding the effectiveness of the questionnaire and concludes by stating that it is both “relatively” reliable and demonstrates degrees of predictive validity.
Link to Full Text | Show Similar Items | Show Associated Keywords
146. Prior Learning Assessment Inside Out. 2012. PLA: Quality assurance and accountability.
The second issue of PLAIO focuses on a range of issues, problems and questions concerning quality assurance and accountability.
Link to Full Text | Show Similar Items | Show Associated Keywords
147. Prior Learning Assessment Inside Out. 2012. The legacy of PLA: 40 years of practice.
This inaugural issue of PLAIO focuses on the historical roots of prior learning assessment and examines how these foundations are connected to--or disconnected from--current trends in higher education.
Link to Full Text | Show Similar Items | Show Associated Keywords
148. Provezis, S., Jankowski, N. May 2011. NILOA transparency framework: A tool for transparent communication of assessment information.
Presentation at The Association for the Assessment of Learning in Higher Education (AALHE) on Student Learning Assessment Components with examples.
Link to Full Text | Show Similar Items | Show Associated Keywords
149. Puncochar, J., & Klett, M. (2013). A model for outcomes assessment of undergraduate science knowledge and inquiry processes.
To measure the efficacy of a Liberal Studies education, a Midwestern regional university developed a systematic, rubric-guided assessment based on nationally recognized science principles and inquiry processes to evaluate student work in undergraduate science laboratory courses relative to a liberal education. The rubric presented a direct measure of student understandings of science inquiry processes. The assessment procedure used stratified random sampling at confidence levels of 95% to select student work, maintained anonymity of students and faculty, addressed concerns of university faculty, and completed a continuous improvement feedback loop by informing faculty of assessment results to assess and refine science-inquiry processes of course content. The procedure resulted in an assessment system for benchmarking science inquiry processes evident in student work and offered insights into the effect of undergraduate science laboratory courses on student knowledge and understanding.
Link to Full Text | Show Similar Items | Show Associated Keywords
150. Ravishanker, G. 2011. Doing academic analytics right: Intelligent answers to simple questions (Research Bulletin 2).
This ECAR research bulletin explores the various factors that must come together for an institution to have an academic analytics infrastructure that is flexible, agile, appropriately structured, and cost-effective. It examines not only appropriate technologies but, more importantly, the critical roles that stakeholders and governance play in setting the stage for success. There is a tremendous amount at stake for American higher education right now. How can IT facilitate—and motivate—our institutions to adopt the tools that can help mine the gold that resides in our very own vaults?
Link to Full Text | Show Similar Items | Show Associated Keywords
151. Ridley, D.R., & Smith, E.D. (2006). Writing across the curriculum works: The impact of writing emphasis upon senior exit writing samples.
Seniors’ writing skills were assessed in 1998 at a medium-sized public university. Blind scoring, a standard scoring guide, and trained graders were used. Curricular writing emphasis was assessed through a syllabus study, yielding a Curricular Emphasis Score. Controlling for entry-level skill in writing, Writing Score and Curricular Emphasis were highly correlated.
Link to Full Text | Show Similar Items | Show Associated Keywords
152. Roszkowski, M. J., & Ricci, R. 2004. Measurement of importance in a student satisfaction questionnaire: Comparison of the direct and indirect methods for establishing attribute importance.
Members of a research- or measurement-minded audience may find this article interesting. The authors examine relationships between importance and satisfaction on dual-scale student satisfaction questionnaires (like the SSI).
Link to Full Text | Show Similar Items | Show Associated Keywords
153. Schermer, T. and Gray, S. 2012, July. The senior capstone: Transformative experiences in the liberal arts (The Teagle Foundation Final Report).
The four private liberal arts colleges participating in this study – Allegheny College, Augustana College, Washington College, and The College of Wooster – are distinctive in that they require all seniors to engage in an intensive mentored experience (“capstone”) that is designed and executed by the student using the theories, methods, and tools of a discipline, resulting in a scholarly or creative work. While we have long believed the experience to be transformative, the evidence has been largely anecdotal. This report presents some concrete findings on the impact of capstones on student learning.
Link to Full Text | Show Similar Items | Show Associated Keywords
154. Schneider, C.G. Winter 2013. Holding Courses Accountable for Competencies Central to the Degree.
This article explores the connections between typical college coursework and a recent focus on competencies--rather than credits--as the central requirement for earning degrees.
Link to Full Text | Show Similar Items | Show Associated Keywords
155. Schuh, J. H., Upcraft, M. L., & Associates. 1996. Assessment in student affairs: A guide for practitioners.
This is a "single-volume, practical resource on using assessment to develop and improve all facets of student affairs. It includes detailed guidance for student affairs staff on how to assess student needs, student satisfaction, campus environments, campus cultures, and student outcomes. And it explains how senior staff can employ assessment findings in strategic planning, policy development, and day-to-day decision making."
Link to Full Text | Show Similar Items | Show Associated Keywords
156. Seemiller, C., & Murray, T. 2013. The common language of leadership.
This article outlines the results of a comprehensive examination of learning outcomes of 475 academic programs within 72 academic accrediting organizations in regard to student leadership development.
Link to Full Text | Show Similar Items | Show Associated Keywords
157. Siemens, G., & Long, P. 2011. Penetrating the fog: Analytics in learning and education.
This article helps explain the role that analytics can play in higher education and learning outcomes; especially the ways in which analytics can serve educators to improve teaching techniques and student learning. In addition, the article also touches upon how analytics can help administrators deal with the problems of resource allocation and improve the quality of education in tumultuous economic times.
Link to Full Text | Show Similar Items | Show Associated Keywords
158. Smith, B. P. 2007. Student ratings of teaching effectiveness: An analysis of end-of-course faculty evaluations .
The purpose of this study was to describe student ratings of teaching effectiveness for faculty in the College of Education (COE) at a Research I institution in the Southern United States.
Link to Full Text | Show Similar Items | Show Associated Keywords
159. Southern Education Foundation. Advancing Excellence, Enhancing Equity: Making the Case for Assessment at Minority-Service Institutions..
This publication from the Southern Education Foundation offers practical steps for initiating assessment programs at minority-serving institutions.
Link to Full Text | Show Similar Items | Show Associated Keywords
160. Southern Education Foundation. Advancing Excellence, Enhancing Equity: Making the Case for Assessment at Minority-Serving Institutions.
This brief considers practical and effective ways to improve student success and assessment at minority-serving institutions.
Link to Full Text | Show Similar Items | Show Associated Keywords
161. Spurlin, J., Rajala, S., Lavelle, J. 2008. Designing better engineering education through assessment: A practical resource for faculty and department chairs on using assessment and ABET criteria to improve student learning.
This book is written for engineering faculty and department chairs as a practical guide to improving the assessment processes for undergraduate and graduate engineering education in the service of improved student learning. It is written by engineering faculty and assessment professionals who have many years of experience in assessment of engineering education and of working with engineering faculty.
Link to Full Text | Show Similar Items | Show Associated Keywords
162. Stassen, M. L. A., Doherty, K., & Poe, M. 2001. Course-based review and assessment: Methods for understanding student learning.
This handbook provides an overview to assessment and use of assessment in the classroom, helps define your goals and objectives for the class, gives techniques of how to assess and finally, gives ways to understand and use the results gained.
Link to Full Text | Show Similar Items | Show Associated Keywords
163. Sullivan, T. A., Mackie, C., Massy, W. F., & Sinha, E. 2012. Improving measurement of productivity in higher education.
A report recently released by the National Research Council titled, "Improving Measurement of Productivity in Higher Education," discusses various ways to measure institutional quality and college productivity.
Link to Full Text | Show Similar Items | Show Associated Keywords
164. Terenzini, P. T. 1989. Assessment with open eyes: Pitfalls in studying student outcomes.
This article identifies some of the serious conceptual, measurement, organizational, and political problems likely to be encountered in the process of designing and implementing an assessment program and how some of them might be avoided.
Link to Full Text | Show Similar Items | Show Associated Keywords
165. Tweedell, C. Sep/Oct2011. Assessment on a budget: Overcoming challenges of time and money.
Abstract: The article focuses on the strategies for educational assessment. It says that having faculty develop program-level learning outcomes is one of the first steps in organizing program assessment. It adds that standardized tests are not designed for a specific institution and may have only a limited relationship with improving learning outcomes. Moreover, the learning of the students in their educational program is the best assessment technique.
Link to Full Text | Show Similar Items | Show Associated Keywords
166. Van Middlesworth, C. L. 2003. Community college strategies: Assessing learning communities.
Learning communities are clusters of courses that are taught as an integrated unit. "Learning communities present unique challenges to an institution’s assessment program because they do not lend themselves to an off-the-shelf assessment design. Adequate assessment of learning communities requires viewing the initiative through several lenses: instruction; communication; social cohesion or interaction; student and faculty learning; student reflections on their learning; and faculty perceptions of learning activities, support, and instructional atmosphere. Assessing learning communities requires more than using standardized instruments to measure what students know; it also involves developing methodologies to find out how students learn" (p. 12). The case of the Metropolitan Community College District (MCCD). Six quantitative and qualitative methods of learning communities assessment: (1) structured interviews of students and faculty, (2) classroom observation, (3) a student learning survey, (4) embedded
Link to Full Text | Show Similar Items | Show Associated Keywords
167. Virginia Assessment Group. 2013, Winter. Research & Practice in Assessment.
This issue focuses on assessment in the field of student affairs and showcases the scholarship of faculty from five major research universities. In a paper on assessing college student growth in student affairs, Nicholas Bowman of Bowling Green State University argues that reliance solely upon self-reported measures of student growth in cognitive domains such as learning and leadership can be “highly problematic and potentially misleading” to institutional decision-makers. Bowman then suggests five concrete ways to improve the quality of assessment data used to drive decision-making in student affairs.
Link to Full Text | Show Similar Items | Show Associated Keywords
168. Wagner, E., & Ice, P. 2012. Data changes everything: Delivering on the promise of learning analytics in higher education.
Analytics can aid educators across all sectors to create significant learning experiences for students, which can lead to increased student engagement and development. Seemingly miniscule aspects of professor and student online habits and personal preferences, combined with systematically tracking information, can come together, through patterns identified by analytics, to improve learning outcomes.
Link to Full Text | Show Similar Items | Show Associated Keywords
169. White, E. 2007. Assigning, responding, evaluating: A writing teacher's guide.
Ed White's practical guide to designing writing assignments, writing tests, and evaluating student writing has been thoroughly updated for the fourth edition, including new sections on directed self-placement, computer scoring of writing, Phase 2 scoring of portfolios, and much more.
Link to Full Text | Show Similar Items | Show Associated Keywords
170. WICHE Cooperative for Educational Technologies (WCET). 2010. No significant difference.
This website has been designed to serve as a companion piece to Thomas L. Russell's book, "The No Significant Difference Phenomenon" (2001, IDECC, fifth edition). Mr. Russell's book is a fully indexed, comprehensive research bibliography of 355 research reports, summaries and papers that document no significant differences (NSD) in student outcomes between alternate modes of education delivery, with a foreword by Dr. Richard E. Clark. Previous editions of the book were provided electronically; the fifth edition is the first to be made available in print from IDECC (The International Distance Education Certification Center).
Link to Full Text | Show Similar Items | Show Associated Keywords

Search Again