Search By:


Publication Search Results

Search returned 152 results using Keyword: "Using assessment for improvement"



1. Association for the Assessment of Learning in Higher Education (AALHE).
AALHE is an organization of practitioners interested in using effective assessment practice to document and improve student learning. There are also blog entries to stimulate online conversations about assessment. Membership in the AALHE is open to all who have an interest in assessing and improving student learning in higher education. Individual and institutional memberships are available.
Link to Full Text | Show Similar Items | Show Associated Keywords
2. Association of American Universities Data Exchange (AAUDE).
AAUDE is a public service organization whose purpose is to improve the quality and usability of information about higher education. Membership is comprised of AAU institutions that participate in the exchange of data/information to support decision-making at their institution.
Link to Full Text | Show Similar Items | Show Associated Keywords
3. Building Engagement and Attainment for Minority Students (BEAMS).
Specifically focused on student engagement and student learning, the BEAMS program worked with MSIs to improve student engagement and success. Over 100 baccalaureate MSIs were involved in the program.
Link to Full Text | Show Similar Items | Show Associated Keywords
4. Minority Serving Institutions (MSI) Student Learning Outcomes Institute.
As a part of this program, the Southern Education Foundation (SEF) was granted $500,000 to "Increase MSI’s commitment to transparency and effectiveness in improving student learning outcomes” as one of the objectives on Lumina's above agenda (Lumina, 2010). Through this grant, SEF’s mission is “to enhance student learning, outcomes assessment, documentation, and use at HBCUs and HSIs” (SEF, 2011). The first institute was held in January 2011. SEF hosted their second MSI Student Learning Outcomes Institute February 2-4, 2012, at the Grand Hyatt Hotel in Atlanta, GA. Association for Institutional Researchers (AIR) and SEF co-hosted an IPEDS workshop on February 2. Seven institutions have already been chosen to help lead the efforts.
Link to Full Text | Show Similar Items | Show Associated Keywords
5. MSIs models of success program.
Lumina Foundation for Education's MSIs Models of Success Program is a recent effort to promote student success at MSIs. Funded by Lumina Foundation for Education, IHEP controlled the technical aspects of the program for grantees. Its five goals included: 1. Improve MSIs' capacity to collect, analyze and use data to inform decisions that promote student success. 2. Strengthen policy and practice to improve developmental education. 3. Create a collective voice for policy advocacy on behalf of MSIs. 4. Increase MSIs' commitment to transparency and effectiveness in improving student outcomes. 5. Increase completion or graduation rates among underserved students, especially men of color.
Link to Full Text | Show Similar Items | Show Associated Keywords
6. New England Consortium on Assessment and Student Learning.
In collaboration with the New England Association of Schools and Colleges, NECASL initiated an innovative assessment project exploring how students learn and how they make important decisions about their academic programs.
Link to Full Text | Show Similar Items | Show Associated Keywords
7. New Leadership Alliance for Student Learning and Accountability .
The New Leadership Alliance for Student Learning and Accountability (the Alliance) was established to improve student learning at the undergraduate level and to find educationally valid ways of demonstrating that such improvement is taking place. The Alliance aims to improve student learning through voluntary and cooperative professional efforts to significantly improve assessment of, and accountability for, student learning outcomes. It also aims to convey to the higher education community and the larger public the importance of a quality college education in preparation for work, life, and responsible citizenship.
Link to Full Text | Show Similar Items | Show Associated Keywords
8. Presidents' Alliance for Excellence in Student Learning and Accountability.
Institutions joining the Presidents' Alliance, an initiative of the new Leadership Alliance for Student Learning and Accountability, are publicly making a commitment to significantly improve assessment of, and accountability for, student learning outcomes on their campuses. This involves committing to an Action Plan to build on previous work to assess, report on, and use evidence to improve student learning.
Link to Full Text | Show Similar Items | Show Associated Keywords
9. Valid Assessment of Learning in Undergraduate Education (VALUE).
The VALUE project seeks to contribute to the national dialogue on assessment of college student learning building on a philosophy of learning assessment that privileges authentic assessment of student work and shared understanding of student learning outcomes on campuses over reliance on standardized tests administered to samples of students outside of their required courses. The result of this philosophy has been the collaborative development of 15 rubrics by teams of faculty and academic professionals on campuses from across the country.
Link to Full Text | Show Similar Items | Show Associated Keywords
10. 21st-Century Commission on the Future of Community Colleges. 2012. Reclaiming the American dream: Community colleges and the nation's future.
This report urges community colleges to more effectively assess the learning outcomes of its students to build a culture of evidence. A brief overview of the Voluntary Framework of Accountabilty's work on assessment in community colleges is offered.
Link to Full Text | Show Similar Items | Show Associated Keywords
11. Ackermann, E. Summer 2007. Program Assessment in Academic Libraries: An Introduction for Assessment Practitioners.
This paper addresses recent changes in the perception of libraries’ functions in higher education and developments in measurement tools. The report looks at three issues at the helm of library assessment: (1) the tradition of assessment in libraries; (2) the current state of affairs and challenges of assessing the following library components: instruction, services, and resources; and (3) implications for the future of library assessment.
Link to Full Text | Show Similar Items | Show Associated Keywords
12. Allen, J., & Bresciani, M. J. 2003, January. Public institutions public challenges.
A discussion about the use of assessments and reports in transparent communication.
Link to Full Text | Show Similar Items | Show Associated Keywords
13. Alliance for Excellent Education. 2013. Expanding Education and Workforce Opportunities Through Digital Badges.
This report from the Alliance for Excellence Education explains how student learning outcomes can be improved through the use of digital badges.
Link to Full Text | Show Similar Items | Show Associated Keywords
14. Appling, J., Gancar, J., Hughes, S., & Saad, A. 2012. Class syllabi, general education, and ePortfolios.

Link to Full Text | Show Similar Items | Show Associated Keywords
15. Appling,J., Dippre, A., Gregory, E., Hembree, M., Kooi, K., Pazzo, K., Carson, S., & Shawen, A. 2015. General education and ePortfolios: Syllabi and the role of faculty.
Using student ePortfolios, Clemson University’s General Education assessment plan was evaluated. Faculty had concerns about communicating the natural science requirement to students, and determined that changes in their syllabi could be effective.
Link to Full Text | Show Similar Items | Show Associated Keywords
16. Arcario, P., Eynon, B., Klages, M., & Polnariev, B. A. 2013. Closing the loop: How we better serve our students through a comprehensive assessment process.
Outcomes assessment is often driven by demands for accountability. LaGuardia Community College's outcomes assessment model has advanced student learning, shaped academic program development, and created an impressive culture of faculty-driven assessment. Our inquiry-based approach uses ePortfolios for collection of student work and demonstrates the importance of engaging faculty input into the outcomes assessment design to continually "close the assessment loop." This article outlines the steps, successes, and challenges involved in constructing an effective outcomes assessment model that deepens learning across the institution.
Link to Full Text | Show Similar Items | Show Associated Keywords
17. Association for Institutional Research. 2009. A ten-step process for creating outcomes assessment measures for an undergraduate management program: A faculty-driven process.
This paper offers a plan for involving department faculty members in the creation of outcomes assessment by borrowing from the current literature in the field, as well as literature from human resources development and organizational behavior.
Link to Full Text | Show Similar Items | Show Associated Keywords
18. Association of American Colleges and Universities. 2009. Assessing learning outcomes: Lessons from AAC&U’s VALUE project.
The entire Winter 2009 edition of Peer Review addresses the VALUE project. Information presented includes an overview of the project, information on e-portfolios, application of rubrics, assessment process, and the use of assessment results for improvements.
Link to Full Text | Show Similar Items | Show Associated Keywords
19. Baker, G. R. February 2012. North Carolina A&T State University: A culture of inquiry.
North Carolina A&T was selected for inclusion as a case study for NILOA due to its commitment to improving its campus by developing a "culture of inquiry"—specifically as this relates to student learning outcomes assessment activities. Three elements have been instrumental in A&T's drive to become a more data-driven institution: 1) administrative leadership that encourages discussions and collaboration around student learning outcomes assessment activities on campus; 2) the use of professional development opportunities to help foster the involvement and commitment of faculty members; and 3) the systematic and intentional use of student feedback.
Link to Full Text | Show Similar Items | Show Associated Keywords
20. Baker, G. R. April 2012. Texas A&M International University: A culture of assessment INTEGRATEd.
Texas A&M International University was selected as a NILOA case study institution due to 1) its commitment to choosing assessments and tools appropriate for its students, 2) its long history with and innovative approach to assessment, and 3) the influential role of professional development at the institution to help prepare “Assessment Champions” and expand the number of “pockets of excellence” in terms of assessment practices throughout the campus.
Link to Full Text | Show Similar Items | Show Associated Keywords
21. Baker, G. R., Jankowski, N., Provezis, S. & Kinzie, J. 2012, July. Using assessment results: Promising practices of institutions that do it well.
To learn more about what colleges and universities are doing to use assessment data productively to inform and strengthen undergraduate education, NILOA conducted nine case studies. This report synthesizes the insights from these individual studies to discern promising practices in using information about student learning. The report concludes with lessons learned and reflective questions to help institutions advance their own assessment efforts within their specific institutional contexts.
Link to Full Text | Show Similar Items | Show Associated Keywords
22. Banta, T. W. May/Jun2011. Double loop learning in assessment.
Abstract: The article discusses double loop learning in assessment that involves planning, implementation, analysis of findings, improvement and back to planning. It suggests that the double loop should involve measuring again to check the improvements taken. The article concludes that there is a critical need for measurement experts in developing the methods for the scholarship of assessment.
Link to Full Text | Show Similar Items | Show Associated Keywords
23. Banta, T. W. (Ed.). 2004. Hallmarks of effective outcomes assessment.
"This booklet brings together the best guidance and practices from Assessment Update to illustrate time-tested principles for all aspects of assessment from planning and implementing to sustaining and improving assessment efforts over time. Useful for those new to assessment as well as experienced practitioners, it details the specific hallmarks required for the success of any assessment program--from leadership and staff development to the assessment of process as well as outcomes, ongoing communication among constituents, and more."
Link to Full Text | Show Similar Items | Show Associated Keywords
24. Banta, T. W. (Ed.). 1999. Portfolio assessment: Uses, cases, scoring, and impact.
"This booklet's articles explore how portfolios, including Web-based portfolios, have been used at various institutions to assess and improve programs in general education, the major, advising, and overall institutional effectiveness. They describe ways portfolios can be scored, students' perspectives on portfolios, how portfolios changed the faculty culture at one college, and more."
Link to Full Text | Show Similar Items | Show Associated Keywords
25. Banta, T. W., Jones, E. A., & Black, K. E. 2009. Designing effective assessment: Principles and profiles of good practice.
Over 146 higher education institutions were profiled in order to identify the 13 most essential principles for good practice in assessing student learning outcomes. Three phases of assessment - planning, implementing, and improving and sustaining assessment on campus - are of focus.
Link to Full Text | Show Similar Items | Show Associated Keywords
26. Banta, T. W., Pike, G. R., Hansen, M. J. 2009. The use of engagement data in accreditation, planning and assessment.
This article provides a basis for the use of evidence in institutional decision making and planning. The authors identify four steps in creating a “culture of evidence.” These include: goal setting, identifying assessment measures, tracking/data collection analysis, and application of findings. NSSE results are uses as examples of understanding the four steps.
Link to Full Text | Show Similar Items | Show Associated Keywords
27. Banta, T.W., Griffin, M., Flateby, T.L., & Kahn, S. December 2009. Three promising alternatives for assessing college students' knowledge and skills.
In this paper, assessment experts Trudy Banta, Merilee Griffin, Theresa Flateby, and Susan Kahn describe the development of several promising authentic assessment approaches. The contributors draw on their rich assessment experience to illustrate how portfolios, common analytic rubrics, and online assessment communities can more effectively link assessment practices to pedagogy. In addition to discussing the strengths and limitations of each approach, the paper offers concrete examples of how these authentic approaches are being used to guide institutional improvement, respond to accountability questions, and involve more faculty, staff, and students in meaningful appraisals of learning outcomes.
Link to Full Text | Show Similar Items | Show Associated Keywords
28. Baron, M. A., & Boschee, F. 1995. Authentic assessment: The key to unlocking student success.
"A review of authentic assessment that provides, in addition to a thorough grounding in the topic area, insightful thoughts on the purposes of evaluation, the nature of school planning, and the current status of efforts at school reform."
Link to Full Text | Show Similar Items | Show Associated Keywords
29. Bashford, J., & Slater, D. January 2008. Assessing and improving student outcomes: What we are learning at Miami Dade College.
This paper presents Miami Dade College’s institutional effectiveness office use of data to make decisions about college operations in an attempt to improve student outcomes. Strategies are presented and examples of institutionalizing those strategies are examined.
Link to Full Text | Show Similar Items | Show Associated Keywords
30. Bember, V., Trwoler, P., Saunders, M., & Knight, P. 2009. Enhancing learning, teaching, assessment and curriculum in higher education.
Using case studies and theoretical frameworks, this book invites readers to conceptualize improvement within their institution.
Link to Full Text | Show Similar Items | Show Associated Keywords
31. Benjamin, R. February 2011. Avoiding a tragedy of the commons in postsecondary education.
At this moment in history, human capital -- the stock of knowledge and skills citizens possess-- is our country’s principal resource. To develop human capital requires a high performing educational system, as education is the primary venue for preserving and enhancing human capital. But a storm is brewing in plain sight. Here’s a brief, incomplete, but ominous sketch of the problem and what it means for assessment.
Link to Full Text | Show Similar Items | Show Associated Keywords
32. Beyl, C. A. 2011. Still striving: Using a hypothetical university to demonstrate holistic assessment at the university, program and course level.
In this paper, WMU, a hypothetical university is examined. Beginning with an academic audit, WMU used this information to assess student learning outcomes at the university, program, and course level. Through intrusive institutional research and assessment, WMU was able to create a quality enhancement plan to fit their needs and address what they had learned about student learning.
Link to Full Text | Show Similar Items | Show Associated Keywords
33. Blaich, C. F. & Wise, K. S. January 2011. From gathering to using assessment results: Lessons from the Wabash national study.
Drawing from the Wabash Study, a multi-institutional longitudinal research and assessment project, Charlie Blaich and Kathy Wise, from the Center of Inquiry at Wabash College, share their field-tested findings and lessons learned about campus use of assessment results. The Wabash Study assists institutions in collecting, understanding and using data. The researchers at the Center of Inquiry found the last component to be the real challenge—using the data for improved student learning. In this Occasional Paper, Blaich and Wise describe the accountability movement, the history and purpose of the Wabash Study, and the reasons why institutions have a hard time moving from gathering data to using data, giving five practical steps to campus leaders for using the data collected.
Link to Full Text | Show Similar Items | Show Associated Keywords
34. Blaney, J.,Filer, K., & Lyon, J. Summer 2014. Assessing High Impact Practices Using NVivo: An Automated Approach to Analyzing Student Reflections for Program Improvement.
Roanoke College developed a system to automate the qualitative coding process using NVivo, a software analysis tool, allowing them to identify patterns in student learning that indicate effective and ineffective aspects of applied learning experiences. The NVivo query approach led to increased efficiency in the assessment of most HIPs included in the experiential learning program at Roanoke College.
Link to Full Text | Show Similar Items | Show Associated Keywords
35. Blasi, L. December 2011. How assessment and institutional research staff can help faculty with student learning outcomes assessment .
Institutional researchers can provide support for faculty members as they seek to improve the attainment of student learning outcomes through assessment. Sometimes a few dedicated faculty members drive the process, but increased faculty support is needed to cultivate a culture of assessment on campus.
Link to Full Text | Show Similar Items | Show Associated Keywords
36. Bollag, B. 2006. Making an art form of assessment.
This article discusses Alverno College, a leader in assessment in higher education.
Link to Full Text | Show Similar Items | Show Associated Keywords
37. Bollag, B. 2006. Using quality benchmarks for assessing and developing undergraduate programs.
This book uses selected performance criteria benchmarks to assist undergraduate programs to define their educational missions and goals as well as to document their effectiveness. It helps faculty and administrators use benchmarks not only to assess outcomes of student learning, but to program assessment, evaluate student learning, create meaningful faculty scholarship, ensure quality teaching, and forge connection to the community.
Link to Full Text | Show Similar Items | Show Associated Keywords
38. Brennan, R. L., Goa, X., & Colton, D. A. 1995. Generalizability analyses of work keys listening and writing tests.
An article on the psychometric characteristics of the listening and writing test of the WorkKeys program.
Link to Full Text | Show Similar Items | Show Associated Keywords
39. Bresciani, M. Summer 2011. Identifying barriers in implementing outcomes-based assessments program review: A grounded theory analysis.
While conversations proposing standardized testing within higher education abound (Allen & Bresciani, 2003; Department of Education (DOE), 2006; Ewell, 1997a, 1997b; Ewell & Jones, 1996; Maki, 2004; Palomba & Banta, 1999), proponents of outcomes-based assessment program review are still applauding the value and extent that the process can be used to inform decisions to improve student learning and development (Bresciani, 2006; Bresciani, Zelna, & Anderson, 2004; Huba & Freed, 2000; Maki, 2004; Mentkowski, 2000; Palomba & Banta, 1999; Suskie, 2004). As such, practitioners of outcomes-based assessment continue to seek various ways to meaningfully engage in outcomes-based assessment program review in order to find ways to improve student learning and development.
Link to Full Text | Show Similar Items | Show Associated Keywords
40. Bresciani, M. J. August 2011. Making assessment meaningful: What new student affairs professionals and those new to assessment need to know.
With the growing demands of assessment becoming more widespread throughout higher education institutions, knowledge about assessment for new student affairs professionals is even more critical. Marilee J. Bresciani provides a quick overview as to how new student affairs professionals can contribute both effectively and meaningfully to assessment practices at their institution.
Link to Full Text | Show Similar Items | Show Associated Keywords
41. Bresciani, Marilee J. & Uline, Cynthia L. Mar/Apr2012. Assessing Ed.D. programs for program evaluation and improvement and impact on pk-20 learning environments..
Abstract: The article discusses research on the Independent Doctoral Program in Educational Leadership of the California State University (CSU). It describes the high expectations for the graduates of the program, particularly their possible contributions to the PK-20 student learning and success. The researchers reportedly evaluated how the graduates used their leadership and research skills in improving learning environments. Further design recommendations suggested to improve the program are cited.
Link to Full Text | Show Similar Items | Show Associated Keywords
42. Bridges, B. K., Kinzie, J., Nelson Laird, T. F., & Kuh, G. D. 2008. Student engagement and student success at historically Black and Hispanic-serving institutions.
This book chapter provides examples of the use of student engagement assessments and data to promote student success at MSIs.
Link to Full Text | Show Similar Items | Show Associated Keywords
43. Buente, W., Winter, J. S., & Kramer, H. 2015. Program-based assessment of capstone ePortfolios for a communication BA curriculum.
In 2013, the Department of Communication at the University of Hawaii at Manoa used ePortfolios to evaluate their program. With the effectiveness of ePortfolios as an assessment tool, they found gaps in their curriculum, and identified several improvements to their current processes.
Link to Full Text | Show Similar Items | Show Associated Keywords
44. California State University Northridge. 2014. SUNY's General Education "tips" for closing the loop and frequently asked questions.
SUNY’s General Education Assessment “Tips” for Closing the Loop and Frequently Asked Questions
Link to Full Text | Show Similar Items | Show Associated Keywords
45. Case, S. 2007. Reconfiguring and realigning the assessment feedback processes for an undergraduate criminology degree.
The author conducted this study with a question of how to streamline the assessment process while still maximizing student learning benefits. So, the question aimed to symbiotically merge explicit engagement with assessment criteria and constructive feedback. A reconfigured system was adopted as a standard at the Criminology department.
Link to Full Text | Show Similar Items | Show Associated Keywords
46. Connors, R. & Smith, T. 2011. Change the culture, change the game: The breakthrough strategy of energizing your organization and creating accountability for results.
This book offers advice for shaping and managing organizational culture.
Link to Full Text | Show Similar Items | Show Associated Keywords
47. Council of Independent Colleges. 2008. Evidence of learning: Applying the collegiate learning assessment to improve teaching and learning in the liberal arts college experience.
The Council of Independent Colleges sponsored report presents the experience of a consortium of 33 CIC member colleges and universities with the CLA over a period of three years.
Link to Full Text | Show Similar Items | Show Associated Keywords
48. Cunningham, A., & Leegwater, L. 2011. Minority-serving institutions: What can we learn? .
The role that MSIs play in the lives of low-income, students of color, in respect to institutional policies and practices particular to these institutions, are the focus of this chapter. Included are promising practices facilitating student success and ways to circumvent potential barriers for low-income students at MSIs.
Link to Full Text | Show Similar Items | Show Associated Keywords
49. Davis Sr., L. 2009. Still striving: The role of faculty and staff in the SACS accreditation process.
Involvement of faculty and staff in the accreditation process has never been more important. Without their involvement, student learning outcomes and therefore, quality education cannot properly be addressed. This goal of this paper is “to encourage HBCU faculty and staff to embrace their roles in relation to accreditation and better understand SACS’ requirements and points of emphasis” (p. 3).
Link to Full Text | Show Similar Items | Show Associated Keywords
50. Del Rios, M., & Leegwater, L. 2008. Increasing student success at minority-serving institutions: Findings from the Beams project.
The primary purpose of Building Engagement and Attainment for Minority Students (BEAMS) is to help institutions cultivate data-driven initiatives that promote student learning, engagement and success. The BEAMS Report discusses findings from those MSIs who participated in its’ project from 2004-2008. Teams were given assistance in building a culture of evidence to help inform their decision making concerning their institution.
Link to Full Text | Show Similar Items | Show Associated Keywords
51. Denecke, D., Kent, J., & Wiener, W. 2011. Preparing future faculty to assess student learning.
Seeking to enhance teaching and student learning through assessment, this report looks at the efficacy of using programs similar to and including Preparing Future Faculty (PFF) to prepare graduate students entering the professoriate.
Link to Full Text | Show Similar Items | Show Associated Keywords
52. Diamond, R. M. 2008. Designing and assessing courses and curricula: A practical guide. (3rd ed.).
This updated book provides readers with tools and examples for those interested in adopting a learner-centered approach in their courses.
Link to Full Text | Show Similar Items | Show Associated Keywords
53. Donahoo, S., & Lee, W. Y. 2008. The adversity of diversity: Regional associations and the accreditation of minority-serving institutions.
This book chapter examines recent regional accreditation decisions concerning MSIs and the resulting campus impact.
Link to Full Text | Show Similar Items | Show Associated Keywords
54. Driscoll, A. & Wood, S. 2007. Developing outcomes-based assessment for learner-centered education: A faculty introduction.
This book attempts to explain how faculty can comfortably use outcomes-based assessment within their own instruction. The author navigates readers through the process of creating expectations, standards and criteria, and course alignment to desired outcomes. articulating expectations, defining criteria and standards, and aligning course content consistently with desired outcomes
Link to Full Text | Show Similar Items | Show Associated Keywords
55. Dwyer, C. A., Millett, C. M. & Payne, D. G. 2006, June. A culture of evidence: Postsecondary assessment and learning outcomes.
To understand the value that a college experience adds to student inputs, three measurements must be addressed: Student input measures (What were student competencies before college?, student output measures (What were student competencies after college?), and a measure of change between inputs and outputs. This paper also briefly reviews principles of fair and valid testing that pertain to the assessments being recommended.
Link to Full Text | Show Similar Items | Show Associated Keywords
56. Elgie, S., Childs, R., Fenton, N.E., Levy, B.A., Lopes, V., Szala-Meneok, K., Wiggers, R.D. 2012. Researching teaching and student outcomes in postsecondary education.
“The guide reflects a growing dedication to assessment and evaluation in teaching and learning, and more broadly to evidence-based practice in all issues related to student success,”
Link to Full Text | Show Similar Items | Show Associated Keywords
57. Erradi, A. 2012. EasyCapstone: A framework for managing and assessing capstone design projects.
To enhance students learning and satisfy ABET requirements, the Department of Computer Science and Engineering at Qatar University undertook over the past few years significant enhancements to the senior design project course. This work has produced a framework for managing and assessing capstone design projects. Along with a web-based application named easyCapstone to ease the framework adoption by automating key workflows particularly for managing the project registration, the submission of deliverables, scheduling project presentations, assessing students work and providing timely personalized feedback to students.
Link to Full Text | Show Similar Items | Show Associated Keywords
58. Erwin, T. D. Summer 2012. Intellectual college development related to alumni perceptions of personal growth.
Alumni self-ratings of their personal growth were linked to their intellectual development during college four to seven years earlier. Graduates that were satisfied with their personal growth in the arts, creative thinking, making logical inferences, learning independently, exercising initiative, and tolerating other points of view had higher intellectual scores in Commitment and Empathy as undergraduates years earlier. These findings support a relationship between college student intellectual development and alumni perceptions of their personal growth. The implications of this study support continuing the custom of querying graduates about their earlier education, a practice in wide use already; and add to the validity of the Scale of Intellectual Development as a measure of college impact upon personal dispositions.
Link to Full Text | Show Similar Items | Show Associated Keywords
59. Ewell, P. 2005. Across the grain: Learning from reform initiatives in undergraduate education.
This paper explores the question: “What makes a particular change initiative successful?” Features of successful reform initiatives are highlighted.
Link to Full Text | Show Similar Items | Show Associated Keywords
60. Ewell, P. T. 2009, November. Assessment, accountability, and improvement: Revisiting the tension.
Assessments of what students learn during college are typically used for either improvement or accountability, and occasionally both. For reasons carefully outlined by Peter Ewell in this NILOA Occasional Paper, since the early days of the “assessment movement” in the US, these two purposes of outcomes assessment have not rested comfortably together. No one is more qualified than Ewell to summarize what has changed and what has not over the past two decades in terms of student learning outcomes assessment and the shifting expectations and demands of policy makers, accreditors, higher education leaders, and government officials about student and institutional performance. After delineating how various kinds of information can and should be used for improvement and accountability, he points to ways that institutions can productively manage the persistent tensions associated with improvement and accountability as faculty and staff members do the important work of documenting, reporting, and using what student
Link to Full Text | Show Similar Items | Show Associated Keywords
61. Flores, S.M. 2006. Benchmarking: An essential tool for assessment, improvement, and accountability.
"This volume provides the reader with an increased understanding of benchmarking in the community college sector through four examples of national benchmarking initiatives designed specifically for two-year institutions; describes how the data from those initiatives are being used for assessment, institutional improvement, planning, management, and decision making; and discusses benchmarking's costs, benefits, and limitations."
Link to Full Text | Show Similar Items | Show Associated Keywords
62. Fulcher, K. and Orem, C. Winter 2010. Evolving from quantity to quality: A new yardstick for assessment.
Higher education experts tout learning outcomes assessment as a vehicle for program improvement. To this end the authors share a rubric designed explicitly to evaluate the quality of assessment and how it leads to program improvement. The rubric contains six general assessment areas, which are further broken down into 14 elements. Embedded within the article are links to the full rubric, an example of an exemplary assessment report, and a how-to guide for conducting and reporting quality assessment.
Link to Full Text | Show Similar Items | Show Associated Keywords
63. Fulcher, K. H., Good, M. R., Coleman, C. M., & Smith, K. L. 2014, December. A Simple Model for Learning Improvement: Weigh Pig, Feed Pig, Weigh Pig.
Assessing learning does not by itself result in increased student accomplishment, much like a pig never fattened up because it was weighed. Indeed, recent research shows that while institutions are more regularly engaging in assessment, they have little to show in the way of stronger student performance. This paper clarifies how assessment results are related to improved learning – assess, effectively intervene, re-assess – and contrasts this process with mere changes in assessment methodology and changes to pedagogy and curriculum. It also explores why demonstrating improvement has proven difficult for higher education. The authors propose a solution whereby faculty, upper administration, pedagogy/curriculum experts, and assessment specialists collaborate to enhance student learning.
Link to Full Text | Show Similar Items | Show Associated Keywords
64. Garcia, A. E., & Pacheco, J. M. 1992, March. A student outcomes model for community colleges: Measuring institutional effectiveness.
This paper presented at the 1992 North Central Association of Colleges and Schools commission outlines Santa Fe Community College's Student Outcomes Model. Through a series of ongoing outcomes studies, the SOM seeks to: identify what the college should be teaching, measure the extent to which the college is actually doing so, and collect information to help the college better fulfill its mission.
Link to Full Text | Show Similar Items | Show Associated Keywords
65. Gasman, M., Baez, B., & Turner, C. S. V. (Eds.). 2009. Understanding minority-serving institutions.
In this book, the authors address pertinent issues and ideas related to MSIs. A few of the chapter titles include: "Minority Serving Institutions: A Historical Backdrop; Student Engagement and Student Success at Historically Black and Hispanic-Serving institutions; and The Adversity of Diversity: Regional Associations and the Accreditation of Minority Serving Institutions." In addition, Minority Serving Institutions are defined and details about their particular characteristics are discussed.
Link to Full Text | Show Similar Items | Show Associated Keywords
66. Gawande, A. 2009. The checklist manifesto: How to get things right.
This book offers real-life examples of how using checklists can result in immediate improvement in organizations.
Link to Full Text | Show Similar Items | Show Associated Keywords
67. Gerretson, H., & Golson, E. 2005. Synopsis of the use of course-embedded assessment in a medium size public university’s general education program.
Discusses how the institution implemented assessment on its campus and how they are using the data collected.
Link to Full Text | Show Similar Items | Show Associated Keywords
68. Goldstein, P.J. & Katz, R. N. 2005.. Academic analytics: The use of management information and technology in higher education—key findings.
Producing meaningful, accessible, and timely management information has long been the holy grail of higher education administrative technology. The last decade has seen institutions make substantial investments in enterprise computing infrastructure to meet this goal. But have we met it? Our information systems produce many reports, but are we getting the information we need?
Link to Full Text | Show Similar Items | Show Associated Keywords
69. Hamline University. 2014. Learning outcomes assessment. Reporting.
Reporting is the most important step in the continuous cycle of learning assessment. It is the collaborative process through which programs use evidence of student learning to gauge the efficacy of collective educational practices, and to identify and implement strategies for improving student learning. Responses can range from curricular or pedagogical change to new faculty/staff development or student learning activities, and from comprehensive revision to evidence-based affirmation of current practice.
Link to Full Text | Show Similar Items | Show Associated Keywords
70. Harris, D. 2011. Value-added measures in education what every educator needs to know .
This book examines misuse and the use of value-added assessment measures in teacher evaluation and improvement, and policy making.
Link to Full Text | Show Similar Items | Show Associated Keywords
71. Hawthorne, J., & Kelsch, A. 2012. Closing the loop: How an assessment project paved the way for GE reform.
Highlights an University of North Dakota (UND) assessment project rooted in five "actionable" principles: a) the need for scholarly credibility (assessment is perceived as scholarly in method and conception), b) authenticity (the degree to which the data generated feel "real" or "true"); c) keeping it local (grounded in a specific campus context); d) a faculty-owned project, and e) driven by genuine inquiry.
Link to Full Text | Show Similar Items | Show Associated Keywords
72. Heathfield, Susan M. n.d. How to change your culture: organization of culture.
This article describes the steps required for changing and creating organizational culture.
Link to Full Text | Show Similar Items | Show Associated Keywords
73. Hecht, Laura. Achieving transparency, closing the loop.
This short case study discusses the use transparent assessment strategies and technology to involve faculty in the assessment process, understand student gaps in learning, and foster solutions to close these gaps.
Link to Full Text | Show Similar Items | Show Associated Keywords
74. Hubert, D.A., & Lewis, K.J. 2014. A framework for general education assessment: Assessing information literacy and quantitative literacy with ePortfolios.
Examining 100 random student ePortfolios from General Education courses using two college-wide learning outcomes, the authors reflect on how use of ePortfolios can effectively assess student work. Benefits of using ePortfolios, particularly in General Education, are also discussed.
Link to Full Text | Show Similar Items | Show Associated Keywords
75. Hurtado, S., & DeAngelo, L. 2012, Spring. Linking diversity and civic-minded practices with student outcomes: New evidence from national surveys.
This article examines national data to understand more about the impact of diversity and civic-related practices in regards to specific student outcomes.
Link to Full Text | Show Similar Items | Show Associated Keywords
76. Hutchings, P. April 2011. What new faculty need to know about assessment.
As a new faculty member, you will have questions about your students’ learning—as all thoughtful teachers do: Are they really learning what I’m teaching? How well do they understand the key concepts I’m focusing on? Can they apply what they’re learning in new contexts? What can I do better or differently to help students develop the skills and knowledge they need to be effective in this class, in subsequent courses, and in their future life and work? This assessment brief focuses upon an introduction for faculty to assessment of student learning.
Link to Full Text | Show Similar Items | Show Associated Keywords
77. Hutchings, P., Ewell, P., Banta, T. 2012. AAHE principles of good practice: Aging nicely.
Twenty years ago, in 1992, the American Association for Higher Education’s Assessment Forum released its “Principles of Good Practice for Assessing Student Learning,” a document developed by twelve prominent scholar-practitioners of the movement. The principles have been widely used, studied, and written about (see for instance Banta, Lund, Black & Oblander, 1995), and adapted in other documents and statements. Their inclusion on the NILOA website is a welcome addition, for, like good wine, the AAHE Principles have aged quite nicely.
Link to Full Text | Show Similar Items | Show Associated Keywords
78. James T. M. Minor details.
This blog provides insights from Dr. James T. Minor on college completion agenda, higher education policy, and institutional performance which are often of importance to MSIs.
Link to Full Text | Show Similar Items | Show Associated Keywords
79. Jankowski, N. August 2011. Capella University: An outcomes-based institution.
Capella University was selected for a case study due to its systematic, embedded student learning outcomes assessment process; its administrative support and vision of what assessment can do for individual learners; its transparency efforts such as Capella Results, which publicizes assessment results, and its help in developing Transparency By Design; and its use of assessment results to enhance learner success levels.
Link to Full Text | Show Similar Items | Show Associated Keywords
80. Jankowski, N. 2013, October. How institutions use evidence of assessment: Does it really improve student learning? .
This presentation from the University of Illinois College of Education Higher Education Collaborative Series examines the ways in which institutions are using assessment data.
Link to Full Text | Show Similar Items | Show Associated Keywords
81. Jankowski, N. July 2011. Juniata College: Faculty led assessment.
Juniata College was identified as an example of good assessment practice for the faculty-led Center for the Scholarship of Teaching and Learning (SoTL Center) that champions and supports evidence-based teaching; an administration-supported accountability website that provides data and information about outcomes to multiple audiences; and the use of evidence of student learning to make improvements at the institution and individual course levels.
Link to Full Text | Show Similar Items | Show Associated Keywords
82. Jankowski, N. 2013, June. Showing an impact: Using assessment results to improve student learning.
This presentation from the 2013 Florida State Assessment Meeting describes how institutions are currently using assessments to improve student learning outcomes.
Link to Full Text | Show Similar Items | Show Associated Keywords
83. Jankowski, N. April 2012. St. Olaf College: Utilization-Focused Assessment.
The National Institute for Learning Outcomes Assessment (NILOA) selected St. Olaf as a case study institution due to the institutional framing of assessment as inquiry in support of student learning and as meaningful, manageable, and mission-driven; the utilization-focus/backward-design approach employed in assessment; the integration of student learning outcomes assessment processes into faculty governance structures; and the collaborative involvement of multiple stakeholders and diverse ways in which evidence of student learning is utilized throughout the institution.
Link to Full Text | Show Similar Items | Show Associated Keywords
84. Jankowski, N. 2013, June. Using assessment evidence to improve student learning: Can it be done?.
This presentation from the 2013 Assessment in Higher Education Conference describes how institutions are using assessment data.
Link to Full Text | Show Similar Items | Show Associated Keywords
85. Jankowski, N., & Provezis, S. November 2011. Making student learning evidence transparent: The state of the art.
Making Student Learning Evidence Transparent: The State of the Art is composed of four sections. The sections cover 1) the impact of national transparency initiatives; 2) the changing landscape of transparency; 3) the display of assessment results and their subsequent use; and 4) a synthesis of the previous three sections.
Link to Full Text | Show Similar Items | Show Associated Keywords
86. Jaschik, S. April 2009. 'Tuning' college degrees.
Inside Higher Ed article on Tuning USA
Link to Full Text | Show Similar Items | Show Associated Keywords
87. Jo Beld. 2015. Building Your Assessment Toolkit: Strategies for Gathering Actionable Evidence of Student Learning.
This resource explores the various assessment strategies that Minority-Serving Institutions (MSIs) can utilize. It offers various questions for MSIs to ask themselves before beginning their assessment, an analysis of various assessment instruments, and advice on each approach.
Link to Full Text | Show Similar Items | Show Associated Keywords
88. Khan, R., Khalsa, D., Klose, K., and Cooksey, Y. Winter 2012. Assessing graduate student learning in four competencies: Use of a common assignment and a combined rubric.
Abstract: Since 2001, the University of Maryland University College (UMUC) Graduate School has been conducting outcomes assessment of student learning. The current 3-3-3 Model of assessment has been used at the program and school levels providing results that assist refinement of programs and courses. Though effective, this model employs multiple rubrics to assess a wide variety of assignments and is complex to administer. This paper discusses a new outcomes assessment model called C2, currently being piloted in UMUC’s Graduate School. The model employs a single common activity (CoA) to be used by all Graduate School programs. It is designed to assess four of the five student learning expectations (SLEs) using one combined rubric (ComR). The assessment activity, scored by trained raters, displays pilot results supporting inter-rater agreement. Pilot implementation of the C2 model has advanced its reliability and its potential to streamline current assessment processes in the Graduate School.
Link to Full Text | Show Similar Items | Show Associated Keywords
89. Kimberly D. Tanner. 2012, Summer. Promoting Student Metacognition.
This paper discusses various ways instructors can integrate/teach metacognitive strategies in their class, and how metacognition can help faculty, as well. Metacognition can promote conceptual changes in students, improve thinking skills, and result in better academic performance.
Link to Full Text | Show Similar Items | Show Associated Keywords
90. Kinzie, J. June 2012. Carnegie Mellon University: Fostering assessment for improvement and teaching excellence.
Carnegie Mellon was selected as a case study for the National Institute for Learning Outcomes Assessment (NILOA) for having an approach to student learning outcomes assessment that reflects the institution’s commitment to interdisciplinarity and innovative teaching and learning. Three elements have been instrumental in CMU’s advances in program-level student learning outcomes assessment: 1) an institutionalized research-oriented and data-informed university decision-making process driven by deans and departments; 2) an organizational culture with established processes promoting continuous improvement; and 3) the elevation of a cross-campus faculty resource—the Eberly Center for Teaching Excellence—as the hub of assessment support. This case study broadly describes CMU’s approach to addressing the challenges of assessment, explores the salient elements of CMU’s culture for assessment and improvement, and then focuses on the positioning and role of the Eberly Center for Teaching Excellence in student learning outcomes.
Link to Full Text | Show Similar Items | Show Associated Keywords
91. Kinzie, J. August 2011. Colorado State University: A comprehensive continuous improvement system.
Colorado State University was determined to be an instructive case study because of its innovative learning outcomes assessment and institutional improvement activities have been highlighted in various publications (see Bender, 2009; Bender, Johnson, & Siller, 2010; Bender & Siller, 2006, 2009; McKelfresh & Bender, 2009) and have been noted by experts in assessment and accreditation. CSU's assessment effort in student affairs is a model for bridging the work of academic affairs and student affairs through student learning outcomes assessment. Over the last dozen years, CSU has expanded its continuous improvement system for managing information sharing to serve the decision-making and reporting needs of various audiences. This system—known as the CSU Plan for Researching Improvement and Supporting Mission, or PRISM—provides information on the university's performance in prioritized areas, uses a peer review system for feedback, and emphasizes the importance of documenting institutional improvements informed by
Link to Full Text | Show Similar Items | Show Associated Keywords
92. Kinzie, J. October 2010. Perspectives from campus leaders on the current state of student learning outcomes assessment: NILOA focus group summary 2009-2010.
This paper highlights lessons from four focus group sessions with campus leaders--presidents, provosts, academic deans and directors of institutional research from a variety of two- and four-year institutions-- regarding their perspectives on the state of learning assessment practices on their campuses.
Link to Full Text | Show Similar Items | Show Associated Keywords
93. Kinzie, J., Bers, T., Quinlan, M. K. January 2012. Student learning outcomes assessment at community colleges.
Presentation at Association of American Colleges and Universities (AAC&U) on approaches and uses of assessment.
Link to Full Text | Show Similar Items | Show Associated Keywords
94. Kinzie, J., Jankowski, N., Baker, G., Klages, M., & Martinez, V. 2012, October. Using assessment results: Promising practices of institutions that do it well (Presentation).
This presentation from the 2012 Assessment Institute provides an overview of NILOA's July 2012 report, "Using Assessment Results: Promising Practices of Institutions that Do It Well." To read the full report, click here: http://www.learningoutcomeassessment.org/UsingAssessmentResults.htm
Link to Full Text | Show Similar Items | Show Associated Keywords
95. Klein, S., Benjamin, R., & Bolus, R. 2007. The collegiate learning assessment: Facts and fantasies.
This white paper presents a model of what the CLA does and does not measure, an overview of the instruments used with examples of tasks, and how information on how to interpret the value-added results.
Link to Full Text | Show Similar Items | Show Associated Keywords
96. Klein-Collins, R. 2012. Competency-Based Degree Programs in the U.S.: Postsecondary Credentials for Measurable Student Learning and Performance.
CAEL's latest report which addresses using student assessment in competency-based degree programs.
Link to Full Text | Show Similar Items | Show Associated Keywords
97. Klemic, G. G., & Lovero, E. Jan/Feb2011. Closing the loop: Assessing SLOs for quantitative and qualitative models in business courses.
Abstract: The article offers information on the assessment of student learning outcomes (SLOs) in the College of Business (COB) at Lewis University in Romeoville, Illinois, as part of COB's assessment plan and the Academic Quality Improvement Program (AQIP) of their regional accreditor. It mentions the development of an assessment device used for the assessment processes. It notes that the assessment project helps the faculty realize to adjust their teaching and courses to improve student learning.
Link to Full Text | Show Similar Items | Show Associated Keywords
98. Kotter, J.P. 2006. Leading change.
Using an eight-step process and examining the challenges of over 100 companies, Kotter presents readers with ideas for change for organizations trying to overcome their challenges. The eight steps include: 1) establishing a sense of urgency; 2) creating the guiding coalition; 3) developing a vision and strategy; 4) communicating the change vision; 5) empowering employees for broad-based action; 6) generating short-term wins; 7) consolidating gains and producing more change; and 8) anchoring new approaches in the culture.
Link to Full Text | Show Similar Items | Show Associated Keywords
99. Kuh, G. June 2010. NILOA: Tracking the status of outcomes assessment in the U.S..
Presentation at Association for Institutional Research (AIR) Targeted Affinity Group Opening Session on NILOA activities, accreditation as a driver of assessment, and findings from the NILOA 2010 Program-Level Survey.
Link to Full Text | Show Similar Items | Show Associated Keywords
100. Kuh, G. 2013, February 28. Quality assurance implications of high-impact practices and related improvement efforts.
This presentation from the New Mexico Higher Education Assessment and Retention (NMHEAR) conference explores the use of high-impact practices in higher education.
Link to Full Text | Show Similar Items | Show Associated Keywords
101. Kuh, G., & Ikenberry, S. October 2009. More than you think, less than we need: Learning outcomes assessment in American higher education.
The 2009 report from the National Institute of Learning Outcomes Assessment (NILOA) is based on information from more than 1,500 regionally accredited degree-granting institutions in the U.S. The NILOA study, titled “More Than You Think, Less Than We Need: Learning Outcomes Assessment in American Higher Education,” summarizes what colleges and universities are doing to measure student learning.
Link to Full Text | Show Similar Items | Show Associated Keywords
102. Lagotte, B. Summer 2012. Review of “good education in an age of measurement”.
In Good Education in an Age of Measurement, Gert J.J. Biesta argues that analysis about what constitutes a “good” education demands more than the evidence-based, “best practice” paradigm currently offers. Furthermore, the narrow perspective of assessing learning outcomes may prove detrimental for education towards a deeply democratic society. Although not exactly the type of insight assessment researchers might welcome, Biesta’s thoughtful critique can ultimately enhance the ways scholars evaluate the quality of education. Biesta reinvigorates discussions about what constitutes a good education, specifically the purpose of education. Concerned about a lack of attention to purposes in the research literature, Biesta puts this issue front and center. His inquiry includes a normative perspective rather than only a managerial focus on education as a technique. That is, he produces a conceptual framework for why we ought to focus on particular educational goals. To this end, Biesta provides a three-prong framework.
Link to Full Text | Show Similar Items | Show Associated Keywords
103. Lester, N., et al. 2003. Writing across the curriculum: A college snapshot.
This article describes a research project intended to yield data about the state of writing across the curriculum at one urban college campus site.
Link to Full Text | Show Similar Items | Show Associated Keywords
104. Lowood, J. 2013. Restructuring the Writing Program at Berkeley City College: Or how we learned to love assessment and use it to improve student learning.
The portfolio-based assessment program at BCC started in 2011. They first looked at their pre-transfer English and English as a Second Language (ESL) composition/reading classes learning outcomes and determined that the best way to assess if they were met was through portfolios. As a results, all students had to summarize readings, write an in-class essay based on a prompt, and a research paper. About 500 students were assessed per semester. The endeavor extended so much so that the entire English and English as a Second Language (ESL) Department participated in scoring the portfolios through a rubric design.
Link to Full Text | Show Similar Items | Show Associated Keywords
105. Makela, J. P., & Rooney, G. S. June 2012. Learning outcomes assessment step-by-step: Enhancing evidence-based practice in career services.
This newest monograph, "Learning Outcomes Assessment Step-by-Step: Enhancing Evidence-Based Practice in Career Services," by Julia Panke Makela and Gail S. Rooney, examine learning outcomes assessments in career services offices. Examples of practical strategies are offered.
Link to Full Text | Show Similar Items | Show Associated Keywords
106. Malcom, L., Bensimon, E. M., & Dávila, B. 2010, Winter. (Re)constructing hispanic-serving institutions: Moving beyond numbers towards student success.
This brief highlights the purpose and need for Hispanic Serving Institutions (HSIs). One of the main questions asked is “What evidence is used to assess performance as a Hispanic-serving institution?” (p. 5). The major goals of the brief include: “1) attending to their mission and identity in order to develop programmatic initiatives that promote Latino/a student success, and 2) focusing on collecting data to assess the extent HSIs are meeting their mission to improve educational outcomes for Latino students” (p. 1).
Link to Full Text | Show Similar Items | Show Associated Keywords
107. McInerney, D. M., Brown, G. T. L., & Liem, G. A. D. 2009. Student perspectives on assessment: What students can tell us about assessment for learning.
Seeking diverse student voices on assessment? This books uses American, and international student feedback on both formative and summative assessment across all levels of education.
Link to Full Text | Show Similar Items | Show Associated Keywords
108. McNeice-Stallard, B. E., & Stallard, C. M. 2012. Measuring sustainability of outcomes assessment.
This qualitative study evaluates how well faculty at a two-year community college in the United States used assessment of student learning outcomes (SLOs) for pedagogical/curricular change; how well the “use of results” of 1,200 courses demonstrate, from a qualitative perspective, the engagement of faculty in SLOs; and how well discussions with faculty and managers demonstrate the value of SLOs. The results indicated that some faculty were going to be doing some curricular or pedagogical changes because of the assessment results.
Link to Full Text | Show Similar Items | Show Associated Keywords
109. Miller, R. 2007. Assessment in cycles of improvement: Faculty designs for essential learning outcomes. .
This publication features a series of reports on how selected colleges and universities foster and assess student learning in twelve liberal education outcome areas, including writing, quantitative literacy, critical thinking, ethics, intercultural knowledge, and information literacy. Moving from goals to experiences, assessments, and improvements driven by assessment data, each institutional story illustrates how complex learning can be shaped over time and across programs to bring students to higher levels of achievement of these important outcomes.
Link to Full Text | Show Similar Items | Show Associated Keywords
110. Millet, C. M., Payne, D. G., Dwyer, C. A., Stickler, L. M., & Alexiou, J. J. 2008. A culture of evidence: An evidence-centered approach to accountability for student learning outcomes.
This paper presents a framework that institutions of higher education can use to improve, revise and introduce comprehensive systems for the collection and dissemination of information on student learning outcomes. For faculty and institutional leaders grappling with the many issues and nuances inherent in assessing student learning, the framework offers a practical approach that allows them to meet demands for accountability in ways that respect the diverse attributes of students, faculty and the institutions themselves.
Link to Full Text | Show Similar Items | Show Associated Keywords
111. Montenegro, E., & Jankowski, N. A. January 2017. Equity and Assessment: Moving Towards Culturally Responsive Assessment.
As colleges educate a more diverse and global student population, there is increased need to ensure every student succeeds regardless of their differences. This paper explores the relationship between equity and assessment, addressing the question: how consequential can assessment be to learning when assessment approaches may not be inclusive of diverse learners? The paper argues that for assessment to meet the goal of improving student learning and authentically documenting what students know and can do, a culturally responsive approach to assessment is needed. In describing what culturally responsive assessment entails, this paper offers a rationale as to why change is necessary, proposes a way to conceptualize the place of students and culture in assessment, and introduces three ways to help make assessment more culturally responsive.
Link to Full Text | Show Similar Items | Show Associated Keywords
112. Morelon, C. 2006. Building institutional capacity for informed decision making to enhance student learning outcomes.
Although a good deal has been written on accountability, accreditation, assessment, and institutional effectiveness, there is a dearth of examples from Historically Black Colleges and Universities (HBCUs) about how they use these processes for institutional improvement. Given the press for institutions to provide evidence of their impact on student learning, resource-dependent HBCUs are challenged to meet such demands. The purpose of this research was to better understand factors that compelled one institution to become more data-centered in its decision making in order to affect student learning outcomes.
Link to Full Text | Show Similar Items | Show Associated Keywords
113. Ndoye, A., & Parker, M. A. 2010. Creating and sustaining a culture of assessment.
Many institutions of higher education develop assessment systems to demonstrate evidence of value added and to meet accreditation requirements. The sustainability of such assessment systems is usually dependent on creating a culture of assessment, which entails establishing shared values and principles and implementing practices designed to meet organizational goals. The article also provides specific examples to help institutions move along the continuum or improve their current practices and concludes with a discussion of policy implications.
Link to Full Text | Show Similar Items | Show Associated Keywords
114. New Leadership Alliance for Student Learning and Accountability. 2012. Committing to quality: Guidelines for assessment and accountability in higher education .
This publication guides colleges and universities in improving the quality of a college degree. It asks colleges to take responsibility for assessing and improving student learning — to set clear goals for student achievement, regularly gather and use evidence that measures performance against those goals, report evidence of student learning, and continuously work to improve results.
Link to Full Text | Show Similar Items | Show Associated Keywords
115. Norris, D., Baer, L., Leonard, J., Pugliese, L., & Lefrere, P. 2008. Action analytics: Measuring and improving performance that matters in higher education.
The action analytics of the future will better assess students' competencies. Using individualized planning, advising, and best practices from cradle to career, these action analytics solutions will align interventions to facilitate retention and transitions and will fully maximize learners' success. Six primary actions are needed to evolve from the current generation of academic analytics (tools, solutions, and services) to action analytics.
Link to Full Text | Show Similar Items | Show Associated Keywords
116. Norris, D., Leonard, J., Pugliese, L., Baer, L., & Lefrere, P. 2008. Framing action analytics and putting them to work.
This article is a companion piece to the article “Action Analytics: Measuring and Improving Performance That Matters in Higher Education,” which describes the emergence of a new generation of tools, solutions, and behaviors that are giving rise to more powerful and effective utilities through which colleges and universities can measure performance and provoke pervasive actions to improve it.
Link to Full Text | Show Similar Items | Show Associated Keywords
117. Nunley, C., Bers, T, & Manning, T. July 2011. Learning outcomes assessment in community colleges.
As community colleges becoming increasingly important in educating students across the country, more emphasis is being placed on community colleges to provide the public with information on learning outcomes of its students. In this tenth NILOA Occasional Paper, Charlene Nunley, Trudy Bers and Terri Manning describe the complex environment of community colleges as it relates to student learning outcomes assessment. Results from previous surveys of community college instituitional researchers and chief academic officers are analyzed in addition to short vignettes of examples of good practices at various community colleges. Through prior experience either working with institutions or within their own institution, suggestions are offered from the authors in an effort to make student learning outcomes assessment more effective and transparent.
Link to Full Text | Show Similar Items | Show Associated Keywords
118. Nyamekye, A. September 2011. Putting myself to the test.
In a routine evaluation, my principal praised my organization, management, and facilitation, but posed the following question: “How do you know the kids are really getting it?” She urged me to develop more-rigorous assessments of student learning. Ego and uncertainty inspired me to measure the impact of my instruction. I thought I was effective, but I wanted proof.
Link to Full Text | Show Similar Items | Show Associated Keywords
119. O'Neill, N. 2012. Promising Practices for Personal and Social Responsibility: Findings from a National Research Collaborative .
Drawing on meetings of a distinguished group of educational researchers, Promising Practices highlights select national/multi-institutional data and major themes along five dimensions of personal and social responsibility. Importantly, the report also offers a set of evidence-based recommendations for improving campus practice in relation to educating students for personal and social responsibility.
Link to Full Text | Show Similar Items | Show Associated Keywords
120. Ortlieb, E., & Cheek, E. H. 2012, March. Using informative assessments towards effective literacy instruction.
Examples of effective literacy assessment practices for different student populations are offered in this book.
Link to Full Text | Show Similar Items | Show Associated Keywords
121. Palomba, C. A., & Banta, T. W. 1999. Assessment essentials: Planning, implementing, and improving assessment in higher education.
"This book examines current assessment practices in higher education and offers suggestions on planning assessment programs, carrying them out, and using the results to improve academic programs. Examples from all types of institutions (community colleges, liberal arts colleges, and comprehensive, doctoral and research institutions) are used to illustrate various assessment activities."
Link to Full Text | Show Similar Items | Show Associated Keywords
122. Peterson, M. W., & Einarson, M. K. 2001. What are colleges doing about student assessment? Does it make a difference?.
"The purpose of our study was to extend current understanding of how postsecondary institutions have approached, supported, and promoted undergraduate student assessment, and the institutional uses and impacts that have been realized from these assessment efforts" (p.630).
Link to Full Text | Show Similar Items | Show Associated Keywords
123. Pike, G. R. Jan/Feb2012. Assessment measures.
The author argues that defining the uses to be made of the assessment data is the most important step in evaluating an assessment instrument.
Link to Full Text | Show Similar Items | Show Associated Keywords
124. Pinker, S. 1997. How the Mind Works.
"This book is intended for anyone who is curious about how the mind works." Those interested in how the mind processes and uses information to make decisions will find this book useful. Computation, evolution, faculties of the mind (i.e., perception, reasoning, emotion, social relations, etc.) are the major foci of this book.
Link to Full Text | Show Similar Items | Show Associated Keywords
125. Provezis, S. July 2011. Augustana College: An assessment review committee's role in engaging faculty.
Over the last six years, Augustana has been active in the area of assessing student learning and has become a leader in gaining faculty involvement. This involvement is due in part to the institutional type—which focuses on teaching and learning, the dynamic role of the Assessment Review Committee, and the communication strategies. This has allowed them to make several improvements on campus based on their assessment activities.
Link to Full Text | Show Similar Items | Show Associated Keywords
126. Provezis, S. June 2012. LaGuardia Community College: Weaving assessment into the institutional fabric.
A federally designated Hispanic Serving Institution, LaGuardia Community College serves an overwhelmingly minority and first-generation college student population “from diverse cultures, ages, and educational and economic backgrounds.” Its students come from 160 different countries and speak more than 120 different primary languages. LaGuardia’s commitment to educational excellence has been acknowledged by Excelencia in Education, the Bellwether Award for Exemplary Instructional Programs, and the Community College Excellence Award from the MetLife Foundation. Because of its reputation as a leader in learning outcomes assessment, particularly through the use of electronic portfolios (ePortfolios), LaGuardia was selected by the National Institute for Learning Outcomes Assessment (NILOA) as an Example of Best Practice. This report features LaGuardia’s commitment to assessment, the collaboration across units at the college, the ePortfolio as the foundation of the assessment efforts, and the institution’s robust p
Link to Full Text | Show Similar Items | Show Associated Keywords
127. Puncochar, J., & Klett, M. (2013). A model for outcomes assessment of undergraduate science knowledge and inquiry processes.
To measure the efficacy of a Liberal Studies education, a Midwestern regional university developed a systematic, rubric-guided assessment based on nationally recognized science principles and inquiry processes to evaluate student work in undergraduate science laboratory courses relative to a liberal education. The rubric presented a direct measure of student understandings of science inquiry processes. The assessment procedure used stratified random sampling at confidence levels of 95% to select student work, maintained anonymity of students and faculty, addressed concerns of university faculty, and completed a continuous improvement feedback loop by informing faculty of assessment results to assess and refine science-inquiry processes of course content. The procedure resulted in an assessment system for benchmarking science inquiry processes evident in student work and offered insights into the effect of undergraduate science laboratory courses on student knowledge and understanding.
Link to Full Text | Show Similar Items | Show Associated Keywords
128. Ranellucci, J., Muis, K. R., Duffy, M., Wang, X., Sampasivam, L. and Franco, G. M. 2012. To master or perform? Exploring relations between achievement goals and conceptual change learning.
Looking at the relationship between the types of goals set by students and the learning that occur in light of these goals, this article explains that students who set strong mastery-oriented goals are more likely to achieve deep learning and conceptual change. Students who desire performance related goals (approach or avoidance) are not as likely to engage in deep learning or conceptual change.
Link to Full Text | Show Similar Items | Show Associated Keywords
129. Ravishanker, G. 2011. Doing academic analytics right: Intelligent answers to simple questions (Research Bulletin 2).
This ECAR research bulletin explores the various factors that must come together for an institution to have an academic analytics infrastructure that is flexible, agile, appropriately structured, and cost-effective. It examines not only appropriate technologies but, more importantly, the critical roles that stakeholders and governance play in setting the stage for success. There is a tremendous amount at stake for American higher education right now. How can IT facilitate—and motivate—our institutions to adopt the tools that can help mine the gold that resides in our very own vaults?
Link to Full Text | Show Similar Items | Show Associated Keywords
130. Rodgers, M. Jan/Feb2011. A call for student involvement in the push for assessment.
Abstract: The author discusses the important assessment practices for students in universities. She mentions her attempt to encourage faculty, administrators and the president of the Student Government Association (SGA) at her own institution for them to realize the importance of assessment for quality improvement and accountability reasons, but her suggestion to implement assessment failed. However, she still hopes that awareness of assessment will not be ignored, and she encourages students to move.
Link to Full Text | Show Similar Items | Show Associated Keywords
131. Rogers, E.M. 2003. Diffusion of innovations.
This book examines the ways in which new ideas are shared and transferred through organizations.
Link to Full Text | Show Similar Items | Show Associated Keywords
132. Salisbury, M. Delicious ambiguity.
This blog is the second generation of Delicious Ambiguity, a weekly column that started in the fall of 2011 in the faculty newsletter at Augustana College. The whole idea was to help the college and everyone here who works with students to think about our student data as a means to improve student learning and the educational experience rather than just seeing the data as an ends in and of itself. Even though the first year of columns was published as separate pages in an online newsletter, all of the them have been uploaded to this blog site.
Link to Full Text | Show Similar Items | Show Associated Keywords
133. Schuh, J. H. & Gansemer-Topf, A. M. December 2010. The role of student affairs in student learning assessment.
Student affairs professionals are expected to be knowledgeable about the student experience. Thus, it follows that they can and should play an important role in assessing student learning. We hope this paper will persuade faculty and institutional leaders that student affairs staff with the requisite expertise should be involved in collecting, interpreting, and using evidence of student learning for both accountability and improvement.
Link to Full Text | Show Similar Items | Show Associated Keywords
134. Schuh, J. H., Upcraft, M. L., & Associates. 1996. Assessment in student affairs: A guide for practitioners.
This is a "single-volume, practical resource on using assessment to develop and improve all facets of student affairs. It includes detailed guidance for student affairs staff on how to assess student needs, student satisfaction, campus environments, campus cultures, and student outcomes. And it explains how senior staff can employ assessment findings in strategic planning, policy development, and day-to-day decision making."
Link to Full Text | Show Similar Items | Show Associated Keywords
135. Schuh, J. H., Upcraft, M. L., & Associates. 2001. Assessment practice in student affairs: An applications manual.
An companion to the 1996 release, "this manual continues the work begun in their earlier book and provides a full range of tools for conducting effective assessments. The authors begin with an overview of the assessment process and then detail a range of methodologies, approaches, and issues--explaining how to use them and when to recruit expertise from other campus sources."
Link to Full Text | Show Similar Items | Show Associated Keywords
136. Senge, P. 1999. The dance of change: The challenges to sustaining momentum in learning organizations. .
This book offers solutions for professionals seeking to implement change initiatives.
Link to Full Text | Show Similar Items | Show Associated Keywords
137. Senge, P.M. 1990. The fifth discipline: The art and practice of the learning organization.
The five disciplines of learning organizations discussed in this book include: personal mastery, mental models, building shared vision, team learning, and systems thinking. The fifth discipline, systems thinking, is the focus of the book and is used to help understand learning organizations.
Link to Full Text | Show Similar Items | Show Associated Keywords
138. Seybert, J. A. 2002. Assessing student learning outcomes.
"This chapter addresses assessment of student learning in general education, transfer programs, career and occupational programs, remedial and developmental courses and programs, and noncredit and continuing education offerings, as well as assessment of affective and noncognitive outcomes and the use of assessment results."
Link to Full Text | Show Similar Items | Show Associated Keywords
139. Silva, M. L., Delaney, S. A., Cochran, J., Jackson, R., & Olivares, C. 2015. Institutional assessment and the integrative core curriculum: Involving students in the development of an ePortfolio system.
Because students are often not involved nor included in the decision-making assessment process, the authors piloted a project to include students as co-authors and research assistants to improve their ePortfolio design. The authors purport that students can and should be included in decision-making assessment processes.
Link to Full Text | Show Similar Items | Show Associated Keywords
140. Snider-Lotz, T. G. 2002. Designing an evidence-centered assessment program.
Additional information on evidence-centered assessment programs and design may be found here: http://ecd.sri.com/
Link to Full Text | Show Similar Items | Show Associated Keywords
141. Southern Education Foundation. Advancing Excellence, Enhancing Equity: Making the Case for Assessment at Minority-Service Institutions..
This publication from the Southern Education Foundation offers practical steps for initiating assessment programs at minority-serving institutions.
Link to Full Text | Show Similar Items | Show Associated Keywords
142. Southern Education Foundation. 2010. Still striving: Trustees and presidents of historically black colleges and universities’ unprecedented dialogue about governance and accreditation.
As a result of a SACS meeting in which HBCU presidents, chancellors, trustees and education scholars were invited to speak on governance and accreditation, this paper "captures exchanges of ideas and information about matters such as Board/executive relations, financial management, policymaking and oversight strategies and is “must” reading for anyone who wants to learn about best practices in higher education governance and how accrediting agencies function" (SEF, 2011).
Link to Full Text | Show Similar Items | Show Associated Keywords
143. Stassen, M. L. A., Doherty, K., & Poe, M. 2001. Course-based review and assessment: Methods for understanding student learning.
This handbook provides an overview to assessment and use of assessment in the classroom, helps define your goals and objectives for the class, gives techniques of how to assess and finally, gives ways to understand and use the results gained.
Link to Full Text | Show Similar Items | Show Associated Keywords
144. Stokes, P. August 2011. From uniformity to personalization: How to get the most out of assessment.
The potential for assessment to inform the improvement of curriculum, teaching, student performance, and institutional effectiveness has never been greater. So why aren’t our students performing better?
Link to Full Text | Show Similar Items | Show Associated Keywords
145. The Civic Learning and Democratic Engagement National Task Force. 2012. A crucible moment: College learning and democracy's future.
The importance of civic learning in colleges and universities is the focus of this report. A civic institutional matrix for institutional use is included.
Link to Full Text | Show Similar Items | Show Associated Keywords
146. Volkwein, J. F. September 2011. Gaining ground: The role of institutional research in assessing student outcomes and demonstrating institutional effectiveness.
The work of institutional researchers is gaining importance on today's campuses. Included in institutional researchers wide range of duties is a significant role in student outcomes assessment. In this eleventh NILOA Occasional Paper, J. Fredericks Volkwein leads us through their roles. Analysis of data obtained from the Center for the Study of Higher Education at Penn State’s survey “National Survey of Institutional Research Offices in 2008-09,” gathered from over 3,300 professional staff is included. Overall, this occasional paper helps to better understand the role, responsibilities and challenges faced by institutional researchers in relation to student outcomes assessment on their campuses.
Link to Full Text | Show Similar Items | Show Associated Keywords
147. Voorhees, A. B. 2001. Creating and implementing competency-based learning models.
This article defines three models of competency-based learning: general education models, trait based scales, and industry-based models. Following these descriptions, suggestions are provided for putting competency-based learning models into effect across different types of institutions.
Link to Full Text | Show Similar Items | Show Associated Keywords
148. Vuong, B., & Hairston, C. C. 2012, October. Using data to improve Minority-Serving Institution success.
This brief highlights how MSIs from the Lumina MSI-Models of Success project have used data to implement policy and programmatic changes on their campuses in support of student and institutional success.
Link to Full Text | Show Similar Items | Show Associated Keywords
149. Walter, C. K. 2012. Student outcomes assessment of a logistics and supply chain management major.
Assessment of specialized programs, such as logistics and supply chain management program described here, may pose challenges because previous experience are less widely shared than in the more mainline subjects. This case study provides one model that may guide other faculties facing a similar assignment. The report detailed the steps followed to assess an undergraduate program in Logistics and Supply Chain Management. The starting point was a two-stagecourse-mapping, which identified strengths and weaknesses of course coverage compared to college goals, and also compared to a set of topic areas recommended by faculty members. Next in the process was a survey of graduating students who responded to questions about basic concepts in their recently completed courses. The assessment was considered useful in providing a feedback path necessary for faculty to "close the loop" in course design and teaching. In addition, this case study showed how a process from a small liberal arts college may be adopted to a narrowly focused business program in a larger and more diverse university setting.
Link to Full Text | Show Similar Items | Show Associated Keywords
150. Weiner, L. and Bresciani, M. Winter 2011. Can institutions have quality programming without utilizing a systemic outcomes-based assessment process?.
For many students, service learning not only expands their educational horizons, but also makes them more aware and compassionate of those who live in communities very different from their own. Yet, despite the expressed benefits of service learning and the increasing numbers of institutions offering service-learning programs, it is not known whether they are truly successful unless evidence of their success is provided. The use of outcomes-based assessment is one of the processes that generates evidence of program effectiveness. Few studies have been performed to identify whether all the components of effective outcomes-based assessment must be present in order for quality programs to be identified. Thus, the purpose of this cross-case comparative study was to find out necessary components.
Link to Full Text | Show Similar Items | Show Associated Keywords
151. Weiss, Gregory L.; Cosbey, Janet R.; Habel, Shelly K.; Hanson, Chad M.; Larsen, Carolee. Jan 2002. Improving the Assessment of Student Learning: Advancing a Research Agenda in Sociology.
This paper summarizes current research on key components of assessment plans, discusses the history of assessment, and proposes research questions for sociologists relating to context, content, process, and effects of assessment.
Link to Full Text | Show Similar Items | Show Associated Keywords
152. Yousey-Elsener, K., Bentrim, E., & Henning, G.W. 2015. Coordinating Student Affairs Divisional Assessment: A Practical Guide.
This book is a practical guide for practitioners to lead and implement assessment efforts in student affairs.
Link to Full Text | Show Similar Items | Show Associated Keywords

Search Again