Search By:


Publication Search Results

Search returned 126 results using Keyword: "NILOA"



1. Adelman, C. 2015, February. To Imagine a Verb: The Language and Syntax of Learning Outcomes Statements.
This Occasional Paper, focused on syntax and semantics, provides language-centered principles, guidelines and tools for writing student learning outcome statements. While placing the verb at the center of all student learning outcomes, it distinguishes between active and operational verbs, voting for the latter on the grounds that they are more likely to lead, naturally and logically, to assignments that allow genuine judgment of student performance, and offers, as more constructive cores of student learning outcomes, 20 sets of operational verbs corresponding to cognitive activities in which students engage and faculty seek to elicit.
Link to Full Text | Show Similar Items | Show Associated Keywords
2. Ariovich, L., & Richman, W.A. 2013, October. All-in-one: Combining grading, course, program, and general education outcomes assessment.
In NILOA's nineteeth occasional paper, authors Laura Ariovich and W. Allen Richman discuss Prince George's Community College's All-in-One system, designed to integrate course, program, and general education assessment in order to connect outcomes assessment with grading
Link to Full Text | Show Similar Items | Show Associated Keywords
3. Baker, G. R. February 2012. North Carolina A&T State University: A culture of inquiry.
North Carolina A&T was selected for inclusion as a case study for NILOA due to its commitment to improving its campus by developing a "culture of inquiry"—specifically as this relates to student learning outcomes assessment activities. Three elements have been instrumental in A&T's drive to become a more data-driven institution: 1) administrative leadership that encourages discussions and collaboration around student learning outcomes assessment activities on campus; 2) the use of professional development opportunities to help foster the involvement and commitment of faculty members; and 3) the systematic and intentional use of student feedback.
Link to Full Text | Show Similar Items | Show Associated Keywords
4. Baker, G. R. April 2012. Texas A&M International University: A culture of assessment INTEGRATEd.
Texas A&M International University was selected as a NILOA case study institution due to 1) its commitment to choosing assessments and tools appropriate for its students, 2) its long history with and innovative approach to assessment, and 3) the influential role of professional development at the institution to help prepare “Assessment Champions” and expand the number of “pockets of excellence” in terms of assessment practices throughout the campus.
Link to Full Text | Show Similar Items | Show Associated Keywords
5. Baker, G. R., Jankowski, N., Provezis, S. & Kinzie, J. 2012, July. Using assessment results: Promising practices of institutions that do it well.
To learn more about what colleges and universities are doing to use assessment data productively to inform and strengthen undergraduate education, NILOA conducted nine case studies. This report synthesizes the insights from these individual studies to discern promising practices in using information about student learning. The report concludes with lessons learned and reflective questions to help institutions advance their own assessment efforts within their specific institutional contexts.
Link to Full Text | Show Similar Items | Show Associated Keywords
6. Banta, T. W., Ewell, P. T., & Cogswell, C. A. . October 2016. Tracing assessment practice as reflected in Assessment Update.
At some future point, when a definitive history of the assessment movement is written, one of the most frequently cited, influential publications will be Assessment Update (AU). Since 1989, this bimonthly newsletter has been published by Jossey-Bass in partnership with Indiana University-Purdue University Indianapolis (IUPUI). It is no coincidence that the two most frequent contributors to AU, Trudy Banta—AU’s founding editor and intellectual muse—and Peter Ewell, are also among the most prolific thinkers and writers shaping the scholarship and practice of student learning outcomes assessment. In this featured NILOA occasional paper, Banta and Ewell with the assistance of Cynthia Cogswell mine the pages of AU between 2000 through 2015 to distill the major themes and advances that characterize the evolution of assessment as a field of professional practice.
Link to Full Text | Show Similar Items | Show Associated Keywords
7. Banta, T.W., Griffin, M., Flateby, T.L., & Kahn, S. December 2009. Three promising alternatives for assessing college students' knowledge and skills.
In this paper, assessment experts Trudy Banta, Merilee Griffin, Theresa Flateby, and Susan Kahn describe the development of several promising authentic assessment approaches. The contributors draw on their rich assessment experience to illustrate how portfolios, common analytic rubrics, and online assessment communities can more effectively link assessment practices to pedagogy. In addition to discussing the strengths and limitations of each approach, the paper offers concrete examples of how these authentic approaches are being used to guide institutional improvement, respond to accountability questions, and involve more faculty, staff, and students in meaningful appraisals of learning outcomes.
Link to Full Text | Show Similar Items | Show Associated Keywords
8. Bassis, M. March 2011. In search for a standard of quality.
Since the job of colleges and universities is to develop the talents of its students, quality should be a function, not of how much talent the school had attracted, but how much talent it had developed. Since then, the issue of talent development - of how to promote more and better learning during the college years - has been at the heart of my work as a teacher, scholar and administrator.
Link to Full Text | Show Similar Items | Show Associated Keywords
9. Benjamin, R. February 2011. Avoiding a tragedy of the commons in postsecondary education.
At this moment in history, human capital -- the stock of knowledge and skills citizens possess-- is our country’s principal resource. To develop human capital requires a high performing educational system, as education is the primary venue for preserving and enhancing human capital. But a storm is brewing in plain sight. Here’s a brief, incomplete, but ominous sketch of the problem and what it means for assessment.
Link to Full Text | Show Similar Items | Show Associated Keywords
10. Blaich, C. F. & Wise, K. S. January 2011. From gathering to using assessment results: Lessons from the Wabash national study.
Drawing from the Wabash Study, a multi-institutional longitudinal research and assessment project, Charlie Blaich and Kathy Wise, from the Center of Inquiry at Wabash College, share their field-tested findings and lessons learned about campus use of assessment results. The Wabash Study assists institutions in collecting, understanding and using data. The researchers at the Center of Inquiry found the last component to be the real challenge—using the data for improved student learning. In this Occasional Paper, Blaich and Wise describe the accountability movement, the history and purpose of the Wabash Study, and the reasons why institutions have a hard time moving from gathering data to using data, giving five practical steps to campus leaders for using the data collected.
Link to Full Text | Show Similar Items | Show Associated Keywords
11. Blaich, C., Keller, C., Philippe, K., Kuh, G., Provezis, S. January 2011. Can you see me now? Taking the pulse of transparency efforts.
Presentation at Association of American Colleges and Universities (AAC&U) Annual Meeting on NILOA web scan studies, the Voluntary System of Accountability (VAR), the Voluntary Framework for Accountability, and lessons learned from the Wabash study.
Link to Full Text | Show Similar Items | Show Associated Keywords
12. Blasi, L. December 2011. How assessment and institutional research staff can help faculty with student learning outcomes assessment .
Institutional researchers can provide support for faculty members as they seek to improve the attainment of student learning outcomes through assessment. Sometimes a few dedicated faculty members drive the process, but increased faculty support is needed to cultivate a culture of assessment on campus.
Link to Full Text | Show Similar Items | Show Associated Keywords
13. Bresciani, M. J. August 2011. Making assessment meaningful: What new student affairs professionals and those new to assessment need to know.
With the growing demands of assessment becoming more widespread throughout higher education institutions, knowledge about assessment for new student affairs professionals is even more critical. Marilee J. Bresciani provides a quick overview as to how new student affairs professionals can contribute both effectively and meaningfully to assessment practices at their institution.
Link to Full Text | Show Similar Items | Show Associated Keywords
14. Cain, T., & Hutchings, P. 2013, October. Faculty buy-in and engagement: Reframing the conversation around faculty roles in assessment.
This presentation from the 2013 Assessment Institute discusses faculty's engagement with assessment, including common challenges, sources of discontent, and solutions for overcoming these difficulties.
Link to Full Text | Show Similar Items | Show Associated Keywords
15. Cain, T., & Jankowski, N. 2013, November. Mapping the landscape of learning outcomes assessment: An update from the field.
This presentation from the Association for the Study of Higher Education (ASHE) 2013 Annual Conference reviews the results of NILOA's 2014 survey of provosts.
Link to Full Text | Show Similar Items | Show Associated Keywords
16. Connor, R. June 2011. Navigating a perfect storm.
There’s good reason to think that higher education is about to confront a perfect storm, a convergence of troubles that are more than the usual bluster. The economy is not just slow to recover; it may be ‘hollowing out’ in ways that undermine the old claim that going to college guarantees a good job upon graduation. Confidence in higher education may also be waning, if not among the general public then among policy makers troubled by stagnant graduation rates and slippage in the rank order of percentage of adults with baccalaureate degrees compared to some other highly developed countries.
Link to Full Text | Show Similar Items | Show Associated Keywords
17. Cooper, T. & Terrell, T. 2013, August. What are institutions spending on assessment? Is it worth the cost?.
In NILOA's eighteenth occasional paper, authors Tammi Cooper and Trent Terrell examine how much institutions are spending when it comes to assessment. Their paper presents findings from a survey of assessment professionals at institutions regarding the cost of assessment and the perceived benefit that institutions receive.
Link to Full Text | Show Similar Items | Show Associated Keywords
18. DeWitt, P. March 2012. What is satisfactory performance? Measuring students and measuring programs with rubrics.
Some assessment experts strongly recommend that a desired level of achievement be stated when measuring student performance on stated student learning outcomes. According to Nichols, the criteria should be stated in quantitative terms, as this example illustrates: “Eighty percent of those taking the CPA exam each year…will pass three of four parts of the exam” (Nichols, 1989, p. 178). In the era of rubrics, this can easily be translated to “Eighty percent of students…will score at least ‘satisfactory’ on three of the four rubric rows.”
Link to Full Text | Show Similar Items | Show Associated Keywords
19. Eubanks, D., & Gliem, D. 2015, May. Improving Teaching, Learning, and Assessment by Making Evidence of Achievement Transparent.
Technology can change higher education by empowering students to make an impact on the world as undergraduates. Done systematically, this would allow institutions to close the credibility gap with an increasingly dubious public. Authentic student achievements that are addressed to a real world audience can lead to richly detailed Resume 2.0 portfolios of work that add value to degrees and the granting institutions. A guide is provided for implementation of new high-impact practices, including structured assignment creation.
Link to Full Text | Show Similar Items | Show Associated Keywords
20. Ewell, P. T. 2009, November. Assessment, accountability, and improvement: Revisiting the tension.
Assessments of what students learn during college are typically used for either improvement or accountability, and occasionally both. For reasons carefully outlined by Peter Ewell in this NILOA Occasional Paper, since the early days of the “assessment movement” in the US, these two purposes of outcomes assessment have not rested comfortably together. No one is more qualified than Ewell to summarize what has changed and what has not over the past two decades in terms of student learning outcomes assessment and the shifting expectations and demands of policy makers, accreditors, higher education leaders, and government officials about student and institutional performance. After delineating how various kinds of information can and should be used for improvement and accountability, he points to ways that institutions can productively manage the persistent tensions associated with improvement and accountability as faculty and staff members do the important work of documenting, reporting, and using what student
Link to Full Text | Show Similar Items | Show Associated Keywords
21. Ewell, P., Ikenberry, S. O., Kuh, G. D. April 2010. Using student learning outcomes for accountability and improvement: The NILOA agenda.
Presentation at the North Central Association of Colleges and Schools (NCA) Higher Learning Commission on NILOA's activities, the 2010 Program-Level Questionnaire and Survey, and forthcoming NILOA products of measurement of quality, cost of assessment, accreditation, and transparency.
Link to Full Text | Show Similar Items | Show Associated Keywords
22. Ewell, P., Jankowski, N., & Provezis, S. September 2010. Connecting state policies on assessment with institutional assessment activity.
The coincidence of two national surveys—one at the state level and one at the institutional level—enabled researchers at the National Institute for Learning Outcomes Assessment (NILOA) to explore the relationships between state policies on student learning outcomes assessment and institutional approaches to assessing student learning and related phenomena. This report shows the findings of that study.
Link to Full Text | Show Similar Items | Show Associated Keywords
23. Ewell, P., Kinzie, J., Keith, J., & Love, M. B. January 2011. Down and in: A national perspective on program-level assessment.
Presentation at Association of American Colleges and Universities (AAC&U) on the Reviewing NILOA survey results, qualitative information on program assessment, with examples from two exemplary campuses.
Link to Full Text | Show Similar Items | Show Associated Keywords
24. Ewell, P., Mandell, C., Martin, E., & Hutchings, P. 2013, October. Mapping the curriculum: Learning outcomes and related assignments.
This presentation from the 2013 Assessment Institute discusses the implications of using the DQP for assessing learning outcomes, curriculum mapping, the use of rubrics, and designing an assignment library.
Link to Full Text | Show Similar Items | Show Associated Keywords
25. Ewell, P., Paulson, K., & Kinzie, J. June 2011. Down and in: Assessment practices at the program level.
To follow up the 2009 National Institute for Learning Outcomes Assessment (NILOA) report on institutional assessment activity described by chief academic officers, NILOA surveyed program heads in the two and four-year sectors to gain a more complete picture of assessment activity at the program or department level.
Link to Full Text | Show Similar Items | Show Associated Keywords
26. Fulcher, K. H., Good, M. R., Coleman, C. M., & Smith, K. L. 2014, December. A Simple Model for Learning Improvement: Weigh Pig, Feed Pig, Weigh Pig.
Assessing learning does not by itself result in increased student accomplishment, much like a pig never fattened up because it was weighed. Indeed, recent research shows that while institutions are more regularly engaging in assessment, they have little to show in the way of stronger student performance. This paper clarifies how assessment results are related to improved learning – assess, effectively intervene, re-assess – and contrasts this process with mere changes in assessment methodology and changes to pedagogy and curriculum. It also explores why demonstrating improvement has proven difficult for higher education. The authors propose a solution whereby faculty, upper administration, pedagogy/curriculum experts, and assessment specialists collaborate to enhance student learning.
Link to Full Text | Show Similar Items | Show Associated Keywords
27. Gilchrist, D., & Oakleaf, M. April 2012. An essential partner: The librarian’s role in student learning assessment.
Debra Gilchrist and Megan Oakleaf, two leaders in librarianship and assessment, document the ways librarians contribute toward campus efforts of student learning assessment. The paper includes a variety of examples of institutions that have developed student learning assessment processes.
Link to Full Text | Show Similar Items | Show Associated Keywords
28. Gold, L., Rhoades, G., Smith, M. & Kuh, G. May 2011. What faculty unions say about student learning outcomes assessment.
This paper summarizes the views on student learning outcomes assessment held by the leadership of three major national faculty unions—the American Association of University Professors (AAUP), the American Federation of Teachers (AFT), and the National Education Association (NEA). Framed as a conversation, a spokesperson from each group talks about how organized faculties can contribute their ideas and fashion their practices to enhance student learning and educational attainment.
Link to Full Text | Show Similar Items | Show Associated Keywords
29. Hinds, T., & Jankowksi, N. 2012, October 30. Voluntary System of Accountability and learning outcomes: An update.
This presentation from the 2012 Assessment Institute provides background information on NILOA, the Voluntary System of Accountability (VSA), and outlines the VSA's College Portrait.
Link to Full Text | Show Similar Items | Show Associated Keywords
30. Hutchings, P. 2014, July. DQP Case Study: Kansas City Kansas Community College.
KCKCC created an alternative system for documenting student achievement of Degree Qualifications Profile (DQP) proficiencies using an interactive curriculum mapping database that allows faculty to enter information about individual student performance on each learning outcome and competency in their courses.
Link to Full Text | Show Similar Items | Show Associated Keywords
31. Hutchings, P. 2014, January. DQP Case Study: Point Loma Nazarene University, San Diego, California.
Point Loma Nazarene University's engagement with the Degree Qualifications Profile began early and has been sustained over a number of years. PLNU's work with the DQP is now prompting conversations about how to more effectively assess learning in ways that are comparable across programs and how to continue to improve the experience of Point Loma students.
Link to Full Text | Show Similar Items | Show Associated Keywords
32. Hutchings, P. April 2010. Opening doors to faculty involvement in assessment.
Much of what has been done in the name of assessment has failed to induce large numbers of faculty to systematically collect and use evidence of student learning to improve teaching and enhance student performance. Pat Hutchings, a senior associate at The Carnegie Foundation for the Advancement of Teaching, examines the dynamics behind this reality, including the mixed origins of assessment, coming both from within and outside academe, and the more formidable obstacles that stem from the culture and organization of higher education itself. Then, she describes six ways to bring the purposes of assessment and the regular work of faculty closer together, which may make faculty involvement more likely and assessment more useful.
Link to Full Text | Show Similar Items | Show Associated Keywords
33. Hutchings, P. April 2011. What new faculty need to know about assessment.
As a new faculty member, you will have questions about your students’ learning—as all thoughtful teachers do: Are they really learning what I’m teaching? How well do they understand the key concepts I’m focusing on? Can they apply what they’re learning in new contexts? What can I do better or differently to help students develop the skills and knowledge they need to be effective in this class, in subsequent courses, and in their future life and work? This assessment brief focuses upon an introduction for faculty to assessment of student learning.
Link to Full Text | Show Similar Items | Show Associated Keywords
34. Hutchings, P. July 2016. Assessment and integrative learning.
At this LiveText Conference, Pat Hutchings discusses engaging faculty to make assessment matter to students.
Link to Full Text | Show Similar Items | Show Associated Keywords
35. Hutchings, P., Ewell, P., & Humphreys, D. 2014, March 31. Where policies and practice meet: Assessment and the way we work.
This presentation connects the NILOA provost survey results to assignment design.
Link to Full Text | Show Similar Items | Show Associated Keywords
36. Hutchings, P., Ewell, P., Banta, T. 2012. AAHE principles of good practice: Aging nicely.
Twenty years ago, in 1992, the American Association for Higher Education’s Assessment Forum released its “Principles of Good Practice for Assessing Student Learning,” a document developed by twelve prominent scholar-practitioners of the movement. The principles have been widely used, studied, and written about (see for instance Banta, Lund, Black & Oblander, 1995), and adapted in other documents and statements. Their inclusion on the NILOA website is a welcome addition, for, like good wine, the AAHE Principles have aged quite nicely.
Link to Full Text | Show Similar Items | Show Associated Keywords
37. Hutchings, P., Jordan-Fleming, M. K., & Green, K. October 2016. Using intentionally designed assignments to foster and assess student learning.
This session will explore the benefits – and some of the challenges – of bringing educators together to collaborate on the design and use of the projects, papers, exams, and presentations they require of their students.
Link to Full Text | Show Similar Items | Show Associated Keywords
38. Ikenberry, S., Kuh, G., Provezis, S., Jankowski, N., Jea, G., Goldfarb, J., Makela, J. December 2009. Mapping the landscape of learning outcomes assessment.
Presentation at Higher Education Collaborative (HEC) at University of Illinois, Urbana Champaign on accreditation study questions and methods, schools with common learning outcomes, assessment types, and evaluation of survey and web scan reports.
Link to Full Text | Show Similar Items | Show Associated Keywords
39. Jankowski, N. 2013, May. Assessment for learning research methods: A multi-faceted terrain.
This presentation from The Higher Education Academy (HEA) Social Science Conference offers an overview of NILOA, research method outcomes, and assessment practices.
Link to Full Text | Show Similar Items | Show Associated Keywords
40. Jankowski, N. 2014, January. Assessment of student learning: An overview of the landscape.
This presentation was given at North Central Michigan College in Petoskey, MI, and discusses the history and mission of NILOA and current trends in higher education assessment.
Link to Full Text | Show Similar Items | Show Associated Keywords
41. Jankowski, N. 2014, January. Assignment design workshop.
This presentation was given at North Central Michigan College in Petoskey, MI, and discusses the factors involved in creating effective assignments.
Link to Full Text | Show Similar Items | Show Associated Keywords
42. Jankowski, N. August 2011. Capella University: An outcomes-based institution.
Capella University was selected for a case study due to its systematic, embedded student learning outcomes assessment process; its administrative support and vision of what assessment can do for individual learners; its transparency efforts such as Capella Results, which publicizes assessment results, and its help in developing Transparency By Design; and its use of assessment results to enhance learner success levels.
Link to Full Text | Show Similar Items | Show Associated Keywords
43. Jankowski, N. 2013, October. How institutions use evidence of assessment: Does it really improve student learning? .
This presentation from the University of Illinois College of Education Higher Education Collaborative Series examines the ways in which institutions are using assessment data.
Link to Full Text | Show Similar Items | Show Associated Keywords
44. Jankowski, N. July 2011. Juniata College: Faculty led assessment.
Juniata College was identified as an example of good assessment practice for the faculty-led Center for the Scholarship of Teaching and Learning (SoTL Center) that champions and supports evidence-based teaching; an administration-supported accountability website that provides data and information about outcomes to multiple audiences; and the use of evidence of student learning to make improvements at the institution and individual course levels.
Link to Full Text | Show Similar Items | Show Associated Keywords
45. Jankowski, N. 2013, June. Providing evidence of student learning: NILOA's Transparency Framework.
This presentation from the 2013 Florida State Assessment Meeting examines the history and uses of NILOA's Transparency Framework.
Link to Full Text | Show Similar Items | Show Associated Keywords
46. Jankowski, N. 2013, June. Showing an impact: Using assessment results to improve student learning.
This presentation from the 2013 Florida State Assessment Meeting describes how institutions are currently using assessments to improve student learning outcomes.
Link to Full Text | Show Similar Items | Show Associated Keywords
47. Jankowski, N. April 2012. St. Olaf College: Utilization-Focused Assessment.
The National Institute for Learning Outcomes Assessment (NILOA) selected St. Olaf as a case study institution due to the institutional framing of assessment as inquiry in support of student learning and as meaningful, manageable, and mission-driven; the utilization-focus/backward-design approach employed in assessment; the integration of student learning outcomes assessment processes into faculty governance structures; and the collaborative involvement of multiple stakeholders and diverse ways in which evidence of student learning is utilized throughout the institution.
Link to Full Text | Show Similar Items | Show Associated Keywords
48. Jankowski, N. The role of IR in assessing student learning: Managing shifting priorities.
This presentation from the Ohio Association for Institutional Research and Planning (OAIRP) Spring Conference examines the role of Institutional Research in assessing student learning.
Link to Full Text | Show Similar Items | Show Associated Keywords
49. Jankowski, N. 2013, June. Using assessment evidence to improve student learning: Can it be done?.
This presentation from the 2013 Assessment in Higher Education Conference describes how institutions are using assessment data.
Link to Full Text | Show Similar Items | Show Associated Keywords
50. Jankowski, N, & Hinds, T. 2013, October. Making the value argument by telling evidence-based stories: The voluntary system of accountability.
This presentation from the 2013 Assessment Institute discusses NILOA's work with the Voluntary System of Accountability.
Link to Full Text | Show Similar Items | Show Associated Keywords
51. Jankowski, N, & Kinzie, J. 2013, May. The role of IR in fostering good assessment practice.

Link to Full Text | Show Similar Items | Show Associated Keywords
52. Jankowski, N. July 2016. Assessing with students: Mapping learning experiences.
At this LiveText Conference, Natasha Jankowski discusses the curriculum mapping process and its implications.
Link to Full Text | Show Similar Items | Show Associated Keywords
53. Jankowski, N. A., Ikenberry, S. O., Kinzie, J., Kuh, G. D., Shenoy, G. F., & Baker, G. R. March 2012. Transparency & accountability: An evaluation of the VSA college portrait pilot.
The Voluntary System of Accountability (VSA) is a vehicle for public four-year universities to report comparable information about the undergraduate student experience via the College Portrait, a common web reporting template. NILOA evaluated the effectiveness of the student learning outcomes pilot project within the College Portrait resulting in this report: Transparency & Accountability: An Evaluation of the VSA College Portrait Pilot. The evaluation took place October 2011 through February 2012, drawing on a variety of data sources.
Link to Full Text | Show Similar Items | Show Associated Keywords
54. Jankowski, N., & Allen, C. 2014, February 28. General education outcomes and NILOA's Transparency Framework.
This presentation from the 2014 AAC&U General Education and Assessment Meeting includes examples of NILOA's Transparency Framework in the field and offers new ideas for thinking about communicating assessment data.
Link to Full Text | Show Similar Items | Show Associated Keywords
55. Jankowski, N., & Kinzie, J. October 2016. Future directions of assessment: Movement on the field..
This presentation explores three shifts in the field of assessment toward more cross-cutting, integrative initiatives and projects. Efforts to document student learning through co-curricular transcripts and active integration of academic affairs and student affairs will be discussed, followed by an overview of the importance of transparent communication to various audiences of our current initiatives and ongoing assessment activities. The presentation will conclude with an overview of what NILOA has been learning from institutions through the work of tracking and mapping involvement with the Degree Qualifications Profile.
Link to Full Text | Show Similar Items | Show Associated Keywords
56. Jankowski, N., & Makela, J. P. June 2010. Exploring the landscape: What institutional websites reveal about student learning outcomes assessment activities.
Despite persistent calls for colleges and universities to post student learning outcomes assessment information to their websites, the assessment information that can be found online falls considerably short of the activities reported by chief academic officers. The study finds that institutions are often not taking full advantage of their website to increase transparency regarding student learning outcomes assessment. The researchers share their findings and offer recommendations for institutions.
Link to Full Text | Show Similar Items | Show Associated Keywords
57. Jankowski, N., & Provezis, S. November 2011. Making student learning evidence transparent: The state of the art.
Making Student Learning Evidence Transparent: The State of the Art is composed of four sections. The sections cover 1) the impact of national transparency initiatives; 2) the changing landscape of transparency; 3) the display of assessment results and their subsequent use; and 4) a synthesis of the previous three sections.
Link to Full Text | Show Similar Items | Show Associated Keywords
58. Jankowski, N., Allen, C., Dumas, R., Bahr, C., and Sotherland, P. 2013, October. Transparent online communication: How to convey assessment information meaningfully.
This presentation from the 2013 Assessment Institute offers information on NILOA's Transparency Framework.
Link to Full Text | Show Similar Items | Show Associated Keywords
59. Jankowski, N., Dumas, R., & Allen, C. 2013, June. Transparent communication of assessment results: A revision to the NILOA Transparency Framework.
This presentation from the 2013 Association for the Assessment of Learning in Higher Education (AALHE)conference provides a history of NILOA's Transparency Framework and offers ideas for its revision.
Link to Full Text | Show Similar Items | Show Associated Keywords
60. Jankowski, N., Eggleston, T., Heyman, E. 2013, June. Engaging with assessment and the Degree Qualifications Profile: Institutions in practice.
This presentation from the Association for the 2013 Assessment of Learning in Higher Education (AALHE) conference provides institutions are working with the DQP.
Link to Full Text | Show Similar Items | Show Associated Keywords
61. Jankowski, N., Hutchings, P., Dawn, S., Dodge, L. 2014, March 1. The degree qualifications profile: A framework for assessing general education.
This presentation from the 2014 AAC&U General Education and Assessment Meeting provides an introduction to the DQP, including NILOA's work with the DQP, resources, and campus examples.
Link to Full Text | Show Similar Items | Show Associated Keywords
62. Jankowski, N., Hutchings, P., Slotnick, R., Cratsley, C., Fulton, S., Oates, S. 2014, January. What the DQP looks like on the ground: National trends and campus examples.

Link to Full Text | Show Similar Items | Show Associated Keywords
63. Jankowski, N., Keller, C., Gore, P., & Kinzie, J. June 2012. The future of student learning outcomes assessment on the college portrait .
Presentation at Association for Institutional Research (AIR) on the Evaluation of the VSA College Portrait Pilot
Link to Full Text | Show Similar Items | Show Associated Keywords
64. Jankowski, N., Kinzie, J., & Kuh, G. 2014, January 24. What provosts say about student learning outcomes asssessment.
This presentation from the 2014 AAC&U Annual Meeting presents an overview of NILOA's 2014 provost survey.
Link to Full Text | Show Similar Items | Show Associated Keywords
65. Jankowski, N., Kuh, G. June 4, 2012. Lessons from the Field: A NILOA Update.
Presentation at Association for Institutional Research (AIR) on NILOA's mission, current projects, and findings.
Link to Full Text | Show Similar Items | Show Associated Keywords
66. Jankowski, N., Pike, G. October 2011. Institutional transparency of student learning outcomes assessment: A framework.
Presentation at the Assessment Institute on the need for transparency, the transparency continuum, barriers to transparency, and examples of colleges and universities presently engaged in transparency.
Link to Full Text | Show Similar Items | Show Associated Keywords
67. Judd, T. P., Secolsky, C., Allen, C. February 2012. Being confident about results from rubrics.
Using rubrics to assess student learning is more and more common, and their use is almost certainly going to increase, as the Association of American Colleges and Universities (AAC&U) essential learning outcomes become better known and the Lumina Degree Qualifications Profile gains traction. Both outcomes frameworks require something more than what available standardized instruments measure.
Link to Full Text | Show Similar Items | Show Associated Keywords
68. Keller, C., Kuh, G., Phillippe, K., Provezis, S., Weiler, W. October 2011. National transparency initiatives: Where are they now?.
Presentation at Assessment Institute on transparency, influences of assessment, and the Voluntary System of Accountability (VSA).
Link to Full Text | Show Similar Items | Show Associated Keywords
69. Kinzie, J. June 2012. Carnegie Mellon University: Fostering assessment for improvement and teaching excellence.
Carnegie Mellon was selected as a case study for the National Institute for Learning Outcomes Assessment (NILOA) for having an approach to student learning outcomes assessment that reflects the institution’s commitment to interdisciplinarity and innovative teaching and learning. Three elements have been instrumental in CMU’s advances in program-level student learning outcomes assessment: 1) an institutionalized research-oriented and data-informed university decision-making process driven by deans and departments; 2) an organizational culture with established processes promoting continuous improvement; and 3) the elevation of a cross-campus faculty resource—the Eberly Center for Teaching Excellence—as the hub of assessment support. This case study broadly describes CMU’s approach to addressing the challenges of assessment, explores the salient elements of CMU’s culture for assessment and improvement, and then focuses on the positioning and role of the Eberly Center for Teaching Excellence in student learning outcomes.
Link to Full Text | Show Similar Items | Show Associated Keywords
70. Kinzie, J. August 2011. Colorado State University: A comprehensive continuous improvement system.
Colorado State University was determined to be an instructive case study because of its innovative learning outcomes assessment and institutional improvement activities have been highlighted in various publications (see Bender, 2009; Bender, Johnson, & Siller, 2010; Bender & Siller, 2006, 2009; McKelfresh & Bender, 2009) and have been noted by experts in assessment and accreditation. CSU's assessment effort in student affairs is a model for bridging the work of academic affairs and student affairs through student learning outcomes assessment. Over the last dozen years, CSU has expanded its continuous improvement system for managing information sharing to serve the decision-making and reporting needs of various audiences. This system—known as the CSU Plan for Researching Improvement and Supporting Mission, or PRISM—provides information on the university's performance in prioritized areas, uses a peer review system for feedback, and emphasizes the importance of documenting institutional improvements informed by
Link to Full Text | Show Similar Items | Show Associated Keywords
71. Kinzie, J. 2014, March. DQP Case Study: University System of Georgia - Georgia State University and Georgia Perimeter College.
Georgia State University and Georgia Perimeter College collaborated on a project to explore the DQP proficiencies at the associate's and bachelor's degree levels. The project provided opportunity for faculty and staff to work toegether to explore the creation of discipline-specific versions of the DQP, establish common learning outcomes between programs, and devise mechanisms for assessing the DQP and evaluating the strengths and weaknesses of individual students relative to the disciplinary DQPs.
Link to Full Text | Show Similar Items | Show Associated Keywords
72. Kinzie, J. October 2010. Perspectives from campus leaders on the current state of student learning outcomes assessment: NILOA focus group summary 2009-2010.
This paper highlights lessons from four focus group sessions with campus leaders--presidents, provosts, academic deans and directors of institutional research from a variety of two- and four-year institutions-- regarding their perspectives on the state of learning assessment practices on their campuses.
Link to Full Text | Show Similar Items | Show Associated Keywords
73. Kinzie, J. February 2015. DQP case study: American Public University System.
The American Public University System offers online education through American Public University and American Military University. Founded in 1991 by a retired Marine Corps officer who envisioned an innovative way to offer quality and affordable education to the U.S. armed forces, American Military University (AMU) later extended its reach to those in or seeking to enter public service related fields. NILOA selected APUS as a case study site for its unique mission and for the significant headway it made in experimenting with the Degree Qualifications Profile (DQP). In addition, APUS was one of more than 100 institutions that accepted the challenge of testing the DQP without the benefit of participating in a Lumina Foundation for Education funded initiative.
Link to Full Text | Show Similar Items | Show Associated Keywords
74. Kinzie, J., & Jankowski, N. 2013, October. Delving deeper into NILOA survey results: What we know about institutional assessment practice in 2013.
This presentation from the 2013 Assessment Institute discusses results of NILOA's 2013 survey of provosts.
Link to Full Text | Show Similar Items | Show Associated Keywords
75. Kinzie, J., & Lindsay, N. 2014, February 28. Assessment administrators anonymous: 12 steps for involving faculty in assessment.
This presentation from the 2014 AAC&U General Education and Assessment Meeting discusses the role of faculty in assessment and presents results of NILOA's 2009 and 2014 provost surveys.
Link to Full Text | Show Similar Items | Show Associated Keywords
76. Kinzie, J., Bers, T., Quinlan, M. K. January 2012. Student learning outcomes assessment at community colleges.
Presentation at Association of American Colleges and Universities (AAC&U) on approaches and uses of assessment.
Link to Full Text | Show Similar Items | Show Associated Keywords
77. Kinzie, J., Harper, I., Moeckel, D.L., Renick, T. 2013, July. Conversations about the Degree Qualifications Profile (DQP).
This presentation from the American Association of State Colleges and Universities (AASCU) Summer Meeting outlines the ways in which several institutions are working with the DQP.
Link to Full Text | Show Similar Items | Show Associated Keywords
78. Kinzie, J., Hutchings, P., Bailey, M., Helm., J., & Powell, K. 2013, October. Using the degree qualifications profile: Institutional examples of promising practices.
This presentation from the 2013 Assessment Institute introduces the DQP, discusses NILOA's work with the DQP, and three examples of the DQP in use.
Link to Full Text | Show Similar Items | Show Associated Keywords
79. Kinzie, J., Jankowski, N., Baker, G., Klages, M., & Martinez, V. 2012, October. Using assessment results: Promising practices of institutions that do it well (Presentation).
This presentation from the 2012 Assessment Institute provides an overview of NILOA's July 2012 report, "Using Assessment Results: Promising Practices of Institutions that Do It Well." To read the full report, click here: http://www.learningoutcomeassessment.org/UsingAssessmentResults.htm
Link to Full Text | Show Similar Items | Show Associated Keywords
80. Kinzie, J., Jankowski, N., Haak, B., Bender, K. October 2011. Advancing student learning outcomes assessment: Lessons from campuses doing good work.
Presentation at the Assessment Institute on the purpose of case studies and on colleges and universities presently engaged with assessment.
Link to Full Text | Show Similar Items | Show Associated Keywords
81. Kinzie, J., Paulson, K., Provezis, S. May 2011. Department and program-level assessment: Taking stock and identifying challenges.
Presentation at Association for Institutional Research (AIR) on the Evaluation of NILOA survey results, qualitative information on program assessment, and discussion of challenges of program-level assessment.
Link to Full Text | Show Similar Items | Show Associated Keywords
82. Klein-Collins, R. 2013, October. Sharpening our focus on learning: The rise of competency-based approaches to degree completion.
In NILOA's twentieth occasional paper, author Rebecca Klein-Collins, Senior Director of Research and Policy Development for the Council for Adult and Experiential Learning (CAEL), discusses the methodology, practices, and policies surrounding competency-based education.
Link to Full Text | Show Similar Items | Show Associated Keywords
83. Kuh, G. November 2010. Learning outcomes assessment: A national perspective.
Presentation at Council of Graduate Schools on global competitiveness in degree attainment, the two paradigms of assessment, and Valid Assessment of Learning in Undergraduate Education (VALUE) Rubrics.
Link to Full Text | Show Similar Items | Show Associated Keywords
84. Kuh, G. June 2010. NILOA: Tracking the status of outcomes assessment in the U.S..
Presentation at Association for Institutional Research (AIR) Targeted Affinity Group Opening Session on NILOA activities, accreditation as a driver of assessment, and findings from the NILOA 2010 Program-Level Survey.
Link to Full Text | Show Similar Items | Show Associated Keywords
85. Kuh, G. 2013, February 28. Quality assurance implications of high-impact practices and related improvement efforts.
This presentation from the New Mexico Higher Education Assessment and Retention (NMHEAR) conference explores the use of high-impact practices in higher education.
Link to Full Text | Show Similar Items | Show Associated Keywords
86. Kuh, G. April 2012. Taking stock of student learning outcomes assessment.
Presentation at the Higher Education Quality Council of Ontario (HEQCO) on the need for and the economic impact of assessment.
Link to Full Text | Show Similar Items | Show Associated Keywords
87. Kuh, G. 2013, February 28. What matters to student success: The promise of high-impact practices.
This presentation from the 2013 New Mexico Higher Education Assessment and Retention Conference (NMHEAR) provides an overview of high-impact practices and their implications.
Link to Full Text | Show Similar Items | Show Associated Keywords
88. Kuh, G., & Ikenberry, S. October 2009. More than you think, less than we need: Learning outcomes assessment in American higher education.
The 2009 report from the National Institute of Learning Outcomes Assessment (NILOA) is based on information from more than 1,500 regionally accredited degree-granting institutions in the U.S. The NILOA study, titled “More Than You Think, Less Than We Need: Learning Outcomes Assessment in American Higher Education,” summarizes what colleges and universities are doing to measure student learning.
Link to Full Text | Show Similar Items | Show Associated Keywords
89. Kuh, G., & Jankowski, N. 2012, October 29. What you see is less than we need: Communicating and using evidence of student learning.
This presentation from the 2012 Assessment Institute outlines current NILOA projects, including work related to the DQP.
Link to Full Text | Show Similar Items | Show Associated Keywords
90. Kuh, G., Ewell, P. T., Wellman, J., Kinzie, J. January 2010. Using student learning outcomes for accountability and improvement.
Presentation at Association of American Colleges and Universities (AAC&U) on the cost variables and outputs of assessment and accreditation as a driving force for assessment.
Link to Full Text | Show Similar Items | Show Associated Keywords
91. Kuh, G., Ikenberry, S., & Jankowski, N. 2013, October. From provosts' lips to NILOA's ear: What we know about institutional assessment practice in 2013.
This presentation from the 2013 Assessment Institute discusses results of NILOA's 2013 survey of provosts.
Link to Full Text | Show Similar Items | Show Associated Keywords
92. Kuh, G., Jankowski, N., Ikenberry, S., & Kinzie, J. 2014. Knowing what students know and can do: The current state of student learning outcomes assessment in US colleges and universities.
In a follow-up to the 2009 survey of chief academic officers, NILOA again asked institutions about practices and activities related to assessing student learning. This report showcases findings regarding institutional activities, uses, drivers, and areas of continued need to advance the assessment of student learning. In addition, the report examines changes and shifts over time in institutional assessment related activities.
Link to Full Text | Show Similar Items | Show Associated Keywords
93. Kuh, G.D. 2014, January. NILOA and the DQP.
This presentation from the 2014 AAC&U Annual Meeting discusses NILOA's work with the DQP and includes results from NILOA's 2014 provost survey.
Link to Full Text | Show Similar Items | Show Associated Keywords
94. Lingenfelter, P. E. May 2011. It is time to make our academic standards clear.
The seal of the United States of America bears the phrase, E Pluribus Unum, “out of many, one.” In education, however, e pluribus pluribus is a better description of our national character. We insist on “local control” in elementary and secondary education, which David Cohen and Susan Moffitt in The Ordeal of Equality suggest has impeded nearly a half-century of efforts to improve the education of poor children. In higher education “institutional autonomy” is the functional equivalent of “local control.” We resist “standardization” with every fiber of our being, while asserting our commitment to ever higher standards of scholarly achievement.
Link to Full Text | Show Similar Items | Show Associated Keywords
95. Michael Bassis. July 2015. A Primer on The Transformation of Higher Education in America.
The collection covers a variety of topics: changing paradigms, early calls for change, prominent analyses and prescriptions, critical concepts, processes and tools, prominent transformation efforts in both the for-profit and not-for-profit sectors, barriers to change, critiques of “transformation,” influential websites, supportive foundations and other material of note.
Link to Full Text | Show Similar Items | Show Associated Keywords
96. Miller, M., Lincoln, C., Goldberger, S., Kazis, R., Rothkoph, A. 2012, January. From denial to acceptance: The stages of assessment.
In some ways, the assessment movement over the last 25 years is similar to what individuals experience as they move through Kübler-Ross’s (1997) stages of grief: denial, anger, bargaining, depression, and acceptance. Eventually, reluctantly, slowly, and unevenly, many institutions have come to an acceptance of assessment and its role in higher education.
Link to Full Text | Show Similar Items | Show Associated Keywords
97. Montenegro, E., & Jankowski, N. A. January 2017. Equity and Assessment: Moving Towards Culturally Responsive Assessment.
As colleges educate a more diverse and global student population, there is increased need to ensure every student succeeds regardless of their differences. This paper explores the relationship between equity and assessment, addressing the question: how consequential can assessment be to learning when assessment approaches may not be inclusive of diverse learners? The paper argues that for assessment to meet the goal of improving student learning and authentically documenting what students know and can do, a culturally responsive approach to assessment is needed. In describing what culturally responsive assessment entails, this paper offers a rationale as to why change is necessary, proposes a way to conceptualize the place of students and culture in assessment, and introduces three ways to help make assessment more culturally responsive.
Link to Full Text | Show Similar Items | Show Associated Keywords
98. Montenegro, E., & Jankowski, N. A. 2015, April. Focused on What Matters: Assessment of Student Learning Outcomes at Minority-Serving Institutions.
This report features the assessment work being done at Minority-Serving Institutions (MSIs). Comparisons are made between assessment activities at MSIs and those underway at Predominantly White Institutions (PWIs) as well as those at different types of MSIs (Tribal Colleges, Historically Black Colleges and Universities, and others). Four main findings are discussed including the internal focus of MSIs, the emphasis on using assessment data for improvement, differences among different types of MSIs in their assessment approaches, and matching assessment approaches to student characteristics and learning needs. Implications are presented for understanding assessment activities in MSIs, and how such understandings can help advance assessment efforts at all postsecondary institutions.
Link to Full Text | Show Similar Items | Show Associated Keywords
99. Naser, C. R. January 2012. What assessment personnel need to know about IRBs .
Because assessment projects across all disciplines are now employing systematic research methods that include access to students’ confidential data and artifacts, faculty need to be cognizant of our obligation to protect human subjects in our research. Beyond simple compliance, we want to be respectful of students and to be sure we are acting ethically. By the same token, it is easy to misunderstand the policies and procedures of Institutional Review Boards (IRBs). What is the proper role of IRBs in student learning assessment?
Link to Full Text | Show Similar Items | Show Associated Keywords
100. National Institute for Learning Outcomes Assessment (NILOA). 2009. 2009 survey questionnaire.
This survey is for examining institution level assessment activities regarding campus assessment practices. Please contact us before using it for research or external purposes.
Link to Full Text | Show Similar Items | Show Associated Keywords
101. National Institute for Learning Outcomes Assessment (NILOA). 2010. 2010 survey questionnaire.
This survey is for surveying at the program and department level. Please contact us before using it for research or external purposes.
Link to Full Text | Show Similar Items | Show Associated Keywords
102. Nunley, C., Bers, T, & Manning, T. July 2011. Learning outcomes assessment in community colleges.
As community colleges becoming increasingly important in educating students across the country, more emphasis is being placed on community colleges to provide the public with information on learning outcomes of its students. In this tenth NILOA Occasional Paper, Charlene Nunley, Trudy Bers and Terri Manning describe the complex environment of community colleges as it relates to student learning outcomes assessment. Results from previous surveys of community college instituitional researchers and chief academic officers are analyzed in addition to short vignettes of examples of good practices at various community colleges. Through prior experience either working with institutions or within their own institution, suggestions are offered from the authors in an effort to make student learning outcomes assessment more effective and transparent.
Link to Full Text | Show Similar Items | Show Associated Keywords
103. Nyamekye, A. September 2011. Putting myself to the test.
In a routine evaluation, my principal praised my organization, management, and facilitation, but posed the following question: “How do you know the kids are really getting it?” She urged me to develop more-rigorous assessments of student learning. Ego and uncertainty inspired me to measure the impact of my instruction. I thought I was effective, but I wanted proof.
Link to Full Text | Show Similar Items | Show Associated Keywords
104. Ochoa, E. M. April 2012. The state of assessment of learning outcomes.
My sense of assessment of learning outcomes in higher education is framed by what I think is its ultimate purpose and ideal end-state. Ideally, we would have a well-articulated, measurable set of desired educational outcomes associated with all our academic programs. Such measures would exhibit some commonalities in terms of capacities associated with different degree levels, as well as unique aspects by discipline and institutional mission. Student progress toward achieving those capacities would be gauged based on how far and how many of the desired outcomes have been attained using well-established metrics, rather than by seat time or actual hours of work.
Link to Full Text | Show Similar Items | Show Associated Keywords
105. Otis, M. M. 2010. Listening to students.
An insightful article from Change magazine discussing assessment from a student's perspective.
Link to Full Text | Show Similar Items | Show Associated Keywords
106. Parker, H. E. . 2015. Digital badges as effective assessment tools.
This brief, published by the National Institute for Learning Outcomes Assessment, offers an overview of the purposes and uses of digital badges.
Link to Full Text | Show Similar Items | Show Associated Keywords
107. Prineas, M. & Cini, M. October 2011. Assessing learning in online education: The role of technology in improving student outcomes.
This paper focuses on how online education can impact how we understand and assess student learning outcomes. The authors begin by tracing the development of both online education and assessment practice, arguing that little crossover has occurred between the two even though opportunities to connect the movements abound including data mining, program design, real time program changes, and individualized analytics for students. This paper concludes with a discussion about the changing role for faculty in this new paradigm of online education and assessment.
Link to Full Text | Show Similar Items | Show Associated Keywords
108. Provezis, S. May 2011. A transparency framework: How to make student learning outcomes results accessible to external audiences.
Presentation at Association for Institutional Research (AIR) Annual Forum on Student Learning Assessment Components with examples.
Link to Full Text | Show Similar Items | Show Associated Keywords
109. Provezis, S. July 2011. Augustana College: An assessment review committee's role in engaging faculty.
Over the last six years, Augustana has been active in the area of assessing student learning and has become a leader in gaining faculty involvement. This involvement is due in part to the institutional type—which focuses on teaching and learning, the dynamic role of the Assessment Review Committee, and the communication strategies. This has allowed them to make several improvements on campus based on their assessment activities.
Link to Full Text | Show Similar Items | Show Associated Keywords
110. Provezis, S. June 2012. LaGuardia Community College: Weaving assessment into the institutional fabric.
A federally designated Hispanic Serving Institution, LaGuardia Community College serves an overwhelmingly minority and first-generation college student population “from diverse cultures, ages, and educational and economic backgrounds.” Its students come from 160 different countries and speak more than 120 different primary languages. LaGuardia’s commitment to educational excellence has been acknowledged by Excelencia in Education, the Bellwether Award for Exemplary Instructional Programs, and the Community College Excellence Award from the MetLife Foundation. Because of its reputation as a leader in learning outcomes assessment, particularly through the use of electronic portfolios (ePortfolios), LaGuardia was selected by the National Institute for Learning Outcomes Assessment (NILOA) as an Example of Best Practice. This report features LaGuardia’s commitment to assessment, the collaboration across units at the college, the ePortfolio as the foundation of the assessment efforts, and the institution’s robust p
Link to Full Text | Show Similar Items | Show Associated Keywords
111. Provezis, S. October 2010. Regional accreditation and student learning outcomes: Mapping the territory.
Regional Accreditation in the American higher education system has been challenged in recent years as to its approach to evaluating institutional quality, but too little is known about the criteria and processes they use. This paper carefully examines how regional accrediting groups go about the job of making judgments about institutional quality. To do so, policies and procedures of the seven regional accreditors as they relate to student learning outcomes assessment are examined to find similarities and differences. In many ways, these organizations exhibit a degree of consistency across regions with regard to student learning outcomes assessment. However, more could be done to define useful approaches to assessment, to disseminate these approaches, and to address the cost of these additional expectations for institutions.
Link to Full Text | Show Similar Items | Show Associated Keywords
112. Provezis, S. April 2012. Student learning outcomes assessment: All signs point to accreditation.
Keynote presentation at RosEvaluation Conference on the driving forces of assessment.
Link to Full Text | Show Similar Items | Show Associated Keywords
113. Provezis, S., & Jankowski, N. 2011. Presenting learning outcomes assessment results to foster use.
A chapter on NILOA's Transparency Framework regarding institutional transparency and public reporting.
Link to Full Text | Show Similar Items | Show Associated Keywords
114. Provezis, S., Jankowski, N. May 2011. NILOA transparency framework: A tool for transparent communication of assessment information.
Presentation at The Association for the Assessment of Learning in Higher Education (AALHE) on Student Learning Assessment Components with examples.
Link to Full Text | Show Similar Items | Show Associated Keywords
115. Provezis, S., Jankowski, N., Makela, J., Santucci, D. June 2010. Learning outcomes assessment, transparency, and the internet: A critical examination of higher education institutions' web-based communication strategies.
Presentation at Association for Institutional Research (AIR) on NILOA's use of web scans and findings relating to Institutional Research websites.
Link to Full Text | Show Similar Items | Show Associated Keywords
116. Provezis, S., Kepple, T., Pugliesi, K., Beld, J., Pike, G. January 2012. Show me the learning: Best practices in institutional transparency.
Presentation at Association of American Colleges and Universities (AAC&U) on colleges and universities presently engaged with assessment and transparency.
Link to Full Text | Show Similar Items | Show Associated Keywords
117. RiCharde, R. S. 2012. What to consider when selecting an assessment management system.
A few years ago, the primary reason for using a data management system arose from the need to manage large amounts of dynamic data more efficiently. But in the past few years, there’s been a tectonic shift in public policy that catapulted organizing assessment and institutional effectiveness data to mission-critical status.
Link to Full Text | Show Similar Items | Show Associated Keywords
118. Schuh, J. H. & Gansemer-Topf, A. M. December 2010. The role of student affairs in student learning assessment.
Student affairs professionals are expected to be knowledgeable about the student experience. Thus, it follows that they can and should play an important role in assessing student learning. We hope this paper will persuade faculty and institutional leaders that student affairs staff with the requisite expertise should be involved in collecting, interpreting, and using evidence of student learning for both accountability and improvement.
Link to Full Text | Show Similar Items | Show Associated Keywords
119. Shenoy, G. November 2011. Talking about assessment: An analysis of the measuring stick blog and the comments it elicited.
Presentation at American Evaluation Association Conference on The Chronicle of Higher Education's "Measuring Stick" blog and comments.
Link to Full Text | Show Similar Items | Show Associated Keywords
120. Shenoy, G. November 2011. Why assess student learning? What the measuring stick series revealed.
NILOA staff conducted a content analysis of the essays and readers’ comments. Three main findings emerged. First, general agreement does not exist as to how to define quality. In addition, who should be responsible for ensuring quality and how to measure it are unclear. In the absence of consensus on these important issues, we hope readers will use the NILOA website to continue the conversation about this important topic. And now, I offer more detail about what my analysis found.
Link to Full Text | Show Similar Items | Show Associated Keywords
121. Smith, V. July 2011. Transparency drives learning at Rio Salado College .
No doubt about it, higher education is under greater scrutiny. Such scrutiny is especially intense in the case of predominantly on-line academic programs. Documenting what students are learning and making that evidence transparent are common challenges. These expectations may only increase as higher education looks for cost-effective solutions to access, retention and completion at both the institutional level and the program level.
Link to Full Text | Show Similar Items | Show Associated Keywords
122. Stokes, P. August 2011. From uniformity to personalization: How to get the most out of assessment.
The potential for assessment to inform the improvement of curriculum, teaching, student performance, and institutional effectiveness has never been greater. So why aren’t our students performing better?
Link to Full Text | Show Similar Items | Show Associated Keywords
123. Swing, R. & Coogan, C. 2010. Valuing assessment: Cost-benefit considerations.
Nearly every U.S. accredited college and university allocates resources to support assessment of student learning outcomes, satisfaction, and other measures of institutional effectiveness. But with only limited data about best practices in budgeting for assessment, colleges are left guessing how much they should spend on assessment to achieve the best return on their investment.
Link to Full Text | Show Similar Items | Show Associated Keywords
124. Teitelbaum, E. October 2016. Involving students and their perspectives: A student panel discussion.
This session had a student panel and touched on bridging the divide between the practice of assessment on the part of faculty and administrators and the lived experience of assessment on the part of students.
Link to Full Text | Show Similar Items | Show Associated Keywords
125. Volkwein, J. F. September 2011. Gaining ground: The role of institutional research in assessing student outcomes and demonstrating institutional effectiveness.
The work of institutional researchers is gaining importance on today's campuses. Included in institutional researchers wide range of duties is a significant role in student outcomes assessment. In this eleventh NILOA Occasional Paper, J. Fredericks Volkwein leads us through their roles. Analysis of data obtained from the Center for the Study of Higher Education at Penn State’s survey “National Survey of Institutional Research Offices in 2008-09,” gathered from over 3,300 professional staff is included. Overall, this occasional paper helps to better understand the role, responsibilities and challenges faced by institutional researchers in relation to student outcomes assessment on their campuses.
Link to Full Text | Show Similar Items | Show Associated Keywords
126. Wellman, J. V. January 2010. Connecting the dots between learning and resources.
With all the talk about the need for more accountability, surprisingly little is known about what kind of resources an institution needs in order to produce a given level of student attainment. Jane Wellman charts this territory and discovers some surprises, such as how conclusions about cost-effectiveness change when the metric is cost-per-degree rather than the traditional cost-per-enrollment. One result is that, contrary to popular belief, community colleges are not cheap when it comes to cost-per-degree. Another important insight—again against the grain of conventional wisdom—is that simply investing more money does not appear to produce more or better outcomes. As Wellman points out, the key to productivity is intentionally targeted investments.
Link to Full Text | Show Similar Items | Show Associated Keywords

Search Again