Promoting Co-curricular Learning Success: How Sharing Assessment Results Can Strengthen On-campus Credibility and Help Recruit Students
Ken Waters, Ph.D., Professor of Journalism, Pepperdine University
Elizabeth R. Smith, Ed.D., Assistant Professor of Journalism, Pepperdine University
This paper examines the assessment procedures used for the extracurricular student newsroom at one small, liberal arts, Christian university. The assessment of this student newsroom provides an important context for general extracurricular assessment practices, because the student newsroom is attached to the university’s Journalism major and includes students from all majors across the university. This model of extracurricular assessment is, largely, framed by George D. Kuh’s conceptualization of High Impact Learning and begins with developing learning outcomes for the co-curricular and collecting direct and indirect evidence. Additional assessment tools include connecting with program alumni, collecting feedback from content sources and administrators at the university, annual archival reports (generated outside of program assessments), and convocation and town hall programs specific to the extracurricular. The examination of assessment for this student newsroom provides a model for a wide range of extracurricular programs found at universities, particularly Christian universities.
Key words: assessment, high impact practices, extracurricular, co-curricular, student newspaper, student newsroom, Kuh
Cite as: Ken Waters and Elizabeth R. Smith, “Promoting Co-curricular Learning Success: How Sharing Assessment Results Can Strengthen On-campus Credibility and Help Recruit Students,” Journal of Christian Teaching Practice, 5(1), 2018, https://www.theccsn.com/promoting-co-curricular-learning-success-how-sharing-assessment-results-can-strengthen-on-campus-credibility-and-help-recruit-students/
In informal conversations, and in formal conference presentations, faculty who advise student co-curricular activities note that often their work is misunderstood and, thus, targeted for elimination by administrators. This misunderstanding usually arises from a student newspaper editorial, a controversial choice of choral music, a debate topic that offends a donor, or a cuss word uttered on stage. At other times, faculty advisers may feel their co-curricular efforts are equated with vocational school teaching by overzealous liberal arts purists. Thankfully, higher education’s fixation on assessing student learning provides faculty advisers with a means to align their activities with penultimate learning strategies—often called authentic learning or High Impact Practices—and make a strong case for more budgetary support and credibility as an important learning activity.
This article explores ways in which advisers can better assess co-curricular activities and use those findings to make the case that their work is at the heart of “best practices” for student learning. This paper also discusses how that data can be selectively shared with administrators, parents, and students not only to improve credibility but also to aid in the recruitment of students. The following discussion considers the nature and practice of assessment, in addition to the latest theorizing on High Impact Practices. It also shows how the results of co-curricular assessment can be used to improve student learning in conjunction with classroom learning and further the visibility and credibility of what students learn.
What Is Assessment?
Assessment is a time-consuming reality for college and university professors. Assessment expectations include what students learn inside the four walls of a classroom and what they learn in activities that extend classroom learning into areas such as student government, intramurals, campus ministry, living-quarters management and mentoring, and a host of other opportunities provided by the college. Assessment activities take many forms. Books abound on various techniques that can help faculty learn whether students are achieving the intended learning outcomes. Learning outcomes are set not only by classroom educators and leaders of curricular units but by staff and administrators who oversee those out-of-the-classroom activities. The idea is that by learning leadership and people skills through employment on campus, or through volunteer service, students are gaining skills alongside (or co=with) their classroom knowledge.
Traditionally, those who have served as faculty advisers have often called these activities a “co-curricular.” This is particularly true at colleges and universities where students receive course credit for their learning engagement on the student newspaper, choir, debate squad, theater performances, artistic exhibitions, and other activities. In recent years, through accrediting agency and government pressure, universities have moved the locus of their assessment activities from what students learn in a classroom to a combination of classroom and outside-the-classroom experiences to prove that colleges and universities provide a strong learning environment that is a good financial value. While not new, “experiential learning” has been exalted to a higher plane of learning that colleges can feature as proof that students are learning critical thinking, other skills useful for success in life, and training that will help them secure a job upon graduation. The term co-curricular is now used to designate learning that takes place through student activities that may not include the classroom at all, such as student government service, membership in a fraternity or sorority, spiritual life leadership, student chaplaincy, and a host of other leadership roles. These, too, are subject to assessment. Often these types of student learning take place separate from the classroom and are under the supervision of staff rather than faculty. This leaves those who have traditionally called such efforts “co-curricular” searching for a new term to refer to faculty supervision of what has traditionally called co-curricular. This article will refer to what these activities provide as experiential learning, extracurricular learning, authentic learning, or High Impact Practice.
In his seminal paper on High Impact Practices, George D. Kuh argues that students maximize their college learning experience when they are engaged in internships, senior capstone experiences, service learning, study abroad, living in a learning community, and conducting research with faculty. These experiences are transformative and authentic. He asserts that students involved in High Impact Practices “devote considerable time and effort to purposeful tasks; more require daily decisions that deepen students’ investment in the activity as well as their commitment to their academic program and college.” The authors of this paper assert, along with others, that serving as a staff member or executive on a student newspaper or television news program, a debate squad, a choir, theatre troupe, or orchestra is a worthy High Impact Practice as well. By gathering useful assessment data gleaned from these learning activities we can provide strong evidence that students are bridging the gap between the classroom and their future professions by applying knowledge to practice in ways students and administrators should celebrate. It is our task as academics and passionate staff to demonstrate student learning through these co-curricular activities we advise.
The examples presented here are primarily from assessment of the student news activities at a medium-sized Christian university on the West Coast of the United States. They are, however, easily adaptable for advisers of forensics, student-run communication bureaus, theater, music, and other on-campus activities that blend classroom learning and practical application. The suggestions aim to assist Christian educators in their responsibility to maximize student potential as intellectual, emotional, and spiritual beings. Additionally, this paper seeks to help enhance the pedagogy of Christian universities in their overarching mission to foster both truth and academic excellence.
Most extracurricular faculty advisers recognize that these activities reinforce classroom knowledge and simultaneously demonstrate skill in future professions. While such authentic and experiential experiences are frequently transformational for students, faculty may struggle to create these experiences while meeting other contractual obligations and occasionally defending artistic or press choices. The experiences are both authentic and experiential, and often lead to transformational experiences for students. Those of us see this transformation, while we also struggle with how to achieve maximum student learning when our dedication is limited by sizeable teaching loads; a call to scholarly publication; tight budgets for equipment and scholarships; and occasional battles with administrators over a controversial article, work of art, swear word in a copyrighted play, or selection of music for a concert. This paper will provide assessment guidance for authentic learning activities that not only document positive results but also suggest strategies for raising visibility and funding for our programs.
A Foundation for Effective Assessment
Assessment is the process of determining if students are learning what we say they are learning. Grades may measure an individual student’s grasp of material in a course, but grades are not useful in measuring the extent to which the learning goals of the university, college, and/or major are fulfilled. The process, importantly, includes a mechanism that encourages faculty to use assessment data to craft improvement plans for the program being assessed and to later determine if those plans were achieved as another round of assessment is carried out. The process is circular and sequential, with improvement plans building on improvement plans as each of the learning outcomes is assessed over an agreed-upon period of time.
The process begins with using the college or university’s mission statement and institutional learning outcomes to craft course-specific learning outcomes, aided by national standards for achievement if these exist for a given major or program. These are statements of expectation, asserting that by graduation, students in a particular major or extracurricular program will be able to demonstrate that they have learned or achieved the stated learning outcomes. This should signal to administrators and accrediting agencies that the program assessed is fulfilling its part in providing the education promised to the students; it should signal to parents and students that their tuition was well spent; and it should assure potential employers that students have some practical experience in the field for which they are seeking employment. Finally, it should reinforce to all stakeholders that faculty advisers are dedicated to improvement as they tweak learning strategies in response to any weaknesses found in student learning through the assessment process.
Writing Learning Outcomes
To properly assess learning, faculty or staff advisers need a set of learning outcomes that guide the program. If this co-curricular program is included in the curricular units of journalism, communication, fine arts, or the performing arts, the learning outcomes are a further refinement of the academic unit’s outcomes. At our university, the journalism major’s learning outcomes include the following:
- Knowledge: Explain the role of the free press in a democratic society, apply the principles and laws of free speech, identify the key events history of journalism, and identify the trends of the current media landscape and of the journalism profession.
- Skills: Conduct relevant research, identify and interview sources for news articles, evaluate source credibility, synthesize acquired information and opinions, and present the resulting news stories in clear and concise fashion using a variety of words, images, and sound.
- Collaboration: Collaborate with respect for others and make ethical choices in the production, management, funding, and promotion of media messages.
- Respect and values: Recognize insensitivity, disrespect, and injustice; develop practices to respect and include minority voices and perspectives.
Given these Program Learning Objectives (PLOs) for the major, then, the construction of learning outcomes for the extracurricular opportunities tied to the major flow primarily from Learning Outcomes 2, 3, and 4. Some knowledge of the history and context of news media communication might be gleaned by students in dialogue with the faculty adviser when a question about the context of a news or editorial article is discussed, but the majority of that knowledge is gained in courses dedicated to the three latter PLOs. Skills, collaboration, and respect and values, in this example, are taught in classes but are applied more frequently and in greater depth through the process of creating a news budget, employing writers, editing copy, suggesting story improvement, consulting with editors about a news items placement and titling, design, advertising sales, and other skills. Conversations about the appropriateness of a student opinion article in relationship to the Christian mission of the university often take place late in the evening. The ensuing debate over whether to publish the article often involves multiple students debating the relationship between an editorial staff’s communication goals and the realities of their commitment to their “employer.” Additionally, in the process of producing and publishing a news program, students must interact with each other in constructive, albeit at times tense, discussion. Similar dynamics exist in the production activities leading to the final performances of musical or theatrical performances. All of these activities, often interwoven and unconsciously performed, are ripe with data for assessment.
Advisers of extracurricular activities in partnership with faculty in an academic major may choose not to create separate PLOs for the learning experience, instead relying on those of the major to guide their learning outcome creation. This also provides professors in the major the opportunity to include the rich data gained from assessing the co-curricular into their annual program assessment review. Our university, as at most colleges and universities, all students on campus are welcome to participate in co-curricular activities such as the student newspaper, so incorporating the paper’s learning outcomes to the journalism major’s outcomes was problematic for us. That led us to undertake a two-step process of learning outcome creation. First, we tied our learning outcomes for the student-run news media to the university’s mission statement as the starting point for crafting learning outcomes. This can be particularly useful if the co-curricular is not tied to a major but overseen by an office of student life, student affairs, or student government. Here is an example of using the university’s mission statement to craft learning outcomes for a student news-media activity.
Pepperdine Graphic Media strengthens students for purpose, service, and leadership by developing their skills in writing, editing, and publication production, by providing a vehicle to integrate and implement their liberal arts education, and by developing students’ critical thinking through independent editorial judgment.
PGM participates in Pepperdine’s Christian mission and affirmations, especially the pursuit of truth, excellence, and freedom in a context of public service.
In this example, using the university PLOs is helpful because by nature the mission and learning outcomes of a university are broader and more encompassing than those of a particular major. In the mission/learning outcomes statement for our student news media, known as PGM, we also mirrored part of a university affirmation statement that it is “committed to the highest standards of academic excellence and Christian values, where students are strengthened for lives of purpose, service, and leadership.” We picked up on the idea of strengthening students for lives of purpose, service, and leadership and built our learning outcomes of writing, editing and production skills—added to critical thinking through independent editorial judgement—around these ideals. Additionally, we felt we needed to emphasize a university affirmation statement “[t]hat truth, having nothing to fear from investigation, should be pursued relentlessly in every discipline.” As we’ll discuss later, tying the learning outcomes of the student news media to the mission and learning outcomes of the university provides a strong common ground for discussions with administrators and faculty who question the purpose of the student news media when controversies over an article or editorial arise.
A second way to create extracurriclar learning outcomes is to directly tie the learning outcomes of a major such as journalism (or theater, art, film, etc.) to learning outcomes of its affiliated co-curricular. In this example, the Program Learning Outcomes linking the journalism major learning outcomes to the learning outcomes for the Pepperdine Graphic Media resulted in these learning outcomes wedded to the major PLOs mentioned previously:
- To participate in the reporting, writing, editing, and presentation of [news] content in the publication process (ties into knowledge, skills, collaboration, respect, and values)
- To cultivate diverse sources and develop story ideas (collaboration, respect, and values)
- To understand our diverse audience and the concepts of audience engagement (knowledge, skills, and collaboration)
- To understand the production of multi-platform, multi-media content (knowledge and skills)
- To be able to articulate the importance of student journalism and the power and responsibility of that importance on a college campus (knowledge, skills, collaboration, respect, and values)
Another way to create learning outcomes for the co-curricular is to tie those outcomes to a national or international accrediting body. For instance, the National Academy of Schools of Music lists more than a dozen professional and liberal arts undergraduate degree programs for which they have developed learning competencies. Schools seeking accreditation by NASM must mirror these competencies in their learning outcomes as one part of the accreditation process. This would be true for their coursework learning outcomes and for any learning expectations for the choir, orchestra, and other co-curricular activities involving students. While our Journalism faculty has chosen not to seek accreditation by the Association of Schools and Colleges of Journalism and Mass Media, we consult the group’s recommendations on an annual basis to ensure we are keeping up with best practices.
Beginning the Assessment Process
The process for assessing an academic program is fairly straightforward once the learning outcomes are written and approved by the faculty or staff of a given program and/or major. Traditional assessment practice suggests assessing one PLO per year. The PLO to be examined in a given year should be agreed upon by faculty or the administrative unit prior to the beginning of the academic year and arranged in a schedule. This assures that the sequencing of learning outcome assessment is clear a year or more in advance. Planning ahead allows for appropriate data to be gathered before an academic term begins. For example, at the beginning of an academic year, a pre-test can be given to students to determine what they know or have applied prior to the beginning of the new learning experience. Another benefit of early agreement on assessment goals is that rubrics can be modified, and assignments given, that allow for better targeted and successful data collection at the end of that academic year. In the example of this program’s news media learning outcomes, faculty can choose one year to assess writing, another year to assess editing, another year for production and/or design, and another year for considering critical thinking, collaboration and/or ethics and values.
Two types of assessment are available: direct and indirect. Direct assessment analyzes student work and assesses that work in a systematic fashion. Assessors can be a small group of professors, professionals, or even students. Expectations are agreed upon beforehand, and most often a rubric is used. Indirect evidence may include student surveys, focus groups, or other tools that involve student reporting rather than direct observation of student-produced artifacts.
In the case of our journalism program, faculty begin their annual assessment process by gathering relevant demographic data on students who held a staff or leadership position. Because a journalism major is offered, faculty gather data for both majors and for non-majors. This information includes but is not limited to gender diversity, ethnic and religious diversity, academic majors, and scholarship funding. This is especially important if faculty want to return to the data later to determine if majors perform better on learning measures at graduation as compared to students who did not have the benefit of a rigorous major and had to “learn on the job.”
The most common direct evidence one can gather is through assessing student performance on one or more of the outcomes expected. As previously noted, assessment experts argue that a subset of direct evidence, often called authentic or transformative assessment, is the sine qua non of assessment. According to Jon Mueller, “Authentic evidence is a form of assessment in which students are asked to perform real-world tasks that demonstrate meaningful application of essential knowledge and skills.” Authentic evidence also includes evidence such as awards or external scholarships earned by students and “skills that can be useful after college and in the workplace.” Our journalism faculty begin each assessment cycle by identifying the student learning outcomes they want to assess. Say, for instance, the faculty want to see how well students meet the writing expectations for print, online, and television news stories. First, faculty will gather a sample of news products in written and video form. The exact number may vary, but they attempt to gather stories throughout the school year and balance the sample by writer and perhaps by type of story—hard news, features, columns, etc. Faculty then use national rubrics they may modify for their own purposes, such as the adaptation of a writing rubric to reflect expectations better for news instead of essay writing. In advance of the scoring, faculty set forth their expectations, such as “70 percent of the students should score satisfactory or above on the variety and diversity of sources consulted for their story.” The choice is somewhat arbitrary, but the faculty reason that if a 70 percent is normally considered the bottom of the “average” scale in traditional grading, they want their students to at least be above average in their journalistic writing.
Once the faculty have identified the articles (for some it may be other forms of performance or exhibition), two or three professors in journalism or related areas score student writing by reading or viewing our selection of articles/news broadcasts or scripts and mark the assessment rubric. Most years the faculty will ask a selection of professional journalists, some alumni and some not, to give an outside-of-academia perspective by using the same rubric to score the student work. (Colleagues in the arts ask professional artists to attend performances or exhibitions and provide feedback in a similar fashion as part of their assessment processes.) Student leaders, editors, and producers also score rubrics. The scoring usually asks assessors to check rubric boxes corresponding to three, four, or five choices, ranging from “insufficient” to “excellent.” Scores are then tabulated as means and percentages and presented to advisers to analyze, with special attention paid to any areas of weakness identified for improvement.
A key benefit of extracurricular learning is that it often takes place in groups. Student collaboration and learning communities help students gain “a set of aptitudes and competencies that increase the personal and interpersonal effectiveness of individuals who have them and, through their interpersonal relationships, the effectiveness of others as well.” Studies have attempted to quantify student leadership development. In one study, coaches, student housing advisers, and other staff leaders were asked to confidentially rate students on areas of leadership development, self-direction, and adaptability. This scale could be meaningful for faculty and staff at faith-based institutions because some of its elements such as “Acts in ethical ways,” “Is able to admit when she or he has made a mistake,” and “Is able to come up with new ideas for how to deal with situations” speak to our institutions’ Christian mission. To show through assessment that students in the co-curricular are scoring high on measures of leadership development is an excellent statement of the institution’s learning outcomes.
If majors receive unit credit for their experiential learning work, asking them to complete a written reflection paper can provide useful direct evidence of their learning. One or two prompt questions tied to learning outcomes is a useful way to acquire data that can provide feedback and quotes that can be used for later promotional material. We have found the reflection paper to be particularly useful as a way to encourage students to reflect on their learning in internships and as they interact with other creative students involved in the student-run strategic communications bureau on campus. Of course, direct evidence can be gathered in a variety of ways, including the use of pre-test/post-tests and knowledge surveys.
In addition to this direct assessment, learning outcomes may be assessed by soliciting other feedback. Using a survey of participating students, and then an end-of-the-year focus group of graduating seniors, provides both a broad view of the program and a more nuanced and deep view through the focus groups. It is best to try to conduct a written or online survey a few weeks before the end of the semester to allow time to tabulate and consider open-ended responses. Prior to graduation, a focus group of graduating seniors, plied with pizza and drinks (or a healthier food if possible!), allows advisers to engage in question and answer sessions that can help with improving training, organization, or the elimination or creation of new positions for the next academic year. Periodic surveys of alumni, in which they are asked to reflect on how their learning contributed to their current profession, also provide helpful input. A few years ago, alumni responding to our survey told us that nearly three-fourths felt their co-curricular experience helped prepare them “reasonably” or “extremely” well for their professional careers. Another survey suggested that alumni felt underprepared for jobs in web-based and technology journalism. This feedback encouraged us to move toward a digital-first model in the student newsroom and, in discussions with other Communication faculty, we created a mandatory course called Storytelling in the Media. The alumni survey data was a key component of gaining college-wide faculty approval for the new course. Alumni surveys also facilitate the gathering of email addresses of former students so it is easier to keep them linked together through a Facebook page or a periodic newsletter from the adviser. Faculty have also found that occasional communication with alumni helps us identify useful guest speakers who may be willing to speak to students in person or through Skype or Facetime. However, alumni involvement should always circle back to whether a program is achieving what it has promised.
Indirect assessment techniques also hold promise for gathering student perceptions about teamwork, collaboration, and success of mentorship programs in the co-curricular. In one of the assessment cycles, television news producers and directors filled out a Collaboration Mentorship form during the semester. The form asked students to reflect on their interactions with the “talent” and “crew” immediately after the live broadcast ended. This provided data to the faculty that tied back to the collaboration PLO. The results were encouraging.
Closing the Loop: Planning for Improvement
The assessment process may require educators impacted by the findings to explore the students’ educational journey throughout the curriculum or co-curriculum and identify learning goals in each course or each learning event. Then they propose new or modified learning modules to shore up what might be areas of weakness. The process would then include a way to conduct additional assessment in a few years’ time to determine if the teaching and learning improvements achieved the hoped-for improvements in student learning. Several years ago, for instance, assessment of student news stories in a selection of print, online, and broadcast news found a lack of diversity in the sources with too much reliance on a small group of student leaders, faculty, and administrators. The same rubric identified a glaring lack of sourcing from people of color on campus. With the help of student editors, advisers created a substantial article checklist for student reporters and editors that stresses the need for more diverse sources. The checklist is first introduced to students who attend Our Big Week. OBW begins a week before classes start in August as student news staff members simultaneously attend intensive workshops on skill development in a variety of print, online, and broadcast journalism skills while producing a first-of-the-year newspaper that greets the student body when it arrives on campus for the first day of classes. The idea for OBW itself resulted from assessment data showing several weaknesses in student media skills.
Assessment results also noted that television news stories lacked acceptable sound quality. The sourcing deficiency fell into learning outcomes dealing with skills related to presenting a comprehensive and accurate presentation of the news. Sound deficiencies were related to a lack of proper training, in our case instruction in a reporting course and in the orientation and student-led training reporters received when they became members of the TV news staff.
The exercise of closing the loop is particularly important if the assessment being performed is summative, a five-to-seven-year program assessment of all the learning outcomes of the High Impact Practice. Not only will the report list three or four important improvements to be sought over the next few years, but it should also list recommendations that will form a Memorandum of Understanding (MOU) with the administrative supervisor over the program being assessed. The MOU is an agreement between the faculty and/or staff and the administrative supervisor (Director, Dean, Associate Dean, etc.) about the improvements to be achieved and a short strategy for how that will be accomplished. The MOU will also list any support that the administrative unit will provide in the form of increased budget, space, or staff additions to bring about those improvements.
The Benefits of Having Useful Assessment Data
When assessment results show students meeting or exceeding benchmarked standards, faculty advisers should have the opportunity to brag. Several years ago, the faculty adviser of our student newspaper was called into a meeting with the new provost of the university. The college dean was also in attendance. The provost said he had read several editions of the student newspaper and had found the subject, writing, and sourcing no different from the articles published in a daily newspaper of a nearby Research 1 university with several hundred student staff members. While the adviser secretly rejoiced at what he felt was a compliment, the provost’s comparison was a way of saying that he did not see the mission of the university reflected in the student articles or staff editorials. Instead, he said, the worldview reflected in the journalism of the articles was secular, not reflective of viewing the incident being covered from the standpoint of a Christian worldview. The adviser argued from the standpoint of what the assessment community called “authentic” evidence, the outside-the-university confirmation of the student newspaper’s learning achievements. This included national awards and graduates working for the top newspaper and broadcast outlets in the nation. He also noted that several alumni had Pulitzer Prizes, and that many national rankings mentioned the university’s student news programs and journalism major. A wise, First-Amendment-savvy dean then intervened to explain that past university administrators (some reluctantly, it must be admitted) had blessed the student newspaper as a learning laboratory to train students to be light and life in the secular business of writing and broadcasting the news. As a learning tool, students were allowed to “fail” (admittedly, within limits) and to therefore have to discuss their story with an offended coach, student, faculty member or administrator to learn how their freedom of the press impacted the people who were subjects of their reporting. The dean further explained that the adviser’s role was to ensure students did not commit libel, to serve a consultant’s role to ensure students maintained minimum standards of quality, and to engage student editors in conversations—sometimes before publication, but never with censorship as a possible action—about ethical issues that might arise from publishing a story or editorial in its current format. A key component of the values and ethical discussion often included a consideration of the university’s mission and how it was tied to university learning outcomes.
A few months later the university president requested a meeting with the faculty adviser. The adviser made sure he brought with him assessment data that included the number of articles in the student newspaper that addressed religious issues and themes or included evangelical testimonial statements from students and teachers. He also presented information from recent assessment data showing how students were meeting liberal arts expectations for critical thinking and certain skills-related learning outcomes. The president appreciated the professional manner in which the faculty adviser did his homework. The credibility gained by sharing assessment data would be evident as the conversation continued. However, the president said he was concerned with some of the advertising in the paper and asked about the advertising policies. In particular, he had received negative feedback from a Board member regarding advertising for female “egg donors,” and for a local pub’s ad for its “study hours,” which happened to correspond to the same time most restaurants hosted “happy hours.” Professional advertising standards, and the student newspaper’s advertising guidelines, permitted placement of the ads, but the president argued from a theological standpoint. He noted that the ads encouraged an ethically questionable practice of egg donations and the drinking of alcohol. The adviser agreed that the learning outcomes of the student news program dealing with collaboration and sensitivity to community standards could be enhanced by discussing this issue with students. Not only did the president agree to meet with student editors for a further discussion, he also provided students with additional leads for more “acceptable” advertising possibilities. That dialogue eventually led to the president and his top administrators meeting annually with student journalists for an “off the record” discussion about shared issues and concerns. In this meeting, held a few days before the beginning of the school year, the president acknowledges a university commitment to student learning and allowing students to fail and grow. He also stresses to students that he and his administrators will return a student reporter’s phone calls or answer emails as quickly as possible. This pre-emptive move has lessened tensions between students and administrators in a number of potentially controversial issues student journalists have tackled over the past few years.
Using Assessment Results for Recruitment and Public Relations
Some of the assessment data reported in those conversations with the provost, dean, and president was routinely reported each year on templates provided by our university’s Office of Institutional Effectiveness. Many universities now compile assessment data into larger reports that eventually are posted online as evidence of student learning at the university. This transparency is said to be useful in helping potential students and their parents determine if the academic units are places where their life dreams and career choices will receive strong and rigorous teaching with admirable results. But is this enough? Do students and parents really visit our university’s Institutional Effectiveness office’s website to check on the learning outcomes of our High Impact programs? It is rare that prospective students and parents ask about particular things that would be explained in an assessment report, so the answer is, most likely, no.
Here’s another option, one that may help in student recruitment for extracurricular programs. Four years ago, Pepperdine’s faculty advisers decided to compile a State of the Student Media report. This proved to be a time-consuming undertaking, but one which has found favor among students, parents, faculty, and administrators. It is also available to potential students and serves as a recruiting tool. The State of the Student Media takes a narrative approach to highlighting program assessment data in its various forms. The introduction to the first issue succinctly states the purpose:
In this spirit, this document contains a wide spectrum of information, including background on each print and digital publication, staff demographic information, staff policies, personnel issues and challenges faced by the staff (both internal and external). Digital technology means monumental changes are often made efficiently and quickly. The State of [our student news media] document allows us to curate all of those changes (big and small) annually for future reference.
As an extracurricular program, [our student news media] functions independently of the classroom but is also an invaluable support of the journalism curriculum as a whole. In light of that complex relationship, the advisers wanted to create an official document that sums up the successes and challenges faced by the extracurricular. We hope that faculty and administrators can use this document as a tool and resource.
The document proceeds with a dozen pages of history of each of the extracurricular activities in the report: informal instruction in print and online news reporting and writing through the weekly printed newspaper and the 24/7 digital news site; a feature magazine published each semester; GNews, a short web video of the week’s news posted along with the online news and analytics; The Pixel, a newsletter summary of the news and of events in the coming week delivered to student and staff subscribers each Monday morning; student use of social media tools to alert followers to breaking news and new editions of the printed paper; and advertising sales involving students in a variety of majors (for which students receive a commission). The report also highlights student awards from the Associated Collegiate Press, College Media Association, California College Media Association, and Society of Professional Journalists plus any national scholarships or internships accorded students during the past year, along with a short summary of the total number of such awards students have received since the beginning of the student journalism program. Strategically, the report provides a short summary of the equipment and technology students use to create and publish all of these award-winning news products. (The age and reliability of such tools sometimes is mentioned to prime the pump for future budget requests!) The report is followed by charts listing the major equipment and software in use and its annual cost. Another section of the report lists the majors of students involved in the co-curricular plus their ethnicity, gender, religious affiliation, and year in school. Assessment data and improvement plans that are assessed through the co-curricular are included in the body of the document. The document closes with reflections from the advisers, highlighting the best journalism of the year and the challenges faced by student editors and advisers as they reported on controversial stories on campus or provided opinions about controversial issues affecting the national dialogue. These issues, of course, include topics administrators would prefer not be discussed on a Christian college campus, such as abortion; theology and doctrine; support for Queer students, or lack thereof; political issues of all types; and on-campus controversies over student misdeeds, rising tuition, the lack of parking, cafeteria food quality, and a host of other newsworthy events. Appendices in the document include the primary responsibilities of the advisers, the number of hours they devote to the extracurricular in addition to teaching classes, their personal improvement goals for advising students, managing their time and budgets, and more.
After completion during the summer of each year, the report is attached to a short email summary and sent to all university and college administrators and faculty leadership. Early in this 22-page document, the advisers note that the report is not confidential. It is meant to be read and discussed and used as a means for archiving as well as for assessment and planning. Each year at his annual briefing with student journalists, the university president acknowledges that he has read the document and makes some brief comments on the issues he sees carrying over from the previous academic year to the present year. The document has proven to be useful to Admissions Office recruiters seeking to impress potential students with news about student accomplishments.
Town Hall Meetings
In addition to the State of the Student News Media report, the co-curricular program at Pepperdine University began sponsoring town hall meetings on campus several times per semester. The idea arose from discussions among faculty and students about how the student media could help foster a campus-wide understanding of current events and student civic engagement. These desires were tied to student learning outcomes related to collaboration, critical thinking, problem solving, service learning, and civic engagement. It is also consistent with the movements among professional journalists to consider ways in which reporters can help facilitate education and healing of the communities in which they live. Student news media leaders select a theme or issue, then assume responsibility for all aspects of execution and publicity. Themes have included: Deferred Action for Childhood Arrivals (DACA), Issues Facing Higher Education, Student Veterans, and Diversity on Campus. Costs associated with the town hall are minimal and covered by the student news media budget. The town hall receives extensive coverage in the student news media as a means of informing the student body about important civic and cultural issues.
Another way the faculty and student editors encourage leadership development among the students is to host what our university calls Club Convo, a small group Bible study for which students can receive chapel attendance credit. Topics often cover themes related to vocation and calling, encouraging student co-curricular leaders to consider not only the development of professional skills but the development of a worldview and ethical standing that helps them live out the university’s mission of purpose, service, and leadership.
Assessment holds a special space in the pedagogy of Christian universities, where the professors and administrators are dedicated to educating the whole person, especially as spiritual beings who aim to be seekers of truth. The assessment of High Impact/extracurricular programs is an evolving process that can grow and constrict to include new ways to capture the data of learning and to eliminate methods that fail to yield helpful information. Through the examination of the assessment of one university’s student newsrooms, this paper shares a transferable model for general extracurricular assessment. Based on Kuh’s conceptualization of High Impact Practices, this model begins with the formulation of learning outcomes which can be tied to both the most closely aligned major and more broadly to the university’s mission statement. The model also includes the collection of direct, indirect, and authentic evidence and the inclusion of professionals and alumni in that process. Finally, this model examines how assessment evidence can be uniquely used in the recruitment of additional students and as a public relations tool throughout the university community.
Ken Waters, Ph.D., is professor of journalism at Pepperdine University. His research focuses primarily on evangelical publications, communication ethics and communication strategies of international nonprofit agencies.
Elizabeth Smith, Ed.D., is an assistant professor of Journalism at Pepperdine University. Her current research includes news literacy, communities of practice in student newsrooms, and social media and fake news. Elizabeth has worked in print, web, and broadcast journalism.
 Molly Worthen, “No Way to Measure Students,” New York Times, February 25, 2018, SR1.
 See, for instance, A. W. Astin and A. L. Antonio, Assessment for Excellence: The Philosophy and Practice of Assessment and Evaluation in Higher Education, 2nd ed. (Lanham, MD: Rowman & Littlefield Publishers, 2012); Marilee J. Bresciani, Outcomes-Based Academic and Co-curricular Program Review (Sterling, VA: Stylus Publishing, 2006); Jayne E. Brownell and Lynn E. Swaner, Five High-Impact Practices (Washington, D.C.: Association of American Colleges and Universities, 2009); A. Herrington and J. Herrington, “What Is an Authentic Learning Environment?” in Authentic Learning Environments in Higher Education, eds. T. Herrington and J. Herrin (Hershey, PA: Information Science Publishing, 2007), 68-77; M. M. Lombardi, Authentic Learning for the 21st Century: An Overview, Educause Learning Initiative, 2007, retrieved from: http://net.educause.edu/ir/library/.
 Worthen, “No Way.”
 See, for instance, Jeffrey Scott Coker, Evan Heiser, Laura Taylor, and Connie Book, “Impacts of Experiential Learning Depth and Breadth on Student Learning Outcomes,” Journal of Experiential Education 40, no.1 (2017): 4-23.
 George D. Kuh, High Impact Educational Practices: What They Are, Who Has Access to Them and Why They Matter (Washington D.C.: Association of American Colleges and Universities, 2008), 14.
 Kuh does not dispute this contention, except to note that working on a student newspaper or debate squad or entertainment TV program, for instance, is not a college-wide learning opportunity and that research on the effectiveness of such programs is currently lacking. For more, see Kuh, page 19, and Sarah Stone Watt, “Authentic Assessment in Debate: An Argument for Using Ballots to Foster Talent-Development and Promote Authentic Learning,” Contemporary Argumentation & Debate (2012): 75-104.
 Ironically, a few years after creating the extracurricular learning outcomes based on the university’s PLOs, our assessment data showed that journalism majors who had worked in the student media as undergraduates developed far better skills than those who took only journalism classes on their way to graduation. We changed the graduation requirements to include at least two units of a practicum course but decided to leave the extracurricular learning outcomes linked to the university outcomes rather than create new learning outcomes that tied to the journalism major.
 Jon Mueller, “Authentic Assessment Toolbox,” accessed February 13, 2018, http://jfmueller.faculty.noctrl.edu/toolbox/whatisit.htm.
 Kuh, High Impact Practices, 27.
 The best collection of nationally-recognized rubrics are available from the American Association of Colleges & Universities at https://www.aacu.org/value-rubrics.
 The process also calls for statistical calculation of inter-rater reliability, which is necessary for reporting results to accrediting agencies. We’ve found that a small group of professors teaching the same or similar content scoring a manageable number of artifacts will naturally achieve an acceptable level of interrater reliability, so we compute this score selectively. Most books on statistical measures provide information on how to calculate interrater reliability.
 Matthew Feldmann, Jeffery P. Aper and Sam T. Meredith, “Co-curricular Assessment Scale Development,” Journal of General Education 60, no. 1 (2011): 18.
 Ibid., 16-42.
 See Karl R. Wirth and Dexter Perkins, “Knowledge Surveys: An Indispensable Course Design and Assessment Tool,” Proceedings: Innovations in the Scholarship of Teaching and Learning, St. Olaf College/Carleton College, April 1-3, 2005, https://www.macalester.edu/academics/geology/wirth/WirthPerkinsKS.pdf; Edward Nuhfer and Delores Knipp, “The Knowledge Survey: A Tool for All Reasons,” To Improve the Academy 21, no. 1 (2003): 59-78.