PACRAO Review September 2014 Edition

vintage_typewriter-01

Volume 4 • Number 2 • September 2014

Download the entire issue: PACRAO Review September 2014

Moving Online: The Future of Course Evaluations
Laura Jacek, Ph.D., University of Oregon

“What are the concerns that loom so convincingly over the decision to go online with course evaluations? Are they valid? This article addresses areas about which institutions, faculty, and administrators may be concerned.”

 

Admissions Inside & Out: Tools for Continuous Growth                                              —Michelle Taylor, California State University, Dominguez Hills

“Working in the Admissions Office presents a variety of situations and challenges. This article identifies tools that will compliment current business processes, allow room for growth, and result in increased productivity.”

 

BYU’s Holistic Admissions Review: Getting Past Just Admitting Smart Students    —Travis Blackwelder, Brigham Young University

As the process of colleges establishing holistic admission policies continues to evolve,  Brigham Young University places meaningful value on many nonacademic factors when rendering admission decisions.  This article will explore two such components: the evaluation of extracurricular activities, and considering the content of applicant essays.”

 

Creating FERPA Training That is Fun, Educational, Responsive, Participatory, Assessable

Barry Allred and Jearlene Leishman, Brigham Young University

While not all training programs need to look the same, the approach makes a difference in the learning process. This article discusses how Brigham Young University sought to increase FERPA-compliance and awareness by leveraging key principles.”

 

 

Moving Online: The Future of Course Evaluations

typing

Faculty and administrators can have long lists of concerns around online evaluations. These learned voices have a great deal of power to move an institution (or to keep it from moving), and though there is abundant literature available with strong and consistent data on these issues, finding and compiling that data is time consuming. This means that an institution will often take a leap into the unknown if it moves to online evaluations, or it may not make the switch at all.

So what are the concerns that loom so convincingly over the online decision? Are they valid? Each section below details one area about which institutions, faculty, and administrators may be concerned.

Myth #1: Paper Evaluations are More Accurate

There is a prevalent belief that paper evaluations come closer than online evaluations, to a rating that most accurately reflects the quality of a faculty member’s teaching. If this were true, scores between paper and online evaluations would differ substantially. Burton et al. (2012) searched out studies in the field of course evaluation, and determined that out of the 18 studies they identified as measuring differences between quantitative feedback for paper vs. online evaluations, 14 reported no difference between the delivery methods and 2 reported slightly higher ratings online. Even in terms of qualitative feedback, studies tend to find that online formats garner more abundant, positive, and more substantive comments than paper (Burton et al. 2012; Heath et al. 2007).

Myth #2: Paper Evaluation Populations are More Positive

antique pen 2Some faculty believe that by giving the evaluations in class, they are able to exclude poorer performing, lower attendance students, who may evaluate them more harshly. They believe that poor students fill out evaluations more often in an online format, simply because they are given the opportunity to do so. An additional concern is that students who expect lower grades may rate a professor as less competent, interesting, etc… than do the students expecting higher grades.

Neither of these ideas are supported in the literature. Course and teacher ratings not are related to student attendance (Ardalan et al 2007, Perrett 2013), and students with a higher GPA complete online evaluations at over twice the rate of students with a poor GPA (Thorpe 2002). Students expecting higher grades also evaluate at a higher rate (Adams and Umbach 2012). Even if this were not the case, students expecting poor grades in a class are no more likely to score an instructor below the class mean than students expecting good grades (Avery et al 2006; and Thorpe 2002).

Myth #3: Paper Evaluations Result in More and Higher Quality Feedback

There is a perception that there are fewer comments offered in an online format, and that those comments are of poorer content. Additionally, because response rates are often lower through an online format, a drop in qualitative response would be very detrimental.

There are many studies looking at differences in qualitative feedback provided by paper vs. online evaluations. Contrary to expectation, almost all studies show a higher percentage of students who respond to evaluations given online, include qualitative feedback (Donovan et al. 2006; Heath et al. 2007; Kasiar et al. 2002; Laubsch 2006). The amount of that qualitative feedback is also greater online than that in the paper evaluations, often by a wide margin (Burton et al. 2012; Heath et al. 2007; Kasiar et al. 2002; Hardy 2003).

There are few studies that offer actual statistics as to the percent of students commenting online vs those commenting on paper; most simply state that there is “more” words, or “7 times more words”, etc… In the table below, those data that are available are shown and averaged. Note that Hardy’s 2003 study listed three online rates, and two paper rates; those are averaged here.

Chart-article 1

Percent of Students Adding Comments, Online vs. Paper Rates

Considering the higher percentage of students commenting online, an institution’s paper response rate would have to drop almost in half when they moved online, before their sheer number of comments would drop below their norm (assuming average response rates for that institution, and average comment rates as shown above). While response rates do tend to drop with a move to online, the average drop is nowhere near half. So in moving online, the probability that an institution will have a lower percentage of students commenting is almost nil. The probability that they will experience a drop in number of comments, regardless of response rate, is also very low.

Researchers have also examined the ratio of positive to negative statements in evaluations. Unsurprisingly, studies show that this ratio does not differ between paper and online formats (Ardalan et al 2007, Hardy 2003, Venette et al 2010). Perhaps most importantly, several studies have examined the quality of the comments submitted through both formats (paper vs. online), and found that online comments were more substantive and informative, as defined by more words per comment, more descriptive text, and more detailed feedback (Ballantyne 2003; Burton et al. 2012; Collings and Ballantyne 2004; Donovan et al. 2006; Johnson 2002).

Myth #4: Online Evaluations Have Lower Return Rates than Paper

Are return rates better on paper evaluations than online? Studies comparing rates of online vs. paper evaluation find that online evaluations generally have lower response rates than do paper evaluations, barring incentives and interventions (e.g. reminder messages, rewards). How much higher those rates are is a matter up for more debate. In academic refereed papers published since 2005 where no incentives or interventions were explicitly listed in the paper, and where paper comparison rates were available, paper response rates averaged 11-14% higher than online rates. In a real-world comparison of universities around the US, the difference averaged 11% (with no incentives). Adding incentives can boost response rate, dependent upon which incentives or interventions are used, from 7-25% (Ravenscroft and Enyeart 2009; Norris and Conn 2005; Johnson 2002). Considering that many institutions using online evaluations use incentives, interventions, or both, the real-life difference in response rates between paper and online evaluation formats can be made negligible.

Myth #5: Online response rates will not be high enough to have statistical validity

How much is enough? What response rates are necessary to achieve statistical validity? Online evaluations, in the absence of incentives and interventions, can on average achieve a 60% or higher response rate. Is a response by 60% or more of a course acceptable, as a gauge of the entire enrollment of the course? Nulty (2008), looked at exactly this issue. He used and justified an 80% confidence interval for his calculations, and through a number of assumptions and corrections for bias, stated that classes under 20 students need a minimum of a 58% response rate to be considered valid. Courses with greater than 50 enrollees can use 35% as their bar. Larger classes have even smaller acceptable rates. While Nulty outlines some cautions and has some confounding variables in his data, the overall conclusions are supportive of online evaluation response rates as acceptable statistically.

Why having faculty buy-in matters

If the myths outlined above are false, and a college or university can realize adequate response rates, accurate feedback, substantive feedback, and statistically reliable data through online evaluations, why do some institutions still fail to do so?

The answer can be found in one word: attitude. At institutions where evaluation is taken seriously by the administration and the faculty, students can feel that their feedback matters, and respond accordingly. Student perception is key in gaining the participation of the student body, and it does not happen in the absence of faculty support. Many students surveyed believe that faculty do not take evaluations seriously, and do not make changes as a result of the students’ reviews (Nasser and Fresco 2002). In their 2006 study, Anderson et al. achieved online rates in excess of 80% during their pilots. Those rates fell substantially when the project was opened campus wide. In other words, when the pilot was conducted, in classes where the faculty were involved in the development of the project and invested in its success, rates soared. When the general populace of faculty was required to use the same system, rates fell.

Best Practices

When you’ve made your decision, and feel that you’re ready to move forward, here are some tips that will help you increase the odds of a smooth transition and good outcome.

Choose one person, or a small group of people to champion this project and move it forward. They need to be the “face” on the project, as well as being willing to talk to administrators, visit chairs, speak at departmental staff meeting, and basically educate the faculty and administrators of your institution about online evaluations and the value they can bring.

Regardless of whether you’re going to create your own system or purchase one from a vendor, involve IT from the beginning. Figure out where the responsibility for course evaluations and the upkeep of the system are landing up-front. Plan for success, prepare for bumps. Plan for incentives, faculty training, and website improvement. Make sure you test thoroughly, and run at least one pilot prior to institution-wide roll-out.

Don’t rest on your laurels. Once your system is up and purring happily, don’t walk away. Watch your response rates. Work to tweak reminder notices and/or other incentives to best motivate your audiences – both students and faculty. Build a website for frequently asked questions, at the very least. Talk with your faculty and students to maintain and improve the culture of evaluation at your institution. To have a truly successful online system, work to keep your institution focused on the idea that course evaluations are valuable – regardless of whether you’re the person filling them out, or the one reading them.

References

  • Adams, M. and P. Umbach. 2012. Nonresponse and online student evaluations of teaching: Understanding the influence of salience, fatigue, and academic environments. Research in Higher Education, 53: 576-591.
  • Ardalan, A., R. Ardalan, S. Coppage, and W. Crouch. 2007. A comparison of student feedback obtained through paper‐based and web‐based surveys of faculty teaching. British Journal of Educational Technology, 38(6).
  • Avery, R. J., W. K. Bryant, A. Mathios, H. Kang, and D. Bell. 2006. Electronic Course Evaluations: Does an Online Delivery System Influence Student Evaluations? Journal of Economic Education, 37(1): 21-37.
  • Ballantyne, C.S. 2003. Online evaluations of teaching: An examination of current practice and considerations for the future. In D. L. Sorenson and T. D. Johnson (Eds.), New Directions for Teaching and Learning #96: Online students ratings of instruction (pp. 103-112). San Francisco, CA: Jossey-Bass.
  • Burton, W., A. Civitano, and P. Steiner-Grossman. 2012. Online vs. paper evaluations: differences in both quantitative and qualitative data. Journal of Computing in Higher Education, 24(1): 58-69.
  • Collings, D., and C. Ballantyne. 2004. Online student survey comments: A qualitative improvement? Paper presented at the 2004 Evaluation forum, Melbourne, Australia. Retrieved January 9, 2013 from: http://our.murdoch.edu.au/Educational-Development/_document/Publications/Eval_forum_paper.pdf
  • Donovan, J., C. E. Mader, and J. Shinsky. 2006. Constructive student feedback: Online vs. Traditional course evaluations. Journal of Interactive Online Learning, 5(3): 283-295.
  • Handwerk, P., C. Carson, and K. Blackwell. 2000. On-line vs. paper-and-pencil surveying of students: A case study. Paper presented at the 40th Annual Meeting of the Association of Institutional Research, May 2000. Retrieved July 7, 2014 from: http://files.eric.ed.gov/fulltext/ED446512.pdf
  • Hardy, N. 2003. Online ratings: fact and fiction. New Directions for Teaching and Learning, 96: 31-41. Retrieved July 2, 2012 from: https://wiki.albany.edu/download/attachments/20415113/_Online+Ratings+Fact+and+Fiction_Hardy_2003.pdf
  • Heath, N. M., S. R. Lawyer, and E. B. Rasmussen. 2007. Web-based vs. paper-and-pencil course evaluations. Teaching Psychology, 34(4). Retrieved May 7, 2013 from: http://www.isu.edu/psych/Articles/Rasmussen/Web%20vs%20Paper%20Course%20Evals%202007.pdf
  • Johnson, T. 2002. Online student ratings: Will students respond? Paper presented at the annual meeting of the American Educational Research Association, New Orleans, 2002. Retrieved May 7, 2013 from: http://www.armstrong.edu/images/institutional_research/onlinesurvey_will_students_respond.pdf
  • Kasiar, J. B., S. L. Schroeder, and S. G. Holstad. 2002. Comparison of Traditional and Web-Based Course Evaluation Processes in a Required, Team-Taught Pharmacotherapy Course. American Journal of Pharmaceutical Education, 66: 268-270.
  • Laubsch, P. 2006. Online and in‐person evaluations: A literature review and exploratory comparison. Journal of Online Learning and Teaching, 2(2). Retrieved May 7, 2013 from: http://merlot.org/Vol2_No2_Laubsch.htm
  • Layne B. H., J. R. DeCristoforo, and D. McGinty. 1999. Electronic vs. traditional student ratings of instruction. Res Higher Education, 40:221-232.
  • Nasser, F., and B. Fresko. 2002. Faculty Views of Student Evaluation of College Teaching. Assessment and Evaluation in Higher Education, 27(2): 187-198.
  • Norris, J., and C. Conn. 2005. Investigating Strategies for Increasing Student Response Rates to Online-Delivered Course Evaluations. Quarterly Review of Distance Education, 6: 13-29.
  • Nulty, D. 2008, June. The adequacy of response rates to online and paper surveys: what can be done? Assessment and Evaluation in Higher Education, 33(3): 301-314. Retrieved May 7, 2013 from: http://public.clunet.edu/~mondsche/misc/Nulty.pdf.
  • Perrett, J. 2013. Exploring graduate and undergraduate course evaluations administered on paper and online: A case study. Assessment & Evaluation in Higher Education, 38(1): 85-93.
  • Ravenscroft, M., and C. Enyeart. 2009. Online Student Course Evaluations: Strategies for Increasing Student Participation Rates: Custom Research Brief. Education Advisory Board, Washington D.C. Retrieved June 20, 2012 from: http://tcuespot.wikispaces.com/file/view/Online+Student+Course+Evaluations+-+Strategies+for+Increasing+Student+Participation+Rates.pdf.
  • Thorpe, S. W. 2002. Online student evaluation of instruction: An investigation of non-response bias. Paper presented at the 42nd annual Forum for the Association for Institutional Research, Toronto, Ontario, Canada.

 

Laura Jacek, Ph.D. is Assistant Registrar for Operation of University of Oregon in Eugene, Oregon. She worked as an institutional researcher for 13 years, before moving into the Registrar’s office at University of Oregon, where she keeps the course evaluation system running smoothly. She has given several presentations on different aspects of the course evaluations and continues to do research in that area. She earned her B.A. from California State University, Sacrament; M.A. from San Diego State University; and Ph.D. from Oregon State University, all in the field of Geography.

Editor’s Note: An expanded version of this article appeared previously in AACRAO’s College and University (Vol. 89 No. 2 Winter 2013, pp. 12-21).

Admissions Inside & Out: Tools for Continuous Growth

Working in the Admissions Office presents a variety of situations and challenges. For this reason, it is important to identify tools that will compliment current business processes, allow room for growth, and result in increased productivity. This causes us to ask the question, “How can we do more with less?”

admissionsThis article highlights key tools I have used in the CSU-Dominguez Hills Admissions Office to manage workflow in a way that results in goal achievement, as well as staff satisfaction. These tools are centered around the following areas: leadership, building bridges, inclusivity/transparency, problem-solving, resources, accountability, discipline, and professional development.

While it may seem as if each of these strategies by themselves is very basic, I believe the combination of the approach is what makes it work extremely well. This structure allows for the needs of the office to be identified, and at the same time ensures that each team member is aware of the importance of their contribution. This big-picture awareness allows the admissions staff to see how they contribute to the bigger goal of accurately and efficiently admitting students while providing excellent customer service to the University as a whole.

General information regarding the California State University – Dominguez Hills and the Office of Admissions & Records:

  • Part of the 23 campus California State University System
  • Located in Carson, CA (approximately 17 miles from downtown Los Angeles)
  • Public University with undergraduate and graduate programs
  • Approximately 12,000 undergraduate & 2,300 graduate students
  • 13 Full-Time Staff Members; 3 Part-Time Student Employees; 1-2 Occasional/Seasonal Full-Time Temporary Employees

Leadership: Building Office Morale

Hand drawing empty diagramLeadership includes the effort to assure each employee that they are an integral part of the Admissions Team. Each year, the team gathers for an On-Campus Office Retreat. Time is spent in break-out groups brainstorming for areas in which improvements can be made. The entire group then reconvenes to discuss suggestions from each small group, resulting in recommendations to management. The result of this effort is that team members feel that their input is valued, especially when a recommendation is implemented. Additionally, the entire Admissions Team gathers during the lunch hour twice each year for an All Admissions Lunch, and once each year for a Management Thank You Lunch. These gatherings are a rare opportunity that allows the entire team to convene at a single point in time to connect and share ideas. Additional recognition and encouragement efforts include birthday cards signed by each team member, group birthday song serenades, a Thanksgiving Potluck, and an annual Holiday Party.

Leadership also includes communication, communication, communication … empowering staff with knowledge and tools. The admissions team meets as a group for a staff meeting every Monday at 10 am. Meetings typically take approximately 45 minutes. The agenda always includes reports from our Staff Committees, Professional Development Reports, and a Motivational Reading or Ice Breaker. Also included are updates for each active term including important dates, deadlines, activities, and other relevant information.

Also important for leadership is to arm the team with the necessary data. I have created an Admissions Evaluator Report, which is all inclusive with live application data. It is a sortable spreadsheet which includes each applicable field from the Application for Admission. It is used by the entire admissions team in order to prioritize workflow, make workflow decisions, and provide the status of progress toward the completion of admission reviews. This allows staff to see the big picture of all applicants, but drill down to the specific students they need to be focusing in on at any given time. Having unlimited and on-demand access to the Admission Evaluator Report results in the often elusive feeling of control and empowerment for each Team Member.

Building Bridges … Over 400 hours of instruction

In an effort to build bridges with other offices on campus, the admissions team has provided more than 400 annual hours of instruction regarding admissions processes. The admissions team has taken an active role in training and utilizing the following offices to relay accurate information to both prospective and denied students regarding admission criteria and application procedures: Academic Advisement, Athletics, Counselors, Customer Service Representatives, Outreach Officers (Transfer Center), Student Information Services, and Student Success Coordinators. In addition, an Internal FAQ has been created with answers to the 100 Most Frequently Asked Admissions Questions. Further, we have taken time to develop relationships with the following areas: Academic Departments, Financial Aid, Housing, Information Technology, International Advisor, Student Groups, Orientation (NSO Help Desk), Student Affairs, Veterans Affairs (Vet Net Allies), Vice- President’s Office, and the campus Webmaster.

Inclusive/Transparent

The Admissions Office does not have a hidden agenda. I believe it takes a village to service the applicant pool. For this reason, invitations are extended to the weekly admissions staff meetings to members of the Customer Service Unit, Educational Opportunity Program (EOP), Outreach Office, and the University Advisement Center. Additional attendees are welcome, as applicable. An annual campus wide admissions update presentation is also offered. It begins with an introduction and summary for the past year, followed by breakout sessions covering detailed information regarding Graduate Admissions, International Admissions, Residency, Transfer Credit, and Undergraduate Admissions. Finally, it is important to include Student Employees and Temporary Employees in training and professional development opportunities. This inclusion allows casual employees to feel like a genuine part of the office.

Proactive Problem Solving

Extraordinary efforts are made by the admissions team to make problem-solving easy with consistent policies and procedures. This includes cross-training for all admissions team members within each unit. This does not mean that all staff are responsible for the exact same workload. Instead, it means that if one area is more caught up than another, team members are able to pitch-in (as necessary) to meet the annual goals of the office.

Additional efforts are placed toward providing clear and up-to-date website information (online information management tracking for approximately 20 web pages) and the development of more than 150 Business Process Guides (BPGs). BPGs are definitive written instructions that help employees rely on standard operating procedures rather than on word of mouth. Included in each BPG is a creation date, documented updates, as well as archived versions of existing guides. Consistent and documented policies and procedures assist employees with proactive problem solving by giving the confidence necessary to make quality decisions.

Resources

TeamworkWe believe that a job can only be done most efficiently and accurately if the entire staff is provided with the necessary tools. For this reason, the admissions team has been provided with many resources that are easily accessible. The list begins with a shared online drive where there is an Admissions Calendar, Department Calendar, and the more than 150 BPGs. The Admissions Calendar provides a day-to-day list of processes occurring throughout the office. It also includes an annual history of these items for the past 3 years. Next is a Department Calendar for the entire Admissions & Records Office, which primarily includes deadlines and mass mailings/notifications.

We also provide the Admissions Team with numerous additional resources. There is an internal library of resource books, which have been accumulated over time. The Library is reviewed each year, new books are ordered, and contents are cataloged on the shared drive. Access to online resources is also provided, especially focusing on publications which assist with international admission evaluations: AACRAO Edge, IERF, WES, and recorded/written presentations & webinars. In addition, a plethora of online information from the Chancellor’s Office is available: CSU Admission Handbook, Residency Handbook, Coded Memos, and Executive Orders. Each of these resources further empowers the admissions team to make consistent decisions resulting in a fair and accurate admissions review process. Finally, the CSU-Dominguez Hills Website, along with reports for specific populations of students (PeopleSoft Queries: Appeals, Exception Admits, Miscellaneous Documents, etc…), has allowed further development of an admission system that works.

Providing these resources and access to shared and online tools empowers staff to make thoughtful and consistent decisions.

Accountability & Discipline

The importance of both upward and downward communication, especially regarding professional expectations, cannot be overstated. For this reason, there are multiple opportunities for the admissions team to both hear and to be heard. The door for open communication is presented by management, and the employee is highly encouraged to walk through it. Not all team members take advantage of each opportunity, but the hope is that, over time, a relationship will be built. The stronger the relationship, typically, the more hard-working and loyal the member becomes to the team and the institution.

Accountability is attained via daily tallies. Each team member is provided with an identical standard list of responsibilities and is required to tally their productivity throughout the day. This is not very different than how a lawyer bills hours. The result allows management to know how much work a particular team member or unit is capable of performing over time. This history allows for more accurate workload planning and time management. We are also, based on the totals for the tallies, able to report an annual summary of work that would otherwise not be easily tracked.

Team members are also required to meet with management individually every other month. During this meeting the discussion is centered on current workload/assignments, notable errors since the last meeting, professional development opportunities, as well as progress toward annual goals. Finally, this is an opportunity for management to give accolades for a job well done.

Another important source of feedback is through probationary (3 month, 6 month, and 11 month during the first year) and annual evaluations. Evaluations reflect job strengths, job weaknesses, required improvements (with timelines provided), goals, and progress achieved since the last evaluation. These evaluations are more than perfunctory. They are meaningful and give the team member guidelines and the opportunity to correct possible negative qualities and prevent a write-up and/or further disciplinary action.

Finally, accountability and discipline must include well-documented disciplinary write-ups. Simply informing an employee that they are not performing at an acceptable level is not enough. admissions team members are provided with specific documentation and examples regarding areas that are not up to par. Copies of errors in student records are provided along with a written summary outlining the expectation and how it was not met. Team members are also given a timeline in which to make the required improvements. For this reason, over the past five years, there has been only one case that escalated to the point where a grievance hearing with the team member’s union representative was required. All other circumstances have resulted in the necessary improvements to get performance back on track.

Professional Development

The final component, but certainly not the least important, is professional development. Each member of the Admission Team is required to participate in at least one individual professional development opportunity each year. The main source for the individual opportunities are Lynda.com. Team members have also attended sessions coordinated by Fred/Pryor Seminars/Career Track and the CSUDH Human Resources Office (Connections and New Horizon Software Workshops).

This is in addition to the many opportunities the Admission Team participates in as a group. Collectively, the Admissions Team participated in approximately 600 training opportunities in 2013. Examples of group training sessions are: Data Entry Proficiency, Data Security & Privacy, EXCEL, Information Security Training, Mental Health First Aid, Military Transcripts (JST), Quality Service Training for Admission, Transcript Authenticity & Verification, and Transfer Credit Update.

Summary

I hope this article has highlighted for you some key tools which allow for managing workflow in a way that will result in goal achievement, as well as staff satisfaction. I hope that the structure I have shared with you will complement your current processes and increase productivity… and ultimately, allow your team to do more with less.

When management identifies the needs of the office, and at the same time ensures that each team member is aware of their contribution, success can be more easily achieved. The big-picture awareness allows staff to see how they contribute to the bigger goal through accurately and efficiently admitting students while providing excellent customer service to the University as a whole.

Leadership, building bridges, inclusivity/transparency, problem-solving, resources,     accountability, discipline, and professional development are keys that can lead to the success of your team.

Although it may seem that each of these strategies individually is very basic, I believe the combination of these in a unified approach is what unlocks the potential to make them work extremely well.

 

Michelle Taylor is a veteran admissions professional who has worked in the field for more than twenty-five years. She received her M.A. in Communication Management from the University of Southern California. Currently, Michelle is the Associate Director of Admissions at California State University – Dominguez Hills. She began her career in International Admissions and has since worked in graduate admissions, with professional academic programs, and has extensive admissions experience with both private and public education. In addition to her admissions career, Michelle is an outdoor enthusiast who enjoys golf, hiking, and tennis. She currently resides in Orange County, California and can be contacted at mtaylor@csudh.edu.

Author’s Note: I would like to acknowledge that Khaleah Bradshaw, who is no longer working in the CSU-Dominguez Hills Office of Admissions & Records, was a co-presenter for the original presentation at the AACRAO Annual Meetings in both San Francisco and Denver. 

BYU’s Holistic Admissions Review: More than Grades and Test Scores

job application 2The concept of “best fit” resonates fervently among college and university campuses across the country. Research shows 77% of college freshmen applied for admission to at least three colleges or universities. More than 28% of students submitted seven or more applications (NACAC, 2014). But students are not alone in hoping for the best fit in a particular college environment. Increasingly, colleges seem concerned with identifying, admitting, and enrolling students who ultimately represent a particular campus. Further attention to selectivity and yield rates has contributed to wide implementation of holistic admission review philosophies. Application essays, recommendations, extracurricular evaluations, student portfolios, and the like are just a few of the methodologies today’s admission offices utilize. The endgame, ideally, is facilitating a dynamic merger of student-and-campus wherein enrichment, for both parties, is realized.

BYU Institutional Background

Owned and operated by the Church of Jesus Christ of Latter-day Saints (LDS), Brigham Young University occupies a unique niche in the landscape of higher education.   The Provo, Utah, campus is home to more than 30,000 students. While 98% of the student body are members of the LDS church, every student participates in ecclesiastical interviews as part of the initial application process and then annually as a condition of continuing enrollment. The endorsement is a fundamental factor in admission and subsequent matriculation. However, other academic and non-academic elements are included in BYU’s admission evaluation. While holistic admission policies have been fashionable and productive at many colleges and universities, BYU, established in 1875, has long been mindful of more than academic merit.

BYU’s Challenge

Quantity of Applications. As a private, faith-based university, “best fit” is an especially important component in admission selection. Specifically, adherence to the University’s Honor Code, academic excellence, and social influence are core areas wherein applicants can demonstrate an ability to assimilate and thrive at Brigham Young. It’s the charge of the University to attract, admit, and retain these students. With a firm enrollment cap, at a time when competition to attend BYU has never been greater (13,000 fall applicants, 54% admission rate, 28.8 ACT average, U.S. News’ Most Popular University title four of the last six years), the holistic review has taken on an increasingly significant role.

Over the last decade, BYU has seen over a 50% increase in applications, from roughly 8,500 in 2005 to 13,000 in 2014. Utah, BYU’s top feeder state, expects a 31% increase in the number of high school seniors by 2022, the second largest gain in the country (NACAC, 2014). Other states ranking high for an expected increase in high school seniors also place as top 10 feeders to BYU (Nevada, #1 for expected increase – 35%; Texas, #3 – 28%; Colorado, #4 – 25%). Additionally, there is similar growth expected in the U.S. LDS population of high school seniors. The current and projected rise in the number of BYU applicants places added emphasis on the admission selection process.

Quality of Applications. While recognizing the demand for admission to BYU, the quality of academic success seems to be mirroring the increase in application quantity.   For example, Table 1 illustrates the academic comparisons between the fall 2007 admitted class with the recently admitted class of 2014.

Table 1

BYU-chart

While the increase in academic achievement is noticeable, BYU recognizes there is more to the college experience, especially at a faith-based institution where overall campus engagement is a priority. The University’s admissions process should – and does – reflect the need to consider other strengths and potential contributions outside of the classroom.

Non-Academic Admissions Factors

While both the sheer quantity and academic quality of applications to BYU continue to surge, the University’s mission remains closely tied to its sponsoring organization. Ecclesiastical fit is the primary factor in admission consideration. In this regard, grades and test scores matter little if the ecclesiastical endorsement indicates a worrisome match.

But perhaps more applicable to the general population of college and university campuses across the country, other important variables are strongly considered in BYU’s holistic admission evaluation. The balance of this article will focus on two specific factors: evaluating extracurricular activities, and assessing the essay section of the student application.

Extracurricular Activities. According to the NACAC 2013 State of College Admissions report, the top 10 most important factors in rendering college admission decisions were the following:

  1. Grades in college prep courses
  2. Strength of curriculum
  3. Admission test scores (ACT, SAT)
  4. Grades in all courses
  5. Essay or writing sample*
  6. Student’s demonstrated interest*
  7. Counselor recommendation*
  8. Teacher recommendation*
  9. Class rank
  10. Extracurricular activities*

While the top four factors are academic indicators, five of the next six elements (noted by asterisks) take into account much different variables. The weighting of extracurricular activities varies from campus to campus, as does the method of evaluating such activities. The Common Application, for example, asks applicants to list extracurricular activities in the order of interest to the applicant. This exercise is mostly free flowing and dependent upon the applicant’s ability to prioritize and weight activities. More importantly, the applicant must also discern what would be most impressive to the colleges. BYU’s model differs in that the University lists 63 specific activities, divided by activity type (Athletics, Service, Employment, Military, Music/Performing Arts, etc.). The applicant is then invited to review the list and simply check the appropriate box if he or she has participated in that particular activity. The 63 selected activities are meant to align with the University’s aims and mission. Here is an example of the School Leadership section of the BYU extracurricular activities review:

School (These may apply to high school or college)

  • Served as an officer of an official school club
  • Served as Chief Editor of school website or other major publication
  • Served as captain of a varsity athletic team
  • Served as student body officer of entire school
  • Served as class president (of freshman, sophomore etc. class)
  • Served as student body president of entire school

For some activities, if the box is checked, a pop-up window will appear, asking for further information in an effort to provide additional context.

While it may seem an exercise in “whoever checks the most boxes wins”, BYU recognizes that not all activities are created equal. Depth will outweigh breadth. Focused and singular accomplishments in a particular domain will be more meaningful than surface involvement in several types of activities. Leadership opportunities are especially commendable.

In addition to simply checking boxes, the applicant is also asked to complete a Noteworthy Accomplishments section, with the following instructions:

“You may use the following boxes to either expand upon a listed activity or introduce an unlisted experience. This is by no means required, but can only serve to your advantage to complete. We encourage you to write about experiences that are meaningful to you personally. Please include the months and years of participation and the total hours spent, and limit explanations to no more than 100 words or 500 characters. In excess will not be visible.”

 

A series of five free-writing sections allows for the applicant to provide, in 100 words or less, additional information regarding specific activities of their choosing. By combining the “checked boxes” approach with the “mini-essay” approach, BYU has a method of evaluating extracurricular involvement at both quantitative and qualitative levels. The University’s application readers see both the checked boxes and the written comments as the application is evaluated.

college essayThe Importance of Essays. As previously noted in the NACAC study, applicant essays are the leading non-academic factor in a college’s admission decision. Of the responding colleges, 58% listed essays as having “considerable” or “moderate” importance. For BYU, the essays are used less as an indicator of college-writing proficiency, but more as a glimpse into the applicant’s background and worldview. When reading essays, BYU instructs its evaluators on the principle, “Content over form.”

BYU typically asks applying freshmen to answer three questions. During the fall 2014 cycle, freshmen answered an essay question about a trial or character building experience he or she has shouldered. The second question addressed specific reasons for applying to BYU. The third essay was one wherein the prompt simply asked if there was anything else the applicant wanted the admissions committee to know and consider.

Each essay provides its own meaning in the BYU admissions process. The character building essay can, among other things, provide context for some perceived deficiencies in the application. Perhaps there was a noticeable absence in the depth and breadth of extracurricular activities, but the essay reveals significant health struggles or multiple moves by the family. The last essay (the Anything Else? essay) allows for the applicant to address a myriad of potential topics, depending on what’s important to the individual. Students have often provided more information about an endured trial. Some applicants share further insights into an absorbing extracurricular activity. Others have submitted poetry, original music compositions, a short story, and so on. Some of the admissions committee’s most memorable essays are derived from the Anything Else? prompt.

Regarding the second essay which addresses why applicants are applying to BYU, early results from longitudinal institutional research (BYU, 2010) show a correlation between high levels of campus engagement and identified reasons for selecting BYU as a preferred choice for college. While there may be many reasons for choosing to apply to BYU, some are more meaningful than others. As members of the admissions committee read and evaluate this particular essay, alignment with institutional values is stressed.   When alignment occurs, there is an increased likelihood of a more robust campus experience, in and out of the classroom. When campus engagement is a goal, the admissions application can be an instrument to discern the student’s potential to contribute to that end.

Conclusion

The idea that the college years are more than simply earning grades and diplomas would likely resonate with most people. If the college experience itself is more than an exercise in academic prowess, the admissions process to attend college should reflect as much. The holistic nature of BYU’s admissions process has allowed access for students who normally would not be competitive for admission based on standard academic measures, yet are predicted to contribute to the campus in meaningful ways. While the student may feel fortunate to be admitted, the University certainly feels privileged to welcome those who manifest potential to make a significant impact on campus and also appreciate their time in the distinctive environment. Of the various rankings assigned to the BYU experience, it’s perhaps the institution’s positions in the top 10 of If I Could Do It All Over Again and Best Overall Student Experience (niche.com) that resonate most. Earning the Princeton Review’s #1 Most Stone Cold Sober title for the 17th straight year isn’t far behind!

A holistic admission review can mean more than selecting an applicant based on a variety of contributing factors; holistic can also denote examining the potential long-term relationship of the applicant to the university. An admissions committee is not only admitting a freshman class, it’s admitting someone’s roommate, admitting the following year’s sophomores, future graduates, an alumni base, and a cadre of students who may carry the college’s name for a lifetime. The time spent on the front end of the process is well-worth the endeavor.

 

References

National Association for College Admissions Counseling. (2014). 2013 State of College Admission. Retrieved from http://www.nacacnet.org/research/research-data/nacac-research/Pages/default.aspx

Niche.com. Brigham Young University Rankings. (2014). Retrieved from http://colleges.niche.com/brigham-young-university/rankings/

Smith-Barrow, Delece. (2014, January). National Universities Where Accepted Students Usually Enroll. U.S. News and World Report. Retrieved from http://www.usnews.com/education/best-colleges/articles/2014/01/30/national-universities-where-accepted-students-usually-enroll

Thompson, Carolyn. (2014, August). BYU keeps No. 1 ‘stone-cold sober’ title in Princeton Review; Syracuse is top party school. Salt Lake Tribune. Retrieved from http://www.sltrib.com/sltrib/news/58259349-78/university-college-princeton-review.html.csp

BYU Institutional Assessment & Analysis. (2010). BYU Freshman Surveys Combined Report. Provo, UT: Author.

Travis Blackwelder is the Associate Director of Admissions at Brigham Young University. He has been employed at BYU for 11 years. Travis earned his bachelor’s degree from BYU and a master’s degree from Harvard University.

Author’s Note: I would like to acknowledge that BYU’s Dr. Norman Finlinson, Executive Director of Student Academic and Advisement Services, and R. Kirk Strong, Director of Admission Services, were co-presenters at the recent AACRAO Annual Meeting in Denver.

Editor’s Note: An expanded version of this article, including specific implementation measures and how nonacademic factors are assimilated into the holistic admission review process, will appear in a future issue of AACRAO’s College and University.

Creating FERPA Training That is Fun, Educational, Responsive, Participatory, Assessable

confidentialSeveral months ago, the Registrar’s Office received a call from a department on campus. The caller was frustrated and expressed his concern about FERPA violations that were happening in his department. One such violation pertained to faculty members returning graded papers and homework outside their office doors without student permission. The caller claimed his faculty had been told many times this was a violation.

Does this scenario sound familiar? With yes most likely being the answer, the important question we should ask is how do we help faculty and staff adhere more adequately to FERPA? Training is a foundational piece to FERPA compliance. A well trained campus reduces risk of non-compliance. More importantly it aids in the protection of student rights.

While not all training programs need to look the same, the approach matters. This article addresses how Brigham Young University attempted to increase FERPA-compliance and create more awareness at our institution through training. Our FERPA training approach is built upon the universal principles of Fun, Educational, Responsive, Participatory, and Assessable. Though resources vary across institutions, we hope to stimulate ideas and opportunities available on your campus that may not have been considered previously.

Background

Brigham Young University is a private not-for-profit four-year teaching based institution located in Provo, Utah (about an hour south of Salt Lake City). Undergraduate enrollment hovers around 30,000 with an additional 3,000 graduate students. There are around 1,600 full-time faculty and 2,500 full-time staff and administrative employees.

Brigham Young University developed and maintains a homegrown student information system. Access to this system is granted after need/roles are assessed, and users have reviewed FERPA policy and training. Since not all BYU faculty and staff need access to the system, some fall outside of the bounds of required FERPA training.

FERPA training on BYU’s campus has evolved from 20 minute VHS tapes available for checkout, to widely distributed DVDs, to online streaming video. Currently, BYU has implemented an online modular training environment.

Principle: Fun

What type of emails or videos are the most popular among your office staff? Is it the latest policy change? Perhaps it’s the mission and vision statements? If your office is like many others, it is probably the emails or YouTube videos with animals posing in Star Wars attire, demotivational cat posters, or 100 creative things to do with duct tape. Why are these so popular? It’s because, fun is memorable. Fun is refreshing. Fun gets peoples’ attention.

Because of personnel’s different learning styles, traditional methods of writing, verbal communication, email correspondence, or website text do not have a significant impact on FERPA knowledge (Maycunich, 2002). Instead, seek to provide training that will appeal to several different aspects of your audience’s learning approaches. Create fun scenarios in video format (or other training formats) that strive to increase impact and retain FERPA knowledge.

Imagine a scenario where a supervisor approaches a member of his staff and has the following conversation:

Supervisor: “Amber, can you look up a student for me? Here is her information.”

 

Amber: “Sure, is she a new student you’ll be advising?”

 

Supervisor: “No.”

 

Amber: “Do you need that information to finish your reports?”

 

Supervisor: “No. I think she is dating my son.”

 

Obviously, under FERPA it would not be appropriate to release student information to the supervisor for personal reasons. Dating is a culturally humorous topic at Brigham Young University. By creating a scenario around this topic, it creates an environment that sparks humor for the participant. Embrace your culture in your training. Especially if it is fun. It only takes a bit of observation and a little creativity to incorporate fun into your FERPA training.

What are some other ways fun can be introduced around a seemingly daunting training topic such as FERPA? Here are some approaches:

  • Use lively and interesting individuals. There is a great pool of talent on most campuses. Access to talent may be easier than you think. Does your campus have a film/theatre department? Often there is an eagerness for projects in these departments. This is a great resource for incorporating interesting and lively individuals into your video training environment.
  • Create humorous situations. It is possible to create humor around a serious training situation; however, do not go overboard. Making light of a FERPA regulation may diminish its intended training purpose. Use subtlety in your approach. Use others to collect ideas. Does someone in your office have a funny experience when helping a customer, faculty, or staff member? Could this experience translate into a training opportunity? An office brainstorm of ideas can quickly give a long list of options. You may be surprised how much the right environment (and a little chocolate) get the ideas flying.
  • Use animated and unexpected graphics. Our design team created several visuals and animated graphics that were used in the introduction portion of the training. Such visuals included boxing gloves that punch words, or cartoon figures to demonstrate FERPA principles. Let’s face it, who doesn’t enjoy lively and animated cartoons? If you are not an artist or graphic designer, try contacting your campus animation or design department for help.

Principle: Educational

In the early stages of this project we asked ourselves if building a training environment would increase knowledge and awareness of FERPA. It has been suggested that self-perception of FERPA knowledge increased significantly for institutions after a FERPA tutorial was administered regardless of faculty or staff status or years at the institution of the individual (Turnage, 2007). However, in order for the training to provide as much educational punch as possible, the incorporation of a few concepts was required.

First and foremost, the training topics have to address common or relevant situations. A team of individuals was sent around campus to interview faculty and staff on topics covered in FERPA. This research provided feedback on areas of FERPA that needed special attention, such as; returning graded homework, grade privacy, determining legitimate need to know, parents roles (or lack of) in student grades, third-party access, and use of email. Obviously we couldn’t cover every FERPA policy, but at least we had something to work from.

trainingThe next consideration was to determine how to keep training scenarios and situations current. One of the great challenges faced today is understanding and applying FERPA in the digital age. New apps, cloud storage, and email access have not only changed the FERPA policy landscape but also opened up new possibilities for FERPA training using these new platforms. Training that is structured modularly allows content to be more current and relevant. It is much easier to replace many small individual modules than redo a continuous 20 minute video.

Lastly, in order for educational environments to be maximized, we train more specifically using smaller amounts of information. The training is not, and cannot be, all encompassing. Keep training scenarios and information specific and to the point.

Principle: Responsive

Everyone has different training needs. So what are the needs faculty, staff, and students have when it comes to training? How quickly can you respond to these needs?

Develop a plan that will help identify user needs before you begin. Part of this plan should include technological resources available on campus, a breakdown of both staff and faculty preferences, etc. Training that is easily accessible, available on demand, and role specific should account for a majority of the need.

Not too long ago our FERPA training environment consisted of a continuous play DVD that faculty and staff were required to watch before being granted access to our student information system. This training method provided little flexibility. Providing training in a web based environment has allowed greater flexibility in access and formatting. With the advent of responsive design technology, training environments can render on a variety of devices, thus creating easy access, greater mobility, and on demand environments.

As is true on many campuses, faculty and staff usually knew when a situation they encountered was potentially FERPA sensitive, but they didn’t exactly know how to respond. Training that is role-based, with specific examples of how to navigate the common situations successfully, addresses needs before they arise. Additionally, provide quick access to these scenarios within a training environment specific to the user so they can refresh their knowledge on the subject by quickly jumping to relevant scenarios.

Principle: Participatory

For many, training materials included handbooks, pamphlets, and other written text explaining FERPA policy. Most could tell you, however, that simply posting information about FERPA in a faculty handbook or university policy document alone is insufficient to help campus personnel understand the law (Maycunich, 2002).

Engaging in the process, on the other hand, creates an internalization of the information. Retention occurs at a higher level when they are engaged, focused, and challenged. Here are some ideas on how to build stronger participation into your training:

  • Invite the user to react to a situation. After a short video segment setting the stage for the FERPA situation, we presented the trainee with a “what would you do” situation and asked them to select one of four options. Once the answer was submitted, they received instant feedback on their response. If they answered incorrectly, they were told what the correct answer was. This should be a learning process for the participant and not just an exam or certification.
  • Ask the user at the time the information is presented if they understood the principle(s) being taught. If users have further questions or comments allow for the selection of an “I have questions regarding this…” option. Have these questions collected and available within the system. In the case of BYU, our FERPA compliance coordinator reviews and responds to each inquiry regularly. Since its deployment in August of 2013, more than 80 questions have been posted and responded to on a variety of topics. A key aspect of this process is to watch for trends in questions so they can be accounted for in future modules.
  • Confirm or correct responses and explain why. No matter what was answered, our system confirms the answer as correct or informs the user that it was incorrect but then tells them what the right answer is. In either scenario, the user is informed of the principles behind why the answer was what it was. Having an understanding of why something is the way it is can often add sustainability to compliance.

Principle: Assessable

In some circles the term “assessable” refers to a basis for taxation. The intent is not to impose a tax on the system, but a determination of the system value. It is important to understand if there were mistakes in questions posed or determine if concepts are frequently misunderstood. It is also important to make sure your campus is not at risk due to a lack of understanding on key concepts. Much of this can be ascertained if a system is created that would allow measurement of individual performance and system usage patterns. All of this measurement should lead back to an understanding of risk for FERPA violations.

By tracking individual scores and responses you can assess the comprehension of the individual. As a result of the authentication system we implemented (this ties into the university CAS authentication) we are able to determine the user information based off their user ID. Since its inception, over 700 faculty sessions have been completed with an average score percentage of around 90% correct. Nearly 2,400 staff sessions have been completed with average percent correct near 80%. Because the scenarios are different between faculty and staff, a comparative cannot be inferred.

Not all tools are perfect. Not all questioning is sound. Build in a mechanism that allows the tracking of each question and show the distribution of answers. Several months into the system usage, we noticed a question was missed more frequently than others. We were curious if this was a misunderstood concept or a poorly worded question. Evaluation of the issues lead us to believe it was a poorly worded question. We modified the question. As a result, the accuracy of the responses increased significantly for this question.

Lastly, integrate your environment with an analytic tool. We chose Google analytics, mostly because it’s “free.” This would allow us to understand how the environment is being accessed (mobile/tablet vs. desktop), how long they stayed, what areas were used most, etc. Google allows so many ways to slice and dice information. All of which can lead to more informed decisions on content structuring and framework.

Conclusion

So what’s next? To watch and learn. The system created at BYU will continue to collect data that can be used for future development initiatives. Other topics of interest have surfaced that we plan to build as training modules into the system.

We are also working with the administration to see if we can require FERPA training on a regular basis as opposed to only once when they first request access to the student information system. Now that the infrastructure is in place, most of the future initiatives can be managed in house and won’t require resource planning.

While the specific approach taken at BYU may be daunting to some, the FERPA training principles are universal. Fun, Educational, Responsive, Participatory, Assessable. Resources always seem to be the foremost concern for those approaching a solution similar to that of BYU. It is not to be denied that access to quality resources helped in this project; however, most campuses are equipped with the resources needed. Do you have computer science students who can help develop the framework as a class project? Registrar staff can write scripts and then use the film and theatre department to create videos, or just have some students use their YouTube skills. If you have a graphic design department, or an employee with those talents, they can create visuals, graphics, and marketing. Coordinating this may seem challenging, but the payoff in building relationships and creating awareness of FERPA is substantial.

If you’d like to see what we did and get a feel for our videos, quizzes and administrative site, go here: http://registrar.byu.edu/registrar/ferpaDemo/index.php

References

Maycunich, Ann. FERPA : an investigation of faculty knowledge levels and organization practices at three land-grant universities. Ames: Thesis (Ph. D.)–Iowa State University, 2002.

Turnage, Casey Carlton. School officials’ knowledge of the Family Educational Rights and Privacy Act of 1974 at the University of Southern Mississippi. Hattiesburg: Dissertation (University of Southern Mississippi), 2007.

 

Barry K. Allred is the University Registrar at Brigham Young University. He currently serves as the chair of the university FERPA Compliance Committee. Prior to joining the registrar’s office, he worked as the Associate Director of Technology Applications and manager of data reporting at BYU.

Jearlene Leishman is the Senior Associate Registrar at Brigham Young University with responsibilities over Records, Registration, Transfer Evaluation, Petitions, Data Entry, FERPA and data access. Jearlene currently serves as the FERPA Compliance Coordinator for the university.

Brian Chantry is an Associate Registrar at Brigham Young University with responsibilities over technology, academic data reporting and other internal Registrar Office support. He has over 11 years of experience in higher education focused primarily around the development of technology initiatives that improve business processes.