PLACEMENT AND RANKINGS
The U.S News rankings for law schools utilize a composite score derived from various inputs linked to law school quality, which have been revised over the past fifteen years Despite these adjustments, the composite score consistently reflects four key categories: (1) the quality of enrolled students, determined by acceptance rates, median LSAT scores, and undergraduate GPAs; (2) the school's reputation, which is based on surveys of deans, professors, judges, and lawyers; and (3) other critical characteristics that contribute to the overall assessment of law schools.
Brian Leiter's Law School Rankings provide valuable insights for students, highlighting the importance of distinguishing between faculty quality and teaching effectiveness, which traditional academic reputation surveys may overlook For prospective law students, understanding these nuances can inform their decision-making process when selecting a law school.
11 Library holdings account for 0.75% of a school's total composite score Law Methodology, US News and World Report, http://www.usnews.conmarticles/education/best- graduate-schools/2008032611aw-methodology.html.
Numerical rankings that combine multiple input variables can be misleading, as highlighted by commentators like Brian Leiter In his 2006 article, "How to Rank Law Schools," Leiter emphasizes the importance of considering relevant factors beyond mere numbers when evaluating law schools.
Schools should be evaluated individually instead of being combined under arbitrary criteria While it's possible to rank schools based on various factors, such as job placement rates, these metrics cannot be effectively merged into a single meaningful assessment.
13 We also think students would benefit from disclosure of other law school data, but this paper is focused on post-graduation outcomes.
[Vol 83:791 resources, such as expenditure per student, library holdings, and student-faculty ratios; and (4) post-graduation outcomes, including employment rates, interview statistics, and bar passage.
Since the U.S News rankings expanded in 1990, the outcomes category has incorporated various measures impacting post-graduation success, such as law firm interviews, bar passage rates, graduate employment levels, and median starting salaries Although these factors represent only twenty percent of a law school's total score, we argue that post-graduation outcomes significantly influence prospective students' enrollment decisions This article examines the evolution of the post-graduation outcomes category and its impact on the status and behavior of law schools.
Prospective law students primarily consider U.S News rankings to evaluate post-graduation outcomes, as rising tuition often outpaces starting salaries With 14% of students seeking a way to assess the risks of various admissions offers, the rankings play a crucial role in their decision-making Employers also increasingly depend on these rankings for recruitment, prompting students to weigh the benefits of attending higher-ranked law schools For instance, a student accepted into an elite law school may incur significant debt for access to prestigious job opportunities, while a student with lower credentials may prioritize factors like tuition and bar passage rates at non-elite institutions.
14 Despite well-publicized "salary wars" in large legal markets, see, for example, Gina Passarella, First-Year Pay Up 250 Percent in 20 Years at Big Firms, LEGAL INTELLIGENCER,
Mar 7, 2006, over the last fifteen years, the cost of legal education has increased much faster than associate salaries See Leigh Jones, Salary Raises Dwarfed by Law School Tuition Hikes,
NAT'L L.J., Feb 6, 2006 (reporting data from NALP showing that average private sector pay increased 60% between 1990 and 2005 while private school tuition increased by 130% and in- state public school tuition increased 267%).
One of the top attorney recruiting firms releases an annual guide that assesses transcripts, academic honors, and journal memberships from first-tier law schools as ranked by U.S News This guide serves as a valuable resource for evaluating the qualifications of law school graduates.
16 See RoNrr DiNoVrrZER, BRYANT G GARTH, RICHARD SANDER, JOYCE STERLING & GrrA
Z WILDER, AFTER THE JD: FIRST RESULTS OFA NATIONAL STUDY OF LEGAL CAREERS 75 tbl 10.3
Graduates from the Top 10 law schools, according to U.S News rankings, face the highest median debt levels, as documented in the 2004 report "After the JD." For more detailed insights, refer to the full document available at the NALP Foundation website.
Prospective law students can easily access empirical studies regarding law school rankings and job placements Notable resources include Brian Leiter's Law School Rankings, which evaluates national law schools based on their job placement rates in elite law firms, and Michael Sullivan's updated analysis of law school job placement data.
18 See, e.g., William D Henderson & Andrew P Morriss, Student Quality as Measured by
INDIANA LAW JOURNAL passage statistics 19 In other words, students are not evaluating educational quality so much as return on investment 20
Rankings and employment prospects are not perfectly aligned, as our research indicates that law students and legal employers often prioritize economic benefits over rankings Specifically, students in the lower three-quarters of the law school hierarchy weigh the value of a higher U.S News ranking against increased tuition costs Additionally, those with slightly better credentials may choose lower-ranked schools in strong legal markets due to the greater number of on-campus interviews offered by nearby corporate law firms We will now explore the evolution of the post-graduation component of U.S News rankings over time.
LSAT Scores: Migration Patterns in the U.S News Rankings Era, 81 IND L.J 163, 187-88
In a study analyzing regression results from the bottom three quartiles of U.S News law school rankings, a significant relationship was found between various cost factors—including tuition, student debt, and the classification of a school as a public law institution—and changes in median LSAT scores over a twelve-year period (Johnson, 2006) This research highlights the detrimental impact of rankings on the holistic admissions process.
In 2006, a law school dean observed that students accepted to multiple law schools often base their enrollment choices on various factors, prominently including the financial aid and scholarship packages provided.
19 See, e.g., Melissa Nann, Over Seventy Percent Passed July Bar, Examiners Report
Temple University has significantly improved its success rate for bar exam test takers, as highlighted in a recent article by Legal Intelligencer The dean of Penn State Dickinson School of Law noted that bar exam passing rates are a crucial factor for prospective law students when evaluating law schools This improvement has contributed to a notable forty-six percent increase in applicants, reflecting the positive impact of the school's enhanced bar exam performance.
In a recent article discussing employer placement, a law student emphasizes that while some applicants prioritize diversity, most focus primarily on career placement and the cost of attendance With legal education often resulting in debt exceeding $80,000, prospective law students are likely to carefully consider the financial implications—tuition, fees, and opportunity costs—against the potential benefits of higher future earnings and enhanced job prestige.
Anthony Ciolli, The Legal Employment Market: Determinants of Elite Firm Placement and How Law Schools Stack Up, 45 JuRIMETRIcs 413, 414 (2005).
Research by Henderson and Morriss highlights the significance of geographic proximity over law school ranking in determining the concentration of educational credentials within specific law firm offices This finding is further supported by Oyer and Schaefer's working paper, which emphasizes the role of location in the personnel-economic landscape of large U.S law firms For more detailed insights, refer to their study available at Stanford Business School's website.
US NEWS'S EVOLVING PLACEMENT METHODOLOGY
Since 1991, detailed rankings have included post-graduation success indices, accounting for twenty percent of the overall U.S News rankings This segment has evolved significantly over the years, as illustrated in Table 1.
Part II analyzes the significance of changes in US News rankings from 1991 to 1996, highlighting how annual methodological adjustments led to notable yearly fluctuations Despite these variations, outcome data increasingly aligned with the overall rankings Additionally, it explores the market dynamics and equilibrium that arose as post-graduation measures remained consistent during this period.
Between 1997 and 2005, significant changes in methodology were implemented by US News editors, driven by various motivations This section analyzes these methodological shifts and their implications, particularly focusing on how they influenced competition among law schools.
Table i U.S News Inputs and Weights for Post-Graduation Success
Bar Passage Percent Emplo ed Employer Interviews Startini Salary
The Evolutionary Record from 1991 to 1996
In 1991, U.S News expanded its ranking methodology to include four key measures of placement success: the employment percentage of 1990 graduates at graduation, the employment percentage three months post-graduation, the ratio of graduates to employers recruiting on campus, and the average starting salary for the 1990 class, excluding bonuses and judicial clerkships These factors contributed to twenty percent of the rankings, as students often prioritize schools that may enhance their class rank, believing this will provide a competitive edge in the job market.
24 Top 25 Law Schools, U.S NEWS & WORLD REP., Apr 29, 1991, at 74.
In 1991, the Indiana Law Journal published a ranking of law schools, featuring only the Top 25 institutions Notably, around half of these schools performed better in job placement than their overall rankings indicated, highlighting a weak correlation between law school rankings and the entry-level legal job market during the early 1990s.
In the latest ranking, Cornell and Northwestern emerged as significant winners, improving their placement scores by seven or more positions, with Cornell ranking thirteenth overall and sixth in placement, while Northwestern ranked fourteenth overall and seventh in placement Conversely, notable losers included Stanford, Georgetown, Minnesota, and UNC, each scoring ten points or more below their overall ranks in placement Stanford, ranked fourth overall but seventeenth in placement, and Minnesota, ranked twenty-second overall but forty-third in placement, suffered due to insufficient data provided to U.S News, leading to estimated scores that adversely affected their overall rankings.
In 1992, the rankings were enhanced to include "quartile" data for schools beyond the Top 25, offering a selection of detailed information for these institutions The placement score continued to be based on the same subfactors as in 1991, with the notable change of adjusting the employment statistic from three months to six months post-graduation.
In the earlier years, schools prioritized other aspects over their U.S News rankings, which likely resulted in less effort being invested in collecting the necessary input data.
Cornell and Northwestern universities could have ranked lower than Texas, which is fifteenth overall and sixteenth in placement, if not for their strong placement performance Texas outperformed Cornell in two out of four metrics and tied in one, while it beat Northwestern in one metric and tied in another If Northwestern had surpassed Cornell in placement rank, it would have moved ahead of Cornell in the overall rankings, as their scores were only 0.8 points apart.
If Stanford had received a placement ranking that reflected its other rankings, it likely would have secured third place, surpassing Chicago due to higher scores in three out of five categories While Stanford tied with Chicago in Academic Reputation, it fell short in Placement The possibility of Stanford overtaking Harvard for the second spot remains uncertain, as it outperformed Harvard in Faculty Resources but lagged in Student Selectivity and Lawyer/Judge Reputation Similarly, had Minnesota's placement ranking aligned with its other scores, it could have ranked higher, potentially surpassing George Washington, Wisconsin, Hastings, and even challenging Iowa, given its superior performance in the majority of the other criteria.
29 Best of the Rest, U.S NEWS & WORLD REP., Mar 23, 1992, at 80.
[Vol 83:791 months 30 This time both Stanford and Minnesota provided data to the magazine, and both did substantially better on placement than they had the previous year based on
According to estimates from U.S News, Stanford University improved its placement ranking by twelve positions and moved up one spot overall, while the University of Minnesota climbed nine places in placement ranking and also advanced one position overall This improvement may be attributed to the low-ball estimation method employed in the rankings.
1991 provided an incentive for schools to cooperate with the magazine's data requests 31
The placement scores for 1992 showed significant variation compared to 1991, with eleven of the twenty-three schools in the Top 25 dropping in rank, ten improving, and only two maintaining their positions Changes in overall rankings were loosely correlated with placement scores due to adjustments in the weighting of faculty resources and student selectivity by U.S News The most fluctuating aspect of the placement score was salary data, with ten schools reporting declines, including notable drops from Minnesota (11%) and Texas (8%) Analysis of the 1992 data indicates that discrepancies in data collection quality contributed to these changes, as fifteen schools reported salary increases, such as Hastings (15%) and George Washington (9%) Given the challenging job market for lawyers at that time, these salary variations may have stemmed from the differing participation rates of graduates in reporting their earnings, suggesting that missing data could have inadvertently enhanced a school's ranking in U.S News.
In 1993, U.S News revised its post-graduation metrics by adjusting the weights of the placement score, with "Employed at graduation" accounting for 40% of the score, "Employed at six months" for 30%, and the interview/graduate ratio decreasing to 5% Additionally, the average starting salary was given a significant weight of 25% in the placement score.
Despite recent changes, the Top 25 schools remained largely unchanged, with the top five retaining their positions and only nine schools swapping places Additionally, most of the institutions in the Top 25 recorded placement scores that were below, and in some cases significantly below, expectations.
30 See Law Schools: The Top 25, U.S NEws & WORLD REP., Mar 23, 1992, at 78.
31 The editors at U.S News eventually noticed these incentives and altered the methodology See infra notes 37-41 and accompanying text.
In the early 1990s, law firms and law schools experienced a significant decline in recruitment activity compared to the late 1980s, as highlighted by Claudia MacLachlan in her article "Another Paltry Summer," published in the National Law Journal on June 8, 1992 The piece discusses the reduced rates of On-Campus Interview (OCI) activity, indicating a challenging job market for aspiring legal professionals during this period.
In 1991, the legal job market showed a significant decline compared to 1990, with many larger firms opting for smaller summer programs A report by the National Association for Law Placement (NALP) highlighted this bleak situation, revealing a reduction in on-campus interviews and job offers for both summer associate roles and full-time positions Consequently, many graduates found themselves without employment six months post-graduation.
33 Note, however, that the total contribution of the score remained at twenty percent.
34 The "A " List: The Top 25, U.S NEws & WORLD REP., Mar 22, 1993, at 62, 62-63.
In the recent rankings by Indiana Law Journal, only three schools surpassed their overall placement rank: Boston College, which rose eleven spots; USC, which increased by four; and NYU, which climbed five positions Conversely, several institutions faced significant declines in placement, including Georgetown (down seven), Texas (down six), Notre Dame (down nine), Hastings (down forty-one), Iowa (down eighteen), and Minnesota (down thirty-four) Notably, four of these schools, excluding Georgetown and Hastings, may have been adversely affected by their distant locations from major legal job markets, highlighting the critical role that geographic proximity plays in enhancing factors such as the interview-to-graduate ratio.
50 in 1994 and placement weights changed slightly, again shifting emphasis to placement six months after graduation (increased to 40%, 8% overall) from placement at graduation (reduced to 30%, 6% overall) 36
One of the most important changes in employment methodology occurred in 1995.
In previous years, U.S News assumed that graduates who did not report their employment status were employed, but starting in 1995, they revised this assumption to indicate that only 25% of those with unknown status were employed This change aimed to address the bias towards schools with low reporting rates, as there was little incentive for schools to verify the status of graduates they suspected were unemployed The new methodology encouraged schools to confirm graduates' employment statuses, thereby enhancing the quality of the data Additionally, the 1995 updates included increasing the weight of employed graduates six months post-graduation to 55% of placement, adjusting the starting salary measurement to a median value at 10% of placement, and changing the interview statistic from the ratio of on-campus interviewers to graduates to the ratio of interview appointments to graduates.
Methodological Stability, 1997 to 2006
With the exception of some minor reweighing of employment data in 2006, 54 the
US News post-graduation methodology has remained the same for nearly a decade, as shown in Table 1.55 During this time period, the 20% post-graduation consisted of two
49 The data were reported school-by-school, with no tables facilitating comparison.
The new bar passage variable is determined by dividing the first-time bar passage rate in the state with the highest number of graduates taking the bar exam by that state's overall first-time bar passage rate, according to the U.S News & World Report methodology.
52 In other words, the 1991 to 1996 time period strongly corroborates the market coordinate theory of Russell Korobkin See Korobkin, In Praise ofLaw School Rankings, supra note 1 and accompanying text.
In the 1950s and 60s, the LSAT played a minimal role in law school admissions, contrary to today's rankings-driven perspective A survey conducted by Lunneborg and Radford revealed that among eighty-eight law schools, only one prioritized the LSAT over undergraduate GPA The findings indicated a prevalent skepticism towards formulaic admissions processes, despite evidence supporting the LSAT's effectiveness in predicting first-year law student performance This suggests that calls for more systematic evaluation in admissions were largely unpersuasive at the time.
54 In 2006, U.S News reduced the weight given to employment at graduation (from 6% to 4%) and increased that given to employed at nine months (from 12% to 14%) See supra Table
55 See, e.g., Methodology, U.S NEWS & WORLD REP., Mar 2, 1998, at 81; Methodology, U.S NEWS & WORLD REP., Apr 10, 2000 at 74; Methodology, U.S NEws & WORLD REP., Apr. 9,2001, at 79; Methodology, U.S NEWS & WORLD REP., (2002) at 61; Methodology, U.S NEWS
From 1997 to 2006, law school rankings experienced greater stability compared to the fluctuations seen from 1991 to 1996, primarily due to consistent data collection methods During this period, nearly all ABA-approved law schools saw improvements in employment statistics, while higher-ranked institutions increasingly separated themselves from lower-ranked schools in terms of bar passage rates This analysis focuses on the trends in employment outcomes and bar passage success during these years.
1 Employment at Graduation and at Nine Months
Employment data from law schools, as reported to the ABA and U.S News, suggests a significant improvement in job prospects for graduates over the past decade According to calculations based on U.S News input data, the percentage of graduates from ABA-approved law schools employed at graduation rose from 62.6% for the class of 1997 to 73.9% for the class of 2004 Additionally, the employment rate for new lawyers within nine months of graduation increased from 83.9% in 1995, indicating a positive trend in legal employment outcomes.
1997 rankings) to 91.6% in 2004 (used in 2006 rankings) Moreover, as shown in Table 2, large gains were posted by law schools throughout the U.S News hierarchy.
Table 2 Mean U.S News Employment Inputs, by 1997 US News Rank
Emplo yed at Graduation Employed at Nine Months
The significant increase in post-graduation employment rates may be attributed to either healthy competition among law schools or potential manipulation of data The positive perspective suggests that law schools, responding to competitive pressures from U.S News rankings, have enhanced their data collection and job placement strategies.
The top fifty law schools are categorized into "Top 16" and "Rest of Tier 1," as the Top 16 have consistently held their prestigious status since the U.S News rankings began This classification highlights the enduring reputation of these elite institutions in legal education.
J LEGAL EDUC 568, 572 (1998) (noting that "the same sixteen schools have occupied the top sixteen spots in every survey" since 1990).
INDIANA LA WJOURNAL evidence that US News has spurred many schools to aggressively track their graduates 57
The recent increase in post-graduation employment figures for law graduates may be misleading, as some law schools may be inflating these statistics This inflation can be attributed to short-term internships created to enhance employment rates for U.S News rankings, as well as strategic survey methods that prioritize numerical outcomes over genuine employment improvements Additionally, these statistics do not differentiate between legal jobs that require a law degree and non-legal positions taken out of necessity, further complicating the accuracy of reported employment rates.
The growing skepticism surrounding legal education is fueled by a stark contrast between rising employment rates and declining bar passage rates While employment nine months after graduation increased from 83.9% to 91.5%, the first-time bar passage rate for the same graduates fell from 83.0% in 1995 to 78.6% in 2004 This discrepancy suggests that many of the reported jobs may not require a valid law license, potentially misleading prospective law students about the true value of their legal education.
57 See, e.g., Leigh Jones, Law Schools Play the Rankings Game, NAT'L L.J., Apr 18,
In 2005, it was noted that law schools could improve their rankings by more effectively tracking graduates' job placements, as the U.S News methodology addresses incomplete data Several law schools have since adopted more systematic approaches to monitor these employment statistics.
In an article by Alex Wellen titled "The $8.78 Million Maneuver," published in The New York Times on July 31, 2005, it highlights how two schools have effectively employed short-term internships to enhance their employment statistics reported to U.S News Additionally, Dale Whitman's piece, "Doing the Right Thing," featured in the AALS Newsletter, discusses the ethical implications of such practices in academic institutions.
In April 2002, a law school implemented a strategy to enhance its employment statistics by hiring unemployed graduates to work in its library, as noted in a message from the AALS This initiative aimed to improve the school's reported employment rates for graduates nine months after graduation.
In his article "Ya Gotta Pay the Pig," Richard A Matasar discusses how law schools enhance their placement statistics by hiring graduates immediately after graduation or by contacting them with messages indicating that a lack of response will be interpreted as employment.
The bar passage rates for 1995 and 2004 were key metrics used by U.S News for their rankings published in 1997 and 2006, reflecting averages of both full-time and part-time entering classes The National Conference of Bar Examiners reported a decline in overall bar passage rates from 69.8% in 1995 to 63.6% in 2004 While there are challenges in accurately assessing bar passage rates for individual law schools—since U.S News only considers the state with the highest number of examinees—this data underscores the importance of law schools transparently reporting bar passage statistics Such statistics should be adjusted for SAT and undergraduate GPA to help prospective students gauge their likelihood of success on the bar exam based on their qualifications.
Evaluating the costs and benefits of a legal education is essential, yet it raises concerns about the adequacy of current reporting standards The American Bar Association's Accreditation Standard 509 mandates that law schools present "basic consumer information" in a manner that is both fair and reflective of actual practices This highlights the need for transparency and accuracy in how legal education is represented to prospective students.
Between 1997 and 2006, post-graduation employment statistics showed a steady increase, while US News bar passage scores exhibited significant fluctuations Since the incorporation of bar passage rates into US News' ranking methodology, higher-ranked law schools have generally improved their scores, while lower-ranked institutions have seen their scores decline.
Table 3 Change in U.S News Bar Score 1997-2006, by Starting Position
1997 Rank Median Mean S.E Mean Std Dev Valid N
Assessing the Changes
Determinants of Employment Variables
Post-graduation employment data significantly influences US News rankings, comprising 18% of the total score, while bar passage contributes an additional 2% Specifically, the employment metrics are divided into 4% for graduates employed at graduation and 14% for those employed nine months post-graduation, highlighting the critical importance of these factors for law schools.
Research indicates that the student market is divided into regional and national segments, each governed by distinct rules A school's initial standing significantly influences its appeal, while students in regional markets prioritize geographic proximity to active legal job markets over rankings Additionally, lower tuition and reduced law school debt are also favored over rank Notably, there is minimal correlation between higher reputation scores and improvements in median LSAT scores, and various strategies exist that can enhance LSAT outcomes.
98 MICHAEL LEWIS, MONEYBALL: THE ART OF WINNING AN UNFAIR GAME (2003) See Paul
In their 2004 book review, L Caron and Rafael Gely explore the significant lessons that law schools can glean from Billy Beane and the Oakland Athletics, highlighting the book's relevance to the legal academy Additionally, Henderson and Morriss propose three data-driven strategies inspired by the Moneyball philosophy to enhance law schools' median LSAT scores For further insights, the MoneyLaw blog serves as a popular resource for law professors interested in applying Moneyball principles within the legal education context.
Our analysis of the US News employment data reveals three key trends Firstly, top law schools maintain high employment rates at graduation, ranging from 90.0% to 99.8%, while lower-tier schools show a decline in these rates Secondly, the variation in employment rates at nine months post-graduation is significantly reduced, with a narrower gap between Tier 4 and non-Top 16 schools Lastly, many law schools, particularly those in Tiers 2, 3, and 4, often do not disclose graduation employment figures, with nearly fifty schools failing to report this data in the 2006 edition However, the majority of law schools provided employment figures at nine months, likely because these statistics were more favorable for attracting prospective students.
Table 6 U.S News Em ploment Data by Tier, 2006
Rank Mean 25th Percent I Median I 75th Percentile S.E Mean Vald N_
Law school employment statistics should be approached with skepticism due to their uniform and systemic manipulation Although the reported numbers may lack credibility, the methods employed by law schools to enhance these figures are well-known among administrators Consequently, despite a decline in the actual data, the relative ranking of employment outcomes across law schools may still hold significance This suggests that even if all institutions engage in similar tactics to inflate their employment rates, it is still possible to identify meaningful factors influencing these outcomes.
We conducted a linear regression analysis to investigate the factors influencing employment rates at graduation, using the 2006 U.S News "percent employed at graduation" as our dependent variable It's important to note that the employment statistics considered in this study reflect a two-year lag.
In 2006, employment data indicated trends for the law school graduating class of 2004 Our previous research highlighted that elite law schools function within a distinct market with unique dynamics Therefore, we focused our analysis on institutions outside the Top 16 law schools.
Our independent variables included six factors that could potentially influence employment outcomes:
The 2004 25th percentile LSAT score of law schools serves as an indicator of their overall ranking We posited that a law school's rank significantly impacts employers' hiring choices, as they may be more inclined to consider candidates with average or below-average grades from prestigious institutions Our focus on the 25th percentile score stems from the belief that students at the lower end of the academic spectrum face greater challenges in securing immediate employment compared to their higher-performing peers.
(2) The natural log of the number of on-campus interviews by NALP employers 1 03
We included this variable under the hypothesis that an increase in OCI interviews indicates higher employer interest in a law school's graduates To account for the diminishing returns of additional interviews as the number of employers nears or exceeds the number of job-seeking students, we utilized the natural logarithm.
Our hypothesis suggests that part-time students have greater flexibility in managing their graduation timeline compared to full-time students, as they can adjust their course load each semester We believe that once part-time students secure permanent employment, they are motivated to complete their education swiftly, increasing their likelihood of being employed at graduation Additionally, these students may either continue with their current employer post-graduation or have already obtained a job while studying, allowing them to balance work and law school effectively.
(4) A dummy variable, coded lfor historically black institutions and 0 otherwise.1 0 5
We hypothesized that legal employers' interests in improving their firms' racial
101 See Henderson & Morriss, supra note 18, at 182-86.
William C Kidder's article, "Portia Denied: Unmasking Gender Bias on the LSAT and Its Relationship to Racial Diversity in Legal Education," highlights the reliance of legal employers, such as law firms and federal judges, on the LSAT for hiring decisions This reliance has significant implications for gender bias and racial diversity within the legal profession.
103 These data were compiled by National Jurist magazine See Colleen Gareau, Who's Hiring on Campus This Fall?, NAT'L JURIST, Sept 2005, at 16, 20 tbl
At Notre Dame Law School, which typically hosts a class of 185 students, 248 employers participated in on-campus interviews (OCI) This indicates that the additional value gained from the 248th employer's visit was likely much lower than that derived from the 100th employer's visit.
To evaluate the correlation between minority student representation and employment rates at graduation, we conducted an analysis incorporating the percentage of minority students as an independent variable However, the findings from this variable were inconclusive.
[Vol 83:791 diversity would lead to increased hiring at schools that offered firms the greatest opportunities to reach minority students 106
A dummy variable was created to indicate whether a law school is situated in one of the Top 10 cities for corporate law firms, with a value of 1 for those schools and 0 for others Our hypothesis suggests that this factor would lead to an increased influx of law students from various institutions, including those located outside the city, ultimately resulting in lower employment opportunities upon graduation.
The 2004 U.S News lawyer/judge reputational variable, often dismissed as insignificant in law school rankings, may actually reflect employer perceptions of law schools within specific regional markets Consequently, institutions with higher reputation scores could enhance the job placement rates of their graduates.
Bar Passage
In 1997, U.S News incorporated bar exam success into its ranking methodology, measuring it by the ratio of first-time bar exam pass rates of a law school's graduates in the plurality state to the state's overall pass rate This metric contributes 10% to the placement score and 2% to the overall U.S News rankings.
Success on the bar exam is a key factor for evaluating law schools by prospective students and employers The inclusion of bar passage data by U.S News in 1997 was a positive development; however, bar passage rates across states are not directly comparable This lack of comparability arises from systematic differences in the qualifications of applicants taking the bar exams in different states, which can lead to misleading rankings and perceptions of law school quality.
(2) differences among primary jurisdictions in both the content of exams and the cut scores used for common elements like the Multistate Bar Exam (MBE).
We offer three examples that show how these factors interact with each other to create intractable problems of commensurability First, in the 2006 edition of the US.
News rankings, the bar passage score for the sixth-ranked University of Chicago was
published the **1998 rankings of ABA law schools**, highlighting the limited knowledge surrounding the survey conducted among judges and lawyers For further details, you can refer to the report available at **AALS**.
Harry T Edwards highlights a significant disconnect between legal education and the practical needs of the legal profession in his 1992 article, "The Growing Disjunction Between Legal Education and the Legal Profession." He argues that the work of elite legal educators is increasingly irrelevant to practicing lawyers, emphasizing the need for legal education to better align with the realities of legal practice This growing gap raises concerns about the effectiveness of current legal training in preparing students for their future careers in law.
2191 (1993) (reiterating and further developing this critique).
The disparity in bar passage scores between New York University (1.29) and the University of Chicago (1.16) largely arises from the differing overall pass rates of their respective bar exams, with New York's at 75% and Illinois's at 85% This difference is not reflective of the schools' individual pass rates, which are 97.1% for NYU and 98.7% for Chicago Given their similar entering credentials and faculty quality, it seems unlikely that Chicago students would perform significantly worse on the New York bar exam Both institutions are considered "national" law schools, focusing less on local law, which further supports this notion Therefore, the variations in bar exam results highlighted by the U.S News ranking may not accurately represent the quality of education at these law schools.
Significant biases exist in the percentage of law students taking the plurality bar exam and the relative strength of those students For instance, over 20% of Am Law 200 lawyers are based in the New York City metropolitan area, leading to a higher likelihood that top-performing graduates from law schools outside New York will opt for the New York bar exam This trend can lower the average quality of graduates from those schools who take the local exam, negatively impacting their bar passage rates since law school grades are the best predictor of exam performance Conversely, law schools in New York are likely to have a higher proportion of top graduates taking the New York bar exam, although this could be balanced by a strong out-of-state bar exam cohort that enhances the average quality of students taking the local bar, complicating the passage rates further.
The scaled MBE cut score in New York is set at 134, while Illinois has a lower cut score of 132 Additionally, New York emphasizes non-MBE materials more heavily, accounting for 60% of the exam compared to Illinois's 50% This includes a specific section featuring fifty multiple-choice questions focused on New York law, highlighting the state's unique approach to bar exam preparation.
122 In the 2006 edition, Chicago outscored NYU on most of the reported criteria, tied on one, and was outscored by NYU on only the undergraduate GPA numbers.
123 See supra note 107 (providing breakdown of Top 10 markets).
Many non-New Yorkers taking the New York bar exam are likely to be lawyers who secured positions in New York, a highly sought-after location for large firm jobs This trend suggests that these bar candidates often come from the top tiers of their graduating classes Additionally, the prevalence of top law schools outside New York, with seven of the sixteen non-New York institutions ranked in the Top 20—such as Yale, Harvard, Penn, Michigan, Duke, Georgetown, and George Washington—further supports this notion.
126 We calculated the mean 2006 25th percentile LSAT scores for each bar jurisdiction, weighted by average class size, for each law school sharing the same plurality bar exam.
In some states, candidates from non-ABA-accredited law schools or those with only law office apprenticeship experience are allowed to take the bar exam, resulting in a significantly higher failure rate compared to their ABA-accredited counterparts For instance, in California, 2,160 applicants from non-accredited schools sat for the bar in 2004, with only a 16% pass rate, while 8,230 graduates from ABA-accredited schools achieved a 54% pass rate Although a few other states permit non-ABA-accredited graduates to take the bar, participation is generally low The combination of California's bar exam rules, the influx of candidates from unaccredited schools, and the state's high cut score notably lowers the first-time bar passage rate, which in turn enhances the rankings of most California law schools in U.S News.
Table 9 indicates that median bar passage scores tend to be lower in lower-tier law schools Notably, several institutions in Tiers 2, 3, and 4 boast higher U.S News bar input scores than at least one of the Top 16 law schools This phenomenon is particularly evident in California, where the overall first-time bar passage rate is 61% In fact, five of the six highest bar input scores are attributed to California law schools: Stanford (1.505), UCLA (1.410), UC Berkeley (1.372), UC Hastings (1.323), and USC (1.320) This highlights the unique characteristics of the applicant pool and their effect on bar exam outcomes.
New York leads in applicant pool quality with a score of 161.1, followed closely by Virginia at 160.7, Utah at 159.7, and Illinois at 158.7 Conversely, Wyoming (149.0), West Virginia (148.0), and South Dakota (147.0) have the weakest applicant pools based on the 25th percentile LSAT scores Notably, these LSAT statistics have significantly increased over the past six years In 2000, New York, Virginia, and Utah also ranked highest but with lower scores of 157.3, 157.1, and 155.3, respectively.
According to the May 2005 issue of The Bar Examiner, California's bar exam pass rate for first-time takers was reported at 61% by U.S News However, it is important to note that the data on "source of legal education" does not differentiate between first-time exam takers and others.
In various states including Alabama, California, and Massachusetts, at least one graduate from a non-ABA-accredited law school has taken the bar exam Notably, in Alabama, California, Massachusetts, Tennessee, and Washington, over 10% of bar takers were from non-ABA-accredited institutions The performance gaps in these states were significant, with Alabama at 50%, California at 38%, Massachusetts at 37%, Tennessee at 25%, and Washington at 2% These findings are based on the Authors' calculations using data from the 2005 statistics.
Ave Maria School of Law, classified as Tier 4, achieved an impressive 100% bar passage rate in a jurisdiction where the first-time passage rate is only 74%, earning a bar input score of 1.35 Although 2006 marked the first year Ave Maria was ranked by U.S News, historical data from 2003 to 2006 indicates that it was the top-performing law school in Michigan for three out of four years, with perfect bar passage results in both 2004 and 2006 For detailed bar exam results, visit the Ave Maria School of Law website.
The INDIA NA LAW JOURNAL highlights the impressive success of nineteen law schools that surpassed the bar passage rates of Marquette University and the University of Wisconsin, both of which boast a 100% bar passage rate due to Wisconsin's diploma privilege for in-state graduates.
Of these nineteen schools, sixteen had either California or New York as their plurality bar exam ' 3 '
Table 9 2006 U.S News Bar Passage Scores, by 2006 Ranking
Rank Median Minimum Maximum Range Valid N
Salary and Employment Information
While on-campus interviews play a significant role in securing legal employment for law students, most graduates find permanent positions through alternative methods For instance, around two-thirds of the class of 2003 received job offers prior to their graduation from law school.
Students seeking employment in smaller firms, the public sector, or non-profit organizations often find jobs through traditional methods like job postings, referrals, and direct outreach to employers Typically, job offers are made after bar exam results are released, as many of these employers prefer to hire licensed attorneys.
The market for entry-level lawyers has important school-specific dynamics.
Prospective law students often seek historical data on graduation rates, employment types, geographic distribution, and starting salaries of graduates from specific law schools NALP compiles this valuable information through a comprehensive annual survey submitted by law schools, providing insights in its annual Jobs & Salary Report.
The employment and salary data for new law graduates, as reported by JD's, is not publicly accessible at the school level, with the only available information being employment rates submitted to the ABA and U.S News These rates aggregate all job types, including non-legal positions, into a single figure, which can misrepresent the actual outcomes for graduates Additionally, while the NALP survey covers around 92% of graduates each year, response rates differ significantly among law schools This variation in non-responder pools can distort the data, leading to a potentially inflated perception of graduates' chances of securing meaningful legal employment.
In his article, James G Leipold highlights the legal employment trends for law school graduates, noting that in 2003, nearly 25% of jobs were secured through the fall On-Campus Interviewing (OCI) process, predominantly at large law firms, as reported by the Executive Director of NALP.
157 Id.; see AFTER THE JD, supra note 16, at 82 & fig.l 1.2, tbl.ll.2 (showing relative importance of various methods of finding a first job after law school).
158 See Leipold, supra note 155, at 34.
159 See AFrERTHEJD, supra note 16, at 80-82 & fig.l 1.2, tbl 11.2 (summarizing how job search varies dramatically by relative rank of law school).
The market for new law graduates has seen a significant increase, reaching nearly 90% for the first time since 2001, according to a report by NALP This information is derived from the 32nd consecutive report detailing demographic, employment, and salary data from 178 ABA-accredited law schools, which provided employment information for 92% of the Class of 2005 Additionally, a similar report for the Class of 2004 indicates that the market for new law graduates remains steady.
161 See NALP Press Releases, supra note 160.
The ABA-LSA C Official Guide to Law Schools indicates that while it reports the number of graduates with unknown employment status, the sector percentage breakdowns are derived solely from known respondents Consequently, the Guide warns that for institutions with a significant percentage of graduates whose employment status is not disclosed, the reported percentages may not accurately reflect the true employment landscape.
Law schools, through the ABA, could empower NALP to gather and publish detailed salary and employment data at the school level This transparency would enable students to realistically assess their expected debt against potential earnings post-graduation Salary information can significantly influence law students' expectations, particularly since average salaries can be skewed by a small number of graduates landing high-paying positions in large firms Therefore, a comprehensive breakdown of employment types by law school would provide more accurate insights Nonetheless, even the current NALP salary data remains valuable for students weighing the substantial investment of a legal education today.
By providing detailed school-specific information on employment types and expected salaries, law schools can help prospective students evaluate the implications of choosing a more prestigious and costly institution This transparency may foster a beneficial relationship between students' career aspirations and the financial investment required For instance, differences in placement success rates in various sectors—such as criminal prosecution, government agencies, and non-profit organizations—can guide students toward more affordable law schools with strong placement records in their desired fields With this knowledge, students can make informed decisions about which schools best align with their career goals Additionally, law schools may leverage this data to develop focused long-term strategies that cater to specific practice areas.
Publishing reliable information on employment types, rates, geographic placement, and salaries would enable prospective students to make informed decisions, leading to more price-sensitive choices This transparency would compel high-cost law schools to justify their tuition, fostering a more competitive market for legal education While such information may not benefit all institutions, as it could pressure budgets and encourage comparison shopping, it ultimately serves to protect students from misleading data regarding job placement rates.
Moderate increases in starting salaries are clashing with rising debt levels due to soaring tuition costs, highlighting a concerning trend For instance, the average cost of legal education has surged by 267% since 1990, while associate salaries in private firms have only risen by 60%.
In her article, Rachel F Moran discusses the influence of U.S News & World Report rankings on legal education, arguing that the current ranking system encourages gaming strategies over genuine competitive innovation She contends that the rigid, one-size-fits-all standards of ABA accreditation contribute to a stagnant environment in legal education, limiting the potential for meaningful progress and improvement.
In recent years, the number of new law schools has increased significantly, leading to a rise in bar exam failures, as highlighted by Leigh Jones in the National Law Journal.
16 (noting that "[s]ince 2003, at least seven new law schools have popped up across the country" and are in the process of getting provisionally approved by the ABA) A
Prospective law students often hold unrealistic expectations about the financial benefits of a legal education, influenced by the information provided by sources like the Indiana Law Journal Given that students typically rely on loans against their future earnings to finance their education, the legal academy has a responsibility to offer transparent and accurate disclosures regarding employment outcomes and financial prospects.
Bar Passage
Bar passage is a crucial factor for prospective law students when evaluating law schools, and while US News has made efforts to measure bar exam success, the current input scores offer limited guidance To enhance transparency, law schools, in collaboration with the Law School Admission Council (LSAC) and the National Conference of Bar Examiners (NCBE), could establish a data pooling system that highlights their effectiveness in preparing graduates for the bar exam Most states utilize the Multistate Bar Examination (MBE) and the Multistate Professional Responsibility Examination (MPRE), both of which provide comparable scaled scores across jurisdictions By integrating MBE and MPRE scores with entering credentials and demographic data from LSAC, a clearer picture of law school performance could emerge, aiding students in their decision-making process.
The data-pooling arrangement aims to create a comprehensive dataset that monitors law school performance on standardized tests, enabling the generation of school-specific bar results based on entering credentials This data can be presented in tables or through an interactive web design, allowing for easy comparisons among law schools As a result, students may evaluate their options based on the likelihood of achieving target MBE scores, potentially choosing to switch to a more prestigious institution or one with higher tuition fees Additionally, it is crucial to consider the impact of allowing underperforming schools to fail, as providing accurate consumer information will accelerate this process.
166 See 2005 Statistics, supra note 127, at 37 (summarizing the MBE).
The Multistate Performance Exam (MPT) and the Multistate Essay Exam (MEE) are utilized by approximately half of the states, with grading conducted by individual state bar examiners, making their results not directly comparable.
Thirty years ago, large-scale analyses were conducted without the advantage of modern computing technology, as demonstrated in the study by Carlson and Werts on the relationships between law school predictors, law school performance, and bar exam outcomes.
169 Among the forty-eight states requiring the MBE, the cut scores range from 120 to 145 on a 200-point scale See PMBR, supra note 121.
From the perspective of law schools, the perceived danger is that professors will
Many educators focus solely on preparing students for the bar exam, often overlooking essential teaching methods that foster professional growth This narrow approach can hinder the adoption of innovative and more effective instructional strategies Research from the 2006 Law School Survey of Student Engagement indicates that increased faculty-student interaction, including timely feedback and discussions outside of class, correlates with enhanced analytical skills—key competencies assessed by the bar exam Additionally, it is plausible that the quality of law school instruction, alongside LSAT scores and law school grades, plays a significant role in influencing bar exam performance.
Law professors must be vigilant against elitism, as around 25% of bar applicants, particularly among minority candidates, do not pass the bar exam each year If we are not committed to enhancing these outcomes or collaborating with state bar officials to develop a more effective test, it is essential to reassess how we are fulfilling our responsibilities to students and the broader community.
While we share skepticism about the value of simple composite rankings, we believe they can provide valuable data that enhances competition in legal education, ultimately benefiting law students Access to outcome data, such as bar exam results and interview statistics, can help students make informed decisions about applying to and selecting law schools If students had access to more reliable and affordable data sources beyond what is offered by U.S News, it could foster healthier competition among law schools, potentially leading to lower tuition costs for some students.
170 See, e.g., AALS Survey, supra note 74, at 455 (summarizing reactions of many law professors who worry about excessive emphasis on higher bar scores).
171 See ROBERT STEVENS, LAW SCHOOLS: LEGAL EDUCATION IN AMERICA FROM THE 1850s
TO THE 1980s (1983) (discussing controversies about the bar exam and reasons why it has endured).
Randolph N Jonakait explores the dichotomy in legal education and the fluctuating relevance of local law schools in his paper, "The Two Hemispheres of Legal Education and the Rise and Fall of Local Law Schools." He critically examines the influence of elite law schools, highlighting that a majority of law professors have graduated from these institutions, which raises questions about the broader agenda in legal education For further insights, access the full paper at http://ssrn.com/abstract=913084.
The LSAC Bar Passage Study reveals significant racial disparities in both first-time and eventual bar passage rates, highlighting the challenges faced by minority candidates in the legal profession Research by Stephen P Klein and Roger Bolus further examines the initial and eventual passing rates for first-time test takers in July 2004, underscoring the need for ongoing analysis and intervention to address these disparities.
(reporting essentially the same results for the Texas bar population), available at http://www.ble.state.tx.us/announcements/klein%20report%200606.doc.
174 In other words, to do more than to gripe about bar exams around the coffee machine in the faculty lounge.