Those of you who follow the debate over the value of JDs (especially post-Lehman) are likely familiar with Michael Simkovic's research which found a million dollar life time earnings premium. Criticism, quite rightly, abounded. One of the most glaring problems was the claim about how many hours per week lawyers work compared to the rest of the workforce (warning, your sides may split):
We find that, after applying controls, law degree holders typically work 3.9 hours more per week, or about 45 minutes per day.
While the rest of the world is working 9-5 jobs, lawyers are working 9-5:45. Tell that to your boss today and see how things go...
This problem illustrates one of the biggest flaws in looking at the JD premium, it treats the JD like a passive investment. Put in your $150,000 of tuition and 3 years and (maybe) pass the bar, then just go about the rest of your life as usual and you'll see a $20,000 per year increase to your pay check. Anyone who's ever worked a day as a lawyer (as Simkovic briefly did) should know that's not how it works. The JD will bring many people an increase in their hourly earnings, but most of the earning premium has to be worked for. You work longer, more stressful hours. The true earnings bonus comes from removing the 40 hour cap on most white collar jobs, and allowing you to work 60, 70 or 80 hours a week. Now, if you happen to enjoy legal work, that's going to be an awesome situation for you. If you don't like it, then the ability to work longer, harder hours isn't a bonus at all.
But that's all beside the point for what we want to discuss today. We can kinda see how maybe if you come across some not so reliable data and you've got a bunch of confirmation bias in your head, you won't question it too much and just land on the conclusion you already wanted to reach. That's just regular derp. In a recent post on Brian Lieter's blog, Simkovic went full derp.
Simkovic claimed that when schools report salary data it is not unethical to not disclose response rates. He defends this first by saying it's standard practice. That argument is a non-starter, because it's entirely possible that the standard is just to engage in unethical behavior. The "everyone else is doing it" argument doesn't fly.
His next line of reasoning is that prospective students are subject to information overload, so the data needs to be kept away from them:
Sometimes, too much information can be distracting. It’s often best to keep communication simple and focus only on the most important details.
Remember those lawsuits from a few years back when students claimed they were deceived by their schools about employment prospects? They lost because the courts found they were sophisticated consumers capable of seeing through the schools' puffery and other bologna. Simkovic is now arguing the opposite, that including a response rate would be too much and hurt their precious widdle bwains.
The issue here is the word "best." Yes, if your goal is to increase enrollments, especially among students paying full sticker price, it is "best" to keep your salary data as "simple" (read: favorable) as possible. If your goal is to help prospective students make a fully informed decision, then no. God fracking no, it's not "best" to exclude the friggin' response rate.
But wait, Simkovic hasn't gone full derp just yet.
His last defense of excluding response rates is that it doesn't matter because, well... we'll let you see it for yourself (emphasis added):
Nonresponse is not the same thing as nonresponse bias. Law school critics do not seem to understand this distinction. A problem only arises if the individuals who respond are systematically different from those who do not respond along the dimensions being measured. Weighting and imputation can often alleviate these problems. The critics’ claims about the existence, direction, and magnitude of biases in the survey data are unsubstantiated.
High non-response rates to questions about income are not a sign of something amiss, but rather are normal and expected. The U.S. Census Bureau routinely finds that questions about income have lower response rates (higher allocation rates) than other questions.
Law school critics claim that law school graduates who do not respond to questions about income are likely to have lower incomes than those who do respond. This claim is not consistent with the evidence. To the contrary, high-income individuals often value privacy and are reluctant to share details about their finances.
His claim is that people who don't answer salary data might not be any different than those who do, so the low response rate doesn't affect the numbers, but even if they were different, non-responders probably earn more, so the salary data schools claim is actually lower than the real earnings.
If you just stepped in something, it would be Simkovic's brains leaking out on to the floor.
He backs up this claim to a citation about how urban lawyers are less likely to respond than rural lawyers, and urban lawyers earn more, so blah blah blah. That's some evidence to back up his claim, but it's hardly compelling. It's even less compelling if you just sit down with a couple NALP reports and actually look at the salary data and response rates. We'll grab a few at semi-random (focusing on schools that have a lot of students in both large and small firms, so we can see if there's a difference in response rates):
#1. American University 2013:
53 students were employed in firms with 101+ lawyers, 47 (89%) provided salary data, which ranged from $135,000 to $160,000.
49 students were employed in firms with 2-10 lawyers, and only 27 (55%) provided salary data, which ranged from $50,000 to $65,000, the lowest range for any firm size band.
#2 Boston College 2013:
75 students in 101+ firms, 74 (99%) provided salary data, ranging from $145,000-160,000.
26 students in 2-10 firms, 15 (58%) provided salary data, ranging from $53,000-65,000.
#3 Fordham 2013:
164 students in 101+ firms, 163 (100%, yay rounding!) provided salary data, ranging from $132,500-160,000.
38 students in 2-10 firms, 23 (61%) provided salary data, ranging from $52,500-75,000.
#4 George Mason 2013:
23 students in 251+ firms*, 21 (91%) provided salary data, ranging from $135,000-160,000.
37 students in 2-10 firms, 19 (51%) provided salary data, ranging from $42,500-70,000.
*4 students were in 101-250 sized firms, but NALP does not report salary data when there are fewer than 5 people in the category.
#5 Pepperdine 2013:
16 students in 101-250 and 500+ firms*, 16 (100% without rounding!) provided salary data, ranging from $76,000-160,000.
50 students in 2-10 firms, 33 (66%) provided salary data, ranging from $52,000-75,000.
*Only 3 students in 251-500 firms
#6 Wayne State 2013:
14 students in 101-250 firms, 14 (100%) provided salary data, ranging from $100,000-100,000. (Maybe this is a good time to note the salary figures are the 25th to 75th percentiles.)
39 students in 2-10 firms, 11 (28%) provided salary data, ranging from $31,200-52,000.
Do you see a trend? We sure as hell do. People who do not respond are, in Simkovic's words, "systematically different" from those who do respond. Nearly everyone working in a large firm provides salary data. Only about half of those working in small firms do. That's a systematic difference. And now here comes the full derp:
Simkovic thinks that the people working in small firms not reporting are withholding the data because they're making bank.
At virtually every single school the lowest salaries are found in the 2-10 sized firms. But according to Simkovic, this is only because the high earners aren't reporting their salaries. But only at small firms! High earners at large firms aren't so shy about their salaries. The numbers are only off because there's a bunch of tiny firms paying $160,000 right out the gate, and no one is talking about them.
He's just one step away from claiming that the lack of news coverage about the Illuminati is proof that the Illuminati controls the media.