Presentation on theme: "David L. Baumer, NC State University Academy of Legal Scholars in Business, August 2010 Annual Conference."— Presentation transcript:
David L. Baumer, NC State University Academy of Legal Scholars in Business, August 2010 Annual Conference
Although not all legal scholarship requires empirical research, clearly in some instances, empirical research enhances the credibility of contentions made in scholarly legal writings – E.g., women and minorities are victimized by discriminatory employment practices Empirical data reveal that on average white men earn more than women and minorities and their unemployment rates are lower Of course it is possible that other factors account for much of these differences: education, experience, choices made by women and minorities – Furthermore, establishing the facts through empirical analysis is important Even if you are morally opposed to the death penalty, it is still relevant to determine if there appears to be a negative relationship between executions and murders
What is the role of empirical research in legal scholarship? – I contend that empirical data can be analogized to the capture of an enemy combatant Let’s imagine that we capture M. (Muhammad, Mike, Mary, take your pick) and (s)he appears eager to reveal his co- conspirators – Perhaps because he fears additional retribution – As lieutenants in the war on terror, we would not be doing our job if we refused to listen to M. If I strongly suspected that M. possessed valuable information that may save U.S. lives, I would go several steps further in the interrogation of M. – I would be willing to harshly question M. and make threats that may or may not be real – I could see using sleep deprivation to disorient M. – I could even be persuaded that some light water boarding would be appropriate in cases of imminent danger to many
Clearly, in this continuum, the reliability of information obtained goes down as the interrogation techniques become more extreme If M. dies, you cannot get more information from him so there are some natural reasons for moderating the questioning As John McCain made clear, every person has a breaking point and after than, he or she will say whatever they need in to avoid further torture Under extreme duress the value of information obtained approaches zero
I favor use of light torture of data that are common techniques in the industry Let us review some polar cases and then work towards the middle – To ignore data that either vindicates or repudiates the contentions you advance in your paper is unethical In many instances, gathering and analyzing the data exceed the ability of the authors or anyone else – I have no problem when someone reports that data are potentially available but it is beyond the scope of the paper or the capabilities of the authors to gather and analyze it – I have a huge problem when the researchers need to know the results in advance and refuse to report statistical analyses that undermine their contentions—this has happened
Begin with a theory – Discrimination in employment reduces wages of victims – Structural Equation: W = f(E, A, R), Wages are a function of Education, Age, and Race Linear expression of the Structural Equation above W = B 0 + B 1 (E) + B 2 (A) + B 3 (1, or 0) + e The focus is on the B’s, since E, A, R and W are known and are the data. If B1 and B2 are not positive and significant, something is wrong with the data – A positive and significant relationship between these variables has been shown in thousands of studies – The real focus is upon B3. “1” is used when the observation involves a woman or a minority, “0” otherwise » If B3 is negative and significant, it shows that after adjusting for other variables, such as Education and Age, wages of women and minorities are less, and that difference is statistically significant
Now days output from statistical analyses are computed for you – Typically there will be a table that contains the value of the B’s and their statistical significance There are three levels of statistical significance commonly used –.10, which means that the chances of obtaining such a value of a B (calculated by a computer solving the regression equation) is less than one in ten [I choose not to discuss one and two tale issues] » Generally, we assume a B = 0. We want to determine whether the regression computed value of a B is likely due to chance or whether we have a genuine independent variable [E, A, R] –.05, is the most commonly used standard for determining statistical significance. If a B is significant at the.05 level, it means that there is less than a 1 in 20 chance that the B value recorded was due to chance. –.01, is a strong indication of statistical significance…less than 1 in 100 chance that the recorded value is due to a false positive
There are other statistical measures of the goodness of the model – R 2 is a measure of the percent of variation of the dependent variable explained by the independent variable – F statistic is a measure of whether the model as a whole is significant If F is not significant, you should be rethinking the model Let’s assume that R 2 is adequate and that the F statistic is significant but your B values are not significant, what can be done?
Suppose there is reason to believe that the relationship between W and E and A is positive, but it is curved: Education has a positive but diminishing effect on wages – W = B 0 + B 1 E + B 2 E 2 + B 3 (A) + B(dummy) B 1 would be positive and significant and B 2 would be negative It is common to transform variables into logarithms: – W = BE B1 A B2 R B3 lnW = B 0 + B 1 lnE + B 2 lnA + B 3 LnR
Outliers – In advance of the tests, sometimes you know some years or some observations are going to give you misleading results In time series analysis, the War years are sometimes thrown out – Impact of capital punishment was achieved in part by including four years in the late 1960s Some industries are outliers – Should we include professional sports in wage equations? If you are going to discard some data as outliers, you need to have a rationale other than your model works better by throwing out the certain years or observations
Robert Weisberg, JD, PhD, Edwin E. Huddleson, Jr. Professor of Law at Stanford University Law School, in his Dec. 2005 Annual Review of Law and Social Science article titled "The Death Penalty Meets Social Science: Deterrence and Jury Behavior Under New Scrutiny," wrote: "Social science has long played a role in examining the efficacy and fairness of the death penalty. Empirical studies of the deterrent effect of capital punishment were cited by the Supreme Court in its landmark cases in the 1970s; most notable was the 1975 Isaac Ehrlich study (2,943KB), which used multivariate regression analysis and purported to show a significant marginal deterrent effect over life imprisonment, but which was soon roundly criticized for methodological flaws. Decades later, new econometric studies have emerged, using panel data techniques, that report striking findings of marginal deterrence, even up to 18 lives saved per execution. Yet the cycle of debate continues, as these new studies face criticism for omitting key potential variables and for the potential distorting effect of one anomalously high-executing state (Texas). Meanwhile, other empiricists, relying mainly on survey questionnaires, have taken a fresh look at the human dynamics of death penalty trials, especially the attitudes and personal background factors that influence capital jurors."1975 Isaac Ehrlich study Dec. 2005 - Robert Weisberg, JD, PhD Robert Weisberg, JD, PhD
Frequently the credibility of a story or analysis can be enhanced data analysis. There are a number of do’s and don’ts Do-- Make use of data when it is available Link up with a colleague who has training in statistics Economics, Statistics, Finance, Marketing, Sociology May or may not require co-authorship By vetting your statistical work through someone with advanced graduate training, you can avoid embarrassing mistakes
Alert readers to the steps you took to obtain your results You should reveal enough information so that a diligent reader, critic, or reviewer could reproduce your results You should be willing to share your dataset, even though you spent hours and money compiling the dataset Your initial hypotheses should be revealed You should reveal data transformations: Log transformations, elevation of variables to higher powers, discarding of outliers Relate your findings to prior studies Report unexpected results that conflict with your hypotheses
There is nothing improper about advancing arguments that are not supported by the data Opponents of the death penalty need not defeat deterrence arguments based on regression analysis of differences in murder rates across states The death penalty could be opposed on “moral” grounds Data analysis can be your friend and even if some studies report results contrary to your contentions, virtually every empirical study has weaknesses
Your consent to our cookies if you continue to use this website.