Stand your ground/castle law stuidies

Jimro said:
1977 to 2005 is 28 years of data.

2000 to 2010 is 10 years of data.

An intellectually rigorous person would give more weight to the conclusion that was derived from the larger data set.

The first thing an intellectually rigorous person would consider is what a given study was designed to evaluate.

A comparison of the number of years of data is irrelevant, because they weren't studying the same thing as Lott. The whole point of the Texas A&M study was to examine the effects on crime rates of the liberalizations of castle doctrine that were passed in several states between 2005 and 2009. Lott was studying the effects of self-defense laws in general in a time frame that preceded that of the Texas A&M study. The laws in which Hoekstra and Cheng were interested had not been passed at that time.

As to the difference in the sizes of experimental and control groups which you raised in an earlier post, the difference-in-differences method handles that readily by using weighted averages. (See Table A1, for example, in which the regressions are weighted by state population.) It's not unusual in such studies for the size of the control group to exceed that of the experimental group by a factor of 10 (see, for example, this paper on using the method in health care research).

Rather than attempting to find fault with the study because one doesn't like its conclusions, we might find it far more interesting to discuss the implications of such research from the perspective that its findings may be valid. If "stand your ground" laws indeed lead to increased homicide rates, that should interest us. As their conclusion notes, it's possible that the increase reflects a greater number of justifiable homicides:
A critical question is whether all the additional homicides that were reported as murders or non-negligent manslaughters could have been legally justified. Based on the results of various tests and exercises performed here, our view is that this is unlikely, albeit not impossible.
So it's an open question as to how much of the increase represents the lawful use of deadly force; there may be more meat here for proponents of "stand-your-ground" laws than for opponents.
.....
2damnold4this, thanks for the link.
 
Last edited:
A comparison of the number of years of data is irrelevant, because they weren't studying the same thing as Lott. The whole point of the Texas A&M study was to examine the effects on crime rates of the liberalizations of castle doctrine that were passed in several states between 2005 and 2009. Lott was studying the effects of self-defense laws in general in a time frame that preceded that of the Texas A&M study. The laws in which Hoekstra and Cheng were interested had not been passed at that time.

No, the whole point of the A&M study was put in the title, which was a horrible title because it asks 4 questions.

Does strengthening Self Defense Law Deter Crime or Escalate Violence?

Let me translate that into 4 questions
Does strengthening self defense laws deter crime?
Does strengthening self defense laws have no effect on crime deterrence? (null hypothesis)
Does strengthening self defense laws escalate violence?
Does strengthening self defense laws have no effect on violence? (null hypothesis)

Then they focus solely on "homicide" instead of "crime." In Florida they focus solely on the increase in homicide, but not the dramatic decrease in overall violent crime.

And the larger data set is relevant because it gives an actual baseline of data that shows normal variation.

And here is why. If you were looking at, "Has the introduction of Biotech corn caused an increase in water consumption?" and you didn't have a good baseline of data for a particular area, for a number of years showing average water use during different growing conditions and instead choose to use a "control" group of different states with different growing conditions, and only looking at the years after Biotech corn was introduced, you have designed a poor experiment.

And that is what the Hoekstra did, was eliminate any baseline fluctuation which is freely available from the data sets that they used in their study. So now their study is open to the criticism that they limited their data set to ignore "noise in the system" instead of accounting for "noise in the system."

As far as particular state laws go, simply accounting for population density and urban distribution is only the beginning. Think about this, gun friendly Vermont is in the "no stand your ground or castle doctrine" control group despite having some of the most gun friendly laws in the country.

They limited their data set, limited the scope of "violence" to "homicide" and to me that is not good science, or good english.

Good science is observable, and repeatable. Here we have the data set they used freely available to the public, and already here I have shown a different interpretation of the data set. I could repeat their work by following their steps exactly, but I could also refute their work by using a larger portion of the data set they used. Not good science.

In terms of having a small experimental group, remember the scare that "vaccines cause autism" brought around? That study was never replicated, suffered from small sample size, and caused children across the world to get sick from preventable diseases. I'm very critical of bad science, as it has profound effects on policy.

Jimro
 
The Georgia State study differed from the A&M study in several ways. One way was the GS study looked at laws that removed the duty to retreat separately from laws that changed things like civil liability or a presumption of innocence. Another way the GS study was different than the A&M study is it looked at race and gender. It's interesting that the GS study found an increase in murder rates associated with a removal of duty to retreat laws but not other changes. It's also interesting that increase in murder rates was only for white males.
 
Ok, just finished the first read through of the Georgia study.

Much more rigorous than the Texas A&M study.

However, their findings of 4.4 to 7.4 more homicides per month spread across 18 states is something that I find to be a non-issue.

Also it should be noted that Texas and Louisianna had some trouble with the weather in 2005 which caused some social upheaval that is out of the "statistical norm" for those states. There is no mention of "mass human migration in the event of New Orleans flooding" as a confounding factor.

I do appreciate that they showed the control rates both before and after showing similar curve patterns between the SYG and control population. I also noticed that the curved diverged very little after the SYG zero point.

Jimro
 
I truly believe the violent crime rate is dependent on the age of the population. An older population is less violent then the younger version. America is getting older so the violent crime rate is dropping. Half the population is over 40. Average age of a Floridian is 42.
 
Back
Top