by Jim Gardner Oct 26, 2015 Lake Forest Patch
Yesterday we discussed the mis-representation of data by Orange County officials, citing the Orange County Sheriff’s Department (OCSD) mis-statement of the costs of Police services and a deflection by Orange County Animal Care Services (OCAC) when asked about the euthanasia rates. I also cited a large discrepancy between what OCAC considers the euthanasia rate for dogs (9.6%) and my own calculations (22.6%). Today we’ll look at how we come to that discrepancy.
To get their numbers, OCAC has to fiddle with the data. They use the total number of dogs entering the shelter, which gives a very large number that helps keep the euthanasia rate down.
As difficult as it may be to believe, they include the number of dogs who arrive dead. These are classified as “deceased animal impound” and can be distinguished from the dogs who are picked up dead by the OCAC field services. In 2014 there were 925 dead dogs who arrived at the shelter. These 925 dogs are included as being “admitted”! This inflates the number of admissions and reduces the euthanasia rate, since the rate is based on the number killed divided by the number admitted. Needless to say, dead dogs are not euthanized. The higher the number admitted, the lower the euthanasia rate.
In my study of dozens of shelters throughout California and the U.S. I have never encountered this policy (E-mail me for my latest report that shows the most recent figures in California).
Next they include those animals who get “lost” or “escape” or go “missing” or are otherwise unaccounted for. They add in some who have no outcome at the time that the statistics are gathered (e.g., they may be slated to be killed but not yet killed). In 2014 this came to another 75 dogs. Once again, by increasing the number admitted they lower the euthanasia rate. But because there is no outcome for these dogs, they should not be included in the computation of the euthanasia rate. Again, needless to say, missing or escaped dogs are not euthanized.
As stated above in regard to counting dead dogs, in my study of dozens of shelters throughout California and the U.S. I never found the policy of counting missing and escaped in the computation of outcome data regarding euthanasia.
Next, OCAC removes the number of dogs who are killed that were brought to the shelter by owners asking that the dogs be euthanized. Of course not all dogs brought to a shelter by an owner need to be euthanized. But OCAC assumes they do, and subtracts this number from their statistics.
What differences do all these machinations make?
Here are the results the way that OCAC did the math, and here is what I produced. We both use the same figures, so here they are. I’m using the 2014 figures since these are the only ones for which I have the complete information.
· 11,987 dogs were impounded
o 925 arrived dead
o 50 had no outcome determined as of Dec 31, 2014
· 2,484 dogs were killed by shelter employees
o 1,444 had been brought there by their owners
o 1,040 were not owner relinquished and were picked up as strays
Using OCAC math, 11,987 dogs were admitted and 1,040 were killed who were not owner relinquished. They subtract the 1,444 (owner relinquished) from the 11,987 and come up with 10,543 dogs who were admitted (even the dead ones and the ones who were lost or unaccounted for). So 1,040/10,543 = 9.9%.
FWIW – OCAC officials cited a rate of 9.6% even though the math shows that the rate is 9.9%.
Using my math, you subtract 975 dogs (dead and unaccounted for) from the 11,987 since you shouldn’t count dead animals and you can’t count the animals for whom outcomes are not available. That gives us 11,012 dogs who entered alive and for whom outcomes were produced. Of these 11,012 dogs, 2,484 were killed based on the decision of OCAC staff. Some of these had been brought to the shelter by their owners, and some had not. But OCAC staff made the decision to kill them. So 2,484/11,012 = 23% killed.
According to OCAC math, only 9.9% are being killed, but according to my math, more than twice as many (22.6%) are being killed.
Once again, County officials are doing the math that makes their position appear in a far better light.
I recently reviewed the data for more than 30 shelters in California and none of them used the method used by OCAC to compute euthanasia rate. None include dead or missing animals. Putting aside these issues, more than half the shelters I examined used my method of looking at the total admissions with outcomes and the total killed. A minority does use the OCAC method of subtracting out the “owner requested euthanasia” but most do not.
Examining the usage of “owner requested euthanasia” I found that OCAC had the highest numbers, which might be expected since OCAC had the highest admissions. What disturbed me, however, was the fact that OCAC had more than twice the average number of animals in this category. IOW, OCAC is getting far more owner requests for euthanasia than other shelters. This raises some disturbing questions about how animals get classified as “owner requested”, but I’ll have to save that for another day.
OCAC plays a switcharoo and gives the euthanasia rate for dogs instead of the overall rate, so that the number appears significantly lower. Then they under-estimate the actual euthanasia rate by 228%. In both cases, County officials seem to be “cooking the books”.
Most elected officials have no real idea what they’re doing. They have very few facts at their disposal about anything, and little time to spend gathering those facts. They rely almost entirely on the data given to them by the staff. So when the information given them is distorted, the elected officials are going to make decisions based on inadequate facts.
If you think the euthanasia rate at the County Shelter is “less than 6%”, you’re not going to spend a lot of time trying to fix the shelter. 6% is a pretty good figure. If you think it’s above 40%, you’ll have a different attitude all together.
It looks like County employees are playing with the data in a way that makes their job performance look better than it is. It’s not for me to say whether or not they do this deliberately, but it certainly appears that this is being done.
We have the same problem at the City. The quality of the data in many reports is so poor that had I received these reports while I was a University Professor, many of them would have received a “D” or an “F”. I’ve reported on this many times, and expect to continue to in the foreseeable future.
I shared my concerns with Supervisors Bartlett and Spitzer and am waiting for a reply. We’re all in this together trying to make our neighborhood, city, and county a better place to live. We’re not always going to make the best decision for everyone and we’re going to disagree because, quite frankly, we have different values and some have vested interests that put them in office. But we need to have a knowledge base that gives us the right information.
ABOUT THE AUTHOR
Dr. Jim Gardner is on the City Council for Lake Forest. You can check him out on LinkedIn and/or Facebook and you can share your thoughts about the City at Lake Forest Town Square on Facebook. His comments are not meant to reflect official City Policy.