7 min read

Six Experts Agreed the Bite Marks Matched. He Served 33 Years. The Method Was Never Tested.

Six Experts Agreed the Bite Marks Matched. He Served 33 Years. The Method Was Never Tested.

Mechanism #15: Authority Without Foundation — Methods that never had scientific validation acquired the legal authority of science through courtroom repetition, institutional endorsement, and the trappings of expertise.

In 1982, a woman was raped and her husband beaten to death in their home in Newport News, Virginia. The assailant bit her on the legs. Six forensic analysts examined the bite marks and all six agreed: the marks matched the teeth of Keith Allen Harward, a sailor stationed nearby.

The victim never identified Harward. The only eyewitness identification came from a security guard — obtained under hypnosis. But six experts had spoken. The science was clear.

Harward was convicted and sentenced to life.

He served 33 years.

In 2016, DNA testing identified the actual perpetrator: Jerry Crotty, another sailor. Crotty had died in custody in 2006. Harward was exonerated, compensated $1 million by the state of Virginia, and released at age 60. He is now 67, living in rural North Carolina.

Here is what matters about this case: the bite mark analysis that convicted Keith Harward has no scientific basis. Not "was improperly applied." Not "had some methodological weaknesses." Four separate governmental scientific bodies have concluded that bite mark analysis does not meet the threshold of science. The method was never validated. It was used in courtrooms for decades before anyone checked whether it worked.

What Was Never Proven

In 2022, the National Institute of Standards and Technology released a draft report on bite mark analysis identifying three foundational premises — all unsupported by evidence:

Premise Required for Bite Mark Analysis to Work Scientific Status
Human dentition is unique at the individual level Not proven
That uniqueness transfers accurately to human skin Not proven
Identifying characteristics can be accurately captured by analysis Not proven

Every link in the evidential chain is broken. False positive rates in studies typically exceed 10%. Yet the method was admitted in courtrooms across America for over 40 years. Only Texas has imposed a moratorium. In all other states, the decision rests with individual judges.

At least 36 people have been exonerated after convictions involving bite mark evidence. Combined, they lost over 450 years.

The FBI's 96%

Bite marks are not an anomaly. They are the starkest example of a pattern that runs across forensic science.

In 2015, the FBI completed a review of microscopic hair analysis testimony from cases predating the year 2000. Of 268 cases in which FBI examiners testified to inculpate defendants, 257 — or 96% — contained erroneous statements. Twenty-six of twenty-seven FBI examiners had given flawed testimony. The errors reached 41 states.

9
already executed
5
died on death row
1,918
combined years lost

From FBI hair analysis cases alone. 129 false convictions. More than half of exonerees were Black.
Sources: FBI, National Registry of Exonerations, Innocence Project

Consider what the FBI was doing. An analyst would testify that a hair found at a crime scene was "microscopically similar" to the defendant's hair, then assign probabilities — one analyst told a jury there was "one chance in 10 million" that a hair did not belong to the defendant. That defendant, Santae Tribble, spent 28 years in prison before DNA exoneration. The hair turned out to be from a dog.

The Reckoning No One Listened To

The scale of the problem has been documented. Repeatedly. By the government's own scientific bodies.

In 2009, the National Academy of Sciences published a landmark report finding that most forensic disciplines lacked scientific foundation. In 2016, the President's Council of Advisors on Science and Technology went further, reviewing over 2,000 papers and establishing a framework for evaluating forensic methods based on "foundational validity" — whether a method has been tested and shown to actually work.

Their findings:

Forensic Method Foundational Validity (PCAST 2016)
DNA analysis (single source) Valid
DNA analysis (simple mixtures, ≤2 contributors) Valid
Latent fingerprint analysis Valid
Firearms / toolmark analysis Not established
Footwear analysis Not established
Microscopic hair analysis Not established
Bite mark analysis Not established

PCAST recommended that courts not admit evidence from methods lacking foundational validity. Most courts ignored this.

Three valid methods. Four without scientific foundation. The four invalid methods had been used in courts for decades. PCAST recommended that judges stop admitting evidence from unvalidated methods. The judicial system's response was, overwhelmingly, to keep admitting it — treating PCAST's findings as going to the "weight" of evidence rather than its admissibility. In other words: let the jury hear the junk science and decide how much to trust it.

The Gatekeeping That Doesn't Gate

The legal system has a mechanism for this. Daubert v. Merrell Dow Pharmaceuticals (1993) established that judges should serve as "gatekeepers" for scientific evidence, evaluating whether testimony is based on sufficient facts, reliable principles, and sound methodology.

In civil cases — where corporations have money to mount challenges — Daubert works. Expert testimony gets excluded regularly. Methodological scrutiny is real.

In criminal cases, Daubert is, in the words of legal scholars, "a near dead letter."

"By clothing itself in the trappings of 'science,' it conveys to judges and juries an undeserved impression of certainty."

— Federal judiciary, on forensic evidence lacking scientific validation

The structural reasons are clear. Criminal defendants are poor. Public defenders are overloaded. Daubert challenges require expensive expert witnesses. The prosecution controls almost all physical evidence — defense teams have no independent laboratory access. Judges, who are often elected, lack scientific training and face political incentives not to suppress evidence that might let defendants go free. Once a forensic method has been admitted by one court, subsequent courts cite that admission as precedent, creating a self-reinforcing loop that has nothing to do with the underlying science.

It's Still Happening

This is not history. In Colorado, former Colorado Bureau of Investigation forensic scientist Missy Woods faces 102 felony charges for manipulating DNA evidence across 1,022 cases spanning her 29-year career.

Colorado Bureau of Investigation — The Woods Cases

472sexual assault cases — submitted "No Male DNA Found" when male DNA was present 134homicide cases 211burglaries 58assaults 47robberies 19kidnappings

A coworker raised concerns in 2014. Woods was formally accused in 2018 and reinstated. An intern found the anomaly in 2023. Woods pleaded not guilty Feb 11, 2026. Trial: Sept 24 – Oct 30, 2026. Cost so far: $11M+.

In Illinois, the University of Illinois Chicago forensic toxicology lab used discredited THC testing methods and faulty machinery across 2,200+ cases. Lab management knew the equipment was unreliable and did not notify law enforcement. Eighteen people were exonerated in January 2025. The total number of affected defendants remains unknown because no systematic notification program exists.

That last detail is the structural tell. When a drug is found to be dangerous, the FDA issues a recall. When an airplane part is defective, the FAA mandates inspection of every aircraft that used it. When a forensic method is discredited or an analyst is found to have fabricated results, there is no equivalent mechanism. No program systematically identifies all cases touched by the bad science. No institution is required to notify defendants. No automated process flags convictions for review.

The System's Response to Its Own Failure

If this story were about a knowledge system correcting itself — slowly, painfully, but genuinely — it would end with increased funding for forensic science reform, mandatory validation standards, and retrospective case review.

Instead, the FY2026 federal budget proposes cutting Paul Coverdell Forensic Science Improvement Grants by 71% — from $35 million to $10 million. The Debbie Smith DNA Backlog Grant Program is funded below the cap Congress authorized. State crime labs are buckling under demand, losing analysts to the private sector because they can't compete on salary, while training replacements takes months to years.

The methods were never validated. The reports documenting this were ignored. The people convicted by junk science have no systematic path to review. And the funding to fix any of it is being cut.

What This Mechanism Is

My previous fourteen posts mapped how knowledge systems fail within science — definition manipulation, methodology creating findings, incentive amplification, unfalsifiable entrenchment. All of those involve a scientific apparatus that exists but malfunctions.

Forensic science reveals something starker. For bite marks, hair analysis, footwear comparison, and firearms toolmarks, the scientific apparatus never existed. The authority was constructed from the aesthetics of science — the lab coat, the microscope, the technical language, the "forensic" prefix, the expert witness designation — without its substance. A method used in a laboratory setting by a person called an expert who uses words that sound scientific becomes science in the eyes of a jury. No empirical validation required.

And unlike a flawed hypothesis that might eventually be tested and rejected, a method embedded in legal precedent has a self-preservation mechanism science doesn't: stare decisis. Once admitted, it is cited by future courts not because the science improved but because a previous judge admitted it. The legal system's own architecture makes correction structurally harder than error.

Keith Harward's 33 years weren't taken by a scientific mistake. They were taken by something that was never science at all — but looked enough like it to convince twelve people in a room.