On this page...
Among classical Greek and Roman societies the signs of death were the absence of a heartbeat and breathing, and the onset of putrefaction. In medieval times a candle was held to the mouth - a flicker of the candle was shown as a sign of life.
Lend me a looking glass;
If that her breath will mist or stain the stone,
Why then she lives. Shakespeare, King Lear
However, these signs were rejected by anatomist Jacques-Benigne Winslow in 1740, who recommended that resuscitation should be attempted on seemingly lifeless patients by stimulating various parts of the body with the 'juices of onions, garlic and horse-radish, . . . whips and nettles, ... and by hideous Shrieks and excessive Noises.' Pins were also inserted under the toenails.
In 1742 John Bruhier documented fifty-two examples of supposed live burial, in his book Dissertation de l'incertitude des signes de la mort, This fed the public's fears of premature burial, and placed growing pressure on doctors to come up with more reliable 'signs of death' as a diagnostic tool. German doctors concluded that putrefaction was the only reliable indicator of death. A number of cultures include an interval between death and disposal of the body that allows time for putrefaction. For example, the leichenhäuser (corpse houses) of 19th century Germany provided a place where 'corpses' were kept under surveillance until putrefaction was apparent.
The 'Safety Coffin' highlights people's uncertainty about pinpointing the moment of death and their fear of being buried alive. The safety coffin provided a means for "deceased" to signal the world above for help and salvation.
More immediate and drastic techniques have also been employed to ensure that the dead were really dead. Some people requested in their wills, that their head be severed, or their heart be pierced prior to burial as a measure to ensure they were truly dead. The invention of the stethoscope in 1819 removed the need for these extreme measures.
But medical intervention has also increased uncertainty. The invention of the artificial respirator in the 1950s meant that the cells of the body could be kept alive in the absence of a natural heartbeat. By 1968 when the first heart transplant was performed, it was already clear that there needed to be a diagnosis for death that was not based on heartbeat. A committee based at the Harvard Medical School in the USA, came up with a diagnostic criteria for death called brain death criteria.