An early 18th century English mathematician and Presbyterian minister, Thomas Bayes never used the phrase Bayesian analysis. He published only two papers in his life, one theological and one in which he defended Newton's calculus against criticism from philosopher George Berkeley.
But late in life, Bayes became interested in probability. He died in 1761, and a year after his death his friend Richard Price arranged for a public reading of a paper by Bayes in which he proposed what became known as Bayes' Theorem.
This formula prescribed how one could update initial beliefs about an event with information from data. To illustrate, using a modern example, suppose one assigns a 1 in 1000 probability that a particular suspect committed a crime. Suppose we find that the suspect's DNA matches DNA found at the crime scene. Bayes' rule provides a mathematical way to compute the probability that the suspect committed the crime given the new evidence.
Assigning probabilities to unknown truths was a controversial methodology, and Bayesian statistics only became widely accepted among statisticians in the last century. But now it is the basis for a wide variety of applications, from assigning authorship of the different Federalist Papers to discovering patterns in complicated biomedical data. Increasingly it's playing an important role in physician's clinical decision making.
All of this came from work done after Bayes and the development of computers strong enough to do heavy number crunching. For that reason, it's speculated that he never imagined the usefulness of the theorem named after him