A short history of Drug Discovery

Drug Discovery in the 20th Century

The development of small molecule therapeutic agents for the treatment and prevention of diseases has played a critical role in the practice of medicine for many years. In fact, the use of natural extracts for medicinal purposes goes back thousands of years; however, it has only been in the past half century or so that searching for new drugs has found itself in the realm of science. In 1900, one-third of all deaths in the U.S. were from three general causes that are rare today because they are preventable and/or treatable: pneumonia, tuberculosis, and diarrhea. By 1940, the chance of dying from these three causes was 1 in 11; by 2000, the odds were down to 1 in 25.

Of the three, only pneumonia remains in the list of top ten causes of death, which is now led by more complex conditions such as cardiovascular disease and cancer. While other factors such as improved sanitation and vaccination certainly played a role in the increase of life expectancy during the twentieth century – from less than 50 years in 1900 to more than 77 years in 2000 – the availability of drugs to control infection, hypertension, hyperlipidemia, and to some extent even cancer, certainly also contributed to the obvious improvement in our collective health and life expectancy during that period.

The history of drug discovery in the pharmaceutical industry and academic labs over the past half-century shows a progression of discovery paradigms that began shortly after “miracle drugs” such as the penicillins became available to the public after World War II. That same decade also saw the rise of synthetic organic chemistry, which had progressed to the point that the large scale preparation of “non-natural” drugs or drug candidates was economically feasible.

The Rise of Synthesis, Sophisticated Structure Determination Methods, and Powerful Computers

Synthetic organic chemistry was a very important advance at the time, particularly because bacteria had begun to develop resistance to the natural penicillins, and synthetic chemistry provided a solution: the ability to prepare analogs (with the natural b-lactam core but carrying a non-natural side chain) that proved to have activity against resistant strains.

Still, there were many diseases for which there were no effective therapeutic interventions, and synthetic chemistry offered the promise that if a drug could be envisioned, it could be made. It would be a few more years, though, before biology would catch up by providing the more detailed information–at the molecular level–needed to do that.

As synthesis became progressively more sophisticated, it assumed a leading role in drug development, and particularly in refining or optimizing the activity of known drugs. The resulting “methyl, ethyl, isopropyl” analog model for drug discovery was highly empirical but quite successful for several decades, through the 1970s.

Another revolution was beginning about that same time, which was sparked by the commercial availability of extraordinarily powerful spectrometers (especially NMR and MS) and separation techniques (HPLC) for determining the structures of minute quantities of biologically active natural products, which had been isolated, identified, and then screened in panels of assays for various types of desired activity, i.e., toxicity against cancer cell lines.

Then, in the following decade, attention began to shift away from random searches for active natural products to a new, “rational” computational archetype for drug discovery, namely, computer-aided drug design. This shift was driven not only by dramatic increases in computer power in the early 1980s, but also by significant concurrent advances in structural biology (especially protein crystallography) that provided a continuous stream of new protein structures upon which to base computational drug design studies.

A Return to Empirical Methods

While there were a few successes of so-called “rational drug design,” this discovery model performed surprisingly poorly overall and was supplanted rather quickly (but not completely) beginning in the early 1990’s by a return to largely empirical methods: small molecule library synthesis and high-throughput screening.

Again, it was technological advances – in this case, sophisticated robotics and biological techniques for simultaneously assaying thousands of compounds in concert with improved methodology in synthetic chemistry – that drove this shift. But now, less than 20 years later, the output of the big pharma drug pipeline appears to be at a new low, and there is a growing consensus that these high-throughput screening programs have also failed to deliver, perhaps as a result of a lack of true chemical diversity in the very large industrial libraries that have been synthesized and screened. New ideas for opening the pipeline are clearly needed.

Preparing for Drug Discovery Tomorrow

Developing the therapeutic agents of the future will clearly involve the same basic science disciplines that have always been at the core of drug discovery, namely, structural biology to provide information about the target biomacromolecules; chemistry to design and synthesize the drug candidates; and pharmacology to determine the effects of the interaction between drug and target. Taking drug discovery to the next level may require an entirely new approach, but more likely will instead result from the introduction of new disciplines and/or greatly improved technologies into the process.

It seems likely that integrating sophisticated new computational, bioinformatics, pharmacogenomics, engineering, and/or nanotechnology methods into the process will lead to the next stage of advances in drug discovery.