PJ Online | The human genome: a brave new world of chemical proteomics
The Pharmaceutical Journal
Meetings and Conferences
World Congress of Pharmacy and Pharmaceutical Sciences 2002 summary
The human genome: a brave new world of chemical proteomics
A symposium entitled "Building in safety during drug development", jointly organised by the Industrial Pharmacy section and the Laboratories and Medicines Control Services section, was held on 5 September. It examined the opportunities for the pharmaceutical industry more quickly and efficiently to develop innovative new drugs to treat unmet medical needs.
In her introduction, chairman Dr Linda Hakes (Schwarz-Pharma, Germany) noted that regulators, patients and the health care professions have variously expressed concern that safety might be compromised if new medicines are developed and approved too quickly. This fear could be allayed by the use of modern scientific methods and technologies to reduce the time needed to bring innovative new medicines to the market while maintaining the highest standards of safety.
Unlocking the pharmacology of the human genome
In a presentation of computer-driven videoclips portraying virtual behaviour of prospective molecules and active sites, Dr David Bailey (Purely Proteins Ltd, Cambridge, UK) described the United States and European joint publication in 2001 of the first draft statement of the structure of the human genome as "just the beginning of a voyage of discovery". Over 50 per cent of the "genomic landscape is still uncharted biochemical territory" and he observed that of almost 5,000 current drug "targets" identified by molecular mechanistics, less than one fifth fall into known "pharma" territory.
Dr Bailey presented a thematic integration of discovery technologies leading to a drug candidate. It seems that the traditional sieve of pharmacology screening is multiplying dramatically and, he suggested, the majority of future drugs will be large bio-molecules, while much of the remaining small molecule research would increasingly be driven by computers. Time to development for lead drug candidates is already near optimum and is not expected to get much shorter, he said.
Taking the topical example of development of HIV protease inhibitors, he proceeded to identify mechanisms to improve understanding of three key discovery end-points: targets, leads and candidates. Using elegant videoclips, he portrayed structured design of HIV as the most highly researched protein in the world and related it to metallo-, cysteine- and serine-subfamilies of proteases. His interactive 3-D computer models facilitated prediction of selectivity of sites and he speculated "what would happen if" a specific drug structure were to enter them. Fine mapping of sites and associated ligands is essential for accurate definition of available "pockets", side chain flexibility at site, and to construct feasible drug scaffolds. However, as a Roche 3D screening program had shown, despite good binding and (seemingly) less drug molecule movement, the scaffold would still predict a million potential molecules. He demonstrated virtual screening of existing compounds, matching fragments and complete molecules to specified sites, with energy and volume criteria for optimum binding. The docking procedure revealed how, over a range of targets, candidate molecules might "learn" to fit into sites by "stochastic tunnelling". Such sophisticated software could assess binding efficiency, identify new receptors and compare superimposed structural features, at possibly "unknown receptors and uncharted binding sites". He called this "chemical proteomics".
Proceeding from the prediction of candidates at virtual sites, he came to the molecular synthesis phase. He identified the "chemical drivers" as combinatorial chemistry, high throughput parallel analysis and equipment miniaturisation. These drivers have served well for small molecules but molecules of the new era are much larger and it is, said Dr Bailey, "necessary to engineer selectivity" on a full scale running from low molecular weight to large, up to protein and, ultimately, to cellular level.
In the final section of this keynote presentation, Dr Bailey identified the "key technology hurdle" ? data integration. He speculated that chemical design might present 140,000 design strategies for an average site with 10,000 scaffolds per strategy, examined in 100-member combinatorial "libraries", accounting for 1,011 possible enumerated compounds. In a screening experiment with 100,000 human genes per chip, 10 time points each with 10 doses and 10 replicates would collectively generate 108 data points per experiment.
What more then in this "brave new world"? With 300,000 proteins and 30,000 genes, "would there be a drug for every gene?" he asked. "No!" There are too many issues of selectivity and multiple sites ? and redundant functions. But quite possibly there would be drugs to suit a large number of genes. "Chemical proteomics will tell us soon," he said. This, together with ligand design, held the key to design many new, safe, drugs to flow from the human genome project.
Modern technologies in analytical profiling
Professor Bill Dawson (Proteome Sciences, UK) noted that analytical specifications for new medicines have "changed dramatically over the past decade or so". With so many biotechnology products entering clinical trials, "technologies that were previously in the protein scientists' domain have been added to the analysts' armamentarium". "Complex analytical equipment," he went on, "has become user-friendly and more affordable." Concurrently, computer power has increased beyond [all historic] recognition and the power of relational databases has provided the capacity for handling information from many data sets.
Classical spectroscopic techniques have been refined, and new computational techniques incorporated here, too, which has increased both sensitivity and turn-round times. He instanced the advantage of near infrared (NIR) spectrometry for non-invasive identification of solids: the instruments are relatively cheap (ca ?50,000) and can also be used online in production. To demonstrate NIR versatility, Professor Dawson displayed some 2D orthogonal plots that provided discrimination of species of digitalis and the geographic origin of cannabis seizures.
Sophisticated separative technologies have also evolved, such as robotic driven 2D electrophoresis. Many traditional separative systems are now routinely coupled with a variety of output instruments, notably bench-top mass-spectrometers (MS) but also with nuclear magnetic resonance, ultraviolet and infrared spectrometers. Equipment miniaturisation has facilitated examination of sub-microgram quantities, eg, a nano-scale high performance liquid chromatography system coupled to time-of-flight (TOF) MS, and sensor assemblies incorporating immunological reagents on novel supports.
Professor Dawson stressed that such technologies enable the analysis of complex biological large molecules, such as complete proteins. He reviewed some main areas of application of MS to proteins, including the identification of each peak of a protein digest, using the latest MALDI (matrix-assisted laser-desorption ionisation) TOF-MS, strand comparisons and "de novo" sequencing of hitherto unknown structures.
The use of these analytical systems, both independently and in combination, also provide an opportunity to produce a meaningful specification much more rapidly than before ? but it is essential to tailor the derived specification to the use of the product. Process control and monitoring successive production stages (synthesis, polymorph resolution, solution mixing, blending, granulation, drying and coating) can also be addressed through scanning electron microscopy, solid phase NMR or reflective NIR.
Professor Dawson recognised current trends to ensure that "regulatory frameworks should reflect specifications that are case-specific" and considered that the huge analytical advances are now sufficiently robust for acceptance in drug application dossiers. In some instances, such as NIR, the techniques are "readily transferable to the third world" and, inter alia, useful in sorting at source adulteration of herbal products, at least qualitatively. He concluded that regulators should be "ready to accept new technologies earlier rather than later".
Discussing the mechanism of arrhythmia, Professor Luc Hondeghem (pharmaceutical consultant, Belgium) sought to resolve a paradox concerning the value of prolongation of the Q-T interval (the time between consecutive diastolic beats). He distinguished between prolongation of APD (action potential duration), widely described as a major mechanism for anti-arrhythmia, and Q-T prolongation which is frequently regarded as a surrogate endpoint for pro-arrhythmia. He differentiated three types of APD prolongation: reverse use dependence (R), beat-to-beat instability (I) and quasi-triangular fast polarisation (T). When over 700 chemicals were subjected to electro-physiological characterisation, it was found that prolongation of APD was associated with pro-arrhythmia. However, even for those chemical agents that exhibited R/I/T activity, prolongation of APD correlated directly with reduction of pro-arrhythmia.
He reported animal studies with sub-endocardial and epicardial recording of electrode potential differences: for statistical reasons, these tests required large numbers of animal hearts. He instanced increasing the concentration of almokalant in the rabbit heart, which led to increasing prolongation of APD and pro-arrhythmia. However, prolongation of APD with low concentrations of erythromycin that lengthened the APD without causing any R/I/T, actually reduced the pro-arrhythmia of almokalant while lengthening the APD. More generally, when agents markedly prolonged the APD without any R/I/T action, arrhythmias were reduced.
Professor Hondeghem described a blinded study, in which three pharmaceutical companies submitted 41 vials containing therapeutic agents to a SCREENIT computerised system, which automatically quantifies the APD prolongation and the R/I/T parameters in Langendorff perfused rabbit hearts. He reported successful identification of all 21 vials containing a drug known to prolong the Q-T, and all agents reported to be torsadogenic [potentially fatal "torsade"] in the clinic were also identified by the test as potentially dangerous. He summarised that the SCREENIT system had proved to be highly reproducible; it had correctly recognised APD prolongation, accurately identified pro-arrhythmic drugs, and all this without erroneously incriminating safe agents. This innovative technique that he had pioneered had involved far fewer experiments (in this case 20 instead of 154), a much shorter time to assess (one day instead of six months) and saved many animals from experimentation.
In the final paper, Dr Jeroen Aerssens (University of Maastricht, Netherlands) discussed means to "identify patients at risk of adverse drug reactions". Asking "why do compounds fail during chemical drug development?", he instanced variability in drug metabolism (in pharmacokinetic safety studies), difficulty in identifying patients who would respond to the drug (efficacy problem), or just plain unfavourable economics. How then could pharmacogenomics help, and would this revolutionise medical practice? Dr Aerssens offered "hope but also reality." "Today, drug development is based on undifferentiated treatment of large populations" and one drug "does not fit all patients. ... Tomorrow, however, drug development will take account of variation between individuals."
He recalled that an individual's genetic make-up largely determines his risk for adverse drug reactions and extent of drug effectiveness. He claimed that "advances in biotechnology and the human genome project have led to a new discipline of pharmacogenomics" and that this study aimed better to understand these inter-individual variations in drug responses. Dr Aerssens noted that variation between individuals means that a drug is often only effective in a sub-set of patients, depending on whether the relevant enzyme is metabolising "normally" or metabolism is by a mutant gene. He suggested that such genetic control might dramatically affect therapeutic ratios conventionally ranked into "toxic", "effective" and "ineffective" dosage.
He had examined adverse drug reaction literature from 1997 to mid-2002. Out of 27 drugs frequently indicted in ADR studies, almost 60 per cent were metabolised by at least one enzyme with known functional (potentially genetic) variation. He commented that genetic variation affects toxicity, efficacy and dosing. Taking one specific example, he examined various abnormalities for atypical patients in respect of the enzyme TPMT (thiopurine methyl transferase), which is involved in S-methylation of a series of purine-analogue drugs, such as azathioprine.
He said that applications of pharmaco-genomics relied on two main technologies: (i) identification and screening for inherited variations in genotyping and (ii) monitoring the expression level of thousands of genes by suitable microarray technology. He commented that these tools could be applied to identify patients at risk for ADRs during drug development. Taking a wide range of drugs, as different as cardiovascular and psychoactive agents, he reviewed their metabolism by one genotype, remarking that such knowledge was available in clinical research but rarely in general practice. Dr Aerssens concluded with a list of factors to consider in applying pharmacogenomics to ADRs: severity, frequency and homogeneity, balance between cost of drug and cost of ADR treatment, and consequences of false positive or negative diagnoses.
Citation: The Pharmaceutical Journal URI: 20007786
Recommended from Pharmaceutical Press