Expert view: Optimising the hit-to-lead workflow
Posted: 13 December 2017 | Dr. Paul Wylie (Head of Applications at TTP Labtech) | No comments yet
Hits identified in high-throughput screens are evaluated within the hit-to-lead phase of drug discovery, where they undergo an iterative optimisation process employing a variety of techniques to identify promising lead compounds to move forward to the lead optimisation phase.
Typically, the workflow starts by confirming the hits identified in the initial HTS by repeating the initial assay, as well as running orthogonal assays using different assay technologies. This may then be further triaged by screening concentration curves to evaluate relative affinities, efficacies and selectivity of the compounds to the biological target of interest.
Analogous compounds from a set with similar structural profiles to the hit compounds are tested against a variety of parameters, including toxicity, metabolic stability and solubility. These assays commonly include physiologically relevant cell based assays often in different species.
This process needs to be carried out systematically, and ideally in a relatively high-throughput manner to allow for testing of the broadest range of chemical analogues possible. Where structural information about the target is known, structure-based drug design techniques using molecular modelling and methodologies such as x-ray crystallography and NMR can also be applied to develop the SAR (structure activity relationship) faster and in a more focused way, as well as potentially identify new binding sites on the target proteins.
An ideal strategy for hit-to-lead optimisation should be amenable to a high-throughput format to allow for the interrogation of the widest possible chemical space to establish the optimal activity on the biological target.
TTP Labtech’s automation solutions are designed to streamline these workflows by addressing common sample management, liquid handling and assay screening bottlenecks. Our innovative technologies focus on minimising costs, eliminating laborious or problematic processes, improving data quality, and the preservation of sample integrity.