AI Model Predicts Dual Target Drugs

How AI may accelerate drug design for complex diseases.

by · Psychology Today
Reviewed by Tyler Woods
Source: diegobarruffa/Pixabay

Artificial intelligence (AI) deep learning is rapidly transforming the biotechnology and pharmaceutical industries. New AI models based on ChatGPT-like algorithms are emerging in many industry verticals. A new study published in Cell Reports Physical Science by University of Bonn researchers shows how a ChatGPT-like AI model can predict dual-target drugs that can inhibit two enzymes at the same time to treat complex diseases and accelerate polypharmacology.

“Compounds with defined multi-target activity are candidates for the treatment of multi-factorial diseases,” wrote University of Bonn professor Dr. Jürgen Bajorath and co-author Sanjana Srinivasan.

Polypharmacology is the design or use of pharmaceutical agents that can act on multiple drug targets at the same time. Most diseases are complex diseases, or multifactorial diseases, caused by the interaction of a combination of factors, such genetic, lifestyle, and environmental factors. Examples of multifactorial diseases include bipolar disorder, schizophrenia, manic depression, migraine headaches, epilepsy, common cancers, Type 2 diabetes, Alzheimer’s disease, Parkinson's disease, rheumatoid arthritis, osteoporosis, asthma, kidney disease, multiple sclerosis, autoimmune disorders, and more diseases and conditions.

Artificial intelligence is making significant inroads in the pharmaceutical and biopharmaceutical industries, especially in research and development. By 2032, the worldwide market for AI in drug discovery is expected to reach $13 billion in revenues according to Statista.

“Here, we introduce transformer-based chemical language model variants for the generative design of dual-target compounds,” the research team wrote.

Large language models (LLMs) such as ChatGPT by OpenAI, are a type of artificial neural network with a transformer deep learning architecture. The “T” in ChatGPT stands for transformer. Transformers can make predictions by spotting patterns in massive amounts of text and have a self-attention mechanism that enables them to pay more attention to the areas that are most relevant while viewing various parts of the sequence simultaneously. Google introduced transformers in the 2017 paper “Attention Is All You Need” authored by Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, and Illia Polosukhin.

For this study, Bajorath and Srinivasan created a chemical language model that was trained on pairs of chemical notation strings of text called Simplified Molecular Input Line Entry System (SMILE) strings. These SMILE strings are a compact method of representing a molecule through sequences of letters and symbols. Their chemical language model was trained on over 75,000 target pairs of strings where one string represents a molecule that works on one target protein, and the other string represents a compound that works on the same target protein as well as affects a different target protein. Then, the AI model was fine-tuned with pairs of strings so it could “learn” targetable compounds associated with various classes of proteins.

“The final models were found to exactly reproduce known dual-target compounds excluded from model derivation,” the scientists reported.

This research study demonstrates that an AI transformer model can predict chemical compounds that can target two proteins at the same time. This proof-of-concept is a step forward in accelerating drug design for many complex diseases and shortening the overall drug development lifecycle.

Copyright © 2024 Cami Rosso All rights reserved.