Categories
Clinic Exploratory research Preclinical

Artificial intelligence against bacterial infections: the case of bacteriophages

« If we fail to act, we are looking at an almost unthinkable scenario where antibiotics no longer work and we are cast back into the dark ages of medicine » – David Cameron, former UK Prime Minister

Hundreds of millions of lives are at stake. The WHO has made antibiotic resistance its number one global priority, showing that antibiotic resistance could lead to more than 100 million deaths per year by 2050, and that it already causes around 700,000 deaths per year, including 33,000 in Europe. Among the various therapeutic strategies that can be implemented, there is the use of bacteriophages, an old and neglected alternative approach that Artificial Intelligence could bring it back. Explanations.

Strategies that can be put in place to fight antibiotic resistance

The first pillar of the fight against antibiotic resistance is the indispensable public health actions and recommendations aimed at reducing the overall use of antibiotics. For example :

  • The continuation of communication campaigns aimed at combating the excessive prescription and consumption of antibiotics (in France a famous slogan is: “Antibiotics are not automatic”?)
  • Improving sanitary conditions to reduce the transmission of infections and therefore the need for antibiotics. This measure concerns many developing countries, whose inadequate drinking water supply causes, among other things, many cases of childhood diarrhea.
  • Reducing the use of antibiotics in animal husbandry, by banning the addition of certain antibiotics to the feed of food-producing animals.
  • Reducing environmental pollution with antibiotic molecules, particularly in establishing more stringent anti-pollution standards for manufacturing sites in the pharmaceutical industry.
  • The improvement and establishment of comprehensive structures, for monitoring human and animal consumption of antibiotics and the emergence of multi-drug resistant bacterial strains.
  • More frequent use of diagnostic tests, to limit the use of antibiotics and to select more precisely which molecule is needed.
  • Increased use of vaccination

The second pillar of the fight is innovative therapeutic strategies, to combat multi-drug resistant bacterial strains against which conventional antibiotics are powerless. We can mention :

  • Phage therapy: the use of bacteriophages, natural predatory viruses of bacteria. Phages can be used in therapeutic cases where they can be put directly in contact with bacteria (in the case of infected wounds, burns, etc.) but not in cases where they should be injected into the body, as they would be destroyed by the patient’s immune system.
  • The use of enzybiotics: enzymes, mainly from bacteriophages like lysine, that can be used to destroy bacteria. At the time of writing, this approach is still at an experimental stage.
  • Immunotherapy, including the use of antibodies: Many anti-infective monoclonal antibodies – specifically targeting a viral or bacterial antigen – are in development. Palivizumab directed against the F protein of the respiratory syncytial virus was approved by the FDA in 1998. The synergistic use of anti-infective antibodies and antibiotic molecules is also being studied.

Each of the proposed strategies – therapeutic or public health – can be implemented and their effect increased tenfold with the help of technology. One of the most original uses of Artificial Intelligence concerns the automation of the design of new bacteriophages.

Introduction to bacteriophages

Bacteriophages are capsid viruses that only infect bacteria. They are naturally distributed throughout the biosphere and their genetic material can be DNA, in the vast majority of cases, or RNA. Their discovery is not recent and their therapeutic use has a long history, in fact, they started to be used as early as the 1920s in Human and Animal medicine. Their use was gradually abandoned in Western countries, mainly because of the ease of use of antibiotics and the fact that relatively few clinical trials were conducted on phages, their use being essentially based on empiricism. In other countries of the world, such as Russia and the former USSR, the culture of using phages in human and animal health has remained very strong: they are often available without prescription and used as a first-line treatment.

The mechanism of bacterial destruction by lytic bacteriophages

There are two main types of bacteriophages:

  • On the one hand, lytic phages, which are the only ones used in therapeutics and those we will focus on, destroy the bacteria by hijacking the bacterial machinery in order to replicate.
  • On the other hand, temperate phages, which are not used therapeutically but are useful experimentally because they add genomic elements to the bacteria, potentially allowing it to modulate its virulence. The phage cycle is called lysogenic.

The diagram below shows the life cycle of a lytic phage:

This is what makes lytic phages so powerful, they are in a “host-parasite” relationship with bacteria, they need to infect and destroy them in order to multiply. Thus, the evolution of bacteria will select mainly resistant strains, as in the case of antibiotic resistance, however, unlike antibiotics, which do not evolve – or rather “evolve” slowly, in step with the scientific discoveries of the human species – phages will also be able to adapt in order to survive and continue to infect bacteria, in a kind of evolutionary race between the bacteria and the phages.

The possible use of Artificial Intelligence

One of the particularities of phages is that, unlike some broad-spectrum antibiotics, they are usually very specific to a bacterial strain. . Thus, when one wishes to create or find appropriate phages for a patient, a complex and often relatively long process must be followed, even though a race against time is usually engaged for the survival of the patient: the bacteria must be identified, which implies sample cultivation from the patient, characterizing the bacterial genome and then determining which phage will be the most likely to fight the infection. Until recently, this stage was an iterative process of in-vivo testing, which was very time-consuming, but as Greg Merril, CEO of the start-up Adaptive Phage Therapeutics (a company which is developing a phage selection algorithm based on bacterial genomes), points out: “When a patient is severely affected by an infection, every minute is important.”

Indeed, to make phage therapy applicable on a very large scale, it is necessary to determine quickly and at a lower cost which phage will be the most effective. This is what the combination of two technologies already allows and will increasingly allow: high frequency sequencing and machine learning. The latter makes it possible to process the masses of data generated by genetic sequencing (the genome of the bacteriophage or the bacterial strain) and to detect patterns in relation to an experimental database indicating that a phage with a genome X was effective against a bacterium with a genome Y.  The algorithm is then able to determine the chances of success of a whole library of phages on a given bacterium and determine which will be the best without performing long iterative tests. As with every test-and-learn domain, phage selection can be automated.

In addition to the determination of the best host for a given bacteriophage (and vice versa) discussed below, the main use cases described for artificial intelligence in the use of phages are

  • Classification of bacteriophages: The body in charge of classification is the International Committee on Taxonomy of Viruses (ICTV). More than 5000 different bacteriophages are described and the main family is the Caudovirales. Traditional approaches to the classification of bacteriophages are based on the morphology of the virion protein that is used to inject the genetic material into the target bacterium. These approaches are mainly based on electron microscopy techniques. A growing body of scientific literature suggests that Machine Learning is a relevant alternative for a more functional classification of bacteriophages.
  • Predicting the functionality of bacteriophage proteins: Machine Learning can be useful to elucidate the precise mechanisms of the PVP (Phage Virion Protein), involved, as mentioned above, in the injection of genetic material into the bacterium.
  • Determining the life cycle of bacteriophages: As discussed earlier in this article, there are two categories of phages: lytic and temperate. Traditionally, the determination of whether a phage belongs to one of these two families was determined by culture and in-vitro The task is more difficult than one might think because under certain stress conditions and in the presence of certain hosts, temperate phages have the ability to survive by performing lytic cycles. At present, PhageAI algorithms are able to determine 99% of the phage category.

It is also possible, as illustrated in the diagram below, for rare and particularly resistant bacteria, to combine the techniques seen above with synthetic biology and bio-engineering techniques in order to rapidly create “tailor-made” phages. In this particular use case, Artificial Intelligence offersits full potential in the development of an ultra-personalised medicine.

Conclusion

Despite its usefulness, phage therapy is still complicated to implement in many Western countries. In France, this therapy is possible within the framework of a Temporary Authorisation for Use under the conditions that the patient’s life is engaged or that his functional prognosis is threatened, that the patient is in a therapeutic impasse and that he or she is the subject of a mono-microbial infection. The use of the therapy must also be validated by a Temporary Specialised Scientific Committee on Phagotherapy of the ANSM and a phagogram – an in vitro test that studies the sensitivity of a bacterial strain to bacteriophages, in the manner of antibiograms – must be presented before treatment is started. Faced with these multiple difficulties, many patient associations are mobilizing to campaign for simplified access to phagotherapy. With the help of Artificial Intelligence, more and more phagotherapies can be developed, as illustrated in this article, and given the urgency and scale of the problem of antibiotic resistance, it is essential to prepare the regulatory framework within which patients will be able to access the various alternative treatments, including bacteriophages. The battle is not yet lost, and Artificial Intelligence will be a main key ally.

Would you like to discuss the subject? Would you like to take part in writing articles for the newsletter? Would you like to participate in an entrepreneurial project related to PharmaTech?

Contact us at hello@resolving-pharma.com!


To go further :

These articles should interest you

Vitalik_Buterin_Scientist_Landscape

Introduction to DeSci

How Science of the Future is being born before our eyes « [DeSci] transformed my research impact from a low-impact virology article every other year to saving the lives and…
Illustration In Silico

Towards virtual clinical trials?

Clinical trials are among the most critical and expensive steps in drug development. They are highly regulated by the various international health agencies, and for good reason: the molecule or…

To subscribe free of charge to the monthly Newsletter, click here.

Would you like to take part in the writing of Newsletter articles ? Would you like to take part in an entrepreneurial project on these topics ?

Contact us at hello@resolving-pharma.com ! Join our group LinkedIn !

Categories
Clinic

Towards virtual clinical trials?

Clinical trials are among the most critical and expensive steps in drug development. They are highly regulated by the various international health agencies, and for good reason: the molecule or new medical procedure being tested can potentially harm the patients. To date, randomized clinical trials are the most valuable to health authorities. However, even though studies are designed to generate as much data as possible while limiting bias and respecting patient safety, they are limited in terms of the parameters tested. For example, some molecules are designed to treat diseases affecting small numbers of patients, so it is very complicated and costly for clinical trial sponsors to recruit enough patients and the statistical power generated is sometimes too low to be interpreted with confidence.

Could a mathematical, computerized model of a patient totally replace, or at least supplement, the data generated by humans in a clinical trial?

This short article will try to develop the concept of in silico clinical trials through some notions and examples from the scientific literature. We hope that it can teach you more about this exciting field.

In silico clinical trials use virtual patients, i.e. mathematical models generated by an algorithm, mimicking our physiology and capable of reproducing, for example, the pharmacokinetics of a drug X 1 or their associated toxicity 2. They have many advantages, such as generating more confidence in the molecule being tested before any animal and/or human experiments. Increasing the statistical power of trials carried out on small populations; such as when a molecule is tested in orphan diseases. Eventually, this technology follows the 3 Rs rule of limiting the use of laboratory animals: Replace, Reduce, Refine.

Working with this concept, Sarrami-Foroushan et al 1 modelled the therapeutic effect of stenting in the treatment of intracranial aneurysms. The first step of the project was to check whether it was possible to replicate the data from existing studies and, secondly, to explore certain situations that would have required a more complicated set of patients to pool.

Based on “virtual” carotid anatomies (but modelled from real patients), the researchers were able to apply a set of models to reproduce the different physical mechanisms (fluid dynamics for the blood, for example) involved in the evolution of the aneurysm, and to observe the effect of the prosthesis on the diseased vessel (in this case, its occlusion). The aim was also to generate a model capable of comparing the effect of the prosthesis in a normotensive patient and in a hypertensive patient.

The predicted score was comparable to results already published in the literature, and allowed the exploration of new scenarios where, for example, the aneurysm has a more complex morphology and some patients are more difficult to recruit.

This example illustrates the strength that in silico technology will represent in the coming decades. Various health authorities, such as the FDA 2, are placing increasing emphasis on these predictions, as they reduce the cost and duration of clinical trials.

Another case study is that developed by Gutiérrez-Casares et al. 3 in the treatment of ADHD, using two different small molecules, lisdexamfetamine and methylphenidate.

The team first had to characterize the pathology and the drugs tested at the molecular level: in ADHD, the expression of certain proteins is altered, and the two molecules have a different mode of action. Sensitivity and efficacy may therefore be different in a patient depending on the molecule studied. The activity of these proteins was then correlated to clinical efficacy criteria. They generated a virtual population, demographically similar to the populations observed in the pathology, describing different protein profiles according to the “healthy” or “sick” status of the patient. Finally, the team used this virtual population to generate their pharmacokinetic profiles and simulate the concentration that the drug would have in their body.

Based on their protein profiles and by cross-referencing the generated efficacy data, the researchers were able to find the key proteins in the mechanism of action of both drugs. It is not only efficacy and safety data that can be generated via in silico testing, but also data fundamental to the drug’s mode of action that can be inferred.

To date, it is still complicated to adopt a holistic approach to the simulation of Human physiology. As the article by Gutiérrez-Casares et al. shows, the reliability of models is limited to what is already known.  The notion of digital twins is applicable to many fields, but may never be applicable to Health. However, with ever-increasing computing power and evolving clinical databases, models will come closer and closer to plausible outcomes. Where a phase 3 trial requires numerous patients, could in silico trials reduce this number and speed up the approval of new drugs on the market?

On the public side, initiatives such as the VPH Institute 5 and Avicenna Alliance 6 promote the use of in silico and contain multiple resources available to all to democratize the technology.

On the private side, there are companies such as InSilicoTrials 7, Novadiscovery 8 or the InClinico tool from the company InSilico 9 that offer platforms accessible to the various players in the health sector, and provide them with “ready-to-use” tools to initiate their own simulations.

Is it possible to imagine a future where these tools allow small and medium-sized biotechs to access phase 3 clinical trials without the financial means of a BigPharma? The ecosystem of the healthcare industries would then be more favourable to innovative and “risky” ideas, and not only to the historical players, capable of absorbing the heavy failure of a phase 3.


To go further:

  1. Sarrami-Foroushani, A. et al. In-silico trial of intracranial flow diverters replicates and expands insights from conventional clinical trials. Nat. Commun. 12, 3861 (2021).
  2. AltaThera Pharmaceuticals Announces FDA Approval for New Indications of Sotalol IV: A New and Faster Way to Initiate Sotalol Therapy for Atrial Fibrillation (AFib) Patients. 
  3. Gutiérrez-Casares, J. R. et al. Methods to Develop an in silico Clinical Trial: Computational Head-to-Head Comparison of Lisdexamfetamine and Methylphenidate. Front. Psychiatry 12, 1902 (2021).
  4. Marr, B. 7 Amazing Examples of Digital Twin Technology In Practice. Forbes 
  5. VPH Institute | Virtual Physiological Human – International non-profit organisation. 
  6. AVICENNA ALLIANCE. 
  7. InSilicoTrials – Modeling and simulation in drug development. InSilicoTrials 
  8. Novadiscovery. 
  9. InClinico | Insilico Medicine. 

These articles should interest you

Vitalik_Buterin_Scientist_Landscape

Introduction to DeSci

How Science of the Future is being born before our eyes « [DeSci] transformed my research impact from a low-impact virology article every other year to saving the lives and…
Illustration In Silico

Towards virtual clinical trials?

Clinical trials are among the most critical and expensive steps in drug development. They are highly regulated by the various international health agencies, and for good reason: the molecule or…

To subscribe free of charge to the monthly Newsletter, click here.

Would you like to take part in the writing of Newsletter articles ? Would you like to take part in an entrepreneurial project on these topics ?

Contact us at hello@resolving-pharma.com ! Join our group LinkedIn !

Categories
Clinic Exploratory research

Reshaping real-world data sharing with Blockchain-based system

“[Blockchain] is a complicated technology and one whose full potential is not necessarily understood by healthcare players. We want to demonstrate […] precisely that blockchain works when you work on the uses!” Nesrine Benyahia, Managing Director of DrData

***

Access to real-world health data is becoming an increasingly important issue for pharmaceutical companies and facilitating the acquisition of this data could make the development of new drugs faster and less costly. After explaining the practices of data acquisition in the pharmaceutical industry, and the current initiatives aiming at facilitating them, this article will then focus on the projects using the Blockchain, in the exchange, monetization and securing of these precious data.

Use of real-world data by the Pharmaceutical Industry, where do we stand?

Real-world data are commonly defined as data that are not collected in an experimental setting and without intervention in the usual way patients are managed, with the aim of reflecting current practice in care. These data can sometimes complement data from randomized controlled trials, which have the disadvantage of being true only in the very limited context of clinical trials. The use of real-world data is likely to grow for two key reasons. First, new technological tools allow us to collect them (connected medical devices, for example) while others allow us to analyze them (data science, text-mining, patient forums, exploitation of grey literature, etc.). Secondly,  for a few years now, we have been observing a regulatory evolution that allows more and more early access and clinical evidence on small number of patients (especially in the case of cancer drug trials) and that tends to move the evidence cursor towards real-world data.

The uses of real-world data are varied and concern the development of new drugs – in particular in order to define new management algorithms, or to discover unmet medical needs through the analysis of databases – but also the monitoring of products already on the market – we can cite several cases of use such as the monitoring of safety and use, access to the market with conditional financial support or payment on performance. These data can be used to inform the decisions of health authorities and also the strategic decisions of pharmaceutical companies.

Current acquisition and use of real-world data: Data sources are varied, with varying degrees of maturity and availability, as well as varying access procedures. Some of these data come directly from healthcare, such as data from medico-administrative databases or hospital information systems, while others are produced directly by patients, through social networks, therapy management applications and connected medical devices. Access to this data for the pharmaceutical industry takes place in various ways. Like many other countries, France is currently working to implement organizational and regulatory measures to facilitate access to this real-world data, and to organize its collection and use, notably with the creation of the Health Data Hub. However, to this day, in the French and European context, no platform allows patients to have access to all of their health data and to freely dispose of them in order to participate in a given research project.

Imagining a decentralized health data sharing system, the first steps:

As a reminder, blockchain is a cryptographic technology developed in the late 2000s that allows to store, authenticate, and transmit information in a decentralized (without intermediaries or trusted third parties), transparent and highly secure way. For more information about how blockchain works, please refer to our previous article about this technology: “Blockchain, Mobile Applications: Will technology solve the problem of counterfeit drugs?” As we already explained in that article, the young Blockchain technology has so far mainly expressed its potential in the field of crypto currencies, but it is possible to imagine many other applications.

Thus, several research teams are working on how this technology could potentially address the major challenges of confidentiality, interoperability, integrity, and secure accessibility – among others – posed by the sharing of health data.

These academic research teams have envisioned blockchains that bring together different stakeholders: healthcare services, patients, and data users (who may be the patients themselves or other healthcare-producing organizations). These systems do not provide data to third parties (industrialists, for example); their only objectives are to improve the quality of care and to offer patients a platform that brings together their fragmented health data: in the United States, data is siloed because of the organization of the health system; in France, although the Social Security system has a centralizing role, the “Mon Espace Santé” service, which allows patients to access all of their data and is a descendant of the Shared Medical Record, is slow to be implemented.

These academic projects propose, on the one hand, to store medical information on a private blockchain – and on the other hand to operate Smart Contracts with different uses. Smart Contracts are computerized equivalents of traditional contracts, but they are different because their execution does not require a trusted third party or human intervention (they are executed when the conditions provided by the computer code are met). In these proposals for real-world data sharing systems, they allow, among other things, to authenticate the identity of the users, to guarantee the integrity of the data, their confidentiality, and the flexibility of their access (unauthorized persons cannot access the patient data).

Despite their theoretical qualities, these academic projects do not integrate the possibility for patients to share their data in an open access fashion, to different research projects. In the last part of this article, we will review two examples of start-ups seeking to address this issue using the Blockchain.

Examples of two blockchain projects that allow patients to share their health data:

Embleema is a startup that offers a platform where patients can upload their health data – ranging from their complete genome to the results of their medical tests, to data from connected medical devices. At the same time, pharmaceutical companies can express their needs, and an algorithm on the platform will then select patients who could correspond to this need, by their pathology or by the treatments they are prescribed. They will then be asked to sign a consent document to participate in an observational study, in exchange for which they will be paid (in the USA) or may choose a patient association that will receive funding (in France).  The data produced by patients are stored on centralized servers of specialized health data hosts, and only the industrialists who have purchased it have access to it. The Ethereum blockchain and its system of smart contracts are used in the Embleema model only to certify compliance and organize the sharing of documents related to the study (collection of patient consent, etc.). We can therefore wonder about the added value of the blockchain in this model. Couldn’t these documents have been stored on centralized servers? And the actions triggered by the smart contracts carried out from a centralized database, with Embleema acting as a trusted third party? How much of the marketing use of the term Blockchain is in this model? In any case, the Patient Truth platform developed by Embleema has the great merit of proposing a model in which patients have control over their health data, and the choice to get involved in this or that academic or industrial research project.

***

The second company we will focus on is MedicalVeda, a Canadian start-up in which blockchain plays a more central role, including the launch of an ERC-20 token (a standard cryptocurrency using the Ethereum blockchain that can be programmed to participate in a Smart Contract). The workings of this company, which seeks to solve several problems at once – regarding access to healthcare data by the healthcare industries but also about access to care on the patient side – is quite complex and conceptual and we will try to simplify it as much as possible. MedicalVeda’s value proposition is based on several products:

  • The VEDA Health Portal, which is a platform to centralize patient’s health data for the benefit of caregivers and pharmaceutical industry research programs to which the patient can choose to provide access. Similar to the projects previously mentioned in this article, the goal is to overcome the challenge of data siloing. The data is secured by a private blockchain.
  • The Medical Veda Data Market Place, which aims to directly connect patients and pharmaceutical companies according to their needs. Transactions are made using the blockchain and are paid for in crypto-currencies.
  • Two other products are worth mentioning: the MVeda token, which is the cryptocurrency of the data sales platform, which pays patients, and Medfi Veda, a decentralized finance system that allows American patients to borrow money to fund medical interventions by collateralizing their MVeda crypto-currency tokens. This collateral lending system is classic in decentralized finance, but admittedly the details of the system developed by MVeda remain murky. The objective of the system is to allow patients to collateralize their health data in order to facilitate their access to healthcare.
***

In conclusion, Blockchain is still a young technology that experienced a very high level of interest in the healthcare world in 2018 before gradually drying up since then, mainly due to a misunderstanding of its potential and a lack of education of healthcare professionals on the subject on the one hand, and on the other hand due to too much marketing use of what had become a “buzz-word.” The intrinsic qualities of this technology make it possible to imagine creative and ambitious models for sharing health data, which may be the source of accelerated development of new drugs in the future. For this time being, and despite courageous and intelligent initiatives, some of which have already been commercialized, no solution is fully functional on a very large scale; everything remains to be built.


To go further:

These articles should interest you

Vitalik_Buterin_Scientist_Landscape

Introduction to DeSci

How Science of the Future is being born before our eyes « [DeSci] transformed my research impact from a low-impact virology article every other year to saving the lives and…
Illustration In Silico

Towards virtual clinical trials?

Clinical trials are among the most critical and expensive steps in drug development. They are highly regulated by the various international health agencies, and for good reason: the molecule or…

To subscribe free of charge to the monthly Newsletter, click here.

Would you like to take part in the writing of Newsletter articles ? Would you like to take part in an entrepreneurial project on these topics ?

Contact us at hello@resolving-pharma.com ! Join our group LinkedIn !

Categories
Clinic Exploratory research Preclinical

3D printing and artificial intelligence: the future of galenics?

“Ten years from now, no patient will take the same thing as another million people. And no doctor will prescribe the same thing to two patients.”

Fred Paretti from the 3D drug printing startup Multiply Labs.

3D printing – also known as additive manufacturing – is one of the technologies capable of transforming pharmaceutical development, and will certainly play a role in the digitalization of the drug manufacturing sector. This short article will attempt to provide an overview of how 3D printing works, its various use cases in the manufacture of personalized medicines, the current regulatory framework for this innovative technology, and the synergies that may exist with Artificial Intelligence.

3D printing, where do we stand?

The principle of 3D printing, developed since the early 2000s and now used in a large number of industrial fields, consists of superimposing layers of material in accordance with coordinates distributed along three axes (in three dimensions) following a digital file. This 3D file is cut into horizontal slices and sent to the 3D printer, allowing it to print one slice after another. The terminology “3D printing” brings together techniques that are very different from each other:

  • The deposition of molten wire or extrusion: a plastic wire is heated until it melts and deposited at points of interest, in successive layers, which are bound together by the plastic solidifying as it cools. This is the most common technique used by consumer printers.
  • The photopolymerization of the resin: a photosensitive resin is solidified with the help of a laser or a very concentrated light source, layer by layer. This is one of the techniques that allows a very high level of detail.
  • Sintering or powder fusion: a laser is used to agglomerate the powder particles with the energy it releases. This technique is used to produce metal or ceramic objects.

In the pharmaceutical industry, 3D printing is used in several ways, the main ones being :

  • The realization of medical devices, using the classic techniques of printing plastic or metallic compounds or more particular techniques allowing medical devices to acquire original properties, like the prostheses of the start-up Lattice Medical allowing adipose tissue to regenerate.
  • Bio-printing, allowing, by printing with human cells, to reconstitute organs such as skin or heart patches, like what is done by another French start-up: Poietis
  • Finally, and this is what will be discussed in this article, 3D printing also has a role to play in galenics by making it possible to print, from a mixture of excipient(s) and active substance(s), an orally administered drug.

What are the uses of 3D printing of medicines? 

3D printing brings an essential feature to drug manufacturing: flexibility. This flexibility is important for:

  • Manufacturing small clinical batches: clinical phases I and II often require small batches of experimental drugs for which 3D printing is useful: it is sometimes economically risky to make large investments in drug manufacturing at this stage. Moreover, it is often necessary to modify the active ingredient content of the drugs used, and 3D printing would enable these batches to be adapted in real time. Finally, 3D printing can also be useful for offering patients placebos that are as similar as possible to their usual treatments.
  • Advancing towards personalized medicine: 3D printing of drugs allows the creation of “à la carte” drugs by mixing several active ingredients with different contents for each patient. In the case of patients whose weight and absorption capacities vary over time (children or the elderly who are malnourished, for example), 3D printing could also adapt their treatments in real time according to changes in their weight, particularly in terms of dosage and speed of dissolution.

To address these issues, most major pharmaceutical companies are increasingly interested in 3D printing of drugs. They are investing massively in this field or setting up partnerships, like Merck, which is cooperating with the company AMCM in order to set up a printing system that complies with good manufacturing practices. The implementation of this solution has the potential to disrupt the traditional manufacturing scheme, as illustrated in the diagram below.

Figure 1 – Modification of the manufacturing steps of a tablet by implementing 3D printing (Source : Merck)

Regulation

The first commercialized 3D printed drug was approved by the FDA in 2015. Its active ingredient is levetiracetam. The goal of using 3D printing for this drug was to achieve a more porous tablet that dissolves more easily and is more suitable for patients with swallowing disorders. Despite these initial approvals and market accesses, the regulatory environment has yet to be built, as it is still necessary to assess the changes in best practices that 3D printing technology may impose and determine what types of tests and controls should be implemented. Destructive quality controls are not particularly well suited to the small batches produced by the 3D printer technique. To our knowledge, there are currently no GMP-approved 3D printers for the manufacture of drugs.

Will the future of drug 3D printing involve artificial intelligence? 

A growing number of authors believe that 3D printing of drugs will only be able to move out of the laboratory and become a mainstream technology in industry if artificial intelligence is integrated. Indeed, as things stand at present, because of the great flexibility mentioned above, the use of 3D printing requires a long iterative phase: it is necessary to test thousands of factors concerning in particular the excipients used, but also the parameters of the printer and the printing technique to be selected. The choice of these different factors is currently made by the galenics team according to its objectives and constraints: what is the best combination of factors to meet a given pharmacokinetic criterion? Which ones allow to minimize the production costs? Which ones allow to respect a possible regulatory framework? Which ones allow for rapid production? This iterative phase is extremely time-consuming and capital-intensive, which contributes to making 3D printing of drugs incompatible with the imperatives of pharmaceutical development for the moment. Artificial Intelligence seems to be the easiest way to overcome this challenge and to make the multidimensional choice of parameters to be implemented according to the objectives “evidence-based”. Artificial Intelligence could also be involved in the quality control of the batches thus manufactured.

The use of Artificial Intelligence to design new drugs opens up the prospect of new technical challenges, particularly with regard to the availability of the data required for these Machine Learning models, which are often kept secret by pharmaceutical laboratories.  We can imagine that databases can be built by text-mining scientific articles and patents dealing with different galenic forms and different types of excipients and then completed experimentally, which will require a significant amount of time. In addition to these technical challenges, it will also be necessary to ask more ethical questions, particularly with regard to the disruption of responsibilities caused by the implementation of these new technologies: who would be responsible in the event of a non-compliant batch being released? The manufacturer of the 3D printer? The developer of the algorithm that designed the drug? The developer of the algorithm that validated the quality control? Or the pharmacist in charge of the laboratory?

All in all, we can conclude that 3D printing of medicines is a technology that is already well mastered, whose market is growing by 7% each year to reach a projected market of 440 million dollars in 2025, but whose usefulness is so far limited to certain cases of use, but which could tomorrow, due to the unlocking of its potential through the combination of Artificial Intelligence, allow us to achieve a fully automated and optimized galenic development and manufacturing of oral forms, finally adapted to the ultra-customized medicine that is coming.

To subscribe to the monthly newsletter for free: Registration

Would you like to take part in writing articles for the newsletter ? You wish to participate in an entrepreneurial project on these themes ?

Contact us at hello@resolving-pharma.com ! Join our LinkedIn Group !


To go further:

  • Moe Elbadawi, Laura E. McCoubrey, Francesca K.H. Gavins, Jun J. Ong, Alvaro Goyanes, Simon Gaisford, and Abdul W. Basit ; Disrupting 3D Printing of medicines with machine learning ; Trends in Pharmacological Sciences, September 2021, Vol 42, No.9
  • Moe Elbadawi, Brais Muñiz Castro, Francesca K H Gavins, Jun Jie Ong, Simon Gaisford, Gilberto Pérez , Abdul W Basit , Pedro Cabalar , Alvaro Goyanes ; M3DISEEN: A novel machine learning approach for predicting the 3D printability of medicines ; Int J Pharm. 2020 Nov 30;590:119837
  • Brais Muñiz Castro, Moe Elbadawi, Jun Jie Ong, Thomas Pollard, Zhe Song, Simon Gaisford, Gilberto Pérez, Abdul W Basit, Pedro Cabalar, Alvaro Goyanes ; Machine learning predicts 3D printing performance of over 900 drug delivery systems ; J Control Release. 2021 Sep 10;337:530-545. doi: 10.1016/j.jconrel.2021.07.046
  • Les médicaments imprimés en 3D sont-ils l’avenir de la médecine personnalisée ? ; 3D Natives, le média de l’impression 3D ; https://www.3dnatives.com/medicaments-imprimes-en-3d-14052020/#!
  • Les médicaments de demain seront-ils imprimés en 3D ? ; Le mag’ Lab santé Sanofi ; https://www.sanofi.fr/fr/labsante/les-medicaments-de-demain-seront-ils-imprimes-en-3D
  • Press Releases – Merck and AMCM / EOS Cooperate in 3D Printing of Tablets ; https://www.merckgroup.com/en/news/3d-printing-of-tablets-27-02-2020.html

Ces articles pourraient vous intéresser

Vitalik_Buterin_Scientist_Landscape

Introduction to DeSci

How Science of the Future is being born before our eyes « [DeSci] transformed my research impact from a low-impact virology article every other year to saving the lives and…
Illustration In Silico

Towards virtual clinical trials?

Clinical trials are among the most critical and expensive steps in drug development. They are highly regulated by the various international health agencies, and for good reason: the molecule or…
Categories
Clinic Exploratory research Preclinical

Why are we still conducting meta-analyses by hand?

« It is necessary, while formulating the problems of which in our further advance we are to find solutions, to call into council the views of those of our predecessors who have declared an opinion on the subject, in order that we may profit by whatever is sound in their suggestions and avoid their errors. »

Aristotle, De anima, Book 1, Chapter 2

Systematic literature reviews and meta-analyses are essential tools for synthesizing existing knowledge and generating new scientific knowledge. Their use in the pharmaceutical industry is varied and will continue to diversify. However, they are particularly limited by the lack of scalability of their current methodologies, which are extremely time-consuming and prohibitively expensive. At a time when scientific articles are available in digital format and when Natural Language Processing algorithms make it possible to automate the reading of texts, should we not invent meta-analyses 2.0? Are meta-analyses boosted by artificial intelligence, faster and cheaper, allowing more data to be exploited, in a more qualitative way and for different purposes, an achievable goal in the short term or an unrealistic dream?

Meta-analysis: methods and presentation

A meta-analysis is basically a statistical analysis that combines the results of many studies. Meta-analysis, when done properly, is the gold standard for generating scientific and clinical evidence, as the aggregation of samples and information provides significant statistical power. However, the way in which the meta-analysis is carried out can profoundly affect the results obtained.

Conducting a meta-analysis therefore follows a very precise methodology consisting of different stages:

  • Firstly, a search protocol will be established in order to determine the question to be answered by the study and the inclusion and exclusion criteria for the articles to be selected. It is also at this stage of the project that the search algorithm is determined and tested.
  • In a second step, the search is carried out using the search algorithm on article databases. The results are exported.
  • Articles are selected on the basis of titles and abstracts. The reasons for exclusion of an article are mentioned and will be recorded in the final report of the meta-analysis.
  • The validity of the selected studies is then assessed on the basis of the characteristics of the subjects, the diagnosis, and the treatment.
  • The various biases are controlled for in order to avoid selection bias, data extraction bias, conflict of interest bias and funding source bias.
  • A homogeneity test will be performed to ensure that the variable being evaluated is the same for each study. It will also be necessary to check that the data collection characteristics of the clinical studies are similar.
  • A statistical analysis as well as a sensitivity analysis are conducted.
  • Finally, the results are presented from a quantitative and/or non-quantitative perspective in a meta-analysis report or publication. The conclusions are discussed.

The systematic literature review (SLR), unlike the meta-analysis, with which it shares a certain number of methodological steps, does not have a quantitative dimension but aims solely to organize and describe a field of knowledge precisely.

The scalability problem of a powerful tool

The scalability problem is simple to put into equation and will only get worse over time: the increase in the volume of data generated by clinical trials to be processed in literature reviews is exponential while the methods used for extracting and processing these data have evolved little and remain essentially manual. The intellectual limits of humans are what they are, and humans cannot disrupt themselves.

As mentioned in the introduction to this article, meta-analyses are relatively costly in terms of human time. It is estimated that a minimum of 1000 hours of highly qualified human labor are required for a simple literature review and that 67 weeks are needed between the start of the work and its publication. Thus, meta-analyses are tools with a high degree of inertia and their temporality is not currently adapted to certain uses, such as strategic decision-making, which sometimes requires certain data to be available quickly. Publications illustrate the completion of full literature reviews in 2 weeks and 60 working hours using automation tools using artificial intelligence.

“Time is money”, they say. Academics have calculated that, on average, each meta-analysis costs about $141,000. The team also determined that the 10 largest pharmaceutical companies each spend about $19 million per year on meta-analyses. While this may not seem like a lot of money compared to the various other expenses of generating clinical evidence, it is not insignificant and it is conceivable that a lower cost could allow more meta-analyses to be conducted, which would in turn explore the possibility of conducting meta-analyses of pre-clinical data and potentially reduce the failure rate of clinical trials – currently 90% of compounds entering clinical trials fail to demonstrate sufficient efficacy and safety to reach the market.

Reducing the problem of scalability in the methodology of literature reviews and meta-analyses would make it easier to work with data from pre-clinical trials. These data present a certain number of specificities that make their use in systematic literature reviews and meta-analyses more complex: the volumes of data are extremely large and evolve particularly rapidly, the designs of pre-clinical studies as well as the form of reports and articles are very variable and make the analyses and the evaluation of the quality of the studies particularly complex. However, systematic literature reviews and other meta-analyses of pre-clinical data have different uses: they can identify gaps in knowledge and guide future research, inform the choice of a study design, a model, an endpoint or the relevance or not of starting a clinical trial. Different methodologies for exploiting preclinical data have been developed by academic groups and each of them relies heavily on automation techniques involving text-mining and artificial intelligence in general.

Another recurring problem with meta-analyses is that they are conducted at a point in time and can become obsolete very quickly after publication, when new data have been published and new clinical trials completed. So much time and energy is spent, in some cases after only a few months or weeks, to present inaccurate or partially false conclusions. We can imagine that the automated performance of meta-analyses would allow their results to be updated in real time.

Finally, we can think that the automation of meta-analyses would contribute to a more uniform assessment of the quality of the clinical studies included in the analyses. Indeed, many publications show that the quality of the selected studies, as well as the biases that may affect them, are rarely evaluated and that when they are, it is done according to various scores that take few parameters into account – for example, the Jadad Score only takes into account 3 methodological characteristics – and this is quite normal: the collection of information, even when it is not numerous, requires additional data extraction and processing efforts.

Given these scalability problems, what are the existing or possible solutions?

Many tools already developed

The automation of the various stages of meta-analyses is a field of research for many academic groups and some tools have been developed. Without taking any offence to these tools, some examples of which are given below, it is questionable why they are not currently used more widely. Is the market not maturing enough? Are the tools, which are very fragmented in their value proposition, not suitable for carrying out a complete meta-analysis? Do these tools, developed by research laboratories, have sufficient marketing? Do they have sufficiently user-friendly interfaces?

As mentioned above, most of the tools and prototypes developed focus on a specific task in the meta-analysis methodology. Examples include Abstrackr, which specialises in article screening, ExaCT, which focuses on data extraction, and RobotReviewer, which is designed to automatically assess bias in reports of randomised controlled trials.

Conclusion: improvement through automation?

When we take into account the burgeoning field of academic exploration concerning automated meta-analysis as well as the various entrepreneurial initiatives in this field (we can mention in particular the very young start-up: Silvi.ai), we can only acquire the strong conviction that more and more, meta-analysis will become a task dedicated to robots and that the role of humans will be limited to defining the research protocol, assisted by software that will allow us to make the best possible choices in terms of scope and search algorithms. Thus, apart from the direct savings that will be made by automating meta-analyses, many indirect savings will be considered, particularly those that will be made possible by the best decisions that will be taken, such as whether or not to start a clinical trial. All in all, the automation of meta-analyses will contribute to more efficient and faster drug invention.

Resolving Pharma, whose project is to link reflection and action, will invest in the coming months in the concrete development of meta-analysis automation solutions.

Would you like to discuss the subject? Would you like to take part in writing articles for the Newsletter? Would you like to participate in an entrepreneurial project related to PharmaTech?

Contact us at hello@resolving-pharma.com! Join our LinkedIn group!


To go further:
  • Marshall, I.J., Wallace, B.C. Toward systematic review automation: a practical guide to using machine learning tools in research synthesis. Syst Rev 8, 163 (2019). https://doi.org/10.1186/s13643-019-1074-9
  • Clark J, Glasziou P, Del Mar C, Bannach-Brown A, Stehlik P, Scott AM. A full systematic review was completed in 2 weeks using automation tools: a case study. J Clin Epidemiol. 2020 May;121:81-90. doi: 10.1016/j.jclinepi.2020.01.008. Epub 2020 Jan 28. PMID: 32004673.
  • Beller, E., Clark, J., Tsafnat, G. et al. Making progress with the automation of systematic reviews: principles of the International Collaboration for the Automation of Systematic Reviews (ICASR). Syst Rev 7, 77 (2018). https://doi.org/10.1186/s13643-018-0740-7
  • Lise Gauthier, L’élaboration d’une méta-analyse : un processus complexe ! ; Pharmactuel, Vol.35 NO5. (2002) ; https://pharmactuel.com/index.php/pharmactuel/article/view/431
  • Nadia Soliman, Andrew S.C. Rice, Jan Vollert ; A practical guide to preclinical systematic review and meta-analysis; Pain September 2020, volume 161, Number 9, http://dx.doi.org/10.1097/j.pain.0000000000001974
  • Matthew Michelson, Katja Reuter, The significant cost of systematic reviews and meta-analyses: A call for greater involvement of machine learning to assess the promise of clinical trials, Contemporary Clinical Trials Communications, Volume 16, 2019, 100443, ISSN 2451-8654, https://doi.org/10.1016/j.conctc.2019.100443
  • Vance W. Berger, Sunny Y. Alperson, A general framework for the evaluation of clinical trial quality; Rev Recent Clin Trials. 2009 May ; 4(2): 79–88.
  • A start-up specializing in meta-analysis enhanced by Artificial Intelligence: https://www.silvi.ai/
  • And finally, the absolute bible of meta-analysis: The handbook of research synthesis and meta-analysis, Harris Cooper, Larry V. Hedges et Jefferey C. Valentine

These articles should interest you

Vitalik_Buterin_Scientist_Landscape

Introduction to DeSci

How Science of the Future is being born before our eyes « [DeSci] transformed my research impact from a low-impact virology article every other year to saving the lives and…
Illustration In Silico

Towards virtual clinical trials?

Clinical trials are among the most critical and expensive steps in drug development. They are highly regulated by the various international health agencies, and for good reason: the molecule or…

To subscribe free of charge to the monthly Newsletter, click here.

Would you like to take part in the writing of Newsletter articles ? Would you like to take part in an entrepreneurial project on these topics ?

Contact us at hello@resolving-pharma.com ! Join our group LinkedIn !