Insights
- Drug discovery is ripe for an AI-powered reboot. AI is accelerating drug discovery but must be balanced with well-governed data infrastructure and explainable models that account for validation and regulatory compliance.
- Clinical trials are costly, lengthy and prone to failure. AI innovations can reshape every phase of the trial process and deliver better efficiency and greater chances for success.
- Customer engagement is rapidly evolving from sales-focused customer relationship management (CRM) platforms to enterprise-wide engagement systems. A split between industry-leading CRM providers and the advance of AI has forced life sciences companies to re-evaluate their engagement architectures.
- Pharmaceutical supply chains have been rattled by a range of new developments and the persistent tension between speed and quality. Life sciences companies can use AI and enhanced data governance to build greater resilience in their supply chains.
- Mergers and acquisitions are a constant in the life sciences industry. The rising importance of digital systems makes information technology integration vital for mergers to pay off.
- The shift to patient-centered healthcare is changing how life sciences companies operate. Patients and other stakeholders demand consumer-grade engagement, but pharmaceutical businesses must carefully manage data and privacy considerations.
Foreword: A compelling moment in the business of breakthroughs
Life sciences companies have always been in the business of breakthroughs.
What sets this industry apart is the pairing of its relentless pursuit of new discovery with the deeply human mission behind it: to heal, restore, and improve lives. Researchers, scientists, and executives in life sciences organizations carry out this mission with intensity. They know their work touches individuals and families in profoundly personal ways.
The tools and discipline required to develop, test, and formulate new therapies are established and well-honed. And the industry’s emphasis on innovation has given it remarkable capacity to adopt new methods and technologies to create breakthroughs.
This is what makes artificial intelligence in life sciences uniquely compelling. AI has disruptive potential perhaps on par with the printing press or electrification. And now it is being put to work by organizations built for structured inquiry, rigorous experimentation, and development of complex processes. The opportunity for new breakthroughs is immense.
All this occurs in a fast-changing world. Geopolitical shifts, evolving expectations from stakeholders, and the rise of personalized medicine are driving life sciences businesses to change their operating models. Companies must address these changes while still answering to healthcare professionals, shareholders, regulators, partners, and their own employees.
The business of breakthroughs can still follow the traditional discover-test-deliver sequence, but companies must work faster, work more equitably, and operate with greater transparency. Pharmaceuticals supply chains must continue to deliver and grow more resilient and flexible. Commercial models must evolve as companies rethink how they engage in marketing, interact with consumers, and select innovations to pursue.
This journal reflects the insights of more than 50 Infosys life sciences leaders and subject matter experts exploring how life sciences businesses can navigate this moment of transformation. It highlights practical examples of how Infosys and its partners are using AI across the life sciences value chain, and how new technologies can help companies meet the challenges of a changing world.
Our hope is that this journal will spark new ideas, deepen conversations and inspire leaders across the industry to continue the essential work at the heart of life sciences: the business of breakthroughs.
Subhro Mallik
Executive vice president and head of life sciences, Infosys
Executive summary
Life sciences executives believe breakthroughs will keep coming. They expect their own companies and the industry overall will continue discovering new drugs and ways to treat patients. This is according to a survey of 150 life sciences executives conducted for this research journal.
Pharmaceutical companies and medical device manufacturers are investing heavily to prove their therapies are safe and effective, and to deliver them to patients efficiently. Executives surveyed by Infosys most frequently identified clinical trials as their top investment priority, followed by drug discovery and supply chain (Figure 1).
Figure 1. Top investment priorities for life sciences executives
Source: Infosys Knowledge Institute
In a contrasting side note, industry executives say mergers and acquisitions (M&A) remain largely a human endeavor. Some 50% of respondents perceive no measurable impact of AI on M&A. Infosys leadership believes that AI will assist and speed up the M&A process, in the early assessment and decision-making phase, and during merger integration and transition services phase.
Our inaugural research journal, Life Sciences: Trends for the Future, published in 2024, noted heavy enthusiasm and investment in generative AI. This 2026 edition highlights the development of rentosertib, the first named drug discovered and designed with the use of generative AI.
In drug discovery, our experts describe an AI-powered reboot that holds the promise to replenish development pipelines that have been at risk of growing less productive, less efficient, and more expensive. This AI-powered reboot will only deliver if companies establish robust data governance and trustworthy AI systems with explainable models, continuous validation, and clear regulation. Similarly, digital transformation and sound data strategies are vital to improve clinical trials. AI and other digital enhancements such as the automation of routine processes promise to make drug testing more efficient and more successful.
Drug discovery and clinical trials may be focal points for life sciences AI investment, but other areas also demand attention and should benefit from new tech.
Geopolitical shifts demand that life sciences businesses re-examine their supply chains. Global pharmaceutical operations earlier established global supply chains based on scale and efficiency. These supply chains must now be retooled for resilience and flexibility. Geopolitical shifts require production to be located closer to the consumer. Medical advances like personalized medicine now require much greater agility from supply chains and production systems.
Figure 2. Where AI has improved life sciences
Source: Infosys Knowledge Institute
The mandate for resilience and flexibility extends into manufacturing, with the additional requirement for unprecedented visibility. This journal describes how one pharmaceutical manufacturer has built a layered program that balances efficiency and resilience through standardized enterprise resource planning. These standards cover its own manufacturing processes and extend to partners and contract manufacturers as well.
Strategic decisions by enterprise software companies have caused pharmaceutical makers to take a fresh look at what they need from their CRM systems. Beyond forcing enterprises to choose between competing platforms, CRM companies are expanding from their origin in sales and marketing to offer a broader engagement platform for reaching health care professionals and patients.
Patent expirations continue to compel pharmaceutical companies to pursue mergers and acquisitions, but our expert panel stresses that M&A strategy has changed. Companies seek more specific therapy lines and precision medicine capabilities. This evolution underscores the critical role of integration and IT systems management to unlock value from mergers and spinoffs.
Our research shows drug discovery, clinical trial management, supply chain, manufacturing, CRM systems and M&A are prominent life sciences topics.
The rise of the consumer in life sciences did not grow to the level of these areas as an investment priority or critical differentiator. However, the shift to patient-centered healthcare has already changed the structure of life sciences business, as demonstrated by divestitures of consumer products businesses of Johnson & Johnson and GSK. Informed, empowered patients expect more, ask different questions, and demand a wider variety of services from life sciences businesses. Consumers now have a defined perspective on what they want from drugmakers, and these companies need to develop better ways to get closer to their customers.
The entire life sciences value chain is poised to benefit from AI, but only with thorough study and integration. This requires improvements in data, security, governance, integration, and cultural alignment. Breakthroughs in life sciences will build on the strength of established industry practices, combined with innovation in digital, technological, laboratory, and cultural contexts.
An AI-powered reboot for drug discovery
- Artificial intelligence (AI) is accelerating drug discovery through capabilities such as large-scale triage of promising compounds, which reduces failures and identifies strong candidates faster.
- Well-governed, open, and federated data infrastructures are prerequisites for AI to be clinically useful and equitable.
- Trustworthy AI with explainable models, continuous validation, and clear regulation is essential to translate computational promise into safer, fairer medicines.
Drug discovery is slow, costly, and inefficient, making it ripe for an AI-powered reboot. More than 10 years and $2 billion are often required to introduce a single new medicine to patients, and only between 10% and 20% of experimental drugs make it out of trials and to market approval.
Drug discovery is also critical. Nine in 10 life sciences executives surveyed expect drug discovery to be a major differentiator in the next 18 to 24 months (Figure 1).
Figure 1. Drug discovery is key
Source: Infosys Knowledge Institute
Researching, developing, testing, and bringing a drug to market is also a complicated process. Traditional early-stage testing in animals and laboratory dishes is a poor predictor for how humans will respond, leading to failures and associated higher costs. Beyond this phase, success rates remain low. As of 2025, more than 10,000 new drugs are in clinical trials, with about a quarter targeting cancer and the rest focused on metabolic disorders such as diabetes and a range of infectious diseases. Even when these drugs are registered, clinical trials often take four or five years from enrollment to publication, and only about half are ultimately published as full peer-reviewed journal articles. This significantly limits data awareness and knowledge sharing.
AI can help by identifying the most promising drug targets, designing new molecules, and speeding up the development process. Models can analyze massive quantities of public and private data, explore different mechanisms of drug behavior, and identify new repurposed applications for medicines that failed in earlier trials against their previous targets. This data-driven triage enables more informed decisions before making large financial commitments, effectively reducing drug development risks. Recent analyses show that publication and patents activity in AI-driven drug discovery have increased steadily, by 34% and 17% respectively, reflecting AI‘s rapid integration across the discovery pipeline. This is critical as 62% of industry executives surveyed by Infosys said their companies are likely or certain to face a shortage of new drug candidates in the next 18 to 24 months (Figure 2).
Figure 2. A shortage of new drug candidates is likely
Source: Infosys Knowledge Institute
For now, progress is led by large pharmaceutical companies that can afford to create new data and access high-performance computing power. These companies are also investing in biotech start-ups and national-level biobanks to explore computational applications to design molecules, simulate drug behavior, or develop personalized treatments.
Companies such as Insilico Medicine use vast multiomic datasets, drawing upon information from different biology specialisms and global genomic partnerships, to design therapies with broader applicability, including for rare diseases. Researchers are also increasingly pairing AI with more representative population datasets, like the US National Institutes of Health’s All of Us program and regional efforts such as GenomeAsia100K to reduce bias and improve translatability across diverse patient groups.
At the time of this writing, no AI-designed molecule has received full regulatory approval yet, but progress is accelerating. For example, Insilico Medicine now has multiple AI-designed drug candidates in human studies. Notable examples include rare lung-disease treatment rentosertib, the first drug discovered and designed by generative AI, which is in Phase 2 clinical trials with patients, and the cancer drug ISM3091, which is now in a Phase 1 clinical trial. Exscientia has likewise advanced several AI-designed drug candidates.
Major bottlenecks remain: Models are often trained on biased or limited data, highly ranked AI hits sometimes fail early when tested in the lab (for example, Exscientia’s DSP1181 was discontinued after Phase 1), and other AI-originated candidates such as EXS21546 have been deprioritized despite progressing into clinical studies.
Preclinical discovery reengineered
AI is challenging the classical drug discovery workflows. Scientists can now virtually test thousands of compounds prior to lab experiments by combining patient genetic, protein, chemical, and clinical datasets to inform simulations on how a drug will behave in the body. This fail-early, fail-fast approach helps researchers focus on promising drug candidates and quickly identify early safety risks, saving time and money before expensive lab or clinical testing.
Some new technologies include miniaturized human tissues, tiny lab-grown organs, and single-cell measurements to predict how individual patients might respond to a drug. These advances make lab results more relevant to real humans. Dr. Kat Arney, a biologist, author and science communicator, discussed how AI approaches can now generate new biological sequences, which are then iteratively tested and refined in the lab, on the Infosys Knowledge Institute’s AI Interrogator podcast.
Through collaboration with AI providers, pharmaceutical companies can redirect investment to discover new drugs, find promising targets, and repurpose existing drugs for new uses. To create new drugs more likely to reach patients, researchers and companies are using AI in a range of ways, including:
- Using real-world patient information and data-driven predictions to select drug targets and candidate molecules that have a high likelihood of working in humans.
- Learning lessons from past failed experiments and publicly available data to avoid repeating mistakes.
- Harnessing historical data to repurpose drugs that do not appear to work for their original use, but may be effective for other conditions.
- Adjusting drug formulation and timing to make it safer, more effective, and commercially viable for longer periods.
These tactics show that AI use in drug discovery workflows goes beyond analyzing historic data to making proactive decisions on quality and translatability. For example, some companies are using AI to assess large bodies of previously generated research data to predict how new drug projects are most likely to succeed, suggest alternative strategies, and eventually guide executives on which projects to prioritize. AI also enables companies to first experiment safely on computers, reducing costs and optimizing early-stage decision-making in drug development.
A new approach
Certain disease groups may benefit disproportionately from AI application. Applications in oncology, infectious disease, and central nervous system (CNS) disorders illustrate its potential to address long-standing challenges in precision medicine.
For example, each cancer tumor is genetically unique. This makes designing treatments extremely challenging to hit targets for every patient. AI models tackle patient-specific complexity by integrating datasets derived from DNA, tissue scans, and maps of molecular interactions. Integrating and making sense of these datasets through AI models guides researchers to promising drug candidates for specific patients, avoiding the biological noise underlying the data.
Companies such as Recursion and Exscientia exemplify this shift, relying on detailed cell imaging, AI-driven molecule design, and clinical datasets to rapidly identify and test new drug candidates. Recursion uses high-content cellular imaging with machine learning to generate and analyze massive datasets from its oncology program, moving from biological discovery to developing a lead drug candidate in under 18 months. Exscientia has advanced multiple small‑molecule candidates designed via generative AI built on Amazon Web Services into human trials, demonstrating that AI‑enabled chemistry can accelerate the move from concept to clinic.
During the Covid-19 pandemic, data-driven AI was used to predict antiviral compounds and optimize monoclonal therapies, ultimately accelerating effective treatment options. Knowledge-graph models, which are AI tools that map relationships between drugs, diseases, and genes, can select thousands of compounds for rapid testing to identify treatments quickly.
This method was used by BenevolentAI during the pandemic to identify baricitinib as a novel therapy to reduce lung cell infection. Similarly, AbCellera’s AI platform screened millions of human immune cells and applied computational triage to select one of the world’s first leading monoclonal antibodies for Covid‑19 treatment.
While AI sped up infectious-disease drug discovery during the pandemic, it is now being deployed to fight antimicrobial resistance. Drug-resistant infections kill about 1.3 million people every year and contribute to nearly 5 million deaths worldwide. Bacteria and other microbes can quickly change or develop drug-resistant genes, making many antibiotics ineffective. AI modeling has been used to discover novel antibiotics, identifying Halicin as a drug that can be used against numerous bacterial strains. Compiling results from data-driven machine-learning models with rapid lab testing effectively refined drug candidates and identified those most effective against resistant bacteria. These developments promote antibiotic discovery from experimental serendipity toward systematic, algorithm-driven design.
CNS diseases, such as Alzheimer’s and Parkinson’s, are tough to treat because drugs must cross the brain’s protective barrier without affecting other organs. However, generative AI models can design brain-penetrant small molecules with favorable safety profiles. Specifically, companies such as WiseCube are using AI to design small molecules that enter the brain safely. As another example, Insilico Medicine uses generative AI models in a range of approaches, including developing an oral drug that targets brain inflammation, reaching early testing milestones faster than traditional methods.
High-tech approaches
Biobanks of structured datasets that account for the complexity of human diversity and follow findable, accessible, interoperable, reusable (FAIR) principles can become the foundation of AI-driven discovery. In this sense, biobanks are not just static data repositories, but essential AI enablers. Well-engineered biobanks can fulfill the promise that predictive AI models are grounded in real human biology by combining high-quality, harmonized multiomic and clinical data, implementing rigorous data curation standards, and continuously updating datasets to reflect longitudinal health outcomes. This allows companies and researchers to benchmark what good looks like, validate AI predictions against real-world observations, and iteratively refine models to improve accuracy and translatability for personalized drug development.
Digital twins are computationally reconstructed copies of patients, built from genetic, protein, metabolic, imaging, and long-term health data. Well-designed digital twins can allow researchers to run millions of virtual experiments to find promising drug targets, predict who will respond to specific treatment regimens, and spot safety risks before clinical testing in humans. While this approach is exciting, it only works if supported by robustly engineered algorithms and efficient data infrastructure networks.
Biobanks are rapidly becoming the engine rooms for developing digital twins . Projects such as the UK Biobank, Biobank Japan, and FinnGen now pool genetics, proteins, imaging, and long-term health data from millions of volunteers.
Biobanks are rapidly becoming the engine rooms for developing digital twins. Projects such as the UK Biobank, Biobank Japan, and FinnGen now pool genetics, proteins, imaging, and long-term health data from millions of volunteers.
AI tools use this data to spot disease patterns and link them to real-world outcomes, turning static archives into living research systems. For example, AstraZeneca and the University of Oxford use data from the UK Biobank, which holds health records and genetic sequencing data from half a million individuals, together with the MILTON AI platform to map genetic risk factors for disease and predict patient outcomes. Similarly, Finland’s FinnGen project, which is analyzing genomic and health data from 500,000 participants, combines protein and genetic data to identify new biomarkers of disease that can be used to guide treatment strategies. BioBank Japan, meanwhile, has obtained genetic and clinical data from approximately 260,000 participants and has partnered with Google Japan to build AI‑based risk score models with the aim of improving prevention and disease management. A machine learning study of participants from the UK and Japanese biobanks found that the relationship between lifestyle factors and cardiometabolic disease varied widely across genetic risk backgrounds, highlighting how well-designed biobank data plus AI methods can reveal nuances invisible in traditional studies.
By linking public, academic, and private resources, these initiatives create the large, standardized datasets necessary for devising regulatory-quality AI-driven drug development workflows.
The Chan Zuckerberg Initiative (CZI) and NVIDIA, for example, are collaborating to build one of the largest high‑performance computing clusters in nonprofit life sciences research. This enables virtual cell system models that allow researchers to simulate how cells and tissues respond to treatments at scale. Through collaborations with organizations like the European Molecular Biology Laboratory’s European Bioinformatics Institute, CZI is also applying multiomics and clinical data to build patient‑level digital twins for rare diseases, using simulation to fill data gaps in small populations and to propose novel therapeutic interventions.
However, turning these vast and varied datasets into accurate digital twins of real patients will depend on having the right technical systems and clear rules for data use. Designing digital twins that enable personalized drug development will require strong technological foundations, including:
- Centralized, curated data lakes: Biobanks rely on well-organized bioinformatic tools that can combine genetic, protein, metabolic, imaging, and clinical data at scale, while keeping formats clean and traceable.
- FAIR data standards: Researchers and stakeholders must ensure that organized datasets are easy to find, share, and reuse across research groups.
- Secure federated learning: Strong regulatory oversight and governance are necessary to create a secure regulatory framework for training future AI models on multiple hospital and biobank datasets without sharing private patient information.
- Powerful cloud computing: Running advanced AI models at the population-scale, across diverse patients, will require robust and efficient computing resources.
- Continuous validation: Without inclusion of explainable outputs and uncertainty checks, results will be difficult for both clinicians and regulators to trust.
Clinicians should be able to interact with digital twins through intuitive user interfaces. Clear presentations of AI predictions, data visualizations, and explanations of how AI reached its conclusions are most useful for clinicians who do not regularly interact with AI models. These adjustments can empower doctors and clinicians to explore virtual patient results, test ideas, and make evidence-based treatment decisions by working with AI user-interfaces, rather than against them.
In this system of digital-twin translatability, biobanks become dynamic discovery engines. By continuously supplying high-quality, standardized, and ethically managed data to digital twins, they help create accurate models of human biology that can speed up preclinical validation and advance precision medicine on a global scale.
AI drug discovery success
The transition to an AI-enabled precision medicine ecosystem for creating novel drugs requires coordinated action across academia, industry, and government.
To realize AI’s potential in drug discovery, stakeholders should prioritize several strategic actions to ensure transparency, equity, and clinical impact across populations, such as:
- Turn biobanks into active data hubs: Instead of simply storing samples, biobanks should continuously supply high-quality data to AI systems. By combining genetic, protein, metabolite, imaging, and clinical information, these living datasets can drive predictive models and digital twins that better reflect the diversity of real patients.
- Connect academia, healthcare, and industry: AI models can only reach their full potential when researchers, hospitals, and companies share data and tools openly. Standardizing data formats and governance makes insights easier to use and reproduce, accelerating discoveries and setting the stage for the next generation of breakthroughs. The UK Biobank-DeepMind partnership demonstrates how collaborative frameworks can responsibly combine biomedical and AI expertise to generate scalable insights. Multi-stakeholder efforts such as the Accelerating Medicines Partnership led by the US National Institutes of Health, Food and Drug Administration, and pharmaceutical companies such as Ultromics and Anumana, also exemplify how shared governance and interoperable data ecosystems drive research.
- Create clear and trustworthy AI outputs: Clinicians should understand how AI makes its predictions. Safe and responsible use in drug development require that data and methods are transparent, and that people remain closely involved in the process. When AI supports rather than replaces human judgment, it builds trust, strengthens clinical decisions, and protects patient well-being.
- Make AI accessible worldwide: If the greatest AI advances are centralized within a few countries or companies, then AI in drug development could worsen global health gaps. Governments and funders should expand digital infrastructure, train talent, and provide open-source AI tools, especially in lower-income regions. Platforms like the Global Alliance for Genomics and Health, which sets standards to expand genomic data use within a human rights framework, and the UK’s eLIXIR, which links maternity, neonatal, and maternal mental health records, have shown how open standards can democratize access to biomedical data.
- Update rules and ethics: Laws and guidelines must keep pace with AI advances. Clearly established rules on data use, accountability, and decision-making, combined with explainable outputs, can help earn public trust. Without this, AI for designing patient-specific drug treatments risks being more promise than practical application.
Bringing AI into drug discovery is more than developing new tools through the quickest means possible. It also requires a complete rethink of how this technology is developed and applied. Linking open biobanks with transparent AI and real collaboration holds the promise of making drug development faster, more accurate, and more equitable, and of supporting breakthroughs for more diseases and hard-to-treat patient groups.
The promise of better clinical trial outcomes
- Clinical trials are costly, lengthy, and failure-prone, with nearly 90% of drugs trialed never reaching the market.
- Trial-specific innovations related to artificial intelligence are reshaping every stage of the trial process.
- Organizations that try to bolt AI on to an outdated operating model struggle to scale.
- Robust data governance, human oversight, and organization culture changes can lead to more efficient and successful drug trials.
Clinical trials create the most cost, delay, and risk in drug development. The cost of bringing a new drug to market has climbed to $2 billion, up from about $1 billion at the turn of the century. Clinical trials consume between six and seven years of a decade-plus research-and-development timeline.
The majority of life sciences executives surveyed by Infosys called clinical trials their top investment priority (Figure 1). Even so, some nine in 10 drugs and medical devices fail in the trial phase, never achieving a return on their hefty investments.
Figure 1. Clinical trials are the top investment priority
Source: Infosys Knowledge Institute
Drug discovery is a prime mission for life sciences companies, and rich with promise, despite not being a top investment priority for most. Universities, biotech and pharma startups are flush with potential molecules that may ultimately be developed into viable therapeutics. The problem isn’t a lack of promising compounds. The problem is that even if you believe one works, this must be proven clinically in a randomized trial. Until then, it doesn’t matter.
Industry-wide adoption of software-as-a-service tools has given life sciences companies cloud-based benefits including real-time collaboration and enhanced agility. But this adoption has not led to innovation in critical business processes. This is due to the complexity, scale, and regulatory rigor of clinical trials. Covid vaccine development was a notable exception where decentralized trial approaches took a sudden leap forward. Advanced technologies now being deployed have the potential to address other process bottlenecks and reshape clinical trials.
Generative artificial intelligence (AI), predictive modeling, machine learning (ML), natural language processing (NLP), applied to operations and decision support are no longer theoretical. They are already shortening clinical trial timelines, reducing risk, and helping bring vital therapies to patients faster.
The key question for AI in clinical trials is how can AI reduce risk, improve probability of success, and increase speed through clinical trials.
Investment in drug development
Life sciences executives say they spend more on clinical trials than all other areas, and intend to increase that significantly (Figure 2).
Figure 2. Clinical trials already top spending
Source: Infosys Knowledge Institute
As noted in chapter one of this journal, no AI-designed molecule has full regulatory approval yet, but Insilico’s rare lung-disease treatment rentosertib is in Phase 2 clinical trials, and other AI-involved candidates are in earlier stages. And DeepMind’s AlphaFold has proven very successful in predicting the atomic structures of proteins, paving the way for better understanding of disease and faster design of new therapies.
Meanwhile, French pharma company Sanofi successfully used digital-twin technology to make virtual copies of organs in simulating the performance of its asthma drug lunsekimig, and found that it accurately predicted the results of the clinical trials that followed. During Covid-19, American multinational Pfizer used AI-powered virtual screening to compress the discovery and preclinical stages for its antiviral drug Paxlovid, chopping years off its development so that it received Food and Drug Administration (FDA) authorization in just 21 months.
Decentralized and siteless trials offer new solutions to longstanding challenges in clinical trial execution and patient participation. Trial participants are easier to recruit and retain in decentralized and hybrid trial settings.
Wearable devices, remote sensors, and televisits allow clinicians to monitor and collect data periodically or continuously from patients electronically in their own homes or at other remote sites. For example, Infosys subsidiary Kaleidoscope Innovation conceived the design of a wearable sensor that analyzes sweat in real time and sends the data to a research center for analysis. Such work can transform data gathering because many trial participants are often dealing with debilitating illnesses, and clinical trial centers could be hundreds of miles away. Traveling can be onerous, leading to potential participants declining, incomplete datasets due to missed appointments, or even dropouts.
Decentralized trials also improve representation. Regulators require trials to reflect the racial, ethnic, and genetic diversity of real-world populations, yet white and Asian populations are often over-represented while Black people and Hispanics are underrepresented. AI-driven matching can help correct these imbalances when paired with intentional data governance, ensuring models expand access rather than reinforce historical bias, although datasets must first be audited to identify and mitigate any bias toward race or sex.
While data and AI are increasingly shaping how patients are identified, engaged, and represented across clinical trials, predictive analytics surface high-risk dropout patterns, refine inclusion criteria, and identify responsive sub-populations by learning from prior trials and real-world patient data. When these datasets are integrated, cleaned, and governed, they provide a foundation for new approaches such as synthetic control arms, reducing reliance on traditional placebo groups while preserving scientific rigor. Linking real-world data with legacy clinical datasets allows sponsors to extract greater value from existing evidence — accelerating insights, improving trial feasibility, and enabling more targeted development strategies.
Studies show AI-assisted design can improve recruitment rates by up to 65% and forecast accuracy by 85%. AI trial design also accelerates trial outcomes by between 30% and 50%, and reduces costs by 40%, the studies show.
Linking real-world data such as electronic health records with traditional clinical datasets promises faster trials and richer insights. In the context of clinical trial control arms, real-world data can reduce the size of control groups that do not receive experimental treatment. For example, a leading pharmaceutical company worked with Infosys to target resources and re-use clinical and real-world data for a trial. The company aligned evidence from electronic health records with traditional trial datasets to drive richer insights and more efficient clinical development.
Beyond trial design and execution, similar advances are now reshaping medical writing and clinical operations. Foundational AI, combined with agentic reasoning, is driving step-change improvements in data quality, document accuracy, and operational efficiency. Automated document classification, quality checks, and intelligent content validation ensure that evidence is structured, traceable, and inspection-ready from the moment it is created. At the same time, agentic AI systems can reason across protocols, standard operating procedures, case report forms, and trial master files to identify inconsistencies, generate regulatory-grade content, and accelerate review cycles. This has the potential to reduce manual effort and enable clinical teams to focus on higher-value scientific and operational decisions.
Obstacles on the road ahead
Despite early-stage evidence that life sciences companies have been more successful in putting AI to work, we still see that most AI initiatives in life sciences fail to scale. AI proofs of concept routinely collapse under real-world variability.
Curated data used in pilots does not stand up to operational noise that develops when projects move from laboratory to open environments. Moving to scale also requires that life sciences companies confront the persistent fact that historical data will contain inequities that algorithms could entrench.
This is why most AI initiatives in life sciences struggle to scale beyond proof of concept. Models trained on curated pilot datasets often fail under real-world variability, where data is incomplete, inconsistent, and continuously evolving. Scaling AI also requires confronting embedded inequities that, if ungoverned, can be amplified by algorithms. Data governance and monitoring are required to prevent real-world data variability from blocking sustained AI adoption.
The true barriers to adoption turn out to be fragmented workflows and uneven data readiness – not model quality. Proofs of concept that function in a controlled clinical setting struggle in messy practical application of full-scale production. When an AI model attempts to integrate across these silos, the underlying fragmentation becomes visible. Different departments follow different rules, data definitions vary and critical inputs aren’t standardized or captured reliably.
Further, return on investment from AI initiatives remains elusive. Companies often lack benchmark understanding of their most critical business processes, skewing their views of where actual bottlenecks are. Enterprises rarely tie strategic initiatives to quantifiable outcomes and consistently measure for this after an initiative is launched. That said, a growing number of companies and their advisors know about this issue and are working on improvements.
The gap between AI demos and AI business impact continues to widen.
And these constraints are tightening. Models are cheaper and more powerful, but deployment and integration have become the bottlenecks; regulation is intensifying; data complexity is rising; and economic pressure is increasing. The gap between AI demos and AI business impact continues to widen. Tech investments are under greater scrutiny for value realization. If companies are unable to estimate both cost and opportunities, it becomes difficult to make a case to move from status quo. Companies and their advisors need very strong evidence of an opportunity to make the case for wide adoption of AI for appropriate use cases. Without the right insight and data, that case becomes even harder to make.
This reveals a deeper truth in life sciences: organizations are not failing to adopt AI — they are attempting to scale it across operating environments shaped by fragmentation, variability, and historically independent evolution. Scaling AI therefore demands new tools and also coherent governance, aligned incentives, shared accountability, and operating models designed for continuous variability rather than static control.
Realizing the value of AI requires a cultural shift away from manual, familiarity-based ways of working toward data-driven decision-making, shared accountability, and trust in continuously evolving technology. The cultural change is about learning new ways to decide, not about learning new tools.
Finally, trust in the integrity of clinical trials has and will always be a prerequisite for both regulators and participants.
Winning their trust in the output from the application of any new technology is vital. This can only come with transparency about how outputs were produced, and whether those outputs are related to trial success or failure.
Heightened cybersecurity is also paramount to protect participant privacy, and to prevent hacking to tamper with results.
While none of these challenges necessarily need to slow adoption, they do demand continuous and vigilant governance, discipline, and human oversight.
The AI prescription
The prescription is an AI-ready operating model designed for modern clinical development. This includes:
- Harmonized processes and governed data,
- Structured, explainable AI frameworks rather than isolated prompt-based tools,
- Value-first design with measurable outcomes such as reduced amendments, faster document cycles, and improved enrollment, and
- Enterprise-wide readiness built on responsible AI governance, change management, and integration with core clinical systems.
When these elements are in place, AI can reliably accelerate clinical development, reduce cost, and bring therapies to patients faster.
For example, Novartis has published a detailed framework for responsible use of AI systems that describes governance, transparency, bias mitigation, data quality and life cycle management standards for its entire organization. This is a concrete example of big pharma treating AI not as a tool but as a governed system.
Six traits of AI maturity
Infosys research suggests that organizations leading in AI maturity share six common practices:
- Establish strong data foundations. Investing in data curation, interoperability, and lineage tracking before scaling AI. It’s also important to adopt FAIR standards and metadata catalogs, which are centralized libraries of data assets, to make data findable and reusable.
- Design for diversity. Integrating demographic and genomic data early to ensure patient recruitment is representative, and algorithms are fair. Include bias testing as a standard operating procedure.
- Balance automation with human oversight. Embedding human-in-the-loop review for all critical decisions reduces the risk of propagating errors. This also helps prevent AI deskilling.
- Adopt hybrid trial models. Combine in-person and remote elements to improve participant retention and reduce site costs. Use digital twins to run virtual control arms where appropriate.
- Upskill continuously. Ensure clinicians, statisticians, and operations staff are all AI-literate. Make training modular and set up cross-functional teams to create an environment in which information flows freely between experts.
- Implement ethical and transparent AI governance. This creates defined accountability, audit trails, and escalation paths. Infosys experts advocate an AI-assurance framework built on traceability, interpretability, and fairness.
Compound the savings
The cost of AI in clinical development is often framed in familiar terms: platform investment, integration effort, governance, and change enablement. These costs are tangible, planned, and visible.
What is far less visible — and far more material — is the cumulative cost of operating without an AI-ready enterprise model. Fragmented data estates, independently evolved processes, manual coordination, and retrospective oversight quietly compound across studies and portfolios, creating delay, rework, and preventable risk that rarely appears on a single line item.
Companies must reframe this equation by treating AI as a managed, enterprise capability rather than a series of experiments. Through life cycle management, responsible-by-design governance, and platform-level integration, AI becomes reusable, monitorable, and scalable. Value is found in the ability to repeatedly apply intelligence across decisions, workflows, and portfolios, not isolated use cases. In this model, savings emerge through earlier risk detection, reduced late-stage disruption, fewer reconciliation cycles, and the release of expert capacity from manual tracking to higher-value oversight. Just as importantly, these savings compound as organizational data and knowledge are retained and reused. This leads directly to better protocol design, more predictable execution, and stronger sponsor oversight over time.
The real question, therefore, is whether they can continue to absorb the hidden and recurring costs of the fragmented status quo when an AI-first, responsibly governed enterprise models make those costs increasingly avoidable.
The institutional intelligence link
AI and advanced digital technologies are no longer peripheral innovations in clinical development. They are becoming the connective tissue that links discovery, trial design, execution, oversight, and evidence generation into a coherent whole. When historical trial data, real-world evidence, and operational insights are reused rather than isolated, each study contributes to the next, strengthening scientific understanding and improving predictability.
Realizing this potential requires an AI-ready enterprise model built for continuous variability: one that aligns processes across therapeutic areas and regions, governs data and models responsibly, and embeds intelligence directly into decision-making workflows.
Cultural change is central to this shift. Teams must learn to operate with probabilistic insight, trust early signals, share accountability beyond traditional boundaries, and view compliance, oversight, and learning as continuous disciplines rather than episodic events.
In this context, the promise of AI is not speed alone, but institutional learning at scale. The result is a clinical development system that is more adaptive, more transparent, and more equitable. AI-enhanced clinical trials are capable of delivering safer therapies faster while retaining the rigor and trust that life sciences demand.
The direction forward is clear. The question is whether organizations will evolve their cultures, governance, and operating assumptions fast enough to realize the full benefit of AI applied to clinical trials.
Case study: Compliance and reporting considerations
Regulatory documentation is one of the most time-consuming aspects of clinical research. Designing a trial, associated protocols, and securing all the required permissions can take six to 12 months, and substantially longer for international trials. Generative AI is already being used to draft required reports on any safety issues with individual patients, assemble submission dossiers, and to translate documents rapidly to reduce this timeframe. A major drug developer worked with its technology provider to automate clinical safety reports. Generating those safety reports with AI and automation tools cuts turnaround times and produces more consistent and traceable documentation.
In practice, Infosys research found that automating safety reports can increase productivity by up to 60%, particularly in Phase 2 and Phase 3 drug studies with large patient populations. These reports are a crucial part of clinical studies that must be submitted to regulators.
Before putting automation and AI tools to work, the client established clarity on standards, starting with goals. In the context of the reports on patient outcomes required by regulators, this came down to faster production, less manual intervention, and more consistent narratives, in terms of language or structure.
The second critical standard is data. Before automating anything, our developers worked to be sure that data sets were fully available, up to date and adhered to standards, such as those set by the Clinical Data Interchange Standards Consortium.
With standards in place, the client ingested the study’s plan for patient reports to establish rules for when study subjects would need those write-ups generated. These rules then were used to build outlines and create content components from the study data.
A simple user interface was developed to present the content components to keep a human in the loop and allow for configuration and flexibility before generating narratives.
Finally, the client used natural language processing tools trained on pharmaceutical-specific data sources, including the Medical Dictionary for Regulatory Activities, to generate and refine narratives.
This allowed researchers to rapidly convert tabular study data into consistent safety reports.
Life sciences CRM: Who really owns the customer?
- A split between life sciences CRM providers, Salesforce, and Veeva is forcing organizations to choose between platforms and prompting a rethink of engagement architecture.
- CRM is evolving from a sales tool to an enterprise-wide engagement platform supporting personalized, multichannel interaction between drug developers, healthcare professionals, patients, and payers.
- AI and regulation are reshaping the CRM landscape: As expectations rise for intelligent orchestration and data auditability, platform selection becomes a strategic choice.
Customer relationship management (CRM) platforms have become essential tactical tools for pharmaceutical and biotech companies. They play a critical role in challenges facing life sciences businesses going forward.
These enterprises are confronting increased commercial complexity and heightened expectations from patients. Managing those expectations requires CRM ecosystems to evolve from basic sales coordination tools into strategic platforms that support multichannel engagement.
This transformation is critical. A significant gap is emerging between outreach efforts and customer needs. While most life sciences executives are confident in their engagement strategies, only 28% of HCPs say those strategies meet their needs. This gap suggests that CRMs require a rethink, not merely an upgrade. The key question is this: What kind of customer relationship do life sciences organizations want?
The answer is complicated by significant software platform disruption. In November 2022, Veeva, a cloud services company specializing in life sciences, announced it would end its 15-year-plus collaboration with Salesforce. This was followed by the announcement of an expanded global partnership between Salesforce and healthcare IT company IQVIA to leverage IQVIA’s Orchestrated Customer Engagement suite to develop and enhance Salesforce’s Life Sciences Cloud. The Salesforce offering is now branded as Agentforce Life Sciences. Existing Veeva customers must now choose between migrating to Veeva’s Vault CRM platform or Agentforce. IQVIA customers must decide between a transformation to Veeva’s platform or a migration to Agentforce.
The Veeva-Salesforce split is forcing CRM decisions to the top of the strategic agenda. Customers on Veeva-Salesforce stacks must now make a choice. Roche, Merck, and GSK are among those who have committed to Veeva’s new commercial Vault platform. Salesforce, meanwhile, is expanding its Agentforce Life Sciences Cloud, bringing together HCPs, patient services, trial management, and data integration. Pfizer, Novartis, Takeda, Haleon, and others have been named as early adopters.
Earlier, some companies experimented with using both platforms. In the last few months, most have settled on one. Using a single platform evidently allows them to manage operations more effectively and to unify typically siloed functions and affiliates.
Ready or not, life sciences organizations must reassess their customer engagement strategies and address the challenges of process and organization changes, platform migration, service agreements, and integration of legacy software systems.
A costly and complex choice
Most pharmaceutical organizations would prefer to avoid choosing a side . For all its limitations, the legacy stack was deeply integrated into sales, compliance, and medical workflows. Breaking that architecture introduces both direct and indirect costs, from migration planning and integration rebuilds, to retraining the teams in the field, regulatory validation, and system governance.
CRMs are typically integrated with master data management, data lakes, analytics platforms, content management systems, and increasingly with external tools like consent hubs or AI engines. Re-platforming even part of the stack has a ripple effect across downstream and upstream systems. These concerns explain why some companies are opting to run both platforms concurrently, at least in the short term.
Nonetheless, the situation provides an opportunity to reset companies that have postponed major CRM upgrades or failed to modernize beyond field automation.
The choice hinges on the strategic difference between the two platforms. Veeva promises deep life sciences specialization, and tighter integration of medical and commercial teams with established delivery partners like Infosys, Accenture, Deloitte, and Cognizant. Salesforce positions itself as an end-to-end, data-focused solution coupled with its Agentforce AI agents, which aims to unify clinical trials, commercial engagement, and patient services, with a deep partner ecosystem that includes technology consultancies Infosys, ZS, Accenture, Cognizant, Capgemini, and Deloitte. Both target the same transformation goal: Move CRMs from a sales enablement tool to a more integrated customer experience engine.
CRM’s evolving purpose
- Pharmaceutical businesses originally deployed CRM systems to track face-to-face visits to HCPs . Over time, the platforms evolved to support engagement by phone, email, and other digital channels, as well as to handle medical content delivery, consent management, and even some patient services. But for all their technical growth, CRM platforms have often under-delivered on strategic impact.
- Still, an Infosys survey of life sciences executives found that 61% were satisfied with their CRM systems (Figure 1). Deloitte found a similar disconnect. In the same survey mentioned above, where 28% of HCPs called customer engagement by life sciences effective, 82% of life sciences executives said they are satisfied with their customer engagement strategies.
Fragmented data, poor integration with marketing, sales, and customer engagement tools, and inconsistent workflows all contribute to this shortfall. Furthermore, in many emerging markets, the personal relationship between HCPs and sales representatives has a greater influence on outcomes than the technical capabilities of CRM systems. As a result, many sales teams tend to use their corporate CRM system only at a minimal level.
Figure 1. Most execs are satisfied with CRM
Source: Infosys Knowledge Institute
The strategic CRM
The key is not the platform choice, but how it supports the company’s broader engagement architecture and data benefits. Life sciences organizations are increasingly seeking to rationalize fragmented digital estates. CRMs must be part of that strategy, not a siloed toolset maintained for legacy reasons.
This shift is also reshaping how platform value is measured. The general trend is a move to a CRM organized around the needs of the customer or patient, rather than the organization’s internal processes. For example, a customer-focused CRM might identify that a healthcare provider isn’t buying a particular medicine because it doesn’t have trained staff to give injections. The system can then pass this information from the sales team to the medical team to resolve the problem. In a process-driven system, with siloed sales and medical teams, it would take longer to identify the problem.
Finally, new challengers are emerging in the form of regional players and organizations targeting mid-tier pharmaceutical organizations . While niche alternatives are unlikely to displace Salesforce or Veeva, they reflect a growing recognition that CRMs cannot remain static. Whether a company standardizes on one vendor, adopts a hybrid approach, or experiments with niche platforms, there’s a fundamental question: Is the CRM able to accommodate the engagement model for the future?
Patient-driven engagement
Patients are driving changes in CRM systems from the outside. In fields such as obesity, oncology, dermatology, and rare to ultra-rare diseases, patients often initiate treatment conversations after seeing therapies discussed on social media or appearing in consumer advertising. When demand precedes prescription, companies seek ways to coordinate engagement across functions, linking medical, marketing, field, and patient services into a coherent process.
CRMs are being reconfigured to meet these use cases. Salesforce, for example, has positioned its Life Sciences Cloud as a cross-functional platform supporting not just field reps, but also patient engagement teams and in other contexts.
Veeva, historically centered on HCP engagement, is also expanding its footprint. In 2024, it launched Campaign Manager and Service Center, two modules that extend Vault CRM into post-prescription and patient support use cases. Campaign Manager allows companies to build multichannel marketing journeys across brands and regions, while Service Center supports contact center workflows and patient service activities by integrating the CRM with other software, including the Microsoft 365 suite. Veeva has also integrated its CRM Events system into the Vault platform.
These additions reflect a growing consensus: CRMs must now serve as the operational core of customer experience in life sciences. That means supporting all variety of patients, all available channels, and tailored journeys, while maintaining compliance and leveraging real-world data.
When demand precedes prescription, companies seek ways to coordinate engagement across functions, linking medical, marketing, field, and patient services into a coherent process.
The rise of AI
AI is becoming key to this new CRM model, with almost unanimous agreement among those in the sector that it is shaping their strategy. Rather than present someone with information and leave them to decide what to do next, AI can analyze that information, find subtle patterns, and suggest the next best content or channel.
Salesforce has embedded agentic AI across its platform, allowing for automation of tasks such as deploying approved marketing materials without delay or preparing field teams for the next engagement. New AI-powered tools promise to reduce administrative workload and improve responsiveness for staff. Veeva Systems moved more slowly in deploying its native AI capabilities. The company successfully launched its AI offering, Veeva AI Agents, in December 2025.
Third-party providers are helping bridge the AI innovation gap. Several companies have developed sales and marketing decision support tools for life sciences, using AI-based orchestration. One top 10 pharma organization used such a tool to activate next-best-action tools, increasing revenue by as much as $30 million. Next-best-action is a discipline that draws on data to recommend a specific next step for a sales rep to move a health professional closer to a sale, recommendation or other optimal outcome. Advocates say next-best-action increases trust and engagement with HCPs.
Next-best-action is just the beginning. Marketing and sales consultants are urging life sciences companies to explore next-best experience. This concept uses AI to bring in revenue-maximizing concepts like return on experience to measure a patient’s optimal interaction with an HCP. In contrast with a specific next-best action, next-best experience takes a broader view of the customer relationship over time.
For some, that might be information about a new product, while for others it could be an offer of training on products they already use. This reframes CRMs as a system of orchestration — coordinating personalized interactions for different customer needs.
Veeva has showcased its own Vault CRM AI agents delivering long-awaited capabilities such as a plain-language preparation for the next engagement with an HCP. The demonstration at Veeva’s November 2025 Commercial Summit Keynote in Madrid used conversational prompts such as “Generate an engagement plan” or “Summarize” an existing engagement plan. Life sciences executives surveyed by Infosys generally agree that AI has improved CRM platforms in their industry, but only one in five said AI had delivered significant improvement (Figure 2).
Figure 2. AI helps CRM platforms
Source: Infosys Knowledge Institute
Data and regulation
CRMs must also meet rising expectations around data governance and regulatory compliance. Today’s CRMs contain a growing volume of highly sensitive data relating to patients, HCPs, insurers, and internal stakeholders.
A key turning point is the introduction of the EU Artificial Intelligence Act, adopted in 2024, which imposes strict requirements on high-risk AI systems affecting health, safety, or fundamental rights. These include those used in life sciences.
While this legislation is Europe-focused, its influence is global. Life sciences businesses operating in multiple jurisdictions must ensure that AI-driven CRMs meet standards for transparency, explainability, human oversight, and data auditability.
Beyond AI regulation, another area of growing complexity is consent management for things such as marketing materials and tracking customer engagement with digital content. As CRMs evolve to support multichannel and patient-facing contact, managing consent at the individual level, for example by channel, content type, and topic, becomes significantly more challenging.
When capturing consent data is spread across multiple systems, the potential for inconsistencies and compliance risk greatly increases. As more pharma businesses adopt digital tools for HCP and patient engagement, the need for real-time, system-wide consent governance has grown sharply.
But a consent management system alone won’t address these challenges. Customer master data management platforms connected with data providers such as IQVIA’s OneKey or Veeva OpenData offer a rapidly improving solution. In August 2025, IQVIA and Veeva ended an eight-year legal battle and signed a data-sharing agreement that makes it easier for customers to use software and data together. This allows companies to choose a preferred player in each key market rather than picking sides on a global level. Data strategy and global governance should address data model discrepancy and different ways of working.
Salesforce encourages integration with specialist providers, such as H1, Definitive Healthcare, and Athena Health, to create a golden record and track consent consistently across systems. Multiple systems support richer customer profiling but introduce new governance challenges. Veeva, by contrast, promotes a vertically integrated model, with built-in tools for consent and data governance as part of its native architecture.
Strategic data architecture
Platform choice also has implications for enterprise data strategy. Some industry experts argue that life sciences companies should evaluate platform options through the lens of data readiness, ensuring their system will enable them to use data as they wish, and not prove a constraint to the way they organize their business in the future.
Salesforce offers flexibility through open application programming interfaces (APIs) and ecosystem integrations. This makes it attractive for companies pursuing an architecture where CRMs are one part of a modular engagement platform. However, this can also introduce complexity in integrating third-party data sources, expanding data reconciliation and validation processes, and expanding governance oversight.
Veeva, on the other hand, offers tighter native integration between CRM, content, data, and analytics. Its OpenData solution provides HCP data directly within the Veeva ecosystem, reducing dependency on third-party sources and offering a more controlled compliance environment.
A chance at transformation
- CRM transformation is a challenge and an opportunity. Life sciences companies must pursue thoughtful and effective next steps to make the most of this required change.
- Treat CRM transformation not as a technical upgrade, but as a strategic reset touching on business strategy, organizational processes, data management, platform governance, and customer experience. That means rethinking engagement architecture around customer needs rather than company structure. A modern CRM strategy should unify functions, including medical, marketing, market access, and field operations through shared data, coordinated workflows, and consistent content governance, enabling engagement across all channels and external touchpoints.
- Define a clear CRM purpose before choosing a platform. What does success look like in business terms? Is it delivering more personalized HCP engagement, improving onboarding and adherence in patient support programs, or enabling real-time orchestration of cross-functional teams? Without strategic clarity, companies risk repeating old patterns and replicating outdated workflows in new technology, squandering the full potential of platform change.
- Assess platforms on more dimensions than functional capabilities and cost. Consider data strategy alignment, integration flexibility, roadmap maturity, and vendor’s vision.
- CRM ecosystems should be treated as strategic partnerships that define future engagement, rather than standalone products.
- Treat CRM migration as a business transformation program. That means going beyond technical implementation to invest in organizational change management, user training, content governance, and new performance metrics. It may also require rethinking incentive structures and field roles to support more collaborative, insight-driven engagement. Success depends not only on what technology is deployed, but on whether teams are equipped to use it to drive meaningful outcomes, for customers, the business, and the patients.
As the Veeva-Salesforce breakup, followed by the IQVIA-Salesforce partnership, force life sciences organizations to reassess their technology stack, it also brings into focus a broader question: What kind of customer relationships do pharmaceutical and biotech companies want to own? Setting up the right performance metrics and program objectives will help define the correct trajectory of the CRM review, select the right vendor, and harvest the commercial upsides of the next-generation CRM technologies.
In a market where pipelines are increasingly specialized and value is determined not just by efficacy but also by patient experience, CRM platforms are no longer optional. They are foundational to how life sciences companies engage, adapt, and lead.
How to forge a resilient supply chain
- Amid geopolitical volatility, the resilience, agility, and speed of life science supply chains matter as much as cost. Cutting edge cell and gene therapies are fragile and time-sensitive. They deliver more, and require more from their supply chains.
- Companies must rebalance their operations to reduce risk and bring production closer to customers. Advanced technologies can help offset the additional costs.
- End-to-end visibility is essential. AI and enhanced data access must now be at the core of supply chain management.
Rising geopolitical tensions, including from proposed US tariffs, conflict in Ukraine, and physical supply-chain disruptions caused by the Covid-19 pandemic have made resilient, optimized supply chains even more critical in life sciences. This resilience will help companies mitigate risks posed by future disruptions and meet new post-pandemic expectations for faster patient drug access.
This requires both physical and digital transformation. Tariff risks and pharmacy industry policies that prioritize security of supply are already reshaping global production decisions toward localized manufacturing and contract manufacturing agreements.
Meanwhile AI assumes a growing role in supply-chain forecasting and risk modeling to keep execution tight, yet reliable.
AI and machine learning models are helping planners manage uncertainty, optimize inventory and improve responsiveness. In effect, AI helps planners shift the focus of supply chain management from cost savings to diversification, predictive modeling, and preparing for uncertainty.
Budgets and attention are being recalibrated toward operational resilience as a competitive necessity. One in three life sciences executives surveyed by Infosys said supply chain is the single most significant differentiator for their organization. A total of 71% of respondents called it most significant or major (Figure 1).
Figure 1. Supply chain is a major differentiator
Source: Infosys Knowledge Institute
Take the medical device sector. This now encompasses smart medical implants, robotic surgery tools, scanning machines, and 3D-printed medical devices, as well as drug-eluting stents and emerging drug-digital therapeutic combinations. These require particularly resilient supply arrangements. Globally this market sub-sector is set to be worth $955 billion by 2030, spurring accelerating growth in medical device contract manufacturing. Health authorities are now taking steps to protect patients from shortages in critical equipment supply.
Pressure points
Based on the relative emphasis of recent earnings calls, pharmaceutical industry leaders appear less concerned about the impact of US tariffs than peers in other industries such as automotive. This might be partly because critical medicines enjoy particular protections, while the industry has a relatively diverse spread of activities geographically.
Still, geopolitical changes, economic uncertainty, and the time and money required to establish new operational locations cannot be ignored. Tariffs also have an impact on ingredient sourcing and supplier reliability, adding new costs, complexity, and risk.
Johnson & Johnson has warned about the risk to drug availability posed by tariffs. Other life sciences companies have announced substantial investments in US-based manufacturing to reduce their exposure to tariffs. In Europe, a new Critical Medicines Act is incentivizing manufacturers to invest in EU-based production and resilience.
Research suggests that more than $150 billion globally has been committed to new capital projects through 2030. These costs will need to be offset through operational cost efficiencies and sales growth.
Aside from geopolitics, the expansion of personalized medicine is straining traditional supply chains. In 2025, new drug modalities accounted for $197 billion, or 60% of the total projected pharma pipeline value. This includes cell and gene therapies (CGTs) that are now coming to market at an accelerating pace. The global market for these therapies, which offer treatments for people living with rare or difficult-to-treat conditions, including cancers, blood disorders, and neurological conditions, is projected to reach $108 billion by 2032. Most major pharma companies are making sizeable plays for this sector now, via strategic acquisition, contract partnerships, or investment in their own R&D and manufacturing facilities.
Establishing specialist, compliant, local manufacturing facilities is costly. Cell and gene therapy companies have invested capital between $80 million and $155 million in new facilities. In addition, human biologics — medicines made from living organisms, such as cells, proteins, or sugars used to treat complex diseases — require special handling considerations.
Individualized treatments require special considerations as well. Delivering batches of single, individualized live-cell-based therapies to a specific patient — at a cost per treatment that can be several million dollars — requires a strictly controlled, time-sensitive, real-time traceable, closed-loop supply chain. The risk from cold-chain failures alone is estimated to be more than $35 billion annually, and that does not account for the human cost if a patient’s time-critical treatment has to be destroyed.
Resilience in operating models
To weather the high costs of resilience in these contexts, life sciences companies must adapt their operating models and invest in smart operational capabilities. Visibility across the manufacturing and supply ecosystem is crucial, allowing greater anticipation of potential issues and the ability to pivot swiftly.
One supporting technology option is digital control towers — integrated systems and data sources, brought together in a cloud-based platform with dashboards. Digital twin and scenario simulation capabilities; network-connected internet of things (IoT) sensors to enable real-time tracking; and AI-powered data analytics and process automation.
In addition, advanced supply chain planning suites can empower companies to optimize the cost of resilience. This is achieved by combining demand planning, inventory optimization, and network design with scenario modeling and what-if simulations. This allows companies to model service levels, potential disruptions and costs simultaneously, and make policy choices to maximize profit and minimize risk-adjusted cost.
Prescriptive analytics tools can combine forecasts, pricing data, and capacity constraints to guide drugmakers on key decisions such as how much inventory to hold in particular locations and what mix of suppliers to use. This is another way to align resilience needs with financial targets.
Modern supply chain management and ERP systems add AI-driven forecast and risk scoring inputs to inform how to adjust plans when disruptions or demand shifts arise. With these systems, life sciences companies can access real-time data, analyze supplier performance and automate re-planning decisions.
The upshot is a need for highly optimized supply chains that adapt rapidly and operate with near-real-time transparency. Legacy enterprise resource planning (ERP) and siloed planning systems don’t support these requirements.
A popular alternative to building new manufacturing facilities is to outsource production. This can enhance speed and agility, improve resilience by spreading risk, and reduce costs. Research predicts that supplier spending on both contract manufacturing and integrated contract development and manufacturing operations relationships will rise.
However, this shift intensifies the need for visibility across the cycle of planning, sourcing, making, and deliver. This increases the importance of having a digital infrastructure capable of supporting dynamic planning and real-time interventions.
Roche, a client, devised a multilayered measurement program to increase its supply-chain resilience without harming efficiency. Roche will balance distributed production, offering speed, flexibility, and resilience, with cost and compliance management. Near-shoring is a part of the plan, but transparency and close orchestration will be critical to make the strategy come to life.
To maintain visibility as operations extend into contract development and manufacturing operations, Roche is implementing standardized SAP solutions, including tools for exchanging data with its partners. This will facilitate process or stock optimization. It is also harnessing AI wherever possible, like to automatically reflect status updates, including deviation reports, across systems.
Control towers and digital twins
The potential of digital control towers to facilitate real-time monitoring and zero-touch planning has been widely highlighted. These towers aren’t just dashboards, but cloud-based nerve centers. They ingest data from ERP systems, manufacturing execution systems, logistics, IoT sensors, laboratory information management systems, quality management systems, and partner operations, and make it easier to visualize what could happen. Digital control towers can also simulate scenarios and incorporate AI agents into supply chain planning.
AI agents can play predictive, prescriptive, and executive roles in supply chain management. Predictive agents can analyze historical data, seasonal patterns and other seemingly disconnected information to predict demand spikes. Prescriptive agents can monitor internal operations and recommend stock reallocation or expedited shipments to optimize inventory. And finally executive agents can act autonomously to address known scenarios, such as a supply shipment delay.
Digital control towers provide visibility and real-time data for life sciences supply chains, making it easier to coordinate, resolve issues proactively, and improve efficiency. Their role is to centralize information to monitor shipments, manage inventory, and enhance resilience against disruptions.
This should ultimately improve patient care by ensuring products arrive intact and on time. Potential benefits include reduced delays and waste, enhanced regulatory compliance, improved risk management, and a more agile and responsive supply chain.
During Covid-19, Pfizer relied on its advanced supply-chain control towers to enhance its response capabilities for manufacturing and supplying vaccines. They enabled it to continuously monitor the flow of materials and products in real time, meet regulatory requirements, and ensure the integrity of medical supplies.
Bristol Myers Squibb uses digital-twin technology to identify potential risks in its pharma supply chain, and balance the dual priorities of production capacity and the ability to deliver, ensuring delivery to patients.
As companies start to look beyond resilience toward antifragile supply chains — involving more adaptive, probabilistic planning — they will also need to connect more seamlessly with their suppliers — especially if they are relying increasingly on contract manufacturing and contract development and manufacturing operations.
This is particularly true for biologics. Biologics are more susceptible to supply-chain issues than small-molecule drugs due to their inherent complexity, fragile nature, and specialized handling requirements. When partners are contracted to perform the manufacturing, the line of sight across the ecosystem could be obscured, yet in these cases it is even more critical to manage resources and ensure supply resilience and speed.
Roche is harnessing both digital control-room capabilities and digital-twin simulations as part of its ongoing supply-chain optimization. At Roche Diagnostics, the technologies are helping to coordinate medical device installation for large diagnostic laboratory equipment, for example, scanning machines in hospitals. This needs to be assembled and tested in situ, requiring careful coordination by suppliers and engineers.
Use of a digital twin, to simulate assembly, implementation, and testing in advance, has enabled a reduction in installation time from 25 days to between 10 and 12 days.
Simulating entire pharma supply chains to optimize planning from raw material procurement to final product distribution, can help life sciences organizations anticipate disruptions, optimize inventory levels, and improve logistics coordination.
Tech interventions
Nine in 10 life sciences executives are extremely confident in their ability to achieve supply chain resilience (Figure 2).
Figure 2. Life sciences believe supply chains are resilient
Source: Infosys Knowledge Institute
Given that confidence and so many technological options, focus is important. Targeted opportunities include:
Cold chain and IoT
More than 30% of biopharma products require cold-chain logistics because even minor temperature changes can cause losses.
The right algorithm will monitor and ensure the required combination of pressure, vibration, sunlight exposure, temperature, and time out of refrigeration to preserve sensitive products in optimal condition.
Sensor-enabled cold-chain management solutions have cut spoilage and product-loss costs by around 20% in pilot deployments. Such savings are substantial given the tens of billions of dollars in annual losses that are associated with cold-chain failures; when lives are at stake, the implications are much higher.
For Gilead’s Kite Pharma, innovation in the cold chain for its CAR-T therapy starts with tight asset management – the ability to track every item all the way across the cycle.
Pfizer, whose Covid-19 vaccine needed to be kept at minus 70°C, relied on IoT-enabled barcodes, integrated with existing logistics and supply-chain management systems, to monitor environmental conditions during the pandemic.
Moderna, meanwhile, digitally tracks mRNA vaccine shipments across more than 50 countries, claiming 98% on-time delivery even during global logistics disruption. During the pandemic, its AI-powered supply chain, fed by real-time logistics data to hone forecasting, proved a lifeline.
Quicker onboarding for partners
As life sciences companies rebalance their operating models, AI helps hone the selection, vetting, and engagement of contract development and management operations to reduce costly delays and compliance failures.
Adaptable frameworks can help bring new contract delivery partners online more quickly. Using SAP-based contract manufacturing framework and templates has resulted in setup times that are between 20% and 30% faster.
When Moderna established a long-term manufacturing collaboration with contract delivery partner Lonza for mRNA vaccine manufacturing early in the pandemic, rapid set-up was crucial, and involved transfer of electronic batch records and equipment automation between the two companies to accelerate scale-up. Lonza produced the first commercial batch of the Moderna vaccine in November 2020, five months after the partnership had been announced.
Faster regulatory compliance
Regulatory requirements can also threaten resilience. Generative AI regulatory compliance agents like those from Infosys have compressed compliance cycles from nine to 12 months to just two or three weeks, accelerating the process of bringing new operations or suppliers online.
Patients at the center
For companies to deliver the patient-centricity they have promised — and to continue to operate commercially viable businesses — they must accelerate the overhaul of their supply chains.
The closed-loop CGT supply chain implemented by Novartis is one such example, placing strong emphasis on the patient experience. As part of the closed loop, patient comfort is reinforced through mobile notifications from care providers, further emphasizing the shift to patient-oriented from product-oriented life cycle management.
Greener logistics
A more efficient and flexible supply chain means less wasted energy and fewer miles traveled, which reduces a company’s carbon footprint. Tech-enhanced visibility into this system will enable life sciences businesses to get ahead of Scope 3 greenhouse gas protocol standards, which aim to capture the carbon footprint of a company’s entire upstream and downstream value chain. Resilience, accountability and efficiency are all interconnected.
Supply chain resilience and speed
To realize these capabilities, life sciences companies should:
- Rethink their network model: Map exposure scenarios and model alternative setups. AI-powered analytics and digital simulations can help.
- Invest in the digital core: Unify siloed systems via modern cloud platforms and make quality data available for AI decision-making.
- Operationalize visibility: Implement a digital control tower with live monitoring via IoT, AI-driven analytics, proactive risk management, and real-time collaboration support.
- Pilot, then scale: Start with measurable use cases for advanced technology deployment, and scale successful models. Look for standardized, market-ready capabilities that provide 80% of the core functionality and can be finessed for any particular requirements.
- Partner for end-to-end transformation: Extend essential interdisciplinary collaboration across manufacturing and supply-chain operations.
What’s next in supply chain
Digital twins are expected to be among the biggest levers for improving resilience in life sciences over the next five years, driven by increasing AI orchestration, live end-to-end visibility, greater cold-chain automation, decentralized manufacturing, advanced analytics and adoption of AI agents in supply chain management. Many organizations are already piloting priority technologies today and expect to scale them up in the next few years.
To avoid falling behind and hone resilience and responsiveness through geopolitical uncertainty, companies must keep advancing and innovating their global supply-chain strategies. Too much is at stake in this shift from efficient ”just in time” to resilient “just in case.”
Personalized medicine crystallizes many of the challenges that modern life sciences supply chain solutions need to address. Patients undergoing cell and gene therapies have generally exhausted other treatment avenues, so time could be running out for them. Delivery delays would be critical for the patient and cost the company if manufactured products are spoiled and have to be recreated.
Infosys client Novartis has deployed a rapid manufacturing platform to optimize the delivery of its CAR-T therapy, Kymriah. Novartis redesigned its closed-loop, patient-specific manufacturing supply chain, reducing the cycle time from 28 days to 20. Those eight days saved can be the difference between life and death for a patient.
The software solution is a cell and gene therapy orchestration (CGTO) platform that connects every stage of the patient life cycle: cell collection at a clinic; monitored cold-chain transport to the manufacturing site; genetic modification; quality release; and reinfusion back to the patient. IoT sensors log temperature and location throughout the supply chain, while digital records capture auditable evidence for regulators.
Novartis has similarly modernized its global supply chain for radioligand therapy (RLT). It established end-to-end visibility, automated coordination, and data-driven decision-making across the RLT value chain. This overcomes silos from Novartis’s multiple biotech acquisitions by using cloud integration and intelligent data management. Novartis has reported a 40% reduction in manufacturing cycle time, 25% lower operational costs, and faster delivery of precision cancer therapies.
Case study: A critical personal supply chain
Personalized medicine crystallizes many of the challenges that modern life sciences supply chain solutions need to address. Patients undergoing cell and gene therapies have generally exhausted other treatment avenues, so time could be running out for them. Delivery delays would be critical for the patient and cost the company if manufactured products are spoiled and have to be recreated.
Infosys client Novartis has deployed a rapid manufacturing platform to optimize the delivery of its CAR-T therapy, Kymriah. Novartis redesigned its closed-loop, patient-specific manufacturing supply chain, reducing the cycle time from 28 days to 20. Those eight days saved can be the difference between life and death for a patient.
The software solution is a cell and gene therapy orchestration (CGTO) platform that connects every stage of the patient life cycle: cell collection at a clinic; monitored cold-chain transport to the manufacturing site; genetic modification; quality release; and reinfusion back to the patient. IoT sensors log temperature and location throughout the supply chain, while digital records capture auditable evidence for regulators.
Novartis has similarly modernized its global supply chain for radioligand therapy (RLT). It established end-to-end visibility, automated coordination, and data-driven decision-making across the RLT value chain. This overcomes silos from Novartis’s multiple biotech acquisitions by using cloud integration and intelligent data management. Novartis has reported a 40% reduction in manufacturing cycle time, 25% lower operational costs, and faster delivery of precision cancer therapies.
IT makes mergers pay off
- Mergers and acquisitions (M&A) in life sciences are driven by a focus on acquiring capabilities in higher-margin areas such as precision medicine, as patents on blockbuster drugs expire.
- The rising importance of digital technology in the industry makes value creation from mergers and spin-offs increasingly reliant on effective post-deal IT integration.
- Companies need a repeatable playbook that covers IT integration, transition service agreements, and manages organizational change.
M&A is a strategic lever in the life sciences industry. The M&A market has picked up pace, with 454 deals in the first three quarters of 2025 versus 324 during the same period last year, according to DealForma data.
Global deal value has surged to around $159 billion in three quarters through 2025, driven by a big increase in transaction size in the third quarter.
In most cases, companies are looking to make smaller acquisitions that expand their therapeutic pipeline in high-value areas such as precision medicine, as the industry goes through a concentrated period of patent expiries between now and the end of the decade. A recent deal of this kind was GSK’s acquisition of IDRx, a Boston-based biopharma company developing precision medicines for gastrointestinal cancers, for about $1.15 billion.
Life sciences companies are constantly evaluating which capabilities they need to scale up or acquire, and which therapeutic areas in their portfolio no longer match their strategic goals or margin ambitions. This requires a playbook for integrating new businesses and spinning off older ones, particularly as the technical complexity of post-merger integration work has increased with digitalization. The largest opportunity for M&A in life sciences is in IT, according to Infosys Consulting. Life sciences executives surveyed by Infosys express confidence in their integration capabilities. Some 88% say their companies are good or very good at integrating acquisitions (Figure 1).
Figure 1. Merger integration is a strength
Source: Infosys Knowledge Institute
The complexity of merging or untangling data platforms, enterprise resource planning (ERP) software, and other essential business processes risks diluting the promised value of the deal and has even caused some transactions to be shelved.
What does a successful playbook for IT M&A look like? It should enable life sciences companies to minimize disruption by protecting key business processes and data, execute a structured end-to-end IT integration or divestiture program, and maximize synergies while controlling costs.
The desired level of IT integration is a key consideration, based on strategic intentions such as how long it intends to keep the acquired business, managing transition service agreements (TSAs), and how it wants to manage organizational change.
M&A drivers and trends
Between 2022 and 2030, 190 drugs, including 69 blockbusters will lose exclusivity. To make up for hundreds of billions in lost revenue after patents expire, life sciences companies are taking a focused and proactive approach to M&A, looking to acquire artificial intelligence capabilities (and relevant data), and expand in higher-margin areas such as niche therapeutic areas and personalized medicines, rather than simply getting bigger.
Recent examples include Novartis, which announced in October that it would buy Avidity Biosciences, a precision medicine company developing RNA treatments for rare neuromuscular diseases, for $12 billion. That will be its biggest acquisition in many years.
In the same month, Merck completed its $10 billion acquisition of respiratory disease-focused biotech company Verona Pharma, strengthening its pipeline of cardiopulmonary treatments.
Spin-offs grow in prominence
Spin-offs and demergers have also been an important feature of M&A in recent years, as life sciences companies shift their strategy to refocus their biopharma businesses and build portfolios around specific treatment areas. This reverses the expansion trend of the previous two decades. Demerger examples include GSK and Johnson & Johnson. Both spun off their consumer health businesses into separate publicly traded companies, creating Haleon and Kenvue respectively (Kenvue agreed to a $40 billion takeover offer from Kimberly-Clark in November).
Johnson & Johnson’s shares have climbed by around 30% this year and in October the company raised its full-year sales forecast. GSK’s shares have not enjoyed a similar lift, but the company earlier this year increased its sales forecast for 2031 to £40 billion ($52.4 billion), promising that 15 large pipeline opportunities will launch by 2031.
As part of the same trend to focus on higher-margin businesses, Pfizer and Novartis spun off their generic drug businesses in 2020 and 2023 respectively. In the past these would likely have been acquired, but the appetite to buy generic drug-makers has shrunk, as prices for generics come under pressure, and typical buyers for generics have focused on building their own capabilities.
The manufacturing motivation
Another M&A driver in the near term will be the need for life sciences companies to expand their manufacturing capacity in the US. The introduction of tariffs on imported drugs this year is spurring decisions on whether to build or buy factories in the country. One example is the $6.7 billion merger of Mallinckrodt and Endo earlier this year, creating a combined company with a large US manufacturing capacity.
The growing importance of data and the application of AI are also creating different types of deals and partnerships. For example, pharmaceutical company Lilly and chipmaker Nvidia announced in October they would build what they described as the most powerful supercomputer in the pharmaceuticals industry, to accelerate AI-enabled drug discovery. Lilly has also launched TuneLab, a platform that shares AI models trained on years of the company’s data with selected smaller biotech companies, while Novartis and Merck also have data-sharing initiatives with startups.
For now, no new medicines discovered or designed entirely by AI have reached the market, but among the furthest along is rentosertib, a drug for the lung condition idiopathic pulmonary fibrosis, trialed in a Phase 2 study with results published in Nature Medicine in June. Generative AI was used to find both a possible cause of the disease and the therapeutic compound.
IT unlocks deal value
IT now plays a decisive part in unlocking the value of a deal, in mitigating risks, and in ensuring business continuity.
Getting this wrong can be costly. A notable example in a similarly highly regulated industry occurred after the British retail bank TSB was acquired by the Spanish banking group Sabadell in 2018.
When the merged company tried moving 1.3 billion customer records from an older system belonging to the former parent company (Lloyds) on to Sabadell’s software, almost two million online banking customers were locked out of their accounts, some for weeks, bill payments failed, and customers were mistakenly given access to the confidential records of others. The issues took eight months to resolve fully, and the bank said it incurred £330 million in costs. It was also fined almost £49 million by regulators, the Financial Conduct Authority and the Prudential Regulation Authority.
To avoid such regulatory, reputational, and financial risk, and realize the full potential of a transaction, it is vital to involve IT teams as soon as a deal is agreed — changes take an average of six to 18 months, based on Infosys research and experience. The major challenges during integration or separation are:
- Minimizing disruption by protecting key business processes and data
Protecting data integrity and ensuring regulatory compliance are essential, because clinical trial data, pharmacovigilance systems, and regulatory submissions are mission-critical and subject to strict regulatory oversight, including by the US Food and Drug Administration and the European Medicines Agency. IT integration or separation must preserve full compliance to avoid costly delays or penalties.
With intellectual property (IP) and R&D systems, laboratory information management systems (LIMS), electronic lab notebooks (ELNs), and proprietary AI and machine learning (ML) models for drug discovery must be protected, migrated, and integrated without loss of function or data.
Finally, M&A events can make companies more vulnerable to cyberthreats. IT acts as the first line of defense to safeguard patient data, trade secrets, and manufacturing controls. - Executing a structured end-to-end IT integration or divestiture program
The life cycle of IT integration or separation starts with an assessment of existing IT systems in both companies to identify risks, challenges, and dependencies between systems and processes. It then goes into design, planning, and execution, accounting for factors such as the need to migrate applications together due to dependencies. A well-structured post-merger integration program spans the entire enterprise (Figure 2). - Maximizing the realization of synergies while controlling costs
While mitigating IT risks and ensuring business continuity are essential to successful integration or separation, value is created by quickly integrating ERP software, which brings together and manages core company business processes, customer relationship management (CRM) software, and supply-chain platforms. This allows owners to achieve operational synergies more quickly, especially for global distribution and manufacturing.
Digital transformation can also multiply the value of a deal. Strategic use of AI can optimize portfolios, undertake predictive maintenance in manufacturing, and enable automation and advanced market analytics for the combined or separated entity. This is an opportunity for improvement in the industry. Half of life sciences executives surveyed by Infosys say AI has had no impact on mergers and acquisitions activity (Figure 3).
Figure 2. Post-merger integration phases
Source: Infosys Knowledge Institute
Figure 3. Half say AI has no impact on M&A
Source: Infosys Knowledge Institute
Where to create post-deal value
For life sciences companies, M&A is an ongoing process. The portfolio of treatments and capabilities is continually reviewed to ensure the company has the right balance to support its strategic goals as the market evolves. To maximize chances that the integration and separation of businesses create value, three focus areas are particularly important for an IT integration playbook:
- Align IT integration to strategy: Acquirers can pursue different levels of IT integration post-merger, from full assimilation to keeping the new company as a standalone entity. This decision should be driven by strategy. For example, do leaders expect to own the business for a limited number of years or for the long term? Putting in place defined processes for different levels of integration means that companies can turn to the playbook time and again when M&A opportunities arise.
- Accelerate transition service agreement period: In spin-off or demerger situations, accelerating the exit from the TSA reduces costs. Achieving this requires a detailed plan and constant tracking (see the Biocon case study at the end of this chapter).
- Engage experts for integration management: The relationship between an acquired organization and its buyer can be turbulent. Enlisting a neutral expert to manage and consult the integration management office (IMO) ensures that all applications, data, HR functions, and processes are properly merged into the parent organization. This has long been true, and is growing more critical as infrastructure, applications, and data workstream add more robust capabilities and grow more complex.
- Prioritize strong organizational change management: Change management is essential for keeping key IT talent in the new business, either post-merger or post-spin-off. The IT landscape of a newly merged or separated company will inevitably be messy, with too many technologies and people working in the same roles but using different tools. People know some degree of rationalization will happen, so establishing clear communication and the right incentives is essential to dispel uncertainty and make sure the right talent stays with the company. Consolidating teams is also important to make sure the old and newly acquired companies are pulling together.
Effective IT integration or separation is central to transaction success, and this importance is only likely to grow as life sciences companies embrace AI, automation, and data analytics in every part of their business. Companies that define their post-merger integration processes in a playbook they can deploy each time a deal is struck increase the opportunity for IT M&A to create value, achieve progress, and contain risks.
Case study: Biocon’s accelerated integration
A recent example of realizing deal value through IT integration is Biocon Biologics, which in 2022 bought the biosimilar medicines business of Viatris, the US-based pharma company created from the merger of Pfizer’s Upjohn generic drugs division with Mylan. As a result of the $3.3 billion deal, India-based Biocon has one of the broadest portfolios of biosimilars in the life sciences industry and gained a direct presence in the US, Europe, Canada, Japan, Australia, and New Zealand.
Biocon’s priorities for the post-merger integration were to achieve a successful Day One as a merged company, accelerate the exit from the two-year TSA that was part of the deal, and lay the foundations of the future digital enterprise.
The challenges ranged from the everyday and practical — planning for how the Viatris team would switch over their company phones and laptops, for example — to strategic regulatory risk. Biocon’s lean in-house IT team had no expertise in the new markets where the enlarged company would be operating, such as the US and Europe, and so had limited knowledge of the relevant regulations and compliance laws.
During a rigorous integration planning exercise before and after the deal closed in November 2022, Biocon defined its Day One, Day 100, and target state technology blueprints. It identified more than 100 IT work packages and its technology vendor selection process. It then set up a project management office for IT integration, rolled out an IT governance structure, and defined its exit charter from the TSA to track data migration, verification, and acknowledgment.
Over the course of 2023, the company defined cutover strategies with hourly action plans for switching from the old company’s systems to the newly merged ones, put in place a 24-hour command center for a smooth transition over 12 days, and tracked and logged each element of the integration project in real time. Templates for different countries were executed in parallel to ensure the TSA exit stayed on schedule.
The result? Biocon concluded its TSA after one year rather than two, saving time and money. It transferred more than 20 terabytes of data through cloud platforms and from data centers through physical drives. In the meantime, it implemented 18 greenfield software systems and integrated nine third-party logistics providers across 15 countries.
The shift closer to patients
- The shift to patient-centered healthcare is changing the operating models and actual structures of life sciences companies.
- Personalization demands a new approach to drug development, marketing, and relationship management from life sciences companies.
- Data management and privacy are emerging as key issues.
Patients have been looking up symptoms on Google for decades.
The Covid-19 pandemic turbo-charged the readiness of individuals to turn to technology to find out more about their health and manage it more proactively. That shift, together with the increasing availability of low-cost, direct-to-consumer medical product channels, means that consumers have better access to pharmaceutical and healthcare information than what they can find on Google. They can also use online channels such as Apple Health monitoring suite, diagnostic tools such as Cardio AI’s cardiovascular health platform, and Teladoc virtual consultation services to keep a track of their health indices and explore therapeutic options. In response to this growing trend, life sciences organizations must reshape their business models to focus on the consumer. They must become patient-centered, with growing emphasis on personalization in drug development and therapeutic delivery.
Life sciences executives are aware of the importance of getting closer to consumers. Half of leaders surveyed by Infosys call patient-centricity a major differentiator. But it is rarely an executive’s top priority. Just fewer than one in five deemed it their most significant differentiator (Figure 1).
Figure 1. Patient-centricity is a major differentiator
Source: Infosys Knowledge Institute
The patient-centered mindset
Where patients lead, life sciences companies must follow. This shift demands a mindset change. It requires and enables new approaches to R&D, marketing, and data collection and management, as well as greater collaboration with data platform providers. For example, AstraZeneca as well as Haleon have standing invitations for external innovation and collaboration. The goal is to make stronger connections between researchers and product developers, and medical professionals, and patients and consumers. Medical tech company Medtronic and Microsoft have launched a collaboration to use artificial intelligence (AI) to manage patient care through remotely monitored medical devices, ranging from consumer wearables to prescribed therapeutic devices and implants. The Medtronic partnership aims to create a unified ecosystem that is home to actionable data and better solutions for customers.
AstraZeneca and Medtronic aren’t the only ones, though. The consumer-focused approach to a business that has traditionally relied on business-to-business sales and relationships in a highly regulated professional setting is apparent across the sector. Since August 2024, Lilly, Novo Nordisk, and Pfizer have launched direct-to-patient purchase portals, with the Lilly and Novo Nordisk portals (LillyDirect and NovoCare) becoming a battleground for the companies’ competing GLP-1 weight-loss drugs. PfizerForAll is an information and advice focused portal, but also allows patients to access prescription drugs, as well as over-the-counter medications and diagnostic tests.
New business models
The introduction of direct-to-consumer sales for both over-the-counter and prescription medications is only the most visible aspect of deep structural changes underway across the industry. Patient-centricity is also prompting life sciences companies to adopt new business models. In some cases, this means creating a dual model that treats pharmaceuticals or medical devices separately for patients from consumer health products for consumers.
In 2022, life sciences giant GSK demerged its consumer healthcare division to create Haleon as a new listed business. In the same year, Johnson & Johnson also spun off its consumer healthcare business to create standalone company Kenvue, which has recently moved closer to a pure consumer goods model through its planned acquisition by Kimberly-Clark, best known for its tissues, diapers, and other personal care products.
In 2018, pharmaceutical company Merck sold its remaining consumer healthcare businesses to consumer industry leader Procter & Gamble, having already disposed of another part of its consumer products portfolio to Bayer in 2014.
This change of model has implications for R&D life cycles, risk, compliance, and product validation. Being run separately from pharmaceutical businesses gives these companies a freer hand to develop products and services on a fast-moving consumer goods model.
A new kind of relationship
In addition to a new relationship with consumers, life sciences companies have evolved their relationship with their traditional customers — the healthcare providers (HCPs). Whether an individual physician, hospital, or larger healthcare business — the traditional model of face-to-face sales visits and relationship-based support is giving way to an omnichannel model, where sales and marketing efforts shift to online knowledge sources and direct sales portals.
Where patients lead, life sciences companies must follow. This shift demands a mindset change. It requires and enables new approaches to R&D, marketing, and data collection and management, as well as greater collaboration with data platform providers.
According to research published in the journal Mayo Clinic Proceedings: Digital Health, omnichannel strategy in healthcare services improves patient engagement, satisfaction, and access to care, especially for underserved populations. Other advantages include cost savings, improved patient satisfaction, and better adherence to treatment plans.
A critical factor in adherence — the extent to which a patient’s behavior corresponds to established healthcare goals — is the use of AI-enabled digital support channels, for example a wearable device or mobile app that monitors patient behavior and issue alerts when adherence dips. Some adherence tracking tools also feature educational resources that help patients understand the importance of their medications and the potential consequences of nonadherence, reinforcing their commitment to their treatment plans.
This can strengthen the feedback loop with HCPs. Such approaches can improve the evidence base for treatment outcomes for chronic diseases, as well as improve and standardize progress monitoring.
To support this shift to a more patient-centered model, life sciences companies are increasingly turning to specialized pharmaceutical commercialization consultants and enablers. For example, pharmaceutical commercialization services provider Eversana specializes in ensuring that the patients understand the therapeutic regimen better. Many patients drop out of the therapy midway, especially in the case of rare diseases and cancers, because they cannot afford it any longer. Eversana helps patients understand cost factors involved in a therapy beforehand so they can arrange their finances to steer through the treatment course or, where possible, help them enroll in patient assistance programs run mostly by the life sciences companies. The latter stand to benefit from this as improved adherence means better patient outcomes and the market success of these products.
Another provider, Medistrava, supports life sciences companies to engage with patient communities and to implement clinical trials. Haleon and others offer portals for healthcare professionals to know more about diseases and products in order to serve their patients and consumers better.
The rise of personalization
Pharmaceutical companies, medical-device manufacturers, and consumer healthcare companies are evaluating ways to personalize the delivery of products and healthcare as expectations and treatments evolve.
One factor is the development of pharmaceutical technology, such as personalized medicine, nanotechnology and novel drug delivery methods. They allow medical treatments to be personalized for individuals according to their genetic makeup, lifestyle, and environmental factors.
For example, people suffering from lung cancer have a change or mutation in the epidermal growth factor receptor (EGFR) gene, a protein found on the surface of cells and can benefit from drugs such as gefitinib, an EGFR inhibitor. Other lung cancer patients who have a change in a gene called anaplastic lymphoma kinase could potentially benefit from another drug called crizotinib.
This personalization of drugs based on genetics is driven by breakthroughs in genomics and the ability of analytics techniques to sift and evaluate the vast amounts of data generated by genomic research and clinical trials.
Despite the clear advantages of this shift toward a more patient-centered model, personalization requires reworking of logistics and supply chains, and the management of direct-to-patient therapeutic services. It is widely recognized that some patient outcomes can be improved by using personalized therapies created by adjusting drug properties according to individual genomic profiles that allow targeted drug delivery to specific tissues or cells. However, the logistical challenges associated with these innovations are considerable.
A paper in the International Journal of Multidisciplinary Research argues that technology can help meet these challenges when companies concentrate on improving supply-chain efficiency and demand forecasting through AI-supported predictive analytics. But success also requires more decentralized manufacturing and better collaboration between pharmaceutical companies, logistics providers, and HCPs. For more on the supply chain retooling required by personalized medicine, see Chapter 4 in this journal.
Data, AI, and your data
Patient-centricity and personalization are changing the way life sciences companies use data and AI applications. By its nature, personalization demands personal data, and increasingly this data will be captured by mobile apps and home devices operated by patients. At present, the majority of life sciences executives say AI has slightly improved patient-centricity in the industry (Figure 2).
Figure 2. IoT, connected devices top investment priority
Source: Infosys Knowledge Institute
Some companies are using AI’s analytical capabilities to derive new treatment programs from existing databases — one example is the work of Johnson & Johnson Innovative Medicine in Japan on hypertension. The company used data from a Japanese insurance claims database to evaluate the relative performance of hypertension drugs already in the market.
The company then partnered with patient-analytics data specialist Prospection, which uses a proprietary AI application to analyze longitudinal patient journeys — a patient’s interaction with the healthcare system over time. This data helped the pharmaceutical company differentiate its products and collaborate more effectively with HCPs.
Other companies are also working with technology providers to capture data from consumer wearables. The US startup Rune Labs, for example, is collaborating with Apple to improve precision care for Parkinson’s disease. It does this by using its data platform StrivePD to combine patient-reported outcomes and device-level data from the Apple Watch to record and predict the trajectory of a disease, turning passive data into clinical signals.
Elsewhere, healthcare technology company MC10 has partnered with pharma company AbbVie to use wearable technology to investigate outcomes in multiple sclerosis, while Novo Nordisk teamed with digital health specialist Glooko to create a personalized tool for diabetes monitoring, allowing patients to track their blood sugar and receive personal recommendations for diabetes management.
However, companies need to be aware of the limitations of consumer-grade wearables: A recent academic paper published by PLOS Digital Health identifies many issues to be addressed for the potential benefits to be realized in full. These range from data quality and poor interoperability of systems, to varying digital literacy among patients, and access to wearables. Addressing these issues is crucial to successful realization of the benefits of personalized medicine.
A shift in mindset
The increasing interest from patients and consumers in their care and wellness journeys is a societal change. All life sciences and consumer health companies will have to confront this challenge, which requires a shift in mindset as much as changing the technology or the business model. In particular:
- Companies need to prepare direct-to-patient/consumer strategies in the light of margin pressure from growth in this area, insurance coverage issues, and potential limits to this model from regulation.
- Patient-centricity requires additional investments and prioritization of resources across the product life cycle.
- The shift toward personalization creates challenges in logistics, patient communication, and management of direct-to-patient therapeutic services, as well as in relationships with large HCP organizations. Companies need to meet these challenges to differentiate themselves in the market and achieve competitive advantage in an increasingly patient-centric environment.
- The shift from conventional to precision medicine creates specific data challenges: Companies will need to address issues of siloed data, regulatory compliance, interoperability, trust, and data ownership.
To succeed and thrive in a rapidly changing environment, life sciences and consumer health companies will have to re-wire their capabilities through continuous transformation.
Case study: The cost of revolutionary devices
A Personalized healthcare remains an emerging discipline. Advances in biosensing, wearable devices, and related data analytics promise extraordinary possibilities for life sciences companies.
Yet the costs of research and development in fields where many technological hurdles remain, and the complexities of meeting regulatory and data management and privacy challenges remain formidable.
A team at Infosys subsidiary Kaleidoscope Innovation has been exploring the feasibility of a wearable device that can collect and monitor health biomarkers in cancer patients’ sweat or blood to identify any early warning signs that treatment needs to be adjusted.
The team looked at how a wearable device that measures biomarkers called aptamers and cytokines and uses that data to monitor a patient’s response to cancer treatment could be built, approved, and marketed. It does not yet exist — the closest researchers have got to creating such a wearable is a 25kg unit, more than 10 times the weight it would need to be. But the Kaleidoscope team concluded that in principle such a device could be developed, revolutionizing cancer care in the process.
That would leave many operational hurdles to adoption, though. If the device uses an app on a patient’s phone, how is the data managed from that point? How does a doctor treating hundreds of patients manage the volume of incoming data? What are the implications for HIPAA, the US federal law that governs privacy and security for patient health? As Kaleidoscope Innovation points out, it is not enough to simply create new products. Each product is likely to need a new ecosystem.
The team concluded that the development costs for bringing this device to life would be at least $1 billion. That is not an unreasonable budget for what could be a transformative medical breakthrough, but it is also a reminder that pushing the boundaries in personalized medicine remains costly, risky, and fraught with uncertainty.
