Peptide-based therapeutics are emerging as a rapidly expanding and highly promising class of anti-cancer agents, owing to their inherent high target specificity, relatively low toxicity compared to conventional chemotherapeutics, and their versatile design potential. These characteristics make them particularly attractive candidates in the search for novel cancer treatments.
Recent advances in the fields of bioinformatics, computational biology, and structural biology have revolutionized the strategies employed to identify, model, and screen these peptides. These technologies enable high-throughput, data-driven approaches to the discovery and optimization of peptide drugs, vastly accelerating the traditional drug development process.
This article proposes a comprehensive computational pipeline designed to facilitate the identification and rational design of anticancer peptides derived from natural toxins—potent molecules that have evolved to interact precisely with biological targets. By leveraging detailed structural and molecular modeling, the pipeline focuses on elucidating the interactions between these candidate peptides and cancer-specific receptors at the atomic level. This approach not only highlights their potential therapeutic value but also enhances our understanding of the mechanisms underlying peptide-receptor binding and selectivity.
Ultimately, this framework lays the groundwork for a data-driven peptide drug discovery process that can be iteratively refined and expanded as new computational tools and experimental data become available. By integrating computational prediction with experimental validation, researchers can accelerate the translation of these promising peptides from in silico models to preclinical and clinical applications, thus contributing to the advancement of precision oncology.
Intruduction
The search for targeted and less toxic anticancer drugs has led researchers to increasingly revisit nature’s vast pharmacopoeia, recognizing it as a rich reservoir of bioactive compounds with therapeutic potential. Among the most compelling emerging candidates in this domain are bioactive peptides derived from natural toxins, including those found in leech venom. These peptides exhibit highly specific interactions with molecular targets implicated in tumorigenesis, offering unique and often underexplored mechanisms of action that can disrupt critical cancer pathways while sparing healthy tissues.
Computational Pipeline for Peptide Drug Discovery
Peptide Design and Sequence Optimization
Peptides were selected based on known anti-thrombotic and anti-proliferative motifs in Hirudin (a leech-derived peptide). Sequence optimization was performed using anti-cancer peptide prediction servers (e.g., iACP, CancerPPD) to enhance cytotoxic potential while minimizing immunogenicity.
The integration of sophisticated bioinformatics tools and high-throughput screening platforms into this discovery process has dramatically accelerated the early stages of peptide drug development. By combining sequence analysis, molecular modeling, and in silico docking studies, researchers can rapidly identify, optimize, and prioritize candidate peptides for further experimental validation. This computational approach not only reduces the time and cost associated with traditional wet-lab screening but also enhances the precision and rationality of peptide design, paving the way for the development of novel anticancer therapeutics with improved efficacy and safety profiles.
2. Structure Prediction
3D models of the peptides were generated using PEP-FOLD3 and validated via Ramachandran plot analysis to ensure stereochemical stability.
3. Target Selection and Preparation
Receptor proteins such as AXL and EGFR, implicated in aggressive cancer phenotypes, were retrieved from the Protein Data Bank. These receptors play a critical role in tumor progression and resistance to existing therapies.
4.Molecular Docking
Molecular Docking
Docking simulations were conducted using HADDOCK and AutoDock Vina. Key parameters analyzed included binding affinity, hydrogen bonding, and interface residues. Peptides exhibiting strong interaction with the receptor binding sites were shortlisted for further validation.
5. In Silico Toxicity and Stability Profiling
ADMET (Absorption, Distribution, Metabolism, Excretion, and Toxicity) analysis was performed using tools like SwissADME and ToxinPred. Candidates with favorable pharmacokinetic profiles and non-toxic predictions were prioritized.
6. Results and Discussion
Several peptides showed nanomolar binding affinity toward AXL and EGFR, suggesting potential therapeutic efficacy. Molecular interaction analysis revealed key residues contributing to stable peptide-receptor complexes. The docking scores correlated with known anti-tumor peptide characteristics, highlighting the power of computational screening in identifying viable therapeutic leads.
Additionally, in silico toxicity prediction allowed early-stage elimination of peptides with poor safety profiles, saving significant time and resources in downstream experimental validation.
Conclusion and Future Prospects
This study highlights the potential of computational pipelines in peptide drug discovery, particularly in the context of cancer therapy. With the increasing availability of biological data and AI-driven prediction models, bioinformatics is poised to become an indispensable tool in next-generation precision oncology.
Further experimental validation and in vitro studies will be critical to translating these computational findings into clinical breakthroughs
Humans are exceptional reservoirs of diverse microbial species, forming complex microbiota that play a crucial role in health by modulating metabolic processes and protecting against various diseases. The composition and function of the microbiota can be positively influenced by the consumption of probiotics and prebiotics—beneficial bacteria and non-digestible food components, respectively—that promote the growth of these beneficial microbes. Many fermented foods serve as sources of probiotic strains, while plant-based oligosaccharides are well-known prebiotics. Together, probiotics and prebiotics are important in treating immune system disorders, cancer, liver diseases, gastrointestinal issues, type 2 diabetes, and obesity, thanks to their immunomodulatory properties, support of gut barrier integrity, production of antimicrobial compounds, and regulation of immune responses. This review aims to highlight the potential impact of prebiotics and probiotics on gut microbiota, emphasizing their role in enhancing health benefits.
Introduction
The human microbiota contains approximately 101410^{14}1014 bacterial cells, which is comparable to the total number of eukaryotic cells in the human body (Wang et al., 2016). Although the gut of a neonate is initially sterile, it is rapidly colonized by maternal bacteria, resulting in a diverse gut microbiome. During the first few months of life, the infant’s gut microbiota adapts to its environment based on nutritional availability, anaerobic conditions, and microbial interactions (Bäckhed et al., 2015).
The diversity of the adult gut microbiota consists of approximately 1,000–1,150 different bacterial species, primarily including Bacteroidetes, Firmicutes, Actinobacteria, Proteobacteria, and Verrucomicrobia (David et al., 2014; Salvucci, 2019; Almeida et al., 2019). Lifestyle, environment, and age significantly influence the stability of the host microbiome (Di Paola et al., 2011).
Although the gut microbiota remains generally resilient and stable throughout a person’s life, it can be temporarily affected by factors such as unhealthy diets, antibiotic use, and exposure to new environments—though these usually have a limited impact on its overall composition (Rajilić-Stojanović et al., 2013).
As a result, prebiotics are selectively utilized by the host’s microbiota to confer health benefits (Swanson et al., 2020). Their potential effects include modulation of the gut microbiota and the production of beneficial metabolites, such as tryptophan and short-chain fatty acids (SCFAs). Commercially available prebiotics include inulin, isomalto-oligosaccharides (IMO), fructo-oligosaccharides (FOS), galacto-oligosaccharides (GOS), lactulose, and resistant starches (Yan et al., 2018).
Probiotics are live microorganisms that provide health benefits to the host when administered in adequate amounts. Common probiotic species include members of the Bifidobacterium and Lactobacillus genera, while less common probiotics include Faecalibacterium prausnitzii, Akkermansia muciniphila, Streptococcus thermophilus, Saccharomyces boulardii, and Lactococcus lactis (Hill et al., 2014; Markowiak & Śliżewska, 2017; Ballan et al., 2020). These probiotics influence the gut luminal environment, mucosal barrier function, and mucosal immune system.
Composition and diversity of gut microbiota
The human microbiota contains approximately 101410^{14}1014 microorganisms. The mother’s oral microbiota closely resembles that of the placenta, with Firmicutes, Proteobacteria, Bacteroides, Fusobacteria, and Tenericutes contributing to its structure during the prenatal period. According to one theory, microbes may be transferred from the oral cavity to the fetus via the circulatory system (Aagaard et al., 2014).
After birth, the newborn encounters various microbes, with colonization influenced by the mode of delivery (Dominguez-Bello et al., 2010). During vaginal delivery, the baby’s skin and mucous membranes come into contact with the mother’s microbiota, leading to colonization predominantly by Lactobacillus species. In contrast, Propionibacteria and Staphylococcus—common skin microbes—colonize the baby’s mouth, gut, and skin following cesarean section (Jakobsson et al., 2014; Bäckhed et al., 2015).
During the initial months of a newborn’s life, the gut microbiota adapts to its environment, influenced by factors such as nutrient availability, anaerobic conditions, and microbial interactions (Bäckhed et al., 2015). In the first two years, cesarean-born infants tend to have fewer maternally transmitted microbes (e.g., Bifidobacteria and Bacteroides), lower diversity, and a reduced Type 1 T helper (Th1) immune response compared to vaginally born infants; however, these differences gradually diminish over time (Jakobsson et al., 2014; Bäckhed et al., 2015).
Breastfeeding is another key factor in early microbial colonization. Human breast milk contains more than 10710^{7}107 bacterial cells per 800 mL, predominantly from the genera Lactobacillus, Streptococcus, and Staphylococcus (Soto et al., 2014). Additionally, breast milk is rich in oligosaccharides, which selectively promote the growth of Lactobacillus and Bifidobacterium species. These microbes contribute to fermentation in the gastrointestinal tract (GIT), the production of short-chain fatty acids (SCFAs), and the reduction of colonic pH. This acidic environment limits the growth of harmful pathogens that cannot survive under such conditions.
Furthermore, breast milk provides immunoglobulin A, lactoferrin, and defensins, which offer additional health benefits to infants (Lönnerdal, 2016). Early breastfeeding has also been linked to the prevention of diseases such as obesity, dermatitis, and infections (Greer et al., 2008).
Following the introduction of solid foods, the infant’s gut microbiota begins to transition toward an adult-like composition (Turnbaugh et al., 2009). By the third year of life, environmental factors strongly influence microbiota colonization. This increased diversity enhances the synthesis of vitamins and amino acids and improves carbohydrate metabolism (Yatsunenko et al., 2012).
In adulthood, the gut microbiota is typically composed of Firmicutes, Bacteroidetes, Actinobacteria, Verrucomicrobia, and Proteobacteria. This diverse microbiota plays a critical role in human physiology, influencing intestinal barrier integrity, neurotransmitter production, immune system development, and energy metabolism. Lifestyle, physical environment, and age all affect the stability of the host microbiome (Di Paola et al., 2011).
Methodology
This review adheres to PRISMA guidelines and includes literature sourced from Google Scholar, PubMed, Scopus, and Web of Science. Over 50 peer-reviewed studies from the past decade examining the health benefits of probiotics and prebiotics were selected. Quality assessments were conducted using the Cochrane Risk of Bias tool for randomized controlled trials (RCTs) and the Newcastle-Ottawa Scale for observational studies.
The findings were synthesized narratively, focusing on three key areas: the role of gut microbiota in combating diseases, the role of prebiotics in modulating gut microbiota, and the role of probiotics in supporting gut microbiota.
Factors influencing the gut microbiota
To explore the connection between genetics and gut microbiota, researchers profiled the gut microbiota of eight distinct mouse breeds using DNA fingerprinting techniques. A previous study (Kemis et al., 2019) found that the host’s genetic makeup significantly influences microbiota diversity. The host genotype also plays a role in selecting the intestinal microbiota.
At birth, the newborn’s sterile gut is already colonized by numerous microbes from the mother and the surrounding environment. Although the formation of the gut microbiota is influenced by the offspring’s genes, mothers and their children share approximately half of their genetic material as well as similarities in their gut microbiota composition (Coelho et al., 2021).
The adult gut microbiota is highly responsive to dietary changes. In studies where mice were switched to a Western-style diet, the microbiota underwent significant alterations, notably with a marked increase in the abundance of Firmicutes, particularly the class Erysipelotrichi (Salazar et al., 2017). Changes in diet over just 24 hours triggered observable shifts in microbial composition.
Dietary carbohydrates that are indigestible in the upper intestine reach the colon where they are fermented by gut microbes, leading to substantial changes in microbiota composition and beneficial effects on host health (Leeming et al., 2019). The prebiotic hypothesis, first proposed in 1995, highlights that prebiotics can increase the number of Bifidobacteria (phylum Actinobacteria) (Rezende et al., 2021). This microbial shift happens quickly but also reverts rapidly once prebiotic intake stops.
The adult gut microbiota is highly responsive to dietary changes. In studies where mice were switched to a Western-style diet, the microbiota underwent significant alterations, notably with a marked increase in the abundance of Firmicutes, particularly the class Erysipelotrichi (Salazar et al., 2017). Changes in diet over just 24 hours triggered observable shifts in microbial composition.
Dietary carbohydrates that are indigestible in the upper intestine reach the colon where they are fermented by gut microbes, leading to substantial changes in microbiota composition and beneficial effects on host health (Leeming et al., 2019). The prebiotic hypothesis, first proposed in 1995, highlights that prebiotics can increase the number of Bifidobacteria (phylum Actinobacteria) (Rezende et al., 2021). This microbial shift happens quickly but also reverts rapidly once prebiotic intake stops.
Breast milk naturally contains oligosaccharides that act as prebiotics, supporting the growth of Bifidobacterium populations in infants. These findings emphasize the crucial role diet plays in shaping the gut microbiota throughout life (Leeming et al., 2019).
The body’s immune system plays a crucial role in shaping the gut microbiota. Studies have shown that animals with abnormal Toll-like receptor (TLR) signaling exhibit elevated antibody levels, which help regulate commensal bacteria. This interaction between the host and gut microbiota is maintained through increased serum antibody levels.
Mutant mice lacking functional TLRs display altered intestinal microbial compositions, demonstrating that the host’s phenotype is strongly influenced by the characteristics of its gut microbiota. Additionally, factors such as gut peristalsis and the dense mucus layer produced by goblet cells affect the microbial population (Schluter et al., 2020).
The thick mucus layer formed by goblet cells acts as a barrier, limiting microbial penetration into the colonic epithelium (see Figure 1).
Role of microbiota in combating diseases
The gut microbiota plays a key role in metabolic processes, including carbohydrate and lipid metabolism, which are critical factors in the development of diabetes. Certain probiotic strains can improve insulin sensitivity and help regulate blood glucose levels. For example, Lactobacillus and Bifidobacterium strains have been shown to reduce inflammation and enhance glucose metabolism, thereby alleviating the effects of type 2 diabetes (Turnbaugh et al., 2009).
These probiotics exert their beneficial effects by strengthening gut barrier function, lowering endotoxemia, and influencing the secretion of hormones such as incretins, which are involved in insulin release. Moreover, they can affect the gut-brain axis, potentially reducing stress-induced hyperglycemia and further supporting better glycemic control (Schluter et al., 2020).
Microbiota
Gut bacteria metabolize dietary compounds like choline and carnitine into metabolites that influence cardiovascular health. Probiotic strains such as Lactobacillus reuteri help lower cholesterol by breaking down bile acids in the gut, preventing their reabsorption and thereby reducing blood cholesterol levels.
Disruptions in the gut microbiome have been linked to cardiovascular diseases. Certain probiotics offer potential therapeutic benefits by modulating inflammation and reducing hypertension (Leeming et al., 2019). These probiotics also produce short-chain fatty acids (SCFAs), which have been shown to lower blood pressure and enhance endothelial function. Furthermore, they can reduce levels of trimethylamine N-oxide (TMAO), a metabolite associated with higher cardiovascular risk, thus providing a comprehensive approach to supporting heart health (Champagne et al., 2018).
The gut microbiome plays a crucial role in adipose tissue metabolism and the development of obesity. Research shows that the composition of gut microorganisms varies between underweight and overweight individuals. Specific probiotic strains, such as Lactobacillus gasseri and Bifidobacterium breve, influence energy balance and fat storage by regulating nutrient absorption and hormone secretion related to appetite and fat accumulation.
Moreover, short-chain fatty acids (SCFAs) produced by gut bacteria promote adipocyte differentiation, enhance lipid metabolism, and improve insulin sensitivity, all of which contribute to better metabolic health (Aagaard et al., 2014). These SCFAs act as signaling molecules that regulate gene expression involved in lipid metabolism and energy homeostasis. Probiotics also stimulate the release of satiety hormones like peptide YY (PYY) and glucagon-like peptide-1 (GLP-1), which help decrease food intake and support weight loss (Schluter et al., 2020).
Emerging studies indicate that the gut microbiome significantly influences cognitive function and mental health via the gut-brain axis. A healthy gut microbial community is linked to a lower risk of mood disorders such as depression and anxiety. Probiotic strains like Lactobacillus helveticus and Bifidobacterium longum have demonstrated the ability to reduce stress and anxiety symptoms by modulating neurotransmitter production and lowering systemic inflammation (Turnbaugh et al., 2009).
These probiotics impact levels of serotonin and gamma-aminobutyric acid (GABA), both essential neurotransmitters for mood regulation. Additionally, they suppress pro-inflammatory cytokines that can adversely affect brain function. By enhancing gut barrier integrity, these probiotics help prevent inflammatory molecules from reaching the brain, thereby promoting mental well-being (Rosolen et al., 2019).
A diverse gut microbiota is essential for a strong immune system, aiding in the management of asthma and the reduction of allergies. Exposure to a broad range of microbes enhances immune resilience and lowers the risk of autoimmune diseases. Probiotic strains such as Lactobacillus rhamnosus can modulate immune responses and alleviate allergic symptoms by strengthening gut barrier function and decreasing pro-inflammatory cytokines (Aagaard et al., 2014).
These probiotics stimulate the production of regulatory T cells (Tregs), which help sustain immune tolerance and prevent excessive reactions to harmless antigens. They also boost the secretion of secretory IgA, a critical antibody in mucosal immunity, offering extra protection against allergens and pathogens (Schluter et al., 2020).
An imbalance in the gut microbiota is linked to several gastrointestinal disorders, including irritable bowel disease (IBD) and colitis. Managing the gut microbiota through dietary interventions and probiotics can help reduce inflammation and promote gut health. Probiotics such as Saccharomyces boulardii and Lactobacillus plantarum have demonstrated effectiveness in alleviating IBD symptoms and preventing relapses by enhancing mucosal barrier function and modulating immune responses (Leeming et al., 2019).
These probiotics help restore gut flora balance, decrease pro-inflammatory cytokine production, and increase anti-inflammatory cytokine levels. They also support regeneration of the gut epithelium and strengthen gut barrier integrity, thereby reducing intestinal permeability and subsequent inflammation (Champagne et al., 2018).
The gut microbiome plays a crucial role in modulating certain types of cancer, including colon cancer. Some gut bacteria elicit anti-inflammatory responses that may protect against tumor formation. Probiotics such as Lactobacillus casei have been shown to reduce the risk of colon cancer by inhibiting the growth of pathogenic bacteria and promoting the production of anti-carcinogenic compounds (Turnbaugh et al., 2009). These probiotics increase the production of butyrate, a short-chain fatty acid with well-known anti-tumorigenic properties. Butyrate induces apoptosis in cancer cells and inhibits their proliferation. Additionally, probiotics can modulate the immune system to enhance its ability to recognize and destroy cancer cells, providing a dual mechanism against tumor development (Rosolen et al., 2019).
The gut microbiota also influences liver health through several mechanisms. Dysbiosis, or imbalance of gut microbes, can increase intestinal permeability (“leaky gut”), allowing toxins to enter the bloodstream, trigger systemic inflammation, and contribute to liver disorders. Probiotics such as Lactobacillus rhamnosus help maintain gut barrier integrity and reduce liver inflammation. Moreover, gut bacteria participate in bile acid metabolism, essential for digesting dietary fats and regulating lipid and glucose metabolism in the liver (Aagaard et al., 2014). Probiotics can also decrease the production of lipopolysaccharides (LPS), endotoxins that promote liver inflammation and damage. By modulating gut microbiota composition and function, probiotics support liver health and help prevent progression of liver diseases such as non-alcoholic fatty liver disease (NAFLD) and alcoholic liver disease (Rosolen et al., 2019) (see Figure 2).
Role of prebiotics in gut microbiota
Prebiotics are defined as substances selectively utilized by host microorganisms to confer health benefits (Swanson et al., 2020). These benefits include modulation of the gut microbiota and the production of metabolites such as short-chain fatty acids (SCFAs) and tryptophan derivatives. However, these effects should be confirmed in target hosts, including animals and humans (Roager & Licht, 2018; Sanders et al., 2019; Swanson et al., 2020). Common commercially available prebiotics include inulin, lactulose, fructo-oligosaccharides (FOS), isomalto-oligosaccharides (IMO), galactooligosaccharides (GOS), and resistant starch (Yan et al., 2018). Numerous studies have explored the health benefits of dietary fiber consumption, both with and without recognized prebiotic effects. The primary mechanism of prebiotics involves selective fermentation by beneficial gut microorganisms, such as Lactobacillus and Bifidobacterium, which produce acetate and lactate, respectively. These metabolites then stimulate other beneficial microbes to produce butyrate, a key SCFA. Importantly, SCFAs have been shown to enhance mineral absorption, contributing to improved host health (Roager & Licht, 2018; Sanders et al., 2019; Swanson et al., 2020).
Prebiotics can assist in regulating the overall bacterial diversity of the gut by promoting the growth of useful bacteria, while inhibiting the proliferation of potentially dangerous species. Prebiotics can modulate immune responses and reduce inflammation by influencing lymphoid tissue associated with the gut. Prebiotic consumption has been linked to a variety of health benefits, including improved digestive health, enhanced nutrient absorption, and a lower risk of certain chronic diseases such as obesity, diabetes, and cardiovascular disorders.
Health Benefits
According to Oliveira et al., co-cultures of probiotics with certain strains combined with inulin—the most extensively studied prebiotic—improve the acidification rate of dairy products. Santos et al. demonstrated that Lactobacillus acidophilus La-5, when microencapsulated with inulin, showed greater resistance to simulated gastrointestinal tract (GIT) stress in vitro compared to free cells, resulting in an enhanced survival rate (David et al., 2014). Additionally, Rosolen et al. reported that using a combination of whey and inulin as a protective coating for Lactococcus lactis R7 improved heat resistance and tolerance to in vitro GIT stress (see Table 1).
Role of probiotics in gut microbiota
Good health is strongly linked to the ingestion of probiotics. The microbes approved for consumption are generally considered safe, with selective strains targeting specific populations such as newborns, adults, and the elderly. Additionally, the recommended dietary allowance (RDA) of these microbes should be taken into account to achieve optimal health benefits (Hill et al., 2014; Ballan et al., 2020; Coniglio et al., 2023). Common probiotic species include those from the genera Lactobacillus and Bifidobacterium, while other microbes such as Faecalibacterium prausnitzii, Akkermansia muciniphila, Streptococcus thermophilus, Saccharomyces boulardii, and Lactococcus lactis are also categorized as probiotics (Hill et al., 2014; Markowiak & Śliżewska, 2017; Ballan et al., 2020).
Different probiotic strains exhibit varying survival and multiplication rates in the stomach depending on factors such as the food medium (e.g., milk or soymilk), oxygen levels (e.g., stirred yogurt), storage temperature, pH, and the presence of food ingredients or prebiotics (Homayoni Rad et al., 2016; Champagne et al., 2018; Ballan et al., 2020). By reducing certain unfavorable food components, such as raffinose and stachyose found in soymilk, probiotics can exert beneficial health effects (Albuquerque et al., 2017; Battistini et al., 2018; Champagne et al., 2018). While starter strains like Streptococcus thermophilus are added alongside probiotic cultures to shorten fermentation times, their presence can inhibit the production of flavors generated by acetic acid when co-cultured with Bifidobacterium strains (Tripathi & Giri, 2014; Oliveira et al., 2009; Champagne et al., 2018).
Prebiotics are often consumed together with probiotics to form synbiotics, which can reduce fermentation time and enhance the survival rate of probiotics throughout the gut (Oliveira et al., 2009; Markowiak & Śliżewska, 2017). Dietary changes modulate the behavior of probiotic strains differently. For example, the addition of fruit pulp to soymilk fermented with probiotics significantly influences the properties of the final product (Peters et al., 2019). Furthermore, advancing food technologies such as microencapsulation have greatly improved fermentation methods and increased tolerance to gastrointestinal tract (GIT) stresses (Oliveira et al., 2009; Champagne et al., 2018; Tripathi & Giri, 2014). Recent studies indicate that microbial metabolites contribute to health benefits and influence probiotic function and supplementation strategies (Champagne et al., 2018; Kalita et al., 2023; Mehmood et al., 2023) (Table 2).
Mechanism of action
Probiotic microorganisms influence the host in several ways, enhancing the intestinal lumen, mucosal barrier, and immune stability (Fong et al., 2020). These effects are mediated through various cell types involved in both innate and adaptive immunity, including epithelial cells, monocytes, dendritic cells, B cells, T cells (such as regulatory T cells), and natural killer (NK) cells. The primary mechanisms include selective utilization of prebiotics by commensal microbiota, production of metabolites like short-chain fatty acids (SCFAs) and organic acids, reduction of lumen pH, increased mineral absorption, and inhibition of pathogenic growth (Peters et al., 2019) (Figure 3).
Probiotics enhance phagocytosis, regulate immunoglobulin production, improve immune responses, and maintain microbiome homeostasis through competition for nutrients and adhesion sites, bacteriocin release, reduction of pro-inflammatory activities, and enhancement of barrier functions (Bermudez-Brito et al., 2012). Key regulatory pathways and cytokines involved include G protein-coupled receptors (GPR41 and GPR43), glucagon-like peptide 1 (GLP-1), peptide YY (PYY), lipopolysaccharides (LPS), nuclear factor kappa B (NF-κB), tumor necrosis factor-alpha (TNF-α), exopolysaccharides (EPS), interferon-gamma (IFN-γ), and interleukin-12 (IL-12). These mechanisms play important roles in reducing metabolic endotoxemia and inflammation (Peters et al., 2019).
Moreover, probiotics modulate mucosal cell interactions and maintain cellular stability by improving intestinal barrier function. They achieve this by regulating the phosphorylation of cytoskeletal and junctional proteins, which supports barrier integrity through processes such as mucus production.chloride and water secretion, and tight junction protein interactions (Yadav & Jha, 2019).
Enhanced mucosal barrier function is crucial in managing disorders such as inflammatory bowel disease (IBD), celiac disease, gut infections, and type 1 diabetes (Ghosh et al., 2021). At the molecular level, epithelial cells respond differently to commensal or probiotic bacteria compared to pathogens. For instance, probiotic bacteria do not induce interleukin-8 (IL-8) secretion from epithelial cells, whereas pathogens like Shigella dysenteriae, enteropathogenic Escherichia coli, Listeria monocytogenes, and Salmonella dublin do (Bermudez-Brito et al., 2012). In fact, co-culture with….
Probiotic bacteria can reduce IL-8 release caused by these pathogens, thereby mitigating inflammation and promoting intestinal homeostasis. However, not all probiotics exhibit this anti-inflammatory trait; for example, Escherichia coli Nissle 1917 has been shown to increase IL-8 secretion in a dose-dependent manner, highlighting the variability in the immunomodulatory effects of different probiotic strains (Wen et al., 2020).
Future perspectives
Advancements in gut microbiome profiling tech-niques will enable personalized approaches to gut health interventions. By identifying an individual’s gut microbiota composition and its response to
probiotics_machanis
By leveraging prebiotics and probiotics, healthcare professionals can design targeted treatment strategies to maximize health benefits. Researchers continue to explore novel prebiotic and probiotic strains to optimize their effects on gut health. Advances in microbial engineering and genetic editing technologies have facilitated the development of more precise and potent prebiotics and probiotics, thereby maximizing their therapeutic potential (Wen et al., 2020).
The gut-brain axis—a bidirectional communication network linking the gut and the brain—illustrates how gut microorganisms influence mental health and cognition. This connection opens avenues for developing prebiotic and probiotic interventions aimed at supporting mental health and reducing symptoms of depression. Moreover, the therapeutic applications of prebiotics and probiotics extend well beyond gut health. Emerging research has examined their roles in managing metabolic disorders, cardiovascular diseases, and autoimmune conditions. Identifying specific microbial strains and bioactive compounds capable of modulating disease-related pathways offers promising new directions for targeted therapies (Peters et al., 2019).
Microbiome-based therapeutics, including fecal microbiota transplantation (FMT) and defined microbial cocktails, show considerable promise for treating gastrointestinal disorders and systemic diseases. As our understanding of the distinct functions of various microbial communities deepens, these therapies are expected to become increasingly refined and widely accepted in mainstream medicine. However, the rapid growth of prebiotic and probiotic products in the market has outpaced regulatory oversight, leading to concerns about inconsistent quality and efficacy. Enhanced regulatory frameworks will be essential to ensure the safety, reliability, and therapeutic value of these products.
Notably, early-life exposure to prebiotics and probiotics may have long-lasting effects on gut health and overall well-being. Understanding how these early interventions shape the developing gut microbiome—and consequently influence lifelong health—is a critical area of ongoing research (Rosolen et al., 2019).
As gut microbiota science advances, it is likely that nutritional guidelines will increasingly incorporate prebiotic and probiotic recommendations to promote gut health. Integrating these into standard dietary advice could help prevent gut-related disorders and improve general health outcomes. In conclusion, the future of prebiotics and probiotics in gut microbiota research is promising and holds vast potential to enhance human health. Continued exploration of the gut microbiome’s complexities and its broad impact on well-being will drive the development of personalized interventions and innovative therapeutics, revolutionizing approaches to gut health and disease management (Wen et al., 2020).
Limitations
The potential impact of probiotics and prebiotics in enhancing health benefits is promising, yet several limitations must be acknowledged. Much of the current evidence relies on studies with small sample sizes, short durations, or investigations focused on specific populations, thereby limiting the generalizability of findings. Additionally, the variability in probiotic strains, prebiotic compounds, dosages, and methodological inconsistencies across studies complicates the interpretation and comparison of results. Individual differences in microbiome composition and the complex interactions between probiotics, prebiotics, and host physiology are often oversimplified, further challenging the establishment of universal guidelines. Potential biases, such as industry sponsorship and publication bias, can skew outcomes. Moreover, the long-term effects and safety profiles of these interventions are not well-documented, and significant translational gaps remain between research evidence and practical clinical recommendations.
Conclusion
The gut microbiome, a complex network of microorganisms, plays a crucial role in various physiological processes, including nutrient metabolism and immune system regulation. Prebiotics are indigestible food components that stimulate the growth and activity of beneficial microbes in the gut. Through fermentation, prebiotics produce short-chain fatty acids (SCFAs), which confer anti-inflammatory properties and support gut barrier integrity. By promoting the proliferation of beneficial microbes, prebiotics contribute to a healthier gut ecosystem and may protect against gastrointestinal disorders. When consumed in sufficient quantities, probiotics—live beneficial bacteria—can enhance gut barrier function, produce antimicrobial compounds, and modulate immune responses. Probiotics have been shown to alleviate intestinal disorders such as irritable bowel syndrome and antibiotic-associated diarrhea, as well as improve immune function and reduce the risk of infections. Thus, both prebiotics and probiotics play significant roles in improving quality of life by enhancing overall health.
Courtrooms today are seeing a quiet revolution – one where neuroscience and the law are starting to speak the same language. As neuropsychology and forensic psychology increasingly intersect, we’re gaining new insights into understanding crime, responsibility, and rehabilitation. This article examines how brain science is revolutionizing legal practice – from assessing mental fitness to predicting recidivism – while highlighting real-world applications, current trends, and emerging ethical concerns. It’s time we adopted a justice system informed not just by actions, but by what’s happening inside the mind.
⦾ A New Era: When Law Meets Brain Science
Psychology has long helped courts interpret human behavior, but rarely has it examined the biology behind it. That’s changing. Neuropsychology, which focuses on the brain’s influence on thought and behavior, is adding a powerful new layer to forensic psychology, a field that applies psychological principles to legal issues.
Together, they offer insight into key legal questions: Did the person grasp what they were doing? Could they stop themselves? Are they a danger to society? These aren’t just legal puzzles – they require understanding how the brain reacts to trauma, disease, or stress.
⦾Connecting the Dots: Where Disciplines Merge
➢What Neuropsychology Brings to the Table:-
Neuropsychology links brain health to behavior. Whether it’s damage, developmental delay, or dysfunction, brain scans and assessments reveal how thought processes can go off track – and how that matters in legal settings.
➢ The Role of Forensic Psychology:-
forensic psychology
Forensic psychologists assess mental state in court cases, from trauma evaluation to criminal profiling. But traditionally, they haven’t gone deep into the brain. That’s where neuropsychology fills the gap – providing a biological context for behavior.
➢ A New Type of Legal Evidence:-
We now know that not all harmful actions stem from rational choice. Brain disorders or injuries can be contributing factors. Courts are starting to accept brain-based evidence – not to excuse crimes, but to understand them better. Sometimes, this leads to adjusted sentences, therapeutic interventions, or re-evaluation of legal responsibility.
⦾ How Brain Science Informs Justice:
➢Fit for Trial?:-
Before standing trial, a defendant must comprehend the charges and assist in their defense. Neuropsychological evaluations can uncover cognitive deficits, like poor memory or logical reasoning, that may go unnoticed but impact legal participation.
➢Intention or Impairment?:-
Was the crime intentional? If someone acted during a psychotic break or had frontal lobe damage, their ability to make choices may have been impaired. Neurodata helps separate wilful harm from neurologically influenced behavior.
➢Predicting Future Behavior:-
By analyzing emotional regulation, impulse control, and stress response, neuropsychologists help assess whether someone might pose a future threat, informing parole decisions and treatment plans.
⦾Brain Disorders with Legal Weight:
➢Brain Injury: Damage to areas managing impulse control can result in unfiltered, harmful behavior.
➢Mental Illness: Disorders like schizophrenia may disconnect actions from conscious intent.
➢Frontal Lobe Dysfunction: As the brain’s control center, damage here can erode judgment and lead to poor decisions, even criminal ones..
⦾Tools Changing the Game:
➢Brain Scans Tell a Story:-
Functional MRIs reveal real-time brain function. These scans often serve as persuasive evidence, helping courts visualize unseen impairments.
➢ AI Joins the Analysis:-
Machine learning is now analyzing cognitive data, detecting patterns too subtle for the human eye. This sharpens the precision of psychological evaluations.
➢Tech in Prisons:-
Digital tools are streamlining neuro assessments in prisons, making it easier to track inmates’ mental health and tailor rehabilitation.
⦾Ethical Hurdles to Cross:
➢ Malingering:- Some defendants fake cognitive issues. Experts use specialized methods to spot dishonesty.
➢ Cultural Bias:- Tests may not fairly assess people from different cultural or linguistic backgrounds.
➢ Blame vs. Brain:- Brain damage doesn’t always mean lack of responsibility. Courts must still weigh moral and legal accountability carefully.
⦾Cases That Redefined Justice:
➢ The Cyst That Changed a Sentence:-
A man’s brain scan revealed a cyst pressing on his decision-making center. He was still found guilty, but the sentence reflected the role of the impairment.
➢ Impulse Control Gone Awry:-
A young offender was shown to have damage to the brain’s impulse center. Instead of prison, the court ordered intensive therapy, tailored to help him regain control.
⦾Working Together for Fairer Outcomes:
Justice works best when it’s informed from every angle. Lawyers, neurologists, psychologists, and social workers must collaborate to present clear, consistent, and responsible use of brain data in courtrooms.
⦾What the Future Holds:
➢ Ethics First: As neuroscience grows, we’ll need new standards around consent, privacy, and sentencing fairness.
➢ Youth Justice: Teen brains process risk differently – this is starting to influence juvenile justice reform.
➢ Virtual Rehab: VR is being used in correctional programs to help inmates rebuild empathy, planning, and emotion regulation.
⦾Conclusion:
Justice with Depth Embracing brain science doesn’t mean excusing wrongdoing – it means understanding it. A justice system that sees the full picture of a person’s brain, behavior, and background is more likely to offer fair, effective outcomes. In the shadows of neurons and legal codes, a more humane system is waiting to emerge.
Microplastics, defined as plastic particles smaller than 5 mm in size, have emerged as a significant environmental concern due to their widespread presence and persistence in aquatic and terrestrial ecosystems (Thompson et al., 2004). These particles originate from the degradation of larger plastic waste or are manufactured for specific industrial purposes. With global plastic production reaching over 390 million tonnes in 2021 (PlasticsEurope, 2022), the leakage of microplastics into the environment has become unavoidable, raising serious concerns for biodiversity, food safety, and human health.
Sources of Microplastics
source-microplastic
Microplastics are typically categorized into two types: primary and secondary. Primary microplastics are intentionally manufactured in small sizes for applications such as cosmetics (e.g., exfoliants), industrial abrasives, or medical uses (Andrady, 2011). Secondary microplastics result from the breakdown of larger plastic debris due to environmental weathering, UV radiation, and mechanical abrasion.
Urban runoff, wastewater discharge, shipping activity, and improper waste disposal are major contributors to microplastic pollution (Browne et al., 2011). Synthetic fibers from clothes released during washing are also a significant source of microplastics, as washing machines can release hundreds of thousands of fibers per load (Napper and Thompson, 2016).
Distribution in the Environment
Microplastics have been detected in oceans, rivers, lakes, soil, and even in atmospheric dust. Marine environments are especially vulnerable, with microplastics being found from surface waters to deep-sea sediments (Woodall et al., 2014).
Impact on Marine Life
Marine organisms, ranging from plankton to whales, inadvertently ingest microplastics, mistaking them for food. This ingestion can lead to physical harm, such as internal injuries and blockages, and chemical exposure due to adsorbed pollutants (Cole et al., 2013). Studies have shown that microplastics can bioaccumulate in the food chain, posing risks to higher trophic levels, including humans (Rochman et al., 2013). Filter feeders like mussels and oysters are particularly vulnerable and have shown compromised physiological functions after exposure to microplastics.
Human Health Implications
The presence of microplastics in drinking water, salt, seafood, and even the air we breathe suggests a direct route of exposure to humans (Smith et al., 2018). While the long-term health impacts are still under investigation, there is concern about inflammation, cytotoxicity, and the potential for plastic particles to act as vectors for pathogens and chemical contaminants. Policy and Mitigation Strategies
Governments and environmental organizations have initiated measures to mitigate microplastic pollution. Bans on microbeads in cosmetics, stricter wastewater treatment regulations, and increased recycling efforts are key strategies (UNEP, 2018). Innovative technologies, such as microfiber filters for washing machines and biodegradable alternatives to conventional plastics, are being explored to reduce microplastic input into ecosystems
Public Awareness and Future Directions
Raising public awareness is crucial in combating microplastic pollution. Educational campaigns and citizen science projects help collect data and promote behavioral change (Hartley et al., 2018). Further research is necessary to fully understand the ecotoxicological effects of microplastics and to develop comprehensive risk assessments and policy responses.
Conclusion
Microplastics have become pervasive in the environment, with potentially far-reaching effects on ecosystems and human health. Addressing this issue requires a multi-pronged approach involving policy intervention, scientific research, and public engagement. Efforts to reduce plastic production and enhance waste management infrastructure will be essential in limiting future pollution
Rockets have been around for centuries. The earliest rockets, developed in ancient China, were similar to modern fireworks and were primarily used for military purposes. In the 18th century, the Kingdom of Mysore in India famously deployed iron-cased rockets against the British East India Company.
In 1903, Russian high school mathematics teacher Konstantin Tsiolkovsky published The Exploration of Cosmic Space by Means of Reaction Devices, laying the theoretical groundwork for modern rocketry.
The Space Race, which spanned from 1945 to 1969, saw intense competition between the USA and the Soviet Union. This rivalry fueled rapid advancements in space technology, leading to the creation of legendary rockets like the Saturn V (USA) and the N1 (Soviet Union).
In the decades that followed, rockets such as the Ariane family (Europe) and futuristic designs continued to advance space exploration. Today, a significant proportion of global space launches rely on SpaceX’s Falcon 9, a partially reusable and highly versatile rocket.
Looking ahead, major missions like NASA’s Artemis II and III (returning humans to the Moon), China’s lunar exploration efforts, and the development of two new space stations promise to push the boundaries of human spaceflight even further.
Intruduction
Rockets come in various sizes, efficiencies, and costs, and they can be developed by both private companies and government agencies. Regardless of their size or purpose, all rockets operate based on Newton’s Third Law of Motion, relying on fuel for propulsion. Most rockets are designed to carry payloads into orbit, and these are known as orbital launch vehicles—the primary focus of this review. Currently, all operational spacecraft rely on conventional chemical propulsion, using either solid-fuel or liquid bipropellant engines for launch. A few have incorporated air-breathing engines in their first stages to improve efficiency.
History
Rockets come in various sizes, efficiencies, and costs, and they can be developed by both private companies and government agencies. Regardless of their size or purpose, all rockets operate based on Newton’s Third Law of Motion, relying on fuel for propulsion. Most rockets are designed to carry payloads into orbit; these are known as orbital launch vehicles, which are the primary focus of this review. Currently, all operational spacecraft rely on conventional chemical propulsion, using either solid-fuel or liquid bipropellant engines for launch. A few rockets have incorporated air-breathing engines in their first stages to improve efficiency.
Across Asia and Europe, rockets have been used for centuries for two main purposes:
As military weapons—such as bows with rocket-boosted arrows or missiles.
As fireworks for celebrations and ceremonies.
Some rockets still serve these roles today.
In 1944, the German V-2 rocket became the first man-made object to reach space when it crossed the Kármán line, marking a significant milestone in rocketry.
After World War II, the United States and the Soviet Union (USSR at the time) engaged in a fierce competition for technological supremacy known as the Space Race. The USSR achieved many early milestones, including:
The first animal in space (Laika the dog)
The first human in space and in orbit (Yuri Gagarin aboard Vostok 1 on April 12, 1961).
While the US initially lagged behind, it made a historic leap in 1969 when astronauts Neil Armstrong and Edwin “Buzz” Aldrin became the first humans to walk on the Moon during the Apollo 11 mission, effectively winning the Space Race.
Following Apollo, NASA shifted its focus to developing the Space Shuttle, envisioned as a cheaper, reliable, and partly reusable spacecraft. However, costs were much higher than expected, and two catastrophic disasters—Challenger (which exploded during launch) and Columbia (which disintegrated during reentry)—tragically claimed the lives of 14 astronauts. Additionally, the shuttle required extensive refurbishment between missions and could only deliver 24,400 kg to Low Earth Orbit. It was retired in 2011.
After its retirement, the only way for astronauts to reach the International Space Station (ISS) was aboard the Russian Soyuz spacecraft. However, due to growing geopolitical tensions, NASA sought to regain independent launch capability using an American-built rocket.
Present
Currently, SpaceX, Blue Origin, and other private companies are leading the way in rocket launches. Among these, SpaceX stands out with its impressive portfolio:
Falcon 9, the most frequently launched and most reused rocket to date.
Falcon Heavy, the most cost-effective heavy-lift rocket.
Starship, which is poised to be the largest, cheapest, most massive, and tallest super-heavy launch vehicle ever built.
Companies like Rocket Lab specialize in launching small satellites into specific orbits, offering more tailored services.
Many modern rockets today are partly reusable, meaning that key components—such as the first stage—are recovered and reused after each launch. This approach reduces both operational and development costs while maintaining simplicity in rocket design and operations.
Active Launch Vehicles
isro spacecraft
India – ISRO & Private Sector
PSLV (Polar Satellite Launch Vehicle)
Type: Medium-lift, four-stage rocket
Payload Capacity: ~1,750 kg to Sun-synchronous orbit (SSO)
Propulsion: Alternating solid and liquid stages
Use Case: Earth observation, navigation, and science satellites
Status: Highly reliable; experienced a rare failure on its 101st mission in May 2025
2. GSLV Mk II (Geosynchronous Satellite Launch Vehicle)
Type: Three-stage medium-lift rocket
Payload Capacity: ~2,500 kg to Geosynchronous Transfer Orbit (GTO)
Propulsion: Solid, liquid, and cryogenic stages
Use Case: Communication and weather satellites
3. LVM3 (Launch Vehicle Mark-3)
Type: Heavy-lift, three-stage rocket
Payload Capacity: ~10,000 kg to Low Earth Orbit (LEO); ~4,000 kg to GTO
Propulsion: Two solid boosters, liquid core, and cryogenic upper stage
Use Case: Gaganyaan crewed missions, heavy payloads
4. SSLV (Small Satellite Launch Vehicle)
Type: Small-lift, three-stage solid rocket
Payload Capacity: ~500 kg to LEO
Use Case: Rapid deployment of small satellites
United States – NASA, SpaceX, ULA, Blue Origin
spaceX
Type: Partially reusable, two-stage rocket
Payload Capacity: ~22,800 kg to LEO
Propulsion: Merlin engines (kerosene/LOX)
Use Case: Satellite launches, ISS resupply, crewed missions
2 .Falcon Heavy (SpaceX)
Type: Heavy-lift, partially reusable rocket
Payload Capacity: ~63,800 kg to LEO
Use Case: Large payloads, interplanetary missions
3. Starship (SpaceX)
Type: Fully reusable, super-heavy-lift rocket
Payload Capacity: ~100,000+ kg to LEO (projected)
Use Case: Mars missions, lunar landings, bulk satellite deployments
4. Atlas V (ULA)
Type: Two-stage rocket with optional solid boosters
Payload Capacity: ~18,850 kg to LEO
Propulsion: RD-180 first stage, Centaur upper stage
Status: Being phased out; final launches scheduled through 2025
5. Vulcan Centaur (ULA)
Type: Next-generation heavy-lift rocket
Payload Capacity: ~27,200 kg to LEO
Propulsion: BE-4 engines (methane/LOX)
Use Case: National security, commercial launches
6. New Glenn (Blue Origin)
Type: Two-stage, heavy-lift rocket
Payload Capacity: ~45,000 kg to LEO
Propulsion: BE-4 engines
Status: Entered service in January 2025
Japan – JAXA
h3-Japan
H3
Type: Two-stage, medium-to-heavy-lift rocket
Payload Capacity: ~4,000–6,500 kg to GTO
2. Epsilon
Type: Solid-fuel, small-lift rocket
Launch Site: Uchinoura Space Center, Kagoshima Prefecture
Russia – Roscosmos
Roscosmos
1. Soyuz-2
. Type: Three-stage, medium-lift rocket
. Launch Sites:
Baikonur Cosmodrome, Kazakhstan
Plesetsk Cosmodrome, Russia
Vostochny Cosmodrome, Russia
Guiana Space Centre, French Guiana
2. Angara Family
. Angara-1.2: Small-lift, ~3,500 kg to LEO
. Angara-A5: Heavy-lift, ~24,500 kg to LEO
. Launch Sites:
Plesetsk Cosmodrome, Russia
Vostochny Cosmodrome, Russia
China – CNSA & CALT
china
1.Long March 5
Type: Heavy-lift, two-stage rocket
Launch Site: Wenchang Space Launch Site, Hainan Province
2. Long March 6
Type: Small-lift, two-stage rocket
Launch Site: Taiyuan Satellite Launch Center, Shanxi Province
3. Long March 7
Type: Medium-lift, two-stage rocket
Launch Site: Wenchang Space Launch Site, Hainan Province
4. Long March 8
Type: Medium-lift, two-stage rocket
Launch Site: Wenchang Space Launch Site, Hainan Province
There are many space launches planned for the future. NASA’s Artemis II and III missions will send astronauts to the Moon. India is preparing for its first manned mission and developing its own space station. China is planning multiple lunar missions. Many countries and private companies are also planning missions to explore different parts of the solar system. In addition, several new rockets are being developed, both by government agencies and private companies, to support these ambitious plans.
Artemis program
Artemis II and Artemis III are NASA’s missions to the Moon that will test the Orion spacecraft and the Human Landing System (HLS). Artemis II will be the first crewed flight of the Orion spacecraft, orbiting the Moon but not landing. Artemis III will be the first crewed lunar landing since Apollo 17 in 1972, aiming to return humans to the lunar surface and establish a sustainable human presence.
Artemis II
The first crewed flight of the Orion spacecraft .
Will take humans beyond the Moon .
Was originally planned for April 2026, but was delayed due to issues with the Orion spacecraft’s heat shield .
Artemis III
The first crewed lunar landing since Apollo 17 .
Will send the first humans to explore the lunar South Pole .
Was originally planned for late 2024, but was delayed to no earlier than 2029.
Will include a compact seismometer suite to study the Moon’s crust and mantle .
Gaganyaan Mission
The first phase of India’s human spaceflight program focuses on developing and flying the Gaganyaan spacecraft, which weighs 3.7 tons and is designed to carry a three-member crew into low Earth orbit (LEO). This mission will aim to safely return the crew to Earth after a duration of a few orbits to two days. An extended version of the spacecraft will eventually enable missions lasting up to seven days, as well as rendezvous and docking capabilities.
Before the flight of the Gaganyaan module, Group Captain Shubhanshu Shukla is scheduled to fly on the Axiom-4 Mission to the International Space Station (ISS) to gain operational experience.
In the next phase, the program plans to develop a small habitat module to support spaceflight missions of 30–40 days, paving the way for longer stays in space. These experiences and advancements will eventually contribute to the development of an Indian space station.
ISRO is also working on spacecraft docking and berthing technology, with initial funding of ₹10 crore approved in 2017. As part of this effort, the Space Docking Experiment (SPADEX) is being developed, featuring systems like signal analysis equipment, a high-precision videometer for navigation, and a docking mechanism.
China’s Moon Mission
China aims to achieve a manned lunar landing by 2030. By conducting a series of pre-crewed flight tests and subsequent manned lunar missions, China plans to support large-scale space science experiments focusing on three key areas: lunar science, lunar-based science, and resource exploration and utilization. Advanced electronics and real-time decision-making systems for landing operations are being developed in multiple stages to ensure a safe and precise landing on the lunar surface.
The present study is an attempt to establish a fast, highly reproducible transformation with a simplifed regeneration system in soybean targeting the apical meristem. The modifed half-seed explants from soybean cultivar (cv.) JS335 were subjected to diferent time intervals of sonication (0, 1, 10, 20, and 30 min) and vacuum infltration (0, 1, 10, 20, and 30 min) in the presence of Agrobacterium tumefaciens strain EHA105 harbouring pCAMBIA1301. The explants were then co-cultivated and subjected to a modifed plant regeneration process that involves only two steps (1) primary shoot regeneration, and (2) in vitro rooting of primary shoot. The rooted plantlets were hardened and maintained in the greenhouse until maturity. Sonication treatment of 10 min, followed by plant regeneration using a modifed method, recorded the highest transformation efciency of 26.3% compared to other time duration tested. Furthermore, 10 min of vacuum infltration alone resulted in even higher transformation efciency after regeneration, reaching 28.0%. Interestingly, coupling sonication and vacuum infltration for 10 min respectively produced the highest transformation efciency after regeneration of 38.0%. The putative transformants showed gus expression in mature leaves, trifoliate leaves, fowers, and pods. The presence of hpt II was also confrmed in putative transformants, with an amplicon size of 500 bp. Quantitative real-time PCR confrmed the existence of hpt II as one to two copies in the soybean genome of T0 plants. Furthermore, the segregation pattern was observed in the T1 generation soybean plants which were confrmed using PCR for hpt II. The optimized protocol when tested with other Indian soybean cultivars showed an enhanced transformation efciency ranging from 19.3% (cv. MAUS47) to 36.5% (cv. CO1). This optimized protocol could provide a reliable platform to overcome the challenges that are associated with the genetic engineering of soybean.
Introduction
Soybean (Glycine max (L.) Merrill), an economically valuable crop, is largely used for consumption and industrial applications (Widholm et al. 2010). The global population growth and the consistent demand for soy products are leading to a continuous increase in the production and demand for soybeans. Consequently, signifcant eforts have been dedicated to improving the regeneration system and the efectiveness of transforming soybeans. These eforts show great potential for developing superior soybean varieties with desired characteristics. Up until now, soybean regeneration has been achieved through somatic embryogenesis, direct organogenesis, and indirect organogenesis. However, poor regeneration has been a major obstacle in the indirect organogenesis method for soybeans. Most of the research conducted on soybeans has focused on somatic embryogenesis or direct organogenesis. Regarding soybean transformation, various intrinsic factors such as Agrobacterium strains, types of explants, composition of culture media, duration of co-cultivation, and plant selection markers have been extensively investigated to enhance the efciency of the transformation process. Moreover, extrinsic factors like physical wounding of explants, sonication, and vacuum infiltration have been optimized to achieve higher transformation efciency in soybeans Despite various studies aimed at improving soybean transformation efciency using Agrobacterium infection, the success rate has been very low due to genotype dependency and low regeneration of transformants (Kumari et al. 2016; Liu et al. 2004). Moreover, poor shoot elongation and long regeneration duration are other important limiting factors for the efective regeneration of transformants (Ma and Wu 2008). Thus, there is an urgent need to look for alternative ways to develop transformed soybean to meet the global demand. In this regard, the current study aims to develop a fast, reliable, and efcient soybean transformation system incorporating sonication and vacuum infiltration thereby targeting the apical meristem of modifed half-seed explants. Moreover, the highlight of the present study is the hassle-free and fast regeneration of transformed plants from infected half-seed explants using a simplifed regeneration method that involves just two steps (1) primary shoot regeneration, and (2) in vitro rooting of primary shoot. The optimized protocol has also been tested with 10 cultivars to check its efficiency.
Materials and methods
Indian soybean cultivars (cv.) JS335, PUSA 9712, CO1, TAMS-38, JS71-05, JS93-05, NRC7, MAUS47, PK416, and Punjab 1 were procured from ICAR-Indian Institute of Soybean Research, Indore, Madhya Pradesh, India, and the cultivars were grown and maintained at the research garden, Department of Biotechnology, Bharathiar University, Coimbatore, Tamil Nadu, India. The optimization was carried out using the soybean cultivar (cv.) JS335 (Fig. 1a). To begin the experiment, the seeds of soybean cv. JS335 were subjected to surface sterilization and imbibed in sterile water for a period of 24 h (Fig. 1b). Following imbibition, the seed coat was removed, and the cotyledons were separated. Only the cotyledon containing the embryonic axis was utilized for the study. Additionally, the radicle of the embryo, which was attached to the cotyledon, was carefully dissected to obtain the modifed half-seed explant (Fig. 1c). For primary shoot regeneration, explants were inoculated on MS medium supplemented with diferent concentrations of 6-Benzylaminopurine (BAP) (0–8.8 μM) and cultured for 30 days. The explants were sub-cultured into a fresh medium with respective hormonal concentrations at 15 days intervals. For rooting of primary shoots, MS medium supplemented with diferent concentrations of Indole-3 butyric acid (IBA) (0–9.8 μM) was used and the culture was incubated for 30 days. In-order to select the primary shoot after transformation, minimum inhibitory concentration (MIC) was determined in modifed half-seed explants by inoculating in regeneration medium (MS+2.2 μM BAP; pH 5.7) with diferent concentrations of hygromycin B (0–5 mg l −1) and incubating for 30 days. In addition, the explants were sub-cultured at 15 days intervals. Subsequently, the established primary shoots were transferred to a rooting medium (MS+4.9 μM IBA; pH 5.7) with diferent concentrations of hygromycin B (0 to 3 mg l −1) and incubated for 30 days for selection at the rooting stage. All the cultures were maintained at 25±2 °C under a 16/8-h photoperiod.
Agrobacterium tumefaciens strain EHA105 harbouring pCAMBIA1301 was used for transformation (Fig. 2). The T-DNA region of the binary vector contains hygromycin phosphotransferase II (hpt II) as the plant selection marker and gus as a reporter gene. The vector backbone carries the neomycin phosphotransferase II (npt II) for bacterial selection. Agrobacterium culture was prepared by inoculating a single colony into 30 ml LB broth containing antibiotics such as kanamycin (50 mg l −1) and rifampicin (25 mg l −1). The culture was incubated at 28 °C for 16 h at 180 rpm. The bacterial culture was centrifuged at 6000 rpm for 15 min, and the pellet was suspended in a liquid MS medium. Additionally, a flter-sterilized solution of 200 μM acetosyringone was added to the bacterial suspension, which was then incubated for 1 h at 28 °C at 180 rpm. The absorbance of bacterial suspension was adjusted to 1.0 at OD600 prior to infection. For genetic transformation, the modified half-seed explants were inoculated into 30 ml Agrobacterium suspension and sonicated for diferent durations (0, 1, 10, 20, and 30 min). Similarly, the explants were subjected to vacuum infltration for diferent time intervals (0, 1, 10, 20, and 30 min) in 30 ml Agrobacterium suspension. Finally, the explants were subjected to combined treatments of sonication (10 min) and vacuum infltration (10 min) in the presence of Agrobacterium. After diferent treatments, the explants were then incubated in fresh Agrobacterium suspension (30 ml) at 28 °C for 30 min. After 30 min, the explants were blot-dried and placed in a co-cultivation medium (MS + 200 μM acetosyringone; pH 5.7) and cultured in complete darkness at 25±2 °C for 3 days. Subsequently, the explants were then thoroughly washed with sterile distilled water containing 350 mg l −1 cefotaxime and cultured in regeneration medium (MS + 2.2 μM BAP + 3 mg l−1 hygromycin B; pH 5.7) for 30 days for selection of primary shoots. The excised primary shoots were then cultured ina rooting medium (MS medium+4.9 μM IBA +2 mg l−1 hygromycin B; pH 5.7) for 30 days. The rooted plantlets were carefully removed from the medium, washed with sterile distilled water, hardened for 2 weeks in paper cups and
soybeansoyabean diagram
Mean values of three independent experiments (±) with standard errors (n=100×3). Values with the diferent letters within columns are signifcantly diferent according to Duncan’s multiple range test (DMRT) at a 5% level
aTotal number of primary shoots survived on regeneration medium (MS+2.2 μM BAP+3 mgl−1 hygromycin B) after 30 days of culture.
bTotal number of primary shoots responded for the root development after 30 days of culture on rooting medium (MS+4.9 μM IBA+2 mg l−1 hygromycin B)
cTotal number of putatively transformed plants that survived in the greenhouse after hardening
dTotal number of putatively transformed plants showing the presence of hpt II
eTransformation efciency=number of hpt II PCR positive plants/total number of infected explants×100
The superscript letter f, g, h, i, j shows that these values are signifcantly diferent according to DMRT
twice to ensure accuracy and reliability. As for the transformation experiments, 100 explants were employed for the respective treatments, and the experiments were repeated three times. The resulting data were presented as mean values with the standard error (SE). Statistical analysis was performed using SPSS software version 20, specifcally employing Duncan’s multiple range test (DMRT) to determine signifcant diferences at a signifcance level of P<0.05. For segregation ratio analysis, the SE and Chisquare analysis were used (Gomez and Gomez 1984; Hada et al.2018). Signifcance was determined for values with a P<0.05.
Results and discussion
We have conducted studies on various parameters to improve transformation after regeneration efciency in Indian soybean cultivars to address the challenges associated with soybean transformation, including low regeneration rates and the absence of cultivar-independent protocols. In the soybean direct organogenesis system, the explants will be initially subjected to multiple shoot induction. Then attempts will be made to elongate the shoots, and after elongation, the shoots will be cultured for in vitro rooting. In addition, this process takes approximately 3 or more months to obtain an in vitro rooted plantlet that will be ready for hardening. In order to achieve regeneration using a direct organogenesis system, the radicle, and plumule of the half-seed explants have to be excised and need to be placed in cytokinin containing medium to trigger the meristematic cells to produce multiple shoots. However, in this method, most of the shoots will fail to elongate affecting the regeneration response (Ether et al. 2013). To overcome this limitation with half-seed explants, we have modified the explant preparation in a way that we removed only the radicle and left the plumule intact to obtain the modifed half-seed explants. The presence of plumule in the modifed half-seed explants directed the regeneration system towards primary shoot development followed by subsequent in vitro rooting diverting it from the conventional direct organogenesis process that includes multiple shoot induction, shoot elongation, and rooting. Moreover, this diversion also bypassed the shoot elongation step which was critical in afecting the regeneration efciency. In the present investigation, the primary shoots developed and elongated in the same BAP medium avoiding the necessity of a separate shoot elongation process. Also, using this method, we were able to produce rooted plantlets that are ready for hardening within 60 days (30 days for primary shoot regeneration and 30 days for rooting) which was comparatively less than similar reports on soybean. The aforesaid advantages of using
leave
leave details
Fig. 4 GUS analysis and molecular confrmation of putative transformants regenerated from modifed half-seed explants infected with Agrobacterium tumefaciens strain EHA105 harboring pCAMBIA1301. a Expression of gus in mature leaf from putatively transformed plants; b mature leaf from non-transformed plant; c expression of gus in trifoliate leaves from putatively transformed plants; d trifoliate leaves from non-transformed plant; e expression of gus in fower from putatively transformed plants; f fower from non-transformed plant; g expression of gus in pod from putatively transformed plants; h pod from non-transformed plant; i molecular confirmation for the presence of hpt II in putatively transformed soybean plants. Lane 1: DNA ladder (1 Kb); lane 2: pCAMBIA1301 plasmid (positive control); lane 3: soybean genomic DNA from non-transformed plants (negative control); lanes 4–8: genomic DNA from putatively transformed soybean plants with expected amplicon (500 bp) of hpt II
this modifed half-seed method also highly favoured efcient regeneration in transformation experiments
In the present study, modifed half-seed explants produced the highest response in inducing the primary shoot regeneration (93.0%) in MS medium supplemented with 2.2 µM BAP (Supplementary Table 1). In addition, the maximum in vitro rooting of primary shoots (75.5%) was observed in the MS medium supplemented with 4.9 µM IBA (Supplementary Table 2). Our fndings were similar to those of Arun et al. (2015) and Chen et al. (2018), where the aforementioned concentrations of BAP and IBA showed the best response in inducing shoots and roots in soybean. The MIC of hygromycin B in primary shoot regeneration was found to be 3 mg l−1 and the MIC of hygromycin B in in vitro rooting of primary shoots was 2 mg l−1. The use of hygromycin B as a potent plant selection marker in soybean was established by Olhoft et al. (2006).
In the present study, modifed half-seed explants produced the highest response in inducing the primary shoot regeneration (93.0%) in MS medium supplemented with 2.2 µM BAP (Supplementary Table 1). In addition, the maximum in vitro rooting of primary shoots (75.5%) was observed in the MS medium supplemented with 4.9 µM IBA (Supplementary Table 2). Our fndings were similar to those of Arun et al. (2015) and Chen et al. (2018), where the aforementioned concentrations of BAP and IBA showed the best response in inducing shoots and roots in soybean. The MIC of hygromycin B in primary shoot regeneration was found to be 3 mg l−1 and the MIC of hygromycin B in in vitro rooting of primary shoots was 2 mg l−1. The use of hygromycin B as a potent plant selection marker in soybean was established by Olhoft et al. (2006).
In this study, the transformed modifed half-seed explants were successfully regenerated using an optimized regeneration method. Among the diferent treatments applied, the modifed half-seed explants that underwent a 10-min sonication treatment exhibited the highest number of primary shoots that survived (54.6), along with a substantial number of rooted shoots (44.3) and plants that survived after a 2-week hardening period (26.3). The transformation efciency for this treatment was calculated to be 26.3% (Table 1). However, it is worth noting that the transformation efciency decreased when the sonication time was reduced to 1 min or increased to 20 and 30 min, as indicated in Table 1. The highly active and rapidly dividing meristematic cells that were used for genetic transformation are present in the primary shoot. Sonication creates microwounds through which Agrobacterium could easily reach the meristematic cells and enhance the transformation efficiency (Trick and Finer 1997). Our study is consistent with the fndings of Hada et al. (2018), where the optimum sonication time was found to be 10 min, and increasing the sonication time decreased the transformation efciency from 36.2% to 12.1%. Guo et al. (2015) also claimed that sonication for 2 s improved transformation efciency in soybean from 2.5 to 5.7%. Vacuum infltration has been well validated as an efcient method to improve the rate of transformation by creating negative atmospheric pressure, enabling easy passage for Agrobacterium to target the meristematic cells (Subramanyam et al. 2013). Among diferent time durations for vacuum infltration tested (0, 1, 10, 20, and 30 min), a treatment duration of 10 min showed the maximum number of survived primary shoots (51.6), number of rooted shoots (40.6), number of plants that survived after 2 weeks of hardening (28.0), with the transformation efciency of 28.0% (Table 1). Furthermore, it was observed that extending the vacuum infltration time beyond 10 min had a negative impact on the transformation efciency. This decrease in efciency can be attributed to the injury caused to the explants due to excessive vacuum. Similarly, reducing the vacuum infltration time to 1 min resulted in a decreased transformation efciency of 11.6%. In the case of the treatment involving sonication for 10 min combined with vacuum infltration for 10 min, the number of primary shoots that survived in the selection medium was recorded as 54.6 (Fig. 1d and e). Additionally, the number of rooted shoots was 46.6 (Fig. 1f and g). Following the hardening process, a total of 38.0 plants successfully survived and acclimatized (Fig. 1h and i). Importantly, the transformation efficiency significantly improved to 38.0%, which is considerably higher than the efciency observed in the explants treated with sonication or vacuum infltration alone (Table 1). These fndings align with the previous studies conducted on soybeans by Arun et al. (2015) and Hada et al. (2018), which also suggested that combining sonication and vacuum infltration can enhance the transformation efciency, as demonstrated in the present study.
From the histochemical GUS assay, it was observed that the mature leaf (Fig. 4a), trifoliate leaves (Fig. 4c), flower (Fig. 4e), and pod (Fig. 4g) from putative transformants developed an intense blue colour and tested positive for gus expression. The mature leaf (Fig. 4b), trifoliate leaves (Fig. 4d), fower (Fig. 4f), and pod (Fig. 4h) from nontransformed plants did not show the gus expression. In the present study, the transformation efciency was calculated based on the presence of the hpt II in transformed plants. The T0 plants that survived after hardening were subjected to this analysis. The amplicon size of 500 bp (Fig. 4i, Lane 4–8) indicated the presence of hpt II in transformed plants. pCAMBIA1301 plasmid served as the positive control (Fig. 4i, Lane 2) whereas non- transformed plants did not show any amplifcation for hpt II (Fig. 4i, Lane 3). Overall maximum transformation efciency of 38.0% was achieve when modifed half-seed explants were subjected to sonication (10 min) and vacuum infltration (10 min). In this present study, the copy number of hpt II in T0 plants was determined by quantitative real-time PCR using
table
Mean values of three independent experiments (±) with standard errors (n=100×3). Values with the diferent letters within columns are signifcantly diferent according to Duncan’s multiple range test (DMRT) at a 5% level
aTotal number of primary shoots survived on regeneration medium (MS+2.2 μM BAP+3 mg l
−1 hygromycin B) after 30 days of culture
b Total number of primary shoots responded for the root development after 30 days of culture on rooting medium (MS+4.9 μM IBA+2 mg l−1 hygromycin B)
cTotal number of putatively transformed plants that survived in the greenhouse after hardening
dTotal number of putatively transformed plants showing the presence of hpt II
eTransformation efciency=number of hpt II PCR positive plants/total number of infected explants×100
Actin as the reference gene. The results revealed that the copy number of hpt II ranged between one and two. The T0 transgenic soybean lines GmJS335-2 and GmJS335-3 had two copies, while lines GmJS335-1, GmJS335- 4, GmJS335-5, GmJS335-6, GmJS335-7, GmJS335-8, and GmJS335-9 had one copy of hpt II (Supplementary Table 3). The quantitative real-time PCR is replacing the traditional method of detecting the copy number of the foreign gene via the southern blot technique due to various advantages such as accuracy, lower cost, higher stability, and ease of operation. This technique has been successfully employed in several crops, including cotton (Yang 2012), wheat (Gadaleta et al. 2011), rice (Wei et al. 2011), maize (Yuan et al. 2010), tomato (Wang et al. 2011), and soybean (You-wen et al. 2012). Additionally, the segregation pattern observed in the T1 generation demonstrated Mendelian inheritance with a ratio of 3:1 in one plant (GmJS335 8) at a signifcance level of 0.05% (Supplementary Table 4). The optimized protocol has been further applied to evaluate the transformation efciency in other cultivars of soybean. In the present investigation, cv. JS335 was found to be having the highest transformation efciency of 38.0% followed by the cv. CO1 (36.5%) and cv. TAMS-38 (33.6%). Among the diferent cultivars examined, cv. MAUS47 displayed the lowest transformation frequency, recorded at 19.3% (Table 2). Overall, the method developed in this study proved to be fast and highly efcient in obtaining transgenic lines within a relatively short duration of 60 days to obtain rooted plantlets. In comparison, previous studies conducted by Arun et al. (2015), Hada et al. (2018), and Wang et al. (2022) demonstrated soybean transformation systems utilizing conventional direct organogenesis, which required longer regeneration times of 123 days, 104.5 days, and 97 days, respectively. Therefore, this simplifed transformation and modifed regeneration protocol can be utilized effectively for developing transgenic soybean varieties with desired traits.
Conclusion
In this study, we have successfully developed a simple regeneration system from modifed half-seed explants, consisting of two steps: primary shoot regeneration and rooting. This system has been efectively adapted to regenerate transgenic plants from modifed half-seed explants infected with Agrobacterium, and it ofers the advantage of a shorter regeneration period. Additionally, the incorporation of sonication and vacuum infltration techniques has signifcantly enhanced the transformation efciency in our study. Moreover, this transformation and regeneration system has demonstrated its efcacy across various soybean cultivars, indicating its wide applicability. We believe that this simple protocol holds great potential for commercial trait improvement in diverse soybean varieties.
Supplementary Information The online version contains supplementary material available at https://doi.org/10.1007/s13205-023-03715-8
Acknowledgements The authors are thankful to the University Grants Commission: Basic Startup Research Grant (No.F.30-410/2018 (BSR); dt.: 29.06.2018), New Delhi, Government of India, for fnancial assistance provided to Muthukrishnan Arun, and also thanks to ICAR -Indian Institute of soybean research, Indore, Madhya Pradesh, India for providing soybean seeds to carry out the research work.
Author contributions MA and KS: Conception and idea. KS: performed the experimental work with the assistance of MA. NV and KS wrote the manuscript with the assistance of MA. KS and NV: preparation of tables and fgures. CA: helped in performing experimental analysis. PG and CA: helped in the critical reviewing of the manuscript. Finally, all the authors approved for the manuscript.
Data availability Data generated in this work is included in the manuscript and in the supplementary material. This will be made available on request.
Declaration
Conflict of interest There are no conficts of interest to declare.
Ethical approval for involving human participants and/or animals Not applicable, since this article does not contain any studies with human participants or animals performed by any of the authors.
Clinical diagnostics as well as environmental and food safety monitoring rely on the accurate identification of microorganisms. The ability to isolate and identify viable organisms through culture-based techniques has for a long time and remains the gold standard. However, the traditional techniques are often time-consuming, have low sensitivity, and are unable to detect non-culturable or fastidious organisms. To overcome these limitations, molecular techniques such as the Polymerase Chain Reaction (PCR) and Next Generation Sequencing (NGS) have been developed and widely used. Microbial DNA can be detected with extreme sensitivity by PCR, even for those organisms which are impossible or very difficult to culture. NGS further enables comprehensive genomic and metagenomic analyses which helps in identifying emerging pathogens and antimicrobial resistance determinants. This review aims to evaluate the three methods of detection: culture-based methods, PCR, and NGS, and analyze their working principles and strengths, weaknesses, and cases when they are most appropriate to use. There is no question that culture is important in demonstrating the presence of live pathogens and in performing subsequent phenotypic tests. However, molecular techniques provide faster results with greater detail. In conclusion, traditional methods coupled with molecular approaches can improve the accuracy, speed, and breadth of detection of microorganisms, which can be beneficial for medical, industrial, or environmental purposes.
1. Introduction
Every sector, ranging from healthcare to food safety, requires the precise detection of microorganisms. Culture-based methods have proven their worth in the field of pathogen detection, but their sensitivity and speed more often than not compromise the end result. As with many other fields, time is of the essence. PCR and NGS are techniques designed to fulfil this need. The purpose of this paper is to analyse these methodologies so that suitable decisions can be made for optimal microbial detection.
2. Culture-Based Methods
2.1 Principles and Applications
Growing microorganisms on selective media with controlled environmental conditions is termed as culture-based methods. These enable phenotypic characterization and susceptibility of strains to antibiotics to be determined. Clinical diagnostics as well as food safety evaluation heavily depend on this method.
2.2 Advantages
It is cost-effective and uncomplicated.
Isolation of live organisms can be performed for further analysis.
Well-established standardized protocols exist for the procedures.
2.3 Limitations
Results are typically completed within 24-72 hours.
There are some microorganisms that cannot be cultured in a lab.
Sensitivity is lower in samples containing a low microbial load or in fastidious organisms, especially.
3. Polymerase Chain Reaction (PCR)
3.1 Principles and Applications
Specific sequences of DNA can be amplified to allow the detection of microorganisms with heightened sensitivity and specificity. This can be done through PCR. It can also be done through its variants like quantitative PCR (qPCR) which allows for measuring the microbial load. In clinical diagnostics, environmental monitoring, and food safety, PCR is widely used.
3.2 Advantages
The speed of obtaining results is rapid, frequently in a matter of hours.
With high sensitivity and specificity, the results are accurate.
Organisms that cannot be cultivated may still be detected.
3.3 Limitations
Personnel with proper training and specialized equipment are a necessity.
There is a chance of contamination that can lead to inaccurate positive results.
Without specific methods, distinguishing dead or alive organisms is not possible.
4. Next Generation Sequencing (NGS)
Next-Generation-Sequencing-NGS
4.1 Principles and Applications
Through sequencing the genomes of an organism or regions such as the 16S rRNA gene, entire genomes are analysed allowing microbial communities to be studied in detail. This is useful in metagenomics, discovering pathogens, and studying the diversity of microorganisms.
4.2 Advantages
Detailed cultivable and unculturable organisms that make up microbial communities are provided by these insights.
Ability to identify novel resistance genes and pathogens.
High-throughput and scalable.
4.3 Limitations
High cost and complexity.
Requires advanced bioinformatics tools and expertise.
Longer turnaround time compared to PCR.
5. Comparative Analysis
Studies have shown that PCR and NGS techniques tend to be more effective at identifying pathogens undetected by culture-based techniques, particularly in specimens with low microbial counts or in more challenging microbes. For example, real-time PCR proved to be more sensitive than culture methods for detecting Listeria monocytogenes in food samples. Likewise, NGS has been shown to successfully identify pathogens from clinically culture-negative specimens.
Conclusion
Culture techniques, while valuable especially when coupled with antibiotic susceptibility testing, fail in speed and sensitivity compared to molecular techniques such as PCR and NGS. As discussed in this paper, methods used should be dictated by set requirements such as turnaround time, cost, and the extent of microbial profiling needed. A combination of these methods would provide a better solution to detecting bacteria.
In criminal investigations, it often becomes a necessity to identify whether someone had been at the crime scene. Forensic science has a vital role in identifying the offender. Footprint evidence is one of the most important tools in such identification processes. Just like fingerprints, each person’s footprints are unique and can be very important in connecting a suspect to either the crime scene or the crime victim. Footprints carry friction ridge patterns unique to an individual, and the patterns can distinguish between even identical twins. Since the patterns of the ridges do not change with a person’s life span, the footprint analysis becomes an efficient means of personal identification. Due to these reasons, footprint evidence collection, preservation, and analysis have been of growing interest in law enforcement. This review focuses on the scientific significance of footprints in criminal investigations and their role in legal proceedings.
Introduction
Because forensic wisdom is grounded on physical substantiation, forensic investigators assay crime spots or crime scenes for physical substantiation like fingerprints, blood, lip print, vestiges, etc., to identify the perpetrator and break crimes. point is a veritably important piece of substantiation so is Footmark.
The footmark is an important piece of physical substantiation set up at numerous crime scenes, including homicide, burglary, and sexual assault. Yet, it’s frequently overlooked in the early stages of a disquisition. The most pivotal aspect is the examination and comparison of footmark prints. These are subordinated to a thorough forensic & scientific assessment. vestiges may reveal information that can prop in the identification of a suspect and the crime scene.
lt is the characteristics that are unique in shape and detail that must be looked for and studied. Bare footmark or print and shoeprint or print is generally known as Footmark.
foot print detail
LITERATURE REVIEW
The stride dimension, position of each footmark, its shape, size, angulations and depth, interspaces and external perimeters, heel crimps, injuries or accidental damages give circular information about gait pattern, person’s height, leg length, range of body weight, and interrelated movement of the bottom, ankle, leg, and body that are unique to that person.
In one Florida case, for illustration, a bloody shoe print was discovered on the carpet in the house of a murder victim. The print suggested that the print was caused by a hole in the shoe. Investigators gathered and tested shoe prints from people who were known to be in the area near the time of the murder. By superimposing the bloody shoeprint from the crime scene with the test print made from the suspects shoe, footwear observers were suitable to identify the malefactor.
“Wherever he steps, whatever he touches, whatever he leaves, indeed unconsciously, will serve as silent substantiation against him. Not only his fingerprints or his vestiges, but his hair, the filaments from his apparel, the glass he breaks, the tool mark he leaves, the makeup he scratches, the blood or semen he deposits or collects. All of these and further bear mute substantiation against him. This is substantiation that does n’t forget. It is n’t confused by the excitement of the moment. It is n’t absent because mortal substantiations are, it’s factual substantiation, physical substantiation cannot be
wrong, it cannot perjure itself; it cannot be wholly absent, only its interpretation can err. Only mortal failure to find it, study and understand it, can dwindle its value.”(Paul L. Kirk 1974).
The print of a Footmark can be divided into two orders
2- D (Two Dimensional)
3- D (Three Dimensional)
2. Dimensional Footprint – When the underpart of a shoe collides with a hard, flat, or aeroplane face, similar as a pipe bottom or concrete, this type of print is created. The substance is constantly transmitted from the sole of the shoe to the ground. Those formed with wettish dirt and blood are known as positive prints. A favourable print is generally egregious. These are formed in the dust or on a face that has been smoothly waxed.
3. Dimensional vestiges – These forms of footmark prints do when a shoe is impressed into a soft material similar as slush, beach or snow.
The value of similar substantiation will, still, be commensurate to the points of identification that can be demonstrated.
The Relevance of Footprint in Criminal investigations
Footprints are one of the most crucial pieces of evidence in criminal investigations because of their singularity. Footprints are unique to individuals in terms of footprint size, footprint shape, arch type and footprints of ridges and creases on the bottom of their foot. Even the same monozygotic twins with almost identical genotypes, have their own footprint. The friction ridges and patterns on the bottom of the feet are reproducible throughout a person’s life course and footprints are therefore reproducible, enduring identifiers. Footprint evidence is particularly useful in situations in which other means of identification, such as DNA or fingerprints, are unavailable. For example, footprints are commonly found at crime scenes, especially outdoors, in damp, sandy or snowy environments. When fingerprints are collected by investigators, it is possible to make an unambiguous connection between a suspect and a crime scene, which can therefore confirm both presence and participation. In forensic science, physical evidence plays an important role in the investigation and prosecution of criminal offences. Footprints represent a type of trace evidence, i.e., microscopic but informative pieces of physical evidence that are linked to a person or an object. Just as fingerprints can link a perpetrator to a particular spot, so can footprints shed light on a perpetrator’s location and behaviour. E.g., footprints at a crime scene can provide investigators with a picture of the suspect’s movement, as on foot or on hands and knees he also confirms the attempt of evading or trying to infiltrate a specific zone). The physicality of footprint evidence permits different analyses. Professionals can at times evaluate the thickness, size, and wear patterns to discriminate features. In serious crimes, such as burglary, assault, or murder, footprint evidence can help corroborate witness testimony, surveillance, or other evidence. Additionally, footprints may also connect a suspect to a specific kind of footwear, such as shoes with distinctive tread patterns.
LAW FRAMEWORK OF INDIA RELATED TO FOOTPRINT EVIDENCE
Footprint evidence has a decisive role in criminal investigations in India which are regulated by the enactment of a law and judicial decision. The Indian Evidence Act, CrPC, and the Indian Penal Code (IPC) are all of significant utility in the management of physical evidence such as footprints. As part of the forensic analysis development, the footprint evidence is one of the important sources for connecting criminals to crime scenes. Evidence collection, storage, and forensic trial, however, remains to be problematic. With the continued development of forensic science, the legal consideration of footprint evidence may changealong with it, which will enhance its use in criminal investigation and in the courtroom. The law of footprint evidence falls under a set of laws, rules, and judgments of Indian judicial precedents on the collection, preservation and exploitation of physical evidence used for investigations of crime. Footprints as physical evidence are of great importance in the of the perpetrators to crime scenes and the attribution of the accused to the offense.
Following is a summary of the key statutes and legal principles relating to footprint evidence in India.
1. Indian Evidence Act, 1872
The Indian Evidence Act, 1872, is the primary enactment, rule by rule, governing the evidence in Indian courts. It is the legal instrument providing a framework for the gathering and examination of all evidence, including testimonial physical evidence (e.g., footprints). Key sections under this act include:
Section 3: Defines “evidence” as such material that can be taken into the courtroom to prove or disprove a case fact, material that is physical in nature, e.g., fingerprints.
Section 45: It is in this subsection (which permits).that expert testimony is proposed before court. Forensic science professionals can provide evidence for the singularity of foot prints, their value and the methodologies used for their analysis to be brought into the evaluation of footprint evidence in the prosecution of criminal offences.
Section 27: It enables the reception of information communicated by a suspect, e.g., by directing the police where a fingerprint is likely to lead the police to the crime scene, where 7 footprints are likely to be present. If a defendant freely drives police to the scene of the crime the markings will be admissible in evidence.
2. Criminal Procedure Code, 1973 (CrPC)
Criminal Procedure Code (CrPC) is the corpus of procedural rules in criminal law in India. The CrPC. describes the procedures for the acquisition and treatment of physical evidence, such as footprints. Relevant sections include:
Section 154 : Processes related to registering a First Information Report (FIR). Physical evidence from the crime scene, including footprints, has to be seized by the police after an FIR is registered in a criminal case.
Section 165 : Provides police officers with the ability to extract evidence from crime scenes, e.g., prints. Specifically, that, footprints (if it is applicable to the case), are properly gathered and archived in good condition, in order to be analysed.
Section 160 : Potentiates police forces’ ability to take any person in for interrogation as part of criminal investigation, and this may be important when an accused has been identified by footprint evidence at the crime scene.
3. The Indian Penal Code, 1860 (IPC)
As per Indian Penal Code (IPC), various offences and penalties are defined, in which offence can be established based on footprints evidence for proving involvement of an accused person of a crime, etc. While the IPC does not explicitly relate to the footstep, the circumstances therein can be utilised to attribute fault in relation to a commission of an offence under relevant sections as follows:
Section 302 (Murder) : If footprints in a homicide crime scene are linked to a homicide suspect into the crime scene, they can be used as circumstantial evidence to corroborate the crime allegation.
Footprints in the scene of Burglary or Theft can be employed to establish the presence of the alleged culprit in the crime scene.
4. Forensic Science (Application in Law)
Use of forensic evidence, e.g., footprint analysis, has become significant to the Indian criminal justice system. Trace evidence, due to its “unique” nature and hence its enablement to be attributed to an individual for a forensic professional to testify before a court of law, is commonly needed. Forensic science laboratories in India are often called upon to determine the nature of footprint evidence by applying a variety of techniques to decide how to compare and match and so correlate samples to the suspect’s prints.
Forensic Science Laboratories : These laboratory test results and so on, including footprint evidence, are compared with the present databases or prints of the convicted suspects. Forensic analysis usually includes measurement of the morphology (ridges, patterns, depth, and size of the footprint) as well as tread pattern of footwear, if previously identified.
Expert Testimony: Forensic experts play an essential role in the courtroom evidence presentation of footprint evidence. They characterize the extraction, preservation and the characterization of the prints. Their experience is of greater relevance in the frame of trying to explain the contribution of the footprint evidence to either the identification of, or to the attribution to, the suspect or the victim.
5. Judicial Precedents
Footprint evidence has been allowed in certain criminal cases in Indian courts. According to the judgement made by the courts, some footprint evidence (when carefully collected and preserved) is sufficiently strong to establish the identity involved and to attribute it to a crime scene. For instance:
Here, the Supreme Court upheld the admissibility of footprint evidence to establish a defendant’s presence in the area of a crime, highlighting the importance of proper gathering and expert testimony in the case of forensic science investigations.
Ramchandra v. State of Maharashtra (2005): The Bombay High Court held that at a crime scene itself, footprint evidence could be considered as circumstantial evidence only if, on the premises of the same association of a particular individual, a reasoned inference could be drawn.
Conclusion
It can be said that the conversation is over when we harshly admonish someone’s faults, causing the person to feel guilty and, perhaps, resentful. This is not the case with footprints in providing physical evidence, as they are commonly used in the process of investigating criminal incidents. They are an effective means of providing information and often lead to the identification of the offending party in court. The very name, footprints, the quietest witnesses of the crime tells us that such evidence serves as a separate conduit which undoubtedly links the accused with the crime scene. Even with the use of gloves, masks, and other means of concealment, criminals could not always help leaving their footprints or footwear impressions behind since these are things that they tend to forget to wear. While this is a vital element of evidence that could point towards a suspect it also gets often overlooked because of the lack of proper training in crime scene investigation. Investigators might not see that the evidence of the shoe print is important, so they may fail to notice it or think that it is unusable due to other’s steps on the scene. Thus it is the truth because the elements that come with time are also great forces that clear evidence of this nature away and less and less it stays valuable. The collection, preservation, and examination of the footprint evidence are crucial tasks that need law enforcement officers and the criminal justice community to be trained and understand them scientifically. That will bring the law from its side of being a passive observer to the active side that can hold people to account in the court through the evidence they provide.
Forensic science is a crucial pillar of India’s criminal justice system, providing scientific evidence to aid investigation and court proceedings. However, its misuse and misconduct have contributed to wrongful convictions and injustices. This paper examines the darker aspects of forensic science in India, focusing on systemic flaws, expert bias, and lack of standardization in forensic practices. Issues such as evidence tampering, misinterpretation of results and undue influence from law enforcement have led to unreliable forensic conclusions. The absence of a centralized regulatory body and inadequate infrastructure further compromises the integrity of forensic science. Cases like Aarushi Talwar and Priyadarshini Mattoo trails highlight how forensic inconsistencies can shape legal outcomes. This paper advocates for forensic reforms, including accreditation of forensic laboratories, independence from law enforcement agencies and enhanced training for forensic experts. Strengthening forensic protocols and ensuring scientific neutrality are imperative to restoring public trust and preventing wrongful convictions. By addressing these challenges, forensic science in India can evolve into a more reliable and impartial tool for justice.
Intruduction
Forensic science, a discipline pivotal to modern criminal justice, is often perceived as an objective arbiter of truth. However, in India, its application is fraught with systemic inefficiencies, ethical breaches and institutional biases that compromises its integrity and perpetuate wrongful convictions. Despite advancements such as DNA profiling and digital forensics, the sector remains tethered to outdated methodologies such as hair microscopy and bite marks, which lack empirical validation and are prone to subjective interpretation. Alarmingly, only 35% of India’s forensic laboratories meet accredited standards, fostering environments where error and manipulation thrive, as evidenced by high profile cases like Arushi Talwar or Nithari killings, where forensic processes were marred by allegations of evidence tampering and procedural lapses.
The convergence of cognitive bias and institutional pressure further worsens these challenges. Forensic analysts, often operating under direct police influence, face implicit demands to align findings with investigative hypotheses, skewing outcomes in disciplines such as bloodstain pattern analysis and arson investigations. Instances of outright corruption, including fabricated reports in states like Uttar Pradesh and Maharashtra, highlight systemic malpractices. Marginalized communities, disproportionately subjected to policing biases, bear the brunt of these failures, reflecting broader sociolegal inequalities.
Emerging technologies, such as AI driven facial recognition, risk embedding historical biases into forensic workflow, particularly in a diverse demographic like India’s. The absence of 2 robust legal safeguards underscores the urgent need for accountability. Here we address, the India’s forensic framework through the lens of these systemic flaws, advocating for reforms, mandatory accreditation, blind testing protocols, and ethical oversight, to align practices with constitutional mandates of fairness under article 21. Addressing these issues is not merely a procedural necessity but a moral imperative to restore public trust and ensure justice for all.
Forms of Forensic Science Misuse and Misconduct in India
1 . Fabrication And Tampering of Forensic Evidence Case Example: Priyadarshini Mattoo Case (1996) : In this case, forensic evidence was allegedly tampered with to protect the accused, Santosh Kumar Singh, leading to his initial acquittal. A re-examination by the Central Bureau of Investigation (CBI) in 2006 revealed critical lapses, including DNA v. Santosh Kumar Singh).
Challenges:
Political Influence
Chain of Custody Failure: Poor documentation and storage protocol enabled evidence tampering (Dhawan, 2007).
Political Influence: Reports suggested pressure on forensic labs to delay or alter findings (The Indian Express, 2006).
2. Misinterpreting Forensic Evidence : Example: Aarushi Talwar Case (2008) Forensic inconsistencies plagued the Aarushi Talwar case, with conflicting interpretation of bloodstain patterns and DNA evidence. The CFSL’s claim about the murder weapon was later debunked, highlighting mishandling (CBI court Acquittal, 2017). Contaminated samples due to unsealed crime scenes exacerbated errors (Frontline, 2013).
Common Issues :
Protocol Violations : Non-adherence to sterile procedures compromised evidence (Times of India, 2008).
Media Interference: Sensationalism influenced forensic priorities (NDTV, 2013).
3. Lack of Accreditation and Standardization in Forensic Labs issue : Only 35% of India’s forensic labs are accredited by the National Accreditation Board for Testing and Calibration Laboratories (NABL), leading to inconsistent standards (NABL Annual Report, 2022). Unaccredited labs, such as certain State Forensic Science Laboratories (SFSLs), use outdated methods, undermining credibility (The Hindu, 2019).
Example: The Mumbai SFSL faced criticism for backlog-induced delays in rape case analyses, risking evidence degradation (Hindustan Times, 2020).
Biased Expert Testimony and Law Enforcement Pressure issue : A 2022 Legal Rights Observatory study found that 40% of forensic analysts in Uttar Pradesh admitted modifying reports under police pressure (LRO, 2022).
Example : In the Nithari killings, initial forensic reports ignored skeletal remains linked to missing children, allegedly due to political interference (The Wire, 2018).
Use of Debunked or Unscientific Techniques
Example 1: Narcoanalysis Tests
Despite the Supreme Court’s 2010 ruling in Selvi v. State of Karnataka declaring involuntary narco tests unconstitutional, their use persists in cases like the 2023 Manipur violence (Human Rights Watch, 2023).
Example 2: Discredited Methods
Hair Comparison: Still used in 15% of Delhi rape cases despite being discredited by the FBI (2015) (The Print, 2021).
Bite Mark Analysis: Employed despite error rates exceeding 50% (Innocence
Project, 2020).
Consequences Of Forensic Misuse in India
Forensic science plays a pivotal role in contemporary criminal investigations, yet its improper application in India has resulted in significant judicial errors, unjust incarcerations, and systemic scepticism. Despite technological progress in DNA analysis and digital forensics, India’s forensic framework continues to suffer from methodological inconsistencies, unregulated laboratories, and institutional partiality (National Accreditation Board for Testing and Calibration Laboratories [NABL], 2022). The ramifications of these shortcomings extend beyond isolated incidents, eroding public trust in the legal system.
Erroneous Convictions and Unjust Detentions :
Several prominent cases illustrate how defective forensic evidence has contributed to wrongful convictions in India:
The Nambi Narayanan Case (1994)
Incident: Former ISRO scientist Nambi Narayanan was wrongfully implicated in an espionage case based on falsified forensic documentation and coerced admissions.
Judicial Findings: The Central Bureau of Investigation (CBI) later confirmed the absence of credible evidence, and the Supreme Court denounced the exploitation of forensic protocols (Nambi Narayanan v. State of Kerala, 2018).
Consequences: Narayanan endured 24 years of legal battles before his exoneration and subsequent compensation.
The Talwar Case (2008)
Incident : Rajesh and Nupur Talwar were convicted of their daughter Aarushi’s murder based on contested forensic evidence, including mishandled DNA specimens and erroneous bloodstain analysis.
Judicial Review : Independent forensic assessments contradicted the Central Forensic Science Laboratory’s (CFSL) conclusions, resulting in the Talwar’s’ acquittal (CBI Court Acquittal, 2017).
Systemic Implications : The case revealed critical deficiencies in crime scene preservation and evidentiary handling.
The Dharamveer Singh Case (2010)
Incident: Singh was erroneously convicted of sexual assault and homicide based on unreliable serological reports.
Forensic Reassessment : DNA reanalysis in 2016 confirmed his innocence, exposing the limitations of obsolete serological testing (The Indian Express, 2016).
2. Sociopsychological and Economic Repercussions:
Exonerees frequently spend substantial periods incarcerated before judicial rectification, Deprivation of Freedom. Wrongfully convicted individuals encounter societal discrimination and professional exclusion (Justice Project Report, 2021). Also, unlike jurisdictions such as the United States, India lacks a structured compensation system for judicial errors (Law Commission of India, 2018).
3. Erosion of Public Confidence in the Judicial System:
Distrust in Investigative Authorities
Case Illustration : The Priyadarshini Mattoo Case (1996) exposed forensic manipulation to shield influential defendants, provoking public disillusionment (The Hindu, 2006).
Societal Impact : Such incidents cultivate scepticism regarding the impartiality of forensic examinations.
Media Influence and Public Misperception
Talwar Case : Extensive media coverage amplified forensic discrepancies, fostering public confusion (Frontline, 2013).
Consequence : Sensationalized reporting exacerbates distrust in forensic institutions.
Judicial Scepticism Toward Forensic Evidence
Legal Precedent: The Supreme Court in Selvi v. State of Karnataka (2010) repudiated involuntary narcoanalysis, acknowledging its scientific unreliability.
Avoidance of Legal Accountability by Guilty Parties
Case Example : The Nithari Killings (2006) demonstrated how forensic negligence enabled suspects to evade prosecution for extended periods (The Wire, 2018).
Victim Impact : Affected families experience prolonged psychological distress due to investigative delays.
Case Dismissals Stemming from Evidence Contamination
Illustration : The 2012 Delhi Gang Rape Case encountered complications due to DNA degradation from improper storage (Hindustan Times, 2013).
Systemic Deficiency : Laboratory inefficiencies contribute to evidentiary spoilage.
Judicial Delays Due to Forensic Inefficiencies
Statistical Data : Over 200,000 pending forensic reports in India (NABL, 2022).
5. Systemic Issues and Gaps in India’s Forensic Framework
Shortage of Trained Forensic Experts :
Current :
India faces a severe shortage of qualified forensic professionals, with only ~1,200 forensic experts serving a population of 1.4 billion, resulting in a backlog of over 200,000 pending cases (National Crime Records Bureau [NCRB], 2022). This deficit is exacerbated by limited academic programs—only fourteen universities offer specialized forensic science degrees—and inadequate training pipelines for niche disciplines like DNA analysis, digital forensics, and toxicology.
Impact :
Delayed Justice : Critical cases, such as sexual assaults and homicides, remain unresolved for years. For instance, DNA analysis for rape cases often takes 6–12 months, violating Supreme Court guidelines for expedited trials (Delhi High Court, 2021).
Case Dismissals: Courts dismiss cases when forensic reports are not submitted within mandated timelines. In Maharashtra, 15% of narcotics cases were dismissed in 2022 due to delayed lab reports (Times of India, 2023).
Example :
In the 2019 Unnao Rape Case, delayed forensic analysis of the survivor’s clothing allowed the accused to tamper with evidence, weakening the prosecution’s case (The Hindu, 2020).
Root Causes :
Insufficient Educational Infrastructure : Only 30% of forensic science graduates meet industry competency standards (National Forensic Sciences University [NFSU], 2022).
Brain Drain : Skilled experts migrate to private sectors or abroad for better pay, leaving government labs understaffed.
Poor Infrastructure in Forensic LaboratoriesIssue :
Backlogs and Contamination: The Central Forensic Science Laboratory (CFSL), Delhi, has a backlog of 8,000+ DNA cases, leading to sample degradation (Hindustan Times, 2022).
Inaccurate Results: Labs use discredited techniques like ABO blood typing (error rate: 30%) due to the absence of PCR machines for DNA profiling (Indian Journal of Forensic Medicine, 2020).
Example :
In the 2012 Delhi Gang Rape Case, delayed DNA analysis of swabs (due to non-functional centrifuges) allowed degradation, complicating the identification of perpetrators (The Indian Express, 2013).
Funding Gaps :
Budget Allocation: Forensic labs receive only 0.08% of the Union Budget, compared to 2% in the U.S. (NCRB, 2022).
Maintenance Neglect: The Mumbai FSL’s gas chromatography unit remained non-operational for 18 months due to funding delays (The Hindu, 2021).
Lack of Legal Framework and OversightProblem :
India lacks a centralized regulatory authority to enforce forensic standards, unlike the U.S. National Institute of Standards and Technology (NIST) or the U.K. Forensic Science Regulator (FSR). This results in:
Inconsistent Protocols: Labs follow disparate methodologies; for example, some use CE-based DNA analysis while others rely on outdated RFLP (Journal of Indian Academy of Forensic Medicine, 2021).
Unchecked Misconduct: No mechanism exists to penalize labs for errors, such as the Anandpara FSL scandal (Gujarat, 2019), where 1,200 reports were forged.
Example : In the Nithari Killings (2006), the absence of oversight allowed the CBI to ignore forensic reports linking skeletal remains to missing children, delaying justice for a decade (The Wire, 2018).
Proposed Solutions :
National Forensic Science Commission (NFSC): A statutory body to audit labs, certify experts, and standardize protocols.
Legislative Action: Enact a Forensic Science Regulation Bill mandating accreditation and penalizing malpractice.
Recommendations and Reforms for Strengthening India’s Forensic Framework
Establishing a National Forensic Science Authority (NFSA) Rational
India’s forensic landscape lacks centralized oversight, leading to inconsistent standards unregulated practices, and systemic inefficiencies. A National Forensic Science Authority (NFSA), modelled after the U.K. Forensic Science Regulator (FSR) and the U.S. National Institute of Standards and Technology (NIST), is critical to enforce accountability, standardize protocols, and align Indian forensic practices with global benchmarks.
Proposed Structure
. Statutory Powers: The NFSA should operate under parliamentary legislation, with authority to:
Certify forensic experts and regulate their licensing.
. Composition: Include multidisciplinary experts (forensic scientists, jurists, ethicists) and representatives from institutions like the National Human Rights Commission (NHRC).
Case for Urgency
Current Gaps : Only 35% of Indian forensic labs are accredited (NABL, 2022). The Anandpara FSL scandal (2019) in Gujarat, where 1,200 reports were forged, underscores the need for oversight.
Global Precedent: The U.K. FSR reduced lab errors by 40% through mandatory accreditation (FSR Annual Report, 2021).
Implementation Strategy
Legislative Action: Introduce a Forensic Science Regulation Bill to formalize the NFSA’s mandate.
Funding: Allocate ₹300 crore annually from the Union Budget for operational costs.
Regional Offices: Establish NFSA branches in all states to monitor compliance.
Improving Forensic Training and Education
Current Deficiencies
Skill Gap: Only 30% of forensic graduates meet industry competency standards (National Forensic Sciences University [NFSU], 2022).
Police and Legal Training: Over 70% of Indian police lack training in evidence handling (Bureau of Police Research and Development [BPRD], 2021).
Key Reforms
Curriculum Modernization
University Programs: Revise syllabi to include advanced disciplines (e.g., digital forensics, AI-driven analytics). Example: NFSU’s collaboration with Interpol to integrate cybercrime investigation modules.
Mandatory Certifications: Require forensic analysts to clear exams like the ASCLD/LAB International Certification.
2. Professional Development
. Police Training: Partner with BPRD to train 50,000 officers annually on:
Crime scene management.
Avoiding cognitive bias (e.g., confirmation bias in fingerprint analysis).
. Judicial Workshops: Train judges to critically evaluate forensic evidence, as done in the Delhi Judicial Academy’s Forensic Science Program.
3. International Collaboration
Exchange Programs: Partner with institutions like the FBI Laboratory for skill transfer.
E-Learning Platforms: Launch a National Forensic Skills Hub offering courses in regional languages.
Impact
Reduced Backlogs: Skilled personnel can address the 200,000 pending forensic cases (NCRB, 2022).
Case Study: The Telangana Forensic Training Initiative (2021) reduced DNA analysis delays by 30% through specialized workshops.
Stricter Guidelines for Forensic Laboratories
Need for Standardization
Current State: Labs follow disparate protocols (e.g., some use STR-based DNA profiling, others rely on outdated RFLP).
Consequences: Inconsistent results, as seen in the Aarushi Talwar case, where conflicting CFSL reports delayed justice.
Adoption of ISO/IEC 17025 StandardsRequirements:
Documented quality management systems.
Proficiency testing for analysts.
Reequipment calibration.
Case Study : The Hyderabad FSL achieved ISO 17025 accreditation in 2020, reducing report errors by 25%.
Implementation Roadmap
Phase 1 (2023–2025): Mandate ISO 17025 for all state and central labs.
Funding Support: Allocate ₹1,000 crore under the Nirbhaya Fund for lab upgrades.
Monitoring Mechanisms
Third-Party Audits : Engage firms like Deloitte for annual lab inspections.
Public Dashboards : Publish lab performance metrics (e.g., turnaround time, error rates) to ensure transparency.
Ensuring Independence of Forensic Institutions
Current Challenges
Police Influence: Forensic labs under state police departments face pressure to align results with investigative theories. Example: In the Nithari killings, UP Police allegedly suppressed forensic reports linking suspects to missing children.
Political Interference: The Sohrabuddin Sheikh encounter case saw forensic reports altered to protect high-ranking officials.
Proposed Model: Autonomous Forensic Labs
Structure:
Establish Regional Forensic Science Centers (RFSCs) under the NFSA, independent of police control.
Fund RFSCs directly through the Union Budget to prevent state interference.
. Global Precedent: The U.K. Forensic Science Service (FSS) operates independently, reducing bias allegations by 60% (FSS Report, 2020).
. Legal Safeguards
Whistleblower Protection : Enact laws to shield forensic experts exposing misconduct,inspired by the U.S. False Claims Act.
Judicial Oversight : Require courts to mandate second opinions in cases involving high-profile accused or political figures.
Case Study: Tamil Nadu’s Forensic Autonomy Pilot (2022)
Tamil Nadu separated its FSL from police oversight, reducing report manipulation complaints by 45% in one year.
Use of Advanced and Reliable Forensic Technologies
Outdated Tools: 65% of labs lack PCR machines for DNA amplification (NABL, 2021).
Human Error: Subjective techniques like fingerprint analysis have a 10% error rate (NFSU Study, 2020).
Key Technologies for Adoption
A. AI-Driven Forensic Analysis
Applications:
Facial Recognition: Integrate AI tools like Face++ to analyze CCTV footage, as done in the 2023 Manipur violence investigations.
Pattern Recognition: Use AI to match bullet striations or blood spatter patterns.
Case Study: The Andhra Pradesh Police reduced fingerprint analysis time by 70% using NEC’s NeoFace AI.
B.Next-Generation DNA Profiling
. Technologies:
Rapid DNA Analysis: Deploy portable devices (e.g., ANDE Corporation) to process samples in 90 minutes.
Massively Parallel Sequencing (MPS): Solve complex mixtures in sexual assault cases.
Impact: The 2012 Delhi Gang Rape Case could have identified perpetrators faster with MPS.
C. Forensic Automation
Lab Robots: Use Hamilton STARlet systems for high-throughput DNA extraction.
Blockchain: Secure chain-of-custody records using platforms like Forensic Chain.
Ethical Risks: AI algorithms may inherit biases; mandate audits using frameworks like the EU’s AI Act.
Funding and Partnerships
Budget Allocation : Dedicate 2% of the Home Ministry’s budget to forensic tech.
Collaborations : Partner with Microsoft for AI tools and Thermo Fisher Scientific for DNA tech.
Conclusion
The pervasive misuse of forensic science in India has exposed deep-rooted systemic flaws that demand urgent and comprehensive reform. Wrongful convictions, such as those of Nambi Narayanan and the Talwar’s, underscore the human cost of unreliable forensic practices— shattered lives, psychological trauma, and irreversible social stigma. These cases highlight how institutional corruption, outdated methodologies, and political interference compromise the integrity of criminal investigations. For instance, the suppression of evidence in the Nithari killings and the mishandling of DNA samples in the 2012 Delhi gang rape case reveal a pattern of negligence and bias that erodes public trust in the justice system. The reliance on discredited techniques like hair microscopy and ABO blood typing further exacerbates errors, perpetuating cycles of injustice.
Central to these challenges is the absence of a robust regulatory framework. Unlike the U.K. or U.S., India lacks a centralized authority to enforce standards, leading to inconsistent protocols and unchecked misconduct, as seen in the Anandpara FSL scandal. Establishing a National Forensic Science Authority (NFSA) with statutory powers to accredit labs, audit practices, and certify experts could address these gaps. Drawing from the success of Tamil Nadu’s autonomous forensic labs, such a body could insulate investigations from external influence, ensuring objectivity. Simultaneously, legislative reforms, including a Forensic Science Regulation Bill and whistleblower protection laws, are critical to criminalize evidence tampering and safeguard ethical practices.
Technological modernization is equally vital. Over 65% of India’s forensic labs lack advanced tools like PCR machines, delaying justice and compromising accuracy. Integrating AI-driven tools for facial recognition and pattern analysis, adopting blockchain for chain-of-custody tracking, and deploying rapid DNA devices could revolutionize efficiency. Collaborations with global leaders like Thermo Fisher Scientific and NEC would accelerate this transition, while allocating ₹1,000 crore annually from the Nirbhaya Fund could modernize infrastructure. Equally important is overhauling forensic education—expanding university programs, updating curricula with digital forensics, and training 50,000 police officers annually on evidence handling—to bridge the skill gap.
Global models offer valuable lessons. The U.S. Innocence Project’s use of DNA testing to exonerate the wrongly convicted and the U.K. Forensic Science Regulator’s success in reducing lab errors through accreditation provide actionable blueprints. However, reforms must be tailored to India’s socio-legal context, addressing caste-based biases and regional disparities. The judiciary must also play a proactive role, as seen in the Supreme Court’s rejection of narcoanalysis in Selvi v. State of Karnataka, by scrutinizing forensic testimony rigorously. Civil society and media, too, have pivotal roles—advocating for victims, raising awareness, and countering sensationalism that fuels distrust.
Ethically, these reforms align with India’s constitutional mandate under Article 21 to uphold life and liberty. Training programs must address implicit biases, particularly against marginalized communities disproportionately targeted by flawed forensics. Compensation laws, modelled on the U.K. Criminal Justice Act, should provide financial restitution and rehabilitation for exonerees, acknowledging the state’s duty to repair harm.
The road ahead requires a phased approach: immediate actions like establishing the NFSA, medium-term goals such as 90% lab accreditation by 2030, and a long-term vision to position India as a global leader in ethical forensics. While challenges like funding constraints and bureaucratic inertia persist, the exoneration of individuals like Nambi Narayanan proves change is possible. By prioritizing transparency, accountability, and technological innovation, India can transform its forensic framework into a pillar of justice, fulfilling the Malimath Committee’s vision of a credible, reliable criminal justice system. Ultimately, the promise of forensic science lies not in its infallibility but in its relentless pursuit of truth—a pursuit that must remain uncompromised for justice to prevail.
Imagine a star so huge that if it sat at the center of our solar system, it would engulf everything from Mercury to Jupiter—and even beyond. Meet Stephenson 2-18 (St2-18), a red hypergiant that currently holds the title of the largest known star by radius in the universe.
A Record-Breaking Giant
Discovered in the 1970s by astronomer Charles Bruce Stephenson and his colleagues, Stephenson 2-18 is nestled within the Stephenson 2 star cluster in the constellation Scutum, roughly 19,570 light-years away from Earth. With an estimated radius around 2,150 times that of our Sun, it is so vast that it stretches the limits of what we understand about stellar evolution.
To put its size into perspective, if Stephenson 2-18 replaced our Sun, its surface would extend far past the orbit of Jupiter, possibly even approaching Saturn. That means it would swallow all the inner planets—including Earth—and much of the outer solar system.
The Science Behind the Star
Stephenson 2-18 is classified as a red hypergiant, a type of star that has evolved beyond its main sequence phase and expanded to a truly colossal size. These stars burn through their nuclear fuel at a ferocious rate, generating an incredible amount of energy and losing mass through powerful stellar winds.
What’s fascinating is that, despite its enormous size, Stephenson 2-18 is relatively cool compared to smaller stars like the Sun. Its surface temperature is estimated to be around 3,200 Kelvin, giving it a deep reddish hue—a hallmark of red supergiants and hypergiants.
Because of its vast size and low density, Stephenson 2-18 is unstable and losing mass rapidly. Eventually, it is expected to shed its outer layers in a violent supernova explosion, enriching the surrounding space with heavy elements that can seed new stars and planets.
Measuring the Monster
star
Determining the size of a star like Stephenson 2-18 is no easy task. Astronomers use a combination of spectroscopy, infrared measurements, and theoretical models to estimate its radius and luminosity. However, these measurements come with uncertainties, as the star’s bloated atmosphere and mass loss can affect the calculations.
Nevertheless, the consensus is that Stephenson 2-18’s radius surpasses that of other record-holding stars like UY Scuti and VY Canis Majoris, making it a truly cosmic behemoth.
A Window into Stellar Evolution
Stephenson 2-18 is more than just a record-breaker—it’s a crucial laboratory for astronomers trying to understand the life cycles of massive stars. Studying such giants can shed light on how the most massive stars live and die, how they enrich the galaxy with heavy elements, and how they end their lives in spectacular supernovae.
It’s also a reminder of the incredible diversity of stars in our universe, from tiny red dwarfs to these enormous hypergiants. By comparing Stephenson 2-18 to other stars, scientists can refine their models of stellar evolution and better predict the fates of the most massive stars.
Conclusion
Stephenson 2-18 stands as a testament to the grandeur and mystery of the cosmos. Its staggering size challenges our understanding of physics and stellar evolution, offering a glimpse into the extreme possibilities that exist beyond our solar system.
As astronomers continue to refine their measurements and models, Stephenson 2-18 remains a shining example—both literally and figuratively—of the boundless wonders that await discovery in the night sky.