Prototype for Stress-Based Viral Fever Prediction

A. Srivatsa*1, Kavya A.P.2, F. Fareesha1, Lekhana G.1, Gowthami V.1 and Chandanashree Y.K.1

1Department of Electronics and Communication, The National Institute of Engineering, Mysuru, India
2Department of Electronics and Communication, Vidyavardhaka College of Engineering, Mysuru, India

Submitted on 25 August 2025; Accepted on 11 November 2025; Published on 17 December 2025

To cite this article: A. Srivatsa, Kavya A.P., F. Fareesha, Lekhana G., Gowthami V. and Chandanashree Y.K., “Prototype for Stress-Based Viral Fever Prediction,” Trans. Appl. Sci. Eng. Technol., vol. 1, no. 2, pp. 1-12, 2025.

Copyright: 

Abstract

This paper presents the development of a prototype device for predicting viral fever by monitoring stress levels through cortisol detection. Research indicates a negative correlation between the immune system’s strength and prolonged high stress levels. As stress increases, the immune system weakens, making individuals more susceptible to viral fevers and infections. The primary goal of this study is to design a device that tracks stress levels by measuring cortisol, a neuroendocrine hormone released during stress. Our approach utilizes a synthetic form of cortisol to construct the prototype, employing infrared (IR) sensing technology with IR LEDs and photodiodes to detect cortisol levels. This innovative device aims to provide an early warning system for viral infections by continuously monitoring and analyzing stress-induced hormonal changes.

Keywords: infrared; stress levels; cortisol; biomimicry; immune system

Abbreviations: IR: infrared; PHA: phytohemagglutinin; Con A: concanavalin A; NK: natural killer; HSE: Health and Safety Executive; ACTH: adrenocorticotropic hormone; ELISA: enzyme-linked immunosorbent assay; NIRS: near-infrared spectroscopy; PLS: partial least squares; ANN: artificial neural network; OGTT: oral glucose tolerance test; FPG: fasting plasma glucose; SARS: severe acute respiratory syndrome; MERS-CoV: Middle East respiratory syndrome coronavirus; MHC: major histocompatibility complex; CRH: corticotropin-releasing hormone; HPA: hypothalamic-pituitary-adrenal; RIA: radioimmunoassay; UFC: urinary free cortisol

1. Introduction

Nature has always been a great inspiration for sustainable and innovative solutions. The process of getting inspiration from nature for building solutions is called biomimicry. Rooted in recognizing nature’s efficiency, resilience, and adaptability, biomimicry offers a pathway to address complex challenges across various domains, from engineering and architecture to materials science and medicine. COVID-19, the pandemic that had the whole world shut down, belongs to a group of viruses called coronaviruses commonly found in bats [1]. Bats are known to be the home for several viruses, but they rarely get affected by those. This can be attributed to the incredibly strong immune system of bats [2]. But when bats are prone to stressful situations, their immune system deteriorates, leading to their worsening health because of viruses. This leads to the propagation of viruses from bats to nature, which can affect other organisms [3]. A similar phenomenon is observed in humans as well. When a person is exposed to prolonged stress, it leads to diminished immune system strength of the person. This makes the person susceptible to viral infections. The project leverages this finding and tries to implement a device for the prediction of viral fever while monitoring the stress levels of a person. The primary objective of the project is to create a prototype device capable of accurately tracking an individual’s susceptibility to viral infections, facilitating early detection and proactive management. We plan to incorporate stress monitoring capabilities into this device, providing users with comprehensive health insights, particularly in scenarios where stress may compromise immune function. Through the utilization of advanced non-invasive sensor technologies, the aim is to minimize discomfort and simplify the diagnostic process, reducing reliance on invasive techniques. Additionally, we are developing an algorithm that takes recorded stress levels as input to predict an individual’s vulnerability to viral infections. In addition to its diagnostic capabilities, our wearable device aims to empower individuals with actionable insights into their health status. Through user-friendly interfaces and mobile applications, individuals can easily access and interpret their health data, enabling them to make informed decisions about their health and well-being.

2. Literature Survey

According to the results of a meta-analysis conducted, stress was related to the decline in the immune system. The methodology involved a comprehensive search for studies on stress and immune function, utilizing both computerized searches and reference list inspections. Inclusion criteria focused on English-language peer-reviewed studies reporting data from independent samples, with stress measured using self-report measures or common stressors, and participants being physically healthy. Additionally, studies were categorized based on stressor duration and type, and meta-analytic techniques, including effect size estimation and fail-safe N computation, were employed to analyze the data. Functional immune parameters included in the analysis were lymphocyte proliferation in response to phytohemagglutinin (PHA) and concanavalin A (Con A), as well as natural killer (NK) cell cytotoxic activity. These parameters were considered outcomes to assess the effects of stress on immune function. Specifically, they were among the measures used to evaluate the impact of stress on the activity and function of lymphocytes (such as T cells) and NK cells, which are key components of the immune system involved in defending the body against pathogens and abnormal cells. The analysis revealed that stress reliably correlates with reductions in both the proliferative response to PHA and Con A, as well as with diminished NK cell activity [4].

The study, conducted from January to December 2014, involved 294 subjects. Clinical and anamnestic data, including age, occupational history, medical background, and lifestyle habits (smoking and alcohol consumption), were collected for each participant. The research focused on evaluating the impact of stress on the neuroendocrine system, particularly its association with elevated glucose levels via cortisol-mediated mechanisms. Individuals with a history of diabetes were excluded from the study. During the visits, a 10 mL venous blood sample was obtained from each subject to analyze blood glucose levels on the same day as administering the Health and Safety Executive (HSE) questionnaire, which assessed various stress-related factors such as workload, control, support, relations, role, and change. Statistical analyses involved calculating mean, standard deviation, median, and range for blood glucose values, questionnaire scores, and confounding factors. Pearson correlation coefficients, after log transformation of the data, were used to gauge the correlation between stress levels and blood glucose levels. Multiple linear regression analysis was employed to account for major confounding variables. Statistical significance was considered at p-values < 0.05. The findings suggested a potential, albeit statistically nonsignificant, increase in blood glucose levels associated with higher workplace stress levels [5].

Stress triggers a cascade of physiological responses in the body, including the release of cortisol, a key stress hormone. Cortisol acts through specific receptors located in the cytoplasm of target cells, influencing various metabolic and immune processes. Traditionally, cortisol levels have been assessed through blood sampling, which has drawbacks such as inducing additional stress and measuring total cortisol rather than the biologically active form. However, cortisol response to stress typically lags behind the secretion of adrenocorticotropic hormone (ACTH) by 5–20 minutes, with peak blood levels achieved in 10–30 minutes. Interestingly, cortisol transfer from blood to saliva occurs rapidly, within 2–3 minutes, offering a potential alternative method for cortisol measurement. While studies validating the correlation between saliva cortisol concentrations and free cortisol levels in blood are limited, saliva cortisol presents a promising avenue for stress biomarker research due to its ease of collection and potential wide applicability [6].

The paper "Highly sensitive and non-invasive electrochemical immunosensor for salivary cortisol detection" [7] introduces a non-invasive electrochemical immunosensor designed for the highly sensitive and selective detection of cortisol in saliva. This sensor achieves a low detection limit and high sensitivity, indicating its potential for creating portable and efficient sensing devices. The study utilized a NiO thin film (NiO/ITO) along with EDC-NHS chemistry to immobilize the anti-cortisol antibody. Saliva samples were collected, stored at 4°C, and diluted to reduce interference. Structural and morphological analyses were conducted using XRD, SEM, AFM, and FTIR spectroscopy. The sensor achieved a detection limit of 0.32 pg/mL with minimal interference. It demonstrated stable responses over several weeks and produced reproducible results across multiple electrodes. Additionally, the sensor showed a good correlation with ELISA results after the samples were diluted [7].

In a recent study, researchers measured the dielectric properties of microwaves to detect glucose levels non-invasively. They developed a correlation between glucose levels and the measured dielectric properties using Cole-Cole polynomial equations. The dielectric properties of blood plasma were measured for various glucose concentrations within the frequency range of 0.5 GHz to 20 GHz. A significant change in dielectric properties was noted when glucose levels increased from 2000 mg/dL to 4000 mg/dL. This method, while promising for cortisol detection through similar Cole-Cole polynomial equation development, poses challenges in miniaturization and is highly sensitive, making it less practical for portable applications [8].

A study aimed at developing a non-invasive and rapid method for in vivo blood glucose measurement focused on near-infrared spectroscopy (NIRS), which is considered more promising than other techniques. In this research, NIR spectra were recorded at specific intervals, and blood glucose levels were measured using a biochemical analyzer. Calibration models for the NIR data were constructed using partial least squares (PLS) and artificial neural network (ANN). The robustness and repeatability of these models were confirmed through t and F tests at a significance level of 0.05. Additionally, diabetes induction was validated using the oral glucose tolerance test (OGTT), with diabetic rats showing significantly higher fasting plasma glucose (FPG) and 2-hour post-OGTT plasma glucose (2h-PG) levels compared to normal rats [9].

3. Methodology

3.1. Biomimicry

Biomimicry, derived from the Greek words bios (life) and mimesis (to imitate), is the practice of imitating nature to solve human problems. It is a relatively new discipline that involves studying nature’s solutions and applying them to human design challenges [10]. The term "biomimicry" was coined in 1982 and popularized by scientist Janine Benyus in her influential 1997 book, "Biomimicry: Innovation Inspired by Nature." Benyus defined biomimicry as "the new science that studies nature’s models and imitates these designs to solve human problems." She advocated for viewing nature as a "model, measure, and mentor," emphasizing that the main goal of biomimicry is sustainability. Biomimicry looks to nature as a model for innovative designs, a measure of sustainability, and a mentor for learning valuable lessons. Nature’s principles, captured in life’s principles, serve as a guide for evaluating the sustainability of human innovations. Biomimicry represents a shift towards valuing nature for what we can learn from it, rather than what we can extract from it [10].

Michael Pawlyn once rightly said: “You could look at nature as being like a catalogue of products, and all of those have benefited from a 3.8-billion-year research and development period. And given that level of investment, it makes sense to use it [11].

Throughout history, humans have drawn inspiration from nature’s creatures. In ancient Greek mythology, Daedalus and his son, Icarus, famously attempted to emulate birds’ flight to escape their island prison. During the Renaissance, Leonardo da Vinci drew on his observations of birds and bats to design a flying machine, showcasing nature’s influence on his inventions. In the 20th century, Swiss engineer George de Mestral was inspired by the burrs that clung to his dog’s fur during a walk in the Alps. This observation led him to invent Velcro, which mimicked the mechanism of attachment found in nature [11].

3.2. Purpose of choosing biomimicry

The fundamental approach of biomimicry is to comprehend the core principles of a biological process or adaptation and then apply these principles to create bio-inspired products or address specific technical challenges [12]. The growing interest in biomimicry in recent times has led to numerous product innovations [13].

Biomimicry is poised to revolutionize various fields in the 21st century by transforming the way professionals think and inspire their ideas. This approach influences the materials used in different industries, from building to technology, and their impact on the environment and users. Despite its seemingly small role in daily life, biomimicry significantly impacts the world we live in. The adaptation of biomimetic technology in the latter half of the 20th century has led to new conventions in different areas, challenging traditional ways of thinking. Design biomimetics serves as a bridge between different professions, linking design with environmental considerations. This approach encourages thinking and designing practices that prioritize environmental and biological considerations, resulting in more responsive and safer products. Biomimetic technology offers solutions to environmental issues, such as reducing emissions and purifying the surrounding environment. Understanding and utilizing this technology will be vital for humanity’s development in the 21st century.

3.2.1. Influence from bats

Bats are known to be the primary source of zoonotic viruses worldwide. Molecular studies have confirmed that bats serve as natural reservoirs for rabies and other Lyssaviruses [14]. They have also been identified as major reservoirs for fatal zoonotic viruses such as the severe acute respiratory syndrome (SARS) coronavirus [15], Middle East respiratory syndrome coronavirus (MERS-CoV) [16], Ebola and Marburg hemorrhagic fever filovirus [14], and paramyxoviruses like Nipah and Hendra [17]. Bats’ ability to fly and their vast number of species provide ideal breeding sites for viruses. Their unique flying ability enables direct and indirect interactions with various animal species across different geographic areas, increasing interspecies viral transmission. The physiological effects of flight on bats’ immune response are notable, as it raises their metabolic rate significantly, leading to elevated body temperatures akin to fever. This "fever" state enhances the immune response by increasing the activity of both innate and adaptive immune systems, which helps suppress viral proliferation [2] (Figure 1).


FIGURE 1: Stress weakens a bat’s antiviral response, allowing viruses to replicate in their cells. source: https://asknaturestage.wpengine.com/strategy/batsavoid-viral-infection-with-super-immunity/the-strategy.

Additionally, some bat species can undergo hibernation and enter a torpor state, lowering their body temperature and metabolic rate. This decrease in metabolic activity and body temperature reduces their immune response, which may prevent viral destruction within bat populations [14, 18].

Bats’ roosting habits, characterized by high-density roosts with diverse species, further increase the likelihood of intra- and interspecies viral transmission [19]. Zoonotic viruses found in bats are predominantly RNA viruses that replicate in the cytoplasm, exhibiting high potential for replication, mutation, and recombination [19]. These viruses are highly adaptable and persistently infect bat populations, potentially serving as a defense mechanism against predators and acting as a biological weapon for bats. Bats possess immune elements similar to other mammals, including pattern recognition receptors, interferons, complement activity, immunoglobulins, antibody response, major histocompatibility complex (MHC), cytokines, and T-cell-dependent immunity. However, they also have unique immune characteristics that support their ability to resist viral infections [20]. Wang et al. [20] summarized the immune system of bats, hypothesizing that bats can resist viral infections early on through their innate immune system.

The bat is a fascinating example of biomimicry, as it serves as a natural reservoir for numerous viruses without being harmed by them. They spread these viruses to other organisms through activities like urinating, shedding, or consuming fruits. Bats only succumb to these viruses when their immune systems weaken.

3.3. Biomimicry steps

Define: The first stage in the biomimicry process is define. This stage involves identifying the problem or challenge that needs to be addressed. It requires a comprehensive understanding of the context, constraints, and goals of the project. During this stage, stakeholders are engaged to gather detailed information about the problem, ensuring that the defined problem is precise and accurately reflects the needs and limitations of the environment in which the solution will be implemented. By thoroughly defining the problem, the stage sets a solid foundation for the rest of the biomimicry process.

Biologize: The biologize stage translates the defined problem into biological terms. This involves identifying the essential functions and processes required to solve the problem and finding analogous challenges and solutions in nature. Biologizing helps to bridge the gap between human challenges and biological strategies, encouraging the exploration of how organisms and ecosystems solve similar problems. During this stage, a thorough understanding of biological systems and their underlying principles is developed, enabling designers to frame their problem in a way that is compatible with natural processes and solutions.

Discover: In the discover stage, extensive research is conducted to uncover relevant biological models that exemplify potential solutions. This involves studying a wide range of organisms, ecosystems, and natural phenomena to gather insights and strategies that nature employs to solve problems analogous to the defined challenge. This stage emphasizes biodiversity and the vast array of solutions evolved over millions of years. By documenting and analyzing these biological strategies, designers can compile a rich source of inspiration and potential blueprints for their projects.

Abstract: The abstract stage focuses on distilling the biological strategies discovered into general principles and concepts that can be applied to human design challenges. This involves identifying the core mechanisms and functions that make the biological solution effective and translating these into a form that can be utilized in the design process. Abstraction allows the extraction of essential insights without getting bogged down by the complexities of the entire biological system, making it easier to apply these principles across different contexts and scales.

Emulate: In the emulate stage, the abstracted principles are integrated into the design of a solution. This involves creating prototypes and models that mimic the identified biological strategies. Emulation requires innovative thinking to adapt these principles to the constraints and requirements of human design, while preserving the efficiency and sustainability found in nature. This stage often involves iterative testing and refinement, ensuring that the solution not only mimics the biological inspiration but also meets the practical needs of the defined problem.

Evaluate: The final stage, evaluate, involves assessing the performance and impact of the biomimetic design. This stage ensures that the solution effectively addresses the defined problem while adhering to sustainability and efficiency criteria. Evaluation may include testing the solution in real-world conditions, analyzing its environmental impact, and comparing its performance to traditional solutions. Feedback from this stage is crucial for making necessary adjustments and improvements, ensuring that the biomimetic design is both practical and aligned with the principles of nature. This stage also involves reflecting on the entire process to identify lessons learned and opportunities for further innovation (Figure 2).


FIGURE 2: Biomimicry steps.

3.4. Stress

Despite its often-negative connotations, stress is a familiar and widespread aspect of life, acting as a stimulant for some and a burden for many others. Definitions of stress vary, focusing on internal or external challenges, how organisms perceive stimuli, or physiological responses. Stress is understood as a chain of events: a stressor triggers a brain reaction, leading to bodily fight-or-flight responses. Key mediators of stress effects include norepinephrine, epinephrine, corticotropin-releasing hormone (CRH), adrenocorticotropin, and cortisol. These hormones can induce changes in cells and tissues, signaling the presence of a stressor [21, 22].

An integrated definition states that stress is a constellation of events, consisting of a stimulus (stressor), that precipitates a reaction in the brain (stress perception), that activates physiological fight-or-flight systems in the body (stress response) [23]. It’s crucial to note that a stressor affects the brain or body solely through the biological stress response. The major mediators of stress effects are norepinephrine and epinephrine from the sympathetic nervous system, and CRH, adrenocorticotropin, and cortisol from the hypothalamic–pituitary–adrenal axis. Since almost all cells in the body have receptors for these factors, stress hormones can induce changes in nearly all cells and tissues, signaling the presence of a stressor. Chronic or long-lasting stress can be harmful [24–26], but it’s often overlooked that a stress response can have beneficial adaptive effects in the short term [27, 28]. One major distinguishing characteristic of stress is the duration of its biological effects. Short-term stress typically lasts for minutes to hours, while chronic stress persists for several hours per day for weeks or months [23]. Dysregulation of the circadian cortisol rhythm is one marker that appears to coincide with the deleterious effects of chronic stress [23, 29, 30]. The intensity of stress can be gauged by the peak levels of stress hormones, neurotransmitters, and other physiological changes, such as increases in heart rate and blood pressure. This intensity could affect how long these changes persist during and after stress.

The impact of stress on immune function and subsequent health outcomes depends on several key factors. These include how stress affects the distribution of immune cells in the body, the duration of the stress experienced, and the type and concentration of stress hormones involved. Additionally, factors such as timing of stress exposure relative to immune system activation, as well as individual differences like gender, genetics, and age, can all play a role. It’s important to note that whether stress enhances or suppresses immune function, the ultimate impact on health depends on the overall effect on the immune response. Understanding these complex interactions is crucial for managing stress and its effects on health. Increasing stress can weaken the human immune system, making individuals more vulnerable to viral infections [31].

3.5. Parameters of stress

The two parameters that determine the stress in the body are glucose and cortisol. Glucose is a type of sugar that serves as a primary source of energy for cells in the human body. It is produced when carbohydrates in food are broken down during digestion. Glucose is absorbed into the bloodstream and transported to cells, where it is used for energy production. It is particularly important for the brain and red blood cells, which rely heavily on glucose for fuel. Any excess glucose not immediately needed for energy is stored in the liver and muscles as glycogen for later use. In an Italian healthcare company, 241 employees were assessed for work-related stress using the HSE Indicator Tool. After excluding certain individuals based on specific criteria, the final sample consisted of 149 males and 92 females aged 27–67 years. The participants’ data were anonymized and analyzed for scientific purposes using a 35-item questionnaire that examined seven organizational dimensions related to stress and well-being. Statistical analysis of the questionnaire was performed using software provided by HSE. The study found a significant correlation between perceived work-related stress, particularly in dimensions like sustain from managers, sustain from colleagues, quality of relationships, and professional changes, and increased levels of circulating glucose in healthcare workers. The analysis revealed a negative correlation between the scores of the HSE questionnaire, which assessed stress levels, and blood glucose values, even after considering confounding factors like age, sex, length of employment, smoking, and alcohol consumption. This suggests that higher levels of work-related stress are associated with increased blood glucose levels in the studied population [5]. Cortisol (C21H30O5) is a steroid hormone produced by the adrenal glands, located above the kidneys, as a part of the body’s stress response system, known as the hypothalamic-pituitary-adrenal (HPA) axis, with a molecular weight of 362.46 g/mol, widely recognized as a biomarker of both psychological and physiological stress [32, 33]. When triggered by various factors, the hypothalamus releases CRH, which stimulates the pituitary gland to release ACTH into the bloodstream. ACTH then prompts the adrenal cortex to increase cortisol production. Cortisol plays a crucial role in regulating various physiological processes. It plays a crucial role in regulating blood pressure, carbohydrate metabolism, and glucose levels, while also contributing significantly to the homeostasis of various bodily systems, including cardiovascular, renal, immune, endocrine, and skeletal systems [34, 35]. Abnormally elevated cortisol levels can disrupt blood amino acid and fatty acid levels, leading to immune suppression and inflammation. This hormone’s excessive presence can result in symptoms such as obesity, bone fragility, and fatigue [20]. Conversely, decreased cortisol levels, as seen in Addison’s disease, can manifest as arterial hypotension, weight loss, and darkened scars/skinfolds [21]. Cortisol’s primary effects are closely linked to emotional or psychological stress, earning it the moniker ’stress hormone’ [36].

Glucose levels, while varying among individuals, are not ideal for stress detection due to their susceptibility to external factors such as diet and physical activity. This variability can lead to inaccurate assessments of stress levels.

3.6.  Invasive approach of detecting cortisol and associated risks

Blood tests are the primary method of measuring this essential steroid hormone produced by the adrenal glands, in an invasive procedure for detecting cortisol levels. These tests are essential for the diagnosis and monitoring of conditions such as Cushing’s syndrome, Addison’s disease, or adrenal insufficiency, which require precise and reliable results. In order to take blood, a medical expert must insert a needle into a vein. This operation can result in mild pain, a pricking or stinging sensation, and possible bruising or throbbing thereafter. This procedure presents certain minor hazards, although it is generally safe: heavy bleeding, fainting, numerous puncture attempts, hematoma, and infection. There is an increasing trend towards non-invasive methods of monitoring cortisol levels, like saliva and urine testing, which provide a more pleasant and less invasive option for patients, as a result of these risk factors and the discomfort they cause.

3.7.  Embracing a non-invasive approach

Transitioning to non-invasive approaches for diagnosis, particularly those that eliminate the need for needles and invasive procedures, is crucial for improving patient comfort and compliance. Non-invasive approaches offer a promising solution to this challenge by providing simple and comfortable monitoring of stress levels, making them accessible across different age groups, from children to the elderly.

Currently, our clinic measures total cortisol levels, which include both protein-bound and free fractions. However, only free cortisol is biologically active and responsible for the body’s cortisol-related effects. This active fraction can be detected in various biological fluids, such as blood (serum and plasma), saliva, and urine [37]. Unfortunately, determining free cortisol levels traditionally involves labor-intensive laboratory methods that are unsuitable for point-of-care diagnostics due to their reliance on large samples and time-consuming processes [38].

To overcome these challenges, we are embracing non-invasive approaches to revolutionize cortisol monitoring. These methods have the potential to detect cortisol levels in real time with minimal intrusion, providing continuous insights into an individual’s stress response throughout the day. By offering timely interventions and personalized stress management strategies, this approach can significantly improve healthcare outcomes.

Moreover, the accessibility and user-friendly nature of these non-invasive approaches make them suitable for individuals of all ages. Their non-intrusive design ensures comfort and compliance, facilitating widespread adoption and ultimately enhancing the quality of care for everyone. By transitioning towards a more patient-centered approach to diagnosis and monitoring, we can empower individuals to take control of their health and well-being.

3.8.  Non-invasive approaches to detect cortisol

Cortisol is present in measurable amounts in various body fluids, such as urine, sweat, and saliva, which are all easily accessible without surgical devices [7].

3.8.1. Using saliva

Salivary cortisol is a biologically active form of cortisol that can be easily collected and monitored without causing distress to the person involved in the test. Because of these advantages, it has become an area of promising research. However, the levels of salivary cortisol are much lower (up to 100-fold) than those in serum, ranging from 0.1 ng/mL to 10 ng/mL, making it important to use highly sensitive and selective immunosensors with low detection limits for efficient and non-invasive detection of cortisol. Currently, techniques like radioimmunoassay (RIA), chromatography, and ELISA are used to determine cortisol levels [7].

In recent years, the use of salivary cortisol detection has become popular for developing stress monitoring systems. This is because saliva is a convenient biofluid to use. There is strong evidence in the literature showing a correlation between blood and salivary cortisol concentrations, making it an important factor to consider [37, 39]. Salivary cortisol has an important advantage of being entirely in a free state. The process of collecting saliva samples for measuring cortisol is non-invasive and painless. A standard operating procedure has been established for the collection of saliva, which reduces the variability of measurements. While there are several advantages to using salivary cortisol, there are also some drawbacks. The diurnal cycle results in nominal cortisol values ranging from 0.05 µg/dL to 0.5 µg/dL in saliva [37, 40]. For the purpose of detecting low ranges of cortisol, a highly sensitive assay is required. However, salivary cortisol is highly unstable at room temperature, which makes it difficult to store during the on-site sampling and processing period. The concentration of salivary cortisol is highest in the mornings, ranging from 3.6 nmol/L to 8.3 nmol/L, and drops to 2.95 nmol/L to 2.1 nmol/L at late nights [37, 41].

3.8.2. Using urine

Cortisol levels measured in urine, known as 24-hour urinary free cortisol (UFC), reflect the free and active cortisol present in the body, with normal ranges between 36 µg/24 h and 137 µg/24 h [42]. Although the 24-hour urine collection for UFC testing is non-invasive and painless, it has practical issues regarding convenience and reliability. Patients must carry a special container and be relatively stationary for 24 hours, with the container stored in a refrigerator until lab delivery. Additionally, creatinine levels must be measured to verify complete collection. Factors like pregnancy and certain medications (ketoconazole, adrenalux, and metyrapone) can alter cortisol concentrations. These requirements make UFC measurement unsuitable for real-time cortisol detection in point-of-care settings [37].

3.8.3. Using sweat

Cortisol can be detected using an electrochemical sensor modified with a pseudoknot-assisted aptamer and a flexible microfluidic sweat sampling system. The microfluidic sampler, worn on the skin, enables quick collection of sweat while separating old and new sweat. The aptamer undergoes conformational changes, providing high specificity for cortisol and can be regenerated, allowing for continuous monitoring of temporal changes [37]. Cortisol is detected using an electrochemical sensor functionalized with a pseudoknot-assisted aptamer and a flexible microfluidic sweat sampling system. The skin-worn microfluidic sampler provides rapid sweat collection while separating old and new sweat. The conformation-switching aptamer provides high specificity towards cortisol while being regenerable, allowing it to monitor temporal changes continuously [43].

3.9.  Detection mechanism

Detecting cortisol levels is essential for comprehending stress responses and managing various health conditions. Two primary methods are used for this purpose: microwave detection and IR radiation analysis, specifically NIRS. Both of these methods offer unique advantages and challenges that determine their suitability for different applications and settings.

3.9.1. Microwave detection

Microwave detection is the process of identifying and evaluating different materials or objects using microwave radiation. This technology depends on the way that microwaves interact with the target material, which can provide important details about its composition or attributes. Non-invasive detection of glucose levels was achieved by measuring the dielectric properties of microwaves. A correlation between glucose levels and measured dielectric properties was developed using Cole-Cole polynomial equations [44].

In order to detect microwaves, an emitter is usually used to send microwave radiation in the direction of the target substance or object. Certain changes or responses are produced by the interaction of the microwaves with the target, and a detector picks these up. The target’s characteristics can be analyzed and understood by measuring these changes with the detector.

With benefits including quick analysis, non-destructive testing, and comparatively less sensitivity to environmental influences, microwave detection is appropriate for a wide range of applications. However, even with the use of power-efficient components, continuous operating expenses may still arise, and the adoption of miniaturization and specialized components may initially cost more. Furthermore, reliability depends on the calibration accuracy remaining constant in the face of environmental changes. Miniaturization comes with obstacles, such as reduced analytical capability and connection, which could prevent some applications from working. Nevertheless, if miniaturization challenges are successfully overcome, there is potential for small, light designs, especially in wearable electronics.

3.9.2. Near-infrared spectroscopy

Near-infrared spectroscopy is an optical method that studies scattered, transmitted, or reflected light from an illuminated surface. NIR waves are in the electromagnetic bandwidth of 700–2500 nm. NIRS has various applications in fields like medicine, pharmaceutics, food analysis, quality control of chemical products, material sciences, astronomy, and agriculture [45].

NIRS is a scientific technique that uses the near-infrared region of the electromagnetic spectrum to analyze how light interacts with matter. It accurately measures the absorption of specific wavelengths corresponding to molecular vibrations of C-H, N-H, and O-H bonds, providing invaluable insights into the chemical composition of samples. NIRS emits near-infrared light onto a sample and precisely measures the intensity of light that is transmitted, absorbed, or scattered by the sample. Researchers can determine the chemical composition of the sample by analyzing the absorption of light at specific wavelengths, enabling them to gain a comprehensive understanding of the properties of the sample.

NIRS has several advantages that make it a popular scientific technique. It is non-destructive, meaning that it can analyze samples without destroying them, and it is versatile, meaning that it can be used in various fields. NIRS can also be miniaturized, enabling the creation of compact and portable devices, and it can be connected to different data platforms. These benefits make NIRS an excellent choice for chemical composition analysis, as it provides real-time or near-real-time analysis for timely interventions and low operational costs due to energy-efficient designs.

However, there are also some challenges associated with NIRS. For example, there are concerns about the reliability of sensor quality and calibration, and the limited penetration depth restricts its applicability. In addition, data interpretation can be complex, requiring expertise to understand the results. Finally, NIRS is sensitive to environmental factors such as temperature and ambient light, requiring careful control measures.

3.10. Near-infrared spectroscopy: the optimal method for cortisol detection

NIRS is a superior method for detecting cortisol levels when compared to microwave detection. NIRS is a non-destructive technique that allows for the analysis of samples without damaging or altering them. This is particularly important for sensitive biological samples such as cortisol. Additionally, NIRS is highly sensitive and specific in analyzing molecular vibrations, making it ideal for identifying specific chemical bonds associated with cortisol molecules.

NIRS devices can be miniaturized, enabling the development of compact and portable devices for point-of-care testing or wearable applications. This is particularly advantageous for continuous monitoring of cortisol levels in real-time. Lastly, NIRS offers the potential for real-time or near-real-time analysis, allowing for timely interventions in managing stress-related conditions.

Overall, the versatility, non-destructiveness, sensitivity, and potential for miniaturization make NIRS the optimal method for cortisol detection.

3.11. Block diagram

Figure 3 shows the block diagram representation of the suggested prototype. The hardware implementation uses components like an Arduino Uno microcontroller, an IR LED, and an IR sensor. The dashboard for the prototype, where all the data is displayed in a user-friendly manner, is built using the Django framework. SQLite is used to build the database.


FIGURE 3: Block level representation of the prototype.

4. Implementation

Our project centers around predicting viral infections based on stress levels using a novel device and prediction algorithms. At the heart of our implementation is a custom-designed rectangular box measuring 4 × 3 × 3 cm. This box features a sliding tray in the middle where samples, specifically synthetic cortisol in our case, are placed. An IR LED is positioned above the tray to emit radiation, which interacts with the cortisol sample. The interaction results in a change in light properties, with the remaining light being captured by a photodiode, generating a small current. Leveraging an Arduino microcontroller, we control the excitation of the IR LED and record the photodiode readings corresponding to various cortisol concentrations.

Once the readings are obtained, they serve as inputs to a prediction algorithm model. This model is pivotal in predicting whether an individual is experiencing high levels of stress, as stress weakens the immune system and renders individuals more susceptible to viral infections. Our Django-based dashboard plays a crucial role in visualizing these predictions. We’ve integrated the system seamlessly by implementing code that triggers the IR LED and collects the photodiode readings. Initially, the readings are taken for the empty box before every trial so that environmental bias is removed, prompting the user to insert the sample after the initial calibration is done. Subsequently, the system collects readings for a specified duration before halting. This data feeds into the machine learning model, and the resulting predictions are displayed on a webpage accessible via a link generated by the IDE upon command execution. The dashboard provides insights to the users on their susceptibility to viral infections based on stress levels. This implementation not only combines hardware and software components but also harnesses the power of machine learning and web development to offer a comprehensive solution for predicting and visualizing stress-based viral infection susceptibility (Figures 4-6).

Figure 7 shows the flow diagram of the program to insert data. The program starts with opening the COM port to get serialized input from Arduino, and then it proceeds to process it by normalizing it and removing the environmental bias for that setup. After the cortisol values have been normalized, the relative stress levels are calculated by taking the complement. Finally, the prediction is made based on the stress values. The prediction algorithm takes into consideration the stress levels of the previous two days to give a verdict.


FIGURE 4: Hardware implementation.


FIGURE 5: Hardware connections.

FIGURE 6: Isolated environment for cortisol monitoring using IR detection.

FIGURE 7: Data flow diagram.

5. Results

Table 1 shows the values of voltages that were obtained at the current-to-voltage converter. The current flowing through the negative end of the photodiode was converted to a voltage before normalizing it. The values were normalized by first measuring the voltage obtained for the empty box with nothing between the photodiode and IR LED, then measuring the voltage values obtained for the known cortisol concentration, and finally dividing the latter by the former. This helped in removing the environmental bias for the same setting.

TABLE 1: Iterative capture based on quantity.

Iteration/quantity

2 mg

4 mg

6 mg

8 mg

Iteration 1

0.718

0.606

0.547

0.46

Iteration 2

0.7023

0.743

0.5225

0.426

Iteration 3

0.6912

0.594

0.4861

0.477

Iteration 4

0.5248

0.514

0.513

0.4975

Iteration 5

0.5219

0.543

0.516

0.505

Iteration 6

0.666

0.627

0.579

0.5866

Figure 8 shows the plot of cortisol levels in terms of normalized voltage values vs. the iteration for each concentration of cortisol. Iterations 1,2, and 3 were done in the same environmental setting, that is, during the nighttime. Iterations 4 and 5 were done during the daytime. Iteration 5 was done on an isolated bench to mitigate vibration interference.

It was observed that for the same environmental setting, the values obtained for a particular concentration level were consistent with ± 0.02 accuracy. Some variations from the expected values were observed, like in the case of the 2nd iteration’s 4 mg concentration’s voltage value. Ideally, it should’ve been less than the values obtained for 2 mg concentration in the same environment, but the observed value was greater than that.

These variations can be explained by human error and the inaccuracy of the photodiode. However, since the photodiode was providing consistent values in the next iterations, it was concluded that the variation was caused by human error.

Figure 9 shows the plot of cortisol concentration and the stress levels calculated. Since stress levels in humans are negatively correlated with the cortisol amount present, the stress levels for the prototype model were calculated by taking the complement of the cortisol voltage values obtained. This may or may not be true in the actual case of human beings, but we have built a proof of concept that can be customized for the actual correlation factors of cortisol.

As expected, the stress levels for the 8 mg of cortisol concentration are the highest, and the stress levels for the 2 mg of cortisol concentration are the lowest.

As all the operation is controlled by the program that inserts data, all the output values are sent to the program, where it performs the normalization and calibration, and then takes (Figure 10).


FIGURE 8: Plot of cortisol levels and output voltage across the photodiode.

FIGURE 9: Plot of cortisol amount and stress levels.

FIGURE 10: Dashboard.

6. Future Scope

Expanding on our project’s trajectory, the future holds the development of a wearable device akin to a smartwatch, aimed at continuously monitoring stress levels. This device represents a significant advancement, offering personalized stress tracking and real-time alerts to users. The device’s initial phase involves a calibration period lasting one or two days, during which it acclimates to the user’s unique physiological responses. This calibration process enables the device to adapt its stress detection algorithms to the individual, enhancing accuracy and reliability. Once calibrated, the wearable device operates autonomously, monitoring stress levels through sweat analysis. By leveraging advanced sensors and algorithms, it detects subtle changes in sweat composition associated with stress. Upon detecting elevated stress levels, the device promptly alerts the user, enabling timely intervention and stress management strategies. By embracing wearable technology and cutting-edge data analytics, our project envisions a sophisticated stress monitoring solution that empowers individuals to proactively manage their well-being and lead healthier, more resilient lives.

7. Conclusion

In conclusion, our project introduces a pioneering method for predicting viral infections by monitoring stress levels, utilizing an integrative approach that combines hardware development, software engineering, and algorithmic analysis. The core innovation lies in the creation of a bespoke device designed to measure cortisol levels, which serve as a key indicator of stress. This device employs advanced IR sensing technology with IR LEDs and photodiodes to detect cortisol concentrations in the blood invasively.

To complement the hardware, we have developed a Django-based dashboard for real-time data visualization and analysis. This user-friendly interface provides individuals with valuable insights into their stress levels and potential susceptibility to infections. The integration of advanced algorithms further enhances the predictive capabilities of the system, analyzing patterns in cortisol fluctuations to identify periods of heightened risk for viral infections.

However, due to the current unavailability of lab-tested databases, the validation of our predictive model remains incomplete. This limitation underscores the need for future work to secure comprehensive datasets that can validate and refine the model’s accuracy.

Looking ahead, our vision includes the development of a wearable stress-monitoring device. This future iteration aims to offer personalized tracking and real-time alerts, significantly enhancing user convenience and engagement. The wearable device will continuously monitor cortisol levels, providing users with immediate feedback on their stress status and potential health risks.

With ongoing improvements in both accuracy and functionality, our project aspires to empower individuals to take proactive steps in managing their health and well-being. By offering a reliable early warning system for viral infections based on stress monitoring, we are paving the way for a new era of proactive healthcare solutions. This advancement promises to reduce the incidence of stress-related health issues and improve overall public health outcomes through timely interventions and personalized health management strategies.

References

  1. D. M. Morens, et al., “The origin of covid19 and why it matters,” The American Journal of Tropical Medicine and Hygiene, vol. 103, no. 3, pp. 955-959, 2020.
  2. T. J. O’Shea, et al., “Bat flight and zoonotic viruses,” Emerging Infectious Diseases, vol. 20, no. 5, pp. 741-745, May 2014.
  3. S. Subudhi, N. Rapin, and V. Misra, “Immune system modulation and viral persistence in bats: understanding viral spillover,” Viruses, vol. 11, no. 2, pp. 192, Feb 2019.
  4. T. B. Herbert and S. Cohen, “Stress and immunity in humans: a meta-analytic review.” Biopsychosocial Science and Medicine, vol. 55, no. 4, pp. 364–379 July 1993.
  5. A. Sancini, et al., “Work related stress and blood glucose levels,” Annali di Igiene, Medicina Preventiva e di Comunità, vol. 29, no. 2, pp. 123–133, 2017.
  6. D. Bozovic, M. Racic, and N. Ivkovic, “Salivary cortisol levels as abiological marker of stress reaction,” Journal of Academy of Medical Science of Bosnia and Herzegovina, vol. 67, no. 5, pp. 374–377, 2013.
  7. N. Dhull, G. Kaur, V. Gupta, and M. Tomar, “Highly sensitive and noninvasive electrochemical immunosensor for salivary cortisol detection,” Sensors and Actuators B: Chemical, vol.  293, pp. 281–288, 2019.
  8. T. Yilmaz, R. Foster, and Y. Hao, “Radio-frequency and microwavetechniques for non-invasive measurement of blood glucose levels,” Diagnostics, vol. 9, no. 1, pp. 6, Jan 2019.
  9. X. Jintao, Y. Liming, L. Yufei, L. Chunyan and C. Han, “Noninvasive and fast measurement of blood glucose in vivo by near infrared (nir) spectroscopy,” Spectrochimica Acta Part A: Molecular and Biomolecular Spectroscopy, vol. 179, pp. 250–254, May 2017.
  10. S. Pathak, “Biomimicry:(innovation inspired by nature),” International Journal of New Technology and Research, vol. 5, no. 6, pp. 34–38 June 2019.
  11. S. D. Salvo, “Biomimicry in architecture (book by michael pawlyn),” FORUM A+P Interdisciplinary Journal of Architecture and Built Environment, vol. 26, Jan 2023.
  12. I. Agnarsson, A. Dhinojwala, V. Sahni, and T. A. Blackledge, “Spider silk as a novel high performance biomimetic muscle driven by humidity,” Journal of Experimental Biology, vol. 212, no. 13, pp. 1990-1994, 2009.
  13. E. L. Luke, “Product and technology innovation: What can biomimicry inspire?” Biotechnology Advances, vol. 32, no. 8, pp. 1494–1505, Dec 2014.
  14. D. Hanadhita, A. S. Satyaningtijas, and S. A. Priyono, “Bats as a viral reservoir: A short review of their ecological characters and immune system,” in Proceedings of the 1st International Conference in One Health (ICOH 2017), Indonesia,  2017, pp. 124-128.
  15. L. F. Wang, Z. Shi, S. Zhang, H. Field, P. Daszak, and B. T. Eaton, “Review of bats and sars,” Emerging Infectious Diseases, vol. 12, no. 12, pp. 1834-1840, Dec 2006.
  16. H. A. Mohd, J. A. A. Tawfiq, and Z. A. Memish, “Middle east respiratory syndrome coronavirus (mers-cov) origin and animal reservoir,” Virology Journal, vol.13, no. 87, pp. 1-7 2016.
  17. I. Smith and L. F. Wang, “Bats and their virome: an important source of emerging viruses capable of infecting humans,” Current Opinion in Virology, vol. 3, no. 1, pp. 84-91, Feb 2013.
  18. C. H. Calisher, J. E. Childs, H. E. Field, K. V. Holmes and T. Schountz, “Bats: important reservoir hosts of emerging viruses,” Clinical Microbiology Reviews, vol. 19, no. 3, pp. 531-545, July 2006.
  19. A. D. Luis, et al., “A comparison of bats and rodents as reservoirs of zoonotic viruses: are bats special?” Proceedings of the Royal Society B: Biological sciences, vol. 280, no. 1756, pp. 20122753, Apr 2013.
  20. L. F. Wang and C. Cowled, Bats and viruses: a new frontier of emerging infectious diseases, John Wiley & Sons, June 2015.
  21. D. S. Goldstein and B. McEwen, “Allostasis, homeostats, and the nature of stress,” Stress, vol. 5, no. 1, pp. 55-58, Feb 2002.
  22. R. M. Sapolsky, “The influence of social hierarchy on primate health,” Science, vol. 308, no. 5722, pp. 648-652, Apr 2005.
  23. F. S. Dhabhar and B. S. Mcewen, “Acute stress enhances while chronic stress suppresses cell-mediated immunityin vivo: A potential role for leukocyte trafficking,” Brain, Behavior, and Immunity, vol. 11, no. 4, pp. 286-306 Dec 1997.
  24. B. S. McEwen, “Protective and damaging effects of stress mediators,” The New England Journal of Medicine, vol. 338, no. 3, pp. 171-179, 1998.
  25. M. Irwin, et al., “Reduction of immune function in life stress and depression,” Biological Psychiatry, vol. 27, no. 1, pp. 22-30, Jan 1990.
  26. G. P. Chrousos and T. Kino, “Glucocorticoid action networks and complex psychiatric and/or somatic disorders,” Stress, vol. 10, no. 2, pp. 213-219, June 2007.
  27. F. S. Dhabhar and K. Viswanathan, “Short-term stress experienced at time of immunization induces a long-lasting increase in immunologic memory,” American Journal of Physiology-Regulatory, Integrative and Comparative Physiology, vol. 289, no. 33, pp. R738–R744, Sep 2005.
  28. F. S. Dhabhar and B. S. Mcewen, “Bi-directional effects of stress on immune function: possible explanations for salubrious as well as harmful effects,” in Psychoneuroimmunology, 4thed, Academic Press, 2007, pp. 723-760.
  29. S. Sephton and D. Spiegel, “Circadian disruption in cancer: a neuroendocrine-immune pathway from stress to disease?” Brain, Behavior, and Immunity, vol. 17, no. 5, pp. 321-328, Oct 2003.
  30. A. N. Saul, et al., “Chronic stress and susceptibility to skin cancer,” Journal of the National Cancer Institute, vol. 97, no. 23, pp. 1760–1767, Dec 2005.
  31. F. S. Dhabhar, “Effects of stress on immune function: the good, the bad, and the beautiful,” Immunologic Research, vol. 58, no. 2-3, pp. 193-210, May 2014.
  32. J. H. Lee, Y. Hwang, K. A. Cheon, and H. I. Jung, “Emotion-on-a-chip (eoc): Evolution of biochip technology to measure human emotion using body fluids,” Medical Hypotheses, vol. 79, no. 6. Pp. 827-832, Dec 2012.
  33. M. Debono, et al., “Modified-Release Hydrocortisone to Provide Circadian Cortisol Profiles,” The Journal of Clinical Endocrinology & Metabolism, vol. 94, no. 5, pp. 1548-1554, May 2009.
  34. R. Gatti, G. Antonelli, M. Prearo, P. Spinella, E. Cappellin, E. F. D. Palo, “Cortisol assays and diagnostic laboratory procedures in human biological fluids,” Clinical Biochemistry, vol. 42, no. 12, pp. 1205-1217, Aug 2009.
  35. E. R. D. Kloet, M. Joëls and F. Holsboer, “Stress and the brain: from adaptation to disease,” Nature Reviews Neuroscience, vol. 6, no. 6, pp. 463-475, June 2005.
  36. F. Holsboer and M. Ising, “Stress hormone regulation: biological role and translation into therapy,” Annual Review of Psychology, vol. 61, pp. 81-109, 2010.
  37. T. Iqbal, A. Elahi, W. Wijns, and A. Shahzad, “Cortisol detection methods for stress monitoring in connected health,” Health Sciences Review, vol. 6, pp. 100079, Mar 2023.
  38. M. Frasconi, M. Mazzarino, F. Botrè, and F. Mazzei, “Surface plasmon resonance immunosensor for cortisol and cortisone determination,” Analytical and Bioanalytical Chemistry, vol. 394, no. 8, pp. 2151-2159, Aug 2009.
  39. M. D. VanBruggen, A. C. Hackney, R. G. McMurray, and K. S. Ondrak, “The relationship between serum and salivary cortisol levels in response to different intensities of exercise,” International Journal of Sports Physiology and Performance, vol. 6, no. 3, pp. 396-407, Sep 2011.
  40. J. Bakusic, S. D. Nys, M. Creta, L. Godderis and R. C. Duca, “Study of temporal variability of salivary cortisol and cortisone by lc-ms/ms using a new atmospheric pressure ionization source,” Scientific Reports, vol. 9, no. 19313, 2019.
  41. N. E. Farhan, D. A. Rees, and C. Evans, “Measuring cortisol in serum, urine and saliva–are our assays good enough?” Annals of Clinical Biochemistry, vol. 54, no. 3, pp. 308-322, May 2017.
  42. A. Ghemigian, “Cushing’s disease–same condition, different scenarios,” Archives of the Balkan Medical Union, vol. 53, no. 1, pp. 135-139, Mar 2018.
  43. T. Yilmaz, R. Foster, and Y. Hao, “Radio-frequency and microwave techniques for non-invasive measurement of blood glucose levels,” Diagnostics, vol. 9, no. 1, pp. 6. Jan 2019.
  44. A. Sakudo, “Near-infrared spectroscopy for medical applications: Current status and future perspectives,” Clinica Chimica Acta, vol. 455, pp. 181-188, Apr 2016.
  45. D. Cozzolino, “The ability of near infrared (NIR) spectroscopy to predict functional properties in foods: Challenges and opportunities,” Molecules, vol. 26, no. 22, pp. 6981, Nov 2021.