Research Article
Open Access
Office and Ambulatory Blood Pressure in Obese and Abdominally Obese Hypertensive Patients
Pages 1 - 10

View PDF
Research Article
Open Access
QRS complex findings in patients following out-of-hospital cardiac arrest with particular focus on their coronary status
Pages 11 - 20

View PDF
Abstract
Background: There is still a lack of knowledge about the clinical relevance of electrocardiographic findings in patients following out-of-hospital cardiac arrest (OHCA). Methods: All victims of OHCA who were admitted to our hospital between January 1st 2008 and December 31st 2013 were identified and their QRS complexes were analyzed according to QRS duration and QRS morphology measured with the simplified Selvester Score. Results: A total of 147 out of 204 OHCA patients were included in our study, of which 76 received coronary angiography. The first 12-lead ECG showed a mean QRS duration of 108.0 ± 22.1 ms and 4.3 ± 3.5 points for the simplified Selvester Score. QRS complexes in patients following OHCA due to an initial shockable rhythm were significantly wider in patients who were discharged alive (114.0 ± 23.8 ms) than in patients who died in-hospital (98.9 ± 18.1 ms) (p=0.016), and patients who survived until the follow-up examination showed a significant reduction in the QRS duration (p=0.001), whereas the simplified Selvester Score showed no such changes. Subgroup analyses revealed that this reduction in QRS duration was most pronounced in patients with coronary artery disease (CAD) who received percutaneous coronary intervention (PCI). Conclusion: Neither QRS duration nor QRS morphology can reliably predict the prognosis of all patients following OHCA. However, as QRS durations decrease, especially in patients with CAD who receive PCI, it is possible that standardized QRS monitoring in patients following OHCA could be a useful tool in the monitoring of the hemodynamics of patients following OHCA.
Research Article
Open Access
Proximate Assessment of Physicochemical And Microbial Parameter Of Five Different Bottled Water In Kano
Pages 9 - 15

View PDF
Abstract
The study sampled five bottled water consumed in Kano which include Sona, Santana, Aquafina, Eva and Swan bottled water (popularly consumed in eateries, suya spots, garage among others in kano). Analysis was done using standard method and results obtained for the physicochemical and microbial assay was below the guidelines limit of WHO, indicating the suitability of the bottled water analyzed for the aforementioned parameters. Hence, the populist believed bottled water is safe for consumption; thorough monitoring survey is needed to enhance the quality and safety sustenance.
Research Article
Open Access
Eclampsia – Present Scenario in a Teaching Hospital – A Two Years Study
Pages 65 - 69

View PDF
Abstract
Introduction: Eclampsia has been recognized as a clinical entity since the time of Hippocrates; and has been a nightmare to healthcare providers ever since. It is defined as the occurrence of generalised convulsions associated with preeclampsia during pregnancy, labour or within 7 days of delivery and not caused by epilepsy or other convulsive disorders. The incidence of eclampsia has often been viewed as an index of civilization in a country. There is low utilization of both antenatal and intrapartum care services and the patients may present to the hospital only as a last resort. Materials And Methods: This is a Prospective Study was carried out at the Department of Obstetrics and Gynecology at Tertiary Care Teaching Hospital over a period 2 years. All patients presenting with eclampsia during the said period were recruited into the study. All patients presenting with eclampsia during the said period were recruited into the study. Eclamptics are usually admitted directly into the labour ward. Patients who were diagnosed with other causes of convulsions in pregnancy like cerebral malaria and epilepsy were excluded from the study. A total of 821 pregnant mothers with eclampsia admitted in the inpatient department of the tertiary care teaching hospital were recruited for the study, irrespective of their previous antenatal check up history. Results: Majority (66%) of the patients had between 2 to 5 episodes of convulsion. The MINIMUM number was 1 episode of convulsion , seen in 13 % of the patients. The MAXIMUM number of convulsions was 40. Of the 66 patients who had had more than 10 no. of convulsions , 30 had not received any treatment prior to referral, while there were no patients who had had more than 10 no. of convulsions after receiving the Loading Dose of MgSO4. For patients having less than 5 no. of convulsions , the number of patients having received only the IM Dose of MgSO4 was 1.5 times those having received the Loading Dose.(228/154 =1.48) In 29% of the patients, Hypertension was controlled by delivery alone. Those who failed to achieve a control of BP by Delivery alone were administered Calcigard (Nifedipine). Conclusion: Eclampsia was noted to be commoner among the young primigravida patients. The importance of this finding is that this group of patients deserve extra surveillance during antenatal care in terms of monitoring their blood pressure and screening their urine for proteinuria to detect pre-eclampsia. It is hoped that such interventions will have positive impact on maternal and child care. However, all this will go in vain unless health care providers at the grassroot levels are sensitised regarding the early diagnosis of Pre eclampsia and prompt and appropriate initiation of treatment
Research Article
Open Access
Accuracy of Mobile 12 Lead ECG Device for Assessment of Qtc Interval in Arrhythmia Patients: A Prospective and Retrospective Validation Study
Pages 206 - 214

View PDF
Abstract
Background: Ambulatory assessment of the heart rate–corrected QT interval (QTc) within arrhythmia patients can be of diagnostic value where these patients are on QTc-prolonging medication. Repeating sequential 12-lead electrocardiograms (ECGs) to monitor the QTc is cumbersome, but Spandan Smartphone ECG devices can potentially solve this problem. Objective: Objective of this prospective and retrospective, cross-sectional, within patient diagnostic validation study was to validate the measurement of QTc interval in Spandan 12 lead ECG and to assess the accuracy of the 12 lead Spandan Smartphone ECG device in measuring the QTc intervals in the general cardiology outpatient population with normal ECG and arrhythmias. Materials and Methods: This single-center study was carried out at Shri Mahant Indresh Hospital (SMIH), Dehradun, Uttarakhand, India from August 2022 to October 2022. All patients (n=1168) visiting the electrocardiogram (ECG) room at the Department of Cardiology of the SMIH, Dehradun during the study period were enrolled in the study by taking their written consent and explaining the purpose of the study. Results: Mean (SD) age was 54.36±4.9 years. The male gender (n=783,67.03%) shows the maximum frequency than female gender. Primary Coronary Intervention was noted in 426 (36.4%) of the study population. All the four parameters showed positive Pearson correlation between 12 Lead Standard ECG and Spandan Smartphone ECG. The maximum mean difference between 12 Lead Standard ECG and Spandan Smartphone ECG was noted for QTc parameter in overall participants. Conclusion: 12-lead Spandan Smartphone ECG allows for QTc assessment with good accuracy and can be used safely in ambulatory QTc monitoring. This may improve patient satisfaction and reduce healthcare costs
Research Article
Open Access
Comparative study of safety and efficacy of Programmed Labour against natural progression of labour in primigravida women at a tertiary hospital
Pages 522 - 527

View PDF
Abstract
Background: Programmed labour protocol was developed with principles asensuring adequate uterine contractions , providing optimum pain relief & close clinical monitoring of labor events. Present study was conducted to evaluate the safety and efficacy of Programmed Labour protocol in a study group as against Spontaneous progression of labour in primigravida patients. Material and Methods: The present study was a hospital based randomized prospective clinical study, conducted in primigravidae at term with cephalic presentation, adequate liquor and no high risk factors and in active phase of first stage of labour or Cervical dilatation ≥ 3cm, ≥ 80% effacement and intact membranes, Reactive stress Test. 200 primigravidas were alternately allocated into 2 groups. as study group (100 women received programmed labour protocol) & control Group (100 women were observed expectantly and underwent spontaneous labour). Results: Mean age of patients in the study group was 23.13 ± 2.46 years and 23.74 ± 2.58 years in the control group. Among patients of the study group; period of gestation was 38.87 ±1.00 weeks and 38.74 ± 1.12 weeks in the control group. We compared various labour related parameters such as duration of active phase of labour (hours), rate of cervical dilatation (cm/hr), duration of 2nd stage of labour (mins), duration of 3rdstage of labour (mins), total duration of labour (min) & average blood loss (ml) between study & control group. All above parameters were favourable in study group & difference was highly significant statistically (p<0.001). Perception in degree of pain relief among patients of the study and control group was found be highly significant statistically. (p<0.001) i.e. pain relief was significantly much higher among patients of the study group than pain relief in control group patients. The difference in degree of maternal satisfaction in the study and control group was found be statistically significant (p<0.001). Conclusion: Programmed labour is safe, effective providing labour analgesia; facilitating cervical dilation and shortening duration of labour with good maternal and fetal outcomes
Research Article
Open Access
A Study on Prevalence of Peripheral Neuropathy among Known Type 2 Diabetic Patients in Urban Population Chidambaram
Pages 584 - 588

View PDF
Abstract
Diabetes is a public health problem; prevalence of diabetes is progressively on the rise. International diabetes federation estimates a doubling prevalence of diabetes mellitus by 2035 from that of 541 million in 2022. Objectives: To find out the prevalence of peripheral neuropathy among known type 2 diabetics and to correlate peripheral neuropathy with select socio – demographic variables. Materials and Methods: A Descriptive cross -sectional community-based study was done among Known type 2 diabetes mellitus individuals of age group 30 years and above. The study was carried out for a period of 10 months after getting approval from the institutional ethical committee. Peripheral neuropathy was classified using the Toronto clinical scoring system of peripheral neuropathy. Data collected was entered in Microsoft 2010 excel spread sheet, compiled and analyzed using IBM SPSS Version 22 statistical package. Results: The prevalence of peripheral neuropathy was found to be 12.6% among the study subjects. Frequency of Peripheral neuropathy was increasing as the duration of diabetes increases. A significant association was found between duration of diabetes and peripheral neuropathy. A significant association was found between increased RBS value and peripheral neuropathy. Conclusion: Maintaining a proper blood-glucose control is the key to primary prevention of diabetes related complications. Regular monitoring of blood-glucose level must be done for the management of Diabetic peripheral neuropathy.
Research Article
Open Access
Relationship of Cardiotocography and Umbilicalartery Doppler Findings with Perinatal Outcome in Low Risk Pregnancies with Decreased Fetal Movements
Pages 702 - 720

View PDF
Abstract
Introduction: Fetal movement tracking may be used to identify worsening in the fetus condition. It is described as any kick, flutter, swish, or roll perceived by the pregnant women and is considered evidence of the musculoskeletal and central nervous systems' integrity. Decreased fetal movement has been linked to poor pregnancy outcomes such as intrauterine growth restriction, fetal death and preterm deliveries. Clinical data on the association between decreased fetal movements and perinatal outcome is insufficient. Methodology: Ethical clearance was obtained from SRIMANTA SANKARDEVA UNIVERSITY of HEALTH SCIENCE for study of decreased fetal movements in Gauhati medical college and hospital. A doppler study was conducted using 3 dimensional ultrasound machines in ANOPD, departmental indoor USG room, and 2 dimensional ultrasound machine in observation room in the Department of Obstetrics & Gynaecology, Gauhati Medical College& Hospital. Patients were placed in supine position with left lateral tilt and umbilical artery Wave forms were recorded in the mid position from the free floating loops. Indices noted were S(systolic)/D(diastolic) ratio, resistance index (RI), plasticity index (PI), and reversal of blood flow in diastole. CTG monitoring was done in Departmental Observation Room using a CTG machine (labelled as FETAL MONITOR, SN-EATB8L1732, manufacturer-BPL, model no. FM 9854). Each selected patient was monitored for a period of 20minutes with a paper speed of 3cm/minutes during antepartum or intrapartum status. Noted following information were: base line FHR, beat to beat variability, FHR accelerations, presence of deceleration, and reactive. Results: A prospective observational study was conducted in Gauhati Medical College & Hospital, Guwahati, Assam during a time period of one year. 150 antenatal women at term gestation with decreased fetal movements without any other high risk conditions were monitored for fetal wellbeing by CTG and Doppler. Four groups were categorised into four groups: Group I-CTG reactive and Umbilical Artery Doppler normal, Group IIA, Group IIB, and Group III. The findings of each group were compared with different modes of delivery and different parameters of perinatal outcomes. Conclusion: Maternal perception of fetal movements is the most widely used technique to evaluate fetal wellbeing. Low-risk pregnancies with decreased fetal movements should be monitored for close antenatal fetal monitoring, appropriate and prompt interventions. Non-reactive CTG alone or with combination of abnormal Doppler results are better predictors of poor perinatal outcome and can indicate if neonatal resuscitation is required. These two tools can be used together for fetal monitoring and appropriate intervention at the correct time to improve the perinatal outcomes
Research Article
Open Access
Effect of Impaired Blood Glucose Level on Cognitive Functions
Pages 966 - 970

View PDF
Abstract
Introduction: Diabetes mellitus is one of the most common diseases whose prevalence is on the rise. It is believed that within the next 30 years, the number of diabetic patients will double in comparison to the year 2000.Aims: To Study the effect of impaired blood glucose level on cognitive functions. Materials and Methods: The present study was a Prospective cross-sectional study. This Study was conducted from Dec 2019 to June 2021 at Department of Physiology in association with department of Medicine at tertiary care hospital Result: In the present study, it was also seen that reduction in mean MMSE score in type2 DM subjects having duration above 5 years was significant when compared to those with duration less than 5 years. It shows that longer duration has an effect over cognitive function. Correlation coefficient between MMSE score and HbA1c was negative in this study, from which it is inferred that increase in HbA1c levels is associated with decrease in MMSE scores. Conclusion: We found that, Effective control needs proper diet, regular exercise, monitoring blood glucose by self and management of medications. A person’s cognitive skill to bring about the above mentioned needs is thus crucial for self-management of diabetes
Research Article
Open Access
Evaluation of Lipid Profile changes before, during, & after Low Dose Continuous/ Intermittent Isotretinoin Therapy for Acne Vulgaris
Pages 1006 - 1009

View PDF
Abstract
The purpose of the current study is to evaluate the impact of low dose isotretinoin therapy on the changes of total cholesterol, triglycerides, HDL cholesterol, and LDL cholesterol in acne vulgaris patients. Methodology: Fifty patients with moderate to severe acne who were attending the dermatology department and were between the ages of 15 and 45 were treated with 20 mg of isotretinoin every day for four months. Blood samples were taken on day 0, the second week, the first, second, third, and fourth months. Results: The measured baseline cholesterol levels in the continuous therapy group were 116.86 ± 23.55, and they grew above the baseline levels at each subsequent interval of 4 weeks, 8 weeks, 12 weeks, 16 weeks, and at the end of the treatment. The P-value is significant when compared to the baseline. At all time points, compared to baseline and above the normal limit, there was a statistically significant increase in cholesterol, triglycerides, and LDL. There was also a statistically significant decline in HDL levels. Conclusion: Increase in cholesterol, triglycerides, and LDL over the usual range was brought on by low dosage continuous isotretinoin therapy. HDL values fluctuate with grade 1 (increase and decrease). Mild, well-tolerated side effects did not need therapy discontinuation. So it's crucial to raise awareness of the implications. We tell our expert that using low dose isotretinoin for moderate to severe acne can be done with little worry, although close monitoring is crucial
Research Article
Open Access
A Study on Significance of Computed Tomographic Evaluation of Acute Pancreatitis
Pages 82 - 85

View PDF
Abstract
Introduction: Pancreatitis is one of most complex and clinically challenging of all abdominal disorders. USG and abdominal CT are the most commonly used diagnostic imaging modalities for the evaluation of pancreas. Computed Tomography (CT) is highly accurate and sensitive than USG in both diagnosing as well as demonstrating the extent. Early assessment of the cause and severity of acute pancreatitis is of utmost importance for prompt treatment and close monitoring of patient with severe disease. CT is the imaging method of choice for assessing the extent of acute pancreatitis and for evaluating complications. Materials and methods: This is a prospective study was conducted in the Department of Radiology at Dr. VRK Women's Medical College, Teaching Hospital and Research Centre Hyderabad, among 70 cases of acute pancreatitis. All the cases of acute pancreatitis referred under department of radiology in a tertiary healthcare institute, and fulfil the set inclusion criteria, who consented to participate in the study were included in the present study. It was carried out among 46 indoor cases of acute pancreatitis referred under department of radiology for further diagnostic evaluation, in a tertiary healthcare teaching institute in Maharashtra during study period. Ethical Approval was taken from the college ethics committee. Result: In our study, a total 70 patients were studied using CT scan, who was suspected to have acute pancreatitis. Among them, 50 (71.5%) were males and 20 (28.5) were females. Necrosis of the pancreatic gland parenchyma was seen in 17 (24.3%) patients. 12 patients (17.1%) showed <30% necrosis. 8 patients (11.4%) showed 30-50% necrosis, and 10 patients (14.3%) showed more than 50% necrosis. Conclusion: In conclusion CECT was found to be an excellent imaging modality for diagnosis, establishing the extent of disease process and in grading its severity. The Modified CT Severity Index is a simpler scoring tool and more accurate than the Balthazar CT Severity Index. In this study, it had a stronger statistical correlation with the clinical outcome, be it the length of hospital stay, development of infection, occurrence of organ failure and overall mortality. It could also predict the need for interventional procedures
Research Article
Open Access
Combination of novel diagnostic biomarkers for Prostate cancer prognostication: A prospective Study
Pages 529 - 533

View PDF
Abstract
Background: Prostate cancer (PCa) is a major cause of morbidity and mortality among men in the United States and globally. However, many men with prostate cancer have slow growing tumor and experience an indolent course even without curative therapy. The increasing incidence may be due to increased PSA-measurements and other diagnostic efforts. However, this review does not handle the associated differential diagnosis. Also, the biological heterogeneity that characterizes this disease causes decision issues unique to prostate cancer. Low-grade cancer diagnosed late in life may have no impact on the quality or length of life. Materials and methods: A cross-sectional study was conducted to test the hypothesis of an association of IFN-γ, IL-6 and PSA with obesity parameters for the severity of prostate cancer. Total 90 participants included in study and Anthropometric examination and hormonal test were also performed simultaneously. Among 90 participants; Total 45 participants were grouped in Benign Prostatic Hyperplasia (BPH) and 45 in PCa groups respectively. Serum samples of men with suspicion of prostate cancer based on high prostate specific antigen (PSA) and/or abnormal DRE were withdrawn before biopsy between 8 a.m. and 11 a.m. Serum PSA, IFN-gamma and IL-6 levels were estimated using ELISA on the same day. The serum was separated, aliquoted and kept frozen at -80ºC for analysis. Result: Waist hip ratio was significantly (p<0.0001) higher in the patients of PCa (2.9 ± 1.43) as compared to BPH (1.92 ± 1.20). Level of IFN-γ was significantly (p<0.0001) higher in PCa (144.6 ± 49.9) patients as compared to BPH patients (61.8 ± 11.9). Similarly, the Interleukin-6 level was significantly (p<0.0001) higher in PCa patients (36.95±11.37) as compared to BPH patients (13.7±9.47). The age of the patients was almost similar in both Lower (68.59±12.15) and Higher (67.70±12.70) grades. The level of BMI was significantly (p=0.008) higher in the patients of Higher grade (28.35±7.99) as compared to Lower (26.85±5.89) grade. The higher-grade patients had more risk of being overweight than the lower grade patients (Unadjusted OR=1.14, 95%CI=1.03-1.16). Similarly, the waist-hip ratio was also significantly (p=0.03) higher in the patients of Higher grade (2.33±1.53) as compared to Lower grade (2.09±1.38). Conclusion: The introduction of total PSA in clinical practice has resulted in early detection and reduced mortality from PCa. However, PCa screening remains controversial, because of the risk of over diagnosis reduced mortality and overtreatment and the inability to detect a significant proportion of dangerous tumors. A large concerted effort has been made to improve and/or monitoring the activity of PCa and to guide molecular targeted therapy and/or assess therapeutic response. An integrated approach with blood-based measurement of different molecular forms of PSA in combination with genetic and urine biomarkers hold the promise of improving screening for and diagnosis of PCa. Analysis of panels of blood-based biomarkers will be a significant step towards fingerprinting of the tumors biologic behavior.
Research Article
Open Access
A Comparative Study of Intravenous Labetalol and Oral Nifedipine for Control of Blood Pressure in Severe Pre-Eclampsia in a Tertiary Care Hospital
Pages 688 - 697

View PDF
Abstract
|
Background: Hypertension is the most common medical disorder in pregnancy, complicating 6-10% pregnancies1. Treatment of severely increased blood pressure is widely recommended to reduce the risk of maternal and fetal complications. Regimens for acute treatment of severe hypertension in pre-eclampsia include intravenous medications. Although effective, these drugs require venous access and careful fetal monitoring and might not be feasible in busy or low resource environments. Therefore, this study aimed to compare the efficacy of intravenous labetalol and oral nifedipine for control of hypertension in severe pre-eclampsia. Objective: To compare efficacy of intravenous labetelol and oral nifedipine when used rapidly to lower high blood pressure in severe pre-eclampsia mothers. Methodology: This is a hospital based prospective randomized interventional comparative trial conducted at Midnapore Medical College and Hospital from April ’21 to Sep ’22. The study has a sample size of 100 patients divided into two groups randomly, group A received intravenous labetelol injection (in escalating dose of 20,40, 80 and 80 mg every 30 mins, maximum dose of 220mg) and group B received oral nifedipine (10mg tablet orally upto 5 doses) every 30 mins2. Target BP is ≤ 150/ 90 mm of Hg. After target BP is reached further antihypertensive given as per choice. Results: In labetalol group 18 (32%) patients achieved target blood pressure with 1 dose, 10 patients (20%) with 2 doses, 14 (28%) with 3 doses and 8 (16%) with 4 doses while in nifedipine group 16 (32%) of patients achieved target blood pressure with 1 dose, 12 patients (24%) with 2 doses, 10 (20%) with 3 doses, 8 (16%) with 4 doses and 4 (8%) with 5 doses and P value is non significant (0.29).The mean average reduction in systolic blood pressure and diastolic blood Pressure after 30 minutes of drug administration was 6.04± 7.38mmHg and 6.88±4.8mmHg for labetalol and for nifedipine it was 4.32±4.22mmHg and 5.12±3.9mmHg with a non significant P value of 0.469. The mean time required to achieve target BP in Group A was 67.2±33.168 minutes and In Group B was 73.2±38.475 minutes with a P value of 0.405 which stands non significant. Conclusion: Oral nifedipine and intravenous labetalol regimens are almost equally effective in acute control of blood pressure in severe preeclampsia.
|
Research Article
Open Access
Biomedical Waste Disposal -How Knowledgible Are We
Pages 790 - 798

View PDF
Abstract
Biomedical Waste Disposal has become a subject of great concern in the modern health care system. It is vital in maintaining public health and preventing transmission of certain infectious diseases. Knowledge and attitudes towards safe disposal of biomedical waste is key to the successful implementation of the program particularly in the health care facilities. 200 health care professionals interviewed to understand the ground realities of their knowledge and attitudes. While all of them agreed to the fact that knowledge about Biomedical Waste disposal is essential, only 55% have the knowledge of color coding. 74% have knowledge of segregation and 86% use protective gear while segregating. 99% have favorable attitude score. Knowledge among the medical personnel is high whereas the same among the house keeping staff is low. The need for providing continuous education and monitoring its implementation and strict law enforcement are some of the suggestions made to achieve a complete and meaningful Biomedical Waste Management.
Research Article
Open Access
A Cross-Sectional Comparative Study of T3, T4 & TSH Levels in Altered Thyroid Status in Premenopausal Women
Pages 952 - 956

View PDF
Abstract
|
Introduction: Hypothyroid or Hyperthyroid state affects all the physiological systems including cardiovascular system, central nervous system, digestive system, blood, etc. Despite increasing knowledge of thyroid physiology and better means for investigation of thyroid functions, we still are at preliminary stage of understanding the pathophysiology of these disorders Objectives: The present study was carried out to compare body T3, T4 and TSH levels in newly diagnosed patients of hypothyroidism, hyperthyroidism and age and gender matched euthyroid subjects Materials and methods: The present study was carried out in 90 female subjects in the age group of 30 to 45. Diagnosis of hypothyroidism and hyperthyroidism was based on both clinical and biochemical criteria. Subjects were divided in euthyroid, hypothyroid and hyperthyroid groups with each group containing 30 subjects. T3, T4 & TSH levels were measured in all the groups. Results: Hyperthyroid group had significantly higher T3 and T4 levels as compared to euthyroid and hypothyroid groups. Hypothyroid group had significantly higher TSH as compared to euthyroid and hyperthyroid groups. Conclusion: T3 and T4 levels are significantly higher in hyperthyroidism and they are significantly lower in Hypothyroidism as compared to euthyroid premenopausal women. TSH levels are significantly higher in hypothyroid subjects, while TSH levels are significantly lower in hyperthyroid subjects. Regular monitoring of T3, T4 and TSH especially in women is recommended.
|
Research Article
Open Access
Association of HbA1c and Neutrophil-To-Lymphocyte Ratio in Type 2 Diabetic Patients: An Observational Study
Pages 1350 - 1354

View PDF
Abstract
|
Background: The elevated ratio of neutrophils to lymphocytes, commonly referred to as NLR, can function as an indicator and a prognosticator for a range of cardiac and non-cardiac ailments. The aim of our study was to examine the correlation between NLR and different levels of glycemic regulation in individuals with type 2 diabetes and present our results. Methods: An observational study was conducted at teaching hospital of Central India, wherein 90 patients diagnosed with type 2 diabetes were purposively selected and categorised into three groups based on their level of diabetes control, as per the standards set by the American Diabetes Association (ADA). The study categorised patients into three groups based on their HbA1c levels: group A consisted of patients with HbA1c levels ≤ 7% indicating excellent control, group B included patients with HbA1c levels ranging from 7.0-9.0% indicating poor control, and group C comprised patients with HbA1c levels ≥ 9% indicating the worst control. The patients underwent evaluation with regards to their complete blood count. Results: In comparison to patients belonging to Group A, who exhibited favourable control, patients belonging to Group C, who demonstrated the poorest control, manifested a significantly elevated leukocyte count (p.001), an increased neutrophil count (p.003), and a decreased lymphocyte count (p 0.44). There was no significant difference observed among the patients belonging to Group B. The NLR value exhibited a statistically significant increase in Group C, which represented the worst control, as compared to Group B, which represented poor control, and Group A, which represented the best control. The values were recorded as 4.32.8, 2.71.0, and 2.00.5, respectively. (p.001). The Neutrophil-to-Lymphocyte Ratio (NLR), in conjunction with fasting blood sugar, was identified as an autonomous predictor of suboptimal diabetes control. The odds ratio for NLR was 1.809 with a 95% confidence interval of 1.459-2.401, while the odds ratio for fasting blood sugar was 0.938 with a 95% confidence interval of 0.995-0.982. Conclusion: Patients diagnosed with type 2 diabetes mellitus exhibiting elevated levels of neutrophil-to-lymphocyte ratio (NLR) are also observed to have elevated levels of glycated haemoglobin (HbA1c) and suboptimal glycemic control. In the context of post-treatment management of individuals with diabetes, it may be employed as a means of closely monitoring their overall well-being.
|
Research Article
Open Access
To evaluate the usefulness of the pulsatility index (PI) of the umbilical artery (UA) and the pulsatility index of fetal middle cerebral artery (MCA)
Pages 1402 - 1413

View PDF
Abstract
|
Introduction- Doppler is a noninvasive method for evaluation of fetoplacental circulation without any disturbance to human pregnancy. It gives valuable information about hemodynamic situation of the fetus and is an efficient diagnostic test of fetal jeopardy that helps in management of high-risk pregnancy. Doppler ultrasound technology evaluates umbilical artery (and other fetal arteries) waveforms to assess fetal well-being in the third trimester of pregnancy. Aims and objectives- To evaluate the usefulness of the pulsatility index (PI) of the umbilical artery (UA) and the pulsatility index of fetal middle cerebral artery (MCA). Also, emphasize on the importance of altered cerebroplacental ratio in predicting the adverse perinatal outcome in patients with abnormal cerebroplacental ratio and timely intervention in these fetus to prevent adverse perinatal outcome. Material and methods- This study, Prospective observational study, was conducted in the Department of Obstetrics & Gynecology at Tata Main Hospital, Jamshedpur, Jharkhand, periods of 1 Year and 6 Months, from January 2018 to June 2019. Patients those were attended OPD & got admitted as IPD to Tata Main Hospital at 30-36 weeks of gestation comprised the study population. Only those women who fulfilled the inclusion criteria and were willing to participate in the study voluntarily were included in the study after taking an informed consent. Results and conclusion - In our study, 58% and 42% patients in control group were primigravida and multigravida respectively which was comparable to patients in Case group 56% and 44% respectively. Doppler flow velocity analysis can be valuable in antenatal assessment of SGA, FGR and even in AGA for prediction of late onset growth restriction and perinatal adverse outcome. By noninvasive hemodynamic monitoring of umbilical arteries (Feto-placental circulation) and middle cerebral arteries (fetal-circulation) has been a great help to improve perinatal outcome in pregnancy with comorbidities. For the prediction of adverse perinatal outcome in women with high-risk pregnancies, the best doppler index according to our study was cerebroplacental ratio (MCA/UA PI ratio). In cases with abnormal doppler, timely interventions lead to improved perinatal outcome. Hence, repeated doppler study in these pregnancies can help to reduce perinatal morbidity and mortality in high-risk cases. This study also suggested that CP ratio has the value for identifying those fetuses at risk for adverse perinatal outcome even their weights was greater than the 10th centile but are at risk for adverse outcome or late onset FGR because of an abnormal or lower CP ratio than 50th percentile value for age specific cutoff value.
|
Research Article
Open Access
An Analysis of Maternal Mortality Trends in a Tertiary Care Hospital
Pages 1435 - 1441

View PDF
Abstract
|
Background: Maternal mortality serves as an indicator of the standard of healthcare within a given community. The maternal mortality ratio is a critical indicator that reflects the standard of reproductive healthcare afforded to expectant mothers. The study aimed to investigate institutional maternal mortality and its underlying causes. Methods: A hospital-based retrospective study was conducted on 1174 cases of maternal mortality over a four-year period from January 2018 to December 2021 in the Tertiary care center. Data pertaining to all mortalities were gathered from individual case records, facility-based maternal death review forms, and MDR case summaries. Results: The study analysed a total of 1174 deaths. During the study period, the incidence of MMR was 1465 per 1 lakh live births. The age bracket of 20-30 years exhibited the highest incidence of maternal mortalities. The data indicates that a significant proportion of maternal mortality cases occurred in primiparous women (77.17%), in contrast to multi (10.7%) and grand para (12.09%) individuals. The majority of the subjects (52.8%) were not booked, and a significant proportion of them (59.2%) resided in rural regions. The study at hand reveals that maternal mortality was primarily caused by direct and indirect factors, accounting for over 98% of cases. Non-obstetric causes, on the other hand, were responsible for approximately 1.2% of maternal deaths. The predominant direct factors leading to adverse maternal outcomes were haemorrhage (18.2%), encompassing post-partum haemorrhage, ante-partum haemorrhage, and abortion-related haemorrhage. Additionally, hypertensive disorders of pregnancy, including eclampsia, severe preeclampsia, and HELLP syndrome, were the most significant contributors, accounting for 33.9% of cases. Conclusions: The timely detection of high-risk pregnancies, consistent antenatal monitoring, adequate training of healthcare professionals, and prompt referral to tertiary care facilities can significantly decrease mortality rates. There has been a rise in the incidence of measles, mumps, and rubella (MMR) during the ongoing COVID-19 pandemic from 2020 to 2021.
|
Research Article
Open Access
Comparative Evaluation of Equipotent Dose of Cisatracurium and Atracurium in Patients Undergoing General Surgeries
Pages 1693 - 1697

View PDF
Abstract
|
Introduction: Atracurium is a benzyl-isoquinolinium, non-depolarizing neuromuscular blocking agent of intermediate duration of action. It has revolutionized anesthetic practice by providing muscle relaxation with faster onset, a more rapid measurable recovery. Cisatracurium is a recently introduced benzylisoquinolinium non-depolarizing neuromuscular drug which is a stereoisomer of Atracurium and constitutes about 15% of the commercially produced Atracurium and with a potency of three to four times greater than that of Atracurium. Materials and methods: This single center study was conducted in the Department of Anesthesiology at Maheshwara Medical College and Hospital over a period of 1 year. A total of 120 patients, 18-60 years of age. Patients were randomly divided into two groups; Group C received cisatracurium 0.1 mg/kg as muscle relaxant and Group A received atracurium 0.3 mg/kg IV. The mean onset time and duration of action for the two groups was done by Stockholm rules of the pharmacodynamic investigations of muscle relaxants activity. Intubating conditions, hemodynamic changes, and safety profile were noted. Result: In atracurium group, easy jaw opening was seen in 41 patients Comparatively in Cisatracurium group, jaw opening was easy in 45 patients and moderate in five patients. There was no statistically significant difference between two groups (P>0.05). In Group A, vocal cords were found in moving condition in about 35 patients while in 25 patients they were open, easing the intubation. In Group CA, vocal cords movement was seen in Twenty patients. It was found that vocal cord relaxation was better in Group CA, which was statistically significant (P<0.05). In Group A, 34 patients had slight diaphragmatic movement and 16 patients showed complete relaxation. However, in Group CA, 38 patients showed complete relaxation and only seven patients had mild cough reflex. Conclusion: It can be concluded that intubating conditions are better with 3ED95 dose of cisatracurium as compared to 2ED95 dose of atracurium. None of the participant showed signs of histamine release. Hence, cisatracurium can be considered as more efficacious as compared to atracurium.
|
Research Article
Open Access
Assessing magnitude of hypertension: A community based study in the rural field practice of a Medical college
Pages 86 - 94

View PDF
Abstract
|
Background: Hypertension is one of the most important risk factors for cardiovascular diseases particularly Ischemic Heart Disease & stroke. According to a nationally representative study on burden of high blood pressure in India, 70% of the people suffering from hypertension are not aware of it. Deaths due to hypertension are largely preventable. In comparison to other evidence-based interventions for non-communicable diseases, control of hypertension has the largest potential to save lives. Objective: 1.To estimate the magnitude of hypertension in a rural community. 2. To determine the significance of factors associated with hypertension Design and Methodology: A Community based cross sectional study was conducted in two pre-selected villages near Kakaramanahalli, rural field practice area of RajaRajeswari Medical College for a duration of six months among the people aged 18 years and above. A person was considered to be a hypertensive if he/ she were already diagnosed case of hypertension and / or on treatment or with a current SBP of ≥ 140 mm Hg or DBP ≥ 90 mm Hg and a person was considered as pre hypertensive if he/ she were with a current SBP of 120 – 139 mm Hg or DBP 80 – 89 mm Hg. Results: Out of 101 participants, the mean age was 52.13±16 years. Majority of the people were in the age group of 60 years & above accounting for 44 (43%). Females outnumbered the males accounting for 66 (65%). Illiterates were more among the study participants accounting for 54 (53.5%). Majority of the people in the study were agriculturists accounting for 51 (50.5%) . Overall, the magnitude of Hypertension among the study population was found to be 33.7% and 32.7% were falling under the category of pre-hypertensive. There was no statistically significant association between blood pressure and age, gender, type of family, BMI and waist circumference. Conclusion: Our study concluded that more screening activities to be implemented at rural levels for the population who are above the age of 40 years. There is a need for frequent monitoring of Blood pressure irrespective of BMI, waist circumference in the population above the age of 40 years.
|
Research Article
Open Access
Comparison of Mixed Venous Oxygen Saturation vs. Serum Lactate Monitoring in the Management of Patients in Septic Shock
Pages 422 - 429

View PDF
Abstract
|
Background: This study was conducted to test early resuscitation, targeting lactate levels as the marker of adequacy of oxygen delivery, compare the return of normal ScvO2 in predicting 7th day mortality, and compare the clearance of serum lactate (at least 10%) in predicting 7th day mortality. Methods: This was a hospital-based randomized prospective study conducted among 120 adult patients admitted to the medical ICU with septic shock at Rajarajeswari Medical College and Hospital from November 2015 to August 2017 after obtaining clearance from the institutional ethics committee and written informed consent from the study participants. Results: 7th day mortality was 21.7% in group L and 33.3% in group S, the p-value was 0.152 which was not clinically significant between both groups. But lactate clearance in group L at 24 hours was 57.50±25.82 and at 72 hours was 76.87±14.80 which was clinically strongly significant and ScvO2 at mortality was around 70.6% but still there was a mortality of 33.3% on 7th day, which is higher compared to the lactate group. Conclusion: Goal-directed therapy provided at the initial stages of severe sepsis and septic shock has significant short term and long-term benefits. We can also say that using lactate as an indicator of sepsis will help in early diagnosis with risk stratification and repeated measurements at regular intervals can assist in the progress of treatment.
|
Research Article
Open Access
A Study on Prognostic Implications of Hyponatremia in Elderly Hospitalised Patients
Pages 782 - 790

View PDF
Abstract
|
Background: Elderly in-patients are most usually affected by hyponatremia, a relatively prevalent electrolyte problem in clinical medicine [1-3]. It is recognised as occurring in 15–30% of hospitalised patients and is indicated by an s. Na+ level of less than 135 mEq/L. However, it has been noted that the occurrence rate in older people might reach 50% [3-5]. OBJECTIVES:
1. To classify severity of hyponatremia in hospitalized elderly and to correlate with outcome following treatment.
2. To study clinical feature and etiology of hyponatremia in elderly hospitalized patients.
Material & Methods: Study Design: Hospital based prospective observational study. Study area: Department of General Medicine, Subbaiah Institute of Medical Sciences, Shivamogga. Study Period: July 2022 – June 2023. Study population: Elderly patients (60 yrs and older) admitted in medical ICU Sample size: Study consisted a total of 100 subjects. Sampling Technique: Simple Random sampling. All elderly patients admitted to medical ICU, 3-5 ml of venous blood was collected in a yellow top vaccutainer, and 5-10ml of urine (spontaneous void or catheter specimen) is collected in clean bottle. Routine blood and urine investigations as appropriate the diagnosis like, Complete blood count, renal function tests, electrolytes, liver function tests, urine routine, chest radiograph and other imaging studies as needed are done. When the electrolytes reports are available, patients are enrolled in the study if they are having serum sodium less than 125mmol/L and the plasma and urine sample are sent for measurement of serum osmolality and urine osmolality by freezing point depression osmometer. Serum electrolytes and urine spot Sodium are measured by ion sensitive electrode method. Results: Among the 80 patients who improved 50 were female and 30 were male. And among the 20 patients, who expired, 15 were male and 5 were female. This indicates that among the 45 male patients admitted 30 (66.67%) patients improved and 15 (33.33%) patients expired, and among 55 female patients 50 (90.91%) improved and 5 (9.09%) expired. Which shows though the severe hyponatremia is high among females the response to treatment and survival is better among females than compared to males (p=0.0026). Conclusion: Clinicians need to be aware about the common occurrence of hyponatremia in acutely sick elderly and early identification and adherence to standardized correction protocol is essential to avoid complications and to reduce mortality. Meticulous monitoring for dosing of multiple drugs in elderly population would help in preventing hyponatremia.
|
Research Article
Open Access
Comparative Analysis of the Effects of Administration of Pregabalin and Duloxetine on Postoperative Analgesic Requirement Following Lower Extremity Trauma Surgeries
Pages 904 - 910

View PDF
Abstract
|
Background- The induction of hyperalgesia is a well-documented consequence of surgical trauma, and inadequate pain management has been identified as a contributing factor to the development of persistent pain during the postoperative phase. The primary objectives of this study were to assess and compare the impact of pregabalin and duloxetine on post-operative pain scores. Methods: In this observational study, a total of 120 patients with American Society of Anesthesiologists physical status I-II who were scheduled for lower limb trauma surgery were randomly assigned to two groups. One group received oral pregabalin at a dosage of 150 mg per day, while the other group received duloxetine at a dosage of 60 mg per day. The medications were administered 2 hours prior to surgery and continued once daily for the following 2 days after the surgery. The surgical procedure was conducted utilizing a standardized technique for spinal anesthesia. The investigator was unaware of the treatment allocation, which consisted of oral paracetamol at a dosage of 1 g every 6 hours, and intravenous diclofenac at a dosage of 75 mg as a rescue analgesic. The main objective of the study was to assess the response rate in relation to the need for rescue analgesia. The secondary outcomes encompassed various measures, such as the total amount of rescue analgesia administered, the visual analogue scale scores obtained at rest and during movement, the assessment of haemodynamics, the evaluation of anxiety and depression levels, the assessment of patient satisfaction, and the monitoring of any adverse effects. Results: In the pregabalin group, 60% of patients necessitated the initial administration of rescue analgesia within the first 72 hours following the surgical procedure, while the corresponding figure in the duloxetine group was 50%. Within the pregabalin group, it was observed that 6.6% of patients necessitated a second dose of rescue analgesia, with an average duration of 24 hours. Conversely, in the duloxetine group, 10% of patients required a second dose after an average duration of 40 hours. The scores on the visual analogue scale, the time until the first rescue intervention, and the cumulative use of rescue analgesics were found to be similar in both groups. Conclusion: Equivalent rate-responsive rescue analgesia was required in patients receiving pregabalin or duloxetine following lower limb trauma surgery.
|
Research Article
Open Access
Assessment of Biomedical Waste Management in Government Health Care Facilities of Ganjam District, Odisha
Pages 1141 - 1148

View PDF
Abstract
Introduction: Hospital waste is “Any waste which is generated in the diagnosis, treatment or immunization of human beings or animals or in research” in a hospital. “Hospital waste is a special type of waste produced in small quantities carrying a high potential of infection and injury and high potential to transmit infection to others. There are serious health effects from public health standpoint if hospital waste is not handled properly. Usually, the terms medical waste, hospital waste, infectious, and regulated medical wastes are often used interchangeably with medical wastes since there is no universally accepted definition for these terms. Material and Methods: This is a Facility based cross-sectional study conducted at Health care facilities at various levels in Ganjam district. In each health care facility, the medical officer, the pharmacist, the staff nurse and attendant comprised our study population. Those health care providers who gave consent to participate in the study. Using the observation checklist, the facilities were observed for infrastructure, logistics and practice of the stake holders. Then, the respondents were interviewed using the structured questionnaire for knowledge. A value of 1 and 0 was assigned for correct and incorrect practices respectively. For knowledge a value of 1 and 0 was assigned for correct and incorrect responses respectively. The total knowledge and practice score for each facility was calculated and then mean score was calculated. They were asked for their valuable feedback. Finally, they were thanked for their valuable support. Results: Only 12 (46.1%) of the doctors agreed that their facilities generate biomedical wastes, 15 (57.7%) of the doctors had opined that biomedical wastes associate with health hazard, 17 (65.4%) of the doctors were concerned regarding needle stick injury, 15 (57.7%) doctors believed wearing PPE reduces infection. Color coding of the waste segregation could be answered by 17 (65.4%) doctors, 13 (50%) of the doctors agreed that the BMW containers need to be labelled and 16 (61.5%) doctors agreed that the wastes need to be segregated at point of generation. Regarding color coded bins, 19 (73.1%) doctors practiced putting wastes in color coded bins. 20 (76.9%) doctors had the practice of displaying segregation instructions at their work place. 16 (61.5%) doctors were properly segregating wastes and aided in its proper transport. 18 (69.2%) doctors were not in practice of getting dustbins filled more than 3/4th. Conclusion: Findings from our study reveal that though the participants in our study have a fair knowledge regarding biomedical waste management still there is a lot of scope in not only improving the knowledge but also in changing the attitude and inculcating more rational practices towards the same. Majority of attendants had poor knowledge and practice regarding BMWM. Thus, there has to be a regular training programmes on biomedical waste management and its hazards for all the healthcare workers including group D workers. Along with educational intervention, strict implementation of biomedical waste management guidelines with its monitoring at all levels is also very much essential.
Research Article
Open Access
The role of Ambulatory blood pressure measurement in patients with End Stage Renal Disease (ESRD) with an aim to improve Renal and CardioVascular outcomes
Pages 1355 - 1363

View PDF
Abstract
|
Background: Ambulatory blood pressure (BP) measurement, compared to office blood pressure measurement, provides for better risk stratification in essential hypertension, but its prognostic role in non-dialysis chronic kidney disease has not been well studied. Methods: In 436 consecutive individuals with chronic kidney disease, the prognostic value of daytime and nighttime systolic blood pressure (SBP) and diastolic blood pressure (DBP) in contrast with office measurements was assessed. Time to renal mortality (end-stage renal disease or death) and time to fatal and nonfatal cardiovascular events were the primary end points. Patients were categorised using BP quintiles. Results: The patients had a mean (SD) age of 65.1 (13.6) years and a glomerular filtration rate of 42.9 (19.7) mL/min/1.73 m2. Of the participants, 41.7% were female, 36.5% had diabetes, and 30.5% had cardiovascular disease. SBP/DBP values measured in the office were 146 (19)/82(12)mmHg; midday values were 131(17)/75 (11)mmHg, and nighttime values were 122(20)/66 (10)mmHg. 155 and 103 patients, respectively, achieved the renal and cardiovascular end points during follow-up (median, 4.2 years).Patients with an SBP of 136 to 146 mmHg and those with an SBP greater than 146 mmHg had an increased adjusted risk of cardiovascular endpoint (hazard ratio [HR], 2.23; 95% confidence interval [CI], 1.13-4.41and3.07;1.54-6.09) and renal death compared with those with a daytime SBP of 126 to 135 mmHg (1.72;1.022.89and1.85;1.11-3.08). In comparison to the reference SBP value of 106-114 mmHg, night time SBPs of 125 to 137 mmHg and higher than 137 mmHg also raised the risk of the cardiovascular endpoint (HR, 2.52;95%CI, 1.11-5.71and4.00;1.77-9.02) and renal endpoint (1.87; 1.03-3.43and2.54;1.41-4.57). The risk of the kidney or cardiovascular endpoints was not predicted by office blood pressure monitoring. Patients who didn't dip or did it backwards were more likely to experience both outcomes. Conclusion: When dealing with chronic kidney disease, ambulatory blood pressure monitoring, particularly at night, provides for a more precise prognosis of renal and cardiovascular risk however office blood pressure monitoring makes no prognoses.
|
Research Article
Open Access
Renal Parameters and Serum Electrolytes Level in Newborns with Birth Asphyxia- A case Control Study
Pages 1825 - 1830

View PDF
Abstract
Background: Birth asphyxia is defined by the occurrence of hypoxia, hypercapnia, and acidosis, resulting in systemic disruptions, potentially including electrolyte imbalances, in newborn infants. The acquisition of knowledge pertaining to electrolyte disturbances is of significant worth, as it serves as a crucial determinant impacting perinatal morbidity, mortality, and the subsequent course of treatment. Material and Methods: The study described herein is a one-year prospective case-control investigation that took place within the Department of Pediatrics located in central India. A total of 80 newborns, consisting of 40 in the study group and 40 in the control group, were included in the study.The diagnosis of birth asphyxia was determined through the utilization of the APGAR score, while the diagnosis of hypoxic ischemic encephalopathy was established by employing the SARNAT staging system.The renal parameters, including serum creatinine, blood urea nitrogen (BUN), serum electrolytes from blood samples, and urine sodium and urine potassium from urine samples, were assessed in all the newborns. Results: Total 80 newborns out of which 40 were included in study group and 40 were included in control group. Out of 40 asphyxiated newborn 25(62.5%) were males and 15(37.5%) female. So there was higher incidence seen in the male babies. The BUN levels were 28+8.98 in the asphyxiated newborns as compared to controls who had BUN level was 20.3+2.65 and it was statistically significant. BUN level was higher among cases as compared to control and it was statistically significant.The mean serum creatinine levels were 1.7+0.29 in case group and 1.12+0.4 in control and it was statistically significant difference between both the groups. Conclusion: Perinatal asphyxia is an important cause of neonatal renal failure. Monitoring of blood levels of urea, serum creatinine, serum calcium and urine output helps in the early diagnosis and management of renal failure in birth asphyxia. Serum electrolytes levels and renal parameters had a linear correlation with severity of birth asphyxia.
Research Article
Open Access
Glycosylated Hemoglobin and Lipid Profile Changes in Gestational Diabetes: A Comparative Study with Normoglycemic Pregnant Women
Pages 1621 - 1625

View PDF
Abstract
Introduction: Gestational diabetes affects a significant proportion of pregnant women and can have adverse health effects for both the mother and the baby. Monitoring blood glucose levels and lipid profiles is crucial in managing this condition. This comparative study examines how glycosylated hemoglobin (HbA1c) and lipid profile parameters change in women with gestational diabetes compared to normoglycemic pregnant women, with the goal of improving diagnostic and management strategies for this condition. The aim of this study was to determine that HbA1c is an independent marker of dyslipidaemia among GDM cases and emphasize the link between the aforementioned parameters among pregnant women in Bihar. Materials and Methods: In this comparative study, we included fifty patients who were diagnosed with gestational diabetes during pregnancy. All of the antenatal women were in their third trimester. We also included another fifty pregnant women as controls, who did not have gestational diabetes or any other pregnancy complications in their third trimester. Both the cases and controls were randomly selected from the age group of 20 to 45 years. In this study, we measured the serum lipid profile parameters, oral glucose tolerance test blood glucose levels, and glycosylated haemoglobin levels in patients with gestational diabetes, and compared them with those of healthy pregnant women. Results: In this study, 50 pregnant women with GDM had a mean age of 31.2 years, while 50 pregnant women in the healthy control group had a mean age of 29.3 years. In the present study, serum triglycerides were observed at 193.12±10.12 mg/dL in GDM cases and 150.76±8.54 mg/dL in the control group, while serum total cholesterol was observed at 211.43±14.34 mg/dL in GDM cases and 168.83±18.19 mg/dL in the control group. The levels of serum triglycerides and serum cholesterol in GDM cases were statistically significantly higher as compared to the controls. The serum HDL cholesterol was observed at 57.98±5.78 mg/dL in GDM cases and 55.12±6.67 mg/dL in the control group, while serum LDL cholesterol was observed at 92.13±13.45 mg/dL in GDM cases and 82.03±10.16 mg/dL in the control group. There was no statistically significant difference in their HDL and LDL Cholesterol in the cases and control group. The fasting blood glucose was recorded at 116±9.65 mg/dL in GDM cases and 89±5.89 mg/dL in the control group, the blood glucose level after 1 hour of 75grams oral glucose administration in oral glucose tolerance test was observed at 198.13±12.74 mg/dL in GDM cases and 158.33±9.34 mg/dL in the control group while blood glucose level after 2 hours was observed at 174.38±11.48 mg/dL in GDM cases and 140.11±7.87 mg/dL in the control group. The differences between cases and controls were statistically significant. The mean value of the HbA1c of cases and control groups was 8.15±1.12 mg/dL and 6.02±0.18 mg/dL respectively. This difference between healthy pregnant women and women with GDM was statistically significant. Conclusion: The study's findings have conclusively demonstrated that triglyceride, high-density lipoprotein, glycated haemoglobin, and glucose levels in the blood all play a significant role in the development of dyslipidemia in gestational diabetes mellitus (GDM). Although it is well known that lipid parameters increase during a healthy pregnancy, the way they increase in GDM is different.
Research Article
Open Access
Correlation between different cardiovascular risk factors with insulin resistance in psoriasis
Pages 1691 - 1695

View PDF
Abstract
|
Background: Psoriasis vulgaris (PV) is a chronic recurrent inflammatory skin disease that occurs in genetically predisposed individuals, influenced by various endogenous and exogenous transducing factors. Objectives: The present study was aimed to assess the relationship between cardiovascular risk factors such as, the pattern of dyslipidaemia and body fat deposition with insulin resistance in Psoriatic patients. Methods: Body mass index (BMI) and waist circumference (WC) were measured in 40 psoriaitic patients against matched controls. Fasting Blood Glucose (FBG), Triglyceride (TG), Cholesterol(CHOL), Low density lipoprotein(LDL), Very low density lipoprotein(VLDL), High density lipoprotein(HDL), hsCRP were measured by spectrophotometry. Homocysteine was measured by immune fluroscene technology. Insulin resistance was assessed by measuring the HOMA-IR values. Results: FBG, HDL and WC between this two groups were statistically not significant (p value=0.271, 0.21 and 0.72 respectively). On the other hand, BMI, HOMA-IR, TG, CHOL, LDL, VLDL, hsCRP and Homocysteine levels were significantly higher in the case group (p<0.05). Bivariate correlation analysis showed HOMA-IR to be significantly associated with FBG, BMI, WC, Total CHOL and LDL (but not with VLDL, TG and HDL, hsCRP and Homocysteine values). Conclusions: Mainly an increased insulin resistance that is directly related to significantly elevated levels of abdominal obesity and LDL cholesterol levels reflects metabolic derangements in psoriatic patients in this region. We suggest regular monitoring of psoriatic patients for these parameters to avoid the impending cardiovascular risks in them.
|
Research Article
Open Access
Correlation between Serum Uric Acid Levels and Kidney Function in Hypertensive Patients: A Cross-sectional Assessment
Pages 1731 - 1735

View PDF
Abstract
|
Background: Hypertension is known to be accompanied by various renal and metabolic anomalies. The exact relationship between serum uric acid (SUA) levels and kidney function, especially in hypertensive patients, requires elucidation.Objective: To understand the correlation between SUA levels and kidney function, gauged by the estimated glomerular filtration rate (eGFR), in a sample size of 342 hypertensive individuals.Methods: Employed a cross-sectional design involving 342 hypertensive participants. SUA was determined using the enzymatic colorimetric technique, while the CKD-EPI equation was utilized to evaluate eGFR. Statistical methodologies were used to identify correlations.Results: A notable inverse correlation between SUA and eGFR was established (r = -0.67, p < 0.001). After accounting for confounding factors, increased SUA was identified as an independent predictor of diminished eGFR.Conclusion: In a sample of 342 hypertensive patients, elevated SUA levels were significantly related to a decline in kidney function. Regular monitoring of SUA may be integral for the management of hypertensive patients, but additional research is required to validate these outcomes and understand potential therapeutic directions.
|
Research Article
Open Access
Segmental Spinal Anaesthesia for Routine Surgeries: Efficacy and Safety in ASA 1 & 2 Patients – A Case Series Study
Pages 1736 - 1747

View PDF
Abstract
|
This case series study aims to evaluate the efficacy and safety of Segmental Spinal Anaesthesia (SSA) in ASA 1 & 2 patients undergoing a variety of routine surgeries. A total of 115 cases were analyzed, with patients ranging in age from 18 to 80 years, and a male-to-female ratio of 37:63. The study assessed the intervertebral space used and the drugs administered for each surgery, along with monitoring sensory anesthesia, motor block, postoperative nausea and vomiting (PONV), urinary retention, respiratory depression, hemodynamic changes, shoulder tip pain, abdominal discomfort, conversion to general anesthesia (GA),time to mobilization and postoperative analgesia requirements.
|
Research Article
Open Access
Depression and Morbidity Profile among the children in orphanages of Bhuvaneswar city: A Cross Sectional study
Pages 1848 - 1853

View PDF
Abstract
|
Background: Orphan children are vulnerable group in the society due to lack of affection, care love, emotional attachment and phycological support from parents which leads depression, malnutrition. There is decreased immunity due to malnutrition which leads to infections. Hence the present study was done with an objective to assess morbidity profile and to estimate the prevalence of depression among the children residing in the orphanages Methods: A community based cross‑sectional study was carried out in 5 selected Orphanages and 210 children residing in those orphanages selected by the simple random sampling method. Data was collected by interview method using pretested semi structured schedule. Results: Prevalence of Depression was 38.6 % and it was more among girls compared to boys and the difference is statistically significant. The major morbidities observed were dental caries 55.2 %, underweight 55.2 %, stunting 53.2 %, and pallor 20.5 %Conclusion: Depression, malnutrition and dental caries were the major health problems among the children in the orphanages and needs to be addressed and regular monitoring of nutritional status and improving the oral hygiene by health education and regular counselling will help to cope up with depression and other health problems.
|
Research Article
Open Access
Prediction of Induction to Delivery Interval in Vaginal Dinoprosotne Induced Labour
Pages 1883 - 1889

View PDF
Abstract
|
Introduction:-The aspiration of successful induction of labour is to reduce the risk of expectant pregnancy. Application of Dinoprostone gel for induction of labour is the gold standard practice in obstetrics. Induction of labour should be safe, effective as well as convenient for both patient and medical staff. Therefor induction to delivery time interval has a determinable effect in its success.
Aim:-To observe induction to delivery time interval in labour induced with Dinoprostone gel and factors associated with it.
Material and methods:-This is a retrospective observational study conducted from December 2020 to May 2021 in GMERS hospital sola. Labour induction with Dinoprostone gel in 210 women was studied. Pregnant women fulfilling the inclusion criteria were induced with 0.5 mg Dinoprostone gel intracervicaly after recording baseline bishop score and assessing fetal wellbeing with NST.
Vigilant labour monitoring was done and second gel instillation and labour augmentation with oxytocin was done as and when required.
Induction was considered to be failed when there was no progressive cervical dilatation &/or inefficient uterine activity. Primary and secondary outcomes were observed and then analysed.
Results:-Out of total 210 pregnant women induced with Dinoprostone gel 83.80% women delivered vaginally with mean induction to delivery interval 13.6+/- 1.1 hours in primi gravida and 8.9+/-0.9 hours in multi gravida.
Only 7.61% maternal complication rate and 0.9% NICU admission suggests good maternal and perinatal outcome of this study.
Conclusion:-Intra cervical Dinoprostone gel application is associated with successful outcome and relatively shortens duration of labour improving its acceptance worldwide.
Clinical Significance:-Induction of labour with cervical prostaglandin application such as Dinoprostone is a common & Routine Procedure
Not only induction of labour but timely delivery also plays an important role in successful labour. Shorter the duration of labour better and more acceptable is the outcome both for women as well as doctor.
In this study we assess the duration of time required by Dinoprostone gel application for successful induction and delivery, and factors associated with it.
|
Research Article
Open Access
To Analyze the Factors Predicting Failure of Non Invasive Ventilation in Copd Patients
Pages 2120 - 2128

View PDF
Abstract
|
Background: In the Emergency Department, COPD patients are assessed clinically and categorized with different grades of severity of the disease. Aim: To analyze the factors predicting failure of non invasive ventilation in Emergency Department among the patients with acute exacerbation of chronic obstructive pulmonary disease. Methodology: It was a prospective cohort study carried out during the period from July 2022 to August 2023. A total of 82 patients with acute exacerbation of chronic obstructive pulmonary disease requiring NIV attending Department of Emergency Medicine Results: In the present study 71.95% of the patients were males and 28.05% of the patients were females. The male female to ratio was 2.56:1. In this study 52.44% of the patients had hypertension, 42.68% of the patients had diabetes mellitus. The other comorbid conditions are as shown in table 5.3 and graph 5.3. In this study all the patients had shortness of breath and cough (100.00%) while fever was noted in 70.73% of the patients. In the present study failure of NIV and requirement of intubation was noted in 4.88% of the patients. In this study NIV failure was significantly associated with patients having pre existing or pulmonale (p=0.017) and hypothyroidism (p=0.025). In this study with regard to temperature (p=0.042), PO2 pertaining to second ABG analysis (p=0.023), NIV tidal volume (p=0.031) and hospital stay (p=0.001) differed significantly in patients with and without NIV failure. Conclusion: Based on the findings of this study it may be concluded that, the rate of NIV failure was low (4.88%) in a carefully selected patient population with timely intervention and strict monitoring.
|
Research Article
Open Access
Assessment of Knowledge of Pediatricians in Provision of Quality Immunization Services in Private Sector in Central India
Pages 2139 - 2145

View PDF
Abstract
|
Background: The paediatricians in the private sector in India can play an important role in providing vaccine service delivery and immunization coverage. Standards and systems for service quality of private providers should be estab-lished by countries.Standards should include practices in all facilities delivering vaccines, including proper storage and handling, appropriate use of injections, proper recording and adherence to safety measures, and waste management and disposal. [1]There is also a need to have co-partnership and communication with private providers to improve the per-formance of health system in long term. Methodology: A cross sectional study was conducted in urban area of Bhopal city. Knowledge and adherence to standard guidelines related to vaccination practices were assessed. Total 110 paedia-tricians were found eligible for the present study.After excluding 10 paediatricians who refused to participate, finally, 100 paediatricians were involved(responded) giving response rate of 90.9%. Pre-designed;pretested questionnaire was used for data collection. Results: In this study, the study population were paediatricians providing vaccination in private clinics and private hospital. Most of the private providers 50% were in between 41 to 60 age group, 81% paediatricians were trained and 82% immunization clinics were registered to Government sector. Out of total respondents (76/100) 76% answered correctly on all knowledge item questions. Most respondents 76% had complete knowledge score on cold chain vaccine. Mean (SD) for knowledge score was 96.3(±7.61) ranging from 70 to100. Conclusion: Knowledge of ma-jority of paediatricians were good,for success of NIP (National Immunization Program),it is necessary toincrease the private sector involvement in the area of immunization delivery.
|
Research Article
Open Access
Laboratory profile in serologically proven dengue in children
Pages 50 - 55

View PDF
Abstract
Background: Dengue fever is a significant global health concern, particularly affecting children in tropical regions. This study aimed to comprehensively analyze the laboratory profiles of serologically proven dengue cases in children and their associations with clinical outcomes. Materials and Methods: A prospective observational study was conducted on 300 pediatric patients with suspected dengue fever. Demographic data, clinical symptoms, serological markers (IgM and IgG antibodies, NS1 ELISA), and laboratory parameters were analyzed. Associations with disease severity and clinical outcomes were explored. Results: High prevalence of dengue-specific IgM antibodies (91.7%) and IgG antibodies (63.3%) was observed, with 50% of cases indicating secondary infections. NS 1 antigen ELISA was positive in 40% of cases. Clinical symptoms included fever (91.7%), headache (80%), myalgia (60%), and bleeding manifestations (16.7%). Severe forms of dengue (DHF/DSS) accounted for 30% of cases. Hemoglobin levels were lower in DHF/DSS cases (10.5 g/dL) than in non-severe cases (9.8 g/dL). Platelet counts were significantly lower in DHF/DSS cases (110 × 10^3/µL) compared to ICU admissions (85 × 10^3/µL). Serum creatinine levels were slightly elevated in ICU admission cases (1.1 mg/dL) compared to DHF/DSS cases (0.9 mg/dL). Conclusion: This study highlights the importance of serological markers and laboratory parameters in diagnosing dengue and assessing disease severity in pediatric cases. Early diagnosis and monitoring of these markers are crucial for timely clinical intervention. Further research is needed to validate these findings and enhance our understanding of pediatric dengue pathophysiology.
Research Article
Open Access
Liver Function Tests in Dengue Patients in A Tertiary Care Hospital of South Odisha: A Hospital Based Cross Sectional Study
Pages 123 - 127

View PDF
Abstract
|
Background: Dengue, is an arthropod-borne viral disease of significant public health importance. In the context of the rising burden of Dengue in South Odisha, this hospital-based cross-sectional study conducted at MKCG Medical College Hospital, Berhampur, aims to evaluate liver function tests in 100 Dengue patients. Recognizing the significance of hepatic involvement in Dengue, the study seeks to contribute valuable insights into the hepatic manifestations of the disease, potentially enhancing diagnostic and management strategies in this region. Methodology: Utilizing a systematic approach, this study employed a cross-sectional design at MKCG Medical College Hospital, Berhampur, enrolling 100 Dengue patients. Standardized liver function tests were conducted, including serum bilirubin, alanine transaminase (ALT), aspartate transaminase (AST), and alkaline phosphatase. Data analysis involved descriptive statistics, providing a comprehensive overview of hepatic parameters in Dengue patients within the specified tertiary care setting Results: Among 100 Dengue patients, 40% exhibited less than a 2-fold increase in AST levels, while 28% showed a 2-10-fold rise, and 10% demonstrated more than a 10-fold increase. Overall, 22% maintained normal SGOT values, with 78% displaying elevated levels. Regarding SGPT, 20% had normal values, 35% presented with less than a 2-fold increase, 25% displayed a 2-10-fold rise, and 20% had more than a 10-fold increase from normal levels. Early-stage symptoms like vomiting and abdominal pain correlated with hepatic involvement, with statistically higher AST and ALT levels in patients developing complications such as DHF, DSS, hepatic failure, ARDS, AKI, and encephalopathy Conclusion: In summary, our study reveals a notable prevalence of hepatic involvement in Dengue patients, emphasizing the significance of vigilant monitoring, particularly in cases with early symptoms and those at risk of complications. These findings offer crucial insights for tailored interventions and enhanced patient care within the tertiary care context of South Odisha.
|
Research Article
Open Access
Correlation between mixed venous oxygen saturation, central venous oxygen saturation and cerebral oxygen saturation measured by near-infrared spectroscopy during off pump coronary artery bypass grafting
Pages 246 - 257

View PDF
Abstract
Introduction: OPCAB was designed to reduce complications resulting from cardiopulmonary bypass like stroke, renal complications and myocardial ischemia and to reduce hospital stay, reduce morbidity and mortality. It includes various anatomical distortions of heart using stabilizers and suspensions which needs extensive monitoring techniques. To improve its efficiency neurological monitoring like NIRS and PA cannulation could play a significant role in further reducing such complications. Mixed venous oxygen saturation (SvO2) remains the accepted standard during anesthesia to evaluate the balance of oxygen delivery and consumption, especially during cardiac surgery. Monitoring the ScvO2-SvO2 with conventional PAC gives indirect evidence of myocardial ischemia, after excluding other causes of ischemia in lower body. Materials and Methods: In this single centred prospective interventional study, 60 patients undergoing elective off pump CABG between March 2018 to March 2020 were taken. Institutional ethical and scientific committee approval was taken (UNMICRC/ANESTH/2017/09) and written informed consent from patients was obtained. Results: Total 360 patients were enrolled in the study for comparative analysis of regional cerebral oxygen saturation (rScO2), central venous oxygen saturation (ScvO2) and mixed venous oxygen saturation (SvO2) in off pump CABG. Table 1 shows general characteristics of patients. Mean ejection fraction was 45.92 ± 9.23%. Fifty patients had triple vessel disease and 10 had double vessel disease for which 60, 53 and 49 patients had undergone Left anterior descending (LAD), Obtuse marginal (OM) or Diagonal (DG) and Posterior descending artery (PDA) or Right coronary artery (RCA) grafting respectively. Conclusion: Positioning of the heart for distal anastomoses at lateral and posterior wall was associated with more hemodynamic alteration and increased in inotropic and vasopressor requirement and significant decreased in rScO2, ScvO2 and SvO2. There was significant positive correlation on measured gradient between ScvO2 & SvO2 and rScO2 & SvO2 and rScO2 & ScvO2. ΔrScO2 was found to be highest as compared to ΔSvO2 followed by ΔScvO2.
Research Article
Open Access
Evaluation of Arrhythmias in Patients with Acute Coronary Syndrome in the First 24 hours of hospitalization
Pages 286 - 293

View PDF
Abstract
|
Introduction: Acute MI is one of leading causes of death and majority of deaths are due to arrhythmias. The aim of this study was to evaluate the incidence and risk factors and outcomes of fatal arrhythmias. Early revascularization reduces the risk of fatal arrhythmias. Most arrhythmias causes deaths within 48 hours and it includes bradyarrhythmias, heart blocks, atrial fibrillation and ventricular tachycardia and fibrillation. Aims and objectives: This study examines arrhythmias in acute coronary syndrome patients in the first 24 hours following presentation. Method: We collected hospital data from June 2022 to june 2023 with a 3 months follow-up after discharge. The study assessed clinical presentation, ECG monitoring of 900 ACS patients. Data was rigorously collected, including demographics, clinical information, and follow-up outcomes. Inclusion criteria: includes adults over 18 years with Acute MI. Exclusion criteria: contraindications for monitoring, severe arrhythmias, communication issues. Result: Patients with acute myocardial infarction (AMI) were 2.21 times more likely to have >50 PVCs per hour. Those over 65 had a 2.41 times higher risk. The model fit well (chi-square value 14.79, p = 0.0004). Length of stay strongly correlated with AMI diagnosis (F value 35.41, p < 0.0001). Various arrhythmias were found, including PVCs (44.44%), non-sustained VT (20.44%), supraventricular (11.33%), and atrial fibrillation (6.55%). Sustained VT (2.44%), asystole (2.22%), torsade de pointes (1.11%), and ventricular fibrillation (1.11%) were less common but serious. Right bundle branch block was the most frequent conduction deficit (5.77%), followed by 2nd degree (2.66%) AV block, left anterior fascicular block (2.88%), left bundle branch block (1.77%), and left posterior fascicular block (0.88%). These findings emphasize the range of arrhythmias and conduction issues, highlighting the need for tailored therapeutic and monitoring approaches. Conclusion: This study concluded that life-threatening arrhythmias were less common compared to benign ventricular ectopics and supraventricular tachycardia in ACS patients in PCI era. Patient had favourable outcome if they received timely PCI.Isolated PVCs affected approximately 25% of the sample. It independently increased hospital stay but did not affect other outcomes.
|
Research Article
Open Access
A Study On Prevalence, Severity Scoring and Causality Assessment of Adverse Drug Reactions in Pediatric Patients
Pages 720 - 732

View PDF
Abstract
Introduction: Adverse drug reactions (ADR) are an important aspect of drug therapy and can be a major setback in clinical practice. An ADR is defined by the World Health Organization (WHO) as ‘a response to a medicine which is noxious and unintended, and which occurs at doses normally used in man. The safety of drugs used in patients of an adult age group cannot be extrapolated to a pediatric age group. The pharmacokinetics and pharmacodynamics of many commonly used drugs vary significantly between these two age groups of patients2. Adverse drug reactions (ADRs) in children can have a relatively more severe effect when compared to adults. Thus, the ADRs can lead to significant morbidity among children.3 An increase in the number of drugs and self-medication with various medications have enhanced the occurrence of adverse drug reactions in recent times, especially in pediatric population. Material & Methods: This was a prospective, observation based, non-interventional study was Conducted in Dept. of Pediatrics, SCB Medical college and SVPPGIP, Cuttack which are two institutions under one department. This Department is a tertiary care center for pediatric patients in our state. Our institution is an approved ADR Monitoring Center (AMC) under Pharmacovigilance programme of India (Pvpi). ADRs were confirmed by the clinicians based on temporal relationship between start of drug and reaction, withdrawal of drug leading to decrease severity or abolition of reaction (dechallenge), exclusion of other causes etc. Sensitization of doctors in various seminar were done for spontaneous ADR reporting in Suspected Adverse Drug Reaction Reporting Forms by health care professionals. Results: Out of total 350 cases, dermatological system was most commonly involved i.e. 207 cases (59. 14%).This is followed by involvement of central nervous system 46 number of cases (13.14%). The GI system was involved in 34 cases i.e. (9.71%). Most of the ADRs were due to Antibiotics, these drugs are involved in 198 (56.57%) cases. Commonest antibiotics causing ADRs were Ofloxacin involving 26 cases (13.13%) of antibiotics followed by Ceftriaxone and cefixime comprising 22 cases (11.11%) and 14 cases (7.07%) respectively of total antibiotics used. A single drug as a possible causative agent of ADR,177 such cases were reported which constituted 50.57% of the total ADRs. Sometimes these agents were used with other drugs but Dechallenge test ruled out the probability of other drugs involvement. Out of 177, 110 drugs caused ADRs when used alone i.e. 31.4 % drug reactions were caused by monotherapy. Conclusion: Our study showed varied range of ADRs with higher reports in male children compared to females and maximum reports of ADRs obtained in age group 5-10 Years. Dermatological ADRs have highest incidence out of all the ADRs and FDE is most frequent among dermatological ADRs. Antibiotics were the commonest suspected agent in the reported ADRs. This study also exposed high occurrence of over-the-counter prescription to the pediatric age group causing ADRs (20.87%) of total ADRs and 12% of total serious ADRs. Incidence of serious ADRSs were more where multiple drugs were the suspected causative agents of ADRs. Various atypical ADRs were also observed due to active monitoring. Hence this study further emphasizes the need of proactive Pharmacovigilance, restriction of over-the-counter medications and increasing awareness among health care professionals, patients and public, for rational use of antibiotics, avoiding multidrug therapy and FDCs to reduce the incidence of ADRs especially in pediatric age groups.
Research Article
Open Access
Pulmonary Function in Thalassemia Major Patients Receiving Regular Blood transfusion
Pages 750 - 763

View PDF
Abstract
|
Introduction: In Thalassemia major there is decrease or total suppression of hemoglobin polypeptide chain synthesis occurs. Patient require regular blood transfusion to maintain normal Hb level greater than 10 gm% [17]. An inevitable, important and potentially lethal complication of administering repeated blood transfusion to a child with thalassemia is gradual overloading of body with iron. Iron deposition on various organ affects their function including lungs. Pulmonary deposition of iron ultimately leads to decease in function of lungs which can be assessed by spirometry and their correlation can be derived. From the above facts one can apprehend the problem of associated complication in thalassemia major children receiving regular blood transfusion. Aim &Objectives: To determine the pulmonary function status in beta-thalassemia major patients receiving regular blood transfusion. To study the pattern of respiratory impairment using spirometry. Also to estimate iron overload status by estimating serum ferritin level and with these values correlation of respiratory impairment with iron overload in thalassemia patients. Methodology: Hospital based observational cross-sectional prospective study, of sample size 81 having diagnosed as Thalassemia major of patients between age group of 6year to 14 years. Just before transfusion, venous sample collected from all participants and serum ferritin levels were assessed. Serum ferritin levels were recorded in the chart of patients every 6 months. Serum ferritin was derived by calculating the average measurement over a 2-year period for each patient. Further patients were categorized into population group A and population group B whom having serum ferritin level greater than or equal to 2500ng/ml and less than 2500ng/ml respectively. PFT was performed on the day scheduled for blood transfusion. Results were expressed as a percentage of normal. To compare the clinical parameter and biochemical parameters, chi-square test of association has been used. For comparison of study variable, independent student t test was used. For correlation of number of blood transfusion with serum ferritin, pearson correlation co-efficient has been used. The results were compared by using SPSS software version 17. Results: Patients with higher number of transfusion i.e. greater or equal to 140 showed increased chance of pulmonary abnormality as evidenced by mean FEV1(91.82±3.556) which was significantly higher compared to patients with less number of transfusion (i.e. less than 140) was (86.23±2.224). Pulmonary function test parameter FEV1 when compared to serum ferritin level. It was found that mean FEV1(91.06±3.564) level of population with serum ferritin level less than 2500 is significantly higher than mean FEV1(81.18±4.177) of population with serum ferritin level higher or equal to 2500, which is statistically significant as evidenced by p-value of 0.001 Conclusion: Patients with higher number of transfusion showed increased chance of pulmonary abnormality. The severity of the restrictive disease increases in older age and with more transfusion iron burden, which is indicative of a central role of iron in the pathogenesis of pulmonary function abnormality, which is associated with thalassemia major. This study will emphasise, patients with Thalassemia major on regular blood transfusion need monitoring throughout treatment to avoid future Pulmonary complications.
|
Research Article
Open Access
A Study of 25-Hydroxy Vitamin D Levels in Type 2 Diabetes Mellitus with and without Nephropathy
Pages 885 - 891

View PDF
Abstract
Background: This study was conducted to evaluate the serum vitamin D levels in patients with type 2 diabetes mellitus with and without nephropathy. Methods: This was a hospital-based cross-sectional case control study conducted among 100 patients who attended OPD and IPD at the Department of Medicine, Dr. B.R. Ambedkar Medical College and Hospital, Bangalore, over a period of 18 months from December 2020 to May 2022, after obtaining clearance from the institutional ethics committee and written informed consent from the study participants. Results: In comparison between diabetic nephropathy and non-diabetic nephropathy between both groups, a statistically significant difference (p<0.001) was noted between them with regard to serum creatinine, eGFR, UACR and vitamin D levels, suggesting that the diabetic nephropathic group has increased creatinine levels, highly reduced eGFR, highly elevated UACR and significantly decreased vitamin D levels, which are not present in the non-diabetic nephropathy patients. The eGFR and vitamin D levels were compared among the diabetic nephropathic subjects. The eGFR was split into three groups and by the ANOVA test measure, a significant association was obtained between them suggesting decreased eGFR also decreases vitamin D levels. On correlating vitamin D levels with serum creatinine and UACR a statistically significant (p<0.001) strong negative correlation was obtained (0.85 and 0.91) respectively. Conclusion: The study found that individuals with diabetic nephropathy had a higher prevalence of vitamin D insufficiency. However, at more advanced stages of diabetic kidney disease, their severity is more prevalent. Patients with CKD (Chronic Kidney Disease) should get information from health care providers on vitamin D monitoring and its dietary sources.
Research Article
Open Access
“Clinical and Epidemiological Study ofScorpion Sting Envenomation in A Tertiary Care Teaching Hospital’’
Pages 952 - 958

View PDF
Abstract
Background: Scorpion envenomation is a potentially fatal public health risk in tropical and subtropical places around the world. [1] However, morbidity and mortality from venomous animal bites or stings have received little attention in poor nations, including India. This is evident in the absence of a system for reporting venomous bites or stings. Objectives: 1. To study the clinical presentation, course, complications and outcome of scorpion sting envenomation. 2. To study the epidemiology and circumstances leading to scorpion sting in the community. Material & Methods: Study Design: Hospital based prospective cross-sectional study. Study area: The study was conducted in the Department of Paediatrics. Study Period: 1 year. Study population: All the children admitted for scorpion sting into hospital. Sample size: Study consisted a total of 54 subjects. Sampling Technique: Simple Random technique.Study tools and Data collection procedure: On admission, a detailed clinical history, including the time of sting, symptomatology, details of treatment received before admission was taken. Further a description of the scorpion and details about the circumstances leading up to the sting were obtained. All the patients were subjected to a detailed clinical examination at admission and at frequent intervals thereafter, as was necessary in each case. Hourly monitoring of heart rate, respiratory rate, blood pressure, urine output, cardiovascular and respiratory status was done. Results: Commonest complications were, peripheral circulatory failure, pulmonary edema, Myocarditis and Congestive cardiac failure (15% of cases). One child developed popliteal artery thrombosis 76 hours after admission to hospital. Three patients presented with Encephalopathy, two of whom had massive pulmonary edema and succumbed within 5 hour of admission. One child had left sided hemiparesis and encephalopathy, secondary to Left MCA territory infarct with mild pulmonary edema. Conclusion: In India, cardiovascular complications are most common and life threatening. However, anticipation and close monitoring for other uncommon complications is critical for effective management. Prazosin has revolutionized the management of scorpion sting envenomation. Administration of prazosin, as early as possible, is probably the single most effective intervention for preventing complications following scorpion sting.
Research Article
Open Access
Study Clinical Profile and Outcome of Respiratory Distress in Neonatal Period Admitted in A Tertiary Care Centre
Pages 1146 - 1151

View PDF
Abstract
|
The management of respiratory distress has advanced significantly in recent years. Various ventilatory therapy modes, including continuous positive airway pressure, conventional mechanical ventilation, ultra high frequency jet ventilation, liquid ventilation, surfactant replacement therapy, sophisticated monitoring, and extracorporeal membrane oxygenation, have all improved the outcomes for babies with respiratory distress. The mortality rate for neonates experiencing respiratory distress is 2-4 times higher than that of those without such distress. Material and Method: The Sardar Vallavbhai Patel Post Graduate Institute of Paediatrics (SVPPGIP) and SCBMCH Cuttack were the study's sites. Neonates that are carried straight to homes or are delivered to smaller hospitals in Orissa and are referred for neonatal care are taken care of by this unit. Study participants were 282 consecutive newborn respiratory distress hospitalisations that met the inclusion criteria. Result: All infants had their progress monitored until their demise or release. Each neonate's fate was documented upon their release from the newborn nursery unit, and those with sepsis were divided into two groups: those who lived and those who did not. For the purpose of estimating haemoglobin, total white blood cell count, absolute neutrophil count, and platelet count, 0.5 millilitres of blood were drawn. Before administering antibiotics, a peripheral vein was used to get a sample of blood, preferably 1 millilitre, which was then cleaned off with 70% alcohol and allowed to dry. The samples were then grown both aerobically and anaerobically. 0.5 ml of blood was drawn into a simple tube without the use of EDTA, and the latex agglutination method was utilised to estimate the CRP. Conclusion: The majority of the neonates in the 282 cases of respiratory distress were male, and the majority were delivered vaginally normally. For their gestational age, the majority of the newborns were healthy. Most newborns had respiratory difficulty within the first 24 hours of life, which is known as the early neonatal phase. The most prevalent diagnosis was pneumonia. A positive blood culture and a positive CRP exhibited a high sensitivity value in the diagnosis of pneumonia, and the study group's total mortality rate was 24.11%.
|
Research Article
Open Access
Genomic Sequencing of Variants In Sars-Cov -2 in Symptomatic Individuals At Tertiary Care Hospital
Pages 1909 - 1913

View PDF
Abstract
Introduction: COVID-19 is an acute viral illness caused by severe acute respiratory syndrome corona virus 2(SARS-CoV-2). Since the onset of the SARS-CoV-2 pandemic, multiple new variants of concern have emerged which are associated with enhanced transmissibility and increased virulence. It also highlights the role of the clinical inter professional teams, public health agencies, and community participation in improving patient care. Aim: An analysis of genomic sequencing variants of SARS-CoV-2 in symptomatic patients during 2nd and 3rd wave of pandemic by next-generation sequencing (NGS). Materials And Methods: A total of 200 symptomatic patients, throat/nasopharyngeal swab were collected for real-time reverse transcription-polymerase chain reactions (RT-PCR) at tertiary care hospital, Guntur. The specimens were transported under cold chain according to guidelines to Centre for Cellular & Molecular biology (CCMB), Hyderabad, for genome sequence analysis by next generation sequencing (NGS). Study period – 2ndwave i.e., MARCH 2021 –NOVEMBER 2021 & 3rdwave i.e., DECEMBER 2021 –MARCH 2022 according to WHO. Result – Out of 200 samples analysed, 132 samples of 2nd wave & 68 samples in 3rd wave. Out of 132 samples, 57 Delta (B.1.617.2), 75 Delta sub-lineages. Out of 68 samples 41 Omicron (B.1.1.529), 11 Omicron lineages (BA.1), 16 Omicron (BA.2). Conclusion: During the 2ndwave the symptomatic patients were detected with more delta and delta sub lineages showing high mortality rate. During 3rdwave omicron and omicron sub lineages were detected more than delta showing very high transmissibility and less mortality. Continuous monitoring and analysis of the sequence variants to understand the genetic heterogenicity.
Research Article
Open Access
Pattern of Ocular Manifestations in Pregnancy and Labour: From the Benign to the Vision-Threatening
Pages 1297 - 1302

View PDF
Abstract
|
Ocular manifestations during pregnancy and labor are multifaceted, ranging from benign fluctuations in visual acuity to potentially vision-threatening conditions. Understanding these manifestations is essential for comprehensive maternal healthcare. Objective: This study aims to elucidate the patterns of ocular manifestations in pregnant women, investigate associated risk factors, assess their clinical significance, and classify them into benign and vision-threatening categories. Methods: A retrospective analysis of medical records total n= 200 pregnant women was conducted, with data collected on ocular symptoms, preexisting ocular conditions, and pregnancy-related complications from January 2020 to September 2023. Ophthalmological examinations included visual acuity assessment, intraocular pressure measurement, and fundus evaluation. Results: Among the participants, 48.5% reported mild fluctuations in visual acuity, primarily attributed to hormonal changes. Preexisting ocular conditions are exacerbated in 12.3% of cases, with dry eye syndrome being the most prevalent. Elevated intraocular pressure (>21 mmHg) was observed in 6.8% of participants, necessitating further evaluation for glaucoma. Rare but severe conditions, including central serous chorioretinopathy (1.5%) and central retinal vein occlusion (0.6%), were identified, often associated with hypertensive disorders. Psychological distress due to ocular symptoms was reported in 22.7% of cases. Conclusions: Ocular manifestations during pregnancy and labor are common, with fluctuations in visual acuity and exacerbation of preexisting conditions being the most prevalent. Regular ophthalmological monitoring during pregnancy is crucial to identify and manage potentially vision-threatening conditions. Addressing the psychosocial impact of ocular symptoms is also essential for holistic maternal care.
|
Research Article
Open Access
Evaluation of Pulmonary Sequelae in Covid-19 Patients
Pages 1491 - 1497

View PDF
Abstract
|
The long-term pulmonary sequelae in COVID-19 patients remain a crucial area of investigation. This study aims to evaluate the resolution of pulmonary abnormalities in COVID-19 survivors through serial CT scans. Methods: An observational study was conducted on 80 COVID-19 patients, with CT scans performed during hospitalization and at two follow-up intervals. Ground glass opacities, consolidation, interstitial septal thickening, and fibrous bands were among the evaluated radiological findings. Results: At baseline, ground glass opacities were present in all patients (100%), with a significant resolution by the second follow-up (complete resolution in 51.2%). Consolidation was observed in 78.8% of patients initially, with 84.1% showing complete resolution at the second follow-up. Interstitial septal thickening and fibrous bands also showed considerable resolution over time. A significant correlation was found between higher CRP levels and increased CTSI scores (p=0.0001). Conclusion: The study demonstrates a significant resolution of initial pulmonary abnormalities in COVID-19 patients over time. The findings highlight the potential for lung recovery post-COVID-19, while also emphasizing the importance of monitoring for long-term sequelae, especially in patients with severe initial presentations.
|
Research Article
Open Access
A Study of Correlation of Quantitative C–Reactive Protein With CD4 Count in Patients of HIV on ART at KIMS, Hubli, Karnataka
Pages 1509 - 1519

View PDF
Abstract
|
Since the beginning of the epidemic, 76 million people have been infected with the HIV virus and about 33 million people have died of HIV/AIDS. Globally, 38.0 million people were living HIV at the end of 2019 according to the WHO report. In developing nations, the ever-growing incidence of HIV infection has placed a huge burden on economy, so there is a growing need for having cheaper alternatives for monitoring disease activity. Infections in people living with HIV reflect the immune suppression of the host. Hence, CRP can be used as a marker of degree of immune suppression, severity and type of opportunistic infections. Material and Methods: 144 HIV patients admitted in the General Medicine department of KIMS Hubballi are studied. It’s a single centered, prospective observational study carried out for a period of 2 years. Patient with opportunistic infection with CD4 count and CRP levels are studied. Statistical analysis was used to find the correlation between CD4 count and CRP. Results: The mean age of our study population was 36. 59 % of the population were males. Majority of the patients had opportunistic infection as oral candidiasis. In our study mean CD4 count was 228.03 and mean serum CRP levels was 22.98. In the study, As the severity of opportunistic infection increase, CRP levels increase and CD4 count decreases. Our study found a significant correlation (Pearson Correlation, r value - -0.781p value - <0.0001) between CD4 count and CRP levels. Conclusions: As CRP levels shows a significant negative correlation with CD4 count and significant positive correlation with type and severity of opportunistic infections, CRP levels can be used as a one of the marker of immunosuppression in place of CD4 count in resource-limited areas in patients with opportunistic infections.
|
Research Article
Open Access
Clinical Outcomes and Management Strategies in a Critical Care Setting During COVID-19: A Detailed Analysis of Patient Progress and Response to Treatment in the ICU of Government General Hospital, Nizamabad
Dr Kiran Madhala1, Dr Ch Subash Kumar2, Dr Suresh Babu Sayana3, Dr B. Vishwanath4
Pages 51 - 57

View PDF
Abstract
Background: This study examines the clinical outcomes and efficacy of management strategies for patients in the ICU of Government General Hospital, Nizamabad. It focuses on evaluating the impact of therapeutic interventions like oxygen therapy and CPAP adjustments in a critical care setting, particularly during the challenging period of the COVID-19 pandemic. Methods: A retrospective observational study was conducted involving 50 patients admitted to the ICU. The evaluation criteria included monitoring changes in oxygen saturation levels, the usage and adjustment of CPAP, and the presence of comorbid conditions. The study aimed to categorize patient outcomes into three groups: improvement, stability, and deterioration during their ICU stay. Results: Among the patients studied, 60% (30 patients) demonstrated clinical improvement, marked by increased oxygen saturation, reduced respiratory distress, and stabilized vital signs. 20% (10 patients) maintained a stable condition with no significant change in their health status. In contrast, another 20% (10 patients) experienced a deterioration in their condition, necessitating enhanced respiratory support. The study also found a high prevalence of comorbidities; 40% (20 patients) had hypertension, and 30% (15 patients) had diabetes mellitus. Conclusion: This study offers a comprehensive analysis of the clinical outcomes and management strategies in an ICU setting during a critical period. The findings highlight the effectiveness of personalized treatment approaches, the impact of comorbidities on patient outcomes, and the challenges posed by the COVID-19 pandemic. These insights are crucial for enhancing patient care in critical settings and for guiding future research in the field of critical care medicine.
Research Article
Open Access
Role of Transcranial Doppler in Early Diagnosis and Monitoring of Cerebral Vasculopathy in Pediatric Tuberculous Meningitis
Abinashi Sabyasachi Sethy
Pages 191 - 195

View PDF
Abstract
|
Background: Neurotuberculosis, particularly tuberculous meningitis (TBM), poses a significant threat to pediatric populations, often leading to severe morbidity and mortality. Timely diagnosis and intervention are critical for improved outcomes. Neuroimaging, including CT and MRI, play a crucial role in identifying characteristic features of TBM, such as basal hyperdensities, hydrocephalus, and periventricular infarcts. Transcranial Doppler (TCD) is an emerging tool, offering real-time, non-invasive assessment of cerebral hemodynamics. Limited research has explored the role of TCD in TBM-related vasculopathy. Methodology: A prospective study conducted from August 2019 to July 2020 included 60 pediatric TBM patients. Diagnosis followed the Consensus clinical case definition. TCD was performed serially on days 1, 3, and 7, and findings were compared with CT. Disease severity was graded using the Modified British Medical Research Council (MRC). Statistical analysis was performed with a significance level set at p < 0.05. Results: The study identified a positive correlation between TCD findings and disease stage, with 52.5% of subjects exhibiting normal Doppler studies. Abnormal findings included stenosis in 37% of cases, primarily involving the middle cerebral artery (MCA). The correlation between TCD and CT angiography (CTA) was highly positive, with TCD demonstrating a sensitivity of 91.7%, specificity of 85.7%, and overall accuracy of 87.5%. Discussion: The findings underscore TCD's effectiveness in early diagnosis and monitoring of cerebral vasculopathy in pediatric TBM, particularly in identifying stenotic areas. The positive correlation between TCD and disease stage supports its utility as a reliable tool in assessing disease progression.
|
Research Article
Open Access
Prevalence of common risk factors for colorectal cancers and cardiovascular diseases in the general population of the wilaya of Bejaia
Pages 196 - 204

View PDF
Abstract
|
In Algeria as at the global level, the incidence of cancers differs according to sex since the most common in men is colorectal cancer and in women breast cancer. All sexes combined, lung cancer, colorectal cancer, breast cancer and prostate cancer represent the four deadliest cancers with an overall cancer mortality estimated at 32,802,4116 (12.5%). Cardiovascular diseases represent the first cause of death. All these diseases have known risk factors and mortality can be reduced significantly by acting only on these risk factors Despite these very different clinical manifestations, the development of cancer and cardiovascular diseases, however, calls for a similar phenomenon: chronic inflammation. This was particularly well illustrated by the results of the CANTOS study, where it was shown that an antibody neutralizing the inflammatory protein interleukin-1b reduced both the risk of cardiovascular mortality and cancer. This common reliance of cancer and cardiovascular disease on chronic inflammation also explains why the set of lifestyle habits that promote the development of inflammatory conditions (smoking, physical inactivity, poor diet, obesity, diabetes) are so common risk factors for cancer and cardiovascular disease. Cancer and cardiovascular diseases are respectively the first and second cause of death in Algeria, being alone responsible for about 60% of all annual deaths, the aim of our study is to determine the common risk factors of colorectal cancer. and cardiovascular diseases in subjects over 50 years old in a representative sample of the population of Bejaia. During the study period, a total of 3002 citizens were included, 1735 (43.38%) people from the population of the daïra of Souk el Tenine, 375 (12.5%) people from the daïra of Adekar and 892 (29.7%) from the daïra of Amizour. The age group of the highest study population is between [50–54] years, then the age group between [60-64] years. The comorbidities found in our target population are: hypertension in 29.3%, diabetes in 22.2%, obesity in 32.35% and dysthyroidism. This survey made it possible to highlight the extent of risk factors for cardiovascular disease and diabetes, obesity, and colorectal cancer in the population studied. The results obtained will serve as baseline data for monitoring the most prevalent non-communicable disease prevention and control indicators in this region.
|
Research Article
Open Access
Prevalence of Persistent Pleuritic Chest Pain, its Risk Factors and Association with Treatment Outcome in Patients of Pleural Effusion on Antitubercular Treatment
Shashikant Bhaskar,
Omprakash Dipak Jalamkar,
Amarawatin Kurre,
Ivona Lobo,
Gopalsing Namdeosing Solanke
Pages 878 - 885

View PDF
Abstract
|
Background: Tuberculous pleural effusion represents a significant manifestation of extrapulmonary tuberculosis, with pleuritic chest pain being a common symptom that affects patient quality of life and treatment outcomes. Methods: This retrospective study analyzed 100 patients with tuberculous pleural effusion undergoing antitubercular treatment at a tertiary care teaching hospital. Data on demographic and clinical characteristics, treatment outcomes, and the prevalence of pleuritic chest pain were collected and analyzed. Results: The study found that 82% of patients presented with pleuritic chest pain, which significantly reduced to 8% by the end of treatment. The majority of patients were males (57%), with a mean age of 37.46 ± 14.2 years. Malnutrition was prevalent, with 44% of patients having a BMI of less than 18.5 kg/m^2. Fever (93%) and cough (72%) were the most common symptoms at presentation. Treatment outcomes were positive, with 94% of patients completing treatment.Conclusion: The significant reduction in pleuritic chest pain highlights the efficacy of antitubercular treatment. The study underscores the importance of addressing nutritional needs and monitoring for potential drug resistance, especially in patients with persistent symptoms. Future research should focus on the comprehensive care approach, including the role of adjunct therapies in managing TB pleural effusion.
|
Research Article
Open Access
Evaluation of prescription pattern of analgesics and antimicrobial agents and their adverse drug reactions reported from an institutional dental hospital in North India
Ramsha Ahsan ,
Md. Kalim Ansari,
Sharique Alam ,
Irfan Ahmad Khan
Pages 970 - 976

View PDF
Abstract
|
Background: Analgesics and antimicrobial agents are commonly prescribed drugs in dental patients. Therefore, monitoring their use and adverse reactions is very important. Aims and Objectives- To evaluate the prescription patterns and associated adverse drug reactions of analgesics and antimicrobial agents in dental patients. Materials and Methods: This is an observational study conducted in a tertiary care centre in northern India from July 2022 to September 2022. The study assessed a total of 100 prescriptions from dental practitioners. The majorities of patients visiting were in the age group of 21-30 years. The standard ADR reporting forms of CDSCO were used to record all adverse events experienced by the patients. Adverse drug reaction’s causality assessment was done using Naranjo’s Scale and the severity assessment by Modified Hartwig & Siegel scale. Results: The majorities of patients visiting the dental practitioners were males (53%) and the most common dental infections for which antimicrobials and analgesics prescribed were acute/chronic Caries (33%), acute/chronic Periodontitis (30%), Pericoronitis (19%), Periapical abscess (4%), Post RCT (5%), Trismus (2%), Sialadenitis (1%), Post operative TMJ ankylosis (1%), Oral Cancer (1%), Oroantral Fistula (1%), Ameloblastoma of jaw (1%), Cellulitis (2%) . The antimicrobials most frequently used for management of the observed conditions were Amoxycillin + potassium clavulanate(67%), Cefixime (27%), cefixime + clavulanic acid (2%), Ceftriaxone (inj.) (1%), Amikacin sulphate (1%), Metronidazole (2%) and the most common Analgesics prescribed were Aceclofenac + paracetamol (71%), Paracetamol (5%), Diclofenac sodium(23%), Dynapar AQ (1%) Out of 100 patients, 19 reported adverse drug reactions (ADRs), and out of 19 patients with ADRs, 52 different types of adverse drug reactions were reported. Conclusion: Our findings suggested caries as the most frequently reported conditions for which Antimicrobials and Analgesics were prescribed. Amoxicillin+ Potassium Clavulanate followed by cefixime was the most commonly used antimicrobials. 19% patients reported ADRs of which nausea and diarrhea were most frequent.
|
Research Article
Open Access
Predictors of Mortality in Patients with Concomitant and Sequential Covid – 19 Associated Mucormycosis – A Cross Sectional Study in a Tertiary Care Centre
Govinda Balappa,
Ramesh S Maddimani,
Rakshitha N S,
Sachin K S
Pages 1012 - 1018

View PDF
Abstract
Introduction: Mortality rates for COVID-19-related mucormycosis vary greatly in reported studies. A systematic evaluation of 101 cases revealed a fatality rate of 30.7 percent. However, research on the determinants of death in COVID-19 associated mucormycosis is insufficient. The purpose of this study was to find out what factors contributed to in-hospital mortality in patients with COVID-19-related mucormycosis. Objectives: To study the the Clinical profile , Haematological ,Biochemical and Radiological changes associated with mortality in patients with covid-19 associated mucormycosis. Methodology: In this single-center, observational study, 130 patients diagnosed with COVID-19 associated mucormycosis were recruited from a tertiary level intensive care unit from Bowring and Lady Curzon hospital, Bangalore, India. Results: Proportion of HTN, IHD, CKD and HIV was significantly more in non survivors compared to survivors. ICU admission and Oxygen requirement was scientifically higher in Non Survivors and had significant association with the outcome. . There was no significant difference in the levels of Hb, Neutrophils, Lymphocytes, Monocytes, Eosinophils, and Platelets as p>0.05. Total count (17191 7764), ESR (57.6± 12.4), CRP levels (199.0 69.5), and S.Ferritin (624.6 268.0) were significantly higher among the Non survivors. S.LDH (355.7 108.9), S.Free Iron (51.7 13.3), HBA1C (11.4 2.4), and S.Urea (36.9 35.3) were also found to be significantly higher among the non survivors. Conclusion: The current study highlights that a multidisciplinary approach in COVID-19 associated mucormycosis patients that includes timely and effective surgical debridement coupled with appropriate antifungal therapy and diligent sugar monitoring with intrahospital glycemic control may help to lower mortality. Key Words:
Research Article
Open Access
Ultrasound-Guided Implantable Chamber under Ambulatory Anesthesia “Comfort and Safety Regarding a Series at Bejaia University Hospital”
Chahira Mazouzi ,
Amel Mekroud ,
Adjia Kachenoura ,
Reda Fehri Boubzari,
Radia Benyahia ,
Salim Belkherchi
Pages 1046 - 1052

View PDF
Abstract
|
The care associated with the therapeutic course of the cancer patient must imperatively meet the rules of safety and above all comfort, for a patient population that is particularly fragile on a physical and psychological level. Our work will highlight the benefit of combining ultrasound guidance and ambulatory sedation, for the establishment of an implantable port for patients with active neoplasia. A series of 23 patients was collected in the Intensive Care Department of Bejaia University Hospital. The patients are referred by the medical oncology team for the installation of an implantable port for IV chemotherapy. The protocol consists, after verifying its feasibility, of preparing the patient. Simplified explanation of the procedure, Installation of equipment with monitoring, Titrated ambulatory sedation, based on ketamine associated or not with midazolam, Local anesthesia based on injectable lidocaine, placement of the implant after ultrasound-guided catheterization. End of the procedure, dressing and resumption of contact of the patient with his companion. This protocol made it possible to optimize the use of awake sedation drugs, putting the patient in comfort and making the procedure safer by ultrasound guidance. The initiation of this new protocol in adults with very satisfactory results for permanent implantable venous access, will open the door for the care of the pediatric population, for the establishment of implantable catheter ports in complete safety.
|
Research Article
Open Access
Identification of Ocular Structural and Functional Markers for Pre-diabetes and Diabetes Mellitus
Pick Ling Marinette Leong,
Kirtika Shrivastava ,
Kokkula Vishal Kumar,
Pooja Agrawal
Pages 7 - 17

View PDF
Abstract
Introduction: Ocular structural and functional markers are important for early detection and monitoring of pre-diabetes and diabetes mellitus. Diabetes is a persistent metabolic condition marked by high levels of glucose in the blood, which can lead to complications affecting different organs, such as the eyes. Aims and Objectives: The primary objective of this research is to identify ocular structural and functional markers associated with pre-diabetes and diabetes mellitus. Methodology: Adults aged 25-45 underwent comprehensive eye and health assessments at a tertiary care centre, utilizing advanced tools like the Omron Body Fat device and A1C Now+ test. Ocular evaluations employed sophisticated methods, including the Cochet Bonnet esthesiometer and Zeiss OCT. The investigation included 59 participants. Result: The study's findings reveal a distinct connection between diabetes, HbA1c levels, and different ocular parameters. Individuals with diabetes show elevated average HbA1c levels, advanced age, decreased Amplitude of Accommodation, and heightened Presbyopic Addition. Significant variations are noted in Cerebrospinal Fluid values, Pain Sensitivity Reaction Time, and different ocular surface measures in individuals with diabetes or prediabetes, suggesting possible effects on both systemic and ocular health. Conclusion: Functional markers such as contrast sensitivity function and photo stress recovery test were notably reduced in prediabetes cases, suggesting their value as visual indicators. Additional investigation into the contrast sensitivity function is advised because of its negative relationship with blood sugar levels. Photo stress recovery test delays indicate early macular changes prior to diabetes diagnosis, highlighting the significance of proactive screening.
Research Article
Open Access
A Study on Early Complications of Cemented Bipolar Prosthesis in Fracture neck Femur in Elderly
Amit Rahangdale,
Puneet Kumar Acharya,
Ritesh Parteti,
Anita Harinkhede
Pages 183 - 188

View PDF
Abstract
Background: Fracture of the femur is a common reason for hospital admission among the elderly population, with increasing frequency due to factors such as longer life expectancy, osteoporosis, and sedentary lifestyles. Conservative treatment approaches often lead to complications and are not suitable for many patients. Hemiarthroplasty, particularly using bipolar endoprostheses, has emerged as an effective surgical intervention for displaced femoral neck fractures in elderly individuals, offering pain relief and improved mobility. Method: This prospective study evaluated 36 patients over the age of 50 with intra-capsular femoral neck fractures treated with hemiarthroplasty using bipolar endoprostheses. The study aimed to assess functional outcomes and quality of life using the Harris Hip Score. Patients underwent preoperative planning, medical evaluations, and surgical management with cemented bipolar hemiarthroplasty. Postoperative monitoring was conducted at regular intervals for up to six months, with clinical, functional, and radiological evaluations performed during follow-up appointments. Result: Among the study participants, 44.4% were aged 50-65, while 55.6% were over 65, with a mean age of 64.2 years. Females comprised 55.6% of the cohort. Evaluation of Harris Hip Scores showed that 50.0% of patients achieved excellent outcomes, 38.9% had good outcomes, and smaller proportions fell into fair and poor categories. Most participants reported no pain and exhibited favorable outcomes in terms of limping, support required, distance walked, range of motion, leg length discrepancy, and post-operative complications. Radiological assessments indicated satisfactory stem positions in the majority of cases. Conclusion: Bipolar hemiarthroplasty with cement fixation proves to be a beneficial treatment option for elderly patients with fractured neck of femur, offering good to satisfactory functional outcomes and low complication rates. This procedure facilitates early mobilization and restores pre-injury functional status in most patients, highlighting its effectiveness in addressing femoral neck fractures in the elderly population.
Research Article
Open Access
Maternal and fetal outcomes of dengue fever in pregnancy in a Tertiary care hospital of Eastern India
Dipnarayan Sarkar ,
Sannyasi Charan Barman,
Rajat Kumar Das,
Kajal Kumar Patra,
Kishore P Madhwani,
Rituparna Mukhopadhyay
Pages 209 - 213

View PDF
Abstract
|
Background : Dengue is a vector borne disease with various grades of severity. Pregnancy is a high-risk group and is prone for complications of dengue haemorrhagic fever. Dengue fever has rapidly emerged as the most common arboviral infection globally. Objectives: The primary objective of the study was to assess maternal and fetal outcomes of pregnancies affected with dengue fever. Materials and methods: It was an institutional based prospective observational study. It was conducted in Department of Gynaecology and Obstetrics, College of Medicine & Sagar Dutta Hospital, Kamarhati, Kolkata, West Bengal, India. After receiving the clearance from the ethical committee study was conducted from June 2022 to December 2022. All pregnant patients reporting to the hospital with fever and serologically confirmed dengue infection (40 confirmed cases) were included in the study. Clinical and laboratory data of patients were collected. The cases were followed up till their delivery to monitor the effect of dengue. The data were entered in MS Excel spreadsheet and analysis was done using Statistical Package for Social Sciences (SPSS) version 21.0. Result: In the present study platelet count of 9 (22%) patients were <25000 lac and platelet count of 13 (33%) patients were 15000 to 50000 lac. In the present study 3 (7.50%) patients need ICU care followed by 9 (22.50%) need platelet transfusion, 7 (17.50%) need C PAP, 8 (20%) need PPH, 7 (17.50%) need abortion and 2 (5%) patients need abruption. In the present study 5 (12.5%) fetals suffered from fetal distress followed by 2 (5%) suffered from Oligohydramnios. In the present study 4 (35%) neonatal were normal. 8 (20%) neonatal need SNCU admission, 2 (5%) neonatal need NICU admission. Conclusion: Pregnancy-related dengue illness progressed quickly and caused serious consequences. For both the mother and the fetus to have a positive outcome, close materno-fetal monitoring and prompt obstetric care are necessary.
|
Research Article
Open Access
A Study on Functional Outcomes of Serial Cast Correction in Infant with Club Foot Deformity by Ponseti Method
Amit Rahangdale ,
Krutika Shekhawat ,
Soumitra Sethia ,
Anita Harinkhede ,
Ritesh Parteti
Pages 1703 - 1708

View PDF
Abstract
|
Background: Congenital Idiopathic Talipes Equinovarus (CTEV), commonly known as clubfoot, is a complex foot deformity that requires meticulous management to achieve optimal outcomes. The Ponseti method, characterized by serial casting and, if necessary, percutaneous tendoachilles tenotomy, has emerged as the preferred non-operative treatment approach for clubfoot. However, the traditional Ponseti protocol may pose logistical challenges for patients living far from medical centers. This study explores the feasibility and effectiveness of an accelerated Ponseti protocol, involving weekly casting sessions over a shorter duration, to alleviate the burden on patients and families. Methods: A prospective observational study was conducted at a tertiary care institute in Central India, involving infants with idiopathic clubfoot deformity aged between birth and 12 months. Patients underwent weekly manipulation and casting according to the accelerated Ponseti protocol. Pirani scoring system was used for initial assessment and monitoring of deformity correction. Data on demographic variables, treatment modalities, complications, and Pirani scores were collected and analyzed. Results: Among 60 included patients, the majority were male (75%) with bilateral involvement (55%). Most cases (87.10%) underwent casting combined with heel cord tenotomy, with 51.62% requiring 5-6 casts for correction. Complications were minimal, with only 3.23% experiencing superficial blisters. Significant improvement was observed in Pirani scores from a mean of 5.016 before treatment to 0.103 after treatment (p < 0.001), indicating successful deformity correction. Conclusion: The accelerated Ponseti protocol demonstrated feasibility and effectiveness in correcting idiopathic clubfoot deformity, with satisfactory outcomes and minimal complications. This approach offers a practical solution to reduce the treatment duration and logistical challenges associated with traditional Ponseti casting, particularly for patients living in remote areas. The study underscores the importance of early intervention and standardized assessment tools like the Pirani scoring system in guiding clubfoot management.
|
Research Article
Open Access
The Validation of a Mobile Based Ambulatory Heart Rhythm Monitoring Solution - Vigo Heart
Sowjanya Patibandla ,
Kiran Kumar ,
Rajani Adepu ,
Rajiv Kumar Bandaru,
B. Maduri
Pages 374 - 385

View PDF
Abstract
|
Background: The need for mobile-based ambulatory heart rhythm monitoring arises from its potential to provide convenient and continuous tracking of heart rhythms, improving early detection and management of cardiac issues while accommodating patients' active lifestyles. The present study aims to compare and validate the 24 h ECG monitoring between the traditional Holter and Vigo Heart wearable patch. Method: One hundred and nineteen patients with a workup of pre-diagnosed arrhythmias or suspicious arrhythmic episodes were evaluated. Each participant wore both devices simultaneously, and the cardiac rhythm was monitored for 24 h. Selective ECG parameters were compared between the two devices and the cardiologist independently compared the diagnoses of each device. Results: The indication for ECG monitoring in the ent study was patients who were presented with suspicious arrhythmia-related symptoms (47.8%). The Vigo Heart ECG showed a negligible amount of mean noise percentage (1.94 ±6.68%) when compared with the traditional Holter 17.84±23.95% of the total recording time. For the maximum heart rate, there was significant correlation between the Holter monitoring and Vigo Heart patch (129.69 ± 22.5 vs. 113.31 ± 23.6 beats/min, p = 0.02). The results also showed significant correlation for the average heart rates (74.85 ± 10.8 vs. 76 ± 10.3 beats/min, p = 0.02) and minimum heart rates (47.94 ± 9.5 vs. 59.21 ± 8.9 beats/min, p = 0.02), for the Vigo Heart ECG and Holter monitoring, respectively. The cardiologist made coherent clinical diagnoses for all the 119 study participants using both the ECG monitoring devices. The findings also revealed comparable coherent detection of cardiac arrhythmias in both the ECG monitoring devices. Conclusion: The single-lead adhesive device presents itself as a viable and acceptable alternative for conducting ambulatory ECG monitoring in individuals with arrhythmia or suspicious symptoms of arrhythmias.
|
Research Article
Open Access
Prevalence of Thyroid Disorders in Pregnancy
Neetu Singh Sikarwar,
Farhat Kazim
Pages 451 - 457

View PDF
Abstract
Background: Thyroid dysfunction during pregnancy is associated with adverse outcomes for both mother and child. This study aimed to investigate the prevalence of thyroid dysfunction among pregnant women and its correlation with obstetric outcomes and risk factors.Methods: A prospective observational study was conducted on 500 pregnant women. Thyroid function tests were performed at enrollment and during each trimester. Data on obstetric outcomes and compliance with treatment were collected. Results: The prevalence of thyroid dysfunction was 5.0%, with hypothyroidism (2.0%), hyperthyroidism (1.0%), subclinical hypothyroidism (1.6%), and subclinical hyperthyroidism (0.4%). No significant association was found between thyroid dysfunction and adverse obstetric outcomes such as preterm birth (20% vs. 9%, OR 2.5, p=0.08) and low birth weight (16% vs. 8%, OR 2.1, p=0.18). Age over 30 years (OR 2.0, p=0.02) and a family history of thyroid disease (OR 3.5, p=0.001) were significant risk factors. Followup results showed a progressive worsening of thyroid function during pregnancy. High compliance with levothyroxine treatment was observed (80%). Conclusion: While the prevalence of thyroid dysfunction in this cohort is in line with global rates, the study highlights the critical need for routine monitoring and management of thyroid function in pregnancy. The findings also emphasize the role of specific risk factors in identifying women at higher risk for thyroid dysfunction.
Research Article
Open Access
Comparison of Haemodynamic Stress Response to Endotracheal Intubation Using Direct Laryngoscopy Versus Intubating Laryngeal Mask Airway in Adult Patients with Normal Airway
Sunil Hosalli Rajanna,
Sandhya Dakshinamurthy,
Hanuman Srinivas Murthy,
Pooja Shah
Pages 511 - 524

View PDF
Abstract
Background : In this study, we wanted to compare the haemodynamic responses to endotracheal intubation using Intubating Laryngeal mask airway (ILMA Fastrach TM) and direct laryngoscopy in adult patients with normal airway. Methods: This was a hospital-based study conducted among 60 ASA grade I and II patients undergoing elective lumbar spine surgeries under general anaesthesia. They were grouped into two groups, group I -Laryngoscopy group and group II – ILMA Fastrach TM ( group II). Circulatory response to intubation was recorded in both groups by invasive arterial Blood Pressure(BP) monitoring device placed before induction of anaesthesia. The values were recorded at pre induction, at induction, every minute post induction for 3minutes and at ILMA / laryngoscopy , every 10 second post intubation for 2 minutes followed by every minute for next 3 minutes. The maximum values and maximum increase in BP and HR were recorded in both groups. Results:Intubation through I LMA- Fastrach™ was associated with significantly lower cardiovascular responses compared to direct laryngoscopy and intubation. There was a significant increase in blood pressure and heart rate from baseline in both the groups. The maximum increase was above or equal to preinduction values with laryngoscopy and intubation. The maximum values in I LMA (Fastrach™ ) group were never beyond preinduction values with respect to changes in blood pressure. The maximum increase in blood pressure and heart rate from respective base line values were similar between the two groups. This occurred in spite of longer time required for intubation in I LMA(Fastrach™ ) group in comparison with laryngoscopy group. Conclusion:The intubation done through ILMA(Fastrach™) was associated with lesser haemodynamic response associated with intubation in adult normotensive patients with normal airway.
Research Article
Open Access
The Ten Group Robson Classification: A Single Centre Approach Identifying Strategies to Optimise Caesarean Section Rates
Dr. Siftie Banga,
Dr Tanya Mahindra,
Dr Vandana Singh
Pages 1746 - 1751

View PDF
Abstract
Background: The escalating rates of cesarean sections (CS) globally necessitate evidence-based strategies to mitigate unnecessary surgeries. Robson's ten-group classification system offers a standardized approach to assess CS rates across diverse healthcare settings. Understanding the factors contributing to CS rates is crucial for effective obstetric management. Methods: This study analyzed 346 cases of CS using Robson's classification scheme, incorporating data on demographics, gestational age, parity, onset of labor, and indications for CS. The sample size was determined based on previous research, and data were collected from case records using a proforma. The primary objective was to identify the group exerting the most significant influence on CS rates and evaluate rates within each category. Results: Group 5, comprising women with a history of previous CS, contributed notably to overall CS rates. Fetal distress emerged as a predominant indication for primary CS, followed by malpresentation of the fetus and failed induction. Groups 1 and 3, involving spontaneous labor, exhibited moderate contributions to CS rates. Conversely, Groups 6, 7, and 10 represented smaller proportions, indicating lesser impact on CS rates. Conclusion: The study underscores the significance of Robson's classification in assessing CS rates and identifying key contributors. Women with prior CS represent a substantial proportion of CS cases, highlighting the importance of offering trial of labor after cesarean section (TOLAC) where feasible. Efforts to enhance obstetric care should include improved training in fetal monitoring interpretation and reinstating skills in assisted vaginal breech birth and external cephalic version. By addressing these factors, healthcare facilities can strive towards optimizing CS rates while ensuring safe maternal and neonatal outcomes.
Research Article
Open Access
Cardiovascular Ramifications In Post-Acute COVID-19 Syndrome: A Comprehensive Investigation Into Pathophysiological Mechanisms And The Impact Of Diet
Saim Hasan,
Nidhi Sharma,
Abhishek Sharma,
Faiza Ismail
Pages 633 - 638

View PDF
Abstract
The COVID-19 pandemic highlights the complex interplay between viral infections and cardiovascular disease (CVD), with dietary factors emerging as a critical modifiable risk factor. Examining the intricate relationship between COVID-19 and circulatory risk factors, this review highlights the growing role cardiologists play in managing chronic heart problems that arise after immunization and assessing the prognosis of myocardial injury. Overshadowing myocarditis, acute myocardial damage is a prominent consequence of COVID-19 and is often linked with sickness severity and viral load. There is worry about possible long-term cardiac effects made worse by SARS-CoV-2 Post-Acute Sequelae (PASC). Acute disease severity influences the frequency of PASC. Several heart-related symptoms are emphasized, such as myocarditis, ischemic heart disease, and arrhythmias. Several heart problems, including myocardial ischemia, thrombosis, and inflammation, have been related to several mRNA COVID-19 vaccinations. There are differences in treatment approaches, therefore in order to lower cardiovascular mortality, early detection and care are required. Chronic symptom management is recommended to use a multidisciplinary strategy that includes ambulatory monitoring and stress testing. After the pandemic, eating will play a critical role in reducing the risk of disease as it has an impact on lifestyle, mental health, and access to healthcare. With a particular emphasis on nutrition as a significant modifiable risk factor, this review investigates the connections between COVID-19 infection and cardiovascular health
Research Article
Open Access
Correlation Between Renal Function Tests And Thyroid Hormones In Patients With Thyroid Disorders
Rimpy Charak,
Ruhi Charak,
Shreya Nigoskar,
Ashutosh Kumar
Pages 661 - 668

View PDF
Abstract
Introduction: Renal function is evidently modified in both hypothyroidism and hyperthyroidism. However, there is a scarcity of clinical data on the relationship between thyroid disease and renal function. The objective of this study was to evaluate alterations in biochemical indicators of renal function in individuals with thyroid dysfunction and to correlate these measurements with the patient's thyroid hormones. Material and Methods: A total of 25 patients with primary hyperthyroidism and 294 patients with primary hypothyroidism were included as cases. A group of 100 persons who were in good health were selected as controls. Immunoassay was used to evaluate thyroid-stimulating hormone (TSH), free thyroxine (FT4), and free triiodothyronine (FT3). The serum levels of urea, creatinine, and uric acid were measured using an EM-360 autoanalyzer. The estimated glomerular filtration rate (eGFR) was calculated using the Modification of Diet in Renal Disease (MDRD) algorithm. Renal function tests were evaluated in all cases. Results: The results of our study showed a significant increase in the average levels of serum urea (36.26±3.69) and uric acid (6.55± 0.34) in patients with hypothyroidism. This increase was statistically significant (p value < 0.001). Similarly, we observed a significant increase (p value < 0.001) in serum urea (29.98±2.17) and uric acid (6.59± 0.34) levels in patients with hyperthyroidism. Nevertheless, hyperthyroid patients exhibited a decrease in serum creatinine levels (0.70± 0.04) compared to the control group, resulting in an increase in estimated glomerular filtration rate (eGFR) (121.55± 5.79). Conversely, the hypothyroid group showed a significant increase in creatinine levels (1.04± 0.05) (p value < 0.001), leading to a decrease in eGFR (102.05± 5.38) compared to the control group. Conclusion: Thyroid dysfunction is linked to abnormal renal function. The clinician should recognize the association between thyroid problems and abnormal kidney function to consider performing a thyroid function test for patients with slightly raised biochemical indicators of renal function during treatment. Monitoring creatinine levels is necessary for people with thyroid disease.
Research Article
Open Access
Assessing Perfusion Index Correlation between Right Toe P.I and Index Finger P
I in Lower Segment Caesarean Section Hypotension
Vishwanath K G,
Shivanagouda B Patil,
Arun M A,
Manjunath M H
Pages 858 - 864

View PDF
Abstract
Objective: Determine correlation between perfusion index in index finger and right toe regarding hypotension during spinal anesthesia in parturients undergoing LSCS. Methods: A cohort of 75 parturients (aged 18 to 35 years), scheduled for elective LSCS and classified as ASA-II, underwent intraoperative vital sign monitoring including heart rate, NIBP, respiratory rate, SpO2, and perfusion index measurements from both finger and toe sites. Readings were taken at two-minute intervals until the 15th minute, followed by five-minute intervals until surgery completion. Data were analyzed using SPSS 20.0. Results: Mean age was 25.91 years (±3.82), heart rate 95.55 bpm (±14.9), Finger PI 5.64 (±3.49), Toe PI 3.38 (±3.36), and MAP 98.28 mmHg (±8.70). ROC analysis identified baseline cutoffs for predicting hypotension: Finger PI 3.55 and Toe PI 1.85. Spearman’s rank correlation analysis revealed significant correlations between baseline finger perfusion index (>3.5) and hypotension episodes (rs = 0.400, P < 0.000), and baseline toe perfusion index (>1.85) and hypotension episodes (rs = 0.549, P < 0.000), suggesting moderate agreement. Conclusion: Perfusion Index (PI) is a valuable predictor of hypotension in healthy parturients undergoing elective LSCS under subarachnoid block. Continuous toe PI monitoring during spinal anesthesia induction may aid in predicting post-spinal hypotension and assessing aortocaval compression by the gravid uterus.
Research Article
Open Access
Right Toe and Index Finger Perfusion Index in Clinical Prediction of Post-
Subarachnoid Block Hypotension in Lower Segment Caesarean Section-
Observational Study
Vishwanath K G,
Arun M A,
Shivanagouda B Patil,
Neelam Meena
Pages 865 - 869

View PDF
Abstract
Objective: To determine the sensitivity and specificity of perfusion index in the right toe and index finger as a predictor of post-spinal hypotension in parturient undergoing Lower Segment caesarean Section (LSCS). Methods: 75 Parturient aged 18 to 35 years who were undergoing LSCS elective procedures, of ASA-II were included in the study. Intraoperative assessment of vital parameters including heart rate, NIBP, Respiratory rate, Sp02 and perfusion index at both finger and toe were recorded every 2 minutes till the 15th minute and then every 5 minutes till the end of the surgery. The data for the study was calculated using the SPSS 20.0 package program. Results: In this study, the mean age was 25.91 years (±3.82), heart rate 95.55 bpm (±14.9), Finger PI 5.64 (±3.49), Toe PI 3.38 (±3.36), and MAP 98.28 mmHg (±8.70). Using ROC analysis, baseline cutoffs for predicting hypotension were identified, Finger PI 3.55 (83.3% sensitivity, 51.1% specificity) and Toe PI 1.85 (80% sensitivity, 56% specificity). Conclusion: The Perfusion Index (PI) can be used as an effective tool for predicting hypotension in healthy parturients posted for elective caesarean section under subarachnoid block. Continuous monitoring of toe PIs during induction of spinal anaesthesia might help to predict the development of post-spinal hypotension and reflect the aorto-caval compression by the gravid uterus.
Research Article
Open Access
Effectiveness and Safety of Rosuvastatin in Reducing LDL Cholesterol Levels: An Observational Study
G. Neeraja Rani,
Rekala Karunakar,
Dubbasi Praveen Kumar,
Prashanth Kumar Patnaik
Pages 935 - 940

View PDF
Abstract
|
Background: Dyslipidemia, characterized by elevated levels of low-density lipoprotein (LDL) cholesterol, is a significant risk factor for cardiovascular diseases. Rosuvastatin, a statin medication, is commonly prescribed to lower LDL cholesterol levels. However, comprehensive observational studies assessing its effectiveness and safety in real-world settings are limited.This observational study aimed to evaluate the effectiveness and safety of rosuvastatin in reducing LDL cholesterol levels among individuals with dyslipidemia.
Methods: A sample of 100 participants with dyslipidemia, aged 45-65 years, was enrolled in a 12-week observational study. Baseline characteristics, including age, gender distribution, and baseline LDL cholesterol levels, were recorded. Participants received rosuvastatin therapy as per standard clinical practice. The primary outcome measure was the change in LDL cholesterol levels from baseline to the end of the 12-week treatment period. Safety assessments included monitoring for adverse events, liver function tests, and creatine kinase levels. Compliance and adherence to medication were also evaluated.
Results: Following 12 weeks of rosuvastatin therapy, a significant reduction in LDL cholesterol levels was observed across the sample (mean reduction: 30 mg/dL ± 5 mg/dL). Subgroup analysis based on baseline LDL cholesterol levels demonstrated consistent reductions, with greater reductions observed in participants with higher baseline LDL cholesterol levels. Rosuvastatin therapy was well-tolerated, with no serious adverse events reported. Common adverse effects were mild and transient, including muscle aches, gastrointestinal discomfort, and headache. Compliance with therapy was high, with 95% of participants completing the treatment period and adherence rates exceeding 90%. Secondary outcomes indicated improvements in total cholesterol (mean reduction: 35 mg/dL ± 6 mg/dL), triglycerides (mean reduction: 25 mg/dL ± 4 mg/dL), and HDL cholesterol (mean increase: 5 mg/dL ± 2 mg/dL).
|
Conclusion: This observational study provides evidence supporting the effectiveness and safety of rosuvastatin in reducing LDL cholesterol levels among individuals with dyslipidemia. Rosuvastatin therapy was well-tolerated and associated with improvements in lipid profiles. These findings highlight the potential of rosuvastatin as a therapeutic option for managing dyslipidemia and reducing cardiovascular risk
Research Article
Open Access
A Retrospective Research Conducted by the General Surgery Department at FM MCH Examined the Clinical Characteristics and Management of Hydatid Cysts of the Liver.
Narayan Chandra Behera, MS(Surgery), Mch (Urology),
Rukmani Jena,
Arvind Ranjan Mickey,
Abhishek Patro
Pages 1069 - 1074

View PDF
Abstract
|
Introduction: Hydatid cysts of the liver (HCL) are a severe yet unappreciated public health concern in underdeveloped nations such as India. HCL is mostly caused by the tapeworm Echinococcus granulosus [1]. In 2010, a research conducted by the World Health Organisation (WHO) estimated the incidence of cystic echinococcosis per 100,000 individuals in Southeast Asia to be 0.8 (95% confidence interval (0.2-2)) [2]. Estimating HCL's influence in India is difficult, though, for a variety of reasons. First, the total frequency of the illness is greatly underreported in many epidemiological studies and series due to a lack of thorough research and surveys covering the whole endemic population. Furthermore, there is a propensity for the Health Management Information System, the government of India's monitoring system, to underreport.
Aim: To evaluate hepatic hydatid cyst patients' clinical symptoms, therapy, and sociodemographic characteristics in a poor country.
Materials And Methods: During the course of 24 months, the Department of General Surgery at FM Medical College and Hospital in Balasore, undertook this retrospective study. A retrospective, descriptive study was performed on 23 patients who had been identified with a liver hydatid cyst based on clinical symptoms, imaging testing, or serology. To present the study's findings, the data was assessed and statistically analysed using IBM SPSS 23.0 for Windows.
Results: The age group of 25 to 45 was the most commonly affected (10, 43.47%), with an average age of 36 among the patients. Female patients made up 56.5% of the total patients. Palpable liver (7, 30.4 %) and stomach discomfort (21, 91.3 %) were the most common symptoms. Abdominal ultrasonography and computed tomography were the two primary imaging techniques used to establish a diagnosis. Anechoic, unilocular cystic lesions were the most prevalent kind. Most liver cysts in these individuals were found in the right lobe. In 44.4 percent of the patients, hydatid cysts were surgically removed; the most common kind of surgical operation was pericystectomy.
Conclusion: In India, hepatic hydatid cysts are frequently the source of illness. For most patients, surgery remains the primary course of treatment; diagnosis requires a clinical examination accompanied by imaging investigations.
|
Research Article
Open Access
Use of Oxygen Saturation Index for monitoring of patients with hypoxic respiratory failure and role in predicting success of extubation in mechanically ventilated patients
Asha Prakash Mohapatra,
Gayatri Ray,
Pusparaj Aditinandan Pradhan,
Deshish Kumar Panda,
Saroj Shekhar Rath
Pages 1163 - 1169

View PDF
Abstract
|
Background: Hypoxemic respiratory failure is an important cause of intensive care unit (ICU) admissions. Oxygen index (OI) and Oxygen saturation index (OSI) are important parameters used for diagnosing and monitoring critically ill children with hypoxic respiratory failure in ICU.
Objectives: To find out the correlation between OI and OSI and to determine the reliability of OSI in predicting the success of extubation.
Methods: This prospective study included children aged 1 month to 14 years requiring mechanical ventilation at a tertiary care teaching hospital over a period of 2 years. Arterial blood gas analysis was done; OI and OSI values were calculated as per protocol.
Results: A total of 148 children were included (boys:girls = 2:1). Mean (± SD) OI of 4.9 2.3 and OSI of 5.7 2.8 were recorded with a mean difference of 0.75 1.90. A good correlation was found between OI and OSI (0.73). The equation of correlation obtained was OI = 1.5 + (0.6 x OSI). A sensitivity of 89.7% at an OSI cut off of 4.15 (= OI of 4) in diagnosing P-ARDS was found. Good degree of correlation was found between predicting success of extubation and OSI (r = 0.32).
Conclusions: Although good correlation exists between OI and OSI, many factors significantly affect the difference between the two. Therefore, OSI can be used as a reliable monitoring method in controlled settings after ensuring good patient selection, proper method of sampling and sample handling, good quality electronic devices and invasive monitoring facilities.
|
Research Article
Open Access
Clinical Profiling of Portal Venous Thrombosis in Patients with Chronic Liver Disease: A Cross-Sectional Study
Rohit Dubey ,
Anand Rajput ,
Varsha Patel ,
Rajkishore Singh
Pages 1 - 6

View PDF
Abstract
|
Introduction: The incidence of portal vein thrombosis can vary depending on factors such as age, underlying liver condition, portal venous blood flow rate, and the patient's pro- or anticoagulant status. This study aimed to describe clinical profile and assess the correlation between portal vein thrombosis and color Doppler findings in patients with chronic liver disease. Materials & Methods: A total of 145 patients diagnosed with chronic liver disease, including both alcoholic and non-alcoholic etiologies and both genders, were included in the study. Detailed medical histories, clinical examinations, and laboratory evaluations were conducted for all patients. This included assessments of fasting glucose levels, liver function tests, and coagulation profiles (Prothrombin time, INR). Results: Among the 145 participants, the majority belonged to the 36-55 age group. The average age of participants was 44.78±12.51 years. Most participants had hemoglobin levels below 11 gm%. Serum bilirubin levels were above normal in 68 participants. Elevated SGPT/SGOT values were observed in 54 participants. 48 participants had below-normal serum albumin levels, and 44 had low platelet counts. Subjects with portal vein thrombosis exhibited a mean portal vein diameter which was significantly higher compared to those without portal vein thrombosis. Biochemical markers showed a significant association between Child Pugh score and platelet count, INR, and total bilirubin. Conclusion: Portal vein thrombosis can exacerbate hepatic decompensation and affect the survival of patients with cirrhosis. The prognostic significance of portal vein thrombosis in cirrhosis remains uncertain. Early detection, appropriate treatment, and regular monitoring can help prevent portal vein thrombosis in liver cirrhosis, leading to improved liver function and survival.
|
Research Article
Open Access
Fetomaternal outcome in cardiac disease complicating pregnancy: A
retrospective study
Ramya Palani,
Preetha Gunasegaran,
Deepa Shanmugham
Pages 84 - 89

View PDF
Abstract
Background: Cardiac disease in pregnancy is considered to be an important cause of maternal morbidity and mortality. Cardiac disease complicates 1-3% of all pregnancies and considered as leading cause of indirect maternal deaths. Cardiac disease in pregnancy is considered to be high risk and management of it in pregnancy is challenging. Aim: To evaluate fetomaternal outcome in cardiac disease complicating pregnancy. Objectives: To evaluate fetal and maternal outcome in pregnancy with cardiac disease. To measure the prevalence of cardiac disease in pregnancy. Materials & Methods: A retrospective observational study of all women who delivered at a tertiary care centre from 2011 to 2018 with heart disease complicating pregnancy were included in the study. Their details were collected from the case record and registers, using data collection proforma. The outcomes were studied. Results: The prevalence of cardiac disease was found to be 0.66%. Most common heart disease in pregnancy was found to Rhematic heart disease (72%). Among them the most common heart disease was found to be mitral stenosis (35%). Conclusion: Cardiac disease is a high risk pregnancy and has major effect on fetal and maternal outcome morbidity and mortality in pregnancy. Hence proper antenatal monitoring, involvement of multidisciplinary team and delivery in a tertiary care setup with ICU and Cardiac care facilities will definitely improve the fetal and maternal outcomes in cardiac disease complicating pregnancy.
Research Article
Open Access
Mode of Delivery in Breech Presentation From 28 Weeks of Gestation and Its Perinatal Outcome
B. Neelima,
Padmavathi ,
Dhanireddy Salini Sakuntala,
P. Rabbani Begum
Pages 179 - 192

View PDF
Abstract
|
Aim: To study the mode of delivery in breech presentation from 28 weeeks of gestation and its perinatal outome.
Methodology: This study was conducted in the Department of Obstetrics and Gynaecology, GMC Kadapa from February 2021 to July 2022.
Results: In the study 100 cases of breech presentation was taken and studied.42% belonged to the age group of 20-25 years. 54% belonged to multigravida in this study and the remaining were primigravida. 77% were in between gestational age of > 36 weeks of gestation. 58% of cases were booked and the remaining registered late in pregnancy. 9% cases had oligohydramnios as risk factor in this study population. 9% had PIH disorders and 6% had other medical disorders. Indications of caesarean section are FPD, which is 20%, followed by oligohydramnios, footling presentation and fetal distress. 51% cases were in frank breech followed by 33% in flexed and remaining were footling. 21 cases in this study had uterine anomaly in which most common was unicornuate uterus followed by septate uterus. Caesarean section reduces risk of perinatal outcome at term during both labour and delivery for singleton breech presentation compared with vaginal delivery. Fetal morbidity was lower and APGAR scores are better in fetuses delivered by lower segment cesarean section. Perinatal mortality was more in fetuses delivered by vaginal route. Hence, it can be stated that vaginal mode of delivery is not always a completely safe option but may be considered as a safe mode for babies in breech as long as the selection criteria is fulfilled and delivery is done by a skilled and trained obstetrician with continuous fetal monitoring.
|
Conclusion: The present study stated that, vaginal mode of delivery is not always a completely safe option but may be considered as a safe mode for babies in breech as long as the selection criteria is fulfilled and delivery is done by a skilled and trained obstetrician with continuous fetal monitoring. Therefore, it is concluded that a balanced decision to be taken about the mode of delivery on a case by case basis as it differs from case to case and gestational age as well as training of assisted breech delivery will go on a long term basis to optimise the outcome of breech presentation.
Research Article
Open Access
Pulse Oximetry Saturation in Comparison to Pao2 in Abg in Respiratory Distress in Nicu and Picu
Ritika Singh Chandel,
Monisha Sahai
Pages 317 - 322

View PDF
Abstract
|
Background: Pulse oximetry is widely used in the NICU and PICU to monitor oxygenation in newborns and children with respiratory distress. This study aimed to evaluate the relationship between arterial partial pressure of oxygen (PaO2) and pulse oxygen saturation (SpO2) values in this patient population. Methods: A total of 50 newborns and children with respiratory distress admitted to the NICU and PICU were included in this observational study. PaO2 and SpO2 values were obtained simultaneously, and their relationship was analyzed using correlation, linear regression, and agreement analyses. Results: A strong positive correlation was found between PaO2 and SpO2 (r = 0.78, p < 0.001). The linear regression equation was PaO2 = 21.5 + 0.46 × SpO2 (R-squared = 0.61, adjusted R-squared = 0.60, p < 0.001). The mean difference between PaO2 and SpO2 was 2.8 (SD = 8.2), with 95% limits of agreement ranging from -13.3 to 18.9. The sensitivity and specificity of SpO2 for detecting hypoxemia (PaO2 < 60 mmHg) were 85.7% and 91.2%, respectively. Conclusions: SpO2 is a reliable tool for monitoring oxygenation in newborns and children with respiratory distress, showing a strong correlation with PaO2. However, its accuracy may be influenced by factors such as the FiO2 level and the severity of hypoxemia. Clinicians should use SpO2 in conjunction with other clinical parameters and diagnostic tools when assessing and managing this patient population.
|
Research Article
Open Access
Anaesthesia Quality Assessment in the Recovery Room
Bhushan Nagarkar ,
Vijaykumar Khandale ,
Kailash Sharma
Pages 553 - 563

View PDF
Abstract
Background: In modern era of medical science, patients’ post-anaesthesia recovery has improved mainly because of better monitoring, measures taken intra-operatively to avoid post-operative complications, and better immediate post-anaesthesia care. The present study aimed to know the incidence of postoperative nausea and vomiting, hypothermia and worst pain score in post-surgical patients in recovery room. Materials and Methods: This study was carried out after approval from institutional review board. This is a prospective observational study, conducted on the post-operative patients in the post anaesthesia recovery room in Tata Memorial Hospital, Mumbai for a period of two months. Data was collected from 1,007 patients out of 1,191 elective surgical procedures carried out. Incidence of postoperative nausea and vomiting (PONV), postoperative pain and hypothermia were assessed in the post-operative recovery room. Result: With the cut off value of 35°C, the incidence of hypothermia at ICU admission was 31.4%. There was significant correlation between duration of anaesthesia and hypothermia (p=0.04). Incidence of hypothermia in surface surgeries i.e. breast, head and neck, bone and soft tissue services was 26.3% (162/617) and in cavity surgeries i.e. gastrointestinal, genitourinary, gynaecology and thoracic surgeries was 39.5% (154/390) and in children <12 years was 35.5% (11/31). 6.6% of patients (66/1007) had nausea and 2% (20/1007) had vomiting on ICU admission. There is also no correlation between severe nausea and vomiting with the use of intraoperative antiemetic, duration of anaesthesia. 9.8 % (99/1007) had moderate to severe pain on admission to ICU, 12.1 % (122/1007) after one hour of admission and 2% had severe pain during ICU stay. There was no significant correlation between intra- operative analgesia and post-operative pain score. The study didn’t find any correlation with hypothermia, PONV and worst pain increasing the duration of ICU or hospital stay or affecting the outcome as the p value was >0.05. Conclusion: Incidence of hypothermia in ours study is similar as compared to previous studies. The incidence of severe pain is similar in cancer patients but lower than the patients undergoing general surgical procedures. Rate of re-admission and PONV in a post-surgical patient is very low in our ICU compared to other studies. We need to take further steps in improving the temperature monitoring, to control severe postoperative pain and PONV. Despite of pharmacological advances and known risk factors the incidences of postoperative complications is still higher.
Research Article
Open Access
Observational Study on the Role of Doppler Ultrasound in Assessing Placental Insufficiency in High-Risk Pregnancies
Ritu Raj (MS),
Rajeev Ranjan (MD),
Palash Majumdar (MS),
Prof Somajita Chakraborty (MD)
Pages 1204 - 1212

View PDF
Abstract
|
Background: Placental insufficiency is a significant cause of perinatal morbidity and mortality in high-risk pregnancies. Doppler ultrasound has emerged as a potential tool for early detection and management of this condition. Objective: To evaluate the role of Doppler ultrasound in assessing placental insufficiency and predicting adverse outcomes in high-risk pregnancies. Methods: In this study, we enrolled 100 high-risk pregnant women and performed Doppler ultrasound examinations of the umbilical artery (UA), middle cerebral artery (MCA), and uterine artery (UtA). Pregnancy outcomes and management changes were recorded. Results: Abnormal Doppler findings were observed in 35% of UA, 28% of MCA, and 32% of UtA examinations. UA Doppler showed high diagnostic accuracy for placental insufficiency (sensitivity 82.5%, specificity 96.7%). Abnormal UA Doppler was associated with increased odds of preterm delivery (OR 3.8, 95% CI: 2.1-6.9). Abnormal MCA Doppler correlated with low birth weight (OR 2.9, 95% CI: 1.7-5.2), while abnormal UtA Doppler was associated with pre-eclampsia (OR 4.2, 95% CI: 2.3-7.6). Doppler findings led to management changes in 45% of cases, including increased fetal monitoring (45%), antenatal corticosteroid administration (30%), and early delivery (22%).Conclusion:Doppler ultrasound is an effective tool for assessing placental insufficiency and predicting adverse outcomes in high-risk pregnancies, often guiding management decisions.
|
Research Article
Open Access
A Comparative Study of Oral and IV Magnesium in Reducing Hypomagnesemia and Arrhythmia
Preeti Bala Gautam,
Aman Kumar,
Bhupendra Tiwari
Pages 1248 - 1252

View PDF
Abstract
|
Introduction: Cardiac arrhythmias are a prevalent issue following surgeries, with hypomagnesemia often associated with this complication. Prophylactic administration of intravenous magnesium has been a standard practice for patients admitted in ICU. This study aimed to compare the efficacy of oral versus intravenous magnesium in preventing hypomagnesemia and arrhythmias. Methods: In this interventional clinical study, 98 patients were randomly allocated into two groups. Baseline serum magnesium levels and arrhythmias were assessed for all patients. One group received 1.6 gm of oral magnesium hydroxide via nasogastric (NG) tube, while the other group was administered 2 g of magnesium sulfate at the induction of anesthesia. Serum magnesium levels were monitored for 48 hours postoperatively. Results: The preoperative hypomagnesemia difference between the groups was not statistically significant. During surgery, serum magnesium levels peaked at approximately 4 mg/dL, with no hypomagnesemia observed in any patient. Although the serum magnesium levels in the oral group decreased in parallel but remained below those in the intravenous (IV) group, no significant differences were observed during postoperative monitoring. Additionally, the prevalence of arrhythmias was 14.60% in the IV group and 6.83% in the oral group (OR=0.44). Conclusion: Administering 1.6 gm of oral magnesium hydroxide is as effective as 2 gm of intravenous magnesium sulfate in preventing hypomagnesemia and arrhythmias. This study suggests that oral magnesium supplementation is a promising, cost-effective alternative.
|
Research Article
Open Access
Cardiac Complications in Patients with Dengue Fever
Noorussaba Arfeen,
Devendra Kumar Sinha,
Kaushal Kishore
Pages 1223 - 1229

View PDF
Abstract
Background: Dengue fever, a mosquito-borne viral infection caused by the dengue virus, presents a significant public health challenge, particularly in tropical and subtropical regions. While primarily known for its febrile and hemorrhagic manifestations, dengue fever can also lead to severe cardiac complications. This study aims to systematically investigate the incidence, clinical profile, and outcomes of cardiac complications in patients with dengue fever, providing critical insights into their management and prognostication. Materials and Methods: This prospective observational study was conducted at Patna Medical College and Hospital, Patna, from January to November 2023. It included 78 patients with a confirmed diagnosis of dengue fever, excluding those with pre-existing cardiac conditions. Detailed clinical assessments, electrocardiographic (ECG) monitoring, and echocardiographic evaluations were performed to identify cardiac complications. Routine laboratory investigations included cardiac biomarkers such as troponin I and creatine kinase-MB (CK-MB). Data were analyzed using SPSS software version 25, with logistic regression analyses to identify potential risk factors. Statistical significance was set at p<0.05. Results: The study included 78 patients with an average age of 35.4 ± 15.2 years; 66.7% were male. Cardiac complications were observed in 19.2% of patients, including myocarditis (7.7%), arrhythmias (5.1%), pericarditis (3.8%), and heart failure (2.6%). Patients with cardiac complications were more likely to have hemorrhagic manifestations (53.3% vs. 19%, p=0.018) and shock (33.3% vs. 7.9%, p=0.011). ECG abnormalities, such as arrhythmias and conduction defects, and echocardiographic findings, including reduced left ventricular ejection fraction and pericardial effusion, were prevalent. Elevated troponin I and CK-MB levels were noted in 66.7% and 53.3% of patients with cardiac complications, respectively. These patients had longer hospital stays (12.5 ± 4.2 days vs. 8.3 ± 2.1 days, p<0.001), higher intensive care needs (66.7% vs. 12.7%, p<0.001), and increased in-hospital mortality (13.3% vs. 1.6%, p=0.032). Conclusion: Cardiac complications in dengue fever are associated with significant morbidity and mortality. Hemorrhagic manifestations and shock are strong predictors of cardiac involvement. Routine cardiac monitoring using ECG and echocardiography, alongside the measurement of cardiac biomarkers, is essential for early detection and management. Addressing these complications promptly can improve patient outcomes and reduce the disease burden.
Research Article
Open Access
Glycosylated Hemoglobin levels correlate with Carotid Intima Medial
Thickness in young adults with thyroid dysfunction
Rhea Ratan,
Sandeep Garg,
Shreya Sehgal,
Jyoti Kumar,
Pragya Sharma,
Ruchir Rustagi,
Bhawna Mahajan
Pages 1269 - 1277

View PDF
Abstract
Background: To explore the association of carotid intima medial thickness (CIMT) with TSH and other biochemical parameters among young adults with thyroid dysfunction. Material methods: Our study included 50 young subjects , 13-39 years, attending endocrinology clinic of our centre for thyroid dysfunction with no associated co-morbidities. BMI, thyroid and biochemical profile was assessed for all. All subjects underwent measurement of right and left CIMT using sonography (linear transducer 7mHz frequency). Statistical methods were then used to analyse the data. Results: CIMT values in our 50 subjects [hypothyroid:n=37 and hyperthyroid:n=13; age: 27.6±7.1 years ] fell in the normal range (Rt=0.53±0.10 mm ; Lt=0.52±0.11 mm). Hypothyroids had a significantly higher HbA1C (p value;0.038) and Serum cholesterol (p value;0.028) levels as compared to hyperthyroid subjects. When the values for the entire group were studied, CIMT values did not correlate either with TSH or BMI [24.66±4.14 kg/m. sq.]; though it positively correlated with age and HbA1c (particularly right CIMT, correlation coefficient,0.50). Hyperthyroid subjects had a significant positive correlation of TSH with Rt CIMT(0.750) and S.creatinine (0.780) and a negative correlation with cholesterol (-0.700). On the other hand, in hypothyroids, TSH levels did not significantly correlate with any parameters other than age (-0.38). Conclusion: Higher HbA1c (even in non diabetic range) are associated with higher CIMT among young patients of thyroid dysfunction, making it a useful tool for monitoring cardiovascular risk in conjunction with CIMT, especially in those with hypothyroidism.
Research Article
Open Access
Effects of haemodialysis on biochemical and endocrinological parameters in patients with renal failure attending in tertiary care hospital at, Pmch, Patna
Dr. Nayana Deb,
Dr. Madhu Sinha,
Dr. Satyendu Sagar
Pages 382 - 383

View PDF
Abstract
|
Objective: Present study was conducted to evaluate the effects of hemodialysis on biochemical and endocrinological parameters in patients with renal failure. Materials and methods: A cohort of 50 patients with end-stage renal disease (ESRD) undergoing hemodialysis was analyzed. Blood samples were collected pre- and post-dialysis to measure biochemical (electrolytes, urea, creatinine) and endocrinological (parathyroid hormone, erythropoietin, insulin) parameters. Statistical analysis was conducted to determine the significance of changes. Results: Hemodialysis significantly reduced serum urea and creatinine levels. Electrolyte imbalances, such as hyperkalemia, were corrected. Endocrinological changes included a significant reduction in parathyroid hormone and an increase in erythropoietin levels, while insulin levels showed variable responses. Conclusions: Hemodialysis effectively normalizes several biochemical imbalances in ESRD patients. Endocrinological parameters also show significant changes, highlighting the need for ongoing monitoring and management in these patients.
|
Research Article
Open Access
Comparative Study between Low Dose Ketamine and Ondansetron on Prevention of Hypotension in Patient Posted for Laparoscopic Cholecystectomy under General Anesthesia: A Randomized Double-Blind Study
Prashant Kumar Mishra,
Atit Kumar,
Purva Kumrawat,
Awadhesh Singh,
Amit Kumar Singh,
Matendra Singh Yadav
Pages 431 - 439

View PDF
Abstract
Background: This study was conducted to compare the efficacy of ketamine and ondansetron, two of the commonly used drugs, on blood pressure among patients undergoing laparoscopic cholecystectomy under general anesthesia. Methods: This was a prospective randomized double-blind study conducted among 56 patients coming for elective laparoscopic cholecystectomy under general anesthesia at UPUMS Saifai, Etawah, from November 2018 to April 2020 after obtaining clearance from the institutional ethics committee and written informed consent from the study participants. Study assessed preoperative patient conditions and randomly allocated 56 patients into two groups for comparative anesthesia techniques. Group A (n = 28) received Inj. Ketamine10 mg diluted upto 5 ml in normal saline, while Group B received Inj. Ondansetron 4 mg diluted upto 5 ml in normal saline before induction. All patients were premedicated and induced with standard drugs. Vital signs were recorded just after giving the study drug, at the time of induction, immediately after intubation and every minute after intubation upto 10 minutes. Monitored closely, and any deviations from baseline were noted,and hypotension managed through fluid resuscitation and rescue drugs if necessary. Heart rate changes were also recorded. The study aimed to evaluate the effects of Ketamine versus Ondansetron on hemodynamic stability during anesthesia induction, employing rigorous monitoring and treatment protocols for any adverse events. Results: In comparison of SBP (Systolic Blood Pressure) at baseline and different follow-up intervals between two study groups, just after giving away the trial drug, mean systolic blood pressure was 135.82±12.14 mmHg in group A (ketamine) as compared to 122.82±11.16 mmHg in group B (ondansetron), thus showing a statistically significant difference between two groups (p<0.001). Immediately after intubation, mean systolic blood pressure was 132.86±14.78 mmHg in group A (ketamine) as compared to 125.75±10.78 mmHg in group B (ondansetron). Statistically, the difference between the two groups was significant (p = 0.045). At all the follow-up intervals, mean values were higher in group A (ketamine) as compared to group B (ondansetron) and the difference was also significant statistically at 3 min and 10 min post-intubation intervals (p<0.05). Conclusion: Post-induction anesthesia hypotension incidence was higher in ondansetron as compared to that in ketamine group; however, the difference was not significant statistically. It seemed that pressor responses following intubation superseded the hypotensive effect of induction anesthesia
Research Article
Open Access
Assessment of Cardiovascular Risk Factors in Middle-Aged Adults: A Longitudinal Observational Study
Akshaya Kumar Samal,
Deepak Narayan Lenka
Pages 485 - 493

View PDF
Abstract
Introduction: Cardiovascular diseases (CVDs) are the leading cause of death globally, with middle-aged adults particularly vulnerable to developing risk factors that can lead to serious health complications. Understanding the dynamics of these risk factors is crucial for effective intervention and prevention. Objective: This study aims to assess the progression of cardiovascular risk factors in middle-aged adults through a longitudinal observational approach, providing insights into the prevalence, trends, and potential early indicators for reducing CVD incidence. Method: A longitudinal observational design of 522 middle-aged adults was selected through stratified random sampling from the Department of Cardiology, Hi-Tech Medical College & Hospital, Bhubaneswar, India. Baseline data collection, beginning in June 2019, included comprehensive health assessments, biochemical analyses, and lifestyle questionnaires. Follow-up assessments were conducted annually until June 2024. Key variables measured were blood pressure, lipid profiles, fasting glucose levels, body mass index (BMI), smoking status, and physical activity levels. Statistical analysis was performed using paired t-tests to compare baseline and follow-up data, with a p-value of <0.05 considered statistically significant. Results: Preliminary results indicate a high prevalence of hypertension (55%, p<0.01), dyslipidemia (47%, p<0.01), obesity (40%, p<0.01), and diabetes (30%, p<0.01) among participants. Over the five years, the incidence of hypertension increased to 60% (p<0.01), dyslipidemia to 52% (p<0.01), and obesity to 45% (p<0.01). Diabetes prevalence rose to 35% (p<0.01). Smoking rates slightly decreased from 25% to 22% (p=0.04), while physical inactivity remained high at 60% (p=0.03). Among urban populations, the increase in risk factors was more pronounced, with hypertension rising from 50% to 65% (p<0.01) and obesity from 35% to 50% (p<0.01). The interrelationship between obesity, hypertension, and diabetes was significant, suggesting a compounded risk for cardiovascular events. Conclusions: The study highlights the escalating prevalence of cardiovascular risk factors in middle-aged adults, emphasizing the need for early and targeted intervention strategies. Public health initiatives must focus on lifestyle modifications and continuous monitoring to mitigate these risks and reduce the burden of CVDs
Case Report
Open Access
Carpopedal Spasm: A Diagnostic Dilemma
Monica Chhikara,
Monika ,
Vaishali Gupta,
Bharti Singla,
Raj Bhagavan
Pages 816 - 818

View PDF
Abstract
Non-invasive blood pressure monitoring (NIBP) is a commonly used standard ASA monitor in Operation Theater. Beside of non-invasive, it can lead to various complications like petechial rash, ecchymosis, venous stasis, thrombophlebitis, infection, hematoma formation in patient on blood thinners, compartment syndrome, neuropathy due to compression and skin necrosis. These kinds of complications are not suspected by the anesthesiologist routinely. Most of these are seen invariably in diabetic, on anticoagulation therapy and old debilitating patients due to frequent monitoring. We are reporting a case of mechanical trauma caused due to NIBP monitoring in a patient posted for excision of bladder cyst. Intraoperatively, patient presented with carpopedal spasm distal to the BP cuff due to repeated cycling. We ruled out other causes of carpopedal spasm. Patient was managed for pain and for spasm calcium gluconate was given. Patient recovered and shifted to PACU. Through knowledge of complications and vigilance during perioperative period can helps the anesthesiologist to avoid them in their future course of perioperative care.
Research Article
Open Access
Comparative evaluation of MRI sequences for optimal visualization of joint cartilage in osteoarthritis.
Pages 895 - 899

View PDF
Abstract
Osteoarthritis (OA) is a leading cause of disability, with its incidence rising in tandem with obesity rates. Traditional imaging methods, such as radiography, are limited in their ability to detect early cartilage changes, necessitating the exploration of advanced imaging techniques. MRI offers a non-invasive method to visualize joint structures, with various sequences providing different insights into cartilage morphology and composition. Methods: We conducted a comparative study involving 100 OA patients, utilizing multiple MRI sequences to assess joint cartilage. Each patient underwent imaging with the following sequences: T2 mapping, T2* mapping, T1 rho, dGEMRIC, gagCEST, sodium imaging, and DWI. Image quality, cartilage visualization, and sensitivity to cartilage degeneration were evaluated for each sequence. Quantitative measurements were taken to assess cartilage thickness, composition, and structural integrity. Results: •T2 Mapping: Effective in assessing cartilage hydration and collagen network integrity. Provided clear images of cartilage structure but was less sensitive to early biochemical changes. •T2 Mapping: * Similar to T2 mapping but offered improved sensitivity to iron and other paramagnetic substances within the cartilage. •T1 Rho: Excellent for detecting early biochemical changes in cartilage, particularly proteoglycan content. •dGEMRIC: Provided detailed information on glycosaminoglycan (GAG) concentration, a key marker of cartilage health. •gagCEST: Offered high specificity for GAG concentration, though image acquisition times were longer. •Sodium Imaging: Directly measured sodium content, correlating with GAG concentration. However, required specialized equipment and longer scan times. •DWI: Sensitive to changes in the microstructure of cartilage, offering insights into early degeneration processes. Conclusion: Advanced compositional MRI techniques, particularly T1 rho and dGEMRIC, hold significant promise for the early detection and monitoring of OA. While traditional morphological sequences like T2 mapping remain valuable for structural assessment, integrating these advanced techniques can enhance the diagnostic accuracy and treatment planning for OA patients. Further research is needed to streamline these techniques for widespread clinical adoption.
Research Article
Open Access
Impact of Moderate Exercise on Cardiac Function in a Healthy Population
NilayKumar B Patel,
Harsiddh Thaker,
Nayan Mali,
Bhupendra Varlekar
Pages 78 - 82

View PDF
Abstract
Introduction: The present study evaluated the cardiovascular responses of moderate physical activity in a population, which is underreported from all over world literature and can provide unique insights for Indian population. The aim of this research was to assess the cardiovascular response to moderate physical exercise in a healthy Indian population, providing distinctive perspectives. Methods: Participants undertook a standardized submaximal exercise protocol and cardiac output was evaluated continuously using non-invasive methods like cardiography throughout the procedure in a hospital setting. Rather, these approaches were designed to capture the cardiovascular adjustments during moderate exercise and not put subjects under substantial stress. The sample size was n=100 in the resting stage group and n=100 for the exercise group and total n=200. The age group of the participants was in the range of 19-50 years. Results: The results indicated that there was a significant rise in cardiac output following exercise, and the non-linear data aligned within which the work bout took place. This study showed that cardiovascular adjustment to physical stress is particularly efficacious in Indian population. Heart function was found to be generally healthy in this group. Moreover, there were no significant differences in the gender in the present study indicating that among this people group both sexes have similar cardiovascular response suggestions. There was a significant effect on cardiac vascular activity amongst the people engaged in exercise in comparison to the control group. Conclusion: In conclusion, the studies provide useful data regarding cardiovascular fitness of Indian youth people and strength in carrying out frequent cardiovascular testing if engaged into physical activity. The results indicate that this type of monitoring may become a valuable tool in identifying the cardiac risk populations better, and eventually they will lead to greater health effects over an available period of time.
Research Article
Open Access
“A Combinative Study of Abnormal Fetal Doppler Ultrasound and Umbilical Cord Blood Gas Analysis in Detecting Fetal Acidemia”
Dr Sumayya Tabassum M,
Dr Nimma Pooja Reddy,
Dr Nemakallu Sarala Reddy
Pages 158 - 164

View PDF
Abstract
Background: Intrapartum hypoxia causes fetal suffocation, acidosis, newborn brain damage, long-term morbidity, and mortality.1 As a result, intrapartum fetal monitoring is performed to detect early indicators of fetal hypoxia and to take appropriate action as soon as possible to avoid fetal hypoxia complications. Objectives: 1. To study the correlation between fetal Doppler ultrasound and umbilical cord blood gas analysis in detecting fetal academia. 2.To identify the maternal and obstetric determinants associated with fetal acidemia. MATERIAL & METHODS: Study Design: Hospital-based prospective observational study. Study area: The study was conducted in the Department of Obstetrics and Gynaecology. Study Period: 1 year. Sample size: Using a sample size calculator, expecting a correlation coefficient of r=0.300, the required sample size was calculated to be 85. Study population: During the study period, a total of 92 pregnant women with abnormal fetal Doppler ultrasound findings, meeting the below-mentioned inclusion and exclusion criteria were eventually recruited into this study. Sampling Technique: Simple Random technique. Study tools and Data collection procedure: The institutional ethical committee clearance was obtained. The design and nature of the clinical study were explained to the patients and significant relatives of the patients. Informed consent was obtained from patients. The socio-demographic data was collected and recorded in the specially designed proforma. The socioeconomic class was assessed using the Socio-Economic Status Schedule. The obstetric history data was gathered and recorded in the proforma. All the patients in the sample group (n=92) were subjected to thorough clinical examination which included general physical, systemic examination and pelvic examination. Maternal blood samples for routine laboratory investigations (including tests for acidosis) were sent. Results: There was a significant correlation between gestational age and fetal acidemia (p value=0.0096). A gestational age of 41 weeks or more has a lesser incidence of fetal acidemia. Middle cerebral artery pulsatility index (MCAPI) was significantly associated with fetal academia (p-value of <0.0001). The umbilical artery pulsatility index (UAPI) had a highly significant correlation (p <0.0001) with the umbilical cord blood gas analysis. Conclusion: Based on the statistical analysis of the data, it is safe to assume that fetal Doppler ultrasound can predict the development of fetal acidemia. Two indicators, the Middle cerebral artery pulsatility index and the Umbilical artery pulsatility index, are strong predictors of fetal acidemia. Similarly, determinants like obstetric score, medical comorbidity, general physical examination and status of labour per vaginal examination were also not found to be significantly associated with fetal acidemia.
Research Article
Open Access
"Assessment of Cirrhotic Cardiomyopathy in Liver Cirrhosis Patients Using ECG Parameters and Echocardiographic Findings: A Cross-Sectional Study”.
Kamala Rajeswari Gollamudi,
Raghava Reddy Yarram,
Anil Kumar Bethapudi,
Hani Rajesh Akula
Pages 390 - 394

View PDF
Abstract
Background: Liver cirrhosis significantly affects health outcomes, with rising cases linked to nonalcoholic steatohepatitis (NASH) in addition to chronic alcohol abuse. "Cirrhotic cardiomyopathy" is defined by systolic and diastolic dysfunction along with electrophysiological abnormalities, absent other cardiac disease. Patients are at risk of heart failure under stress, diagnosed through electrocardiography, 2D echocardiography, and biomarkers such as BNP. Key diagnostic indicators include a resting ejection fraction < 55%, diminished cardiac output under stress, and an E/A ratio < 1.0, while additional supportive features like electrophysiological changes and elevated biomarkers may be helpful but are not required. Methods: This cross-sectional study at Dr. PSIMS & RF Hospital included 50 cirrhosis patients, assessed using Child-Pugh and MELD scores. Evaluations included QTc interval assessment, 2D echocardiography, and cirrhotic cardiomyopathy criteria from the 2005 World Congress of Gastroenterology, Montreal. Inclusion criteria were hospitalized patients with cirrhosis, while those under 18 years of age, with COPD, or with co-existing heart disease were excluded. Statistical analysis used SPSS version 21, with significance set at p < 0.05. Results: Patients with QTc intervals ≤ 440 ms generally exhibited better liver function, with 65.5% in Child-Pugh Class A and 37.9% with MELD scores ≤ 9. Conversely, those with QTc intervals > 440 ms often had more severe liver impairment, with 71.4% in Child-Pugh Class C and 42.9% with MELD scores ≥ 30, showing significant differences (p < 0.05). Ejection fractions > 55% were associated with better liver function, while ejection fractions ≤ 55% indicated more severe impairment, with significant differences (p < 0.05). Cardiac parameters, including right atrial size, left atrial size, and ejection fraction, differed significantly across Child-Pugh classes, with Class C patients having larger right and left atrial sizes and lower ejection fractions compared to Classes A and B (p < 0.05). Conclusion: In conclusion, the study reveals that in liver cirrhosis patients, prolonged QTc intervals are strongly correlated with Child-Pugh and MELD scores, while an ejection fraction ≤ 55% indicates more severe impairment, highlighting the critical need for continuous cardiac monitoring.
Case Report
Open Access
Optimizing Anaesthesia for Concurrent Carotid Endarterectomy and Off-Pump Coronary Artery Bypass: Insights from a Case Series
Lakshmanarajan ,
Deepika ,
Shanmugapriya V ,
Yuvaraj M ,
Karthikeyan D
Pages 582 - 587

View PDF
Abstract
Introduction: This case series study evaluates the optimization of anesthesia protocols for concurrent carotid endarterectomy (CEA) and off-pump coronary artery bypass (OPCAB). The study aims to create awareness about the significance of optimizing anesthesia for these combined high-risk procedures. The case series included fifteen patients, aged 54 to 72 years, with body mass indices (BMI) ranging from 24.2 to 28.7. Hospital stays ranged from 7 to 10 days, with minimal anesthesia-related complications observed. Key findings highlight the importance of individualized anesthesia management and meticulous postoperative monitoring to ensure favourable outcomes in patients undergoing concurrent CEA and OPCAB. The results contribute to the existing body of knowledge by demonstrating that a multidisciplinary approach and tailored anaesthetic protocols can enhance patient safety and surgical success. Future research should focus on larger, prospective studies to validate these findings and further refine anesthesia protocols. These insights underscore the critical role of optimized anesthesia in managing complex cardiovascular and cerebrovascular pathologies, ultimately improving patient care and outcomes in this high-risk population.
Research Article
Open Access
To Detrmine Role of Antihypertensive Chronotherapy In Diurnal Blood Pressure Patterns
Dr. Parminder Singh,
Dr Gaurav Mohan,
Dr Rahat sharma
Pages 622 - 631

View PDF
Abstract
Background: Chronotherapy in hypertension control is considered to better control nocturnal blood pressure patterns. This study aimed at studying relation between diurnal blood pressure patterns and timing of antihypertensive medication. Method: Hypertensive patients of age group 19 years to 65 years who were on antihypertensives for a minimum period of one month and were free of any cardiovascular complication or chronic kidney disease were included in the study. After doing routine workup, they were subjected to 24-hour ambulatory blood pressure monitoring. Results: In this study of 105 patients (mean age 44±10.9), morning administration of antihypertensive medication resulted in significantly higher blood pressure surges and less nighttime BP decrease compared to nighttime administration, with a p-value of 0.001. No significant difference was found across medication classes for nocturnal dipping. Conclusion: Taking antihypertensive drugs in night appears to results in better control of hypertension.
Research Article
Open Access
Maternal and Perinatal outcomes of pregnancies complicated by cardiac disease at tertiary hospital
Bullu Priya Oraon,
Shashi Bala Singh
Pages 648 - 652

View PDF
Abstract
Introduction: Cardiac disease is a leading cause of maternal morbidity and mortality during pregnancy. Effective management strategies are crucial for improving outcomes in this high-risk population. Aim: This study aimed to evaluate the outcomes of pregnant women with cardiac disease managed at the Rajendra Institute of Medical Sciences (RIMS), Ranchi, to refine treatment protocols. Methods: A prospective cohort study was conducted over one year, enrolling 35 pregnant women with either congenital or acquired heart diseases. Participants underwent regular monitoring with echocardiography, and data were collected on maternal and perinatal outcomes, including delivery methods and postpartum complications. Results: The study highlights significant maternal and perinatal complications in pregnant women with cardiac disease, with anemia (31.4%) and preterm birth (25.7%) as prevalent issues. The findings underscore the need for careful monitoring and management tailored to the severity of cardiac dysfunction to improve outcomes for both mothers and newborns. Conclusion: Effective cardiac and obstetric management in a tertiary care setting allowed for predominantly vaginal deliveries and highlighted the importance of echocardiography in monitoring. Recommendation: Tailored antibiotic prophylaxis and comprehensive postpartum contraceptive counseling should be integrated into care protocols for pregnant women with cardiac disease
Research Article
Open Access
Methemoglobinemia Unmasked: A Deep Dive into Poisoning Cases and Treatment Strategies
Dr Varnan Chandrawanshi,
Dr Aanchal Goyal,
Dr Divyansh Badole,
Dr Manoj Gupta,
Dr Namrata Sharma,
Dr Puneet Goyal
Pages 749 - 753

View PDF
Abstract
Background: In India, poisoning is one of the common modalities for attempting suicide especially farmers. There is varied presentation among these patients. One such uncommon presentation is Methemoglobinemia. Methemoglobinemia is a rare but potentially life-threatening condition characterized by the oxidation of hemoglobin to methemoglobin (MetHb), rendering it incapable of effectively transporting oxygen, resulting in tissue hypoxia. This condition can be congenital or acquired, often due to exposure to certain chemicals, drugs, or toxins. Acquired methemoglobinemia is frequently seen in cases of poisoning, as highlighted in this case series of three patients admitted to a tertiary care hospital in central India following suicidal ingestion of toxic substances Material and Methods: This study was conducted in a tertiary care teaching hospital of central India where patients of Toxin induced Methemoglobinemia were recruited. It was an observational study of 6 months duration. Informed consent was obtained and duly signed by the patient or next of kin. History, Physical examination, Routine investigations, ABG, Co-Oximetry studies were done. Patients were managed as per established protocol with no additional out of protocol investigation or treatment was done pertaining to this study. Patient's confidentiality was maintained throughout this study. Results: This case series involved three patients who developed toxic methemoglobinemia following ingestion of different toxic substances. - Effectiveness of Methylene Blue: In cases with moderate to severe methemoglobinemia (MetHb 33.5% to 41.4%), methylene blue proved effective in reducing MetHb levels and improving clinical outcomes. The initial treatment significantly improved MetHb levels and patient symptoms, with favorable outcomes observed in two of the three cases. - Severe Outcomes: The third case, with an exceptionally high MetHb level of 74%, demonstrated the limits of methylene blue treatment. Despite multiple doses, the patient’s condition deteriorated, indicating that extremely high MetHb levels and delayed treatment can lead to poor outcomes.Complications and Mortality: The case with the highest MetHb level resulted in mortality, underscoring the critical importance of early diagnosis and intervention. The other two cases, despite initial severe symptoms, responded well to timely methylene blue treatment and supportive care. Conclusion: This case series illustrates the clinical variability and challenges in managing toxin-induced methemoglobinemia. The condition should be suspected in cases of poisoning, particularly when there is a mismatch between oxygen saturation and arterial blood gas measurements. Early intervention with methylene blue, guided by co-oximetry, is essential for improving outcomes. Severe cases, especially those with MetHb levels exceeding 70%, carry a poor prognosis despite aggressive treatment, highlighting the need for early recognition, monitoring, and advanced supportive care
Research Article
Open Access
A Hospital Based Study of Serum Electrolytes in Acute Exacerbation of Copd in Koshi Region
Dr. Samique Ahmad,
Dr. Pramod Kumar Agrawal,
Dr. Mrityunjay Pratap Singh,
Dr. Nusrat Jahan,
Dr. Helal Ahmed khan,
Dr. Akash Sharma,
Dr. Zeeshan Ali khan,
Dr. Sharqua Zaheen
Pages 83 - 88

View PDF
Abstract
Introduction: Dyspnoea, coughing, and increased production and purulence of sputum are symptoms of chronic obstructive pulmonary disease (COPD), which can sometimes deteriorate rapidly. Acute exacerbation of chronic obstructive pulmonary disease (AECOPD) describes this extreme worsening of symptoms all at once. Aims: To study the level of serum electrolytes in patients with acute exacerbation of COPD. Assessment of acute exacerbation of COPD based on severity of dyspnea using modified medical research council dyspnoea scale, clinical examinations and pulmonary function tests. Materials and Methods: The present study was a prospective hospital-based study. This Study was conducted from July 2022 to December 2023, spanning 18 months at Katihar Medical College and Hospital in Bihar, India. Result: In our study, 63 (63%) of the patients had fever, 100 (100%) had cough, 73 (73%) had crepitations, and 90 (90%) had wheeze. In our study, 77 (77%) patients had SPO2 levels between 94-85, 22 (22%) had SPO2 levels between 84-75, and 1 (1%) had SPO2 levels below 75. In our study, 25 (25%) patients had one mMRC Scale, 35 (35%) had two mMRC Scales, 25 (25%) had three mMRC Scales, and 15 (15%) had four mMRC Scales. In our study, 44 (44%) patients had <135 (hyponatremia), while 56 (56%) had 135-145 (normal). In our study, 49 individuals (49.0%) had <3.5 (hypokalemia), while 51 (51.0%) had 3.5-5.0 (normal). Conclusion: The findings suggest that serum electrolyte imbalances are common in acute exacerbations of COPD and may exacerbate respiratory symptoms. Monitoring and managing electrolyte levels could be essential in improving patient outcomes during acute exacerbations. Further studies are recommended to explore the therapeutic implications of these findings.
Research Article
Open Access
To Evaluate the Effectiveness of Prophylactic Use of Intravenous Ketamine, Clonidine and Tramadol in Control of Shivering in Patient Undergoing Elective Surgeries Under Spinal Anaesthesia
Dr. Arpan Kumar Jain,
Dr. Vikas Kumar Sahu,
Dr. Apoorva Garhwal,
Dr. Arish Sadaf
Pages 206 - 217

View PDF
Abstract
Background- Shivering is distressing for the patient’s undergoing surgery under both regional and after general anaesthesia. Shivering increases expenditure of cardiac and systemic energy, resulting in increased oxygen consumption and carbon dioxide production, lactic acidosis and raises the intraocular and intracranial pressure. It also interferes with haemodynamic monitoring intra operatively Aims- To evaluate the effectiveness of prophylactic use of intravenous ketamine, clonidine and tramadol in control of shivering in patient undergoing elective surgeries under spinal anaesthesia. Materials and methods- A Prospective, Randomized, Double Blind Comparative Study, done in Tertiary Care Superspeciality Hospital with in a span of 1 year. Adult patients posted for various elective surgeries under Spinal Anaesthesia. Patients scheduled for elective surgeries under spinal anaesthesia, Age group of 18-60 years of both sexes and ASA grades I or II included in study. The subjects were randomized in to 4 groups by using computer generated SPSS 16 software in to random numbers to receive ketamine, tramadol or clonidine. The patients were randomized into four groups of 42 patients each. Results- The age range was 18-60 years for all the groups. There was no significant difference between the groups for age and sex distribution (p>0.05). There was no significant difference observed in the duration of surgery (p=0.46). There was no significant difference observed in the median level of spinal anaesthesia in the four groups (p=0.052). There was significant difference observed in the distribution of grade of shivering in normal saline group compared to tramadol, ketamine and clonidine group (p=0.0499). There was no significant surface temperature difference between the groups (p=0.67). There was statistically significant difference observed in ketamine group with respect to heart rate compared to tramadol, clonidine and normal saline groups till 40min after spinal anaesthesia with p-value of <0.001 except the baseline values (p=0.93). After 40 min, there was no statistically significant difference observed among the groups. There was statistically significant difference observed in ketamine group with respect to mean blood pressure compared to tramadol, clonidine and normal saline groups till 50 min after spinal anaesthesia with p-value of <0.001 except for baseline value (p=0.870) and value at 5 mins (p=0.0012). After 50 min, there was no statistically significant difference observed in four groups. Conclusion- We conclude that giving either ketamine 0.5 mg/ kg, clonidine 75 mcg or tramadol 0.5 mg/kg i.v. prophylactically just before neuraxial blockade significantly decreases the incidence of shivering without causing any major side-effects. Using ketamine may be more beneficial as it improves the hemodynamic profile by its sympathomimetic effects and it sedates the patient effectively, which increases patient comfort during surgery, maintains cardiorespiratory stability and prevents recall of unpleasant events during the surgery.
Research Article
Open Access
Endovascular Emergency Venous - Code Stroke Salvage for Cerebral Venous Sinus Thrombosis in Covid Era: Direct Jugular Vein Accesses Intervention with Technical Note Utilizing Peripheral Hardware+ In Neuro Intervention
Dr Abhinav Mohan,
Dr Shweta D ,
Dr Jayshree Chidanand Awalaker,
Dr Palange Pankaj Bindusar,
Dr Rohan Patil,
Dr Shahaji Vishwasrao Deshmukh,
Dr Ashwin Valsangkar
Pages 223 - 228

View PDF
Abstract
Venous thrombosis is uncommon cause of stroke as compared to arterial occlusions, but it is an important consideration because of its potential morbidity and increasing incidence especially in current covid era. Historically comparatively low incidence of cerebral venous thrombosis {CVT} is approximately at 0.2 to 0.5 per 100000 per year while the mortality of CVT probably varied between 20%-50%. Standard medical management for CVT is hydration and systemic anticoagulation with heparin at therapeutic dosage, even in patients with an intracranial hemorrhage (ICH) i.e., Venous hemorrhagic infarct at baseline along with watchful monitoring for seizures & raised Intracranial Tension (ICT) and fundoscopy to monitor Papilledema. There are few cases that do not respond to standard of care with medical management & with progressive CVT leading to poor outcomes with resultant ischemic and hemorrhagic stroke, cerebral edema, mass effect and death. Endovascular options has been in vogue in recent decade, including intra-venous application of thrombolytic agents and/or mechanical thrombectomy for patients with Major venous sinus thrombosis without large hematoma & significant midline shift that necessitates emergency decompressive craniotomy and those with Altered Sensorium (Glasgow Coma scale < 10)/ Refractory to anticoagulation with progressive disease or neurologic deterioration (deterioration on Glasgow Coma Scale ) refractory to anti- coagulation therapy or with new deterioration of symptoms or worsening of ICH or Haemorrage despite standard medical management. We present our unique experience of venous stroke patients in covid era that underwent endovascular salvage for major cortical venous sinus thrombosis & technical note on direct jugular vein accesses intervention utilizing peripheral hardware.
Research Article
Open Access
Antimicrobial Susceptibility Among Cardiac Implantable Electronic Device Site Infections: A Prospective Observational Study
Dr. Kirti Parmar,
Dr. Abhishek Sharma,
Dr. Saurabh Rattan
Pages 454 - 464

View PDF
Abstract
Introduction: Implantation of cardiac implantable electrophysiological devices (CIEDs), including permanent pacemakers and implantable cardioverter-defibrillators has been on the rise over the past years, largely due to the expanded indications for CIED implantation for primary prevention. Infection associated with implantable devices is a serious complication with high morbidity leading to mortality. The importance of appropriate empirical antibiotic coverage is illustrated by studies that document the association between inappropriate selection and increased mortality in patients with permanent pacemaker implantation. Increasing multi drug resistance problem could be due to mutations, over use of broad-spectrum antibiotics, across the counter availability of antibiotics and lack of infection control policy in the hospital settings. Methodology: A prospective observational study conducted at the major tertiary care centre of the State of Himachal Pradesh, for duration of one year. Patients who had undergone interventional cardiology procedure and developing any sign or symptom of general or systemic infection were included. Results: On direct Gram staining of clinical samples, microorganisms were seen in 12 (70.58%) samples and in 5 (29.41%) samples no microorganism seen. Out of 12 positive samples, Gram positive cocci were isolated from ten samples accounting for 83.33% of total isolates, while Gram negative bacilli were isolated from one sample (8.33%) and both Gram positive cocci and Gram-negative bacilli were isolated from single sample accounting for 8.33% of total isolates. Majority of the isolates were S.aureus (46%), followed by S.epidermidis (38%). Pseudomonas aeruginosa and Achromobacter spp. were 8% each. Out of 11 Staphylococcus isolates, 6 (54.54%) were identified as Staphylococcus aureus (S.aureus) and 5 (45.45%) were Staphylococcus epidermidis (S.epidermidis). There was 100% sensitivity to Vancomycin, Daptomycin and Linezolid. Almost 64% samples were resistant to Oxacillin, Cefoxitin, Cefazolin and Erythromycin; 45% were resistant to Co-trimoxazole and Clindamycin; 18% were resistant to Gentamicin and 9% were resistant to teicoplanin and Rifampicin. All the isolates were resistant to Ampicillin and Penicillin G. Out of 6 isolates of S.aureus, 3 (50%) were MRSA. Out of 5 isolates of S.epidermidis 4 (80%) were Methicillin resistant. All the MRSA isolates were sensitive to daptomycin, teicoplanin, vancomycin, linezolid and rifampicin but they were resistant to erythromycin, ampicillin and penicillin G. 67% isolates were sensitive to gentamicin, co-trimoxazole and clindamycin. All Methicillin resistant S.epidermidis were sensitive to daptomycin, vancomycin and linezolid. 75% were sensitive to gentamicin, teicoplanin and rifampicin and only 25% were sensitive to co-trimoxazole. However, all were resistant to erythromycin and clindamycin. In single isolate of Achromobacter spp resistance was observed for gentamicin, imipenem, meropenem, ciprofloxacin, levofloxacin and tetracycline. The isolate was sensitive to ceftazidime, piperacillin-tazobactam and co-trimoxazole. Single isolate of Pseudomonas aeruginosa was sensitive to amikacin, gentamicin, imipenem, meropenem, ceftazidime, cefepime, aztreonam, ciprofloxacin, levofloxacin and piperacillin-tazobactam. Conclusion: The present study indicated an infection rate of 8.1% following permanent pacemaker implantation. 84% of the causative organisms were Staphylococcus species and out of which 64% were methicillin resistant. Staphylococcus has been reported as a major cause of community and hospital acquired infections. Infections caused by Staphylococcus used to respond to β-lactam and related group of antibiotics. Vancomycin has been used as the drug of choice for treating MRSA infections. Further, the regular surveillance of hospital associated infections including monitoring antibiotic sensitivity pattern of MRSA and formulation of definite antibiotic policy may be useful for reducing the incidence of MRSA infection.
Research Article
Open Access
Haemodynamic Changes from Supine to Prone Position in General Anaesthesia
Dr. H. Riaz Fathima,
Dr. Krishna Prasad Patla,
Dr. Abraham A A
Pages 520 - 526

View PDF
Abstract
Background: Prone positioning during general anesthesia for spine surgery can induce significant hemodynamic changes. This study aimed to quantify these changes and their time course. Methods: Fifty-four patients undergoing elective spine surgery were included. Hemodynamic parameters were measured in supine position and at three time points after prone positioning: immediately, 5 minutes, and 10 minutes. Results: Significant decreases were observed in systolic blood pressure (mean difference 12.926 mmHg, p<0.001), diastolic blood pressure (mean difference 3.778 mmHg, p<0.001), and mean arterial pressure (mean difference 6.574 mmHg, p<0.001) immediately after prone positioning. Peak airway pressure increased significantly (mean difference 1.630 cmH2O, p<0.001). These changes persisted at 5 and 10 minutes, though some recovery was noted. Heart rate, end-tidal CO2, and oxygen saturation showed minimal changes. Conclusions: Prone positioning under general anesthesia leads to significant reductions in blood pressure and increases in airway pressure, with partial recovery over 10 minutes. These findings highlight the need for careful monitoring and management during prone positioning.
Research Article
Open Access
A Study of Electrocardiographic Abnormalities and Cardiac Markers in Patients with Acute Cerebrovascular Accidents in First 24 Hours
Chodavarapu Dheeraj Daya Sagar,
Battula Venkatesh
Pages 124 - 127

View PDF
Abstract
Background: Electrocardiographic (ECG) changes are reported frequently after acute strokes. It seems that cardiovascular effects of strokes are modulated by concomitant or pre-existent cardiac diseases, and are also related to the type of cerebrovascular disease and its localization. We aimed to determine the pattern of ECG changes associated with pathophysiologic categories of acute stroke among patients with/ without cardiovascular disease and to determine if specific ECG changes are related to the location of the lesion. Every year, more than half a million people in the world suffer from acute cerebrovascular events, including ischemic stroke, intracerebral and subarachnoid hemorrhage. Materials and methods: This is a Prospective and observational study conducted in the Department of Medicine, Tertiary Care Teaching Hospital over a period of 1.5 years. Selection of study subjects - After admission, based on clinical history and Physical Examination, a presumptive diagnosis is made and later the patient will be subjected to Serial ECGs after informed consent. Patients admitted in the NICU and various medical wards within 24 hours after the onset of neurological deficit. Patients who developed stroke during their stay in hospital. Result: We have recruited 90 stroke patients, most of them were males. Major type was ischemic stroke. In total 62 (68.89%) stroke patients had some form of ECG change. Majority i.e. 35 (38.89%) patients had QTc prolongation followed by 32 (35.56%) patients had T wave changes. QTc prolongation and Atrial fibrillation were significantly more among hemorrhagic stroke patients (p<0.05) and T wave changes and ST changes (elevation or depression) were significantly more among ischemic stroke patients (p<0.05). Conclusion: PWDis and PTFV1 are independent predictors of PAF in patients with acute ischemic stroke. These simple and easily accessible predictors that can be detected via surface ECG may be used as a guide to identify patients who require longer rhythm monitoring to better detect occult PAF, thereby preventing recurrent strokes.
Research Article
Open Access
Surgical Study of Various Causes and Symptomatology of Intestinal Obstruction in Paediatric age Group at A Tertiary Hospital
Keerti Mali Patil,
Upendra Pawar,
Sharanbasappa Gubbi,
Kiran Mali Patil
Pages 622 - 628

View PDF
Abstract
Introduction: Intestinal obstruction in paediatric age group differs from that in adults in presentation, etiology and management. Management of intestinal obstruction in children differs from that in adults in terms of fluid requirement, electrolytes and drugs dosage, mode of anesthesia, surgical technique & post-operative monitoring as well as complications. Present study was aimed to study of various causes, symptomatology & management of intestinal obstruction in paediatric age group at a tertiary hospital. Material and Methods: Present study was single-center, prospective, observational study, conducted patients of both the genders and age less than 16 years of age, presented with intestinal obstruction & underwent surgical intervention. Results: During study period, 100 patients satisfied study criteria. Male cases were 75 (75.0%) and female were 25 (25.0%). Maximum number of cases were from the age groups of 1-12 months and 1years -5 years (27.0% each) followed by age group of < 1 month (26.0%) & age group 5-10 years (20 %). Common clinical features observed were distention (83 %), vomiting (66 %), constipation (44 %), pain abdomen and bleeding PR (24 % each). Intussusception (25 %) was most common diagnosis followed by imperforate anus (21 %), volvulus (10 %), CHPS (10 %), Hirschsprung's disease (8 %), Meckle’s band (6 %) & post operative adhesive (6 %). Study reveals that, there was statistically very highly significant difference of distribution of mortality of patients among outcome of complicated and uncomplicated surgeries (P < 0.001) & all deaths were occurred in complicated surgeries 11 (100.0%) Common post operative complications observed were septicemia (9 %), fever (8 %), wound infection (6 %) & respiratory distress (4 %). Conclusion: The most important conclusion drawn out of this study is that with early diagnosis and intervention, the outcome and the mortality rates of these children can be reduced accountable.
Research Article
Open Access
A Study of Dexmedetomidine with Bupivacaine Vs Bupivacaine Alone in Erector Spinae Plane Block for Post-Operative Pain Control of Spine Fixation Surgeries
Dr. Kiran Janwe,
Dr. Tejaswini Chaudhary,
Dr. Pranay Gandhi
Pages 648 - 653

View PDF
Abstract
Since posterior lumbosacral spine fixation surgeries are nowadays common spine procedures performed for various reasons and usually accompanied by moderate to severe postoperative pain, it is necessary to find effective postoperative analgesia in these patients. This study was aimed at monitoring the analgesic effect of dexmedetomidine in combination with bupivacaine versus bupivacaine alone on ESPB erector spinae plane blockade for postoperative pain relief in posterior lumbosacral spine fixation surgeries. Methodology: A study involving 75 patients who were randomly divided into 3 groups (25 patients each): Dexmedetomidine in combination with bupivacaine (group DB), bupivacaine (group B) and saline (control) (group S). Primary clinical outcomes were active (during mobilization) and passive (at rest) visual analog scale (VAS) pain scores in the first 24 hours measured every 2 hours, opioid consumption (number of PCA presses), and need for rescue analgesia. Other clinical outcomes included active and passive VAS pain scores at 24 hours, measured every 4 hours, opioid consumption, need for rescue analgesia, postoperative side effects of opioids, and intraoperative side effects of dexmedetomidine such as bradycardia and hypotension. Observation And Results: In our study we observed that Active and passive VAS pain scores, postoperative opioid consumption, need for rescue analgesia, and postoperative opioid side effects were significantly lower in DB group when compared to other groups (B and S groups). There were no additional intraoperative dexmedetomidine side effects as bradycardia and hypotension.
Research Article
Open Access
Evaluation of Risk Factors and Prognostic Indicators in Pediatric Status Epilepticus
Saheli Dasgupta,
Asha Mukherjee,
Gautam Guha,
Suparna Guha
Pages 685 - 689

View PDF
Abstract
Background: Status epilepticus (SE) occurs when a child has repeated, prolonged seizures and doesn't regain consciousness. Due to high morbidity and mortality, prompt diagnosis and treatment are crucial. Paediatric SE has few risk factors and prognostic indicators, so its outcome depends on its etiology, treatment response, and prompt medical intervention. This study examined pediatric SE risk factors and prognostic indicators. Objective: The purpose of this study is to identify the risk factors, prognostic indicators, and outcomes in pediatric status epilepticus and explore the relationship between clinical and diagnostic findings and recovery. Methods: Kolkata's Vivekananda Institute of Medical Sciences pediatrics department conducted a descriptive study. The study included 50 SE-admitted children aged 2–12. The patient's medical history and physical status were assessed, along with EEG, blood, and brain CT and MRI scans. Data included neurological recovery, mortality, complications, risk factors, and prognostic indicators. Results: A majority of the 50 children studied were boys. The most common risk factors were infections (42%) and metabolic disturbances (30%). Most children had generalized tonic-clonic seizures (70%) and seizures lasting over 5 minutes (80%). Time to seizure control, treatment response, and abnormal EEG or brain imaging were associated with worse outcomes. 80% of children recovered neurologically, 4% died, and 10% had long-term developmental sequelae. Conclusion: This study emphasizes the importance of early diagnosis and intervention for pediatric status epilepticus. Early seizure control and normal EEG/brain imaging were prognostic indicators, but infections and metabolic disturbances were the biggest risk factors. Starting treatment early and monitoring SE children reduces the risk of neurological impairment and death.
Research Article
Open Access
A Comparative Study of Oral Misoprostol with Intravaginal Misoprostol for Induction of Labour
Sangeeta Dubey Bhargava,
Yogita Raj Dubey,
Anmol Bhargava
Pages 725 - 728

View PDF
Abstract
Background – Misoprostol is a promising agent for cervical ripening and induction of labour. Though the route of administration and doses are not standardized. Aims and objective – The objective of our study is to compare efficacy and safety of oral route and intravaginal route of misoprostol for induction of labour. Materials and method - This was a prospective comparative study carried out at the department of obstetrics and gynecology, Gandhi medical college and Sultania zanana hospital, Bhopal on 200 pregnant women for a period of one year. Result – In our study no significant difference was observed between oral misoprostol and intravaginal misoprostol with respect to amount of drug required, induction- delivery interval, mode of delivery and neonatal outcome.Regarding maternal side effects and complications, nausea, vomiting and diarrhea were noted more with oral misoprostol while cervical tears, vaginal tears and lacerations were more with intravaginal route of administration. Conclusion – Both oral misoprostol and intravaginal misoprostol in a dose of 50 micrograms every four hours to a maximum of four doses are safe and efficacious in induction of labour in closely supervised hospital settings with adequate monitoring.
Research Article
Open Access
The Prevalence of Complications After Spinal Anesthesia in Post-Surgical Patients –An Observational Study
Nitin Gautam,
Akhilendra Chopra
Pages 826 - 829

View PDF
Abstract
Spinal Anaesthesia is a procedure that is currently being used a lot because of its benefit to relieve temporary pain sensation in patients without affecting patients’ consciousness. However, this action can cause several complications. The present study was a descriptive observational study, conducted on 200 patients aged 12-65 in post operative unit undergone spinal anesthesia to find out the prevalence of complications on the bases of self-created questioner, in which included back pain, headache, urine retention, hypotension etc. Related to spinal complications recorded by medical record in the department of anesthesia and critical care of individual after the spinal anesthesia and performed statistical analysis on collected data. The results showed that the most prevalent complications in the recovery unit include shivering followed by hypotension, nausea, back pain, delirium, vomiting etc. The high prevalence of complications in the post-anesthetic care unit can be considered an alarm and also highlights the importance for skilled personnel and monitoring equipment in critical care unit.
Research Article
Open Access
Exposure Of Petrol Pump Personnel to Fuels and Its Effects on Pulmonary Function Tests in And Around Pune City
Sheetal R Salvi,
Nikhil J Bhandari
Pages 20 - 25

View PDF
Abstract
Introduction: Due to the fast growth of cities and economies, health risks at work have become a significant public health issue. Several segments of society face an increased likelihood of experiencing negative outcomes due to their work conditions. One such group is petrol pump workers, who are consistently exposed to harmful chemical compounds found in gasoline as a result of their vocation. Hence, this cross-sectional research was conducted to examine the influence of workplace exposure to petrol vapours, diesel, and automobile emission on tests for pulmonary function. Methods: The study group consisted of thirty male petrol pump personnel, while the control group consisted of thirty healthy males who were matched to the study group. The assessment of pulmonary functions was conducted using a handheld spirometer. The mean ± standard deviation (SD) values for each parameter were calculated for both the study as well as the control groups. These values were subsequently compared utilizing an unpaired 't' test. Results: The study group (Petrol pump operators) exhibited a noteworthy decrease (p <0.05) in Forced Vital Capacity (FVC) and Forced Expiratory Flow between 25-75% (FEF 25-75%) compared to the control group. Conclusions: This study determines that petrol pump workers face an increased risk of developing pulmonary impairment, specifically a restrictive pattern of lung disease, over time. It also highlights the importance of medical monitoring and the enforcement of occupational safety measures to prevent work-related illnesses.
Research Article
Open Access
Systematic Review: Managing Obesity with Multidisciplinary Approaches
Sundaravadivel. V. P,
Kamal Kishore Bishnoi,
Savita Wawage,
Dhawal Vyas
Pages 26 - 30

View PDF
Abstract
Obesity is a growing global health crisis that significantly contributes to chronic diseases such as type 2 diabetes, cardiovascular disorders, hypertension, and certain cancers. It is recognized as a multifactorial condition influenced by genetic, environmental, behavioral, and psychological factors. Traditional obesity management approaches, which predominantly focus on dietary modifications and increased physical activity, often fail to produce sustainable long-term results. As a result, there is an increasing emphasis on multidisciplinary approaches that integrate dietary interventions, physical activity, behavioral therapy, pharmacological treatments, and bariatric surgery to address obesity more comprehensively.
Obesity Management in a Multidisciplinary Approach Multifaceted in nature, the management of obesity requires teamwork that involves different health professionals from diverse fields, such as dietitians, exercise physiologists, psychologists, endocrinologists, and bariatric surgeons. They work best at offering individualized and global approaches to overcome the lifestyle challenges and the psychosocial issues that impact weight loss success. This approach emphasizes behavioural and psychological strategies, including evidence-based methods such as cognitive-behavioural therapy (CBT), mindfulness-based stress reduction for managing stress and other triggers that lead to emotional eating, and achieving sustainable lifestyle changes.
Pharmacological interventions are a critical component in obesity management, especially in those patients who do not respond to lifestyle changes alone. GLP-1 receptor agonists and orlistat are two examples of medications shown to help with weight loss. Bariatric surgery is the most effective intervention for patients with severe obesity, resulting in durable and clinically meaningful weight loss, improved metabolic control, and resolution of obesity-related comorbidities. Nevertheless, surgical solutions demand complete support pre-operatively and post-operatively to be successful in the long run.
This systematic review synthesized evidence from 30 studies to assess the effectiveness of multidisciplinary approaches for managing obesity. The results show that combining different modalities yields superior and longer-lasting weight loss to those delivered by a single modality. Moreover, multidisciplinary care enhances patients' psychological well-being, quality of life, and metabolic health. While the results are encouraging, adherence, access, and long-term feasibility are challenges for widespread implementation.
It also discusses future directions in obesity management, including the potential for mobile health applications, telemedicine, and wearable technology to promote patient engagement and monitoring. Such multidisciplinary approaches can transform obesity care by tackling the underlying causes of the disease and delivering personalized, patient-centred interventions. These results highlight the need for multidisciplinary approaches that focus on preventive care and holistic treatment models as healthcare systems move to help alleviate the global burden of obesity and improve long-term health outcomes.
Research Article
Open Access
A Comparative Study of Prevalence and Antimicrobial Susceptibility Pattern of Clinical Isolates of Methicillin-Resistant Staphylococcus aureus and Methicillin-Resistant Coagulase Negative Staphylococcus in a Tertiary Care Hospital of West Bengal
Minakshi Das,
Tapajyoti Mukherjee,
Biswajit Sarkar,
Aniruddha Das
Pages 830 - 836

View PDF
Abstract
Background: Nosocomial infections are a significant global concern, with an increasing prevalence of antibiotic-resistant strains, such as methicillin-resistant Staphylococcus aureus (MRSA), reported worldwide. Both MRSA and methicillin-resistant coagulase-negative staphylococci (MRCoNS) play crucial roles in healthcare-associated infections. The objective of this study was to determine and compare the prevalence and antibiotic susceptibility patterns of MRSA and MRCoNS. Materials and Methods: In this cross-sectional, hospital-based study, clinical samples submitted to the bacteriology laboratory of the Microbiology Department at Burdwan Medical College over a nine-month period were screened for Staphylococcus species. The isolates were identified as Staphylococcus aureus and coagulase-negative staphylococci (CoNS) using standard microbiological techniques. Methicillin resistance in all isolates was tested with a 30 μg Cefoxitin disc and further confirmed through an automated system by measuring the Minimum Inhibitory Concentration (MIC). Antibiotic susceptibility patterns were determined using the modified Kirby-Bauer disc diffusion method following the guidelines of the Clinical and Laboratory Standards Institute (CLSI). The collected data were recorded and analyzed using Microsoft Excel (version 2010). Results: A total of 830 Staphylococcus strains were isolated from various clinical samples, including 694(84%) Staphylococcus aureus and 136(16%) coagulase-negative Staphylococcus (CoNS). Among the Staphylococcus aureus isolates, 285 (41.1%) were methicillin-resistant, while 54(39.7%) of the CoNS isolates showed methicillin resistance. Among methicillin-resistant Staphylococcus aureus (MRSA) strains, the highest resistance was observed against ceftriaxone(96.1%), and the lowest was against linezolid(1.05%) and teicoplanin(0%). In methicillin-resistant coagulase-negative staphylococci (MRCoNS) strains, the highest resistance was observed against ceftriaxone(90.7%), and lowest was noted for vancomycin (1.8%), linezolid (0%), and teicoplanin (0%). Conclusions: Continuous monitoring of the antimicrobial susceptibility patterns of methicillin-resistant Staphylococcus aureus (MRSA) and methicillin-resistant coagulase-negative staphylococci (MRCoNS) is essential for selecting appropriate therapies, developing antibiotic policies, and minimizing the use of reserved antibiotics.
Research Article
Open Access
A Study on the Relationship Between Organomegaly, Dengue Severity, and Dengue Seropositivity in a Rural Tertiary Care Hospital in Western Maharashtra.
Dr. Jayashree P Jadhav,
Dr. Lakhan Khurana,
Dr. Sanjay Krishnan S
Pages 116 - 119

View PDF
Abstract
Background: Dengue fever, caused by arthropod-borne viruses, presents as a mild illness with fever, muscle pain, rash, and swollen lymph nodes, while its severe form, dengue hemorrhagic fever, can be fatal due to capillary permeability and hemostasis issues. Infants aged 4-9 months face higher risks of severe dengue, with symptoms like convulsions and liver dysfunction being more common and fatal. Aim & Objectives: A Study on the Relationship Between Organomegaly, Dengue Severity, and Dengue Seropositivity in a Rural Tertiary Care Hospital in Western Maharashtra. Methodology: This descriptive longitudinal study was conducted over a period of two years, from June 2022 to May 2024, in the Department of Paediatrics at Dr. Balasaheb Vikhe Patil Rural Medical College, Loni. The inclusion criteria consisted of all male and female patients under one year of age with a laboratory-confirmed diagnosis of Dengue Fever, provided their parents gave written informed consent. The exclusion criteria included infants presenting with other viral exanthematous fevers or dengue-like illnesses with a negative laboratory test. Result: The study examined 79 infants admitted with dengue between June 2022 and May 2024, accounting for 12% of all infantile dengue cases. The average age of the infants was 7.5 months. Dengue fever without warning signs constituted 55.7% of the cases, while 26.6% had dengue with warning signs, and 17.7% were diagnosed with severe dengue. Severe outcomes were linked to hepatomegaly and splenomegaly. Conclusion: Infants with dengue frequently exhibit non-specific symptoms, making diagnosis challenging. While IgM positivity was commonly observed, NS1 positivity was associated with increased severity and mortality. Early detection of NS1 was vital for effective management. Hepatosplenomegaly were found to be linked to greater severity and higher mortality rates. Timely diagnosis, close monitoring, and proper supportive care are crucial for improving outcomes in the management of infantile dengue.
Research Article
Open Access
Transthoracic Echocardiography: A real time hemodynamic monitoring tool during induction of anaesthesia in patients undergoing coronary artery bypass grafting surgery
Thiruvenkadam Selvaraj,
Vijayakumar Natarajan,
Arun Thilak E,
Aishwarya Ramesh
Pages 354 - 362

View PDF
Abstract
Objective: To evaluate the effectiveness of transthoracic echocardiography as a hemodynamic monitoring tool during induction of anesthesia and endotracheal intubation Design: Prospective, single center, observational study Setting: Medical college teaching hospital Participants: Sixteen patients undergoing elective coronary artery bypass surgery Interventions: Patients were monitored with Transthoracic echocardiography and pulmonary artery catheter Measurements and Main results: Baseline pre induction Transthoracic echocardiography was done to calculate fractional Shortening, fractional Area Change. Cardiac output and systemic vascular resistance were calculated by left ventricular outflow and mitral inflow Doppler. At the same time baseline pulmonary artery catheter measurements, cardiac output and calculated systemic vascular resistance were recorded. Measurements of Transthoracic echocardiography and pulmonary artery catheter were repeated during post induction and one minute after endotracheal intubation. Percent difference between baseline and post induction (Group A data) and percent difference between post induction and post intubation (Group B data) of all parameters were calculated. From group A and group B data estimated percent change in cardiac output and systemic vascular resistance correlated between two techniques. It also predicts the change in contractility during induction and endotracheal intubation. The change in cardiac output as estimated by the mitral inflow doppler and the left ventricular outflow doppler correlated well. Conclusion: Transthoracic echocardiography can be used as a replacement for pulmonary artery catheter to predict change in blood pressure, afterload and cardiac output during induction of anaesthesia in a non-invasive manner
Research Article
Open Access
Evaluation of Tailored Anesthetic Strategies in High-Risk Cardiovascular and Geriatric Patients: A Prospective Observational Study on Perioperative Challenges and Outcomes
Jignesh M Trivedi,
Jitendra J Patel
Pages 363 - 367

View PDF
Abstract
Background: High-risk patients, including those with cardiovascular conditions and geriatric individuals, present significant challenges in anesthetic management due to their increased susceptibility to perioperative complications. Cardiovascular diseases (CVDs) are a leading cause of perioperative morbidity, while the aging population experiences unique physiological changes that complicate surgical outcomes. Objective: This study aims to evaluate the outcomes and effectiveness of tailored anesthetic strategies for high-risk cardiovascular and geriatric patients undergoing surgical procedures. Methods: A prospective observational study was conducted involving 500 high-risk patients, comprising 250 cardiovascular and 250 geriatric individuals. Data on perioperative challenges, anesthetic techniques, intraoperative monitoring, and postoperative outcomes were collected and analyzed. Results: Cardiovascular patients demonstrated increased risks of hemodynamic instability, arrhythmias (12%), and myocardial ischemia (8%). Effective management included preoperative cardiac optimization and advanced intraoperative monitoring. Geriatric patients exhibited heightened incidences of postoperative cognitive dysfunction (14%) and delayed recovery (10%), with age-specific protocols such as regional anesthesia and multimodal analgesia showing positive outcomes. Conclusion: Tailored anesthetic approaches are crucial for high-risk patients to mitigate complications and improve surgical outcomes. Multidisciplinary collaboration and the integration of advanced monitoring technologies play pivotal roles in enhancing patient safety. This study provides evidence supporting the need for personalized anesthetic strategies to address the unique challenges faced by cardiovascular and geriatric patients
Research Article
Open Access
Systematic Review: Risk Factors for Developing Type 2 Diabetes Mellitus
Anamika Chakraborty Samant,
Hemali Jha,
Parul Kamal
Pages 382 - 390

View PDF
Abstract
Type 2 Diabetes Mellitus (T2DM) is a multifactorial metabolic disorder characterized by insulin resistance, impaired glucose regulation, and progressive beta-cell dysfunction. The global prevalence of T2DM has been rising at an alarming rate, influenced by genetic, lifestyle, environmental, and socio-economic factors. This systematic review examines the key risk factors associated with the development of T2DM, including obesity, physical inactivity, unhealthy diet, genetic predisposition, psychosocial stress, environmental toxins, and socioeconomic determinants. The review synthesizes evidence from epidemiological studies, clinical trials, and meta-analyses to provide a comprehensive understanding of the complex interplay of risk factors that contribute to T2DM onset. Identifying and addressing these risk factors through preventive strategies is crucial for reducing the burden of diabetes globally. Moreover, this review highlights the importance of personalized lifestyle interventions and early screening methods to mitigate risk and improve long-term health outcomes. Addressing disparities in healthcare access and developing targeted public health strategies are essential in reducing diabetes prevalence and improving patient quality of life. Future research should focus on innovative prevention programs, technological advancements in monitoring glucose levels, and community-based interventions that promote sustainable lifestyle changes
Research Article
Open Access
Indications and Rate of Caesarean Delivery in a Zonal Hospital in Kanpur, Uttar Pradesh: a Retrospective Study
Ritam Bhattacharya,
Roshni Abichandani,
Arunav Sharma
Pages 20 - 23

View PDF
Abstract
Background: Caesarean deliveries are one of the most commonly performed surgeries in this world. However, the past decades have witnessed a gradual rise in the caesarean section rate in India as well as worldwide. The objective of this present study is to analyse the rate and indications of caesarean delivery over a five year period in 7 Air Force Hospital, Kanpur, Uttar Pradesh. Methods: This is a retrospective study that analysed the rate and indications of caesarean delivery that took place over five years from 01 Jan 2019 to 31 dec 2023. Data of the patients was obtained from the hospital records and statistical analysis was done. Results: There was an overall rise in the rate of caesarean delivery from 21.3% in 2019 to 32.2% in 2023. Previous caesarean status was the most common indication. There was an increase in primary caesarean section as well from 10% in 2019 to 24% in 2023. At the same time, there was a reduction in the incidence of neonatal birth asphyxia from 1.28% in 2019 to 0.5% in 2023. Conclusion: Efforts should be made to ensure that every caesarean delivery is medically justified and that every patient who needs a caesarean delivery receives it on time, instead of trying to achieve a specific rate of caesarean delivery. At the same time, patient education, better intrapartum care, improved monitoring of labour and regular audits can help us minimize the rate of caesarean delivery over time.
Research Article
Open Access
Evaluation of Imaging Features of Drug-Sensitive and Drug-Resistant Pulmonary Tuberculosis
Anurag Shukla,
Sarajuddin Ansari,
Vivek Arora
Pages 84 - 88

View PDF
Abstract
Background: Pulmonary tuberculosis (PTB) remains a significant public health concern, with drug-resistant tuberculosis (DR-TB) complicating treatment and prognosis. Radiological imaging plays a crucial role in the early detection and differentiation of drug-sensitive tuberculosis (DS-TB) and DR-TB. Objective: This study aims to evaluate the radiological features of DS-TB and DR-TB and identify distinguishing characteristics to facilitate early diagnosis and improved clinical decision-making. Methods: A prospective observational study was conducted from December 2023 to November 2024 at the Department of Respiratory Medicine, RKDF Medical College, Bhopal, and Maharshi Devraha Baba Autonomous State Medical College, Deoria. Patients aged ≥18 years with microbiologically confirmed DS-TB or DR-TB were included. Extrapulmonary TB cases and those with comorbid pulmonary conditions affecting imaging interpretation were excluded. Chest X-rays (CXR) and high-resolution computed tomography (HRCT) scans were analyzed for imaging patterns such as cavitation, consolidation, nodular opacities, fibrosis, pleural effusion, and bronchiectasis. Statistical analysis included descriptive statistics, chi-square tests, and logistic regression to determine significant differences. Results: DR-TB cases demonstrated a higher prevalence of cavitation (75.0% vs. 29.2%, p<0.001), bronchiectasis (50.0% vs. 12.5%, p<0.001), fibrosis (68.8% vs. 25.0%, p<0.001), and pleural effusion (31.3% vs. 16.7%, p=0.021) compared to DS-TB. Additional findings such as tree-in-bud appearance (81.3% vs. 33.3%, p<0.001) and lymphadenopathy (62.5% vs. 20.8%, p<0.001) were more frequent in DR-TB. Conclusion: Imaging serves as a critical tool in differentiating DS-TB from DR-TB. The distinct radiological patterns observed in this study can aid clinicians in early diagnosis, treatment planning, and monitoring of TB cases, thereby improving patient outcomes..
Research Article
Open Access
Rheumatic Mitral Stenosis: Long-Term Follow-Up of Adult Patients with Nonsevere Initial Disease
Sudhakar Singh,
Dheeraj Kela
Pages 157 - 162

View PDF
Abstract
Background: Rheumatic mitral stenosis remains an important clinical problem, especially in developing regions where rheumatic heart disease prevails. While the severe lesions present a strong indication of urgent intervention, the non-severe lesions that involve mildly symptomatic or asymptomatic subjects warrant closer long-term follow-up for understanding their clinical course and guiding management strategies. Methods: This was a longitudinal cohort study of 140 adult patients with non-severe rheumatic mitral stenosis and a mitral valve area ≥1.0 but <2.0 cm² at Heritage Hospitals Lanka. The baseline characteristics, symptom progression, echocardiographic parameters, and quality of life were assessed in this three-year follow-up study. The primary outcome measure was progression of mitral stenosis, defined as a reduction in the mitral valve area to less than 1.0 cm². The secondary outcomes were incident atrial fibrillation and changes in the quality of life as measured with the Kansas City Cardiomyopathy Questionnaire. Results: During the median follow-up period of 20 months, mitral stenosis had progressed in 35.7% of patients. The median time to progression was 18 months. The risk of progression was significantly greater in patients with NYHA Class III at baseline, p < 0.01. Also, 25% of the patients developed atrial fibrillation, mostly in those with a mitral valve area <1.5 cm² (p < 0.001). Quality-of-life scores showed a significant improvement from baseline in all patients with p < 0.001, reflecting effective symptom management despite disease progression in some patients. Conclusion: The study lays much emphasis on long-term follow-up in patients with nonsevere rheumatic mitral stenosis, whereby the identification of such patients will be at a higher risk for disease progression and complications like atrial fibrillation. Individualized care strategy and regular monitoring improve outcome and enhance quality of life in these patients.
Research Article
Open Access
A Study on the Visual Outcomes of Cataract Surgery in Diabetic Patients and Assessment of Post-operative Complications Compared to Non-Diabetic Patients.
Md. Obaidur Rahman,
Sudhir Kumar
Pages 183 - 189

View PDF
Abstract
Background: Diabetes is a major cause of vision loss worldwide, with cataracts being a common complication. Cataract surgery in diabetic patients accounts for about 20% of total procedures and helps improve visual acuity while facilitating retinal examination. However, there is a potential risk of worsening diabetic retinopathy. This study aims to evaluate and compare visual outcomes and postoperative complications in diabetic and non-diabetic patients undergoing cataract surgery. Materials and Methods: This prospective observational study included 50 diabetic and 50 non-diabetic patients undergoing phacoemulsification or SICS with PCIOL implantation. Preoperative assessments included HbA1c levels, visual acuity, intraocular pressure, cataract grading, and fundus evaluation. Postoperative evaluations were conducted on days 1, 2, and 6 weeks, assessing BCVA, intraocular pressure, and fundus changes. Central foveal thickness (CFT) was measured using OCT preoperatively and six weeks postoperatively. Statistical analysis was performed using GraphPad version 8.4.3, with P-values < 0.05 considered significant. Results: This study compared visual outcomes, intraocular pressure, and retinal changes in diabetic and non-diabetic patients undergoing cataract surgery. Diabetic patients had a higher prevalence of cortical cataracts, while nuclear cataracts were slightly more common in non-diabetics. Preoperative glycaemic control, measured by HbA1c levels, significantly influenced postoperative visual recovery, with better-controlled diabetics (HbA1c <7%) achieving superior vision. Both groups showed significant improvement in visual acuity post-surgery, with no major differences between them. Postoperative complications, including iritis and Descemet’s membrane folds, were more frequent in diabetics but not statistically significant. A significant increase in central foveal thickness was observed in both groups, with a greater rise in diabetics, indicating a higher risk of subclinical macular edema. These findings highlight the importance of glycaemic control and close retinal monitoring in diabetic patients undergoing cataract surgery. Conclusion: Cataract surgery improves vision in diabetic patients, though outcomes are slightly better in non-diabetics. Complications like iritis and Descemet's membrane folds were more common in diabetics but not statistically significant. Poor glycaemic control and diabetic retinopathy affected recovery, with a greater increase in central foveal thickness post-surgery. Preoperative diabetic retinopathy remains a key factor in visual outcomes.
Research Article
Open Access
A Study on Complication During Therapeutic Plasma Exchange in A Tertiary Care Hospital of Central India
Devesh Kumar Bulbake,
Sachin Sharma,
Amrita Tripathi,
Ashok Yadav,
Ramu Thakur,
Tamil Priya L
Pages 190 - 194

View PDF
Abstract
Background: Therapeutic Plasma Exchange (TPE) is a life-saving procedure used to treat various autoimmune, hematological, and neurological disorders. While effective, TPE is associated with a range of complications that can affect patient safety and treatment outcomes, especially in tertiary care settings managing critically ill patients. Objective: This study aimed to analyze the types, frequencies, severities, and timing of complications associated with TPE in a tertiary care hospital in Central India. Methods: A cross-sectional observational study was conducted at MGM Medical College and M.Y. Hospitals, Indore, from December 2020 to December 2024. Data were collected from 400 TPE sessions involving 160 patients. Complications were categorized as mild, moderate, or severe and recorded during the procedure and within 24 hours post-treatment. Statistical analyses, including chi-square tests, were used to assess associations between complications and patient or procedural characteristics. Results: Complications were observed in 37.75% of TPE sessions. Mild complications, such as hypotension, fever, and pruritus, accounted for the majority (69.5%) of the total complications and were primarily transient. Moderate complications included hypocalcemic symptoms (7.3%) and catheter-related issues (5.3%), while severe complications, such as deep vein thrombosis (3.3%) and sepsis, were rare but critical. Most complications occurred during the procedure (61.59%), followed by within one hour post-TPE (28.48%), and least after one hour (9.93%). A significant association was noted between the timing and severity of complications (p = 0.004).Conclusion: TPE is generally safe when performed with proper monitoring, but complications remain a concern, emphasizing the importance of individualized care and vigilance. Advanced technology, aseptic techniques, and adherence to evidence-based protocols are essential to minimizing risks. This study provides valuable insights into optimizing TPE protocols in tertiary care settings and improving patient outcomes
Research Article
Open Access
Study of Drug utilization pattern in OPD Patients at a tertiary care Teaching Hospital in North India.
Manoj Kumar,
Dheeraj Kumar,
Smriti Chawla,
Prashant Harit,
Naresh Jyoti,
Gurleen Kaur
Pages 447 - 451

View PDF
Abstract
Background: Drug utilization studies (DUS) are an important resource for stakeholders in drug and health policies. DUS is the marketing, distribution, prescription and use of certain drugs in the society concerned, and the resulting medical, social and economic consequences therefore it covers prescribing, dispensing, administration or intake of medicine and related events. DUS is mainly aimed at analyzing drug therapy problem and monitoring its consequences in drug utilization, attempting to improve drug therapy quality.Drug utilization research promotes rational drug use by encouraging prescribers the correct drug, dose, and affordable price, Assessing whether drugs are prescribed and used appropriately, providing feedback to doctors on prescription rationality, evaluating the effectiveness of interventions aimed at improving rational drug use. Objective: To evaluate the drug utilization study in outpatient department (OPD) patients at a tertiary care teaching hospital in North India. Methods: An observational, prospective study was conducted in the OPD of a tertiary care hospital. The Patients' demographic data and prescription details were recorded.Results: A total of 650 patients were included in the study. The findings revealed that 89% drugs were prescribed by brand name, while 11% were by generic name, Antibiotics were prescribed in 6.50% of cases, Injections were prescribed in 6.30% of cases, Polypharmacy was observed in 20.6% of patients.Conclusion: This study highlights the need for rational use of drugs in OPD patients. The findings of this study can inform policy design, education, and awareness programs to motivate physicians to use drugs rationally.
Research Article
Open Access
Evaluation Of Post Operative Recovery with Or Without Endotracheal Tube Cuff Pressure Measurement Intraoperatively
Prathibha Krishna Pillai,
Vandana Trivedi,
Aalap Trivedi
Pages 483 - 487

View PDF
Abstract
Background: The most common laryngo-tracheal complaints following general anesthesia with tracheal intubation in the postoperative period are sore throat and hoarseness, with an incidence ranging from 24% to 90%, which may hamper the quality of recovery postoperatively. This study was designed to assess whether intraoperative monitoring of endotracheal tube cuff pressure can help reduce the incidence of sore throat and hoarseness. Aims & Objective: Main aim of our study is to compare the quality of recovery in post operative patients and hemodynamic stability, smooth extubation, less post operative airway related complication and better patient satisfaction in whom endotracheal cuff pressure is been measured and monitored intra operatively v/s not measured. Materials & Methods: 60 patients scheduled for elective procedures under general anesthesia with orotracheal intubation were recruited through simple random sampling and divided into two groups of 30 each: Group A and Group B. All patients received general anesthesia following a standard protocol. In Group A, cuff pressure monitoring was performed, whereas Group B served as the control group with no such monitoring. The incidence and severity of sore throat and hoarseness were recorded for both groups. Result: 100 patients were analyzed for the outcomes without any dropouts. The basic parameters like age, sex, BMI, and duration of surgery were found to be statistically insignificant among the two groups. The incidence of sore throat and its severity along with that of hoarseness of voice were found to be statistically insignificant, but with an evident better outcome in Group A. Conclusion: We conclude that intraoperative monitoring of cuff pressure using a cuff pressure monitor significantly reduced the incidence as well as the severity of sore throat and incidence of hoarseness of voice in patients undergoing orotracheal intubation, but came out to be statistically not significant with evident improvement in quality of recovery post operatively.
Research Article
Open Access
Pulmonary Function Abnormalities Among Treated Cases of Pulmonary Tuberculosis
Bhavya Mehta,
Lokesh Maan,
Mahesh Mishra,
Jitendra Jalutharia,
Mit Mehta,
Tushar Vashist,
Jalpit Patel,
Apurv Mathur,
Niharika Jha
Pages 488 - 492

View PDF
Abstract
Background: Tuberculosis (TB) is a significant global health challenge, with India accounting for 27% of global cases. Despite improved treatment success rates, the long-term impact of post-pulmonary TB sequelae remains inadequately studied, particularly in high-burden countries. This study aimed to evaluate the pulmonary function abnormalities in post-pulmonary tuberculosis cases. Methods: A hospital-based observational study was conducted on 300 post-pulmonary TBpatients at a tertiary care center from September 2022 to March 2024. Thestudy utilized comprehensiveassessment methods including clinical evaluation and pulmonary function testing using computerized spirometry. Risk factors weresystematically evaluated, and statistical analysis was performed with significance set at p≤0.05. Results: Thestudypopulation (n=242)(meanage59.33±12.18years) showed male predominance(76.03%) and primarily rural residence (79.8%). History of smoking in 52.1% and biomass fuel exposure in 19.8% cases. Common clinical manifestations included shortness of breath (87.6%), cough (77.7%), and expectoration (59.5%). Spirometry wassuccessfully performed in 242 cases out of 300 cases. Spirometry revealed pulmonary function abnormalities in 89.26% of cases: restrictive pattern in 90 cases (37.19%), mixed pattern in 64 cases (26.45%), and obstructive pattern in 62 cases (26.62%). 98 cases (40.5%) had severe to very severe pulmonary function abnormalities. Conclusion: PostTB pulmonary function abnormalities are prevalent and associated with significant pulmonary function abnormalities, underscoring the need for comprehensive post-TB care. Strategies should include pulmonary rehabilitation, regular lung function monitoring, and interventions targeting modifiable risk factors like smoking and incomplete treatment adherence to improve long-term outcomes.
Research Article
Open Access
A Study of Aneurysms of Arteriovenous Fistula in Chronic Kidney Disease Patients at a Tertiary Care Centre in Eastern India
Shilpa Basu Roy,
Aparna Basumatary,
. Subesha Basu Roy,
Birupaksha Biswas,
Debtanu Hazra
Pages 554 - 559

View PDF
Abstract
Background: Regular puncture for dialysis, treatment with anticoagulation and abnormal hemodynamics make infections, hematoma, thrombosis, limb oedema, cellulitis of limb, bleeding, pseudoaneurysms and true aneurysms a relatively common complication in patients with arteriovenous fistula (AVF) for hemodialysis. Aims: We aim to describe the presentations, treatment modalities and probable causative factors of true and pseudo aneurysms in CKD patients with arteriovenous access.Materials and Methodology: It was a retrospective observational study in the Department of Cardiothoracic and Vascular Surgery at IPGMER And SSKM Hospital, Kolkata, during the period July 2022 to July 2024. Results: In our study, 34.03% patients were in the age group 51-60 and 23.15% were in the age group 41-50. Of those studied, 61.4 % were male, the rest were female. All the patients had Stage V CKD. 68.77% patients had aneurysms of the brachiocephalic fistula while the rest had aneurysms of the radiocephalic fistula. Among comorbidities, 67.01% patients had Type 2 diabetes mellitus (T2DM), 86.31% patients were hypertensive, 64.21% patients had peripheral arterial disease, 36.14% patients had heart failure, 82.80% patients had dyslipidemia.In our study, 108 (37.80%) patients had Type Ia aneurysm, 142 (49.82%) had Type Ib aneurysm, 21 (7.36%) had Type IIa aneurysm, 14 (4.91%) had Type IIb aneurysm. 44.91% patients who presented were asymptomatic, while 40% presented as bleeding fistula and 15.09% presented as hematoma. Among treatment modalities undertaken, ligation of fistula was done for a significant 77.55% of cases, excision of aneurysm and repair for 16.84% and endovascular repair was done for 5.61 % of patients. Conclusions: Frequent monitoring of the arteriovenous access, avoiding repeated punctures in same site for dialysis, regular dressing and antibiotics to prevent infection may help identify and prevent aneurysms early and provide prompt treatment to avoid potentially fatal consequences like rupture, hemorrhage, thrombosis and stenosis. To determine the ideal treatment strategy and the appropriate time for intervention, studies outlining the etiology, natural history and development of aneurysms are necessary.
Research Article
Open Access
Evaluation of Predictors, Complications & Outcome of Ventilator-Associated Pneumonia.
Reddy Sanskar,
Maniram Kumhar,
Ravindra Kumar Tiwari,
Haku Anshau,
Vibha Vinayaka,
Harsh Tak
Pages 587 - 591

View PDF
Abstract
Background: According to studies, VAP is a serious C thus it’s O can vary depending on patient health, type & severity of pneumonia ans effectiveness of the treatment. Therefore the goal of our study was to evaluate its PD, C & O. Material and Method: 144 patients were randonly selected and evaluated for various PD parameters , C & O after undering routine investigation by questionnaire format. Result: We found that, C were significantly higher for late than early VAP, those who were present with VAP showed MTR in majority and early PD were significantly higher. Conclusion: We conclude that, VAP that starts early may respond well to more targeted therapies, but VAP that starts late needs continuous monitoring and care due to the increased risk of significant complications.
Research Article
Open Access
Observational Study of the Relationship Between Serum Lipid Profiles and Risk of Atherosclerotic Cardiovascular Disease (ASCVD)
Prasanti Ponnamalla,
Kandavalli Raja Ravikanth,
Kamarajugadda Vagdevi,
Bharathi Gangumalla,
Sannapu Prasanna Kumar
Pages 40 - 45

View PDF
Abstract
Background: Atherosclerotic cardiovascular disease (ASCVD) remains a leading cause of morbidity and mortality globally. Serum lipid profiles play a crucial role in ASCVD risk assessment, but the predictive value of traditional and non-traditional lipid markers requires further investigation. Objectives: This study aims to determine the relationship between serum lipid profiles (total cholesterol, LDL-C, HDL-C, triglycerides) and ASCVD risk. Secondary objectives include analyzing lipid ratios (TC/HDL-C, LDL-C/HDL-C) as predictors, evaluating the role of non-traditional lipid markers, and identifying demographic and lifestyle factors influencing lipid profiles and ASCVD risk. Methods: This observational study included 100 adults (30–70 years) without pre-existing ASCVD, recruited from a single-center healthcare facility. Baseline demographic, lifestyle, and biochemical parameters were recorded. Lipid profiles, lipoprotein(a), apolipoproteins, fasting glucose, and HbA1c were assessed. Participants were followed for six weeks to monitor incident ASCVD events. Data were analyzed using SPSS and R software, with logistic regression applied for risk assessment. Results: The mean ASCVD risk score was 10.3 ± 4.7%. Elevated LDL-C (132.5 ± 21.6 mg/dL) and unfavorable lipid ratios correlated with higher ASCVD risk. Incident ASCVD events occurred in 15% of participants, including myocardial infarction (7%), stroke (4%), and peripheral arterial disease (4%). Lipoprotein(a) and apolipoproteins showed potential value in risk stratification. Conclusion: Dyslipidemia and unfavorable lipid ratios significantly predict ASCVD risk. Non-traditional lipid markers may enhance risk assessment. Routine lipid monitoring and targeted interventions are essential for early prevention.
Research Article
Open Access
Ambulatory blood pressure monitoring for measuring blood pressure pattern in patients admitted with acute heart failure in a tertiary care centre: An Observational Study
Kumar Shubham,
Shashi Mohan Sharma,
Dinesh Kumar Gautam,
Pradeep Meena,
Dhananjay Shekhawat
Pages 110 - 117

View PDF
Abstract
Background: Ambulatory blood pressure monitoring (ABPM) is increasingly recognized for its ability to capture circadian variations in blood pressure, which are pivotal for managing patients with acute heart failure (AHF). This observational study investigates the utility of ABPM in a clinical setting to correlate blood pressure patterns with clinical outcomes in patients admitted with AHF. Methodology: This prospective observational cohort study was conducted at a tertiary care center, encompassing a sample of 100 patients diagnosed with AHF. ABPM was employed 24 hours prior to discharge post initial stabilization to monitor blood pressure fluctuations over a 24-hour period. Data were analyzed to correlate these fluctuations with clinical parameters including heart failure severity and cardiac structural changes, as evidenced by echocardiographic data. Results: The study findings highlighted that NYHA Class III or IV at admission was significantly higher in HFmrEF risers (96.2%) compared to non-risers (88.9%) (p = 0.02). ABPM measurements showed that HFpEF patients had the highest average 24-hour SBP (124.9 ± 17.8 mmHg), followed by HFmrEF (112.4 ± 15.2 mmHg) and HFrEF (102.8 ± 13.9 mmHg). HFpEF patients had the highest prevalence of nocturnal hypertension (52.7%), followed by HFmrEF (34.1%) and HFrEF (27.4%). The differences were significant (p=0.01). The differences in LVEF between the AHF groups were statistically significant, with HFpEF showing the best heart function and HFrEF showing the worst. Conclusion: ABPM provides valuable insights into the prognostic implications of blood pressure variability in patients with AHF. The data suggests that ABPM should be considered as part of the routine assessment in AHF patients to better tailor therapeutic interventions and potentially improve clinical outcomes.
Research Article
Open Access
Utility of Complete Blood Count and Peripheral Blood Picture in Assessing Dengue Severity and Outcomes
Divya Srivastava,
Praveen Kumar
Pages 137 - 141

View PDF
Abstract
Background: A Complete Blood Count (CBC) and Peripheral Blood Picture (PBP) are essential diagnostic tools in assessing the severity and outcomes of dengue infection. Dengue, a mosquito-borne viral illness caused by the dengue virus (DENV), can range from mild dengue fever (DF) to severe forms such as dengue hemorrhagic fever (DHF) and dengue shock syndrome (DSS). The CBC and PBP provide critical information about hematological changes that correlate with disease progression and severity. The objective is to observe the trends of recovery of white blood cells (WBCs) and platelets in dengue fever Materials and Methods: This is a prospective and cross-sectional study conducted in the Department of Pathology at Uma Nath Singh Autonomous State Medical College, Jaunpur, Uttar Pradesh over period of 1 year. Patients diagnosed with dengue fever, dengue hemorrhagic fever (DHF), or dengue shock syndrome (DSS) based on WHO criteria were included. Cases confirmed by serological tests (NS1 antigen, IgM/IgG ELISA) or RT-PCR. Patients with recorded laboratory parameters, including hemoglobin, hematocrit, white blood cell count, platelet count, and peripheral blood smear findings were included. Results : A total of 560 patient data were analyzed in this study, comprising 280 males and 280 females. The mean age of the study population was 34.2 years, with a standard deviation of 13.7, and the age range was between 16 and 84 years. Among these patients, 245 did not develop DHF during hospitalization, while 315 progressed to DHF in the ward. During the acute febrile phase (Days 2–3) of the illness, leucopenia (WBC < 5000 cells/mm³) was observed in 72.4% of the patients. The average WBC counts for DF and DHF patients were 4.22 and 4.57, respectively. Neutrophil counts showed mean values of 2.85 in DF patients and 3.21 in DHF patients. Lymphocyte counts were lower in DHF patients, with a mean of 0.92 compared to 1.07 in DF patients.Conclusion: In conclusion, DF is an increasing, global problem with a growing footprint on millions of lives. At this time, monitoring decreases in hemoglobin and increases in WBC counts, particularly neutrophils, through routine CBC testing in hospitalized patients with suspected DF may identify those young children at higher risk of severe
Research Article
Open Access
A prospective study of incidence and outcome of arrhythmias in patients with Acute Myocardial Infarction (AMI)
Dr. Venkata harish,
Dr. V K Manasa,
Dr. Chennakesavulu Dara
Pages 267 - 270

View PDF
Abstract
Background: Acute Myocardial Infarction (AMI) is a leading cause of morbidity and mortality globally, with arrhythmias representing a frequent and significant complication. These arrhythmias, which can range from benign to life-threatening, are closely associated with the severity of myocardial injury and contribute to poor short- and long-term outcomes. This prospective observational study aimed to investigate the incidence, types, and outcomes of arrhythmias in patients with AMI admitted to the intensive coronary care unit at SVRRGGH, Tirupati. A cohort of 100 patients was assessed based on clinical features, ECG evidence, blood biomarkers, and imaging. The study found that the majority of patients were aged 41-70 years, with a significant male predominance (83%). Lifestyle factors such as smoking and alcohol use were common, and hypertension and diabetes were prevalent comorbidities. The most common type of myocardial infarction was Inferior Wall Myocardial Infarction (INFWMI). Arrhythmias, particularly ventricular premature contractions (VPCs), were observed in a significant number of patients, with spontaneous resolution noted in some cases. The study highlighted the relationship between the location of the myocardial infarction and the occurrence of specific arrhythmias, with no significant correlation found between MI type and mortality. Additionally, factors such as age, gender, and comorbidities influenced arrhythmic patterns and outcomes. These findings suggest that arrhythmias in AMI patients can often resolve spontaneously but require careful monitoring and timely intervention. The study underscores the importance of personalized treatment strategies and further research to refine management techniques and improve patient outcomes, particularly for those with high-risk factors such as comorbidities and lifestyle behaviors.
Research Article
Open Access
Evaluating the Role of Perioperative Goal-Directed Fluid Therapy in Preserving Postoperative Renal Function in High-Risk Surgical Patients: A Prospective Study
Yogesh Kumar Chhetty,
Vinamra Tiwari,
Himanshu Jangid
Pages 755 - 760

View PDF
Abstract
Background: Postoperative acute kidney injury (AKI) remains a significant complication in high-risk surgical patients, contributing to increased morbidity, prolonged hospital stays, and higher mortality rates. Fluid management during the perioperative period plays a critical role in maintaining renal perfusion and preventing ischemic kidney injury. Goal-Directed Fluid Therapy (GDFT), an individualized approach using hemodynamic monitoring to optimize fluid administration, has been proposed as a strategy to improve renal perfusion and organ function in surgical patients. However, its impact on postoperative renal function, particularly in high-risk populations, remains a subject of ongoing investigation. This study aims to evaluate the effectiveness of perioperative GDFT in preserving renal function and reducing the incidence of postoperative AKI in high-risk surgical patients. Objective: To assess the impact of perioperative Goal-Directed Fluid Therapy (GDFT) on postoperative renal function, determine its role in reducing AKI incidence, and compare it with standard fluid management protocols in high-risk surgical patients. Methods: This prospective observational study was conducted over six months in the surgical and critical care units of a tertiary care hospital, enrolling 50 high-risk surgical patients undergoing major non-cardiac surgery. Patients were divided into two groups based on intraoperative fluid management strategy: •GDFT Group: Patients managed using non-invasive hemodynamic monitoring (stroke volume variation, cardiac output, dynamic fluid responsiveness) to guide fluid administration. •Standard Fluid Therapy (SFT) Group: Patients managed using a fixed, weight-based fluid administration approach. Preoperative renal function was assessed using serum creatinine, estimated glomerular filtration rate (eGFR), and urine output. Postoperative renal function was evaluated using Acute Kidney Injury Network (AKIN) criteria, comparing serum creatinine changes, urine output, and AKI incidence between the two groups at 24 hours and 72 hours postoperatively. Secondary outcomes included length of hospital stay, need for renal replacement therapy (RRT), and overall morbidity and mortality rates. Results: The incidence of postoperative AKI was significantly lower in the GDFT group (12%) compared to the SFT group (32%) (p < 0.05). Patients in the GDFT group maintained better renal function, as indicated by lower serum creatinine elevation (mean increase of 0.2 ± 0.1 mg/dL vs. 0.5 ± 0.2 mg/dL in SFT, p < 0.05) and higher urine output (mean 1.2 ± 0.4 mL/kg/hr vs. 0.7 ± 0.3 mL/kg/hr, p < 0.05). The requirement for renal replacement therapy (RRT) was lower in the GDFT group (4%) compared to the SFT group (12%), although this difference was not statistically significant. Additionally, the length of ICU stay, and total hospital stay were significantly shorter in the GDFT group, suggesting an overall improved recovery trajectory. Conclusion: The findings of this study suggest that perioperative Goal-Directed Fluid Therapy (GDFT) is associated with improved renal function, reduced incidence of postoperative AKI, and shorter hospital stays in high-risk surgical patients. The use of dynamic hemodynamic monitoring for individualized fluid administration appears to be superior to standard fixed-volume resuscitation strategies, potentially leading to better renal perfusion and organ protection. These results support the implementation of GDFT protocols in high-risk surgical populations to improve postoperative outcomes. Further multi-center trials with larger patient cohorts are needed to establish standardized GDFT guidelines for optimizing perioperative renal protection strategies.
Research Article
Open Access
Incidence Of Thyroid Function Abnormality in Newly Diagnosed MDR/RR TB Patient, A Retrospective Observational Study in Central India
Shailesh Agrawal,
Debasish Chakraborty,
Salil Bhargava,
Sanjay Avashia,
Deepak Bansal
Pages 489 - 492

View PDF
Abstract
Background and objective: Thyroid function abnormalities are a recognized comorbid condition associated with tuberculosis. Research has been conducted on how second-line anti-tuberculosis drugs affect thyroid function. However, there is very limited research on baseline thyroid dysfunction in newly diagnosed MDR/RR TB patients. Method: Baseline thyroid function report of 51 microbiologically confirmed newly diagnosed MDR/RR TB patients including both pulmonary and extra pulmonary cases from September 2022 to June 2024 who reported to MY hospital, Indore, were documented from DRTB register. All the cases were more than 18 years old. Result: 17.64 % of our study population showed hypothyroidism. 13.72 % of patients had subclinical hypothyroidism and 3.91 % of patients had clinical hypothyroidism. Conclusion: Incidence of hypothyroidism was significantly more in MDR/RR TB patients in comparison to general population. Subclinical hypothyroidism was also more commonly associated in MDR/RR TB patients. So, hypothyroidism especially subclinical hypothyroidism is a serious concern associated with MDR/RR TB patients and this can also get deteriorated with 2nd line anti tubercular therapy. So proper monitoring of such issue and management is very important.
Research Article
Open Access
Cardiovascular Risk in Type 2 Diabetes Patients
Thota Abhinav,
Mohammed Abdul Aleem Sagri,
J Prathyusha Rao
Pages 603 - 606

View PDF
Abstract
Background: Diabetes mellitus (DM) is a chronic metabolic disorder with a rapidly increasing global prevalence, contributing significantly to morbidity and mortality. Poor glycemic control is a key factor leading to severe complications, particularly cardiovascular diseases (CVD). This study assesses demographic distribution, glycemic control, diabetes duration, management strategies, and cardiovascular risk factors among diabetic patients. Aim: To evaluate glycemic control, management strategies, and the prevalence of cardiovascular risk factors among diabetic patients attending a tertiary care hospital. Methods: A cross-sectional study was conducted among 100 diabetic patients. Data on demographic characteristics, mean HbA1c levels, duration of diabetes, management approaches, and cardiovascular risk factors were collected and analyzed using SPSS software. Results: Of the study population, 63% were male and 57% were aged 41–60 years. The mean HbA1c was 8.2, indicating poor glycemic control. Most patients (61%) had diabetes for over five years, and 56% required both oral hypoglycemic agents (OHA) and insulin. The most prevalent cardiovascular risk factors were dyslipidemia (63%), hypertension (41%), and a high-risk CVD category (37%). Tobacco use and alcohol consumption were observed in 19% and 29% of the patients, respectively. Conclusion: The study highlights poor glycemic control and a high prevalence of cardiovascular risk factors among diabetic patients, emphasizing the urgent need for targeted interventions. Multidisciplinary diabetes management, including early lifestyle modifications, optimal pharmacological strategies, and regular monitoring, is crucial in reducing diabetes-related complications. Future research should explore individualized intervention strategies and their long-term impacts on glycemic control and cardiovascular risk reduction.
Research Article
Open Access
A prospective study on Microalbuminuria among Chronic Kidney Disease Patients
Jayabalakrishnan Subburaja,
Manila Jain
Pages 629 - 634

View PDF
Abstract
Introduction: Chronic kidney disease is classified into five stages based on the estimated glomerular filtration rate (eGFR), with stage 1 being the mildest and stage 5 representing end-stage renal disease (ESRD). Microalbuminuria is typically detected in the early stages of CKD and is defined as a urinary albumin-to-creatinine ratio (ACR) between 30 and 300 mg/g. It is indicative of glomerular injury and endothelial dysfunction, reflecting increased permeability of the glomerular filtration barrier. The pathophysiology of microalbuminuria in CKD involves multiple factors, including glomerular hypertension, podocyte injury, and inflammation. Persistent microalbuminuria is associated with a decline in renal function and an increased risk of cardiovascular events, making it an important marker for risk stratification and therapeutic monitoring in CKD patients. Material and Methods: This is an Observational or cross-sectional study was conducted among CKD patients from outpatient clinics or hospitals, Index Medical College. Patients diagnosed with CKD stages 1–5, based on the Kidney Disease Improving Global Outcomes (KDIGO) guidelines. Demographic and Clinical Data: Collect information on age, gender, duration of CKD, comorbidities, medications, and lifestyle factors. Quantify urinary albumin excretion using spot urine samples or 24-hour urine collections. Normalize results to urinary creatinine concentration. Spot urine samples or 24-hour urine collections will be used to quantify urinary albumin excretion. Urinary albumin concentration will be measured using an immunoturbidimetric assay. Results Mean Urinary Albumin (mg/g creatinine) is 145.6 mg/g creatinine, SD: 85.3 and 24-hour Urinary Albumin (mg/day) is 320.4 mg/day, SD: 150.2. Both Urinary Albumin Measures Increase with CKD Progression: Urinary albumin (mg/g creatinine) rises from 45.2 mg/g (Stage 1) to 380.5 mg/g (Stage 5). 24-hour urinary albumin excretion shows a similar increase from 85.3 mg/day (Stage 1) to 600.5 mg/day (Stage 5). The rate of albumin increase is mild in Stages 1 & 2 but becomes steep in Stages 3-5. Urinary Albumin (mg/g creatinine) has a correlation of r=0.65r = 0.65r=0.65 (p < 0.001). 24-hour Urinary Albumin (mg/day) has an even stronger correlation of r=0.70r = 0.70r=0.70 (p < 0.001). Conclusion This study demonstrates a significant negative correlation between antioxidant status and microalbuminuria in CKD patients, consistent with previous research. The findings highlight the role of oxidative stress in CKD progression and suggest that interventions targeting oxidative stress may help reduce microalbuminuria and slow disease progression
Research Article
Open Access
A study of Hematological profile in patient’s of tuberculosis
Hetal Dineshbhai Asari,
Jaysukh Bhanabhai Berani,
Parth Navinkumar Patel,
Deepa Parmanand Jehwani
Pages 648 - 655

View PDF
Abstract
Introduction: Tuberculosis can significantly affect the hematopoietic system, leading to various hematological abnormalities like anemia, leukocytosis, and changes in platelet counts, which can be valuable in diagnosis and monitoring treatment response. Method: This prospective study conducted in PDU medical college and hospital, Rajkot, Gujarat from April 2021 to March 2022. Blood sample sent to clinical laboratory, Department of pathology, where peripheral smear was prepared from EDTA sample and data evaluated. Total 850 patient’s samples were studied in this study. Data collected includes patients diagnosed with Pulmonary tuberculosis, Extra pulmonary tuberculosis and MDR tuberculosis. hematological parameters like hemoglobin (HB), RBC count, RDW, mean corpuscular volume (MCV), mean corpuscular hemoglobin (MCH), mean corpuscular hemoglobin concentration (MCHC), total leucocyte count (TLC), differential leukocyte count (DLC), platelet count with the help of automated hematology analyzer. Result: Maximum numbers of cases found in 3rd decade followed by 5th decade and 4 th decade. Anemia was frequently encountered in patients with tuberculosis (70.24%).Among anemic patients most patients (51.09%) have moderate degree of anemia with Hemoglobin level between 7 - 10 gm/dl and (38.02%) have mild degree of anemia with hemoglobin level between 10.1 - 12.9 for male and 10.1 - 11.9 for female. Only few patients (10.89%) have severe degree of anemia with hemoglobin level between <7 gm/dl. Normocytic Normochromic anemia was the most common type of anemia (52.60%). Followed by hypochromic microcytic anemia (42.04%).Increased ESR is the commonest finding associated with tuberculosis (92.35%) Leucocytosis occurred in (43.18%) cases, among them (72.20%) cases show Neutrophilia while (24.79%) cases show Lymphocytosis. Most cases have normal platelet count but thrombocytosis was seen in (32.47%) cases. Conclusion: These types of hematological abnormalities are quite common in patients with tuberculosis and physicians must maintain a high index of suspicion for diagnosis of tuberculosis in patients with these abnormalities
Research Article
Open Access
An Observational Study on Assessemnt of Pregnancy Outcome in Women with Thalassemia Carrier State in A Tertiary Care Centre
Nabanita Dasgupta,
Ayesha Sadaf,
Kajal Kumar Patra,
Rajib De ,
Tanaya Ghosh
Pages 683 - 690

View PDF
Abstract
Background: Thalassemia minor is a common hereditary hemoglobinopathy that may impact pregnancy outcomes despite being traditionally considered a benign carrier state. This study evaluates the maternal and neonatal complications associated with thalassemia carrier pregnancies in a tertiary care setting. Methods: A prospective observational study was conducted at a tertiary care hospital, comparing 100 pregnant women with thalassemia minor to 100 non-carrier controls. Maternal outcomes, including anaemia, gestational diabetes mellitus (GDM), hypertensive disorders, postpartum haemorrhage (PPH), and mode of delivery, were assessed. Neonatal outcomes such as low birth weight (LBW), intrauterine growth restriction (IUGR), preterm birth, NICU admissions, and perinatal mortality were evaluated. Logistic regression analysis adjusted for maternal BMI, gestational age, and anaemia severity.
Results:
- Anaemia was significantly more prevalent in thalassemia carriers (78% vs. 18%, p < 0.001), with a fourfold increased risk of severe anaemia (OR = 4.52, p < 0.001).
- Caesarean section rates were significantly higher in carriers (42% vs. 30%, p = 0.040).
- IUGR risk was significantly elevated in thalassemia carriers (24% vs. 10%, OR = 2.88, p = 0.010), and LBW was more frequent (38% vs. 22%, p = 0.020).
- NICU admissions were higher among carrier neonates (15% vs. 8%), though not statistically significant (p = 0.080).
Conclusion: Thalassemia minor is associated with a higher risk of anaemia, IUGR, LBW, and caesarean section, emphasizing the need for enhanced prenatal screening, haematological monitoring, and individualized obstetric care. Early detection and multidisciplinary management can mitigate adverse pregnancy outcomes in this population.
Research Article
Open Access
Study of Pre and Post Dialysis Serum Electrolytes and ECG Changes in Patient with Chronic Kidney Disease
Mohammed Ubaidulla Mohammed Ataull,
Aditya Patil,
Amitkumar Potulwar,
A.R. Farooqui,
Tejasri koorapati,
Subhash More
Pages 799 - 803

View PDF
Abstract
Background: Chronic kidney disease (CKD) is associated with significant electrolyte imbalances and cardiac complications. Hemodialysis plays a crucial role in correcting these abnormalities; however, rapid shifts in serum electrolytes can lead to ECG changes, increasing the risk of arrhythmias. This study aims to evaluate the frequency of different ECG abnormalities in CKD patients and analyze electrolyte changes after dialysis. Methods: A cross-sectional study was conducted at a tertiary care medical center on 200 patients with end-stage renal disease (ESRD) undergoing hemodialysis. Patients above 12 years of age meeting the inclusion criteria were enrolled. Exclusion criteria included ischemic heart disease, atrial fibrillation, left ventricular hypertrophy, left bundle branch block (LBBB), and antiarrhythmic medication use. Pre- and post-dialysis blood samples were analyzed for serum levels of potassium, calcium, magnesium, sodium, bicarbonate, urea, and creatinine. A 12-lead ECG was recorded before and after dialysis to assess changes in P wave amplitude, QRS complex, T wave, PR interval, QT interval, ST depression, and QT dispersion. Results: The majority of patients were males (66%), hypertensive (65%), and aged 51-60 years (22%). Hemodialysis led to significant changes in serum sodium (p<0.001), calcium (p<0.05), potassium (p<0.001), magnesium (p<0.05), bicarbonate (p<0.001), urea (p<0.001), and creatinine (p<0.05). Significant ECG changes included reductions in QT interval (p<0.001) and QT dispersion (p<0.001), and increased QRS amplitude (p<0.001).
Conclusion: Hemodialysis significantly alters electrolyte levels and induces ECG changes, highlighting the need for continuous cardiac monitoring in CKD patients undergoing dialysis.
Research Article
Open Access
Study of ischemic stroke patient with special emphasis on its relationship with lipid profile and carotid artery plaque as evaluated by doppler ultrasound study
Vivek Kumar Singh,
Ataul Haque,
Vikrant Kumar
Pages 810 - 813

View PDF
Abstract
Background: Ischemic stroke is a major cause of morbidity and mortality worldwide, with a strong association with atherosclerosis and dyslipidemia. Carotid artery plaque formation is a critical factor in stroke pathophysiology, and its evaluation through Doppler ultrasound provides valuable insights into disease progression. This study aims to assess the relationship between ischemic stroke, lipid profile, and carotid artery plaque characteristics. Materials and Methods: A total of 100 ischemic stroke patients, aged 45–75 years, were included in this hospital-based cross-sectional study. Patients underwent lipid profile analysis, including total cholesterol (TC), low-density lipoprotein (LDL), high-density lipoprotein (HDL), and triglycerides (TG). Carotid Doppler ultrasound was performed to assess plaque presence, morphology, and degree of stenosis. Statistical analysis was conducted to determine correlations between lipid parameters and carotid plaque severity. Results: Among the patients, 70% had hyperlipidemia, with a mean LDL level of 150 ± 20 mg/dL and HDL of 38 ± 5 mg/dL. Carotid artery plaques were detected in 65% of cases, with 40% exhibiting significant stenosis (>50%). A strong positive correlation (r = 0.72, p < 0.01) was observed between LDL levels and plaque severity. Patients with TC > 200 mg/dL had a 3.5-fold increased risk of severe carotid plaque formation. Conclusion: The study highlights a significant association between dyslipidemia and carotid artery plaque formation in ischemic stroke patients. Routine lipid monitoring and carotid Doppler evaluation can aid in early detection and risk stratification, potentially reducing stroke recurrence through targeted lipid-lowering therapies.
Research Article
Open Access
Prevalence of Iron Deficiency Anemia Among Blood Donors: A Cross-Sectional Study
Vinay Changdeorao Nalpe,
Vaibhav Vilas Deshmukh,
Dinesh Vishwanath Swami,
Arvind N Bagate
Pages 840 - 843

View PDF
Abstract
Introduction: Iron deficiency anemia (IDA) is a significant concern among blood donors due to the potential impact on donor health and blood supply quality. This study assesses the prevalence of IDA in a cohort of blood donors, with a focus on gender differences and the efficacy of current screening practices. Methods: This cross-sectional study was conducted at a tertiary care center, including 74 blood donors (56 females and 18 males). Participants underwent screening for iron deficiency using standard hematological parameters, including hemoglobin and serum ferritin levels. Results: The prevalence of iron deficiency among female donors was 39.29% (95% CI: 26.79% - 51.79%), significantly higher than the 33.33% (95% CI: 11.11% - 55.56%) observed in male donors. The overall effectiveness of pre-donation screening in identifying iron deficiency was high, with a detection rate of 99.56% (95% CI: 90.91% - 100.00%) among those screened. Conclusions: Iron deficiency remains a prevalent issue among blood donors, particularly in females. The high rate of detection through pre-donation screening suggests that current methods are effective, but continuous monitoring and tailored interventions, such as iron supplementation and adjusted donation intervals, are recommended to manage iron levels in blood donors effectively. Further research is needed to refine screening techniques and develop gender-specific strategies to address this issue.
Research Article
Open Access
Neonatal Surgery and Its Association with Developmental and Psychiatric Disorders in Early Childhood: A Cohort Study
Vanama Lavya Kumar,
Gorre Jagadish Kumar,
C V S Lakshmi,
Sivasankar Nunna
Pages 648 - 652

View PDF
Abstract
Background: Neonatal surgery is often performed to correct life-threatening conditions in newborns. However, little is known about its long-term impact on developmental and psychiatric outcomes in early childhood. This cohort study aimed to examine the association between neonatal surgery and the occurrence of developmental delays and psychiatric disorders in children. Methods: A cohort of children who underwent neonatal surgery was compared with a control group. Data on demographic characteristics, developmental delays at age 3, and psychiatric disorders at age 5 were collected. Statistical analyses included chi-square tests and multivariate regression models. Results: There were no significant differences between the two groups in terms of gender, gestational age, or birth weight. At age 3, 30% of children in the neonatal surgery group exhibited developmental delays, compared to 12% in the control group (p = 0.02). Specifically, motor delays were more prevalent in the neonatal surgery group (20% vs. 8%, p = 0.04). At age 5, 20% of children in the neonatal surgery group had psychiatric disorders, compared to 8% in the control group (p = 0.03). Anxiety disorders were more common in the surgery group (12% vs. 4%, p = 0.09). Multivariate analysis revealed that neonatal surgery was significantly associated with both developmental delays (OR = 2.8, p = 0.02) and psychiatric disorders (OR = 2.5, p = 0.03). Conclusion: Neonatal surgery is associated with a higher risk of developmental delays and psychiatric disorders in early childhood. These findings highlight the importance of early monitoring and intervention for children who undergo neonatal surgery
Research Article
Open Access
Outcomes of Medical versus Surgical Management of Cesarean Scar Pregnancy: A Randomized Controlled Trial
Dipika Kadu,
Vivek R Panara,
Niharika Dilipbhai Barasara
Pages 42 - 45

View PDF
Abstract
Background: Cesarean Scar Pregnancy (CSP) is a rare form of ectopic pregnancy where the embryo implants within the scar of a previous cesarean section. Effective management is crucial to prevent severe complications, including uterine rupture and life-threatening hemorrhage. This study aimed to compare the clinical outcomes of medical versus surgical management of CSP in a randomized controlled trial. Materials and Methods: A total of 60 patients diagnosed with Cesarean Scar Pregnancy were randomly allocated into two groups: Medical Management (n = 30) and Surgical Management (n = 30). The medical group received intramuscular methotrexate (MTX) at a dose of 50 mg/m², followed by serial monitoring of β-hCG levels until normalization. The surgical group underwent hysteroscopic resection of the gestational sac. Primary outcomes assessed included treatment success rate, time to β-hCG normalization, blood loss, hospital stay duration, and complication rates. Data were analyzed using appropriate statistical methods, with significance set at p < 0.05. Results: The treatment success rate was significantly higher in the Surgical Management group (93.3%) compared to the Medical Management group (76.7%) (p = 0.04). The mean time to β-hCG normalization was shorter in the surgical group (28.3 ± 5.2 days) compared to the medical group (45.7 ± 7.4 days) (p < 0.001). Blood loss was notably higher in the surgical group (210 ± 50 mL) compared to the medical group (120 ± 35 mL) (p = 0.02). However, hospital stay duration was shorter in the surgical group (2.1 ± 0.6 days) compared to the medical group (4.5 ± 1.2 days) (p < 0.001). Complication rates were higher in the medical group (20%) than in the surgical group (10%). Conclusion: Surgical management of Cesarean Scar Pregnancy offers a higher success rate and faster resolution compared to medical management, though it is associated with higher blood loss. Medical management remains a viable alternative for patients contraindicated for surgery or seeking conservative treatment. Further studies with larger samples are warranted to confirm these findings.
Research Article
Open Access
Retrospective Evaluation of Antipsychotic-Induced Metabolic Side Effects in Schizophrenia Patients
Gorre Jagadish Kumar,
Prashanth Kumar Patnaik
Pages 69 - 72

View PDF
Abstract
Background: Antipsychotic medications are essential in managing schizophrenia but are frequently associated with metabolic side effects. These adverse effects increase the risk of cardiovascular disease, diabetes, and mortality in affected patients. Objective: To evaluate the prevalence and progression of metabolic side effects in schizophrenia patients undergoing antipsychotic treatment over a six-month period. Methods:A retrospective observational study was conducted on 100 schizophrenia patients receiving antipsychotic therapy at a tertiary care hospital. Demographic details, medication history, and metabolic parameters were collected from patient records at baseline and after 6 months of treatment. Parameters assessed included body mass index (BMI), fasting blood glucose, lipid profile, and blood pressure. The presence of metabolic syndrome was determined using NCEP-ATP III criteria. Statistical significance was assessed using paired comparisons and chi-square tests. Results: Among 100 patients (mean age: 36.2 ± 9.4 years; 57 males, 43 females), 74% were on atypical antipsychotics. Olanzapine (32%) and Risperidone (24%) were the most frequently prescribed. Significant increases were observed in weight gain (14% to 38%), BMI >25 (22% to 49%), fasting glucose >100 mg/dL (18% to 41%), and triglycerides >150 mg/dL (27% to 46%) (p < 0.01). Atypical antipsychotics were associated with a higher incidence of metabolic abnormalities. The prevalence of metabolic syndrome rose from 8% to 28% over the treatment period (p < 0.001). Conclusion: Antipsychotic therapy, particularly with atypical agents, is strongly associated with metabolic side effects in schizophrenia patients. Routine monitoring and early intervention are essential to mitigate long-term health risks.
Research Article
Open Access
Effect of Levothyroxine Dose Titration on Quality of Life and Serum TSH Levels in Hospital-Initiated Hypothyroid Patients: A 6-Month Follow-up Study
Kaushik Ghanshyambhai Khatrani,
Ujval R. Patel,
Hardik kumar Manojbhai Patel,
Hardik Ashokbhai Savaliya,
Siddharth Patel,
Ravindrapal Singh
Pages 223 - 226

View PDF
Abstract
Background: Hypothyroidism is a common endocrine disorder characterized by elevated serum thyroid-stimulating hormone (TSH) and decreased thyroid hormone levels. Timely initiation and appropriate titration of levothyroxine are crucial for symptomatic relief and metabolic balance. This study evaluates the impact of levothyroxine dose adjustment on serum TSH levels and quality of life (QoL) in newly diagnosed hypothyroid patients over six months. Materials and Methods: A prospective observational study was conducted on 60 newly diagnosed hypothyroid patients aged 20–55 years at a tertiary care hospital. Levothyroxine therapy was initiated based on body weight and titrated every 6 weeks to achieve target TSH levels (0.5–4.5 µIU/mL). Serum TSH was measured at baseline, 3 months, and 6 months. QoL was assessed using the Thyroid-Specific Patient-Reported Outcome (ThyPRO) questionnaire at the same intervals. Results: The mean baseline TSH was 18.7 ± 5.4 µIU/mL, which significantly decreased to 6.1 ± 2.3 µIU/mL at 3 months and reached 2.9 ± 1.1 µIU/mL at 6 months (p < 0.001). QoL scores showed marked improvement, with the mean ThyPRO score improving from 72.4 ± 8.2 at baseline to 48.3 ± 7.5 at 3 months and 31.6 ± 6.4 at 6 months (p < 0.001). Most patients reached euthyroid status by the end of the study with individualized titration. Conclusion: Levothyroxine dose titration over a 6-month period significantly improves thyroid function and quality of life in patients with newly diagnosed hypothyroidism. Regular monitoring and individualized dosing are key to achieving optimal therapeutic outcomes.
Research Article
Open Access
Quantifying C-Reactive Protein in Clinically Stable Chronic Obstructive Pulmonary Disease Patients
Sameer Chandratre,
Bharat Trivedi,
Akhilesh Omprakash Somani
Pages 266 - 269

View PDF
Abstract
Background: Chronic Obstructive Pulmonary Disease (COPD) is associated with chronic systemic inflammation, and C-reactive protein (CRP) is a key biomarker. This study evaluates CRP levels in stable COPD patients compared to healthy controls. Methods: A case-control study included 40 stable COPD patients (GOLD stages 1–4) and 40 age- and sex-matched healthy controls. Serum CRP was measured using high-sensitivity CRP (hs-CRP) assay. Spirometry confirmed COPD severity. Statistical analysis was performed using SPSS v26. Results: Mean CRP was significantly higher in COPD patients (5.2 ± 2.1 mg/L) vs. controls (1.8 ± 0.9 mg/L) (p < 0.001). CRP increased with GOLD stages (Stage 1: 3.1 ± 1.2 mg/L, Stage 4: 7.5 ± 2.4 mg/L; p < 0.01). No significant difference was found between current and ex-smokers (p = 0.45). Conclusion: Elevated CRP in stable COPD suggests persistent systemic inflammation, correlating with disease severity. CRP may aid in monitoring disease progression and guiding therapy.
Research Article
Open Access
Topic-Oligohydramnios and Fetal Growth Restrictions Indicator of Adverse Pregnancy Outcomes in Patients with Hypertensive Disorders in Pregnancy: A Retrospective Study
Megha .MN,
Krupa. B.M,
Ashwini Nayak,
Tejaswini R
Pages 270 - 274

View PDF
Abstract
Background-Hypertensive disorders complicates 5-10%of pregnancies all over the world and its incidence in India found to be 10.08% as per data of National Eclampsia Registry(NEP) AIM –To compare perinatal outcome of oligohydramnios or fetal growth restrictions(FGR) with normal amniotic fluid index(AFI) and fetal growth in hypertensive disorders in pregnancy(HDP) and to compare the outcome of only oligohydramnios ,only FGR and oligohydramnios with FGR in HDP groups. Study Design – This is retrospective study including the 234 pregnant women after 20weeks of gestation with HDP ,from May 2022-May 2024 Patients were divided into two groups: HDP with oligohydramnios or FGR(n = 48) and HDP with normal AFI and fetal growth(n = 186). Then, the first group was divided as only oligohydramnios(n = 16), only FGR(n = 20) and oligohydramnios with FGR(n = 12). perinatal outcomes were recorded. Results - The study found no significant differences in maternal characteristics or complications between the HDP group with oligohydramnios/FGR and the group with normal AFI. However, the HDP group with oligohydramnios/FGR had higher impaired Doppler findings and cesarean section rates (p = 0.004). Neonatal birth weight was lower in the HDP group with oligohydramnios/FGR (p = 0.001), but no significant differences were found in APGAR scores, NICU admissions, or neonatal death. Subgroup analysis showed higher cesarean sections, NICU admissions, and acute fetal distress in the combined oligohydramnios/FGR group (p = 0.05). These findings suggest more severe complications in pregnancies with both oligohydramnios and FGR. Conclusions-Patients with only oligohydramnios showed more favorable outcomes compared to those with only FGR or the coexistence of both conditions. Close monitoring of patients with FGR and those with both conditions is recommended to improve pregnancy outcomes
Research Article
Open Access
Impact of Diabetes Mellitus on the Development and Progression of Benign Prostatic Hyperplasia (BPH)
B. Rajasekhar,
Cheviti Sreeharsha,
Neerukatti Sheliya Dainy
Pages 1001 - 1005

View PDF
Abstract
Background: Benign Prostatic Hyperplasia (BPH) is a common condition in aging men, characterized by the non-malignant enlargement of the prostate gland, which leads to lower urinary tract symptoms (LUTS) such as frequency, urgency, and nocturiaObjective: This study aimed to evaluate the impact of Diabetes Mellitus (DM) on the development and progression of Benign Prostatic Hyperplasia (BPH) in male patients. Methods: An observational cohort study was conducted with 100 male participants, 50 with DM and 50 without, aged 50-80 years. Participants were monitored over a 12-month period. Prostate volume and International Prostate Symptom Score (IPSS) were measured at baseline and after 12 months. Multivariate logistic regression and hazard ratio analysis were used to evaluate the relationship between DM and BPH development and progression. Results: The prevalence of BPH was significantly higher in the Diabetic group (72%) compared to the Non-Diabetic group (52%, p = 0.04). Diabetes was associated with a faster progression of prostate volume increase (1.4 ± 0.8 cm³ vs 0.9 ± 0.6 cm³, p = 0.02) and a greater increase in IPSS (5.2 points vs 3.1 points, p = 0.03). Multivariate analysis revealed diabetes mellitus (OR = 2.7, 95% CI 1.4–5.3, p = 0.01) and age (OR = 1.5, 95% CI 1.1–2.0, p = 0.03) as independent factors contributing to BPH risk. The Diabetic group also experienced more medical interventions and had a higher incidence of acute urinary retention (14% vs 6%, p = 0.08). Conclusion: Diabetes Mellitus significantly increases the likelihood of developing and accelerating the progression of BPH in men. Early monitoring and proactive management of BPH may be essential for diabetic patients.
Research Article
Open Access
Comparison of Cardiac Output Assessment with Less Invasive (FloTrac) and Invasive (PAC CCO) Methods in Patients Undergoing Off-Pump Coronary Artery Bypass Grafting
Yogesh N. Zanwar,
Saurabh B. Tiwari,
Amol B. Thakare,
Ashutosh Vijay Jaiswal
Pages 287 - 292

View PDF
Abstract
Background: Cardiac output assessment plays a crucial role in managing patients undergoing CABG (Coronary Artery Bypass Grafting). Reliable measurement is essential for optimizing hemodynamic stability. This study compares the less invasive FloTrac method with the invasive PAC-CCO (Pulmonary Artery Catheter Continuous Cardiac Output) method in patients undergoing off-pump CABG. Methods A prospective observational study was conducted in the cardiac surgery unit of a tertiary care hospital. Thirty-three patients undergoing elective off-pump CABG over a period of one year were included in this study. The less invasive cardiac output was measured using FloTrac attached to a dedicated left femoral line, while the invasive cardiac output was measured using a 7.5 Fr Swan-Ganz catheter inserted through the right internal jugular vein. Both measurements were recorded simultaneously at 10-minute intervals. Results A total of 3,620 data points were analyzed. Among these, 66 data sets showed identical readings between the two methods. FloTrac provided lower estimates in 586 cases, while it overestimated cardiac output in 2,968 cases. The less invasive FloTrac method demonstrated a statistically moderate correlation with the invasive PAC CCO method, with a tendency toward higher readings. Conclusion Cardiac output assessed with the FloTrac method showed both underestimation and overestimation when compared to the PAC CCO method, with a higher likelihood of overestimation. While FloTrac provides a less invasive alternative, its moderate correlation with PAC CCO suggests that clinical judgment is essential when interpreting its values in off-pump CABG patients.
Research Article
Open Access
The Impact of Early Diagnosis on the Management and Prognosis of Rheumatic Heart Valve Disease
Pages 330 - 332

View PDF
Abstract
Background: Rheumatic heart valve disease (RHVD) remains a significant cause of cardiovascular morbidity, particularly in developing countries. Timely diagnosis plays a crucial role in preventing complications, optimizing treatment strategies, and improving patient outcomes. This study aims to evaluate the effect of early diagnosis on the management protocols and long-term prognosis of patients with RHVD. Materials and Methods: A prospective, observational study was conducted in CVTS department, Sanjay Gandhi Postgraduate Institute of Medical Sciences, Lucknow over a period of 18 months. A total of 120 patients diagnosed with RHVD were categorized into two groups: Group A (early diagnosis, n=60) and Group B (delayed diagnosis, n=60). Patients were assessed for clinical outcomes, surgical interventions, hospital readmissions, and mortality. Echocardiography, ECG, and serological markers were utilized for diagnostic confirmation and monitoring. Results: In Group A, 82% of patients showed clinical improvement with medical management alone, compared to 53% in Group B. The need for valve replacement surgery was significantly lower in Group A (12%) versus Group B (35%). Hospital readmissions within one year were reduced in Group A (18%) compared to Group B (44%). The one-year survival rate was also higher in the early-diagnosed group (95%) compared to the delayed group (81%) (p<0.05). Conclusion: Early diagnosis of rheumatic heart valve disease significantly improves clinical management and prognosis. Prompt identification allows for timely initiation of medical therapy, reduces the need for surgical interventions, minimizes readmissions, and enhances survival rates.
Research Article
Open Access
Impact of COVID-19 on Oxygen Saturation and Exercise Tolerance in Young Adults: An Observational Analysis
Mudduluru Revathi,
Gunti Durga Devi
Pages 43 - 46

View PDF
Abstract
Background: Post-COVID sequelae in young adults have garnered significant attention, particularly regarding cardiopulmonary recovery. This study aims to evaluate the impact of COVID-19 on resting oxygen saturation and exercise tolerance in young adults. Methods: An observational study was conducted between March 2020 and June 2020 involving 100 young adults (aged 18–35 years) who had recovered from mild to moderate COVID-19. Baseline demographic data, resting oxygen saturation (SpO₂), and 6-minute walk test (6MWT) performance were recorded. Post-exercise desaturation (≥4% drop in SpO₂), fatigue scores, and heart rate changes were analyzed. Symptomatology was assessed via self-reported outcomes. Results: The mean age was 26.8 ± 4.9 years with 58% males. Mean BMI was 24.6 ± 3.2 kg/m². Average resting SpO₂ was 96.4% ± 1.8; 12 participants (12%) had SpO₂ < 95%. The mean 6MWT distance was 465.3 ± 54.7 meters. A ≥4% SpO₂ drop was observed in 28% of participants. These individuals exhibited lower resting SpO₂, reduced walk distance (430.6 ± 48.1 meters vs. 478.2 ± 50.3 meters, p < 0.01), and higher fatigue scores (6.3 ± 1.7 vs. 4.5 ± 1.5, p < 0.01). Persistent fatigue and exertional dyspnea were reported in 37% and 29% respectively. Conclusion: A significant proportion of young adults exhibit post-COVID impairments in oxygen saturation and exercise tolerance, even after mild to moderate infection. These findings highlight the need for post-recovery monitoring and rehabilitation strategies in this population.
Research Article
Open Access
To study clinical and arterial blood gas parameter changes in spontaneous pneumothorax before and after tube thoracostomy
G. Peter Praveen Herald,
H. Krishna Murthy
Pages 62 - 65

View PDF
Abstract
Introduction and Background: The collapse of the lungs and difficulty breathing are symptoms of spontaneous pneumothorax, which happens when air gets into the pleural cavity. But nothing is known about the effects of tube thoracostomy on clinical variables and ABG readings. This study compares the pre- and post-tube thoracostomy vital signs, respiratory status, and arterial blood gas characteristics in SP patients. Materials and Methods: At a tertiary care hospital, 50 patients who needed tube thoracostomy due to spontaneous pneumothorax were the subjects of a prospective observational study from January 2018 to December 2018 at Department of Pulmonary Medicine, Viswabharathi Medical College, Penchikalapadu, Kurnool, Andhra Pradesh, India. Patients who were at least 18 years old, had radiologically verified SP, and had a tube thoracostomy reason met the inclusion criteria. Patients with significant cardiopulmonary disorders or a history of traumatic or tension pneumothorax were not included. The following baseline clinical parameters were obtained before and 6–12 hours after the procedure: blood pressure, oxygen saturation, heart rate, respiratory rate, and ABG values. With SPSS Version 22, statistical analysis was carried out using Wilcoxon signed-rank tests or paired t-tests as needed. Results: There were 50 patients in all, with a mean age of 35.6 ± 10.2 years and a M:F ratio of 4:1. Chest discomfort (75%) and dyspnea (90%) were the most frequent initial symptoms. The mean respiration rate before the procedure was 28.4 ± 4.5 breaths per minute; after the thoracostomy, it considerably improved to 18.2 ± 3.1 breaths per minute (p < 0.001). Heart rate decreased from 110.3 ± 12.7 bpm to 89.6 ± 10.5 bpm (p = 0.002), and oxygen saturation rose from a pre-procedure mean of 86.5 ± 5.4% to 97.2 ± 2.3% (p < 0.001). PaO2 significantly improved (62.4 ± 9.1 mmHg to 85.7 ± 8.3 mmHg, p < 0.001) and PaCO2 significantly decreased (52.1 ± 7.5 mmHg to 41.8 ± 6.2 mmHg, p = 0.005), according to ABG analysis, suggesting improved ventilation and oxygenation. Conclusion: When patients have spontaneous pneumothorax, tube thoracostomy greatly enhances their respiratory function as well as arterial blood gas values. Symptoms are alleviated as a result of the procedure's success in improving oxygenation and ventilation. The key to the best possible patient outcomes is the regular monitoring of ABG readings and the early diagnosis of problems. Research after an intervention should look at how patients' bodies change over time and how often symptoms return.
Research Article
Open Access
Comparative Analysis of Pulmonary Function in Urban and Rural Adolescents Exposed to Varying Air Quality Levels
Roopali Mittal,
Kavita Singh,
Prashant V Kariya
Pages 596 - 598

View PDF
Abstract
Background: Air pollution is a major environmental health concern, particularly affecting lung development in adolescents. Urban populations are often more exposed to higher levels of air pollutants compared to their rural counterparts. This study aims to evaluate and compare pulmonary function in adolescents residing in urban and rural regions with varying air quality indices (AQI). Materials and Methods: A cross-sectional observational study was conducted involving 120 adolescents aged 13–18 years, with 60 participants each from urban and rural areas. Participants underwent spirometry to measure Forced Vital Capacity (FVC), Forced Expiratory Volume in one second (FEV1), and FEV1/FVC ratio. AQI levels were monitored over a 3-month period in both regions. Exclusion criteria included known respiratory illnesses, smoking, and recent infections. Statistical analysis was performed using unpaired t-tests and ANOVA. Results: The mean FEV1 among urban adolescents was 2.48 ± 0.42 L, significantly lower than the rural group (2.91 ± 0.37 L, p < 0.001). Similarly, FVC was reduced in the urban group (3.12 ± 0.45 L) compared to rural participants (3.45 ± 0.39 L, p = 0.004). The mean FEV1/FVC ratio was also lower in urban subjects (79.4%) versus rural (84.3%), indicating early signs of obstructive airway changes. Average AQI in urban areas was 186 (moderate to poor), while rural areas recorded an average AQI of 72 (good). Conclusion: Adolescents living in urban areas with higher air pollution levels demonstrate significantly reduced pulmonary function compared to their rural counterparts. These findings highlight the need for improved air quality monitoring and public health interventions to protect vulnerable age groups.
Research Article
Open Access
Clinical-Hematological Profile of Patient with Acute Dengue Infection
Vijay Sagar,
Sanjay Kumar,
Asim Mishra
Pages 860 - 864

View PDF
Abstract
Background: Dengue fever, caused by the dengue virus (DENV) and transmitted by Aedes aegypti mosquitoes, is a rapidly growing public health concern, particularly in tropical regions like India. The disease exhibits a wide clinical spectrum from mild febrile illness to severe forms such as dengue hemorrhagic fever (DHF) and dengue shock syndrome (DSS). Early identification of hematological abnormalities is crucial for effective diagnosis, risk stratification, and timely intervention. This study aimed to evaluate the clinical and haematological profiles of patients with acute dengue infection and examine the correlation between laboratory parameters and disease severity. Materials and Methods: This hospital-based observational study was conducted at Anugrah Narayan Magadh Medical College, Gaya from July 2018 to March 2020. A total of 280 patients with serologically confirmed dengue (NS1 antigen and/or IgM antibody positive) were included. Demographic, clinical, and haematological data were recorded. Complete blood counts were analyzed using an automated hematology analyzer, and serial monitoring was performed in severe cases. Dengue severity was categorized as Dengue Fever (DF), DHF, or DSS based on WHO criteria. Statistical analysis was conducted using standard software, with p<0.05 considered significant. Results: The mean age of patients was 33.8 ± 14.5 years, with a predominance of young adults (18–40 years, 50%) and males (60%). Urban residents accounted for 70% of cases. Common clinical features included fever (100%), myalgia (80%), and headache (70%), with bleeding manifestations present in 30% of patients. Thrombocytopenia (<150,000/µL) and leukopenia (<4,000/µL) were observed in 82% and 58% of patients, respectively. Elevated hematocrit (>40%) was seen in 46%. Significant trends were noted across severity groups, with DSS patients showing the lowest platelet counts (mean 32,000/µL) and highest hematocrit (mean 46.0%; p<0.001). Serial monitoring showed platelet recovery by Day 7 in most severe cases. Patients with bleeding had significantly lower platelet counts and higher hematocrit compared to those without bleeding (p<0.001). Conclusion: Thrombocytopenia, hemoconcentration, and leukopenia are prominent haematological markers in dengue and are strongly associated with disease severity and bleeding risk. Routine monitoring of these parameters can guide early diagnosis, clinical management, and risk stratification. This study emphasizes the importance of localized data in shaping regional dengue control strategies and reinforces the value of simple haematological tests in the effective management of dengue, especially in resource-limited settings.
Research Article
Open Access
The Role of Artificial Intelligence in Modern Healthcare: Advances, Challenges, and Future Prospects
K. Akila ,
R. Gopinathan,
J. Arunkumar,
B. Sree Bavai Malar
Pages 615 - 624

View PDF
Abstract
Artificial intelligence (AI) is transforming the medical industry by improving diagnosis accuracy, optimizing treatment plans, and streamlining healthcare processes. AI-powered algorithms analyze massive medical databases to diagnose diseases early on, tailor treatment plans, and aid in clinical decision-making. AI enhances diagnostic accuracy in radiology, pathology, dermatology, and ophthalmology by analyzing images using deep learning algorithms. AI-driven treatment planning in oncology, cardiology, and neurology allows for precision medicine by predicting disease progression and optimizing drug selection. Furthermore, AI improves healthcare operations through robotic-assisted surgeries, AI-powered virtual assistants, and electronic health record (EHR) automation, which improves patient management while reducing clinician labour. Despite these advantages, issues like as data privacy, algorithmic bias, model transparency, and system integration must be resolved. Future AI developments in precision medicine, robotic nursing, wearable health monitoring, and federated learning will significantly improve patient care. AI has the potential to alter modern medicine by establishing ethical principles and regulatory frameworks that ensure safer, more efficient, and tailored healthcare solutions.
Research Article
Open Access
Assessment of Serum Magnesium and Lipid Profile Alterations in Hypertensive Disorders of Pregnancy
Swarna Sudha Pullemalla,
Murali Mohan. P
Pages 1405 - 1408

View PDF
Abstract
Background: Hypertensive disorders during pregnancy, including gestational hypertension and preeclampsia, are significant contributors to maternal and fetal morbidity and mortality. Emerging evidence suggests that alterations in serum magnesium and lipid profiles may play a role in the pathophysiology of these conditions. Objective: To evaluate and compare serum magnesium levels and lipid profiles among normotensive pregnant women and those with hypertensive disorders of pregnancy (HDP). Methods: A prospective case-control study was conducted involving 100 pregnant women beyond 32 weeks of gestation. Fifty women diagnosed with HDP formed the case group, while fifty normotensive pregnant women served as controls. Fasting blood samples were analyzed for serum magnesium, total cholesterol, triglycerides, HDL-C, LDL-C, and VLDL-C. Statistical analysis was performed using SPSS version 25. Results: Women with HDP exhibited significantly lower serum magnesium levels and higher levels of total cholesterol, triglycerides, LDL-C, and VLDL-C compared to controls. HDL-C levels were notably lower in the HDP group. These findings suggest a correlation between dysregulated mineral and lipid metabolism and the development of hypertensive disorders during pregnancy. Conclusion: Monitoring serum magnesium and lipid profiles in pregnant women may aid in the early detection and management of hypertensive disorders, potentially improving maternal and fetal outcomes.
Research Article
Open Access
Prospective Evaluation of Serum Prolactin as a Biomarker for TB Severity
Daksh Sharma,
Krishna Gopal Singh,
Shilpi Raikwar
Pages 667 - 671

View PDF
Abstract
Background: Tuberculosis (TB) continues to be a major health challenge in India. The identification of biomarkers that reflect disease severity can assist in patient management and treatment response monitoring. Prolactin, a pituitary hormone with immunomodulatory functions, may serve such a role. Objectives: To evaluate serum prolactin levels in TB patients, assess its correlation with disease severity, and study changes following anti-TB therapy. Methods: This was a prospective observational study conducted over 12 months at a tertiary care center in Central India. One hundred newly diagnosed TB patients were enrolled. Serum prolactin levels were measured at diagnosis, 2 months, and 6 months. Disease severity was graded based on clinical, radiological, and microbiological criteria. ROC analysis was performed to determine the diagnostic utility of prolactin for severe TB. Results: Mean serum prolactin levels were significantly higher in patients with severe TB (34.2 ± 8.1 ng/mL) compared to moderate (24.6 ± 7.0 ng/mL) and mild disease (17.5 ± 6.2 ng/mL; p < 0.001). Prolactin positively correlated with sputum AFB grade (r = 0.62), radiographic extent (r = 0.58), and symptom severity (r = 0.66). ROC analysis showed an AUC of 0.88 for detecting severe TB at a cut-off of 29.5 ng/mL. Follow-up data revealed a significant decline in prolactin levels with treatment. Conclusion: Serum prolactin is a promising biomarker of TB severity and may assist in prognosis and treatment monitoring. Further studies are needed to validate its clinical utility.
Case Report
Open Access
Unexpected hemorrhage: A young female’s stroke unveils Fibromuscular Dysplasia
Manmadha Rao K,
Palash Shah,
. Matta Sashi Kiran,
Satish Kumar Ampolu,
Sachin Chavan
Pages 672 - 677

View PDF
Abstract
Background: This case report examines the diagnostic challenges and management of a young female patient who presented with hemorrhagic stroke, later determined to be due to renal artery stenosis likely caused by fibromuscular dysplasia. FMD is a hyperplastic arterial disorder primarily affecting medium-sized and small arteries, though larger arteries can also be involved. It is more common in young females and typically targets the renal and carotid/vertebral arteries but can also affect other arteries. While FMD usually presents as renovascular hypertension, it can also manifest as a stroke in young adults. Case Description A 27-year-old female presented with an acute hemorrhagic stroke. Upon admission to Divisional Railway Hospital, Kharagpur, South Eastern Railway, extensive diagnostic evaluation was conducted. Imaging studies confirmed the presence of hemorrhagic stroke. Given her young age and atypical presentation, further investigations were necessary. Angiographic studies revealed abnormalities consistent with fibromuscular dysplasia, characterized by the string of beads appearance in the renal arteries. The renal artery stenosis, likely due to FMD, was a significant factor contributing to her stroke. Her management involved acute stroke treatment, long-term blood pressure control and regular monitoring. Conclusion This case highlights the importance of considering FMD in the differential diagnosis of young stroke patients, especially in the absence of traditional risk factors. Early recognition and timely treatment of FMD can significantly improve the patient quality of life and ensure favorable long-term prognosis. The case illustrates the need for thorough investigation and high index of clinical suspicion in diagnosing and managing such conditions.
Research Article
Open Access
A Prospective Observational Study on Adverse Drug Reactions in Elderly Patients Receiving Polypharmacy in a Tertiary Care Hospital
Uma Maheswari Nagireddy,
Palaparthi Srinivas,
K. Vishnuvardhan Babu
Pages 1807 - 1812

View PDF
Abstract
Background: Elderly patients are particularly vulnerable to adverse drug reactions (ADRs) due to age-related physiological changes and polypharmacy. This study aimed to assess the incidence, pattern, severity, causality, and preventability of ADRs among elderly patients receiving polypharmacy in a tertiary care hospital. Methods: A prospective observational study was conducted over six months among 100 elderly patients (aged ≥60 years) on ≥5 medications. Patients were followed throughout their hospital stay for the development of ADRs. Data were recorded using standardized formats, and causality was assessed using WHO-UMC criteria, severity using the Modified Hartwig and Siegel Scale, and preventability using Schumock and Thornton criteria. Results: Out of 100 patients, 42% experienced at least one ADR, with a total of 58 ADRs recorded. Gastrointestinal (31.0%) and central nervous system (22.4%) manifestations were most common. The major drug classes implicated included antihypertensives (24.1%), NSAIDs (19.0%), and antidiabetics (17.2%). Most ADRs were moderate (50.0%) in severity, and causality assessment classified them as probable (46.6%), possible (43.1%), or certain (10.3%). Preventability analysis indicated that 19.0% of ADRs were definitely preventable and 36.2% were probably preventable. Patients on ≥10 medications (28%) had a higher incidence of ADRs, with an average of 7.8 ± 2.1 drugs per patient. Conclusion: ADRs are common among elderly patients receiving polypharmacy, with a significant proportion being preventable. Regular medication reviews, deprescribing, and vigilant monitoring are essential strategies to enhance drug safety in this population.
Research Article
Open Access
Evaluation of Muscle Fatigue Using Surface Electromyography during Isometric Contractions in Athletes and Non-Athletes
Shyam Prasad Parimala,
Pranoti P Shinde,
Sumalatha Naitham
Pages 682 - 685

View PDF
Abstract
Background: Muscle fatigue is a critical parameter influencing athletic performance and daily functionality. Surface electromyography (sEMG) is a non-invasive technique that helps quantify muscle fatigue by monitoring electrical activity during sustained contractions. This study aimed to evaluate and compare muscle fatigue patterns during isometric contractions in athletes and non-athletes using sEMG. Materials and Methods: A total of 40 participants were recruited, comprising 20 athletes and 20 non-athletes aged 18–30 years. sEMG recordings were obtained from the biceps brachii during a sustained isometric contraction at 60% of the participant’s maximum voluntary contraction (MVC) for 60 seconds. Parameters analyzed included median frequency (MF) shift and root mean square (RMS) amplitude. The rate of decline in MF and increase in RMS were used as indicators of fatigue. Results: Athletes demonstrated a slower rate of MF decline (−0.45 Hz/sec) compared to non-athletes (−0.89 Hz/sec), indicating better fatigue resistance. RMS amplitude increased by 18.4% in athletes and 31.7% in non-athletes over the 60-second contraction period. Statistical analysis revealed significant differences between groups in both MF decline (p=0.002) and RMS increase (p=0.015). Conclusion: Athletes exhibited superior muscular endurance during isometric contractions, reflected by a more gradual MF reduction and lower RMS increment. These findings suggest that sEMG can effectively differentiate fatigue resistance levels in trained and untrained individuals, making it a useful tool in sports science and rehabilitation monitoring.
Research Article
Open Access
To Determine the Effects of Chronic Liver Disease on Bone Health
Alankrat Kumar Singh,
Rajendra Dhar,
Asrar Ahmed
Pages 843 - 847

View PDF
Abstract
Aim: The aim of the present study was to determine the effects of chronic liver disease on bone health. Methods: The present study was conducted in the General medicine and Gastroenterology Department at NIMS Hospital, Jaipur for the period of 18months and 171 patients were included in the study. Results: The mean age of the participants was 53.79 ± 11.79 years. Of the total sample, 85 (49.7%) were female and 86 (50.3%) were male. In terms of occupation, the largest group of participants were laborers (35, 20.5%), followed by self-employed individuals (33, 19.3%), and farmers (32, 18.7%). Regarding dietary habits, 88 (51.5%) of the participants followed a vegetarian diet, while 83 (48.5%) followed a non-vegetarian diet. In terms of alcohol consumption, 88 (51.5%) reported yes to alcohol consumption similarly, regarding smoking status, 97 (56.7%) were smokers. The duration of liver disease among the participants ranged from 3 to 8 years, with a median duration of 5 years. Regarding the provisional diagnosis, the most common diagnosis was Non-alcoholic fatty liver disease (NAFLD). For osteoporosis, a larger proportion of the participants, 143 patients (83.6%), did not have osteoporosis and the history of fractures was reported by a small number of participants, with 4 patients (2.3%) having a fracture history. Conclusion: In conclusion, this study underscores the critical importance of closely monitoring bone health in patients suffering from chronic liver disease (CLD). The findings reveal a significant prevalence of osteopenia and a noteworthy presence of osteoporosis, highlighting the detrimental impact that liver dysfunction can have on skeletal health. Key contributing factors, such as suboptimal vitamin D levels, hypocalcemia, and altered bone metabolism, were identified, emphasizing the interconnectedness of liver function and bone health.
Research Article
Open Access
Physiological Benefits of Smoking Cessation Among Employees at a Tertiary Health Care Institution, Kanchipuram District, Tamil Nadu
Subamalani ,
B. Vasanthi,
Sasi Kumar,
Dharani
Pages 891 - 895

View PDF
Abstract
Background: Smoking is a well-established risk factor for various chronic diseases, including cardiovascular and respiratory conditions. While long-term cessation benefits are well-documented, this study evaluates the short-term physiological effects of smoking cessation among employees in a tertiary health care setting. Objectives: To determine the physiological benefits of smoking cessation by evaluating changes in cardiovascular and pulmonary parameters using spirometry and vital sign monitoring. Methods: A cross-sectional follow-up study was conducted among 100 consenting employees at a tertiary health care institution in Kanchipuram District, Tamil Nadu, who had a history of smoking for more than one year. Baseline assessments of body weight, heart rate, blood pressure (systolic and diastolic), and pulmonary function (FVC, FEV1, FEF) were performed using a digital spirometer and standard clinical instruments. Following one month of smoking cessation, all parameters were reassessed. Data were analyzed using descriptive statistics. Results: No significant changes were observed in body weight (73.05 ± 1.21 kg) and heart rate (74.54 ± 0.45 bpm) pre- and post-cessation. However, systolic (124.08 ± 0.45 to 122.81 ± 0.15 mmHg) and diastolic blood pressures (82.36 ± 0.24 to 81.94 ± 0.25 mmHg) showed mild reductions. Pulmonary function showed substantial improvements: FVC (3.63 ± 0.09 to 4.76 ± 0.12 L), FEV1 (2.57 ± 0.04 to 3.88 ± 0.15 L), and FEF (2.54 ± 0.24 to 3.27 ± 0.25 L/sec). Conclusion: Short-term smoking cessation significantly improves pulmonary function but has limited immediate effects on cardiovascular parameters. These findings highlight the rapid respiratory benefits of quitting smoking.
Research Article
Open Access
A Study of Lipid Profile in Pre-Dialysis Chronic Kidney Disease Patients in Tertiary Care Hospital, South Gujarat
Ajaykumar Patel,
Rudra Goyani,
Riddhi dudhrejiya,
Vansh Varma,
Gareema Naik
Pages 912 - 917

View PDF
Abstract
Objective: This hospital-based cross-sectional study aimed to estimate the prevalence and pattern of dyslipidaemia in pre-dialysis chronic kidney disease (CKD) patients and to evaluate its association with the stages of CKD. The study sought to determine the extent of lipid abnormalities and their correlation with disease progression. Methods: The study included 50 adult pre-dialysis CKD patients admitted to a tertiary care centre between May 2022 and January 2024. Patients were enrolled using purposive sampling. CKD staging was classified according to KDIGO guidelines. Lipid profiles were assessed, including total cholesterol, LDL, HDL, and triglycerides. Statistical analysis was performed using unpaired t-tests and chi-square tests, with significance at p<0.05. Results: Of the 50 pre-dialysis CKD patients (60% male), 48% had dyslipidaemia. It was more common in males (53%) than females (40%) and in those aged >50 years (64%) than in younger age groups (p = 0.06). Most patients (76%) were in Stage 5 CKD, where abnormal lipid levels were markedly higher. Significant associations were found between advanced CKD stage and elevated total cholesterol, LDL-C, and triglycerides (p = 0.03, 0.04, and 0.04, respectively), while low HDL-C was not statistically significant (p = 0.21). These findings suggest a worsening lipid profile with CKD progression. Conclusions: The study highlights the high prevalence of dyslipidaemia in pre-dialysis CKD patients, with lipid abnormalities worsening as CKD progresses. These findings emphasize the importance of early lipid monitoring and intervention to mitigate cardiovascular risk in this population.
Research Article
Open Access
Assessment of Long-Term Post-COVID Complications in Patients with Pre-Existing Metabolic Syndrome: A Prospective Cohort Study
Atul Bhoraniya,
Mihir Patel,
Priyanka Malaviya,
Minaxi Kushwah
Pages 935 - 938

View PDF
Abstract
Background: The COVID-19 pandemic has posed unprecedented challenges, especially for individuals with underlying comorbidities. Among these, metabolic syndrome (MetS) — characterized by central obesity, dyslipidemia, hypertension, and insulin resistance — has emerged as a key determinant of adverse outcomes. This study aims to prospectively assess the long-term post-COVID complications in patients with pre-existing MetS, focusing on cardiometabolic, respiratory, and neuropsychiatric sequelae. Materials and Methods: A prospective cohort study was conducted across three tertiary healthcare centres in India. A total of 300 patients aged 30–65 years with laboratory-confirmed COVID-19 and pre-existing MetS (as per IDF criteria) were enrolled. Follow-up assessments were conducted at 3-, 6-, and 12-months post-recovery. Clinical outcomes including new-onset type 2 diabetes, exacerbation of hypertension, pulmonary fibrosis, persistent fatigue, and cognitive decline were evaluated using structured clinical assessments, laboratory tests, and imaging modalities. A control group of 200 COVID-19-recovered patients without MetS was also followed for comparison. Results: At 12 months follow-up, 42.3% of patients in the MetS group reported persistent fatigue compared to 18.5% in the control group (p<0.01). New-onset type 2 diabetes was observed in 26.7% of MetS patients versus 8.0% in controls (p<0.001). Pulmonary complications such as reduced DLCO and fibrotic changes were documented in 33.1% of MetS cases and 14.5% of controls (p<0.05). Neurocognitive issues, including memory deficits and anxiety, were more prevalent in the MetS group (38.5%) than in controls (21.0%) (p=0.02). Conclusion: Individuals with pre-existing metabolic syndrome are at significantly increased risk of developing long-term post-COVID complications, including cardiometabolic dysfunction, chronic respiratory impairment, and neuropsychiatric disturbances. These findings highlight the need for tailored post-COVID monitoring and management strategies in this high-risk population.
Research Article
Open Access
Spatiotemporal Analysis of Anemia Burden among Pregnant Women: A GIS-Based Epidemiological Study
Chaitanyakumar Mahadevbhai Aghara,
Nihar Sayariya,
Swarnim Rathod*
Pages 939 - 942

View PDF
Abstract
Background: Anemia remains a major public health challenge among pregnant women, particularly in low- and middle-income countries, where it contributes significantly to maternal and fetal morbidity. Spatial and temporal mapping of anemia prevalence enables the identification of regional disparities and the targeting of interventions. This study aimed to assess the spatiotemporal burden of anemia among pregnant women using Geographic Information System (GIS) tools for improved policy formulation and resource allocation. Materials and Methods: A retrospective cross-sectional study was conducted using secondary data collected from antenatal clinics across 10 districts. Hemoglobin levels of pregnant women were categorized as per WHO guidelines. Spatial data were geo-referenced using ArcGIS 10.8. Hotspot analysis (Getis-Ord Gi*) and temporal trend evaluation were employed to identify regions with high anemia burden and observe changes over time. Statistical significance was set at p<0.05. Results: Out of 25,600 pregnant women assessed, 57.3% were found to be anemic (Hb <11 g/dL), with a higher prevalence in rural and tribal regions. The year-wise distribution showed a declining trend from 61.2% in 2018 to 52.8% in 2022. GIS-based hotspot analysis revealed consistent high-burden clusters in Districts A, D, and G, with cold spots observed in urban centres of Districts B and E (p<0.05). Seasonal peaks in anemia prevalence were noted during the monsoon months. Accessibility to healthcare services and nutritional supplementation programs showed a spatial correlation with reduced anemia burden. Conclusion: This GIS-based spatiotemporal study highlights significant geographic and temporal variations in anemia prevalence among pregnant women. The identification of persistent hotspots can guide localized interventions and strengthen antenatal care services in vulnerable regions. Integration of spatial tools in public health monitoring offers a robust framework for addressing maternal anemia.
Research Article
Open Access
A Prospective Comparative Study Between Stapled and Conventional Haemorrhoidectomy
Hersh Nath Agrawal,
Anuj Kumar Gupta,
Harshit Gupta,
Akash Sachan,
Alok Agarwal,
Nanu Ram Prajapati,
Vivek Bhardwaj
Pages 1063 - 1068

View PDF
Abstract
Background: Haemorrhoids, or piles, are a common condition affecting many adults, characterized by swollen vascular structures in the anal canal, leading to discomfort and bleeding. Treatment options range from conservative management to surgical interventions, with haemorrhoidectomy being a definitive surgical option for advanced cases. This study compares two surgical techniques: conventional haemorrhoidectomy and stapled haemorrhoidectomy. Objective: The objective of this study is to conduct a prospective comparative analysis of conventional and stapled haemorrhoidectomy, focusing on their effectiveness, patient acceptance, postoperative outcomes, complications, and cost-benefit analysis. Methods: This prospective observational study was conducted over 18 months at Maharaja Agrasen Hospital, New Delhi, involving 60 patients with symptomatic Grade II/III haemorrhoids. Patients were randomly assigned to either stapled or conventional haemorrhoidectomy. Data were collected through clinical examinations, interviews, and standardized assessments, with postoperative outcomes evaluated using the Visual Analogue Scale (VAS) for pain and monitoring for complications. Result: The study found that stapled haemorrhoidectomy had significantly shorter operative times (22.27 vs. 25.00 minutes), less blood loss (30.47 vs. 78.30 mL), shorter hospital stays (1.53 vs. 2.77 days), and quicker return to work (5.2 vs. 15.4 days). Pain scores were significantly lower in the stapled group at all measured intervals. Late complications, such as delayed wound healing, were also less frequent in the stapled group. Conclusion: Stapled haemorrhoidectomy is shown to be a superior option for managing advanced haemorrhoids, offering benefits such as reduced operative time, lower blood loss, faster recovery, and fewer complications. While further long-term studies are needed to assess recurrence rates, the findings support the adoption of stapled haemorrhoidectomy in clinical practice for its efficiency and patient-centered outcomes.
Research Article
Open Access
Impact of Laparoscopic Ovarian Drilling on Ovarian Reserve, Hormonal Profile, and Fertility Outcomes in Clomiphene Citrate Resistant Women with Polycystic Ovarian Syndrome
Bongi Vivekanand ,
Lingudu Brahmanandam ,
Kandregula Appala Venkata Subrahmanyam
Pages 15 - 18

View PDF
Abstract
Background: To prospectively evaluate the impact of laparoscopic ovarian drilling on ovarian reserve, hormonal changes, ovulation, and menstrual regularity in women with polycystic ovarian syndrome (PCOS) resistant to clomiphene citrate therapy. Methods: This prospective study included 48 women diagnosed with PCOS according to the Rotterdam criteria, who previously failed treatment with clomiphene citrate. Participants underwent laparoscopic ovarian drilling using electrocautery to create multiple ovarian punctures. Hormonal profiles (Anti-Mullerian Hormone [AMH], luteinizing hormone [LH], follicle-stimulating hormone [FSH], and testosterone) were assessed at baseline and at 3 and 6 months after surgery. Ovulation was confirmed through serum progesterone levels and ultrasound follicular monitoring, while menstrual regularity was tracked over 6 months. Statistical analysis was performed using paired t-tests or Wilcoxon signed-rank tests, with p-values <0.05 considered statistically significant. Results: Post ovarian drilling, significant reductions were observed in AMH levels (16% at 3 months, 25% at 6 months), LH levels (28% at 3 months, 35% at 6 months), and testosterone levels (30% at 3 months, 33% at 6 months). FSH levels remained relatively stable throughout the follow-up. Ovulation was restored in 78% of patients, and menstrual regularity returned in 72% of participants within 6 months post ovarian drilling. Conclusion: Laparoscopic ovarian drilling significantly improves hormonal balance, ovulatory function, and menstrual regularity in clomiphene resistant PCOS patients. However, the procedure is associated with a considerable reduction in ovarian reserve markers, particularly AMH, indicating a trade-off between immediate reproductive benefits and potential long-term fertility implications. Therefore, patient counseling regarding potential impacts on ovarian reserve is crucial when considering laparoscopic ovarian drilling as a treatment option.
Research Article
Open Access
Correlation Between Assisted Reproductive Technology and the Risk of Congenital Heart Disease
Mousumi Acharya,
Subasis Mishra
Pages 444 - 447

View PDF
Abstract
Background: Assisted Reproductive Technology (ART) has revolutionized infertility treatment, leading to an increasing number of ART-conceived births worldwide. However, concerns have emerged regarding the potential risks associated with ART, particularly the increased incidence of congenital anomalies, including congenital heart disease (CHD). CHD is one of the most common birth defects and a leading cause of neonatal morbidity and mortality. While several studies have suggested a potential link between ART and CHD, findings remain inconsistent, necessitating further investigation. Aim This study aimed to evaluate the correlation between ART and the risk of CHD in neonates by comparing the incidence and types of CHD among ART-conceived infants and naturally conceived infants. Methods One hundred neonates—50 ART-conceived and 50 naturally conceived—were included in a retrospective cohort analysis. Information about maternal factors, neonatal outcomes, and echocardiographic results were taken from hospital records. SPSS version 23.0 was used for the statistical analysis, and independent t-tests and chi-square tests were used to analyze continuous and categorical data, respectively. Confounding variables such birth weight, gestational age, and mother age were taken into account using multivariate logistic regression. P-values less than 0.05 were regarded as statistically significant. Results The incidence of CHD was significantly higher in ART-conceived infants (54%) compared to naturally conceived infants (24%) (p = 0.001). Atrial septal defects (16% vs. 8%), ventricular septal defects (20% vs. 10%), and patent ductus arteriosus (12% vs. 4%) were more frequent in the ART group. According to logistic regression analysis, low birth weight (OR = 0.67, p = 0.015), advanced maternal age (OR = 1.12, p = 0.003), and ART were all independent risk factors for CHD (OR = 3.45, p = 0.001). Conclusion This study showed a strong correlation between ART and a higher risk of congenital heart disease. Regardless of the mother's age or birth weight, the frequency of CHD was greater in infants conceived via ART. The necessity of focused prenatal and postnatal cardiac assessment in ART pregnancies is highlighted by these findings. Recommendations Regular fetal echocardiographic monitoring should be considered for ART-conceived pregnancies. Understanding the underlying mechanisms and possible long-term cardiovascular hazards related with ART will require more extensive, longitudinal research
Research Article
Open Access
A comparative study of the anatomy of the heart in patients with hypertension and normotensive individuals
Sindhu K S ,
Satyanath Reddy Kodidala
Pages 1181 - 1184

View PDF
Abstract
Background: Hypertension is a major risk factor for cardiovascular diseases and can lead to significant changes in the heart's anatomy. This study aims to compare the anatomical variations of the heart between hypertensive and normotensive individuals and evaluate the correlation between hypertension duration and severity with cardiac structural changes. Objectives: To assess cardiac anatomical parameters, such as left ventricular wall thickness, left ventricular mass, and chamber dimensions in hypertensive individuals. To compare these anatomical parameters between hypertensive and normotensive individuals. To evaluate the correlation between hypertension duration and severity with observed cardiac anatomical variations. Methods: A total of 200 participants, including 100 hypertensive individuals and 100 normotensive controls, were included in this comparative study. The study employed echocardiographic measurements to assess left ventricular wall thickness, left ventricular mass, and chamber dimensions. The data were analyzed using independent t-tests, chi-square tests, and Pearson correlation to evaluate the significance of differences and associations. Results: Hypertensive individuals showed significantly higher left ventricular wall thickness, left ventricular mass, and chamber dimensions compared to normotensive controls (p<0.001 for all parameters). A significant positive correlation was observed between hypertension duration and severity with increased left ventricular wall thickness (r=0.58–0.64), left ventricular mass (r=0.62–0.67), and chamber dimension (r=0.47–0.52), indicating that longer duration and greater severity of hypertension are associated with more pronounced cardiac structural changes. Conclusion: This study confirms that hypertension leads to significant anatomical alterations in the heart, including increased left ventricular wall thickness, mass, and chamber dimensions. The findings highlight the importance of early intervention and regular monitoring to prevent further cardiac remodeling and adverse outcomes in hypertensive patients.
Research Article
Open Access
Intraoperative Scar Condition and Fetomaternal Outcomes in Patients with Previous LSCS with Scar Tenderness
Uzma Yasmeen ,
Veena B.T ,
Smitha K ,
Reethu Varadarajan
Pages 195 - 198

View PDF
Abstract
Background: With cesarean section (CS) rates rising globally, concerns about complications like uterine scar dehiscence (USD) have also increased. USD, where a previous cesarean scar weakens or separates, can pose serious risks to both mother and baby. This study explores how intraoperative scar conditions relate to maternal and fetal outcomes in women with a previous lower-segment cesarean section (LSCS) and scar tenderness. Aims & Objectives: To assess intraoperative uterine scar conditions (intact, thinned out, dehiscence, or rupture) in women with previous LSCS presenting with scar tenderness and to analyze the associated maternal and fetal outcomes to improve clinical management strategies. Methodology: This prospective observational study included 46 women with a history of LSCS and clinically assessed scar tenderness over 12 months. Scar tenderness was evaluated by gently palpating the suprapubic area. During surgery, scars were categorized as intact, thinned-out, dehiscent, or ruptured, and these findings were linked to maternal and fetal outcomes. Results: Most women (65.2%) were between 26 and 35 years old, and 76.08% delivered at term. Scar thinning (17.9%) and dehiscence (5.6%) were more common in those with multiple prior LSCS, though no cases of rupture occurred. Women with two prior LSCS had a higher rate of complications, including urinary issues (38.9%) and wound infections (27.8%). Among newborns, 19.5% had meconium-stained fluid, 6.5% had an APGAR score below 7, and 30.4% required immediate NICU admission. These findings suggest that repeat LSCS increases risks for both mother and baby, especially after multiple surgeries. Conclusion: Scar tenderness can serve as an important warning sign for complications in women with a history of LSCS. Most had intact scars, suggesting a trial of labor could be an option when there is no strong reason for another cesarean. However, thinned-out scars were linked to higher maternal and newborn risks, highlighting the need for close monitoring and timely decisions. Larger studies are needed to strengthen these findings and improve care for mothers and babies.
Research Article
Open Access
C - Reactive protein Levels in Preterm Premature Rupture of Membranes (PPROM): Impact on Maternal And Fetal Outcomes
Pages 252 - 255

View PDF
Abstract
Background: PPROM, occurring in 2–4.5% of pregnancies, is a major contributor to preterm births and perinatal mortality, with microbial invasion increasing maternal and neonatal risks. CRP, an inflammatory biomarker, may help predict adverse outcomes in PPROM cases. Methodology: A prospective observational study of 78 PPROM cases analyzed CRP levels and their association with maternal and neonatal outcomes, categorizing participants into CRP-positive and CRP-negative groups. Results: Elevated CRP levels correlated with higher maternal complications (sepsis, UTI, atonic PPH), increased neonatal morbidity (lower APGAR scores, perinatal depression), and longer NICU stays, with more C-sections and labor inductions in the CRP-positive group. Conclusion: While CRP is a useful inflammatory marker, its predictive value for chorioamnionitis remains uncertain, and routine serial monitoring may not significantly alter clinical management. Further research is required to refine its role in PPROM care.
Research Article
Open Access
Clinical Study of Incidence of Hypoglycemia in Breastfed Late Preterm Neonate
Nithin S Shagale,
Vinodkumar M K
Pages 792 - 798

View PDF
Abstract
Background: Hypoglycemia is the most common metabolic abnormality in infancy and childhood. When prolonged or recurrent it is a potent cause of irreversible brain damage leading to cognitive impairment, recurrent seizure activity, cerebral palsy autonomic dysregulation. Late preterms are at higher risks for a number of problems including poor feeding ,hypoglycemia, hypocalcemia, Jaundice, infections, respiratory distress, failure to thrive and hospital readmission. So this study tried to evaluate the incidence of hypoglycemia in late preterm appropriate for gestational age babies who were on breast feeding. Material and methods: This is a Hospital Based Observational study was conducted in the Department of Pediatrics, Subbaiah Institute of Medical Sciences, Shimoga, Karnataka. A total of 120 consecutive late preterm babies appropriate weight for gestational age were monitored for glucose levels for this study. The babies which were not fitting into the inclusion and exclusion criteria are not considered for the study. When the hypoglycemia was noted ,the level of glucose was assessed and managed according to the standard AIMS NICU protocols. The hypoglycemia was confirmed with the laboratory diagnosis. Results: A total of 120 late preterm babies were assessed. Of which, 61 were female and 59 were male babies. In our study, overall incidence of hypoglycemia was 15.83%. Majority of the hypoglycemia occurred on the first day (84.21%) and 2nd day(15.78%) with no episodes on third day of life. Out of 19 hypoglycemic babies, 8(42.1%) were symptomatic, and 11(57.89%) were asymptomatic .In our study, hypoglycemia was slightly more in male babies. Out of babies born to 82 multiparous mothers, hypoglycemia occured in 9 and out of babies born to 38 primiparous mothers 10 developed hypoglycemia. Considering the mode of delivery, out of 53 babies born by normal vaginal route, 8 had hypoglycemia and in 67 caesarian born babies,11 had hypoglycemia. Conclusions: The incidence of asymptomatic hypoglycemia is much higher than symptomatic hypoglycemia. The highest incidence (84.2%) of hypoglycemia was noted in the first twenty four hours of life and 15.8% in next twenty four hours. Hence there is a need for monitoring blood glucose regularly in postnatal wards even in healthy late pre terms during the first 2 days of life.
Research Article
Open Access
Status of Contralateral Ear in Unilateral Chronic Otitis Media
Preeti Kumari ,
Shalini Jadia ,
Sadat Qureshi ,
Sandeep Sharma
Pages 300 - 308

View PDF
Abstract
Background: Chronic Otitis Media (COM) is a persistent inflammatory condition of the middle ear and mastoid cavity that significantly impacts patients quality of life. While clinical attention is often focused on the affected ear, the contralateral ear (CLE), defined as the asymptomatic or less affected ear, also plays a crucial role. The objective of this study was to assess and correlate the findings of status of the CLE in cases of COM and evaluate its clinical implications. Methodology: This cross-sectional observational study was conducted after obtaining approval from the institutional ethical committee. A total of 120 individuals diagnosed with unilateral COM were included. Patients with an intact tympanic membrane (TM) in the CLE from all age groups were enrolled. Exclusion criteria included prior ear surgery, head/ear trauma, or refusal to participate. Results: Among the 120 patients, 69 (57.50%) were male, and 51 (42.50%) were female. The primary symptoms were ear discharge and hearing impairment. The mean age of patients 37.48±12.13. Otoscopic examination revealed large central perforation (LCP) in 34.2% of cases and medium central perforation (MCP) in 25%. Posterior Superior Quadrant (PSQ) with Attic Retraction was observed in 6.70% of cases. The CLE showed Grade 1 TM retraction in 25% of cases and tympanosclerotic patches in 15.00%, followed by Grade 2 TM retraction in 12.5%. The pure tone audiometry (PTA) findings revealed 117 patients (97.5%) had conductive hearing loss in the diseased ear, with an average hearing loss of 43.65 ± 14.16 dB. In CLE showed, 84 patients (70%) had normal hearing while, 28.3% had mild hearing loss. Conclusion: This study highlight the significant impact of chronic otitis media (COM) on both the diseased and contralateral ears and evaluation of both ears is essential for accurate diagnosis, disease monitoring, and timely therapeutic intervention. Regular assessments help determine the progression and potential impact of COM on the contralateral ear, allowing for early management and better patient outcomes. Proper patient education and continuous monitoring are crucial for effective treatment planning and prevention of further deterioration.
Research Article
Open Access
Antibiotic Resistance Patterns of Outpatient Paediatric Urinary Tract Infections
Pages 53 - 56

View PDF
Abstract
Background: Urinary tract infections (UTIs) are among the most common bacterial infections in Pediatric patients. The emergence of antibiotic-resistant uropathogens has complicated empirical treatment strategies, necessitating continuous surveillance of resistance patterns. Objectives: To evaluate the distribution of uropathogens and their antibiotic resistance profiles in pediatric UTIs over a 12-month period. Methods: A longitudinal observational study was conducted over one year in a tertiary care hospital. A total of 100 pediatric patients (aged 1 month to 12 years) clinically diagnosed with UTI were enrolled. Midstream urine samples were collected and processed for culture and sensitivity. The isolated organisms were identified, and antibiotic susceptibility testing was performed using the Kirby-Bauer disk diffusion method according to CLSI guidelines. Multidrug resistance (MDR) was defined as resistance to three or more antibiotic classes. Results: Escherichia coli was the most common isolate (66%), followed by Klebsiella pneumoniae (14%), Proteus mirabilis (8%), Enterococcus faecalis (6%), and Pseudomonas aeruginosa (4%). High resistance rates were observed for ampicillin (84.8% in E. coli, 92.9% in Klebsiella), cotrimoxazole, and ciprofloxacin. Nitrofurantoin and imipenem retained better sensitivity. Overall, 38% of isolates exhibited multidrug resistance. A rising trend in resistance to ampicillin and cotrimoxazole was noted over the study period. Conclusion: This study highlights the alarming prevalence of antimicrobial resistance in pediatric UTIs, particularly among Gram-negative organisms. Regular monitoring of resistance patterns is essential to guide empirical therapy and limit the spread of MDR pathogens.
Research Article
Open Access
Study of Functional Echocardiography in Neonates with Septic Shock
Nilesh Sadhwani ,
L. S. Deshmukh ,
Atul Londhe ,
Amol Joshi
Pages 346 - 352

View PDF
Abstract
Background: Sepsis is a major cause of morbidity and death in the neonatal period. The diagnosis and management of shock in the new-born presents many challenges to neonatologists. Functional echocardiography is rational, noninvasive, readily available, performed at the bedside, and provides information in real time, making it an ideal tool to evaluate hemodynamics and to acquire physiological and anatomical information in critically-ill patients. This study was conducted with the aim to assess correlation of functional echocardiography with clinical parameters in neonates with septic shock. Material and Methods: This was a Single centric Prospective cross-sectional type of study conducted among 140 neonates. It was conducted in the Department of Neonatology (NICU), GMC Aurangabad. Study population included all the newborns in NICU, GMCH, Aurangabad who developed septic shock as diagnosed by signs and symptoms of sepsis and confirmed by clinical and lab parameters. Neonates with diagnosed or suspected congenital and/ or cardiac mal-formation, those diagnosed with shock of etiology other than septic shock and those already on inotropic support were excluded from the study. Results: The gestational age of the neonates ranged from 27 to 41 weeks, with a mean of 33.08 ± 3.17 weeks. The most frequent blood culture finding was K. pneumoniae (24.29%). Total Leukocyte Count was lower than normal in 70% neonates. Pearson’s correlation test was used to assess the correlation of the hemodynamic parameters with cardiac function at the time of diagnosis, after resuscitation with inotropes at 2 hours, and after stabilization. Conclusion: Functional echocardiography aids in the clinical evaluation of neonatal shock and aids in monitoring the effectiveness of treatment.
Case Report
Open Access
Cesarean Delivery in a Pregnant Patient with Congenital Complete Heart Block: Anaesthetic Challenges with Review of Literature—a Case Report
Dr Sukriti Atram,
Dr Jenin Arul Michael,
Dr Shreyash Gosavi,
Dr Archita Singh
Pages 383 - 391

View PDF
Abstract
Background: The anaesthesiologists and obstetricians encounter specific challenges while managing pregnant patients with congenital complete heart block (CCHB) who require cesarean delivery due to pregnancy-induced physiological changes which requires precise planning to ensure maternal and fetal hemodynamic stability and better outcome. The case involves a 20-year-old primigravida patient weighing 52 kg and measuring 141 cm in height who was pregnant at 38 weeks and 2 days while maintaining a stable fixed heart rate between 48 and 53 bpm due to congenital complete heart block. The patient underwent pacemaker implantation as a past medical procedure before experiencing an infection, which led to the device removal. An emergency LSCS under spinal anaesthesia required the patient to undergo surgery, while the anaesthesiologists anticipated and effectively treated complications of bradycardia and hypotension by using close monitoring and pre-emptive transcutaneous pacing support, targeted fluid therapy and vasopressor use. A 2.6 kg healthy female baby received good Apgar scores during delivery. The patient demonstratedfew sustained episodes of bradycardia during the perioperative period, which were successfully managed because of effective preoperative planning between multiple disciplines, as well as constant monitoring during surgery and after delivery to ensure the wellbeing of patients with congenital cardiac conduction disorders.
Research Article
Open Access
Telehealth versus In-Person Care for Diabetes and Hypertension Co-management: A Randomized Controlled Trial
Akshay Jayantibhai Prajapati,
Keval Rajendrakumar Acharya,
Anantraj M Dixit,
Jaykumar Ganpatbhai Sahani
Pages 487 - 490

View PDF
Abstract
Background: The dual burden of type 2 diabetes mellitus (T2DM) and hypertension is a major contributor to cardiovascular morbidity and mortality. Integrated care models are essential for effective management. With the growing adoption of digital health technologies, telehealth has emerged as a potential alternative to conventional care. This study aimed to evaluate the clinical effectiveness of telehealth versus in-person care in the co-management of T2DM and hypertension. Materials and Methods: A total of 120 patients diagnosed with both T2DM and hypertension were randomly assigned into two groups: the Telehealth Group (n=60) and the In-Person Care Group (n=60). Inclusion criteria included age between 30–65 years, HbA1c ≥ 7%, and systolic BP ≥ 140 mmHg at baseline. The telehealth group received virtual consultations via a dedicated platform every 2 weeks, with remote monitoring of blood glucose and BP. The in-person group attended physical consultations at similar intervals. Primary outcomes were change in HbA1c and systolic blood pressure at 6 months. Secondary outcomes included medication adherence, patient satisfaction, and frequency of emergency visits. Results: At the end of 6 months, the telehealth group showed a mean reduction in HbA1c from 8.5% ± 1.1 to 7.2% ± 0.9 (p < 0.001), while the in-person group improved from 8.4% ± 1.0 to 7.5% ± 0.8 (p < 0.01). The reduction in systolic BP was also significant in both groups: from 148.2 ± 7.5 mmHg to 132.6 ± 6.3 mmHg in the telehealth group (p < 0.001), and from 147.9 ± 8.1 mmHg to 135.4 ± 7.1 mmHg in the in-person group (p < 0.01). Medication adherence was slightly higher in the telehealth group (92% vs. 87%, p = 0.04), and patient satisfaction scores were also greater (mean 4.5 vs. 3.9 on a 5-point Likert scale). No significant difference was observed in the number of emergency visits between the groups. Conclusion: Telehealth is a feasible and effective modality for the co-management of diabetes and hypertension, showing comparable or slightly superior outcomes in glycemic and blood pressure control compared to traditional in-person care. Improved adherence and satisfaction highlight the potential of remote monitoring in chronic disease management, particularly in resource-limited or rural settings.
Research Article
Open Access
The Efficacy of Wearable Cardiovascular Monitoring Devices in Real-Time Arrhythmia Detection: Systematic Review
Saim Ali Khan,
Pallavi Sharma ,
Rajender Singh ,
Mohammed Majid Hussain,
Rahul Tiwari ,
Heena Dixit
Pages 491 - 499

View PDF
Abstract
Background: Wearable cardiovascular monitoring devices have emerged as promising tools for real-time arrhythmia detection and patient-managed care. Their diagnostic value, usability, and impact on clinical outcomes remain areas of active investigation. Objective: To systematically evaluate the diagnostic accuracy, clinical utility, and user acceptability of wearable devices in detecting arrhythmias, particularly atrial fibrillation (AF). Data Sources: A systematic search was conducted in PubMed (2018–2025) using terms related to “wearables,” “arrhythmia,” and “cardiac monitoring.” Filters applied included free full-text availability and original human studies. Study Selection: Studies were included if they assessed wearable, non-invasive devices (e.g., smartwatches, ECG patches) for arrhythmia detection and reported diagnostic performance or clinical outcomes. Data Extraction and Synthesis: Twelve studies were included. Data on study design, population, device type, diagnostic accuracy, intervention changes, and usability were extracted and narratively synthesized. Main Outcomes and Measures: Primary outcomes were AF detection rate, sensitivity, specificity, and clinical intervention changes. Results: Wearables demonstrated sensitivity ranging from 84% to 95% and specificity up to 93%. Intervention changes occurred in up to 35% of cases. High patient satisfaction and adherence were reported. Conclusions and Relevance: Wearable cardiac monitors provide accurate, patient-friendly arrhythmia detection and support timely clinical intervention, reinforcing their role in modern cardiovascular care.
Research Article
Open Access
Effect of Pre-Pregnancy Body Mass Index on Mode of Delivery: A Comprehensive Observational Study
Pages 588 - 594

View PDF
Abstract
Background: Pre-pregnancy body mass index (BMI) is a crucial determinant of maternal and neonatal health, significantly influencing the mode of delivery, maternal complications, and neonatal outcomes. With the increasing prevalence of maternal obesity and undernutrition, obstetricians face challenges in managing pregnancy-related risks. Obesity has been linked to gestational diabetes mellitus (GDM)1, hypertensive disorders, macrosomia, prolonged labor4, and an increased likelihood of cesarean delivery, while underweight mothers are more prone to intrauterine growth restriction (IUGR)6, low birth weight (LBW), and neonatal intensive care unit (NICU) admissions8. Understanding the relationship between BMI and delivery outcomes is essential for improving antenatal care, risk stratification, and maternal-fetal health management. Materials And Methods: This study was conducted as a prospective observational study at the Department of Obstetrics and Gynecology, Kempegowda Institute of Medical Sciences, Bangalore, from August 1, 2024, to October 31, 2024. A total of 40 term pregnant women were categorized into four BMI groups based on the WHO classification: underweight (<18.5 kg/m²), normal (18.5–24.9 kg/m²), overweight (25–29.9 kg/m²), and obese (≥30 kg/m²). Data collection included patient demographics, obstetric history, mode of delivery, maternal complications, and neonatal outcomes. Statistical analysis was performed using SPSS v23, with chi-square tests, logistic regression, and Pearson’s correlation coefficient applied to evaluate associations between BMI and pregnancy outcomes. A p-value <0.05 was considered statistically significant. Results: The results revealed that cesarean section rates increased with maternal BMI, with 100% of obese women undergoing cesarean delivery, compared to 62.5% in overweight women, 25% in normal BMI women, and 16.7% in underweight women. Vaginal delivery was most frequent in normal BMI (75%) and underweight (83.3%) women, whereas obese women had the highest incidence of labor complications, including prolonged labor (50%) and gestational diabetes (50%). Hypertensive disorders were significantly higher in overweight (37.5%) and obese (50%) women, indicating an increased risk of metabolic and vascular dysfunction in these groups. Neonatal outcomes were also significantly affected by maternal BMI. Low birth weight (50%) was most common in underweight mothers, suggesting nutritional insufficiency and placental insufficiency99. Conversely, macrosomia (25%) was prevalent in obese women, aligning with higher rates of gestational diabetes and excessive fetal growth1010. NICU admissions were highest in underweight (3.3%) and obese (50%) neonates, emphasizing the importance of BMI regulation before pregnancy to minimize neonatal morbidity. Statistical analysis confirmed that BMI was positively correlated with cesarean section rates (p < 0.001, OR = 4.2), while underweight mothers had a significantly higher risk of delivering low birth weight neonates (p < 0.001). Additionally, gestational diabetes was strongly associated with obesity (p < 0.001), reinforcing the need for early glucose screening in overweight pregnancies. Conclusion: In conclusion, this study demonstrates that both underweight and obese women face increased pregnancy-related risks, emphasizing the importance of achieving an optimal BMI before conception. Obese women are at a significantly higher risk of cesarean delivery, gestational diabetes, and hypertensive disorders, while underweight women are more likely to deliver low birth weight infants and experience increased NICU admissions. These findings highlight the need for preconception weight management programs, targeted antenatal monitoring, and early interventions for high-risk pregnancies. Future research should explore larger-scale studies to evaluate long-term neonatal outcomes and assess the effectiveness of maternal weight optimization programs in reducing pregnancy-related complications.
Research Article
Open Access
Study of Inflammatory Markers - CRP, D-dimer, and Ferritin in COVID-19 Positive patients - A Retrospective Study
Mahesh Kumar C.H,
Shiv Kumar Chabba,
Shivakumarswamy Udasimath ,
Ravishankar G ,
Sushma MKM ,
Nagaraj V Gadwal
Pages 699 - 703

View PDF
Abstract
Background: The COVID-19 pandemic, caused by SARS-CoV-2, has been associated with a wide range of clinical presentations, ranging from asymptomatic cases to severe respiratory failure. Inflammatory biomarkers such as C-reactive protein (CRP), D-dimer, and Ferritin have been recognized as important indicators of disease severity and prognosis. This study is aimed to evaluate the levels of these biomarkers in COVID-19-positive patients and correlate them with demographic parameters and clinical outcomes. Methods: A Retrospective Observational Study was conducted in the Department of Pathology, Central Laboratory, RGSSH, OPEC, RIMS, Raichur, from June 2021 to May 2022. A total of 400 COVID-19-positive patients were included. Data on CRP, D-Dimer, and Ferritin levels were collected and analyzed concerning gender, age group, and clinical outcome (ICU vs. ward admission). Results: Of the 400 patients, 58.3% were male and 41.7% were female. The majority belonged to the age group of 41–60 years. Statistically significantly higher levels of CRP (p = 0.02) and D-dimer (p < 0.001) were observed in ICU patients compared to ward patients, while the difference in ferritin levels was not statistically significant (p=0.142). There was no significant association of biomarker levels with gender. However, D-dimer levels showed a significant correlation with age (p = 0.004), with the highest levels in patients above 80 years. Conclusion: Elevated CRP and D-dimer levels are significantly associated with severe COVID-19 infection and ICU admission. These biomarkers may serve as valuable tools for the early identification of high-risk patients, aiding in timely clinical decision-making. Regular monitoring of these markers is recommended to improve patient outcomes.
Research Article
Open Access
The Clinico-Microbiological Spectrum of Urosepsis in CKD patients: A Hospital-Based Study
Subhashree Mohapatra,
Naveen Kumar Medi,
Sudipti Sahu,
Satyaram Satapathy,
Nirupama Chayani,
Nikunja Kumar Das
Pages 729 - 733

View PDF
Abstract
Introduction: The prevalence of chronic kidney disease is rising globally because of numerous contributing variables, such as lower urinary tract blockage, urinary stones, co-morbidities, and sepsis. Monitoring is required for several indicators, including blood pressure, blood sugar, renal function tests, etc. The high mortality rate of urosepsis in CKD necessitates the early identification of the sepsis-causing organisms and the determination of antibiotic sensitivity to identify resistant species. Taking this into consideration, the study was conducted accordingly. Aim and Objectives: To study the microbiological spectrum, antimicrobial-resistant pattern, and treatment involved in urosepsis of CKD patients. Materials and methods: A total of 100 CKD patients were included in the study; detailed history was obtained, and clinical examination was done. Blood and urine samples were collected and sent to the microbiology laboratory for further processing. After culture sensitivity, empirical treatment was accordingly changed, and results were observed. Results: The most often isolated organism in blood and urine cultures was E. coli. The most frequent organism that caused death was Candida. The common drugs to which the patients responded were cefeperozone-sulbactum and meropenem. Conclusion: In conclusion, microbiological tests such as blood and urine cultures are crucial for the early detection of urosepsis in patients with chronic kidney disease (CKD) and the precise delivery of antibiotics.
Research Article
Open Access
Physiological Parameters in the Diagnosis and Management of Ocular Disorders
Katta Sreenivas Reddy,
. P. Jayanth Kumar,
Penjarla H Priyamvada
Pages 748 - 752

View PDF
Abstract
Background: Ocular physiology plays a central role in the detection and progression of many eye diseases. Physiological parameters such as intraocular pressure (IOP), tear film break-up time (TBUT), central corneal thickness (CCT), ocular blood flow (OBF), and pupillary reflex responses offer objective metrics essential for accurate diagnosis and patient monitoring. This study aims to assess the clinical utility of these parameters in diagnosing glaucoma, dry eye syndrome, and optic neuritis. Materials and Methods: A hospital-based, cross-sectional study was conducted on 200 subjects categorized into four groups: glaucoma, dry eye syndrome, optic neuritis, and healthy controls (n=50 each). Each participant underwent detailed ophthalmic evaluation, including IOP measurement, TBUT testing, pachymetry, ocular blood flow assessment via color Doppler imaging, and pupillary reflex testing. Statistical analysis was performed using SPSS v26.0 with ANOVA and Pearson’s correlation tests. Results: Significant intergroup differences were observed. Glaucoma patients exhibited the highest mean IOP (23.32 mmHg) and lowest OBF (9.06 cm/s). Dry eye patients showed markedly reduced TBUT (5.98 seconds). Central corneal thickness was thinnest in glaucoma (519.03 µm), while optic neuritis patients had the most prolonged pupillary reflex times (351.52 ms). Control subjects had normal physiological ranges across all parameters. Conclusion: Physiological parameters are vital tools in diagnosing and managing ocular disorders. Integrating these objective measures into routine clinical assessments can enhance early detection, guide treatment, and improve patient outcomes.
Research Article
Open Access
A Prospective Observational Study of Autonomic Dysfunction in Cirrhosis of Liver and Its Correlation with Electrocardiography and Echocardiography
Dr. Mudireddy Bindu Bhavani,
Dr. R. M. Honnutagi
Pages 827 - 832

View PDF
Abstract
Background: Cirrhosis is a chronic liver condition characterized by hepatic fibrosis, anatomical distortion, and compromised liver function. Autonomic dysfunction (AD) is a significant concern due to its impact on cardiovascular stability, hemodynamic modulation, and patient prognosis. AD is characterized by irregularities in heart rate variability, impaired blood pressure management, and abnormal reflex reactions, which can increase the risk of cardiac events. Cirrhotic cardiomyopathy, characterized by compromised ventricular contractility and electromechanical dysfunction, is linked to autonomic abnormalities. ECG and ECHO are vital tools for assessing heart function in cirrhosis patients, revealing anatomical and functional heart alterations. Objective: This study aims to evaluate autonomic dysfunction in individuals with liver cirrhosis, its impact on ECG abnormalities, heart rate variability, blood pressure regulation, and cardiovascular reflexes, and its influence on various Child-Pugh and MELD score groups. It also seeks to identify potential predictors of autonomic dysfunction in cirrhosis, which could aid in early risk assessment and therapeutic management. Methods: A retrospective analysis was performed on clinical data from 100 patients admitted with cirrhosis over an 18-month period, from May 2023 to December 2024, at Shri B M Patil Medical College and Research Center, Vijayapura, where the data was collected. The information gathered included the patient's demographics, clinical conditions at admission, ECG results (QTc interval), echocardiographic results, and signs of autonomic dysfunction. Results: The study examined the age distribution and physiology of patients with heart conditions, focusing on 20-60-year-olds. Pulse rates were categorized into three ranges: 81-100 bpm, 60-80 bpm, and 101-130 bpm. The Valsalva maneuver showed a similar distribution, with 52% falling in the 81-100 bpm range and 36% in the 60-80 bpm range. Blood pressure was measured using a blood pressure cuff, with higher pressure indicating a higher risk of heart failure. The study also examined blood pressure readings under three conditions: Supine BP (lying down), Standing BP, and Hand Grip BP. The Child-Pugh classification assessed the severity of chronic liver disease, with the mean age group mostly middle-aged. The study found a strong link between autonomic dysfunction, cardiovascular abnormalities, and liver disease progression. Conclusion: The study reveals a significant gender disparity in the population, with 95% being males. Cardiovascular assessments show normal physiological responses, but some individuals show signs of autonomic dysfunction. ECG analysis reveals abnormalities in sinus rhythms, highlighting the need for continuous monitoring. Liver function assessments reveal a high prevalence of severe liver disease, necessitating urgent medical interventions. Early detection and management of these health issues are crucial for improving health outcomes. Future research should focus on lifestyle modifications, targeted treatments, and long-term monitoring.
Research Article
Open Access
Role of Intraoperative Parathyroid Hormone Monitoring in Primary Hyperparathyroidism Patients Undergoing Surgery
Pages 53 - 55

View PDF
Abstract
Background: Primary hyperparathyroidism (PHPT) is an endocrine disorder characterized by autonomous production of parathyroid hormone (PTH). We planned the present study to evaluate the level of PTH intraoperatively and postoperatively and determine the outcome of the surgery. Materials & Methods: A total of 36 patients scheduled to undergo parathyroidectomy for hyperparathyroidism were involved in the present study. Complete physical examination of all the subjects was carried out. Pre-surgical assessment of all the subjects was done. Minimally invasive parathyroidectomy (MIP) was done in all the patients. A 50% reduction in PTH level from baseline was used as an indication that the exploration was successful. If a parathyroid adenoma was not found or if the PTH did not drop sufficiently after the removal of the gland, the incision was extended and bilateral neck exploration was done. Results: MIP was carried out in 33 patients, while bilateral neck exploration was required in 3 patients. A significant decline in the mean PTH concentration was seen during surgery and postoperatively. Also, we observed a significant fall in the postoperative calcium levels in comparison to the preoperative calcium levels. Conclusion: Intraoperative PTH monitoring plays a significant and crucial role in assessing the surgical treatment of primary hyperparathyroidism.
Research Article
Open Access
Role of NASG (Non-pneumatic Anti-Shock Garment) in Managing Hemorrhagic Shock in Postpartum Hemorrhage: A Systematic Review
Nannuri Viswa Samatha,
Heena Dixit
Pages 458 - 462

View PDF
Abstract
Background: Postpartum hemorrhage (PPH) remains the leading cause of maternal mortality globally, particularly in low-resource settings. The Non-Pneumatic Anti-Shock Garment (NASG) is a first-aid compression device endorsed by WHO and FIGO to stabilize women in hypovolemic shock while awaiting definitive care. Despite its potential, utilization remains suboptimal in many countries. Objectives: To systematically assess the role of NASG in managing hemorrhagic shock in PPH cases, with a focus on utilization rates, associated factors, and outcomes in low-resource settings. Methods: This systematic review followed PRISMA guidelines and was registered in PROSPERO (CRD42023412128). Electronic databases including PubMed, Embase, AJOL, and Google Scholar were searched up to May 2023. Observational and interventional studies reporting NASG utilization and associated outcomes were included. Data extraction and quality assessment were independently performed. Meta-analysis was conducted using a random-effects model. Results: Eight studies involving 2,690 healthcare providers were included. The pooled utilization rate of NASG was 43.2% (95% CI: 35.88–50.52; I² = 93.5%). Utilization was significantly associated with three factors: training (OR = 5.43), availability (OR = 7.78), and provider knowledge (OR = 4.61). Sensitivity analysis confirmed the robustness of the pooled estimate. Conclusion: Despite proven efficacy, NASG utilization remains limited in real-world settings. Structured training, consistent availability, and improved provider awareness are essential to scale up usage and reduce PPH-related mortality. Strengthening policy integration and monitoring systems will further enhance implementation outcomes.
Research Article
Open Access
To Estimate the Correlation between Serum Uric Acid to Creatinine Ratio and Proteinurea in Diabetes Mellitus Patients
Naveenkumar V.K.,
Vandana Balgi,
Kavya D
Pages 836 - 838

View PDF
Abstract
Background: Type-2 Diabetes Mellitus (T2DM) is a chronic condition that has reached epidemic proportions worldwide, affecting millions of individuals and representing a significant public health burden. The ratio of serum uric acid to creatinine (SUA/Cr ratio) has been proposed as a novel marker for assessing the risk of kidney damage and other metabolic disturbances, including in diabetic patients. Elevated SUA/Cr ratios have been linked to the early stages of diabetic nephropathy, including proteinuria, and may help identify individuals at risk before significant kidney dysfunction develops. Objectives: To estimate the correlation between serum uric acid to creatinine ratio and proteinuria in Diabetes Mellitus patients Methods: It’s a cross-sectional study conducted on 60 Diabetic patients visiting to K R Hospital, Mysuru from April 2023 to October 2024. Serum uricacid and creatinine ratio, proteinuria will be measured and then correlating these values with diabetic patients. Results
The analysis revealed a positive correlation (r = 0.42) between the serum uric acid to creatinine ratio and proteinuria in patients with type 2 diabetes mellitus, with a statistically significant p-value of 0.05. This suggests that as the serum uric acid to creatinine ratio increases, the level of proteinuria also tends to rise, indicating a potential link between this biochemical ratio and renal involvement in diabetic individuals. Conclusion: This study highlights the importance of early detection and monitoring of kidney disfunction in individuals with Type-2 DM particularly by using SUA/Cr ratio and proteinuria as a potential marker and demonstrates clear relationship between SUA/Cr ratio and proteinuria.
Research Article
Open Access
A Cross-Sectional Study on the Efficacy of Preoperative Antibiotic Prophylaxis Versus No Prophylaxis in Reducing Surgical Site Infections in Clean Surgeries
Sanjeev R Navalyal,
Prafullachandra Hoogar,
Praveen Kumar K H,
Lata K Mankani
Pages 1142 - 1146

View PDF
Abstract
Introduction: Surgical site infections (SSIs) are common postoperative complications that increase morbidity and healthcare costs. The role of preoperative antibiotic prophylaxis in clean surgical procedures remains controversial. Aim: To evaluate the efficacy of preoperative antibiotic prophylaxis versus no prophylaxis in reducing SSIs in clean surgeries. Methods: A cross-sectional comparative study was conducted involving 120 patients undergoing clean surgeries at a tertiary care center. Patients were divided equally into two groups: one receiving a single dose of preoperative intravenous ceftriaxone and the other receiving no prophylaxis. Outcomes measured included incidence of SSI, postoperative complications, hospital stay, and antibiotic-related side effects. Data were analyzed using appropriate statistical tests with significance set at p < 0.05. Results: SSI incidence was significantly lower in the antibiotic group (6.7%) compared to the no prophylaxis group (25.0%) (p = 0.004). Postoperative fever and wound discharge were also significantly reduced (p = 0.04). The antibiotic group experienced shorter hospital stays (mean 4.1 vs. 5.3 days, p < 0.001) and lower pain scores (p = 0.006). However, antibiotic-related side effects occurred in 11.7% of patients receiving prophylaxis. Conclusion: Preoperative antibiotic prophylaxis significantly reduces surgical site infections and improves postoperative outcomes in clean surgeries, supporting its routine use with cautious monitoring for adverse effects.
Research Article
Open Access
Connective Tissue Disorders Associated Interstitial Lung Disease – Evaluation by High Resolution Computed Tomography and Fibrosis Scoring System
Umer Ahmed Syed,
Dondha Shravani ,
Bingi Vishwanath ,
V Venkateswara Rao
Pages 70 - 75

View PDF
Abstract
Background: Connective tissue disorders (CTDs) are systemic autoimmune diseases that frequently involve the lungs, leading to interstitial lung disease (ILD), which is a major cause of morbidity and mortality. High-resolution computed tomography (HRCT) has emerged as a critical tool for the non-invasive assessment of ILD patterns and severity. This study aims to evaluate the HRCT imaging spectrum of CTD-associated ILD and correlate imaging findings with pulmonary function test (PFT) results.Objectives: To determine the predominant HRCT patterns in ILD associated with CTDs, quantify fibrosis severity using a scoring system, and correlate these scores with spirometric parameters. Methods: A prospective observational study was conducted at the Departments of Respiratory Medicine and Radiodiagnosis at Government Medical College and Hospital, Nizamabad and Nirmal. A total of 40 patients diagnosed with CTDs and suspected ILD underwent HRCT imaging. Fibrosis severity was scored based on zonal involvement (upper, middle, lower zones). PFTs were performed, and parameters such as FVC, FEV₁, and FEV₁/FVC ratio were recorded. Pearson’s correlation was used to assess relationships between HRCT scores and PFT values. Results: Among the 40 participants, systemic sclerosis (42.5%) and rheumatoid arthritis (37.5%) were the most common CTDs. NSIP was the predominant HRCT pattern, seen in 80% of cases. Quantitative fibrosis scores averaged 26.5 in systemic sclerosis and 28.8 in rheumatoid arthritis. A weak to moderate positive correlation was observed between fibrosis scores and FEV₁/FVC ratio (r = 0.43), suggesting that HRCT grading reflects pulmonary functional impairment. Conclusion: HRCT is a reliable tool for detecting and classifying ILD in CTD patients, with NSIP being the most common radiological pattern. The fibrosis scoring system offers a semi-quantitative method to estimate disease burden and demonstrates correlation with functional parameters. These findings support HRCT’s role in disease monitoring and prognosis in CTD-associated ILD.
Research Article
Open Access
Effect Of Antiepileptic Drugs on Liver Enzymes
Pages 60 - 62

View PDF
Abstract
Objective: To know the effect of antiepileptic drugs on liver enzymes. Study Design: Cross sectional study. Materials & Methods: The study was conducted on 108 patients at Santosh Medical College & Hospital, Ghaziabad between Sep 14 to Oct 14. Patients divided into 3 groups, consisting of 36 patients in each group of phenytoin, carbamazepine and valproate. Results: Total No. of Patients was 108, out of which 36 patients were there is each group of Phenytoin, Carbamazepine and valproate. Most of the patients 36 (33.33%) and 32 (29.62%) belongs to age group of >40 – 50 years and >50 year respectively. Regarding raised SGPT, it was seen in 5 (13.89), 3 (8.33) and 3 (8.33) in phenytoin, carbamazepine and sodium valproate group respectively. SGOT were raised in sodium valproate group respectively. Alkaline phosphatase was raised in 10 (27.78), 20 (55.56) and 22 (61.11) in phenytoin group, carbamazepine group, and sodium valproate group respectively. Conclusion: From the present study we can conclude that sodium valproate is more hepatotoxic than carbamazepine which is more toxic than phenytoin. It is recommended that base line Liver Function Test (LFT) is essential before starting of AEDs and regular monitoring of LFT is also done between the course of treatment.
Research Article
Open Access
Lipid Profile Abnormalities in Metabolic Syndrome Patients: A Comparative Cross-Sectional Study
Amol Chaudhari,
Pallavi Prabhu,
Mukund Tayade,
Khilchand Bhangale
Pages 1152 - 1156

View PDF
Abstract
Introduction: Metabolic syndrome (MetS) is a cluster of metabolic abnormalities that predispose individuals to increased cardiovascular risk. Dyslipidemia is a core component of MetS and plays a crucial role in its pathogenesis. This study aimed to compare lipid profile abnormalities between metabolic syndrome patients and healthy controls. Methods: A comparative cross-sectional study was conducted involving 200 participants (100 MetS patients and 100 healthy controls). Anthropometric measurements, blood pressure, fasting blood glucose, and lipid profiles—including total cholesterol, triglycerides, LDL cholesterol, and HDL cholesterol—were assessed. Statistical analysis was performed to compare lipid parameters between groups. Results: Metabolic syndrome patients demonstrated significantly higher mean total cholesterol (220.6 ± 38.5 mg/dL vs. 182.4 ± 29.7 mg/dL, p < 0.001), triglycerides (186.9 ± 54.3 mg/dL vs. 111.3 ± 41.5 mg/dL, p < 0.001), and LDL cholesterol (140.4 ± 31.2 mg/dL vs. 108.7 ± 26.1 mg/dL, p < 0.001) compared to controls. HDL cholesterol was significantly lower in MetS patients (38.7 ± 8.9 mg/dL) than controls (52.3 ± 9.6 mg/dL, p < 0.001). Dyslipidemia prevalence was high among MetS patients, with 91% showing at least one abnormal lipid parameter. Conclusion: Significant dyslipidemia is prevalent in metabolic syndrome patients compared to healthy controls, underscoring the importance of lipid monitoring and management in this high-risk group to reduce cardiovascular complications.
Research Article
Open Access
Elevation Of Liver Enzymes and Its Correlation with Type 2 Diabetes Mellitus in A Tertiary Care Hospital
Dr. Shashank Tyagi,
Dr. Priyank Jain,
Dr. Chandan Pandurang Wani
Pages 209 - 212

View PDF
Abstract
Background: Diabetes mellitus is one of the most common chronic diseases has been related to various liver illnesses such as liver enzyme derangements, non-alcoholic fatty liver disease, hepatocellular carcinoma, and cirrhosis. There has been increased interest on the contribution of liver enzymes to prediction of diabetes and glycemic control. Aims and Objectives: The aim is study was to correlate liver enzymes with type 2 diabetes mellitus (T2DM) and non-diabetic individuals. Materials and Methods: Diabetic patients seen on Outpatient Department basis or admitted as inpatients are included in this study. Information is collected and detailed history is taken using pre-formed proforma at the time of admission. Liver function tests are measured to all participants, and HbA1C value is measured. Liver enzymes are correlated with HbA1C values. Results: Majority of the participants were males (64% in cases & 60% in control). The mean age among cases was 53.5 ± 9.3 and among control was 49.8 ± 5.6 years. The mean duration of diabetes is 7.86±5.38, mean HbA1c is 8.48±3.25. Mean fasting blood sugar and post-prandial blood sugar were 169.5±91.3 and 242.3±133.6, respectively. Liver enzymes like Aspartate transferase (AST), alanine aminotransferase (ALT), alkaline phosphatase (ALP) and gamma-glutamyl transferase (GGT) were statistically significantly raised in diabetes mellitus cases (p<0.05) as compared to non-diabetic cases Conclusion: We have found significant association among AST, ALT, ALP and GGT with type 2 diabetes mellitus; all were negatively correlated with HbA1C level, Hence the monitoring the liver function tests in uncontrolled T2DM patients was essential.
Research Article
Open Access
Hypertriglyceridemia-Induced Acute Pancreatitis Triggering Diabetic Ketoacidosis with Multi-Organ Dysfunction in an Undiagnosed Type 1 Diabetic: A Case Report
Rohan Ghosh,
Debashis Sadhukhan,
Tarapada Das,
Hiranmoy Barman,
Sumit Kr. Agarwal
Pages 259 - 260

View PDF
Abstract
Background: Diabetic ketoacidosis (DKA) is a serious complication of diabetes, occasionally triggered by acute pancreatitis due to hypertriglyceridemia. We report a rare case of a young male with undiagnosed T1DM presenting with DKA, hypertriglyceridemia-induced pancreatitis, AKI, and ARDS. Case Presentation: A 23-year-old male presented with abdominal pain, respiratory distress, and altered consciousness. He was hypotensive, hyperglycemic (CBG 428 mg/dL), with a GCS of E2V1M3. Investigations revealed high anion gap metabolic acidosis, triglycerides of 730 mg/dL, elevated lipase, positive urinary ketones, and renal impairment. Imaging confirmed pancreatitis with renal parenchymal disease. He was intubated and treated in ICU with insulin infusion, fluids, vasopressors, antibiotics, and dialysis. He later developed ARDS, which was managed appropriately. The patient recovered fully and was discharged with counselling on lifelong insulin therapy, glucose monitoring, and dietary adherence. Conclusion: Early multidisciplinary management in severe DKA cases with organ dysfunction can result in full recovery.
Research Article
Open Access
Flap Edge Capillary Blood Glucose Monitoring as A Predictor of Flap Survival in Transposition Flap for Pilonidal Sinus Surgery in A Tertiary Care Hospital.
Dr. M. Muralidharan,
Dr. P. Sumitra ,
Dr. M. Allwyn sudhagar,
Dr. M. Vennila ,
Dr. M. Gogan
Pages 709 - 713

View PDF
Abstract
Background: Pilonidal sinus is treated with wide local excision and primary closure, which carries high recurrence, post-operative morbidity and cosmetic implications. In order to overcome that limberg transposition flap is done. Post-operative flap survival is crucial for proper healing. Hence there is a need for a tool to predict flap survival as earliest. Flap capillary blood glucose monitoring is such a tool which is, done as a clinical basic procedure, with no harm to patient or the flap. Methods: To prospectively validate flap capillary blood glucose monitoring serially, as an indicator to predict flap survival in limberg flap in pilonidal sinus surgery patients at Government medical college & ESI hospital-general surgery department. Conducted as a prospective cohort study among 30 patients who underwent wide local excision and limberg flap for pilonidal sinus in the Department of General Surgery, Government Medical College & ESI Hospital between June- 2022 to November- 2022. Estimated capillary blood glucose at the edge of flap 3mm away from incision site, using glucometer at 0, 6, 24- hours following surgery. Results: Of the 30 patients who underwent limberg flap surgery, 1 patient developed postoperative flap necrosis due to venous thrombosis. 1 patient developed flap infection which was treated with appropriate antibiotics after culture and sensitivity and the flap survived. Rest of the 28 patients had healthy flaps and uneventful postoperative period. Of the 30 patients who underwent limberg flap for pilonidal sinus 29 patients in whom the flap survived had flap glucose level more than 62mg/dl, in the first 24 hours. Conclusion: Monitoring flap capillary blood glucose serially in postoperative period in order to identify the flaps in risk, to start early goal directed therapy to improve flap survival and the patients can benefit. This simple and cheap technique can be used for routine monitoring of Limberg flaps along with the routine clinical evaluation.
Research Article
Open Access
A Comparitive Study of Recombinant Erythropoetin Injectables Versus the Oral Formulation of Desidustat in Treating Patients of the Anemia of Chronic Kidney Disease
Birupaksha Biswas,
Suhena Sarkar,
Subesha Basu Roy,
Shilpa Basu Roy,
Nupur Ghosh,
Soumyajit Mallick,
Paramita Adhikary,
Debtanu Hazra,
Aparna Basumatary
Pages 354 - 362

View PDF
Abstract
Background: Anemia in chronic kidney disease (CKD) arises from insufficient erythropoietin production and functional iron deficiency, significantly impairing quality of life and disease prognosis. Recombinant human erythropoietin (rh-EPO) remains the cornerstone of therapy, though associated with parenteral administration burdens and resistance. Desidustat, an oral hypoxia-inducible factor prolyl hydroxylase inhibitor (HIF-PHI), offers a novel mechanism by stimulating endogenous erythropoiesis and enhancing iron metabolism. Aims and Objectives: To evaluate and compare the efficacy and safety of oral Desidustat monotherapy against injectable rh-EPOs in CKD-associated anemia in terms of hematological indices, iron profile, renal function (eGFR), and cardiovascular risks, within a multidisciplinary framework. Materials and Methods: A prospective interventional cohort study was conducted on 150 CKD stage 3b–5 patients (October–December 2024) already on rh-EPO, who were switched to Desidustat. Regular monitoring of hemoglobin, reticulocyte parameters (ARC, ARI), ferritin, transferrin saturation (TSAT), eGFR, serum erythropoietin, and HEART score (MACE prediction) was done. Statistical analysis involved paired sample t-tests using SPSS v17, with p<0.05 considered significant. Results: Desidustat significantly improved hemoglobin (mean increase: 0.609 g/dL, p<0.05), eGFR (mean increase: 1.828 mL/min/1.73m², p<0.05), ARC (mean increase: 20.9×10⁹/L, p<0.05), ARI (mean increase: 0.16%, p<0.05), ferritin (mean increase: 7.86 ng/mL, p<0.05), and TSAT (mean increase: 4.91%, p<0.05). Peripheral smear confirmed effective erythropoiesis. No significant increase in MACE risk was observed. Compared to rh-EPOs, Desidustat demonstrated superior tolerability, oral convenience, and reduced need for adjunctive iron therapy. Conclusion: Desidustat presents a compelling oral alternative to rh-EPO injectables in managing CKD-associated anemia. Beyond hematological improvements, it shows promise in renal function stabilization and iron metabolism enhancement, with a favorable safety and compliance profile. This study reinforces the therapeutic potential of HIF-PHIs in nephrology and warrants further multicenter validation.
Research Article
Open Access
Prevalence and Risk Factors of Non-Alcoholic Fatty Liver Disease (NAFLD) in Type 2 Diabetic Patients.
Bhanpratap Ahirwar,
Vinay Verma,
Delux Godghate,
Hansraj Parmar
Pages 415 - 419

View PDF
Abstract
Background: Non-Alcoholic Fatty Liver Disease (NAFLD) is a common liver condition associated with Type 2 Diabetes Mellitus (T2DM). The high prevalence of NAFLD in diabetic patients is a growing concern due to its potential to progress to more severe liver conditions, such as non-alcoholic steatohepatitis (NASH), fibrosis, and cirrhosis. This study aimed to assess the prevalence and risk factors associated with NAFLD in T2DM patients in Central India. Methods: A cross-sectional observational study was conducted over 12 months at a tertiary care hospital in Central India. A total of 100 adult T2DM patients aged 40-70 years were included. Data were collected through structured questionnaires, physical examinations, and biochemical tests. Liver ultrasound was used to diagnose NAFLD and assess its severity. The risk factors for NAFLD, including age, gender, obesity, hypertension, dyslipidemia, family history of NAFLD, and duration of diabetes, were analyzed. Results: The prevalence of NAFLD in T2DM patients was found to be 65%. Significant risk factors for NAFLD included obesity (69.2%), hypertension (76.9%), dyslipidemia (73.8%), a family history of NAFLD (61.5%), and longer duration of diabetes (mean 8.5 years). Liver ultrasound showed that 46.2% of patients had mild NAFLD, 30.8% had moderate NAFLD, and 23.1% had severe NAFLD. Biochemical markers, including elevated ALT and AST levels, and higher fasting blood glucose, were significantly associated with NAFLD. Lifestyle modifications, weight loss, and regular monitoring of liver function were the primary management approaches. Conclusion: This study highlights the high prevalence of NAFLD in T2DM patients and identifies key risk factors such as obesity, hypertension, and dyslipidemia. Early detection and management through lifestyle interventions, regular monitoring of liver function, and appropriate pharmacotherapy are crucial for mitigating the adverse effects of NAFLD and improving long-term outcomes in diabetic patients.
Research Article
Open Access
Electrocardiographic changes in people living with HIV and its correlation with serum CRP levels
Manjiri Naik,
Rajiv Naik,
Swapnil Sapre,
Sonali Bhattu
Pages 613 - 616

View PDF
Abstract
Background: With the advent of antiretroviral therapy (ART) and improved follow-up strategies through ICTC services, the incidence of opportunistic infections in people living with HIV (PLHIV) has declined. However, there has been a concurrent rise in non-communicable diseases, particularly cardiovascular disorders. Electrocardiography (ECG) is a non-invasive tool for detecting early cardiac abnormalities, while serum C-reactive protein (CRP), an inflammatory biomarker, has been increasingly recognized for its prognostic role in cardiovascular risk among PLHIV. Materials and Methods: This prospective observational study was conducted at a tertiary care hospital from December 2020 to October 2022. A total of 50 HIV-positive patients aged above 18 years attending the ART center were enrolled after obtaining informed consent. Patients with known cardiovascular diseases, hypertension, or chronic alcoholism were excluded. All participants underwent 12-lead ECG and serum CRP testing to identify potential associations between ECG abnormalities and inflammatory status. Results: Out of the 50 participants, ECG abnormalities were noted in 48 (96%) patients. T wave inversions were the most prevalent, observed in 29 patients (58%), followed by ST-segment elevations in 12 (24%). Other findings included sinus tachycardia (4%), irregular rhythm (4%), left ventricular hypertrophy (4%), and electrical alternans (2%). Only 2 patients (4%) had normal ECGs. Elevated serum CRP levels were observed in 93% of patients with T wave inversions and in all cases with ST-segment elevations, where 60% had CRP levels exceeding 100 mg/L. Among patients with ST-T changes, 95% demonstrated elevated CRP levels, indicating a strong correlation (p < 0.001). In contrast, all patients with normal ECGs had normal CRP levels, suggesting lower cardiovascular risk. Conclusion: Cardiovascular abnormalities are emerging complications among PLHIV in the ART era. ECG serves as an effective preliminary tool for assessing these complications. Elevated serum CRP levels show a significant association with electrocardiographic abnormalities and can be used as a predictive marker for cardiovascular morbidity and mortality in this population. Regular ECG and CRP monitoring may enhance early detection and management of cardiovascular risks in PLHIV.
Research Article
Open Access
A Study on Catheter Associated Urinary Tract Infections (Cauti) and Antibiotic Sensitivity Pattern of Uropathogens Causing Cauti
Albert Dawn ,
T.N. Lahiri Mazumder
Pages 627 - 629

View PDF
Abstract
Background: The most frequent hospital-acquired illness (HAI) is still catheter-associated urinary tract infections (CAUTI). This highlights the necessity of putting in place and keeping an eye on efficient infection management measures in order to lower the risk of CAUTI. Aims: The current study's objectives were to compute CAUTI rat and identify the etiology with drug susceptibility. Materials & Methods: Catheter-associated urinary tract infections (CAUTIs) are a common healthcare-associated infection caused by prolonged catheter use. Effective prevention methods include maintaining sterile insertion techniques, ensuring proper catheter care, minimizing catheter use, and prompt removal when no longer needed. The antibiotic sensitivity pattern of uropathogens causing CAUTIs highlights the importance of monitoring local resistance trends to guide treatment. Common uropathogens include Escherichia coli, Klebsiella spp., and Pseudomonas aeruginosa, which often exhibit resistance to multiple antibiotics. Empirical therapy should be tailored based on antibiograms to avoid resistance development and ensure effective treatment. Result: Antimicrobial resistance (AMR) among uropathogens is a significant global challenge, with varying resistance profiles across pathogens. E. coli demonstrated moderate susceptibility to ampicillin and piperacillin-tazobactam but showed high resistance to amoxicillin-clavulanate and ceftriaxone. K. pneumoniae exhibited extensive resistance, including 100% resistance to several antibiotics, though partial susceptibility to some aminoglycosides and ceftazidime was observed. P. aeruginosa displayed multidrug resistance, with susceptibility limited to carbapenems like imipenem and meropenem. Acinetobacter species showed pan-resistance to all tested antibiotics, highlighting a severe clinical threat. These findings stress the urgent need for antimicrobial stewardship, novel therapies, and robust resistance surveillance systems. Conclusion: We concluded that CAUTI continued to pose a serious threat to patient safety and to the infection control team. A major factor in lowering CAUTI rates, which in turn lowers patient morbidity and hospital stays, is the implementation of appropriate care bundles and ongoing training for healthcare professionals. Multidrug-resistant uropathogens such as E. coli and Klebsiella spp.
Research Article
Open Access
Elevated CA125 Reflects Disrupted Albumin-Sodium Homeostasis in Maintenance Hemodialysis
Asmita Hazra,
Jayati Chakraborty,
Saptarshi Mandal
Pages 630 - 640

View PDF
Abstract
Background: CA125 (Cancer Antigen 125) is a tumor marker associated with ovarian cancer. It is gradually becoming apparent that it is also elevated in various conditions associated with inflammation or fluid overload. Similar to heart failure patients (Núñez et al., 2016),1 in chronic kidney disease patients on hemodialysis, who experience both fluid overload and inflammation, CA125 may potentially serve as a prognostic marker. Experimental evidence demonstrates that mechanical stretch directly upregulates expression of “MUC16” , the gene for CA125, secreted from mesothelial cells (Huang et al., 2013),2 providing a mechanistic basis for CA125 elevation in volume overload states. Objective: To investigate the correlation between CA125 levels and routine biochemical parameters (albumin and sodium) in maintenance hemodialysis patients, explore its potential as an early warning biomarker for patients at risk of developing hypoalbuminemia, and to explore gender-based differences in these relationships. Methods: This cross-sectional study analyzed 122 maintenance hemodialysis patients at a tertiary care government hospital in Eastern India. Complete data for CA125, albumin, and sodium was available for 87 patients. Serum albumin and sodium were measured using Beckman Coulter AU480, while CA125 was measured by chemiluminescent immunoassay. Statistical analysis included correlation analysis and ANOVA using R software version 3.6.1. To assess the diagnostic value of CA125 in identifying hypoalbuminemia, an analysis using receiver operating characteristic (ROC) curves was conducted (4.0 g/dL based on KDOQI). Multiple regression examined factors independently associated with log(CA125).A Results: The study population had a mean age of 47.9 ± 12.8 years with balanced sex distribution (M:F ratio 1.1:1). CA125 showed a log-normal distribution (p<10⁻¹⁶) with median 12.45 U/ml. Albumin demonstrated significant negative correlation with log CA125 (r=-0.47, p=0.000006). Among patients with elevated CA125 (>35 U/ml), 22.2% (4/18) had hypoalbuminemia (<3.5 g/dl) compared to only 1.4% (1/69) in those with normal CA125 (p<0.05). The albumin-sodium correlation was positive overall (r=0.20, p<0.00001) but showed progressive strengthening across CA125 tertiles, with only the high CA125 group showing significant correlation. Patients with hypoalbuminemia or hyponatremia had 5-fold higher mean CA125 (129.8 vs 26.1 U/ml). Gender analysis revealed stronger albumin-sodium correlation in females (r=0.35) compared to males (r=0.12). ROC analysis revealed that CA125 had excellent diagnostic accuracy for predicting hypoalbuminemia (<4.0 g/dL), with an AUC of 0.861 (95% CI: 0.791-0.932, p<0.001). The optimal cutoff value of 12.25 U/ml yielded a sensitivity of 77.1% and specificity of 80.7%. In multiple analysis, albumin remained significantly associated with log(CA125) after adjusting for confounders. Conclusion: High CA125 levels in hemodialysis patients correlate with dysregulation of albumin and sodium homeostasis, suggesting its potential as an early warning biomarker rather than a diagnostic replacement for albumin. CA125 reflects underlying pathophysiological processes (mesothelial stress, inflammation, volume overload) that precede overt hypoalbuminemia, making it valuable for risk stratification and targeted intervention. The stronger correlations observed in females warrant further investigation into gender-specific applications of CA125 monitoring in dialysis patients.
Research Article
Open Access
Evaluate Maternal Microbiome and Their Association with Adverse Pregnancy Outcomes: A Prospective Longitudinal Study
Kavya Patel,
Jay Jagdish Pathak,
Mahammed Mubin Sikandarbhai Manva
Pages 695 - 698

View PDF
Abstract
Background: Emerging evidence highlights the critical role of the maternal microbiome in modulating immune, metabolic, and hormonal functions during pregnancy. Alterations in microbial communities may contribute to adverse pregnancy outcomes such as preeclampsia, preterm birth, gestational diabetes mellitus (GDM), and intrauterine growth restriction (IUGR). This longitudinal study investigates the association between maternal microbiome composition and pregnancy outcomes across the three trimesters. Materials and Methods: A prospective cohort of 120 pregnant women aged 20–35 years was recruited at <12 weeks gestation and followed through delivery. Vaginal, oral, and fecal microbiome samples were collected at each trimester. 16S rRNA gene sequencing was used for microbial profiling. Pregnancy outcomes assessed included gestational age at delivery, incidence of GDM, hypertensive disorders, and neonatal birth weight. Alpha and beta diversity indices were calculated, and associations with outcomes were analyzed using multivariate regression models. Results: Out of 120 participants, 112 completed the study. Women who developed preeclampsia (n=14) showed significantly lower vaginal microbial diversity in the second trimester (Shannon index mean: 2.1±0.4) compared to normotensive women (3.5±0.6; p<0.001). Higher relative abundance of Prevotella and Gardnerella in the vaginal microbiome was significantly associated with preterm birth (n=11; OR=2.8, 95% CI: 1.4–5.6). Gut microbial dysbiosis characterized by a lower Firmicutes/Bacteroidetes ratio was observed in GDM cases (n=16) during the third trimester (p=0.02). No significant changes were observed in oral microbiome patterns across groups. Conclusion: This study underscores the dynamic nature of the maternal microbiome and its potential predictive value for pregnancy complications. Specific microbial shifts, particularly in the vaginal and gut environments, are associated with adverse outcomes such as preeclampsia, preterm birth, and GDM. Monitoring maternal microbiome profiles may serve as a non-invasive tool for early identification of at-risk pregnancies and inform targeted interventions.
Research Article
Open Access
Clinicodemographic Profile of Adult Sickle Cell Disease Patients: A Cross-Sectional Study from Eastern India
Surabhi Mishra,
Sushant Kumar Bhuyan,
Soumya Kumar Acharya,
Namita Mohapatra
Pages 5 - 9

View PDF
Abstract
Background: Sickle cell disease (SCD) is a chronic hemoglobinopathy with diverse clinical manifestations and regional variation in India. Understanding its clinicodemographic distribution and crisis-related risk factors is crucial for effective management. This study aimed to assess the demographic profile and clinical presentation of adult SCD patients and examine factors associated with vaso-occlusive crises (VOC). Materials and Methods: A hospital-based cross-sectional study was conducted among 157 adult SCD patients at a tertiary care center in Eastern India from May 2023 to December 2024. Data on demographics, geographic origin, clinical features, and treatment history were collected. Clinical manifestations including VOC, hemolytic crisis, cholelithiasis, and acute chest syndrome were documented. Associations between VOC and demographic or clinical factors were analyzed using appropriate statistical tests. Results: Most participants were aged 21–30 years (49%), with a slight female predominance (52.9%). A significant proportion hailed from Nayagarh, Angul, and Khordha districts. VOC was the most common complication (61.8%), followed by hemolytic crisis (25.5%). VOC was significantly associated with younger age (p < 0.001), presence of splenomegaly (p = 0.001), and history of blood transfusions (p < 0.001). No significant association was observed with gender or hydroxyurea therapy. Other acute complications were infrequent. Conclusion: VOC is the most prevalent clinical manifestation among adult SCD patients in Eastern India, particularly affecting younger individuals with splenomegaly and a transfusion history. Geographical clustering suggests the need for targeted regional interventions. Identification of VOC-associated factors can guide individualized patient monitoring and resource allocation.
Research Article
Open Access
Assessment of Muscle Oxygen Saturation Using Near-Infrared Spectroscopy During Progressive Exercise in Trained and Untrained Individuals
Darshil v Korat,
Piyushkumar Harsukhlal Kaneriya,
Vishvesh Kiritbhai Lakhani,
Janvi Bhanjibhai Panchotiya
Pages 72 - 75

View PDF
Abstract
Background: Muscle oxygen saturation (SmO₂) reflects local oxygen utilization during exercise and is a vital indicator of muscular and cardiovascular efficiency. Near-infrared spectroscopy (NIRS) offers a non-invasive method to monitor real-time changes in SmO₂. This study aimed to compare SmO₂ dynamics during graded exercise in trained and untrained individuals to evaluate the impact of physical conditioning on oxygen kinetics. Materials and Methods: A total of 30 participants (15 trained athletes and 15 untrained healthy individuals) aged 18–30 years were recruited. All participants underwent a standardized incremental cycling protocol on an ergometer. Muscle oxygen saturation was continuously measured using portable NIRS devices placed on the vastus lateralis muscle. Heart rate, perceived exertion (RPE), and SmO₂ were recorded at baseline, each workload stage, and immediately post-exercise. Data were analyzed using repeated-measures ANOVA with significance set at p<0.05. Results: At rest, the mean SmO₂ was significantly higher in trained individuals (78.4% ± 3.2) compared to untrained individuals (72.1% ± 4.5). During peak exercise, SmO₂ decreased to 42.5% ± 5.3 in the trained group and 35.2% ± 6.1 in the untrained group (p=0.01). Trained participants demonstrated faster recovery in SmO₂ values post-exercise (return to baseline in 90 ± 12 seconds) compared to untrained individuals (130 ± 18 seconds). Heart rate and RPE were also significantly lower in the trained group at comparable workloads. Conclusion: Trained individuals exhibit higher baseline muscle oxygen saturation, reduced desaturation during exercise, and faster post-exercise recovery, indicating more efficient oxygen utilization. NIRS can serve as a reliable tool for assessing training status and monitoring exercise performance
Research Article
Open Access
An Observational Descriptive Study on the Correlation of Renal Cortical Thickness, Renal Echogenicity and Renal Size with Estimated Glomerular Filtration Rate in Chronic Kidney Disease Patients in A Tertiary Care Hospital
Aalaya Haridas,
Dr Siddharth Pugalendhi,
Dr Shayilendranath. V,
Dr Bhargav Kiran Gaddam,
Dr Thokala Sivaiah MD,
Alaya Haridas,
Dr Siddharth Pugalendhi,
Dr Shayilendranath. V,
Dr Bhargav Kiran Gaddam,
Dr Thokala Sivaiah MD
Pages 109 - 115

View PDF
Abstract
Background: Chronic Kidney Disease (CKD) is a significant public health concern, and early detection is crucial for preventing disease progression. Ultrasound is an ideal imaging modality for CKD due to its non-invasive nature and accessibility. This study aimed to evaluate the correlation between renal cortical thickness, renal echogenicity, and renal size with eGFR in CKD patients. Research Question: Is there a correlation between renal cortical thickness, renal echogenicity, and renal size with eGFR in CKD patients? Methods: A one-year observational study was conducted at the Department of General Medicine, Mahatma Gandhi Medical College and Research Institute, Pondicherry, from January 2024 to December 2024. Ninety-one CKD stage 1-5 patients attending the General Medicine OPD were included in the study. Socio-demographic profiles, hemoparameters (eGFR, hemoglobin, serum sodium, serum potassium, serum calcium, serum phosphorus, serum creatinine, and blood urea), and ultrasound parameters (renal cortical thickness, renal echogenicity, and renal size) were studied. Results: The study population had a mean age of 55 years, with a higher burden of disease between 41-60 years (66%). Males (77%) had higher morbidity compared to females (23%). The mean values of hemoparameters were: hemoglobin (8.87 ± 1.94 g/dl), serum calcium (7.75 ± 1.04 mg/dl), serum phosphorus (3.89 ± 1.04 mg/dl), serum sodium (134.3 ± 3.14 mmol/L), serum potassium (6.17 ± 13.04 mmol/L), serum creatinine (3.57 ± 2.33 mg/dl), and blood urea (59.57 ± 23.16 mg/dl). The mean eGFR value was 24.92 ± 13.77 ml/min/1.73m². Significant abnormal values of hemoparameters were observed in relation to decreased eGFR and grading of renal echogenicity. There was a significant (P<0.01) reduction in mean values of eGFR, renal size, and renal cortical thickness with increasing renal echogenicity grade. A significant correlation was observed between renal cortical thickness, renal echogenicity, and renal size with eGFR. Conclusion: This study demonstrates a significant correlation between renal cortical thickness, renal echogenicity, and renal size with eGFR in CKD patients. These ultrasound parameters can be useful in monitoring disease progression and predicting renal function decline.
Research Article
Open Access
A Study to Correlate Serum Albumin Levels Postoperatively in Predicting Post Operative Complications in Major Open Abdominal Surgeries
CH Naga Harsha Vardhan,
Sai Mayur Datta,
Surabhi Singh,
Sunil Kumar Patanaik,
G Lokesh Abhinav,
P Naga Praveen
Pages 205 - 209

View PDF
Abstract
Background: Major open abdominal surgeries are associated with significant postoperative morbidity. Serum albumin, traditionally considered a nutritional marker, is also a dynamic indicator of surgical stress and systemic inflammation. While preoperative hypoalbuminemia is a known predictor of poor outcomes, the prognostic value of a postoperative drop in serum albumin remains underexplored. This study aimed to correlate postoperative serum albumin decline with the development of postoperative complications. Materials and Methods: A prospective observational study was conducted in the Department of General Surgery, KIMS, Bhubaneswar, from February 2023 to January 2025. Eighty-four adult patients undergoing elective major open abdominal surgeries (>2 hours duration) were included. Serum albumin levels were recorded preoperatively and at 6 hours, postoperative day (POD) 1, and POD 3. A drop ≥1 g/dL in serum albumin on POD1 was considered significant. Postoperative complications such as wound infections, sepsis, and anastomotic leaks were recorded. Statistical analyses included chi-square, t-test, ANOVA, and diagnostic accuracy metrics. Results: Postoperative complications were observed in 45 patients (53.6%). Significant associations were found between complications and age >60 years (p=0.018), male gender (p<0.001), BMI >25 (p=0.027), presence of comorbidities (p<0.001), surgery duration >3 hours (p<0.001), and blood loss >200 mL (p<0.001). A ≥1 g/dL drop in serum albumin on POD1 occurred in 71.1% of patients with complications, versus only 2.6% in those without (p<0.001). This drop showed strong predictive power: sensitivity 71.1%, specificity 97.4%, positive predictive value (PPV) 97%, negative predictive value (NPV) 74.5%, and diagnostic accuracy 83.3%. Moreover, a high correlation was observed between decreased albumin and elevated MPASS scores (p<0.001), reinforcing its role as a surgical stress marker. Conclusion: A postoperative serum albumin declines of ≥1 g/dL within 24 hours is a strong, early, and cost-effective predictor of postoperative complications in major open abdominal surgeries. Routine monitoring of serum albumin levels can significantly aid in identifying high-risk patients and guide timely interventions to improve clinical outcomes
Research Article
Open Access
A Prospective Study on the Pharmacotherapy of Bronchial Asthma in Paediatric Patients and Emphasis on Adverse Drug Reactions at a Tertiary Care Hospital
Suvarna Chavan,
Tushar Chavan,
G Sreedhar
Pages 468 - 471

View PDF
Abstract
Background: Bronchial asthma is one of the most prevalent chronic respiratory disorders affecting children globally. Effective pharmacotherapy is crucial in minimizing symptoms, improving quality of life, and preventing exacerbations. Through systematic monitoring of prescribed medications and vigilant documentation of any adverse events, this study intends to provide insights into drug safety, optimize therapeutic regimens, and enhance overall asthma care in pediatric populations. However, adverse drug reactions (ADRs) associated with asthma medications, especially in the pediatric population, pose a significant clinical challenge. Materials and Methods: This prospective observational study was conducted over a period of 12 months in the pediatric Department and Pharmacology of a tertiary care hospital. Children aged 1–14 years diagnosed with bronchial asthma and receiving pharmacological treatment were included. Patients were followed throughout their hospital stay or during monthly outpatient visits for up to three months. During each visit, adherence, drug tolerance, and the emergence of new ADRs were assessed. Specific emphasis was placed on detecting ADRs related to long-term corticosteroid use (e.g., oral candidiasis, growth suppression) and β2-agonists (e.g., tachycardia, tremors). Results: Out of 150 pediatric patients enrolled, the majority were male (63.3%). A total of 47 ADRs were reported, with salbutamol being the most common offending agent. The majority of pediatric asthma cases were in the 7–10 years age group (30%). There was a male predominance (63.3%) in asthma incidence. SABA (especially salbutamol) was the most frequently prescribed agent (93.3%), followed by inhaled corticosteroids. Most ADRs were mild to moderate (93.6%), indicating manageable side effects. Most ADRs were classified as "Probable" (42.5%) or "Possible" (38.3%), indicating a moderate level of evidence linking the ADR to the drug. Only 10.6% were confirmed as “Certain,” emphasizing the diagnostic challenges in pediatric ADR surveillance. Conclusion: Pharmacotherapy in pediatric asthma is effective but not without risks. Active monitoring for ADRs, especially in long-term corticosteroid and β2-agonist use, is imperative. Patient and caregiver education, proper inhalation technique, and routine follow-ups are vital components of asthma management in children.
Research Article
Open Access
A Study of Serum Electrolytes and Uric Acid Among Psoriasis Patients in ACSR Government Hospital of SPSR Nellore District: A Case-Control Study
Pasupurekula Laxmi,
P. Pullaiah,
M. Prasanth,
. K. Madhavi,
G. Sarvari
Pages 956 - 959

View PDF
Abstract
Background: Psoriasis is a chronic inflammatory skin disorder associated with systemic metabolic changes. Serum electrolytes and uric acid levels have been implicated in the pathophysiology of psoriasis. This study aims to evaluate and compare the serum electrolyte levels (sodium, potassium) and uric acid levels in psoriasis patients and healthy controls and to correlate serum electrolytes with uric acid in psoriasis patients. Materials and Methods: A case-control study was conducted at ACSR Government Medical College, Nellore, including 30 psoriasis patients and 30 age- and sex-matched healthy controls with exclusion criteria (Chronic alcoholics and smokers, people with hypertension, diabetes, personal or family history of metabolic disease, patients who were on oral contraceptives and any other medication, pregnant women, and postmenopausal women) . Serum sodium, potassium levels were analysed on ISE electrolyte analyser and uric acid levels were measured on semi auto analyser. Results: The mean and SD values of Serum Sodium (146.8 ± 4.5) were higher in cases compared to mean and SD (138.6 ± 3.8) of controls and mean and SD of Serum potassium (5.2 ± 0.5) were higher in cases compared to mean and SD (4.1 ± 0.4) of controls. The mean and SD of Serum Uric acid (6.8 ± 1.4) levels were higher in psoriasis patients compared to mean and SD (5.2 ± 1.1) of controls where the p value is <0.0001 and is considered highly significant and serum electrolytes shows positive correlation with uric acid where the p value is <0.0001 and is considered highly significant. Conclusion: Increased Serum sodium and potassium levels may reflect systemic inflammation, increased epidermal turnover and metabolic disturbances (hyperuricemia) in psoriasis patients. These findings emphasize the need for regular biochemical monitoring and to identify potential complications early.
Research Article
Open Access
Idiopathic Left Fascicular Ventricular Tachycardia in Pregnancy with Newly Diagnosed Bicuspid Aortic Valve, Moderate Aortic Regurgitation, and Mitral Valve Prolapse – Successful Management with Verapamil and Postpartum Ablation
Ankita C Vaghani,
Chetankumar Vaghani,
Nilesh Parshottam,
Bhavesh Talaviya
Pages 489 - 495

View PDF
Abstract
Background: Idiopathic left fascicular ventricular tachycardia (ILFVT) is an uncommon, verapamil-sensitive arrhythmia that typically occurs in structurally normal hearts. Its presentation during pregnancy is rare, and management becomes more complex when accompanied by underlying congenital valvular abnormalities such as bicuspid aortic valve (BAV), aortic regurgitation (AR), and mitral valve prolapse (MVP). Prompt recognition and individualized multidisciplinary care are critical to ensure optimal maternal and fetal outcomes. Case Summary: We present the case of a 26-year-old primigravida at 32 weeks gestation who presented with palpitations and dizziness. Electrocardiogram revealed regular monomorphic ventricular tachycardia with right bundle branch block morphology and right axis deviation, consistent with ILFVT. Echocardiography incidentally revealed a bicuspid aortic valve with moderate aortic regurgitation and mitral valve prolapse. Initial pharmacological therapy with verapamil, amiodarone, and beta-blockers was ineffective; however, transcutaneous pacing successfully terminated the arrhythmia, followed by a positive response to oral verapamil. The patient was managed medically throughout pregnancy, with continuous fetal monitoring and serial echocardiography. At 38 weeks, she underwent elective cesarean section. Postpartum electrophysiological study confirmed reentrant ILFVT localized to the left posterior fascicle. Successful radiofrequency ablation was performed using CARTO 3D mapping with irrigated-tip catheter technology. She remained arrhythmia-free during a subsequent pregnancy and at three-year follow-up. Conclusion: This case highlights the rare occurrence of ILFVT in pregnancy complicated by valvular abnormalities. Transcutaneous pacing and verapamil can be effective in acute management. Definitive ablation postpartum offers excellent long-term outcomes. Early recognition, multidisciplinary coordination, and timing of intervention are key to successful management in such complex scenarios.
Research Article
Open Access
A Study on The Changes of Sodium and Potassium Level in CKD (ESRD) Patients After Hemodialysis in Tertiary Hospital JLNMCH, Bhagalpur Bihar
Kumar Abhisek,
Varsha Sinha,
Rolly Bharty
Pages 515 - 524

View PDF
Abstract
Background: Electrolyte imbalance is a common and critical complication in patients with end-stage renal disease (ESRD), particularly concerning sodium and potassium levels. Hemodialysis serves to correct these imbalances and improve clinical outcomes. Aim and Objective: To assess the pre- and post-hemodialysis levels of sodium and potassium among ESRD patients and analyze additional biochemical parameters including urea, creatinine, calcium, and phosphorus in a tertiary care setting. Material and Methods: This was a Cross-sectional study carried out in the Department of Biochemistry for a period of 24 months. A total of 101 ESRD patients undergoing routine hemodialysis were enrolled. Pre- and post-dialysis sodium and potassium levels were recorded. Associated parameters such as urea, creatinine, calcium, and phosphorus were analyzed. Statistical analysis was done using SPSS version 26.0, employing t-tests and chi-square tests. Results: In the present study it was observed that significant decrease in potassium levels post-dialysis was observed (mean pre: 5.58±0.66 vs. post: 4.23±0.63, p < 0.001), while sodium levels showed a non-significant change (mean pre: 145.67±7.68 vs. post: 144.98±6.48, p = 0.140). Significant alterations were also found in serum urea, creatinine, calcium, and phosphorus. Conclusion: Hemodialysis significantly reduces hyperkalemia in ESRD patients but does not produce a statistically significant change in sodium levels. Continued biochemical monitoring is essential to mitigate complications and optimize treatment.
Research Article
Open Access
Lipid Profile and Atherogenic Indices in Children with Transfusion-Dependent Beta-Thalassemia: A Comparative Study
Harshita C Shekar,
Madhura K L,
Harish H N,
Bharat Kumar G N,
Shivani B Shrigiri
Pages 564 - 569

View PDF
Abstract
Background: Beta-thalassemia major (B-TM) is a hereditary hemoglobinopathy requiring lifelong transfusions, predisposing patients to iron overload and metabolic disturbances. Emerging evidence links B-TM to early vascular dysfunction and atherosclerosis, traditionally considered age-related. Altered lipid profiles and atherogenic indices may serve as early cardiovascular risk markers, yet comprehensive studies in pediatric populations remain limited. This study aims to assess lipid abnormalities and cardiovascular risk markers in transfusion-dependent B-TM children, enabling early intervention to mitigate long-term atherosclerotic complications and improve risk stratification in this vulnerable group. Methods: This prospective observational study was conducted at Hassan Institute of Medical Sciences (HIMS), Hassan, over five months (May–September 2023), involving 70 pediatric participants (35 beta-thalassemia major, 35 healthy controls). Eligibility criteria included children aged 2–18 years with transfusion-dependent beta-thalassemia receiving at least eight transfusions annually. Demographic, clinical, and biochemical parameters (CBC, lipid profile, serum ferritin, CRP, liver enzymes, blood glucose) were assessed. Results: Children with transfusion-dependent beta-thalassemia presented with significant hematological abnormalities, including lower hemoglobin (8.06 g/dL vs. 11.37 g/dL, p < 0.001) and severe iron overload (serum ferritin: 2080.70 ng/mL vs. 128.03 ng/mL, p < 0.001). They also show altered lipid metabolism, with higher total cholesterol, LDL, triglycerides, and VLDL, and lower HDL, contributing to increased cardiovascular risk. Atherogenic indices (TG/HDL ratio, Castelli’s Risk Index, AIP) were significantly elevated, indicating a greater propensity for premature cardiovascular complications. Liver dysfunction markers were also raised. These findings underscore the need for regular monitoring, early interventions, and optimized management strategies to mitigate long-term complications in beta-thalassemia patients. Conclusion: Children with transfusion-dependent B-TM major exhibit significant lipid profile abnormalities with higher LDL, triglycerides, and atherogenic indices, and lower HDL, increasing cardiovascular risk. These findings justify the need for regular lipid monitoring and early interventions to mitigate long-term cardiovascular complications in affected children.
Research Article
Open Access
Study Of Serum Lipid Profile and Renal Dysfunction in Patients with Heart Failure at Tertiary Care Hospital, Gujarat
Lakavath Vijay Kumar,
Meenakshi R Shah,
Aniket Kumar Shankar Bhai Ganvit,
Harsh Patel,
Kaushik Kumar R Damor
Pages 581 - 586

View PDF
Abstract
Background: Heart failure is often complicated by renal dysfunction and lipid abnormalities, conditions that frequently share underlying risk factors such as hypertension, diabetes, and obesity. Heart failure and renal dysfunction are closely interlinked due to their common underlying risk factors like hypertension, diabetes mellitus and age. Dyslipidemia characterized by abnormal levels of cholesterol and triglycerides are prevalent in patients with heart failure due to shared risk factors like obesity, diabetes mellitus, and metabolic syndrome. Objective: This study aimed to assess the relationship between serum lipid profiles and renal dysfunction in individuals newly diagnosed with heart failure at a tertiary care hospital in Gujarat. Methods: A total of 168 patients diagnosed with heart failure were enrolled in a cross-sectional study conducted at GMERS Medical College and Hospital, Gotri, Vadodara. Renal function was evaluated using the Cockcroft-Gault equation, while lipid abnormalities were identified based on established clinical criteria. Data analysis was performed using SPSS software, with a p-value of less than 0.05 considered statistically significant. Results: Among the participants, 47.02% were found to have renal dysfunction, and 51.79% had dyslipidemia. Notably, 72.15% of those with renal impairment also exhibited lipid abnormalities. A statistically significant association was observed between renal dysfunction and dyslipidemia (χ² = 24.77; p < 0.00001). Common comorbid conditions included hypertension and diabetes, and most patients were classified as overweight or obese. Conclusion: The study findings suggest a strong association between renal dysfunction and dyslipidemia in heart failure patients. Monitoring lipid profiles may play a critical role in identifying patients at greater risk for renal complications, potentially guiding more targeted interventions.
Research Article
Open Access
Correlation of Absolute Eosinophil Count with Severity of Respiratory Allergy: A Cross-Sectional Study
Vivek Arora,
Omkar Mishra,
Anurag Shukla
Pages 800 - 804

View PDF
Abstract
Introduction: Respiratory allergies such as allergic rhinitis and bronchial asthma are major global health burdens, with rising prevalence, particularly in developing countries. Eosinophils play a pivotal role in the pathophysiology of allergic diseases, and Absolute Eosinophil Count (AEC) is considered a potential biomarker reflecting disease activity and severity. Objectives: To evaluate the correlation between Absolute Eosinophil Count and the clinical severity of respiratory allergic conditions. Materials and Methods: This cross-sectional observational study was conducted over 1.5 years at RKDF Medical College and Maharishi Devraha Baba Autonomous State Medical College. A total of 200 patients clinically diagnosed with respiratory allergies (allergic rhinitis and bronchial asthma) were included. AEC was measured and compared across mild, moderate, and severe grades of disease severity. Statistical analyses including correlation coefficients were applied. Results: The mean AEC values were significantly higher in patients with severe allergic symptoms (mean AEC 650 ± 82 cells/µL) compared to those with moderate (410 ± 65 cells/µL) and mild (270 ± 49 cells/µL) symptoms (p < 0.001). A strong positive correlation (r = 0.81) was observed between AEC and clinical severity score. Additionally, asthmatic patients demonstrated higher AEC values than those with isolated allergic rhinitis. Conclusion: AEC is significantly correlated with the clinical severity of respiratory allergy, particularly bronchial asthma. It serves as a simple, cost-effective, and readily available marker that may aid in disease monitoring and stratification of patients for appropriate therapeutic intervention
Research Article
Open Access
Feto-Maternal Outcomes in Intrahepatic Cholestasis of Pregnancy in A State Teaching Hospital
Ashis Kumar Mukhopadhyay,
Maya Mukhopadhyay,
Nigar Anjum
Pages 862 - 866

View PDF
Abstract
Introduction: Intrahepatic cholestasis of pregnancy is most common pregnancy related liver disorder and unique to pregnancy. It is characterised by pruritus with onset in the 2nd or 3rd trimester of pregnancy without skin rash, elevated serum amino transferases and bile acid levels and spontaneous relief of signs and symptoms within 2-3 weeks after delivery. Aims: To find out the adverse effects of intrahepatic cholestasis of pregnancy on feto- maternal outcomes. Materials & Methods: The present study was a prospective observational study. This Study was conducted from 1 ½ years (March 2018 to July 2019) at department of obstetrics and gynaecology in Chittaranjan Seva Sadan College of Obstetrics & Gynecology and Child Health. Total 100 patients were included in this study. Result: At 1 minute, 20% of new-borns had an APGAR score <6, while 80% had scores ≥6, indicating good initial adaptation. The occurrence of low APGAR scores was relatively limited but clinically significant. The result was statistically significant with a P value < 0.0001, suggesting a non-random distribution. At 5 minutes, 8.5% of new-borns had an APGAR score <6, while 91.5% showed improved scores ≥6, indicating better postnatal recovery. Though fewer new-borns had low scores at this stage, it remains clinically relevant. The distribution was statistically significant with a P value < 0.0001. Conclusion: Intrahepatic cholestasis of pregnancy (ICP) significantly impacts feto-maternal outcomes, predominantly affecting women in their prime reproductive age. This study highlights that ICP is associated with increased risks of adverse fetal events, including low birth weight, low APGAR scores, and a higher rate of NICU admissions, emphasizing the need for vigilant fetal monitoring.
Research Article
Open Access
A Study of Etiological and Clinical Profile of Gastro-Intestinal Perforation in Tribal Population in Central India.
Megh Malhar Nagori,
. Prasad P. Upganlawar,
Vidhey Tirpude,
Sandeep Ambedkar,
Swapnil Rangari
Pages 112 - 117

View PDF
Abstract
Purpose: This study was conducted to explore the etiogical and clin profile of GI perforation in a tertiary care center located in a tribal region of Central India. Methods: A prospective observational study was conducted at the Department of Surgery in a tertiary hospital in Central India between January, 2024 to December, 2024 on patients who presented with GI perforation. Patients underwent detailed clinical evaluation, relevant investigations, surgical intervention, and postoperative monitoring. Data were analyzed using chi-square, Fisher’s exact test, and Student’s t-test to determine statistical significance. Results: A total of 105 patients were included in the study. The most affected age group was 21–30 years, and 76.2% of patients were male. Common etiologies included peptic ulcer (60%) and appendicitis (32.4%). The most frequent sites of perforation were prepyloric and appendicular regions. Conclusion: GI perforations remain a major surgical challenge in tribal populations, with peptic ulcers and appendicitis as leading causes
Research Article
Open Access
An Analytical Study of Caesarean Section Based on Modified Robson Ten Group Classification System in A Tertiary Care Centre of Westbengal Near Sundarban Area
Abhishek Rajakumar,
Sanjana Haldar,
Amrita Chatterjee,
. Shailasree Sinha,
Deblina Chowdhury
Pages 226 - 230

View PDF
Abstract
Background: Caesarean section has classically been defined as an operative procedure performed for the delivery of the fetus through a surgical incision made on the abdominal wall and an intact uterine wall after the period of viability. Aims: Analysis of leading groups contributing to the high caesarean section rates in a tertiary care hospital of West Bengal near Sundarbans area based on Modified Robson Ten Group Classification System (MRTGCS) Materials & Methods: The present study was conducted at Diamond Harbour Government Medical College and Hospital, West Bengal, over a period of 18 months, comprising 12 months of data collection followed by 6 months dedicated to data analysis. It was designed as an observational descriptive cross-sectional study, aiming to capture a snapshot of relevant clinical and demographic parameters within the study population at a single point in time. This design facilitated the assessment of prevalence and distribution patterns without manipulating study variables, thereby ensuring a comprehensive and unbiased representation of the existing scenario. Result: In our study, 8 (0.12%) patients had all abnormal lies (including previous Caesarean section but excluding breech), in Spontaneous labour 3 (0.04%) patients had all abnormal lies (including previous Caesarean section but excluding breech) in Induced labour, 14 (0.22%) patients had Caesarean section before labour, who had all abnormal lies (including previous Caesarean section but excluding breech). Association of All abnormal lies (including previous Caesarean section but excluding breech) with Group was statistically significant (p<0.0001). Conclusion: Enhanced antenatal counseling, standardized labor management, and adherence to clinical guidelines can help optimize caesarean delivery rates. This classification system proves to be a valuable tool in monitoring, auditing, and guiding clinical practices to improve maternal and neonatal outcomes
Research Article
Open Access
Prospective Assessment of the Efficacy of Wearable Technology in Postoperative Patient Monitoring and Early Complication Detection
Reegan Jose Mathias,
Karthick Govindarajan,
Najeem Fazil M
Pages 243 - 249

View PDF
Abstract
Background: Postoperative complications such as surgical site infections, deep vein thrombosis (DVT), and respiratory distress remain a major source of morbidity, prolonged hospitalization, and unplanned ICU admissions. Standard ward monitoring, performed intermittently, may miss early signs of deterioration. Advances in wearable sensor technology enable continuous, multiparameter monitoring, potentially allowing earlier recognition and intervention. Objective: To evaluate the efficacy of continuous wearable health-monitoring devices compared with standard intermittent monitoring in detecting early postoperative complications, and to assess their impact on clinical outcomes. Methods: This single-center, prospective, randomized controlled trial enrolled 200 adult postoperative patients undergoing major surgery. Participants were randomized 1:1 to wearable continuous monitoring (heart rate, respiratory rate, SpO₂, skin temperature, activity) or standard intermittent monitoring every 4–6 hours. The primary outcome was early complication detection rate. Secondary outcomes included time from physiological change to recognition, length of stay, ICU transfers, and 30-day readmissions. Results: Wearable monitoring nearly doubled overall early complication detection compared to standard care (30% vs 13%, p=0.01), with significant increases for surgical site infection (14% vs 7%, p=0.04), DVT (8% vs 3%, p=0.05), and respiratory distress (12% vs 5%, p=0.03). Mean detection was 7–11 hours earlier across complications (p<0.001). Wearable devices detected abnormal heart rate (+14 bpm), respiratory rate (+5 bpm), SpO₂ (-4%), and skin temperature (+1.2°C) several hours before clinical recognition. The wearable group had shorter hospital stays (5.6 ± 1.8 vs 7.4 ± 2.1 days, p=0.002), fewer ICU transfers (6% vs 14%, p=0.04), and fewer readmissions (8% vs 16%, p=0.03). Mortality differences were not statistically significant. Conclusions: Continuous wearable monitoring significantly improves early detection of postoperative complications, provides a substantial lead time for intervention, and is associated with reduced ICU transfers, readmissions, and hospital stay
Research Article
Open Access
Optimizing Outcomes in Pancreaticoduodenectomy: Insights from a Newly Established Hepato-Pancreato-Biliary and Liver Transplant Surgery Unit
Dinesh Kumar Bharti,
Ashutosh Pancholi,
Anupama Nagar
Pages 840 - 846

View PDF
Abstract
Background: Pancreaticoduodenectomy (PD) is a complex procedure with significant morbidity, particularly in newly established Hepato-Pancreato-Biliary and Liver Transplant (HPB & LT) units. Methods: A retrospective cohort study analyzed 59 patients undergoing PD at a new HPB & LT department. Variables included demographics, diagnosis, intraoperative blood loss, hospital stay, and complications (Clavien-Dindo classification, ISGPS-defined postoperative pancreatic fistula [POPF] and delayed gastric emptying [DGE], post-pancreatectomy hemorrhage [PPH], surgical site infection [SSI] per Southampton Wound Scoring System, intra-abdominal abscess, bile leak, chyle leak). Statistical analyses used non-parametric tests, correlations, and Fisher’s Exact Test (p < 0.05). Results: Mean age was 51.8 ± 14.0 years; 57.6% were male. Periampullary carcinoma predominated (55.9%). PPH occurred in 1 patient (1.7%), POPF in 4 (6.8%), DGE in 22 (37.3%), SSI in 71.2%, intra-abdominal abscess, bile leak, and chyle leak in 3.4% each. Mean hospital stay was 8.1 ± 3.8 days. PPH was associated with a 25-day hospital stay vs. 7.8 ± 3.3 days for no PPH (p = 0.017). DGE (p = 0.001) and POPF (p = 0.045) prolonged hospital stays (9.8 ± 4.7 and 11.8 ± 6.7 vs. 7.8 ± 3.4 and 7.8 ± 3.4 days, respectively). SSI presence (p = 0.002) and severity (ρ = 0.60, p < 0.001) correlated with longer stays. Head of pancreas mass and IPMN trended toward higher complication severity (p = 0.08). Conclusion: SSI and DGE are primary morbidity drivers in PD, with POPF and PPH contributing significantly despite lower incidence. Enhanced infection control and complication monitoring are critical in new HPB units
Research Article
Open Access
Urinary Albumin Creatinine Ratio in Adults with Sickle Cell Anaemia (Trait and Disease) in Tertiary Care Hospital
Souvik Pramanik,
M Das,
Pratim Gupta
Pages 330 - 332

View PDF
Abstract
Introduction: Sickle cell anaemia (SCA) is a hereditary haemoglobin disorder characterized by abnormal haemoglobin S (HbS), leading to chronic haemolysis, Vaso-occlusion, and a variety of systemic complications. The urinary albumin-to-creatinine ratio (ACR) is a sensitive marker for detecting microalbuminuria, a precursor to overt kidney disease. In this study we have determined urinary ACR in adults with both sickle cell disease (SCD) and trait (SCT) and compare the values between these two groups. Materials And Methods: In this cross-sectional study total 40 spot urine samples (20 trait and 20 disease) of patients attending outpatient department (OPD) /In patient department (IPD) of medicine in Assam Medical College and Hospital were collected over a period of 6 months. Spot urine micro albumin, urine creatinine analysis done on Vitros 5600 Autoanalyzer. Results: Mean ± SD of urinary ACR (61.04±42.79) in patients with SCD is significantly (p value is 0.0023) higher than that of patients with SCT (26.33±20.74). Additionally, the urinary ACR is higher in patients aged 31–40 (71.66±50.96) than in those aged 21–30 (41.74±6.83), or in those aged ≤20 (19.00±6.73) for all the study participants. Conclusion: Urinary ACR serves as a valuable marker for assessing renal dysfunction in adults with sickle cell disease and trait. Since ACR increases with age, it could be a useful monitoring tool in adults with sickle cell related hemoglobinopathies. Monitoring Urinary ACR in such patients may improve the overall management of patients affected by sickle cell-related renal complications.
Research Article
Open Access
Correlation of Procalcitonin and C-reactive Protein Levels with Blood Culture Positivity and Cardiovascular Dysfunction in Suspected Sepsis: A Prospective Observational Study in a Tertiary Care Hospital
Harish Thummala,
Usha Rani Vadlamanu,
Leena Chacko,
Ramesh Kandimalla
Pages 1967 - 1974

View PDF
Abstract
Background: Sepsis is a life-threatening condition with significant cardiovascular implications, including sepsis-induced myocardial dysfunction, arrhythmias, and circulatory collapse. Although blood culture is the gold standard for etiological diagnosis, delayed results limit early targeted intervention. Biomarkers such as procalcitonin (PCT) and C-reactive protein (CRP) have gained prominence for early sepsis diagnosis, but their relationship with both microbiological confirmation and cardiac involvement in septic patients warrants further exploration. Objectives: To evaluate the correlation between serum PCT and CRP levels with blood culture positivity in suspected sepsis patients and to examine their association with cardiovascular dysfunction as assessed by clinical and echocardiographic parameters. Materials and Methods: A prospective observational study was conducted over 12 months in the Departments of Microbiology, Biochemistry, and Cardiology at a tertiary care hospital. Adults (≥18 years) fulfilling Sepsis-3 criteria were enrolled. Blood samples were collected prior to antibiotic administration for culture and measurement of serum PCT (chemiluminescent immunoassay) and CRP (immunoturbidimetry). Cardiac function was assessed using echocardiography and hemodynamic monitoring. Statistical analyses included ROC curve evaluation, correlation coefficients, and multivariate logistic regression. Results: Of the 210 patients, 92 (43.8%) had positive blood cultures. Gram-negative bacilli (56.5%) predominated. Median PCT and CRP levels were significantly higher in culture-positive patients (p < 0.001 and p < 0.01, respectively). Cardiovascular dysfunction, defined as ejection fraction <50% and/or elevated NT-proBNP, was present in 48.9% of cases and showed a strong positive correlation with elevated PCT (r = 0.68) and CRP (r = 0.54) levels. Combined biomarker assessment improved diagnostic accuracy for predicting both culture positivity and cardiac involvement. Conclusion: PCT demonstrated superior predictive value over CRP for culture-confirmed sepsis and cardiovascular dysfunction. Incorporating these biomarkers into early sepsis workup, alongside echocardiographic evaluation, can guide timely antimicrobial and hemodynamic interventions.
Research Article
Open Access
Utility of Serial Serum Ferritin and C-Reactive Protein Measurements in Early Detection of Inflammatory Progression in Hospitalized Patients
P.M. Sasikala,
T Rameswari,
S. Uma Maheswari
Pages 497 - 503

View PDF
Abstract
Introduction: Early detection of inflammatory progression in hospitalized patients is vital for timely intervention. Serum ferritin and C-reactive protein (CRP) are key acute-phase reactants, but the prognostic value of their serial measurements is underexplored. To assess the utility of serial ferritin and CRP measurements in predicting inflammatory progression during hospitalization. Methods: A prospective observational study was conducted among 73 adult inpatients with inflammatory conditions. Patients were classified into progression (n = 25) and no-progression (n = 48) groups based on clinical outcomes. Serum ferritin and CRP levels were measured on Days 1, 3, and 5. Intergroup comparisons, temporal trends, and independent predictors were analyzed using repeated measures statistics and logistic regression. Results: Baseline ferritin and CRP levels were significantly higher in the progression group (median [IQR]: 418.0 [184.8–549.5] ng/mL and 88.2 [62.2–103.3] mg/L) than in the no-progression group (128.6 [71.8–226.9] ng/mL and 23.1 [16.4–27.9] mg/L; p < 0.001 for both). CRP percent change from Day 1 to Day 5 showed excellent discrimination for progression (AUC 0.996; sensitivity 96%; specificity 100%; cut-off +33.6%). Logistic regression identified baseline CRP (OR 14.45, 95% CI 3.00–69.61, p = 0.001) and ferritin (OR 4.17, 95% CI 1.28–13.58, p = 0.018) as independent predictors. Conclusion: Serial monitoring of ferritin and CRP enhances early detection of inflammatory progression. A >33.6% CRP increase over five days is a strong predictor, warranting closer surveillance and potential intervention.
Research Article
Open Access
Pharmacovigilance Assessment of Chemotherapy-Induced Adverse Effects in Patients with Tuberculosis
Pages 783 - 786

View PDF
Abstract
Background: Chemotherapy for tuberculosis (TB), though highly effective, is often associated with adverse drug reactions (ADRs) that may impact treatment adherence and outcomes. Pharmacovigilance studies are essential to identify and characterize these reactions in clinical practice. Objectives: To assess the incidence, pattern, causality, severity, and outcomes of chemotherapy-induced ADRs in patients with TB. Methods: This prospective observational study included 100 adult TB patients receiving anti-tubercular chemotherapy. Baseline demographics, clinical characteristics, and regimen details were recorded. ADRs were identified through active monitoring and evaluated for organ system involvement, causality (WHO–UMC criteria), and severity (Modified Hartwig scale). Outcomes were documented at follow-up. Data were analyzed descriptively. Results: The mean age was 42.6 ± 13.2 years, with a male predominance (62%). Pulmonary TB accounted for 72% of cases, and 84% received the standard HRZE regimen. ADRs occurred in 68% of patients, totaling 92 events, with a median onset of 14 days (IQR: 9–21). Multiple ADRs were noted in 19% of patients. Gastrointestinal (26.1%), hepatobiliary (21.7%), cutaneous (15.2%), and neurological (13.0%) systems were most frequently affected. Causality was probable in 47.8%, possible in 40.2%, and certain in 12.0% of cases. Severity grading showed 45.6% mild, 42.4% moderate, and 12.0% severe ADRs. Recovery occurred in 91.3%, residual effects in 6.5%, and mortality in 1.1%. Conclusions: ADRs to TB chemotherapy are common, predominantly gastrointestinal and hepatobiliary, and require early detection and management to optimize adherence and outcomes. Strengthened pharmacovigilance systems are vital in TB control programs
Research Article
Open Access
Role of Echocardiography in Diagnosis and Management of Cardiovascular Emergencies in the ICU
Ahmed H. Awad,
Ahmed M. Abdelbaky,
Wael G. Elmasry
Pages 529 - 535

View PDF
Abstract
Background: Echocardiography (Echo.), through transthoracic (TTE) and transoesophageal (TEE) approaches, is a pivotal bedside tool in the intensive care unit (ICU) for rapid diagnosis and monitoring of cardiovascular emergencies. Its ability to provide real-time anatomical and functional assessment supports early recognition of life-threatening conditions and timely therapeutic interventions. Methods: A qualitative secondary data analysis was conducted using peer-reviewed literature, clinical guidelines, and retrospective case reports on ICU or high-dependency patients (≥18 years) evaluated with Echo. for suspected myocardial infarction. Data on echocardiographic findings, clinical presentation, and management outcomes were synthesised to assess diagnostic yield, therapeutic impact, and detection of post-MI complications. Results: Echo. identified key pathologies including LV systolic dysfunction, segmental wall motion abnormalities, acute valvular lesions, pulmonary embolism, and pericardial tamponade. ACS evaluation frequently revealed LAD territory hypokinesia, prompting urgent interventions. Clinical management changed in 51.2% of studies, with immediate changes in 41.5%. Interventions included fluid therapy adjustment, vasoactive drug titration, urgent cardiac procedures, and pericardiocentesis. No procedure-related complications occurred. Conclusion: Echo. enables early diagnosis of MI in ICU patients by detecting SWMAs and mechanical complications before clinical or biochemical confirmation. Its real-time integration into decision-making significantly influences treatment strategies, improving timeliness and precision of care
Research Article
Open Access
A Case-Control Study on the Influence of Ketogenic Diet on Immunity in Central Indian Subjects
Chelikam Rohini,
Ashutosh Jain
Pages 1093 - 1097

View PDF
Abstract
Background: The ketogenic diet (KD), characterized by high-fat, moderate-protein, and very low-carbohydrate intake, has garnered widespread attention for its therapeutic potential in various clinical conditions. Traditionally used in the management of refractory epilepsy, KD has evolved to demonstrate beneficial effects in metabolic disorders such as obesity, type 2 diabetes, and even neurological conditions like Alzheimer's disease. The core mechanism of KD involves shifting the body's primary energy source from glucose to ketone bodies, resulting in altered metabolic and physiological processes. Materials and Methods This is a Case-control study was conducted in the Department of Physiology at Index Medical College. Data was collected from consenting participants attending the outpatient departments (OPD) of General Medicine and Physiology at Index Medical College and hospital from January 2023 to December 2024. Participants were recruited after meeting inclusion criteria and providing informed consent. Participants will follow a monitored KD consisting of <10% carbs, ~70% fats, and ~20% proteins. Participants will continue a balanced Indian diet based on standard dietary recommendations. Results The KD group demonstrated substantial reductions in CRP (−0.8 mg/L), IL-6 (−0.9 pg/mL), TNF-α (−1.3 pg/mL), and fecal calprotectin (~15% decline). These improvements are in agreement with several clinical and preclinical studies. Conclusion While the anti-inflammatory and motility effects may hold clinical promise, caution is warranted regarding microbiota diversity and distal transit changes. Personalization and careful monitoring should guide KD implementation for gastrointestinal and immunological health optimization.
Research Article
Open Access
Mucocutaneous Manifestations of Human Immunodeficiency Virus Infection in Children
Rik Goswami,
Saswati Halder,
Projna Biswas
Pages 580 - 584

View PDF
Abstract
Introduction: Human Immunodeficiency Virus (HIV) infection in children is associated with a broad spectrum of mucocutaneous manifestations, which are often the first clinical indicators of underlying immunodeficiency. These manifestations range from common infections to neoplastic and inflammatory conditions and significantly affect quality of life, morbidity, and sometimes mortality in pediatric patients. Early recognition of these skin and mucous membrane lesions is crucial for timely diagnosis, initiation of antiretroviral therapy (ART), and prevention of further complications. Methods: This prospective study was conducted over a period of one year at the Calcutta School of Tropical Medicine. A total of 60 patients with confirmed Human Immunodeficiency Virus (HIV) infection presenting with mucocutaneous manifestations were included. Relevant demographic and clinical data, including age, gender, and body mass index (BMI), were recorded. Detailed evaluation of associated endocrine disorders, including thyroid, adrenal, pituitary disorders, and polycystic ovarian syndrome (PCOS), was performed. Comprehensive dermatological examination was carried out to document the type of skin manifestations and specific skin features. All data were systematically collected, compiled, and analyzed to assess the pattern and association of mucocutaneous lesions with clinical and laboratory parameters. Results: In this study of 60 HIV-infected children, most were aged 6–10 years, with no significant age or gender differences. Seventy percent were on ART, and shorter ART duration was significantly associated with higher lesion prevalence. Oral candidiasis was the most common mucocutaneous manifestation, followed by seborrheic dermatitis, bacterial infections, molluscum contagiosum, and herpes zoster. Lower CD4 counts and shorter ART duration were significantly linked to higher rates of oral candidiasis and other lesions. Conclusion: Mucocutaneous manifestations are highly prevalent in children with HIV infection and often reflect the degree of immunosuppression. Recognizing these lesions can aid in early diagnosis, monitoring disease progression, and guiding timely initiation of antiretroviral therapy. Pediatricians and dermatologists should maintain a high index of suspicion for HIV in children presenting with recurrent or atypical mucocutaneous lesions.
Research Article
Open Access
A Comparative Study to Assess the Efficacy of Antenatal Corticosteroids for Women at Risk of Late Preterm Delivery
Swapna Lakhotia,
Anusree Aich,
Kajal Kumar Patra
Pages 617 - 623

View PDF
Abstract
Background: Late preterm infants (34⁰–36⁶ weeks) are at increased risk of respiratory morbidity and other complications compared to term neonates. Antenatal corticosteroids have proven efficacy in improving neonatal outcomes in early preterm births, but their role in late preterm deliveries remains an area of active investigation. Objective: To evaluate the impact of antenatal corticosteroid administration on neonatal and maternal outcomes in women at risk of late preterm delivery. Methods: This prospective observational study included 120 women with late preterm pregnancies, divided into two groups: those who received a complete course of dexamethasone (n = 60) and those who did not (n = 60). Maternal and neonatal characteristics were recorded, and outcomes were compared using appropriate statistical tests. Results: Neonates exposed to corticosteroids required significantly less respiratory support (11.7% vs. 18.3%, p = 0.0022) and had fewer NICU admissions (11.7% vs. 20.0%, p = 0.018) compared with the non-steroid group. However, the incidence of neonatal hypoglycemia was higher in the corticosteroid group (23.3% vs. 15.0%, p = 0.0048). No significant differences were observed in neonatal resuscitation, surfactant use, APGAR scores, sepsis, or maternal complications such as chorioamnionitis. Mode of delivery was unaffected by corticosteroid use. Conclusion: Antenatal corticosteroid administration in late preterm pregnancies significantly reduces neonatal respiratory morbidity and NICU admissions but increases the risk of hypoglycemia. The therapy appears safe for mothers and should be considered in clinical practice, with appropriate neonatal glucose monitoring
Research Article
Open Access
Efficacy of Freshly Collected Human Amniotic Membrane Dressings in Non-Diabetic Chronic Burn Wounds: An Integrated Analysis of Biochemical Mediators, Cytokine Profiles and Histopathology
Indranil Roy,
Raj Gupta,
Mollinath Mukherjee,
Nirmal Polle,
Prof. Dr. Niranjan Bhattacharya
Pages 631 - 638

View PDF
Abstract
Background - Chronic burn wounds are a worldwide challenge leading to morbidity and mortality in patients thus adding a great amount of socio-economic burden. Conventional dressing in chronic burn wounds intends to reduce the exudation of fluids and electrolyte loss but lack of cell therapy-based support function. In such scenario, freshly collected human amniotic membrane (HAM) possessing progenitor cells, cytokines, and growth factors support may enhance regenerative capacity. Objective - To focus on markers and mediators of wound healing thus evaluating the sequential histological, biochemical, and cytokine changes in chronic burn wounds treated with freshly collected HAM, and to correlate these findings with phases of wound healing. Method - In this study, placentas are collected from sero-negative mothers after proper consent. Freshly collected amniotic membrane was applied on 10 non – diabetic patients (Sample group) after having institutional ethical permission and proper informed consent and 10 patients were placed under the Control group who received conventional burn wound dressing. Investigations including biochemical and inflammatory markers studies, cytokines profiling and histology were being conducted to study the healing outcomes. Pain and wond closure assessment, wound infection if any, wound size measeurement and duration of epithelialization were routinely monitored. Result - Dressing with freshly collected amniotic membrane showed a decrease in wound size, rapid epithelialization, lowering of pain score, and rare graft rejection. Histology demonstrated a clear transition from inflammation-rich granulation tissue without epidermal covering, to early/mid proliferation with active angiogenesis and fibroblast activity, late proliferation with organized collagen synthesis, and finally early remodeling with complete epithelialization. Serum albumin increased from 36→43.5 g/L and wound fluid albumin from 16→26 g/L; total protein rose in serum from 66.5→79.8 g/L and in wound fluid from 30→47 g/L. Wound fluid LDH (Lactate Dehydrogenase) decreased from 3820→2650 U/L, indicating reduced local tissue injury. TGF-β1 (Transforming Growth Factor – beta – 1) peaked mid-phase, supporting fibroblast proliferation, while IL-1β (Interleukin beta – 1) declined markedly by three months, reflecting inflammation resolution. Overall freshly collected amniotic membrane considered to be hypo-antigenic contains great potential to improve chronic burn wounds by regeneration with minimal scarring. Conclusion - Sequential histology, combined with biochemical and cytokine markers, provided objective evidence of progressive healing in chronic burn wounds treated with HAM. The observed modulation of inflammatory and regenerative pathways supports HAM’s role as an effective, low-cost, and biologically active dressing, particularly beneficial in resource-limited settings. This integrated assessment approach may serve as a valuable model for monitoring chronic wound healing
Research Article
Open Access
Closed-Loop Anaesthesia Delivery Systems and Their Impact on Intraoperative Haemodynamic Stability in High-Risk Surgical Patients
Tanmay Kuvarsinh Varma,
Bharatkumar Mansinhbhai Chaudhari,
Maitri Harshadkumar Vashist
Pages 655 - 658

View PDF
Abstract
Background: High-risk surgical patients frequently experience intraoperative haemodynamic fluctuations, which increase perioperative morbidity and mortality. Closed-loop anaesthesia delivery systems (CLADS), which automatically titrate anaesthetic drugs based on real-time feedback, are increasingly studied as tools to improve haemodynamic stability compared to conventional manual administration. However, evidence in high-risk populations remains limited. Materials and Methods: This prospective, randomized controlled trial included 120 high-risk surgical patients (ASA III–IV) undergoing major abdominal and thoracic procedures. Patients were divided into two groups: Group A (n=60) managed with CLADS, and Group B (n=60) managed with manual anaesthesia delivery. Haemodynamic parameters (mean arterial pressure [MAP], heart rate [HR]), intraoperative hypotensive episodes, total vasopressor requirement, anaesthetic drug consumption, and recovery times were recorded. Data were analyzed using Student’s t-test and chi-square test, with p<0.05 considered statistically significant. Results: Patients in Group A had significantly fewer episodes of hypotension (2.1 ± 1.4 vs. 5.7 ± 2.3 per case; p<0.001) and reduced vasopressor use (12.6 ± 4.3 mg vs. 24.2 ± 6.5 mg; p<0.01) compared to Group B. Mean MAP deviations from baseline were smaller in Group A (8.5 ± 3.2 mmHg) versus Group B (15.4 ± 5.8 mmHg; p<0.001). Anaesthetic drug consumption was lower in CLADS patients (propofol: 540 ± 110 mg vs. 670 ± 150 mg; p=0.02). Recovery times were shorter in Group A (12.4 ± 4.1 min vs. 18.7 ± 6.3 min; p=0.03). No major adverse events were noted. Conclusion: Closed-loop anaesthesia delivery systems significantly improved intraoperative haemodynamic stability, reduced vasopressor requirements, and shortened recovery time in high-risk surgical patients compared to conventional manual techniques. These findings support wider clinical adoption of CLADS to optimize outcomes in vulnerable patient populations
Research Article
Open Access
The Clinical Significance of ABO Blood Groups in Dengue Fever: A Prospective Analysis
B Satish Babu,
B Magesh,
G M Prakash,
Chinmay R Hosamani
Pages 691 - 695

View PDF
Abstract
Introduction: Dengue fever, a mosquito-borne viral illness, varies from dengue fever (DF) to dengue haemorrhagic fever (DHF) and dengue shock syndrome (DSS). This prospective study examines 80 dengue cases to explore associations between demographic factors, laboratory parameters, clinical severity, and outcomes. Materials and Methods: Prospectively 80 patients with confirmed dengue (NS1 and/or IgM positive) at a hospital. Data included age, gender, ABO blood group, fever duration, platelet count, haematocrit, severity, bleeding, hospital stay, and outcome. Severity was classified per WHO guidelines. Comparative analysis evaluated links between severity, bleeding, platelet count, haematocrit, and blood group. Trends in laboratory markers were assessed across severity groups. Results: 80 patients (53.75% male, 46.25% female, mean age 34.4 years), 47 had DF, 23 had DHF, and 10 had DSS. DSS cases had the lowest platelet counts (mean 28.8x10³/µL) and highest haematocrit (mean 51.4%). Bleeding occurred in all DHF and DSS cases but none in DF. DSS required the longest hospital stays (mean 11.9 days). Blood group distribution showed O (37.5%), A (28.8%), B (22.5%), and AB (11.2%), with more severity in O blood group and association (DSS 40% O, 30% A, 20% B, 10% AB). 1 DSS patient (age 85, female) died due to coronary heart disease; others recovered. Conclusion: Low platelet counts, high haematocrit, and bleeding strongly predicted dengue severity. Blood group O showed more severity. Prospective monitoring of key markers can guide early interventions, reducing severe outcomes, including mortality tied to comorbidities like coronary heart disease.
Research Article
Open Access
Evaluation of Two Doses of Intrathecal Clonidine as An Adjunct to Hyperbaric Ropivacaine in Orthopaedic Surgeries of Lower Limb
Manu singh,
Sumeet Rajshekhar,
M Sarath Chandra,
Ashok Pal Paikra
Pages 796 - 799

View PDF
Abstract
Background: Adjuvant agents are usually added to the local anesthetic agents during the use of subarachnoid block for prolongation of subarachnoid block for prolongation of both analgesic and anesthesia duration. Aim: The present study was aimed to comparatively assess the efficacy of two different doses of 30 µg and 50 µg intrathecal clonidine as an adjuvant to the hyperbaric ropivacaine in subjects undergoing lower limb surgeries. Methods: The present study assessed 180 subjects that were randomly divided into two groups of 90 subjects each where Group I subjects were given 3ml of 0.75% hyperbaric ropivacaine along with 30µg clonidine which was diluted with normal saline for total of 3.5mL volume. Group II subjects were given 50 µg clonidine with 3ml of 0.75% hyperbaric ropivacaine diluted with normal saline for total of 3.5mL volume. The study compared hemodynamic changes, complications, side-effects, sensory and motor blockade, and analgesia duration. Results: The study results showed that 30µg clonidine took longer time to reach highest spinal level with 12.4±1.24 minutes compared to 50µg clonidine that took 11.6±1.4 minutes with p=0.003. For two segment regression, time was significantly longer with 50µg clonidine compared to 30µg clonidine with p<0.001. Durations for motor and sensory block was also higher with 50µg clonidine compared to 30µg clonidine with p<0.001. Analgesia duration was significantly longer in 50µg clonidine group with p<0.001. Higher incidence of hypotension and bradycardia was seen with 50µg clonidine that were managed using standard interventions. Conclusion: The present study concludes that 50µg of intrathecal clonidine provides higher analgesic effects in comparison to 30µg clonidine. However, it is linked with a higher risk of hypotension and bradycardia. Timely interventions and careful watch and monitoring are vital when clonidine is used in higher dose compared to a low dose.
Research Article
Open Access
Incidence of Venous Thromboembolism in the Cases of Free Flap in Head & Neck Cancer in A Tertiary Institution-Based Study in Eastern India
Poulomi Saha,
Saidul Islam,
Arunabha Sengupta
Pages 812 - 816

View PDF
Abstract
Introduction: Venous thromboembolism (VTE), encompassing both deep vein thrombosis (DVT) and pulmonary embolism (PE), is a significant postoperative complication in patients undergoing major oncologic surgeries, particularly those involving microvascular free flap reconstruction. Head and neck cancer patients are considered at elevated risk due to prolonged operative times, malignancy-associated hypercoagulability, and perioperative immobilization. However, the incidence of VTE in this specific surgical subset within the Indian population remains underreported. Objective: To evaluate the incidence of VTE in patients undergoing free flap reconstruction for head and neck cancer in a tertiary care institution in Eastern India, and to identify potential risk factors associated with its occurrence Methods: This retrospective observational study was conducted over a period of one year, from January 2024 to January 2025, at the Department of Head & Neck Surgery, IPGMER and SSKM Hospital, Kolkata, West Bengal, India. A total of 50 patients who underwent oral oncologic surgery with simultaneous reconstruction were included in the study. Clinical, surgical, and postoperative data were collected and analyzed to evaluate the incidence and associated risk factors for venous thromboembolism (VTE) in this high-risk surgical population. Results: The comparative analysis between VTE-positive (Group 1) and VTE-negative (Group 2) patients revealed no statistically significant differences in demographic, oncologic, or surgical variables. Although Group 1 showed slightly higher mean age, BMI, and rates of comorbidities, smoking, alcohol use, and prior VTE, none of these reached statistical significance. Tumor site, TNM stage, histological type and grade, as well as prior chemotherapy or radiotherapy, were also comparable between the groups. Surgical factors—including type of free flap used, duration of surgery, intraoperative blood loss, transfusion requirement, and number of vascular anastomoses—did not differ significantly between the two groups. However, certain postoperative factors showed significant associations with VTE occurrence. The use of DVT prophylaxis was significantly lower in Group 1 (p = 0.002), and postoperative mobility was also reduced (p = 0.035). Additionally, ICU stay was significantly longer in VTE-positive patients (p = 0.029). Although postoperative complications were more frequent in Group 1 and hospital stay was longer, these differences were not statistically significant. The mean time to VTE diagnosis was 6.1 ± 3.2 days postoperatively, underscoring the importance of early monitoring. Conclusion: This institution-based study highlights a notable incidence of VTE (5.9%) in patients undergoing free flap reconstruction for head and neck cancer. The findings underscore the need for vigilant perioperative thromboprophylaxis, early mobilization, and high clinical suspicion for early detection and management of VTE in this high-risk surgical cohort. Tailoring VTE prevention protocols based on individualized risk assessment may further reduce associated morbidity.
Research Article
Open Access
Cardiac Health in the Diabetic Population of India: Awareness of Risk, Preventive Behaviors, and Clinical Outcomes
Deepak Basia,
Maninder Hariya,
Amrita Kulhria
Pages 1182 - 1187

View PDF
Abstract
Background: Individuals with diabetes mellitus (DM) face a two- to four-fold higher risk of cardiovascular disease (CVD) compared to the general population, making cardiac health awareness and preventive practices critical. India, home to over 77 million diabetics, is experiencing a surge in diabetes-related cardiovascular morbidity and mortality. However, data on awareness, preventive behaviors, and emergency preparedness in this group remain limited. Materials and Methods: A descriptive, cross-sectional study was conducted using a structured, validated questionnaire distributed via Google Forms. The survey assessed socio-demographic characteristics, medical history, knowledge of cardiac risks and emergency measures (20-item questionnaire), and adoption of preventive practices. Participants (n = 400) were adults with self-reported diabetes residing in India. Knowledge scores were categorized as Excellent (16–20), Good (12–15), Fair (8–11), and Poor (0–7). Statistical analysis employed chi-square tests and multivariate logistic regression to identify determinants of good knowledge (SPSS v 25; p < 0.05 considered significant). Results: The majority of respondents were aged 45–54 years (34.5%), male (55%), and urban residents (61%). Type 2 diabetes predominated (90.5%), with 67% reporting hypertension and 48.5% dyslipidemia. While 77% recognized diabetes as a major cardiovascular risk factor and 71.5% understood the role of hypertension, only 38.5% knew optimal BP targets and 43% knew HbA1c goals. Awareness of CPR and aspirin use during emergencies was poor (46.5% and 42%, respectively). Preventive behaviors were inconsistent: blood glucose monitoring (93.5%) and medication adherence (84%) were high, but only 42% underwent regular cardiac check-ups and 46% engaged in daily physical activity. Overall, 17% achieved excellent knowledge, while 34.5% scored fair and 18% poor. Education (p < 0.001), urban residence (p = 0.002), and occupation (p = 0.008) were significantly associated with higher knowledge levels. Multivariate analysis confirmed education and prior CPR awareness as strong predictors. Conclusion: Cardiac health awareness among Indian diabetics remains suboptimal, with critical gaps in practical knowledge and emergency preparedness. Despite good adherence to basic diabetes management, comprehensive cardiovascular risk reduction strategies are inadequately practiced. Targeted interventions—emphasizing lifestyle modification, structured education, and community-based CPR training—are essential to mitigate the rising burden of diabetes-related cardiovascular complications in India
Research Article
Open Access
To Evaluate the Safety and Efficacy of Repaglinide Plus Voglibose Combination in T2d Patients
Pages 835 - 837

View PDF
Abstract
Background: Type 2 diabetes mellitus (T2DM) requires effective glycemic control to prevent complications. Combining agents with complementary mechanisms may improve outcomes. Objective: To evaluate the safety and efficacy of the repaglinide plus voglibose combination in T2DM patients over one year. Methods: A prospective study was conducted involving 180 T2DM patients receiving repaglinide plus voglibose. Glycemic parameters, including fasting blood glucose (FBG), postprandial blood glucose (PPBG), and HbA1c, were measured at baseline and after 12 months. Safety was assessed by monitoring adverse events and hypoglycemic episodes. Results: Significant reductions were observed in mean FBG (156.8 ± 28.5 to 120.5 ± 22.3 mg/dL), PPBG (240.3 ± 35.7 to 170.7 ± 28.1 mg/dL), and HbA1c (8.2 ± 0.7 to 6.9 ± 0.6%) after 12 months (p < 0.001 for all). The combination was well tolerated, with mild hypoglycemia reported in 6.7% of patients and mild gastrointestinal side effects. Conclusion: Repaglinide plus voglibose combination therapy is effective and safe for improving glycemic control in T2DM patients, offering a viable option for optimizing diabetes management.
Research Article
Open Access
A Study of Perinatal Outcome of Meconium-Stained Liquor in Term, Preterm and Postterm Pregnancy in A District Hospital
Deblina Chowdhury,
Sandhya Das,
Ankit Panja ,
Abhishek Rajakumar
Pages 838 - 841

View PDF
Abstract
Introduction: Meconium-stained amniotic fluid (MSAF) is a common obstetric complication associated with increased risk of adverse perinatal outcomes, including meconium aspiration syndrome (MAS), low Apgar scores, neonatal intensive care unit (NICU) admissions, and perinatal morbidity and mortality. The incidence of MSAF increases with gestational age, but it may also occur in preterm pregnancies. Understanding its impact across term, preterm, and postterm pregnancies is essential for timely obstetric interventions. Aims: To evaluate the perinatal outcomes of MSAF in term, preterm, and postterm pregnancies and to determine the association of MSAF with mode of delivery, MAS, Apgar scores, and neonatal morbidity. Materials and Methods: This was a prospective, comparative study conducted over one year, from 1st December 2019 to 30th November 2020, in the Department of Obstetrics and Gynecology at MR Bangur Hospital, Tollygunge, and Kolkata, which serves both rural and urban populations of South 24 Parganas district. The study included 108 booked antenatal cases attending the hospital’s antenatal clinic, enrolled according to predefined inclusion and exclusion criteria. Results: In our study, there was no statistically significant difference between cases and controls in terms of maternal age and gravidity (p = 0.546 and p = 0.841, respectively). However, birth weight and fetal heart rate showed significant differences between the two groups. A higher proportion of cases had birth weight <2 kg compared to controls (20.4% vs. 3.7%, p = 0.015). Similarly, fetal heart rate distribution differed significantly, with more cases having heart rate >120/min and fewer cases with heart rate <100/min compared to controls (p = 0.023). Conclusion: MSAF is associated with adverse perinatal outcomes, particularly in postterm pregnancies. Close fetal monitoring, timely obstetric intervention, and preparedness for neonatal resuscitation are crucial to improve neonatal outcomes. Early recognition and management of MSAF can reduce the risk of MAS and other complications
Research Article
Open Access
AI-Assisted Diagnosis Patterns in Chronic Illness Management
Anupama Abhilasha Murmu,
Jayakrishnan B,
Bhanupriya Singh,
Angshuman De
Pages 46 - 50

View PDF
Abstract
Background: Artificial intelligence (AI) has become more relevant in healthcare, especially in the management of chronic diseases, where precise diagnosis, long-term monitoring, and individualized interventions are paramount. Its potential aside, there are questions about how it would be integrated in clinical practice, within ethical considerations, and with equal access. Objective: This review sought to integrate current evidence concerning AI-supported diagnostic trends in chronic disease management, emphasizing technological developments, human aspects, and clinical implications. Methods: A narrative review approach was adopted, with literature sourced from PubMed, Scopus, and Web of Science. Studies published in the past decade were included if they evaluated AI applications in chronic illness diagnosis, decision support, or patient engagement. Data were thematically synthesized into domains of diagnostic accuracy, human–technology interaction, and access to care. Results: Evidence shows that AI models improve diagnostic accuracy across conditions including diabetes, hypertension, and cardiovascular disease, frequently outperforming traditional techniques. Clinical decision support systems enhanced workflow productivity and treatment customization. Conversational agents and remote monitoring devices improved patient engagement, especially in rural and under-resourced environments. Yet provider trust, transparency in systems, and ethical governance remain essential drivers of adoption. Comparative analysis with previous studies across oncology, osteoporosis, and pandemic response further affirmed AI’s cross-domain utility, while underscoring the importance of regulatory and methodological rigor. Conclusion: AI holds substantial promise in transforming chronic illness management, but its effectiveness will depend on transparent design, ethical integration, and alignment with human-centered care values.
Research Article
Open Access
A Silent Heart in a Febrile Storm: Sequential Leptospirosis and Scrub Typhus Unmasking Pediatric Bradycardia
M. Mohnish Darshan,
Rajkumar Kundavaram,
Disha Pandya,
Amber Kumar,
Girish Chandra Bhatt,
Shikha Malik
Pages 1 - 5

View PDF
Abstract
Leptospirosis and scrub typhus, while significant causes of acute febrile illness in endemic regions, rarely lead to cardiac complications in children. This case report presents a unique instance of a 13-year-old previously healthy female who developed hypotension and sinus bradycardia during an episode of leptospirosis, which was successfully treated with doxycycline. However, the weeks following this initial episode saw the persistence of bradycardia, syncope, and left-sided chest pain, leading to a cardiology evaluation. Despite the absence of conduction block in serial ECGs and Holter monitoring, her symptoms continued until a second febrile episode—scrub typhus—again triggered symptomatic bradycardia. The patient responded well to doxycycline and a short course of corticosteroids, with complete resolution of symptoms and normalization of heart rate. This case highlights the potential of sequential tropical infections to precipitate functional sinus node disturbances in children and underscores the importance of including reversible infectious causes in the differential diagnosis of pediatric bradyarrhythmias.
Research Article
Open Access
Assessment of Cardiac Manifestations in Dengue Patients and Their Association with Disease Warning Signs
Pages 220 - 223

View PDF
Abstract
Background: Dengue fever is a common mosquito-borne infection that may present with systemic and organ-specific complications. Cardiac manifestations, particularly electrocardiographic (ECG) changes, are increasingly recognized in patients with dengue. Correlation of these cardiac abnormalities with established warning signs may provide critical prognostic information. Aim: To study cardiac manifestations in patients presenting with dengue infection and to observe electrocardiographic changes, with special emphasis on correlation with warning signs of dengue. Material and Methods: This observational study included 120 patients with laboratory-confirmed dengue infection. Baseline clinical data, warning signs, and 12-lead ECGs were obtained and analyzed. ECG abnormalities were correlated with clinical warning signs using appropriate statistical tests. Results: The most frequent ECG abnormality was sinus bradycardia, followed by sinus tachycardia and conduction disturbances. ECG abnormalities showed significant correlations with abdominal pain, mucosal bleeding, fluid accumulation, shock, respiratory distress, and ARDS, whereas persistent vomiting, lethargy, and hepatomegaly did not show significant associations. Conclusion: ECG monitoring should be considered an essential part of the evaluation of dengue patients, especially those presenting with warning signs, as it can facilitate early identification of cardiac involvement and improve management outcomes.
Research Article
Open Access
Evaluation of Renal Function among Term Neonates with Perinatal Asphyxia
Bhavi Shah,
Sachin Patel,
Harshida Vagadoda,
Bhavi Shah,
Sachin Patel,
Harshida Vagadoda
Pages 224 - 227

View PDF
Abstract
Background: Perinatal asphyxia is a major cause of neonatal morbidity and mortality, often associated with hypoxic-ischemic encephalopathy (HIE) and multi-organ dysfunction. Among affected organs, the kidneys are particularly vulnerable, leading to acute kidney injury (AKI). Aim: To evaluate renal function among term neonates with perinatal asphyxia and assess its correlation with the degree of HIE. Material and Methods: A prospective case–control study was conducted including 120 term neonates, of which 60 had perinatal asphyxia (cases) and 60 were healthy controls. Renal function was assessed using serum creatinine, creatinine clearance, urine output, and urinary indices. Data were analyzed to compare renal function parameters between groups and correlated with the severity of HIE. Results: Asphyxiated neonates demonstrated significantly reduced creatinine clearance and urine output compared to controls. Urinary indices including FeNa, renal failure index, and osmolality were markedly altered. Severity of renal dysfunction was positively correlated with the stage of HIE. Conclusion: Perinatal asphyxia significantly impairs renal function in term neonates, with dysfunction correlating with HIE severity. Early recognition and monitoring are crucial to reduce morbidity and prevent long-term renal complications.
Research Article
Open Access
Freshly Collected Amniotic Membrane Therapy in Chronic Non-Healing Ulcers: A Regenerative Approach to Wound Healing Mechanisms and Vascular Regeneration
Raj Gupta,
Indranil Roy,
Mollinath Mukherjee,
Niranjan Bhattacharya
Pages 240 - 246

View PDF
Abstract
Background: Non-healing ulcers are chronic wounds that fail to progress through normal healing due to factors like poor circulation, infection, or underlying conditions. Freshly collected human amniotic membrane (HAM) has emerged as an effective biological dressing due to its anti-inflammatory, anti-microbial, and regenerative properties. Rich in growth factors and extracellular matrix components, HAM promotes tissue repair, reduces scarring, and accelerates healing, making it a promising treatment for managing non-healing ulcers Objective: The objective is to assess the role of freshly collected human amniotic membrane as a biological dressing in chronic non-healing ulcers by evaluating its impact on pro-angiogenic growth factor expression, endothelial and vascular markers, histopathological tissue regeneration, and modulation of cytokine levels. Method: Fresh human amniotic membrane was aseptically collected from consenting mothers undergoing elective cesarean section after screening for infections. The membrane was washed in sterile saline, trimmed, and immediately applied to chronic non-healing ulcers in 15 patients (study group). The control group (15 patients) received conventional dressing. All procedures followed ethical guidelines with informed consent from both donors and recipients, ensuring sterility and prompt application to preserve bioactivity. Healing outcomes were assessed through biochemical analysis, growth factor and cytokine profiling, and histological examination. Regular monitoring included pain score, wound size measurement, infection status, and duration of epithelialization. Results: After applying HAM as a biological dressing on chronic non-healing ulcers in 15 patients, significant improvements were observed compared to 15 control patients treated with conventional dressings. Clinically, the HAM group showed a greater wound size reduction (60% vs.30 %, p<0.01), enhanced granulation tissue formation, decreased pain scores, and reduced exudate levels.Histopathological analysis revealed increased neovascularization, demonstrated by higher microvessel density along with thicker epithelialization and reduced inflammatory infiltrate in the HAM group.VEGF levels in wound tissue and exudate were significantly elevated (4.2-fold increase, p<0.01) in the HAM group, alongside increased basic fibroblast growth factor (bFGF) and platelet-derived growth factor (PDGF), supporting enhanced angiogenesis and tissue regeneration.Blood parameters showed reduced systemic inflammation markers, including lower CRP and normalized white blood cell counts in the HAM group. These findings collectively indicate that fresh HAM promotes accelerated vascular regeneration and healing compared to conventional dressings. Conclusion: Fresh human amniotic membrane significantly improved healing outcomes in non-healing ulcer patients compared to conventional dressings. It accelerated wound closure, enhanced granulation tissue formation, and reduced pain and exudate. Improved histopathology and increased angiogenic growth factors like VEGF supported better vascular regeneration. Reduced inflammatory markers and normalized blood parameters further confirmed its effectiveness as a superior biological dressing.
Research Article
Open Access
Neonatal Hypoglycaemia and Bradycardia in Newborns of Gestational Hypertensive Mothers Treated with Labetalol
Dr. Suseender Durairaj ,
Dr A. Agneeswaran ,
Dr Bennie James Christine
Pages 259 - 270

View PDF
Abstract
Background: Gestational hypertension is a common complication of pregnancy, often requiring antihypertensive medication. Labetalol, a combined alpha- and beta-blocker, is frequently used. However, its use has been associated with potential neonatal complications, including hypoglycemia and bradycardia, due to its ability to cross the placenta. This study aims to investigate the prevalence and characteristics of neonatal hypoglycemia and bradycardia in newborns born to gestational hypertensive mothers treated with labetalol at Trichy SRM Medical College. Methods: This was an prospective observational study conducted at Trichy SRM Medical College by collecting data from newborns born to gestational hypertensive mothers. The study population included all newborns of mothers diagnosed with gestational hypertension, with a specific focus on those exposed to maternal labetalol therapy. Data on maternal demographics, gestational hypertension characteristics, labetalol usage (dose, duration), and neonatal outcomes (birth weight, APGAR scores, presence of hypoglycemia and bradycardia, levels, NICU admission, duration of stay) were collected and analyzed. Detailed descriptive statistics, including frequencies, percentages, means, and standard deviations, were calculated. Graphical representations were used to visualize key findings. Results: The study included 50 newborns born to gestational hypertensive mothers. Of these, 22 (44%) were exposed to maternal labetalol therapy1. Neonatal hypoglycemia was observed in 30 (60%) of the total newborns 2, with an average blood glucose of 37.84 mg/dL3. Neonatal bradycardia was present in 20 (40%) of the total newborns 4, with an average heart rate of 94.74 bpm5. In the labetalol-exposed group, 14 (63.6%) experienced hypoglycemia and 12 (54.5%) experienced bradycardias. Further detailed statistics are presented in the results section. Conclusion: The findings suggest a notable prevalence of neonatal hypoglycemia and bradycardia in newborns of gestational hypertensive mothers, including those exposed to maternal labetalol. While this observational study cannot establish causality, the observed trends warrant further investigation into the precise relationship between maternal labetalol use and these neonatal adverse events. Close monitoring of blood glucose and heart rate is recommended for newborns of mothers receiving labetalol for gestational hypertension.
Research Article
Open Access
Evaluation of Renal Hemodynamics in Diabetic Kidney Disease by Doppler Ultrasound and Its Association with Biochemical Parameters
Dr. Vishruth Rasa ,
Dr. Shruti Dharmadas Barki
Pages 400 - 406

View PDF
Abstract
Introduction: Diabetic kidney disease (DKD) is a leading cause of chronic kidney disease and end-stage renal failure. Early detection of renal hemodynamic changes is crucial as functional impairment often precedes overt biochemical derangements. Doppler ultrasound provides a non-invasive means of assessing intrarenal vascular resistance indices, particularly the resistive index (RI) and pulsatility index (PI). However, limited consensus exists on their diagnostic value in DKD when compared with conventional biochemical markers. Material and Methods: This cross-sectional study was conducted in the Department of Radiology, including 100 patients with type 2 diabetes mellitus and evidence of DKD. All participants underwent renal ultrasound and Doppler evaluation using a high-resolution machine with a 3.5–5 MHz curvilinear probe. RI and PI were measured in the main, segmental, and interlobar arteries. Biochemical investigations included fasting blood glucose, glycated hemoglobin (HbA1c), serum creatinine, blood urea, estimated glomerular filtration rate (eGFR, CKD-EPI), and urinary albumin excretion. Correlations between Doppler and biochemical parameters were analyzed using Pearson’s correlation. Results: The mean RI was 0.73 ± 0.06, and the mean PI was 1.35 ± 0.18. RI correlated positively with serum creatinine (r = 0.46, p < 0.001) and urinary albumin excretion (r = 0.42, p < 0.01), and negatively with eGFR (r = –0.41, p < 0.001). HbA1c showed a mild but significant correlation with RI (r = 0.32, p < 0.05). Violin and box plots demonstrated progressive increases in RI across albuminuria categories (normo-, micro-, macroalbuminuria) and CKD stages (G1–G5). Bland–Altman analysis confirmed good repeatability of RI measurement. Conclusion: Renal Doppler indices, especially RI, are strongly associated with key biochemical markers of renal function and disease severity in DKD. Doppler ultrasound offers a reliable, non-invasive adjunct to biochemical assessment and may facilitate earlier detection and monitoring of DKD progression. Larger longitudinal studies are warranted to establish prognostic thresholds and validate the role of RI/PI in risk stratification.
Research Article
Open Access
Use of the Brice Questionnaire to Assess Intraoperative Awareness: A Comparison of Propofol and Dexmedetomidine in Open Cholecystectomy in Resource-Limited Settings
Prasadula Sarah Monica,
Krishna Chaitanya Bevara,
Patta Saroj,
B. Annapurna sarma
Pages 431 - 436

View PDF
Abstract
Background: Intraoperative awareness under general anaesthesia, though rare, can lead to severe psychological consequences. This study compares the efficacy of propofol and dexmedetomidine in preventing intraoperative awareness in patients undergoing open cholecystectomy. Objectives: To evaluate and compare the incidence of intraoperative awareness and explicit recall using propofol versus dexmedetomidine infusions during open cholecystectomy surgeries under general anaesthesia, and to assess their effects on intraoperative hemodynamic parameters. Methods: A prospective, randomized, single-blind study was conducted on 60 ASA I & II patients aged 18–65 years undergoing open cholecystectomy in a resource-limited setting. Patients were randomized into two groups (n=30): Group P received propofol (2 mg/kg induction, 0.25 mg/kg/hr infusion), and Group D received dexmedetomidine (1 mcg/kg bolus over 10 minutes, followed by 0.5 mcg/kg/hr infusion). Hemodynamic parameters and intraoperative awareness were assessed using the Brice Questionnaire 24 hours post-extubation. Results: Intraoperative awareness was reported in three patients (two definite, one possible) in the propofol group and none in the dexmedetomidine group. Hemodynamic parameters (MAP, HR) were more stable in the dexmedetomidine group at key surgical milestones. Mild bradycardia occurred in three patients in Group D. Statistical analysis showed a significant difference in awareness incidence (p = 0.04) and MAP changes during intubation and incision (p < 0.005). Conclusion: Both dexmedetomidine and propofol reduce intraoperative awareness, but dexmedetomidine demonstrated superior effectiveness and hemodynamic stability. In resource-limited settings where BIS monitors are not available, tools like the Brice Questionnaire offer a viable and accessible method for assessing awareness.
Research Article
Open Access
Study of Fetomaternal Hemorrhage in Late Pregnancy and the Early Postpartum Period
Pages 383 - 385

View PDF
Abstract
Introduction: Feto-maternal haemorrhage (FMH) is the transplacental passage of fetal blood into the maternal circulation, which can lead to significant clinical consequences such as fetal anemia, alloimmunization, and even fetal demise. This study aims to evaluate the incidence, volume, and clinical implications of FMH during the third trimester of pregnancy and the immediate postpartum period in a sample size of 90 women. Materials and Methods: A prospective observational study was conducted on 90 pregnant women in their third trimester and immediate postpartum period. Inclusion criteria included singleton pregnancies with no known placental abnormalities, while exclusion criteria included multiple pregnancies, placental abruption, and preeclampsia. FMH was detected using the Kleihauer-Betke (KB) test and flow cytometry. Results: The incidence of FMH was 11.1% (10/90) during the third trimester and 16.7% (15/90) in the immediate postpartum period. The mean volume of FMH was 4.2 mL in the third trimester and 14.8 mL postpartum. Significant associations were found between FMH and maternal age, parity, and mode of delivery. Five tables summarize the findings, including demographic data, FMH incidence, volume, and clinical outcomes. Conclusion: FMH is a significant clinical event with a higher incidence in the immediate postpartum period. Early detection and monitoring are crucial to prevent adverse fetal outcomes. Further research is needed to establish standardized protocols for FMH screening and management
Research Article
Open Access
Prevalence of Pulmonary Hypertension in Patients with Interstitial Lung Disease: A Cross-Sectional Echocardiographic Study
Dr Ganesh Gore ,
Dr Ravindranath Sahay ,
Dr Ganesh Gore ,
Dr Ravindranath Sahay
Pages 883 - 888

View PDF
Abstract
Introduction: Pulmonary hypertension (PH) is a serious complication of interstitial lung disease (ILD) that worsens functional status and survival. While right heart catheterization is the diagnostic gold standard, echocardiography provides a practical non-invasive screening tool. This study aimed to determine the prevalence of PH among ILD patients using echocardiography and to evaluate associated clinical and echocardiographic correlates. Methods: A cross-sectional observational study was conducted on 140 patients with ILD at a tertiary care hospital. Patients underwent detailed clinical assessment, high-resolution computed tomography, and echocardiographic evaluation. PH was defined as an estimated systolic pulmonary artery pressure (sPAP) >35 mmHg. Echocardiographic parameters, demographic data, and functional indices were compared between patients with and without PH. Statistical analyses included t-tests, chi-square tests, and logistic regression where appropriate. Results: Of 140 ILD patients, 48 (34.3%; 95% CI: 26.9-42.5) had echocardiographic evidence of PH. Prevalence was highest in idiopathic pulmonary fibrosis (51.3%) and lowest in sarcoidosis (16.7%). Patients with PH had significantly higher mean sPAP (52.7 ± 8.1 vs. 28.9 ± 4.6 mmHg, p<0.0001), larger RV basal diameter, reduced TAPSE, and higher frequency of right atrial enlargement and RV dysfunction. Clinically, PH patients were older, had longer ILD duration, poorer 6-minute walk distance, lower resting oxygen saturation, reduced DLCO, and greater need for long-term oxygen therapy. Conclusion: PH is prevalent in one-third of ILD patients, with higher frequency in fibrotic subtypes. Echocardiography remains a valuable tool for early detection and risk stratification. Recognition of PH in ILD should prompt closer monitoring, supportive interventions, and consideration of advanced therapies.
Review Article
Open Access
Role of Artificial Intelligence in Anesthesia
Gopal Singh ,
Manju Bansal ,
Dheeraj singha ,
Desh raj
Pages 521 - 524

View PDF
Abstract
Artificial intelligence (AI) is becoming an important part of modern healthcare, and anesthesiology is one of the fields where it can make a big difference. AI uses computer methods such as machine learning (ML), deep learning (DL), natural language processing (NLP), and computer vision to support doctors in their work. In anesthesia, these tools can improve patient safety, make drug delivery more accurate, reduce errors, and improve the efficiency of the operating room. AI can help before surgery by predicting risks, during surgery by monitoring depth of anesthesia, blood pressure, and breathing, and after surgery by predicting complications like nausea, delirium, or death. Closed-loop drug delivery systems, robotic airway management, and AI-based monitoring are new areas where progress is happening fast. Clinical decision support systems (CDSS) and AI-based intensive care monitoring also show promise. Despite many advantages, there are challenges. Data privacy, algorithm bias, medico-legal issues, high cost, and lack of training remain big concerns. For AI to be widely used, it must be safe, fair, cost-effective, and well-integrated into hospital systems. In the future, AI may allow fully personalized anesthesia, autonomous systems that can deliver anesthesia on their own, and the use of virtual and augmented reality for better guidance and training. Federated learning and continuous learning systems will also make AI safer and more reliable. With responsible use and teamwork between doctors, engineers, AI can make anesthesia safer, more effective, and more patient-centered.
Research Article
Open Access
Correlation of Total Serum Calcium and Ionic Calcium Levels in Severity of Birth Asphyxia: A Prospective Study
Ashish Solanki,
Lalit kumar Chauhan
Pages 538 - 543

View PDF
Abstract
Background: The Correlation between total serum calcium, ionic calcium, and the severity of birth asphyxia is an important area of research in neonatal care. Birth asphyxia remains a significant contributor to neonatal morbidity and mortality. Among the various biochemical disturbances associated with hypoxic-ischemic events, calcium imbalance—especially hypocalcemia—plays a crucial role in worsening clinical outcomes. Aim: This study evaluates the correlation between total serum calcium, ionic calcium levels, in severity of birth asphyxia. Material and Methods: This prospective observational study was conducted over one year in the Department of Paediatrics at a tertiary care teaching hospital. A total of 80 term neonates were enrolled and divided into two groups: Group A (n=40) included neonates with birth asphyxia, and Group B (n=40) included healthy term neonates. Total serum calcium was measured using the Arsenazo III method, and ionized calcium was assessed using an ion-selective electrode technique. Blood samples were collected within six hours of life. Data were analyzed using SPSS version 25, with p<0.05 considered statistically significant. Results: The mean total serum calcium in the asphyxiated group was 7.12 ± 0.65 mg/dL, significantly lower than 8.42 ± 0.52 mg/dL in controls (p<0.001). Hypocalcemia (<7 mg/dL) was observed in 45% of Group A versus 5% in Group B. The mean ionized calcium level in Group A was 0.92 ± 0.14 mmol/L, significantly lower than 1.15 ± 0.12 mmol/L in Group B (p<0.001), with 60% of Group A showing ionized hypocalcemia compared to 10% in controls. A significant positive correlation was found between Apgar scores at 5 minutes and both total calcium (r=0.48, p=0.002) and ionized calcium (r=0.52, p<0.001). Conclusion: Neonates with birth asphyxia exhibit significantly lower levels of both total and ionized calcium compared to healthy neonates. The high prevalence of hypocalcemia and its association with lower Apgar scores highlight the need for routine calcium monitoring and timely correction to improve neonatal outcomes.
Research Article
Open Access
Artificial Intelligence–Assisted ECG Interpretation versus Conventional Reporting in Predicting Arrhythmias in Acute Coronary Syndrome: A Diagnostic Accuracy Study
Pages 643 - 649

View PDF
Abstract
Background: Accurate and timely arrhythmia detection in acute coronary syndrome (ACS) is critical for improving outcomes. Artificial intelligence (AI)–enabled ECG interpretation may offer advantages over conventional physician reporting. Objective: To compare the diagnostic performance of AI-assisted versus conventional ECG interpretation in predicting arrhythmias among ACS patients. Methods: In this prospective diagnostic accuracy study, 1,000 ACS patients underwent ECG evaluation using both an AI-based system and physician interpretation. Confirmed arrhythmic events from continuous cardiac monitoring served as the reference standard. Diagnostic metrics (sensitivity, specificity, AUC), agreement (Cohen’s κ), and time to diagnosis were assessed. Results: AI interpretation achieved higher sensitivity (97.5% vs. 86.7%), specificity (91.8% vs. 81.7%), and diagnostic accuracy (93.4% vs. 83.1%) compared to physician reporting (p < 0.001). The AUC was significantly greater for AI (0.991; 95% CI: 0.987–0.995) than for conventional methods (0.919; 95% CI: 0.899–0.937). AI also reduced time to diagnosis (1.8 ± 0.6 min vs. 6.5 ± 1.2 min; p < 0.001). Agreement with the reference standard was higher for AI (κ = 0.85) than for physicians (κ = 0.67). Conclusion: AI-assisted ECG interpretation demonstrated superior diagnostic accuracy and efficiency in detecting arrhythmias in ACS patients. Its integration into acute cardiac care may enhance early triage and treatment, though further validation is warranted.
Research Article
Open Access
A Study on Clinical Profile, Comorbid Condition, Radiological Findings and Outcome of RT-PCR Confirmed H1N1 Positive Patients in a Tertiary Care Hospital
Mamta Yadav,
Mahesh Kumar Yadav,
Ajay Gupta
Pages 58 - 63

View PDF
Abstract
Background: Influenza A (H1N1) continues to cause significant morbidity and mortality worldwide, with variable clinical presentations and outcomes. Periodic evaluation of its clinical profile, associated comorbidities, radiological features, and outcomes is essential to improve case management and identify high-risk groups.Objective:To study the clinical profile, comorbidities, radiological findings, and outcomes of RT-PCR–confirmed H1N1 patients admitted to a tertiary care hospital. Methods This prospective cross-sectional analytical study was conducted in the Department of Internal Medicine, Max Super Specialty Hospital, Saket, New Delhi, from May 2019 to April 2020. A total of 207 adult patients with RT-PCR–confirmed H1N1 infection were included. Data regarding demographics, clinical features, comorbidities, laboratory investigations, chest radiographs, ICU admission, ventilatory support, and outcomes were recorded. Statistical analysis was performed using SPSS version 21.0, with chi-square applied; p < 0.05 was considered statistically significant. Results Males constituted 64% of cases, and the majority were in the 51–60 years age group (32%). Fever (88%) and dry cough (80%) were the most common symptoms. Breathlessness, myalgia, sore throat, coryza, and diarrhea showed significant associations with mortality (p < 0.001). Hypertension (37%) and diabetes mellitus (33%) were the leading comorbidities, with all comorbidities significantly associated with death. Laboratory abnormalities including anemia, leukocytosis, thrombocytopenia, and deranged liver and kidney function were significantly higher among non-survivors. Abnormal chest radiographs were present in 61% of patients and were significantly associated with mortality (p = 0.011). Overall mortality was 6%. Conclusion H1N1 infection most commonly affected middle-aged adults, with comorbidities, abnormal laboratory parameters, and radiological findings strongly predicting adverse outcomes. Early identification and close monitoring of high-risk patients are crucial to improve survival
Research Article
Open Access
An Observational Study of Radiological and Electrophysiological Profile of Post Stroke Seizures in A Tertiary Care Centre in North India
Amitabh Dwivedi,
Ankita Sharma
Pages 37 - 41

View PDF
Abstract
Background: Post-stroke seizures are a notable complication of cerebrovascular events, particularly in elderly patients, with significant impact on prognosis and quality of life. This study aims to analyze the characteristics and patterns of post-stroke seizures, examining associations with demographic factors, lesion characteristics, stroke classification, and stroke severity. Aim: To assess the demographic and clinical characteristics of post-stroke seizures, examining associations with stroke subtypes, lesion location, seizure types, and stroke severity as per NIHSS and Oxfordshire classifications. Material and Methods: This prospective, cross-sectional study was conducted at the Neurology Department, St. Stephen’s Hospital, Delhi, over a period of 19 months. Sixty patients presenting with a first episode of post-stroke seizure were included. Inclusion criteria were based on clinical findings, neuroimaging (MRI or CT), and EEG. Data were analyzed using descriptive statistics, and correlations were assessed using Chi-square tests. Results: Among the study cohort, 62% were male, with a mean age of 65.8 years; 81% were over 50 years. Focal seizures were more common (56.6%) than generalized seizures (38.4%), with immediate onset seizures occurring in 55% of cases. Cortical lesions were more associated with seizures (65%) than subcortical lesions (11.6%), particularly in the left hemisphere (55%). PACI was the most common ischemic stroke type (64%) associated with seizures, while larger volume ICH presented greater risk in hemorrhagic stroke. Significant associations were found between NIHSS severity and stroke subtype (p = 0.0001), as well as with Oxfordshire classification (p = 0.00001). Conclusion: Post-stroke seizures exhibit distinct demographic and clinical patterns, with focal seizures predominating in cortical and left-sided lesions. PACI and cardio-embolic strokes were linked to higher seizure risk. Stroke severity as measured by NIHSS and Oxfordshire classification were significantly associated with seizure onset, underscoring the importance of targeted monitoring in high-risk groups.
Research Article
Open Access
Longitudinal assessment of glycemic variability and its association with microvascular complications in Type 2 Diabetes Mellitus patients
Kishan Zalariya,
Jaykumar Jakasaniya,
Ayushkumar Kandiya
Pages 60 - 64

View PDF
Abstract
Background: While glycated hemoglobin (HbA1c) is the established standard for glycemic control, growing evidence suggests that glycemic variability (GV)—the amplitude, frequency, and duration of glucose fluctuations—may be an independent risk factor for diabetic complications. However, long-term data linking GV to the incidence and progression of microvascular complications are limited. Methods: We conducted a prospective cohort study of 352 T2DM patients recruited from a tertiary diabetes center. At baseline and annually for five years, participants underwent 14-day continuous glucose monitoring (CGM) to calculate GV metrics, primarily the Mean Amplitude of Glycemic Excursions (MAGE). Comprehensive assessments for retinopathy, nephropathy, and neuropathy were performed at baseline and at the end of the study. Patients were stratified into tertiles based on their 5-year average MAGE (Low, Moderate, High GV). Cox proportional hazards regression was used to analyze the association between GV tertiles and a composite microvascular outcome. Results: Over a median follow-up of 5.1 years, 102 patients (29.0%) developed the composite microvascular outcome. The incidence was significantly higher in the high GV tertile (45.3%) compared to the moderate (26.5%) and low GV (15.4%) tertiles (p<0.001). The mean 5-year HbA1c was similar across groups (7.4% ± 0.6% vs. 7.6% ± 0.7% vs. 7.7% ± 0.8% for low, moderate, and high GV, respectively; p=0.112). After adjusting for mean HbA1c, age, sex, diabetes duration, and other confounders, the high GV tertile was associated with a significantly increased risk of the composite outcome (Hazard Ratio [HR] 2.92, 95% CI 1.68–5.08, p<0.001) compared to the low GV tertile. Conclusion: Long-term high glycemic variability is a potent and independent predictor of the incidence and progression of microvascular complications in patients with T2DM. These findings suggest that targeting GV, in addition to achieving HbA1c goals, may be a crucial strategy for preventing long-term diabetic complications
Research Article
Open Access
Predicting Postoperative Complications Using Biomarkers: Investigating the Utility of Biomarkers in General Surgery Patients
Pages 1204 - 1208

View PDF
Abstract
Background: Postoperative complications remain a leading cause of morbidity and mortality in general surgery patients. Early prediction could guide timely interventions. Objective: To evaluate the predictive utility of selected serum biomarkers for postoperative complications in patients undergoing elective and emergency general surgery. Methods: A prospective observational study was conducted on 300 patients undergoing general surgery procedures between January 2023 and June 2024. Preoperative and postoperative biomarkers including C-reactive protein (CRP), procalcitonin (PCT), neutrophil-to-lymphocyte ratio (NLR), and interleukin-6 (IL-6) were measured and correlated with 30-day postoperative complications. Receiver operating characteristic (ROC) curves assessed predictive performance. Results: Postoperative complications occurred in 92 patients (30.6%), most commonly surgical site infection (14.6%) and sepsis (7.3%). Elevated postoperative PCT (>2 ng/mL) and CRP (>150 mg/L on day 3) showed strong predictive value for infectious complications (AUC: 0.86 and 0.79, respectively). NLR >5 was independently associated with cardiopulmonary complications (p <0.05). IL-6 demonstrated early elevation within 24 hours, correlating with systemic inflammatory response. Conclusion: Biomarkers such as PCT, CRP, NLR, and IL-6 show strong potential for early prediction of postoperative complications in general surgery patients. Integration of biomarker monitoring into perioperative protocols may improve patient outcomes.
Research Article
Open Access
Screening of methicillin-resistant Staphylococcus aureus (MRSA) in patients and health care workers and its susceptibility to mupirocin and vancomycin in a tertiary care teaching hospital of West Bengal
Tapajyoti Mukherjee,
Minakshi Das,
Himanshu Agarwal,
Binita Kangsabanik,
Aritra Bhattacharyya
Pages 982 - 990

View PDF
Abstract
Background: Methicillin-resistant Staphylococcus aureus (MRSA) is a significant global concern, particularly in healthcare settings. This study aimed to screen asymptomatically colonized patients and healthcare workers for MRSA and assess the susceptibility of those MRSA isolates to mupirocin and vancomycin. Methods: In this cross-sectional study, nasal swabs were collected from 100 inpatients, 50 outpatients, and 50 healthcare workers. MRSA isolates were identified using standard microbiological techniques and confirmed using the cefoxitin disk diffusion method. High-level mupirocin resistance was detected using 200 μg mupirocin disks and vancomycin susceptibility was assessed using the E-test. Demographic and clinical data were collected and analyzed. Results: The prevalence of MRSA carriage was 33.0% among inpatients, 24.0% among outpatients, and 28.0% among healthcare workers, with no statistically significant differences between the groups. High-level mupirocin resistance was highest among healthcare workers (42.9%), followed by inpatients (27.3%) and outpatients (25.0%). Vancomycin-intermediate Staphylococcus aureus (VISA) was detected in one inpatient and one healthcare worker isolate, whereas no vancomycin-resistant Staphylococcus aureus (VRSA) was observed. Conclusions: This study revealed substantial MRSA colonization among patients and healthcare workers, with notably high-level mupirocin resistance in healthcare workers. These findings underscore the importance of routine MRSA surveillance, targeted decolonization, and judicious antibiotic use in informing infection control strategies in tertiary care settings. Regular monitoring of resistance trends and further molecular epidemiological studies are recommended to elucidate the transmission dynamics and guide interventions
Research Article
Open Access
Salivary Biomarkers in Oral Squamous Cell Carcinoma
Manoj Upadhyay,
Saurabh Jain,
Rahul Puri
Pages 894 - 898

View PDF
Abstract
Introduction: Oral squamous cell carcinoma (OSCC) is one of the most common malignancies of the oral cavity, representing approximately 3% of all cancers worldwide and more than 90% of all oral cancers. It remains a major public health challenge due to its high incidence, morbidity, and mortality. Aim: To assess the Salivary biomarkers in oral squamous cell carcinoma Methodology: This was a case–control study conducted in the Department of dentistry, Govt medical college, Dungarpur, between Dec 2022 to Nov 2023. A total of 58 participants were enrolled, consisting of 29 histopathologically confirmed cases of oral squamous cell carcinoma (OSCC) and 29 age- and sex-matched healthy controls. Result: In this study, most OSCC cases were middle-aged males with a high prevalence of smoking and alcohol use.More than half presented with lymph node metastasis and advanced-stage tumors (T III–IV). Salivary biomarkers (IL-8, IL-6, TNF-α, MMP-9, CYFRA 21-1, LDH) were significantly elevated in cases, while AZGP1 was reduced compared to controls. Conclusion: Salivary biomarkers such as IL-8, IL-6, TNF-α, MMP-9, CYFRA 21-1, and LDH were significantly elevated in OSCC, while AZGP1 was reduced. These findings suggest that salivary biomarkers can serve as reliable, non-invasive tools for early detection and monitoring of OSCC.
Research Article
Open Access
Comparative Evaluation of Ultrasonography and Plain Radiography in the Assessment of Small Joint Arthritis of the Hands and Wrists: A Prospective Observational Study
Vinod Kumar M ,
Shashikant Kumar,
Charulatha
Pages 906 - 911

View PDF
Abstract
Background: Early and accurate assessment of small joint arthritis is critical for diagnosis, monitoring, and management of inflammatory and degenerative arthropathies. Ultrasonography (USG) and plain radiography are widely employed imaging modalities, yet their relative diagnostic utility in small joint evaluation remains underexplored. Aim: To compare ultrasonographic and radiographic findings in patients presenting with arthritis of the small joints of the hands and wrists and to correlate imaging parameters with inflammatory biomarkers. Methods: A prospective observational study was conducted on 100 patients presenting with clinical evidence of small joint arthritis. Each patient underwent ultrasonography and plain radiography. Parameters including joint effusion, synovial hypertrophy, hyperemia, osteophytes, erosions, and joint space narrowing were evaluated. Laboratory investigations (ESR, CRP, and rheumatoid factor) were correlated with imaging findings. Statistical analysis was performed using Chi-square and correlation tests, with p < 0.05 considered significant. Results: Rheumatoid arthritis was the predominant diagnosis (48%), followed by osteoarthritis (26%), psoriatic arthritis (16%), and gouty arthritis (10%). Ultrasonography detected more abnormalities than X-ray across all parameters, particularly for synovial hypertrophy (60 vs. 48) and joint effusion (65 vs. 52), showing strong modality agreement (p < 0.001). ESR and CRP were significantly associated with ultrasonographic findings of hyperemia and synovial thickening. Grade III changes in hypertrophy and hyperemia were the most frequent, indicating active inflammation. Conclusion: Ultrasonography proved superior in detecting early inflammatory changes compared to plain radiography, while radiographs remained indispensable for evaluating structural joint damage.
Research Article
Open Access
Restrictive v/s liberal fluid strategies in management of sepsis
Vinayshree R Harsoor,
Kailash Reddy,
Nusrat Anjum
Pages 912 - 916

View PDF
Abstract
Background: Fluid resuscitation is a cornerstone of sepsis management, yet the optimal strategy regarding the volume and rate of fluid administration remains debated. Objective: To assess the clinical outcomes, including mortality, length of ICU stay, duration of mechanical ventilation, and incidence of complications, associated with restrictive versus liberal fluid strategies in septic patients. Methods: A prospective observational cohort study was conducted at Dept of Anaesthesia, Gulbarga Institute of Medical Sciences, Kalaburagi. A total of 235 patients diagnosed with sepsis were enrolled. Patients were divided into two groups: the restrictive fluid group (n=117) and the liberal fluid group (n=118). Fluid resuscitation strategies were guided by either a restrictive protocol using dynamic monitoring tools or a liberal protocol following the Surviving Sepsis Campaign guidelines. Results: The 28-day mortality rate was significantly lower in the restrictive fluid group (18.8%) compared to the liberal fluid group (27.1%) (p = 0.03). The restrictive group also had a shorter ICU stay (8.2 ± 3.5 days vs. 10.4 ± 4.1 days, p = 0.02) and a shorter duration of mechanical ventilation (5.7 ± 2.8 days vs. 7.3 ± 3.2 days, p = 0.01). Additionally, the restrictive fluid strategy resulted in fewer cases of acute kidney injury (15.4% vs. 23.7%, p = 0.04) and pulmonary edema (9.4% vs. 16.9%, p = 0.05). The incidence of fluid overload was also lower in the restrictive group (26.5% vs. 42.4%, p = 0.03). Conclusion: The restrictive fluid strategy in sepsis management was associated with lower mortality, fewer complications, and improved recovery outcomes compared to the liberal fluid strategy
Research Article
Open Access
Cardiopulmonary bypass induced electrolyte imbalance, correction and implications in operative and intensive care unit management in patients undergoing on pump cardiac surgery
Arpita Saxena,
Ratnesh Kumar,
Narendra Nath Das
Pages 507 - 510

View PDF
Abstract
Introduction: Electrolytes such as potassium, magnesium, calcium, and phosphate play an important role in cell membrane potential regulation. Dsyelectrolemia in the setting of cardiac diseases and subsequent on pump cardiac surgery may be life threatening AIM: To evaluate the incidence of electrolyte depletion in cardiac surgery patients, its correction and implications in intraoperative and intensive care unit management. Methods: We measured serially serum levels of magnesium, phosphate, potassium, calcium and sodium in 100 consecutive patients. 50 who were undergoing cardiac surgical interventions (group 1) and the rest 50 underwent lung/peripheral vascular surgery. Results: Group 1 (cardiac surgery patients) had levels of potassium, magnesium, phosphate and calcium that were significantly lower than group 2 (control individuals). Potassium levels were 10.4 ± 4.6 mmol/ hour for group 1 versus only 1.6 ± 1.4 mmol/hour for group 2 (P < 0.001). A similar observation was observed for magnesium, with 38 patients in group 1 receiving an average amount of 2.1 g due to arrhythmias as opposed to only one patient in group 2 (P < 0.001). 8 patients in the cardiac surgery group received calcium as a treatment or preventive measure compared to one in the control group (P < 0.001). Group 1 had 23 patients (46%) whose magnesium levels were below 0.70 mmol/l, compared to 8 patients (16%) in group 2 (P < 0.001). In group 1, 42 patients (84%) had phosphate levels lower than 0.60 mmol, compared to 6 patients (12%) in group 2 (P < 0.001). Conclusion: In patients undergoing cardiopulmonary bypass cardiac surgery, electrolyte depletion is a high risk. This study partly explains the elevated risk of tachyarrhythmia in cardiac surgery. Hence, we recommend frequent measurement of electrolytes mainly magnesium, potassium, phosphorus and calcium levels in perioperative period. Careful and frequent monitoring with meticulous correction even preemptive of electrolytes shall be beneficial to prevent postoperative tachyarrhythmia. This will result in better outcomes in patients undergoing cardiac surgery. Prophylactic administration of potassium, magnesium and phosphate should be taken into account intraoperatively and postoperatively in all cardiac patients.
Research Article
Open Access
Hematological And Biochemical Abnormalities in Dengue Infection: Impact of Diabetes Mellitus as A Comorbidity
Jainam Dilipbhai Shah,
Shekh Shahajoddin Minajoddin,
Vishwas Arora
Pages 533 - 536

View PDF
Abstract
Background: Dengue causes characteristic hematological and biochemical abnormalities during acute infection. The presence of diabetes mellitus may modify these laboratory patterns through underlying metabolic and endothelial dysfunction. Objectives: To compare hematological and biochemical parameters between dengue patients with and without diabetes mellitus and assess whether diabetes is associated with distinct laboratory abnormalities during acute dengue infection. Methods: This cross-sectional analytical study was conducted in the Department of General Medicine at K. B. Bhabha Municipal General Hospital, Mumbai, over one year (June 2023–June 2024). A total of 100 confirmed dengue patients (25 diabetics and 75 non-diabetics) were included. Hematological indices (hemoglobin, hematocrit, leukocyte count, and platelet count) and biochemical parameters (liver function tests, renal markers, and electrolytes) were measured at presentation. Diabetes mellitus was defined by previous diagnosis or HbA1c ≥ 6.5%. Statistical comparisons were made using the t-test and Chi-square test, with p < 0.05 considered significant. Results: Both groups exhibited comparable hemoglobin levels, leukocyte counts, and overall platelet reduction. Severe thrombocytopenia was marginally more frequent among people with diabetes, though not statistically significant. Biochemical profiles showed more pronounced elevation of AST and ALT levels in diabetic patients (p < 0.05), along with a mild trend toward higher serum urea and creatinine and lower sodium levels. These differences indicate greater hepatic and metabolic stress in the diabetic subgroup. Conclusion: While hematological abnormalities were similar in dengue patients irrespective of diabetes status, biochemical derangements—particularly liver enzyme elevation—were more prominent among people with diabetes. This suggests that diabetes may potentiate hepatic vulnerability during dengue infection, warranting closer biochemical monitoring in this comorbid population
Research Article
Open Access
Integrated surveillance and audit of hand hygiene practices in intensive care unit: Insights from a Government Medical College and Hospital in West Bengal
Binita Kangsabanik,
Tapajyoti Mukherjee,
Minakshi Das,
Aniruddha Das,
B. R. Shamanna
Pages 1000 - 1005

View PDF
Abstract
Background: Hand hygiene (HH) is a critical infection prevention strategy in intensive care units (ICUs); however, compliance remains suboptimal in resource-limited settings. This prospective observational study aimed to evaluate HH practices among healthcare workers (HCWs) in the ICU of a government medical college hospital in West Bengal, India. Methods: Using the WHO Hand Hygiene Observation Tool, trained observers directly observed the ICU healthcare workers a total of 380 HH opportunities (HHOs) and actions across the WHO's "Five Moments" framework. HH compliance was calculated by dividing the number of times hand hygiene was performed by the total number of hand hygiene opportunities.Results: A total of 380 HHOs were observed over two months, with an overall adherence rate of 27.1% (103/380). Compliance declined slightly from January (28.7%) to February (26.0%), despite ongoing surveillance. Profession-specific analysis revealed the highest adherence among nurses (40.6%), followed by housekeeping staff (30.2%) and physiotherapists (25.9%), while doctors showed lower compliance (9.1%–20.8%). Logistic regression confirmed significantly lower adherence odds among resident medical officers (OR=0.23, 95%CI: 0.11–0.46) and interns (OR=0.38, 95%CI: 0.20–0.75) than among nurses. Moment-specific adherence was highest before aseptic procedures (42.9%) and after body fluid exposure (42.6%) but lower before (26.6%) and after patient contact (18.6%) or contact with patient surroundings (17.3%). Conclusions: The findings highlight behavioral and contextual barriers to consistent HH adherence, emphasizing the need for targeted education, leadership engagement, and continuous monitoring to strengthen infection prevention practices in governmental healthcare settings. Future research should evaluate multimodal interventions and electronic monitoring to support evidence-based policies that optimize patient safety in resource-limited ICUs.
Research Article
Open Access
Bacteriological profile and antibiotic sensitivity pattern in patients with community-acquired pneumonia
Winnie Elizabeth Jose,
Tony Luke Baby,
R C Krishna Kumar
Pages 2389 - 2393

View PDF
Abstract
Background Community-acquired pneumonia is one of the leading reasons for morbidity and mortality worldwide, particularly in less developed countries. Early identification of the causative organisms, and their antimicrobial susceptibility, can guide effective management and minimize complications. In many parts of the world, antibiotic resistance is on the rise, and it is necessary to maintain continuous, local -level surveillance to develop a case management strategy. Objectives To identify the bacteriological profile and assess the antibiotic sensitivity pattern among patients diagnosed with community-acquired pneumonia in a tertiary care hospital in India. Methods This was a prospective cross-sectional observational study conducted over one year from January 2024 to December 2024. A total of 130 patients with clinical and radiological diagnosis of community-acquired pneumonia were included. Sputum samples, blood cultures, and other respiratory specimens were collected before initiation of antibiotics when possible. Organism identification and antibiotic susceptibility testing were performed using standard microbiological methods. Relevant demographic and clinical data were recorded. Results A positive bacterial culture was obtained in 74.6 percent of patients. Streptococcus pneumoniae was the most common pathogen, followed by Haemophilus influenzae, Staphylococcus aureus, and Klebsiella pneumoniae. Most isolates showed high sensitivity to amoxicillin-clavulanate, ceftriaxone, and azithromycin. Resistance to fluoroquinolones and third-generation cephalosporins was more common in Gram-negative organisms, particularly Klebsiella species. Methicillin-resistant Staphylococcus aureus was identified in a small but relevant proportion of cases. Conclusion Streptococcus pneumoniae continues to be the leading bacterial cause of pneumonia acquired in the community, but increasing resistance among Gram-negative organisms emphasizes the importance of judicious choice of antibiotics. Monitoring local bacterial patterns and susceptibility shifts can guide rational antimicrobial choice and enhance patient outcomes
Research Article
Open Access
Impact of Aspirin Dose on Warfarin Anticoagulation Control After Mechanical Valve Replacement: A Prospective Observational Study
Sandeep Singh,
Panmeshwar Rathia,
Cheena Singh,
Shoranki Pardhan,
Shamsher Singh Lohchab
Pages 614 - 618

View PDF
Abstract
Background: Effective anticoagulation after mechanical heart valve replacement is essential to mitigate thromboembolic and hemorrhagic complications. The combination of warfarin and low-dose aspirin provides enhanced protection; however, the optimal dosage of aspirin is still a subject of debate. This study analyzed the clinical outcomes of warfarin in comparison to 75 mg and 150 mg aspirin following mechanical valve replacement. Method: A prospective observational study was performed at Pt. B. D. Sharma PGIMS, Rohtak, involving 60 patients who underwent mechanical valve replacement. Patients were assigned randomly to two groups: Group A received warfarin in combination with 75 mg of aspirin, while Group B received warfarin with 150 mg of aspirin. The quality of anticoagulation was evaluated through Time in Therapeutic Range (TTR) utilizing the Rosendaal method. Thromboembolic and bleeding events, the necessity for fluoroscopy, and mortality were assessed over a six-month period. Results: The majority of patients demonstrated moderate INR control, with TTR values of 64.1% for Group A and 73.4% for Group B (p = 0.53). Prosthetic valve thrombosis was observed in 3 patients (10%) in Group A, while none were reported in Group B (p = 0.05). All cases were associated with TTR < 50%. Bleeding complications were similar (8.3% overall; p = 0.53), and no significant hemorrhage was observed. Fluoroscopy was necessary for 5% of patients, all of whom were in Group A. The mortality rate was 1.7%, confined to patients experiencing valve thrombosis. Conclusion: the combination of warfarin and 150 mg aspirin offers enhanced thromboembolic protection and improved time in therapeutic range (TTR) without elevating the risk of bleeding. It is crucial to maintain a TTR greater than 60% to avert valve thrombosis. Individualized anticoagulation, accompanied by consistent INR monitoring and patient education, is essential for achieving optimal postoperative outcomes
Research Article
Open Access
Comparative Analysis of Novel Predictive Markers: Monocyte-to-HDL Ratio (MHR) versus Hemoglobin-to-RDW Ratio (HRR) in Post-PCI Coronary Heart Disease
Dhruba Jyoti Manna,
Kripasindhu Maurya
Pages 850 - 855

View PDF
Abstract
Background: Few studies in literature have reported that MHR and HRR high predictive value in post-PCI and heart failure patients. Concerning PCI, HRR can be predictive low HRR signifying increased cardiovascular events and mortality in subjects with CAD. HRR monitoring can help in identification of subjects at higher complication risk for better management and follow-up. Aim: The present study was aimed to compare the novel predictive markers monocyte to high density lipoprotein ratio (MHR) and haemoglobin to red –cell distribution width ratio (HRR) in coronary heart disease post – PCI. Methods: The study assessed 138 subjects with CAD. The outcomes assessed in the study subjects were Major adverse cardiovascular event (MACE) Manifestation of mild signs and symptoms like anxiety, fatigue, chest pain – not amounting to hospital visit, mortality, and restenosis. Demographic data and clinical data of patient was obtained from case files and was recorded on a separate sheet. Findings of laboratory investigations done were also recorded on the same sheet. Patients were followed till discharge or final outcome Results: HRR level was evaluated for prediction of post-PCI mortality at a cut-off with a lower value indicating positive result. Area under curve for HRR was 0.838 (indicating a projected accuracy of 83.8%). A cut off of HRR ≤0.8116 was found to be 75% Sensitive and 85.1% Specific. MHR level was evaluated for prediction of post-PCI mortality at a cut-off with a higher value indicating positive result. Area under curve for MHR was 0.863 (indicating a projected accuracy of approximately 86.3%). A cut off of MHR ≥0.1017 was found to be 100% Sensitive and 88.9% Specific. Conclusion: The findings of the study showed the scope for utilization of MHR and HHR for prognostic purposes in post-PCI coronary heart disease patients. In the present study, despite the limitation of a short-term follow-up and limited sample size they depicted some predictive value. Further studies on a larger sample size and longer duration of follow-up are recommended to validate and substantiate the findings of the present study.
Research Article
Open Access
Study of Maternal and Perinatal Outcome in Twin Pregnancy – A Tertiary Care Hospital Based Cross Sectional Study
Ritu ,
Manideepa Roy ,
Purabi Das ,
Rumen Chandra Boro
Pages 35 - 40

View PDF
Abstract
Background Twin pregnancies are associated with increased maternal and perinatal risks compared to singleton gestations. They contribute significantly to maternal morbidity, obstetric complications, and adverse neonatal outcomes. This study aimed to evaluate the maternal and perinatal outcomes of twin pregnancies at a tertiary care hospital in Tezpur, Assam. Objectives To determine the incidence of twin pregnancies and analyse maternal complications, maternal morbidity and mortality, as well as neonatal morbidity and mortality. Methods This study was conducted over one year at Tezpur Medical College and Hospital. A total of 100 women with twin pregnancies beyond 28 weeks of gestation were included, fulfilling inclusion and exclusion criteria. Data were collected through structured proformas, clinical examinations, ultrasonography, and perinatal monitoring. Maternal and neonatal outcomes were analysed using descriptive statistics. Results The incidence of twin pregnancy was 0.7% among 13,737 deliveries. Most women were aged 20–29 years (73%), with nearly equal distribution between primigravida (49%) and multigravida (51%). Preterm delivery occurred in 84% of cases, and anemia (72%) was the most common maternal risk factor. Premature labour (76%) was the leading complication, with maternal mortality recorded in 2%. Caesarean delivery was slightly more common (54%) than vaginal birth (46%). Perinatal outcomes were marked by low birth weights (s (<2.5 kg in 80–90% of twins), intrauterine deaths (3–5%), and a high NICU admission rate (52–56%). Conclusion Twin pregnancies are high-risk with significant maternal and perinatal complications. Strengthening antenatal care, early risk identification, and skilled intrapartum management are essential to improve outcomes.
Research Article
Open Access
Impact Of Obesity on Autonomic Modulation, Heart Rate, And Blood Pressure in Obese Young People
R. Aravind Kumar ,
P. Palanivel ,
R. Madhubala ,
D. Rajkumar
Pages 114 - 119

View PDF
Abstract
Background: With changing lifestyle patterns, obesity is increasingly being seen even among young adults who are otherwise presumed to be healthy. Medical students, despite their health literacy, often experience irregular routines and stress-related habits that quietly affect their cardiovascular system. This study set out to explore how excess body weight might be linked to early changes in heart rate regulation and blood pressure among first-year MBBS students in South India. Objective: To examine the association between obesity and resting cardiovascular parameters, specifically heart rate, blood pressure, and short-term heart rate variability (RMSSD), as indicators of autonomic modulation in young adults. Methods: Over eight months (September 2024 to April 2025), 100 undergraduate medical students were recruited at Dhanalakshmi Srinivasan Medical College, Perambalur. Participants were grouped using Asia-Pacific BMI guidelines into normal weight (18.5–22.9 kg/m², n = 50) and obese (≥25 kg/m², n = 50) categories. Resting heart rate and blood pressure readings were obtained under standardized morning conditions. Autonomic function was assessed noninvasively through RMSSD derived from lead II ECG. Statistical comparisons between groups were made using unpaired t-tests, and associations were explored through Pearson correlation. Findings: Students in the obese category showed a consistently higher mean heart rate (83.87 ± 4.67 bpm), elevated systolic (129.69 ± 8.12 mmHg) and diastolic blood pressure (85.91 ± 6.55 mmHg), and reduced parasympathetic tone (RMSSD: 25.16 ± 3.87 ms) when compared to their normal-BMI counterparts (heart rate: 72.09 ± 4.37 bpm; SBP: 118.59 ± 6.26 mmHg; DBP: 74.89 ± 5.40 mmHg; RMSSD: 41.09 ± 5.91 ms). All differences were statistically significant (p < 0.001). Conclusion: The findings highlight an early pattern of autonomic imbalance in obese young adults, suggesting increased cardiovascular strain even in the absence of clinical disease. These subtle shifts, detectable through noninvasive screening, underscore the need for proactive monitoring and health counseling tailored to medical students’ unique academic pressures.
Research Article
Open Access
Comparative Analysis of Maternal and Fetal Outcomes in Spontaneous versus Induced Labour among Post-Dated Pregnancies: A Prospective Interventional Study
Rameshwari Malshetty,
Suman Umeshchandra
Pages 149 - 153

View PDF
Abstract
Background: Post-dated pregnancy, defined as gestation extending beyond 40 weeks, is associated with an increased risk of maternal and perinatal morbidity. The optimal management of such pregnancies—whether to await spontaneous onset or to induce labour—remains a critical obstetric consideration. Aim: To compare the maternal and fetal outcomes between spontaneous and induced labour among post-dated pregnancies. Methods: This prospective interventional study included 100 women with post-dated singleton pregnancies admitted to the Department of Obstetrics and Gynaecology at a tertiary care centre. Participants were divided into two groups: Group I (spontaneous onset of labour, n=50) and Group II (induced labour, n=50). Induction was performed using prostaglandin E₂ gel followed by oxytocin as needed. Maternal outcomes such as mode and duration of delivery, perineal injuries, and postpartum haemorrhage were compared. Fetal outcomes assessed included Apgar scores, meconium aspiration, NICU admissions, and perinatal mortality. Statistical analysis was performed using SPSS version 20.0, with p<0.05 considered significant. Results: Cesarean section rates were significantly higher in the induced group (50%) than in the spontaneous group (16%) (p<0.001). Vaginal delivery was more common in spontaneous labour (70% vs 42%; p=0.003). The mean duration of labour was longer in induced cases (10.48 ± 3.50 h vs 8.72 ± 3.81 h; p=0.018). Maternal complications and neonatal outcomes, including Apgar <7 at 5 minutes (12% vs 10%), meconium aspiration (10% each), and NICU admission (12% vs 10%), did not differ significantly between groups. Conclusion: Induction of labour in post-dated pregnancies is associated with an increased cesarean delivery rate and prolonged labour duration but does not adversely impact maternal or fetal outcomes when managed appropriately. Vigilant monitoring and individualized decision-making are essential for optimizing perinatal results
Research Article
Open Access
Comparison of Butorphanol and Dexmedetomidine as adjuvants to Propofol for ease of Baska Mask insertion for short procedures – A Prospective Double Blinded Randomized controlled study
Suresh Palanisamy,
Ranjan RV ,
Nagalakshmi Palanisamy
Pages 856 - 864

View PDF
Abstract
Introduction: Supraglottic airway device has replaced endotracheal intubation for elective surgeries requiring GA. Insertion of SADs requires adequate depth of anaesthesia in spontaneously breathing patient and Propofol with adjuvants was commonly used to facilitate its insertion. With this background, this study was conducted to compare ease of insertion by addition of either dexmedetomidine or butorphanol added to propofol for insertion of newer generation SAD - Baska mask in short elective surgeries done under general anaesthesia. Aim & Objectives: To compare Butorphanol and Dexmedetomidine as adjuvants to Propofol on the insertion conditions of Baska Mask for short surgical procedures. To assess the ease of insertion & the incidence of complications such as cough, laryngospasm during insertion. Material and Methods: Around 88 adult patients belonged to ASA I or II of either sex, scheduled for elective surgery under general anaesthesia were allocated randomly either to receive dexmedetomidine 0.5µ/kg IV (Group A) or butorphanol 20µ/kg IV (Group B). All patients were uniformly pre-medicated, induced and Baska mask was inserted as per standard protocol. Ease of insertion score was determined by Modified scheme of Lund & Stovner grading & time taken for insertion was noted. Intra operative monitoring of HR, systemic arterial pressures, SpO2 & EtCO2 were recorded at baseline, after induction, 1,3,5,10 and 15 mins after insertion of Baska mask. Results: There were no statistically significant differences in the demographic characteristics and duration of insertion of Baska mask (P > 0.05). The efficacy of successful insertion was statistically significant with respect to various ease of insertion characteristics & number of attempts required to insert Baska mask in Butorphanol group as compared to Dexmedetomidine Group (P < 0.05). Conclusion: The Study conclude that addition of Butorphanol to Propofol as adjuvant compared to Dexmedetomidine reduces the dose of propofol required and provides superior insertion conditions and good jaw relaxation for ease of insertion of Baska mask. The first pass success rate was greater in Butorphanol group than Dexmedetomidine group. We recommend Butorphanol at 20µ/kg as an adjuvant to propofol for Bask mask insertion without hemodynamic compromise when compared to Dexmedetomidine at 0.5µ/kg.
Research Article
Open Access
Cross-Sectional Study of Anemia Patterns and Iron Indices in Chronic Kidney Disease Stages 3-5
Jamadar Mallikarjun Andappa
Pages 214 - 219

View PDF
Abstract
Background: Anemia is a common and clinically significant complication of chronic kidney disease (CKD), particularly in stages 3-5, where progressive renal impairment leads to reduced erythropoietin production, iron dysregulation, and chronic inflammation. Understanding anemia patterns and iron profile alterations is essential for effective management. Aim: To evaluate anemia patterns and iron indices among patients with CKD stages 3-5. Methods: A hospital-based cross-sectional study was conducted among 120 adults diagnosed with CKD stages 3-5. Detailed demographic, clinical, and laboratory data were collected. Hemoglobin levels, anemia prevalence, anemia morphologic types, and iron indices (serum iron, ferritin, TSAT, TIBC) were assessed. Statistical analyses included chi-square tests, t-tests, ANOVA, and correlation coefficients, with p < 0.05 considered significant. Results: The mean age of the study population was 56.7 ± 11.3 years, with 59.2% males. Anemia prevalence was 78.3%, increasing significantly with CKD stage (68.4% in stage 3, 80.5% in stage 4, 85.4% in stage 5; p = 0.041). Normocytic normochromic anemia was the most common type (55.3%), followed by microcytic hypochromic (22.3%) and mixed-pattern anemia (12.8%). Anemic patients had significantly lower serum iron (42.8 ± 15.7 µg/dL) and TSAT (18.7 ± 7.3%) than non-anemic patients (p < 0.001). Ferritin was lower in anemic individuals but remained elevated overall, suggesting combined absolute and functional iron deficiency. Hemoglobin showed a significant positive correlation with eGFR (r = +0.41, p < 0.001) and TSAT (r = +0.36, p < 0.001). Age ≥60 years was associated with significantly lower hemoglobin levels (p = 0.023). Conclusion: Anemia is highly prevalent in CKD stages 3-5 and becomes more severe as renal function declines. Normocytic normochromic anemia predominates, but a substantial proportion of patients exhibit iron-deficiency patterns. The significant alterations in iron indices highlight the importance of early detection, regular monitoring, and individualized management of anemia in CKD patients
Research Article
Open Access
Assessing the Relationship Between Thiazide Use and Syncope Or Fall in Hypertensive Indian Subjects Admitted to the Tertiary Care Hospital
Kaza Srajan,
Tilak Raj Gajendra,
Manjusha Sori
Pages 1010 - 1014

View PDF
Abstract
Background: Thiazides are used commonly to manage hypertension and can also increase the risk of falls and syncope in its users. However, this issue remains under studied and mentioned. Aim: The present study was aimed to assess the relationship between thiazide use and syncope or fall in hypertensive Indian subjects admitted to the tertiary care hospital Methods: The present study assessed 472 subjects that presented with hypertension to the Institute within the defined study period and were on thiazides and 472 control subjects that were not on thiazides. The data from all the subjects were gathered and assessed from the previous records of the Institute including demographic characteristics, clinical data, laboratory findings, and outcomes. The main outcome assessed was syncope or fall episode occurrence. Also, association in syncope/fall risk and various factors was assessed. Results: The study results showed higher prevalence of chronic kidney disease, acute kidney injury, metabolic alkalosis, hypercalcemia, hypokalaemia, and hyponatremia in subjects with thiazides compared to control subjects with p-value of <0.05. Also, a significantly higher prevalence of syncope and fall was seen in subjects on thiazides compared to controls with p=0.002. Also, increased risk of syncope/ fall was seen in subjects with decreased eGFR, acute kidney injury, metabolic alkalosis, hypokalaemia, hyponatremia, longer thiazide duration, congestive heart failure, and age. Conclusion: The use of thiazide diuretics in subjects with hypertension is associated with the syncope which is mainly mediated with renal impairment and electrolyte disturbances. These results focus on vital role of individualized treatment and careful monitoring approaches for prescription of thiazide diuretics in subjects having hypertension
Research Article
Open Access
Post-Dated Pregnancy: A Study on Its Effects on Maternal and Fetal Well-Being
Pages 278 - 287

View PDF
Abstract
Background: Post-dated pregnancy, defined as gestation extending beyond 42 completed weeks (294 days), poses significant risks to both maternal and fetal health. Despite advances in obstetric care, post-term pregnancies continue to be associated with increased perinatal morbidity and mortality. This study aimed to evaluate the effects of post-dated pregnancy on maternal and fetal outcomes in a tertiary care setting. Methods A prospective observational study was conducted from June 2023 to July 2024 at the Department of Obstetrics and Gynaecology, Tezpur Medical College and Hospital. A total of 106 pregnant women with gestational age beyond 40 weeks were included. Detailed maternal and fetal assessments were performed, including biophysical profile, non-stress test, and Doppler studies. Maternal outcomes assessed included mode of delivery, induction of labour, and maternal complications. Fetal outcomes evaluated were birth weight, Apgar scores, meconium-stained liquor, neonatal intensive care unit (NICU) admissions, and perinatal mortality. Results The mean gestational age at delivery was 41.2 ± 0.8 weeks. The caesarean section rate was 48.1%, with fetal distress being the most common indication (35.3%). Labour induction was required in 67.9% of cases. Meconium-stained amniotic fluid was observed in 42.5% of deliveries. Macrosomia (birth weight >4000g) occurred in 16.0% of neonates. Low Apgar scores (<7 at 5 minutes) were documented in 13.2% of newborns. NICU admission rate was 28.3%, significantly higher compared to term pregnancies. Maternal complications included postpartum haemorrhage (11.3%), perineal trauma (23.6%), and operative delivery morbidity. Conclusion Post-dated pregnancy is associated with increased maternal and fetal complications. Higher rates of operative delivery, meconium aspiration, macrosomia, and neonatal morbidity were observed. Active management with timely induction of labour and continuous intrapartum monitoring are essential to improve perinatal outcomes in post-term pregnancies
Research Article
Open Access
A Rare Case of Cardiac Lymphoma Presenting as Pulmonary Embolism
Mohammad Manzar Baig,
Anoop Purkayastha,
Aabid Hussain Dar,
Aditi Shukla,
Monawar Sultan,
Devsena Jha
Pages 319 - 322

View PDF
Abstract
Background: Primary cardiac lymphoma (PCL) is a rare extranodal lymphoma comprising <2% of all cardiac tumors. Clinical manifestations are nonspecific and often mimic pulmonary embolism (PE) or intracardiac thrombus. We report a rare case of right atrial (RA) diffuse large B-cell lymphoma (DLBCL) initially suspected as a large intracardiac thrombus/clot-in-transit with clinical features resembling PE. Materials and Methods A 69-year-old male presented with progressive dyspnea, presyncope, and tachycardia. Emergency evaluation with ECG, 2D echocardiography, CT pulmonary angiography, venous Doppler, Holter monitoring, and laboratory investigations was performed. The patient underwent right atrial mass excision through midline sternotomy. Postoperative recovery, complications, atrial fibrillation episodes, anticoagulation management, and oncologic referral were documented. All data were prospectively collected from hospital records (Feb–Mar 2025). Results Echocardiography showed a 4×3×5 cm RA mass protruding into the right ventricle (RV) causing tricuspid inflow obstruction with a mean gradient of 9 mmHg. CT angiography revealed a lobular RA lesion suspicious for clot-in-transit vs neoplasm, without pulmonary artery thrombosis. Surgical excision achieved complete removal of the mass. Histopathology confirmed DLBCL (CD20+, BCL6+, Ki-67 ~70%). Postoperatively, the patient developed transient atrial fibrillation managed medically and was discharged in stable condition on antiplatelet and anticoagulation therapy. Conclusion This case highlights that primary cardiac lymphoma may masquerade as pulmonary embolism. Early multimodal imaging, high clinical suspicion, and urgent surgical exploration in obstructive cases are lifesaving. Combined cardiology–cardiac surgery–oncology management is essential for optimal outcomes
Research Article
Open Access
Study of Association of Abnormal Cardiotocography in High-Risk Pregnancies and Perinatal Outcome – A Cross-Sectional Study in a Tertiary Care Centre of Assam
Muhammad Sameer Hussain ,
Mridusmita Majumdar ,
Rumen Chandra Boro
Pages 354 - 359

View PDF
Abstract
Background: High-risk pregnancies contribute significantly to perinatal morbidity and mortality. Cardiotocography (CTG) remains a vital tool for intrapartum fetal monitoring, enabling early detection of distress and timely obstetric intervention. However, its predictive accuracy and effect on perinatal outcomes require further evaluation. Methods A prospective cross-sectional study was conducted in the Department of Obstetrics and Gynaecology, Tezpur Medical College and Hospital, from September 2023 to August 2024. A total of 180 antenatal women with ≥37 weeks of gestation and one or more high-risk factors were included. Admission CTG was performed for 20 minutes and categorized as reactive, non-reactive, or pathological. Maternal outcomes (mode of delivery) and neonatal outcomes (Apgar score, NICU admission, perinatal mortality) were recorded. Statistical analysis was done using SPSS v20.0, with p < 0.05 considered significant. Results Reactive CTG was observed in 70.55% of cases, non-reactive in 19.44%, and pathological in 10%. A significant association existed between CTG findings and mode of delivery (p < 0.0001), with 90.5% of reactive CTG cases delivering vaginally, while 97.47% of pathological CTG required caesarean section. Pathological CTG correlated with low Apgar scores (<7) and increased NICU admissions. CTG showed 80% sensitivity and 80.74% specificity, with a high negative predictive value (99.09%). Conclusion Abnormal CTG patterns are strongly linked to adverse perinatal outcomes, especially in conditions like PIH and IUGR. Although CTG is a sensitive tool for detecting fetal distress, its low positive predictive value necessitates adjunctive methods for accurate fetal assessment and minimizing unnecessary interventions
Research Article
Open Access
Association of Neutrophil-to-Lymphocyte and Monocyte-to-Lymphocyte Ratios with Glycated Hemoglobin in Controlled and Uncontrolled Type 2 Diabetes Mellitus: A Cross-Sectional Study from Eastern India.
Jagnyaseni Panda,
Ajaya Bhatta,
Girija Shankar Prasad Patro
Pages 454 - 457

View PDF
Abstract
Background: Type 2 diabetes mellitus (T2DM) is a chronic metabolic disorder associated with persistent hyperglycemia, chronic low-grade inflammation, and an increased risk of cardiovascular and microvascular complications. Glycated hemoglobin (HbA1c) remains the gold standard for assessing long-term glycemic control but does not fully capture the inflammatory state that contributes to disease progression. Methodology: A hospital-based cross-sectional observational study was conducted in the Department of Medicine at MKCG Medical College and Hospital, Berhampur, Odisha, India. A total of 180 adults aged 18–70 years with T2DM, diagnosed according to ADA criteria, were enrolled using a simple random sampling method. Patients with inflammatory, hepatic, renal, or hematological disorders were excluded. Clinical data, hematological parameters, HbA1c, NLR, and MLR were recorded. Results: The mean age of participants was 54.1 ± 8.3 years. Most patients in both groups were aged 50–59 years, and no significant association was found between gender and glycemic control. Elevated NLR (>2) was present in 62.99% of uncontrolled patients compared to 42.31% of controlled patients (p < 0.05). Similarly, elevated MLR (>2) was observed in 63.64% of uncontrolled patients versus 34.62% of controlled patients (p < 0.05). Both NLR and MLR showed significant positive correlations with HbA1c, indicating their potential as markers of poor glycemic control. Discussion: The findings highlight that subclinical inflammation, as reflected by elevated NLR and MLR, is closely associated with poor glycemic control in T2DM. These results support previous studies demonstrating that inflammatory pathways play a pivotal role in diabetes progression and complications. Conclusion: This study demonstrates a significant positive association between elevated NLR and MLR and poor glycemic control in patients with T2DM. Incorporating these inflammatory markers into clinical practice alongside HbA1c could enhance patient monitoring, risk stratification, and early detection of complications
Research Article
Open Access
Evaluation of Matrix Metalloproteinases-3 As A Possible Biomarker For Oral Sub Mucous Fibrosis
Dattatray Kale,
Somnath Salgar
Pages 471 - 474

View PDF
Abstract
Background: Oral submucous fibrosis (OSMF) a chronic and an insidious disease of buccal mucosa has a strong predisposition to malignant transformation. In OSMF, the disturbance in equilibrium between matrix metalloproteinases (MMPs) and tissue inhibitor matrix metalloproteinases (TIMP) results in elevated deposition of extracellular matrix (ECM). MMPs play a crucial role in fibrosis of the oral cavity. The present study was conducted with an aim to evaluate MMP-3 as a possible biomarker for OSMF. Methods: The study included a total of 118 clinically diagnosed OSMF patients. Additionally, total of 118 healthy individuals was recruited as controls. MMP-3 was estimated using the TIMP ELISA method involves quantifying MMP-3 levels in serum samples. Results: Poor oral hygiene, chronic alcoholism and consumption of spicy food were common confounders other than tobacco chewing and arecanut consumption found to be associated with OSMF. The mean serum levels of MMP-3 were higher in OSMF patients as compared to individuals belonging to control group and the difference was highly statistically significant. The mean MMP-3 of OSMF patients increased with severity of the disease. Conclusion: From this study, it can be concluded that it was noted that elevated serum MMP- 3 levels can be used as screening tool in the early detection of OSMF. Additionally, it can be ideal tool for monitoring the prognosis of the patients
Research Article
Open Access
Serial Serum Albumin Measurements as a Prognostic Indicator in Critically Ill Patients: A MICU/ICCU-Based Study
Jay K Patel,
Tejas Amin,
Mayur J Patel,
Jay K Patel,
Tejas Amin,
Mayur J Patel
Pages 491 - 494

View PDF
Abstract
Background: Serum albumin is a well-recognized biomarker reflecting nutritional and inflammatory status, and its prognostic relevance in critically ill patients continues to be explored. This study evaluated the association between baseline and serial serum albumin levels with survival and morbidity indicators in patients admitted to intensive care. Material and Methods: A prospective, hospital-based observational study was conducted over two years in the MICU and ICCU of a tertiary care center. One hundred critically ill adult patients meeting the inclusion criteria were enrolled using purposive sampling. Detailed clinical assessment and routine laboratory investigations were performed at admission. Serum albumin levels were measured on Day 0, 1, 3, 5, 7, and 10. Patients were categorized as survivors or non-survivors, and comparisons were made with respect to mortality, need for mechanical ventilation, and duration of hospitalization. Statistical analyses were conducted using SPSS version 22, with significance set at p < 0.05. Results: Demographic factors such as age distribution and sex did not differ significantly between survivors and non-survivors. Admission serum albumin levels showed a significant association with mortality, with lower values more frequent among non-survivors. Serial follow-up measurements demonstrated consistently reduced albumin levels in the non-survivor group at all time points, with highly significant differences throughout the observation period. Lower albumin levels at presentation were also associated with an increased likelihood of requiring mechanical ventilation. Conclusion: Both baseline and early serial serum albumin measurements serve as meaningful prognostic indicators in critically ill patients. Persistently reduced albumin levels are linked to higher mortality and increased need for ventilatory support, highlighting the utility of albumin monitoring in risk assessment and clinical management.
Review Article
Open Access
Short-Term and Long-Term Health Effects of High Air Quality Index Exposure: A Systematic Review of Multisystem Outcomes
Ananya Rakshit ,
Parul Horo ,
Sana Ahsan ,
Anupam Tyagi ,
Sameer Srivastava
Pages 592 - 599

View PDF
Abstract
Background: Air pollution is a leading environmental risk factor for global morbidity and mortality, and the Air Quality Index (AQI) serves as a critical measure of exposure severity. This systematic review evaluates the short-term and long-term effects of high AQI on human health by synthesizing findings from epidemiological and clinical studies published between 2000 and 2024. Major databases including PubMed, Scopus, and Web of Science were searched for studies linking elevated AQI levels to health outcomes across respiratory, cardiovascular, neurological, and reproductive systems. A total of 78 studies met the inclusion criteria, encompassing over 40 million individuals across diverse geographic regions. The evidence consistently indicates that short-term exposure (hours to days) to elevated AQI primarily induces respiratory distress, asthma exacerbations, acute cardiovascular events, and hospital admissions, particularly in children and the elderly. Long-term exposure (months to years) is associated with chronic obstructive pulmonary disease (COPD), ischemic heart disease, stroke, metabolic dysfunction, neuroinflammation, cognitive decline, and increased all-cause mortality. The findings reveal a dose–response relationship, with sustained exposure above AQI 150 posing significant cumulative health risks. This review underscores the urgent need for stricter air quality regulations, continuous monitoring, and public health interventions aimed at reducing exposure and improving urban air management.
Research Article
Open Access
Longitudinal Growth Patterns of Femoral Length (FL1 & FL2) in South Indian Human Fetuses and Their Correlation with Gestational Age
K. Vidulatha ,
K Sangeetha ,
N Mythily
Pages 600 - 606

View PDF
Abstract
Background: Fetal femur length (FL) is one of the most reliable skeletal indicators for estimating gestational age (GA) in clinical, anatomical, and forensic contexts. Direct anatomical measurement provides higher accuracy than ultrasound, especially because cartilaginous epiphyses are not fully visualized in imaging. Aim: To evaluate the longitudinal growth of femur length measured laterally (FL1) and medially (FL2) in South Indian fetuses and to determine the strength of correlation between these measurements and gestational age. Materials and Methods: Thirty spontaneously delivered normal fetal specimens ranging from 11 to 30 weeks of gestation were included. Following standardized dissection, both FL1 (greater trochanter to lateral condyle) and FL2 (fovea capitis to medial condyle) were measured using a high-precision digital Vernier caliper. Fetuses were categorized into four gestational groups. Mean values, standard deviations, and weekly growth rates were calculated. Correlation analysis and ANOVA were performed. Results: Mean FL1 increased from 2.4 cm (Group A: 11–15 weeks) to 6.73 cm (Group D: 26–30 weeks). FL2 increased from 2.06 cm to 6.91 cm in the same groups. Weekly growth rates were 2.16 mm/week for FL1 and 2.42 mm/week for FL2. FL2 consistently exceeded FL1 due to inclusion of the femoral head and neck. A strong linear correlation was observed between both measurements and gestational age. Conclusion: Femur length shows a robust, predictable increase with gestational age. FL2, due to its anatomical breadth, serves as a stronger growth indicator. The study provides population-specific reference data for Indian fetuses, valuable for anatomical research, intrauterine growth monitoring, forensic identification, and medicolegal evaluation.
Research Article
Open Access
Incidence of Significant Renal Artery Stenosis in Patients Undergoing Primary PTCA
Yathish B.E ,
Pratham Mathur
Pages 615 - 619

View PDF
Abstract
Background: Renal artery stenosis is a frequently under-recognized comorbidity in patients presenting with acute coronary syndromes undergoing primary percutaneous transluminal coronary angioplasty. Objective: This study aimed to determine the incidence of significant renal artery stenosis in patients undergoing primary PTCA. Methodology: This cross-sectional observational study was conducted at, Dept of Cardiology, Adichunchanagari Medical College, Bellur, Karnataka, India and it included 200 consecutive adult patients undergoing primary PTCA for ST-elevation myocardial infarction or high-risk non-ST elevation myocardial infarction. Following coronary intervention, selective renal angiography was performed during the same procedural session. Results: Out of 200 patients, 32 (16 percent) had significant renal artery stenosis. Among these, 24 (75 percent) had unilateral stenosis, while 8 (25 percent) had bilateral involvement. Patients with stenosis were older (63.1 ± 9.3 years) compared with those without stenosis (56.3 ± 10.9 years). Hypertension was more prevalent in the stenosis group (87.5 percent vs. 64.3 percent), and median serum creatinine was higher (1.24 mg/dL vs. 1.05 mg/dL). Differences in age and hypertension were statistically significant (p = 0.01 and p = 0.03, respectively). No procedural complications related to renal angiography were observed. Conclusion: Significant renal artery stenosis is present in a substantial proportion of patients undergoing primary PTCA, particularly among older and hypertensive individuals. Early detection during coronary angiography may support better contrast management, guide blood pressure control, and improve renal monitoring strategies.
Case Report
Open Access
Intravenous Amiodarone - Induced Acute Liver Injury: Early Recognition and Management with N-Acetylcysteine
Sowmya Manjari Siddenthi,
Siva Keerthana Suddapalli,
Naga Naveen Bobbala,
Ariosto Rosado
Pages 1 - 5

View PDF
Abstract
Introduction: Amiodarone, a class III antiarrhythmic agent, is widely employed in the management of supraventricular and ventricular tachyarrhythmias. While hepatotoxicity related to chronic oral administration is a well-recognized complication, acute hepatic injury following intravenous (IV) amiodarone is exceedingly rare and potentially fatal. The mechanism is multifactorial, often attributed to the solvent polysorbate 80, which may induce mitochondrial dysfunction and hepatic ischemia. We present a case of acute, reversible hepatocellular injury following IV amiodarone infusion, successfully managed with early discontinuation of the drug and administration of intravenous N-acetylcysteine (NAC). Case Presentation: A 74-year-old male with a history of hypertension and dyslipidemia presented with acute abdominal pain and was diagnosed with a perforated duodenal ulcer. Following emergency laparotomy and primary repair, the patient developed postoperative respiratory failure requiring ICU care and mechanical ventilation. During his ICU stay, he developed new-onset atrial fibrillation with rapid ventricular response, for which IV amiodarone was initiated (150 mg loading followed by continuous infusion). Within 24 hours, the patient reverted to sinus rhythm but exhibited a sharp rise in hepatic transaminases AST 5024 U/L, ALT 1393 U/L with mild hyperbilirubinemia (1.6 mg/dL) and normal alkaline phosphatase levels. No hypotension, hypoxia, or exposure to other hepatotoxic drugs was noted. Viral and autoimmune hepatitis panels were negative, and abdominal ultrasound revealed normal hepatic architecture. The diagnosis of IV amiodarone-induced acute hepatocellular injury was made based on clinical chronology and exclusion of alternative causes. Amiodarone was discontinued, and IV N-acetylcysteine was initiated using a standard 5-day infusion protocol (150 mg/kg loading dose followed by 50 mg/kg and 100 mg/kg maintenance doses). Remarkable biochemical improvement occurred within 48 hours, with normalization of liver enzymes by day five. The patient recovered completely and was discharged in stable condition, maintaining sinus rhythm on oral beta-blocker therapy. Discussion: Acute hepatocellular injury following IV amiodarone infusion is rare but potentially severe, with an onset typically within hours of drug administration. The hepatotoxic component is likely related to polysorbate 80, an emulsifying agent in the IV formulation, which induces mitochondrial damage, circulatory collapse, and hepatic ischemia. The biochemical pattern of massive aminotransferase elevation with mild bilirubin rise mimics ischemic hepatitis but in the absence of hypotension. N-acetylcysteine, originally developed for acetaminophen toxicity, has demonstrated efficacy in non-acetaminophen acute liver failure by replenishing glutathione stores, scavenging reactive oxygen species, and improving hepatic microcirculation. In this case, early NAC administration led to rapid enzyme normalization and clinical recovery, supporting its hepatoprotective role in IV amiodarone-induced hepatic injury. Conclusion: This case highlights that intravenous amiodarone can cause acute, severe but reversible hepatocellular injury even in patients with normal baseline liver function. Early recognition, immediate discontinuation of the drug, and timely administration of N-acetylcysteine can result in complete hepatic recovery and prevent progression to acute liver failure. Vigilant monitoring of liver function tests within the first 24 hours of infusion is crucial to ensure patient safety.
Research Article
Open Access
Association of Glycated Hemoglobin (HbA1c) Levels with Risk of Ischemic Stroke in Diabetic and Non-Diabetic Patients: A Case–Control Study from PGIMER & Capital Hospital, Odisha
Nagula Prasanna ,
Premakanta Mohanty ,
Susanta Kumar Bhuyan,
Jibanjyoti Das ,
Namita Mohapatra
Pages 120 - 124

View PDF
Abstract
Background: Stroke is one of the leading causes of mortality and disability worldwide, with diabetes mellitus recognized as a major modifiable risk factor. Glycated hemoglobin (HbA1c) reflects long-term glycemic control and has been proposed as a potential marker for predicting stroke risk even in non-diabetic individuals. This study aimed to evaluate the relationship between HbA1c levels and ischemic stroke in diabetic and non-diabetic patients admitted to a tertiary care hospital in eastern India. Materials and Methods: A hospital-based, descriptive, case–control study was conducted in the Department of General Medicine, PGIMER & Capital Hospital, Bhubaneswar, Odisha, over six months (October 2024–March 2025). A total of 240 inpatients with acute ischemic stroke were included—120 diabetics and 120 non-diabetics—aged ≥18 years. Detailed history, clinical examination, and investigations including HbA1c, fasting and random blood glucose, lipid profile, and blood pressure were recorded. Data were analyzed using R software (version 4.3.2), with p < 0.05 considered statistically significant. Results: The majority of stroke cases (58%) occurred in patients aged 51–70 years. Based on glycemic status, 50% were known diabetics, 5.4% were newly diagnosed, 12.9% had stress hyperglycemia, and 31.7% were euglycemic. The mean HbA1c level was significantly higher among diabetic patients (7.85 ± 2.30%) compared to non-diabetics (6.25 ± 2.10%, p < 0.01). Although diabetics also had higher mean values of fasting blood sugar, lipid parameters, and blood pressure, these differences were not statistically significant. Discussion: The study demonstrates that elevated HbA1c levels are strongly associated with ischemic stroke, independent of other risk factors. These findings are consistent with previous studies that identified HbA1c as a reliable predictor of vascular risk and subclinical atherosclerosis. Routine HbA1c assessment in acute stroke cases can identify undiagnosed diabetes or prediabetes and guide early preventive interventions. Conclusion: Higher HbA1c levels are significantly correlated with the occurrence of ischemic stroke. Poor long-term glycemic control may contribute to increased cerebrovascular vulnerability in diabetics and may also signal higher risk among non-diabetics. Monitoring and management of hyperglycemia should therefore form an integral part of acute and long-term stroke care.
Research Article
Open Access
A Prospective Comparison of Hemodynamic Responses Following Spinal Anesthesia in Hypertensive versus Normotensive Patients Undergoing Infraumbilical Surgery
Dr. Ramya DV ,
Dr. Haripriya Ramachandran ,
, Dr. Naga Seshu Kumari Vasantha
Pages 173 - 177

View PDF
Abstract
: Background: Spinal anesthesia (SAB) is a common technique for infraumbilical surgeries, but it frequently causes hypotension and bradycardia. Hypertensive patients are thought to be more vulnerable to these hemodynamic perturbations due to altered vascular autoregulation. This study aimed to compare the hemodynamic changes and incidence of hypotension following SAB between hypertensive and normotensive patients. Methods: In this prospective, observational study, 100 patients (ASA I & II) aged 40-65 years scheduled for elective surgery below the umbilicus under SAB were enrolled. They were allocated into two groups: Group H (n=50, hypertensive on medication) and Group N (n=50, normotensive). All patients received preloading with 10 ml/kg isotonic saline. Spinal block was performed with 3.5 ml of 0.5% hyperbaric bupivacaine. Systolic (SBP), diastolic (DBP), mean arterial pressure (MAP), and heart rate (HR) were recorded at baseline, after fluid loading, and at 1, 3, 5, 10, 20, 30-, 40-, 50-, and 60-minutes post-SAB. Hypotension was defined as a >25% decrease from baseline SBP. Results: The incidence of hypotension was significantly higher in Group H (36%) compared to Group N (14%) (p = 0.012). The mean maximum decrease in SBP, DBP, and MAP was also significantly greater in Group H at multiple time intervals (p < 0.05). There was no statistically significant difference in the incidence of bradycardia between the groups (Group H: 10%, Group N: 6%; p = 0.717). Conclusion: Hypertensive patients experience a significantly greater incidence and magnitude of hypotension following spinal anesthesia compared to normotensive patients, underscoring the need for intensified hemodynamic monitoring and proactive management in this population.
Research Article
Open Access
Association of Malondialdehyde and total antioxidant capacity in Iron deficiency anemia
N. Sridevi ,
K. Balu Mahendran
Pages 704 - 706

View PDF
Abstract
Background: Iron deficiency anemia (IDA) , utmost established nutritional deficiency disorder and primary causative factor of anemia specifically in developing countries . As far as with high prevalence, there is no standard definition of anemia but as per the WHO determines hemoglobin <11 g/dL is considered as Anemia . Among this 50% of anemia is only due to the iron deficit .But the Prevalence of IDA among females after puberty because of menstrual bleeding and further after marriage, predominantly they suffer with severe iron deficiency during period of pregnancy . IDA in men might be considered as red flag for the possible presence of serious inflammatory disease, and considerable proportion of asymptomatic gastric issues , colorectal diseases and precancerous lesions . So in the present study we focused on especially oxidative stress parameters and their imbalance in Iron deficiency anemia. Objectives: The present study focused to estimate Malondialdehyde , total antioxidant capacity levels in IDA patients compared with healthy volunteers, and to find out their association with hemoglobin levels. Materials and methods: Fifty IDA patients with all the age groups of men and women were selected and 50 healthy age matched subjects were selected as controls. Serum Total antioxidant capacity & MDA were estimated by spectrophotometric methods and Hb and other complete blood picture analysis carried out by Hematology Analyzer. Results: MDA levels were significantly increased in IDA patients compared with healthy controls . TAC and Hemoglobin values are significantly decreased in IDA patients. Hemoglobin levels positively correlated with TAC, and negatively correlated with MDA . Conclusion: Total antioxidant capacity , lipid peroxidation vital risk factor responsible for increased oxidative stress in IDA patients. Regular monitoring and supplementation of iron and other multivitamins are beneficial for reduction of oxidative stress and to reduce iron deficiency anemia complications.
Research Article
Open Access
A STUDY OF CLINICAL PROFILE OF HYPERTENSIVE CRISIS PATIENTS ADMITTED IN INTENSIVE CARE UNIT
Dr. Arpit Jaiswal ,
Dr. Gautam Sharma ,
Dr. Shailendra Kumar
Pages 242 - 247

View PDF
Abstract
INTRODUCTION;Hypertensive crisis (HC) is a life-threatening emergency characterized by severely elevated blood pressure, which can lead to organ damage, stroke, heart failure, and acute renal failure. These patients often require ICU admission for intensive monitoring and immediate intervention. Aims and objectives: This study aims to evaluate the clinical profile of hypertensive crisis patients admitted to the ICU, including demographic factors, clinical presentations, and management strategies. Materials and Methods: A cross-sectional, descriptive study was conducted in the Intensive Care Unit of St. Stephens Hospital, Delhi, from February 2023 to July 2024. The study included 81 patients admitted with hypertensive crisis. Key variables examined included age, systolic and diastolic blood pressure, and diagnoses such as acute coronary syndrome (ACS), acute kidney injury (AKI), acute decompensated heart failure (ADHF), hypertensive retinopathy, and acute ischaemic stroke. Results: The majority of patients were middle-aged to elderly, with 28% in the 51–60 age group and 27% in the 61–70 age group. Blood pressure improved significantly, with systolic blood pressure decreasing from 198.29 mm Hg at 0 hours to 131.25 mm Hg at discharge. Ventilation needs varied, with 32 patients requiring non-invasive ventilation (NIV) and 17 requiring intravenous (IV) support. Conclusion: Hypertensive crisis is a severe condition requiring ICU care. Elderly patients with comorbidities like chronic hypertension and renal disease are at higher risk. Early recognition and aggressive management with intravenous antihypertensive therapy are critical for improving outcomes and preventing complications. Future research should focus on optimizing treatment protocols and targeting high-risk patients.
Research Article
Open Access
Effectiveness of Thrombolytic Therapy in Treating Mechanical Prosthetic Valve Thrombosis: A Comprehensive Observational Study
Dr. Shubham Rajkishor Patel
Pages 288 - 294

View PDF
Abstract
Background: Mechanical prosthetic valve thrombosis (PVT) is a serious condition with high morbidity and mortality. While emergency surgery has long been the standard treatment, thrombolytic therapy (TT) using agents like streptokinase has become a valuable alternative—particularly for patients with high surgical risk or in resource-limited settings, showing variable success across different populations. Objective: To evaluate the effectiveness and safety of thrombolytic therapy in patients with mechanical prosthetic valve thrombosis at a tertiary care center. Methods: This retrospective study included 27 mechanical PVT patients diagnosed via clinical assessment, echocardiography, and fluoroscopy. All received intravenous streptokinase. Demographics, clinical features, causes, and outcomes were documented. Thrombolysis success was defined by clinical improvement and imaging resolution. Complications such as embolism, bleeding, and mortality were assessed. Results Among the 27 patients (mean age 48.1 ± 14.1 years; 77.8% female), the mitral prosthesis was most commonly affected. The leading cause of PVT was non-compliance with oral anticoagulation and subtherapeutic INR levels. Thrombolysis was successful in 85.2% of patients, with restoration of valve function and clinical recovery. Complications included minor bleeding and embolic events and mortality in 7.4% (2 patients). Prognostic analysis indicated worse outcomes in older patients and in those presenting with severe NYHA functional class. Conclusion: Thrombolytic therapy with streptokinase is a safe and effective treatment for mechanical prosthetic valve thrombosis, especially in settings where surgical intervention is not immediately available. Strict anticoagulation monitoring and patient compliance are critical to prevention.
Research Article
Open Access
A cross sectional clinical study to evaluate the correlation between HbA1c levels and grades of diabetic retinopathy in diabetic patients at tertiary care hospital.
Rajeshwari M ,
Kavita Patil2 ,
Shruti B ,
Sagar H ,
Supriya C ,
Rohini Kallur ,
Shivalee A
Pages 342 - 346

View PDF
Abstract
Background: Diabetic retinopathy (DR) is a long-term complication of diabetes characterized by progressive damage to the small blood vessels of the retina, which can ultimately threaten vision. Glycated hemoglobin (HbA1c) is widely recognized as a reliable marker for monitoring long-term glycemic control and may serve as an early indicator for identifying individuals at higher risk of developing DR. Aim: To evaluate the correlation between HbA1c level and grade of diabetic retinopathy, to study the awareness of diabetic retinopathy in diabetic patients and to identify the systemic risk factors associated with diabetic retinopathy. Materials and methods: This study was designed as a cross-sectional, hospital-based study. A total of 206 individuals attending the non-communicable disease (NCD) clinic at a tertiary care hospital in Kalaburagi. Data collection was carried out over a 12-month period, from November 2023 to November 2024. Results: In our study, Out of the 206 participants, diabetic retinopathy was detected in 133 patients, giving a prevalence of 64.5%. The condition was observed more frequently in males, with the highest proportion of cases falling in the 50–60 year age group. Distribution by severity revealed that 53 patients (25.75%) had mild non-proliferative DR (NPDR), 32 (16.47%) had moderate NPDR, 4 (1.94%) showed severe NPDR, and 6 (2.91%) had very severe NPDR. Proliferative DR (PDR) was present in 8 patients (3.88%), another 8 (3.88%) showed advanced diabetic eye disease, and 22 (10.7%) exhibited clinically significant macular edema (CSME). A progressive increase in HbA1c levels was noted with higher grades of retinopathy, and correlation analysis confirmed a significant positive relationship (r = 0.523, 0.687, 0.872; p < 0.05). Awareness about DR was generally poor, with nearly two-thirds of patients (67.48%). Conclusion: This study demonstrates a clear positive association between elevated HbA1c levels and the severity of diabetic retinopathy. Limited awareness about the condition among patients appears to contribute to delayed detection, resulting in advanced retinal changes and substantial visual impairment. These findings highlight the importance of regular eye screening, better patient education, and effective control of both blood glucose and blood pressure in reducing the burden of diabetic retinopathy.
Research Article
Open Access
AI-Driven Predictive Continuous Glucose Monitoring for Hypoglycemia Prevention: A Systematic Review and Meta-Analysis of Randomized Evidence
Roshan Rajesh Menon ,
Challagonda Ranjith Rao ,
Sriram Arumugam ,
Akshay Talpe ,
Sakshi Malav ,
Harshawardhan Dhanraj Ramteke ,
Roshan Rajesh Menon ,
Challagonda Ranjith Rao ,
Sriram Arumugam ,
Akshay Talpe ,
Sakshi Malav ,
Harshawardhan Dhanraj Ramteke ,
Roshan Rajesh Menon ,
Challagonda Ranjith Rao ,
Sriram Arumugam ,
Akshay Talpe ,
Sakshi Malav ,
Harshawardhan Dhanraj Ramteke
Pages 391 - 402

View PDF
Abstract
Background: Hypoglycaemia remains a major limitation in insulin-treated diabetes despite advances in continuous glucose monitoring (CGM). AI-predictive CGM systems aim to reduce hypoglycaemia by forecasting impending low glucose events and enabling earlier preventive interventions. The overall effectiveness of these technologies has not been comprehensively synthesized. Methods: We conducted a systematic review and meta-analysis of randomized controlled trials comparing AI-predictive CGM with standard CGM or usual care. Electronic databases were searched from inception to the final search date. Primary outcomes included hypoglycaemia burden (time below range), with secondary outcomes comprising severe hypoglycaemia, time in range, HbA1c, mean glucose, and CGM wear duration. Random-effects meta-analyses using REML were performed. Risk of bias was assessed using RoB 2.0, and certainty of evidence was evaluated using GRADE. The review was prospectively registered in PROSPERO (CRD420251270444). Results: Eleven randomized trials involving 1,164 participants were included. AI-predictive CGM significantly reduced hypoglycaemia burden and severe hypoglycaemia events compared with comparators. Improvements were also observed in time in range, HbA1c, and mean glucose levels, without a consistent effect on CGM wear duration. Substantial heterogeneity was noted across outcomes, but the direction of effect consistently favored AI-predictive CGM. Conclusions: AI-predictive CGM is associated with clinically meaningful reductions in hypoglycaemia and improved overall glycaemic control. These findings highlight the added value of predictive intelligence beyond conventional CGM and support its role in contemporary diabetes management.
Research Article
Open Access
Evaluating lipometabolic and clinical parameters in subclinical hypothyroidism: Evidence from a tertiary care centre in North India
Mohammad Obaid ,
Mohammad Ashraf
Pages 423 - 426

View PDF
Abstract
Objective: Subclinical hypothyroidism (ScH), has been increasingly explored for its potential lipometabolic and vascular implications. Evidence remains inconsistent, with some studies suggesting adverse effects on lipid metabolism and vascular function, while others show no significant associations.. With regional variation in thyroid disorder prevalence and metabolic patterns across India, this study evaluated lipometabolic and related clinical markers in individuals with ScH in Kashmir. Methodology: A cross-sectional study was conducted on fifty patients with ScH and fifty age- and gender-matched euthyroid controls attending a tertiary care centre in Srinagar. Standardized procedures were used for anthropometry, blood pressure measurement, and biochemical analysis, including thyroid profile, lipid parameters, and fasting glucose. Statistical analysis employed independent t-tests, chi-square tests, and Pearson’s correlations, with p < 0.05 considered significant.
Results: Baseline demographics were similar between groups. Mean systolic and diastolic blood pressures in the ScH group were not significantly different from controls(p>0.05). Lipid and glycaemic parameters—including fasting glucose, triglycerides, HDL, and LDL—also showed no significant differences.(p > 0.05) Wide variability in BMI, triglyceride and LDL levels suggested notable inter-individual differences. Findings indicate that ScH did not produce measurable alterations in the lipometabolic profile within this cohort. Conclusion: ScH was not associated with significant changes in blood pressure, lipid parameters, or BMI in this Kashmiri sample. Although these results support studies reporting minimal metabolic impact, conflicting evidence elsewhere highlights the importance of ongoing monitoring. Larger longitudinal studies are needed to clarify whether progression of ScH contributes to future lipometabolic risk.
Research Article
Open Access
Comparison of Serum Vitamin D Levels in Children with First Episode of Unprovoked Seizures and Those on Long-Term Anti-Epileptic Drug Therapy
Dr. Spoorthy D N ,
Dr. Bharat Kumar G N ,
Dr. Manuprakash S K
Pages 543 - 545

View PDF
Abstract
Background: Epilepsy is one of the most common chronic neurological disorders in children requiring long-term anti-epileptic drug (AED) therapy. Enzyme-inducing AEDs (EIAEDs) can accelerate vitamin D metabolism, leading to hypovitaminosis D and bone health concerns. This study compares serum vitamin D levels in children with first-episode unprovoked seizures and those on AED therapy for over 12 months. Methods: A cross-sectional comparative study was conducted in the Department of Paediatrics, Hassan Institute of Medical Sciences, Karnataka. One hundred children aged 2–18 years were enrolled: Group A (first-episode unprovoked seizures, n=50) and Group B (on AED therapy >12 months, n=50). Serum vitamin D, calcium, phosphorus, and alkaline phosphatase (ALP) levels were measured and analyzed using independent t-test and chi-square test, with p<0.05 considered significant. Results: Mean serum vitamin D levels were 30.28 ± 5.94 ng/mL in Group A and 31.00 ± 6.15 ng/mL in Group B (p=0.55). Serum calcium, phosphorus, and ALP levels showed no significant difference. Levetiracetam was the most common drug in Group A, while phenytoin predominated in Group B. No significant differences were found in demographic or vital parameters. Total leukocyte count was significantly higher in Group A (p=0.001), indicating postictal stress response. Conclusion: Long-term AED use did not significantly affect serum vitamin D or bone metabolism markers. Routine monitoring of vitamin D levels is advisable in children on chronic AED therapy, though current regimens appear metabolically safe and well-tolerated.
Research Article
Open Access
Postoperative Blood Pressure Variability and Major Adverse Cardiac Events in Adults After Non-Cardiac Surgery: A Retrospective Cohort Study
Neerukatti Sheliya Dainy ,
Annareddy Gangadhara Reddy ,
Madigonda Ganesh
Pages 1200 - 1205

View PDF
Abstract
Background: Blood pressure (BP) derangements after non-cardiac surgery are common and have been linked to myocardial injury and early mortality in perioperative cohorts. Whether early postoperative BP variability itself predicts major adverse cardiac events (MACE) is less clear. Objectives: To assess BP variability during the first 48 postoperative hours and its association with in-hospital MACE. Methods: A retrospective cohort study was conducted on eighty adults undergoing non-cardiac surgery under general or regional anesthesia were included. Postoperative BP readings over 48 hours were extracted from records. Variability was summarized by standard deviation (SD) and coefficient of variation (CV). High systolic BP variability was defined as SD ≥12 mmHg (median). MACE included myocardial injury, new-onset arrhythmia, acute heart failure, and cardiac death. Results: Mean age was 57.4 ± 10.2 years and 65.0% were men. High systolic BP variability was present in 42 (52.5%). Eighteen patients (22.5%) had ≥1 MACE; myocardial injury was most frequent (12.5%). MACE occurred in 33.3% of the high-variability group versus 10.5% of the low-variability group (unadjusted odds ratio 4.25, 95% CI 1.26–14.38). After adjustment for age, hypertension, diabetes, and surgical risk, high variability remained an independent predictor (adjusted odds ratio ≈ 3.1). Conclusion: High early postoperative BP variability was common and independently associated with higher in-hospital MACE after non-cardiac surgery
Research Article
Open Access
Hemodynamic and Electrocardiographic Responses to Laryngoscopy and Endotracheal Intubation in Adults with Hypertension or coronary artery disease: A Prospective Observational Study
Annareddy Gangadhara Reddy ,
Madigonda Ganesh ,
Neerukatti Sheliya Dainy
Pages 1206 - 1210

View PDF
Abstract
Background: Laryngoscopy and tracheal intubation trigger a sympathetic surge that produces short-lived tachycardia and hypertension, which can precipitate myocardial ischemia in patients with hypertension and coronary artery disease (CAD). Objectives: To quantify peri-intubation hemodynamic changes and describe electrocardiographic (ECG) alterations in adults with hypertension and/or CAD undergoing elective surgery under general anesthesia. Methods: This prospective observational study enrolled eighty adult patients classified as ASA physical status II–III, all of whom had established hypertension and/or coronary artery disease (CAD). Heart rate (HR), systolic blood pressure (SBP), diastolic blood pressure (DBP), mean arterial pressure (MAP), and ECG (lead II and V5) were recorded at baseline, during laryngoscopy, immediately after intubation, and at 3 and 5 minutes after intubation. Maximum percentage change from baseline was calculated and the proportion with >20% rise was documented. Results: Mean age was 56.8 ± 9.4 years; 65% were male; 47.5% had hypertension alone and 30% had both hypertension and CAD. HR increased from 76.4 ± 8.9 to 98.8 ± 11.6 beats/min immediately after intubation; SBP rose from 138.6 ± 12.8 to 170.8 ± 18.2 mmHg, with gradual decline by 5 minutes. More than half had >20% rise in HR (57.5%) and SBP (52.5%). Transient ECG changes occurred in 22.5%; ST-segment depression was seen in 7.5% and no patient developed sustained arrhythmia or required intervention. Conclusion: Adults with hypertension and/or CAD demonstrated a pronounced but transient pressor response to laryngoscopy and intubation, with infrequent, self-limited ECG changes. Close monitoring and proactive attenuation strategies during airway instrumentation are essential in high-risk patients
Research Article
Open Access
ROLE OF C-REACTIVE PROTEIN AND PROCALCITON IN IN THE EARLY DIAGNOSIS OF INTRAABDOMINAL INFECTIONS FOLLOWING GASTROINTESTINAL SURGERY
Dr. G. Madhusudhana ,
Dr. P. Chandana Priyanka ,
Dr. J. Vaishnavi
Pages 36 - 38

View PDF
Abstract
Background: Intraabdominal infections remain one of the most serious postoperative complications following gastrointestinal surgery, contributing significantly to morbidity, prolonged hospital stay, and mortality. Early diagnosis is often challenging as clinical signs may be nonspecific in the immediate postoperative period. Biomarkers such as C-reactive protein (CRP) and procalcitonin (PCT) have been increasingly studied for their role in the early detection of infective complications. Aim: To evaluate the role of serum C-reactive protein and procalcitonin levels in the early diagnosis of intraabdominal infections in patients undergoing gastrointestinal surgery. Materials and Methods: This prospective observational study was conducted at Government Medical College, Kadapa, over a period of one year. Adult patients undergoing elective or emergency gastrointestinal surgery and requiring postoperative intensive care monitoring were included. Serum CRP and PCT levels were measured at 1st, 24th, 48th, and 72nd postoperative hours. Patients were monitored clinically and radiologically for the development of intraabdominal infections such as anastomotic leaks and intraabdominal abscesses. Diagnostic accuracy of CRP and PCT was assessed using sensitivity, specificity, and receiver operating characteristic (ROC) curve analysis. Results: A significant proportion of patients developed postoperative intraabdominal infections. Both CRP and PCT levels were significantly higher in infected patients compared to non-infected patients, particularly at 48 and 72 hours postoperatively. Procalcitonin demonstrated higher sensitivity and specificity than CRP at these time points, indicating superior predictive value for early diagnosis of intraabdominal infections. Conclusion: Serial measurement of serum procalcitonin and C-reactive protein is valuable in the early detection of postoperative intraabdominal infections. Procalcitonin, especially at 48 and 72 hours, is a more reliable biomarker than CRP and can aid clinicians in early diagnosis and timely intervention.
Research Article
Open Access
Arrythmias in Thrombolysed Patients of Acute ST Elevation Myocardial Infarction – A Prospective Observational Study
Dr. Devarsh Sanghavi ,
Dr Krishna K Lakhani ,
Dr. Nisha Lalwani
Pages 153 - 159

View PDF
Abstract
Background: Myocardial infarction (MI) remains a leading cause of morbidity and mortality worldwide. Arrhythmias are common complications following acute MI, particularly after thrombolytic therapy, and significantly influence clinical outcomes. Objectives: To evaluate the demographic and clinical profile of patients with acute MI, assess the incidence and pattern of arrhythmias following thrombolytic therapy, and analyze their association with infarct location, comorbidities, and in-hospital mortality. Methods: This prospective observational study included 160 patients with acute MI admitted to a tertiary care center. Demographic variables, comorbid conditions, type and site of MI, occurrence and type of arrhythmias after thrombolysis, and clinical outcomes were recorded and analyzed. Results: The majority of patients were aged 46–60 years, with a mean age of 54.54 ± 11.09 years, and males constituted 65% of the study population. Hypertension and diabetes mellitus were the most common comorbidities. Anterior wall MI was the predominant infarct type. Arrhythmias occurred in 63.13% of patients, with ventricular premature complexes (22.77%), sinus tachycardia (19.80%), and idioventricular rhythm (14.85%) being the most frequent. Ventricular arrhythmias were associated with higher mortality. The overall in-hospital mortality rate was 13%, predominantly observed in patients with anterior and anteroinferior wall infarctions. Conclusion: Arrhythmias are common following thrombolytic therapy in acute MI, particularly in patients with anterior wall involvement and associated comorbidities. Early detection, continuous cardiac monitoring, and timely management of arrhythmias, along with effective control of cardiovascular risk factors, are essential to improve patient outcomes and reduce mortality.
Research Article
Open Access
CLINICAL UTILITY OF 24-HOUR AMBULATORY BLOOD PRESSURE MONITORING IN HOSPITALISED PATIENTS WITH CHRONIC KIDNEY DISEASE- A CROSS-SECTIONAL STUDY
Ammu Roy ,
Rojith karandode Balakrishnan ,
Dr Neeraj Manikath ,
Dr NK Thulaseedharan
Pages 160 - 166

View PDF
Abstract
Background:Hypertension is highly prevalent in patients with chronic kidney disease (CKD) and is a major contributor to cardiovascular morbidity and progression of renal dysfunction. Accurate assessment of blood pressure (BP) in CKD patients is challenging due to altered circadian BP patterns and poor reliability of office blood pressure measurements. Ambulatory blood pressure monitoring (ABPM) provides comprehensive evaluation of BP variability, nocturnal BP behavior, and overall BP burden, which may have important clinical implications in CKD.Objectives:To study various blood pressure parameters obtained by ambulatory blood pressure monitoring in patients with chronic kidney disease, to compare office blood pressure monitoring with ambulatory blood pressure monitoring, to determine the prevalence of resistant, masked, and white-coat hypertension, and to evaluate altered circadian BP patterns and their association with CKD stages and target organ damage.Methods:This hospital-based observational cross-sectional study was conducted among 88 hospitalized CKD patients aged more than 12 years admitted to the general medicine wards of Government Medical College, Kozhikode, between January 2022 and December 2022. Office BP was measured using a mercury sphygmomanometer, and all participants underwent 24-hour ABPM using a validated device. Ambulatory parameters including daytime and nighttime systolic and diastolic BP, nocturnal dipping status, hyperbaric index, percent time elevation, and BP phenotypes were analyzed. Data were expressed as mean ± standard deviation and percentages.Results:Hypertension was detected in 80.7% of patients by ABPM compared to 76.1% by office BP measurement. Resistant hypertension was observed in 34.1% of the study population and in 44.7% of hypertensive patients, with prevalence increasing with advancing CKD stage. A high prevalence of nocturnal non-dipping pattern was noted (73.9%), and all patients with resistant hypertension were non-dippers. Nighttime hyperbaric index and percent time elevation were significantly higher than daytime values, particularly in advanced CKD stages. Target organ damage was common, with left ventricular hypertrophy present in 44.5% and hypertensive retinopathy in 37.5% of patients, while resistant hypertension patients showed markedly higher prevalence of these complications.Conclusion:Ambulatory blood pressure monitoring provides superior diagnostic and prognostic information compared to office BP measurement in hospitalized CKD patients. ABPM enables accurate identification of resistant and masked hypertension, detects abnormal nocturnal BP patterns, and reveals increased nighttime BP burden associated with target organ damage. Routine use of ABPM in CKD patients may improve blood pressure management, risk stratification, and prevention of cardiovascular complications.
Research Article
Open Access
A Prospective Observational Study on the Incidence and Predictors of Hypoglycemia in Neonates Born to Diabetic Mothers at a Tertiary Care Center in Konkan, Maharashtra
Murughesh Patil ,
Lakshmi Paragannavar ,
Santosh Kumar Karamasi
Pages 181 - 187

View PDF
Abstract
Background: Infants of diabetic mothers (IDM) are at increased risk of neonatal hypoglycemia, contributing to interventions and NICU admissions.Objectives: To estimate the incidence of hypoglycemia in IDM, identify predictors, and describe timing and short-term outcomes.Methods: Prospective observational study (18 months) at SSPM Medical College and Lifetime Hospital, Padve, Maharashtra. Consecutive IDM (n=220) underwent scheduled glucose screening (2, 6, 12, 24, 48, 72 hours and if symptomatic). Hypoglycemia was defined using Indian guideline–based operational thresholds. Predictors were assessed using bivariate analysis and multivariable logistic regression.Results: Hypoglycemia occurred in 69/220 (31.4%) (95% CI 25.6–37.8). First episodes were most frequent at 6 hours (43.5%) and 12 hours (27.5%). Among affected neonates, 36.2% were symptomatic, 20.3% had severe hypoglycemia (<25 mg/dL), and 7.2% had recurrent episodes. IV dextrose was required in 37.7% and NICU admission was higher with hypoglycemia (40.6% vs 23.2%, p=0.009). On multivariable analysis, HbA1c (aOR 1.67, p=0.062) and LGA status (aOR 2.13, p=0.063) showed borderline association with hypoglycemia.Conclusion: Nearly one-third of IDM developed hypoglycemia, predominantly within the first 12 hours, with increased need for IV therapy and NICU admission. Early feeding and protocol-based monitoring are essential.
Research Article
Open Access
A Comparative Study of the Effect of Dexmedetomidine and Fentanyl on Hemodynamic Stress Response during Laryngoscopy and Pneumoperitoneum in Laparoscopic Surgery
Ekta Kakdiya ,
Kruti Patel ,
Sujata Patel
Pages 1261 - 1266

View PDF
Abstract
Background: Laryngoscopy, tracheal intubation, and pneumoperitoneum during laparoscopic surgery provoke significant sympathetic stimulation, resulting in tachycardia and hypertension. Pharmacological attenuation of this hemodynamic stress response is essential to improve perioperative stability. This study compared dexmedetomidine and fentanyl for their effectiveness in controlling hemodynamic responses during these critical periods. Material and methods: This prospective, randomized, double-blind study included 100 adult patients of ASA physical status I and II undergoing elective laparoscopic surgery under general anaesthesia. Patients were randomly allocated into two groups (n = 50 each). Group D received dexmedetomidine (1 µg/kg loading dose followed by 0.2 µg/kg/h infusion), while Group F received fentanyl in an equivalent dosing regimen. Heart rate, systolic, diastolic, and mean arterial blood pressure, and oxygen saturation were recorded at baseline, during airway manipulation, throughout pneumoperitoneum, at extubation, and during the postoperative period up to 6 hours. Demographic variables were also compared. Results: Baseline demographic characteristics and initial hemodynamic parameters were comparable between the two groups. Dexmedetomidine produced a significantly greater attenuation of heart rate and blood pressure responses following the loading dose, during laryngoscopy and intubation, throughout pneumoperitoneum, and at extubation compared with fentanyl. The differences were most pronounced during periods of maximal surgical stress. Hemodynamic parameters gradually returned toward baseline values in both groups during late postoperative monitoring, with no significant intergroup differences at 3 and 6 hours. Oxygen saturation remained stable and comparable between the groups at all time points. Conclusion: Dexmedetomidine provides superior control of hemodynamic stress responses compared with fentanyl during laparoscopic surgery, without compromising oxygenation. Its use contributes to improved perioperative hemodynamic stability during airway manipulation and pneumoperitoneum
Research Article
Open Access
Pulmonary Function Abnormalities in Children with Beta Thalassemia Major and Their Association with Age and Serum Ferritin Levels: A Cross-sectional Study in Eastern India
Dr Jyotiranjan Satapathy ,
Dr Santosh Kumar Pradhan ,
Dr Rajesh Das
Pages 269 - 273

View PDF
Abstract
Background: Beta-thalassemia major is a common hereditary hemoglobin disorder requiring lifelong blood transfusions. Advances in transfusion therapy and iron chelation have improved survival; however, chronic complications involving various organ systems continue to emerge. Pulmonary dysfunction in thalassemia major is often subclinical and remains under-recognized, with conflicting evidence regarding its pattern and association with iron overload. Objectives: To evaluate pulmonary function abnormalities in children with β-thalassemia major and to assess their association with age and serum ferritin levels. Methods: A hospital-based cross-sectional study was conducted among 70 children aged 5–15 years with confirmed β-thalassemia major admitted for periodic blood transfusion. Children with pre-existing pulmonary disease or congenital/rheumatic heart disease were excluded. Clinical details, transfusion history, and chelation status were recorded. Pre-transfusion hemoglobin and serum ferritin levels were measured. Pulmonary function tests were performed using spirometry within 24 hours of transfusion. Pulmonary patterns were classified as normal, restrictive, obstructive, or combined. Statistical analysis included descriptive statistics and correlation analysis, with p < 0.05 considered significant. Results: Pulmonary function abnormalities were observed in 54 (77.1%) children. The restrictive pattern was the most common abnormality (88.9%), followed by obstructive and combined patterns. Reduced FVC and FEV₁ were frequently noted, while the FEV₁/FVC ratio remained normal in most children. A significant negative correlation was observed between age and FVC%, FEV₁%, and PEFR, indicating progressive pulmonary involvement with advancing age. Although spirometric values were lower in children with higher serum ferritin levels, no statistically significant association was found between serum ferritin and pulmonary dysfunction. Discussion: The predominance of restrictive lung disease suggests impaired lung growth or chronic parenchymal involvement in thalassemia major. Progressive decline in pulmonary function with age highlights the cumulative impact of disease duration and chronic transfusion therapy. The lack of a strong association with serum ferritin indicates that pulmonary dysfunction is likely multifactorial rather than solely related to iron overload. Conclusion: Pulmonary dysfunction is highly prevalent in children with β-thalassemia major, predominantly presenting as restrictive ventilatory impairment. Routine pulmonary function monitoring should be incorporated into the standard care of thalassemic children to enable early detection and timely intervention, even in the absence of respiratory symptoms.
Research Article
Open Access
A CROSS- SECTIONAL STUDY OF THE RELATIONSHIP BETWEEN CD4 COUNT AND MUCOCUTANEOUS MANIFESTATIONS IN HIV POSITIVE PATIENTS ATTENDING TERTIARY CARE CENTER
Dr. Manasa Pulala ,
Dr. V. Kishore Kumar
Pages 264 - 271

View PDF
Abstract
INTRODUCTION:HIV is a virus that attacks the immune system, especially CD4 cells. It leads to progressive immunodeficiency and can result in AIDS if untreated. HIV spreads through blood, sexual fluids, and from mother to child. HIV infection often causes skin, hair, nail, and mouth problems early on. Mucocutaneous manifestations may indicate HIV infection and disease severity. Opportunistic infections like bacterial, viral, and fungal infections are common in HIV. Non-infectious skin conditions, drug reactions, and malignancies are also seen.This study focuses on identifying mucocutaneous lesions in HIV patients in relation to CD4 lymphocyte count as a marker of immune status. AIM: The study aims to correlate the CD4 counts with occurrence of mucocutaneous manifestations in HIV patients.MATERIALS AND METHODS: This is a cross-sectional study which includes 140 HIV-positive patients visiting OPD of Department of Dermatology a nd Venereology at Government General Hospital, Ananthapuramu from April 2023 to Oct 2024. RESULTS: Among 140 HIV patients with mucocutaneous manifestations, there were 44 males and 56 females, most commonly aged 31–40 years. Heterosexual transmission was the predominant route (94%), and the mean CD4 count was 462.9 cells/µl, with 46% in WHO stage 2 and 8% severely immunosuppressed (CD4 <200). Non-infectious dermatoses were slightly more prevalent, with Pruritic Papular Eruption (PPE) being the most common (n=19, mean CD4 379), followed by Xanthelasma palpebrarum and vitiligo. Infectious manifestations included oral candidiasis (n=17), tinea corporis, tinea versicolor, vulval candidiasis, HPV warts, molluscum contagiosum, leprosy, and furunculosis. Nail changes (longitudinal melanonychia), hair changes (chronic telogen effluvium), and drug reactions (FDE, lichenoid eruptions) were also observed. Infectious skin conditions like tinea and scabies occurred in patients with moderate to preserved immunity (CD4 >490), whereas severe infections such as oral candidiasis and genital herpes were associated with advanced immunosuppression (CD4 <200), highlighting their role as clinical markers of immune status. These findings underscore the importance of regular dermatological assessment in HIV care, particularly in resource-limited settings, as mucocutaneous manifestations can aid early diagnosis, monitoring, and evaluation of HAART effectiveness. CONCLUSION: Cutaneous and mucosal lesions are key clinical indicators of HIV infection. These manifestations can occur at any stage and correlate with immunological status. Certain dermatoses are associated with profound immunosuppression, suggesting advanced HIV disease. Common dermatological conditions may present with atypical morphology or increased severity in HIV-positive patients.Unusual or recalcitrant presentations can complicate diagnosis, particularly in undiagnosed individuals.
Research Article
Open Access
Comparative Evaluation of Serum CK-Total, CK-MB, and Lactate Dehydrogenase Levels in Type 2 Diabetes Mellitus and Their Association with Cardiovascular Risk
Dr. Dilipkumar M. Kava ,
Dr. Kalpeshkumar C. Nakarani ,
Dr. Vilas U. Chavan
Pages 287 - 292

View PDF
Abstract
Introduction: Diabetes mellitus markedly increases cardiovascular morbidity and mortality due to the combined effects of hyperglycemia, insulin resistance, hypertension, obesity, and dyslipidemia, which accelerate atherosclerosis and vascular dysfunction. Diabetic patients often develop silent myocardial ischemia, leading to late presentation with severe cardiac events. Chronic subclinical elevation of cardiac biomarkers such as CK total, CK-MB, and LDH in diabetes suggests ongoing myocardial injury. This study evaluates the association of these biomarkers with cardiovascular risk factors in type 2 diabetes compared with non-diabetic individuals. Material and methods: This cross-sectional study was conducted at a tertiary care hospital in Surat, India. Serum creatine kinase total (CK total), CK-MB, and lactate dehydrogenase (LDH) were analyzed in 149 diabetic and 149 non-diabetic individuals using an Erba XL-640 fully automated chemistry analyzer. Height and weight were recorded to calculate body mass index (BMI). Statistical analysis was performed using SPSS version 16. Result and discussion: We had observed significantly (p<0.001) higher level of CK total (158 ± 75.10 U/l) in diabetic and non-diabetic (127.5 ± 73.9 U/l) individuals. We had observed significantly (p<0.001) higher level of CK-MB (41.18 ± 14.6 U/l) in diabetic and non-diabetic (29.19 ± 19.1 U/l) individuals. We had observed significantly (p<0.001) higher level of LDH (687 ± 180 U/l) in diabetic and non-diabetic (428 ± 197 U/l) individuals. We had also observed significant higher level of CK total, CK-MB and LDH in all age and BMI groups in diabetic subjects compared to non-diabetic individuals. Conclusion: This study demonstrates that, Serum CK total, CK-MB, and LDH are significantly elevated in type 2 diabetes mellitus, reflecting early, subclinical myocardial injury driven by chronic hyperglycemia, oxidative stress, and metabolic inflammation. Despite the availability of high-sensitivity troponins, CK-MB and LDH remain reliable, cost-effective markers of cardiac stress and respond to metabolic control and antioxidant therapy. Routine monitoring of these enzymes, alongside glycemic and lipid parameters, can facilitate early cardiovascular risk detection and guide preventive strategies to reduce future cardiac events in diabetic patients.
Research Article
Open Access
SERUM SODIUM AS A PROGNOSTIC FACTOR IN DECOMPENSATED CHRONIC LIVER DISEASE IN A TERTIARY CARE HOSPITAL
Dr Satya Prakash Yadav ,
Dr Ajit Yadav ,
Dr. Kapil Yadav ,
Dr Manisha Yadav ,
Dr Anil Yadav
Pages 416 - 421

View PDF
Abstract
Aim: The study aimed at assessing serum sodium levels as a prognostic factor in “decompensated liver disease” (DCLD). Methods: The study was conducted in the General Medicine Department of SGT Hospital, utilizing the inpatient (IPD) settings over a period of 18 months. Patients presenting with clinical symptoms of decompensated cirrhosis, such as ascites, hepatic encephalopathy, and gastrointestinal bleeding, were screened and enrolled in the study. A total of 60 patients diagnosed with decompensated liver disease were included in the study. Results: The mean age of participants was 46.3 ± 6.8 years, with an age range spanning from 18 to 55 years, reflecting the inclusion criteria. A significant proportion of patients, approximately 55%, were aged 41 to 55 years, while 40% were within the 30 to 40 years category. Gender distribution revealed a marked male predominance, with 78.3% (47/60) of the patients being male, while 21.7% (13/60) were female. A significant proportion (20%) belonged to the lower class group, while 45% were from the lower middle class group. A significant proportion (20%) belonged to the lower class group, while 45% were from the lower middle class group. Only 15% of the patients belonged to the middle class category, with 3% in upper middle and 1.6% in upper class category. In this study, alcohol consumption emerged as the most prominent risk factor, with 70% (42/60) of patients having a documented history of chronic alcohol use. Conclusion: In conclusion, this study provides compelling evidence that serum sodium is an independent and reliable predictor of disease severity, complications, and mortality in decompensated liver disease. The strong associations observed between hyponatremia and adverse outcomes reinforce the need for early detection, continuous monitoring, and targeted therapeutic interventions. The integration of serum sodium assessment into routine liver disease management and transplantation evaluation has the potential to enhance risk stratification, guide therapeutic decision-making, and ultimately improve patient outcomes.
Research Article
Open Access
Effect of Recurrent Urinary Tract Infections on Renal Function Parameters and Blood Pressure Regulation in Adult Women: A Comparative Study
K. P. Prasad Babu ,
P. Sumangali ,
K. Akshitha
Pages 1437 - 1441

View PDF
Abstract
Background: Recurrent urinary tract infection (rUTI) is common in adult women, and recurrent inflammation can influence renal homeostasis and blood pressure regulation. Objectives: To compare renal function parameters and blood pressure profiles between adult women with rUTI and age-matched controls, and to examine associations between rUTI burden, albuminuria, and blood pressure indices. Methods: A hospital-based comparative study was conducted on one hundred adult women were enrolled (50 rUTI; 50 controls). Serum creatinine, blood urea, estimated glomerular filtration rate (eGFR), and urine albumin–creatinine ratio (UACR) were assessed. Blood pressure was measured using a standardized protocol. Group comparisons, correlation analysis, and multivariable regression were performed. Results: Women with rUTI had higher serum creatinine (0.92 ± 0.18 vs 0.81 ± 0.14 mg/dL) and blood urea (28.6 ± 7.9 vs 24.9 ± 6.8 mg/dL), with lower eGFR (89.4 ± 16.8 vs 98.8 ± 14.9 mL/min/1.73 m²). UACR was higher in the rUTI group (median 24 vs 10 mg/g), and microalbuminuria was more frequent (32.0% vs 8.0%). Systolic/diastolic blood pressure was higher in the rUTI group (128.6 ± 12.9/82.1 ± 8.7 vs 121.4 ± 11.2/77.6 ± 7.9 mmHg). rUTI status remained independently associated with lower eGFR, higher systolic blood pressure, and microalbuminuria after adjustment. Conclusion: Adult women with rUTI demonstrated subtle deterioration in renal functional markers and higher blood pressure levels, supporting periodic renal and blood pressure monitoring in this population
Research Article
Open Access
Exploring Biochemical Markers in the Diagnosis and Management of Type 2 Diabetes Mellitus
Manoranjan Mallick ,
Jagyanprava Dalai ,
Rajanikanta sahoo ,
Rajesh Senapati
Pages 1354 - 1358

View PDF
Abstract
Background: Type 2 diabetes mellitus (T2DM) is a chronic metabolic disorder characterized by hyperglycemia due to insulin resistance and/or impaired insulin secretion. This study aimed to evaluate the association between various biochemical markers and glycemic control in T2DM patients. A cross-sectional analysis was conducted on 200 patients recruited from a tertiary care hospital. Data collection included demographic information, anthropometric measurements, and laboratory tests for fasting plasma glucose (FPG), glycated hemoglobin (HbA1c), lipid profile, adipokines (adiponectin, leptin), inflammatory markers (CRP, IL-6), and oxidative stress markers (MDA, TAC). Results indicated significant correlations between several biochemical markers and HbA1c levels. FPG, total cholesterol, LDL-C, CRP, IL-6, and MDA were positively correlated with HbA1c, indicating poorer glycemic control. Conversely, adiponectin and TAC showed negative correlations with HbA1c, suggesting a protective role. Multiple linear regression analysis identified FPG, CRP, and adiponectin as significant independent predictors of HbA1c levels. The findings underscore the critical role of a comprehensive panel of biochemical markers in the management of T2DM. Incorporating inflammatory and oxidative stress markers into routine monitoring could enhance the precision of glycemic control strategies and improve patient outcomes. Future research should focus on longitudinal studies and intervention trials to further elucidate these associations and develop targeted therapeutic intervention
Research Article
Open Access
Prevalence and Determinants of Electrolyte Imbalances Among Hospitalized Patients in the Department of Internal Medicine: An Observational Study
Dr G. Mounika Reddy ,
Dr Pranav Vijay Deore
Pages 396 - 399

View PDF
Abstract
Background: Electrolyte disturbances are common in hospitalized adults and are linked to adverse clinical outcomes. Objectives: To estimate the prevalence and pattern of electrolyte imbalances among adult inpatients and to describe key clinical determinants. Methods: This hospital-based observational study enrolled 100 consecutive adults admitted to the Department of General Medicine, Prathima Institute of Medical Sciences, Karimnagar, Telangana, India, from January 2022 to June 2022. Admission serum sodium, potassium, and calcium were reviewed along with comorbidities and selected clinical exposures. Electrolyte imbalance was defined as any sodium, potassium, or calcium value outside the institutional reference range. Results: Electrolyte imbalance was present in 64% of admissions. Hyponatremia was the most frequent abnormality, followed by hypokalemia. Imbalance occurred more often among patients receiving diuretics and among those with chronic kidney disease, gastrointestinal fluid loss, diabetes mellitus, and sepsis or severe infection. Conclusion: Electrolyte disturbances affected nearly two-thirds of medical inpatients. Early identification and risk-based monitoring of vulnerable groups are essential to improve inpatient safety.
Research Article
Open Access
Pages 507 - 510
Background: Intraoperative fluid management plays a critical role in gastrointestinal (GI) surgery, as both fluid overload and hypovolemia can contribute to postoperative complications. Goal-directed fluid therapy (GDFT) has been proposed to optimize tissue perfusion and reduce morbidity. Aim: To evaluate the impact of intraoperative fluid management strategies on postoperative outcomes in patients undergoing major gastrointestinal surgery. Methods: This prospective observational study included 160 adult patients undergoing elective gastrointestinal surgery. Patients were divided into two groups: GDFT (n=80), managed with individualized fluid therapy guided by dynamic hemodynamic monitoring, and Liberal Fluid (LF) group (n=80), managed with conventional fluid protocols. Demographic data, intraoperative fluid volume, and postoperative outcomes—including surgical site infection, anastomotic leak, postoperative ileus, acute kidney injury, and length of hospital stay—were recorded. Statistical analysis was performed using SPSS 26, with p<0.05 considered significant. Results:
Baseline demographics were comparable between groups. Total intraoperative fluid administered was significantly lower in the GDFT group (2,350 ± 450 mL vs. 3,200 ± 500 mL, p<0.001). The incidence of surgical site infection (5% vs. 15%, p=0.03) and postoperative ileus (8% vs. 18%, p=0.04) was significantly reduced in the GDFT group. Patients in the GDFT group experienced faster gastrointestinal recovery, with earlier time to first flatus (2.1 ± 0.6 vs. 3.0 ± 0.8 days, p<0.001) and shorter hospital stay (6.2 ± 1.5 vs. 8.1 ± 2.0 days, p<0.001). No significant differences were observed in anastomotic leaks or pulmonary complications. Conclusion: Goal-directed intraoperative fluid therapy reduces postoperative complications, accelerates gastrointestinal recovery, and shortens hospital stay compared to standard liberal fluid administration in patients undergoing gastrointestinal surgery. Incorporating GDFT into routine perioperative care may improve surgical outcomes and enhance recovery pathways.
Research Article
Open Access
Role of intraoperative fluid management in reducing postoperative complications in gastrointestinal surgery
Pages 507 - 510

View PDF
Abstract
Background: Intraoperative fluid management plays a critical role in gastrointestinal (GI) surgery, as both fluid overload and hypovolemia can contribute to postoperative complications. Goal-directed fluid therapy (GDFT) has been proposed to optimize tissue perfusion and reduce morbidity. Aim: To evaluate the impact of intraoperative fluid management strategies on postoperative outcomes in patients undergoing major gastrointestinal surgery. Methods: This prospective observational study included 160 adult patients undergoing elective gastrointestinal surgery. Patients were divided into two groups: GDFT (n=80), managed with individualized fluid therapy guided by dynamic hemodynamic monitoring, and Liberal Fluid (LF) group (n=80), managed with conventional fluid protocols. Demographic data, intraoperative fluid volume, and postoperative outcomes—including surgical site infection, anastomotic leak, postoperative ileus, acute kidney injury, and length of hospital stay—were recorded. Statistical analysis was performed using SPSS 26, with p<0.05 considered significant. Results:
Baseline demographics were comparable between groups. Total intraoperative fluid administered was significantly lower in the GDFT group (2,350 ± 450 mL vs. 3,200 ± 500 mL, p<0.001). The incidence of surgical site infection (5% vs. 15%, p=0.03) and postoperative ileus (8% vs. 18%, p=0.04) was significantly reduced in the GDFT group. Patients in the GDFT group experienced faster gastrointestinal recovery, with earlier time to first flatus (2.1 ± 0.6 vs. 3.0 ± 0.8 days, p<0.001) and shorter hospital stay (6.2 ± 1.5 vs. 8.1 ± 2.0 days, p<0.001). No significant differences were observed in anastomotic leaks or pulmonary complications. Conclusion: Goal-directed intraoperative fluid therapy reduces postoperative complications, accelerates gastrointestinal recovery, and shortens hospital stay compared to standard liberal fluid administration in patients undergoing gastrointestinal surgery. Incorporating GDFT into routine perioperative care may improve surgical outcomes and enhance recovery pathways.
Research Article
Open Access
Clinical Profile, Angiographic Characteristics, and Short- to Mid-Term Outcomes of Patients With In-Stent Restenosis
Kunal Parwani ,
Pankaj Singh ,
Jayal Shah ,
Krutika Patel ,
Meghkumar Shah
Pages 15 - 20

View PDF
Abstract
Background: The optimal treatment strategy for in‑stent restenosis (ISR) still remains under debate. There have been scarce data on Indian patients relating to ISR treatment. Hence the present study was undertaken to study the clinical profile and angiographic patterns of patients admitted with ISR and they were followed for 6 months for outcomes associated with different treatment modalities. Method: A total of 200 patients were enrolled in study during a period from January 2022 to March 2023. All patients underwent a general and cardiac evaluation. We evaluated the types of clinical presentation, and angiographic characteristics of ISR, laboratory parameters, treatment modalities used for stenting ISR lesions and the outcomes that occurred within a minimum period of 6 months from the date of the clinical presentation. Results: Unstable angina (39.5%) and NSTEMI (36%) were the common clinical presentations noted for ISR cases. Single vessel disease (46%) with LAD involvement (60%) was common amongst ISR cases, with uncontrolled DM (90%) being a commonly noted factor. Non-focal ISR (65%) with involvement of previous DES and BMS stent usage was seen amongst patients. Most patients with previous BMS stent had neo-atherosclerosis (56.25%) while for previous DES cases, the commonest IVUS finding was under expansion (53.84%). Commonest treatment modality adopted for ISR cases was PTCA (64%), which led to significant increase in both, stent length and stent diameter after management. The overall outcomes of ISR management were promising, with 97.5% cases discharged from hospital after management and maintaining well at 6-month follow up as well. Conclusion: This study highlights the effectiveness of current ISR management strategies and underscores the need for ongoing monitoring and tailored treatment approaches in the Indian population.
Research Article
Open Access
A Prospective Study of Adverse Drug Reactions in a Tertiary Care Hospital
V. Prabhanjan Kumar ,
Kathi Madhu Chandra
Pages 75 - 79

View PDF
Abstract
Background: Adverse drug reactions (ADRs) represent a significant burden on healthcare systems globally, contributing to increased morbidity, prolonged hospitalisation, and substantial economic costs. Systematic pharmacovigilance in tertiary care settings provides essential safety data for clinical decision-making. Objective: To prospectively identify, assess, and characterise ADRs in patients admitted to a tertiary care teaching hospital over a 12-month period, with a focus on causality, severity, preventability, and drug class involvement. Methods: A prospective, observational study was conducted over 12 months (January–December 2020) across medicine, surgery, and allied departments. ADRs were collected via spontaneous reporting, active ward monitoring, and prescription review. Causality was assessed using the Naranjo Algorithm and WHO-UMC criteria. Severity was graded by the Modified Hartwig and Siegel scale, and preventability was evaluated using the Schumock and Thornton criteria. Results: Of 3,842 patients monitored, 312 ADRs were identified (incidence: 8.12%). Female patients were more frequently affected (58.3%). The most common drug classes implicated were antimicrobials (27.6%), NSAIDs (18.3%), and cardiovascular drugs (15.7%). The gastrointestinal system was the most frequently affected organ (32.4%). Causality assessment revealed 41.0% probable, 38.5% possible, and 18.3% definite ADRs. Overall, 72.4% were mild-to-moderate, and 34.9% were considered preventable. Conclusion: The high proportion of preventable ADRs underscores the urgent need for structured pharmacovigilance programmes, therapeutic drug monitoring, and multidisciplinary medication safety teams in tertiary hospitals.