Background: Chronic obstructive pulmonary disease (COPD) is a global health problem with high morbidity and mortality. Tai Ji and Qigong are traditional Chinese mediative movements, benefit COPD patients physical and mental health. Methods: We searched the following twelve databases Web of Science, EBSCO, Medline, EMBASE, Scopus, PubMed, PsycArticles, Psychology and Behavioral Sciences Collection, PsycInfo, CINAHL, Cochrane Library online and Clinical trials from inception to July 2023. Any RCTs managed with Tai Ji and/or Qigong on stable COPD were eligible without age, publishment language and comparison management restrict. Outcome measures comprised pulmonary function, the incidence of acute exacerbation, 6WMD, chronic pain, physical and/or cognitive function, and any assessment of people QoL. Results: Our research will update evidence summaries and provide a quantitative and standardized assessment of the effect of Tai Ji and/or Qigong on patients with stable COPD. Conclusion: Our research will generate the latest evidence for determining whether Tai Ji and/or Qigong is equivalent to conventional PR. Abbreviations: RCTs = randomized controlled trials; 6WMD = 6-minutes walking distance; QoL = quality of life; CIs = confidence intervals; CAT = COPD assessment test.
While healthy individuals have redundant degrees of freedom of the joints, they coordinate their multi-joint movements such that the redundancy is effectively reduced. Achieving high inter-joint coordination may be difficult for upper limb prosthesis users due to the lack of proprioceptive feedback and limited motion of the terminal device. This study compared inter-joint coordination between prosthesis users and individuals without limb loss during different upper limb activities of daily living (ADLs). Nine unilateral prosthesis users (five males) and nine age- and sex-matched controls without limb loss completed three unilateral and three bilateral ADLs. Principal component analysis was applied to the three-dimensional motion trajectories of the trunk and arms to identify coordinative patterns. For each ADL, we quantified the cumulative variance accounted for (VAF) of the first five principal components (pcs), which was the lowest number of pcs that could achieve 90% VAF in control limb movements across all ADLs (5 [≤] n [≤] 9). The VAF was lower for movements involving a prosthesis compared to those completed by controls across all ADLs (p < 0.001). The pc waveforms were similar between movements involving a prosthesis and movements completed by control participants for pc1 (r > 0.78, p < 0.001). The magnitude of the relationship for pc2 and pc3 differed between ADLs, with the strongest correlation for symmetric bilateral ADLs (0.67 [≤] r [≤] 0.97, p < 0.001). Collectively, this study demonstrates that activities of daily living are less coordinated for prosthesis users compared to individuals without limb loss. Future work should explore how device features, such as the availability of sensory feedback or motorized wrist joints influence multi-joint coordination.
Objective: This research takes previous study, Cancer family caregivers during the palliative, hospice, and bereavement phases: A review of the descriptive psychosocial literature, limited in recent decade, as methodology template. The purpose of this review was to organize the literature as compared to the different result of previous study. Method: As a systematic review, major databases were searched for non-intervention descriptive studies. Psychosocial variables of family caregivers to adults with cancer during the different phases would be included. Result: The 23 studies reviewed were conducted in ten countries and varied considerably by samples, outcome measures, and results. Despite limiting several conditions, results, such as age, gender, and relationship to the patient, were inconsistent. Across the 23 studies, 53 unique instruments were used; 13 of which were no psychometric testing. The family caregivers who were younger and faced level of daily life impairment tended to be burden, anxious, depress. To summarize the different factors influencing caregivers' status, complicated grief was consistent with their situation. Conclusion: As compare with previous study, it demonstrated inconsistent results, which were spouse, gender and age, affecting family caregivers' status. However, regarding to measurement instruments using, it was much more rigorous than before. Also, it had been changed in the major study site and the number of study. As a consequence of physical and psychosocial status of family caregivers, they were in high risk population.
Background: Self-harm and suicide are relatively overrepresented in incarcerated populations, especially in female prisons. Identifying those most at risk of significant self-harm could provide opportunities for effective, targeted interventions. Aims: To develop and validate a machine learning-based algorithm capable of achieving a clinically useful level of accuracy when predicting the risk of self-harm in female prisoners. Method: Data were available on 31 variables for 286 female prisoners from a single UK-based prison. This included sociodemographic factors, nature of the index offence, and responses to several psychometric assessment tools used at baseline. At 12-month follow-up any self-harm incidents were reported. A machine learning algorithm (CatBoost) to predict self-harm at one-year was developed and tested. To quantify uncertainty about the accuracy of the algorithm, the model building and evaluation process was repeated 2000 times and the distribution of results summarised. Results: The mean Area Under the Curve (AUC) for the model on unseen (validation) data was 0.92 (SD 0.04). Sensitivity was 0.83 (SD 0.07), specificity 0.94 (SD 0.03), positive predictive value 0.78 (SD 0.08) and the negative predictive value 0.95 (0.02). If the algorithm was used in this population, for every 100 women screened, this would equate to approximately 17 true positives and five false positives. Conclusions: The accuracy of the algorithm was superior to those previously reported for predicting future self-harm in general and prison populations and likely to provide clinically useful levels of prediction. Research is needed to evaluate the feasibility of implementing this approach in a prison setting.
Adolescents exhibit remarkable heterogeneity in the structural architecture of brain development. However, due to the lack of large-scale longitudinal neuroimaging studies, existing research has largely focused on population averages and the neurobiological basis underlying individual heterogeneity remains poorly understood. Using structural magnetic resonance imaging from the IMAGEN cohort (n=1,543), we show that adolescents can be clustered into three groups defined by distinct developmental patterns of whole-brain gray matter volume (GMV). Genetic and epigenetic determinants of group clustering and long-term impacts of neurodevelopment in mid-to-late adulthood were investigated using data from the Adolescent Brain Cognitive Development (ABCD), IMAGEN and UK Biobank cohorts. Group 1, characterized by continuously decreasing GMV, showed generally the best neurocognitive performances during adolescence. Compared to Group 1, Group 2 exhibited a slower rate of GMV decrease and worsened neurocognitive development, which was associated with epigenetic changes and greater environmental burden. Further, Group 3 showed increasing GMV and delayed neurocognitive development during adolescence due to a genetic variation, while these disadvantages were attenuated in mid-to-late adulthood. In summary, our study revealed novel clusters of adolescent structural neurodevelopment and suggested that genetically-predicted delayed neurodevelopment has limited long-term effects on mental well-being and socio-economic outcomes later in life. Our results could inform future research on policy interventions aimed at reducing the financial and emotional burden of mental illness.
Adolescence is a critical developmental with increased vulnerability to mental disorders. While the positive impact of physical exercise on adult mental health is well-established, dose-response relationships and the underlying neural and genetic mechanisms in adolescents remain elusive. Leveraging data from >11,000 pre-adolescents (9-10 years, ABCD Study) we examined associations between seven different measures of exercise dosage across 15 exercises and psychopathology, and the roles of brain function and structure and psychiatric genetic risks. Five specific exercises (basketball, baseball/softball, soccer, football, and skiing) were associated with better mental health while the beneficial effects varied with exercise types, dosage measures and dimensions of psychopathology. Interestingly, more exercise does not always translate to better mental health whilst earlier initiation was consistently advantageous. Communication between attention and default-mode brain networks mediated the beneficial effect of playing football. Crucially, exercise mitigates the detrimental effects of psychiatric genetic risks on mental health. We offer a nuanced understanding of exercise effects on adolescent mental health to promote personalized exercise-based interventions in youth.
Aims: Nitazoxanide is a broad-spectrum antiviral with potential application in a number of viral infections. Its use is limited by gastrointestinal side effects associated with increasing dose. In this study, we investigated the possibility of enhancing the exposure of its active metabolite, tizoxanide, through pharmacokinetic interaction with atazanavir/ritonavir. Method: This was a crossover drug-drug interaction study, 18 healthy participants received a single dose of 1000 mg of nitazoxanide alone in period 1 and in combination with 300/100 mg atazanavir/ritonavir in period 2 after a washout period of 21 days. On both days, blood samples for intensive pharmacokinetic analyses were collected before and at 0.25, 0.5, 1, 2, 4, 6, and 12 h after dose. To explore the utility of dried blood spots (DBS) as alternative to plasma for tizoxanide quantification, 50 L of blood from some participants was spotted on DBS cards. Pharmacokinetic parameters were derived by non-compartmental analysis and compared between periods 1 and 2. The correlation between tizoxanide concentration in plasma and DBS was also evaluated. Results: Co-administration of nitazoxanide with atazanavir/ritonavir resulted in a significant increase in tizoxanide plasma exposure. The geometric mean ratios (90% CI) of tizoxanide AUC0-12h, Cmax and C12h were 1.872 (1.870 - 1.875), 2.029 (1.99 - 2.07) and 3.14 (2.268 - 4.352) respectively, were all outside the 0.8 - 1.25 interval, implying clinically significant interaction. DBS concentration (%CV) was 46.3% (5.6%) lower than plasma concentrations, with a strong correlation (R = 0.89, P < 0.001). Similarly, DBS-derived plasma concentration and plasma concentrations displayed a very strong correlation with linearity (R = 0.95, P < 0.001) Conclusion: Co-administration with atazanavir/ritonavir enhanced tizoxanide exposure with no report of adverse events in healthy volunteers.
We studied the impact of microstructural abnormalities in the corpus callosum on language development in 348 infants born very prematurely. We discovered that the fractional anisotropy of the corpus callosum anterior midbody was a significant predictor of standardized language scores at two years, independent of clinical and social risk factors.
Problem: The past decades have yielded an explosion of research using artificial intelligence for cancer detection and diagnosis in the field of computational pathology. Yet, an often unspoken assumption of this research is that a glass microscopy slide faithfully represents the underlying disease. Here we show systematic failure modes may dominate the slides digitized from a given medical center, such that neither the whole slide images nor the glass slides are suitable for rendering a diagnosis. Methods: We quantitatively define high quality data as a set of whole slide images where the type of surgery the patient received may be accurately predicted by an automated system such as ours, called "iQC". We find iQC accurately distinguished biopsies from nonbiopsies, e.g. prostatectomies or transurethral resections (TURPs, a.k.a. prostate chips), only when the data qualitatively appeared to be high quality, e.g. vibrant histopathology stains and minimal artifacts. Crucially, prostate needle biopsies appear as thin strands of tissue, whereas prostatectomies and TURPs appear as larger rectangular blocks of tissue. Therefore, when the data are of high quality, iQC (i) accurately classifies pixels as tissue, (ii) accurately generates statistics that describe the distribution of tissue in a slide, and (iii) accurately predicts surgical procedure. Results: While we do not control any medical center's protocols for making or storing slides, we developed the iQC tool to hold all medical centers and datasets to the same objective standard of quality. We validate this standard across five Veterans Affairs Medical Centers (VAMCs) and the Automated Gleason Grading Challenge (AGGC) 2022 public dataset. For our surgical procedure prediction task, we report an Area Under Receiver Operating Characteristic (AUROC) of 0.9966-1.000 at the VAMCs that consistently produce high quality data and AUROC of 0.9824 for the AGGC dataset. In contrast, we report an AUROC of 0.7115 at the VAMC that consistently produced poor quality data. An attending pathologist determined poor data quality was likely driven by faded histopathology stains and protocol differences among VAMCs. Specifically, this VAMC produced slides entirely by hand, whereas all other VAMCs leveraged automated methods to produce slides. Conclusion: Our surgical procedure prediction AUROC may be a quantitative indicator positively associated with high data quality at a medical center or for a specific dataset. To produce high quality data, we recommend producing slides using robotics or other forms of automation whenever possible. We recommend scanning slides digitally before the glass slide has time to develop signs of age, e.g faded stains and acrylamide bubbles. To our knowledge, iQC is the first automated system in computational pathology that validates data quality against objective evidence, e.g. surgical procedure data available in the EHR or LIMS, which requires zero efforts or annotations from anatomic pathologists.
Irritable Bowel Syndrome (IBS) is characterized by abdominal pain and alterations in bowel pattern, such as constipation (IBS-C), diarrhea (IBS-D), or mixed (IBS-M). Since malabsorption of ingested carbohydrates (CHO) can cause abdominal symptoms that closely mimic those of IBS, identifying genetic mutations in CHO digestive enzymes associated with IBS symptoms is critical to ascertain IBS pathophysiology. Through candidate gene association studies, we identify several common variants in TREH, SI, SLC5A1 and SLC2A5 that are associated with IBS symptoms. By investigating rare recessive Mendelian or oligogenic inheritance patterns, we identify case-exclusive rare deleterious variation in known disease genes (SI, LCT, ALDOB, and SLC5A1) as well as candidate disease genes (MGAM and SLC5A2), providing potential evidence of monogenic or oligogenic inheritance in a subset of IBS cases. Finally, our data highlight that moderate to severe IBS-associated gastrointestinal symptoms are often observed in IBS cases carrying one or more of deleterious rare variants.
Purpose: Diagnosed depression is prevalent in prisons of affluent countries; literature on depression screening in prisons of low-resource nations is sparse. Haiti has experienced multiple recent disasters, which could have both somatic and mental health consequences. To surveil its prisons for depression, ethnoculturally appropriate scales could be helpful. Design/methodology/approach: We performed a cross-sectional analysis of symptoms of depression and its associations among participants in a 2019-2020 tuberculosis treatment adherence project across 6 Haitian prisons. To measure depression, we piloted the use of the Zanmi-Lasante Depression Symptom Inventory (ZLDSI) scale in a carceral setting. We calculated its Cronbach alpha in this setting and generated binary logistic models to study the associations of depression with basic demographic variables; use of cigarettes, marijuana, and alcohol; and incarceration history. We then performed a multivariate logistic regression to determine if substance use and education predicted depression, after adjusting for age. Findings: Fifty subjects were recruited; age ranged from 18 to 59 years. Adherence to TB medication was recorded as above 99% in all subjects. The Cronbach alpha score for the ZLDSI scale in this population was 0.77, signifying the good fit of the scale for this population. A ZLDSI score [≥]13.0 has been associated with depression; 66% of participants had scores of 13.0 or greater, mean 13.9 (S.D. 8.2). Multivariate analysis showed significant associations between depression, alcohol consumption, age, and income. Originality/Conclusion: We believe this study represents the first measurement of depressive symptoms in a Haitian prison population; it found symptoms common.
In Rhineland-Palatinate, Germany, a system of three data sources has been established to track the Covid-19 pandemic. These sources are the number of Covid-19-related hospitalizations, the Covid-19 genecopies in wastewater, and the prevalence derived from a cohort study. This paper presents an extensive comparison of these parameters. It is investigated whether wastewater data and a cohort study can be valid surrogate parameters for the number of hospitalizations and thus serve as predictors for coming Covid-19 waves. We observe that this is possible in general for the cohort study prevalence, while the wastewater data suffer from a too large variability to make quantitative predictions by a purely data-driven approach. However, the wastewater data as well as the cohort study prevalence are able to detect hospitalizations waves in a qualitative manner. Furthermore, a detailed comparison of different normalization techniques of wastewater data is provided.
Background Understanding the role of circulating proteins in prostate cancer risk can reveal key biological pathways and identify novel targets for cancer prevention. Methods We investigated the association of 2,002 genetically predicted circulating protein levels with risk of prostate cancer overall, and of aggressive and early onset disease, using cis-pQTL Mendelian randomization (MR) and colocalization. Findings for proteins with support from both MR, after correction for multiple-testing, and colocalization were replicated using two independent cancer GWAS, one of European and one of African ancestry. Proteins with evidence of prostate-specific tissue expression were additionally investigated using spatial transcriptomic data in prostate tumor tissue to assess their role in tumor aggressiveness. Finally, we mapped risk proteins to drug and ongoing clinical trials targets. Results We identified 20 proteins genetically linked to prostate cancer risk (14 for overall [8 specific], 7 for aggressive [3 specific], and 8 for early onset disease [2 specific]), of which a majority were novel and replicated. Among these were proteins associated with aggressive disease, such as PPA2 [Odds Ratio (OR) per 1 SD increment = 2.13, 95% CI: 1.54-2.93], PYY [OR = 1.87, 95% CI: 1.43-2.44] and PRSS3 [OR = 0.80, 95% CI: 0.73-0.89], and those associated with early onset disease, including EHPB1 [OR = 2.89, 95% CI: 1.99-4.21], POGLUT3 [OR = 0.76, 95% CI: 0.67-0.86] and TPM3 [OR = 0.47, 95% CI: 0.34-0.64]. We confirm an inverse association of MSMB with prostate cancer overall [OR = 0.81, 95% CI: 0.80-0.82], and also find an inverse association with both aggressive [OR = 0.84, 95% CI: 0.82-0.86] and early onset disease [OR = 0.71, 95% CI: 0.68-0.74]. Using spatial transcriptomics data, we identified MSMB as the genome-wide top-most predictive gene to distinguish benign regions from high grade cancer regions that had five-fold lower MSMB expression. Additionally, ten proteins that were associated with prostate cancer risk mapped to existing therapeutic interventions. Conclusion Our findings emphasize the importance of proteomics for improving our understanding of prostate cancer etiology and of opportunities for novel therapeutic interventions. Additionally, we demonstrate the added benefit of in-depth functional analyses to triangulate the role of risk proteins in the clinical aggressiveness of prostate tumors. Using these integrated methods, we identify a subset of risk proteins associated with aggressive and early onset disease as priorities for investigation for the future prevention and treatment of prostate cancer.
The COVID-19 pandemic has exposed a number of key challenges that need to be urgently addressed. In particular, rapid identification and validation of prognostic markers is required. Mass spectrometric studies of blood plasma proteomics provide a deep understanding of the relationship between the severe course of infection and activation of specific pathophysiological pathways. Analysis of plasma proteins in whole blood may also be relevant for the pandemic as it requires minimal sample preparation. Here, for the first time, frozen whole blood samples were used to analyze 189 plasma proteins using multiple reaction monitoring (MRM) mass spectrometry and stable isotope-labeled peptide standards (SIS). A total of 128 samples (FRCC, Russia) from patients with mild (n=40), moderate (n=36) and severe (n=19) COVID-19 infection and healthy controls (n=33) were analyzed. Levels of 114 proteins were quantified and compared. Significant differences between all of the groups were revealed for 61 proteins. Changes in the levels of 30 reproducible COVID-19 markers (SERPING1, CRP, C9, ORM1, APOA1, SAA1/SAA2, LBP, AFM, IGFALS, etc.) were consistent with studies performed with serum/plasma samples. Levels of 70 proteins correlated between whole blood and plasma samples. The best-performing classifier built with 13 significantly different proteins achieved the best combination of ROC-AUC (0.93-0.95) and accuracy (0.87-0.93) metrics and distinguished patients from controls, as well as patients by severity and risk of mortality. Overall, the results support the use of frozen whole blood for MRM analysis of plasma proteins and assessment of the status of patients with COVID-19.
Since its resurgence in 2017, Yellow fever (YF) outbreaks have continued to occur in Nigeria despite routine immunization and implementation of several reactive mass vaccination campaigns, resulting in substantial morbidity and mortality. Nigeria is considered a high-priority country for implementing the WHO EYE strategy, which is targeted at eliminating YF outbreaks by 2026. This retrospective observational study was conducted to describe the epidemiological profile of reported cases, trends, and seasonality of YF incidence; identify factors associated with Yellow fever disease (YFD) and barriers to YF vaccination in Nigeria. Univariate, bivariate and multivariate binary logistic regression analysis was done. Of 13014 suspected YF cases, 7640 (58.7%) had laboratory confirmation for Yellow fever virus (YFV). Predictors of YFD were male sex (aOR 2.36, 95% CI: 1.45-3.91) compared to female; age group being 15-29 years (aOR 4.13, 95% CI: 1.59-13.00) compared to under-five; residing in the Derived Savannah (aOR 30.10, 95% CI: 11.50-104.00), Lowland/Mangrove/Freshwater rainforest (aOR 8.84, 95% CI: 3.24-31.10), Guinea Savannah/Jos Plateau (aOR 6.13, 95% CI: 1.90-23.50) compared to the Sahel/Sudan savannah; working in outdoor settings compared to indoor (aOR 1.76, 95% CI: 0.96-3.22); and vomiting (aOR 2.62, 95% CI: 1.39-4.83). The rainy season was protective against YFD (aOR 0.32, 95% CI: 0.19-0.52) compared to the dry season. Because being unvaccinated emerged as protective factor (aOR:0.51, 95% CI: 0.25-1.00) compared to those with unknown vaccination status, the data was further disaggregated by vaccination status. Predictors with higher odds ratios were found among unvaccinated. Predictors of YFD among the vaccinated were the first quarter compared to the second quarter of the year (aOR 4.04, 95% CI: 1.48-12.95) and residing in the southern region compared to the north (aOR 14.03, 95% CI: 4.09-88.27). Barriers to YF vaccination were the rainy season compared to the dry season (aOR 1.29, 95% CI: 1.05-1.57), being 15 years or older (15-29: aOR 2.06, 95% CI: 1.51-2.83; 30-44: aOR 2.11, 95% CI: 1.45-3.07; 45-59: aOR 2.72, 95% CI: 1.63-4.58; 60+: aOR 6.55, 95% CI: 2.76-17.50), residing in the northern region (aOR 3.71, 95% CI: 3.01-4.58) compared to the south, and occupation being butcher/hunter/farmer (aOR 2.30, 95% CI: 1.52-3.50) compared to home-based/office workers. Being a student was protective against being unvaccinated (aOR 0.62, 95% CI: 0.47-0.83). Several factors were associated with YFD, which were aggravated by lack of vaccination. Although barriers to vaccination were elucidated, inadequate vaccination coverage alone may not account for the recurrent outbreaks of YF in Nigeria. These findings are critical for planning public health interventions and to guide further research that would enable Nigeria end YF epidemics.
Background: Atherosclerotic Cardiovascular Disease (ASCVD) is a leading cause of death globally, and early detection of high-risk individuals is essential for initiating timely interventions. The authors aimed to develop and validate a deep learning (DL) model to predict an individual's elevated 10-year ASCVD risk score based on retinal images and limited demographic data. Methods: The study used 89,894 retinal fundus images from 44,176 UK Biobank participants (96% non-Hispanic White, 5% diabetic) to train and test the DL model. The DL model was developed using retinal images plus age, race/ethnicity, and sex at birth to predict an individual's 10-year ASCVD risk score using the Pooled Cohort Equation (PCE) as the ground truth. This model was then tested on the US EyePACS 10K dataset (5.8% Non-Hispanic white 99.9% diabetic), composed of 18,900 images from 8,969 diabetic individuals. Elevated ASCVD risk was defined as a PCE score of ?7.5%. Results: In the UK Biobank internal validation dataset, the DL model achieved area under the receiver operating characteristic curve (AUROC) of 0.89, sensitivity 84%, and specificity 90%, for detecting individuals with elevated ASCVD risk scores. In the EyePACS 10K and with the addition of a regression-derived diabetes modifier, it achieved sensitivity 94%, specificity 72%, mean error -0.2%, and mean absolute error 3.1%. Conclusion: This study demonstrates that DL models using retinal images can provide an additional approach to estimating ASCVD risk, and the value of applying DL models to different external datasets and opportunities about ASCVD risk assessment in patients living with diabetes.
Background: In October 2019, cannabis edibles were legalized for sale in Canada. This move was intended to improve public safety by regulating contents (including a maximum 10 mg tetrahydrocannabinol (THC) per package) and packaging to prevent accidental ingestion or over consumption. This study aimed to explore consumer preferences for cannabis edibles to inform cannabis policy. Methods: We explored the relative importance and trade-offs consumers make for attributes of cannabis edibles using a discrete choice experiment. Attributes included type of edible, price, THC content, cannabis taste, package information, product consistency, product recommendations, and Health Canada regulation. Participants lived in Canada, were 19 years of age or older, and purchased a cannabis edible in the last 12 months. A multinomial logit (MNL) model was used for the base model, and latent class analysis to assess preference sub-groups. Results: Among 684 participants, the MNL model showed that potency was the most relevant followed by edible type. A two-group latent class model revealed two very distinct preference patterns. Preferences for group 1 (~65% of sample) were driven primarily by edible type, while for group 2 (~35% of sample) were driven almost entirely by THC potency. Conclusion: This study found that consumer preferences for ~65% of consumers of cannabis edibles are being met through regulated channels. The remaining ~35% are driven by THC potency at levels that are not currently available on the licensed market. Attracting this market segment will require reviewing the risks and benefits of restricting THC package content.
Background: Acute kidney injury (AKI) is a frequent complication in critical patients leading to worse prognosis. Although the consequences of AKI are worse among critical patients, AKI is also associated with less favorable outcomes in non-critical patients. Hence, understanding the magnitude of the problem in these patients is crucial, yet there is a scarcity of evidence in non-critical settings, especially in resource limited countries. Hence, the study aimed at determining the incidence and predictors of hospital acquired acute kidney injury (HAAKI) in non-critical medical patients who were admitted at a large tertiary hospital in Ethiopia. Methods: A retrospective chart review study was conducted among 232 hospitalized non-critical medical patients admitted to St. Paul's Hospital Millennium Medical College between January 2020 and January 2022. Data was characterized using frequency and median with interquartile range. To identify predictors of HAAKI, a log binomial regression model was fitted at a p value of [≤] 0.05. The magnitude of association was measured using adjusted relative risk (ARR) with its 95% CI. Results: During the median follow-up duration of 11 days (IQR, 6-19 days), the incidence of HAAKI was estimated to be 6.0 per 100 person-day observation (95% CI= 5.5 to 7.2). Significant predictors of HAAKI were found to be having type 2 diabetes mellitus (ARR=2.36, 95% CI= 1.03, 5.39, p-value=0.042), and taking vancomycin (ARR=3.04, 95% CI= 1.38, 6.72, p-value=0.006) and proton pump inhibitors (ARR=3.80, 95% CI = 1.34,10.82, p-value=0.012). Conclusions: HAAKI is a common complication in hospitalized non-critical medical patients, and is associated with a common medical condition and commonly prescribed medications. Therefore, it is important to remain vigilant in the prevention and timely identification of these cases and to establish a system of rational prescribing habits.
Background: Homelessness persists as a critical global issue despite myriad interventions. This study analyzed state-level differences in homelessness rates across the United States to identify influential societal factors to help guide resource prioritization. Methods: Homelessness rates for 50 states and Washington D.C. were compared using the most recent data from 2020-2023. Twenty-five variables representing potential socioeconomic and health contributors were examined. Given non-normal distributions, nonparametric statistical techniques, including correlation and predictive modeling, identified significant factors. Results: The cost of living index, mainly influenced by housing, transportation, and grocery costs, showed the strongest positive correlation with homelessness rates (all p <0.001). Unemployment, alcohol binging, taxes, and poverty were also influential factors. Opioid prescription rates demonstrated an unexpected negative correlation. Random forest classification emphasized the cost of living index as the primary contributor, with housing costs presenting the largest influence. Conclusion: This state-level analysis revealed the cost of living index, predominantly driven by housing expenses, as the foremost factor associated with homelessness rates, greatly outweighing other variables. These findings can help inform resource allocation to mitigate homelessness through targeted interventions.
Background: The long term impact of coronavirus disease 2019 (COVID19) on many aspects of society emphasizes the necessity of vaccination and nucleic acid conversion time as markers of prevention and diagnosis. However, little research has been conducted on the immunological effects of vaccines and the influencing factors of virus clearance. Epidemiological characteristics and factors related to disease prognosis and nucleic acid conversion time need to be explored. Design and participants: We reviewed published documentation to create an initial draft. The data were then statistically evaluated to determine their link. Given that a Chongqing shelter hospital is typical in terms of COVID-19 patients receiving hospital management and treatment effects, a retrospective analysis was conducted on 4,557 cases of COVID19 infection in a shelter hospital in Chongqing in December 2022, which comprised 2,291 males and 2,266 females. The variables included age, medical history, nucleic acid conversion time, vaccination status, and clinical symptoms. Results: Univariate survival analysis using the Log-rank test (P < 0.05) showed that factors such as age significantly affected nucleic acid conversion time. COX regression analysis indicated a significant association between a history of hypertension and nucleic acid conversion time, which had a hazard ratio of 0.897 (95% CI: 0.811 to 0.992). A statistically significant difference was observed between vaccinated and unvaccinated infected individuals in terms of the presence of symptoms such as cough and sensory system manifestations (P < 0.05). Conclusion: The effect of vaccination against COVID19 on symptoms such as coughing, nasal congestion, muscle aches, runny nose, and sensory system symptoms in COVID19 patients was determined. Typical symptoms, such as runny nose, were generally higher in vaccinated than in unvaccinated ones; previous hypertension was an influential factor in nucleic acid conversion time in patients with COVID19 infection.