At our community teaching hospital, orders for end of life often lacked instructions to titrate opioids based on evidence-based principles and failed to address nonpain symptoms. An order set and a nursing-driven opioid titration protocol were implemented in August 2016 after extensive education. The purpose of this retrospective preintervention and postintervention study was to evaluate the impact of this intervention on the quality of end-of-life orders. We evaluated 69 patients with terminal illness receiving morphine infusions. After implementation, more morphine infusion orders included an as-needed bolus dose with an objective indication and appropriate instructions on when and how to titrate the infusion compared with before the intervention (94.6% vs 18.8%, P less then .0001). Morphine infusion orders were also significantly more likely to include a maximum dose (P = .041) and an initial bolus dose (P less then .0001). In addition, prescribers were more likely to order additional medications to manage nausea/vomiting, constipation, anxiety, or pain using a nonopioid (P less then .05 for all). In this study, implementation of a standardized opioid titration protocol and symptom management order set led to an improvement in the quality of morphine infusion orders for pain management at the end of life and increased the use of medications to manage nonpain symptoms in dying patients. We reported that 2,5-dimethylcelecoxib (DM-celecoxib), a celecoxib derivative that is unable to inhibit cyclooxygenase-2, prevented cardiac remodeling induced by sarcomeric gene mutation, left ventricular pressure overload, or β-adrenergic receptor stimulation. This effect seemed to be mediated by the inhibition of the canonical Wnt/β-catenin signaling pathway, which has been suggested to play a key role in the development of chronic kidney disease and chronic heart failure. We investigated the effect of DM-celecoxib on cardiac remodeling and kidney injury in hypertension model mice induced by angiotensin II infusion in the absence or presence of high-salt load. DM-celecoxib prevented cardiac remodeling and markedly reduced urinary albumin excretion without altering blood pressure in those mice. Moreover, DM-celecoxib prevented podocyte injury, glomerulosclerosis, and interstitial fibrosis in the kidney of mice loaded with angiotensin II and high-salt load. DM-celecoxib reduced the phosphorylation level of Akt and activated glycogen synthase kinase-3, which led to the suppression of the Wnt/β-catenin signal in the heart and kidney. DM-celecoxib also reduced the expression level of snail, a key transcription factor for the epithelial-mesenchymal transition and of which gene is a target of the Wnt/β-catenin signal. Results of the current study suggested that DM-celecoxib could be beneficial for patients with hypertensive heart and kidney diseases. Results of the current study suggested that DM-celecoxib could be beneficial for patients with hypertensive heart and kidney diseases. Data about the safety and efficacy of direct-acting antivirals (DAAs) in the treatment of hepatitis C virus (HCV) patients with concomitant rheumatoid arthritis (RA) are scarce. We assessed the impact and safety of DAAs treatment of hepatitis C on rheumatoid arthritis disease activity. Prospectively, we enrolled 65 patients with RA and HCV. A clinico-laboratory evaluation was done at baseline, including liver assessment and RA disease activity score-28 (DAS28). At 12 weeks of post-DAAs treatment, sustained virologic response (SVR12) and DAS28 were reevaluated. The SVR12 was achieved in 59 (90.8%) patients. RA control was achieved in 47 (79.9%) patients. The post SVR12 DAS28 score was significantly lower than the baseline (3.32 ± 0.93 vs. 4.37 ± 0.90; P < 0.001). https://www.selleckchem.com/products/VX-770.html There was a significant decline in the mean values of serum anticyclic citrullinated peptide, rheumatoid factor, erythrocyte sedimentation rate and C-reactive protein after achieving an SVR12 (30.47 ± 12.37 vs. 57.61 ± 15.91 U/ml; 29.78 ± 19.58 vs. 55.14 ± 16.89 IU/ml; 17.13 ± 10.84 vs. 29.68 ± 14.32 mm/h and 5.76 ± 1.57 vs. 11.44 ± 4.13 mg/l, respectively; P < 0.05). RA activity and antirheumatic drugs were stepped-down [12 (20.3%) and 35 (59.3%) patients showed good and moderate RA response, respectively]. The baseline viral load, absence of cirrhosis and SVR12 were the only predictors of disease control (P < 0.05). No drug-related adverse events or drug-related discontinuation. Unlike interferon, HCV elimination by DAAs significantly improves RA activity and treatment outcome with high safety and efficacy. Unlike interferon, HCV elimination by DAAs significantly improves RA activity and treatment outcome with high safety and efficacy. Under-recognition of alpha-1 antitrypsin deficiency (AATD) is well documented in AATD-lung disease but is rarely reported in patients with liver cirrhosis requiring liver transplantation. This report examines the frequency of newly diagnosed AATD based on pathologic examination of explanted livers following liver transplantation, trends in diagnosis over time, and prognostic correlates of under-recognition outcomes following liver transplantation. This study retrospectively reviewed 1473 pathology reports from adult patients (>18 years) undergoing liver transplantation at Cleveland Clinic between 2004 and 2017. Pathology reports of explanted livers exhibiting periodic acid-Schiff, diastase-resistant inclusion bodies (PAS+G) suggestive of AATD were included and medical records were reviewed regarding demographics, AATD genotype, alternative etiologies for cirrhosis, presence of emphysema, and survival outcomes. Kaplan-Meier estimates of survival outcomes were compared between patients diagnosed pre-liveransplantation. The observed trend towards higher survival in patients diagnosed with AATD pre-liver transplantation suggests the opportunity to enhance outcomes by earlier recognition of AATD. In patients with cirrhosis, there is a clinical concern that the development of protein-calorie malnutrition will affect the immune system and predispose these patients to increased infectious outcomes. In this study, we evaluate the effects of malnutrition on the infectious outcomes of patients admitted with cirrhosis. This study used the 2011-2017 National Inpatient Sample to identify patients with cirrhosis. These patients were stratified using malnutrition (protein-calorie malnutrition, cachexia, and sarcopenia) and matched using age, gender, and race with 11 nearest neighbor matching method. The endpoints included mortality and infectious outcomes. After matching, there were 96 842 malnutrition-present cohort and equal number of controls. In univariate analysis, the malnutrition cohort had higher hospital mortality [10.40 vs. 5.04% P < 0.01, odds ratio (OR) 2.18, 95% confidence interval (CI) 2.11-2.26]. In multivariate models, malnutrition was associated with increased mortality [P < 0.01, adjusted odds ratio (aOR) 1.