Hepatology : official journal of the American Association for the Study of Liver Diseases
-
Hepatitis B immunization has been documented to prevent fulminant hepatic failure (FHF) and hepatocellular carcinoma (HCC) by historical comparison studies in Taiwan. This study aimed to assess long-term risks and predictors of various liver diseases associated with incomplete immunization in 3.8 million vaccinees. Profiles of the National Hepatitis B Immunization Registry, National Cancer Registry, and National Death Certification Registry were linked to ascertain newly diagnosed cases of HCC and deaths from FHF and chronic liver diseases (CLDs) from infancy to early adulthood of 3,836,988 newborn vaccinees. Cox's proportional hazards models were used to estimate hazard ratios (HRs) for various risk predictors. There were 49 newly developed cases of HCC, 73 deaths from FHF, and 74 deaths from CLDs during the follow-up of 41,854,715 person-years. There were striking differences between unvaccinated and vaccinated newborns after the launch of a national immunization program for HCC incidence (0.293 vs. 0.117 per 100,000 person-years), FHF mortality (0.733 vs. 0.174 per 100,000 person-years), and CLD mortality (2.206 vs. 0.177 per 100,000 person-years). Among vaccinees, incomplete immunization was the most important risk predictor of HCC, FHF, and CLDs, showing an HR (95% confidence interval, P value) of 2.52 (1.25-5.05; P = 0.0094), 4.97 (3.05-8.11; P < 0.0001), and 6.27 (3.62-10.84; P < 0.0001), respectively, after adjustment for maternal hepatitis B serostatus. ⋯ Hepatitis B immunization can significantly prevent the long-term risk of HCC, FHF, and CLDs from infancy to early adulthood. Incomplete immunization with hepatitis B immunoglobulin or vaccines was the most important risk predictor of the liver disease among vaccinees.
-
Comparative Study
Cost analysis of sofosbuvir/ribavirin versus sofosbuvir/simeprevir for genotype 1 hepatitis C virus in interferon-ineligible/intolerant individuals.
Treatment guidance for chronic hepatitis C (CHC) released by the American Association for the Study of Liver Diseases (AASLD) and the Infectious Diseases Society of America (IDSA) offers two options for interferon (IFN)-ineligible/intolerant individuals with genotype 1 infection: sofosbuvir/ribavirin (SOF/RBV) for 24 weeks or sofosbuvir/simeprevir (SOF/SMV) for 12 weeks. A 24-week course of SOF/RBV costs approximately US$169,000, with sustained virologic response (SVR) rates ranging from 52% to 84%; 12 weeks of SOF/SMV costs approximately $150,000, with SVR between 89% and 100%. Because SOF/SMV is currently used off-label, debate exists among physicians and payers about whether it should be prescribed and covered. This article presents a cost-effectiveness analysis of these two treatment regimens accounting for costs of drugs, treatment-related medical care, retreatment for individuals who do not achieve SVR, and natural history of continued HCV infection after failed retreatment. Analysis uses a Markov model with a lifetime horizon and a societal perspective. In the base-case scenario, SOF/SMV dominated SOF/RBV in a modeled 50-year-old cohort of treatment-naïve and -experienced subjects, excluding those who failed earlier therapy with telaprevir or boceprevir. SOF/SMV yielded lower costs and more quality-adjusted life years (QALYs) for the average subject, compared to SOF/RBV ($165,336 and 14.69 QALYs vs. $243,586 and 14.45 QALYs, respectively). In base-case cost analysis, the SOF/SMV treatment strategy saved $91,590 per SVR, compared to SOF/RBV. Under all one-way sensitivity scenarios, SOF/SMV remained dominant and resulted in cost savings. ⋯ These results suggest that a 12-week course of SOF/SMV is a more cost-effective treatment for genotype 1 CHC than 24 weeks of SOF/RBV among IFN-ineligible/intolerant individuals, supporting the AASLD/IDSA guidance and offering implications for both clinical and regulatory decision making as well as pharmaceutical pricing.
-
Randomized Controlled Trial Comparative Study
Withholding parenteral nutrition during critical illness increases plasma bilirubin but lowers the incidence of biliary sludge.
Cholestatic liver dysfunction (CLD) and biliary sludge often occur during critical illness and are allegedly aggravated by parenteral nutrition (PN). Delaying initiation of PN beyond day 7 in the intensive care unit (ICU) (late PN) accelerated recovery as compared with early initiation of PN (early PN). However, the impact of nutritional strategy on biliary sludge and CLD has not been fully characterized. This was a preplanned subanalysis of a large randomized controlled trial of early PN versus late PN (n = 4,640). In all patients plasma bilirubin (daily) and liver enzymes (alanine aminotransferase [ALT], aspartate aminotransferase [AST], gamma-glutamyl transpeptidase [GGT], alkaline phosphatase [ALP], twice weekly; n = 3,216) were quantified. In a random predefined subset of patients, plasma bile acids (BAs) were also quantified at baseline and on days 3, 5, and last ICU-day (n = 280). Biliary sludge was ultrasonographically evaluated on ICU-day 5 (n = 776). From day 1 after randomization until the end of the 7-day intervention window, bilirubin was higher in the late PN than in the early PN group (P < 0.001). In the late PN group, as soon as PN was started on day 8 bilirubin fell and the two groups became comparable. Maximum levels of GGT, ALP, and ALT were lower in the late PN group (P < 0.01). Glycine/taurine-conjugated primary BAs increased over time in ICU (P < 0.01), similarly for the two groups. Fewer patients in the late PN than in the early PN group developed biliary sludge on day 5 (37% versus 45%; P = 0.04). ⋯ Tolerating substantial caloric deficit by withholding PN until day 8 of critical illness increased plasma bilirubin but reduced the occurrence of biliary sludge and lowered GGT, ALP, and ALT. These results suggest that hyperbilirubinemia during critical illness does not necessarily reflect cholestasis and instead may be an adaptive response that is suppressed by early PN.