1 / 10

American Journal of Epidemiology

American Journal of Epidemiology. February 1, 2012 175 (3) . Racial and Geographic Factors in the Incidence of Legg- Calvé - Perthes ’ Disease: A Systematic Review.

gasha
Télécharger la présentation

American Journal of Epidemiology

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. American Journal of Epidemiology February 1, 2012 175 (3)

  2. Racial and Geographic Factors in the Incidence of Legg-Calvé-Perthes’ Disease: A Systematic Review Legg-Calvé-Perthes’ disease (Perthes’ disease) is a childhood osteonecrosis of the hip for which the disease determinants are poorly understood. In this review, the authors identify studies of Perthes’ disease incidence published up to December 2010 and make denominator populations comparable in order to allow meaningful between-study evaluation. Incidence rates and confidence intervals were determined, and, where appropriate, denominator populations were obtained from national statistical offices. Poisson regression was used to determine the influence of race and geography. The review included 21 studies that described 27 populations in 16 countries, with 124 million person-years of observation. The annual incidence among children under age 15 years ranged from 0.2 per 100,000 to 19.1 per 100,000. Race was a key determinant, with East Asians being least affected and whites most affected, though data were insufficient to consider incidence among blacks (for South Asians vs. East Asians, incidence rate ratio = 2.9, 95% confidence interval (CI): 2.4, 3.5; for whites vs. East Asians, incidence rate ratio = 8.8, 95% CI: 8.2, 9.6). Latitude was a strong predictor of disease, even after adjustment for race. Each 10° increase in latitude was associated with an incidence increase of 1.44 (95% CI: 1.30, 1.58) times. While much of the international variation appears to be a function of race, latitude demonstrates a strong association. This observation may offer new epidemiologic insights into the determinants of Perthes’ disease.

  3. Inequalities in Body Mass Index and Smoking Behavior in 70 Countries: Evidence for a Social Transition in Chronic Disease Risk Despite the growing burden of chronic disease globally, few studies have examined the socioeconomic patterning of risk across countries. The authors examined differences in the social patterning of body mass index (BMI) and current smoking by urbanicity among 70 countries from the 2002–2003 World Health Surveys. Age-adjusted, gender-stratified ordinary least squares and logistic regression analyses were conducted in each country to assess the relation between education and BMI or smoking. Meta-analytic techniques were used to assess heterogeneity between countries in the education-risk factor relations. Meta-regression was used to determine whether the heterogeneity could be explained by country-level urbanicity. In the least urban countries, persons with higher education had a higher BMI, while the opposite pattern was seen in the most urban countries, with this pattern being especially pronounced among women. In contrast, smoking was consistently concentrated among persons of lower education among all men and among women in the least urban countries. For women in the most urban countries, higher education was associated with higher odds of smoking, although there was substantial variability in this relation. These results highlight a global trend toward an increasing burden of chronic disease risk among persons of lower socioeconomic position as countries become more urban.

  4. Testing Gene-Environment Interaction in Large-Scale Case-Control Association Studies: Possible Choices and Comparisons Several methods for screening gene-environment interaction have recently been proposed that address the issue of using gene-environment independence in a data-adaptive way. In this report, the authors present a comparative simulation study of power and type I error properties of 3 classes of procedures: 1) the standard 1-step case-control method; 2) the case-only method that requires an assumption of gene-environment independence for the underlying population; and 3) a variety of hybrid methods, including empirical-Bayes, 2-step, and model averaging, that aim at gaining power by exploiting the assumption of gene-environment independence and yet can protect against false positives when the independence assumption is violated. These studies suggest that, although the case-only method generally has maximum power, it has the potential to create substantial false positives in large-scale studies even when a small fraction of markers are associated with the exposure under study in the underlying population. All the hybrid methods perform well in protecting against such false positives and yet can retain substantial power advantages over standard case-control tests. The authors conclude that, for future genome-wide scans for gene-environment interactions, major power gain is possible by using alternatives to standard case-control analysis. Whether a case-only type scan or one of the hybrid methods should be used depends on the strength and direction of gene-environment interaction and association, the level of tolerance for false positives, and the nature of replication strategies.

  5. Gene-Environment Interactions in Genome-Wide Association Studies: A Comparative Study of Tests Applied to Empirical Studies of Type 2 Diabetes The question of which statistical approach is the most effective for investigating gene-environment (G-E) interactions in the context of genome-wide association studies (GWAS) remains unresolved. By using 2 case-control GWAS (the Nurses’ Health Study, 1976–2006, and the Health Professionals Follow-up Study, 1986–2006) of type 2 diabetes, the authors compared 5 tests for interactions: standard logistic regression-based case-control; case-only; semiparametric maximum-likelihood estimation of an empirical-Bayes shrinkage estimator; and 2-stage tests. The authors also compared 2 joint tests of genetic main effects and G-E interaction. Elevated body mass index was the exposure of interest and was modeled as a binary trait to avoid an inflated type I error rate that the authors observed when the main effect of continuous body mass index was misspecified. Although both the case-only and the semiparametric maximum-likelihood estimation approaches assume that the tested markers are independent of exposure in the general population, the authors did not observe any evidence of inflated type I error for these tests in their studies with 2,199 cases and 3,044 controls. Both joint tests detected markers with known marginal effects. Loci with the most significant G-E interactions using the standard, empirical-Bayes, and 2-stage tests were strongly correlated with the exposure among controls. Study findings suggest that methods exploiting G-E independence can be efficient and valid options for investigating G-E interactions in GWAS.

  6. Invited Commentary: GE-Whiz! Ratcheting Gene-Environment Studies up to the Whole Genome and the Whole Exposome One goal in the post-genome-wide association study era is characterizing gene-environment interactions, including scanning for interactions with all available polymorphisms, not just those showing significant main effects. In recent years, several approaches to such “gene-environment-wide interaction studies” have been proposed. Two contributions in this issue of the American Journal of Epidemiology provide systematic comparisons of the performance of these various approaches, one based on simulation and one based on application to 2 real genome-wide association study scans for type 2 diabetes. The authors discuss some of the broader issues raised by these contributions, including the plausibility of the gene-environment independence assumption that some of these approaches rely upon, the need for replication, and various generalizations of these approaches.

  7. Dealing With Missing Outcome Data in Randomized Trials and Observational Studies Although missing outcome data are an important problem in randomized trials and observational studies, methods to address this issue can be difficult to apply. Using simulated data, the authors compared 3 methods to handle missing outcome data: 1) complete case analysis; 2) single imputation; and 3) multiple imputation (all 3 with and without covariate adjustment). Simulated scenarios focused on continuous or dichotomous missing outcome data from randomized trials or observational studies. When outcomes were missing at random, single and multiple imputations yielded unbiased estimates after covariate adjustment. Estimates obtained by complete case analysis with covariate adjustment were unbiased as well, with coverage close to 95%. When outcome data were missing not at random, all methods gave biased estimates, but handling missing outcome data by means of 1 of the 3 methods reduced bias compared with a complete case analysis without covariate adjustment. Complete case analysis with covariate adjustment and multiple imputation yield similar estimates in the event of missing outcome data, as long as the same predictors of missingness are included. Hence, complete case analysis with covariate adjustment can and should be used as the analysis of choice more often. Multiple imputation, in addition, can accommodate the missing-not-at-random scenario more flexibly, making it especially suited for sensitivity analyses.

  8. Is Illicit Drug Use Harmful to Cognitive Functioning in the Midadult Years? A Cohort-based Investigation From March to July of 2011, the authors investigated the prospective association between illicit drug use and cognitive functioning during the midadult years. A total of 8,992 participants who were surveyed at 42 years of age in the National Child Development Study (1999–2000) were included. The authors analyzed data on 3 cognitive functioning measures (memory index, executive functioning index, and overall cognitive index) when the participants were 50 years of age (2008–2009). Illicit drug use at 42 years of age was based on self-reported current or past use of any of 12 illicit drugs. Multivariable regression analyses were performed to estimate the association between different illicit drug use measures at 42 years of age and cognitive functioning at 50 years of age. A positive association was observed between ever (past or current) illicit drug use and cognitive functioning (β = 0.62, P < 0.001), although the effect size was small. Even though there was no clear evidence against the null hypothesis, drug dependence (β = −0.27, P = 0.58) and long-term illicit drug use (β = −0.04, P = 0.87) tended to be negatively associated with cognitive functioning. At the population level, it does not appear that current illicit drug use is associated with impaired cognitive functioning in early middle age. However, the authors cannot exclude the possibility that some individuals and groups, such as those with heavier or more prolonged use, could be harmed.

  9. Self-rated Health Compared With Objectively Measured Health Status as a Tool for Mortality Risk Screening in Older Adults: 10-Year Follow-up of the Bambuí Cohort Study of Aging Interest in self-rated health (SRH) as a tool for use in disease and mortality risk screening is increasing. The authors assessed the discriminatory ability of baseline SRH to predict 10-year mortality rates compared with objectively measured health status. Principal component analysis was used to create a health score that included systolic blood pressure, presence of diabetes mellitus, body mass index, electrocardiographic parameters, B-type natriuretic peptide, and other biochemical and hematologic measures. From 1997 to 2007, a total of 474 of the 1,388 baseline participants died and 81 were lost to follow-up, yielding 11,833 person-years of observation. The adjusted hazard ratio for death was 1.74 (95% confidence interval (CI): 1.32, 2.29) for persons reporting poor health versus those reporting good health. When combined with age and sex, SRH had a C statistic to predict death equal to 0.69 (95% CI: 0.67, 0.71), which was comparable to that of the inclusive health score (C = 0.69, 95% CI: 0.67, 0.72). The addition of other parameters, such as lifestyle, physical functioning, mental symptoms, and physical symptoms, had little effect on these 2 predictive models (C = 0.71 (95% CI: 0.69, 0.73) and C = 0.71 (95% CI: 0.69, 0.74), respectively). The abilities of the SRH and the health score models to predict death decreased in parallel fashion over time. These results suggest that older adults who report poor health warrant particular attention as persons who have accumulated biologic markers of disease.

  10. Association Between Plasma 25-Hydroxyvitamin D and Colorectal Adenoma According to Dietary Calcium Intake and Vitamin D Receptor Polymorphism The anticarcinogenic potential of vitamin D might be mediated by not only calcium metabolism but also other mechanisms initiated by vitamin D receptor (VDR). The authors measured plasma 25-hydroxyvitamin D in healthy volunteer examinees who underwent total colonoscopy in Tokyo, Japan, 2004–2005, and evaluated its influence on colorectal adenoma, both alone and in interaction with VDR polymorphisms, which correspond to the FokI and TaqI restriction sites. The main analysis of plasma 25-hydroxyvitamin D included 737 cases and 703 controls. Compared with the lowest quintile of plasma 25-hydroxyvitamin D, only the highest was related to a significantly decreased odds ratio of colorectal adenoma (odds ratio = 0.64, 95% confidence interval: 0.45, 0.92). In contrast, all but the lowest quintile of dietary calcium intake presented similarly reduced odds ratios (odds ratio for the highest = 0.67, 95% confidence interval: 0.47, 0.95). Of note, the association between plasma 25-hydroxyvitamin D levels and colorectal adenoma was modified by the TaqI polymorphism of the VDR gene (Pinteraction = 0.03) but not by dietary calcium intake (Pinteraction = 0.93). These observations highlight the importance of vitamin D in colorectal tumorigenesis. Vitamin D might protect against colorectal neoplasia, mainly through mechanisms other than the indirect mechanism via calcium metabolism.

More Related