KDOQI (Kidney Disease Outcomes Quality Initiative)

Executive Summaries | Anemia | Hemodialysis | Peritoneal Dialysis |
Vascular Access | Nutrition | CKD 2002 | Dyslipidemias | Bone Metabolism | Hypertension and Antihypertensive Agents | Cardiovascular Disease in Dialysis Patients | History of KDOQI | Pediatric Bone | Anemia 2006 |
Updates 2006

Clinical Practice Guidelines and Clinical Practice Recommendations
2006 Updates
Hemodialysis Adequacy
Peritoneal Dialysis Adequacy
Vascular Access



Nephrologists in the United States in general are savvy physicians who respond quickly to public information about care of their patients. Even before the Kidney Disease Clinical Studies Initiative Hemodialysis (HEMO) Study was concluded, average dialysis doses were increasing in the United States, perhaps stimulated by the study itself, which was widely publicized to promote enrollment among the 72 participating clinics.1,2 The original National Kidney Foundation (NKF)-Dialysis Outcomes Quality Initiative (DOQI) guidelines for hemodialysis (HD) in 1997 probably also fueled the dose increase. At the time the study was completed, the average single-pool fractional urea clearance Kt/V (spKt/V) in the United States was 1.52 per dialysis given 3 times per week.3 This was and continues to be significantly greater than the minimum of 1.2 established originally in 1994 by a consortium of nephrologists.4,5 The original minimum recommended dose was based mostly on opinions generated from observational studies and was reiterated by the Kidney Disease Outcomes Quality Initiative (KDOQI) in 2001.6

The HEMO Study showed that the minimum dose established by the previous KDOQI guidelines is appropriate when dialysis is performed 3 times per week for 2.5 to 4.5 hours.1 Dialysis providers no longer need to focus on providing more dialysis by using bigger dialyzers and higher flow rates, but they cannot sit back and relax because the yearly mortality rate for patients with chronic kidney disease (CKD) stage 5 remains unacceptably high in the United States (>20% per year in 2002, and 17% per year in the HEMO Study). This ongoing high mortality rate has served as an incentive for investigators seeking better alternative solutions for dialysis-dependent patients and has spurred interest in alternative therapies and modes of therapy, such as hemofiltration, daily dialysis, sorbent therapy, better volume control, use of ultrapure water, and other interventions. Mortality differences among countries are now explained partially by differences in patient selection and comorbidity, but a considerable gap remains, especially when statistics in the United States are compared with those in Japan, where annual mortality rates are less than 10%. The Dialysis Outcomes and Practice Patterns Study (DOPPS) analyses show that these differences are not caused by different methods for gathering statistics.7 The HEMO Study showed that the differences are not caused by higher doses in Japan.1 Better survival in the Japanese may be caused by genetic differences that enhance survival of Asian dialysis patients, whether treated in the United States or Japan.8,9 Some consolation can be gained from the most recent data published by the United States Renal Data System (USRDS) and Centers for Medicare & Medicaid Services (CMS) that show a reduction in mortality rates during the past 2 decades.10

The HEMO Study broadened the scope of interest and opened the eyes of the dialysis health care industry to the issue of dialysis adequacy. It did not settle the question of small-solute toxicity, but it served to redirect attention to other possible causes of morbidity, mortality, and poor quality of life (QOL). These include retention of solutes that are poorly removed by diffusion or convection because of their large size or binding to serum proteins, solute sequestration, physiological stress caused by either the dialysis itself or the intermittent schedule of dialyses that causes fluctuations in fluid balance and solute concentrations, or accumulation of such non–uremia-associated toxins as drug metabolites that are known to accumulate in dialyzed patients. In the latter case, reducing or stopping antihypertensive drug therapy may have hidden benefits. The caregiver can be a source of the problem, as evidenced by past experience with aluminum toxicity.

The enormous risk for cardiovascular disease (CVD) in patients with CKD stage 5 compared with patients with normal renal function suggests a toxic phenomenon. Perhaps alternate pathways for toxin removal are damaged in patients with CKD, causing accumulation of toxins not normally eliminated by the kidneys. Other possible explanations for the high risk for CVD and cerebrovascular disease include a yet to be discovered renal effect that may protect the vascular endothelium. This role of kidney disease in patients with heart failure and the “cardiorenal syndrome” may be related to cardiovascular risks in patients with renal disease.11 It is worth noting that the loss of hormones normally produced by the kidney is a well-established cause of disability and mortality that is not responsive to dialysis. The strong association of survival with residual native kidney function in both HD and peritoneal dialysis (PD) patients is consistent with such an effect.

The potential for inflammation caused by contaminated dialysate or soft-tissue reactions to calcium deposits may contribute to the observed strong relationship among inflammatory markers, CVD, and renal disease. It is possible that the high morbidity and mortality rates are not related to dialysis at all. If so, more attention should be given to comorbidity and QOL and less attention to the adequacy of dialysis. At this juncture in the search for answers and solutions, both imagination and science are needed.

New issues addressed in these updated guidelines include the timeline for initiation of dialysis therapy, which also is addressed by the PD and Vascular Access Work Groups. Emphasis was placed on patients destined for HD therapy, but efforts also were made to coordinate these guidelines with the initiation guidelines generated by the other work groups that recommended stepped increases in the prescribed dialysis dose, early referral, and early access placement.

Predialysis blood urea nitrogen (BUN) is easy to measure, but the postdialysis concentration is a moving target. Its decrease during dialysis is sharply reversed when the treatment ceases; thus, timing of the postdialysis blood sample is critical. The Work Group determined that markedly slowing blood flow at the end of dialysis before sampling the blood is the safest and simplest technique for achieving the uniformity needed for reliable and reproducible values of Kt/V.

The delivered Kt/V determined by single-pool urea kinetic modeling continues to be preferred as the most precise and accurate measure of dialysis. Simplified formulas are acceptable within limits, and urea reduction ratio (URR) continues to be viable, but with pitfalls. Conductivity (ionic) clearance also is accepted, but tends to underestimate dialyzer urea clearance. The Work Group believed that more attention should be given to residual kidney function (RKF) in light of recent evidence linking outcomes more closely to RKF than to dialysis dose. Although we do not recognize a state of “overdialysis,” patient QOL is compromised by dialysis; therefore, giving unnecessary treatment should be avoided, especially now that we recognize a ceiling dose above which morbidity and mortality are not improved. Pitfalls and controversies about methods for adding RKF to dialyzer clearance were reviewed, but were considered too complex for the average dialysis clinic to manage. Implementation was simplified by setting a cutoff urea clearance of 2 mL/min, above which inclusion of residual native kidney urea clearance (Kr) is recommended and below which it can be ignored. Although the cutoff value is somewhat arbitrary, it serves to separate patients into 2 groups: 1 group in which the trouble and expense of measuring RKF can be avoided, and the other group in which more attention should be focused on RKF to potentially improve QOL. In the latter group are patients for whom recovery of renal function may be anticipated. Patients in the group with RKF greater than 2 mL/min (~10% to 30%) should have regular measurements of native kidney clearance to avoid underdialysis as function is lost and to avoid prolonging dialysis if function recovers. Twice-weekly dialysis may be permissible in a few patients within the group with RKF greater than 2 mL/min who have stable function and do not have excessive fluid gains. Because RKF is preserved better in current HD patients compared with the past, a separate guideline was established to encourage preservation of RKF.

More frequent dialysis is becoming more common; thus, methods for measuring the dose are required. Partially controlled studies suggest that QOL improves, hypertension is alleviated, left ventricular hypertrophy (LVH) regresses, and sleep disturbances abate with daily or nocturnal HD. The Work Group reviewed current methods and gave practice recommendations for measuring the dose in these patients. More definitive recommendations may come from the National Institutes of Health (NIH) Frequent HD Network Study that currently is enrolling patients.

The Work Group focused more intently on the target dose and its relationship with the minimum dose which, in light of HEMO Study findings, remains 1.2 Kt/V units per dialysis for patients dialyzed 3 times per week. Data from the HEMO Study also revealed a coefficient of variation within patients of approximately 0.1 Kt/V units; therefore, the previous target of 1.3 was considered too low. To grant 95% confidence that the dose will not decrease to less than 1.2 per dialysis, the target dose was increased to 1.4 per dialysis. This is in keeping with current practice and is consistent with the target spKt/V of approximately 1.4 set by the European Standards Group.12 The Work Group favored high-flux membranes. The HEMO Study did not provide definitive answers, but data suggested that dialysis vintage and flux are related and CVD might be affected favorably by the use of high-flux dialysis.1 The issue of sex also was addressed by the Work Group, which believed that dialysis doses and targets should remain the same in women compared with men. However, in light of suggestive findings from the HEMO Study and observational studies, clinicians should be aware of a possible increased responsiveness to dialysis in females compared with males.13

Concern was raised by the Work Group about malnourished patients with respect to both the initiation and adequacy of HD. Initiation is confounded by errors in calculation of glomerular filtration rate (GFR) for patients with diminishing muscle mass, and adequacy is confounded by the effect of malnutrition on patients' water volume (V), the denominator of the integrated urea clearance expression (Kt/V). Estimation equations for calculating GFR before starting dialysis therapy are based on serum creatinine level, but are adjusted for sex, size, race, and other factors that tend to alter the relationship between concentration and clearance. Most of these factors either increase or decrease the generation of creatinine, but the patient's state of nutrition—which is well known to affect creatinine generation—is not a variable in this equation. The consequent error in malnourished patients would tend to underestimate GFR and thus endanger the patient from the ill consequences of the delayed initiation of dialysis therapy. In addition, if the patient is malnourished, dialysis probably is better started early.

After a patient starts dialysis therapy, loss of weight because of malnutrition will decrease V, increasing the Kt/V, potentially to values higher than the desired target range. Reducing the dialysis dose (Kt/V) in such patients may lead to potential harm from inadequate dialysis. The Work Group addressed this problem in Clinical Practice Recommendation (CPR) 4.6, which calls for an increase in Kt/V when signs of malnutrition are present. The magnitude of the increase is left to the clinician, who might take into consideration the absolute level of Kt/V and cause of the malnutrition. If Kt/V is already much greater than the minimum, an additional increase probably would not benefit the patient. Similarly, if malnutrition is caused by a condition other than uremia, increasing the dose may have no effect. This issue will require revisiting in the future, hopefully with more available hard data.

The importance of missed dialysis treatments was emphasized repeatedly by the Work Group. Although difficult to quantify in terms of a guideline, patient cooperation and compliance is a major determinant of survival.14-16 To ensure compliance, efforts should be made to maintain the patient's confidence in the health care system at all levels. However, patient satisfaction in general and patient encounters with physicians have not shown a strong correlation with survival.17

Other aspects of dialysis adequacy were addressed, including fluid balance, blood pressure control, and membrane biocompatibility. Reuse has moved to the background among issues of concern in dialysis clinics for 2 reasons: (1) many clinics in the United States no longer reuse dialyzers, and (2) risks associated with reuse were examined and found to be very small. Monitoring outcome goals within each dialysis clinic is vitally important for quality assurance and quality improvement, and this issue been added as a Clinical Practice Guideline (CPG) for HD and PD adequacy. This outcomes-monitoring guideline is not intended to guide individual patient care, but is intended for the dialysis clinic as a whole.

More data are available regarding adequacy in pediatric HD patients, but the numbers thankfully remain small, so definitive evidence is lacking. The greater metabolic rate per unit of surface area in children has been invoked by some to justify a higher dose. Use of V as a denominator (see previous discussion of V) also may endanger smaller patients. In other respects, for younger smaller patients, we have little evidence to support a different dosing regimen than that delivered to adults.

Since the last issuance of the KDOQI Guidelines, the Standards Group of the European Renal Association in 2002 published adequacy guidelines for HD measurement, dosing, and minimum standards.12 The HD adequacy group chose urea-equilibrated Kt/V (eKt/V), recommending the Daugirdas method69 for converting spKt/V to eKt/V, with a target of 1.2 per dialysis (spKt/V ~ 1.4). The target was higher than that previously recommended by KDOQI (spKt/V = 1.3 per dialysis), but the rationale for increasing the target was not clearly delineated. The group recommended using the mean of creatinine and urea clearance as a measure of RKF and discouraged twice-weekly dialysis.

In the United States, we have come a long way, from marveling about how HD can snatch patients from the jaws of death and keep them alive indefinitely to coping with 0.1% of the population depending on HD for life support. Nephrologists have learned that, although numbering more than 300,000, these patients represent a small segment of approximately 20 million people in the United States with kidney disease who have survived tremendous risks for CVD and other morbid diseases to develop CKD stage 5. They often arrive in the dialysis clinic with a legacy of diabetes, CVD, and inflammatory diseases that continue to progress. The challenge for today's health care workers and the dialysis industry is to provide an opportunity for these patients to live long and comfortably with freedom to pursue their dreams, even if for only a relatively short length of time in those at high risk. We need to be all things for these patients, but first and foremost, we must deliver the best dialysis therapy we can with available technology. These new KDOQI HD CPGs, CPRs, and Research Recommendations are designed to provide a clearer pathway and help everyone move in that direction.