Having just concluded a long driving trip, I was reminded again just how helpful the technology under the hood of today’s automobiles can be: the warning lights when you’ve still got enough gas left to find a gas station, the low tire pressure light that tells you of a slow leak before you have a flat tire, the binging that lets you know you’ve left your headlights on (again), and my personal favorite, the klaxon-like bell that alerts you that your parking brake is still engaged (since that red “brake” light on the dashboard clearly wasn’t sufficient notice).
Before such warning signs were standard features, I’ve had the decidedly unpleasant experience of running out of gas in the middle of nowhere, finding myself driving on (two) flat tires, and—many years ago—I’m pretty sure I was responsible for ruining the brakes on my Dad’s car simply because I didn’t realize that the smell of burning rubber was coming from the car I was driving.
For some time now, the Federal Reserve has held short-term interest rates near zero in an effort to support an economic recovery—and has, in fact, announced its intention to maintain that policy until such time as the recovery seems to have taken hold. However, as many retirees and workers have discovered, those historically low interest rates are crimping their retirement savings—and a new study by the Employee Benefit Research Institute (EBR)¹ quantifies the impact of a sustained low-interest rate environment on America’s retirement readiness.
Using EBRI’s unique Retirement Security Projection Model® (RSPM), we found that more than a quarter of Baby Boomers and Gen Xers who would have had adequate retirement income under an assumption that historical average market returns would prevail are instead simulated to end up running short of money in retirement if today’s historically low interest rates are assumed to be a permanent condition (assuming retirement income/wealth is assumed to cover 100 percent of simulated retirement expense²).
Not that everyone is affected to the same degree. In fact, the analysis reveals that the potential impact varies by income levels: The low-yield-rate environment appears to have a limited impact on retirement income adequacy for those in the lowest preretirement income quartile, since they have relatively small levels of defined contribution and IRA assets and since they rely more heavily on Social Security income in retirement. However, the research found there is a very significant impact for the top three income quartiles.³
The research found that the impact is lessened if the current low rates are temporary, but that its impact can be magnified by years of future eligibility for participation in a defined contribution plan. For example, moving from the historical-return assumption to a zero-real-interest-rate assumption results in an 11 percentage-point decrease in simulated retirement readiness for Gen Xers who have one to nine years of future eligibility, but that gap widens to a 15 percentage-point decrease in retirement readiness for those with 10 or more years of future eligibility.
In recent days, word that the Federal Reserve sees an end to its current policies has brought some volatility to the stock market. While several sectors of the economy have benefitted from the U.S. Federal Reserve holding short-term interest rates near zero to support a recovery, there are warning signs in the EBRI analysis about longer-term consequences that policy makers should also consider—and implications for retirement readiness should historical averages return.
Nevin E. Adams, JD
¹ The full report is published in the June 2013 EBRI Notes, “What a Sustained Low-yield Rate Environment Means for Retirement Income Adequacy: Results From the 2013 EBRI Retirement Security Projection Model,®” online here.
² When 80 percent of simulated retirement expenses must be covered, only 5‒8 percent are simulated to run short of money.
³ This is in sharp contrast to a recent report by the Center for Retirement Research at Boston College that claimed that the lower interest rates had only a minor impact for ALL income categories. However, as we have noted previously, CRR’s National Retirement Risk Index, on which the conclusions are based, continues to assume that all retirees annuitize all of their defined contribution and IRA balances; continues to ignore the impact of long-term care and nursing home costs or assumes that they are insured against by everyone; and also seems to rely on an outdated perspective of 401(k)-plan designs and savings trends, essentially ignoring the impact of automatic enrollment, auto-escalation of contributions, and the diversification impact of qualified default investment alternatives. Additionally, given the way their model assumes assets accumulate during the preretirement period, the change in interest rates has NO impact on accumulations at retirement age. See “Rely Able?”
this blog is about topics of interest to plan advisers (or advisors) and the employer-sponsored benefit plans they support. *It doesn't have a thing to do (any more) with PLANADVISER magazine.
Sunday, June 30, 2013
Sunday, June 23, 2013
Balancing "Acts"
Last week we looked at how the trends in employment-based retirement plans and employment-based health plans seem to be heading in opposite directions: fewer choices for workers to make in the former, more in the latter (see Consumer “Driven,” online here. Recent EBRI research suggests a potential divergence in other areas as well.
According to the EBRI/MGA Consumer Engagement in Health Care Survey, 26–40 percent of respondents reported some type of access-to-health-care issue for either themselves or family members last year. “Access” in this case refers not to availability, per se, but is broadly defined as not filling prescriptions due to cost, skipping doses to make medication last longer, or delaying or avoiding getting health care due to cost.
Not surprisingly, access is more of a problem among those with lower incomes, who appear to be forgoing spending on health care. In fact, regardless of health plan type, individuals in households with less than $50,000 in annual income were more likely than those in households with $50,000 or more in annual income to report access issues. In sum, a number of individuals, notably lower-income workers, were restricting their spending on healthcare.¹
Another recent EBRI analysis found that lower-income workers were withdrawing money from their individual retirement accounts in much greater numbers, earlier, and at much larger percentages, than other workers. In fact, the report noted that nearly half (48 percent) of the bottom-income quartile of those between the ages of 61 and 70 had made such an IRA withdrawal, and that their average annual percentage of account balance withdrawn (17.4 percent) was higher than the rest of the income distribution. In sum, a number of individuals, again, notably lower-income workers, were withdrawing more from their retirement savings accounts than those in higher income groups.
One of the great hopes behind a growing emphasis on consumer-directed health plans is that individuals would make different, perhaps more efficient decisions about their health care. Of course, one of the looming concerns is that individuals would make ill-informed decisions influenced by short-term personal economic (rather than health) factors. Similarly, there have been concerns expressed that, left to their own devices, individuals will withdraw too much money too soon from their retirement accounts—that their decisions too will be motivated by short-term needs, rather than by a full appreciation for the longer-term consequences of those actions.
As previous EBRI research has documented, the availability of health insurance may not only affect retirement decisions, but the costs of health care and long-term care can have a very real impact on retirement income adequacy.² The trends highlighted in the EBRI analyses suggest that some—notably lower-income individuals—could be spending less on healthcare than they might, and perhaps drawing more from their retirement accounts than they should.
What’s not yet clear—and what future research may shed light on—is whether these actions are borne of necessity, are simply random and potentially ill-considered, or are the result of conscious (and perhaps conscientious) choice.
Nevin E. Adams, JD
¹ Some additional evidence of the trend was highlighted by EBRI research recently published in Health Affairs, specifically that consumer-directed health plans (CDHPs) were shown to reduce the long-term use of outpatient physician visits and prescription drugs. Link is online here.
² See Views on Health Coverage and Retirement: Findings from the 2012 Health Confidence Survey, and ‘Savings Needed for Health Expenses for People Eligible for Medicare: Some Rare Good News.”
See also “Lessons From the Evolution of 401(k) Retirement Plans for Increased Consumerism in Health Care: An Application of Behavioral Research,” online here.
According to the EBRI/MGA Consumer Engagement in Health Care Survey, 26–40 percent of respondents reported some type of access-to-health-care issue for either themselves or family members last year. “Access” in this case refers not to availability, per se, but is broadly defined as not filling prescriptions due to cost, skipping doses to make medication last longer, or delaying or avoiding getting health care due to cost.
Not surprisingly, access is more of a problem among those with lower incomes, who appear to be forgoing spending on health care. In fact, regardless of health plan type, individuals in households with less than $50,000 in annual income were more likely than those in households with $50,000 or more in annual income to report access issues. In sum, a number of individuals, notably lower-income workers, were restricting their spending on healthcare.¹
Another recent EBRI analysis found that lower-income workers were withdrawing money from their individual retirement accounts in much greater numbers, earlier, and at much larger percentages, than other workers. In fact, the report noted that nearly half (48 percent) of the bottom-income quartile of those between the ages of 61 and 70 had made such an IRA withdrawal, and that their average annual percentage of account balance withdrawn (17.4 percent) was higher than the rest of the income distribution. In sum, a number of individuals, again, notably lower-income workers, were withdrawing more from their retirement savings accounts than those in higher income groups.
One of the great hopes behind a growing emphasis on consumer-directed health plans is that individuals would make different, perhaps more efficient decisions about their health care. Of course, one of the looming concerns is that individuals would make ill-informed decisions influenced by short-term personal economic (rather than health) factors. Similarly, there have been concerns expressed that, left to their own devices, individuals will withdraw too much money too soon from their retirement accounts—that their decisions too will be motivated by short-term needs, rather than by a full appreciation for the longer-term consequences of those actions.
As previous EBRI research has documented, the availability of health insurance may not only affect retirement decisions, but the costs of health care and long-term care can have a very real impact on retirement income adequacy.² The trends highlighted in the EBRI analyses suggest that some—notably lower-income individuals—could be spending less on healthcare than they might, and perhaps drawing more from their retirement accounts than they should.
What’s not yet clear—and what future research may shed light on—is whether these actions are borne of necessity, are simply random and potentially ill-considered, or are the result of conscious (and perhaps conscientious) choice.
Nevin E. Adams, JD
¹ Some additional evidence of the trend was highlighted by EBRI research recently published in Health Affairs, specifically that consumer-directed health plans (CDHPs) were shown to reduce the long-term use of outpatient physician visits and prescription drugs. Link is online here.
² See Views on Health Coverage and Retirement: Findings from the 2012 Health Confidence Survey, and ‘Savings Needed for Health Expenses for People Eligible for Medicare: Some Rare Good News.”
See also “Lessons From the Evolution of 401(k) Retirement Plans for Increased Consumerism in Health Care: An Application of Behavioral Research,” online here.
Sunday, June 16, 2013
Consumer "Driven?"
Over the last several years, the trend in employment-based retirement plans has been to put in place structures to make more decisions for workers through the expansion of automatic enrollment plan designs.(1) At the same time, the trend in employment-based health plans has been to look for ways to better manage costs, while providing workers more choice, and flexibility in those choices, by looking to so-called consumer-driven health plans, or CDHPs.(2)
In the case of the former, the shift has been to help workers make better decisions, to boost savings by not only increasing plan participation, but to direct contributions to more diversified investment options than many seem to choose on their own accord. In the case of healthcare plan designs, the expansion of choice is similarly rooted in a desire to help workers make “better” decisions, albeit with a slightly different emphasis.
The addition of CDHP designs (and in many cases it is an addition, rather than a replacement for, traditional health plan designs) has arguably been motivated in no small part by a desire to constrain the costs of employment-based health care programs by giving workers some “skin in the game” beyond whatever premiums may be associated with the benefit. A recent EBRI report notes that, by 2012, 31 percent of employers offered some version of a CDHP (either a health reimbursement arrangement or health savings account-eligible plan), with about 25 million people (about 14.6 percent of the privately insured market) covered by these type plans.
The concept is relatively straightforward: CDHPs combine high deductibles with tax-preferred savings or spending accounts (that can be funded by employer and/or employee contributions, or both) that workers and their families can use to pay their out-of-pocket health care expenses. The theory is that individuals may spend money from their own account(s) more judiciously. However, there have been concerns that individuals may prove to be too frugal, choosing to defer needed and perhaps even necessary healthcare just to avoid spending money.
Additionally, while the theory that consumerism would lead to less (and perhaps better) spending appeared sound, and while prior research in this area has generally found low-to-moderate reductions in measures such as services used, conclusions have also been limited by the potential for selection bias, in that workers were often given a choice between CDHPs and more traditional options—and it was possible that individuals of a particular profile might be more inclined to opt for the CDHP.
A recent EBRI study, published in the June issue of Health Affairs,(3) was able to examine healthcare services utilization trends by using data from two large employers—one that adopted a CDHP in 2007 and another with no CDHP. That research(4) found that after four years under an HSA plan, there were 0.26 fewer physician office visits per enrollee per year and 0.85 fewer prescriptions filled, although there were 0.018 more emergency department visits, and the likelihood of receiving recommended cancer screenings was lower under the HSA plan after one year and, even after recovering somewhat in later years, still lower than the baseline at the study’s conclusion. While small numbers, these figures are considered statistically significant.
Ultimately, the longer-term impact of CDHPs on health status, outcomes, and spending remains to be established. On the other hand, and as the research paper notes, if CDHPs succeed in getting workers to make decisions that are more cost-sensitive, employers may well want to ensure not only that the plan designs work to incentivize the right choices, but that their work force is educated on the expanded choices that lie ahead.
Nevin E. Adams, JD
(1) See “The Impact of PPA on Retirement Savings for 401(k) Participants,” online here.
(2) CDHPs combine high deductibles with tax-preferred savings or spending accounts that workers and their families can use to pay their out-of-pocket health care expenses. These accounts allow people to accumulate funds on a tax-preferred basis—the funds may include contributions from the employer, the employee, or both, depending on the plan’s structure. Employees can choose between using the funds for their health care cost sharing or saving the money for the future. Employers began offering CDHPs in 2001 when a handful started offering health reimbursement arrangements (HRAs). They then started offering health savings account (HSA)-eligible plans after the Medicare Prescription Drug, Improvement, and Modernization Act of 2003 included a provision to allow individuals with certain high-deductible health plans to contribute to an HSA. See also “What Do We Really Know About Consumer-Driven Health Plans?” and “Characteristics of the CDHP Population, 2005–2010.”
(3) See “Consumer-Directed Health Plans Reduce The Long-Term Use Of Outpatient Physician Visits And Prescription Drugs,” online here.
(4) This work was conducted through the EBRI Center for Research on Health Benefits Innovation (EBRI CRHBI). The following organizations provided the funding for EBRI CRHBI: American Express, BlueCross BlueShield Association, Boeing, CVS Caremark, General Mills, Healthways, IBM, John Deere & Co., JP Morgan Chase, Mercer, and Pfizer.
In the case of the former, the shift has been to help workers make better decisions, to boost savings by not only increasing plan participation, but to direct contributions to more diversified investment options than many seem to choose on their own accord. In the case of healthcare plan designs, the expansion of choice is similarly rooted in a desire to help workers make “better” decisions, albeit with a slightly different emphasis.
The addition of CDHP designs (and in many cases it is an addition, rather than a replacement for, traditional health plan designs) has arguably been motivated in no small part by a desire to constrain the costs of employment-based health care programs by giving workers some “skin in the game” beyond whatever premiums may be associated with the benefit. A recent EBRI report notes that, by 2012, 31 percent of employers offered some version of a CDHP (either a health reimbursement arrangement or health savings account-eligible plan), with about 25 million people (about 14.6 percent of the privately insured market) covered by these type plans.
The concept is relatively straightforward: CDHPs combine high deductibles with tax-preferred savings or spending accounts (that can be funded by employer and/or employee contributions, or both) that workers and their families can use to pay their out-of-pocket health care expenses. The theory is that individuals may spend money from their own account(s) more judiciously. However, there have been concerns that individuals may prove to be too frugal, choosing to defer needed and perhaps even necessary healthcare just to avoid spending money.
Additionally, while the theory that consumerism would lead to less (and perhaps better) spending appeared sound, and while prior research in this area has generally found low-to-moderate reductions in measures such as services used, conclusions have also been limited by the potential for selection bias, in that workers were often given a choice between CDHPs and more traditional options—and it was possible that individuals of a particular profile might be more inclined to opt for the CDHP.
A recent EBRI study, published in the June issue of Health Affairs,(3) was able to examine healthcare services utilization trends by using data from two large employers—one that adopted a CDHP in 2007 and another with no CDHP. That research(4) found that after four years under an HSA plan, there were 0.26 fewer physician office visits per enrollee per year and 0.85 fewer prescriptions filled, although there were 0.018 more emergency department visits, and the likelihood of receiving recommended cancer screenings was lower under the HSA plan after one year and, even after recovering somewhat in later years, still lower than the baseline at the study’s conclusion. While small numbers, these figures are considered statistically significant.
Ultimately, the longer-term impact of CDHPs on health status, outcomes, and spending remains to be established. On the other hand, and as the research paper notes, if CDHPs succeed in getting workers to make decisions that are more cost-sensitive, employers may well want to ensure not only that the plan designs work to incentivize the right choices, but that their work force is educated on the expanded choices that lie ahead.
Nevin E. Adams, JD
(1) See “The Impact of PPA on Retirement Savings for 401(k) Participants,” online here.
(2) CDHPs combine high deductibles with tax-preferred savings or spending accounts that workers and their families can use to pay their out-of-pocket health care expenses. These accounts allow people to accumulate funds on a tax-preferred basis—the funds may include contributions from the employer, the employee, or both, depending on the plan’s structure. Employees can choose between using the funds for their health care cost sharing or saving the money for the future. Employers began offering CDHPs in 2001 when a handful started offering health reimbursement arrangements (HRAs). They then started offering health savings account (HSA)-eligible plans after the Medicare Prescription Drug, Improvement, and Modernization Act of 2003 included a provision to allow individuals with certain high-deductible health plans to contribute to an HSA. See also “What Do We Really Know About Consumer-Driven Health Plans?” and “Characteristics of the CDHP Population, 2005–2010.”
(3) See “Consumer-Directed Health Plans Reduce The Long-Term Use Of Outpatient Physician Visits And Prescription Drugs,” online here.
(4) This work was conducted through the EBRI Center for Research on Health Benefits Innovation (EBRI CRHBI). The following organizations provided the funding for EBRI CRHBI: American Express, BlueCross BlueShield Association, Boeing, CVS Caremark, General Mills, Healthways, IBM, John Deere & Co., JP Morgan Chase, Mercer, and Pfizer.
Sunday, June 09, 2013
“Drawing” Board?
While drawing boards have been used by engineers and architects for more than two centuries, the phrase “back to the drawing board” is of much more recent origin, coined by the Peter Arno in a cartoon first published in the March 1, 1941 issue of New Yorker magazine.(1) The cartoon features a crashed plane in the background, a parachute in the distance, several military officials and rescue workers rushing to help/investigate—and one remarkably nonchalant individual, walking in the opposite direction with a rolled up document tucked under his arm as he comments, “Well, back to the old drawing board.”
For all the much-deserved focus on retirement savings accumulations, a growing amount of attention is now directed to how those already in (and fast-approaching) retirement are actually investing and drawing down those savings.
A recent EBRI analysis(2) found that at age 61, only 22.2 percent of households with an individual retirement account (IRA) took a withdrawal from that account. That pace slowly increases to 40.5 percent by age 69 before jumping to 54.1 percent at age 70, and by the age of 79, almost 85 percent of households with an IRA took a distribution.
IRAs are, of course, a vital component of U.S. retirement savings, holding more than 25 percent of all retirement assets in the nation, according a recent EBRI report. A substantial and growing portion of these IRA assets originated in other employment-based tax-qualified retirement plans, such as defined benefit (pension) and 401(k) plans.
The EBRI analysis also found that the percentage of households with an IRA making a withdrawal from that account not only increased with age, but also spiked around ages 70 and 71, a trend that appears to be a direct result of the required minimum distribution (RMD) rules in the Internal Revenue Code.(3) Those rules require that traditional IRA account holders begin to take at least a specific amount from their IRA no later than April 1 of the year following the year in which they reach age 70-½, or else suffer a fairly harsh tax penalty.
In fact, at age 71, 71.1 percent of households owning an IRA that took a withdrawal reported that they took only the RMD amount, increasing to 77.4 percent at age 75, 83.2 percent at age 80, and 91.1 percent by age 86.
However, the EBRI report noted that IRA-owning households not yet subject to the RMD—those headed by individuals between the ages of 61 and 70—made larger withdrawals than older households, both in absolute dollar amounts as well as a percentage of IRA account balance. Indeed, the bottom-income quartile of this age group had a very high percentage (48 percent) of households that made an IRA withdrawal—and their average annual percentage of account balance withdrawn (17.4 percent) was higher than the rest of the income distribution. Moreover, those younger households that made IRA withdrawals spent most of it.
While a significant percentage of those in the sample are drawing out only what the law mandates, the data indicate that more of those in the lower-income groups not only draw money out sooner, but also draw out a higher percentage of their savings—perhaps too early to sustain them throughout retirement.(4)
Planning and preparation matters—not only for retirement savings, but in retirement withdrawals. Because for those whose retirement resources run short too soon, it’s generally also too late to go “back to the drawing board.”
Nevin E. Adams, JD
(1) You can see the original cartoon online here.
(2) The data for this study come from the University of Michigan’s Health and Retirement Study (HRS), which is sponsored by the National Institute on Aging. See “IRA Withdrawals: How Much, When, and Other Saving Behavior,” online here.
(3) As noted in a previous post (see “Means Tested”), there are advantages to a drawdown strategy based on the schedule provided by the Internal Revenue Service (IRS) for required minimum distributions, or RMDs. See also Withdrawal “Symptoms.”
(4) The EBRI Retirement Readiness Ratings™ indicate that approximately 44 percent of the Baby Boomer and Gen-Xer households are simulated to be at-risk of running short of money in retirement, assuming they retire at age 65 and retain any net housing equity in retirement until other financial resources are depleted. Those individuals may well become part of the significant percentage of retirees who eventually must depend on Social Security for all of their retirement income. See “All or Nothing? An Expanded Perspective on Retirement Readiness.”
For all the much-deserved focus on retirement savings accumulations, a growing amount of attention is now directed to how those already in (and fast-approaching) retirement are actually investing and drawing down those savings.
A recent EBRI analysis(2) found that at age 61, only 22.2 percent of households with an individual retirement account (IRA) took a withdrawal from that account. That pace slowly increases to 40.5 percent by age 69 before jumping to 54.1 percent at age 70, and by the age of 79, almost 85 percent of households with an IRA took a distribution.
IRAs are, of course, a vital component of U.S. retirement savings, holding more than 25 percent of all retirement assets in the nation, according a recent EBRI report. A substantial and growing portion of these IRA assets originated in other employment-based tax-qualified retirement plans, such as defined benefit (pension) and 401(k) plans.
The EBRI analysis also found that the percentage of households with an IRA making a withdrawal from that account not only increased with age, but also spiked around ages 70 and 71, a trend that appears to be a direct result of the required minimum distribution (RMD) rules in the Internal Revenue Code.(3) Those rules require that traditional IRA account holders begin to take at least a specific amount from their IRA no later than April 1 of the year following the year in which they reach age 70-½, or else suffer a fairly harsh tax penalty.
In fact, at age 71, 71.1 percent of households owning an IRA that took a withdrawal reported that they took only the RMD amount, increasing to 77.4 percent at age 75, 83.2 percent at age 80, and 91.1 percent by age 86.
However, the EBRI report noted that IRA-owning households not yet subject to the RMD—those headed by individuals between the ages of 61 and 70—made larger withdrawals than older households, both in absolute dollar amounts as well as a percentage of IRA account balance. Indeed, the bottom-income quartile of this age group had a very high percentage (48 percent) of households that made an IRA withdrawal—and their average annual percentage of account balance withdrawn (17.4 percent) was higher than the rest of the income distribution. Moreover, those younger households that made IRA withdrawals spent most of it.
While a significant percentage of those in the sample are drawing out only what the law mandates, the data indicate that more of those in the lower-income groups not only draw money out sooner, but also draw out a higher percentage of their savings—perhaps too early to sustain them throughout retirement.(4)
Planning and preparation matters—not only for retirement savings, but in retirement withdrawals. Because for those whose retirement resources run short too soon, it’s generally also too late to go “back to the drawing board.”
Nevin E. Adams, JD
(1) You can see the original cartoon online here.
(2) The data for this study come from the University of Michigan’s Health and Retirement Study (HRS), which is sponsored by the National Institute on Aging. See “IRA Withdrawals: How Much, When, and Other Saving Behavior,” online here.
(3) As noted in a previous post (see “Means Tested”), there are advantages to a drawdown strategy based on the schedule provided by the Internal Revenue Service (IRS) for required minimum distributions, or RMDs. See also Withdrawal “Symptoms.”
(4) The EBRI Retirement Readiness Ratings™ indicate that approximately 44 percent of the Baby Boomer and Gen-Xer households are simulated to be at-risk of running short of money in retirement, assuming they retire at age 65 and retain any net housing equity in retirement until other financial resources are depleted. Those individuals may well become part of the significant percentage of retirees who eventually must depend on Social Security for all of their retirement income. See “All or Nothing? An Expanded Perspective on Retirement Readiness.”
Sunday, June 02, 2013
“Half” Baked?
I’ve never been much good in the kitchen. I’ve neither the patience/discipline to follow most recipes, nor the innate sense for the right balance of ingredients that those with culinary talent seem to have. That said, I learned the hard way years ago that if you mix the right items in the wrong order, or the wrong amounts of the right items, leave something to bake too long – or not long enough – the results can be disastrous.
A recent report by the Pew Charitable Trusts posed the question, “Are Americans Prepared for Their Golden Years?” Perhaps not surprisingly, the report indicated that many are not. What was surprising, however, was the assertion that Gen-Xers (those born between 1966 and 1975), in the Pew analysis, looked to be in even worse shape than either early or late Boomers.
Previous EBRI research has found that approximately 44 percent of simulated lifepaths for Baby Boomer and Gen-Xer households are projected to run short of money in retirement, assuming they retire at age 65 and retain any net housing equity in retirement until other financial resources are depleted. However, that includes a wide range of personal circumstances, from individuals projected to run short by as little as a dollar to those projected to fall short by tens of thousands of dollars. Looking specifically at Gen X, many of which have decades of saving accumulations still ahead of them, nearly one-half (49.1 percent) of the simulated lifepaths of that demographic are projected to have retirement resources that are at least 20 percent more than is simulated to be needed, while approximately one-third (31.4 percent) are projected to have between 80 percent and120 percent of the financial resources necessary to cover retirement expenses and uninsured health care costs (see Retirement Income Adequacy for Boomers and Gen Xers: Evidence from the 2012 EBRI Retirement Security Projection Model).
However, in reviewing the Pew report and its associated methodology, several key differences in approach emerge. On the one hand, the Pew report assumes that workers will receive credit for a full career in the accrual of Social Security benefits, and it also imputes a full-career accrual of defined benefit pension benefits – though many individuals don’t wait till full retirement age to collect on the former (accepting lower benefits), and many don’t accumulate enough service to be entitled to the latter (see “The Good Old Days”, “Employee Tenure Trends, 1983–2012“). This assumption likely exaggerates the retirement readiness of older workers, who are more likely to have some defined benefit accrual.
On the other hand, the Pew report appears to assume no further contributions, either by employer or employee, to the defined contribution balances as of 2010. That’s right, no further contributions beyond the self-reported participant balances of 2010, and no earnings projection on those assumed non-existent contributions, either. This assumption likely serves to understate the future retirement readiness of younger workers, who have years, and in many cases decades, of savings ahead of them.
Based on the combination of those assumptions and the well-documented trend away from defined benefit plans and toward a greater reliance on defined contribution designs, it’s little wonder that the Pew report concludes that Gen Xers will be worse off than Boomers.
In sum, whether you’re baking a cake or evaluating research conclusions, if it seems a bit “off,” it’s generally a good idea to carefully review the recipe – and double check the ingredients.
Nevin E. Adams, JD
The Pew Charitable Trusts report, “Retirement Security Across Generations” is available online here.
A recent report by the Pew Charitable Trusts posed the question, “Are Americans Prepared for Their Golden Years?” Perhaps not surprisingly, the report indicated that many are not. What was surprising, however, was the assertion that Gen-Xers (those born between 1966 and 1975), in the Pew analysis, looked to be in even worse shape than either early or late Boomers.
Previous EBRI research has found that approximately 44 percent of simulated lifepaths for Baby Boomer and Gen-Xer households are projected to run short of money in retirement, assuming they retire at age 65 and retain any net housing equity in retirement until other financial resources are depleted. However, that includes a wide range of personal circumstances, from individuals projected to run short by as little as a dollar to those projected to fall short by tens of thousands of dollars. Looking specifically at Gen X, many of which have decades of saving accumulations still ahead of them, nearly one-half (49.1 percent) of the simulated lifepaths of that demographic are projected to have retirement resources that are at least 20 percent more than is simulated to be needed, while approximately one-third (31.4 percent) are projected to have between 80 percent and120 percent of the financial resources necessary to cover retirement expenses and uninsured health care costs (see Retirement Income Adequacy for Boomers and Gen Xers: Evidence from the 2012 EBRI Retirement Security Projection Model).
However, in reviewing the Pew report and its associated methodology, several key differences in approach emerge. On the one hand, the Pew report assumes that workers will receive credit for a full career in the accrual of Social Security benefits, and it also imputes a full-career accrual of defined benefit pension benefits – though many individuals don’t wait till full retirement age to collect on the former (accepting lower benefits), and many don’t accumulate enough service to be entitled to the latter (see “The Good Old Days”, “Employee Tenure Trends, 1983–2012“). This assumption likely exaggerates the retirement readiness of older workers, who are more likely to have some defined benefit accrual.
On the other hand, the Pew report appears to assume no further contributions, either by employer or employee, to the defined contribution balances as of 2010. That’s right, no further contributions beyond the self-reported participant balances of 2010, and no earnings projection on those assumed non-existent contributions, either. This assumption likely serves to understate the future retirement readiness of younger workers, who have years, and in many cases decades, of savings ahead of them.
Based on the combination of those assumptions and the well-documented trend away from defined benefit plans and toward a greater reliance on defined contribution designs, it’s little wonder that the Pew report concludes that Gen Xers will be worse off than Boomers.
In sum, whether you’re baking a cake or evaluating research conclusions, if it seems a bit “off,” it’s generally a good idea to carefully review the recipe – and double check the ingredients.
Nevin E. Adams, JD
The Pew Charitable Trusts report, “Retirement Security Across Generations” is available online here.
Subscribe to:
Posts (Atom)