Journal Issue: Home Visiting: Recent Program Evaluations Volume 9 Number 1 Spring/Summer 1999
Why Were There No Program Impacts?
The results reported above—almost no positive impacts for parents or children on any measure or in any site—are disappointing. This section posits some possible explanations of these disappointing findings, focusing on the definition and implementation of the program, its underlying strategy and theory, and the quality of services provided.
Definition and Implementation
Past programs have foundered when they were poorly defined or poorly implemented, but that does not appear to have been the case with CCDP. ACYF clearly and carefully defined the CCDP program so that it could be understood and implemented by staff at the local level. ACYF provided a detailed definition of the program, strong centralized management and oversight, and associated programmatic regulations and guidance. A management information system was put in place to help monitor service provision and to identify technical assistance needs. Project monitoring included monthly telephone calls, grantee meetings three times a year, quarterly progress reports, and annual week-long site visits conducted by staff from ACYF and CSR, Inc., to assess compliance and provide technical assistance. Compared with other demonstration projects and other federal programs, there is little question that the CCDP model was well defined at the federal level, clearly communicated to local grantees in a variety of settings, and closely monitored.
Given a well-defined program, it is still possible that local grantees were unable to do a high-quality job of implementing the program. However, as reported in the CCDP process evaluation, CCDP served the families that it was intended to serve, worked with thousands of service agencies nationwide, and delivered or obtained a wide range of services for a high proportion of participating families. CCDP intended to provide up to five years of continuous service to low-income families, and families recruited for the CCDP demonstration and evaluation participated for an average of more than three years. Compared with other demonstration programs, which often have annual dropout rates of 50% or more,6 CCDP was relatively successful in retaining substantial numbers of families from a traditionally difficult-to-serve segment of the population.Did CCDP Choose the Right Intervention Strategy?
If the program was clearly defined and fairly well implemented, perhaps its poor results were due to a more fundamental problem. Perhaps the underlying assumptions about appropriate intervention strategies (that is, case management as the means to serve parents and changing parent behavior as the means to promote children's development) were incorrect.
CCDP chose to provide case management as the primary intervention with parents, assuming that what poor families need most is a guide and negotiator as they navigate the web of services in the community. It is probably true that many CCDP families came to rely on the help and advice they received from their case managers, and during the first two years of the program, larger percentages of CCDP families than control group families reported that they received most services.21 However, in most cases the differences were not large—certainly not as large as might be expected for a program that spent about $11,000 per family per year to ensure that services were delivered. More importantly, by the end of the study, there were no important differences between CCDP and control group families in terms of the amount of services received,6 suggesting either that families in the control group found similar guides and negotiators at one or more of the service providers they used, or that most families, even poor ones, know what services they need and how to find them. What the CCDP case managers could not do, even with the help of specialist staff, was obtain needed services that did not exist or were in short supply. They could not, for example, produce adequate housing, jobs that paid a living wage, or outpatient mental health services in communities that did not have them.
The case management model has been tried in other fields, with similar outcomes. For example, the Fort Bragg Child and Adolescent Mental Health Demonstration project, funded by the U.S. Army, was an $80 million program that delivered mental health and substance-abuse services using a coordinated case management approach to involve various service agencies. An evaluation of this program reached many of the same conclusions as the current study—the demonstration had a systematic and comprehensive approach to planning treatments, more parental involvement, strong case management, more individualized services, fewer treatment dropouts, a greater range of service, enhanced continuity of care, more services in less restrictive environments, and a better match between services and needs.22 Despite these positive implementation findings, no positive effects were found on a wide range of child-level outcome measures. Comparison group children who participated in a less expensive, fragmented system of care without case management did as well clinically as children in the demonstration. This pattern of findings—good implementation of an integrated case management service-delivery system, followed by no effects on program participants—has been seen in other recent studies of child and adolescent mental health services.23-25 (See the article by Wagner and Clayton in this journal issue for a discussion of case management for adolescent parents.)
CCDP chose to focus its early childhood component on parents, seeing them as the conduits of children's developmental experiences. This approach reflects the views of many developmental psychologists and is a strategy common to many programs that seek to intervene with very young children. However, literature about the effects of parenting education on child development casts doubt on the efficacy of this approach.26,27 At the same time, there is substantial research evidence that the best way to achieve large effects on children is to provide intensive services directly to children for an extended period.19,20 This research does not dismiss the importance of the parents' role in child development. In fact, there is widespread agreement that competent parenting is related to positive child development. What is not available, as Alison Clarke-Stewart pointed out in her review of a decade of research about parenting education, is evidence about what parenting education should consist of, how it should be delivered, who should provide it, what kinds of parents are likely to benefit from it, what effects on parents might be expected from a widely used intervention rather than a small research study, and what the timing of such interventions must be if they are to have any hope of influencing children's outcomes.28 In the absence of such evidence, intervention programs have little guidance when they choose parenting education as a strategy.
There are several reasons why parenting education as implemented in CCDP might have been ineffective. First, as noted earlier, parents who were present for every possible early childhood education home visit received only 13 hours of instruction in the course of a year. This is unlikely to be a sufficient amount of exposure to new ideas. Second, the parenting curriculum was often delivered by case managers who may not have internalized or agreed with the ideas they were communicating. Many parents had older children and may have already formulated their own parenting and teaching strategies; their day-to-day interactions with children may have been influenced little, if at all, by the home visitors. Finally, parenting education simply may not be a very effective way of enhancing children's development.Service Quality
CCDP was developed under the assumptions that (1) most of the services needed by low-income families already existed in most communities, and (2) these services were of adequate quality to address the needs of low-income families. It is possible that these assumptions were incorrect and that the problem lay with the services obtained by CCDP—perhaps local services were of poor quality, or maybe they were not the services needed by participating families, or maybe they were not sufficiently intensive.
While there is no information about the quality of services provided through CCDP, there are data on the extent to which parents reported that services allowed them to meet the goals that they and CCDP staff set for themselves. Although CCDP families set many different goals, only a small percentage of parents reported that they attained those goals. For example, 37% reported that they obtained adequate housing, 11% reported that they improved their parenting skills, 24% reported that they obtained health care, 13% reported that they obtained social support, 17% reported that they furthered their education, 14% reported that their children had enhanced cognitive and social development, and so on.7 This suggests that the great majority of participating parents did not think that CCDP helped them achieve the goals they set for themselves. These perceptions of program noneffectiveness on the part of CCDP parents are especially striking given the high satisfaction with which participants view most social programs.29