Municipal community centers as healthy settings: evaluation of a real-world health promotion intervention in Jerusalem | BMC Public Health

Program overview of this evaluation research

The health-promoting community center (HPCC) program was conducted by the “Jerusalemites Choose Health” (JCH) program, a joint effort of the Jerusalem Municipality, the Ministry of Health and the Jerusalem District Health Bureau. The JCH further collaborated with additional offices within the Jerusalem Municipality, including the Jerusalem Municipality Sports Authority, the Jerusalem Health Education and Promotion Department, and the Linda Joy Pollin Cardiovascular Wellness Center for Women at Hadassah University Medical Center. The Pollin Center was asked to evaluate the program.

Criteria for a HPCC were established by consensus and incorporated into an agreement signed by participating DMCCs (see Additional file 1, Appendix A).

It was hypothesized that DMCCs who will receive the intervention will have better skills at building sustainable participatory effective HP initiatives and will provide a healthy setting for the community within their jurisdiction. Thus, the study questions prior to this evaluation research were ‘to what extend the intervention process and components will increase knowledge of DMCCs’ staff members in establishing HPPs?’ and ‘how this intervention will foster the planning and implementation of HPPs according to the HPCC criteria, as evaluated by the EQUIHP and RE-AIM frameworks?’.

DMCCs were invited to participate in this program via a call for proposals circulated to all 31 DMCCs in Jerusalem and re-issued after the first year of the program. The invitation included four pre-requisites:1) the DMCC director was requested to sign a letter of intent to define their center as a health promoting setting and to act in accordance with the declaration, 2) assignment of a HP coordinator who will participate in a 12 session HP course, 3) submission of a questionnaire regarding HP policy and actions being done within the community center and the neighborhood surrounding it, 4) sending an attached letter explaining the reason the DMCC is interested in joining the project and elaborating on actions currently being done by the DMCC in specific HP areas. These pre-requisites were the inclusion criteria for participating in the program. Funding was made available for HPPs that met criteria and included goals, objectives, activities, a budget and a plan for evaluation. The call for proposals was re-issued to all DMCCs after the completion of the first year of the program, without the requirement for participation in the seminar. The structure of the program is presented in Fig. one.

fig. one

Program Timeline and description of the preparation phase, first- and second-year activities

Training seminar

During Year 1, all health coordinators, one from each participating DMCC, were required to attend a twelve-session weekly training seminar. Each session lasted 5 h. The seminar coordinator was a community project manager from the Pollin center and a trained researcher in the HP field. She was one of the speakers and moderator during panel discussions and workshop sessions. Additional professional speakers delivered presentations and workshops during this seminar: public health doctors, HP specialist from the Ministry of Health, physiotherapist, nutritionist, etc. The seminar taught the principles of HP, and trained and supervised coordinators in the knowledge, skills and methods for planning and implementing HPPs. They learned about the standards of what is a HPCC. The themes delivered during this seminar included basic knowledge in nutrition, physical activity, smoking and healthy environments. They also learned about leadership, planning principles, writing skills, implementation of an HP program, and building internal and external partnerships. Participating coordinators were trained and guided to perform health needs in their community and to write an annual work plan. The training seminar also encouraged and enabled the creation of a peer-network of HPCCs. In the final class, each coordinator presented his / each HP program work plan. See Supplemental Table 1 for the curriculum outline.

educational kit

In Year 2, a HP online resource kit was distributed for coordinators’ self-education and program planning. The kit included HP materials, ideas for HP evidence-based initiatives, and manuals for activities with various target populations (children, elderly, families, immigrants, etc.). Topics included healthy workplaces, health-promoting settings, HPPs developed by the ministries of education and health and lists of potential partners and service providers. Coordinators were encouraged to use these materials for self-education and program-planning. The new coordinators, who joined in Year 2, received one-on-one mentoring and telephone follow-up (as they had not participated in the group seminar). During April 2019, each coordinator presented their HPPs work plan at a formal group meeting and received feedback based on HP practices by the program steering committee and leading advisors.

Peer group and mentoring

To facilitate the creation of an ongoing peer network, a social media portal group discussion was created including the research team. This provided a platform wherein coordinators could discuss challenges, problem-solve and share successes. Trained researchers mentored coordinators via monthly phone calls. They discussed programs implemented, challenges and issues that required guidance or assistance.

Evaluation

Baseline and training seminar

Questionnaires were filled out by each DMCC, assessing the baseline characteristics of the community center and coordinators.

Furthermore, at the end of the course, seminar attendees filled out an anonymous questionnaire addressing participants’ knowledge, skills and health behaviors.

Educational kit, peer group and mentoring

A process evaluation for assessing kit use and the social media portal group participation was performed at the end of year 2. Participants were asked the following questions: “Have you used this kit? What materials were the most useful for you? What would you suggest adding to this kit?”

The research team recorded details on HP initiatives, new partnerships, challenges and mentoring concerns through the monthly mentoring phone calls.

EQUIHP evaluation

At the end of Year 1 and during Year 2, coordinators assessed each HP activity using an online 28-item tool, adapted from the European Quality Instrument for Health Promotion (EQUIHP) [8] that was culturally adapted and translated. The EQUIHP tool is usually used to assess HPPs, for quality improvement or as a checklist for self-evaluation [6]. The tool is divided into four dimensions, considering the important factors for effective HP: 1) Framework of HP principles, 2) Project development and implementation, 3) Project management, and 4) Sustainability.

Final RE-AIM evaluation

For a more complete and detailed assessment, the final evaluation at the end of Year 2 was based on the RE-AIM framework [7], an accepted and robust model in the public health domain, used for planning, implementing, and evaluating several types of HP interventions. Dimensions include reach(R), effectiveness(E), adoption(A), implementation(I) and maintenance(M) [9]. The RE-AIM framework has been used for reporting internal and external validity [10, 11]. The different measures for evaluating and measuring inclusion of RE-AIM components are presented in Table 1. Relevant elements of the EQUIHP tool [8] were also incorporated to the finalized RE-AIM tool by an expert committee, resulting in a final 29-item tool. Our RE-AIM evaluation investigated each DMCC and evaluated adherence to HP principles and practices in the proposed work plans and projects implementation.

Table 1 Evaluation within RE-AIM dimensions

data analysis

For summarizing and describing our results, the characteristics of the data were analyzed using descriptive statistics. Descriptive coefficients regarding the frequency distribution, central tendency, and variability of the dataset were assessed. For ordinal data, median and interquartile range (IQR) were used [12]. The data is presented in a graphical way, providing a useful visual summary of the results. The box plots enabled the researchers to quickly identify and compare the dispersion of the data set. The evaluation data were analyzed using IBM SPSS statistics 25 (Chicago, IL, USA).

For baseline and training seminar evaluation, transmitters were calculated to describe and quantify the questionnaire variables.

For evaluation of the educational kit, peer group and mentoring, a researcher organized and summarized the data by frequencies calculation and qualitative analysis. The content of the social media portal discussions was assessed by thematic analysis.

For EQUIHP evaluation, each item of the adapted tool was coded online by the coordinators, according to three options of replies: not relevant or absent, partially present, or present, adhering to EQUIHP instructions. Assistance in reporting and coding was provided by the research team to coordinators as necessary. Two items regarding the planned and actual number of participants were open questions and were categorized by a researcher. Then, to obtain a comparative index and based on a previous study [13], each item was recoded on a 0–1 scale and according to the following rules: (0 = not done), (0.5 = partially complete), (1 = complete). The sum of valid scores was divided by the number of valid answers. Absolute and median scores on a 0–1 scale were calculated for each EQUIHP dimension of the reported HP initiatives.

For the final RE-AIM evaluation, to improve the accuracy of data collection, data was collected via in-depth telephone interviews (rather than online) at the end of Year 2, coded and interpreted by the research team. The in-depth interviews were intended to avoid possible bias from the online reports and subjective self-coding by coordinators. After transcription of all interviews by a trained researcher, the data were coded and interpreted by the research team only. First, by one research separately, then to ensure accuracy and reduce bias, two additional independent researchers reviews and discussed coding and conclusions. Each item was coded on a 0–1 scale (0 = not relevant or absent, 0.5 = partially present, 1 = present). Absolute and median scores were calculated for each DMCC and RE-AIM components.