The evaluation of CURANT was conducted by the Centre for Migration and Intercultural Studies (CeMIS) of the University of Antwerp. CeMIS was present from the project’s outset as part of the partnership, overseeing the project evaluation component. At the same time, it also tried to maintain a level of independence within the partnership, balancing its insider and outsider position for the benefit of the evaluation. This balancing act was not always easy, and this will be further explored in the context of specific challenges that presented themselves.
While CeMIS was the evaluation focal point, other partners responsible for implementation of services were also involved in evaluation throughout the project. In particular, they were consulted in the process of developing the evaluation approach through six group interviews, with participation from project designers and coordinators, social and education workers and psychotherapists. Once CeMIS finalised the approach based on partner input, including the project’s theory of change, the material was presented to the delivery partners who decided on further evaluation steps.
We collected the data to see what the wavelengths are, who has what types of goals and ideas […] we presented this analysis; we think these are the indicators, we think maybe here and there there’s a bit of a mismatch, and so on. But then, we passed it on over to the partners and to the project manager to decide how to go further.
Source: CURANT project hearing
Delivery partners also provided data for the evaluation and otherwise supported the research process (by facilitating access to respondents for example, especially among young refugees).
All partners were really interested in providing inputs on the things they were doing, the activities they were organising and the potential impact.
Source: CURANT project hearing
Overall, the evaluation benefited from a cooperative spirit within the partnership. Various platforms were created for the evaluation partner and delivery partners to meet and exchange feedback to improve the project.
We had a lot of meetings. We did reporting at several moments during the project […] We were continuously involved, and from the beginning also, and there was a lot of interaction.
Source: CURANT project hearing
Complementary to the group interviews that it organised with delivery partners, CeMIS also participated in major preparatory meetings during the initial stages of the project and presented evaluation results at different points throughout its duration. This was possible, as the evaluation was structured into four phases:
- Phase one: Preparatory and start-up phase, which aimed to develop the programme theory and a baseline measurement.
- Phases two and three: Two rounds of fieldwork, which aimed to gather data through in-depth interviews and focus groups, proceeding to data analysis and evaluation (two evaluation reports were produced).
- Phase four: The final evaluation and policy-oriented final report.
Want to learn more about the evaluation results? Check out the ‘Policy recommendations’ report and the first and second evaluation reports.
After one year, we had a big partnership meeting, like a one year evaluation. CeMIS got presenting their results. All the partners could say what they felt went wrong, what went good, they could see that CeMIS had the same results.
Source: CURANT project hearing
This ongoing nature of the evaluation allowed for findings to be integrated into the implementation of activities. One example was the inclusion of shorter, tailor made educational trajectories for those young refugees who were eager to start working as soon as possible. When evaluation results indicated that the refugees were overburdened with activities and appointments, their trajectories were adapted and made less intense. When CeMIS and the other partners realised that the project was concentrated more on the refugees, the delivery partners placed more attention on the buddies.
Buddies also needed attention and a different approach sometimes […] It came up during the partnership meeting how we are going to guarantee those things and change the procedure […] Everybody wanted to have refugees more strengthened and buddies more aware about the refugees.
Source: CURANT project hearing
The flexibility to alter activities testifies to the fact that the project embraced a learning mindset beyond the evaluation itself. The results obtained by CeMIS often converged with observations made by delivery partners, which evaluators perceived as logical considering the high level of interaction within the evaluation. While observations sometimes overlapped, CeMIS was perceived as being more objective, possibly adding weight to evaluation results.
It’s really important to have CeMIS on board to have objective results and to see whether what we thought was really right; to have some leverage to sustain the project and to disseminate project results.
Source: CURANT project hearing
In addition to bringing academic objectivity, thanks to its involvement in consultations with all partners and continuous focus on examining the intervention, CeMIS enjoyed a holistic view of the project. As such, this allowed it to play an integrative role in answering questions as to whether, how and why the project worked.
While in a partnership every partner had some expertise […] nobody had a helicopter view. And that’s what CeMIS had, CeMIS had a helicopter view from all the partners.
Source: CURANT project hearing
CeMIS’s integrative function may have been strengthened by the choice of a theory-based evaluation approach which entailed the development of a shared theory of change. Theory development also tested the limits of CeMIS’s role in this respect. The process revealed some differences in how partners viewed specific aspects of the project, and CeMIS reflected on whether it wanted to be the one to bring the partners to a common denominator.
Is this our role, to get everybody on the same page from the very beginning? We didn’t see it necessarily like that.
Source: CURANT project hearing
In conducting the evaluation, CeMIS had to balance its position within the project. As an insider, it had stronger interactions with project partners, which allowed for easier access to information, sharing of results and feeding into the project. As an outsider not implementing the activities, it had to establish enough presence to build a necessary relationship of trust with young research participants. At the same time, too strong a presence could have blurred the line between CeMIS and delivery partners in the eyes of young respondents, especially refugees. This, in turn, could lead respondents to answer questions in a way they believed to be desirable and, therefore, safe.
Sometimes we also went into activities. We tried to observe at some times. I think we did that more at the beginning, but then the line between being a partner like the other partners and being an outside partner was not so clear for the participants, so in the end we took a bit more distance for them to know that we were really more an outside partner.
Source: CURANT project hearing
Another challenge faced by the evaluators was the fact that they were not able to measure the results of the whole project. The evaluation needed to finish six months before the end of the project to allow the evaluation report to be written, presented and disseminated before the end of the implementation period (when financing for the project stopped).