Evaluation Services

Evaluation work grounded in real material

Structured interpretation of unstructured material to answer a fundamental question: what does your data reveal about the goals and expectations that defined your initiative?

What we deliver

AI-supported analytical procedures examine extensive datasets and form conclusions that give organisations a clear and accountable record of achievement.

  • Evaluation of programmes, projects, and events

  • Assessment of goal attainment and KPI alignment

  • Analysis of unstructured data as evidence of performance

  • Development of evaluative criteria when none exist

  • Visual dashboards that show results with clarity

Our evaluation work rests on structured interpretation of unstructured material. Clients trust us to answer a fundamental question: What does your data reveal about the goals and expectations that defined your programme, project, or event?

AI-supported analytical procedures allow us to examine extensive datasets and form conclusions that give organisations a clear and accountable record of achievement.

Evaluations grounded in real material

We assess the extent to which programmes, projects, and events reach their intended outcomes. The process draws on the material that organisations already hold: reports, comment fields, strategic documents, interviews, internal communication, public feedback, media coverage, and other qualitative sources. For example, quantitative indicators, such as counts of likes, reactions, or published articles, show attention but reveal little about how an initiative was discussed or understood. Our analysis identifies what was said, what shaped the conversation, and how the initiative was perceived. The outcome shows where goals were reached, where expectations fell short, and where unintended effects emerged.

Goal- and KPI-aligned evaluation

Clients who operate with predefined goals or KPIs receive an evaluation tied directly to those targets. When organisations lack established indicators, we define the evaluative frame together and create criteria that reflect the project's purpose and context. The outcome remains transparent and grounded in clear logic so internal and external stakeholders can trust the conclusions.

Unstructured data as a reliable source of performance

Unstructured data often provides depth and context that reveal additional indicators of success or failure. Human readers often experience drift, fatigue, and reduced consistency when handling such complexity. AI-supported structured analysis removes such constraints and secures consistent criteria across large bodies of information. The result forms an evaluative foundation that strengthens internal discussions and external communication.

Visualisation of results

Evaluation outputs include dashboards and visual representations that show distributions, trends, and outcome patterns across the full body of material. These forms make complex findings accessible for boards, project teams, and stakeholders who need a clear account of achievements and areas that require attention.

Frequently Asked Questions

We support evaluations of programmes, projects, and events where understanding public perception is important and where large amounts of unstructured material exist that cannot realistically be processed by hand. This applies to cultural programmes, public policy initiatives, social programmes, festivals, city-wide events, and similar large-scale projects. We have direct experience supporting European Capital of Culture evaluations, but the same methods work wherever organisations need to understand how an initiative was actually received, discussed, and understood by the public. The common thread is that these evaluations generate vast amounts of material - newspaper articles, social media posts, public comments, feedback - that contains valuable insight but sits unexamined because no team has the capacity to read through it all systematically.

We take responsibility for the unstructured data component of an evaluation. This means we analyse the newspaper coverage and social media discourse surrounding your programme or event. Your own evaluation team continues to handle the aspects they know best: defining goals and KPIs, coordinating with stakeholders, managing surveys, and compiling the final evaluation report. What we contribute is a rigorous evidence layer showing what citizens and media actually said about your initiative, how they discussed it, what themes emerged, and how perception developed over the course of the programme. This complements traditional evaluation methods. Where surveys tell you what people say when asked directly, and visitor statistics tell you how many people showed up, our analysis tells you what people said when they were talking among themselves - in newspaper opinion pieces, on social media, in comment sections. Together, this gives a much fuller picture than any single method alone.

Social media gives us access to something that traditional evaluation methods struggle to capture: what citizens genuinely think when they are not being asked directly. When someone responds to a survey, they know they are being evaluated and may adjust their answers accordingly. When someone posts on social media, they are expressing themselves to their own network, often without any awareness that their opinion might be studied. This makes social media a window into authentic public reflection. We have developed structured, reproducible methods to analyse this discourse at scale. Rather than reading a sample of posts and forming impressions, we systematically code thousands of posts according to consistent criteria. The result is concrete, defensible findings: which themes citizens raised on their own initiative, how different groups in the community responded, what concerns or criticisms emerged organically, what praise or enthusiasm appeared, and how the overall conversation evolved from the beginning of the programme to its conclusion. This is evidence of what the public actually thought - not what we assumed they thought, and not what they told us when we asked.

No, and this is an important distinction. The goals and KPIs for an evaluation should be defined by the organisation's own team - the people who understand the programme's strategic intentions, know its history, and will be accountable for the results. We are not in a position to tell a city what their European Capital of Culture programme should have achieved, or to define success criteria for a social initiative we were not involved in designing. What we do is work within whatever evaluation framework you have established. Once you tell us what questions you are trying to answer and what goals you set out to achieve, we provide evidence from public discourse that speaks to those questions. Our role is to add depth, rigour, and scale to the unstructured data component of your evaluation - to show you what citizens actually said - not to design the evaluation itself.

Counting media mentions tells you how much attention your programme received, but it tells you nothing about what was actually said. A programme mentioned in a thousand articles might have been praised in all of them, criticised in all of them, or discussed in ways that had nothing to do with your goals. The number alone reveals nothing. Similarly, tracking hashtag volume shows you that people were talking, but not what they were saying or whether the conversation was positive, negative, substantive, or superficial. Our approach is fundamentally different. We analyse the actual content of the coverage - the arguments made, the themes raised, the sentiment expressed, the criticisms and the praise. We do this systematically across thousands of articles and posts, applying consistent criteria so that the findings are reproducible and defensible. The result is not a metric but an account of what your initiative meant to people, how it was understood, and how that understanding developed over time. This is the kind of evidence that can genuinely inform future decisions and demonstrate impact in a way that simple counts cannot.

A disciplined evaluation process

Do you want a disciplined account of how your project or event performed against its goals? We support you through the full evaluative process.

Evaluation Services | NordAIConsult AS