The responses to the individual questions from our questionnaire were calculated in each of the 6
interoperability groups. The results were then presented in a matrix showing the 6 individual services in the
columns of the matrix and the 6 interoperability groups in the rows of the matrix. An average compliance
level was calculated by dividing the sum of results with the number of services. This was presented on the
last row of the matrix. Similarly, an average compliance level on the 6 individual interoperability groups was
calculated and shown on the last column.
As shown in Figure 2 below, the compliance levels differ greatly from service to service and between our 6
interoperability groups.

A matrix showing the calculated degree of compliance by each service in relation to the
interoperability principle

The matrix shown above presents the calculated degree of compliance based on the responses to our
questionnaire. The individual results for each service were calculated as a sum of all responses to all
questions submitted in the individual interoperability principles. For instance, the result of 38% for “Core
interoperability principles” for “UT-Rocket” was calculated as a sum of all UT-rockets positive (i.e. ‘yes’)
responses to the questions under the group “Core interoperability principles” and then divided this sum by
the number of questions (8) covered by that group.

4.1.1 Core interoperability principles

The services achieved a medium overall result on the core interoperability principles, and a high degree of
compliance on use of open APIs. They also achieved a low degree of compliance on questions pertaining to
reusability of and reproducibility of results. This may be due to misinterpretations of questions and use of
proprietary tools and formats that are determined by user behavior and not by the functionality of the
In a few cases, the low degree of compliance on reusability and reproducibility of results was due to the
service provider’s awareness of potential user behaviour as a barrier to reusability of data and
reproducibility of results. This could be the case if users could choose to make use of proprietary formats
and tools. Our questions were aimed at identifying the degrees of compliance, and fortunately, the
comments from the service providers were also able to identify user behaviour as a potential root cause for
the low degree of compliance.
This raises questions of whether the individual service providers should discourage or even disable use of
proprietary data and tools. On the other hand, regulating such behaviour could easily be seen as a radical
intervention that reduces usability and flexibility for its users. However, reusability and reproducibility are
core elements of interoperability in general and FAIR in particular. Therefore, identifying, analyzing and
making balanced recommendations on reusability and reproducibility are highly recommended for further

4.1.2 Generic user needs and expectations

The services achieved a high degree of compliance with these interoperability principles. This is especially
the case on questions pertaining to the administrative setup of the service. The questions included in this
group covered several interoperability issues concerning user-centricity, security and privacy, inclusion and
accessibility and multilingualism. The highest levels of compliance were measured on user-centricity (83%).
The lowest levels of compliance were measured on questions regarding security and privacy (50%).
Responses to questions regarding inclusion and accessibility achieved a good level of compliance (67%), and
multilingualism also received a good level of compliance (61%).

The services achieved a high degree of compliance with these principles. The questions were mostly
directed towards enabling digital sovereignty for users. All services gather user consent when relevant,
while only half of the services transparently describe issues pertaining to digital sovereignty.

4.1.4 Technical and organizational interoperability

The services achieved a high degree of compliance with these principles. This is reflected in compliance with
formal organizational relationships, while questions regarding the use of APIs result in lower compliance
and questions regarding SLA result in almost full compliance.

4.1.5 Semantic interoperability

The services achieved 50% compliance on both questions regarding semantic interoperability. They were
either fully compliant or non-compliant to the questions. The type of services complying to the questions
seemed to be the deciding factor. Repositories were generally compliant while the other types of services
were not.

4.1.6 Conceptual model

This category contained the highest number of questions. The results were very mixed. Two services were
mostly compliant with the questions while four services were mostly non-compliant. The services were mostly
compliant when the questions were directed towards specific security measures. When questions were
directed at data formats and licensing terms, the services turned out to be mostly non-compliant.
These mixed results may be partially due to differences in service types and due to the inherent difficulties
in distinguishing between interoperability compliance that is determined by the service and interoperability
outcomes that are determined by potential end user behaviour.
As mentioned in Core interoperability principles chapteres, several comments from the service providers
indicated that some responses included user behaviour in their response, while others did not. The same
difficulties apply to some of the questions covered in the conceptual model.
Our questions were mainly aimed at measuring compliance and not the root causes to compliance or
non-compliance. Therefore, some of our questions may have been seen as ambiguous by the service
providers. We did not specifically ask the service providers to include or exclude user behaviour in their
responses. Therefore some service providers may fully enable users to be compliant to all questions
regarding reusable data models, data formats and metadata, but still respond as non-compliant. This can be
due to a single user of their service who chose behavior that was non-compliant. This raises several
interesting questions:

  1. To what degree should the service provider entice its end users to behave in ways that ensure

  2. Where would a service provider gather recommendations for compliance that would be
    understandable and operational for end users?

  3. Can non-compliant end user behavior even be identified or measured?

  4. Should or could such behaviour be enforced by the service provider or by a neutral governing body?

  5. Would it even be possible to examine user compliance while respecting data integrity and privacy?
    These questions should be addressed in any further work concerning end-to-end compliance to the service
    interoperability framework. Clarifications on whether end user behaviour should be in scope for service
    providers ought be made before embarking on future surveys on IF compliance.