Document Type

Report

Publication Date

9-2009

Abstract

One of the key aspects in the development of LarKC is how to evaluate the performance of the platform and its constituent components in order to guarantee that the execution of a pipeline will match the user’s needs and provide the desired solutions (answers) to the user’s queries. Therefore, in this deliverable, the first in a series three documents concerned with the definition of a Framework for Measuring and Evaluating Heuristic Problem Solving, we make the first steps towards defining such framework by considering the theoretical foundations and principles of evaluation and measurement theory, discussing several important aspects related to the process of evaluating LarKC and its platform and, reporting on several dimensions and methods by which the components of the platform and the platform itself can be evaluated. Our primary interest in the context of this deliverable is a formative evaluation framework that is designed by the members of the LarKC project and aims to identify project strengths and weaknesses, focus managerial and research efforts, and measure progress towards achieving the project goals. Such a framework can also serve as the foundation for a subsequent summative evaluation.

Comments

More information about the Large Knowledge Collider can be found at: http://www.larkc.eu/.


Share

COinS