River Publishers Series in Software Engineering

Engineering Agile Big-Data Systems

Editors:
Kevin Feeney, Trinity College Dublin, Ireland
Jim Davies, Oxford University, United Kingdom
James Welch, Oxford University, United Kingdom
Sebastian Hellmann, University of Leipzig, Germany
Christian Dirschl, Wolters Kluwer, Germany
Andreas Koller, Semantic Web Company, Austria
Pieter Francois, Oxford University, United Kingdom
Arkadiusz Marciniak, Adam Mickiewicz University, Poland

Authors:
University of Oxford, UK, Trinity College Dublin, Ireland


To be effective, data-intensive systems require extensive ongoing customisation to reflect changing user requirements, organisational policies, and the structure and interpretation of the data they hold. Manual customisation is expensive, time-consuming, and error-prone. In large complex systems, the value of the data can be such that exhaustive testing is necessary before any new feature can be added to the existing design. In most cases, the precise details of requirements, policies and data will change during the lifetime of the system, forcing a choice between expensive modification and continued operation with an inefficient design.

Engineering Agile Big-Data Systems outlines an approach to dealing with these problems in software and data engineering, describing a methodology for aligning these processes throughout product lifecycles. It discusses tools which can be used to achieve these goals, and, in a number of case studies, shows how the tools and methodology have been used to improve a variety of academic and business systems.
Software engineering, data engineering, big data, Semantic Web, software engineering methodology, RDF, OWL

Chapter 3: Methodology
by JamesWelch, Jim Davies, Kevin Feeney, Pieter Francois, Jeremy Gibbons and Seyyed Shah


650