A set of Common Software Quality Assurance Baseline Criteria for Research Projects


              

A DOI-citable version of this manuscript is available at http://hdl.handle.net/10261/160086.

This manuscript (permalink) was automatically generated from indigo-dc/sqa-baseline@f63f2e8 on January 18, 2021 with the use of https://gitlab.com/manubot/rootstock/.

Authors

Abstract

The purpose of this document is to define a set of quality standards, procedures and best practices to conform a Software Quality Assurance plan to serve as a reference within the European research ecosystem related projects for the adequate development and timely delivery of software products.

Copyright © Members of the INDIGO-DataCloud, DEEP Hybrid-DataCloud eXtreme DataCloud and EOSC-Synergy collaborations, 2015-2020.

Acknowledgements

The INDIGO-DataCloud, DEEP-Hybrid-DataCloud, eXtreme-DataCloud and EOSC-Synergy projects have received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement number 653549, 777435, 777367 and 857647 respectively.

Document Log

Issue Date Comment
v1.0 31/01/2018 First draft version
v2.0 05/02/2018 Updated criteria
v3.0 20/12/2019 Code management section, metadata for software
v3.1 05/03/2020 Add tags/names for each criteria
v3.2 23/04/2020 Add EOSC-Synergy to copyright
v3.3 15/10/2020 Fix issues: #32, #46, #47, #48, #49, #51

Introduction and Purpose

This document has been tailored upon the recommendations and requirements found in the Initial Plan for Software Management and Pilot Services deliverable [1], produced by the INDIGO-DataCloud project. These guidelines evolved throughout the project’s lifetime and are being extended in the EOSC-Synergy, DEEP-Hybrid-DataCloud and eXtreme DataCloud subsequent projects. The result is a consolidated Software Quality Assurance (SQA) baseline criteria emanated from the European Open Science Cloud (EOSC), which aims to outline the SQA principles to be considered in the upcoming software development efforts within the European research community, and continuously evolve in order to be aligned with future software engineering practices and security recommendations.

Goals

  1. Set the base of minimum SQA criteria that a software developed within EOSC development project MUST fulfill.
  2. Enhance the visibility, accessibility and distribution of the produced source code through the alignment to the Open Source Definition [2].
  3. Promote code style standards to deliver good quality source code emphasizing its readability and reusability.
  4. Improve the quality and reliability of software by covering different testing methods at development and pre-production stages.
  5. Propose a change-based driven scenario where all new updates in the source code are continuously validated by the automated execution of the relevant tests.
  6. Adopt an agile approach to effectively produce timely and audience-specific documentation.
  7. Lower the barriers of software adoption by delivering quality documentation and the utilization of automated deployment solutions.
  8. Encourage secure coding practices and security static analysis at the development phase while providing recommendations on external security assessment.

Notational Conventions

The keywords “MUST”, “MUST NOT”, “REQUIRED”, “SHALL”, “SHALL NOT”, “SHOULD”, “SHOULD NOT”, “RECOMMENDED”, “MAY”, and “OPTIONAL” in this document are to be interpreted as described in RFC 2119 [3].

Quality Criteria

The following sections describe the quality conventions and best practices that apply to the development phase of a software component within the EOSC ecosystem. These guidelines ruled the software development process of the former European Commission-funded project INDIGO-DataCloud, where they have proved valuable for improving the reliability of software produced in the scientific European arena.

The next sections describe the development process driven by a change-based strategy, followed by a continuous integration approach. Changes in the source code, trigger automated builds to analyze the new contributions in order to validate them before being added to the software component code base. Consequently, software components are more eligible for being deployed in production infrastructures, reducing the likelihood of service disruption.

Code Accessibility [QC.Acc]

Licensing [QC.Lic]

Code Workflow [QC.Wor]

A change-based approach is accomplished with a branching model.

Code Management [QC.Man]

Code Style [QC.Sty]

Code style requirements pursue the correct maintenance of the source code by the common agreement of a series of style conventions. These vary based on the programming language being used.

Code metadata [QC.Met]

Metadata for the software component provides a way to achieve its full identification, thus making software citation viable [6]. It allows the assignement of a Digital Object Identifier (DOI) and is key towards preservation, discovery, reuse, and attribution of the software component.

Unit Testing [QC.Uni]

Unit testing evaluates all the possible flows in the internal design of the code, so that its behaviour becomes apparent. It is a key type of testing for early detection of failures in the development cycle.

Integration Testing [QC.Int]

Integration testing refers to the evaluation of the interactions among coupled software components or parts of a system that cooperate to achieve a given functionality.

Functional Testing [QC.Fun]

Functional testing involves the verification of the software component’s identified functionality, based on requested requirements and agreed design specifications. This type of software testing focus on the evaluation of the functionality that the software component exposes, leaving apart any internal design analysis or side-effects to external systems.

Test-Driven Development (TDD)

Test-Driven Development [7], is a software development process relying on software requirements being converted to test cases before software is fully developed, and tracking all software development by repeatedly testing the software against all test cases. This is opposed to software being developed first and test cases created later.

Documentation [QC.Doc]

Security [QC.Sec]

Code Review [QC.Rev]

Code review implies the informal, non-automated, peer, human-based revision of any change in the source code [12]. It appears as the last step in the change management pipeline, once the candidate change has successfully passed over the required set of change-based tests.

Automated Deployment [QC.Aud]

Glossary

API
Application Programming Interface
CLI
Command Line Interface
DAST
Dynamic Application Security Testing
EOSC
European Open Science Cloud
IAST
Interactive Application Security Testing
OWASP
Open Web Application Security Project
SAST
Static Application Security Testing
SCM
Software Configuration Management
SQA
Software Quality Assurance
TDD
Test-Driven Development
VCS
Version Control System

References

1. INDIGO-DataCloud collaboration, Initial Plan for Software Management and Pilot Services
Members of the INDIGO-DataCloud collaboration
(2015) https://owncloud.indigo-datacloud.eu/index.php/s/yDklCrWjKnjutVA

2. The Open Source Definition, URL: https://opensource.org/osd
Open Source Initiative
https://opensource.org/osd

3. Key words for use in RFCs to Indicate Requirement Levels
S. Bradner
(1997) https://www.rfc-editor.org/info/rfc2119

4. Licenses & Standards, URL: https://opensource.org/licenses
Open Source Initiative
https://opensource.org/licenses

5. Semantic Versioning 2.0.0, URL: https://semver.org
Tom Preston-Werner
https://semver.org

6. Software citation principles
Arfon M. Smith, Daniel S. Katz, Kyle E. Niemeyer, FORCE11 Software Citation Working Group
PeerJ Computer Science (2016-09-19) https://doi.org/bw3g
DOI: 10.7717/peerj-cs.86

7. Test-driven development: by example
Kent Beck
Addison-Wesley (2003)
ISBN: 9780321146533

8. OWASP Secure Coding Practices-Quick Reference Guide https://owasp.org/www-project-secure-coding-practices-quick-reference-guide/migrated_content.html

9. Source Code Analysis Tools | OWASP https://owasp.org/www-community/Source_Code_Analysis_Tools

10. Dynamic Application Security Testing, URL: https://www.owasp.org/index.php/Category:Vulnerability_Scanning_Tools
The OWASP Foundation
https://www.owasp.org/index.php/Category:Vulnerability_Scanning_Tools

11. What is IAST? Interactive Application Security Testing
Veracode
https://www.veracode.com/security/interactive-application-security-testing-iast

12. OWASP Code Review Guide https://owasp.org/www-project-code-review-guide/migrated_content.html