Giter Club home page Giter Club logo

software-quality-guidelines's Introduction

Software Quality Guidelines

This repository contains guidelines for software quality & sustainability for CLARIAH. It has been composed as a part of WP2 task 54.100. Please access the PDF document in this repository to read the latest version. The guidelines themselves are also available in the form of an interactive web-survey.

The document came about in response to the need for increasing attention to software quality and sustainability in the academic world, as software is such an essential component of our research, yet good software development practice is often not adhered to in a satisfactory degree, and sustainability is often problematic. The guidelines are an instrument for both for the developers of software, as well as software adopters, to assess the quality and sustainability of software

The guidelines are currently in a proposal stage in which we would very much like to get some input. Please see the Request for Comment section at the end of the PDF.

Use our Github Issue Tracker for comments if possible.

software-quality-guidelines's People

Contributors

jauco avatar proycon avatar reinierdevalk avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Forkers

jauco rbirkelbach

software-quality-guidelines's Issues

Move indicators to e.g. a Google Form

I think it would be hugely useful to have a Google Form-version of the criteria. They are currently formulated as a survey or questionnaire.

Having to print the document, and tick the boxes by hand is not a very sustainable model. Especially if we want to somehow analyze the results (semi) automatically.

Static program analysis not mentioned yet

A commenter observed the following is missing:

The analysis of computer software known as 'static program analysis' is missing. For each language and paradigm there are numbers of tools that analyse code and produce metrics that can be readily used in a quality assessment.

Consumer vs developer perspective

We have to strike a balance between a consumer and developer perspective. This was worded nicely in the following comment:

The guidelines are envisioned from a consumer point of view. There is nothing wrong with that, but the document lacks an explicit statement that it is envisioned from this viewpoint and there should be a justification for doing so. After all, CLARIAH is
a consumer, but also a producer of software.
Why is the document consumer centric? Indications are that the words architecture and/or design are very sparse in the document. Architecture pops up 2 times, design 3 times, one time in conjuncture with architecture. A consumer centric approach looks
at software, ‘the thing that is’; a producer centric approach looks at software as the thing that is the result of a process in which architecture and design play a prominent role.

Too many criteria?

Some of the feedback we got relates to the high amount of criteria. Here are some comments we received regarding this:

  • the guidelines are very detailed, and rather seem to lean towards an aftermath assessment than guidelines to be used during the work process. For the latter, we might want to point to the document of the eScience center. (https://nlesc.gitbooks.io/guide/content/software/software_overview.html)
  • To applied for assessment it would also be better to have less criteria
  • is nu wel een erg lange lijst om door te lezen.

I think Jauco's "actionable steps" initiative already mitigates most of these concerns by offering a different perspective on the criteria.

Open vs closed questions for the indicators

All indicators are currently closed questions that are all to be answered affirmatively. Some comments we got:

  • checklist is now yes/no/no available; [...] For the time being the questions are made more closed. DSA operates with more open questions, legitimate your efforts.

From my point of view, the argument for closed questions is one of easier measurability, if all questions have to be answered affirmatively it's relatively easy to compute a score for each dimension and for the whole. It would make it easy to transform into a survey form as well.

Expand descriptions

Various indicators need to be worked out more thoroughly. Some comments we got:

  • a lot of questions could be merged and also made more qualitative - more descriptions
  • De volgende items moeten denk ik nog wat uitgewerkt worden en concreter worden. Nu zou ik niet weten hoe ik software kan schrijven die eraan voldoet: D4 PF1 PF2 PF3 CH1

Prioritization of indicators

How do we prioritize the various indicators? This is missing in the current draft and we rightly got some comments on it:

  • Prioritizing: must have/should have/nice to have.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.