MTAT Software Engineering

Similar documents
Workshop 1a: Software Measurement. Dietmar Pfahl

MTAT Software Engineering Management

Quality Management of Software and

Software Measurement. Mark Micallef

Alan Bundy. Automated Reasoning LTL Model Checking

Measuring lateness. Definition of Measurement (Fenton)

Requirements Validation. Content. What the standards say (*) ?? Validation, Verification, Accreditation!! Correctness and completeness

Lecture 2: Software Metrics

Black-Box Testing Techniques III

Extensibility Patterns: Extension Access

Chap 4. Software Reliability

CSE507. Introduction. Computer-Aided Reasoning for Software. Emina Torlak courses.cs.washington.edu/courses/cse507/17wi/

CSE507. Course Introduction. Computer-Aided Reasoning for Software. Emina Torlak

31 Dec '01 07 Jan '02 14 Jan '02 21 Jan '02 28 Jan '02 M T W T F S S M T W T F S S M T W T F S S M T W T F S S M T W T F S S

ECEN 651: Microprogrammed Control of Digital Systems Department of Electrical and Computer Engineering Texas A&M University

No. of Days. Building 3D cities Using Esri City Engine ,859. Creating & Analyzing Surfaces Using ArcGIS Spatial Analyst 1 7 3,139

8.1 THE LANGUAGE OF MOTION

Engineering for Compatibility

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2019

CS 160: Lecture 16. Quantitative Studies. Outline. Random variables and trials. Random variables. Qualitative vs. Quantitative Studies

Abstract Machine for Software Process Models

Natural Language Processing Prof. Pawan Goyal Department of Computer Science and Engineering Indian Institute of Technology, Kharagpur

Integration and Higher Level Testing

ADVENTURES IN THE FLIPPED CLASSROOM FOR INTRODUCTORY

Software Quality. Introduction " Martin Glinz. Chapter 1. Department of Informatics!

Softwaretechnik / Software-Engineering Lecture 2: Software Metrics

Administrative notes. Computational Thinking ct.cs.ubc.ca

Introduction to Computer Programming

ITI Introduction to Computing II

Chem Compute Science Gateway for Undergraduates. Mark J. Perri, M.S. Reeves, R.M. Whitnell

R E A D : E S S E N T I A L S C R U M : A P R A C T I C A L G U I D E T O T H E M O S T P O P U L A R A G I L E P R O C E S S. C H.

Introduction to Software Engineering

OFFSHORE. Advanced Weather Technology

Data Mining Prof. Pabitra Mitra Department of Computer Science & Engineering Indian Institute of Technology, Kharagpur

Variance Estimates and the F Ratio. ERSH 8310 Lecture 3 September 2, 2009

Statistical Debugging. Ben Liblit, University of Wisconsin Madison

Lexical Analysis: DFA Minimization & Wrap Up

1 Descriptions of Function

Algebra I Solving & Graphing Inequalities

GIS Institute Center for Geographic Analysis

Practical Data Processing With Haskell

Feedback. The Lost Art Of Agile. (v2)

Question Answering on Statistical Linked Data

Sample questions for COMP-424 final exam

Programme Specification MSc in Cancer Chemistry

CS Data Structures and Algorithm Analysis

No. of Days. ArcGIS 3: Performing Analysis ,431. Building 3D cities Using Esri City Engine ,859

No. of Days. ArcGIS Pro for GIS Professionals ,431. Building 3D cities Using Esri City Engine ,859

Factory method - Increasing the reusability at the cost of understandability

Real Time Operating Systems

Risk Analysis for Assessment of Vegetation Impact on Outages in Electric Power Systems. T. DOKIC, P.-C. CHEN, M. KEZUNOVIC Texas A&M University USA

Online Scheduling Switch for Maintaining Data Freshness in Flexible Real-Time Systems

CSCI3390-Lecture 14: The class NP

An object-oriented design process. Weather system description. Layered architecture. Process stages. System context and models of use

COURSE INTRODUCTION & COURSE OVERVIEW

Indicators of Structural Stability of Object-Oriented Designs: A Case Study

An Experimental Investigation on the Innate Relationship between Quality and Refactoring

Preliminary Causal Analysis Results with Software Cost Estimation Data

INSPIRE Monitoring and Reporting Implementing Rule Draft v2.1

Improve Forecasts: Use Defect Signals

Class Note #20. In today s class, the following four concepts were introduced: decision

Weighted Stability Index (WSI) Metric Model Mike Libassi Intel Corp 8/11/99

download from

Evaluating the PFD of Safety Instrumented Systems with Partial Stroke Testing

PHP-Einführung - Lesson 4 - Object Oriented Programming. Alexander Lichter June 27, 2017

Measurement Theory for Software Engineers

Scheduling I. Today. Next Time. ! Introduction to scheduling! Classical algorithms. ! Advanced topics on scheduling

1a. Introduction COMP6741: Parameterized and Exact Computation

A Probabilistic Mental Model for Estimating Disruption

2.6 Complexity Theory for Map-Reduce. Star Joins 2.6. COMPLEXITY THEORY FOR MAP-REDUCE 51

EDA045F: Program Analysis LECTURE 10: TYPES 1. Christoph Reichenbach

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

Automata-based Verification - III

TEEN DRIVER SEAT BELT OBSERVATION FORM

STAT 201 Chapter 5. Probability

Unit 8: Exponential & Logarithmic Functions

1 Introduction. Station Type No. Synoptic/GTS 17 Principal 172 Ordinary 546 Precipitation

MODERNIZATION OF THE MUNICIPAL MAPPING USING HIGH END GNSS SYSTEM AND GIS SOFTWARE

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

Path Testing and Test Coverage. Chapter 9

Mathematical Induction

Path Testing and Test Coverage. Chapter 9

TEEN DRIVER ELECTRONIC DEVICE OBSERVATION FORM

LC OL - Statistics. Types of Data

Vaisala AviMet Automated Weather Observing System

GIS Geographical Information Systems. GIS Management

Simulation of Discrete Event Systems

Topics in Complexity

Physics 101. Hour Exam I Fall Last Name: First Name Network-ID Discussion Section: Discussion TA Name:

CSE 241 Class 1. Jeremy Buhler. August 24,

Solutions BAPC October University of Amsterdam. Solutions BAPC 2017 October / 31

Geographical Information System (GIS) Prof. A. K. Gosain

Chapter 9: Association Between Variables Measured at the Ordinal Level

GIS = Geographic Information Systems;

Lecture 12: Algorithm Analysis

Math 308 Midterm Answers and Comments July 18, Part A. Short answer questions

Te s t D r i v e n D e v e l o p m e n t. C h a m i l J e e w a n t h a

Data Mining Project. C4.5 Algorithm. Saber Salah. Naji Sami Abduljalil Abdulhak

COMP9334: Capacity Planning of Computer Systems and Networks

Homework 4 Math 11, UCSD, Winter 2018 Due on Tuesday, 13th February

Transcription:

MTAT.03.094 Software Engineering Lecture 14: Measurement Dietmar Pfahl Fall 2015 email: dietmar.pfahl@ut.ee

Schedule of Lectures Week 01: Introduction to SE Week 02: Requirements Engineering I Week 03: Requirements Engineering II Week 04: Analysis Week 05: Development Infrastructure I Week 06: Development Infrastructure II Week 07: Architecture and Design Week 08: No lecture Week 09: Refactoring Week 10: Verification and Validation I Week 11: Industry Lecture (Testing) Week 12: Verification and Validation II Week 13: Agile/Lean Methods Week 14: Industry Lecture (Agile) Week 15: Measurement / Course wrap-up, review and exam preparation Week 16: no lecture

Structure of Lecture 14 Measurement Measurement Basics Example Measures

Some Quotes Peter Drucker: If you can t measure it, you can t manage it. McKinsey: If you can measure it, you can manage it.

BUT Software measures can be misleading, so Either you don t use them Or you better know what they mean and how to use them.

Definitions: Measurement and Measure Scale & Unit Measurement: Measurement is the process through which values (e.g., numbers) are assigned to attributes of entities of the real world. Measure: A measure is the result of the measurement process, so it is the assignment of a value to an entity with the goal of characterizing a specified attribute. Source: Sandro Morasca, Software Measurement, in Handbook of Software Engineering and Knowledge Engineering - Volume 1: Fundamentals, pp. 239-276, Knowledge Systems Institute, Skokie, IL, USA, 2001, ISBN: 981-02- 4973-X. http://onlinestatbook.com/2/introduction/levels_of_measurement.html A Entity: Program Attribute: Size Size Measure B 4 e * 3 d * 2 c * 1 b * 0 a * LOC (lines of code)

Software Measurement Challenge Measuring physical properties (attributes): entity attribute unit* scale (type) value range* Human Height cm ratio 178 (1, 300) Human Temperature C interval 37 (30, 45) Measuring non-physical properties (attributes): entity attribute unit* scale (type) value range* Human Intelligence/IQ index ordinal 135 [0, 200] Program Modifiability???? Software properties are usually non-physical: size, complexity, functionality, reliability, maturity, portability, flexibility, understandability, maintainability, correctness, testability, coupling, coherence, interoperability, unit and range are sometimes used synonymously with scale

Measurable Entities in a SW Project An entity can represent any of the following: Process/Activity: any activity (or set of activities) related to software development and/or maintenance (e.g., requirements analysis, design, testing) these can be defined at different levels of granularity Product/Artifact: any artifact produced or changed during software development and/or maintenance (e.g., source code, software design documents) Resources: people, time, money, hardware or software needed to perform the processes Ressource role Product in Ressource tool Activity Product out

Structure of Lecture 14 Measurement Measurement Basics Example Measures

Examples of Software Product Attributes Size Length, Churn, Complexity, Functionality Modularity Cohesion Coupling Quality Value (Price)... Quality (-> ISO 9126) Functionality Reliability Usability Efficiency Maintainability Portability internal vs. external

Lines Of Code (LOC) Product Size 12 14 18????

Lines Of Code

Lines Of Code Summary Accurate, easy to measure How to interpret... Empty lines Comments Several statements on one line Language dependent Doesn't take into account complexity Useful?

McCabe's Cyclomatic Complexity Complexity of a program Number of linearly independent paths through a function Usually calculated using the control flow graph MC = e n + 2p e num of edges, n num of vertices, p num of connected components of the control flow graph MC = d + 1 d num of decision (branching) points Works only for single component analysis (not several connected components; i.e. p = 1 above)

McCabe's Cyclomatic Complexity

McCabe's Cyclomatic Complexity 2: System.out.println("----------"); for (Client c : clients) 4-5: System.out.println(c.getId() + " " + c.getfirstname()); if (clients.size() == 0) e = 7 n = 6 p = 1 MC = 3 8: System.out.println("\tNothing"); 10: System.out.println("----------");

Cyclomatic Complexity Summary Automated (available in any modern IDE) Related to testing notions MC is an upper bound for the branch coverage Each control structure was evaluated both to true and false MC is a lower bound for the path coverage All linearly independent paths were executed Related to maintainability and defects MC > 10 Probability of defects rises But to be used with care: N. Nagappan, T. Ball, A. Zeller, Mining metrics to Predict Component Failures. ICSE'2006

Exercise Calculate McCabe's cyclomatic complexity of the following code snippet: http://tinyurl.com/3gzh28j

private void drawselectclientdialog() { List<Client> allclients = domaincontroller.loadallclients(); List<String> clients = new ArrayList<String>(); for (Client client: allclients) { clients.add(client.getid() + ". " + client.getfirstname()); } String selectedclient = (String)JOptionPane.showInputDialog( this, Translations.getString("main.chooseCustomer"), Translations.getString("main.chooseCustomer"), JOptionPane.OK_CANCEL_OPTION, null, clients.toarray(), 0); Client currentclient = null; try { if (selectedclient!= null) { currentclient = domaincontroller.getclientbyid( Long.parseLong(selectedClient. split(" ")[0].replaceAll("\\.",""))); } } catch(numberformatexception e) { log.error("failed to parse client id," + " probably no client was selected"); } if (currentclient!= null) { log.info("client " + currentclient.getfirstname() + " with ID=" + currentclient.getid() + " got selected."); } else { log.info("no client selected"); } model.setcurrentclient(currentclient); } MC =???

private void drawselectclientdialog() { List<Client> allclients = domaincontroller.loadallclients(); List<String> clients = new ArrayList<String>(); for (Client client: allclients) { clients.add(client.getid() + ". " + client.getfirstname()); } String selectedclient = (String)JOptionPane.showInputDialog( this, Translations.getString("main.chooseCustomer"), Translations.getString("main.chooseCustomer"), JOptionPane.OK_CANCEL_OPTION, null, clients.toarray(), 0); Client currentclient = null; try { if (selectedclient!= null) { currentclient = domaincontroller.getclientbyid( Long.parseLong(selectedClient. split(" ")[0].replaceAll("\\.",""))); } } catch(numberformatexception e) { log.error("failed to parse client id," + " probably no client was selected"); } if (currentclient!= null) { log.info("client " + currentclient.getfirstname() + " with ID=" + currentclient.getid() + " got selected."); } else { log.info("no client selected"); } model.setcurrentclient(currentclient); } MC = d + 1 = 4 + 1 = 5

Common OO Code Measures Measure Coupling Cohesion Cyclomatic Complexity Method Hiding Factor Attribute Hiding Factor Depth of Inheritance Tree Number of Children Weighted Methods Per Class Number of Classes Lines of Code (net and total; comment) Churn (new + changed LoC) Desirable Value Lower Higher Lower Higher Higher Low (tradeoff) Low (tradeoff) Low (tradeoff) Higher (with ident functionality) Lower (with ident functionality) Lower (with ident functionality)

Coupling & Cohesion Coupling between object classes (CBO) Number of classes referenced by a given class (FanOut) Lack of cohesion in methods (LCOM) Number of method pairs that do not share instance variables minus number of methods that share at least one instance variable By convention, LCOM := 0 if the above definition gives a negative number

Coupling Example... Traveller: CBO =?

Coupling Example... CBO = 1 (one other class referenced)

(Lack of) Cohesion Example Number of method pairs that do not share instance variables minus number of methods that share at least one instance variable LCOM = 1 2 = -1 0 0

public PersonDetails { private String _firstname; private String _surname; private String _street; private String _city; (Lack of) Cohesion Example } public PersonDetails() {} public setname(string f, String s) { _firstname = f; _surname = s; } public setaddress(string st, String c) { _street = st; _city = c; } public void printaddress() { System.out.println(_street); System.out.println(_city); } public void printname() { System.out.println(_firstname + " " + _surname); } LCOM =?

(Lack of) Cohesion Example 6 2 = 8 2 = 4 2

Simple Quality Measures (Examples) Correctness: Entity: Document (e.g. Code) Attribute: Quality (Correctness) Unit: Defect (found during QA activity) Range: [0, ) Scale type: ratio Characterisation: Direct Quantitative Objective/Subjective??? Defect Density: Entity: Document (e.g., Code) Attribute: Quality (Defect Density) Unit: Defect/LOC Range: [0, ) Scale type: ratio Characterisation: Indirect Quantitative Objective/Subjective???

Example Performance Requirements

Example Performance Requirements How to test: - Define standard work load - Expose system to standard work load for a defined period of time - Measure CPU usage Q: Should we do this for different kinds of CPUs?

Example Usability Requirements

Example Usability Requirements How to test: - Define several (typical) usage scenarios involving tasks Q and R - Select test users and classify as novice and experienced - Let 5 (or better 10, 15) novices perform the secenarios - Observe what problems they encounter - Classify and count observed problems

Examples of Software Process and Resource Attributes that can be measured Process-related: Efficiency: How fast (time, duration), how much effort (effort, cost), how much quantity/quality per time or effort unit (velocity, productivity)? Effectiveness: Do we get the results (quantity/quality) we want? e.g., test coverage Capability: CMMI level Resource-related: People: Skill, knowledge, experience, learning, motivation, personality Organisation: Maturity Method/Technique/Tool: Effectiveness, efficiency, learnability, cost

Time versus Effort Time: Entity: Some Activity (e.g., Test) Attribute: Time (or Duration) Unit: Year, Month, Week, (Work) Day, Hour, Minute, Second,... Range: [0, ) Scale type: ratio Characterisation: Direct Quantitative Objective/Subjective??? Effort: Entity: Some Activity (e.g., Test) Attribute: Effort Unit: Person-Year,, Person- Day, Person-Hour, Range: [0, ) Scale type: ratio Characterisation: Direct Quantitative Objective/Subjective???

Time versus Effort (cont d) What does it mean when I say: This task (e.g., testing) takes 4 days This task (e.g., testing) needs 4 person-days

Agile Measurement: Burn-Down & Burn-Up Both can be used to calculate (average) team velocity = Story Points (or: Storys) per Team per Sprint

What Next? For you to do: Finish and submit Lab Task 7 on time! Deadline: 1 day earlier than usually! Next week: Lab Task 7 Assessment Exam: 08-Jan-2016: 87 (of 90) seats taken 15-Jan-2016: 25 (of 90) seats taken