Show simple item record

Authordc.contributor.authorSandoval Alcocer, Juan Pablo 
Authordc.contributor.authorBergel, Alexandre 
Authordc.contributor.authorValente, Marco Tulio 
Admission datedc.date.accessioned2020-04-22T15:42:00Z
Available datedc.date.available2020-04-22T15:42:00Z
Publication datedc.date.issued2020
Cita de ítemdc.identifier.citationScience of Computer Programming 191 (2020) 102415es_ES
Identifierdc.identifier.other10.1016/j.scico.2020.102415
Identifierdc.identifier.urihttps://repositorio.uchile.cl/handle/2250/174011
Abstractdc.description.abstractContext: Software performance may suffer regressions caused by source code changes. Measuring performance at each new software version is useful for early detection of performance regressions. However, systematically running benchmarks is often impractical (e.g., long running execution, prioritizing functional correctness over non-functional). Objective: In this article, we propose Horizontal Profiling, a sampling technique to predict when a new revision may cause a regression by analyzing the source code and using run-time information of a previous version. The goal of Horizontal Profiling is to reduce the performance testing overhead by benchmarking just software versions that contain costly source code changes. Method: We present an evaluation in which we apply Horizontal Profiling to identify performance regressions of 17 software projects written in the Pharo programming language, totaling 1,288 software versions. Results: Horizontal Profiling detects more than 80% of the regressions by benchmarking less than 20% of the versions. In addition, our experiments show that Horizontal Profiling has better precision and executes the benchmarks in less versions that the state of the art tools, under our benchmarks. Conclusions: We conclude that by adequately characterizing the run-time information of a previous version, it is possible to determine if a new version is likely to introduce a performance regression or not. As a consequence, a significant fraction of the performance regressions are identified by benchmarking only a small fraction of the software versions.es_ES
Patrocinadordc.description.sponsorshipLam Research 4800054170 4800043946 STICAmSud project 14STIC-02 Comision Nacional de Investigacion Cientifica y Tecnologica (CONICYT) CONICYT FONDECYT 1200067 European Smalltalk User Groupes_ES
Lenguagedc.language.isoenes_ES
Publisherdc.publisherElsevieres_ES
Type of licensedc.rightsAttribution-NonCommercial-NoDerivs 3.0 Chile*
Link to Licensedc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/3.0/cl/*
Sourcedc.sourceScience of Computer Programminges_ES
Keywordsdc.subjectPerformance regressiones_ES
Keywordsdc.subjectSoftware performancees_ES
Keywordsdc.subjectSoftware evolutiones_ES
Keywordsdc.subjectPerformance regression predictiones_ES
Keywordsdc.subjectRegression benchmarkinges_ES
Títulodc.titlePrioritizing versions for performance regression testing: the Pharo casees_ES
Document typedc.typeArtículo de revistaes_ES
dcterms.accessRightsdcterms.accessRightsAcceso Abierto
Catalogueruchile.catalogadorcrbes_ES
Indexationuchile.indexArtículo de publicación ISI
Indexationuchile.indexArtículo de publicación SCOPUS


Files in this item

Icon

This item appears in the following Collection(s)

Show simple item record

Attribution-NonCommercial-NoDerivs 3.0 Chile
Except where otherwise noted, this item's license is described as Attribution-NonCommercial-NoDerivs 3.0 Chile