Página 1 dos resultados de 20 itens digitais encontrados em 0.003 segundos

## Criterios restritos de teste de software : uma contribuição para gerar dados de teste mais eficazes

Silvia Regina Vergileo
Fonte: Biblioteca Digital da Unicamp Publicador: Biblioteca Digital da Unicamp
Tipo: Tese de Doutorado Formato: application/pdf
Relevância na Pesquisa
26.1%

## Criterios potenciais usos de integração : definição e analise

Plinio Roberto Souza Vilela
Fonte: Biblioteca Digital da Unicamp Publicador: Biblioteca Digital da Unicamp
Tipo: Tese de Doutorado Formato: application/pdf
Relevância na Pesquisa
26.1%

## Criterios potenciais usos : uma contribuição ao teste estrutural de Software

Fonte: Biblioteca Digital da Unicamp Publicador: Biblioteca Digital da Unicamp
Tipo: Tese de Doutorado Formato: application/pdf
Relevância na Pesquisa
26.36%

## Uma abordagem evolutiva multiobjetivo para geração automática de casos de teste a partir de máquinas de estados; A multi-objective evolutionary approach for automatic generation of test cases from state machines

Thaise Yano
Fonte: Biblioteca Digital da Unicamp Publicador: Biblioteca Digital da Unicamp
Tipo: Tese de Doutorado Formato: application/pdf
Relevância na Pesquisa
16.34%
A geração automática de casos de teste contribui tanto para melhorar a produtividade quanto para reduzir esforço e custo no processo de desenvolvimento de software. Neste trabalho é proposta uma abordagem, denominada MOST (Multi-Objective Search-based Testing approach from EFSM), para gerar casos de teste a partir de Máquina de Estados Finitos Estendida (MEFE) com a aplicação de uma técnica de otimização. No teste baseado em MEFE, é necessário encontrar uma sequência de entrada para exercitar um caminho no modelo, a fim de cobrir um critério de teste (e.g. todas as transições). Como as sequências podem ter diferentes tamanhos, motivou-se o desenvolvimento do algoritmo M-GEOvsl (Multi-Objective Generalized Extremal Optimization with variable string length) que permite gerar soluções de diferentes tamanhos. Além disso, por ser um algoritmo multiobjetivo, M-GEOvsl também possibilita que mais de um critério seja usado para avaliar as soluções. Com a aplicação desse algoritmo em MOST, tanto a cobertura da transição alvo quanto o tamanho da sequência são levados em consideração na geração de casos de teste. Para guiar a busca, são utilizadas informações das dependências do modelo. O algoritmo gera as sequências de entrada...

## Modeling assembly program with constraints. A contribution to WCET problem

Kafle, Bishoksan
Relevância na Pesquisa
16.41%
Dissertação para obtenção do Grau de Mestre em Lógica Computacional; Model checking with program slicing has been successfully applied to compute Worst Case Execution Time (WCET) of a program running in a given hardware. This method lacks path feasibility analysis and suffers from the following problems: The model checker (MC) explores exponential number of program paths irrespective of their feasibility. This limits the scalability of this method to multiple path programs. And the witness trace returned by the MC corresponding to WCET may not be feasible (executable). This may result in a solution which is not tight i.e., it overestimates the actual WCET. This thesis complements the above method with path feasibility analysis and addresses these problems. To achieve this: we first validate the witness trace returned by the MC and generate test data if it is executable. For this we generate constraints over a trace and solve a constraint satisfaction problem. Experiment shows that 33% of these traces (obtained while computing WCET on standard WCET benchmark programs) are infeasible. Second, we use constraint solving technique to compute approximate WCET solely based on the program (without taking into account the hardware characteristics)...

## Infeasible paths in the context of data flow based testing criteria: identification, classification and prediction

Tipo: Artigo de Revista Científica Formato: text/html
Relevância na Pesquisa
67.02%
Infeasible paths constitute a bottleneck for the complete automation of software testing, one of the most expensive activities of software quality assurance. Research efforts have been spent on infeasible paths, basically on three main approaches: prediction, classification and identification of infeasibility. This work reports the results of experiments on data flow based criteria and of studies aimed at the three approaches above. Identification, classification, and prediction of infeasible paths are revisited in the context of data flow based criteria (Potential Uses Criteria-PU). Additionally, these aspects are also addressed in the scope of integration and object-oriented testing. Implementation aspects of mechanisms and facilities to deal with infeasibility are presented taking into consideration Poketool - a tool that supports the application of the Potential Uses Criteria Family. The results and ideas presented contribute to reduce the efforts spent during the testing activity concerning infeasible paths.

## Design optimization methods for genomic DNA tiling arrays

Bertone, Paul; Trifonov, Valery; Rozowsky, Joel S.; Schubert, Falk; Emanuelsson, Olof; Karro, John; Kao, Ming-Yang; Snyder, Michael; Gerstein, Mark
Fonte: Cold Spring Harbor Laboratory Press Publicador: Cold Spring Harbor Laboratory Press
Tipo: Artigo de Revista Científica
Relevância na Pesquisa
16.1%
A recent development in microarray research entails the unbiased coverage, or tiling, of genomic DNA for the large-scale identification of transcribed sequences and regulatory elements. A central issue in designing tiling arrays is that of arriving at a single-copy tile path, as significant sequence cross-hybridization can result from the presence of non-unique probes on the array. Due to the fragmentation of genomic DNA caused by the widespread distribution of repetitive elements, the problem of obtaining adequate sequence coverage increases with the sizes of subsequence tiles that are to be included in the design. This becomes increasingly problematic when considering complex eukaryotic genomes that contain many thousands of interspersed repeats. The general problem of sequence tiling can be framed as finding an optimal partitioning of non-repetitive subsequences over a prescribed range of tile sizes, on a DNA sequence comprising repetitive and non-repetitive regions. Exact solutions to the tiling problem become computationally infeasible when applied to large genomes, but successive optimizations are developed that allow their practical implementation. These include an efficient method for determining the degree of similarity of many oligonucleotide sequences over large genomes...

## QoSPL: A QoS-Driven Software Product Line Engineering Framework for Distributed Real-time and Embedded Systems

Liu, Shih-Hsi; Bryant, Barrett R.; Gray, Jeff; Raje, Rajeev; Tuceryan, Mihran; Olsen, Andrew; Auguston, Mikhail
Tipo: Artigo de Revista Científica
Relevância na Pesquisa
16.1%
The current synergy of Component-Based Software Engineering (CBSE) and Software Product Line Engineering (SPLE) requires evolution to facilitate Distributed Realtime and Embedded (DRE) system construction. Such evolution is driven by inherent Quality of Service (QoS) characteristics in DRE systems. This paper introduces a QoSdriven SPLE framework (QoSPL) as an analysis and design paradigm for constructing a set of DRE systems as a product line. Leveraging separation of concerns, DRE systems are analyzed and designed by a collection of QoS systemic paths, each of which individually determines how well the service performs along the path and as a whole represents a behavioral view of software architecture. The paradigm reduces construction workload from the problems of tangled functional and QoS requirements and abundant infeasible design alternatives, and offers a less subjective QoS evaluation. The adopted formalisms also facilitate high-confidence DRE product line construction.

## Design, simulation, and implementation of a vision-based vehicle-following system

Gehrig, Stefan
Fonte: Universität Tübingen Publicador: Universität Tübingen
Tipo: Dissertation; info:eu-repo/semantics/doctoralThesis
DE_DE
Relevância na Pesquisa
16.1%
Das Auto der Zukunft wird zunehmend intelligenter. Systeme wie das ABS sind heute bereits selbstverständlich. Der Bremsassistent, der von Daimler-Benz 1997 auf dem Markt eingeführt wurde, zeigt die Tendenz zu mehr Intelligenz im Auto an. Als weitere Stufe werden Assistenzsysteme für das Auto auf den Markt kommen, die den Fahrer teilweise von seinen Fahraufgaben entlasten. Solche Systeme arbeiten unter anderem mit Kamerasensorik. Die Dissertation beschäftigt sich mit dem Bildverstehen von Straßenszenen basierend auf Sensordaten eines Stereo-Kamerasystems. Die Hauptanwendung zielt auf ein autonomes Fahren im Stop-And-Go-Verkehr ab. Im Rahmen der Dissertation wird eine Situationsanalyse der Straßenszene für ein Fahrzeugfolge-Szenario vorgenommen, um Daten für die Längs- und Querführung des Fahrzeugs zu gewinnen. Die Arbeit ist in drei zentrale Punkte aufgeteilt. Um geeignet abstrahierte Daten zu gewinnen, wird an einigen Stellen neue Bildverarbeitungsalgorithmik entworfen. Um ein sicheres Fahrzeugfolgen zu garantieren wird zuerst ein Algorithmik entworfen, die ein exaktes Folgen des Pfades des vorausfahrenden Fahrzeigs ermöglicht. Zur temporalen Integration der gewonnenen Daten der Straß enszene wird eine Karte der Fahrzeugumgebung aufgebaut. Diese wiederum dient als Grundlage für eine Planungs- und Entscheidungsebene...

## Caminhos não executaveis : caracterização, previsão e determinação para suporte ao teste de programas

Silvia Regina Vergilio
Fonte: Biblioteca Digital da Unicamp Publicador: Biblioteca Digital da Unicamp
Tipo: Dissertação de Mestrado Formato: application/pdf
Relevância na Pesquisa
26.95%

## Estrategias de seleção de caminhos no contexto de criterios estruturais de teste de software

Leticia Mara Peres
Fonte: Biblioteca Digital da Unicamp Publicador: Biblioteca Digital da Unicamp
Tipo: Dissertação de Mestrado Formato: application/pdf
Relevância na Pesquisa
26.87%

## Enfoques de programación entera para el Problema del Viajante de Comercio con costos en función del tiempo; Integer programming approaches to the Time Dependent Travelling Salesman Problem

Miranda Bront, Juan José
Tipo: info:eu-repo/semantics/doctoralThesis; tesis doctoral; info:eu-repo/semantics/publishedVersion Formato: application/pdf
Relevância na Pesquisa
26.1%
El Problema del Viajante de Comercio Dependiente del Tiempo (PVCDT) es una generalización del clásico Problema del Viajante de Comercio (PVC) en el cual se busca un circuito de costo mínimo que recorra todos los clientes, donde los tiempos (o costos) de viaje entre ellos no son constantes y pueden variar dependiendo de diversos factores. Una de las motivaciones para considerar la dependencia del tiempo es que nos permite tener una mejor aproximación a muchas aplicaciones de la vida real, dentro de la industria y en el sector de servicios. En la literatura relacionada, existen distintas versiones del PVCDT que consideran que las variaciones se producen por diversos motivos y sobre distintos parámetros. Algunas de ellas estipulan que las variaciones se producen sobre los tiempos (y/o costos) de viaje mientras que otras asumen que las variaciones se producen sobre la velocidad de viaje. Más aún, para el primer caso, una variante asume que el tiempo de viaje depende de la posición del arco en el circuito mientras que otra asume que depende del instante en el que se comienza a atravesar el arco. Al igual que el PVC, el PVCDT pertence a la clase NP-Difícil y por lo tanto no se conoce un algoritmo que encuentre una solución en tiempo polinomial. En esta tesis abordamos dos de las variantes antes mencionadas mediante la formulación de modelos de Programación Lineal Entera. Para cada formulación...

## Domain-Type-Guided Refinement Selection Based on Sliced Path Prefixes

Beyer, Dirk; Löwe, Stefan; Wendler, Philipp
Tipo: Artigo de Revista Científica
Relevância na Pesquisa
16.81%
Abstraction is a successful technique in software verification, and interpolation on infeasible error paths is a successful approach to automatically detect the right level of abstraction in counterexample-guided abstraction refinement. Because the interpolants have a significant influence on the quality of the abstraction, and thus, the effectiveness of the verification, an algorithm for deriving the best possible interpolants is desirable. We present an analysis-independent technique that makes it possible to extract several alternative sequences of interpolants from one given infeasible error path, if there are several reasons for infeasibility in the error path. We take as input the given infeasible error path and apply a slicing technique to obtain a set of error paths that are more abstract than the original error path but still infeasible, each for a different reason. The (more abstract) constraints of the new paths can be passed to a standard interpolation engine, in order to obtain a set of interpolant sequences, one for each new path. The analysis can then choose from this set of interpolant sequences and select the most appropriate, instead of being bound to the single interpolant sequence that the interpolation engine would normally return. For example...

## ATLAS: A geometric approach to learning high-dimensional stochastic systems near manifolds

Crosskey, Miles; Maggioni, Mauro
Tipo: Artigo de Revista Científica
Relevância na Pesquisa
16.27%
When simulating multiscale stochastic differential equations (SDEs) in high-dimensions, separation of timescales, stochastic noise and high-dimensionality can make simulations prohibitively expensive. The computational cost is dictated by microscale properties and interactions of many variables, while the behavior of interest often occurs at the macroscale level and at large time scales, often characterized by few important, but unknown, degrees of freedom. For many problems bridging the gap between the microscale and macroscale by direct simulation is computationally infeasible. In this work we propose a novel approach to automatically learn a reduced model with an associated fast macroscale simulator. Our unsupervised learning algorithm uses short parallelizable microscale simulations to learn provably accurate macroscale SDE models, which are continuous in space and time. The learning algorithm takes as input: the microscale simulator, a local distance function, and a homogenization spatial or temporal scale, which is the smallest time scale of interest in the reduced system. The learned macroscale model can then be used for fast computation and storage of long simulations. We prove guarantees that related the number of short paths requested from the microscale simulator to the accuracy of the learned macroscale simulator. We discuss various examples...

## Abstraction-driven Concolic Testing

Daca, Przemysław; Gupta, Ashutosh; Henzinger, Thomas A.
Tipo: Artigo de Revista Científica
Relevância na Pesquisa
16.1%
Concolic testing is a promising method for generating test suites for large programs. However, it suffers from the path-explosion problem and often fails to find tests that cover difficult-to-reach parts of programs. In contrast, model checkers based on counterexample-guided abstraction refinement explore programs exhaustively, while failing to scale on large programs with precision. In this paper, we present a novel method that iteratively combines concolic testing and model checking to find a test suite for a given coverage criterion. If concolic testing fails to cover some test goals, then the model checker refines its program abstraction to prove more paths infeasible, which reduces the search space for concolic testing. We have implemented our method on top of the concolic-testing tool CREST and the model checker CpaChecker. We evaluated our tool on a collection of programs and a category of SvComp benchmarks. In our experiments, we observed an improvement in branch coverage compared to CREST from 48% to 63% in the best case, and from 66% to 71% on average.; Comment: 25 pages

## Online Regenerator Placement

Mertzios, George B.; Shalom, Mordechai; Wong, Prudence W. H.; Zaks, Shmuel
Tipo: Artigo de Revista Científica
Relevância na Pesquisa
16.34%
Connections between nodes in optical networks are realized by lightpaths. Due to the decay of the signal, a regenerator has to be placed on every lightpath after at most $d$ hops, for some given positive integer $d$. A regenerator can serve only one lightpath. The placement of regenerators has become an active area of research during recent years, and various optimization problems have been studied. The first such problem is the Regeneration Location Problem ($\prb$), where the goal is to place the regenerators so as to minimize the total number of nodes containing them. We consider two extreme cases of online $\prb$ regarding the value of $d$ and the number $k$ of regenerators that can be used in any single node. (1) $d$ is arbitrary and $k$ unbounded. In this case a feasible solution always exists. We show an $O(\log \abs{X} \cdot \log d)$-competitive randomized algorithm for any network topology, where $X$ is the set of paths of length $d$. The algorithm can be made deterministic in some cases. We show a deterministic lower bound of $\Omega \lb$, where $E$ is the edge set. (2) $d=2$ and $k=1$. In this case there is not necessarily a solution for a given input. We distinguish between feasible inputs (for which there is a solution) and infeasible ones. In the latter case...

## A simpler proof for O(congestion + dilation) packet routing

Rothvoss, Thomas
Tipo: Artigo de Revista Científica
Relevância na Pesquisa
16.27%
In the store-and-forward routing problem, packets have to be routed along given paths such that the arrival time of the latest packet is minimized. A groundbreaking result of Leighton, Maggs and Rao says that this can always be done in time O(congestion + dilation), where the congestion is the maximum number of paths using an edge and the dilation is the maximum length of a path. However, the analysis is quite arcane and complicated and works by iteratively improving an infeasible schedule. Here, we provide a more accessible analysis which is based on conditional expectations. Like [LMR94], our easier analysis also guarantees that constant size edge buffers suffice. Moreover, it was an open problem stated e.g. by Wiese, whether there is any instance where all schedules need at least (1 + epsilon)*(congestion + dilation) steps, for a constant epsilon > 0. We answer this question affirmatively by making use of a probabilistic construction.

## Large neighborhoods with implicit customer selection for vehicle routing problems with profits

Vidal, Thibaut; Maculan, Nelson; Penna, Puca Huachi Vaz; Ochi, Luis Satoru
Tipo: Artigo de Revista Científica
Relevância na Pesquisa
16.1%
We consider several Vehicle Routing Problems (VRP) with profits, which seek to select a subset of customers, each one being associated with a profit, and to design service itineraries. When the sum of profits is maximized under distance constraints, the problem is usually called team orienteering problem. The capacitated profitable tour problem seeks to maximize profits minus travel costs under capacity constraints. Finally, in the VRP with private fleet and common carrier, some customers can be delegated to an external carrier subject to a cost. Three families of combined decisions must be taken: customers selection, assignment to vehicles, and sequencing of deliveries for each route. We propose a new neighborhood search for these problems which explores an exponential number of solutions in pseudo polynomial time. The search is conducted with standard VRP neighborhoods on an "exhaustive" solution representation, visiting all customers. Since visiting all customers is usually infeasible or sub-optimal, an efficient "Select" algorithm, based on resource constrained shortest paths, is repeatedly used on any new route to find the optimal subsequence of visits to customers. The good performance of these neighborhood structures is demonstrated by extensive computational experiments with a local search...

## Social network support for data delivery infrastructures

Sastry, Nishanth Ramakrishna
Fonte: University of Cambridge; Faculty of Computer Science and Technology; Computer Laboratory Publicador: University of Cambridge; Faculty of Computer Science and Technology; Computer Laboratory
Tipo: Thesis; doctoral; PhD
EN
Relevância na Pesquisa
16.36%
Network infrastructures often need to stage content so that it is accessible to consumers. The standard solution, deploying the content on a centralised server, can be inadequate in several situations. Our thesis is that information encoded in social networks can be used to tailor content staging decisions to the user base and thereby build better data delivery infrastructures. This claim is supported by two case studies, which apply social information in challenging situations where traditional content staging is infeasible. Our approach works by examining empirical traces to identify relevant social properties, and then exploits them. The first study looks at cost-effectively serving the Long Tail'' of rich-media user-generated content, which need to be staged close to viewers to control latency and jitter. Our traces show that a preference for the unpopular tail items often spreads virally and is localised to some part of the social network. Exploiting this, we propose Buzztraq, which decreases replication costs by selectively copying items to locations favoured by viral spread. We also design SpinThrift, which separates popular and unpopular content based on the relative proportion of viral accesses, and opportunistically spins down disks containing unpopular content...

## Atlas Simulation: A Numerical Scheme for Approximating Multiscale Diffusions Embedded in High Dimensions

Crosskey, Miles Martin