Página 1 dos resultados de 46 itens digitais encontrados em 0.014 segundos

Algoritmos para avaliação da qualidade de vídeo em sistemas de televisão digital.; Video quality assessment algorithms in digital television applications.

Fonseca, Roberto Nery da
Fonte: Biblioteca Digitais de Teses e Dissertações da USP Publicador: Biblioteca Digitais de Teses e Dissertações da USP
Tipo: Dissertação de Mestrado Formato: application/pdf
Publicado em 15/10/2008 PT
Relevância na Pesquisa
25.97%
Nesta dissertação é abordado o tema da avaliação de qualidade em sinais de vídeo, especificamente da avaliação objetiva completamente referenciada de sinais de vídeo em definição padrão. A forma mais confiável de se medir a diferença de qualidade entre duas cenas de vídeo é utilizando um painel formado por telespectadores, resultando em uma medida subjetiva da diferença de qualidade. Esta metodologia demanda um longo período de tempo e um elevado custo operacional, o que a torna pouco prática para utilização. Neste trabalho são apresentados os aspectos relevantes do sistema visual humano, das metodologias para avaliação de vídeo em aplicações de televisão digital em definição padrão e também da validação destas metodologias. O objetivo desta dissertação é testar métricas de baixo custo computacional como a que avalia a relação sinal-ruído de pico (PSNR: Peak Signal-to-Noise Ratio), a que mede similaridade estrutural (SSIM: Structural SIMilarity) e a que mede diferenças em três componentes de cor definidas pela CIE (Commission Internationale de l'Eclairage), representadas por L*, a* e b* em uma dada extensão espacial (S-CIELAB: Spatial-CIELAB). Uma metodologia de validação destas métricas é apresentada...

SystEM-PLA: um método sistemático para avaliação de arquitetura de linha de produto de software baseada em UML; SystEM-PLA: a systematic evaluation method for UML-based software product line architecture

Oliveira Junior, Edson Alves de
Fonte: Biblioteca Digitais de Teses e Dissertações da USP Publicador: Biblioteca Digitais de Teses e Dissertações da USP
Tipo: Tese de Doutorado Formato: application/pdf
Publicado em 03/09/2010 PT
Relevância na Pesquisa
46.1%
A abordagem de linha de produto de software (LP) tem como objetivo principal promover a geração de produtos específicos de um determinado domínio com base na reutilização de uma infraestrutura central, chamada núcleo de artefatos. Um dos principais artefatos do núcleo de uma LP é a Arquitetura de LP (ALP), que representa a abstração de todas as arquiteturas de sistemas únicos que podem ser gerados, para um domínio específico. Avaliações de ALP são importantes, pois permitem aumentar a produtividade e a qualidade dos produtos da LP, bem como, seus resultados permitem a análise de metas de negócio e de retorno de investimento. Este trabalho propõe um método sistemático para avaliação de ALP, o SystEM-PLA (a Systematic Evaluation Method for Software Product Line Architectures). Tal método considera modelos de ALP em UML, por ser uma notação amplamente conhecida e consolidada. SystEM-PLA é composto por um metaprocesso de avaliação, diretrizes que guiam o usuário em como avaliar uma ALP e métricas básicas para modelos UML e atributos de qualidade. O método utiliza a abordagem SMarty (Stereotype-based Management of Variability), para gerenciar variabilidades em LP baseadas em UML. Análises de trade-off com o objetivo de priorizar atributos de qualidade para o desenvolvimento e evolução dos produtos de uma LP são realizadas com base na aplicação e coleta das métricas do SystEM-PLA em configurações de uma ALP. As métricas propostas para os atributos de qualidade complexidade e extensibilidade foram validadas por meio de um estudo experimental. Evidências indicaram a viabilidade de aplicação do método SystEM-PLA na indústria com base em um estudo experimental realizado com profissionais de uma empresa de grande porte no setor de desenvolvimento de software; The software product line (PL) approach aims at promoting the generation of specific products from a particular domain based on the reuse of a central infra-structure...

Management Information Base (MIB) de gerenciamento de confiança em redes ad-Hoc; Management Information Base (MIB) for trust management in ad-hoc networks

Santana, Beatriz Campos
Fonte: Universidade de Brasília Publicador: Universidade de Brasília
Tipo: Dissertação
POR
Relevância na Pesquisa
15.95%
Dissertação (mestrado)— Universidade de Brasília, Faculdade de Tecnologia, Departamento de Engenharia Elétrica, 2012.; Derivado do estudo do fenômeno de mesmo nome existente entre os humanos, vem se estabelecendo um conceito de confiança específico para a área computacional. Tendo aplicação diversificada, tal conceito tem sido muito usado para avaliar o comportamento de diversos componentes de redes computacionais e principalmente para classificar nós (componentes de rede, por exemplo, roteadores), de modo a identificar aqueles que possuem intenções maliciosas e perigosas. Esta dissertação propõe a criação de uma MIB (Management Information Base) de gerenciamento da confiança em redes ad hoc, para coletar dados sobre o comportamento dos nós de uma rede e, a partir desses dados, aplicar um cálculo de confiança, de modo a tornar possível observar se o comportamento de cada nó está sendo adequado para o bom funcionamento da rede. Para a realização deste trabalho, foi necessária a pesquisa de métricas de confiança aplicadas em redes ad hoc para a implementação da MIB de confiança. Para efeito de validação da proposta...

Component-based software engineering: a quantitative approach

Goulão, Miguel Carlos Pacheco Afonso
Fonte: FCT - UNL Publicador: FCT - UNL
Tipo: Tese de Doutorado
Publicado em //2008 ENG
Relevância na Pesquisa
16.31%
Dissertação apresentada para a obtenção do Grau de Doutor em Informática pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia; Background: Often, claims in Component-Based Development (CBD) are only supported by qualitative expert opinion, rather than by quantitative data. This contrasts with the normal practice in other sciences, where a sound experimental validation of claims is standard practice. Experimental Software Engineering (ESE) aims to bridge this gap. Unfortunately, it is common to find experimental validation efforts that are hard to replicate and compare, to build up the body of knowledge in CBD. Objectives: In this dissertation our goals are (i) to contribute to evolution of ESE, in what concerns the replicability and comparability of experimental work, and (ii) to apply our proposals to CBD, thus contributing to its deeper and sounder understanding. Techniques: We propose a process model for ESE, aligned with current experimental best practices, and combine this model with a measurement technique called Ontology-Driven Measurement (ODM). ODM is aimed at improving the state of practice in metrics definition and collection, by making metrics definitions formal and executable,without sacrificing their usability. ODM uses standard technologies that can be well adapted to current integrated development environments. Results: Our contributions include the definition and preliminary validation of a process model for ESE and the proposal of ODM for supporting metrics definition and collection in the context of CBD. We use both the process model and ODM to perform a series experimental works in CBD...

Identifying Neural Drivers with Functional MRI: An Electrophysiological Validation

David, Olivier; Guillemain, Isabelle; Saillet, Sandrine; Reyt, Sebastien; Deransart, Colin; Segebarth, Christoph; Depaulis, Antoine
Fonte: Public Library of Science Publicador: Public Library of Science
Tipo: Artigo de Revista Científica
EN
Relevância na Pesquisa
25.72%
Whether functional magnetic resonance imaging (fMRI) allows the identification of neural drivers remains an open question of particular importance to refine physiological and neuropsychological models of the brain, and/or to understand neurophysiopathology. Here, in a rat model of absence epilepsy showing spontaneous spike-and-wave discharges originating from the first somatosensory cortex (S1BF), we performed simultaneous electroencephalographic (EEG) and fMRI measurements, and subsequent intracerebral EEG (iEEG) recordings in regions strongly activated in fMRI (S1BF, thalamus, and striatum). fMRI connectivity was determined from fMRI time series directly and from hidden state variables using a measure of Granger causality and Dynamic Causal Modelling that relates synaptic activity to fMRI. fMRI connectivity was compared to directed functional coupling estimated from iEEG using asymmetry in generalised synchronisation metrics. The neural driver of spike-and-wave discharges was estimated in S1BF from iEEG, and from fMRI only when hemodynamic effects were explicitly removed. Functional connectivity analysis applied directly on fMRI signals failed because hemodynamics varied between regions, rendering temporal precedence irrelevant. This paper provides the first experimental substantiation of the theoretical possibility to improve interregional coupling estimation from hidden neural states of fMRI. As such...

Generating One Biometric Feature from Another: Faces from Fingerprints

Ozkaya, Necla; Sagiroglu, Seref
Fonte: Molecular Diversity Preservation International (MDPI) Publicador: Molecular Diversity Preservation International (MDPI)
Tipo: Artigo de Revista Científica
Publicado em 28/04/2010 EN
Relevância na Pesquisa
15.95%
This study presents a new approach based on artificial neural networks for generating one biometric feature (faces) from another (only fingerprints). An automatic and intelligent system was designed and developed to analyze the relationships among fingerprints and faces and also to model and to improve the existence of the relationships. The new proposed system is the first study that generates all parts of the face including eyebrows, eyes, nose, mouth, ears and face border from only fingerprints. It is also unique and different from similar studies recently presented in the literature with some superior features. The parameter settings of the system were achieved with the help of Taguchi experimental design technique. The performance and accuracy of the system have been evaluated with 10-fold cross validation technique using qualitative evaluation metrics in addition to the expanded quantitative evaluation metrics. Consequently, the results were presented on the basis of the combination of these objective and subjective metrics for illustrating the qualitative properties of the proposed methods as well as a quantitative evaluation of their performances. Experimental results have shown that one biometric feature can be determined from another. These results have once more indicated that there is a strong relationship between fingerprints and faces.

De Novo Design and Experimental Characterization of Ultrashort Self-Associating Peptides

Smadbeck, James; Chan, Kiat Hwa; Khoury, George A.; Xue, Bo; Robinson, Robert C.; Hauser, Charlotte A. E.; Floudas, Christodoulos A.
Fonte: Public Library of Science Publicador: Public Library of Science
Tipo: Artigo de Revista Científica
Publicado em 10/07/2014 EN
Relevância na Pesquisa
26.04%
Self-association is a common phenomenon in biology and one that can have positive and negative impacts, from the construction of the architectural cytoskeleton of cells to the formation of fibrils in amyloid diseases. Understanding the nature and mechanisms of self-association is important for modulating these systems and in creating biologically-inspired materials. Here, we present a two-stage de novo peptide design framework that can generate novel self-associating peptide systems. The first stage uses a simulated multimeric template structure as input into the optimization-based Sequence Selection to generate low potential energy sequences. The second stage is a computational validation procedure that calculates Fold Specificity and/or Approximate Association Affinity (K*association) based on metrics that we have devised for multimeric systems. This framework was applied to the design of self-associating tripeptides using the known self-associating tripeptide, Ac-IVD, as a structural template. Six computationally predicted tripeptides (Ac-LVE, Ac-YYD, Ac-LLE, Ac-YLD, Ac-MYD, Ac-VIE) were chosen for experimental validation in order to illustrate the self-association outcomes predicted by the three metrics. Self-association and electron microscopy studies revealed that Ac-LLE formed bead-like microstructures...

Development and Validation of an Haemophilus influenzae Supragenome Hybridization (SGH) Array for Transcriptomic Analyses

Janto, Benjamin A.; Hiller, N. Luisa; Eutsey, Rory A.; Dahlgren, Margaret E.; Earl, Joshua P.; Powell, Evan; Ahmed, Azad; Hu, Fen Z.; Ehrlich, Garth D.
Fonte: Public Library of Science Publicador: Public Library of Science
Tipo: Artigo de Revista Científica
Publicado em 07/10/2014 EN
Relevância na Pesquisa
25.72%
We previously carried out the design and testing of a custom-built Haemophilus influenzae supragenome hybridization (SGH) array that contains probe sequences to 2,890 gene clusters identified by whole genome sequencing of 24 strains of H. influenzae. The array was originally designed as a tool to interrogate the gene content of large numbers of clinical isolates without the need for sequencing, however, the data obtained is quantitative and is thus suitable for transcriptomic analyses. In the current study RNA was extracted from H. influenzae strain CZ4126/02 (which was not included in the design of the array) converted to cDNA, and labelled and hybridized to the SGH arrays to assess the quality and reproducibility of data obtained from these custom-designed chips to serve as a tool for transcriptomics. Three types of experimental replicates were analyzed with all showing very high degrees of correlation, thus validating both the array and the methods used for RNA profiling. A custom filtering pipeline for two-condition unpaired data using five metrics was developed to minimize variability within replicates and to maximize the identification of the most significant true transcriptional differences between two samples. These methods can be extended to transcriptional analysis of other bacterial species utilizing supragenome-based arrays.

Classifying metrics for assessing object-oriented software maintainability: a family of metrics’ catalogs

Saraiva, Juliana de Albuquerque Gonçalves; Soares, Sérgio Castelo Branco (Orientador); L. FILHO, FERNANDO J. C. DE (Coorientador)
Fonte: Universidade Federal de Pernambuco Publicador: Universidade Federal de Pernambuco
Tipo: Tese de Doutorado
EN
Relevância na Pesquisa
36.09%
Currently, Object-Oriented Programming (OOP) is one of the most used paradigms. Complementarily, the software maintainability is considered a software attribute that plays an important role in quality level. In this context, the Object-Oriented Software Maintainability (OOSM) has been studied through years, and many researchers have proposed a large number of metrics to measure it. As a consequence of the number and diversity of metrics, beyond the no standardization in metrics definition and naming, the decision-making process about which metrics can be adopted in experiments on OOSM, or even their using in software companies is a difficult task. Therefore, a systematic mapping study was conducted in order to find which metrics are used as indicators in OOSM assessments. There was an initial selection of 5175 primary studies and 138 were selected, resulting in 568 metrics found. Analyzing the 568 metrics, inconsistencies in metrics’ naming were found because there were metrics with the same names but different meanings (8 cases involving 17 metrics) and also, there were metrics with different names, however with similar meanings (32 cases involving 214 metrics). Moreover, a metrics’ categorization has been proposed to facilitate decision-making process about which ones have to be adopted...

Cardiac MRI: Improved Assessment of Left Ventricular Function, Wall Motion, and Viability

Krishnamurthy, Ramkumar
Fonte: Universidade Rice Publicador: Universidade Rice
ENG
Relevância na Pesquisa
15.99%
Heart failure is the clinical syndrome accompanying the inability of the heart to maintain a cardiac output required to meet the metabolic requirements and accommodate venous return, and is one of the leading causes of mortality in United States. Accurate imaging of the heart and its failure is important for successful patient management and treatment. Multiple cardiac imaging modalities provide complementary information about the heart – LV function and wall motion, anatomy, myocardial viability and ischemia. In many instances, it is necessary for a patient to undergo multiple imaging sessions to obtain diagnostic clinical information with confidence. It would be beneficial to the individual and the health care system if a single imaging modality could yield reliable clinical information about the heart, leading to a reduced cost, anxiety and an increased diagnostic confidence. This thesis proposes methods that would make cardiac MRI perform an improved assessment of LV function, wall motion, and viability, such that cardiac MRI is taken one step closer to being a single stop solution for imaging of heart. Conventional cardiac MR imaging is performed at a temporal resolution of around 40 ms per cardiac phase. While the global left ventricular (LV) function can be reliably established at this temporal resolution...

Utilización de métricas de complejidad para la clasificación estética de fotografías

Carballal Mato, Adrián
Fonte: Universidade da Corunha Publicador: Universidade da Corunha
Tipo: Tese de Doutorado
SPA
Relevância na Pesquisa
15.97%
[Resumen] Se explora la utilización de distintos estimadores de complejidad en tareas relacionadas con la estética. La identificación de un conjunto de métricas que permitan clasificar imágenes atendiendo a criterios estéticos sería una contribución especialmente importante para la comprensión del fenómeno estético y de la experiencia visual, proporcionando contribuciones claras para los campos de la estética, la percepción visual y la teoría del arte, reconocimiento de patrones, informática gráfica y visión artificial. De igual manera, se pretende comprobar si la relación existente entre la estética y el color vista en trabajos anteriores utilizando métricas ad-hoc también está presente utilizando conceptos tan generales y a priori independientes como puede ser la complejidad. Se enuncian una serie de métricas basadas en la estimación de la complejidad. Para ello se utilizan dos métodos de compresión como son JPEG y Fractal, la Ley Zipf y la Dimensión Fractal. Todas estas métricas se utilizaran conjuntamente con los filtros detección de bordes Canny y Sobel sobre el modelo de color HSV. Se realizan distintos experimentos abordando dos tareas diferentes como son la ordenación de imágenes y la clasificación atendiendo a criterios estéticos. Se siguen dos enfoque distintos...

Dynamic adaptive search based software engineering needs fast approximate metrics (Keynote)

Harman, Mark; Clark, John; Ó Cinnéide, Mel
Fonte: Association for Computing Machinery Publicador: Association for Computing Machinery
Tipo: info:eu-repo/semantics/conferenceObject; all_ul_research; ul_published_reviewed
ENG
Relevância na Pesquisa
35.95%
peer-reviewed; Search Based Software Engineering (SBSE) uses fitness functions to guide an automated search for solutions to challenging software engineering problems. The fitness function is a form of software metric, so there is a natural and close interrelationship between software metics and SBSE. SBSE can be used as a way to experimentally validate metrics, revealing startling conflicts between metrics that purport to measure the same software attributes. SBSE also requires new forms of surrogate metrics. This topic is less well studied and, therefore, remains an interesting open problem for future work. This paper1 overviews recent results on SBSE for experimental metric validation and discusses the open problem of fast approximate surrogate metrics for dynamic adaptive SBSE.

Experimental Validation of an Elastic Registration Algorithm for Ultrasound Images

Leung, Corina
Fonte: Quens University Publicador: Quens University
Tipo: Tese de Doutorado Formato: 5705954 bytes; application/pdf
EN; EN
Relevância na Pesquisa
36.15%
Ultrasound is a favorable tool for intra-operative surgical guidance due to its fast imaging speed and non-invasive nature. However, deformations of the anatomy caused by breathing, heartbeat, and movement of the patient make it difficult to track the location of anatomical landmarks during intra-operative ultrasound-guided interventions. While elastic registration can be used to compensate for image misalignment, its adaptation for clinical use has only been gradual due to the lack of standardized guidelines to quantify the performance of different registration techniques. Evaluation of elastic registration algorithms is a difficult task since the point to point correspondence between images is usually unknown. This poses a major challenge in the validation of non-rigid registration techniques for performance comparisons. Current validation guidelines for non-rigid registration algorithms exist for the comparison of techniques for magnetic resonance images of the brain. These frameworks provide users with standardized brain datasets and performance measures based on brain region alignment, intensity differences between images, and inverse consistency of transformations. These metrics may not all be suitable for ultrasound registration algorithms due to the different properties of the imaging modalities. Furthermore...

Utilisation du concept de connectivité en hydrologie : définitions, approches expérimentales et éléments de modélisation

Ali, Geneviève
Fonte: Université de Montréal Publicador: Université de Montréal
Tipo: Thèse ou Mémoire numérique / Electronic Thesis or Dissertation
FR
Relevância na Pesquisa
25.81%
Alors que certains mécanismes pourtant jugés cruciaux pour la transformation de la pluie en débit restent peu ou mal compris, le concept de connectivité hydrologique a récemment été proposé pour expliquer pourquoi certains processus sont déclenchés de manière épisodique en fonction des caractéristiques des événements de pluie et de la teneur en eau des sols avant l’événement. L’adoption de ce nouveau concept en hydrologie reste cependant difficile puisqu’il n’y a pas de consensus sur la définition de la connectivité, sa mesure, son intégration dans les modèles hydrologiques et son comportement lors des transferts d’échelles spatiales et temporelles. Le but de ce travail doctoral est donc de préciser la définition, la mesure, l’agrégation et la prédiction des processus liés à la connectivité hydrologique en s’attardant aux questions suivantes : 1) Quel cadre méthodologique adopter pour une étude sur la connectivité hydrologique ?, 2) Comment évaluer le degré de connectivité hydrologique des bassins versants à partir de données de terrain ?, et 3) Dans quelle mesure nos connaissances sur la connectivité hydrologique doivent-elles conduire à la modification des postulats de modélisation hydrologique ? Trois approches d’étude sont différenciées...

A Web Resource for Standardized Benchmark Datasets, Metrics, and Rosetta Protocols for Macromolecular Modeling and Design

Ó Conchúir, Shane; Barlow, Kyle A.; Pache, Roland A.; Ollikainen, Noah; Kundert, Kale; O'Meara, Matthew J.; Smith, Colin A.; Kortemme, Tanja
Fonte: Public Library of Science Publicador: Public Library of Science
Tipo: Artigo de Revista Científica
Publicado em 03/09/2015 EN
Relevância na Pesquisa
25.92%
The development and validation of computational macromolecular modeling and design methods depend on suitable benchmark datasets and informative metrics for comparing protocols. In addition, if a method is intended to be adopted broadly in diverse biological applications, there needs to be information on appropriate parameters for each protocol, as well as metrics describing the expected accuracy compared to experimental data. In certain disciplines, there exist established benchmarks and public resources where experts in a particular methodology are encouraged to supply their most efficient implementation of each particular benchmark. We aim to provide such a resource for protocols in macromolecular modeling and design. We present a freely accessible web resource (https://kortemmelab.ucsf.edu/benchmarks) to guide the development of protocols for protein modeling and design. The site provides benchmark datasets and metrics to compare the performance of a variety of modeling protocols using different computational sampling methods and energy functions, providing a “best practice” set of parameters for each method. Each benchmark has an associated downloadable benchmark capture archive containing the input files, analysis scripts...

A validation metrics based model calibration applied on stranded cables

Castello,Daniel Alves; Matt,Carlos Frederico Trotta
Fonte: Associação Brasileira de Engenharia e Ciências Mecânicas - ABCM Publicador: Associação Brasileira de Engenharia e Ciências Mecânicas - ABCM
Tipo: Artigo de Revista Científica Formato: text/html
Publicado em 01/12/2011 EN
Relevância na Pesquisa
36.07%
The present work is aimed at building a computational model for a typical stranded cable based on the basic principles of Verification and Validation. The model calibration and model tracking are guided based on a pool of validation metrics suitable for data which are commonly used in structural dynamics. The estimator used for the associated inverse problem is the Maximum a Posteriori estimator and the parameter estimation process is performed sequentially over experiments. Experimental tests have been performed at CEPEL's (Electric Power Research Center) laboratory span with the overhead conductor Grosbeak in order to provide the measured data. The predictive capacity of the computational model is assessed by means of frequency-and time-domain validations through FRFs, band limited white-noise and sine sweep excitations. We also present novel and reliable estimates for the bending stiffness and damping parameters of a widely used transmission line conductor.

An overview of metrics-based approaches to support software components reusability assessment

Goulão, Miguel; Abreu, Fernando Brito e
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 30/09/2011
Relevância na Pesquisa
25.89%
Objective: To present an overview on the current state of the art concerning metrics-based quality evaluation of software components and component assemblies. Method: Comparison of several approaches available in the literature, using a framework comprising several aspects, such as scope, intent, definition technique, and maturity. Results: The identification of common shortcomings of current approaches, such as ambiguity in definition, lack of adequacy of the specifying formalisms and insufficient validation of current quality models and metrics for software components. Conclusions: Quality evaluation of components and component-based infrastructures presents new challenges to the Experimental Software Engineering community.

Experimental validation of large-scale simulations of dynamic fracture along weak planes

Chalivendra, Vijaya B.; Hong, Soosung; Arias, Irene; Knap, Jaroslaw; Rosakis, Ares; Ortiz, Michael
Fonte: Elsevier Publicador: Elsevier
Tipo: Article; PeerReviewed Formato: application/pdf
Publicado em //2009
Relevância na Pesquisa
25.91%
A well-controlled and minimal experimental scheme for dynamic fracture along weak planes is specifically designed for the validation of large-scale simulations using cohesive finite elements. The role of the experiments in the integrated approach is two-fold. On the one hand, careful measurements provide accurate boundary conditions and material parameters for a complete setup of the simulations without free parameters. On the other hand, quantitative performance metrics are provided by the experiments, which are compared a posteriori with the results of the simulations. A modified Hopkinson bar setup in association with notch-face loading is used to obtain controlled loading of the fracture specimens. An inverse problem of cohesive zone modeling is performed to obtain accurate mode-I cohesive zone laws from experimentally measured deformation fields. The speckle interferometry technique is employed to obtain the experimentally measured deformation field. Dynamic photoelasticity in conjunction with high-speed photography is used to capture experimental records of crack propagation. The comparison shows that both the experiments and the numerical simulations result in very similar crack initiation times and produce crack tip velocities which differ by less than 6%. The results also confirm that the detailed shape of the non-linear cohesive zone law has no significant influence on the numerical results.

Local Motion And Local Accuracy In Protein Backbone

Davis, Ian Wheeler
Fonte: Universidade Duke Publicador: Universidade Duke
Tipo: Dissertação Formato: 18399336 bytes; 333 pages.; application/pdf
EN_US
Relevância na Pesquisa
25.81%
Proteins are chemically simple molecules, being unbranched polymers of uncomplicated organic compounds. Nonetheless, they fold up into a dazzling variety of complex and beautiful configurations with a dizzying array of structural, regulatory, and catalytic functions. Despite great progress, we still have very limited ability to predict the folded conformation of an amino acid sequence, and limited understanding of its dynamics and motions. Thus, this work presents a quartet of interrelated studies that address some aspects of the detailed local conformations and motions of protein backbone. First, I used a density-dependent smoothing algorithm and a high-quality, B-filtered data set to construct highly accurate conformational distributions for protein backbone (Ramachandran plots) and sidechains (rotamers). These distributions are the most accurate and restrictive produced to date, with improved discrimination between rare-but-real conformations and artifactual ones. Second, I analyzed hundreds of alternate conformations in atomic resolution crystal structures, and discovered that dramatic conformational change in a protein sidechain is often coupled to a subtle but very common mode of conformational change in its backbone -- the backrub motion. Examination of other biophysical data further supports the ubiquity of this motion. Third...

Dynamic Electron Arc Radiotherapy (DEAR): A New Conformal Electron Therapy Technique

Rodrigues, Anna Elisabeth
Fonte: Universidade Duke Publicador: Universidade Duke
Tipo: Dissertação
Publicado em //2015
Relevância na Pesquisa
15.99%

Electron beam therapy represents an underutilized area in radiation therapy. While electron radiation therapy has existed for many decades and electron beams with multiple energies are available on linear accelerators – the most common device to deliver radiation therapy – efforts to advance the field have been slow. In contrast, photon beam therapy has seen rapid advancements in the past decade, and has become the main modality for radiation therapy treatment.

This doctoral research project comprises the development of a novel treatment modality, dynamic electron arc radiotherapy (DEAR) that seeks to address challenges to clinical implementation of electron beam therapy by providing a technique that may be able to treat specific patient subsets with better outcomes than current techniques. This research not only focused on the development of DEAR, but also aimed to improve upon and introduce new tools and techniques that could translate to current clinical electron beam therapy practice.

The concept of DEAR is presented. DEAR represents a new conformal electron therapy technique with synchronized couch motion. DEAR utilizes the combination of gantry rotation, couch motion, and dose rate modulation to achieve desirable dose distributions in patient. The electron applicator is kept to minimize scatter and maintain narrow penumbra. The couch motion is synchronized with the gantry rotation to avoid collision between patient and the electron cone.

First...