Página 1 dos resultados de 31264 itens digitais encontrados em 0.060 segundos

Conditional random field for 3D point clouds with adaptive data reduction.

Lim, E.; Suter, D.
Fonte: IEEE; Online Publicador: IEEE; Online
Tipo: Conference paper
Publicado em //2007 EN
Relevância na Pesquisa
55.7%
We proposed using Conditional Random Fields with adaptive data reduction for the classification of 3D point clouds acquired from a Riegl Terrestrial laser scanner. The training and inference of the acquired large outdoor urban data can be time consuming. We approach the problem by computing an adaptive support region for each data point using 3D scale theory. For training and inference of the discriminative Conditional Random Fields, smaller set of data samples that contains relevant information within the support region is selected instead of using all point cloud data. We tested the algorithm on synthetically generated data and urban point clouds data acquired from the laser scanner. The computed support region is also used in feature extraction for urban point clouds data. The results showed improvement in the training and inference rate while maintaining comparable classification accuracy.; http://dx.doi.org/10.1109/CW.2007.30; E. H. Lim and D. Suter

Mitigating Inconsistencies by Coupling Data Cleaning, Filtering, and Contextual Data Validation in Wireless Sensor Networks

Bakhtiar, Qutub A
Fonte: FIU Digital Commons Publicador: FIU Digital Commons
Tipo: Artigo de Revista Científica Formato: application/pdf
Relevância na Pesquisa
55.81%
With the advent of peer to peer networks, and more importantly sensor networks, the desire to extract useful information from continuous and unbounded streams of data has become more prominent. For example, in tele-health applications, sensor based data streaming systems are used to continuously and accurately monitor Alzheimer's patients and their surrounding environment. Typically, the requirements of such applications necessitate the cleaning and filtering of continuous, corrupted and incomplete data streams gathered wirelessly in dynamically varying conditions. Yet, existing data stream cleaning and filtering schemes are incapable of capturing the dynamics of the environment while simultaneously suppressing the losses and corruption introduced by uncertain environmental, hardware, and network conditions. Consequently, existing data cleaning and filtering paradigms are being challenged. This dissertation develops novel schemes for cleaning data streams received from a wireless sensor network operating under non-linear and dynamically varying conditions. The study establishes a paradigm for validating spatio-temporal associations among data sources to enhance data cleaning. To simplify the complexity of the validation process, the developed solution maps the requirements of the application on a geometrical space and identifies the potential sensor nodes of interest. Additionally...

Hyperspectral discrimination of tropical dry forest lianas and trees: Comparative data reduction approaches at the leaf and canopy levels

Kalacska, Margaret; Bohlman, Stephanie; Sanchez-Azofeifa, G Arturo; Castro-Esau, Karen L; Caelli, Terry
Fonte: Elsevier Publicador: Elsevier
Tipo: Artigo de Revista Científica
Relevância na Pesquisa
65.73%
A dataset of spectral signatures (leaf level) of tropical dry forest trees and lianas and an airborne hyperspectral image (crown level) are used to test three hyperspectral data reduction techniques (principal component analysis, forward feature selection

The pipeline for the GOSSS data reduction

Sota, Alfredo; Apellániz, Jesús Maíz
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 18/01/2011
Relevância na Pesquisa
45.8%
The Galactic O-Star Spectroscopic Survey (GOSSS) is an ambitious project that is observing all known Galactic O stars with B < 13 in the blue-violet part of the spectrum with R-2500. It is based on version 2 of the most complete catalog to date of Galactic O stars with accurate spectral types (v1, Ma\'iz Apell\'aniz et al. 2004 ;v2, Sota et al. 2008). Given the large amount of data that we are getting (more than 150 nights of observations at three different observatories in the last 4 years) we have developed an automatic spectroscopic reduction pipeline. This pipeline has been programmed in IDL and automates the process of data reduction. It can operate in two modes: automatic data reduction (quicklook) or semi-automatic data reduction (full). In "quicklook", we are able to get rectified and calibrated spectra of all stars of a full night just minutes after the observations. The pipeline automatically identifies the type of image and applies the standard reduction procedure (bias subtraction, flat field correction, application of bad pixel mask, ...). It also extracts all spectra of the stars in one image (including close visual binaries), aligns and merges all spectra of the same star (to increase the signal to noise ratio and to correct defects such as cosmic rays)...

The Challenge of Data Reduction for Multiple Instruments on Stratospheric Observatory For Infrared Astronomy (SOFIA)

Charcos-Llorens, M. V.; Krzaczek, R.; Shuping, R. Y.; Lin, L.
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 10/01/2011
Relevância na Pesquisa
45.88%
SOFIA presents a number of interesting challenges for the development of a data reduction environment which, at its initial phase, will have to incorporate pipelines from seven different instruments. Therefore, the SOFIA data reduction software must run code which has been developed in a variety of dissimilar environments which will only increase in future generations of instrumentation. We investigated three distinctly different situations for performing pipelined data reduction in SOFIA: automated data reduction after data archival at the end of a mission, re-pipelining of science data with updated calibrations or optimum parameters, and the interactive user-driven local execution and analysis of data reduction by an investigator. These different modes would traditionally result in very different software implementations of algorithms used by each instrument team, in effect tripling the amount of data reduction software that would need to be maintained by SOFIA. We present here a unique approach for enfolding all the instrument-specific data reduction software in the observatory framework and verifies the needs for all three reduction scenarios as well as the standard visualization tools. The SOFIA data reduction structure would host the different algorithms and techniques that the instrument teams develop in their own environment. Ideally...

Data Reduction for Graph Coloring Problems

Jansen, Bart M. P.; Kratsch, Stefan
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Relevância na Pesquisa
55.64%
This paper studies the kernelization complexity of graph coloring problems with respect to certain structural parameterizations of the input instances. We are interested in how well polynomial-time data reduction can provably shrink instances of coloring problems, in terms of the chosen parameter. It is well known that deciding 3-colorability is already NP-complete, hence parameterizing by the requested number of colors is not fruitful. Instead, we pick up on a research thread initiated by Cai (DAM, 2003) who studied coloring problems parameterized by the modification distance of the input graph to a graph class on which coloring is polynomial-time solvable; for example parameterizing by the number k of vertex-deletions needed to make the graph chordal. We obtain various upper and lower bounds for kernels of such parameterizations of q-Coloring, complementing Cai's study of the time complexity with respect to these parameters. Our results show that the existence of polynomial kernels for q-Coloring parameterized by the vertex-deletion distance to a graph class F is strongly related to the existence of a function f(q) which bounds the number of vertices which are needed to preserve the NO-answer to an instance of q-List-Coloring on F.; Comment: Author-accepted manuscript of the article that will appear in the FCT 2011 special issue of Information & Computation

The JCMT Gould Belt Survey: A Quantitative Comparison Between SCUBA-2 Data Reduction Methods

Mairs, S.; Johnstone, D.; Kirk, H.; Graves, S.; Buckle, J.; Beaulieu, S. F.; Berry, D. S.; Broekhoven-Fiene, H.; Currie, M. J.; Fich, M.; Hatchell, J.; Jenness, T.; Mottram, J. C.; Nutter, D.; Pattle, K.; Pineda, J. E.; Salji, C.; Di Francesco, J.; Hogerh
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 21/09/2015
Relevância na Pesquisa
45.84%
Performing ground-based submillimetre observations is a difficult task as the measurements are subject to absorption and emission from water vapour in the Earth's atmosphere and time variation in weather and instrument stability. Removing these features and other artifacts from the data is a vital process which affects the characteristics of the recovered astronomical structure we seek to study. In this paper, we explore two data reduction methods for data taken with the Submillimetre Common-User Bolometer Array-2 (SCUBA-2) at the James Clerk Maxwell Telescope (JCMT). The JCMT Legacy Reduction 1 (JCMT LR1) and The Gould Belt Legacy Survey Legacy Release 1 (GBS LR1) reduction both use the same software, Starlink, but differ in their choice of data reduction parameters. We find that the JCMT LR1 reduction is suitable for determining whether or not compact emission is present in a given region and the GBS LR1 reduction is tuned in a robust way to uncover more extended emission, which better serves more in-depth physical analyses of star-forming regions. Using the GBS LR1 method, we find that compact sources are recovered well, even at a peak brightness of only 3 times the noise, whereas the reconstruction of larger objects requires much care when drawing boundaries around the expected astronomical signal in the data reduction process. Incorrect boundaries can lead to false structure identification or it can cause structure to be missed. In the JCMT LR1 reduction...

Towards a Data Reduction for the Minimum Flip Supertree Problem

Böcker, Sebastian
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 22/04/2011
Relevância na Pesquisa
55.85%
In computational phylogenetics, the problem of constructing a supertree of a given set of rooted input trees can be formalized in different ways, to cope with contradictory information in the input. We consider the Minimum Flip Supertree problem, where the input trees are transformed into a 0/1/?-matrix, such that each row represents a taxon, and each column represents an inner node of one of the input trees. Our goal is to find a perfect phylogeny for the input matrix requiring a minimum number of 0/1-flips, that is, corrections of 0/1-entries in the matrix. The problem is known to be NP-complete. Here, we present a parameterized data reduction with polynomial running time. The data reduction guarantees that the reduced instance has a solution if and only if the original instance has a solution. We then make our data reduction parameter-independent by using upper bounds. This allows us to preprocess an instance, and to solve the reduced instance with an arbitrary method. Different from an existing data reduction for the consensus tree problem, our reduction allows us to draw conclusions about certain entries in the matrix. We have implemented and evaluated our data reduction. Unfortunately, we find that the Minimum Flip Supertree problem is also hard in practice: The amount of information that can be derived during data reduction diminishes as instances get more "complicated"...

Polynomial Time Data Reduction for Dominating Set

Alber, Jochen; Fellows, Michael R.; Niedermeier, Rolf
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 16/07/2002
Relevância na Pesquisa
55.77%
Dealing with the NP-complete Dominating Set problem on undirected graphs, we demonstrate the power of data reduction by preprocessing from a theoretical as well as a practical side. In particular, we prove that Dominating Set restricted to planar graphs has a so-called problem kernel of linear size, achieved by two simple and easy to implement reduction rules. Moreover, having implemented our reduction rules, first experiments indicate the impressive practical potential of these rules. Thus, this work seems to open up a new and prospective way how to cope with one of the most important problems in graph theory and combinatorial optimization.; Comment: 25 pages, 4 figures (using 8 files), extended abstract entitled "Efficient Data Reduction for Dominating Set: A Linear Problem Kernel for the Planar Case" appeared in the Proceedings of the 8th SWAT 2002, LNCS 2368, pages 150-159, Springer-Verlag, 2002

Gemini Planet Imager Observational Calibrations I: Overview of the GPI Data Reduction Pipeline

Perrin, Marshall D.; Maire, Jérôme; Ingraham, Patrick; Savransky, Dmitry; Millar-Blanchaer, Max; Wolff, Schuyler G.; Ruffio, Jean-Baptiste; Wang, Jason J.; Draper, Zachary H.; Sadakuni, Naru; Marois, Christian; Rajan, Abhijith; Fitzgerald, Michael P.; M
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 08/07/2014
Relevância na Pesquisa
45.83%
The Gemini Planet Imager (GPI) has as its science instrument an infrared integral field spectrograph/polarimeter (IFS). Integral field spectrographs are scientificially powerful but require sophisticated data reduction systems. For GPI to achieve its scientific goals of exoplanet and disk characterization, IFS data must be reconstructed into high quality astrometrically and photometrically accurate datacubes in both spectral and polarization modes, via flexible software that is usable by the broad Gemini community. The data reduction pipeline developed by the GPI instrument team to meet these needs is now publicly available following GPI's commissioning. This paper, the first of a series, provides a broad overview of GPI data reduction, summarizes key steps, and presents the overall software framework and implementation. Subsequent papers describe in more detail the algorithms necessary for calibrating GPI data. The GPI data reduction pipeline is open source, available from planetimager.org, and will continue to be enhanced throughout the life of the instrument. It implements an extensive suite of task primitives that can be assembled into reduction recipes to produce calibrated datasets ready for scientific analysis. Angular, spectral...

"Advanced" data reduction for the AMBER instrument

Millour, Florentin; Valat, Bruno; Petrov, Romain; Vannier, Martin
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 02/07/2008
Relevância na Pesquisa
45.8%
The amdlib AMBER data reduction software is meant to produce AMBER data products from the raw data files that are sent to the PIs of different proposals or that can be found in the ESO data archive. The way defined by ESO to calibrate the data is to calibrate one science data file with a calibration one, observed as close in time as possible. Therefore, this scheme does not take into account instrumental drifts, atmospheric variations or visibility-loss corrections, in the current AMBER data processing software, amdlib. In this article, we present our approach to complement this default calibration scheme, to perform the final steps of data reduction, and to produce fully calibrated AMBER data products. These additional steps include: an overnight view of the data structure and data quality, the production of night transfer functions from the calibration stars observed during the night, the correction of additional effects not taken into account in the standard AMBER data reduction software such as the so-called "jitter" effect and the visibility spectral coherence loss, and finally, the production of fully calibrated data products. All these new features are beeing implemented in the modular pipeline script amdlibPipeline, written to complement the amdlib software.; Comment: 10 pages...

Decentralized Data Reduction with Quantization Constraints

Xu, Ge; Zhu, Shengyu; Chen, Biao
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Relevância na Pesquisa
45.82%
A guiding principle for data reduction in statistical inference is the sufficiency principle. This paper extends the classical sufficiency principle to decentralized inference, i.e., data reduction needs to be achieved in a decentralized manner. We examine the notions of local and global sufficient statistics and the relationship between the two for decentralized inference under different observation models. We then consider the impacts of quantization on decentralized data reduction which is often needed when communications among sensors are subject to finite capacity constraints. The central question we intend to ask is: if each node in a decentralized inference system has to summarize its data using a finite number of bits, is it still optimal to implement data reduction using global sufficient statistics prior to quantization? We show that the answer is negative using a simple example and proceed to identify conditions under which sufficiency based data reduction followed by quantization is indeed optimal. They include the well known case when the data at decentralized nodes are conditionally independent as well as a class of problems with conditionally dependent observations that admit conditional independence structure through the introduction of an appropriately chosen hidden variable.; Comment: Revised manuscript submitted to IEEE Transaction on Signal Processing

The VVDS data reduction pipeline: introducing VIPGI, the VIMOS Interactive Pipeline and Graphical Interface

Scodeggio, M.; Franzetti, P.; Garilli, B.; Zanichelli, A.; Paltani, S.; Maccagni, D.; Bottini, D.; Brun, V. Le; Contini, T.; Scaramella, R.; Adami, C.; Bardelli, S.; Zucca, E.; Tresse, L.; Ilbert, O.; Foucaud, S.; Iovino, A.; Merighi, R.; Zamorani, G.; Ga
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 10/09/2004
Relevância na Pesquisa
45.87%
The VIMOS VLT Deep Survey (VVDS), designed to measure 150,000 galaxy redshifts, requires a dedicated data reduction and analysis pipeline to process in a timely fashion the large amount of spectroscopic data being produced. This requirement has lead to the development of the VIMOS Interactive Pipeline and Graphical Interface (VIPGI), a new software package designed to simplify to a very high degree the task of reducing astronomical data obtained with VIMOS, the imaging spectrograph built by the VIRMOS Consortium for the European Southern Observatory, and mounted on Unit 3 (Melipal) of the Very Large Telescope (VLT) at Paranal Observatory (Chile). VIPGI provides the astronomer with specially designed VIMOS data reduction functions, a VIMOS-centric data organizer, and dedicated data browsing and plotting tools, that can be used to verify the quality and accuracy of the various stages of the data reduction process. The quality and accuracy of the data reduction pipeline are comparable to those obtained using well known IRAF tasks, but the speed of the data reduction process is significantly increased, thanks to the large set of dedicated features. In this paper we discuss the details of the MOS data reduction pipeline implemented in VIPGI...

Automated data reduction workflows for astronomy

Freudling, W.; Romaniello, M.; Bramich, D. M.; Ballester, P.; Forchi, V.; Garcia-Dablo, C. E.; Moehler, S.; Neeser, M. J.
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 21/11/2013
Relevância na Pesquisa
45.85%
Data from complex modern astronomical instruments often consist of a large number of different science and calibration files, and their reduction requires a variety of software tools. The execution chain of the tools represents a complex workflow that needs to be tuned and supervised, often by individual researchers that are not necessarily experts for any specific instrument. The efficiency of data reduction can be improved by using automatic workflows to organise data and execute the sequence of data reduction steps. To realize such efficiency gains, we designed a system that allows intuitive representation, execution and modification of the data reduction workflow, and has facilities for inspection and interaction with the data. The European Southern Observatory (ESO) has developed Reflex, an environment to automate data reduction workflows. Reflex is implemented as a package of customized components for the Kepler workflow engine. Kepler provides the graphical user interface to create an executable flowchart-like representation of the data reduction process. Key features of Reflex are a rule-based data organiser, infrastructure to re-use results, thorough book-keeping, data progeny tracking, interactive user interfaces, and a novel concept to exploit information created during data organisation for the workflow execution. Reflex includes novel concepts to increase the efficiency of astronomical data processing. While Reflex is a specific implementation of astronomical scientific workflows within the Kepler workflow engine...

Keck Interferometer Nuller Data Reduction and On-Sky Performance

Colavita, M. M.; Serabyn, E.; Millan-Gabet, R.; Koresko, C. D.; Akeson, R. L.; Booth, A. J.; Mennesson, B. P.; Ragland, S. D.; Appleby, E. C.; Berkey, B. C.; Cooper, A.; Crawford, S. L.; Creech-Eakman, M. J.; Dahl, W.; Felizardo, C.; Garcia-Gathright, J.
Fonte: Astronomical Society of the Pacific Publicador: Astronomical Society of the Pacific
Tipo: Article; PeerReviewed Formato: application/pdf
Publicado em /10/2009
Relevância na Pesquisa
45.81%
We describe the Keck Interferometer nuller theory of operation, data reduction, and on-sky performance, particularly as it applies to the nuller exozodiacal dust key science program that was carried out between 2008 February and 2009 January. We review the nuller implementation, including the detailed phasor processing involved in implementing the null-peak mode used for science data and the sequencing used for science observing. We then describe the Level 1 reduction to convert the instrument telemetry streams to raw null leakages, and the Level 2 reduction to provide calibrated null leakages. The Level 1 reduction uses conservative, primarily linear processing, implemented consistently for science and calibrator stars. The Level 2 processing is more flexible, and uses diameters for the calibrator stars measured contemporaneously with the interferometer’s K-band cophasing system in order to provide the requisite accuracy. Using the key science data set of 462 total scans, we assess the instrument performance for sensitivity and systematic error. At 2.0 Jy we achieve a photometrically-limited null leakage uncertainty of 0.25% rms per 10 minutes of integration time in our broadband channel. From analysis of the Level 2 reductions...

The ACS Virgo Cluster Survey IV: Data reduction procedures for surface brightness fluctuation measurements with the Advanced Camera for Surveys

Mei, Simona; Blakeslee, John; Tonry, John; Jordan, Andres; Peng, Eric; Cote, Patrick; Ferrarese, Laura; Merritt, David; Milosavljevic, Milos; West, Michael
Fonte: Astrophysical Journal Supplement Series Publicador: Astrophysical Journal Supplement Series
Tipo: Artigo de Revista Científica Formato: 469999 bytes; application/pdf
EN_US
Relevância na Pesquisa
55.64%
The Advanced Camera for Surveys (ACS) Virgo Cluster Survey is a large program to image 100 early-type Virgo galaxies using the F475W and F850LP bandpasses of theWide Field Channel of the ACS instrument on the Hubble Space Telescope (HST). The scientific goals of this survey include an exploration of the three-dimensional structure of the Virgo Cluster and a critical examination of the usefulness of the globular cluster luminosity function as a distance indicator. Both of these issues require accurate distances for the full sample of 100 program galaxies. In this paper, we describe our data reduction procedures and examine the feasibility of accurate distance measurements using the method of surface brightness fluctuations (SBF) applied to the ACS Virgo Cluster Survey F850LP imaging. The ACS exhibits significant geometrical distortions due to its off-axislocation in the HST focal plane; correcting for these distortions by resampling the pixel values onto an undistorted frame results in pixel correlations that depend on the nature of the interpolation kernel used for the resampling. This poses a major challenge for the SBF technique, which normally assumes a flat power spectrum for the noise. We investigate a number of different interpolation kernels and show through an analysis of simulated galaxy images having realistic noise properties that it is possible...

Analysis of H/W & S/W techniques for data reduction in high speed digital image processing

DeSanctis, Paul
Fonte: Rochester Instituto de Tecnologia Publicador: Rochester Instituto de Tecnologia
Tipo: Tese de Doutorado
EN_US
Relevância na Pesquisa
65.85%
With the widespread utilization of charge-coupled-devices, there is much interest in methods to efficiently process images. The processing, manipulation, and storage of photographic quality digital images place significant demands on today's computers. Even with today's high performance bus structure and real-time operating systems, manipulating full resolution image data may quickly overwhelm computer hardware and software. In response to this, data reduction techniques have been developed to aid in resolving this problem. Two common data reduction techniques include data sub-sampling and data averaging. Data sub-sampling approach is simplistic in nature and perhaps easiest to implement in both hardware and/or software. This approach involves sub-sampling the full resolution image data to a lower resolution. Selection of sub-sampled element of the full resolution image is random in nature. This random selection makes sub-sampling an effective technique for flat image fields but degrades or softens the image for edges information quality/content. Data averaging approach is more difficult and complex to implement in both hardware and software than the sub-sampling approach. The data averaging approach involves a two dimensional averaging function to sub-sample the full resolution image data to a lower resolution. Averaging area parameters may be chosen to average X consecutive pixels...

The ACS Fornax Cluster Survey. I. Introduction to the survey and data reduction procedures

Jordan, Andres; Blakeslee, John; Cote, Patrick; Ferrarese, Laura; Infante, Leopoldo; Mei, Simona; Merritt, David; Peng, Eric; Tonry, John; West, Michael
Fonte: Universtiy of Chicago Press: Astrophysical Journal Supplement Series Publicador: Universtiy of Chicago Press: Astrophysical Journal Supplement Series
Tipo: Artigo de Revista Científica
EN_US
Relevância na Pesquisa
55.64%
The Fornax Cluster is a conspicuous cluster of galaxies in the southern hemisphere and the second largest collection of early-type galaxies within <∼ 20 Mpc after the Virgo Cluster. In this paper, we present a brief introduction to the ACS Fornax Cluster Survey — a program to image, in the F475W (g475) and F850LP (z850) bandpasses, 43 early-type galaxies in Fornax using the Advanced Camera for Surveys (ACS) on the Hubble Space Telescope. Combined with a companion survey of Virgo, the ACS Virgo Cluster Survey, this represents the most comprehensive imaging survey to date of early-type galaxies in cluster environments in terms of depth, spatial resolution, sample size and homogeneity. We describe the selection of the program galaxies, their basic properties, and the main science objectives of the survey which include the measurement of luminosities, colors and structural parameters for globular clusters associated with these galaxies, an analysis of their isophotal properties and surface brightness profiles, and an accurate calibration of the surface brightness fluctuation distance indicator. Finally, we discuss the data reduction procedures adopted for the survey.; Also archived in: arXiv:astro-ph/0702320 v1 Feb 13 2007

The ACS Virgo Cluster Survey II. Data reduction procedures

Jordan, Andres; Blakeslee, John; Peng, Eric; Mei, Simona; Cote, Patrick; Ferrarese, Laura; Tonry, John; Merritt, David; Milosavljevic, Milos; West, Michael
Fonte: Astrophysical Journal Supplement Series Publicador: Astrophysical Journal Supplement Series
Tipo: Artigo de Revista Científica Formato: 587836 bytes; application/pdf
EN_US
Relevância na Pesquisa
65.69%
The ACS Virgo Cluster Survey is a large program to carry out multi-color imaging of 100 early-type members of the Virgo Cluster using the Advanced Camera for Surveys (ACS) on the Hubble Space Telescope. Deep F475W and F850LP images (≈ SDSS g and z) are being used to study the central regions of the program galaxies, their globular cluster systems, and the three-dimensional structure of Virgo itself. In this paper, we describe in detail the data reduction procedures used for the survey, including image registration, drizzling strategies, the computation of weight images, object detection, the identification of globular cluster candidates, and the measurement of their photometric and structural parameters.; Also archived in: arXiv: astro-ph/0406433 v1 18 Jun 2004; A.J. extends his thanks to the Oxford Astrophysics Department for their hospitality. Support for program GO-9401 was provided through a grant from the Space Telescope Science Institute, which is operated by the Association of Universities for Research in As- tronomy, Inc., under NASA contract NAS5-26555. A.J. acknowledges additional financial support provided by the National Science Foundation through a grant from the Association of Universities for Research in Astronomy...

UPb.age, a fast data reduction script for LA-ICP-MS U-Pb geochronology

Solari,Luigi A.; Tanner,Martin
Fonte: Instituto de Geología, UNAM Publicador: Instituto de Geología, UNAM
Tipo: Artigo de Revista Científica Formato: text/html
Publicado em 01/04/2011 EN
Relevância na Pesquisa
55.73%
Laser ablation-inductively coupled plasma mass spectrometry (LA-ICPMS) is a rapidly growing technique that allows many different types of in-situ microanalyses to be performed in geological materials. One of the most used methodologies is U-Pb isotopic dating of accessory minerals such as zircon. These analyses can be performed at the scale of tenths of micrometers, in a rapid, cost-effective way and with very good precision and accuracy. It is critical, however, to perform the data reduction in a fast, transparent and customizable way that takes into account the specific analytical procedures employed in various laboratories and the outputs of different instruments. UPb.age is a freely available data reduction script, written and developed in R, a free statistical environment. The software can read, correct and reduce U-Pb isotopic data obtained by several LA-ICMPS instruments. Its main strengths are transparency, robustness, speed, and the ability to be readily customized and adapted to specific analytical procedures used in different laboratories.