• HOME
  • COMMITTEES
  • sponsors
  • SUBMISSIONS
  • INSTRUCTIONS
  • PROGRAM
  • POSTER
  • DEMOS_BOF
  • REGISTRATION
  • ACCOMMODATION
  • SOCIAL EVENTS
  • LOCAL INFORMATION

POSTER PROGRAM

Poster Abstracts

Name Organization Title
Anastasia Alexov University of Amsterdam, Astronomical Institute Anton Pannekoek (API) The LOFAR Pulsar Data Pipeline PDF
Christophe Joel Barache Observatoire de Paris, Laboratoire Syrte The construction of the Large Quasar Astrometric Catalogue (LQAC) PDF
Paul E Barrett US Naval Observatory From Start to Finish: Python for Space Missions PDF
Monica Fernandez-Barreiro Science Archives Team (SAT) - ESAC/ESA ESA New Generation Science Archives: New technologies applied to Graphical User Interface creation PDF
Benjamin Robert Barsdell Swinburne University, Centre for Astrophysics and Supercomputing Advancing Computational Astronomy on Advanced Architectures PDF
Stephane Beland University of Colorado, Center for Astrophysics and Space Astronomy Preliminary Wavelength Calibration for Cosmic Origins Spectrograph
David Stuart Berry
Joint Astronomy Centre, Hawaii Using the AST library to create and use STC-S region descriptions PDF
Juergen Berwein Max Planck Institute for Astronomy Rapid SOA frontend design and prototyping for LINC-NIRVANA
Thomas Boch CDS, Observatoire de Strasbourg The CDS Portal, a unified way to access CDS services PDF
Florian Briegel Max Planck Institute for Astronomy Management of astronomical software projects with open source tools. PDF
C. M. Hubert Chen California Institute of Technology A distributed, real-time data monitoring system as ground support equipment for balloon-borne astronomy experiments PDF
Fabien Chereau ESO A meta-data layer for astronomical archives PDF
Andre Csillaghy U. of Applied Sciences North Western Switzerland, Inst. of 4D Technologies The Heliophysics Integrated Observatory HELIO PDF
Lindsey E. Davis National Radio Astronomy Observatory The ALMA Pipeline Heuristics Package Interface PDF
Lander de Bilbao ESO / FECYT Multithreading for ESO Pipelines PDF
Arancha Delgado European Southern Observatory Instrumental Provenance of ESO Archival Data PDF
Rosa I Diaz Space Telescope Science Institute The HST Exposure Time Calculators: Estimating accurate observing times for HST Observations PDF
Rick Ebert California Institute of Technology, NED NED Spectra Data Service
Alessandro E. Ederoclite Instituto de Astrofisica de Canarias The Data Reduction System for GTC/OSIRIS. PDF
Satoshi Eguchi Kyoto University, Department of Astronomy Development of Image Analysis Software of MAXI PDF
Michael J Fitzpatrick National Optical Astronomy Observatory DTS: The NOAO Data Transport System. PDF
Niall Gaffney Space Telescope Science Institute History and future of the STScI DADS archive system. PDF
Erik Edward Gottschalk Fermilab A Concept for JDEM Science Computing and Operations PDF
Gretchen R Greene STScI HLA Footprints for Multi-Purpose Science PDF
Michael Allan Kalmar Gross Universities Space Research Association Pointing the SOFIA Telescope
Jonas Haase ST-ECF HST Cache: Update
Kirill A. Halin Stavropol State University, Department of Physics The study of the mechanism of cumulative generation of streams for the model of active astronomical object. PDF
Amr Hassan Centre for astrophysics and supercomputing, Swinburne University of Technology - Melbourne , Australia GPU-Based Volume Rendering of Noisy Multi-Spectral Astronomical Data PDF
Arturo A. Hoffstadt Universidad Tecnica Federico Santa Maria, Computer Systems Research Group Reusable state machine code generator PDF
Mark S Holliman University of Edinburgh, Institute for Astronomy, Wide Field Astronomy Unit Virtual Observatory Services at WFAU PDF
Wolfgang Hovest Max-Planck-Institute for Astrophysics Planck Surveyor Mission: Methods for Optimizing the Data Analysis with the Software Infrastructure ProC
Aitor L Ibarra XMM-Newton SOC. ESAC/ESA XMM-Newton Science Analysis Software: Further development and maintenance... and also thinking about the future. PDF
Norio Ikeda Institute of Space and Astronautical Science/Japan Aerospace Exploration Agency Cube FITS Analyzer FAZZ in IDL PDF
Ryoji Ishiwata Nihon University, Department of Physics MAXI Nova Search and Alert System
Emmanuel Joliet esac/esa A new generic way to define astrometric calibration for Gaia data processing. PDF
Mitsuhiro Kohama Cosmic Radiation lab. RIKEN / MAXI ISS ISAS JAXA How to get MAXI data from http://maxi.riken.jp ?
Martin Kuemmel Space Telescope - European Coordinating Facility aXeTwoZero: The next generation of aXe PDF
Uwe Lammers European Space Agency Faster, better, cheaper: News on seeking Gaia's Astrometric Solution with AGIS PDF
ANDREA LARUELO ESAC ESA Archives and VO tools: without frontiers. PDF
Ignacio Leon Science Archives Team (SAT) - ESAC/ESA ESA New Generation Science Archives: State of the art data management techniques for SOHO and EXOSAT Science Archives PDF
Lan Lin Universities Space Research Association An End-to-End Solution for Archiving, Monitoring, Retrieval, and Post-Processing Archive Files for SOFIA PDF
Joao S. Lopez Universidad Te;cnica Federico Santa Maria A Reference Architecture Specification of a Generic Telescope Control System PDF
Sin'itirou Makiuti Institute of Space and Astronautical Science, Japan Aerospace Exploration Agency Data processing for AKARI Far-Infrared All-Sky Catalogue
Areg Martin Mickaelian Byurakan Astrophysical Observatory (BAO) Spectra extraction and analysis software for the Digitized First Byurakan Survey (DFBS) and research projects PDF
Arik William Mitschang SAO TGCat, The Chandra Transmission Grating Catalog and Archive PDF
Marco Molinaro INAF - Trieste Astronomical Observatory VO compliant visualization of theoretical data PDF
Koh-Ichiro Morita National Astronomical Observatory Japan Optimizing Spatial Frequency Data Weights for High Precision Imaging with ALMA
Eric Hildau Neilsen Fermilab A Prototype Data Processing System for JDEM PDF
Jon G. Nielsen Australian National University, Mount Stromlo Observatory MSOTCS: A new Telescope Control System for the Australian National University's 2.3m telescope at Siding Spring Observatory PDF
Se-Jin Oh Korea Astronomy and Space Science Institute High-speed Korea-Japan Joint VLBI Correlator(KJJVC) development and its current progress PDF
Luigi Paioro INAF-IASF Milano Toward a reference implementation of a standardized astronomical software environment. PDF
Sunyoup Park Korea Astronomy & Space Science Institute (KASI) Development of the software for high speed data transfer of the > > high-speed, large capacity data archive system for the storage of the correlation data from Korea - Japan Joint VLBI Correlator(KJJVC)
Sergio Pascual Madrid Complutense University, Astrophysics Department Data Reduction Pipeline for EMIR, the GTC Near-IR Multi-Object Spectrograph PDF
Fabio Pasian INAF Integrated e-infrastructures for astrophysics PDF
Alexey Pavlov MPIA LOPS - towards a science driven observation preparation tool
William D. Pence NASA/GSFC Optimal Compression Methods for Floating-point Format Images PDF
Francois-Xavier Pineau Observatoire Astronomique de Strasbourg Efficiencies of various classification methods applied to XMM-Newton sources PDF
Nor Pirzkal STScI NICMOS Temperature Dependent Calibration PDF
Daniel Pomarede CEA/IRFU Visualization of ASH Simulations of Stellar MHD with SDvision PDF
Bruno Correa Quint University of Sao Paulo; Institute of Astronomy, Geophysics and Atmosphere Sciences; Department of Astronomy Illusion - A Fabry-Perot Data-Cube Synthesizer. PDF
Frederic Raison ESA/ESAC Implementation of the global parameters determination in Gaia's Astrometric Solution (AGIS). PDF
Johnny William Reveco Associated Universities, Inc. (AUI) Extending the device support for the ALMA Control subsystem code generation framework
Tsuyoshi Sakamoto Japan Spaceguard Association A fast asteroid detection algorithm PDF
Juande Santander Vela European Southern Observatory Data Provenance: Use Cases for the ESO archive, and interactions with the Virtual Observatory PDF
Yuji Shirasaki NAOJ, ADC Environment Study of AGNs at z = 0.3 to 3.0 using the Japanese Virtual Observatory PDF
Dan Starr UC Berkeley, Department of Astronomy Real-Time Discovery and Classification of Sparsely Sampled Science Using Berkeley's Transient Classification Pipeline PDF
Ian M. Stewart University of Cape Town, Department of Astronomy. Parallel CLEAN: beyond the frequency domain. PDF
Elizabeth B Stobie National Optical Astronomy Observatory User Support in the Virtual Astronomical Observatory
Felix Stoehr ST-ECF/ESO The HST Cache - Metadata and Services PDF
Ole Streicher Astrophysikalisches Institut Potsdam The next generation MUSE 3D spectroscopy visualization and analysis tool
Kanako Sugimoto NAOJ Single dish observation simulator in CASA
Shigeru Takahashi Nobeyama Radio Observatory, National Astronomical Observatory of Japan Doppler Shift Correction for 2SB Receivers of the 45m Telescope at the Nobeyama Radio Observatory PDF
Satoshi Takita ISAS/JAXA Development of AKARI reduction tools for IRC slow-scan
Takayuki Tamura ISAS/JAXA Data Archive and Transmission System (DARTS) of ISAS/JAXA
Masahiro Tanaka University of Tsukuba, Center for Computational Sciences Impact of Gfarm, a Wide-area Distributed File System, upon Astronomical Data Analysis and Virtual Observatory. PDF
Harry Isaac Teplitz IRSA Building the Spitzer Source List PDF
Rodrigo Javier Tobar Universidad Tecnica Federico Santa Maria Adding support to ACS for Real-Time operations through the usage of a POSIX-compliant RTOS PDF
Albert Torrent University of Girona - ESQ-6750002-E A Boosting approach for the detection of faint compact sources in wide field aperture synthesis radio images PDF
Tomofumi Umemoto National Astronomical Observatory of Japan Concept of VSOP-2 Science Operation Center (SOC: tentative)
Vladimir V. Vitkovskiy Special Astrophysical Observatory of RAS, Informatics Department 6D visualization of multidimensional data by means of cognitive technology PDF
Marc Wenger Strasbourg Observatory, C.D.S. Data mining in the SIMBAD database log files PDF
Andreas Wicenec European Southern Observatory The ALMA Front-end Archive Setup and Performance PDF
Tom Winegar Subaru Telescope - NAOJ STARS 2 - 2nd generation open-source archiving and query software for the Subaru Telescope PDF
Sherry L Winkelman Smithsonian Astrophysical Observatory BibCat: The Chandra Data Archive Bibliography Cataloging System PDF
Masafumi YAGI National Astronomical Observatory of Japan An algorithm of refinement of image alignment for image subtraction PDF
Chisato Yamauchi Japan Aerospace Exploration Agency, Center of Science-satellite Operation and Data Archive SFITSIO -- A next-generation FITS I/O library for C/C++ users PDF
Honglin Ye National Radio Astronomy Observatory A Simple Implementation of a 3D Data Cube viewer PDF
Mauricio Alejandro Zambrano Associated Universities, Inc. Experiences Virtualizing ALMA Software PDF
Nelson R Zarate Gemini Observatory NICI Python Data Reduction PDF
Ivan Zolotukhin Sternberg Astronomical Institute, Moscow State University Grown up with the VO /The VO-powered PhD thesis PDF
|top|

The LOFAR Pulsar Data Pipeline
Anastasia Alexov University of Amsterdam, Astronomical Institute Anton Pannekoek (API)
The LOw Frequency ARray (LOFAR) for radio astronomy is being built by ASTRON in The Netherlands along with several European collaborators. The project is an interferometric array of radio telescopes arranged in clusters that are spread over an area 350 km in diameter at frequencies below 250 MHz. LOFAR will be a breakthrough in the low frequency radio astronomy science domain. Transient radio phenomena and pulsars are one of six LOFAR Key Science Projects (KSPs). As part of the Transients KSP, the Pulsar Working Group has been developing the LOFAR Pulsar Data Pipeline to look at known pulsars as well as search for pulsars within an all-sky survey. The pipeline is being developed for the Blue Gene/P supercomputer and large Linux cluster in order to utilize enormous amount of computation capabilities and data streams of 23TB/hour. The pipeline output will be using the Hierarchical Data Format 5 (HDF5) to efficiently store large amounts of numerical data. We will present the LOFAR Pulsar Data Pipeline overview, the pulsar beam-formed data format, the status of the pipeline processing as well as our future plans of developing additional Transient-pipelines.
|top|


The construction of the Large Quasar Astrometric Catalogue (LQAC)
Christophe Joel Barache Observatoire de Paris, Laboratoire Syrte
We gather the 12 largest quasar catalogues (4 from radio interferometry programs, 8 from optical survey) and we carry out systematic cross-identifications of the objects to obtain their best position estimates, and to provide physical information at both optical and radio wavelengths. This catalogue compilation designated LQAC, gives equatorial coordinates of 113666 quasars with magnitudes at 9 bandwidths, 5 radio fluxes, redshift and absolute magnitude. We made use of VO tools like Aladin for preliminary studies. For cross idenfication, data processing and validation, we made use of two different software packages with the same parameters and strategy : VO Topcat with Stilts and homemade Fortran programs.
|top|


From Start to Finish: Python for Space Missions
Paul Barrett US Naval Observatory
The software development process for many space observatories is often disjoint and inefficient due to the use of multiple languages during the different phases of mission development. Code and algorithms that are often developed using an interactive, array language during the pathfinding efforts of Phase A are often rewritten in a non-interactive, compiled language for use in the production code for Phase C. This approach leads to inefficiency in both development time and cost and can introduce errors during the rewriting process. Python is one programming language that can be used as a high-level, array language and as an efficient, production language. This paper shows how Python will be used during the different phases of development of the Joint Milli-Arcsecond Pathfinder Survey (JMAPS) space mission with an emphasis on code and algorithm reuse from one phase to the next.
|top|


ESA New Generation Science Archives: New technologies applied to Graphical User Interface creation
Monica Fernandez-Barreiro Science Archives Team (SAT) - ESAC/ESA
The Science Archives and VO Team (SAT) has undertaken the effort to build state of the art sub-systems for its new generation of archives. At the time of writing this abstract, the new technology has already been applied to the creation of the SOHO and EXOSAT Science Archives and will be used to reengineer some of the already existing ESA Science Archives in the future. The Graphical User Interface sub-system has been designed and developed upon the premises of building a lightweight rich client application to query and retrieve scientific data quickly and efficiently; special attention has been paid to the usability and ergonomics of the interface. The system architecture relies on the Model View Controller pattern, which isolates logic from the graphical interface. Multiple window layout arrangements are possible using a docking windows framework with virtually no limitations (InfoNode). New graphical components have been developed to fulfill project-specific user requirements. For example video animations can be generated at runtime based on image data requests matching a specific search criteria. In addition, interoperability is achieved with other tools for data visualization purposes using internationally approved standards (c.f., IVOA SAMP), a messaging protocol already adopted by several analysis tools (ds9, Aladin, Gaia). In order to avoid the increasingly common network constraints affecting the end-user′s daily work the system has been designed to cope with possible restrictive firewall set up. Therefore, ESA New Generation archives are accessible from anyplace where standard basic port 80 HTTP connections are available.
|top|


Advancing Computational Astronomy on Advanced Architectures
Benjamin Robert Barsdell Swinburne University, Centre for Astrophysics and Supercomputing
Astronomers have come to rely on the increasing performance of computers to reduce, analyse, simulate and visualise their data. In this environment, faster computation can mean more science outcomes or the opening up of new parameter spaces for investigation. If we are to avoid major issues when implementing codes on advanced architectures, it is important that we have a solid understanding of our algorithms. A recent addition to the high-performance computing scene that highlights this point is the graphics processing unit (GPU). The hardware originally designed for speeding-up graphics rendering in video games is now achieving speed-ups of O(100) in general-purpose computation - performance that cannot be ignored. We are using a generalised approach, based on the analysis of astronomy algorithms, to identify the optimal problem-types and techniques for taking advantage of both current GPU hardware and future developments in computing architectures. Several specific astronomy implementations will be described in addition to the generalised analysis methods.
|top|


Preliminary Wavelength Calibration for Cosmic Origins Spectrograph
Stephane Beland University of Colorado, Center for Astrophysics and Space Astronomy
During the Hubble Space Telescope (HST) Servicing Mission Observatory Verification (SMOV), standard stars were observed with the Cosmic Origins Spectrograph (COS) and wavelength solutions were obtained for various gratings and central wavelength settings of both the NUV and the FUV detectors. We are presenting here preliminary results of these observations and comparison with what was obtained during the thermal-vaccum tests on the ground before launch.
|top|


Using the AST library to create and use STC-S region descriptions
David Berry Joint Astronomy Centre, Hawaii
AST is a general purpose library for manipulating coordinate systems stored within astronomical data in various forms (including FITS-WCS). It now includes support for reading and writing WCS information in the IVOA STC-S format, thus making all the power of AST available for manipulating STC-S regions and coordinate systems. These new facilities have been used within the CUPID clump finding package to allow clump catalogues created by CUPID to be more easily used within a VO context.
|top|


Rapid SOA frontend design and prototyping for LINC-NIRVANA
Juergen Berwein Max Planck Institute for Astronomy
LINC-NIRVANA is a German-Italian Fizeau (imaging) interferometer for the Large Binocular Telescope (LBT) on Mt. Graham in Arizona, USA. For laboratory testing and integration, a large number of engineering applications are needed. The process of engineering, testing and integration has to go hand in hand with an agile software development for data display and configuration frontends. Therefore we implemented software packages, which enable a rapid design and prototyping of engineering applications within an SOA oriented environment. Due to the usage of only precompiled software and the easy to handle workflow neither compilation nor programming knowledge is require. We will present the current development status, usage and advantages of our software, which was realized at the Max Planck Institute for Astronomy in Heidelberg, Germany.
|top|


The CDS Portal, a unified way to access CDS services
Thomas Boch CDS, Observatoire de Strasbourg
The CDS portal is a newly released Web application, which aims at providing a uniform search interface to CDS services (Simbad, VizieR and Aladin). For a given position or object name, the portal returns a summary of available information and data available in the various services. Following the Virtual Observatory (VO) paradigm of ″shifting the results, not the data″, we also provide each user with a private virtual storage space where he can save results obtained from Simbad or VizieR, or upload his own local table. Stored data can later be reused as inputs to other services, cross-identified or saved in VO-compatible formats. The portal has been built as a lightweight application able to run in any modern browser without the need to install a dedicated plugin. It relies upon the Google Web Toolkit technology, an open source framework for Web applications, which was helpful in allowing us to reuse or adapt as much as possible existing HTTP services.
|top|


Management of astronomical software projects with open source tools.
Florian Briegel Max Planck Institute for Astronomy
In this article we will offer an innovative approach to managing the software development process with free open source tools for building and automated testing (autotools http://sources.redhat.com/autobook/), a system to automate the compile/test cycle on a variety of platforms to validate code changes ( Bitten http://bitten.edgewall.org/ ) using virtualization to compile in parallel on various operating system platforms (Xen http://www.xen.org/), version control and change management (Subversion http://subversion.tigris.org/), enhanced wiki and issue tracking system for online documentation and reporting (Trac http://trac.edgewall.org/) and groupware tools as they are: blog, discussion and calendar(Trac Plugins http://trac-hacks.org/ ).
|top|


A distributed, real-time data monitoring system as ground support equipment for balloon-borne astronomy experiments
C. M. Hubert Chen California Institute of Technology
We present a real-time data-monitoring software suite that we developed for the High Energy Focusing Telescope (HEFT). HEFT was one of the first projects to develop focusing mirrors and detectors for hard X-ray astronomy. We deployed these new technologies on the scientific ballooning platform. In scientific ballooning, we launch the payload above the atmosphere for a continuous duration of about 30 hours, during which we control and communicate with the payload in real time via a two-way radio link. During flight, software commonly known as ground support equipments (GSEs) allows experiment conductors to monitor the physical condition of the payload, and to display preliminary science data in real time. To this end, GSEs provide capabilities to display tables of frequently updated quantities and their averages, time-series plots, histograms, spectra, and images---all in real time. Unique from previous implementations of GSEs for other experiments, our system is a server-client network that utilises TCP/IP unicast and UDP multicast to enable multiple, concurrent and independent display clients. We wrote most of the code in the platform-independent Java language, and we verified that the software suite works on Linux, Mac OS/X and Windows XP. We deployed the software in two flight campaigns of HEFT in 2004 and 2005, for use during on-site calibration, pre-launch practice drills, and an observation flight of 24 hours. This system, and individual ideas of its implementation, can be adapted for use in future experiments requiring sophisticated real-time monitoring and data display.
|top|


A meta-data layer for astronomical archives
Fabien Chereau ESO
In many large observatories, meta-data is spread across a variety of sources (databases, text files, personal knowledge), has varying degrees of quality (in terms of completeness, correctness and precision), and operational constraints abound (one cannot risk locking the database with large queries, incorrect meta-data cannot always be corrected). Hence we propose the concept of a ″meta-data layer″ service which will shield the query services from the complexity of data sources. Based on a simple but carefully defined standard, this service aims to be generic enough to be adopted across the VO community, allowing interoperability of the archives at a level not yet possible in the VO.
|top|


The Heliophysics Integrated Observatory HELIO
Andre Csillaghy U. of Applied Sciences North Western Switzerland, Inst. of 4D Technologies
HELIO is a new Europe-wide, FP7-funded distributed network of services that will address the needs of a broad community of researchers in heliophysics. This new research field explores the ″Sun-Solar System Connection″ and requires the joint exploitation of solar, heliospheric, magnetospheric and ionospheric observations. HELIO will provide the most comprehensive integrated information system in this domain; it will coordinate access to the distributed resources needed by the community, and will provide access to services to mine and analyse the data. HELIO will be designed as a Service-oriented Architecture. The initial infrastructure will include services based on metadata and data servers deployed by the European Grid of Solar Observations (EGSO). We will extend these to address observations from all the disciplines of heliophysics; differences in the way the domains describe and handle the data will be resolved using semantic mapping techniques. Processing and storage services will allow the user to explore the data and create the products that meet stringent standards of interoperability. These capabilities will be orchestrated with the data and metadata services using the Taverna workflow tool. HELIO will address the challenges along the FP7 I3 activities model: (1) Networking: we will cooperate closely with the community to define new standards for heliophysics and the required capabilities of the HELIO system. (2) Services: we will integrate the services developed by the project and other groups to produce an infrastructure that can easily be extended to satisfy the growing and changing needs of the community. (3) Joint Research: we will develop search tools that span disciplinary boundaries and explore new types of user-friendly interfaces HELIO will be a key component of a worldwide effort to integrate heliophysics data and will coordinate closely with international organizations to exploit synergies with complementary domains.
|top|


The ALMA Pipeline Heuristics Package Interface
Lindsey E. Davis National Radio Astronomy Observatory
The ALMA (Atacama Large Millimeter Array) Pipeline Heuristics System (PHS) is designed to automatically reduce data taken in standard observing modes and produce standard data products. The PHS is implemented as a set of Python recipes consisting of a list of high-level reduction stages or tasks, for processing single dish and interferometry data sets and combinations thereof. These scripts run in the CASA (Common Astronomy Software Applications) data reduction environment. The PHS must be available to pipeline core developers, testers, and maintainers, to the pipeline commissioning team, staff at the ALMA regional centers, and to end users who may wish to execute parts of the PHS on their desktops resources permitting. To support this diverse user group the user interface must ensure ease-of-use, yet support complex processing setups for experts, e.g. parallelization of processing steps by source, time, frequency band, etc where appropriate. The initial low-level PHS interfaces for single dish and interferometry were developed independently and in parallel to accommodate the different requirements in each area and to get some basic heuristics algorithms working quickly. Although both interfaces support reduction recipes they diverged in style and the degree to which they supported customization. In this poster we describe a new PHS user interface which will support both single dish and interferometry reductions in a more uniform way and facilitate pipeline customization. This interface is targeted towards ALMA staff who will develop new and improve existing PHS recipes for the standard ALMA pipeline, and at astronomers who may be interested in developing their own reduction pipelines in CASA. The new user commands correspond to the existing PHS stages but are easier to execute interactively and tune. Technically, the user commands are implemented as CASA Python tasks, and look to the user exactly the same as core CASA tasks. Lower level Python commands which can be used to construct new tasks are also available for the expert user.
|top|


Multithreading for ESO Pipelines
Lander de Bilbao ESO
The second generation of instruments for ESO′s VLT in Paranal will be installed in the near future. These instruments will increase dramatically the volume of raw data per night, which in turn leads to very high computational needs that can be, up to some level, addressed by existing multi-core, shared-memory computers. To fully utilize these multi-processor systems we need to implement a programming environment supporting multi-threaded execution of applications. Such parallel execution needs to be introduced at the level of pipeline recipes as well as within the Common Pipeline Library. (CPL) on which all operational VLT pipelines of ESO are based. We describe our approach to providing such a new, multithreading pipeline setup and evaluate possible implementation solutions with some performance measurements.
|top|


Instrumental Provenance of ESO Archival Data
Arancha Delgado European Southern Observatory
The ESO Archive contains one of the largest collections of ground-based astronomical data in the world that come from a wide variety of telescopes and other sources and is evolving to become a full research facility capable of optimizing the scientific return from the data. In the Virtual Observatory (VO) era it is critical that the holdings of the ESO Archive are made available to the user community using VO tools. This requires among other things a good knowledge of the instrumental provenance of the data and the identification of metadata from the otherwise inhomogeneous collection to form a VO layer and a provenance database. Also specifications and transmission curves for all optical elements present in the archived observations are being collected for making them available to the astronomical community. As an example, this homogenisation work makes possible the mining of the ESO Archive looking for objects observed using specific filters that allow the creation of new outreach images.
|top|


The HST Exposure Time Calculators: Estimating accurate observing times for HST Observations
Rosa I DiazS Space Telescope Science Institute
The Exposure Time Calculators is a web application that assists users of the Hubble Space Telescope to predict observing times and signal to noise of mostly any possible object that can be observed by the telescope. Accurate predictions are key to determine the time needed to archive the scientific goals of the observers. It is also used by the contact scientist at STScI, to verify the safety and health of some of the instruments that could be damaged by bright observations. The system is a key tool for proposing and planning HST observations. As a result it must be able to accurately predict observing times or expected signal-to-noise ratios for proposed observations. We discuss the issues that complicate developing a general tool that shares as much commonality for computing exposure times for different instruments while handling all the special issues that individual instruments have. Particular attention will be given to the new features and capabilities that have been added to support the new HST instruments, COS and WFC3.
|top|


NED Spectra Data Service
Jeffery D Jacobson California Institute of Technology, NED
As part of its on-going contribution to the U.S. Virtual Astronomy Observatory (VAO), the NASA/IPAC Extragalactic Database (NED) has implemented a prototype system to provide access to its growing collection of published spectra of extragalactic objects. The service leverages NEDs existing databases, Apache Tomcat, and the U.S. National Radio Astronomy Observatory Data Access Layer server (DALserver) prototype system provided to the VAO. We present the adaptations made to NED to support the VAO Simple Spectral Access Protocol, the lessons learned in the prototype implementation and plans for NEDs future Spectra Data Service. NED is a service of the California Institute of Technology/Infrared Processing and Analysis Center, provided to the scientific community and the public under contract to the U.S. National Aeronautics and Space Administration.
|top|


The NDF Data Provenance System and Your Archive
Frossie Economou Joint Astronomy Centre
End users really like to know what was done to their data, especially when they obrain it from semi-anonymous archives or the VO. One way to achieve this is to include metadata in the files indicating what ancestor files were used to produce them, and what the history of processing has been. Because such a system can only work effectively if there is a high level of trust in it, we have integrated provenance and history handling at the infrastructure level, with any Starlink application that is used to process our data automatically honouring and ammending the provenance and history structures. When our data is ingested in the JCMT Science Archive, a subsection of the provenance can be used to enhance search functionality, such as asking for all data that contributed to a particular product.
|top|


The Data Reduction System for GTC/OSIRIS.
Alessandro E. Ederoclite Instituto de Astrofisica de Canarias
I present the online and the offline data reduction system for OSIRIS, the optical multi-mode instrument for GranTeCan. The software is written in Python and invokes PyRAF tasks which have been optimized for the instrument. I review the characteristics of the intrument and of the software. I will also present the improvements which are foreseen for both the instrument and the software. Finally I will give an overview of the integration with GranTeCan′s software. The flexibility of the software makes it easy to be adapted to other telescopes/instruments.
|top|


Development of Image Analysis Software of MAXI
Satoshi Eguchi Kyoto University, Department of Astronomy
Monitor of All-sky X-ray Image (MAXI) is an X-ray all-sky monitor, attached to the Japanese experiment module Kibo on the International Space Station. The main scientific goals of the MAXI mission include the discovery of X-ray novae followed by prompt alerts to the community (Negoro et al., in this conference), and production of X-ray all-sky maps and new source catalogs with unprecedented sensitivities. To extract the best capabilities of the MAXI mission, we are working on the development of detailed image analysis tools. We utilize maximum likelihood fitting to a projected sky image, where we take account of the complicated detector responses, such as the background and point spread functions (PSFs). The modeling of PSFs, which strongly depend on the orbit and attitude of MAXI, is a key element in the image analysis. In this poster, we present the status of our software development, focusing on the methods of computing the PSFs and their application to the actual data.
|top|


DTS: The NOAO Data Transport System.
Michael Fitzpatrick NOAO
The Data Transport System (DTS) provides automated, reliable, high-throughput data transfer between the telescopes, archives and pipeline processing systems used by the NOAO centers in the northern and southern hemispheres. DTS uses an XML-RPC architecture to eliminate the need for persistent connections between the sites, allowing each site to provide or consume services within the network only as needed. The RPC architecture also permits remote control (e.g. to enable a new data queue or manually transmit a file) and monitoring of each system, and for client applications to be language-independent (e.g. a web interface to display transfer status or a task to queue data which is more tightly coupled with the acquisition system being used). The DTS service daemon is highly multi-threaded and capable of managing many different data paths and scheduling priorities, all of which can be easily configured or extended as needed. Bulk data transport is independent of the primary command-and-control methods; a variety of transfer protocols are supported to take best advantage of the bandwidth or properties of the data being moved (e.g. large image size versus many small files). The default transport method uses parallel TCP/IP sockets to ″stripe″ the data to a remote machine, providing a significant improvement in throughput over slow or busy networks. Additional transport protocols will be added in the future.
|top|


History and future of the STScI DADS archive system.
Niall Ives Gaffney Space Telescope Science Institute
For more than 15 years, the archive at STScI has revolved around DADS (Data Archive and Distribution System). Originally a software product delivered by Loral, this system has evolved to keep up with new technologies and users needs. This evolution has encompassed changes in the core language from C++ to Java, changes of OSes including moving from VMS to *nix, several transitions of the archive storage media, and the inclusion of on the fly processing for current instruments as well as reprocessing of heritage instruments. This evolution has also been driven by other missions archived by DADS (FUSE and Kepler) and expansion into the future with our JWST development. Currently we are transitioning from Sybase to SQLServer as our DB platform and Solaris to RHEL as our base OS. From this history and near future plans, we will present our list of both lessons learned and share common ideas needed to make more flexible and perhaps shareable archive components for the astronomical community.
|top|


A Concept for JDEM Science Computing and Operations
Erik Edward Gottschalk Fermilab
We describe the status of a conceptual design for science computing and operations for the Joint Dark Energy Mission (JDEM). An overview of the design will be presented with emphasis on a framework for software development, data management, data processing, and data analysis in today′s distributed computing environment. Initial results from R&D on evaluating a large, scaleable database product, and quality control in a grid computing environment will be discussed. The successful deployment of grid computing and remote operations centers in high-energy physics as a possible model for JDEM will be explored.
|top|


HLA Footprints for Multi-Purpose Science
Gretchen R Greene STScI
Footprints from the science observations of the Hubble Space Telescope are defined by a set of hierarchical geometric regions of instrument coverage; exposures, combined observations, high level science products, and mosaics. In the growing global community of networked applications, the science end-user has several use cases for visualizing and accessing footprint data including scientific proposal preparation, research and analysis of generated science products, and interoperability between archives for correlation of coverage. The Hubble Legacy Archive (HLA) at Space Telescope Science Institute, in coordination with ESO-ECF and CADC, has developed a web based science user interface built on a VO service oriented architecture system to enable varying levels of astronomical community access to science products derived from the HST archive. In this ADASS poster paper we describe new features and technologies for the HLA footprint component web browser visualization tool and the underyling footprint services utilized by the HST Astronomers Proposal Tool (APT) in compliance with an IVOA standard data access protocol. The service infrastructure is based on a high performance spherical geometric model developed by Johns Hopkins University (JHU) and database search algorithms co-developed by STScI and JHU.
|top|


Pointing the SOFIA Telescope
Michael A. K. Gross Universities Space Research Association
SOFIA is an airborne, gyroscopically stabilized 2.5m infrared telescope, mounted to a spherical bearing. Unlike its predecessors, SOFIA will work in absolute coordinates, despite its continually changing position. In order to manage this, the telescope must relate equatorial coordinates to its gyroscopes using a combination of avionics data and star identification, manage field rotation and track sky images. As a telescope mounted to a platform in continuous three-dimensional motion during observation, this presents unique pointing challenges. We describe the algorithms and systems required to acquire and maintain the equatorial reference frame, to relate it to tracking imagers and the science instrument, to set up the oscillating secondary mirror, and to aggregate pointings into larger relocatable building blocks such as nods and dithers.
|top|


HST Cache: Update
Jonas Haase ST-ECF
The HST Cache system at CADC and ESO/ST-ECF has been in operarations for little more than a year now. In than time has the Cache system has fulfilled our expectations and has proven itself to be a excellent vehicle to re-process and serve HST data. This poster reports lessons learned, covers the status of the HST Cache holdings and details the new developments in the associated software.
|top|


The study of the mechanism of cumulative generation of streams for the model of active astronomical object.
Kirill A. Halin Stavropol State University, Department of Physics
The study introduces research of a purely hydrodynamic model of active astronomical object put forth by Vladimir Vitkovskiy in 2001. It defines the principles of creation on its basis of a mathematical model for carrying out of the numerical experiments promoting studying and understanding of physical processes in active objects of different types. The basic mechanism of activity in the model is the cumulative process arising from the accretion of surrounding substance on the poles of rotating object in a cone-shaped funnel. In the substance under accretion hydrogen prevails. Falling on a star surface, hydrogen accumulates and heats up to the temperature at which thermonuclear reaction of transformation of hydrogen in helium begins. When the speed of produced warmth of nuclear reaction exceeds the speed of a heat being conducted off, thermal instability develops and the explosion occurs. Such scheme is possible provided there is a gas cloud around a star which density is much more than density of the interstellar environment, or the star is a part of a close double system and the overflowing mechanism operates. However, as the research of an actually unlimited cumulative action shows (E.Zababahin and I.Zababahin, 1988), this mechanism can characterise the process of accumulation of the substance, necessary for the description of the offered model (V.Vitkovskiy and K.Halin, 2008).
|top|


GPU-Based Volume Rendering of Noisy Multi-Spectral Astronomical Data
Amr Hassan Centre for astrophysics and supercomputing, Swinburne University of Technology - Melbourne , Australia
Traditional analysis techniques may not be sufficient for astronomers to make the best use of the data sets that current and future instruments, such as the Square Kilometre Array and its Pathfinders, will produce. By utilizing the incredible pattern-recognition ability of the human mind, scientific visualization provides an excellent opportunity for astronomers to gain valuable new insight and understanding of their data, particularly when used interactively in 3D. The goal of our work is to establish the feasibility of a real-time 3D monitoring system for data going into the Australian SKA Pathfinder archive. Based on CUDA, an increasingly popular development tool, our work utilizes the massively parallel architecture of modern graphics processing units (GPUs) to provide astronomers with an interactive 3D volume rendering for multi-spectral data sets. Unlike other approaches, we are targeting real time interactive visualization of datasets larger than GPU memory while giving special attention to data with low signal to noise ratio - two critical aspects for astronomy that are missing from most existing scientific visualization software packages. Our framework enables the astronomer to interact with the geometrical representation of the data, to control the volume rendering process to generate a better data representation of their datasets, and to use cut plane and probing functionality.
|top|


Reusable state machine code generator
Arturo A. Hoffstadt Universidad Tecnica Federico Santa Maria, Computer Systems Research Group
The State Machine model is frequently used to represent the behavior of a system, allowing one to express and execute this behavior in a deterministic way. A graphical representation such as a UML State Chart diagram tames the complexity of the system, thus facilitating changes to the model and communication between developers and domain experts. We present a reusable state machine code generator, developed by the Universidad Tecnica Federico Santa Maria and the European Southern Observatory. The generator itself is based on the open source project openArchitectureWare, and uses UML State Chart models as input. This allows for a modular design and a clean separation between generator and generated code. The generated state machine code has well-defined interfaces that are independent of the implementation artifacts such as the middleware. This allows using the generator in the substantially different observatory software of the Atacama Large Millimeter Array and the ESO Very Large Telescope. A project-specific mapping layer for event and transition notification connects the state machine code to its environment, which can be the Common Software of these projects, or any other project. This approach even allows to automatically create tests for a generated state machine, using techniques from software testing, such as path-coverage.
|top|


Virtual Observatory Services at WFAU
Mark S Holliman University of Edinburgh, Institute for Astronomy, Wide Field Astronomy Unit
The Wide Field Astronomy Unit hosts a large number of Virtual Observatory (VO) services that enable access to both data and processing applications housed on our servers in Edinburgh. These services provide astronomers with a powerful set of tools for obtaining and processing data in ways unattainable through conventional access methods. The services offered include cone-search and ADQL access to a number of major databases developed by our data centre such as UKIDSS, SuperCOSMOS Science Archive, and the 6dF Galaxy Survey, and also many mirrors of important databases developed elsewhere, such as SDSS, IRAS, and 2XMM . Images for UKIDSS and SuperCOSMOS are accessible through SIA services. There are useful data processing tools like the STILTS library for table manipulation, a data mining tool for classification using kernel density analysis, and a service for converting VOTables into KML for use in Google Sky. Also hosted are a number of VO infrastructure services like a full registry and VOSpace that enable users to find resources and store data in an online accessible location. WFAU provides secured VO services to the proprietary UKIDSS releases, which are the first secured VO services for a major proprietary data resource in the entire VO. With a limited knowledge of python and a copy of the VODesktop software astronomers can script up workflows that utilize these services to perform complex operations like cross matching between disparate datasets or extracting catalogues from images remotely. Since many of our databases are too large to be downloaded and accessed locally these services make it possible to accomplish complicated tasks online and on dedicated hardware. WFAU′s list of VO services will continue to grow as new IVOA standards are implemented and with the addition of new datasets like the VISTA surveys.
|top|


Planck Surveyor Mission: Methods for Optimizing the Data Analysis with the Software Infrastructure ProC
Wolfgang Hovest Max-Planck-Institute for Astrophysics
Scientific workflows are usually controlled by many parameters, and assigning near-optimal values to these is often critical for efficiently finding solutions to goal-oriented problems. Such problems are typically solved by running sophisticated simulation or data analysis workflows. We present two techniques to support scientists which are integrated into the Process Coordinator (ProC) -- the general purpose scientific workflow engine originally developed for the Planck Surveyor satellite mission. The resume functionality allows to repeatedly execute workflows without recalculating already obtained results, for example after the change of parameters of a part of the workflow, or after a crash, e.g. due to network problems. The sampling framework supports the exploration of high-dimensional parameter spaces for function representation, optimization, or integration purposes. Complemented by one of several pluggable sampling algorithms, a sampler control element (SCE) drives the exploration process in multiple cycles. The whole sampling framework has been tested with different sampling algorithm plug-ins, and is ready for use in astrophysical research.
|top|


XMM-Newton Science Analysis Software: Further development and maintenance... and also thinking about the future.
Aitor L. Ibarra XMM-Newton SOC. ESAC/ESA
The Science Analysis Software (SAS) is a robust software package designed to analyse XMM-Newton data. Coded mainly in C++ and F90 and now almost ten years old, it is still developing, both by extending functionalities and by keeping in line with evolving platforms, compilers and third-party software. At the same time, the XMM-Newton SAS team is working to apply new technologies around SAS to offer observers in the years to come a real XXI century application, making possible XMM-Newton data access, minimizing maintenance costs, while preserving full analysis capabilities going forward. A modern web application based on SAS is in development under the name of RISA (Remote Interface for Science Analysis), combining Virtual Observatory techniques, use of GRID and Cloud computing technologies as well as virtual-machine encapsulation concepts.
|top|


MAXI Nova Search and Alert System
Ryoji Ishiwata Nihon University
Monitor of All-sky X-ray Image (MAXI) is the first astronomical observatory on the International Space Station (ISS). Its cameras, with wide FOVs, continuously scan all-sky every 96 minutes synchronized with the ISS orbit. Each X-ray photon event downlinked from the ISS is stored into a PostgreSQL database, and simultaneously processed by Nova Search and Alert System in order to discovery X-ray novae or transients and report these phenomena to astronomers worldwide for further follow-up observations. In the Nova Search systems, time variabilities are investigated for each celestial pixel with the HEALPix on various timescales. In addition to the algorithm to detect significant changes in X-ray counts, the system has a graphical user interface employing GTK+ and shows us dynamically-changed image or light curve. Events detected by the Nova Search systems are verified in the alert system. In the system, individual detection data received from the Nova Search systems through socket connection are assembled. After investigation with some additional information unconsidered in the Nova Search systems, such as X-ray counts in neighboring pixels, known objects in catalogs, and the effect of the ISS solar paddle, the system or a duty scientist makes a final judgment on whether the event is a true transient or not. In automatic data processing, the alert will be transmitted in less than 30 seconds after the on-board detection. It is possible to study data in detail through access to the database. We will present the operations and the performances.
|top|


Starlink Software Developments: The Nanahope Release
Tim Jenness Joint Astronomy Centre
We discuss recent developments to the Starlink Software Collection that were part of the Nanahope release in the summer of 2009. The Starlink Software is very portable and can run on a number of operating systems, including Linux, Mac OS X and Solaris, in both 32- and 64-bit variants. CUPID, our clump detection application, has been extended to support the creation of VO tables containing STC-S region definitions and we have made a number of enhancements to our automatic data provenance tracking system. GAIA, our data visualisation tool, is now VO-enabled and has new facilities for inspecting irregular clumps of emission in 2 and 3D. SPLAT-VO and GAIA have also been enhanced to understand the SAMP protocol.
|top|


A new generic way to define astrometric calibration for Gaia data processing.
Emmanuel Joliet esac/esa
Gaia is ESA′s ambitious space astrometry mission with a foreseen launch date in early 2012. Its main objective is to perform a stellar census of the 1000 Million brightest objects in our galaxy (completeness to V=20 mag) from which an astrometric catalog of micro-arcsec level accuracy will be constructed. A key element in this endeavor is the Astrometric Global Iterative Solution (AGIS) --- the mathematical and numerical framework for combining the approx. 70 available observations per star obtained during Gaia′s 5yr lifetime into a single global astrometic solution. The fundamental working principles of AGIS was shown (O4.1) at last year′s ADASS XVIII. This time we present a new generic astrometric calibration scheme recently implemented in AGIS. For the development of the data processing software, the traditional astrometric calibration scheme is a heavy task as each new change in the model produces changes in the code, a need for new simulation data, new validations tests, etc. The new scheme allows the calibration of the astrometric instrument to be specified in a more generic and flexible manner. The entire model is defined with an external configuration file that can be modified at any time with no or only minimal impacts on the software. The implementation results in acceptable run time overheads compared to the direct approach with a fixed hard-coded calibration model. This new approach can be a starting point to convert other fixed and hard coded scheme into this more analytical breakdown solution.
|top|


How to get MAXI data from http://maxi.riken.jp ?
Mitsuhiro Kohama Cosmic Radiation lab. RIKEN
Monitor of All-sky X-ray Image (MAXI) was launched successfully by Space Shuttle ″STS-127 Endevour″ at 06:03 EDT on July 15. MAXI will observe the entire of sky all time with X-ray band on the International Space Station. The products of MAXI will notify the dynamic variability of the various X-ray sources to you at just in time. For that purpose,the MAXI data are automatically processed on the ground station. New MAXI products will be updated in the web site ″http://maxi.riken.jp″. When it discovers a transient, an alert will be issued to observers world-wide within 1min. This is the tutorial paper that ″how to get the MAXI products″.
|top|


aXeTwoZero: The next generation of aXe
Martin Kuemmel Space Telescope - European Coordinating Facility
The aXe spectral extraction software was designed to extract spectra from slitless data such as taken with the Advanced Camera for Surveys (ACS) on board of Hubble. Since its first release in 2002, aXe was significantly extended and developed into the standard reduction package for all ACS slitless modes. Wide Field Camera 3, which was installed on Hubble during Service Mission 4 in May 2009, also contains slitless spectroscopic modes on both, the IR and the UVIS arm, and aXe is the recommended tool for reducing its slitless images. Rather than ′only′ adjusting the current version aXe-1.71 to WFC3, we have chosen to re-factorize the current aXe to build aXeTwoZero (aXeTZ), the next generation of the aXe software. Besides being capable of reducing WFC3 data, aXeTZ characterized by: backward compatibility to aXe-1.71; a homogeneous and ′pythonic′ software design; complete 64bit compatibility; extended functionality. In this contribution we introduce aXeTZ and explain its design criteria and new features. aXeTZ contains a new version of aXedrizzle, which allows the rejection of cosmic ray hits based solely on spectroscopic criteria, and its properties are discussed in detail. The reduction of WFC3 data with aXeTZ is demonstrated using early in-orbit calibration data. aXeTZ is scheduled to be released as part of the next STSDAS release at the end of 2009.
|top|


Faster, better, cheaper: News on seeking Gaia's Astrometric Solution with AGIS
Uwe Lammers European Space Astronomy Centre of ESA
Gaia is ESA′s ambitious space astrometry mission with a foreseen launch date in early 2012. Its main objective is to perform a stellar census of the 1000 Million brightest objects in our galaxy from which an astrometric catalog of micro-arcsec level accuracy will be constructed. The Astrometric Global Iterative Solution (AGIS) is the mathematical and numerical framework for creating the astrometric core solution. At last year′s ADASS XVIII we presented (O4.1) the working principles of AGIS, its development status, and selected results obtained with large-scale simulated data sets. We present here the latest developments around AGIS highlighting in particular a much improved algebraic solving method that has recently been implemented. This Conjugate Gradient scheme improves the convergence behavior in significant ways and leads to a solution of much higher scientific quality.
|top|


ESA Archives and VO tools: without frontiers.
ANDREA LARUELO ESAC/ESA
The Science Archives and VO Team (SAT) at ESAC, Madrid, is in charge of the challenging task of developing tools for easy, fast and friendly access to scientific data. Scientific archives and VO tools developed by the SAT are continuously evolving under the idea of flexibility, extensibility and adaptability. The resulting infrastructure is usable for data coming from a wide variety of space based missions. One of the objectives of the SAT group within the VO context is to implement interoperability solutions among services using message brokers (like the International Virtual Observatory Alliance (IVOA) SAMP protocol). This allows users to explore a wide variety of scientific data and its further analysis through suitable analysis tools. Within services developed by SAT we can see examples of this evolution. The spectra visualization and analysis tool,VOSpec, is already integrated with Herschel Interactive Processing Environment (HIPE) through the aforementioned SAMP protocol. In fact, first Herschel spectra have already been displayed using this VO tool. Recently reengineered SOHO archive is also able to send images to any VO SAMP compatible application and will soon be able to send 1D spectra to VOSpec tool through SAMP. As mentioned above, SAT is also in charge of the design and development of VO tools. As example of the progress of such tools within the context of interoperability and highly distributed data, we present Best Fit algorithm. This functionality is already integrated in VOSpec and helps scientists to find the theoretical model that best fits a given Spectral Energy Distribution.
|top|


ESA New Generation Science Archives: State of the art data management techniques for SOHO and EXOSAT Science Archives
Ignacio Leon Science Archives Team (SAT) - ESAC/ESA
The Science Archives and VO Team (SAT) has undertaken the effort to build state of the art sub-systems for its new generation of archives. At the time of writing this abstract, the new technology has already been applied to the creation of the SOHO and EXOSAT Science Archives and will be used to reengineer some of the already existing ESA Science Archives in the future. The Data Layer infrastructure required for the archives has very precise requirements. They include a high granularity of data with millions of searchable elements that require short response times. To achieve the goal of delivering an efficient and reliable access, open source technology has been used. Some of these tools are Postgresql (database), Spring (application framework) or Hibernate (persistence framework). On top of these software solutions, the SAT team has added its own software. It groups all these technologies and provides additional functionality to access both data and metadata. Once the data is prepared for distribution, access rights are controlled by using a virtual ftp access, redirected through ftp (to avoid typical network restrictions problems). In its role of data provider, new archives can create animations using image files which are delivered to the clients through http access. These animations are created on the fly, on a per request basis. The access to metadata is done through state of the art software, which provides Connection Pooling, Statement Pooling, Distributed Query & Objects cache. Additionally, the SAT team has applied Dijkstra graph theory search algorithm which is executed at run time to optimice table joins ordering in the database. And in cases where geometrical data is involved, spherical data types, functions and operators (indexed by GIST R-Tree) are used for geometrical searches (c.f., PgSphere)
|top|


An End-to-End Solution for Archiving, Monitoring, Retrieval, and Post-Processing Archive Files for SOFIA
Lan Lin Universities Space Research Association
The Stratospheric Observatory for Infrared Astronomy (SOFIA) is an airborne astronomical observatory comprised of a 2.5 meter infrared telescope mounted in the aft section of a Boeing 747SP aircraft that will fly at operational altitudes between 37,000 and 45,00 feet, above 99% of atmospheric water vapor. During each mission, the Mission Control and Communication System (MCCS), Telescope Assembly (TA) Subsystems, Flight Management Subsystems, Water Vapor Monitor, Cavity Door Drive Systems, and tracker cameras generate large amounts of housekeeping data. These subsystems reside on different hosts and record data at different frequencies as high as 50 Hz into local disks. The TA Tracker Subsystem alone logs more than 2500 data items at 20 HZ each, creating 400 kilobytes of data each second. It is imperative to archive the housekeeping data from the observatory subsystems post-flight in order to assess observatory performance and to support observatory troubleshooting and improvements. In addition, access to the data must be straightforward so that observatory staff can conduct data analysis tasks quickly and easily. The Archive Manager is designed as an integrated tool to perform four major functions: first, monitor the archiving status; second, control the archiving progress; third, make backups at a separate location; and fourth, manage a data manifest for end of flight processing and future data retrieval. The Archive Manager must carry out these functions without impeding the response time of the system as a whole (round trip command/response time must be less than 20 ms). The performance requirement is achieved by the Archive Manager using backup design that moves large blocks of data at lower frequency instead of small chunks at high frequency. In addition, the Archive Manager itself serves as a subsystem data source that distributes the overall archive information throughout the system. This ensures that the mission director, telescope operator, and observers all have the same mission view during the flight. The Archive Manager was implemented using C++, CORBA and Java as part of the now-retired Mission Control Subsystem for SOFIA. Nonetheless, it should be possible to deploy the Archive Manager into the MCCS software environment with some updates. The Archive Manager automatically prepares the housekeeping data files for ingestion into the SOFIA Data Cycle System Archive. At the end of a flight, the archive data files are transferred off the aircraft and ingested into the archive where they are made available to SOFIA Science and Mission Operations Staff via a simple web page interface. Once downloaded to the users desktop, the files can be opened and analyzed using the Archive Post-Processing Tool which provides a number of data analysis functions. The Archive Manager, DCS Archive, and Archive Post-Processing Tool provide and integrated solution for capturing, archiving, retrieving, and analyzing the huge volumes of housekeeping data produced by the SOFIA observatory subsystems.
|top|


A Reference Architecture Specification of a Generic Telescope Control System
Joao S. Lopez Universidad Tecnica Federico Santa Maria
A Telescope Control System (TCS) is a software responsible of controlling the hardware that an astronomical observation needs. The automation and sophistication of these observations has produced complex systems. Currently, a TCS is compound by software components that interact with several users and even with other systems and instruments. Each observatory has successfully developed a wide spectrum of TCS solutions to their telescopes. Regardless the mount, there are common patterns in the software components that all these telescopes use. As almost every telescope is custom designed, these patterns are reimplemented again and again for each telescope. This is indeed an opportunity of reuse and collaboration. The Generic Telescope Control System (gTCS) pretends to be a base distributed framework for the developing and deployment of the TCS of any telescope, independent of its physical structure, the type of mount and instrumentation that they use. This work presents an architecture specification explained through two complementary approaches: the layers perspective and the deployment perspective. The first approach defines a set of layers, one on the top of the other, offering different levels of abstraction. Meanwhile the deployment perspective intends to illustrate how the system could be deployed, focused on the distributed nature of the devices.
|top|


Data processing for AKARI Far-Infrared All-Sky Catalogue
Sin'itirou Makiuti Institute of Space and Astronautical Science, Japan Aerospace Exploration Agency
AKARI is the second space mission for infrared astronomy in Japan. It was launched into space by a M-V rocket on February 22, 2006 (JST). The satellite had a 68.5cm telescope cooled down to 6 K, and observed by two instruments; the Infrared Camera (IRC; 1.8-26.5 micron) and the Far-Infrared Surveyor (FIS; 50-180 micron). One of the key objectives of the AKARI mission is to carry out an all-sky survey in the mid- and far-infrared and construct point source catalogues. IRAS (Infrared Astronomical Satellite, launched in 1983 by USA, UK, and the Netherlands) carried out the first all-sky survey at four infrared wavelengths and provided infrared source catalogues. The AKARI All-Sky Survey surpass the IRAS survey in higher spatial resolution, better sensitivity, and broader wavelength coverage. Data processing for the AKARI far-infrared survey has been constructed on IDL (Interactive Data Language - ITT Visual Information Solutions). We developed pipeline programs to handle time-line signals obtained by the AKARI/FIS and tools such as a data browser basically on the IDL-based platform. The AKARI/FIS has two far-infrared onboard detectors; SW for shorter (<= 100 micron) and LW for longer (>= 100 micron) wavelengths. Each detector is divided into two photometric bands. Number of detectors pixels are 100 for SW and 75 for LW. Each pixel generates time-line signal independently along the scan paths on the sky. The data from the FIS need to be calibrated and corrected to be treat as uniform and continuous data in the source extraction and photometry process. A semiconductor detector for the far-infrared observations is easily affected by cosmic-rays and incoming photon flux itself, and shows significant variation of its responsivity in the space environments. We need to perform several essential processing; removal of the erratic signals caused by incessant cosmic-ray hitting, correction of the non-linear characteristics of the read-out circuit, time dependent dark subtraction and flat-fielding, and so on. We have been developing a software pipeline system to do those data reductions. The system consists with sequence of separated programs each of which has specific function in data processing. Moreover, we have designed a dedicated data file format as the interface between pipeline modules. In this paper, we outline the data-flow of the pipeline system, explain some details of essential part of the processing, and introduce a far-infrared All-Sky Point Source Catalogue (beta version). The revised version catalogue will be in public autumn 2009.
|top|


Spectra extraction and analysis software for the Digitized First Byurakan Survey (DFBS) and research projects
Areg Martin Mickaelian Byurakan Astrophysical Observatory (BAO)
The Digitized First Byurakan Survey (DFBS) is the largest Armenian astronomical database and is unique in the world for its specific requirements to the extraction and analysis of its low-dispersion prism spectra. The project is a collaboration between the Byurakan Astrophysical Observatory, ″La Sapienza″ Universita di Roma, Cornell University, and VO-Paris. A dedicated software bSpec was created by one of the authors (GC) to extract and measure all spectra in any field of the DFBS. However, more accurate software EXATODS (Extraction and Analysis TOol of DFBS Spectra) was recently written by another author (AS) and have successfully been used for the extraction and study of the asteroids spectra from the DFBS. It scans full plate to find bright spectra and measures the angle of the rotation of each individual spectrum and follows the direction for the dispersion to obtain the correct wavelength calibration. We have developed a dedicated workflow in the VO framework. We will describe these software and present future research possibilities based on the DFBS spectra.
|top|


TGCat, The Chandra Transmission Grating Catalog and Archive
Arik William Mitschang SAO
The newly released Chandra Transmission Grating Catalog and Archive, TGCat, presents a fully dynamic online catalog allowing users to browse and categorize Chandra gratings observations quickly and easily, generate custom plots of resulting response corrected spectra online without the need for special software and to download analysis ready products from multiple observations in one convenient operation. TGCat has been designed to take advantage of the convenience and power of modern browsers and servers to not only enrich the features available but also provide better access to helpful resources relating to searches and returned values. TGCat has been registered as a VO resource with the NVO providing direct access to the catalogs interface. Currently under development are Simple Cone Search and a Simple Image Access interfaces compliant with NVO standards and we intend to provide spectra via the Simple Spectral Access protocol. The catalog is supported by a back-end designed to automatically fetch newly public data, process , archive and catalog them, At the same time utilizing an advanced queue system integrated into the archive′s MySQL database allowing large processing projects including reprocessing, several planned feature additions, and the event of catastrophic data loss, to take advantage of an unlimited number of CPUs across a network. A unique feature of the catalog is that all of the high level functions used to retrieve inputs from the Chandra archive and to generate the final data products are available to the user in an S-Lang written library with detailed documentation. Here we present a structural overview of the Systems, Design, and Accessibility features of the catalog and archive.
|top|


VO compliant visualization of theoretical data
Marco Molinaro INAF - Trieste Astronomical Observatory
In the evolving scenario for the inclusion of theoretical and simulated data in the VO, some improvements within the Italian Theoretical Virtual Observatory (ITVO) project are presented. They include cosmological simulated data archives and services (at Trieste and Catania Astronomical Observatories and at the CINECA supercomputing centre in Bologna) and stellar simulations data archives and services (BaSTI, a Bag of Stellar Tracks and Isochrones, maintained at the Teramo Astronomical Observatory). Following an upgrade in BaSTI database and a new Web service endpoint for VisIVOweb in Trieste some improvements in data visualization are presented: from new plots for BaSTI tracks emphasizing star evolution′s key points to their comparison with observational data, from direct 3D visualization of galaxy clusters to their direct online data manipulation server side using the VO-compliant VisIVOWeb service. Finally we present a new unique Web portal for the ITVO project under VObs.it, the Italian effort in the VO world, that collects all information regarding the various tokens of this theoretical virtual observatory project that are spread all over Italy.
|top|


Optimizing Spatial Frequency Data Weights for High Precision Imaging with ALMA
Koh-Ichiro Morita National Astronomical Observatory Japan
One of the top level technical goals of ALMA (Atacama Large Millimeter / submillimeter Array) is to provide high precision images at millimeter and submillimeter wavelengths. For this goal, ALMA is designed to provide complete sampling in spatial frequency domain from zero to maximum spacing. Since the interferometric observations with 50 x 12 m antennas (major part of ALMA) is hard to measure spatial frequency data shorter than 15 m, 2 additional observing mode will be used to measure very short spacing visibility data. Zero spacing data will be measured by single dish observing mode with several 12 m antennas. For the short spacing data from 6 m to 15 m, we are planning to use interferometric observations with ACA (a subsystem of ALMA) which consists of 12 7-m antennas. Imaging process from these data is more complex than that for a simple interferometry. In particular, we need to apply appropriate weights for these data in spatial frequency domain. This report discuss the imaging procedure and how to optimize these weights and its effect on image quality.
|top|


A Prototype Data Processing System for JDEM
Eric Hildaur Neilsen Fermilab
Fermilab is developing a prototype data processing system for the Joint Dark Energy Mission (JDEM) Science Operations Center (SOC). The SOC requires tools for systematic and timely execution of computing jobs; data management; and systematic record keeping for provenance, workflow organization, and quality control. Drawing on our experience in processing large volumes of data, we have constructed a prototype system using a variety of existing tools, including Fermilab′s distributed computing and mass data storage infrastructure. This system provides an environment within and against which potential architectures and components of the production system can be evaluated. We describe the prototype and its operation, and discuss future directions.
|top|


MSOTCS: A new Telescope Control System for the Australian National University's 2.3m telescope at Siding Spring Observatory
Jon G. Nielsen Australian National University, Research School of Astronomy and Astrophysics, Mount Stromlo Observatory
A new telescope control system, MSOTCS, has been written for the ANU′s 2.3m telescope at Siding Spring Observatory. The system has now been in use for approximately 12 months. An overview of the software, whilst in the early stages of development, was presented at ADASS 2004. In this paper we reflect upon those initial design decisions, discuss the development process and subsequent deployment of the system, and look at the experiences of running the system over the past year. Two areas have been chosen for more detailed discussion. Firstly one of the key features designed into MSOTCS, support for remote and automated observing, is considered. The implications of supporting this style of observing are followed from the initial design, to the implementation, and finally through to the deployment and ongoing support of the system. Secondly we discuss the choice of deployment platform for MSOTCS and contrast our choice, QNX, with other alternatives.
|top|


High-speed Korea-Japan Joint VLBI Correlator(KJJVC) development and its current progress
Se-Jin Oh Korea Astronomy and Space Science Institute
The development of Korea-Japan Joint VLBI Correlator (KJJVC) is being progressed in close cooperation with Korea Astronomy and Space Science Institute and National Astronomical Observatory of Japan for Korean VLBI Network (KVN), Korea-Japan Joint VLBI Network (KJJVN), East-Asian VLBI Network (EAVN) including VSOP2. KJJVC is able to perform 8Gbps/station data rate, 16 stations with 8192 output channels. It consists of various playback systems, Raw VLBI Data Buffer (RVDB) system, VLBI Correlation Subsystem (VCS), Peta-scale Epoch Data Archive (PEDA) system, and control and operation software. In case of playback systems, there are many different type of playback system in East-Asian VLBI Network as like Mark5B, VERA2000, and K5 system. And RVDB system will be able to play back the observed data to VCS with same time from various playback systems. VCS is a core product in KJJVC for processing the correlation of data. PEDA is massive data storage to save the correlated results from VCS. It can also support e-VLBI through next generation Gigabit network. The factory and field inspection of VCS was performed and VCS was installed at VLBI Correlator room in June and August 2009, respectively. The overall system integration work will be performed until end of 2009. The Korea-Japan Joint Correlation Center will be opened in Korea next year. In this paper, KJJVC development and its current status will be shown in detail.
|top|


Toward a reference implementation of a standardized astronomical software environment.
Luigi Paioro INAF-IASF Milano
The OPTICON Network 3.6 (FP6) and the US NVO, coordinating with international partners and the Virtual Observatory, have already identified high-level requirements and a global architectural design for a future astronomical software environment. In order to continue this project and demonstrate the concepts outlined, the new OPTICON Network 9.2 (FP7) was born and is working on a concrete prototype which is meant to contribute to development of an eventual reference implementation of the basic core system to be jointly developed by the major partners. As the reference implementation stabilizes, we plan to work with selected groups within the astronomical community to port software to test the new environment and provide feedback for its further evolution. These groups will include both producers of new software as well as the major legacy systems (e.g. AIPS, CASA, IRAF/PyRAF, Starlink and ESO Common Pipeline Library).
|top|


Development of the software for high speed data transfer of the > > high-speed, large capacity data archive system for the storage of the correlation data from Korea - Japan Joint VLBI Correlator(KJJVC)
Sunyoup Park Korea Astronomy & Space Science Institute (KASI)
Korea-Japan Joint VLBI Correlator (KJJVC), to be used for Korean VLBI Network (KVN) in Korea Astronomy & Space Science Institute (KASI), is a high-speed calculator that outputs the correlation results in the maximum speed of 1.4GB/sec. To receive and record this data keeping up with this speed and with no loss, the design of the software running on the data archive system for receving and recording the output data from the correlator is very important. But, the simple kind of programming using just single thread that receives data from network and records it by turns, can cause a bottleneck effect while processing high speed data and a probable data loss, and cannot utilize the merit of hardwares supporting multi core or hyper threading, or operating systems supporting these hardwares. In this presentation we summarize the design of the data transfer software for KJJVC and high speed, large capacity data archive system using general socket programming and multi threading techniques, and the results of testing of purchased storage system using this software.
|top|


Data Reduction Pipeline for EMIR, the GTC Near-IR Multi-Object Spectrograph
Sergio Pascual Madrid Complutense University, Astrophysics Department
EMIR is a near-infrared wide-field camera and multi-object spectrograph being built for the 10.4m Spanish telescope (Gran Telescopio Canarias, GTC) at La Palma Observatory (Canary islands, Spain). The Data Reduction Pipeline (DRP) will be optimized for handling and reducing near-infrared data acquired with EMIR. Both reduced data and associated error frames will be delivered to the end-users as a final product. The DRP is being designed and built by the EMIR Universidad Complutense de Madrid group.
|top|


Integrated e-infrastructures for astrophysics
Fabio Pasian INAF
Also in the case of astrophysics, the capability of performing ``Big Science″ requires the availability of large HPC facilities. But computational resources alone are far from being enough for the community: as a matter of fact, the whole set of e-infrastructures (network, computing nodes, data repositories, applications) need to work in an interoperable way. This implies the development of common (or at least compatible) user interfaces to computing resources, transparent access to observations and numerical simulations through the Virtual Observatory, integrated data processing pipelines, data mining and semantic web applications. Achieving this interoperability goal is a must to build a real ``Knowledge Infrastructure″ in the astrophysical domain.
|top|


LOPS - towards a science driven observation preparation tool.
Alexey Pavlov MPIA
LINC-NIRVANA is the near-infrared homothetic imaging camera for the Large Binocular Telescope. Once operational, it will provide an unprecedented combination of angular resolution, sensitivity and field of view due to the large collecting area of the two LBT mirrors (8.4m) and by means of the Multi-Conjugated Adaptive Optics (MCAO) and a Fringe and Flexure Tracker System (FFTS). The LINC-NIRVANA Observation Preparation Software (LOPS) is a software tool that supports astronomers in constructing a full observing project for the INC-NIRVANA instrument. The process of the preparation of such observing projects are strongly imposed by the scientific goals and the complexity of the LINC-NIRVANA instrument set up. The LOPS helps the observer to prepare and assemble high level science programs using graphical representation of the components. It allows automatic propagation of the elements among these components and provides a mechanism to investigate the parameter space for justification of the scientific objectives. A prepared information will be then later interpreted by the instrument control software into the low level observation control sequences. At the current phase of the development we present a pilot version -- candidate for the LOPS beta version.
|top|


Optimal Compression Methods for Floating-point Format Images
William D. Pence NASA/GSFC
We report on the results of a comparison study of different techniques for compressing FITS images that have floating-point (real*4) pixel values. Standard file compression methods like GZIP are generally ineffective in this case (with typical compression ratios only in the range 1.2 - 1.6), so instead we use a technique of converting the floating-point values into quantized scaled integers that are compressed using the Rice algorithm. The compressed data stream is stored in FITS format using the tiled-image compression convention. This is technically a lossy compression method, since the pixel values are not exactly reproduced, however all the significant photometric and astrometric information content of the image can be preserved while achieving greatly enhanced compression ratios over those provided by GZIP. We also show that introducing dithering, or randomization, when assigning the quantized pixel values can significantly improve the photometric and astrometric precision in the stellar images in the compressed file without adding additional noise. We quantify our results by comparing the stellar magnitudes and positions as measured in the original uncompressed image to those derived from the same image after applying successively greater amounts of compression. This compression method may be applied to any FITS image by using the FPACK and FUNPACK compression utility programs that are available from the HEASARC web site at http://heasarc.gsfc.nasa.gov/fitsio/fpack/.
|top|


Efficiencies of various classification methods applied to XMM-Newton sources
Francois-Xavier Pineau Observatoire Astronomique de Strasbourg
The statistical identification of all serendipitous X-ray sources detected by the EPIC camera is one of the tasks devoted to the Survey Science Centre (SSC) of XMM-Newton. Using a probabilistic cross-correlation of the 2XMMi catalogue with others like the SDSS DR7 or the 2MASS, we have built several samples of multiwavelength data for which various thresholds on the number of spurious associations can be applied. We create a learning sample of classified XMM sources from the SDSS spectroscopy and from the Archival Catalogue and Database Subsystem which is the remote part of the SSC pipeline that performs the cross-correlation of EPIC sources against a large collection of archival data including Simbad. This allowed us to apply both supervised or unsupervised classification methods. We tested a range of classification algorithms such as simple k-Nearest Neighbours, Mean Shifts, Kernel Density Classification or Support Vector Machines. Advantages and disavantages of each methods are reviewed, and their respective performances are compared.
|top|


NICMOS Temperature Dependent Calibration
Nor Pirzkal STScI
We present our latest calibration work of the NICMOS instrument on the Hubble Space Telescope. This new calibration method uses a novel method to determine the temperature of the detector directly from the science exposure. This has allow, as we show here, to better monitor the temperature behavior of the instrument′s three detectors, providing calibration files that account for the temperature sensitivity of the photometry, the flat-fields, and different components of the dark signal, including the shading and the amplifier glow. and has allowed us improve the calibration of this instrument by providing temperature specific calibration files such as flat-fields, darks, and shading. The temperature of the instrument is determined from the bias level of the data, which is determined using the first read of these non-destructive read, multi-accum datasets. It has allowed us to determine the temperature of each of the three NICMOS detectors to within 0.05K and offers a significant improvement over the available mounting cup temperature sensor that has been used in the past.
|top|


Visualization of ASH Simulations of Stellar MHD with SDvision
Daniel Pomarede CEA/IRFU
With the growing power of high performance computing centers, simulation is playing a leading role in the study of astrophysical objects. The ASH Anelastic Spherical Harmonics program is used to perform high-resolution three-dimensional simulations on massively parallel mainframes with up to several thousands processors, with a special emphasis on the MHD processes occuring in the convection zone of the Sun and other stars. The size and complexity of the data produced in these simulations require to use special software tools at the post-treatment, visualization and analysis stages. The need for interactive and immersive visualization of these data has motivated the development of the SDvision graphical interface. This tool is deployed in the framework of IDL Object Graphics and offers several ways to visualize the scalar and vector fields produced in the simulations. Scalar fields are visualized through either a ray-casting volume rendering, an isosurface reconstruction, a texture mapping algorithm, or through volume slicing. In the volume rendering implementation, the tuning of the RGB colors and transparencies lookup tables can be tuned interactively to enhance the visualization of the turbulent structures that characterize these data. Vector fields are visualized by hedgehog displays or by streamlines that can be seeded interactively. The SDvision tool provides a visualization of the scene though either an OpenGL, hardware-based rendering or through a pure software computation. It is used to produce stereo rendering of the data distribution, with a module producing the input to the new generation of autostereocopic high-definition screens.
|top|


Illusion - A Fabry-Perot Data-Cube Synthesizer.
Bruno Correa Quint Universidade de Sao Paulo, SP, Brazil
Illusion is a software to synthesize Fabry-Perot data for the BTFI project. BTFI stands for Brazilian Tunable Imager Filter and it is a instrument being developed in the Universidade de Sao Paulo in collaboration with several other institutes in Brazil, Canada and France. This instrument will be installed on the SOAR telescope in Chile. Illusion synthesize data to help in the development of data-reduction routines, to train observers and to simulate observations in different contexts. The synthesized data are obtained considering different kind of sources located in the telescope′s image plane and these data are described essentially by the Airy profile. The results are data-cubes in the FITS format that has been extremely important to develop new data-reduction routines and for the training of the BTFI′s software group.
|top|


Implementation of the global parameters determination in Gaia's Astrometric Solution (AGIS).
Frederic Raison European Space Astronomy Centre of ESA, Madrid, Spain
Gaia is ESA′s space astrometry mission with a foreseen launch date in early 2012. Its main objective is to perform a stellar census of the 1000 Million brightest objects in our galaxy (completeness to V=20 mag) from which an astrometric catalog of micro-arcsec level accuracy will be constructed. A key element in this endeavor is the Astrometric Global Iterative Solution (AGIS). A core part of AGIS is to determine the accurate spacecraft attitude, geometric instrument calibration and astrometric model parameters for a well-behaved subset of all the objects (the `primary stars′). In addition, a small number of global parameters will be estimated, one of these being PPN γ. We present here the implementation of the algorithms dedicated to the determination of the global parameters.
|top|


Extending the device support for the ALMA Control subsystem code generation framework
Johnny WIlliam Reveco Associated Universities, Inc. (AUI)
The existing code generation framework in the ALMA Control subsystem provides basic and functional code for ALMA antenna devices using CAN bus as communication interface. There are also devices which use Ethernet communication. The purpose of this work is to extend the code generation framework, based on openArchitectureWare, to include the code generation of Ethernet control components, working on top of the ALMA Common Software (ACS) distributed control framework. In order to achieve this, a new data model and new class templates are needed, and the device components design has to be adapted.

With the approach presented in this paper, which was implemented in early 2009 and put into production in late 2009, it was possible to extend a common base for both, Ethernet and CAN devices, code generator data models, and also define a new interface to control Ethernet devices. This new model is expected to be a generic approach, since it would be simple to extend for other device bus types. In general terms, this is a work towards generic device support through code generation.
|top|


A fast asteroid detection algorithm
Tsuyoshi Sakamoto Japan Spaceguard Association
We present an asteroid detection algorithm that performs fast and automatic processing for a large volume of data produced in wide field surveys. Asteroids have typical daily motions of 15 arcminutes or larger across the sky, and their accurate orbital determinations require us to undertake followup observations as soon as they were detected. In particular, the asteroids passing near the earth′s orbit (Near Earth Asteroids, hereafter NEAs) move a few degrees per day, and the onset of follow observations within a day is essential. Advent of mosaic CCD cameras provides wide field surveys on NEAs over a sky coverage of about one thousand squared degrees per night, such as Catalina Sky Survey and the Panoramic Survey Telescope & Rapid Response System. Thus, it is imperative to construct algorithms that detect faint asteroids with various motions quickly and automatically on the basis of a large volumes of data. Recent works have constructed the algorithms that detect very faint asteroids automatically, such as matched filter method and the method that combines multiple frames along the motion of main-belt asteroids. However, the asteroid detection from a large volumes of data on the basis of their algorithms are too time-consuming. This is because the algorithms depend on only a velocity matching technique that looks for any objects shifting in a linear fashion across the temporal sequence of frames, and the asteroid detection on the basis of the algorithms requires the search over very wide area, in particular the search within a cicle with radius larger than a few arcminutes in all directions is required for the detection of NEAs. The light profiles of the asteroids are good indicators of asteroid search because they are elongated in direction of their motions if the motion during the exposure is larger than the seeing. The light profiles are also distorted by telescope, instrument, other optics, in particular near the edge of the whole image. Thus, we confine the seach area of asteroids by predicting the direction of the motion from the position angle of the ellipse of the light profiles by considering the light profiles of stars enclosed with asteroid candidates. We construct a fast asteroid detection alogorithm that performs both the analyses of the light profile and a velocity matching technique. We check the run-time efficiency and the detection rate of asteroids with various motions and magnitudes by applying our algorithm to the imaging data by 2KCCD mounted on 1-m telescope at kiso observatory, the Mosaic mounted on 0.9-m telescope at WIYN observatory, and other telescopes.
|top|


Data Provenance: Use Cases for the ESO archive, and interactions with the Virtual Observatory
Juande Santander Vela European Southern Observatory
In the Virtual Observatory era, where we intend to expose scientists, or software agents on their behalf, to a stream of observations from all existing facilities, the ability to access, and to further interpret the origin, relationships, and processing steps on archived astronomical assets (their provenance) is a requirement for proper observation selection, and quality assessment. In this poster we present the different use cases Data Provenance is needed for, the many challenges inherent to the ESO archive and their link with ongoing work in the IVOA.
|top|


Environment Study of AGNs at z = 0.3 to 3.0 using the Japanese Virtual Observatory
Yuji Shirasaki NAOJ, ADC

We present a science use case of Virtual Observatory, which is actually achieved to examine an environment of AGN upto redshift of 3.0. We used the Japanese Virtual Observatory (JVO) to obtain the Subaru Suprime-Cam images around known AGNs.

It is thought that the origin of AGN activity is accretion of matters into a massive black hole at the center of the galaxy. One possible mechanism for causing the rapid inflows of the gas into the central region is a major merger between gas-rich galaxies. If this is the case, AGNs are expected to be found in an environment of higher galaxy density than an environment of typical galaxies. Thus the AGN produced at higher redshifts should be observed in a high galaxy density environment.

The current observations, however, indicates that AGNs do not reside in a particularly high density environment. We must, however, be cautious with the results obtained especially at high redshift (z > 0.6). All the high redshift experiments are based on the galaxies selected by a color cut, so some kinds of galaxies are missed in the samples. The observations are limited to several specific fields, so the results are strongly affected by cosmic variance.

We investigated ~1000 AGNs, which are distributed in wide area of the sky, without applying any color selection. The number of AGN examined is about ten times larger than the other studies covering the redshifts larger than 0.6. We successfully found significant excess of galaxies around AGNs at redshifts of 0.3 to 1.8.

If this work is done in a classical manner, that is, raw data are retrieved from the archive through a web interface in an interactive way and the data are reduced in a local poor machine, it may take several years to finish it. Thus we have developed parallel computing system which can communicates with the Subaru data archive in bandwidth of 32 Gbps at maximum. All the public Suprime-Cam data were reduced with this system and the data are provided through the JVO system. So it is possible to do an environment study for any types of object using the deeps image obtained by the Suprime-Cam with very few effort.

We have constructed a pipeline for calculating the galaxy number density profile around a given coordinate. The analysis procedure performed in the pipeline is: searching Suprime-Cam images and UKIDSS catalog data for a given AGN field, retrieve the data from the archive, analyze them on the local machine, and create a radial distribution of number of galaxies and effective area around the specified coordinate. This procedure was executed in parallel on ~10 quad core PCs, and it took only one day for obtaining the final result. Our result implies that the Japanese Virtual Observatory can be a powerful tool to investigate the large scale structure of the intermediate redshift Universe.

|top|


Real-Time Discovery and Classification of Sparsely Sampled Science Using Berkeley's Transient Classification Pipeline
Dan Starr UC Berkeley, Department of Astronomy
The Berkeley Transients Classification Pipeline is a parallelized source identification, classification, and broadcast pipeline which currently identifies transient and variable science observed by the Palomar Transient Factory′s all sky survey. We will discuss successful science classifiers and light-curve characterizing algorithms which are currently applied to the PTF survey, as well as solutions to challenges which arise when filtering out spurious detections from a 100 Megapixel, 7.8 degree field subtraction pipeline. One such solution is a 2D-image classifier which we′ve trained to exclude poorly subtracted sources using input that has been ″crowd sourced″ from a dozen users using our GroupThink web application. The TCP′s science classifications are made using a weighted combination of machine learning algorithms which have been trained using data from DotAstro.org′s light-curve warehouse.
|top|


Parallel CLEAN: beyond the frequency domain.
Ian M. Stewart University of Cape Town, Department of Astronomy.
The CLEAN algorithm, despite its problematic convergence properties and its need for heuristic intervention and tuning, is the most commonly used means of attenuating sampling artifacts from images made through radio interferometry. A subsequent modification (Sault and Wieringa 1994) adapted CLEAN to observations made using a wide fractional bandwidth, which are not well handled by the original algorithm. Here a generalized version of the Sault-Wieringa algorithm is applied to other cases where standard CLEAN also fails, including time-variable and off-pixel-centre sources. A 1-dimensional version of the latter technique can also be applied to extract non-integer frequency values from periodograms.
|top|


User Support in the Virtual Astronomical Observatory
Elizabeth B Stobie National Optical Astronomy Observatory
As the U.S. Virtual Astronomical Observatory (VAO) enters its operational phase, support for its growing number of users will become ever more important. The VAO User Support group has responsibility in three key areas: (1) for maintaining and expanding the VAO Help Desk and Web Site, which will give access to all the catalogs, services, and tools that the VAO provides; (2) for conducting aggressive Testing and Readiness Reviews, so that those tools and services will work reliably and will be well documented; and (3) for pursuing diverse opportunities for Training and Advocacy, including tutorials, seminars, and support for advanced use of the VAO. We will present our implementation plans with the expectation of receiving feedback from the community in order to develop a more effective User Support system for the VAO.
|top|


Single dish observation simulator in CASA
Kanako Sugimoto National Astronomical Observatory of Japan
Simulated observational data for the complex astronomical instruments such as ALMA provides users what properties, sensitivities, and dynamical ranges are possible with the observation, and helps them decide planning observations. In addition, it enables us to test data reduction procedures and their reliability by comparing reduced simulation data with the input model. In order to implement capability of simulating single dish observation in CASA (Common Astronomical Software Applications), a task for single dish observation simulation (SD simulator) is now on the way of development and the first prototype task will be available in the next release scheduled at the end of 2009. CASA SD simulator task predicts observational data and image from an input model image which represents the sky to be observed. The task functionality is realized by binding up functions in C++ libraries (tools) with a Python script (see the abstract by Nakazato et al. for the details of CASA architecture). Taking advantages of the simulation, single dish data reduction and imager tools, users can simulate observations with specified settings, e.g., a telescope, date, integration time, spatial gridding, etc. CASA SD simulator is so far able to simulate data from pointing observations and is planned to implement all observational modes of ALMA (ACA) total power antennas, in future, which include On-The-Fly observation in combination with either position-switching or frequency-switching. It is also planned to enable addition of various errors to corrupt the simulated data such as thermal noises, pointing errors, band characteristics, atmospheric emission and absorption, etc. SD / total power simulation will also be integrated at the python level with existing synthesis simulation in CASA (the simdata task). When completed, users will be able to seamlessly simulate a dataset containing 12m ALMA visibilities, 7m ACA visibilities, and total power, using a single sky model and corrupted with consistent atmospheric and instrumental effects. We will report the current status of development, future plans and perspectives of single dish observation simulator in CASA.
|top|


Doppler Shift Correction for 2SB Receivers of the 45m Telescope at the Nobeyama Radio Observatory
Shigeru Takahashi Nobeyama Radio Observatory, National Astronomical Observatory of Japan
Since the radial velocity between an object and an observer varies with time due to the rotation and revolution of the earth, spectral line observations using a radio telescope need Doppler shift corrections for the data obtained. Converting observed frequencies into rest frequencies allows us to compare the observed spectra with those measured in the laboratory and to identify molecules in existence in observed objects. There are several methods in the Doppler correction, and we choose a proper method which is suitable for each receiver and observational technique. To date, when observations are performed using a position switching method, receivers (HEMT22, SIS80/100, BEARS etc.) of the 45m telescope at Nobeyama Radio Observatory (NRO) adopt a local frequency (Lof) correction method; namely, the correction is done by changing the Lof (synthesizer frequency) every integration. Since the 2008 season, two new units of a two sideband-separating (2SB) receiver (T100H/T100V) have been installed for public use. These new receivers can simultaneously observe in both sidebands so that the conventional correction method used in NRO is not suitable for these receivers, because the correction can be applied for only one sideband, and the correction of the other sideband is not accurate. In order to solve this problem, we have developed a new Doppler shift correction method for the 2SB receivers. The new method is that Lof is set as the value of the first integration and fixed throughout an observation, and all the corrections changing with time are done by software. In this poster, we present this software Doppler shift correction method for the 2SB receivers, showing a comparison with the conventional Lof correction method.
|top|


Development of AKARI reduction tools for IRC slow-scan
Satoshi Takita ISAS/JAXA
The first Japanese infrared astronomical satellite AKARI have made observations with two instruments; the Infrared Camera (IRC; 1.7--26.5 micron) and the Far-Infrared Surveyor (FIS; 50--180 micron). AKARI carried out an all-sky survey at six wavelength bands (9, 18, 65, 90, 140, 160 micron) during May 2006 to August 2007. In addition thousands of pointed observations of selected target have been performed. In the pointed observation mode the IRC has three choices of observation modes; imaging, spectroscopy and slow-scan observations. The FIS has two modes; slow-scan and spectroscopy. In the slow-scan observation mode data acquisition runs continuously while the telescope scans the sky with much slower speed (8, 15 or 30 arcsec/sec) than that of the All-Sky Survey (215 arcsec/sec). We present the details of data reduction software for the IRC slow-scan observations. The data are prepared in dedicated format called TSD (Time-Series Data), which was originally designed for the FIS. The format makes it available to provide all necessary information at each data acquisition in a raw of a table in a FITS file. Basic data processing such as flat-fielding are applied onto this TSD format data. The highlight of the data processing is the ′self pointing reconstruction′. The scan position information provided from the satellite telemetry is not accurate enough compared to the required accuracy for the IRC observations. Therefore, we developed a software to make time dependent correction based on the observed sources by comparing the position reference catalogue. With the system we realized the position accuracy of the reduced images / point source information. as good as one arcsec. The details of the algorithm and software system will be described.
|top|


Data Archive and Transmission System (DARTS) of ISAS/JAXA
Takayuki Tamura ISAS/JAXA
Data ARchives and Transmission System (DARTS) is a versatile space science data archive for astrophysics, solar physics, solar-terrestrial physics, and lunar and planetary science. In astrophysics part we have archived and released the data from ASCA, Suzaku (X-ray astronomy), IRTS, AK\ ARI (Infrared astronomy), and other missions. Recent new services include SUZAKU and AKARI query pages, JUDO (sky-navigation), UDON (QL analysis of X-r ay data) and public education pages. Our current and future data archive services will be introduced.
|top|


Impact of Gfarm, a Wide-area Distributed File System, upon Astronomical Data Analysis and Virtual Observatory.
Masahiro Tanaka University of Tsukuba, Center for Computational Sciences
While 100 TB-scale astronomical data are available through Virtual Observatories, there are still several issues for large-scale data analysis that include transferring a large amount of data and securing enough capacity of storage. We thus propose a VO-capable file system in order to offer easy access to astronomical data, by utilizing Gfarm, a wide-area distributed file system developed as an e-Science infrastructure. Gfarm is a distributed file system that federates storage systems in wide area. It is designed to achieve high reliability and high performance exploiting file replicas and distributed file access. These features facilitate large-scale astronomical data analysis under research collaboration of multiple distant organizations. We discuss file system structure and search method which are compliant with VO standards and the initial performance of data analysis on this system.
|top|


Building the Spitzer Source List
Harry Isaac Teplitz IRSA
The Spitzer Science Center will produce a source list (SL) of photometry for a large subset of imaging data in the Spitzer Heritage Archive (SHA). The list will enable a large range of science projects. The primary requirement on the SL is very high reliability, with areal coverage, completeness and limiting depth being secondary considerations. The SHA at the NASA Infrared Science Archive (IRSA) will serve the SL as an enhanced data product. The SL will include data from the four channels of IRAC (3-8 microns) and the 24 micron channel of MIPS. The Source List will include image products (mosaics) and photometric data for Spitzer observations of about 1500 square degrees and include around 30 million sources. We describe the plans and timeline for development of the Spitzer Source List. We demonstrate the verification of the Source List pipeline using Spitzer Legacy catalogs at ″truth tables″. Finally, we discuss the range of use cases which will be supported.
|top|


Adding support to ACS for Real-Time operations through the usage of a POSIX-compliant RTOS
Rodrigo Javier Tobar Universidad Tecnica Federico Santa Maria
The ALMA project consists in a group of about 60 antennas working synchronized to make observations of celestial bodies. The distributed control of this project will be done using the ALMA Common Software (ACS). Nevertheless, this framework lacks of the real time capabilities to control the antennas′ instrumentation --- as has been probed by previous works --- and which has lead to non-portable workarounds to the problem. Indeed, the time service used in ACS, which is based in the Container/Component model, presents plenty of results that confirms the lack of support for real-time operations on ACS. This paper address the problem of designing and integrate a real-time service for ACS, providing to the framework an implementation such that the control operations over the different instruments could be done within real-time constraints. This implementation is measured with the same metrics that the current time service, showing the difference between the two systems when subjecting them to common scenarios. Also, the new implementation is done following the POSIX specification, ensuring interoperability and portability through time and different operating systems.
|top|


A Boosting approach for the detection of faint compact sources in wide field aperture synthesis radio images
University of Girona Albert Torrent
Several thresholding techniques have been proposed so far in order to perform faint compact source detection in wide field interferometric radio images. Due to their low intensity/noise ratio, some objects can be easily missed by these automatic detection methods. In this paper we present a novel approach to overcome this problem. Our proposal is based on using local features extracted from a bank of filters. These features provide a description of different types of faint source structures. Our approach performs an initial training step in order to automatically learn and select the most salient features, which are then used in a Boosting classifier to perform the detection. The validity of our method is demonstrated using 19 images that compose a 2.5 deg x 2.5 deg radio mosaic, obtained with the Giant Metrewave Radio Telescope, centered on the MGRO J2019+37 peak of gamma emission at the Cygnus region. A comparison with two previously published radio catalogues of this region (task SAD of AIPS and SExtractor) is also provided.
|top|


Concept of VSOP-2 Science Operation Center (SOC: tentative)
Tomofumi Umemoto National Astronomical Observatory of Japan
NAOJ along with ISAS/JAXA plans the construction of the science operation center (SOC) (tentative name) which role is the scheduling of space and ground telescopes, the management of science data to provide for and the assistance for the user in the world. SOC is indispensable presence for succeed in the VSOP-2 project, a point of contact with the researcher who uses the VSOP-2, and it plays an important role so that the researcher may raise the maximum scientific output. For this, the functions of SOC for the organization of ground telescopes and the scheduling of observations, the management and support of processing the correlation, an automatic calibration and analysis of the data with the pipeline processing, an archive of huge data and making the database, the detailed assistance of the data analysis for the visiting user, and the education and the public outreach, are required. In this poster, we introduce the computing system and software development for the VSOP-2 data analysis.
|top|


Flexible Scientific Processing within the WSO-UV observatory
Rafael Vazquez-Osorio GMV, Aerospace and Defence, S.A.
Flexible Scientific Processing within the WSO-UV observatory R. Vazquez, C. Munoz, J.C. Vallejo, J.M. Lozano, (GMV AD), A.I. Gomez de Castro, (UCM) The World Space Observatory/Ultraviolet - WSO/UV is a major international collaboration involving researchers from several countries with Russia playing the leading role, through Russian Federal Agency, ROSCOSMOS. It aims to study the Universe in the 100 - 320 nm ultraviolet wavelengths range with a space observatory based in a primary mirror of diameter 170cm. Its expected lifetime may expand during nearly 10 yrs, with a launch date set in 2012. Aiming to maximize the scientific return along such a long lifetime, the ground segment architecture is to be based in a modular approach, relying in a common framework able to run together different subsystems developed from different agencies and institutes, which may be fully upgraded and even replaced along the years. Another key point for successful operations is to adopt a very flexible approach for the scientific processing of data within the WSO/UV Ground Segment. In the framework of current design and development activities within the project, we are working on the design and prototyping of the Scientific Data Processing Centre (SDPC) that will be in charge of processing (and reprocessing) Observation Data Set (ODS) providing the end users with the mission final science products, ready for scientific utilisation. From a design point of view, the goal is to achieve a fully modular and configurable SDPC system. The processing is driven according configuration files, covering all the processing steps from the highest one, the observation, to the lowest one, the atomic step, where a single algorithm is applied to a set of data inputs. Algorithms can be provided and used as integrated functionalities in form of callable functions (e.g java/C class) or as any runnable module including also interpreted languages (e.g python, IDL, SHELLs). In the future, this SDPC framework will support not only reduction processing steps but also high-level processing steps, which will provide extended product via cross-check with internal and external catalogue or quality assessment.
|top|


Flexible Scientific Processing within the WSO-UV observatory
Rafael Vazquez-Osorio GMV, Aerospace and Defence, S.A.
Flexible Scientific Processing within the WSO-UV observatory R. Vazquez, C. Munoz, J.C. Vallejo, J.M. Lozano, (GMV AD), A.I. Gomez de Castro, (UCM) The World Space Observatory/Ultraviolet - WSO/UV is a major international collaboration involving researchers from several countries with Russia playing the leading role, through Russian Federal Agency, ROSCOSMOS. It aims to study the Universe in the 100 - 320 nm ultraviolet wavelengths range with a space observatory based in a primary mirror of diameter 170cm. Its expected lifetime may expand during nearly 10 yrs, with a launch date set in 2012. Aiming to maximize the scientific return along such a long lifetime, the ground segment architecture is to be based in a modular approach, relying in a common framework able to run together different subsystems developed from different agencies and institutes, which may be fully upgraded and even replaced along the years. Another key point for successful operations is to adopt a very flexible approach for the scientific processing of data within the WSO/UV Ground Segment. In the framework of current design and development activities within the project, we are working on the design and prototyping of the Scientific Data Processing Centre (SDPC) that will be in charge of processing (and reprocessing) Observation Data Set (ODS) providing the end users with the mission final science products, ready for scientific utilisation. From a design point of view, the goal is to achieve a fully modular and configurable SDPC system. The processing is driven according configuration files, covering all the processing steps from the highest one, the observation, to the lowest one, the atomic step, where a single algorithm is applied to a set of data inputs. Algorithms can be provided and used as integrated functionalities in form of callable functions (e.g java/C class) or as any runnable module including also interpreted languages (e.g python, IDL, SHELLs). In the future, this SDPC framework will support not only reduction processing steps but also high-level processing steps, which will provide extended product via cross-check with internal and external catalogue or quality assessment.
|top|


6D visualization of multidimensional data by means of cognitive technology
Vladimir V. Vitkovskiy Special Astrophysical Observatory of RAS, Informatics Department
The multidimensional data huge volume analysis problems can be resolved with the help of modern information technologies and, first of all, with the help of the cognitive computer graphics. New methodology must ensure the successful application of software of the visualization of multidimensional data and systems of the visual programming. New procedures and means of work with the cognitive graphic figures give impetus to the development of fundamentally new algorithmic, software of the visualization of experimental data. The development of cognitive machine drawing for the generation of the visual means of the content of the contemporary bases of given, system archives and the data banks is possible. On the basis of the cognitive graphics concept, we worked out the Space Walker system for visualization and analysis. The system dynamically generates three-dimensional projections of the multidimensional data in the form of mobile three-dimensional images on the computer screen. It allows to train and to aggravate intuition of researcher, to raise his interest and motivation to the creative, scientific cognition, to realize process of dialogue with the very problems simultaneously. The Space Hedgehog system is the next step in the cognitive means of the multidimensional data analyze. The technique and technology cognitive 6D visualization of the multidimensional data is developed on the basis of the cognitive visualization research and technology development. The Space Hedgehog system allows direct dynamic visualization of 6D objects. It is developed with use of experience of the program Space Walker creation and its applications. Let us emphasize that in the form of cognitive means the content of terabyte multidimensional massifs can be represented and analyzed.
|top|


Data mining in the SIMBAD database log files
Marc Wenger Strasbourg Observatory, C.D.S.
The SIMBAD database software logs information from every query, both from the web server and from the SIMBAD software itself. In the last years, the number of queries increased from 30,000 to 300,000 queries per day. This gives a good basis to make some study among this information. Several kind of results will be presented: the way people are working: from session duration to preferred working hours by country. Combining the query IP addresses with an IP geolocalisation database allows to highlight the most active places in astronomy in the world, and also some more surprising ones. Analyzing the type of objects and the sky area queried shows some trends in the astronomers interests. Finally, the SIMBAD log files allow sometimes to visualize particular events.
|top|


The ALMA Front-end Archive Setup and Performance
Andreas Wicenec European Southern Observatory
The ALMA front-end archive system has to capture up to 64 MB/s for a period of several days plus the data of about 100,000 monitor points from all 66 antennas and the correlators. The main science data is delivered through corba based audio/video streams and finally stored on SATA disk arrays hosted on 6 computers and controlled by 12 daemons. All data is collected by software components running on computers in the antennas and then sent through dedicated fiber links to the Array Operations Site at 5000m and from there to the Operations Support Facility (OSF) at 3000m elevation. The various hardware and software components have been tuned and tested to be able to meet the performance requirements. This paper describes the setup and the various components in more detail and gives results of various test runs.
|top|


STARS 2 - 2nd generation open-source archiving and query software for the Subaru Telescope
Tom Winegar Subaru Telescope
The Subaru Telescope is in process of replacing the 1st generation software - STARS 1 - designed for archiving and query of all FITS files and telescope status logs produced by the telescope. STARS 1 has been operational for almost 10 years, and currently manages approximately 3.5 million frames and 100,000 status logs, totalling ~30 TB of combined data. STARS 1 is designed and supported under yearly contract with a vendor, using Oracle databases and C and Perl programming languages. STARS 1, located in Hilo, is matched with a sister archive located in Japan - MASTARS - serving the Japanese astronomy community from Mitaka, Japan. In the interest of reducing operating expenses and transitioning to in-house support, STARS 2 began development in 2006 to duplicate the functionality of STARS 1 with open-source databases and modern programming languages. After 3 years of development, STARS 2 is currently in beta-testing with Subaru staff, in preparation for release to observers later this year. Using MySQL databases and Python and PHP, STARS 2 intends to run in parallel operation with STARS 1 for several years, eventually replacing STARS 1. Since the actual FITS files and telescope status logs are stored in native format in a ~100 TB archive, both STARS 1 and 2 databases contain only FITS keyword=values, administrative information, and current archive file locations. In this way, STARS 1 and STARS 2 may operate in parallel for query functions, simultaneously accessing the same archive. STARS Registration functions will be transitioned to STARS 2 when required. STARS 1 utilized HTTP/HTTPS as the transfer mechanism for delivery of files to offsite observers. STARS 2 continues using HTTP/HTTPS as the preferred transfer mechanism, and adds FTP-outgoing and SCP-outgoing as additional protocols for offsite transfer. In addition, STARS 2 benefits from our experience with the limitations of STARS 1, and has additional features requested by users. Some of the additional features in STARS 2 are: Stored Queries - User queries and results listings are stored and file-by-file transfer-status information is detailed. Stored Destinations - FTP and SCP login parameters are stored by user. Scheduled Transfers - Users can schedule a future time for automatic transfer of files. Calibration Sets - Calibration frames may be assigned to Calibration Sets, which can then be assigned to more than one ProposalID. Improved System Administration - All user activity is logged and the current status of user queries and transfers is easily monitored.
|top|


BibCat: The Chandra Data Archive Bibliography Cataloging System
Sherry L Winkelman Smithsonian Astrophysical Observatory
The Chandra Data Archive (CDA) has been tracking publications in refereed journals and on-line conference proceedings that are based on Chandra observations since early in the mission. Over the years this database and its associated tools have expanded dramatically. In this paper we describe our newly renovated bibliography architecture with an emphasis on new features which have been added including: auto-scan capabilities to reduce in an automated fashion the number of papers which need to be manually classified and to flag keywords (such as observatory names or surveys) used within papers; multi-user classification allowing quality assurance checks; multi-observatory capabilities allowing multiple facilities to use the same database independently; and plug-in support allowing access to associated observatory data to more fully describe data links in papers. The usefulness of some of these features speak for themselves, but others are not so obvious. As an example, we intend to use the multi-observatory functionality to apply separate classification schemes to papers relating to the CDA and the Chandra Source Catalog and potentially to other observatories at the Center for Astrophysics. The data mining aspects of the auto-scanning capabilities can be used for many purposes such as: improving searching for Chandra related papers from both ADS and our bibliography search pages or linking papers to grants for internal uses.
|top|


An algorithm of refinement of image alignment for image subtraction
Masafumi YAGI National Astronomical Observatory of Japan
I will present an algorithm for estimating a shift between a pair of almost aligned images. Thought rough estimation of flux scale and position alignment is available from catalog matching technique,a small shift can make many positive-negative (bipolar) pattern around objects in the difference image of the two. The idea of this study is to use the pattern to find the best position. Moving one image around, we can draw many vectors toward the best position, and the intersection of the vectors would be the solution. The future application for the rotation refinement will also be discussed.
|top|


SFITSIO -- A next-generation FITS I/O library for C/C++ users
Chisato Yamauchi Japan Aerospace Exploration Agency, Center of Science-satellite Operation and Data Archive
The SFITSIO is newly developed FITS I/O library to minimize user effort. This next-generation library supports writing C-language code, and users can intuitively write their code for FITS I/O. The APIs represent the structure of FITS, therefore, users do not forget usage of APIs. By using SFITSIO, users can escape the nuisance that they frequently read the manual document when writing their code using current FITS libraries. The SFITSIO supports r/w of Image, Ascii Table and Binary Table, and some CFITSIO extensions. Access to the compressed files via network is also supported. We will prepare a tutorial document of this library, and distribute it at our poster booth.
|top|


A Simple Implementation of a 3D Data Cube viewer
Honglin Ye National Radio Astronomy Observatory
One of the most important end product from a spectral line observation is the brightness B as a function of right ascension RA, declination DEC and frequency F. B(RA, DEC, F) is commonly referred to as data cube or image cube. In practice, brightness is obtained, for each frequency channel separately, as images. While rendering a stack of such images in various ways can illuminate some of the important aspects of the data cube, the process of finding relations between channels can be difficult and time consuming. New instrument, such as ALMA and EVLA, that can observe over a broad wave length range and up to thousands channels, will present enormous challenges for the visualization and analysis of large number of paged images. It is often desirable to render the data cube as a whole, to examine from arbitrary viewing angle and for any clip region of interest. To meet some of these needs in CASA, I developed a GUI based on Qt OpenGL module for for viewing tree-dimensional data. I will describe the design and implementation of this tool and demonstrate its use.
|top|


Experiences Virtualizing ALMA Software
Mauricio Alejandro Zambrano Associated Universities, Inc.
One of the new topics in system administration during the last years has been the virtualization with all the benefits it implies. Hardware and operative systems are now offering new features to improve the performance of such environments to nearly match real hardware performance. ALMA software is tested among others in simulation on a controlled Standard Tests Environment (STE). A STE is formed by a set of computers and network hardware composed by a general network services server, a general ALMA service server, a support server, an archive server, real time computers, a switch and firewall. We took an Open source hypervisor and made a set of tools to build, manage and deploy complete sets of STEs to test the ALMA Common Software. For many years ALMA has provided pre-installed virtual machines(VMs) for developers. We took this approach one step further allowing to run ALMA Software in simulation on a fully virtualized STE on a single server. The work we present here will show the goals, scope and policies for virtualization and the benefits of its usage since its adoption in late 2008.
|top|


NICI Python Data Reduction
Nelson R Zarate Gemini Observatory
NICI, the new adaptive-optics supported Near Infrared Coronagraphic Imager of the Gemini Observatory (South), has recently been commissioned and offered to the astronomical community. The reduction software package has been written in Python using numerical routines from Numpy, Scipy and Ndimage, as well as the Gemini module Astrodata hiding the low level details of NICI FITS files structure. This is a preliminary release for early public NICI users. We discuss the design of the different python modules, the science data preparation, and basic reduction steps as well as the implementation of the Angular and Spectral Differential Imaging (ADI/SDI) reduction algorithm, and the LOCI method (Locally Optimized Combination of Images) producing the final set of reduced science FITS files for high-contrast imaging applications.
|top|


Grown up with the VO /The VO-powered PhD thesis/
Ivan Zolotukhin Sternberg Astronomical Institute, Moscow State University
The Virtual Observatory has reached sufficient maturity for its routine scientific exploitation by astronomers. To prove this statement, here we present a complete VO-powered PhD thesis comprising 4 science cases covering several aspects of Galactic and extragalactic research. These includes: (1) homogeneous search and measurement of main physical parameters of Galactic open star clusters in huge multi-band photometric surveys; (2) interpretation of UV-to-NIR colours of nearby galaxies using a large homogeneous dataset including spectroscopy and photometry from SDSS, UKIDSS, and GALEX; (3) study of faint low-mass x-ray binaries population in modern observational archives, imposing physical constraints on this poorly-studied type of objects; (4) search for optical counterparts of unidentified x-ray objects with large positional uncertainties in the Galactic Plane. All these studies make heavy use of VO technologies and tools and would not be achievable without them. So refereed papers published in the frame of this thesis can undoubtedly be added to the growing list of VO-based research works.
|top|