Quantcast
Channel: Media & Entertainment Technology » objects
Viewing all articles
Browse latest Browse all 11

The Palomar Transient Factory

$
0
0

February 9 2015, Electronic Imaging Conference, San Francisco—Peter Nugent from Lawrence Berkeley National Laboratory described the efforts in incorporating big data techniques to detecting and observing super nova phenomena. The two major conflicting problems are in observing the whole sky for changes and the need for higher power telescopes, which have smaller field of view, to look farther away.

A supernova explosion is possible in 10-20 percent of the larger stars, and in one month can emit more energy than our sun will generate over its lifetime. Within the universe, a supernova occurs about once a second, while our galaxy, the Milky Way, has about one century. As a result, most of the supernova are over 1 B light-years away from us.

In 1054, a supernova occurred in our neighborhood and was observed and recorded. The remnants of that supernova now comprise the Crab Nebula. It turns out that all of the elements except for hydrogen, helium, and possibly lithium are the results of a supernova. A normal star only processes the lightest elements.

The availability of high-grade visual analysis tools enabled the first automated searches in '95. A network of telescopes around the world started with the 4 M Victa Blanco as first detector. If an anomaly is determined to be something other than an asteroid or multiple star system, the coordinates are forwarded to the Keck in Hawaii, to the Hubble, or to Palomar.

One reason for the sequencing of telescopes is that a narrow field of view enables greater precision for closer objects and can detect images from farther away. Astrophysicists get more information from distant stars due to the red shift. The telescopes get lots of data per photograph and the network generates about 50-100 images a night. The Palomar telescope has 92 M pixels and a 1 arc-second resolution. The 4 M telescope has a field of view of about 2.5° x 4° with image resolution is about 0.5 arc-second per pixel.

The data are transferred across the various location via FedEx, which is less expensive and faster than wire or fiber connections. The move from manual to computer analysis started with a boosted decision tree. The network can link the smaller telescopes and the Palomar for faster response times. The system detected the initial phases of a supernova in '09 that was the result of a core collapse of a star that was between 10-200 solar mass equivalent.

The photometric search starts with a single color, single point detection which is enhanced by the shared resources of 20 different projects. The networking and shared resources allows a quick response to any detected object. The Palomar control center links to the San Diego facility as well as IPAC, LBNL, and Cal Tech for follow up marshalling.

The underlying processing starts with a classifier which identified an object as real and forwards the coordinates to the network. Currently, they move the data to a 225 TB storage where is is available for further processing and inclusion into the database. The data then moves to a substation for distribution to the partners. The processing takes 2 M images down to 40 k reference sets and 50 k potential objects. On average, they get about 1.8 k objects of possible interest from .5 M subjects.

The first and most important question is the object real. Of the 1.5 M candidates per night, most are artifacts. The images from the 2kx4k CCD imagers are capturing 3k images per night and the data are converted for a browser interface. The initial ground truth process called for qualifying the scanners, since people have biases for accepting or rejecting candidate objects. The results of the analyses were used to train the program with 60 parameters for evaluation.

The move to robotic interpretation requires multiple databases to include historical data. They also had members of the general population help to populate the databases, and 15k people signed up in the first 2weeks between 2009 and 2010 to help with the work. the Galaxy Zoo takes 1080p images from the telescopes and lets the crowd help with evaluations. The more highly automated system can now process 95 percent of all images within an hour and reduced the processing time from 30 minutes to under 15.

An example of the capabilities, SN2011fe was caught in 11 hours from initial detection. The groups used gamma ray bursts as initial indicators and added different sensors as quickly as possible. They captured the orphan afterglow and the fast fade. The enhanced network is now able to detect many new and types of events. New shape classifiers are enabling real-time detection of star-galaxy activities, asteroids, historical lightwave generation, and alerts to the partners for collaboration. In addition, their real-bogus identification capabilities are much improved.

Optical and imager upgrades will bring on line a wide field imager with 46°x46° field of view. The Atlas device will have a 0.5 M 10k x 10k detector for asteroid tracking and detection. They still have to overcome some basic limits. The imagers cannot detect an object below the detection limits and more interlinking of the databases is needed to map changes to historical data. The 8 M LSST will be coming online and will generate 15 TB per night. Any objects found by the LSST will need a new 30 M telescope for better resolution.
 

 


Viewing all articles
Browse latest Browse all 11

Trending Articles