Deep-HiTS: Rotation Invariant Convolutional Neural Network for Transient Detection
Author
dc.contributor.author
Cabrera Vives, Guillermo
Author
dc.contributor.author
Reyes, Ignacio
Author
dc.contributor.author
Förster, Francisco
Author
dc.contributor.author
Estévez Valencia, Pablo
Author
dc.contributor.author
Maureira, Juan Carlos
Admission date
dc.date.accessioned
2019-05-29T13:10:08Z
Available date
dc.date.available
2019-05-29T13:10:08Z
Publication date
dc.date.issued
2017
Cita de ítem
dc.identifier.citation
Astrophysical Journal, Volumen 836, Issue 1, 2017
Identifier
dc.identifier.issn
15384357
Identifier
dc.identifier.issn
0004637X
Identifier
dc.identifier.other
10.3847/1538-4357/836/1/97
Identifier
dc.identifier.uri
https://repositorio.uchile.cl/handle/2250/168773
Abstract
dc.description.abstract
We introduce Deep-HiTS, a rotation invariant convolutional neuralnetwork (CNN) model for clas-sifying images of transients candidates into artifacts or real sources for the High cadence TransientSurvey (HiTS). CNNs have the advantage of learning the featuresautomatically from the data whileachieving high performance. We compare our CNN model against a feature engineering approachusing random forests (RF). We show that our CNN significantly outperforms the RF model reducingthe error by almost half. Furthermore, for a fixed number of approximately 2,000 allowed false tran-sient candidates per night we are able to reduce the miss-classified real transients by approximately1/5. To the best of our knowledge, this is the first time CNNs have been used to detect astronomi-cal transient events. Our approach will be very useful when processing images from next generationinstruments such as the Large Synoptic Survey Telescope (LSST). We have made all our code anddata available to the community for the sake of allowing further developments and comparisons athttps://github.com/guille-c/Deep-HiTS