Skip to content
inzva AI Projects #2 - Earthquake Prediction Kaggle Challenge
Branch: master
Clone or download
Type Name Latest commit message Commit time
Failed to load latest commit information.
LICENSE Initial commit Mar 19, 2019 Update Jul 8, 2019
Simple baseline with XGBoost.ipynb Added Baseline that has won #457 and reached top %11 achievement in c… Jun 27, 2019 Code and readme updated Jun 15, 2019

Kaggle LANL Earthquake Prediction Challenge Project

inzva AI Projects #2 - Earthquake Prediction Kaggle Challenge [1]

Project Team Members

  • Eylül Yalçınkaya (GitHub: eylulyalcinkaya)
  • Cemre Efe Karakaş (GitHub: cemreefe)
  • Macit Giray Gökırmak (GitHub: giraygokirmak)

Project Description

In this project, we try and predict the remaining time until the next earthquake occurs in laboratory conditions using a model proposed and used by Andrew Ng in his coursera course specialization, sequence models part [2], for detection of trigger words such as "hello google" for Google or "你好百度" for Baidu etc. in smart devices using acoustic voice data as input. This model has a binary output, we will modify it to out a float, representing the time until the next earthquake, using the acoustic data detected by the devices in the laboratory experiment.


The dataset given by LANL consists of 2 columns and approximately 630 million rows. One column is the acoustic data outputted by the sensors on the laboratory earthquake given as integers and the other one is the tine until the next earthquake, which is preprocessed and not given as output by any device.

acoustic_data time_to_failure
count 6.291455e+08 6.291455e+08
mean 4.519468e+00 4.477084e-01
std 1.073571e+01 2.612789e+00
min -5.515000e+03 9.550396e-05
max 5.444000e+03 1.610740e+01

Project Dependencies

  • Tensorflow 1.12
  • NumPy
  • Keras


Baseline Model

The Trigger Word Detection model described in Ng's Coursera course [2]

How to Run:

Download the dataset and, run to train the model.



[2] Ng, "Trigger Word Detection" Coursera

You can’t perform that action at this time.