Skip to content
This repository has been archived by the owner on Nov 3, 2021. It is now read-only.

Latest commit

 

History

History
85 lines (63 loc) · 2.08 KB

create.rst

File metadata and controls

85 lines (63 loc) · 2.08 KB

Creating/Updating Documents

Create a new Event

event_dict = {
    "example_key": "example value"
}
es_client.save_event(body=event_dict)

Update an existing event

event_dict = {
    "example_key": "example new value"
}
# Assuming 12345 is the id of the existing entry
es_client.save_event(body=event_dict, doc_id="12345")

Create a new alert

alert_dict = {
    "example_key": "example value"
}
es_client.save_alert(body=alert_dict)

Update an existing alert

alert_dict = {
    "example_key": "example new value"
}
# Assuming 12345 is the id of the existing entry
es_client.save_alert(body=alert_dict, doc_id="12345")

Create a new generic document

document_dict = {
    "example_key": "example value"
}
es_client.save_object(index='randomindex', doc_type='randomtype', body=document_dict)

Update an existing document

document_dict = {
    "example_key": "example new value"
}
# Assuming 12345 is the id of the existing entry
es_client.save_object(index='randomindex', doc_type='randomtype', body=document_dict, doc_id="12345")

Bulk Importing

from mozdef_util.elasticsearch_client import ElasticsearchClient
es_client = ElasticsearchClient("http://127.0.0.1:9200", bulk_amount=30, bulk_refresh_time=5)
es_client.save_event(body={'key': 'value'}, bulk=True)
  • Line 2: bulk_amount (defaults to 100), specifies how many messages should sit in the bulk queue before they get written to elasticsearch
  • Line 2: bulk_refresh_time (defaults to 30), is the amount of time that a bulk flush is forced
  • Line 3: bulk (defaults to False) determines if an event should get added to a bulk queue