Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Yr xsup 22806 pan os fetching issues (multiple devices) #26226

Merged
Merged
Show file tree
Hide file tree
Changes from 49 commits
Commits
Show all changes
84 commits
Select commit Hold shift + click to select a range
74da7cf
new helping func
RosenbergYehuda Apr 13, 2023
99a3fd9
typing
RosenbergYehuda Apr 13, 2023
16eb057
remove auto formated lines
RosenbergYehuda Apr 17, 2023
22c15cc
Merge remote-tracking branch 'origin/master' into YR--XSUP-22806]-PAN…
RosenbergYehuda Apr 17, 2023
fd83caa
replace 'seqno' with '@gobid'
RosenbergYehuda Apr 19, 2023
954fb4a
remove other changes
RosenbergYehuda Apr 19, 2023
6492fbd
Merge remote-tracking branch 'origin/master' into YR--XSUP-22806]-PAN…
RosenbergYehuda Apr 19, 2023
617c2c5
Merge remote-tracking branch 'origin/master' into YR--XSUP-22806]-PAN…
RosenbergYehuda Apr 19, 2023
cfbe927
revert
RosenbergYehuda Apr 19, 2023
dc838ea
add note for the user to narrow down the query
RosenbergYehuda Apr 23, 2023
34b1f96
Merge remote-tracking branch 'origin/master' into YR--XSUP-22806]-PAN…
RosenbergYehuda Apr 23, 2023
6730313
remove the Dev
RosenbergYehuda Apr 23, 2023
7a23acf
remove code and add a max id func
RosenbergYehuda Apr 23, 2023
62b658b
try
RosenbergYehuda Apr 23, 2023
4625949
adding a remove duplicates func
RosenbergYehuda Apr 23, 2023
15e48a8
adding support to store a limit per log type
RosenbergYehuda Apr 24, 2023
7ecd468
fixes
RosenbergYehuda Apr 24, 2023
393fae7
using last run directly insted of passing it
RosenbergYehuda Apr 27, 2023
d4d2729
Merge remote-tracking branch 'origin/master' into YR--XSUP-22806]-PAN…
RosenbergYehuda Apr 27, 2023
ab3ae5e
prepare to cr
RosenbergYehuda Apr 30, 2023
e889b5e
mypy
RosenbergYehuda Apr 30, 2023
db85a07
add int
RosenbergYehuda Apr 30, 2023
bbd100c
mypy
RosenbergYehuda May 1, 2023
7a7fd29
BC
RosenbergYehuda May 1, 2023
c0e2faa
mypy
RosenbergYehuda May 1, 2023
46c5304
mypy
RosenbergYehuda May 1, 2023
afc24e7
fix previus tests
RosenbergYehuda May 1, 2023
88bc5bd
test
RosenbergYehuda May 1, 2023
b8b204e
test
RosenbergYehuda May 1, 2023
f3a6954
Merge remote-tracking branch 'origin/master' into YR--XSUP-22806]-PAN…
RosenbergYehuda May 1, 2023
409c7b5
test
RosenbergYehuda May 1, 2023
2572ce8
conflict
RosenbergYehuda May 1, 2023
4e91d8f
docker image
RosenbergYehuda May 1, 2023
f067d18
Merge remote-tracking branch 'origin/master' into YR--XSUP-22806]-PAN…
RosenbergYehuda May 2, 2023
3191899
flake 8
RosenbergYehuda May 2, 2023
6b55237
Merge remote-tracking branch 'origin/master' into YR--XSUP-22806]-PAN…
RosenbergYehuda May 2, 2023
1c7ec4a
Shirley fixes
RosenbergYehuda May 2, 2023
90c0e49
Merge remote-tracking branch 'origin/master' into YR--XSUP-22806]-PAN…
RosenbergYehuda May 2, 2023
aadd7da
Merge remote-tracking branch 'origin/master' into YR--XSUP-22806]-PAN…
RosenbergYehuda May 3, 2023
3f04613
Tal's CR
RosenbergYehuda May 3, 2023
ae2121b
mypy
RosenbergYehuda May 3, 2023
a90d0cb
Merge remote-tracking branch 'origin/master' into YR--XSUP-22806]-PAN…
RosenbergYehuda May 3, 2023
55c4944
fix a falling test and a mistake in fixing the func after CR
RosenbergYehuda May 3, 2023
59f6d74
Merge remote-tracking branch 'origin/master' into YR--XSUP-22806]-PAN…
RosenbergYehuda May 3, 2023
ec625c4
CR
RosenbergYehuda May 7, 2023
2f235d8
Merge remote-tracking branch 'origin/master' into YR--XSUP-22806]-PAN…
RosenbergYehuda May 7, 2023
e834484
mypy
RosenbergYehuda May 7, 2023
94eebcf
docker image
RosenbergYehuda May 7, 2023
78afeda
Merge remote-tracking branch 'origin/master' into YR--XSUP-22806]-PAN…
RosenbergYehuda May 7, 2023
616c154
Shachars CR
RosenbergYehuda May 7, 2023
31feaa8
tal katzir CR
RosenbergYehuda May 7, 2023
849318b
Merge remote-tracking branch 'origin/master' into YR--XSUP-22806]-PAN…
RosenbergYehuda May 7, 2023
a1746b9
fix failing unit tests
RosenbergYehuda May 8, 2023
b8875be
Merge remote-tracking branch 'origin/master' into YR--XSUP-22806]-PAN…
RosenbergYehuda May 8, 2023
8ff2297
flake 8
RosenbergYehuda May 8, 2023
6fe2bda
Guy afik CR
RosenbergYehuda May 8, 2023
ed94b81
fix a failed test
RosenbergYehuda May 8, 2023
6501a6f
Merge remote-tracking branch 'origin/master' into YR--XSUP-22806]-PAN…
RosenbergYehuda May 8, 2023
f589711
Merge remote-tracking branch 'origin/master' into YR--XSUP-22806]-PAN…
RosenbergYehuda May 9, 2023
2216618
Merge remote-tracking branch 'origin/master' into YR--XSUP-22806]-PAN…
RosenbergYehuda May 9, 2023
efcd0a7
adding notes for debugging, and fixing a test
RosenbergYehuda May 9, 2023
1c5be0c
Merge remote-tracking branch 'origin/master' into YR--XSUP-22806]-PAN…
RosenbergYehuda May 9, 2023
6486621
Merge remote-tracking branch 'origin/master' into YR--XSUP-22806]-PAN…
RosenbergYehuda May 9, 2023
d563a29
note
RosenbergYehuda May 9, 2023
4b95190
Merge remote-tracking branch 'origin/master' into YR--XSUP-22806]-PAN…
RosenbergYehuda May 9, 2023
093711a
Merge remote-tracking branch 'origin/master' into YR--XSUP-22806]-PAN…
RosenbergYehuda May 9, 2023
06cac1c
remove the note from yesterday
RosenbergYehuda May 10, 2023
1684a5a
adding the 'forward' param to the request
RosenbergYehuda May 10, 2023
99c2309
adding the notes
RosenbergYehuda May 10, 2023
fb138a5
Merge remote-tracking branch 'origin/master' into YR--XSUP-22806]-PAN…
RosenbergYehuda May 10, 2023
a29a60b
docker
RosenbergYehuda May 10, 2023
be8f664
Merge remote-tracking branch 'origin/master' into YR--XSUP-22806]-PAN…
RosenbergYehuda May 10, 2023
adff01f
change debug message
RosenbergYehuda May 11, 2023
3eef476
fixn readme note
RosenbergYehuda May 11, 2023
2501d80
avoid devices from previous cycles to be deleted
RosenbergYehuda May 11, 2023
bc83206
Merge remote-tracking branch 'origin/master' into YR--XSUP-22806]-PAN…
RosenbergYehuda May 11, 2023
a82f62b
Merge remote-tracking branch 'origin/master' into YR--XSUP-22806]-PAN…
RosenbergYehuda May 11, 2023
6516738
Merge remote-tracking branch 'origin/master' into YR--XSUP-22806]-PAN…
RosenbergYehuda May 11, 2023
36b1ce0
typo
RosenbergYehuda May 11, 2023
d6d188b
Update Packs/PAN-OS/ReleaseNotes/1_17_0.md
ShacharKidor May 11, 2023
a65d1e9
Merge remote-tracking branch 'origin/master' into YR--XSUP-22806]-PAN…
RosenbergYehuda May 11, 2023
79c2b40
Merge remote-tracking branch 'origin/master' into YR--XSUP-22806]-PAN…
RosenbergYehuda May 12, 2023
47c05ac
Merge remote-tracking branch 'origin/master' into YR--XSUP-22806]-PAN…
RosenbergYehuda May 12, 2023
85df12e
docker
RosenbergYehuda May 12, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
138 changes: 98 additions & 40 deletions Packs/PAN-OS/Integrations/Panorama/Panorama.py
Original file line number Diff line number Diff line change
Expand Up @@ -13147,38 +13147,90 @@ def add_time_filter_to_query_parameter(query: str, last_fetch: datetime) -> str:
return query + time_generated


def add_unique_id_filter_to_query_parameter(query: str, last_id: str) -> str:
"""append unique id filter parameter to original query parameter.
'seqno' is a 64-bit log entry identifier incremented sequentially; each log type has unique number space.
by adding this filter which gives us only entries that have larger id, we insure not have duplicates in our request.
def find_largest_id_per_device(incident_entries: List[Dict[str, Any]]) -> Dict[str, str]:
"""
This function finds the largest sequence id per device in the incident entries list.
Args:
incident_entries (List[Dict[str, Any]]): list of dictionaries representing raw incident entries
Returns:
new_largest_id: a dictionary of the largest sequence id per device
"""

new_largest_id: Dict[str, str] = {}
RosenbergYehuda marked this conversation as resolved.
Show resolved Hide resolved
for entry in incident_entries:
device_name: str = entry.get('device_name', '')
incident_id: str = entry.get('seqno', '')

RosenbergYehuda marked this conversation as resolved.
Show resolved Hide resolved
# Upsert the device's id if it's a new device, or it's a larger id
if device_name not in new_largest_id.keys() or int(incident_id) > int(new_largest_id[device_name]):
new_largest_id[device_name] = incident_id
return new_largest_id
RosenbergYehuda marked this conversation as resolved.
Show resolved Hide resolved


def remove_duplicate_entries(entries_dict: Dict[str, List[Dict[str, Any]]], id_dict: Dict[str, Dict[str, str]]):
RosenbergYehuda marked this conversation as resolved.
Show resolved Hide resolved
"""
This function removes entries that have already been fetched in the previous fetch cycle.
Args:
query (str): a string representing a query
last_id (str): largest unique log entry id from last fetch (for a specific log type)
entries_dict (Dict[str, List[Dict[str,Any]]]): a dictionary of log type and its raw entries
id_dict (Dict[str, Dict[str, str]]): a dictionary of devices and their largest id so far
Returns:
new_entries_dict (Dict[str, List[Dict[str,Any]]]): a dictionary of log type and its raw entries without entries that have already been fetched in the previous fetch cycle
"""
new_entries_dict: Dict = {}
for log_type in entries_dict:
for log in entries_dict[log_type]:
RosenbergYehuda marked this conversation as resolved.
Show resolved Hide resolved
device_name = log.get("device_name", '')
RosenbergYehuda marked this conversation as resolved.
Show resolved Hide resolved
latest_id_per_device = id_dict.get(log_type,{}).get(device_name, 0) # get the latest id for that device, if that device is not in the dict, set the id to 0
RosenbergYehuda marked this conversation as resolved.
Show resolved Hide resolved
if not log.get("seqno") or arg_to_number(log["seqno"]) > arg_to_number(latest_id_per_device): # type: ignore
new_entries_dict.setdefault(log_type, []).append(log)
return new_entries_dict


def create_max_fetch_dict(queries_dict: Dict[str, str], configured_max_fetch: int):
"""
This function creates a dictionary of log type and its max fetch value - AKA the max number of entries to fetch.
Args:
queries_dict (Dict[str, str]): a dictionary of log type and its query
configured_max_fetch (int): the max fetch value for the first fetch cycle
Returns:
str: a string representing a query with added time filter parameter
max_fetch_dict (Dict[str, int]): a dictionary of log type and its max fetch value
"""
if last_id:
last_id_int = arg_to_number(last_id)
if isinstance(last_id_int, int):
# last_id is can be filtered only by '>=' so we need to add 1 to it.
last_id_int += 1
unique_id_filter = f" and (seqno geq '{last_id_int}')"
return query + unique_id_filter
else:
return query
else:
return query
max_fetch_dict = {}
for key in queries_dict.keys():
max_fetch_dict.update({key: configured_max_fetch})
return max_fetch_dict
RosenbergYehuda marked this conversation as resolved.
Show resolved Hide resolved


def update_max_fetch_dict(configured_max_fetch: int, max_fetch_dict: Dict[str, int], last_fetch_dict: Dict[str, str]) -> Dict[str, int]:
""" This function updates the max fetch value for each log type according to the last fetch timestamp.
RosenbergYehuda marked this conversation as resolved.
Show resolved Hide resolved
Args:
configured_max_fetch (int): the max fetch value for the first fetch cycle
max_fetch_dict (Dict[str, int]): a dictionary of log type and its max fetch value
last_fetch_dict (Dict[str, str]): a dictionary of log type and its last fetch timestamp
Returns:
max_fetch_dict (Dict[str, int]): a dictionary of log type and its updated max fetch value
"""
for log_type in last_fetch_dict:
previous_fetch_timestamp=demisto.getLastRun().get("last_fetch_dict", {}).get(log_type)
# If the latest timestamp of the current fetch is the same as the previous fetch timestamp,
# that means we did not get all logs for that timestamp, in such a case, we will increase the limit to be last limit + configured limit.
demisto.debug(
f"previous_fetch_timestamp: {previous_fetch_timestamp}, last_fetch_dict.get(log_type): {last_fetch_dict.get(log_type)}")
RosenbergYehuda marked this conversation as resolved.
Show resolved Hide resolved
if previous_fetch_timestamp and previous_fetch_timestamp == last_fetch_dict.get(log_type):
max_fetch_dict[log_type] += configured_max_fetch
else:
max_fetch_dict[log_type] = configured_max_fetch
demisto.debug(f"max_fetch_dict: {max_fetch_dict}")
RosenbergYehuda marked this conversation as resolved.
Show resolved Hide resolved
return max_fetch_dict


def fetch_incidents_request(queries_dict: Optional[Dict[str, str]],
max_fetch: int, fetch_start_datetime_dict: Dict[str, datetime], last_id_dict: Dict[str,str]) -> Dict[str, List[Dict[str,Any]]]:
max_fetch_dict: Dict, fetch_start_datetime_dict: Dict[str, datetime]) -> Dict[str, List[Dict[str,Any]]]:
"""get raw entires of incidents according to provided queries, log types and max_fetch parameters.

Args:
queries_dict (Optional[Dict[str, str]]): chosen log type queries dictionaries
max_fetch (int): max incidents per fetch parameter
max_fetch_dict (Dict): max incidents per fetch parameter per log type dictionary
fetch_start_datetime_dict (Dict[str,datetime]): updated last fetch time per log type dictionary

Returns:
Expand All @@ -13187,23 +13239,22 @@ def fetch_incidents_request(queries_dict: Optional[Dict[str, str]],
entries = {}
if queries_dict:
for log_type, query in queries_dict.items():
max_fetch = max_fetch_dict.get(log_type, MAX_INCIDENTS_TO_FETCH)
fetch_start_time = fetch_start_datetime_dict.get(log_type)
if fetch_start_time:
query = add_time_filter_to_query_parameter(query, fetch_start_time)
if id := last_id_dict.get(log_type):
query = add_unique_id_filter_to_query_parameter(query, id)
entries[log_type] = get_query_entries(log_type, query, max_fetch)
return entries


def parse_incident_entries(incident_entries: List[Dict[str, Any]]) -> Tuple[str | None ,datetime | None, List[Dict[str, Any]]]:
def parse_incident_entries(incident_entries: List[Dict[str, Any]]) -> Tuple[Dict[str, str]| None ,datetime | None, List[Dict[str, Any]]]:
"""parses raw incident entries of a specific log type query into basic context incidents.

Args:
incident_entries (list[dict[str,Any]]): list of dictionaries representing raw incident entries

Returns:
(str | None ,datetime | None, List[Dict[str, Any]]): (updated last fetch time, parsed incident list) tuple
Tuple[Dict[str, str], Optional[datetime], List[Dict[str, Any]]]: a tuple of the largest id, the largest last fetch time and a list of parsed incidents
"""
# if no new incidents are available, return empty list of incidents
if not incident_entries:
Expand All @@ -13214,8 +13265,7 @@ def parse_incident_entries(incident_entries: List[Dict[str, Any]]) -> Tuple[str
new_fetch_datetime = dateparser.parse(last_fetch_string, settings={'TIMEZONE': 'UTC'})

# calculate largest unique id for each log type query
new_largest_id = max({entry.get('seqno', '') for entry in incident_entries})

new_largest_id = find_largest_id_per_device(incident_entries)
# convert incident entries to incident context and filter any empty incidents if exists
parsed_incidents: List[Dict[str, Any]] = [incident_entry_to_incident_context(incident_entry) for incident_entry in incident_entries]
filtered_parsed_incidents = list(filter(lambda incident: incident, parsed_incidents))
Expand All @@ -13237,7 +13287,7 @@ def incident_entry_to_incident_context(incident_entry: Dict[str, Any]) -> Dict[s
incident_context = {}
if occurred_datetime:
incident_context = {
'name': incident_entry.get('seqno'),
'name': f"{incident_entry.get('device_name')} {incident_entry.get('seqno')}",
'occurred': occurred_datetime.strftime(DATE_FORMAT),
'rawJSON': json.dumps(incident_entry),
'type': incident_entry.get('type')
Expand Down Expand Up @@ -13308,14 +13358,14 @@ def log_types_queries_to_dict(params: Dict[str, str]) -> Dict[str, str]:
return queries_dict


def get_parsed_incident_entries(incident_entries_dict: Dict[str, List[Dict[str, Any]]], last_fetch_dict: Dict[str,str], last_id_dict: Dict[str,str]) -> Dict[str,Any]:
def get_parsed_incident_entries(incident_entries_dict: Dict[str, List[Dict[str, Any]]], last_fetch_dict: Dict[str, str], last_id_dict: Dict[str, Dict]) -> Dict[str, Any]:
"""for each log type incident entries array, parse the raw incidents into context incidents.
if necessary, update the latest fetch time and last ID values in their corresponding dictionaries.

Args:
incident_entries_dict (Dict[str, List[Dict[str, Any]]]): list of dictionaries representing raw incident entries
last_fetch_dict (Dict[str,str]): last fetch dictionary
last_id_dict (Dict[str,str]): last id dictionary
last_fetch_dict (Dict[str, str]): last fetch dictionary
last_id_dict (Dict[str, Dict]): last id dictionary

Returns:
Dict[str,Any]: parsed context incident dictionary
Expand All @@ -13337,14 +13387,14 @@ def get_parsed_incident_entries(incident_entries_dict: Dict[str, List[Dict[str,


def fetch_incidents(last_run: dict, first_fetch: str, queries_dict: Optional[Dict[str, str]],
max_fetch: int) -> Tuple[Dict[str, str], Dict[str,str], List[Dict[str, list]]]:
max_fetch_dict: Dict) -> Tuple[Dict[str, str], Dict[str,str], List[Dict[str, list]]]:
"""run one cycle of fetch incidents.

Args:
last_run (Dict): contains last run information
first_fetch (str): first time to fetch from (First fetch timestamp parameter)
queries_dict (Optional[Dict[str, str]]): queries per log type dictionary
max_fetch (int): max incidents per fetch parameter
max_fetch_dict (Dict): max incidents per fetch parameter per log type dictionary

Returns:
(Dict[str, str], Dict[str,str], List[Dict[str, list]]): last fetch per log type dictionary, last unique id per log type dictionary, parsed incidents tuple
Expand All @@ -13357,10 +13407,14 @@ def fetch_incidents(last_run: dict, first_fetch: str, queries_dict: Optional[Dic
fetch_start_datetime_dict = get_fetch_start_datetime_dict(last_fetch_dict, first_fetch, queries_dict)
demisto.debug(f'updated last fetch per log type: {fetch_start_datetime_dict=}.')

incident_entries_dict = fetch_incidents_request(queries_dict, max_fetch, fetch_start_datetime_dict, last_id_dict)
demisto.debug('raw incident entries fetching has completed.')

parsed_incident_entries_dict = get_parsed_incident_entries(incident_entries_dict, last_fetch_dict, last_id_dict)
incident_entries_dict = fetch_incidents_request(queries_dict, max_fetch_dict, fetch_start_datetime_dict)
demisto.debug('raw incident entries fetching has completed.')
RosenbergYehuda marked this conversation as resolved.
Show resolved Hide resolved

# remove duplicated incidents from incident_entries_dict
unique_incident_entries_dict = remove_duplicate_entries(entries_dict=incident_entries_dict, id_dict =last_id_dict)

parsed_incident_entries_dict = get_parsed_incident_entries(unique_incident_entries_dict, last_fetch_dict, last_id_dict)

# flatten incident_entries_dict to a single list of dictionaries representing context entries
parsed_incident_entries_list = [incident for incident_list in parsed_incident_entries_dict.values()
Expand Down Expand Up @@ -13407,12 +13461,16 @@ def main(): # pragma: no cover
elif command == 'fetch-incidents':
last_run = demisto.getLastRun()
first_fetch = params.get('first_fetch') or FETCH_DEFAULT_TIME
max_fetch = arg_to_number(params.get('max_fetch')) or MAX_INCIDENTS_TO_FETCH
configured_max_fetch = arg_to_number(params.get('max_fetch')) or MAX_INCIDENTS_TO_FETCH
queries_dict = log_types_queries_to_dict(params)

last_fetch_dict, last_id_dict, incident_entries_list = fetch_incidents(last_run, first_fetch, queries_dict, max_fetch)

demisto.setLastRun({'last_fetch_dict': last_fetch_dict, 'last_id_dict': last_id_dict})
max_fetch_dict = last_run.get('max_fetch_dict') or create_max_fetch_dict(queries_dict=queries_dict ,configured_max_fetch=configured_max_fetch)

last_fetch_dict, last_id_dict, incident_entries_list = fetch_incidents(last_run, first_fetch, queries_dict, max_fetch_dict)
next_max_fetch_dict = update_max_fetch_dict(configured_max_fetch=configured_max_fetch,
max_fetch_dict=max_fetch_dict,
last_fetch_dict = last_fetch_dict)

demisto.setLastRun({'last_fetch_dict': last_fetch_dict, 'last_id_dict': last_id_dict, 'max_fetch_dict': next_max_fetch_dict})
demisto.incidents(incident_entries_list)

elif command == 'panorama' or command == 'pan-os':
Expand Down
18 changes: 9 additions & 9 deletions Packs/PAN-OS/Integrations/Panorama/Panorama.yml
Original file line number Diff line number Diff line change
Expand Up @@ -154,56 +154,56 @@ configuration:
advanced: true
- display: Traffic Log Type Query
name: traffic_query
additionalinfo: "Traffic Log Type query example: (addr.src in {source}) and (addr.dst in {destination}) and (action eq {action})"
additionalinfo: "Traffic Log Type query example: (addr.src in {source}) and (addr.dst in {destination}) and (action eq {action}).\nIn case of multiple devices, for the sake of speed it is recommended to narrow the query to a specific device. \nFor example:(device_name eq dummy_device)"
RosenbergYehuda marked this conversation as resolved.
Show resolved Hide resolved
required: false
type: 12
section: Collect
advanced: true
- display: Threat Log Type Query
name: threat_query
additionalinfo: "Threat Log Type query example: (severity geq high)"
additionalinfo: "Threat Log Type query example: (severity geq high).\nIn case of multiple devices, for the sake of speed it is recommended to narrow the query to a specific device. \nFor example:(device_name eq dummy_device)"
required: false
type: 12
section: Collect
advanced: true
- display: URL Log Type Query
name: url_query
additionalinfo: "URL Log Type query example: ((action eq block-override) or (action eq block-url)) and (severity geq high)"
additionalinfo: "URL Log Type query example: ((action eq block-override) or (action eq block-url)) and (severity geq high).\nIn case of multiple devices, for the sake of speed it is recommended to narrow the query to a specific device. \nFor example:(device_name eq dummy_device)"
required: false
type: 12
section: Collect
advanced: true
- display: Data Log Type Query
name: data_query
additionalinfo: "Data Log Type query example: ((action eq alert) or (action eq wildfire-upload-success) or (action eq forward)) and (severity geq high)"
additionalinfo: "Data Log Type query example: ((action eq alert) or (action eq wildfire-upload-success) or (action eq forward)) and (severity geq high).\nIn case of multiple devices, for the sake of speed it is recommended to narrow the query to a specific device. \nFor example:(device_name eq dummy_device)"
required: false
type: 12
section: Collect
advanced: true
- display: Correlation Log Type Query
name: correlation_query
additionalinfo: "Correlation Log Type query example: (hostid eq {host_id}) and (match_time in {last_x_time}) and (objectname eq {object_name}) and (severity geq '{severity}'') and (src in {source_address})"
additionalinfo: "Correlation Log Type query example: (hostid eq {host_id}) and (match_time in {last_x_time}) and (objectname eq {object_name}) and (severity geq '{severity}'') and (src in {source_address}).\nIn case of multiple devices, for the sake of speed it is recommended to narrow the query to a specific device. \nFor example:(device_name eq dummy_device)"
required: false
type: 12
section: Collect
advanced: true
- display: System Log Type Query
name: system_query
additionalinfo: "System Log Type query example: (subtype eq {sub_type}) and (severity geq {severity})"
additionalinfo: "System Log Type query example: (subtype eq {sub_type}) and (severity geq {severity}). \nIn case of multiple devices, for the sake of speed it is recommended to narrow the query to a specific device. \nFor example:(device_name eq dummy_device)"
required: false
type: 12
section: Collect
advanced: true
- display: Wildfire Submission Log Type Query
name: wildfire_query
additionalinfo: "Wildfire Submission Log Type query example: ((action eq wildfire-upload-fail) or (action eq wildfire-upload-skip) or (action eq sinkhole))"
additionalinfo: "Wildfire Submission Log Type query example: ((action eq wildfire-upload-fail) or (action eq wildfire-upload-skip) or (action eq sinkhole)). \nIn case of multiple devices, for the sake of speed it is recommended to narrow the query to a specific device. \nFor example:(device_name eq dummy_device)"
required: false
type: 12
section: Collect
advanced: true
- display: Decryption Log Type Query
name: decryption_query
additionalinfo: "Decryption Log Type query example: (app eq {application}) and (policy_name geq {policy_name}) and ((src in {source}) or (dst in {destination}))"
additionalinfo: "Decryption Log Type query example: (app eq {application}) and (policy_name geq {policy_name}) and ((src in {source}) or (dst in {destination})). \nIn case of multiple devices, for the sake of speed it is recommended to narrow the query to a specific device. \nFor example:(device_name eq dummy_device)"
required: false
type: 12
section: Collect
Expand Down Expand Up @@ -9559,7 +9559,7 @@ script:
description: Deletes an application-group
name: pan-os-delete-application-group
outputs: []
dockerimage: demisto/pan-os-python:1.0.0.56449
dockerimage: demisto/pan-os-python:1.0.0.57650
feed: false
isfetch: true
longRunning: false
Expand Down