Skip to content

Commit

Permalink
Version 3.0.0 (final)
Browse files Browse the repository at this point in the history
  • Loading branch information
The n6 Development Team authored and zuo committed Dec 1, 2021
1 parent 9ce94c1 commit 0e63269
Show file tree
Hide file tree
Showing 22 changed files with 1,159 additions and 91 deletions.
2 changes: 1 addition & 1 deletion .n6-version
Original file line number Diff line number Diff line change
@@ -1 +1 @@
3.0.0b3
3.0.0
40 changes: 36 additions & 4 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,13 +1,45 @@
# Changelog

Starting with the 3.0.0 release, all notable changes applied to
the code of _n6_ will be continuously documented in this file.
Starting with the 3.0.0 release, all notable changes applied to the
[code of _n6_](https://github.com/CERT-Polska/n6) are continuously
documented here.

The format of this file is based, to much extent, on
[Keep a Changelog](https://keepachangelog.com/).


## 3.0.0b... (beta releases...) - since 2021-10-13...
## [3.0.0] - 2021-12-01

TBD in the description of the 3.0.0 final release (soon...).
**This release is a big milestone.** It includes, among others:

* migration to Python 3
* in the *n6* data pipeline infrastructure: optional integration
with [IntelMQ](https://github.com/certtools/intelmq)
* in the *n6 Portal:* a new frontend (implemented using
[React](https://reactjs.org/)), two-factor authentication
(based on [TOTP](https://datatracker.ietf.org/doc/html/rfc6238)),
user's/organization's own data management (including config update
and password reset forms, with related e-mail notices), and other
goodies...
* in the *n6 REST API:* API-key-based authentication
* and many, many more improvements, a bunch of fixes, as well as
some refactorization, removals and cleanups...

Beware that many of the changes are *not* backwards-compatible.

Note that most of the main elements of *n6* -- namely:
`N6DataPipeline`, `N6DataSources`, `N6Portal`, `N6RestApi`,
`N6AdminPanel`, `N6BrokerAuthApi`, `N6Lib` and `N6SDK` -- are now
*Python-3-only* (more precisely: are compatible with CPython 3.9).

The legacy, *Python-2-only* stuff -- most of which are *collectors* and
*parsers* (external-data-sources-related components) -- reside in
`N6Core` and `N6CoreLib`; the collectors and parsers placed in `N6Core`,
if related to non-obsolete external data sources, will be gradually
migrated to *Python-3-only* `N6DataSources` (so that, finally, we will
be able to rid of `N6Core` and `N6CoreLib`). There are also
*Python-2-only* variants of `N6Lib` and `N6SDK`: `N6Lib-py2` and
`N6SDK-py2` (needed only as dependencies of `N6Core`/`N6CoreLib`).


[3.0.0]: https://github.com/CERT-Polska/n6/compare/v2.0.6a2-dev1...v3.0.0
2 changes: 1 addition & 1 deletion N6BrokerAuthApi/n6brokerauthapi/__init__.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# Copyright (c) 2013-2021 NASK. All rights reserved.
#TODO: Module modernized to Python 3, but no changes detected, comment to be deleted after MR

"""
This package provides a REST API implementation intended to cooperate
with `rabbitmq-auth-backend-http` -- the RabbitMQ AMQP message broker's
Expand Down
16 changes: 9 additions & 7 deletions N6Core/README.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,10 @@
**Note:** `N6Core` contains legacy *Python-2-only* stuff. Typically,
you will want to use -- instead of it -- the new, *Python-3-only* stuff
residing in `N6DataPipeline`.
**Note:** `N6Core` contains legacy *Python-2-only* stuff.

Then it comes to data sources -- i.e., collectors and parsers --
`N6DataSources` is the place where new sources should be implemented
(in Python 3). The collectors and parsers residing in `N6Core` will
be gradually migrated to `N6DataSources` (if not obsolete).
When it comes to the basic *n6* pipeline components, please use the new,
*Python-3-only* stuff residing in `N6DataPipeline`.

When it comes to the data-sources-related components -- i.e., collectors
and parsers -- `N6DataSources` is the place where any new stuff is to be
implemented (in Python 3). The collectors and parsers residing in
`N6Core` will be gradually migrated to `N6DataSources` (for those data
sources than are not obsolete).
58 changes: 29 additions & 29 deletions N6Core/n6/archiver/archive_raw.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,11 +5,11 @@
Component archive_raw -- adds raw data to the archive database (MongoDB).
A new source is added as a new collection.
"""
from __future__ import print_function #3--

import datetime
import hashlib
import itertools
import math
import os
import socket
import subprocess
Expand All @@ -20,10 +20,10 @@

import gridfs
import pymongo
from gridfs import GridFS
from bson.json_util import loads
from bson.json_util import dumps

from n6lib.common_helpers import open_file
from n6lib.config import Config
from n6.base.queue import QueuedBase, n6QueueProcessingException
from n6lib.log_helpers import get_logger, logging_configured
Expand All @@ -39,16 +39,16 @@


def backup_msg(fname, collection, msg, header):
with open(fname, 'w') as f:
if isinstance(msg, basestring):
payload = (msg.encode('utf-8') if isinstance(msg, unicode)
with open_file(fname, 'w') as f: #3: add 'b'-mode flag
if isinstance(msg, (bytes, unicode)): #3: unicode->str
payload = (msg.encode('utf-8') if isinstance(msg, unicode) #3: unicode->str
else msg)
else:
payload = (repr(msg).encode('utf-8') if isinstance(repr(msg), unicode)
payload = (repr(msg).encode('utf-8') if isinstance(repr(msg), unicode) #3: unicode->str
else repr(msg))

hdr = (repr(header).encode('utf-8') if isinstance(repr(header), unicode)
else repr(header))
hdr = (repr(header).encode('utf-8') if isinstance(repr(header), unicode) #3: unicode->str
else repr(header))
f.write('\n'.join(( collection, hdr, payload )))


Expand All @@ -58,8 +58,8 @@ def timed(*args, **kw):
result = method(*args, **kw)
stop = datetime.datetime.now()
delta = stop - start
print '%r %r (%r, %r) %r ' % \
(str(datetime.datetime.now()), method.__name__, args, kw, str(delta))
print('%r %r (%r, %r) %r ' % \
(str(datetime.datetime.now()), method.__name__, args, kw, str(delta))) #3: __name__ -> __qualname__
return result
return timed

Expand Down Expand Up @@ -106,11 +106,11 @@ def __init__(self, connection, db_name, collection_name):
docs = collection.find().sort("ns", pymongo.ASCENDING)
for i in docs:
coll = Collection(i['ns'].replace(''.join((db_name, '.')), ''))
self.add_to_storage(coll, i['key'].keys()[0])
self.add_to_storage(coll, list(i['key'].keys())[0])

@staticmethod
def add_to_storage(collection, index):
if collection.name not in IndexesStore._collections_tmp_store.keys():
if collection.name not in list(IndexesStore._collections_tmp_store.keys()):
# new collection, add index, and initialize key in store dict
collection.indexes.append(index)
IndexesStore._collections_tmp_store.update({collection.name: collection})
Expand All @@ -122,7 +122,7 @@ def add_to_storage(collection, index):
def name_of_indexed_collection_n6():
# simple select a collection, no system and no tip chunks
# to check the amount of indexes
return [name for name in IndexesStore._collections_tmp_store.keys()
return [name for name in list(IndexesStore._collections_tmp_store.keys())
if ('.chunks' not in name) and name not in ('n6.system.namespaces')]

@staticmethod
Expand Down Expand Up @@ -573,7 +573,7 @@ def input_callback(self, routing_key, body, properties):
# Add to archive
if writing:
type_ = properties.type
payload = (body.encode('utf-8') if isinstance(body, unicode)
payload = (body.encode('utf-8') if isinstance(body, unicode) #3: unicode -> str
else body)

if type_ == 'stream':
Expand All @@ -591,7 +591,7 @@ def input_callback(self, routing_key, body, properties):
"Unknown message type: {0}, source: {1}".format(type_, routing_key))
#finally:
# self.__tf.append(time.time() - t0)
# if next(self.__count) % 5000 == 0:
# if next(self.__count) % 5000 == 0: #spr.
# try:
# LOGGER.critical('ARCHIVE-RAW INPUT CALLBACK TIMES: min %s, avg %s',
# min(tf),
Expand Down Expand Up @@ -686,14 +686,14 @@ def init_files(self):
self.list_tmp_files.append((self.tempfilefd_ver, self.tempfile_ver))

# save orig init file
with open(self.tempfile_file_init, 'w') as fid:
with open_file(self.tempfile_file_init, 'w') as fid:
LOGGER.debug('WTF: %r', type(self.payload))
fid.write(self.payload)
self.file_init = self.tempfile_file_init

for fd, fn in self.list_tmp_files:
os.close(fd)
os.chmod(fn, 0644)
os.chmod(fn, 0o644)
LOGGER.debug('run blacklist init tmp files')

@safe_mongocall
Expand Down Expand Up @@ -750,7 +750,7 @@ def get_patches(self):
"marker": self.marker_db_init}
).sort("received", pymongo.DESCENDING).limit(1)

row = cursor.next()
row = next(cursor)
date = row["received"]
first_file_id = row["_id"]

Expand All @@ -773,23 +773,23 @@ def save_diff_in_db(self, files):
Return: None
"""
file1, file2 = files
f_sout = open(self.tempfile_patch_u, "w")
f_sout = open_file(self.tempfile_patch_u, "w")
if BlackListCompacter.init:
BlackListCompacter.init = 0
subprocess.call("diff -u " + file1 + " " + file2,
stdout=f_sout, stderr=subprocess.STDOUT, shell=True)
f_sout.close()

self.save_file_in_db(self.marker_db_init,
open(self.tempfile_patch_u, 'r').read())
open_file(self.tempfile_patch_u, 'r').read())
LOGGER.debug(' marker init in db:%s ', self.marker_db_init)
else:
subprocess.call("diff -u " + file1 + " " +
file2, stdout=f_sout, stderr=subprocess.STDOUT, shell=True)
f_sout.close()

self.save_file_in_db(self.marker_db_diff,
open(self.tempfile_patch_u, 'r').read())
open_file(self.tempfile_patch_u, 'r').read())
LOGGER.debug('marker in period in db :%s ', self.marker_db_diff)

def generate_orig_file(self, cursor, file_id):
Expand All @@ -807,7 +807,7 @@ def generate_orig_file(self, cursor, file_id):
# generate first file
files_count = 1
# stdout in file
f_sout = open(self.tempfile_patch_u, "w")
f_sout = open_file(self.tempfile_patch_u, "w")
# first diff file post init in GENERATE_ALL_FILE mode
if cursor.count() > 0 and BlackListCompacter.generate_all_file:
out = subprocess.call("patch " + self.tempfile_file + " -i " +
Expand All @@ -818,7 +818,7 @@ def generate_orig_file(self, cursor, file_id):
self.list_tmp_files.append((self.tempfile_ver,
self.tempfile_ver +
str(files_count - 1)))
os.chmod(self.tempfile_ver + str(files_count - 1), 0644)
os.chmod(self.tempfile_ver + str(files_count - 1), 0o644)

# # first diff file post init in GENERATE_ONE_FILE mode
elif cursor.count() > 0 and (not BlackListCompacter.generate_all_file):
Expand All @@ -830,7 +830,7 @@ def generate_orig_file(self, cursor, file_id):

else:
file_db = self.dbm.get_file_from_db_raw(file_id)
patch_file = open(self.tempfile_patch_tmp, 'w')
patch_file = open_file(self.tempfile_patch_tmp, 'w')
patch_file.write(file_db)
patch_file.close()

Expand All @@ -844,7 +844,7 @@ def generate_orig_file(self, cursor, file_id):
# set prev id in current doc.
self.prev_id = id_dba
file_db = self.dbm.get_file_from_db_raw(id_dba)
patch_file = open(self.tempfile_patch_tmp, 'w')
patch_file = open_file(self.tempfile_patch_tmp, 'w')
patch_file.write(file_db)
patch_file.close()

Expand All @@ -860,7 +860,7 @@ def generate_orig_file(self, cursor, file_id):
self.tempfile_patch_tmp + " -o " +
self.tempfile_ver + str(files_count),
stdout=f_sout, stderr=subprocess.STDOUT, shell=True)
os.chmod(self.tempfile_ver + str(files_count), 0644)
os.chmod(self.tempfile_ver + str(files_count), 0o644)
LOGGER.debug('patch_all_files(return code): %r', out)

else:
Expand Down Expand Up @@ -890,7 +890,7 @@ def start(self):

self.marker_db_diff = files_count
LOGGER.debug('files_count: %r', files_count)
except StopIteration, exc:
except StopIteration as exc:
# first file
LOGGER.warning('First file, initialize: %r', exc)
BlackListCompacter.init = 1
Expand Down Expand Up @@ -936,12 +936,12 @@ def main():
except socket.timeout as exc:
# at the moment need to capture sys.exit tool for monitoring
LOGGER.critical('socket.timeout: %r', exc)
print >> sys.stderr, exc
print(exc, file=sys.stderr)
sys.exit(1)
except socket.error as exc:
# at the moment need to capture sys.exit tool for monitoring
LOGGER.critical('socket.error: %r', exc)
print >> sys.stderr, exc
print(exc, file=sys.stderr)
sys.exit(1)


Expand Down
2 changes: 1 addition & 1 deletion N6Core/n6/data/conf/00_pipeline.conf
Original file line number Diff line number Diff line change
Expand Up @@ -25,4 +25,4 @@ comparator = enriched
filter = enriched, compared
anonymizer = filtered
recorder = filtered
counter= recorded
counter = recorded
2 changes: 1 addition & 1 deletion N6DataPipeline/console_scripts
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
#n6archiveraw = n6datapipeline.archive_raw:main
n6archiveraw = n6datapipeline.archive_raw:main
n6aggregator = n6datapipeline.aggregator:main
n6enrich = n6datapipeline.enrich:main
n6comparator = n6datapipeline.comparator:main
Expand Down

0 comments on commit 0e63269

Please sign in to comment.