-
Notifications
You must be signed in to change notification settings - Fork 32
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge pull request #4 from dbarrosop/revamp
rewritten from scratch. Check docs
- Loading branch information
Showing
179 changed files
with
13,388 additions
and
1,968 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,4 +1,4 @@ | ||
Apache License | ||
Apache License | ||
Version 2.0, January 2004 | ||
http://www.apache.org/licenses/ | ||
|
||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,20 +1,40 @@ | ||
SDN Internet Router (sir) | ||
========================= | ||
|
||
This tool in combination with [pmacct](http://www.pmacct.net/) allows you to get the full BGP feed from your upstream providers/peers and install on the FIB only the relevant prefixes for you. The main benefit of this approach is that you will not need a very expensive router to do peering. A cheap and very fast switch might be enough. | ||
The SDN Internet Router, abbreviated SIR, is an agent that you can add to your router. The agent exposes information | ||
that your router can't expose by itself like the BGP table, traffic per BGP prefix or traffic per ASN. This data | ||
is provided both via a WebUI and an API to access this data. | ||
|
||
I recommend you to start reading the [How To: Simple Setup](http://sdn-internet-router-sir.readthedocs.org/en/latest/how_to_simple/index.html), there you can see what this is about, what you need and how to achieve it. | ||
The agent is vendor agnostic as it gathers data using both BGP and netflow/sflow/ipfix. This means it can be attached | ||
to any router or switch that supports those protocols. | ||
|
||
You can also check the following [slides](docs/_static/SDN_Internet_Router-sir-Nov14.pdf) and [video](http://youtu.be/o1njanXhQqM?list=PLXSSXAe33jI2IIWtfnnEj5J7B7KoixKCe). | ||
Features | ||
======== | ||
|
||
Documentation | ||
============= | ||
The agent will expose a Web UI and an API that will allow you do things like: | ||
|
||
You can find the documentation on [Read the Docs](http://sdn-internet-router-sir.readthedocs.org/en/latest/). | ||
* Retrieve Top ASN's based in bandwidth usage. | ||
* Retrieve Top prefixes based in bandwidth usage. | ||
* Simulate what would happen if you had top N prefixes only in your FIB instead of the full routing table. | ||
* Store and retrieve arbitrary data. | ||
* Get raw BGP from your router. | ||
* Get raw flow data from your router. | ||
* Look for all the prefixes that traverses or originates in a particular ASN. | ||
* Check all the prefixes in the router that allows you to reach certain prefixes or IP. | ||
|
||
You can read the full list of features in the following [link](http://sdn-internet-router-sir.readthedocs.org/en/latest/features/index.html). | ||
|
||
Applications | ||
============ | ||
|
||
This agent will give you some visibility about your network. You can use this data to better choose your network equipment, to do traffic engineering, capacity planning, peering decisions... anything you want. You can see some use cases in the following [link](http://sdn-internet-router-sir.readthedocs.org/en/latest/use_cases/index.html). | ||
|
||
Note | ||
==== | ||
Here is a list of links where you can get tools that leverages on SIR: | ||
|
||
This software is in very early stages. There is a lot of documentation missing and some bits and pieces might be refactored. If you plan to test it I suggest you contact me. I can help you to deploy it while keeping you informed of the changes I might be doing that could potentially affect you. | ||
* [pySIR](https://github.com/dbarrosop/pySIR) - This is a python API that helps you interact with the API. It implements all the API calls that the API supports. It's just a convenient of coding without taking care of the requests or having to handle errors. | ||
* [sir_tools](https://github.com/dbarrosop/sir_tools) - A collection of tools that takes advantage of SIR API to perform several operations. | ||
|
||
Documentation | ||
============= | ||
|
||
You can find the documentation on [Read the Docs](http://sdn-internet-router-sir.readthedocs.org/en/latest/). |
File renamed without changes.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,69 @@ | ||
import helpers.api | ||
from flask import g | ||
|
||
|
||
def top_prefixes(request): | ||
# curl http://127.0.0.1:5000/api/v1.0/top_prefixes\?limit_prefixes=10\&start_time\=2015-07-13T14:00\&end_time\=2015-07-14T14:00 | ||
db = getattr(g, 'db') | ||
start_time = request.args.get('start_time') | ||
end_time = request.args.get('end_time') | ||
limit_prefixes = int(request.args.get('limit_prefixes', 0)) | ||
net_masks = request.args.get('net_masks', '') | ||
exclude_net_masks = request.args.get('exclude_net_masks', False) | ||
|
||
result = db.aggregate_per_prefix( | ||
start_time, end_time, | ||
limit=limit_prefixes, | ||
net_masks=net_masks, | ||
exclude_net_masks=exclude_net_masks) | ||
|
||
parameters = { | ||
'limit_prefixes': limit_prefixes, | ||
'start_time': start_time, | ||
'end_time': end_time, | ||
'net_masks': net_masks, | ||
'exclude_net_masks': exclude_net_masks, | ||
} | ||
return helpers.api.build_api_response(result, error=False, **parameters) | ||
|
||
|
||
def top_asns(request): | ||
# curl http://127.0.0.1:5000/api/v1.0/top_asns\?start_time=2015-07-13T14:00\&end_time=2015-07-14T14:00 | ||
db = getattr(g, 'db') | ||
start_time = request.args.get('start_time') | ||
end_time = request.args.get('end_time') | ||
|
||
result = db.aggregate_per_as(start_time, end_time) | ||
parameters = { | ||
'start_time': start_time, | ||
'end_time': end_time, | ||
} | ||
return helpers.api.build_api_response(result, error=False, **parameters) | ||
|
||
|
||
def find_prefix(request, prefix): | ||
# curl http://127.0.0.1:5000/api/v1.0/top_asns\?start_time=2015-07-13T14:00\&end_time=2015-07-14T14:00 | ||
fs = getattr(g, 'fs') | ||
date = request.args.get('date') | ||
print date | ||
result = fs.find_prefix(prefix, date) | ||
parameters = { | ||
'prefix': prefix, | ||
'date': date, | ||
} | ||
return helpers.api.build_api_response(result, error=False, **parameters) | ||
|
||
|
||
def find_prefixes_asn(request, asn): | ||
# curl http://127.0.0.1:5000/api/v1.0/top_asns\?start_time=2015-07-13T14:00\&end_time=2015-07-14T14:00 | ||
fs = getattr(g, 'fs') | ||
date = request.args.get('date') | ||
origin_only = request.args.get('origin_only', False) | ||
|
||
result = fs.find_prefixes_asn(asn, date, origin_only) | ||
parameters = { | ||
'asn': asn, | ||
'date': date, | ||
'origin_only': origin_only, | ||
} | ||
return helpers.api.build_api_response(result, error=False, **parameters) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,128 @@ | ||
from flask import render_template | ||
from flask import g | ||
|
||
|
||
def _init_context_dates(db, request): | ||
context = dict() | ||
dates = db.get_dates() | ||
|
||
starting_time = min(len(dates), 25) | ||
|
||
context['avail_start_time'] = dates[0].strftime('%Y-%m-%dT%H:%M') | ||
context['avail_end_time'] = dates[-1].strftime('%Y-%m-%dT%H:%M') | ||
context['start_time'] = request.form.get('start_time', dates[-starting_time].strftime('%Y-%m-%dT%H:%M')) | ||
context['end_time'] = request.form.get('end_time', context['avail_end_time']) | ||
return context | ||
|
||
|
||
def start_page(request): | ||
return render_template('analytics/start_page.html') | ||
|
||
|
||
def offloaded_traffic(request): | ||
db = getattr(g, 'db', None) | ||
context = _init_context_dates(db, request) | ||
|
||
context['num_prefixes'] = int(request.form.get('num_prefixes', 1000)) | ||
|
||
if request.method == 'GET': | ||
context['total_bytes'] = 0 | ||
context['offloaded_bytes'] = 0 | ||
context['percentage'] = 0.0 | ||
elif request.method == 'POST': | ||
context['total_bytes'] = db.get_total_traffic(context['start_time'], context['end_time']) | ||
context['offloaded_bytes' | ||
] = db.offloaded_bytes(context['num_prefixes'], context['start_time'], context['end_time']) | ||
context['percentage'] = float(context['offloaded_bytes']) * 100.0 / float(context['total_bytes']) | ||
|
||
return render_template('analytics/offloaded_traffic.html', **context) | ||
|
||
|
||
def aggregate(request, field): | ||
db = getattr(g, 'db', None) | ||
context = _init_context_dates(db, request) | ||
|
||
context['flow_aggr'] = list() | ||
context['time_series'] = dict() | ||
context['time_series_times'] = list() | ||
|
||
if field == 'as': | ||
aggregate_method = db.aggregate_per_as | ||
timeseries_method = db.timeseries_per_as | ||
context['title'] = 'ASN\'s' | ||
elif field == 'prefix': | ||
aggregate_method = db.aggregate_per_prefix | ||
timeseries_method = db.timeseries_per_prefix | ||
context['title'] = 'Prefixes' | ||
|
||
if request.method == 'POST': | ||
context['time_series_times'] = db.get_dates_in_range(context['start_time'], context['end_time']) | ||
|
||
context['flow_aggr'] = aggregate_method(context['start_time'], context['end_time']) | ||
time_series = dict() | ||
for a in context['flow_aggr'][0:10]: | ||
time_series[a['key']] = timeseries_method(context['start_time'], context['end_time'], a['key']) | ||
|
||
context['time_series'] = time_series | ||
|
||
return render_template('analytics/analytics_aggregate.html', **context) | ||
|
||
|
||
def simulate(request): | ||
db = getattr(g, 'db', None) | ||
context = _init_context_dates(db, request) | ||
|
||
context['num_prefixes'] = int(request.form.get('num_prefixes', 1000)) | ||
context['time_series'] = dict() | ||
context['time_series_times'] = list() | ||
|
||
if request.method == 'POST': | ||
context['time_series_times'] = db.get_dates_in_range(context['start_time'], context['end_time']) | ||
|
||
time_series = dict() | ||
time_series['total_bytes'] = list() | ||
time_series['offloaded_bytes'] = list() | ||
|
||
for time_serie in context['time_series_times']: | ||
time_series['total_bytes'].append(db.get_total_traffic(time_serie, time_serie)) | ||
time_series['offloaded_bytes'].append(db.offloaded_bytes(context['num_prefixes'], time_serie, time_serie)) | ||
|
||
context['time_series'] = time_series | ||
|
||
return render_template('analytics/simulate.html', **context) | ||
|
||
|
||
def find_prefix(request): | ||
fs = getattr(g, 'fs', None) | ||
context = dict() | ||
context['available_dates'] = [d.strftime('%Y-%m-%dT%H:%M:01') for d in fs.get_available_dates()] | ||
|
||
context['query_name'] = 'Prefix' | ||
if request.method == 'GET': | ||
context['query'] = '' | ||
context['prefixes'] = dict() | ||
context['date'] = context['available_dates'][-1] | ||
elif request.method == 'POST': | ||
context['date'] = request.form.get('date') | ||
context['query'] = request.form.get('query') | ||
context['prefixes'] = fs.find_prefix(context['query'], context['date']) | ||
return render_template('analytics/find_prefix.html', **context) | ||
|
||
|
||
def find_prefix_asn(request): | ||
fs = getattr(g, 'fs', None) | ||
context = dict() | ||
context['available_dates'] = [d.strftime('%Y-%m-%dT%H:%M:01') for d in fs.get_available_dates()] | ||
|
||
context['query_name'] = 'ASN' | ||
if request.method == 'GET': | ||
context['query'] = '' | ||
context['prefixes'] = dict() | ||
context['origin_only'] = True | ||
context['date'] = context['available_dates'][-1] | ||
elif request.method == 'POST': | ||
context['date'] = request.form.get('date') | ||
context['query'] = request.form.get('query') | ||
context['origin_only'] = eval(request.form.get('origin_only', True)) | ||
context['prefixes'] = fs.find_prefixes_asn(context['query'], context['date'], context['origin_only']) | ||
return render_template('analytics/find_prefix.html', **context) |
File renamed without changes.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,12 @@ | ||
from flask import render_template | ||
from flask import g | ||
|
||
import yaml | ||
|
||
|
||
def start_page(request): | ||
context = dict() | ||
with open('api/api_documentation.yaml', 'r') as stream: | ||
context['documentation'] = yaml.load(stream) | ||
|
||
return render_template('api/start_page.html', **context) |
This file was deleted.
Oops, something went wrong.
Oops, something went wrong.