Skip to content
This repository has been archived by the owner on May 28, 2021. It is now read-only.

Commit

Permalink
Work on code based on Code Audit, fixes #290, #493, #494, #446, #449, #…
Browse files Browse the repository at this point in the history
…447

Signed-off-by: Valentin Kuznetsov <vkuznet@gmail.com>


git-svn-id: svn+ssh://svn.cern.ch/reps/CMSDMWM/DAS/trunk@10584 4525493e-7705-40b1-a816-d608a930855b
  • Loading branch information
valya committed Oct 6, 2010
1 parent a117491 commit 29f1b00
Show file tree
Hide file tree
Showing 52 changed files with 375 additions and 662 deletions.
4 changes: 2 additions & 2 deletions bin/das_admin
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
#!/usr/bin/env bash
#!/bin/sh

# setup environment
#. `dirname $0`/setup.sh

# run actual script
$DAS_ROOT/src/python/DAS/tools/das_admin.py "$@"
$DAS_ROOT/src/python/DAS/tools/das_admin.py ${1+"$@"}
4 changes: 2 additions & 2 deletions bin/das_analytics
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
#!/usr/bin/env bash
#!/bin/sh

$DAS_ROOT/src/python/DAS/analytics/analytics_controller.py "$@"
$DAS_ROOT/src/python/DAS/analytics/analytics_controller.py ${1+"$@"}
4 changes: 2 additions & 2 deletions bin/das_bench
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
#!/usr/bin/env bash
#!/bin/sh

# setup environment
#. `dirname $0`/setup.sh

# run actual script
$DAS_ROOT/src/python/DAS/tools/das_bench.py "$@"
$DAS_ROOT/src/python/DAS/tools/das_bench.py ${1+"$@"}
4 changes: 2 additions & 2 deletions bin/das_cacheclient
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
#!/usr/bin/env bash
#!/bin/sh

# setup environment
#. `dirname $0`/setup.sh

# run actual script
$DAS_ROOT/src/python/DAS/tools/das_cache_client.py "$@"
$DAS_ROOT/src/python/DAS/tools/das_cache_client.py ${1+"$@"}
7 changes: 0 additions & 7 deletions bin/das_cleaner

This file was deleted.

4 changes: 2 additions & 2 deletions bin/das_cli
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
#!/usr/bin/env bash
#!/bin/sh

# setup environment
#. `dirname $0`/setup.sh

# run actual script
$DAS_ROOT/src/python/DAS/tools/das_cli.py "$@"
$DAS_ROOT/src/python/DAS/tools/das_cli.py ${1+"$@"}
51 changes: 0 additions & 51 deletions bin/das_code_quality.sh

This file was deleted.

7 changes: 0 additions & 7 deletions bin/das_couchdb

This file was deleted.

4 changes: 0 additions & 4 deletions bin/das_doc.sh

This file was deleted.

7 changes: 0 additions & 7 deletions bin/das_file

This file was deleted.

8 changes: 6 additions & 2 deletions bin/das_map
Original file line number Diff line number Diff line change
@@ -1,11 +1,15 @@
#!/usr/bin/env bash
#!/bin/sh

# usage: das_map

# setup environment
#. `dirname $0`/setup.sh

dir=$DAS_ROOT/src/python/DAS/services/maps
if [ `hostname -d` == "cern.ch" ]
dir=/data/projects/das/config/maps
else
dir=$DAS_ROOT/src/python/DAS/services/maps
fi
cmd=$DAS_ROOT/src/python/DAS/tools/das_mapping_db.py

# run actual script
Expand Down
4 changes: 2 additions & 2 deletions bin/das_map_admin
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
#!/usr/bin/env bash
#!/bin/sh

# setup environment
#. `dirname $0`/setup.sh

cmd=$DAS_ROOT/src/python/DAS/tools/das_mapping_db.py
$cmd "$@"
$cmd ${1+"$@"}
4 changes: 2 additions & 2 deletions bin/das_mapreduce
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
#!/usr/bin/env bash
#!/bin/sh

# setup environment
#. `dirname $0`/setup.sh

# run actual script
$DAS_ROOT/src/python/DAS/tools/das_map_reduce.py "$@"
$DAS_ROOT/src/python/DAS/tools/das_map_reduce.py ${1+"$@"}
4 changes: 2 additions & 2 deletions bin/das_queryspammer
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
#!/usr/bin/env bash
#!/bin/sh

# run actual script
$DAS_ROOT/src/python/DAS/tools/queryspammer/queryspammer.py "$@"
$DAS_ROOT/src/python/DAS/tools/queryspammer/queryspammer.py ${1+"$@"}
2 changes: 1 addition & 1 deletion bin/das_server
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
#!/usr/bin/env bash
#!/bin/bash

# setup environment
#. `dirname $0`/setup.sh
Expand Down
27 changes: 0 additions & 27 deletions bin/das_test.sh

This file was deleted.

10 changes: 0 additions & 10 deletions bin/dassh

This file was deleted.

4 changes: 2 additions & 2 deletions bin/pdump
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
#!/usr/bin/env bash
#!/bin/sh

# setup environment
#. `dirname $0`/setup.sh

# run actual script
$DAS_ROOT/src/python/DAS/tools/read_profile.py "$@"
$DAS_ROOT/src/python/DAS/tools/read_profile.py ${1+"$@"}
24 changes: 0 additions & 24 deletions bin/test_cachesrv.py

This file was deleted.

30 changes: 30 additions & 0 deletions doc/sphinx/changelog.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,36 @@ The most significant part of this release is new plug-and-play mechanism
to add new data-services. This is done via data-service map creation. Each
map is represented data-service URI (URL, input parameters, API, etc.).

- 0.5.3

- Clean-up %post and do not package docs over there
- All names in bin are adjusted to one schema: das_<task>.
- All scripts in bin are changed to use /bin/sh or
/bin/bash and use ${1+"$@"} instead of "$@"
- bin area has been clean-up, e.g. das_doc, dassh is removed, etc.
- Remove runsum_keys in runsum_service.py since it is obsolete code
- Fix issue w/ root.close() for runsum_service.py (parser function)
- Remove session from plotfairy
- Remove encode4admin
- Add urllib.quote(param) for das_services.tmpl and das_tables.tmpl
- fix #446
- das_jsontable.tmpl is removed since it's obsolete and no one is using it.
- Remove das_help.tmpl and /das/help since it is obsolete
- Remove das_admin.py since it is obsolete
- Reviewed decorator in web/tools.py and commented out unused decorators,
exposexml, exposeplist. I want to keep them around upon they become relevant for DAS long terms.
- Fix issue with wrap2das methods and made them internal.
- Add checkargs decorator to validate input parameters for das_web
- Change socket_queue_size to 100
- Set engine.autoreload_on=False, request.show_tracebacks=False.
Verified that server runs in production mode by default.
- Add parameters validation for das_web/das_expert.
- fix #493, allow relocation of PLY parsertab.py
- fix #494, allow usage of HTTP Expires if data-services provide that
- change eval(x) into eval(x, { "__builtins__": None }, {}) for those cases
when fail to use json.load(x). Some data-service are not fully compliant
and the issue with them need to be resolved at their end.

- 0.5.0 till 0.5.2

- based on Gordon series of patches the following changes has been
Expand Down
9 changes: 4 additions & 5 deletions etc/das.cfg
Original file line number Diff line number Diff line change
@@ -1,26 +1,24 @@
[cache_server]
thread_pool = 30
socket_queue_size = 15
socket_queue_size = 100
loglevel = 0
n_worker_threads = 4
host = 0.0.0.0
log_screen = True
queue_limit = 100
logfile = /tmp/das_cache.log
port = 8211
pid = /tmp/das_cache_server.pid

[web_server]
thread_pool = 30
socket_queue_size = 15
socket_queue_size = 100
cache_server_url = http://localhost:8211
loglevel = 0
host = 0.0.0.0
log_screen = True
url_base = /das
logfile = /tmp/das_web.log
port = 8212
pid = /tmp/das_web_server.pid

[mongodb]
dbport = 27017
Expand All @@ -39,10 +37,11 @@ attempt = 3
dbname = analytics

[das]
parserdir = /tmp
services = ['ip_service', 'google_maps', 'postalcode']
logformat = %(asctime)s - %(name)s - %(levelname)s - %(message)s
logfile = /tmp/das.log
verbose = 0
services = dbs,phedex,dashboard,monitor,runsum,sitedb,tier0,google_maps,postalcode,ip_service,xwho

[mapping_db]
dbport = 27017
Expand Down
3 changes: 2 additions & 1 deletion src/python/DAS/core/das_mongocache.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,8 +18,9 @@
import re
import time
import types
import itertools
import random
import itertools
import traceback

# DAS modules
from DAS.utils.utils import genkey, convert_dot_notation, aggregator
Expand Down
3 changes: 2 additions & 1 deletion src/python/DAS/core/das_parser.py
Original file line number Diff line number Diff line change
Expand Up @@ -282,7 +282,8 @@ def __init__(self, config):
for val in self.daskeysmap.values():
for item in val:
self.daskeys.append(item)
self.dasply = DASPLY(self.daskeys, self.dasservices,
parserdir = config['das']['parserdir']
self.dasply = DASPLY(parserdir, self.daskeys, self.dasservices,
verbose=self.verbose)

def parse(self, query, add_to_analytics=True):
Expand Down
10 changes: 3 additions & 7 deletions src/python/DAS/core/das_ply.py
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ class DASPLY(object):
"""
DAS QL parser based on PLY lexer/parser.
"""
def __init__(self, daskeys, dassystems,
def __init__(self, parserdir, daskeys, dassystems,
operators=None, filters=None,
aggregators=None, verbose=0):
self.daskeys = daskeys
Expand All @@ -42,13 +42,9 @@ def __init__(self, daskeys, dassystems,
self.parser = None # defined at run time using self.build()

self.dassystems = dassystems
if not os.environ.has_key('DAS_ROOT'):
msg = 'Unable to locate DAS_ROOT environment'
raise Exception(msg)
self.parsertab_dir = os.path.join(os.environ['DAS_ROOT'], \
'src/python/parser')
self.parsertab_dir = parserdir
if not os.path.isdir(self.parsertab_dir):
msg = 'Directory $DAS_ROOT/src/python/parser does not exists'
msg = 'Directory %s does not exists' % self.parsertab_dir
raise Exception(msg)

# test if we have been given a list of desired operators/filters
Expand Down
7 changes: 6 additions & 1 deletion src/python/DAS/services/abstract_service.py
Original file line number Diff line number Diff line change
Expand Up @@ -378,7 +378,7 @@ def parser(self, query, dformat, data, api):
counter += 1
yield row
elif dformat.lower() == 'json' or dformat.lower() == 'dasjson':
gen = json_parser(data)
gen = json_parser(data, self.logger)
das_dict = {}
for row in gen:
if dformat.lower() == 'dasjson':
Expand Down Expand Up @@ -517,6 +517,11 @@ def api(self, query):
api, args, expire)
headers = make_headers(dformat)
data = self.getdata(url, args, headers)
try: # get HTTP header and look for Expires
e_time = data.info().__dict__['dict']['expires']
expire = expire_timestamp(e_time)
except:
pass
rawrows = self.parser(query, dformat, data, api)
dasrows = self.translator(api, rawrows)
dasrows = self.set_misses(query, api, dasrows)
Expand Down
Loading

0 comments on commit 29f1b00

Please sign in to comment.