Skip to content

Commit

Permalink
updated maintainer to scrapinghub
Browse files Browse the repository at this point in the history
  • Loading branch information
pablohoffman committed May 2, 2012
1 parent 2681be5 commit abcac4f
Show file tree
Hide file tree
Showing 7 changed files with 18 additions and 14 deletions.
3 changes: 2 additions & 1 deletion AUTHORS
@@ -1,7 +1,8 @@
Scrapy was brought to life by Shane Evans while hacking a scraping framework
prototype for Mydeco (mydeco.com). It soon became maintained, extended and
improved by Insophia (insophia.com), with the initial sponsorship of Mydeco to
bootstrap the project.
bootstrap the project. In mid-2011, Scrapinghub became the new official
maintainer.

Here is the list of the primary authors & contributors:

Expand Down
2 changes: 1 addition & 1 deletion debian/changelog
Expand Up @@ -2,4 +2,4 @@ scrapy-SUFFIX (0.11) unstable; urgency=low

* Initial release.

-- Insophia Team <info@insophia.com> Thu, 10 Jun 2010 17:24:02 -0300
-- Scrapinghub Team <info@scrapinghub.com> Thu, 10 Jun 2010 17:24:02 -0300
2 changes: 1 addition & 1 deletion debian/control
@@ -1,7 +1,7 @@
Source: scrapy-SUFFIX
Section: python
Priority: optional
Maintainer: Insophia Team <info@insophia.com>
Maintainer: Scrapinghub Team <info@scrapinghub.com>
Build-Depends: debhelper (>= 7.0.50), python (>=2.6), python-twisted, python-w3lib, python-lxml
Standards-Version: 3.8.4
Homepage: http://scrapy.org/
Expand Down
6 changes: 3 additions & 3 deletions debian/copyright
@@ -1,10 +1,10 @@
This package was debianized by the Insophia <info@insophia.com>.
This package was debianized by the Scrapinghub team <info@scrapinghub.com>.

It was downloaded from http://scrapy.org

Upstream Author: Scrapy Developers

Copyright: 2007-2010 Scrapy Developers
Copyright: 2007-2012 Scrapy Developers

License: bsd

Expand Down Expand Up @@ -36,5 +36,5 @@ ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

The Debian packaging is (C) 2010, Insophia <info@insophia.com> and
The Debian packaging is (C) 2010-2012, Scrapinghub <info@scrapinghub.com> and
is licensed under the BSD, see `/usr/share/common-licenses/BSD'.
4 changes: 2 additions & 2 deletions docs/conf.py
Expand Up @@ -42,7 +42,7 @@

# General information about the project.
project = u'Scrapy'
copyright = u'2008-2011, Insophia'
copyright = u'2008-2012, Scrapinghub'

# The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the
Expand Down Expand Up @@ -173,7 +173,7 @@
# (source start file, target name, title, author, document class [howto/manual]).
latex_documents = [
('index', 'Scrapy.tex', ur'Scrapy Documentation',
ur'Insophia', 'manual'),
ur'Scrapinghub', 'manual'),
]

# The name of an image file (relative to this directory) to place at the top of
Expand Down
4 changes: 2 additions & 2 deletions docs/topics/ubuntu.rst
Expand Up @@ -6,7 +6,7 @@ Ubuntu packages

.. versionadded:: 0.10

`Insophia`_ publishes apt-gettable packages which are generally fresher than
`Scrapinghub`_ publishes apt-gettable packages which are generally fresher than
those in Ubuntu, and more stable too since they're continuously built from
`Github repo`_ (master & stable branches) and so they contain the latest bug
fixes.
Expand Down Expand Up @@ -54,5 +54,5 @@ keyring as follows::

curl -s http://archive.scrapy.org/ubuntu/archive.key | sudo apt-key add -

.. _Insophia: http://insophia.com/
.. _Scrapinghub: http://scrapinghub.com/
.. _Github repo: https://github.com/scrapy/scrapy
11 changes: 7 additions & 4 deletions extras/makedeb.py
@@ -1,4 +1,4 @@
import sys, os, glob
import sys, os, glob, shutil
from subprocess import check_call

def build(suffix):
Expand All @@ -21,9 +21,12 @@ def build(suffix):
check_call('debuild -us -uc -b', shell=True)

def clean(suffix):
for f in glob.glob("debian/scrapy-%s.*" % suffix) + \
glob.glob("debian/scrapyd-%s.*" % suffix):
os.remove(f)
for f in glob.glob("debian/python-scrapy%s*" % suffix) + \
glob.glob("debian/scrapyd%s*" % suffix):
if os.path.isdir(f):
shutil.rmtree(f)
else:
os.remove(f)

def main():
cmd = sys.argv[1]
Expand Down

0 comments on commit abcac4f

Please sign in to comment.