Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ticket3252 port web directory.remaining.1 #672

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
98 commits
Select commit Hold shift + click to select a range
fd4d270
search-replace "n:" -> "t:"
meejah Aug 22, 2019
9e0e25c
WIP FIXME porting directory, need child_ refactor thing
meejah Aug 23, 2019
ace73a9
whitespace
meejah Sep 3, 2019
66c0c71
a thing works
meejah Sep 3, 2019
b51f144
another thing works
meejah Sep 10, 2019
79c3f91
cleanup
meejah Sep 10, 2019
c2ff7e2
more ctx -> req
meejah Sep 11, 2019
8f32936
/file works
meejah Sep 11, 2019
71c9736
fix getchild for directory
meejah Sep 11, 2019
f4c8eb5
not required
meejah Sep 11, 2019
d381d3e
better style
meejah Sep 11, 2019
8b7e1c3
remove debug
meejah Sep 11, 2019
d12078a
fix form
meejah Sep 11, 2019
e230432
fix rename-form
meejah Sep 11, 2019
7a36bdc
delete 'move to different dir' because it also doesn't work on trunk
meejah Sep 11, 2019
b4d9bfe
fix deep-stats results
meejah Sep 11, 2019
247fc54
explicit pack/unpack test
meejah Sep 13, 2019
9219262
Revert "fix getchild for directory"
meejah Sep 18, 2019
410f013
correctly register top-level directory node page
meejah Sep 23, 2019
66f392f
remove debug
meejah Sep 23, 2019
dc4669b
fix name decoding
meejah Sep 24, 2019
19baba2
Resource needs to be new-style
meejah Sep 24, 2019
effaa65
filename goes in the dict
meejah Sep 24, 2019
800b690
the error-message changed
meejah Sep 24, 2019
a29f1be
doubly-quoted
meejah Sep 24, 2019
98005ba
addSlash isn't a thing in twisted.web
meejah Sep 24, 2019
543a948
shadowed test name; reveal both tests
meejah Sep 27, 2019
c4f49bb
children must be None on error
meejah Oct 5, 2019
45bfe74
improve comment wording
meejah Oct 5, 2019
e894795
redirects don't have to be absolute
meejah Oct 13, 2019
8268f76
quote output, render GET
meejah Oct 13, 2019
d26ce3b
some things that fail, but not in quite the right way .. for reasons
meejah Oct 25, 2019
83a70bb
some fixes
meejah Oct 25, 2019
bc04c9b
detech empty pathname components, hopefully the same way as Nevow
meejah Nov 17, 2019
97a3f61
irrelevant comment
meejah Nov 17, 2019
d0c47fe
irrelevant comment
meejah Nov 17, 2019
6415873
undo change
meejah Nov 18, 2019
044d1cf
correct error
meejah Nov 18, 2019
74ce166
use twisted-web APIs, not nevow
meejah Nov 18, 2019
2509671
flake8
meejah Nov 18, 2019
9732955
spelling
meejah Nov 18, 2019
203066e
better error-code
meejah Nov 18, 2019
178c45e
remove print
meejah Nov 19, 2019
bf3c71c
change not required
meejah Dec 22, 2019
757bcc5
cleanup
meejah Dec 22, 2019
65ec212
unused imports
meejah Dec 22, 2019
7c27f29
news
meejah Dec 22, 2019
d5ef65d
beautifulsoup, not re
meejah Dec 23, 2019
c489c61
refactor
meejah Dec 23, 2019
76516fe
use soup, not re
meejah Dec 23, 2019
bd1cbde
re/string checks -> soup
meejah Dec 28, 2019
b4fab44
use soup, not strings
meejah Dec 28, 2019
b71d499
more soup
meejah Dec 28, 2019
af35483
soup not re
meejah Dec 28, 2019
685aaf1
soup, not re
meejah Dec 28, 2019
956d67b
irrelevant comment
meejah Dec 28, 2019
965fadb
irrelevant comment
meejah Dec 28, 2019
b81589c
irrelevant comment
meejah Dec 28, 2019
db7939f
irrelevant comment
meejah Dec 28, 2019
98d8c52
comment
meejah Dec 28, 2019
48f859f
document internal callback
meejah Jan 31, 2020
d12fd57
assert for noreferrer
meejah Jan 31, 2020
9ccbe56
add break
meejah Jan 31, 2020
355c78f
irrelevant comment
meejah Jan 31, 2020
f9956f4
remove asserts for static text
meejah Jan 31, 2020
e4d556b
assert about connected storage servers
meejah Jan 31, 2020
b44980c
get rid of assert_
meejah Jan 31, 2020
fdb3399
just delete test_welcome
meejah Jan 31, 2020
ef5e18b
unused
meejah Jan 31, 2020
d533bec
native strings only
meejah Jan 31, 2020
3d3feec
encode for fail() / Exception
meejah Jan 31, 2020
f9e3fdf
use % instead of format
meejah Feb 13, 2020
ace99a1
add clarifying comment
meejah Feb 13, 2020
733b793
clarify further; remove unused user of RenderMixin
meejah Feb 13, 2020
3c332fe
empty-string, not None
meejah Feb 13, 2020
d425bae
remove irrelevant comment
meejah Feb 13, 2020
95e5029
more comment
meejah Feb 13, 2020
b64f90b
simplify
meejah Feb 13, 2020
8db16ff
old-style class
meejah Feb 13, 2020
c6f4f0b
betterize comment
meejah Feb 18, 2020
088fcff
better formatting
meejah Feb 24, 2020
8c47b8e
don't need object
meejah Feb 24, 2020
b0c138f
remove empty segments instead of making multiple URI's valid
meejah Feb 24, 2020
2e9463b
compute 'is this a terminal request' differently
meejah Feb 25, 2020
fc4aec7
remove comment
meejah Feb 25, 2020
6ea6abd
no .format yet
meejah Feb 25, 2020
7019157
make unpack/pack test use Hypothesis
meejah Feb 25, 2020
c246b3e
self.assertEqual not assert
meejah Feb 26, 2020
8f35f78
temporary fix for hypothesis test
meejah Feb 26, 2020
bc2f5f8
functions shouldn't be named like classes
meejah Feb 28, 2020
0acf0d2
no addslash here
meejah Feb 29, 2020
882c63d
methods shouldn't be named like classes
meejah Feb 29, 2020
018e161
don't allow trailing slashes
meejah Mar 3, 2020
349aefe
fail() takes a message, not Exception
meejah Apr 18, 2020
73d0151
typo
meejah Apr 18, 2020
f8a78c9
URL -> DecodedURL
meejah Apr 18, 2020
8df1ed1
link to tickets
meejah Apr 18, 2020
c385e95
Merge branch 'master' into ticket3252-port-web-directory.remaining.1
meejah Apr 23, 2020
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
Empty file added newsfragments/3263.other
Empty file.
7 changes: 5 additions & 2 deletions src/allmydata/scripts/tahoe_backup.py
Original file line number Diff line number Diff line change
Expand Up @@ -51,8 +51,8 @@ def mkdir(contents, options):
return dircap

def put_child(dirurl, childname, childcap):
assert dirurl[-1] == "/"
url = dirurl + urllib.quote(unicode_to_url(childname)) + "?t=uri"
assert dirurl[-1] != "/"
url = dirurl + "/" + urllib.quote(unicode_to_url(childname)) + "?t=uri"
resp = do_http("PUT", url, childcap)
if resp.status not in (200, 201):
raise HTTPError("Error during put_child", resp)
Expand Down Expand Up @@ -105,6 +105,9 @@ def run(self):

archives_url = to_url + "Archives/"

archives_url = archives_url.rstrip("/")
to_url = to_url.rstrip("/")

# first step: make sure the target directory exists, as well as the
# Archives/ subdirectory.
resp = do_http("GET", archives_url + "?t=json")
Expand Down
8 changes: 7 additions & 1 deletion src/allmydata/test/cli/test_cli.py
Original file line number Diff line number Diff line change
Expand Up @@ -911,8 +911,14 @@ def _check(args, st):
self.failUnlessReallyEqual(err, "")
self.failUnlessIn(st, out)
return out

def _mkdir(ign, mutable_type, uri_prefix, dirname):
d2 = self.do_cli("mkdir", "--format="+mutable_type, dirname)
"""
:param str mutable_type: 'sdmf' or 'mdmf' (or uppercase versions)
:param str uri_prefix: kind of URI
:param str dirname: the directory alias
"""
d2 = self.do_cli("mkdir", "--format={}".format(mutable_type), dirname)
meejah marked this conversation as resolved.
Show resolved Hide resolved
d2.addCallback(_check, uri_prefix)
def _stash_filecap(cap):
u = uri.from_string(cap)
Expand Down
29 changes: 29 additions & 0 deletions src/allmydata/test/common.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,13 +4,15 @@
"SyncTestCase",
"AsyncTestCase",
"AsyncBrokenTestCase",
"TrialTestCase",

"flush_logged_errors",
"skip",
"skipIf",
]

import os, random, struct
import six
import tempfile
from tempfile import mktemp
from functools import partial
Expand Down Expand Up @@ -57,6 +59,7 @@
IReactorSocket,
)
from twisted.internet.endpoints import AdoptedStreamServerEndpoint
from twisted.trial.unittest import TestCase as _TrialTestCase

from allmydata import uri
from allmydata.interfaces import IMutableFileNode, IImmutableFileNode,\
Expand Down Expand Up @@ -1242,3 +1245,29 @@ class AsyncBrokenTestCase(_TestCaseMixin, TestCase):
run_tests_with = EliotLoggedRunTest.make_factory(
AsynchronousDeferredRunTestForBrokenTwisted.make_factory(timeout=60.0),
)


class TrialTestCase(_TrialTestCase):
"""
A twisted.trial.unittest.TestCaes with Tahoe required fixes
applied. Currently these are:

- ensure that .fail() passes a bytes msg on Python2
"""

def fail(self, msg):
"""
Ensure our msg is a native string on Python2. If it was Unicode,
we encode it as utf8 and hope for the best. On Python3 we take
no action.

This is necessary because Twisted passes the 'msg' argument
along to the constructor of an exception; on Python2,
Exception will accept a `unicode` instance but will fail if
you try to turn that Exception instance into a string.
"""

if six.PY2:
if isinstance(msg, six.text_type):
return super(self, TrialTestCase).fail(msg.encode("utf8"))
return super(self, TrialTestCase).fail(msg)
31 changes: 31 additions & 0 deletions src/allmydata/test/test_dirnode.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@
from allmydata import uri, dirnode
from allmydata.client import _Client
from allmydata.immutable import upload
from allmydata.immutable.literal import LiteralFileNode
from allmydata.interfaces import IImmutableFileNode, IMutableFileNode, \
ExistingChildError, NoSuchChildError, MustNotBeUnknownRWError, \
MustBeDeepImmutableError, MustBeReadonlyError, \
Expand All @@ -27,6 +28,9 @@
from base64 import b32decode
import allmydata.test.common_util as testutil

from hypothesis import given
from hypothesis.strategies import text

if six.PY3:
long = int

Expand Down Expand Up @@ -1460,6 +1464,33 @@ def _make_kids(self, nm, which):
kids[unicode(name)] = (nm.create_from_cap(caps[name]), {})
return kids

@given(text(min_size=1, max_size=20))
def test_pack_unpack_unicode_hypothesis(self, name):
"""
pack -> unpack results in the same objects (with a unicode name)
"""
nm = NodeMaker(None, None, None, None, None, {"k": 3, "n": 10}, None, None)
fn = MinimalFakeMutableFile()

# FIXME TODO: we shouldn't have to do this out here, but
meejah marked this conversation as resolved.
Show resolved Hide resolved
# Hypothesis found that a name with "\x2000" does not make the
# round-trip properly .. so for now we'll only give the packer
# normalized names.
# See also:
# https://tahoe-lafs.org/trac/tahoe-lafs/ticket/2606
# https://tahoe-lafs.org/trac/tahoe-lafs/ticket/1076
name = unicodedata.normalize('NFC', name)

kids = {
name: (LiteralFileNode(uri.from_string(one_uri)), {}),
}
packed = dirnode.pack_children(kids, fn.get_writekey(), deep_immutable=False)
write_uri = "URI:SSK-RO:e3mdrzfwhoq42hy5ubcz6rp3o4:ybyibhnp3vvwuq2vaw2ckjmesgkklfs6ghxleztqidihjyofgw7q"
filenode = nm.create_from_cap(write_uri)
dn = dirnode.DirectoryNode(filenode, nm, None)
unkids = dn._unpack_contents(packed)
self.assertEqual(kids, unkids)

def test_deep_immutable(self):
nm = NodeMaker(None, None, None, None, None, {"k": 3, "n": 10}, None, None)
fn = MinimalFakeMutableFile()
Expand Down
58 changes: 58 additions & 0 deletions src/allmydata/test/web/common.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,64 @@ def assert_soup_has_favicon(testcase, soup):
any(t[u'href'] == u'/icon.png' for t in links), soup)


def assert_soup_has_tag_with_attributes(testcase, soup, tag_name, attrs):
"""
Using a ``TestCase`` object ``testcase``, assert that the passed
in ``BeatufulSoup`` object ``soup`` contains a tag ``tag_name``
(unicode) which has all the attributes in ``attrs`` (dict).
"""
tags = soup.find_all(tag_name)
for tag in tags:
if all(v in tag.attrs.get(k, []) for k, v in attrs.items()):
return # we found every attr in this tag; done
testcase.fail(
u"No <{}> tags contain attributes: {}".format(tag_name, attrs)
)


def assert_soup_has_tag_with_attributes_and_content(testcase, soup, tag_name, content, attrs):
"""
Using a ``TestCase`` object ``testcase``, assert that the passed
in ``BeatufulSoup`` object ``soup`` contains a tag ``tag_name``
(unicode) which has all the attributes in ``attrs`` (dict) and
contains the string ``content`` (unicode).
"""
assert_soup_has_tag_with_attributes(testcase, soup, tag_name, attrs)
assert_soup_has_tag_with_content(testcase, soup, tag_name, content)


def _normalized_contents(tag):
"""
:returns: all the text contents of the tag with whitespace
normalized: all newlines removed and at most one space between
words.
"""
return u" ".join(tag.text.split())


def assert_soup_has_tag_with_content(testcase, soup, tag_name, content):
"""
Using a ``TestCase`` object ``testcase``, assert that the passed
in ``BeatufulSoup`` object ``soup`` contains a tag ``tag_name``
(unicode) which contains the string ``content`` (unicode).
"""
tags = soup.find_all(tag_name)
for tag in tags:
if content in tag.contents:
return

# make these "fuzzy" options?
for c in tag.contents:
if content in c:
return

if content in _normalized_contents(tag):
return
testcase.fail(
u"No <{}> tag contains the text '{}'".format(tag_name, content)
)


def assert_soup_has_text(testcase, soup, text):
"""
Using a ``TestCase`` object ``testcase``, assert that the passed in
Expand Down
66 changes: 39 additions & 27 deletions src/allmydata/test/web/test_grid.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,9 @@
import os.path, re, urllib
import json
from six.moves import StringIO

from bs4 import BeautifulSoup

from nevow import rend
from twisted.trial import unittest
from allmydata import uri, dirnode
Expand Down Expand Up @@ -325,8 +328,8 @@ def test_unknown(self, immutable=False):

def _stash_root_and_create_file(n):
self.rootnode = n
self.rooturl = "uri/" + urllib.quote(n.get_uri()) + "/"
self.rourl = "uri/" + urllib.quote(n.get_readonly_uri()) + "/"
self.rooturl = "uri/" + urllib.quote(n.get_uri())
self.rourl = "uri/" + urllib.quote(n.get_readonly_uri())
if not immutable:
return self.rootnode.set_node(name, future_node)
d.addCallback(_stash_root_and_create_file)
Expand Down Expand Up @@ -386,7 +389,7 @@ def _check_info(res, expect_rw_uri, expect_ro_uri):

d.addCallback(lambda ign: self.GET(expected_info_url))
d.addCallback(_check_info, expect_rw_uri=False, expect_ro_uri=False)
d.addCallback(lambda ign: self.GET("%s%s?t=info" % (self.rooturl, str(name))))
d.addCallback(lambda ign: self.GET("%s/%s?t=info" % (self.rooturl, str(name))))
d.addCallback(_check_info, expect_rw_uri=False, expect_ro_uri=True)

def _check_json(res, expect_rw_uri):
Expand All @@ -410,7 +413,7 @@ def _check_json(res, expect_rw_uri):
# TODO: check metadata contents
self.failUnlessIn("metadata", data[1])

d.addCallback(lambda ign: self.GET("%s%s?t=json" % (self.rooturl, str(name))))
d.addCallback(lambda ign: self.GET("%s/%s?t=json" % (self.rooturl, str(name))))
d.addCallback(_check_json, expect_rw_uri=not immutable)

# and make sure that a read-only version of the directory can be
Expand All @@ -425,7 +428,7 @@ def _check_json(res, expect_rw_uri):
d.addCallback(lambda ign: self.GET(self.rourl+"?t=json"))
d.addCallback(_check_directory_json, expect_rw_uri=False)

d.addCallback(lambda ign: self.GET("%s%s?t=json" % (self.rourl, str(name))))
d.addCallback(lambda ign: self.GET("%s/%s?t=json" % (self.rourl, str(name))))
d.addCallback(_check_json, expect_rw_uri=False)

# TODO: check that getting t=info from the Info link in the ro directory
Expand Down Expand Up @@ -492,7 +495,7 @@ def _created(dn):
self.failUnlessIn("CHK", cap.to_string())
self.cap = cap
self.rootnode = dn
self.rooturl = "uri/" + urllib.quote(dn.get_uri()) + "/"
self.rooturl = "uri/" + urllib.quote(dn.get_uri())
return download_to_data(dn._node)
d.addCallback(_created)

Expand Down Expand Up @@ -534,19 +537,28 @@ def _check_kids(children):
# Make sure the lonely child can be listed in HTML...
d.addCallback(lambda ign: self.GET(self.rooturl))
def _check_html(res):
soup = BeautifulSoup(res, 'html5lib')
self.failIfIn("URI:SSK", res)
get_lonely = "".join([r'<td>FILE</td>',
r'\s+<td>',
r'<a href="[^"]+%s[^"]+" rel="noreferrer">lonely</a>' % (urllib.quote(lonely_uri),),
meejah marked this conversation as resolved.
Show resolved Hide resolved
r'</td>',
r'\s+<td align="right">%d</td>' % len("one"),
])
self.failUnless(re.search(get_lonely, res), res)

# find the More Info link for name, should be relative
mo = re.search(r'<a href="([^"]+)">More Info</a>', res)
info_url = mo.group(1)
self.failUnless(info_url.endswith(urllib.quote(lonely_uri) + "?t=info"), info_url)
found = False
for td in soup.find_all(u"td"):
if td.text != u"FILE":
continue
a = td.findNextSibling()(u"a")[0]
self.assertIn(urllib.quote(lonely_uri), a[u"href"])
self.assertEqual(u"lonely", a.text)
self.assertEqual(a[u"rel"], [u"noreferrer"])
self.assertEqual(u"{}".format(len("one")), td.findNextSibling().findNextSibling().text)
found = True
meejah marked this conversation as resolved.
Show resolved Hide resolved
break
self.assertTrue(found)

infos = list(
a[u"href"]
for a in soup.find_all(u"a")
if a.text == u"More Info"
)
self.assertEqual(1, len(infos))
self.assertTrue(infos[0].endswith(urllib.quote(lonely_uri) + "?t=info"))
d.addCallback(_check_html)

# ... and in JSON.
Expand All @@ -573,7 +585,7 @@ def test_deep_check(self):
d = c0.create_dirnode()
def _stash_root_and_create_file(n):
self.rootnode = n
self.fileurls["root"] = "uri/" + urllib.quote(n.get_uri()) + "/"
self.fileurls["root"] = "uri/" + urllib.quote(n.get_uri())
return n.add_file(u"good", upload.Data(DATA, convergence=""))
d.addCallback(_stash_root_and_create_file)
def _stash_uri(fn, which):
Expand Down Expand Up @@ -747,7 +759,7 @@ def test_deep_check_and_repair(self):
d = c0.create_dirnode()
def _stash_root_and_create_file(n):
self.rootnode = n
self.fileurls["root"] = "uri/" + urllib.quote(n.get_uri()) + "/"
self.fileurls["root"] = "uri/" + urllib.quote(n.get_uri())
return n.add_file(u"good", upload.Data(DATA, convergence=""))
d.addCallback(_stash_root_and_create_file)
def _stash_uri(fn, which):
Expand Down Expand Up @@ -960,7 +972,7 @@ def test_deep_add_lease(self):
def _stash_root_and_create_file(n):
self.rootnode = n
self.uris["root"] = n.get_uri()
self.fileurls["root"] = "uri/" + urllib.quote(n.get_uri()) + "/"
self.fileurls["root"] = "uri/" + urllib.quote(n.get_uri())
return n.add_file(u"one", upload.Data(DATA, convergence=""))
d.addCallback(_stash_root_and_create_file)
def _stash_uri(fn, which):
Expand Down Expand Up @@ -1027,8 +1039,8 @@ def test_exceptions(self):
DATA = "data" * 100
d = c0.create_dirnode()
def _stash_root(n):
self.fileurls["root"] = "uri/" + urllib.quote(n.get_uri()) + "/"
self.fileurls["imaginary"] = self.fileurls["root"] + "imaginary"
self.fileurls["root"] = "uri/" + urllib.quote(n.get_uri())
self.fileurls["imaginary"] = self.fileurls["root"] + "/imaginary"
return n
d.addCallback(_stash_root)
d.addCallback(lambda ign: c0.upload(upload.Data(DATA, convergence="")))
Expand All @@ -1044,14 +1056,14 @@ def _stash_bad(ur):
d.addCallback(lambda ign: c0.create_dirnode())
def _mangle_dirnode_1share(n):
u = n.get_uri()
url = self.fileurls["dir-1share"] = "uri/" + urllib.quote(u) + "/"
url = self.fileurls["dir-1share"] = "uri/" + urllib.quote(u)
self.fileurls["dir-1share-json"] = url + "?t=json"
self.delete_shares_numbered(u, range(1,10))
d.addCallback(_mangle_dirnode_1share)
d.addCallback(lambda ign: c0.create_dirnode())
def _mangle_dirnode_0share(n):
u = n.get_uri()
url = self.fileurls["dir-0share"] = "uri/" + urllib.quote(u) + "/"
url = self.fileurls["dir-0share"] = "uri/" + urllib.quote(u)
self.fileurls["dir-0share-json"] = url + "?t=json"
self.delete_shares_numbered(u, range(0,10))
d.addCallback(_mangle_dirnode_0share)
Expand Down Expand Up @@ -1330,8 +1342,8 @@ def _get_dircap(dn):
self.dir_si_b32 = base32.b2a(dn.get_storage_index())
self.dir_url_base = "uri/"+dn.get_write_uri()
self.dir_url_json1 = "uri/"+dn.get_write_uri()+"?t=json"
self.dir_url_json2 = "uri/"+dn.get_write_uri()+"/?t=json"
self.dir_url_json_ro = "uri/"+dn.get_readonly_uri()+"/?t=json"
self.dir_url_json2 = "uri/"+dn.get_write_uri()+"?t=json"
self.dir_url_json_ro = "uri/"+dn.get_readonly_uri()+"?t=json"
self.child_url = "uri/"+dn.get_readonly_uri()+"/child"
d.addCallback(_get_dircap)
d.addCallback(lambda ign: self.GET(self.dir_url_base, followRedirect=True))
Expand Down