Skip to content


initial demo of using jsonp to allow the client.js to get a page from ano #47

wants to merge 1 commit into from

3 participants


initial demo of using jsonp to allow the client.js to get a page from another wiki-server - add alink of the style [[remoteserver:1111|welcome-visitors]] - but you must make sure both servers are running a jsonp enabled wiki
note that this makes the /remote uri redundant.. (though i've only tested viewing)

please don't merge into master - this is really just to show you what I'm talking about wrt using JSONP for REST and Fork or remote wiki pages.

alot of client code still points to the wrong url's - show source for eg - a little refactoring to extract out the build URL code should simplify that tho

@SvenDowideit SvenDowideit initial demo of using jsonp to allow the client.js to get a page from…
… another wiki-server - add alink of the style [remoteserver:1111|welcome-visitors] - but you must make sure both servers are running a jsonp enabled wiki

  note that this makes the /remote uri redundant.. (though i've only tested viewing)

I like this. I think it would be sweet if a server and the client code it offers could have the freedom to choose either approach:

  • client goes straight to server with the desired page
  • client asks server to retrieve the desired page

As you point out, to make this work all servers would need to understand the jsonp protocol. I am a little afraid of jsonp when applied to urls found on the internet. When I've used jsonp I've been in control of all sides of the dialog so I had no issues with eval of untrusted code.

Allen Wirfs-Brock tells me that he has been trying to get browser authors to relax the same-origin policy for application/json text as simply parsing this is much safer than eval of random jsonp expressions. That would be an ideal.


Well, I've talked to some Mozilla people about my idea, but it isn't something I've really pushed on. Also, Crockford said he liked the idea. We'd probably need to get broader community support before the browser implementers would really jump on it.

The basic idea would be to recognize application/json or application/jsonp as distinct mime types in script tags.

So to load a json data file you would say something like:
<script src="whatever" type="application/json"> </script>

and the user agent would recognize "application/json" as a distinct scripting language which it would process using the JavaScript JSON.parse functionality it already has. The resulting objects with no cross site restrictions would then be hung off of the DOM node corresponding to the scripting tag.

I have a writeup that I have never broadly circulated that describes this in more detail and address issues like how to make it work with existing jsonp servers that expose files that would normally not parse with JSON.parse.


goodness me, lets see.

non-cross site restricted access to import data from anywhere, into your local browser's persistent datastore.

I would love it, and so would the blackhats - yes, it's much better than the random code injection, it is still random data injection.

so Allen - oh, yes please, making a web data -> DOM mechanism would really improve data application developers lives.

if we're talking about pushing browsers into a sanity-land where they actually implement what should have been there in the first place, how about fixing authentication? why are browsers not essentially using my ssh keys (though i don't quite grok why ssh keys are separate from gpg keys ...)


Note that my proposal is far safer than current jsonp approaches which allow complete arbitrary JavaScript code to be injected and executed. My proposal only loads valid JSON data trees to be loaded. Such trees are complexly inert. No execution would be involved. At worse you get random strings, numbers, and interior tree node objects hanging of specific script nodes. You are going to get that from any form of json data load, so I don't see where your concern lies. If it is that the data becomes accessible by anyone via dom traversal then stash the object reference in a local var and delete the script node as soon as it is loaded. However, you probably aren't really protecting much. If anybody can already access the data using jsonp and a script node with type="text/javascript".


yes, exactly, it'll be safer, but still as unsafe as the rest of the web.

Way too many of us do DSL / data is code like work, so 'no execution of js' does not mean no execution.

Really, it all comes back to the sad fact that web technology has not worked on webs of trust (gpg style), so that I can use type=application/json , and have some assurance (by checking the signature against corruption and that it comes from a source that I have decided to trust) that its data form someone i know.

imo we don't really want to end up with a git-like collaboration method - it leads back to single webmaster syndrome, with the added confusion of many duplicate forks with spelling changes. instead, if I sign your public key as 'i trust you', then your changes could be auto-merged...

I do wish this was rw federated wiki accessible, that way it'd be easer to refactor.


Ward - I'm playing with my static-server implementation, and I think I can make it serve jsonp using mod_include (ie, server side includes).

if thats the case, can we change the definition of a federated server to only talk jsonp? from there, we can later add mirror/read non-federated wiki sources separately..


We'd need to work out a way to prevent javascript injection before I'd feel comfortable with building on jsonp.

As I reread your previous comments on this issue I realize I don't completely understand your vision of how computation might be spread over a federation. Would you say more?

After pondering this for a while more I've realized I'm operating from some unstated assumptions. I've imagined a federation of servers creating a medium where higher level structures could emerge. I'd also imagined that this emergent structure would be visible to the web and that this would be a strong motivator for participation. Participation would require provisioning some computational resource which would be under the sole control of the participant. I'm happy to elaborate why I think this to be a strong model. But let me instead echo what I now hear you saying.

You suggest that the federation provide a medium upon which structural subsets are formed as webs of trust. In this there need not be a distinction of client and server. All participants within the web are equals. Admission and expulsion from this web would be handled at a higher level. Jsonp is important to erase the distinction between client and server. A cryptographically sound web of trust makes any security concerns mute. A feature of this form is that I can unilaterally deploy distributed computations to any web where I am trusted.

Let me summarize the distinction as read-only trust v. read-write trust. The read-only model supports innovation because anyone can deploy arbitrary computational resources and use the federation only for retrieving and distributing innert information between them. The read-write model supports innovation because uncommitted computational is contributed to a trusted subset of the federation to be used freely by innovators.

For either model, the participation of static sites is only a boundary case and not likely to inform architectural decision making. I hope I've captured our respective interests. If not, help me understand what I might be missing. As always, I thank you for your participation.


I'm sorry for ignoring this for so long - I need to get some serous foswiki work done before the Foswiki Camp and general assembly at the end of the month - After that, well - I'm somehow expecting that figuring out how to give the girls their first snowy xmas will hit, and then it'll be time to jetlag back to Brisbane :)

I really don't feel I can put in enough time to work though my fuzzy ideas - but while they're drawing on new paper together (collaboration between young twins is fascinating)

my focus on what is currently the static site has more to do with simplifying - if we can use zero code to serve reading, we leverage the nature of http. To add write, all we need to do is write a simple server code to allow PUT (er, or use DAV?)

This then makes me feel that we can focus on the meat of the innovation - the interaction and data dynamics that allow users to innovate.

so when we say 'static' i just think fast and lightweight, not read only.

btw, I keep staring at the elephant - what happens if you have millions of edit events on a page? Especially as iirc, we only get ~5M in the browse datastore.

I was hoping that we would have a federation already, so I could work on our combined, and separate visions, but real life makes quite moments harder :)

mmm, times up.


We'll have good stuff waiting for you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Commits on Oct 15, 2011
  1. @SvenDowideit

    initial demo of using jsonp to allow the client.js to get a page from…

    SvenDowideit committed
    … another wiki-server - add alink of the style [remoteserver:1111|welcome-visitors] - but you must make sure both servers are running a jsonp enabled wiki
      note that this makes the /remote uri redundant.. (though i've only tested viewing)
This page is out of date. Refresh to see the latest.
Showing with 72 additions and 19 deletions.
  1. +28 −7 client/
  2. +25 −9 client/client.js
  3. +12 −1 node.js/
  4. +7 −2 server/server.rb
35 client/
@@ -15,9 +15,23 @@ $ ->
randomBytes = (n) -> (randomByte() for [1..n]).join('')
renderInternalLink = (match, name) ->
+ #if there is a | in the link, treat it as a remote wiki server link
+ site=null
+ m = name.match(/^(.*)\|(.*)$/)
+ if m
+ site = m[1]
+ name = m[2]
# spaces become 'slugs', non-alpha-num get removed
slug = name.replace(/\s/g, '-').replace(/[^A-Za-z0-9-]/g, '').toLowerCase()
- "<a class=\"internal\" href=\"/"+slug+".html\" data-page-name=\""+slug+"\">"+name+"</a>"
+ link = $('<a>'+name+'</a>')
+ .addClass('internal')
+ .attr('href', '/'+slug+'.html')
+ .attr('data-page-name', slug)
+ .data("slug", slug)
+ if site?
+ link.attr("site", site)
+"site", site)
+ $('<div>').append(link).remove().html()
resolveLinks = (string) ->
@@ -258,7 +272,7 @@ $ ->
$(pageElement).data("data", data)
if site?
- $(pageElement).append "<h1><a href=\"//#{site}\"><img src = \"/remote/#{site}/favicon.png\" height = \"32px\"></a> #{page.title}</h1>"
+ $(pageElement).append "<h1><a href=\"//#{site}\"><img src = \"http://#{site}/favicon.png\" height = \"32px\"></a> #{page.title}</h1>"
$(pageElement).append "<h1><a href=\"/\"><img src = \"/favicon.png\" height = \"32px\"></a> #{page.title}</h1>"
@@ -297,8 +311,9 @@ $ ->
buildPage JSON.parse(json)
- resource = if site? then "remote/#{site}/#{slug}" else slug
- $.get "/#{resource}.json?random=#{randomBytes(4)}", "", (page) ->
+ resource = if site? then "http://#{site}/#{slug}" else '/'+slug
+ ####TODO: need to know if the server supports jsonp - and if not, use a normal get
+ $.getJSON "#{resource}.json?random=#{randomBytes(4)}&callback=?", (page) ->
buildPage page
@@ -320,8 +335,8 @@ $ ->
slug = $(pageElement).attr('id')
site = $(pageElement).data('site')
- resource = if site? then "remote/#{site}/#{slug}" else slug
- $.get "/#{resource}.json?random=#{randomBytes(4)}", "", (page) ->
+ resource = if site? then "http://#{site}/#{slug}" else '/'+slug
+ $.get "#{resource}.json?random=#{randomBytes(4)}", "", (page) ->
window.dialog.html('<pre>'+JSON.stringify(page, null, 2)+'</pre>')
window.dialog.dialog( "option", "title", "Source for: "+slug );
@@ -329,8 +344,14 @@ $ ->
.delegate '.internal', 'click', (e) ->
name = $( 'pageName'
+ site = $( 'site'
$('.page').nextAll().remove() unless e.shiftKey
- $("<div/>").attr('id', name).addClass("page").appendTo('.main').each refresh
+ newPage = $("<div/>")
+ .attr('id', name)
+ .addClass("page")
+ if site
+ newPage ='site', site)
+ newPage.appendTo('.main').each refresh
if History.enabled
pages = $.makeArray $(".page").map (_, el) ->
34 client/client.js
@@ -26,9 +26,20 @@
renderInternalLink = function(match, name) {
- var slug;
+ var link, m, site, slug;
+ site = null;
+ m = name.match(/^(.*)\|(.*)$/);
+ if (m) {
+ site = m[1];
+ name = m[2];
+ }
slug = name.replace(/\s/g, '-').replace(/[^A-Za-z0-9-]/g, '').toLowerCase();
- return "<a class=\"internal\" href=\"/" + slug + ".html\" data-page-name=\"" + slug + "\">" + name + "</a>";
+ link = $('<a>' + name + '</a>').addClass('internal').attr('href', '/' + slug + '.html').attr('data-page-name', slug).data("slug", slug);
+ if (site != null) {
+ link.attr("site", site);
+"site", site);
+ }
+ return $('<div>').append(link).remove().html();
resolveLinks = function(string) {
return string.replace(/\[\[([^\]]+)\]\]/gi, renderInternalLink).replace(/\[(http.*?) (.*?)\]/gi, "<a class=\"external\" href=\"$1\">$2</a>");
@@ -358,7 +369,7 @@
page = $.extend(empty, data);
$(pageElement).data("data", data);
if (site != null) {
- $(pageElement).append("<h1><a href=\"//" + site + "\"><img src = \"/remote/" + site + "/favicon.png\" height = \"32px\"></a> " + page.title + "</h1>");
+ $(pageElement).append("<h1><a href=\"//" + site + "\"><img src = \"http://" + site + "/favicon.png\" height = \"32px\"></a> " + page.title + "</h1>");
} else {
$(pageElement).append("<h1><a href=\"/\"><img src = \"/favicon.png\" height = \"32px\"></a> " + page.title + "</h1>");
@@ -398,8 +409,8 @@
return initDragging();
} else {
- resource = site != null ? "remote/" + site + "/" + slug : slug;
- return $.get("/" + resource + ".json?random=" + (randomBytes(4)), "", function(page) {
+ resource = site != null ? "http://" + site + "/" + slug : '/' + slug;
+ return $.getJSON("" + resource + ".json?random=" + (randomBytes(4)) + "&callback=?", function(page) {
return initDragging();
@@ -419,21 +430,26 @@
pageElement = $(this).parent().parent();
slug = $(pageElement).attr('id');
site = $(pageElement).data('site');
- resource = site != null ? "remote/" + site + "/" + slug : slug;
- return $.get("/" + resource + ".json?random=" + (randomBytes(4)), "", function(page) {
+ resource = site != null ? "http://" + site + "/" + slug : '/' + slug;
+ return $.get("" + resource + ".json?random=" + (randomBytes(4)), "", function(page) {
window.dialog.html('<pre>' + JSON.stringify(page, null, 2) + '</pre>');
window.dialog.dialog("option", "title", "Source for: " + slug);
return window.dialog.dialog('open');
}).delegate('.internal', 'click', function(e) {
- var name, page, pages;
+ var name, newPage, page, pages, site;
name = $('pageName');
+ site = $('site');
if (!e.shiftKey) {
- $("<div/>").attr('id', name).addClass("page").appendTo('.main').each(refresh);
+ newPage = $("<div/>").attr('id', name).addClass("page");
+ if (site) {
+ newPage ='site', site);
+ }
+ newPage.appendTo('.main').each(refresh);
if (History.enabled) {
pages = $.makeArray($(".page").map(function(_, el) {
13 node.js/
@@ -1,6 +1,7 @@
http = require 'http'
fs = require 'fs'
qs = require 'querystring'
+url = require 'url'
port = 8888
@@ -28,6 +29,11 @@ filetype = {
process.serve_url = (req, res) ->
file = req.url[1..]
+ urlparams = url.parse(req.url, true)
+ console.log urlparams
+ jsonpCallback = urlparams.query.callback
+ console.log 'callback='+jsonpCallback
if req.method == 'PUT'
#nasty hack, as we're only putting pages atm, and the URI's are :(
@@ -81,8 +87,13 @@ process.serve_url = (req, res) ->
status = 404
data = ""
console.log 'status: '+ status
- #console.log ' contentType: '+ contentType
+ if jsonpCallback
+ contentType = 'text/javascript'
+ console.log ' contentType: '+ contentType
if status is 200
+ if jsonpCallback
+ data = jsonpCallback+'( '+data+' )'
res.writeHead 200,
'Content-Type': contentType
'Content-Length': data.length + 1
9 server/server.rb
@@ -63,8 +63,13 @@ def resolve_links string
get %r{^/([a-z0-9-]+)\.json$} do |name|
- content_type 'application/json'
- JSON.pretty_generate(get_page(name))
+ if defined? params['callback'] then
+ content_type 'text/javascript'
+ params['callback'] + '( ' + JSON.pretty_generate(get_page(name)) + ' )'
+ else
+ content_type 'application/json'
+ JSON.pretty_generate(get_page(name))
+ end
put %r{^/page/([a-z0-9-]+)/action$} do |name|
Something went wrong with that request. Please try again.