Drawing geojson-based polygons on Google Maps

This post will demonstrate how to draw polygons on Google Maps v3 using geojson-encoded data from GeoDjango. The most common method for displaying polygons on Google Maps seems to be by using KML. Google Maps requires the KML-file to be available on a public website though and that is kind of a bore for debugging. This approach uses only json and the standard maps drawing API. To get the display the polygons, you have to loop over them and do a setMap() with your map.

I’m assuming the browser is getting geojson from from a GeoDjango model object like this: model.area.geojson, but it should work for data from other sources. Note that I’m using the very excellent Underscore.js Javascript library to do really terse functional programming. Also note that the function takes an optional bounds object which gets expanded as polygon points are added.

The code seems to perform very well in modern browsers, even for fairly large and complex polygons. Unfortunately there is no live demo, but the site I’m working on should go up soon.

function createPolygons(areajson, bounds){
  var coords = areajson.coordinates;
  var polygons = _(coords).reduce([], function(memo_n, n) {
    var polygonpaths = _(n).reduce(new google.maps.MVCArray(), function(memo_o, o) {
      var polygoncords = _(o).reduce(new google.maps.MVCArray(), function(memo_p, p) {
        var mylatlng = new google.maps.LatLng(p[1], p[0]);
        if(bounds){
          bounds.extend(mylatlng);
        }
        memo_p.push(mylatlng);
        return memo_p;
      });
      memo_o.push(polygoncords);
      return memo_o;
    });
    var polygon = new google.maps.Polygon({
      paths: polygonpaths,
      strokeColor: "#808080",
      strokeOpacity: 0.8,
      strokeWeight: 2,
      fillColor: "#C0C0C0",
      fillOpacity: 0.35
    });
    memo_n.push(polygon);
    return memo_n;
  });
  return polygons;
}

Creating Narwhal/CommonJS packages

I really like the way Javascript is moving from being this annoying thing you have to deal with when doing web-development to becoming a proper server-side programming languages with standard libraries and fast VMs. Yesterday I cloned the Narwhal git repo and tried to create my own package with a few simple collection types. Narwhal is done by the guys at 280north and conforms to the CommonJS standard, an attempt at a cross-platform standard by the various Javascript platform implementors.

I haven’t quite figured out what goes into creating a proper package, but if you check out and configure Narwhal, you can dump the collection implementation in /lib and the tests in /tests and see it all work. First the collections:

// -- friism Michael Friis

exports.Stack = function() {
  return new stack();
};

exports.Queue = function() {
  return new queue();
};

/**
 * Stack implementation, using native Javascript array
 */

function stack() {
  this.data = [];
};

stack.prototype.pop = function() {
  return this.data.pop();
};

stack.prototype.push = function(o) {
  this.data.push(o);
};

/**
 * Queue implementation. Mechanics lifted from Wikipedia
 * Could be optimised to do fewer slices, see here: 
 * http://safalra.com/web-design/javascript/queues/Queue.js
 */

function queue() {
  this.data = [];
  this.length = 0;
}

queue.prototype.isEmpty = function() {
  return (this.data.length == 0);
};

queue.prototype.enqueue = function(obj) {
  this.data.push(obj);
  this.length = this.data.length;
}

queue.prototype.dequeue = function() {
  var ret = this.data[0];
  this.data.splice(0,1);
  this.length = this.data.length;
  return ret;
}

queue.prototype.peek = function() {
  return this.data[0];
}

queue.prototype.clear = function() {
  this.length = 0;
  this.data = [];
}

… and the tests:

var assert = require("test/assert");

var js5 = require("js5");

exports.testStackSimple = function() {
  var mystack = new js5.Stack();
  var myobject = "1";
  mystack.push(myobject);
  var popped = mystack.pop();
  assert.isEqual(myobject, popped);
};

exports.testStack = function() {
  var mystack = new js5.Stack();
  for(var i = 99; i >= 0; i--) {
    mystack.push(i.toString());
  }

  for(var j = 0; j < 100; j++) {
    assert.isEqual(mystack.pop(), j.toString());
  }
}

exports.testQueue = function() {
  var myqueue = new js5.Queue();
  for(var i = 0; i < 100; i++) {
    myqueue.enqueue(i.toString());
  }

  for(var j = 0; j < 100; j++) {
    assert.isEqual(myqueue.dequeue(), j.toString());
  }
}

if (module == require.main) {
    require("os").exit(require("test").run(exports));
}

I asked for help on the Narwhal IRC channel and Kris Kowal pointed me to Chiron, a module library he's working on that already contains set and dictionary implementations. I recommend checking out the code, it highligts some of the interesting challenges of implementing collections in Javascript.

Also, this "Javascript, the Good Parts" talk by Doug Crockford (author of book of the same name) is really good:

Book: State of the eUnion

Last fall, I wrote a chapter for a book titled “State of the eUnion”. My chapter is called “Democracy 2.0” and is about how sites like Folkets Ting, OpenCongress and TheyWorkForYou get built and what features should go into them. The other chapters are about the challenges and possibilities of governments and the Internet in general. They are written by people like Tim O’Reilly, Lawrence Lessig and David Weinberger — very humbling company. You can download a pdf or buy a copy on Amazon.

Post with videos of me saying words

The whole Folkets Ting business has turned out rather well (even though the site is not currently updated — we’re working on it!) and I’ve been invited to speak on a few occations. Some of the talks were recorded, and in the interest of self-agrandissement they are included below in chronological order (except for the last one).

Short interview at Halvandet, the day before Reboot11 started:

Talk at Reboot11:

Talk (in Danish) at HeadStart morning inspiration-session in Århus:

Short blurb (in Danish) on what I think about the usefulness of public data at the ODIS conference:

Speech on “Political Data API” after a project of mine won a competition promoting reuse of public data (winners were announced at the conference mentioned above):

You can watch the same video with slides here

And finally, a non Folkets Ting video where I talk about TEDBot, recorded at the “Berlin in October” un-conference:

Famous Danish Programmers

Denmark somehow seems to have hatched more programmers and language designers of note than one would expect of a country of 6 million. Since almost none of them live in Denmark, it is kind of easy to forget. Here’s a partial list (alphabetical, inclusion determined by my completely whimsical notions of famousness, reasons for inclusion may be somewhat exaggerated):

News Essay

(This summer I applied for the “Coders Wanted” Knight Foundation Scholarship at the Medill School of Journalism. In case anyone’s interested, I’m uploading the essays I wrote for my application.)

Question: In the new media landscape, it’s possible for anyone to do the things that professional journalists do: for instance, dig up information other people are interested in, shoot photos or video of newsworthy events, and publish their work for others to see. What is the role of the professional journalist in this world where anyone can publish? What should be the relationship between the professionals and “citizen journalists”?

The relationship between citizen and professional journalist was put to the test by media coverage of the protest and disturbances that followed in the wake the recent Iranian election. Most professional journalists and photographers from western media had been ejected from Iran or were in other ways prevented from filing stories from the country. A lot of media coverage ended up being built on information from Iranian bloggers and Twitter-users and from videos posted to YouTube and similar online services.

While Iran is certainly an extreme case, events there underscore the trend that breaking news coverage is increasingly handled by so called “citizen journalists”. There are several reasons for professional journalists being less around when stuff happens. First, repressive governments, aware of the explosive role of media, may simply ban journalists. This was what happened in Iran and also — arguably — in Gaza in 2008-2009. Second, news organizations today do not have the resources to support a large and dispersed network of journalists deployed around the world. Last, the probability that a citizen will be on the spot with an Internet-connected videophone when something newsworthy happens, is just much greater than a news-team being nearby.

Because “breaking news” and setting the agenda with “exclusive” stories has traditionally been a point of competition among professional journalists, this development seems to be causing cases of Twitter-envy at some news-organizations. The result can be thinly sourced news based on random tweets and un-dated YouTube videos. Indeed, the news-hounds on Twitter and other places demand that journalists pick up these stories, as coverage on, say, CNN is considered a validation of the seriousness of what is going on. Journalists and editors at CNN caught an earful from Twitter users and bloggers for a perceived lack of Iran-coverage immediately after the election, in spite of there being preciously little verifiable information to report at the time.

In my opinion, it would behoove journalists and editors to refrain from propagating largely unsubstantiated news found on social media platforms, even when — as is typically done now — they are presented with large disclaimers. The trouble with these stories is that they add value for no one:  The news-junkie with an interest in the topic at hand will invariably already be well informed, while the casual observer will only understand that, apparently someone on Twitter is writing about an event alleged to have happened just now. Worse, even with disclaimers (and partly because of them), the credibility of professional news organisations suffer when some stories turn out to be false or outright scams.

This is not to say that professionals cannot draw on citizen journalists when piecing together stories. A foreign correspondent analyzing the situation in Iran could very well discuss third-hand reports from citizens, but in a critical manner and not, on its own, as the primary source. Reports could also be carefully augmented with video and pictures shot by citizens. This critical and cautious approach may sometimes be construed by opinionated citizens as professional journalists’ arrogance and aloofness. To avoid this, professionals should reach out and educate about their need for credibility and their commitment to fair and balanced reporting.

The optimal relationship would have professional journalists that are continually being kept to task by engaged citizens who, on the other hand, are encouraged by the same journalists to file credible and (if possible) verifiable photos, videos and eyewitness accounts. This will give media users access to a range of news-sources, from reasoned analysis by journalists, corroborated and augmented by citizen reports, to drinking straight from the pipe of raw and opinionated coverage flowing out of Twitter, YouTube or whatever other platform that is in vogue. Some professional journalists already embrace this development and I think Rick Sanchez of CNN says it particularly well when defending that channels Iran-coverage in the last 30 seconds of this clip.

An excellent example of citizen and professional journalists working together is found in a recent unravelling of a string of cases of medical malpractice in Denmark. Two journalists were contacted by a couple whose infant child had died some time after swallowing a battery. The parents had pleaded with doctors to examine their child, to no avail. Their complaint about the lack of treatment had also been turned down (Denmark has a single-provider health care system which is sometimes not very receptive to criticism). The journalists saw they had a powerful and emotional story, but wanted to find out if it was part of a trend or just a lone case. To that end, they created a Facebook group where people could volunteer similar stories. This unearthed a string of malpractice cases where complaints had also fallen on deaf ears. These were duly investigated and yielded a series of articles on medical negligence and ignored complaints. The journalists continued to use the Facebook group as a sounding board for new article angles and ideas and for soliciting feedback. Investigating and building this sort of story, while not impossible, would certainly have been very time consuming without the active participation of involved citizens.

While labeling people volunteering stories on a Facebook group “citizen journalists” may be a bit thick, they do form part of a continuum that extends over Twitter-users and YouTubers to bloggers. In the end, the professional journalists could write a string of explosive articles, citizens got their previously ignored stories told and all Danes will hopefully get better health care as a result.

What is the role then of the professional journalist confronted with wired, media-savvy and outspoken citizens? Journalists should insist on their commitment to provide fair and balanced reporting with integrity, even in the face of demands for speedy coverage of events that may or not be breaking right now. They should also reach out and tab into the wealth of information and opinion provided by citizen journalists and use it to augment and improve the stories they create.

Knight Foundation Scholarship Essay

(This summer I applied for the “Coders Wanted” Knight Foundation Scholarship at the Medill School of Journalism. In case anyone’s interested, I’m uploading the essays I wrote for my application.)

Question: How do journalism and technology relate to one another in the digital age?

Technology relates to journalism in two different ways: It is a topic of coverage (“science journalism”) and a driver of change. The subject of science and tech journalism is an interesting one, but this essay will focus on technology as an enabler and driver of change in the practice of journalism.

Ever since the invention of movable type, technological progress has gradually deceased the amount of money and time required to distribute information. The advent of digital technology has lowered the cost to (almost) zero and made distribution instantaneous. As Chris Anderson argues in his recent book “FREE”, this final drop to zero marks a discontinuity and it has some profound implications.

The speed and ease of digital publishing now makes it possible for everyone to write news reports, shoot photos and record video of news events — endeavours that used to be the exclusive privilege of journalists and photographers. The Internet has also greatly increased the scope for reader feedback and debate on stories created by traditional journalists. Taken together, this has led to an interesting integration of newsgathering where professional and so-called “citizen” journalists collaborate and compete to dig up, investigate and publish news.

An extreme example of this are The Guardian’s (a British newspaper) recent attempts at making sense of UK parliament members’ expense claims. The expense records were released under a freedom of information request as more than 2 million scanned documents. To investigate these, the newspaper enlisted its readers (and the Internet at large) to wade through the documents, sift out the interesting claims, determine amounts and exactly what items were claimed.

The Internet has led to the development of a range of interesting platforms, similar to the one mentioned, where journalism-related activities are taking place even outside of the confines of traditional media organizations. The author, for example, has created a web site called Folkets Ting (“People’s Parliament”) which — in the tradition of sites like OpenCongress (US) and The Public Whip (UK) — makes legislation, votes and debates from the Danish parliament available for public scrutiny and debate. It used to be the responsibility of journalists to keep elected politicians to account, but tools like these enable interested citizens to join in. It is the author’s hope that such sites will increase the scope of debate beyond the, often narrow, attention span of traditional media and lead to a greater breadth of opinion being voiced (even if the result is also likely to be lot messier).

Unfortunately, digital technology and the Internet has also seriously undermined the business model of many traditional media companies. The decline of newspapers is a particular worry, partly because theirs has been such a rapid fall (several renowned American newspapers have already shut down and more are teetering on the brink of bankruptcy), partly because they seem to play an outsize role in digging up and investigating agenda-setting stories that other types of media then pick up.

The traditional newspaper business model was based on the fact that printing technology was expensive and building a subscriber-base required time and large investments. After these had been secured however, the newspaper could make a mint on classifieds and other ads and the revenue then subsidized newsroom activities. The Internet rudely killed off this model because there is now nothing stopping sites like Craigslist and eBay from just publishing classifieds (and auctions) to large audiences and not donate the proceeds to deserving journalists.

Publishers have variously called on readers, governments and Google to do something, “do something” usually meaning “give us more money” in some shape or form. News has become a commodity that readers in most cases are unwilling to pay for. A large decline in journalism may represent a failure of the market warranting government intervention, but it is a path fraught with danger. Demanding money be redestributed from a successful part of the value chain looks like zero-sum thinking and reveals an unwillingness to reconsider ones own business. It is the opinion of this aspiring journalist (and of Chris Anderson) that the old business model, or something like it, is unlikely to return.

What, then, of journalism? Some forms (business coverage most prominently) are prospering in spite of the Internet. Other forms may shrink somewhat or find themselves augmented or supplanted by enthusiastic citizen journalists using technology and global connectivity to their advantage. An area such as public oversight of politicians and institutions could expand greatly if good tools for improving transparency and reporting are developed.

The author believes that journalism in the digital is more exciting than ever. To be sure, there are challenges to overcome, but the advantages are many: Journalists can reach wider audiences, both faster and cheaper and they can involve, solicit feedback from and collaborate with more people than at any time before. The author can’t wait to develop the platforms and systems that will form the foundations of new kinds of digital journalism, and hopes, with the help of the Knight Foundation, to get a chance to do so at Medill.

Exchange Rate data

As part of our ongoing efforts at making sense of the Tenders Electronic Daily procurement contracts, I had to get hold of historical exchange rates to convert the values of all the contracts into a comparable form. Professor Werner Antweiler at The University of British Columbia maintains a very impressive, free database of exactly this data. He doesn’t let you export it in (great) bulk unfortunately. I wrote a small script to get the monthly data for the currencies I wanted, the important parts (in C#) are included below. Note that the site may throttle you. Also, please don’t use this to try to scrape all the data and republish it, or in other ways make a fool of yourself.

string url = "http://fx.sauder.ubc.ca/cgi/fxdata";
// this uses Euros as the base currency
string requeststring =
	string.Format(
	"b=EUR&c={0}&rd=&fd=1&fm=1&fy=2003&ld=31&lm=12&ly=2008&y=monthly&q=volume&f=csv&o=",
	"YOURCURRENCY");

HttpWebRequest req = (HttpWebRequest)WebRequest.Create(url);

req.ContentType = "application/x-www-form-urlencoded";
req.Expect = null;
req.Method = "Post";

byte[] reqData = Encoding.UTF8.GetBytes(requeststring);
req.ContentLength = reqData.Length;
Stream reqStream = req.GetRequestStream();
reqStream.Write(reqData, 0, reqData.Length);
reqStream.Close();

HttpWebResponse WebResp = (HttpWebResponse)req.GetResponse();
var resp = WebResp.GetResponseStream();
StreamReader answer = new StreamReader(resp);
string res = answer.ReadToEnd();

if (res.Contains("Error"))
{
	throw new Exception(string.Format("Bad currency: {0}", curr));
}

if (res.Contains("Access"))
{
	// You're being throttled
}

var lines = res.Split(new char[] { '\n' });

// ignore the first two lines and the last two ones
for (int i = 2; i < lines.Length - 2 ; i++)
{
	var line = lines[i];
	var vals = line.Split(new char[] { ',' });

	// parse the vals
	var month = GetMonth(vals[0]);
	var year = GetYear(vals[0]);

	var rate = decimal.Parse(vals[1], CultureInfo.InvariantCulture);
}

// Util Methods
private static int GetMonth(string s)
{
	var month = s.Substring(1, 3);
	switch (month)
	{
		case "Jan": return 1;
		case "Feb": return 2;
		case "Mar": return 3;
		case "Apr": return 4;
		case "May": return 5;
		case "Jun": return 6;
		case "Jul": return 7;
		case "Aug": return 8;
		case "Sep": return 9;
		case "Oct": return 10;
		case "Nov": return 11;
		case "Dec": return 12;
		default: throw new Exception("crap");
	}
}

private static int GetYear(string s)
{
	var year = s.Substring(5, 4);
	return int.Parse(year);
}

Folkets Ting beta launched

I’ve created a new web site on Danish politics in the tradition of The Public Whip and OpenCongress (although it’s not yet nearly as good as those guys). It’s called Folkets Ting and comes with a complimentary blog (both in Danish). Go check it out.

Transatlantic Facebook application performance woes

Someone I follow on Twitter reported having problems getting a Facebook application to perform. I don’t know what they are doing so this post is just guessing at their problem, but the fact is that — if you’re not paying attention — you can easily shoot yourself in the foot when building and deploying Facebook apps. The diagram below depicts a random fbml Facebook app deployed to a server located in Denmark being used by a user also situated in Denmark. Note that Facebook doesn’t yet have a datacenter in Europe (they have one on each coast in the US).

fbservers

The following exchange takes place:

  1. User requests some page related to the application from Facebook
  2. Facebook realizes that serving this request requires querying the application and sends a request for fbml to the app
  3. The app gets the request and decides that in order to respond, it has to query the Facebook API for further info
  4. The Facebook API responds to the query
  5. The application uses the query results and the original request to create a fbml response that is sent to Facebook
  6. Facebook gets the fbml, validates it and macroexpand various fbml tags
  7. Facebook sends the complete page to the user

… so that adds up 6 transatlantic requests pr. page requested by the user. Assuming a 250ms ping time from the Danish app-server to the Facebook datacenter this is a whopping 1.5s latency on top of whatever processing time your server needs AND the time taken by Facebook to process your API request and validate your fbml.

So what do you do? Usually steps 3 and 4 can be eliminated through careful use of fbml and taking advantage of the fact that Facebook includes the ids of all the requesting users friends. Going for an iframe app is also helpful because it eliminates one transatlantic roundtrip and spares Facebook from having to validate any fbml. A very effective measure if you insist on fbml, is simply getting a server stateside — preferably someplace with low ping times to Facebook datacenters. There are plenty of cheap hosting options around, Joyent will even do it for free (I’m not affiliated in any way).

Older Posts Newer Posts