Monday, November 24, 2008
A useful warning
Sometimes I want a bit of confirmation that a message actually makes it through and request a Read Receipt. So I get a notification when the recipient, or at least some computer associated with that recipient, opens the message.
This is a Return Receipt for the mail that you sent to....
Note: This Return Receipt only acknowledges that the message was displayed on the recipient's computer. There is no guarantee that the recipient has read or understood the message contents.
I really like the disclaimer in there. Maybe we should all configure our email software to automatically reply to all spam with such a receipt ?
This is a Return Receipt for the mail that you sent to....
Note: This Return Receipt only acknowledges that the message was displayed on the recipient's computer. There is no guarantee that the recipient has read or understood the message contents.
I really like the disclaimer in there. Maybe we should all configure our email software to automatically reply to all spam with such a receipt ?
Saturday, November 15, 2008
A face lift
If you read this you already noticed the new look and feel of this blog. This blog now uses the same style as the Lokkilok business site. At the same time the address was changed to a simple http://betweenthekeys.lokkilok.com; the old address should still work, but you may want to update any possible bookmarks and subscription.
A similar face lift was applied to the famous Lokkilok blog on paragliding and to the blog on other non-flying topics named La Testa Millimetrata.
Happy reading!
A similar face lift was applied to the famous Lokkilok blog on paragliding and to the blog on other non-flying topics named La Testa Millimetrata.
Happy reading!
Friday, November 14, 2008
New Nokia Marketing Strategy?
Today Nokia informs about a new release of the phone application that work with their Share on Ovi service. Interestingly Nokia states that:
The new features in Share Online 4.0 make use of technical enablers present only in the very latest versions of S60. Because of this, we are making this beta available initially only to a very limited number of devices: N96, 6220 Classic, 6210 Navigator and 5320 XpressMusic.
Looks like Nokia really wants to move in a direction where services together with the phone applications for those services drive device sales. If Share on Ovi would be a really great service you'd want to go and by a new phone. As it is, I think the service is quite nice, but not that special to justify dumping my only one year old, very nice, E65.
Update: This application actually makes me want to buy a N95 (or iPhone). But maybe should dust of my sax instead. Or do both?
The new features in Share Online 4.0 make use of technical enablers present only in the very latest versions of S60. Because of this, we are making this beta available initially only to a very limited number of devices: N96, 6220 Classic, 6210 Navigator and 5320 XpressMusic.
Looks like Nokia really wants to move in a direction where services together with the phone applications for those services drive device sales. If Share on Ovi would be a really great service you'd want to go and by a new phone. As it is, I think the service is quite nice, but not that special to justify dumping my only one year old, very nice, E65.
Update: This application actually makes me want to buy a N95 (or iPhone). But maybe should dust of my sax instead. Or do both?
Thursday, November 13, 2008
Another interesting excercise: MapTz a GAE-hosted service to enhance Google Maps
The rains continue, actually it is almost slush coming down here this morning. This week spent quite some time to develop another tiny service to solve a single problem, MapTz.
The problem was that in Zipiko we often need the timezone of particular locations. We use GeoNames a lot which nicely gives us detailed location information, including the timezone. Alas, GeoNames can be slow at times and sometimes is actually down. Google Maps offers a similar service and seems faster and more reliable. It has its cons though, and one is that it does not provide the timezone for geocoded locations. With a means to get the time zone for a given location we would be able to use either GeoNames or Google Maps and even switch automatically from one to the other based upon responsiveness. So the issue was how to get the time zone when given coordinates and a bit of other info.
Some googling resulted in a couple of services that seemed to be able to return a UTC-offset for a given point, but that I really wanted a pointer in the Olsen database of timezones. This db is implemented in most operating systems and environments, often known as tz, and it allows effective localization of a given UTC time, taking daylight saving, historical changes and what have you into account. So here is what I ended up doing.
It turns out there is a very nice computer readable map with all the worlds time zones, at MapStars. Here the time zones are the true zones on the world map where clocks are on the same time, i.e. UTC-offset. The map is in KML, an XML based format, so can be parsed by a computer program. Essentially one ends up with a set of polygons with for each polygon an associated offset from UTC. The next ingredient is an algorithm that determines if a point is inside a polygon. I used a simple Python implementation of the ray-casting algorithm.
Now within one UTC-offset region, a "band" on the world map, there are many actual time zones (in the Olsen sense). To find the right one I ended up using knowledge of the country(-code) of the point. We can look for all time zones that are in use in a particular country and then pick the one with a UTC-offset that matches the offset of the found region. This worked, but I needed this code to run on the highly distributed Google App Engine and that required a couple of tricks as GAE, rightfully, restricts the amount of resources available to an application.
First, the set of polygons is a largish data structure. I spent most time in figuring out how to set up this data structure and make it available quickly to each (HTTP) request. Parsing the KML file takes 4-5 seconds on my local machine so you really want to do that only once. GAE offers essentially 2 ways to make some data available to any server that ends up running your app to serve a request: a database and a memory cache. The polygon set was too large for the memory cache, and fetching largish things from that cache is , understandably, not very fast. Each polygon could easily be stored in the db but there over 70 of them, some quite large/complex, and they might all have to be fetched for a request. A third approach is to use a global variable to hold the polygons. Such variable will not be available to all possible instances of the application but it does persist between subsequent request to the same instance, if those requests follow each other within a short period.
I went with this approach, as follows. The code parses the KML file "off-line" and creates python code that instantiates the polygons and places them in a global variable. When running as a GAE web app the code simply imports that python code. One more complication was that global variables are limited in memory foot print; as such a good feature. The polygon set went a bit over that size. Another issue was that the python file to instantiate the polygons was too large. To overcome both issues I split the set in chunks of 10 regions with for each set a global and a python file and zipped the files into one. Instantiation (to serve the "first" request) only takes a couple of hundred milliseconds and now I had something that worked quite well.
So time for a couple of more optimizations. First if a country only uses one time zone, there is no need to search for the right polygon. Second, when creating the polygons, i.e. looping over the coordinates that bound the region, I could record the northern-, southern, easter-, and wester-most points of that region. If a given point is not within those 4 boundary points it certainly is not within the polygon. These optimizations are in the current service and response times and use of computational quota are quite satisfactory. One more optimization that probably would have a big impact is to move from floating points to integers for all coordinates. I may do that one of these days and update this blog with the result. Meanwhile feel free to use the service if you have a need for getting those timezones!
Update: as was to be expected the above mentioned optimization to use integers indeed really speeds thing up. The GAE logs report an approx. 10 fold speed increase and likewise 10 times less use of computational cycles.
The problem was that in Zipiko we often need the timezone of particular locations. We use GeoNames a lot which nicely gives us detailed location information, including the timezone. Alas, GeoNames can be slow at times and sometimes is actually down. Google Maps offers a similar service and seems faster and more reliable. It has its cons though, and one is that it does not provide the timezone for geocoded locations. With a means to get the time zone for a given location we would be able to use either GeoNames or Google Maps and even switch automatically from one to the other based upon responsiveness. So the issue was how to get the time zone when given coordinates and a bit of other info.
Some googling resulted in a couple of services that seemed to be able to return a UTC-offset for a given point, but that I really wanted a pointer in the Olsen database of timezones. This db is implemented in most operating systems and environments, often known as tz, and it allows effective localization of a given UTC time, taking daylight saving, historical changes and what have you into account. So here is what I ended up doing.
It turns out there is a very nice computer readable map with all the worlds time zones, at MapStars. Here the time zones are the true zones on the world map where clocks are on the same time, i.e. UTC-offset. The map is in KML, an XML based format, so can be parsed by a computer program. Essentially one ends up with a set of polygons with for each polygon an associated offset from UTC. The next ingredient is an algorithm that determines if a point is inside a polygon. I used a simple Python implementation of the ray-casting algorithm.
Now within one UTC-offset region, a "band" on the world map, there are many actual time zones (in the Olsen sense). To find the right one I ended up using knowledge of the country(-code) of the point. We can look for all time zones that are in use in a particular country and then pick the one with a UTC-offset that matches the offset of the found region. This worked, but I needed this code to run on the highly distributed Google App Engine and that required a couple of tricks as GAE, rightfully, restricts the amount of resources available to an application.
First, the set of polygons is a largish data structure. I spent most time in figuring out how to set up this data structure and make it available quickly to each (HTTP) request. Parsing the KML file takes 4-5 seconds on my local machine so you really want to do that only once. GAE offers essentially 2 ways to make some data available to any server that ends up running your app to serve a request: a database and a memory cache. The polygon set was too large for the memory cache, and fetching largish things from that cache is , understandably, not very fast. Each polygon could easily be stored in the db but there over 70 of them, some quite large/complex, and they might all have to be fetched for a request. A third approach is to use a global variable to hold the polygons. Such variable will not be available to all possible instances of the application but it does persist between subsequent request to the same instance, if those requests follow each other within a short period.
I went with this approach, as follows. The code parses the KML file "off-line" and creates python code that instantiates the polygons and places them in a global variable. When running as a GAE web app the code simply imports that python code. One more complication was that global variables are limited in memory foot print; as such a good feature. The polygon set went a bit over that size. Another issue was that the python file to instantiate the polygons was too large. To overcome both issues I split the set in chunks of 10 regions with for each set a global and a python file and zipped the files into one. Instantiation (to serve the "first" request) only takes a couple of hundred milliseconds and now I had something that worked quite well.
So time for a couple of more optimizations. First if a country only uses one time zone, there is no need to search for the right polygon. Second, when creating the polygons, i.e. looping over the coordinates that bound the region, I could record the northern-, southern, easter-, and wester-most points of that region. If a given point is not within those 4 boundary points it certainly is not within the polygon. These optimizations are in the current service and response times and use of computational quota are quite satisfactory. One more optimization that probably would have a big impact is to move from floating points to integers for all coordinates. I may do that one of these days and update this blog with the result. Meanwhile feel free to use the service if you have a need for getting those timezones!
Update: as was to be expected the above mentioned optimization to use integers indeed really speeds thing up. The GAE logs report an approx. 10 fold speed increase and likewise 10 times less use of computational cycles.
Sunday, October 12, 2008
Does anybody care ?
"Will anybody care", a thought about social media in my blog for less-, or non-, IT-technical subjects might be of interest to the (couple of) readers of Between the keys.
Monday, October 06, 2008
Battling the Platforms
It's autumn, rainy and windy so paragliding is well, over. Inspired by some recent encounters with interesting startups and a very nice presentation on experiences with the new Google App Engine, I decided to do a tiny project. And as all of a sudden lots of friends and acquaintances seem to have discovered Facebook, I decided to make a app for that.
With so many people on Facebook, I've noticed that it is a good source of interesting events: whatever one thinks of the rest of Facebook, I personally really enjoy to check out the events that my friends are interested in (the new Zipiko service is centered around this use case). Very frequently I want to attend the event too. But it just so happens that I use Google to host my master calendar and from there I sync to my laptop. Facebook offers "export" of events as iCal files but that's fairly primitive and above all cumbersome. So here I had a good candidate little project: make a Facebook app that reads future events that I've signed up for and add them to my Google Calendar. And host this app at Google App Engine (GAE). Although the functionality would be minimal the project would use both the Facebook and Google data APIs and libraries and use the Google App Engine web app development and hosting platform.
Doing a simple GAE app turned out to be fairly straightforward. A GAE app must be written in Python and that was initially my tallest hurdle; I'd never done anything in Python before. But I've years of Smalltalk experience and Python supposedly would be somewhat similar; and in any case not very hard. My Linux/Fedora laptop already had Python, all I needed was to add the PyDev plugin to my Eclipse install. Then I downloaded the GAE development kit, essentially a couple of Python libraries and a little, pure python, server that implements the same restrictions as GAE. Zipipop's Stefano's recent presentation on GAE discusses those restrictions at length; for this tiny project they were no issue at all. I found it easy enough to run the development server from within Eclipse, but I could not get debugging to work in- or outside Eclipse. Seems to be some issue with the Python version I'm using but GAE wouldn't work with a newer version. Anyway the project was going to be tiny so I figured I could do without debugging. In Smalltalk development it was very common to write most code in the debugger so I was curious to try that out with Python.
So once I had the obligatory "hello world" GAE app working on my laptop it was time to turn it into a Facebook app that could get my (Facebook) events. Just creating a page that would list the events was very easy with the Facebook Python library. The most important aspect was to build the app such that all requests could (and should) come from Facebook. In other words the browser/user should all the time be directed to Facebook which will then request the app to generate content for the so called "canvas".
In the app a few lines of library code are used to parse the HTTP body that Facebook POSTs. If the user is logged in (to Facebook) this contains a bunch of frequently needed user information. In my tiny project all we really needed was the Facebook user id, which was used in a single Facebook API call to fetch the future events (for that user). Facebook also offers FBML, a bunch of markup elements that make it easy to offer content in the Facebook style.
At this point I had Facebook events in the GAE memcache. So next was to insert those into the Google Calendar. Most of Google's services offer interfaces that all use the same unified approach: Google Data APIs. Google offers a Python library that nicely encapsulates most of the boring work. However it is important to more or less know what has to happen at the HTTP level, as things have to happen in the right order.
What needs to happen is essentially an OAuth dance. To get some real work done the app needs to present a (use) token and to get that token it has to present an authorization token. It gets this (one-time) authorization token by redirecting the user to the Google Account service with some parameters; the essential ones are a URI that indicates the scope of the actions that the app wants to do (i.e. "calendar") and a URL to where Google should redirect the user back. As explained above in my case this URL should actually point to Facebook, with my app as a URL path component. Upon succes Google will add the one-time authorization token as a parameter to that URL. Luckily Facebook puts this parameter in its request to the application. So in my little app I would catch the token and then upgrade it with a call to Google to a "use" token. That token is supposedly somewhat permanent so I decided to save it in a little User object. The GAE offers a object database functionality that deals with distribution completely behind the prying eyes of the developer , a very nice idea. Of course it didn't quite work right away...
This as the GAE documentation on the use of the Google Data Services states that:
Now upgrading to a use token worked and with such token it is a simple Google Data services library call to insert an event, but there were some caveats. Most disturbingly was that when I ran the app on the GAE development server I always got a "Not Implemented" error. I assumed that this was because I'd done something wrong but when I after quite a while decided to try the app on the actual GAE infra this part immediately worked. The other caveat was that the Facebook API returns events with timing information in Pacific time. Sigh. After another hour or so I'd figured out how to use the Python libraries to move those back into UTC. I'm still not sure if I got this right for Daylight Saving Time situations.
A day later I suddenly noticed that it did no longer work, and quickly found out that this was because my code would sometimes get a trailing slash in the value of a URL request parameter. Easy to strip out but it hadn't happened before. I think this happened when Google released an upgrade of the GAE kit, but I'm not not completely sure that this problem co-occurred with that upgrade.
All in all it was an interesting exercise indeed! It would be tempting to compare the various platforms in detail but I've only scratched the surface. A couple of remarks can be made though. The Google App Engine seems a very useful hosting platform for not-too-complex websites. It forces one to structure the app in such a way that it will scale, but this is done in a positive, instructive, manner.
The Facebook APIs and the Google Data APIs are more or less comparable; they offer various identity-based services. Facebook can provides the profile of the user, the friend list, the events, and a couple of other things, e.g. a facility to send notifications and SMS messages. Google provides access to the calendar, documents, and contacts of the user. Unfortunately the Facebook and Google services have different interfaces although they are quite similar, simple HTTP request with JSON or XML for service response content. Overall the provided libraries look good and are effective in hiding the complexities very much. The disadvantage seems to be that the APIs are not specified with as much rigor as I would like.
Oh, and if you want to copy Facebook events into your Google Calendar you can try it at http://apps.facebook.com/toogcal. It might work, but don't count on it. Looks like tomorrow it will be flyable after all...
With so many people on Facebook, I've noticed that it is a good source of interesting events: whatever one thinks of the rest of Facebook, I personally really enjoy to check out the events that my friends are interested in (the new Zipiko service is centered around this use case). Very frequently I want to attend the event too. But it just so happens that I use Google to host my master calendar and from there I sync to my laptop. Facebook offers "export" of events as iCal files but that's fairly primitive and above all cumbersome. So here I had a good candidate little project: make a Facebook app that reads future events that I've signed up for and add them to my Google Calendar. And host this app at Google App Engine (GAE). Although the functionality would be minimal the project would use both the Facebook and Google data APIs and libraries and use the Google App Engine web app development and hosting platform.
Doing a simple GAE app turned out to be fairly straightforward. A GAE app must be written in Python and that was initially my tallest hurdle; I'd never done anything in Python before. But I've years of Smalltalk experience and Python supposedly would be somewhat similar; and in any case not very hard. My Linux/Fedora laptop already had Python, all I needed was to add the PyDev plugin to my Eclipse install. Then I downloaded the GAE development kit, essentially a couple of Python libraries and a little, pure python, server that implements the same restrictions as GAE. Zipipop's Stefano's recent presentation on GAE discusses those restrictions at length; for this tiny project they were no issue at all. I found it easy enough to run the development server from within Eclipse, but I could not get debugging to work in- or outside Eclipse. Seems to be some issue with the Python version I'm using but GAE wouldn't work with a newer version. Anyway the project was going to be tiny so I figured I could do without debugging. In Smalltalk development it was very common to write most code in the debugger so I was curious to try that out with Python.
So once I had the obligatory "hello world" GAE app working on my laptop it was time to turn it into a Facebook app that could get my (Facebook) events. Just creating a page that would list the events was very easy with the Facebook Python library. The most important aspect was to build the app such that all requests could (and should) come from Facebook. In other words the browser/user should all the time be directed to Facebook which will then request the app to generate content for the so called "canvas".
In the app a few lines of library code are used to parse the HTTP body that Facebook POSTs. If the user is logged in (to Facebook) this contains a bunch of frequently needed user information. In my tiny project all we really needed was the Facebook user id, which was used in a single Facebook API call to fetch the future events (for that user). Facebook also offers FBML, a bunch of markup elements that make it easy to offer content in the Facebook style.
At this point I had Facebook events in the GAE memcache. So next was to insert those into the Google Calendar. Most of Google's services offer interfaces that all use the same unified approach: Google Data APIs. Google offers a Python library that nicely encapsulates most of the boring work. However it is important to more or less know what has to happen at the HTTP level, as things have to happen in the right order.
What needs to happen is essentially an OAuth dance. To get some real work done the app needs to present a (use) token and to get that token it has to present an authorization token. It gets this (one-time) authorization token by redirecting the user to the Google Account service with some parameters; the essential ones are a URI that indicates the scope of the actions that the app wants to do (i.e. "calendar") and a URL to where Google should redirect the user back. As explained above in my case this URL should actually point to Facebook, with my app as a URL path component. Upon succes Google will add the one-time authorization token as a parameter to that URL. Luckily Facebook puts this parameter in its request to the application. So in my little app I would catch the token and then upgrade it with a call to Google to a "use" token. That token is supposedly somewhat permanent so I decided to save it in a little User object. The GAE offers a object database functionality that deals with distribution completely behind the prying eyes of the developer , a very nice idea. Of course it didn't quite work right away...
This as the GAE documentation on the use of the Google Data Services states that:
Note: With Google App Engine, you must use the URLFetch API to request external URLs. In our Google Data Python client library, gdata.service
does not use the URLFetch API by default. We have to tell the service object to use URLFetch by calling gdata.alt.appengine.run_on_appengine
on the service object, like this: gdata.alt.appengine.run_on_appengine(self.client)
Now upgrading to a use token worked and with such token it is a simple Google Data services library call to insert an event, but there were some caveats. Most disturbingly was that when I ran the app on the GAE development server I always got a "Not Implemented" error. I assumed that this was because I'd done something wrong but when I after quite a while decided to try the app on the actual GAE infra this part immediately worked. The other caveat was that the Facebook API returns events with timing information in Pacific time. Sigh. After another hour or so I'd figured out how to use the Python libraries to move those back into UTC. I'm still not sure if I got this right for Daylight Saving Time situations.
A day later I suddenly noticed that it did no longer work, and quickly found out that this was because my code would sometimes get a trailing slash in the value of a URL request parameter. Easy to strip out but it hadn't happened before. I think this happened when Google released an upgrade of the GAE kit, but I'm not not completely sure that this problem co-occurred with that upgrade.
All in all it was an interesting exercise indeed! It would be tempting to compare the various platforms in detail but I've only scratched the surface. A couple of remarks can be made though. The Google App Engine seems a very useful hosting platform for not-too-complex websites. It forces one to structure the app in such a way that it will scale, but this is done in a positive, instructive, manner.
The Facebook APIs and the Google Data APIs are more or less comparable; they offer various identity-based services. Facebook can provides the profile of the user, the friend list, the events, and a couple of other things, e.g. a facility to send notifications and SMS messages. Google provides access to the calendar, documents, and contacts of the user. Unfortunately the Facebook and Google services have different interfaces although they are quite similar, simple HTTP request with JSON or XML for service response content. Overall the provided libraries look good and are effective in hiding the complexities very much. The disadvantage seems to be that the APIs are not specified with as much rigor as I would like.
Oh, and if you want to copy Facebook events into your Google Calendar you can try it at http://apps.facebook.com/toogcal. It might work, but don't count on it. Looks like tomorrow it will be flyable after all...
Subscribe to:
Posts (Atom)