19Jul/111

Building the City Cloud part 2: WCF Data Services and JSON

When we developed our OData web service we came accross a situation in which we needed our webservice to return JSON in stead of XML. According to the specification OData supports both JSON and XML by including the $format query option. Unfortunately WCF Data Service does not support the $format query option and will return the following error when this query option is provided:

<error xmlns="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata">
    <code /> 
    <message xml:lang="en-US">
        The query parameter '$format' begins with a system-reserved 
        '$' character but is not recognized.
    </message> 
</error>

The service tells you that it indeed saw the $format query option but that it didn't recognize it as a valid query option. In this blog post I'm going to show you the different options you have for resolving this little problem.

JSON

JSON stands for JavaScipt Object Notation and is used to interchange data. It is an alternative to XML and became very popular because of its simplicity and the fact that it is very lightweight, easy readable and supported on almost all platforms. It was originally derived from JavaScript, but it is a language independent technology. It especially works pretty fine with JavaScript, because in JavaScript there is no extra parser needed. JSON is primarily used to exchange data between webservices and webapplications.

Enable support for the $format query option

Although WCF Data Service does not support the $format query option it actually supports JSON as an output format. All you need to do is specify "application/json" in the Accept header of your request, but sometimes it is not easy to modify the request headers and sometimes you just want to built an OData compliant data service that supports the $format query option. The basic solution to this problem comes down to removing the $format query option and changing the Accept header to application/json.

You don''t have to do this all by yourself. There are two pieces of code out there that do the job for you:

Both of these pieces of code comes with the nice advantage that they also add support for JSONP callbacks. JSONP is a technique commonly used to circumvent the same origin policy of modern browsers. The same origin policy prevents scripts running on domain1.com to communicate with domain2.com.

DataServicesJSONP

Using the DataServicesJSONP library is very simple. You just need to add the JSONPSupportBehaviour to your data service definition, just like in the picture below and you have full support for JSON and JSONP in your OData web service.

 WCF Data Services Toolkit

Using the WCF Data Services Toolkit not only gives you support for JSON and JSONP, but also adds caching capabilities and offers solutions for a lot different data sources. To use the toolkit in your data service, simply inherit your service from ODataService instead of DataService, just like in the picture below and you have full support for JSON and JSONP in your OData web service.

I created a solution that contains examples of both techniques. You can download it here. The solution contains two wcf services, one that uses DataServicesJSONP and one that uses the toolkit. To compile the code you need to have SQL Compact Edition 4.0 installed. You can install it via the Web Platform Installer or directly via the Microsoft Download Center.

Did you like this? Share it:
Tagged as: , , 1 Comment
16Jul/115

Enabling Smart DJ in the Zune software outside the US

Today I started the Zune software for the first time on my computer. I like the zune software because of the way it looks and how it works. When you start the software for the first time, it starts with an awesome video on the features it provides and how cool it looks. It also shows off Smart DJ, which I find a really great feature, because it is able to generate dynamic mixes based on artist, album or song . Unfortunately when I opened the zune software I couldn't find this feature anywhere in the zune software. It appears that with Zune 4.2 this feature was removed for international users and is only availbale in the US. However there are two workareounds availbale to re-enable Smart DJ outside of the US.

  1. Change the system locale in Windows to US.
  2. Add a specific registry key by following these steps or use this archive, which contains a registry file that, when executed, adds the necessary keys automatically.
    • Close Zune
    • Open the Registry Editor by clicking Start, type regedit in the searchbox and then click regedit.exe.
    • Navigate to HKEY_CURRENT_USER\Software\Microsoft\Zune
    • Right-click the Zune key and choose new->Key
    • Name the key "FeaturesOverride" (without the quotation marks)
    • Select the newly created key and right-click in the right pane.
    • Choose new -> DWORD (32 bit) and give it the name "QuickMixLocal" (again without the quotation marks)
    • Double click "QuickMixLocal" and give it the value "1" (again without the quotation marks)
    • Close the Registry Editor
    • Start Zune. You should now be able to use Smart DJ.

Did you like this? Share it:
Tagged as: , 5 Comments
14Jul/111

Building the City Cloud part 1: Overall system architecture

The last couple of months I've been working on my bachelor project to finish my bachelor computer science. Together with Tom Verhoeff, Jos Kraaijeveld, Jochem Toolenaar and Oana Nitu, I participated in the Imagine Cup 2011. We formed team O!ife and our project was called the City Cloud. The City Cloud is a cloud computing platform that allows different kinds of data generated by the city, its inhabitants, companies and its government to be easily accessed by developers to create new innovative technologies and solutions. You can find more about the City Cloud on the O!ife blog.

This post is one in a series on how we built the City Cloud and everything around it. The focus of these posts will be on the technical aspects of the City Cloud. It will also cover some problems we encountered during development and the solutions we used to tackle them. At this moment I have no idea how many parts will follow. We've came across lots of different problems and used lots of technologies, so I will write another part whenever I have the time and when I think the topic is interesting enough for other developers. The first part will cover the overall architecture of the system to establish a background for other posts.

Goal

When we started the City Cloud project we thought of it as a developer platform for a city. The main goal should be to offer the correct tools for developers to build next generation ubiquitous applications: it should be easy to expose new datasets through the system and the development of applications using these datasets should not require large investments in both terms of infrastructure and programming skills.

In the design phase we divided the system up into three parts: the cloud framework, the mobile platform and the website/webapplication. The framework was the heart of the system and was designed to run in the Azure cloud. Its responsibility was to present the data that is part of the system in an easy and consistent way and offering an all-in-one solution for data and app storage. The mobile platform and the website/webapplication functioned as two different portals to the data and the apps. The website should give general information on cities and should show the datasets and applications available. It should also provide developers with documentation and best practices on how to make use of the platform's capabilities in the best possible way. The mobile framework should be the main portal to the end user in a city. It should give an user specific information on data and apps based on his current location. With more and more smartphones sold every day, smartphones offered a great opportunity to be the number one entry point to the City Cloud.

Framework

The framework should expose data in an easy way. To achieve this we decided to go with an OData webservice running in an Azure webrole. Besides being a Microsoft technology which counts for the Imagine Cup, OData offers an excellent way to expose datasets and besides that offers great possibilities to perform transformations and filtering on the returned data. Another advantage is the great tools available to develop OData webservices and to consume them on almost all major platforms. To get data into the City Cloud framework we invented "connectors". To expose a new dataset through the framework all a developer would need to do is build a connector that allows the City Cloud to retrieve data from their data source. This connector should implement some interface, provide the system with some description of the data structures and it should all magically come to life.

Mobile platform

To reach as much as possible users we wanted the mobile platform to run not only on Windows Phone, but also on Android and iOS. But we didn't want developers to write the same application three times for three different platforms. To accomplish this we decided to go with html5 and javascript as the tools to build the applications for our mobile platform. On every major smartphone platform it is possible to embed some kind of web browser control into a native application and load webpages in it and with the Mango update for Windows Phone coming this fall, html5 is also broadly supported on all mobile phones. We call these html5 applications "building blocks". These building blocks could be hosted in the City Cloud framework and could use datajs to interact with the OData webservice via javascript.

Website and webapplication

To give developers and users an overview of the data and applications already in the system we built the website including a Silverlight webapplication. The webapplication was meant to be used as an explorer for the webservice. In this way everybody could take a quick look into the system to browse data. It also offered the possibility to plot these datasets on a map.

Extensibility

Developers could use the website to submit new applications and new connectors to the system. Because of the extensibility model we felt the need to building a validation step to verify the working of the newly submitted parts in the framework. To validate new submissions we made use of a core feature of the azure platform: a worker role. In Windows Azure a worker role is meant to be used for background processing or other long running tasks. In the cloud framework we used a worker role to validate a submission and to make the required changes to the framework to add a new connector or building block.
At the end of the design phase the overall system architecture looked a lot like a high level Model-View-Controller architecture. The website and the mobile platform functioned as the views, while the framework holds the current state of the system and functioned as the model. The worker role was used to change the state of the framework and can thus be seen as the controller. The following image shows the high level architecture in a graphical way:

I hope this gives a good overview of the system we built the last couple of months. Unfortunately no code yet, but I will save that for the next part.

Did you like this? Share it: