Home - About me - Browse by categories

VorlonJS - A Journey to DevOps - New blog post series

I have just started to publish a new blog post series that will detail how we implemented DevOps practices like Continuous Integration, Tests, Continuous Deployment and Release Management, Tests in Production on the Vorlon.JS project, using Visual Studio Team Services and Microsoft Azure.

VorlonJS is a node.js based tool for web developers that allows you to remotely test, inspect and debug a web application. Because it’s not easy to use traditional web debugging tools “F12-like” on devices such as mobile or embedded devices, Vorlon.JS offers a way to improve the debug experience, using a remote Dashboard.

The series starts here. Enjoy, and don’t hesitate to provide any feedback !

read more

Déploiement d’un cluster Docker Swarm dans Azure

Docker Swarm est un outil qui permet d’orchestrer le déploiement de conteneurs Docker au sein d’un cluster. L’idée étant d’exécuter des commandes Docker comme vous le feriez sur un hôte Docker classique, mais de distribuer automatiquement ces commandes sur les différents nœuds du cluster.

Azure Resource Manager (ARM) permet - entre autre - de déployer une solution composée de plusieurs machines et/ou services Microsoft Azure, depuis le portail Preview à partir d’un fichier JSON qui en décrit la structure. De nombreux templates sont à votre disposition sur GitHub, pour vous aider à démarrer plus vite ou à construire les vôtres. (Si vous n’êtes pas encore familier avec ARM, je vous invite à lire templates sont à votre disposition sur GitHub)

Récemment, un template qui permet de déployer un cluster Docker Swarm a été rendu disponible (ici). Il permet de déployer deux groupes de haute disponibilité au sein d’un même réseau virtuel :

  • Un pour les masters Swarm (au total 3), derrière un load balancer - Un pour les nœuds Swarm (nombre de machines à définir au moment du déploiement du template), également derrière un load balancer, mais non exposé sur Internet

Vous trouverez une description complète du cluster qui est déployé par le template sur la page GitHub, directement.

Avant de le déployer, il faut vous assurer de posséder une clé SSH, pour pouvoir vous y connecter ensuite. Si vous êtes sur Linux ou MacOS X, cette étape est un détail. Si vous êtes sous Windows, je vous invite à télécharger Git for Windows, qui vous permettra d’installer automatiquement des outils vous permettant la génération de cette clé, puis d’établir ensuite une connexion SSH depuis la ligne de commande.

Une fois installé, il vous suffit de lancer Git Bash, de taper la commande suivante, et de vous laisser guider :

>  ssh-keygen.exe

Une clé privée et une clé publique seront alors générée et placées pour vous dans le dosser .ssh de votre répertoire utilisateur. La clé publique vous sera demandée lors de la création du cluster Swarm.

Rendez-vous sur la page du template sur GitHub et cliquez sur le bouton Deploy to Azure.

image

Lorsque cela vous est demandé, authentifiez-vous avec votre compte Azure, et vous devriez alors vous retrouver sur une page vous demandant la saisie de différents paramètres :

image

Concrètement, il vous faut saisir :

  • le nom d’un compte de stockage à créer - le nom d’utilisateur / administrateur qui pourra se connecter au cluster - votre clé publique SSH (à préfixer avec ssh-rsa ) - Le nombre de nœuds Swarm que vous souhaitez créer

Choisissez / créez un groupe de ressources dans lequel le cluster doit être déployé, acceptez les conditions et validez en cliquant sur le bouton Créer.

Une fois le déploiement terminé, vous devriez voir les différents éléments qui ont été créés pour vous :

image

image

Le template a été configuré de manière à ce que les noeuds du cluster Swarm ne soient pas directement accessibles sur Internet (pas d’IP publique). Pour vous y connecter, il faut donc passer par l’un des master (port 2200, 2201 ou 2202) sur le DNS -manage..cloudapp.azure.com. Par exemple, si vous avez choisi le nom DNS **swarmcluster** à la création et que vous avez déployé le cluster dans le data center **West Europe**, vous pouvez alors vous connecter au master grâce à la commande :

ssh -A [email protected]swarmcluster-manage.westeurope.cloudapp.azure.com

Vous pouvez alors ensuite exécuter la commande suivante pour récupérer des infos sur les nœuds Docker qui sont disponibles :

docker -H tcp://localhost:2375 info

image

Il ne vous reste plus qu’à déployer des conteneurs dans le cluster. A chaque fois que vous allez demander le déploiement d’un conteneur, Swarm va automatiquement sélectionner un nœud du cluster dans lequel le déployer. Par exemple, si vous tapez 2 fois la commande suivante :

docker -H tcp://localhost:2375 run -d -p 80:80 nginx

Puis :

docker -H tcp://localhost:2375 ps -a

Vous pourrez alors constater que deux conteneurs basés sur l’image Nginx ont été démarrés, sur deux nœuds différents du cluster:

image

Le tour est joué, votre cluster Docker Swarm est en ligne !

A+

Julien

read more

[Build 2015] Quoi de neuf pour Microsoft Azure ?

La stratégie de Microsoft est claire depuis un petit moment : « mobile first, cloud first », c’est-à-dire un accès permanant à nos données, à demeure comme en mobilité, et sur n’importe quel périphérique (du Raspberry PI au Windows Phone, en passant par le PC, la XBOX), grâce à Windows 10 ! 
Evidemment, pour permettre cette mobilité, il faut un backend qui soit capable de stocker les données, mais aussi d’héberger des applications, des APIs, des services (…) afin de les rendre accessibles partout et tout le temps : c’est là qu’intervient Microsoft Azure !

imageIl était donc assez logique qu’Azure occupe une place très importante lors de cette nouvelle édition de //Build ! Beaucoup d’annonces ont été faites, notamment lors de la keynote du premier jour. La suite sur image.

A+

Julien

read more

Azure Media Services : enable dynamic PlayReady content protection on your assets

As you maybe already know, Azure Media Services (AMS) has a feature called dynamic packaging that enables to stream a multi-bitrate asset as Smooth Streaming, HLS or MPEG-Dash dynamically, without storing an asset for each package format. Recently Microsoft added a new feature called dynamic content protection that allows to protect a content dynamically using AES or PlayReady DRM. In this post, I am going to explain how to enable PlayReady encryption, publish the asset and play it with the Silverlight player that has been updated by the AMS team to support PlayReady.

I wrote this post using the lot of samples provided on GitHub by the Azure Media Services team (thanks to them!) and the code of the on GitHub by the Azure Media Services team, so it may have some code that I took directly from these samples Smile

All the code of this blog post is available on my GitHub.

Overview of dynamic content protection

The feature is currently in preview on Microsoft Azure, so you have to enable it for your subscription if you want to use it. To do that, go to http://azure.microsoft.com/en-us/services/preview/ and search for Content Protection. Then click the Try it button and follow the steps to enable the feature :

image

Once done, you are ready to play with PlayReady Smile

There are two way to work with PlayReady in AMS. The first one is to use an external PlayReady license server to deliver the license (maybe you already have one for your services, so you can keep using it!). The second one is to use the license server that is hosted on Azure Media Services and protect the access to the service via a secure token service (like Azure Access Control or any other STS that can deliver SWT tokens). I am going to present the second way in this post.

The workflow to enable dynamic protection on a media asset is the following :

    - Upload your asset in the Azure BLOB storage - Encode the asset in a multi-bitrate file to use dynamic packaging - Generate content keys and content delivery policy on the asset and configure PlayReady - Create a locator to be able to stream the asset dynamically

### Configure your media service in the Azure portal

Before starting to work with PlayReady, you need to configure your media service to be able to work with the STS that will deliver tokens. If you want to use ACS you can follow this post. In this article I will use a simple class to deliver tokens in my application.

Go on http://manage.windowsazure.com and select your media service. Go to the CONTENT PROTECTION tab. In the middle of the page you should see the PlayReady configuration :

image

For the license authorization policy, choose Token. This option tells Azure Media Services that it should validate a bearer token when the client call the PlayReady license server.

The issuer represents the name or URI of the STS that will generates the token, and the scope (or audience) represents the scope that is given to the token when it is issued.

Click on the manage verification keys button. In the primary verification key field put the key that will be used by the STS to sign the token :

image

Eventually you can add a secondary validation key in case of the first one get compromised.

All these information will make Media Services able to validate the token that your player will pass in header when calling the PlayReady license through the license server.

Your media service is ready to work with PlayReady.

Deliver Simple Web Token (SWT)

As they are named, SWT are really simple token that are easy to exchange on the http protocol. They should contains at least :

  • An issuer - An audience (scope) - An expiration date - A HMAC of all the parameters to ensure the token has not been compromised

If needed you can also pass a list of claims to the token.

All these parameters are URL encoded and represented in a simple query string : Issuer=your_issuer&Audience=your_audience etc…

The following class is a sample of Simple Web Token implementation :

/// <summary>
/// Represents a simple web token
/// </summary>
public class SimpleWebToken
{
private static readonly DateTime epochStart =
new DateTime(1970, 01, 01, 0, 0, 0, 0, DateTimeKind.Utc);

private NameValueCollection claimsCollection;
private byte[] signingKey = null;

/// <summary>
/// Initializes a new instance of the <see cref="SimpleWebToken"/> class.
/// </summary>
/// <param name="base64Key">The base64 key used to sign the token.</param>
/// <param name="ttl">The time to live.</param>
public SimpleWebToken(string base64Key, TimeSpan ttl)
{
TimeSpan ts = DateTime.UtcNow - epochStart + ttl;
this.ExpiresOn = Convert.ToUInt64(ts.TotalSeconds);
this.claimsCollection = new NameValueCollection();

var securityKey = new InMemorySymmetricSecurityKey(Convert.FromBase64String(base64Key));
signingKey = securityKey.GetSymmetricKey();
}

/// <summary>
/// Gets or sets the issuer.
/// </summary>
public string Issuer { get; set; }

/// <summary>
/// Gets or sets the audience.
/// </summary>
public string Audience { get; set; }

/// <summary>
/// Gets or sets the signature.
/// </summary>
public byte[] Signature { get; set; }

/// <summary>
/// Gets the expires on.
/// </summary>
public ulong ExpiresOn { get; private set; }

/// <summary>
/// Gets the claims.
/// </summary>
/// <value>
/// The claims.
/// </value>
public IList<Claim> Claims
{
get
{
return this.claimsCollection.AllKeys
.SelectMany(key =>
this.claimsCollection[key].Split(',')
.Select(value => new Claim(key, value)))
.ToList();
}
}

/// <summary>
/// Adds the claim.
/// </summary>
/// <param name="name">The name.</param>
/// <param name="value">The value.</param>
public void AddClaim(string name, string value)
{
this.claimsCollection.Add(name, value);
}

/// <summary>
/// Returns a <see cref="System.String" /> that represents this instance.
/// </summary>
/// <returns>
/// A <see cref="System.String" /> that represents this instance.
/// </returns>
public override string ToString()
{
StringBuilder content = new StringBuilder();

content.Append("Issuer=").Append(this.Issuer);

foreach (string key in this.claimsCollection.AllKeys)
{
content.Append('&').Append(key)
.Append('=').Append(this.claimsCollection[key]);
}

content.Append("&ExpiresOn=").Append(this.ExpiresOn);

if (!string.IsNullOrWhiteSpace(this.Audience))
{
content.Append("&Audience=").Append(this.Audience);
}

using (HMACSHA256 hmac = new HMACSHA256(signingKey))
{
byte[] signatureBytes = hmac.ComputeHash(
Encoding.ASCII.GetBytes(content.ToString()));

string signature = HttpUtility.UrlEncode(
Convert.ToBase64String(signatureBytes));

content.Append("&HMACSHA256=").Append(signature);
}

return content.ToString();
}
}

In our case, we will use the same issuer, audience and signing key than defined in the Azure Media Services portal.

It’s now really easy to issue a SWT using these information :

Uri issuerName = new Uri(ConfigurationManager.AppSettings["IssuerName"]);
Uri scope = new Uri(ConfigurationManager.AppSettings["Scope"]);
byte[] signingKey = Convert.FromBase64String(ConfigurationManager.AppSettings["SigningKey"]);

var swt = new SimpleWebToken(ConfigurationManager.AppSettings["SigningKey"], TimeSpan.FromMinutes(10));
swt.Audience = scope.ToString();
swt.Issuer = issuerName.ToString();

string token = swt.ToString();

Upload and encode the asset

In this part you will see how to upload an asset in the Azure BLOB storage and launch an encoding job to generate a multi-bitrate asset. First you have to add the media services NuGet package to your solution :

image

Be sure to also add the SDK Extensions package that give a set of really cool extensions methods for the media services SDK ! Smile

Creating a media services asset from a file is now really simple using the CreateFromFile extension methods on the assets collection :

string mediaServiceName = ConfigurationManager.AppSettings["MediaServiceName"];
string mediaServiceKey = ConfigurationManager.AppSettings["MediaServiceKey"];

var context = new CloudMediaContext(mediaServiceName, mediaServiceKey);

var asset = context.Assets.CreateFromFile(videoFilePath, AssetCreationOptions.None, (sender, progressChanged) =>
{
Console.WriteLine("{0} of {1} bytes uploaded", progressChanged.BytesUploaded, progressChanged.TotalBytes);
});

Once the asset as been uploaded, you can create a job that contains the encoding task :

IJob job = context.Jobs.CreateWithSingleTask(
MediaProcessorNames.AzureMediaEncoder,
MediaEncoderTaskPresetStrings.H264AdaptiveBitrateMP4Set720p,
asset,
"Adaptive Bitrate MP4",
AssetCreationOptions.None);

job.Submit();
job = job.StartExecutionProgressTask(j =>
{
Console.WriteLine("Job state : {0}", j.State);
Console.WriteLine("Job progress: {0:0.##}%", j.GetOverallProgress());
}, CancellationToken.None).Result;

If you want more explanations about these extensions for media services, you can follow this post.

Configure dynamic PlayReady on the asset

Once the job is done, an output asset is available on it. This asset represents the multi-bitrate file that has been encoded. You can now generates all the stuff needed to enable PlayReady.

First you have to create a Content Key based on the CommonEncryption type :

private static async Task<IContentKey> CreateCommonTypeContentKeyAsync(IAsset asset, CloudMediaContext _context)
{
// Create envelope encryption content key
Guid keyId = Guid.NewGuid();
byte[] contentKey = GetRandomBuffer(16);

IContentKey key = await _context.ContentKeys.CreateAsync(
keyId,
contentKey,
"ContentKey CENC",
ContentKeyType.CommonEncryption);

// Associate the key with the asset.
asset.ContentKeys.Add(key);

return key;
}

private static byte[] GetRandomBuffer(int size)
{
byte[] randomBytes = new byte[size];
using (RNGCryptoServiceProvider rng = new RNGCryptoServiceProvider())
{
rng.GetBytes(randomBytes);
}

return randomBytes;
}

Next you have to generate the token restrictions to associate the issuer, the audience and the signing key of your STS with the content key :

private static List<ContentKeyAuthorizationPolicyRestriction> GetTokenRestrictions(string name, Uri issuer, Uri scope, byte[] signingKey)
{
TokenRestrictionTemplate tokenTemplate = new TokenRestrictionTemplate();
tokenTemplate.Issuer = issuer;
tokenTemplate.Audience = scope;
tokenTemplate.PrimaryVerificationKey = new SymmetricVerificationKey(signingKey);

string requirements = TokenRestrictionTemplateSerializer.Serialize(tokenTemplate);

List<ContentKeyAuthorizationPolicyRestriction> restrictions = new List<ContentKeyAuthorizationPolicyRestriction>()
{
new ContentKeyAuthorizationPolicyRestriction()
{
KeyRestrictionType = (int)ContentKeyRestrictionType.TokenRestricted,
Requirements = requirements,
Name = name
}
};

return restrictions;
}

Now you have to get the template of PlayReady license you will use. Media Services SDK offers a set of classes to make it easy :

private static string ConfigurePlayReadyLicenseTemplate()
{
PlayReadyLicenseResponseTemplate responseTemplate = new PlayReadyLicenseResponseTemplate();
PlayReadyLicenseTemplate licenseTemplate = new PlayReadyLicenseTemplate();
licenseTemplate.PlayRight.AllowPassingVideoContentToUnknownOutput = UnknownOutputPassingOption.Allowed;
responseTemplate.LicenseTemplates.Add(licenseTemplate);

return MediaServicesLicenseTemplateSerializer.Serialize(responseTemplate);
}

These helper methods are used like the following :

private static async Task ConfigurePlayReadyDynamicEncryptionAsync(
CloudMediaContext cloudMediaContext,
IAsset outputAsset,
Uri issuerName,
Uri scope,
byte[] signingKey)
{
IContentKey contentKey = null;
var assetContentKeys = outputAsset.ContentKeys
.Where(c => c.ContentKeyType == ContentKeyType.CommonEncryption);

if (!assetContentKeys.Any())
{
contentKey = await CreateCommonTypeContentKeyAsync(outputAsset, cloudMediaContext);
}
else
{
contentKey = assetContentKeys.First();
}

var tokenRestrictions = GetTokenRestrictions(
"My STS Token Restrictions",
issuerName,
scope,
signingKey);

var playReadyLicenseTemplate = ConfigurePlayReadyLicenseTemplate();

IContentKeyAuthorizationPolicyOption policyOption = await
cloudMediaContext.ContentKeyAuthorizationPolicyOptions.CreateAsync(
"Option with Token Restriction",
ContentKeyDeliveryType.PlayReadyLicense,
tokenRestrictions,
playReadyLicenseTemplate);

IContentKeyAuthorizationPolicy policy = await cloudMediaContext
.ContentKeyAuthorizationPolicies
.CreateAsync("ACS Authorization Policy");

policy.Options.Add(policyOption);
await policy.UpdateAsync();

contentKey.AuthorizationPolicyId = policy.Id;
await contentKey.UpdateAsync();

var licenseAcquisitionUrl = await contentKey.GetKeyDeliveryUrlAsync(ContentKeyDeliveryType.PlayReadyLicense);
string strLicenseAcquisitionUrl = System.Security.SecurityElement.Escape(licenseAcquisitionUrl.ToString());

Dictionary<AssetDeliveryPolicyConfigurationKey, string> assetDeliveryPolicyConfiguration =
new Dictionary<AssetDeliveryPolicyConfigurationKey, string>
{
{
AssetDeliveryPolicyConfigurationKey.PlayReadyLicenseAcquisitionUrl, strLicenseAcquisitionUrl
},
};

var assetDeliveryPolicy = await cloudMediaContext.AssetDeliveryPolicies.CreateAsync(
"PlayReady Delivery Policy",
AssetDeliveryPolicyType.DynamicCommonEncryption,
AssetDeliveryProtocol.SmoothStreaming,
assetDeliveryPolicyConfiguration);

outputAsset.DeliveryPolicies.Add(assetDeliveryPolicy);
await outputAsset.UpdateAsync();

}

As you can see, once you have the content key, the token restrictions and the PlayReady license to use, you can easily create the content key authorization policy and associate it to the content key. The next step is to create the content delivery policy to make the Media Services PlayReady licenses server able to generate the PlayReady license for your content.

Dynamic encryption with PlayReady is now configured !

Publish the asset

The last step is the asset’s publication. If you have already done this without PlayReady you will see that it works exactly in the same way. First, you have to create an access policy in your media services :

var accessPolicy = context.AccessPolicies
.Where(a => a.Name == "catchup_policy")
.AsEnumerable()
.FirstOrDefault();

if (accessPolicy == null)
{
accessPolicy = context.AccessPolicies
.Create("catchup_policy", TimeSpan.FromDays(100 * 365), AccessPermissions.Read);
}

Next, you will use this access policy to publish the asset via a locator :

var locator = context.Locators
.CreateLocator(LocatorType.OnDemandOrigin, asset, accessPolicy);

Console.WriteLine(locator.GetSmoothStreamingUri());

Et voilà !

Test the asset with PlayReady encryption

To test your new dynamic PlayReady encrypted asset you can go on http://sltoken.azurewebsites.net (developed by the AMS team, I think). Put the smooth streaming URL of your content and the token your STS has delivered, and it just works :

image

Note : to make the dynamic packaging working you have to enable at least one streaming unit in the streaming endpoint configuration of your media service.

Conclusion

In this blog post we have seen how to upload, encode and protect a content with PlayReady DRM, the new feature available with Azure Media Services. This feature is not yet supported on assets that were recorded from live channels, but in this case you can use AES encryption.

Hope this helps !

Julien

read more

Develop a blog using ASPNET vNext, Azure Website, DocumentDB and Search services - Part 2

Introduction

This post is part of a series of articles about developing a blog with ASP.NET vNext, Azure Website, DocumentDB and search services. If you would like to read another article in the series: - Develop a blog using ASPNET vNext, Azure Website, DocumentDB and Search services - Part 1- Develop a blog using ASPNET vNext, Azure Website, DocumentDB and Search services - Part 2

In the first article I explained what ASP.NET vNext is and how it is possible to create the Azure resources we’ll need via the preview portal. Today, I’m going to explain what Azure DocumentDB is and start the implementation of the data layer of the blog, using this service. NOTE: the alpha-4 version of ASP.NET vNext has been release with Visual Studio CTP 4 few days ago, so you can switch to this new version! Thanks to Adrien that reviews this blog post before I publish it !## Creation of the Visual Studio Solution

Before getting into Azure DocumentDB, we are going to create the blog solution in Visual Studio 14 CTP. In the web section, be sure to choose the new ASP.NET vNext project templates:

image

I chose to create an empty ASP.NET vNext application. Now, add a new ASP.NET vNext class library to the solution. It’s in this library that we will create the domain, queries and commands that work with Document DB. As you can see, both projects have a project.json file with some dependencies configuration. The project.json file defines two kinds of dependencies: external libraries such as MVC, Entity Framework, Document DB and frameworks dependencies. image

aspnet50 represents the full .NET framework and aspnetcore50 represents the small cloud optimized framework that comes with ASP.NET vNext. Referencing these two frameworks ensure that your application will build for both of them but also that all the libraries that you work with are supported by the cloud optimized Framework. In our case, it’s not possible because some of the Azure SDK librairies are not supported by this version, so you should remove the aspnetcore50 line from all project.json files. Now you can add a very simple C# class in the library project, to represent a blog post: ```csharp namespace MetroBlog.Domain.Entities { using Newtonsoft.Json; using System; using System.Collections.Generic; using System.Collections.ObjectModel;

/// <summary> /// Represents a blog post /// </summary> public class Post { /// <summary> /// Creates a new instance of the class class; /// </summary> public Post() { this.Tags = new Collection(); }

/// <summary> /// Gets or sets the post identifier /// </summary> [JsonProperty(“id”)] public Guid Id { get; set; }

/// <summary> /// Gets or sets the publication date /// </summary> [JsonProperty(“publishDate”)] public DateTime PublishDate { get; set; }

/// <summary> /// Gets or sets the title /// </summary> [JsonProperty(“title”)] public string Title { get; set; }

/// <summary> /// Gets or sets the permalink /// </summary> [JsonProperty(“permalink”)] public string Permalink { get; set; }

/// <summary> /// Gets or sets the summary /// </summary> [JsonProperty(“summary”)] public string Summary { get; set; }

/// <summary> /// Gets or sets the content of the post /// </summary> [JsonProperty(“content”)] public string Content { get; set; }

/// <summary> /// Gets or sets the collection of tags /// </summary> [JsonProperty(“tags”)] public Collection Tags { get; set; }

/// <summary> /// Gets or sets a boolean that indicates if the post is published /// </summary> [JsonProperty(“isPublished”)] public bool IsPublished { get; set; }

/// <summary> /// Gets or sets the username of the author /// </summary> [JsonProperty(“username”)] public string Username { get; set; } } }

## Using Azure DocumentDB to store the blog posts

### Configure the project to use Azure Document DB

###

DocumentDB is a NoSQL database that supports the storage of documents described as JSON. One of the big advantage of this kind of database is that the schema has not to be fixed: it can evolve with the application and data that are pushed inside the database are continuously indexed to provide the best performances. Because the document are stored as JSON format, it is really easy to push and extract data from DocumentDB. Microsoft provides a language that looks like SQL to query the database, so if you are not used to work with document database, don’t be afraid: it’s simple!
To reference the nuget package in your project, open the project.json file and start typing the name of the package (Microsoft.Azure.Documents.Client) in the dependencies section and click on the package in the IntelliSense:

![image](/images/develop-a-blog-using-aspnet-vnext-azure-website-documentdb-and-search-services-part-2/image_213DBC0F.png)

As soon as you save the file, the package will be downloaded and the project references updated:
![image](/images/develop-a-blog-using-aspnet-vnext-azure-website-documentdb-and-search-services-part-2/image_19B24CA2.png)

To start using the Azure Document DB with this SDK, you’ll need three information:

- The endpoint of the instance you have created
- The authorization key to access the database
- The id of the database to work with

The two first information are available on the preview Azure portal :

![image](/images/develop-a-blog-using-aspnet-vnext-azure-website-documentdb-and-search-services-part-2/image_02F30519.png)

### Create a base class for commands and queries

I chose to work with queries and commands to read and create post from / to the document database. I also made a base class for all my commands and queries.
To work with Document DB in your application you need to create an instance of the DocumentClient class that comes with the NuGet package and takes the endpoint URL and the authorization key as parameters.
```csharp
public DocumentDbCommandQueryBase(DocumentDbOptions options)
{
this.options = options;
this.DocumentClient = new DocumentClient(new System.Uri(options.EndpointUrl), options.AuthorizationKey);
}

After that you can get the database to work with or create it if it’s not exist already :

protected async Task<Microsoft.Azure.Documents.Database> GetDatabaseAndCreateIfNotExists()
{
var database = this.DocumentClient.CreateDatabaseQuery()
.Where(d => d.Id == this.options.DatabaseId)
.AsEnumerable()
.FirstOrDefault();

if (database == null)
{
database = await this.DocumentClient.CreateDatabaseAsync(
new Microsoft.Azure.Documents.Database()
{
Id = this.options.DatabaseId
});
}

return database;
}

Relational databases like SQL Azure use table to store data. In Document DB, the documents are stored in document collections that you can get or create using the DocumentClient:

protected async Task<Microsoft.Azure.Documents.DocumentCollection> GetDocumentCollectionAndCreateIfNotExists(string collectionId)
{
var database = await this.GetDatabaseAndCreateIfNotExists();

var documentCollection = this.DocumentClient.CreateDocumentCollectionQuery(database.SelfLink)
.Where(d => d.Id == collectionId)
.AsEnumerable()
.FirstOrDefault();

if(documentCollection == null)
{
documentCollection = await this.DocumentClient.CreateDocumentCollectionAsync(
database.SelfLink,
new Microsoft.Azure.Documents.DocumentCollection()
{
Id = collectionId
});
}

return documentCollection;
}

Query the document database to list all posts

It’s possible to query the database with a pseudo SQL language or using LINQ. I chose the second one. You can find more information about querying DocumentDb here. To create a query you also need to use the DocumentClient class and call the CreateDocumentQuery method:

var documentCollection = await this.GetDocumentCollectionAndCreateIfNotExists("Posts");

var postsQuery = this.DocumentClient
.CreateDocumentQuery<Domain.Entities.Post>(documentCollection.DocumentsLink)
.Where(p => p.IsPublished)
.AsQueryable();

You can call the extension method AsDocumentQuery on your linq query to get the underlying document query and execute it asynchronously:```csharp var result = await documentQuery.ExecuteNextAsync();

Like the Azure table storage, DocumentDb uses continuation token when returning the documents :

```csharp
var posts = new List<Domain.Entities.Post>();
posts.AddRange(result);
while (documentQuery.HasMoreResults)
{
result = await documentQuery.ExecuteNextAsync<Domain.Entities.Post>();
posts.AddRange(result);
}

Add a post document in the database

Adding a document in the database is pretty simple! You have to get the documents collection on your database and then call the CreateDocumentAsync method with the entity as parameter:

var documentCollection = await this.GetDocumentCollectionAndCreateIfNotExists("Posts");
await this.DocumentClient.CreateDocumentAsync(documentCollection.SelfLink, this.post);

Cool, isn’t it ?

Conclusion

In this post we have seen how to use Visual Studio 14 CTP to create new ASP.NET vNext projects, handle the new dependencies file project.json and how to use the DocumentDB SDK to create / read data from / to the documents collection.

The code is now available on Github, so feel free to browse it if you want.

In the next post we will start to really use ASP.NET vNext to display the post !

Stay tuned

Julien

read more