Had enough.  One thing for sure, can’t depend on other people to be reliable when it comes to things that matter to you.  The old geeks.netindonesia.net was great but that does not seem to be the case now.  I’ve lost contact with the maintainer and previous attempt to get help on getting the blog contents from the server did not go well at all.

So, I am going to migrate the content here the only way I know how…

The Internet Time Machine (web.archive.org) as suggested from this article: http://webmasters.stackexchange.com/questions/33346/recovering-a-lost-website-with-no-backup

I’ll not migrate outdated / irrelevant contents, only contents that are of interest to me will be brought over to this new site.  Will try to maintain the chronological order.

Enough said… now let’s get to work…


I have to confess that I don’t religiously take on TDD / Unit Testing all the time.  Most of the time when prototyping I use regular C# console app or better yet, fire up jsfiddle.net and hack away at the things that I want to proof using JavaScript.   Most of the time this workflow do what I need it to do.  I can get the result that I want and pretty confident that the code works before I migrate it back to real production code in C#.   Perhaps sometimes after I’ll code up some unit tests to get some coverage.

Digressing a bit… I love jsfiddle and things like dotnetfiddle since they allow me to prototype stuffs without having to go through the code, wait for compile and run to see what’s coming out at the other end, but that’s a different story. *Digression mode OFF*

I found that most of the time when I really feel that I need to fall back to TDD / up front unit testing is when dealing with compilcated logic like rules that depends on multiple if statements… like…

If it’s Tuesday and it is raining and you’re wearing red socks then do this

If it’s Tuesday and it is raining and you’re wearing blue socks then do that

If it’s Tuesday but it’s a leap year and it’s between January to March, and you are wearing either black, blue or red sock but also wearing sweater, then do something else.

You get my point…. some codes are just harder to hold in your head and get right.   Complex boolean logic is one such thing.  This type of code is a perfect candidate for starting with TDD.

I had to work on some brownfield code for a client which has this kind of logic to determine if a node in a treeview or the screen entry is enabled… what a headache :).   Unit Testing came to the rescue.  The result is… I  am more confident now that the boolean logic that I put into the code works as expected since I coded enough test to cover every conceivable scenario of the business rule / requirement.

Agree?  Disagree?

What do you do?


What I think how the Azure .Net SDK upgrade tool works behind the scene

I am not sure if this a 100 % correct, but from the observed behavior at work, upgrading Azure SDK (say from 1.8 to 2.2., or from 2.2 to 2.3, etc.) using the tooling that is integrated into Visual Studio seems to only update the references that belongs to projects that are referenced in the Azure Cloud Service project being upgraded.  Any other Azure DLL references elsewhere in your solution, WILL NOT be updated automatically.

For example, your solution looks like so:

Solution A (created using SDK 2.2)

  • Cloud Service Project
    • Web Project 1
    • Web Project 2
  • Cloud Library Component (has Azure DLL references i.e. Storage, ServiceRuntime)
  • Data Access / Repository Component (has Azure DLL references i.e. Storage, Cache)
  • Web Project 1 (has Azure DLL references i.e. ServiceRuntime, Configuration, Diagnostics)
  • Web Project 2 (has Azure DLL references i.e. ServiceRuntime, Configuration, Diagnostics, Cache)

I know… this is probably not how it should be done, but for the sake argument, this is brown field project that you inherited and tasked to upgrade to the newer and shinier Azure SDK.  Ideally, it would be nice if you can centralized the changes to a single project, but in reality, all sort of weird legacy things happen…

Given the solution structure, what we think Azure upgrade tool does under the cover is something like:

  1. Figure out what projects are references in the Cloud Service Project.  in this case it’s Web Project 1 and Web Project 2.
  2. Update all .cscfg and .csdef file to use the new SDK schema, etc.
  3. For all the projects found in the Cloud Service Project, look at all Azure DLL references in the project reference, and update it to point to the new SDK dll (i.e. in C:\Program Files\Microsoft SDKs\Windows Azure\.NET SDK\v2.3\ref).  Look as well in the respective web.config (maybe in app.config as well, not sure) and update and reference to the new Azure SDK dll.

To save cost, we use this trick where we have a webrole.cs inside our web project that is sort of doing what additional separate worker roles might do and this will need its own app.config inside the web project in addition to web.config since the webrole.cs will not run in the same application domain  / process as the web application itself and therefore will not get its configuration from web.config.  Rather, it will pull its configuration from the app.config in that same web project.  What we found is that the app.config is not being updated by the Azure .Net SDK upgrade tool.

In this case, only Web Project 1 and Web Project 2 Azure dll references are updated.  The Cloud Library and Data Access components ARE NOT updated even though they might be referred to by Web Project 1 / 2.


Azure .Net SDK upgrade choices

Given the situation, you can choose to:

  1. Spend the needed time to refactor the solution and try to get all stuffs that relate to Azure centralized in a project, abstract it behind some sort of interface.  This could be a nice thing to have.  Added benefit is that you can, in theory, abstract out all Azure dependencies in that particular project and deal with a higher level abstraction that will give you an agnostic way to access cloud related stuffs… perhaps this is useful as well if you want to move your cloud dependencies to other cloud vendor such as AWS.
  2. Bite the bullet on the technical debt and manually re-reference the Azure DLL that are skipped by the upgrade tool.  Perhaps figure out a way to automate this as well…
  3. Hope that Microsoft to fix their Azure SDK upgrade tool in Visual Studio to crawl Azure dependencies from the entry points and upgrade those references as well.


What We Did

What we did at work was to go with option 2.  We manually updated Azure .Net SDK references in Cloud Library and Data Access components to be on par with what is in the tool updated web projects.  It would be nice if the tool that comes with the SDK do this automatically... wishful thinking.

We also toyed around with the idea of pulling the Azure DLL dependencies via nuget package and script them somehow, so the next time around we need to upgrade the Azure SDK, we just do a minor change in the PowerShell script and run it which will cause it to update all the Azure DLL references to the new SDK level.


Official Azure Nuget Packages Inconsistencies

Some other weird thing that we found… It’s not very clear which way we should go since half of the Azure DLL can found (officially) as nuget packages such as Storage and Cache, but some others like Service Runtime are referred from the ref folder inside the SDK installation and do not have any official nuget package associated with it.  We did find unofficial nuget package for it though such as Unofficial.Microsoft.WindowsAzure.Diagnostics.


Not too long ago, we moved our repository to www.visualstudio.com

Some reasons behind the moves were:

  1. Visual Studio online repository is free for up to 5 developers (which suit our small team).
  2. We want to take advantage of the continuous delivery workflow that Azure and VS online offer.
  3. VS online has a nice Scrum portal that include feature, backlog and task management, drag-able Kanban board, burn down chart and cumulative flow diagram. The “virtual” project chat room is nice but we use Skype mostly to communicate with our remote team.

Since then, we’ve been working toward enabling continuous deployment on our Azure solution similar to what is described in this article: Continuous delivery to Azure using Visual Studio Online

The purpose of this post is not to teach you how to do this.  For that, please follow the steps described in the linked article above.  What I wish to talk about here is more about what’s missing from the “happy day” scenario of doing continuous delivery as described in that article.

Azure Continuous Deployment of Visual Studio solution with multiple cloud service projects from Visual Studio Online

Caveat: I am an the accidental build master who does not have in depth knowledge of VS Build Process Template customization, not yet anyhow, so the solution described here might not be the most technically correct, but it works for us.  If you know a better way to do this, by all mean, please share.

Originally, our Visual Studio solution contains multiple cloud service projects which we found to not work with the normal continuous deployment scenario that comes out of the box with Visual Studio Online and Azure. 


Why out of the box build process will not work with Visual Studio solutions with multiple cloud service projects

Apparently, the standard continuous deployment workflow that comes out of the box prefers a solution with only a single cloud service project. You can see this in the build definition in the following section: Process/6.Deployment/Deployment/Windows Azure Deployment Environment. Basically, you can only deploy a single cloud service if you doing it the standard way. You might be able to tweak the build definition workflow but we decided to not waste too much time researching this. 


Move each cloud service project into their own Visual Studio solutions

So, we moved the extra cloud service projects from our main Visual Studio solution into their own Visual Studio solutions and re-establish any project / binary references as necessary.  This, however, present a different challenge when doing .NET Azure SDK upgrade, which I will talk about in a different post.  Each of these solutions are then associated with their own continuous delivery build definitions.  We found this setup to be working quite nicely.

Original VS solution:

My Awesome  VS Solution

  • Cloud Service Project 1 (i.e. main web + worker role that will always need to be deployed as pair)
  • Cloud Service Project 2 (i.e. incoming email handler worker role, unfrequently deployed)
  • Supporting library project 1 (i.e. Azure specific library)
  • Supporting library project 2 (i.e. SQL Data Access library)
  • Actual web project
  • Worker role project
  • Email worker project

Refactored VS solutions:

Main Web + Worker Role VS solution

  • Cloud Service Project 1 (main web + worker role)
  • Supporting library project 1
  • Supporting library project 2
  • Web project
  • Worker role project

Email Handler VS solution

  • Cloud Service Project 2 (email handler)
  • Supporting library project 1
  • Supporting library project 2
  • Email worker project

Additional Build Definition Configuration

The build definition themselves need to be tweaked due to the following reasons:

  1. We have multiple cloud service project configuration: development (local). staging, production, etc.
  2. We have multiple build configuration: debug, staging, production, etc.  We use build configuration to include or exclude a certain code path using #if, #endif directive, etc., as well to do configuration transform on web.config or app.config (using SlowCheetah Visual Studio extension).

Therefore, we need to do extra build definition customization in the following sections:

  • Process/Build/Projects: this need to point to the Visual Studio solution you wish to build (must only contain a single cloud service project).
  • Process/Build/Configuration: this need to be configured to match the build configuration and platform that you are building for (i.e. Mixed Platform|Staging or Any CPU|Production).
  • Process/Build/Advanced/MSBuild Arguments: you need to add /p:TargetProfile={your cloud project configuration setting target}, for example: /p:TargetProfile=Production if you are building for Production deployment where each configuration setting comes from the ServiceConfiguration.Production.cscfg file in the Cloud Service project.
  • Process/Deployment/Deployment/Windows Azure Deployment Environment: you need to tweak this to suit your need.  For example what Storage Account you wish the continuous deployment to use (It will upload the built package to the blob storage in the vsdeploy container), whether you wish it to go directly to Production slot or keep it in Staging slot for manual VIP swap, the Azure subscription, which cloud service to deploy to, etc.

I hope this post will help you if you ever found yourself in similar situation.


Further notes on Visual Studio solution splitting

The following has nothing to do with continuous deployment but we realized something else after having tons of issue with Azure .Net SDK upgrade and the way we split up our original VS solution (the one with multiple cloud service projects) into multiple VS solutions. 

The problem originated with how we structure our solution, how the Azure .NET SDK upgrade tool and nuget works. We are using project dependencies and not binary dependencies for our own VS projects when establishing references between projects in the solution.  We also have Azure DLLs (pulled via nuget package) referenced in multiple projects (not just the web project).  And for some reasons, when we update a nuget package such as WindowsAzure.Storage in one of the refactored VS solutions, things will start breaking in the other refactored VS solution, vice versa (build failure, etc.).  This has to do with how the reference dependencies is resolved by Visual Studio.  When shared supporting project 1 is updated (via nuget) in VS solution 1, it will inject the path relative to that particular solution.  When the same supporting project 1 is loaded in VS solution 2, it won’t be able to find the referred DLL in the specific package folder and it will fall back to an older Azure DLL in the C:\Program Files\Microsoft SDKs\Windows Azure\…\ref folder.

So, to solve this issue, we thing the following setup should work nicely.

Say you have 3 cloud services in your product.  You would create the following VS solutions:

  1. A solution which sole purpose is for development only.  This solution can have multiple cloud service projects inside of it (as per original).  The purpose here is to minimize Azure DLLs and other nuget packages issues when upgrading Azure .Net SDK in the future.  Any Azure .Net SDK and related nuget package upgrade should be done in this solution. 
  2. Solution 1 that only contains cloud service project 1 and all other dependencies (i.e. cloud library, data access, etc.).  This will be the designated build solution when deploying cloud service 1.  You should not perform any nuget upgrade in this solution.  As a matter of fact, you should not load this in Visual Studio to do any code alteration.  This solution is purely for build and continuous deployment purposes. 
  3. Solution 2 that only contains cloud service project 2 and all other dependencies (i.e. cloud library only).  This will be the designated build solution when deploying cloud service 2.  You should not perform any nuget upgrade in this solution.  As a matter of fact, you should not load this in Visual Studio to do any code alteration.  This solution is purely for build and continuous deployment purposes.
  4. Solution 3 for deploying cloud service 3… You should not perform any nuget upgrade in this solution.  As a matter of fact, you should not load this in Visual Studio to do any code alteration.  This solution is purely for build and continuous deployment purposes.

I think you get the idea.

We think having multiple solutions like so will simplify both Continuous Delivery and Azure SDK updates in the future.


It’s been a while since my last blog post.  My old blog is still located in http://geeks.netindonesia.net/blogs/jimmy and for whatever reason, about 99% of the time I am unable to connect to it to efficiently post any new blog post or even access it for reading from Australia. 

After giving it a lot of thought I finally decided to start over.  This time around I’m taking things into my own hand instead of hosting the blog in community site where I have no control over the accessibility of such site.

Starting fresh gives me a chance to try new stuffs as well such as trying Mads Kristensen’s MiniBlog engine that I am hosting as Azure Web Sites.  Toyed around with Orchard CMS a bit but I think MiniBlog is a better solution for this.

Slowly I’ll try to move what I deem to be interesting contents from my old blog.  So far the attempt to gain access to my posts as an export of some sort is not working.