Wintellect Blogs

It’s Deja Vu All Over Again… Timing… Timing… Timing… Azure Storage Keys and Your Computer’s Clock

Paul Mehner

24 Nov , 2012  

The demo code for my upcoming book on Windows Azure storage inexplicably
began reporting that “The remote server returned an error: (403) Forbidden” on
every operation. Investigating the problem using the integrated Visual Studio
2012 Server Explorer to view objects in cloud storage was also problematic…
Visual Studio reported that the “current storage account key is invalid”, and
instructed me to create a new one. The new storage account key also failed, and
so did associating a new management X.509 certificate to the subscription.

Having remembered conversations that I shared with Jeffrey
Richter on clock skew and credentials in the cloud, I checked my notebook
computer’s clock and discovered that it was slow by several minutes. My
computer was set to synchronize with time.nist.gov, but when I attempted to
perform a manual update I received a connection error with the service. In an
attempt to rectify this secondary condition, I switched my notebook’s time
source to the time.windows.com time server and synchronized again. Viola!… with
my notebook’s clock time more closely aligned with the Windows Azure storage
service’s clock time… my credentials began working again.

This can be a tricky condition to diagnose because every
error message given by the client library’s interpretation of the 403 status
code points you towards something having to do with your storage account or its
storage keys rather than the real trouble, which is your computer’s clock.… Read more

Wintellect Blogs

Comodo SSL Certificate Breach’s Potential Impact on Security Token Services and their Identity Providers

Paul Mehner

25 Mar , 2011  

Recently, Iranian crackers used a username and password to make certificate requests from the Comodo Certificate Authority. These requests were successful and certificates were issued for 9 domains which are published on the Comodo Fraud Incident Report page: http://www.comodo.com/Comodo-Fraud-Incident-2011-03-23.html 

This issue is of particular importance to me because SSL is the primary mechanism by which integrity and confidentiality are assured for security Security Tokens and Security Token Requests. My latest blog post provides instructions on how to add Yahoo and Google as Identity Providers to Windows Azure AppFabric Access Control Service v2.0. The fraudulent certificates are for the major Identity Provider sources on the Internet (e.g. mail.google.com, www.google.com, login.yahoo.com, login.skype.com, addons.mozilla.org, login.live.com, global trustee). These certificates may be used to spoof content, perform phishing attacks, or perform man-in-the-middle attacks against all internet application users (in my view, it potentially impacts more than just applications accessible via web browsers). Although the sky is far from falling, this breach does illuminate some pretty significant vulnerabilities in our Internet security infrastructure, which need to be tightened.

Revocations of your computer’s trust of these certificates can be obtained via a web browser update (which is also very unfortunate as it makes the procedure for responding to such security threats extremely cumbersome and hard to orchestrate).… Read more

Wintellect Blogs

Programmatically Adding Google or Yahoo as an Identity Provider to the Windows Azure AppFabric Labs v2.0 Access Control Service

Paul Mehner

22 Mar , 2011  

This blog post assumes that the reader knows the basics of Identity Providers and Security Token Services. Its purpose is to illustrate how to programmatically add Google or Yahoo as an Identity Provider because there isn’t much information available on how to do this. For further information about using the ManagementServices proxy, I suggest downloading the Codeplex ACS Management examples from http://acs.codeplex.com/releases/view/57595

We manage the Windows Azure AppFabric Access Control Service v2.0 through code using the ManagementService proxy and data types which are generated when we add a service reference to the ACS Metadata endpoint located at https://{yournamespace}.accesscontrol.appfabriclabs.com/v2/mgmt/service, You can do this using either the Visual Studio “Add Service Reference” menu option, or manually using the svcutil.exe utility. There are examples of this in the code samples mentioned above.

To begin, we will use the management service proxy to retrieve a list of the IdentityProviders that have already been installed for the targeted namespace. By default, Windows Live ID will already be present and cannot be removed. The management service API requires that all requests be accompanied by a SWT token, which is also covered in the previously mentioned code samples.

To create a new IdentityProvider, we need to establish an Issuer for tokens coming from that Identity.… Read more

Wintellect Blogs

Scaling Up Or Scaling Out In The Cloud

Paul Mehner

19 Dec , 2010  

Windows Azure provides us the ability to scale our application up by specifying how many CPU cores we want in our service instances, or to scale out by specifying how many single-core instances we require. Both strategies can be used to accomplish our scaling objectives for the same price (8 1-Core machines @ 12 cents/hour or 1 8-core machine @ 96 cents/hour), but in smaller deployment scenarios (under 8 CPU cores) there are a couple of advantages that clearly favor selecting a greater number of small-VM instances over a single VM instance with an equivalent number of cores.

The Windows Azure Service Level Agreement (SLA) guarantees 99.95% service uptime. To receive this benefit, the SLA requires that you deploy a minimum of two service instances. Another important feature is the Rolling Upgrade. A rolling upgrade is a deployment feature of Windows Azure that allows service instances to be stopped and upgraded individually without bringing all of your instances down at the same time. This allows your service to remain operational during upgrade periods (albeit in a degraded state).… Read more

Wintellect Blogs

Installing Windows Azure SDK v1.3 Breaks Support for Visual Studio 2008

Paul Mehner

18 Dec , 2010  

Be aware that installing the November 2010 Windows Azure SDK v1.3 will break support for cloud projects running under Visual Studio 2008. To the best of my knowledge this was not widely announced (in fact, I learned about this fact during installation of the SDK). If you have Visual Studio 2008 Windows Azure projects, you’ll want to ensure that you have Visual Studio 2010 and a plan for migrating your projects prior to installing this new SDK.

When running the SDK setup on a machine with Visual Studio 2008 installed, you’ll receive a warning that “Setup has detected that Windows Azure Tools for Visual Studio 2008 is installed. As Windows Azure Tools 1.3 does not support Visual Studio 2008, if you continue to install this software, Windows Azure Tools for Visual Studio 2008 will stop working due to incompatible Windows Azure SDK version. Do you want to continue?” … Read more

Wintellect Blogs

Using The AsyncEnumerator To Improve Throughput of I/O-Bound Windows Azure Worker Roles

Paul Mehner

15 Dec , 2010  

The Windows Azure Worker Role is a perfect place to put code that you want to run continuously in the background to process work as it becomes available. The information presented here would also be useful in web roles as well.

If you’re writing cloud applications, its likely you are targeting high levels of performance and scalability. It is reasonable to expect that you want to get the most out of your investment in cloud computing, and making the best use of your purchased resources will save you money. It is therefore also reasonable to expect that most non-trivial applications that you deploy to a production cloud environment would be written to perform I/O operations asynchronously.

In a Windows Azure Worker Role, a single thread is dispatched to your worker process’ Run method by the Windows Azure AppFabric. The rest of the threading model is left up to you. This is very much like a windows service or a console application. If you want to make maximum use of the cores available in your service instances then it is highly recommended that you leverage the CLR thread pool.

Using the .NET Task Parallel Library (TPL) is an option if your worker roles are compute-bound, but it won’t help you much for I/O bound operations.… Read more

Wintellect Blogs

Updated FAQ for SQL Azure

Paul Mehner

10 May , 2010  

Microsoft published an updated FAQ (May 3, 2010) for SQL Azure, available here

The FAQ is very thorough and is a “must read” for any organization planning a relational database migration or new cloud application

“This paper provides an architectural overview of SQL Azure Database, and describes how you can use SQL Azure to augment your existing on-premises data infrastructure or as your complete database solution”

Read more

Wintellect Blogs

Improving Windows Azure Storage Throughput Using the Content Delivery Network

Paul Mehner

4 May , 2010  

Windows Azure Content Delivery Network (CDN) caches your Windows Azure Data Storage blobs at strategically placed locations around the world (18 at the time of this blog post). The purpose of the CDN is to provide maximum bandwidth for delivery of content to our applications and users. Building massively scalable applications requires squeezing every ounce of juice possible from the infrastructure and machinery. The CDN significantly improves retrieval performance for our most frequently used anonymously accessible read-only data.

The CDN works by caching the first request made to retrieve a blob from Windows Azure Data Storage using a specialized URL that maps to our data storage account. It then keeps the results of that query in that geographically localized cache so that subsequent requests to the same blob can be performed from the cache, which is much faster than the original trip to fetch the blob from the more geographically distant data center. Any blob requested through a special CDN URL will be served from the local cache until its Time To Live (TTL) has expired, in which case a fresh copy of the blob will be retrieved from data center blob storage with a fresh TTL. As the first request still requires retrieval from data center storage, frequently used blobs will receive the greatest performance boost.… Read more

Wintellect Blogs

Learning Windows Azure platform Resources

Paul Mehner

20 Apr , 2010  

I’ve assembled a short list of training materials and utilities that are helpful in learning the Windows Azure platform

Read more

Wintellect Blogs

Understanding Windows Azure platform AppFabric Access Control Service Resources

Paul Mehner

6 Apr , 2010  

Before we can begin using the Windows Azure AppFabric Access Control Service (ACS) to decouple our applications from security concerns and enable claims-based identities we need to understand the Resources contained in the Service Namespace and what role they play in the authentication and authorization infrastructure. This brief blog entry is meant to provide you with the basic understanding and vocabulary required to get started.

Service Namespace

The Service Namespace is an abstraction for the collection of ACS Resources including Token Policies, Scopes, Issuers, and Rules (which are described in more detail below).

The Service Namespace is comprised of a hierarchy of related entities. At the root of this hierarchy is the AppFabric Service Account Project. The Service Namespace can be broken into three constituent parts as shown in Figure 1, the Token Policy, the Scope, and the Issuer.

image 
Figure 1 – Service Namespace Object Hierarchy

Token Policy

A Token Policy defines token expiration periods and digital signing keys. A Token Policy may be shared across Service Namespaces and is used by the ACS to sign the response tokens and to set their expiration periods.

Issuer

An issuer is a party that will issue requests for tokens from the ACS.… Read more