In the last several years NoSQL database stores have emerged as viable and useful alternatives to traditional relational databases. While relational technologies like SQL Server will likely always have a place in the enterprise, the means to store data as documents or connected graphs vs. classic “rows and columns” is powerful and should occupy a first-class position in any modern enterprise developer’s toolbox.
Click on the video above to view, and post your feedback in the comments.… Read more
The modern enterprise generates huge amounts of data and requires state-of-the-art tools to store, analyze, and derive actionable meaning from that data. Mainstream open-source options like Apache Hadoop, Spark, and Storm are powerful but have a big learning curve and impose significant ongoing operational burdens.
Azure Data Lake combines these powerful tools with flexible, approachable alternatives into a unified, cloud-enabled analytics-as-a-service platform that scales to any size enterprise.
Attend this one hour, demo-filled webinar from Wintellect to learn the major components of Azure Data Lake and how you can start using it today to gain deep insight into your business.
Click on the video above to view the webinar, and share your feedback in the comments!… Read more
The growing universe of smart devices provides us with an immense amount of data that can be analyzed to improve productivity and, ultimately, our quality of life. But what’s the best way to reliably and securely collect that data? In an insightful new webinar, Wintellect Principal Architect Josh Lane provides an overview of the technology behind Microsoft’s Azure IoT Suite, a set of cloud-powered tools for effectively managing and communicating with connected devices at scale.
In the one-hour session, Lane will walk you through general principles for creating a robust IoT platform, the various elements of the Azure IoT ecosystem, some real-world scenarios for applying them, and options for businesses considering subscribing.
Click on the video above to view, and share your feedback in the comments.… Read more
I’m preparing some material for a webinar on Azure IoT in mid-October (you are signed up, aren’t you?) and thought I’d do a quick intro to the basic concepts and moving parts.
Azure IoT Hub is a cloud-scale service for managing and securely communicating with large numbers of field devices (potentially millions at once); communication can occur from device to cloud, and also from cloud to device (for issuing commands or queries to devices). It’s standards-based so it works with many device types, a number of communications protocols and guest operating systems, and supports various network topologies. It also supports custom gateways for edge analytics, traffic optimization, etc. Finally, it integrates with a number of existing Azure services like Stream Analytics, Machine Learning, and Event Hubs to maximize scale and minimize time to insight.
Let’s walk through a sample Azure IoT Hub-based solution and see it in action. I live in Atlanta, GA USA and our public transit system provides geolocation and other metadata about buses and trains; we’ll use the bus data to simulate device activity in the field (each bus == a device) and use IoT Hubs, Stream Analytics, Azure Storage, DocumentDB, and some custom code to collect and present that data in a meaningful way.… Read more
In my last post I discussed the pervasive issue of relational modeling as a (poor) substitute for proper domain modeling, the reasons why (almost) everyone continues to build software that way, and the resulting problems that arise for those with ambitions to move their relationally-modeled software system to the cloud.
Given the prevalence of RDBMS-backed enterprise software and the accelerating pace of public cloud adoption, chances are pretty good you’re faced with this scenario today. Let’s discuss what you can do about it.
My first piece of advice is make sure you really need public cloud. The cloud has well-documented scalability, elasticity, and agility benefits, but most of those arise only for software designed intentionally to take advantage of them. They also tend to prove most cost-effective when amortized across a relatively long application life span, or across a relatively large (and, ideally, increasing) number of transaction requests. If yours is a legacy app with modest and relatively static resource needs, the cost justification for a public cloud deployment may not be so obvious.… Read more
For CTOs looking to squeeze new life out of legacy enterprise applications, the cloud offers tantalizing prospects. Pushbutton scalability, reduced capital costs, outsourcing of non-core IT functions, potentially greater monitoring and health management capabilities and almost certainly greater overall uptime… even with the potential downsides, its no wonder senior management is tempted.
And yet those downsides are more than nagging problems; in many cases they pose significant barriers to a successful move from private data center to public cloud. An existing enterprise app might work fine running on internal hardware, with a modest user base… but move it blindly to a VM in Azure or AWS and suddenly that clever little accounting app grinds to a halt (as does your business). But why is that, exactly?… Read more
I just got home from Devlink (unfortunately I had to bail out a day early) but I wanted to take a moment to say how impressed I am with the event, facilities, staff, and most important… the content! There were some excellent sessions throughout the week and my only regret in giving two talks of my own is that it left less time to soak up knowledge from everyone else. This was my first Devlink… it definitely won’t be my last. Kudos and sincere thanks to John Kellar and the Devlink board for putting on a great conference.
I had the pleasure of delivering two talks… “Node.js for .NET Developers” and “AWS vs. Microsoft Azure”. Both had great audience engagement and were lots of fun to deliver. I also did tag-team delivery of the all-day Microsoft Azure Pre-Con session with fellow Wintellectual John Garland, himself a fountain of Azure knowledge and all-around smart dude. It’s almost enough for me to forgive the fact that he’s a Florida Gator. Almost.
So I was watching Twitter this morning in anticipation of interesting news from TechEd 2014. TechEd isn’t traditionally known as the place where Microsoft drops a lot of big, bold announcements so I wasn’t expecting too much. But then I saw Scott Hanselman post this little nugget:
Devs of #msteched: Everything changes in 90min. Join me at my Foundational Session at 11am for a parade of awesome and a big announcement.
— Scott Hanselman (@shanselman) May 12, 2014
Scott’s certainly not prone to baseless hyperbole… so this was interesting. A bit later, he followed up with this:
— Scott Hanselman (@shanselman) May 12, 2014
Okay, let’s go have a look. Hmm, yes… stuff from the last Build about native compilation, open language compilers, and better JIT… cloud-optimized CLR, that’s interesting… deploy my own CLR and .NET Framework with each app, okay that’s nice I guess WAIT WHAT.
Why in the world would I want to do that? (more on that in a moment)
Okay, moving on… VS.NET and IIS and self-host options, yes of course… what’s this? NuGet goo and .csproj are going away in favor of a project.json file?!?! Hellooooo, node and npm!… Read more
I’m a big fan of both node.js for building web-enabled servers and of Windows Microsoft Azure for hosting them… and the combination of the two is pretty compelling, if not necessarily widespread (yet ). I recently dug into the gory details of authenticating node applications against Azure Active Directory, and thought I’d walk through how to set this up. Now that it’s done it seems straightforward enough… but digging in on the front end was not for the faint of heart or weak of constitution. Nor was it particularly well-documented. Hopefully this post helps that a bit.
Web applications that expose sensitive data and/or functionality need to authenticate users against some security store. Hundreds of years ago cavemen rubbed sticks together, stored cleartext passwords in their database, and invoked all kinds of unholiness in the name of “security”. We’ve evolved a bit (thank goodness) and anymore this will typically involve federated identity and delegation of authentication details to trusted third parties like Facebook, Google, Twitter, or Microsoft, using identity protocols like WS-Federation, OpenID, and OAuth/OAuth2. We’ll see an example of how this works, below.
In this walkthrough I’ll specifically demonstrate the following:
To make the most of this, you’ll need basic comfort with both node.js and Azure.… Read more
Last night I worked my way through some dev tool updates and happened to install both SQL Server 2014 and the new Azure SDK 2.3. I prefer to host the databases for the storage emulator in my primary SQL Server instance instead of the default LocalDB, and had grown accustomed to running the command-line DSInit.exe tool to accomplish this.
It turns out that as of version 2.3 DSInit is no longer included in the Azure SDK tooling (a fact that might have been useful to include in the release notes ). Instead, the storage emulator functionality has been unified into a single command-line executable WAStorageEmulator.exe, found by default at c:\Program Files (x86)\Microsoft SDKs\Windows Azure\Storage Emulator. WAStorageEmulator includes switches for initializing the storage database, starting and stopping the emulator, etc.
No problem, I think to myself… I’ll just run the ‘init’ command for WAStorageEmulator.exe and point it toward my default SQL instance:
WAStorageEmulator.exe init –sqlinstance .
This immediately resulted in a crash… not very user-friendly. Debugging into Visual Studio showed the problem to be an unhandled ArgumentException… hmm, maybe it doesn’t like the period specifier for the local instance. Let’s try (local) instead:
WAStorageEmulator.exe init –sqlinstance (local)
Okay, no crash this time, but now the process just hangs and seemingly does nothing.… Read more