Automating Analyzing Tons of Minidump Files with WinDBG and PowerShell

When debugging a nasty problem in your code, one of the most helpful things you can get is a minidump. With that picture of what your app was doing at the time of the crash, hang, or when the memory started spiking, you’ve got a big hint to jumpstart your exploring. While there’s a bunch of tools out there, such as the wonderful ProcDump, and the debuggers themselves to create minidumps, the real moment of truth is when you have to look at those minidump. That’s easy to do with one or two, but what happens if you have 200? In my line of work, where I debug other’s software problems (and will be glad to help you with yours), I’m routinely faced with hundreds of dumps from a client. As much as I would like to carefully open each minidump and lovingly type the same commands over and over for the greatest consulting billing statement ever, I just can’t get my clients to pay for that.

What I really need is a way to say, “Here’s a bunch of .DMP files; go run these WinDBG commands across all of them.” It turns out that accomplishing that basic task is not hard at all when you combine a little WinDBG knowledge with a little PowerShell magic. You can find all this goodness in my Get-DumpAnalysis cmdlet that’s part of my Wintellect PowerShell open source mode. You can grab everything from GitHub: https://github.com/Wintellect/WintellectPowerShell. Before jumping into using the script, I need to talk about how it works. That way you’ll understand the usage better.

In order to “script” WinDBG so that it executes a set of commands in a file, the little used $$< command does the trick. That command will read in a text file and execute each line in turn as though you had typed it into the command area. There’s other variants of $$< you can look up in the WinDBG help that offer slightly different features but this command is enough for my purposes.

If I wanted to run !analyze and get the list of loaded modules, the following debugging script text file called, BasicAnalysis.txt, shows those commands. The asterisk are considered comment lines for $$<.

  1. * Run !analyze
  2. !analyze v
  3. * Get the list of loaded modules
  4. lmv

Therefore, in the WinDBG command area, if you execute $$<BasicAnalysis.txt, you have your two command run automatically. That solves the first step but what would be nice is if we could tell WinDBG on startup that we want to run some commands immediately. Luckily for us, the -c command line switch tells WinDBG to do exactly that. So to tie everything together, if I execute the following command, it will open up the minidump (-z) execute the commands in my text file and by stacking the Q command, immediately quit.

  1. windbg -z test.dmp -c “$$<BasicAnalysis.txt;Q”

That’s a start but there’s two problems. The first is I need to capture the output of the commands. The second is that WinDBG is a GUI application so if I want to run a script across a bunch of minidumps, I’m going to have many instances of WinDBG starting and stopping all over the place. I can solve both of these problems the same way: don’t use WinDBG!

The solution is to use CDB.EXE, WinDBG’s console-only brother that’s installed into the same directory as WINDBG.EXE. Since almost no one knows about CDB.EXE, I didn’t want to put it into the title of the blog entry to avoid confusion. What’s wonderful about CDB.EXE is that it’s basically the exact same command window as WinDBG, with the exact same startup options, too, but strictly inside a console window. Thus to execute the same command line with CDB, but redirect the output to a file, here you go:

  1. cdb -z test.dmp -c “$$<BasicAnalysis.txt;Q” > log.log

You might be wondering why I’m using output redirection instead of .logopen, to do the logging to a file. It’s all about capturing the earliest output. If you use .logopen, you miss the first couple of lines of normal output, which contains some extremely important information such as the operating system version the dump was created on. Also, tools like ProcDump put in comments the command line they were executed with to create the minidump. Because those occur before .logopen had a chance to run you miss it.

Now that you see how we can execute CDB to do the analysis work, let me mention some of the other requirements I wanted in Get-DumpAnalysis. The first was that I wanted full pipeline support so if I have a bunch of .DMP files in a directory tree I can do them all at once. Secondly, I wanted the log file for each minidump file to be named <minidump>-<debuggingscript>.log and placed in the same directory as the minidump. That keeps everything in the same tree and you can do multiple runs with different debugging scripts without losing any previous runs of Get-DumpAnalysis.

There’s three parameters to Get-DumpAnalysis:

  • Files: the file or pipeline set of files to process
  • DebuggingScript: the text file of CDB commands you want to run against each minidump
  • CdbProgramPath: by default, the script uses the CDB.EXE found in the PATH environment variable, but if you want to specify a different CDB.EXE, such as when analyzing 32-bit .NET dumps on your x64 workstation, list the full path in this parameter.

Once you’ve loaded WintellectPowerShell (Import-Module WintellectPowerShell for you PowerShell beginners), just pipeline your way to minidump analysis glory:

  1. Get-ChildItem -Path C:dumps -Filter *.dmp -Recurse | Get-DumpAnalysis -DebuggingScript .AnalysisModsAndVersion.txt

It’s not every day that I get to write a blog entry that works for .NET, C++, and even driver developers! After running a ton of minidumps through Get-DumpAnalysis, I’ve saved a lot of time for my clients. I hope you’ll find it useful as well. If you have any ideas or feedback, I’d love to hear it at the GitHub site for WintellectPowerShell. If you want to learn more about debugging better, check out Wintellect’s Instructor Lead Training Debugging and Performance classes that can take your team to the superstar debugging level. We also offer those classes at WintellectNOW.

The Atmosera Difference

Atmosera maintains a comprehensive portfolio of partners to ensure our customers can leverage the best technology and services available. At the core of our partner program is Microsoft, whose Azure Cloud platform provides the most flexible path for true hybrid computing, including on-site, private and public environments. Whether your business needs a private cloud, a public cloud or a combination, Atmosera partners with you to achieve the right balance.

Microsoft

While Atmosera offers multiple deployment options for customers, we recognize Microsoft Azure as the only hyperscaler to offer an integrated stack capable of delivering a true “hybrid cloud.” Atmosera is a leading Microsoft partner, with the expertise to help customers take advantage of Microsoft’s extensive portfolio, including Azure, Cloud Platform (formerly Cloud OS) and Office 365. Additionally, Atmosera is a proud member of Microsoft’s exclusive Cloud Solution Provider (CSP), Cloud OS Network (COSN) and Microsoft Certified for Hybrid Cloud programs.

We deliver solutions that accelerate the value of Azure.

Ready to experience the full power of Microsoft Azure?

Start Today

Blog Home

Stay Connected

Upcoming Events

All Events