Handy PowerShell script to crawl file shares and collect information on their structure


Hi Community,

I’ve been a bit busy lately (work and family related with the arrival of my second children) but it’s always great to share information with you. A couple of weeks ago, I was working on this engagement to decide whether to move to the cloud or not (It wasn’t a brainer for me taking into consideration the existing technology stack my customer has, but more importantly I needed to collect evidence or facts in order to help them make a decision).

My customer has quite some file shares spread across Windows and multiple NAS. I had to write a PowerShell script (see below) for this task. Personally, I don’t like any scripting language mainly because I expect my code to be compiled, linked besides being able to tweak my code to perform as I want it too, but like I said it’s my personal opinion.

I hope you’ll find it useful.



$shareCount = 0
$startAnalysis = (Get-Date)
$loggedOnUser = Get-WMIObject -class Win32_ComputerSystem | select username

Write-Host “**************************************************************************”
Write-Host “**        File Share Report – Shares on: $targetServer              “
Write-Host “**        Currently logged on as: $loggedOnUser                     ”
Write-Host “**        Analysis start: $startAnalysis”
Write-Host “**************************************************************************”

If (-Not [string]::IsNullOrEmpty($targetServer)) {
$existingShares = Get-WmiObject -class Win32_Share -computername $targetServer `
-filter “Type=0 And Name Like ‘%[^$]’ And Name <> ‘NETLOGON’ And Name <> ‘SYSVOL'”
foreach($share in $existingShares) {
$results = $null
$uncPath = “\\$targetServer\\”+$share.Name
# Let‘s display information on fileshares on target server (Summary)
        $stats = dir $uncPath -recurse -errorAction “SilentlyContinue” | `
            where {-Not $_.PSIsContainer}| Measure-Object -Property length -Sum
        $results = New-Object -TypeName PSObject -Property @{
         $results | Format-Table ComputerName, LocalPath, UNCPath, SizeKB, NumberFiles
         # Let’
s display a breakdown on file types and their sizes
Get-ChildItem -Path $uncPath -Recurse |
Where-Object { !$_.PSIsContainer } |
Group-Object Extension |

         Select-Object @{n=“Extension”;e={$_.Name -replace ‘^\.’}}, `
@{n=“Size(MB)”;e={[math]::Round((($_.Group | Measure-Object Length -Sum).Sum / 1MB), 2)}}, 
@{n=“Average Size(MB)”;e={[math]::Round((($_.Group | Measure-Object Length -Average).Average / 1MB), 2)}},
@{n=“Maximum Size (MB)”;e={[math]::Round((($_.Group | Measure-Object Length -Maximum).Maximum / 1MB), 2)}},  #, 
Count | Format-Table
# Let‘s display a breakdown on files in shares by age bucket
         $files=dir $uncPath -recurse |
         Select Fullname, CreationTime, LastWriteTime, Length,
                OverAYear = ($files | Where {$_.Days -gt 365} | Measure-Object).Count
_365Days‘ = ($files | Where {$_.Days -gt 180 -AND $_.Days -le 365} | Measure-Object).Count
_180Days‘ = ($files | Where {$_.Days -gt 90 -AND $_.Days -le 180} | Measure-Object).Count
_90Days‘ =  ($files | Where {$_.Days -gt 30 -AND $_.Days -le 90} | Measure-Object).Count
_30Days‘ =  ($files | Where {$_.Days -gt 7 -AND $_.Days -le 30} | Measure-Object).Count
_7Days‘ =   ($files | Where {$_.Days -gt 0 -AND $_.Days -le 7} | Measure-Object).Count
        $ageBucket =  New-Object -TypeName PSObject -Property $summary | Select Path, OverAYear, _365Days, _180Days, _90Days, _30Days, _7Days
        [Console]::Write(“`n`r*****************************`nFiles grouped by age (Count)`n*****************************`n`rPath: {0}`nOver a year: {1}`n365 days: {2}`n”, $ageBucket.Path, $ageBucket.OverAYear,  $ageBucket._365Days)
        [Console]::Write(“180 days: {0}`n90 days: {1}`n30 days: {2}`n”, $ageBucket._180Days, $ageBucket._90Days,  $ageBucket._30Days)
        [Console]::Write(“7 days: {0}`n”, $ageBucket._7Days)
        $shareCount += 1
        Write-Host “——————————————————————————“
     $endAnalysis = (Get-Date) – $startAnalysis
     Write-Host “Total Shares: $shareCount”
     Write-Host “Analysis Execution Time: $endAnalysis”
} Else {
   Write-Host “**  targetServer needs to be specified.    **” -ForegroundColor Red



.NET Crash Dump and live process analysis via clrmd

Application debugging and analysis can be a daunting task, even more when source code neither symbols are not available. Visual Studio provides developers with powerful debugging capabilities, but the problem many times faced by developers is that Visual Studio is not installed on the target computer Sad smile which is fair enough if it is a production environment.

There are a few tools available, being WinDbg the most powerful and one of my favorite ones. WinDbg allows developers to debug native (in kernel and user mode) and managed code through SOS (Debugging Extension). The options available are more powerful than the ones provided by Visual Studio debugger, however it might not be very user friendly, for the following reasons:

  • Load SOS via CLR or mscorwks (depending on the version of the framework)
  • Type in commands in WinDbg to perform our analysis. These commands are powerful if the developer knows them besides having a good understanding of how the CLR works, the only thing is that many of these commands’ names are very user friendly, as shown below
.load C:WINDOWSMicrosoft.NETFrameworkv2.0.50727sos.dll

!dumpheap -type MyBusinessObject

PDB symbol for mscorwks.dll not loaded

 Address       MT     Size

027437e4 01d6683c       12

02743830 01d6683c       12

0274387c 01d6683c       12


02747d6c 01d6683c       12

02747db8 01d6683c       12

02747e04 01d6683c       12

02747e50 01d6683c       12

02747e9c 01d6683c       12

02747ee8 01d6683c       12

total 30 objects


      MT    Count    TotalSize Class Name

01d6683c       30          360 FinalizerProblem.MyBusinessObject

Total 30 objects

!gcroot 02747d6c

Note: Roots found on stacks may be false positives. Run "!help gcroot" for

more info.

Error during command: warning! Extension is using a feature which Visual Studio does not implement.

Scan Thread 7092 OSTHread 1bb4

Scan Thread 6864 OSTHread 1ad0

Finalizer queue:Root:02747d6c(FinalizerProblem.MyBusinessObject)


SyncBlocks to be cleaned up: 0

MTA Interfaces to be released: 0

STA Interfaces to be released: 0


generation 0 has 0 finalizable objects (002906d0->002906d0)

generation 1 has 36 finalizable objects (00290640->002906d0)

generation 2 has 0 finalizable objects (00290640->00290640)

Ready for finalization 69 objects (002906d0->002907e4)


      MT    Count    TotalSize Class Name

7b47f8f8        1           20 System.Windows.Forms.ApplicationContext


7910b694       10          160 System.WeakReference

7b47ff4c        4          224 System.Windows.Forms.Control+ControlNativeWindow

01d6683c       22          264 FinalizerProblem.MyBusinessObject

01d65a54        1          332 FinalizerProblem.Form1

7b4827e8        2          336 System.Windows.Forms.Button

7ae78e7c        8          352 System.Drawing.BufferedGraphics


Total 105 objects

There is a great article on MSDN on this subject – Debugging Managed Code using the Windows Debugger.

So, WinDbg is very powerful but some developers might not find it user friendly, what options do we have then? Well, good news is that Microsoft has produced and released a library to diagnose and analyze CLR applications, it is called CLrMD (CLR Memory Diagnostics). It is currently in beta and available to download from nuget.


Image 1 – Install nuget package

Therefore, I have just built an utility to showcase some of the features in the library. The utility is a WPF C# application which implements the ClrMD library as well as the MVVM pattern. The whole idea is to make developers life easier, by providing an easy to use UI and encapsulate some of the commands in SOS as operations that can be selected on the user interface.


Image 2 – Options available in the utility

The utility as of now, provides 3 operations only which are:

  • Dump Heap
  • Heap Stats
  • Threads and StackTrace

These options can be expanded by making changes to the DebuggerOption ViewModel to add a new option, and by implementing the required code in CorDbg.Operations class (Depicted below code for collecting information threads and stack trace information of the attached process.

public ObservableCollection<Thread> GeThreadsAndStackTrace() {

            var retval = new ObservableCollection<Thread>();

            if (Session != null) {

                Session.Runtime.Threads.ToList().ForEach(x => {

                    var newItem = new Thread() {

                        ThreadId = string.Format("{0:X}", x.OSThreadId).Trim(),

                        ThreadExecutionBlock = string.Format("{0:X}", x.Teb).Trim()


                    x.StackTrace.ToList().ForEach(z => newItem.StackTrace.Add(new StackTrace() {

                        InstructionPtr = string.Format("{0,12:X}", z.InstructionPointer),

                        StackPtr = string.Format("{0,12:X}", z.StackPointer),

                        Method =  (z.Method != null ?  z.Method.GetFullSignature() : string.Empty)





            return retval;


The operation workflow is as follows:

  • Refresh process list – if required (This is if the target application was launched after the utility was running)
  • Select the mode to attach to the target process.
  • Select operation and click on the Go button… That simple!

In this example, the target application was another instance of Visual Studio.


Image 3 – Managed Heap Stats


Image 4 – Running threads and their stack traces

I hope you’ll find this utility useful, please feel free to download it and extend it.
[office src=”https://onedrive.live.com/embed?cid=2FE1291768841ACE&resid=2FE1291768841ACE!5826&authkey=!ADRH2VtXOaWWuRU” width=”98″ height=”120″”]

NDepend 5.3

I have always considered software development as an art and a science at the same time, in which developers translate business requirements into computer instructions, therefore creativity, innovation and technology are amalgamated into one, but the road from an idea’s inception to design then development is long and amazingly not perfect, hence we developers sometimes rush things to meet deadlines or just build the functionality with the “promise of refactoring and/or improving it” later.

As a software developer and architect, I must conduct code review for new and existing codebases. One of the challenges I have always faced is the need to query codebases, in order to find dependencies and ensure that the code and/or changes being produced will not affect or break some OOP principles neither will introduce dependency issues.

There are a few tools out there I use to check source code quality, but none of them it’s as powerful or flexible enough as NDepend.  I have been a NDepend user for a long time now, and I even posted an article about it a few years back, and besides having a refreshed user interface, it now supports CQLINQ as well as CQL (Code Query Language) being the latter for legacy and compatibility reasons.

Once we start NDepend, the start page is displayed. From it, we can start analyzing our code as well as installing add-ins for Reflector and Visual Studio.


NDepend smoothly integrates into Visual Studio since version 2008, thus making it easier for developers.


There are many features in the product, but just to mention a few:

  • Multi VS solutions wide-analysis and collaboration
  • Rich Code Search in VS
  • Multi Query Edition in VS
  • Reflector disassembly’s comparison
  • Continuous comparison with a base line in VS
  • Code visualization in VS
    - Dependency Matrix
    - Dependency Graph and Metric View


The Dashboard allows developers to have a quick look at how their codebase is structured, but more importantly the incurred rule violations in code. Information is displayed in a succinct and clear way, not to mention the graphics that make it easy to interpret our code better.  


I have briefly described some of the features and benefits of NDepend, but before I forget to mention, it is available to download as a 14-day trial so you can use it and get a better idea about the capabilities of the product.



Read XML config files with Visual C++

Hi Community,

I am currently working on an add-in for Visual Studio to expose some functionality available in the “Debugger Engine and Extension APIs”. The development is in the very early stages, but I have completed already one feature required by the Add-in, which is the  ability to read XML configuration files similarly to how .NET and the CLR do but in this case using Visual C++. This Add-in comprises native and managed code, where .NET helps me to interact with the IDE and the core functionality of the Add-in is encapsulated in a DLL. This blog entry describes the ConfigReader class.

A sample configuration file is shown below

<?xml version="1.0" encoding="utf-8"?>


    <sendOutputToVSWindow enabled="true" />


        <extension name="ext" path="C:Program Files (x86)Windows Kits8.1Debuggersx86winextext.dll" />

        <extension name="wow64exts" path="C:Program Files (x86)Windows Kits8.1Debuggersx86WINXPwow64exts.dll" />

        <extension name="exts" path="C:Program Files (x86)Windows Kits8.1Debuggersx86WINXPexts.dll" />

        <extension name="uext" path="C:Program Files (x86)Windows Kits8.1Debuggersx86winextuext.dll" />

        <extension name="ntsdexts" path="C:Program Files (x86)Windows Kits8.1Debuggersx86WINXPntsdexts.dll" />



The configuration file must reside in the same folder of the library, and it’s located in the constructor of the class

ConfigReader::ConfigReader() {




void ConfigReader::ReadConfig() {

    if (!LocateConfigFile())

        throw std::exception("Config file not found. Unable to proceed.");


Since the DLL can be anywhere (unlike .NET) that it’s either in the GAC or the bin folder of the application, I needed to find the configuration file in the same path of the library, and this is done via enumerating the loaded modules.

BOOL ConfigReader::LocateConfigFile() {

    auto retval = FALSE;

    DWORD nModuleCount = 0;

    IXMLDOMDocumentPtr pDocPtr;

    MODULEINFO moduleDetails = {0};

    HANDLE hToken = NULL, hProcess = NULL;

    HMODULE hLoadedModules[Max_Loaded_Modules];




    if ((hToken = GetThreadToken()) != NULL && SetPrivilege(hToken, SE_DEBUG_NAME, TRUE)) {

        if ((hProcess = OpenProcess(PROCESS_ALL_ACCESS, TRUE, GetCurrentProcessId())) != NULL) {

            if ((EnumProcessModulesEx(hProcess,  hLoadedModules, sizeof(hLoadedModules), &nModuleCount, LIST_MODULES_ALL)) != NULL) {

                auto modules = std::vector<HMODULE>(std::begin(hLoadedModules), std::end(hLoadedModules));


                 #ifdef _WIN64

                     nModuleCount = Item_Count(nModuleCount) / 2;


                     nModuleCount = Item_Count(nModuleCount);



                 auto config = DoesConfigFileExist(hProcess, modules, TargetImageName);


                 if (!config.empty())

                     retval = ParseConfigFile(config);




        SetPrivilege(hToken, SE_DEBUG_NAME, FALSE);






    return retval;



std::wstring ConfigReader::DoesConfigFileExist(const HANDLE& hProcess, std::vector<HMODULE>& hModules, const wchar_t* targetImage) {

    BOOL found = FALSE;

    std::wstring retval;

    wchar_t szDir[_MAX_DIR];

    wchar_t szExt[_MAX_EXT];

    wchar_t szBuffer[MAX_PATH];

    wchar_t szFName[_MAX_FNAME];

    wchar_t szDrive[_MAX_DRIVE];


    std::find_if(hModules.begin(), hModules.end(), [&, this](HMODULE hModule) {

        auto ret = FALSE;


        if (!found && hModule != nullptr  && (GetModuleFileNameEx(hProcess, hModule, szBuffer, Array_Size(szBuffer))) != NULL) {

            size_t cntConverted;

            char szAnsiPath[MAX_PATH];

            _wsplitpath_s(szBuffer, szDrive, Array_Size(szDrive), szDir, Array_Size(szDir), szFName, Array_Size(szFName), szExt, Array_Size(szExt));

            auto imageName = std::wstring(szFName).append(szExt);

            auto configPath = std::wstring(szDrive).append(szDir).append(ConfigFileName);

            wcstombs_s(&cntConverted, szAnsiPath, configPath.data(), configPath.size() );

            std::ifstream configFile(szAnsiPath);


            if (wcscmp(targetImage, imageName.data()) == 0  && configFile.good()) {



                found = TRUE;




        return ret;



    return retval;



HANDLE ConfigReader::GetThreadToken() {

    HANDLE retval;



    if (!OpenThreadToken(GetCurrentThread(), flags, FALSE, &retval)) {

        if (GetLastError() == ERROR_NO_TOKEN) {

            if (ImpersonateSelf(SecurityImpersonation) &&

                !OpenThreadToken(GetCurrentThread(), flags, FALSE, &retval))

                retval = NULL;



    return retval;



BOOL ConfigReader::SetPrivilege(HANDLE& hToken, LPCTSTR Privilege, BOOL bEnablePrivilege) {

    LUID luid;

    auto retval = FALSE;

    TOKEN_PRIVILEGES tp = {0};

    DWORD cb = sizeof(TOKEN_PRIVILEGES);


    if (LookupPrivilegeValue(NULL, Privilege, &luid)) {

        tp.PrivilegeCount = 1;

        tp.Privileges[0].Luid = luid;

        tp.Privileges[0].Attributes = bEnablePrivilege ? SE_PRIVILEGE_ENABLED : 0;

        AdjustTokenPrivileges(hToken, FALSE, &tp, cb, NULL, NULL);


        if (GetLastError() == ERROR_SUCCESS)

            retval = TRUE;


    return retval;


A few things worth mentioning:

  • Always use STL containers and algorithms when possible (e.g.: std::vector instead of a C/C++ type array.
  • To query information about a process, this one has to be opened and most of the times (depending on what it’s required to do) a few flags must be set.  Also remember to close any handle after it’s been used.
  • Use std::wstring (UNICODE) instead of wchar_t*. They’re safer and better to use.
  • Use lambdas in conjunction with algorithms (e.g. s td::find_if)
  • Get to learn and know the functions to convert from Multibyte (ANSI) to wide characters (UNICODE) (e.g.: wcstombs_s)
  • Pass references (and use const whenever is possible)

Once the configuration file is found, it is parsed using the MSMXL parser which it’s COM based, therefore the use of smart pointers is strongly suggested.

BOOL ConfigReader::ParseConfigFile(const std::wstring& configFile) {

    auto retval = FALSE;

    VARIANT_BOOL success;

    IXMLDOMDocumentPtr pDocPtr;

    IXMLDOMNodePtr selectedNode; 






    if (SUCCEEDED(pDocPtr->load(_variant_t(configFile.c_str()), &success))) {

        if (SUCCEEDED(pDocPtr->selectSingleNode(_bstr_t(XmlRootNode), &selectedNode))) {


            retval = TRUE;






    return retval;



void ConfigReader::ProcessElementRecursively(IXMLDOMNodePtr& node) {

    long childrenCount = 0;

    IXMLDOMNodePtr childNode;

    IXMLDOMNodeListPtr children;




    if (SUCCEEDED(node->get_childNodes(&children)) && SUCCEEDED(children->get_length(&childrenCount)) && childrenCount > 0) {

        for (auto nCount = 0; nCount < childrenCount; nCount++) {

            if (SUCCEEDED(children->get_item(nCount, &childNode))) {










void ConfigReader::ExtractInformationFromElement(IXMLDOMNodePtr& node) {

    size_t nSize;

    VARIANT value;

    std::wstring key;

    BSTR nodeContent;

    DOMNodeType nodeType;

    WCHAR szNodeText[512] = {0};

    char szBuffer[MAX_PATH] = {0};





    if (SUCCEEDED(node->get_nodeType(&nodeType)) && nodeType == DOMNodeType::NODE_ELEMENT) {

        nodeContent = SysAllocString(szNodeText);

        auto pElement = (IXMLDOMElementPtr)node;



        if (wcscmp(nodeContent, L"sendOutputToVSWindow") == 0) {

            pElement->getAttribute(_bstr_t(L"enabled"), &value);


            if (value.vt != VT_NULL) 

                Properties.insert(std::make_pair(nodeContent, value.bstrVal));


        } else if (wcscmp(nodeContent, L"extension") == 0) {

            pElement->getAttribute(_bstr_t(L"name"), &value);


            if (value.vt != VT_NULL)



            pElement->getAttribute(_bstr_t(L"path"), &value);


            if (value.vt != VT_NULL && !key.empty()) {

                Properties.insert(std::make_pair(key.c_str(), value.bstrVal));

                wcstombs_s(&nSize, szBuffer, key.c_str(), key.size());

                std::string name(szBuffer);

                wcstombs_s(&nSize, szBuffer, value.bstrVal, wcslen(value.bstrVal));

                std::string path(szBuffer);

                m_extensions.push_back(ExtInformation(name, path));










Our ConfigReader object has two main fields (Extensions and Properties) – Depicted below


and we can also retrieve any property from the configuration file, in a similar way we do it in .NET. This is accomplished via the GetSetting method

const std::wstring ConfigReader::GetSetting(const wchar_t* key) {

    std::wstring retval;


    if (Properties.size() > 0 && key != nullptr && wcslen(key) > 0) {

        typedef std::pair<const std::wstring, const std::wstring> item;


        std::find_if(Properties.begin(), Properties.end(), [&](item i) {

            auto ret = FALSE;


            if (retval.size() == 0) {

                if (wcscmp(i.first.data(), key) == 0) {


                    ret = TRUE;



            return ret;





    return retval;





Visual Studio “4” CTP – General Availability

Hi Community,

Microsoft has announced the general availability for Visual Studio “14” CTP Today. There are quite a few interesting features, in both native and managed languages.

Please find below some resources about this release:




Changes to storage Emulator in azure sdk 2.3


I have always created my Azure Storage Emulator on my local SQL server instance via executing DSInit command line tool, but that has changed with the latest release of the SDK that came out last week (3rd April 2014).


Why? Well… Microsoft has decided to deprecate it, that simple. So if you find yourself in a similar situation I’ve just found myself, the new command that can save your life is “WAStorageEmulator.exe” that can be found in “C:Program Files (x86)Microsoft SDKsWindows AzureStorage Emulator”. There’s also something else that Microsoft has deprecated,  the UI for the storage emulator now all we have is a command line.