<?xml version="1.0" encoding="utf-8"?><feed xmlns="http://www.w3.org/2005/Atom" ><generator uri="https://jekyllrb.com/" version="3.10.0">Jekyll</generator><link href="https://www.woodcp.com/feed.xml" rel="self" type="application/atom+xml" /><link href="https://www.woodcp.com/" rel="alternate" type="text/html" /><updated>2026-03-02T20:43:18+00:00</updated><id>https://www.woodcp.com/feed.xml</id><title type="html">Wood Consulting Partners</title><subtitle>Expert technology consulting, solution architecture, and custom development services</subtitle><author><name>Todd A. Wood</name><uri>https://www.woodcp.com</uri></author><entry><title type="html">React UI Library Comparison: 8 Libraries, 1 Dashboard, Side by Side</title><link href="https://www.woodcp.com/2026/03/react-ui-library-comparison/" rel="alternate" type="text/html" title="React UI Library Comparison: 8 Libraries, 1 Dashboard, Side by Side" /><published>2026-03-02T06:00:00+00:00</published><updated>2026-03-02T06:00:00+00:00</updated><id>https://www.woodcp.com/2026/03/react-ui-library-comparison</id><content type="html" xml:base="https://www.woodcp.com/2026/03/react-ui-library-comparison/"><![CDATA[<p>I wanted a definitive answer to a question that comes up at the start of every React project: <strong>which UI component library should we use?</strong></p>

<p>Blog posts and Reddit threads give opinions. I wanted code. So I had Claude Code build the same dashboard — stat cards, charts, data tables, activity feeds, dark mode toggle — eight times, once with each of the most popular React UI libraries available today. Same Vite + React 19 + TypeScript + Recharts stack across the board. The only variable is the UI library.</p>

<p>The result is a 1-for-1 reference implementation I can pull from when starting future projects instead of wondering “what does a data table actually look like in Mantine vs. MUI?”</p>

<h2 id="the-libraries">The Libraries</h2>

<p>Here’s what made the cut, with production bundle sizes from the identical dashboard build:</p>

<table>
  <thead>
    <tr>
      <th>Library</th>
      <th>Styling Approach</th>
      <th>Bundle Size</th>
      <th>Dark Mode</th>
    </tr>
  </thead>
  <tbody>
    <tr>
      <td><strong>Material UI (MUI) v6</strong></td>
      <td>CSS-in-JS (Emotion)</td>
      <td>709 KB</td>
      <td>Built-in</td>
    </tr>
    <tr>
      <td><strong>Tailwind CSS v4</strong></td>
      <td>Utility classes</td>
      <td>573 KB</td>
      <td>Manual</td>
    </tr>
    <tr>
      <td><strong>shadcn/ui</strong></td>
      <td>Tailwind + Radix</td>
      <td>577 KB</td>
      <td>Built-in</td>
    </tr>
    <tr>
      <td><strong>Ant Design v5</strong></td>
      <td>CSS-in-JS</td>
      <td>1.3 MB</td>
      <td>Built-in</td>
    </tr>
    <tr>
      <td><strong>Chakra UI v3</strong></td>
      <td>CSS-in-JS (Emotion)</td>
      <td>793 KB</td>
      <td>Built-in</td>
    </tr>
    <tr>
      <td><strong>Mantine v8</strong></td>
      <td>CSS Modules + PostCSS</td>
      <td>865 KB</td>
      <td>Built-in</td>
    </tr>
    <tr>
      <td><strong>React Bootstrap v2</strong></td>
      <td>Bootstrap CSS</td>
      <td>821 KB</td>
      <td>Built-in</td>
    </tr>
    <tr>
      <td><strong>HeroUI (NextUI) v2</strong></td>
      <td>Tailwind + CSS-in-JS</td>
      <td>1.1 MB</td>
      <td>Built-in</td>
    </tr>
  </tbody>
</table>

<h2 id="what-stood-out">What Stood Out</h2>

<p><strong>Bundle size tells a clear story.</strong> Tailwind CSS (573 KB) and shadcn/ui (577 KB) produce the smallest builds by a wide margin. Utility CSS tree-shakes aggressively. Ant Design at 1.3 MB is more than double — its CSS-in-JS runtime and comprehensive component set add real weight.</p>

<p><strong>shadcn/ui is the sweet spot for most new projects.</strong> Near-smallest bundle, beautiful defaults, you own the component code, built on Radix for accessibility. It’s rapidly becoming the de facto standard for modern React apps, and after building this comparison I understand why.</p>

<p><strong>MUI is still the safe enterprise pick.</strong> The most mature, battle-tested option with the largest ecosystem. If Material Design aesthetic works for your use case, the documentation and community support are unmatched.</p>

<p><strong>Mantine is the hidden gem.</strong> 100+ components, 50+ hooks, form handling, notifications, rich text editor — all free, no paid tiers. CSS Modules instead of CSS-in-JS means zero runtime overhead. If you need MUI-level completeness with better performance, Mantine deserves a serious look.</p>

<p><strong>React Bootstrap shows its age.</strong> Still the fastest path for teams with Bootstrap experience, but the aesthetic feels dated and the component set is limited compared to modern alternatives.</p>

<h2 id="recommendations-by-use-case">Recommendations by Use Case</h2>

<ul>
  <li><strong>Enterprise / Internal Tools</strong> → MUI or Ant Design for comprehensive component sets and enterprise support</li>
  <li><strong>Modern SaaS / Startup</strong> → shadcn/ui for owned components, beautiful defaults, tiny bundle</li>
  <li><strong>Performance-Critical</strong> → Tailwind CSS or shadcn/ui for smallest bundles, zero CSS-in-JS runtime</li>
  <li><strong>Full-Featured + Free</strong> → Mantine for the most complete free library available</li>
  <li><strong>Rapid Prototyping</strong> → Mantine or React Bootstrap for familiar patterns and fast iteration</li>
  <li><strong>Design-Forward Consumer App</strong> → HeroUI or Chakra UI for polished aesthetics and smooth animations</li>
</ul>

<h2 id="the-process">The Process</h2>

<p>The whole comparison — eight fully functional dashboard implementations, side-by-side screenshots in light and dark mode, and a detailed report — took about an afternoon with Claude Code doing the heavy lifting. Each implementation runs on its own port (5001-5008) so you can compare them live in the browser.</p>

<p>Having identical reference implementations removes the guesswork. Next time a project needs a data table with sorting, or a dashboard layout with responsive cards, I have working code in eight different libraries to pull from instead of reading docs and hoping the API works how I think it does.</p>

<p>The full report with screenshots and detailed pros/cons analysis for each library is available at the <a href="https://github.com/woodcp/react-ui-examples">project repository</a>.</p>]]></content><author><name>Todd A. Wood</name><uri>https://www.woodcp.com</uri></author><category term="development" /><category term="react" /><category term="ui" /><category term="typescript" /><category term="frontend" /><category term="architecture" /><summary type="html"><![CDATA[I wanted a definitive answer to a question that comes up at the start of every React project: which UI component library should we use?]]></summary></entry><entry><title type="html">Clipboard Images in Claude Code CLI</title><link href="https://www.woodcp.com/2026/01/clipboard-images-in-claude-code-cli/" rel="alternate" type="text/html" title="Clipboard Images in Claude Code CLI" /><published>2026-01-12T06:00:00+00:00</published><updated>2026-01-12T06:00:00+00:00</updated><id>https://www.woodcp.com/2026/01/clipboard-images-in-claude-code-cli</id><content type="html" xml:base="https://www.woodcp.com/2026/01/clipboard-images-in-claude-code-cli/"><![CDATA[<p>Anyone using Claude Code CLI in a terminal has probably hit the friction of sharing an image with it to see a problem. I would have to take a screenshot, save it somewhere, get the file path, and tell Claude in the prompt to look at it.</p>

<p>On Windows (and other platforms) you can grab images off the clipboard. The green light hit me today and I set up a custom Claude command with Claude’s help.</p>

<h2 id="setup">Setup</h2>

<p>Create two files in your <code class="language-plaintext highlighter-rouge">~/.claude/</code> folder:</p>

<p><strong>commands/clip.md</strong></p>
<div class="language-markdown highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="nn">---</span>
<span class="na">allowed-tools</span><span class="pi">:</span> <span class="s">Bash(powershell.exe:*), Read</span>
<span class="na">description</span><span class="pi">:</span> <span class="s">Grab an image from clipboard for analysis</span>
<span class="nn">---</span>

Capture the clipboard image by running the clipboard-image.ps1 script:

powershell.exe -NoProfile -ExecutionPolicy Bypass -File "C:<span class="se">\U</span>sers<span class="se">\Y</span>OUR_USERNAME<span class="se">\.</span>claude<span class="se">\c</span>lipboard-image.ps1"

The script outputs the saved image path (with timestamp) to stdout. Read the image file at that path to analyze it.

After analyzing the image, address the user's request: $ARGUMENTS
</code></pre></div></div>

<p><strong>clipboard-image.ps1</strong></p>
<div class="language-powershell highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="c"># clipboard-image.ps1</span><span class="w">
</span><span class="c"># Saves clipboard image to temp folder with timestamp</span><span class="w">

</span><span class="n">Add-Type</span><span class="w"> </span><span class="nt">-AssemblyName</span><span class="w"> </span><span class="nx">System.Windows.Forms</span><span class="w">

</span><span class="nv">$img</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="p">[</span><span class="n">System.Windows.Forms.Clipboard</span><span class="p">]::</span><span class="n">GetImage</span><span class="p">()</span><span class="w">

</span><span class="kr">if</span><span class="w"> </span><span class="p">(</span><span class="nv">$img</span><span class="p">)</span><span class="w"> </span><span class="p">{</span><span class="w">
    </span><span class="nv">$timestamp</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="n">Get-Date</span><span class="w"> </span><span class="nt">-Format</span><span class="w"> </span><span class="s2">"yyyyMMdd_HHmmss"</span><span class="w">
    </span><span class="nv">$path</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="s2">"</span><span class="nv">$</span><span class="nn">env</span><span class="p">:</span><span class="nv">TEMP</span><span class="s2">\clipboard_</span><span class="nv">$timestamp</span><span class="s2">.png"</span><span class="w">
    </span><span class="nv">$img</span><span class="o">.</span><span class="nf">Save</span><span class="p">(</span><span class="nv">$path</span><span class="p">)</span><span class="w">
    </span><span class="n">Write-Host</span><span class="w"> </span><span class="nv">$path</span><span class="w">
</span><span class="p">}</span><span class="w"> </span><span class="kr">else</span><span class="w"> </span><span class="p">{</span><span class="w">
    </span><span class="n">Write-Error</span><span class="w"> </span><span class="s2">"No image in clipboard"</span><span class="w">
    </span><span class="kr">exit</span><span class="w"> </span><span class="mi">1</span><span class="w">
</span><span class="p">}</span><span class="w">
</span></code></pre></div></div>

<h2 id="usage">Usage</h2>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>/clip can you fix the button alignment shown here to be below the cancel button
</code></pre></div></div>

<h2 id="what-happens">What Happens</h2>

<ol>
  <li>Claude runs the PowerShell script to capture the clipboard image</li>
  <li>Saves it to <code class="language-plaintext highlighter-rouge">%TEMP%\clipboard_TIMESTAMP.png</code></li>
  <li>Reads the image file</li>
  <li>Analyzes it and responds to your prompt</li>
</ol>

<h2 id="try-it">Try It</h2>

<ol>
  <li>Copy an image to your clipboard (e.g., take a screenshot with <code class="language-plaintext highlighter-rouge">Win+Shift+S</code>)</li>
  <li>Type <code class="language-plaintext highlighter-rouge">/clip describe what you see</code></li>
</ol>

<p>The command is stored at <code class="language-plaintext highlighter-rouge">~/.claude/commands/clip.md</code> so it works in any Claude Code session.</p>

<h2 id="update-january-13-2026">Update: January 13, 2026</h2>

<p>I refactored to use an external PowerShell script instead of an inline command. On Windows, being more explicit with <code class="language-plaintext highlighter-rouge">powershell.exe</code> and the full script path prevents Claude from iterating on a working script each time the command runs.</p>

<p>Real-world test: My wife had knee surgery yesterday and I copied one of the doctor-supplied arthroscopic images to the clipboard:</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>❯ /clip what is this an image of

● Bash(powershell.exe -NoProfile -ExecutionPolicy Bypass -File "...")
  ⎿  C:\Users\toddw\AppData\Local\Temp\clipboard_20260113_103327.png

● Read(~\AppData\Local\Temp\clipboard_20260113_103327.png)
  ⎿  Read image (1.5MB)

● Arthroscopic surgery image. Looks like a view inside a joint
  (likely shoulder or knee) through a scope. You can see:

  - Smooth cartilage surface (left side, cream-colored dome)
  - Torn/damaged tissue (white fibrous material in center)
  - Surgical instrument (metal probe on right)
  - Typical circular arthroscope view with black border

  Possibly a labral tear repair or rotator cuff procedure based
  on the anatomy visible.
</code></pre></div></div>

<p>Claude nailed it.</p>]]></content><author><name>Todd A. Wood</name><uri>https://www.woodcp.com</uri></author><category term="ai" /><category term="development" /><category term="ai" /><category term="claude" /><category term="cli" /><category term="productivity" /><summary type="html"><![CDATA[Anyone using Claude Code CLI in a terminal has probably hit the friction of sharing an image with it to see a problem. I would have to take a screenshot, save it somewhere, get the file path, and tell Claude in the prompt to look at it.]]></summary></entry><entry><title type="html">How-To Fork Git Repositories on Visual Studio Online</title><link href="https://www.woodcp.com/2014/01/how-to-fork-git-repositories-on-visual-studio-online/" rel="alternate" type="text/html" title="How-To Fork Git Repositories on Visual Studio Online" /><published>2014-01-03T06:00:00+00:00</published><updated>2014-01-03T06:00:00+00:00</updated><id>https://www.woodcp.com/2014/01/how-to-fork-git-repositories-on-visual-studio-online</id><content type="html" xml:base="https://www.woodcp.com/2014/01/how-to-fork-git-repositories-on-visual-studio-online/"><![CDATA[<h2>Visual Studio Online</h2>
<p>Microsoft rebranded Team Foundation Service as
    <a title="http://www.visualstudio.com/products/visual-studio-online-overview-vs"
        href="http://www.visualstudio.com/products/visual-studio-online-overview-vs" target="_blank">Visual Studio Online</a> with the launch of Visual Studio 2013 last fall. Visual Studio Online enables team collaboration
    using either Git or TFS source code control directly inside Visual Studio 2013, web browser, or the command line. Microsoft
    is entering a space dominated by
    <a href="https://github.com/" target="_blank">GitHub</a> &amp;
    <a href="https://bitbucket.org/" target="_blank">Bitbucket</a>, and developers will seek familiar workflows and features offered by by these platforms.</p>
<h2>GitHub Fork</h2>
<p>GitHub established the paradigm of forking, upstream repositories, and pull requests for a lot of developers.
    <a href="http://stackoverflow.com/users/6309/vonc"
        target="_blank">VonC</a> put together a nice graphic to illustrate the relationship of repositories in his response to a
    <a href="http://stackoverflow.com/questions/6286571/git-fork-is-git-clone"
        target="_blank">Stack Overflow question</a> on the topic.</p>
<p>
    <a href="http://stackoverflow.com/questions/6286571/git-fork-is-git-clone" target="_blank">
        <img title="Fork" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; float: none; padding-top: 0px; padding-left: 0px; margin-left: auto; display: block; padding-right: 0px; border-top-width: 0px; margin-right: auto"
            border="0" alt="Fork" src="/assets/images/Fork.png" width="450" height="469">
    </a>
</p>
<p>GitHub simplifies the process of forking an upstream repository with the simple click of the Fork button:</p>
<p>
    <img title="ForkButtonCropped" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; float: none; padding-top: 0px; padding-left: 0px; margin-left: auto; display: block; padding-right: 0px; border-top-width: 0px; margin-right: auto"
        border="0" alt="ForkButtonCropped" src="/assets/images/ForkButtonCropped.png" width="533" height="188">
</p>
<h2>Visual Studio Online Fork</h2>
<p>Visual Studio Online might not currently have a Fork button but the process of forking Git repositories between accounts
    is quite straightforward and involves:</p>
<ol>
    <li>Creating the target project to contain the fork repository
        <li>Cloning the upstream repository
            <li>Adding a remote origin link to the fork
                <li>Pushing the cloned repository to the fork origin</li>
</ol>
<p>In this scenario, I will fork a Git repository name
    <em>VisualStudioOnlineFork</em> from an upstream master repository located in
    <em>woodcp.visualstudio.com</em> to a fork project located in
    <em>toddwood.visualstudio.com</em> using the owner account of the fork repository. The fork project owner account had access
    granted to the upstream repository by the upstream owner account.</p>
<p>
    <a href="/assets/images/UpstreamRepo.png">
        <img title="UpstreamRepo" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px"
            border="0" alt="UpstreamRepo" src="/assets/images/UpstreamRepo_thumb.png" width="644" height="398">
    </a>
</p>
<h3>Create Fork Repository</h3>
<p>Login to the fork account and create a new team project named the same as the upstream repository. Technically the names
    do not need to match but that is the convention established by the GitHub Fork button.</p>
<p>
    <a href="/assets/images/CreateTargetFork.png">
        <img title="CreateTargetFork" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px"
            border="0" alt="CreateTargetFork" src="/assets/images/CreateTargetFork_thumb.png" width="644" height="442">
    </a>
</p>
<p>Once the project is created, Visual Studio Online will display the empty repository and instructions on how to set the remote
    and push the upstream or any existing repository:</p>
<p>
    <a href="/assets/images/BareRepository.png">
        <img title="BareRepository" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px"
            border="0" alt="BareRepository" src="/assets/images/BareRepository_thumb.png" width="552" height="484">
    </a>
</p>
<h3>Clone Upstream Repository</h3>
<p>With the target and empty fork repository created, we can clone the upstream repository to our local computer. Access the
    upstream repository in Visual Studio Online if you need&nbsp; to identify the clone Url:</p>
<p>
    <a href="/assets/images/CloneUrl.png">
        <img title="CloneUrl" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px"
            border="0" alt="CloneUrl" src="/assets/images/CloneUrl_thumb.png" width="552" height="484">
    </a>
</p>
<p>A nice feature of the
    <font face="Courier New">git clone</font> command is to set the origin name to upstream when cloning the master repository using the
    <font face="Courier New">--origin=upstream</font> directive:</p>
<pre>C:\&gt; mkdir fork
    Directory: C:\
Mode                LastWriteTime     Length Name
----                -------------     ------ ----
d----          1/3/2014   8:58 PM            fork

C:\&gt; cd fork

C:\fork&gt; git clone --origin=upstream https://woodcp.visualstudio.com/DefaultCollection/_git/VisualStudioOnlineFork
Cloning into 'VisualStudioOnlineFork'...
remote:
remote:                    fTfs
remote:                  fSSSSSSSs
remote:                fSSSSSSSSSS
remote: TSSf         fSSSSSSSSSSSS
remote: SSSSSF     fSSSSSSST SSSSS
remote: SSfSSSSSsfSSSSSSSt   SSSSS
remote: SS  tSSSSSSSSSs      SSSSS
remote: SS   fSSSSSSST       SSSSS
remote: SS fSSSSSFSSSSSSf    SSSSS
remote: SSSSSST    FSSSSSSFt SSSSS
remote: SSSSt        FSSSSSSSSSSSS
remote:                FSSSSSSSSSS
remote:                  FSSSSSSs
remote:                    FSFs    (TM)
remote:
remote:  Microsoft (R) Visual Studio (R) Team Foundation Server
remote:
Unpacking objects: 100% (3/3), done.
Checking connectivity... done
</pre>
<h3>Configure Remote Origin</h3>
<p>At this point, we have cloned the repository with only a remote named upstream pointing to the master repository. Referencing
    the bare repository instructions in the fork repository, we add the remote origin pointing to the fork repository location:</p>
<pre>C:\fork&gt; cd .\VisualStudioOnlineFork

C:\fork\VisualStudioOnlineFork [master]&gt; git remote add origin https://toddwood.visualstudio.com/DefaultCollection/_git/VisualStudioOnlineFork
</pre>
<h3>Push to Fork</h3>
<p>Once again, using the bare repository instructions we perform a
    <font face="Courier New">git push -u origin --all</font> to complete the fork process:</p>
<pre>C:\fork\VisualStudioOnlineFork [master]&gt; git push -u origin --all
Counting objects: 3, done.
Writing objects: 100% (3/3), 250 bytes | 0 bytes/s, done.
Total 3 (delta 0), reused 0 (delta 0)
remote: Analyzing objects (3/3) (2 ms)
remote: Storing pack file and index...  done (800 ms)
To https://toddwood.visualstudio.com/DefaultCollection/_git/VisualStudioOnlineFork
 * [new branch]      master -&gt; master
Branch master set up to track remote branch master from origin.
</pre>
<p>The fork of the upstream repository is now complete and available in Visual Studio Online:</p>
<p>
    <a href="/assets/images/ForkRefreshView.png">
        <img title="ForkRefreshView" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px"
            border="0" alt="ForkRefreshView" src="/assets/images/ForkRefreshView_thumb.png" width="644" height="322">
    </a>
</p>
<h2>Visual Studio Online Merge Upstream Changes</h2>
<p>As time progresses the upstream repository will diverge from the forked repository. The fork maintainer can easily merge
    the upstream changes by fetching from upstream, merging master, and pushing to origin. This process is no different than
    when dealing with GitHub hosted repositories.</p>
<h3>Upstream Changes</h3>
<pre>C:\WCP\VisualStudioOnlineFork [master]&gt; echo Downstream Merge &gt;&gt; .\README.md
C:\WCP\VisualStudioOnlineFork [master +0 ~1 -0]&gt; echo MIT &gt;LICENSE.md
C:\WCP\VisualStudioOnlineFork [master +1 ~1 -0 !]&gt; git add -A
C:\WCP\VisualStudioOnlineFork [master +1 ~1 -0]&gt; git commit -m "Added LICENSE and updated README"
[master 7f7d2ab] Added LICENSE and updated README
 2 files changed, 0 insertions(+), 0 deletions(-)
 create mode 100644 LICENSE.md
C:\WCP\VisualStudioOnlineFork [master]&gt; git push
Counting objects: 6, done.
Delta compression using up to 4 threads.
Compressing objects: 100% (3/3), done.
Writing objects: 100% (4/4), 368 bytes | 0 bytes/s, done.
Total 4 (delta 0), reused 0 (delta 0)
remote: Analyzing objects (4/4) (2 ms)
remote: Storing pack file and index...  done (374 ms)
To https://woodcp.visualstudio.com/DefaultCollection/_git/VisualStudioOnlineFork
   d2c3b1a..7f7d2ab  master -&gt; master
C:\WCP\VisualStudioOnlineFork [master]&gt;
</pre>
<h3>Downstream Merge</h3>
<pre>C:\WCP\VisualStudioOnlineFork [master]&gt; cd C:\fork\VisualStudioOnlineFork
C:\fork\VisualStudioOnlineFork [master]&gt; git fetch upstream
remote: Microsoft (R) Visual Studio (R) Team Foundation Server
Unpacking objects: 100% (4/4), done.
From https://woodcp.visualstudio.com/DefaultCollection/_git/VisualStudioOnlineFork
   d2c3b1a..7f7d2ab  master     -&gt; upstream/master
C:\fork\VisualStudioOnlineFork [master]&gt; git merge master upstream/master
Updating d2c3b1a..7f7d2ab
Fast-forward
 LICENSE.md | Bin 0 -&gt; 12 bytes
 README.md  | Bin 26 -&gt; 64 bytes
 2 files changed, 0 insertions(+), 0 deletions(-)
 create mode 100644 LICENSE.md
C:\fork\VisualStudioOnlineFork [master]&gt; git push
Counting objects: 6, done.
Delta compression using up to 4 threads.
Compressing objects: 100% (3/3), done.
Writing objects: 100% (4/4), 368 bytes | 0 bytes/s, done.
Total 4 (delta 0), reused 0 (delta 0)
remote: Analyzing objects (4/4) (2 ms)
remote: Storing pack file and index...  done (496 ms)
To https://toddwood.visualstudio.com/DefaultCollection/_git/VisualStudioOnlineFork
   d2c3b1a..7f7d2ab  master -&gt; master
C:\fork\VisualStudioOnlineFork [master]&gt;
</pre>
<h2>Conclusion</h2>
<p>I am a big fan of GitHub but when your clients decide to use Visual Studio Online that is exactly what you do. However, this
    does not mean you need to abandon certain Git repository work flows and setups that are proven to work.</p>
<p>I have been using Visual Studio Online for the past three months without any complaints. The integration with Visual Studio
    2013 is very nice and the agile management online is well done. The lack of a Fork button on Visual Studio Online is
    a minor inconvenience considering the process of forking repositories is still possible as I demonstrated.</p>]]></content><author><name>Todd A. Wood</name><uri>https://www.woodcp.com</uri></author><category term="git" /><category term="github" /><category term="VSO" /><category term="git" /><category term="github" /><category term="VSO" /><summary type="html"><![CDATA[Visual Studio Online Microsoft rebranded Team Foundation Service as Visual Studio Online with the launch of Visual Studio 2013 last fall. Visual Studio Online enables team collaboration using either Git or TFS source code control directly inside Visual Studio 2013, web browser, or the command line. Microsoft is entering a space dominated by GitHub &amp; Bitbucket, and developers will seek familiar workflows and features offered by by these platforms. GitHub Fork GitHub established the paradigm of forking, upstream repositories, and pull requests for a lot of developers. VonC put together a nice graphic to illustrate the relationship of repositories in his response to a Stack Overflow question on the topic. GitHub simplifies the process of forking an upstream repository with the simple click of the Fork button: Visual Studio Online Fork Visual Studio Online might not currently have a Fork button but the process of forking Git repositories between accounts is quite straightforward and involves: Creating the target project to contain the fork repository Cloning the upstream repository Adding a remote origin link to the fork Pushing the cloned repository to the fork origin In this scenario, I will fork a Git repository name VisualStudioOnlineFork from an upstream master repository located in woodcp.visualstudio.com to a fork project located in toddwood.visualstudio.com using the owner account of the fork repository. The fork project owner account had access granted to the upstream repository by the upstream owner account. Create Fork Repository Login to the fork account and create a new team project named the same as the upstream repository. Technically the names do not need to match but that is the convention established by the GitHub Fork button. Once the project is created, Visual Studio Online will display the empty repository and instructions on how to set the remote and push the upstream or any existing repository: Clone Upstream Repository With the target and empty fork repository created, we can clone the upstream repository to our local computer. Access the upstream repository in Visual Studio Online if you need&nbsp; to identify the clone Url: A nice feature of the git clone command is to set the origin name to upstream when cloning the master repository using the --origin=upstream directive: C:\&gt; mkdir fork Directory: C:\ Mode LastWriteTime Length Name ---- ------------- ------ ---- d---- 1/3/2014 8:58 PM fork]]></summary></entry><entry><title type="html">Git Unite - Fix Case Sensitive File Paths on Windows</title><link href="https://www.woodcp.com/2013/01/git-unite-fix-case-sensitive-file-paths-on-windows/" rel="alternate" type="text/html" title="Git Unite - Fix Case Sensitive File Paths on Windows" /><published>2013-01-13T06:00:00+00:00</published><updated>2013-01-13T06:00:00+00:00</updated><id>https://www.woodcp.com/2013/01/git-unite-fix-case-sensitive-file-paths-on-windows</id><content type="html" xml:base="https://www.woodcp.com/2013/01/git-unite-fix-case-sensitive-file-paths-on-windows/"><![CDATA[<p>
    <a title="git-unite on GitHub" href="https://github.com/tawman/git-unite" target="_blank">Git Unite</a> is a utility that fixes case sensitive file paths present in a git repository index on Windows. Since Windows
    is not case sensitive, the git index case sensitivity issue does not manifest itself until browsing the code repository
    on GitHub or cloning the repository to a case sensitive file system on Linux.</p>
<p>Introducing case sensitive file paths into the git index on a case insensitive operating system like Windows is easier than
    you think. A simple ‘
    <em>git mv .\Where\Waldo where\is\Waldo</em>' is all you need to create two separate paths in the git index, but the Windows
    working directory will only report one. There might be
    <em>git config</em> settings that help avoid this problem, but controlling the settings and behavior of 20+ contributors
    on a project team is nearly impossible.</p>
<p>The problem is exacerbated when hundreds of files are moved during a repository layout reorganization. If the user moving
    the files is not careful, these case sensitive path names will pollute the git index but appear fine in the working directory.
    Cleaning up these case sensitive file path issues on Windows is tedious, and this is where Git Unite helps out.</p>
<p>Git Unite will search the git repository index for file paths that do not match the same case that Windows is using. For
    each git index path case mismatch found, Git Unite will update the git index entry with the case reported by the Windows
    file system.</p>
<h2>Usage</h2>
<pre>Usage: Git.Unite [OPTIONS]+ repository
Unite the git repository index file paths with current Windows case usage.
If no repository path is specified, the current directory is used.

Options:
      --dry-run              dry run without making changes
  -h, --help                 show this message and exit
</pre>
<h2>History</h2>
<p>I work on a project that has one particular git repository tracking over 7,000 files. The repository contains a mixture of
    ASP.NET MVC3 code, SQL Server SSIS ETL packages, and PowerShell scripts. It all started one day when an ETL developer
    could not locate the package she developed on the GitHub web site.</p>
<p>I took a look at the git repository on her machine and the ETL package was clearly there under an
    <em>Etl\Some\Dir\Path</em> folder. The repository reported being up to date with origin/master, but it took several minutes
    before I noticed an
    <em>etl</em> and
    <em>Etl</em> folder on the GitHub web site.</p>
<p>It turns out that the ETL team was in the process reorganizing the ETL packages into a new directory structure layout. I
    booted up a VM running Ubuntu and cloned the repository down to a case sensitive file system. I found 694 ETL files that
    were tracked in the git index with a directory path case different than the one reported by the Windows file system.</p>
<p>I fixed the problem by using a combination of
    <em>find</em>,
    <em>sort</em>, and
    <em>awk</em> to build a bash script to run the 694 git mv commands. This was a painful process that I did not want to repeat
    so I decided to build a tool anyone on the team could use on Windows to fix the problem.</p>
<p>In fact, two months later the same issue appeared again in a different repository. This time I was able to install the Git
    Unite utility on the user’s machine and fix the issue in a couple minutes. We tracked down the source of the problem
    to a developer that hand-typed the target directory of a git mv command in all lowercase.</p>
<h2>Example Scenario</h2>
<p>Here is a representative example using Posh-Git on Windows 7 as to how someone can introduce case sensitive file paths on
    a case insensitive file system.</p>
<h3>Step 1 – Create a new git repository and push it to GitHub</h3>
<pre>C:\demo&gt; mkdir Where
C:\demo&gt; touch .\Where\Waldo
C:\demo&gt; touch .\Where\IsHere
C:\demo&gt; git init .
Initialized empty Git repository in C:/demo/.git/
C:\demo [master +1 ~0 -0 !]&gt; git add .
C:\demo [master +2 ~0 -0]&gt; git commit -m initial
[master (root-commit) 42ea0fc] initial
 0 files changed, 0 insertions(+), 0 deletions(-)
 create mode 100644 Where/IsHere
 create mode 100644 Where/Waldo

C:\demo [master]&gt; git remote add origin git@github.com:tawman/waldo.git
C:\demo [master]&gt; git push -u origin master
Counting objects: 4, done.
Delta compression using up to 6 threads.
Compressing objects: 100% (2/2), done.
Writing objects: 100% (4/4), 265 bytes, done.
Total 4 (delta 0), reused 0 (delta 0)
To git@github.com:tawman/waldo.git
 * [new branch]      master -&gt; master
Branch master set up to track remote branch master from origin.
</pre>
<p>When we look on GitHub the repository appears as expected:
    <a href="/assets/images/Step1.png">
        <img style="background-image: none; border-right-width: 0px; padding-left: 0px; padding-right: 0px; display: inline; border-top-width: 0px; border-bottom-width: 0px; border-left-width: 0px; padding-top: 0px"
            title="Initial repository as seen on GitHub" border="0" alt="Initial repository as seen on GitHub" src="/assets/images/Step1_thumb.png"
            width="644" height="256">
    </a>
</p>
<h3>Step 2 – Start asking some questions</h3>
<pre>C:\demo [master]&gt; mkdir .\Where\Is
C:\demo [master]&gt; touch .\Where\Is\He
C:\demo [master +1 ~0 -0 !]&gt; git add -A
C:\demo [master +1 ~0 -0]&gt; git commit -m "Good Question"
[master 3d9006e] Good Question
 0 files changed, 0 insertions(+), 0 deletions(-)
 create mode 100644 Where/Is/He
</pre>
<p>Keep a close eye on
    <em>where</em> Waldo is going...</p>
<pre>C:\demo [master]&gt; git mv .\Where\Waldo where\is\Waldo
C:\demo [master +0 ~1 -0]&gt; git commit -m "Find Me"
[master 35f843b] Find Me
 1 files changed, 0 insertions(+), 0 deletions(-)
 rename {Where =&gt; where/is}/Waldo (100%)

C:\demo [master]&gt; find Where
Where
Where/Is
Where/Is/He
Where/Is/Waldo
Where/IsHere
C:\demo [master]&gt; ls


    Directory: C:\demo


Mode                LastWriteTime     Length Name
----                -------------     ------ ----
d----         1/12/2013  10:54 PM            Where
</pre>
<p>Seems quite obvious
    <em>Where</em> Waldo is, but let’s check what GitHub thinks:</p>
<pre>C:\demo [master]&gt; git push
Counting objects: 11, done.
Delta compression using up to 6 threads.
Compressing objects: 100% (5/5), done.
Writing objects: 100% (9/9), 683 bytes, done.
Total 9 (delta 1), reused 0 (delta 0)
To git@github.com:tawman/waldo.git
   42ea0fc..35f843b  master -&gt; master
</pre>
<p>It would appear that git and GitHub have narrowed down the location of Waldo to one of two possible locations:
    <a href="/assets/images/Step2.png">
        <img style="background-image: none; border-right-width: 0px; padding-left: 0px; padding-right: 0px; display: inline; border-top-width: 0px; border-bottom-width: 0px; border-left-width: 0px; padding-top: 0px"
            title="GitHub is not exactly sure where he is at" border="0" alt="GitHub is not exactly sure where he is at" src="/assets/images/Step2_thumb.png"
            width="644" height="132">
    </a>
</p>
<h3>Step 3 – Let the confusion begin</h3>
<pre>C:\demo [master]&gt; ls .\Where\Is\Waldo


    Directory: C:\demo\Where\Is


Mode                LastWriteTime     Length Name
----                -------------     ------ ----
-a---         1/12/2013  10:50 PM          0 Waldo
</pre>
<p>According to Windows, Waldo should be hanging out right here:
    <a href="/assets/images/Step3a.png">
        <img style="background-image: none; border-right-width: 0px; padding-left: 0px; padding-right: 0px; display: inline; border-top-width: 0px; border-bottom-width: 0px; border-left-width: 0px; padding-top: 0px"
            title="Is he here?" border="0" alt="Is he here?" src="/assets/images/Step3a_thumb.png" width="644" height="133">
    </a>Unfortunately, according to git he is hanging out over there:</p>
<p>
    <a href="/assets/images/Step3b.png">
        <img style="background-image: none; border-right-width: 0px; padding-left: 0px; padding-right: 0px; display: inline; border-top-width: 0px; border-bottom-width: 0px; border-left-width: 0px; padding-top: 0px"
            title="Or is he here?" border="0" alt="Or is he here?" src="/assets/images/Step3b_thumb.png" width="644" height="129">
    </a>
</p>
<h3>Step 4 – Get everyone back on the same page with Git Unite</h3>
<pre>C:\demo [master]&gt; Git.Unite.exe C:\demo
C:\demo [master +0 ~1 -0]&gt; git status
# On branch master
# Changes to be committed:
#   (use "git reset HEAD <file>..." to unstage)
#
#       renamed:    where/is/Waldo -&gt; Where/Is/Waldo
#
C:\demo [master +0 ~1 -0]&gt; git commit -m fixed
[master 4495f40] fixed
 1 files changed, 0 insertions(+), 0 deletions(-)
 rename {where/is =&gt; Where/Is}/Waldo (100%)
C:\demo [master]&gt; git push
Counting objects: 7, done.
Delta compression using up to 6 threads.
Compressing objects: 100% (3/3), done.
Writing objects: 100% (4/4), 354 bytes, done.
Total 4 (delta 0), reused 0 (delta 0)
To git@github.com:tawman/waldo.git
   35f843b..4495f40  master -&gt; master
</pre>
<p>Git Unite clears up the confusion by reconciling the git index file path with the same case Windows is using. When I go back
    and look at the repository on GitHub, there is only one place
    <em>Where</em> Waldo could be:
    <a href="/assets/images/Step4a.png">
        <img style="background-image: none; border-right-width: 0px; padding-left: 0px; padding-right: 0px; display: inline; border-top-width: 0px; border-bottom-width: 0px; border-left-width: 0px; padding-top: 0px"
            title="Everyone is back Where expected" border="0" alt="Everyone is back Where expected" src="/assets/images/Step4a_thumb.png"
            width="644" height="105">
    </a> As far as Windows was concerned, Waldo was here the whole time:
    <a href="/assets/images/Step4b.png">
        <img style="background-image: none; border-right-width: 0px; padding-left: 0px; padding-right: 0px; display: inline; border-top-width: 0px; border-bottom-width: 0px; border-left-width: 0px; padding-top: 0px"
            title="I knew he was here the whole time" border="0" alt="I knew he was here the whole time" src="/assets/images/Step4b_thumb.png"
            width="644" height="152">
    </a>
</p>
<p>
    <a title="Fork me on GitHub" href="https://github.com/tawman/git-unite" target="_blank">Fork me on GitHub</a>
</p>]]></content><author><name>Todd A. Wood</name><uri>https://www.woodcp.com</uri></author><category term="git" /><category term="git" /><category term="github" /><category term="windows" /><summary type="html"><![CDATA[Git Unite is a utility that fixes case sensitive file paths present in a git repository index on Windows. Since Windows is not case sensitive, the git index case sensitivity issue does not manifest itself until browsing the code repository on GitHub or cloning the repository to a case sensitive file system on Linux. Introducing case sensitive file paths into the git index on a case insensitive operating system like Windows is easier than you think. A simple ‘ git mv .\Where\Waldo where\is\Waldo' is all you need to create two separate paths in the git index, but the Windows working directory will only report one. There might be git config settings that help avoid this problem, but controlling the settings and behavior of 20+ contributors on a project team is nearly impossible. The problem is exacerbated when hundreds of files are moved during a repository layout reorganization. If the user moving the files is not careful, these case sensitive path names will pollute the git index but appear fine in the working directory. Cleaning up these case sensitive file path issues on Windows is tedious, and this is where Git Unite helps out. Git Unite will search the git repository index for file paths that do not match the same case that Windows is using. For each git index path case mismatch found, Git Unite will update the git index entry with the case reported by the Windows file system. Usage Usage: Git.Unite [OPTIONS]+ repository Unite the git repository index file paths with current Windows case usage. If no repository path is specified, the current directory is used.]]></summary></entry><entry><title type="html">PowerShell Object Pipeline</title><link href="https://www.woodcp.com/2012/04/powershell-object-pipeline/" rel="alternate" type="text/html" title="PowerShell Object Pipeline" /><published>2012-04-15T06:00:00+00:00</published><updated>2012-04-15T06:00:00+00:00</updated><id>https://www.woodcp.com/2012/04/powershell-object-pipeline</id><content type="html" xml:base="https://www.woodcp.com/2012/04/powershell-object-pipeline/"><![CDATA[<p>On .NET projects today, a common theme with setup and builds scripts is
    <a href="https://github.com/psake/psake" target="_blank">
        <em>psake</em>
    </a> and
    <a href="http://technet.microsoft.com/en-us/dd793612" target="_blank">Windows PowerShell</a>. A development team member on my current project is a PowerShell Ninja, which prompted me to learn
    more about PowerShell this weekend. I watched an excellent five-part webcast series by Ed Wilson (
    <a href="https://twitter.com/#!/ScriptingGuys" target="_blank">@ScriptingGuys</a>) on
    <a href="http://technet.microsoft.com/en-us/scriptcenter/dd742419" target="_blank">Scripting with Windows PowerShell</a> that got me up to speed in a couple hours.</p>
<p>These webcasts changed my viewpoint of PowerShell from being a VBScript replacement (that I never used) to a powerful scripting
    framework and shell. I have modified and extended existing
    <em>psake</em> build scripts, but I did not realize some of the key concepts lurking just below the surface.</p>
<p>I did not consider it possible that PowerShell pipelines objects between commands instead of strings due to the many years
    I spent developing shell scripts on UNIX/Linux. Once Ed explained this key concept, I discarded my preconceived understanding
    of PowerShell and watched the entire series to learn more.</p>
<p>I can only imagine how useful object pipelining in UNIX shell scripts would be after witnessing the equivalent in PowerShell.
    For example, getting basic process information from the bash prompt on the Mac can be done with the
    <em>ps</em> command. The output of the
    <em>ps</em> command can be manipulated as standard ASCII text:</p>
<script src="https://gist.github.com/2394435.js?file=BashShell.txt"></script>
<p>Conversely, the same process information in PowerShell passes along the entire process object to the next command to inspect:</p>
<script src="https://gist.github.com/2394445.js?file=PowerShell.txt"></script>
<p>Object passing between commands in PowerShell is just a minor fact hardly worth blogging about, but it is a fundamental reason
    you should ignore decades of preconceived ideas about shell scripting and give PowerShell an honest look.</p>]]></content><author><name>Todd A. Wood</name><uri>https://www.woodcp.com</uri></author><category term="PowerShell" /><category term="PowerShell" /><summary type="html"><![CDATA[On .NET projects today, a common theme with setup and builds scripts is psake and Windows PowerShell. A development team member on my current project is a PowerShell Ninja, which prompted me to learn more about PowerShell this weekend. I watched an excellent five-part webcast series by Ed Wilson ( @ScriptingGuys) on Scripting with Windows PowerShell that got me up to speed in a couple hours. These webcasts changed my viewpoint of PowerShell from being a VBScript replacement (that I never used) to a powerful scripting framework and shell. I have modified and extended existing psake build scripts, but I did not realize some of the key concepts lurking just below the surface. I did not consider it possible that PowerShell pipelines objects between commands instead of strings due to the many years I spent developing shell scripts on UNIX/Linux. Once Ed explained this key concept, I discarded my preconceived understanding of PowerShell and watched the entire series to learn more. I can only imagine how useful object pipelining in UNIX shell scripts would be after witnessing the equivalent in PowerShell. For example, getting basic process information from the bash prompt on the Mac can be done with the ps command. The output of the ps command can be manipulated as standard ASCII text: Conversely, the same process information in PowerShell passes along the entire process object to the next command to inspect: Object passing between commands in PowerShell is just a minor fact hardly worth blogging about, but it is a fundamental reason you should ignore decades of preconceived ideas about shell scripting and give PowerShell an honest look.]]></summary></entry><entry><title type="html">PetaPoco Custom SQL Paging</title><link href="https://www.woodcp.com/2012/03/petapoco-custom-sql-paging/" rel="alternate" type="text/html" title="PetaPoco Custom SQL Paging" /><published>2012-03-24T06:00:00+00:00</published><updated>2012-03-24T06:00:00+00:00</updated><id>https://www.woodcp.com/2012/03/petapoco-custom-sql-paging</id><content type="html" xml:base="https://www.woodcp.com/2012/03/petapoco-custom-sql-paging/"><![CDATA[<p>
    I blogged previously about
    <a href="http://www.woodcp.com/2012/02/server-side-paging-with-petapoco-and-datatables/" target="_blank">Server-Side Paging with PetaPoco and DataTables</a>, which works well for about 80% of the SQL statements that require
    server-side paging. In fact, I already submitted a
    <a href="https://github.com/toptensoftware/PetaPoco/pull/94" target="_blank">pull request</a> in December 2011 to modify the
    <em>rxOrderBy</em> Regex used by
    <a href="http://www.toptensoftware.com/petapoco/" target="_blank">PetaPoco</a> during
    <em>Page&lt;T&gt;</em> query rewrite to support sub-select ordering containing
    <em>ROW_NUMBER() PARTITION BY</em>.</p>
<p>The latest query from the 20% unsupported category prompted my
    <a href="https://github.com/toptensoftware/PetaPoco/pull/104"
        target="_blank">pull request</a> to bypass the PetaPoco query rewrite logic entirely and allow the caller to provide the
    <em>count</em> and
    <em>paging</em> queries. After all, PetaPoco is a single-file micro-ORM that expects the user to understand SQL already.
    The secret sauce covers the low friction data access layer between the database and your POCOs, and not the SQL itself.</p>
<p>Currently, PetaPoco supports server-side paging with the following
    <em>Page&lt;T&gt;</em> method overloads that generate
    <em>count</em> and
    <em>page</em> results queries from a single SQL statement:</p>
<ul>
    <li>
        <font face="Consolas">public Page&lt;T&gt; Page&lt;T&gt;(long page, long itemsPerPage, Sql sql)</font>
        <li>
            <font face="Consolas">public Page&lt;T&gt; Page&lt;T&gt;(long page, long itemsPerPage, string sql, params object[] args)</font>
        </li>
</ul>
<p>My
    <a href="https://github.com/toptensoftware/PetaPoco/pull/104" target="_blank">pull request</a> simply adds another
    <em>Page&lt;T&gt;</em> method overload that accepts the two SQL statements to use for the count and page results without
    calling the
    <em>BuildPageQueries&lt;T&gt;</em> method:
    <font face="Consolas">public Page&lt;T&gt; Page&lt;T&gt;(long page, long itemsPerPage, Sql sqlCount, Sql sqlPage)</font>
</p>
<p>The following gist provides a sample set of queries using the new Page&lt;T&gt; overload:</p>
<div>
    <script src="https://gist.github.com/2126002.js?file=SamplePetaPocoPageCustomSql.cs"></script>
</div>
<p>There has not been much activity in the
    <a href="https://github.com/toptensoftware/PetaPoco" target="_blank">PetaPoco GitHub repository</a> in over 8 months, so you might want to pull this change into your own fork if faced with
    similar server-side paging requirements. </p>]]></content><author><name>Todd A. Wood</name><uri>https://www.woodcp.com</uri></author><category term="PetaPoco" /><category term="PetaPoco" /><summary type="html"><![CDATA[I blogged previously about Server-Side Paging with PetaPoco and DataTables, which works well for about 80% of the SQL statements that require server-side paging. In fact, I already submitted a pull request in December 2011 to modify the rxOrderBy Regex used by PetaPoco during Page&lt;T&gt; query rewrite to support sub-select ordering containing ROW_NUMBER() PARTITION BY. The latest query from the 20% unsupported category prompted my pull request to bypass the PetaPoco query rewrite logic entirely and allow the caller to provide the count and paging queries. After all, PetaPoco is a single-file micro-ORM that expects the user to understand SQL already. The secret sauce covers the low friction data access layer between the database and your POCOs, and not the SQL itself. Currently, PetaPoco supports server-side paging with the following Page&lt;T&gt; method overloads that generate count and page results queries from a single SQL statement: public Page&lt;T&gt; Page&lt;T&gt;(long page, long itemsPerPage, Sql sql) public Page&lt;T&gt; Page&lt;T&gt;(long page, long itemsPerPage, string sql, params object[] args) My pull request simply adds another Page&lt;T&gt; method overload that accepts the two SQL statements to use for the count and page results without calling the BuildPageQueries&lt;T&gt; method: public Page&lt;T&gt; Page&lt;T&gt;(long page, long itemsPerPage, Sql sqlCount, Sql sqlPage) The following gist provides a sample set of queries using the new Page&lt;T&gt; overload: There has not been much activity in the PetaPoco GitHub repository in over 8 months, so you might want to pull this change into your own fork if faced with similar server-side paging requirements.]]></summary></entry><entry><title type="html">Server-Side Paging with PetaPoco and DataTables</title><link href="https://www.woodcp.com/2012/02/server-side-paging-with-petapoco-and-datatables/" rel="alternate" type="text/html" title="Server-Side Paging with PetaPoco and DataTables" /><published>2012-02-18T06:00:00+00:00</published><updated>2012-02-18T06:00:00+00:00</updated><id>https://www.woodcp.com/2012/02/server-side-paging-with-petapoco-and-datatables</id><content type="html" xml:base="https://www.woodcp.com/2012/02/server-side-paging-with-petapoco-and-datatables/"><![CDATA[<p>TL;DR – Server-side paging of table data is not only faster and more responsive, but very easy to implement with
    <a href="http://www.toptensoftware.com/petapoco/" target="_blank">PetaPoco</a> and
    <a href="http://datatables.net/" target="_blank">DataTables</a>. I will demonstrate how millisecond response times are still possible even when dealing with a million
    rows of data.</p>
<p>This blog post can be forked on
    <a href="https://github.com/tawman/PetaPocoPage" target="_blank">GitHub: tawman / PetaPocoPage</a>.</p>
<h2>Sample Web Site and Data</h2>
<p>For the sample web site, I used the
    <a href="http://spawner.sourceforge.net/" target="_blank">Spawner Data Generator</a> tool to generate 1,000,000 rows of test data for use in this example. As you can see from
    the timings, each table grid page request is limited to approximately 1.25 KB of data and a response time around 75 milliseconds.
</p>
<p>
    <img style="background-image: none; border-right-width: 0px; padding-left: 0px; padding-right: 0px; display: inline; border-top-width: 0px; border-bottom-width: 0px; border-left-width: 0px; padding-top: 0px"
        title="SiteTimings" border="0" alt="SiteTimings" src="/assets/images/SiteTimings.jpg" width="630" height="586">
</p>
<p>On my current project, we initially used
    <a href="http://tablesorter.com/docs/" target="_blank">jQuery tablesorter</a> with a table loading between 1,000 and 1,500 detail records for a report. There was a noticeable
    delay of a couple seconds while the data loaded. Obviously, we needed to implement server-side paging and I knew
    <a href="http://www.toptensoftware.com/petapoco/" target="_blank">PetaPoco</a> could help on the backend. However, tablesorter had some other issues with searching paged tables that lead
    me to discover
    <a href="http://datatables.net/" target="_blank">DataTables</a>
</p>
<h2>DataTables</h2>
<p>DataTables is a great extensible jQuery plug-in and satisfied our searching, paging, and server-side processing requirements.
    The only negative issue I had is cosmetic since DataTables uses
    <a href="http://en.wikipedia.org/wiki/Hungarian_notation" target="_blank">Hungarian notation</a>. The other features and extensibility options shine through making DataTables an excellent client
    side UI choice.</p>
<h3>Client Side Setup</h3>
<p>Wiring up a view table with DataTables is very simple, and adding the server-side callbacks and Twitter Bootstrap formatting
    is also very straight forward. In my example view, there is very little markup required and actually no model is provided:</p>
<script src="https://gist.github.com/1861582.js"></script>
<p>The sample project also includes some additional JavaScript to setup a 500 ms filtering delay before sending the search term
    back to the server, pagination styling, and CSS styling. Fortunately, the DataTables library is thoroughly documented
    on its web site along with these extension configurations.</p>
<h3>Server Side Setup</h3>
<p>DataTables will POST a JSON request back to the server Controller that I process with a custom
    <em>IModelBinder</em> to keep the Controller code clean and remove the Hungarian notation:</p>
<script src="https://gist.github.com/1861621.js"></script>
<p>The combination of the
    <em>IModelBinder</em> and a helper method to format the DataTable response keeps the Controller action code simple:</p>
<script src="https://gist.github.com/1861623.js"></script>
<p>The DataTable response formatter simply combines the
    <em>DataTablesPageRequest</em> object with the PetaPoco
    <em>Page&lt;Customer&gt;</em> object into a JSON response expected by DataTables:</p>
<script src="https://gist.github.com/1861658.js"></script>
<p>Fortunately, the DataTable request and response object formats are very compatible with how PetaPoco support paging.</p>
<h2>PetaPoco</h2>
<p>PetaPoco makes it very simple and efficient to issue paged queries to SQL Server and it handles the query rewrite to use
    <em>ROW_NUMBER</em> for you. The resulting PetaPoco
    <em>Page&lt;T&gt;</em> object makes it easy to interface with the DataTables response object:</p>
<pre><code>
public class Page&lt;T&gt; where T:new()
{
    public long CurrentPage { get; set; }
    public long ItemsPerPage { get; set; }
    public long TotalPages { get; set; }
    public long TotalItems { get; set; }
    public List&lt;T&gt; Items { get; set; }
}</code></pre>
<p>To support the DataTables searching and ordering in my sample project, I used some helper methods to apply the search criteria
    against all columns and order by the columns selected. The resulting repository method to handle the PetaPoco Page&lt;T&gt;
    query request is quite straightforward:</p>
<script src="https://gist.github.com/1861686.js"></script>
<p>In the end, it does not matter if you are dealing with 100, 1,000, or 1,000,000 rows of data. You can still achieve fast
    page response times and low latency calls to your backend database to serve data up one page at a time. The following
    timings are from pressing the Next page link 7 times:</p>
<p>
    <img style="background-image: none; border-right-width: 0px; padding-left: 0px; padding-right: 0px; display: inline; border-top-width: 0px; border-bottom-width: 0px; border-left-width: 0px; padding-top: 0px"
        title="Timings" border="0" alt="Timings" src="/assets/images/Timings.png" width="634" height="351">
</p>
<h2>Conclusion</h2>
<p>The million row sample data is not a practical use case and merely used to illustrate a point. In fact, I reduced the sample
    data down to 4,000 rows for publishing on GitHub as a 250 MB database is not very clone friendly. Have a look at the
    project source on
    <a href="https://github.com/tawman/PetaPocoPage" target="_blank">GitHub: tawman / PetaPocoPage</a> and I hope it provides a useful reference for someone faced with a similar challenge.</p>
<p>DataTables and PetaPoco were made to page together.</p>]]></content><author><name>Todd A. Wood</name><uri>https://www.woodcp.com</uri></author><category term="DataTables" /><category term="PetaPoco" /><category term="DataTables" /><category term="PetaPoco" /><summary type="html"><![CDATA[TL;DR – Server-side paging of table data is not only faster and more responsive, but very easy to implement with PetaPoco and DataTables. I will demonstrate how millisecond response times are still possible even when dealing with a million rows of data. This blog post can be forked on GitHub: tawman / PetaPocoPage. Sample Web Site and Data For the sample web site, I used the Spawner Data Generator tool to generate 1,000,000 rows of test data for use in this example. As you can see from the timings, each table grid page request is limited to approximately 1.25 KB of data and a response time around 75 milliseconds. On my current project, we initially used jQuery tablesorter with a table loading between 1,000 and 1,500 detail records for a report. There was a noticeable delay of a couple seconds while the data loaded. Obviously, we needed to implement server-side paging and I knew PetaPoco could help on the backend. However, tablesorter had some other issues with searching paged tables that lead me to discover DataTables DataTables DataTables is a great extensible jQuery plug-in and satisfied our searching, paging, and server-side processing requirements. The only negative issue I had is cosmetic since DataTables uses Hungarian notation. The other features and extensibility options shine through making DataTables an excellent client side UI choice. Client Side Setup Wiring up a view table with DataTables is very simple, and adding the server-side callbacks and Twitter Bootstrap formatting is also very straight forward. In my example view, there is very little markup required and actually no model is provided: The sample project also includes some additional JavaScript to setup a 500 ms filtering delay before sending the search term back to the server, pagination styling, and CSS styling. Fortunately, the DataTables library is thoroughly documented on its web site along with these extension configurations. Server Side Setup DataTables will POST a JSON request back to the server Controller that I process with a custom IModelBinder to keep the Controller code clean and remove the Hungarian notation: The combination of the IModelBinder and a helper method to format the DataTable response keeps the Controller action code simple: The DataTable response formatter simply combines the DataTablesPageRequest object with the PetaPoco Page&lt;Customer&gt; object into a JSON response expected by DataTables: Fortunately, the DataTable request and response object formats are very compatible with how PetaPoco support paging. PetaPoco PetaPoco makes it very simple and efficient to issue paged queries to SQL Server and it handles the query rewrite to use ROW_NUMBER for you. The resulting PetaPoco Page&lt;T&gt; object makes it easy to interface with the DataTables response object: public class Page&lt;T&gt; where T:new() { public long CurrentPage { get; set; } public long ItemsPerPage { get; set; } public long TotalPages { get; set; } public long TotalItems { get; set; } public List&lt;T&gt; Items { get; set; } } To support the DataTables searching and ordering in my sample project, I used some helper methods to apply the search criteria against all columns and order by the columns selected. The resulting repository method to handle the PetaPoco Page&lt;T&gt; query request is quite straightforward: In the end, it does not matter if you are dealing with 100, 1,000, or 1,000,000 rows of data. You can still achieve fast page response times and low latency calls to your backend database to serve data up one page at a time. The following timings are from pressing the Next page link 7 times: Conclusion The million row sample data is not a practical use case and merely used to illustrate a point. In fact, I reduced the sample data down to 4,000 rows for publishing on GitHub as a 250 MB database is not very clone friendly. Have a look at the project source on GitHub: tawman / PetaPocoPage and I hope it provides a useful reference for someone faced with a similar challenge. DataTables and PetaPoco were made to page together.]]></summary></entry><entry><title type="html">PetaPoco DoddleReport Dynamic PIVOT Query</title><link href="https://www.woodcp.com/2012/02/petapoco-doddlereport-dynamic-pivot-query/" rel="alternate" type="text/html" title="PetaPoco DoddleReport Dynamic PIVOT Query" /><published>2012-02-11T06:00:00+00:00</published><updated>2012-02-11T06:00:00+00:00</updated><id>https://www.woodcp.com/2012/02/petapoco-doddlereport-dynamic-pivot-query</id><content type="html" xml:base="https://www.woodcp.com/2012/02/petapoco-doddlereport-dynamic-pivot-query/"><![CDATA[<p>
	TL;DR – Sample ASP.NET MVC3 application demonstrating how to load PIVOT query results as a dynamic object with
	<a href="http://www.toptensoftware.com/petapoco/" target="_blank">PetaPoco</a> and exporting the data to Excel using
	<a href="http://doddlereport.codeplex.com/" target="_blank">DoddleReport</a>.
</p>
<p>
	This blog post can be forked on
	<a href="https://github.com/tawman/PetaPocoPivot" target="_blank">GitHub: tawman / PetaPocoPivot</a>.
</p>
<h2>Dynamic Data Structures with PIVOT</h2>
<p>SQL Server 2008 R2 provides the capability to
	<a href="http://msdn.microsoft.com/en-us/library/ms177410.aspx" target="_blank">PIVOT and UNPIVOT</a> table data depending on your need to flatten rows into columns or columns into rows. In our case,
	we needed to transform multiple rows of employee expiration dates into a single employee row with columns for each requirement
	expiration date.</p>
<h3>SQL PIVOT Query Example</h3>
<p>The following source data is used to illustrate a SQL PIVOT query in action:
	<table border="1">
		<thead>
			<tr>
				<th>Employee</th>
				<th>Requirement</th>
				<th>Expiration Date</th>
			</tr>
		</thead>
		<tbody>
			<tr>
				<td>John Doe</td>
				<td>Alpha</td>
				<td>01/12/2011</td>
			</tr>
			<tr>
				<td>John Doe</td>
				<td>Bravo</td>
				<td>11/07/2012</td>
			</tr>
			<tr>
				<td>John Doe</td>
				<td>Charlie</td>
				<td>
					<em>NULL</em>
				</td>
			</tr>
			<tr>
				<td>John Doe</td>
				<td>Delta</td>
				<td>10/11/2009</td>
			</tr>
			<tr>
				<td>Sally Smith</td>
				<td>Bravo</td>
				<td>05/10/2012</td>
			</tr>
			<tr>
				<td>Sally Smith</td>
				<td>Charlie</td>
				<td>07/04/2012</td>
			</tr>
			<tr>
				<td>Sally Smith</td>
				<td>Echo</td>
				<td>08/31/2012</td>
			</tr>
		</tbody>
	</table>
</p>
<p>A simple SQL PIVOT query to transform the data:</p>
<script src="https://gist.github.com/tawman/4fb482eeefe3d91c2616c3e1ae5abbdf.js"></script>
<p>
	The following table illustrates the results of the SQL PIVOT query:
</p>
<table border="1">
	<thead>
		<tr>
			<th>Employee</th>
			<th>Alpha</th>
			<th>Bravo</th>
			<th>Charlie</th>
			<th>Delta</th>
			<th>Echo</th>
		</tr>
	</thead>
	<tbody>
		<tr>
			<td>John Doe</td>
			<td>01/12/2011</td>
			<td>11/07/2012</td>
			<td>&nbsp;</td>
			<td>10/11/2009</td>
			<td>&nbsp;</td>
		</tr>
		<tr>
			<td>Sally Smith</td>
			<td>&nbsp;</td>
			<td>05/10/2012</td>
			<td>07/04/2012</td>
			<td>&nbsp;</td>
			<td>08/31/2012</td>
		</tr>
	</tbody>
</table>
<h3>Data Driven Object Structure</h3>
<p>Obviously, defining a POCO object to load the PIVOT data ahead of time is not possible if the aggregated column values are
	not known until runtime. Our initial implementation bypassed PetaPoco entirely and used a
	<em>DataTable</em> to export the data to Excel using DoddleReport. Working around PetaPoco did not seem right so I looked for
	an alternate approach involving PetaPoco.</p>
<h2>PetaPoco and Dynamic Types</h2>
<p>Fortunately, C# provides a
	<a href="http://msdn.microsoft.com/en-us/library/dd264741.aspx" target="_blank">dynamic type</a> but I did not know if PetaPoco and DoddleReport could support this option. Every time I think we pushed
	PetaPoco to its limit, I learn that it has more tricks up its sleeve. I decided to not even consult the documentation and
	wired it up to see if we could return an
	<em>IEnumerable&lt;dynamic&gt;</em> instead of a
	<em>DataTable</em>.</p>
<p>The answer came as no surprise when coding the PIVOT query into the Repository that PetaPoco could care less if you actually
	had a POCO:</p>
<script src="https://gist.github.com/1806105.js"></script>
<h2>DoddleReport and Dynamic Types</h2>
<p>Now that PetaPoco was returning a dynamic data set, I needed to figure out what DoddleReport could do with it. I had not
	worked with DoddleReport up to this point, but quickly located the code in our Controller that performed the export to Excel
	functionality. I checked the
	<em>ToReportSource()</em> method overloads and saw it had one for an
	<em>IEnumerable</em>.</p>
<p>Typical
	<em>ReportResult</em> Action with DoddleReport does not require much code, but it sure does a lot behind the scenes:</p>
<script src="https://gist.github.com/1806138.js"></script>
<p>My first attempt to run the export with no change to the DoddleReport code was met with a blank export. I
	<a href="https://twitter.com/#!/iToddWood/status/167737343623696387" target="_blank">tweeted</a> out my status of trying this and
	<a href="http://www.matthidinger.com/" target="_blank">Matt Hidinger</a> quickly pointed me to the
	<a href="http://doddlereport.codeplex.com/SourceControl/changeset/view/840affd7397e#src%2fDoddleReport%2fDynamic%2fDynamicReportSource.cs"
	 target="_blank">DynamicReportSource.cs on CodePlex</a>. I added the
	<em>DynamicReportSource</em> code as a Helper in my project (Note: There is a
	<a href="http://nuget.org/packages/DoddleReport.Dynamic" target="_blank">Nuget package</a> for it) and tried to use it. The first issue I ran into was that the code used an
	<em>ExpandoObject</em>, but I changed it to use
	<em>dynamic</em> instead. The second issue I ran into was the handling of NULL values, but I worked around this by using a
	<em>new object()</em> when the
	<em>t.Value</em> is null.</p>
<p>Modified DoddleReport
	<em>DynamicReportSource.cs</em> file used:</p>
<script src="https://gist.github.com/1784079.js"></script>
<h2>Dynamic PetaPoco and DoddleReport</h2>
<p>With the final changes to DoddleReport in place, our project now supports exporting PIVOT reports to Microsoft Excel. I put
	together a
	<a href="https://github.com/tawman/PetaPocoPivot" target="_blank">sample ASP.NET MVC3 application</a> that demonstrates the flow of the dynamic data from SQL Server via PetaPoco and DoddleReport
	to Excel.</p>
<h3>Standard MVC ActionResult View</h3>
<p>The default Index action displays the dynamic data in a normal table view by iterating over the dynamic results:
	<img style="background-image: none; border-right-width: 0px; padding-left: 0px; padding-right: 0px; display: inline; border-top-width: 0px; border-bottom-width: 0px; border-left-width: 0px; padding-top: 0px"
	 title="PivotWebExample" border="0" alt="PivotWebExample" src="/assets/images/PivotWebExample.png" width="630" height="395">
</p>
<h3>Exporting Data to Excel</h3>
<p>The
	<em>DoddleReport Excel</em> link in the header will call the /Home/PivotReport
	<em>.xlsx</em> ReportAction to trigger the export to Excel. The .xlsx extension it what tells DoddleReport we want an Excel
	result:
</p>
<p>
	<img style="background-image: none; border-right-width: 0px; padding-left: 0px; padding-right: 0px; display: inline; border-top-width: 0px; border-bottom-width: 0px; border-left-width: 0px; padding-top: 0px"
	 title="PivotExcelExample" border="0" alt="PivotExcelExample" src="/assets/images/PivotExcelExample.png" width="630" height="435">
</p>
<h3>Generating a Default HTML View</h3>
<p>If you leave off the extension like the DoddleReport HTML header link does, then DoddleReport will generate the HTML view
	similar to the Index view I coded:
	<img style="background-image: none; border-right-width: 0px; padding-left: 0px; padding-right: 0px; display: inline; border-top-width: 0px; border-bottom-width: 0px; border-left-width: 0px; padding-top: 0px"
	 title="PivotDoddleHtmlExample" border="0" alt="PivotDoddleHtmlExample" src="/assets/images/PivotDoddleHtmlExample.png" width="630"
	 height="395">
</p>
<p>DoddleReport supports additional formats like PDF, but I did not add the necessary Nuget packages to support it. With the
	proper library references in place, all it takes to generate a PDF is to add the .PDF extension to the ReportResult Action
	URL and DoddleReport will happily generate it for you.</p>
<h2>Conclusion</h2>
<p>Yet again I learned to never underestimate PetaPoco. I now have the same respect for DoddleReport as it makes a dynamic teammate
	with PetaPoco to deliver Excel exports based on our PIVOT query results.</p>
<p>I hope you found this blog post helpful and be sure to check out the full source code on
	<a href="https://github.com/tawman/PetaPocoPivot" target="_blank">GitHub: tawman / PetaPocoPivot</a> to see this dynamic pair in action.</p>]]></content><author><name>Todd A. Wood</name><uri>https://www.woodcp.com</uri></author><category term="DoddleReport" /><category term="PetaPoco" /><category term="DoddleReport" /><category term="PetaPoco" /><summary type="html"><![CDATA[TL;DR – Sample ASP.NET MVC3 application demonstrating how to load PIVOT query results as a dynamic object with PetaPoco and exporting the data to Excel using DoddleReport. This blog post can be forked on GitHub: tawman / PetaPocoPivot. Dynamic Data Structures with PIVOT SQL Server 2008 R2 provides the capability to PIVOT and UNPIVOT table data depending on your need to flatten rows into columns or columns into rows. In our case, we needed to transform multiple rows of employee expiration dates into a single employee row with columns for each requirement expiration date. SQL PIVOT Query Example The following source data is used to illustrate a SQL PIVOT query in action: Employee Requirement Expiration Date John Doe Alpha 01/12/2011 John Doe Bravo 11/07/2012 John Doe Charlie NULL John Doe Delta 10/11/2009 Sally Smith Bravo 05/10/2012 Sally Smith Charlie 07/04/2012 Sally Smith Echo 08/31/2012 A simple SQL PIVOT query to transform the data: The following table illustrates the results of the SQL PIVOT query: Employee Alpha Bravo Charlie Delta Echo John Doe 01/12/2011 11/07/2012 &nbsp; 10/11/2009 &nbsp; Sally Smith &nbsp; 05/10/2012 07/04/2012 &nbsp; 08/31/2012 Data Driven Object Structure Obviously, defining a POCO object to load the PIVOT data ahead of time is not possible if the aggregated column values are not known until runtime. Our initial implementation bypassed PetaPoco entirely and used a DataTable to export the data to Excel using DoddleReport. Working around PetaPoco did not seem right so I looked for an alternate approach involving PetaPoco. PetaPoco and Dynamic Types Fortunately, C# provides a dynamic type but I did not know if PetaPoco and DoddleReport could support this option. Every time I think we pushed PetaPoco to its limit, I learn that it has more tricks up its sleeve. I decided to not even consult the documentation and wired it up to see if we could return an IEnumerable&lt;dynamic&gt; instead of a DataTable. The answer came as no surprise when coding the PIVOT query into the Repository that PetaPoco could care less if you actually had a POCO: DoddleReport and Dynamic Types Now that PetaPoco was returning a dynamic data set, I needed to figure out what DoddleReport could do with it. I had not worked with DoddleReport up to this point, but quickly located the code in our Controller that performed the export to Excel functionality. I checked the ToReportSource() method overloads and saw it had one for an IEnumerable. Typical ReportResult Action with DoddleReport does not require much code, but it sure does a lot behind the scenes: My first attempt to run the export with no change to the DoddleReport code was met with a blank export. I tweeted out my status of trying this and Matt Hidinger quickly pointed me to the DynamicReportSource.cs on CodePlex. I added the DynamicReportSource code as a Helper in my project (Note: There is a Nuget package for it) and tried to use it. The first issue I ran into was that the code used an ExpandoObject, but I changed it to use dynamic instead. The second issue I ran into was the handling of NULL values, but I worked around this by using a new object() when the t.Value is null. Modified DoddleReport DynamicReportSource.cs file used: Dynamic PetaPoco and DoddleReport With the final changes to DoddleReport in place, our project now supports exporting PIVOT reports to Microsoft Excel. I put together a sample ASP.NET MVC3 application that demonstrates the flow of the dynamic data from SQL Server via PetaPoco and DoddleReport to Excel. Standard MVC ActionResult View The default Index action displays the dynamic data in a normal table view by iterating over the dynamic results: Exporting Data to Excel The DoddleReport Excel link in the header will call the /Home/PivotReport .xlsx ReportAction to trigger the export to Excel. The .xlsx extension it what tells DoddleReport we want an Excel result: Generating a Default HTML View If you leave off the extension like the DoddleReport HTML header link does, then DoddleReport will generate the HTML view similar to the Index view I coded: DoddleReport supports additional formats like PDF, but I did not add the necessary Nuget packages to support it. With the proper library references in place, all it takes to generate a PDF is to add the .PDF extension to the ReportResult Action URL and DoddleReport will happily generate it for you. Conclusion Yet again I learned to never underestimate PetaPoco. I now have the same respect for DoddleReport as it makes a dynamic teammate with PetaPoco to deliver Excel exports based on our PIVOT query results. I hope you found this blog post helpful and be sure to check out the full source code on GitHub: tawman / PetaPocoPivot to see this dynamic pair in action.]]></summary></entry><entry><title type="html">Hierarchical Data with PetaPoco Multi-Poco Mapping</title><link href="https://www.woodcp.com/2012/02/hierarchical-data-with-petapoco-multi-poco-mapping/" rel="alternate" type="text/html" title="Hierarchical Data with PetaPoco Multi-Poco Mapping" /><published>2012-02-01T06:00:00+00:00</published><updated>2012-02-01T06:00:00+00:00</updated><id>https://www.woodcp.com/2012/02/hierarchical-data-with-petapoco-multi-poco-mapping</id><content type="html" xml:base="https://www.woodcp.com/2012/02/hierarchical-data-with-petapoco-multi-poco-mapping/"><![CDATA[<p>
    TL;DR – Sample ASP.NET MVC3 application demonstrating how to load hierarchical table data with a single SQL
    <em>SELECT</em> statement using
    <a href="http://www.toptensoftware.com/petapoco/" target="_blank">PetaPoco&apos;s</a>&nbsp;
    <a href="http://www.toptensoftware.com/Articles/115/PetaPoco-Mapping-One-to-Many-and-Many-to-One-Relationships"
        target="_blank">Multi-Poco mapping feature</a>.
</p>
<p>
    This blog post can be forked on
    <a href="https://github.com/tawman/PetaPocoHierarchy" target="_blank">GitHub: tawman / PetaPocoHierarchy</a>.
</p>
<h2>PetaPoco – A Tiny ORM-ish Thing</h2>
<p>
    We are using
    <a href="http://www.toptensoftware.com/petapoco/" target="_blank">PetaPoco</a> on a current project implementation since our data management needs do not
    <a href="http://ayende.com/blog/136195/when-should-you-use-nhibernate" target="_blank">rise to the level of NHibernate</a>. I learn something new each time I think we reached the limits of what PetaPoco can
    achieve for us. The most recent scenario involved loading hierarchical data without introducing a
    <a href="http://nhprof.com/Learn/Alerts/SelectNPlusOne" target="_blank">SELECT N+1 problem</a>. I will demonstrate how to use the PetaPoco Multi-Poco Mapping Feature to load hierarchical table
    data with a single SELECT statement.
</p>
<h3>Multi-Poco Mapping Feature</h3>
<p>
    <img style="background-image: none; border-right-width: 0px; padding-left: 0px; padding-right: 0px; display: inline; border-top-width: 0px; border-bottom-width: 0px; border-left-width: 0px; padding-top: 0px"
        title="PetaPocoInfoGraphicDisplay" border="0" alt="PetaPocoInfoGraphicDisplay" src="/assets/images//PetaPocoInfoGraphicDisplay.png"
        width="630" height="298">
</p>
<p>
    The
    <a href="http://www.toptensoftware.com/Articles/115/PetaPoco-Mapping-One-to-Many-and-Many-to-One-Relationships" target="_blank">PetaPoco – Mapping One-to-Many and Many-to-One Relationships</a>
    blog article described using the Multi-Poco Mapping Feature for loading one-to-many relationships, but I found the same technique
    works well with hierarchical data.
</p>
<h4>Organization Hierarchical Data Example</h4>
<p>
    I built a simple ASP.NET MVC3 website involving an organization hierarchy modeled as a single table with a self-referencing
    column for the parent organization. The site contains a single page and table to display the hierarchical organization
    tree.
</p>
<p>
    The DDL used to generate the table in SQL Server 2008 R2 Express is shown in the following gist:
</p>
<script src="https://gist.github.com/1700953.js"></script>
<p>
The following table illustrates the sample data loaded in the Organization table and used for this demo, but I replaced the Guid values with integers to make the parent-child relationships easier to identify:
</p>
<table border="1" cellspacing="0" cellpadding="2" width="606">
    <thead>
        <tr>
            <th width="50">Id</th>
            <th width="50">ParentId</th>
            <th width="50">OrganizationCode</th>
            <th width="450">Name</th>
        </tr>
    </thead>
    <tbody>
        <tr>
            <td>1</td>
            <td>NULL</td>
            <td>US</td>
            <td>United States</td>
        </tr>
        <tr>
            <td>2</td>
            <td>1</td>
            <td>MW</td>
            <td>Midwest</td>
        </tr>
        <tr>
            <td>3</td>
            <td>1</td>
            <td>NE</td>
            <td>Northeast</td>
        </tr>
        <tr>
            <td>4</td>
            <td>1</td>
            <td>NW</td>
            <td>Northwest</td>
        </tr>
        <tr>
            <td>5</td>
            <td>1</td>
            <td>SE</td>
            <td>Southeast</td>
        </tr>
        <tr>
            <td>6</td>
            <td>1</td>
            <td>SW</td>
            <td>Southwest</td>
        </tr>
        <tr>
            <td>7</td>
            <td>2</td>
            <td>MI</td>
            <td>Michigan</td>
        </tr>
        <tr>
            <td>8</td>
            <td>2</td>
            <td>IN</td>
            <td>Indiana</td>
        </tr>
        <tr>
            <td>9</td>
            <td>2</td>
            <td>IL</td>
            <td>Illinois</td>
        </tr>
        <tr>
            <td>10</td>
            <td>2</td>
            <td>OH</td>
            <td>Ohio</td>
        </tr>
        <tr>
            <td>11</td>
            <td>7</td>
            <td>WC</td>
            <td>Wayne</td>
        </tr>
        <tr>
            <td>12</td>
            <td>7</td>
            <td>WA</td>
            <td>Washtenaw</td>
        </tr>
        <tr>
            <td>13</td>
            <td>11</td>
            <td>DE</td>
            <td>Detroit</td>
        </tr>
        <tr>
            <td>14</td>
            <td>11</td>
            <td>CA</td>
            <td>Canton</td>
        </tr>
        <tr>
            <td>15</td>
            <td>11</td>
            <td>PM</td>
            <td>Plymouth</td>
        </tr>
        <tr>
            <td>16</td>
            <td>12</td>
            <td>AA</td>
            <td>Ann Arbor</td>
        </tr>
        <tr>
            <td>17</td>
            <td>12</td>
            <td>YP</td>
            <td>Ypsilanti</td>
        </tr>
        <tr>
            <td>18</td>
            <td>16</td>
            <td>UM</td>
            <td>University of Michigan</td>
        </tr>
        <tr>
            <td>19</td>
            <td>17</td>
            <td>EMU</td>
            <td>Eastern Michigan University</td>
        </tr>
    </tbody>
</table>
<p>The website uses the following Organization POCO object to store the hierarchical data from the Organization table:</p>
    <script src="https://gist.github.com/1701086.js"></script>
<p>Loading the data from SQL Server is accomplished with a single CTE query, but processed by PetaPoco with a custom helper
    method to transform the query results. The Organization Repository
    <em>GetAll()</em> method invokes the PetaPoco Multi-Poco Mapping feature as follows:</p>
    <script src="https://gist.github.com/1721103.js"></script>
<p>The CTE query prepares the Organization data for processing by the custom Multi-Poco Mapping Helper
    <em>OrganizationParentRelator().BuildHierarchy</em> with the following query result set
    (<em>only last 4 of Guid shown</em>):</p>
<table style="font-size: 90%" border="1" cellspacing="0" cellpadding="0">
    <thead>
        <tr>
            <th>Id</th>
            <th>Parent</th>
            <th>Code</th>
            <th>Name</th>
            <th>Id</th>
            <th>Parent</th>
            <th>Code</th>
            <th>Name</th>
            <th>Level</th>
            <th>LinkId</th>
        </tr>
    </thead>
    <tbody>
        <tr>
            <td>5D6B</td>
            <td>26A5</td>
            <td>SW</td>
            <td>Southwest</td>
            <td>26A5</td>
            <td>NULL</td>
            <td>US</td>
            <td>United States</td>
            <td>1</td>
            <td>5D6B</td>
        </tr>
        <tr>
            <td>5D6B</td>
            <td>26A5</td>
            <td>SW</td>
            <td>Southwest</td>
            <td>5D6B</td>
            <td>26A5</td>
            <td>SW</td>
            <td>Southwest</td>
            <td>0</td>
            <td>5D6B</td>
        </tr>
        <tr>
            <td>FACC</td>
            <td>BF51</td>
            <td>WC</td>
            <td>Wayne</td>
            <td>26A5</td>
            <td>NULL</td>
            <td>US</td>
            <td>United States</td>
            <td>3</td>
            <td>FACC</td>
        </tr>
        <tr>
            <td>FACC</td>
            <td>BF51</td>
            <td>WC</td>
            <td>Wayne</td>
            <td>88BB</td>
            <td>26A5</td>
            <td>MW</td>
            <td>Midwest</td>
            <td>2</td>
            <td>FACC</td>
        </tr>
        <tr>
            <td>FACC</td>
            <td>BF51</td>
            <td>WC</td>
            <td>Wayne</td>
            <td>BF51</td>
            <td>88BB</td>
            <td>MI</td>
            <td>Michigan</td>
            <td>1</td>
            <td>FACC</td>
        </tr>
        <tr>
            <td>FACC</td>
            <td>BF51</td>
            <td>WC</td>
            <td>Wayne</td>
            <td>FACC</td>
            <td>BF51</td>
            <td>WC</td>
            <td>Wayne</td>
            <td>0</td>
            <td>FACC</td>
        </tr>
        <tr>
            <td>BF51</td>
            <td>88BB</td>
            <td>MI</td>
            <td>Michigan</td>
            <td>26A5</td>
            <td>NULL</td>
            <td>US</td>
            <td>United States</td>
            <td>2</td>
            <td>BF51</td>
        </tr>
        <tr>
            <td>BF51</td>
            <td>88BB</td>
            <td>MI</td>
            <td>Michigan</td>
            <td>88BB</td>
            <td>26A5</td>
            <td>MW</td>
            <td>Midwest</td>
            <td>1</td>
            <td>BF51</td>
        </tr>
        <tr>
            <td>BF51</td>
            <td>88BB</td>
            <td>MI</td>
            <td>Michigan</td>
            <td>BF51</td>
            <td>88BB</td>
            <td>MI</td>
            <td>Michigan</td>
            <td>0</td>
            <td>BF51</td>
        </tr>
        <tr>
            <td>26A5</td>
            <td>NULL</td>
            <td>US</td>
            <td>United States</td>
            <td>26A5</td>
            <td>NULL</td>
            <td>US</td>
            <td>United States</td>
            <td>0</td>
            <td>26A5</td>
        </tr>
        <tr>
            <td>3661</td>
            <td>FACC</td>
            <td>PM</td>
            <td>Plymouth</td>
            <td>26A5</td>
            <td>NULL</td>
            <td>US</td>
            <td>United States</td>
            <td>4</td>
            <td>3661</td>
        </tr>
        <tr>
            <td>3661</td>
            <td>FACC</td>
            <td>PM</td>
            <td>Plymouth</td>
            <td>88BB</td>
            <td>26A5</td>
            <td>MW</td>
            <td>Midwest</td>
            <td>3</td>
            <td>3661</td>
        </tr>
        <tr>
            <td>3661</td>
            <td>FACC</td>
            <td>PM</td>
            <td>Plymouth</td>
            <td>BF51</td>
            <td>88BB</td>
            <td>MI</td>
            <td>Michigan</td>
            <td>2</td>
            <td>3661</td>
        </tr>
        <tr>
            <td>3661</td>
            <td>FACC</td>
            <td>PM</td>
            <td>Plymouth</td>
            <td>FACC</td>
            <td>BF51</td>
            <td>WC</td>
            <td>Wayne</td>
            <td>1</td>
            <td>3661</td>
        </tr>
        <tr>
            <td>3661</td>
            <td>FACC</td>
            <td>PM</td>
            <td>Plymouth</td>
            <td>3661</td>
            <td>FACC</td>
            <td>PM</td>
            <td>Plymouth</td>
            <td>0</td>
            <td>3661</td>
        </tr>
        <tr>
            <td>D24E</td>
            <td>88BB</td>
            <td>IL</td>
            <td>Illinois</td>
            <td>26A5</td>
            <td>NULL</td>
            <td>US</td>
            <td>United States</td>
            <td>2</td>
            <td>D24E</td>
        </tr>
        <tr>
            <td>D24E</td>
            <td>88BB</td>
            <td>IL</td>
            <td>Illinois</td>
            <td>88BB</td>
            <td>26A5</td>
            <td>MW</td>
            <td>Midwest</td>
            <td>1</td>
            <td>D24E</td>
        </tr>
        <tr>
            <td>D24E</td>
            <td>88BB</td>
            <td>IL</td>
            <td>Illinois</td>
            <td>D24E</td>
            <td>88BB</td>
            <td>IL</td>
            <td>Illinois</td>
            <td>0</td>
            <td>D24E</td>
        </tr>
        <tr>
            <td>DCAC</td>
            <td>BF51</td>
            <td>WA</td>
            <td>Washtenaw</td>
            <td>26A5</td>
            <td>NULL</td>
            <td>US</td>
            <td>United States</td>
            <td>3</td>
            <td>DCAC</td>
        </tr>
        <tr>
            <td>DCAC</td>
            <td>BF51</td>
            <td>WA</td>
            <td>Washtenaw</td>
            <td>88BB</td>
            <td>26A5</td>
            <td>MW</td>
            <td>Midwest</td>
            <td>2</td>
            <td>DCAC</td>
        </tr>
        <tr>
            <td>DCAC</td>
            <td>BF51</td>
            <td>WA</td>
            <td>Washtenaw</td>
            <td>BF51</td>
            <td>88BB</td>
            <td>MI</td>
            <td>Michigan</td>
            <td>1</td>
            <td>DCAC</td>
        </tr>
        <tr>
            <td>DCAC</td>
            <td>BF51</td>
            <td>WA</td>
            <td>Washtenaw</td>
            <td>DCAC</td>
            <td>BF51</td>
            <td>WA</td>
            <td>Washtenaw</td>
            <td>0</td>
            <td>DCAC</td>
        </tr>
        <tr>
            <td>FD9B</td>
            <td>DCAC</td>
            <td>YP</td>
            <td>Ypsilanti</td>
            <td>26A5</td>
            <td>NULL</td>
            <td>US</td>
            <td>United States</td>
            <td>4</td>
            <td>FD9B</td>
        </tr>
        <tr>
            <td>FD9B</td>
            <td>DCAC</td>
            <td>YP</td>
            <td>Ypsilanti</td>
            <td>88BB</td>
            <td>26A5</td>
            <td>MW</td>
            <td>Midwest</td>
            <td>3</td>
            <td>FD9B</td>
        </tr>
        <tr>
            <td>FD9B</td>
            <td>DCAC</td>
            <td>YP</td>
            <td>Ypsilanti</td>
            <td>BF51</td>
            <td>88BB</td>
            <td>MI</td>
            <td>Michigan</td>
            <td>2</td>
            <td>FD9B</td>
        </tr>
        <tr>
            <td>FD9B</td>
            <td>DCAC</td>
            <td>YP</td>
            <td>Ypsilanti</td>
            <td>DCAC</td>
            <td>BF51</td>
            <td>WA</td>
            <td>Washtenaw</td>
            <td>1</td>
            <td>FD9B</td>
        </tr>
        <tr>
            <td>FD9B</td>
            <td>DCAC</td>
            <td>YP</td>
            <td>Ypsilanti</td>
            <td>FD9B</td>
            <td>DCAC</td>
            <td>YP</td>
            <td>Ypsilanti</td>
            <td>0</td>
            <td>FD9B</td>
        </tr>
        <tr>
            <td>18F8</td>
            <td>FACC</td>
            <td>DE</td>
            <td>Detroit</td>
            <td>26A5</td>
            <td>NULL</td>
            <td>US</td>
            <td>United States</td>
            <td>4</td>
            <td>18F8</td>
        </tr>
        <tr>
            <td>18F8</td>
            <td>FACC</td>
            <td>DE</td>
            <td>Detroit</td>
            <td>88BB</td>
            <td>26A5</td>
            <td>MW</td>
            <td>Midwest</td>
            <td>3</td>
            <td>18F8</td>
        </tr>
        <tr>
            <td>18F8</td>
            <td>FACC</td>
            <td>DE</td>
            <td>Detroit</td>
            <td>BF51</td>
            <td>88BB</td>
            <td>MI</td>
            <td>Michigan</td>
            <td>2</td>
            <td>18F8</td>
        </tr>
        <tr>
            <td>18F8</td>
            <td>FACC</td>
            <td>DE</td>
            <td>Detroit</td>
            <td>FACC</td>
            <td>BF51</td>
            <td>WC</td>
            <td>Wayne</td>
            <td>1</td>
            <td>18F8</td>
        </tr>
        <tr>
            <td>18F8</td>
            <td>FACC</td>
            <td>DE</td>
            <td>Detroit</td>
            <td>18F8</td>
            <td>FACC</td>
            <td>DE</td>
            <td>Detroit</td>
            <td>0</td>
            <td>18F8</td>
        </tr>
        <tr>
            <td>88BB</td>
            <td>26A5</td>
            <td>MW</td>
            <td>Midwest</td>
            <td>26A5</td>
            <td>NULL</td>
            <td>US</td>
            <td>United States</td>
            <td>1</td>
            <td>88BB</td>
        </tr>
        <tr>
            <td>88BB</td>
            <td>26A5</td>
            <td>MW</td>
            <td>Midwest</td>
            <td>88BB</td>
            <td>26A5</td>
            <td>MW</td>
            <td>Midwest</td>
            <td>0</td>
            <td>88BB</td>
        </tr>
        <tr>
            <td>947A</td>
            <td>BB56</td>
            <td>UM</td>
            <td>University of Michigan</td>
            <td>26A5</td>
            <td>NULL</td>
            <td>US</td>
            <td>United States</td>
            <td>5</td>
            <td>947A</td>
        </tr>
        <tr>
            <td>947A</td>
            <td>BB56</td>
            <td>UM</td>
            <td>University of Michigan</td>
            <td>88BB</td>
            <td>26A5</td>
            <td>MW</td>
            <td>Midwest</td>
            <td>4</td>
            <td>947A</td>
        </tr>
        <tr>
            <td>947A</td>
            <td>BB56</td>
            <td>UM</td>
            <td>University of Michigan</td>
            <td>BF51</td>
            <td>88BB</td>
            <td>MI</td>
            <td>Michigan</td>
            <td>3</td>
            <td>947A</td>
        </tr>
        <tr>
            <td>947A</td>
            <td>BB56</td>
            <td>UM</td>
            <td>University of Michigan</td>
            <td>DCAC</td>
            <td>BF51</td>
            <td>WA</td>
            <td>Washtenaw</td>
            <td>2</td>
            <td>947A</td>
        </tr>
        <tr>
            <td>947A</td>
            <td>BB56</td>
            <td>UM</td>
            <td>University of Michigan</td>
            <td>BB56</td>
            <td>DCAC</td>
            <td>AA</td>
            <td>Ann Arbor</td>
            <td>1</td>
            <td>947A</td>
        </tr>
        <tr>
            <td>947A</td>
            <td>BB56</td>
            <td>UM</td>
            <td>University of Michigan</td>
            <td>947A</td>
            <td>BB56</td>
            <td>UM</td>
            <td>University of Michigan</td>
            <td>0</td>
            <td>947A</td>
        </tr>
        <tr>
            <td>AB85</td>
            <td>26A5</td>
            <td>NW</td>
            <td>Northwest</td>
            <td>26A5</td>
            <td>NULL</td>
            <td>US</td>
            <td>United States</td>
            <td>1</td>
            <td>AB85</td>
        </tr>
        <tr>
            <td>AB85</td>
            <td>26A5</td>
            <td>NW</td>
            <td>Northwest</td>
            <td>AB85</td>
            <td>26A5</td>
            <td>NW</td>
            <td>Northwest</td>
            <td>0</td>
            <td>AB85</td>
        </tr>
        <tr>
            <td>24D5</td>
            <td>FACC</td>
            <td>CA</td>
            <td>Canton</td>
            <td>26A5</td>
            <td>NULL</td>
            <td>US</td>
            <td>United States</td>
            <td>4</td>
            <td>24D5</td>
        </tr>
        <tr>
            <td>24D5</td>
            <td>FACC</td>
            <td>CA</td>
            <td>Canton</td>
            <td>88BB</td>
            <td>26A5</td>
            <td>MW</td>
            <td>Midwest</td>
            <td>3</td>
            <td>24D5</td>
        </tr>
        <tr>
            <td>24D5</td>
            <td>FACC</td>
            <td>CA</td>
            <td>Canton</td>
            <td>BF51</td>
            <td>88BB</td>
            <td>MI</td>
            <td>Michigan</td>
            <td>2</td>
            <td>24D5</td>
        </tr>
        <tr>
            <td>24D5</td>
            <td>FACC</td>
            <td>CA</td>
            <td>Canton</td>
            <td>FACC</td>
            <td>BF51</td>
            <td>WC</td>
            <td>Wayne</td>
            <td>1</td>
            <td>24D5</td>
        </tr>
        <tr>
            <td>24D5</td>
            <td>FACC</td>
            <td>CA</td>
            <td>Canton</td>
            <td>24D5</td>
            <td>FACC</td>
            <td>CA</td>
            <td>Canton</td>
            <td>0</td>
            <td>24D5</td>
        </tr>
        <tr>
            <td>B338</td>
            <td>88BB</td>
            <td>IN</td>
            <td>Indiana</td>
            <td>26A5</td>
            <td>NULL</td>
            <td>US</td>
            <td>United States</td>
            <td>2</td>
            <td>B338</td>
        </tr>
        <tr>
            <td>B338</td>
            <td>88BB</td>
            <td>IN</td>
            <td>Indiana</td>
            <td>88BB</td>
            <td>26A5</td>
            <td>MW</td>
            <td>Midwest</td>
            <td>1</td>
            <td>B338</td>
        </tr>
        <tr>
            <td>B338</td>
            <td>88BB</td>
            <td>IN</td>
            <td>Indiana</td>
            <td>B338</td>
            <td>88BB</td>
            <td>IN</td>
            <td>Indiana</td>
            <td>0</td>
            <td>B338</td>
        </tr>
        <tr>
            <td>F352</td>
            <td>FD9B</td>
            <td>EMU</td>
            <td>Eastern Michigan University</td>
            <td>26A5</td>
            <td>NULL</td>
            <td>US</td>
            <td>United States</td>
            <td>5</td>
            <td>F352</td>
        </tr>
        <tr>
            <td>F352</td>
            <td>FD9B</td>
            <td>EMU</td>
            <td>Eastern Michigan University</td>
            <td>88BB</td>
            <td>26A5</td>
            <td>MW</td>
            <td>Midwest</td>
            <td>4</td>
            <td>F352</td>
        </tr>
        <tr>
            <td>F352</td>
            <td>FD9B</td>
            <td>EMU</td>
            <td>Eastern Michigan University</td>
            <td>BF51</td>
            <td>88BB</td>
            <td>MI</td>
            <td>Michigan</td>
            <td>3</td>
            <td>F352</td>
        </tr>
        <tr>
            <td>F352</td>
            <td>FD9B</td>
            <td>EMU</td>
            <td>Eastern Michigan University</td>
            <td>DCAC</td>
            <td>BF51</td>
            <td>WA</td>
            <td>Washtenaw</td>
            <td>2</td>
            <td>F352</td>
        </tr>
        <tr>
            <td>F352</td>
            <td>FD9B</td>
            <td>EMU</td>
            <td>Eastern Michigan University</td>
            <td>FD9B</td>
            <td>DCAC</td>
            <td>YP</td>
            <td>Ypsilanti</td>
            <td>1</td>
            <td>F352</td>
        </tr>
        <tr>
            <td>F352</td>
            <td>FD9B</td>
            <td>EMU</td>
            <td>Eastern Michigan University</td>
            <td>F352</td>
            <td>FD9B</td>
            <td>EMU</td>
            <td>Eastern Michigan University</td>
            <td>0</td>
            <td>F352</td>
        </tr>
        <tr>
            <td>4EE6</td>
            <td>26A5</td>
            <td>SE</td>
            <td>Southeast</td>
            <td>26A5</td>
            <td>NULL</td>
            <td>US</td>
            <td>United States</td>
            <td>1</td>
            <td>4EE6</td>
        </tr>
        <tr>
            <td>4EE6</td>
            <td>26A5</td>
            <td>SE</td>
            <td>Southeast</td>
            <td>4EE6</td>
            <td>26A5</td>
            <td>SE</td>
            <td>Southeast</td>
            <td>0</td>
            <td>4EE6</td>
        </tr>
        <tr>
            <td>FCFC</td>
            <td>88BB</td>
            <td>OH</td>
            <td>Ohio</td>
            <td>26A5</td>
            <td>NULL</td>
            <td>US</td>
            <td>United States</td>
            <td>2</td>
            <td>FCFC</td>
        </tr>
        <tr>
            <td>FCFC</td>
            <td>88BB</td>
            <td>OH</td>
            <td>Ohio</td>
            <td>88BB</td>
            <td>26A5</td>
            <td>MW</td>
            <td>Midwest</td>
            <td>1</td>
            <td>FCFC</td>
        </tr>
        <tr>
            <td>FCFC</td>
            <td>88BB</td>
            <td>OH</td>
            <td>Ohio</td>
            <td>FCFC</td>
            <td>88BB</td>
            <td>OH</td>
            <td>Ohio</td>
            <td>0</td>
            <td>FCFC</td>
        </tr>
        <tr>
            <td>22E8</td>
            <td>26A5</td>
            <td>NE</td>
            <td>Northeast</td>
            <td>26A5</td>
            <td>NULL</td>
            <td>US</td>
            <td>United States</td>
            <td>1</td>
            <td>22E8</td>
        </tr>
        <tr>
            <td>22E8</td>
            <td>26A5</td>
            <td>NE</td>
            <td>Northeast</td>
            <td>22E8</td>
            <td>26A5</td>
            <td>NE</td>
            <td>Northeast</td>
            <td>0</td>
            <td>22E8</td>
        </tr>
        <tr>
            <td>BB56</td>
            <td>DCAC</td>
            <td>AA</td>
            <td>Ann Arbor</td>
            <td>26A5</td>
            <td>NULL</td>
            <td>US</td>
            <td>United States</td>
            <td>4</td>
            <td>BB56</td>
        </tr>
        <tr>
            <td>BB56</td>
            <td>DCAC</td>
            <td>AA</td>
            <td>Ann Arbor</td>
            <td>88BB</td>
            <td>26A5</td>
            <td>MW</td>
            <td>Midwest</td>
            <td>3</td>
            <td>BB56</td>
        </tr>
        <tr>
            <td>BB56</td>
            <td>DCAC</td>
            <td>AA</td>
            <td>Ann Arbor</td>
            <td>BF51</td>
            <td>88BB</td>
            <td>MI</td>
            <td>Michigan</td>
            <td>2</td>
            <td>BB56</td>
        </tr>
        <tr>
            <td>BB56</td>
            <td>DCAC</td>
            <td>AA</td>
            <td>Ann Arbor</td>
            <td>DCAC</td>
            <td>BF51</td>
            <td>WA</td>
            <td>Washtenaw</td>
            <td>1</td>
            <td>BB56</td>
        </tr>
        <tr>
            <td>BB56</td>
            <td>DCAC</td>
            <td>AA</td>
            <td>Ann Arbor</td>
            <td>BB56</td>
            <td>DCAC</td>
            <td>AA</td>
            <td>Ann Arbor</td>
            <td>0</td>
            <td>BB56</td>
        </tr>
    </tbody>
</table>

<p>The custom helper is supplied as a parameter to the PetaPoco Multi Fetch call:</p>

<script src="https://gist.github.com/tawman/b02c543fad69ddfb48772a01457d51cb.js"></script>

<p>The repository query is structured such that each resulting row of data contains an Organization (T1) and Parent Organization
    (T2). The query will return multiple rows for the same Organization equal to its depth in the hierarchy. The rows are
    sorted by Organization Hierarchy Level in descending order.</p>

<p>The
    <em>OrganizationParentRelator.BuildHierarchy(organization, parent)</em> helper method will push the parent organization upward
    as each Organization row is processed and then return the hierarchical representation of the Organization (TRet) when
    no more parents are present:</p>
<p>
<script src="https://gist.github.com/1721109.js"></script>
</p>
<h4>Resulting Organization Structure Displayed Online</h4>
<p>
    <img style="background-image: none; border-right-width: 0px; padding-left: 0px; padding-right: 0px; display: inline; border-top-width: 0px; border-bottom-width: 0px; border-left-width: 0px; padding-top: 0px"
        title="WebPageView" border="0" alt="WebPageView" src="/assets/images//WebPageView.png" width="630" height="492">
</p>

<h3>Conclusion</h3>
<p>I hope you found this blog post helpful and be sure to check out the full source code on
    <a href="https://github.com/tawman/PetaPocoHierarchy"
        target="_blank">GitHub: tawman / PetaPocoHierarchy</a>. The PetaPoco project page over on
    <a href="http://www.toptensoftware.com/petapoco/"
        target="_blank">Topten Software</a> has more examples under the blog posts sections for using the Multi-Poco Mapping feature in other
    ways.</p>]]></content><author><name>Todd A. Wood</name><uri>https://www.woodcp.com</uri></author><category term="PetaPoco" /><category term="PetaPoco" /><summary type="html"><![CDATA[TL;DR – Sample ASP.NET MVC3 application demonstrating how to load hierarchical table data with a single SQL SELECT statement using PetaPoco&apos;s&nbsp; Multi-Poco mapping feature. This blog post can be forked on GitHub: tawman / PetaPocoHierarchy. PetaPoco – A Tiny ORM-ish Thing We are using PetaPoco on a current project implementation since our data management needs do not rise to the level of NHibernate. I learn something new each time I think we reached the limits of what PetaPoco can achieve for us. The most recent scenario involved loading hierarchical data without introducing a SELECT N+1 problem. I will demonstrate how to use the PetaPoco Multi-Poco Mapping Feature to load hierarchical table data with a single SELECT statement. Multi-Poco Mapping Feature The PetaPoco – Mapping One-to-Many and Many-to-One Relationships blog article described using the Multi-Poco Mapping Feature for loading one-to-many relationships, but I found the same technique works well with hierarchical data. Organization Hierarchical Data Example I built a simple ASP.NET MVC3 website involving an organization hierarchy modeled as a single table with a self-referencing column for the parent organization. The site contains a single page and table to display the hierarchical organization tree. The DDL used to generate the table in SQL Server 2008 R2 Express is shown in the following gist: The following table illustrates the sample data loaded in the Organization table and used for this demo, but I replaced the Guid values with integers to make the parent-child relationships easier to identify: Id ParentId OrganizationCode Name 1 NULL US United States 2 1 MW Midwest 3 1 NE Northeast 4 1 NW Northwest 5 1 SE Southeast 6 1 SW Southwest 7 2 MI Michigan 8 2 IN Indiana 9 2 IL Illinois 10 2 OH Ohio 11 7 WC Wayne 12 7 WA Washtenaw 13 11 DE Detroit 14 11 CA Canton 15 11 PM Plymouth 16 12 AA Ann Arbor 17 12 YP Ypsilanti 18 16 UM University of Michigan 19 17 EMU Eastern Michigan University The website uses the following Organization POCO object to store the hierarchical data from the Organization table: Loading the data from SQL Server is accomplished with a single CTE query, but processed by PetaPoco with a custom helper method to transform the query results. The Organization Repository GetAll() method invokes the PetaPoco Multi-Poco Mapping feature as follows: The CTE query prepares the Organization data for processing by the custom Multi-Poco Mapping Helper OrganizationParentRelator().BuildHierarchy with the following query result set (only last 4 of Guid shown): Id Parent Code Name Id Parent Code Name Level LinkId 5D6B 26A5 SW Southwest 26A5 NULL US United States 1 5D6B 5D6B 26A5 SW Southwest 5D6B 26A5 SW Southwest 0 5D6B FACC BF51 WC Wayne 26A5 NULL US United States 3 FACC FACC BF51 WC Wayne 88BB 26A5 MW Midwest 2 FACC FACC BF51 WC Wayne BF51 88BB MI Michigan 1 FACC FACC BF51 WC Wayne FACC BF51 WC Wayne 0 FACC BF51 88BB MI Michigan 26A5 NULL US United States 2 BF51 BF51 88BB MI Michigan 88BB 26A5 MW Midwest 1 BF51 BF51 88BB MI Michigan BF51 88BB MI Michigan 0 BF51 26A5 NULL US United States 26A5 NULL US United States 0 26A5 3661 FACC PM Plymouth 26A5 NULL US United States 4 3661 3661 FACC PM Plymouth 88BB 26A5 MW Midwest 3 3661 3661 FACC PM Plymouth BF51 88BB MI Michigan 2 3661 3661 FACC PM Plymouth FACC BF51 WC Wayne 1 3661 3661 FACC PM Plymouth 3661 FACC PM Plymouth 0 3661 D24E 88BB IL Illinois 26A5 NULL US United States 2 D24E D24E 88BB IL Illinois 88BB 26A5 MW Midwest 1 D24E D24E 88BB IL Illinois D24E 88BB IL Illinois 0 D24E DCAC BF51 WA Washtenaw 26A5 NULL US United States 3 DCAC DCAC BF51 WA Washtenaw 88BB 26A5 MW Midwest 2 DCAC DCAC BF51 WA Washtenaw BF51 88BB MI Michigan 1 DCAC DCAC BF51 WA Washtenaw DCAC BF51 WA Washtenaw 0 DCAC FD9B DCAC YP Ypsilanti 26A5 NULL US United States 4 FD9B FD9B DCAC YP Ypsilanti 88BB 26A5 MW Midwest 3 FD9B FD9B DCAC YP Ypsilanti BF51 88BB MI Michigan 2 FD9B FD9B DCAC YP Ypsilanti DCAC BF51 WA Washtenaw 1 FD9B FD9B DCAC YP Ypsilanti FD9B DCAC YP Ypsilanti 0 FD9B 18F8 FACC DE Detroit 26A5 NULL US United States 4 18F8 18F8 FACC DE Detroit 88BB 26A5 MW Midwest 3 18F8 18F8 FACC DE Detroit BF51 88BB MI Michigan 2 18F8 18F8 FACC DE Detroit FACC BF51 WC Wayne 1 18F8 18F8 FACC DE Detroit 18F8 FACC DE Detroit 0 18F8 88BB 26A5 MW Midwest 26A5 NULL US United States 1 88BB 88BB 26A5 MW Midwest 88BB 26A5 MW Midwest 0 88BB 947A BB56 UM University of Michigan 26A5 NULL US United States 5 947A 947A BB56 UM University of Michigan 88BB 26A5 MW Midwest 4 947A 947A BB56 UM University of Michigan BF51 88BB MI Michigan 3 947A 947A BB56 UM University of Michigan DCAC BF51 WA Washtenaw 2 947A 947A BB56 UM University of Michigan BB56 DCAC AA Ann Arbor 1 947A 947A BB56 UM University of Michigan 947A BB56 UM University of Michigan 0 947A AB85 26A5 NW Northwest 26A5 NULL US United States 1 AB85 AB85 26A5 NW Northwest AB85 26A5 NW Northwest 0 AB85 24D5 FACC CA Canton 26A5 NULL US United States 4 24D5 24D5 FACC CA Canton 88BB 26A5 MW Midwest 3 24D5 24D5 FACC CA Canton BF51 88BB MI Michigan 2 24D5 24D5 FACC CA Canton FACC BF51 WC Wayne 1 24D5 24D5 FACC CA Canton 24D5 FACC CA Canton 0 24D5 B338 88BB IN Indiana 26A5 NULL US United States 2 B338 B338 88BB IN Indiana 88BB 26A5 MW Midwest 1 B338 B338 88BB IN Indiana B338 88BB IN Indiana 0 B338 F352 FD9B EMU Eastern Michigan University 26A5 NULL US United States 5 F352 F352 FD9B EMU Eastern Michigan University 88BB 26A5 MW Midwest 4 F352 F352 FD9B EMU Eastern Michigan University BF51 88BB MI Michigan 3 F352 F352 FD9B EMU Eastern Michigan University DCAC BF51 WA Washtenaw 2 F352 F352 FD9B EMU Eastern Michigan University FD9B DCAC YP Ypsilanti 1 F352 F352 FD9B EMU Eastern Michigan University F352 FD9B EMU Eastern Michigan University 0 F352 4EE6 26A5 SE Southeast 26A5 NULL US United States 1 4EE6 4EE6 26A5 SE Southeast 4EE6 26A5 SE Southeast 0 4EE6 FCFC 88BB OH Ohio 26A5 NULL US United States 2 FCFC FCFC 88BB OH Ohio 88BB 26A5 MW Midwest 1 FCFC FCFC 88BB OH Ohio FCFC 88BB OH Ohio 0 FCFC 22E8 26A5 NE Northeast 26A5 NULL US United States 1 22E8 22E8 26A5 NE Northeast 22E8 26A5 NE Northeast 0 22E8 BB56 DCAC AA Ann Arbor 26A5 NULL US United States 4 BB56 BB56 DCAC AA Ann Arbor 88BB 26A5 MW Midwest 3 BB56 BB56 DCAC AA Ann Arbor BF51 88BB MI Michigan 2 BB56 BB56 DCAC AA Ann Arbor DCAC BF51 WA Washtenaw 1 BB56 BB56 DCAC AA Ann Arbor BB56 DCAC AA Ann Arbor 0 BB56]]></summary></entry><entry><title type="html">Light the Afterburners with WP Engine</title><link href="https://www.woodcp.com/2012/01/light-the-afterburners-with-wp-engine/" rel="alternate" type="text/html" title="Light the Afterburners with WP Engine" /><published>2012-01-27T06:00:00+00:00</published><updated>2012-01-27T06:00:00+00:00</updated><id>https://www.woodcp.com/2012/01/light-the-afterburners-with-wp-engine</id><content type="html" xml:base="https://www.woodcp.com/2012/01/light-the-afterburners-with-wp-engine/"><![CDATA[<p>
<a href="http://wpengine.com"><img style="background-image: none; margin: 0px 0px 0px 20px; padding-left: 0px; padding-right: 0px; display: inline; float: right; padding-top: 0px; border-width: 0px;" title="wpe-banner-125x125" src="/assets/images/wpe-banner-125x125.jpg" alt="wpe-banner-125x125" width="129" height="129" align="right" border="0" /></a>
TL; DR – You will enjoy rolling your own <a href="http://wordpress.org/download/" target="_blank">WordPress</a>
instance on <a href="https://www.windowsazure.com/en-us/" target="_blank">Azure</a> until you
see the cost of scaling and maintaining the site. I decided to spend more time blogging and
less time and money managing the infrastructure by leaving the speed, scale, security, and
support of a fully managed <em>WordPress</em> installation to Austin&apos;s very own
<a href="http://wpengine.com/" target="_blank">WP Engine</a>.
</p>
<h2>Turbulence</h2>
<p>
It did not take long before I hit my first patch of turbulence in the
<a href="https://www.windowsazure.com/en-us/" target="_blank">Windows Azure Cloud</a>. My
second blog post about using <a href="http://www.woodcp.com/2012/01/heterogeneous-development-environment-with-parallels-desktop-7-for-mac/" target="_blank">Parallels Desktop® 7 for Mac</a>
for a development machine was retweeted by <a href="https://twitter.com/#!/parallelsmac" target="_blank">@ParallelsMac</a>
to its 18K+ followers. I checked my blog entry for any comments, but was greeted by the
message: <strong>Error establishing a database connection</strong>.
</p>
<h2>The Cathedral and the Bazaar in the Sky</h2>
<p>
<img style="background-image: none; padding-left: 0px; padding-right: 20px; display: inline; float: left; padding-top: 0px; border-width: 0px;" title="Windows Azure" src="/assets/images/Azure.png" alt="Windows Azure" width="207" height="175" align="left" border="0" />
I had <em>WordPress 3.3.1</em> configured to run on two Extra Small Azure server instances
(Shared CPU Cores and 768 MB Memory) with a <em>SQL Azure</em> backend. I figured this
configuration would suffice for a single blog site with no following.
</p>
<p>
The first action I took during the outage was to start up an additional two instances via the
web management interface. This immediately resolved the issue, but I assumed the traffic that
caused it was long gone and never to return. Since I was only 3 days into my 90 day free trial
of Azure, there was no cost for spinning up additional instances. However, projecting the cost
forward under normal conditions after the trial period gave me cause for concern.
</p>
<h3>Cloud with a Silver Lining</h3>
<p>
Windows Azure is quite up front about its pricing levels and provides an excellent
<a href="https://www.windowsazure.com/en-us/pricing/calculator/" target="_blank">Pricing Calculator</a>
to check out the cost of various configurations. It took me several days, but I finally
spotted the <em>Billing</em> link on my management portal page. I checked today to see the
net effect of running five Extra Small Azure instances the past couple days. The 6 days worth
of statistics made it clear I would not be needing the full 90 day <em>free</em> trial period:
<ul>
  <li>Database (DB / Month) – Web Edition: 19.355% of 1 db/month</li>
  <li>Compute Hours –   Windows Azure Compute: 17.2% of 750 Unit</li>
  <li>Storage Transactions (in 10,000s) – Windows Azure Storage: 3.112% of 5 10,000s</li>
  <li>liData Transfer Out: 0.17 GB of 80GB</li>
  <li>Data Transfer In: 0.39 GB<GB/li>
</ul>
</p>
<p>
The free trial period includes 750 hours of compute time per month, but that is only enough
for a single Extra Small instance. The recommended configuration by Azure is two instances
and that was not even enough for my small burst of traffic.
</p>
<h3>Seeding Your Own Cloud</h3>
<p>
I thoroughly enjoyed rolling my own instance of <em>WordPress</em> in the Azure cloud, but the
price of scaling it is too much for the average blogger. Change management of the
<em>WordPress</em> site is not as straight forward either when running your own cloud. I found
this out quickly when using the <em>WordPress</em> administration interface.
</p>
<p>
I knew from the start that running multiple instances would isolate certain aspects of the
framework but not by how much. The <a href="https://github.com/Interop-Bridges/Windows-Azure-PHP-Scaffolders" target="_blank">Windows-Azure-PHP-Scaffolders</a> included a Windows Azure Storage plug-in to make images and static content available across all instances. In addition, the <em>WordPress</em> database will keep your content and plug-in settings available for all instances. The problem is with the other items that you frequently change when first setting up a new <em>WordPress</em>
site:
<ul>
  <li>Plugins – You cannot install or uninstall plugins using the online admin interface unless you understand the fine art of round robin load balance dancing (F5 or Command-R) to install them on all nodes</li>
  <li>Themes – You cannot install or modify themes using the online admin interface without the same issue</li>
  <li>You could install the certificates to Remote Desktop into the instances, but Azure will use the base image uploaded when starting up any new instances. All component and template changes will be lost on the new instance.</li>
</ul>
</p>
<p>
The remedy is to configure the <em>WordPress</em> plugins and theme modifications on your local
computer, repackage the changes into a new Azure deployment, and use the Azure management
console to stage the upgrade into the Cloud. Azure makes the process quite simple for someone
like myself, but it is more complicated than simply using the WP-Admin interface to make the
changes online.
</p>
<h2>The Specialists</h2>
<p>
Reliability and performance is rarely free or inexpensive, so I went back to where I started 6
days ago looking for hosted <em>WordPress</em> providers. I had no interest in the standard web
hosting shared / VPS offerings thereby limiting my search to true SaaS providers like
<a href="https://en.wordpress.com/signup/">WordPress.com</a> and
<a href="http://wpengine.com">WP Engine</a>. After seeing the cost of Azure, I did not even
bother with exploring Amazon AWS.
</p>
<h3>WordPress .COM</h3>
<p>
My new review of <em>WordPress.com</em> left me with the same concerns as before. It needs to
control your DNS zone file, only short form domain name URL, and from what I read you cannot
use any plugin that generates / uses javascript. There was some confusing verbiage on ads and
I am not looking to run ads on my blog no matter how much I mention the products I use and
find value.
</p>
<h3>WP Engine</h3>
<p>
<a href="http://wpengine.com/"><img style="background-image: none; margin: 0px 0px 0px 20px; padding-left: 0px; padding-right: 0px; display: inline; float: right; padding-top: 0px; border-width: 0px;" title="engine" src="http://www.woodcp.com/wp-content/uploads/2012/01/engine.png" alt="engine" width="244" height="244" align="right" border="0" /></a>WP Engine on the other hand, provides exactly what I was attempting to accomplish on my own, but without the change management opportunity and scaling costs. I also like the idea of supporting a <a href="http://wpengine.com/about/">locally operated Austin business</a>.
If I am not mistaken, I only know of WP Engine by word of mouth of them either sponsoring or
helping out at some Austin technical user group meeting or something. This just goes to prove
that more tech companies should be sponsoring local tech user groups with beer and food.
</p>
<p>
The best part, it literally only took me 43 minutes from the time I received the
<em>Welcome to WP Engine!</em> email until the moment I logged into
<a href="http://www.namecheap.com/">Namecheap.com</a> to update my DNS settings to point my
CNAME record directly to WP Engine.
</p>
<h2>Closing Thought</h2>
<p>
In the end, Azure is fun but WP Engine will get it done. I enjoyed the experience of
understanding the finer points of packaging and deploying to Azure, but I am not willing to
spend the dollars necessary to operate successfully on it. Stay tuned, I have only been on WP
Engine for 5 hours now.
</p>
<h2>References</h2>
<ul>
  <li>Eric S. Raymond (1999). <a class="external text" href="http://www.catb.org/~esr/writings/cathedral-bazaar/cathedral-bazaar/" rel="nofollow"><em>The Cathedral &amp; the Bazaar</em></a>. O&apos;Reilly. ISBN 1-56592-724-9</li>
</ul>]]></content><author><name>Todd A. Wood</name><uri>https://www.woodcp.com</uri></author><category term="Azure" /><category term="Azure" /><category term="Parallels" /><category term="WP Engine" /><summary type="html"><![CDATA[TL; DR – You will enjoy rolling your own WordPress instance on Azure until you see the cost of scaling and maintaining the site. I decided to spend more time blogging and less time and money managing the infrastructure by leaving the speed, scale, security, and support of a fully managed WordPress installation to Austin&apos;s very own WP Engine. Turbulence It did not take long before I hit my first patch of turbulence in the Windows Azure Cloud. My second blog post about using Parallels Desktop® 7 for Mac for a development machine was retweeted by @ParallelsMac to its 18K+ followers. I checked my blog entry for any comments, but was greeted by the message: Error establishing a database connection. The Cathedral and the Bazaar in the Sky I had WordPress 3.3.1 configured to run on two Extra Small Azure server instances (Shared CPU Cores and 768 MB Memory) with a SQL Azure backend. I figured this configuration would suffice for a single blog site with no following. The first action I took during the outage was to start up an additional two instances via the web management interface. This immediately resolved the issue, but I assumed the traffic that caused it was long gone and never to return. Since I was only 3 days into my 90 day free trial of Azure, there was no cost for spinning up additional instances. However, projecting the cost forward under normal conditions after the trial period gave me cause for concern. Cloud with a Silver Lining Windows Azure is quite up front about its pricing levels and provides an excellent Pricing Calculator to check out the cost of various configurations. It took me several days, but I finally spotted the Billing link on my management portal page. I checked today to see the net effect of running five Extra Small Azure instances the past couple days. The 6 days worth of statistics made it clear I would not be needing the full 90 day free trial period: Database (DB / Month) – Web Edition: 19.355% of 1 db/month Compute Hours – Windows Azure Compute: 17.2% of 750 Unit Storage Transactions (in 10,000s) – Windows Azure Storage: 3.112% of 5 10,000s liData Transfer Out: 0.17 GB of 80GB Data Transfer In: 0.39 GB The free trial period includes 750 hours of compute time per month, but that is only enough for a single Extra Small instance. The recommended configuration by Azure is two instances and that was not even enough for my small burst of traffic. Seeding Your Own Cloud I thoroughly enjoyed rolling my own instance of WordPress in the Azure cloud, but the price of scaling it is too much for the average blogger. Change management of the WordPress site is not as straight forward either when running your own cloud. I found this out quickly when using the WordPress administration interface. I knew from the start that running multiple instances would isolate certain aspects of the framework but not by how much. The Windows-Azure-PHP-Scaffolders included a Windows Azure Storage plug-in to make images and static content available across all instances. In addition, the WordPress database will keep your content and plug-in settings available for all instances. The problem is with the other items that you frequently change when first setting up a new WordPress site: Plugins – You cannot install or uninstall plugins using the online admin interface unless you understand the fine art of round robin load balance dancing (F5 or Command-R) to install them on all nodes Themes – You cannot install or modify themes using the online admin interface without the same issue You could install the certificates to Remote Desktop into the instances, but Azure will use the base image uploaded when starting up any new instances. All component and template changes will be lost on the new instance. The remedy is to configure the WordPress plugins and theme modifications on your local computer, repackage the changes into a new Azure deployment, and use the Azure management console to stage the upgrade into the Cloud. Azure makes the process quite simple for someone like myself, but it is more complicated than simply using the WP-Admin interface to make the changes online. The Specialists Reliability and performance is rarely free or inexpensive, so I went back to where I started 6 days ago looking for hosted WordPress providers. I had no interest in the standard web hosting shared / VPS offerings thereby limiting my search to true SaaS providers like WordPress.com and WP Engine. After seeing the cost of Azure, I did not even bother with exploring Amazon AWS. WordPress .COM My new review of WordPress.com left me with the same concerns as before. It needs to control your DNS zone file, only short form domain name URL, and from what I read you cannot use any plugin that generates / uses javascript. There was some confusing verbiage on ads and I am not looking to run ads on my blog no matter how much I mention the products I use and find value. WP Engine WP Engine on the other hand, provides exactly what I was attempting to accomplish on my own, but without the change management opportunity and scaling costs. I also like the idea of supporting a locally operated Austin business. If I am not mistaken, I only know of WP Engine by word of mouth of them either sponsoring or helping out at some Austin technical user group meeting or something. This just goes to prove that more tech companies should be sponsoring local tech user groups with beer and food. The best part, it literally only took me 43 minutes from the time I received the Welcome to WP Engine! email until the moment I logged into Namecheap.com to update my DNS settings to point my CNAME record directly to WP Engine. Closing Thought In the end, Azure is fun but WP Engine will get it done. I enjoyed the experience of understanding the finer points of packaging and deploying to Azure, but I am not willing to spend the dollars necessary to operate successfully on it. Stay tuned, I have only been on WP Engine for 5 hours now. References Eric S. Raymond (1999). The Cathedral &amp; the Bazaar. O&apos;Reilly. ISBN 1-56592-724-9]]></summary></entry></feed>