Tip: Out-ExcelClipboard in PowerShell

From time to time I want to get some result from a PowerShell command into Excel. Unless I’m missing something, there isn’t a great option to do this in standard PowerShell.

I know that you can use Export-Csv or Export-Clixml and then open the file in Excel. I know you can use Out-GridView and then copy and paste from there into Excel (albeit without the column headings). You can also use .Net types but I just want to pipe my result to a function and then paste into Excel. I want <some output of previous functions> | Out-ToExcel

function Out-ExcelClipboard {
    param(
        [Parameter(ValueFromPipelineByPropertyName, ValueFromPipeline)]
        $InputObject
    )
    begin {
        $Objects = @()
    }
    process {
        $Objects += $InputObject
    }
    end {
        $Objects | ConvertTo-Csv -Delimiter "`t" -NoTypeInformation | Set-Clipboard
    }
}

Export-ModuleMember -Function Out-ExcelClipboard

I’ve added this function to my module which I load by default in my PowerShell profile. This took more code than I was expecting. For the curious, ValueFromPipeline tells PowerShell to bind the output from the pipeline to this parameter. It then calls the process block for each value in the pipeline.

You can add begin and end blocks to do extra stuff before and after all the values have been processed. In my case, stuff them all into a new collection, convert to tab-delimited text without including any type information and then pass that value to the clipboard. Lovely.

Tip: Test for Tables Missing from Permission Sets

In PowerShell:

$tablesInPermissionSets = @()

$permissionSets = gci . -Recurse -Filter '*.al' | ? {(gc $_.FullName).Item(0).startsWith('permissionset')}
 $permissionSets | % {
    $content = gc $_.FullName -Raw
    [Regex]::Matches($content, '(?<=tabledata ).*(?= =)') | % {
        $tablesInPermissionSets += $_.Value
    }
 }

$tablesInTables = @()

$tables = gci . -Recurse -Filter '*.al' | Where-Object {(Get-Content $_.FullName).Item(0).StartsWith('table ')}
 $tables | % {
    $content = gc $_.FullName -Raw
    [Regex]::Matches($content, "(?<=table \d+ ).*(?=$([Environment]::NewLine))") | % {
        $tablesInTables += $_.Value
    }
 }

$missingTables = ""

Compare-Object $tablesInTables $tablesInPermissionSets | ? SideIndicator -eq '<=' | % {
    $missingTables += $_.InputObject + [Environment]::NewLine
}

if ('' -ne $missingTables) {
    throw "Missing table permissions: $missingTables"
}

In English:

  1. Find all the files in the current folder, and child folders, with a filename ending in .al and which have a first line starting with “permissionset”
  2. Build a collection of the tabledata objects that are referenced in those permission sets
  3. Find all the files in the current folder, and child folders, with a filename ending in .al and which have a first line starting with “table ” (with a space to avoid matching “tableextension”)
  4. Build a collection of the names of the tables
  5. Use Compare-Object to compare the collections and find names which appear in the list of tables but not in tabledata permissions
  6. Build an error message of missing table permissions
  7. Throw the error

PowerShell Profile:

Like most small PowerShell scripts that I write, I’ve just added it to my PowerShell profile. Run code $profile in a PowerShell prompt to open the profile file in VS Code.

function Test-Permissions() {
  #...all of the above code
}

Maybe there is already a VS Code extension that checks for this? It would make sense, but I’m pretty minimalist with the extensions that I have installed anyway. I run it from the terminal in VS Code.

Tip: Install all Apps in a Folder to a Container

Intro

I do this fairly often to prep a new (local) container for development. The script needs to be run on the Docker host and assumes that the PowerShell prompt is already at the folder containing the apps (or apps in child folders) you want to install.

Use the dev endpoint if you are going to want to publish the same apps from VS Code into the container. Or publish into the global scope if you prefer – maybe if you are creating an environment for testing.

Publishing to Dev Scope Using Dev Endpoint

$container = Read-Host -Prompt 'Container'; $credential = Get-Credential; Sort-AppFilesByDependencies -appFiles (gci . -Recurse -Filter *.app | % {$_.FullName}) | % {try{Publish-BcContainerApp $container -appFile $_ -sync -upgrade -skipVerification -useDevEndpoint -credential $credential} catch {Publish-BcContainerApp $container -appFile $_ -sync -install -skipVerification -useDevEndpoint -credential $credential}}

Publishing to Global Scope

$container = Read-Host -Prompt 'Container'; Sort-AppFilesByDependencies -appFiles (gci . -Recurse -Filter *.app | % {$_.FullName}) | % {try{Publish-BcContainerApp $container -appFile $_ -sync -upgrade -skipVerification} catch {Publish-BcContainerApp $container -appFile $_ -sync -install -skipVerification}}

Tip: List-Commits

function List-Commits {
  cd 'C:\Git'
  $Commits = @()
  Get-ChildItem . -Directory | % {
    cd "$_"
    if (Test-Path (Join-Path (Get-Location) '.git')) {
      $Commits += git log --all --format="$($_)~%h~%ai~%s~%an" | ConvertFrom-Csv -Delimiter '~' -Header ('Project,Hash,Date,Message,Author'.Split(','))
    }
    cd ..
  }
  $Commits | ? Author -EQ "$(git config --get user.name)" | sort Date -Descending | Out-GridView -Title 'Commits'
}

This function iterates through Git repositories under the same parent folder (C:\Git in my case), builds a list of all the commits that you’ve authored (i.e. that match your user.name in Git config) and displays them in descending date order in a grid view.

Change the path in the second line to suit, or just remove it to have it search for repositories under the current directory.

Sample output in a PowerShell grid view

I use it to remind myself what I’ve been working on over the last few days. Mostly for fun and only occasionally because I’m late filling in my timesheets… 🙄

Tip: Remove-BranchesWithUpstreamGone

Wait, I Thought I Deleted That Branch?

One of the things that I found counter-intuitive when I was getting started with Git is that when branches are deleted on the server they are still present in your local repository, even after you have fetched from the server.

We typically delete the source branch when completing a pull request, so this happens a lot. Usually, once the PR has been completed I want to:

  1. remove the reference to the deleted remote branch
  2. remove the corresponding local branch

Removing Remote Reference

The reference to the remote branch is removed when you run git fetch with the prune switch.

git fetch --prune
Fetching origin
From <remote url>
- [deleted] (none) -> origin/test

Removing Local Branches

Local branches can be removed with the git branch command. Adding -d first checks for unmerged commits and will not delete the branch if there are any commits which are only in the branch that is being deleted. Adding -D overrides the check and deletes the branch anyway.

git branch -d test
Deleted branch test (was <commit hash>)

Remove-BranchesWithUpstreamGone

I’ve added a couple of PowerShell functions to my profile file – which means they are always available in my terminal. If I’m working on an app and I know that some PR’s have been merged I can clean up my workspace running Remove-BranchesWithUpstreamGone in VS Code’s terminal.

As a rule, I don’t need to keep any branches which used to have a copy of the server, but don’t any more (indicated by [gone] in the list of branches). Obviously, local branches which have never been pushed to the server won’t be deleted.

function Remove-BranchesWithUpstreamGone {
  (Get-BranchesWithUpstreamGone) | ForEach-Object {
    Write-Host "Removing branch $_"
    git branch $_ -D
  }
}

function Get-BranchesWithUpstreamGone {
  git fetch --all --prune | Out-Null
  $Branches = git branch -v | Where-Object { $_.Contains('[gone]') }
  $BranchNames += $Branches | ForEach-Object {
    if ($_.Split(' ').Item(1) -ne '') {
      $_.Split(' ').Item(1)
    }
    else {
      $_.Split(' ').Item(2)
    }
  }

  $BranchNames
}