With the recent GA release of the Power Platform Build Tools, we added support for creating an environment on demand and use that environment in subsequent tasks, for example to generate a build artifact and then delete the environment afterwards.
Note that the URL defined in my service connection is overridden, it is the auth in the service connection that is used to ensure that the user credentials passed have sufficient privileges to create the environment. The Build Tools tasks that deals with environment actions such as create, backup, delete currently only supports username/pw auth. Support for Service Principals for those tasks is just around the corner...
Happy CI/CD'ing 😊
One more detail, I do not use the reset task that comes in Power Platform Build Tools, or the UI because there is a bug (I do not know if it persists) that installs the dynamics app in the environment.
Brilliant, thank you!
I have taken as a practice, using an sandbox enviromente with username and password. And after each build, do a reset of the environment using powershell.
Install-Module -Name Microsoft.PowerApps.Administration.PowerShell -Confirm:$False -Force
Install-Module -Name Microsoft.PowerApps.PowerShell -AllowClobber -Confirm:$False -Force
# Here is how you can pass in credentials
$pass = ConvertTo-SecureString "$(Pass)" -AsPlainText -Force
Add-PowerAppsAccount -Username -Password $pass
$resetRequest = [pscustomobject]@{
FriendlyName = "bla"
DomainName = "bla"
Purpose = "purpose"
BaseLanguageCode = 1033
Currency = [pscustomobject]@{
Code = "USD"
Name = "USD"
Symbol = "$"
}
Templates = @('D365_CDS')
}
Reset-PowerAppEnvironment -EnvironmentName id -ResetRequestDefinition $resetRequest
I typically use service principal where I can but also use username/password. I don't think I articulated my question well enough. What I was hoping to do was to pass a parameter to the pipeline to determine the name of the environment so I could specify the environment name at runtime. E.g. - "May2021DEV". However, it appears that I need to "hardcode" a name into the service connection for this to work, and for each subsequent step to be able to do their thing. 🙂
@Mikkelsen2000: Are the Tasks that deal with Environments already supported for connections with a service principal? I guess Nick and I are experiencing the same problem now.
I have been experimenting a bit trying to build some pipelines with "Just in time" environments. What I can't figure out (and it is likely obvious) is how to establish the service connection on subsequent steps, e.g. create an environment followed by a task that imports a solution. The docs refer to BuildTools.EnvironmentUrl but I am unsure how to use that? Is there a place where I can see an example? THANKS!
Hi @Mikkelsen2000, two questions:
1. IIUC the process of importing and exporting the solution in a "fresh" environment has the advantage of verifying, that the solution doesn't depend on other components, correct? Currently we export the solution unmanaged and managed from our DEV environment directly, without an additional build environment.
2. When importing a solution containing a Power Automate Flow, the owner of the Flow changes to the App-User which is used in the pipeline. This means, that the owner of the Flow doesn't match the owner of the connection in the PROD environment. This owner mismatch seems to cause the Flow to be deactivated. The correct owner can go into the PROD environment and activate the Flow, but that's not really automated. Is there anything wrong on my side, or are you aware of this issue?
Thanks for your post.
When I use the 'create an environment' task I get this error:
Exception calling "AcquireToken" with "2" argument(s): "AADSTS90002: Tenant '***' not found. This may happen if there are no active subscriptions for the tenant. Check to make sure you have the correct tenant ID. Check with your subscription administrator.
What does this mean? Could it be that only generic service connections are supported?
edit: My service connection wasn't configured correctly.
Hello @damienjs check this blog, https://benediktbergmann.eu/2020/02/10/cds-basic-alm-process/
and guide for Ms https://github.com/microsoft/PowerApps-Samples/tree/master/build-tools
This process is following the recommendation and best practice from Microsoft. Microsoft’s approach, and the approach they recommend to everyone, is the “Source-Controle centric” approach. This means that you always should store a functional version of your Solution in your Source-Controle. I do see the following main reasons for this approach:
Build Managed Solution
Sub-Process – Build Managed Solution
The Solution we stored with the first sub-process will be changed to a Managed solution in this sub-process. To do so we have to import the solution to a JIT Build (Just-In-Time Build) environment, export it as Managed and store the Zip-File as an artifact inside of DevOps.
We do need the extra JIT Build environment because this process will run independently from the first one. This could result in having a different version in our development environment than the version we would like to package. In the best case, we would create a blank new environment when starting this process. I will not go into detail in this article but will publish another one specifically on this topic.
@JaRiv You could use $(Build.BuildId) in the name of the build environment that you create to ensure each time the environment name is unique and knowable.
WarrenBelz
87
Most Valuable Professional
mmbr1606
71
Super User 2025 Season 1
Michael E. Gernaey
65
Super User 2025 Season 1