Replies: 6 comments
-
Hi @cmeeren , So, dotnet-affected works by analyzing your dotnet projects. If the migrations (or smoke test but i'll just say migrations for short) project don't have a
To build what has changed whenever a push happens, your pipeline needs to:
Here's an untested azure pipelines example, unfortunately I don't have any public real-use example to share for Azure Pipelines (we do have it for Github Actions) pr:
- main
variables:
CI: 'true'
DEFAULT_COMMIT_SOURCE: $(git rev-parse HEAD~1)
TARGET_COMMIT: $(git rev-parse $SYSTEM_PULLREQUEST_TARGETBRANCH)
jobs:
- job: main
pool:
vmImage: 'ubuntu-latest'
steps:
# Credits to https://nx.dev/recipes/other/azure-last-successful-commit
# Set Azure Devops CLI default settings
- bash: az devops configure --defaults organization=$(System.TeamFoundationCollectionUri) project=$(System.TeamProject)
displayName: 'Set default Azure DevOps organization and project'
# Get last successfull commit from Azure Devops CLI
- bash: |
LAST_SHA=$(az pipelines build list --branch $(Build.SourceBranchName) --definition-ids $(System.DefinitionId) --result succeeded --top 1 --query "[0].triggerInfo.\"ci.sourceSha\"")
if [ -z "$LAST_SHA" ]
then
LAST_SHA=$DEFAULT_COMMIT_SOURCE
fi
echo "Last successful commit SHA: $LAST_SHA"
echo "##vso[task.setvariable variable=COMMIT_SOURCE]$LAST_SHA"
displayName: 'Get last successful commit SHA'
env:
AZURE_DEVOPS_EXT_PAT: $(System.AccessToken)
- task: UseDotNet@2
displayName: 'Installing DotNet SDK'
inputs:
version: 7.x
# Restore Tools
- script: dotnet tool restore --tool-manifest .config/dotnet-tools.json
displayName: Restoring Tools
# Run dotnet affected
- script: |
dotnet affected --verbose \
-p $(System.DefaultWorkingDirectory) \
--from $(COMMIT_SOURCE) \
--to $(TARGET_COMMIT) \
--format traversal text
exitCode="$?"
if [ "$exitCode" -eq 0 ]; then
# changes happen, we'll need to build them
cat affected.proj
echo "##vso[task.setvariable variable=HAS_CHANGES]true"
elif [ "$exitCode" -eq 166 ]; then
# nothing changed. No need to build
echo `##vso[task.complete result=Succeeded;]DONE`
exit 0
else
exit $exitCode
fi
displayName: Run dotnet-affected
# Restore
- script: dotnet restore affected.proj
displayName: Run dotnet restore
condition: and(succeeded(), eq(variables.HAS_CHANGES, 'true'))
# Build
- script: |
dotnet build affected.proj \
--no-incremental \
/clp:NoSummary \
/nologo \
/WarnAsError
displayName: Run dotnet build
condition: and(succeeded(), eq(variables.HAS_CHANGES, 'true')) Regarding your last point:
You can use the We have plans to for a JSON formatter as well #35 Finally, I would like to mention that you might wanna give a look at
I hope I haven't missed any of your questions, feel free to ask, I'll be happy to answer as soon as I find the time. Thanks for the feedback and for trying out the tool. Kind regards, |
Beta Was this translation helpful? Give feedback.
-
Thank you so much! This is immensely helpful. 🙏 There's a lot of other work currently, but hopefully I'll find time to work on this some time during the next couple of weeks. When I have landed on something that seems to work, I'll report back with my findings. In the meantime, I may come back with more questions. |
Beta Was this translation helpful? Give feedback.
-
Can you explain why you set Also, it seems like your example pipeline only triggers for PRs. Do you know of any modifications I need to make to the rest of the script if I want a pipeline that triggers for all commits to |
Beta Was this translation helpful? Give feedback.
-
Hi @cmeeren
So I copy/pasted from the NX example. Instead of using the first commit, I would have a step in the pipeline that when last successfully built commit is not found, it builds everything. To build everything, you can use a Traversal project with glob pattern expressions to select all your projects.
To have a single pipeline that builds PRs and your main branch, you'll need to have a step at the beginning of the pipeline that determines the source and target commit. You can use the |
Beta Was this translation helpful? Give feedback.
-
Hi @cmeeren , I'm converting this into a discussion. Feel free to continue the conversation there. Leo. |
Beta Was this translation helpful? Give feedback.
-
Hi again. I have now experimented a bit and just want to report back. As a reminder, my requirements are listed in the OP. Additionally I would also like to add this requirement:
Unfortunately, given my requirements, I found out that dotnet-affected is not the right solution for me:
Particularly for the last reason above, the only real solution for me is to build/pack/deploy the different APIs in parallel on different agents. Therefore, the best dotnet-affected can give me is the ability to determine (using quite a bit of custom code and parsing its output) exactly which pipelines to trigger. But this is already partially supported out of the box Azure Pipelines by using trigger paths. My repo structure is simple enough that I can just include all source files in trigger:
branches:
include:
- '*'
paths:
include:
- src/MyApi/*
- src/Common/**/*.fsproj
- src/Common/**/*.fs
exclude:
- src/Common/**/*Tests/*
- src/Common/**/*TestApp/*
- src/Common/**/*Benchmarks/* While not ideal (in particular, this could become out of date after refactoring libraries), it is so much simpler that I'll stick to that instead. (For the record, I am also using pipeline templates to reduce pipeline step duplication.) I also looked into nx-dotnet, but I'm not willing to adjust my workflow by having to create new projects through nx (or at least remember to register them in a JSON file), change my repo structure (apps vs. libs), manage nx tags, etc. |
Beta Was this translation helpful? Give feedback.
-
Not sure this is the best place to ask, but it is at least somewhat related to dotnet-affected, and given the helpful responses I have received so far here, I figured I'd try.
I have a monorepo with many APIs. Each API has a smoke test project and a DB migrations project. These do not have a project reference to the API project (I could add that if needed as a workaround), but they should still always be built and packaged (as separate artifacts) along with the API, so that when the API is deployed, the release pipeline can run the DB migrations and smoke tests.
Our migration to a monorepo has just been completed and there is still work left to do. Currently I am not using dotnet-affected. Each API still has a separate Azure DevOps pipeline for build (YAML) and deploy (UI). I have set up path triggers in the build pipelines so that the APIs are rebuilt for changes to files I know may influence the build. This gives many false positives, it's fairly high maintenance in the face of refactors (since project references are not picked up and have to be manually added to the path trigger list), and of course means that common libraries are restored/re-built many times (once for each pipeline).
I would like to experiment with dotnet-affected, but I'm not sure how best to accomplish what I need, which is this:
How can this best be accomplished using dotnet-affected?
Beta Was this translation helpful? Give feedback.
All reactions