top of page

Understanding .NET: How It's Made and Released

In the world of making computer programs, it's important to know how popular tools like .NET are put together. .NET is a big part of building different kinds of apps for computers, the web, and more.

This guide will show you what goes on behind the scenes of .NET. We'll talk about how the people who make .NET write the code, test it, and then put it out there for everyone to use. We'll go step by step, explaining everything from how the code flows to how they keep everything safe and working well.

Table of Contents:

Product Dependencies

Toolset Dependencies

Multi-Repository Development in .NET

.NET development is structured around a constellation of repositories, each responsible for distinct aspects of the framework's ecosystem. For example, the core .NET runtime and associated NuGet packages are built within the dotnet/runtime repository. However, these outputs aren't self-contained; they're utilized by other repositories such as dotnet/aspnetcore, dotnet/installer, and dotnet/extensions. This inter-repository dependency network forms the backbone of .NET's development landscape.

Challenges in Managing Dependencies Across Repositories

While the multi-repository approach offers advantages in modularity and team autonomy, it introduces challenges in dependency management.

Key challenges include:

  1. Dependency Versioning: Repositories within the .NET ecosystem often rely on specific versions of dependencies. Coordinating these version requirements across repositories is crucial to ensure compatibility and stability.

  2. Dependency Resolution: As dependencies evolve, repositories must adapt to incorporate these changes seamlessly. However, updating dependencies can trigger cascading updates across interconnected repositories, leading to complex dependency resolution processes.

  3. Build Synchronization: Unlike a monolithic codebase where changes propagate uniformly, updates in one repository may necessitate corresponding adjustments in others. Coordinating these changes and synchronizing builds across repositories demand careful orchestration.

Dependency Consumption Strategies in .NET

.NET employs several strategies for managing dependencies across repositories:

  1. API Consumption: Repositories consume dependencies by interacting with their public APIs. While major releases maintain API stability, ongoing development may introduce API changes, requiring continuous adaptation.

  2. Version Referencing: Repositories reference specific version numbers of assets produced in other repositories. This ensures consistency and predictability in the dependency chain, enabling reproducible builds.

  3. Redistribution: Dependencies may be redistributed within the .NET ecosystem. For example, while certain packages are available on public repositories like NuGet, they may also be repackaged and distributed within larger frameworks or SDKs.

  4. Source Code Building: In cases where prebuilt binaries are unsuitable, dependencies are built from source code. This approach ensures compatibility across diverse environments and adheres to distribution policies, particularly in Linux distributions.

Dependency Tracking

Dependency tracking in .NET development involves a detailed process of keeping tabs on various dependencies utilized across repositories. This includes employing metadata files and specific MSBuild property names to effectively manage dependency relationships and version control strategies.

In each repository, such as dotnet/installer, dependency information is stored in a file named eng/Version.Details.xml. This file meticulously outlines the names, versions, and sources of dependencies utilized within that specific repository. For instance, it might specify that dotnet/installer relies on a particular version of Microsoft.NET.Sdk sourced from a specific commit in the dotnet/sdk repository.

Here's a glimpse of what such a file might look like:

<?xml version="1.0" encoding="utf-8"?> 
		<!-- Other dependencies.. --> 
		<Dependency Name="Microsoft.NET.Sdk" Version="5.0.100-rc.1.20403.9"> 

These dependency names correspond to specific MSBuild property names found in another file called eng/Versions.props, also within the same repository. These MSBuild properties are then utilized throughout the repository to specify versions or paths associated with those dependencies.

Here's a snippet showcasing such properties:

	<!-- Dependencies from --> 

By meticulously organizing dependency information in this structured manner, it becomes feasible to construct a comprehensive graph detailing all dependencies across the entire product. This graph can then be systematically traversed, commencing from the source repositories and commits, to ensure that all dependencies are properly accounted for.

Dependencies are typically categorized into two primary types:

  1. Product dependencies, which are indispensable for the product to function as intended

  2. Toolset dependencies, which are not bundled with the product but are utilized for testing or build-related tasks.

Product Dependencies:

Product dependencies are integral to the core functionality of the software product. These dependencies are essential for the product to operate as intended and to provide its primary features and capabilities. They directly contribute to the functionality and behavior of the end-user-facing aspects of the application.

Examples of product dependencies in a .NET project might include:

  • Libraries or frameworks that provide essential functionality directly utilized by the application.

  • Third-party components or packages that are critical for specific features or functionalities of the product.

  • Internal modules or libraries developed within the organization that are shared across multiple projects and are fundamental to the product's operation.

Product dependencies are typically included in the final product distribution and are essential for its successful deployment and operation. They are carefully managed to ensure compatibility, reliability, and consistency across different product versions.

Toolset Dependencies:

Toolset dependencies, on the other hand, are not directly part of the product's runtime or end-user functionality. Instead, they are used during the development, testing, build, or deployment processes to support various tooling and infrastructure requirements.

Examples of toolset dependencies in a .NET project might include:

  • Build tools or compilers required to compile source code into executable binaries.

  • Testing frameworks or libraries used to automate testing procedures and ensure software quality.

  • Packaging tools or scripts used to create distributable packages or artifacts for deployment.

Toolset dependencies are essential for the development and maintenance of the software product but are not included in the final distribution or deployment package. They are primarily used by developers, build engineers, or automated build systems to facilitate the software development lifecycle.

While toolset dependencies are not directly visible to end-users, they are crucial for ensuring the efficiency, reliability, and maintainability of the development process. Managing toolset dependencies effectively can streamline development workflows, improve build and deployment automation, and enhance overall productivity.

This meticulous approach to dependency tracking is fundamental for maintaining the coherence and functionality of .NET projects, enabling efficient version control and streamlined development processes.

Automated Dependency Management with Maestro

In our development ecosystem, we employ a custom service called Maestro to automatically update our tracked dependencies. This service, implemented in the form of the Arcade Services repository (accessible at, plays a crucial role in ensuring the smooth flow of dependencies across our projects.

In simpler terms, Maestro is a tool for managing dependencies in .NET projects. It can be used to automatically update dependencies, resolve conflicts, and generate NuGet packages. 

Here are some of the benefits of using Maestro for automated dependency management:

  • It can save developers time and effort by automating the process of managing dependencies.

  • It can help to ensure that projects are built with the correct dependencies, which can help to prevent build errors.

  • It can help to improve the build reproducibility of projects.

  • It can help to make it easier to manage complex dependency relationships.

Channels and Subscriptions:

Channels: Not all builds serve the same purpose. A build from the main branch of a repository may be intended for day-to-day development, while a build from a test branch may not be intended to flow anywhere. Channels are used to tag builds with specific intents, signifying where their outputs should be directed.

Subscriptions: Subscriptions map builds of a source repository, assigned to a particular channel, onto a target branch in another repository. When a build is assigned to a channel, Maestro modifies the state of the target branch by updating its dependencies, typically through modifications to the eng/Version.Details.xml and eng/Versions.props files. It then initiates a pull request with these changes.

Consider the below diagram that shows an example of how Maestro can be used to manage the dependencies of a .NET project.

Understanding .NET: How It's Made and Released

The red path in the diagram represents the longest path through the graph from a build-time standpoint. This means that it is the longest chain of dependencies that must be built to build the final project. The length of this path is determined by the number of dependencies that each project has.

For example, the project dotnet-runtime has two dependencies: 

  • dotnet-extensions 

  • dotnet-winforms. 

To build dotnet-runtime, Maestro must first build dotnet-extensions and dotnet-winforms. 

The project dotnet-winforms also has two dependencies: 

  • dotnet-wpf-int 

  • datnet-wpf-int

So, to build dotnet-runtime, Maestro must build dotnet-extensions, dotnet-winforms, dotnet-wpf-int, and datnet-wpf-int.

The red path in the diagram shows the order in which Maestro would build these projects. First, it would build dotnet-extensions and dotnet-winforms. Then, it would build dotnet-wpf-int and datnet-wpf-int. Finally, it would build dotnet-runtime.

Impact of Channels on Publishing:

Channels also play a significant role in determining the publishing of a build's assets. Builds not assigned to any channel do not publish their outputs anywhere. However, assignment to a channel triggers the publishing of these assets, with the channel dictating the desired endpoints.

For example, a channel designated for day-to-day development of .NET 5 may indicate that certain files should be pushed to a specific storage account, while packages should be pushed to a designated Azure DevOps NuGet feed.

Flexibility and Scalability:

As our product development workflows evolve and the number of repositories grows, the traditional model of assigning build intent based on branches becomes less effective.

Different teams may have varying development practices and branching strategies, leading to complexity in managing dependencies. By leveraging subscriptions tied to channels, we establish a cleaner producer-consumer model for the flow of dependencies.

This approach allows teams to focus on their specific tasks without needing to closely track the branching structures of other repositories.

Ensuring Coherency in Dependency Graph

The concepts of coherence and incoherence within the dependency graph play a pivotal role in determining the overall health and functionality of the software ecosystem.

What is Coherence?

Coherence means everything in the dependency graph works together smoothly. It's like all the pieces of a puzzle fitting perfectly. In a coherent system, all parts use compatible versions, making everything predictable and consistent. This is important, especially when releasing software, as it reduces the chances of things going wrong and ensures a hassle-free experience for users.

Navigating Incoherence:

On the flip side, incoherence is when things don't match up in the dependency graph. It's like trying to build with mismatched Lego bricks. This creates confusion and can cause problems like errors or things not working as expected. Dealing with incoherence is crucial to keep everything running smoothly and prevent issues in the software system.

Imagine you're putting together a jigsaw puzzle, but some pieces don't quite fit. That's like incoherency in a software system – it's a possible error state. Let's consider the .NET shared framework runtime as an example. This runtime has a specific set of rules (like the pieces of a puzzle) that it follows to work properly. Now, in our software world, different versions of this runtime might be floating around in different parts of the system. But here's the catch: when we're getting ready to release our software, we need everything to work smoothly together.

Here's where the trouble comes in: if different pieces of our software rely on different versions of the runtime, there's a chance that something might not work as expected. Maybe one piece expects the runtime to behave one way, while another piece expects something different. This mismatch could lead to errors or bugs, like pieces of the puzzle not fitting together correctly.

So, in an incoherent system where different parts aren't on the same page with which runtime version they're using, there's a risk that important changes or problems might slip through the cracks, causing headaches down the line. That's why it's crucial to have strategies in place to keep everything in sync and prevent these kinds of issues from happening.

Maintaining Coherence:

Achieving and maintaining coherence within the dependency graph requires a proactive and systematic approach. Several strategies can be employed to ensure that all repositories reference consistent versions of dependencies:

  • Version Management: Implementing robust version control mechanisms ensures that dependencies are locked to specific versions or version ranges. By enforcing version consistency, developers mitigate the risk of unintentional upgrades or compatibility conflicts.

  • Dependency Tracking Tools: Leveraging sophisticated dependency tracking tools facilitates the seamless management of dependencies across repositories. These tools provide insights into dependency relationships, version compatibility, and resolution of conflicts, empowering developers to maintain coherence effectively.

  • Automated Testing and Integration: Integrating automated testing frameworks and continuous integration pipelines enables early detection of incoherence. Through rigorous testing and continuous integration, developers can validate the compatibility and functionality of dependencies, preempting potential issues before they manifest in production environments.

  • Controlled Dependency Flow: Implementing controlled mechanisms for propagating dependency updates ensures synchronized and coherent updates across repositories. Tools like Maestro automate the flow of dependency outputs, streamlining the process of dependency management and minimizing the risk of incoherence.

Preparing for Release

The process of preparing for a release involves a structured workflow aimed at building and releasing the product efficiently.

Here's an overview of the workflow:

Maestro Subscriptions as Flow Graph:

Think of Maestro subscriptions as forming a flow graph, where each edge represents the flow of changes between repositories. If no additional 'real' changes are made in any repository in the graph, the flow of changes will eventually cease. This leads to the product reaching an unchanging, coherent state where there is a single version of each dependency. This state is crucial for preview and servicing releases.

Goal of Reaching a Coherent State:

The ultimate objective of the workflow is to achieve a coherent state within the product. This means ensuring that all repositories reference compatible versions of dependencies, minimizing the risk of compatibility issues. Striving for coherence is a regular practice, especially during preview and servicing releases.

General Flow for Each Release:

The workflow follows a structured approach for each release, which typically includes the following steps:

Preparation for the new release:

This initial step involves updating various branding elements to prepare for the upcoming release. This includes adjusting version numbers, preview identifiers, and determining which packages will be shipped with the release. If the release is a preview version, new branches are typically created from the main development branches to isolate the stabilization efforts for the upcoming features.

Committing changes:

Once the necessary adjustments and preparations have been made, the next step is to commit these changes across all relevant repositories. This includes incorporating approved bug fixes, implementing feature enhancements, and any other modifications required for the release.

Monitoring Builds and Dependency Flows:

As changes are committed, builds are triggered in each repository to generate updated artifacts. Continuous monitoring of build pipelines and dependency flows is crucial to ensure that all components are aligned and compatible with each other.

Iterative Approach to achieve coherency:

Achieving coherence across the entire dependency graph is critical for ensuring a stable and reliable release. This involves allowing new builds to complete and dependencies to flow between repositories until a state of coherence is reached. In this coherent state, there is only one version of each dependency throughout the entire ecosystem, minimizing the risk of compatibility issues.

Preparation of dotnet/source-build release:

Once coherence is achieved, the source code dependencies of the coherent product are processed through the dotnet/source-build project. This step ensures that the source code is properly prepared for the release and can be built successfully in various environments.

Validating and Testing:

Additional validation testing is performed to ensure the quality and reliability of the release. This validation goes beyond the testing conducted during the pull request (PR) and continuous integration (CI) phases and may include comprehensive testing of various scenarios within Visual Studio or other development environments.

Send dotnet/source-build to partners:

After successful validation, the product's source code is shared with partners who may need to build .NET from source in their own continuous integration (CI) environments. This step facilitates collaboration and ensures that partners have access to the latest version of the source code.

Addressing Issues:

If any issues or discrepancies are identified during the validation or partner testing phases, additional changes may be required. These changes are prepared, committed, and integrated into the release branch, and the coherence process may need to be revisited to ensure that the fixes are properly incorporated.

Final Release:

Once all preparations are complete and the product has been thoroughly validated, it is packaged into the appropriate formats for each supported operating system. These packaged assets are then ready for distribution and release to users. Depending on the timing of the release, assets may be held in storage until the scheduled ship date or released immediately.

Sometimes, the release preparation process finishes well before the planned release date, which is always the ideal scenario. In such cases, the finalized assets are stored in Azure blob storage and various package feeds, awaiting the scheduled release date. However, there are instances when the process is completed close to the scheduled ship date, leaving minimal room for adjustments. This situation can be especially challenging, particularly if the release date is fixed, such as being tied to the commencement of a significant conference.

Validation Process

The validation procedures conducted before finalizing a release are crucial to ensure its integrity and stability. These procedures involve a series of checks and tests to verify that the software meets quality standards and performs as expected.

The validation process typically includes:

  1. Functional Testing: This involves testing the software's functionalities to ensure they behave as intended. Test cases are executed to validate various features, user interactions, and workflows.

  2. Performance Testing: Performance tests assess the software's responsiveness, scalability, and resource usage under different conditions. This ensures that the software meets performance requirements and can handle the expected workload.

  3. Compatibility Testing: Compatibility tests verify that the software works correctly across different platforms, operating systems, browsers, and devices. It ensures seamless operation and user experience across diverse environments.

  4. Security Assessment: Security assessments are conducted to identify and address potential vulnerabilities or security weaknesses in the software. This involves security testing, vulnerability scanning, and code review to mitigate security risks.

  5. Regression Testing: Regression tests ensure that the changes introduced in the release do not negatively impact existing functionalities. It involves retesting previously validated features to ensure their continued reliability.

  6. User Acceptance Testing (UAT): UAT involves testing the software with real users to validate its usability, functionality, and overall user experience. Feedback from users is collected and incorporated to improve the software further.

  7. Compliance Checks: Compliance checks ensure that the software complies with relevant standards, regulations, and industry best practices. This includes accessibility standards, data privacy regulations, and software development guidelines.

  8. Documentation Review: Documentation, including user guides, release notes, and technical documentation, is reviewed to ensure accuracy, completeness, and clarity.

By conducting these validation procedures, organizations can ensure that the release meets quality standards, performs reliably, and delivers a positive user experience.

Release Day Procedures

Release day is a critical milestone in the software development lifecycle, marking the culmination of development efforts and the delivery of new features, enhancements, or bug fixes to users. It involves a series of carefully orchestrated activities to ensure a smooth and successful release.

Below is an examination of the activities typically carried out on release day:

  1. Final Build Verification: Before initiating the release process, the final build is verified to ensure that it meets all quality criteria and is ready for deployment. This involves performing a final round of testing, including functional, performance, and compatibility checks.

  2. Pushing Builds to Distribution Endpoints: Once the final build is validated, it is pushed to distribution endpoints such as download servers, package repositories, or cloud storage. This ensures that users can access and download the latest version of the software.

  3. Publishing Release Notes: Release notes are prepared and published to inform users about the changes, enhancements, and bug fixes included in the release. Release notes provide valuable information about new features, known issues, and any required actions for users.

  4. Tagging Commits: Commits associated with the release are tagged to create a snapshot of the codebase at the time of release. This facilitates version control and enables developers to track changes associated with specific releases.

  5. Merging Branches: Branches containing features, fixes, or enhancements targeted for the release are merged into the main or release branch. This consolidates all changes and prepares the codebase for deployment.

  6. Deployment: The final step in the release process is deploying the new version of the software to production environments. Deployment procedures may vary depending on the nature of the application and the infrastructure setup. It may involve automated deployment scripts, manual configuration changes, or cloud-based deployment tools.

  7. Monitoring and Rollback Planning: Following deployment, the system is closely monitored for any issues or anomalies. Monitoring tools and alerts are used to detect performance issues, errors, or security breaches. Additionally, rollback procedures are prepared in case of unforeseen issues that require reverting to a previous version of the software.

  8. Communication: Throughout the release process, clear and timely communication is essential to keep stakeholders informed about the progress, status, and any issues encountered. This includes internal teams, customers, partners, and other relevant stakeholders.

By following these release day procedures, organizations can ensure a successful and seamless release, delivering valuable updates to users while minimizing disruptions and mitigating risks.

Code Flow and Security Patch Management

The day-to-day development of .NET primarily occurs on GitHub, providing transparency and collaboration within the developer community. However, for releases containing critical security fixes, a different approach is necessary to prevent early disclosure of vulnerabilities that could compromise customer applications. To address this requirement, the .NET team maintains a parallel branch structure in an Azure DevOps repository, allowing for secure handling of security patches while maintaining public development visibility.

Parallel Branch Structure:

Public Branches:

These branches directly mirror corresponding branches on GitHub. For instance, if there's a release/5.0 branch on GitHub's dotnet/runtime repository, there's an identical release/5.0 branch in the Azure DevOps dotnet-runtime repository. Changes made in GitHub are seamlessly pulled into these public branches via fast-forward merges.

Internal Branches:

Alongside the public branches, there are internal counterparts prefixed with internal/, such as internal/release/5.0. Commits made to GitHub's release/5.0 branch are automatically merged into the corresponding internal branch. These internal branches facilitate the incorporation of security fixes without compromising public visibility.

Handling Releases:

Public Releases:

For releases without security fixes, development primarily occurs on the public GitHub branches. All commits and dependency flows are directed towards these branches.

Security Patch Releases:

When security fixes are required, they are committed to the appropriate internal branch in Azure DevOps. Dependency flow is adjusted to target the internal branches for these releases. Non-security-related changes continue to be committed to the corresponding public GitHub branches and are automatically merged into the internal branches.

Release Management:

Merge Back to Public Branch:

On release day, a pull request is opened to merge the changes from the internal release branch back into the public release branch (e.g., from internal/release/5.0 to release/5.0). This synchronization ensures that the internal branch state aligns with the public branch, preparing it for subsequent releases.

Mitigating Dependency Flow Conflicts:

Selective Dependency Flow:

To avoid conflicts during the merging process, dependency flow is selectively enabled/disabled based on the type of release being built. For internal releases, public dependency flow is largely disabled except for essential components. Similarly, internal dependency flow is limited for public releases, ensuring smoother integration and reducing the need for manual intervention.

Future Considerations:

Improving Parallel Build Maintenance:

The .NET team acknowledges the challenges associated with maintaining both public and internal builds in parallel. Future efforts will focus on developing robust mechanisms to streamline this process and minimize discrepancies between public and internal releases.

By implementing this parallel branch structure and release management workflow, the .NET team ensures the seamless integration of security patches while maintaining transparency and efficiency in the development process.

Shipping and Non-Shipping Assets

Analysis of the categorization of build outputs into shipping and non-shipping assets. Explanation of versioning strategies and management of stable builds.

In the .NET ecosystem, the outputs of each build are categorized into two main types: Shipping and Non-Shipping assets. This categorization is crucial for managing versioning strategies and ensuring stable builds.

Shipping Assets:

  • These are assets intended for external distribution on platforms like or the .NET download sites.

  • Shipping assets typically receive stable versions (e.g., 5.0.6) rather than non-stable versions (e.g., 5.0.6-servicing.20364.11) during RTM (Release to Manufacturing) and servicing releases.

  • They are integral parts of official releases and are expected to be stable and reliable.

Non-Shipping Assets:

  • These assets are not meant for external distribution and do not appear on platforms like

  • They are primarily used for internal purposes, such as inter-repo transport.

  • Non-shipping assets always have non-stable versions to maintain uniqueness across builds.

  • Examples include transport packages like "Microsoft.NET.Sdk" which are redistributed internally.

Stable builds and Dependency Flow

In a multi-repo open-source software (OSS) development environment like .NET, managing stable builds and ensuring proper dependency flow is crucial due to strong ties between repositories. Consider a scenario where a component like "Microsoft.Extensions.Logging" is built out of "dotnet/runtime" and is also referenced independently. When building a new version of "Microsoft.Extensions.Logging" for a release, maintaining stable and predictable versioning is essential. However, building "dotnet/runtime" multiple times for a given release could result in duplicate versions of "Microsoft.Extensions.Logging"

To address this, the .NET team employs three sets of NuGet feeds within their build infrastructure:

Shipping Feeds:

  • Purpose: These feeds house non-stable, shipping packages generated during day-to-day builds. They are crucial for distributing components intended for external consumption.

  • Variants: There are both internal and public variants of shipping feeds for each major product version. Internal feeds facilitate builds within the organization, while public feeds cater to external consumers.

  • Examples: The .NET team may have feeds like "dotnet5" and "dotnet5-internal" to differentiate between public and internal consumption.

Non-Shipping Feeds:

  • Purpose: Unlike shipping feeds, these feeds store non-stable, non-shipping packages produced in day-to-day builds. They ensure that packages not intended for external distribution have unique version numbers.

  • Variants: Similar to shipping feeds, there are internal and public variants for each major product version. This ensures consistency in versioning across different environments.

  • Examples: Non-shipping feeds could be named "dotnet5-transport" and "dotnet5-internal-transport" to distinguish between public and internal versions.

Isolated Feeds:

  • Purpose: These feeds serve a specific role in maintaining a clean build environment and ensuring proper versioning during dependency updates.

  • Generation: Isolated feeds are dynamically created for each build and contain stable, shipping packages. They are named based on the repository and SHA of the build to maintain uniqueness.

  • Usage: Isolated feeds are primarily used to avoid conflicts in the NuGet cache and ensure that dependency updates pull in the correct package versions.

  • Release Day: On release day, assets from isolated feeds are pushed to This ensures that the official NuGet repository becomes the single source of truth for consumers.

Consider the below XML code as an example of a NuGet configuration file (NuGet.config). NuGet is a package manager for .NET projects that allows developers to consume and manage dependencies.

<?xml version="1.0" encoding="utf-8"?>
    <clear />
	<!--Begin: Package sources managed by Dependency Flow automation. Do not edit the sources below.-->
    <add key="darc-int-dotnet-core-setup-0c2e69c" value="" />
    <add key="darc-int-dotnet-corefx-059a4a1" value="" />
	<!--End: Package sources managed by Dependency Flow automation. Do not edit the sources above.-->
    <add key="dotnet-core" value="" />
    <add key="dotnet-tools" value="" />
    <add key="dotnet3" value="" />
    <!-- Add more package sources if needed -->

The provided XML configuration represents the NuGet.config file, which specifies package sources for NuGet to use during package restore and installation. This file is crucial for managing NuGet package dependencies in .NET projects.

In the context of the .NET team's build infrastructure and the described dependency flow, this NuGet.config file would typically be included in the root directory of each repository within their multi-repo OSS development environment.

Here's how it fits into the overall process:

Shipping Feeds and Non-Shipping Feeds:

  • The <add> elements within the <packageSources> section specify the URLs of various package feeds, including internal and public variants for shipping and non-shipping packages.

  • These package sources are referenced in the NuGet.config files of product repositories, allowing developers to restore and consume packages from the specified feeds during development.

Isolated Feeds:

  • While the provided XML configuration doesn't directly represent isolated feeds, these feeds would be dynamically generated for each build, as described in the original text.

  • During dependency updates or builds, Maestro (or a similar build automation tool) would modify the NuGet.config files in dependent repositories to include the URLs of isolated feeds specific to the current build's repository and SHA.

  • This modification ensures that dependency resolution during builds pulls in stable packages from the isolated feeds, maintaining a clean build environment.

The NuGet.config file containing the provided XML configuration would be utilized in each repository of the .NET team's multi-repo OSS development environment. It serves as a central configuration for specifying package sources, including both internal and public feeds for shipping and non-shipping packages. Additionally, the file is updated dynamically by build automation tools to include isolated feed URLs during dependency updates, ensuring proper versioning and dependency resolution during builds.

Internal Isolated Feeds and Public Builds Resolution

Recently, a developer encountered an impediment when attempting to build the tagged v3.1.300 commit. This issue arose because the 3.1.300 SDK release was internally built, causing the isolated "darc-int-" feeds automatically included in the NuGet.config file to be inaccessible publicly. Although these feeds become redundant after release day, their presence alone in the NuGet.config file triggers a failure during the NuGet restore step. This recurring problem has been a persistent grievance among the .NET teams. Consequently, on release day, manual intervention is required during internal-to-public merge PRs to remove any "darc-int-" feeds, ensuring successful builds.

Fortunately, a recent development by the .NET Engineering Services team promises to alleviate this issue for future releases. A change was integrated into Maestro, disabling the "darc-int" NuGet sources by default. Consequently, during public builds, any internal sources are deemed unnecessary—a status that holds post-release. Conversely, internal builds will automatically activate these sources as required. This strategic adjustment ensures that commits tagged for release remain consistently buildable without necessitating source modification on the day of release.

This proactive measure not only streamlines the release process but also reinforces the stability and reliability of .NET project builds moving forward.

Prep and Validation

Following the completion of development tasks and ensuring a coherent product with all desired fixes, we initiate a series of steps aimed at preparing the product for release and conducting thorough validation. This process is orchestrated through an Azure DevOps pipeline, augmented by Maestro for uniquely handling dependencies.

Dependency Graph Traversal:

Commencing at the dotnet/installer repository, Maestro traverses the dependency graph starting from the designated commit intended for shipping. It meticulously identifies all unreleased dependencies contributing to the graph.

Asset Collection:

Maestro proceeds to locate and gather all individual repository builds associated with these dependencies. It retrieves the complete set of Microsoft CI assets generated by these builds. Each asset is equipped with a set of endpoints that facilitate its publication.

Asset Preparation:

These assets are meticulously downloaded and organized by Maestro, ensuring they are placed in a directory ready for the subsequent stages of validation and release.

Additional Preparation and Validation:

  • NuGet Package Consistency Checks: We meticulously examine the consistency and integrity of NuGet packages, ensuring they meet quality standards and adhere to versioning conventions.

  • Signing Checks: Verification processes are in place to ensure that all components and artifacts are properly signed, enhancing security and trustworthiness.

  • Integration with Development Environments: We integrate the .NET product into Visual Studio and Visual Studio for Mac, preparing for seamless deployment via these popular development environments.

  • Bug Verification: Rigorous verification processes are conducted to confirm the resolution of reported bugs expected to be addressed in the release. This ensures the delivery of a robust and stable product.

Issue Resolution:

Any bugs or issues identified during the preparation and validation phase are promptly evaluated for severity. Necessary changes are implemented in the respective product repositories to address these issues effectively.


Upon successful completion of the preparation and validation phase, the product is deemed ready for release and proceeds to the release stage.

This comprehensive approach to preparation and validation ensures the delivery of a high-quality, thoroughly tested product, ready to meet the needs and expectations of our users.


In exploring the ins and outs of .NET development, we've uncovered the careful steps taken to create and release software. From managing code flow to handling security patches, we've seen the dedication behind .NET's reliability. By understanding this process, we appreciate the hard work of the developers who make it all possible.


bottom of page