European ASP.NET 4.5 Hosting BLOG

BLOG about ASP.NET 4, ASP.NET 4.5 Hosting and Its Technology - Dedicated to European Windows Hosting Customer

European ASP.NET Core 9.0 Hosting - HostForLIFE :: LINQ and Data View in C# and VB.NET

clock July 8, 2025 07:29 by author Peter

Here, a base class example and an exploration of Data View and LINQ (Language-Integrated Query) using DataTable Query for retrieving unique values from the database will be provided. LINQ is compatible with the most recent version,.NET 9, as well as.NET Core 3.1,.NET 5, and.NET 6. We will retrieve the data from a DataTable to a DataView using an example of a Book Class. Below is an example of a model-based class.

Book.cs
public class Book
{
    public int BookID { get; set; }
    public string Title { get; set; }
    public string Author { get; set; }
    public string Genre { get; set; }
    public decimal Price { get; set; }
}

Book Class Output Example without using LINQ.

BookID Title Author Genre Price
1 XXX xxx 1.1 45.00
2 YYY yyy 2.1 45.00
3 ZZZZ zzz 1.1 50.00
4 AAA aaa 1.1 30.00

To retrieve the unique values of the version sorted by price value, the example code below is provided for C# and VB.NET.

using System;
using System.Collections.Generic;
using System.Linq;

class Program
{
    static void Main()
    {
        List<Book> books = new List<Book>
        {
            new Book { BookID = 1, Title = "XXX", Author = "xxx", version = "1.1", Price = 45.00m },
            new Book { BookID = 2, Title = "YYY", Author = "yyy", version = "2.1", Price = 50.00m },
            new Book { BookID = 3, Title = "ZZZZ Hobbit", Author = "zz", version = "1.1", Price = 30.00m },
            new Book { BookID = 4, Title = "AAA", Author = "aa", version = "1.1", Price = 42.00m }
        };

        // LINQ: Get all 1.1 books sorted by price
        var programmingBooks = books
            .Where(b => b.version == "1.1")
            .OrderBy(b => b.Price);

        Console.WriteLine("Books (sorted by price):");
        foreach (var book in programmingBooks)
        {
            Console.WriteLine($"{book.Title} by {book.Author} - ${book.Price}");
        }
    }
}

class Book
{
    public int BookID { get; set; }
    public string Title { get; set; }
    public string Author { get; set; }
    public string version { get; set; }
    public decimal Price { get; set; }
}

Dim bookTable As New DataTable()
bookTable.Columns.Add("BookID", GetType(Integer))
bookTable.Columns.Add("Title", GetType(String))
bookTable.Columns.Add("Author", GetType(String))
bookTable.Columns.Add("Genre", GetType(String))
bookTable.Columns.Add("Price", GetType(Decimal))

' Sample data
bookTable.Rows.Add(1, "XXX", "xxx", "1.1", 45.0D)
bookTable.Rows.Add(2, "YYY", "yyy", "1.2", 45.0D)
bookTable.Rows.Add(3, "ZZZ", "zzz", "2.1", 50.0D)
bookTable.Rows.Add(4, "AAA", "aa", "1.1", 30.0D)

Dim view As New DataView(bookTable)
Dim distinctBooks As DataTable = view.ToTable(True, "Title", "Author")

For Each row As DataRow In distinctBooks.Rows
    Console.WriteLine($"Title: {row("Title")}, Author: {row("Author")}")
Next

Output
Title: XXX, Author: xxx
Title: ZZZZ, Author: zz
Title: AAA, Author: aa



European ASP.NET Core 9.0 Hosting - HostForLIFE :: Base Class Library (BCL) for .NET

clock June 30, 2025 09:58 by author Peter

One crucial component of the.NET framework is the Base Class Library (BCL). Microsoft provides a basic collection of pre-built classes and types for the.NET platform called the Base Class Library (BCL). These are the fundamental components of nearly all.NET applications, be they desktop, online, console, or API projects. It is closely linked with the Common Language and is a component of the Framework Class Library (FCL). It functions similarly to the standard toolkit included with the.NET runtime, prepared to tackle common programming tasks such as:

  • Working with strings, numbers, and dates
  • Reading and writing files
  • Managing collections of data
  • Performing network operations
  • Handling threads and asynchronous tasks
  • Encrypting and securing data
  • Parsing XML and JSON
  • Reflecting on types and assemblies

It’s implemented in assemblies like System.Runtime.dll, System.Private.CoreLib.dll, and others, and is shared across all .NET implementations—whether you're building with .NET Core, .NET Framework, or modern .NET (6/7/8).

Think of the BCL as the foundation toolkit that every .NET developer uses — just like a workshop full of power tools and templates to build robust applications without reinventing the wheel.

Framework Class Library (FCL)
The Framework Class Library includes the BCL but also adds broader APIs for developing web apps (ASP.NET), desktop apps (WinForms/WPF), database access (ADO.NET), and more.

Common Language Runtime (CLR)
The CLR is the execution engine of .NET. It handles memory management, type safety, garbage collection, exception handling, and security.

In my opinion as a developer, we use many components from the Base Class Library in our day-to-day projects. We often see some libraries being used more commonly than others. It's helpful to understand what they are and how they work with real examples.

We build our house (Application) on a strong foundation (CLR), construct it using tools like List<T>, File, and Console (BCL), and then decorate it with features like ASP.NET, ADO.NET, and WinForms (FCL).

Why Use Base Class Library (BCL) in .NET?

  • Saves Development Time - No need to write common functionality from scratch. Built-in classes like List<T>, File, DateTime make tasks faster and easier.
  • Reduces Bugs - Microsoft’s BCL classes are thoroughly tested and reliable. Reduces the risk of writing error-prone custom implementations.
  • Increases Productivity - Developers can focus on business logic instead of low-level utilities. Example: File.ReadAllText("file.txt") reads an entire file in one line.
  • 4Cross-Platform Compatible - Works consistently across Windows, Linux, and macOS using .NET Core / .NET 5+. Ensures platform-independent development.
  • Improves Code Readability - Standardized APIs make code easier to read, understand, and maintain. Using familiar classes like StringBuilder, Dictionary, and Console helps teamwork.
  • Boosts Performance - Many BCL classes (like Stopwatch, Span<T>) are optimized for speed and memory.Useful for performance-critical tasks.
  • Consistent Design - All .NET languages (C#, VB.NET, F#) use the same BCL. APIs follow predictable naming and behavior patterns.


Commonly Used Base Class Libraries
1. System

System is the core namespace in the .NET Base Class Library (BCL). It contains the fundamental types and classes that almost every .NET application uses — including data types, console I/O, math functions, exceptions, and more. The foundation layer of all your C# programs — like the basic bricks and cement used in every building project. It includes essential types like:

  • Primitive types: System.Int32, System.Boolean, System.String, etc.
  • Base object model: System.Object, System.Type, System.Exception
  • Utilities: System.Math, System.Convert, System.Environment
  • Date and time: System.DateTime, System.TimeSpan
  • Console I/O: System.Console
  • Nullable types: System.Nullable<T>
  • Tuples and value types: System.ValueTuple, System.Enum

Example code:
using System;

class Program
{
    static void Main()
    {
        string name = "C# Community";
        int year = DateTime.Now.Year;

        Console.WriteLine($"Hello, {name}! Welcome to the year {year}.");
    }
}


We used:
    System.String
    System.DateTime
    System.Console


All from the System namespace—no extra libraries needed.

Why Is It Important?

It’s automatically referenced in most .NET projects, It provides the building blocks for all other namespaces and It’s the default namespace for many language features (e.g., int is an alias for System.Int32).

2.  System.IO
The System.IO namespace in .NET is we go-to toolkit for handling files, directories, and data streams. It’s part of the Base Class Library (BCL) and provides everything we need to read, write, and manage data on disk or in memory. It enables to:

  • Read and write files (File, FileStream, StreamReader, StreamWriter)
  • Work with directories (Directory, DirectoryInfo)
  • Handle paths (Path)
  • Monitor file system changes (FileSystemWatcher)
  • Read/write binary data (BinaryReader, BinaryWriter)
  • Use memory streams (MemoryStream)


Example Code
using System;
using System.IO;

public class Program
{
    public static void Main()
    {
        string path = "myFile.txt";
        File.WriteAllText(path, "Hi! C# community, I am Miche!");
        string content = File.ReadAllText(path);
        Console.WriteLine("File content:\n" + content);
    }
}


We used using System; and using System.IO;

These two using statements import the namespaces required.
System: Gives access to Console, String, etc.
System.IO: Gives access to file-related classes like File.


Common Classes in System.IO

Class Purpose
File / FileInfo Create, delete, copy, move files
Directory / DirectoryInfo Manage folders
StreamReader / StreamWriter Read/write text
BinaryReader / BinaryWriter Read/write binary data
FileStream Low-level file access
Path Manipulate file paths
MemoryStream Work with data in memory
FileSystemWatcher Watch for file changes (great for logging or syncing apps)

Why Is It Important?
To save data locally (like logs, configs, user input),  To read config files or resources in your application, To upload/download files in web apps and To interact with the file system in background jobs, APIs, or desktop tools. System and System.IO are libraries commonly used by developers from beginner to advanced levels. In this article, we introduced them briefly, and in the upcoming articles, we will explore other important libraries — explaining them in simple terms, along with detailed examples.

Essential .NET Base Class Libraries (BCL)

Namespace What It Is  Why It’s Important Use Case / Class
System Core types and base functionality Fundamental types used in all .NET code Console, String, DateTime, Math
System.IO File and stream handling Enables file read/write, stream data File, FileInfo, StreamReader, Path
System.Net.Http HTTP communication Connects to REST APIs or web servers HttpClient for GET/POST requests
System.Collections.Generic Generic data collections Manage dynamic data with type safety List<T>, Dictionary<K,V>
System.Linq LINQ querying over collections Simplifies filtering, sorting, and projections Where, Select, FirstOrDefault
System.Threading.Tasks Asynchronous programming Perform non-blocking background operations Task, async/await, Task.Run()
System.Text Text processing and manipulation Efficient string handling and encoding StringBuilder, Encoding.UTF8
System.Text.RegularExpressions Regex-based pattern matching Validate input, extract data from text Regex.IsMatch, Regex.Replace
System.Globalization Culture-aware formatting Supports localization for global users CultureInfo, DateTime.ToString("D", culture)
System.Diagnostics Logging, tracing, performance Debug, benchmark, trace app behavior Stopwatch, Debug.WriteLine()
System.Reflection Runtime type inspection Enables plugins, dependency injection, metadata access Type, Assembly, PropertyInfo
System.Security.Cryptography Cryptographic services Secure hashing, encryption, certificates SHA256, Aes, RNGCryptoServiceProvider
System.ComponentModel Metadata for components Data annotations, property binding INotifyPropertyChanged, attributes
System.Timers Timer-based scheduling Execute code at intervals Timer.Elapsed, auto-repeating logic
System.Configuration App config settings access Read/write app configuration files ConfigurationManager.AppSettings[]
System.Data Core data access tools Work with databases or tabular data DataTable, DataSet, DbConnection

How to Use Base Class Library (BCL) in .NET

  • BCL is built-in: No need to install — it comes with .NET automatically.
  • Use using statements: Add namespaces like System, System.IO, System.Collections.Generic at the top of your file.
  • Call built-in classes: Use BCL types like Console, DateTime, File, List<T>, etc.
  • Works in all project types: Console apps, web apps, APIs, desktop apps, etc.
  • Saves time: Gives you ready-to-use tools for common tasks like reading files, formatting dates, or handling collections.

Conclusion
The BCL is more than just a library—it’s the beating heart of .NET development. Mastering these foundational namespaces allows us to build clean, efficient, and maintainable software across a wide range of applications. From System.Net.Http for HTTP communication to System.Text.Json for serialization, these tools empower us to write less boilerplate and focus more on business logic.



European ASP.NET Core 9.0 Hosting - HostForLIFE :: Troubleshooting Memory Leaks in .NET Core

clock June 25, 2025 07:51 by author Peter

Memory leaks have the potential to subtly impair the functionality of your.NET Core apps, resulting in excessive memory usage, slow performance, or even crashes. Thankfully,.NET has a comprehensive toolkit for efficiently identifying, evaluating, and repairing memory leaks. Using useful tools and a real-world example, we'll go over how to find and fix memory leaks step-by-step in this post.

 

Prerequisites
Before diving into debugging memory leaks, ensure you have the following installed and set up.

  • .NET SDK 6.0+
  • dotnet-counters tool
  • dotnet-dump tool
  • Basic understanding of C# and ASP.NET Core

To install tools.

  • dotnet tool install --global dotnet-counters
  • dotnet tool install --global dotnet-dump

Creating a Sample ASP.NET Core Application
Add a Leaky Controller
In the Controllers folder, create a new file called LeakController.cs.
[Route("api/[controller]")]
[ApiController]
public class LeakController : ControllerBase
{
    private static Dictionary<int, byte[]> _leakyCache = new();

    [HttpGet("{kb}")]
    public IActionResult LeakMemory(int kb)
    {
        var buffer = new byte[kb * 1024];
        _leakyCache[DateTime.UtcNow.Ticks.GetHashCode()] = buffer;

        return Ok(new
        {
            Message = $"Leaked {kb} KB. Cache size: {_leakyCache.Count}"
        });
    }
}


Each request allocates a byte array and stores it in a static dictionary using the current timestamp as a key. Since we never remove old items, this creates a memory leak.

Running the Application
dotnet run

The API should be available at https://localhost:7261/api/leak/{kb}. This leaks 1000 KB of memory. To simulate repeated leaking.

for i in {1..50}; do curl http://localhost:7261/api/leak/1000; done

Monitoring with dotnet counters
Use dotnet-counters to monitor real-time performance metrics.

Get the process ID (PID) of your running app.
dotnet-counters ps

Now, run the below command to get the different parameters along with the memory size. dotnet-counters monitor --refresh-interval 1 -p <PID>

What to Watch.

  • GC Heap Size: Should grow steadily
  • Gen 0/1/2 GC Count: Garbage collections increase, but the EAP never shrinks

This confirms that objects aren’t being collected – a classic memory leak.

Capturing Memory Dumps

We capture a snapshot of the memory to analyze it.
dotnet-dump collect -p <PID> -o before_leak.dmp

Trigger more leaks.
for i in {1..100}; do curl http://localhost:5000/api/leak/1000; done

Then collect another snapshot.
dotnet-dump collect -p <PID> -o after_leak.dmp

Fixing the Leak
Limit the size of the dictionary.
if (_leakyCache.Count > 100)
    _leakyCache.Clear();


Validating the Fix
Restart the app, monitor with dotnet-counters, and send repeated requests. You should now see memory usage stabilize.

Conclusion
In.NET Core, memory leaks are subtle yet fatal. You can successfully identify and resolve them with dotnet-counters, dotnet-dump, and analytic tools like SOS and Visual Studio. This post led you through constructing a leaking program, finding difficulties, examining heap dumps, and repairing the leak. To guarantee seamless performance, your development and deployment cycle should include routine diagnostics and profiling. Thank You, and Stay Tuned for More!



European ASP.NET Core 9.0 Hosting - HostForLIFE :: The Difference Between Controller API.NET Core and Minimal API

clock June 23, 2025 08:02 by author Peter

will go over how the.NET 9 Core Minimal API Controller APIs differ from one another. Introduced in.NET 6.0, the Minimal API is a lightweight and quick API development that is improved in.NET 9. Additionally, it works well with microservices, tiny APIs, and lightweight services. The basic API lacks [HttpGet] and routing characteristics. All of the logics are defined in Program.cs, making it simpler. The controller-based API is structured, scalable, and based on the Model-View-Controller (MVC) design. It will work well for complicated and sizable applications. Different controller classes are used to structure logics.

 

Minimal API (.NET 9) Controller API (.NET 9)
Quick APIs, small and compact for micro services Complex APIs, larger, and enterprise apps
Lambda expressions are used and in-line based logic development Routing is based on the Attribute and Controller APIs
Dependency Injection is supported Dependency Injection is supported and more structured
Less structured for Testing & Maintainability can be handled for larger apps also Unit Testing & Maintainability is easier with the separation of concerns
Complex routing or filter options are limited Complex routing or filters are supported and structured

Will see how to create and run both a minimal API and a Controller API. First, to create the minimal API, run the following command in the command line.

dotnet new web -n MinimalApiDemo
cd MinimalApiDemo


In the program.cs file mapping the URL and function, there are no controllers and attributes; simply added routed mappings.

Minimal API Example
var builder = WebApplication.CreateBuilder(args);
var app = builder.Build();

var products = new List<Product>
{
    new(1, "Laptop", 999.99m),
    new(2, "Phone", 499.99m)
};

app.MapGet("/products", () => products);

app.MapGet("/products/{id}", (int id) =>
    products.FirstOrDefault(p => p.Id == id) is Product product
        ? Results.Ok(product)
        : Results.NotFound());

app.MapPost("/products", (Product product) =>
{
    products.Add(product);
    return Results.Created($"/products/{product.Id}", product);
});

app.Run();


To check the output, visit 'https://localhost:5001/product/1'.

Controller API Example
To create an MVC controller-based AP,I run the below command.
dotnet new webapi -n ControllerApiDemo
cd ControllerApiDemo


Here we will have a simple product API example, with [HttpGet] routing example given below.
using Microsoft.AspNetCore.Mvc;

[ApiController]
[Route("api/[controller]")]
public class ProductsController : ControllerBase
{
    private static readonly List<Product> Products = new()
    {
        new Product { Id = 1, Name = "Laptop", Price = 1200 },
        new Product { Id = 2, Name = "Smartphone", Price = 800 }
    };

    [HttpGet]
    public ActionResult<IEnumerable<Product>> GetAll()
    {
        return Ok(Products);
    }

    [HttpGet("{id}")]
    public ActionResult<Product> GetById(int id)
    {
        var product = Products.FirstOrDefault(p => p.Id == id);
        if (product == null) return NotFound();
        return Ok(product);
    }
}


To check the output, visit 'https://localhost:5001/api/products'



European ASP.NET Core 9.0 Hosting - HostForLIFE :: A Complete Guide to.NET 8 Web API Health Checks

clock June 19, 2025 08:55 by author Peter

In today's fast-paced digital landscape, keeping your web applications reliable and responsive is more critical than ever. Health checks continuously monitor vital components—from databases and external APIs to network connections providing real-time insights into potential issues. Whether you're managing a small service or a large-scale microservices architecture, effective health checks help you detect problems early, streamline diagnostics, and automate recovery processes.

This post will demonstrate how to use.NET 8 to create reliable health checks in an ASP.NET Web API. In addition to monitoring an external API and running a custom ping test for network connectivity, you'll learn how to check the condition of your SQL Server database and provide a clean, JSON-formatted response that works in unison with contemporary monitoring tools. You may increase resilience and ensure dependable, seamless operations by making sure every crucial component of your application is regularly examined.

In this article, we will:

  • Create a new ASP.NET Web API project.
  • Implement health checks for a SQL Server database, an external API, and a custom ping test.
  • Convert the health check response to JSON.
  • Discuss the benefits and potential drawbacks of using health checks.

Prerequisites
Before you begin, make sure you have:

  • .NET 8 SDK installed.
  • A basic understanding of ASP.NET Web API and C#.
  • A valid connection string for your SQL Server database.
  • An external API URL to monitor (e.g., C# Corner API).

Step 1. Create the ASP.NET Web API Project
Open your command prompt (or terminal) and run the following bash command to create a new ASP.NET Web API project named HealthCheckDemo:
dotnet new webapi -n HealthCheckDemo
cd HealthCheckDemo


This command creates a basic Web API project using the minimal hosting model found in .NET 8.
You will also need the following NuGet packages:
dotnet add package AspNetCore.HealthChecks.SqlServer
dotnet add package AspNetCore.HealthChecks.Uris

Step 2. Implement a Custom Ping Test Health Check
A custom ping test can help verify network connectivity by pinging a specific host. Create a new file called PingHealthCheck.cs in your project and add the following code:
using System;
using System.Net.NetworkInformation;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Diagnostics.HealthChecks;

public class PingHealthCheck : IHealthCheck
{
    private readonly string _host;

    public PingHealthCheck(string host)
    {
        _host = host;
    }

    public async Task<HealthCheckResult> CheckHealthAsync(
        HealthCheckContext context,
        CancellationToken cancellationToken = default)
    {
        using var ping = new Ping();
        try
        {
            // Send a ping with a 2-second timeout
            var reply = await ping.SendPingAsync(_host, 2000);
            if (reply.Status == IPStatus.Success)
            {
                return HealthCheckResult.Healthy(
                    $"Ping to {_host} succeeded with roundtrip time {reply.RoundtripTime}ms");
            }
            return HealthCheckResult.Unhealthy(
                $"Ping to {_host} failed with status {reply.Status}");
        }
        catch (Exception ex)
        {
            return HealthCheckResult.Unhealthy(
                $"Ping to {_host} resulted in an exception: {ex.Message}");
        }
    }
}


This custom health check pings the host (such as "8.8.8.8") and returns a healthy status if the ping is successful.

Step 3. Configure Health Checks in Program.cs
Open your Program.cs file and update it to register health checks for your SQL Server database, the external API, and the custom ping test. In addition, configure the endpoint to output a JSON response.
using System.Text.Json;
using Microsoft.AspNetCore.Diagnostics.HealthChecks;
using Microsoft.Extensions.Diagnostics.HealthChecks;

var builder = WebApplication.CreateBuilder(args);

// Add services for controllers (if you plan to use controllers)
builder.Services.AddControllers();

// Retrieve the connection string from configuration (set in appsettings.json)
string connectionString = builder.Configuration.GetConnectionString("DefaultConnection");

// Register health checks.
builder.Services.AddHealthChecks()
    // SQL Server Health Check: Execute a simple query to evaluate connectivity.
    .AddSqlServer(connectionString,
                  name: "SQL Server",
                  healthQuery: "SELECT 1;",
                  failureStatus: HealthStatus.Unhealthy)
    // External API Health Check: Verify that an external API (e.g., C#-Corner) is reachable.
    .AddUrlGroup(new Uri("https://www.c-sharpcorner.com/"),
                 name: "C# Corner API",
                 failureStatus: HealthStatus.Degraded)
    // Custom Ping Test Health Check: Ping an external host (e.g., Google's public DNS).
    .AddCheck("Ping Test", new PingHealthCheck("8.8.8.8"));

var app = builder.Build();

app.UseRouting();

app.UseEndpoints(endpoints =>
{
    endpoints.MapControllers();

    // Map the health check endpoint with a custom JSON response writer.
    endpoints.MapHealthChecks("/health", new HealthCheckOptions
    {
        ResponseWriter = async (context, report) =>
        {
            context.Response.ContentType = "application/json";

            var response = new
            {
                status = report.Status.ToString(),
                // Provide details about each health check.
                checks = report.Entries.Select(entry => new
                {
                    key = entry.Key,
                    status = entry.Value.Status.ToString(),
                    description = entry.Value.Description,
                    duration = entry.Value.Duration.ToString()
                }),
                totalDuration = report.TotalDuration.ToString()
            };

            // Serialize the response in indented JSON format.
            await context.Response.WriteAsync(JsonSerializer.Serialize(response, new JsonSerializerOptions
            {
                WriteIndented = true
            }));
        }
    });
});

app.Run();

Explanation of the Code
Service Registration

  • SQL Server Check: Executes a simple query (e.g., "SELECT 1;") to ensure that the database is reachable.
  • External API Check: Checks the availability of an external API (such as C# Corner's API).
  • Ping Test: Uses the custom PingHealthCheck class to verify network connectivity by pinging "8.8.8.8".
  • Custom JSON Response Writer: The response writer builds a structured JSON object that includes overall health status, details of each health check (name, status, description, and duration), plus the total duration of the process.

Step 4. Configure the Connection String
In your appsettings.json file, add your SQL Server connection string under the "DefaultConnection" key:

json
{
  "ConnectionStrings": {
    "DefaultConnection": "Server=YOUR_SERVER;Database=YOUR_DATABASE;User Id=YOUR_USER;Password=YOUR_PASSWORD;"
  }
}

Replace the placeholders with your actual SQL Server credentials.

Step 5. Testing the Health Check Endpoint

Run Your Application: In your command prompt, run:
dotnet run

Access the Health Endpoint: Open your browser or use a tool like Postman and navigate to:
http://localhost:<port>/health

A typical JSON response might look like this:

This JSON response gives a clear overview of the health status of each dependency and ensures easy integration with monitoring tools.

Benefits of Using Health Checks

  • Proactive Monitoring: Regular health checks allow for early detection of issues, enabling quick corrective actions or automated recovery processes.
  • Simplified Diagnostics: Aggregating multiple checks into a single endpoint simplifies troubleshooting by providing clear, structured output via JSON.
  • Enhanced Resilience: Health checks are integral to modern orchestration systems like Kubernetes, which use them to manage service restarts and traffic routing based on real-time status.
  • Easy Integration: A JSON-formatted health response is readily consumable by dashboards and third-party monitoring tools, providing comprehensive insights over time.

Drawbacks and Considerations

  • Performance Overhead: Detailed or numerous health checks might add overhead, particularly if they perform resource-intensive operations. Balance thorough monitoring with performance.
  • False Positives: Overly aggressive or sensitive checks might incorrectly flag transient issues as failures, resulting in unnecessary alerts.
  • Maintenance Effort: As your application evolves, updating health checks to reflect new dependencies or architectural changes may require additional maintenance.
  • Security Concerns: Exposing detailed internal health information can be risky. Consider securing the health endpoint, especially in production environments.

Conclusion
Implementing comprehensive health checks in your ASP.NET Web API using .NET 8 significantly enhances your system's resilience and reliability. These checks—whether it's verifying SQL Server connectivity, ensuring external API availability, or performing a custom ping test—offer proactive monitoring that simplifies diagnostics and facilitates automated recovery in orchestration environments. Although careful consideration is needed to balance monitoring detail with performance overhead and potential security concerns, fine-tuning these checks can lead to a robust infrastructure that minimizes downtime and improves user experience. In essence, health checks serve as a crucial asset, empowering you to maintain a healthy, high-performing application in an ever-changing digital landscape.



European ASP.NET Core 9.0 Hosting - HostForLIFE :: Manage Transactions in the Entity Framework

clock June 16, 2025 09:36 by author Peter

We shall study Entity Framework Transactions in this article.

 Let's begin, then.

Comprehending Transactions

  • Transactions treat several activities as a single unit, ensuring data integrity.
  • They are crucial for preserving uniformity, particularly in intricate applications.

Built-in Transaction Management
Entity Framework (EF) provides built-in transaction management through the DbContext.
EF automatically handles transactions for single SaveChanges calls.
using (var context = new AppDbContext())

using (var context = new AppDbContext())
  {
     // Operations
     context.SaveChanges(); // EF handles transaction automatically
  }


Handling Multiple Operations as a Unit
Transactions are crucial for ensuring data integrity when performing multiple related operations. For instance, transferring money between accounts involves debiting one account and crediting another. Both operations should be part of a single transaction.

Example
using (var context = new AppDbContext())
  {
     using (var transaction  = context.Database.BeginTransaction())
       {
         try
          {
             // Debit one account
             // Credit another account

             context.SaveChanges();
             transaction.commit();
          }
         catch(Exception ex)
          {
            transaction.Rollback();
            // Handle the exception
          }
       }
   }


Rollback Strategies

Rollbacks revert all changes if any operation fails.
Essential for maintaining data integrity in case of errors.
using (var context = new AppDbContext())
  {
     using (var transaction  = context.Database.BeginTransaction())
       {
         try
          {
            // Operations
            transaction.commit();
          }
         catch(Exception ex)
          {
            transaction.Rollback();
          }
       }
  }


Commit Strategies

Use Commit to finalize transactions after successful operations.
Always wrap Commit within a try catch block to handle exception.
using (var context = new AppDbContext())
  {
     using (var transaction  = context.Database.BeginTransaction())
       {
         try
          {
            // Operations
            transaction.commit();
          }
         catch(Exception ex)
          {
            transaction.Rollback();
            // Log or Handle the exception
          }
       }
  }

Combining Explicit Transactions with Multiple Contexts
Handle transactions across multiple DbContext instances.
Ensure consistency across different data sources.
using (var context1 = new AppDbContext1())
using (var context2 = new AppDbContext2())
 {
    using (var transaction  = context1.Database.BeginTransaction())
     {
        try
         {
            // Operations on context1
            context1.SaveChanges();

            // Operations on context2
            context2.Database.UseTransaction(transaction.GetDbTransation());
            context2.SaveChanges();

            transaction.commit();
         }
        catch(Exception ex)
         {
            transaction.Rollback();
         }
     }
 }


Points to be taken

  • Transactions are critical for data integrity.
  • EF provides built-in and explicit transaction management.
  • Rollbacks and commits ensure consistency and handle errors effectively.

Conclusion
In this article, I have tried to cover how to handle Entity Framework Transactions.



European ASP.NET Core 9.0 Hosting - HostForLIFE :: ASP.NET Core Clean and Reliable Code Testing with Moq using C# 13 and xUnit

clock June 9, 2025 08:34 by author Peter

C# 13 and .NET 8 have greatly enhanced ASP.NET Core development capabilities. However, building scalable and maintainable systems requires robust testing in addition to feature implementation. Using xUnit, Moq, and the latest C# 13 features, you will learn how to write clean, reliable, and testable code.

This guide will walk you through testing a REST API or a service layer:

  • Creating a test project
  • Using xUnit to write clean unit tests
  • Using Moq to mock dependencies
  • Using best practices for test architecture and maintainability

Setting Up Your ASP.NET Core Project with C# 13
With ASP.NET Core Web API and C# 13, begin with .NET 8 and ASP.NET Core Web API.
dotnet new sln -n HflApi

dotnet new web -n Hfl.Api
dotnet new classlib -n Hfl.Domain
dotnet new classlib -n Hfl.Core
dotnet new classlib -n Hfl.Application
dotnet new classlib -n Hfl.Infrastructure


dotnet sln add Hfl.Api/Hfl.Api.csproj
dotnet sln add Hfl.Domain/Hfl.Domain.csproj
dotnet sln add Hfl.Core/Hfl.Core.csproj
dotnet sln add Hfl.Application/Hfl.Application.csproj
dotnet sln add Hfl.Infrastructure/Hfl.Infrastructure.csproj


Create a Test Project with xUnit and Moq
Add a new test project:
dotnet new xunit -n HflApi.Tests
dotnet add HflApi.Tests/HflApi.Tests.csproj reference HflApi/HflApi.csproj
dotnet add HflApi.Tests package Moq


Use Case: Testing a Service Layer
Domain, Service, Respository, Interface and API
namespace HflApi.Domain;
public record Order(Guid Id, string Status);


using HflApi.Domain;

namespace HflApi.Core.Interfaces;

public interface IOrderRepository
{
    Task<Order?> GetByIdAsync(Guid id);
}


using HflApi.Core.Interfaces;

namespace HflApi.Application.Services;

public class OrderService
{
    private readonly IOrderRepository _repository;

    public OrderService(IOrderRepository repository)
    {
        _repository = repository;
    }

    public async Task<string> GetOrderStatusAsync(Guid orderId)
    {
        var order = await _repository.GetByIdAsync(orderId);
        return order?.Status ?? "Not Found";
    }
}


using HflApi.Core.Interfaces;
using HflApi.Domain;

namespace Hfl.Infrastructure.Repositories
{
    public class InMemoryOrderRepository : IOrderRepository
    {
        private readonly List<Order> _orders = new()
    {
        new Order(Guid.Parse("7c3308b4-637f-426b-aafc-471697dabeb4"), "Processed"),
        new Order(Guid.Parse("5aee5943-56d0-4634-9f6c-7772f6d9c161"), "Pending")
    };

        public Task<Order?> GetByIdAsync(Guid id)
        {
            var order = _orders.FirstOrDefault(o => o.Id == id);
            return Task.FromResult(order);
        }
    }
}

using Hfl.Infrastructure.Repositories;
using HflApi.Application.Services;
using HflApi.Core.Interfaces;


var builder = WebApplication.CreateBuilder(args);


builder.Services.AddScoped<IOrderRepository, InMemoryOrderRepository>();
builder.Services.AddScoped<OrderService>();

var app = builder.Build();

app.MapGet("/orders/{id:guid}", async (Guid id, OrderService service) =>
{
    var status = await service.GetOrderStatusAsync(id);
    return Results.Ok(new { OrderId = id, Status = status });
});

app.Run();


Unit Testing with xUnit and Moq
Test Class
using Moq;
using HflApi.Application.Services;
using HflApi.Core.Interfaces;
using HflApi.Domain;


namespace OrderApi.Tests;

public class OrderServiceTests
{
    private readonly Mock<IOrderRepository> _mockRepo;
    private readonly OrderService _orderService;

    public OrderServiceTests()
    {
        _mockRepo = new Mock<IOrderRepository>();
        _orderService = new OrderService(_mockRepo.Object);
    }

    [Fact]
    public async Task GetOrderStatusAsync_ReturnsStatus_WhenOrderExists()
    {
        var orderId = Guid.NewGuid();
        _mockRepo.Setup(r => r.GetByIdAsync(orderId))
                 .ReturnsAsync(new Order(orderId, "Processed"));

        var result = await _orderService.GetOrderStatusAsync(orderId);

        Assert.Equal("Processed", result);
    }

    [Fact]
    public async Task GetOrderStatusAsync_ReturnsNotFound_WhenOrderDoesNotExist()
    {
        var orderId = Guid.NewGuid();
        _mockRepo.Setup(r => r.GetByIdAsync(orderId))
                 .ReturnsAsync((Order?)null);

        var result = await _orderService.GetOrderStatusAsync(orderId);

        Assert.Equal("Not Found", result);
    }

    [Theory]
    [InlineData("Processed")]
    [InlineData("Pending")]
    [InlineData("Shipped")]
    public async Task GetOrderStatus_ReturnsCorrectStatus(string status)
    {
        var orderId = Guid.NewGuid();
        _mockRepo.Setup(r => r.GetByIdAsync(orderId))
                 .ReturnsAsync(new Order(orderId, status));

        var result = await _orderService.GetOrderStatusAsync(orderId);

        Assert.Equal(status, result);
    }
}


Best Practices
1. Use Dependency Injection for Testability
All dependencies should be injected, so don't use static classes or service locator patterns.

2. Keep Tests Isolated
In order to isolate external behavior, Moq should be used to isolate database/network I/O from tests.

3. Use Theory for Parameterized Tests
[Theory]
[InlineData("Processed")]
[InlineData("Pending")]
[InlineData("Shipped")]
public async Task GetOrderStatus_ReturnsCorrectStatus(string status)
{
    var orderId = Guid.NewGuid();
    _mockRepo.Setup(r => r.GetByIdAsync(orderId))
             .ReturnsAsync(new Order(orderId, status));

    var result = await _orderService.GetOrderStatusAsync(orderId);

    Assert.Equal(status, result);
}

4. Group Tests by Behavior (Not CRUD)
Tests should be organized according to what systems do, not how they are performed. For example:

  • GetOrderStatus_ShouldReturnCorrectStatus
  • CreateOrder_ShouldSendNotification

5. Use Records for Test Data in C# 13
public record Order(Guid Id, string Status);

Immutable, concise, and readable test data objects can be created using records.

  • Test Coverage Tips
  • To measure test coverage, use Coverlet or JetBrains dotCover.
  • Business rules and logic at the service layer should be targeted.
  • Make sure you do not overtest third-party libraries or trivial getter/setter functions.

Recommended Tools

Tool

Purpose

xUnit

Unit Testing Framework

Moq

Mocking Dependencies

FluentAssertions

Readable Assertions

Coverlet

Code Coverage

Summary

Use xUnit, Moq, and C# 13 capabilities to test ASP.NET Core applications. To make sure your apps are dependable, provide a clean architecture, separated unit tests, and appropriate test names. Developers can find and address problems earlier in the development cycle by integrating DI, mocking, and xUnit assertions. This leads to quicker feedback, more confidence, and more maintainable systems. Unit tests' isolation guarantees that every part functions on its own, enhancing the overall dependability of the system. Over time, a codebase becomes easier to comprehend and maintain with a clear design and relevant test names. This method promotes a strong development process by lowering regressions and enhancing code quality. Furthermore, modular designs and well-defined test cases facilitate team member onboarding and debugging, encouraging cooperation. These procedures ultimately result in more scalable and resilient applications.



European ASP.NET Core 9.0 Hosting - HostForLIFE :: Using Blazor Server to Call REST APIs: An Introduction with Example

clock June 5, 2025 08:27 by author Peter

Prerequisites

  • Basic knowledge of Blazor Server
  • Visual Studio or VS Code
  • .NET 6 or .NET 8 SDK
  • A sample REST API (we'll use JSONPlaceholder

Step 1. Create a Blazor Server App

  • Open Visual Studio
  • Create a new Blazor Server App
  • Name it BlazorRestClientDemo

Step 2. Create the Model

public class Post
{
    public int UserId { get; set; }
    public int Id { get; set; }
    public string Title { get; set; }
    public string Body { get; set; }
}

Step 3. Register HttpClient in Program.cs
builder.Services.AddHttpClient("API", client =>
{
    client.BaseAddress = new Uri("https://jsonplaceholder.typicode.com/");
});


Step 4. Create a Service to Call the API

public class PostService
{
    private readonly HttpClient _http;

    public PostService(IHttpClientFactory factory)
    {
        _http = factory.CreateClient("API");
    }

    public async Task<List<Post>> GetPostsAsync()
    {
        var response = await _http.GetFromJsonAsync<List<Post>>("posts");
        return response!;
    }

    public async Task<Post?> GetPostAsync(int id)
    {
        return await _http.GetFromJsonAsync<Post>($"posts/{id}");
    }

    public async Task<Post?> CreatePostAsync(Post post)
    {
        var response = await _http.PostAsJsonAsync("posts", post);
        return await response.Content.ReadFromJsonAsync<Post>();
    }
}

C#

Step 5. Register the Service and Program.cs.

builder.Services.AddScoped<PostService>();

Step 6. Use in a Razor Component
@page "/posts"
@inject PostService PostService

<h3>All Posts</h3>

@if (posts == null)
{
    <p>Loading...</p>
}
else
{
    <ul>
        @foreach (var post in posts)
        {
            <li><b>@post.Title</b> - @post.Body</li>
        }
    </ul>
}

@code {
    private List<Post>? posts;

    protected override async Task OnInitializedAsync()
    {
        posts = await PostService.GetPostsAsync();
    }
}

Add a simple form and call CreatePostAsync().

Conclusion
Blazor Server apps can easily consume REST APIs using HttpClient and typed models. In this article, you learned how to,

  • Register and inject HttpClient
  • Call GET and POST endpoints
  • Display data in the UI

Blazor is a powerful front-end technology, and now you know how to connect it with real-world APIs.



European ASP.NET Core 9.0 Hosting - HostForLIFE :: Using C# 13, EF Core, and DDD to Create a Clean ASP.NET Core API

clock June 3, 2025 07:30 by author Peter

We'll show you how to create a scalable, modular, and tested RESTful API using the newest.NET technology and industry best practices. The solution is perfect for enterprise development in the real world because it is designed with maintainability and clean architecture principles in mind. HTTP requests and answers will be handled by ASP.NET Core 8, and by utilizing the most recent language improvements, C# 13 (with preview features) will allow us to develop code that is clearer and more expressive. To access and save data, we'll utilize SQL Server and Entity Framework Core 8.

By prioritizing key domain logic and dividing concerns across layers, the design complies with Domain-Driven Design (DDD). We'll use the Repository and Unit of Work patterns to abstract data access and guarantee transactional consistency. The project also includes structured logging, centralized exception management, and FluentValidation for input validation for stability and traceability.

  • ASP.NET Core 8
  • C# 13 (Preview features)
  • Entity Framework Core 8
  • MS SQL Server
  • Domain-Driven Design (DDD)
  • Repository + Unit of Work Patterns
  • Dependency Injection
  • Validation, Error Handling & Logging


Project Setup & Structure
Clean Folder Structure (Clean Architecture & DDD-Aligned)

In order to promote separation of concerns, maintainability, and testability, the project uses Clean Architecture principles and Domain-Driven Design principles. The src directory is divided into well-defined layers, each with a specific responsibility.

  • CompanyManagement.API: As the interface between the application layer and the outside world, CompanyManagement.API contains API controllers, dependency injection configurations, and middleware such as error handling and logging.
  • CompanyManagement.Application: In CompanyManagement.Application, the business logic is encapsulated in services (use cases), data transfer objects (DTOs), and command/query handlers. Without concern for persistence or infrastructure, this layer coordinates tasks and enforces application rules.
  • CompanyManagement.Domain: CompanyManagement.Domain defines the business model through entities, interfaces, enums, and value objects. This layer is completely independent of any other project and represents the domain logic.
  • CompanyManagement.Infrastructure: The CompanyManagement.Infrastructure class implements the technical details necessary to support the application and domain layers. This includes Entity Framework Core configurations, the DbContext, repository implementations, and database migrations.
  • CompanyManagement.Tests: CompanyManagement.Tests contains unit and integration tests to help maintain code quality and prevent regressions by testing each component of the system in isolation or as part of a broader workflow.

As a result of this layered structure, the application is able to evolve and scale while keeping the codebase clean, decoupled, and easy to test.

Step-by-Step Implementation
Define the Domain Model

Company.cs – Domain Entity
In Domain-Driven Design (DDD), the domain model captures the core business logic of your application. In the CompanyManagement.Domain.Entities namespace, the Company entity encapsulates key business rules and states of a real-world company.

There are three primary properties in the Company class:

  • Id: At the time of creation, a unique identifier (Guid) is generated.
  • Name: A company's name, which is immutable from outside the class, is its name.
  • EstablishedOn: The company's founding date is marked by the EstablishedOn property.

As Entity Framework Core (EF Core) requires a parameterless constructor for materialisation, the constructor is intentionally private. To ensure that Id is always generated and that required fields (Name, EstablishedOn) are always initialised during creation, a public constructor is provided.

This method contains a guard clause to ensure the new name is not null, empty, or whitespace, enforcing business rules directly within the domain model.

Rich domain modelling in DDD adheres to encapsulation, immutability (where appropriate), and self-validation, which are key principles.

Company.cs

namespace CompanyManagement.Domain.Entities;

public class Company
{
    public Guid Id { get; private set; }
    public string Name { get; private set; } = string.Empty;
    public DateTime EstablishedOn { get; private set; }

    private Company() { }

    public Company(string name, DateTime establishedOn)
    {
        Id = Guid.NewGuid();
        Name = name;
        EstablishedOn = establishedOn;
    }

    public void Rename(string newName)
    {
        if (string.IsNullOrWhiteSpace(newName))
            throw new ArgumentException("Name cannot be empty");

        Name = newname;
    }
}


Define the Repository Interface
In Domain-Driven Design (DDD), the repository pattern provides an abstraction over data persistence, enabling the domain layer to remain independent from infrastructure concerns like databases or external APIs. The ICompanyRepository interface, defined in the CompanyManagement.Domain.Interfaces namespace, outlines a contract for working with Company aggregates.

  • This interface declares the fundamental CRUD operations required to interact with the Company entity:
  • Task<Company?> GetByIdAsync(Guid id): Asynchronously retrieves a company by its unique identifier. Returns null if no match is found.
  • GetAllAsync(): Returns a list of all the companies in the system.
  • Task AddAsync(Company company): Asynchronously adds a new Company to the data store using the AddAsync(Company company) task.
  • void Update(Company company): Updates an existing company. This is typically used when domain methods like Rename modify the entity's status.
  • void Delete(Company company): Removes a company from the database.

The rest of the application depends only on abstractions, not implementations, when this interface is defined in the domain layer. A key component of Clean Architecture, the Dependency Inversion Principle (DIP) promotes loose coupling, testability (e.g., mocks in unit tests), and is based on loose coupling.

ICompanyRepository.cs

using CompanyManagement.Domain.Entities;

namespace CompanyManagement.Domain.Interfaces;

public interface ICompanyRepository
{
    Task<Company?> GetByIdAsync(Guid id);
    Task<List<Company>> GetAllAsync();
    Task AddAsync(Company company);
    void Update(Company company);
    void Delete(Company company);
}

EF Core Implementation
As the heart of the Entity Framework Core data access layer, AppDbContext represents a session with the SQL Server database, allowing us to query and save domain entity instances. Input into the constructor is handled through dependency injection, which allows the application to configure the context externally, such as setting the connection string or enabling SQL Server-specific functionality.

The Companies property is a strongly typed DbSet<Company>, which EF Core uses to track and manage Company entities in the database. It abstracts the Companies table and allows you to perform operations like querying, inserting, updating, and deleting data through LINQ and asynchronous methods.

The OnModelCreating method is overridden to configure entity mappings using the Fluent API. Here, we map the Company entity to the Companies table explicitly using modelBuilder.Entity<Company>().ToTable("Companies"). This approach provides flexibility for configuring additional constraints, relationships, and database-specific settings in the future — while keeping the domain model clean and free of persistence concerns.

We follow the Separation of Concerns principle by isolating all database configurations within AppDbContext, ensuring our domain remains pure and focused only on business logic.

AppDbContext.cs
using CompanyManagement.Domain.Entities;
using Microsoft.EntityFrameworkCore;

namespace CompanyManagement.Infrastructure.Data;

public class AppDbContext : DbContext
{
    public AppDbContext(DbContextOptions<AppDbContext> options) : base(options) { }

    public DbSet<Company> Companies => Set<Company>();

    protected override void OnModelCreating(ModelBuilder modelBuilder)
    {
        modelBuilder.Entity<Company>().ToTable("Companies");
    }
}
 

For managing Company entities in the database, the CompanyRepository class implements the ICompanyRepository interface concretely. The application leverages Entity Framework Core and is injected with an AppDbContext instance, so that it can interact directly with the database context. By using ToListAsync(), the GetAllAsync method retrieves all Company records from the database asynchronously. The GetByIdAsync method locates a specific company using its unique identifier (Guid id), returning null otherwise.

With AddAsync, a new Company entity is asynchronously added to the context's change tracker, preparing it for insertion into the database when SaveChangesAsync() is called. Update marks an existing company entity as modified, and Delete flags an entity for deletion.

Using the Repository Pattern, the data access logic is abstracted from the rest of the application, which keeps the underlying persistence mechanism hidden. As a result, we can swap or enhance persistence strategies without affecting the domain or application layers, promoting separation of concerns, testability, and maintainability.

CompanyRepository.cs

using CompanyManagement.Domain.Entities;
using CompanyManagement.Domain.Interfaces;
using CompanyManagement.Infrastructure.Data;
using Microsoft.EntityFrameworkCore;

namespace CompanyManagement.Infrastructure.Repositories;

public class CompanyRepository(AppDbContext context) : ICompanyRepository
{
    public Task<List<Company>> GetAllAsync() => context.Companies.ToListAsync();
    public Task<Company?> GetByIdAsync(Guid id) => context.Companies.FindAsync(id).AsTask();
    public Task AddAsync(Company company) => context.Companies.AddAsync(company).AsTask();
    public void Update(Company company) => context.Companies.Update(company);
    public void Delete(Company company) => context.Companies.Remove(company);
}


Application Layer – Services & DTOs
As part of the Application Layer, Data Transfer Objects (DTOs) are crucial to separating external inputs from internal domain models. With C# 9+, the CreateCompanyDTO is an immutable and simple data structure that encapsulates only the necessary data for creating a new company, such as its name and its establishment date.

Using a DTO like CreateCompanyDto ensures that the API or service layer receives only the necessary information while maintaining a clear boundary from domain entities. In addition to improving maintainability, validation, and security, it simplifies testing and supports serialization/deserialization out of the box, which eliminates over-posting.

It is a clean, minimal contract aligned with Clean Architecture and Single Responsibility Principle by protecting the domain from direct external access.

CreateCompanyDto.cs

namespace CompanyManagement.Application.DTOs;

public record CreateCompanyDto(string Name, DateTime EstablishedOn);

In this class, business operations related to Company entities are handled by the core application service. When persisting changes, it relies on two key abstractions: ICompanyRepository for data access and IUnitOfWork for transactional consistency management.

In the CreateAsync method, a new company is created by instantiating a new Company domain entity based on the CreateCompanyDto provided, then delegating to the repository and adding it asynchronously. Then it commits the transaction to the database by calling the SaveChangesAsync method on the unit of work. Finally, the caller receives the unique identifier (Id) for the newly created company that can be referenced in the future.

Asynchronously retrieves all existing companies by invoking the corresponding repository method, returning a list of Company entities.

The service contains business logic and coordinates domain operations, while the repository abstracts persistence details. By using interfaces to support easy unit testing, it follows best practices for asynchronous programming, dependency injection, and dependency injection.

CompanyService.cs
using CompanyManagement.Application.DTOs;
using CompanyManagement.Domain.Entities;
using CompanyManagement.Domain.Interfaces;

namespace CompanyManagement.Application.Services;

public class CompanyService(ICompanyRepository repository, IUnitOfWork unitOfWork)
{
    public async Task<Guid> CreateAsync(CreateCompanyDto dto)
    {
        var company = new Company(dto.Name, dto.EstablishedOn);
        await repository.AddAsync(company);
        await unitOfWork.SaveChangesAsync();
        return company.Id;
    }

    public async Task<List<Company>> GetAllAsync() => await repository.GetAllAsync();
}

Unit of Work
IUnitOfWork represents a fundamental pattern for managing data persistence and transactional consistency in an application. It abstracts the concept of a "unit of work," which encapsulates a series of operations that should be treated as a single atomic transaction. The SaveChangesAsync() method commits all pending changes to the underlying data store asynchronously. This method returns an integer indicating the number of state entries committed.

By relying on this abstraction, the application ensures that all modifications across multiple repositories can be coordinated and saved together, preserving data integrity and consistency. Furthermore, implementations can be mocked or swapped without changing the consuming code, improving testability and separation of concerns.

IUnitOfWork.cs
namespace CompanyManagement.Domain.Interfaces;

public interface IUnitOfWork
{
    Task<int> SaveChangesAsync();
}

The UnitOfWork class implements the IUnitOfWork interface, encapsulating the transaction management logic for the application. It depends on the AppDbContext, which represents the Entity Framework Core database context.

It simply delegated the call to context.SaveChangesAsync(), asynchronously persisting all tracked changes. This ensures that any changes made through repositories within a unit of work are saved as one atomic operation.

Using the UnitOfWork class to centralize the commit logic makes it easier to coordinate multiple repository operations under one transaction, while also supporting dependency injection and testing.

UnitOfWork.cs
using CompanyManagement.Domain.Interfaces;

namespace CompanyManagement.Infrastructure.Data;

public class UnitOfWork(AppDbContext context) : IUnitOfWork
{
    public Task<int> SaveChangesAsync() => context.SaveChangesAsync();
}


API Layer
ASP.NET Core's Dependency Injection (DI) container is used to wire up essential services in Program.cs. In order to register the AppDbContext with the DI container, AddDbContext is first used. The connection string is retrieved from the application's configuration under the key "Default" and configures Entity Framework Core to use SQL Server as the database provider. As a result, the application can access the database seamlessly.

Through AddScoped, the repository and unit of work abstractions are then mapped to their concrete implementations. The result is that a new instance of CompanyRepository and UnitOfWork is created and shared for every HTTP request, providing scoped lifetime management for database operations.

Also registered as a scoped service is the CompanyService, which contains the business logic for managing companies. With this setup, controllers and other components receive these dependencies via constructor injection, thereby promoting loose coupling, testability, and separation of concerns in the API layer.

ServiceRegistration.cs
using CompanyManagement.Application.Services;
using CompanyManagement.Domain.Interfaces;
using CompanyManagement.Infrastructure.Data;
using CompanyManagement.Infrastructure.Repositories;
using Microsoft.EntityFrameworkCore;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;

namespace CompanyManagement.Infrastructure.DependencyInjection;

public static class ServiceRegistration
{
    public static IServiceCollection AddInfrastructure(this IServiceCollection services, IConfiguration configuration)
    {
        services.AddDbContext<AppDbContext>(options =>
            options.UseSqlServer(configuration.GetConnectionString("Default")));

        services.AddScoped<ICompanyRepository, CompanyRepository>();
        services.AddScoped<IUnitOfWork, UnitOfWork>();
        services.AddScoped<CompanyService>();

        return services;
    }
}

SwaggerConfiguration.cs
using Microsoft.OpenApi.Models;

namespace CompanyManagement.API.Configurations;

public static class SwaggerConfiguration
{
    public static IServiceCollection AddSwaggerDocumentation(this IServiceCollection services)
    {
        services.AddEndpointsApiExplorer();

        services.AddSwaggerGen(options =>
        {
            options.SwaggerDoc("v1", new OpenApiInfo
            {
                Title = "Company API",
                Version = "v1",
                Description = "API for managing companies using Clean Architecture"
            });

            // Optional: Include XML comments
            var xmlFilename = $"{System.Reflection.Assembly.GetExecutingAssembly().GetName().Name}.xml";
            var xmlPath = Path.Combine(AppContext.BaseDirectory, xmlFilename);
            if (File.Exists(xmlPath))
            {
                options.IncludeXmlComments(xmlPath);
            }
        });

        return services;
    }
}


SwaggerMiddleware.cs
namespace CompanyManagement.API.Middleware;

public static class SwaggerMiddleware
{
    public static IApplicationBuilder UseSwaggerDocumentation(this IApplicationBuilder app)
    {
        app.UseSwagger();

        app.UseSwaggerUI(options =>
        {
            options.SwaggerEndpoint("/swagger/v1/swagger.json", "Company API V1");
        });

        return app;
    }
}

Program.cs
using CompanyManagement.API.Configurations;
using CompanyManagement.API.Endpoints;
using CompanyManagement.API.Middleware;
using CompanyManagement.Infrastructure.DependencyInjection;
var builder = WebApplication.CreateBuilder(args);
builder.Services.AddOpenApi();
builder.Services.AddEndpointsApiExplorer();
builder.Services.AddSwaggerDocumentation();

var app = builder.Build();

app.UseSwagger();
app.UseSwaggerDocumentation();
builder.Services.AddInfrastructure(builder.Configuration);

 .
if (app.Environment.IsDevelopment())
{
    app.MapOpenApi();
}
app.MapCompanyEndpoints();

app.UseHttpsRedirection();


appsettings.json
{
  "ConnectionStrings": {
    "Default": "Server=localhost;Database=CompanyDb;Trusted_Connection=True;MultipleActiveResultSets=true"
  },
  "Logging": {
    "LogLevel": {
      "Default": "Information",
      "Microsoft.AspNetCore": "Warning"
    }
  },
  "AllowedHosts": "*"
}

The CompanyEndpoints is the REST API entry point that handles HTTP requests related to company management. Decorated with the [ApiController] attribute, it features automatic model validation, binding, and API-specific conventions. The controller routes are prefixed with "api/[controller]", which means they will be based on the controller name (api/company). Using constructor injection, the controller receives an instance of CompanyService, which enables it to delegate business logic operations to the application layer.

In the Create action, a CreateCompanyDto object is accepted from the request body, the service's CreateAsync method is called, and a 201 Created response is returned with the new resource's location header.
When an HTTP GET request is made with a company ID in the route, GetById retrieves all companies via the service, searches for companies with the matching ID, and returns 200 OK if a match is found, or 404 Not Found otherwise.
A 200 OK response is returned by the GetAll action when HTTP GET requests are made to retrieve all companies.

By offloading business logic to the service layer, this controller adheres to the Single Responsibility Principle and ensures maintainability and testability.

CompanyEndpoints.cs
using CompanyManagement.Application.DTOs;
using CompanyManagement.Application.Services;

namespace CompanyManagement.API.Endpoints;

public static class CompanyEndpoints
{
    public static void MapCompanyEndpoints(this WebApplication app)
    {
        app.MapPost("/api/companies", async (CreateCompanyDto dto, CompanyService service) =>
        {
            var id = await service.CreateAsync(dto);
            return Results.Created($"/api/companies/{id}", new { id });
        });

        app.MapGet("/api/companies", async (CompanyService service) =>
        {
            var companies = await service.GetAllAsync();
            return Results.Ok(companies);
        });

        app.MapGet("/api/companies/{id:guid}", async (Guid id, CompanyService service) =>
        {
            var companies = await service.GetAllAsync();
            var match = companies.FirstOrDefault(c => c.Id == id);
            return match is not null ? Results.Ok(match) : Results.NotFound();
        });
    }
}

Best Practices
Several key best practices are followed in this project to ensure that the code is clean, maintainable, and scalable. For instance, Data Transfer Objects (DTOs) are used to avoid exposing domain entities directly to external clients, improving security and abstraction. The Single Responsibility Principle (SRP) is applied by ensuring each layer has a distinct, focused responsibility, resulting in improved code organisation and clarity.

The project uses Dependency Injection to promote flexibility, ease testing, and maintain separation of concerns. For I/O-bound operations such as database access, asynchronous programming with async/await improves application responsiveness and scalability.

As a result of exception handling middleware, error logging is consistent, and maintenance is simplified. Finally, the project includes comprehensive unit testing using xUnit and Moq frameworks for each layer, which ensures code quality and reliability in the future.



European ASP.NET Core 9.0 Hosting - HostForLIFE :: Use Cases and Performance Comparison of PLINQ vs LINQ in C#

clock May 26, 2025 08:20 by author Peter

Large datasets and computationally demanding activities are becoming increasingly prevalent in software applications, thus developers need effective tools to handle data. PLINQ (Parallel LINQ) and LINQ (Language Integrated Query) are two well-liked choices in C#. Although their syntax and functionality are identical, their query execution methods are very different. With the help of real-world examples and performance comparisons, this article examines the main distinctions, applications, and performance factors between LINQ and PLINQ.

What is LINQ?
LINQ (Language Integrated Query) is a feature of C# that enables developers to perform data querying in a syntax integrated into the language. Introduced in .NET Framework 3.5, LINQ provides a consistent method to work with different data sources like collections, databases, XML, and more. It executes queries sequentially, processing each item in turn.

LINQ Example

var numbers = new List<int> { 1, 2, 3, 4, 5 };
var evenNumbers = numbers.Where(n => n % 2 == 0).ToList();

foreach (var number in evenNumbers)
{
    Console.WriteLine(number); // Output: 2, 4
}

LINQ is straightforward to use and works well for small-to-medium-sized datasets or queries that are not computationally intensive.

What is PLINQ?

PLINQ (Parallel LINQ) was introduced with .NET Framework 4.0 and extends LINQ by enabling parallel query execution. Built on the Task Parallel Library (TPL), PLINQ uses multiple CPU cores to process large datasets or computationally expensive operations more efficiently. It partitions data into chunks and executes them concurrently using threads.

PLINQ Example
var numbers = Enumerable.Range(1, 10_000);
var evenNumbers = numbers.AsParallel()
                         .Where(n => n % 2 == 0)
                         .ToList();

Console.WriteLine(evenNumbers.Count); // Output: 5000


The AsParallel() method enables parallel execution of the query, leveraging all available processor cores.

Performance Comparison Between LINQ and PLINQ

To better understand how LINQ and PLINQ differ in performance, let’s process a large dataset and measure the time taken for each.

Example: LINQ vs PLINQ Performance
The following code processes a dataset of numbers from 1 to 5,000,000 and filters prime numbers using both LINQ and PLINQ. We also measure execution time.
using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.Linq;

class Program
{
    static void Main()
    {
        // Prepare a large dataset
        var largeDataSet = Enumerable.Range(1, 5_000_000).ToList();

        // LINQ benchmark
        var stopwatch = Stopwatch.StartNew();
        var linqPrimes = largeDataSet.Where(IsPrime).ToList();
        stopwatch.Stop();
        Console.WriteLine($"LINQ Time: {stopwatch.ElapsedMilliseconds} ms");
        Console.WriteLine($"LINQ Prime Count: {linqPrimes.Count}");

        // PLINQ benchmark
        stopwatch.Restart();
        var plinqPrimes = largeDataSet.AsParallel().Where(IsPrime).ToList();
        stopwatch.Stop();
        Console.WriteLine($"PLINQ Time: {stopwatch.ElapsedMilliseconds} ms");
        Console.WriteLine($"PLINQ Prime Count: {plinqPrimes.Count}");
    }

    static bool IsPrime(int number)
    {
        if (number <= 1) return false;
        for (int i = 2; i <= Math.Sqrt(number); i++)
        {
            if (number % i == 0) return false;
        }
        return true;
    }
}

Explanation of Benchmark

  • Dataset: A large range of numbers (1 to 5,000,000) serves as the input.
  • LINQ: The query is processed sequentially, examining each number to determine if it is prime.
  • PLINQ: The query runs in parallel, dividing the dataset into chunks for multiple threads to process concurrently.

Expected Output
On a multi-core machine, you might see performance results like:


Ordered vs Unordered Processing in PLINQ
By default, PLINQ processes data in unordered mode to maximize performance. However, if your application requires results to be in the same order as the input dataset, you can enforce order using .AsOrdered().

Example. Using .AsOrdered() in PLINQ
var numbers = Enumerable.Range(1, 10);
var orderedResult = numbers.AsParallel()
                       .AsOrdered()
                       .Where(n => n % 2 == 0)
                       .ToList();
Console.WriteLine(string.Join(", ", orderedResult)); // Output: 2, 4, 6, 8, 10

If maintaining the order doesn’t matter, you can use .AsUnordered() to further optimize performance.

Benchmark. Ordered vs Unordered PLINQ
var numbers = Enumerable.Range(1, 1_000_000).ToList();

var stopwatch = Stopwatch.StartNew();

// Ordered PLINQ
var orderedPrimes = numbers.AsParallel()
                       .AsOrdered()
                       .Where(IsPrime)
                       .ToList();
stopwatch.Stop();
Console.WriteLine($"AsOrdered Time: {stopwatch.ElapsedMilliseconds} ms");

stopwatch.Restart();

// Unordered PLINQ
var unorderedPrimes = numbers.AsParallel()
                         .AsUnordered()
                         .Where(IsPrime)
                         .ToList();
stopwatch.Stop();
Console.WriteLine($"AsUnordered Time: {stopwatch.ElapsedMilliseconds} ms");

Expected Output
AsOrdered Time: 210 ms
AsUnordered Time: 140 ms

Use Cases for LINQ and PLINQ

When to Use LINQ?

  • Small datasets where sequential processing is efficient.
  • Tasks requiring strict order preservation.
  • Easy debugging and simple queries.
  • Real-time systems where lower latency matters more than raw throughput.

When to Use PLINQ?

  • Large datasets where parallel execution can reduce runtime.
  • Computationally intensive tasks, such as processing images or mathematical operations.
  • Bulk operations where order doesn’t matter, e.g., statistical analysis of logs.
  • Applications running on multi-core machines utilize available CPU resources.

Summary Table of Insights
Key Differences Between LINQ and PLINQ

Feature LINQ PLINQ
Execution Sequential Parallel
Performance Best suited for small datasets Designed for large datasets
Utilization Uses a single CPU core Utilizes multiple CPU cores and threads
Order Preservation Preserves element order by default Unordered by default (order can be enforced)
Error Handling Simple error propagation Requires handling of thread-specific exceptions
Control Limited control over execution Offers options like cancellation and partitioning
Overhead No additional overhead Thread management and partitioning may add overhead

Conclusion
In C#, LINQ and PLINQ are both great tools for data queries. PLINQ performs best in situations requiring extensive data processing or operations on huge datasets where parallelism may be used, whereas LINQ is appropriate for smaller, simpler datasets.


Depending on whether you need result ordering or efficiency is your first priority, PLINQ offers both ordered and unordered processing options. The optimum method for your use case can be found by benchmarking your query in real-world situations.

By striking a balance between order sensitivity, performance, and application complexity, you may optimize LINQ and PLINQ and write code that is both efficient and maintainable.



About HostForLIFE.eu

HostForLIFE.eu is European Windows Hosting Provider which focuses on Windows Platform only. We deliver on-demand hosting solutions including Shared hosting, Reseller Hosting, Cloud Hosting, Dedicated Servers, and IT as a Service for companies of all sizes.

We have offered the latest Windows 2016 Hosting, ASP.NET Core 2.2.1 Hosting, ASP.NET MVC 6 Hosting and SQL 2017 Hosting.


Tag cloud

Sign in