European ASP.NET 4.5 Hosting BLOG

BLOG about ASP.NET 4, ASP.NET 4.5 Hosting and Its Technology - Dedicated to European Windows Hosting Customer

European ASP.NET Core 9.0 Hosting - HostForLIFE :: Base Class Library (BCL) for .NET

clock June 30, 2025 09:58 by author Peter

One crucial component of the.NET framework is the Base Class Library (BCL). Microsoft provides a basic collection of pre-built classes and types for the.NET platform called the Base Class Library (BCL). These are the fundamental components of nearly all.NET applications, be they desktop, online, console, or API projects. It is closely linked with the Common Language and is a component of the Framework Class Library (FCL). It functions similarly to the standard toolkit included with the.NET runtime, prepared to tackle common programming tasks such as:

  • Working with strings, numbers, and dates
  • Reading and writing files
  • Managing collections of data
  • Performing network operations
  • Handling threads and asynchronous tasks
  • Encrypting and securing data
  • Parsing XML and JSON
  • Reflecting on types and assemblies

It’s implemented in assemblies like System.Runtime.dll, System.Private.CoreLib.dll, and others, and is shared across all .NET implementations—whether you're building with .NET Core, .NET Framework, or modern .NET (6/7/8).

Think of the BCL as the foundation toolkit that every .NET developer uses — just like a workshop full of power tools and templates to build robust applications without reinventing the wheel.

Framework Class Library (FCL)
The Framework Class Library includes the BCL but also adds broader APIs for developing web apps (ASP.NET), desktop apps (WinForms/WPF), database access (ADO.NET), and more.

Common Language Runtime (CLR)
The CLR is the execution engine of .NET. It handles memory management, type safety, garbage collection, exception handling, and security.

In my opinion as a developer, we use many components from the Base Class Library in our day-to-day projects. We often see some libraries being used more commonly than others. It's helpful to understand what they are and how they work with real examples.

We build our house (Application) on a strong foundation (CLR), construct it using tools like List<T>, File, and Console (BCL), and then decorate it with features like ASP.NET, ADO.NET, and WinForms (FCL).

Why Use Base Class Library (BCL) in .NET?

  • Saves Development Time - No need to write common functionality from scratch. Built-in classes like List<T>, File, DateTime make tasks faster and easier.
  • Reduces Bugs - Microsoft’s BCL classes are thoroughly tested and reliable. Reduces the risk of writing error-prone custom implementations.
  • Increases Productivity - Developers can focus on business logic instead of low-level utilities. Example: File.ReadAllText("file.txt") reads an entire file in one line.
  • 4Cross-Platform Compatible - Works consistently across Windows, Linux, and macOS using .NET Core / .NET 5+. Ensures platform-independent development.
  • Improves Code Readability - Standardized APIs make code easier to read, understand, and maintain. Using familiar classes like StringBuilder, Dictionary, and Console helps teamwork.
  • Boosts Performance - Many BCL classes (like Stopwatch, Span<T>) are optimized for speed and memory.Useful for performance-critical tasks.
  • Consistent Design - All .NET languages (C#, VB.NET, F#) use the same BCL. APIs follow predictable naming and behavior patterns.


Commonly Used Base Class Libraries
1. System

System is the core namespace in the .NET Base Class Library (BCL). It contains the fundamental types and classes that almost every .NET application uses — including data types, console I/O, math functions, exceptions, and more. The foundation layer of all your C# programs — like the basic bricks and cement used in every building project. It includes essential types like:

  • Primitive types: System.Int32, System.Boolean, System.String, etc.
  • Base object model: System.Object, System.Type, System.Exception
  • Utilities: System.Math, System.Convert, System.Environment
  • Date and time: System.DateTime, System.TimeSpan
  • Console I/O: System.Console
  • Nullable types: System.Nullable<T>
  • Tuples and value types: System.ValueTuple, System.Enum

Example code:
using System;

class Program
{
    static void Main()
    {
        string name = "C# Community";
        int year = DateTime.Now.Year;

        Console.WriteLine($"Hello, {name}! Welcome to the year {year}.");
    }
}


We used:
    System.String
    System.DateTime
    System.Console


All from the System namespace—no extra libraries needed.

Why Is It Important?

It’s automatically referenced in most .NET projects, It provides the building blocks for all other namespaces and It’s the default namespace for many language features (e.g., int is an alias for System.Int32).

2.  System.IO
The System.IO namespace in .NET is we go-to toolkit for handling files, directories, and data streams. It’s part of the Base Class Library (BCL) and provides everything we need to read, write, and manage data on disk or in memory. It enables to:

  • Read and write files (File, FileStream, StreamReader, StreamWriter)
  • Work with directories (Directory, DirectoryInfo)
  • Handle paths (Path)
  • Monitor file system changes (FileSystemWatcher)
  • Read/write binary data (BinaryReader, BinaryWriter)
  • Use memory streams (MemoryStream)


Example Code
using System;
using System.IO;

public class Program
{
    public static void Main()
    {
        string path = "myFile.txt";
        File.WriteAllText(path, "Hi! C# community, I am Miche!");
        string content = File.ReadAllText(path);
        Console.WriteLine("File content:\n" + content);
    }
}


We used using System; and using System.IO;

These two using statements import the namespaces required.
System: Gives access to Console, String, etc.
System.IO: Gives access to file-related classes like File.


Common Classes in System.IO

Class Purpose
File / FileInfo Create, delete, copy, move files
Directory / DirectoryInfo Manage folders
StreamReader / StreamWriter Read/write text
BinaryReader / BinaryWriter Read/write binary data
FileStream Low-level file access
Path Manipulate file paths
MemoryStream Work with data in memory
FileSystemWatcher Watch for file changes (great for logging or syncing apps)

Why Is It Important?
To save data locally (like logs, configs, user input),  To read config files or resources in your application, To upload/download files in web apps and To interact with the file system in background jobs, APIs, or desktop tools. System and System.IO are libraries commonly used by developers from beginner to advanced levels. In this article, we introduced them briefly, and in the upcoming articles, we will explore other important libraries — explaining them in simple terms, along with detailed examples.

Essential .NET Base Class Libraries (BCL)

Namespace What It Is  Why It’s Important Use Case / Class
System Core types and base functionality Fundamental types used in all .NET code Console, String, DateTime, Math
System.IO File and stream handling Enables file read/write, stream data File, FileInfo, StreamReader, Path
System.Net.Http HTTP communication Connects to REST APIs or web servers HttpClient for GET/POST requests
System.Collections.Generic Generic data collections Manage dynamic data with type safety List<T>, Dictionary<K,V>
System.Linq LINQ querying over collections Simplifies filtering, sorting, and projections Where, Select, FirstOrDefault
System.Threading.Tasks Asynchronous programming Perform non-blocking background operations Task, async/await, Task.Run()
System.Text Text processing and manipulation Efficient string handling and encoding StringBuilder, Encoding.UTF8
System.Text.RegularExpressions Regex-based pattern matching Validate input, extract data from text Regex.IsMatch, Regex.Replace
System.Globalization Culture-aware formatting Supports localization for global users CultureInfo, DateTime.ToString("D", culture)
System.Diagnostics Logging, tracing, performance Debug, benchmark, trace app behavior Stopwatch, Debug.WriteLine()
System.Reflection Runtime type inspection Enables plugins, dependency injection, metadata access Type, Assembly, PropertyInfo
System.Security.Cryptography Cryptographic services Secure hashing, encryption, certificates SHA256, Aes, RNGCryptoServiceProvider
System.ComponentModel Metadata for components Data annotations, property binding INotifyPropertyChanged, attributes
System.Timers Timer-based scheduling Execute code at intervals Timer.Elapsed, auto-repeating logic
System.Configuration App config settings access Read/write app configuration files ConfigurationManager.AppSettings[]
System.Data Core data access tools Work with databases or tabular data DataTable, DataSet, DbConnection

How to Use Base Class Library (BCL) in .NET

  • BCL is built-in: No need to install — it comes with .NET automatically.
  • Use using statements: Add namespaces like System, System.IO, System.Collections.Generic at the top of your file.
  • Call built-in classes: Use BCL types like Console, DateTime, File, List<T>, etc.
  • Works in all project types: Console apps, web apps, APIs, desktop apps, etc.
  • Saves time: Gives you ready-to-use tools for common tasks like reading files, formatting dates, or handling collections.

Conclusion
The BCL is more than just a library—it’s the beating heart of .NET development. Mastering these foundational namespaces allows us to build clean, efficient, and maintainable software across a wide range of applications. From System.Net.Http for HTTP communication to System.Text.Json for serialization, these tools empower us to write less boilerplate and focus more on business logic.



European ASP.NET Core 9.0 Hosting - HostForLIFE :: Troubleshooting Memory Leaks in .NET Core

clock June 25, 2025 07:51 by author Peter

Memory leaks have the potential to subtly impair the functionality of your.NET Core apps, resulting in excessive memory usage, slow performance, or even crashes. Thankfully,.NET has a comprehensive toolkit for efficiently identifying, evaluating, and repairing memory leaks. Using useful tools and a real-world example, we'll go over how to find and fix memory leaks step-by-step in this post.

 

Prerequisites
Before diving into debugging memory leaks, ensure you have the following installed and set up.

  • .NET SDK 6.0+
  • dotnet-counters tool
  • dotnet-dump tool
  • Basic understanding of C# and ASP.NET Core

To install tools.

  • dotnet tool install --global dotnet-counters
  • dotnet tool install --global dotnet-dump

Creating a Sample ASP.NET Core Application
Add a Leaky Controller
In the Controllers folder, create a new file called LeakController.cs.
[Route("api/[controller]")]
[ApiController]
public class LeakController : ControllerBase
{
    private static Dictionary<int, byte[]> _leakyCache = new();

    [HttpGet("{kb}")]
    public IActionResult LeakMemory(int kb)
    {
        var buffer = new byte[kb * 1024];
        _leakyCache[DateTime.UtcNow.Ticks.GetHashCode()] = buffer;

        return Ok(new
        {
            Message = $"Leaked {kb} KB. Cache size: {_leakyCache.Count}"
        });
    }
}


Each request allocates a byte array and stores it in a static dictionary using the current timestamp as a key. Since we never remove old items, this creates a memory leak.

Running the Application
dotnet run

The API should be available at https://localhost:7261/api/leak/{kb}. This leaks 1000 KB of memory. To simulate repeated leaking.

for i in {1..50}; do curl http://localhost:7261/api/leak/1000; done

Monitoring with dotnet counters
Use dotnet-counters to monitor real-time performance metrics.

Get the process ID (PID) of your running app.
dotnet-counters ps

Now, run the below command to get the different parameters along with the memory size. dotnet-counters monitor --refresh-interval 1 -p <PID>

What to Watch.

  • GC Heap Size: Should grow steadily
  • Gen 0/1/2 GC Count: Garbage collections increase, but the EAP never shrinks

This confirms that objects aren’t being collected – a classic memory leak.

Capturing Memory Dumps

We capture a snapshot of the memory to analyze it.
dotnet-dump collect -p <PID> -o before_leak.dmp

Trigger more leaks.
for i in {1..100}; do curl http://localhost:5000/api/leak/1000; done

Then collect another snapshot.
dotnet-dump collect -p <PID> -o after_leak.dmp

Fixing the Leak
Limit the size of the dictionary.
if (_leakyCache.Count > 100)
    _leakyCache.Clear();


Validating the Fix
Restart the app, monitor with dotnet-counters, and send repeated requests. You should now see memory usage stabilize.

Conclusion
In.NET Core, memory leaks are subtle yet fatal. You can successfully identify and resolve them with dotnet-counters, dotnet-dump, and analytic tools like SOS and Visual Studio. This post led you through constructing a leaking program, finding difficulties, examining heap dumps, and repairing the leak. To guarantee seamless performance, your development and deployment cycle should include routine diagnostics and profiling. Thank You, and Stay Tuned for More!



European ASP.NET Core 9.0 Hosting - HostForLIFE :: A Complete Guide to.NET 8 Web API Health Checks

clock June 19, 2025 08:55 by author Peter

In today's fast-paced digital landscape, keeping your web applications reliable and responsive is more critical than ever. Health checks continuously monitor vital components—from databases and external APIs to network connections providing real-time insights into potential issues. Whether you're managing a small service or a large-scale microservices architecture, effective health checks help you detect problems early, streamline diagnostics, and automate recovery processes.

This post will demonstrate how to use.NET 8 to create reliable health checks in an ASP.NET Web API. In addition to monitoring an external API and running a custom ping test for network connectivity, you'll learn how to check the condition of your SQL Server database and provide a clean, JSON-formatted response that works in unison with contemporary monitoring tools. You may increase resilience and ensure dependable, seamless operations by making sure every crucial component of your application is regularly examined.

In this article, we will:

  • Create a new ASP.NET Web API project.
  • Implement health checks for a SQL Server database, an external API, and a custom ping test.
  • Convert the health check response to JSON.
  • Discuss the benefits and potential drawbacks of using health checks.

Prerequisites
Before you begin, make sure you have:

  • .NET 8 SDK installed.
  • A basic understanding of ASP.NET Web API and C#.
  • A valid connection string for your SQL Server database.
  • An external API URL to monitor (e.g., C# Corner API).

Step 1. Create the ASP.NET Web API Project
Open your command prompt (or terminal) and run the following bash command to create a new ASP.NET Web API project named HealthCheckDemo:
dotnet new webapi -n HealthCheckDemo
cd HealthCheckDemo


This command creates a basic Web API project using the minimal hosting model found in .NET 8.
You will also need the following NuGet packages:
dotnet add package AspNetCore.HealthChecks.SqlServer
dotnet add package AspNetCore.HealthChecks.Uris

Step 2. Implement a Custom Ping Test Health Check
A custom ping test can help verify network connectivity by pinging a specific host. Create a new file called PingHealthCheck.cs in your project and add the following code:
using System;
using System.Net.NetworkInformation;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Diagnostics.HealthChecks;

public class PingHealthCheck : IHealthCheck
{
    private readonly string _host;

    public PingHealthCheck(string host)
    {
        _host = host;
    }

    public async Task<HealthCheckResult> CheckHealthAsync(
        HealthCheckContext context,
        CancellationToken cancellationToken = default)
    {
        using var ping = new Ping();
        try
        {
            // Send a ping with a 2-second timeout
            var reply = await ping.SendPingAsync(_host, 2000);
            if (reply.Status == IPStatus.Success)
            {
                return HealthCheckResult.Healthy(
                    $"Ping to {_host} succeeded with roundtrip time {reply.RoundtripTime}ms");
            }
            return HealthCheckResult.Unhealthy(
                $"Ping to {_host} failed with status {reply.Status}");
        }
        catch (Exception ex)
        {
            return HealthCheckResult.Unhealthy(
                $"Ping to {_host} resulted in an exception: {ex.Message}");
        }
    }
}


This custom health check pings the host (such as "8.8.8.8") and returns a healthy status if the ping is successful.

Step 3. Configure Health Checks in Program.cs
Open your Program.cs file and update it to register health checks for your SQL Server database, the external API, and the custom ping test. In addition, configure the endpoint to output a JSON response.
using System.Text.Json;
using Microsoft.AspNetCore.Diagnostics.HealthChecks;
using Microsoft.Extensions.Diagnostics.HealthChecks;

var builder = WebApplication.CreateBuilder(args);

// Add services for controllers (if you plan to use controllers)
builder.Services.AddControllers();

// Retrieve the connection string from configuration (set in appsettings.json)
string connectionString = builder.Configuration.GetConnectionString("DefaultConnection");

// Register health checks.
builder.Services.AddHealthChecks()
    // SQL Server Health Check: Execute a simple query to evaluate connectivity.
    .AddSqlServer(connectionString,
                  name: "SQL Server",
                  healthQuery: "SELECT 1;",
                  failureStatus: HealthStatus.Unhealthy)
    // External API Health Check: Verify that an external API (e.g., C#-Corner) is reachable.
    .AddUrlGroup(new Uri("https://www.c-sharpcorner.com/"),
                 name: "C# Corner API",
                 failureStatus: HealthStatus.Degraded)
    // Custom Ping Test Health Check: Ping an external host (e.g., Google's public DNS).
    .AddCheck("Ping Test", new PingHealthCheck("8.8.8.8"));

var app = builder.Build();

app.UseRouting();

app.UseEndpoints(endpoints =>
{
    endpoints.MapControllers();

    // Map the health check endpoint with a custom JSON response writer.
    endpoints.MapHealthChecks("/health", new HealthCheckOptions
    {
        ResponseWriter = async (context, report) =>
        {
            context.Response.ContentType = "application/json";

            var response = new
            {
                status = report.Status.ToString(),
                // Provide details about each health check.
                checks = report.Entries.Select(entry => new
                {
                    key = entry.Key,
                    status = entry.Value.Status.ToString(),
                    description = entry.Value.Description,
                    duration = entry.Value.Duration.ToString()
                }),
                totalDuration = report.TotalDuration.ToString()
            };

            // Serialize the response in indented JSON format.
            await context.Response.WriteAsync(JsonSerializer.Serialize(response, new JsonSerializerOptions
            {
                WriteIndented = true
            }));
        }
    });
});

app.Run();

Explanation of the Code
Service Registration

  • SQL Server Check: Executes a simple query (e.g., "SELECT 1;") to ensure that the database is reachable.
  • External API Check: Checks the availability of an external API (such as C# Corner's API).
  • Ping Test: Uses the custom PingHealthCheck class to verify network connectivity by pinging "8.8.8.8".
  • Custom JSON Response Writer: The response writer builds a structured JSON object that includes overall health status, details of each health check (name, status, description, and duration), plus the total duration of the process.

Step 4. Configure the Connection String
In your appsettings.json file, add your SQL Server connection string under the "DefaultConnection" key:

json
{
  "ConnectionStrings": {
    "DefaultConnection": "Server=YOUR_SERVER;Database=YOUR_DATABASE;User Id=YOUR_USER;Password=YOUR_PASSWORD;"
  }
}

Replace the placeholders with your actual SQL Server credentials.

Step 5. Testing the Health Check Endpoint

Run Your Application: In your command prompt, run:
dotnet run

Access the Health Endpoint: Open your browser or use a tool like Postman and navigate to:
http://localhost:<port>/health

A typical JSON response might look like this:

This JSON response gives a clear overview of the health status of each dependency and ensures easy integration with monitoring tools.

Benefits of Using Health Checks

  • Proactive Monitoring: Regular health checks allow for early detection of issues, enabling quick corrective actions or automated recovery processes.
  • Simplified Diagnostics: Aggregating multiple checks into a single endpoint simplifies troubleshooting by providing clear, structured output via JSON.
  • Enhanced Resilience: Health checks are integral to modern orchestration systems like Kubernetes, which use them to manage service restarts and traffic routing based on real-time status.
  • Easy Integration: A JSON-formatted health response is readily consumable by dashboards and third-party monitoring tools, providing comprehensive insights over time.

Drawbacks and Considerations

  • Performance Overhead: Detailed or numerous health checks might add overhead, particularly if they perform resource-intensive operations. Balance thorough monitoring with performance.
  • False Positives: Overly aggressive or sensitive checks might incorrectly flag transient issues as failures, resulting in unnecessary alerts.
  • Maintenance Effort: As your application evolves, updating health checks to reflect new dependencies or architectural changes may require additional maintenance.
  • Security Concerns: Exposing detailed internal health information can be risky. Consider securing the health endpoint, especially in production environments.

Conclusion
Implementing comprehensive health checks in your ASP.NET Web API using .NET 8 significantly enhances your system's resilience and reliability. These checks—whether it's verifying SQL Server connectivity, ensuring external API availability, or performing a custom ping test—offer proactive monitoring that simplifies diagnostics and facilitates automated recovery in orchestration environments. Although careful consideration is needed to balance monitoring detail with performance overhead and potential security concerns, fine-tuning these checks can lead to a robust infrastructure that minimizes downtime and improves user experience. In essence, health checks serve as a crucial asset, empowering you to maintain a healthy, high-performing application in an ever-changing digital landscape.



European ASP.NET Core 9.0 Hosting - HostForLIFE :: Manage Transactions in the Entity Framework

clock June 16, 2025 09:36 by author Peter

We shall study Entity Framework Transactions in this article.

 Let's begin, then.

Comprehending Transactions

  • Transactions treat several activities as a single unit, ensuring data integrity.
  • They are crucial for preserving uniformity, particularly in intricate applications.

Built-in Transaction Management
Entity Framework (EF) provides built-in transaction management through the DbContext.
EF automatically handles transactions for single SaveChanges calls.
using (var context = new AppDbContext())

using (var context = new AppDbContext())
  {
     // Operations
     context.SaveChanges(); // EF handles transaction automatically
  }


Handling Multiple Operations as a Unit
Transactions are crucial for ensuring data integrity when performing multiple related operations. For instance, transferring money between accounts involves debiting one account and crediting another. Both operations should be part of a single transaction.

Example
using (var context = new AppDbContext())
  {
     using (var transaction  = context.Database.BeginTransaction())
       {
         try
          {
             // Debit one account
             // Credit another account

             context.SaveChanges();
             transaction.commit();
          }
         catch(Exception ex)
          {
            transaction.Rollback();
            // Handle the exception
          }
       }
   }


Rollback Strategies

Rollbacks revert all changes if any operation fails.
Essential for maintaining data integrity in case of errors.
using (var context = new AppDbContext())
  {
     using (var transaction  = context.Database.BeginTransaction())
       {
         try
          {
            // Operations
            transaction.commit();
          }
         catch(Exception ex)
          {
            transaction.Rollback();
          }
       }
  }


Commit Strategies

Use Commit to finalize transactions after successful operations.
Always wrap Commit within a try catch block to handle exception.
using (var context = new AppDbContext())
  {
     using (var transaction  = context.Database.BeginTransaction())
       {
         try
          {
            // Operations
            transaction.commit();
          }
         catch(Exception ex)
          {
            transaction.Rollback();
            // Log or Handle the exception
          }
       }
  }

Combining Explicit Transactions with Multiple Contexts
Handle transactions across multiple DbContext instances.
Ensure consistency across different data sources.
using (var context1 = new AppDbContext1())
using (var context2 = new AppDbContext2())
 {
    using (var transaction  = context1.Database.BeginTransaction())
     {
        try
         {
            // Operations on context1
            context1.SaveChanges();

            // Operations on context2
            context2.Database.UseTransaction(transaction.GetDbTransation());
            context2.SaveChanges();

            transaction.commit();
         }
        catch(Exception ex)
         {
            transaction.Rollback();
         }
     }
 }


Points to be taken

  • Transactions are critical for data integrity.
  • EF provides built-in and explicit transaction management.
  • Rollbacks and commits ensure consistency and handle errors effectively.

Conclusion
In this article, I have tried to cover how to handle Entity Framework Transactions.



European ASP.NET Core 9.0 Hosting - HostForLIFE :: ASP.NET Core Clean and Reliable Code Testing with Moq using C# 13 and xUnit

clock June 9, 2025 08:34 by author Peter

C# 13 and .NET 8 have greatly enhanced ASP.NET Core development capabilities. However, building scalable and maintainable systems requires robust testing in addition to feature implementation. Using xUnit, Moq, and the latest C# 13 features, you will learn how to write clean, reliable, and testable code.

This guide will walk you through testing a REST API or a service layer:

  • Creating a test project
  • Using xUnit to write clean unit tests
  • Using Moq to mock dependencies
  • Using best practices for test architecture and maintainability

Setting Up Your ASP.NET Core Project with C# 13
With ASP.NET Core Web API and C# 13, begin with .NET 8 and ASP.NET Core Web API.
dotnet new sln -n HflApi

dotnet new web -n Hfl.Api
dotnet new classlib -n Hfl.Domain
dotnet new classlib -n Hfl.Core
dotnet new classlib -n Hfl.Application
dotnet new classlib -n Hfl.Infrastructure


dotnet sln add Hfl.Api/Hfl.Api.csproj
dotnet sln add Hfl.Domain/Hfl.Domain.csproj
dotnet sln add Hfl.Core/Hfl.Core.csproj
dotnet sln add Hfl.Application/Hfl.Application.csproj
dotnet sln add Hfl.Infrastructure/Hfl.Infrastructure.csproj


Create a Test Project with xUnit and Moq
Add a new test project:
dotnet new xunit -n HflApi.Tests
dotnet add HflApi.Tests/HflApi.Tests.csproj reference HflApi/HflApi.csproj
dotnet add HflApi.Tests package Moq


Use Case: Testing a Service Layer
Domain, Service, Respository, Interface and API
namespace HflApi.Domain;
public record Order(Guid Id, string Status);


using HflApi.Domain;

namespace HflApi.Core.Interfaces;

public interface IOrderRepository
{
    Task<Order?> GetByIdAsync(Guid id);
}


using HflApi.Core.Interfaces;

namespace HflApi.Application.Services;

public class OrderService
{
    private readonly IOrderRepository _repository;

    public OrderService(IOrderRepository repository)
    {
        _repository = repository;
    }

    public async Task<string> GetOrderStatusAsync(Guid orderId)
    {
        var order = await _repository.GetByIdAsync(orderId);
        return order?.Status ?? "Not Found";
    }
}


using HflApi.Core.Interfaces;
using HflApi.Domain;

namespace Hfl.Infrastructure.Repositories
{
    public class InMemoryOrderRepository : IOrderRepository
    {
        private readonly List<Order> _orders = new()
    {
        new Order(Guid.Parse("7c3308b4-637f-426b-aafc-471697dabeb4"), "Processed"),
        new Order(Guid.Parse("5aee5943-56d0-4634-9f6c-7772f6d9c161"), "Pending")
    };

        public Task<Order?> GetByIdAsync(Guid id)
        {
            var order = _orders.FirstOrDefault(o => o.Id == id);
            return Task.FromResult(order);
        }
    }
}

using Hfl.Infrastructure.Repositories;
using HflApi.Application.Services;
using HflApi.Core.Interfaces;


var builder = WebApplication.CreateBuilder(args);


builder.Services.AddScoped<IOrderRepository, InMemoryOrderRepository>();
builder.Services.AddScoped<OrderService>();

var app = builder.Build();

app.MapGet("/orders/{id:guid}", async (Guid id, OrderService service) =>
{
    var status = await service.GetOrderStatusAsync(id);
    return Results.Ok(new { OrderId = id, Status = status });
});

app.Run();


Unit Testing with xUnit and Moq
Test Class
using Moq;
using HflApi.Application.Services;
using HflApi.Core.Interfaces;
using HflApi.Domain;


namespace OrderApi.Tests;

public class OrderServiceTests
{
    private readonly Mock<IOrderRepository> _mockRepo;
    private readonly OrderService _orderService;

    public OrderServiceTests()
    {
        _mockRepo = new Mock<IOrderRepository>();
        _orderService = new OrderService(_mockRepo.Object);
    }

    [Fact]
    public async Task GetOrderStatusAsync_ReturnsStatus_WhenOrderExists()
    {
        var orderId = Guid.NewGuid();
        _mockRepo.Setup(r => r.GetByIdAsync(orderId))
                 .ReturnsAsync(new Order(orderId, "Processed"));

        var result = await _orderService.GetOrderStatusAsync(orderId);

        Assert.Equal("Processed", result);
    }

    [Fact]
    public async Task GetOrderStatusAsync_ReturnsNotFound_WhenOrderDoesNotExist()
    {
        var orderId = Guid.NewGuid();
        _mockRepo.Setup(r => r.GetByIdAsync(orderId))
                 .ReturnsAsync((Order?)null);

        var result = await _orderService.GetOrderStatusAsync(orderId);

        Assert.Equal("Not Found", result);
    }

    [Theory]
    [InlineData("Processed")]
    [InlineData("Pending")]
    [InlineData("Shipped")]
    public async Task GetOrderStatus_ReturnsCorrectStatus(string status)
    {
        var orderId = Guid.NewGuid();
        _mockRepo.Setup(r => r.GetByIdAsync(orderId))
                 .ReturnsAsync(new Order(orderId, status));

        var result = await _orderService.GetOrderStatusAsync(orderId);

        Assert.Equal(status, result);
    }
}


Best Practices
1. Use Dependency Injection for Testability
All dependencies should be injected, so don't use static classes or service locator patterns.

2. Keep Tests Isolated
In order to isolate external behavior, Moq should be used to isolate database/network I/O from tests.

3. Use Theory for Parameterized Tests
[Theory]
[InlineData("Processed")]
[InlineData("Pending")]
[InlineData("Shipped")]
public async Task GetOrderStatus_ReturnsCorrectStatus(string status)
{
    var orderId = Guid.NewGuid();
    _mockRepo.Setup(r => r.GetByIdAsync(orderId))
             .ReturnsAsync(new Order(orderId, status));

    var result = await _orderService.GetOrderStatusAsync(orderId);

    Assert.Equal(status, result);
}

4. Group Tests by Behavior (Not CRUD)
Tests should be organized according to what systems do, not how they are performed. For example:

  • GetOrderStatus_ShouldReturnCorrectStatus
  • CreateOrder_ShouldSendNotification

5. Use Records for Test Data in C# 13
public record Order(Guid Id, string Status);

Immutable, concise, and readable test data objects can be created using records.

  • Test Coverage Tips
  • To measure test coverage, use Coverlet or JetBrains dotCover.
  • Business rules and logic at the service layer should be targeted.
  • Make sure you do not overtest third-party libraries or trivial getter/setter functions.

Recommended Tools

Tool

Purpose

xUnit

Unit Testing Framework

Moq

Mocking Dependencies

FluentAssertions

Readable Assertions

Coverlet

Code Coverage

Summary

Use xUnit, Moq, and C# 13 capabilities to test ASP.NET Core applications. To make sure your apps are dependable, provide a clean architecture, separated unit tests, and appropriate test names. Developers can find and address problems earlier in the development cycle by integrating DI, mocking, and xUnit assertions. This leads to quicker feedback, more confidence, and more maintainable systems. Unit tests' isolation guarantees that every part functions on its own, enhancing the overall dependability of the system. Over time, a codebase becomes easier to comprehend and maintain with a clear design and relevant test names. This method promotes a strong development process by lowering regressions and enhancing code quality. Furthermore, modular designs and well-defined test cases facilitate team member onboarding and debugging, encouraging cooperation. These procedures ultimately result in more scalable and resilient applications.



European ASP.NET Core 9.0 Hosting - HostForLIFE :: Using C# with .NET 9 for Advanced Data Warehouse Modeling and Querying

clock May 14, 2025 07:25 by author Peter

The purpose of data warehouses is to offer business intelligence, analytics, and reporting. Even while SQL and ETL technologies receive most of the focus, C# can be a useful complement to model definition, metadata management, warehouse schema design, and large-scale querying. This article provides expert coverage of data warehousing with C# 14 and.NET 9.

Why C# Data Warehousing?

  • Automate the creation of warehouse schema (fact/dimension tables).
  • Regulating and certifying source-to-target mapping models.
  • Construct models dimensionally automatically.
  • Generate surrogate keys, SCD Type 2 rows.
  • Integrate with Azure Synapse, Snowflake, and BigQuery APIs.
  • Execute high-performance warehouse queries through parameterization and batching.

Data warehouse maintainability relies heavily on metadata-programming, code generation, and system integration, all of which C# excels at.

Programmatic Warehouse Modeling in C#
Let's create a straightforward dimensional model in present-day C#

public record CustomerDim(string CustomerKey, string CustomerName, string Country);
public record OrderFact(
    string OrderKey,
    string CustomerKey,
    DateTime OrderDate,
    decimal TotalAmount,
    string CurrencyCode);


You can cast the source data to those types before loading, or even generate the equivalent SQL CREATE TABLE scripts from attributes.
[WarehouseTable("dw.CustomerDim")]
public record CustomerDim(string CustomerKey, string CustomerName, string Country);


Use source generation or introspection to obtain the DDL based on annotated classes.

Warehouse Querying from C#

Instead of running raw SQL, put parameterized warehouse queries inside reusable procedures.
public async Task<List<OrderFact>> GetSalesByDateAsync(DateTime from, DateTime to)
{
    const string sql = @"
        SELECT
            OrderKey,
            CustomerKey,
            OrderDate,
            TotalAmount,
            CurrencyCode
        FROM dw.OrderFact
        WHERE OrderDate BETWEEN @from AND @to";
    using var conn = new SqlConnection(_warehouseConn);
    var results = await conn.QueryAsync<OrderFact>(sql, new { from, to });
    return results.ToList();
}


This design pattern allows you to.

  • Develop C# console applications for analytics APIs to display dashboards or reports
  • Export to Excel, Power BI, CSV, or PDF
  • Run batch summaries or ML feature generation jobs
  • High-Level Features
  • Surrogate Key Generation


C# can handle surrogate keys either in-process or through sequences.
int nextKey = await conn.ExecuteScalarAsync<int>(
    "SELECT NEXT VALUE FOR dw.CustomerKeySeq"
);

Slowly Changing Dimensions (SCD Type 2)
Use EF Core or Dapper to insert new rows for updated attributes with validity ranges.
if (existing.Name != updated.Name)
{
    // End old record
    existing.EndDate = DateTime.UtcNow;

    // Add new record
    var newVersion = new CustomerDim
    {
        // Assign necessary properties here
    };
    await conn.ExecuteAsync("INSERT INTO dw.CustomerDim ...", newVersion);
}

Query Materialization
Use the ToDataTable() extension methods to convert warehouse queries into in-memory tables.
var table = queryResults.ToDataTable();
ExportToCsv(table, "output/sales.csv");

BI Tool and API Integration

C# can,

  • Feed Power BI through REST or tabular model APIs
  • Push metrics to dashboards
  • Develop REST APIs that wrap SQL with business-oriented endpoints

Conclusion
Automate report sharing via email, Teams, or Slack. Conclusion With .NET 9 and C# 14, you can be hands-on and flexible in data warehouse modeling and querying. Whether modeling dimensions, building APIs, or filling dashboards, C# gives you control, performance, and maintainability that you simply can't get with SQL scripts alone.




European ASP.NET Core 9.0 Hosting - HostForLIFE :: Using.NET 9 to Build a Face Detection System

clock April 21, 2025 09:25 by author Peter

Face detection has evolved from a cool sci-fi concept to a practical tool in everything from photo tagging to biometric authentication. In this blog post, we’ll explore how to build a real-time face detection system using the latest .NET 9 and OpenCV via Emgu.CV, a popular .NET wrapper around the OpenCV library.


Prerequisites
To follow along, you'll need:

✅ .NET 9 SDK
✅ Visual Studio 2022+ or VS Code
✅ Basic knowledge of C#
✅ Emgu.CV NuGet packages
✅ A working webcam (for real-time detection)

Step 1. Create a New .NET Console App
Open a terminal and run:
dotnet new console -n FaceDetectionApp
cd FaceDetectionApp


Step 2. Install Required Packages
Add the Emgu.CV packages:
dotnet add package Emgu.CV
dotnet add package Emgu.CV.runtime.windows

Step 3. Add the Haar Cascade File
OpenCV uses trained XML classifiers for detecting objects. Download the Haar cascade for frontal face detection:
Place this file in your project root and configure it to copy to the output directory:
FaceDetectionApp.csproj
<ItemGroup>
  <None Update="haarcascade_frontalface_default.xml">
    <CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
  </None>
</ItemGroup>


Step 4. Write the Real-Time Detection Code
Replace Program.cs with:
using Emgu.CV;
using Emgu.CV.Structure;
using System.Drawing;

var capture = new VideoCapture(0);
var faceCascade = new CascadeClassifier("haarcascade_frontalface_default.xml");

if (!capture.IsOpened)
{
    Console.WriteLine("Could not open the webcam.");
    return;
}

Console.WriteLine("Press ESC to exit...");

while (true)
{
    using var frame = capture.QueryFrame()?.ToImage<Bgr, byte>();

    if (frame == null)
        break;

    var gray = frame.Convert<Gray, byte>();
    var faces = faceCascade.DetectMultiScale(gray, 1.1, 10, Size.Empty);

    foreach (var face in faces)
    {
        frame.Draw(face, new Bgr(Color.Green), 2);
    }

    CvInvoke.Imshow("Real-Time Face Detection", frame);

    if (CvInvoke.WaitKey(1) == 27) // ESC key
        break;
}

CvInvoke.DestroyAllWindows();


Output
Once you run the app using dotnet run, your webcam will open, and it will start detecting faces in real time by drawing rectangles around them.

Conclusion

With the power of .NET 9 and the flexibility of OpenCV, building a real-time face detection system has never been easier. Whether you're developing a hobby project or prototyping a serious application, this foundation sets you on the path to mastering computer vision in the .NET ecosystem.



European ASP.NET Core 9.0 Hosting - HostForLIFE :: MAUI.NET 9: Adding Objects to a Local Database

clock April 17, 2025 08:07 by author Peter

Step 1. Now that we have our local database set up, let's use it to store game reviews.

Step 1.1. Code:
using System.ComponentModel.DataAnnotations;

namespace Models.DTOs;

public class DTOBase
{
    [Key]

    public int? Id { get; set; }
    public int? ExternalId { get; set; }
    public DateTime CreatedAt { get; set; }
    public DateTime UpdatedAt { get; set; }
    public bool Inactive { get; set; }
}


Step 1.2. In the same folder, let's create GameDTO.
namespace Models.DTOs;

[Microsoft.EntityFrameworkCore.Index(nameof(IGDBId), IsUnique = true)]
public class GameDTO : DTOBase
{
    public int? IGDBId { get; set; }
    public int UserId { get; set; }
    public required string Name { get; set; }
    public string? ReleaseDate { get; set; }
    public string? Platforms { get; set; }
    public string? Summary { get; set; }
    public string? CoverUrl { get; set; }
    public required GameStatus Status { get; set; }
    public int? Rate { get; set; }
}

public enum GameStatus
{
    Want,
    Playing,
    Played
}

Step 2. In DbCtx.cs, add the Games table

Step 2.1. Code:
public virtual required DbSet<GameDTO> Games { get; set; }

Step 3. In the repo project, create the GameRepo.cs.

Step 4. In the repo project, create the GameRepo.cs:
using Microsoft.EntityFrameworkCore;
using Models.DTOs;

namespace Repo
{
    public class GameRepo(IDbContextFactory<DbCtx> DbCtx)
    {
        public async Task<int> CreateAsync(GameDTO game)
        {
            using var context = DbCtx.CreateDbContext();
            await context.Games.AddAsync(game);
            return await context.SaveChangesAsync();
        }
    }
}


Step 4.1. Extract the interface from this class.
Step 5. Save the image locally so that, after adding the game, we can retrieve it without relying on an internet connection. In the ApiRepo project, inside the IGDBGamesAPIRepo class, create the GetGameImageAsync function.
public static async Task<byte[]> GetGameImageAsync(string imageUrl)
{
    try
    {
        using HttpClient httpClient = new();
        var response = await httpClient.GetAsync(imageUrl);

        if (!response.IsSuccessStatusCode)
            throw new Exception($"Exception downloading image URL: {imageUrl}");

        return await response.Content.ReadAsByteArrayAsync();
    }
    catch (Exception ex)
    {
        throw new Exception($"Error fetching image: {ex.Message}", ex);
    }
}


Step 5.1. In the Services project, inside the IGDBGamesApiService class, create the SaveImageAsync function.
public static async Task SaveImageAsync(string imageUrl, string fileName)
{
    try
    {
        byte[] imageBytes = await IGDBGamesAPIRepo.GetGameImageAsync(imageUrl);
        string filePath = Path.Combine(GameService.ImagesPath, fileName);

        if (File.Exists(filePath))
            return;

        await File.WriteAllBytesAsync(filePath, imageBytes);
    }
    catch (Exception)
    {
        throw; // Consider logging the exception instead of rethrowing it blindly
    }
}

Step 6. Create a function to add the repositories in MauiProgram.
public static IServiceCollection Repositories(this IServiceCollection services)
{
    services.AddScoped<IGameRepo, GameRepo>();
    return services;
}

Step 7. Call this function inside CreateMauiApp()

Step 7.1. Add a function to create the image path.

Step 7.2. Code:
if (!System.IO.Directory.Exists(GameService.ImagesPath))
    System.IO.Directory.CreateDirectory(GameService.ImagesPath);

Step 8. In Models.Resps, create an object to handle the service response for the mobile project.

Step 8.1. Code:
namespace Models.Resps;

public class ServiceResp
{
    public bool Success { get; set; }

    public object? Content { get; set; }
}


Step 9. Create the GameService.

Step 9.1. Code:
using Models.DTOs;
using Models.Resps;
using Repo;

namespace Services
{
   public static readonly string ImagesPath = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.LocalApplicationData), "Inventory");

    public class GameService(IGameRepo GameRepo)
    {
        public async Task<ServiceResp> CreateAsync(GameDTO game)
        {
            game.CreatedAt = game.UpdatedAt = DateTime.Now;

            //User id fixed to 1
            game.UserId = 1;
            await GameRepo.CreateAsync(game);
            return new ServiceResp(true);
        }
    }
}


Step 9.2. Press [Ctrl + .] to extract the interface, then add the service to the Dependency Injection (DI). In MauiProgram, inside the Services function, add the line.

Step 9.3. Code:
services.AddScoped<IGameService, GameService>();

Step 10. In AddGameVM, add a variable to control the accessibility of the confirm button:

Step 10.1. Code:
private bool confirmIsEnabled = true;

public bool ConfirmIsEnabled
{
    get => confirmIsEnabled;
    set => SetProperty(ref confirmIsEnabled, value);
}

Step 11.
Remove gameStatus from AddGameVM so that it uses the one from the DTO model.

Step 12. Add IGameService to the AddGameVM constructor.

Step 13. Create the command that adds the game to the local database with the information that will be displayed in the app. Simultaneously attempts to save the image for local use. After insertion, displays a message and returns to the search screen.

[RelayCommand]
public async Task Confirm()
{
    // Disable button to prevent multiple requests
    ConfirmIsEnabled = false;
    string displayMessage;

    int? _rate = null;

    try
    {
        if (GameSelectedStatus == GameStatus.Played)
        {
            displayMessage = "Game and rate added successfully!";
            _rate = Rate;
        }
        else
        {
            displayMessage = "Game added successfully!";
        }

        if (IsOn && CoverUrl is not null)
        {
            _ = IGDBGamesApiService.SaveImageAsync(CoverUrl, $"{IgdbId}.jpg");
        }

        GameDTO game = new()
        {
            IGDBId = int.Parse(Id),
            Name = Name,
            ReleaseDate = ReleaseDate,
            CoverUrl = CoverUrl,
            Platforms = Platforms,
            Summary = Summary,
            Status = GameSelectedStatus.Value,
            Rate = _rate
        };

        var resp = await gameService.CreateAsync(game);
        if (!resp.Success) displayMessage = "Error adding game";

        // Display message and navigate back
        bool displayResp = await Application.Current.Windows[0].Page.DisplayAlert("Aviso", displayMessage, null, "Ok");

        if (!displayResp)
        {
            await Shell.Current.GoToAsync("..");
        }
    }
    catch (Microsoft.EntityFrameworkCore.DbUpdateException)
    {
        bool displayResp = await Application.Current.Windows[0].Page.DisplayAlert("Error", "Game already added!", null, "Ok");

        if (!displayResp)
        {
            await Shell.Current.GoToAsync("..");
        }
    }
    catch (Exception ex)
    {
        Console.WriteLine($"Error: {ex.Message}");
        throw;
    }
    finally
    {
        ConfirmIsEnabled = true;
    }
}


Step 14. Bind the command to the button:

Step 14.1. Code:
Command="{Binding ConfirmCommand}"

Step 15. Update the database version to be recreated with the current structure.

Step 16. Run the system and add a game, then return to the search:

If the same game is saved again, display a message notifying that it has already been added.

Now, our game is successfully saved in the local database without duplication.



European ASP.NET Core 9.0 Hosting - HostForLIFE :: Scalar-Based API Documentation for ASP.NET Core

clock March 6, 2025 05:54 by author Peter

Having well-organized and aesthetically pleasing API documentation is essential in modern web development. For creating visually appealing API documentation for ASP.NET Core applications, Scalar is a great tool. The easy integration of Scalar into an ASP.NET Core project and the creation of API documentation will be covered in this post.

Scalar: What is it?
An open-source tool for API documentation, Scalar offers a user-friendly and aesthetically pleasing interface for testing and researching APIs. It is a straightforward and user-friendly substitute for Redoc and Swagger UI.

Why Use Scalar for API Documentation?

  • User-friendly Interface: Scalar provides a clean and modern UI.
  • Interactive API Testing: Allows developers to test APIs directly from the documentation.
  • Easy Integration: Simple setup process with minimal configuration.
  • Open Source: Free to use and modify according to project needs.

Setting Up Scalar in ASP.NET Core
Step 1. Install Scalar Package
To integrate Scalar into your ASP.NET Core project, you need to install the Scalar NuGet package. Run the following command in the terminal.
Install-Package Scalar.AspNetCore

Step 2. Configure Scalar in Startup.cs.
Modify the Startup.cs file to add Scalar middleware.
using Scalar.AspNetCore;

public void ConfigureServices(IServiceCollection services)
{
    services.AddControllers();
    services.AddScalar();
}

public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
{
    if (env.IsDevelopment())
    {
        app.UseDeveloperExceptionPage();
    }

    app.UseRouting();

    app.UseEndpoints(endpoints =>
    {
        endpoints.MapControllers();
    });

    app.UseScalar(); // Enable Scalar documentation
}

Step 3. Access API Documentation.
Once the configuration is complete, run your ASP.NET Core application and navigate to the following URL.
https://localhost:<port>/scalar

You should see a beautifully generated API documentation interface.

Customizing Scalar Documentation

Scalar provides customization options to enhance the API documentation experience.

1. Adding API Metadata
You can customize API metadata such as title, version, and description by modifying the Scalar configuration.
services.AddScalar(options =>
{
    options.DocumentTitle = "My API Documentation";
    options.ApiVersion = "v1.0";
    options.Description = "This is a sample API documentation using Scalar.";
});


2. Securing API Documentation
You can use authentication middleware to secure the Scalar endpoint and limit access to your API documentation.

Conclusion

With ASP.NET Core, Scalar is a potent tool for producing visually appealing and interactive API documentation. It is a great substitute for more conventional documentation tools like Swagger because of its simplicity of integration, intuitive interface, and customizable features. You may rapidly set up Scalar and enhance your experience with API documentation by following the instructions in this article.



About HostForLIFE.eu

HostForLIFE.eu is European Windows Hosting Provider which focuses on Windows Platform only. We deliver on-demand hosting solutions including Shared hosting, Reseller Hosting, Cloud Hosting, Dedicated Servers, and IT as a Service for companies of all sizes.

We have offered the latest Windows 2016 Hosting, ASP.NET Core 2.2.1 Hosting, ASP.NET MVC 6 Hosting and SQL 2017 Hosting.


Tag cloud

Sign in