European ASP.NET 4.5 Hosting BLOG

BLOG about ASP.NET 4, ASP.NET 4.5 Hosting and Its Technology - Dedicated to European Windows Hosting Customer

European ASP.NET Core 9.0 Hosting - HostForLIFE :: Using C# with .NET 9 for Advanced Data Warehouse Modeling and Querying

clock May 14, 2025 07:25 by author Peter

The purpose of data warehouses is to offer business intelligence, analytics, and reporting. Even while SQL and ETL technologies receive most of the focus, C# can be a useful complement to model definition, metadata management, warehouse schema design, and large-scale querying. This article provides expert coverage of data warehousing with C# 14 and.NET 9.

Why C# Data Warehousing?

  • Automate the creation of warehouse schema (fact/dimension tables).
  • Regulating and certifying source-to-target mapping models.
  • Construct models dimensionally automatically.
  • Generate surrogate keys, SCD Type 2 rows.
  • Integrate with Azure Synapse, Snowflake, and BigQuery APIs.
  • Execute high-performance warehouse queries through parameterization and batching.

Data warehouse maintainability relies heavily on metadata-programming, code generation, and system integration, all of which C# excels at.

Programmatic Warehouse Modeling in C#
Let's create a straightforward dimensional model in present-day C#

public record CustomerDim(string CustomerKey, string CustomerName, string Country);
public record OrderFact(
    string OrderKey,
    string CustomerKey,
    DateTime OrderDate,
    decimal TotalAmount,
    string CurrencyCode);


You can cast the source data to those types before loading, or even generate the equivalent SQL CREATE TABLE scripts from attributes.
[WarehouseTable("dw.CustomerDim")]
public record CustomerDim(string CustomerKey, string CustomerName, string Country);


Use source generation or introspection to obtain the DDL based on annotated classes.

Warehouse Querying from C#

Instead of running raw SQL, put parameterized warehouse queries inside reusable procedures.
public async Task<List<OrderFact>> GetSalesByDateAsync(DateTime from, DateTime to)
{
    const string sql = @"
        SELECT
            OrderKey,
            CustomerKey,
            OrderDate,
            TotalAmount,
            CurrencyCode
        FROM dw.OrderFact
        WHERE OrderDate BETWEEN @from AND @to";
    using var conn = new SqlConnection(_warehouseConn);
    var results = await conn.QueryAsync<OrderFact>(sql, new { from, to });
    return results.ToList();
}


This design pattern allows you to.

  • Develop C# console applications for analytics APIs to display dashboards or reports
  • Export to Excel, Power BI, CSV, or PDF
  • Run batch summaries or ML feature generation jobs
  • High-Level Features
  • Surrogate Key Generation


C# can handle surrogate keys either in-process or through sequences.
int nextKey = await conn.ExecuteScalarAsync<int>(
    "SELECT NEXT VALUE FOR dw.CustomerKeySeq"
);

Slowly Changing Dimensions (SCD Type 2)
Use EF Core or Dapper to insert new rows for updated attributes with validity ranges.
if (existing.Name != updated.Name)
{
    // End old record
    existing.EndDate = DateTime.UtcNow;

    // Add new record
    var newVersion = new CustomerDim
    {
        // Assign necessary properties here
    };
    await conn.ExecuteAsync("INSERT INTO dw.CustomerDim ...", newVersion);
}

Query Materialization
Use the ToDataTable() extension methods to convert warehouse queries into in-memory tables.
var table = queryResults.ToDataTable();
ExportToCsv(table, "output/sales.csv");

BI Tool and API Integration

C# can,

  • Feed Power BI through REST or tabular model APIs
  • Push metrics to dashboards
  • Develop REST APIs that wrap SQL with business-oriented endpoints

Conclusion
Automate report sharing via email, Teams, or Slack. Conclusion With .NET 9 and C# 14, you can be hands-on and flexible in data warehouse modeling and querying. Whether modeling dimensions, building APIs, or filling dashboards, C# gives you control, performance, and maintainability that you simply can't get with SQL scripts alone.




European ASP.NET Core 9.0 Hosting - HostForLIFE :: Using C# Examples to Implement the Saga Pattern for Distributed Transactions Across Services

clock May 9, 2025 08:28 by author Peter

Distributed transactions are challenging in microservices since each service will have its own data store. The Saga pattern provides a means to handle transactions across services without a global transaction manager. Instead of an atomic transaction, a saga splits it into a sequence of local transactions with compensating transactions for failure. In this post, I describe how you can implement the Saga pattern in C#, with real examples to demonstrate the flow.

Saga Concepts
A saga consists of.

  • A series of operations (local transactions)
  • Compensating actions to undo steps in case anything goes wrong

Sagas can be dealt with in two basic ways:

  • Choreography: all services subscribe to events and respond accordingly (no central coordinator).
  • Orchestration: a central saga orchestrator guides the flow.

Example Scenario: Order Processing Saga Consider an e-commerce process

  • Create Order (Order Service)
  • Reserve Inventory (Inventory Service)
  • Charge Payment (Payment Service)

If payment fails, we need to.

  • Cancel payment (if partial)
  • Release inventory
  • Cancel the order

Orchestration Example (C#)
We'll utilize a basic saga orchestrator.

Saga Orchestrator

public class OrderSagaOrchestrator
{
    private readonly IOrderService _orderService;
    private readonly IInventoryService _inventoryService;
    private readonly IPaymentService _paymentService;

    public OrderSagaOrchestrator(
        IOrderService orderService,
        IInventoryService inventoryService,
        IPaymentService paymentService)
    {
        _orderService = orderService;
        _inventoryService = inventoryService;
        _paymentService = paymentService;
    }

    public async Task<bool> ProcessOrderAsync(OrderData order)
    {
        try
        {
            await _orderService.CreateOrderAsync(order);
            await _inventoryService.ReserveInventoryAsync(order.OrderId, order.Items);
            await _paymentService.ChargeAsync(order.OrderId, order.TotalAmount);
            return true;
        }
        catch (Exception ex)
        {
            Console.WriteLine($"Saga failed: {ex.Message}. Starting compensation...");
            await _paymentService.RefundAsync(order.OrderId);
            await _inventoryService.ReleaseInventoryAsync(order.OrderId);
            await _orderService.CancelOrderAsync(order.OrderId);
            return false;
        }
    }
}

Example Interfaces
public interface IOrderService
{
    Task CreateOrderAsync(OrderData order);
    Task CancelOrderAsync(string orderId);
}

public interface IInventoryService
{
    Task ReserveInventoryAsync(string orderId, List<Item> items);
    Task ReleaseInventoryAsync(string orderId);
}

public interface IPaymentService
{
    Task ChargeAsync(string orderId, decimal amount);
    Task RefundAsync(string orderId);
}

public class OrderData
{
    public string OrderId { get; set; }
    public List<Item> Items { get; set; }
    public decimal TotalAmount { get; set; }
}

public class Item
{
    public string ProductId { get; set; }
    public int Quantity { get; set; }
}

Choreography Example
In a choreography-based saga, all services listen to events. When, for example, the OrderCreated event is published, the Inventory Service hears it and reserves inventory.

Example RabbitMQ consumer for Inventory Service.
consumer.Received += async (model, ea) =>
{
    var message = Encoding.UTF8.GetString(ea.Body.ToArray());
    var orderCreated = JsonConvert.DeserializeObject<OrderCreatedEvent>(message);
    await _inventoryService.ReserveInventoryAsync(
        orderCreated.OrderId,
        orderCreated.Items
    );
    // Then publish InventoryReserved event
};


Best Practices

  • Ensure compensating transactions are always idempotent.
  • Employ reliable messaging (such as RabbitMQ, Kafka) to prevent lost events.
  • Log saga progress for traceability.
  • Research using libraries such as MassTransit (saga support) for production.

Conclusion
The Saga pattern enables your C# services to orchestrate complex workflows without distributed transactions. With orchestration or choreography, sagas ensure data consistency between services and handle failures elegantly. By using these principles carefully, you can create scalable and resilient distributed systems.



European ASP.NET Core 9.0 Hosting - HostForLIFE :: Partial Properties and Indexers Simplified

clock May 6, 2025 07:53 by author Peter

Code cleanliness and scalability are critical, particularly in large projects or when numerous developers are collaborating. One golden rule we always hear is “separation of concerns.” To follow that, many of us use partial classes to split different responsibilities nicely. However, there was one irksome restriction up until C# 12: we were unable to make properties or indexers partial. As a result, we occasionally had to combine several logics in one location, which caused the code to become disorganized. It was particularly annoying when using tools like code generators and Entity Framework.

Now in C# 13.0, this small but powerful feature has come with Partial Properties and Indexers. It may look like a small update, but in real development, it’s actually a big one.

In this blog, let’s see how we were managing earlier, what’s new in C# 13, and how this feature can help us keep our code clean, readable, and more maintainable.

How was it earlier?

Let's say you're working with a code generator (like EF Core) that creates a class like this.
public partial class Product
{
    public string Name { get; set; }
}


Now, suppose you want to add some custom logic, like trimming extra spaces or checking that the name is not empty or null. But since the Name property is already there in the generated code, your can't do anything. You’re left with only a few not-so-nice options.

Shadow property: You make a new property with a similar name and manually sync values. A bit messy.
Use backing fields: override the generated class and handle it yourself, but then you might break the generated code. Risky.
Try partial methods or extension methods. They can work, but they feel like a jugaad (hack), not clean or natural.

End result? Code becomes untidy, logic is scattered in multiple files, and worst part — some developers stop following best practices just because tools don’t support it well.

How can we resolve this in C# 13.0?

In C# 13.0, we have a new feature called partial properties and partial indexers. This allows you to declare a property in one part of a partial class and implement it in another.

This update makes property logic more like how we write methods — now we can do things like this.
// Part 1
public partial void DoSomething();

// Part 2
public partial void DoSomething()
{
    // logic
}

We can now do the same with properties.
// Part 1
public partial string Name { get; set; }
// Part 2
public partial string Name
{
    get => _name;
    set => _name = value.Trim();
}

Simple and powerful! Now let's see a real-life example where we can apply this.

Real-World example - Email validation in a generated model

Let’s say we have a model generated by EF Core or a source generator.

// File: Customer.Generated.cs
public partial class Customer
{
    public partial string Email { get; set; }
}

Above is the auto-generated code, where we can't do much. Hence, creating another partial class and extending it as below.

// File: Customer.cs
public partial class Customer
{
    public partial string Email
    {
        get => _email;
        set
        {
            if (!value.Contains("@"))
                throw new ArgumentException("Invalid email address.");

            _email = value.Trim().ToLower();
        }
    }
    private string _email;
}


This is now clean, separated, and maintainable.

How does it work with Collections?
Earlier, indexers also had the same restriction. Now with partial indexers, working with custom data structures becomes much easier. Let’s see how.

// File: Matrix.Generated.cs
public partial class Matrix
{
    public partial double this[int row, int col] { get; set; }
}


// File: Matrix.cs
public partial class Matrix
{
    public partial double this[int row, int col]
    {
        get => _data[row, col];
        set => _data[row, col] = value;
    }
    private double[,] _data = new double[3, 3];
}


Now your indexer logic can be maintained cleanly alongside internal data handling.

Benefits we are getting out of it

So, what do we really get with these new Partial Properties and Indexers?

  • Cleaner Code Separation: Now your business logic stays in your own file, and the generated code stays untouched. No more mixing things up.
  • Better Tooling Support: Works smoothly with Entity Framework, Blazor, source generators, and even your own scaffolding tools if you are using.
  • No More Jugaad (Hacks): No need to hide properties or write duplicate logic. Just write behavior where it actually belongs.
  • Team Work Becomes Easy: The UI team can handle things like trimming or formatting, backend team can focus on saving data. Clean division.
  • Easier to Maintain: Just open the file, and at a glance, you know what’s written by you and what’s coming from code generation.


European ASP.NET Core 9.0 Hosting - HostForLIFE :: Developing an ASP.NET E-Commerce Chatbot with SQL Server

clock May 2, 2025 07:45 by author Peter

On digital platforms, chatbots are effective tools for enhancing user experience. Using ASP.NET (Web Forms) and SQL Server, I developed a basic e-commerce chatbot in this blog post that provides users with immediate responses about digital products like as planners, templates, and file kinds without the need for third-party AI libraries.

What does this Chatbot do?

  • Responds to product-related user queries using predefined keywords
  • Displays a friendly chat UI with user-bot interaction
  • Store each conversation in a chat log table for analysis
  • Uses SQL queries to retrieve responses based on keyword match
  • Fully functional without JavaScript or AJAX

Tech Stack

  • Frontend: ASP.NET Web Forms, Bootstrap 5
  • Backend: C# (.NET Framework), ADO.NET
  • Database: Microsoft SQL Server
  • Tables Used
    • BootResponses: Stores bot responses
    • ChatLogs: Stores chat history

UI Design (ChatBoot.aspx)
The chatbot is placed inside a styled Bootstrap card. Here's the code.

<div class="chat-wrapper">
    <div class="chat-header">
        <h5>E-Commerce Chatbot - Ask Me About Products!</h5>
    </div>

    <div class="chat-body" id="chatBody" runat="server">
    </div>

    <div class="chat-footer">
        <asp:TextBox
            ID="txtUserInput"
            runat="server"
            CssClass="form-control"
            placeholder="Type your question...">
        </asp:TextBox>

        <asp:Button
            ID="btnSend"
            runat="server"
            Text="Send"
            CssClass="btn btn-primary"
            OnClick="btnSend_Click" />
    </div>
</div>


Here’s how the UI looks.

Backend Logic (ChatBoot.aspx.cs)
On Page_Load, I show a friendly welcome message and sample questions.
chatBody.InnerHtml += @"
<div class='bot-msg'>
    Welcome! I'm your assistant bot for digital product shopping and selling.<br/>
    Try asking:
    <ul>
        <li>What templates do you have?</li>
        <li>How to list my product?</li>
        <li>Steps to buy a planner?</li>
    </ul>
</div>";


When the user clicks "Send", the bot checks the database for matching keywords.
string query = @"
    SELECT TOP 1 ResponseText
    FROM BootResponses
    WHERE @msg LIKE '%' + QuestionKeyword + '%'";


If a match is found, the bot replies with that message. Else, it shows a fallback message.

Also, both user messages and bot replies are logged into an SQL table ChatLogs.

 

 

Database Structure

BootResponses Table

Id QuestionKeyword ResponseText
1 Price Our digital product prices start at just $5.
2 Template We offer a variety of templates including resumes, portfolios, and business plans.
3 Hello Hi there! Welcome to our Digital Marketplace. How can I assist you today?
4 Hi Hello! I’m your Assistant bot. Ask me anything about our Digital Products.

It supports flexible keywords like "Price", "Template", "Upload", "Download", "Support", "Account", and many more.

ChatLogs Table

This logs every user query and bot reply for future review or enhancement.

Id UserMessage BotResponse Timestamp
1 How to buy and sell the digital product I'm sorry, I don't understand. 2025-04-29 16:09:58
2 How to create an Account? You can register a free account to manage your purchases and downloads. 2025-04-29 16:22:15

Highlights

 

  • Keyword Matching: Uses SQL LIKE to match partial questions.
  • Data Logging: Every conversation is stored in ChatLogs.
  • Responsive UI: Built with Bootstrap for a clean mobile-friendly interface.
  • Expandable: Easily add more QuestionKeyword and ResponseText entries.

Future Enhancements

  • Add AJAX to make the chatbot work without page refresh.
  • Train a basic ML model to better understand fuzzy queries.
  • Show suggestions dynamically as the user types.
  • Include voice-based input for accessibility.

Final Thoughts
This chatbot project demonstrates how even a basic keyword-based bot can serve as an interactive assistant for digital product platforms. It’s a great starting point if you're building your own chatbot and want to connect frontend UI with backend logic and a real database.


 



European ASP.NET Core 9.0 Hosting - HostForLIFE :: A highly useful generic helper class for temporarily changing values is ScopedValueChanger<T>

clock April 28, 2025 08:11 by author Peter

In software development, handling transient changes can be difficult, particularly when it's imperative to return to the initial state as soon as the modified state is no longer required. .NET's ScopedValueChanger class offers a dependable and organized method for managing transient value changes. This class guarantees that a value is changed for the duration of a specific scope and that, upon completion of the operation, it is safely returned to its initial state.

Functionality and Design
Utilizing the potent dispose pattern in.NET, the ScopedValueChanger implements the IDisposable interface, automatically resetting the impacted value when the instance exits scope. The ScopedValueChanger class has the following main characteristics and functions:

Structure
Type Parameter T: The class is generic with a type parameter T, allowing it to be used with any data type. This provides flexibility in handling various types of properties or fields.

Constructor Parameters:

  • Func<T> getter: A function to retrieve the current value.
  • Action<T> setter: An action to set the value.
  • T newValue: The new value to be temporarily set.

Behavior
Upon instantiation, the ScopedValueChanger records the current value through the getter delegate and apply the newValue using the setter delegate. When the Dispose method is called—either explicitly or at the end of a using block—the value is reverted to its original state, ensuring state consistency.

Exception Handling

The constructor throws an ArgumentNullException if either the getter or setter is null, preventing undefined behavior from invalid arguments.

Practical Use Cases
The ScopedValueChanger<T> is quite versatile and can be employed in various scenarios. Here are some examples:

Example 1. Temporarily Modifying a Global Setting

Imagine a scenario where an application has a global setting controlling whether logging is enabled. In certain sections of code, such as during performance-sensitive operations, it might be beneficial to temporarily disable logging:
using (var changer = new ScopedValueChanger<bool>(() => Logger.IsEnabled, v => Logger.IsEnabled = v, false))
{
    // Perform operations without logging.
}
// The logging state is automatically restored here.

Example 2. UI Element Visibility
In UI applications, it might be necessary to hide certain elements temporarily, for instance, during a screen transition:
UIElement myPanel = FindPanel();
using (var changer = new ScopedValueChanger<bool>(() => myPanel.IsVisible, v => myPanel.IsVisible = v, false))
{
    // Perform UI updates without the panel being visible.
}
// Panel visibility is restored automatically.


Example 3. Database Transactions
While handling database transactions, it is crucial to ensure that parameters like connection strings or cache settings are accurately restored post-transaction to maintain integrity:
string currentConnectionString = Database.ConnectionString;
using (var changer = new ScopedValueChanger<string>(() => Database.ConnectionString, s => Database.ConnectionString = s, "TemporaryConnection"))
{
    // Perform database operations using the temporary connection string.
}
// The original connection string is restored automatically.


Conclusion
The ScopedValueChanger<T> class is an excellent utility for managing temporary alterations, specifically in contexts requiring resource cleanup or state management. Its reliance on the dispose pattern underscores both safety and simplicity in handling such operations, providing developers with an effective tool for maintaining application integrity across diverse application scenarios.



European ASP.NET Core 9.0 Hosting - HostForLIFE :: Using.NET 9 to Build a Face Detection System

clock April 21, 2025 09:25 by author Peter

Face detection has evolved from a cool sci-fi concept to a practical tool in everything from photo tagging to biometric authentication. In this blog post, we’ll explore how to build a real-time face detection system using the latest .NET 9 and OpenCV via Emgu.CV, a popular .NET wrapper around the OpenCV library.


Prerequisites
To follow along, you'll need:

✅ .NET 9 SDK
✅ Visual Studio 2022+ or VS Code
✅ Basic knowledge of C#
✅ Emgu.CV NuGet packages
✅ A working webcam (for real-time detection)

Step 1. Create a New .NET Console App
Open a terminal and run:
dotnet new console -n FaceDetectionApp
cd FaceDetectionApp


Step 2. Install Required Packages
Add the Emgu.CV packages:
dotnet add package Emgu.CV
dotnet add package Emgu.CV.runtime.windows

Step 3. Add the Haar Cascade File
OpenCV uses trained XML classifiers for detecting objects. Download the Haar cascade for frontal face detection:
Place this file in your project root and configure it to copy to the output directory:
FaceDetectionApp.csproj
<ItemGroup>
  <None Update="haarcascade_frontalface_default.xml">
    <CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
  </None>
</ItemGroup>


Step 4. Write the Real-Time Detection Code
Replace Program.cs with:
using Emgu.CV;
using Emgu.CV.Structure;
using System.Drawing;

var capture = new VideoCapture(0);
var faceCascade = new CascadeClassifier("haarcascade_frontalface_default.xml");

if (!capture.IsOpened)
{
    Console.WriteLine("Could not open the webcam.");
    return;
}

Console.WriteLine("Press ESC to exit...");

while (true)
{
    using var frame = capture.QueryFrame()?.ToImage<Bgr, byte>();

    if (frame == null)
        break;

    var gray = frame.Convert<Gray, byte>();
    var faces = faceCascade.DetectMultiScale(gray, 1.1, 10, Size.Empty);

    foreach (var face in faces)
    {
        frame.Draw(face, new Bgr(Color.Green), 2);
    }

    CvInvoke.Imshow("Real-Time Face Detection", frame);

    if (CvInvoke.WaitKey(1) == 27) // ESC key
        break;
}

CvInvoke.DestroyAllWindows();


Output
Once you run the app using dotnet run, your webcam will open, and it will start detecting faces in real time by drawing rectangles around them.

Conclusion

With the power of .NET 9 and the flexibility of OpenCV, building a real-time face detection system has never been easier. Whether you're developing a hobby project or prototyping a serious application, this foundation sets you on the path to mastering computer vision in the .NET ecosystem.



European ASP.NET Core 9.0 Hosting - HostForLIFE :: MAUI.NET 9: Adding Objects to a Local Database

clock April 17, 2025 08:07 by author Peter

Step 1. Now that we have our local database set up, let's use it to store game reviews.

Step 1.1. Code:
using System.ComponentModel.DataAnnotations;

namespace Models.DTOs;

public class DTOBase
{
    [Key]

    public int? Id { get; set; }
    public int? ExternalId { get; set; }
    public DateTime CreatedAt { get; set; }
    public DateTime UpdatedAt { get; set; }
    public bool Inactive { get; set; }
}


Step 1.2. In the same folder, let's create GameDTO.
namespace Models.DTOs;

[Microsoft.EntityFrameworkCore.Index(nameof(IGDBId), IsUnique = true)]
public class GameDTO : DTOBase
{
    public int? IGDBId { get; set; }
    public int UserId { get; set; }
    public required string Name { get; set; }
    public string? ReleaseDate { get; set; }
    public string? Platforms { get; set; }
    public string? Summary { get; set; }
    public string? CoverUrl { get; set; }
    public required GameStatus Status { get; set; }
    public int? Rate { get; set; }
}

public enum GameStatus
{
    Want,
    Playing,
    Played
}

Step 2. In DbCtx.cs, add the Games table

Step 2.1. Code:
public virtual required DbSet<GameDTO> Games { get; set; }

Step 3. In the repo project, create the GameRepo.cs.

Step 4. In the repo project, create the GameRepo.cs:
using Microsoft.EntityFrameworkCore;
using Models.DTOs;

namespace Repo
{
    public class GameRepo(IDbContextFactory<DbCtx> DbCtx)
    {
        public async Task<int> CreateAsync(GameDTO game)
        {
            using var context = DbCtx.CreateDbContext();
            await context.Games.AddAsync(game);
            return await context.SaveChangesAsync();
        }
    }
}


Step 4.1. Extract the interface from this class.
Step 5. Save the image locally so that, after adding the game, we can retrieve it without relying on an internet connection. In the ApiRepo project, inside the IGDBGamesAPIRepo class, create the GetGameImageAsync function.
public static async Task<byte[]> GetGameImageAsync(string imageUrl)
{
    try
    {
        using HttpClient httpClient = new();
        var response = await httpClient.GetAsync(imageUrl);

        if (!response.IsSuccessStatusCode)
            throw new Exception($"Exception downloading image URL: {imageUrl}");

        return await response.Content.ReadAsByteArrayAsync();
    }
    catch (Exception ex)
    {
        throw new Exception($"Error fetching image: {ex.Message}", ex);
    }
}


Step 5.1. In the Services project, inside the IGDBGamesApiService class, create the SaveImageAsync function.
public static async Task SaveImageAsync(string imageUrl, string fileName)
{
    try
    {
        byte[] imageBytes = await IGDBGamesAPIRepo.GetGameImageAsync(imageUrl);
        string filePath = Path.Combine(GameService.ImagesPath, fileName);

        if (File.Exists(filePath))
            return;

        await File.WriteAllBytesAsync(filePath, imageBytes);
    }
    catch (Exception)
    {
        throw; // Consider logging the exception instead of rethrowing it blindly
    }
}

Step 6. Create a function to add the repositories in MauiProgram.
public static IServiceCollection Repositories(this IServiceCollection services)
{
    services.AddScoped<IGameRepo, GameRepo>();
    return services;
}

Step 7. Call this function inside CreateMauiApp()

Step 7.1. Add a function to create the image path.

Step 7.2. Code:
if (!System.IO.Directory.Exists(GameService.ImagesPath))
    System.IO.Directory.CreateDirectory(GameService.ImagesPath);

Step 8. In Models.Resps, create an object to handle the service response for the mobile project.

Step 8.1. Code:
namespace Models.Resps;

public class ServiceResp
{
    public bool Success { get; set; }

    public object? Content { get; set; }
}


Step 9. Create the GameService.

Step 9.1. Code:
using Models.DTOs;
using Models.Resps;
using Repo;

namespace Services
{
   public static readonly string ImagesPath = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.LocalApplicationData), "Inventory");

    public class GameService(IGameRepo GameRepo)
    {
        public async Task<ServiceResp> CreateAsync(GameDTO game)
        {
            game.CreatedAt = game.UpdatedAt = DateTime.Now;

            //User id fixed to 1
            game.UserId = 1;
            await GameRepo.CreateAsync(game);
            return new ServiceResp(true);
        }
    }
}


Step 9.2. Press [Ctrl + .] to extract the interface, then add the service to the Dependency Injection (DI). In MauiProgram, inside the Services function, add the line.

Step 9.3. Code:
services.AddScoped<IGameService, GameService>();

Step 10. In AddGameVM, add a variable to control the accessibility of the confirm button:

Step 10.1. Code:
private bool confirmIsEnabled = true;

public bool ConfirmIsEnabled
{
    get => confirmIsEnabled;
    set => SetProperty(ref confirmIsEnabled, value);
}

Step 11.
Remove gameStatus from AddGameVM so that it uses the one from the DTO model.

Step 12. Add IGameService to the AddGameVM constructor.

Step 13. Create the command that adds the game to the local database with the information that will be displayed in the app. Simultaneously attempts to save the image for local use. After insertion, displays a message and returns to the search screen.

[RelayCommand]
public async Task Confirm()
{
    // Disable button to prevent multiple requests
    ConfirmIsEnabled = false;
    string displayMessage;

    int? _rate = null;

    try
    {
        if (GameSelectedStatus == GameStatus.Played)
        {
            displayMessage = "Game and rate added successfully!";
            _rate = Rate;
        }
        else
        {
            displayMessage = "Game added successfully!";
        }

        if (IsOn && CoverUrl is not null)
        {
            _ = IGDBGamesApiService.SaveImageAsync(CoverUrl, $"{IgdbId}.jpg");
        }

        GameDTO game = new()
        {
            IGDBId = int.Parse(Id),
            Name = Name,
            ReleaseDate = ReleaseDate,
            CoverUrl = CoverUrl,
            Platforms = Platforms,
            Summary = Summary,
            Status = GameSelectedStatus.Value,
            Rate = _rate
        };

        var resp = await gameService.CreateAsync(game);
        if (!resp.Success) displayMessage = "Error adding game";

        // Display message and navigate back
        bool displayResp = await Application.Current.Windows[0].Page.DisplayAlert("Aviso", displayMessage, null, "Ok");

        if (!displayResp)
        {
            await Shell.Current.GoToAsync("..");
        }
    }
    catch (Microsoft.EntityFrameworkCore.DbUpdateException)
    {
        bool displayResp = await Application.Current.Windows[0].Page.DisplayAlert("Error", "Game already added!", null, "Ok");

        if (!displayResp)
        {
            await Shell.Current.GoToAsync("..");
        }
    }
    catch (Exception ex)
    {
        Console.WriteLine($"Error: {ex.Message}");
        throw;
    }
    finally
    {
        ConfirmIsEnabled = true;
    }
}


Step 14. Bind the command to the button:

Step 14.1. Code:
Command="{Binding ConfirmCommand}"

Step 15. Update the database version to be recreated with the current structure.

Step 16. Run the system and add a game, then return to the search:

If the same game is saved again, display a message notifying that it has already been added.

Now, our game is successfully saved in the local database without duplication.



European ASP.NET Core 9.0 Hosting - HostForLIFE :: .NET Aspire Service Orchestration

clock April 14, 2025 08:15 by author Peter

The orchestration of.NET Aspire helps everything sing, from dispersed services to a coordinated symphony. We will delve deeper into orchestration using.NET Aspire in this chapter.

In this chapter, we will get deeper into orchestration with .NET Aspire.

  • Introduction to Orchestration in .NET Aspire.
  • Understanding the .NET Aspire Orchestrator.
  • Setting Up Your Aspire Orchestration Project.
  • Comparison with Docker Compose.

Introduction to Orchestration in .NET Aspire
As modern applications become more modular and distributed, orchestration becomes essential to tame the complexity. Whether it's a set of APIs, background workers, message queues, or third-party integrations, every moving part needs to be started, connected, and observed in a predictable way. That’s where .NET Aspire's orchestration capabilities shine.

Unlike traditional infrastructure-first orchestrators like Kubernetes, Aspire brings orchestration closer to the developer experience. It helps you declare and manage dependencies between projects and services within your solution, ensuring everything runs smoothly—without needing YAML or external tooling during development.

With Aspire, orchestration is,

  • Declarative: You define what services exist and how they relate to each other.
  • Integrated: It’s built right into the Aspire tooling—no need for Docker Compose or Kubernetes just to test things locally. Observable: Thanks to the built-in dashboard, you can monitor services, view logs, check health statuses, and trace failures all in one place.

The orchestration layer essentially acts as a central brain that boots up your projects in the right order, injects configuration and secrets, and ties everything together using environment variables and DI.

Understanding the .NET Aspire Orchestrator

At the heart of every .NET Aspire solution is a project typically named something like “YourApp.Orchestration”. This isn't just another project in your solution—it's the orchestration engine that wires up everything else.

Think of it as the composition root for your distributed application.

What Does the Orchestrator Do?
The orchestrator’s job is to,

  • Register services like your Web APIs, workers, Blazor apps, databases, caches, and even external services.
  • Configure dependencies between these services so they can talk to each other.
  • Inject environment variables and secrets automatically where needed.
  • Control the service startup order, ensuring dependencies are initialized in the right sequence.
  • Launch and monitor the services as a single unit during development.

Here’s a quick snippet showing how services are wired up inside the “Program.cs” of an Aspire Orchestrator project.
var builder = DistributedApplication.CreateBuilder(args);
var redis = builder.AddRedis("cache");
var productsApi = builder.AddProject<Projects.MyApp_ProductsApi>("products-api");
var frontend = builder.AddProject<Projects.MyApp_Blazor>("frontend");
builder.Build().Run();

This simple and intuitive code accomplishes quite a bit.

  • AddProject registers each project in your solution.
  • WithReference defines dependencies between services (injected as environment variables).
  • Services are automatically exposed via the Aspire Dashboard.

How Does Dependency Injection Work?
Aspire takes care of dependency wiring by exporting connection strings and configuration values as environment variables. For example, if your Products API needs Redis, Aspire will:

  • Automatically assign the Redis service a connection string.
  • Export it as ConnectionStrings__cache (or similar).
  • Your API can consume it using standard configuration binding in .NET.

What Services Can You Orchestrate?

  • ASP.NET Core APIs
  • Worker services
  • Blazor apps
  • Background jobs
  • Redis, PostgreSQL, SQL Server, MongoDB, RabbitMQ, etc.
  • External HTTP or gRPC endpoints.
  • Containerized services via AddContainer(...)

This makes it easy to emulate a realistic production environment locally, with minimal setup.

Setting Up Aspire Orchestration Project

If you are following me in this .NET Aspire quick guide, you must have prepared a blazor weather app in the last chapter, where we have added the support for resiliency using the .NET Aspire Service Default template;

To get started with service orchestration in your Aspire-powered solution, we first need to add an Aspire App Host Project.

  • Right-click the solution file in Solution Explorer.
  • Choose Add > New Project.
  • Search for .NET Aspire App Host and add it to your solution.

This project serves as the orchestration hub for your distributed application. In my example, I now have four projects in the solution, including the newly added project named BlazorWeatherApp.AppHost, following the standard Aspire naming convention.

Registering Projects with the App Host
Once the App Host project is added, the first step is to register all the required services — such as microservices, background workers, or frontend apps by referencing them inside the App Host project.


You can double-check the .csproj file of your App Host project to ensure all project references are correctly added.

What is the App Host and Why Use It?
In .NET Aspire, the App Host acts as the central coordinator for all services. It not only runs all your apps together but also provides features like.

  • Orchestration and centralized launching
  • Monitor all services
  • Built-in service discovery

Let’s look at the code to understand how services are registered.

What does builder.AddProject<>() do?
This method registers an application (like a microservice or a frontend project) with the App Host. It ensures.

  • The project is included in orchestration.
  • It’s monitored and tracked during execution.
  • It can discover and communicate with other registered services.

What role do these string parameters play here ("api", "myweatherapp")?
That string is the resource name or service identifier.

  • It's used for
    • Service discovery (e.g., resolving internal URLs)
    • Dashboard visualization (the names you see in Aspire Dashboard)
    • Logging and diagnostics labels
  • Think of it as a logical name for each project in the distributed system.

For example
builder.AddProject<Projects.BlazorWeatherApp_API>("api");

This means

  • Register the BlazorWeatherApp_API project
  • In service maps, configs, and communication, it's known as "api."

Setting App Host as Startup Project
Since the App Host handles orchestration, you no longer need to start multiple projects individually. Just set the App Host project as the startup project — it will launch and manage all other registered services automatically.

Run and Explore the Aspire Dashboard
After building the solution, run the App Host project. You’ll notice that a built-in dashboard UI launches with it.

On the Resources tab, you’ll see all your registered services listed — these are the ones added via the DistributedApplication.CreateBuilder() API.

This dashboard provides a single place to,

  • View the status of your services
  • Access service endpoints
  • Preview, test, or manage them in real time.

We’ll dive deeper into all the features of the Aspire Dashboard in the next chapter, but feel free to explore it now and get a feel for how Aspire centralizes the management of your distributed apps.

How .NET Aspire Orchestration Improves Manageability?
Orchestration isn't just about launching services—it's about taming complexity with structure, predictability, and control.

What Do We Mean by "Manageability"?
Manageability in this context refers to how easily a developer or team can,

  • Configure and run multi-service applications
  • Handle dependencies between services
  • Inject configuration (env vars, ports, secrets)
  • Maintain consistency across environments (local/dev/test)
  • Onboard new team members quickly

.NET Aspire Orchestration helps on all fronts
Comparing .NET Aspire Orchestration with Docker Compose
Docker Compose builds containers. Aspire builds confidence by making local orchestration feel like part of the codebase.

Why does This Comparison matter?
Both Docker Compose and .NET Aspire Orchestration aim to solve a similar problem:
“How do I spin up multiple dependent services locally in a predictable, repeatable way?”

But they approach it very differently.

Side-by-Side Breakdown

 

Feature Docker Compose .NET Aspire Orchestration
Setup Style YAML (declarative) C# code (imperative, type-safe)
Primary Use Case Dev/prod container orchestration Dev-time orchestration for .NET apps
Learning Curve Familiar to DevOps teams Friendly for .NET devs
Environment Setup Manual port/env config Automatic wiring + port/env injection
Telemetry Integration Requires manual OpenTelemetry setup Built-in with AddServiceDefaults()
Service Discovery Use of service names and ports Seamless via .WithReference()
Startup Order Management Controlled via depends_on Inferred through dependency references
Dashboard/Insights None (requires external tools) Built-in Aspire Dashboard
Production Readiness More production-aligned (with Docker/K8s) Dev-time only (not a prod orchestrator)
Tooling Integration Works across languages & stacks Tight integration with .NET ecosystem

DX (Developer Experience) with Docker Compose and Aspire

With Aspire (C#)

var redis = builder.AddRedis("cache");

var api = builder.AddProject<Projects.MyApp_Api>("api")
                 .WithReference(redis);

With Docker Compose (YAML)
services:
  redis:
    image: redis
    ports:
      - "6379:6379"
  api:
    build: ./api
    depends_on:
      - redis

Aspire gives you,

  • IntelliSense
  • Strong typing
  • Easier refactoring
  • Familiar project references

Where Docker Compose Still Shines?

  • Aspire is dev-time only, so for actual deployment scenarios, Docker Compose or Kubernetes is still necessary.
  • If you're working in a polyglot environment (e.g., Python + Node + .NET), Compose might still be the glue.
  • Aspire currently only supports orchestration of .NET projects and known service types (e.g., Redis, PostgreSQL, etc.).

What About Kubernetes?
Kubernetes is the heavyweight champion of production orchestration—but it’s overkill for your laptop.
While Kubernetes is the industry standard for production orchestration, it’s intentionally not used for local development in most cases. Here’s why

Kubernetes vs .NET Aspire Orchestration

 

Feature Kubernetes .NET Aspire Orchestration
Target Environment Production / Staging Local Development
Complexity High (requires YAML, Helm, CLI tooling) Low (C# code + dotnet run)
Startup Time Slower, may need Minikube / k3s Instant via .NET CLI
Learning Curve Steep for app devs Low, friendly for .NET developers
Tooling Integration External dashboards (Grafana, Prometheus) Built-in Aspire Dashboard

Takeaway

  • Use Kubernetes when deploying to the cloud at scale.

  • Use Docker Compose for multi-container, multi-stack local dev.

  • Use .NET Aspire Orchestration for pure .NET solutions where developer experience and manageability matter.



European ASP.NET Core 9.0 Hosting - HostForLIFE :: How to Use Blazor Server to Convert Images to Text (Easy OCR Example)?

clock April 11, 2025 08:49 by author Peter

First, make a Blazor Server application.
Launch Visual Studio and make a new project called Blazor Server App.


Install the necessary NuGet package in step two.
Install the Tesseract wrapper by opening the Package Manager Console.

Install-Package Tesseract

Step 3. Download Language Data

  • Go to Tesseract tessdata GitHub repo
  • Download eng. traineddata (for English)
  • Create a folder in your project named tessdata
  • Add the downloaded file to that folder
  • Set its properties: Copy to Output Directory → Copy if newer


Step 4. Download Language Data
@page "/"
@rendermode InteractiveServer
@using Tesseract
@inject IWebHostEnvironment Env

<PageTitle>Image OCR</PageTitle>

<h1>Image to Text (OCR)</h1>

<InputFile OnChange="HandleFileSelected" />
<br />
<button class="btn btn-primary mt-2" @onclick="ExtractTextFromImage">Extract Text</button>

@if (!string.IsNullOrEmpty(extractedText))
{
    <h3>Extracted Text:</h3>
    <pre>@extractedText</pre>
}

@code {
    private string? imagePath;
    private string? extractedText;

    async Task HandleFileSelected(InputFileChangeEventArgs e)
    {
        var file = e.File;
        var uploads = Path.Combine(Env.WebRootPath, "uploads");

        if (!Directory.Exists(uploads))
            Directory.CreateDirectory(uploads);

        imagePath = Path.Combine(uploads, file.Name);

        using var stream = File.Create(imagePath);
        await file.OpenReadStream().CopyToAsync(stream);
    }

    void ExtractTextFromImage()
    {
        if (string.IsNullOrWhiteSpace(imagePath))
            return;

        var tessDataPath = Path.Combine(Env.ContentRootPath, "tessdata");

        using var engine = new TesseractEngine(tessDataPath, "eng", EngineMode.Default);
        using var img = Pix.LoadFromFile(imagePath);
        using var page = engine.Process(img);

        extractedText = page.GetText();
    }
}

Step 5. Run the Application

  • Run your Blazor Server app.
  • Navigate to /ocr in the browser.
  • Upload an image file with text.
  • View the extracted text displayed below.


Original Image


Output



European ASP.NET Core 9.0 Hosting - HostForLIFE :: Understanding ASP.NET Core Web API Eager Loading

clock April 7, 2025 07:42 by author Peter

Consider that you are developing an application for a blog. When a user accesses your API to view their blog post, they anticipate seeing not only the content but also the comments, and possibly even the identity of each commenter. Now consider this:


Should I retrieve each component individually or should I make a single, effective request to the database?
You're considering eager loading if your instinct tells you, "One and done." Let's examine its definition, operation, and appropriate application in your ASP.NET Core Web API.

What Is Eager Loading?
Eager loading is a technique used with Entity Framework Core (EF Core) where you load related data alongside your main data in a single query. It's like ordering a combo meal instead of going back to the counter for fries and then again for your drink.

By using .Include() and .ThenInclude(), you’re telling EF Core:
Hey, while you’re grabbing that Blog, go ahead and pull in the Posts and their Comments too.

Why Use Eager Loading?

Here’s the deal

  • Performance boost: One query instead of many.
  • Avoid the N+1 problem: Imagine fetching 1 blog and 50 posts with separate queries—EF will hit the database 51 times!
  • Cleaner API results: Users don’t have to make extra requests to get the full picture.

Real-World Example. Blog, Posts, and Comments
Let’s build a sample project:
public class Blog
{
    public int BlogId { get; set; }
    public string Title { get; set; }
    public List<Post> Posts { get; set; }
}

public class Post
{
    public int PostId { get; set; }
    public string Content { get; set; }
    public int BlogId { get; set; }
    public Blog Blog { get; set; }
    public List<Comment> Comments { get; set; }
}

public class Comment
{
    public int CommentId { get; set; }
    public string Message { get; set; }
    public string AuthorName { get; set; }
    public int PostId { get; set; }
    public Post Post { get; set; }
}


This setup gives us:

  • A Blog with many Posts
  • Each Post with many Comments

DbContext Setup
public class AppDbContext : DbContext
{
    public AppDbContext(DbContextOptions<AppDbContext> options) : base(options) { }

    public DbSet<Blog> Blogs { get; set; }
    public DbSet<Post> Posts { get; set; }
    public DbSet<Comment> Comments { get; set; }
}


Sample Seed Data
Let’s seed some mock data for testing:
protected override void OnModelCreating(ModelBuilder modelBuilder)
{
    modelBuilder.Entity<Blog>().HasData(new Blog { BlogId = 1, Title = "Tech with Jane" });

    modelBuilder.Entity<Post>().HasData(
        new Post { PostId = 1, Content = "ASP.NET Core is awesome!", BlogId = 1 },
        new Post { PostId = 2, Content = "Entity Framework Tips", BlogId = 1 }
    );

    modelBuilder.Entity<Comment>().HasData(
        new Comment { CommentId = 1, Message = "Loved it!", AuthorName = "John", PostId = 1 },
        new Comment { CommentId = 2, Message = "Very helpful!", AuthorName = "Alice", PostId = 1 },
        new Comment { CommentId = 3, Message = "Great explanation.", AuthorName = "Sara", PostId = 2 }
    );
}


Implementing Eager Loading in API
Here’s how to fetch a blog along with all its posts and the comments under each post:

[HttpGet("{id}")]
public async Task<IActionResult> GetBlogDetails(int id)
{
    var blog = await _context.Blogs
        .Include(b => b.Posts)
            .ThenInclude(p => p.Comments)
        .FirstOrDefaultAsync(b => b.BlogId == id);

    if (blog == null)
        return NotFound();

    return Ok(blog);
}


What’s happening here?
.Include(b => b.Posts) loads the Posts for the Blog.
.ThenInclude(p => p.Comments) loads the Comments for each Post.

All of this is done in one SQL query under the hood. No multiple round trips.

Best Practice. Use DTOs (Data Transfer Objects)
Returning full entities (especially with nested objects) is risky—you might expose internal or sensitive data. So, let’s clean it up using a DTO:
public class BlogDto
{
    public string Title { get; set; }
    public List<PostDto> Posts { get; set; }
}

public class PostDto
{
    public string Content { get; set; }
    public List<CommentDto> Comments { get; set; }
}

public class CommentDto
{
    public string Message { get; set; }
    public string AuthorName { get; set; }
}

Then in your controller
[HttpGet("{id}")]
public async Task<IActionResult> GetBlogDto(int id)
{
    var blog = await _context.Blogs
        .Where(b => b.BlogId == id)
        .Include(b => b.Posts)
            .ThenInclude(p => p.Comments)
        .Select(b => new BlogDto
        {
            Title = b.Title,
            Posts = b.Posts.Select(p => new PostDto
            {
                Content = p.Content,
                Comments = p.Comments.Select(c => new CommentDto
                {
                    Message = c.Message,
                    AuthorName = c.AuthorName
                }).ToList()
            }).ToList()
        })
        .FirstOrDefaultAsync();

    if (blog == null) return NotFound();
    return Ok(blog);
}


Now your API returns only what the client needs. Clean. Efficient. Secure.
Important Things to Remember

Use eager loading when

    You know you’ll need related data.
    You want to optimize performance with fewer database hits.

Avoid eager loading when

    The related data is huge, and you don’t need all of it.
    It slows down the response or affects performance negatively.

Output

 

Conclusion
The ASP.NET Core Web API's eager loading capability is a potent tool that enables you to effectively get relevant data using Entity Framework Core. by using leverage. Include() as well as. By using ThenInclude(), you may address the N+1 problem, get rid of pointless database searches, and enhance API speed. Eager loading must be used carefully, though; just include the information you require, and think about use DTOs to maintain secure and tidy answers. When used properly, eager loading facilitates the development of more scalable, well-organized, and quicker APIs.



About HostForLIFE.eu

HostForLIFE.eu is European Windows Hosting Provider which focuses on Windows Platform only. We deliver on-demand hosting solutions including Shared hosting, Reseller Hosting, Cloud Hosting, Dedicated Servers, and IT as a Service for companies of all sizes.

We have offered the latest Windows 2016 Hosting, ASP.NET Core 2.2.1 Hosting, ASP.NET MVC 6 Hosting and SQL 2017 Hosting.


Tag cloud

Sign in