European ASP.NET 4.5 Hosting BLOG

BLOG about ASP.NET 4, ASP.NET 4.5 Hosting and Its Technology - Dedicated to European Windows Hosting Customer

European ASP.NET Core Hosting - HostForLIFE.eu :: How to Add Localisation to ASP.NET Core Application

clock January 25, 2017 06:07 by author Scott

In this post I'll walk through the process of adding localisation to an ASP.NET Core application using the recommended approach with resx resource files.

Introduction to Localisation

Localisation in ASP.NET Core is broadly similar to the way it works in the ASP.NET 4.X. By default you would define a number of .resx resource files in your application, one for each culture you support. You then reference resources via a key, and depending on the current culture, the appropriate value is selected from the closest matching resource file.

While the concept of a .resx file per culture remains in ASP.NET Core, the way resources are used has changed quite significantly. In the previous version, when you added a .resx file to your solution, a designer file would be created, providing static strongly typed access to your resources through calls such as Resources.MyTitleString.

In ASP.NET Core, resources are accessed through two abstractions, IStringLocalizer and IStringLocalizer<T>, which are typically injected where needed via dependency injection. These interfaces have an indexer, that allows you to access resources by a string key. If no resource exists for the key (i.e. you haven't created an appropriate .resx file containing the key), then the key itself is used as the resource.

Consider the following example:

using Microsoft.AspNet.Mvc; 
using Microsoft.Extensions.Localization;

public class ExampleClass 
{
    private readonly IStringLocalizer<ExampleClass> _localizer;
    public ExampleClass(IStringLocalizer<ExampleClass> localizer)
    {
        _localizer = localizer;
    }

    public string GetLocalizedString()
    {
        return _localizer["My localized string"];
    }
}

In this example, calling GetLocalizedString() will cause the IStringLocalizer<T> to check the current culture, and see if we have an appropriate resource file for ExampleClass containing a resource with the name/key "My localized string". If it finds one, it returns the localised version, otherwise, it returns "My Localized string".

The idea behind this approach is to allow you to design your app from the beginning to use localisation, without having to do up front work to support it by creating the default/fallback .resx file. Instead, you can just write the default values, then add the resources in later.

Personally, I'm not sold on this approach - it makes me slightly twitchy to see all those magic strings around which are essentially keys into a dictionary. Any changes to the keys may have unintended consequences, as I'll show later in the post.

Adding localisation to your application

For now, I'm going to ignore that concern, and dive in using Microsoft's recommended approach. I've started from the default ASP.NET Core Web application without authentication.

The first step is to add the localisation services in your application. As we are building an MVC application, we'll also configure View localisation and DataAnnotations localisation. The localisation packages are already referenced indirectly by the Microsoft.AspNetCore.MVC package, so you should be able to add the services and middleware directly in your Startup class:

public void ConfigureServices(IServiceCollection services) 
{
    services.AddLocalization(opts => { opts.ResourcesPath = "Resources"; });

    services.AddMvc()
        .AddViewLocalization(
            LanguageViewLocationExpanderFormat.Suffix,
            opts => { opts.ResourcesPath = "Resources"; })
        .AddDataAnnotationsLocalization();
}

These services allow you to inject the IStringLocalizer service into your classes. They also allow you to have localised View files (so you can have Views with names like MyView.fr.cshtml) and inject the IViewLocalizer, to allow you to use localisation in your view files. Calling AddDataAnnotationsLocalizationconfigures the Validation attributes to retrieve resources via an IStringLocalizer.

The ResourcePath parameter on the Options object specifies the folder of our application in which resources can be found. So if the root of our application is found at ExampleProject, we have specified that our resources will be stored in the folder ExampleProject/Resources.

Configuring these classes is all that is required to allow you to use the localisation services in your application. However you will typically also need some way to select what the current culture is for a given request.

To do this, we use the RequestLocalizationMiddleware. This middleware uses a number of different providers to try and determine the current culture. To configure it with the default providers, we need to decide which cultures we support, and which is the default culture.

Note that the configuration example in the documentation didn't work for me, though the Localization.StarterWeb project they reference did, and is reproduced below.

public void ConfigureServices(IServiceCollection services) 
{
    // ... previous configuration not shown

    services.Configure<RequestLocalizationOptions>(
        opts =>
        {
            var supportedCultures = new[]
            {
                new CultureInfo("en-GB"),
                new CultureInfo("en-US"),
                new CultureInfo("en"),
                new CultureInfo("fr-FR"),
                new CultureInfo("fr"),
            };

            opts.DefaultRequestCulture = new RequestCulture("en-GB");
            // Formatting numbers, dates, etc.
            opts.SupportedCultures = supportedCultures;
            // UI strings that we have localized.
            opts.SupportedUICultures = supportedCultures;
        });
}

public void Configure(IApplicationBuilder app) 
{
    app.UseStaticFiles();
    var options = app.ApplicationServices.GetService<IOptions<RequestLocalizationOptions>>();    app.UseRequestLocalization(options.Value);

    app.UseMvc(routes =>
    {
        routes.MapRoute(
            name: "default",
            template: "{controller=Home}/{action=Index}/{id?}");
    });
}

Using localisation in your classes

We now have most of the pieces in place to start adding localisation to our application. We don't yet have a way for users to select which culture they want to use, but we'll come to that shortly. For now, lets look at how we go about retrieving a localised string.

Controllers and services

Whenever you want to access a localised string in your services or controllers, you can inject an IStringLocalizer<T> and use its indexer property. For example, imagine you want to localise a string in a controller:

public class HomeController: Controller 
{
    private readonly IStringLocalizer<HomeController> _localizer;

    public HomeController(IStringLocalizer<HomeController> localizer)
    {
        _localizer = localizer;
    }

    public IActionResult Index()
    {
        ViewData["MyTitle"] = _localizer["The localised title of my app!"];
        return View(new HomeViewModel());
    }
}

Calling _localizer[] will lookup the provided string based on the current culture, and the type HomeController. Assuming we have configured our application as discussed previously, the HomeController resides in the ExampleProject.Controllers namespace, and we are currently using the fr culture, then the localizer will look for either of the following resource files:

  • Resources/Controller.HomeController.fr.resx
  • Resources/Controller/HomeController.fr.resx

If a resource exists in one of these files with the key "The localised title of my app!" then it will be used, otherwise the key itself will be used as the resource. This means you don't need to add any resource files to get started with localisation - you can just use the default language string as your key and come back to add .resx files later.

Views

There are two kinds of localisation of views. As described previously, you can localise the whole view, duplicating it and editing as appropriate, and providing a culture suffix. This is useful if the views need to differ significantly between different cultures.

You can also localise strings in a similar way to that shown for the HomeController. Instead of an IStringLocalizer<T>, you inject an IViewLocalizer into the view. This handles HTML encoding a little differently, in that it allows you to store HTML in the resource and it won't be encoded before being output. Generally you'll want to avoid that however, and only localise strings, not HTML.

The IViewLocaliser uses the name of the View file to find the associated resources, so for the HomeController's Index.cshtml view, with the fr culture, the localiser will look for:

  • Resources/Views.Home.Index.fr.resx
  • Resources/Views/Home/Index.fr.resx

The IViewLocalizer is used in a similar way to IStringLocalizer<T> - pass in the string in the default language as the key for the resource:

@using Microsoft.AspNetCore.Mvc.Localization
@model AddingLocalization.ViewModels.HomeViewModel
@inject IViewLocalizer Localizer
@{
    ViewData["Title"] = Localizer["Home Page"];
}
<h2>@ViewData["MyTitle"]</h2> 

DataAnnotations

One final common area that needs localisation is DataAnnotations. These attributes can be used to provide validation, naming and UI hints of your models to the MVC infrastructure. When used, they provide a lot of additional declarative metadata to the MVC pipeline, allowing selection of appropriate controls for editing the property etc.

Error messages for DataAnnotation validation attributes all pass through an IStringLocalizer<T> if you configure your MVC services using AddDataAnnotationsLocalization(). As before, this allows you to specify the error message for an attribute in your default language in code, and use that as the key to other resources later.

public class HomeViewModel 
{
    [Required(ErrorMessage = "Required")]
    [EmailAddress(ErrorMessage = "The Email field is not a valid e-mail address")]
    [Display(Name = "Your Email")]
    public string Email { get; set; }
}

Here you can see we have three DataAnnotation attributes, two of which are ValidationAttributes, and the DisplayAttribute, which is not. The ErrorMessage specified for each ValidationAttribute is used as a key to lookup the appropriate resource using an IStringLocalizer<HomeViewModel>. Again, the files searched for will be something like:

  • Resources/ViewModels.HomeViewModel.fr.resx
  • Resources/ViewModels/HomeViewModel.fr.resx

A key thing to be aware of is that the DisplayAttribute is not localised using the IStringLocalizer<T>. This is far from ideal, but I'll address it in my next post on localisation.

Allowing users to select a culture

With all this localisation in place, the final piece of the puzzle is to actually allow users to select their culture. The RequestLocalizationMiddleware uses an extensible provider mechanism for choosing the current culture of a request, but it comes with three providers built in

  • QueryStringRequestCultureProvider
  • AcceptLanguageHeaderRequestCultureProvider
  • CookieRequestCultureProvider

These allow you to specify a culture in the querystring (e.g ?culture=fr-FR), via the Accept-Languageheader in a request, or via a cookie. Of the three approaches, using a cookie is the least intrusive, as it will obviously seamlessly be sent with every request, and does not require the user to set the Accept-Language header in their browser, or require adding to the querystring with every request.

Again, the Localization.StarterWeb sample project provides a handy implementation that shows how you can add a select box to the footer of your project to allow the user to set the language. Their choice is stored in a cookie, which is handled by the CookieRequestCultureProvider for each request. The provider then sets the CurrentCulture and CurrentUICulture of the thread for the request to the user's selection.

To add the selector to your application, create a partial view _SelectLanguagePartial.cshtml in the Shared folder of your Views:

@using System.Threading.Tasks
@using Microsoft.AspNetCore.Builder
@using Microsoft.AspNetCore.Localization
@using Microsoft.AspNetCore.Mvc.Localization
@using Microsoft.Extensions.Options

@inject IViewLocalizer Localizer
@inject IOptions<RequestLocalizationOptions> LocOptions

@{
    var requestCulture = Context.Features.Get<IRequestCultureFeature>();
    var cultureItems = LocOptions.Value.SupportedUICultures
        .Select(c => new SelectListItem { Value = c.Name, Text = c.DisplayName })
        .ToList();
}

<div title="@Localizer["Request culture provider:"] @requestCulture?.Provider?.GetType().Name"> 
    <form id="selectLanguage" asp-controller="Home"
          asp-action="SetLanguage" asp-route-returnUrl="@Context.Request.Path"
          method="post" class="form-horizontal" role="form">
        @Localizer["Language:"] <select name="culture"
                                        asp-for="@requestCulture.RequestCulture.UICulture.Name" asp-items="cultureItems"></select>
        <button type="submit" class="btn btn-default btn-xs">Save</button>

    </form>
</div> 

We want to display this partial on every page, so update the footer of your _Layout.cshtml to reference it:

<footer> 
    <div class="row">
        <div class="col-sm-6">
            <p>&copy; 2016 - Adding Localization</p>
        </div>
        <div class="col-sm-6 text-right">
            @await Html.PartialAsync("_SelectLanguagePartial")
        </div>
    </div>
</footer> 

Finally, we need to add the controller code to handle the user's selection. This currently maps to the SetLanguage action in the HomeController:

[HttpPost]
public IActionResult SetLanguage(string culture, string returnUrl) 
{
    Response.Cookies.Append(
        CookieRequestCultureProvider.DefaultCookieName,
        CookieRequestCultureProvider.MakeCookieValue(new RequestCulture(culture)),
        new CookieOptions { Expires = DateTimeOffset.UtcNow.AddYears(1) }
    );

    return LocalRedirect(returnUrl);
}

And that's it! If we fire up the home page of our application, you can see the culture selector in the bottom right corner. At this stage, I have not added any resource files, but if I trigger a validation error, you can see that the resource key is used for the resource itself:

My development flow is not interrupted by having to go and mess with resource files, I can just develop the application using the default language and add resx files later in development. If I later add appropriate resource files for the fr culture, and a user changes their culture via the selector, I can see the effect of localisation in the validation attributes and other localised strings:

As you can see, the validation attributes and page title are localised, but the label field 'Your Email' has not, as that is set in the DisplayAttribute.

Summary

In this post I showed how to add localisation to your ASP.NET Core application using the recommended approach of providing resources for the default language as keys, and only adding additional resources as required later.

In summary, the steps to localise your application are roughly as follows:

1. Add the required localisation services
2. Configure the localisation middleware and if necessary a culture provider

3. Inject IStringLocalizer<T> into your controllers and services to localise strings

4. Inject IViewLocalizer into your views to localise strings in views

5. Add resource files for non-default cultures

6. Add a mechanism for users to choose their culture



European ASP.NET Core Hosting - HostForLIFE.eu :: Dependency Injection in ASP.NET Core

clock January 16, 2017 11:12 by author Scott

One of the nice things that the new ASP.NET Core stack brings to the table, is Dependency Injection (DI) as a first-class citizen, right out of the box. DI is nothing new, even for ASP.NET, but in the earlier versions, it wasn't baked into the platform, and developers were forced to jump through hoops in order to enable it.

Let's look at the status quo and how things are changing for the better with the new DI system in ASP.NET Core...

Status quo

Because of the history of ASP.NET, the timelines and factoring of its different products, like WebForms, MVC, SignalR, Katana (OWIN) and Web API, they've each had their own way of doing DI. Some products have extensibility points that you can leverage in order to plug in an Inversion of Control (IoC) container:

  • Web API: System.Web.Http.Dependencies.IDependencyResolver and System.Web.Http.Dependencies.IDependencyScope
  • MVC: System.Web.Mvc.IDependencyResolver
  • SignalR: Microsoft.AspNet.SignalR.IDependencyResolver

While others, like WebForms and Katana, doesn't. Some will argue that the IDependencyResolver-type abstraction, which is essentially an implementation of the Service Locator pattern, is an anti-pattern and should be avoided, but that's a discussion for another day.

There are also other ways of achieving DI within some of the frameworks; MVC has IControllerFactory and IControllerActivator, Web API has IHttpControllerActivator etc. All of these are extensibility points that you can implement in order to leverage DI in your controllers.

Implementing these abstractions yourself isn't something that you typically want or should have to do. Most IoC containers have already implemented these adapters for you and ship them as NuGet packages. If we take Autofac as an example, some adapters include

  • Autofac.Mvc4
  • Autofac.Mvc5
  • Autofac.Owin
  • Autofac.WebApi
  • Autofac.WebApi2
  • Autofac.SignalR
  • Autofac.Web (WebForms)

As you can see, it quickly starts to add up - and this is just for a single container! Imagine if I'd compiled a list for the gazillion different IoC containers in the .NET space. Each of the adapters needs to be maintained, updated, versioned etc. That's a big burden on the adapter maintainers and the community in general.

On the consuming side of this, for a typical web application using MVC, SignalR and Web API, you'd end up needing three (or more) of these adapters, in order to leverage DI across the application.

The future

Even though a lot of ideas and code have been carried forward from Katana, ASP.NET Core is by all means a re-imagining, re-write, re-EVERYTHING of the entire, current ASP.NET stack. Hell, it's even triggered a re-jigging of the entire .NET (Core) platform and tooling. This means that it's a perfect time to bring DI into the platform itself, and make all components on top benefit of a single, unified way of doing DI.

Say hello to IServiceProvider! Even though the interface itself isn't new (it's been living in mscorlib under the System namespace since .NET 1.1), it's found new life in the ASP.NET Core DI system. It's also accompanied by a couple of new interfaces; IServiceCollection, which is essentially a builder for an IServiceProvider and IServiceScope, which is intended for resolving services within a specific lifetime scope, like per-request.

In order for things to Just Work™, out of the box, Microsoft have implemented a lightweight IoC container that ships with the ASP.NET Core hosting layer. It's in the Microsoft.Extensions.DependencyInjection NuGet package.

When ASP.NET Core is bootstrapped, it creates an instance of IServiceCollection and passes it to user code using the ConfigureServicesmethod of the Startup class:

public class Startup 
{
    public void ConfigureServices(IServiceCollection services)
    {
        // This method gets called by the runtime.
        // Use this method to add services to the container.

         // Adds the services MVC requires to run.
        services.AddMvc();

        // Add some custom services
        services.AddSingleton<ICache, Cache>();
        services.AddScoped<IDatabaseSession, DatabaseSession>();
    }

    // ...
}

In this method, you're free to add whatever services your application needs, and they will magically be available for constructor injection across the board. Different components in the stack also ship with extension methods to conveniently add the services the component needs to the collection, like AddMvc (shown above), AddCors, AddEntityFramework etc.

Now, it's important to note that the default implementation, living in Microsoft.Extensions.DependencyInjection is a deliberately lightweight, feature poor (is that a word?), fast, implementation of an IoC container. It has just the amount of features needed for the runtime/platform/framework to compose itself and run. A "lowest common denominator" feature set, if you will. If you want more advanced features, like many do, Microsoft actively encourages you to Bring Your Own Container (BYOC), or layer the functionality on top, which I've done with Scrutor. This brings us back to IoC container adapters.

If you want to use a third party container, you have to, like before, implement your own version of IServiceProvider (and its accompanying interfaces), or use an adapter that someone in the community has already provided. There are already several of these available, like

The difference this time is that you only need a single adapter to enable DI across the board. To plug in the adapter, you have to change the return type of the ConfigureServicesmethod to IServiceProvider and return the adapter implementation. By using StructureMap.Dnx as an example, let's look at our startup class again:

public class Startup 
{
    public IServiceProvider ConfigureServices(IServiceCollection services)
    {
        // This method gets called by the runtime.
        // Use this method to add services to the container.

        // Adds the services MVC requires to run.
        services.AddMvc();

        // Add some custom services
        services.AddSingleton<ICache, Cache>();
        services.AddScoped<IDatabaseSession, DatabaseSession>();

        // Create an instance of a StructureMap container.
        var container = new Container();

        // Here we can add stuff to container, using StructureMap-specific APIs...

        // Populate the StructureMap container with
        // services from the IServiceCollection.
        container.Populate(services);

        // Resolve the StructureMap-specific IServiceProvider
        // and return it to the runtime.
        return container.GetInstance<IServiceProvider>();
    }

    // ...
}

By doing this, all components will resolve its services from the StructureMap container, and you'll be able to utilize the full feature set of StructureMap, like awesome diagnostics, property injection, convention based registrations, profiles, decoration etc.

This post turned out longer than I expected, just to show a couple of lines of code at the end, but I thought it would be interesting to put everything in perspective and hopefully you did too. As you can see the DI story has been greatly simplified in the this new world, while still allowing you, as an application, library or framework developer, to utilize DI across the board, with minimal effort.

 

 



European ASP.NET SignalR Hosting - HostForLIFE.eu :: Using ASP.NET SignalR for Chat Application

clock December 5, 2016 10:04 by author Scott

ASP.Net SignalR is one of the major revolutions in the development technology world these days for creating real applications. Consider an application having a page where you are required to update the user interface with the latest data, as soon as it is available. Such applications are said to be real-time applications, where the UI is updated as soon as the latest data is available for the user.

The best example is a stock market application that must maintain the user interface updated with the data as soon as the stock rates are changed. Another example is a chat application that updates the receiver with the latest message, from the sender. Some of the techniques could use timer-based requests to the server for getting the data and some could use polling to get the data. A good alternative to these is SignalR. 

As we know, our model of the web applications mainly consists of a server that hosts the application and the client, representing the end users of the applications. For creating SignalR based applications, we need to create a hub on the server that is responsible for updating the client or the end users with the latest data. This hub is nothing but a simple class inheriting from the Hub class. At the client end, we get an automatically generated JavaScript based proxy file (at run time) that can be used by the client for connecting with the hub. The client code also contains a JavaScript function that the hub uses to provide them the latest data.

To explain this process in detail, our hub directly calls the client JavaScript functions, to provide the latest data. On the other hand, by using the auto-generated JavaScript proxy file at the client end, client code can uses this proxy, to call the methods of the server hub, if required. The term if required is used here deliberately with the fact that the client may not be required to call the hub. For example, in an application where we have a user dashboard with some data from a database, we can use SqlDependency and SignalR to keep the user interface updated, whenever there is any change in the database records. In this case, the client is not required to make any calls to the server for getting the updates. On the other hand, if we have a chat application, the client code will call the server hub and forward the message. The Hub will than broadcast this message to the users of the application, by calling the JavaScript method of the clients.

One very important point from the preceding paragraph is that the client never calls the hub for getting the latest data. The client may only call the hub so that the hub can forward the message to the other connected clients. If the client code needs to make the calls for the latest data to the server, than the entire purpose of using the SignalR fails and we could have used the old concepts of a timer or page refresh for this.

Build Simple Group Chat Application in 15 minutes Using SignalR

Yes, that's correct. Once you are familiar with the basic concept/flow of SignalR, you can do it very easily. We will be now creating a group chat, without the use of a database. If we think about the flow of the application, this entire process requires a client to send a message to the server that will broadcast this message to all the connected client users. So the server needs to have a component that can broadcast the message to the clients. This role is played by the hub class.

Create a new project named SignalRChat. Add the references to the SignalR libraries using Nuget. It will automatically add the references to the OWIN hosting option libraries that allow addition of the SignalR application to the OWIN pipeline. Apart from the server libraries, it also adds the client libraries required for using SignalR. See the following references that are to be added:

Create OWIN host for the application

We will be using the OWIN based hosting, to host this application. Without going into depth about the OWIN hosting, let's add a class named Startup.cs. The name must be Startup as in the OWIN based hosting specifications and its namespace must be decorated with the assembly attribute, specifying that the Startup assembly is the starting point of the application. Next we define a method named Configuration and register the SignalR in the OWIN pipeline using app.MapSignalR().

Create the Hub on the Server

Our next step is to create the hub on the server that is nothing but a class file. We will name it SignalRChatHub and derive from the Hub class. It will contain a method named BroadCastMessage, with 2 parameters. the client code will use this method (using the proxy generated at its end) for communication with the hub and the parameters as the data to be sent to the hub. Inside this method, use the Clients property of the Hub class to call the client-side function. This function is a simple JavaScript function that will receive the data sent by the hub and update the chat window of the user. We will define a function with the name receiveMessage at the client end (later in the code). So for now, we will use this method name.

A point to be noted here is that we will not get any intelligence help for the client methods, of course. This method will be dynamically resolved. See the image below:

Setup the Client code

The server setup is done. We will now add an HTML page named ChatWindow.html that will be our chat window for the user. Also we add the references to the jquery-1.6.4.min.js and jquery.signalR-2.2.0 .min,js on this page that were added by the Nuget package manager. Earlier we discussed that SignalR automatically generates a proxy class at run time, for client code, to connect with the hub. So we also need to reference this file. Since this file is generated at run time, it does not exist physically for us. As in the SignalR tutorials on the official ASP.Net website, it states:

SignalR creates the JavaScript code for the proxy on the fly and serves it to the client in response to the "/signalr/hubs" URL.

So we need to add this reference also.

We also have the option to disable this auto-generated proxy file generation (that is out of scope for this discussion) and create the proxy ourselves. In that case, we need to reference that file accordingly. Next, let's add some HTML mark-up to generate the chat window and design it using CSS styling.

Now it's time for the client code to connect with the hub and send the message, so that hub can broadcast to all the users. For this, we first get the proxy instance in a variable named chatProxy. Note the camel case syntax of the client code below. This is the convention to be followed when creating the SignalR application. For detailed specifications, I recommend you check out the ASP.Net official SignalR website. Without going further into the details, let's move forward with the code. Here, signalRChatHub is the name of the hub on the server (that we created on the server earlier).

Next, we attach the button click event (the button to send the message to the users), when the connection to the hub is successfully started. This event will call the method of the hub, using the proxy instance that will receive the message and broadcast it to users. See the code below:

We also declare the function to which the hub will call, when it needs to update all the users with the message received. This is the function name we referred to, from the hub method, at the start of the discussion. So this function acts as a type of callback function here.

So all the client code is also set up and we can now run the application. So our complete client code will now look as in:

We can now run the application. Copy the URL and open another instance of the browser or any other browser and we can start chatting.

 



European ASP.NET Core 1.0 Hosting - HostForLIFE.eu :: How to Publish Your ASP.NET Core in IIS

clock November 3, 2016 09:27 by author Scott

When you build ASP.NET Core applications and you plan on running your applications on IIS you'll find that the way that Core applications work in IIS is radically different than in previous versions of ASP.NET.

In this post I'll explain how ASP.NET Core runs in the context of IIS and how you can deploy your ASP.NET Core application to IIS.

Setting Up Your IIS and ASP.NET Core

The most important thing to understand about hosting ASP.NET Core is that it runs as a standalone, out of process Console application. It's not hosted inside of IIS and it doesn't need IIS to run. ASP.NET Core applications have their own self-hosted Web server and process requests internally using this self-hosted server instance.

You can however run IIS as a front end proxy for ASP.NET Core applications, because Kestrel is a raw Web server that doesn't support all features a full server like IIS supports. This is actually a recommended practice on Windows in order to provide port 80/443 forwarding which kestrel doesn't support directly. For Windows IIS (or another reverse proxy) will continue to be an important part of the server even with ASP.NET Core applications.

Run Your ASP.NET Core Site

To run your ASP.NET Core site, it is quite different with your previous ASP.NET version. ASP.NET Core runs its own web server using Kestrel component. Kestrel is a .NET Web Server implementation that has been heavily optimized for throughput performance. It's fast and functional in getting network requests into your application, but it's 'just' a raw Web server. It does not include Web management services as a full featured server like IIS does.

If you run on Windows you will likely want to run Kestrel behind IIS to gain infrastructure features like port 80/443 forwarding via Host Headers, process lifetime management and certificate management to name a few.

ASP.NET Core applications are standalone Console applications invoked through the dotnet runtime command. They are not loaded into an IIS worker process, but rather loaded through a native IIS module called AspNetCoreModule that executes the external Console application.

Once you've installed the hosting bundle (or you install the .NET Core SDK on your Dev machine) the AspNetCoreModule is available in the IIS native module list:

The AspNetCoreModule is a native IIS module that hooks into the IIS pipeline very early in the request cycle and immediately redirects all traffic to the backend ASP.NET Core application. All requests - even those mapped to top level Handlers like ASPX bypass the IIS pipeline and are forwarded to the ASP.NET Core process. This means you can't easily mix ASP.NET Core and other frameworks in the same Site/Virtual directory, which feels a bit like a step back given that you could easily mix frameworks before in IIS.

While the IIS Site/Virtual still needs an IIS Application Pool to run in, the Application Pool should be set to use No Managed Code. Since the App Pool acts merely as a proxy to forward requests, there's no need to have it instantiate a .NET runtime.

The AspNetCoreModule's job is to ensure that your application gets loaded when the first request comes in and that the process stays loaded if for some reason the application crashes. You essentially get the same behavior as classic ASP.NET applications that are managed by WAS (Windows Activation Service).

Once running, incoming Http requests are handled by this module and then routed to your ASP.NET Core application.

So, requests come in from the Web and int the kernel mode http.sys driver which routes into IIS on the primary port (80) or SSL port (443). The request is then forwarded to your ASP.NET Core application on the HTTP port configured for your application which is not port 80/443. In essence, IIS acts a reverse proxy simply forwarding requests to your ASP.NET Core Web running the Kestrel Web server on a different port.

Kestrel picks up the request and pushes it into the ASP.NET Core middleware pipeline which then handles your request and passes it on to your application logic. The resulting HTTP output is then passed back to IIS which then pushes it back out over the Internet to the HTTP client that initiated the request - a browser, mobile client or application.

The AspNetCoreModule is configured via the web.config file found in the application's root, which points a the startup command (dotnet) and argument (your application's main dll) which are used to launch the .NET Core application. The configuration in the web.config file points the module at your application's root folder and the startup DLL that needs to be launched.

Here's what the web.config looks like:

<?xml version="1.0" encoding="utf-8"?>
<configuration>
  <!--
    Configure your application settings in appsettings.json. Learn more at http://go.microsoft.com/fwlink/?LinkId=786380
  -->
  <system.webServer>
    <handlers>
      <add name="aspNetCore" path="*" verb="*"
        modules="AspNetCoreModule" resourceType="Unspecified" />
    </handlers>
    <aspNetCore processPath="dotnet"
                arguments=".\AlbumViewerNetCore.dll"
                stdoutLogEnabled="false"
                stdoutLogFile=".\logs\stdout"
                forwardWindowsAuthToken="false" />
  </system.webServer>
</configuration>

You can see that module references dotnetexe and the compiled entry point DLL that holds your Main method in your .NET Core application.

IIS is Recommended!

We've already discussed that when running ASP.NET Core on Windows, it's recommended you use IIS as a front end proxy. While it's possible to directly access Kestrel via an IP Address and available port, there are number of reasons why you don't want to expose your application directly this way in production environments.

First and foremost, if you want to have multiple applications running on a single server that all share port 80 and port 443 you can't run Kestrel directly. Kestrel doesn't support host header routing which is required to allow multiple port 80 bindings on a single IP address. Without IIS (or http.sys actually) you currently can't do this using Kestrel alone (and I think this is not planned either).

The AspNetCoreModule running through IIS also provides the necessary process management to ensure that your application gets loaded on the first access, ensures that it stays up and running and is restarted if it crashes. The AspNetCoreModule provides the required process management to ensure that your AspNetCore application is always available even after a crash.

It's also a good idea to run secure SSL requests through IIS proper by setting up certificates through the IIS certificate store and letting IIS handle the SSL authentication. The backplane HTTP request from IIS can then simply fire a non-secure HTTP request to your application. This means only a the front end IIS server needs a certificate even if you have multiple servers on the backplane serving the actual HTTP content.

IIS can also provide static file serving, gzip compression of static content, static file caching, Url Rewriting and a host of other features that IIS provides natively. IIS is really good and efficient at processing non-application requests, so it's worthwhile to take advantage of that. You can let IIS handle the tasks that it's really good at, and leave the dynamic tasks to pass through to your ASP.NET Core application.

The bottom line for all of this is if you are hosting on Windows you'll want to use IIS and the AspNetCoreModule.

How to Publish ASP.NET Core in IIS

In order to run an application with IIS you have to first publish it. There are two ways to that you can do this today:

1. Use dotnet publish

Using dotnet publish builds your application and copies a runnable, self-contained version of the project to a new location on disk. You specify an output folder where all the files are published. This is not so different from classic ASP.NET which ran Web sites out of temp folders. With ASP.NET Core you explicitly publish an application into a location of your choice - the files are no longer hidden away and magically copied around.

A typical publish command may look like this:

dotnet publish
      --framework netcoreapp1.0
      --output "c:\temp\AlbumViewerWeb"
      --configuration Release

If you open this folder you'll find that it contains your original application structure plus all the nuget dependency assemblies dumped into the root folder:

Once you've published your application and you've moved it to your server (via FTP or other mechanism) we can then hook up IIS to the folder.

After that, please just make sure you setup .NET Runtime to No Managed Code as shown above.

And that's really all that needs to happen. You should be able to now navigate to your site or Virtual and the application just runs.

You can now take this locally deployed Web site, copy it to a Web Server (via FTP or direct file copy or other publishing solution), set up a Site or Virtual and you are off to the races.

2. Publish Using Visual Studio

The dotnet publish step works to copy the entire project to a folder, but it doesn't actually publish your project to a Web site (currently - this is likely coming at a later point).

In order to get incremental publishing to work, which is really quite crucial for ASP.NET Core applications because there are so many dependencies, you need to use MsDeploy which is available as part of Visual Studio's Web Publishing features.

Currently the Visual Studio Tooling UI is very incomplete, but the underlying functionality is supported. I'll point out a few tweaks that you can use to get this to work today.

When you go into Visual Studio in the RC2 Web tooling and the Publish dialog, you'll find that you can't create a publish profile that points at IIS. There are options for file and Azure publishing but there's no way through the UI to create a new Web site publish.

However, you can cheat by creating your own .pubxml file and putting it into the \Properties\PublishProfilesfolder in your project.

To create a 'manual profile' in your ASP.NET Core Web project:

  • Create a folder \Properties\PublishProfiles
  • Create a file <MyProfile>.pubxml

You can copy an existing .pubxml from a non-ASP.NET Core project or create one. Here's an example of a profile that works with IIS:

<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
  <PropertyGroup>
    <WebPublishMethod>MSDeploy</WebPublishMethod>
    <LastUsedBuildConfiguration>Release</LastUsedBuildConfiguration>
    <LastUsedPlatform>Any CPU</LastUsedPlatform>
    <SiteUrlToLaunchAfterPublish>http://samples.west-wind.com/AlbumViewerCore/index.html</SiteUrlToLaunchAfterPublish>
    <LaunchSiteAfterPublish>True</LaunchSiteAfterPublish>
    <ExcludeApp_Data>False</ExcludeApp_Data>
    <PublishFramework>netcoreapp1.0</PublishFramework>
    <UsePowerShell>True</UsePowerShell>
    <EnableMSDeployAppOffline>True</EnableMSDeployAppOffline>
    <MSDeployServiceURL>https://publish.west-wind.com</MSDeployServiceURL>
    <DeployIisAppPath>samples site/albumviewercore</DeployIisAppPath>
    <RemoteSitePhysicalPath />
    <SkipExtraFilesOnServer>True</SkipExtraFilesOnServer>
    <MSDeployPublishMethod>RemoteAgent</MSDeployPublishMethod>
    <EnableMSDeployBackup>False</EnableMSDeployBackup>
    <UserName>username</UserName>
    <_SavePWD>True</_SavePWD>
    <ADUsesOwinOrOpenIdConnect>False</ADUsesOwinOrOpenIdConnect>
    <AuthType>NTLM</AuthType>
  </PropertyGroup>
</Project>

Once you've created a .pubxml file you can now open the publish dialog in Visual Studio with this Profile selected:

At this point you should be able to publish your site to IIS on a remote server and use incremental updates with your content.

#And it's a Wrap Currently IIS hosting and publishing is not particularly well documented and there are some rough edges around the publishing process. Microsoft knows of these issues and this will get fixed by RTM of ASP.NET Core.

In the meantime I hope this post has provided the information you need to understand how IIS hosting works and a few tweaks that let you use the publishing tools available to get your IIS applications running on your Windows Server.



European ASP.NET Core 1.0 Hosting - HostForLIFE.eu :: Setup Angular 2 in ASP.NET Core 1.0

clock October 12, 2016 00:11 by author Scott

This tutorial aims for starting Angular 2 in ASP.NET Core using Visual Studio 2015. The release of Angular 2, ASP.NET Core RC is becoming interesting to build SPA.

I have compiled the steps involved in starting to learn Angular 2. This is detailed explanation, you will feel much easier at end of article.

Create Your ASP.NET Core Project

Open Visual Studio 2015 Community Edition Update 3, Select New Web Project naming it “ngCoreContacts” and select “Empty” project template. Don’t forget to install new web tools for ASP.NET Core 1.0

I used Visual Studio 2015 Community Edition Update 3(Must update), TypeScript 2.0 (must), latest NPM, Gulp.

Setup ASP.NET Core to Serve Static Files

ASP.NET Core is designed as pluggable framework to include and use only necessary packages, instead of including too many initial stuff.

Lets create HTML file named “index.html” under wwwroot folder.

Right click on wwwroot folder, Add New Item and create index.html file. This HTML page will act as default page.

<!DOCTYPE html>
<html>
<head>
    <meta charset="utf-8" />
    <title>Angular 2 with ASP.NET Core</title>
</head>
<body>
    <h1>Demo of Angular 2 using ASP.NET Core with Visual Studio 2015</h1>
</body>
</html>

For ASP.NET Core to serve static files, we need to add StaticFiles middle ware in Configure method of Startup.cs page. Ensure that packages are restored properly.

project.json is redesigned to make it better, we have Static Files middleware to serve static assets like HTML, JS files etc.

public void Configure(IApplicationBuilder app)
        {
            app.UseDefaultFiles();
            app.UseStaticFiles();
        }

 

{
  "dependencies": {
    "Microsoft.NETCore.App": {
      "version": "1.0.1",
      "type": "platform"
    },
    "Microsoft.AspNetCore.Diagnostics": "1.0.0",
    "Microsoft.AspNetCore.Server.IISIntegration": "1.0.0",
    "Microsoft.AspNetCore.Server.Kestrel": "1.0.1",
    "Microsoft.Extensions.Logging.Console": "1.0.0",
    "Microsoft.AspNetCore.StaticFiles": "1.0.0"
  },

  "tools": {
    "Microsoft.AspNetCore.Server.IISIntegration.Tools": "1.0.0-preview2-final"
  },

  "frameworks": {
    "netcoreapp1.0": {
      "imports": [
        "dotnet5.6",
        "portable-net45+win8"
      ]
    }
  },

  "buildOptions": {
    "emitEntryPoint": true,
    "preserveCompilationContext": true,
    "compile": {
      "exclude": [ "node_modules" ]
    }
  },

  "runtimeOptions": {
    "configProperties": {
      "System.GC.Server": true
    }
  },

  "publishOptions": {
    "include": [
      "wwwroot",
      "web.config"
    ]
  },

  "scripts": {   
    "postpublish": [ "dotnet publish-iis --publish-folder %publish:OutputPath% --framework %publish:FullTargetFramework%" ]
  }
}

Run the application now, ASP.NET Core renders static HTML page.

Delete this index.html page, we will be injecting this dynamically later. Till now you saw demonstration of “wwwroot“ as root folder for ASP.NET Core web apps.

Setup Angular 2 in ASP.NET Core

Angular 2 is famously claiming to be ONE Framework for MOBILE and DESKTOP apps. There’s won’t be any breaking changes after final release.

This tutorial refers 5 MIN QUICK START for getting started, it’s more focused on other light weight code editors; but here we are using Visual Studio 2015 Community Edition Update 3 for its built in TypeScript tooling and other features.

We will be using Webpack for module bundler, it’s an excellent alternative to the systemJS approach. To know more about inner details of read “webpack and Angular 2

Majority of webpack scripting is based on AngularClass’s angular2-webpack-starter. I have modified according to ASP.NET Core web apps.

Adding NPM Configuration file for Angular 2 Packages

Angular 2 team is pushing the code changes using NPM rather than CDN or any other source, due to this we need to add NPM configuration file (package.json) to this ASP.NET Core application.

Right Click on “ngCoreContacts“, add new file “NPM Configuration File“; by default package.json is added to ASP.NET Core project. This acts Node Package Manager (NPM) file, a must for adding packages for Angular 2

From the Angular 2 Quick start provided above, we need to add dependencies for required for Angular 2 in ASP.NET Core application. Copy Paste below configuration in package.json file

{
    "version": "1.0.0",
    "description": "ngcorecontacts",
    "main": "wwwroot/index.html",
  "scripts": {
    "build:dev": "webpack --config config/webpack.dev.js --progress --profile",   
    "build:prod": "webpack --config config/webpack.prod.js  --progress --profile --bail",
    "build": "npm run build:dev",   
    "server:dev:hmr": "npm run server:dev -- --inline --hot",
    "server:dev": "webpack-dev-server --config config/webpack.dev.js --progress --profile --watch --content-base clientsrc/",
    "server:prod": "http-server dist --cors",
    "server": "npm run server:dev",
    "start:hmr": "npm run server:dev:hmr",
    "start": "npm run server:dev",
    "version": "npm run build",
    "watch:dev:hmr": "npm run watch:dev -- --hot",
    "watch:dev": "npm run build:dev -- --watch",
    "watch:prod": "npm run build:prod -- --watch",
    "watch:test": "npm run test -- --auto-watch --no-single-run",
    "watch": "npm run watch:dev",   
    "webpack-dev-server": "webpack-dev-server",
    "webpack": "webpack"
  },
  "dependencies": {
    "@angular/common": "~2.0.1",
    "@angular/compiler": "~2.0.1",
    "@angular/core": "~2.0.1",
    "@angular/forms": "~2.0.1",
    "@angular/http": "~2.0.1",
    "@angular/platform-browser": "~2.0.1",
    "@angular/platform-browser-dynamic": "~2.0.1",
    "@angular/router": "~3.0.1",
    "@angular/upgrade": "~2.0.1",
    "angular-in-memory-web-api": "~0.1.1",
    "@angularclass/conventions-loader": "^1.0.2",
    "@angularclass/hmr": "~1.2.0",
    "@angularclass/hmr-loader": "~3.0.2",
    "@angularclass/request-idle-callback": "^1.0.7",
    "@angularclass/webpack-toolkit": "^1.3.3",
    "assets-webpack-plugin": "^3.4.0",
    "core-js": "^2.4.1",
    "http-server": "^0.9.0",
    "ie-shim": "^0.1.0",
    "rxjs": "5.0.0-beta.12",
    "zone.js": "~0.6.17",
    "@angular/material": "^2.0.0-alpha.9",
    "hammerjs": "^2.0.8"
  },
  "devDependencies": {
    "@types/hammerjs": "^2.0.33",
    "@types/jasmine": "^2.2.34",
    "@types/node": "^6.0.38",
    "@types/source-map": "^0.1.27",
    "@types/uglify-js": "^2.0.27",
    "@types/webpack": "^1.12.34",
    "angular2-template-loader": "^0.5.0",
    "awesome-typescript-loader": "^2.2.1",
    "codelyzer": "~0.0.28",
    "copy-webpack-plugin": "^3.0.1",
    "clean-webpack-plugin": "^0.1.10",
    "css-loader": "^0.25.0",
    "exports-loader": "^0.6.3",
    "expose-loader": "^0.7.1",
    "file-loader": "^0.9.0",
    "gh-pages": "^0.11.0",
    "html-webpack-plugin": "^2.21.0",
    "imports-loader": "^0.6.5",
    "json-loader": "^0.5.4",
    "parse5": "^1.3.2",
    "phantomjs": "^2.1.7",
    "raw-loader": "0.5.1",
    "rimraf": "^2.5.2",
    "source-map-loader": "^0.1.5",
    "string-replace-loader": "1.0.5",
    "style-loader": "^0.13.1",
    "sass-loader": "^3.1.2",   
    "to-string-loader": "^1.1.4",
    "ts-helpers": "1.1.1",
    "ts-node": "^1.3.0",
    "tslint": "3.15.1",
    "tslint-loader": "^2.1.3",
    "typedoc": "^0.4.5",
    "typescript": "2.0.3",
    "url-loader": "^0.5.7",
    "webpack": "2.1.0-beta.22",
    "webpack-dev-middleware": "^1.6.1",
    "webpack-dev-server": "^2.1.0-beta.2",
    "webpack-md5-hash": "^0.0.5",
    "webpack-merge": "^0.14.1"
  }
}

Right after saving this, ASP.NET Core starts restoring the packages. It would download all packages mentioned independencies section of above package.json.

Sometimes in solution explorer you might see ‘Dependencies – not installed’, don’t worry this bug in tooling. All the npm packages are installed.

Add TypeScript configuration file – must for Angular 2 in ASP.NET Core using TypeScript

We are creating Angular 2 in ASP.NET Core starting with TypeScript, this obvious reason adds to include TypeScript Configuration file which does work of transpiling it to JavaScript, module loading, target ES5 standards.

Add “tsconfig.json” in the project, copy paste below configuration.

{
  "compilerOptions": {
    "target": "es5",
    "module": "commonjs",
    "moduleResolution": "node",
    "emitDecoratorMetadata": true,
    "experimentalDecorators": true,
    "allowSyntheticDefaultImports": true,
    "sourceMap": true,
    "noEmitHelpers": true,
    "strictNullChecks": false,
    "baseUrl": "./clientsrc",
    "paths": [],
    "lib": [
      "dom",
      "es6"
    ],
    "types": [
      "hammerjs",     
      "node",     
      "source-map",
      "uglify-js",
      "webpack"
    ]
  },
  "exclude": [
    "node_modules",
    "dist"
  ],
  "awesomeTypescriptLoaderOptions": {
    "forkChecker": true,
    "useWebpackText": true
  },
  "compileOnSave": false,
  "buildOnSave": false,
  "atom": { "rewriteTsconfig": false }
}

It’s mandatory to install TypeScript 2.o for working with Angular 2.

At present typings.json is not required because we are using @types with TypeScript. However if your using any other packages which don’t have entries in @types then typings.json has to be added.

Use Webpack as Module Bundler

Webpack is a powerful module bundler. A bundle is a JavaScript file that incorporate assets that belong together and should be served to the client in a response to a single file request. A bundle can include JavaScript, CSS styles, HTML, and almost any other kind of file.

Webpack roams over your application source code, looking for import statements, building a dependency graph, and emitting one (or more) bundles. With plugin “loaders” Webpack can preprocess and minify different non-JavaScript files such as TypeScript, SASS, and LESS files.

In package.json, we have added “webpack“ packages as “devdependencies“. They will perform all bundling work.

What webpack does is written in a JavaScript configuration file know as webpack.config.js. As always the applications are run in Development, Test and Production environment.

There are some common functionalities and some specific to environments. We will focus on development and productionenvironment to write accordingly.

Development environment should have source maps for debugging TypeScript files, minifying bundles of JS, CSS etc files not necessary.

Production environment should minify bundles to reduce loading time, do not include source maps. Webpack 2 also does tree shaking i.e. eliminate unused code to further reduce bundle sizes.

webpack.config.js – Based on environment set process.env.NODE_ENV, it runs either dev or prod configurations.

Webpack.common.js before bundling environment specific files, it performs tasks meant to be used for both environment.

  • Webpack splits Angular 2 apps into 3 files polyfills(to maintain backward compatibility with older browsers) , vendors(all JS, CSS, HTML, SCSS, images, JSON etc into one file) and boot (application specific files)
  • resolve based on various file extensions
  • Webpack itself doesn’t know what to do with a non-JavaScript file. We teach it to process such files into JavaScript withloaders. For this, we have written loaders TS, HTML, JSON, Fonts, images
  • Any static assets placed in “clientsrc/assets” will be copied to assets folder using CopyWebpackPlugin
  • CleanWebpackPlugin cleans “wwwroot/dist” folder every time we run it, so that we get fresh set of files.
  • I told you above to delete the index.html file, now the clientsrc/index.html will be moved to wwwroot usingHtmlWebpackPlugin. Plus Webpack injects the bundle files i.e. polyfills, vendor, boot JS files and includes them in HTML script reference.

Now let’s see webpack.dev.js for development purpose

  • Running “webpack-dev-server” – this runs entire application in memory, any changes to source file gets applied immediately
  • Loads application in debug mode with source map. Everything run in memory i.e. html, js, static files are loaded in memory.
  • Runs the application on localhost 3000 port. Port can be changed as your convenience

Now let’s see webpack.prod.js for production purpose

  • Merges all the bundle files and copies to wwwroot.
  • Minifies all files to load faster using UglifyJsPlugin plugin

Writing Angular 2 application

Until now we created ASP.NET Core app, added TSconfig file, webpack configuration. Now it’s time to write Angular 2 application

In the github repo, you can see “clientsrc” folder. This contains the angular 2 app which gets bundled into using webpack configurations we wrote

“Clientsrc“ folder has index.html, polyfills.browses.ts, vendor.browsers.ts and mostly importantly boot.ts

We have app folder containing HTML, Angular 2 components and root level module (app.module.ts) which gets loaded while bootstrapping application.

Some of files might be not interesting now, will focus them in separate articles later.

Running the application

Before running make sure you have run command “npm install”. This might not be needed but still it will ensure all packages are installed.

Now let’s run the application in development mode

  • From command line (directory should be same as package.json), type “npm start” & hit enter. It will start running the webpack-dev-server which loads application and listens on localhost:3000.
  • When on console it says “bundle is now VALID” then open a browser and navigate to http://localhost:3000 to see application getting loaded.

Notice wwwroot folder, we don’t see any files copied because everything is running in memory.

Now that application runs properly on browser, let’s understand how Angular 2 app loads

  • When browser starts rendering index.html page, it encounters <my-app>Loading…</my-app> tag.
  • Then Angular’s module platformBrowserDynamic bootstraps clientsrc/app/AppModule through lineplatformBrowserDynamic().bootstrapModule(AppModule)
  • AppModule then loads the component app.component.ts which is mentioned in @NgModule as bootstrap entry
  • Clientsrc/src/Appcomponent then resolves the <my-app> tag as selector in it and renders UI with TypeScript code.

When we enter “npm start” in console to run the application, execution points scripts section of package.json to below code

webpack-dev-server --config config/webpack.dev.js --progress --profile --watch --content-base clientsrc/

This invokes webpack-dev-server, runs the development config and watches for any changes in clientsrc folder. Any changes in this folder will reload application with changes.

Running the application in Production mode

Assuming the application is now ready to deployed, we need to have PROD build. For this run command

//builds app and copies in wwwroot
Npm run build:prod

Now if you see wwwroot folder, we see the HTML, JS bundle files. This wwwroot folder can be deployed on any web server like IIS or nginx

You can either do F5 to run from Visual Studio IDE or run command npm run server:prod



European ASP.NET Core Hosting - HostForLIFE.eu :: How to Fix Error 502.5 - Process Failure in ASP.NET Core

clock October 4, 2016 19:35 by author Scott

This is an issue that sometimes you face when you deploying your ASP.NET Core on shared hosting environment.

Problem

HTTP Error 502.5 - Process Failure

Common causes of this issue:

* The application process failed to start
* The application process started but then stopped
* The application process started but failed to listen on the configured port

Although your hosting provider have setup .net core for you, you can face error above. So, how to fix this problems?

Solution

There are 2 issues that cause the above error

#1. Difference ASP.NET Core Version

1. The difference ASP.NET Core version. Microsoft just released newest ASP.NET Core 1.0.1. Maybe you are still using ASP.NET Core 1.0. So, you must upgrade your .net Core version.

2. When you upgraded your app, make sure that your ‘Microsoft.NETCoreApp’ setting in your project.json was changed to:

"Microsoft.NETCore.App": "1.0.1",

3. Fixing your App ‘type’

"Microsoft.NETCore.App": { "version": "1.0.1", "type": "platform" },

#2. Set path to dotnet.exe in the web.config

Here is example to change your web.config file:

<?xml version="1.0" encoding="utf-8"?>
<configuration>
  <!--
    Configure your application settings in appsettings.json. Learn more at https://go.microsoft.com/fwlink/?LinkId=786380
  -->
  <system.webServer>
    <handlers>
      <add name="aspNetCore" path="*" verb="*" modules="AspNetCoreModule" resourceType="Unspecified" />
    </handlers>
    <aspNetCore processPath="C:\Program Files\dotnet\dotnet.exe" arguments=".\CustomersApp.dll" stdoutLogEnabled="true" stdoutLogFile=".\logs\stdout" forwardWindowsAuthToken="false" />
  </system.webServer>
</configuration>

Conclusion

We hope above tutorial can help you to fix ASP.NET Core problem. This is only our experience to solve above error and hope that will help you. If you need ASP.NET Core hosting, you can visit our site at http://www.hostforlife.eu. We have supported the latest ASP.NET Core 1.0.1 hosting on our hosting environment and we can make sure that it is working fine.



European Entity Framework Core 1.0 Hosting - HostForLIFE.eu :: Speed Up Large Data Sets with .AsNoTracking

clock September 29, 2016 20:04 by author Scott

If you are experiencing performance issues with Entity Framework Core 1.0 when you try to include related objects in your LINQ and lambda expressions on record sets of just a few thousand records, then check out these examples on how to improve it. Of the many ways, the best way that I have found to negate those performance problems and to optimize your requests is by using .AsNoTracking()

What is AsNoTracking()
When you use .AsNoTracking() you are basically telling the context not to track the retrieved information. Sure, this turns off a useful EF feature, but in many cases (like building an api in my case) this feature is unneeded.

In summary, Tracking is turned on by default and is great if you want to retrieve and save objects all within the same context. But if this is not required, you should use AsNewTracking(). AsNoTracking() decouples the returned records from the context, making your queries much faster!

Example

IList<Player> playerList = db.Player.AsNoTracking().ToList()

Example with Joins

IList<Player> playerList = (from p in db.Player
                        select p)
                        .Include(p => p.Team)
                        .ThenInclude(t => t.League)
                        .AsNoTracking()
                        .ToList();

It is also worth noting that there is another way to decouple your objects from EF.  This other way can be done by adding .AsEnumerable().  That would look like this:

var decoupledList = from x in TableA<strong>.AsEnumerable()</strong>

 



European ASP.NET 4.5 Hosting - UK :: How to Improve Your Entity Framework Performance

clock November 23, 2015 23:46 by author Scott

LINQ to Entity is a great ORM for querying and managing databases. There are some points that we should consider while designing and querying databases using the Entity Framework ORM.

Disable change tracking for entity if not needed

In the Entity Framework, the context is responsible for tracking changes in entity objects. Whenever we are only reading data, there is no need to track the entity object. We can disable entity tracking using the MergeOption, as in:

AdventureWorksEntities e = new AdventureWorksEntities();
e.EmployeeMasters.MergeOption = System.Data.Objects.MergeOption.NoTracking;

Use Pre-Generating Views to reduce response time for the first request

While working with Entity Framework, view generation may take longer for a large and complicated model. We can use a pre-generated view to save run time cost to generate a view for the first request of a page. A Model Generated view helps us to improve start-up performance at run time.

Avoid fetching all the fields if not required

Avoid fetching all fields from the database that are not really required. For example we have an EmployeeMaster table and for some operation I require only the EmployeeCode field so the query must be:

AdventureWorksEntities e = new AdventureWorksEntities();
string
 code = e.EmployeeMasters.Where(g => g.EmployeeId == 1000)
                                                             .Select(s => s.EmployeeCode).FirstOrDefault();

Retrieve only required number of records (rows)

Do not collect all data while binding data to a data grid. For example if we have a data grid on a page and we only show 10 records on the screen, so do not fetch all records from the database, as in:

AdventureWorksEntities e = new AdventureWorksEntities();
int
 pageSize = 30, startingPageIndex = 3;
List<EmployeeMaster> lstemp = e.EmployeeMasters
                                                           .Take(pageSize)
                                                           .Skip(startingPageIndex * pageSize).ToList();

Avoid using Contains

Avoid use of the contain keyword with a LINQ - Entity Query. While Entity Framework generates equivalent SQL, the contain is converted in the "WHERE IN" clause. The "IN" clause has performance issues.

Avoid using Views

Views degrade the LINQ query performance. So avoid using views in LINQ to Entities.

Use Compiled Query

If you are using Entity Framework 4.1 and below, you must use a compiled query. The Compiled Query class provides compilation and caching of queries for reuse. The Entity Framework 5.0 supports a new feature called Auto-Compiled LINQ Queries. With EF 5.0, LINQ to Entity queries are compiled automatically and placed in EF's query cache, when executed.

static Func<AdventureWorksEntitiesIQueryable<EmployeeDetail>> getEmpList = (
            CompiledQuery.Compile((AdventureWorksEntities db) =>
                db.EmployeeDetails
            ));

AdventureWorksEntities e = new AdventureWorksEntities1();
List
<EmployeeDetail> empList = getEmpList(e).ToList();

Must examine Queries before being sent to the Database

When we are investigating performance issues, sometimes it's helpful to know the exact SQL command text that the Entity Framework is sending to the database. I would suggest that we must examine each and every LINQ to Entity queries to avoiding performance issues.

Efficiently Loading Related Data

Entity Framework supports the loading of related data. The Include method helps us load relevant data. Use of the Include method ensures it is really required for the page i.e. do not include a navigation property whenever it is not really required.

Select appropriate Loading type while defining model

Entity Framework supports three ways to load related data: eager loading, lazy loading and explicit loading. Eager loading is the process in which a query of one type of entity also loads related entities as part of the query. Eager loading is done by use of the Include method. Lazy loading is the process in which an entity or collection of entities is automatically loaded from the database the first time that a property referring to the entity/entities is accessed. When lazy loading disabled, it is possible to lazily load related entities with use of the Load method on the related entity's entry. It is called explicit loading. So depending upon on the type of application and requirement we can select appropriate Loading type for the entity model.

Conclusion

Using the above definition tips, we can definitely improve the performance of LINQ to Entity Queries.



European ASP.NET 4.5 Hosting - UK :: How to Check that .NET has been Installed on the Server

clock November 16, 2015 21:51 by author Scott

Most of the newbies and few administrators handling the deployment of their company’s ASP.Net applications on the Windows Server must be knowing about the basics of how ASP.Net is associated with the IIS web server. Here’s a quick tip for you to quickly check  whether the exact version of .NET framework has been installed on the Server and also to check whether it has been registered with the IIS or not.

In this scenario, I'm going to consider a fresh installation of the Windows Server (2008 R2/ 2012 R2). SO make sure you have the below mentioned configuration done accordingly.

How to Find The Existence of the .NET Framework and Its Files

Navigate to this location, to make sure the .Net Framework that you are wanting/looking for is installed.

File location: (32-bit): C:\Windows\Microsoft.NET\Framework\ (By default)
For 64-bit: C:\Windows\Microsoft.NET\Framework64\ (By default)

There are several .NET Framework versions available. Some are included in some Windows OS by default and all are available to download at Microsoft website as well.

Following is a list of all released versions of .NET Framework:

- .NET Framework 1.0 (Obsolete)
- .NET Framework 1.1 (Obsolete, comes installed in Windows Server 2003)
- .NET Framework 2.0* (Obsolete, comes as a standalone installer)
- .NET Framework 3.0* (comes pre-installed in Windows Vista and Server 2008)
- .NET Framework 3.5* (comes pre-installed in Windows 7 and Server 2008 R2)
- .NET Framework 4.0*
- .NET Framework 4.5* (comes pre-installed in Windows 8/8.1 and Server 2012/2012 R2)

Note: The framework marked with (*) have their later versions which is available as service packs, that can be downloaded from Microsoft’s Download website.

To check the ASP.Net application version compatibility, refer this MSDN article. Which is also shown below

If you find the respective folder, say v4.0.xxx, then it means that the Framework 4.0 or above has been installed. X:\Windows\Microsoft.NET\Framework\v4.0.30319  (The X:\ is replaced by the OS drive)

Also you will find a similar folder for the x64 bit .Net Framework and its files associated in this location X:\Windows\Microsoft.NET\Framework64\v4.0.30319 (The X:\ is replaced by the OS drive)

This can also be confirmed using another method by navigating into the Windows Registry to find a key and its existence confirms the same.

How to Find the .NET Framework Versions by Viewing the Registry

1. On the Start menu, choose Run.

2. In the Open box, enter regedit.exe.

You must have administrative credentials to run regedit.exe.

In the Registry Editor, open the following subkey:

HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\NET Framework Setup\NDP\v4\Full

This confirms the existence of .Net framework 4 and above(4.5 / 4.5.1 / 4.5.2 Only).

In most of the cases people might ignore this, assuming they have already configured ASP.Net by choosing from the server roles or any other version of .Net application that was working earlier.

This step will guide you to check whether your new application (with respect to the .Net Framework compatibility) has its asp.net component registered or not.

Simply Check ASP.NET 4.5 has been Installed via IIS

To check that .net 4.5 has been installed on the server, please just simply create a website in IIS and hit the “Select” button to check the .Net framework versions available to create Application Pool.

If you find v4.5 instead of v4.0, then it clearly justifies that .Net framework version 4.5 or above (4.5 / 4.5.1 / 4.5.2 only) has been installed and made available to run any website that requires this framework’s version.

If you don't find it, then navigate to the framework’s root folder (refer pic 2) and run “aspnet_regiis.exe” which will in turn register the asp.net component to the IIS.

Once you have registered the ASP.Net component, restart the IIS from the command prompt by typing “iisreset”, then launch your IIS Manager and follow the Step 3, to check the version listed in the available frameworks in the “Add Website” windows as shown above.

Now you are all set to go and start deploying your web application through Web Deploy / FTP or any other.



European ASP.NET 4.5. Hosting - France :: Memory Leak in .Net 4.5

clock November 9, 2015 23:57 by author Scott

One of the common issues with events and event handlers is memory leaks. In applications, it is possible that handlers attached to event sources will not be destroyed if we forget to unregister the handlers from the event. This situation can lead to memory leaks. .Net 4.5 introduces a weak event pattern that can be used to address this issue.

Problem: Without using weak event pattern.

Let's understand the problem with an example. If we understand the problem then the solution is very easy.

public class EventSource  
{  
        public event EventHandler<EventArgs> Event = delegate { };    

        public void Raise()  
        {  
            Event(this, EventArgs.Empty);  
        }          
}    
  

public class EventListener   
{  
        public int Counter { getset; }  
        EventSource eventSourceObject;    

        public EventListener(EventSource source)  
        {  
            eventSourceObject= source;  
            eventSourceObject.Event += MyHandler;  
        }           

        private void MyHandler(object source, EventArgs args)  
        {  
            Counter++;  
            Console.WriteLine("MyHandler received event "+ Counter);  
        }    

        public void UnRegisterEvent()  
        {  
            eventSourceObject.Event -= MyHandler;  
        }  
}  
static void Main(string[] args)  
{              
         EventSource source = new EventSource();  
         EventListener listener = new EventListener(source);  
         Console.WriteLine("Raise event first time");  
         Console.Read();  
         source.Raise();    

         //Set listener to null and call GC to release the  
        //listener object from memory.    

         Console.WriteLine("\nSet listener to null and call GC");  
         Console.Read();  
        //developer forget to unregister the event                 

        //listener.UnRegisterEvent();  
         listener = null;  
         GC.Collect();  
         Console.WriteLine("\nRaise event 2nd time");  
         Console.Read();  
         source.Raise();  
         Console.ReadKey();                  
}  

Output

The developer is expecting that set listener = null is sufficient to release the object but this is not the case. The event listener object is still in memory.

the EventListener's handler should be called a second time. But it's calling.

In this example the developer forget to unregister the event. With this mistake, the listener object is not released from memory and it's still active. If we call the unregister method then the EventListener would be released from memory. But the developer does not remember when to unregister the event.

This is the problem!

Solution: Weak event pattern.

The weak event pattern is designed to solve this memory leak problem. The weak event pattern can be used whenever a listener needs to register for an event, but the listener does not explicitly know when to unregister. 

In the old way of attaching the event with a source it is tightly coupled.

eventSourceObject.Event += MyHandler;  

Using Weak event pattern

Replace the preceding registration with the following code and run the program.

WeakEventManager<EventSource, EventArgs>.AddHandler(source, "Event", MyHandler);

After raising the event a second time, the listener is doing nothing. It means the listener is no longer in memory. 

Now the developer need not be concerned about calling the unregister event.

For any specific scenario if we want to unregister the event then we can use the RemoveHandler method as in the following:

WeakEventManager<EventSource, EventArgs>. RemoveHandler(source, "Event", MyHandler);  



About HostForLIFE.eu

HostForLIFE.eu is European Windows Hosting Provider which focuses on Windows Platform only. We deliver on-demand hosting solutions including Shared hosting, Reseller Hosting, Cloud Hosting, Dedicated Servers, and IT as a Service for companies of all sizes.

We have offered the latest Windows 2016 Hosting, ASP.NET Core 2.2.1 Hosting, ASP.NET MVC 6 Hosting and SQL 2017 Hosting.


Tag cloud

Sign in