Archive

Archive for November, 2010

Intenet Connectivity Check and Reconnection via Batch File

November 24, 2010 Leave a comment

Here is a Batch (.bat) file which will automatically check for network/internet connectivity and if not found will automatically reconnect. This will come in very handy for people who’s internet link keeps dropping and are having frequent disconnections.

A) Copy this code.
B) Open notepad and paste.
C) Customize.
D) Save as test.bat and Schedule


@echo off
ECHO Checking connection, please wait...
PING -n 1 www.google.com|find "Reply from " >NUL
IF NOT ERRORLEVEL 1 goto :SUCCESS
IF     ERRORLEVEL 1 goto :TRYAGAIN

:TRYAGAIN
ECHO -------------------------------------------------------
ECHO -------------------------------------------------------
ECHO FAILURE!
ECHO Let me try a bit more, please wait...
@echo off
PING -n 3 www.google.com|find "Reply from " >NUL
IF NOT ERRORLEVEL 1 goto :SUCCESS2
IF     ERRORLEVEL 1 goto :FAILURE

:SUCCESS
ECHO -------------------------------------------------------
ECHO -------------------------------------------------------
ECHO You have an active Internet connection
pause
goto END

:SUCCESS2
ECHO -------------------------------------------------------
ECHO -------------------------------------------------------
ECHO You have an active internet connection but some packet loss was detected.
pause
goto :END

:FAILURE
ECHO -------------------------------------------------------
ECHO -------------------------------------------------------
ECHO You do not have an active Internet connection
ECHO Reconnecting.. Please Wait
ECHO -------------------------------------------------------
ECHO -------------------------------------------------------
@echo off
rasdial ConnectionName /DISCONNECT
rasdial ConnectionName USERNAME PASSWORD
ECHO Re-checking Connection, please wait...
PING -n 3 www.google.com|find "Reply from " >NUL
IF NOT ERRORLEVEL 1 goto :SUCCESS
IF     ERRORLEVEL 1 goto :FAILURE2

:FAILURE2
ECHO -------------------------------------------------------
ECHO -------------------------------------------------------
ECHO Reconnenction Failed.
ECHO Please try later.
pause
goto :END

:END
Categories: Internet Tags: ,

Redirect from HTTP to HTTPS in ASP.NET

November 21, 2010 6 comments

Security Switch 4.1
===================
Security Switch enables various ASP.NET applications to automatically switch requests for pages/resources between the HTTP and HTTPS protocols without the need to write absolute URLs in HTML markup.

With deprecated support for ASP.NET 1.1 (via version 2.x) and full support for ASP.NET 2 and higher, you can easily configure what pages/resources should be secured via your website’s SSL certificate. This is accomplished through the configuration of an ASP.NET module (IHttpModule).

Configuration
————-
Configuring Security Switch is a simple process. Open the web.config file for your web application, or website, and the following lines where indicated.

<configuration>

<configSections>

<section name=”securitySwitch” type=”SecuritySwitch.Configuration.Settings, SecuritySwitch” />
</configSections>

<securitySwitch mode=”RemoteOnly”>
<paths>
<add path=”~/Login.aspx” />
</paths>
</securitySwitch>

<system.web>

<httpModules>

<!– for IIS <= 6.x, IIS 7.x + Classic Mode, and Web Development Server (Cassini) –>
<add name=”SecuritySwitch” type=”SecuritySwitch.SecuritySwitchModule, SecuritySwitch” />
</httpModules>

</system.web>

<system.webServer>

<validation validateIntegratedModeConfiguration=”false” />
<modules>

<!– for IIS 7.x + Integrated Mode –>
<add name=”SecuritySwitch” type=”SecuritySwitch.SecuritySwitchModule, SecuritySwitch” />
</modules>

</system.webServer>

</configuration>

First, add a new section definition to the configSections element collection. This tells ASP.NET that it can expect to see a section further down named, “securitySwitch”. Next, add the aforementioned section. The securitySwitch section is where you will actually configure the module. For now, we set mode to “RemoteOnly” and add an entry to paths for the Login.aspx page (more on these settings later). Finally, add the module entry to either system.Web/httpModules (for IIS <= 6.x, IIS 7.x with Classic Mode enabled, and the Web Development Server/Cassini), system.webServer/modules (for IIS 7.x with Integrated Mode enabled), or both. The excerpt above adds the module to both sections and adds the system.webServer/validation element to prevent IIS from complaining about the entry added to system.web/httpModules.

Another important step that many people forget is to include the SecuritySwitch assembly. Just copy the SecuritySwitch.dll assembly into your site’s bin folder, or add a reference to the assembly in your project.

The securitySwitch Section
··························
Configuration of the module is done via the securitySwitch section of a web.config file. The main element has several attributes itself, but none are required. The following section declaration is perfectly valid and will enable the module with all defaults. Note, the paths element and at least one add element entry within it are required.

<securitySwitch>
<paths>

</paths>
</securitySwitch>

The securitySwitch element may have the following attributes set to an allowed value, as also defined below.

Attribute Name            Data Type   Default Value   Allowed Values
—————————————————————————————–
baseInsecureUri           string      [null]          any valid URI
baseSecureUri             string      [null]          any valid URI
bypassSecurityWarning     bool        false           true, false
ignoreAjaxRequests        bool        false           true, false
ignoreSystemHandlers      bool        true            true, false
mode                      Mode        On              On, RemoteOnly, LocalOnly, Off
offloadedSecurityHeaders  string      [null]          query string like name/value pairs

Set baseSecureUri to a valid URI when you do not have an SSL certificate installed on the same domain as your standard site (accessed via HTTP) or if your server is setup to serve HTTPS on a non-standard port (a port other than 443). Setting baseSecureUri will instruct the module to redirect any requests that need to switch from HTTP to HTTPS to a URI that starts with the baseSecureUri. For example, if baseSecureUri is “https://secure.mysite.com&#8221; and a request for http://www.mysite.com/Login.aspx is made (and Login.aspx is configured to be secure), the module will redirect visitors to https://secure.mysite.com/Login.aspx. Similarly, if baseSecureUri is “https://secure.somehostingsite.com/mysite&#8221;, visitors would be redirected to https://secure.somehostingsite.com/mysite/Login.aspx.

Likewise, set baseInsecureUri to a valid URI when you have supplied a value for baseSecureUri. This ensures the module will send visitors back to your standard site when switching from HTTPS to HTTP. To build on the previous example above, if baseInsecureUri is “http://www.mysite.com&#8221;, a visitor requesting https://secure.somehostingsite.com/mysite/Info/ContactUs.aspx would be redirected to http://www.mysite.com/Info/ContactUs.aspx.

If either baseSecureUri or baseInsecureUri are set, you must provide both values. The module needs to know how to switch back when necessary and will use the other base URI to accomplish that.

Set bypassSecurityWarning to true when you wish to attempt to avoid browser warnings about switching from HTTPS to HTTP. Many browsers alert visitors when a server issues a redirect request that would remove the user from HTTPS. This is not necessarily a bad feature in browsers. However, some website owners/developers wish to avoid such security warnings when possible. When bypassSecurityWarning is true, the module will forgo the usual practice of issuing a formal redirect and, instead, will output a “Refresh” header followed by some JavaScript to change the visitor’s location. A refresh header is not a standard HTTP header. However, many browsers do honor it and “refresh” the current location with the specified URL after a timeout. The module sets the URL to the appropriate redirect location with a timeout of 0 (immediately). In addition, a small JavaScript block is output to the browser as backup. If the browser does not honor the refresh header, the script will set the window’s location to the appropriate URL.

Setting ignoreAjaxRequests to true will have the module ignore all AJAX requests, regardless of the request’s path. When true, this setting overrides any matching path’s settings if the request is made via AJAX. If false, the module will process the request like all others by checking for any matching path.

When ignoreSystemHandlers is true (the default), the module will automatically add a special path that will effectively ensure that requests for .axd handlers will be ignored during processing. This is most likely desireable, because ASP.NET makes ample use of the WebResource.axd handler. Likewise, Trace.axd and any other handler with the .axd extension will be ignored when this module evaluates the need to redirect the request. This will avoid browser warnings about mixed security, which occurs when a page is requested via one protocol (i.e. HTTPS) and resources referenced by the page are requested via a different protocol (i.e. HTTP). Without this setting, when a request for WebResource.axd is made via HTTPS on a secure page, the module would see that no path entry matching the request is found. Therefore, the module would redirect the request to use HTTP, causing the mixed security alert. Note, you can disable this setting and manually add path entries for WebResource.axd and any others you specifically want the module to ignore.

The mode attribute determines under what circumstances the module evaluates requests. A value of “On” enables the module for all requests, regardless of their origin. “RemoteOnly” will instruct the module to only consider requests that are made from a remote computer. If a request is made on the actual Web server (i.e. localhost, 127.0.0.1, etc.), the module will not act. Likewise, setting the mode to “LocalOnly” will enable module only when a request is made from the Web server. Finally, “Off” disables the module entirely. Disabling the module is great for troubleshooting issues with SSL and/or protocols, because it takes the Security Switch module out of the equation.

Use offloadedSecurityHeaders to designate request headers that may be present from an offloaded security device (such as a dedicated SSL server/accelerator; e.g., ISA Server, etc.). The value of this attribute should look like a query string without the leading “?”, with a name/value pair (e.g., SSL=Yes). If there are more than one headers the module should consider, delimit each pair with an ampersand (e.g., SSL=Yes&HTTPS=on).

Paths
~~~~~
Within the securitySwitch section element, there should be a paths element. The paths element is a collection of entries that tell the module how to handle certain requests. Adding path entries should be familiar to most ASP.NET developers. Each element in the paths collection is an “add” element, with attributes itself. Below is an example of a few path entries.

<securitySwitch>
<paths>
<add path=”~/Info/Contact.aspx” matchType=”Exact” />
<add path=”~/Login.aspx” />
<add path=”~/Manage” />

<add path=”~/Admin(/|/[Dd]efault\.aspx)?$” matchType=”Regex” ignoreCase=”false” security=”Insecure” />
<add path=”~/Admin/” />

<add path=”~/Media/” security=”Ignore” />

<add path=”~/Cms/Default\.aspx\?([a-zA-Z0-9\-%_= ]+&amp;)*pageId=2(&amp;[a-zA-Z0-9\-%_= ]+)*$” matchType=”Regex” />
</paths>
</securitySwitch>

The first entry will ensure that any request for the Contact.aspx page in the Info sub-directory of the site will be secured via HTTPS. The matchType is “Exact” and that means that only an exact request for that path will be matched. In other words, if there is any tail, query string, or bookmark included in a request, it will not be redirected (e.g. /Info/Contact.aspx?ref=email, /Info/Contact.aspx#form).

The next two entries will secure requests for the Login.aspx page and any path starting with /Manage. Since no matchType is specified, the default, “StartsWith”, is used. This works better for these two, because often requests for the login page will have a query string attached to it with the return URL (e.g. /Login.aspx?ReturnUrl=%2fManage). Likewise, anything in the /Manage sub-directory will be secured. Note, however, that a request for /ManagementInfo.aspx will also be secured because that request starts with /Manage.

The fourth and fifth entries are all about the /Admin sub-directory. The fifth entry ensures that any request to the /Admin sub-directory are secured. However, the fourth entry preempts the fifth, because it is listed beforehand. It instructs the module to access the default page in the /Admin sub-directory insecurely (via HTTP). It uses a matchType of “Regex” to catch the various possible ways a request may be made for the default page (e.g. /Admin, /Admin/, /Admin/Default.aspx). Also, the ignoreCase attribute is set to false to prove a point; /Admin/Default.aspx and /Admin/default.aspx are separate requests. The regex accounts for both. If we omit ignoreCase, or set it to true (the default), the regex path could be rewritten to just “~/Admin(/|/Default\.aspx)?$” and either request will be matched.

The sixth entry will force the module to ignore any requests for resources in the /Media sub-directory. This is especially important if you are running a website on IIS 7.x in Integrated Mode or if you have a wildcard handler setup in IIS to process all requests through the ASP.NET pipeline. In these cases, a request for /Media/Images/Title.jpg will use the same protocol that the page it’s reference in uses. If left out and a page secured via HTTPS references that image, the image request would be redirected to HTTP by the module; causing mixed security warnings in the browser.

The final entry uses regex to secure a particular query string value when requested with the /Cms/Default.aspx page. If an insecure request for /Cms/Default.aspx?pageId=2 is made, it will be redirected by the module in order to secure it via HTTPS. This entry even accounts for the pageId=2 parameter being anywhere within the query string. It can be the first parameter, the only parameter, or the third parameter; it doesn’t matter (e.g. /Cms/Default.aspx?cache=On&pageId=2&author=Matt).

Finally, if no path entry matches a request, the module will ensure it is accessed insecurely (via HTTP). This prevents “getting stuck in HTTPS”. That is, accessing the site via HTTPS and continuing to request resources via HTTPS. Such behavior would result in more CPU usage on the server (SSL requires a bit more processing for the encryption) and more bandwidth consumption (encrypted data is inherently larger than raw data). Either could end up costing you or your client quite a bit more in hosting bills!

Take care when ordering your path entries. Order definitely matters. In the example above, entries four and five are ordered specifically to achieve the desired results. If the fourth entry (the one that sets security to “Insecure”) were below the fifth entry, the module would never get to it. The module processes entries in the order you specify them, and once it finds a matching entry, it acts on it. In fact, the only reason there is an option to set the security attribute to “Insecure” is to override more general entries below. As in this example, anything in the /Admin sub-directory would be secured if it were not for the fourth entry overriding such behavior for the default page.

IntelliSense and the securitySwitch Section Schema
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
To enable IntelliSense while editing the securitySwitch section in a web.config file, add an xmlns attribute to the section and include the provided schema file in your solution. Below is an example of the section with the necessary attribute.

<securitySwitch xmlns=”http://SecuritySwitch-v4.xsd&#8221; …>
<paths>

</paths>
</securitySwitch>

Be sure to either include the SecuritySwitch-v4.xsd file in your solution, or (better still) install the schema file for Visual Studio. If Visual Studio does not automatically detect the schema file in your solution, you can add it to the Schemas property in the Properties window while the web.config file is open. To install the schema file for Visual Studio to always find in all your projects, copy the .xsd file to the appropriate directory, as shown below ([version] indicates the version of Visual Studio you are installing to).

* for 32-bit systems: %ProgramFiles%\Microsoft Visual Studio [version]\Xml\Schemas
* for 64-bit systems: %ProgramFiles(x86)%\Microsoft Visual Studio [version]\Xml\Schemas

Dyanmic Evaluation of Requests
——————————
There may be times when you cannot configure the paths that need to be secured, because your application generates URLs/paths dynamically. This is especially true for Content Management Systems (CMS). In those cases, you can leave out the paths element from the configuration section and provide an event handler for the module’s EvaluateRequest event. To do this, add an event handler to your site’s Global.asax file named, “SecuritySwitch_EvaluateRequest” with the following signature:

protected void SecuritySwitch_EvaluateRequest(object sender, EvaluateRequestEventArgs e) {
// TODO: Update e.ExpectedSecurity based on the current Request.
}

Set the event argument’s ExpectedSecurity property to one of the RequestSecurity values and the module will honor it instead of attempting to figure out how the request should be handled through the configuration of paths.

Additional Resources
——————–

* Download Sample Source code and website.

Security Switch

* Transport Layer Security (TLS) and Secure Sockets Layer (SSL) on Wikipedia
http://en.wikipedia.org/wiki/Transport_Layer_Security

* Tip/Trick: Enabling SSL on IIS 7.0 Using Self-Signed Certificates (by the Gu)
http://weblogs.asp.net/scottgu/archive/2007/04/06/tip-trick-enabling-ssl-on-iis7-using-self-signed-certificates.aspx

* How to Set Up SSL on IIS 7
http://learn.iis.net/page.aspx/144/how-to-set-up-ssl-on-iis-7/

Categories: ASP.Net, IIS

Performance Tips for .Net Application

November 16, 2010 1 comment

Performance Tips for ASP.NET Applications

Cache Aggressively

When designing an app using ASP.NET, make sure you design with an eye on caching. On server versions of the OS, you have a lot of options for tweaking the use of caches on the server and client side. There are several features and tools in ASP that you can make use of to gain performance.

Output Caching—Stores the static result of an ASP request. Specified using the <@% OutputCache %> directive:

  • Duration—Time item exists in the cache
  • VaryByParam—Varies cache entries by Get/Post params
  • VaryByHeader—Varies cache entries by Http header
  • VaryByCustom—Varies cache entries by browser
  • Override to vary by whatever you want:
    • Fragment Caching—When it is not possible to store an entire page (privacy, personalization, dynamic content), you can use fragment caching to store parts of it for quicker retrieval later.a) VaryByControl—Varies the cached items by values of a control
    • Cache API—Provides extremely fine granularity for caching by keeping a hash table of cached objects in memory (System.web.UI.caching). It also:a) Includes Dependencies (key, file, time)b) Automatically expires unused items

      c) Supports Callbacks

Caching intelligently can give you excellent performance, and it’s important to think about what kind of caching you need. Imagine a complex e-commerce site with several static pages for login, and then a slew of dynamically generated pages containing images and text. You might want to use Output Caching for those login pages, and then Fragment Caching for the dynamic pages. A toolbar, for example, could be cached as a fragment. For even better performance, you could cache commonly used images and boilerplate text that appear frequently on the site using the Cache API. For detailed information on caching (with sample code), check out the ASP. NET Website.

Use Session State Only If You Need To

One extremely powerful feature of ASP.NET is its ability to store session state for users, such as a shopping cart on an e-commerce site or a browser history. Since this is on by default, you pay the cost in memory even if you don’t use it. If you’re not using Session State, turn it off and save yourself the overhead by adding <@% EnabledSessionState = false %> to your asp. This comes with several other options, which are explained at the ASP. NET Website.

For pages that only read session state, you can choose EnabledSessionState=readonly. This carries less overhead than full read/write session state, and is useful when you need only part of the functionality and don’t want to pay for the write capabilities.

Use View State Only If You Need To

An example of View State might be a long form that users must fill out: if they click Back in their browser and then return, the form will remain filled. When this functionality isn’t used, this state eats up memory and performance. Perhaps the largest performance drain here is that a round-trip signal must be sent across the network each time the page is loaded to update and verify the cache. Since it is on by default, you will need to specify that you do not want to use View State with <@% EnabledViewState = false %>. You should read more about View State on the ASP. NET Website to learn about some of the other options and settings to which you have access.

Avoid STA COM

Apartment COM is designed to deal with threading in unmanaged environments. There are two kinds of Apartment COM: single-threaded and multithreaded. MTA COM is designed to handle multithreading, whereas STA COM relies on the messaging system to serialized thread requests. The managed world is free-threaded, and using Single Threaded Apartment COM requires that all unmanaged threads essentially share a single thread for interop. This results in amassive performance hit, and should be avoided whenever possible. If you can’t port the Apartment COM object to the managed world, use <@%AspCompat = “true” %> for pages that use them.

Batch Compile

Always batch compile before deploying a large page into the Web. This can be initiated by doing one request to a page per directory and waiting until the CPU idles again. This prevents the Web server from being bogged down with compilations while also trying to serve out pages.

Remove Unnecessary Http Modules

Depending on the features used, remove unused or unnecessary http modules from the pipeline. Reclaiming the added memory and wasted cycles can provide you with a small speed boost.

Avoid the Autoeventwireup Feature

Instead of relying on autoeventwireup, override the events from Page. For example, instead of writing a Page_Load()method, try overloading the public void OnLoad() method. This allows the run time from having to do aCreateDelegate() for every page.

Encode Using ASCII When You Don’t Need UTF

By default, ASP.NET comes configured to encode requests and responses as UTF-8. If ASCII is all your application needs, eliminated the UTF overhead can give you back a few cycles. Note that this can only be done on a per-application basis.

Use the Optimal Authentication Procedure

There are several different ways to authenticate a user and some of more expensive than others (in order of increasing cost: None, Windows, Forms, Passport). Make sure you use the cheapest one that best fits your needs.

Tips for Database Access

The philosophy of tuning for database access is to use only the functionality that you need, and to design around a ‘disconnected’ approach: make several connections in sequence, rather than holding a single connection open for a long time. You should take this change into account and design around it.

Microsoft recommends an N-Tier strategy for maximum performance, as opposed to a direct client-to-database connection. Consider this as part of your design philosophy, as many of the technologies in place are optimized to take advantage of a multi-tired scenario.

Use the Optimal Managed Provider

Make the correct choice of managed provider, rather than relying on a generic accessor. There are managed providers written specifically for many different databases, such as SQL (System.Data.SqlClient). If you use a more generic interface such as System.Data.Odbc when you could be using a specialized component, you will lose performance dealing with the added level of indirection. Using the optimal provider can also have you speaking a different language: the Managed SQL Client speaks TDS to a SQL database, providing a dramatic improvement over the generic OleDbprotocol.

Pick Data Reader Over Data Set When You Can

Use a data reader whenever when you don’t need to keep the data lying around. This allows a fast read of the data, which can be cached if the user desires. A reader is simply a stateless stream that allows you to read data as it arrives, and then drop it without storing it to a dataset for more navigation. The stream approach is faster and has less overhead, since you can start using data immediately. You should evaluate how often you need the same data to decide if the caching for navigation makes sense for you.

Use Stored Procedures Whenever Possible

Stored procedures are highly optimized tools that result in excellent performance when used effectively. Set up stored procedures to handle inserts, updates, and deletes with the data adapter. Stored procedures do not have to be interpreted, compiled or even transmitted from the client, and cut down on both network traffic and server overhead. Be sure to use CommandType.StoredProcedure instead of CommandType.Text

Be Careful About Dynamic Connection Strings

Connection pooling is a useful way to reuse connections for multiple requests, rather than paying the overhead of opening and closing a connection for each request. It’s done implicitly, but you get one pool per unique connection string. If you’re generating connection strings dynamically, make sure the strings are identical each time so pooling occurs. Also be aware that if delegation is occurring, you’ll get one pool per user.

Avoid Auto-Generated Commands

When using a data adapter, avoid auto-generated commands. These require additional trips to the server to retrieve meta data, and give you a lower level of interaction control. While using auto-generated commands is convenient, it’s worth the effort to do it yourself in performance-critical applications.

Keep Your Datasets Lean

Only put the records you need into the dataset. Remember that the dataset stores all of its data in memory, and that the more data you request, the longer it will take to transmit across the wire.

Use Sequential Access as Often as Possible

With a data reader, use CommandBehavior.SequentialAccess. This is essential for dealing with blob data types since it allows data to be read off of the wire in small chunks. While you can only work with one piece of the data at a time, the latency for loading a large data type disappears. If you don’t need to work the whole object at once, using Sequential Access will give you much better performance.

Best Practices

There are a few tips to remember when working on the CLR in any language. These are relevant to everyone, and should be the first line of defense when dealing with performance issues.

Throw Fewer Exceptions

Throwing exceptions can be very expensive, so make sure that you don’t throw a lot of them. Use Perfmon to see how many exceptions your application is throwing. It may surprise you to find that certain areas of your application throw more exceptions than you expected. For better granularity, you can also check the exception number programmatically by using Performance Counters.

Finding and designing away exception-heavy code can result in a decent perf win. Bear in mind that this has nothing to do with try/catch blocks: you only incur the cost when the actual exception is thrown. You can use as many try/catch blocks as you want. Using exceptions gratuitously is where you lose performance. For example, you should stay away from things like using exceptions for control flow.

Here’s a simple example of how expensive exceptions can be: we’ll simply run through a For loop, generating thousands or exceptions and then terminating. Try commenting out the throw statement to see the difference in speed: those exceptions result in tremendous overhead.

public static void Main(string[] args){
  int j = 0;
  for(int i = 0; i < 10000; i++){
    try{
      j = i;
      throw new System.Exception();
    } catch {}
  }
  System.Console.Write(j);
  return;
}

Design with ValueTypes

Use simple structs when you can, and when you don’t do a lot of boxing and unboxing. Here’s a simple example to demonstrate the speed difference:

using System;

namespace ConsoleApplication{

 

  public struct foo{
    public foo(double arg){ this.y = arg; }
    public double y;
  }
  public class bar{
    public bar(double arg){ this.y = arg; }
    public double y;
  }
  class Class1{
    static void Main(string[] args){
      System.Console.WriteLine("starting struct loop...");
      for(int i = 0; i < 50000000; i++)
      {foo test = new foo(3.14);}
      System.Console.WriteLine("struct loop complete.
                                starting object loop...");
      for(int i = 0; i < 50000000; i++)
      {bar test2 = new bar(3.14); }
      System.Console.WriteLine("All done");
    }
  }
}

When you run this example, you’ll see that the struct loop is orders of magnitude faster. However, it is important to beware of using ValueTypes when you treat them like objects. This adds extra boxing and unboxing overhead to your program, and can end up costing you more than it would if you had stuck with objects! To see this in action, modify the code above to use an array of foos and bars. You’ll find that the performance is more or less equal.

Tradeoffs ValueTypes are far less flexible than Objects, and end up hurting performance if used incorrectly. You need to be very careful about when and how you use them.

Try modifying the sample above, and storing the foos and bars inside arrays or hashtables. You’ll see the speed gain disappear, just with one boxing and unboxing operation.

You can keep track of how heavily you box and unbox by looking at GC allocations and collections. This can be done using either Perfmon externally or Performance Counters in your code.

Use AddRange to Add Groups

Use AddRange to add a whole collection, rather than adding each item in the collection iteratively. Nearly all windows controls and collections have both Add and AddRange methods, and each is optimized for a different purpose. Add is useful for adding a single item, whereas AddRange has some extra overhead but wins out when adding multiple items. Here are just a few of the classes that support Add and AddRange:

  • StringCollection, TraceCollection, etc.
  • HttpWebRequest
  • UserControl
  • ColumnHeader

Use For Loops for String Iteration—version 1

In C#, the foreach keyword allows you to walk across items in a list, string, etc. and perform operations on each item. This is a very powerful tool, since it acts as a general-purpose enumerator over many types. The tradeoff for this generalization is speed, and if you rely heavily on string iteration you should use a For loop instead. Since strings are simple character arrays, they can be walked using much less overhead than other structures. The JIT is smart enough (in many cases) to optimize away bounds-checking and other things inside a For loop, but is prohibited from doing this onforeach walks. For loop on strings is up to five times faster than using foreach.

Here’s a simple test method to demonstrate the difference in speed. Try running it, then removing the For loop and uncommenting the foreach statement. On my machine, the For loop took about a second, with about 3 seconds for theforeach statement.

 

public static void Main(string[] args) {
  string s = "monkeys!";
  int dummy = 0;

  System.Text.StringBuilder sb = new System.Text.StringBuilder(s);
  for(int i = 0; i < 1000000; i++)
    sb.Append(s);
  s = sb.ToString();
  //foreach (char c in s) dummy++;
  for (int i = 0; i < 1000000; i++)
    dummy++;
  return;
  }
}

Tradeoffs Foreach is far more readable, and in the future it will become as fast as a For loop for special cases like strings. Unless string manipulation is a real performance hog for you, the slightly messier code may not be worth it.

Use StringBuilder for Complex String Manipulation

When a string is modified, the run time will create a new string and return it, leaving the original to be garbage collected. Most of the time this is a fast and simple way to do it, but when a string is being modified repeatedly it begins to be a burden on performance: all of those allocations eventually get expensive. Here’s a simple example of a program that appends to a string 50,000 times, followed by one that uses a StringBuilder object to modify the string in place. The StringBuilder code is much faster, and if you run them it becomes immediately obvious.

Image Resizing in C#

November 16, 2010 Leave a comment

Once upon a time there were objects like ASPJeg which made it possible to manipulate images files – resize, convert and much more…

Lucky for us those times are gone and now we have a great framework called .NET, so why not use it ?

Some Image extension methods that will make your life easier.

public static class ImageExtensionMethods
{

static private ImageCodecInfo GetEncoder(ImageFormat format)
{
return ImageCodecInfo.GetImageDecoders().SingleOrDefault(c => c.FormatID == format.Guid);
}

public static void SaveAsJpeg(this Image Img, string FileName, Int64 Quality)
{
ImageCodecInfo jgpEncoder = GetEncoder(ImageFormat.Jpeg);
Encoder QualityEncoder = Encoder.Quality;

using (EncoderParameters EP = new EncoderParameters(1))
{
using (EncoderParameter QualityEncoderParameter = new EncoderParameter(QualityEncoder, Quality))
{
EP.Param[0] = QualityEncoderParameter;
Img.Save(FileName, jgpEncoder, EP);
}
}
}

public static void SaveAsGif(this Image Img, string FileName, Int64 Quality)
{
ImageCodecInfo gifEncoder = GetEncoder(ImageFormat.Gif);
Encoder QualityEncoder = Encoder.Quality;

using (EncoderParameters EP = new EncoderParameters(1))
{
using (EncoderParameter QualityEncoderParameter = new EncoderParameter(QualityEncoder, Quality))
{
EP.Param[0] = QualityEncoderParameter;
Img.Save(FileName, gifEncoder, EP);
}
}
}

public static Image Resize(this Image Img, int Width, int Height, InterpolationMode InterpolationMode)
{
Image CropedImage = new Bitmap(Width, Height);
using (Graphics G = Graphics.FromImage(CropedImage))
{
G.SmoothingMode = SmoothingMode.HighQuality;
G.InterpolationMode = InterpolationMode;
G.PixelOffsetMode = PixelOffsetMode.HighQuality;
G.DrawImage(Img, 0, 0, Width, Height);
}

return CropedImage;
}

public static Image Resize(this Image Img, int Width, int Height)
{
return Img.Resize(Width, Height, InterpolationMode.HighQualityBicubic);
}

private static Rectangle EnsureAspectRatio(this Image Image, int Width, int Height)
{
float AspectRatio = Width / (float)Height;
float CalculatedWidth = Image.Width, CalculatedHeight = Image.Height;

if (Image.Width >= Image.Height)
{
if (Width > Height)
{
CalculatedHeight = Image.Width / AspectRatio;
if (CalculatedHeight > Image.Height)
{
CalculatedHeight = Image.Height;
CalculatedWidth = CalculatedHeight * AspectRatio;
}
}
else
{
CalculatedWidth = Image.Height * AspectRatio;
if (CalculatedWidth > Image.Width)
{
CalculatedWidth = Image.Width;
CalculatedHeight = CalculatedWidth / AspectRatio;
}
}
}
else
{
if (Width Image.Height)
{
CalculatedHeight = Image.Height;
CalculatedWidth = CalculatedHeight * AspectRatio;
}
}
else
{
CalculatedWidth = Image.Height * AspectRatio;
if (CalculatedWidth > Image.Width)
{
CalculatedWidth = Image.Width;
CalculatedHeight = CalculatedWidth / AspectRatio;
}
}
}
return Rectangle.Ceiling(new RectangleF((Image.Width – CalculatedWidth) / 2, 0, CalculatedWidth, CalculatedHeight));
}

public static Image ResizeToCanvas(this Image Img, int Width, int Height, out Rectangle CropRectangle)
{
return Img.ResizeToCanvas(Width, Height, InterpolationMode.HighQualityBicubic, out CropRectangle);
}

public static Image ResizeToCanvas(this Image Img, int Width, int Height,InterpolationMode InterpolationMode, out Rectangle CropRectangle)
{
CropRectangle = EnsureAspectRatio(Image, Width, Height);
Image CropedImage = new Bitmap(Width, Height);

using (Graphics G = Graphics.FromImage(CropedImage))
{
G.SmoothingMode = SmoothingMode.HighQuality;
G.InterpolationMode = InterpolationMode;
G.PixelOffsetMode = PixelOffsetMode.HighQuality;
G.DrawImage(Img, new Rectangle(0, 0, Width, Height), CropRectangle, GraphicsUnit.Pixel);
}

return CropedImage;
}

public static Image ResizeToCanvas(this Image Img, int Width, int Height, RectangleF CR)
{
return Img.ResizeToCanvas(Width, Height, InterpolationMode.HighQualityBicubic, CR);
}

public static Image ResizeToCanvas(this Image Img, int Width, int Height, InterpolationMode InterpolationMode, RectangleF CR)
{
Image CropedImage = new Bitmap(Width, Height);
using (Graphics G = Graphics.FromImage(CropedImage))
{
G.SmoothingMode = SmoothingMode.HighQuality;
G.InterpolationMode = InterpolationMode;
G.PixelOffsetMode = PixelOffsetMode.HighQuality;
G.DrawImage(Img, new Rectangle(0, 0, Width, Height), CR, GraphicsUnit.Pixel);
}

return CropedImage;
}

}

Categories: C#

Migrating ASP.NET Web Services to WCF

November 15, 2010 Leave a comment

ASP.NET provides .NET Framework class libraries and tools for building Web services, as well as facilities for hosting services within Internet Information Services (IIS). Windows Communication Foundation (WCF) provides .NET Framework class libraries, tools and hosting facilities for enabling software entities to communicate using any protocols, including those used by Web services. Migrating ASP.NET Web Services to WCF allows your applications to take advantage of new features and improvements that are unique to WCF.

WCF has several important advantages relative to ASP.NET Web services. While ASP.NET Web services tools are solely for building Web services, WCF provides tools that can be used when software entities must be made to communicate with one another. This will reduce the number of technologies that developers are required to know in order to accommodate different software communication scenarios, which in turn will reduce the cost of software development resources, as well as the time to complete software development projects.

Even for Web service development projects, WCF supports more Web service protocols than ASP.NET Web services support. These additional protocols provide for more sophisticated solutions involving, amongst other things, reliable sessions and transactions.

WCF supports more protocols for transporting messages than ASP.NET Web services. ASP.NET Web services only support sending messages by using the Hypertext Transfer Protocol (HTTP). WCF supports sending messages by using HTTP, as well as the Transmission Control Protocol (TCP), named pipes, and Microsoft Message Queuing (MSMQ). More important, WCF can be extended to support additional transport protocols. Therefore, software developed using WCF can be adapted to work together with a wider variety of other software, thereby increasing the potential return on the investment.

WCF provides much richer facilities for deploying and managing applications than ASP.NET Web services provides. In addition to a configuration system, which ASP.NET also has, WCF offers a configuration editor, activity tracing from senders to receivers and back through any number of intermediaries, a trace viewer, message logging, a vast number of performance counters, and support for Windows Management Instrumentation.

Given these potential benefits of WCF relative to ASP.NET Web services, if you are using, or are considering using ASP.NET Web services you have several options:

Continue to use ASP.NET Web services, and forego the benefits proffered by WCF.

Keep using ASP.NET Web services with the intention of adopting WCF at some time in the future. The topics in this section explain how to maximize the prospects for being able to use new ASP.NET Web service applications together with future WCF applications. The topics in this section also explain how to build new ASP.NET Web services so as to make it easier to migrate them to WCF. However, if securing the services is important, or reliability or transaction assurances are required, or if custom management facilities will have to be constructed, then it is a better option to adopt WCF. WCF is designed for precisely such scenarios.

Adopt WCF for new development, while continuing to maintain your existing ASP.NET Web service applications. This choice is very likely the optimal one. It yields the benefits of WCF, while sparing the cost of modifying the existing applications to use it. In this scenario, new WCF applications can co-exist with existing ASP.NET applications. New WCF applications will be able to use existing ASP.NET Web services, and WCF can be used to program new operational capabilities into existing ASP.NET applications by virtue of WCF ASP.NET compatibility mode.

Adopt WCF and migrate existing ASP.NET Web service applications to WCF. You may choose this option to enhance the existing applications with features provided by WCF, or to reproduce the functionality of existing ASP.NET Web services within new, more powerful WCF applications.

Note:
Care must be taken if a WCF service is hosted by IIS 5.x and ASP.NET is uninstalled. When a WCF service is hosted by IIS 5.x, the code for the service can be requested if ASP.NET is uninstalled. When ASP.NET is uninstalled on an operating system that is running IIS 5.x and WCF is uninstalled, a file with the .svc extension is considered a text file and the contents, including any source code, is returned to the requester

Categories: Others