Speed up changes in Azure Web Roles

4. June 2012 00:37 by Mrojas in Azure  //  Tags: , , , , , , , ,   //   Comments (0)

A common question I get for people that has just moved to Windows Azure Web Roles, is:

 

I just want to make a little change to a page but publishing takes too long, what can i do?

 

Well, there are several things you can do. Usually what I find is that their publishing takes too long because they need to upload a lot of data.. but why? The database is in Azure, so you don't have to upload it, and a asp.net web site is usually just some binaries and some text files?

 

For answering this question I love to use a tool called HDgraph (http://www.hdgraph.com/)

 

 

This tool provides a graphical representation of your hard drive allowing you easily identify which folders are the ones consuming most space.

 

What I usually find is a lot of graphics (jpg, png, gif files), videos (flv, avi), presentations (.pptx) and PDFs files, that are part of the site. No wonder why uploading site changes takes so long.

 

if you do that you are not really taking advantage of all the Azure platform features.

Azure provides a massive storage mechanism and you should take advantage of it.

 

But how do I do that?

 

First Download a tool like Azure Storage Explorer: http://azurestorageexplorer.codeplex.com/

 

 

Create a folder (well well a container)  for example content in Azure and upload your files to that container.

 

And now is time to set your CNAME record to point to your storage. Normally it will be something like making A CNAME content or media point to mycutesite.blob.core.windows.net.

(If you don't remember what a CNAME is lets say is something you configure with your registrar, the people you pay for your domain, so you can make entries to say for example if the browser navigates to www.mycutesite.com then send those requests to mycutesite.cloudapp.net and if they type media.mycutesite.com they it will redirect them to mycutesite.blob.core.windows.net)

 

 

I recommend reading Brandon Werner's excellent for very details quick intro to Windows Azure, from where I took this CNAME entries image.

 http://blogs.msdn.com/b/brandonwerner/archive/2009/11/28/how-to-host-your-site-and-content-on-azure-quickly-and-easily.aspx

 

 

 

Again the idea is that you will remove most of the content of your asp.net solution and change your html so all resources will be downloaded from media.mycutesite.com\content

 

And finally, take advantage of Windows Azure new WebDeploy feature which allows a fast way to modify your pages directly. See (http://blogs.msdn.com/b/cloud/archive/2011/04/19/enabling-web-deploy-for-windows-azure-web-roles-with-visual-studio.aspx)

 

Getting the IP Address of the client in a Windows Azure Role

Are que getting null or empty with some Request.ServerVariables

When you convert your ASP application to run on Windows Azure it is a good
to put attention to the methods that are used to get the user IP Address.
Normally the recommendation will be to use Request.UserHostAddress however
our friend Alex has found that this property can return null or empty.

After some research Alex found that there are several scenarios under which
you must check both the REMOTE_ADDR and the HTTP_X_FORWARD_FOR server variables:

More info:
http://forums.asp.net/t/1138908.aspx and
http://meatballwiki.org/wiki/AnonymousProxy

A possible code snipped that can provide a value for the client address can be:

public static string ReturnIP()
        {
            var request = System.Web.HttpContext.Current.Request;
            var ServerVariables_HTTP_X_FORWARDED_FOR = (String)request.ServerVariables["HTTP_X_FORWARDED_FOR"];
            var ServerVariables_REMOTE_ADDR = (String)request.ServerVariables["REMOTE_ADDR"];
            string ip = "127.0.0.1";
            if (!string.IsNullOrEmpty(ServerVariables_HTTP_X_FORWARDED_FOR) && 
                !ServerVariables_HTTP_X_FORWARDED_FOR.ToLower().Contains("unknown"))
            {
                ServerVariables_HTTP_X_FORWARDED_FOR = ServerVariables_HTTP_X_FORWARDED_FOR.Trim();
                string[] ipRange = ServerVariables_HTTP_X_FORWARDED_FOR.Split(',');
                ip = ipRange[0];
            }
            else if (!string.IsNullOrEmpty(ServerVariables_REMOTE_ADDR))
            {
                ServerVariables_REMOTE_ADDR = ServerVariables_REMOTE_ADDR.Trim();
                ip = ServerVariables_REMOTE_ADDR;
            }
            return ip;
       }

In the previous code the HTTP_X_FORWARDED_FOR value is examined first and if it is not null or unknown then ip address of the client
is gotten from there.

Windows Azure and Websites in a Flux

1. May 2011 07:47 by Mrojas in Azure  //  Tags: , , , ,   //   Comments (0)

Windows Azure is a great platform and the escalatity oportunities are great,
and deployment time is also great.
You can have all your website up and running in just 10-15minutes.

But… and yes there is always a but.

Sometimes you can have a WebSite that is not that static, that as a matter of fact
you are changing its views constantly. Specially if some ideas are not finished.
And yes you can test locally, but there is also a situation where you might want to have that flexibility.

Well looking around I found a very interesting solution by
Maatern Balliauw. http://blog.maartenballiauw.be/post/2009/06/09/A-view-from-the-cloud-(or-locate-your-ASPNET-MVC-views-on-Windows-Azure-Blob-Storage).aspx

What he proposes is to use windows azure storage as a virtual file system, so you can with simple tools
like the Windows Azure Explorer modify your web pages without the need of going through a lengthy republish process.

So go ahead and keep enyoing Azure

Windows Azure Migration: Database Migration, Post 1

2. April 2011 18:14 by Mrojas in Azure  //  Tags: , , , , , , , , , ,   //   Comments (0)

WheWhen you are doing an azure migration, one of the first thing you must do is
collect all the information you can about your database.
Also at some point in your migration process you might consider between migration to
SQL Azure or Azure Storage or Azure Tables.

Do do all the appropriate decisions you need to collect at least basic data like:

  • Database Size
  • Table Size
  • Row Size
  • User Defined Types or any other code that depends on the CLR
  • Extended Properties

Database Size

You can use a script like this to collect some general information:

create table #spaceused(
databasename varchar(255),
size varchar(255),
owner varchar(255),
dbid int,
created varchar(255),
status varchar(255),
level int)

insert #spaceused (databasename , size,owner,dbid,created,status, level)  exec sp_helpdb

select * from #spaceused for xml raw
drop table  #spaceused
 

When you run this script you will get an XML like:

<row databasename="master" 
size=" 33.69 MB" 
owner="sa" 
dbid="1" 
created="Apr 8 2003" 
status="Status=ONLINE, ..." 
level="90"/>
<row databasename="msdb" 
size=" 50.50 MB" 
owner="sa" 
dbid="4" 
created="Oct 14 2005" 
status="Status=ONLINE, ..." 
level="90"/>
<row databasename="mycooldb" 
size=" 180.94 MB" 
owner="sa" 
dbid="89" 
created="Apr 22 2010" 
status="Status=ONLINE, ..." 
level="90"/>
<row databasename="cooldb" 
size=" 10.49 MB" 
owner="sa" 
dbid="53" 
created="Jul 22 2010" 
status="Status=ONLINE, ..." 
level="90"/>
<row databasename="tempdb" 
size=" 398.44 MB" 
owner="sa" dbid="2" 
created="Feb 16 2011" 
status="Status=ONLINE, ..." 
level="90"/>

And yes I know there are several other scripts that can give you more detailed information about your database
but this one answers simple questions like

Does my database fits in SQL Azure?
Which is an appropriate SQL Azure DB Size?

Also remember that SQL Azure is based on SQL Server 2008 (level 100).

80 = SQL Server 2000

90 = SQL Server 2005

100 = SQL Server 2008


If you are migrating from an older database (level 80 or 90) it might be necessary to upgrade first.

This post might be helpful: http://blog.scalabilityexperts.com/2008/01/28/upgrade-sql-server-2000-to-2005-or-2008/

Table Size

Table size is also important.There great script for that:

http://vyaskn.tripod.com/sp_show_biggest_tables.htm

If you plan to migrate to Azure Storage there are certain constraints. For example consider looking at the number of columns:

You can use these scripts: http://www.novicksoftware.com/udfofweek/vol2/t-sql-udf-vol-2-num-27-udf_tbl_colcounttab.htm (I just had to change the alter for create)

Row Size

I found this on a forum (thanks to Lee Dice and Michael Lee)

DECLARE @sql        VARCHAR (8000)
        , @tablename  VARCHAR (255)
        , @delim      VARCHAR (3)
        , @q          CHAR (1)

  SELECT @tablename = '{table name}'
       , @q         = CHAR (39)

  SELECT @delim = ''
       , @sql   = 'SELECT '

  SELECT @sql   = @sql
                + @delim
                + 'ISNULL(DATALENGTH ([' + name + ']),0)'
       , @delim = ' + '
  FROM   syscolumns
  WHERE  id = OBJECT_ID (@tablename)
  ORDER BY colid

  SELECT @sql = @sql + ' rowlength'
              + ' FROM [' + @tablename + ']'
       , @sql =  'SELECT MAX (rowlength)'
              + ' FROM (' + @sql + ') rowlengths'
  PRINT @sql
  EXEC (@sql)

Remember to change the {table name} for the name of the table you need

User Defined Types or any other code that depends on the CLR

Just look at your db scripts at determine if there are any CREATE TYPE statements with the assembly keyword.
Also determine if CLR is enabled with a query like:

select * from sys.configurations where name = 'clr enabled'

If this query has a column value = 1 then it is enabled.

Extended Properties

Look for calls to sp_addextendedproperty dropextendedproperty OBJECTPROPERTY and sys.extended_properties  in your scripts.

Doing Backups in Windows Azure

17. March 2011 03:47 by Mrojas in Azure  //  Tags: , , ,   //   Comments (0)

When we migrate our customers to Azure, we want them to take advantage of
this rich platform.

Even for a simple deployment you get a Windows Azure Storage Account,
and that account means up to 100TB of storage!!!!! So take advantage of that.

One common thing that any enterprise needs is backups.
You need to backups of you email files, databases, documents, etc.

Sometimes you can have a dedicated server for storing that data, but
all hard drives can fail, so you will need to make several copies of your
backup information, probably use a RAID disk, make backups of your backups
on tape, DVD, etc…

What if you could just use your cloud storage, which is triplicated in secure datacenters?

Well you can!

There are currently several solutions.

For example in codeplex you will find projects like: http://myazurebackup.codeplex.com

MyAzureBackup provides simple to use web user interface.
I have even extended some of its functionality to work with a Windows Service that uses
a FileWatcher to upload files from a directory.
And it is easy to use this application as a base for your backup infrastructure adding some
security functionality like encrypting files.

So go ahead and explore all the new posibilities you have with cloud computing