Updating Multiple ASP.NET Sites by FTP

Imagine a scenario where you have around 100 ASP.NET web sites all using the same code base, which each consume site specific data from the same database and have several thousand dynamic pages each! Sounds daunting doesn’t it? Once you have all these sites configured that is ok for the time being, but what happens if you want to update all of these sites with a new version of the code? Some of you will already be shouting bad architecture, but if if a client wants dedicated IP addresses for each site and you intend to do caching in any meaningful way then separate sites with separate resources on potentially different web servers and even on different hosting companies does make some sense.

I first looked into using an FTP library like: http://www.codeproject.com/KB/IP/ftplib.aspx but soon realized that any hand rolled application would need to be robust enough to handle connection problems and update failures etc. It was then that I realized I already owned a tool that could with a little effort take care of the job.

WS_FTP Professional

With any tool there is certainly going to be some setup required and to make this solution work I did in fact create all 100 or so sites in the WS_FTP GUI, while you can save a sites individual ftp settings within the Visual Studio IDE it certainly cannot handle the scenario I am describing in this post. The cool thing about WS_FTP Professional is that it ships with a scripting tool and is also a great way to manage a myriad of FTP connections and allows you to name them and group them into appropriate categories of your choosing. The scripting language itself is also easy to use (and remember) and is very powerful. Obviously once you create a script you can save it and also use the WS_FTP scheduler to run scripts at convenient times (in the middle of the night to cause the least disruption!)
A simple example:

CONNECT "My Sites!Mysitename.info"
LCD “C://NewFiles”
CD “bin” 
RDEL “*”
RPUT “*”

Here the script simply connects to a site in the My Sites group (think of the exclamation mark as a /), navigates to the folder locally where the new files are located, navigates to the bin folder of the remote site, deletes the contents, uploads the new files and finally closes the connection.

This is a simple example but you can target specific files and navigate folders during the same session, you can also download files. Even so writing a 100 of these for a single update is still a little annoying and sadly there is no for each loop to iterate through the sites in the "My Sites Group" part in the scripting language. What worked for me was to write a small Winforms app which helps me write the scripts using placeholders. This application reads the site names from a small data store and uses syntax like:

CONNECT “#SiteName#”
LCD “#LocalSiteFolder#”
DEL “web.config”
PUT “web.config”

When I click the go button in the Winform, the application generates a script for each of the sites, replacing the placeholders (e.g. #SiteName#) with the appropriate text. This works well too if, as in my scenario, the web.config file for each site has site specific settings, so in the example above I am able to target the folder for that specific site (which contains site specific files) and specify a file in it (this also works well for CSS files as each site has it's own theme).

All I need do is save the script and schedule it to run or simply paste it directly into the WS_FTP scripting utility and let it go about it's business. The WS_FTP Scripting Tool also has a log panel, so you can see if an update has failed for whatever reason.

As you can see with a little effort and the right tool you can turn a maintenance nightmare into a relatively painless exercise.
kick it on DotNetKicks.com

USA: NEW! WS_FTP Professional 12 no support