Windows 10 on Mac Bootcamp – fixes

Windows 10 on Bootcamp (Macbook Pro 13 inch, Bootcamp 5.1) has some teething issues as of build 10162.

SSD Powering down problems

You might notice Windows hanging for extended periods of time or blue screening – the SSD is literally powering down underneath Windows. The Bootcamp drivers don’t properly support Windows 10’s powering down of the SSD to save battery. Your Event log might have references to “”Event 129, storahci – Reset to device, \Device\RaidPort0, was issued.” To fix this, you need to disable AHCI Link Power Management and prevent storahci from going into low power mode.

1. Copy and paste the following into a new text file called “enable-hipm.reg” and save it:

Windows Registry Editor Version 5.00
[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Power\PowerSettings\0012ee47-9041-4b5d-9b77-535fba8b1442\0b2d69d7-a2a1-449c-9680-f91c70521c60]
"Attributes"=dword:00000002 [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Power\PowerSettings\0012ee47-9041-4b5d-9b77-535fba8b1442\dab60367-53fe-4fbc-825e-521d069d2456]
"Attributes"=dword:00000002

2. Double click the file to import the records into the registry.

3. Right click on the Battery icon in the Taskbar, select “Power Options”. Click “Change plan settings” under the “Balanced” option. Then click “Change advanced power settings”.

4. Expand the “Hard disk” node and you’ll see “ACHI Link Power Management – HIPM/DIPM”. You need to set the value to “Active” as seen below:

image

5. Create a another regedit file “storahci.reg” with the following content:

Windows Registry Editor Version 5.00
[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\storahci\Parameters\Device]
"SingleIO"=hex(7):2a,00,00,00
"NoLPM"=hex(7):2a,00,00,00

6. Double click the file to import the registry entries. This stops storahci from going into Low Power Mode.

A restart should then solve the SSD freezing problems.

System Restore, Restore Points and Windows 7 style backups do not work

Again, if you are getting messages such as “check the event log for VSS errors” when trying to backup or create a restore point, and then finding event log messages like:

Volume Shadow Copy Service error: Unexpected error CreateFileW(\\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy48\,0x80000000,0x00000003,…).  hr = 0x80070001, Incorrect function.
.

Operation:
   Processing PreFinalCommitSnapshots

Context:
   Execution Context: System Provider

Then you’ll find that this is another Bootcamp driver problem, specifically the applehfs.sys driver that allows read only access to HFS volumes. You need to disable this from starting up:

1. Download Sysinternals Autoruns and run it as an Administrator.

2. Search for “apple” and you’ll see “applehfs.sys”.

image

3. Disable it by unchecking AppleHFS and restart. You should now be able to create System Restore images and Windows 7 style backups.

Hopefully Apple updates Bootcamp for Windows 10. If I find any other issues I’ll update this post.


.NET web app cloud deployments in 2015

.NET web applications tend to get treated very poorly in the real world – some people still think that copying and pasting the contents of their /bin/Release/ directory (lovingly referred to as “DLLs”) over Remote Desktop to a webserver and manually setting up IIS is acceptable – but this is now 2015 and the world has moved on. Here are my thoughts on some of the various ways you can deploy .NET apps to the cloud.

First things first – keeping your .NET app cloud ready

Real cloud environments are stateless. You must treat the web servers you use as ephemeral. DevOps practitioners treat virtual servers as cattle, not pets, and don’t nurse servers back to health if there is a problem. Instead they take them out back, shoot them in the head and spin up a new one.

The .NET Framework does not make building cloud-ready, stateless scalable applications easy by default, especially if you are still shaking off decade old WebForms habits. Here is some advice:

  • Never use Session State. If you type HttpContext.Current.Session you lose. Using Session State either forces you to have a “Session State Server”, building a single point of failure into your architecture, or having to use sticky load balancers to force users to continuously hit the same web node where their in-memory session lives.
  • You’ll need to synchronize your MachineKey settings between machines, so all nodes use the same keys for crypto.
  • Multiple nodes will break ASP.NET MVC’s TempData (typically used for Flash messages) – try CookieTempData
  • For configuration values, only use web.config AppSettings and ConnectionStrings. Sticking to this rule will give you maximum compatibility with the various cloud deployment platforms I’ll outline below. And no, don’t use Environment Variables, despite what The 12 Factor app enthuses – Windows apps do not use Environment Variables for application configuration.
  • Do not rely on any pre-installed software. All dependencies should be pulled from NuGet and distributed with your application package. If you use a vendor’s “solution” (custom PDF components? Using Office to create Excel files? CrystalReports?) insist on a NuGet package or remove the vendor’s software. This is 2015.

image

Azure Websites

The granddaddy of .NET Platform as a Service and the cornerstone of almost every Azure demo. Azure Websites is a very high level abstraction over IIS and .NET web farms, supports lots of very cool deployment mechanisms and is easily scalable.

  • Deploy from Github, TFS, Mercurial etc by monitoring branches. The very clever software under the hood (Kudu) monitors branches for changes, runs MSBuild for you and deploys your app.
  • Lots of features – staging slots (with DNS switch over for zero-downtime deploys), scaling with a slider, monitoring and logging all included
  • You don’t get access to the underlying Windows VM that the sites are running on – even if you pay to have dedicated VMs for your sites. This does mean that you get auto-patching, but if you have any exotic requirements (I’ve seen third party APIs have such broken SSL implementations you need to install their Root CA certificate on your web server) you’ll be out of luck as there is no way to run scripts on the servers.
  • To configure your app, you can set variables that replace AppSettings or ConnectionStrings in your web.config at deployment time.
  • Azure Websites also supports PHP, Java, node.js and more, if you are happy to run those frameworks on Windows. This blog is WordPress backed, so PHP, and running on Azure Websites!

An honorable mention goes out to App Harbor – they technically got there first by providing a Heroku-like experience for .NET developers. Also note that Azure has “Azure Cloud Services” – this is significantly more complex than Azure Websites and does tie you into the Azure platform significantly. Azure Cloud Services are typically chosen for long running cloud systems rather than transactional web sites (think Xbox Live rather than a high traffic blog).

image

Amazon Web Services Elastic Beanstalk

Amazon are by far the biggest cloud provider out there and they try to tick as many Windows feature boxes as possible to woo enterprises. Elastic Beanstalk is a Platform as a Service deployment platform, similar to Azure Websites, but completely platform agnostic. Since it uses all the existing EC2 APIs underneath (Elastic Load Balancing, Auto Scaling Groups etc), Language and OS support is much higher than Azure Websites, at the expense of not being optimised for Windows/.NET workloads.

  • There is no cheap, shared tier. Your application runs on a dedicated VM that you have access to. This makes costs a bit higher (unless you are crazy and want to try to run .NET on micro instances) but gives you more control. As part of your deployment package you can include Powershell scripts that can execute on your VM.
  • The user interface is very limited – when I last checked the only configuration values you could set via the UI were named “PARAM1”, “PARAM2”, “PARAM3” etc, which limited your AppSettings to using those names unless you wanted to completely script your deployment.
  • If you want a SQL Server as a Service, you are limited to RDS which charges for the whole VM and SQL Server license. Azure’s SQL Server service charges for CPU time and disk space, which can work out quite a bit cheaper.
  • Docker container support is available – this will become important for .NET developers when ASP.NET 5 is out of beta and CoreCLR is ready.

image

Opscode Chef + Azure or AWS VMs

Opscode Chef is a favorite of the “infrastructure as code” crowd, and it can be made to work on Windows. Given standard virtual machines on either AWS or Azure, you can install the Chef service on your nodes and execute Chef recipes.

  • Chef recipes are written in Ruby. This may or may not be a problem depending on your team (I can count the number of .NET developers I know who are also good at Ruby on one hand) but is definitely extra skilled requirements. It is possible to use Chef recipes to bootstrap Powershell scripts, but then you have Rube Goldberg machine of pain.
  • Ruby is simply not designed to run on Windows, let alone for long-running processes. The Chef Service had a long standing bug on Windows where Ruby would simply run out of memory. Anybody who has tried getting every gem in a typical Ruby on Rails gemfile to compile on Windows knows the pain I am talking about. Windows support for Ruby is an afterthought.
  • One thing Azure has over AWS for Chef deployments is the ability to pre-install the Chef Client onto a VM when you start it, all from the UI. AWS requires you to manually apt-get the client.
  • Chef recipes are based on the concept of convergence – where the desired state of the server is described and then a policy is calculated to bring the server to that state. Co-incidentally, this is exactly what Powershell Desired State Configuration does. Chef have plans to integrate with Powershell DSC.

image

Octopus Deploy + Azure or AWS VMs

Ocotpus Deploy is quickly becoming one of my favourite parts of the .NET ecosystem. Built by some of the finest .NET developers in the land, for .NET developers. It provides the Platform as a Service ease of Azure Websites with the power of running your own VMs. I think of it as bringing your own platform layer to infrastructure you might get elsewhere – I’ve dealt with a big deployment of Octopus on AWS.

  • VMs can be assigned to environments, enabling a fully customisable Test-UAT-Staging-Production workflow with release promotion.
  • Your build server needs to create “octopack” packages– a nuget package variant. These packages then get pushed to the Octopus server nuget feed and can be deployed.
  • A deployment agent called a “Tentacle” is deployed on each VM. A single MSI command can install and enroll the node.
  • Elastic scaling is not included – Octopus does not manage your environment for you.
  • Deployment steps are fully customisable – you can create IIS sites, AppPools, run custom scripts or even install Windows Services
  • Configuration settings for your application are set as variables that apply to AppSettings and ConnectionStrings in your web.config when you deploy.

The Octopus Deploy team is currently working on version 3.0, which will replace the RavenDB database with SQL Server. I’m very much looking forward to it. Octopus isn’t limited to cloud-deployments either – it can be used equally well for on-premise datacenters.


In summary then, I’d choose Azure Websites if the application is simple enough to work within it’s constraints. Given an application with multiple tiers (microservices etc) or special deployment requirements (third party software, certificates), I’d go for Octopus Deploy on top of whichever is your organisation’s favored cloud provider.

If you have any thoughts on the above, or can point out a mistake I’ve made, please drop me an email or leave a comment.


Windows 8.1’s user-hostile backup story

Update 13th June 2015: It looks like Microsoft has reversed course and reinstated the Windows 7 Backup and Restore feature in Windows 10! Success!

Update 16th Sept 2013: What follows is a rant written after upgrading to 8.1 and seeing my automated backups just stop working and the backup restoration function also removed. An afternoon was wasted faffing about with the new File History feature which still refuses to backup files not on my system drive and deliberately ignores files in the SkyDrive folder, preventing you from having a local backup. I now use CrashPlan which behaves like Windows 7 and 8.0 used to be able to back up files.

Windows 8.1 is now released to manufacturing and those with MSDN or Technet Subs can download it now.

I have the RTM version now set up on home and work machines and have been running the Preview versions on both my Surface RT and Surface Pro. Windows 8.1 has some glaring errors.

SkyDrive integration now built in, removes features compared to the old Desktop client

You no longer have to install the SkyDrive app separately as it is now built into the OS. Windows 8.1 makes a concerted effort to force you to use SkyDrive.

image

The above is a screenshot of a screen displayed during the upgrade process. Can you see the error?

“automatically backed up to the cloud”

This is nonsense – if you delete a file from your local machine, it will be deleted from the cloud. This is NOT a backup. Live mirroring is NOT a backup. If a sync goes wrong – your file disappears completely. There is no backup.

Edit: Turns out SkyDrive has a “Recycle Bin” for deleted files, but it removes files once it is full or after 30 days. So, you might randomly be able to get your file back. Version History also only works for Office documents.

imageimage

On the left is Windows 8’s SkyDrive integration after installing the SkyDrive Desktop app. On the right is Windows 8.1’s built-in SkyDrive integration. This is now a system-level folder and doesn’t even have the syncing icons available. This folder is now virtualized and you aren’t guaranteed that the actual file will be present. Opening the file will sometimes download it from SkyDrive.

The “Windows 7 File Recovery” backup system has been removed in 8.1

Windows 8.0 actually contained two backup systems, the new “File History” and the awesome old backup system from Windows 7, threateningly renamed “Windows 7 File Recovery” as a warning that this will be removed. Lets compare the systems:

Windows 8 File HistoryWindows 7 File Recovery
Files in Libraries and desktop onlyAll files in all locations supported
No system image supportFull system image support
No progress barProgress bar

File History is Microsoft’s attempt to copy OS X’s Time Machine, except Time Machine actually backs up all your files and lets your restore the entire OS partition, just like Windows 7 did! At least you had the choice in 8.0 to use the old system. In 8.1, Windows 7 File Recovery has been removed completely, you can’t even restore your old backups!

Edit: You can create a System Image in 8.1 (Click “System Image Backup” on the File History screen) but this doesn’t work on an automated schedule, so is not an automatic backup.

It gets worse. File History is even more useless in 8.1.

“File History” no longer backs up your SkyDrive folder in 8.1

Microsoft really don’t want you to have a local backup of your SkyDrive files. Take a look at this:

image

Above: Windows 8 File History

image

Above: Windows 8.1 File History

No problem you think? Just add your SkyDrive folder to a Library and it should back up? Nope – all the SkyDrive files are ignored completely, even if you manually add them to a Library.

The response from Microsoft on this is beyond tragic (from here):

image

Your files “are protected by the cloud in case user lose/damage their device”. What about protection from user error or viruses or badly written programs? If your files get corrupted the corrupted files will sync to the cloud and then sync to all your other devices.

Conclusions

It appears that Microsoft are desperate to push SkyDrive, even at the expense of the computing safety of their customers – customers you’d hope were being educated about safe computing. Now I am on 8.1 I am personally stuck with no built-in backup system. My experience with File History has been awful – it appears to even ignore an additional Library I’ve created to include non-library files. I literally cannot get it to back up files on my computer, it is useless. I am going to have to go with a third-party backup system like CrashPlan now.

Windows 8.1 was Microsoft’s chance to undo the wrongs of Windows 8. Users are now faced with the prospect of upgrading and being faced with no backup solution, or even worse their existing backups just stopping working with no warning.

Sort it out Microsoft.

Edit: Some excellent discussion on this over at Hacker News: https://news.ycombinator.com/item?id=6388431


Evernote has no patience, drops WPF over fixed issues

Much noise has been made about Evernote’s new Windows client. For version 4, they dropped WPF/.NET and released a C++ native application.

They were pretty damning with their reasoning:

Evernote 4 is a major departure from Evernote 3.5 in every way. While 3.5 added tons of great new features, there were some problems we simply couldn’t fix: the blurry fonts, slow startup times, large memory footprint, and poor support for certain graphics cards were all issues that the technology behind 3.5 (Windows .net and WPF) was incapable of resolving. As a result, we ended up chasing down platform bugs rather than adding the great features our users wanted.

So we decided to start over from scratch, with fast, native C++ that we knew we could rely on. As you’ll see, the results are amazing. This new version will set a foundation for rapid improvement.

Evernote 4 is designed to give you a great experience on any computer that you use, whether you’re on a netbook, a five year old Windows XP machine or a super fast top-of-the-line Windows 7 computer.

On our test hardware, Evernote 4 starts five times faster, and uses half the memory of Evernote 3.5.

You cannot make statements like “issues that the technology behind 3.5 … was incapable of resolving” without providing more information on the problems they faced and the solutions they tried. For all we know, their Windows client development team could have tripled in size to get the native version out the door.

Evernote 3.5 was a textbook example of reasons to immediately upgrade to .NET 4.0 if you are building WPF applications. Visual Studio’s UI layer is now in WPF, presumably after fixing all these issues.

“The text is blurry”

This is fixed in WPF 4.0, which can render text almost exactly as Windows does if developers request it. The standard behaviour is a DPI-independent accurate representation of the font on the screen (the same way OSX renders text and also the reason why text is “blurry” on Macs).

This is a one-line fix in .NET 4.0, just apply TextOptions.TextFormattingMode=”Display” to your root XAML element. Asian text will also now use bitmap fonts so customers on Japanese XP machines will now get consistent text rendering between your app and their vintage OS (of course Vista/Win7 should be using Meiryo).

“The download size is too large”

The runtime for .NET 3.5 is 65MB or so. .NET 4.0 has reduced this to a 28MB download. Ironically, the Evernote 4.0 installer is 40MB – most likely larger than bundling the .NET 4.0 runtime in the installer of 3.5. Users with .NET 4.0 already installed would have got an even smaller download as the installer would not download the runtime.

Evernote 4.0 appears to include Chromium (the Chrome web browser base), Foxit Reader’s PDF libraries, SQLLite and libxml which could be replaced with the built in Web Browser control, XPS rendering, SQLCE and built in XML libraries of .NET 4.0.

“It uses too much memory”

If your application has hundreds of threads and handles complex local operations, it will require memory. Evernote’s statement about version 4.0 using “half” the memory of the managed 3.5 version is pretty damning considering that the application is not very complicated.

While you may never be able to match the memory consumption of a “native” application in .NET, there are some things you need to know to improve your memory use:

  • Use .NET 4.0 as it has background garbage collection
  • Keep your visual tree small (no endless nested Grids in XAML)
  • Virtualize your ItemControls! This might be the most important issue. You cannot bind a XAML WrapPanel to 1000 images without Virtualization and expect scrolling to be smooth or your memory consumption to be low.
  • Use the ThreadPool and Background Workers instead of manually creating threads. ThreadPool-based tasks intelligently carry out the number of simultaneous tasks based on the number of processor cores you have. Allowing your code to create and infinite number of threads is a recipe for disaster as each thread needs its own stack space of at least 1MB. Ideally use .NET 4.0s new parallel programming and async functions.
  • Read the correct memory numbers in Task Manager (Private Working Set).

“The application is slow to start up”

Managed applications do not have to start slowly. A new application from a VS template will start up instantly – it is when you start adding references to other libraries, interop hooks and modules (for composite apps) that startup time starts dropping. Perhaps the most important thing to do is use NGEN to generate a native version of your application at install time, instead of waiting for the JIT compilation when the user launches the application. The application will slow down when loading modules for the first time if they have not been pre-compiled. There is some great information on improving start up times on MSDN, as usual.

In general, try to show some part of your application immediately. A splash screen might be okay for application start up times of less than a few seconds, but there is no reason why the main window can not be displayed and relevent data start loading in the background. This is mainly a perception issue. If you only show your main window after loading all data that your application could possibly need ever, of course start up will appear to be slow.

Did Evernote even try .NET 4.0?

The .NET development community is waiting for some sort of postmortem from the Evernote team. Most of the above problems would have been solved by upgrading to .NET 4.0 and running some decent profilers on the executable.

Rewriting your application natively will no doubt use less memory and start up faster, but it will also look worse, take longer, be more expensive to develop and your UI will break when high DPI screens start to be used. Portability is not a reason: if you want an application to work on Windows/Mac/Linux/iOS/Android, you write a web application.

If you want an example of an amazing WPF 4.0 application designed for the Windows platform, check out MetroTwit.


No native Japanese text in Windows Phone 7 … yet

The first preview version of the Windows Phone 7 SDK is out and it doesn’t support Japanese (or non-Latin) text in the English ROM. This is a huge disappointment.

image

One of the major advantages the iPhone has over almost every other smartphone platform (and the major reason I bought mine in the first place – I needed Japanese language support) is the built in support for non-Latin languages and their input methods. This allows Apple to provide one single worldwide firmware edition.

Previous versions of Windows Mobile have required users to hack in Japanese fonts using the registry and rely on some awful third party hacks to get Japanese IMEs working. I seriously hope that before WP7 is finished, Microsoft just install worldwide fonts and IMEs like they started to do with Vista. There is no excuse. We are unlikely to get low-level access to the registry this time around to hack the support in ourselves to non-Japanese ROMs.

One of their slides is supposed to imply that the Metro theme “Celebrates Typography”

image

— more like it totally ignores it. I suppose screens of square boxes fits the “Authentically Digital” principle.


Vista net problems? Disable auto-tuning

I have been having a problem where I would get “This page cannot be displayed” errors the first time I tried to access a website. The odd thing was that hitting Refresh would solve the problem and I would get through to the site immediately. At first, I thought it was a DNS problem due to the Internet Connection Sharing setup, but turns out it is Vista messing around with the net connection with a process called “Receive Window Auto-Tuning” (this alters the size of the packets on your network based on network conditions).

To turn this off, fire up an elevated command prompt window and type in “netsh interface tcp set global autotuninglevel=disabled” without the quotes. This solved all the problems I was having with my net connection in Vista SP1. Strange.

image


XNA2.0 Dependency checking

After a lot of trial and error using Process Monitor and Virtual PC, I have finally sussed out exactly what XNA2.0 games on Windows need to run. The requirements are slightly different to XNA 1 games.

Direct X runtimes

There are four files that need to be installed in the system32 folder for XNA to initialise properly. They are:

  • xinput1_3.dll
  • x3daudio1_3.dll
  • d3dx9_31.dll, and
  • xactengine2_9.dll

The first three can be placed alongside the application exe and then load fine, but xactengine2_9.dll does not load this way for some reason, and has to be present in the system directory. Distributing these files alongside the application breaks the DirectX EULA, so they have to be installed using dxsetup.exe.

To check the presence in your XNA game, just put this code in Program.cs before game.Run() is called:

bool HasAllPrereqs = true;
// check all the required files, if any missing, return false
if (!System.IO.File.Exists(System.Environment.SystemDirectory
    + "\\xactengine2_9.dll")) HasAllPrereqs = false;
if (!System.IO.File.Exists(System.Environment.SystemDirectory
    + "\\d3dx9_31.dll")) HasAllPrereqs = false;
if (!System.IO.File.Exists(System.Environment.SystemDirectory
    + "\\x3daudio1_2.dll")) HasAllPrereqs = false;
if (!System.IO.File.Exists(System.Environment.SystemDirectory
    + "\\xinput1_3.dll")) HasAllPrereqs = false;

If HasAllPrereqs is false after those lines, exit the application before it crashes horribly when XNA tries to initialise.

Visual C++ 2005 SP1 runtimes

Even a fresh Vista install doesn’t have these. They are provided when updating Visual Studio 2005 to SP1, or installing SP1 of the .Net Framework 2.0. However, Vista comes with SP0 of .Net 2.0, meaning 99% of machines you come across will be lacking what XNA 2.0 needs. There is a 2.5MB standalone installer on the MS download site here which installs what you need even on .Net 2.0 SP0 machines. .Net 3.5 installs Net 2.0 SP1.

So if any of the following are installed, we safely have the right Visual C++ 2005 runtimes:

  • .NET Framework 2.0 SP1
  • .NET Framework 3.5
  • Visual C++ 2005 SP1 Redistributable

By looking up the product codes in the registry, we can also check at runtime if we have the runtimes (again, before game.Run()):

[DllImport("msi.dll")]
public static extern Int32 MsiQueryProductState(string szProduct);

…goes before the main application entry point, and

bool vccOK = false;
// check for VC++ 2005 SP1 redist (very rare in the wild)
if (MsiQueryProductState(
"{7299052b-02a4-4627-81f2-1818da5d550d}") == 5) vccOK = true;
// check for .NET Framework 2.0 SP1
if (MsiQueryProductState(
"{2FC099BD-AC9B-33EB-809C-D332E1B27C40}") == 5) vccOK = true;
// check for .NET Framework 3.5 (includes 2.0 SP1)
if (MsiQueryProductState(
"{B508B3F1-A24A-32C0-B310-85786919EF28}") == 5) vccOK = true;

If vccOK is still false, exit the application before you call game.Run().


Taskbar invisible over Remote Desktop

I frequently use Remote Desktop (RDP/Terminal Services) to access my machines running Vista SP1 RTM. 90% of the time, after connecting, I get this annoying problem with the Start bar:

image

Only the Start button itself is visible.

  • This happens on all my Vista machines, both SP0 and SP1
  • This happens no matter what version of Remote Desktop I am using, the XP version or the “new” Vista one
  • It only appears to happen when the computer was originally using the Aero DWM composition engine and originally in a different resolution to what I am asking the Remote Desktop session to render
  • It happens wether or not the taskbar is at the top or bottom of the screen

So far the only way to get the taskbar back I have found is to click the lonely Start button, click Windows Security, choose Start Task Manager, kill the explorer.exe process and start it again in the Task Manager from File > Run.

Subsequent Remote Desktop connections are then fine, but logging back into the machine from the console then back into Remote Desktop makes it disappear again. A bit frustrating, to say the least.


Windows Live ID Return URL banned words

UPDATE: No need to do this now, its fixed!

For edngames.com I use Facebook, Yahoo! and Windows Live as sign on solutions. However, Windows Live is the only system with a restriction on the domain names you can register. For instance, because of the word “games” in my domain, I get the error message “The Return URL field contains a forbidden word or domain. Please use a different Return URL and enter the HIP solution again.”

image

Facebook and Yahoo, competing single sign on solutions, do not have this restriction, which the word “game” I assume is to block gambling sites from the authentication.

To get around this, I have had to set up a dummy domain (edslife.co.uk) without the banned words and perform authentication on that – you cannot simply do a redirect because the signature returned by the Windows Live server will be invalid because its a different return URL. I then have to create my own authentication (I use a hash function based on the time and a secret word) to move between the dummy domain to the real one securely.

image 

Although this works, and is just as secure as authenticating on the target site I reckon, it provides a pretty shoddy user experience because I have to explain that there is another domain name involved. You also cannot use this method to get data from the Windows Live server such as contact information because from a different domain, the authentication is invalid.