.NET web app cloud deployments in 2015

.NET web applications tend to get treated very poorly in the real world – some people still think that copying and pasting the contents of their /bin/Release/ directory (lovingly referred to as “DLLs”) over Remote Desktop to a webserver and manually setting up IIS is acceptable – but this is now 2015 and the world has moved on. Here are my thoughts on some of the various ways you can deploy .NET apps to the cloud.

First things first – keeping your .NET app cloud ready

Real cloud environments are stateless. You must treat the web servers you use as ephemeral. DevOps practitioners treat virtual servers as cattle, not pets, and don’t nurse servers back to health if there is a problem. Instead they take them out back, shoot them in the head and spin up a new one.

The .NET Framework does not make building cloud-ready, stateless scalable applications easy by default, especially if you are still shaking off decade old WebForms habits. Here is some advice:

  • Never use Session State. If you type HttpContext.Current.Session you lose. Using Session State either forces you to have a “Session State Server”, building a single point of failure into your architecture, or having to use sticky load balancers to force users to continuously hit the same web node where their in-memory session lives.
  • You’ll need to synchronize your MachineKey settings between machines, so all nodes use the same keys for crypto.
  • Multiple nodes will break ASP.NET MVC’s TempData (typically used for Flash messages) – try CookieTempData
  • For configuration values, only use web.config AppSettings and ConnectionStrings. Sticking to this rule will give you maximum compatibility with the various cloud deployment platforms I’ll outline below. And no, don’t use Environment Variables, despite what The 12 Factor app enthuses – Windows apps do not use Environment Variables for application configuration. UPDATE Jan 2016: ASP.NET 5 has embraced Environment Variables as a first class configuration option bringing it inline with other web frameworks – if you are using ASP.NET 5 you can now use Environment Variables as an alternative to local config files. Don’t bother for ASP.NET 4.6 apps. 
  • Do not rely on any pre-installed software. All dependencies should be pulled from NuGet and distributed with your application package. If you use a vendor’s “solution” (custom PDF components? Using Office to create Excel files? CrystalReports?) insist on a NuGet package or remove the vendor’s software. This is 2015.

image

Azure Websites

The granddaddy of .NET Platform as a Service and the cornerstone of almost every Azure demo. Azure Websites is a very high level abstraction over IIS and .NET web farms, supports lots of very cool deployment mechanisms and is easily scalable.

  • Deploy from Github, TFS, Mercurial etc by monitoring branches. The very clever software under the hood (Kudu) monitors branches for changes, runs MSBuild for you and deploys your app.
  • Lots of features – staging slots (with DNS switch over for zero-downtime deploys), scaling with a slider, monitoring and logging all included
  • You don’t get access to the underlying Windows VM that the sites are running on – even if you pay to have dedicated VMs for your sites. This does mean that you get auto-patching, but if you have any exotic requirements (I’ve seen third party APIs have such broken SSL implementations you need to install their Root CA certificate on your web server) you’ll be out of luck as there is no way to run scripts on the servers.
  • To configure your app, you can set variables that replace AppSettings or ConnectionStrings in your web.config at deployment time.
  • Azure Websites also supports PHP, Java, node.js and more, if you are happy to run those frameworks on Windows. This blog is WordPress backed, so PHP, and running on Azure Websites!

An honorable mention goes out to App Harbor – they technically got there first by providing a Heroku-like experience for .NET developers. Also note that Azure has “Azure Cloud Services” – this is significantly more complex than Azure Websites and does tie you into the Azure platform significantly. Azure Cloud Services are typically chosen for long running cloud systems rather than transactional web sites (think Xbox Live rather than a high traffic blog).

image

Amazon Web Services Elastic Beanstalk

Amazon are by far the biggest cloud provider out there and they try to tick as many Windows feature boxes as possible to woo enterprises. Elastic Beanstalk is a Platform as a Service deployment platform, similar to Azure Websites, but completely platform agnostic. Since it uses all the existing EC2 APIs underneath (Elastic Load Balancing, Auto Scaling Groups etc), Language and OS support is much higher than Azure Websites, at the expense of not being optimised for Windows/.NET workloads.

  • There is no cheap, shared tier. Your application runs on a dedicated VM that you have access to. This makes costs a bit higher (unless you are crazy and want to try to run .NET on micro instances) but gives you more control. As part of your deployment package you can include Powershell scripts that can execute on your VM.
  • The user interface is very limited – when I last checked the only configuration values you could set via the UI were named “PARAM1”, “PARAM2”, “PARAM3” etc, which limited your AppSettings to using those names unless you wanted to completely script your deployment.
  • If you want a SQL Server as a Service, you are limited to RDS which charges for the whole VM and SQL Server license. Azure’s SQL Server service charges for CPU time and disk space, which can work out quite a bit cheaper.
  • Docker container support is available – this will become important for .NET developers when ASP.NET 5 is out of beta and CoreCLR is ready.

image

Opscode Chef + Azure or AWS VMs

Opscode Chef is a favorite of the “infrastructure as code” crowd, and it can be made to work on Windows. Given standard virtual machines on either AWS or Azure, you can install the Chef service on your nodes and execute Chef recipes.

  • Chef recipes are written in Ruby. This may or may not be a problem depending on your team (I can count the number of .NET developers I know who are also good at Ruby on one hand) but is definitely extra skilled requirements. It is possible to use Chef recipes to bootstrap Powershell scripts, but then you have Rube Goldberg machine of pain.
  • Ruby is simply not designed to run on Windows, let alone for long-running processes. The Chef Service had a long standing bug on Windows where Ruby would simply run out of memory. Anybody who has tried getting every gem in a typical Ruby on Rails gemfile to compile on Windows knows the pain I am talking about. Windows support for Ruby is an afterthought.
  • One thing Azure has over AWS for Chef deployments is the ability to pre-install the Chef Client onto a VM when you start it, all from the UI. AWS requires you to manually apt-get the client.
  • Chef recipes are based on the concept of convergence – where the desired state of the server is described and then a policy is calculated to bring the server to that state. Co-incidentally, this is exactly what Powershell Desired State Configuration does. Chef have plans to integrate with Powershell DSC.

image

Octopus Deploy + Azure or AWS VMs

Ocotpus Deploy is quickly becoming one of my favourite parts of the .NET ecosystem. Built by some of the finest .NET developers in the land, for .NET developers. It provides the Platform as a Service ease of Azure Websites with the power of running your own VMs. I think of it as bringing your own platform layer to infrastructure you might get elsewhere – I’ve dealt with a big deployment of Octopus on AWS.

  • VMs can be assigned to environments, enabling a fully customisable Test-UAT-Staging-Production workflow with release promotion.
  • Your build server needs to create “octopack” packages– a nuget package variant. These packages then get pushed to the Octopus server nuget feed and can be deployed.
  • A deployment agent called a “Tentacle” is deployed on each VM. A single MSI command can install and enroll the node.
  • Elastic scaling is not included – Octopus does not manage your environment for you.
  • Deployment steps are fully customisable – you can create IIS sites, AppPools, run custom scripts or even install Windows Services
  • Configuration settings for your application are set as variables that apply to AppSettings and ConnectionStrings in your web.config when you deploy.

The Octopus Deploy team is currently working on version 3.0, which will replace the RavenDB database with SQL Server. I’m very much looking forward to it. Octopus isn’t limited to cloud-deployments either – it can be used equally well for on-premise datacenters.


In summary then, I’d choose Azure Websites if the application is simple enough to work within it’s constraints. Given an application with multiple tiers (microservices etc) or special deployment requirements (third party software, certificates), I’d go for Octopus Deploy on top of whichever is your organisation’s favored cloud provider.

If you have any thoughts on the above, or can point out a mistake I’ve made, please drop me an email or leave a comment.


Why use a Macbook Pro as a Windows .NET Software Developer (Updated)

Update December 2016: Apple has released a new version of the MacBook Pro featuring the new Touch Bar feature. I do not recommend buying this model.

  • Worse battery life
  • Worse keyboard
  • Touch Bar feature is pretty useless for Bootcamp or virtualization – you will miss the usual function keys
  • USB-C only. Expect to spend 100s on adapters.
  • The original Bootcamp drivers actually physically blew the speakers when running Windows

Luckily Apple still sell the 2015 model without Touch Bar. I would recommend buying one of those.

Original post continues below…

I’ve been using OSX alongside Windows for almost 8 years now. In this post I will outline why a Mac is handsdown the best development laptop you can buy even if you are primarily a Windows or .NET developer. I use a Retina Macbook Pro 13 inch at home and for side projects, plus a Retina Macbook Pro 15 inch at work.

image

Reason 1: You need access to OS X as a professional software developer

If you care about maximizing the value you can provide as a practitioner of building software, you need access to OS X. OS X is the only way to build native iOS applications – and as a .NET developer without access to OS X, you’ll never even be able to use Xamarin to run your C# code on iOS natively. You’ll also miss out on the wider community and the rest of the development world – Javascript, Ruby, Python and Scala developers all use Unix-based operating systems to do their work, not Windows, and most of the time their preferred platform is OS X. If you want to dabble in Ruby over the weekend, or even teach yourself a new skill, OS X on a Mac is the only way to get first class support as most non-.NET/Java developers run their stuff on Macs (a small minority use a Linux distribution to do their work). No laptop other than a Macbook Pro or Air will give you access to this world and you will find yourself increasingly isolated professionally if you can only use Windows.

Reason 2:  It’s the only laptop that lets you run OS X and Windows at the same time

Since Apple put Intel chips in their machines, you’ve always been able to run Windows natively on a Mac by creating a second partition using the built-in Bootcamp utility. Since the middle of 2014, this has gotten even better with the introduction of native UEFI support. Gone is the 80s-era BIOS emulation, and now Windows boots just as fast as OSX itself. The Windows 8 startup circle animation even starts rendering before the Mac bootup sound finishes playing.

You don’t have to reboot when you want to use Windows – you can attach that same native partition within a virtual machine using Parallels or VMWare Fusion. That means you can run both OSX and Windows side by side, rebooting into the native Windows partition when you need the full power of the machine. Plus, its really cool to be able to do this:

Swiping between OSX and Windows

(As a side note, I get much better performance out of Parallels Desktop 10 than VMWare Fusion when running Windows 8.1 or the Windows 10 Tech Preview – the Parallels virtual display driver is WDDM 1.2 compatible, rather than VMWare’s WDDM 1.0 compatible version. WDDM 1.0 is from the Vista era.)

Reason 3: Multiple monitor support is amazing

All the Retina Macbook Pros have two Thunderbolt ports, which double up as Mini DisplayPort ports, and an HDMI port. The Retina Macbook Pro 13 inch can support 2 external monitors under OSX, and three under Linux or native Windows. The 15 inch version can support 3 external monitors and the internal screen at the same time. Both these limits can be extended by using USB 3.0 “DisplayLink” adapters or docks at the cost of CPU power and graphics quality. With virtualisation, you can set Windows up to use any number of monitors.

Reason 4: You can test your work on multiple retina display implementations

Both the 13 inch and 15 Retina Macbook Pros have amazing high resolution screens. If you are building web applications, you need to be able to test your work on “retina” displays and this is the quickest way of doing it, without getting a 4K monitor. Most retina displays in the wild are on Apple devices too (iPad, iPhone etc). The ridiculous resolution of the 15 model (2880×1800!) even enables you to test your apps and sites in Windows at up to 200% DPI scaling without an external monitor.

Reason 5: The Apple Store retail support network

Say what you want about the “cult of Mac”, they have retail support available in almost every major city on earth through their Apple Store network. If you need a new charger or accessory, you can walk in and buy one from an actual shop. If you have a problem, you can go in and (sometimes pay for) a repair – not phone an offshore support line and get a box posted to you. Acer, Dell, Samsung etc do not have the meatspace reach of Apple (unless you like to shop at PC World). The thought of having to buy a replacement AC adaptor for a “Acer Aspire S3-392G” machine at short notice is quite scary. If you have a preference for the US keyboard layout, a Mac is the only laptop stocked in retail available with a selection of keyboard layouts – when in Tokyo, Apple were the only people in the whole city that stocked laptops with US keyboards.

Reason 6: The .NET Framework is becoming multiplatform

In case you missed the news, Microsoft have committed to making the core of the next .NET Framework version work on both Linux and OSX, instead of leaving it up to Mono to provide an implementation. This is a direct result of the leaders in the .NET space stretching C# out of it’s comfort zone of Windows and Visual Studio. ASP.NET vNext supports development using Sublime Text on a Mac. The OmniSharp project brings C# support to Sublime text, Emacs and Atom. Visual Studio is not required. From 2016 onwards, I expect ASP.NET vNext to start featuring in C# developer job ads, and they are going to expect you to be able to at least run applications without Visual Studio. Deployment of greenfield applications to Linux servers using Docker containers will start becoming the norm from next year.

In Summary

I haven’t even touched on the other reasons why this is now my preferred setup – the now native SMB 2.0 support in OSX, OneNote finally on Mac, the quality of the keyboards and trackpads – but using non-Apple laptops is painful sometimes. I was once issued the 2nd generation of the fabled Lenovo Thinkpad line of X1 Carbons that got rid of the function keys and replaced them with comedy touch “context sensitive media buttons” (the 3rd gen reversed this bonkers choice). My last two companies have eventually managed to sort out a top of the line 15 inch Retina Macbook Pro as my corporate machine and thanks to the proliferation of Macs in the corporate setting, IT departments are slowly warming up to the idea.

If you have any questions about how I use the above, drop me a line in the comments or send me an email and I’ll be happy to respond.


East Village London E20 – Get Living London Review

Update: I’ve posted an updated review after being here for 18 months.

The following is a review of my thoughts so far about living at East Village, the former Athlete’s Village at the 2012 Olympic Games. My landlord is Get Living London and this will be from the perspective of myself as someone I imagine typical of a renter here – a young professional working at a firm in Canary Wharf in the technology field. I have moved back to after spending half a decade in Japan, so my taste in living space might be a bit skewed towards modern city life.

The surroundings

CIMG3871

This really is now a gorgeous part of London. The Olympic Park is just next door and there are acres of parklands available for public use. There is even a wetlands area with ducks! The site has a security team in East Village fleeces looking after the place and gardeners (although I haven’t seen them).

Nice things nearby

The following is all available within walking distance of my flat:

ellie_simmondsstratrford1

Sainsbury’s Local – 30 seconds walk

d29620e4-4fd3-46be-b8c4-d8859d16eaa112345

The Neighborhood Pub that serves a mean rotisserie chicken and Asahi beer! – 35 seconds walk

Westfield_stratford_city

The Stratford City Westfield Shopping mall – one of the largest urban shopping centers in Europe (with a cinema, casino and an Apple Store!) – 4 minutes walk

1280px-Stratford_International_stn_entrance

Stratford International stn entrance” by Sunil060902Own work. Licensed under CC BY-SA 3.0 via Wikimedia Commons.

Stratford International Train station – you can get to Paris in 2.5 hours – plus a DLR station that gets me to work in 20 minutes – 3 minutes walk

I have not included the Olympic Park legacy facilities above – the swimming pool, cycling velodrome etc – simply because I haven’t had the chance to go yet! There is also an NHS doctor’s surgery and a school on the site.

Commuting to work

On a good day I can get to work door to door in 30 minutes on the DLR. This is in stark contrast to the hour long slog through the Tokyo rush hour that I used to endure. The DLR is amazing – daylight throughout, mobile phone signal all the way (unlike the Tube) and no drivers. I spend about 80 GBP a month on pay as you go tickets via my auto-top up Oyster card (1.50 each way) – this is cheaper than the Zone 3-2 travelcard, which is over 90 GBP per month. If you work from home even one day a week, the travelcard doesn’t make sense.

In Japan, commuting costs are paid for by your employer, not you, as quite rightly the cost of getting to work is a necessary, deductible business expense. Not so much luck in the UK. If you decide to buy or rent a house in Sussex, say Brighton, you will have to pay an eye watering 460 GBP per month out of your own pocket for the privilege of getting to work on the worst train in Britain that was late every day for a year. No thank you. For those that say renting is “throwing money away” – this annual 4,800 GBP is definitely tossing money down the drain, or could at least be put towards the rent or a mortgage of somewhere closer to work.

Renting through Get Living London

When I first made enquires into renting in London I was absolutely appalled by the state of the market – companies like Foxtons appear to be almost deliberately misleading in the fact there is no way of knowing up front how much it will cost to move into somewhere when you view an ad as the total rental cost does not include fees. Between “Admin Charges”, “Contract charges”, “Check in fees” and all sorts of other nonsense that of course vary between rental agencies, there is no way to actually properly compare the price of listings on places such as Zoopla and Rightmove. This is likely by design. Scotland has outlawed rental agency fees, and even airlines are forced to show all hidden costs upfront in the advertised price to protect consumers. 

ZJLQST9593ZJLQNRQ[NN5376NRQ[-temp-img-contact1

The Get Living London management office

Get Living London is not your standard landlord, but is part of the growing Private Rental Sector (another private rental company in other London locations is Fizzy Living). They own the buildings you are renting. There is no middleman taking a cut. And with Get Living London – there are NO FEES to move in. At all. No check in fees, no admin fees, no debit card fees, no inventory fees, nothing. You pay the rent and the deposit and that is it.

Lets compare how much this actually changes the effective cost over an example 12 month tenancy agreement, where someone moves in with you after six months and thus needs to get added to the tenancy agreement:

  Foxtons (source) Get Living London
Administration Fee for creating the tenancy agreement 420 GBP 0 GBP
Admin Fee for adding someone to the tenancy agreement 210 GBP 0 GBP
Checking out 150 GBP “Inventory Check Out Fee” 0 GBP
Total fees for the year 780 GBP 0 GBP
Cost per month spread over 12 months: 65 GBP 0 GBP

The fees alone at Foxtons would make your rent the equivalent of an extra 65 pounds a month in this example. Who knows what this value is for other rental agencies.

What are you getting for that 0 pounds a month at Get Living? Very good service from my experience so far – I emailed the dedicated property manager about our heating the other day and he phoned me back in about 3 minutes. Try getting that sort of service from your amateur Buy To Let landlord.

Lets look at the flat

I love the place. It is modern, with underfloor heating, an awesome “winter garden” balcony area and an en-suite even in two bedroom flats. Plus, Get Living London do not charge any extra for a furnished flat – the furnishings are pretty awesome and were very welcome after moving halfway around the world. My only gripe would be that its apparently not possible for them to take, say, a bed away, so you’ll have to find storage for the bits you don’t need yourself.

IMG_0923

An example of some of the furnishings – plus the friendly chap who showed us round.

There is no boiler and no gas mains (so no British Gas to deal with) – the heating and hot water is supplied via East London Energy who “constructed a centrally-managed energy centre that produces all of the areas heating and hot water requirements and distributes it to homes and businesses via a network of highly insulated pipes” (or so I’m told). The kitchen hobs are therefore induction. Electricity has been super cheap so far, at about 30 GBP per month. In Japan it cost three times as much because of the air conditioning we needed.

The technology

All the East Village flats are served by Hyperoptic broadband. This means:

  • 1 Gbps Fibre To The Premises – (not to a cabinet up the road and a dodgy bit of 1960s copper between you and the box like BT claim their “fibre rollout” is)
  • 1 Gbit Ethernet cabling throughout the flat
  • A router is provided but you can switch your own in  – there is a pure Ethernet jack in the wall that dishes out public IPv4 addresses
  • No landline (there is a VoIP service available)
  • No BT line rental required (saving you about 15 GBP a month!)

I thought my broadband in Japan was good at 100Mbps, but the 1Gbps that I get from Hyperoptic here is absolutely nuts (and this is going through a Thunderbolt Ethernet connector and a router. I’ve seen it faster but I think I am saturating the upstream on the test servers in London):

1487742_10100112836579161_304752227367329868_o

Note that Get Living London residents get a free 20Mbps service from Hyperoptic, with the 100Mbps and 1Gbps costing 10 GBP and 20 GBP a month respectively. Even the free 20Mbps tier saves you at least 30 pounds a month that you’d otherwise need in a home wired by BT (ISP charge plus line rental).

Overall

Living here is awesome – the location is perfect for work, my family loves the area, the facilities are brilliant and the cost is very reasonable when you factor in how much I save in commuting charges; let alone rental fees, copper phone line rental fees and other nuisance costs that should just not exist today. Compared to the nightmares I had of moving back to the UK – old 60s terraced council “housing stock”, begging BT for 4Mbps ADSL, an expensive awful commute, being shafted by letting agents – it’s been a wonderful surprise. For anyone working in East London and looking to rent, I don’t think you can find somewhere better.

If you have any questions about living in East Village or renting through Get Living London, drop me an email and I’ll be happy to respond.


Sayonara Tokyo – returning to the UK

After almost six years living in Japan, I have made my way back to the UK. I am now based in London, working for one of the Big 4 Management Consulting firms as a Lead Developer.

I occasionally get emails from readers asking what it is like to be a software developer in Tokyo, some flat out asking me if I know of any jobs going. I tell them all the same thing – Tokyo is a hard place to be a non-native Japanese speaker doing software. I was very lucky to have a pretty sweet job working for a branch of a US company, but ultimately the job market there is not healthy. I very much needed a more senior role but there was nowhere to grow within the Japan R&D office – my boss was not going anywhere. Whilst foreign companies can pay pretty decently by Japan standards (expect anything from 5M for Junior to 9M JPY for a Senior development role, but this is only 50k-90k USD, not even US graduate level), Japanese companies take a very traditional view of compensation setting (think tying job ranks to age, salary to “seniority” rather than skill or value). Don’t expect to make anything like US or even UK salaries unless you are in senior management.

My primary skillset is the whole .NET stack – but .NET is not popular in Japan. The .NET community is so small in that at the time of writing this blog post I was at the top of Google for “ASP.NET MVC Tokyo”:

image

If that isn’t a reason to move on, I don’t know what is.

Before the move to the UK, I flew over for a week to interview. A bit of planning beforehand lined me up with eight job interviews which resulted in five job offers, with the worst offer being 20% more than I was making in Tokyo. This blew my mind, but apparently it is normal in London for skilled developers. It is nice to be wanted. I also now have a family – my son was born this January and he deserves a successful father. Back in my homeland there are far more opportunities and I am not at a disadvantage in any way – there are no handicaps to being as successful as I can be.

This isn’t to say I don’t still love Japan. It is a lovely place to live, but a pretty crap place to work unless you are very senior or running your own business. The job market illiquidity (especially as a foreigner) means you will be paid far less, have to commute further and have less chance to grow than in the same level software development jobs overseas. Without the ability to instantly get job interviews (something you can do anywhere in the western world if you are good), you end up feeling completely trapped and at the mercy of your employer all the while knowing at the back of your mind that you could actually be having a career elsewhere. Not a good recipe for happiness. If and when I move back it will not be as a salaryman.

Relocating to the UK has been quite an adventure and I plan to blog some more about what it is like to come back after half a decade. Just the level of customer service in retail stores has made me miss my old home but I am sure the reverse culture shock will wear off in time.

To the next six years!


.NET Live Coding Talk in London UK at Medidata

On December 5th, I delivered a live coding demo at Medidata’s UK office, going over some of the newer stuff for .NET Web Developers. It’s 48 minutes long and covers MVC5, VS2013, EF6, SignalR 2 and some other bits while I build a rudimentary Twitter clone called “MediTwit”. Nuget blew up about half way through but we recovered. Full video below (visit the full post page to view): Continue reading…


Open Sourcing “Cognition”

One of the perks of working in R&D at Medidata is “Innovation Time”. Twice a year, engineers get to spend two weeks building whatever they want. Ideally some of the things we build will go back into direct product development but if not, we are encouraged to open source the work.

Fresh from attending Microsoft’s Build 2013 conference in San Francisco, I wanted an excuse to use all the latest Microsoft web stack tools. Namely:

  • Visual Studio 2013 Preview
  • ASP.NET MVC5 Beta
  • Entity Framework 6 Beta
  • SignalR 2.0 Beta

I also wanted to test out CouchDB, which was made infinitely easier with the awesome MyCouch library by Daniel Wertheim. The result of which is “Cognition” – a cross between a wiki and a CMS that can be used as a basic CRM, a job board or knowledge base. The result of two weeks work can be seen on the company repo at https://github.com/mdsol/cognition – a test site can be seen hosted on Azure at http://cognition-demo.cloudapp.net. There is quite a lot of rough code with scant test coverage, unfortunately necessary to get the experiment out the door in two weeks. It does however show what is possible with the .NET web stack.

imageimage

Here is a short video showing the current state of the app (no audio – visit the full post page to view): Continue reading…


Windows 8.1’s user-hostile backup story

Update 13th June 2015: It looks like Microsoft has reversed course and reinstated the Windows 7 Backup and Restore feature in Windows 10! Success!

Update 16th Sept 2013: What follows is a rant written after upgrading to 8.1 and seeing my automated backups just stop working and the backup restoration function also removed. An afternoon was wasted faffing about with the new File History feature which still refuses to backup files not on my system drive and deliberately ignores files in the SkyDrive folder, preventing you from having a local backup. I now use CrashPlan which behaves like Windows 7 and 8.0 used to be able to back up files.

Windows 8.1 is now released to manufacturing and those with MSDN or Technet Subs can download it now.

I have the RTM version now set up on home and work machines and have been running the Preview versions on both my Surface RT and Surface Pro. Windows 8.1 has some glaring errors.

SkyDrive integration now built in, removes features compared to the old Desktop client

You no longer have to install the SkyDrive app separately as it is now built into the OS. Windows 8.1 makes a concerted effort to force you to use SkyDrive.

image

The above is a screenshot of a screen displayed during the upgrade process. Can you see the error?

“automatically backed up to the cloud”

This is nonsense – if you delete a file from your local machine, it will be deleted from the cloud. This is NOT a backup. Live mirroring is NOT a backup. If a sync goes wrong – your file disappears completely. There is no backup.

Edit: Turns out SkyDrive has a “Recycle Bin” for deleted files, but it removes files once it is full or after 30 days. So, you might randomly be able to get your file back. Version History also only works for Office documents.

imageimage

On the left is Windows 8’s SkyDrive integration after installing the SkyDrive Desktop app. On the right is Windows 8.1’s built-in SkyDrive integration. This is now a system-level folder and doesn’t even have the syncing icons available. This folder is now virtualized and you aren’t guaranteed that the actual file will be present. Opening the file will sometimes download it from SkyDrive.

The “Windows 7 File Recovery” backup system has been removed in 8.1

Windows 8.0 actually contained two backup systems, the new “File History” and the awesome old backup system from Windows 7, threateningly renamed “Windows 7 File Recovery” as a warning that this will be removed. Lets compare the systems:

Windows 8 File History Windows 7 File Recovery
Files in Libraries and desktop only All files in all locations supported
No system image support Full system image support
No progress bar Progress bar

File History is Microsoft’s attempt to copy OS X’s Time Machine, except Time Machine actually backs up all your files and lets your restore the entire OS partition, just like Windows 7 did! At least you had the choice in 8.0 to use the old system. In 8.1, Windows 7 File Recovery has been removed completely, you can’t even restore your old backups!

Edit: You can create a System Image in 8.1 (Click “System Image Backup” on the File History screen) but this doesn’t work on an automated schedule, so is not an automatic backup.

It gets worse. File History is even more useless in 8.1.

“File History” no longer backs up your SkyDrive folder in 8.1

Microsoft really don’t want you to have a local backup of your SkyDrive files. Take a look at this:

image

Above: Windows 8 File History

image

Above: Windows 8.1 File History

No problem you think? Just add your SkyDrive folder to a Library and it should back up? Nope – all the SkyDrive files are ignored completely, even if you manually add them to a Library.

The response from Microsoft on this is beyond tragic (from here):

image

Your files “are protected by the cloud in case user lose/damage their device”. What about protection from user error or viruses or badly written programs? If your files get corrupted the corrupted files will sync to the cloud and then sync to all your other devices.

Conclusions

It appears that Microsoft are desperate to push SkyDrive, even at the expense of the computing safety of their customers – customers you’d hope were being educated about safe computing. Now I am on 8.1 I am personally stuck with no built-in backup system. My experience with File History has been awful – it appears to even ignore an additional Library I’ve created to include non-library files. I literally cannot get it to back up files on my computer, it is useless. I am going to have to go with a third-party backup system like CrashPlan now.

Windows 8.1 was Microsoft’s chance to undo the wrongs of Windows 8. Users are now faced with the prospect of upgrading and being faced with no backup solution, or even worse their existing backups just stopping working with no warning.

Sort it out Microsoft.

Edit: Some excellent discussion on this over at Hacker News: https://news.ycombinator.com/item?id=6388431


ASP.NET MVC Basics Part 2: ViewModel to Model Mapping and Editing

In Part 1, I walked through creating a simple form with a backing ViewModel and Validation. In Part 2, I’ll walk through creating a backing Model and Edit functionality.

To start off from here, load up the code from part1: https://github.com/edandersen/mvcformtutorial/tree/part1

The final code for Part 2 will be uploaded here: https://github.com/edandersen/mvcformtutorial/tree/part2

Model Mapping flow

viewmodel-model-mapping

When editing a ViewModel, we need to prepopulate the view with real data from a domain model. This can be a database table mapped to an object via an ORM or data from another source. In the diagram above, we can see the GET Edit action requesting Model with an ID of 123 from the Repository holding the model data, creating a ViewModel that represents the Model and passing it onto the View to render. POSTing the data back is similar to the Create POST method in Part 1, except we load the existing Model from the repository, update it with the validated data from the ViewModel and update the model in the Repository.

Continue reading…


ASP.NET MVC Basics Part 1: View Model binding

I’m going to walk through the basics of Form submission with ASP.NET MVC, showing some best practices. This set of tutorials will be useful for developers moving away from ASP.NET WebForms into ASP.NET MVC or even Rails developers curious about how we do things in .NET.

You can download the code for Part 1 at: https://github.com/edandersen/mvcformtutorial/tree/part1

Form submission flow

If you have come from WebForms, you’ll be used to being able to pull form values out in the code behind by simply referencing variables in your code behind. These magically map to elements on the page and most of the time you are blissfully unaware how the data gets there. With MVC, we don’t have the same abstraction. Whereas you can access POSTed variables directly with FormsCollection (or params in Rails) but with the ViewModel pattern, we can simulate the binding that ASP.NET provides and access our form variables in a strongly typed manner.

aspnet-form-submission-1

Continue reading…