Friday, February 27, 2015

Using SSH with Azure Linux Virtual Machines

Accessing Windows VMs in Azure is pretty straightforward – after you create the VM, you can download an RDP file from the Azure portal and remotely administer the VM. You can also access your VM via PowerShell, although that is a bit more complex due to the need for certificates. But what if you are using a Linux VM?

Turns out – it's pretty easy. The Azure documentation article, at http://azure.microsoft.com/en-gb/documentation/articles/virtual-machines-linux-use-ssh-key/  shows you how to do it. The steps are pretty simple, but will vary with the Linux distro you are using.

The basic way is to first generate the SSH keys. The easiest way to do this is to load and  then using openssl to generate an X.509 certificate, Then connect with Putty.

You can also forego creating keys, and just using a password. You set the password when you create the VM, then just login using putty. When you do, it' looks a bit like this:

del.icio.us Tags: ,,,

d

So for those of you who a) want to learn Linux and b) struggle with loading it on your own hardware – Azure provides a simple way to create then use a Linux VM.

Updated Azure Module Is Released

I am just back from teaching Azure and Office 365 in Belgium where we used the latest version of the Azure Cmdlets. The new version, 0.8.14 contains a huge number of updates and improvements. The changes include:
  • New StorSimple commands in AzureServiceManagement mode.
  • Updated HD Insight cmdlets
  • New Azure Insights cmdlets in AzureResourceManager mode
  • A new cmdlet for Azure VM, Get-AzureVmDSCExtensionStatus to get the DSC status for a VM
  • A number of new Azure Automation cmdlets in AzureResourceManager mode
I like that Azure is becoming more DSC aware – I am really excited about seeing DSC being fully implemented in both Windows and Azure.
To get this new version, you can either use the Web Platform Installer (which allows you to install more than just the new module).  Or, you can go to Github, and get the stand-alone version from the Azure-PowerShell repository (http://az412849.vo.msecnd.net/downloads04/azure-powershell.0.8.14.msi). The latter is an MSI that installs the updated module. Note that if you are running PowerShell already, then the MSI could ask you to either close those windows, or reboot to get the new module fully installed.
This update shows the sheer pace at which Azure is being updated. I find it staggering when you compare it to some earlier MS development cycles (e.g. the 5 years between NT4 and Windows 2000). The really good news is that Azure is getting richer and better by the month. The downside is the sheer difficulty IT Pros may have keeping up with this rapid pace of change. All in all, I think this is really not a bad problem to have!

Monday, February 23, 2015

Your Nearest Azure Data Centre

When designing an solution involving any cloud vendor, you need to be aware of the network latency between your users and the cloud. For Azure, this means putting your objects (VMs, web sites, storage, etc) in the data canter closes to you. But which one is that?

I just came upon a neat page that can help: http://linkis.com/azurewebsites.net/Jaybw. This page plots a nice looking graph of the latency between the client and the 15 existing Azure data centres around the world.  After a few tests, my graph looks like this:

image

There's also a nice table that goes along with the graph:

image

As you can see, the latency between my office and Azure favours the Western Europe data centre in Amsterdam. I had expected Dublin (North Europe) to be faster or at least very close – but was in fact slower. . Not surprisingly, Brazil, Japan and Australia are a lot further away. I was also surprised that the times to South East Asia  and East Asia were faster than both West US and East US.

What do your tests show?

del.icio.us Tags: ,

Saturday, February 21, 2015

Azure Cmdlets and the Debug Switch

So here is is, late on a Saturday and I'm off in the morning to teach an Office 365 class. In a fit of overexcitement, I decided to try to build out the lab environment on my laptop before travelling to the class. The class is run as a long-day boot camp and in the past the start of the course has been challenging with issues relating to building out the environment. I hoped to avoid that this week!

This class is one of the first from Microsoft to utilise Azure. The student's "on premises" data centre is in Azure, and the course looks at adding Office 365 into the mix. It's a fantastic idea – the student just runs a few scripts and hey presto – their live environment (DC, Exchange, etc, etc, etc) is all built by script as part of the first lab. And amazingly – this just takes around two hours according to the lab manual. What could possibly go wrong?

Well – the last time, those two hours turned into 6 as we had some errors at the start with the first script not running properly. Of course, I was able to troubleshoot and fix the issues although it did take time. So I thought it might be a clever idea to re-test the scripts this time and maybe avoid the issues.

I downloaded the first VM and installed it onto my Windows 10 client, then started running the build scripts. All was going well, till I tried to run the script to create the DC. For several hours, each time I try to create the VM, i would get error: "CurrentStorageAccountName is not accessible. Ensure the current storage account is accessible and in the same location or affinity group as your cloud service." All the normal methods (i.e. looking at Google and reading the articles found) did not help. None of the suggested fixes were appropriate to me. After two hours I was stuck.

Then, by this time on the third page of Google results, I came across a trick that should have occurred to me earlier. Use the-Debug Switch on the call to New-AzureVM (the cmdlet that was failing). I was able to see the HTTP traffic which relates to the underlying REST management api  leveraged by the Azure PowerShell cmdlets.

What I saw was the client asking for my storage account details – but claimed that the storage account did not exist. On close inspection, I could see the storage account name being requested was almost, but not quite, the correct ID. In creating the storage account, the script asked for user input (partner ID and student id), then created a storage account with a unique per/student name 0 a nice touch. In my case, I entered a partner number beginning with a 0 (zero). And looking closely at the script, part of it strips off the leading zero and used the zero-less storage account name for later operations – and of course that fails. To fix things, I just removed everything created thus far from Azure, and re-ran the scripts utilising better input and away I want.

There are two lessons here: first (as I tell all my students in almost every PowerShell Class): all user input is evil unless tested and proved to be to the contrary. If you are writing a script that accepts user input, code defensively and assume the user is going to enter duff data. The script should have checked to ensure the partner id entered did not start with a zero – it's a production script after all. Of course, I probably should have, and eventually did use a partner number not starting in Zero. So the underlying cause is user error. Still a good lesson is that, given a chance, most users can do dumb things and you need to write scripts accordingly.

The second one is the value of the –Debug switch when using any Azure PowerShell cmdlet. There can be quite a lot of output, but the ability to see what the cmdlet is doing can be invaluable. In my case, it took only seconds to work out the problem once I'd seen the debug output from the call to New-AzureVM. I recommend you play with this, as you get more familiar with Azure PowerShell – it can certainly help to in troubleshooting other people's scripts!

Thursday, February 19, 2015

Azure Preview Portal - Improvements

A few days ago, Microsoft shipped a new version of the Azure Preview Portal – one of the two GUIs you can use to manage your Azure accounts, subscriptions and assets. The new preview portal has been in development for a while. I am hoping that sooner rather than later we'll have just one portal. But in the mean time, the improvements to the new portal are most welcome.

Leon Welicki, a PM in the Azure Websites team has written a great article about the new portal and describes the new features – and their are a lot. See his blog article at: http://azure.microsoft.com/blog/2015/01/29/announcing-azure-preview-portal-improvements/

There are a lot of new features!

del.icio.us Tags: ,

PowerShell V5 – Feb 2015 Preview

Continuing with the approach of regular updates to PowerShell V5, the PowerShell team yesterday published a new version. You can read about the version in the PowerShell team Blog: http://blogs.msdn.com/b/powershell/archive/2015/02/18/windows-management-framework-5-0-preview-february-2015-is-now-available.aspx.

The download can be found at the Microsoft Download center: http://www.microsoft.com/en-us/download/details.aspx?id=45883. As noted in the team blog – this new version is only installable on Server 2012, Server 2012 R2 and Windows 8.1. I will also try it out on my latest build of Windows 10 and will report back once I get a chance to try it.

The download site has three .MSU files (one for each of the OS's mentioned) plus a set of release notes. Be careful as the file names are names of the update file are similar! The download is not overly big (circa 10mb for each msu) and takes but a minute or so to install. But, since I had an earlier preview version loaded, the installation required a reboot.

del.icio.us Tags: ,,

Wednesday, February 18, 2015

Free ebooks from MS Press on Azure

Microsoft Virtual Academy and Microsoft Press have joined forces and have issued a number of free e-books on Azure – you can download them from the web (as PDF) and enjoy them on your PC/tablet/Phone/etc. You can get the full set of books from here: http://www.microsoftvirtualacademy.com/ebooks#azure.

Not sure how long

books will remain free – and how many of them are still up to date. Given the fast pace of Azure development, these books are almost out of date before you get them. But having said that, they are still worth reading.

At present I am looking at the book: Rethinking Enterprise Storage: A Hybrid Cloud Model. Although the book is now 18 months old, there is some good thinking here. It's certainly helped me to re-evaluate how storage works in a hybrid model and why that model is so useful for my customers.

So get reading!

del.icio.us Tags: ,,

Tuesday, February 17, 2015

Azure IP Ranges

If you are setting up firewall exclusions related to Azure resources, it helps to know the Azure Dataenter IP address ranges. Turns out – that's really pretty easy: just download the details from the Microsoft Download Centre. Go here, the Azure Site and download that list.  The actual deep link to the XML document containing the IP ranges is: http://go.microsoft.com/fwlink/?LinkId=390343. Speaking personally, I found that deep link a bit hard to see on the Datacenter IP Ranges page.

The list that can download from Microsoft contains the all Compute IP address ranges (including SQL ranges) used by the Azure Datacenters around teh world. Each section of the XML document specifies a geographic region and the IP address ranges associated with that region.

The download is an XML document containing the current IP address ranges for all Azure data centres around the world, except China. The document looks like this (when viewed from PowerShell ISE):

image

The Windows Azure Datacenter IP Ranges in China are separately defined. The download centre enables you to download a separate list as the Chinese data centres are operated by 21Vianet.  You can get this document from here: https://www.microsoft.com/en-us/download/details.aspx?id=42064. It looks like this:

image

These IP address lists are published weekly. Microsoft also go on to make a good security point: Do not assume that all traffic originating from these IP address ranges is trustworthy!

Monday, February 16, 2015

Studying for Azure Exam 70-533?

If so, here's a study guide: http://vnext.azurewebsites.net/?p=10381. It looks at each area covered in the exam and looks at the specifics of what is covered there.  For each objective, there are some links to more information about that area.

And, as the article points out, don't forget that if you take this exam before May 31st 2015, you get a free re-take of the exam should you not pass it first time.

del.icio.us Tags: ,

Friday, February 13, 2015

Azure Backup Improved

In keeping with the near constant stream of improvements to all aspects of Azure, Microsoft has just announced some updates to Azure Backup.

Previously, there were several limits, including;

  • Azure Backup uses only one single retention policy to back up the data. This was limiting,
  • The number of backup copies is limited to 120. Daily backups, therefore only cover last 3 months.
  • Azure Backup does not provide the option of sending the data over network bandwidth alone to the end customer. Another limit.

But today, MS announced some major changes to Azure Backup.

  • You can now set multiple retention policies on backup data. The impact of this is that backup data can now be stored for multiple years by maintaining more backup copies near term, and less backup copies as the backup data becomes aged.
  • The number of backup copies that can be stored at Azure is increased to 366 – a 3 fold increase. 
  • Azure Backup integrates with the Azure Import service to send the initial backup data to Azure data center. This capability will enable the customers to ship the initial backup data through disk to the nearest Azure data center. This can be a significant benefit if you want to backup a LARGE amount of fairly static data.

The latest update also fixes some older issues, like not being able to backup >  850GB, etc.

del.icio.us Tags: ,

Tuesday, February 10, 2015

Azure Premium Storage

Before Christmas, Microsoft added a bunch of new features to Azure. One that I've only really just noticed was Premium Storage. Azure provides several types of cloud storage (blogs, queues, tables and files) – with the files storage still in preview. Premium storage too is in Preview.

The basic idea if Azure Storage is that you can store your data in the cloud: whether that data is the VHD drive for an Azure VM, a message queue used to hook up different parts of an application, or whatever. Azure storage is a fundamental building block you use to create cloud computing systems based on Azure.

The new premium storage feature now allows you to store this data on SSD disks. This new storage option provides higher performance and lower disk latency. Not only that, but, at least during the preview, Microsoft offered three types of SSD: P10, P20, and P30. These disks are 128gb, 512gb and 1tb respectively. The bigger disk provide more IOPS and greater throughput.

I look forward to playing a bit more with Premium Storage! I suppose it goes without saying: you can easily use Azure PowerShell cmdlets to manage this storage. I hope to generate a few scripts to demonstrate this!

For more details on Azure Storage, See Sirius Kuttiyan's recent blog post. For fuller details on Azure Storage pricing, see: http://azure.microsoft.com/en-gb/pricing/details/storage/. Premium storage is still in preview, and is offered at a bit of a discount. The 128GB  P10 disk, for example, is £5.47/month, while the 1TB P30 is £37.54/month.

Monday, February 09, 2015

Another Way to Access Azure VMs with PowerShell

In a blog post over the weekend, I demonstrated how you can access an Azure VM from an on-premises workstation using PowerShell. To summarise this approach: you first need to trust the management certificate provided to the VM by Azure, then use Enter-PSSession using the DNS name of the service, the port opened for PowerShell management endpoint and explicitly requesting SSL. While it took me a bit of time to work all this out (getting the cert downloaded was my biggest initial stumbling block), the approach is fairly simple.

But, like almost everything in PowerShell, it seems, there is yet another way to do this. No sooner had I posted that blog article when I got a tweet:

image

And – the answer was, at that point, no. I'd not noticed that cmdlet! I don't know all of then circa 500 cmdlets (yet)! But it was a nice prod to take a look. Johan's suggestion was a good one as the coding is simpler. Using both methods, you need to create a credential object to pass to the remote machine (specifying the UserID you created when you first created the Azure VM, or some other administrator you have created on the remote Azure system). And you need to trust the certificate the Azure machine presents when negotiating SSL between your client system and the Azure server. Once you have those two done, then you can   enter the PSSession like this:

$PshEP = Get-AzureVm cookhamlo | Get-AzureEndpoint  |
        
Where name -eq 'PowerShell'
Enter-PsSession -ComputerName $VmName -Port $PshEp.Port –Credential
         $VmCred –UseSSL

Using Johan's suggestion, the coding would look like this:

$Azuri = Get-AzureWinRmUri -ServiceName $VmName                   
Enter-PsSession -ConnectionUri $Azuri.AbsoluteUri -Credential $VmCred

Having tried them, both approaches work perfectly fine (assuming you have a valid credential object and trust the remote system's management port certificate).  both approaches work, but it feels like the second approach to be easier.

Saturday, February 07, 2015

Accessing Azure VMs with PowerShell

I've been doing quite a lot of work recently with Azure VMs and PowerShell. I've written some scripts to create a VM and to automate some common management functions on those Azure VMs (like re-size the VM, etc). I'm slowly posting the actual scripts to my PowerShell scripts blog (http://pshscripts.blogspot.co.uk). In the work I'm doing and in conversations with customers, many of the Azure VMs being created are stand alone. I see this as a great on-ramp to Azure, allowing the customer to dip their toe into the Azure water, for pretty minor costs. Once they are happy, they can move on to linking their on-premises infrastructure to the Azure Cloud. But in the meantime, those Azure VMs need to be managed – and of course, that means using PowerShell!

One of the first scripts I wrote was to create a simple, stand-alone VM in Azure. It took me quite a while, mainly because Azure is different to on-premises Hyper-V. I had to go through a small Azure learning curve. The first thing I want to do after creating an Azure VM is to manage it with PowerShell using remoting. You can do that, but there are a few barriers in the way (that would largely be absent in an on-premises Kerberos based authentication model!).

When you create an Azure VM, Azure creates a PowerShell endpoint to enable PowerShell management. Since the Azure VM is not in your on premises AD, you can certainly use Ntlanman authentication. (and authenticate against the in-VM user database). You need to provide a PowerShell credential object when logging into the remote session. But you know how to do that!!

The second issue is, since the VM is in a different secuity realm, the need to do mutual authentication upon creating the PowerShell session. For non-domain joined systems, this means using SSL. Yes there are ways around this or to simplify things, but there are security risks in doing so – so I'll stick with SSL. To use SSL, you need to trust the SSL certificate offered up by, in this case, the Azure VM.

When you create an Azure VM, you can either provide a certificate (e.g. issued by a CA you trust), or Azure can create a VM self signed certificate. To use the self signed cert, you need to trust the cert. To do this, you just Import the VM's defaul cert into your local host's trusted root store. Once in place, your system will trust the VM and complete authentication successfully

I've automated the task of importing the Cert with the the Install-WinRmAzureVmCert function, which I posted tonight on my scripts blog. This script defines a function that takes an Azure VM name and Azure service name, and installs the VM's default self signed cert into the local host's Trusted

$Vmname = 'CookhamLO.CloudApp.Net'
# Get the relevant Azure Subscription and set it
$SubscriptionName = (Get-AzureSubscription)[1].subscriptionname
Select-AzureSubscription $SubscriptionName
# And now install the cert
Install-WinRMAzureVMCert -CloudServiceName Cookhamlo  -VMName Cookhamlo

And once that is done, I just get the credential (I'll leave that up to the reader!), get the remote management end point to get the TCP port number to use for PowerShell remoting then call Enter-PsSession. Like this:

imageAs you can see at the bottom of the screenshot, I just use Enter-PsSession and am able to access the remote VM.

This is all pretty easy, albeit complicated by the security architecture of Windows and Azure. It's nice to know one can create a very secure remoting tunnel to your Azure VMs and manage them simply – just using PowerShell.

del.icio.us Tags: ,,

Wednesday, February 04, 2015

Creating Help Files for PowerShell – Sapien PowerShell Help Writer

In PowerShell you can get Cmdlet/Script/Function help in a couple of ways. First, you can use Comment Based Help – just put in a few carefully scripted comments and the Get-Help engine can provide help. This is fine for Advanced Functions, but if you want anything richer, including real cmdlet help text, you need to use  MAML (Microsoft Assistance Mark-up Language. MAML is a variant of XML, and in my experience is almost impossible for normal people to write using just Notepad (or other text editor).

There have been a few GUI-based tools over the years that have purported to do this – but in my exper5inence none of them have ever worked. I suspect Microsoft has some internal tools, but these have not been released outside Microsoft. Well – Until now that is!!

In her blog article entitled Introducing PowerShell Help Writer 2015, June Blender announces a new tool, PowerShell Help Writer (PHW). PHW is a new tool, developed by Sapien. This stool does a variety of things including being a fully feature Help tool that makes it easy to write and manage complete and more complex Help topics.

June's blog post has some more detail on the product, and you can find out even more by going over to Sapiens's web site for PHW: http://www.sapien.com/software/powershell_helpwriter. Sadly, the tool is not free (it costs US$49.00). It's disappointing at one level – at that price, casual scripter's, small business, etc are unlikely to pay for the product. I continue to believe this tool, or something like it, should be produced by Microsoft and as part of PowerShell or PowerShell ISE.

Still, if you are writing enterprise cmdlets, or commercial products, then this tool is almost a given. Unless you are one of the very, very few who can write MAML!

del.icio.us Tags: ,,,

Tuesday, February 03, 2015

Microsoft Cloud Platform Roadmap

In a nice change from the earlier Cone of Silence approach, Microsoft has begun to publish a detailed road map for their cloud platform. Published at http://www.microsoft.com/en-us/server-cloud/roadmap/recently-available.aspx?TabIndex=2, the road map shows features that have recently been released, are in public preview, in development and cancelled.

This means we can now see what MS are planning, and have begun to roll out in preview. Nice touch!

del.icio.us Tags: ,

Monday, February 02, 2015

More Azure VM sizes Available

At the beginning of January, Microsoft announced the general availability of bigger Azure VM sizes. The G-Series proved more RAM, more logical processors combined with lots of SSD disk storage – these VM sizes should provide extraordinary  performance. The G-Series currently scales from Standard_G1 (2 CPU, 28 GB ram, 412 SSD), through to Standard_G5 (32 CPS, 448GB Ram, 6.5TB SSD). http://azure.microsoft.com/blog/2015/01/08/azure-is-now-bigger-faster-more-open-and-more-secure/ provides more capacity details for these new VMs.

These VM sizes are ideal for large database servers – not only SQL Server, but also My SQL, MongoDB, Cassandra, Cloudera, DataStax and others. Each VM can also have up to 64 TB of attached data disks! To help the compute grunt of these VMs, they feature the latest Intel CPUs (Intel® Xeon® processor E5 v3 family) and DDR4 memory.

When Microsoft first announced these new VM sizes, availability was restricted to the West US Azure region (but they were clear that they were working to add support in additional regions).  To help me find where I can get a particular VM size, I wrote a script function to find out which regions hold a particular VM size. Here's the function:

Function wherecaniget {
  [CmdletBinding()]
  param ($VMSize)
  # Get Locations
 
$Locations = Get-AzureLocation
  # Where is that VM size?
  Foreach ($loc in $Locations) {
    $ln = $loc.DisplayName
    $rs = $loc.VirtualMachineRoleSizes
    If ($rs -contains $VMSize) {$ln}
  }
}

I notice that these new VM sizes are now available in the East US region as well.

image 

The rate of change is just awesome!

del.icio.us Tags: ,

Sunday, February 01, 2015

Adding an Endpoint to an Azure VM

In a recent blog article, I showed you how you can create an Azure VM. If you are familiar with Hyper-V and the Hyper-V cmdlets – the approach is a little different (since you configure Azure differently). One aspect of creating a VM is how you, in effect, open ports to an Azure VM.

With Azure, you create an end point that, in effect, creates a public port and an internal port for your VM. You create one port, which is the public port (on the Azure service) and an internal port for the VM. The approach to doing this is pretty straightforward, although it's a pipelined pattern:

Get-AzureVM -ServiceName $ServiceName -Name $VmName  |
  Add-AzureEndPoint -Name "Http" -Protocol "tcp" -PublicPort 80
-LocalPort 80  |
    Update-AzureVm

This pattern has you first get the Azure VM object, to which you add an Azure endpoint. Then the updated object is piped to Update-AzureVm which adds the new endpoint to the VM. Of course, if the end point already exists, this pattern would throw an exception. With Add-AzureEndPoint, you specify the protocol, and public and external ports. This 'one-liner' creates a public port and an internal (local) port to enable the VM to serve HTTP traffic.

I've created a simple script, over on http://pshscripts.blogspot.com, that implements a New-HttpVmEndpoint function which you can use to add a new HTTP endpoint to a virtual machine. This script omits some error handling you might wish to add in. For example, you might want to check whether the end point already exists. or whether the VM and VM service exist. You could obviously extend that script to add other endpoints (eg HTTPS).