Befriending Dragons

Transform Tech with Anti-bullying Cultures


1 Comment

PowerShell works for Amazon AWS S3 too!

More and more we have to work with data in many different locations. This week I got to work with S3 files that were moving to Azure blob storage. I was surprised to find that Amazon has published AWS cmdlets for PowerShell. It took me a little while to figure out the format and terminology so I’ll try to explain that and compare/contrast how we interact with storage in AWS and Azure. Today we will cover viewing the files.

Configure PowerShell

Well first, let’s get things set up. Install the Azure and AWS cmdlets for PowerShell. These examples will pass keys for everything so there’s no need to configure PowerShell with certificates to access the clouds.

The first time (depending on your PowerShell version) you use PowerShell after installing AWS cmdlets you may need to run these cmdlets:

Add-Type -Path “C:Program Files (x86)AWS SDK for .NETbinNet45AWSSDK.dll”
Import-Module “C:Program Files (x86)AWS ToolsPowerShellAWSPowerShellAWSPowerShell.psd1”

Connecting to Storage

S3

We’ll start with AWS S3. Each connection to S3 storage requires an AWS region (unless you use the default “US Standard”, an access id (unique identifier), a secret key, and a bucket. You are storing data within a specific region on an access point in a managed grouping called a bucket. The access id in S3 is equivalent to a storage account name in Azure. A bucket in S3 is roughly equivalent to a container in Azure.

$S3Bucket = “MyBucket”
$S3Key=”SecretKeyValue”
$S3AccessID=”AccessKey”
$AWSregion = “us-west-2”

Next let’s use those values to make a new client connection to S3. You define a configuration object that points to the full URL for the region. Then you pass that configuration object, the access id, and the secret key to a function that creates a client connection to S3. This sets the context for the entire session and the context does not have to be passed to the individual commands. Note that the URL changes depending on the region, for example https://s3-us-west-2.amazonaws.com

Set-DefaultAWSRegion $AWSregion # auto-stored to $StoredAWSRegion
$AWSserviceURL=”https://s3-$AWSRegion.amazonaws.com”
$config=New-Object Amazon.S3.AmazonS3Config
$config.ServiceURL = $AWSserviceURL
$S3Client=[Amazon.AWSClientFactory]::CreateAmazonS3Client($secretKeyID, $secretAccessKeyID, $config)

Azure

Let’s compare that to how we list files in Azure blob storage. First you specify the location and credentials. The region is implied because the storage account name is unique across all regions. The container and secret key value are similar in meaning.

$storageAccountName = “MyStorageAccountName”
$storageaccountkey = “SecretKeyValue”
$containerName = “MyBucket”

Then you define the storage context which is the location and credentials of an object. Alternatively you could set the default storage context for the session or for a particular profile’s connection to a given subscription.

$AzureContext = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountkey

View the Files

S3

Now you can get basic metadata about the S3 bucket:
Get-S3Bucket $S3Bucket
Get-S3BucketLocation $S3Bucket

Next let’s list the files in that bucket.

Get-S3Object -BucketName $S3Bucket

You can populate an array with the list, in this example I passed in just the name (key) of each file:
$S3FileList = (Get-S3Object -BucketName $S3Bucket).key

And you can filter the result set:
$S3FileList = (Get-S3Object -BucketName $S3Bucket | Where-Object {$_.lastmodified -lt “2/17/2015”}).Key
$S3FileList = (Get-S3Object -BucketName $S3Bucket | Where-Object {$_.key -like “*42*”}).Key

Azure

For Azure we can do similar operations to view the files. This example lists all files in the container:

Get-AzureStorageBlob -Context $AzureContext -Container $containerName

You can also populate an array with the list:

$AzureList = Get-AzureStorageBlob -Context $AzureContext -Container $containerName

Or pull out just a single property:

(Get-AzureStorageBlob -Context $AzureContext -Container $containerName).Name

Or list just blobs that match a wildcard value:

Get-AzureStorageBlob -Context $AzureContext -Container $containerName -Blob *42*

My Work Here is Done

This intro to PowerShell for S3 opens up the door to many possibilities – data migrations, multi-cloud hybrid solutions, and whatever your imagination can conjure up! Today we reviewed how to view files, I’ll cover more in future posts. Happy PowerShelling!

Tip

When you open “Microsoft Azure PowerShell” type ISE in the window to launch the interactive PowerShell shell. It has intellisense, multiple script windows, and a cmdlet viewer.


3 Comments

Understanding WASB and Hadoop Storage in Azure

Yesterday we learned Why WASB Makes Hadoop on Azure So Very Cool. Now let’s dive deeper into Windows Azure storage and WASB. I’ll answer some of the common questions I get when people first try to understand how WASB is the same as and different from HDFS.

What is HDFS?

The Hadoop Distributed File System (HDFS) is one of the core Hadoop components, it is how Hadoop manages data and storage. At a high level, when you load a file into Hadoop the “name node” uses HDFS to chunk the file into blocks and it spreads those blocks of data across the worker nodes within the cluster. Each chunk of data is stored on multiple nodes (assuming the replication factor is set to > 1) for higher availability. The name node knows where each chunk of data is stored and that information is used by the job manager to allocate tasks and resources appropriately across nodes.

What is WASB?

Windows Azure Storage Blob (WASB) is an extension built on top of the HDFS APIs. The WASBS variation uses SSL certificates for improved security. It in many ways “is” HDFS. However, WASB creates a layer of abstraction that enables separation of storage. This separation is what enables your data to persist even when no clusters currently exist and enables multiple clusters plus other applications to access a single piece of data all at the same time. This increases functionality and flexibility while reducing costs and reducing the time from question to insight.

What is an Azure blob store, an Azure storage account, and an Azure container? For that matter, what is Azure again?

Azure is Microsoft’s cloud solution. A cloud is essentially a collection of host data centers that you don’t have to directly manage. You can request services from that cloud. For example, you can request virtual machines and storage, data services such as SQL Azure Database or HDInsight, or services such as Websites or Service Bus. In Azure you store blobs on containers within Azure storage accounts. You grant access to a storage account, you create collections at the container level, and you place blobs (files of any format) inside the containers. This illustration from Microsoft’s documentation helps to show the structure:

Blob1

How do I manage and configure block/chunk size and the replication factor with WASB?

You don’t. It’s not generally necessary. The data is stored in the Azure storage accounts, remaining accessible to many applications at once. Each blob (file) is replicated 3x within the data center. If you choose to use geo-replication on your account you also get 3 copies of the data in another data center within the same region. The data is chunked and distributed to nodes when a job is run. If you need to change the chunk size for memory related performance at run time that is still an option. You can pass in any Hadoop configuration parameter setting when you create the cluster or you can use the SET command for a given job.

Isn’t one of the selling points of Hadoop that the data sits with the compute? How does that work with WASB?

Just like with any Hadoop system the data is loaded into memory on the individual nodes at compute time (when the job runs). The difference with WASB is that the data is loaded from the storage accounts instead of from local disks. Given the way Azure data center backbones are built the performance is generally the same or better than if you used disks locally attached to the VMs.

How do I load data to Hadoop on Azure?

You use any of the many Azure data loading methods. There isn’t really anything special about loading data that will be used for Hadoop. As with data used by any other application there are some guidelines around directory structures, optimal numbers of files, and internal format but that is independent of data loading. Some common examples are AZCopy, CloudXplorer and other storage explorers, and SQL Server Integration Services (SSIS).

And yes, I will blog about those guidelines but not here. 🙂

Can I have multiple Hadoop clusters pointing to one storage account?

Yes.

Can I have one Hadoop cluster pointing to multiple storage accounts?

Yes. Check!

See: Use Additional Storage Accounts with HDInsight Hive.

Can I have many Hadoop clusters pointing to multiple storage accounts?

Why, yes. Yes you can. Check!

Do I get to keep my data even if no Hadoop cluster currently exists?

What a fun day to say Yes. Check!

For a caveat see HDInsight: Hive Internal and External Tables Intro.

Is WASB available for any distribution of Hadoop other than HDInsight?

It is my pleasure to answer that with a resounding Yes. Check!

WASB is built into HDInsight (Microsoft’s Hadoop on Azure service) and is the default file system. WASB is also available in the Apache source code for Hadoop. Therefore when you install Hadoop, such as Hortonworks HDP or Cloudera EDH/CDH, on Azure VMs you can use WASB with some configuration changes to the cluster.

How do I manage files and directories?

Hive is the most common entry point for Hadoop jobs and with Hive you never point to a single file, you always point to a directory. If you are a stickler for details and want to point out that Azure doesn’t have directories, that’s technically true. However, Hadoop recognizes that a slash “/” is an indication of a directory. Therefore Hadoop treats the below Azure blob file as if it were AFile.txt in a directory structure of: SomeDirectory/ASubDirectory. But since you don’t access individual files in Hive you will reference either SomeDirectory or SomeDirectory/ASubDirectory.

Blob: wasb://YOURDefaultContainer@YOURStorageAccount.blob.core.windows.net/SomeDirectory/ASubDirectory/AFile.txt

You can add, remove, and modify files in the Azure blob store without regard to whether a Hadoop cluster exists. Each time a job runs it reads the data that currently exists in the directory(s) it references. Hadoop itself can also write to files.

What about ORCFile, Parquet, and AVRO?

They are proprietary formats often used within Hadoop but rarely used outside of Hadoop. There are performance advantages to using those formats for “write once, read many” data inside Hadoop, but chances are high that you won’t then be able to access the data without going through one of your Hadoop clusters.

Should I have lots of small files?

NO! No!  

Why is too long to answer here. The short answer is to use files that are many multiples of the in-memory chunk size, in the GB or TB size range. Whenever possible use fewer, larger files instead of many small files. If necessary stitch the files together.

That’s your storage lesson for today – please put your additional Hadoop on Azure storage questions in the comments or send me a tweet! Thanks for stopping by!

Cindy Gross – Neal Analytics: Big Data and Cloud Technical Fellow  image
@SQLCindy | @NealAnalytics | CindyG@NealAnalytics.com | http://smallbitesofbigdata.com

http://blogs.msdn.com/b/cindygross/archive/2015/02/04/understanding-wasb-and-hadoop-storage-in-azure.aspx

http://www.nealanalytics.com/understanding-wasb-and-hadoop-storage-in-azure/

!function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0],p=/^http:/.test(d.location)?’http’:’https’;if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src=p+”://platform.twitter.com/widgets.js”;fjs.parentNode.insertBefore(js,fjs);}}(document,”script”,”twitter-wjs”);


2 Comments

Why WASB Makes Hadoop on Azure So Very Cool

Rescue dogData. It’s all about the data. We want to make more data driven decisions. We want to keep more data so we can make better decisions. We want that data stored cheaply, easily accessible, and quickly ingested. Hadoop promises to help with all those things. However, when you deal with Hadoop on-premises you have a multi-step process to load the data. Azure and WASB to the rescue!

With a typical Hadoop installation you load your data to a staging location then you import it into the Hadoop Distributed File System (HDFS) within a single Hadoop cluster. That data is manipulated, massaged, and transformed. Then you may export some or all of the data back to a non-HDFS system (a SAN, a file share, a website).

What’s different in the cloud? With Azure you have Azure Blob Storage Accounts. Data can be stored there as blobs in any format. That data can be accessed by various applications – including Hadoop without first doing a separate load into HDFS! This is made possible because Microsoft used the public extensions available with HDFS to create the Windows Azure Storage Blobs (WASB) interface between Hadoop and the Azure blob storage. This WASB code is available for any distributor of Hadoop in the Apache source code and it is the default storage system in HDInsight – Microsoft’s Hadoop on Azure PaaS offering. It is also available for Hortonworks HDP on Azure VMs or Cloudera EDH/CDH on Azure VMs with some manual configuration steps.

With WASB you load your data to Azure blobs at any time – whether Hadoop clusters currently exist or not. That way you aren’t paying for Hadoop compute time simply to load data. You spin up one or more clusters, point them at the data sets (yes, multiple clusters pointing to same data!), and run your Hadoop jobs. When you don’t need the system for a while you take down your Hadoop cluster(s) and the data is still there. At any point, whether one or more Hadoop clusters are accessing the data or not, other applications can still access and manipulate the data. For example, you could have data sitting on an Azure storage account that is being added to by a SQL Server Integration Services (SSIS) job. At the same time someone is using Power Query to load that data into PowerPivot while a website inserts new data to the same location. Meanwhile your R&D department can be running highly intensive jobs that require a large cluster up for many days or weeks at a time, and your sales team can have a separate, smaller cluster that’s up for a few hours a day – all pointing at the same data!

With this separation of storage and compute you have simplified your data accessibility, reduced data movement and copies, and reduced the time it takes to have your data available! That all adds up to lower costs and a faster, more data-driven time to insight.

Cindy Gross – Neal Analytics: Big Data and Cloud Technical Fellow  
@SQLCindy | @NealAnalytics | CindyG@NealAnalytics.com | http://smallbitesofbigdata.com

http://www.nealanalytics.com/why-wasb-makes-hadoop-on-azure-so-very-cool/

http://blogs.msdn.com/b/cindygross/archive/2015/02/03/why-wasb-makes-hadoop-on-azure-so-very-cool.aspx


2 Comments

Azure Maximums and Resource Usage from PowerShell

Technorati Tags: ,

Have you ever struggled to find out how many VM cores, HDInsight cores, storage accounts, or other Azure resources your subscription is set to allow or how many you actually use? Maybe you want to use this information in your automation scripts to avoid trying to create components for which you don’t have resources.

quizzical owl

PowerShell to the rescue!

First a couple of key points. There are various maximums in Azure. Today we are talking about finding the currently configured maximums allowed for a specified subscription. There are default maximums (default limit) which you can increase for a given subscription by opening a billing support ticket. There are also hard maximums (maximum limit). However, with some products, such as HDInsight (Hadoop), you can get past some per-subscription maximums for dependent services by combining resources (storage accounts) from multiple subscriptions for a single HDInsight cluster. All the samples below find the current billing quota limitation and actual usage for the current subscription.

Let’s take a look at the information available on the subscription level cmdlet.

Start by checking which subscription is in focus / current for the PowerShell session.

(Get-AzureSubscription -Current).SubscriptionName

(Get-AzureSubscription -Current).CurrentStorageAccountName

If you need information on a different subscription either pass the subscription name (as defined on your client) for the cmdlets that support this or change the focus to a different subscription.

$SubName = “sqlcatwoman”

Select-AzureSubscription -SubscriptionName $SubName

Now we will look at the cores available for Azure virtual machines (VMs / IaaS). Note that HDInsight cores are tracked separately. Be careful with unexpected line wraps that may paste into your PowerShell window (or ISE) incorrectly. The below snippet is 1 comment line and 4 lines of code.

# How many cores are available to create new VMs (or increase size of existing VMs) for the current subscription?

[int]$maxVMCores     = (Get-AzureSubscription -current -ExtendedDetails).maxcorecount

[int]$currentVMCores = (Get-AzureSubscription -current -ExtendedDetails).currentcorecount

[int]$availableCores = $maxVMCores $currentVMCores

Write-Host “Cores available for VMs:” $availableCores

We can get similar information about cloud services:

#how many cloud (hosted) services are available on this subscription

[int]$maxAvl         = (Get-AzureSubscription -current -ExtendedDetails).MaxHostedServices

[int]$currentUsed    = (Get-AzureSubscription -current -ExtendedDetails).CurrentHostedServices

[int]$availableNow   = $maxAvl $currentUsed

Write-Host “Cloud services available:” $availableNow

Some limits and usage are available on cmdlets specific to a particular technology. For example, the HDInsight usage and maximums are available from the Get-AzureHDInsightProperties cmdlet. You can find details and samples on Get HDInsight Properties with PowerShell.

Other times we have to look at different cmdlets for different pieces of the information, such as for storage accounts:

#how many storage accounts are available on this subscription

[int]$maxAvl         = (Get-AzureSubscription -current -ExtendedDetails).MaxStorageAccounts

[int]$currentUsed    = (Get-AzureStorageAccount).Count

[int]$availableNow   = $maxAvl $currentUsed

Write-Host “Storage Accounts available:” $availableNow

We can look at all the extended properties available for a subscription:happy owl

Get-AzureSubscription -current –ExtendedDetails

If you know you have a particular component created and this cmdlet shows the “Current” value is zero, take a look at the Get-Azure… cmdlet for that particular type of resource and look for a “Current” value.

Another handy thing to look at is the overall information about what Azure regions exist and what services are available in each region:

Get-AzureLocation 

And you can pull off specific information:

Get-AzureLocation  | Select DisplayName

I hope these small bites of PowerShell help save the day for you in some way!


Leave a comment

Get HDInsight Properties with PowerShell

Small Bites of Big Data from AzureCAT

You’ve created your HDInsight Hadoop clusters and now you want to know exactly what you have out there in Azure. Maybe you want to pull the key information into a repository periodically as a reference for future troubleshooting, comparisons, or billing. Maybe you just need to get a quick look at your overall HDInsight usage. This is something you can easily automate with PowerShell.

Environment

First, open Windows Azure PowerShell or powershell_ise.exe.

Set some values for your environment:

$SubName = "YourSubscriptionName"
Select-AzureSubscription -SubscriptionName $SubName
Get-AzureSubscription -Current
$ClusterName = "HDInsightClusterName" #HDInsight cluster name

HDInsight Usage for the Subscription

Take a look at your overall HDInsight usage for this subscription:

Get-AzureHDInsightProperties

Get-AzureHDInsightProperties returns the number of clusters for this subscription, the total HDInsight cores used and available (for head nodes and data nodes), the Azure regions where HDInsight clusters can be created, and the HDInsight versions available for new clusters:

ClusterCount    : 2
CoresAvailable  : 122
CoresUsed       : 48
Locations       : {East US, North Europe, Southeast Asia, West Europe...}
MaxCoresAllowed : 170
Versions        : {1.6, 2.1, 3.0}

You can also pick out specific pieces of information and write them to a file, store them as variables, or use them elsewhere. This example simply outputs the values to the screen.

write-host '== Max HDInsight Cores for Sub: ' (Get-AzureHDInsightProperties).MaxCoresAllowed
write-host '== Cores Available:             ' (Get-AzureHDInsightProperties).CoresAvailable
write-host '== Cores Used:                  ' (Get-AzureHDInsightProperties).CoresUsed

HDInsight Cluster Information

Get-AzureHDInsightCluster provides information about all existing HDInsight clusters for this subscription:

Get-AzureHDInsightCluster
As you can see this cmdlet tells you the size, connection information, and version.
ClusterSizeInNodes    : 4
ConnectionUrl         : https://BigCAT.azurehdinsight.net
CreateDate            : 4/5/2014 3:37:23 PM
DefaultStorageAccount : sqlcatwomanwestus.blob.core.windows.net
HttpUserName          : Admin
Location              : West US
Name                  : BigCAT30
State                 : Running
StorageAccounts       : {}
SubscriptionId        : {YourSubID}
UserName              : Admin
Version               : 3.0.0.0.661685
VersionStatus         : Compatible

ClusterSizeInNodes    : 4
ConnectionUrl         : https://cgrosstest.azurehdinsight.net
CreateDate            : 5/5/2014 6:09:58 PM
DefaultStorageAccount : cgrosstest.blob.core.windows.net
HttpUserName          : Admin
Location              : West US
Name                  : cgrosstest
State                 : Running
StorageAccounts       : {sqlcatwomanwestus.blob.core.windows.net}
SubscriptionId        : {YourSubID}
UserName              : Admin
Version               : 3.0.2.0.727283
VersionStatus         : Compatible

You can also get information about just one HDInsight cluster at a time:

Get-AzureHDInsightCluster  -name $ClusterName

Or you can get very granular and look at specific properties, even some that aren’t in the default values:

write-host '== Default Storage Account:     ' `
(Get-AzureHDInsightCluster -Cluster $ClusterName).DefaultStorageAccount.StorageAccountName.split(".")[0]
write-host '== Default Container:           ' `
(Get-AzureHDInsightCluster -Cluster $ClusterName).DefaultStorageAccount.StorageContainerName

This information will be a valuable source of information for tracking past configurations, current usage, and planning. Enjoy your Hadooping!

Sample Script

# Cindy Gross 2014
# Get HDInsight properties
$SubName = "YourSubscriptionName"
Select-AzureSubscription -SubscriptionName $SubName
Get-AzureSubscription -Current
$ClusterName        = "YourHDInsightClusterName" #HDInsight cluster name

Get-AzureHDInsightProperties 
Get-AzureHDInsightCluster 
Get-AzureHDInsightCluster  -name $ClusterName
write-host '== Default Storage Account:     ' `
(Get-AzureHDInsightCluster -Cluster $ClusterName).DefaultStorageAccount.StorageAccountName.split(".")[0]
write-host '== Default Container:           ' `
(Get-AzureHDInsightCluster -Cluster $ClusterName).DefaultStorageAccount.StorageContainerName
write-host '== Max HDInsight Cores for Sub: ' (Get-AzureHDInsightProperties).MaxCoresAllowed
write-host '== Cores Available:             ' (Get-AzureHDInsightProperties).CoresAvailable
write-host '== Cores Used:                  ' (Get-AzureHDInsightProperties).CoresUsed


Leave a comment

Use Additional Storage Accounts with HDInsight Hive

When you create an HDInsight Hadoop cluster you pass in one or more storage accounts and their associated keys. This allows you to access the files on all associated storage accounts from the cluster. If you want to use public storage that isn’t passed in at create time that’s easy – simply supply the storage account name each time you run a job. But how do you access data on private storage accounts that need an access key?

The steps are laid out in this wiki by Eric Hanson: Using an HDInsight Cluster with Alternate Storage Accounts and Metastores

http://social.technet.microsoft.com/wiki/contents/articles/23256.using-an-hdinsight-cluster-with-alternate-storage-accounts-and-metastores.aspx

I am providing a variable based variation of the PowerShell sample for Hive. To set up PowerShell for use with Azure see Getting Started with Azure PowerShell Cmdlets–Subscription Management.

First you will set some values for your environment. If you use your default subscription you don’t need to pass in the subscription name and select it. However, you will always need to specify the HDInsight cluster name. In this example $undefinedStorageAccount is the name of an account that you want to access from a cluster but you didn’t define it when you created the cluster. You always need to specify which container to use for any given reference so you also need to define $undefinedContainer. If the storage account belongs to the current subscription you can simply ask Azure to return the key (#commented out in the example below) or you can paste in the key that someone has given you.

$subscriptionName = "LocalAzureSubscriptionName"
$clusterName = "HDInsightClusterName"
$undefinedStorageAccount = "AdditionalStorageAccount"
$undefinedContainer = "ContainerOnAdditionalStorageAccount"
#$undefinedStorageKey = Get-AzureStorageKey $undefinedStorageAccount | %{ $_.Primary }
$undefinedStorageKey = "YourActualAccessKeyFromAzurePortal"

Now choose which of your locally defined subscriptions to use:

Select-AzureSubscription -SubscriptionName $subscriptionName

Set the context of the cluster you want to use:

Use-AzureHDInsightCluster $clusterName

Now let’s check your HDInsight cluster properties.

$defaultStorageAccount  = (Get-AzureHDInsightCluster -Name $clusterName).DefaultStorageAccount.StorageAccountName #default/only storage account
$defaultContainerName   = (Get-AzureHDInsightCluster -Subscription $SubID -Cluster $ClusterName).DefaultStorageAccount.StorageContainerName
$definedStorageAccounts = (Get-AzureHDInsightCluster -Name $clusterName).StorageAccounts #no 2nd account is associated, no value is returned

Let’s check the values and verify that the storage account you want to use is not listed as either the DefaultStorageAccount (every cluster has one) or as one of the additional known storage accounts configured during provisioning (you may have zero, one, or many).

write-host "===Default storage account"
$defaultStorageAccount
write-host "===Default container name"
$defaultContainerName
write-host "===Other defined storage accounts for this cluster"
$definedStorageAccounts

Next we’ll get a non-recursive listing of the files in the default location:

invoke-hive "dfs -ls wasb://$defaultContainerName@$defaultStorageAccount/;" #default storage

And then try to get a listing for the private storage account that we have not associated with the cluster:

invoke-hive "dfs -ls wasb://$undefinedContainer@$undefinedStorageAccount/;" #not associated, errors

Because the storage account access key is not yet known you will see an error similar to this one:

Logging initialized using configuration in file:/C:/apps/dist/hive-0.12.0.2.0.7.0-1559/conf/hive-log4j.properties
ls: org.apache.hadoop.fs.azure.AzureException: Unable to access container xyz in account abc using anonymous credentials, 
and no credentials found for them  in the configuration.
Command failed with exit code = 1

But we can fix this! From PowerShell we can pass in “defines” statements to change configuration values, add libraries, etc.

$defines = @{}
$defines.Add("fs.azure.account.key.$undefinedStorageAccount.blob.core.windows.net", $undefinedStorageKey)
Invoke-Hive -Defines $defines -Query "dfs -ls wasb://$undefinedContainer@$undefinedStorageAccount.blob.core.windows.net/;"

The access key is only available to this Hive query, but now that I have the variables set I can pass it in to other queries as well. Happy Hiving!

I hope you enjoyed this small bite of Big Data!


5 Comments

Getting Started with Azure PowerShell Cmdlets–Subscription Management

I’ve started using the Azure PowerShell cmdlets more often to manage virtual machines and HDInsight in Azure. Once you connect to a subscription everything just works. However, the initial steps to get one or more subscriptions configured to be used from your machine or understanding how to change subscription information on your machine can be confusing. Some of the docs are contradictory, outdated, or incomplete. Often they assume you are only a co-admin of one subscription. The below steps should get you going with Azure cmdlets whether you admin one or many subscriptions.

You need to enable your machine to talk to one or more Azure subscriptions. The first step is creating a certificate. Do NOT do this if you already used the PublishSettings commands unless you first use Remove-AzureSubscription (which removes the locally stored information about the specified subscription). Makecert is more secure than PublishSettings, especially if you (a given email address) have multiple co-administrators per subscription and/or you (a given email address) are a co-administrator of multiple subscriptions.

The steps to get going are documented in Shep’s blog “Cloud Spelunking, Managing Azure form your Desktop via PowerShell (the Setup)” http://blogs.msdn.com/b/sql_shep/archive/2013/03/29/cloud-spelunking-managing-azure-form-your-desktop-via-powershell.aspx. I’ll go a bit deeper and fill in a few additional details on what Shep calls the “hard” option.

Create a Certificate

If you have IIS, Visual Studio, or the Windows SDK you will have some variation of a “Developer Command Prompt” (or VS201x or Visual Studio Command Prompt). Open that command prompt with the “run as administrator” option. Replace YourCertName with a meaningful name and run the below command. The cert always goes to the cert store on your local machine – the last parameter is an optional file based copy of that certificate that we will need for the next step. If you don’t specify the location it goes to %windir%system32. Be very protective of the .cer file – delete it once you have uploaded it. You can always generate another file if you need it.

makecert -sky exchange -r -n “CN=<YourCertName>” -pe -a sha1 -len 2048 -ss My “c:temp<YourCertName>.cer”

This certificate is yours – do not share it with others. If you want to reuse the certificate on other machines that you control, you can copy the .cer file to those machines and import them into the local certificate store on each machine. The .cer is just a copy, the actual certificate was loaded into your local certificate store (Manage Computer Certificates) by makecert.

Upload Certificate to Azure Subscription(s)

Generally you will not want to share certificates with others. Any certificate you use must be in your local certificate store (Manage Computer Certificates). The same certificate must also be uploaded to the portal and associated with each subscription you wish to manage from your machine.

From your local machine where you created the certificate in the above step:

  • Log in to the Azure Portal with an email address that is associated with the subscription you want to use from your own machine.
  • Scroll to the bottom of the left pane and choose “SETTINGS”

settings

  • Choose “MANAGEMENT CERTIFICATES”

AzurePortalSettingsMgmtCert

  • Click on the “UPLOAD” button in the middle of the bar at the bottom of the screen.

image

  • In the “Upload a management certificate” dialog navigate to the location specified in the last parameter above or %windir%system32 if you didn’t specify a location. Choose the .cer file you just created with makecert (or export a certificate from the local certificate store – just make sure it has the right properties). If you have multiple subscriptions there is a 2nd drop down box where you need to choose the subscription that the certificate will be associated with.

image

  • Repeat for any additional subscriptions that you want to manage with the same certificate (or create one certificate per subscription for additional security granularity).

Install and Configure the Azure PowerShell Cmdlets

Follow the steps here to install the Azure Cmdlets. Basically you are selecting “Azure PowerShell” from the Web Platform Installer. You can also check in the Web Platform Installer for updated versions of the cmdlets.

A very common setting that many admins set is the RemoteSigned Execution Policy. This is less secure than AllSigned or Restricted but allows you to use most downloaded scripts.

Open Windows Azure PowerShell with the “run as admin” option and run:

Set-ExecutionPolicy RemoteSigned –Force
Get-ExecutionPolicy –list

If you see errors when setting the execution policy, search on your specific error or start with this blog: Set-ExecutionPolicy : Windows PowerShell updated your execution policy successfully, but the setting is overridden by a policy defined at a more specific scope!!! You may need to open “Edit Group Policy” (in Windows 8 that opens the Local Group Policy Editor) and make a change.  Sometimes you may need to set each individual scope, but process scope settings go back to the default when the process is closed:

Set-ExecutionPolicy RemoteSigned -Scope Process -Force

Then import the Azure cmdlets:

Import-Module Azure

You can close the PowerShell window, you no longer need to “run as admin”.

Enable PowerShell to use a Subscription via a Certificate

Repeat this section on each machine that will be used to execute PowerShell code. Also repeat for additional subscriptions on each machine.

Open Windows Azure PowerShell. Optionally type ISE to open the Integrated Scripting Environment where you can edit, save, and run collections of cmdlets.

First, set some variables. You will need to copy some basic settings from the Azure Management Portal. On the far left side of the portal, scroll all the way to the bottom and choose “SETTINGS” and “MANAGEMENT CERTIFICATES” (see the “Upload Certificate to Azure Subscription(s)” section of this blog for more details – you are copying from the same place where you uploaded the certificate). Choose the certificate you just uploaded. Don’t worry if the numbers are cut off on the screen, if you highlight and copy it will get the whole value, even the part that doesn’t show on the screen. Replace the $subID and $thumbprint below – do not update $myCert as that is done based on your other variables. Execute the code in the PowerShell window.

#copy SUBSCRIPTION ID from portal 
#lower left, settings, management certificates
$subID = "11111111-2222-3333-4444-555555555555"
#copy THUMBPRINT from portal 
#lower left, settings, management certificates
$thumbprint = "1234567891234567891234567891234567891234"
$myCert = Get-Item cert:\CurrentUserMy$thumbprint  

Now set the subscription name you will use to refer to this subscription from this machine. In most cases you will choose the NAME of the subscription from the portal but that is not required. The matching between your machine’s knowledge of the subscription and the subscription on Azure is done via the SUBSCRIPTION ID. Update $localSubName below and execute the code in the PowerShell window. Note that the local subscription name is case-sensitive.

#subname to be used locally
#usually you will choose the actual subscription name
#stored in %appdata%Windows Azure PowerShellWindowsAzureProfile.xml
$localSubName = "MyFavSub"

Now that you have set the values for your own environment, run the code to actually update your machine’s knowledge of the subscription. Note that I used the back tick “`” to specify that the command continues on a new line.

Set-AzureSubscription –SubscriptionName $localSubName `
–SubscriptionId $subID -Certificate $myCert

Some operations rely on a default storage account, you may want to set the default storage account you want to use for each subscription.

#optionally set "current" storage account for this sub
$defaultStorageAccount = 'MyFavStorageAccount'
Set-AzureSubscription -SubscriptionName $localSubName `
-CurrentStorageAccount $defaultStorageAccount

Next you can set the default subscription that you will start with when you open PowerShell on this machine (note that we’ve changed from the Set cmdlet to the Select one):

Select-AzureSubscription –Default $localSubName

You can change which of the configured subscriptions is the current one:

Select-AzureSubscription –Current $localSubName

Check to see which subscription you are currently using:

Get-AzureSubscription –Current
(Get-AzureSubscription -Current).SubscriptionName

Verify that you can connect and list the services associated with the current subscription:

Get-AzureService | select ServiceName

Look at the Local Configuration

Now let’s look at what got updated on the local machine.

Open File Explorer and go to %appdata%Windows Azure PowerShell. Open WindowsAzureProfile.xml in Notepad or your favorite editor. Here are a few of the key values for each subscription you have mapped on your machine:

IsDefault tells you which one is the default subscription for your machine

<IsDefault>true</IsDefault>

The thumbprint id is stored as the ManagementCertificate:

<ManagementCertificate>1234567891234567891234567891234567891234</ManagementCertificate>

The local name you chose for the subscription is stored in Name (to avoid confusion chose the name used in the portal):

<Name>MyFavSub</Name>

The subscription id is stored in SubscriptionId:

<SubscriptionId>11111111-2222-3333-4444-555555555555</SubscriptionId>

Remove Subscription

If you need to remove a subscription from your machine, whether because you no longer have access to it or because you want to change one of the properties such as the name or which certificate you use, you can use Remove-AzureSubscription. This updates your local %appdata%Windows Azure PowerShell.

#RemoveSub
#Remove my machine's knowledge of a subscription 
#Removes info from %appdata%Windows Azure PowerShellWindowsAzureProfile.xml
Remove-AzureSubscription -SubscriptionName MyFavSub

Sample Script

Here is a handy dandy cut/paste version of the above PowerShell code to add a subscription and make it your default and current subscription:

#copy SUBSCRIPTION ID from portal 
#lower left, settings, management certificates
$subID = "YourOwnSubID"
#copy THUMBPRINT from portal 
#lower left, settings, management certificates
$thumbprint = "YourCertThumbprint"
$myCert = Get-Item cert:\CurrentUserMy$thumbprint  
#subname to be used locally
#usually you will choose the actual subscription name
#stored in %appdata%Windows Azure PowerShellWindowsAzureProfile.xml
$localSubName = "YourSubcriptionName"
#optionally set "current" storage account for this sub
$defaultStorageAccount = 'OptionalDefaultStorage'
Set-AzureSubscription –SubscriptionName $localSubName `
    –SubscriptionId $subID -Certificate $myCert
Set-AzureSubscription -SubscriptionName $localSubName `
    -CurrentStorageAccount $defaultStorageAccount
Select-AzureSubscription –Default $localSubName
Select-AzureSubscription –Current $localSubName
Get-AzureSubscription –Current
(Get-AzureSubscription -Current).SubscriptionName

You are Ready for PowerShell Gooey Goodness!

Woohoo! Now you can access your Azure subscriptions from your machine without entering ids and passwords. You can automate, simplify, and standardize any Azure activity that has an associated cmdlet! Happy PowerShelling!


					
		
	


10 Comments

Sample PowerShell Script: HDInsight Custom Create

This is a working script I use to create various HDInsight clusters. For a really reproducible, automated environment you would want to put this into a .ps1 script that accepts parameters (see here for an example). However, you may find the method below good for learning and experimenting. Replace all the “YOURxyz” sections with your actual information. Beware of oddities introduced by cut/paste such as spaces being replaced by line breaks or quotes being replaced by smart quotes. The # is a comment, some commands that you rarely run are commented out so remove the # to run them if you need them.

# This PowerShell script is meant to be a cut/paste of specific parts, it is NOT designed to be run as a whole.

# Do once after you install the cmdlets
#Get-AzurePublishSettingsFile
#Import-AzurePublishSettingsFile C:UsersYOURDirectoryDownloadsYOURName-credentials.publishsettings

# Use if you admin more than one subscription
#Get-AzureAccount # This may be needed to log in to Azure
Select-AzureSubscription –SubscriptionName YOURSubscription
Get-AzureSubscription -Current

# Many things are easier in the ISE
ise

###############################################
### create clusters ###
###############################################

# Add your specific information here
# Previous failures may make a name unavailable for a while – check to see if previous cluster was partially created
$ClusterName = “YOURNewHDInsightClusterName” #the name you will give to your cluster
$Location = “YOURDataCenter” #cluster data center must be East US, West US, or North Europe (as of December 2013)
$NumOfNodes = 1 #start small
$StorageAcct1 = “YOURExistingStorageAccountName” #currently must be in same data center as the cluster
$DefaultContainer = “YOURExistingContainerName” #already exists on the storage account

# These variables are automatically set for you
$FullStorage1 = “${StorageAcct1}.blob.core.windows.net”
$Key1 = Get-AzureStorageKey $StorageAcct1 | %{ $_.Primary }
$SubID = Get-AzureSubscription -Current | %{ $_.SubscriptionId }
$SubName = Get-AzureSubscription -Current | %{ $_.SubscriptionName }
$Cert = Get-AzureSubscription -Current | %{ $_.Certificate }
$Creds = Get-Credential -Message “New admin account to be created for your HDInsight cluster” #this prompts you

###############################################
# Sample quick create
###############################################
# Equivalent of quick create
# The ` specifies that the cmd continues on the next line, beware of artifical line breaks added during cut/paste from the blog
New-AzureHDInsightCluster -Name $ClusterName -ClusterSizeInNodes $NumOfNodes -Subscription $SubID -Location “$Location” `
-DefaultStorageAccountName $FullStorage1 -DefaultStorageAccountKey $Key1 -DefaultStorageContainerName $DefaultContainer -Credential $Creds

###############################################
# Sample custom create
###############################################
#https://hadoopsdk.codeplex.com/wikipage?title=PowerShell%20Cmdlets%20for%20Cluster%20Management
# Most params are the same as quick create, use a new cluster name
# Pass in a 2nd storage account, a SQLAzure db for the metastore (assume same db for Oozie and Hive), add Avro library, some config values
# Execute all the variable settings from above

# This value is set for you, don’t change!
$configvalues = new-object ‘Microsoft.WindowsAzure.Management.HDInsight.Cmdlet.DataObjects.AzureHDInsightHiveConfiguration’

# Add your specific information here
$ClusterName = “YOURNewHDInsightClusterName
$StorageAcct2 = “YOURExistingStorageAccountName2
$MetastoreAzureSQLDBName = “YOURExistingSQLAzureDBName
$MetastoreAzureServerName = “YOURExistingSQLAzureServer.database.windows.net” #gives a DNS error if you don’t use the full name
$configvalues.Configuration = @{ “hive.exec.compress.output”=”true” }  #this is an example of a config value you may pass in

# These variables are automatically set for you
$FullStorage2 = “${StorageAcct2}.blob.core.windows.net”
$Key2 = Get-AzureStorageKey $StorageAcct2 | %{ $_.Primary }
$MetastoreCreds = Get-Credential -Message “existing id/password for your SQL Azure DB (metastore)” #This prompts for the existing id and password of your existing SQL Azure DB

# Add a config file value
# Add AVRO SerDe libraries for Hive (on storage 1)
$configvalues.AdditionalLibraries = new-object ‘Microsoft.WindowsAzure.Management.HDInsight.Cmdlet.DataObjects.AzureHDInsightDefaultStorageAccount’
$configvalues.AdditionalLibraries.StorageAccountName = $FullStorage1
$configvalues.AdditionalLibraries.StorageAccountKey = $Key1
$configvalues.AdditionalLibraries.StorageContainerName = “hivelibs” #container called hivelibs must exist on specified storage account
# Create custom cluster
New-AzureHDInsightClusterConfig -ClusterSizeInNodes $NumOfNodes `
| Set-AzureHDInsightDefaultStorage -StorageAccountName $FullStorage1 -StorageAccountKey $Key1 -StorageContainerName $DefaultContainer `
| Add-AzureHDInsightStorage -StorageAccountName $FullStorage2 -StorageAccountKey $Key2 `
| Add-AzureHDInsightMetastore -SqlAzureServerName $MetastoreAzureServerName -DatabaseName $MetastoreAzureSQLDBName -Credential $MetastoreCreds -MetastoreType OozieMetastore `
| Add-AzureHDInsightMetastore -SqlAzureServerName $MetastoreAzureServerName -DatabaseName $MetastoreAzureSQLDBName -Credential $MetastoreCreds -MetastoreType HiveMetastore `
| Add-AzureHDInsightConfigValues -Hive $configvalues `
| New-AzureHDInsightCluster -Subscription $SubID -Location “$Location” -Name $ClusterName -Credential $Creds

###############################################
# get status, properties, etc.
###############################################
#$SubName = $SubID = Get-AzureSubscription -Current | %{ $_.SubscriptionName }
Get-AzureHDInsightProperties -Subscription $SubName
Get-AzureHDInsightCluster -Subscription $SubName
Get-AzureHDInsightCluster -Subscription $SubName -name YOURClusterName

###############################################
# remove cluster
###############################################
#Remove-AzureHDInsightCluster -Name $ClusterName -Subscription $SubName


Leave a comment

Your First HDInsight Cluster–Step by Step

Small Bites of Big Data from AZURECAT
Big Data Tech Training Series #1
Cindy Gross | Murshed Zaman

Sometimes it is just hard to get started. Have you been putting off your first foray into Hadoop? Are you not sure where to begin? Let’s get really basic.

Prerequisites:

Log on to the Windows Azure Portal https://manage.windowsazure.com

Go to storage https://manage.windowsazure.com/#Workspaces/StorageExtension/storage. Create a storage account in a location that is available to HDInsight (as of November 2013 that’s East US, West US, and North Europe). Do NOT choose an affinity group. If you choose to “Enable Geo-Replication” there will be an extra charge – it’s probably not necessary for a demo/test account as you have a limited amount of credit in the trial subscription. In the portal choose the STORAGE icon on the left. Then click on +NEW at the bottom. That opens a QUICK CREATE window. Enter a unique name for your storage, such as sqlcatwomanrules. It only allows lower case letters and numbers.

StorageNov2013 NewStorageAccountNov2013

Now click on the HDInsight icon just below the storage icon storage. Choose QUICK CREATE. Enter a unique name for your HDInsight cluster. For a demo choose 4 data nodes. Enter a password that contains upper and lower case letters, a number, and a special character. Choose the storage account you created above. Once you click on “CREATE HDINSIGHT CLUSTER” it will take several minutes for the cluster to be deployed.

StorageNov2013_HDI QuickCreateNov2013

Once it completes you are ready to use your cluster!

ClusterQuickStart

If you won’t be using the cluster right away, go ahead and delete it (look for the icon at the bottom of the portal) to save compute time and money. You can easily recreate it when you need it.

image

Look for more blogs soon on customizing your cluster with CUSTOM CREATE or PowerShell and on automating deployment and jobs with PowerShell. In the meantime see if you can get Invoke-Hive working from PowerShell for some simple Hive commands such as:

Invoke-Hive “select * from hivesampletable limit 10”

Big Data Technical Series:

Your First HDInsight Cluster–Step by Step

Automating HDInsight cluster creation with PowerShell


1 Comment

PowerShell for Azure cmdlets: Subscription was all Wacky

I was working on some HDInsight scripts in PowerShell and doing lots of experimenting. I’m not sure what exactly I did but all of a sudden everything stopped working. With lots of interruptions from meetings and chats and lunch…. I couldn’t retrace my steps. Everything seemed to fail on the Azure subscription information so I tried to get really basic – what did Get-AzureSubscription|%{$_.SubscriptionName} return? As it turns out, wacky garbage:

set-azuresubscription ?
!?
!
help
sqlcatwoman
?
Sqlcatwomanwestus

What I expected to see was my single subscription:

sqlcatwoman

So what happened? The Azure portal only shows one subscription. Obviously those other lines are not valid subscriptions – they look like the output of a help command or an error. Reinstalling the cmdlets, rebooting, and reimporting certificates didn’t help. I turned to my AzureCAT coworkers for help and @elcid98 pointed out this blog post that talks about how subscriptions are used in PowerShell:

Azure Subscriptions in PowerShell demystified http://yossidahan.wordpress.com/2012/12/12/azure-subscriptions-in-powershell-demystified/

This caught my attention: “The second file – DefaultSubscriptionData.xml – also lists the available subscriptions and the associated certificates’ thumbprints“. Ok, where is that file? A search finds it in

C:Users%username%AppDataRoamingWindows Azure Powershell

I checked and sure enough, where I would expect just one entry I see multiple – and they’re named the same thing as the garbage in my output! I cleared out all but one entry to end up with this:

<?xml version=”1.0″ encoding=”utf-8″?>
< Subscriptions xmlns:xsd=”http://www.w3.org/2001/XMLSchema” xmlns:xsi=”http://www.w3.org/2001/XMLSchema-instance” version=”0″ xmlns=”urn:Microsoft.WindowsAzure.Management:WaPSCmdlets”>
< Subscription name=”sqlcatwoman”>
< SubscriptionId>You don’t get to see the real info!</SubscriptionId>
<Thumbprint>Not here either!</Thumbprint>
< ServiceEndpoint>https://management.core.windows.net</ServiceEndpoint>
< /Subscription>
< /Subscriptions>

Hmmmm…. I still got an error from Get-AzureSubscription. Back to C:Users%username%AppDataRoamingWindows Azure PowerShell. What’s this? WindowsAzureProfile.xml also has all the same junk! I cleared out all the extras to end up with this:

<?xml version=”1.0″ encoding=”utf-8″?>
< ProfileData xmlns:i=”http://www.w3.org/2001/XMLSchema-instance” xmlns=”http://schemas.datacontract.org/2004/07/Microsoft.WindowsAzure.Commands.Utilities.Common”>
< DefaultEnvironmentName>AzureCloud</DefaultEnvironmentName>
<Environments />
<Subscriptions>
< AzureSubscriptionData>
< ActiveDirectoryEndpoint>https://login.windows.net/</ActiveDirectoryEndpoint>
< ActiveDirectoryTenantId>More secrets!</ActiveDirectoryTenantId>
< ActiveDirectoryUserId>cgross@microsoft.com</ActiveDirectoryUserId>
< CloudStorageAccount i:nil=”true” />
< IsDefault>true</IsDefault>
< LoginType i:nil=”true” />
< ManagementCertificate>Hiding this one too!</ManagementCertificate>
< ManagementEndpoint>https://management.core.windows.net/</ManagementEndpoint>
<Name>sqlcatwoman</Name>
<RegisteredResourceProviders xmlns:d4p1=”http://schemas.microsoft.com/2003/10/Serialization/Arrays” />
< SubscriptionId>And more secrets</SubscriptionId>
</AzureSubscriptionData>
</Subscriptions>
< /ProfileData>

Success! Get-AzureSubscription now returns just my single, valid subscription. All my other Azure cmdlets magically started working again. I don’t know how it got that way, but at least now I know where the subscription information is stored. I hope this helps someone else with their Azure subscription PowerShell scripting!