Hey Amazon, can Redshift do this?

We know the kinder, gentler, “we play nice with everyone” Microsoft would never egg AWS boss Andy Jassy on like this, but here's the deal: Today at Build 2015, Microsoft executive vice president for cloud and enterprise Scott Guthrie made a whole slew of “elastic” announcements — some involving technologies that will make data scientists pretty happy.

Most notable, from our point of view, is the Azure SQL Data Warehouse, an elastic Data Warehouse-as-a-Service with enterprise class-features.

"Unlike Redshift, you can independently adjust the amount of compute and storage you use in a SQL data warehouse. This allows you to... adjust your data warehouse in seconds... allowing you to increase and decrease pretty much at will," said Guthrie.

A First for Microsoft

It’s Microsoft’s first cloud data warehouse that can dynamically grow or shrink, enabling customers to pay only for the query performance they need, when they need it — to petabyte-scale.

Its users won’t experience the pains found on other services like Redshift, Amazon’s existing petabyte-scale data warehouse, according to Microsoft corporate vice president T.K. "Ranga" Rengarajan. There will be no need for pauses as you grow your data warehouse, on demand.

But that’s not all. With this announcement Microsoft is the only company to offer both on-premises and cloud data warehouses at scale.

“We are the innovators. We are unique in the world,” said Rengarajan during a pre-conference interview with CMSWire.

Azure SQL Data Warehouse is based on SQL Server’s massively parallel processing architecture, and integrates with existing data tools including Power BI for data visualization, Azure Machine Learning for advanced analytics, Azure Data Factory for event orchestration and Azure HDInsight, Microsoft’s 100 percent Apache Hadoop-managed big data service.

The on-premises offering, Analytics Platform System (APS), was rated as a Leader in Gartner’s most recent Magic Quadrant for Data Warehouse and Data Management Solutions for Analytics, as was the Microsoft Analytics Platform System (SQL Server Parallel Data Warehouse and HDInsight for Hadoop).

Data-rich insights, including those gleaned from the Internet of Things, can now be processed in seconds and minutes rather than hours or days.

“This is a time of great possibilities," said Rengarajan."Businesses can now take data and make a dramatic difference in the world.”

The preview for Azure SQL Data Warehouse will be available later this year.

Elastic Capabilities

But that’s hardly the only announcement Guthrie made from the stage. Microsoft’s Azure SQL, its database as a service, will now have elastic capabilities. The benefit? Customers will no longer need to over provision to prepare for peak demand, instead cloud ISV’s and developers can use an elastic database pool to let hundreds, or even thousands of databases use resources.

Learning Opportunities

“This will help customers work more efficiently with respect to (financial) resources,” said Rengarajan. The purchase model allows for control over price and performance against a group of databases. Key features include:

  1. Elastic database tools simplify building and managing applications that scale, so building applications against one —or thousands — of databases is just as easy using familiar T-SQL and ADO.NET models.
  2. The client library and tools help speed time-to-market, and customers can run centralized query operations such as reporting and data extractions with unified results from a database set.
  3. Enhanced data security. Transparent Data Encryption allows real time encryption/unencryption of data and logs, and Column-level Security, which allows a column to be encrypted within a table.
  4. Support for full-text search within databases.
  5. Easier migration with Service Tier Advisor, which helps customers identify the Azure SQL Database service tier that best fits their needs.

These new capabilities are being generally released today to Microsoft customers via the Azure Portal.

Microsoft’s Azure Data Lake

Microsoft went on to announce a new data lake storage service, which Guthrie described as a method for storing self-scalable storage resources and enterprise-grade security for data gathered together using the Hadoop HDFS file system.

“Azure is the best place for your data,” said Rengarajan. His words sound less like a marketing pitch and more like a result he came away with after months of analysis. “You can now create a store of petabyte size, for every type of data in its native format.”

The data lake provides for high throughput for massively parallel querying, unlimited storage and no capacity restrictions on individual files, and it works with Hortonworks, Cloudera and any other Hadoop distro, for that matter.

A New Era?

Microsoft’s announcements today will usher in a new breed of data use cases, according to Rengarajan.

If you buy Microsoft's pitch, businesses will be able to leverage data to create better relationships with customers, inform empathy and loyalty. They’ll be in a position to fashion new experiences in an IoT world well beyond tablets — a world filled with a new breed of watches, things that beep, HoloLenses and effects we have yet to imagine.

But the wins here at Build go beyond user experiences, Rengarajan said.

Developers who create them will become what he calls the “cool guys and gals,” because rather than struggling to learn new technologies, they’ll be working with familiar Microsoft tools and creating solutions ahead of their peers.