Data Warehouse Fast Track for SQL Server 2016

Microsoft Data Warehouse Fast Track for SQL Server 2016 is a joint effort between Microsoft and its hardware partners to deliver validated, pre-configured solutions that reduce the complexity of implementing a data warehouse on SQL Server Enterprise Edition.  The Data Warehouse Fast Track program provides flexibility of solutions and customer choice across hardware vendors’ technologies, and uses the core capabilities of the Windows Server operation system and SQL Server to deliver a balanced SMP data warehouse with optimized performance.


The reference architectures are tested internally by Microsoft and consist of high performance hardware and software configurations at various price, performance and footprint tiers.  Data Warehouse Fast Track for SQL Server brings some great capabilities designed to support a modern data warehouse implementation where data and analytics can truly exist in the same solution, spanning cloud and on-premises.  These reference architectures have been available since SQL Server 2012.

The Data Warehouse Fast Track is not a replacement for APS (Analytics Platform System).  APS is a MPP (Massively Parallel Processing) data warehouse appliance which is designed as a pure data warehouse offering and scales to store and query petabytes of data.  In general, the initial database size for using APS over the Data Warehouse Fast Track is 150TB:


Data Warehouse Fast Track for SQL Server brings the optimal configuration of hardware and software together into a single packaged offering which is guaranteed to perform.  Balanced against time to solution versus cost, Data Warehouse Fast Track for SQL Server truly enables success ‘out of the box’ without the need to perform arduous sizing or throughput calculations (this has all been done for you), simple purchasing and installation, fast performance and scalability, and total peace of mind.


There are certified reference architectures ranging from 6TB to 145TB across SQL Server 2014 and SQL Server 2016.  There is even an RA which scales to 1.2PB!  To see the partners and their Data Warehouse Fast Track for SQL Server offerings, check out Data Warehouse Fast Track.  Keep in mind there is the HP Superdome X for high-end OLTP/DW that has up to 384-cores, 24TB of memory, and 92TB of disk space that can give you even more performance for a SMP solution.

Posted in Appliance, Data warehouse, Fast Track, PDW/APS, SQL Server, SQLServerPedia Syndication | 4 Comments

Microsoft certification changes

A recent Microsoft blog post announced that they are releasing five new Microsoft Certified Solutions Expert (MCSE) and Developer (MCSD) specialties.  These credentials are aligned to Centers of Excellence, used by the Microsoft Partner Network to identify technical competencies that are widely recognizable by both Microsoft partners and customers.  All of these changes are being made without adding addition certification exams.


(the white circles in the image represent a single exam that needs to be taken)

The five new expert certifications are:

To earn each of these credentials, you must first earn a qualifying Microsoft Certified Solutions Associate (MCSA) certification and, then, pass a single additional exam from a list of electives associated with the corresponding Center of Excellence.  Click on the five links above to see the MCSA requirements and the electives.

The resulting MCSE or MCSD certification will be added to your transcript and will never expire.  Instead, the achievement date will signify your investment in continuing education on the technology.  Every year, you will have the opportunity to re-earn the certification by passing an additional exam from the list of electives, demonstrating your investment in broadening or deepening your skills in a given Center of Excellence.  Each time you earn the certification, a new certification entry will be added to your transcript.  This process will replace the existing recertification requirement of taking a specific recertification exam every 2 years (MCSD) or 3 years (MCSE) in order to prevent your certification from going inactive.

So instead of publishing “upgrade” exams that smash topics from multiple exams to basically test you on what’s changed since two or three years ago, you will have the choice of which additional elective exam you wish to take.  This new renewal method allows you to renew your certification while both staying current and learning something new.

Note that you can earn the corresponding new MCSE or MCSD certifications for 2016 without having to take any additional exams: I found out about the changes when I received an email from Microsoft saying I had two new MCSE’s: I had “MCSE: Data Platform” and “MCSE: Business Intelligence” which became “MCSE: Data Management and Analysis“.  Also, passing “70-473 Designing and Implementing Cloud Data Platform Solutions” and “70-475 Designing and Implementing Big Data Analytics Solutions” and “70-534 Architecting Microsoft Azure Solutions”) qualified me for “MCSE: Cloud Platform and Infrastructure“.

Another change: The three Azure certification exams (70-532, 70-533 and 70-534) used to earn you the full MCSD: Azure Solutions Architect certification.  However, this MCSD has gone away.  The three Azure certification exams are being integrated into the brand new MCSE and MCSD tracks “MCSD: App Builder” and “MCSE: Cloud Platform and Infrastructure”.

More info:

Microsoft Certification Changes and Goodbye to MCSD Azure Solutions Architect

Microsoft streamlines MCSE and MCSD certifications, eliminates requirement to retake exams

Microsoft makes massive changes to MCSE and MCSD

Posted in Certification, SQLServerPedia Syndication | 1 Comment

Cortana Intelligence Solutions

Cortana Intelligence Solutions is a new tool just released in public preview that enables users to rapidly discover, easily provision, quickly experiment with, and jumpstart production grade analytical solutions using the Cortana Intelligence Suite (CIS).  It does so using preconfigured solutions, reference architectures and design patterns (I’ll just call all these solutions “patterns” for short).  At the heart of each Cortana Intelligence Solution pattern is one or more ARM Templates which describe the Azure resources to be provisioned in the user’s Azure subscription.  Cortana Intelligence Solution patterns can be complex with multiple ARM templates, interspersed with custom tasks (Web Jobs) and/or manual steps (such as Power BI authorization in Stream Analytics job outputs).

So instead of having to manually go to the Azure web portal and provision many sources, these patterns will do it for you automatically.  Think of a pattern as a way to accelerate the process of building an end-to-end demo on top of CIS.  A deployed solution will provision your subscription with necessary CIS components (i.e. Event Hub, Stream Analytics, HDInsight, Data Factory, Machine Learning, etc.) and build the relationships between them.

I’ll now go through the process of deploying a solution/pattern.

When you go to the main page of the Cortana Intelligence Solution, you can click on “Deployments” to see the deployments you already created, or you can click on “Solutions Gallery” or the “Get Started Now” button.  You will then be taken to the Cortana Intelligence Gallery (which is not new, but the “Solutions” link in the gallery is) and will be presented with four patterns to choose from (many more will be available soon).  I will now show you screen shots of what you will see when you choose a pattern:

I’ll choose the “Predictive Maintenance for Aerospace” pattern:


When I click on it I’ll then see a summary of the solution, including the estimated provisioning time:


It includes technical details and workflow:


Included is a nice solution diagram:


It also shows the services that will be used:


Then I hit the “Deploy” button and see a screen to fill out:


Once I hit the “Create” button I get to see the status of the deployment:


Clicking on the little “i” next to a running step gives me more details:


When this step finished I was sent an email with a link to the Azure ML experiment it created:


When the deployment finished it displayed some post-deployment instructions and info:


At the bottom of the instructions was a link to source code and a very detailed Technical Guide I could look at:


You can always view the post-deployment instructions later by clicking “Deployments” on the main page of the Cortana Intelligence Solution and clicking on the deployment name.

How to delete your solution?  Make sure to delete the solution if you are not using it to save costs.  Deleting your solution will delete all the components provisioned in your subscription when you deployed the solution.  To delete the solution click on your solution name in the left panel of the solution template and click on delete.

Cortana Intelligence Solutions offer an improvement over Azure Quickstart Templates: Each Azure Quickstart Template is a single ARM template.  A Cortana Intelligence Solution, on the other hand, is comprised of one or more ARM templates interspersed with custom “tasks”.  This enables complex flows that involve creating, configuring and hydrating Azure resources in ways that are not possible through an ARM template alone.

Another advantage is some Cortana Intelligence Solutions have a “Try with your data” experience.  This allows a user to play with the solution without having to deploy it.  An example of this is the IT Anomaly Insights solution that actually uses an Anomaly Detection machine learning API in the back end.

Cortana Intelligence Solutions are similar in concept to Azure IoT Suite preconfigured solutions but have a much broader focus that just IoT and use more products.

I see Cortana Intelligence Solutions as not only a great time saver, but a way to use the proper reference architecture for the solutions you are looking to build.  It will make sure you are using the proper technologies and tools for your project so it will be a success.

More info

Insanely Practical Patterns to Jump Start Your Analytics Solutions (video)

Drive transformative change with advanced analytics in Cortana Intelligence Suite and Microsoft R (video)

Dive into Predictive Maintenance using Cortana Intelligence Suite (video)

Posted in Cortana Intelligence Suite, SQLServerPedia Syndication | 3 Comments

Making sense of Microsoft technology

In my role as a Data Platform Solution Architect (DPSA) at Microsoft, part of my responsibility is to keep up with all the Microsoft on-prem and cloud data-related technology and trends, as well as non-Microsoft technology and trends in areas such as Hadoop and NoSQL.  I work with Microsoft clients by first understanding their current data-related architectures and then educating them on which technologies and products they should consider in order to update their current architectures or to build new solutions.  There is a lot of knowledge transfer as most clients are so busy keeping what they have running that they are not aware of many of the products Microsoft has and how they all work together (I often say “they don’t know what they don’t know”).  I like to think of it as I help them put all the pieces of the puzzle together.  And as I mentioned in my previous blog, I try to show the clients The art of possible with the cloud.

It is a daunting task keeping up with all the technology as it changes so often.  Even though I spend half my days learning, I can barely keep my head above water, and that is with me just focusing on data-related products and not all the other Azure products such as networking, web and mobile app services, media services, etc. (we have “cloud solution architects” that cover those products).  To narrow down the technologies a client should consider, I will learn about their environment and ask a bunch of questions.  To help readers of my blog learn about the Microsoft technologies and which one’s might be a good fit, I wanted to list a few documents and blog posts:

Azure Quick Start Guide by me.  This is a short overview with helpful links to most of the Azure data platform and analytics products

Microsoft BI and IM Design Guidance by Rod Colledge (Data Platform Solution Architect at Microsoft).  This document contains a detailed description of the data platform and analytics products for Azure and on-prem and includes example architectures.  This is an excellent document that will give you a true understanding of many of the Microsoft products and when best to use each

Ivan Kosyakov (Data Platform Technical Architect at Microsoft) blog: Decision Tree for Big Data solutions and Decision Tree for Machine Learning.  Also check out his glossary.  These are great blogs to help you narrow down which products to use based on your use case

Azure Info Hub: An excellent list of all the Azure products that is updated frequently.  Includes a short description of each product and the latest news, along with training videos, e-books, whitepapers, tools, and even StackOverflow discussions

Hear are other useful blogs and presentations of mine:


Azure SQL Database vs SQL Data Warehouse

Relational databases vs Non-relational databases

Why use a data lake?


Relational databases vs Non-relational databases

Should I move my database to the cloud?

How does Microsoft solve Big Data?

Posted in Azure, SQLServerPedia Syndication | Leave a comment

The art of possible with the cloud

One thing I try to do in my role with Microsoft is to get clients to think of possible use cases for building solutions in the Azure cloud.  To set off light bulbs in their heads.  Sometimes this is accomplished by showing them demo’s of existing solutions.  It’s that old saying “they don’t know what they don’t know”, so I try to open up their minds to ideas they never thought of.  There are so many technologies and tools that it is easy to miss some of them, so I thought I would list the most popular one’s for building solutions in Azure:

  • Power BI: The most well-known tool for building reports and dashboards, but what is not so well-known is there are “content packs” that allow you to connect to hundreds of services such as Salesforce, Microsoft Dynamics, and Google Analytics and automatically create reports and dashboards in seconds.  See Connect to services with content packs for Power BI (65 content packs as of 9/15/16).  Also, to give you ideas on the types of reports you can build in Power BI, check out these samples broken out by department or by industry.  In addition, there is a gallery of custom visuals built by the Power BI community that you can use.  You can keep up with the Power BI updates via these links: monthly Power BI Desktop updates, monthly mobile apps for Power BI, near-weekly Power BI Service updates, and Power BI Desktop data sources (63 data sources as of 9/15/16).  Finally, check out narratives for Power BI (product extention that will tell a story about your data) and Power BI Solution Templates (setup end-to-end, enterprise-ready solutions for common BI problems in minutes)
  • Azure Machine Learning (Azure ML): A very easy way for designing, developing, testing and deploying predictive models.  What exactly does that mean?  Check out these use cases to get a better idea
  • Cognitive Services: A way to add vision, speech, language, knowledge, and search capabilities to your applications using intelligence APIs and SDKs.  How cool is that!  There are a ton of API’s that you can check out here
  • HoloLens: Augmented reality that enables you to interact with high‑definition holograms in your world.  In short, HoloLens is a smart-glasses headset in which the live presentation of physical real-world elements is incorporated with that of virtual elements (“holograms”) such that they are perceived to exist together in a shared environment.  We are all going to want to build solutions with this!  Here is one video and another to show you the art of possible.  Then check out some use cases.  Then get blown away by checking out holoportation
  • Microsoft Bots: Bots are bits of software that use artificial intelligence to converse in human terms.  Imagine a bot in Skype that performs a variety of tasks, like adding items to your calendar, booking travel or hotel rooms, or even pre-populating conversations to friends with text.  They are easy to build with the Microsoft Bot Framework, and you can even chat with Spock
  • Cortana Intelligence Gallery: The Cortana Intelligence Suite (CIS) is a set of tools that allow you to build end-to-end “big data” solutions.  The Cortana Intelligence Gallery is a community of developers and data scientists, including Microsoft, who have shared their analytics solutions built on CIS.  It’s a great way to see the art of possible, as well as a way to quickly build a solution by using an existing solution in the gallery as a starting point.  The gallery is categorized by Solutions, Experiments, Machine Learning APIs, Notebooks, Competitions, Tutorials, and Collections.  You can also browse by industry
  • Azure IoT Suite preconfigured solutions: IoT is the internetworking of physical devices, vehicles, buildings and other items—embedded with electronics, software, sensors, actuators, and network connectivity that enable these objects to collect and exchange data (see case studies).  The Azure IoT Suite preconfigured solutions are implementations of common IoT solution patterns that you can deploy to Azure.  You can use the preconfigured solutions as a starting point for your own IoT solutions or to learn about common patterns in IoT solution design and development
  • Microsoft Open Source:  Who would have ever thought Microsoft would embrace open source?  Well they do, big time!  You can build solutions in Azure using Linux as part of the Open Cloud.  There are over a thousand pre-configured software images in the Virtual Machines Marketplace and hundreds of Azure Quickstart templates that you can use to create a VM, many which include open source software.  In a matter of minutes you can be building solutions such as a data lake in Hadoop, or combine open source software with Microsoft technologies to create a modern data warehouse
Posted in Azure, Power BI, SQLServerPedia Syndication | 2 Comments

SQL Server 2016 real-time operational analytics

SQL Server 2016 introduces a very cool new feature called real-time operational analytics, which is the ability to run both analytics (OLAP) and OLTP workloads on the same database tables at the same time.  This allows you to eliminate the need for ETL and a data warehouse in some cases (using one system for OLAP and OLTP instead of creating two separate systems).  This will help to reduce complexity, cost, and data latency.

Real-time operational analytics targets the scenario of a single data source such as an enterprise resource planning (ERP) application on which you can run both the operational and the analytics workload.  This does not replace the need for a separate data warehouse when you need to integrate data from multiple sources before running the analytics workload or when you require extreme analytics performance using pre-aggregated data such as cubes.

Real-time operational analytics uses an updatable nonclustered columnstore index (NCCI).  The columnstore index maintains a copy of the data, so the OLTP and OLAP workloads run against separate copies of the data.  This minimizes the performance impact of both workloads running at the same time.  SQL Server automatically maintains index changes so that OLTP changes are always up-to-date for analytics.  This makes it possible and practical to run analytics in real-time on up-to-date data. This works for both disk-based and memory-optimized tables.

To accomplish this, all you need to do is to create an NCCI on one or more tables that are needed for analytics.  SQL Server query optimizer automatically chooses NCCI for analytics queries while your OLTP workload continues to run using the same btree indexes as before.


The analytics query performance with real-time operational analytics will not be as fast as you can get with a dedicated data warehouse but the key benefit is the ability to do analytics in real-time.  Some businesses may choose to do real-time operational analytics while still maintaining a dedicated data warehouse for extreme analytics as well as incorporating data from other sources.

More info:

Get started with Columnstore for real time operational analytics

Real-Time Operational Analytics: DML operations and nonclustered columnstore index (NCCI) in SQL Server 2016

Real-Time Operational Analytics – Overview nonclustered columnstore index (NCCI)

Real-Time Operational Analytics Using In-Memory Technology

SQL Server 2016 Operational Analytics (video)

Real Time Operational Analytics in SQL Server 2016 (video)

Real-time Operational Analytics in SQL Server 2016 – Part 1

Posted in SQL Server 2016, SQLServerPedia Syndication | 1 Comment

Azure SQL Database new performance level

A new performance level for Azure SQL Database was recently announced, called P15.  This new offering is more than two times more powerful than the next best offering, P11.  P15 offers 4000 database transaction units (DTU) where P11 offered 1750 DTU’s.  Also, the max concurrent workers and concurrent logins increased from 2,400 to 6,400; the max concurrent sessions stayed the same at 32,000; and the max In-memory OLTP storage increased from 14GB to 32GB.

More info:

Azure SQL Database new premium performance level P15 generally available

Posted in Azure SQL Database, SQLServerPedia Syndication | 3 Comments

Azure SQL Database vs SQL Data Warehouse

I am sometimes asked to compare Azure SQL Database (SQL DB) to Azure SQL Data Warehouse (SQL DW).  The most important thing to remember is SQL DB is for OLTP (i.e. applications with individual updates, inserts, and deletes) and SQL DW is not as it’s strictly for OLAP (i.e. data warehouses).  So if your going to build a OLTP solution, you would choose SQL DB.  However, both products can be used for building a data warehouse (OLAP).  With that in mind, here is a list of the differences:

I have other blogs that cover SQL DB and SQL DW.

Posted in Azure SQL Database, Azure SQL DW, SQLServerPedia Syndication | 2 Comments

What is the Lambda Architecture?

Lambda architecture is a data-processing architecture designed to handle massive quantities of data (i.e. “Big Data”) by using both batch-processing and stream-processing methods.  This idea is to balance latency, throughput, scaling, and fault-tolerance by using batch processing to provide comprehensive and accurate views of batch data, while simultaneously using real-time stream processing to provide views of online data.  The two view outputs may be joined before presentation.

This allows for a way to bridge the gap between the historical single version of the truth and the highly sought after “I want it now” real-time solution.  By combining traditional batch processing systems with stream consumption tools the needs of both can be achieved with one solution.

The high-level overview of the Lambda architecture is expressed here:

Untitled picture

A brief explanation of each layer:

Data Consumption: This is where you will import the data from all the various source systems, some of which may be streaming the data.  Others may only provide data once a day.

Stream Layer: It provides for incremental updating, making it the more complex layer.  It trades accuracy for low latency, looking at only recent data.  Data in here may be only seconds behind, but the trade-off is the data may not be clean.

Batch Layer: It looks at all the data at once and eventually corrects the data in the stream layer.  It is the single version of the truth, the trusted layer, where there is usually lots of ETL and a traditional data warehouse.  This layer is built using a predefined schedule, usually once or twice a day, including importing the data currently stored in the stream layer.

Presentation Layer: Think of it as the mediator, as it accepts queries and decides when to use the batch layer and when to use the speed layer.  Its preference would be the batch layer as that has the trusted data, but if you ask it for up-to-the-second data, it will pull from the stream layer.  So it’s a balance of retrieving what we trust versus what we want right now.

A lambda architecture solution using Azure tools might look like this, using a vehicle with IoT sensors as an example:


In the above diagram, Event Hubs is used to ingest millions of events in real-time.  Stream Analytics is used for 1) real-time aggregations on data and 2) spool data into long-term storage (SQL Data Warehouse) for batch.  Machine Learning is used in real-time for anomaly detection on tire pressure, oil level, engine temp, etc, to predict vehicles requiring maintenance.  The data in the Azure Data Lake Storage is used for rich analytics using HDInsight and Machine Learning, orchestrated by the Azure Data Factory (for e.g. aggressive driving analysis over past year).  Power BI and Cortana are used for the presentation layer, and the Azure Data Catalog is the metadata repository for all the data sets.

Using Hadoop technologies might provide a solution that looks like this:


Be aware this is a complicated architecture.  It will need a number of hardware resources and difference code bases for each layer, with each possibly using different technologies/tools.  The complexity of the code can be 3-4 times a traditional data warehouse architecture.  So you will have to weigh the costs versus the benefit of being able to use data slightly newer than a standard data warehouse solution.

More info:

The Lambda architecture: principles for architecting realtime Big Data systems

How to beat the CAP theorem

Questioning the Lambda Architecture

Lambda Architecture: Low Latency Data in a Batch Processing World

Lambda Architecture for the DWH

Lambda Architecture: Design Simpler, Resilient, Maintainable and Scalable Big Data Solutions

The Lambda Architecture and Big Data Quality

Posted in SQLServerPedia Syndication | 2 Comments

Multi-tenant databases in the cloud

For companies that sell an on-prem software solution and are looking to move that solution to the cloud, a challenge arises on how to architect that solution in the cloud.  For example, say you have a software solution that stores patient data for hospitals.  You sign up hospitals, install the hardware and software and the associated databases on-prem (at the hospital or a co-location facility), and load their patient data.  Think of each hospital as a “tenant”.  Now you want to move this solution to the cloud and get the many benefits that come with it, the biggest being the time to get a hospital up and running, which can go from months on-prem to hours in the cloud.  Now you have some choices: keep each hospital separate with their own VMs and databases (“single tenant”), or combining the data for each hospital into one database (“multi-tenant”).  For another example, you can simply be creating a PaaS application similar to Salesforce.  Here I’ll describe the various cloud strategies using Azure SQL Database, which is for OLTP applications, and Azure SQL Data Warehouse, which is for OLAP applications (see Azure SQL Database vs SQL Data Warehouse):

Separate Servers\VMs

You create VMs for each tenant, essentially doing a “lift and shift” of the current on-premise solution.  This provides the best isolation possible and it’s regularly done on-premises, but it’s also the one that doesn’t enable cutting costs, since each tenant has it’s own server, sql, license and so on.  Sometimes this is the only allowable option if you have in your client contract that their data will be virtual machine-isolated from other clients.  Some cons: table updates must be replicated across all the servers (i.e. updating reference tables), there is no resource sharing, and you need multiple backup strategies across all the servers.

Separate Databases

A new database is created and assigned when a tenant is provisioned.  You can land a number of the databases on each VM (i.e. each VM handles ten tenants), or create a database using Azure SQL Database.  This is often used in order if  you need to provide isolation for each customer, because we can associate different logins, permissions and so on to each database.  If using Azure SQL Database, be aware the database size limit is 1TB.  If you have a client database that will exceed that, you can use sharding (via Elastic Database Tools) or use cross-database queries (see Scaling Azure SQL Database and Cross-database queries in Azure SQL Database) with row-level security (see Multi-tenant applications with elastic database tools and row-level security).  The lower service tier for SQL Database has a max database size of 2GB, so you might be paying for storage that you don’t really use.  If using Azure SQL Data Warehouse, you have no limit on database size.  Some other cons: A different connection pool is required per database, updates must be replicated across all the databases, there is no resource sharing (unless using Elastic Database Pools) and you need multiple backup strategies across all the databases.

Separate Schemas

Also a very good way to achieve multi-tenancy but at the same time share some resources since everything is inside the same database, but the schemas used are different, having a separate schema for each tenant.  That allows you to even customize a specific tenant without affecting others.  And you save costs by only paying for one database (which can fit on SQL Data Warehouse not matter what the size) or a handful of databases if using SQL Database (i.e. ten tenants per database).  Some of the cons: You need to replicate all the database objects in every schema, so the number of objects can increase indefinitely, updates must be replicated across all the schemas, the connection pool for the database must maintain a different connection per tenant (or set of credentials), a different user is required per tenant (which is stored at server level) and you have to backup that user independently.

A variation of this using SQL Database is to split the tenants over multiple databases, but not to use separate schemas for performance reasons.  The is done by assigning a distinct set of tenants to each database using a partitioning strategy such as hash, range or list partitioning.  This data distribution strategy is oftentimes referred to as sharding.

Row Isolation

Everything is shared in this option, server, database and even schema.  All the data for the tenants are within the same tables in one database.  The only way they are differentiated is based on a TenantId or some other column that exists on the table level.  Another big benefit is code changes: with this option you only have one spot to change code (i.e. table structure).  With the other options you will have to roll out code changes to many spots.  You will need to use row-level security or something similar when you need to limit the results to an individual tenant.  Or you can create views or use stored procedures to filter tenants.  You also have the benefit of ease-of-use and performance when you need to aggregate results over multiple tenants.  Azure SQL Data Warehouse is a great solution for this, as there is no limit to the database size.

But be aware that there is a limit of 32 concurrent queries and 1,024 concurrent connections, so if you have thousands of users who will be hitting the database at the same time, you may want to create data marts in Azure SQL Database or create SSAS cubes.  This was a limit imposed since there is no resource governor or CPU query scheduler like there is in SQL Server.  But the benefit is each query gets its own resources and it won’t affect other queries (i.e. you don’t have to worry about a query taking all resources and blocking everyone else).  There are also resource classes that allow more memory and CPU cycles to be allocated to queries run by a given user so they run faster, with the trade-off that it reduces the number of concurrent queries that can run.

A great article that discusses the various multi-tenant models in detail and how multi-tenancy is supported with Azure SQL Database is Design Patterns for Multi-tenant SaaS Applications with Azure SQL Database.

As you can see, there are lot’s of options to consider!  It becomes a balance of cost, performance, ease-of-development, east-of-use, and security.

More info:

Tips & Tricks to Build Multi-Tenant Databases with SQL Databases

Multitenancy in SQL Azure

Choosing a Multi-Tenant Data Architecture

Multi-Tenant Data Architecture

Multi-Tenant Data Isolation in SQL Azure

Multi Tenancy and Windows Azure

Posted in Azure, Azure SQL Database, Azure SQL DW, SQLServerPedia Syndication | 1 Comment