Azure SQL Server 2016 VM

With Windows Server 2016 just been released, now is the perfect time to build an Azure VM with SQL Server 2016 on Windows Server 2016.  In a matter of minutes you can be playing and learning both platforms.  Below I will document the steps I took to build the VM along with the additional software I installed.  This is a fully-loaded VM that I use for demo’s and to build small projects:

(Software updates as of 11/4/2016)

  1. Go to the Azure Portal, choose “New”, type in “SQL Server 2016”, and choose “SQL Server 2016 RTM Enterprise on Windows Server 2016”.  This will install SQL Server 2016 CU2 (13.0.2164.0)
  2. Follow the prompts to enter the info needed to build the VM.  I kept the default Azure “Resource Manager” (ARM) deployment model.  I chose the “East US” region, picked “Standard DS3” for the virtual machine size, created a resource group called “SQLServer”, used an existing storage account I called “serrastoragessd” which is premium-LRS and located in the East US region, enabled R Services, and created one data disk (under “Storage configuration” in SQL Server settings – create more disks for faster performance – please read Storage configuration for SQL Server VMs)
  3. After about 5 minutes your new VM will be ready.  I then connect to the VM and check for windows updates and install them
  4. On the Azure Portal, click on the VM and under “Support + Troubleshooting” you will see “Boot diagnostics”.  This will show you the boot screen of the VM so you can see if it is still performing windows updates
  5. I then login to SSMS and for the server properties change the server authentication to “SQL Server and Windows Authentication mode”.  I then create a SQL login with sysadmin server role
  6. Install the latest CU if needed: see SQL Server 2016 build numbers
  7. Get the latest SSMS version if needed: see Download SQL Server Management Studio (SSMS).  Latest is 16.5 (13.0.16000.28)
  8. Install Visual Studio Enterprise 2015 with Update 3 (14.0.25424.00 Update 3)
  9. Install SQL Server Data Tools (SSDT) 2015 GA for VS 2015 (14.0.61021.0)
  10. Install Office Professional Plus 2016 (make sure to choose 64-bit version)
  11. Install Visio Professional 2016 (make sure to choose 64-bit version)
  12. Install Chrome (Version 53.0.2785.143)
  13. Install Adobe Reader (Version 2015.020.20039)
  14. Install Java 8 (Update 111)
  15. Install Azure Data Lake Tools for Visual Studio (Version 2.2.21)
  16. Install Power BI Desktop (October update)
  17. Install Microsoft Data Migration Assistant (Version 2.0)
  18. Install Microsoft Database Experimentation Assistant Technical Preview (Version 1.0)
  19. Install Azure Storage Explorer (Version 0.8.5).  I use a Shared Access Signature to connect to my Azure storage
  20. Install Roboform (Version 7.9.22)
  21. Install DocumentDB Studio (Version 0.71)
  22. Install DocumentDB Data Migration Tool (Version 1.7)
  23. Install Data Warehouse Migration Utility Preview (Version 1)
  24. Install Azure SDK for .NET VS 2015 (Version 2.9.5)
  25. Install Microsoft Data Management Gateway (Version 2.4.6137.1)
  26. Install Red Gate Azure Explorer (Version 1.1.0.43)
  27. Install Chrome Postman (Version 4.8.1)
  28. Install Fiddler (Version 4.6.3.44034)
  29. Install Narratives for Power BI
  30. Install ZoomIt (Version 4.5)
  31. Install Microsoft R Open (Version 3.3.1).  Installs RGui
  32. Install RStudio (Version 1.0.44)

  33. Install R Tools for Visual Studio 2015 (RTVS) (Version 0.5)
  34. Download and restore the Wide-World-Importers sample databases (Version 1.0)
  35. Download and restore the AdventureWorks Sample Databases and Scripts for SQL Server 2016 CTP3
  36. Download and restore the Northwind database
  37. Download and restore samples databases from SQLskills
  38. Use Site Recovery to backup your VM daily
  39. When not is use, manually stop your VM, or use an Azure Marketplace solution or a graphical runbook (both use Azure Automation) to save money.  Note you will still be charged for storage if you created a data disk (see Azure Storage Pricing)
Posted in Azure, SQL Server 2016, SQLServerPedia Syndication | 3 Comments

PASS Summit Announcements: SQL DW free trial

Microsoft usually has some interesting announcements at the PASS Summit, and this year was no exception.  I’m writing a set of blogs covering the major announcements.  Next up is the free trial of Azure SQL Data Warehouse (SQL DW).

Azure SQL Data Warehouse is an enterprise-class, massively parallel processing (MPP) distributed database capable of processing petabyte volumes of both relational and non-relational data.  It is the industry’s first cloud data warehouse capable of grow, shrink, and pause-in-seconds capabilities with proven SQL functionality and if you’ve not yet tested it out you can now; for free.

You can use this one month free trial to do POCs and try out SQL DW up to 200 DWU and 2TB of data.  You must sign up by December 31st 2016.  Please note that once the one month free trial is over, you will start getting billed at general availability pricing rates.  For more information on the free trial, and to sign up, go here.

This is a great promotion as without it, you can get a free $200 credit for Azure, but you will quickly hit that limit when use SQL DW.

For an excellent 5-part overview of SQL DW, check out Azure SQL Data Warehouse

More info:

One Month Free Trial for Azure Data Warehouse

Posted in Azure SQL DW, SQLServerPedia Syndication | 1 Comment

PASS Summit Announcements: Power BI reports on-prem in SSRS

Microsoft usually has some interesting announcements at the PASS Summit, and this year was no exception.  I’m writing a set of blogs covering the major announcements.  Next up is the technical Preview for Power BI reports on-prem in SSRS.

The Technical Preview is a pre-configured Virtual Machine in the Azure Marketplace that includes everything you need to get started, even sample reports and data.  With this update, you can visually explore data and create an interactive report using Power BI Desktop, and then publish that report to an on-premises report server (SQL Server Reporting Services).  You can then share the report with your coworkers so they can view and interact with it in their web browsers.

Check out the Official announcement.  Get it now in the Azure Marketplace.  An excellent step-by-step tutorial is at Technical Preview of Power BI reports in SQL Server Reporting Services now available and Create Power BI reports in the SQL Server Reporting Services Technical Preview.  Post questions in the Reporting Services forum.  For users who would prefer to run this technical preview on an on-premises server, you can provision a virtual machine and then download the image as a .vhd file and use Hyper-V functionality to do so (see How to run the Technical Preview of Power BI Reports in SQL Server Reporting Services on-prem using Hyper-V).

image_thumb826

Previously you would use Power BI Desktop to build reports and you would publish them to the Power BI Service in the cloud.  This is a solution for those that do not want to publish their reports to the cloud.

This preview supports Power BI reports that connect “live” to Analysis Services models – both Tabular and Multidimensional (cubes).  Additional data sources will be added in a future preview.  There is a new feature in this version: the ability to add comments to reports.  Make sure to check out Ten things you might have missed in the Technical Preview of Power BI Reports in SQL Server Reporting Services.

pbix-in-ssrs_thumb1

Microsoft plans to release the production-ready version in the next SQL Server release wave.  They won’t be releasing it in a Service Pack, Cumulative Update, or other form of update for SSRS 2016.  The Technical Preview is effectively a pre-release of SSRS vNext.

More info:

First thoughts on Power BI on premises

Power BI Reports in SSRS Techinical Preview

Power BI reports in SQL Server Reporting Services: Feedback on the Technical Preview

Posted in Power BI, SQLServerPedia Syndication, SSRS | Leave a comment

PASS Summit Announcements: Azure Analysis Services

Microsoft usually has some interesting announcements at the PASS Summit, and this year was no exception.  I’m writing a set of blogs covering the major announcements.  Perhaps the biggest one is the introduction of the Azure Analysis Services Public Preview (OLAP).

This is a PaaS for SQL Server Analysis Services (SSAS).  So it’s PaaS SSAS 🙂  Read the official announcement.

It is based on the analytics engine in SSAS,  For those not familiar with SSAS, it is an OLAP engine and BI modeling platform that enables developers and BI professionals to create BI Semantic Models that can power highly interactive and rich analytical experiences in BI tools (such as Power BI and Excel) and custom applications.  It allows for much faster query and reporting processing compared to going directly against a database or data warehouse.  It also creates a semantic model over the raw data to make it much easier for business users to explore the data.

Some of the main points:

  • Developers can create a server in seconds, choosing from the Developer (D1) or Standard (S1, S2, S4) service tiers.  Each tier comes with fixed capacity in terms of query processing units and model cache.  The developer tier (D1) supports up to 3GB model cache and the largest tier (S4) supports up to 100GB
  • The Standard tiers offer dedicated capacity for predictable performance and are recommended for production workloads.  The Developer tier is recommended for proof-of-concept, development, and test workloads
  • Administrators can pause and resume the server at any time.  No charges are incurred when the server is paused.  On the roadmap is to offer administrators the ability to scale up and down a server between the Standard tiers (not available currently)
  • Developers can use Azure Active Directory to manage user identity and role based security for their models
  • The service is currently available in the South-Central US and West Europe regions.  More regions will be added during the preview

Similarities with SSAS:

  • Developers can use SQL Server Data Tools (SSDT) in Visual Studio for creating models and deploying them to the service.  Administrators can manage the models using SQL Server Management Studio (SSMS) and investigate issues using SQL Server Profiler
  • Business users can consume the models in any major BI tool.  Supported Microsoft tools include Power BI, Excel, and SQL Server Reporting Services.  Other MDX compliant BI tools can also be used, after downloading and installing the latest drivers
  • The service currently supports tabular models (compatibility level 1200 only).  Support for multidimensional models will be considered for a future release, based on customer demand
  • Models can consume data from a variety of sources in Azure (e.g. Azure SQL Database, Azure SQL Data Warehouse) and on-premises (e.g. SQL Server, Oracle, Teradata).  Access to on-premises sources is made available through the on-premises data gateway
  • Models can be cached in a highly optimized in-memory engine to provide fast responses to interactive BI tools.  Alternatively, models can query the source directly using DirectQuery, thereby leveraging the performance and scalability of the underlying database or big data engine

Check out the pricing, the documentation, tutorial videos, and the top-rated feature requests.

Get started with the Azure Analysis Services preview by simply provisioning a resource in the Azure Portal or using Azure Resource Manager templates, and using that server name in your Visual Studio project.

as

More info:

Learn more about Azure Analysis Services

First Thoughts On Azure Analysis Services

Creating your first data model in Azure Analysis Services

Why a Semantic Layer Like Azure Analysis Services is Relevant (Part 1)

Posted in SSAS | 3 Comments

PASS Summit Announcements: DMA/DEA

Microsoft usually has some interesting announcements at the PASS Summit, and this year was no exception.  My next few blogs will cover the major announcements.  This first one is about the Data Migration Assistant (DMA) tool v 2.0 general availability and the technical preview of Database Experimentation Assistant (DEA).

In short, customers will be able to assess and upgrade their databases using DMA, and validate target database’s performance using DEA, which will build higher confidence for these upgrades.

More details on DMA:

DMA enables you to upgrade to a modern data platform on-premises or in Azure VM by detecting compatibility issues that can impact database functionality on your new version of SQL Server.  It recommends performance and reliability improvements for your target environment.  It also allows you to not only move your schema and data, but also uncontained objects from your source server to your target server.

DMA replaces all previous versions of SQL Server Upgrade Advisor and should be used for upgrades for most SQL Server versions.  Note that the SQL Server Upgrade Advisor allowed for migrations to SQL Database, but DMA does not yet support this.

DMA helps you to assess and migrate the following components, along with performance, security and storage recommendations, in the target SQL Server platform that the database can benefit from post upgrade:

  • Schema of databases
  • Data
  • Server roles
  • SQL and windows logins

After the successful upgrade, applications will be able to connect to the target SQL server databases with the same credentials.  Click here for more info.

Microsoft Download Center links: Microsoft® Data Migration Assistant v2.0

More details on DEA:

Technical Preview of DEA is the new A/B testing solution for SQL Server upgrades.  It enables customers to conduct experiments on database workloads across two versions of SQL Server.  Customers who are upgrading from older SQL Server versions (starting 2005 and above) to any new version of the SQL Server will be able to use key performance insights, captured using a real world workload to help build confidence about upgrade among database administrators, IT management and application owners by minimizing upgrade risks.  This enables truly risk-free migrations.

The tool offers the following capabilities for workload comparison analysis and reporting:

  • Automated script to set up workload capture and replay of production database (using existing SQL server functionality Distributed Replay & SQL tracing)
  • Perform statistical analysis model on traces collected using both old and new instances
  • Visualize data through analysis report via rich user experience

Supported versions:

Source: SQL Server 2005, SQL Server 2008, SQL Server 2008 R2, SQL Server 2012, SQL Server 2014, and SQL Server 2016

Target: SQL Server 2012, SQL Server 2014, and SQL Server 2016

Click here for more info.

Microsoft Download Center links: Microsoft® Database Experimentation Assistant Technical Preview

More info:

Top 5 Announcements at PASS Summit 2016

Posted in SQL Server | Leave a comment

Data Warehouse Fast Track for SQL Server 2016

Microsoft Data Warehouse Fast Track for SQL Server 2016 is a joint effort between Microsoft and its hardware partners to deliver validated, pre-configured solutions that reduce the complexity of implementing a data warehouse on SQL Server Enterprise Edition.  The Data Warehouse Fast Track program provides flexibility of solutions and customer choice across hardware vendors’ technologies, and uses the core capabilities of the Windows Server operation system and SQL Server to deliver a balanced SMP data warehouse with optimized performance.

t1

The reference architectures are tested internally by Microsoft and consist of high performance hardware and software configurations at various price, performance and footprint tiers.  Data Warehouse Fast Track for SQL Server brings some great capabilities designed to support a modern data warehouse implementation where data and analytics can truly exist in the same solution, spanning cloud and on-premises.  These reference architectures have been available since SQL Server 2012.

The Data Warehouse Fast Track is not a replacement for APS (Analytics Platform System).  APS is a MPP (Massively Parallel Processing) data warehouse appliance which is designed as a pure data warehouse offering and scales to store and query petabytes of data.  In general, the initial database size for using APS over the Data Warehouse Fast Track is 150TB (the database size is raw data with the assumption it will have a 5:1 compression):

t2

Data Warehouse Fast Track for SQL Server brings the optimal configuration of hardware and software together into a single packaged offering which is guaranteed to perform.  Balanced against time to solution versus cost, Data Warehouse Fast Track for SQL Server truly enables success ‘out of the box’ without the need to perform arduous sizing or throughput calculations (this has all been done for you), simple purchasing and installation, fast performance and scalability, and total peace of mind.

t3

There are certified reference architectures ranging from 6TB to 145TB across SQL Server 2014 and SQL Server 2016.  There is even an RA which scales to 1.2PB!  To see the partners and their Data Warehouse Fast Track for SQL Server offerings, check out Data Warehouse Fast Track.  Keep in mind there is the HP Superdome X for high-end OLTP/DW that has up to 384-cores, 24TB of memory, and 92TB of disk space that can give you even more performance for a SMP solution.

Posted in Appliance, Data warehouse, Fast Track, PDW/APS, SQL Server, SQLServerPedia Syndication | 5 Comments

Microsoft certification changes

A recent Microsoft blog post announced that they are releasing five new Microsoft Certified Solutions Expert (MCSE) and Developer (MCSD) specialties.  These credentials are aligned to Centers of Excellence, used by the Microsoft Partner Network to identify technical competencies that are widely recognizable by both Microsoft partners and customers.  All of these changes are being made without adding addition certification exams.

certpath

(the white circles in the image represent a single exam that needs to be taken)

The five new expert certifications are:

To earn each of these credentials, you must first earn a qualifying Microsoft Certified Solutions Associate (MCSA) certification and, then, pass a single additional exam from a list of electives associated with the corresponding Center of Excellence.  Click on the five links above to see the MCSA requirements and the electives.

The resulting MCSE or MCSD certification will be added to your transcript and will never expire.  Instead, the achievement date will signify your investment in continuing education on the technology.  Every year, you will have the opportunity to re-earn the certification by passing an additional exam from the list of electives, demonstrating your investment in broadening or deepening your skills in a given Center of Excellence.  Each time you earn the certification, a new certification entry will be added to your transcript.  This process will replace the existing recertification requirement of taking a specific recertification exam every 2 years (MCSD) or 3 years (MCSE) in order to prevent your certification from going inactive.

So instead of publishing “upgrade” exams that smash topics from multiple exams to basically test you on what’s changed since two or three years ago, you will have the choice of which additional elective exam you wish to take.  This new renewal method allows you to renew your certification while both staying current and learning something new.

Note that you can earn the corresponding new MCSE or MCSD certifications for 2016 without having to take any additional exams: I found out about the changes when I received an email from Microsoft saying I had two new MCSE’s: I had “MCSE: Data Platform” and “MCSE: Business Intelligence” which became “MCSE: Data Management and Analysis“.  Also, passing “70-473 Designing and Implementing Cloud Data Platform Solutions” and “70-475 Designing and Implementing Big Data Analytics Solutions” and “70-534 Architecting Microsoft Azure Solutions”) qualified me for “MCSE: Cloud Platform and Infrastructure“.

Another change: The three Azure certification exams (70-532, 70-533 and 70-534) used to earn you the full MCSD: Azure Solutions Architect certification.  However, this MCSD has gone away.  The three Azure certification exams are being integrated into the brand new MCSE and MCSD tracks “MCSD: App Builder” and “MCSE: Cloud Platform and Infrastructure”.

More info:

Microsoft Certification Changes and Goodbye to MCSD Azure Solutions Architect

Microsoft streamlines MCSE and MCSD certifications, eliminates requirement to retake exams

Microsoft makes massive changes to MCSE and MCSD

MCSD and MCSE Titles Revamped

Posted in Certification, SQLServerPedia Syndication | 1 Comment

Cortana Intelligence Solutions

Cortana Intelligence Solutions is a new tool just released in public preview that enables users to rapidly discover, easily provision, quickly experiment with, and jumpstart production grade analytical solutions using the Cortana Intelligence Suite (CIS).  It does so using preconfigured solutions, reference architectures and design patterns (I’ll just call all these solutions “patterns” for short).  At the heart of each Cortana Intelligence Solution pattern is one or more ARM Templates which describe the Azure resources to be provisioned in the user’s Azure subscription.  Cortana Intelligence Solution patterns can be complex with multiple ARM templates, interspersed with custom tasks (Web Jobs) and/or manual steps (such as Power BI authorization in Stream Analytics job outputs).

So instead of having to manually go to the Azure web portal and provision many sources, these patterns will do it for you automatically.  Think of a pattern as a way to accelerate the process of building an end-to-end demo on top of CIS.  A deployed solution will provision your subscription with necessary CIS components (i.e. Event Hub, Stream Analytics, HDInsight, Data Factory, Machine Learning, etc.) and build the relationships between them.

I’ll now go through the process of deploying a solution/pattern.

When you go to the main page of the Cortana Intelligence Solution, you can click on “Deployments” to see the deployments you already created, or you can click on “Solutions Gallery” or the “Get Started Now” button.  You will then be taken to the Cortana Intelligence Gallery (which is not new, but the “Solutions” link in the gallery is) and will be presented with four patterns to choose from (many more will be available soon).  I will now show you screen shots of what you will see when you choose a pattern:

I’ll choose the “Predictive Maintenance for Aerospace” pattern:

pattern

When I click on it I’ll then see a summary of the solution, including the estimated provisioning time:

cis-1

It includes technical details and workflow:

pattern-3

Included is a nice solution diagram:

pattern-4

It also shows the services that will be used:

pattern-5

Then I hit the “Deploy” button and see a screen to fill out:

pattern-6

Once I hit the “Create” button I get to see the status of the deployment:

pattern-7

Clicking on the little “i” next to a running step gives me more details:

pattern-8

When this step finished I was sent an email with a link to the Azure ML experiment it created:

pattern-9

When the deployment finished it displayed some post-deployment instructions and info:

pattern-10

At the bottom of the instructions was a link to source code and a very detailed Technical Guide I could look at:

pattern-11

You can always view the post-deployment instructions later by clicking “Deployments” on the main page of the Cortana Intelligence Solution and clicking on the deployment name.

How to delete your solution?  Make sure to delete the solution if you are not using it to save costs.  Deleting your solution will delete all the components provisioned in your subscription when you deployed the solution.  To delete the solution click on your solution name in the left panel of the solution template and click on delete.

Cortana Intelligence Solutions offer an improvement over Azure Quickstart Templates: Each Azure Quickstart Template is a single ARM template.  A Cortana Intelligence Solution, on the other hand, is comprised of one or more ARM templates interspersed with custom “tasks”.  This enables complex flows that involve creating, configuring and hydrating Azure resources in ways that are not possible through an ARM template alone.

Another advantage is some Cortana Intelligence Solutions have a “Try with your data” experience.  This allows a user to play with the solution without having to deploy it.  An example of this is the IT Anomaly Insights solution that actually uses an Anomaly Detection machine learning API in the back end.

Cortana Intelligence Solutions are similar in concept to Azure IoT Suite preconfigured solutions but have a much broader focus that just IoT and use more products.

I see Cortana Intelligence Solutions as not only a great time saver, but a way to use the proper reference architecture for the solutions you are looking to build.  It will make sure you are using the proper technologies and tools for your project so it will be a success.

More info

Insanely Practical Patterns to Jump Start Your Analytics Solutions (video)

Drive transformative change with advanced analytics in Cortana Intelligence Suite and Microsoft R (video)

Dive into Predictive Maintenance using Cortana Intelligence Suite (video)

Posted in Cortana Intelligence Suite, SQLServerPedia Syndication | 4 Comments

Making sense of Microsoft technology

In my role as a Data Platform Solution Architect (DPSA) at Microsoft, part of my responsibility is to keep up with all the Microsoft on-prem and cloud data-related technology and trends, as well as non-Microsoft technology and trends in areas such as Hadoop and NoSQL.  I work with Microsoft clients by first understanding their current data-related architectures and then educating them on which technologies and products they should consider in order to update their current architectures or to build new solutions.  There is a lot of knowledge transfer as most clients are so busy keeping what they have running that they are not aware of many of the products Microsoft has and how they all work together (I often say “they don’t know what they don’t know”).  I like to think of it as I help them put all the pieces of the puzzle together.  And as I mentioned in my previous blog, I try to show the clients The art of possible with the cloud.

It is a daunting task keeping up with all the technology as it changes so often.  Even though I spend half my days learning, I can barely keep my head above water, and that is with me just focusing on data-related products and not all the other Azure products such as networking, web and mobile app services, media services, etc. (we have “cloud solution architects” that cover those products).  To narrow down the technologies a client should consider, I will learn about their environment and ask a bunch of questions.  To help readers of my blog learn about the Microsoft technologies and which one’s might be a good fit, I wanted to list a few documents and blog posts:

Azure Quick Start Guide by me.  This is a short overview with helpful links to most of the Azure data platform and analytics products

Microsoft BI and IM Design Guidance by Rod Colledge (Data Platform Solution Architect at Microsoft).  This document contains a detailed description of the data platform and analytics products for Azure and on-prem and includes example architectures.  This is an excellent document that will give you a true understanding of many of the Microsoft products and when best to use each

Ivan Kosyakov (Data Platform Technical Architect at Microsoft) blog: Decision Tree for Big Data solutions and Decision Tree for Machine Learning.  Also check out his glossary.  These are great blogs to help you narrow down which products to use based on your use case

Azure Info Hub: An excellent list of all the Azure products that is updated frequently.  Includes a short description of each product and the latest news, along with training videos, e-books, whitepapers, tools, and even StackOverflow discussions

Interactive Azure Platform Big Picture – Lists most of the Azure products with a short description and links for more info

Hear are other useful blogs and presentations of mine:

Blogs:

Azure SQL Database vs SQL Data Warehouse

Relational databases vs Non-relational databases

Why use a data lake?

Presentations:

Relational databases vs Non-relational databases

Should I move my database to the cloud?

How does Microsoft solve Big Data?

Posted in Azure, SQLServerPedia Syndication | Leave a comment

The art of possible with the cloud

One thing I try to do in my role with Microsoft is to get clients to think of possible use cases for building solutions in the Azure cloud.  To set off light bulbs in their heads.  Sometimes this is accomplished by showing them demo’s of existing solutions.  It’s that old saying “they don’t know what they don’t know”, so I try to open up their minds to ideas they never thought of.  There are so many technologies and tools that it is easy to miss some of them, so I thought I would list the most popular one’s for building solutions in Azure:

  • Power BI: The most well-known tool for building reports and dashboards, but what is not so well-known is there are “content packs” that allow you to connect to hundreds of services such as Salesforce, Microsoft Dynamics, and Google Analytics and automatically create reports and dashboards in seconds.  See Connect to services with content packs for Power BI (65 content packs as of 9/15/16).  Also, to give you ideas on the types of reports you can build in Power BI, check out these samples broken out by department or by industry.  In addition, there is a gallery of custom visuals built by the Power BI community that you can use.  You can keep up with the Power BI updates via these links: monthly Power BI Desktop updates, monthly mobile apps for Power BI, near-weekly Power BI Service updates, and Power BI Desktop data sources (63 data sources as of 9/15/16).  Finally, check out narratives for Power BI (product extention that will tell a story about your data) and Power BI Solution Templates (setup end-to-end, enterprise-ready solutions for common BI problems in minutes)
  • Azure Machine Learning (Azure ML): A very easy way for designing, developing, testing and deploying predictive models.  What exactly does that mean?  Check out these use cases to get a better idea
  • Cognitive Services: A way to add vision, speech, language, knowledge, and search capabilities to your applications using intelligence APIs and SDKs.  How cool is that!  There are a ton of API’s that you can check out here
  • HoloLens: Augmented reality that enables you to interact with high‑definition holograms in your world.  In short, HoloLens is a smart-glasses headset in which the live presentation of physical real-world elements is incorporated with that of virtual elements (“holograms”) such that they are perceived to exist together in a shared environment.  We are all going to want to build solutions with this!  Here is one video and another to show you the art of possible.  Then check out some use cases.  Then get blown away by checking out holoportation
  • Microsoft Bots: Bots are bits of software that use artificial intelligence to converse in human terms.  Imagine a bot in Skype that performs a variety of tasks, like adding items to your calendar, booking travel or hotel rooms, or even pre-populating conversations to friends with text.  They are easy to build with the Microsoft Bot Framework, and you can even chat with Spock
  • Cortana Intelligence Gallery: The Cortana Intelligence Suite (CIS) is a set of tools that allow you to build end-to-end “big data” solutions.  The Cortana Intelligence Gallery is a community of developers and data scientists, including Microsoft, who have shared their analytics solutions built on CIS.  It’s a great way to see the art of possible, as well as a way to quickly build a solution by using an existing solution in the gallery as a starting point.  The gallery is categorized by Solutions, Experiments, Machine Learning APIs, Notebooks, Competitions, Tutorials, and Collections.  You can also browse by industry
  • Azure IoT Suite preconfigured solutions: IoT is the internetworking of physical devices, vehicles, buildings and other items—embedded with electronics, software, sensors, actuators, and network connectivity that enable these objects to collect and exchange data (see case studies).  The Azure IoT Suite preconfigured solutions are implementations of common IoT solution patterns that you can deploy to Azure.  You can use the preconfigured solutions as a starting point for your own IoT solutions or to learn about common patterns in IoT solution design and development
  • Azure Quickstarts from Microsoft partners: A set of advanced ARM templates that are designed to launch full-stack solutions comprising multiple artifacts from multiple software providers, saving manual integration steps and time
  • Cortana Intelligence Solutions: Enables users to rapidly discover, easily provision, quickly experiment with, and jumpstart production grade analytical solutions using the Cortana Intelligence Suite (CIS).  It does so using preconfigured solutions, reference architectures and design patterns, built using ARM templates by Microsoft.  See Cortana Intelligence Solutions
  • Microsoft Open Source:  Who would have ever thought Microsoft would embrace open source?  Well they do, big time!  You can build solutions in Azure using Linux as part of the Open Cloud.  There are over a thousand pre-configured software images in the Virtual Machines Marketplace and hundreds of Azure Quickstart templates that you can use to create a VM, many which include open source software.  In a matter of minutes you can be building solutions such as a data lake in Hadoop, or combine open source software with Microsoft technologies to create a modern data warehouse
Posted in Azure, Power BI, SQLServerPedia Syndication | 2 Comments