Build Schedule

Advanced Filters:
  • Track

    clear all















  • Speaker


  • Level


  • Session Type



Sessions Found: 22
Azure Cosmos DB is Microsoft's globally-distributed database as a service which supports multiple database models such as graph, key value, and document data models -- with the ability to add new models in the future.

In this hands-on session, attendees will learn about and use the features of Azure Cosmos DB, with these topics highlighted:

- Key Value Features
- Graph Features
- Document Features
- Column Family
- Scalability & Elasticity
- Global Distribution
- Tunable Consistency

It is the first cloud database to natively support a multitude of data models and popular query APIs, is built on a novel database engine capable of ingesting sustained volumes of data and provides blazing-fast queries – all without having to deal with schema or index management. And it is the first cloud database to offer five well-defined consistency models so you can choose just the right one for your app.
Speaker:

Accompanying Materials:

No material found.

Session Type:
Precon (480 minutes)

Track:
Track 1

Level: Intermedia

Session Code:

Date: August 26

Time: 1:45 PM - 9:45 PM

Room: L21-MPR01

2,4,8 & 16 - these aren't just numbers to count or calculate. These are years that how SQL Server evolved as number of years!

Why Upgrade? 
Data Platform Upgrade topic has been a popular session that I've presented in major conferences like Microsoft Tech-Ed (North America, Europe & India) and SQLbits since 2008. 

In this session, we will overview in depth end-to-end upgrade process that covers the essential phases, steps and issues involved in upgrading  SQL Server (2000 to  2012),  SQL Server 2014 (with a good overview on 2016 too) by using best practices and available resources. 

What to-do and what not-to-do?

A popular session that I have been presenting since the year 2008, in MS Tech-Ed, SQL Saturday & SQLbits UK.
We will cover the complete upgrade cycle, including the preparation tasks, upgrade tasks, and post-upgrade tasks. Real-world examples from my Consulting experience expanding on why & how such a solution.
Speaker:

Session Type:
Extended Session (90 minutes)

Track:
Track 2

Level: Advanced

Session Code:

Date: August 26

Time: 11:15 AM - 12:45 PM

Room: L21-MPR02

SQL Server 2017 being released brings to the table some new features and improvements in quick succession from SQL Server 2016. In this session we will look at a subset of the new &/or improved features that SQL Server 2017 brings to the table.
              
              From Analytics, Availability, Configuration and Performance there is something for everyone in this session to introduce you to the latest version of SQL Server.
Speaker:

Session Type:
Regular Session (60 minutes)

Track:
Track 3

Level: Beginner

Session Code:

Date: August 26

Time: 2:45 PM - 3:45 PM

Room: L22-CF12

Abstract: In this session I will walk through some of the out-of-box SQL Server tools that you will need to know or never knew could be so much fun. A number of commandline utilities exist with SQL Server that we are never exposed to. In this session, I will take a tour of each of these tools and the scenario’s that real world customers used them in innovative ways.
Speaker:

Accompanying Materials:

No material found.

Session Type:
Regular Session (60 minutes)

Track:
Track 3

Level: Intermedia

Session Code:

Date: August 26

Time: 10:00 AM - 11:00 AM

Room: L22-CF12

Where to begin the step 1 of data science to become a Data scientist? What are these Data Scientists up to? This is an initial data science session, for a novice and learn how easily you can step in to Data Science and start to become a professional.

 How can we see and try using a data scientist statistical model in our day to day familiar tool like Microsoft SQL Server?

 Thanks to Microsoft for integrating R Revolution within SQL Server 2016. We all now have the opportunity to use R packages and see the results within SQL 2016 and also to utilize it for any applications and/or Reporting services.
Speaker:

Session Type:
Regular Session (60 minutes)

Track:
Track 4

Level: Beginner

Session Code:

Date: August 26

Time: 10:00 AM - 11:00 AM

Room: L22-CF15

In order to ascertain the abilities of cloud computing platform, let us overview what is available & offered on Microsoft Azure.

Microsoft Azure has the ability to move, store and analyze data within the cloud. It is essential to evaluate multiple opportunities and options with Microsoft Azure data insights. In this session let us talk about strategies on data storage, data partitioning and availability options with Azure. A tour on how best these Azure components can help you achieve success for your Big Data platform.
Speaker:

Session Type:
Regular Session (60 minutes)

Track:
Track 2

Level: Intermedia

Session Code:

Date: August 26

Time: 1:45 PM - 2:45 PM

Room: L21-MPR02

Database Professional is always challenged with monitoring and maintenance across the environment. Be it 10 server or a 1000 servers, would it be easier to have a centralised solution? 
In this session I will walk you through the simple steps of building a centralised database maintenance and monitoring solution. This solution will help a database professional to query all the monitored instances, collect performance metrics, run customized scripts and any maintenance task a database professional needs to perform across the database environment. 
Talk about a easy DBA life, this is a definite simple solution you should build.
Speaker:

Accompanying Materials:

No material found.

Session Type:
Regular Session (60 minutes)

Track:
Track 4

Level: Intermedia

Session Code:

Date: August 26

Time: 11:15 AM - 12:15 PM

Room: L22-CF15

The heart of almost all Business Intelligence solutions; Relational Data Warehouse, has been servicing us for decades making the implementation is complete and success. It was always an On-Premises database, but now it is available as a Database-as-a-Service with Microsoft Azure, called Azure SQL Data Warehouse. It is a MPP cloud-based, scale-out relational database that supports workloads from few hundred gigabytes to petabytes of data and it addresses all modern requirements in data warehousing. Let’s talk about traditional On-Premises data warehouse little bit and then discuss about Azure SQL Data Warehouse and implementation of it.
Speaker:

Accompanying Materials:

No material found.

Session Type:
Regular Session (60 minutes)

Track:
Track 2

Level: Intermedia

Session Code:

Date: August 26

Time: 10:00 AM - 11:00 AM

Room: L21-MPR02

In today’s high pressured business environments the need for agile BI has forced users to produce information at an ever increasing rate. The aspect of data quality is often overlooked as the need to bring out information at a more rapid pace forces people to cut corners. More often than not, data quality plays a back seat role until serious questions are raised regarding some of the information that is being published and as a result questions are asked about the reliability of the decision making process.This session will show business users how DQS can assist their business in identifying, resolving and managing data quality issues through a couple of clear and concise examples._The session will also show how the DQS engine can integrate with SSIS backend data warehousing package to assist in automating data quality issue resolution, again through clear and concise code examples.
Speaker:

Session Type:
Regular Session (60 minutes)

Track:
Track 4

Level: Intermedia

Session Code:

Date: August 26

Time: 2:45 PM - 3:45 PM

Room: L22-CF15

You may have heard the word "DevOps" and wondered whether it is just another buzzword and/or what it can do for you.

In this session I will demystify the concepts of DevOps and we will look at two aspects of DevOps - Continuous Integration & Continuous Delivery.

Continuous Integration is the practice in which software developers frequently integrate their work with that of other members of the development team.It also involves automating tests around the integrated work

?Continuous Delivery is the next step after Continuous Integration in the deployment pipeline and is the process of automating the deployment of software to  test, staging, and production environments.

Database migrations/changes are an area that may not be typically automated or utilise Continuous Delivery.

Through the use of a comprehensive live demo to a running production database the audience will learn the benefits and how to implement Continuous Delivery in their database systems deployment pipelin
Speaker:

Accompanying Materials:

No material found.

Session Type:
Regular Session (60 minutes)

Track:
Track 2

Level: Intermedia

Session Code:

Date: August 26

Time: 2:45 PM - 3:45 PM

Room: L21-MPR02

SQL Server 2017 and Azure SQL Database now enable you to describe a Graph (a network structure) and store data in the form of nodes and edges making up the graph
Speaker:

Accompanying Materials:

No material found.

Session Type:
Regular Session (60 minutes)

Track:
Track 4

Level: Intermedia

Session Code:

Date: August 26

Time: 1:45 PM - 2:45 PM

Room: L22-CF15

Is data load performance into columnstore slowing you down? Should you drop nonclustered indexes before loading data? What batchsize should you choose? How should you handle large number of Updates/Deletes? Columnstore index can speed up the performance of analytics queries significantly but are you getting the best performance possible? Come to this session to learn the best practices and the techniques customers have used to load data in parallel with minimal or reduced logging into columnstore index. Specifically, we will cover data load using BCP, Bulk Insert, SSIS or MERGE command, concurrent trickle insert in IOT scenario as well as moving data from staging table. We will also talk about how to diagnose performance issues in queries accessing columnstore index and the steps you can take to troubleshoot.
Speaker:

Session Type:
Extended Session (90 minutes)

Track:
Track 1

Level: Intermedia

Session Code:

Date: August 26

Time: 11:15 AM - 12:45 PM

Room: L21-MPR01

In Memory OLTP came along in SQL Server 2014 along with several limitations.
It was still really useful – as long as your use case suited the limitations.
SQL Server 2016 removes several of these limitations and widens the use cases of in-memory OLTP.
For example, even if you are not taking advantage of in-memory tables, you can probably leverage performance improvements by replacing traditional table variables and both local and global temporary tables with in memory implementations. Now you’ll be able to say that table variable is truly in memory!
Come along to this session and learn about some of features of in memory OLTP  - or Hekaton as a lot of people refer to this -  and how you may be able to leverage these high performance features in both new and existing applications.
Speaker:

Accompanying Materials:

No material found.

Session Type:
Regular Session (60 minutes)

Track:
Track 4

Level: Beginner

Session Code:

Date: August 26

Time: 1:30 PM - 2:30 PM

Room: L21-MPR02

In this session we'll look over some of the things which you should be looking at within your virtual environment to ensure that you are getting the performance out of it that you should be.  This will include how to look for CPU performance issues at the host level.  We will also be discussing the Memory Balloon drivers and what they actually do, and how you should be configuring them, and why.  We'll discuss some of the memory sharing technologies which are built into vSphere and Hyper-V and how they relate to SQL Server.  Then we will finish up with some storage configuration options to look at.
Speaker:

Session Type:
Regular Session (60 minutes)

Track:
Track 3

Level: Beginner

Session Code:

Date: August 26

Time: 1:45 PM - 2:45 PM

Room: L22-CF12

The Query Store feature is a game changer for how SQL Server professionals troubleshoot query performance. It provides the ability to natively capture baseline data within SQL Server. Using the performance metrics the Query Store records, we can decide which Execution Plan SQL Server should use when executing a specific query.
During this session we will take a thorough look at the Query Store, its architecture, the build-in reporting, DMVs and the performance impact of enabling the Query Store. No matter if you are a DBA or developer, the Query Store has information that can help you
Speaker:

Accompanying Materials:

No material found.

Session Type:
Regular Session (60 minutes)

Track:
Track 2

Level: Intermedia

Session Code:

Date: August 26

Time: 4:00 PM - 5:00 PM

Room: L21-MPR02

The newest release of SQL Server brings several important bits that can make a difference for your HA and DR strategies. In this session, we will look at those improvement that concern Always On Availability Groups and other areas that help to keep your system available not just technically but also performance-wise. We will also look at the lesser known automatic seeding option that exists since SQL Server 2016 and how new DMVs help to analyze and estimate backup requirements & effects. If you are unsure if SQL Server 2017 is a distinguishing factor from former releases, and you have high demands for HA & DR, this session shall give you the answers.
Speaker:

Accompanying Materials:

No material found.

Session Type:
Regular Session (60 minutes)

Track:
Track 1

Level: Beginner

Session Code:

Date: August 26

Time: 4:00 PM - 5:00 PM

Room: L21-MPR01

In this session you will learn about the new release of SQL Server on Linux. You will learn about how to choose Windows versus Linux, the changes needed to make SQL Server run on Linux, and high availability and disaster recovery solutions.
Speaker:

Accompanying Materials:

No material found.

Session Type:
Regular Session (60 minutes)

Track:
Track 1

Level: Intermedia

Session Code:

Date: August 26

Time: 2:45 PM - 3:45 PM

Room: L21-MPR01

Basic Overview of Encryption concepts supported by SQL Server like HASHES, ASYMMETRIC KEYS, SYMMETRIC KEYS, CERTIFICATES, ENCRYPTION ALGORITHMS, Service Master Key (SMK), Database Master Key (DMK), Database Encryption Key (DEK).
Then we will work through TDE Overview, Availability, What is encrypted/not encrypted and Implementation steps and best practices.
Finally, we will explore hidden mysteries of TDE  and decryption pitfall.
Speaker:

Session Type:
Regular Session (60 minutes)

Track:
Track 1

Level: Intermedia

Session Code:

Date: August 26

Time: 4:00 PM - 5:00 PM

Room: L21-MPR01

SQL Server 2016 and SQL Server 2017 Just Works! But if you are a IT Professional you want to be armed with all the right diagnostics to react to a mission critical problem or tune the engine to the needs for your business. Dev-Ops also requires the right tools to build and tune queries for maximum performance. SQL Server 2016 and 2017 have all the diagnostics you need just built-in to the product. These are the foundation for great tools from vendors such as SentryOne, RedGate, SolarWinds, and Idera. We also have intelligence built-into the engine based on these diagnostics to automate, learn, and adapt. In this session we will show you the wide variety of these diagnostics with testimonies from vendors and SQL MVPs. You will learn why SQL Server diagnostics are the best in the industry, built-in, and spans all platforms across SQL Server, Azure, and Linux
Speaker:

Session Type:
Regular Session (60 minutes)

Track:
Track 1

Level: Intermedia

Session Code:

Date: August 26

Time: 10:00 AM - 11:00 AM

Room: L21-MPR01

One of the main drivers for digital transformation reported in the survey was optimizing customer experiences and engagement. More than 60 percent of executives identified delivering a superior experience for customers and creating new sources of customer value as important factors in determining an organization’s success as a digital business. These goals aren’t just pipedreams either – organizations are backing this assertion with serious budgetary investment, with half of respondents planning to invest in building applications in the next year that support the customer engagement model.  So what role does data play in achieving these goals through a successful digital transformation? I will present details of the answer to this questions. I will cover how the data a company collects can be analyzed for insights that help the company personalize its interactions with customers. Data can be used to find trends around a customer’s habits, which is typically indicative of their future
Speaker:

Accompanying Materials:

No material found.

Session Type:
Regular Session (60 minutes)

Track:
Track 3

Level: Intermedia

Session Code:

Date: August 26

Time: 4:00 PM - 5:00 PM

Room: L22-CF12

Sessions Found: 22
Back to Top
cage-aids
cage-aids
cage-aids
cage-aids