Build Schedule

Advanced Filters:
  • Track

    clear all











  • Speaker


  • Level


  • Session Type

Sessions Found: 11
Az SQL Serverrel való ismerkedés úgy kezdödik, hogy az utasításkészlet csupán a SELECT, INSERT, UPDATE és DELETE négyes. Mennyire egyszerü, ehhez mindenki ért. 

Aztán ahogy hízik az adatbázis, nagyságrendekkel több tranzakció történik másodpercenként úgy jönnek a meglepetések is. Például mi is az az index scan, hol fordul elö, tényleg végigolvassa-e a teljes táblát? Mi az a scan count, mit jelent ha 0, ha 1, vagy ha több? 

Ebben az elöadásban scan-elés témakörét járjuk körbe.
Speaker:

Session Type:
Regular Session (60 minutes)

Track:
Application & Database Development

Level: Intermedia

Session Code:

Date: April 20

Time: 10:45 AM - 11:45 AM

Room: Neumann

Data warehouse and BI market is evolving rapidly with the appearance of new cloud born technologies. We might assume, that moving an existing Microsoft based DWH to the cloud is an easy step, but when we dig a little bit deeper, we will see, there are many-many new technological choices and aspects on how to modernize an existing dwh/bi system in the cloud. Not to mention if we start everything from scratch in a new project designed specifically to the cloud to utilize cloud flexibility and innovation as much as possible. 
Which ETL tool should I use? Data factory v2 with SSIS and BIML, or Azure Databricks powered Dataflows? Or Power BI Dataflow? Which is the right decision to run OLAP workloads? Azure AS? Or simply Power BI? When do I need Azure SQL DWH?
In the last couple years I helped many customers to modernize their DWH landscape partially or fully in the cloud and during my presentation I will share my findings and recipes for the most common situation I met. You will have fun:)
Speaker:

Accompanying Materials:

No material found.

Session Type:
Regular Session (60 minutes)

Track:
BI Platform Architecture, Development & Administration

Level: Intermedia

Session Code:

Date: April 20

Time: 9:30 AM - 10:30 AM

Room: Neumann

CosmosDB service is a NoSQL is a globally distributed, multi-model database database service designed for scalable and high performance modern applications. CosmosDB is delivered as a fully managed service with an enterprise grade SLA. It supports querying of documents using a familiar SQL over hierarchical JSON documents. Azure Cosmos DB is a superset of the DocumentDB service. It allows you to store and query noSQL data, regardless of schema.
In this presentation, you will learn:
•	How to get started with DocumentDB you provision a new database account. 
•	How to index documents 
•	How to create applications using CosmosDb (using REST API or programming libraries for several popular language) 
•	Best practices designing applications with CosmosDB
•	Best practices creating queries.
Speaker:

Session Type:
Regular Session (60 minutes)

Track:
Cloud Application Development & Deployment

Level: Advanced

Session Code:

Date: April 20

Time: 1:00 PM - 2:00 PM

Room: Neumann

Have you heard that knowing machine learning is the easiest way to get rich quickly? Let’s test this statement. Kaggle is the place to do data science projects, why not to start there?
During this session we will solve simple Kaggle competition. Actually, we will submit two solutions. The first made with super-duper deep neural network (black-box approach). Then we will follow proven ML methodologies and solve the problem methodically. All that using SQL Server Machine Learning Services.
Minimum slides and maximum fun guaranteed.
Speaker:

Session Type:
Regular Session (60 minutes)

Track:
Advanced Analysis Techniques

Level: Intermedia

Session Code:

Date: April 20

Time: 10:45 AM - 11:45 AM

Room: Room 3

A common problem in database design is the implementation of entities that are different physically (for instance, they share some attributes but have additional, specific, attributes), but should logically participate in the same relationships as one and the same.
For instance, the customer can either be a natural person (a person, for short), or a legal person (an organization or a company). The majority of the attributes of a natural person are, of course, different from the majority of the attributes of a legal person; however, from the perspective of how they participate in business operations, they need to be considered as equal.
In this session you will learn how to use specific native SQL Server functionalities to solve this particular problem: sparse columns, XML, JSON, and/or even User-defined CLR types – in an OLTP database, as well as in a star (or snowflake) schema data warehouse.
Speaker:

Session Type:
Regular Session (60 minutes)

Track:
Application & Database Development

Level: Advanced

Session Code:

Date: April 20

Time: 3:30 PM - 4:30 PM

Room: Neumann

Or how to visualize your data and find insights for beginner IoT developer
If you create an IoT solution today, you have a variety of components available to mix and match to make your solution, akin LEGO.
You get your hardware and firmware right and you get data from the sensors, now what? Of course, you would like to have, some data visualization easy and fast, and of course you would like to learn something from that data easy and fast. And it would be nice to have the results available on mobile devices, yes, yes easy and fast.
Now enter our two titan technologies in a match to the death oops data ?? trying to achieve these goals, you guess it easy fast (and cheap). 
And we will see them clash, from real hardware, to the big Azure cloud, to mobile devices, trying to outmatch each other.
Speaker:

Accompanying Materials:

Session Type:
Regular Session (60 minutes)

Track:
Cloud Application Development & Deployment

Level: Intermedia

Session Code:

Date: April 20

Time: 2:15 PM - 3:15 PM

Room: Room 2

A simple and straightforward  solution/process is always helpful when a database migration is planned, irrespective whether it is on-premise or cloud.

Azure Data Migration Service (DMS) is here to perform, lift and shift migrations to Azure SQL Database Managed Instance. Not just with migration using GUI, you can perform scripting to scale migration and plan minimal downtime migrations from different data sources to SQL Server.

Let us jump into Database Migration Service, and how you can utilize and modernize your data estate to fully managed services in Azure.
Speaker:

Session Type:
Regular Session (60 minutes)

Track:
Cloud Application Development & Deployment

Level: Intermedia

Session Code:

Date: April 20

Time: 10:45 AM - 11:45 AM

Room: Room 2

When a query runs slow we have debates between developers and admins on "who's fault it is". Is the server too slow or badly configured - admins fault. Is the query written badly or the design of the database very poor - developers fault. 
During this presentation we will learn how to determine if a query can perform better and where to start looking for improvements based on the findings. It will include demo's which will hopefully engage the audience in debates and see that working together (devs and admins) is the best option.
Speaker:

Accompanying Materials:

Session Type:
Regular Session (60 minutes)

Track:
Enterprise Database Administration & Deployment

Level: Intermedia

Session Code:

Date: April 20

Time: 3:30 PM - 4:30 PM

Room: Room 2

Get your fundamental knowledge and insights on how to establish and create your machine learning project with SQL Server 2016 or SQL Server 2017 using R or Python.

From retrieving data from SQL Server to providing data insights with different data exploration techniques to data operationalization and completing the journey with predictive modeling and data visualization and presentation.
Speaker:

Session Type:
Regular Session (60 minutes)

Track:
Advanced Analysis Techniques

Level: Advanced

Session Code:

Date: April 20

Time: 2:15 PM - 3:15 PM

Room: Neumann

Security. We are taking it seriously. But as DBA, we are not always up-to-date with all kind of information which we store in our databases. We know that our database and some tables, even some specific columns when we have multiple types of data, users, permissions and so on need to be protected even more than others.

But we need to know. We have GDPR, ISO2700, SOC compliance around us. And nobody tells us what it exactly means. We 'simply' need to be compliant (which is not simple). But we can use the power of SSMS together with Vulnerability Assessment and Security Center to collect all necessary information which we can use later to improve security, as well as decrease risks for our databases. Even better it will cost us (almost) nothing.

We will look for a hybrid environment when our data are on-premises and in the cloud, but we use security tools for both.
Speaker:

Session Type:
Regular Session (60 minutes)

Track:
Enterprise Database Administration & Deployment

Level: Advanced

Session Code:

Date: April 20

Time: 1:00 PM - 2:00 PM

Room: Room 3

Az adatkockák, dimenziók, hierarchiák és attribútumok helyes tervezése létfontosságú az SSAS megoldások teljesítményének fenntartásában.

Jó pár best practice-t összegyujtöttem az OLAP-kockák építésével és üzemeltetésével töltött évek alatt. Ebben az eloadásban végigmegyünk a tíz legfontosabb ilyen gyakorlaton, hogy beépíthesd ezeket a mindennapi munkádba, amikor te építed saját OLAP-kockáidat az Analysis Services platformon. Ülj be erre az eloadásra, és nézd meg, hogyan segíthetnek ezek a tippek és best practice-ek idot spórolni neked is!
Speaker:

Accompanying Materials:

No material found.

Session Type:
Regular Session (60 minutes)

Track:
BI Platform Architecture, Development & Administration

Level: Intermedia

Session Code:

Date: April 20

Time: 9:30 AM - 10:30 AM

Room: Room 2

Sessions Found: 11
Back to Top cage-aids
cage-aids
cage-aids
cage-aids