20462C-ENU-TrainerHandbook - Ebook download as PDF File (.pdf), Text File (.txt) or read book online.20462c-enu-trainerh...
M I C R O S O F T
L E A R N I N G
20462C Administering Microsoft® SQL Server® Databases
P R O D U C T
MCT USE ONLY. STUDENT USE PROHIBITED
O F F I C I A L
MCT USE ONLY. STUDENT USE PROHIBITED
ii
20462C: Administering Microsoft SQL Server Databases
Information in this document, including URL and other Internet Web site references, is subject to change without notice. Unless otherwise noted, the example companies, organizations, products, domain names, e-mail addresses, logos, people, places, and events depicted herein are fictitious, and no association with any real company, organization, product, domain name, e-mail address, logo, person, place or event is intended or should be inferred. Complying with all applicable copyright laws is the responsibility of the user. Without limiting the rights under copyright, no part of this document may be reproduced, stored in or introduced into a retrieval system, or transmitted in any form or by any means (electronic, mechanical, photocopying, recording, or otherwise), or for any purpose, without the express written permission of Microsoft Corporation. Microsoft may have patents, patent applications, trademarks, copyrights, or other intellectual property rights covering subject matter in this document. Except as expressly provided in any written license agreement from Microsoft, the furnishing of this document does not give you any license to these patents, trademarks, copyrights, or other intellectual property.
The names of manufacturers, products, or URLs are provided for informational purposes only and Microsoft makes no representations and warranties, either expressed, implied, or statutory, regarding these manufacturers or the use of the products with any Microsoft technologies. The inclusion of a manufacturer or product does not imply endorsement of Microsoft of the manufacturer or product. Links may be provided to third party sites. Such sites are not under the control of Microsoft and Microsoft is not responsible for the contents of any linked site or any link contained in a linked site, or any changes or updates to such sites. Microsoft is not responsible for webcasting or any other form of transmission received from any linked site. Microsoft is providing these links to you only as a convenience, and the inclusion of any link does not imply endorsement of Microsoft of the site or the products contained therein. © 2014 Microsoft Corporation. All rights reserved.
Microsoft and the trademarks listed at http://www.microsoft.com/about/legal/en/us/IntellectualProperty/Trademarks/EN-US.aspx are trademarks of the Microsoft group of companies. All other trademarks are property of their respective owners Product Number: 20462C Part Number: X19-32473 Released: 05/2014
MCT USE ONLY. STUDENT USE PROHIBITED
MICROSOFT LICENSE TERMS MICROSOFT INSTRUCTOR-LED COURSEWARE
These license terms are an agreement between Microsoft Corporation (or based on where you live, one of its affiliates) and you. Please read them. They apply to your use of the content accompanying this agreement which includes the media on which you received it, if any. These license terms also apply to Trainer Content and any updates and supplements for the Licensed Content unless other terms accompany those items. If so, those terms apply. BY ACCESSING, DOWNLOADING OR USING THE LICENSED CONTENT, YOU ACCEPT THESE TERMS. IF YOU DO NOT ACCEPT THEM, DO NOT ACCESS, DOWNLOAD OR USE THE LICENSED CONTENT. If you comply with these license terms, you have the rights below for each license you acquire. 1.
DEFINITIONS. a. “Authorized Learning Center” means a Microsoft IT Academy Program Member, Microsoft Learning Competency Member, or such other entity as Microsoft may designate from time to time.
b. “Authorized Training Session” means the instructor-led training class using Microsoft Instructor-Led Courseware conducted by a Trainer at or through an Authorized Learning Center. c.
“Classroom Device” means one (1) dedicated, secure computer that an Authorized Learning Center owns or controls that is located at an Authorized Learning Center’s training facilities that meets or exceeds the hardware level specified for the particular Microsoft Instructor-Led Courseware.
d. “End User” means an individual who is (i) duly enrolled in and attending an Authorized Training Session or Private Training Session, (ii) an employee of a MPN Member, or (iii) a Microsoft full-time employee. e. “Licensed Content” means the content accompanying this agreement which may include the Microsoft Instructor-Led Courseware or Trainer Content. f.
“Microsoft Certified Trainer” or “MCT” means an individual who is (i) engaged to teach a training session to End Users on behalf of an Authorized Learning Center or MPN Member, and (ii) currently certified as a Microsoft Certified Trainer under the Microsoft Certification Program.
g. “Microsoft Instructor-Led Courseware” means the Microsoft-branded instructor-led training course that educates IT professionals and developers on Microsoft technologies. A Microsoft Instructor-Led Courseware title may be branded as MOC, Microsoft Dynamics or Microsoft Business Group courseware. h. “Microsoft IT Academy Program Member” means an active member of the Microsoft IT Academy Program. i.
“Microsoft Learning Competency Member” means an active member of the Microsoft Partner Network program in good standing that currently holds the Learning Competency status.
j.
“MOC” means the “Official Microsoft Learning Product” instructor-led courseware known as Microsoft Official Course that educates IT professionals and developers on Microsoft technologies.
k. “MPN Member” means an active Microsoft Partner Network program member in good standing.
MCT USE ONLY. STUDENT USE PROHIBITED
l.
“Personal Device” means one (1) personal computer, device, workstation or other digital electronic device that you personally own or control that meets or exceeds the hardware level specified for the particular Microsoft Instructor-Led Courseware.
m. “Private Training Session” means the instructor-led training classes provided by MPN Members for corporate customers to teach a predefined learning objective using Microsoft Instructor-Led Courseware. These classes are not advertised or promoted to the general public and class attendance is restricted to individuals employed by or contracted by the corporate customer. n. “Trainer” means (i) an academically accredited educator engaged by a Microsoft IT Academy Program Member to teach an Authorized Training Session, and/or (ii) a MCT.
o. “Trainer Content” means the trainer version of the Microsoft Instructor-Led Courseware and additional supplemental content designated solely for Trainers’ use to teach a training session using the Microsoft Instructor-Led Courseware. Trainer Content may include Microsoft PowerPoint presentations, trainer preparation guide, train the trainer materials, Microsoft One Note packs, classroom setup guide and Prerelease course feedback form. To clarify, Trainer Content does not include any software, virtual hard disks or virtual machines. 2.
USE RIGHTS. The Licensed Content is licensed not sold. The Licensed Content is licensed on a one copy per user basis, such that you must acquire a license for each individual that accesses or uses the Licensed Content.
2.1
Below are five separate sets of use rights. Only one set of rights apply to you.
a. If you are a Microsoft IT Academy Program Member: i. Each license acquired on behalf of yourself may only be used to review one (1) copy of the Microsoft Instructor-Led Courseware in the form provided to you. If the Microsoft Instructor-Led Courseware is in digital format, you may install one (1) copy on up to three (3) Personal Devices. You may not install the Microsoft Instructor-Led Courseware on a device you do not own or control. ii. For each license you acquire on behalf of an End User or Trainer, you may either: 1. distribute one (1) hard copy version of the Microsoft Instructor-Led Courseware to one (1) End User who is enrolled in the Authorized Training Session, and only immediately prior to the commencement of the Authorized Training Session that is the subject matter of the Microsoft Instructor-Led Courseware being provided, or 2. provide one (1) End User with the unique redemption code and instructions on how they can access one (1) digital version of the Microsoft Instructor-Led Courseware, or 3. provide one (1) Trainer with the unique redemption code and instructions on how they can access one (1) Trainer Content, provided you comply with the following: iii. you will only provide access to the Licensed Content to those individuals who have acquired a valid license to the Licensed Content, iv. you will ensure each End User attending an Authorized Training Session has their own valid licensed copy of the Microsoft Instructor-Led Courseware that is the subject of the Authorized Training Session, v. you will ensure that each End User provided with the hard-copy version of the Microsoft InstructorLed Courseware will be presented with a copy of this agreement and each End User will agree that their use of the Microsoft Instructor-Led Courseware will be subject to the terms in this agreement prior to providing them with the Microsoft Instructor-Led Courseware. Each individual will be required to denote their acceptance of this agreement in a manner that is enforceable under local law prior to their accessing the Microsoft Instructor-Led Courseware, vi. you will ensure that each Trainer teaching an Authorized Training Session has their own valid licensed copy of the Trainer Content that is the subject of the Authorized Training Session,
MCT USE ONLY. STUDENT USE PROHIBITED
vii. you will only use qualified Trainers who have in-depth knowledge of and experience with the Microsoft technology that is the subject of the Microsoft Instructor-Led Courseware being taught for all your Authorized Training Sessions, viii. you will only deliver a maximum of 15 hours of training per week for each Authorized Training Session that uses a MOC title, and ix. you acknowledge that Trainers that are not MCTs will not have access to all of the trainer resources for the Microsoft Instructor-Led Courseware.
b. If you are a Microsoft Learning Competency Member: i. Each license acquired on behalf of yourself may only be used to review one (1) copy of the Microsoft Instructor-Led Courseware in the form provided to you. If the Microsoft Instructor-Led Courseware is in digital format, you may install one (1) copy on up to three (3) Personal Devices. You may not install the Microsoft Instructor-Led Courseware on a device you do not own or control. ii. For each license you acquire on behalf of an End User or Trainer, you may either: 1. distribute one (1) hard copy version of the Microsoft Instructor-Led Courseware to one (1) End User attending the Authorized Training Session and only immediately prior to the commencement of the Authorized Training Session that is the subject matter of the Microsoft Instructor-Led Courseware provided, or 2. provide one (1) End User attending the Authorized Training Session with the unique redemption code and instructions on how they can access one (1) digital version of the Microsoft InstructorLed Courseware, or 3. you will provide one (1) Trainer with the unique redemption code and instructions on how they can access one (1) Trainer Content, provided you comply with the following: iii. you will only provide access to the Licensed Content to those individuals who have acquired a valid license to the Licensed Content, iv. you will ensure that each End User attending an Authorized Training Session has their own valid licensed copy of the Microsoft Instructor-Led Courseware that is the subject of the Authorized Training Session, v. you will ensure that each End User provided with a hard-copy version of the Microsoft Instructor-Led Courseware will be presented with a copy of this agreement and each End User will agree that their use of the Microsoft Instructor-Led Courseware will be subject to the terms in this agreement prior to providing them with the Microsoft Instructor-Led Courseware. Each individual will be required to denote their acceptance of this agreement in a manner that is enforceable under local law prior to their accessing the Microsoft Instructor-Led Courseware, vi. you will ensure that each Trainer teaching an Authorized Training Session has their own valid licensed copy of the Trainer Content that is the subject of the Authorized Training Session, vii. you will only use qualified Trainers who hold the applicable Microsoft Certification credential that is the subject of the Microsoft Instructor-Led Courseware being taught for your Authorized Training Sessions, viii. you will only use qualified MCTs who also hold the applicable Microsoft Certification credential that is the subject of the MOC title being taught for all your Authorized Training Sessions using MOC, ix. you will only provide access to the Microsoft Instructor-Led Courseware to End Users, and x. you will only provide access to the Trainer Content to Trainers.
MCT USE ONLY. STUDENT USE PROHIBITED
c.
If you are a MPN Member: i. Each license acquired on behalf of yourself may only be used to review one (1) copy of the Microsoft Instructor-Led Courseware in the form provided to you. If the Microsoft Instructor-Led Courseware is in digital format, you may install one (1) copy on up to three (3) Personal Devices. You may not install the Microsoft Instructor-Led Courseware on a device you do not own or control. ii. For each license you acquire on behalf of an End User or Trainer, you may either: 1. distribute one (1) hard copy version of the Microsoft Instructor-Led Courseware to one (1) End User attending the Private Training Session, and only immediately prior to the commencement of the Private Training Session that is the subject matter of the Microsoft Instructor-Led Courseware being provided, or 2. provide one (1) End User who is attending the Private Training Session with the unique redemption code and instructions on how they can access one (1) digital version of the Microsoft Instructor-Led Courseware, or 3. you will provide one (1) Trainer who is teaching the Private Training Session with the unique redemption code and instructions on how they can access one (1) Trainer Content, provided you comply with the following: iii. you will only provide access to the Licensed Content to those individuals who have acquired a valid license to the Licensed Content, iv. you will ensure that each End User attending an Private Training Session has their own valid licensed copy of the Microsoft Instructor-Led Courseware that is the subject of the Private Training Session, v. you will ensure that each End User provided with a hard copy version of the Microsoft Instructor-Led Courseware will be presented with a copy of this agreement and each End User will agree that their use of the Microsoft Instructor-Led Courseware will be subject to the terms in this agreement prior to providing them with the Microsoft Instructor-Led Courseware. Each individual will be required to denote their acceptance of this agreement in a manner that is enforceable under local law prior to their accessing the Microsoft Instructor-Led Courseware, vi. you will ensure that each Trainer teaching an Private Training Session has their own valid licensed copy of the Trainer Content that is the subject of the Private Training Session, vii. you will only use qualified Trainers who hold the applicable Microsoft Certification credential that is the subject of the Microsoft Instructor-Led Courseware being taught for all your Private Training Sessions, viii. you will only use qualified MCTs who hold the applicable Microsoft Certification credential that is the subject of the MOC title being taught for all your Private Training Sessions using MOC, ix. you will only provide access to the Microsoft Instructor-Led Courseware to End Users, and x. you will only provide access to the Trainer Content to Trainers.
d. If you are an End User: For each license you acquire, you may use the Microsoft Instructor-Led Courseware solely for your personal training use. If the Microsoft Instructor-Led Courseware is in digital format, you may access the Microsoft Instructor-Led Courseware online using the unique redemption code provided to you by the training provider and install and use one (1) copy of the Microsoft Instructor-Led Courseware on up to three (3) Personal Devices. You may also print one (1) copy of the Microsoft Instructor-Led Courseware. You may not install the Microsoft Instructor-Led Courseware on a device you do not own or control. e. If you are a Trainer. i. For each license you acquire, you may install and use one (1) copy of the Trainer Content in the form provided to you on one (1) Personal Device solely to prepare and deliver an Authorized Training Session or Private Training Session, and install one (1) additional copy on another Personal Device as a backup copy, which may be used only to reinstall the Trainer Content. You may not install or use a copy of the Trainer Content on a device you do not own or control. You may also print one (1) copy of the Trainer Content solely to prepare for and deliver an Authorized Training Session or Private Training Session.
MCT USE ONLY. STUDENT USE PROHIBITED
ii.
You may customize the written portions of the Trainer Content that are logically associated with instruction of a training session in accordance with the most recent version of the MCT agreement. If you elect to exercise the foregoing rights, you agree to comply with the following: (i) customizations may only be used for teaching Authorized Training Sessions and Private Training Sessions, and (ii) all customizations will comply with this agreement. For clarity, any use of “customize” refers only to changing the order of slides and content, and/or not using all the slides or content, it does not mean changing or modifying any slide or content.
2.2 Separation of Components. The Licensed Content is licensed as a single unit and you may not separate their components and install them on different devices.
2.3 Redistribution of Licensed Content. Except as expressly provided in the use rights above, you may not distribute any Licensed Content or any portion thereof (including any permitted modifications) to any third parties without the express written permission of Microsoft. 2.4 Third Party Notices. The Licensed Content may include third party code tent that Microsoft, not the third party, licenses to you under this agreement. Notices, if any, for the third party code ntent are included for your information only. 2.5 Additional Terms. Some Licensed Content may contain components with additional terms, conditions, and licenses regarding its use. Any non-conflicting terms in those conditions and licenses also apply to your use of that respective component and supplements the terms described in this agreement. 3.
LICENSED CONTENT BASED ON PRE-RELEASE TECHNOLOGY. If the Licensed Content’s subject matter is based on a pre-release version of Microsoft technology (“Pre-release”), then in addition to the other provisions in this agreement, these terms also apply:
a. Pre-Release Licensed Content. This Licensed Content subject matter is on the Pre-release version of the Microsoft technology. The technology may not work the way a final version of the technology will and we may change the technology for the final version. We also may not release a final version. Licensed Content based on the final version of the technology may not contain the same information as the Licensed Content based on the Pre-release version. Microsoft is under no obligation to provide you with any further content, including any Licensed Content based on the final version of the technology. b. Feedback. If you agree to give feedback about the Licensed Content to Microsoft, either directly or through its third party designee, you give to Microsoft without charge, the right to use, share and commercialize your feedback in any way and for any purpose. You also give to third parties, without charge, any patent rights needed for their products, technologies and services to use or interface with any specific parts of a Microsoft technology, Microsoft product, or service that includes the feedback. You will not give feedback that is subject to a license that requires Microsoft to license its technology, technologies, or products to third parties because we include your feedback in them. These rights survive this agreement. c.
Pre-release Term. If you are an Microsoft IT Academy Program Member, Microsoft Learning Competency Member, MPN Member or Trainer, you will cease using all copies of the Licensed Content on the Pre-release technology upon (i) the date which Microsoft informs you is the end date for using the Licensed Content on the Pre-release technology, or (ii) sixty (60) days after the commercial release of the technology that is the subject of the Licensed Content, whichever is earliest (“Pre-release term”). Upon expiration or termination of the Pre-release term, you will irretrievably delete and destroy all copies of the Licensed Content in your possession or under your control.
MCT USE ONLY. STUDENT USE PROHIBITED
4.
SCOPE OF LICENSE. The Licensed Content is licensed, not sold. This agreement only gives you some rights to use the Licensed Content. Microsoft reserves all other rights. Unless applicable law gives you more rights despite this limitation, you may use the Licensed Content only as expressly permitted in this agreement. In doing so, you must comply with any technical limitations in the Licensed Content that only allows you to use it in certain ways. Except as expressly permitted in this agreement, you may not: • access or allow any individual to access the Licensed Content if they have not acquired a valid license for the Licensed Content, • alter, remove or obscure any copyright or other protective notices (including watermarks), branding or identifications contained in the Licensed Content, • modify or create a derivative work of any Licensed Content, • publicly display, or make the Licensed Content available for others to access or use, • copy, print, install, sell, publish, transmit, lend, adapt, reuse, link to or post, make available or distribute the Licensed Content to any third party, • work around any technical limitations in the Licensed Content, or • reverse engineer, decompile, remove or otherwise thwart any protections or disassemble the Licensed Content except and only to the extent that applicable law expressly permits, despite this limitation.
5. RESERVATION OF RIGHTS AND OWNERSHIP. Microsoft reserves all rights not expressly granted to you in this agreement. The Licensed Content is protected by copyright and other intellectual property laws and treaties. Microsoft or its suppliers own the title, copyright, and other intellectual property rights in the Licensed Content. 6.
EXPORT RESTRICTIONS. The Licensed Content is subject to United States export laws and regulations. You must comply with all domestic and international export laws and regulations that apply to the Licensed Content. These laws include restrictions on destinations, end users and end use. For additional information, see www.microsoft.com/exporting.
7.
SUPPORT SERVICES. Because the Licensed Content is “as is”, we may not provide support services for it.
8.
TERMINATION. Without prejudice to any other rights, Microsoft may terminate this agreement if you fail to comply with the terms and conditions of this agreement. Upon termination of this agreement for any reason, you will immediately stop all use of and delete and destroy all copies of the Licensed Content in your possession or under your control.
9.
LINKS TO THIRD PARTY SITES. You may link to third party sites through the use of the Licensed Content. The third party sites are not under the control of Microsoft, and Microsoft is not responsible for the contents of any third party sites, any links contained in third party sites, or any changes or updates to third party sites. Microsoft is not responsible for webcasting or any other form of transmission received from any third party sites. Microsoft is providing these links to third party sites to you only as a convenience, and the inclusion of any link does not imply an endorsement by Microsoft of the third party site.
10.
ENTIRE AGREEMENT. This agreement, and any additional terms for the Trainer Content, updates and supplements are the entire agreement for the Licensed Content, updates and supplements.
11.
APPLICABLE LAW. a. United States. If you acquired the Licensed Content in the United States, Washington state law governs the interpretation of this agreement and applies to claims for breach of it, regardless of conflict of laws principles. The laws of the state where you live govern all other claims, including claims under state consumer protection laws, unfair competition laws, and in tort.
MCT USE ONLY. STUDENT USE PROHIBITED
b. Outside the United States. If you acquired the Licensed Content in any other country, the laws of that country apply. 12.
LEGAL EFFECT. This agreement describes certain legal rights. You may have other rights under the laws of your country. You may also have rights with respect to the party from whom you acquired the Licensed Content. This agreement does not change your rights under the laws of your country if the laws of your country do not permit it to do so.
13.
DISCLAIMER OF WARRANTY. THE LICENSED CONTENT IS LICENSED "AS-IS" AND "AS AVAILABLE." YOU BEAR THE RISK OF USING IT. MICROSOFT AND ITS RESPECTIVE AFFILIATES GIVES NO EXPRESS WARRANTIES, GUARANTEES, OR CONDITIONS. YOU MAY HAVE ADDITIONAL CONSUMER RIGHTS UNDER YOUR LOCAL LAWS WHICH THIS AGREEMENT CANNOT CHANGE. TO THE EXTENT PERMITTED UNDER YOUR LOCAL LAWS, MICROSOFT AND ITS RESPECTIVE AFFILIATES EXCLUDES ANY IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NON-INFRINGEMENT.
14.
LIMITATION ON AND EXCLUSION OF REMEDIES AND DAMAGES. YOU CAN RECOVER FROM MICROSOFT, ITS RESPECTIVE AFFILIATES AND ITS SUPPLIERS ONLY DIRECT DAMAGES UP TO US$5.00. YOU CANNOT RECOVER ANY OTHER DAMAGES, INCLUDING CONSEQUENTIAL, LOST PROFITS, SPECIAL, INDIRECT OR INCIDENTAL DAMAGES.
This limitation applies to o anything related to the Licensed Content, services, content (including code) on third party Internet sites or third-party programs; and o claims for breach of contract, breach of warranty, guarantee or condition, strict liability, negligence, or other tort to the extent permitted by applicable law. It also applies even if Microsoft knew or should have known about the possibility of the damages. The above limitation or exclusion may not apply to you because your country may not allow the exclusion or limitation of incidental, consequential or other damages.
Please note: As this Licensed Content is distributed in Quebec, Canada, some of the clauses in this agreement are provided below in French. Remarque : Ce le contenu sous licence étant distribué au Québec, Canada, certaines des clauses dans ce contrat sont fournies ci-dessous en français.
EXONÉRATION DE GARANTIE. Le contenu sous licence visé par une licence est offert « tel quel ». Toute utilisation de ce contenu sous licence est à votre seule risque et péril. Microsoft n’accorde aucune autre garantie expresse. Vous pouvez bénéficier de droits additionnels en vertu du droit local sur la protection dues consommateurs, que ce contrat ne peut modifier. La ou elles sont permises par le droit locale, les garanties implicites de qualité marchande, d’adéquation à un usage particulier et d’absence de contrefaçon sont exclues.
LIMITATION DES DOMMAGES-INTÉRÊTS ET EXCLUSION DE RESPONSABILITÉ POUR LES DOMMAGES. Vous pouvez obtenir de Microsoft et de ses fournisseurs une indemnisation en cas de dommages directs uniquement à hauteur de 5,00 $ US. Vous ne pouvez prétendre à aucune indemnisation pour les autres dommages, y compris les dommages spéciaux, indirects ou accessoires et pertes de bénéfices. Cette limitation concerne: • tout ce qui est relié au le contenu sous licence, aux services ou au contenu (y compris le code) figurant sur des sites Internet tiers ou dans des programmes tiers; et. • les réclamations au titre de violation de contrat ou de garantie, ou au titre de responsabilité stricte, de négligence ou d’une autre faute dans la limite autorisée par la loi en vigueur.
MCT USE ONLY. STUDENT USE PROHIBITED
Elle s’applique également, même si Microsoft connaissait ou devrait connaître l’éventualité d’un tel dommage. Si votre pays n’autorise pas l’exclusion ou la limitation de responsabilité pour les dommages indirects, accessoires ou de quelque nature que ce soit, il se peut que la limitation ou l’exclusion ci-dessus ne s’appliquera pas à votre égard.
EFFET JURIDIQUE. Le présent contrat décrit certains droits juridiques. Vous pourriez avoir d’autres droits prévus par les lois de votre pays. Le présent contrat ne modifie pas les droits que vous confèrent les lois de votre pays si celles-ci ne le permettent pas. Revised July 2013
xi
MCT USE ONLY. STUDENT USE PROHIBITED
20462C: Administering Microsoft SQL Server Databases
MCT USE ONLY. STUDENT USE PROHIBITED
xii
20462C: Administering Microsoft SQL Server Databases
Acknowledgments
Microsoft Learning would like to acknowledge and thank the following for their contribution towards developing this title. Their effort at various stages in the development has ensured that you have a good classroom experience.
Design and Development
This course was designed and developed by Content Master, a division of CM Group Ltd. Content Master is a global provider of technical content and learning services.
Graeme Malcolm – Lead Content Developer
Graeme Malcolm is a Microsoft SQL Server subject matter expert and professional content developer at Content Master—a division of CM Group Ltd. As a Microsoft Certified Trainer, Graeme has delivered training courses on SQL Server since version 4.2; as an author, Graeme has written numerous books, articles, and training courses on SQL Server; and as a consultant, Graeme has designed and implemented business solutions based on SQL Server for customers all over the world.
Lin Joyner – Contributing Content Developer
Lin is an experienced SQL Server developer and administrator, having worked with SQL Server since version 6.0. She designs and writes SQL Server and .NET development training materials. Prior to joining Content Master, Lin was a professional trainer for five years when she held the MCT and MCSD certifications.
xiii
Contents Module 1: Introduction to SQL Server 2014 Database Administration Module Overview
1-1
Lesson 1: Database Administration Overview
1-2
Lesson 2: Introduction to the SQL Server Platform
1-5
Lesson 3: Database Management Tools and Techniques
1-11
Lab: Using SQL Server Administrative Tools
1-16
Module Review and Takeaways
1-21
Module 2: Installing and Configuring SQL Server 2014 Module Overview
2-1
Lesson 1: Planning SQL Server Installation
2-2
Lesson 2: Installing SQL Server 2014
2-10
Lesson 3: Post-Installation Configuration
2-15
Lab: Installing SQL Server 2014
2-18
Module Review and Takeaways
2-21
Module 3: Working with Databases and Storage Module Overview
3-1
Lesson 1: Introduction to Data Storage with SQL Server
3-2
Lesson 2: Managing Storage for System Databases
3-8
Lesson 3: Managing Storage for User Databases
3-12
Lesson 4: Moving Database Files
3-21
Lesson 5: Configuring the Buffer Pool Extension
3-24
Lab: Managing Database Storage
3-28
Module Review and Takeaways
3-32
Module 4: Planning and Implementing a Backup Strategy Module Overview
4-1
Lesson 1: Understanding SQL Server Recovery Models
4-2
Lesson 2: Planning a Backup Strategy
4-8
Lesson 3: Backing up Databases and Transaction Logs
4-15
Lesson 4: Using Backup Options
4-24
Lesson 5: Ensuring Backup Reliability
4-29
Lab: Backing Up Databases
4-35
Module Review and Takeaways
4-41
MCT USE ONLY. STUDENT USE PROHIBITED
20462C: Administering Microsoft SQL Server Databases
MCT USE ONLY. STUDENT USE PROHIBITED
xiv
20462C: Administering Microsoft SQL Server Databases
Module 5: Restoring SQL Server 2014 Databases Module Overview
5-1
Lesson 1: Understanding the Restore Process
5-2
Lesson 2: Restoring Databases
5-6
Lesson 3: Advanced Restore Scenarios
5-11
Lesson 4: Point-in-Time Recovery
5-17
Lab: Restoring SQL Server Databases
5-21
Module Review and Takeaways
5-25
Module 6: Importing and Exporting Data Module Overview
6-1
Lesson 1: Introduction to Transferring Data
6-2
Lesson 2: Importing and Exporting Data
6-9
Lesson 3: Copying or Moving a Database
6-17
Lab: Importing and Exporting Data
6-22
Module Review and Takeaways
6-26
Module 7: Monitoring SQL Server 2014 Module Overview
7-1
Lesson 1: Introduction to Monitoring SQL Server
7-2
Lesson 2: Dynamic Management Views and Functions
7-7
Lesson 3: Performance Monitor
7-11
Lab: Monitoring SQL Server 2014
7-15
Module Review and Takeaways
7-18
Module 8: Tracing SQL Server Activity Module Overview
8-1
Lesson 1: Tracing SQL Server Workload Activity
8-2
Lesson 2: Using Traces
8-9
Lab: Tracing SQL Server Workload Activity
8-18
Module Review and Takeaways
8-22
xv
Module 9: Managing SQL Server Security Module Overview
9-1
Lesson 1: Introduction to SQL Server Security
9-2
Lesson 2: Managing Server-Level Security
9-9
Lesson 3: Managing Database-Level Principals
9-18
Lesson 4: Managing Database Permissions
9-28
Lab: Managing SQL Server Security
9-36
Module Review and Takeaways
9-44
Module 10: Auditing Data Access and Encrypting Data Module Overview
10-1
Lesson 1: Auditing Data Access in SQL Server
10-2
Lesson 2: Implementing SQL Server Audit
10-7
Lesson 3: Encrypting Databases
10-16
Lab: Auditing Data Access and Encrypting Data
10-22
Module Review and Takeaways
10-26
Module 11: Performing Ongoing Database Maintenance Module Overview
11-1
Lesson 1: Ensuring Database Integrity
11-2
Lesson 2: Maintaining Indexes
11-7
Lesson 3: Automating Routine Database Maintenance
11-14
Lab: Performing Ongoing Database Maintenance
11-17
Module Review and Takeaways
11-20
Module 12: Automating SQL Server 2014 Management Module Overview
12-1
Lesson 1: Automating SQL Server Management
12-2
Lesson 2: Implementing SQL Server Agent Jobs
12-5
Lesson 3: Managing SQL Server Agent Jobs
12-11
Lesson 4: Managing Job Step Security Contexts
12-15
Lesson 5: Managing Jobs on Multiple Servers
12-20
Lab: Automating SQL Server Management
12-25
Module Review and Takeaways
12-28
MCT USE ONLY. STUDENT USE PROHIBITED
20462C: Administering Microsoft SQL Server Databases
MCT USE ONLY. STUDENT USE PROHIBITED
xvi
20462C: Administering Microsoft SQL Server Databases
Module 13: Monitoring SQL Server 2014 with Notifications and Alerts Module Overview
13-1
Lesson 1: Monitoring SQL Server Errors
13-2
Lesson 2: Configuring Database Mail
13-6
Lesson 3: Configuring Operators, Notifications, and Alerts
13-11
Lab: Using Notifications and Alerts
13-17
Module Review and Takeaways
13-20
Lab Answer Keys Module 1 Lab: Using SQL Server Administrative Tools
L01-1
Module 2 Lab: Installing SQL Server 2014
L02-1
Module 3 Lab: Managing Database Storage
L03-1
Module 4 Lab: Backing Up Databases
L04-1
Module 5 Lab: Restoring SQL Server Databases
L05-1
Module 6 Lab: Importing and Exporting Data
L06-1
Module 7 Lab: Monitoring SQL Server 2014
L07-1
Module 8 Lab: Tracing SQL Server Workload Activity
L08-1
Module 9 Lab: Managing SQL Server Security
L09-1
Module 10 Lab: Auditing Data Access and Encrypting Data
L10-1
Module 11 Lab: Performing Ongoing Database Maintenance
L11-1
Module 12 Lab: Automating SQL Server Management
L12-1
Module 13 Lab: Using Notifications and Alerts
L13-1
About This Course
This section provides you with a brief description of the course, audience, required prerequisites, and course objectives.
Course Description
MCT USE ONLY. STUDENT USE PROHIBITED
About This Course
xvii
This five-day instructor-led course provides students with the knowledge and skills to maintain a Microsoft SQL Server 2014 database. The course focuses on teaching individuals how to use SQL Server 2014 product features and tools related to maintaining a database.
Audience
The primary audience for this course is individuals who administer and maintain SQL Server databases. These individuals perform database administration and maintenance as their primary area of responsibility, or work in environments where databases play a key role in their primary job.
The secondary audience for this course is individuals who develop applications that deliver content from SQL Server databases.
Student Prerequisites This course requires that you meet the following prerequisites:
Basic knowledge of the Microsoft Windows operating system and its core functionality.
Working knowledge of Transact-SQL.
Working knowledge of relational databases.
Some experience with database design.
Students who attend this training can meet the prerequisites by attending the following courses, or obtaining equivalent knowledge and skills:
20461C: Querying Microsoft SQL Server
Course Objectives After completing this course, students will be able to:
Describe core database administration tasks and tools.
Install and configure SQL Server 2014
Configure SQL Server databases and storage.
Plan and implement a backup strategy.
Restore databases from backups.
Import and export data.
Monitor SQL Server.
Trace SQL Server activity.
Manage SQL Server security.
Audit data access and encrypt data.
About This Course
Perform ongoing database maintenance.
Automate SQL Server maintenance with SQL Server Agent Jobs.
Configure Database Mail, alerts and notifications.
Course Outline This section provides an outline of the course: Module 1: Introduction to SQL Server 2014 Database Administration Module 2: Installing and Configuring SQL Server 2014 Module 3: Working with Databases and Storage Module 4: Planning and Implementing a Backup Strategy Module 5: Restoring SQL Server 2014 Databases Module 6: Importing and Exporting Data Module 7: Monitoring SQL Server 2014 Module 8: Tracing SQL Server Activity Module 9: Managing SQL Server Security Module 10: Auditing Data Access and Encrypting Data Module 11: Performing Ongoing Database Maintenance Module 12: Automating SQL Server 2014 Management Module 13: Monitoring SQL Server 2014 with Notifications and Alerts
Course Materials
The following materials are included with your kit:
MCT USE ONLY. STUDENT USE PROHIBITED
xviii
Course Handbook. A succinct classroom learning guide that provides all the critical technical information in a crisp, tightly-focused format, which is just right for an effective in-class learning experience.
Lessons: Guide you through the learning objectives and provide the key points that are critical to the success of the in-class learning experience.
Labs: Provide a real-world, hands-on platform for you to apply the knowledge and skills learned in the module.
Module Reviews and Takeaways: Provide improved on-the-job reference material to boost knowledge and skills retention.
Lab Answer Keys: Provide step-by-step lab solution guidance at your fingertips when it’s needed.
Lessons: Include detailed information for each topic, expanding on the content in the Course Handbook.
Labs: Include complete lab exercise information and answer keys in digital form to use during lab time
MCT USE ONLY. STUDENT USE PROHIBITED
About This Course
xix
Resources: Include well-categorized additional resources that give you immediate access to the most up-to-date premium content on TechNet, MSDN, Microsoft Press
Student Course Files: Include the Allfiles.exe, a self-extracting executable file that contains all the files required for the labs and demonstrations.
Course evaluation. At the end of the course, you will have the opportunity to complete an online evaluation to provide feedback on the course, training facility, and instructor.
To provide additional comments or feedback on the course, send e-mail to
[email protected]. To inquire about the Microsoft Certification Program, send e-mail to
[email protected].
Virtual Machine Environment
This section provides the information for setting up the classroom environment to support the business scenario of the course.
Virtual Machine Configuration In this course, you will use Microsoft Hyper-V to perform the labs. The following table shows the role of each virtual machine used in this course: Virtual machine
Role
20462C-MIA-DC
Domain Controller
20462C-MIA-SQL
SQL Server VM
Software Configuration The following software is installed on each VM:
SQL Server 2014 (on the SQL Server VM)
Course Files
There are files associated with the labs in this course. The lab files are located in the folder D:\Labfiles on the student computers.
Classroom Setup Each classroom computer will have the same virtual machine configured in the same way.
Course Hardware Level
To ensure a satisfactory student experience, Microsoft Learning requires a minimum equipment configuration for trainer and student computers in all Microsoft Certified Partner for Learning Solutions (CPLS) classrooms in which Official Microsoft Learning Product courses are taught. This course requires hardware level 6+.
Processor: Intel Virtualization Technology (Intel VT) or AMD Virtualization (AMD-V)
Hard Disk: Dual 120 GB hard disks 7200 RM SATA or better (Striped)
RAM: 12 GB or higher
DVD/CD: DVD drive
Network adapter with Internet connectivity
About This Course
Video Adapter/Monitor: 17-inch Super VGA (SVGA)
Microsoft Mouse or compatible pointing device
Sound card with amplified speakers
MCT USE ONLY. STUDENT USE PROHIBITED
xx
In addition, the instructor computer must be connected to a projection display device that supports SVGA 1024 x 768 pixels, 16 bit colors. Note: For the best classroom experience, a computer with solid state disks (SSDs) is recommended. For optimal performance, adapt the instructions below to install the 20462C-MIA-SQL virtual machine on a different physical disk than the other virtual machines to reduce disk contention.
MCT USE ONLY. STUDENT USE PROHIBITED 1-1
Module 1 Introduction to SQL Server 2014 Database Administration Contents: Module Overview
1-1
Lesson 1: Database Administration Overview
1-2
Lesson 2: Introduction to the SQL Server Platform
1-5
Lesson 3: Database Management Tools and Techniques
1-11
Lab: Using SQL Server Administrative Tools
1-16
Module Review and Takeaways
1-21
Module Overview This module introduces the Microsoft® SQL Server® 2014 platform. It describes the components, editions, and versions of SQL Server 2014, and the tasks that a database administrator commonly performs for a SQL Server instance.
Objectives After completing this module, you will be able to:
Describe the SQL Server platform.
Describe common database administration tasks.
Use SQL Server administration tools.
Introduction to SQL Server 2014 Database Administration
Lesson 1
Database Administration Overview
MCT USE ONLY. STUDENT USE PROHIBITED
1-2
Most organizations use software applications to manage business processes and activities, and these applications generally store data in a database. Organizations are increasingly reliant on applications and the data they store, and often databases are a “mission-critical” component of a business’s information technology (IT) infrastructure.
The role of a database administrator (DBA) includes a wide range of responsibilities and tasks that ensure that the databases an organization relies on are maintained and kept at optimum efficiency. This lesson describes some of these responsibilities and tasks, which will be explored in greater detail throughout the rest of this course.
Lesson Objectives After completing this lesson, you will be able to:
Describe common characteristics of a database administrator.
Describe common database administration tasks.
Describe the importance of documentation in a database solution.
What Makes a Good Database Administrator? While this course is focused on performing database maintenance tasks for a SQL Server 2014 database, it is important to consider the important characteristics of a successful DBA. While there are many thousands of DBAs working successfully throughout the world, each with their own experience and personality, some common factors that contribute to success include:
Technological knowledge and skills. A good DBA not only requires in-depth knowledge of the database platform used to host the database; but also needs to be familiar with host operating system configuration, storage devices, and networking.
Business-awareness. While a DBA is a technical role, a good DBA typically understands the business context within which the database operates, and its role in supporting the business.
Organizational skills. Database systems can be complex, with a lot of components and subsystems to manage. Some tasks need to be performed at specific times, and a good DBA must keep track of these tasks while also responding to unexpected issues as they arise.
Ability to prioritize. When unexpected problems affect a database, application users and business stakeholders typically make demands on the DBA to resolve the situation based on their individual requirements. A good DBA must prioritize the resolution of issues based on factors such as servicelevel agreements (SLAs) with the business for database services, the number of users and systems affected, and the degree to which the problems are affecting ongoing operations.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
Common Database Administration Tasks Depending on the organization, a single DBA might be responsible for managing multiple database servers and databases, or multiple DBAs might each take responsibility for a specific application, database server, business unit, or geographic location. Regardless of how database management responsibility is apportioned, common tasks that a DBA must perform include:
1-3
Provisioning database servers and databases. This can involve installing and configuring instances of SQL Server on physical or virtual servers, or creating new virtual machines based on template images. It can also involve creating databases and allocating their data and log files to appropriate storage devices.
Maintaining database files and objects. After a database has been created and populated with data in tables and indexes, it requires ongoing maintenance to ensure it continues to perform optimally. This involves reducing any fragmentation that occurs in data files as records are added and deleted, ensuring that data files are kept at an appropriate size, and ensuring that the logical and physical data structures remain consistent.
Managing recovery in the event of database failure. Databases are often critical to business operations, and a core responsibility for a DBA is to plan an appropriate backup and recovery strategy for each database, ensure backups are performed, and restore the database in the event of a failure.
Importing and exporting data. Data is often transferred between systems, so DBAs often need to extract data from, or import data to, databases.
Applying security to data. An organization’s database servers often contain its most valuable asset – the data that enables the business to operate. Security breaches can be costly, expensive and timeconsuming to trace and repair; and damaging to customer trust and confidence. A DBA must implement security policies that enable users to access the data they need, while ensuring that the business meets its legal compliance obligations, protects its assets, and mitigates the risks associated with security breaches.
Monitoring and troubleshooting database systems. Many database administration operations are reactive, in the sense that they involve taking action to troubleshoot and remediate a problem that has been identified. Successful DBAs also undertake a proactive approach, in which they monitor systems against an established baseline to try to detect potential problems before they impact data operations.
Introduction to SQL Server 2014 Database Administration
Documenting Database Management Procedures One of the key attributes of a successful DBA is the ability to be organized. Most DBAs are familiar with the systems they manage, and the tasks that must be performed on a day-to-day basis. However, even the best DBAs do not rely purely on their memory. DBAs commonly compile and maintain documentation, often referred to as a “run book”, that includes information such as:
Configuration settings and file locations.
Personnel contact details.
Standard maintenance procedures and schedules.
Disaster recovery procedures.
MCT USE ONLY. STUDENT USE PROHIBITED
1-4
While it may be unexciting, maintaining documentation for the database system is an important part of database administration. A detailed run book can be invaluable when a new DBA must take over responsibility for managing a database, or when an unexpected emergency occurs and the DBA is not present to deal with it. Even when the DBA is available to respond to a disaster such as the failure of a database server, having clearly documented steps to recover the database reduces the sense of panic and pressure, and enables a faster resolution of the problem.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
Lesson 2
Introduction to the SQL Server Platform
1-5
As a DBA, it is important to be familiar with the database management system used to store your data. SQL Server is a platform for developing business applications that are data focused. Rather than being a single monolithic application, SQL Server is structured as a series of components. It is important to understand the use of each of these.
You can install more than one copy of SQL Server on a server. Each of these is called an instance and can be configured and managed independently.
SQL Server ships in a variety of editions, each with a different set of capabilities for different scenarios. It is important to understand the target business cases for each of the SQL Server editions and how the evolution through a series of improving versions over many years results in today’s stable and robust platform.
Lesson Objectives After completing this lesson, you will be able to:
Explain the role of each component that makes up the SQL Server platform.
Describe the functionality that SQL Server instances provide.
Explain the available SQL Server editions.
Explain how SQL Server has evolved.
SQL Server Components SQL Server is a very good relational database engine, but it offers more than just that. It is a complete data platform comprising of many components.
Introduction to SQL Server 2014 Database Administration
Component
Description
Database Engine
The SQL Server database engine is the heart of the SQL Server platform. It provides a high-performance, scalable relational database engine based on the SQL language that can be used to host Online Transaction Processing (OLTP) databases for business applications and data warehouse solutions. SQL Server 2014 also includes a memory-optimized database engine which uses in-memory technology to improve performance for short-running transactions.
Analysis Services
SQL Server Analysis Services (SSAS) is an online analytical processing (OLAP) engine that works with analytic cubes and tables. It is used to implement enterprise BI solutions for data analysis and data mining.
Integration Services
SQL Server Integration Services (SSIS) is an extract, transform, and load (ETL) platform tool for orchestrating the movement of data in both directions between SQL Server components and external systems.
Reporting Services
SQL Server Reporting Services (SSRS) is a reporting engine based on web services, providing a web portal and end-user reporting tools. It can be installed in native mode, or integrated with Microsoft SharePoint Server.
Master Data Services
SQL Server Master Data Services (MDS) provides tooling and a hub for managing master or reference data.
Data Quality Services
SQL Server Data Quality Services (DQS) is a knowledge-driven data quality tool for data cleansing and matching.
StreamInsight
SQL Server StreamInsight provides a platform for building applications that perform complex event processing for streams of real-time data.
Full-Text Search
Full-Text Search is a feature of the database engine that provides a sophisticated semantic search facility for text-based data.
Replication
The SQL Server database engine includes Replication, a set of technologies for synchronizing data between servers to meet data distribution needs.
PowerPivot for SharePoint Server
PowerPivot for SharePoint is a specialized implementation of SQL Server Analysis Services that can be installed in a Microsoft SharePoint Server farm to enable tabular data modeling in shared Microsoft Excel workbooks. PowerPivot is also available natively in Excel.
Power View for SharePoint Server
Power View for SharePoint is a component of SQL Server Reporting Services when installed in SharePoint-Integrated mode. It provides interactive data exploration, visualization, and presentation experience that encourages intuitive, impromptu reporting. Power View is also available natively in Excel.
MCT USE ONLY. STUDENT USE PROHIBITED
1-6
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
SQL Server Instances It is sometimes useful to install more than one copy of a SQL Server component on a single server. Many SQL Server components can be installed more than once as separate instances.
SQL Server Instances The ability to install multiple instances of SQL Server components on a single server is useful in a number of situations:
1-7
You may want to have different administrators or security environments for sets of databases. You can manage and secure each instance of SQL Server separately.
Some of your applications may require server configurations that are inconsistent or incompatible with the server requirements of other applications. You can configure each instance of SQL Server independently.
Your application databases might need different levels of service, particularly in relation to availability. You can use SQL Server instances to separate workloads with differing service level agreements (SLAs).
You might need to support different versions or editions of SQL Server.
Your applications might require different server-level collations. Although each database can have different collations, an application might be dependent on the collation of the tempdb database when the application is using temporary objects.
Different versions of SQL Server can also be installed side-by-side using multiple instances. This can assist when testing upgrade scenarios or performing upgrades.
Default and Named Instances
Prior to SQL Server 2000, only a single copy of SQL Server could be installed on a server system. SQL Server was addressed by the name of the Windows server on which it is hosted. To maintain backward compatibility, this mode of connection is still supported and is known as the default instance. In internal configuration tools, a default instance of the database engine is named MSSQLSERVER.
Additional instances of SQL Server require an instance name that you can use in conjunction with the server name and are known as named instances. If you want all your instances to be named instances, you do not need to install a default instance first. You cannot install all components of SQL Server in more than one instance. A substantial change in SQL Server 2012 allows for multiple instance support for SQL Server Integration Services (SSIS). To access a named instance, client applications use the address ServerName\Instance-Name. For example, a named instance called Test on a Windows server called APPSERVER1 would be addressed as APPSERVER1\Test. There is no need to install SQL Server tools and utilities more than once on a server. You can use a single installation of the tools to manage and configure all instances.
Introduction to SQL Server 2014 Database Administration
SQL Server Editions SQL Server is available in a wide variety of editions, with different price points and levels of capability.
SQL Server Editions Each SQL Server edition is targeted to a specific business use case, as shown in the table on the next page:
Edition
Business Use Case
Parallel Data Warehouse
Uses massively parallel processing (MPP) to execute queries against vast amounts of data quickly. Parallel Data Warehouse systems are sold as a complete "appliance" rather than through standard software licenses.
Enterprise
Provides the highest levels of reliability for demanding workloads.
Business Intelligence
Adds Business Intelligence (BI) to the offerings from Standard Edition.
Standard
Delivers a reliable, complete data management platform.
Express
Provides a free edition for lightweight web and small server-based applications.
Compact
Provides a free edition for stand-alone and occasionally connected mobile applications, optimized for a very small memory footprint.
Developer
Allows building, testing, and demonstrating all SQL Server functionality.
Web
Provides a secure, cost effective, and scalable platform for public websites and applications.
Microsoft Azure SQL Database
Allows building and extending database applications on a cloudbased platform.
MCT USE ONLY. STUDENT USE PROHIBITED
1-8
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
SQL Server Versions SQL Server has been available for many years, yet it is rapidly evolving with new capabilities and features. It is a platform with a rich history of innovation achieved while maintaining strong levels of stability.
Early Versions The earliest versions (1.0 and 1.1) were based on the OS/2 operating system. SQL Server 4.2 and later moved to the Microsoft Windows® operating system, initially on Windows NT.
Subsequent Versions
1-9
SQL Server 7.0 saw a significant rewrite of the product. Substantial advances were made in reducing the administration workload and OLAP Services (which later became Analysis Services) was introduced.
SQL Server 2000 featured support for multiple instances and collations. It also introduced support for data mining. After the product release, SQL Server Reporting Services (SSRS) was introduced as an add-on enhancement to the product, along with support for 64-bit processors. SQL Server 2005 provided another significant rewrite of many aspects of the product. It introduced support for:
Non-relational data stored and queried as XML.
SQL Server Management Studio (SSMS) was released to replace several previous administrative tools.
SSIS replaced a former tool known as Data Transformation Services (DTS).
Another key addition was the introduction of support for objects created using the Common Language Runtime (CLR).
The T-SQL language was substantially enhanced, including structured exception handling.
Dynamic Management Views (DMVs) and Dynamic Management Functions (DMFs) were introduced to enable detailed health monitoring, performance tuning, and troubleshooting.
Substantial high availability improvements were included in the product; in particular, database mirroring was introduced.
Support for column encryption was introduced.
SQL Server 2008 also provided many enhancements:
Filestream support improved the handling of structured and semi-structured data.
Spatial data types were introduced.
Database compression and encryption technologies were added.
Specialized date- and time-related data types were introduced, including support for time zones within datetime data.
Full-text indexing was integrated directly within the database engine. (Previously full-text indexing was based on interfaces to operating system level services.)
A policy-based management framework was introduced to assist with a move to more declarativebased management practices, rather than reactive practices.
A Windows PowerShell® provider for SQL Server was introduced.
The enhancements and additions to the product in SQL Server 2008 R2 included:
MCT USE ONLY. STUDENT USE PROHIBITED
1-10 Introduction to SQL Server 2014 Database Administration
Substantial enhancements to SSRS.
The introduction of advanced analytic capabilities with PowerPivot.
Improved multi-server management capabilities were added.
Support for managing reference data was provided with the introduction of Master Data Services.
StreamInsight provides the ability to query data that is arriving at high speed, before storing it in a database.
Data-tier applications assist with packaging database applications as part of application development projects.
The enhancements and additions to the product in SQL Server 2012 included:
Further substantial enhancements to SSRS.
Substantial enhancements to SSIS.
The introduction of tabular data models into SQL Server Analysis Services (SSAS).
The migration of BI projects into Visual Studio 2010.
The introduction of the AlwaysOn enhancements to SQL Server High Availability.
The introduction of Data Quality Services.
Strong enhancements to the T-SQL language such as the addition of sequences, new error-handling capabilities, and new window functions.
The introduction of the FileTable.
The introduction of statistical semantic search.
Many general tooling improvements.
SQL Server 2014
SQL Server 2014 builds on the mission-critical capabilities of previous versions and provides even better performance, availability, scalability, and manageability. It provides new in-memory capabilities for OLTP and data warehousing, as well as new disaster recovery functionality through Microsoft Azure™.
Lesson 3
Database Management Tools and Techniques
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
1-11
A DBA for a SQL Server database has a range of tools for managing different aspects of the database solution at their disposal. It is important to be familiar with the available tools, and the techniques you can use within them to manage a SQL Server database.
Lesson Objectives After completing this lesson, you will be able to:
Describe common tools for managing a SQL Server database system.
Use SQL Server Management Studio to manage a database server and databases.
Run Transact-SQL statements to perform maintenance tasks.
Use the SQLCMD command line utility.
Use Windows PowerShell with SQL Server.
SQL Server Tools SQL Server provides multiple tools that you can use to manage various aspects of the database system. These tools include:
SQL Server Management Studio (SSMS). This is the primary database management tool for SQL Server database servers. It provides a graphical user interface (GUI) and a TransactSQL scripting interface for managing the database engine component and databases. Additionally, you can use SSMS to manage instances of SSAS, SSIS, and SSRS as well as cloud-based databases in Microsoft Azure SQL Database.
SQL Server Configuration Manager (SSCM). You can use SSCM to configure and control SQL Server services, and to manage server and client network protocols and aliases.
SQL Profiler. When you need to examine activity in a SQL Server database or SSAS data model, you can use SQL profiler to record a trace that can be viewed or replayed. This enables you to troubleshoot problems or optimize database configuration based on actual usage patterns.
SQL Server Database Engine Tuning Advisor (DTA). A properly optimized database uses indexes and other structures to improve query performance. The DTA provides schema recommendations based on analysis of representative workloads, and can provide a useful starting point for database optimization.
SQL Server Import and Export. This tool is a graphical wizard that simplifies the process of transferring data in or out of a SQL Server database.
The sqlcmd utility. Pronounced “SQL Command”, this is a command line tool that you can use to connect to a SQL Server instance and run Transact-SQL statements or scripts.
The bcp utility. BCP stands for Bulk Copy Program, and the bcp utility is a command line tool for importing and exporting data to and from SQL Server.
MCT USE ONLY. STUDENT USE PROHIBITED
1-12 Introduction to SQL Server 2014 Database Administration
Additionally, SQL Server includes configuration and management tools for specific components such as Analysis Services, Reporting Services, Data Quality Services, and Master Data Services. You can also install SQL Server Data Tools (SSDT) and SQL Server Data Tools for Business Intelligence (SSDT-BI) add-ins in Microsoft Visual Studio, and use them to develop database and business intelligence (BI) solutions based on SQL Server components.
SQL Server Management Studio SSMS is the primary tool for managing SQL Server databases. It is based on the Visual Studio shell used for software development projects, and supports the following features:
Object Explorer. This is a pane in which you can connect to SQL Server instances and manage the objects they contains. By default, when you open SSMS you are prompted to connect to a SQL Server instance, and this instance is displayed in Object Explorer. You can then connect to additional instances and view them concurrently.
Code Editor. You can manage database servers and objects using graphical interfaces (typically opened from Object Explorer), or you can enter and run Transact-SQL statements in the code editor pane. Using Transact-SQL code to perform management tasks enables you to save the commands as scripts, which can be re-executed at a later time or scheduled to run automatically. The code editor in SSMS supports IntelliSense, which provides auto-completion of statements and color-coding of keywords to improve script readability. You can also use snippets to simplify the creation of commonly used statements. SSMS also provides the ability to generate Transact-SQL code for most tasks that can be performed using graphical tools, making it easier to create reusable scripts for administrative tasks.
Solutions and Projects. You can use projects and solutions to keep related scripts, connections, and other documents together. This can make it easier to keep track of all the script files required to create and manage a database solution.
Reports. SSMS includes an extensible report interface that you can use to view detailed configuration and status information about SQL Server instances, databases, and other objects.
The sqlcmd Utility The sqlcmd utility is a command line tool that you can use to run Transact-SQL statements or scripts in a SQL Server instance. You can use sqlcmd to automate database tasks from the command line, and for to perform configuration and management tasks when SSMS is not available. In particular, you can use sqlcmd to open a dedicated administrator connection (DAC) to a server when standard connections are not possible.
Parameters The sqlcmd utility provides parameters that you can use to configure connections and perform tasks. These parameters include: -S server_name (connect to the specified server) -d database_name (connect to the specified database) -U login (log in as the specified login) -P password (authenticate the login with the specified password) -E (use a trusted connection for Windows authentication) -A (open a dedicated administrator connection) -i input_file (run the Transact-SQL code in the specified input file) -o output_file (save the output in the specified output file) -q "Transact-SQL query" (run the specified query) -Q " Transact-SQL query " (run the specified query and exit) -v var = "value" (pass the specified variable to the input script)
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
1-13
The sqlcmd utility supports many more parameters. For a full list, enter sqlcmd -? At the command line.
Using Transact-SQL to Perform Management Tasks You can perform most administrative tasks in SSMS by using the graphical user interface. However, some tasks can only be performed by using Transact-SQL statements; and even if a task can be performed in a graphical interface, it is often sensible to use Transact-SQL code that can be saved as a script and re-executed later or run automatically by a scheduled job. Note: Most of the graphical interfaces in SSMS provide a Script button that generates the equivalent Transact-SQL code to apply the options you have chosen in the UI.
Transact-SQL commands that you can use to perform management tasks include:
MCT USE ONLY. STUDENT USE PROHIBITED
1-14 Introduction to SQL Server 2014 Database Administration
Explicit data definition language (DDL) statements. For example, you can use the Transact-SQL CREATE DATABASE statement to create a database, and the corresponding DROP DATABASE statement to delete a database.
System stored procedures and functions. SQL Server provides system stored procedures and functions that encapsulate common system configuration and management tasks. For example, you can use the sp_configure system stored procedure to set SQL Server instance configuration settings.
DBCC (Database Console Commands). DBCC commands are used to perform specific configuration and maintenance tasks, and to perform verification checks in a SQL Server database. For example, you can use the DBCC CHECKDB command to verify the logical and physical integrity of the objects in a database.
Using Windows PowerShell to Manage SQL Server Windows PowerShell is a command-line shell and scripting language designed for system administrators. It can be used to administer Windows, Microsoft server products, Microsoft Azure services, and non-Microsoft products, such as UNIX. This gives administrators a common scripting language across servers and other devices.
Windows PowerShell Cmdlets and Modules In traditional shells, commands take the form of executable programs that range from the very simple (such as attrib.exe) to the very complex (such as netsh.exe). In contrast, Windows PowerShell cmdlets are very simple, object-based, single-function command-line tools built into the shell. You can use each cmdlet separately, and also combine them to perform complex tasks.
Cmdlets have a recognizable name format—a verb and noun separated by a dash (-), such as Get-Help, Get-Process, and Start-Service. The verb defines the action that the cmdlet performs; for example, "get" cmdlets retrieve data, "set" cmdlets establish or change data, "format" cmdlets format data, and "out" cmdlets direct the output to a specified destination. The noun specifies the object being acted upon; for example, Get-Service retrieves information about services. Cmdlets are packaged in modules, which can be installed on a computer and loaded into the PowerShell environment as required. You can use the Get-Module cmdlet to list available modules on a computer, and you can use Import-Module to load the modules you need. When you install SQL Server 2014 on a computer, the installation includes the SQLPS module, which includes cmdlets that you can use to manage SQL Server instances and objects.
Using SQLPS Module Cmdlets The SQLPS module includes:
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
1-15
A SQL Server provider. This enables a simple navigation mechanism similar to file system paths. You can navigate and build paths similar to file system paths, where the drive is associated with a SQL Server management object model, and the nodes are based on the object model classes. You can then use PowerShell cmdlets such as Get-ChildItem to retrieve objects in the SQL Server object model. You can also use commands such as cd and dir, to navigate the paths, similar to the way you navigate folders in a command prompt window.
A set of SQL Server cmdlets. The SQL Server cmdlets support actions such as running Transact-SQL statements with the Invoke-Sqlcmd cmdlet.
You can use the SQL Server 2014 Windows PowerShell components to manage instances of SQL Server 2000 or later. Instances of SQL Server 2005 must be running SP2 or later. Instances of SQL Server 2000 must be running SP4 or later. When the SQL Server 2014 Windows PowerShell components are used with earlier versions of SQL Server, they are limited to the functionality available in those versions.
PowerShell Interfaces You can use the Windows PowerShell to manage SQL Server in the following user interfaces:
The Windows PowerShell command prompt. This provides a console window in which you can run PowerShell cmdlets.
The Windows PowerShell interactive scripting environment (ISE). This provides a PowerShell script development environment that supports IntelliSense and other features to simplify script development.
SQL Server PowerShell. SQL Server includes a PowerShell console named SQLPS.exe in which the SQLPS module is pre-loaded. You can open this console from within SQL Server Management Studio and use it to run cmdlets interactively.
Lab: Using SQL Server Administrative Tools Scenario
MCT USE ONLY. STUDENT USE PROHIBITED
1-16 Introduction to SQL Server 2014 Database Administration
As a new database administrator at Adventure Works Cycles, you plan to familiarize yourself with the SQL Server instance and tools that you will use to manage databases.
Objectives After completing this lab, you will be able to:
Use SQL Server Management Studio.
Use the sqlcmd utility.
Use Windows PowerShell with SQL Server.
Estimated Time: 45 Minutes Virtual machine: 20462C-MIA-SQL User name: ADVENTUREWORKS\Student Password: Pa$$w0rd
Exercise 1: Using SQL Server Management Studio Scenario The DBAs at Adventure Works Cycles use SQL Server Management Studio as the primary administrative tool for SQL Server databases. You therefore want to familiarize yourself with it. The main tasks for this exercise are as follows: 1. Prepare the Lab Environment 2. Use Object Explorer in SQL Server Management Studio 3. Create a Database 4. Run a Transact-SQL Query 5. Create a Project Task 1: Prepare the Lab Environment 1.
Ensure that the 20462C -MIA-DC and 20462C -MIA-SQL virtual machines are both running, and then log on to 20462C -MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
Run Setup.cmd in the D:\Labfiles\Lab01\Starter folder as Administrator.
Task 2: Use Object Explorer in SQL Server Management Studio 1.
Start SQL Server Management Studio and connect to the MIA-SQL database engine using Windows authentication.
2.
Ensure that Object Explorer is visible, and expand the Databases folder to view the databases that are hosted on the MIA-SQL instance.
3.
View the Server Dashboard standard report for the MIA-SQL instance.
Task 3: Create a Database 1.
Under MIA-SQL right-click the Databases folder, and click New Database. Then create a new database named AWDatabase with the default settings.
2.
View the databases listed under the Database folder and verify that the new database has been created.
Task 4: Run a Transact-SQL Query 1.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
1-17
In SQL Server Management Studio, create a new query, and execute the following Transact-SQL code: EXEC sp_helpdb AWDatabase;
2.
View the query results, noting that they include information about the AWDatabase you created in the previous task.
3.
Save the script file as GetDBInfo.sql in the D:\Labfiles\Lab01\Starter folder.
Task 5: Create a Project 1.
In SQL Server Management Studio, create a new project named AWProject in the D:\Labfiles\Lab01\Starter folder.
2.
Ensure that Solution Explorer is visible, and add a new connection to the project. The new connection should be to the MIA-SQL database engine instance and it should use Windows authentication.
3.
Add a new query to the project, and change its name to BackupDB.sql.
4.
In Object Explorer, right-click the AWDatabase database you created previously, point to Tasks, and click Back Up.
5.
In the Back Up Database – AWDatabase dialog box, in the Script drop-down list, select Script Action to Clipboard. Then cancel the backup operation.
6.
Paste the contents of the clipboard into the empty BackupDB.sql script.
7.
Edit the BackupDB.sql script to change the backup location to D:\Labfiles\Lab01\Starter\AWDatabase.bak.
8.
Save all of the files in the solution, and then close the solution and minimize SQL Server Management Studio.
Results: At the end of this exercise, you will have created a SQL Server Management Studio project containing script files.
Exercise 2: Using the sqlcmd Utility Scenario DBAs at Adventure Works Cycles occasionally use sqlcmd to connect to SQL Server and perform maintenance tasks. You therefore want to familiarize yourself with sqlcmd. The main tasks for this exercise are as follows: 1. Use sqlcmd Interactively 2. Use sqlcmd to Run a Script
Task 1: Use sqlcmd Interactively 1.
Open a command prompt, and enter the following command to view details of all sqlcmd parameters: sqlcmd -?
2.
Enter the following command to start sqlcmd and connect to MIA-SQL using Windows authentication: sqlcmd -S MIA-SQL -E
3.
In the sqlcmd command line, enter the following commands to view the databases on MIA-SQL. Verify that these include the AWDatabase database you created in the previous exercise: SELECT name FROM sys.sysdatabases; GO
4.
Enter the following command to exit sqlcmd: Exit
Task 2: Use sqlcmd to Run a Script 1.
In the command prompt window, enter the following command to use sqlcmd to run the GetDBInfo.sql script you created earlier in MIA-SQL: sqlcmd -S MIA-SQL -E -i D:\Labfiles\Lab01\Starter\GetDBinfo.sql
Note that the query results are returned, but they are difficult to read in the command prompt screen. 2.
Enter the following command to store the query output in a text file: sqlcmd -S MIA-SQL -E -i D:\Labfiles\Lab01\Starter\GetDBinfo.sql -o D:\Labfiles\Lab01\Starter\DBinfo.txt
3.
Use Notepad to view the contents of the D:\Labfiles\Lab01\Starter\DBinfo.txt file.
Results: At the end of this exercise, you will have used sqlcmd to manage a database.
Exercise 3: Using Windows PowerShell with SQL Server Scenario
MCT USE ONLY. STUDENT USE PROHIBITED
1-18 Introduction to SQL Server 2014 Database Administration
IT administrators at Adventure Works use Windows PowerShell to script configuration tasks across a range of services. You want to investigate how to use Windows PowerShell with SQL Server. The main tasks for this exercise are as follows: 1. Use Windows PowerShell 2. Using PowerShell in SQL Server Management Studio. 3. Create a PowerShell Script
Task 1: Use Windows PowerShell 1.
On the taskbar, click the Windows PowerShell icon.
2.
At the Windows PowerShell prompt, enter the following command: Get-Process
3.
Review the list of services. In the ProcessName column, note the SQL services. Then enter the following command to list only the services with names beginning “SQL”,: Get-Process SQL*
4.
To find a way to sort the list, enter the following command: Get-Help Sort
5.
Review the help information, then enter the following command: Get-Process SQL* | Sort-Object Handles
6.
Verify that the list is now sorted by number of handles, and close Windows PowerShell.
Task 2: Using PowerShell in SQL Server Management Studio.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
1-19
1.
In SQL Server Management Studio, in Object Explorer, right-click MIA-SQL, and then click Start PowerShell.
2.
Enter the following command to show which modules are loaded, and verify that they include SQLPS and SQLASCMDLETS: Get-Module
3.
Enter the following command to set the current location to the MIA-SQL server: Set-location SQLServer:\SQL\MIA-SQL
4.
Use the following command to display the SQL Server database engine instances on the server: Get-ChildItem
5.
Use the Set-Location cmdlet to change the current location to SQLServer:\SQL\MIASQL\DEFAULT\Databases.
6.
Use the Get-ChildItem cmdlet to display the databases on the default instance.
7.
Use the following command to execute a Transact-SQL statement that retrieves the server version: Invoke-Sqlcmd "SELECT @@version"
8.
Close the SQL Server Powershell window and close SQL Server Management Studio without saving any files.
Task 3: Create a PowerShell Script 1.
On the task bar, right-click the Windows PowerShell icon and click Windows PowerShell ISE.
2.
In the PowerShell command prompt, enter the following command to verify that the SQLPS module is not loaded: Get-Module
3.
Use the following command to load the SQLPS module, and then use the Get-Module cmdlet to verify that it has been loaded: Import-Module SQLPS -DisableNameChecking
MCT USE ONLY. STUDENT USE PROHIBITED
1-20 Introduction to SQL Server 2014 Database Administration
4.
If the Commands pane is not visible, on the View menu, click Show Command Add-on. Then in the Commands pane, in the Modules list, select SQLPS and view the cmdlets in the module, noting that they include cmdlets to perform tasks such as backing up databases and starting SQL Server instances.
5.
If the Script pane is not visible, click the Script drop-down arrow. Then, in the Script pane, type the following commands. (Hint: Use the IntelliSense feature.) Import-Module SQLPS -DisableNameChecking Set-location SQLServer:\SQL\MIA-SQL\Default\Databases Get-Childitem | Select Name, Size, SpaceAvailable, IndexSpaceUsage | Out-GridView
6.
Click Run Script. Then view the results in the window that is opened. (The script may take a few minutes to run.)
7.
Close the output window, and modify the script as shown in the following example: Import-Module SQLPS -DisableNameChecking Set-location SQLServer:\SQL\MIA-SQL\Default\Databases Get-Childitem | Select Name, Size, SpaceAvailable, IndexSpaceUsage | Out-File
'D:\Labfiles\Lab01\Starter\Databases.txt' 8.
Save the script as GetDatabases.ps1 in the D:\Labfiles\Lab01\Starter folder. Then close the PowerShell ISE.
9.
In the D:\Labfiles\Lab01\Starter folder, right-click GetDatabases.ps1 and click Run with PowerShell.
10. When the script has completed, open Databases.txt in Notepad to view the results. Then close Notepad.
Results: At the end of this task, you will have a PowerShell script that retrieves information about databases from SQL Server.
Module Review and Takeaways
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
1-21
This module introduced SQL Server 2014 and the tools that are commonly used to manage it. The rest of this course focuses on how to perform specific database management operations in a SQL Server database engine instance.
Review Question(s) Question: When might you use each of the management tools you explored in the lab?
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED 2-1
Module 2 Installing and Configuring SQL Server 2014 Contents: Module Overview
2-1
Lesson 1: Planning SQL Server Installation
2-2
Lesson 2: Installing SQL Server 2014
2-10
Lesson 3: Post-Installation Configuration
2-15
Lab: Installing SQL Server 2014
2-18
Module Review and Takeaways
2-21
Module Overview
One of the key responsibilities of a database administrator (DBA) is to provision databases servers and databases. This includes planning and performing the installation of SQL Server on physical servers and virtual machines. This module explains how to assess resource requirements for SQL Server 2014 and how to install it.
Objectives After completing this module, you will be able to:
Plan a SQL Server 2014 installation.
Install SQL Server 2014.
Perform post-installation configuration tasks.
Installing and Configuring SQL Server 2014
Lesson 1
Planning SQL Server Installation
MCT USE ONLY. STUDENT USE PROHIBITED
2-2
Before starting the installation process for SQL Server, it is important to discover how each of the requirements for a successful installation can be met. In this lesson, you will consider the specific requirements that SQL Server places on the hardware and software platforms on which it runs and learn about tools which you can use to pre-test your systems before installing SQL Server.
Lesson Objectives After completing this lesson, you will be able to:
Describe considerations for installing SQL Server.
Describe the hardware and software requirements for SQL Server 2014.
Assess CPU and memory requirements for SQL Server.
Describe considerations for storage I/O and performance requirements for SQL Server.
Plan service accounts for SQL Server services.
Considerations for SQL Server Installation When planning an installation of SQL Server, consider the following factors:
Required components and features. One of the most obvious factors to consider is which SQL Server components and features do you need to install to meet your business requirements. While it may be tempting to install all available components just in case they are required, you should limit the installation to include only the components that are actually needed. This approach helps to minimize the “surface attack area”, reducing the opportunity for security vulnerabilities. It also minimizes the resource utilization overhead of the installation and simplifies administration by eliminating services and components that would otherwise need to be managed.
Hardware resource requirements. Depending on the components being installed and the workloads they must support, SQL Server can place substantial demands on the hardware resources of a server. A typical SQL Server database engine installation consumes CPU, memory, and storage I/O subsystem resources to meet the requirements of the applications using the database. In the case of most enterprise database solutions, the server (physical or virtual) is often dedicated to support the SQL Server installation. However, in some small organizations or departmental solutions, the database engine may need to co-exist with other applications and software services on the same server; and compete for hardware resources. When planning an installation, you must be sure that the server where you intend to install SQL Server has sufficient spare capacity to support the database workload while providing the required levels of performance and concurrency.
Service account identities. Each SQL Server service is configured to run within the context of a specific Windows account. This account provides an identity for the service, and is used to authorize
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
2-3
access to system resources by the service. You should plan the accounts that will be used by the services in your SQL Server installation carefully to ensure that they meet the requirements of the “principle of least privilege”, which states that all authenticated accounts should have the minimum permissions and system rights they need to fulfil their function.
Default data file locations. The SQL Server database engine is used to store and manage data in databases, and these databases exist as files on storage devices. While you can specify the location of data files when you create a database, SQL Server uses a default location for internal system databases and new databases with no file location specified. You should set the default location to a folder on an appropriate storage device when you install SQL Server.
Server collation. SQL Server databases use a specified collation to control how character data is sorted and stored. You specify a collation at the server (instance) level, and unless you have specific character sort-order requirements, you should choose a collation that matches that used by the server on which you are installing SQL Server. You can override the server collation in individual databases if required, but the server collation is used for system databases including TempDB, which is used by all databases to store temporary objects.
Minimum Hardware and Software Requirements SQL Server 2014 has specific hardware and software requirements.
Hardware Requirements In earlier versions of SQL Server, it was necessary to focus on minimum requirements for processor, disk space and memory. Nowadays, discussing minimum processor speeds and disk space requirements for the SQL Server components is pointless. Even the slowest processor in a new laptop is fast enough to meet the minimum requirements for SQL Server.
Processors
In enterprise environments, the number of processors is now a much more significant issue. While it might seem desirable to add as many CPUs as possible, it is important to consider that there is a tradeoff between the number of CPUs and license costs. Also, not all computer architectures support the addition of CPUs. Adding CPU resources might then require architectural upgrades to computer systems, not just the additional CPUs. The information in the following table shows the processor types required for x84 and x64 processor architectures: Architecture
Requirements
x86
At least a 1 GHz Pentium III-compatible processor
x64
At least a 1.4 GHz AMD Opteron, AMD Athlon 64, Intel Xeon or Intel Pentium with EM64T support
Disk
The hardware requirements for SQL Server list the required disk space to install the product. These values are not the only ones to consider though, because the size of user databases can have a greater impact.
Installing and Configuring SQL Server 2014
MCT USE ONLY. STUDENT USE PROHIBITED
2-4
Disk subsystem performance, however, is critical. A typical SQL Server system today is I/O bound, if it is configured and working correctly. Note that a bottleneck is, in itself, not a bad thing. Any computer system with any task that it needs to perform will have a bottleneck somewhere. If another component of the server is the bottleneck (rather than the I/O subsystem), there is usually an underlying issue to resolve. It could be a lack of memory or something more subtle like a recompilation issue (that is, a situation where SQL Server is constantly recompiling code). Memory requirements for SQL Server are discussed in the next topic.
Software Requirements Like any server product, SQL Server requires specific combinations of operating system and software in order to install.
Operating System
Even though it is possible to install versions of SQL Server on the client operating systems, such as the Windows 7® (SP1) and Windows Vista® (SP2), the product is really designed for use on server operating systems such as the Windows Server series. Similarly, many higher-end editions of SQL Server also require higher-end editions of Windows. SQL Server Books Online provides a precise list of supported versions and editions.
It is strongly recommended to avoid installing SQL Server on a domain controller. If you attempt this, the installation is not blocked but limitations are applied to it.
You cannot run SQL Server services on a domain controller under a local service account or a network service account.
After SQL Server is installed on a computer, you cannot change the computer from a domain member to a domain controller. You must uninstall SQL Server before you change the host computer to a domain controller.
After SQL Server is installed on a computer, you cannot change the computer from a domain controller to a domain member. You must uninstall SQL Server before you change the host computer to a domain member.
SQL Server Setup cannot create security groups or provision SQL Server service accounts on a readonly domain controller. In this scenario, Setup will fail.
SQL Server failover cluster instances are not supported where cluster nodes are domain controllers and installation is blocked.
Prerequisite Software
In earlier versions, the installer for SQL Server would preinstall most requirements as part of the installation process. This is no longer the case—the .NET Framework and PowerShell need to be preinstalled before running setup. The installer for SQL Server will install the SNAC and the SQL Server setup support files. However, to minimize the installation time for SQL Server, particularly in busy production environments, it is useful to have preinstalled these components during any available planned downtime. Components such as the .NET Framework often require a reboot after installation so the preinstallation of these components can further reduce downtime during installations or upgrades.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
General Software Requirements
2-5
The SQL Server installer is based on the Windows Installer 4.5 technology. You should consider installing Windows Installer 4.5 prior to the installation of SQL Server, to minimize SQL Server installation time. Several components of SQL Server have a requirement for the Internet Explorer® browser. These components include the Microsoft Management Console (MMC) add-in, SSMS, Business Intelligence Design Studio (BIDS), the Report Designer in BIDS, and any use of HTML Help.
Assessing CPU and Memory Requirements Planning server resource requirements is not an easy task. There is no simple formula that allows you to calculate resource requirements based on measures such as database size and the number of connections, even though you may see references to these at times.
CPU CPU utilization for a SQL Server largely depends upon the types of queries that are running on the system. Processor planning is often considered as relatively straightforward, in that few system architectures provide fine-grained control of the available processor resources. Testing with realistic workloads is the best option.
Increasing the number of available CPUs will provide SQL Server with more scope for creating parallel query plans. Even without parallel query plans, SQL Server workloads will make good use of multiple processors when working with simple query workloads from a large number of concurrent users. Parallel query plans are particularly useful when large amounts of data are being scanned within large data warehouses.
Try to ensure that your server is dedicated to SQL Server whenever possible. Most servers that are running production workloads on SQL Server should have no other significant services running on the same system. This particularly applies to other server applications such as Microsoft Exchange Server.
Many new systems are based on Non-Uniform Memory Access (NUMA) architectures. In a traditional symmetric multiprocessing (SMP) system, all CPUs and memory are bound to a single system bus. The bus can become a bottleneck when additional CPUs are added. On a NUMA-based system, each set of CPUs has its own bus, complete with local memory. In some systems, the local bus might also include separate I/O channels. These CPU sets are called NUMA nodes. Each NUMA node can access the memory of other nodes but the local access to local memory is much faster. The best performance is achieved if the CPUs mostly access their own local memory. Windows and SQL Server are both NUMA aware and try to make use of these advantages.
Optimal NUMA configuration is highly dependent on the hardware. Special configurations in the system BIOS might be needed to achieve optimal performance. It is crucial to check with the hardware vendor for the optimal configuration for a SQL Server on the specific NUMA-based hardware.
Memory The availability of large amounts of memory for SQL Server to use is now one of the most important factors when sizing systems.
While SQL Server will operate in relatively small amounts of memory, when memory configuration challenges arise, they tend to relate to the maximum, not the minimum, values. For example, the Express
Installing and Configuring SQL Server 2014
Edition of SQL Server will not utilize more than 1 GB of memory, regardless of how much is physically installed in the system.
MCT USE ONLY. STUDENT USE PROHIBITED
2-6
The majority of servers being installed today are 64-bit, with a single address space that can directly access large amounts of memory. The biggest challenge with 32-bit servers is that memory outside of the 4-GB "visible" address space (that is the memory that can be directly accessed) is retrieved by using Address Windowing Extensions (AWE). While earlier versions of SQL Server allowed the use of AWE-based memory for the caching of data pages, SQL Server 2012 onwards no longer supports the use of AWEbased memory to increase the address space for 32-bit systems.
Storage I/O Considerations The performance of SQL Server is tightly coupled to the performance of the I/O subsystem it is using. Current I/O systems are complex, so planning and testing the storage is a key task during the planning stage.
Determining Requirements In the first phase of planning, the requirements of the application must be determined, including the I/O patterns that need to be satisfied. These include the frequency and size of reads and writes sent by the application. As a general rule, OLTP systems produce a high number of random I/O operations on the data files and sequential write operations on database log files. By comparison, data warehouse-based applications tend to generate large scans on data files.
Storage Styles
The second planning phase involves determining the style of storage to be used. With direct attached storage (DAS), it is easier to get good predictable performance. On storage area network (SAN) systems, more work is often required to get good performance, but SAN storage typically provides a wide variety of management capabilities and storage consolidation.
One particular challenge for SQL Server administrators is that SAN administrators are generally more concerned with the disk space that is allocated to applications rather than the performance requirements of individual files. Rather than attempting to discuss file layouts with a SAN administrator, try to concentrate on your performance requirements for specific files. Leave the decisions about how to achieve those goals to the SAN administrator. That is, focus on what is needed in these discussions rather than on how it can be achieved.
RAID Systems
In SAN-based systems, you will not often be concerned about the redundant array of independent disks (RAID) levels being used. If you have specified the required performance on a file basis, the SAN administrator will need to select appropriate RAID levels and physical disk layouts to achieve that. For DAS storage, become aware of different RAID levels. While other RAID levels exist, RAID levels 1, 5, and 10 are the most common ones used in SQL Server systems.
Number of Spindles
For most current systems, the number of drives (or spindles, even though the term is now somewhat dated), will matter more than the size of the disk. It is easy to find large disks that will hold substantial databases but often a single large disk will not be able to provide sufficient I/O operations per second or
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
enough data throughput (MB/sec) to be workable. Solid state drive (SSD) based systems are quickly changing the available options in this area.
Drive Caching
2-7
Read caching within disk drives is not particularly useful because SQL Server already manages its own caching system. It is unlikely that SQL Server will need to re-read a page from disk that it has recently written, unless the system is low on memory. Write caches can substantially improve SQL Server I/O performance, but make sure that hardware caches guarantee a write, even after a system failure. Many drive write caches cannot survive failures and this can lead to database corruptions.
Tools for Assessing Storage I/O Performance To help you assess the proposed storage system for a SQL Server installation, you can use utilities to simulate the kinds of I/O workload your database instance is expected to support.
SQLIOSim
SQLIOSim is an unsupported utility that you can download from the Microsoft download site. It is designed to simulate the activity generated by SQL Server without the need to have SQL Server installed. This capability makes SQLIOSim a good tool for pre-testing server systems that are targets for running SQL Server.
SQLIOSim is a stand-alone tool that you can copy on to the server and run. It does not need to be installed on the target system by using an installer. SQLIOSim has both GUI and command-line execution options. While SQLIOSim is useful for stress-testing systems, it is not good for general performance testing. The tasks it performs vary between each execution of the utility, so no attempt should be made to compare the output of multiple executions directly, particularly in terms of timing.
SQLIO
SQLIO is another unsupported utility that you can download from the Microsoft download site. Unlike SQLIOSIM, which is used for non-repeatable stress testing, SQLIO is designed to create entirely repeatable I/O patterns. You use a configuration file to determine the types of I/O operations to test. SQLIO then tests those types of operations specifically. A common way to use SQLIOSIM and SQLIO together is to use SQLIOSIM to find issues with certain types of I/O operations. SQLIO can then be used to generate those specific problematic types of I/O operations, while attempting to resolve the issues. SQLIO is also a stand-alone tool that does not require SQL Server to be installed on the system. Also, it is advised to perform these tests before installation of SQL Server. SQLIO only checks one I/O type at a time. This makes the interpretation of the results the most important task.
Installing and Configuring SQL Server 2014
Planning Service Accounts SQL Server services run as services on a Windows server. When you install SQL Server, you must specify the accounts that the various SQL Server services will use. You can also configure service account settings after installing SQL Server by using the SQL Server Configuration Manager tool. Note: You should not use the Windows Services tool to manage SQL Server accounts.
SQL Server Services
MCT USE ONLY. STUDENT USE PROHIBITED
2-8
When you install SQL Server, you can choose to install some or all of the available services. The available SQL Server services include:
SQL Server Database Services. This is the service for the SQL Server Database Engine.
SQL Server Agent. This service is responsible for automation, including running jobs and issuing alerts.
Analysis Services. This service provides online analytical processing (OLAP) and data-mining functionality.
Reporting Services. This service is involved in the creation and execution or reports.
Integration Services. This service performs the tasks associated with extract, transform, and load (ETL) operations.
Master Data Services. This service enables you to maintain master records of data items to ensure that data is represented consistency across the enterprise.
Data Quality Services. This service enables you to maintain data quality through operations such as data cleansing and de-duplication.
SQL Server Browser. By default, named instances use dynamic TCP port assignment, which means that when the SQL Server service starts, they select a TCP port from the available ports. The SQL Browser service enables connectivity to named SQL Server instances when the connection uses dynamic port assignment in this way. The service is not required for connections to default SQL Server instance, which use port 1433, the well-known port number for SQL Server. For connections to named instances that use a TCP specific port number, or which specify a port number in the connection string, the SQL Server Browser service is not required and you can disable it.
Full-text search. This service enables the indexing and searching or unstructured and semistructured data.
SQL Writer. This service enables you to backup and restore by using the Windows Volume Shadow Copy service (VSS). This service is disabled by default, and you should only enable it if you intend to use VSS backups.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
Choosing Service Accounts
2-9
There are several types of accounts that you can use for SQL Server services, so it is important to be aware of the differences between these accounts when planning a secure SQL Server installation. The account that you choose for each service will require the correct privileges; however, you should avoid using accounts, such as administrative accounts, that have excessive privileges and rights. Follow the principal of least privilege when choosing service accounts, and never use an account that has greater privileges that it requires to do its job. When you use SQL Server Configuration Manager (or the SQL Server Setup installation program) to specify a service account, the account is automatically added to the relevant security group to ensure that it has the appropriate rights. Ideally, each service should run under its own dedicated account. This minimizes risk because in the event of one account being compromised, other services remain unaffected. The different types of accounts that you can use for SQL Server services include:
Domain user account. A non-administrator domain user account is secure choice for service accounts in a domain environment.
Local user account. A non-administrator local user account is secure choice for service accounts in a non-domain environment, such as a perimeter network.
Local System account. The local system account is a highly privileged account that is used by various Window services. Consequently, you should avoid using this account to run SQL Server services.
Network Service account. The Network Service account has fewer privileges than the Local System account, but it does enable a service to have access to network resources. However, because this account is often used by multiple services, including Windows services, you should avoid using it where possible. If your database server runs on Windows Server 2008 R2 or later, you can use a virtual service account instead (see below).
Managed service account. Managed service accounts are available if the host operating system is Windows Server 2008 R2 or later (Windows 7 also supports managed service accounts). SQL Server support for managed service accounts was introduced in SQL Server 2012. A managed service account is a type of domain account that is associated with a single server and which you can use to manage services. You cannot use a managed service account to log on to a server, so it is more secure than a domain user account. Additionally, unlike a domain user account, you do not need to manually manage passwords for managed service accounts. A domain administrator needs to create and configure a managed service account before you can use it.
Virtual service account. Virtual service accounts are available if the host operating system is Windows Server 2008 R2 or later (Windows 7 also supports virtual service accounts). SQL Server support for virtual service accounts was introduced in SQL Server 2012. A virtual service account is similar to a managed service account, except that it is a type of local account that you can use to manage services rather than a domain account. Unlike managed service accounts, an administrator does not need to create or configure a virtual service account. This is because a virtual service account is simply a virtualized instance of the built-in Network Service account with its own unique identifier.
Lesson 2
Installing SQL Server 2014
MCT USE ONLY. STUDENT USE PROHIBITED
2-10 Installing and Configuring SQL Server 2014
After making the decisions about your SQL Server configuration, you can proceed to installation. In this lesson, you will see the phases that installation proceeds through and how SQL Server checks your system for compatibility by using a tool known as the System Configuration Checker. For most users, the setup program will report that all was installed as expected. For the rare situations where this does not occur, you will also learn how to carry out post-installation checks and troubleshooting.
Lesson Objectives After completing this lesson, you will be able to:
Describe options for installing SQL Server.
Install SQL Server.
Perform an unattended installation.
Describe strategies for upgrading SQL Server.
Options for Installing SQL Server 2014 You can install SQL Server 2014 in different ways— by using the installation wizard, from the command prompt, by using a configuration file, and by using SysPrep. All these methods provide the same end result, but enable you to choose the most appropriate installation process for your environment.
Installation Wizard The SQL Server installation wizard provides a simple user interface for installing SQL Server. It comprises of multiple pages that gather the information required to install the product.
The installation wizard enables you to select all the components of SQL Server that you want to install. As well as using it to create a new installation on the server, you can also use it to add components to an existing one. Note: You must be a local administrator to run the installation wizard on the local computer. When installing from a remote share, you need read and execute permissions.
Command Prompt
You can also run the SQL Server setup program from the command prompt, using switches to specify the options that you require. You can enable users to fully interact with the setup program, to view the progress without requiring any input, or to run it in quiet mode without any user interface. Unless you are using a volume licensing or third-party agreement, users will always be required to confirm acceptance of the software license terms.
Configuration File As well as using switches to provide information to the command prompt setup, you can also use a configuration file. This can simplify the task of installing identically-configured instances across your enterprise. The configuration file is a text file containing name/value pairs. You can manually create this file by running the installation wizard, selecting all your required options, and then, instead of installing the product, generating a configuration file of those options—or taking the configuration file from a previously successful installation. If you use a configuration file in conjunction with command prompt switches, the command prompt values will override any values in your configuration file.
SQL Server SysPrep
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
2-11
SQL Server SysPrep enables you to prepare an instance of SQL Server, and complete the installation later. This can help when you want to prepare instances on multiple computers or multiple instances on one computer. SQL Server 2014 extends the SysPrep functionality in previous versions to support clustered instances in command-line installations.
Performing an Installation of SQL Server 2014 You install the required components of SQL Server 2014 by running the SQL Server 2014 Setup program. There are many pages in the Installation Wizard, including:
Product Key. If you are using anything other than the evaluation edition, you must enter the product key for your copy of SQL Server 2014.
License Terms. You must accept the license terms in order to continue with the installation.
Product Updates. The installation process checks for any updates to prerequisite software.
Install Setup Files. The installation process installs the setup files that it requires to install SQL Server.
Install Rules. The installation process checks for known potential issues that can occur during setup and requires you to rectify any that it finds before continuing.
Setup Role. You must select the type of installation that you need to ensure that the process includes the correct feature components you require. The options are: o
SQL Server Feature Installation. This option installs the key components of SQL Server, including the database engine, Analysis Services, Reporting Services, and Integration Services.
o
SQL Server PowerPivot for SharePoint. This option installs PowerPivot for SharePoint on a new or existing instance of SharePoint server.
o
All Features with Defaults. This option installs all SQL Server features and uses the default options for the service accounts.
MCT USE ONLY. STUDENT USE PROHIBITED
2-12 Installing and Configuring SQL Server 2014
After you select one of these options, you can further customize the features to install on the next page of the wizard.
Feature Selection. You can use this page to select the exact features that you want to install. You can also specify where to install the instance features and the shared features.
Instance Configuration. You must specify whether to install a default or named instance (on the first installation) and if a named instance, the name that you want to use.
Server Configuration. Specify the service account details and startup type for each service that you are installing.
Configuration. You must configure component-specific settings for each component that you have selected to install.
Ready to Install. Use this page to review the options you have selected throughout the wizard prior to performing the installation.
Complete. When the installation is complete, you are likely to need to reboot the server.
Performing an Unattended Installation In many organizations, script files for standard builds of software installations are created by senior IT administrators and used to ensure consistency throughout the organization. Unattended installations can help with the deployment of multiple identical installations of SQL Server across an enterprise. Unattended installations can also provide for the delegation of the installation to another person.
Unattended Installation Methods
One option for performing an unattended installation of SQL Server is to create an .ini file containing the required setup information and passing it as a parameter to setup.exe at a command prompt. A second alternative is to pass all the required SQL Server setup details as parameters to the setup.exe program, rather than placing the parameters into an .ini file. You can choose to use the .ini file or the separate parameters, but a combination of both is not permitted. In both examples on the slide, the second method has been used. The first example shows a typical installation command and the second shows how an upgrade could be performed using the same method.
/q Switch
The "/q" switch shown in the examples specifies "quiet mode" – no user interface is provided. An alternative switch "/qs" specifies "quiet simple" mode. In the quiet simple mode, the installation runs and shows progress in the UI but does not accept any input.
Creating an .ini File
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
2-13
An .ini file for unattended installation can be created by using any text editor, such as Notepad. The SQL Server installation program creates a file called ConfigurationFile.ini in a folder that is named based upon the installation date and time, under the folder C:\Program Files\Microsoft SQL Server\110\Setup Bootstrap\Log. You can use this as a starting point for creating your own .ini file. The .ini file is composed of a single [Options] section containing multiple parameters, each relating to a different feature or configuration setting.
Upgrading SQL Server If you have an existing instance of a previous version of SQL Server, you can upgrade it to SQL Server 2014. There are two basic ways that SQL Server upgrades can be performed – there is no single preferred way to do this. Each method has benefits and limitations, and is appropriate in certain circumstances.
In-place Upgrades In-place upgrades occur when the installed version of SQL Server is directly replaced by a new version by the SQL Server setup program. This is an easier and highly automated, though riskier, method of upgrading. If an upgrade fails, it is much harder to return to the previous operating state. For most customers, the risk of this cannot be ignored.
When you are weighing up risk, you need to consider that it may not be the SQL Server upgrade that fails. Even if the SQL Server upgrade works as expected, but then the application fails to operate as anticipated on the new version of SQL Server, the need to recover the situation quickly will be just as important. In-place upgrades have the added advantage of minimizing the need for additional hardware resources and avoid the need to redirect client applications that are configured to work with the existing server.
Before performing an in-place upgrade, you should use the Upgrade Advisor tool to analyze existing instances and identify any issues that need to be addressed before upgrading. You can install the Upgrade Advisor from the Planning page of the SQL Server 2014 Installation Center.
Side-by-side Upgrades
Side-by-side upgrades are a safer alternative, as the original system is left in place and can be quickly returned to production should an upgrade issue arise. However, side-by-side upgrades involve extra work and more hardware resources. To perform a side-by-side upgrade, you will need enough hardware resources to provide for both the original and the new systems. Two common risks associated with side-by-side upgrades relate to the time taken to copy all the user databases to a new location and the space required to hold these copies. While most side-by-side upgrades are performed on separate servers, it is possible to install both versions of SQL Server on the same server during a side-by-side upgrade. However, side-by-side upgrades of versions with the same major build number (that is, SQL Server 2008 R2 and SQL Server 2008) on the same server are a special case. Because the major version number is identical, separate versions of the shared components cannot co-exist on the same server. Shared components will be upgraded.
Not all versions of SQL Server are supported when installed side-by-side. Consult SQL Server Books Online for a matrix of versions that are supported when installed together.
Hybrid Options
MCT USE ONLY. STUDENT USE PROHIBITED
2-14 Installing and Configuring SQL Server 2014
It is also possible to use some elements of an in-place upgrade and a side-by-side upgrade together. For example, rather than copying all the user databases, after installing the new version of SQL Server beside the old version, and migrating all the server objects such as logins, you could detach user databases from the old server instance and reattach them to the new one. Once user databases have been attached to a newer version of SQL Server, they cannot be reattached to an older version again, even if the database compatibility settings have not been upgraded. This is a risk that needs to be considered when using a hybrid approach.
Lesson 3
Post-Installation Configuration
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
2-15
When you have completed the installation of SQL Server, you can perform post-installation checks and configuration tasks to meet your specific requirements.
Lesson Objectives After completing this lesson, you will be able to:
Perform post-installation checks.
Configure SQL Server services and network protocols.
Manage SQL Server updates.
Performing Post-Installation Checks After SQL Server has been installed, the most important check is to make sure that all SQL Server services are running, by using the SQL Server Services node in SSCM. Note: SQL Server services have names that differ slightly to their displayed names. For example, the service name for the default SQL Server service is MSSQLSERVER. You can view the actual service name by looking on the properties page for the service.
You do not need to check the contents of the SQL Server setup log files after installation, because the installer program will indicate any errors that occur and attempt to reverse any of the SQL Server setup that has been completed to that point. When errors occur during the SQL Server Setup phase, the installation of the SQL Server Native Access Client and the Setup Components is not reversed. Typically, you only need to view the setup log files in two scenarios:
If setup is failing and the error information displayed by the installer does not help you to resolve the issue.
If you contact Microsoft Product Support and they ask for detailed information.
If you do require the log files, you will find them in the %programfiles%\Microsoft SQL Server\110\Setup Bootstrap\Log folder.
Configuring Services and Network Protocols You use SQL Server Configuration Manager (SSCM) to configure SQL Server services, and the network libraries exposed by SQL Server services, as well as configuring how client connections are made to SQL Server.
Configuring Services You can use SSCM to control (that is, start, stop, and configure) each service independently, to set the startup mode (automatic, manual, or disabled) of each service, and to set the service account identity for each service. You can also set startup parameters to start SQL Server services with specific configuration settings troubleshooting purposes. You also use SSCM to configure both server and client protocols and ports. SSCM provides two sets of network configurations—protocols that the server exposes and those used for making connections.
Configuring Server Network Ports and Listeners
MCT USE ONLY. STUDENT USE PROHIBITED
2-16 Installing and Configuring SQL Server 2014
You can configure each network endpoint that an instance of SQL Server exposes. This includes the determination of which network libraries are enabled and, for each one, the configuration of the network library. Typically, this will involve settings such as protocol port numbers. You should discuss the required network protocol configuration of SQL Server with your network administrator. There are three protocols available:
TCP/IP
Named pipes
Shared memory
The configuration for the TCP/IP protocol allows for different settings on each configured IP address if required, or a general set of configurations that are applied to all IP addresses.
Configuring Client Network Ports and Listeners
SQL Native Client (SNAC) is installed on the server as well as on client systems. When SQL Server management tools are installed on the server, they the SNAC library to make connections to the SQL Server services that are on the same system. Every computer that has SNAC installed needs the ability to configure how that library will access SQL Server services. For this reason, in addition to server network configuration settings, SSCM includes client configuration nodes with which you can configure how client connections are made. Note that two sets of client configurations are provided and that they only apply to the computer where they are configured. One set is used for 32-bit applications; the other set is used for 64-bit applications. SSMS is a 32-bit application, even when SQL Server is installed as a 64-bit application.
Aliases Connecting to a SQL Server service can involve multiple settings such as server address, protocol, and port. If you hard-code these connection details in your client applications, and then any of the details change, your application will no longer work. To avoid this issue and to make the connection process simpler, you can use SSCM to create aliases for server connections.
You create a server alias and associate it with a server, protocol, and port (if required). Client applications can then connect to the alias without being concerned about how those connections are made.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
2-17
Each client system that utilizes SNAC (including the server itself) can have one or more aliases configured. Aliases for 32-bit applications are configured independently of those for 64-bit applications.
Managing SQL Server Updates As with all software products over time, issues can be encountered with SQL Server. The product group is very responsive in fixing any identified issues by releasing software updates. SQL Server updates are released in several ways:
Hotfixes (also known as QFE or Quick Fix Engineering) are released to address urgent customer concerns. Due to the tight time constraints, only limited testing can be performed on these fixes, so they should only be applied to systems that are known to be experiencing the issues that they address.
Cumulative Updates (CUs) are periodic roll-up releases of hotfixes that have received further testing as a group.
Service Packs (SPs) are periodic releases where full regression testing has been performed. Microsoft recommend applying SPs to all systems after appropriate levels of organizational testing.
The simplest way to keep SQL Server up to date is to enable automatic updates from the Microsoft Update service. Larger organizations or those with strong change processes should exert caution in applying automatic updates. It is likely that the updates should be applied to test or staging environments before being applied to production environments. SQL Server 2014 can also have product SPs slipstreamed into the installation process to avoid the need to apply them after installation.
Lab: Installing SQL Server 2014 Scenario
MCT USE ONLY. STUDENT USE PROHIBITED
2-18 Installing and Configuring SQL Server 2014
You have been tasked with creating a new instance of SQL Server that will be used by the IT department as a test server for new applications.
Objectives After completing this lab, you will be able to:
Assess available resources.
Install SQL Server 2014.
Perform post-installation checks.
Estimated Time: 60 minutes Virtual machine: 20462C-MIA-SQL User name: ADVENTUREWORKS\Student Password: Pa$$w0rd
Exercise 1: Preparing to Install SQL Server Scenario You are preparing to install SQL Server 2014 for the IT department in Adventure Works Cycles. Before installing, you want to determine the readiness of the server hardware provisioned for the instance. The main tasks for this exercise are as follows: 1. Prepare the Lab Environment 2. View Hardware and Software Requirements 3. Run the System Configuration Checker
Task 1: Prepare the Lab Environment 1.
Ensure that the MSL-TMG1, 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are both running, and then log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
Run Setup.cmd in the D:\Labfiles\Lab02\Starter folder as Administrator.
Task 2: View Hardware and Software Requirements 1.
Run Setup.exe in the C:\Server2014-x64-ENU folder to run the SQL Server installation program.
2.
In the SQL Server Installation Center, on the Planning page, view the Hardware and Software Requirements.
Task 3: Run the System Configuration Checker 1.
In the SQL Server Installation Center, on the Tools lab, use the System Configuration Checker to assess the computer’s readiness for a SQL Server installation.
2.
Keep the SQL Server Installation Center window open. You will use it again in a later exercise.
Results: After this exercise, you should have run the SQL Server setup program and used the tools in the SQL Server Installation Center to assess the computer’s readiness for SQL Server installation.
Exercise 2: Installing SQL Server Scenario
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
2-19
The required configuration details for the new SQL Server 2014 instance you must install are described in the following table: Item
Configuration
Instance Name
SQLTEST
Features
Database Engine only (excluding Replication, Full-Text, and DQS)
User database directory
M:\SQLTEST\Data
User database log directory
L:\SQLTEST\Logs
Service Accounts
ADVENTUREWORKS\ServiceAcct / Pa$$w0rd for all services
Startup
Both SQL Server and SQL Server Agent should start manually
Server Collation
SQL_Latin1_General_CP1_CI_AS
Authentication Mode
Mixed
SA Password
Pa$$w0rd
Administrative User
ADVENTUREWORKS\Student
Filestream Support
Disabled
The main tasks for this exercise are as follows: 1. Review the Installation Requirements 2. Install the SQL Server Instance
Task 1: Review the Installation Requirements 1.
Review the requirements in the exercise scenario.
2.
Verify that the required folders exist.
Task 2: Install the SQL Server Instance 1.
Install the required instance of SQL Server. o
On the Product Key page, select Evaluation edition, which does not require a product key.
o
On the Feature Selection page, select only the features that are required.
o
On the Server Configuration page, configure the service account name and password, the startup type for the SQL Server Agent and SQL Server Database Engine services, and verify the collation.
o
On the Database Engine Configuration page, configure the authentication mode and the SA password; add the current user (Student) to the SQL Server administrators list, specify the required data directories, and verify that Filestream is not enabled.
Results: After this exercise, you should have installed an instance of SQL Server.
Exercise 3: Performing Post-Installation Configuration Scenario
MCT USE ONLY. STUDENT USE PROHIBITED
2-20 Installing and Configuring SQL Server 2014
In this exercise, you will start the SQL Server service for the new instance and connect to it using SSMS to make sure that the instance works. The main tasks for this exercise are as follows: 1. Start the SQL Server Service 2. Configure Network Protocols and Aliases 3. Verify Connectivity to SQL Server
Task 1: Start the SQL Server Service 1.
Start SQL Server 2014 Configuration Manager, and view the properties of the SQL Server (SQLTEST) service.
2.
Verify that that the service is configured to log on as ADVENTUREWORKS\ServiceAcct. Then start the service.
Task 2: Configure Network Protocols and Aliases 1.
In SQL Server Configuration Manager, view the SQL Server network protocols configuration for the SQLTEST instance and verify that the TCP/IP protocol is enabled.
2.
View the SQL Server Native Client 32-bit client protocols and verify that the TCP/IP protocol is enabled. Then create an alias named Test that uses TCP/IP to connect to the MIA-SQL\SQLTEST instance from 32-bit clients.
3.
View the SQL Server Native Client protocols and verify that the TCP/IP protocol is enabled. Then create an alias named Test that uses TCP/IP to connect to the MIA-SQL\SQLTEST instance from 64bit clients.
Task 3: Verify Connectivity to SQL Server 1.
Use sqlcmd to connect to the MIA-SQL\SQLTEST instance of SQL Server using a trusted connection, and run the following command to verify the instance name: SELECT @@ServerName; GO
2.
Use SQL Server Management Studio to connect to the Test alias.
3.
In SQL Server Management Studio, in Object Explorer: a.
View the properties of the Test instance and verify that the value of the Name property is MIASQL\SQLTEST.
b.
Stop the Test service.
Results: After this exercise, you should have started the SQL Server service and connected using SSMS.
Module Review and Takeaways
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
2-21
In this module, you have learned about considerations for installing SQL Server, how to install SQL Server, and how to perform post-installation configuration tasks.
Review Question(s) Question: What additional considerations do you think there are for installing additional named instances on a server where SQL Server is already installed?
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED 3-1
Module 3 Working with Databases and Storage Contents: Module Overview
3-1
Lesson 1: Introduction to Data Storage with SQL Server
3-2
Lesson 2: Managing Storage for System Databases
3-8
Lesson 3: Managing Storage for User Databases
3-12
Lesson 4: Moving Database Files
3-21
Lesson 5: Configuring the Buffer Pool Extension
3-24
Lab: Managing Database Storage
3-28
Module Review and Takeaways
3-32
Module Overview
One of the most important roles for database administrators working with Microsoft® SQL Server® is the management of databases and storage. It is important to know how data is stored in databases, how to create databases, how to manage database files, and how to move them. Other tasks related to storage include managing the tempdb database and using fast storage devices to extend the SQL Server buffer pool cache.
Objectives After completing this lesson, you will be able to:
Describe how SQL Server stores data.
Manage storage for system databases.
Manage storage for user databases.
Move database files.
Configure the buffer pool extension.
Working with Databases and Storage
Lesson 1
Introduction to Data Storage with SQL Server
MCT USE ONLY. STUDENT USE PROHIBITED
3-2
Before you can create and manage databases effectively, you must understand how data is stored in them, know about the different types of files that SQL Server can utilize, where the files should be placed, and learn how to plan for ongoing file growth.
Lesson Objectives After completing this lesson, you will be able to:
Describe how data is stored in SQL Server.
Describe the considerations for disk storage devices.
Explain how specific redundant array of independent disks (RAID) systems work.
Determine appropriate file placement and the number of files for SQL Server databases.
Ensure sufficient file capacity and allow for ongoing growth.
How Data Is Stored in SQL Server SQL Server databases consist of a logical schema of tables and other objects in which data structures are used to organize records. This logical schema is physically stored in a set of files allocated for the database to use, with data records being written to pages within those files.
Database Files There are three types of database file used by SQL Server—primary data files, secondary data files, and transaction log files.
Primary Data Files
The primary data file is the starting point of the database. Every database has a single primary data file. As well as holding data pages, the primary data file holds pointers to the other files in the database. Primary data files typically use the file extension .mdf. The use of this file extension is not mandatory but is highly recommended.
Secondary Data Files
Secondary data files are optional, user-defined, additional data files that can be used to spread the data across more storage locations for performance and/or maintenance reasons. You can use secondary files to spread data across multiple disks by putting each file on a different disk drive. Additionally, if a database exceeds the maximum size for a single Windows file, you can use secondary data files so the database can continue to grow. The recommended extension for secondary data files is .ndf.
Transaction Log Files
Transaction log files (commonly referred to as simply log files) hold information that you can use to recover the database when necessary. There must be at least one log file for each database. All transactions are written to the log file using the write-ahead logging (WAL) mechanism to ensure the integrity of the database in case of a failure and to support rollbacks of transactions. The recommended extension for log files is .ldf.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
3-3
When data pages need to be changed, they are fetched into memory and changed there. The “dirty pages” are then written to the transaction log in a synchronous manner. Later, during a background process known as a “checkpoint”, the dirty pages are written to the database files. For this reason, the pages contained in the transaction log are critical to the ability of SQL Server to recover the database to a known committed state. Transaction logs are discussed in detail in this course. Note: The log file is also used by other SQL Server features, such as transactional replication, database mirroring, and change data capture. These are advanced topics and beyond the scope of this course.
Pages and Extents Data files store data on pages, which are grouped into extents.
Data File Pages
Pages in a SQL Server data file are numbered sequentially, starting with zero for the first page. Each file in a database has a unique file ID number. To uniquely identify a page in a database, both the file ID and the page number are required. Each page is 8 KB in size. After allowing for header information that is needed on each page, there is a region of 8,096 bytes remaining for holding data. Data rows can hold fixed length and variable length column values. All fixed length columns of a data row need to fit on a single page, within an 8,060-byte limit. Data pages only hold data from a single database object, such as a table or an index.
Extents
Groups of eight contiguous pages are referred to as an extent. SQL Server uses extents to simplify the management of data pages. There are two types of extents:
Uniform extents. All pages within the extent contain data from only one object.
Mixed extents. The pages of the extent can hold data from different objects.
The first allocation for an object is at the page level, and always comes from a mixed extent. If they are free, other pages from the same mixed extent will be allocated to the object as needed. Once the object has grown bigger than its first extent, then all future allocations are from uniform extents. In both primary and secondary data files, a small number of pages is allocated to track the usage of extents in the file.
Considerations for Disk Storage Devices Typically, a database server will not have enough internal disks to enable it to deliver the required levels of performance—so many servers use an external array of disks to store data. In a disk array, magnetic disks and/or solid state devices are combined to provide redundancy and improved performance. Commonly used types of disk array include:
Working with Databases and Storage
Direct Attached Storage (DAS)
MCT USE ONLY. STUDENT USE PROHIBITED
3-4
When using DAS, disks are stored in an enclosure and connected to the server by a RAID controller. You can use the controller to create RAID arrays from the DAS disks. The Windows Server operating system will treat each RAID array as a storage volume, just as it would an internal disk. Typically, a DAS enclosure contains between eight and 24 drives. DAS offers very good levels of performance, particularly in terms of throughput, and is relatively inexpensive. However, DAS can be limiting because you typically cannot share the storage between multiple servers.
Storage Area Network (SAN)
In a SAN, disks are stored in enclosures and connected by a common network. The network can be either an Ethernet network, as is the case with internet SCSI (iSCSI) SANs, or a fiber channel network. Servers connect to SAN storage through host bus adapters (HBAs). Fiber channel SANs generally offer better performance, but are more expensive than iSCSI SANs. Although more expensive than DAS, a SAN’s principal benefit is that it enables storage to be shared between servers, which is necessary for configurations such as server clustering. In a SAN, it is common practice to ensure that components such as HBAs, ports, and switches are duplicated. This removes single points of failure and so helps to maintain service availability.
Windows Storage Pools
In Windows storage pools, you can group drives together in a pool, and then create storage spaces which are virtual drives. This enables you to use commodity storage hardware to create large storage spaces and add more drives when you run low on pool capacity. You can create storage pools from internal and external hard drives (including USB, SATA, and SAS) and from solid state drives.
RAID Levels Many storage solutions use RAID hardware to provide fault tolerance through data redundancy, and in some cases, to improve performance. You can also implement software-controlled RAID 0, RAID 1, and RAID 5 by using the Windows Server operating system, and other levels may be supported by third-party SANs. Commonly used types of RAID include:
RAID 0, disk striping. A stripe set consists of space from two or more disks that is combined into a single volume. The data is distributed evenly across all of the disks, which improves I/O performance; particularly when each disk device has its own hardware controller. RAID 0 offers no redundancy, and if a single disk fails then the volume becomes inaccessible.
RAID 1, disk mirroring. A mirror set is a logical storage volume that is based on space from two disks, with one disk storing a redundant copy of the data on the other. Mirroring can provide good read performance, but write performance can suffer. RAID 1 is expensive in terms of storage because 50 percent of the available disk space is used to store redundant data.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
3-5
RAID 5, disk striping with parity. RAID 5 offers fault tolerance through the use of parity data that is written across all the disks in a striped volume that is comprised of space from 3 or more disks. RAID 5 typically performs better than RAID 1. However, if a disk in the set fails, performance degrades. RAID 5 is less costly in terms of disk space than RAID 1 because parity data only requires the equivalent of one disk in the set to store it. For example, in an array of five disks, four would be available for data storage, which represents 80 percent of the total disk space.
RAID 10, mirroring with striping. In RAID 10, a non-fault tolerant RAID 0 stripe set is mirrored. This arrangement delivers the excellent read/write performance of RAID 0, combined with the fault tolerance of RAID 1. However, RAID 10 can be expensive to implement because, like RAID 1, 50 percent of the total space is used to store redundant data.
Consider the following points when planning files on RAID hardware:
Generally, RAID 10 offers the best combination of read/write performance and fault tolerance, but is the most costly solution.
Write operations on RAID 5 can sometimes be relatively slow compared to RAID 1 because of the need to calculate parity data (RAID 5). If you have a high proportion of write activity, therefore, RAID 5 might not be the best candidate.
Consider the cost per GB. For example, implementing a 500 GB database on a RAID 1 mirror set would require (at least) two 500 GB disks. Implementing the same database on a RAID 5 array would require substantially less storage space.
Many databases use a SAN, and the performance characteristics can vary between SAN vendors. For this reason, if you use a SAN, you should consult with your vendors to identify the optimal solution for your requirements.
Windows storage spaces enable you to create extensible RAID storage solutions that use commodity disks. This solution offers many of the benefits of a specialist SAN hardware solution, at a significantly lower cost.
Determining File Placement and Number of Files When you create a database, you must decide where to store the database files. The choice of storage location for database files is extremely important, as it can have a significant effect on performance, resiliency, recoverability, and manageability.
Isolating Data and Log Files It is important to isolate log and data files for both performance and recovery reasons. This isolation needs to be at the physical disk level.
Access Patterns
The access patterns of log and data files are very different. Data access on log files consists primarily of sequential, synchronous writes, with occasional random disk access. Data access on data files predominantly offers asynchronous random disk access to the data files from the database. A single physical storage device does not tend to provide good response times when these types of data access are combined.
Working with Databases and Storage
Recovery
MCT USE ONLY. STUDENT USE PROHIBITED
3-6
While RAID volumes provide some protection from physical storage device failures, complete volume failures can still occur. If a SQL Server data file is lost, the database can be restored from a backup and the transaction log reapplied to recover the database to a recent point in time. If a SQL Server log file is lost, the database can be forced to recover from the data files, with the possibility of some data loss or inconsistency in the database. However, if both the data and log files are on a single disk subsystem that is lost, the recovery options usually involve restoring the database from an earlier backup and losing all transactions since that time. Isolating data and log files can help to avoid the worst impacts of drive subsystem failures. Note: Storage solutions use logical volumes as units of storage, and a common mistake is to place data files and log files on different volumes that are actually based on the same physical storage devices. When isolating data and log files, ensure that volumes on which you store data and log files are based on separate underlying physical storage devices.
Data File Management
Ideally, all data files that are defined for a database should be the same size. Data is spread evenly across all available data files. The main performance advantages from this are gained when the files are spread over different storage locations. Allocating multiple data files provides a number of management advantages, including:
The possibility of moving files and part of the data later.
A reduction in recovery time when separately restoring a database file (for example, if only part of the data is corrupt).
An increase in the parallelism in the I/O channel.
The ability to have databases larger than the maximum size of a single Windows file.
Number of Log Files
Unlike the way that SQL Server writes to data files, the SQL Server database engine only writes to a single log file at any point in time. Additional log files are only used when space is not available in the active log file.
Ensuring Sufficient File Capacity Capacity planning helps to ensure that your databases have access to the space required as they grow. Calculating the rate of database growth enables you to plan file sizes and file growth more easily and accurately. When planning the capacity that you will require, you should estimate the maximum size of the database, indexes, transaction log, and tempdb, through a predicted growth period.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
3-7
For most sites, you should aim to create database files that are large enough to handle the data expected to be stored in the files over a 12-month period. If possible, base capacity planning on tests with the actual application(s) that will store data in the database. If this is not possible, consult the application developer or vendor to determine realistic data capacity requirements.
Autogrowth vs. Planned Growth
SQL Server can automatically expand a database according to growth parameters that were defined when the database files were created. While the options for autogrowth should be enabled to prevent downtime when unexpected growth occurs, it is important to avoid the need for SQL Server to ever autogrow the files. Instead, you should monitor file growth over time and ensure that files are large enough for several months or years.
Many administrators are concerned that larger database files will somehow increase the time it takes to perform backups. The size of a SQL Server backup is not related directly to the size of the database files as only pages that actually contain data are backed up. One significant issue that arises with autogrowth is a trade-off related to the size of the growth increments. If a large increment is specified, a significant delay can be experienced in the execution of the Transact-SQL statement that triggers the need for growth. If the specified increment is too small, the filesystem can become very fragmented and the database performance can suffer because the data files have been allocated in small chunks all over a disk subsystem.
Log File Growth Planning
If the transaction log is not set up to expand automatically, it can soon run out of space when certain types of activity occur in the database. For example, performing large-scale bulk operations, such as bulk import or index creation, can cause the transaction log to fill rapidly.
As well as expanding the size of the transaction log, you can also truncate a log file. Truncating the log purges the file of inactive, committed, transactions and enables the SQL Server database engine to reuse this part of the transaction log. However, you should be careful when truncating the transaction log as doing so may affect the recoverability of the database in the event of a failure. Generally, log truncation is managed as part of a backup strategy.
Working with Databases and Storage
Lesson 2
Managing Storage for System Databases SQL Server uses system database to maintain internal metadata. Database administrators should be familiar with the SQL Server system databases and how to manage them.
Lesson Objectives After completing this lesson, you will be able to:
Describe each of the system databases in a SQL Server instance.
Move system database files.
Configure tempdb.
SQL Server System Databases In addition to the user databases that you create for applications, a SQL Server instance always contains five system databases—master, msdb, model, tempdb, and resource. These databases contain important metadata that is used internally by SQL Server, and you cannot drop any of them.
master The master database contains all system-wide information. Anything that is defined at the server instance level is typically stored in the master database. If the master database is damaged or corrupted, SQL Server will not start, so it is imperative to back it up on a regular basis.
msdb
MCT USE ONLY. STUDENT USE PROHIBITED
3-8
The msdb database holds information about database maintenance tasks, and in particular it contains information used by the SQL Server Agent for maintenance automation, including jobs, operators, and alerts. It is also important to regularly back up the msdb database, to ensure that jobs, schedules, history for backups, restores, and maintenance plans are not lost. In earlier versions of SQL Server, SQL Server Integration Services (SSIS) packages were often stored within the msdb database. In SQL Server 2014, you should store them in the dedicated SSIS catalog database instead.
model
The model database is the template on which all user databases are established. Any new database uses the model database as a template. If you create any objects in the model database, they will then be present in all new databases on the server instance. Many sites never modify the model database. Note that, even though the model database does not seem overly important, SQL Server will not start if the model database is not present.
tempdb
The tempdb database holds temporary data. SQL Server truncates or creates this database every time it starts, so there is no need to perform a backup. In fact, there is no option to perform a backup of the tempdb database.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
resource
3-9
The resource database is a read-only hidden database that contains system objects mapped to the sys schema in every database. This database also holds all system stored procedures, system views and system functions. In SQL Server versions before SQL Server 2005, these objects were defined in the master database.
Moving System Databases All system databases, except the resource database, can be moved to new locations to help balance I/O load. However, you need to approach moving system databases with caution as if this is performed incorrectly, it is possible to stop the operation of SQL Server.
Moving the msdb, model, and tempdb Databases To move the msdb, model, and tempdb databases, perform the following steps: 1.
For each file to be moved, execute the ALTER DATABASE … MODIFY FILE statement.
2.
Stop the instance of SQL Server.
3.
Move the files to the new location (this step is not necessary for tempdb, as its files are recreated automatically on startup)
4.
Restart the instance of SQL Server.
Moving the master Database
The process for moving the master database is different from the process for other databases. To move the master database, perform the following steps: 1.
Open SQL Server Configuration Manager.
2.
In the SQL Server Services node, right-click the instance of SQL Server, click Properties, and then click the Startup Parameters tab.
3.
Edit the Startup Parameters values to point to the planned location for the master database data (-d parameter) and log (-l parameter) files.
4.
Stop the instance of SQL Server.
5.
Move the master.mdf and mastlog.ldf files to the new location.
6.
Restart the instance of SQL Server.
Considerations for tempdb The performance of the tempdb database is critical to the overall performance of most SQL Server installations. The database consists of the following objects:
Internal objects Internal objects are used by SQL Server for its own operations. They include work tables for cursor or spool operations, temporary large object storage, work files for hash join or hash aggregate operations, and intermediate sort results.
Note: Working with internal objects is an advanced concept beyond the scope of this course.
Row versions
MCT USE ONLY. STUDENT USE PROHIBITED
3-10 Working with Databases and Storage
Transactions that are associated with snapshot-related transaction isolation levels can cause alternate versions of rows to be briefly maintained in a special row version store within tempdb. Row versions can also be produced by other features, such as online index rebuilds, Multiple Active Result Sets (MARS), and triggers.
User objects
Most objects that reside in the tempdb database are user-generated and consist of temporary tables, table variables, result sets of multi-statement table-valued functions, and other temporary row sets.
Planning tempdb Location and Size
By default, the data and log files for tempdb are stored in the same location as the files for all other system databases. If your SQL Server instance must support database workloads that make extensive use of temporary objects, you should consider moving tempdb to a dedicated volume to avoid fragmentation of data files, and set its initial size based on how much it is likely to be used. You can leave autogrowth enabled, but set the growth increment to be quite large to ensure that performance is not interrupted by frequent growth of tempdb. You can choose the location of tempdb files during installation, and you can move them later if required.
Because tempdb is used for so many purposes, it is difficult to predict its required size in advance. You should carefully test and monitor the sizes of your tempdb database in real-life scenarios for new installations. Running out of disk space in the tempdb database can cause significant disruptions in the SQL Server production environment and prevent applications that are running from completing their operations. You can use the sys.dm_db_file_space_usage dynamic management view to monitor the disk space that the files are using. Additionally, to monitor the page allocation or deallocation activity in tempdb at the session or task level, you can use the sys.dm_db_session_space_usage and sys.dm_db_task_space_usage dynamic management views. By default, the tempdb database automatically grows as space is required because the MAXSIZE of the files is set to UNLIMITED. Therefore, tempdb can continue growing until space on the disk that contains it is exhausted.
Using Multiple Files
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
3-11
Increasing the number of files in tempdb can overcome I/O restrictions and avoid latch contention during page free space (PFS) scans as temporary objects are created and dropped, resulting in improved overall performance. However, do not create too many files, as this can then degrade the performance. As a general rule, it is advised to have 0.25-1 file per core, with the ratio lower as the number of cores on the system increases. However, the optimal configuration can only be identified by doing real live tests.
Demonstration: Moving tempdb Files In this demonstration, you will see how to:
Move tempdb files.
Demonstration Steps Move tempdb files 1.
Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are running, and log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
In the D:\Demofles\Mod03 folder, run Setup.cmd as Administrator.
3.
Start SQL Server Management Studio and connect to the MIA-SQL database engine using Windows authentication.
4.
In Object Explorer, expand Databases, expand System Databases, and then right-click tempdb and click Properties.
5.
In the Database Properties dialog box, on the Files page, note the current files and their location. Then click Cancel.
6.
Open the MovingTempdb.sql script file in the D:\Demofiles\Mod03 folder.
7.
View the code in the script, and then click Execute. Note the message that is displayed after the code has run.
8.
View the contents of T:\ and note that no files have been created in that location, because the SQL Server service has not yet been restarted.
9.
In Object Explorer, right-click MIA-SQL and click Restart. When prompted to allow changes, to restart the service, and to stop the dependent SQL Server Agent service, click Yes.
10. View the contents of T:\ and note that the tempdb.mdf and tempdb.ldf files have been moved to this location. 11. Keep SQL Server Management Studio open for the next demonstration.
Lesson 3
Managing Storage for User Databases
MCT USE ONLY. STUDENT USE PROHIBITED
3-12 Working with Databases and Storage
User databases are non-system databases that you create for applications. Creating databases is a core competency for database administrators working with SQL Server. As well as understanding how to create them, you need to be aware of the impact of file initialization options and know how to alter existing databases. When creating databases, you also need to consider where the data and logs will be stored on the file system. You may also want to change this or provide additional storage when the database is in use. When databases become larger, there is a need to allocate the data across different volumes, rather than storing it in a single large disk volume. This allocation of data is configured using filegroups and used to address both performance and ongoing management needs within databases.
Lesson Objectives After completing this lesson, you will be able to:
Create user databases.
Configure database options.
Alter databases.
Manage database files.
Describe key features of filegroups.
Create and manage filegroups.
Creating User Databases You create databases by using either the user interface in SQL Server Management Studio (SSMS) or the CREATE DATABASE command in Transact-SQL. The CREATE DATABASE command offers more flexible options, but the user interface can be easier to use. This topic will concentrate on using the CREATE DATABASE command, but the information is equally applicable to the options available in the user interface in SSMS.
CREATE DATABASE
Database names must be unique within an instance of SQL Server and comply with the rules for identifiers. A database name is of data type sysname, which is defined as nvarchar(128). This means that up to 128 characters can be present in the database name and that each character can be chosen from the double-byte Unicode character set. While database names can be quite long, you will find that these become awkward to work with.
Data Files
As discussed earlier in this module, a database must have a single primary data file and one log file. The ON and LOG ON clauses of the CREATE DATABASE command specify the name and path to use.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
3-13
In the following code, a database named Sales is being created, comprising of two files—a primary data file located at M:\Data\Sales.mdf and a log file located at L:\Logs\Sales.ldf: Using the CREATE DATABASE statement CREATE DATABASE Sales ON (NAME = Sales_dat, FILENAME = 'M:\Data\Sales.mdf', SIZE = 100MB, MAXSIZE = 500MB, FILEGROWTH = 20%) LOG ON (NAME = Sales_log, FILENAME = 'L:\Logs\Sales.ldf', SIZE = 20MB, MAXSIZE = UNLIMITED, FILEGROWTH = 10MB);
Each file includes a logical file name as well as a physical file path. Because operations in SQL Server use the logical filename to reference the file, the logical file name must be unique within each database.
In this example, the primary data file has an initial file size of 100 MB and a maximum file size of 500 MB. It will grow by 20 percent of its current size whenever autogrowth needs to occur. The log file has an initial file size of 20 MB and has no limit on maximum file size. Each time it needs to autogrow, it will grow by a fixed 10 MB allocation.
Collations and Default Values
If required, a specific collation can be allocated at the database level. If no collation is specified, it will default to the collation that was specified for the server instance during SQL Server installation. Keeping individual databases with the same collation as the server is considered a best practice. While it is possible to create a database by providing just the database name, this results in a database that is based on the model database—with the data and log files in the default locations—which is unlikely to be the configuration that you require.
Deleting Databases To delete (or “drop”) a database, right-click it in Object Explorer and click Delete or use the DROP DATABASE Transact-SQL statement. Dropping a database automatically deletes all of its files. The following code example drops the Sales database: Dropping a database. DROP DATABASE Sales;
Configuring Database Options Each database has a set of options that you can configure. These options are unique to each database and changing them for one database will not impact on any others. All database options are initially set from the configuration of the model database when you create a database. You can change them by using the SET clause of the ALTER DATABASE statement or by using the Properties page for each database in SSMS.
Categories of Options There are several categories of database options:
MCT USE ONLY. STUDENT USE PROHIBITED
3-14 Working with Databases and Storage
Auto options. Control certain automatic behaviors. As a general guideline, Auto Close and Auto Shrink should be turned off on most systems but Auto Create Statistics and Auto Update Statistics should be turned on.
Cursor options. Control cursor behavior and scope. In general, the use of cursors when working with SQL Server is not recommended, apart from particular applications such as utilities. Cursors are not discussed further in this course but it should be noted that their overuse is a common cause of performance issues.
Database availability options. Control whether the database is online or offline, who can connect to it, and whether or not it is in read-only mode.
Maintenance and recovery options. o
Recovery model. Database recovery models will be discussed in Module 4 of this course.
o
Page verify. Early versions of SQL Server offered an option called Torn Page Detection. This option caused SQL Server to write a small bitmap across each disk drive sector within a database page. There are 512 bytes per sector, meaning that there are 16 sectors per database page (8 KB). This was a fairly crude yet reasonably effective way to detect a situation where only some of the sectors required to write a page were in fact written. In SQL Server 2005, a new CHECKSUM verification option was added. The use of this option causes SQL Server to calculate and add a checksum to each page as it is written and to recheck the checksum whenever a page is retrieved from disk.
Note: Page checksums are only added the next time that any page is written. Enabling the option does not cause every page in the database to be rewritten with a checksum.
Demonstration: Creating Databases In this demonstration, you will see how to:
Create a database by using SQL Server Management Studio.
Create a database by using the CREATE DATABASE statement.
Demonstration Steps Create a Database by Using SQL Server Management Studio 1.
Ensure that you have completed the previous demonstration. If not, start the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines, log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd, and run D:\Demofiles\Mod03\Setup.cmd as Administrator.
2.
If SQL Server Management Studio is not open, start it and connect to the MIA-SQL database engine using Windows authentication.
3.
In Object Explorer, right-click Databases and click New Database.
4.
In the Database name box, type DemoDB1.
5.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
3-15
In the Database files list, note the default logical names, initial size, and autogrowth settings. Then change the Path and File Name by typing the following values:
DemoDB1 o
Path: M:\Data\
o
File Name: DemoDB1.mdf
DemoDB1_log o
Path: L:\Logs\
o
File Name: DemoDB1.ldf
6.
Click OK to create the new database.
7.
Expand the Databases folder and then right-click DemoDB1 and click Properties.
8.
On the Options tab, review the database options. Then click Cancel.
Create a Database by Using the CREATE DATABASE Statement 1.
In SQL Server Management Studio, open the CreatingDatabases.sql script file from the D:\Demofiles\Mod03 folder.
2.
Select the code under the comment Create a database and click Execute to create a database named DemoDB2.
3.
Select the code under the comment View database info and click Execute. Then view the information that is returned.
4.
Keep SQL Server Management Studio open for the next demonstration.
Altering User Databases You may need to modify a database when it is in use, for example, you might need to change the name or options. You can make such modifications by using the ALTER DATABASE Transact-SQL statement or by using SSMS. You can use the ALTER DATABASE statement to modify the files and filegroups of a database, the options, and the compatibility levels.
Altering Database Options You can modify database options by using the ALTER DATABASE SET statement, specifying the option name and, where applicable, the value to use. For example, you can set a database to read only or read write. ALTER DATABASE … SET ALTER DATABASE HistoricSales SET READ_ONLY;
Note: Many of the database set options that you configure by using the ALTER DATABASE statement can be overridden using a session level set option. This enables users or applications to execute a SET statement to configure the setting just for the current session. Additional Reading: For more information about database set options, see ALTER DATABASE SET Options (Transact-SQL) in SQL Server Books Online.
Altering Database Compatibility Options If you want your database to be compatible with a specific version of SQL Server, you can use the SET COMPATIBILITY_LEVEL option with the ALTER DATABASE statement. You can set compatibility to SQL Server 2000 and later versions. The value that you specify for the compatibility level defines which previous versions it should be compatible with. ALTER DATABASE … SET COMPATIBILITY_LEVEL ALTER DATABASE Sales SET COMPATIBILITY_LEVEL = 100;
The values you can use are described in the following table: Value
Version to be compatible with
80
SQL Server 2000
90
SQL Server 2005
100
SQL Server 2008 and SQL Server 2008 R2
110
SQL Server 2012
120
SQL Server 2014
Managing Database Files You may need to modify the structure of a database when it is in operation. The most common requirement is to add additional space by either expanding existing files or adding additional files. You might also need to drop a file. SQL Server will not allow you to drop a file that is currently in use in the database. Dropping a file is a two-step process—first the file has to be emptied, and then it can be removed.
Adding Space to a Database
MCT USE ONLY. STUDENT USE PROHIBITED
3-16 Working with Databases and Storage
By default, SQL Server automatically expands a database according to growth parameters that you define when you create the database files. You can also manually expand a database by allocating additional space to an existing database file or by creating a new file. You may have to expand the data or transaction log space if the existing files are becoming full.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
3-17
If a database has already exhausted the space allocated to it and it cannot grow a data file automatically, error 1105 is raised. (The equivalent error number for the inability to grow a transaction log file is 9002.) This can happen if the database is not set to grow automatically or if there is not enough disk space on the hard drive.
Adding Files
One option for expanding the size of a database is to add files. You can do this by using either SSMS or by using the ALTER DATABASE … ADD FILE statement.
Expanding Files When expanding a database, you must increase its size by at least 1 MB. Ideally, any file size increase should be much larger than this. Increases of 100 MB or more are common.
When you expand a database, the new space is immediately made available to either the data or transaction log file, depending on which file was expanded. When you expand a database, you should specify the maximum size to which the file is permitted to grow. This prevents the file from growing until disk space is exhausted. To specify a maximum size for the file, use the MAXSIZE parameter of the ALTER DATABASE statement, or use the Restrict filegrowth (MB) option when you use the Properties dialog box in SSMS to expand the database.
Transaction Log
If the transaction log is not set up to expand automatically, it can run out of space when certain types of activity occur in the database. In addition to expanding the size of the transaction log, the log file can be truncated. Truncating the log purges the file of inactive, committed transactions and allows the SQL Server database engine to reuse this unused part of the transaction log. If there are active transactions, the log file might not be able to be truncated and expanding it may be the only available option.
Dropping Database Files
Before you drop a database file, it must be empty of data. You can empty the file by using the EMPTYFILE option of the DBCC SHRINKFILE command, and then remove the file by using the ALTER DATABASE statement.
Shrinking a Database
You can reduce the size of the files in a database by removing unused pages. Although the database engine will reuse space effectively, there are times when a file no longer needs to be as large as it once was. Shrinking the file may then become necessary, but it should be considered a rarely used option. You can shrink both data and transaction log files—this can be done manually, either as a group or individually, or you can set the database to shrink automatically at specified intervals.
Methods for Shrinking You can shrink a database or specific database files by using the DBCC SHRINKDATABASE and DBCC SHRINKFILE commands. The DBCC SHRINKFILE is preferred as it provides much more control of the operation than DBCC SHRINKDATABASE. Note: Shrinking a file usually involves moving of pages within the files, which can take a long time. Regular shrinking of files tends to lead to regrowth of files. For this reason, even though SQL Server provides an option to automatically shrink databases, this should only be rarely used. As in most databases, enabling this option will cause substantial fragmentation issues on the disk subsystem. It is best practice to only perform shrink operations if absolutely necessary.
TRUNCATE ONLY
MCT USE ONLY. STUDENT USE PROHIBITED
3-18 Working with Databases and Storage
TRUNCATE_ONLY is an additional option of DBCC SHRINKFILE which releases all free space at the end of the file to the operating system, but does not perform any page movement inside the file. The data file is shrunk only to the last allocated extent. This option often does not shrink the file as effectively as a standard DBCC SHRINKFILE operation, but is less likely to cause substantial fragmentation and is much faster.
Introduction to Filegroups As you have seen in this module, databases consist of at least two files: a primary data file and a log file. To improve performance and manageability of large databases, you can add secondary files. Data files are always defined within a filegroup. Filegroups are named collections of data files you can use to simplify data placement and administrative tasks such as backup and restore operations. Using filegroups may also improve database performance in some scenarios, because they enable you to spread database objects such as tables across multiple storage volumes.
Every database has a primary filegroup (named PRIMARY), and when you add secondary data files to the database, they automatically become part of the primary filegroup, unless you specify a different filegroup. When planning to use filegroups, consider the following facts:
Database files can only belong to one filegroup.
A filegroup can only be used by one database.
Using Filegroups for Data Manageability You can use filegroups to control the placement of data based on management considerations. For example, if a database contains tables of read-only data, you can place these tables in a dedicated filegroup that is set to be read-only.
Additionally, you can backup and restore files and filegroups individually. This enables you to achieve faster backup times because you only need to back up the files or filegroups that have changed, instead of backing up the entire database. Similarly, you can achieve efficiencies when it comes to restoring data. SQL Server also supports partial backups. A partial backup enables you to separately back up read-only and read/write filegroups. You can then use these backups to perform a piecemeal restore, which enables you to restore individual filegroups one by one, and bring the database back online filegroup by filegroup. Note: You will learn more about partial backups and piecemeal restores later this course.
Using Filegroups for Performance
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
3-19
When you create tables, you can specify a filegroup for the table data. If you do not specify a filegroup, the default filegroup will be used. The default filegroup is the primary filegroup, unless you configure a different filegroup as the default. By creating tables on specific filegroups, you can isolate heavily accessed tables from other tables, reducing contention and boosting performance. When a filegroup contains multiple files, SQL Server can write to all of the files simultaneously, and it populates them by using a proportional fill strategy. Files that are the same size will have the same amount of data written to them, ensuring that they fill at a consistent rate. Files that are different sizes will have different amounts of data written to them to ensure that they fill up at a proportionally consistent rate. The fact that SQL Server can write to filegroup files simultaneously enables you to use a filegroup to implement a simple form of striping. You can create a filegroup that contains two or more files, each of which is on a separate disk. When SQL Server writes to the filegroup, it can use the separate I/O channel for each disk concurrently, which results in faster write times. Note: Generally, you should use filegroups primarily to improve manageability and rely on storage device configuration for I/O performance. However, when a striped storage volume is not available, using a filegroup to spread data files across physical disks can be an effective alternative.
Creating and Managing Filegroups As a database administrator, you may be required to manage large databases that contain large numbers of files. Creating and managing filegroups makes it easier to control the physical storage of the database files and enables you to keep files with similar manageability or access requirements together.
Creating Filegroups You can create additional filegroups and assign files to them as you create a database, or you can add new filegroups and files to an existing database.
In the following code example, the Sales database includes a primary filegroup containing a single file named sales.mdf, a filegroup named Transactions containing two files named sales_tran1.ndf and sales_tran2.ndf, and a filegroup named Archive containing two files named sales_archive1.ndf and sales_archive2.ndf. Creating a Database with Multiple Filegroups CREATE DATABASE Sales ON PRIMARY (NAME = 'Sales', FILENAME = 'D:\Data\sales.mdf', SIZE = 5MB, FILEGROWTH = 1MB), FILEGROUP Transactions (NAME = 'SalesTrans1', FILENAME = 'M:\Data\sales_tran1.ndf', SIZE = 50MB, FILEGROWTH = 10MB), (NAME = 'SatesTrans2', FILENAME = 'N:\Data\sales_tran2.ndf', SIZE = 50MB, FILEGROWTH = 10MB), FILEGROUP Archive (NAME = 'HistoricData1', FILENAME = 'O:\Data\sales_archive1.ndf', SIZE = 200MB, FILEGROWTH = 10MB), (NAME = 'HistoricData2', FILENAME = 'P:\Data\sales_archive2.ndf', SIZE = 200MB, FILEGROWTH = 10MB) LOG ON (NAME = 'Sales_log', FILENAME = 'L:\Logs\sales.ldf', SIZE = 10MB , FILEGROWTH = 1MB);
To add filegroups to an existing database, you can use the ALTER DATABASE … ADD FILEGROUP statement. You can then use the ALTER DATABASE … ADD FILE statement to add files to the new filegroup.
Setting the Default Filegroup
MCT USE ONLY. STUDENT USE PROHIBITED
3-20 Working with Databases and Storage
Unless you specify otherwise, the primary filegroup is the default filegroup for the databases. Any objects created without an explicit ON clause are created in the default filegroup. A recommended practice is to use the primary filegroup for internal system objects (which are created automatically with the database), and to add a secondary filegroup for user objects. If you adopt this practice, you should make the secondary filegroup the default filegroup for the database. You can do this by modifying the properties of the database in SSMS, or by using the ALTER DATABASE … MODIFY FILEGROUP statement. The following code example changes the default filegroup in the Sales database to the Transactions filegroup: Changing the Default Filegroup ALTER DATABASE Sales MODIFY FILEGROUP Transactions DEFAULT;
Using Read-Only Filegroups
When a database contains a mixture of read/write and read-only data, you can use read-only filegroups to store tables containing data that will not be modified. This approach is particularly useful in large databases, such as data warehouses, as it enables you to employ a backup strategy that includes a single backup of read-only data, and regular backups that include only volatile data. To make a filegroup read-only, use the ALTER DATABASE … MODIFY FILEGROUP statement with the READONLY option. The following code example makes the Archive filegroup read-only: Making a Filegroup Read-Only ALTER DATABASE Sales MODIFY FILEGROUP Archive READONLY;
To make a read-only filegroup writable, use the ALTER DATABASE … MODIFY FILEGROUP statement with the READWRITE option.
Lesson 4
Moving Database Files
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
3-21
As well as adding and removing files from a database, you may sometimes need to move database files, or even whole databases. You may also need to copy a database.
Lesson Objectives After completing this lesson, you will be able to:
Move user database files.
Detach and attach databases.
Use the Copy Database Wizard.
Moving User Database Files You can move database files to a different location by using SSMS or the Transact-SQL ALTER DATABASE statement. Note: Before you can move database files, you need to take the database offline. When you move user database files, you need to use the logical name of the file defined when you create the database. You can use the sys.database_files view to discover the logical name of the files in a database.
Using the ALTER DATABASE Statement You can use the ALTER DATABASE statement to move database files within the same instance of SQL Server by including the MODIFY FILE clause in the statement. The following example shows how to move the data file for the AdventureWorks database: Moving Database Files by Using the ALTER DATABASE Statement ALTER DATABASE AdventureWorks SET OFFLINE; // Move the files on the file system ALTER DATABASE AdventureWorks MODIFY FILE (NAME = AWDataFile, FILENAME = 'C:\AWDataFile.mdf'); ALTER DATABASE AdventureWorks SET ONLINE;
Detaching and Attaching Databases SQL Server provides you with detach and attach functionality which you can use to add or remove a database from a particular instance of the database engine. This enables a commonly used technique to move a database from one instance to another.
Detaching Databases
MCT USE ONLY. STUDENT USE PROHIBITED
3-22 Working with Databases and Storage
You detach databases from an instance of SQL Server by using SSMS or the sp_detach_db stored procedure. Detaching a database does not remove the data from the data files or remove the data files from the server. It simply removes the metadata entries for that database from the system databases on that SQL Server instance. The detached database then no longer appears in the list of databases in SSMS or in the results of the sys.databases view. After you have detached a database, you can move or copy, and then attach it to another instance of SQL Server.
UPDATE STATISTICS SQL Server maintains a set of statistics on the distribution of data in tables and indexes. As part of the detach process, you can specify an option to perform an UPDATE STATISTICS operation on table and index statistics. While this is useful if you are going to reattach the database as a read-only database, in general it is not a good option to use while detaching a database.
Detachable Databases
Not all databases can be detached. Databases that are configured for replication, mirrored, or in a suspect state, cannot be detached. Note: Replicated and mirrored databases are advanced topics beyond the scope of this course.
A more common problem that prevents a database from being detached at the time that you attempt to perform the operation, is that connections are open to the database. You must ensure that all connections are dropped before detaching the database. SSMS offers an option to force connections to be dropped during this operation.
Attaching Databases SSMS also provides you with the ability to attach databases. You can also do this by using the CREATE DATABASE … FOR ATTACH statement. Note: You may find many references to the sp_attach_db and sp_attach_single_file_db stored procedures. These older syntax are replaced by the FOR ATTACH option to the CREATE DATABASE statement. Note also that there is no equivalent replacement for the sp_detach_db procedure.
A common problem when databases are reattached is that database users can become orphaned. You will see how to deal with this problem in a later module.
Demonstration: Detaching and Attaching a Database In this demonstration, you will see how to:
Detach a database.
Attach a database.
Demonstration Steps Detach a Database
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
3-23
1.
Ensure that you have completed the previous demonstrations in this module, and that you have created a database named DemoDB2.
2.
In Object Explorer, right-click the Databases folder and click Refresh; and verify that the DemoDB2 database is listed.
3.
Right-click DemoDB2, point to Tasks, and click Detach. Then in the Detach Database dialog box, select Drop Connections and Update Statistics, and click OK.
4.
View the M:\Data and L:\Logs folders and verify that the DemoDB2.mdf and DemoDB2.ldf files have not been deleted.
Attach a Database 1.
In SQL Server Management Studio, in Object Explorer, in the Connect drop-down list, click Database Engine. Then connect to the MIA-SQL\SQL2 database engine using Windows authentication.
2.
In Object Explorer, under MIA-SQL\SQL2, expand Databases and view the databases on this instance.
3.
In Object Explorer, under MIA-SQL\SQL2, right-click Databases and click Attach.
4.
In the Attach Databases dialog box, click Add. Then in the Locate Database Files dialog box, select the M:\Data\DemoBD2.mdf database file and click OK.
5.
In the Attach Databases dialog box, after you have added the master databases file, note that all of the database files are listed. Then click OK.
6.
In Object Explorer, under MIA-SQL\SQL2, under Databases, verify that DemoDB2 is now listed.
Lesson 5
Configuring the Buffer Pool Extension The topics in this module so far have discussed the storage of system and user database files. However, SQL Server 2014 also supports the use of high performance storage devices, such as solid-state disks (SSDs) to extend the buffer pool (the cache used to modify data pages in-memory).
Lesson Objectives After completing this lesson, you will be able to:
Describe the Buffer Pool Extension.
Explain the considerations needed when working with the Buffer Pool Extension.
Introduction to the Buffer Pool Extension SQL Server uses a buffer pool of memory to cache data pages, reducing I/O demand and improving overall performance. As database workloads intensify over time, you can add more memory to maintain performance—but this solution isn’t always practical. Adding storage is often easier than adding memory, so SQL Server 2014 introduces the Buffer Pool Extension to enable you to use fast storage devices for buffer pool pages.
MCT USE ONLY. STUDENT USE PROHIBITED
3-24 Working with Databases and Storage
The Buffer Pool Extension is an extension for the SQL Server buffer pool that targets non-volatile storage devices, such as solid-state disk drives (SSDs). When the Buffer Pool Extension is enabled, SQL Server uses it for data pages in a similar way to the main buffer pool memory.
Only clean pages, containing data that is committed, are stored in the Buffer Pool Extension, ensuring that there is no risk of data loss in the event of a storage device failure. Additionally, if a storage device containing the Buffer Pool Extension fails, the extension is automatically disabled. You can easily re-enable the extension when the failed storage device has been replaced. The Buffer Pool Extension provides the following benefits:
Performance gains on online transaction processing (OLTP) applications with a high amount of read operations can be improved significantly.
SSD devices are often less expensive per megabyte than physical memory, making this a costeffective way to improve performance in I/O-bound databases.
The Buffer Pool Extension can be enabled easily, and requires no changes to existing applications.
Note: The Buffer Pool Extension is only available in 64-bit installations of SQL Server 2014 Enterprise Edition.
Considerations for Using the Buffer Pool Extension The Buffer Pool Extension has been shown to improve the performance of OLTP databases. While database workloads can vary significantly, using the Buffer Pool Extension is typically beneficial when the following conditions are true:
The I/O workload consists of OLTP operations with a high volume of reads.
The database server contains up to 32 GB of physical memory.
The Buffer Pool Extension is configured to use a file that is between four and 10 times the amount of physical memory in the server.
The Buffer Pool Extension file is stored on high throughput SSD storage.
Scenarios where the Buffer Pool Extension is unlikely to significantly improve performance include:
Data warehouse workloads.
OLTP workloads with a high volume of write operations.
Servers on which more than 64 GB of physical memory is available to SQL Server.
Working with the Buffer Pool Extension
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
3-25
To resize or relocate the Buffer Pool Extension file, you must disable the Buffer Pool Extension, and then re-enable it with the required configuration. When you disable the Buffer Pool Extension, SQL Server will have less buffer memory available, which may cause an immediate increase in memory pressure and I/O, resulting in performance degradation. You should therefore plan reconfiguration of the Buffer Pool Extension carefully to minimize disruption to application users. You can view the status of the buffer pool extension by querying the sys.dm_os_buffer_pool_extension_configuration dynamic management view. You can monitor its usage by querying the sys.dm_os_buffer_descriptors dynamic management view.
Configuring the Buffer Pool Extension To enable the Buffer Pool Extension, you must use the ALTER SERVER CONFIGURATION statement and specify the filename and size to be used for the Buffer Pool Extension file. The following code sample enables the Buffer Pool Extension with a size of 50 GB: Enabling the Buffer Pool Extension ALTER SERVER CONFIGURATION SET BUFFER POOL EXTENSION ON (FILENAME = 'E:\SSDCACHE\MYCACHE.BPE', SIZE = 50 GB);
To disable the Buffer Pool Extension, use the ALTER SERVER CONFIGURATION statement with the SET BUFFER POOL EXTENSION OFF clause.
MCT USE ONLY. STUDENT USE PROHIBITED
3-26 Working with Databases and Storage
To resize or relocate the Buffer Pool Extension file, you must disable the Buffer Pool Extension, and then re-enable it with the required configuration. When you disable the Buffer Pool Extension, SQL Server will have less buffer memory available, which may cause an immediate increase in memory pressure and I/O, resulting in performance degradation. You should therefore carefully plan reconfiguration of the Buffer Pool Extension to minimize disruption to application users. You can view the status of the Buffer Pool Extension by querying the sys.dm_os_buffer_pool_extension_configuration dynamic management view. You can monitor its usage by querying the sys.dm_os_buffer_descriptors dynamic management view.
Demonstration: Configuring the Buffer Pool Extension In this demonstration, you will see how to:
Enable the Buffer Pool Extension.
Verify Buffer Pool Extension configuration.
Disable the Buffer Pool Extension.
Demonstration Steps Enable the Buffer Pool Extension 1.
Ensure that you have completed the previous demonstration. If not, start the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines, log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd, and run D:\Demofiles\Mod03\Setup.cmd as Administrator.
2.
If SQL Server Management Studio is not open, start it and connect to the MIA-SQL database engine using Windows authentication.
3.
Open the script file ConfiguringBPE.sql in the D:\Demofiles\Mod03 folder.
4.
Review the code under the comment Enable buffer pool extension, and note that it creates a Buffer Pool Extension file named MyCache.bpe in S:\. On a production system, this file location would typically be on an SSD device.
5.
Use File Explorer to view the contents of the S:\ folder and note that no MyCache.bpe file exists.
6.
In SQL Server Management Studio, select the code under the comment Enable buffer pool extension, and click Execute.
Verify Buffer Pool Extension Configuration 1.
View the contents of the S:\ folder and note that the MyCache.bpe file now exists.
2.
In SQL Server Management Studio, select the code under the comment View buffer pool extension details, and click Execute. Then note the information about the Buffer Pool Extension that is returned from the dynamic management view.
3.
Select the code under the comment Monitor buffer pool extension, and click Execute. This dynamic management view shows all buffered pages, and the is_in_bpool_extension column indicates pages that are stored in the Buffer Pool Extension.
Disable the Buffer Pool Extension
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
3-27
1.
In SQL Server Management Studio, select the code under the comment Disable buffer pool extension, and click Execute.
2.
Select the code under the comment View buffer pool extension details, and click Execute. Then note the information about the Buffer Pool Extension that is returned from the dynamic management view.
3.
Use File Explorer to view the contents of the S:\ folder and note that the MyCache.bpe file has been deleted.
Lab: Managing Database Storage Scenario As a database administrator at Adventure Works Cycles, you are responsible for managing system and user databases on the MIA-SQL instance of SQL Server. There are several new applications that require databases, which you must create and configure.
Objectives After completing this lab, you will be able to:
Configure tempdb.
Create databases.
Attach a database.
Estimated Time: 45 minutes Virtual machine: 20462C-MIA-SQL User name: ADVENTUREWORKS\Student Password: Pa$$w0rd
Exercise 1: Configuring tempdb Storage Scenario
MCT USE ONLY. STUDENT USE PROHIBITED
3-28 Working with Databases and Storage
The application development team has notified you that some of the new applications will make extensive use of temporary objects. To support this requirement while minimizing I/O contention, you have decided to move the tempdb database files to a dedicated storage volume and increase the size of the data and log files. The main tasks for this exercise are as follows: 1. Prepare the Lab Environment 2. Configure tempdb Files Task 1: Prepare the Lab Environment 1.
Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are both running, and then log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
Run Setup.cmd in the D:\Labfiles\Lab03\Starter folder as Administrator.
Task 2: Configure tempdb Files 1.
Use SQL Server Management Studio to view the properties of the tempdb system database on the MIA-SQL database engine instance, and note the current location and size of the database files.
2.
Alter tempdb to so that the database files match the following specification: o
tempdev
Size: 10MB
File growth: 5MB
Maximum size: Unlimited
File Name: T:\tempdb.mdf
o
3.
templog
Size: 5MB
File growth: 1MB
Maximum size: Unlimited
File Name: T:\templog.ldf
Restart the SQL Server service and verify that the changes have taken effect.
Results: After this exercise, you should have inspected and configured the tempdb database.
Exercise 2: Creating Databases Scenario The following two applications have been developed, and these applications require databases:
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
3-29
The Human Resources applications is a simple solution for managing employee data. It is not expected to be used heavily or to grow substantially.
The Internet Sales application is a new e-commerce website, and must support a heavy workload that will capture a large volume of sales order data.
The main tasks for this exercise are as follows: 1. Create the HumanResources Database 2. Create the InternetSales Database 3. View Data File Information
Task 1: Create the HumanResources Database 1.
Create a new database named HumanResources with the following database files: Logical Name HumanResources
HumanResources_log
Filegroup PRIMARY
Initial Size
Growth
Path
50 MB
5MB / Unlimited
M:\Data\HumanResources.mdf
5MB
1MB / Unlimited
L:\Logs\HumanResources.ldf
Task 2: Create the InternetSales Database 1.
Create a new database named InternetSales with the following database files and filegroups: Filegroup
InternetSales
PRIMARY
5 MB
1 MB / Unlimited
M:\Data\InternetSales.mdf
InternetSales_data1
SalesData
100 MB
10 MB / Unlimited
M:\Data\InternetSales_data1.ndf
InternetSales_data2
SalesData
100 MB
10 MB / Unlimited
N:\Data\InternetSales_data2.ndf
2 MB
10% / Unlimited
L:\Logs\InternetSales.ldf
InternetSales_log 2.
Initial Size
Logical Name
Growth
Path
Make the SalesData filegroup the default filegroup.
Task 3: View Data File Information
MCT USE ONLY. STUDENT USE PROHIBITED
3-30 Working with Databases and Storage
1.
In SQL Server Management Studio, open the ViewFileInfo.sql script file in the D:\Labfiles\Lab03\Starter folder.
2.
Execute the code under the comment View page usage and note the UsedPages and TotalPages values for the SalesData filegroup.
3.
Execute the code under the comments Create a table on the SalesData filegroup and Insert 10,000 rows.
4.
Execute the code under the comment View page usage again and verify that the data in the table is spread across the files in the filegroup.
Results: After this exercise, you should have created a new HumanResources database and an InternetSales database that includes multiple filegroups.
Exercise 3: Attaching a Database Scenario Business analysts at Adventure Works Cycles have developed a data warehouse that must be hosted on MIA-SQL. The analysts have supplied you with the database files so that you can attach the database. The database includes multiple filegroups, including a filegroup for archive data, which should be configured as read-only. The main tasks for this exercise are as follows: 1. Attach the AWDataWarehouse Database 2. Configure Filegroups
Task 1: Attach the AWDataWarehouse Database 1.
2.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
3-31
Move AWDataWarehouse.ldf from the D:\Labfiles\Lab03\Starter\ folder to the L:\Logs\ folder, and then move the following files from the D:\Labfiles\Lab03\Starter\ folder to the M:\Data\ folder: o
AWDataWarehouse.mdf
o
AWDataWarehouse_archive.ndf
o
AWDataWarehouse_current.ndf
Attach the AWDataWarehouse database by selecting the AWDataWarehouse.mdf file and ensuring that the other files are found automatically.
Task 2: Configure Filegroups 1.
View the properties of the AWDataWarehouse database and note the filegroups it contains.
2.
Set the Archive filegroup to read-only.
3.
View the properties of the dbo.FactInternetSales table and verify that it is stored in the Current filegroup.
4.
View the properties of the dbo.FactInternetSalesArchive table and verify that it is stored in the Archive filegroup.
5.
Edit the dbo.FactInternetSales table and modify a record to verify that the table is updateable.
6.
Edit the dbo.FactInternetSalesArchive table and attempt to modify a record to verify that the table is read-only.
Results: After this exercise, you should have attached the AWDataWarehouse database to MIA-SQL.
Module Review and Takeaways In this module, you have learned how to manage storage for SQL Server databases and the buffer pool extension. Best Practice: When working with database storage, consider the following best practices:
Plan and test your file layout carefully.
Separate data and log files on the physical level.
Keep the data files of a database at the same size.
Create the database in an appropriate size so it doesn’t have to be expanded too often.
Shrink files only if absolutely necessary.
Set a filegroup other than PRIMARY as the default filegroup.
Review Question(s) Question: Why is it typically sufficient to have one log file in a database? Question: Why should only temporary data be stored in the tempdb system database?
MCT USE ONLY. STUDENT USE PROHIBITED
3-32 Working with Databases and Storage
MCT USE ONLY. STUDENT USE PROHIBITED 4-1
Module 4 Planning and Implementing a Backup Strategy Contents: Module Overview
4-1
Lesson 1: Understanding SQL Server Recovery Models
4-2
Lesson 2: Planning a Backup Strategy
4-8
Lesson 3: Backing up Databases and Transaction Logs
4-15
Lesson 4: Using Backup Options
4-24
Lesson 5: Ensuring Backup Reliability
4-29
Lab: Backing Up Databases
4-35
Module Review and Takeaways
4-41
Module Overview
One of the most important aspects of a database administrator's role is ensuring that organizational data is reliably backed up so that it is possible to recover the data if a failure occurs. Even though the computing industry has known about the need for reliable backup strategies for decades and discussed the needs at great length, tragic stories regarding data loss are still commonplace. A further problem is that, even when the strategies in place work as they were designed, the outcomes still regularly fail to meet an organization’s operational requirements. In this module, you will consider how to create a strategy that is aligned with organizational needs, and learn how to perform the backup operations required by that strategy.
Objectives After completing this module, you will be able to:
Describe how database transaction logs function, and how they affect database recovery.
Plan a backup strategy for a SQL Server database.
Back up databases and transactions logs.
Perform copy-only, compressed, and encrypted backups.
Maximize backup reliability.
Planning and Implementing a Backup Strategy
Lesson 1
Understanding SQL Server Recovery Models
MCT USE ONLY. STUDENT USE PROHIBITED
4-2
Before you can plan a backup strategy, you must understand how SQL Server uses the transaction log to maintain data consistency, and how the recovery model of the database affects transaction log operations and the available backup options.
Lesson Objectives After completing this lesson, you will be able to:
Explain how the SQL Server transaction log operates.
Describe the transaction log file structure.
Configure database recovery models.
Implement capacity planning for transaction logs.
Configure checkpoint options.
Overview of SQL Server Transaction Logs Two of the common requirements for transaction management in database management systems are the atomicity and durability of transactions. Atomicity requires that an entire transaction is committed or that no work at all is committed. Durability requires that, once a transaction is committed, it will survive system restarts, including those caused by system failures. SQL Server uses the transaction log to ensure both the atomicity and durability of transactions.
Write-Ahead Logging
When SQL Server needs to modify the data in a database page, it first checks if the page is present in the buffer cache. If the page is not present, it is read into the buffer cache. SQL Server then modifies the page in memory, writing redo and undo information to the transaction log. While this write is occurring, the “dirty” page in memory is locked until the write to the transaction log is complete. At regular intervals, a background checkpoint process flushes the dirty pages to the database, writing all the modified data to disk.
This process is known as write-ahead logging (WAL) because all log records are written to the log before the affected dirty pages are written to the data files and the transaction is committed. The WAL protocol ensures that the database can always be set to a consistent state after a failure. This recovery process will be discussed in detail later in this course, but its effect is that transactions that were committed before the failure occurred are guaranteed to be applied to the database. Those transactions that were “in flight” at the time of the failure, where work is partially complete, are undone. Writing all changes to the log file in advance also makes it possible to roll back transactions if required.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
Transaction Rollback
4-3
SQL Server can use the information in the transaction log to roll back transactions that have only been partially completed. This ensures that transactions are not left in a partially-completed state. A transaction rollback may occur because of a request from a user or client application (such as the execution of a ROLLBACK TRANSACTION statement) or because a transaction is in a partially-completed state at the time of a system failure.
Transaction Log File Structure It is essential that there is enough information in the transaction log to process rollback requests from users or applications and to meet the needs of other SQL Server features, such as replication and change data capture.
Transaction Log Structure and Virtual Log Files
SQL Server writes to the transaction log in chronological order, in a circular fashion. Because the file may need to grow, it is internally divided into a set of virtual log files (VLFs) which have no fixed size. When the database engine is creating or extending a log file, it dynamically determines the appropriate size for the VLFs, depending on the size of growth that is occurring. The values in the following table show how it calculates the number of VLFs to create: Growth Increment
Number of VLFs
Less than or equal to 64 MB
4
Between 64 MB and 1 GB
8
Greater than 1 GB
16
When a log file write reaches the end of the existing log file, SQL Server starts writing again at the beginning, overwriting the log records currently stored. This mechanism works well, providing that the previous log records in that section of the log file have already been written to the database and freed up, or “truncated”. If they have not been truncated and the data is required, SQL Server tries to grow the size of the log file. If this is not possible (for example, if it is not configured for automatic growth or the disk is full) SQL Server fails the transaction and returns an error. If it is possible to grow the log file, SQL Server allocates new virtual log files, using the auto growth size increment in the log file configuration. Note: Instant File Initialization (IFI) cannot be used with transaction log files. This means that transactions can be blocked while log file growth occurs.
Planning and Implementing a Backup Strategy
Log File Truncation
MCT USE ONLY. STUDENT USE PROHIBITED
4-4
Truncation occurs either as part of the backup process for a transaction log or automatically, when using certain database recovery models. The entries in the log file are logically ordered by their Log Sequence Number (LSN), with the starting point of the oldest active transaction being the MinLSN value. When the log file is truncated, only data up to the MinLSN can be truncated. Any entries with a LSN greater than the MinLSN must be retained for use in any potential recovery process. During truncation, only data up to the start of the virtual log file that contains the minimum of the start of the last checkpoint operation, MinLSN, and oldest transaction that is not yet replicated (if using replication), is truncated. Note: Replication is beyond the scope of this course but it is important to be aware that the configuration and state of replicated data can affect transaction log truncation.
Working with Recovery Models SQL Server supports three types of database recovery model. All models preserve data in the event of a disaster, but there are important differences that you need to consider when selecting a model for your database. Choosing the appropriate recovery model is an important part of any backup strategy. The recovery model that you select for your database will determine many factors, including:
Maintenance processing overhead.
Exposure to potential loss.
The available backup types.
When choosing a recovery model for your database, you will need to consider the size of the database, the potential maintenance overhead, and the level of acceptable risk with regards to potential data loss.
Simple Recovery Model
The simple recovery model is the easiest to administer, but the riskiest for data loss. In this model, you do not back up the transaction log so, in the event of disaster, you can only recover up to the most recent database backup. Any changes made since then will be lost. If you decide to use the simple recovery model, you should ensure that you perform regular database backups to reduce any potential loss. However, the intervals should be long enough to keep the backup overhead from affecting production work. You can include differential backups in your strategy to help reduce the overhead.
Full Recovery Model
The full recovery model is the default for a database when you install SQL Server, though you can modify this by changing the recovery model for the model database. The full recovery model provides the normal database maintenance model for databases where durability of transactions is necessary.
The full recovery model requires log backups. It fully logs all transactions and retains the transaction log records until after they are backed up. The full recovery model enables a database to be recovered to the point of failure, assuming that the tail of the log can be backed up afterwards. The full recovery model also supports an option to restore individual data pages or restore to a specific point in time.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
Bulk-logged Recovery Model
4-5
The bulk-logged recovery model can reduce the transaction logging requirements for many bulk operations. It is intended solely as an add-on to the full recovery model. For example, while executing certain large-scale bulk operations such as bulk import or index creation, a database can be switched temporarily to the bulk-logged recovery model. This temporary switch can increase performance by only logging extent allocations and reduce log space consumption.
The bulk-logged recovery model still requires transaction log backups because, like the full recovery model, the bulk-logged recovery model retains transaction log records until after they are backed up. The trade-offs are bigger log backups and increased work-loss exposure because the bulk-logged recovery model does not support point-in-time recovery. Note: One potentially surprising outcome is that the log backups can often be larger than the transaction logs. This is because SQL Server retrieves the modified extents from the data files while performing a log backup for minimally-logged data.
Capacity Planning for Transaction Logs The recovery model that you choose for a database will impact the size of the log file. When using the simple recovery model, SQL Server truncates the log after each checkpoint. In the full and bulk-logged recovery models, the log is truncated after each log backup, to ensure that an unbroken chain of backup log files exists. The truncation of a log file happens after a log backup. There is a common misconception that a full database backup breaks this chain of log file backups—but this is not true.
Determining Log File Size
It is very difficult to calculate the size requirements for log files. As with planning other aspects of SQL Server, monitoring during realistic testing is the best indicator.
Another common misconception is that the log file of a database in simple recovery model will not grow. This is also not the case. In simple recovery model, the transaction log needs to be large enough to hold all details from the oldest active transaction. Large or long-running transactions can cause the log file to need additional space.
Inability to Truncate a Log
Installed or in-use SQL Server features can also prevent you from truncating log files. For example, database mirroring, transactional replication, and change data capture can all affect the ability for the database engine to truncate log files. Note: Database mirroring, transactional replication, and change data capture are beyond the scope of this course.
You can use the log_reuse_wait_desc column in the sys.databases table to identify the reason why you cannot truncate a log.
Planning and Implementing a Backup Strategy
Identifying Truncation Issues SELECT name, log_reuse_wait_desc FROM sys.databases;
The values that can be returned for the log_reuse_wait_desc column include: 0 = Nothing 1 = Checkpoint 2 = Log backup 3 = Active backup or restore 4 = Active transaction 5 = Database mirroring 6 = Replication 7 = Database snapshot creation 8 = Log scan 9 = Other (transient) After resolving the reason that is shown, perform a log backup (if you are using full recovery model) to truncate the log file, and then you can use DBCC SHRINKFILE to reduce the filesize of the log file. Note: If the log file does not reduce in size when using DBCC SHRINKFILE as part of the above steps, the active part of the log file must have been at the end at that point in time.
Working with Checkpoint Options SQL Server has four types of checkpoint operation:
MCT USE ONLY. STUDENT USE PROHIBITED
4-6
Automatic checkpoints occur in the background to meet the upper time limit suggested by the recovery interval server configuration option. Automatic checkpoints run to completion, but are throttled, based on the number of outstanding writes and whether the database engine detects an increase in write latency above 20 milliseconds.
Indirect checkpoints are issued in the background to meet a user-specified target recovery time for a given database. The default target recovery time is zero, which causes automatic checkpoint settings to be used on the database. If you have used the ALTER DATABASE statement to modify the TARGET_RECOVERY_TIME option to a value greater than zero, this value is used in place of the recovery interval specified for the server instance.
Manual checkpoints are issued when you execute a Transact-SQL CHECKPOINT command. The manual checkpoint occurs in the current database for your connection. By default, manual checkpoints run to completion. The optional checkpoint duration parameter specifies a requested amount of time, in seconds, for the checkpoint to complete.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
4-7
Internal checkpoints are issued by various server operations, such as backup and database snapshot creation, to guarantee that disk images match the current state of the log.
You can configure the target duration of a checkpoint operation by executing the CHECKPOINT statement. Using the CHECKPOINT Statement CHECKPOINT 5;
Demonstration: Logs and Full Recovery In this demonstration, you will see how log truncation works in full recovery model.
Demonstration Steps Observe Log File Behavior in the Full Recovery Model 1.
Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are running, and log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
In the D:\Demofiles\Mod04 folder, run Setup.cmd as Administrator.
3.
Start SQL Server Management Studio and connect to the MIA-SQL database engine using Windows authentication.
4.
In Object Explorer, expand Databases, right-click the LogTest database, and click Properties.
5.
In the Database Properties - LogTest dialog box, on the Options page, verify that the Recovery model is set to Full. Then click Cancel.
6.
Open the LogsAndFullRecovery.sql Transact-SQL file in the D:\Demofiles\Mod04 folder.
7.
Select the code comment Perform a full database backup and click Execute.
8.
Select the code comment View database file space and click Execute. Note the space used in the LogTest_log file (and note that log files have a type value of 1).
9.
Select the code comment Insert data and click Execute to insert 5000 rows.
10. Select the code comment View log file space and click Execute. Note the space used in the LogTest_log file has increased. 11. Select the code comment Issue a checkpoint and click Execute force SQL Server to perform a checkpoint and flush the modified pages to disk
12. Select the code comment View log file space again and click Execute. Note the space used in the LogTest_log file has not decreased.
13. Select the code comment Check log status and click Execute. Note that SQL Server is awaiting a log backup before the log file can be truncated. 14. Select the code comment Perform a log backup and click Execute.
15. Select the code comment Verify log file truncation and click Execute. Note the space used in the LogTest_log file has decreased because the log has been truncated. 16. Keep SQL Server Management Studio open for the next demonstration.
Planning and Implementing a Backup Strategy
Lesson 2
Planning a Backup Strategy
MCT USE ONLY. STUDENT USE PROHIBITED
4-8
Now you have an understanding of SQL Server transaction logs and database recovery models, it is time to consider the types of backups that are available with SQL Server.
To effectively plan a backup strategy, you need to align your chosen combination of backup types to your business recovery requirements. Most organizations will need to use a combination of backup types rather than relying solely on just one.
Lesson Objectives After completing this lesson, you will be able to:
Describe the available Microsoft SQL Server backup types.
Describe key recovery objectives.
Describe a full database backup strategy.
Describe a differential backup strategy.
Describe a transaction log backup strategy.
Describe file, filegroup, and partial backup strategies.
Backup Types SQL Server supports several backup types, which you can combine to implement the right backup and recovery strategy for a particular database based on business requirements and recovery objectives.
Full Backups A full backup of a database includes the data files and the active part of the transaction log. The first step in the backup is performing a CHECKPOINT operation. The active part of the transaction log includes all details from the oldest active transaction forward. A full backup represents the database at the time that the data reading phase of the backup was completed and serves as your baseline in the event of a system failure. Full backups do not truncate the transaction log.
Differential Backups
A differential backup saves the data that has been changed since the last full backup. Differential backups are based on the data file contents rather than log file contents and contain extents that have been modified since the last full database backup. Differential backups are generally faster to restore than transaction log backups but they have less options available. For example, point-in-time recovery is not available unless differential backups are also combined with log file backups.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
Transaction Log Backups
4-9
Transaction log backups record any database changes by backing up the log records from the transaction log. Point-in-time recovery is possible with transaction log backups which are generally much smaller than full database backups—this means they can be run much more frequently. After the transaction log is backed up, the log records that have been backed up and are not in the currently active portion of the transaction log are truncated. Transaction log backups are not available in the simple recovery model.
File or Filegroup Backups
If performing a full database backup on very large databases is not practical, you can perform database file or filegroup backups to back up specific files or filegroups.
Partial Backups
If the database includes some read-only filegroups, you can simplify the backup process by using a partial backup. A partial backup is similar to a full backup, but it contains only the data in the primary filegroup, every read/write filegroup, and any specified read-only files. A partial backup of a read-only database contains only the primary filegroup.
Tail-log Backups
A transaction log backup taken just before a restore operation is called a tail-log backup. Typically, taillog backups are taken after a disk failure that affects data files only. From SQL Server 2005 onwards, SQL Server has required that you take a tail-log backup before it will allow you to restore a database, to protect against inadvertent data loss.
Additionally, tail-log backups are often possible even when the data files from the database are no longer accessible.
Copy Only Backups
SQL Server 2005 and later versions support copy-only backups which are useful for taking copies of backups offsite or when performing online restore operations. Unlike other backups, a copy-only backup does not impact the overall backup and restore procedures for the database. All recovery models support copy-only data backups.
Determining Recovery Objectives When planning a backup strategy, there is always a trade-off between the level of safety that is guaranteed and the cost of the solution. Most businesses will say that they cannot afford to lose any data in any circumstances. While zero data loss is an admirable goal, it is generally not affordable or realistic. For this reason, there are two objectives that need to be established when discussing a backup strategy—a recovery time objective (RTO) and a recovery point objective (RPO). Part of the strategy might also involve the retrieval of data from other locations where copies are stored.
Recovery Time Objective
There is little point in having perfectly recoverable data if the time taken to achieve that target is too long. A backup strategy needs to have a RTO.
MCT USE ONLY. STUDENT USE PROHIBITED
4-10 Planning and Implementing a Backup Strategy
For example, consider the backup requirements of a major online bank. If the bank was unable to access any of the data in their systems, how long could it continue to function?
Now imagine that the bank was making full copies of all its data, but a full restore of that data would take two weeks to complete. This time includes finding the correct backup media, identifying a person with the authority to perform the restore, locating documentation related to the restore, and actually restoring the backups. What impact would a two-week outage have on the bank? The more important question is how long would an interruption to data access need to be, before the bank ceased to be viable? The key message with RTO is that a plan involving quick recovery with a small data loss might be more palatable to an organization than one that eliminates data loss but takes much longer to implement.
Recovery Point Objective
After a system has been recovered, hopefully in a timely manner, the next important question relates to how much data will have been lost. This is represented by the RPO.
For example, while a small business might conclude that restoring a backup from the previous night, with the associated loss of up to a day's work is an acceptable risk trade-off, a large business might see the situation very differently. It is common for large corporations to plan for zero committed data loss. This means that work that was committed to the database must be recovered but that it might be acceptable to lose work that was in process at the time a failure occurred.
Mapping to Business Strategy
The most important aspect of creating a backup strategy is that it must be designed in response to the business requirements and strategy. The backup strategy also needs to be communicated to the appropriate stakeholders within the organization. It is important to make sure that the expectations of the business users are managed, in line with the agreed strategy.
Organizations often deploy large numbers of databases. The RPO and RTO for each database might be different. This means that database administrators will often need to work with different backup strategies for different databases they are managing. Most large organizations have a method of categorizing the databases and applications in terms of importance to its core functions.
Business requirements will determine all aspects of the backup strategy, including how frequently backups need to occur, how much data is to be backed up each time, the type of media that the backups will be held on, and the retention and archival plans for the media.
Full Database Backup Strategies A full database backup strategy involves regular full backups to preserve the database. If a failure occurs, the database can be restored to the state of the last full backup.
Full Database Backups This backs up the whole database, as well as the portion of the transaction log covering changes that occurred while reading the data pages. Full database backups represent a copy of the database at the time that the data-reading phase of the backup finished, not at the time it started. Backups can be taken while the system is being used. At the end of the backup, SQL Server writes transaction log entries that cover the period during which the backup was occurring into the backup.
Common Usage Scenarios
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
4-11
For a small database that can be backed up quickly, the best practice is to use full database backups. However, as a database becomes larger, full backups take more time to complete and require more storage space. Therefore, for a large database, you might want to supplement full database backups in conjunction with other forms of backup.
Following each backup, under the simple recovery model, the database is exposed to potential work loss if a disaster was to occur. The work-loss exposure increases with each update until the next full backup, when it returns to zero and a new cycle of work-loss exposure begins. Scenarios that might be appropriate for using a full database backup strategy include:
Test systems.
Data warehouses where the data can be recovered from a source system and where the data in the data warehouse does not change regularly.
Systems where the data can be recovered from other sources.
Example on the Slide
In the example shown on the slide, a full backup is performed on Sunday, Monday, and Tuesday. This means that during the day on Monday, up to a full day of data is exposed to risk until the backup is performed. The same amount of exposure happens on Tuesday. After the Tuesday backup is carried out, the risk increases every day until the next Sunday backup is performed.
Transaction Log Backup Strategies A backup strategy that involves transaction log backups must be combined with either a full database strategy or one that combines the use of full and differential database backups.
Transaction Log Backups Transaction log backups save all data since the last log backup. Rather than reading database pages, transaction log backups read data from the transaction log. A backup strategy based on transaction log backups is appropriate for databases with frequent modifications.
When it is necessary to recover a database, the latest full database backup needs to be restored, along with the most recent differential backup (if one has been performed). After the database has been restored, transaction logs that have been backed up since that time are also then restored, in order. Because the restore works on a transactional basis, it is possible to restore a database to a specific point in time from the transactions stored in the log backup.
In addition to providing capabilities that let you restore the transactions that have been backed up, a transaction log backup truncates the transaction log. This enables VLFs in the transaction log to be reused. If you do not back up the log frequently enough, the log files can fill up.
Example on the Slide
In the example shown on the slide, nightly full database backups are supplemented by periodic transaction log backups during the day. If the system fails, recovery can be made to the time of the last transaction log backup. If, however, only the database data files fail and a tail-log backup can be performed, no committed data loss will occur.
Combinations of Backup Types Transaction log backups are typically much smaller than other backups, especially when they are performed regularly. Potential data loss can be minimized by a backup strategy that is based on transaction log backups, in combination with other backup types. Because log backups typically take longer to restore than other types, it is often advisable to combine transaction log backups with periodic differential backups. During recovery, only the transaction log backups that were taken after the last differential backup need restoring.
Differential Backup Strategies Differential backups are a good way to reduce potential work loss and maintenance overhead. When the proportion of a database that is changed between backup intervals is much smaller than the entire size of the database, a differential backup strategy can be useful. However, if you have a very small database, differential backups may not save much time.
Differential Backups
MCT USE ONLY. STUDENT USE PROHIBITED
4-12 Planning and Implementing a Backup Strategy
From the time that a full backup occurs, SQL Server maintains a map of extents that have been modified. In a differential backup, SQL Server backs up only those extents that have changed. It is important to realize though that, after the differential backup is performed, SQL Server does not clear that map of modified extents. The map is only cleared when full backups occur. This means that a second differential backup performed on a database will include all changes since the last full backup, not just those changes since the last differential backup.
Common Usage Scenarios Because they only save the data that has been changed since the last full database backup, differential backups are typically much faster and occupy less disk space than transaction log backups for the same period of time.
Differential database backups are especially useful when a subset of a database is modified more frequently than the remainder of the database. In these situations, differential database backups enable you to back up frequently without the overhead of full database backups.
Example on the Slide
In the example on the slide, a full database backup is taken at midnight on Sunday night (early Monday morning). Differential backups are then taken at midnight each other night of the week. The differential backup taken on Monday night would include all data changed during Monday. The differential backup taken on Tuesday night would include all data changed on Monday and Tuesday. The differential backup taken on Friday night would include all data that changed on Monday, Tuesday, Wednesday, Thursday, and Friday. This means that differential backups can grow substantially in size between each full backup interval.
Combinations of Backups
Differential backups must be combined with other forms of backup. Because a differential backup saves all data changed since the last full backup was made, it cannot be taken unless a full backup has already been performed.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
4-13
Another important aspect to consider is that, when a recovery is needed, multiple backups need to be restored to bring the system back online—rather than a single backup. This increases the risk exposure for an organization and must be considered when planning a backup strategy. Differential backups can also be used in combination with both full and transaction log backups.
Partial Backup Strategies
For very large databases that contain multiple data files and/or multiple filegroups, you can consider using a file or filegroup backup strategy to reduce the time it takes to perform backups. This strategy is useful for databases that have narrow backup windows, and it can also speed up recovery times, because if a single data file is lost or corrupt, you only need to restore that file or the filegroup that contains that file, instead of the whole database. As with the other backup strategies, you initiate this strategy by taking a full database backup, but you then back up the data files or filegroups in turn. You can also backup transaction logs between file or filegroup backups, to improve recoverability.
Managing file and filegroup backups can be complex, and the loss of a single data file backup can cause serious problems, including making a database unrecoverable.
One way to simplify the process of backing up parts of a database is to use a partial backup, which backs up only the primary filegroup and the read/write filegroups. However, this is only recommended when the database contains enough data in read-only filegroups to make a substantial time and administrative saving. It is also recommended that you use partial backups in conjunction with the simple recovery model. Additional Reading: For more information about partial backups, see the topic Partial Backups (SQL Server) in SQL Server Books Online.
One of the key benefits of a partial backup strategy is that in the event of a failure, you can perform a piecemeal restore that makes data in the read/write filegroups available before the read-only filegroups have been restored. This enables you to reduce the time to recovery for workloads that do not require the data in the read-only filegroups. Note: Piecemeal restores are discussed in the next module of this course.
Example on the Slide
In the example shown on the slide, all read-only filegroups are backed up on Monday at midnight, along with a partial backup that includes only the primary filegroup and all read/write filegroups. At the end of each subsequent day, a partial differential backup is used to back up modified pages in read-write filegroups.
Discussion: Planning Backup Strategies Consider the following scenario, and identify an appropriate backup strategy for each database.
Database Characteristics As a DBA for Adventure Works cycles, you must manage the following databases:
MCT USE ONLY. STUDENT USE PROHIBITED
4-14 Planning and Implementing a Backup Strategy
HumanResources. This database is 200 MB in size. The typical rate of change (the volume of data modified) is around 2 MB per hour during office hours.
InternetSales. This database is 10 GB in size. The typical rate of change is 200 MB per hour, with changes occurring 24 hours a day, seven days a week.
AWDataWarehouse. This database is 150 GB in size, including 100 GB of archive data stored on a read-only filegroup, 50 GB of data stored on a writable secondary filegroup, and negligible system tables on the primary filegroup. Around 5 GB of new data is loaded into tables in the writable secondary filegroup each week in a single batch operation that starts at 5:00 on Saturday and takes two hours to complete. Each month after the last weekly load, 20 GB of old data in the writable secondary filegroup is moved to the read-only filegroup in an operation that takes around an hour.
The storage solution used for SQL Server backup devices supports a backup throughout of 150 MB per minute, and a restore throughput of 100 MB per minute.
Business Requirements The following business requirements have been identified for the databases:
HumanResources. This database must never be unavailable for longer than an hour. In the event of a failure, the database must be recovered so that it includes all transactions that were completed up to the end of the previous working day.
InternetSales. This database must never be unavailable for more than two hours, and no more than 30 minutes of transactions can be lost.
AWDataWarehouse. This database must never be unavailable for more than 48 hours. In the event of failure, the database should be recovered to include the data loaded by the most recent batch load operation.
Office hours are from 08:00 to 18:00, Monday to Friday.
Backup Strategy Considerations For each database, you must determine:
The appropriate recovery model.
The types(s) of backup to be performed.
The appropriate time(s) at which to perform the backup(s).
Lesson 3
Backing up Databases and Transaction Logs
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
4-15
Now you have seen how to plan a backup strategy for a SQL Server system, you can learn how to perform SQL Server backups, including full and differential database backups, transaction log backups, and partial backups.
Lesson Objectives After completing this lesson, you will be able to:
Use SQL Server Management Studio and the BACKUP command to perform backups.
Initialize backup media.
Back up databases.
Back up transaction logs.
Back up files and filegroups.
Introduction to SQL Server Backup You can perform backups in SQL Server by using the BACKUP Transact-SQL statement or the graphical interface in SQL Server Management Studio (SSMS). Note: You can also use Windows PowerShell to perform a backup by using the BackupSqlDatabase cmdlet in the SQLPS module. Simplified syntax for the BACKUP statement is shown below. Simplified BACKUP Syntax BACKUP { DATABASE | LOG] } TO , [,…n] WITH
The SSMS graphical user interface includes the following pages, on which you can configure backup options:
General. Use this page to specify the database to be backed up, the backup type, the backup destination, and other general settings.
Media Options. Use this page to control how the backup is written to the backup device(s), for example overwriting or appending to existing backups.
Backup Options. Use this page to configure backup expiration, compression, and encryption.
MCT USE ONLY. STUDENT USE PROHIBITED
4-16 Planning and Implementing a Backup Strategy
In SQL Server, you perform backups while other users continue working with the database; and these other users might experience a performance impact due to the I/O load placed on the system by the backup operation. SQL Server does place some limitations on the types of commands you can execute while a backup is being performed. For example, you cannot use the ALTER DATABASE command with the ADD FILE or REMOVE FILE options or shrink a database during a backup. Additionally, you cannot include the BACKUP command in either an explicit or an implicit transaction or roll back a backup statement. You can only back up databases when they are online, but it is still possible to perform a backup of the transaction log when a database is damaged, if the log file itself is still intact. This is why it is so important to split data and log files onto separate physical media.
Backup Timing An important consideration when making a backup is to understand the timing associated with its contents—the database may be in use while the backup is occurring. For example, if a backup starts at 22:00 and finishes at 01:00, does it contain a copy of the database as it was at 22:00, a copy as it was at 01:00, or a copy from a time between the start and finish?
SQL Server writes all data pages to the backup device in sequence, but uses the transaction log to track any pages that are modified while the backup is occurring. SQL Server then writes the relevant portion of the transaction log to the end of the backup. This process makes the backups slightly larger than in earlier versions, particularly if heavy update activities are happening at the same time as the backup. This altered process also means that the backup contains a copy of the database as it was at a time just prior to the completion of the backup—not as it was at the time the backup started.
VSS and VDI The Windows Volume Shadow Copy Service (VSS) and the Virtual Device Interface (VDI) programming interfaces are available for use with SQL Server. The main use for these interfaces is so that third party backup tools can work with SQL Server.
In very large systems, it is common to need to perform disk-to-disk imaging while the system is in operation, because standard SQL Server backups might take too long to be effective. The VDI programming interface enables an application to freeze SQL Server operations momentarily while it takes a consistent snapshot of the database files. This form of snapshot is commonly used in geographicallydistributed storage area network (SAN) replication systems.
Media Sets and Backup Sets Before performing a backup, you should understand how backups are stored. A single backup is called a backup set, and is written to a media set, which can contain up to 64 backup devices.
Backup devices can be physical or logical. Physical backup devices are explicit files that you specify using a Universal Naming Convention (UNC) file path, and logical devices are named objects that reference files, providing a layer of abstraction that makes it easier to change backup device locations. Ultimately, both physical and logical devices reference a file location, which can be on the local file system, a network file share, a volume in a SAN or other storage device, or a blob container in Microsoft Azure storage.
Note: Direct backup to tape is not supported. If you want to store backups on tape, you should first write it to disk, and then copy the disk backup to tape. If a media set spans several backup devices, the backups will be striped across the devices. Note: No parity device is used while striping. If two backup devices are used together, each receives half the backup. Both must also be present when attempting to restore the backup.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
4-17
Every backup operation to a media set must write to the same number and type of backup devices. Media sets and the backup devices are created the first time a backup is attempted on them. Media sets and backup sets can also be named at the time of creation and given a description. The backups on an individual device within a media set are referred to as a media family. The number of backup devices used for the media set determines the number of media families in a media set. For example, if a media set uses two backup devices, it contains two media families.
Media and Backup Set Initialization
Media sets can contain multiple backup sets, enabling you to append backups of different types (and even different databases) to the same backup devices. When you perform a backup, you must be careful to use the appropriate media set options to avoid inadvertently overwriting backup sets that you want to retain. Two key options of which you must be particularly aware are:
FORMAT / NOFORMAT. The FORMAT option is used to write a new media header on the backup devices used in the backup. This creates a new media set, breaking any existing media sets to which the backup devices currently belong and deleting any existing backup sets they contain. When you perform a backup to an existing backup device, SQL Server uses a default value of NOFORMAT to safeguard against accidental backup deletion. In SQL Server Management Studio, the FORMAT option is specified by selecting Back up to a new media set, and erase all existing backup sets in the Backup Database dialog box.
INIT / NOINIT. The INIT option retains the existing media header, but overwrites all existing backup sets in the media set. By default, SQL Server uses the NOINIT option to avoid accidental backup deletion. In SQL Server Management Studio, you can select Backup to the existing media set, and then select Append to the existing backup set to use the NOINIT option, or Overwrite all existing backups to use the INIT option.
As an example, consider the following code, which backs up a database to a media set that consists of two files. Assuming the files do not already exist, SQL Server creates them and uses them to define a new media set. The data from the backup is striped across the two files: Initializing a media set BACKUP DATABASE AdventureWorks TO DISK = 'P:\SQLBackups\AW_1.bak', DISK = 'R:\SQLBackups\AW_2.bak';
MCT USE ONLY. STUDENT USE PROHIBITED
4-18 Planning and Implementing a Backup Strategy
Another backup could be made at a later time, to the same media set. The data from the second backup is again striped across the two files and the header of the media set is updated to indicate that it now contains the two backups: Appending a backup set to an existing media set BACKUP DATABASE AdventureWorks TO DISK = 'P:\SQLBackups\AW_1.bak', DISK = 'R:\SQLBackups\AW_2.bak' WITH NOINIT;
Later, another backup is performed using the following command. This overwrites the two previous backups in the media set, so that it now contains only the new backup: Overwriting existing backup sets in a media set BACKUP DATABASE AdventureWorks TO DISK = 'P:\SQLBackups\AW_1.bak', DISK = 'R:\SQLBackups\AW_2.bak' WITH INIT;
If a user then tries to create another backup to only one of the backup files in the media set using the following code, SQL Server will return an error, because all backup sets to a media set must use the same backup devices: Attempting to use a single device in an existing media set BACKUP DATABASE AdventureWorksDW TO DISK = 'R:\SQLBackups\AW_1.bak';
Before the member of the media set can be overwritten, the FORMAT option needs to be added to the WITH clause in the backup command. This create a new media set that contains a single file. The original media set, together with all of the backup sets it contains, is no longer valid. Formatting a media set BACKUP DATABASE AdventureWorksDW TO DISK = 'R:\SQLBackups\AW_1.bak' WITH FORMAT, INIT;
Use the FORMAT option to overwrite the contents of a backup file and split up the media set, but use the FORMAT option very carefully. Formatting one backup file of a media set renders the entire backup set unusable.
Performing Database Backups Database backups can be full or differential.
Performing a Full Database Backup A full database backup will back up all the data pages in the database and also saves the active portion of the transaction log. The following code makes a full database backup of the AdventureWorks database and stores it in a file named 'R:\Backups\AW.bak'. The INIT option creates the file if it does not already exist and overwrites it if it does. Performing a Full Database Backup BACKUP DATABASE AdventureWorks TO DISK = 'R:\Backups\AW.bak' WITH INIT;
Performing a Differential Database Backup
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
4-19
Although full database backups are ideal, often the time taken to perform one can outweigh the benefits that it provides. This is particularly true when only a small percentage of the database changes between each backup. In this scenario, differential backups are a sensible consideration. You can perform a differential backup by using SSMS or by using the DIFFERENTIAL option of the BACKUP statement. The following code makes a differential database backup of the AdventureWorks database and stores it in a file named 'R:\Backups\AW.bak'. The NOINIT option appends the backup to any existing backups in the media set. Performing a Differential Database Backup BACKUP DATABASE AdventureWorks TO DISK = 'R:\Backups\AW.bak' WITH DIFFERENTIAL, NOINIT;
How Differential Backups Work
SQL Server maintains a map of modified extents called the differential bitmap page. One differential bitmap page is maintained for each 4-GB section of every data file. Each time a full database backup is created, SQL Server clears the map. As the data in the data files changes, SQL Server updates the map and, when a differential backup runs, only extents that have changed since the last full database backup are backed up. If you run two consecutive differential backups, the second will contain all the extents that have changed since the last full database backup, not just the ones since the first differential backup. Note: You cannot create a differential database backup unless a full database backup has been taken first.
Performing Transaction Log Backups Before a transaction log backup can be performed, the database must be in either full or bulklogged recovery model. In addition, a transaction log backup can only occur when a full database backup has been previously taken. Unless a database is set to bulk-logged recovery mode, a transaction log backup does not save any data pages from the database. The following code backs up the transaction log of the AdventureWorks database: Performing a Transaction Log Backup BACKUP LOG AdventureWorks TO DISK = 'R:\Backups\AW.bak' WITH NOINIT;
MCT USE ONLY. STUDENT USE PROHIBITED
4-20 Planning and Implementing a Backup Strategy
A transaction log backup finds the MaxLSN of the last successful transaction log backup, and saves all log entries beyond that point to the current MaxLSN. The process then truncates the transaction log as far as is possible (unless the COPY_ONLY or NO_TRUNCATE option is specified). The longest-running active transaction must be retained, in case the database needs to be recovered after a failure.
Log Record Chains Before you can restore a database by using transaction log backups, an unbroken chain of log records must be available—from the last full database backup to the desired point of restoration. If there is a break, you can only restore up to the point where the backup chain is broken.
For example, imagine a scenario where you create a database, and later take a full backup. At this point, the database can be recovered. If the recovery model of the database is then changed to simple and subsequently switched back to full, a break in the log file chain has occurred. Even though a previous full database backup exists, the database can only be recovered up to the point of the last transaction log backup, prior to the change to simple recovery model. After switching from simple to full recovery model, you must perform a full database backup to create a starting point for transaction log backups.
Backing Up the Tail-log
To recover a database to the point of failure, you must take a tail-log backup before starting a restore on an existing database. This ensures that all transactions are written to at least one backup, before they can be overwritten. The tail-log backup prevents work loss and keeps the log chain intact. When you are recovering a database to the point of a failure, the tail-log backup is often the last one of interest in the recovery plan. If you cannot back up the tail of the log, you can only recover a database to the end of the last backup that was created before the failure. Note: Not all restore scenarios require a tail-log backup. You do not need to have a tail-log backup if the recovery point is contained in an earlier log backup or if you are moving or replacing (overwriting) the database and do not need to restore it to a point of time after the most recent backup.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
4-21
When performing a tail-log backup of a database that is currently online, you can use the NO_RECOVERY option to immediately place the database into a restoring state, preventing any more transactions from occurring until the database is restored. If the database is damaged, you can use the NO_TRUNCATE option, which causes the database engine to attempt the backup, regardless of the state of the database. This means that a backup taken while using the NO_TRUNCATE option might have incomplete metadata. If you are unable to back up the tail of the log using the NO_TRUNCATE option when the database is damaged, you can attempt a tail-log backup by specifying the CONTINUE_AFTER_ERROR option.
Performing Partial and Filegroup Backups When you need to manage an extremely large database, the time taken to perform full backups (and restore them in the event of failure) can have a detrimental effect on ongoing business operations. While transaction log backups and differential backups can ease this problem, for extremely large databases that use the Simple recovery model (for which transaction log backups cannot be performed), you can choose to back up only the files and filegroups that contain volatile data, without including read-only files and filegroups in the backup. There are two techniques used to implement this kind of backup solution:
Partial backup. A partial backup backs up only the primary filegroup and filegroups that are set to read-write. You can also include specific read-only filegroups if required. The purpose of a partial backup is to enable you to easily back up the parts of a database that change, without having to plan the backup of specific files or filegroups. You can perform a full or differential partial backup.
File and Filegroup backups. A filegroup backup enables you to back up only selected files and filegroups in a database. This can be useful with very large databases that would take a long time to back up in full, because it enables you to back them up in phases. It’s also useful for databases that contain some read-only data, or data that changes at different rates, because it enables you to back up only the read-write data or to back up more frequently updated data more often.
The following code example performs a partial backup that includes the primary filegroup and all read/write filegroups: A Partial Backup BACKUP DATABASE LargeDB READ_WRITE_FILEGROUPS TO DISK = 'R:\Backups\LrgRW.bak' WITH INIT;
MCT USE ONLY. STUDENT USE PROHIBITED
4-22 Planning and Implementing a Backup Strategy
The following code example backs up specific filegroups. You can also use the FILE parameter to back up individual files: A Filegroup Backup BACKUP DATABASE LargeDB FILEGROUP = 'LrgFG2' TO DISK = 'R:\Backups\LrgFG2.bak' WITH INIT;
Demonstration: Performing Backups In this demonstration, you will see how to:
Perform a full database backup.
Perform a differential database backup.
Perform a transaction log backup.
Demonstration Steps Perform a Full Database Backup 1.
Ensure that you have performed the previous demonstration in this module. If not, start the 20462CMIA-DC and 20462C-MIA-SQL virtual machines, log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd, and in the D:\Demofiles\Mod04 folder, run Setup.cmd as Administrator.
2.
If SQL Server Management Studio is not already open, start it and connect to the MIA-SQL database engine using Windows authentication.
3.
In Object Explorer, under Databases, right-click AdventureWorks, point to Tasks, and click Back Up.
4.
In the Backup Up Database – AdventureWorks dialog box, ensure that Backup type is set to Full, and in the Destination section, select each existing file path and click Remove. Then click Add and in the Select Backup Destination dialog box, enter the file name D:\Demofiles\Mod04\AW.bak and click OK.
5.
In the Backup Up Database – AdventureWorks dialog box, on the Media Options page, note that the default option is to append to an existing media set. In this case, there is no existing media set so a new one will be created, and there are no existing backup sets to overwrite.
6.
In the Backup Up Database – AdventureWorks dialog box, on the Backup Options page, note the default backup name and expiration settings.
7.
In the Backup Up Database – AdventureWorks dialog box, in the Script drop-down list, click Script Action to a New Query Window. Then click OK.
8.
When the backup has completed successfully, click OK.
9.
In the query pane, view the Transact-SQL BACKUP statement that was used to back up the database.
10. View the D:\Demofiles\Mod04 folder and note the size of the AW.bak file.
Perform a Differential Backup
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
4-23
1.
In SQL Server Management Studio, open the UpdatePrices.sql script file from the D:\Demofiles\Mod04 folder and click Execute. This script updates the Production.Product table in the AdventureWorks database.
2.
In Object Explorer, under Databases, right-click AdventureWorks, point to Tasks, and click Back Up.
3.
In the Backup Up Database – AdventureWorks dialog box, in the Backup type list, select Differential. Then in the Destination section, ensure that D:\Demofiles\Mod04\AW.bak is the only backup device listed.
4.
In the Backup Up Database – AdventureWorks dialog box, on the Media Options page, verify that the option to append to the existing media set is selected.
5.
In the Backup Up Database – AdventureWorks dialog box, on the Backup Options page, change the Name to AdventureWorks-Diff Database Backup.
6.
In the Backup Up Database – AdventureWorks dialog box, in the Script drop-down list, click Script Action to a New Query Window. Then click OK.
7.
When the backup has completed successfully, click OK.
8.
In the query pane, view the Transact-SQL BACKUP statement that was used to back up the database. Note that it includes the WITH DIFFERENTIAL option.
9.
View the D:\Demofiles\Mod04 folder and note that size of the AW.bak file has increased, but not much—the second backup only includes the extents containing pages that were modified since the full backup.
Perform a Transaction Log Backup 1.
In SQL Server Management Studio, switch to the UpdatePrices.sql script you opened previously and click Execute to update the Production.Product table in the AdventureWorks database again.
2.
In Object Explorer, under Databases, right-click AdventureWorks, point to Tasks, and click Back Up.
3.
In the Backup Up Database – AdventureWorks dialog box, in the Backup type list, select Transaction Log. Then in the Destination section, ensure that D:\Demofiles\Mod04\AW.bak is the only backup device listed.
4.
In the Backup Up Database – AdventureWorks dialog box, on the Media Options page, verify that the option to append to the existing media set is selected. Also verify that the option to truncate the transaction log is selected.
5.
In the Backup Up Database – AdventureWorks dialog box, on the Backup Options page, change the Name to AdventureWorks-Transaction Log Backup.
6.
In the Backup Up Database – AdventureWorks dialog box, in the Script drop-down list, click Script Action to a New Query Window. Then click OK.
7.
When the backup has completed successfully, click OK.
8.
In the query pane, view the Transact-SQL BACKUP statement that was used to back up the database. Note that this time the statement is BACKUP LOG.
9.
View the D:\Demofiles\Mod04 folder and note that size of the AW.bak file has increased, but not much—the third backup only includes the transaction log entries for data modifications since the full backup.
10. Keep SQL Server Management Studio open for the next demonstration.
Lesson 4
Using Backup Options
MCT USE ONLY. STUDENT USE PROHIBITED
4-24 Planning and Implementing a Backup Strategy
SQL Server Backup provides a range of options that can help optimize your backup strategy, including the ability to perform a copy-only backup, compress backups, and encrypt backups. This lesson explores these options.
Lesson Objectives After completing this lesson, you will be able to:
Perform a copy-only backup.
Use backup compression.
Use backup encryption.
Copy-Only Backups A copy-only SQL Server backup is independent of the sequence of conventional SQL Server backups. Usually, taking a backup changes the database and affects how later backups are restored. However, there may be a need to take a backup for a special purpose without affecting the overall backup and restore procedures for the database. You can make copy-only backups of either the database or the transaction logs. Restoring a copyonly full backup is the same as restoring any full backup.
Compressing Backups Backup files can quickly become very large, so SQL Server enables you to compress them. You can set the default backup compression behavior and also override this setting for individual backups. The following restrictions apply to compressed backups:
Compressed and uncompressed backups cannot co-exist in a media set.
Windows-based backups cannot share a media set with compressed SQL Server backups.
Versions of SQL Server before SQL Server 2012 cannot read compressed backups, but lower editions of the product can restore compressed backups, even though they cannot create compressed backups.
You can use the property pages for the server to view and configure the default backup compression setting. To compress a backup, you can use the WITH COMPRESSION option of the BACKUP statement. Compressing a Backup BACKUP DATABASE AdventureWorks TO DISK = 'R:\Backups\AW_Comp.bak' WITH COMPRESSION;
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
4-25
If your default setting is to compress backups and you want to override this, use the NO_COMPRESSION option.
The compression level that can be achieved depends entirely upon how compressible the data in the database is. Some data compresses well, other data does not. A reduction in I/O and backup size of 30 to 50 percent is not uncommon in typical business systems. Note: Backup compression can be used on a database that is encrypted by using Transparent Database Encryption (TDE), though the compression rate will be minimal.
Impact of Compressed Backups on Performance
Because a compressed backup is smaller than an uncompressed backup of the same amount of data, compressing a backup typically reduces the amount of device I/O required and significantly decreases the duration of backups. However, any form of compression tends to increase Central Processing Unit (CPU) usage. The additional CPU resources that are consumed by the compression process may adversely impact concurrent operations on systems that are CPU bound. Most current SQL Server systems are I/O bound, rather than CPU bound, so the benefit of reducing I/O usually outweighs the increase in CPU requirements by a significant factor.
Impact of Compression on Recovery Time
Although a reduction in the time taken to perform backups can be beneficial, backups are usually performed while the system is being used and should not impact availability. However, compression benefits not only the backup process but also the restore process, and it can significantly improve the ability to meet recovery time objective (RTO) requirements.
Demonstration: Using Backup Compression In this demonstration, you will see how to:
Use backup compression.
Demonstration Steps Use Backup Compression 1.
Ensure that you have performed the previous demonstration in this module.
2.
In SQL Server Management Studio, in Object Explorer, under Databases, right-click AdventureWorks, point to Tasks, and click Back Up.
MCT USE ONLY. STUDENT USE PROHIBITED
4-26 Planning and Implementing a Backup Strategy
3.
In the Backup Up Database – AdventureWorks dialog box, ensure that Backup type is set to Full, and in the Destination section, select the existing file path and click Remove. Then click Add and in the Select Backup Destination dialog box, enter the file name D:\Demofiles\Mod04\AW_Comp.bak and click OK.
4.
In the Backup Up Database – AdventureWorks dialog box, on the Media Options page, note that the default option is to append to an existing media set. In this case, there is no existing media set so a new one will be created, and there are no existing backup sets to overwrite.
5.
In the Backup Up Database – AdventureWorks dialog box, on the Backup Options page, change the Name to AdventureWorks-Compressed Backup and in the Set backup compression list, select Compress backup.
6.
In the Backup Up Database – AdventureWorks dialog box, in the Script drop-down list, click Script Action to a New Query Window. Then click OK.
7.
When the backup has completed successfully, click OK.
8.
In the query pane, view the Transact-SQL BACKUP statement that was used to back up the database, noting that the COMPRESSION option was specified.
9.
View the D:\Demofiles\Mod04 folder and note the size of the AW_Comp.bak file. This should be significantly smaller than the AW.bak file was after the full database backup in the previous demonstration.
10. Keep SQL Server Management Studio open for the next demonstration.
Encrypting Backups
Backups are a fundamental requirement for protecting an organization’s data against hardware failure or natural disaster. However, the data in the backup may be sensitive, and you must ensure that the backup media is secured against unauthorized access to the data it contains. In most organizations, you can accomplish this goal by storing backup media in secured file system locations. However, it is common for organizations to use an off-site storage solution for backups to protect against loss of data in the event of a disaster that affects the entire site (for example, a flood or fire). In this kind of scenario, or when the data in the backup requires additional security for compliance reasons, you can encrypt backups so that they can only be restored on a SQL Server instance that contains the correct encryption key. Backup encryption in SQL Server is based on standard encryption algorithms, including AES 128, AES 192, AES 256, and Triple DES. To encrypt a backup, you must specify the algorithm you want to use and a certificate or asymmetric key that can be used to encrypt the data. To use backup encryption: 1.
Create a database master key in the master database. This is a symmetric key that is used to protect all other encryption keys and certificates in the database.
2.
Create a certificate or asymmetric key with which to encrypt the backup. You can create a certificate or asymmetric in a SQL Server database engine instance by using the CREATE CERTIFICATE or CREATE ASYMMETRIC KEY statement. Note that asymmetric keys must reside in an extended key management (EKM) provider.
3.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
4-27
Perform the backup using the ENCRYPTION option (or selecting Encryption in the Backup Database dialog box), and specifying the algorithm and certificate or asymmetric key to be used. When using the Backup Database dialog box, you must select the option to back up to a new media set.
You should back up the database master key and encryption keys to a secure location (separate from the backup media location) to enable you to restore the database to a different SQL Server instance in the event of a total server failure. The following example code backs up the AdventureWorks database using the AES 128 encryption algorithm and a certificate named BackupCert: Using Backup Encryption BACKUP DATABASE AdventureWorks TO DISK = 'R:\Backups\AW_Encrypt,bak' WITH FORMAT, INIT, ENCRYPTION( ALGORITHM=AES_128, SERVER CERTIFICATE = [BackupCert])
Demonstration: Using Backup Encryption In this demonstration, you will see how to:
Create a database master key.
Create a certificate.
Encrypt a database backup.
Demonstration Steps Create a Database Master Key 1.
Ensure that you have performed the previous demonstration in this module.
2.
In SQL Server Management Studio, open the EncyptionKeys.sql script file in the D:\Demofiles\Mod04 folder.
3.
Select the code under the comment Create a database master key and click Execute.
4.
Select the code under the comment Back up the database master key and click Execute.
Create a Certificate 1.
Select the code under the comment Create a certificate and click Execute.
2.
Select the code under the comment Back up the certificate and its private key and click Execute.
Encrypt a Database Backup 1.
In Object Explorer, under Databases, right-click AdventureWorks, point to Tasks, and click Back Up.
2.
In the Backup Up Database – AdventureWorks dialog box, ensure that Backup type is set to Full, and in the Destination section, select the existing file path and click Remove. Then click Add and in the Select Backup Destination dialog box, enter the file name D:\Demofiles\Mod04\AW_Encrypt.bak and click OK.
3.
In the Backup Up Database – AdventureWorks dialog box, on the Media Options page, select Back up to a new media set, and erase all existing backup sets. Then enter the new media set name Encrypted Backup.
MCT USE ONLY. STUDENT USE PROHIBITED
4-28 Planning and Implementing a Backup Strategy
4.
In the Backup Up Database – AdventureWorks dialog box, on the Backup Options page, change the Name to AdventureWorks-Encrypted Backup and in the Set backup compression list, select Compress backup.
5.
In the Encryption section, select Encrypt backup. Then ensure that the AES 128 algorithm is selected, and select the BackupCert certificate you created previously.
6.
In the Backup Up Database – AdventureWorks dialog box, in the Script drop-down list, click Script Action to a New Query Window. Then click OK.
7.
When the backup has completed successfully, click OK.
8.
In the query pane, view the Transact-SQL BACKUP statement that was used to back up the database, noting that the ENCRYPTION option was specified.
9.
Keep SQL Server Management Studio open for the next demonstration.
Lesson 5
Ensuring Backup Reliability No matter how many backups you perform, it is essential that you ensure they are readable and restorable, otherwise the entire backup system is flawed. It is also important to be able to query information about your backups so you can access the correct data when required.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
4-29
In this lesson, you will learn how to verify a backup and ensure its integrity and how to retrieve backup history and header information.
Lesson Objectives After completing this lesson, you will be able to:
Describe options for ensuring backup integrity.
View backup information.
Determining a Retention and Testing Policy for Backups It is too common for organizations to regularly perform backup operations, and then when the time comes to restore the backups, find that they are not usable. Most of these problems would be alleviated by a good retention and testing plan. Your strategy must include plans for the retention of backups and for the locations where the media or backups should be retained. Several common problems arise, but these can easily be avoided.
Insufficient Copies of Backups. Your organization is dependent on the quality of backups should the need arise to restore them. The more copies of backups you have and the more pieces of media that are holding all the required data, the better the chance you have of being able to recover.
The worst option is generally regarded as creating a backup over your most recent backup. If the system fails during the backup, you will often then have lost both your data and your backup. Avoidance strategy: Make multiple copies of backups.
Insufficient Data on the Backups. Company A performed regular backups, yet no testing of recovery was ever made. The first time a real recovery was attempted, it was discovered that not all files that needed to be backed up were in fact being backed up.
Avoidance strategy: Regular reconstruction of data from backup recovery testing.
Unreadable Backups. Company B performed regular backups but did not test them. When recovery was attempted, none of the backups were readable. This is often initiated by hardware failures but can be caused by inappropriate storage of media.
Avoidance strategy: Regular backup recovery testing.
Unavailable Hardware. Company C purchased a special tape drive to perform backups. When the time came to restore the backups, that special device no longer worked and the organization had no other way to read the backups, even if they were valid.
Avoidance strategy: Regular backup recovery testing.
MCT USE ONLY. STUDENT USE PROHIBITED
4-30 Planning and Implementing a Backup Strategy
Old Hardware. Company D performed regular backups and retained them for an appropriate period. When the time came to restore the backups, the company no longer possessed equipment that was capable of performing that operation.
Avoidance strategy: Regular backup recovery testing, combined with recovery and backup onto current devices.
Misaligned Hardware. Company E performed regular backups and even tested that they could undertake restore operations from the backups. However, because they tested the restores on the same device that performed the backups, they did not realize that the device was misaligned and it was the only one that could read those backups. When a restore was needed, the device that the backups were performed on had failed.
Avoidance strategy: Regular backup recovery testing on a separate system and physical device.
General Considerations There are several general points to consider regarding the retention period of backups.
When a backup strategy calls for you to perform multiple types of backups, it is important to work out the combination of backups you will require.
Organizations might need to fulfill legal or compliance requirements regarding the retention of backups. In most cases, full database backups are kept for a longer time than other types.
Checking the consistency of databases by using the DBCC CHECKDB statement is a crucial part of database maintenance, and is discussed later in the course.
As well as deciding how long backups need to be retained, you will need to determine where they are kept. Part of the RTO needs to consider how long it takes to obtain the physical backup media if it needs to be restored.
You should also make sure that backups are complete. Are all files that are needed to recover the system (including external operating system files) being backed up?
Options for Ensuring Backup Integrity SQL Server Backup includes options to help ensure the integrity of backups, reducing the risk of finding that backups are unusable when required to recover from a failure.
Mirrored Media Sets A mirrored media set is a copy of the backup media set that you can optionally create in parallel with the main backup operation. It consists of two to four device mirrors, each containing the entire media set. You should configure each mirror with the same number of backup devices, which must be of the same device type.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
4-31
Using the theory that it is better to have multiple copies of a backup, rather than a single copy, mirroring a media set can increase availability of your data. However, it is important to realize that mirroring a media set exposes your system to a higher level of hardware failure risk, as a malfunction of any of the backup devices causes the entire backup operation to fail. You can create a mirrored backup set by using the MIRROR TO option of the BACKUP statement. Creating a Mirrored Backup Set BACKUP DATABASE AdventureWorks TO DISK = 'R:\Backups\AW.bak' MIRROR TO DISK = 'Q:\ Backups\AW_M.bak' WITH FORMAT, INIT;
Note: The mirrored media set functionality is only available in SQL Server Enterprise Edition.
WITH CHECKSUM Option
SQL Server enables you to perform a checksum operation over an entire backup stream and write the value to the end of the backup. The WITH CHECKSUM option validates the page-level information (checksum or torn page if either is present) as well as the one checksum for the backup stream. It does, however, consume slightly more CPU resources during the backup process than backups without the calculation of a checksum. You can configure SQL Server to assess the checksum value, either during restore operations or during backup verification operations made with the RESTORE VERIFYONLY command. You enable the checksum option by using the WITH CHECKSUM clause of the BACKUP statement. Using a Checksum BACKUP DATABASE AdventureWorks TO DISK = 'R:\Backups\AW.bak' WITH CHECKSUM;
Note: The COMPRESSION option also enables the CHECKSUM option automatically.
Backup Verification
To verify a backup, you can use the RESTORE VERIFYONLY statement which checks the backup for validity but does not restore it. The statement performs the following checks:
Backup set is complete.
All volumes are readable.
Page identifiers are correct (to the same level as if it were about to write the data).
Checksum is valid (if present on the media).
Sufficient space exists on destination devices.
The checksum value can only be validated if the backup was performed with the WITH CHECKSUM option. Without the CHECKSUM option during backup, the verification options only check the metadata and not the actual backup data. The RESTORE VERIFYONLY statement is similar to the RESTORE statement and supports a subset of its arguments.
Verifying a Backup RESTORE VERIFYONLY FROM DISK = 'R:\Backups\AW.bak'
You can also perform verification steps by using the backup database task in SSMS. Best Practice: Consider verifying backups on a different system to the one where the backup was performed. This will eliminate the situation where a backup is only readable on the source hardware.
Viewing Backup History SQL Server tracks all backup activity in the following tables in the msdb database:
backupfile
backupfilegroup
backupmediafamily
backupmediaset
backupset
You can query these tables to retrieve information about backups that have been performed. SSMS also provides reports and logs of backup information. You can use system stored procedures to delete backup history. Deleting Backup History // Delete all backup history before 2009 EXEC sp_delete_backuphistory @oldest_date = '20090101'; // Delete all backup history for the Market database EXEC sp_delete_database_backuphistory @database_name = 'Sales';
Note: If a database is restored onto another server, the backup information is not restored with the database, as it is held in the msdb database of the original system.
MCT USE ONLY. STUDENT USE PROHIBITED
4-32 Planning and Implementing a Backup Strategy
Retrieving Backup Metadata You can find information about a specific media set by executing the RESTORE statement with the specific options:
Option
Description
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
RESTORE LABELONLY
Returns information about the backup media on a specified backup device.
RESTORE HEADERONLY
Returns all the backup header information for all backup sets on a particular backup device.
RESTORE FILELISTONLY
Returns a list of data and log files contained in a backup set.
Demonstration: Verifying Backups In this demonstration, you will see how to:
View the Backup and Restore Events report.
Query backup history tables.
Verify backup media.
Demonstration Steps View the Backup and Restore Events Report 1.
Ensure that you have performed the previous demonstration in this module.
2.
In SQL Server Management Studio, in Object Explorer, under Databases, right-click AdventureWorks, point to Reports, point to Standard Reports, and click Backup and Restore Events.
3.
In the Backup and Restore Events [AdventureWorks] report, expand Successful Backup Operations and view the backup operations that have been performed for this database.
4.
In the Device Type column, expand each of the Disk (temporary) entries to view details of the backup media set files.
Query Backup History Tables 1.
In SQL Server Management Studio, open the VerifyingBackups.sql script file in the D:\Demofiles\Mod04 folder.
2.
Select the code under the comment View backup history, and click Execute.
4-33
3.
MCT USE ONLY. STUDENT USE PROHIBITED
4-34 Planning and Implementing a Backup Strategy
View the query results, which show the backups that have been performed for the AdventureWorks database.
Verify Backup Media 1.
Select the code under the comment Use RESTORE HEADERONLY, and click Execute.
2.
View the query results, which show the backups in the AW.bak backup device.
3.
Select the code under the comment Use RESTORE FILELISTONLY, and click Execute.
4.
View the query results, which show the database files contained in the backups.
5.
Select the code under the comment Use RESTORE VERIFYONLY, and click Execute.
6.
View the message that is returned, which should indicate that the backup is valid.
Lab: Backing Up Databases Scenario
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
4-35
As a database administrator for Adventure Works Cycles, you are responsible for the HumanResources, InternetSales, and AWDataWarehouse databases. You must implement a backup solution for these databases, based on the backup requirements that have been provided.
Objectives After completing this lab, you will be able to:
Implement a backup strategy based on full database backups.
Implement a backup strategy based on full, differential, and transaction log backups.
Implement a backup strategy based on filegroup and partial backups.
Estimated Time: 90 minutes Virtual machine: 20462C-MIA-SQL User name: ADVENTUREWORKS\Student Password: Pa$$w0rd
Exercise 1: Backing Up Databases Scenario The backup strategy for the HumanResources database is based on daily full database backups. You must perform tests of these backup operations and verify the backups. The main tasks for this exercise are as follows: 1. Prepare the Lab Environment 2. Set the Recovery Model 3. Perform a Full Database Backup 4. Modify Data in the Database 5. Perform Another Full Database Backup 6. View the Backup and Restore Events Report
Task 1: Prepare the Lab Environment 1.
Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are both running, and then log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
Run Setup.cmd in the D:\Labfiles\Lab04\Starter folder as Administrator.
Task 2: Set the Recovery Model 1.
Based on the proposed backup strategy for the HumanResources database, determine the appropriate recovery model for the database.
2.
Use SQL Server Management Studio to check the current recovery model of the database, and change it if necessary.
Task 3: Perform a Full Database Backup 1.
2.
Back up the HumanResources database to R:\Backups\HumanResources.bak. o
Use a full backup.
o
Create a new media set with the name “Human Resources Backup”.
o
Name the backup set “HumanResources-Full Database Backup”.
o
Compress the backup.
Verify that the backup file has been created, and note its size.
Task 4: Modify Data in the Database 1.
Update the Employee table in the HumanResources database using the following Transact-SQL code: UPDATE HumanResources.dbo.Employee SET PhoneNumber='151-5551234' WHERE BusinessEntityID = 259;
This code is in the Update HumanResources.sql file in the D:\Labfiles\Lab04\Starter folder.
Task 5: Perform Another Full Database Backup 1.
2.
Back up the HumanResources database to D:\Backups\HumanResources.bak. o
Use a full backup.
o
Back up the database to the existing media set, and append the backup to the existing backup sets.
o
Name the backup set “HumanResources-Full Database Backup 2”.
o
Compress the backup.
Verify that the size of the backup file has increased.
Task 6: View the Backup and Restore Events Report 1.
In SQL Server Management Studio, view the Backup and Restore Events report for the HumanResources database.
2.
Verify that the report shows the two backups you have created (HumanResources-Full Database Backup and HumanResources-Full Database Backup 2.
Results: At the end of this exercise, you will have backed up the HumanResources database to R:\Backups\HumanResources.bak.
MCT USE ONLY. STUDENT USE PROHIBITED
4-36 Planning and Implementing a Backup Strategy
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
4-37
Exercise 2: Performing Database, Differential, and Transaction Log Backups Scenario The backup strategy for the InternetSales database uses a combination of full, differential, and transaction log backups. The main tasks for this exercise are as follows: 1. Set the Recovery Model 2. Perform a Full Database Backup 3. Modify Data in the Database 4. Perform a Transaction Log Backup 5. Modify Data in the Database 6. Perform a Differential Backup 7. Modify Data in the Database 8. Perform Another Transaction Log Backup 9. Verify Backup Media
Task 1: Set the Recovery Model 1.
Based on the proposed backup strategy for the InternetSales database, determine the appropriate recovery model for the database.
2.
Use SQL Server Management Studio to check the current recovery model of the database, and change it if necessary.
Task 2: Perform a Full Database Backup 1.
2.
Back up the InternetSales database to R:\Backups\InternetSales.bak. o
Use a full backup.
o
Create a new media set with the name “Internet Sales Backup”.
o
Name the backup set “InternetSales-Full Database Backup”.
o
Compress the backup.
Verify that the backup file has been created, and note its size.
Task 3: Modify Data in the Database 1.
Update the Product table in the InternetSales database using the following Transact-SQL code:
UPDATE InternetSales.dbo.Product SET ListPrice = ListPrice * 1.1 WHERE ProductSubcategoryID = 1;
This code is the Update InternetSales.sql file in the D:\Labfiles\Lab04\Starter folder.
Task 4: Perform a Transaction Log Backup 1.
Back up the InternetSales database to R:\Backups\InternetSales.bak. o
Use a transaction log backup.
o
Back up the log to the existing media set, and append the backup to the existing backup sets.
o
Name the backup set “InternetSales-Transaction Log Backup”.
o 2.
Compress the backup.
Verify that the size of the backup file has increased.
Task 5: Modify Data in the Database 1.
Update the Product table in the InternetSales database using the following Transact-SQL code: UPDATE InternetSales.dbo.Product SET ListPrice = ListPrice * 1.1 WHERE ProductSubcategoryID = 2;
This code is in the Update InternetSales.sql file in the D:\Labfiles\Lab04\Starter folder.
Task 6: Perform a Differential Backup 1.
2.
Back up the InternetSales database to R:\Backups\InternetSales.bak. o
Use a differential backup.
o
Back up the log to the existing media set, and append the backup to the existing backup sets.
o
Name the backup set “InternetSales-Differential Backup”.
o
Compress the backup.
Verify that the size of the backup file has increased.
Task 7: Modify Data in the Database 1.
Update the Product table in the InternetSales database using the following Transact-SQL code. UPDATE InternetSales.dbo.Product SET ListPrice = ListPrice * 1.1 WHERE ProductSubcategoryID = 3;
This code is in the Update InternetSales.sql file in the D:\Labfiles\Lab04\Starter folder.
Task 8: Perform Another Transaction Log Backup 1.
2.
Back up the InternetSales database to R:\Backups\InternetSales.bak. o
Use a transaction log backup.
o
Back up the log to the existing media set, and append the backup to the existing backup sets.
o
Name the backup set “InternetSales-Transaction Log Backup 2”.
o
Compress the backup.
Verify that the size of the backup file has increased.
Task 9: Verify Backup Media 1.
Use the following query to verify that the backups you performed in this exercise are all on the backup device: RESTORE HEADERONLY FROM DISK = 'R:\Backups\InternetSales.bak'; GO
MCT USE ONLY. STUDENT USE PROHIBITED
4-38 Planning and Implementing a Backup Strategy
2.
Use the following query to identify the database files that are included in the backups: RESTORE FILELISTONLY FROM DISK = 'R:\Backups\InternetSales.bak'; GO
3.
Use the following query to verify that the backups are valid: RESTORE VERIFYONLY FROM DISK = 'R:\Backups\InternetSales.bak'; GO
Results: At the end of this exercise, you will have backed up the InternetSales database to R:\Backups\InternetSales.bak.
Exercise 3: Performing a Partial Backup Scenario
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
4-39
The AWDataWarehouse database is too large for a conventional backup strategy, so you have decided to use a partial backup strategy. The main tasks for this exercise are as follows: 1. Set the Recovery Model 2. Back Up the Read-Only Filegroup 3. Perform a Partial Backup 4. Modify Data in the Database 5. Perform a Differential Partial Backup 6. Verify Backup Media
Task 1: Set the Recovery Model 1.
Based on the proposed backup strategy for the AWDataWarehouse database, determine the appropriate recovery model for the database.
2.
Use SQL Server Management Studio to check the current recovery model of the database, and change it if necessary.
Task 2: Back Up the Read-Only Filegroup 1.
Use the following query to verify that the backups you performed in this exercise are all on the backup device:
BACKUP DATABASE AwDataWarehouse FILEGROUP = 'Archive' TO DISK = 'R:\Backups\AWDataWarehouse-Read-Only.bak' WITH FORMAT, INIT, NAME = 'AWDataWarehouse-Archive', COMPRESSION;
2.
Verify that the backup file AWDataWarehouse-Read-Only.bak has been created in the R:\Backups folder.
Task 3: Perform a Partial Backup 1.
Use the following query to perform a partial backup of the AWDataWarehouse database: BACKUP DATABASE AWDataWarehouse READ_WRITE_FILEGROUPS TO DISK = 'R:\Backups\AWDataWarehouse-Read-Write.bak' WITH FORMAT, INIT, NAME = 'AWDataWarehouse-Active Data', COMPRESSION;
2.
Verify that the backup file AWDataWarehouse-Read-Write.bak has been created in the R:\Backups folder.
Task 4: Modify Data in the Database 1.
Add a record to the FactInternetSales table in the AWDataWarehouse database using the following Transact-SQL code: INSERT INTO AWDataWarehouse.dbo.FactInternetSales VALUES (1, 20080801, 11000, 5.99, 2.49);
This code is in the Update AWDataWarehouse.sql file in the D:\Labfiles\Lab04\Starter folder.
Task 5: Perform a Differential Partial Backup 1.
MCT USE ONLY. STUDENT USE PROHIBITED
4-40 Planning and Implementing a Backup Strategy
Use the following query to perform a differential partial backup of the AWDataWarehouse database: BACKUP DATABASE AWDataWarehouse READ_WRITE_FILEGROUPS TO DISK = 'R:\Backups\AWDataWarehouse-Read-Write.bak' WITH DIFFERENTIAL, NOFORMAT, NOINIT, NAME = 'AWDataWarehouse-Active Data Diff', COMPRESSION;
Task 6: Verify Backup Media 1.
Use the following query to view the backups on AWDataWarehouse_Read-Only.bak, and scroll to the right to view the BackupTypeDescription column: RESTORE HEADERONLY FROM DISK = 'R:\Backups\AWDataWarehouse-Read-Only.bak'; GO
2.
Use the following query to view the backups on AWDataWarehouse_Read-Write.bak, and scroll to the right to view the BackupTypeDescription column: RESTORE HEADERONLY FROM DISK = 'R:\Backups\AWDataWarehouse-Read-Write.bak'; GO
Results: At the end of this exercise, you will have backed up the read-only filegroup in the AWDataWarehouse to R:\Backups\AWDataWarehouse_Read-Only.bak; and you will have backed up the writable filegroups in the AWDataWarehouse to R:\Backups\AWDataWarehouse_Read-Write.bak
Module Review and Takeaways In this module, you have learned how to create a backup strategy that is aligned with organizational needs and seen how the transaction logging capabilities of SQL Server can help you to achieve an appropriate outcome. Best Practice: Plan your backup strategy carefully.
Plan the backup strategy in conjunction with the business needs.
Choose the appropriate database recovery model.
Plan your transaction log size based on the transaction log backup frequency.
Consider using differential backups to speed recovery.
Consider compressing backups to reduce storage requirements and backup time.
Review Question(s) Question: What are the unique features of transaction log restores? Question: When might a full database backup strategy be adequate?
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
4-41
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED 5-1
Module 5 Restoring SQL Server 2014 Databases Contents: Module Overview
5-1
Lesson 1: Understanding the Restore Process
5-2
Lesson 2: Restoring Databases
5-6
Lesson 3: Advanced Restore Scenarios
5-11
Lesson 4: Point-in-Time Recovery
5-17
Lab: Restoring SQL Server Databases
5-21
Module Review and Takeaways
5-25
Module Overview
In the previous module, you learned how to create backups of Microsoft® SQL Server® 2014 databases. A backup strategy might involve many different types of backup so it is essential that you can effectively restore them.
You will often be restoring a database in an urgent situation. You must, however, ensure that you have a clear plan of how to proceed and successfully recover the database to the required state. A good plan and understanding of the restore process can help avoid making the situation worse. Some database restores are related to system failure. In these cases, you will want to return the system as close as possible to the state it was in prior to the failure. Some failures, though, are related to human error and you may wish to recover the system to a point prior to the error. The point-in-time recovery features of SQL Server 2014 can help you to achieve this.
User databases are more likely to be affected by system failures than system databases because they are typically much larger. However, system databases can be affected by failures, and special care needs to be taken when recovering them. In particular, you need to understand how to recover each system database because you cannot use the same process for all system databases. In this module, you will see how to restore user and system databases and how to implement point-intime recovery.
Objectives After completing this module, you will be able to:
Explain the restore process.
Restore databases.
Perform advanced restore operations.
Perform a point-in-time recovery.
Restoring SQL Server 2014 Databases
Lesson 1
Understanding the Restore Process When you need to recover a database, it is essential to have a good plan to avoid causing further damage. After you have completed the preliminary step of attempting to create a tail-log backup, it is most important to determine which database backups to restore—and in which order.
Lesson Objectives After completing this lesson, you will be able to:
Describe the restore process.
Describe the different types of restores.
Decide on the backups to restore and in which order.
Phases of the Restore Process Restoring a SQL Server database consists of three phases: data copy, redo, and undo. The combination of the redo and the undo phases is commonly referred to as the recovery of a database.
Data Copy The data copy phase is typically the longest in a database restore. Firstly, the data files from the database need to be retrieved from the backups. Before any data pages are restored, the restore process reads the header of the backup and SQL Server recreates the required data and log files. If instant file initialization (IFI) has not been enabled by granting rights to the SQL Server service account, the rewriting of the data files can take a substantial amount of time.
MCT USE ONLY. STUDENT USE PROHIBITED
5-2
After the data and log files are recreated, the data files are restored from the full database backup. Data pages are retrieved from the backup in order and written to the data files. The log files need to be zeroed out before they can be used. This process can also take a substantial time if the log files are large.
If a differential backup is also being restored, SQL Server overwrites the extents in the data files with those contained in the differential backup.
Redo Phase
At the start of the redo phase, SQL Server retrieves details from the transaction log. In the simple recovery model, these details are retrieved from either the full database backup or the differential backup. In the full or bulk-logged recovery model, these log file details are supplemented by the contents of any transaction log backups that were taken after the full and differential database backups. In the redo phase, SQL Server rolls all changes that are contained within the transaction log details into the database pages, up to the recovery point. The recovery point is typically the latest time for which transactions exist in the log.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
Undo Phase The transaction log will likely include details of transactions that were not committed at the recovery point, which is typically the time of the failure. In the undo phase, SQL Server rolls back any of these uncommitted transactions.
5-3
Because the action of the undo phase involves rolling back uncommitted transactions and placing the database online, no more backups can be restored.
During the undo phase, the Enterprise edition of SQL Server will allow the database to come online and users to begin to access it. This capability is referred to as the fast recovery feature. Queries that attempt to access data that is still being undone are blocked until the undo phase is complete. This can potentially cause transactions to time out, but does mean that users can access the database sooner. In general, you cannot bring a database online until it has been recovered. The one exception to this is the fast recovery option, which allows users to access the database while the undo phase is continuing.
Recovery does not only occur during the execution of RESTORE commands. If a database is taken offline and then placed back into an ONLINE state, recovery of the database will also occur. The same recovery process takes place when SQL Server restarts.
Note: Other events that lead to database recovery include clustering or database mirroring failovers. Failover clustering and database mirroring are advanced topics that are beyond the scope of this course.
Types of Restores The restore scenarios available for a database depend on its recovery model and the edition of SQL Server you are using.
Complete Database Restore in Simple Recovery Model The most basic restore strategy for SQL Server databases is to restore and recover a full database backup. If available, you can restore the latest differential backup after the restore of the full database backup but before the recovery process for the database.
In most scenarios involving the simple recovery model, no differential backups are performed. In these cases, you only restore the last full database backup, and then the recovery phase returns the database to the state it was in at the time just prior to the full database backup being completed.
Complete Database Restore in Full Recovery Model
The most common restore strategy requires full or bulk-logged recovery model and involves restoring full, differential (if present), and log backups. While the aim of the restore is to recover the database to the latest point in time possible, options exist to restore the database to earlier points in time. These options will be discussed later in this module.
Restoring SQL Server 2014 Databases
System Database Restore
MCT USE ONLY. STUDENT USE PROHIBITED
5-4
Restoring system databases is possible but requires special processes to avoid further issues from occurring. For example, if a master database is left in an inconsistent state, SQL Server will refuse to start until the master database is recovered. The recovery of system databases will be discussed later in this module.
Filegroup or Restore
SQL Server includes the functionality to restore filegroups or individual files, so if specific files in a database are corrupt or lost you can potentially reduce the overall time to recover the database. The recovery of individual files is only supported for read-only files when operating in simple recovery model, but you can use it for read-write files when using the bulk-logged or full recovery models. The recovery of individual files uses a process that is similar to the complete database restore process. It is discussed later in this module.
Piecemeal Restore
A piecemeal restore is used to restore and recover the database in stages, based on filegroups, rather than restoring the entire database at a single time. The first filegroup that must be restored is the primary filegroup, usually along with the read/write filegroups for which you want to prioritize recovery. You can then restore read-only filegroups. In SQL Server 2014, piecemeal restore is only available in the Enterprise edition.
Page Restore
Another advanced option is the ability to restore an individual data page. If an individual data page is corrupt, users will usually see either an 823 error or an 824 error when they execute a query that tries to access the page. You can try to recover the page using a page restore. If a user query tries to access the page after the restore starts, they will see error 829, which indicates the page is restoring. If the page restore is successful, user queries that access the page will again return results as expected. Page restores are supported under full and bulk-logged recovery models, but not under simple recovery model.
Online Restore
Online restore involves restoring data while the database is online. This is the default option for File, Page, and Piecemeal restores. In SQL Server 2014, online restore is only available in the Enterprise edition.
Preparations for Restoring Backups Before restoring any database, it is important to attempt a tail-log backup, unless you are intending to replace the current state of the database. You can often perform a tail-log backup, even when damage has occurred to the data files of the database. The tail-log backup is critical when you need to restore the database to the latest possible point in time.
Identifying Backups to Restore The recovery of any database depends upon restoring the correct backups in the correct order. The normal process for restoring a database is: 1.
Restore the latest full database backup as a base to work from. If only individual files are damaged or missing, you may be able to restore just those files.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
5-5
2.
If differential backups exist, you only need to restore the latest differential backup.
3.
If transaction log backups exist, you need to restore all transaction log backups since the last differential backup. You also need to include the tail-log backup created at the start of the restore process, if the tail-log backup was successful. (This step does not apply to databases using the simple recovery model.)
Discussion: Determining Required Backups to Restore The scenario on the slide describes the backup schedule for an organization and the timing of a failure. What restore process should you follow?
Restoring SQL Server 2014 Databases
Lesson 2
Restoring Databases
MCT USE ONLY. STUDENT USE PROHIBITED
5-6
Most restore operations involve restoring a full database backup, often followed by a differential backup and a sequence of transaction log backups. In this lesson, you will learn how to restore these types of backup and recover a database.
Lesson Objectives After completing this lesson, you will be able to:
Restore a full database backup.
Restore a differential backup.
Restore a transaction log backup.
Restoring a Full Database Backup You can restore a database by using either SQL Server Management Studio (SSMS) or the RESTORE DATABASE statement in Transact-SQL.
Restoring a Database
The simplest recovery scenario is to restore a database from a single full database backup. If no subsequent differential or transaction log backups need to be applied, you can use the RECOVERY option to specify that SQL Server should complete the recovery process for the database and bring it online. If additional backups must be restored, you can prevent recovery from occurring by specifying the NORECOVERY option. If you do not specify either of these options, SQL Server uses RECOVERY as the default behavior. In the following example, the AdventureWorks database is restored from the AW.bak backup media: Restoring a Database from a Full Backup RESTORE DATABASE AdventureWorks FROM DISK = 'R:\Backups\AW.bak';
Replacing an Existing Database SQL Server will not allow you to restore a database backup over an existing database if you have not performed a tail-log backup on the database. If you attempt to do this using SSMS, SQL Server will provide a warning and will automatically attempt to create a tail-log backup for you. If you need to perform the restore operation and you do not have a tail-log backup, you must specify the WITH REPLACE option.
In the following code example, the existing AdventureWorks database is replaced with the database in the AW.bak backup media: RESTORE DATABASE AdventureWorks FROM DISK = 'R:\Backups\AW.bak WITH REPLACE';
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
Note: The WITH REPLACE option needs to be used with caution as it can lead to data loss.
Restoring Database Files to a Different Location
5-7
When you restore a database from another server, you might need to place the database files in different locations to those recorded in the backup from the original server. You might also need to do this if you are copying a database by using backup and restore. The WITH MOVE option enables you to specify new file locations In this example, the AdventureWorks database is being restored from another server. As well as specifying the source location for the media set, new locations for each database file are also specified in the RESTORE statement. Note that the MOVE option requires the specification of the logical file name, rather than the original physical file path. WITH MOVE RESTORE DATABASE AdventureWorks FROM DISK = 'R:\Backups\AW.bak' WITH MOVE 'AdventureWorks_Data' TO 'Q:\Data\AdventureWorks.mdf', MOVE 'AdventureWorks_Log' TO 'U:\Logs\ AdventureWorks.ldf';
Restoring a Database in Standby Mode
SQL Server provides the ability to view the contents of a database that has not been recovered, by using the option WITH STANDBY, instead of WITH NORECOVERY. After you restore a database by using the WITH STANDBY option, you can still apply further transaction log backups to the database. The STANDBY option is typically used to support Log Shipping scenarios, in which a secondary copy of a database is synchronized by reapplying the transactions in the transaction log of the primary database. You can also use the STANDBY option to enable inspection of data in a database that you do not want to bring online.
Restoring a Differential Backup In a differential backup strategy, you must restore an initial full database backup and then restore the latest differential backup.
Database Recovery When Restoring Multiple Backups As discussed previously, the recovery process in SQL Server is critical to the maintenance of transactional integrity. This requires that all transactions that had committed before the failure are recorded in the database and all transactions that had not committed are rolled back.
The RESTORE command includes an option to specify WITH RECOVERY or WITH NORECOVERY. The WITH RECOVERY option is the default action and does not need to be specified. This ensures that a database is brought online immediately after being restored from a full database backup. However, when your backup strategy requires you to restore additional backups subsequent to the full database backup, it is important to choose the correct option for each RESTORE command. The process is straightforward in most cases. All restores must be performed WITH NORECOVERY except the last restore, which must be WITH RECOVERY. Until the final backup is restored with the RECOVERY option, the will display as (Restoring…) in SSMS.
Restoring SQL Server 2014 Databases
MCT USE ONLY. STUDENT USE PROHIBITED
5-8
There is no way to restore additional backups after a restore WITH RECOVERY has been processed. If you accidentally perform a backup using the WITH RECOVERY option, you must restart the entire restore sequence.
Restoring a Database from a Differential Backup
The command for restoring a differential backup is identical to that for restoring a full database backup. Differential backups are often appended to the same file as the full database backup (in which case you need to specify the specific file from the media set that you want to restore).
In this example, the AdventureWorks database is restored from the first file in the media set containing a full database backup. This media set is stored in the operating system file R:\Backups\AW.bak. The second file in the media set is the first differential backup, but the changes in this are also contained in the second differential backup in the third file. Therefore, the second RESTORE statement only needs to restore the contents of the third file. Restoring Full and Differential Backups RESTORE DATABASE AdventureWorks FROM DISK = 'R:\Backups\AW.bak' WITH FILE = 1, NORECOVERY; RESTORE DATABASE AdventureWorks FROM DISK = 'R:\Backups\AW.bak' WITH FILE = 3, RECOVERY;
If both the full and differential backup sets are on the same backup media, SSMS automatically selects the required the backups in the Restore Database dialog box and ensures that the appropriate recovery settings are applied.
Restoring Transaction Log Backups You can restore the transaction logs for a database by using either SSMS or the RESTORE LOG statement in Transact-SQL. You should restore all log files, apart from the last log, using the WITH NORECOVERY option and then restore the last log file (which is often the tail-log backup) using the WITH RECOVERY option.
When using a backup strategy that involves transaction log backups, you must start by restoring the latest full database backup, followed by the latest differential backup if one exists, and then you must restore all transaction logs created since the last full or differential backup in chronological order. Any break in the chain of transaction logs will cause the restore process to fail and require you to restart the recovery from the beginning. In the following example, the AdventureWorks database has failed but the log file was accessible, so a tail-log backup has been stored in AW-TailLog.bak. To restore the database, the latest full backup (backup set 1 in AW.bak) is restored using the NORECOVERY option, followed by the latest differential backup (backup set 3 in AW.bak), again with the NORECOVERY option. Then all subsequent planned transaction log backups (backup sets 4 and 5 in AW.bak) are restored in chronological order with the NORECOVERY option, before finally the tail-log backup (the only backup set in AW-TailLog.bak) is restored with the RECOVERY option.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
Using the RESTORE LOG Statement -- Restore last full database backup RESTORE DATABASE AdventureWorks FROM DISK = 'R:\Backups\AW.bak' WITH FILE = 1, NORECOVERY; -- Restore last differential backup RESTORE DATABASE AdventureWorks FROM DISK = 'R:\Backups\AW.bak' WITH FILE = 3, NORECOVERY; -- Restore planned log backups RESTORE LOG AdventureWorks FROM DISK = 'R:\Backups\AW.bak' WITH FILE = 4, NORECOVERY; RESTORE LOG AdventureWorks FROM DISK = 'R:\Backups\AW.bak' WITH FILE = 5, NORECOVERY; -- Restore tail-log backup RESTORE LOG AdventureWorks FROM DISK = 'R:\Backups\AW-TailLog.bak' WITH RECOVERY;
Note: In the previous example, the log file was available after the database failed, so a taillog backup could be taken. This enables the database to be recovered to the point of failure. Had the log file not been available, the last planned transaction log backup (backup set 5 in AW.bak) would have been restored using the RECOVERY option, and all transactions since that backup would have been lost.
Demonstration: Restoring Databases In this demonstration, you will see how to:
Create a tail-log backup.
Restore a database.
Demonstration Steps
5-9
1.
Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are running, and log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
In the D:\Demofiles\Mod05 folder, run Setup.cmd as Administrator.
3.
Start SQL Server Management Studio and connect to the MIA-SQL database engine using Windows authentication.
4.
In Object Explorer, expand Databases, and note that the AdventureWorks database is in a Recovery Pending state.
5.
Click New Query and execute the following Transact-SQL code to attempt to bring the database online: ALTER DATABASE AdventureWorks SET ONLINE;
6.
Note the error message that is displayed. The AdventureWorks.mdf data file has been lost, so the database cannot be brought online.
7.
Delete the ALTER DATABASE statement, and replace it with the following code to perform a tail-log backup: BACKUP LOG AdventureWorks TO DISK = 'D:\Demofiles\Mod05\AW-TailLog.bak' WITH NO_TRUNCATE;
8.
MCT USE ONLY. STUDENT USE PROHIBITED
5-10 Restoring SQL Server 2014 Databases
Click Execute, and view the resulting message to verify that the backup is successful.
Restore a Database 1.
In Object Explorer, right-click the AdventureWorks database, point to Tasks, point to Restore, and click Database.
2.
In the Restore Database – AdventureWorks dialog box, in the Source section, select Device and click the ellipses (...) button.
3.
In the Select backup devices dialog box click Add, and then in the Locate backup File – MIA-SQL dialog box, select D:\Demofiles\Mod05\AW.bak and click OK.
4.
In the Select backup devices dialog box, ensure that D:\Demofiles\Mod05\AW.bak is listed, and then click Add.
5.
In the Locate backup File – MIA-SQL dialog box, select D:\Demofiles\Mod05\AW-TailLog.bak and click OK.
6.
In the Select backup devices dialog box, ensure that both D:\Demofiles\Mod05\AW.bak and D:\Demofiles\Mod05\AW-TailLog.bak are listed and click OK.
7.
Note that the backup media contains a full backup, a differential backup, and a transaction log backup (these are the planned backups in AW.bak); and a copy-only transaction log backup (which is the tail-log backup in AW-TailLog.bak). All of these are automatically selected in the Restore column.
8.
On the Options page, ensure that the Recovery state is set to RESTORE WITH RECOVERY.
9.
In the Script drop-down list, click New Query Editor Window. Then click OK.
10. When the database has been restored successfully, click OK. 11. View the Transact-SQL code that was used to restore the database, noting that the full backup, the differential backup, and the first transaction log backup were restored using the NORECOVERY option. The restore operation for the tail-log backup used the default RECOVERY option to recover the database. 12. In Object Explorer, verify that the AdventureWorks database is now recovered and ready to use. 13. Close SQL Server Management Studio without saving any files.
Lesson 3
Advanced Restore Scenarios
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
5-11
The techniques discussed in the previous lesson cover most common restore scenarios. However, there are some more complex restore scenarios for which a DBA must be prepared. This lesson discusses restore scenarios for file and filegroup backups, encrypted backups, individual data pages, and system databases.
Lesson Objectives After completing this lesson, you will be able to:
Restore a file or filegroup backup (including a piecemeal restore).
Restore an encrypted backup.
Restore individual data pages.
Restore system databases.
Restoring File or Filegroup Backups It can often be much quicker to restore a single file or filegroup than an entire database. You do not need to back up specific files or filegroups in order to restore them because SQL Server can extract the specific database files from a full or differential backup.
Restoring a File or Filegroup Perform the following steps to restore an individual file or filegroup: 1.
Create a tail-log backup of the active transaction log. (If you cannot do this because the log has been damaged, you must restore the whole database or restore to an earlier point in time.)
2.
Restore each damaged file from the most recent file backup of that file.
3.
Restore the most recent differential file backup, if any, for each restored file.
4.
Restore transaction log backups in sequence, starting with the backup that covers the oldest of the restored files and ending with the tail-log backup created in step 1.
5.
Recover the database.
You must restore the transaction log backups that were created after the file backups to bring the database back to a consistent state. The transaction log backups can be rolled forward quickly, because only the changes that relate to the restored files or filegroups are applied. Undamaged files are not copied and then rolled forward. However, you do still need to process the whole chain of log backups.
Performing a Piecemeal Restore
MCT USE ONLY. STUDENT USE PROHIBITED
5-12 Restoring SQL Server 2014 Databases
As discussed in the previous module, when a database is extremely large, you can use filegroups to store inactive data on read-only filegroups, and use a partial backup strategy in which each read-only filegroup is backed up once, and only read/write filegroups are included in subsequent backups—significantly reducing the time taken to perform a full or differential backup.
One of the advantages of including read-only filegroups in partial backup strategy is that it enables you to perform a piecemeal restore. In a piecemeal restore, you can recover read/write filegroups and make them available to users for querying before the recovery of read-only filegroups is complete. To perform a piecemeal restore: 1.
Restore the latest partial full database backup, specifying the read/write filegroups to be restored and using the PARTIAL option to indicate that read-only filegroups will be restored separately.
2.
Restore the latest partial differential backup, and log file backups if they exist. Use the RECOVERY option with the last RESTORE operation to recover the database. Data in read/write filegroups is now available.
3.
Restore each read-only filegroup backups with the RECOVERY option to bring them online.
Restoring an Encrypted Backup You can restore an encrypted backup to any SQL Server instance that hosts the certificate or key you used to encrypt the backup. This means that if you need to restore an encrypted backup to the server from which you backed up the database, you can restore the backup using the same procedure as for a non-encrypted backup as long as the certificate or key still exists in the instance.
In many recovery scenarios however, the database must be restored onto a different SQL Server instance; for example because the original instance has failed irretrievably or because you want to move the database to a different server. In this case, you must use the following procedure to restore the encrypted database: 1.
Create a database master key for the master database. This does not need to be the same database master key that was used in the original instance, but if you are recovering from a complete server failure you can restore the original database master key from a backup.
2.
Create a certificate or key from a backup. Use the CREATE CERTIFICATE or CREATE ASYMMETRIC KEY statement to create a certificate or key from the backup you created of the original key used to encrypt the database. The new certificate or key must have the same name as the original, and if you used a certificate, you must restore both the public certificate and the private key.
3.
Restore the database. Now that the encryption key is available on the SQL Server instance, you can restore the database as normal.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
5-13
The following code sample shows how to restore an encrypted database backup on a new SQL Server instance: Restoring an Encrypted Backup CREATE MASTER KEY ENCRYPTION BY PASSWORD = 'Pa$$w0rd'; CREATE CERTIFICATE BackupCert FROM FILE = 'K:\Backups\Backup.cer' WITH PRIVATE KEY ( DECRYPTION BY PASSWORD = 'CertPa$$w0rd', FILE = 'K:\Backups\Backup.key'); GO RESTORE DATABASE AdventureWorks FROM DISK = 'R:\Backups\AW_Encrypt.bak'
Demonstration: Restoring an Encrypted Backup In this demonstration, you will see how to restore an encrypted backup.
Demonstration Steps Restore an Encrypted Backup 1.
Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are running, and log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
If you did not complete the previous demonstration, in the D:\Demofiles\Mod05 folder, run Setup.cmd as Administrator.
3.
Start SQL Server Management Studio and connect to the MIA-SQL\SQL2 database engine using Windows authentication.
4.
In Object Explorer, under the MIA-SQL\SQL2 instance, expand Databases and view the existing databases on this instance.
5.
Open the Restore Encrypted Backup.sql script file in the D:\Demofiles\Mod05 folder.
6.
Select the code under the comment Try to restore an encrypted backup and click Execute. Note that this fails because the required certificate is not present.
7.
Select the code under the comment Create a database master key for master and click Execute. This creates a database master key for the master database on MIA-SQL\SQL2.
8.
Select the code under the comment Import the backed up certificate and click Execute. This creates a certificate from public and private key backups that were taken from the MIA-SQL instance.
9.
Select the code under the comment Restore the encrypted database and click Execute. Note that this time the restore operation succeeds.
10. In Object Explorer, refresh the Databases folder and verify that the AdventureWorksEncrypt database has been restored. 11. Close SQL Server Management Studio.
Restoring Data Pages In some scenarios, data pages in the database files can become corrupted. In this case, you can restore individual pages to repair the database, which may be faster than restoring the entire database or the affected file. Page restore is only supported for database using the full or bulklogged recovery model.
MCT USE ONLY. STUDENT USE PROHIBITED
5-14 Restoring SQL Server 2014 Databases
The first indication of a corrupt page is usually the occurrence of error 823 or 824 when querying a table. To identify potentially corrupt pages, you can use the DBCC CHECKDB command and query the suspect_pages system table in the msdb database. This table provides the page ID of each affected page, along with details of the potential problem. With this information, you can then restore damaged pages by using the RESTORE DATABASE Transact-SQL statement or by using the Restore Page dialog box in SQL Server Management Studio. To restore damaged pages while keeping the database online: 1.
Restore one or more damaged pages from a full database backup.
2.
Restore the latest differential backup if one exists.
3.
Restore each subsequent transaction log backup with the NORECOVERY option.
4.
Back up the transaction log.
5.
Restore the transaction log with the RECOVERY option.
The final two steps of backing up and restoring the log are required to ensure that the final log sequence number (LSN) of the restored pages is set as the REDO target of the transaction log. Online page restore is only supported in SQL Server Enterprise Edition. You can perform an offline page restore by using the following procedure: 1.
Back up the tail-log using the NORECOVERY option.
2.
Restore one or more damaged pages from a full database backup.
3.
Restore the latest differential backup if one exists.
4.
Restore each subsequent transaction log backup with the NORECOVERY option.
5.
Restore the tail-log with the RECOVERY option.
The following example code performs an online recovery of two pages: Restoring a Page -- Restore pages from the full backup RESTORE DATABASE AdventureWorks PAGE='1:55, 1:207' FROM DISK = 'R:\Backups\AdventureWorks.bak' WITH FILE=1, NORECOVERY; -- Apply the differential backup RESTORE DATABASE AdventureWorks FROM DISK = 'R:\Backups\AdventureWorks.bak' WITH FILE=3, NORECOVERY; -- Restore subsequent transaction log backups RESTORE LOG AdventureWorks FROM DISK = 'R:\Backups\AdventureWorks.bak' WITH FILE=4, NORECOVERY; -- Backup the log BACKUP LOG AdventureWorks TO DISK = 'R:\Backups\AW-Log.bak'; -- Restore the log to set the correct REDO LSN and recover RESTORE LOG AdventureWorks FROM DISK = 'R:\Backups\AW-Log.bak' WITH RECOVERY;
Recovering System Databases The recovery process for all system databases is not identical. Each system database has specific recovery requirements:
master
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
5-15
The master database holds all system-level configurations. SQL Server requires the master database before a SQL Server instance can run at all. SQL Server cannot start without the master database, therefore if it is missing or corrupt, you cannot execute a standard RESTORE DATABASE command to restore it. Before starting to recover the master database, you must have access to a temporary master database so that the SQL Server instance will start. This temporary master database does not need to have the correct configuration as it will only be used to start up the instance to initiate the recovery process to restore the correct version of your master database. There are three ways that you can obtain a temporary master database:
You can use the SQL Server setup program to rebuild the system databases, either from the location that you installed SQL Server from or by running the setup program found at Microsoft SQL Server\110\Setup\Bootstrap\SQL11\setup.exe.
Note: Re-running the setup program will overwrite all your system databases, so you must ensure that they are regularly backed up and able to be restored after you have restored the master database.
You can use a file-level backup of the master database files to restore the master database. You must take this file-level backup when the master database is not in use—that is, when SQL Server is not running—or by using the VSS service.
Note: Copying the master database from another instance is not supported. The VSS service is beyond the scope of this course.
MCT USE ONLY. STUDENT USE PROHIBITED
5-16 Restoring SQL Server 2014 Databases
You can locate a copy of the master.mdf database file from the Templates folder located in the MSSQL\Binn folder for a SQL Server instance.
When you have created a temporary version of the master database, you can use the following procedure to recover the correct master database: 1.
Start the server instance in single-user mode by using the –m startup option.
2.
Use a RESTORE DATABASE statement to restore a full database backup of the master database. It is recommended that you execute the RESTORE DATABASE statement by using the sqlcmd utility. After restoring the master database, the instance of SQL Server will shut down and terminate your sqlcmd connection.
3.
Remove the single-user startup parameter.
4.
Restart SQL Server.
model
The model database is the template for all databases that are created on the instance of SQL Server. When the model database is corrupt, the instance of SQL Server cannot start. This means that a normal restore command cannot be used to recover the model database if it becomes corrupted. In the case of a corrupt model database, you must start the instance with the -T3608 trace flag as a command-line parameter. This trace flag only starts the master database. When SQL Server is running, you can restore the model database by using the normal RESTORE DATABASE command.
msdb
SQL Server Agent uses the msdb database for scheduling alerts and jobs, and for recording details of operators. The msdb database also contains history tables, such as those that record details of backup and restore operations. If the msdb database becomes corrupt, SQL Server Agent will not start. You can restore the msdb database by using the RESTORE DATABASE statement as you would a user database, and then the SQL Server Agent service can be restarted.
resource
The resource database is read-only and contains copies of all system objects that ship with Microsoft SQL Server. This is a hidden database and you cannot perform backup operations on it. It can, however, be corrupted by failures in areas such as I/O subsystems or memory. If the resource database is corrupt, it can be restored by a file-level restore in Windows or by running the setup program for SQL Server.
tempdb
The tempdb database is a workspace for holding temporary or intermediate result sets. This database is recreated every time an instance of SQL Server starts, so there is no need to back up or restore it. When the server instance is shut down, any data in tempdb is permanently deleted.
Lesson 4
Point-in-Time Recovery
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
5-17
In the previous lesson, you learned how to recover a database to the latest point in time possible. However, there are occasions when you may need to recover the database to an earlier point in time. You have also learned that you can stop the restore process after any of the backups are restored and initiate the recovery of the database. While stopping the restore process after restoring an entire backup file provides a coarse level of control over the recovery point, SQL Server provides additional options that allow for more fine-grained control. In this lesson, you will learn about how point-in-time recovery works and how to use the options that it provides.
Lesson Objectives After completing this lesson, you will be able to:
Describe point-in-time recovery.
Use the STOPAT restore option.
Use The STOPATMARK restore option.
Perform a point-in-time recovery in SQL Server Management Studio.
Overview of Point-in-Time Recovery You will mostly want to recover a database to the most recent point in time, though sometimes you may want to restore it to an earlier stage. SQL Server enables you to restore a database to a specified point in time and then recover it. You can either restore it to an exact time by using a datetime value or you can recover to a named transaction in the transaction log.
For either of these options to work, the database needs to use the full recovery model. SQL Server can only stop at points in the transaction log chain when the database is in full recovery model. If a database changes from the full recovery model to the bulk-logged recovery model to process bulk transactions—and is then changed back to the full recovery model—the recovery point cannot be in the time that the database was using the bulk-logged recovery model. If you attempt to specify a recovery point during which the database was using the bulk-logged recovery model, the restore will fail and an error will be returned. Note: If a user error causes the inadvertent deletion of some data, you may not be aware of when the error actually occurred. Therefore, you will not know which log file contains the deletion and the point at which to recover the database. You can use the WITH STANDBY option on each log file restore and inspect the state of the database after each restore operation, to determine when the error occurred and when to recover the database.
STOPAT Option You use the STOPAT option to specify a recovery point that is based on a datetime value. You might not know in advance which transaction log backup file contains transactions from the time where the recovery needs to occur. Therefore, the syntax of the RESTORE LOG command enables you to specify the RECOVERY option for each log restore command in the sequence. In this example, the RECOVERY option is specified for each transaction log restore, but the actual recovery process will not take place until the time specified in the STOPAT option is located in the transaction log. Using the STOPAT Option RESTORE DATABASE database_name FROM full_backup WITH NORECOVERY; RESTORE DATABASE database_name FROM differential_backup WITH NORECOVERY; RESTORE LOG database_name FROM first_log_backup WITH STOPAT = time, RECOVERY; … (additional log backups could be restored here) RESTORE LOG database_name FROM final_log_backup WITH STOPAT = time, RECOVERY;
The behavior of the STOPAT and RECOVERY options is as follows:
MCT USE ONLY. STUDENT USE PROHIBITED
5-18 Restoring SQL Server 2014 Databases
If the specified time is earlier than the first time in the transaction log backup, the restore command fails and returns an error.
If the specified time is contained within the period covered by the transaction log backup, the restore command recovers the database at that time.
If the specified time is later than the last time contained in the transaction log backup, the restore command restores the logs, sends a warning message, and the database is not recovered, so that additional transaction log backups can be applied.
This behavior ensures that the database is recovered up to the requested point, even when STOPAT and RECOVERY are both specified with every restore.
STOPATMARK Option If you require more precise control over the recovery point, you can use the STOPATMARK option of the RESTORE Transact-SQL statement. If you know in advance that you might need to recover to the point of a specific operation, you can place a mark in the transaction log to record the precise location. This example starts a transaction with the name and transaction mark of UpdPrc and a description of Start of nightly update process: Marking a Transaction BEGIN TRAN UpdPrc WITH MARK 'Start of nightly update process';
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
5-19
If you do not know the name of a transaction that was marked, you can query the dbo.logmarkhistory table in the msdb database.
The STOPATMARK option is similar to the STOPAT option for the RESTORE command. SQL Server will stop at the named transaction mark and include the named transaction in the redo phase. If you wish to exclude the transaction (that is, restore everything up to the beginning of the named transaction), you can use the STOPBEFOREMARK option instead. If the transaction mark is not found in the transaction log backup that is being restored, the restore completes and the database is not recovered, so that other transaction log backups can be restored. The main use for the STOPATMARK feature is to restore an entire set of databases to a mutually consistent state, at some earlier point in time. If you need to perform a backup of multiple databases, so that they can all be recovered to a consistent point, consider marking all the transaction logs before commencing the backups. Note: You cannot use the stop at mark functionality in SSMS; it is only available by using the Transact-SQL statement.
Performing a Point-in-Time Recovery by Using SQL Server Management Studio SQL Server Management Studio provides a graphical user interface (GUI) that makes it easy to restore a database to a specific point in time. To perform a point-in-time recovery by using SSMS, follow the usual steps to open the Restore Database dialog box. On the General page, click Timeline to open the Backup Timeline dialog box where you can configure a specific date and time where the restore should stop.
Demonstration: Performing a Point-in-Time Recovery In this demonstration you will see how to:
Perform a point-in-time recovery.
Demonstration Steps Perform a Point-In-Time Recovery
MCT USE ONLY. STUDENT USE PROHIBITED
5-20 Restoring SQL Server 2014 Databases
1.
Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are running, and log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
Start SQL Server Management Studio and connect to the MIA-SQL database engine using Windows authentication.
3.
In SQL Server Management Studio, open the Point-in-Time Restore.sql script file in the D:\Demofiles\Mod05 folder.
4.
Select and execute the code under the comment Create a database and back it up. This creates a database with a single table, and performs a full backup.
5.
Select and execute the code under the comment enter some data. This inserts a record into the Customers table.
6.
Select and execute the code under the comment get the current time. This displays the current date and time. Make a note of the current time.
7.
Wait until a minute has passed, and then select and execute the code under the comment get the current time again to verify that it is now at least a minute since you noted the time.
8.
Select and execute the code under the comment enter some more data. This inserts a second record into the Customers table.
9.
Select and execute the code under the comment backup the transaction log. This performs a transaction log backup of the database.
10. Close the query window. 11. In Object Explorer, expand Databases and verify that BackupDemo is listed (if not, right-click the Databases folder and click Refresh). Then right-click the BackupDemo database, point to Tasks, point to Restore, and click Database. 12. In the Restore Database – BackupDemo dialog box, click Timeline.
13. In the Backup Timeline: BackupDemo dialog box, select Specific date and time and set the Time value to the time you noted earlier (after the first row of data was inserted). Then click OK. 14. In the Restore Database – BackupDemo dialog box, click OK. When notified that the database has been restored successfully, click OK. 15. In Object Explorer, expand the BackupDemo database and its Tables folder. Then right-click dbo.Customers and click Select Top 1000 Rows. When the results are displayed, verify that the database was restored to the point in time after the first row of data was inserted, but before the second row was inserted. 16. Close SQL Server Management Studio without saving any files.
Lab: Restoring SQL Server Databases Scenario
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
5-21
You are a DBA with responsibility for managing the HumanResources, InternetSales, and AWDataWarehouse databases. You have backed up the databases according to their individual backup strategies, and now you must recover them in the event of a failure.
Objectives After completing this lab, you will be able to:
Restore a database from a full backup.
Restore a database from full, differential, and transaction log backups.
Perform a piecemeal restore of a large database with read-only filegroups.
Estimated Time: 60 minutes Virtual machine: 20462C-MIA-SQL User name: ADVENTUREWORKS\Student Password: Pa$$w0rd
Exercise 1: Restoring a Database Backup Scenario
The HumanResources database has failed to come online, and you must determine the problem and recover it to its last backed-up state. The main tasks for this exercise are as follows: 1. Prepare the Lab Environment 2. Determine the Cause of the Failure 3. Restore the HumanResources Database
Task 1: Prepare the Lab Environment 1.
Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are both running, and then log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
Run Setup.cmd in the D:\Labfiles\Lab05\Starter folder as Administrator.
Task 2: Determine the Cause of the Failure 1.
Use SQL Server Management Studio to view the status of the HumanResources database on the MIA-SQL instance of SQL Server.
2.
Use the following Transact-SQL query to try to bring the database online: ALTER DATABASE HumanResources SET ONLINE;
3.
Review the error message, and then check the contents of the M:\Data folder to determine if the HumanResources.mdf file is present. If not, the database cannot be brought online because the primary data file is lost.
Task 3: Restore the HumanResources Database 1.
Restore the HumanResources database from the most recent full backup.
MCT USE ONLY. STUDENT USE PROHIBITED
5-22 Restoring SQL Server 2014 Databases
SQL Server should have retained the backup history for this database, and the Restore Database tool in SQL Server Management Studio should automatically select the second backup set in the R:\Backups\HumanResources.bak backup media set. 2.
Verify that the database has been restored.
Results: After this exercise, you should have restored the HumanResources database.
Exercise 2: Restoring Database, Differential, and Transaction Log Backups Scenario The HumanResources database has failed to come online, and you must determine the problem and recover it to the most recent transaction possible. The main tasks for this exercise are as follows: 1. Determine the Cause of the Failure 2. Perform a Tail-Log Backup 3. Restore the InternetSales Database
Task 1: Determine the Cause of the Failure 1.
Use SQL Server Management Studio to view the status of the InternetSales database on the MIASQL instance of SQL Server.
2.
Use the following Transact-SQL query to try to bring the database online: ALTER DATABASE InternetSales SET ONLINE;
3.
Review the error message, and then check the contents of the M:\Data folder to verify that the InternetSales.mdf file is present. This file has become corrupt, and has rendered the database unusable.
Task 2: Perform a Tail-Log Backup 1.
Verify that the InternetSales_log.ldf transaction log file is present in the L:\Logs folder.
2.
Use the following Transact-SQL code to back up the tail of the transaction log:
USE master; BACKUP LOG InternetSales TO DISK = 'R:\Backups\IS-TailLog.bak' WITH NO_TRUNCATE;
Task 3: Restore the InternetSales Database 1.
Restore the InternetSales database from the planned backups in R:\Backups\InternetSales,bak and the tail-log backup in R:\Backups\IS-TailLog.bak.
In this case, the backup history for the database has been lost, so you must specify the backup media sets for the existing planned backups as well as the tail-log backup you just took. The planned backups should be restored using the NORECOVERY option, and then the tail-log backup should be restored using the RECOVERY option.
2.
Verify that the database has been restored.
Results: After this exercise, you should have restored the InternetSales database.
Exercise 3: Performing a Piecemeal Restore Scenario
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
5-23
The AWDataWarehouse has been accidentally dropped, and you must recover it as quickly as possible. It is acceptable to have the database partially recovered so that recent data can be accessed before archive data becomes available. The main tasks for this exercise are as follows: 1. Begin a Piecemeal Restore 2. Restore Read/Write Filegroups and Bring the Database Online 3. Restore the Read-Only Filegroup
Task 1: Begin a Piecemeal Restore 1.
Verify that the AWDataWarehouse database does not exist on MIA-SQL.
2.
Use the following Transact-SQL code to initiate a piecemeal restore:
USE master; RESTORE DATABASE AWDataWarehouse FILEGROUP='Current' FROM DISK = 'R:\Backups\AWDataWarehouse_Read-Write.bak' WITH PARTIAL, FILE = 1, NORECOVERY;
This code restores the primary filegroup and the Current filegroup from a full database backup on the AWDataWarehouse_Read-Write.bak media set. The PARTIAL option indicates that only the primary and named read-write filegroups should be restored, and the NORECOVERY option leaves the database in a restoring state, ready for subsequent restore operations of the read-write filegroup data. 3.
Refresh the Databases folder in Object Explorer to verify that the database is in a restoring state.
Task 2: Restore Read/Write Filegroups and Bring the Database Online 1.
Use the following Transact-SQL code to restore a differential backup from the second backup set on the AWDataWarehouse_Read-Write.bak media set:
RESTORE DATABASE AWDataWarehouse FROM DISK = 'R:\Backups\AWDataWarehouse_Read-Write.bak' WITH FILE = 2, RECOVERY;
2.
Verify that the database is now shown as online in Object Explorer, and that you can query the dbo.FactInternetSales table
3.
Verify that you cannot query the dbo.FactInternetSalesArchive table, because it is stored in a filegroup that has not yet been brought online.
Task 3: Restore the Read-Only Filegroup 1.
Use the following Transact-SQL code to restore the read-only Archive filegroup from the AWDataWarehouse_Read-Only.bak media set:
RESTORE DATABASE AWDataWarehouse FILEGROUP='Archive' FROM DISK = 'R:\Backups\AWDataWarehouse_Read-Only.bak' WITH RECOVERY;
2.
Verify that you can now query the dbo.FactInternetSalesArchive table.
Results: After this exercise, you will have restored the AWDataWarehouse database.
MCT USE ONLY. STUDENT USE PROHIBITED
5-24 Restoring SQL Server 2014 Databases
Module Review and Takeaways
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
5-25
In this module, you learned how to restore databases from backups, including full database backups, differential database backups, and transaction log backups. You also learned how to restore individual filegroups and how to perform a piecemeal restore for a database that includes read/write and read-only filegroups. When planning a database recovery solution, consider the following best practices.
Don’t forget to back up the tail of the log before starting a restore sequence.
If available, use differential restore to reduce the time taken by the restore process.
Use file level restore to speed up restores when not all database files are corrupt.
Perform regular database backups of master, msdb and model system databases.
Create a disaster recovery plan for your SQL Server and test restoring databases regularly.
Review Question(s) Question: What are the three phases of the restore process?
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED 6-1
Module 6 Importing and Exporting Data Contents: Module Overview
6-1
Lesson 1: Introduction to Transferring Data
6-2
Lesson 2: Importing and Exporting Data
6-9
Lesson 3: Copying or Moving a Database
6-17
Lab: Importing and Exporting Data
6-22
Module Review and Takeaways
6-26
Module Overview
While a great deal of data residing in a Microsoft® SQL Server® system is entered directly by users who are running application programs, there is often a need to move data in other locations to and from SQL Server. SQL Server provides a set of tools that you can use to transfer data in and out. Some of these tools, such as the bcp utility and SQL Server Integration Services (SSIS), are external to the database engine. Other tools, such as the BULK INSERT statement and the OPENROWSET function, are implemented in the database engine. SQL Server also enables you to create data-tier applications (DACs) which package all the tables, views, and instance objects associated with a user database into a single unit of deployment.
In this module, you will explore these tools and techniques so that you can import and export data to and from SQL Server.
Objectives After completing this lesson, you will be able to:
Describe tools and techniques for transferring data.
Import and export data.
Copy or move a database.
Importing and Exporting Data
Lesson 1
Introduction to Transferring Data
MCT USE ONLY. STUDENT USE PROHIBITED
6-2
The first step in learning to transfer data in and out of SQL Server is to become familiar with the processes involved, and with the tools that SQL Server provides to implement data transfer.
When large amounts of data need to be inserted into SQL Server tables, the default settings for constraints, triggers, and indexes are not likely to provide the best performance possible. You may achieve improved performance by controlling when the checks that are made by constraints are carried out and when the index pages for a table are updated.
Lesson Objectives After completing this lesson, you will be able to:
Describe core data transfer concepts.
Describe the tools that SQL Server provides for data transfer.
Improve the performance of data transfers.
Disable and rebuild indexes.
Disable and re-enable constraints.
Overview of Data Transfer Not all data can be entered row-by-row by database users. Often data needs to be imported from external data sources, such as other database servers or files. Also, users often request that data from tables in databases is exported to text files. In earlier modules, you have seen how collations can cause issues when misconfigured. Correcting the collation of a database also often requires the export and re-import of the data from the database.
Data Transfer Steps Although not all data transfer requirements are identical, there is a standard process that most data transfer tasks follow. The main steps are:
Extracting data from a given data source.
Transforming the data in some way to make it suitable for the target system.
Loading the data into the target system.
Together, these three steps are commonly referred to as an Extract, Transform, Load (ETL) process, which can be implemented by the use of ETL tools. Note: In some situations, an Extract, Load, Transform (ELT) process might be more appropriate than an ETL process. For example, you may decide to perform data transformations after the data has been loaded into the database engine rather than before.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
Extracting Data
6-3
While there are other options, extracting data typically involves executing queries on a source system to retrieve the data, or opening and reading source files. During the extraction process, there are two common aims:
To avoid excessive impact on the source system. For example, do not read entire tables of data when you only need to read selected rows or columns. Also, do not continually re-read the same data, and avoid the execution of statements that block users of the source system in any way.
To ensure the consistency of the data extraction. For example, do not include one row from the source system more than once in the output of the extraction.
Transforming Data The transformation phase of an ELT process will generally involve several steps, such as the following:
Data might need to be cleansed. For example, you might need to remove erroneous data or provide default values for missing columns.
Lookups might need to be performed. For example, the input data might include the name of a customer, but the database might need an ID for the customer.
Data might need to be aggregated. For example, the input data might include every transaction that occurred on a given day, but the database might need only daily summary values.
Data might need to be de-aggregated. This is often referred to as data allocation. For example, the input data might include quarterly budgets, but the database might need daily budgets.
In addition to these common operations, data might need to be restructured in some way, for example by pivoting the data so that columns become rows, concatenating multiple source columns into a single column, or splitting a single source column into multiple columns.
Loading Data
After data is transformed into an appropriate format, you can load it into the target system. Instead of performing row-by-row insert operations for the data, you can use special options for loading data in bulk. Additionally, you can make temporary configuration changes to improve the performance of the load operation.
Available Tools for Data Transfer SQL Server provides a set of tools for performing data transfer tasks. It is important to understand which tool is best to use for which types of scenario.
SQL Server Import and Export Wizard SQL Server also provides the Import and Export Wizard, which is a simple method of creating SQL Server Integration Services (SSIS) packages. SSIS is an ETL tool that ships with SQL Server. SSIS is capable of connecting to a wide variety of data sources and destinations and can perform complex transformations on data. SSIS provides many tasks and transformations out of the box and can also be extended by the use of custom .NET Framework components and scripts.
Importing and Exporting Data
Bulk Copy Program (bcp)
MCT USE ONLY. STUDENT USE PROHIBITED
6-4
You can use the Bulk Copy Program (bcp) to import large numbers of new rows from an operating system data file into a SQL Server table, or to export data from a SQL Server table to an operating system file. Although you can use the bcp utility with a Transact-SQL queryout option, which specifies the rows to be exported, the normal use of bcp does not require any knowledge of Transact-SQL.
BULK INSERT
You can use the BULK INSERT Transact-SQL statement to import data directly from an operating system data file into a database table. The BULK INSERT statement differs from bcp in a number of ways. First, you execute the BULK INSERT statement from within Transact-SQL, whereas the bcp utility is a command line utility. Also, while the bcp utility can be used for both import and output, the BULK INSERT statement can only be used for data import.
OPENROWSET (BULK)
OPENROWSET is a table-valued function that you can use to connect to and retrieve data from OLE-DB data sources. Full details of how to connect to the data source need to be provided as parameters to the OPENROWSET function. You can use OPENROWSET to connect to other types of database engine. SQL Server offers a special OLE-DB provider called BULK that you can use with the OPENROWSET function. The BULK provider enables the import of entire documents from the file system.
Improving the Performance of Data Transfers If you enable constraints, indexes, and triggers on the tables that are the targets of data transfers, SQL Server checks the data values for every row that you import. This constant checking can substantially slow down SQL Server data transfers.
Disabling Constraints, Indexes, and Triggers Rather than checking each value during the import process or updating each index for every row, you can improve overall performance by disabling the process of checking or index updating until all the data is loaded, and then performing that work once, at the end of the import process.
For example, consider a FOREIGN KEY constraint that ensures that the relevant customer does exist whenever a customer order is inserted into the database. While you could check this reference for each customer order, it is possible that a customer may have thousands of orders, resulting in thousands of checks. Instead of checking each value as it is inserted, you can check the customer reference as a single lookup after the overall import process completes—to cover all customer orders referring to that customer. Only CHECK and FOREIGN KEY constraints can be disabled. The process for disabling and re-enabling constraints will be discussed later in this lesson.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
6-5
Similar to the way that avoiding lookups for FOREIGN KEY constraints during data import can improve performance, avoiding constant updating of indexes can have the same effect. In many cases, rebuilding the indexes after the import process is complete is much faster than updating the indexes as the rows are imported. The exception to this situation is when there is a much larger number of rows already in the table than are being imported. Triggers are commands that are executed when data is modified. It is important to decide if the processing that the triggers perform would also be better processed in bulk after the import, rather than as each insert occurs.
Control Locking Behavior
By default, SQL Server manages the granularity of the locks it acquires during the execution of commands. SQL Server starts with row level locking and only tries to escalate when a significant number of rows are locked within a table. Managing large numbers of locks occupies resources which could be used to minimize the execution time for queries. As the data in tables that are the target of bulk-import operations are normally only accessed by the process that is importing the data, the advantage of rowlevel locking is often not present. For this reason, it may be advisable to lock the entire table by using a TABLOCK query hint during the import process.
Use Minimal Logging Whenever Possible
Minimal logging is a special operation that can provide substantial performance improvements—for example, in bulk imports. As well as making the operations faster, minimal logging helps avoid excessive log growth during large import operations. Not all commands can use minimal logging. While not an exhaustive list, the items below indicate the types of restrictions that must be met in order to apply minimal logging:
The table is not being replicated.
Table locking is specified (using TABLOCK).
If the table has no clustered index but has one or more nonclustered indexes, data pages are always minimally logged. How index pages are logged, however, depends on whether the table is empty.
If the table is empty, index pages are minimally logged.
If the table is non-empty, index pages are fully logged.
If the table has a clustered index and is empty, both data and index pages are minimally logged.
If a table has a clustered index and is non-empty, data pages and index pages are both fully logged, regardless of the recovery model.
Importing and Exporting Data
Disabling and Rebuilding Indexes Prior to SQL Server 2005, you had to drop indexes to prevent them from being updated when the data in the table is updated. The problem with dropping the index is that, when you need to put the index back in place by recreating it, you need to know exactly how it is configured.
Disabling Indexes In SQL Server 2005 and later, you can disable an index. Rather than totally dropping the index details from the database, this option leaves the metadata about the index in place and just stops it from updating. Queries that are executed by users will not use disabled indexes. You can disable an index by using the graphical interface in SQL Server Management Studio (SSMS) or by using the ALTER INDEX Transact-SQL statement. The following code example disables an index named idx_emailaddress on the dbo.Customer table: Disabling an Index ALTER INDEX idx_emailaddress ON dbo.Customer DISABLE;
You can disable also disable all of the indexes on a table, as shown in the following code example: Disabling All Indexes on a Table ALTER INDEX ALL ON dbo.Customer DISABLE;
Note: A clustered index defines how a table is structured. If a clustered index is disabled, the table becomes unusable until the index is rebuilt.
MCT USE ONLY. STUDENT USE PROHIBITED
6-6
The major advantage of disabling an index instead of dropping it, is that you can put the index back into operation by using a rebuild operation. When you rebuild an index, you do not need to know details of how it is configured. This makes it much easier to create administrative scripts that stop indexes being updated while large import or update operations are taking place, and that put the indexes back into operation after those operations have completed.
Rebuilding Indexes
After data has been imported, you can rebuild the indexes on a table by using the graphical tools in SSMS or by using the ALTER INDEX Transact-SQL statement or the DBCC DBREINDEX command. The following code example shows how to rebuild the idx_emailaddress on the dbo.Customer table: Rebuilding an Index ALTER INDEX idx_emailaddress ON dbo.Customer REBUILD;
You can also use the ALL keyword with the ALTER INDEX statement to rebuild all indexes on a specified table, similarly to disabling an index.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
The following code example shows how to use the DBCC DBREINDEX command to rebuild the idx_emailaddress on the dbo.Customer table: Rebuilding an Index with DBCC DBREINDEX DBCC DBREINDEX ("dbo.Customer", "idx_EmailAddress");
6-7
You can specify an empty string for the second parameter in DBCC DBREINDEX to rebuild all indexes on a table. If a large volume of data has been loaded, it may be more efficient to recreate the index, dropping existing indexes in the process. To recreate an index, replacing the existing one, you can use the CREATE INDEX statement with the DROP_EXISTING option as shown in the following example: Recreating an Index CREATE INDEX idx_emailaddress ON dbo.Customer(EmailAddress) WITH (DROP_EXISTING = ON);
Disabling and Enabling Constraints You can use constraints to define how SQL Server enforces data integrity rules. Two important constraints related to data loading are primary key and unique constraints. Primary key constraints define the column or columns that uniquely identify each row in a table and unique constraints ensure that a column or columns do not contain duplicate values. SQL Server creates indexes to help it enforce these constraints.
Disabling Primary Key or Unique Constraints
To disable a primary key or unique constraint, you first need to disable the index that is associated with the constraint. This is typically only useful with nonclustered primary key constraints. When you re-enable the constraint, the associated indexes are automatically rebuilt. If duplicate values are found during the rebuild, the re-enabling of the constraint will fail. For this reason, if you disable these constraints while importing data, you need to be sure that the data being imported will not violate the rules that the constraints enforce. Note: If a table has a primary key enforced with a clustered index, disabling the index associated with the constraint prevents access to any data in the table.
Foreign Key and Check Constraints
You use foreign key constraints to make sure that entities in one table that are referred to by entities in another actually exist. For example, a supplier must exist before a purchase order can be entered. Foreign key constraints use primary key or unique constraints while checking the references. If you disable the primary key or unique constraint that a foreign key reference points to, the foreign key constraint is automatically disabled. However, when you re-enable the primary key or unique constraint, foreign key references that use these constraints are not automatically re-enabled.
Importing and Exporting Data
You can use check constraints to limit the values that can be contained in a column or the relationship between the values in multiple columns in a table. You can disable and enable both foreign key and check constraints by using the CHECK and NOCHECK options of the ALTER TABLE statement. Disabling Foreign Key and Check Constraints ALTER TABLE Person.Salary NOCHECK CONSTRAINT SalaryCap; ALTER TABLE Person.Salary CHECK CONSTRAINT SalaryCap;
You can also disable or enable all constraints by replacing the constraint name with the ALL keyword.
MCT USE ONLY. STUDENT USE PROHIBITED
6-8
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
Lesson 2
Importing and Exporting Data
6-9
SQL Server provides a range of tools and techniques for importing and exporting data. In this lesson, you will explore these tools and learn how to use them to import and export data to and from a SQL Server database.
Lesson Objectives After completing this lesson, you will be able to:
Use the SQL Server Import and Export Wizard.
Use the bcp utility.
Use the BULK INSERT statement.
Use the OPENROWSET function.
The SQL Server Import and Export Wizard You can use the SQL Server Import and Export Wizard to copy data to and from any data source for which a managed .NET Framework data provider or a native OLE-DB provider is available. For example, SQL Server, flat files, Microsoft Office Access®, Microsoft Office Excel®, and a wide variety of other database engines. Note: On a 64-bit computer, SSIS setup installs the 64-bit version of the Import and Export Wizard. However, some data sources, such as Access and Excel, only have 32-bit providers. To use these data sources, you must install the 32-bit version of the Import/Export Wizard.
You can use the wizard to perform the data transfer immediately, and you can also save the SSIS package it generates for execution at a later time.
Running SSIS Packages SSIS provides two utilities that you can use to run packages:
DTExec utility
You can use DTExec to run SSIS packages from the command line. You need to specify parameters including the server to use, the location of the package, environment variables, and input parameters. The utility reads the command line parameters, loads the package, configures the package options based on the parameters passed, and then runs the package. It returns an exit code signifying the success or failure of the package.
DtExecUI utility
The Execute Package Utility (DtExecUI) can run SSIS packages from SQL Server Management Studio (SSMS) or from the command prompt and is a GUI for the DTExec command prompt utility. The GUI simplifies the process of passing parameters to the utility and receiving exit codes.
MCT USE ONLY. STUDENT USE PROHIBITED
6-10 Importing and Exporting Data
You can also run SSIS packages from SQL Server Agent jobs. This enables you to automate and schedule the execution, either independently or as part of a larger job. You can configure the parameters for the package by using the New Job Step dialog box. Note: SQL Server Agent jobs are described in detail later in this course.
The Import and Export Wizard is based on SQL Server Integration Services (SSIS), which provides a comprehensive platform for building ETL solutions. The Import and Export Wizard itself provides minimal transformation capabilities. Except for setting the name, the data type, and the data type properties of columns in new destination tables and files, the SQL Server Import and Export Wizard supports no column-level transformations. If you need to develop a more complex ETL solution, you should use the SQL Server Data Tools for BI (SSDT-BI) add-in for Visual Studio to create an SSIS that consists of one or more SSIS packages. Note: To learn more about using SSDT-BI to develop SSIS projects, attend course 20463C: Implementing a Data Warehouse with Microsoft SQL Server 2014.
Demonstration: Using the Import and Export Wizard In this demonstration, you will see how to:
Use Import and Export Wizard to export data.
Demonstration Steps Use Import and Export Wizard to Export Data 1.
Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are running, and log on to MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
In the D:\Demofiles\Mod06 folder, run Setup.cmd as Administrator.
3.
Start SQL Server Management Studio and connect to the MIA-SQL database engine using Windows authentication.
4.
In Object Explorer, expand Databases. Then right-click the AdventureWorks database, point to Tasks, and click Export Data.
5.
On the Welcome to SQL Server Import and Export Wizard page, click Next.
6.
On the Choose a Data Source page, in the Data source drop-down list, select SQL Server Native Client 11.0. Then ensure that the MIA-SQL server is selected, that Use Windows Authentication is selected, and that the AdventureWorks database is selected; and click Next.
7.
On the Choose a Destination page, in the Data source drop-down list, select Flat File Destination. Then in the File name box type D:\Demofiles\Mod06\Currency.csv, clear the Column names in in the first data row checkbox, and click Next.
8.
On the Specify Table Copy or Query page, ensure that Copy data from one or more tables or views is selected, and click Next.
9.
On the Configure Flat File Destination page, in the Source table or view list, select [Sales].[Currency]. Then ensure that the Row delimiter is {CR}{LF}and the Column delimiter is Comma {,}, and click Next.
10. On the Save and Run Package page, ensure that Run immediately is selected, and click Next.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
6-11
11. On the Complete the Wizard page, click Finish. Then, when the execution is successful, click Close. 12. Start Excel and open the Currency.csv file in the D:\Demofiles\Mod06 folder and view the data that has been exported. Then close Excel without saving the file.
The bcp Utility You can use the bcp command line utility to bulk copy data between an instance of Microsoft SQL Server and a data file in a user-specified format. You can use it to easily import large numbers of new rows into SQL Server tables or to export data out of tables into data files.
bcp Syntax The syntax for the bcp utility is highly versatile, and includes a large number of options. The general form of a bcp command specifies:
A table or view in a SQL Server database.
A direction (in when importing data into SQL Server, out when exporting data from SQL Server).
A local file name for the source (when importing) or destination (when exporting).
You can also use the queryout direction to specify that data is to be extracted from the database based on a Transact-SQL query. Additionally, the bcp utility supports the following commonly used parameters. Note that these parameters are case-sensitive:
-S server\instance: Specifies the SQL Server instance. The default is (local).
-d: The database containing the table or view (you can also specify a fully-qualified table or view name that includes the database and schema – for example AdventureWorks.Sales.Currency).
-T: Specifies that a trusted connection should be used to connect using Windows authentication.
-U user_name: Specifies a user name for SQL Server authentication.
-P password: Specifies a password for SQL Server authentication.
-c: Specifies that the data file stores data in character format.
-w: Specifies that the data file stores data in Unicode character format.
-n: Specifies that the data file stores data in SQL Server native format.
-f format_file: Specifies a format file that defines the schema for the data.
-t delimiter: Specifies a field terminator for a data in character format. The default is a tab.
-r delimiter: Specifies a row terminator for data in character format. The default is a new line. Note: For a full list of parameters and syntax options, you can enter the command bcp -?
MCT USE ONLY. STUDENT USE PROHIBITED
6-12 Importing and Exporting Data
The following code example connects to the MIA-SQL SQL Server instance using Windows authentication, and exports the contents of the Sales.Currency table in the AdventureWorks database to a text file named Currency.csv in which the data is saved in comma-delimited character format with a new line for each row: Using bcp to Export Data bcp AdventureWorks.Sales.Currency out D:\Currency.csv -S MIA-SQL -T -c -t , -r \n
Using Format Files
While you can export and import data easily by using delimited character data or SQL Server native format, there are some scenarios where you may need to use specific data formats for the data you are importing or exporting. If you run the bcp command without specifying any format information, the utility will then prompt you to specify the data type, prefix length, and delimiter for each field in the specified table or view and give you the option to save the specified schema as a format file that you can reuse in a later bcp operation. Format files define the schema of the data for your import/export operations, and can be defined as text or XML files.
To preemptively create a format file, use the format nul direction and specify the name of the format file you want to create. You can then interactively specify the data type, prefix length, and delimiter for each field in the specified table or view, and save the resulting schema in the format file. The default format file type is text, but you can use the -x parameter to create an XML format file. If you want to create a format file for character data with specific field and row terminators, you can specify them with the -c, -t, and -r parameters. The following example shows how to use the bcp utility to create an XML-based format file named CurrencyFmt.xml based on the AdventureWorks.Sales.Currency table: Creating a Format File bcp AdventureWorks.Sales.Currency format nul -S MIA-SQL –T -c -t , -r \n -x -f D:\CurrencyFmt.xml
To use a format file when importing or exporting data, use the -f parameter.
The following example shows how to import the contents of Currency.csv into the Finance.dbo.Currency table. The in parameter specifies the file to read and the –f parameter specifies the format file to use: Importing Data with a Format File bcp Finance.dbo.Currency in D:\Currency.csv -S MIA-SQL -T -f D:\CurrencyFmt.xml
Demonstration: Using the bcp Utility In this demonstration, you will see how to:
Use bcp to create a format file.
Use bcp to export data.
Demonstration Steps Use bcp to Create a Format File 1.
Ensure that you have completed the previous demonstration in this module.
2.
Open a command prompt and type the following command to view the bcp syntax help: bcp -?
3.
In the command prompt window, enter the following command to create a format file: bcp AdventureWorks.Sales.SalesTaxRate format nul -S MIA-SQL -T -c -t , -r \n -x -f D:\Demofiles\Mod06\TaxRateFmt.xml
4.
6-13
Start Notepad and open TaxRateFmt.xml in the D:\Demofiles\Mod06 folder. Then view the XML format file and close notepad.
Use bcp to Export Data 1.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
In the command prompt window, enter the following command to export data from SQL Server:
bcp AdventureWorks.Sales.SalesTaxRate out D:\Demofiles\Mod06\SalesTaxRate.csv -S MIA-SQL -T -f D:\Demofiles\Mod06\TaxRateFmt.xml
2.
Close the command prompt.
3.
Start Excel and open the SalesTaxRate.csv file in the D:\Demofiles\Mod06 folder and view the data that has been exported. Then close Excel without saving the file.
The BULK INSERT Statement The BULK INSERT statement loads data from a data file into a table. This functionality is similar to that provided by the in direction of the bcp command. However, the data file is read by the SQL Server process, not by an external utility. The BULK INSERT statement executes within a Transact-SQL batch. Because the data files are opened by a SQL Server process, data is not copied between client process and SQL Server processes. By comparison, the bcp utility runs in a separate process which produces a higher load on the server when run on the same system.
A key considerations for using the BULK INSERT statement is that file paths to source data must be accessible from the server where the SQL Server instance is running, and must use the correct drive letters for volumes as they are defined on the server. For example, when running the BULK INSERT statement in SQL Server Management Studio or sqlcmd from a client computer, the path C:\data\file.txt references a file on the C: volume of the server, not the client.
Constraints and Triggers
The BULK INSERT statement offers the CHECK_CONSTRAINTS and FIRE_TRIGGERS options that you can use to tell SQL Server to check constraints and triggers. Unlike the bcp utility, the default operation of the BULK INSERT statement is to not apply CHECK and FOREIGN KEY constraints and not to fire triggers on the target table during import operations. Also unlike bcp, you can execute the BULK INSERT statement from within a user-defined transaction, which gives the ability to group BULK INSERT with other operations in a single transaction. Care must be taken however, to ensure that the size of the data batches that you import within a single transaction are not excessive or significant log file growth might occur, even when the database is in simple recovery model.
MCT USE ONLY. STUDENT USE PROHIBITED
6-14 Importing and Exporting Data
In the following example, new orders are inserted into the Sales.OrderDetail table from a text file on the file system: Using the BULK INSERT Statement BULK INSERT AdventureWorks.Sales.OrderDetail FROM 'F:\orders\neworders.txt' WITH ( FIELDTERMINATOR ='|', ROWTERMINATOR ='\n' ); GO
Demonstration: Using the BULK INSERT Statement In this demonstration, you will see how to:
Use the BULK INSERT statement to import data.
Demonstration Steps Use the BULK INSERT Statement to Import Data 1.
Ensure that you have completed the previous demonstration in this module.
2.
In SQL Server Management Studio, in Object Explorer, under Databases, expand Finance. Then expand the Tables folder, right-click dbo.Currency, and click Select Top 1000 Rows.
3.
View the query results, and verify that the dbo.Currency table is currently empty.
4.
Click New Query, and in the new query pane, enter the following Transact-SQL code: BULK INSERT Finance.dbo.Currency FROM 'D:\Demofiles\Mod06\Currency.csv' WITH ( FIELDTERMINATOR =',', ROWTERMINATOR ='\n' );
5.
Click Execute and note the number of rows affected.
6.
Switch to the query pane that retrieves the top 1000 rows from the dbo.Currency table and click Execute to re-run the SELECT query. Note that the table is now populated with the same number of rows as you noted in the previous step.
The OPENROWSET Function You can use the OPENROWSET function to access data using an OLE-DB provider, enabling you use Transact-SQL queries to retrieve data from a wide range of external data sources. SQL Server includes a special OLE-DB provider called BULK which enables you to access data in files. The same format files that are used with bcp and BULK INSERT can also be used with this provider. The following example code inserts the contents of the Accounts.csv text file into the dbo.Accounts table, using the AccountsFmt.xml format file to determine the schema of the rows in the text file: Using OPENROWSET to Insert Rows from a Text File INSERT INTO dbo.Accounts SELECT * FROM OPENROWSET (BULK 'D:\Accounts.csv', FORMATFILE = 'D:\AccountsFmt.xml') AS rows;
Note: Note that the results returned by the OPENROWSET function must have a correlation name specified in an AS clause.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
6-15
Similarly to the BULK INSERT statement, file paths used with the OPENROWSET function refer to volumes that are defined on the server.
Two key advantages of OPENROWSET, compared to bcp, are that it can be used in a query with a WHERE clause (to filter the rows that are loaded), and that it can be used in a SELECT statement that is not necessarily associated with an INSERT statement.
Inserting BLOB Data with OPENROWSET
In addition to the import of data rows, the BULK provider offers three special options that enable it to import the entire file contents as a binary large object (BLOB) into a single column of a table. These special options are:
SINGLE_CLOB. This option reads an entire single-byte character-based file as a single value of data type varchar(max).
SINGLE_NCLOB. This option reads an entire double-byte character-based file as a single value of data type nvarchar(max).
SINGLE_BLOB. This option reads an entire binary file as a single value of data type varbinary(max).
In the following example, the data in the SignedAccounts.pdf file is inserted into the Document column of the dbo.AccountsDocuments table: Inserting Data by Using the OPENROWSET Function INSERT INTO dbo.AccountsDocuments(FiscalYear, Document) SELECT 2013 AS FiscalYear, * FROM OPENROWSET(BULK 'D:\SignedAccounts.pdf', SINGLE_BLOB) AS Document;
Note: To use OPENROWSET with OLE-DB providers other than BULK, the ad hoc distributed queries system configuration option must be enabled and the DisallowAdhocAccess registry entry for the OLE-DB provider must be explicitly set to 0. This registry key is typically located at HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\MSSQLServer\Providers\MSDASQL. When these options are not set, the default behavior does not allow for the ad hoc access that is required by the OPENROWSET function when working with external OLE-DB providers.
Demonstration: Using the OPENROWSET Function In this demonstration, you will see how to:
Use the OPENROWSET function to import data.
Demonstration Steps Use the OPENROWSET Function to Import Data
MCT USE ONLY. STUDENT USE PROHIBITED
6-16 Importing and Exporting Data
1.
Ensure that you have completed the previous demonstrations in this module.
2.
In SQL Server Management Studio, in Object Explorer, right-click the dbo.SalesTaxRate table in the Finance database, and click Select Top 1000 Rows.
3.
View the query results, and verify that the dbo.SalesTaxRate table is currently empty.
4.
Click New Query, and in the new query pane, enter the following Transact-SQL code: INSERT INTO Finance.dbo.SalestaxRate SELECT * FROM OPENROWSET (BULK 'D:\Demofiles\Mod06\SalesTaxRate.csv'’, FORMATFILE = 'D:\Demofiles\Mod06\TaxRateFmt.xml') AS rows;
5.
Click Execute and note the number of rows affected.
6.
Switch to the query pane that retrieves the top 1000 rows from the dbo.SalesTaxRate table and click Execute to re-run the SELECT query. Note that the table is now populated with the same number of rows as you noted in the previous step.
Lesson 3
Copying or Moving a Database
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
6-17
The techniques discussed in this module so far enable you to import and export data to and from individual tables. However, in some scenarios you must copy or move an entire database from one SQL Server instance to another.
Lesson Objectives After completing this lesson, you will be able to:
Describe options for moving or copying a database.
Use the Copy Database Wizard.
Export and Import Data-Tier Applications.
Options for Copying or Moving Databases There are several ways to copy or move a database to other instance of SQL Server. These include:
Detach and attach. You can detach a database, move or copy its data and log files as required, and then attach the database on a different SQL Server instance.
Backup and restore. You can back up a database and then restore it on a different SQL Server instance.
The Copy Database Wizard. You can use the copy database wizard to copy a database and its dependencies to a different SQL Server instance.
Data-Tier Applications. You can export a database as a data-tier application, and import it on a different SQL Server instance.
The detach/attach and backup/restore techniques are easy to implement. However, only the database is copied, and the DBA needs to take care of all dependent objects such as logins, certificates, and other server-level objects.
The Copy Database Wizard The Copy Database Wizard provides an easy-touse wizard that can move or copy the database with all dependent objects without needing additional scripting. It is also possible to schedule the copy operation. The wizard provides two methods for copying or moving the database. It can be configured to use detach and attach (which is the fastest option) but has a downside, as the source database needs to be offline while the detach/copy/attach is occurring. The second method uses the SQL Server Management Objects (SMO) programming library methods to create the objects and transfer the data. This is slower but means that the source database can be kept online while copying.
MCT USE ONLY. STUDENT USE PROHIBITED
6-18 Importing and Exporting Data
If you select the move option in the wizard, it deletes the source database afterwards. If you select copy, the source database is left intact. The wizard performs the task by detaching the database, moving the data and log files on the file system, and then attaching the database back to an instance of SQL Server. Note: If the source database is in use when the wizard tries to move or copy a database, the operation is not performed. Running the Copy Database Wizard requires sysadmin privileges on both instances and a network connection must be present.
Demonstration: Using the Copy Database Wizard In this demonstration, you will see how to:
Use the Copy Database Wizard.
Demonstration Steps Use the Copy Database Wizard 1.
Ensure that you have completed the previous demonstrations in this module.
2.
In SQL Server Management Studio, in Object Explorer, in the Connect drop-down list click Database Engine. Then connect to the MIA-SQL\SQL2 instance using Windows authentication.
3.
In Object Explorer, under the MIA-SQL\SQL2 instance, expand Databases and verify that the AdventureWorks database is not listed.
4.
In Object Explorer, under the MIA-SQL instance, right-click the AdventureWorks database, point to Tasks, and click Copy Database.
5.
On the Welcome to the Copy Database Wizard page, click Next.
6.
On the Select a Source Server page, ensure that MIA-SQL is selected with the Use Windows Authentication option, and click Next.
7.
On the Select a Destination Server page, change the Destination server to MIA-SQL\SQL2 and select the Use Windows Authentication option. Then click Next.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
6-19
8.
On the Select the Transfer Method page, ensure that Use the detach and attach method is selected, and click Next.
9.
On the Select Databases page, in the Copy column, ensure that the AdventureWorks database is selected. Then click Next.
10. On the Configure Destination Database (1 of 1) page, note the default settings for the database name and file locations. Then click Next.
11. On the Select Server Objects page, verify that Logins is listed in the Selected related objects list. Then click Next. 12. On the Configure the Package page, note the default package name and logging options. Then click Next.
13. On the Scheduling the Package page, ensure that Run immediately is selected, and click Next. 14. On the Completing the Wizard page, click Finish. 15. Wait for the operation to complete. Then click Close.
16. In Object Explorer, under the MIA-SQL\SQL2 instance, right-click Databases and click Refresh. Then verify that the AdventureWorks database has been copied to this instance.
Data-Tier Applications Data-Tier applications (known as DACs) provide a way to simplify the development, deployment, and management of database applications and their SQL Server instance-level dependencies. They provide a useful way to package application databases for deployment. Installation and upgrade of DACs is automated as they are not designed for large line of business applications. The intention is to make it easy to install and upgrade large numbers of simpler applications. Note: Data-tier applications do not support all SQL Server objects. For example, XML schema collections and SQL CLR based objects are not supported. For this reason, not all databases are available for extraction to .dacpac files. When SQL Server is unable to perform a registration or extraction, the wizard displays the objects that are not supported.
Creating a Data-Tier Application
DACs provide a similar experience for installing and upgrading database applications, as occurs with Windows applications. A developer creates a DAC by using Visual Studio, and includes all required objects, and defines policies that limit how the application can be installed. For example, a deployment policy could indicate that an application can only be installed on SQL Server versions 10.5 and above. When the DAC project is built, the output is a .dacpac file containing the database schema and its dependent objects, which can be delivered to the database administrator for deployment. A single .dacpac file can be used to both install and upgrade an application and is portable across different environments such as development, test, staging, and production.
Extracting or Exporting a Data-Tier Application
MCT USE ONLY. STUDENT USE PROHIBITED
6-20 Importing and Exporting Data
Database Administrators (DBAs) can extract a data-tier application from an existing database. This enables the DBA to manage the database and its dependencies as a unit, and also makes it possible to extract a .dacpac package containing the database schema and its dependencies for deployment to another SQL Server instance. Alternatively, DBAs can export the database to a .bacpac package, which is a DAC package that also includes the data already in the database. The export operation is performed in two phases. First, the package file is created in the same was as a .dacpac file, and then the data is bulk exported from the database into the package file.
Deploying or Importing a Data-Tier Application
To deploy a data-tier application from a .dacpac package, you can use the wizard provided in SQL Server Management Studio or you can create a PowerShell script to automate the deployment. Whichever approach you use, the database is created on the target SQL Server instance along with any instance-level dependencies that do not already exist. A versioned data-tier application that encompasses all of the components of the application is defined on the target instance and can be monitored and managed as a unit.
If you have previously exported a database, including its data as a .bacpac package, you can import it into a SQL Server instance. In this case, the DAC is deployed to the server along with its data.
Data-Tier Application Versions
One of the benefits of using a Data-Tier application is that the application is versioned and can be upgraded across the enterprise in a consistent, managed way. This is useful in large organizations where the same database application might be deployed in multiple sites or virtual servers. Application administrators can easily track which version of an application is installed in each location, and upgrade it to the latest version as required.
Demonstration: Exporting and Importing a Data-tier Application In this demonstration, you will see how to:
Export a data-tier application.
Import a data-tier application.
Demonstration Steps Export a Data-Tier Application 1.
Ensure that you have completed the previous demonstrations in this module.
2.
In SQL Server Manager, in Object Explorer, under the MIA-SQL instance, right-click the Finance database, point to Tasks, and click Export Data-tier Application. (Be careful to click Export Datatier Application, and not Extract Data-tier Application.)
3.
On the Introduction page, click Next.
4.
On the Export Settings page, ensure that Save to local disk is selected and enter the path D:\Demofiles\Mod06\Finance.bacpac. Then click the Advanced tab and verify that all tables are selected, and click Next.
5.
On the Summary page, click Finish.
6.
Wait for the export operation to complete, and then click Close.
Import a Data-Tier Application
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
6-21
1.
In SQL Server Management Studio, in Object Explorer, under the MIA-SQL\SQL2 instance, right-click the Databases folder and click Import Data-tier Application.
2.
On the Introduction page, click Next.
3.
On the Import Settings page, ensure that Import from local disk is selected and enter the path D:\Demofiles\Mod06\Finance.bacpac. Then Next.
4.
On the Database Settings page, review the default settings for the database name and file paths, and then click Next.
5.
On the Summary page, click Finish.
6.
Wait for the import operation to complete, and then click Close.
7.
On Object Explorer, under the MIA-SQL\SQL2 instance, if necessary, refresh the Databases folder and verify that the Finance database has been imported.
Lab: Importing and Exporting Data Scenario
MCT USE ONLY. STUDENT USE PROHIBITED
6-22 Importing and Exporting Data
You are a DBA at Adventure Works Cycles, with responsibility for the HumanResources, InternetSales, and AWDataWarehouse databases. One of your tasks is to import and export data to and from these database as required.
Objectives After completing this lab, you will be able to:
Use the SQL Server Import and Export Wizard to export data.
Use the bcp utility to import data.
Use BULK INSERT to import data.
Use OPENROWSET to import data.
Estimated Time: 45 minutes Virtual machine: 20462C-MIA-SQL User name: ADVENTUREWORKS\Student Password: Pa$$w0rd
Exercise 1: Using the SQL Server Import and Export Wizard Scenario The production manager has asked you query the InternetSales database in order to summarize historical order volumes for the products Adventure Works sells, and plan future production. You have created a Transact-SQL query, and the production manager has confirmed that it returns the required data. Now the production manager has asked you to provide the data in a Microsoft Excel workbook. The main tasks for this exercise are as follows: 1. Prepare the Lab Environment 2. Use the SQL Server Import and Export Wizard to Export Data Task 1: Prepare the Lab Environment 1.
Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are both running, and then log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
Run Setup.cmd in the D:\Labfiles\Lab06\Starter folder as Administrator.
Task 2: Use the SQL Server Import and Export Wizard to Export Data 1.
Use the SQL Server Import and Export Wizard to export the required sales data from the InternetSales database on the MIA-SQL instance of SQL Server. o
Export the data to an Excel file named Sales.xls in the D:\Labfiles\Lab06\Starter folder.
o
Include the column names in the first row.
o
Use the Transact-SQL query in the Query.sql script file (which is in the D:\Labfiles\Lab06\Starter folder) to extract the required data.
Results: After this exercise, you should have exported data from InternetSales to an Excel workbook named Sales.xls.
Exercise 2: Using the bcp Utility Scenario
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
6-23
Adventure Works Cycles uses a recruitment agency to find new employees. The agency periodically exports details of potential candidates from its candidate database and sends the data to the Human Resources (HR) department in tab-delimited text format. The HR manager has asked you to import the candidate data into the dbo.JobCandidate table in the HumanResources database. The main tasks for this exercise are as follows: 1. Create a Format File 2. Use bcp to Import Data
Task 1: Create a Format File 1.
Use SQL Server Management Studio to view the contents of the dbo.JobCandidate table in the HumanResources database. o
2.
Note that the table includes some columns that contain Unicode characters.
Use the bcp utility to create an XML format file for the JobCandidate table. o
Ensure that the format file uses Unicode character data with a tab (\t) file terminator and a new line (\n) row terminator.
o
Save the format file as JobCandidateFmt.xml in the D:\Labfiles\Lab06\Starter folder.
Task 2: Use bcp to Import Data 1.
Use Notepad to view the contents of the JobCandidates.txt file in the D:\Labfiles\Lab06\Starter folder. Note that this file contains new candidate data in tab-delimited Unicode text.
2.
Use the bcp utility to import the data in the JobCandidates.txt file into the dbo.JobCandidates table in the HumanResources database. o
3.
Use the format file you created in the previous task.
Use SQL Server Management Studio to view the contents of the dbo.JobCandidate table in the HumanResources database, and verify that the new candidate data has been imported.
Results: After this exercise, you should have created a format file named JobCandidateFmt.xml, and imported the contents of the JobCandidates.txt file into the HumanResources database.
Exercise 3: Using the BULK INSERT Statement Scenario
Adventure Works Cycles sells to customers throughout the world, and the e-ecommerce application developers have updated the web site to support multiple currencies. A comma-delimited file containing currency conversion rates has been uploaded to the M:\ volume used by database server, and you must import this data into the dbo.CurrencyRate table in the InternetSales database.
The main tasks for this exercise are as follows: 1. Disable Indexes 2. Use the BULK INSERT Statement to Import Data 3. Rebuild Indexes
Task 1: Disable Indexes 1.
Use SQL Server Management Studio to view the contents of the dbo.CurrencyRate table in the InternetSales database, and verify that it is currently empty.
2.
In Object Explorer, view the indexes that are defined on the dbo.CurrencyRate table.
3.
Disable all indexes on the dbo.CurrencyRate table.
Task 2: Use the BULK INSERT Statement to Import Data
MCT USE ONLY. STUDENT USE PROHIBITED
6-24 Importing and Exporting Data
1.
Use Excel to view the contents of the CurrencyRates.csv file in the M:\ folder, and note that it contains currency rate data in comma-delimited format.
2.
In SQL Server Management Studio, use the BULK INSERT Transact-SQL statement to import the data from the CurrencyRates.csv file into the dbo.CurrencyRates table in the InternetSales database.
3.
Verify that the data has been imported.
Task 3: Rebuild Indexes 1.
Rebuild all indexes on the dbo.CurrencyRate table.
Results: After this exercise, you should have used the BULK INSERT statement to load data into the CurrencyRates table in the InternetSales database.
Exercise 4: Using the OPENROWSET Function Scenario
The recruitment agency has sent more job candidate data, and you have decided to import only records for candidates who have supplied an email address. The main tasks for this exercise are as follows: 1. Copy Data Files to the Server 2. Disable Indexes and Constraints 3. Use the OPENROWSET Function to Import data 4. Re-Enable Indexes and Constraints
Task 1: Copy Data Files to the Server 1.
Use Notepad to view the JobCandidates2.txt file in the D:\Labfiles\Lab06\Starter folder and note that it contains data for three candidates, only two of which have supplied email addresses.
2.
Copy the JobCandidates2.txt and JobCandidatesFmt.xml files from the D:\Labfiles\Lab06\Starter folder to the M:\ folder.
Note: In this lab environment, the client and server are the same. However, in a real environment you would need to upload data and format files from your local workstation to a volume that is accessible from the server. In this scenario, M: represents a volume in a SAN that would be accessible from the server.
Task 2: Disable Indexes and Constraints
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
6-25
1.
Use SQL Server Management Studio to view the contents of the dbo.JobCandidate table in the HumanResources database.
2.
In Object Explorer, view the indexes and constraints that are defined on the dbo.JobCandidate table.
3.
Disable the non-clustered indexes on the dbo.JobCandidate table.
4.
Disable all constraints on the dbo.JobCandidate table.
Task 3: Use the OPENROWSET Function to Import data 1.
2.
In SQL Server Management Studio, use the OPENROWSET function in an INSERT Transact-SQL statement to import the data from the JobCandidates2.txt file into the dbo.JobCandidate table in the HumanResources database. o
Use the BULK provider in the OPENROWSET function.
o
Specify the JobCandidateFmt.xml format file.
o
Use a WHERE clause to include only records where EmailAddress is not null.
Verify that the data has been imported.
Task 4: Re-Enable Indexes and Constraints 1.
Rebuild the indexes you disabled on the dbo.JobCandidate table.
2.
Re-enable all constraints on the dbo.JobCandidate table.
Results: After this exercise, you should have imported data from JobCandidates2.txt into the dbo.JobCandidates table in the HumanResources database. Question: Why was it not necessary to disable constraints when importing the currency rates? Question: If the dbo.JobCandidate table has included a column for a resume in Microsoft Word document format, which tool or command could you use to import the document into a column in a table?
Module Review and Takeaways In this module, you have learned how to import and export data, and how to move or copy a database between SQL Server instances. When planning a data transfer solution, consider the following best practices:
Choose the right tool for bulk-imports.
Use SSIS for complex transformations.
Use bcp or BULK INSERT for fast imports and exports.
Use OPENROWSET when data needs to be filtered before it gets inserted.
Try to achieve minimal logging to speed up data import.
Review Question(s) Question: What other factors might you need to consider when importing or exporting data?
MCT USE ONLY. STUDENT USE PROHIBITED
6-26 Importing and Exporting Data
MCT USE ONLY. STUDENT USE PROHIBITED 7-1
Module 7 Monitoring SQL Server 2014 Contents: Module Overview
7-1
Lesson 1: Introduction to Monitoring SQL Server
7-2
Lesson 2: Dynamic Management Views and Functions
7-7
Lesson 3: Performance Monitor
7-11
Lab: Monitoring SQL Server 2014
7-15
Module Review and Takeaways
7-18
Module Overview
A large amount of the time spent by a database administrator (DBA) involves monitoring activity in databases and database servers in order to diagnose problems and identify changes in resource utilization requirements. SQL Server provides a range of tools and functionality you can use to monitor current activity and to record details of previous activity. This module explains how to use three of the most commonly used tools: Activity Monitor, dynamic management views and functions (DMVs and DMFs), and Performance Monitor.
Objectives After completing this module, you will be able to:
Describe considerations for monitoring SQL Server and use Activity Monitor.
Use dynamic management views and functions to monitor SQL Server.
Use Performance Monitor to monitor SQL Server.
Monitoring SQL Server 2014
Lesson 1
Introduction to Monitoring SQL Server
MCT USE ONLY. STUDENT USE PROHIBITED
7-2
SQL Server is a sophisticated software platform with a large number of subsystems and potential workloads, all of which make demands on system resources and affect how database applications perform in a multi-user environment. To effectively manage a SQL Server database solution, a DBA must be able to monitor the key metrics that affect the workloads that the solution must support, and use the information gained from monitoring to plan hardware capacity and troubleshoot performance problems.
Lesson Objectives After completing this lesson, you will be able to:
Describe considerations for monitoring SQL Server.
Describe tools for monitoring SQL Server.
Use Activity Monitor to view current activity in a SQL Server instance.
SQL Server Monitoring Overview As a DBA, you need to support the database workloads that the applications and services in your organization rely on. To do this effectively, you need to understand how those workloads are using server resources, identify any changes in resource utilization patterns, and diagnose the causes of any performance issues that may occur.
Why Monitor? Important reasons to monitor SQL Server workloads include:
Diagnosing causes of performance issues. It is not uncommon for users to complain of slow performance when using a database application. By performance, users usually mean that the response time of the application (the time taken between the user submitting a request and them seeing a response from the application) is slower than acceptable, though often the problem can be attributed to a bottleneck that affects the overall throughput of the system (the amount of data that can be processed for all concurrent workloads simultaneously). Effective monitoring is a key part of diagnosing and resolving these problems. For example, you might identify long-running queries that could be optimized by creating indexes or improving Transact-SQL code (to improve response time), or you might find that at peak times the server has insufficient physical memory to cope with demand (reducing throughput).
Detecting and resolving concurrency issues. When multiple users and applications access the same data, SQL Server uses locks to ensure data consistency. This can cause some requests to be blocked while waiting for another to complete, and can occasionally result in deadlock where two operations are blocking one another. By monitoring current activity, you can identify processes that are blocking one another, and take action to resolve the problem if necessary. If your monitoring reveals consistent locking issues, you can then troubleshoot them by tracing workloads and analyzing the results. Techniques for accomplishing this are discussed in the next module.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
7-3
Identifying changing trends in resource utilization. When a database server is initially provisioned, it is usually done so after careful capacity planning to identify the required hardware resources to support the database workload. However, business processes, and the database workloads that support them, can change over time; so it is important to monitor resource utilization changes so that you can proactively upgrade hardware or consolidate workloads as the needs of the organization change.
Guidelines for Monitoring When planning to monitor SQL Server workloads, consider the following guidelines:
Understand the workloads you need to support. Every workload has different requirements, both in terms of the resources required to support it and in terms of trends in demand between quiet and peak times. Before you can effectively support a database solution, you need to understand its workloads so that you can identify the relevant metrics to monitor, and prioritize resources based on the importance of each workload to the business.
Establish a baseline. A common mistake is to wait until there is a problem before monitoring the SQL Server solution. The problem with this approach is that without something to use as a comparison, the values obtained from monitoring are unlikely to help you identify what has changed since the system was operating acceptably. A better approach is to identify the key metrics that your workloads rely on, and record baseline values for these metrics when the system is operating normally. If you experience performance problems later, you can monitor the system and compare each metric to its baseline in order to identify significant changes that warrant further investigation to try to diagnose the problem.
Monitor regularly to track changes in key metrics. Instead of waiting for as problem to arise, it is generally better to proactively monitor the system on a regular basis to identify any trends that signify changes in the way the workloads consume resources. With this approach, you can plan server upgrades or application optimizations before they become critical.
Monitoring Tools for SQL Server SQL Server provides a number of tools you can use to carry out performance monitoring and tuning. Each tool is useful in certain scenarios and you will often need to combine several of them to achieve the optimal results.
Monitoring SQL Server 2014
Tool
Description
Activity Monitor
A component of SQL Server Management Studio that enables DBAs to view details of current activity in the database engine.
Dynamic Management View and Functions
Database objects that provide insight into internal SQL Server operations.
Performance Monitor
A windows administrative tool that you can use to record values for multiple performance counters over a period of time, and analyze the results in a variety of chart and report formats.
SQL Server Profiler
A tracing and profiling tool that you can use to record details of Transact-SQL and other events in a SQL Server workload, and then replay the events or use the trace as a source for database tuning.
SQL Trace
A lightweight, Transact-SQL based programming interface for tracing SQL Server activity.
Database Engine Tuning Advisor
A tool provided with SQL Server for tuning indexes and statistics based on a known workload.
Distributed Replay
An advanced tool for replaying workloads across a potentially distributed set of servers.
SQL Server Extended Events
A lightweight eventing architecture.
SQL Server Data Collection
An automated system for collecting and storing and reporting performance data for multiple SQL Server instances.
SQL Server Utility Control Point
A centralized management portal for monitoring server health for multiple instances based on specific collection sets.
Microsoft System Center Operations Manager
An enterprise-wide infrastructure management solution that uses management packs to collect performance and health data from Windows and application services.
Activity Monitor Activity Monitor is a tool in SQL Server Management Studio (SSMS) that shows information about processes, waits, I/O resource performance, and recent expensive queries. You can use it to investigate both current and recent historical issues. You must have the VIEW SERVER STATE permission to use Activity Monitor.
MCT USE ONLY. STUDENT USE PROHIBITED
7-4
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
7-5
To start Activity Monitor, in SSMS, right-click the server name, and then click Activity Monitor. Activity Monitor displays five sections:
The Overview section contains graphical information about processor usage, waiting tasks, database I/O, and batch requests per second.
The Processes section includes detailed information on processes, their IDs, logins, databases, and commands. This section also shows details of processes that are blocking other processes.
The Resource Waits section shows categories of processes that are waiting for resources and information about the wait times.
The Data File I/O section shows information about the physical database files in use and their recent performance.
The Recent Expensive Queries section shows detailed information about the most expensive recent queries, and resources consumed by those queries. You can right-click the queries in this section to view either the query or an execution plan for the query.
You can filter data by clicking column headings and choosing the parameter for which you want to view information.
Demonstration: Using Activity Monitor In this demonstration, you will see how to:
View server activity in Activity Monitor.
Troubleshoot a blocked process.
Demonstration Steps View Server Activity in Activity Monitor 1.
Ensure that the 20462C-MIA-DC, and 20462C-MIA-SQL virtual machines are running, and log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
In the D:\Demofiles\Mod07 folder, run Setup.cmd as Administrator.
3.
Start SQL Server management Studio and connect to the MIA-SQL database engine instance using Windows authentication.
4.
In Object Explorer, right-click the MIA-SQL SQL Server instance and click Activity Monitor.
5.
In Activity Monitor, view the charts in the Overview section, which show background activity in the SQL Server instance.
6.
Expand the Processes section and view the processes currently running in the SQL Server instance.
7.
Click the filter icon for the Application column header, and filter the data to show only processes for the Microsoft SQL Server Management Studio application (you may need to widen the columns to read the headers).
8.
Remove the filter to show all applications.
9.
Expand the Resource Waits section and view the statistics for processes waiting on resources.
10. Expand the Data File I/O section and view the details of the database file I/O activity (you may need to wait for a few seconds while the data is collected and displayed).
11. Expand the Recent Expensive Queries section and view the list of queries that have consumed query processing resources.
Monitoring SQL Server 2014
Troubleshoot a Blocked Process
MCT USE ONLY. STUDENT USE PROHIBITED
7-6
1.
With Activity Monitor still open in SQL Server Management Studio, in the D:\Demofiles\Mod07 folder, run ActivityWorkload.cmd.
2.
In SQL Server Management Studio, in Object Explorer, expand Databases, expand AdventureWorks, and expand Tables. Then right-click Production.Product and click Select Top 1000 Rows.
3.
In status bar at the bottom of the query pane, note that the query continues executing. Another process is preventing it from completing.
4.
In the Activity Monitor pane, in the Processes section, filter the Task State column to show processes that are in a SUSPENDED state.
5.
In the Blocked By column for the suspended process, note the ID of the process that is blocking this one.
6.
Remove the filter on the Task State column to show all processes, and find the blocking process with the ID you identified in the previous step.
7.
Note the value in the Head Blocker column for the blocking process. A value of 1 indicates that this process is the first one in a chain that is blocking others.
8.
Right-click the blocking process and click Details. This displays the Transact-SQL code that is causing the block—in this case, a transaction that has been started but not committed or rolled back.
9.
Click Kill Process, and when prompted to confirm, click Yes. After a few seconds, the Processes list should update to show no blocked processes.
10. Close the Activity Monitor pane, and verify that the query to retrieve the top 1000 rows from Production.Products has now completed successfully. 11. Close the command prompt window, but keep SQL Server Management Studio open for the next demonstration.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
Lesson 2
Dynamic Management Views and Functions
7-7
SQL Server provides a range of tools and features to support monitoring of server activity. Dynamic management views (DMVs) and dynamic management functions (DMFs) provide insights directly into the inner operations of the SQL Server database engine and are useful for monitoring.
Lesson Objectives After completing this lesson, you will be able to:
Describe SQL Server DMVs and DMFs.
View dynamic state information by using DMV and DMFs.
Overview of Dynamic Management Views and Functions In earlier versions of SQL Server, database administrators often used third-party tools to monitor the internal state of SQL Server. Most third-party tools performed this monitoring by using extended stored procedures. This approach is not recommended because the extended stored procedures operate within the memory space of the SQL Server process. Poorly-written programs that operate in these memory regions can cause instability or crashes of SQL Server.
SQL Server 2005 and later offer dynamic management objects to provide insight into the inner operation of the database engine without needing to use extended stored procedures. Some of the objects have been created as views and are called DMVs. Other objects have been created as functions and are called DMFs. Note: The information exposed by DMVs and DMFs is generally not persisted in the database as is the case with catalog views. The views and functions are virtual objects that return state information. That state is cleared when the server instance is restarted. DMVs and DMFs return server state information that you can use to monitor the health of a server instance, diagnose problems, and tune performance. There are two types of DMV and DMF:
Server-scoped. These objects provide server-wide information. To use these objects, users require VIEW SERVER STATE permission on the server.
Database-scoped. These objects provide database-specific information. To use these objects, users require VIEW DATABASE STATE permission on the database.
Monitoring SQL Server 2014
MCT USE ONLY. STUDENT USE PROHIBITED
7-8
All DMVs and DMFs exist in the sys schema and follow the naming convention dm_%. They are defined in the hidden resource database, and are then mapped to the other databases. There are a great many DMVs and DMFs, covering a range of categories of system information. Some commonly used categories of DMVs and DMFs are in the following table: Category
Description
sys.dm_exec_%
These objects provide information about connections, sessions, requests, and query execution. For example, sys.dm_exec_sessions provides one row for every session that is currently connected to the server.
sys.dm_os_%
These objects provide access to SQL OS-related information. For example, sys.dm_os_performance_counters provides access to SQL Server performance counters without needing to reach them using operating system tools.
sys.dm_tran_%
These objects provide access to transaction management. For example, sys.dm_os_tran_active_transactions gives details of currently active transactions.
sys.dm_io_%
These objects provide information on I/O processes. For example, sys.dm_io_virtual_file_stats shows details of I/O performance and statistics for each database file.
sys.dm_db_%
These objects provide database-scoped information. For example, sys.dm_db_index_usage_stats has information about how each index in the database has been used.
In addition to these core categories, there are DMVs and DMFs that provide information about specific SQL Server subsystems, such as security, Resource Governor, AlwaysOn Availability Groups, Service Broker, memory-optimized tables, and others.
Viewing Activity by Using Dynamic Management Views Both DMVs and DMFs are viewable in Object Explorer—DMVs are in the System Views node for any database and DMFs are in the Table Valued Function node in the master database. Note: DMVs and DMFs need to use the sys schema prefix when included in Transact-SQL statements; they cannot be referenced by using one-part names. There are two basic types of dynamic management objects:
Objects that return real-time state information from the system.
Objects that return recent historical information.
Objects That Return Real-Time State Information from the System Most DMVs and DMFs provide information about the current state of the system.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
7-9
The following example shows how to use the sys.dm_exec_sessions and sys.dm_os_waiting_tasks views to return a list of user tasks that have been waiting for longer than three seconds: Returning Real-Time Information SELECT s.original_login_name, s.program_name, t.wait_type, t.wait_duration_ms FROM sys.dm_os_waiting_tasks AS t INNER JOIN sys.dm_exec_sessions AS s ON t.session_id = s.session_id WHERE s.is_user_process = 1 AND t.wait_duration_ms > 3000;
In many cases, when a task is waiting, the cause of the wait will be some form of lock. Note: Whenever a task has to wait for any resource, the task is sent to a waiting list. The task remains on that list until it receives a signal telling it that the requested resource is now available. The task is then returned to the running list, where it waits to be scheduled for execution again. This type of wait analysis is very useful when tuning system performance, as it helps you to identify bottlenecks within the system.
Objects That Return Historical Information The second type of DMO returns historical information.
For example, the sys.dm_os_wait_stats view returns information about how often and how long any task had to wait for a specific wait type since the SQL Server instance started. Returning Historical Information SELECT * FROM sys.dm_os_wait_stats ORDER BY wait_time_ms DESC;
Another useful example of a historical function is sys.dm_io_virtual_file_stats(), which returns information about the performance of database files.
Demonstration: Querying Dynamic Management Views In this demonstration, you will see how to:
View SQL Server service configuration settings.
View storage volume statistics.
View query statistics.
Demonstration Steps View SQL Server Service Configuration Settings 1.
If you did not complete the previous demonstration in this module, start the 20462C-MIA-DC, and 20462C-MIA-SQL virtual machines, log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd, and in the D:\Demofiles\Mod07 folder, run Setup.cmd as Administrator.
2.
If SQL Server Management Studio is not already open, start it and connect to the MIA-SQL database engine instance using Windows authentication.
3.
In SQL Server Management Studio, open the DMVs.sql script file in the D:\Demofiles\Mod07 folder.
MCT USE ONLY. STUDENT USE PROHIBITED
7-10 Monitoring SQL Server 2014
4.
Highlight the Transact-SQL statement under the comment View service information, and then click Execute.
5.
Review the results, noting the values in the startup_type column for the SQL Server and SQL Server Agent services.
6.
Highlight the Transact-SQL statement under the comment View registry information, and then click Execute.
7.
Review the results. Note the value Start in the value_name column for the MSSQLSERVER and SQLSERVERAGENT registry keys and the corresponding values in the value_data column. These values are the equivalent registry value for the startup_type column values returned by the sys.dm_server_services dynamic management view.
View Storage Volume Statistics 1.
Select the Transact-SQL code under the comment View volume stats, noting that it retrieves data from the sys.sysdatabases and sys.master_files system tables as well as the sys.dm_os_volume_stats dynamic management function.
2.
Click Execute and review the query results, which show the files for all databases in the instance, together with details about the disk volume on which they are hosted.
View Query Statistics 1.
Highlight the Transact-SQL statement under the comment Empty the cache and then click Execute.
2.
Highlight the Transact-SQL statement under the comment get query stats, noting that this code uses the sys.dm_exec_query_stats DMV and the sys.dm_exec_sql_text DMF to return details about Transact-SQL queries that have been executed.
3.
Click Execute and review the results, which show some background system queries, noting the various columns that are returned.
4.
Highlight the Transact-SQL statement under the comment Execute a query, and then click Execute.
5.
Re-highlight the Transact-SQL statement under the comment get query stats, and then click Execute to get the query stats again.
6.
In the results, find the row with a SQLText column that contains the Transact-SQL statement that you executed and note the statistics returned for the query.
7.
Re-highlight the Transact-SQL statement under the comment Execute a query, and then click Execute to run the query again.
8.
Re-highlight the Transact-SQL statement under the comment get query stats, and then click Execute to get the query stats again.
9.
Review the results for the query, noting that the query you executed now has an execution_count value of 2.
10. Close SQL Server Management Studio without saving any files.
Lesson 3
Performance Monitor
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
7-11
SQL Server Management Studio (SSMS) provides the Activity Monitor which you can use to investigate both current and recent historical issues. The SQL Server processes also expose a set of performancerelated objects and counters to the Windows Performance Monitor. These objects and counters enable you to monitor SQL Server as part of monitoring the entire server.
Lesson Objectives After completing this lesson, you will be able to:
Describe Performance Monitor.
Work with SQL Server counters.
Work with data collector sets.
Introduction to Performance Monitor Windows Performance Monitor is an operating system tool that brings together several previously disparate performance and monitoring tools. By consolidating several sources of information in one place, Windows Performance Monitor helps you obtain the information you need to diagnose server performance and instability issues. Windows Performance Monitor is the key tool for monitoring Windows systems. Because SQL Server runs on the Windows operating system, it is important to monitor at the server level, as well as at the database engine level, because problems in the database engine might be caused by issues at the operating system or hardware level.
Objects, Counters and Instances
Performance Monitor captures values for system metrics that you specify. Typically, these metrics measure the use of system resources such as the CPU, memory, or disk subsystems. The specific metrics that you can monitor are organized into the following hierarchy.
Objects. Objects represent course-grained resource categories, such as Processor, Memory, or Logical Disk. There are a great many built-in objects provided by Windows, and server applications such as SQL Server often install additional objects that relate to application-specific resources.
Counters. An object contains one or more counters, which are the metrics that you can monitor. For example, the Processor object includes the % Processor Time counter, which measures processor utilization.
Instances. Some counters can be captured for specific instances of objects. For example, when monitoring the % Processor Time counter in a multi-CPU server, you capture measurements for each individual CPU. Additionally, most multi-instance counters provide a _Total instance, which measures the counter for all instances combined.
MCT USE ONLY. STUDENT USE PROHIBITED
7-12 Monitoring SQL Server 2014
When describing a system counter, the format Object: Counter (instance) is used. For example, the % Processor Time counter for the _Total instance of the Processor object is described as Processor: % Processor Time (_Total). When describing the _Total instance, or if an object has only one instance, the (instance) part is often omitted; for example Processor: % Processor Time.
Charts and Reports Performance Monitor provides multiple ways to visualize the counter values that you have captured. These include:
A line chart.
A histogram bar chart.
A text-based report.
When viewing a chart in performance Monitor, you can select individual counters in a list below the chart and see the last, average, minimum, and maximum values recorded for that counter within the selected timeframe. You can also highlight an individual counter to make it easier to identify in the chart. You export data from all charts and reports, and you can save them as images.
SQL Server Counters Many applications expose application-related counters to Performance Monitor. SQL Server exposes a large number of objects and counters which use the following naming convention:
Object Name Format
Usage
SQLServer:
Default instances
MSSQL$:
Named instances
SQLAgent$:
SQL Server Agent
You can also access the same counter values by using the sys.dm_os_performance_counters DMV.
There are too many SQL Server objects and counters to discuss in detail in this course, and the specific counters you should monitor depend on the workload of your specific database environment. However, some commonly monitored SQL Server performance counters include:
SQLServer: Buffer Manager: Buffer cache hit ratio. This measures how frequently SQL Server found data pages in cache instead of having to fetch them from disk. Generally a high figure for this counter is desirable.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
7-13
SQLServer: Plan Cache: Cache hit ratio. This measures how often a pre-compiled execution plan for a query could be used, saving the query processor from having to compile an execution plan. A high value for this is generally desirable.
SQLServer: SQL Statistics: Batch Requests / sec. This measures the number of Transact-SQL batches per second, and provides a good measure of how active SQL Server is.
SQLServer: Locks: Lock Requests / sec. This measures the number of lock requests SQL Server receives each second. A high value for this counter indicates that a lot of transactions are locking pages (usually to update data), and on its own is not necessarily an indication of a problem.
SQLServer: Locks: Average Wait Time (ms). This measures the average time that requests are waiting for locks to be released. A high value for this counter indicates that concurrent processes are blocking one another.
SQLServer: Memory: Database Cache Memory. This measures how much of its memory allocation SQL Server is using to cache data.
SQLServer: Memory: Free Memory. This measures how much of its memory allocation SQL Server is currently not using.
Data Collector Sets and Logs Although you can use Performance Monitor interactively to add individual counters and view live measurements, it is most useful as a tool for logging counter values over a period of time, and viewing the results later.
Data Collector Sets You can define data collector sets as reusable sets of counters, events, and system properties that you want to measure at different times. By using a data collector set to group specific performance metrics that are most appropriate for the database workloads you need to support, you can easily record baseline measurements that reflect typical and peak time periods, and then use the same data collector sets later to monitor the system and observe any changes.
Performance Logs
Each data collector set is configured to use a specific folder in which to save its logs. Logs are files containing the recorded measurements, and usually have dynamically generated names that include the date and time the log was saved. You can open log files in Performance Monitor, filtering them to a specific timespan if desired, and then view the logged counter values as charts or a report.
Demonstration: Using Performance Monitor In this demonstration, you will see how to:
View performance counters.
Demonstration Steps View Performance Counters
MCT USE ONLY. STUDENT USE PROHIBITED
7-14 Monitoring SQL Server 2014
1.
If you did not complete the previous demonstration in this module, start the 20462C-MIA-DC, and 20462C-MIA-SQL virtual machines, log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd, and in the D:\Demofiles\Mod07 folder, run Setup.cmd as Administrator.
2.
Right-click the Start button and click Computer Management.
3.
In Computer Management, expand Performance, expand Monitoring Tools, and click Performance Monitor.
4.
In the toolbar, click the Add button (a green +).
5.
In the list of objects, expand the Processor object, and select only the % Processor Time counter. Then in the Instances of selected object list, ensure that _Total is selected and click Add.
6.
In the list of objects, expand the Memory object and select the Page Faults/sec counter. Then click Add.
7.
In the list of objects, expand the SQLServer:Locks object, click the Average Wait Time (ms) counter, and then hold the Ctrl key and click the Lock Requests/sec and Lock Waits/sec counters. Then in the Instances of selected object list, ensure that _Total is selected and click Add.
8.
In the list of objects, expand the SQLServer:Plan Cache object, and select the Cache Hit Ratio counter. Then in the Instances of selected object list, ensure that _Total is selected and click Add.
9.
In the list of objects, expand the SQLServer:Transactions object, and select the Transactions counter. Then click Add.
10. In the Add Counters dialog box, click OK. Then observe the counters as they are displayed in Performance Monitor.
11. On the toolbar, click Freeze Display and note that the chart is paused. Then click Unfreeze Display to resume the chart. 12. On the toolbar, in the Change Graph Type list, select Histogram bar and view the resulting chart. Then in the Change Graph Type list, select Report and view the text-based report. 13. On the toolbar, in the Change Graph Type list, select Line to return to the original line chart view. 14. Click any of the counters in the list below the chart and on the toolbar click Highlight so that the selected counter is highlighted in the chart. Press the up and down arrow keys on the keyboard to change the selected counter. 15. In the D:\Demofiles\Mod07 folder, run PerformanceWorkload1.cmd and PerformanceWorkload2.cmd to generate some activity in the database. 16. In Performance Monitor, observe the effect on the counters as the workloads run.
17. Close both command prompt windows, and observe the effect that ending the workloads has on the Performance Monitor counters. 18. Close Computer Management.
Lab: Monitoring SQL Server 2014 Scenario
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
7-15
You are a database administrator (DBA) at Adventure Works Cycles with responsibility for the InternetSales database. One of your key tasks is to monitor this database to ensure that it is performing within expected parameters.
Objectives After completing this lab, you will be able to:
Collect baseline metrics.
Monitor a database workload.
Estimated Time: 45 minutes Virtual machine: 20462C-MIA-SQL User name: ADVENTUREWORKS\Student Password: Pa$$w0rd
Exercise 1: Collecting Baseline Metrics Scenario The InternetSales database is the data source for a business-critical application. You need to record measurements of key performance metrics for this database to establish a performance baseline. The main tasks for this exercise are as follows: 1. Prepare the Lab Environment 2. Create a Data Collector Set 3. Run the Data Collector Set 4. View the Logged Data 5. View Query and I/O Statistics Task 1: Prepare the Lab Environment 1.
Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are both running, and then log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
Run Setup.cmd in the D:\Labfiles\Lab07\Starter folder as Administrator.
Task 2: Create a Data Collector Set 1.
In Computer Management, create a data collector set named SQL Server Workload.
2.
Create the data collector set manually, and include the following performance counters: o
Processor: % Processor Time (_Total)
o
Memory: Page Faults / sec
o
SQLServer:Locks: Average Wait Time (ms) (_Total)
o
SQLServer:Locks: Lock Waits / sec (_Total)
o
SQLServer:Memory Manager: Database Cache Memory (KB)
o
SQLServer:Memory Manager: Free Memory (KB)
3.
o
SQLServer:Plan Cache: Cache Hit Ratio (_Total)
o
SQLServer:Transactions: Transactions
Configure the data collector set to save its logs in the D:\Labfiles\Lab07\Starter\Logs folder.
Task 3: Run the Data Collector Set
MCT USE ONLY. STUDENT USE PROHIBITED
7-16 Monitoring SQL Server 2014
1.
Start the SQL Server Workload data collector set you created in the previous task.
2.
Run Baseline.ps1 in the D:\Labfiles\Lab07\Starter folder, changing the execution policy if prompted. This starts a baseline workload process that takes three minutes to run.
3.
When the baseline workload has finished, stop the SQL Server Workload data collector set.
Task 4: View the Logged Data 1.
Use Performance Manager to view the log data that was recorded by the SQL Database Workload data collector set. This will be stored in a folder with a name similar to MIA-SQL_2014010101000001 in the D:\Labfiles\Starter\Lab07\Logs folder.
2.
Add all counters from the log to the display, and explore the data in the available chart and report formats.
3.
Save the Report view as an image in the D:\Labfiles\Lab07\Starter folder for future reference.
Task 5: View Query and I/O Statistics 1.
Use SQL Server Management Studio to query DMVs and DMFs in the MIA-SQL instance and obtain the following information: o
The top 5 queries by average reads. You can retrieve this information from the sys.dm_exec_query_stats DMV cross applied with the sys.dm_exec_sql_text DMF based on the sql_handle column.
o
I/O stats for the files used by the InternetSales database. You can retrieve this from the sys.dm_io_virtual_file_stats DMV, joined with the sys.master_files system table on the database_id and file_id columns. Note that you can use the DB_NAME function to retrieve a database named from a database ID.
If you have difficulty creating the required queries, you can use the ones in the Query DMV.sql file in the D:\Labfiles\Lab07\Starter folder. 2.
Save the results of your queries as comma-separated values (CSV) files in the D:\Labfiles\Lab07\Starter folder.
Results: At the end of this exercise, you will have a data collector set named SQL Server Workload, a log containing baseline measurements, and query and I/O statistics obtained from DMVs and DMFs.
Exercise 2: Monitoring a Workload Scenario
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
7-17
The application developers of the Adventure Works e-commerce solution have added some new reporting capabilities to the application. You need to determine how these new capabilities have affected the workload for the InternetSales database. The main tasks for this exercise are as follows: 1. Run the Data Collector Set 2. View the Logged Data 3. View Query and I/O Statistics
Task 1: Run the Data Collector Set 1.
Start the SQL Server Workload data collector set you created in the previous task.
2.
Run Workload.ps1 in the D:\Labfiles\Lab07\Starter folder. This starts a workload process that takes three minutes to run.
3.
When the workload has finished, stop the SQL Server Workload data collector set.
Task 2: View the Logged Data 1.
Use Performance Manager to view the new log data that was recorded by the SQL Database Workload data collector set. This will be stored in a folder with a name similar to MIASQL_2014010101-000002 in the D:\Labfiles\Starter\Lab07\Logs folder.
2.
Explore the data in the available chart and report formats, noting any values that look consistently high.
3.
Compare the Report view with the baseline image you saved previously, and identify which counters have changed significantly.
Task 3: View Query and I/O Statistics 1.
Use SQL Server Management Studio to retrieve the top 5 queries by average reads and I/O statistics for the InternetSales database from the appropriate DMVs and DMFs.
2.
Compare the results to the baseline results you saved earlier.
Results: At the end of this exercise, you will have a second log file containing performance metrics for the revised workload. Question: Based on the results of your monitoring, what aspect of the database solution is most significantly affected by the changes to the workload?
Module Review and Takeaways
MCT USE ONLY. STUDENT USE PROHIBITED
7-18 Monitoring SQL Server 2014
In this module, you learned how to use Activity Monitor, DMVs and DMFs, and Performance Monitor to monitor SQL Server activity. When monitoring SQL Server, consider the following best practices:
Identify the system resources that your database workload uses, and determine the key performance metrics that indicate how your database and server are performing.
Record baseline measurements for typical and peak workloads so that you have a basis for comparison when troubleshooting performance problems later.
Identify the DMVs and DMFs that return appropriate performance information for your workloads, and create reusable scripts that you can use to quickly check system performance.
Monitor the overall system periodically and compare the results with the baseline. This can help you detect trends that will eventually result in resource over-utilization before application performance is affected.
Review Question(s) Question: How are dynamic management views and functions different from system tables?
MCT USE ONLY. STUDENT USE PROHIBITED 8-1
Module 8 Tracing SQL Server Activity Contents: Module Overview
8-1
Lesson 1: Tracing SQL Server Workload Activity
8-2
Lesson 2: Using Traces
8-9
Lab: Tracing SQL Server Workload Activity
8-18
Module Review and Takeaways
8-22
Module Overview
While monitoring performance metrics provides a great way to assess the overall performance of a database solution, there are occasions when you need to perform more details analysis of the activity occurring within a SQL Server instance in order to troubleshoot problems and identify ways to optimize workload performance.
This module describes how to use SQL Server Profiler and SQL Trace stored procedures to capture information about SQL Server, and how to use that information to troubleshoot and optimize SQL Server workloads.
Objectives After completing this module, you will be able to:
Trace activity in SQL Server.
Use captured traces to test, troubleshoot, and optimize database performance.
Tracing SQL Server Activity
Lesson 1
Tracing SQL Server Workload Activity
MCT USE ONLY. STUDENT USE PROHIBITED
8-2
Application workloads generate activity in SQL Server, which you can think of as a sequence of events that begin and end as the workload progresses. The ability to trace these events is a valuable tool for performance tuning, for troubleshooting and diagnostic purposes, and for replaying workloads in order to check the impact of performance changes against test systems or to test application workloads against newer versions of SQL Server.
Lesson Objectives After completing this lesson, you will be able to:
Use SQL Server Profiler.
Describe commonly-used trace events.
Describe how columns are used in traces.
Work with trace templates.
View trace files.
Use SQL Trace.
SQL Server Profiler SQL Server Profiler is a graphical tool that you can use to create, capture, and view details of events in a SQL Server instance. It captures the activity from client applications to SQL Server and stores it in a trace, which you can then analyze. SQL Server Profiler captures data when events occur, but only captures events that you specify in the trace definition. A variety of information (shown as a set of columns) is available when each event occurs, but you can again select which columns you want to include. Selecting events and columns each time you run SQL Server Profiler can become timeconsuming, so there are existing templates that you can use or modify. You can also save your own selections as a new template.
Options for Saving Traces When a SQL Server Profiler trace is active, it loads the captured events into a graphical grid in the SQL Server Profiler user interface. Additionally, SQL Server Profiler can send the captured event details to either operating system files or database tables.
Capture to Files
Capturing to an operating system file is the most efficient option for SQL Server Profiler traces. When configuring file output, you need to supply a filename for the trace. The default file extension for a trace file is .trc. SQL Server Profiler defaults to a file size of 5 MB which can be too small for some traces. A more realistic value on most large systems is between 500 MB or 5,000 MB, depending upon the volume of activity to record.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
8-3
When the allocated file size is full, SQL Server Profiler opens a new file using the previous filename with an integer appended to it and starts writing to the new file. This is called file rollover and is the default behavior for SQL Server Profiler. You can disable and enable file rollover in the Trace Properties dialog box. It is considered good practice to work with a large maximum file size and avoid the requirement for rollover files, unless there is a need to move the captured traces onto media such as DVDs or onto download sites that cannot work with larger files.
Capturing to Tables
SQL Server Profiler can also capture trace data to database tables. The underlying SQL Trace programming interface does not directly support output to tables. However, the SQL Server Profiler program can retrieve the event data into its graphical grid, and then write those rows to the specified database table. Note: Writing trace data directly back to the SQL Server system that is being monitored can impact performance. SQL Server Profiler also provides an option for saving existing captured event data displayed in the graphical grid to a database table.
Trace Events The information that a trace records consists of sets of events. An event is an occurrence of an action in an instance of the SQL Server database engine. Events contain attributes which are listed as data columns. The events are grouped into categories of related event classes. The information in the table on the next page describes the most commonly-traced events.
Event
Description
SQL:BatchCompleted
When a batch of Transact-SQL statements is completed, the SQL:BatchCompleted event is fired. Note that there is also an event raised when the batch is first started but the completed event contains more useful information, such as details of the resources used during the execution of the batch.
SQL:StmtCompleted
If tracing at the SQL batch level is too coarse, it is possible to retrieve details of each individual statement contained within the batch.
RPC:Completed
The RPC:Completed event is fired when a stored procedure finishes execution. There is a traceable event when the stored procedure starts but, similar to the SQL:BatchCompleted event, the RPC:Completed event is useful as it contains details of the resources used during the execution of the stored procedure. You can see a statement-by-statement breakdown of resources that stored procedure uses by using the SP:StmtCompleted event.
Tracing SQL Server Activity
Event
Description
Audit Login/Audit Logout
Details of each login and logout event that occurs during the tracing activity can be included in your traces.
Deadlock Graph
Unhandled deadlocks often lead to errors being passed to end users from applications. If your system is suffering from deadlocks, the Deadlock Graph event fires when a deadlock occurs and captures details of what caused it. The details are captured into an XML document that you can view graphically in SQL Server Profiler.
Trace Columns and Filters Data columns contain the attributes of events. SQL Server Profiler uses data columns in the trace output to describe events that are captured when the trace runs. SQL Server Profiler has a large set of potential columns but not every event writes values to all the possible columns. For example, in the SQL:BatchStarting event, the Reads, Writes, Duration, and CPU columns are not offered because the values are not available at the time of the event. These columns are available in the SQL:BatchCompleted event. Note: You can group the output in the SQL Server Profiler graphical grid, based on column values.
MCT USE ONLY. STUDENT USE PROHIBITED
8-4
You can define which columns to capture when you create the trace and you should minimize the number of columns that you capture to help reduce the overall size of the trace. You can also organize columns into related groups by using the Organize Columns function. One of the more interesting columns is TextData. Many events do not include it by default, but the values that it contains are very useful. For example, in the RPC:Completed event, the TextData column contains the Transact-SQL statement that executed the stored procedure.
Filtering Columns
You can set filters for each of the columns that you capture in a trace. It is important to ensure that you are only capturing events of interest by using filters to limit the events. Effective use of filters helps to minimize the overall size of the captured trace, helps to avoid overwhelming the server with tracing activity, and decreases the number of events that are contained in the trace to reduce complexity during analysis. Also, smaller traces are typically faster to analyze. The filters that you configure are only used if the event writes that particular column to the trace. For example, if you set a filter for DatabaseName = AdventureWorks and capture the Deadlock Graph event, all deadlock events will be shown because the DatabaseName column is not exposed.
Text-based columns can be filtered using a LIKE operator and wildcard characters. For example, you could filter the DatabaseName column on the expression LIKE Adventure%, which would include events from all databases with a name beginning with “Adventure”.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
Note: If you need to create a filter based on the database that you want to trace activity against, tracing by DatabaseID is more efficient than tracing by DatabaseName. However, trace templates that filter by DatabaseID are less portable than those that do so by the name, because a database restored on another server will typically have a different DatabaseID.
Trace Templates SQL Server Profiler offers predefined trace templates that enable you to easily configure the event classes you need for specific types of traces. The Standard template, for example, helps you to create a generic trace for recording logins, logouts, batches completed, and connection information. You can use this template to run traces without modification or as a starting point for additional ones with different event configurations.
8-5
You can use SQL Server Profiler to create your own templates that define the event classes and data columns to include in traces. After you define and save the template, you can run a trace that records the data for each event class you selected. You can also modify the existing templates to create custom versions for specific purposes.
You can create templates in SQL Server Profiler by creating a trace using the graphical interface—starting and stopping the trace at least once—and then saving the trace as a template.
Viewing Trace Files When you write trace output to trace files, the results are not very easy to read and are somewhat difficult to parse in an application. SQL Server provides two options to make it easy to work with the contents of trace files:
You can open trace files in SQL Server Profiler, and then filter or group the output for analysis. SQL Server Profiler is especially useful for working with small trace files.
You can query the trace file by using the fn_trace_gettable system function. This approach enables you to use Transact-SQL to view the contents of trace files, and can be used to import data from trace files into tables for reporting and analysis. Importing the files into a table is particularly useful when you need to analyze large volumes of trace data, as you can add an index to the table to improve the speed of common queries against the captured data. You can then use Transact-SQL queries to analyze and filter the data.
Tracing SQL Server Activity
The following Transact-SQL code retrieves all columns from a trace file: Using fn_trace_gettable SELECT * FROM fn_trace_gettable ('D:\Traces\adworks.trc', default);
Demonstration: Using SQL Server Profiler In this demonstration, you will see how to:
Use SQL Server Profiler to create a trace.
Run a trace and view the results.
Demonstration Steps Use SQL Server Profiler to Create a Trace
MCT USE ONLY. STUDENT USE PROHIBITED
8-6
1.
Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are running, and log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
In the D:\Demofiles\Mod08 folder, run Setup.cmd as Administrator.
3.
Start SQL Server Management Studio, and connect to the MIA-SQL database engine instance using Windows authentication.
4.
On the Tools menu, click SQL Server Profiler.
5.
When SQL Server Profile starts, connect to the MIA-SQL database engine instance using Windows authentication.
6.
In the Trace Properties dialog box, on the General tab, set the following properties: o
Trace name: Demo Trace
o
Use the template: TSQL
o
Save to file: D:\Demofiles\Mod08\Demo Trace.trc
7.
In the Trace Properties dialog box, on the Events Selection tab, note the events and columns that were automatically selected from the TSQL template.
8.
Select Show all events, and under TSQL, select SQL:StmtCompleted. Then clear Show all events so that only the selected events, including the one you just selected are shown.
9.
Select Show all columns and select the Duration column for the SQL:StmtCompleted event.
10. Click the column header for the Database Name column, and in the Edit Filter dialog box, expand Like, enter AdventureWorks, and click OK. Then clear Show all columns so that only the selected columns are shown. Run a Trace and View the Results 1.
In the Trace Properties dialog box, click Run.
2.
Observe the trace as it shows some background activity.
3.
Switch back to SQL Server Management Studio, open the Query.sql script file in the D:\Demofiles\Mod08 folder, and click Execute. This script runs a query in the AdventureWorks database twenty times.
4.
While the query is executing, switch back to SQL Server Profiler and observe the activity.
5.
When the query has finished, in SQL Server Profiler, on the File menu, click Stop Trace.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
8-7
6.
In the trace, select any of the SQL:StmntCompleted events and note that the Transact-SQL code is shown in the bottom pane.
7.
Keep SQL Server Profiler and SQL Server Management Studio open for the next demonstration.
SQL Trace SQL Server Profiler is a graphical tool and it is important to realize that, depending upon the options you choose, it can have significant performance impacts on the server being traced. SQL Trace is a library of system stored procedures that you can use for tracing when minimizing the performance impacts of the tracing is necessary. Internally, SQL Server Profiler uses the programming interface provided by SQL Trace.
SQL Trace is a feature running within the database engine to create and run traces using system stored procedures. Internally, SQL Server Profiler makes calls to the SQL Trace facility in SQL Server when SQL Server Profiler needs to create or manage traces.
Traces run in the process of SQL Server database engine and can write events to a file or an application using SQL Server Management Objects (SMO) objects. The information you learned about how events, columns, and filtering work in SQL Server Profiler, is directly applicable to how the same objects work within SQL Trace.
At first, implementing traces can appear difficult, as you need to make many stored procedure calls to define and run a trace. However, you can use the graphical interface in SQL Server Profiler to create a trace, and then script it for use with SQL Trace. Typically, very few changes need to be made to the SQL Trace script files that SQL Server Profiler creates—this generally involves such things as the path to output files.
SQL Trace vs SQL Server Profiler It is important to understand the differences between SQL Trace and SQL Server Profiler and in which scenarios to use each tool.
You need to use system stored procedures to configure SQL Trace, whereas SQL Server Profiler provides a graphical interface for configuration and for controlling the tracing activity.
SQL Trace runs directly inside the database engine, whereas SQL Server Profiler runs on a client system (or on the server) and communicates to the database engine by using the SQL Trace procedures.
SQL Trace can write events to files or to applications using SMO, whereas SQL Server Profiler can write events to files or database tables.
SQL Trace is useful for long-running, performance-critical traces, or for very large traces that would significantly impact the performance of the target system. SQL Server Profiler is more commonly used for debugging on test systems, performing short-term analysis, or capturing small traces.
Note: The Server processes trace data option in SQL Server Profiler is not the same as scripting a trace and starting it directly through stored procedures. The option creates two
Tracing SQL Server Activity
traces—one that directly writes to a file and a second to send the events through SMO to SQL Server Profiler.
MCT USE ONLY. STUDENT USE PROHIBITED
8-8
Traces do not automatically restart after the server instance restarts. Therefore, if a trace needs to be run constantly, you should script the trace, write a stored procedure to launch it, and then mark the stored procedure as a startup.
Demonstration: Using SQL Trace In this demonstration, you will see how to:
Export a trace definition.
Configure and run a trace.
Demonstration Steps Export a Trace Definition 1.
Ensure that you have completed the previous demonstration in this module.
2.
In SQL Server Profiler, with the Demo Trace still open, on the File menu, point to Export, point to Script Trace Definition, and click For SQL Server 2005 - 2014.
3.
Save the exported trace script as DemoTrace.sql in the D:\Demofiles\Mod08 folder, and click OK when notified that the script has been saved.
4.
Keep SQL Server Profiler open for the next demonstration.
Configure and Run a Trace 1.
In SQL Server Management Studio, open the DemoTrace.sql script file in the D:\Demofiles\Mod08 folder (which you exported from SQL Server Profiler in the previous task).
2.
View the Transact-SQL code, and in the line that begins exec @rc = sp_trace_create, replace InsertFileNameHere with D:\Demofiles\Mod08\SQLTraceDemo.
3.
Click Execute to start the trace, and note the TraceID value that is returned.
4.
Switch back to the Query.sql tab and click Execute to run the workload query.
5.
When the query has finished, open StopTrace.sql in the D:\Demofiles\Mod08 folder.
6.
In the StopTrace.sql script, under the comment Stop the trace, if necessary, modify the DECLARE statement to specify the TraceID value for the trace you started previously.
7.
Select the code under the comment Stop the trace and click Execute. Setting the trace status to 0 stops the trace, and setting it to 2 closes the file and deletes the trace definition on the server.
8.
Select the code under the comment View the trace and click Execute. Then review the traced events.
9.
Close the StopTrace.sql script and DemoTrace.sql tabs without saving any changes so that only the Query.sql tab remains open.
10. Keep SQL Server Management Studio open for the next demonstration.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
Lesson 2
Using Traces
8-9
After you have used SQL Server Profiler or SQL Trace to capture a trace, you can use the trace to analyze workload activity and use the results of this analysis to troubleshoot problems and optimize performance. SQL Server provides a range of ways in which you can use traces, and these are discussed in this lesson.
Lesson Objectives After completing this lesson, you will be able to:
Replay traces.
Use the Database Engine Tuning Advisor to generate recommendations from a trace.
Combine traces with Performance Monitor logs.
Use a trace to troubleshoot concurrency issues.
Replaying Traces You can replay traces captured with SQL Server Profiler or SQL Trace to repeat the workload on a SQL Server instance, enabling you to validate changes that you are considering making to a system or for test a workload against new hardware, indexes, or physical layout changes. You can also use it to test if corrections that you implement really do solve the problem.
Replaying Traces in SQL Server Profiler
For a trace to be replayed, specific event classes and columns must be present. You can ensure that you include these by using the TSQL_Replay trace template. You must also set certain options in the Replay Configuration dialog box, such as the name of the server to replay the trace on and how many threads to use. The replay does not need to be performed against the same system that the trace events were captured on, but the system must be configured in a very similar way. This particularly applies to objects such as databases and logins.
Replaying Traces in Distributed Replay
SQL Server Profiler enables you to replay traces on a single computer, which may be useful for functional testing, but does not provide the ability to test a workload under realistic, scaled-out conditions. Distributed Replay is a set of tools provided with SQL Server that enables you to replay a trace on more than one computer, resulting in a more scalable solution that better simulates mission critical workloads. The Distributed Replay environment consists of four key components:
Administration tool. Distributed Replay provides a console tool that you can use to start, monitor, and cancel replaying.
Replay controller. The controller is a Windows service which orchestrates the actions of the replay clients.
Tracing SQL Server Activity
MCT USE ONLY. STUDENT USE PROHIBITED
8-10
Replay clients. The replay client or clients are computers that run a Windows service to replay the workload against an instance of SQL Server.
Target server. The target server is the instance of SQL Server upon which the clients replay the traces.
Distributed Replay uses configuration information stored in XML files on the controller, the clients, and the computer running the administration tool. Each XML file contains configuration data for the relevant component of the system.
Controller configuration – DReplayController.config
The controller configuration includes information about the level of logging to use. By default, only critical messages are logged, but you can configure to log information and warning messages.
Client configuration – DReplayClient.config
The client configuration includes information about the name of the controller, the folders in which to save dispatch and result files, and again, the logging level to use.
Administration tool configuration – DReplay.exe.preprocess.config
The administration tool configuration features information on whether or not to include system session activities during the replay and to what value to cap the idle time during the trace.
Replay configuration – DReplay.exe.replay.config
The replay configuration includes information on the name of the target server, whether to use connection pooling, and how many threads to use per client. It also contains elements to control the number of rows and result sets to record. Additional Reading: For more information about the configuration files, their contents, and their locations, go to Configure Distributed Replay at http://technet.microsoft.com/library/ff878359(v=sql.120).aspx.
The Database Engine Tuning Advisor
You can use the Database Engine Tuning Advisor (DTA) to gain insight into the existing indexing and partitioning structure of your databases, and to get recommendations for how to improve database performance by creating appropriate indexing and partitioning structures. In addition to optimizing your indexing structure, the Database Engine Tuning Advisor can recommend new physical data structures, including partitioning. The Database Engine Tuning Advisor also offers you the ability to tune across multiple servers and limit the amount of time the tuning algorithms run. It is available as both a command-line utility—enabling you to take advantage of advanced scripting options—and graphical utility.
Workloads
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
8-11
The Database Engine Tuning Advisor utility analyzes the performance effects of workloads run against one or more databases. Typically, these workloads are obtained from traces captured by SQL Server Profiler or SQL Trace. After analyzing the effects of a workload on your databases, Database Engine Tuning Advisor provides recommendations for improving system performance. A workload is a set of Transact-SQL statements that executes against databases you want to tune. The workload source can be a file containing Transact-SQL statements, a trace file generated by SQL Profiler, or a table of trace information, again generated by SQL Profiler. You can also use SQL Server Management Studio (SSMS) to launch Database Engine Tuning Advisor to analyze an individual statement.
Recommendations
The recommendations that the Database Engine Tuning Advisor produces include suggested changes to the database such as new indexes, indexes that should be dropped, and depending on the tuning options you set, partitioning recommendations. The recommendations appear as a set of Transact-SQL statements that will implement the suggested changes. You can view the Transact-SQL and save it for later review and execution, or you can choose to implement the recommended changes immediately. Note: Be careful of applying changes to a database without detailed consideration, especially in production environments. Also, ensure that any analysis you perform is centered on appropriately sized workloads so that recommendations are not based on partial information. The Database Engine Tuning Advisor provides a rich set of configuration options that enable you to customize the analysis to perform and how the optimization recommendations should be made.
Running the Database Engine Tuning Advisor on sizeable workloads can take a long time, particularly on systems that also have large numbers of database objects. You can configure Database Engine Tuning Advisor to limit the time it will spend on analysis and to return the results it has obtained up to the time limit.
You can also configure which types of recommendations should be made, along with whether or not you wish to see recommendations that involve dropping existing objects.
Exploratory Analysis
You can also use the Database Engine Tuning Advisor to perform exploratory analysis, which involves a combination of manual and tool-assisted tuning. To perform exploratory analysis with the Database Engine Tuning Advisor, use the user-specified configuration feature. This enables you to specify the tuning configurations for existing and hypothetical physical design structures, such as indexes, indexed views, and partitioning. The benefit of specifying hypothetical structures is that you can evaluate their effects on your databases without incurring the overhead of implementing them first. You can create an XML configuration file to specify a hypothetical configuration, and then use it for analysis. You can perform the analysis, either in isolation or relative to the current configuration. You can also perform this type of analysis by using a command-line interface.
Tracing SQL Server Activity
Demonstration: Using the Database Engine Tuning Advisor In this demonstration, you will see how to:
Configure a tuning session.
Generate recommendations.
Validate recommendations.
Demonstration Steps Configure a Tuning Session
MCT USE ONLY. STUDENT USE PROHIBITED
8-12
1.
Ensure that you have completed the previous demonstration in this module.
2.
In SQL Server Profiler, on the Tools menu, click Database Engine Tuning Advisor.
3.
When the Database Engine Tuning Advisor starts, connect to the MIA-SQL database engine instance using Windows authentication.
4.
In the Database Engine Tuning Advisor, In the Session name box, type Tuning Demo.
5.
Under Workload, ensure that File is selected, and browse to the D:\Demofiles\Mod08\Demo Trace.trc file (which is where you saved the trace from SQL Server Profiler in the previous demonstration).
6.
In the Database for workload analysis drop-down list, select AdventureWorks.
7.
In the Select databases and tables to tune list, select AdventureWorks and note that 71 of 71 tables are selected. Then in the drop down list of tables, select only the following tables:
8.
o
Product
o
ProductCategory
o
ProductSubcategory
o
SalesOrderDetail
o
SalesOrderHeader
On the Tuning Options tab, review the default options for recommendations. Then click Advanced Options, select Generate online recommendations where possible, and click OK.
Generate Recommendations 1.
In the Database Engine Tuning Advisor, on the toolbar, click Start Analysis.
2.
When the analysis is complete, on the Recommendations tab, review the index recommendations that the DTA has generated.
3.
On the Reports tab, view the tuning summary and in the Select report list, select Statement detail report.
4.
View the report and compare the Current Statement Cost value to the Recommended Statement Cost value (cost is an internal value that the SQL Server query processor uses to quantify the work required to process a query).
5.
On the Actions menu, click Save Recommendations, save the recommendations script as DTA Recommendations.sql in the D:\Demofiles\Mod08 folder, and click OK when notified that the file was saved.
6.
Close the Database Engine Tuning Advisor.
Validate Recommendations
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
8-13
1.
In SQL Server Management Studio, highlight the SELECT statement in the Query.sql script, taking care not to highlight the GO 20 statement that follows it.
2.
On the Query menu, click Display Estimated Execution Plan. This displays a breakdown of the tasks that the query processor will perform to process the query.
3.
Note that the query processor suggests that there is at least one missing index that would improve query performance. Then hold the mouse over the SELECT icon at the left side of the query plan diagram and view the Estimated Subtree Cost value that is displayed in a tooltip.
4.
In SQL Server Management Studio, open the DTA Recommendations.sql script you saved from the Database Engine Tuning Advisor in the D:\Demofiles\Mod08 folder. Then click execute to implement the recommended indexes.
5.
Switch back to the Query.sql tab, and highlight the SELECT statement, taking care once again not to highlight the GO 20 statement that follows it.
6.
On the Query menu, click Display Estimated Execution Plan.
7.
Note that the query processor no longer suggests that there is a missing index. Then hold the mouse over the SELECT icon at the left side of the query plan diagram and view the Estimated Subtree Cost value that is displayed in a tooltip.
Combining Traces with Performance Monitor Logs SQL Server Profiler enables you to record details of events occurring within the SQL Server database engine, while Performance Monitor enables you to record system-wide performance metrics that measure resource utilization and server activity. By combining these two sources of information, you can gain a holistic view of how SQL Server workloads affect system resource utilization, and use this information to troubleshoot performance issues or plan server capacity.
Using SQL Server Profiler, you can open a data collector set log, choose the counters you want to correlate with a trace, and display the selected performance counters alongside the trace in the SQL Server Profiler graphical user interface (GUI). When you select an event in the trace window, a vertical red bar in the System Monitor data window pane of SQL Server Profiler indicates the performance log data that correlates with the selected trace event.
To correlate a trace with performance counters, open a trace file or table that contains the StartTime and EndTime data columns, and then in SQL Server Profiler, on the File menu, click Import Performance Data. You can then open a performance log, and select the System Monitor objects and counters that you want to correlate with the trace.
Tracing SQL Server Activity
Demonstration: Correlating a Trace with Performance Data In this demonstration, you will see how to:
Correlate a trace with performance data.
Demonstration Steps Correlate a Trace with Performance Data
MCT USE ONLY. STUDENT USE PROHIBITED
8-14
1.
Ensure that you have completed the previous demonstration in this module.
2.
In the D:\Demofiles\Mod08 folder, double-click AWCounters.blg to open the log file in Performance Monitor.
3.
View the line chart, noting the times noted along the bottom axis. Then close Performance Monitor.
4.
In SQL Server Profiler, open the AWTrace.trc file in the. D:\Demofiles\Mod08 folder and view the traced events. Noting that the event times in the StartTime column match those in the Performance Monitor log.
5.
In SQL Server Profiler, on the File menu, click Import Performance Data. Then open the AWCounters.blg log file in the D:\Demofiles\Mod08 folder.
6.
In the Performance Counters Limit Dialog dialog box, select \\MIA-SQL (which selects all of the counters in the log file) and click OK.
7.
Click the line chart at approximately the 3:15:35 PM marker. Note that the event in the trace that occurred at that time is selected, and the Transact-SQL statement that was executed is shown in the bottom pane.
8.
Keep SQL Server Management Studio open for the next demonstration.
Troubleshooting Concurrency Issues Many of the issues that arise in day-to-day database operations relate to concurrency issues as multiple users and applications attempt to access and modify the same data simultaneously. SQL Server uses a robust lock-based solution to ensure that all transactions result in consistent data, but this can occasionally result in processes being blocked by one another.
Locking in SQL Server
Before a database transaction acquires a dependency on the current state of a data element, such as a row, page, or table, it must protect itself from the effects of another transaction that attempts to modify the same data. The transaction does this by requesting a lock on the data element. Locks have different modes, such as shared or exclusive. SQL Server uses shared locks for reading data to enable other users to read the same data while ensuring that no-one changes it. It uses exclusive locks during updates to ensure that no-one else can read or write the potential inconsistent data. The locking mode defines the level of dependency the transaction has on the data.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
8-15
No transaction can be granted a lock that conflicts with the mode of a lock that has already been granted to another transaction on that same data. If a transaction requests a lock mode that does conflict, the database engine blocks the requesting transaction until the first lock is released. For UPDATE operations, SQL Server always holds locks until the end of the transaction. For SELECT operations, it holds the lock protecting the row for a period that depends upon the transaction isolation level setting. All locks still held when a transaction completes are released, regardless of whether the transaction commits or rolls back.
Locking is crucial for transaction processing and is normal behavior for the system. Problems only occur when locks are held for too long and other transactions are blocked in a similar way because of the locks being held.
Blocking
Blocking is what happens to one process when it needs to wait for a resource that another process has locked. Blocking is a normal occurrence for systems and is not an issue, except when it is excessive.
You can monitor blocking in real time by using SQL Server Activity Monitor and by running Dynamic Management Views. Additionally, you can use the TSQL_Locks trace template in SQL Server Profiler to capture details of lock-related events in order to troubleshoot excessive blocking.
Deadlocks
Deadlock errors are a special type of blocking error where SQL Server needs to intervene; otherwise the locks would never be released. The most common form of deadlock occurs when two transactions have locks on separate objects and each transaction requests a lock on the other transaction’s object. For example:
Task 1 holds a shared lock on row 1.
Task 2 holds a shared lock on row 2.
Task 1 requests an exclusive lock on row 2, but it cannot be granted until Task 2 releases the shared lock.
Task 2 requests an exclusive lock on row 1, but it cannot be granted until Task 1 releases the shared lock.
Each task must wait for the other to release the lock, which will never happen.
A deadlock can occur when several long-running transactions execute concurrently in the same database. A deadlock can also happen as a result of the order in which the optimizer processes a complex query, such as a join.
How SQL Server Ends a Deadlock
SQL Server ends a deadlock by automatically terminating one of the transactions. SQL Server does the following:
Chooses a deadlock victim. SQL Server gives priority to the process that has the highest deadlock priority. If both processes have the same deadlock priority, SQL Server rolls back the transaction that is the least costly to rollback.
Rolls back the transaction of the deadlock victim.
Notifies the deadlock victim’s application (with message number 1205).
Allows the other transaction to continue.
Tracing SQL Server Activity
Note: In a multi-user environment, each client should check for message number 1205, which indicates that the transaction was rolled back. If message 1205 is found, the application should reconnect and try the transaction again.
Monitoring Deadlocks Deadlocks are normally not logged. The only indication that a deadlock has occurred is that an error message is returned to the client that has been selected as the victim. You can monitor deadlocks by using SQL Server Profiler and/or SQL Trace. There are several deadlock events available, including the Deadlock Graph event.
Demonstration: Troubleshooting Deadlocks In this demonstration, you will see how to:
Capture a trace based on the TSQL_Locks template.
View a deadlock graph.
Demonstration Steps Capture a Trace Based on the TSQL_Locks Template
MCT USE ONLY. STUDENT USE PROHIBITED
8-16
1.
Ensure that you have completed the previous demonstrations in this module.
2.
In SQL Server Profiler, on the File menu, click New Trace. Then connect to the MIA-SQL database engine instance using Windows authentication.
3.
In the Trace Properties dialog box, in the Trace name box, type Locks. Then in the Use the template drop-down list, select TSQL_Locks.
4.
On the Events Selection tab, view the events that are selected in this template. Then click the column header for the Database Name column, and in the Edit Filter dialog box, expand Like, enter AdventureWorks, and click OK.
5.
Click Run to start the trace.
6.
While the trace is running, in the D:\Demofiles\Mod08 folder, run Deadlock.cmd. This will open two command prompt windows.
7.
When both command prompt windows close, in SQL Server profiler, on the File menu, click Stop Trace.
View a Deadlock Graph 1.
In SQL Server Profiler, in the Locks trace, find the Deadlock graph event and select it.
2.
In the bottom pane, view the deadlock graph; which shows that a deadlock occurred and one process was selected as the victim.
3.
On the File menu, point to Export, point to Extract SQL Server Events, and click Extract Deadlock Events. Then save the deadlock events as Deadlocks in the D:\Demofiles\Mod08 folder.
4.
Close SQL Server Profiler, and in SQL Server Management Studio, open the Deadlocks_1.xdl file in the D:\Demofiles\Mod08 folder. Note that you can view deadlock graph files in SQL Server Management Studio.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
8-17
5.
Hold the mouse pointer over each of the process circles to see the statements that they were executing as a tooltip. Note that the deadlock occurred because one process used a transaction to update records in the Production.Product table and then the Sales.SpecialOffer table, while the other process tried to update the same records in the opposite order.
6.
Close SQL Server Management Studio without saving any files.
Tracing SQL Server Activity
Lab: Tracing SQL Server Workload Activity Scenario You are a database administrator (DBA) for Adventure Works Cycles with responsibility for the InternetSales database. You need to optimize the indexes and statistics in this database to support the application workload.
Objectives After completing this lab, you will be able to:
Capture activity using SQL Server Profiler.
Use the Database Engine Tuning Advisor.
Capture activity using SQL Trace stored procedures.
Estimated Time: 45 minutes Virtual machine: 20462C-MIA-SQL User name: ADVENTUREWORKS\Student Password: Pa$$w0rd
Exercise 1: Capturing a Trace in SQL Server Profiler Scenario
MCT USE ONLY. STUDENT USE PROHIBITED
8-18
You have identified a typical workload for the InternetSales application, and want to capture details of the individual events that occur during this workload so that you can use the captured trace as a basis for performance optimization. The main tasks for this exercise are as follows: 1. Prepare the Lab Environment 2. Create a SQL Server Profiler Trace 3. Capture Workload Events Task 1: Prepare the Lab Environment 1.
Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are both running, and then log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
Run Setup.cmd in the D:\Labfiles\Lab08\Starter folder as Administrator.
Task 2: Create a SQL Server Profiler Trace 1.
Use SQL Server Profiler to create a trace on the MIA-SQL database engine instance.
2.
Name the trace InternetSales Workload, and configure it to save the results to D:\Labfiles\Lab08\Starter\InternetSales Workload.trc.
3.
Base the trace on the TSQL trace template, and add the SQL:StmtCompleted event.
4.
Add the Duration column for the SQL:StmtCompleted event.
5.
Filter the trace to include events where the DatabaseName column is like InternetSales.
Task 3: Capture Workload Events
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
8-19
1.
Run the trace you defined in the previous task.
2.
While the trace is running, run the Workload.ps1 PowerShell script in the D:\Labfiles\Lab08\Starter folder. This starts a workload in the InternetSales database that lasts for approximately three minutes.
3.
Observe the activity in SQL Server Profiler while the workload is running.
4.
Stop the trace when the workload has finished.
5.
View the details of some of the SQL:StmntCompleted events that were captured to identify Transact-SQL statements in the workload.
Results: After this exercise, you should have captured a workload using SQL Server Profiler.
Exercise 2: Generating Database Tuning Recommendations Scenario
You want to use the trace to identify any changes that could be made in the InternetSales database to improve performance. The main tasks for this exercise are as follows: 1. Create a Tuning Session 2. Generate Recommendations 3. Validate the Recommendations
Task 1: Create a Tuning Session 1.
Start the Database Engine Tuning Advisor and connect to the MIA-SQL database engine instance.
2.
Create a tuning session named Tune InternetSales based on the trace file you captured previously. The session should analyze the workload in the InternetSales database and tune all tables in this database apart from dbo.CurrencyRate.
3.
Configure the advanced tuning options to generate online recommendations where possible.
Task 2: Generate Recommendations 1.
In the Database Engine Tuning Advisor, start the analysis of the traced workload.
2.
When analysis is complete, review the recommendations and save them as a Transact-SQL script in the D:\Labfiles\Lab08\Starter folder.
Task 3: Validate the Recommendations 1.
In the Database Engine Tuning Advisor, view the Event frequency report report identify the most frequently used query in the workload.
2.
View the Statement detail report and compare the Current Statement Cost and Recommended Statement Cost values for the query you identified as being most frequently used.
3.
Copy the statement string for the most frequently used statement to the clipboard, and then use SQL Server Management Studio to create a new query with a connection to the InternetSales database in the MIA-SQL instance and paste the copied statement.
4.
Display the estimated execution plan for the query, and note any warnings about missing indexes and the estimated subtree cost for the root SELECT statement.
Tracing SQL Server Activity
MCT USE ONLY. STUDENT USE PROHIBITED
8-20
5.
Open the recommendations script you saved from the Database Engine Tuning Advisor and run it to create the recommended indexes and statistics.
6.
Return to the most frequently executed query, display the estimated execution plan again, and note any differences.
Results: After this exercise, you should have analyzed the trace in the Database Engine Tuning Advisor, and reviewed the recommendations.
Exercise 3: Using SQL Trace Scenario You have noticed that, when SQL Server Profiler is running on the production server, performance is reduced. You want to capture the performance metrics and reduce the impact on the server. In this exercise, you will capture activity using the SQL Trace stored procedures to lessen the impact on the server. The main tasks for this exercise are as follows: 1. Export a SQL Trace Script 2. Run the Trace 3. View the Trace Results
Task 1: Export a SQL Trace Script 1.
In SQL Server Profiler, export the InternetSales Workload trace you created previously as a trace definition for SQL Server 2005-2014.
2.
Save the exported trace script in the D:\Labfiles\Lab08\Starter folder.
Task 2: Run the Trace 1.
Open the exported trace script in SQL Server Management Studio, and modify the script to save the trace results as D:\Labfiles\Lab08\Starter\InternetSales.
2.
Run the script to start the trace, and note the returned TraceID value.
3.
While the trace is running, run the Workload.ps1 PowerShell script in the D:\Labfiles\Lab08\Starter folder. This starts a workload in the InternetSales database that lasts for approximately three minutes.
4.
While the workload is running, in SQL Server Manager create a new query that uses the following Transact-SQL code (replacing TraceID with the TraceID value for your trace). Do not execute the code until the workload has finished: DECLARE @TraceID int = TraceID; EXEC sp_trace_setstatus @TraceID, 0; EXEC sp_trace_setstatus @TraceID, 2; GO
5.
When the workload finishes, run the Transact-SQL query you created in the previous step to stop the trace.
Task 3: View the Trace Results 1.
In SQL Server Management Studio, use the following Transact-SQL code to retrieve the text data, start time, and duration for each SQL:StmntComplete event in the trace file:
SELECT TextData, StartTime, Duration FROM fn_trace_gettable('D:\Labfiles\Lab08\Starter\InternetSales.trc', default) WHERE EventClass = 41;
Results: After this exercise, you should have captured a trace using SQL Trace. Question: How do you think the Database Engine Tuning Advisor recommendations you implemented will affect overall performance of the workload? Question: The workload you traced was defined to reflect common reporting functionality and includes only SELECT queries. Alongside this workload, the InternetSales database must process INSERT and UPDATE operations submitted by the e-commerce site. How will the recommendations you implemented affect these workloads?
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
8-21
Tracing SQL Server Activity
Module Review and Takeaways
MCT USE ONLY. STUDENT USE PROHIBITED
8-22
In this module, you learned how to use SQL Server Profiler ad SQL Trace to trace SQL Server activity, and how to use traces to replay workloads, optimize databases, analyze performance, and troubleshoot concurrency issues. When tracing activity in SQL Server, consider the following best practices:
Use SQL Server Profiler to perform short traces for debugging and other purposes.
Use SQL Trace for large and long-running traces.
Use SQL Server Profiler to define traces and script them for SQL Trace.
Import trace data into a database table for advanced analysis.
Use Database Engine Tuning Advisor to analyze the database based on the overall workload you want to optimize, rather than focusing on individual queries.
Review Question(s) Question: In what situations would you use SQL Trace rather than SQL Server Profiler? Question: How would you test a workload after configuration changes?
MCT USE ONLY. STUDENT USE PROHIBITED 9-1
Module 9 Managing SQL Server Security Contents: Module Overview
9-1
Lesson 1: Introduction to SQL Server Security
9-2
Lesson 2: Managing Server-Level Security
9-9
Lesson 3: Managing Database-Level Principals
9-18
Lesson 4: Managing Database Permissions
9-28
Lab: Managing SQL Server Security
9-36
Module Review and Takeaways
9-44
Module Overview
Appropriate protection of data is vital in any application, and understanding how to implement security at the server and individual database level is a key requirement for a database administrator (DBA). In this module, you will be learn about the core concepts on which the SQL Server security architecture is based, and how to manage security at the server and database levels.
Objectives After completing this lesson, you will be able to:
Describe core security concepts in the SQL Server security architecture.
Manage server-level security.
Manage database-level security principals.
Manage database permissions.
Managing SQL Server Security
Lesson 1
Introduction to SQL Server Security
MCT USE ONLY. STUDENT USE PROHIBITED
9-2
SQL Server is designed to be a secure data platform, and includes a range of security features. The security architecture in SQL Server is based on well-established principles, with which you should be familiar before configuring individual security settings.
Lesson Objectives After completing this lesson, you will be able to:
Describe core security concepts.
Describe SQL Server securables.
Describe SQL Server principals.
Describe the SQL Server permissions system.
Security Concepts Before learning how to configure security in SQL Server, it may be useful to explore some basic security concepts, and identify how they relate to SQL Server. Security is a major feature of all enterprise software systems, and many of the concepts relating to security are similar across multiple systems.
Securable, Principals, and Permissions
Security is generally concerned with allowing someone or something to access a resource and to perform one or more actions on it. For example, a network administrator might need to enable users to view the contents of a folder. In more general terms, the resource on which the action is to be performed is referred to as a securable, the “someone or something” that needs to perform the action is referred to as a principal, and the configuration that allows the action to be performed is referred to as a permission. In the previous example, the folder is the securable, the user is the principal, and the administrator must grant the user the “read” permission on the folder.
Security Hierarchies
Security architectures are often hierarchical, primarily to simplify management of permissions. In a hierarchical security architecture, securables can contain other securables, for example a folder can contain files; and principals can contain other principals, for example users can be added to a group. Permissions are usually inherited, both by hierarchies of securables (for example granting “read” permission on a folder implicitly grants “read” permission on the files it contains), and by hierarchies of principals (for example granting “read” permission to a group implicitly grants read permission to all users who are members of that group. Generally, inherited permissions can be explicitly overridden at different hierarchy levels to fine tune access.
MCT USE ONLY. STUDENT USE PROHIBITED
Maintaining a Microsoft® SQL Server® 2014 Database 9-3
This hierarchical arrangement simplifies permission management in a number of ways:
Fewer individual permissions need to be granted, reducing the risk of misconfiguration. You can set the general permissions that are required at the highest level in the hierarchy, and only apply explicit overriding permissions further down the hierarchy to handle exceptional cases.
After the permissions have been set, they can be controlled through group membership. This makes it easier to manage permissions in environments where new users arrive and existing users leave or change roles. When planning a security solution, consider the following best practices:
Provide each principal with only the permissions they actually need.
Use securable inheritance to minimize the number of implicit permissions that must be set in order to enable the required level of access.
Use principal containers such as groups to create a layer of abstraction between principals and permissions to access securables. Then use membership of these groups to control access to resources via the permissions you have defined. Changes in personnel should not require changes to permissions.
SQL Server Securables SQL Server includes securables at multiple levels of a hierarchical architecture.
Server-Level Securables In SQL Server, securables at the top level of the SQL Server instance are referred to as server-level objects. Some examples of server-level objects include:
Endpoints. These are network addressable interfaces that are used to connect to SQL Server.
Logins. These are security principals by which users and applications access SQL Server.
Server roles. These are security principals that can be used to contain multiple Logins in order to simplify permissions management.
Credentials. These can be used by SQL Server to access external resources, such as Microsoft Azure storage.
Note: Some securables are also principals. For example, a login is a principal that enables access to the SQL Server instance; but it is also a securable because there are actions that can be performed on it (such as disabling, or deleting it) that require permissions.
Managing SQL Server Security
Database-Level Securables In a database, there are objects that must be secured. These include:
Certificates. These are cryptographic keys that can be used to encrypt data or authenticate connections.
Users. These are principals that enable access to a database and the objects it contains.
Schemas. These are namespaces that are used to organize database objects.
Schemas define namespaces for database objects. Every database contains a schema named dbo, and database developers can create additional schemas to keep related objects together and simplify permissions management. Schemas contain core database objects, including:
MCT USE ONLY. STUDENT USE PROHIBITED
9-4
Tables. These are the data structures that contain application data.
Views. These are pre-defined Transact-SQL queries that are used as a layer of abstraction over tables.
Indexes. These are structures that are used to improve query performance.
Stored procedures and functions. These are pre-defined Transact-SQL statements that are used to implement business logic in a database application.
SQL Server Principals SQL Server uses different kinds of principal at the server-level and database-level.
Server-Level Principals In order to gain access to a SQL Server instance, a user must use a server-level principal called a login. Logins can then be added to server-level roles.
Logins SQL Server supports two kinds of login:
SQL Server logins. These are logins with security credentials that are defined in the master database. SQL Server authenticates these logins by verifying a password.
Windows logins. These reference security accounts that are managed by Windows, such as Windows users or groups. SQL Server does not authenticate these logins, but rather trusts Windows to verify their identity. For this reasons, connections made to SQL Server using a Windows login are often referred to as trusted connections.
Note: It is also possible for logins to be created from certificates and keys, but this is an advanced topic beyond the scope of this course.
When using Windows logins, it is important to note that a Windows login in SQL Server can reference an individual Windows user, a domain global group defined in Active Directory, or a local group (either a domain local group in Active Directory or a local group defined on the Windows server hosting SQL Server). A Windows login that references a group implicitly enables all Windows users in that group to access the SQL Server instance.
MCT USE ONLY. STUDENT USE PROHIBITED
Maintaining a Microsoft® SQL Server® 2014 Database 9-5
Using Windows group logins can greatly simplify ongoing administration. Windows users are added to global groups based on their role within the organization, and global groups are added to local groups based on specific SQL Server access requirements. As new users arrive, and existing users change roles or leave the organization, access to SQL Server is controlled through group membership in Active Directory, and no additional changes need to be made within SQL Server. You should also note that basing logins on Windows groups can make testing and troubleshooting permissions issues within SQL Server more complex; but generally the long-term manageability benefits make this a worthwhile tradeoff.
Server-Level Roles Server-level roles are security principals to which you can add logins in order to simplify permissions management. SQL Server 2014 supports two kinds of server-level role:
Fixed server-level roles. These are system-defined roles that are automatically granted the required permissions to perform specific server-level management tasks.
User-defined server roles. These are roles that DBAs can create in order to define custom serverlevel management groups.
Adding a login to a server role implicitly grants that login all of the permissions assigned to the role.
Database-Level Principals
Having access to the server does not (in itself) indicate that a login has any access to user databases on the server. To enable logins to access a databases, a mapping must exist between the login and a database user in the database. You can add database users to database-level roles to simplify permissions management.
Database Users
A database user is a database-level principal that is usually mapped to a login at the server-level. Database users often have the same name as the logins that they are mapped to, but this is not required. You can even map logins to different user names in each database. You can think of a database user as being the identity that a login uses when accessing resources in a particular database. Note: Mapping login names to database user names is considered a best practice.
Contained Database Users
There is a single exception to the rule that database users are mapped to server-level logins. When a database has been configured as a contained database. Contained databases are designed to be isolated from the SQL Server instance on which they are hosted, so users in a contained database do not map to logins. Instead, contained database users are either mapped to Windows accounts in the same way as Windows logins at the server level, or they are configured with a password and authenticated by SQL Server at the database level.
Database-Level Roles You can add database users (in both regular and contained databases) to database roles in order to simplify permissions management. SQL Server supports two kinds of database-level role:
Fixed database-level roles. These are system-defined roles that encapsulate permissions required to perform common tasks.
User-defined database roles. These are custom roles that DBAs can create to group users with similar access requirements.
Managing SQL Server Security
Note: In an environment where all logins are based on Windows groups, database users based on these group logins behave in much the same way as roles, so you may choose to grant permissions directly to database users based on Windows groups rather than create user-defined roles. However, user-defined roles can be useful to combine users with similar permissions requirements when you are using a mixture of individual Windows logins and SQL Server logins.
Application Roles An application role is a database-level principal that an application can activate in order to change its security context within the database. When an application role is active, SQL Server enforces the permissions that are applied to the application role and not those of the current database user.
SQL Server Permissions SQL Server uses permissions to enable principals to perform actions. In SQL Server, some permissions relate to general actions that can be performed by executing a statement, and these are known a statement permissions. For example, at the server-level, the CREATE DATABASE permission allows a principal to create a database. When managing statement permissions, you assign a permission to a principal.
MCT USE ONLY. STUDENT USE PROHIBITED
9-6
Other permissions are based on actions that relate to securables, and these are known as object permissions. The specific permissions that relate to a securable depend on the actions that can be performed on it. For example, an endpoint has a CONNECT permission, tables and views have a SELECT permission, and a stored procedure has an EXECUTE permission. When managing object permissions, you assign a permission on a securable to a principal.
GRANT A user who has not been granted permission is unable to perform the action related to it. For example, users have no permission to SELECT data from tables if they have not been granted permission. In SQL Server. Permissions are granted using the GRANT statement. You can grant multiple permissions to multiple principals in a single GRANT statement, as shown in the following pseudo-code: Using the GRANT Statement -- Statement permission GRANT permission, permission, …n TO principal, principal, …n; -- Object permission GRANT permission, permission, …n ON securable securable TO principal, principal, …n;
Permissions can be granted explicitly or they can be inherited. Permission inheritance applies to principal hierarchies (in other words, if a database role is granted SELECT permission on a schema, all database users who are members of that role are implicitly granted SELECT permission on the schema); and also to securable hierarchies (for example, the database role that was granted SELECT permission at the schema level implicitly receives SELECT permission on all objects within the schema that support that permission.
Inherited permissions are cumulative. For example, if a database role has been granted SELECT permission on schema, and a user who is a member of that database role has explicitly been granted UPDATE
MCT USE ONLY. STUDENT USE PROHIBITED
Maintaining a Microsoft® SQL Server® 2014 Database 9-7
permission on a table within the schema, the user receives SELECT permission on the table (inherited through membership of a role that has SELECT permission on the parent schema) and UPDATE permission (directly granted to the user on the table).
WITH GRANT OPTION
When you grant permissions to a principal, you can also give them the right to re-grant the same permissions to other principals by using the WITH GRANT OPTION clause. This enables you to delegate the responsibility for managing permissions, but you should use it with caution, as you then lose control of the security of that securable.
DENY
An exception can be made to cumulative inherited permissions by using the DENY statement. A DENY statement explicitly denies a specific permission on a securable to a principal, and overrides any other explicit or inherited permissions that the principal may have been granted.
The format of a DENY statement mirrors that of a GRANT statement, as shown in the following pseudocode: Using the GRANT Statement -- Statement permission DENY permission, permission, …n TO principal, principal, …n; -- Object permission DENY permission, permission, …n ON securable securable TO principal, principal, …n;
For example, a user who is a member of a database role that has SELECT permission on a schema, automatically has SELECT permission on all tables and views in that schema. If the schema contains a table to which you do not want the user to have access, you can DENY select permission on the table to the user. Even though the user has inherited SELECT permission through membership of the database role (which in turn has inherited SELECT permission from the parent schema), the user will not be able to query the table. Note that DENY permissions are inherited, and cannot be overridden further down the hierarchy. For example, if you deny a user SELECT permission on a schema, granting SELECT permission on an individual table in the schema will not allow the user to query that table. Note: You should use DENY sparingly. A need to DENY many permissions tends to indicate a potential problem with your security design.
REVOKE
To remove a previously granted or denied permission, you can use the REVOKE statement. Note that REVOKE removes only a specified explicit permission, it cannot be used to override inherited permissions. When you revoke a permission, you revoke it from a principal, as shown in the following pseudo-code: Using the GRANT Statement -- Statement permission REVOKE permission, permission, …n FROM principal, principal, …n; -- Object permission REVOKE permission, permission, …n ON securable securable FROM principal, principal, …n;
If you have granted a principal a permission and included the WITH GRANT OPTION FOR clause (so the principal can grant the same permission to others), you can use the REVOKE GRANT OPTION statement to
Managing SQL Server Security
MCT USE ONLY. STUDENT USE PROHIBITED
9-8
revoke the ability to grant the permission without revoking the permission itself. You can also use the REVOKE statement with a CASCADE clause to revoke the permission to others who have been granted the permission by the specified principal.
Effective Permissions
The effective permissions for a given principle on a specific securable are the actual permissions that SQL Server will enforce based on:
Explicit permissions granted directly to the principal on the securable.
Permissions inherited from membership of a role.
Permissions inherited from a parent securable.
You can view the effective permissions for SQL Server logins and individual Windows logins in two ways:
In SSMS, view the Permissions tab of the properties dialog box for the securable, or the Securables tab of the properties dialog for the principal. Here you can select the combination of principal and securable you want to check and view the Effective tab of the Permissions pane.
In Transact-SQL, use the EXECUTE AS statement to impersonate a login (at the server level) or a user (at the database level), and query the sys.fn_my_permissions system function, specifying the securable for which you want to view effective permissions.
The following code sample shows how to impersonate the login ADVENTUREWORKS\RosieReeves and view the effective permissions on a table named dbo.Products: Viewing Effective Permissions for a Database User EXECUTE AS LOGIN = 'ADVENTUREWORKS\RosieReeves' SELECT * FROM sys.fn_my_permissions('dbo.Products'. 'Object'); REVERT
Windows group logins (and users based on them) cannot be impersonated; so you cannot use either of these techniques to view effective permissions for a login based on a Windows group. Each Windows user can log in and query the sys.fn_my_permissions system function for themselves, but since Windows users can be added to more than one Windows group, the results for each Windows user may vary depending on the groups to which they belong.
MCT USE ONLY. STUDENT USE PROHIBITED
Maintaining a Microsoft® SQL Server® 2014 Database 9-9
Lesson 2
Managing Server-Level Security
Security implementation for SQL Server usually begins at the server level, where users are authenticated based on logins and organized into server-level roles to make it easier to manage permissions.
Lesson Objectives After completing this lesson, you will be able to:
Describe common application security models for SQL Server.
Configure the SQL Server authentication mode.
Manage logins.
Manage server-level roles.
Manage server-level permissions.
Application Security Models At the server-level, you must ensure that only authenticated users can access the SQL Server instance. However, before you start creating logins, you must consider the application security architecture that the database must support.
The Trusted Server Application Security Model
The trusted server application model is commonly used in large-scale enterprise applications, web sites, and Internet services. In this model, a user access an application that stores data in a database. However, the application uses its own identity to access the database; not that of the user. Typically, the application authenticates individual users based on credentials they present when logging in, or based on their Windows credentials. However, the database server (in this example, a SQL Server instance), does not need to contain logins for the individual users of the application, web site, or service. Only the application itself needs a login, as the application is trusted to have authenticated and authorized its own users before accessing data on their behalf.
The Impersonation/Delegation Security Model
An alternative approach that is commonly used in business applications is for the application to use the user’s credentials to access the database server. In this model, the application may be running in the context of its own service account identity, but when connecting to the database server it impersonates the user for whom it is accessing data. This model therefore requires that the SQL Server instance contains a login for each individual application user, with potentially different permission requirements for each user.
Note: In technical terms, an application that retrieves data based on the user’s credentials uses impersonation to access a SQL Server instance on the same server, and delegation to access SQL Server on a remote server. Delegation requires substantial configuration, a discussion of which is beyond the scope of this course.
SQL Server Authentication Options Determining the application security model helps you plan the number of distinct logins you must create and manage. After you have done this, you must determine how these logins will be authenticated. As previously discussed, SQL Server supports Windows logins and SQL Server logins. To determine which of these login types can connect to the SQL Server instance, you must select one of the following authentication modes:
MCT USE ONLY. STUDENT USE PROHIBITED
9-10 Managing SQL Server Security
Windows authentication mode. In Windows authentication, only users with Windows logins are permitted to connect to SQL Server. Windows authentication was formerly known as integrated authentication. SQL Server does not actually authenticate Windows logins, but instead allows access based on an access token that has been issued by Windows when the user logged in.
SQL Server and Windows authentication mode. In SQL Server and Windows authentication, users with Windows logins can access SQL Server, and users with SQL Server logins, which are directly authenticated by SQL Server, can access the instance. This mode is often called mixed authentication.
You can specify which type of authentication to use when you install SQL Server and you can also change the mode after installation, although this needs an instance level restart to take effect. Note: You generally change the configuration by using SQL Server Management Studio (SSMS). However, it is only a single registry key that is being changed, therefore you can also configure it by using a group policy within Windows.
If you install SQL Server using mixed mode authentication, setup enables a SQL Server login called sa. It is important to create a complex password for this login because it has administrative rights at database server level. If you install using Windows authentication mode, then changing to mixed authentication later does not enable the sa login.
Protocols for Authentication
Windows authentication is typically performed by using the Kerberos protocol. The Kerberos protocol is supported with SQL Server over the TCP/IP, named pipes, and shared memory network protocols.
The SQL Server Native Access Client (SNAC) provides encrypted authentication for SQL Server logins. If SQL Server does not have a Secure Sockets Layer (SSL) certificate installed by an administrator, SQL Server generates and self-signs a certificate for encrypting authentication traffic.
Note: Encrypted authentication only applies to clients running the SQL Server 2005 version of SNAC or later. If an earlier client that does not understand encrypted authentication tries to
MCT USE ONLY. STUDENT USE PROHIBITED
Maintaining a Microsoft® SQL Server® 2014 Database 9-11
connect, by default SQL Server does not allow the connection. If this is a concern, you can use SQL Server Configuration Manager to disallow unencrypted authentication from down-level clients.
Managing Logins You can create logins by using either Transact-SQL code or the GUI in SSMS. Because the task can be a very common operation, you may find it much faster, more repeatable, and accurate to use a Transact-SQL script.
Creating Logins To create a login by using SSMS, expand the Security node for the relevant server instance, right-click Logins, and then click New Login. Complete the details in the Login - New dialog box to configure the login which you require. Alternatively, you can create logins by using the CREATE LOGIN Transact-SQL statement.
In this example, a login named ADVENTUREWORKS\SalesReps is created for the Windows group of the same name. The default database for the user will be salesdb. If you do not specify this option, the default database is set to master. Creating a Windows Login CREATE LOGIN [ADVENTUREWORKS\SalesReps] FROM WINDOWS WITH DEFAULT_DATABASE = [salesdb];
Note: Windows user and group names must be enclosed in square brackets because they contain a backslash character.
You create SQL Server logins in the same way. There are, however, additional arguments that are only relevant to SQL Server logins (for example, the PASSWORD argument). The following example shows how to create a login named DanDrayton and assign a password of Pa$$w0rd: Creating a SQL Server Login CREATE LOGIN DanDrayton WITH PASSWORD = 'Pa$$w0rd', DEFAULT_DATABASE = [salesdb];
SQL Server Login Security Policy In a Windows-based environment, administrators can enable policies for Windows users that enforce password complexity and expiration. SQL Server can enforce similar restrictions for SQL Server logins.
When you create a SQL Server login, you can specify the following options to control how the password policy is enforced:
MCT USE ONLY. STUDENT USE PROHIBITED
9-12 Managing SQL Server Security
MUST_CHANGE: SQL Server will prompt the user to change their password the next time they log on. You must ensure that whatever client application the user will use to connect to SQL Server supports this.
CHECK_POLICY = {ON | OFF}: Setting this value ON enforces the password complexity policy for this user. The default value for this setting is ON.
CHECK_EXPIRATION = {ON | OFF}: Setting this value ON enables password expiration, forcing the user to change their password at regular intervals. The default value for this setting is OFF.
You can configure policy settings for a SQL Server login in SSMS, or in the CREATE LOGIN or ALTER LOGIN statement. The following code example modifies the DanDrayton login created earlier to explicitly disable policy checking: Setting the Password Policy for a Login ALTER LOGIN DanDrayton WITH CHECK_POLICY = OFF, CHECK_EXPIRATION = OFF;
The full application of account policy is not always desirable. For example, some applications use fixed credentials to connect to the server. Often, these applications do not support regular changing of login passwords. In these cases, it is common to disable password expiration for those logins. You can reset passwords by using SSMS or the ALTER LOGIN Transact-SQL statement. Changing a Password ALTER LOGIN DanDrayton WITH OLD_PASSWORD = 'Pa$$w0rd', PASSWORD = 'NewPa$$w0rd';
Disabling and Deleting Logins If logins are not being used for a period of time, you should disable and then later re-enable them. If there is any chance that a login will be needed again in the future, it is better to disable it rather than drop it. Disabling a Login The following code shows how to use the ALTER LOGIN statement to disable a login: ALTER LOGIN DanDrayton DISABLE;
You can remove logins from a server by using the DROP LOGIN statement or SSMS. If a user is currently logged in, you cannot drop their login without first ending their session. In this example, a login is dropped from the server instance: Dropping a Login DROP LOGIN DanDrayton;
MCT USE ONLY. STUDENT USE PROHIBITED
Maintaining a Microsoft® SQL Server® 2014 Database 9-13
Managing Server-Level Roles You can easily configure server rights for SQL Server logins and Windows logins by using serverlevel roles. All logins are automatically a member of the public server-level role, which is a special role that represents all server-level principals. By default, the public role has CONNECT and VIEW ANY DATABASE permissions. You can change or assign additional permissions to the public role, but in general it is better to leave the default permissions and control access to server-level securables by adding logins to fixed server-level roles or user-defined server roles.
Fixed Server-Level Roles
Server-level roles provide an easy way to delegate administrative privileges to logins. Each server-level principal is automatically a member of the public server-level role, and you can add server-level principals to fixed server-level roles to confer administrative rights. The following table is a list of the fixed serverlevel roles with a description of the permissions granted to role members: Fixed Server-Level Role
Description
sysadmin
This role grants permissions to perform any action including adding members to the sysadmin role. For this reason you should limit membership of this role as much as possible.
serveradmin
This role grants permissions to configure server-wide settings and to shut down the server.
securityadmin
This role grants permissions to manage logins. This includes the ability to create and drop logins and the ability to assign permissions to logins. Members of this role can grant and deny server-level permissions to other users and grant and deny database-level permissions to other users on any database to which they have access. Because of the ability to assign permissions to other users, membership of this role should be limited as much as possible.
processadmin
This role grants permissions to terminate processes running on the SQL Server instance.
setupadmin
This role grants permissions to add and remove linked servers and manage replication.
bulkadmin
This role grants permissions to execute the BULK INSERT statement.
diskadmin
This role grants permissions to manage disk files.
dbcreator
This role grants permissions to create, alter, and drop databases.
public
Every user is a member of public and this cannot be changed. This role does not initially grant any administrative permissions. You can add permissions to this role, however this is not advisable because the permissions would be granted to every user.
It is very important that you follow the principle of least privilege when assigning roles to security principals. For example, Imagine a user who needs permissions to shut down the server, end processes,
MCT USE ONLY. STUDENT USE PROHIBITED
9-14 Managing SQL Server Security
manage disk files, and to create, alter, and drop databases. You might consider that it is more straightforward to add the user’s login to the sysadmin role, rather than adding it to the four roles that would be required to give the required permissions, however adding the login to the sysadmin role would give excessive permissions, including the ability to add other members to the sysadmin role. Following the principle of least privilege prevents the awarding of unintended rights, and helps to keep servers secure. Note: Unlike in earlier versions of SQL Server, the BUILTIN\administrators and Local System (NT AUTHORITY\SYSTEM) accounts are not automatically added as members of the sysadmin role, although you can add them manually if required. Note that this does not affect the ability of local administrators to access the database engine when it is in single user mode.
User-Defined Server Roles
If a server-level role does not exactly match your security requirements, you should consider using userdefined server roles instead. You can create user-defined server roles with Transact-SQL or Management Studio and then grant rights to your role. User-defined roles are very useful to enable you to define the specific privileges required for your users and administrators, and for limiting those privileges to only those that are absolutely necessary. You can create user-defined server-level roles by using SSMS or the CREATE SERVER ROLE statement, as shown in the following example: Creating a User-Defined Server Role CREATE SERVER ROLE app_admin;
Managing Server-Level Role Membership
You can add and remove logins to server-level roles by using SSMS or the ALTER SERVER ROLE TransactSQL statement.
In the following code example, the ADVENTUREWORKS\WebAdmins login is added to the app_admin server-level role: Adding a Login to a Server-Level Role ALTER SERVER ROLE app_admin ADD MEMBER [ADVENTUREWORKS\WebAdmins];
To remove a role member, use the ALTER SERVER ROLE statement with the DROP MEMBER clause. To view membership of fixed server-level roles and users-defined server roles, you can query the sys.server_role_members system view.
MCT USE ONLY. STUDENT USE PROHIBITED
Maintaining a Microsoft® SQL Server® 2014 Database 9-15
Managing Server-Level Permissions Permissions at the server level generally relate to administrative actions, such as creating databases, altering logins, or shutting down the server. Additionally, some fundamental permissions such as CONNECT SQL (which allows a user to connect to the SQL Server database engine) are managed at this level.
Granting, Revoking, and Denying Server-Level Permissions
As with all permission in SQL Server; you grant server-level permissions by using the GRANT statement, deny a specific permission by using the DENY statement, and revoke a previous GRANT or DENY by using the REVOKE statement. You can also manage server-level permissions using SSMS by editing the properties of a login or user-defined server role (you cannot change permissions for fixed server roles). To grant server-level permissions, use the GRANT statement.
The following code example grants the ALTER ANY LOGIN permission to the app_admin user-defined server role: Granting an Object Permission GRANT ALTER ANY LOGIN TO app_admin;
In general, you should manage server-level permissions based on membership of fixed server-level roles or by granting permissions to user-defined server roles rather than directly to logins. The only exception to this recommendations is that if all logins are based on Windows groups, you may choose to grant custom permissions directly to logins since they offer the same manageability benefits as user-defined server-level roles.
Demonstration: Managing Server-Level Security In this demonstration, you will see how to:
Set the authentication mode.
Create logins.
Manage server-level roles.
Manage server-level permissions.
Demonstration Steps Set the Authentication Mode 1.
Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are running, and log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
In the D:\Demofiles\Mod09 folder, run Setup.cmd as Administrator.
3.
Start SQL Server Management Studio, and connect to the MIA-SQL database engine using Windows authentication.
4.
In Object Explorer, right-click the MIA-SQL instance and click Properties.
5.
In the Server Properties – MIA-SQL dialog box, on the Security page, verify that SQL Server and Windows Authentication mode is selected. Then click Cancel.
Create Logins
MCT USE ONLY. STUDENT USE PROHIBITED
9-16 Managing SQL Server Security
1.
In Object Explorer, expand Security, and expand Logins to view the logins that are currently defined on this server instance.
2.
Right-click Logins and click New Login. Then in the Login – New dialog box, next to the Login name box, click Search.
3.
In the Select User or Group dialog box, click Object Types. Then in the Object Types dialog box, select only Users and Groups and click OK.
4.
In the Select User or Group dialog box, click Locations. Then in the Locations dialog box, expand Entire Directory, select adventureworks.msft and click OK.
5.
In the Select User, Service Account, or Group dialog box, click Advanced. Then click Find Now. This produces a list of all users and groups in the Active Directory domain.
6.
In the list of domain objects, select HumanResources_Users (this is a domain local group that contains multiple global groups, each of which in turn contains users), then click OK.
7.
In the Select User, Service Account, or Group dialog box, ensure that HumanResources_Users is listed, and click OK.
8.
In the Login – New dialog box, in the Default database drop-down list, select AdventureWorks. Then click OK and verify that the ADVENTUREWORKS\HumanResources_Users login is added to the Logins folder in Object Explorer.
9.
Right-click Logins and click New Login. Then in the Login – New dialog box, enter the name Payroll_Application and select SQL Server authentication.
10. Enter and confirm the password Pa$$w0rd, and then clear the Enforce password expiration check box (which automatically clears the User must change password at next login check box). 11. In the Default database drop-down list, select AdventureWorks. Then click OK and verify that the Payroll_Application login is added to the Logins folder in Object Explorer.
12. Open the CreateLogins.sql script file in the D:\DemoFiles\Mod09 folder and review the code it contains, which creates a Windows login for the ADVENTUREWORKS\AnthonyFrizzell user and the ADVENTUREWORKS\Database_Managers local group, and a SQL Server login named Web_Application. 13. Click Execute. Then, when the script has completed successfully, refresh the Logins folder in Object Explorer and verify that the logins have been created. Manage Server-Level Roles 1.
In Object Explorer, expand Server Roles and view the server roles that are defined on this instance.
2.
Right-click the serveradmin fixed server-level role and click Properties. Then in the Server Role Properties – serveradmin dialog box, click Add.
3.
In the Select Server Login or Role dialog box, click Browse, and in the Browse for Objects dialog box, select [ADVENTUREWORKS\Database_Managers] and click OK. Then in the Select Server Login or Role dialog box, click OK.
4.
In the Server Role Properties – serveradmin dialog box, ensure that [ADVENTUREWORKS\Database_Managers] is listed and click OK.
MCT USE ONLY. STUDENT USE PROHIBITED
Maintaining a Microsoft® SQL Server® 2014 Database 9-17
5.
Open the ServerRoles.sql script file in the D:\DemoFiles\Mod09 folder and review the code it contains, which creates a user-defined server role named AW_securitymanager and adds the ADVENTUREWORKS\AnthonyFrizzell user to the new role.
6.
Click Execute. Then, when the script has completed successfully, refresh the Server Roles folder in Object Explorer and verify that the role has been created.
7.
Right-click the AW_securitymanager role and click Properties, and verify that ADVENTUREWORKS\AnthonyFrizzell is listed as a member. Then click Cancel.
Manage Server-Level Permissions 1.
Open the ServerPermissions.sql script file in the D:\DemoFiles\Mod09 folder and review the code it contains, which grants ALTER ANY LOGIN permission to the AW_securitymanager server role.
2.
Click Execute. Then, when the script has completed successfully, in Object Explorer, right-click the AW_securitymanager role and click Properties.
3.
In the Server Role Properties - AW_securitymanager dialog box, on the General tab, view the selected securables. Then click Cancel.
4.
In the Logins folder, right-click the ADVENTUREWORKS\AnthonyFrizzell login and click Properties.
5.
In the Login Properties - ADVENTUREWORKS\AnthonyFrizzell dialog box, click the Securables tab. Then above the Securables list, click Search, select All objects of the types and click OK, and select Logins and click OK.
6.
In the Login Properties - ADVENTUREWORKS\AnthonyFrizzell dialog box, on the Securables tab, in the list of logins, select Payroll_Application.
7.
In the Permissions for Payroll_Application list, on the Explicit tab, note that no explicit permissions on this login have been granted to ADVENTUREWORKS\AnthonyFrizzell. Then click the Effective tab and note that the ALTER permission has been inherited through membership of the AW_securitymanager role.
8.
In the Login Properties - ADVENTUREWORKS\AnthonyFrizzell dialog box, click Cancel.
9.
Leave SQL Server Management Studio open for the next demonstration.
Lesson 3
Managing Database-Level Principals
MCT USE ONLY. STUDENT USE PROHIBITED
9-18 Managing SQL Server Security
After creating logins, it is necessary to provide that login with access to at least one databases before they can log into the server. Generally, you only need to enable logins to access the databases that they need to work with. You can do this by creating a database user for the login in each database that it must access. In this lesson, you will see how to create and manage database-level principals, including database users and database roles.
Lesson Objectives After completing this lesson, you will be able to:
Manage database users.
Manage database owner (dbo) and guest access.
Manage database roles.
Manage application roles.
Manage users in contained databases.
Managing Database Users Logins cannot connect access a database to which they have not been granted access. You grant database access to a login by creating a database user for it. You can create database users by using SSMS or Transact-SQL statements. To create a new database user in SSMS, expand the relevant database, expand the Security node, right-click the Users node, and then click New User. Complete the details in the Database User - New dialog box to configure the user you require. You can also create database users by using the CREATE USER statement. Creating Database Users USE salesdb; CREATE USER SalesReps FOR LOGIN [ADVENTUREWORKS\SalesReps] WITH DEFAULT_SCHEMA = Sales; CREATE USER DanDrayton FOR LOGIN DanDrayton; CREATE USER WebUser FOR LOGIN [ADVENTUREWORKS\WebAppSvcAcct];
Note: The names of Windows logins must be enclosed in square brackets because they contain a backslash character.
MCT USE ONLY. STUDENT USE PROHIBITED
Maintaining a Microsoft® SQL Server® 2014 Database 9-19
Note that the first example includes a default schema. Schemas are namespaces used to organize objects in the database. If no default schema is specified, the user’s default schema will be the built-in dbo schema. Note that in the third example, the username is different to the login with which it is associated.
You can remove users from a database by using the DROP USER statement or SSMS. However, you cannot drop a database user who owns any securable object (for example, tables or views).
Managing Mismatched Security identifiers
When you create a SQL Server login, it is allocated both a name and a security ID (SID). When you then create a database user for the login, details of both the name and the SID of the login are entered into the sysusers system table in the database. If the database is then backed up and restored onto another server, the database user is still present in the database—but there may be no login on the server that matches it. If you then create a new login with the same name and map as a user in the database, it will not work. This is because the new login has a different SID to the original login so it cannot be added to the database. To resolve the issue, you need to update the database user to link it to the new login on the server by using the ALTER USER statement with the WITH LOGIN clause. Resolving Mismatched SIDs ALTER USER DanDrayton WITH LOGIN = DanDrayton;
This solves the issue but, if you later restore the database on the same or a different server, the problem will arise again. A better way of dealing with it is to avoid the problem overall by using the WITH SID clause when you create the login.
Managing dbo and guest Access While it is generally true that a login cannot access a database without having been explicitly granted access through the creation of a database user, there are two exceptions. Each SQL Server database includes two special database users— dbo and guest.
dbo User
The dbo user is a special user who has permissions to perform all activities in the database. Any member of the sysadmin fixed server role (including the sa user when using mixed mode authentication) who uses a database, is mapped to the special database user called dbo. You cannot delete dbo database user and it is always present in every database.
Database Ownership Like other objects in SQL Server, databases also have owners that are mapped to the dbo user. The following example shows how you can modify the owner of a database by using the ALTER AUTHORIZATION statement: Changing the Database Owner ALTER AUTHORIZATION ON DATABASE::salesdb TO [ADVENTUREWORKS\Database_Managers];
Any object created by a member of the sysadmin fixed server role will automatically have dbo as its owner. Owners of objects have full access to the objects and do not require explicit permissions before they can perform operations on those objects.
guest User
MCT USE ONLY. STUDENT USE PROHIBITED
9-20 Managing SQL Server Security
The guest user account enables logins that are not mapped to a database user in a particular database to gain access to that database. Login accounts assume the identity of the guest user when the following conditions are met:
The login has access to SQL Server but not the database, through its own database user mapping.
The guest account has been enabled.
You can add the guest account to a database to enable anyone with a valid SQL Server login to access it. The guest username is automatically a member of the public role. (Roles will be discussed in the next module). A guest user accesses a database in the following way:
SQL Server checks to see whether the login that is trying to access the database is mapped to a database user in that database. If it is, SQL Server grants the login access to the database as that database user.
If the login is not mapped to a database user, SQL Server then checks to see whether the guest database user is enabled. If it is, the login is granted access to the database as guest. If the guest account is not enabled, SQL Server denies access to the database for that login.
You cannot drop the guest user from a database, but you can prevent it from accessing the database by using the REVOKE CONNECT statement. Conversely, you can enable the guest account by using the GRANT CONNECT statement. Enabling and Disabling the guest Account REVOKE CONNECT FROM guest; GRANT CONNECT TO guest;
Note: By default, the guest user is enabled in the master, msdb, and tempdb databases. You should not try to revoke the guest access in these databases.
MCT USE ONLY. STUDENT USE PROHIBITED
Maintaining a Microsoft® SQL Server® 2014 Database 9-21
Managing Database-Level Roles Database roles enable you to group together database users that have the same resource access requirements or who require the same rights. Managing database-level roles is similar to managing server-level roles.
Fixed Database-Level Roles Each database includes built-in fixed databaselevel roles with pre-defined right to enable you to assign privileges for common scenarios. The table below lists the built-in fixed database-level roles: Fixed Database-Level Role
Description
db_owner
Members of this role have comprehensive rights over the database, equivalent to the owner of the database. This includes the right to fully manage the database and also to drop the database, so you must use this role with caution.
db_securityadmin
Members of this role can grant permissions to configure database-wide settings and role membership.
db_accessadmin
Members of this role can manage access to the database by Windows logins and SQL Server logins.
db_backupoperator
Members of this role can back up the database. Note that this role does not have the right to restore the database.
db_ddladmin
Members of this role can run any Database Definition Language (DDL) Transact-SQL commands in the database. DDL is the portion of the Transact-SQL language that deals with creating, altering, and deleting database and SQL Server objects.
db_datawriter
Members of this role can change (INSERT, UPDATE, and DELETE) data in the database.
db_datareader
Members of this role can read data from all database tables.
db_denydatawriter
Members of this role cannot change (INSERT, UPDATE, and DELETE) data in the database.
db_denydatareader
Members of this role cannot read data from any database tables.
User-Defined Database Roles
You can also create user-defined database roles to enable you to set finer-grained permissions that you cannot achieve by using the fixed database-level roles.
MCT USE ONLY. STUDENT USE PROHIBITED
9-22 Managing SQL Server Security
You can create user-defined database-level roles by using SSMS or the CREATE ROLE statement, as shown in the following example: Creating a User-Defined Database Role CREATE ROLE product_reader;;
Managing Database-Level Role Membership
You can add and remove logins to database-level roles by using SSMS or the ALTER ROLE Transact-SQL statement. In the following code example, the WebApp user is added to the product_reader role: Adding a User to a Database-Level Role ALTER ROLE product_reader ADD MEMBER WebApp;
To remove a role member, use the ALTER ROLE statement with the DROP MEMBER clause. To view membership of fixed server-level roles and users-defined server roles, you can query the sys.database_role_members system view.
Demonstration: Managing Database Users and Roles In this demonstration, you will see how to:
Create databases users.
Manage database-level roles.
Demonstration Steps Create Database Users 1.
Ensure that you have completed the previous demonstration in this module.
2.
In SQL Server Management Studio, in Object Explorer, expand Databases, expand the AdventureWorks database, and expand its Security folder. Then expand the Users folder and view the users currently defined in the database.
3.
Right-click Users and click New User. Then, in the Database user – New dialog box, enter the user name Web_Application, the login name Web_Application, and the default schema Sales; and click OK.
4.
Open the CreateUsers.sql script file in the D:\DemoFiles\Mod09 folder and review the code it contains, which creates users for the Payroll_Application, ADVENTUREWORKS\HumanResources_Users, and ADVENTUREWORKS\AnthonyFrizzell logins.
5.
Click Execute. Then, when the script has completed successfully, refresh the Users folder in Object Explorer and verify that the users have been created.
Manage Database-Level Roles 1.
In Object Explorer, expand the Roles folder, and then expand the Database Roles folder and view the database roles in the database.
2.
Right-click the db_datareader role and click Properties. This is a fixed database-level role.
MCT USE ONLY. STUDENT USE PROHIBITED
Maintaining a Microsoft® SQL Server® 2014 Database 9-23
3.
In the Database Role Properties – db_datareader dialog box, click Add. In the Select Database User or Role dialog box, enter AnthonyFrizzell and click OK. Then verify that AnthonyFrizzel is listed and click OK.
4.
Right-click Database Roles and click New Database Role.
5.
Enter the role name hr_reader, and click Add. In the Select Database User or Role dialog box, enter HumanResources_Users; Payroll_Application and click OK. Then verify that HumanResources_Users and Payroll_Application are listed and click OK.
6.
Open the DatabaseRoles.sql script file in the D:\DemoFiles\Mod09 folder and review the code it contains, which creates roles name hr_writer and web_customer, and adds the HumanResources_Users to the hr_writer role and Web_Application to the web_customer role.
7.
Click Execute. Then, when the script has completed successfully, refresh the Database Roles folder in Object Explorer and verify that the roles have been created.
8.
Keep SQL Server Management Studio open for the next demonstration.
Managing Application Roles An application role is a database security principal that enables an application to activate an alternative (usually elevated) security context in order to obtain permissions required for a specific operation that are not granted to the current user.
For example, a point-of-sale application used in a supermarket checkout might use a SQL Server database to record sales transactions. A checkout operator can log in, using their own credentials, and enter details for each item being purchased. To do this, the user requires INSERT permission on the sales transactions table. In some cases, after completing the transaction a customer may want to add an additional item. This requires UPDATE permission in the sales transactions table, which has not been granted to the checkout operator. In this case, a supervisor could enter a code to approve the operation, at which point the application activates an application role that has the necessary permissions. After the update has been performed, the application role can be deactivated and the security context reverts to that of the checkout operator’s user account.
Creating and Using an Application Role To create an application role, use the CREATE APPLICATION ROLE statement, specifying a password. The following code example creates an application role named sales_supervisor: Creating an Application Role. CREATE APPLICATION ROLE sales_supervisor WITH PASSWORD = 'Pa$$w0rd';
After you have creates an application role and assigned it the required permissions, it can be activated by executing the sp_setapprole stored procedure.
The following code example shows how to activate an application role: Activating an Application Role EXEC sp_setapprole 'SalesSupervisor', 'Pa$$w0rd'; GO
MCT USE ONLY. STUDENT USE PROHIBITED
9-24 Managing SQL Server Security
An application role remains active until the user disconnects from SQL Server or it is deactivated by using the sp_unsetapprole stored procedure. However, to use sp_unsetapprole, you must specify a cookie that was generated when the application role was activated. The following code example shows how to retrieve a cookie when activating an application role, and how to deactivate the application role when it is no longer required: Activating and Deactivating an Application Role EXEC sp_setapprole 'sales_supervisor', 'Pa$$w0rd', @fCreateCookie = true, @cookie = @cookie OUTPUT; -- Perform the operation that requires application role permissions EXEC sp_unsetapprole @cookie;
When the application has completed the operation for which the application role’s permissions are required, it can deactivate the role and revert to the current user’s security context by executing the sp_unsetapprole stored procedure.
Demonstration: Using an Application Role In this demonstration, you will see how to:
Create an application role.
Use an application role.
Demonstration Steps Create an Application Role 1.
Ensure that you have completed the previous demonstration in this module.
2.
In SQL Server Management Studio, under the Roles folder for the AdventureWorks database, rightclick Application Roles and click New Application Role.
3.
In the Application Role – New dialog box, enter the role name pay_admin, enter the default schema HumanResources, enter and confirm the password Pa$$w0rd, and click OK.
Use an Application Role 1.
Open the ApplicationRole.sql script file in the D:\DemoFiles\Mod09 folder. The code in this file displays the identity of the current user and login before, during, and after the activation of the pay_admin application role.
2.
Right-click anywhere in the script window, point to Connection, and click Change Connection. Then connect to the MIA-SQL database engine using SQL Server authentication as Payroll_Application with the password Pa$$w0rd.
3.
Click Execute and view the results. Note that the System Identity does not change (which may be important for auditing reasons), but that the DB Identity switched to pay_admin while the application role was active.
MCT USE ONLY. STUDENT USE PROHIBITED
Maintaining a Microsoft® SQL Server® 2014 Database 9-25
4.
Close the ApplicationRole.sql query pane, but keep SQL Server Management Studio open for the next demonstration.
Managing Users for a Contained Database Databases in a SQL Server instance typically have dependencies on server-level resources, most commonly logins. While this relationship between databases and the server instance on which they are hosted enables a hierarchical approach to security and management, there are some scenarios where it would be useful to completely isolate a database and its management from the server on which it resides. For example:
When it is necessary to move databases between different SQL Server instances. Entities that are external to the database, such as logins, are not moved along with the database.
When a database is in development and the developer does not know which instance will ultimately host the database.
When a database that participates in an AlwaysOn availability group is mirrored on multiple server instances, and it is useful to be able to failover to a secondary instance without having to synchronize server-level logins required to access the database.
A contained database is a database that is hosted on an instance of SQL Server, but which has no dependencies on the server instance. Because there are no dependencies, you can move the database between servers, or use it in availability group scenarios without having to consider external factors, such as logins. Note: To learn about AlwaysOn Availability Groups and other high-availability techniques, attend course 20465C: Designing a Data Solution with Microsoft SQL Server.
Characteristics of Contained Databases Contained databases include the following characteristics that isolate them from a server instance:
Contained databases store the metadata that defines the database. This information is usually stored only in the master database, but in a contained database it is also stored in the contained database itself.
All metadata use the same collation settings.
The database contains users and can authenticate those users without reference to SQL Server logins. Authentication can be performed by the database itself, or by trusting users that have been authenticated by Windows.
Enabling Contained Databases
MCT USE ONLY. STUDENT USE PROHIBITED
9-26 Managing SQL Server Security
To use contained databases. you must enable contained database authentication at the level of the server instance. You can do this either by using the sp_configure stored procedure to enable the contained databases authentication option, or by setting the Enable Contained Databases option to True in the Server Properties window in SQL Server Management Studio.
Creating Contained Databases
After you enable contained databases for an instance, you can configure user databases on that instance as contained. You can configure database containment settings by specifying the CONTAINMENT option of the CREATE DATABASE statement or the SET CONTAINMENT option of the ALTER DATABASE statement as PARTIAL. You can also configure a contained database from the properties windows of a database in SQL Server Management Studio. On the Options page, in the Containment type field, select Partial.
Contained Users
After creating a contained database, you can create contained users for that database. These users can be one of two types:
Users with associated password. These users are authenticated by the database.
Users that are mapped to Windows user accounts. These users exist only in the database with no associated server-level login, and do not require the user to maintain a separate password. Instead of performing its own authentication, the database trusts Windows authentication.
You can create a contained user by using the CREATE USER statement in the context of a contained database.
Demonstration: Using a Contained Database In this demonstration, you will see how to:
Create a contained database.
Create contained users.
Demonstration Steps Create a Contained Database 1.
Ensure that you have completed the previous demonstrations in this module.
2.
In SQL Server Management Studio, open the ContainedDatabase.sql script file in the D:\Demofiles\Mod09 folder.
3.
Select the code under the comment Enable contained databases and click Execute. This code configures the server options to enable contained databases.
4.
Select and execute the code under the comment Create a contained database. This code creates a database named ContainedDB with a CONTAINMENT setting of PARTIAL.
5.
In Object Explorer, under MIA-SQL, refresh the Databases folder and verify that ContainedDB is listed.
6.
Right-click ContainedDB and click Properties. Then, in the Database Properties – ContainedDB dialog box, on the Options tab, note that the Containment type is set to Partial, and click Cancel.
MCT USE ONLY. STUDENT USE PROHIBITED
Maintaining a Microsoft® SQL Server® 2014 Database 9-27
Create Contained Users 1.
In the query window, select and execute the code under the comment Create contained users. This code creates two users in the ContainedDB database: A SQL Server user with a password, and a Windows user.
2.
In Object Explorer, expand the ContainedDB database, expand Security, and expand Users. Note that the two contained users you created are listed.
3.
In Object Explorer, under the server-level Security folder, refresh the Logins folder. Note that there are no logins for the users you created in the contained database.
4.
Right-click anywhere in the query window, point to Connection, and click Change Connection.
5.
In the Connect to Database Engine dialog box, ensure MIA-SQL is selected, in the Authentication drop-down list, select SQL Server Authentication, enter the login SalesApp and the password Pa$$w0rd, and then click Options.
6.
On the Connection Properties tab, in the Connect to database box, ensure is selected and click Connect. An error occurs because there is no login named SalesAppUser. In the Connect to Database Engine window, click OK.
7.
In the Connect to database box, type ContainedDB. Then click Connect. This connection succeeds because the user is defined in the database.
8.
Close the ContainedDatabase.sql query window, but keep SQL Server Management Studio open for the next demonstration.
Lesson 4
Managing Database Permissions After you have enabled access to a database by creating users, and organized users into roles, you can apply permissions to control how users access data and perform tasks in the database.
MCT USE ONLY. STUDENT USE PROHIBITED
9-28 Managing SQL Server Security
The fixed database-level roles provided with SQL Server already have some pre-defined permissions, and it’s possible that you may be able to implement the security you need using only membership of these roles. However, most databases have more fine-grained security requirements than the fixed databaselevel roles alone provide. You should endeavor to use database roles to group users and minimize the number of individual explicit permissions you need to assign in order to secure the database.
Lesson Objectives After completing this lesson, you will be able to:
Set database-level permissions.
Use schemas to organize database objects.
Assign table and view permissions.
Assign permissions for executable code.
Manage permissions for objects with ownership chains.
Database-Level Permissions Similarly to the server level, permissions in a database can be statement permissions or object permissions. You manage both of these kinds of permission by using the GRANT, DENY, and REVOKE statements as discussed previously in this module. Statement permissions at the database level generally govern data definition language (DDL) tasks, such as creating or altering users or roles. The following code example grants the db_dev database role permission to create tables, and grants the sales_admin database role permissions to alter existing roles and users: Statement Permissions in the Database GRANT CREATE TABLE TO db_dev; GRANT ALTER ANY ROLE, ALTER ANY USER TO sales_admin;
At the database level, you can configure permissions on the following securables:
Users
Database roles
Application roles
Full text catalogs
MCT USE ONLY. STUDENT USE PROHIBITED
Maintaining a Microsoft® SQL Server® 2014 Database 9-29
Certificates
Asymmetric keys
Symmetric keys
Schemas
Within schemas, you can configure permissions on database objects such as:
Tables
Functions
Stored procedures
Views
Indexes
Constraints
You can use permissions allow DDL operations on specific securables.
You can use permissions allow DDL operations on specific securables. In the following code example, and the sales_admin database role is granted permission to alter the sales_supervisor application role: Granting permissions on a Securable at the Database Level GRANT ALTER ON APPLICATION ROLE::sales_supervisor TO sales_admin;
You can also use permissions to allow data manipulation language (DML) operations on database objects. The following example grants SELECT permission on the dbo.ProductCategory and dbo.Product tables to the product_reader database role: Granting DML permissions GRANT SELECT ON OBJECT::dbo.ProductCategory TO product_reader; GRANT SELECT ON dbo.Product TO product_reader;
Note that for database objects that belong to a schema, the object type prefix OBJECT:: can be used, but this prefix is optional.
Viewing Effective Permissions You can view the effective permissions that a user or roles has by viewing the Securables tab of the Properties dialog box for that user or role.
Schemas Schemas are naming and security boundaries within a database that contain database objects such as:
Tables
Functions
Stored procedures
Views
Indexes
Constraints
You can create a schema by using the CREATE SCHEMA statement, as shown in the following example: Creating a Schema CREATE SCHEMA sales;
MCT USE ONLY. STUDENT USE PROHIBITED
9-30 Managing SQL Server Security
There are a few built-in schemas in SQL Server. The dbo and guest users have associated schemas of their own names. The sys and INFORMATION_SCHEMA schemas are reserved for system objects which you cannot drop or create objects in. You can return a list of all the schemas in a database by querying the sys.schemas view.
Schemas and Object Name Resolution
The name of the schema forms part of the multi-part naming convention for objects. The full name of any object is built up as Server.Database.Schema.Object; for example MIA-SQL.salesdb.dbo.product. When using Transact-SQL code within the context of a database, objects names are often abbreviated to include only the schema and object names; for example dbo.product, or sales.transaction. Although it is best practice to explicitly state at least the schema and object name when referencing a database object, you can specify only the object name (for example Product) and rely on SQL Server to resolve the name to the correct object. When you create a user, you can optionally specify a default schema for that user. When a user executes Transact-SQL code that references an unqualified object name, SQL Server first tries to resolve the object name in the user’s default schema (if they have one), and if it not found there SQL Server tries to find it in the dbo schema. Note: A database can potentially contain multiple objects with the same unqualified name (for example, production.product, sales.product, and dbo.product. For this reason, you should use explicit two-part schema.object names, three-part database.schema.object names, or fullyqualified server.database.schema.object names when referencing objects.
Schemas and Permission Inheritance
You can control access to the objects in a schema by setting explicit permissions on the objects themselves, but schemas enable you to define permissions in a more streamlined way, because any permissions that you apply to the schema are implicitly applied to the objects it contains. For example, if you grant the SELECT permission on a schema, you implicitly grant the SELECT permission on objects that support the SELECT permission (tables and views) in the schema.
The following code example grants INSERT permission on the sales schema to the sales_writer database role. Members of this role will implicitly be granted INSERT permission on all tables and views in the sales schema:
MCT USE ONLY. STUDENT USE PROHIBITED
Maintaining a Microsoft® SQL Server® 2014 Database 9-31
Granting Permissions on a Schema GRANT INSERT ON SCHEMA::sales TO sales_writer;
Table and View Permissions The permissions to access data that apply to tables and views are:
SELECT. Principals requires this permission to retrieve data from the table or view by using a SELECT statement.
INSERT. Principals need this permission to add new rows to a table or view by using the INSERT statement.
UPDATE. Principals need this permission to modify data in a table or view by using the UPDATE statement.
DELETE. Principals need this permission to remove rows from a table or view by using the DELETE statement.
REFERENCES. Principals need this permission in order to create a foreign-key relationship to a table if they have no other permissions on the table.
Note: In addition to these DML-related permissions, tables and views have DDL and administrative permissions, such as ALTER, CONTROL, TAKE OWNERSHIP, and VIEW DEFINITION.
You can view the effective permissions that selected users and roles have on a specific object by viewing the Permissions page in the Properties dialog box for that object in SSMS. Alternatively, you can view the effective permissions on selected objects that a specific database principal has by viewing the Securables tab of the Properties dialog box for that principal.
Column-Level Permissions
In addition to assigning permissions at table or view level, you can also allocate column-level permissions. This provides a more granular level of security for data in your database.
You do not need to execute separate GRANT or DENY statements for every column where you wish to assign permissions. Where a set of columns needs to be controlled in the same way, you can provide a list of columns in a single GRANT or DENY statement.
Using the GRANT and DENY Statements for Columns GRANT SELECT ON production.product (Name, Price) TO web_customer; GO DENY SELECT ON production.product (Cost) web_customer; GO
MCT USE ONLY. STUDENT USE PROHIBITED
9-32 Managing SQL Server Security
If you execute a DENY statement at table level for a user, and then execute a GRANT statement at column level, the user can still access the columns to which you grant access. However, if you then execute the table-level DENY statement again, the user is denied all permissions on the table, including on the columns to which they previously had access.
Executable Code Permissions In addition to providing you with control over who accesses data in your database or the objects in your server, SQL Server enables you to control which users can execute code. Appropriate security control of code execution is an important aspect of your security architecture.
Stored Procedures By default, users cannot execute stored procedures that other users create unless you grant them the EXECUTE permission on the stored procedure. In addition, they may also need permissions to access the objects that the stored procedure uses. You will discover more about this issue later in the lesson. In the following example, the web_customer role is granted execute permission on the sales.insert_order stored procedure: Granting Execute Permissions on Stored Procedures GRANT EXECUTE ON sales.insert_order TO web_customer;
User-Defined Functions
You also need to assign users permissions to execute user-defined functions (UDFs). The permissions that you need to assign depend on the type of UDF you are working with.
Scalar UDFs return a single value. Users accessing these functions require EXECUTE permission on the UDF.
Table-valued UDFs (TVFs) return a table of results rather than a single value. Accessing a TVF requires SELECT permission rather than EXECUTE permission, similar to the permissions on a table.
It is uncommon to directly update a TVF. It is possible, however, to assign INSERT, UPDATE, and DELETE permissions on one form of TVF known as an inline TVF—this particular form can be updated in some cases.
MCT USE ONLY. STUDENT USE PROHIBITED
Maintaining a Microsoft® SQL Server® 2014 Database 9-33
In addition to these permissions, there are scenarios where you also need to assign the REFERENCES permission to users so that they can correctly execute a UDF. These scenarios include functions which:
Are used in CHECK constraints.
Calculate values for DEFAULT constraints.
Calculate values for computed columns.
The following example shows how to grant execute permissions to the web_customer role on the dbo.calculate_tax function: Granting Execute Permissions on UDFs GRANT EXECUTE ON dbo.calculate_tax TO web_customer;
Managed Code
Managed code is .NET Framework code that ships in assemblies. Assemblies can exist as DLL or EXE files; however, you can only load assemblies in DLL files in SQL Server by using SQL Server CLR integration. Assemblies are registered in a SQL Server database by using the CREATE ASSEMBLY statement. After you load an assembly, the procedures, functions, and other managed code objects appear as standard objects in SQL Server and the standard object permissions apply. For example, users require EXECUTE permissions to run a stored procedure, whether it originates in an assembly or in Transact-SQL code.
Permission Sets
No matter what .NET Framework code is included in an assembly, the actions the code can execute are determined by the permission set specified when creating the assembly.
The SAFE permission set strictly limits the actions that the assembly can perform and inhibits it from accessing external system resources. Code using this permission set can access the local instance of SQL Server by using a direct access path, called a context connection. The SAFE permission set is the default.
The EXTERNAL_ACCESS permission set allows the code to access local and network resources, environment variables, and the registry. EXTERNAL_ACCESS is even necessary for accessing the same SQL Server instance if a connection is made through a network interface.
The UNSAFE permission set relaxes many standard controls over code and you should avoid using it.
The EXTERNAL_ACCESS and UNSAFE permission sets require additional setup. You cannot specify the need for an EXTERNAL_ACCESS permission set when executing the CREATE ASSEMBLY statement. You need to flag the database as TRUSTWORTHY (which is easy, but not recommended) or create an asymmetric key from the assembly file in the master database, create a login that maps to the key, and grant the login EXTERNAL ACCESS ASSEMBLY permission on the assembly.
Ownership Chains All database objects have owners. By default, the principal_id(owner) property is set to NULL for new objects and the owner of a schema automatically owns schema-scoped objects. The best practice is to have all objects owned by the schema object owner and therefore an object with a NULL principal_id property inherits its ownership from the schema where it is contained. When an object such as a stored procedure references another object, an ownership chain is established. An unbroken ownership chain exists when each object in the chain has the same owner. When an unbroken ownership chain exists, access is permitted to the underlying objects when access is permitted to the top level objects.
MCT USE ONLY. STUDENT USE PROHIBITED
9-34 Managing SQL Server Security
Having the same owner for all objects in a schema (which itself also has an owner) simplifies permission management, but it is still important to understand that ownership chain problems can occur and how to resolve them. Ownership chaining applies to stored procedures, views, and functions. The slide shows an example of how ownership chaining applies to views or stored procedures. 1.
User1 has no permissions on the table owned by User2.
2.
User2 creates a view that accesses the table and grants User1 permission to access the view. Access is granted as User2 is the owner of both the top level object (the view) and the underlying object (the table).
3.
User2 then creates a view that accesses a table owned by User3. Even if User2 has permission to access the table and grants User1 permission to use the view, User1 will be denied access because of the broken chain of ownership from the top level object (the view) to the underlying object (the table).
4.
However, if User3 grants User1 permissions directly on the underlying table, he can then access the view that User2 created to access that table.
Demonstration: Managing Permissions In this demonstration, you will see how to:
Set permissions.
View effective permissions.
Demonstration Steps Set Permissions 1.
Ensure that you have completed the previous demonstrations in this module.
2.
In SQL Server Management Studio, open the DatabasePermissions.sql script file in the D:\Demofiles\Mod09 folder.
MCT USE ONLY. STUDENT USE PROHIBITED
Maintaining a Microsoft® SQL Server® 2014 Database 9-35
3.
Select the code under the comment Grant schema permissions and click Execute This code grants SELECT permission on the HumanResources schema to the hr_reader database role, and INSERT, UPDATE, and EXECUTE permission on the HumanResources schema to the hr_writer database role.
4.
Select the code under the comment Grant individual object permissions and click Execute This code grants EXECUTE permission on the dbo.uspGetEmployeeManagers stored procedure to the hr_reader database role; INSERT permission on the Sales.SalesOrderHeader and Sales.SalesOrderDetail tables to the web_customer database role; and SELECT permission on the Production.vProductAndDescription view to the web_customer database role.
5.
Select the code under the comment Override inherited permissions and click Execute This code grants INSERT and UPDATE permission on the Sales schema procedure to the AnthonyFrizzell user, grants UPDATE permission on the HumanResources.EmployeePayHistory table to the [Payroll_Application] user; grants UPDATE permission on the SalariedFlag column in the HumanResources.Employee table to the [Payroll_Application] user; and denies SELECT on the HumanResources.EmployeePayHistory table to the AnthonyFrizzell user.
View Effective Permissions 1.
In Object Explorer, under the AdventureWorks database, expand Tables, right-click HumanResources.Employee and click Properties.
2.
In the Table Properties – Employee dialog box, on the Permissions tab, note that the [Payroll_Application] user has been explicitly granted Update permission.
3.
With the Payroll_Application user selected, view the permissions in the Effective tab, and note that this user has SELECT permission on the table, and UPDATE permission on the SalariedFlag column. The SELECT permission has been implicitly granted through membership of the hr_reader database role, which has inherited SELECT permission from permissions on the parent schema. The UPDATE permission was granted explicitly.
4.
In the Table Properties – Employee dialog box, click Cancel. Then close SQL Server Management Studio.
Lab: Managing SQL Server Security Scenario
MCT USE ONLY. STUDENT USE PROHIBITED
9-36 Managing SQL Server Security
You are a database administrator (DBA) at Adventure Works Cycles with responsibility for managing the InternetSales database. You must implement security for this database by creating the required serverlevel and database-level principals and by applying the required permissions.
Objectives After completing this lab, you will be able to:
Manage server-level security.
Manage database-level security.
Test database access.
Estimated Time: 90 minutes Virtual machine: 20462C-MIA-SQL User name: ADVENTUREWORKS\Student Password: Pa$$w0rd
Exercise 1: Managing Server-Level Security Scenario
The MIA-SQL SQL Server instance will be used to host application databases, including the InternetSales database.
It is a requirement that all corporate IT support personnel can alter any login and view any database on all SQL Server instances. They should have no additional administrative privileges on the SQL Server instance. The InternetSales database must be accessible by the following users:
IT support personnel.
Sales employees in North America, Europe, and Asia.
Sales managers.
An e-commerce web application that runs as the ADVENTUREWORKS\WebApplicationSvc service account.
A marketing application that runs on a non-Windows computer.
The ADVENTUREWORKS.MSFT domain includes the following global groups:
ADVENTUREWORKS\IT_Support: Contains all IT support personnel.
ADVENTUREWORKS\Sales_Asia: Contains all sales employees in Asia.
ADVENTUREWORKS\Sales_Europe: Contains all sales employees in Europe.
ADVENTUREWORKS\Sales_NorthAmerica: Contains all sales employees in North America.
ADVENTUREWORKS\Sales_Managers: Contains all sales managers.
MCT USE ONLY. STUDENT USE PROHIBITED
Maintaining a Microsoft® SQL Server® 2014 Database 9-37
The domain administrator has created the following domain local groups, with the members shown:
ADVENTUREWORKS\Database_Managers: o
ADVENTUREWORKS\IT_Support
ADVENTUREWORKS\InternetSales_Users: o
ADVENTUREWORKS\Sales_Asia
o
ADVENTUREWORKS\Sales_Europe
o
ADVENTUREWORKS\Sales_NorthAmerica
ADVENTUREWORKS\InternetSales_Managers: o
ADVENTUREWORKS\Sales_Managers
The main tasks for this exercise are as follows: 1. Prepare the Lab Environment 2. Verify the Authentication Mode 3. Create Logins 4. Manage Server-Level Roles
Task 1: Prepare the Lab Environment 1.
Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are both running, and then log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
Run Setup.cmd in the D:\Labfiles\Lab09\Starter folder as Administrator.
Task 2: Verify the Authentication Mode 1.
Review the exercise scenario, and determine the appropriate SQL Server authentication mode for the requirements.
2.
Ensure that the authentication mode for the MIA-SQL SQL Server instance is set appropriately to support the requirements.
Task 3: Create Logins 1.
Review the exercise scenario, and determine the required logins.
2.
Create the required logins, using the following guidelines: o
Create the minimum number of logins required to meet the requirements.
o
Use Windows logins wherever possible.
o
Any SQL Server logins should use the password Pa$$w0rd and should be subject to password policy restrictions. However, their passwords should not expire and they should not be required to change the password when they next log in.
o
All logins should use InternetSales as the default database.
Note: A suggested solution for this exercise is provided in the CreateLogins.sql file in the D:\Labfiles\Lab09\Solution folder.
Task 4: Manage Server-Level Roles 1.
Review the exercise scenario and determine the server-level security requirements.
2.
Create any required user-defined server-level roles, add logins to server-level roles, and grant appropriate permissions to meet the requirements.
Note: A suggested solution for this exercise is provided in the ServerRoles.sql file in the D:\Labfiles\Lab09\Solution folder.
MCT USE ONLY. STUDENT USE PROHIBITED
9-38 Managing SQL Server Security
Results: After this exercise, the authentication mode for the MIA-SQL SQL Server instance should support the scenario requirements, you should have created the required logins and server-level roles, and you should have granted the required server-level permissions.
Exercise 2: Managing Database-Level Security Scenario The InternetSales database contains the following schemas and database objects:
dbo schema: o
System objects
Sales schema: o
SalesOrderHeader table
o
SalesOrderDetail table
Products schema: o
Product table
o
ProductSubcategory table
o
ProductCategory table
o
vProductCatalog view
o
ChangeProductPrice stored procedure
Customers schema: o
Customer table
The security requirements for the database are:
IT support personnel must be able to manage security in the database.
All sales employees and managers must be able to read all data in the Sales schema.
Sales managers must be able to insert and update any data in the Sales schema.
Sales managers must be able to execute any stored procedures in the Sales schema.
All sales employees, sales managers, and the marketing application must be able to read all data in the Customers schema.
The e-commerce application must be able to read data from the Products.vProductCatalog view.
The e-commerce application must be able to insert rows into the Sales.SalesOrderHeader and Sales.SalesOrderDetail tables.
Sales managers must be able to read all data in the Products schema.
MCT USE ONLY. STUDENT USE PROHIBITED
Maintaining a Microsoft® SQL Server® 2014 Database 9-39
Sales managers must be able to execute any stored procedures in the Products schema.
The marketing application must be able to read any data in the Products schema.
Data in the Sales schema must only be deleted by an application that has elevated privileges based on an additional password. The elevated privileges must enable the application to read, insert, update, and delete data as well as execute stored procedures in the Sales schema.
The main tasks for this exercise are as follows: 1. Create Database Users 2. Manage Database Roles 3. Assign Permissions
Task 1: Create Database Users 1.
Review the exercise scenario and determine the required database users.
2.
Create the required database users in the InternetSales database. Use the following default schemas: o
IT support personnel: dbo
o
Marketing application: Customers
o
All other users: Sales
Note: A suggested solution for this exercise is provided in the CreateUsers.sql file in the D:\Labfiles\Lab09\Solution folder.
Task 2: Manage Database Roles 1.
Review the exercise scenario and determine the database-level role requirements.
2.
Add users to fixed database-level roles as required.
3.
Create any required user-defined database-level roles, and add appropriate users to them.
4.
Create any required application roles, assigning the password Pa$$w0rd.
Note: A suggested solution for this exercise is provided in the DatabaseRoles.sql file in the D:\Labfiles\Lab09\Solution folder.
Task 3: Assign Permissions 1.
Review the exercise scenario and determine the required permissions.
2.
Apply the required permissions, granting the minimum number of explicit permissions possible while ensuring that users have only the privileges they require.
Note: A suggested solution for this exercise is provided in the DatabasePermissions.sql file in the D:\Labfiles\Lab09\Solution folder.
Results: After this exercise, you should have created the required database users and database-level roles, and assigned appropriate permissions.
Exercise 3: Testing Database Access Scenario
You have create the required server and database principals, and applied appropriate permissions. Now you must verify that the permissions enable users to access the data they require, but do not permit any unnecessary data access.
The main tasks for this exercise are as follows: 1. Test IT Support Permissions 2. Test Marketing Application Permissions 3. Test Web Application Permissions 4. Test Sales Employee Permissions 5. Test Sales Manager Permissions
Task 1: Test IT Support Permissions 1.
Open a command prompt and enter the following command (which opens the sqlcmd utility as ADVENTUREWORKS\AnthonyFrizzell): runas /user:adventureworks\anthonyfrizzell /noprofile sqlcmd
2.
When you are prompted for a password, enter Pa$$w0rd.
3.
In the SQLCMD window, enter the following commands to verify your identity: SELECT suser_name(); GO
MCT USE ONLY. STUDENT USE PROHIBITED
9-40 Managing SQL Server Security
4.
Note that SQL Server identifies Windows group logins using their individual user account, even though there is no individual login for that user. ADVENTUREWORKS\AnthonyFrizzell is a member of the ADVENTUREWORKS\IT_Support global group, which is in turn a member of the ADVENTUREWORKS\Database_Managers domain local group.
5.
In the SQLCMD window, execute an ALTER LOGIN to change the password of the login for the marketing application. Your code should look similar to this: ALTER LOGIN login_name WITH PASSWORD = 'NewPa$$w0rd'; GO
6.
In the SQLCMD window, enter an ALTER LOGIN command to disable the login for the e-commerce web application. Your command should look like this: ALTER LOGIN login_name DISABLE; GO
7.
Close the SQLCMD window and maximize SQL Server Management Studio.
8.
In SQL Server Management Studio, view the properties of e-commerce application login and verify that the login is disabled. Then re-enable it.
MCT USE ONLY. STUDENT USE PROHIBITED
Maintaining a Microsoft® SQL Server® 2014 Database 9-41
Task 2: Test Marketing Application Permissions 1.
In SQL Server Management Studio, create a new query.
2.
Use the EXECUTE AS Transact-SQL statement to impersonate the login for the marketing application, and use the suser_name function to verify that the connection has changed security identity context. Your code should look similar to this: EXECUTE AS LOGIN = 'login_name' GO SELECT suser_name(); GO
3.
Query the sys.fn_my_permissions function to view the effective permissions on the Customers.Customer table in the InternetSales database. Your code should look like this: USE InternetSales; SELECT * FROM sys.fn_my_permissions(‘Customers.Customer’, 'object'); GO
4.
Execute a SELECT statement to verify that the marketing application can query the Customers.Customer table. For example: SELECT * FROM Customers.Customer;
5.
Execute an UPDATE statement and verify that the marketing application cannot update the Customers.Customer table. For example: UPDATE Customers.Customer SET EmailAddress = NULL WHERE CustomerID = 1; GO
6.
Execute a SELECT statement to verify that the marketing application can query the Products.Product table. For example: SELECT * FROM Products.Product;
7.
Execute a SELECT statement to verify that the marketing application cannot query the Sales.SalesOrderHeader table. For example: SELECT * FROM Sales.SalesOrderHeader;
8.
Close SQL Server management Studio without saving any files.
Task 3: Test Web Application Permissions 1.
In a command prompt window, enter the following command to run sqlcmd as ADVENTUREWORKS\WebApplicationSvc: runas /user:adventureworks\webapplicationsvc /noprofile sqlcmd
2.
When you are prompted for a password, enter Pa$$w0rd.
3.
Verify that you can query the Product.vProductCatalog view. For example, execute the following query: SELECT ProductName, ListPrice FROM Products.vProductCatalog; GO
4.
Verify that you cannot query the Products.Product table. For example, execute the following query:
SELECT * FROM Products.Product; GO
Task 4: Test Sales Employee Permissions 1.
In a command prompt window, enter the following command to run sqlcmd as ADVENTUREWORKS\DanDrayton. This user is a member of the ADVENTUREWORKS\Sales_NorthAmerica global group, which is in turn a member of the ADVENTUREWORKS\InternetSales_Users domain local group. runas /user:adventureworks\dandrayton /noprofile sqlcmd
2.
When you are prompted for a password, enter Pa$$w0rd.
3.
Verify that you can query the Sales.SalesOrderHeader table. For example, execute the following query: SELECT SalesOrderNumber, TotalDue FROM Sales.SalesOrderHeader; GO
4.
Verify that you cannot update the Sales.SalesOrderHeader table. For example, execute the following query: UPDATE Sales.SalesOrderHeader SET ShipDate = getdate() WHERE SalesOrderID = 45024; GO
Task 5: Test Sales Manager Permissions 1.
In a command prompt window, enter the following command to run sqlcmd as ADVENTUREWORKS\DeannaBall. This user is a member of the ADVENTUREWORKS\Sales_Managers global group, which is in turn a member of the ADVENTUREWORKS\InternetSales_Managers domain local group. runas /user:adventureworks\dandrayton /noprofile sqlcmd
2.
When you are prompted for a password, enter Pa$$w0rd.
3.
Verify that you can query the Sales.SalesOrderHeader table. For example, execute the following query: SELECT SalesOrderNumber, TotalDue FROM Sales.SalesOrderHeader; GO
4.
Verify that you can update the Sales.SalesOrderHeader table. For example, execute the following query: UPDATE Sales.SalesOrderHeader SET ShipDate = getdate() WHERE SalesOrderID = 45024; GO
5.
MCT USE ONLY. STUDENT USE PROHIBITED
9-42 Managing SQL Server Security
Verify that you cannot update the Product.Product table. For example, execute the following query: UPDATE Products.Product SET ListPrice = 1999.00 WHERE ProductID = 1; GO
MCT USE ONLY. STUDENT USE PROHIBITED
Maintaining a Microsoft® SQL Server® 2014 Database 9-43
6.
Verify that you can use the Products.ChangeProductPrice stored procedure to update the Product.Product table. For example, execute the following query: EXEC Products.ChangeProductPrice 1, 1999.00; GO
7.
Verify that you cannot delete data from the Sales.SalesOrderDetail table. For example, execute the following query: DELETE Sales.SalesOrderDetail WHERE SalesOrderDetailID = 37747; GO
8.
If you created an application role to enable deletions of sales data, test it by using code like this: EXEC sp_setapprole 'app_role_name', 'Pa$$w0rd' GO DELETE Sales.SalesOrderDetail WHERE SalesOrderDetailID = 37747; GO
Results: After this exercise, you should have verified effective permissions in the MIA-SQL instance and the InternetSales database. Question: Compare your solution to the scripts provided in the D:\Labfiles\Lab09\Solution folder. What did you do differently? Question: What sort of login would be required for a user in a Windows domain that is not trusted by the domain in which SQL Server is installed?
Module Review and Takeaways In this module, you have learned how to implement security in a SQL Server database engine instance. When implementing security in SQL Server, consider the following best practices:
MCT USE ONLY. STUDENT USE PROHIBITED
9-44 Managing SQL Server Security
Minimize the number of SQL Server logins.
Use Windows group logins to simplify ongoing management where possible.
Disable logins rather than dropping them if there is any chance that they will be needed again.
Ensure that expiry dates are applied to logins that are created for temporary purposes.
Use fixed server-level roles to delegate server-level management responsibility, and only create userdefined server-level roles if your specific administrative delegation solution requires them.
Disable the guest user in user databases unless you specifically require guest access.
Aim to grant the minimum number of explicit permissions possible to meet the security requirements, and use membership of roles and inheritance to ensure the correct effective permissions.
Ensure every user has only the permission they actually require.
Review Question(s) Question: Your organization needs to track data access by individual Windows users. Does this mean you cannot base logins on Windows groups?
MCT USE ONLY. STUDENT USE PROHIBITED 10-1
Module 10 Auditing Data Access and Encrypting Data Contents: Module Overview
10-1
Lesson 1: Auditing Data Access in SQL Server
10-2
Lesson 2: Implementing SQL Server Audit
10-7
Lesson 3: Encrypting Databases
10-16
Lab: Auditing Data Access and Encrypting Data
10-22
Module Review and Takeaways
10-26
Module Overview
When configuring security for your Microsoft® SQL Server® systems, you need to ensure that you meet any of your organization’s compliance requirements for data protection. Organizations often need to adhere to industry-specific compliance policies, which mandate auditing of all data access. To address this requirement, SQL Server provides a range of options for implementing auditing. Another common compliance requirement is the encryption of data to protect against unauthorized data access in the event that access to the database files themselves is compromised. SQL Server supports this requirement by providing transparent data encryption (TDE). This module describes the available options for auditing in SQL Server, how to use and manage the SQL Server audit feature, and how to implement encryption.
Objectives After completing this module, you will be able to:
Describe the options for auditing data access.
Implement SQL Server audit.
Manage SQL Server audit.
Implement Transparent Data Encryption.
Lesson 1
Auditing Data Access in SQL Server SQL Server provides a variety of tools that you can use to audit data access. In general, no one tool provides all possible auditing requirements and a combination of features often needs to be used. In this lesson, you will learn about the auditing options available.
Lesson Objectives After completing this lesson, you will be able to:
Describe the need for auditing.
Describe the Common Criteria Audit feature.
Use triggers for auditing.
Uses SQL Trace for auditing.
Discussion: Auditing Data Access During this discussion, you will consider the following questions:
Why is auditing required?
What methods have you used for auditing?
What are the limitations of the methods you have used?
Which standards that require auditing does you organization need to comply with?
Common Criteria Auditing Common Criteria is an international standard that was ratified by more than 20 nations in 1999 and has superseded the US C2 rating as a requirement in most standards. It is now maintained by more than 20 countries and adopted by the International Standards Organization (ISO). Note: SQL Server 2014 and previous versions also support the C2 audit mode; however, you should update any applications using C2 audit mode to use the Common Criteria Certification.
MCT USE ONLY. STUDENT USE PROHIBITED
10-2 Auditing Data Access and Encrypting Data
Enabling Common Criteria Compliance in SQL Server
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
10-3
SQL Server provides the common criteria compliance enabled option, which can be set by using the sp_configure system stored procedure. The option is available in the Enterprise edition for production use. (It is also available in the Developer and Evaluation editions for non-production use.) In addition to enabling the common criteria compliance enabled option, you must also download and run a script that finishes configuring SQL Server to comply with Common Criteria Evaluation Assurance Level 4+ (EAL4+). You can download this script from the Microsoft SQL Server website. When the option is enabled and the script is run, three changes occur to how SQL Server operates:
Residual Information Protection (RIP). Memory is always overwritten with a known bit pattern before being reused.
Ability to view login statistics. Auditing of logins is automatically enabled.
Column GRANT does not override table DENY. This changes the default behavior of the permission system.
Note: The implementation of RIP increases security, but can negatively impact the performance of the system.
SQL Trace Many users attempt to use SQL Server Profiler for auditing because it enables tracing commands to be sent to SQL Server, as well as tracing the returned errors. SQL Server Profiler can have a significantly negative performance impact when it is run interactively on production systems. An alternative is to use SQL Trace, which is a set of system stored procedures that SQL Server Profiler can utilize. Executing these procedures to manage tracing offers a much more lightweight method of tracing, particularly when the events are wellfiltered. SQL Trace can then have a role in auditing—because it can capture commands that are sent to the server, you can use it to audit those commands.
SQL Trace uses a server-side tracing mechanism to guarantee that no events are lost, as long as there is space available on the disk and that no write errors occur. If the disk fills or write errors occur, the trace stops. SQL Server continues unless C2 audit mode is also enabled. The possibility of missing events needs to be considered when evaluating the use of SQL Trace for auditing purposes.
DML Triggers Triggers can play an important role in auditing. SQL Server supports a variety of types, including data manipulation language (DML) triggers. These run when a user modifies data and logon triggers which enable tracking details of logons and rolling-back logons, based on business or administrative logic. You can create triggers by using the CREATE TRIGGER statement and specifying whether the trigger should run on inserts, updates, deletes, or logons—and what action the trigger should take.
MCT USE ONLY. STUDENT USE PROHIBITED
10-4 Auditing Data Access and Encrypting Data
The following example shows a DML trigger that runs when an update occurs on a row in the dbo.Employee table and logs the original data in the dbo.EmployeeSalaryAudit table. You can access the original and new information by using the internal inserted and deleted tables in the trigger. Creating a DML Trigger CREATE TRIGGER TR_Employee_Salary_UPDATE ON dbo.Employee FOR UPDATE AS BEGIN SET NOCOUNT ON; IF UPDATE(Salary) BEGIN INSERT dbo.EmployeeSalaryAudit (EmployeeID, OldSalary, NewSalary, UpdatedBy, UpdatedAt) SELECT i.EmployeeID, d.Salary, i.Salary, suser_name(), getdate() FROM inserted AS i INNER JOIN deleted AS d ON i.EmployeeID = d.EmployeeID; END; END; GO
Triggers do have some limitations:
System performance can be significantly impacted by triggers running alongside the usual load on the server.
Users with appropriate permissions can disable triggers. This can cause a significant issue for auditing requirements.
You cannot create triggers that run in response to a SELECT statement.
Triggers have a nesting limit of 32 levels, beyond which they do not work.
Only limited ability to control trigger-firing order is provided. To make sure that it captures all the changes made by other triggers, auditing would normally need to be the last trigger that fires—but this cannot be guaranteed.
Demonstration: Using DML Triggers for Auditing In this demonstration, you will see how to create a DML trigger for auditing.
Demonstration Steps Create a DML Trigger for Auditing
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
10-5
1.
Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are running, and log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
In the D:\Demofiles\Mod10 folder, run Setup.cmd as Administrator.
3.
Start SQL Server Management Studio and connect to the MIA-SQL database engine instance using Windows authentication.
4.
Open the DML Trigger.sql script file in the D:\Demofiles\Mod10 folder.
5.
Select the code under the comment Create a log table, and click Execute. This creates a table named AuditRateChange in the HumanResources schema of the AdventureWorks database.
6.
Select the code under the comment Create a trigger, and click Execute. This creates a trigger on the EmployeePayHistory table that fires on updates. When the Rate column is updated, a row is inserted into the AuditRateChange table.
7.
Select the code under the comment Update a rate, and click Execute. This updates a rate in the EmployeePayHistory table.
8.
Select the code under the comment View the audit log, and click Execute. This retrieves the logged details from the AuditRateChange table.
9.
Keep SQL Server Management Studio open for the next demonstration.
SQL Server Audit SQL Server Audit is the primary auditing tool in SQL Server 2014. It enables you to track server-level and database-level events on an instance of SQL Server and log these events to audit files or event logs. All editions of SQL Server support server-level auditing, and Enterprise edition, Developer edition, and Evaluation edition additionally support databaselevel auditing.
Object
Description
Server Audit
Defines where to store the audit data.
Server Audit Specification
Collects many server-level action groups raised by Extended Events. One per audit.
Database Audit Specification
Collects database-level audit actions raised by Extended Events. One per database per audit.
Object
Description
Actions
Specific actions that can raise events and be added to the audit. For example, SELECT operations on a table.
Action Groups
Logical groups of actions to simplify creating audit specifications. For example, BACKUP_RESTORE_GROUP, which includes any backup or restore commands.
Target
Receives and stores the results of the audit. Can be a file, Windows Security event log, or Windows Application event log.
Extended Events SQL Server audit is based on an eventing engine called Extended Events.
MCT USE ONLY. STUDENT USE PROHIBITED
10-6 Auditing Data Access and Encrypting Data
A wide variety of events occur within the SQL Server database engine. For example, when a user executes a query, the database engine may need to request additional memory or check permissions before the actual query is allowed to run. SQL Server uses the Extended Events feature that enables you to define the actions that SQL Server should take when events occur. When SQL Server executes its internal code, it checks to see if a user has defined an action that should be taken at that point in the code. If they have, SQL Server fires an event and sends details to a target location. Targets can be operating system files, memory-based ring buffers, or Windows® event logs. Extended Events is a lightweight eventing engine that has very little performance impact on the database engine that it is monitoring. You can use Extended Events for many purposes where you may previously have used SQL Trace.
Extended Events are important because SQL Server Audit is based on the Extended Events infrastructure. The eventing engine that Extended Events provides is not tied to particular types of events—the engine is written in such a way that it can process any type of event. Configurations of Extended Events ship in .exe or .dll files called packages. Packages are the unit of deployment and installation for Extended Events and contain all the objects that are part of a particular Extended Events configuration. SQL Server audit is a special package within Extended Events so you cannot change its internal configuration.
Extended Events uses specific terminology for the objects that it uses, as described in the following table: Object
Description
Events
Points of interest during the execution of code.
Targets
Places to which the trace are sent, such as operating system files.
Actions
Responses that SQL Server can make to an event (for example, capturing execution plans to include in a trace).
Types
Definitions of the objects that Extended Events works with.
Predicates
Dynamic filters that SQL Server applies to the event capture.
Maps
Mapping of values to strings (for example, mapping codes to descriptions).
Lesson 2
Implementing SQL Server Audit
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
10-7
Preparing SQL Server audit for use requires that you configure a number of objects before you can create and run your audit, and then view the results. In this lesson, you will learn how to configure SQL Server audit, and how to create and use audits.
Lesson Objectives After completing this lesson, you will be able to:
Configure SQL Server audit.
Detail the roles of audit actions and action groups.
Define audit targets.
Create audits.
Create server audit specifications.
Create database audit specifications.
SQL Server Audit Overview SQL Server Audit is based on the following architecture:
Audits. An audit defines where and how audited events are logged. Each audit defines a target (such as a file or a Windows event log), the time interval before events must be written, and the action SQL Server should take if the target runs out of disk space.
Audit Specifications. An audit specification defines a set of events that should be included in an audit. Server audit specifications log events that affect the server, while database audit specifications log events within a specified database.
Actions and Action Groups. The events that can be included in an audit specification are based on pre-defined actions, which are grouped into action groups. SQL Server provides a comprehensive set of server-level action groups and database-level action groups, and you can audit user-defined actions by adding a user-defined action group to an audit specification at the server and database levels. Additionally, there are audit-level action groups that track changes to the auditing configuration itself. This ensures that when an administrator disables auditing, a record of this change is logged.
Creating an Audit You can create audits by using the Transact-SQL CREATE SERVER AUDIT or SQL Server Management Studio (SSMS). There are several options you can configure, the key ones being listed in the following table:
Option
Description
Name
User-friendly name to refer to the audit.
Queue delay
Time in milliseconds that SQL Server can buffer the audit results before flushing them to the target.
On failure
Action to take if the audit log is unavailable—continue, shut down server, or fail auditable operations.
Audit destination
Location of the target.
Maximum file size
Maximum size in MB of each audit file.
Reserve disk space
Whether to reserve disk space for audit files in advance.
Note: The value you configure for the queue delay needs to be a trade-off between security and performance. A low value ensures that events are logged quickly and avoids the risk of losing items from the audit trail in the event of failure, but can result in a significant performance overhead.
Audit Targets Audits can be sent to one of the following three targets:
MCT USE ONLY. STUDENT USE PROHIBITED
10-8 Auditing Data Access and Encrypting Data
A file. File output provides the highest performance and is the easiest option to configure.
Windows Application Event Log. Avoid sending too much detail to this log as network administrators tend to dislike applications that write too much content to any of the event logs. Do not use this target for sensitive data because any authenticated user can view the log.
Windows Security Event Log. This is the most secure option for auditing data, but you need to add the SQL Server service account to the Generate Security Audits policy before using it.
You should review the contents of the target that you use and archive its contents on a periodic basis.
The following code example creates and enables a server audit that uses a binary file as the target: Creating a Server Audit CREATE SERVER AUDIT HR_Audit TO FILE (FILEPATH='\\MIA-SQL\Audit\') WITH (QUEUE_DELAY = 1000); GO ALTER SERVER AUDIT HR_Audit WITH (STATE = ON); GO
Note: The filename that you provide to the FILEPATH parameter when creating a server audit is actually a path to a folder. SQL Server generates log files automatically and stores them in this location.
Creating a Server Audit Specification After you create an audit, you can then create the server or database audit specification by using Transact-SQL statements or SSMS. When you create a server audit specification, it is automatically disabled, so you must remember to enable it when you are ready for it to start. A server audit specification details the actions to audit. Server-level actions are grouped into predefined action groups, including:
BACKUP_RESTORE_GROUP: Includes all backup and restore actions.
DATABASE_CHANGE_GROUP: Includes actions that create, alter, or drop a database.
FAILED_LOGIN_GROUP: Includes details of failed login attempts.
SUCCESSFUL_LOGIN_GROUP: Includes details of successful logins.
SERVER_OPERATION_GROUP: Includes actions that alter server settings.
Note: For a full list of server-level audit actions groups, see “SQL Server Audit Action Groups and Actions” in SQL Server Books Online. The following example shows how to create and enable an audit to track failed and successful login attempts: Creating a Server Audit Specification CREATE SERVER AUDIT SPECIFICATION AuditLogins FOR SERVER AUDIT SecurityAudit ADD (FAILED_LOGIN_GROUP), ADD (SUCCESSFUL_LOGIN_GROUP) WITH (STATE = ON);
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
10-9
Auditing Data Access and Encrypting Data
Creating Database Audit Specifications Creating a database audit specification is similar to creating a server audit specification. The database audit specification details actions or action groups to audit at the database level. You can audit individual database-level actions, including:
SELECT: Occurs when a SELECT statement is issued.
INSERT: Occurs when an INSERT statement is issued.
DELETE: Occurs when a DELETE statement is issued.
UPDATE: Occurs when an UPDATE statement is issued.
EXECUTE: Occurs when an EXECUTE statement is issued.
Additionally, SQL Server defines database-level action groups, including:
APPLICATION_ROLE_CHANGE_PASSWORD_GROUP: Occurs when an application role password is changed.
DATABASE_OBJECT_ACCESS_GROUP: Includes all access to database objects.
DATABASE_PRINCIPAL_CHANGE_GROUP: Includes all events when a database-level principal is created, altered, or dropped.
SCHEMA_OBJECT_ACCESS_GROUP: Includes all access to objects in a specified schema.
SCHEMA_OBJECT_PERMISSION_CHANGE_GROUP: Includes all changes to object permissions.
Note: For a full list of database-level audit actions and actions groups, see “SQL Server Audit Action Groups and Actions” in SQL Server Books Online. The following example shows how to create an audit specification that includes all database principal changes and all SELECT queries on objects in the HumanResources schema by members of the db_datareader fixed database-level role. Creating a Database Audit Specification USE AdventureWorks; CREATE DATABASE AUDIT SPECIFICATION AdventureWorks_DBSecurity FOR SERVER AUDIT SecurityAudit ADD (DATABASE_PRINCIPAL_CHANGE_GROUP), ADD (SELECT ON SCHEMA::HumanResources BY db_datareader) WITH (STATE = ON);
MCT USE ONLY. STUDENT USE PROHIBITED
10-10
User-Defined Audit Actions The server-level and database-level actions and action groups that SQL Server provides enable you to audit many types of events that occur in SQL Server. However, they do not audit applicationspecific events such as an entry in a Bonus column being higher than a specific threshold value. To audit custom events such as these, you can add the USER_DEFINED_AUDIT_GROUP action group to an audit specification, and call the sp_audit_write stored procedure in your application logic or in a trigger to write an event to the audit. The sp_audit_write stored procedure takes three parameters which enable you to log a smallint to identify the specific event, a binary value to track whether the event succeeds, and a string to describe the event.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
10-11
The following example shows how to call the sp_audit_write stored procedure from an insert trigger. Calling sp_audit_write CREATE TRIGGER HR.BonusChecker ON HR.EmployeeBonus AFTER INSERT AS DECLARE @bonus money, @empid integer, @msg nvarchar(4000) select @bonus = i.Bonus, @empid = i.EmployeeID from inserted i IF @bonus > 1000 BEGIN SET @msg = 'Employee '+ CAST(@empid as varchar(50)) +' bonus is over $1000'
EXEC sp_audit_write @user_defined_event_id = 12, @succeeded = 1, @user_defined_information = @msg;
END
Note: You must ensure that all principals who may trigger custom audit actions have been granted EXECUTE permission on the sys.sp_audit_write stored procedure in the master database. The easiest way to ensure this is to grant EXECUTE permission on sys.sp_audit_write to public.
Auditing Data Access and Encrypting Data
Reading Audited Events After an audit has been active, you will most likely want to review the events that have been audited. You can use the Event Viewer administrative tool in Windows to view events from audits that target Windows event logs. For logs that are sent to a file, SQL Server provides the sys.fn_get_audit_file Transact-SQL function, which returns the contents of all file-based logs in a specified folder as a table. The sys.fn_get_audit_file function takes three parameters—the file_pattern, the initial_file_name, and the audit_record_offset. The file_pattern can be in one of three formats:
MCT USE ONLY. STUDENT USE PROHIBITED
10-12
\* which collects audit files in the specified location.
\LoginsAudit_{GUID} which collects all audit files that have the specified name and GUID pair.
\LoginsAudit_{GUID}_00_29384.sqlaudit which collects a specific audit file.
The structure of data that the sys.fn_get_audit_file function returns is quite detailed and you rarely need to view the data in all columns. This example shows how to retrieve some of the commonly-used columns. Selecting Specific Columns SELECT event_time, object_id, server_principal_name, database_name, schema_name, object_name, statement FROM sys.fn_get_audit_file ('\\MIA-SQL\AuditFiles\*',default,default);
Additional Reading: For a full explanation of the structure, see sys.fn_get_audit_file (Transact-SQL) in SQL Server Books Online.
The audit records produced by SQL Server need to be in a format that fits in system event logs, as well as in files. Because of this requirement, the record format is limited in size by the rules related to those event logging systems. Character fields will be split into 4,000-character chunks that may be spread across a number of entries. This means that a single event can generate multiple audit entries and a sequence_no column is provided to indicate the order of multiple row entries
Managing SQL Server Audit After you have implemented SQL Server Audit, you must be able to manage audit configuration and troubleshoot potential issues.
Enabling and Disabling Auditing You can use SQL Server Management Studio to enable and disable audits or individual audit specifications. Alternatively, you can use the ALTER SERVER AUDIT, ALTER SERVER AUDIT SPECIFICATION, and ALTER DATABASE AUDIT SPECIFICATION statements to set the STATE property to ON or OFF. The following code example disables the SecurityAudit audit. Disabling an Audit Use master; ALTER SERVER AUDIT SecurityAudit WITH (STATE = OFF);
Viewing Audit Configuration Information
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
10-13
SQL Server provides the following database management views (DMVs) and system views that can help you retrieve SQL Server Audit configuration information. View
Description
sys.dm_server_audit_status
Returns one row for each server audit, indicating the current state of the audit.
sys.dm_audit_actions
Returns a row for every audit action and action group.
sys.dm_audit_class_type_map
Returns a table that maps the class types to class descriptions.
sys.server_audits
Contains one row for each SQL Server audit in a server instance.
sys.server_file_audits
Contains extended information about the file audit type in a SQL Server audit.
sys.server_audit_specifications
Contains information about the server audit specifications in a SQL Server audit.
sys.server_audit_specification_details
Contains information about the server audit specification details (actions) in a SQL Server audit.
sys.database_audit_specifications
Contains information about the database audit specifications in a SQL Server audit.
sys.database_audit_specification_details
Contains information about the database audit specifications in a SQL Server audit.
Auditing Data Access and Encrypting Data
Considerations for SQL Server Audit There are several potential issues to consider with SQL Server audit:
MCT USE ONLY. STUDENT USE PROHIBITED
10-14
Each audit is identified by a GUID. If you restore or attach a database on a server, SQL Server attempts to match the GUID in the database with the GUID of the audit on the server. If no match occurs, auditing will not work until you correct the issue by executing the CREATE SERVER AUDIT command to set the appropriate GUID.
If databases are attached to editions of SQL Server that do not support the same level of audit capability, the attach works but the audit is ignored.
Mirrored servers introduce a similar issue of mismatched GUIDs. The mirror partner must have a server audit with the same GUID. You can create this by using the CREATE SERVER AUDIT command and supplying the GUID value to match the one on the primary server.
You should consider the performance impact of audit writes and whether you need to minimize your audit list to maximize performance.
If disk space fills up, SQL Server may not start. In this situation, you may need to force entry to it by using a single user startup with the –f startup parameter.
Demonstration: Using SQL Server Audit In this demonstration, you will see how to:
Create an audit
Create a server audit specification
Create a database audit specification
View audited events
Demonstration Steps Create an Audit 1.
If you did not complete the previous demonstration, start the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines, log onto 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd, and run Setup.cmd in the D:\Demofiles\Mod10 folder as Administrator. Then start SQL Server Management Studio, and connect to MIA-SQL using Windows authentication.
2.
In SQL Server Management Studio, open the Audit.sql script file in the D:\Demofiles\Mod10 folder.
3.
Select the code under the comment Create an audit, and click Execute. This creates an audit that logs events to files in D:\Demofiles\Mod10\Audits.
4.
In Object Explorer, expand Security, and expand Audits (if Audits is not expandable, refresh it and try again).
5.
Double-click the AW_Audit audit you created and view its properties. Then click Cancel.
Create a Server Audit Specification 1.
In SQL Server Management Studio, in the Audit.sql script, select the code under the comment Create a server audit specification and click Execute. This creates an audit specification for the AW_Audit audit that logs failed and successful login attempts.
2.
In Object Explorer, refresh the Server Audit Specifications folder and expand it. Then double-click AW_ServerAuditSpec, view its properties, and click Cancel.
Create a Database Audit Specification
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
10-15
1.
In SQL Server Management Studio, in the Audit.sql script, select the code under the comment Create a database audit specification and click Execute. This creates an audit specification for the AW_Audit audit that logs specific actions by individual principals on the HumanResources schema in the AdventureWorks database.
2.
In Object Explorer, expand Databases, expand AdventureWorks, expand Security, and expand Database Audit Specifications (if Database Audit Specifications is not expandable, refresh it and try again).
3.
Double-click AW_DatabaseAuditSpec and view its properties. Then click Cancel.
View Audited Events 1.
Open a command prompt and enter the following command to run sqlcmd as ADVENTUREWORKS\ChadCorbitt. This user is a member of the ADVENTUREWORKS\Personnel global group, which in turn is a member of the ADVENTUREWORKS\HumanResources_Users domain local group. runas /user:adventureworks\chadcorbitt /noprofile sqlcmd
2.
When prompted for a password, enter Pa$$w0rd.
3.
In the SQLCMD window, enter the following commands to query the HumanResources.Employee table: SELECT LoginID, JobTitle FROM HumanResources.Employee; GO
4.
Close the SQLCMD and command prompt windows.
5.
In the D:\Demofiles\Mod10\Audits folder, verify that an audit file has been created.
6.
In SQL Server Management Studio, in the Audit.sql script, select the code under the comment View audited events and click Execute. This queries the files in the audit folder and displays the audited events (to simplify this demonstration, events logged for the Student user and the service account for SQL Server have been excluded).
7.
Note that all events are logged with the server principal name ADVENTUREWORKS\ChadCorbitt despite the fact that this user accesses SQL Server through membership of a Windows group and does not have an individual login.
8.
Keep SQL Server Management Studio open for the next demonstration.
Auditing Data Access and Encrypting Data
Lesson 3
Encrypting Databases
MCT USE ONLY. STUDENT USE PROHIBITED
10-16
With the focus of most data security being on threats posed by hackers or social engineers, an often overlooked aspect is the risk of the physical theft of data storage media such as disks and backup tapes. For this reason, many organizations are obliged by their security compliance policies to protect data by encrypting it. SQL Server 2014 includes two ways of encrypting data: Transparent Data Encryption (TDE) and Extensible Key Management (EKM). This lesson describes the considerations for using these encryption technologies.
Lesson Objectives After completing this lesson, you will be able to:
Describe the architecture of TDE
Configure TDE.
Move an encrypted database between servers.
Describe how to use EKM.
Transparent Data Encryption Overview Transparent Data Encryption provides a way to secure data rendering database files unreadable without the appropriate decryption keys. It adds an extra layer of protection to your database infrastructure by encrypting data files and log files without the need to reconfigure client applications, or for additional development effort. This is because the SQL Server instance itself performs the jobs of encrypting and decrypting data, so the process is transparent to applications and users. When SQL Server writes data pages to disk, they are encrypted; when they are read into memory, they are decrypted. Note: When a database is configured to use TDE, CPU utilization for SQL Server may increase due to the overhead of encrypting and decrypting data pages.
Transparent Data Encryption Keys TDE uses the following hierarchy of keys to encrypt and decrypt data.
Service Master Key (SMK). The SMK is created at the time of the installation of the SQL Server instance by Setup. The SMK encrypts and protects the Database Master Key for the master database. The SMK is itself encrypted by the Windows operating system Data Protection Application Programming Interface (DPAPI).
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
10-17
Database Master Key (DMK). The DMK for the master database is used to generate a certificate in the master database. SQL Server uses the SMK and a password that you specify to generate the DMK, and stores it in the master database. Note: You can use a password without the SMK to generate a DMK, although this is less secure.
Server Certificate. A server certificate is generated in the master database, and is used to encrypt an encryption key in each TDE-enabled database.
Database Encryption Key (DEK). A DEK in the user database is used to encrypt the entire database.
Configuring Transparent Data Encryption To enable TDE, you must perform the following steps: 1.
Create a database master key in the master database.
2.
Create a server certificate in the master database.
3.
Create a database encryption key in the user database you want to encrypt.
4.
Enable encryption for the user database.
Creating a DMK in the master Database You create the DMK by using the CREATE MASTER KEY Transact-SQL statement and supplying a password. The code example below creates a DMK in the master database. Creating a DMK in the master Database USE master; CREATE MASTER KEY ENCRYPTION BY PASSWORD = ‘Pa$$w0rd’;
Creating a Server Certificate
After you have created a DMK in the maser database, you can use the CREATE CERTIFICATE Transact-SQL statement to create the server certificate. SQL Server uses the DMK to generate the certificate and an associated private key, storing them in the master database. The code example below creates a server certificate called Security_Certficate. Creating a Certificate in the master Database Use master; CREATE CERTIFICATE Security_Certificate WITH SUBJECT = 'DEK_Certificate';
You should back up both the server certificate and its associated private key as soon as you create them. This minimizes the risk of data loss that could occur if you encrypted a database and then lost access to the certificate and private key.
Auditing Data Access and Encrypting Data
The following code example shows how to back up the server certificate and its private key. Backing up a Certificate and its Private Key BACKUP CERTIFICATE Security_Certificate TO FILE = 'D:\backups\security_certificate.cer' WITH PRIVATE KEY (FILE = 'D:\backups\security_certificate.key' , ENCRYPTION BY PASSWORD = 'CertPa$$w0rd');
Creating a DEK in a user Database
MCT USE ONLY. STUDENT USE PROHIBITED
10-18
The DEK is a symmetric key that encrypts the database. Symmetric keys are two-way keys that can perform both encryption and decryption. The DEK is created and stored in the database that you want to encrypt by using the CREATE DATABASE ENCRYPTION KEY Transact-SQL statement.
The code example below creates a database encryption key that uses the AES_128 algorithm in the AdventureWorks database. The key is encrypted using the server certificate created in the previous step. Creating a Database Encryption Key USE AdventureWorks; CREATE DATABASE ENCRYPTION KEY WITH ALGORITHM = AES_128 ENCRYPTION BY SERVER CERTIFICATE Security_Certificate;
Enabling Encryption for a Database After you have created the necessary keys and certificates, you can enable encryption for a database by using the ALTER DATABASE Transact-SQL statement with the SET ENCRYPTION ON command. The Transact-SQL statement below enables encryption for the AdventureWorks database. Enabling Encryption for a Database ALTER DATABASE AdventureWorks SET ENCRYPTION ON;
To check whether a database is encrypted, you can query sys.databases. A value of 0 in the is_encrypted column indicates that the database is not encrypted. A value of 1 in the is_encrypted column indicates that the database is encrypted. The code example below queries sys.databases to show the encryption status of all databases on the server instance. Querying sys.databases to Determine Encryption Status USE master; SELECT name, is_encrypted FROM sys.databases ;
Moving Encrypted Databases The primary reason for encrypting a database is to prevent unauthorized access to the data it contains in the event of the data files being compromised. You cannot simply attach the files from an encrypted database to another server and read its data. If you need to move an encrypted database to another server, you must also move the associated keys and certificates. The list below describes the high-level steps for moving a TDE-enabled database:
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
10-19
1.
On the source server, detach the database that you want to move.
2.
Copy or move the database files to the same location on the destination server.
3.
Create a DMK in the master database on the destination server.
4.
Use a CREATE CERTIFICATE Transact-SQL statement to generate a server certificate on the destination server from the backup of the original server certificate and its private key.
5.
Attach the database on the destination server.
Demonstration: Implementing Transparent Data Encryption In this demonstration, you will see how to:
Create a database master key.
Create a server certificate.
Create a database encryption key.
Enable database encryption.
Demonstration Steps Create a Database Master Key 1.
If you did not complete the previous demonstration, start the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines, log onto 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd, and run Setup.cmd in the D:\Demofiles\Mod10 folder as Administrator. Then start SQL Server Management Studio, and connect to MIA-SQL using Windows authentication.
2.
In SQL Server Management Studio, open the TDE.sql script file in the D:\Demofiles\Mod10 folder.
3.
Select the code under the comment Create DMK and click Execute. This creates a database master key in the master database.
Create a Server Certificate 1.
In the TDE.sql script, select the code under the comment Create server certificate and click Execute. This creates a certificate and then backs up the certificate and its private key.
Auditing Data Access and Encrypting Data
Create a Database Encryption Key 1.
MCT USE ONLY. STUDENT USE PROHIBITED
10-20
In the TDE.sql script, select the code under the comment Create DEK and click Execute. This creates a database encryption key in the ConfidentialDB database.
Enable Database Encryption 1.
In the TDE.sql script, select the code under the comment Enable encryption and click Execute. This enables encryption for the ConfidentialDB database, and retrieves database encryption status from the sys.databases table in the master database.
2.
Review the query results, and verify that the is_encypted value for ConfidentialDB is 1.
3.
Close SQL Server Management Studio without saving any files.
Extensible Key Management In an enterprise environment with high security compliance requirements, managing encryption keys at the individual database server level may not be practical. Many organizations have adopted an enterprise solution that enables them to manage encryption keys and certificates using vendorspecific hardware security modules (HSMs) to store keys securely. Extensible Key Management (EKM) support in SQL Server makes it possible to register modules from third-party vendors in SQL Server, enabling SQL Server to use the encryption keys stored on them. To use keys from a third-party EKM provider to implement TDE, you must perform the following tasks: 1.
Enable the EKM provider enabled option in SQL Server. This is an advanced configuration option, so you will need to use sp_configure with the show advanced options option before you enable EKM, as shown in the following code example. sp_configure 'show advanced options', 1 ; GO RECONFIGURE ; GO sp_configure 'EKM provider enabled', 1 ; GO RECONFIGURE ; GO
2.
Create a cryptographic provider from the file provided by the EKM provider. CREATE CRYPTOGRAPHIC PROVIDER EKM_Provider FROM FILE = 'S:\EKM_Files\EKMKey.dll' ;
3.
Create a credential for system administrators. CREATE CREDENTIAL EKM_Credential WITH IDENTITY = 'EKM_Admin', SECRET = 'Pa$$w0rd' FOR CRYPTOGRAPHIC PROVIDER EKM_Provider ;
4.
Add the credential to a login that will be used to configure encryption. ALTER LOGIN [ADVENTUREWORKS\DBAdmin] ADD CREDENTIAL EKM_Credential ;
5.
Create an asymmetric key stored in the EKM provider. USE master ; GO CREATE ASYMMETRIC KEY EKM_Login_Key FROM PROVIDER EKM_Provider WITH ALGORITHM = RSA_512, PROVIDER_KEY_NAME = 'SQL_Server_Key' ;
6.
Create a credential for the database engine to use when performing encryption and decryption. CREATE CREDENTIAL EKM_TDE_Credential WITH IDENTITY = 'TDE_DB', SECRET = 'Pa$$w0rd' FOR CRYPTOGRAPHIC PROVIDER EKM_Provider ;
7.
Create a login used by TDE, and add the credential to it. CREATE LOGIN EKM_Login FROM ASYMMETRIC KEY EKM_Login_Key ; GO ALTER LOGIN EKM_Login ADD CREDENTIAL EKM_TDE_Credential ;
8.
Create a database encryption key for the database to be encrypted. USE AdventureWorks ; CREATE DATABASE ENCRYPTION KEY WITH ALGORITHM = AES_128 ENCRYPTION BY SERVER ASYMMETRIC KEY EKM_Login_Key ;
9.
Enable TDE for the database. ALTER DATABASE AdventureWorks SET ENCRYPTION ON ;
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
10-21
Auditing Data Access and Encrypting Data
Lab: Auditing Data Access and Encrypting Data Scenario You are a database administrator at Adventure Works Cycles. For compliance reasons, you must implement auditing of access to customer data in the InternetSales database; and you must protect sensitive personnel data by encrypting the HumanResources database.
Objectives After completing this lab, you will be able to:
Implement auditing.
Implement transparent database encryption.
Estimated Time: 60 minutes Virtual machine : 20462C-MIA-SQL User name : ADVENTUREWORKS\Student Password : Pa$$w0rd
Exercise 1: Implementing Auditing Scenario
MCT USE ONLY. STUDENT USE PROHIBITED
10-22
The InternetSales database contains customer information, and as part of your commitment to customer data protection, your corporate compliance team requires that access to customer data is audited. The main tasks for this exercise are as follows: 1. Prepare the Lab Environment 2. Create an Audit 3. Create a Server Audit Specification 4. Create a Database Audit Specification 5. Implement a Custom Action 6. View Audited Events
Task 1: Prepare the Lab Environment 1.
Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are both running, and then log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
Run Setup.cmd in the D:\Labfiles\Lab10\Starter folder as Administrator.
Task 2: Create an Audit 1.
Create an audit named AW_Audit in the MIA-SQL database engine instance.
2.
Enable the AW_Audit audit after you have created it.
Task 3: Create a Server Audit Specification 1.
Create a server audit specification named AW_ServerAuditSpec. This audit specification should: o
Write events to the AW_Audit audit you created previously.
o
Audit failed and successful logins.
o
Be enabled immediately.
Task 4: Create a Database Audit Specification 1.
10-23
Create a database audit specification in the InternetSales database. This audit specification should: o
Write events to the AW_Audit audit you created previously.
o
Audit user-defined events.
o
Audit SELECT actions on the Customers schema by the customers_reader database role.
o
Audit SELECT, INSERT, UPDATE, and DELETE actions on the Customers schema by the sales_admin application role.
o
Be enabled immediately.
Task 5: Implement a Custom Action 1.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
Use the following code to create an UPDATE trigger on the Customers.Customer table in the InternetSales database. You can find this code in the Create Trigger.sql script file in the D:\Labfiles\Lab10\Starter folder.
CREATE TRIGGER Customers.Customer_Update ON Customers.Customer FOR UPDATE AS BEGIN IF UPDATE(EmailAddress) BEGIN DECLARE @msg NVARCHAR(4000); SET @msg = (SELECT i.CustomerID, d.EmailAddress OldEmail, i.EmailAddress NewEmail FROM inserted i JOIN deleted d ON i.CustomerID = d.CustomerID FOR XML PATH('EmailChange')) EXEC sp_audit_write @user_defined_event_id= 12, @succeeded = 1, @user_defined_information = @msg; END; END; GO
2.
Grant EXECUTE permission on the sys.sp_audit_write system stored procedure to the public database role in the master database.
Task 6: View Audited Events 1.
In a command prompt, use the following command to run sqlcmd as ADVENTUREWORKS\VictoriaGray. This user is a member of the ADVENTUREWORKS\Sales_Europe global group, which in turn is a member of the ADVENTUREWORKS\InternetSales_Users domain local group. runas /user:adventureworks\victoriagray /noprofile sqlcmd
2.
When prompted for a password, enter Pa$$w0rd.
3.
In sqlcmd, use the following commands to query the Customers.Customer table: SELECT FirstName, LastName, EmailAddress FROM Customers.Customer; GO
Auditing Data Access and Encrypting Data
4.
MCT USE ONLY. STUDENT USE PROHIBITED
10-24
In sqlcmd, use the following commands to activate the sales_admin application role and update the Customers.Customer table: EXEC sp_setapprole 'sales_admin', 'Pa$$w0rd'; UPDATE Customers.Customer SET EmailAddress = '
[email protected]' WHERE CustomerID = 1699; GO
5.
In the D:\Labfiles\Lab10\Starter\Audits folder, verify that an audit file has been created.
6.
SQL Server Management Studio, use the following Transact-SQL code to query the files in the audit folder and displays the audited events (events logged for the Student user and the service account for SQL Server have been excluded to simplify the results). -- View audited events SELECT event_time, action_id, succeeded, statement, user_defined_information, server_principal_name, database_name, schema_name, object_name FROM sys.fn_get_audit_file ('D:\Labfiles\Lab10\Starter\Audits\*',default,default) WHERE server_principal_name NOT IN ('ADVENTUREWORKS\Student', 'ADVENTUREWORKS\ServiceAcct');
7.
Note that all events are logged with the server principal name ADVENTUREWORKS\VictoriaGray despite the fact that this user accesses SQL Server through membership of a Windows group and does not have an individual login. This identity is audited even when executing statements in the security context of an application role.
Results: After this exercise, you should have created an audit, a server audit specification, and a database specification.
Exercise 2: Implementing Transparent Database Encryption Scenario You want to ensure the security of the HumanResources database to prevent access to the data it contains in the event that the database files are compromised. To achieve this, you will configure transparent data encryption. The main tasks for this exercise are as follows: 1. Create Encryption Keys 2. Enable Database Encryption 3. Move the Database
Task 1: Create Encryption Keys 1.
Create a database master key for the master database on the MIA-SQL instance of SQL Server.
2.
Create a certificate in the master database.
3.
o
Name the certificate TDE_Server_Sert.
o
Assign the subject “TDE Server Certificate”.
Back up the certificate and its private key. o
Back up the certificate to a file named TDE_Server_Cert.cer in the D:\Labfiles\Lab10\Starter folder.
4.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
10-25
o
Back up the private key to a file named TDE_Server_Cert.key in the D:\Labfiles\Lab10\Starter folder.
o
Encrypt the private key backup with the password Pa$$w0rd.
Create a database encryption key in the HumanResources database. o
Use the AES_128 encryption algorithm.
o
Encrypt the key using the server certificate you created in the previous step.
Task 2: Enable Database Encryption 1.
Enable encryption for the HumanResources database.
2.
Query the sys.databases table to verify that encryption is enabled for the HumanResources database.
Task 3: Move the Database 1.
Detach the HumanResources database from the MIA-SQL SQL Server instance.
2.
Attempt to attach the HumanResources database to the MIA-SQL\SQL2 instance. o
The primary database file is named HumanResources.mdf, in the M:\Data folder.
o
Attaching the database should fail because the certificate with which the database encryption key is protected does not exist on the MIA-SQL\SQL2 instance.
3.
Create a database master key for the master database on the MIA-SQL\SQL2 instance.
4.
Create a certificate named TDE_Server_Cert in the master database on the MIA-SQL\SQL2 instance from the backup certificate and private key files you created previously.
5.
Attach the HumanResources database to the MIA-SQL\SQL2 instance and verify that you can access the data it contains.
Results: After completing this exercise, you should have configured TDE and moved the encrypted HumanResource database to another instance of SQL Server.
Auditing Data Access and Encrypting Data
Module Review and Takeaways Best Practice: When planning to implement auditing, consider the following best practices:
MCT USE ONLY. STUDENT USE PROHIBITED
10-26
Choose the option to shut down SQL Server on audit failure. There is usually no point in setting up auditing, and then having situations where events can occur but are not audited. This is particularly important in high-security environments.
Make sure that file audits are placed on drives with large amounts of free disk space and ensure that the available disk space is monitored on a regular basis.
Best Practice: When planning to implement database encryption, consider the following best practices:
Use a complex password to protect the database master key for the master database.
Ensure you back up certificates and private keys used to implement TDE, and store the backup files in a secure location.
If you need to implement data encryption on multiple servers in a large organization, consider using an EKM solution to manage encryption keys.
Review Question(s) Question: What are the three targets for SQL Server audits? Question: You may wish to audit actions by a DBA. How would you know if the DBA stopped the audit while performing covert actions?
MCT USE ONLY. STUDENT USE PROHIBITED 11-1
Module 11 Performing Ongoing Database Maintenance Contents: Module Overview
11-1
Lesson 1: Ensuring Database Integrity
11-2
Lesson 2: Maintaining Indexes
11-7
Lesson 3: Automating Routine Database Maintenance
11-14
Lab: Performing Ongoing Database Maintenance
11-17
Module Review and Takeaways
11-20
Module Overview
The Microsoft® SQL Server® database engine is capable of running for long periods of time with minimal ongoing maintenance. However, obtaining the best outcomes from the database engine requires a schedule of routine maintenance operations. Database corruption is relatively rare but one of the most important tasks in the ongoing maintenance schedule is to check that no corruption has occurred in the database. Recovering from corruption depends upon its detection soon after it occurs. SQL Server indexes can also continue to work without any maintenance, but they will perform better if you periodically remove any fragmentation that occurs within them. SQL Server includes a Maintenance Plan Wizard to assist in creating SQL Server Agent jobs that perform these and other ongoing maintenance tasks.
Objectives After completing this module, you will be able to:
Ensure database integrity by using DBCC CHECKDB.
Maintain indexes.
Configure Database Maintenance Plans.
Lesson 1
Ensuring Database Integrity It is rare for the database engine to cause corruption directly. However, the database engine depends upon the hardware platform that it runs on—and that can cause corruption. In particular, issues in the memory and I/O subsystems can lead to corruption within databases.
MCT USE ONLY. STUDENT USE PROHIBITED
11-2 Performing Ongoing Database Maintenance
If you do not detect corruption soon after it has occurred, further (and significantly more complex or troublesome) issues can arise. For example, there is little point attempting to recover a corrupt database from a set of backups where every backup contains a corrupted copy of the database. You can use the DBCC CHECKDB command to detect, and in some circumstances correct, database corruption. It is therefore important that you are familiar with how DBCC CHECKDB works.
Lesson Objectives After completing this lesson, you will be able to:
Describe database integrity.
Use DBCC CHECKDB.
Describe the most common DBCC CHECKDB options.
Explain how to use the DBCC CHECKDB repair options.
Introduction to Database Integrity There are two levels of integrity known as physical and logical integrity.
Physical integrity: The data pages are written to the physical storage as SQL Server requested and can also be read correctly.
Logical integrity: The data within the pages is logically correct. For example, every index entry points to the correct data row and every data row is referenced by an index entry.
Without regular checking, database integrity cannot be ensured. Backup does not check the integrity, it only checks the page checksums, when WITH CHECKSUM is specified. While the CHECKSUM database option is important, the checksum is only checked when the data is read, or when WITH CHECKSUM is specified during the backup. Archive data is not read frequently, which can lead to corrupt data within the database that, if it is not checked, will not be found for months.
Overview of DBCC CHECKDB DBCC is a utility, provided by SQL Server, which supports a number of management facilities. In earlier documentation, you may see it referred to as the Database Consistency Checker. While checking the consistency of databases by using the CHECKDB option is a primary function of DBCC, it has many other uses. In current versions of the product, it is referred to as the Database Console Commands utility, to more closely reflect the wider variety of tasks that it can be used for.
DBCC CHECKDB
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
11-3
The CHECKDB option in the DBCC utility makes a thorough check of the structure of a database, to detect almost all forms of potential corruption. The functions that DBCC CHECKDB contains are also available as options that can be performed separately if required. The most important of these options are described in the following table: Option
Description
DBCC CHECKALLOC
Checks the consistency of disk space allocation structures for a specified database.
DBCC CHECKTABLE
Checks the pages associated with a specified table and the pointers between pages that are associated with the table. DBCC CHECKDB executes DBCC CHECKTABLE for every table in the database.
DBCC CHECKCATALOG
Checks the database catalog by performing logical consistency checks on the metadata tables in the database. These tables hold information that describes both system and users tables and other database objects. DBSS CHECKCATALOG does not check user tables.
DBCC CHECKDB also performs checks on other types of objects, such as the links for FILESTREAM objects and consistency checks on the Service Broker objects. Note: FILESTREAM and Service Broker are advanced topics that are beyond the scope of this course.
Repair Options
Even though DBCC CHECKDB has repair options, it is not always possible to repair a database without data loss. Usually, the best method for database recovery is to restore a backup of the database. This means that you should synchronize the execution of DBCC CHECKDB with your backup retention policy. This ensures that you can always restore a database from an uncorrupted database and that all required log backups since that time are available.
Online Concurrent Operations
MCT USE ONLY. STUDENT USE PROHIBITED
11-4 Performing Ongoing Database Maintenance
DBCC CHECKDB can take a long time to execute and consumes considerable I/O and CPU resources, so you will often need to run it while the database is in use. Therefore, DBCC CHECKDB works using internal database snapshots to ensure that the utility works with a consistent view of the database. If the performance needs for the database activity running while DBCC CHECKDB is executing are too high, running DBCC CHECKDB against a restored backup of your database is an alternative option. This is not ideal, but is better than not running DBCC CHECKDB at all.
Disk Space
The use of an internal snapshot causes DBCC CHECKDB to need additional disk space. DBCC CHECKDB creates hidden files (using NTFS Alternate Streams) on the same volumes as the database files are located. Sufficient free space on the volumes must be available for DBCC CHECKDB to run successfully. The amount of disk space required on the volumes depends upon how much data is changed during the execution of DBCC CHECKDB. DBCC CHECKDB also uses space in the tempdb database while executing. To provide an estimate of the amount of space required in tempdb, DBCC CHECKDB offers an ESTIMATEONLY option.
Backups and DBCC CHECKDB
It is considered a good practice to run DBCC CHECKDB on a database prior to performing a backup. This check helps to ensure that the backup contains a consistent version of the database.
DBCC CHECKDB Options DBCC CHECKDB provides a number of options that alter its behavior while it is executing.
DBAs often use the PHYSICAL_ONLY option on production systems because it substantially reduces the time taken to run DBCC CHECKDB on large databases. If you regularly use the PHYSICAL_ONLY option, you still need to periodically run the full version of the utility. How often you perform the full version depends upon your specific business requirements.
You can use the NOINDEX option to specify not to perform the intensive checks of nonclustered indexes for user tables. This decreases the overall execution time but does not affect system tables because integrity checks are always performed on system table indexes. The assumption that you are making when using the NOINDEX option is that you can rebuild the nonclustered indexes if they become corrupt.
You can only perform the EXTENDED_LOGICAL_CHECKS when the database is in database compatibility level 100 (SQL Server 2008) or above. It performs detailed checks of the internal structure of objects such as CLR user-defined data types and spatial data types.
You can use the TABLOCK option to request that DBCC CHECKDB takes a table lock on each table while performing consistency checks, rather than using the internal database snapshots. This reduces the disk space requirements at the cost of preventing other users from updating the tables.
The ALL_ERRORMSGS and NO_INFOMSGS options only affect the output from the command, not the operations that the command performs.
The ESTIMATEONLY option estimates the space requirements in the tempdb database.
DBCC CHECKDB Repair Options Repairing a database should be a last resort. When a database is corrupt, it is typically better to restore it from a backup, after solving the cause of the corruption. You should back up a database before performing any repair option and also find and resolve the reason for the corruption—otherwise it may well happen again soon after. As well as providing details of errors that have been found, the output of DBCC CHECKDB shows the repair option that you need to use to correct the problem. DBCC CHECKDB offers two repair options. For both options, the database needs to be in single user mode. The options are:
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
11-5
REPAIR_REBUILD rebuilds indexes and removes corrupt data pages. This option only works with certain mild forms of corruption and does not involve data loss.
REPAIR_ALLOW_DATA_LOSS will almost always produce data loss. It de-allocates the corrupt pages and changes others that reference the corrupt pages. After the operation completes, the database will be consistent, but only from a physical database integrity point of view. Significant loss of data could have occurred. Also, repair operations do not consider any of the constraints that may exist on or between tables. If the specified table is involved in one or more constraints, it is recommended that you execute DBCC CHECKCONSTRAINTS after running the repair operation.
In the example on the slide, four consistency errors were found and the REPAIR_ALLOW_DATA_LOSS option is needed to repair the database.
If the transaction log becomes corrupt, you can use a special option called an emergency mode repair. However, in that situation, it is strongly recommended to restore the database and you should only use the emergency mode repair when no backup is available.
Demonstration: Using DBCC CHECKDB In this demonstration, you will see how to use the DBCC CHECKDB command.
Demonstration Steps Use the DBCC CHECKDB Command 1.
Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are running, and log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
In the D:\Demofiles\Mod11 folder, run Setup.cmd as Administrator.
3.
Start SQL Server Management Studio and connect to the MIA-SQL database engine instance using Windows authentication.
4.
Open the DBCCCHECKDB.sql script file in the D:\Demofiles\Mod11 folder.
5.
Select the code under the comment Run DBCC CHECKDB with default options and click Execute. This checks the integrity of the AdventureWorks database and displays detailed informational messages.
MCT USE ONLY. STUDENT USE PROHIBITED
11-6 Performing Ongoing Database Maintenance
6.
Select the code under the comment Run DBCC CHECKDB without informational messages and click Execute. This checks the integrity of the AdventureWorks database and only displays messages if errors are found.
7.
Select the code under the comment Run DBCC CHECKDB against CorruptDB and click Execute. This checks the integrity of the CorruptDB database and identifies some consistency errors in the dbo.Orders table in this database. The last line of output tells you the minimum repair level required.
8.
Select the code under the comment Try to access the Orders table and click Execute. This attempts to query the dbo.Orders table in CorruptDB, and returns an error because of a logical consistency issue.
9.
Select the code under the comment Access a specific order and click Execute. This succeeds, indicating that only some data pages are affected by the consistency issue.
10. Select the code under the comment Repair the database and click Execute. Note that this technique is used only as a last resort when no valid backup is available. No guarantee on logical consistency in the database (such as foreign key constraints) is provided. 11. Select the code under the comment Access the Orders table and click Execute. This succeeds, indicating that the consistency issue has been repaired.
12. Select the code under the comment Check the internal database structure and click Execute. No error message are displayed, indicating that the database structure is now consistent.
13. Select the code under the comment Check data loss and click Execute. Note that a number of order details records have no matching order records. The foreign-key constraint between these tables originally enforced a relationship, but some data has been lost.
Lesson 2
Maintaining Indexes
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
11-7
Another important aspect of SQL Server that requires ongoing maintenance for optimal performance is the management of indexes. Indexes are used to speed up operations where SQL Server needs to access data in a table. Over time, indexes can become fragmented so the performance of database applications using the indexes will be reduced. Defragmenting or rebuilding the indexes will restore the performance of the database. Index management options are often included in regular database maintenance plan schedules. Before learning how to set up the maintenance plans, it is important to understand more about how indexes work and how to maintain them.
Lesson Objectives After completing this lesson, you will be able to:
Explain how indexes affect performance.
Describe the different types of SQL Server indexes.
Describe how indexes become fragmented.
Use FILLFACTOR and PAD_INDEX.
Explain the ongoing maintenance requirements for indexes.
Implement online index operations.
Describe how SQL Server creates and uses statistics.
How Indexes Affect Performance Whenever SQL Server needs to access data in a table, it calculates whether to read all the pages (known as a table scan) or whether there are one or more indexes that will reduce the amount of effort required in locating the required rows. Consider a query that selects a single order based on the OrderID from the Orders table. Without an index, SQL Server has to scan all the orders in the table to find the particular one. With an index, SQL Server can find the row quickly by calculating which page the relevant OrderID is stored on. The difference here could be between getting to three or four pages of data compared with accessing hundreds.
Indexes can help to improve searching, sorting, and join performance, but they can also impact data modification performance, they require ongoing management, and they demand additional disk space. Occasionally, SQL Server will create its own temporary indexes to improve query performance. However, doing so is up to the optimizer and beyond the control of the database administrator or programmer, so these temporary indexes will not be discussed in this module. The temporary indexes are only used to improve a query plan, if no proper indexing already exists.
Types of SQL Server Indexes Rather than storing rows of data as a heap (the term given to a table where the data pages are not stored in any specific logical order) you can design tables with an internal logical ordering. This is known as a clustered index.
Clustered Index A table with a clustered index has a predefined order for rows within a page and for pages within the table. The order is based on a key made up of one or more columns. The key is commonly called a clustering key.
MCT USE ONLY. STUDENT USE PROHIBITED
11-8 Performing Ongoing Database Maintenance
Because the rows of a table can only be in a single order, there can only be a single clustered index on a table. SQL Server uses an Index Allocation Map entry to point to a clustered index.
There is a common misconception that pages in a clustered index are physically stored in order. While this is possible in rare situations, it is not commonly the case. If it was true, fragmentation of clustered indexes would not exist. SQL Server tries to align physical and logical order while creating an index, but disorder can arise as data is modified. Consider searching for an OrderID with a value of 23678 in an index for OrderID. In the root page, SQL Server searches for the range that contains the value 23678. The entry for the range in the root page points to an index page in the next level. In this level, the range is divided into smaller ranges, again pointing to pages on the following level. This is done up to a point where every row can be referenced on its own. This final level is called the leaf node.
Index and data pages are linked within a logical hierarchy and also double-linked across all pages at the same level of the hierarchy, to assist when scanning across an index. For example, imagine a table with 10 extents and allocated page numbers 201 to 279, all linked in order. (Each extent contains eight pages.) If a page needed to be placed into the middle of the logical order, SQL Server finds an extent with a free page or allocates a new extent for the index. The page is logically linked into the correct position but it could be located anywhere within the database pages.
Nonclustered Index
A nonclustered index is a type of index that does not affect the layout of the data in the table in the way that a clustered index does. If the underlying table is a heap (that is, it has no clustered index), the leaf level of a nonclustered index contains pointers to where the data rows are stored. The pointers include a file number, a page number, and a slot number on the page. If the underlying table has a clustered index (that is, the pages and the data are logically linked in the order of a clustering key), the leaf level of a nonclustered index contains the clustering key. This is then used to seek through the pages of the clustered index to locate the desired rows.
Other Types of Index SQL Server includes other types of index:
Integrated full-text search (iFTS) uses a special type of index that provides flexible searching of text.
The GEOMETRY and GEOGRAPHY data types use spatial indexes.
Primary and secondary XML indexes assist when querying XML data.
Large data warehouses can use columnstore indexes.
Index Fragmentation Index fragmentation occurs over time as you insert and delete data in the table. For operations that read data, indexes perform best when each page is as full as possible. However, if your indexes initially start full (or relatively full), adding data to the table can cause the index pages to need splitting. Adding a new index entry to the end of an index is easy but the process is more complicated if the entry needs to be made in the middle of an existing full index page.
Internal vs. External Fragmentation There are two types of index fragmentation:
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
11-9
Internal fragmentation occurs when pages are not holding as much data as they are capable of. This often occurs when a page is split during an insert operation and can also happen when an update operation causes a row to be moved to another page. In either situation, empty space is left within pages.
External fragmentation occurs when pages that are logically sequenced are not held in sequenced page numbers. If a new index page needs to be allocated, it would be logically inserted into the correct location in the list of pages. In reality, though, it is likely to be placed at the end of the index. That means that a process needing to read the index pages in order must follow pointers to locate the pages. The process then involves accessing pages that are not sequential within the database.
Detecting Fragmentation
SQL Server provides a useful measure in the avg_fragmentation_in_percent column of the sys.dm_db_index_physical_stats dynamic management view. You can use this to analyze the level of fragmentation in your index and to decide whether to rebuild it. SQL Server Management Studio (SSMS) also provides details of index fragmentation in the properties page for each index.
FILLFACTOR and PAD_INDEX The availability of free space in an index page can have a significant effect on the performance of index update operations. If an index record must be inserted and there is no free space, a new index page needs to be created and the contents of the old page split across the two. This can affect performance if it happens too frequently.
You can alleviate the performance impacts of page splits by leaving empty space on each page when creating an index, including a clustered index. You can use the FILLFACTOR option when you create the index to define how full the index pages should be. FILLFACTOR defaults to zero, which means fill 100 percent. Any other value (including 100) is taken as the percentage of how full each page should be.
Performing Ongoing Database Maintenance
MCT USE ONLY. STUDENT USE PROHIBITED
11-10
The following example shows how to create an index that is 70 percent full, leaving 30 percent free space on each page: Using FILLFACTOR ALTER TABLE Person.Contact ADD CONSTRAINT PK_Contact_ContactID PRIMARY KEY CLUSTERED ( ContactID ASC ) WITH (PAD_INDEX = OFF, FILLFACTOR = 70); GO
Note: The difference between the values zero and 100 can seem confusing. While both lead to the same outcome, 100 indicates that a specific FILLFACTOR value has been requested. The value zero indicates that no FILLFACTOR has been specified.
By default, the FILLFACTOR option only applies to leaf level pages in an index. You can use it in conjunction with the PAD_INDEX = ON option to cause the same free space to be allocated in the nonleaf levels of the index.
Ongoing Maintenance of Indexes When SQL Server updates indexes during data modifications, the indexes can become fragmented. SQL Server provides two options for removing fragmentation from clustered and nonclustered indexes—rebuild and reorganize.
REBUILD Rebuilding an index drops and recreates the index. This removes fragmentation, reclaims disk space by compacting the pages based on the specified or existing fill factor setting, and reorders the index rows in contiguous pages. When the option ALL is specified, SQL Server drops all indexes on the table and rebuilds them in a single operation. If any part of that fails, it rolls back the entire operation.
Because SQL Server performs rebuilds as logged, single operations, a single rebuild operation can use a large amount of space in the transaction log. To avoid this, you can change the recovery model of the database to use the BULK_LOGGED or SIMPLE recovery models before performing the rebuild operation, so that it is a minimally-logged operation. A minimally-logged rebuild operation uses much less space in the transaction log and completes faster. Use the ALTER INDEX statement to rebuild an index. Rebuilding an Index ALTER INDEX CL_LogTime ON dbo.LogTime REBUILD;
REORGANIZE
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
11-11
Reorganizing an index uses minimal system resources. It defragments the leaf level of clustered and nonclustered indexes on tables by physically reordering the leaf-level pages to match the logical, left to right order of the leaf nodes. Reorganizing an index also compacts the index pages. The compaction is based on the existing fill factor value. It is possible to interrupt a reorganize without losing the work performed so far. For example, this means that, on a large index, you could configure partial reorganization to occur each day.
For heavily fragmented indexes (more than 30 percent) rebuilding is usually the most appropriate option to use. SQL Server maintenance plans include options to rebuild or reorganize indexes. If you do not use maintenance plans, it is important to build a job that regularly performs defragmentation of the indexes in your databases. You also use the ALTER INDEX statement to reorganize an index. Reorganizing an Index ALTER INDEX ALL ON dbo.LogTime REORGANIZE;
Online Index Operations The Enterprise edition of SQL Server can perform index operations online while users are accessing the database. This is very important because many organizations have no available maintenance time windows during which to perform database maintenance operations such as index rebuilds.
When performing an online index rebuild operation, SQL Server creates a temporary mapping index that tracks data changes taking place while the index rebuild operation is occurring. For consistency, SQL Server takes a very brief shared lock on the object at the beginning and end the operation. During the online rebuild operation, schema locks are held to prevent metadata changes. This means that users cannot change the structure of the table using commands such as ALTER TABLE while the online index rebuild operation is occurring.
Because of the extra work that needs to be performed, online index rebuild operations are typically slower than their offline counterparts. The following example shows how to rebuild an index online: Rebuilding an Index Online ALTER INDEX IX_Contact_EmailAddress ON Person.Contact REBUILD WITH ( PAD_INDEX = OFF, FILLFACTOR = 80, ONLINE = ON, MAXDOP = 4 );
Note: Some indexes cannot be rebuilt online, including clustered and nonclustered indexes with large object data.
Performing Ongoing Database Maintenance
Updating Statistics One of the main tasks that SQL Server performs when it is optimizing queries is deciding which indexes to use. This is based on statistics that SQL Server keeps about the distribution of the data in the index. SQL Server automatically updates these statistics when AUTO_UPDATE_STATISTICS is enabled in a database. This is the default setting and it is recommended that you do not disable this option.
MCT USE ONLY. STUDENT USE PROHIBITED
11-12
For large tables, the AUTO_UPDATE_STATISTICS_ASYNC option instructs SQL Server to update statistics asynchronously instead of delaying query execution, where it would otherwise update an outdated statistic prior to query compilation. You can also update statistics on demand. Executing the command UPDATE STATISTICS against a table causes all statistics on the table to be updated. You can also run the sp_updatestats system stored procedure to update all statistics in a database.
Demonstration: Maintaining Indexes In this demonstration, you will see how to maintain indexes.
Demonstration Steps Maintain Indexes 1.
If you did not complete the previous demonstration in this module, ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are running, and log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd. Then, in the D:\Demofiles\Mod11 folder, run Setup.cmd as Administrator.
2.
If SQL Server Management Studio is not already open, start it and connect to the MIA-SQL database engine instance using Windows authentication.
3.
Open the MaintainingIndexes.sql script file in the D:\Demofiles\Mod11 folder.
4.
Select the code under the comment Create a table with a primary key and click Execute. This creates a table with a primary key, which by default creates a clustered index on the primary key field.
5.
Select the code under the comment Insert some data into the table and click Execute. This inserts 10,000 rows into the table.
6.
Select the code under the comment Check fragmentation and click Execute. In the results, note the avg_fragmentation_in_percent and avg_page_space_used_in_percent values for each index level.
7.
Select the code under the comment Modify the data in the table and click Execute. This updates the table.
8.
Select the code under the comment Re-check fragmentation and click Execute. In the results, note that the avg_fragmentation_in_percent and avg_page_space_used_in_percent values for each index level have changed as the data pages have become fragmented.
9.
Select the code under the comment Rebuild the table and its indexes and click Execute. This rebuilds the indexes on the table.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
11-13
10. Select the code under the comment Check fragmentation again and click Execute. In the results, note that the avg_fragmentation_in_percent and avg_page_space_used_in_percent values for each index level indicate less fragmentation.
Performing Ongoing Database Maintenance
Lesson 3
Automating Routine Database Maintenance
MCT USE ONLY. STUDENT USE PROHIBITED
11-14
You have seen how to manually perform some of the common database maintenance tasks that you will need to execute on a regular basis. SQL Server provides the Maintenance Plan Wizard you can use to create SQL Server Agent jobs that perform the most common database maintenance tasks.
While the Maintenance Plan Wizard makes this process easy to set up, it is important to realize that you can use the output of the wizard as a starting point for creating your own maintenance plans or you could create plans from scratch.
Lesson Objectives After completing this lesson, you will be able to:
Describe SQL Server maintenance plans.
Monitor database maintenance plans.
Overview of SQL Server Maintenance Plans The SQL Server Maintenance Plan Wizard creates SQL Server Agent jobs that perform routine database maintenance tasks. It also schedules those jobs to ensure that your database is regularly backed up, performs well, and is checked for inconsistencies. The wizard creates SQL Server Integration Services (SSIS) packages that are executed by SQL Server Agent tasks. You can schedule many maintenance tasks to run automatically, including:
Database check integrity task.
Database shrink tasks.
Database index tasks, such as rebuilding and reorganizing.
Update statistics tasks.
History cleanup tasks.
Execute agent job tasks.
Backup tasks.
Cleanup tasks.
Note: You can create maintenance plans using one schedule for all tasks or with individual schedules for each one.
Monitoring Maintenance Plans SQL Server implements maintenance plans by using SQL Server Agent jobs that run SSIS packages. Because they use SQL Server Agent jobs, you can monitor the maintenance plans by using the standard Job Activity Monitor in SSMS.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
11-15
The results that the maintenance tasks generate are written to the maintenance plan tables, dbo.sysmaintplan_log and dbo.sysmaintplan_log_detail in the msdb database. You can view the entries in these tables by querying them directly using Transact-SQL or by using the Log File Viewer. In addition, tasks can generate text-based reports, write them to the file system, and send them automatically to operators that have been defined in SQL Server Agent. Note: You can use the cleanup tasks available in the maintenance plans to implement a retention policy for backup files, job history, maintenance plan report files, and msdb database table entries.
Demonstration: Creating a Maintenance Plan In this demonstration, you will see how to create a maintenance plan.
Demonstration Steps Create a Maintenance Plan 1.
If you did not complete the previous demonstration in this module, ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are running, and log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd. Then, in the D:\Demofiles\Mod11 folder, run Setup.cmd as Administrator.
2.
If SQL Server Management Studio is not already open, start it and connect to the MIA-SQL database engine instance using Windows authentication.
3.
In Object Explorer, under MIA-SQL, expand Management, right-click Maintenance Plans, and click Maintenance Plan Wizard.
4.
In the Maintenance Plan Wizard window, click Next.
5.
In the Select Plan Properties window, in the Name textbox type Daily Maintenance. Note the available scheduling options and click Change.
6.
In the New Job Schedule window, in the Name textbox type "Daily". In the Occurs drop down list, click Daily. In the Occurs once at textbox, change the time to 3:00 AM, and click OK.
7.
In the Select Plan Properties window, click Next. Then in the Select Maintenance Tasks page, select the following tasks and click Next. o
Check Database Integrity
o
Reorganize Index
o
Update Statistics
Performing Ongoing Database Maintenance
o
Back up Database (Full)
MCT USE ONLY. STUDENT USE PROHIBITED
11-16
8.
On the Select Maintenance Task Order page, click Next.
9.
On the define Database Check Integrity Task page, select the AdventureWorks database and click OK. Then click Next.
10. On the Define Reorganize Index Task page, select the AdventureWorks database and click OK, ensure that Tables and Views is selected, and click Next. 11. On the Define Update Statistics Task page, select the AdventureWorks database and click OK, ensure that Tables and Views is selected, and click Next.
12. On the Define Backup database (Full) Task page, select the AdventureWorks database and click OK. Then on the Destination tab, ensure that Create a backup file for every database is selected, change the Folder value to D:\Demofiles\Mod11\Backups\ and click Next.
13. On the Select Report Options page, ensure that Write a report to a text file is selected, change the Folder location to D:\Demofiles\Mod11\ and click Next. 14. On the Complete the Wizard page, click Finish. Then when the operation has completed, click Close. 15. In Object Explorer, under Maintenance Plans, right-click Daily Maintenance and click Execute. 16. Wait a minute or so until the maintenance plan succeeds, and in the Execute Maintenance Plan dialog box, click Close. Then right-click Daily Maintenance and click View History. 17. In the Log File Viewer - MIA-SQL dialog box, expand the Date value for the Daily Maintenance plan to see the individual tasks. 18. Keep clicking Refresh and expanding the tasks until four tasks have been completed. Then click Close.
19. In the D:\Demofiles\Mod11 folder, view the Daily Maintenance_Subplan_1_xxxxx.txt file that has been created. 20. In the Backups folder, verify that a backup of the AdventureWorks database has been created.
Lab: Performing Ongoing Database Maintenance Scenario
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
11-17
You are a database administrator (DBA) at Adventure Works Cycles, with responsibility for databases on the MIA-SQL SQL Server instance. You must perform the ongoing maintenance of database on this instance, including ensuring database integrity and managing index fragmentation.
Objectives After completing this lab, you will be able to:
Use DBCC CHECKDB.
Defragment indexes.
Create database maintenance plans.
Estimated Time: 45 minutes Virtual machine: 20462C-MIA-SQL User name : ADVENTUREWORKS\Student Password: Pa$$w0rd
Exercise 1: Managing Database Integrity Scenario
There has been a disk failure in the I/O subsystem. The disk has been replaced but you want to check the consistency of your existing databases. You will execute DBCC CHECKDB to verify the logical and physical integrity of the AWDataWarehouse, HumanResources, and InternetSales databases on the MIA-SQL instance. The main tasks for this exercise are as follows: 1. Prepare the Lab Environment 2. Check Database Consistency 3. Repair a Corrupt Database
Task 1: Prepare the Lab Environment 1.
Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are both running, and then log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
Run Setup.cmd in the D:\Labfiles\Lab11\Starter folder as Administrator.
Task 2: Check Database Consistency 1.
2.
In SQL Server Management Studio, use the DBCC CHECKDB command to check the integrity of the following databases: o
AWDataWarehouse
o
HumanResources
o
InternetSales
Note any issues reported, and determine the minimum repair level required.
Performing Ongoing Database Maintenance
Task 3: Repair a Corrupt Database 1.
2.
MCT USE ONLY. STUDENT USE PROHIBITED
11-18
Use the DBCC CHECKDB command to repair any databases for which integrity issues were discovered in the previous task. To accomplish this, you will need to: a.
Alter the database to set it into single user mode.
b.
Run the DBCC CHECKDB command with the appropriate repair option.
c.
Alter the database to return it to multi-user mode.
Use the DBCC CHECKDB command to verify that the integrity issues have been resolved.
Results: After this exercise, you should have used the DBCC CHECKDB command to check database consistency, and corrected any issues that were found.
Exercise 2: Managing Index Fragmentation Scenario
You have identified fragmentation in the Employees.Employee table in the HumanResources database and you are sure that performance is decreasing as the amount of fragmentation increases. You will rebuild the indexes for any of the main database tables that are heavily fragmented. The main tasks for this exercise are as follows: 1. View Index Fragmentation 2. Defragment Indexes
Task 1: View Index Fragmentation 1.
Query the sys.dm_db_index_physical_stats function to retrieve information about index fragmentation for indexes on the Employees.Employee table in the HumanResources database.
2.
Note the avg_page_space_used_in_percent and avg_fragmentation_in_percent values for each index level.
Task 2: Defragment Indexes 1.
Rebuild all indexes on the Employees.Employee table in the HumanResources database.
2.
Query the sys.dm_db_index_physical_stats function to retrieve information about index fragmentation for indexes on the Employees.Employee table in the HumanResources database to verify that index fragmentation has decreased.
Results: After this exercise, you should have rebuilt fragmented indexes.
Exercise 3: Implementing a Maintenance Plan Scenario
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
11-19
You have also identified a degradation of performance in the application when proper index maintenance has not been performed. You want to ensure that there is an early detection of any consistency issues in the HumanResources database and that the index maintenance is automatically executed on a schedule basis. To make sure this regular maintenance occurs, you will create a database maintenance plan to schedule core operations on a weekly basis. The main tasks for this exercise are as follows: 1. Create a Maintenance Plan 2. Run a Maintenance Plan
Task 1: Create a Maintenance Plan 1.
Create a database maintenance plan for the HumanResources database. The maintenance plan should run every day at 6:00 PM, and perform the following operations: o
Check the integrity of the HumanResources database.
o
Reorganize all indexes on all tables and views in the HumanResources database.
o
Update statistics on all tables and views in the HumanResources database.
o
Perform a full backup of the HumanResources database, storing the backup in the R:\Backups\ folder.
o
Write a report to a text file in the D:\Labfiles\Lab11\Starter folder.
Task 2: Run a Maintenance Plan 1.
Run the maintenance plan you created in the previous task.
2.
View the history of the maintenance plan in the Log File Viewer in SQL Server Management Studio.
3.
When the maintenance plan has performed its four tasks, view the report it has generated.
4.
Verify that a backup of the HumanResources database has been created in the R:\Backups folder.
Results: After this exercise, you should have created the required database maintenance plan. Question: After discovering that the InternetSales database contained corrupt pages, what would have been a preferable solution to repairing the database with potential data loss? Question: If you need to execute a maintenance plan with timing that cannot be accommodated by a single schedule, what can you do?
Performing Ongoing Database Maintenance
Module Review and Takeaways In this module, you learned how to use the DBCC CHECKDB command to detect and repair database integrity issues. You then learned how to observe and repair index fragmentation. Finally, you learned how to use a maintenance plan to automate core maintenance tasks. Best Practice: When planning ongoing database maintenance, consider the following best practices.
MCT USE ONLY. STUDENT USE PROHIBITED
11-20
Run DBCC CHECKDB regularly.
Synchronize DBCC CHECKDB with your backup strategy.
If corruption occurs, consider restoring the database from a backup, and only repair the database as a last resort.
Defragment your indexes when necessary.
Update statistics on a schedule if you don’t want it to occur during normal operations.
Use maintenance plans to implement regular tasks.
Review Question(s) Question: What option should you consider using when running DBCC CHECKDB against large production databases?
MCT USE ONLY. STUDENT USE PROHIBITED 12-1
Module 12 Automating SQL Server 2014 Management Contents: Module Overview
12-1
Lesson 1: Automating SQL Server Management
12-2
Lesson 2: Implementing SQL Server Agent Jobs
12-5
Lesson 3: Managing SQL Server Agent Jobs
12-11
Lesson 4: Managing Job Step Security Contexts
12-15
Lesson 5: Managing Jobs on Multiple Servers
12-20
Lab: Automating SQL Server Management
12-25
Module Review and Takeaways
12-28
Module Overview
The tools provided by Microsoft® SQL Server® make administration easy when compared with other database engines. Even when tasks are easy to perform though, it is common to need to repeat a task many times. Efficient database administrators learn to automate repetitive tasks. This can help avoid situations where an administrator forgets to execute a task at the required time. Perhaps more important though, is that the automation of tasks helps to ensure they are performed consistently, each time they are executed.
This module describes how to use SQL Server Agent to automate jobs, how to configure security contexts for jobs, and how to implement multi-server jobs.
Objectives After completing this module, you will be able to:
Describe methods for automating SQL Server management.
Create jobs, job step types, and schedules.
Manage SQL Server Agent jobs.
Configure job security contexts.
Configure master and target servers.
Lesson 1
Automating SQL Server Management
MCT USE ONLY. STUDENT USE PROHIBITED
12-2 Automating SQL Server 2014 Management
There are many benefits that you can gain from the automation of SQL Server management. Most of the benefits center on the reliable, consistent execution of routine management tasks. SQL Server is a flexible platform that provides a number of ways to automate management, but the most important tool for this is the SQL Server Agent. All database administrators working with SQL Server need to be familiar with the configuration and ongoing management of SQL Server Agent.
Lesson Objectives After completing this lesson, you will be able to:
Explain the benefits of automating SQL Server management.
Describe the available options for automating SQL Server management and the framework that SQL Server Agent provides.
Describe SQL Server Agent.
Benefits of Automating SQL Server Management All efficient database administrators automate their routine administrative tasks. Some benefits you can gain from the automation of SQL Server management include:
Reduced administrative load. Unfortunately, some administrators who work with SQL Server, Windows®, and other tools, see their roles in terms of a constant stream of repetitive administrative tasks. For example, a Windows administrator at a University department might receive regular requests to create a large number of user accounts. The administrator might be happy to create each account one by one, using the standard tooling. A more efficient administrator learns to write a script to create users and execute that instead of performing the operation manually.
The same sort of situation occurs with routine tasks in SQL Server. While you can perform these tasks individually or manually, efficient database administrators do not do this. They automate all their routine and repetitive tasks. Automation removes the repetitive workload from the administrators and enables them to manage larger numbers of systems or to perform higher-value tasks for the organization.
Reliable execution of routine tasks. When you perform routine tasks manually, there is always a chance that you might overlook a vital task. For example, a database administrator could forget to perform database backups. Automation enables administrators to focus on exceptions that occur during the routine tasks, rather than on the execution of the tasks.
Consistent execution of routine tasks. Another problem that can occur when you perform routine tasks manually is that you may not perform the tasks the same way each time. Imagine a situation where a database administrator archives some data from a set of production tables into a set of history tables every Monday morning. The new tables need to have the same name as the originals with a suffix that includes the current date.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
12-3
While the administrator might remember to perform this task every Monday morning, there is a likelihood that one or more of the following errors could occur:
Copy the wrong tables.
Copy only some of the tables.
Forget what the correct date is when creating the suffix.
Format the date in the suffix incorrectly.
Copy data into the wrong archive table.
Anyone who has been involved in ongoing administration of systems will tell you that these and other problems would occur from time to time, even when the tasks are executed by experienced and reliable administrators. Automating routine tasks can assist greatly in making sure that they are performed consistently every time.
Proactive Management
After you automate routine tasks, it is possible that their execution fails but no-one notices. For example, an automated backup of databases may fail but this is not and noticed until one of the backups is needed. As well as automating your routine tasks, you need to ensure that you create notifications telling you when the tasks fail, even if you cannot imagine a situation where they could. For example, you may create a backup strategy that produces database backups in a given folder. The job may run reliably for years until another administrator inadvertently deletes or renames the target folder. You need to know as soon as this problem occurs so that you can rectify the situation.
A proactive administrator will try to detect potential problems before they occur. For example, rather than receiving a notification that a job failed because a disk was full, an administrator might schedule regular checks of available disk space and make sure a notification is received when it is starting to get too low. SQL Server provides alerts on system and performance conditions for this type of scenario.
The SQL Server Agent Service The primary method for automation of management, administrative, and other routine tasks when working with SQL Server 2012 is to use the SQL Server Agent. The SQL Server Agent runs as a Windows service and needs to be running constantly to perform its main roles of executing jobs, firing alerts, and contacting operators. Because of this, you should configure it to begin automatically when the operating system starts. The default setting after installation is for SQL Server Agent to be started manually, so you need to change this before working with it.
You can configure the start mode for SQL Server Agent in the properties of the SQL Server Agent service in SQL Server Configuration Manager. There are three available start modes:
Automatic. The service begins when the operating system starts.
Disabled. The service will not start, even if you attempt to do so manually.
Manual. The service needs to be started manually.
MCT USE ONLY. STUDENT USE PROHIBITED
12-4 Automating SQL Server 2014 Management
You can also configure the SQL Server Agent service to restart automatically if it stops unexpectedly, by using the properties page for the SQL Server Agent in SQL Server Management Studio (SSMS). To restart automatically, the SQL Server Agent service account must be a member of the local Administrators group for the computer where SQL Server is installed—but this is not considered a best practice. A better option would be to use an external monitoring tool such as System Center Operations Manager to monitor and restart the SQL Server Agent service if necessary.
SQL Server Agent Objects The SQL Server Agent supplies a management framework that is based on three core object types:
Jobs that you can use to automate tasks.
Alerts that you can use to respond to events.
Operators that define administrative contacts for alerts.
Jobs
You can use jobs to schedule command-line scripts, Windows PowerShell® scripts, Transact-SQL scripts, SQL Server Integration Service (SSIS) packages and so on. You can also use them to schedule a wide variety of task types, including tasks involved in the implementation of other SQL Server features. These include replication, Change Data Capture (CDC), Data Collection, and Policy Based Management (PBM). Note: Replication, CDC, and PBM are advanced topics that are beyond the scope of this course.
Alerts
The alert system provided by SQL Server Agent is capable of responding to a wide variety of alert types, including SQL Server error messages, SQL Server performance counter events, and Windows Management Instrumentation (WMI) alerts.
Operators
You can configure an action to happen in response to an alert, such as the execution of a SQL Server Agent job or sending a notification to an administrator. In SQL Server Agent, administrators that you can notify are called operators. One common way of notifying operators is by using Simple Mail Transfer Protocol (SMTP)-based email. (Alerts and operators are discussed later in this course.) Note: There are other SQL Server features that you can use to automate complex monitoring tasks, for example, Extended Events—but this is beyond the scope of this course.
Lesson 2
Implementing SQL Server Agent Jobs
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
12-5
Because SQL Server Agent is the primary tool for automating tasks within SQL Server, database administrators need to be proficient at creating and configuring SQL Server Agent jobs. You can create jobs to implement a variety of different types of task and categorize them for ease of management. In this lesson, you will learn how to create, schedule, and script jobs.
Lesson Objectives After completing this lesson, you will be able to:
Define jobs, job types and job categories.
Create job steps.
Schedule jobs for execution.
Defining Jobs, Job Step Types, and Job Categories Creating a job involves putting together a series of steps that the job will execute, along with the workflow that determines which steps should be executed—and in which order. In most jobs, the steps will execute in sequential order, but you can control the order if required. For example, you may want one step to execute if the previous one succeeded, but a different step to execute if the previous one failed. After you determine the steps, you need to decide when to execute the job. You will likely run most of your SQL Server Agent jobs on defined schedules. SQL Server provides the ability to create a flexible set of schedules that you can share between jobs.
It is important to learn to script jobs that have been created. This enables you to quickly recreate the job if a failure occurs and to reconstruct it in other environments. For example, you may create your jobs in a test environment, but then need to move them to your production environment.
Job Step Types
Every job step has an associated type that defines which operation to run. The most commonly-used types include:
Executing a command-line script, batch of commands, or application.
Executing a Transact-SQL statement.
Executing a Windows PowerShell script.
Executing SQL Server Integration Services and Analysis Services commands and queries.
Note: While the ability to execute ActiveX® scripts is retained for backwards compatibility, this option is deprecated and you should not use it for new development.
Creating Jobs
MCT USE ONLY. STUDENT USE PROHIBITED
12-6 Automating SQL Server 2014 Management
You can use SSMS to create jobs or you can execute the sp_add_job system stored procedure, as well as other system stored procedures, to add steps and schedules to the job. After you create the job, SQL Server stores the job definition in the msdb database, alongside all the SQL Server Agent configuration. This example shows how to create a simple job. Using sp_add_job USE msdb; GO EXEC sp_add_job @job_name = 'HR database backup', @enabled = 1, @description = 'Backup the HR database', GO
Job Categories You can organize your jobs into categories either by using the SQL Server built-in categories, such as Database Maintenance, or by defining your own. This is useful when you need to perform actions that are associated with jobs in a specific category. For example, you could create a job category called SQL Server 2005 Policy Check and write a PowerShell script to execute all the jobs in that category against your SQL Server 2005 servers.
Creating Job Steps After you create a job, you can add job steps to it by using SSMS or by executing the sp_add_jobstep system stored procedure. The information in the following table describes the key arguments of the stored procedure.
Argument
Description
@job_id
Unique identification number of job to which to add the step (only specify this or @job_name, not both).
@job_name
User-friendly name of job to which to add the step (only specify this or @job_id, not both).
@step_id
Unique identification number of the job step, starting at 1 and incrementing by 1.
@step_name
User-friendly name of job step.
@subsystem
Subsytem for SQL Server Agent to use to execute the command (for example, CMDEXEC or PowerShell).
Argument
Description
@command
Command to execute.
@on_success_action
Action to perform if the step succeeds (for example, quit or go to the next step).
@on_fail_action
Action to perform if the step fails (for example, quit or go to the next step).
@retry_attempts
Number of retry attempts if the step fails.
@retry_interval
Time to wait between retry attempts.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
12-7
By default, SQL Server advances to the next job step upon success and stops when a job step fails. However, job steps can continue with any step defined in the job, using the success or failure flags. By configuring the action to occur on the success and failure of each job step, you can create a workflow that determines the overall logic flow of the job. Note that, as well as each job step having a defined outcome, the overall job reports an outcome. This means that, even though some job steps may succeed, the overall job might still report failure. You can specify the number of times that SQL Server should attempt to retry execution of a job step if the step fails. You also can specify the retry intervals (in minutes). For example, if the job step requires a connection to a remote server, you could define several retry attempts in case the connection fails. This example shows how to add a job step to a job. Using sp_add_jobstep USE msdb; GO EXEC sp_add_jobstep @job_name = 'HR database backup', @step_name = 'Set HR database to read only', @subsystem = 'TSQL', @command = 'ALTER DATABASE HR SET READ_ONLY', @retry_attempts = 2, @retry_interval = 2; GO
Scheduling Jobs for Execution You can define one or more schedules for every job and use them to start jobs at requested times. In addition to the standard recurring schedules, you can also assign a number of special recurrence types:
One time execution.
Start automatically when SQL Server Agent starts.
Start whenever the central processing unit (CPU) becomes idle.
MCT USE ONLY. STUDENT USE PROHIBITED
12-8 Automating SQL Server 2014 Management
When the recurrence pattern for a job is complex, you may need to create multiple schedules. While you can share each schedule between several jobs, you should avoid having too many starting at the same time.
Even though a job may have multiple schedules, SQL Server will limit it to a single concurrent execution. If you try to run a job manually while it is running as scheduled, SQL Server Agent refuses the request. Similarly, if a job is still running when it is scheduled to run again, SQL Server Agent refuses to let it do so. In the graphic on the slide associated with this topic, the following example shows how to create and attach a schedule for Shift 1 to the job created in earlier examples: Using sp_add_schedule and sp_attach_schedule USE msdb ; GO EXEC sp_add_schedule @schedule_name = 'Shift 1', @freq_type = 4, @freq_interval = 1, @freq_subday_type = 0x8, @freq_subday_interval = 1, @active_start_time = 080000, @active_end_time = 170000 ; GO
EXEC sp_attach_schedule @job_name = 'HR database backup', @schedule_name = 'Shift 1' ; GO
Demonstration: Creating Jobs In this demonstration, you will see how to:
Create a job.
Script a task to a job.
Generate scripts for existing jobs.
Demonstration Steps Create a Job 1.
Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are running, and log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
In the D:\Demofiles\Mod12 folder, run Setup.cmd as Administrator.
3.
Start SQL Server Management Studio and connect to the MIA-SQL database engine instance using Windows authentication.
4.
In Object Explorer, expand SQL Server Agent and Jobs to view any existing jobs. Then right-click Jobs and click New Job.
5.
In the New Job dialog box, on the General page, in the Name box, type Check AdventureWorks DB.
6.
In the New Job dialog box, on the Steps page, click New.
7.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
12-9
In the New Job Step dialog box, on the General page, in the Step name box, type Make Folder. Then ensure that Operating system (CmdExec) is selected in the Type drop-down list and in the Command area, type the following command, which calls a batch file to create an empty folder named AdventureWorks in the D:\Demofiles\Mod12 folder. D:\Demofiles\Mod12\MakeDir.cmd
8.
In the New Job Step dialog box, click OK.
9.
In the New Job dialog box, on the Steps page, click New.
10. In the New Job Step dialog box, on the General page, in the Step name box, type Get DB Info. Then ensure that Transact-SQL script (T-SQL) is selected in the Type drop-down list and in the Command area, type the following command. EXEC sp_helpdb AdventureWorks;
11. In the New Job Step dialog box, on the Advanced page, in the Output file box, type D:\Demofiles\Mod12\AdventureWorks\DB_Info.txt. Then click OK. 12. In the New Job dialog box, on the Steps page, click New.
13. In the New Job Step dialog box, on the General page, in the Step name box, type Check DB. Then ensure that Transact-SQL script (T-SQL) is selected in the Type drop-down list and in the Command area, type the following command. DBCC CHECK ('AdventureWorks');
14. In the New Job Step dialog box, on the Advanced page, in the Output file box, type D:\Demofiles\Mod12\AdventureWorks\CheckDB.txt. Then click OK.
15. In the New Job dialog box, on the Steps page, verify that the Start step is set to 1:Make Folder and note the On Success and On Failure actions for the steps in the job. 16. In the New Job dialog box, on the Schedules page, click New.
17. In the New Job Schedule dialog box, in the Name box, type Weekly Jobs; in the Frequency area, ensure that only Sunday is selected; and in the Daily frequency area, ensure that the option to occur once at 12:00 AM is selected. Then click OK. 18. In the New Job dialog box, click OK. Then verify that the job appears in the Jobs folder in Object Explorer. Script a Task to a Job 1.
In Object Explorer, expand Databases. Then right-click AdventureWorks, point to Tasks, and click Back Up.
2.
In the Back Up Database - AdventureWorks dialog box, select the existing backup destination and click Remove. Then click Add and in the Select Backup Destination dialog box, in the File name box, type D:\Demofiles\Mod12\Backups\AdventureWorks.bak and click OK.
3.
In the Back Up Database - AdventureWorks dialog box, in the Script drop-down list, select Script Action to Job.
4.
In the New Job dialog box, on the General page, note the default name for the job (Back Up Database - AdventureWorks). Then on the Steps page, note that the job includes one Transact-SQL step named 1.
Automating SQL Server 2014 Management
MCT USE ONLY. STUDENT USE PROHIBITED
12-10
5.
In the New Job dialog box, on the Schedules page, click Pick. Then in the Pick Schedule for Job Back Up Database - AdventureWorks dialog box, select the Weekly Jobs schedule you created previously and click OK.
6.
In the New Job dialog box, click OK. Then, in the Back Up Database - AdventureWorks dialog box, click Cancel.
7.
Verify that the job appears in the Jobs folder in Object Explorer.
Generate Scripts for Existing Jobs 1.
In Object Explorer, right-click the Check AdventureWorks DB job, point to Script Job as, point to CREATE To, and click New Query Editor Window. This generates the Transact-SQL code necessary to create the job.
2.
In Object Explorer, right-click the Back Up Database - AdventureWorks job, point to Script Job as, point to CREATE To, and click Clipboard. Then place the insertion point at the end of the TransactSQL code in the query editor window and on the Edit menu, click Paste.
3.
Save the Transact-SQL script as Create Jobs.sql in the D:\Demofiles\Mod12 folder. Using this technique to generate scripts to create jobs is a common way to ensure that jobs can be recreated if they are accidentally deleted or required on a different server.
4.
Keep SQL Server Management Studio open for the next demonstration.
Lesson 3
Managing SQL Server Agent Jobs
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
12-11
When you automate administrative tasks, you need to ensure that they execute correctly. To help with this, SQL Server writes entries to history tables in the msdb database on the completion of each job.
In this lesson, you will learn how to query the history tables, as well as how to troubleshoot any issues that may occur.
Lesson Objectives After completing this lesson, you will be able to:
View job history.
Query SQL Server Agent-related system tables and views.
Troubleshoot failed jobs.
Viewing Job History Every job and job step has an outcome. If either one fails, you need to review the job or step to find out why and rectify the issue before the job runs again. SQL Server Agent tracks the outcomes of jobs and their steps in system tables in the msdb database. You can also choose to write this information to the Windows Application log or SQL Server log. You can view the history for each job by using SSMS or by querying the job history tables. By default, the most recent 1,000 entries for job history are retained; however, you can configure this retention policy to base it instead on age or size of data.
The Object Explorer window in SSMS also provides a Job Activity Monitor. This displays a view of currently executing jobs and data showing the results of the previous execution, along with the scheduled time for the next execution of the job.
Automating SQL Server 2014 Management
Querying SQL Server Agent-related System Tables and Views There are many SQL Server Agent-related system tables and views that you can query to retrieve information about jobs, alerts, schedules, and operators. All the tables are stored in the msdb database in the dbo schema. For example, job history is held in the dbo.sysjobhistory table and there is a list of jobs in the dbo.sysjobs table. The following example shows code querying the dbo.sysjobs and dbo.sysjobhistory tables to retrieve information about jobs that have run. Querying Job Tables SELECT j.name, jh.run_date, jh.run_time, jh.message FROM msdb.dbo.sysjobhistory AS jh INNER JOIN msdb.dbo.sysjobs AS j ON jh.job_id = j.job_id WHERE jh.step_id = 0; GO
Note: The WHERE clause specifies a step_id of 0. Job steps begin at one, not zero, but an entry in the dbo.sysjobhistory table is made with a job step_id of zero to record the overall outcome. The outcome of individual job steps can be obtained by querying step_id values greater than zero. Additional Reading: For more information about the system tables which store SQL Server Agent data, see SQL Server Agent Tables (Transact-SQL) in SQL Server Books Online.
Troubleshooting Failed Jobs Jobs do not always execute as expected and will sometimes fail to execute at all. It is important to follow a consistent process when attempting to work out why a job is failing. There are four basic steps for troubleshooting jobs—checking SQL Server Agent status, reviewing job history, checking job execution, and checking access to dependencies.
Checking SQL Server Agent Status If SQL Server Agent is not running, no jobs can run. Make sure the service is set to start automatically and, if jobs are failing, attempt to start it manually. If the service will still not start, check the following:
MCT USE ONLY. STUDENT USE PROHIBITED
12-12
That the service account for the service is valid, that the password for the account has not changed, and that the account is not locked out. If any of these checks are the issue, the service will not start and details about the problem will be written to the computer’s System event log.
That the msdb database is online. If the msdb database is corrupt, suspect, or offline, SQL Server Agent will not start.
Reviewing Job History
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
12-13
Review the job outcome to identify the last step run. If the job was unsuccessful because a job step failed, which is the most common situation, the error of the job step cannot be seen at this level. It is necessary to review the individual job step outcome for the failed job.
Checking Job Execution If SQL Server Agent is running but an individual job will not run, check the following items:
That the job is enabled. Disabled jobs will not run.
That the job is scheduled. The schedule may be incorrect or the time for the next scheduled execution may be in the future.
That the schedule is enabled. Both jobs and schedules can be disabled and a job will not run on a disabled schedule.
Checking Access to Dependencies
Verify that all dependent objects in the job, such as databases, files, and procedures, are available. Jobs often run in a different security context to the user that creates them. Incorrect security settings present a common problem that causes job execution to fail.
Demonstration: Viewing Job History In this demonstration, you will see how to:
Run jobs.
Troubleshoot a failed job.
Demonstration Steps Run Jobs 1.
Ensure that you have completed the previous demonstration in this module.
2.
In SQL Server Management Studio, in Object Explorer, right-click the Back Up Database AdventureWorks job and click Start Job at Step. Then, when the job has completed successfully, click Close.
3.
In Object Explorer, right-click the Check AdventureWorks DB job and click Start Job at Step. Then select step 1 and click Start. Note that the job fails, and click Close.
Troubleshoot a Failed Job 1.
In Object Explorer, right-click the Back Up Database - AdventureWorks job and click View History.
2.
In the Log File Viewer - MIA-SQL dialog box, expand the date for the most recent instance of the job, and note that all steps succeeded. Then click Close.
3.
In Object Explorer, right-click the Check AdventureWorks DB job and click View History.
4.
In the Log File Viewer - MIA-SQL dialog box, expand the date for the most recent instance of the job, and note that the third step failed.
5.
Select the step that failed, and in the pane at the bottom of the dialog box, view the message that was returned. Then click Close.
Automating SQL Server 2014 Management
MCT USE ONLY. STUDENT USE PROHIBITED
12-14
6.
In Object Explorer, double-click the Check AdventureWorks DB job. Then in the Job Properties Check AdventureWorks DB dialog box, on the Steps page, select step 3 (Check DB) and click Edit.
7.
In the Job Step Properties - Check DB dialog box, modify the command as follows and click OK. DBCC CHECKDB (‘AdventureWorks');
8.
In the Job Properties - Check AdventureWorks DB dialog box, click OK.
9.
In Object Explorer, right-click the Check AdventureWorks DB job and click Start Job at Step. Then select step 1 and click Start.
10. In Object Explorer, double-click Job Activity Monitor and note the Status of the Check AdventureWorks DB job.
11. Click Refresh until the Status changes to Idle, and verify that the Last Run Outcome for the job is Succeeded. Then click Close to close the Job Activity Monitor. 12. In the Start Jobs - MIA-SQL dialog box (which may be behind SQL Server Management Studio), verify that the job completed with a status of Success, and click Close.
13. In the D:\Demofiles\Mod12 folder, view the text files generated by the Check AdventureWorks DB job in the AdventureWorks folder, and verify that a backup file was created in the Backups folder by the Back Up Database - AdventureWorks job. 14. Keep SQL Server Management Studio open for the next demonstration.
Lesson 4
Managing Job Step Security Contexts
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
12-15
By default, SQL Server Agent executes job steps in the context of the SQL Server Agent service account. When a wide variety of jobs types must be automated using jobs, this account requires the appropriate permissions to perform every job step; which can result in a highly privileged account. To avoid the potential security implications of having a single account with a wide range of permissions, you can create proxy accounts with the minimal permissions required to perform specific tasks, and use them to run different categories of job step. This lesson discusses ways to control the security context of the steps in your jobs.
Lesson Objectives After completing this lesson, you will be able to:
Describe considerations for job step security contexts.
Create credentials.
Create proxies and associate them with jobs steps.
Job Step Security Contexts When planning SQL Server Agent jobs, you must consider the security context under which the steps in those jobs will be executed.
Transact-SQL Job Steps Generally, when SQL Server Agent runs a TransactSQL job, SQL Server Agent impersonates the job owner. However, if the owner of the job step is a member of the sysadmin fixed server role, the job step will run in the security context of the SQL Server Agent service, unless the sysadmin chooses to have the step impersonate another user. Members of the sysadmin fixed server role can specify that the job step should impersonate another user.
Other Job Step Types
For job step types that are not Transact-SQL based, a different security model is used. When a member of the sysadmin fixed server role runs a job step, by default it will execute using the SQL Server Agent service account. However, because you may need to use a variety of job step types, granting the required permissions for all the jobs steps that will be executed may introduce a security risk by making the SQL Server Agent service account a highly-privileged account, which if compromised could provide considerable access to the system.
Proxy Accounts
As an alternative to using the SQL Server Agent account, you can use a proxy account to associate a job step with a Windows identity by using an object called a credential. You can create proxy accounts for all available subsystems other than Transact-SQL steps. Using proxy accounts means that you can use different Windows identities to perform the various tasks required in jobs. It provides tighter security by avoiding the need for a single account to have all the permissions required to execute all jobs.
Automating SQL Server 2014 Management
Credentials A credential is a SQL Server object that contains the authentication information required to connect to a resource outside SQL Server. Most credentials contain a Windows user name and password. If multiple SQL Server logins require the same level access to the same set of resources, you can create a single credential to map to them. However, you cannot map a SQL Server login to more than one credential. SQL Server automatically creates some system credentials that are associated with specific endpoints. These are called system credentials and have names that are prefixed with two hash signs (##).
Creating Credentials You create credentials by using the Transact-SQL CREATE CREDENTIAL statement or SSMS.
MCT USE ONLY. STUDENT USE PROHIBITED
12-16
When you create a credential, you specify the password (termed as the secret) and SQL Server encrypts it by using the service master key. Creating a Credential USE master; GO CREATE CREDENTIAL Agent_Export WITH IDENTITY = N'ADVENTUREWORKS\Agent_Export', SECRET = 'Pa$$w0rd'; GO
After you create the credential, you can map it to a login by using the ALTER LOGIN statement. Mapping a Credential to a Login ALTER LOGIN User1 WITH CREDENTIAL Agent_Export; GO
Managing Credentials SQL Server provides the sys.credentials system view to give catalog information about existing credentials. Because the password for a Windows account may change over time, you can update a credential with new values by using the ALTER CREDENTIAL statement. You need to supply both the user name and password (that is, the secret) to the ALTER CREDENTIAL statement. Altering Credentials ALTER CREDENTIAL Agent_Export WITH IDENTITY = N'ADVENTUREWORKS\Agent_Export', SECRET = 'NewPa$$w0rd'; GO
You can remove a credential by using the DROP CREDENTIAL statement.
Proxy Accounts For a job step in a SQL Server Agent job to use a credential, you need to map the credential to a proxy account in SQL Server. There is a built-in set of proxy accounts and you can also create your own. SQL Server proxy accounts define the security context for a job step. SQL Server Agent uses the proxy account to access the security credentials for a Microsoft Windows user. Note: The Windows user specified in the credential must have the Log on as a batch job permission on the computer where SQL Server is running.
Creating Proxy Accounts
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
12-17
You can create a proxy account by using the dbo.sp_add_proxy stored procedure in the msdb database, or by using SSMS. Creating a Proxy Account EXEC dbo.sp_add_proxy @proxy_name = 'Export_Proxy', @enabled = 1, @description = 'Proxy account for exporting data.', @credential_name = 'Agent_Export' ;
Creating a proxy account does not change existing permissions for the Windows account that is specified in the credential. For example, you can create a proxy account for a Windows account that does not have permission to connect to an instance of SQL Server. Job steps using that proxy account are still unable to connect to SQL Server.
A user must have permission to use a proxy account before they can specify it in a job step. By default, only members of the sysadmin fixed server role have permission to access all proxy accounts but you can grant permissions to three types of security principals:
SQL Server logins.
Server roles.
Roles within the msdb database.
Proxy Accounts and Subsystems
A subsystem is a predefined object that represents a set of functionality available in SQL Server, for example, the operating system or PowerShell. SQL Server has built-in proxy accounts for each type of subsystem that it utilizes. You can associate each proxy account with one or more subsystems. Each proxy account can be associated with one or more subsystems.
Subsystems assist in providing security control because they segment the functions that are available to a proxy account. A job step that uses a proxy account can access the specified subsystems by using the security context of the Windows user. When SQL Server Agent runs a job step that uses a proxy account, it impersonates the credentials defined in the proxy account and runs the job step by using that security context. SQL Server Agent checks subsystem access for a proxy account every time a job step runs. If the security environment has changed and the proxy account no longer has access to the subsystem, the job step fails.
Automating SQL Server 2014 Management
The following code example grants the Export_Proxy proxy account access to the SSIS subsystem. Granting a Proxy Account Access to a Subsystem EXEC dbo.sp_grant_proxy_to_subsystem @proxy_name = 'Export_Proxy', @subsystem_name = ‘Dts' ;
Managing Proxy Accounts and Subsystems
MCT USE ONLY. STUDENT USE PROHIBITED
12-18
SQL Server stores the configuration data for SQL Server Agent, including that for proxy accounts, in the msdb database. This database contains a set of system views that you can use to query information about the proxy accounts in your system. System View
Description
dbo.sysproxies
Returns one row per proxy account.
dbo.sysproxylogin
Returns which SQL Server logins are associated with which proxy accounts.
dbo.syssubsystems
Returns one row per subsystem.
dbo.sysproxysubsystem
Returns which subsystem is used by each proxy account.
The following example shows how to return related information about credentials and their proxy accounts. Reviewing Proxy Account and Credential Information USE msdb; GO SELECT p.name as ProxyName, c.name as CredentialName, p.description as ProxyDescription FROM dbo.sysproxies AS p INNER JOIN sys.credentials AS c ON p.credential_id = c.credential_id;
Demonstration: Configuring Security Context In this demonstration, you will see how to:
Create a credential.
Create a proxy account.
Run a job step as a proxy account.
Demonstration Steps Create a Credential 1.
Ensure that you have completed the previous demonstration in this module.
2.
In SQL Server Management Studio, in Object Explorer, under MIA-SQL, expand Security. Then rightclick Credentials and click New Credential.
3.
In the New Credential dialog box, enter the following details and click OK. o
Credential name: FileAgent
o
Identity: MIA-SQL\FileAgent
o
Password: Pa$$w0rd
o
Confirm password: Pa$$w0rd
MIA-SQL\FileAgent is a local Windows user on the MIA-SQL server with minimal privileges. Create a Proxy Account
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
12-19
1.
In Object Explorer, under SQL Server Agent, expand Proxies.
2.
Right-click Operating System (CmdExec) and click New Proxy.
3.
In the New Proxy Account dialog box, in the Proxy name box type FileAgentProxy, and in the Credential name box type FileAgent. Then ensure that only the Operating system (CmdExec) subsystem is selected and click OK.
Run a Job Step as a Proxy Account 1.
In Object Explorer, under SQL Server Agent, in the Jobs folder, double-click the Check AdventureWorks DB job.
2.
In the Job Properties - Check AdventureWorks DB dialog box, on the Steps page, click step 1 (Make Folder) and click Edit.
3.
In the Job Step Properties - Make Folder dialog box, in the Run as drop-down list, select FileAgentProxy. Then click OK.
4.
In the Job Properties - Check AdventureWorks DB dialog box, click OK.
5.
Right-click the Check AdventureWorks DB job, and click Start Job at Step. Then in the Start Job on MIA-SQL dialog box, ensure step 1 is selected and click Start.
6.
When the job has completed successfully, click Close.
7.
In the D:\Demofiles\Mod12 folder, right-click the AdventureWorks folder that was created by your job, and click Properties.
8.
In the AdventureWorks properties dialog box, on the Security tab, click Advanced.
9.
Note that the owner of the folder is MIA-SQL\FileAgent. This is the account that was used to create the folder. Then click Cancel to close the Advanced Security Settings for AdventureWorks dialog box, and click Cancel again to close the AdventureWorks Properties dialog box.
10. Keep SQL Server Management Studio open for the next demonstration.
Automating SQL Server 2014 Management
Lesson 5
Managing Jobs on Multiple Servers You may have jobs that run across multiple servers which would benefit from being automated. SQL Server provides multiserver administration functionality that enables you to distribute jobs across your enterprise.
MCT USE ONLY. STUDENT USE PROHIBITED
12-20
In this lesson, you will learn about the concepts behind multiserver administration and how to implement jobs across multiple servers.
Lesson Objectives After completing this lesson, you will be able to:
Describe the concepts of multiserver administration.
Explain the considerations for multiserver administration.
Run jobs on target servers.
Automate multiserver administration.
Multiserver Administration Concepts Multiserver administration involves one master server which stores the master copy of the jobs and distributes them to one or more target servers. The master server can also receive events from the target servers that update it with the status of the jobs they run. The target servers are assigned to one master server, to which they periodically connect to update their schedule of jobs and download any new ones. For example, you can create a backup job on a central server, distribute it to all servers in your organization, and then monitor the status of all the jobs from the central server.
Considerations for Multiserver Administration There are a number of considerations that you must take into account before setting up multiserver administration:
Because the master server both distributes jobs to other servers and receives events from them, the role can impact a high-load performance server. Consider creating the master server on a server that does not have a significant workload or network traffic requirement.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
12-21
Each target server can only link to one master server. If you want to change your master server, you must first defect all target servers from the master server, and then enlist them all to the new master server.
The master server uses the name of a target server. Therefore, if you want to change the name of the target server you must first defect it from the master server, rename it, and then enlist it back to the master server.
Because they need to communicate across the multiple servers, the SQL Server Agent service and SQL Server service must run using Windows domain accounts.
Configuring Master and Target Servers You can configure master and target servers by using SSMS or Transact-SQL queries.
Using the Master Server Wizard In SSMS, you can access the Master Server Wizard from the SQL Server Agent node in Object Explorer. The steps in the wizard enable you to specify an operator for the master server, select target servers from a list of registered servers, and create and configure any required logins.
Using Transact-SQL
You use the sp_msx_enlist stored procedure to create master and target servers. The first server that you enlist by using this stored procedure acts as the master server and subsequent ones become target servers. The following example creates a master server named AWMaster and enlists two target servers called AWTarget2 and AWTarget3. Using sp_msx_enlist USE msdb ; GO EXEC dbo.sp_msx_enlist 'AWMaster'; GO; EXEC dbo.sp_msx_enlist 'AWTarget1'; GO; EXEC dbo.sp_msx_enlist 'AWTarget2'; GO;
Automating SQL Server 2014 Management
Running Jobs on Target Servers After you configure the master and target servers, you can begin to distribute jobs from one to the other by using SSMS or Transact-SQL. On the master server, in SSMS, in the Properties dialog box for a job, you can select to target either the local server or multiple servers, selecting which ones to run the job on. To distribute jobs to target servers by using Transact-SQL, use the sp_add_jobserver statement as shown in the following example: Distributing Jobs to Target Servers USE msdb; GO EXEC dbo.sp_add_jobserver 'HR database backup', 'AWTarget1'; GO
If you change the definition of a multiserver job after distributing it to the target servers, you need to ensure that SQL Server adds the change to the download list for the target servers to be updated. You can do this by executing the sp_post_msx_operation stored procedure as shown in the following example: Updating Distributed Jobs USE msdb; GO sp_post_msx_operation 'INSERT', 'JOB', ''; GO
You only need to execute this code when you add, update, or remove job steps or schedules; sp_update_job and sp_delete_job automatically add entries to the download list. Note: You can locate the job_id property by querying the dbo.sysjobhistory or dbo.sysjobs tables or by viewing the job properties in SSMS.
MCT USE ONLY. STUDENT USE PROHIBITED
12-22
Demonstration: Configuring Multi-Server Jobs In this demonstration, you will see how to:
Create master and target servers.
Create a job for target servers.
Run a job on a target server.
Demonstration Steps Create Master and Target Servers
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
12-23
1.
Ensure that you have completed the previous demonstrations in this module.
2.
In SQL Server Management Studio, in Object Explorer, under MIA-SQL, right-click SQL Server Agent, point to Multi Server Administration, and then click Make this a Master.
3.
In the Master Server Wizard – MIA-SQL window, on the Welcome to the Master Server Wizard page, click Next.
4.
On the Master Server Operator page, in the E-mail address box, type
[email protected], and then click Next.
5.
On the Target Servers page, click Add Connection.
6.
In the Connect to Server dialog box, in the Server name box, type MIA-SQL\SQL2, and then click Connect.
7.
On the Target Servers page, click Next.
8.
In the Checking Server Compatibility window, click Close.
9.
On the Master Server Login Credentials page, click Next.
10. On the Complete the Wizard page, click Finish. Then when configuration is complete. Click Close. Create a Job for Target Servers 1.
In Object Explorer, expand SQL Server Agent (MSX), and then expand Jobs.
2.
Right-click Jobs and then click New Job.
3.
In the New Job window, in the Name box, type Backup master database.
4.
In the New Job window, on the Steps page, click New.
5.
In the Step name box, type Backup master and ensure that the Transact-SQL (T-SQL) type is selected. Then, in the Command box, type the following Transact-SQL, then click Parse. BACKUP DATABASE master TO DISK = 'master.bak';
Because no folder path is specified, the command will store the back up in the default backup folder for the SQL Server instance. 6.
In the New Job Step window, click OK.
7.
In the New Job window, on the Targets page, select Target multiple servers. Then select MIASQL\SQL2, and then click OK.
Automating SQL Server 2014 Management
Run a Job on a Target Server
MCT USE ONLY. STUDENT USE PROHIBITED
12-24
1.
In Object Explorer, right-click the Backup master database, and click Start Job at Step. Then, in the Start Jobs – MIA-SQL dialog box, click Close.
2.
In Object Explorer, in the Connect drop-down list, click Database Engine. Then connect to the MIASQL\SQL2 instance using Windows authentication.
3.
In Object Explorer, under MIA-SQL\SQL2, expand SQL Server Agent (TSX: MIA-SQL) and expand Jobs.
4.
Right-click the Backup master database job on MIA-SQL\SQL2 and click View History.
5.
Review the job history to verify that is has been executed successfully, and then click Close.
6.
Close SQL Server Management Studio without saving any files.
7.
In File Explorer, browse to the C:\Program Files\Microsoft SQL Server\MSSQL12. SQL2\MSSQL\Backup folder, (clicking Continue if prompted) and verify it contains a master.bak file created in the last few minutes.
8.
Close File Explorer.
Lab: Automating SQL Server Management Scenario
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
12-25
You are a database administrator (DBA) at Adventure Works Cycles, with responsibility for databases on the MIA-SQL instance of SQL Server. Routine tasks that must be performed on this instance have previously been performed manually, but you now plan to automate these tasks using SQL Server Agent.
Objectives After completing this lab, you will be able to:
Create jobs.
Schedule jobs.
Configure job step security context.
Estimated Time: 45 minutes Virtual machine: 20462C-MIA-SQL User name: ADVENTUREWORKS\Student Password: Pa$$w0rd
Exercise 1: Creating a Job Scenario
The HumanResources database must be backed up every day. Additionally, after the backup has been created, the backup file must be copied to a folder, which is automatically replicated to a cloud service for offsite storage of various backup files. The main tasks for this exercise are as follows: 1. Prepare the Lab Environment 2. Create a Job 3. Test the Job 4. Generate a Script for the Job
Task 1: Prepare the Lab Environment 1.
Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are both running, and then log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
Run Setup.cmd in the D:\Labfiles\Lab12\Starter folder as Administrator.
Task 2: Create a Job 1.
Create a job named Backup HumanResources in the MIA-SQL instance of SQL Server.
2.
Add a Transact-SQL step that runs in the HumanResources database and executes the following command. The output from the command should be saved as a text file in the D:\Labfiles\Lab12\Starter folder. BACKUP DATABASE HumanResources TO DISK = 'R:\Backups\HumanResources.bak';
3.
Add an operating system command step that runs the following command to copy the backup file to the D:\Labfiles\Lab12\Starter folder. Copy R:\Backups\HumanResources.bak D:\Labfiles\Lab12\Starter\HumanResources.bak /Y
Automating SQL Server 2014 Management
4.
Ensure that the job is configured to start with the Transact-SQL backup step.
Task 3: Test the Job 1.
Run the Backup HumanResources job you created in the previous task.
2.
View the history for the job and verify that it succeeded.
3.
View the contents of the D:\Labfiles\Lab12\Starter folder and verify that it contains a text file containing the output from the backup step and a copy of the backup file.
Task 4: Generate a Script for the Job 1.
Generate a Transact-SQL script to recreate the Backup HumanResources job, and save it in the D:\Labfiles\Lab12\Starter folder.
Results: After this exercise, you should have created a job named Backup HumanResources.
Exercise 2: Scheduling a Job Scenario You have created a job to back up the HumanResources database. Now you must schedule the job to run automatically each day. The main tasks for this exercise are as follows: 1. Add a Schedule to the Job 2. Verify Scheduled Job Execution
Task 1: Add a Schedule to the Job
MCT USE ONLY. STUDENT USE PROHIBITED
12-26
1.
Enable the clock icon in the notification area of the task bar so that you can easily see the current system time in the MIA-SQL virtual machine.
2.
Add a schedule to the Backup HumanResources job so that the job runs every day one minute from the current system time.
3.
Wait for the scheduled time, and then proceed with the next task.
Task 2: Verify Scheduled Job Execution 1.
Use the Job Activity monitor to view the status of the Backup HumanResources job.
2.
When the job is idle, verify that the Last Run Outcome for the job is Succeeded, and that the Last Run time is the time that you scheduled previously.
Results: After this exercise, you should have created a schedule for the Backup HumanResources job.
Exercise 3: Configuring Job Step Security Contexts Scenario
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
12-27
You have created and scheduled a job to back up the HumanResources database. However, you want to use security accounts with minimal permissions to perform the job steps. The main tasks for this exercise are as follows: 1. Create a Credential 2. Create a Proxy Account 3. Configure Job Step Security Contexts 4. Test the Job
Task 1: Create a Credential 1.
Create a credential named FileAgent_Credential for the MIA-SQL\FileAgent Windows account.
2.
The password for this account is Pa$$w0rd.
Task 2: Create a Proxy Account 1.
Create a proxy account named FileAgent_Proxy for the Operating System (CmdExec) subsystem.
2.
The proxy account should use the credential you created in the previous step.
Task 3: Configure Job Step Security Contexts 1.
Configure the advanced properties of the first step (which uses a Transact-SQL command to back up the database) in the Backup HumanResources job to run as the [Backup_User] user in the HumanResources database.
2.
Configure the second step (which uses an operating system command to copy the backup file) in the Backup HumanResources job to run as the FileAgent_Proxy proxy account.
Task 4: Test the Job 3.
Run the Backup HumanResources job and verify that it succeeds.
Results: After this exercise, you should have configured the Back Up Database step of the Backup HumanResources job to run as the Backup_User SQL Server user. You should have also created a credential named FileAgent_Credential and a proxy named FileAgent_Proxy to perform the Copy Backup File step of the Backup HumanResources job. Question: Assuming you have administrative control over the local Windows user for the proxy account in this scenario, what considerations would apply to its properties?
Automating SQL Server 2014 Management
Module Review and Takeaways In this module, you learned how to create and manage SQL Server Agent jobs to automate database maintenance tasks. Best Practice: When planning SQL Server Agent jobs, consider the following best practices.
Use SQL Agent jobs to schedule routine tasks.
Create custom categories to group your jobs.
Script your jobs for remote deployment, or in case you need to recreate them.
Use job history to review job and job step outcomes.
Use Job Activity Monitor to real-time monitor jobs.
Apply the “principle of least privilege” when configuring job step execution identities to avoid creating highly-privileged user accounts.
Review Question(s) Question: What functions do you currently perform manually that could be placed in a job?
MCT USE ONLY. STUDENT USE PROHIBITED
12-28
MCT USE ONLY. STUDENT USE PROHIBITED 13-1
Module 13 Monitoring SQL Server 2014 with Notifications and Alerts Contents: Module Overview
13-1
Lesson 1: Monitoring SQL Server Errors
13-2
Lesson 2: Configuring Database Mail
13-6
Lesson 3: Configuring Operators, Notifications, and Alerts
13-11
Lab: Using Notifications and Alerts
13-17
Module Review and Takeaways
13-20
Module Overview
One key aspect of managing Microsoft® SQL Server® in a proactive manner is to make sure you are aware of events that occur in the server, as they happen. SQL Server logs a wealth of information about issues and you can configure it to advise you automatically when these issues occur, by using alerts and notifications. The most common way that SQL Server database administrators receive details of events of interest is by email message. This module covers the configuration of Database Mail, alerts, and notifications.
Objectives After completing this module, you will be able to:
Monitor SQL Server errors.
Configure Database Mail.
Configure operators, alerts and notifications.
Lesson 1
Monitoring SQL Server Errors
MCT USE ONLY. STUDENT USE PROHIBITED
13-2 Monitoring SQL Server 2014 with Notifications and Alerts
It is important to understand the core aspects of errors as they apply to SQL Server. In particular, you need to consider the nature and locations of errors, as well as the data that they return. SQL Server records severe errors in the SQL Server error log, so it is important to know how to configure the log.
Lesson Objectives After completing this lesson, you will be able to:
Define SQL Server errors.
Describe error severity levels.
Configure the SQL Server error log.
What Is in an Error? It might not be immediately obvious that a SQL Server error (or exception) is itself an object, and therefore has properties that you can access. Property
Description
Error number
Unique identifying number.
Error message
String describing the cause of the error.
Severity
Int describing the seriousness of the error.
State
Int describing the condition of the error.
Procedure name
String containing the name of the stored procedure or trigger where the error occurred.
Line number
Int containing the line number at which the error occurred.
Error numbers are helpful when trying to locate information about the specific error, particularly when searching for information online. The following example shows how to use the sys.messages catalog view to retrieve a list of system supplied error messages, showing the properties described in the table above: Viewing System Error Messages SELECT * FROM sys.messages WHERE language_id = 1033 ORDER BY message_id;
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
13-3
Error messages can be localized and are returned in a number of languages, so the WHERE clause of this example limits the results to view only the English version.
Error Severity Levels The severity of an error indicates the type of problem encountered by SQL Server. Low severity values are informational messages and do not indicate true errors. Error severities are grouped into ranges.
Values from 0 to 10
Values from 0 to 10 are informational messages raised when SQL Server needs to provide information associated with the running of a query. For example, consider the query SELECT COUNT(Color) FROM Production.Product. This query returns a count, but on the Messages tab in SQL Server Management Studio (SSMS), the message “Warning: Null value is eliminated by an aggregate or other SET operation” is also displayed. No error has occurred, but SQL Server warns that it ignored NULL values when counting the rows.
Values from 11 to 16
Values from 11 to 16 are used for errors that the user can correct. Typically, SQL Server uses them when it asserts that the code being executed contains an error. Errors in this range include:
11. Indicates that an object does not exist.
13. Indicates a transaction deadlock.
14. Indicates errors such as permission denied.
15. Indicates syntax errors.
Values from 17 to 19
Values from 17 to 19 are serious software errors that the user cannot correct. For example, severity 17 indicates that SQL Server has run out of resources (for example, memory or disk space).
Values above 19
Values above 19 tend to be very serious errors that normally involve either the hardware or SQL Server itself. It is common to ensure that all errors above 19 are logged and alerts generated on them.
Configuring the SQL Server Error Log Important messages, particularly those considered as severe error messages, are logged to both the Windows Application Event Log and SQL Server Error Log. The sys.messages view shows the available error messages and indicates which ones will be logged by default. SQL Server writes its error logs to the Program Files\Microsoft SQL Server\MSSQL12.\MSSQL\LOG\ERRORL OG folder. It names the log files ERRORLOG.n where n is the log file number. The log files are text files that you can view by using any text editor or the Log Viewer provided by SSMS.
MCT USE ONLY. STUDENT USE PROHIBITED
13-4 Monitoring SQL Server 2014 with Notifications and Alerts
By default, SQL Server retains backups of the previous six logs and gives the most recent log backup the extension .1, the second most recent the extension .2, and so on. The current error log has no extension. You can configure the number of log files to retain by using the right-click Configure option from the SQL Server Logs node in Object Explorer. The log file cycles with every restart of the SQL Server instance. Occasionally, you might want to remove excessively large log files. You can use the sp_cycle_errorlog system stored procedure to close the existing log file and open a new one on demand. If there is a regular need to recycle the log file, you could create a SQL Server Agent job to execute the system stored procedure on a schedule. Cycling the log can help you to stop the current error log becoming too large.
Demonstration: Viewing the SQL Server Error Log In this demonstration, you will see how to view the error log and how to cycle the log file.
Demonstration Steps View the SQL Server Error Log 1.
Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are running, and log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
In the D:\Demofiles\Mod13 folder, run Setup.cmd as Administrator.
3.
Start SQL Server Management Studio and connect to the MIA-SQL database engine instance using Windows authentication.
4.
In Object Explorer, under the MIA-SQL instance, expand Management and expand SQL Server Logs. Then right-click Current and click View SQL Server Log.
5.
Maximize the Log File Viewer - MIA-SQL window and view the log entries. Note that when you select a log entry, its details are shown in the bottom pane.
6.
In the Select logs pane, expand SQL Server Agent and select Current. Then scroll the main log entries pane to the right until you can see the Log Type column and scroll down to find an entry with the log type SQL Server Agent.
7.
When you have finished viewing the log entries, click Close.
8.
Minimize SQL Server Management Studio and view the contents of the C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\Log folder. If you are prompted to change your
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
13-5
permissions to get access to the folder, click Continue. Note that the current SQL Server log is stored here in the file named ERRORLOG, and the current SQL Server Agent log is stored as SQLAGENT.1. The remaining log files contain log entries for other SQL Server components and services. Cycle the Log File 1.
In SQL Server Management Studio, click New Query.
2.
In the query window, enter the following Transact-SQL code: EXEC sys.sp_cycle_errorlog;
3.
Click Execute.
4.
In Object Explorer, right-click the Current SQL Server log and click View SQL Server Log.
5.
Note that the log has been reinitialized, and then click Close.
Lesson 2
Configuring Database Mail
MCT USE ONLY. STUDENT USE PROHIBITED
13-6 Monitoring SQL Server 2014 with Notifications and Alerts
SQL Server needs to be able to advise administrators when issues arise that require their attention, such as the failure of a scheduled job or a significant error. Email is the most commonly-used mechanism for notifications from SQL Server. You can use the Database Mail feature of SQL Server to connect to an existing Simple Mail Transport Protocol (SMTP) server when SQL Server needs to send email. You can configure SQL Server with multiple email profiles and to control which users can utilize the email features of the product. It is important to be able to track and trace emails that have been sent so SQL Server enables you to configure a policy for their retention.
Lesson Objectives After completing this lesson, you will be able to:
Describe Database Mail.
Configure Database Mail profiles.
Configure Database Mail security.
Configure Database Mail retention.
Overview of Database Mail
Database Mail sends email by using an SMTP server. It is designed to be a reliable, scalable, secure, and supportable system. Messages are delivered asynchronously by a process outside of SQL Server to avoid impacting the performance of your database system. However, if the SMTP server is unavailable, SQL Server can queue the messages until the service is available again. By default, the Database Mail stored procedures are disabled to reduce the surface area of SQL Server and, when they are enabled, only users who are members of the DatabaseMailUserRole database role in the msdb database can execute them. Database Mail logs email activity and also stores copies of all messages and attachments in the msdb database. You can use Database Mail to send emails as part of a SQL Server Agent job, in response to an alert, or on behalf of a user from a stored procedure. You use the Database Mail Configuration Wizard to enable Database Mail and to configure accounts and profiles. A Database Mail account contains all the information that SQL Server needs to send an email message to the mail server. You must specify what type of authentication to use (Windows, basic, or anonymous), the email address, the email server name, type, and port number, and if using authentication, the username and password. SQL Server stores the configuration details in the msdb database, along with all other SQL Server Agent configuration data. SQL Server Agent also caches the profile information in memory so it is possible to send email if the SQL Server database engine is no longer available.
Database Mail Profiles A Database Mail profile is a collection of one or more Database Mail accounts. When there is more than one account in a profile, Database Mail tries to send email using the accounts in a predefined order, ensuring that if one email server is unresponsive, another can be used. Profiles can be private or public. Private profiles are strictly controlled and are only available to members of the sysadmin role or those granted permission by members of the sysadmin role. In contrast, any user who is a member of the DatabaseMailUserRole can use a public profile.
Mail Profiles You can create multiple configurations by using different profiles. For example, you could create one profile to send mail to an internal SMTP server, using an internal email address, for mails sent by SQL Server Agent and a second profile for a database application to send external email notifications to customers.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
13-7
Each database user can have access to multiple profiles. If you do not specify a profile when sending an email, Database Mail uses the default profile. If both private and public profiles exist, precedence is given to a private default profile over a public one. If you do not specify a default profile or if a non-default profile should be used, you must specify the profile name you want to use as a parameter when sending mail.
The following example shows how to use the sp_send_dbmail system stored procedure to send an email using a specific profile. Sending Mail EXEC msdb.dbo.sp_send_dbmail @profile_name = 'HR Administrator', @recipients = '
[email protected]', @body = 'Daily backup completed successfully.', @subject = 'Daily backup status';
Database Mail Security It is important to consider the service account that the SQL Server service will use when you are configuring Database Mail. If you configure SQL Server to run as the Local Service account, it does not have permission to make outgoing network connections—in which case, Database Mail cannot contact an email server located on a different computer.
Database Mail Stored Procedures
MCT USE ONLY. STUDENT USE PROHIBITED
13-8 Monitoring SQL Server 2014 with Notifications and Alerts
To minimize the security surface of SQL Server, the system extended stored procedures for Database Mail are disabled by default. When you run the Database Mail Configuration Wizard, it automatically enables the procedures. If you wish to configure Database Mail manually, you can enable the Database Mail system extended stored procedures by using the sp_configure stored procedure, setting the Database Mail XPs option to the value of 1.
Security and Attachment Limitations
Not all SQL Server users can send emails. This ability is limited to members of the database role called DatabaseMailUserRole in the msdb database. Members of the sysadmin fixed server role can also send Database Mail. You can also limit the types and size of attachments that users can send in emails by Database Mail. You can configure this limitation by using the Database Mail Configuration Wizard or by calling the dbo.sysmail_configure_sp system stored procedure in the msdb database.
Database Mail Logs and Retention SQL Server logs event messages in internal tables in the msdb database. You can configure how much information it logs by setting the logging level to one of the following:
Normal. Only logs errors.
Extended. Logs errors, warnings, and information messages.
Verbose. Logs extended messages plus success messages and a number of internal messages.
Note: You should only use the verbose level for troubleshooting purposes because it can generate a large volume of log entries which can fill the database and impact performance.
You configure the logging level parameter by using the Configure System Parameters dialog box of the Database Mail Configuration Wizard or by calling the dbo.sysmail_configure_sp stored procedure in the msdb database. You can view the logged messages by querying the dbo.sysmail_event_log table. Internal tables in the msdb database also hold copies of the email messages and attachments that Database Mail sends, together with the current status of each message. Database Mail updates these tables as it processes each message. You can track the delivery status of an individual message by viewing information in the following:
dbo.sysmail_allitems
dbo.sysmail_sentitems
dbo.sysmail_unsentitems
dbo.sysmail_faileditems
To see details of email attachments, query the dbo.sysmail_mailattachments view.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
13-9
Because Database Mail retains the outgoing messages and their attachments, you need to plan a retention policy for this data. If the volume of Database Mail messages and related attachments is high, plan for substantial growth of the msdb database.
You can periodically delete messages to regain space and to comply with your organization's document retention policies. The following example shows how to delete messages, attachments, and log entries that are more than one month old: Deleting Old Logs and Mail Items USE msdb; GO DECLARE @CutoffDate datetime ; SET @CutoffDate = DATEADD(m, -1,SYSDATETIME()); EXECUTE dbo.sysmail_delete_mailitems_sp @sent_before = @CutoffDate; EXECUTE dbo.sysmail_delete_log_sp @logged_before = @CutoffDate; GO
You could schedule these commands to be executed periodically by creating a SQL Server Agent job.
Demonstration: Configuring Database Mail In this demonstration, you will see how to:
Create a Database Mail profile.
Send a test e-mail.
Query Database Mail system tables.
Demonstration Steps Create a Database Mail Profile 1.
If you did not complete the previous demonstration in the module, start the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines, log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd, and in the D:\Demofiles\Mod13 folder, run Setup.cmd as Administrator.
2.
If SQL Server Management Studio is not already open, start it and connect to the MIA-SQL database engine instance using Windows authentication.
3.
In Object Explorer, under the MIA-SQL instance, under Management, right-click Database Mail, and click Configure Database Mail.
4.
In the Welcome to Database Mail Configuration Wizard page, click Next.
5.
In the Select Configuration Task page, select the option to set up Database Mail and click Next.
6.
In the New Profile page, in the Profile name textbox type SQL Server Agent Profile, and click Add. Then, in the Add Account to profile 'SQL Server Agent Profile' dialog box, click New Account.
Monitoring SQL Server 2014 with Notifications and Alerts
7.
In the New Database Mail Account dialog box, enter the following details and click OK: o
Account name: AdventureWorks Administrator
o
E-mail address:
[email protected]
o
Display name: Administrator (AdventureWorks)
o
Reply e-mail:
[email protected].
o
Server name: mia-sql.adventureworks.msft
MCT USE ONLY. STUDENT USE PROHIBITED
13-10
8.
In the New Profile page, click Next.
9.
In the Manage Profile Security page, select Public for the SQL Server Agent Profile profile, and set its Default Profile setting to Yes. Then click Next.
10. In the Configure System Parameters page, click Next. Then, in the Complete the Wizard page, click Finish and when configuration is complete, click Close. Send a Test E-Mail 1.
In Object Explorer, right-click Database Mail and click Sent Test E-Mail.
2.
In the Send Test E-Mail from MIA-SQL dialog box, ensure that the SQL Server Agent Profile database mail profile is selected, and in the To textbox, enter
[email protected]. Then click Send Test Email.
3.
View the contents of the C:\inetpub\mailroot\Drop folder, and verify that an email message has been created here.
4.
Double-click the message to view it in Outlook. When you have read the message, close it and minimize the Drop folder window.
5.
In the Database Mail Test E-Mail dialog box (which may be behind SQL Server Management Studio), click OK.
Query Database Mail System Tables 1.
In SQL Server Management Studio, click New Query.
2.
Enter the following Transact-SQL code and click Execute. SELECT * FROM msdb.dbo.sysmail_event_log; SELECT * FROM msdb.dbo.sysmail_mailitems;
3.
View the results. The first result shows system events for Database Mail, and the second shows records of e-mail messages that have been sent.
4.
Keep SQL Server Management Studio open for the next demonstration.
Lesson 3
Configuring Operators, Notifications, and Alerts
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
13-11
Many SQL Server systems have multiple administrators. SQL Server Agent enables you to configure operators that are associated with one or more administrators and to determine when to contact each of the operators—along with the method to use for that contact.
SQL Server can also detect many situations that might be of interest to administrators. You can configure alerts that are based on SQL Server errors or on system events such as low disk space availability, and then configure SQL Server to notify you of these situations.
Lesson Objectives After completing this lesson, you will be able to:
Describe the role of operators in SQL Server Agent.
Describe SQL Server alerts.
Create alerts.
Configure alert actions.
Troubleshoot alerts and notifications.
Operator and Notifications An operator in SQL Server Agent is an alias for a person or a group of people who can receive electronic notifications when jobs complete or when alerts are raised. Note: Operators do not need to be Windows logins, SQL Server logins, or database users. For example, you could create an operator that is a reference to a pager address.
Configuring Operators
You can define new operators using either SSMS or the dbo.sp_add_operator system stored procedure. After you define the operator, you can view the definition by querying the dbo.sysoperators system table in the msdb database. You can configure three types of contact methods for each operator:
Email. An SMTP email address where notifications are sent. Where possible, it is desirable to use group email addresses rather than individual ones. You can also list multiple email addresses by separating them with a semicolon.
Pager email. An SMTP email address where a message can be sent during specified times (and days) during a week.
Net send address. Messenger address where a network message is sent.
Monitoring SQL Server 2014 with Notifications and Alerts
Note: Pager and Net send notifications are deprecated, and should not be used for new development as they will be removed in a future version of SQL Server.
Fail-Safe Operator You can also define a fail-safe operator that is notified in the following circumstances:
MCT USE ONLY. STUDENT USE PROHIBITED
13-12
The SQL Server Agent cannot access the tables that contain settings for operators and notifications in the msdb database.
A pager notification must be sent at a time when no operators configured to receive pager alerts are on duty.
Job Notifications
You can configure SQL Server Agent jobs to send messages to an operator on completion, failure, or success. Configuring jobs to send notifications on completing or success might lead to a large volume of e-mail notifications, so may DBAs prefer to be notified only if a job fails. However, for business-critical jobs, you might want to be notified regardless of the outcome to remove any doubt over the notification system itself.
Demonstration: Configuring SQL Server Agent Operators In this demonstration, you will see how to:
Enable a SQL Server Agent mail profile.
Create an operator.
Configure a job to notify an operator.
Demonstration Steps Enable a SQL Server Agent Mail Profile 1.
Ensure that you have completed the previous demonstration in this module.
2.
In SQL Server Management Studio, in Object Explorer, right-click SQL Server Agent and click Properties.
3.
In the SQL Server Agent Properties dialog box, on the Alert System page, select Enable mail profile and in the Mail profile drop-down list, select SQL Server Agent Profile. Then click OK.
4.
In Object Explorer, right-click SQL Server Agent and click Restart. When prompted to confirm, click Yes.
Create an Operator 1.
In Object Explorer, under SQL Server Agent, right-click Operators and click New Operator.
2.
In the New Operator dialog box, in the Name box type Student, in the E-mail name box type
[email protected], and click OK.
Configure a Job to Notify an Operator 1.
In Object Explorer, under SQL Server Agent, expand Jobs and view the existing jobs.
2.
Right-click the Back Up Database - AdventureWorks job and click Properties.
3.
In the Job Properties - Back Up Database - AdventureWorks dialog box, on the Notifications tab, select E-mail, select Student, and select When the job completes. Then click OK.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
13-13
4.
In Object Explorer, expand the Operators folder, right-click Student and click Properties. On the Notifications page, select Jobs, note the job notifications that have been defined for this operator. Then click Cancel.
5.
Right-click the Back Up Database - AdventureWorks job and click Start Job at Step. Then, when the job has completed, click Close.
6.
Under the Operators folder, right-click Student and click Properties. On the History page, note the most recent notification by e-mail attempt. Then click Cancel.
7.
In the C:\inetpub\mailroot\Drop folder, and verify that a new email message has been created.
8.
Double-click the most recent message to view it in Outlook. Then, when you have read the message, close it and minimize the Drop window.
9.
Keep SQL Server Management Studio open for the next demonstration.
Overview of SQL Server Alerts There are many events other than scheduled jobs occurring in a SQL Server system that are of interest to administrators. An alert is a SQL Server object defining a condition that requires attention and a response that should be taken when the event occurs. You can define alerts to execute a job or to notify an operator when a particular event occurs or even when a performance threshold is exceeded.
SQL Server generates events and enters them into the Windows Application Event Log. On startup, SQL Server Agent registers itself as a callback service with the Application Log. This means that the Application Log will automatically notify SQL Server Agent when events of interest occur. This callback mechanism operates efficiently because SQL Server Agent does not need to continuously read (or poll) the Application Log to find events of interest. When the Application Log notifies SQL Server Agent of a logged event, SQL Server Agent compares the event to the alerts that you have defined. When SQL Server Agent finds a match, it fires the alert, which is an automated response to an event. Note: You must configure SQL Server Agent to write messages to the Windows Application Event Log if they are to be used for SQL Server Agent alerts.
Alerts Actions
You can create alerts to respond to individual error numbers or to all errors of a specific severity level. You can define the alert for all databases or for a specific database. You can also define the time delay between responses. Note: It is considered good practice to configure notifications for all error messages with severity level 19 and above.
Monitoring SQL Server 2014 with Notifications and Alerts
System Events
MCT USE ONLY. STUDENT USE PROHIBITED
13-14
In addition to monitoring SQL Server events, SQL Server Agent can also check conditions that are detected by Windows Management Instrumentation (WMI) events. The WMI Query Language (WQL) queries that retrieve the performance data execute several times each minute, so it can take a few seconds for these alerts to fire. You can also configure performance condition alerts on any of the performance counters that SQL Server exposes.
Creating Alerts You can create alerts by using SSMS or by calling the dbo.sp_add_alert system stored procedure. When defining an alert, you can also specify a SQL Server Agent job to start when the alert occurs. The following example shows how to create an alert that will respond to an error message with id of 9002: Using sp_add_alert EXEC msdb.dbo.sp_add_alert @name=N'AdventureWorks Transaction Log Full', @message_id=9002, @delay_between_responses=0, @database_name=N'AdventureWorks'; GO
Logged Events
You have seen that alerts will only fire for SQL Server errors if the error messages are written to the Windows Application Event Log. In general, error severity levels from 19 to 25 are automatically written to the Application Log but this is not always the case. To check which messages are automatically written to the log, you can query the is_event_logged column in the sys.messages table. Most events with severity levels less than 19 will only trigger alerts if you perform one of the following steps:
Modify the error message by using the dbo.sp_altermessage system stored procedure to make it a logged message.
Raise the error in code by using the RAISERROR WITH LOG option.
Use the xp_logevent system extended stored procedure to force entries to be written to the log.
Configuring Alert Actions You can configure two types of action to respond to an alert:
Execute a job
You can configure a SQL Server Agent job to execute in response to an alert. If you need to start multiple jobs, you must create a new one which starts each of your multiple jobs in turn, and then configure alert response to run the new job.
Notify operators
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
13-15
You can define a list of operators to notify in response to an alert by running the dbo.sp_add_notification system stored procedure. When sending messages to operators about alerts, it is important to provide the operator with sufficient context so that they can determine the appropriate action to take. You can include tokens in the message to add detail. There are special tokens available for working with alerts, including: o
A-DBN. Database name.
o
A-SVR. Server name.
o
A-ERR. Error number.
o
A-SEV. Error severity.
o
A-MSG. Error message.
By default, the inclusion of tokens is disabled for security reasons, but you can enable it in the properties of SQL Server Agent.
Troubleshooting Alerts and Notifications When troubleshooting alerts and notifications, use the following process to identify the issues: 1.
Ensure that SQL Server Agent is running.
The Application Log will only send messages to SQL Server Agent when the agent is running. The Application Log does not hold a queue of notifications to make at a later date. 2.
Check that the error message is written to the Application Log.
For SQL Server event alerts, check that the error message is written to the Application Log. You should also make sure that the Application Log is configured with sufficient size to hold all the event log details.
Monitoring SQL Server 2014 with Notifications and Alerts
3.
Ensure that the alert is enabled.
If the alert is disabled, it will not be able to fire. 4.
Check that the alert was raised.
MCT USE ONLY. STUDENT USE PROHIBITED
13-16
If the alert does not appear to be raised, make sure that the setting for the delay between responses is not set to too high a value. 5.
Check if the alert was raised, but no action was taken.
Check that the job is configured to respond to the alert functions as expected. For operator notifications, check that Database Mail is working and that the SMTP server configuration is correct. Test the Database Mail profile that is sending the notifications by manually sending mail from the profile used by SQL Server Agent.
Demonstration: Configuring SQL Server Agent Alerts In this demonstration, you will see how to:
Create an alert.
Test an alert.
Demonstration Steps Create an Alert 1.
In SQL Server Management Studio, in Object Explorer, under SQL Server Agent, right-click Alerts and click New Alert.
2.
In the New Alert dialog box, on the General page, enter the name Log Full Alert. In the Type dropdown list, note that you can configure alerts on WMI events, performance monitor conditions, and SQL Server events. Then select SQL Server event alert, select Error number, and enter the number 9002 (which is the error number raised by SQL Server when a database transaction log becomes full).
3.
In the New Alert dialog box, on the Response page, select Notify operators and select the E-mail checkbox for the Student operator.
4.
In the New Alert dialog box, on the Options page, under Include alert error text in, select E-mail. Then click OK.
Test an Alert 1.
In SQL Server Management Studio, open the TestAlert.sql script file in the D:\Demofiles\Mod13 folder.
2.
Click Execute and wait while the script fills a table in the TestAlertDB database. When the log file for that database is full, error 9002 occurs.
3.
In Object Explorer, under the Alerts folder, right-click Log Full Alert and click Properties. Then on the History page, note the Date of last alert and Date of last response values and click Cancel.
4.
In the C:\inetpub\mailroot\Drop folder, and verify that a new email message has been created.
5.
Double-click the most recent message to view it in Outlook. Then, when you have read the message, close it and minimize the Drop window.
6.
Close SQL Server Management Studio without saving any files.
Lab: Using Notifications and Alerts Scenario
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
13-17
You are a database administrator (DBA) at Adventure Works Cycles with responsibility for the AWDataWarehouse, HumanResources, and InternetSales databases. You have configured jobs to automate backups of these databases, and now you want to implement notifications and alerts to help you manage the database proactively.
Objectives After completing this lab, you will be able to:
Configure Database Mail.
Configure operators and notifications.
Configure alerts.
Estimated Time: 45 minutes Virtual machine : 20462C-MIA-SQL User name : ADVENTUREWORKS\Student Password : Pa$$w0rd
Exercise 1: Configuring Database Mail Scenario
All database administrators at Adventure Works use e-mail as a primary means of communication. You therefore plan to use Database Mail to enable e-mail notifications from SQL Server. The main tasks for this exercise are as follows: 1. Prepare the Lab Environment 2. Configure Database Mail 3. Test Database Mail
Task 1: Prepare the Lab Environment 1.
Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are both running, and then log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
Run Setup.cmd in the D:\Labfiles\Lab13\Starter folder as Administrator.
Task 2: Configure Database Mail 1.
Configure Database Mail in the MIA-SQL instance of SQL Server to add a new profile named SQL Server Agent Profile with the following account: o
Account name: AdventureWorks Administrator
o
E-mail address:
[email protected]
o
Display name: Administrator (AdventureWorks)
o
Reply e-mail:
[email protected].
o
Server name: mia-sql.adventureworks.msft
The new profile should be public, and it should be the default Database Mail profile.
Monitoring SQL Server 2014 with Notifications and Alerts
Task 3: Test Database Mail
MCT USE ONLY. STUDENT USE PROHIBITED
13-18
1.
Send a test e-mail message from the Database Mail service to
[email protected].
2.
Verify that the test e-mail message is successfully delivered to the C:\inetpub\mailroot\Drop folder.
3.
Query the dbo.sysmail_event_log and dbo.sysmail_mailitems tables in the msdb database to view Database Mail events and e-mail history.
Results: After this exercise, you should have configured Database Mail with a new profile named SQL Server Agent Profile.
Exercise 2: Implementing Operators and Notifications Scenario
Now that you have configured Database Mail, you must create operators to receive notifications. You want to receive all notifications that concern the databases for which you are responsible at your
[email protected] e-mail address, but you also need to configure a fail-safe operator so that notifications are sent to the DBA team alias (
[email protected]) if there is a problem with the notification system. You need to be notified if the jobs that back up the AWDataWarehouse and HumanResources database fail. The InternetSales database is critical to the business, so for peace of mind you want to be notified when the jobs to back up this database and its transaction log complete - regardless of the outcome. The main tasks for this exercise are as follows: 1. Create Operators 2. Configure the SQL Server Agent Mail Profile 3. Configure Job Notifications 4. Test Job Notifications
Task 1: Create Operators 1.
Create a new operator named Student with the e-mail address
[email protected].
2.
Create a second operator named DBA Team with the e-mail address
[email protected].
Task 2: Configure the SQL Server Agent Mail Profile 1.
Configure the SQL Server Agent on MIA-SQL to use the SQL Server Agent Profile Database Mail profile.
2.
Set the fail-safe operator to the DBA Team operator.
3.
Restart the SQL Server Agent service.
Task 3: Configure Job Notifications 1.
Configure the Back Up Database -AWDataWarehouse and Back Up Database - HumanResources jobs to notify the Student operator on failure.
2.
Configure the Back Up Database -InternetSales and Back Up Log - InternetSales jobs to notify the Student operator on completion.
3.
Verify the job notifications assigned to the Student operator by viewing its notification properties.
Task 4: Test Job Notifications
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
13-19
1.
Run the Back Up Database - AWDataWarehouse job (which should fail), and the Back Up Database - HumanResources and Back Up Database - InternetSales jobs (which should succeed).
2.
View the history properties of the Student operator to verify the most recent notification that was sent.
3.
Verify that notification e-mail messages for the failure of the Backup Database AWDataWarehouse job and the completion of the Back Up Database - InternetSales job were successfully delivered to the C:\inetpub\mailroot\Drop folder.
Results: After this exercise, you should have created operators name Student and DBA Team, configured the SQL Server Agent service to use the SQL Server Agent Profile Database Mail profile, and configured the Back Up Database - AWDataWarehouse, Back Up Database - HumanResources, Back Up Database - InternetSales, and Back Up Log - InternetSales jobs to send notifications.
Exercise 3: Implementing Alerts Scenario
The InternetSales database is business critical, and you want to ensure that it remains operational should its transaction log become full. You therefore want to configure an alert that will notify you if the log becomes full, and automatically run the job to back up the transaction log, which will truncate the log and keep the database online. The main tasks for this exercise are as follows: 1. Create an Alert 2. Test the Alert
Task 1: Create an Alert 1.
Create an alert named InternetSales Log Full Alert.
2.
Configure the alert to run the Backup Log - InternetSales job and send an e-mail that includes the error message to the Student operator if error number 9002 occurs in the InternetSales database.
Task 2: Test the Alert 1.
Use the TestAlert.sql script in the D:\Labfiles\Lab13\Starter folder to fill the log in the InternetSales database.
2.
View the history properties of the InternetSales Log Full Alert alert to verify the most recent alert and response.
3.
Verify that notification e-mail messages for the full transaction log error and the completion of the Back Up Log - InternetSales job were successfully delivered to the C:\inetpub\mailroot\Drop folder.
Results: After this exercise, you should have created an alert named InternetSales Log Full Alert. Question: Under what circumstances would e-mail notifications have been sent to the DBA Team operator you created?
Monitoring SQL Server 2014 with Notifications and Alerts
Module Review and Takeaways In this module, you learned how SQL Server logs errors and how you can use Database Mail and the notification system supported by the SQL Server Agent to manage database servers and databases proactively. Best Practice: When planning notifications and alerts in SQL Server, consider the following best practices:
Use Database Mail and not SQL Mail (which is deprecated).
Configure different profiles for different usage scenarios.
Provide limited access to the ability to send e-mail messages from the database engine.
Implement a retention policy for Database Mail log and mail auditing.
Create a fail-safe operator.
Define Alerts for severe error messages.
Review Question(s) Question: You want to designate a colleague in the IT team as an operator, but this colleague does not have a login in the SQL Server instance. Should you create one? Question: You are planning to send notifications from SQL Server, and think it might be easier to use NET SEND notifications instead of e-mail. Why should you not do this?
MCT USE ONLY. STUDENT USE PROHIBITED
13-20
Course Evaluation
Your evaluation of this course will help Microsoft understand the quality of your learning experience. Please work with your training provider to access the course evaluation form.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering Microsoft® SQL Server® Databases
13-21
Microsoft will keep your answers to this survey private and confidential and will use your responses to improve your future learning experience. Your open and honest feedback is valuable and appreciated.
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED L1-1
Module 1: Introduction to SQL Server 2014 Database Administration
Lab: Using SQL Server Administrative Tools Exercise 1: Using SQL Server Management Studio Task 1: Prepare the Lab Environment 1.
Ensure that the 20462C-MIA-DC and 20462C -MIA-SQL virtual machines are both running, and then log on to 20462C -MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
In the D:\Labfiles\Lab01\Starter folder, right-click Setup.cmd and then click Run as administrator.
3.
Click Yes when prompted to confirm that you want to run the command file, and then wait for the script to finish.
Task 2: Use Object Explorer in SQL Server Management Studio 1.
On the task bar, start SQL Server Management Studio.
2.
When prompted, connect to the MIA-SQL database engine using Windows authentication.
3.
If Object Explorer is not visible, on the View menu, click Object Explorer.
4.
In Object Explorer, under MIA-SQL, expand Databases and note the databases that are hosted on this database engine instance.
5.
Right-click MIA-SQL, point to Reports, point to Standard Reports, and click Server Dashboard. Then view the server dashboard report for this instance.
Task 3: Create a Database 1.
Under MIA-SQL right-click the Databases folder, and click New Database.
2.
In the New Database dialog box, enter the database name AWDatabase. Then click OK.
3.
View the databases listed under the Database folder and verify that the new database has been created.
Task 4: Run a Transact-SQL Query 1.
In SQL Server Management Studio, on the toolbar, click New Query.
2.
Enter the following Transact-SQL code: EXEC sp_helpdb AWDatabase;
3.
Click Execute and view the results, which include information about the AWDatabase you created in the previous task.
4.
Save the script file as GetDBInfo.sql in the D:\Labfiles\Lab01\Starter folder.
Task 5: Create a Project 1.
In SQL Server Management Studio, on the File menu, point to New and click Project.
2.
In the New Project dialog box, select SQL Server Scripts. Then and save the project as AWProject in the D:\Labfiles\Lab01\Starter folder.
3.
If Solution Explorer is not visible, on the View menu, click Solution Explorer.
MCT USE ONLY. STUDENT USE PROHIBITED
L1-2 Administering Microsoft® SQL Server® Databases
4.
In Solution Explorer, right-click Connections and click New Connection. Then connect to the MIASQL database engine using Windows authentication.
5.
In Solution Explorer, right-click Queries and click New Query. Then when the query is created, rightclick SQLQuery1,sql and click Rename, and rename it to BackupDB.sql
6.
In Object Explorer, right-click the AWDatabase database you created previously, point to Tasks, and click Back Up.
7.
In the Back Up Database – AWDatabase dialog box, in the Script drop-down list, select Script Action to Clipboard. Then click Cancel.
8.
Paste the contents of the clipboard into the empty BackupDB.sql script.
9.
Edit the BackupDB.sql script to change the backup location to D:\Labfiles\Lab01\Starter\AWDatabase.bak
10. On the File menu, click Save All. Then on the File menu, click Close Solution. 11. Minimize SQL Server Management Studio.
Results: At the end of this exercise, you will have created a SQL Server Management Studio project containing script files.
Exercise 2: Using the sqlcmd Utility Task 1: Use sqlcmd Interactively 1.
Right-click the Start button and click Command Prompt.
2.
In the command prompt window, enter the following command to view details of all sqlcmd parameters: sqlcmd -?
3.
Enter the following command to start sqlcmd and connect to MIA-SQL using Windows authentication: sqlcmd -S MIA-SQL -E
4.
In the sqlcmd command line, enter the following commands to view the databases on MIA-SQL. Verify that these include the AWDatabase database you created in the previous exercise. SELECT name FROM sys.sysdatabases; GO
5.
Enter the following command to exit sqlcmd. Exit
Task 2: Use sqlcmd to Run a Script 1.
In the command prompt window, enter the following command to use sqlcmd to run the GetDBInfo.sql script you created earlier in MIA-SQL. sqlcmd -S MIA-SQL -E -i D:\Labfiles\Lab01\Starter\GetDBInfo.sql
MCT USE ONLY. STUDENT USE PROHIBITED L1-3
2.
Note that the query results are returned, but they are difficult to read in the command prompt screen.
3.
Enter the following command to store the query output in a text file: sqlcmd -S MIA-SQL -E -i D:\Labfiles\Lab01\Starter\GetDBinfo.sql -o D:\Labfiles\Lab01\Starter\DBinfo.txt
4.
Enter the following command to view the text file that was created by sqlcmd: Notepad D:\Labfiles\Lab01\Starter\DBinfo.txt
5.
View the results in the text file, and then close Notepad.
6.
Close the command prompt window.
Results: At the end of this exercise, you will have used sqlcmd to manage a database.
Exercise 3: Using Windows PowerShell with SQL Server Task 1: Use Windows PowerShell 1.
On the taskbar, click the Windows PowerShell icon.
2.
At the Windows PowerShell prompt, enter the following command: Get-Process
3.
Review the list of services. In the ProcessName column, note the SQL services.
4.
Enter the following command to list only the services with names beginning “SQL”,: Get-Process SQL*
5.
To find a way to sort the list, enter the following command: Get-Help Sort
6.
Review the help information, then enter the following command: Get-Process SQL* | Sort-Object Handles
7.
Verify that the list is now sorted by number of handles.
8.
Close Windows PowerShell.
Task 2: Using PowerShell in SQL Server Management Studio 1.
In SQL Server Management Studio, in Object Explorer, right-click MIA-SQL, and then click Start PowerShell.
2.
At the PowerShell prompt, enter the following command: Get-Module
3.
Verify that SQLPS and SQLASCMDLETS are listed.
4.
At the Windows PowerShell prompt, enter the following command:
Set-location SQLServer:\SQL\MIA-SQL
5.
At the Windows PowerShell prompt, enter the following command to display the SQL Server database engine instances on the server: Get-ChildItem
6.
At the Windows PowerShell prompt, enter the following command: Set-location SQLServer:\SQL\MIA-SQL\DEFAULT\Databases
7.
At the Windows PowerShell prompt, enter the following command to display the databases on the default instance: Get-ChildItem
8.
At the Windows PowerShell prompt, enter the following command: Invoke-Sqlcmd "SELECT @@version"
9.
Review the version information.
10. Close the SQL Server Powershell window and close SQL Server Management Studio without saving any files.
Task 3: Create a PowerShell Script 1.
On the task bar, right-click the Windows PowerShell icon and click Windows PowerShell ISE.
2.
In the PowerShell command prompt, enter the following command: Get-Module
3.
Verify that the SQLPS module is not loaded. Then enter the following command to load it: Import-Module SQLPS -DisableNameChecking
4.
MCT USE ONLY. STUDENT USE PROHIBITED
L1-4 Administering Microsoft® SQL Server® Databases
Enter the following command to verify that the SQLPS module is now loaded. Get-Module
5.
If the Commands pane is not visible, on the View menu, click Show Command Add-on. Then in the Commands pane, in the Modules list, select SQLPS.
6.
View the cmdlets in the module, noting that they include cmdlets to perform tasks such as backing up databases and starting SQL Server instances.
7.
If the Script pane is not visible, click the Script drop-down arrow.
8.
In the Script pane, type the following commands. (Hint: Use the IntelliSense feature.) Import-Module SQLPS -DisableNameChecking Set-location SQLServer:\SQL\MIA-SQL\Default\Databases Get-Childitem | Select Name, Size, SpaceAvailable, IndexSpaceUsage | Out-GridView
9.
Click Run Script. Then view the results in the window that is opened. (The script may take a few minutes to run.)
MCT USE ONLY. STUDENT USE PROHIBITED L1-5
10. Close the window, and modify the script as shown in the following example: Import-Module SQLPS -DisableNameChecking Set-location SQLServer:\SQL\MIA-SQL\Default\Databases Get-Childitem | Select Name, Size, SpaceAvailable, IndexSpaceUsage | Out-File
'D:\Labfiles\Lab01\Starter\Databases.txt'
11. Save the script as GetDatabases.ps1 in the D:\Labfiles\Lab01\Starter folder. Then close the PowerShell ISE.
12. In the D:\Labfiles\Lab01\Starter folder, right-click GetDatabases.ps1 and click Run with PowerShell. 13. When the script has completed, open Databases.txt in Notepad to view the results. 14. Close Notepad.
Results: At the end of this task, you will have a PowerShell script that retrieves information about databases from SQL Server.
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED L2-1
Module 2: Installing and Configuring SQL Server 2014
Lab: Installing SQL Server 2014 Exercise 1: Preparing to Install SQL Server Task 1: Prepare the Lab Environment 1.
Ensure that the MSL-TMG1, 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are running, and then log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
In the D:\Labfiles\Lab02\Starter folder, right-click Setup.cmd and then click Run as administrator.
3.
Click Yes when prompted to confirm that you want to run the command file, and then wait for the script to finish.
Task 2: View Hardware and Software Requirements 1.
In the C:\SQLServer2014-x64-ENU folder, run Setup.exe. In the User Access Control message box, click Yes.
2.
In the SQL Server Installation Center, on the Planning page, click Hardware and Software Requirements.
3.
In Internet Explorer, note that the documentation provides detailed information about hardware and software requirements for SQL Server 2014. Then close Internet Explorer.
Task 3: Run the System Configuration Checker 1.
In the SQL Server Installation Center, on the Tools lab, click System Configuration Checker, and wait for the tool to start.
2.
When the tool has run, click Show Details to view the checks that were performed.
3.
Click OK to close SQL Server 2014 Setup.
4.
Keep the SQL Server Installation Center window open. You will use it again in a later exercise.
Results: After this exercise, you should have run the SQL Server setup program and used the tools in the SQL Server Installation Center to assess the computer’s readiness for SQL Server installation.
Exercise 2: Installing SQL Server Task 1: Review the Installation Requirements 1.
Review the requirements in the exercise scenario.
2.
Verify that the M:\SQLTEST\Data and L:\SQLTEST\Logs folders exist (if not, create them).
Task 2: Install the SQL Server Instance 1.
In the SQL Server Installation Center window, on the Installation tab, click New SQL Server standalone installation or add features to an existing installation wait for SQL Server setup to start.
2.
If the Microsoft Updates and Product Updates pages are displayed, clear any checkboxes and click Next.
MCT USE ONLY. STUDENT USE PROHIBITED
L2-2 Administering Microsoft® SQL Server® Databases
3.
On the Install Rules page, click Show details and note that the list of rules that has been checked. If a warning about Windows Firewall is displayed, you can ignore it.
4.
On the Install Rules page, click Next.
5.
On the Installation Type page, ensure that Perform a new installation of SQL Server 2014 is selected and then click Next.
6.
On the Product Key page, select Evaluation and click Next.
7.
On the License Terms page, note the Microsoft Software License Terms and check I accept the license terms, and then click Next.
8.
On the Setup Role page, ensure that SQL Server Feature Installation is selected, and then click Next.
9.
On the Feature Selection page, under the Instance Features, select Database Engine Services, and then click Next.
10. On the Instance Configuration page, ensure that Named instance is selected, type SQLTEST in the Named instance box, and then click Next. 11. On the Server Configuration page, on the SQL Server Agent and SQL Server Database Engine rows, enter the following values: o
Account Name: ADVENTUREWORKS\ServiceAcct
o
Password: Pa$$w0rd
o
Startup Type: Manual
12. On the Collation tab, ensure that SQL_Latin1_General_CP1_CI_AS is selected and click Next. 13. On the Database Engine Configuration page, on the Server Configuration tab, in the Authentication Mode section, select Mixed Mode (SQL Server authentication and Windows authentication). Then enter and confirm he password Pa$$w0rd.
14. Click Add Current User, this will add the user ADVENTUREWORKS\Student (Student) to the list of Administrators. 15. On the Data Directories tab, change the User database directory to M:\SQLTEST\Data. 16. Change the User database log directory to L:\SQLTEST\Logs. 17. On the FILESTREAM tab, and ensure that Enable FILESTREAM for Transact-SQL access is not selected, and then click Next.
18. On the Ready to Install page, review the summary, and then click Install and wait for the installation to complete. 19. On the Complete page, click Close. 20. Close the SQL Server Installation Center window.
Results: After this exercise, you should have installed an instance of SQL Server.
MCT USE ONLY. STUDENT USE PROHIBITED L2-3
Exercise 3: Performing Post-Installation Configuration Task 1: Start the SQL Server Service 1.
On the Start screen, click SQL Server 2014 Configuration Manager. In the User Account Control dialog box, click Yes.
2.
In the left-hand pane of the SQL Server Configuration Manager window, click SQL Server Services.
3.
In the right-hand pane, double-click SQL Server (SQLTEST).
4.
In the SQL Server (SQLTEST) Properties dialog box, verify that the service is configured to log on as ADVENTUREWORKS\ServiceAcct and click Start. Then, when the service has started, click OK.
Task 2: Configure Network Protocols and Aliases 1.
In SQL Server Configuration Manager, expand SQL Server Network Configuration, click Protocols for SQLTEST, and verify that the TCP/IP protocol is enabled for this instance of SQL Server.
2.
In SQL Server Configuration Manager, expand SQL Native Client 11.0 Configuration (32bit), click Client Protocols, and verify that the TCP/IP protocol is enabled for 32-bit client applications.
3.
Click Aliases, and note that there are currently no aliases defined for 32-bit clients. Then right-click Aliases and click New Alias.
4.
In the Alias – New window, in the Alias Name text box, type Test.
5.
In the Protocol drop-down list box, ensure that TCP/IP is selected.
6.
In the Server text box, type MIA-SQL\SQLTEST and click OK.
7.
In SQL Server Configuration Manager, expand SQL Native Client 11.0 Configuration, click Client protocols, and verify that the TCP/IP protocol is enabled for 64-bit client applications.
8.
Click Aliases, and note that there are currently no aliases defined for 64-bit clients. Then right-click Aliases and click New Alias.
9.
In the Alias – New window, in the Alias Name text box, type Test.
10. In the Protocol drop-down list box, ensure that TCP/IP is selected. 11. In the Server text box, type MIA-SQL\SQLTEST and click OK. 12. Close SQL Server Configuration Manager.
Task 3: Verify Connectivity to SQL Server 1.
Right-click the Start button and click Command Prompt.
2.
At the command prompt, enter the following command to connect to the MIA-SQL\SQLTEST instance of SQL Server: sqlcmd –S MIA-SQL\SQLTEST -E
3.
At the sqlcmd prompt, enter the following command to display the SQL Server instance name: SELECT @@ServerName; GO
4.
Close the command prompt window.
5.
Start SQL Server Management Studio, and when prompted, connect to the database engine named Test using Windows Authentication.
MCT USE ONLY. STUDENT USE PROHIBITED
L2-4 Administering Microsoft® SQL Server® Databases
6.
In Object Explorer, right-click Test and click Properties. Then verify that the value of the Name property is MIA-SQL\SQLTEST and click Cancel.
7.
In Object Explorer, right-click Test and click Stop. In the User Account Control message box, click Yes. Then when prompted to confirm that you want to stop the MSSQL$SQLTEST service, click Yes.
8.
When the service has stopped, close SQL Server Management Studio.
Results: After this exercise, you should have started the SQL Server service and connected using SSMS.
MCT USE ONLY. STUDENT USE PROHIBITED L3-1
Module 3: Working with Databases and Storage
Lab: Managing Database Storage Exercise 1: Configuring tempdb Storage Task 1: Prepare the Lab Environment 1.
Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are both running, and then log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
In the D:\Labfiles\Lab03\Starter folder, right-click Setup.cmd, and then click Run as administrator.
3.
Click Yes when prompted to confirm you want to run the command file, and wait for the script to finish.
Task 2: Configure tempdb Files 1.
Start SQL Server Management Studio and connect to the MIA-SQL database engine by using Windows authentication.
2.
In Object Explorer, expand Databases, expand System Databases, right-click tempdb, and click Properties.
3.
On the Files page, view the current file settings. Then click Cancel.
4.
On the toolbar, click New Query.
5.
Enter the following statements and click Execute alternatively, you can open the Configure TempDB.sql script file in the D:\Labfiles\Lab03\Solution folder): USE master; GO ALTER DATABASE tempdb MODIFY FILE (NAME = tempdev, SIZE = 10MB, FILEGROWTH = 5MB, FILENAME = 'T:\tempdb.mdf'); ALTER DATABASE tempdb MODIFY FILE (NAME = templog, SIZE=5MB, FILEGROWTH = 1MB, FILENAME = 'T:\templog.ldf'); GO
6.
In Object Explorer, right-click MIA-SQL and click Restart. When prompted to allow changes, to restart the service, and to stop the dependent SQL Server Agent service, click Yes.
7.
View the contents of T:\ and note that the tempdb.mdf and tempdb.ldf files have been moved to this location.
8.
In SQL Server Management Studio, in Object Explorer, right-click tempdb, and click Properties.
9.
On the Files page, verify that the file settings have been modified. Then click Cancel.
10. Save the script file as Configure TempDB.sql in the D:\Labfiles\Lab03\Starter folder. 11. Keep SQL Server Management Studio open for the next exercise.
Results: After this exercise, you should have inspected and configured the tempdb database.
Exercise 2: Creating Databases Task 1: Create the HumanResources Database 1.
In SQL Server management Studio, click New Query.
2.
Enter the following statements and click Execute alternatively, you can open the Create HumanResources.sql script file in the D:\Labfiles\Lab03\Solution folder):: CREATE DATABASE HumanResources ON PRIMARY (NAME = 'HumanResources', FILENAME = 'M:\Data\HumanResources.mdf', SIZE = 50MB, FILEGROWTH = 5MB) LOG ON (NAME = 'HumanResources_log', FILENAME = 'L:\Logs\HumanResources.ldf', SIZE = 5MB, FILEGROWTH = 1MB); GO
3.
On Object Explorer, right-click the Databases folder and click Refresh to confirm that the HumanResources database has been created.
4.
Save the script file as Create HumanResources.sql in the D:\Labfiles\Lab03\Starter folder.
Task 2: Create the InternetSales Database 1.
In SQL Server management Studio, click New Query.
2.
Enter the following statements and click Execute (alternatively, you can open the Create InternetSales.sql script file in the D:\Labfiles\Lab03\Solution folder): CREATE DATABASE InternetSales ON PRIMARY (NAME = 'InternetSales', FILENAME = 'M:\Data\InternetSales.mdf', SIZE = 5MB, FILEGROWTH = 1MB), FILEGROUP SalesData (NAME = 'InternetSales_data1', FILENAME = 'M:\Data\InternetSales_data1.ndf', SIZE = 100MB, FILEGROWTH = 10MB ), (NAME = 'InternetSales_data2', FILENAME = 'N:\Data\InternetSales_data2.ndf', SIZE = 100MB, FILEGROWTH = 10MB ) LOG ON (NAME = 'InternetSales_log', FILENAME = 'L:\Logs\InternetSales.ldf', SIZE = 2MB, FILEGROWTH = 10%); GO
3.
Under the existing code, enter the following statements. Then select the statements you have just added, and click Execute. ALTER DATABASE InternetSales MODIFY FILEGROUP SalesData DEFAULT; GO
4.
On Object Explorer, right-click the Databases folder and click Refresh to confirm that the InternetSales database has been created.
5.
Save the script file as Create InternetSales.sql in the D:\Labfiles\Lab03\Starter folder.
Task 3: View Data File Information 1.
In SQL Server Management Studio, open the ViewFileInfo.sql script file in the D:\Labfiles\Lab03\Starter folder.
2.
Select the code under the comment View page usage and click Execute. This query retrieves data about the files in the InternetSales database.
MCT USE ONLY. STUDENT USE PROHIBITED
L3-2 Administering Microsoft® SQL Server® Databases
MCT USE ONLY. STUDENT USE PROHIBITED L3-3
3.
Note the UsedPages and TotalPages values for the SalesData filegroup.
4.
Select the code under the comment Create a table on the SalesData filegroup and click Execute.
5.
Select the code under the comment Insert 10,000 rows and click Execute.
6.
Select the code under the comment View page usage again and click Execute.
7.
Note the UsedPages value for the SalesData filegroup, and verify that the data in the table is spread across the files in the filegroup.
Results: After this exercise, you should have created a new HumanResources database and an InternetSales database that includes multiple filegroups.
Exercise 3: Attaching a Database Task 1: Attach the AWDataWarehouse Database 1.
Move AWDataWarehouse.ldf from the D:\Labfiles\Lab03\Starter\ folder to the L:\Logs\ folder.
2.
Move the following files from the D:\Labfiles\Lab03\Starter\ folder to the M:\Data\ folder: o
AWDataWarehouse.mdf
o
AWDataWarehouse_archive.ndf
o
AWDataWarehouse_current.ndf
3.
In SQL Server Management Studio, in Object Explorer, right-click Databases and click Attach.
4.
In the Attach Databases dialog box, click Add. Then in the Locate Database Files dialog box, select the M:\Data\AWDataWarehouse.mdf database file and click OK.
5.
In the Attach Databases dialog box, after you have added the master databases file, note that all of the database files are listed. Then click OK.
6.
In Object Explorer, under Databases, verify that AWDataWarehouse is now listed.
Task 2: Configure Filegroups 1.
In Object Explorer, right-click the AWDataWarehouse database and click Properties.
2.
On the Filegroups page, view the filegroups used by the database.
3.
Select the Read-Only checkbox for the Archive filegroup and click OK.
4.
In Object Explorer, expand AWDataWarehouse, and expand Tables. Then right-click the dbo.FactInternetSales table and click Properties.
5.
On the Storage page, verify that the dbo.FactInternetSales table is stored in the Current filegroup. Then click Cancel.
6.
Right-click the dbo.FactInternetSalesArchive table and click Properties.
7.
On the Storage page, verify that the dbo.FactInternetSalesArchive table is stored in the Archive filegroup. Then click Cancel.
8.
In Object Explorer, right-click the dbo.FactInternetSales table and click Edit Top 200 Rows.
9.
Change the SalesAmount value for the first record to 2500 and press Enter to update the record. Then close the dbo.FactInternetSales table.
MCT USE ONLY. STUDENT USE PROHIBITED
L3-4 Administering Microsoft® SQL Server® Databases
10. In Object Explorer, right-click the dbo.FactInternetSalesArchive table and click Edit Top 200 Rows. 11. Change the SalesAmount value for the first record to 3500 and press Enter to update the record.
12. View the error message that is displayed and click OK. Then press Esc to cancel the update and close the dbo.FactInternetSalesArchive table.
Results: After this exercise, you should have attached the AWDataWarehouse database to MIA-SQL.
MCT USE ONLY. STUDENT USE PROHIBITED L4-1
Module 4: Planning and Implementing a Backup Strategy
Lab: Backing Up Databases Exercise 1: Backing Up Databases Task 1: Prepare the Lab Environment 1.
Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are both running, and then log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
In the D:\Labfiles\Lab04\Starter folder, right-click Setup.cmd, and then click Run as administrator.
3.
Click Yes when prompted to confirm you want to run the command file, and wait for the script to finish.
Task 2: Set the Recovery Model 1.
Start SQL Server Management Studio and connect to the MIA-SQL database engine using Windows authentication.
2.
In Object Explorer, expand Databases. Then right-click HumanResources and click Properties.
3.
In the Database Properties – HumanResources dialog box, on the Options page, in the Recovery model drop-down list, select Simple. Then click OK.
Task 3: Perform a Full Database Backup 1.
In SQL Server Management Studio, in Object Explorer, under Databases, right-click HumanResources, point to Tasks, and click Back Up.
2.
In the Backup Up Database – HumanResources dialog box, ensure that Backup type is set to Full, and in the Destination section, select the existing file path and click Remove. Then click Add and in the Select Backup Destination dialog box, enter the file name R:\Backups\HumanResources.bak and click OK.
3.
In the Backup Up Database – HumanResources dialog box, on the Media Options page, select Back up to a new media set, and erase all existing backup sets. Then enter the new media set name HumanResources Backup.
4.
In the Backup Up Database – HumanResources dialog box, on the Backup Options page, note the default backup name. Then in the Set backup compression list, select Compress backup and click OK.
5.
When the backup has completed successfully, click OK.
6.
Verify that the backup file HumanResources.bak has been created in the R:\Backups folder, and note its size.
Task 4: Modify Data in the Database 1.
In SQL Server Management Studio, click New Query.
2.
Enter the following Transact-SQL code in the query pane, and then click Execute. Alternatively, you can open the Update HumanResources.sql file in the D:\Labfiles\Lab04\Starter folder. UPDATE HumanResources.dbo.Employee SET PhoneNumber='151-555-1234' WHERE BusinessEntityID = 259;
3.
Note the number of rows affected, and then close the query pane without saving the file.
Task 5: Perform Another Full Database Backup
MCT USE ONLY. STUDENT USE PROHIBITED
L4-2 Administering Microsoft® SQL Server® Databases
1.
In SQL Server Management Studio, in Object Explorer, under Databases, right-click HumanResources, point to Tasks, and click Back Up.
2.
In the Backup Up Database – HumanResources dialog box, ensure that Backup type is set to Full, and in the Destination section, verify that R:\Backups\HumanResources.bak is listed.
3.
In the Backup Up Database – HumanResources dialog box, on the Media Options page, ensure that Back up to the existing media set and Append to the existing backup set are selected.
4.
In the Backup Up Database – HumanResources dialog box, on the Backup Options page, change the backup name to HumanResources-Full Database Backup 2. Then in the Set backup compression list, select Compress backup and click OK.
5.
When the backup has completed successfully, click OK.
6.
View the HumanResources.bak backup file in the R:\Backups folder, and verify that its size has increased.
Task 6: View the Backup and Restore Events Report 1.
In SQL Server Management Studio, in Object Explorer, under Databases, right-click HumanResources, point to Reports, point to Standard Reports, and click Backup and Restore Events.
2.
In the Backup and Restore Events [HumanResources] report, expand Successful Backup Operations and view the backup operations that have been performed for this database.
3.
In the Device Type column, expand each of the Disk (temporary) entries to view details of the backup media set file.
4.
Close the report pane.
Results: At the end of this exercise, you will have backed up the HumanResources database to R:\Backups\HumanResources.bak.
Exercise 2: Performing Database, Differential, and Transaction Log Backups Task 1: Set the Recovery Model 1.
In SQL Server Management Studio, in Object Explorer, expand Databases. Then right-click InternetSales and click Properties.
2.
In the Database Properties – InternetSales dialog box, on the Options page, in the Recovery model drop-down list, ensure that Full is selected. Then click OK.
Task 2: Perform a Full Database Backup 1.
In Object Explorer, under Databases, right-click InternetSales, point to Tasks, and click Back Up.
2.
In the Backup Up Database – InternetSales dialog box, ensure that Backup type is set to Full, and in the Destination section, select the existing file path and click Remove. Then click Add and in the Select Backup Destination dialog box, enter the file name R:\Backups\InternetSales.bak and click OK.
MCT USE ONLY. STUDENT USE PROHIBITED L4-3
3.
In the Backup Up Database – InternetSales dialog box, on the Media Options page, select Back up to a new media set, and erase all existing backup sets. Then enter the new media set name InternetSales Backup.
4.
In the Backup Up Database – InternetSales dialog box, on the Backup Options page, note the default backup name. Then in the Set backup compression list, select Compress backup and click OK.
5.
When the backup has completed successfully, click OK.
6.
Verify that the backup file InternetSales.bak has been created in the R:\Backups folder, and note its size.
Task 3: Modify Data in the Database 1.
In SQL Server Management Studio, click New Query.
2.
Enter the following Transact-SQL code in the query pane, and then click Execute. Alternatively, you can open the Update InternetSales.sql file in the D:\Labfiles\Lab04\Starter folder, and select the first batch of code. UPDATE InternetSales.dbo.Product SET ListPrice = ListPrice * 1.1 WHERE ProductSubcategoryID = 1;
3.
Note the number of rows affected. Keep the script open, you will use it again in a later task.
Task 4: Perform a Transaction Log Backup 1.
In Object Explorer, under Databases, right-click InternetSales, point to Tasks, and click Back Up.
2.
In the Backup Up Database – InternetSales dialog box, in the Backup type list, select Transaction Log, and in the Destination section, verify that R:\Backups\InternetSales.bak is listed.
3.
In the Backup Up Database – InternetSales dialog box, on the Media Options page, ensure that Back up to the existing media set and Append to the existing backup set are selected.
4.
In the Backup Up Database – InternetSales dialog box, on the Backup Options page, change the backup name to InternetSales-Transaction Log Backup. Then in the Set backup compression list, select Compress backup and click OK.
5.
When the backup has completed successfully, click OK.
6.
View the InternetSales.bak backup file in the R:\Backups folder, and verify that its size has increased.
Task 5: Modify Data in the Database 1.
Modify the Transact-SQL code in the query pane as follows, and then click Execute. Alternatively, you can open the Update InternetSales.sql file in the D:\Labfiles\Lab04\Starter folder, and select the second batch of code. UPDATE InternetSales.dbo.Product SET ListPrice = ListPrice * 1.1 WHERE ProductSubcategoryID = 2;
2.
Note the number of rows affected. Keep the script open, you will use it again in a later task.
Task 6: Perform a Differential Backup
MCT USE ONLY. STUDENT USE PROHIBITED
L4-4 Administering Microsoft® SQL Server® Databases
1.
In Object Explorer, under Databases, right-click InternetSales, point to Tasks, and click Back Up.
2.
In the Backup Up Database – InternetSales dialog box, in the Backup type list, select Differential, and in the Destination section, verify that R:\Backups\InternetSales.bak is listed.
3.
In the Backup Up Database – InternetSales dialog box, on the Media Options page, ensure that Back up to the existing media set and Append to the existing backup set are selected.
4.
In the Backup Up Database – InternetSales dialog box, on the Backup Options page, change the backup name to InternetSales-Differential Backup. Then in the Set backup compression list, select Compress backup and click OK.
5.
When the backup has completed successfully, click OK.
6.
View the InternetSales.bak backup file in the R:\Backups folder, and verify that its size has increased
Task 7: Modify Data in the Database 1.
Modify the Transact-SQL code in the query pane as follows, and then click Execute. Alternatively, you can open the Update InternetSales.sql file in the D:\Labfiles\Lab04\Starter folder, and select the third batch of code. UPDATE InternetSales.dbo.Product SET ListPrice = ListPrice * 1.1 WHERE ProductSubcategoryID = 3;
2.
Note the number of rows affected. Then close the query pane without saving the file.
Task 8: Perform Another Transaction Log Backup 1.
In Object Explorer, under Databases, right-click InternetSales, point to Tasks, and click Back Up.
2.
In the Backup Up Database – InternetSales dialog box, in the Backup type list, select Transaction Log, and in the Destination section, verify that R:\Backups\InternetSales.bak is listed.
3.
In the Backup Up Database – InternetSales dialog box, on the Media Options page, ensure that Back up to the existing media set and Append to the existing backup set are selected.
4.
In the Backup Up Database – InternetSales dialog box, on the Backup Options page, change the backup name to InternetSales-Transaction Log Backup 2. Then in the Set backup compression list, select Compress backup and click OK.
5.
When the backup has completed successfully, click OK.
6.
View the InternetSales.bak backup file in the R:\Backups folder, and verify that its size has increased.
Task 9: Verify Backup Media 1.
In SQL Server Management Studio, click New Query.
2.
Enter the following Transact-SQL code in the query pane, and then click Execute. RESTORE HEADERONLY FROM DISK = 'R:\Backups\InternetSales.bak'; GO
3.
Verify that the backups you performed in this exercise are all listed.
MCT USE ONLY. STUDENT USE PROHIBITED L4-5
4.
Modify the Transact-SQL code as follows, and then click Execute. RESTORE FILELISTONLY FROM DISK = 'R:\Backups\InternetSales.bak'; GO
5.
Note the database files that are included in the backups.
6.
Modify the Transact-SQL code as follows, and then click Execute. RESTORE VERIFYONLY FROM DISK = 'R:\Backups\InternetSales.bak'; GO
7.
Verify that the backup is valid. Then close the query pane without saving the file.
Results: At the end of this exercise, you will have backed up the InternetSales database to R:\Backups\InternetSales.bak.
Exercise 3: Performing a Partial Backup Task 1: Set the Recovery Model 1.
In SQL Server Management Studio, in Object Explorer, expand Databases. Then right-click AWDataWarehouse and click Properties.
2.
In the Database Properties – AWDataWarehouse dialog box, on the Options page, in the Recovery model drop-down list, select Simple. Then click OK.
Task 2: Back Up the Read-Only Filegroup 1.
In SQL Server Management Studio, click New Query.
2.
Enter the following Transact-SQL code in the query pane, and then click Execute. BACKUP DATABASE AwDataWarehouse FILEGROUP = 'Archive' TO DISK = 'R:\Backups\AWDataWarehouse-Read-Only.bak' WITH FORMAT, INIT, NAME = 'AWDataWarehouse-Archive', COMPRESSION;
3.
Verify that the backup file AWDataWarehouse-Read-Only.bak has been created in the R:\Backups folder.
Task 3: Perform a Partial Backup 1.
In SQL Server Management Studio, click New Query.
2.
Enter the following Transact-SQL code in the query pane, and then click Execute. BACKUP DATABASE AWDataWarehouse READ_WRITE_FILEGROUPS TO DISK = 'R:\Backups\AWDataWarehouse-Read-Write.bak' WITH FORMAT, INIT, NAME = 'AWDataWarehouse-Active Data', COMPRESSION;
3.
Verify that the backup file AWDataWarehouse-Read-Write.bak has been created in the R:\Backups folder.
Task 4: Modify Data in the Database
MCT USE ONLY. STUDENT USE PROHIBITED
L4-6 Administering Microsoft® SQL Server® Databases
1.
In SQL Server Management Studio, click New Query.
2.
Enter the following Transact-SQL code in the query pane, and then click Execute. Alternatively, you can open the Update AWDataWarehouse.sql file in the D:\Labfiles\Lab04\Starter folder. INSERT INTO AWDataWarehouse.dbo.FactInternetSales VALUES (1, 20080801, 11000, 5.99, 2.49);
3.
Note the number of rows affected, and then close the query pane without saving the file.
Task 5: Perform a Differential Partial Backup 1.
In SQL Server Management Studio, click New Query.
2.
Enter the following Transact-SQL code in the query pane, and then click Execute. BACKUP DATABASE AWDataWarehouse READ_WRITE_FILEGROUPS TO DISK = 'R:\Backups\AWDataWarehouse-Read-Write.bak' WITH DIFFERENTIAL, NOFORMAT, NOINIT, NAME = 'AWDataWarehouse-Active Data Diff', COMPRESSION;
Task 6: Verify Backup Media 1.
In SQL Server Management Studio, click New Query.
2.
Enter the following Transact-SQL code in the query pane, and then click Execute. RESTORE HEADERONLY FROM DISK = 'R:\Backups\AWDataWarehouse-Read-Only.bak'; GO
3.
View the backups on this backup media, and scroll to the right to view the BackupTypeDescription column.
4.
Modify the Transact-SQL code as follows, and then click Execute. RESTORE HEADERONLY FROM DISK = 'R:\Backups\AWDataWarehouse-Read-Write.bak'; GO
5.
View the backups on this backup media, and scroll to the right to view the BackupTypeDescription column.
6.
Close SQL Server Management Studio without saving any script files.
Results: At the end of this exercise, you will have backed up the read-only filegroup in the AWDataWarehouse to R:\Backups\AWDataWarehouse_Read-Only.bak; and you will have backed up the writable filegroups in the AWDataWarehouse to R:\Backups\AWDataWarehouse_Read-Write.bak
MCT USE ONLY. STUDENT USE PROHIBITED L5-1
Module 5: Restoring SQL Server 2014 Databases
Lab: Restoring SQL Server Databases Exercise 1: Restoring a Database Backup Task 1: Prepare the Lab Environment 1.
Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are both running, and then log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
In the D:\Labfiles\Lab05\Starter folder, right-click Setup.cmd file, and click Run as administrator.
3.
Click Yes when prompted to confirm you want to run the command file, and wait for the script to finish.
Task 2: Determine the Cause of the Failure 1.
Start SQL Server Management Studio and connect to the MIA-SQL database engine using Windows authentication.
2.
In Object Explorer, expand Databases, and note that the HumanResources database is in a Recovery Pending state.
3.
In SQL Server Management Studio, click New Query and execute the following Transact-SQL code to attempt to bring the database online. ALTER DATABASE HumanResources SET ONLINE;
4.
Note the error message that is displayed. The database cannot be brought online because the primary data file is lost.
5.
View the contents of the M:\Data folder to verify that the HumanResources.mdf file is not present.
Task 3: Restore the HumanResources Database 1.
In SQL Server Management Studio, in Object Explorer, right-click the HumanResources database, point to Tasks, point to Restore, and click Database.
2.
In the Restore Database – HumanResources dialog box, note that the backup history for the database has been retained, and the most recent full backup is automatically selected.
3.
In the Script drop-down list, click New Query Editor Window. Then click OK.
4.
When the database has been restored successfully, click OK.
5.
View the Transact-SQL code that was used to restore the database, noting that the full backup was restored from file 2 in the R:\Backups\HumanResources.bak backup media set.
6.
In Object Explorer, verify that the HumanResources database is now recovered and ready to use.
Results: After this exercise, you should have restored the HumanResources database.
Exercise 2: Restoring Database, Differential, and Transaction Log Backups Task 1: Determine the Cause of the Failure
MCT USE ONLY. STUDENT USE PROHIBITED
L5-2 Administering Microsoft® SQL Server® Databases
1.
In SQL Server Management Studio, in Object Explorer, under Databases, note that the InternetSales database is in a Recovery Pending state.
2.
Click New Query and execute the following Transact-SQL code to attempt to bring the database online. ALTER DATABASE InternetSales SET ONLINE;
3.
Note the error message that is displayed. There is a problem with the primary data file.
4.
View the contents of the M:\Data folder to verify that the InternetSales.mdf file is present. This file has become corrupt, and has rendered the database unusable.
Task 2: Perform a Tail-Log Backup 1.
View the contents of the L:\Logs folder and verify that the InternetSales_log.ldf file is present.
2.
In SQL Server Management Studio, click New Query and enter the following Transact-SQL code to back up the tail of the transaction log: USE master; BACKUP LOG InternetSales TO DISK = 'R:\Backups\IS-TailLog.bak' WITH NO_TRUNCATE;
3.
Click Execute, and view the resulting message to verify that the backup is successful.
Task 3: Restore the InternetSales Database 1.
In SQL Server Management Studio, in Object Explorer, right-click the InternetSales database, point to Tasks, point to Restore, and click Database.
2.
In the Restore Database – InternetSales dialog box, note that only the tail-log backup is listed. The backup history for this database has been deleted.
3.
In the Restore Database – InternetSales dialog box, in the Source section, elect Device and click the ellipses (...) button.
4.
In the Select backup devices dialog box click Add, and then in the Locate backup File – MIA-SQL dialog box, select R:\Backups\InternetSales.bak and click OK.
5.
In the Select backup devices dialog box, ensure that R:\Backups\InternetSales.bak is listed, and then click Add.
6.
In the Locate backup File – MIA-SQL dialog box, select R:\Backups\IS-TailLog.bak and click OK.
7.
In the Select backup devices dialog box, ensure that both R:\Backups\InternetSales.bak and R:\Backups\IS-TailLog.bak are listed and click OK.
8.
Note that the backup media contains a full backup, a differential backup, and a transaction log backup (these are the planned backups in InternetSales.bak); and a copy-only transaction log backup (which is the tail-log backup in IS-TailLog.bak). All of these are automatically selected in the Restore column.
9.
On the Options page, ensure that the Recovery state is set to RESTORE WITH RECOVERY.
10. In the Script drop-down list, click New Query Editor Window. Then click OK. 11. When the database has been restored successfully, click OK.
MCT USE ONLY. STUDENT USE PROHIBITED L5-3
12. View the Transact-SQL code that was used to restore the database, noting that the full backup, the differential backup, and the first transaction log backup were restored using the NORECOVERY option. The restore operation for the tail-log backup used the default RECOVERY option to recover the database. 13. In Object Explorer, verify that the InternetSales database is now recovered and ready to use.
Results: After this exercise, you should have restored the InternetSales database.
Exercise 3: Performing a Piecemeal Restore Task 1: Begin a Piecemeal Restore 1.
In SQL Server Management Studio, in Object Explorer, under Databases, verify that the AWDataWarehouse database is not listed.
2.
Click New Query and enter the following Transact-SQL code start a partial restore of the database from the full backup set in position 1 in the AWDataWarehouse_Read-Write.bak media set: USE master; RESTORE DATABASE AWDataWarehouse FILEGROUP='Current' FROM DISK = 'R:\Backups\AWDataWarehouse_Read-Write.bak' WITH PARTIAL, FILE = 1, NORECOVERY;
3.
Click Execute, and view the resulting message to verify that the restore is successful.
4.
In Object Explorer, right-click the Databases folder and click Refresh; and verify that AWDataWarehouse is listed with a “Restoring” status.
Task 2: Restore Read/Write Filegroups and Bring the Database Online 1.
In SQL Server Management Studio, under the existing code in the query pane, enter the following Transact-SQL code restore the differential backup set in position 2 in the AWDataWarehouse_ReadWrite.bak media set: RESTORE DATABASE AWDataWarehouse FROM DISK = ‘R:\Backups\AWDataWarehouse_Read-Write.bak’ WITH FILE = 2, RECOVERY;
2.
Select the code you just entered and click Execute, and view the resulting message to verify that the restore is successful.
3.
In Object Explorer, right-click the Databases folder and click Refresh; and verify that AWDataWarehouse is now shown as online.
4.
Expand the AWDataWarehouse database and its Tables folder. Then right-click dbo.FactInternetSales and click Select Top 1000 Rows. Note that you can retrieve data from this table, which is stored in the read/write Current filegroup.
5.
In Object Explorer, right-click dbo.FactInternetSalesArchive and click Select Top 1000 Rows. Note that you cannot retrieve data from this table, which is stored in the read-only Archive filegroup.
Task 3: Restore the Read-Only Filegroup 1.
MCT USE ONLY. STUDENT USE PROHIBITED
L5-4 Administering Microsoft® SQL Server® Databases
In SQL Server Management Studio, switch to the query window containing the RESTORE statements you entered earlier. Then add the following Transact-SQL under the existing code: RESTORE DATABASE AWDataWarehouse FILEGROUP='Archive' FROM DISK = 'R:\Backups\AWDataWarehouse_Read-Only.bak' WITH RECOVERY;
2.
Select the code you just entered and click Execute, and view the resulting message to verify that the restore is successful.
3.
In Object Explorer, right-click dbo.FactInternetSalesArchive and click Select Top 1000 Rows. Note that you can now retrieve data from this table, which is stored in the read-only Archive filegroup.
Results: After this exercise, you will have restored the AWDataWarehouse database.
MCT USE ONLY. STUDENT USE PROHIBITED L6-1
Module 6: Importing and Exporting Data
Lab: Importing and Exporting Data Exercise 1: Using the SQL Server Import and Export Wizard Task 1: Prepare the Lab Environment 1.
Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are both running, and then log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
In the D:\Labfiles\Lab06\Starter folder, right-click Setup.cmd and click Run as administrator.
3.
Click Yes when prompted to confirm you want to run the command file, and wait for the script to finish.
Task 2: Use the SQL Server Import and Export Wizard to Export Data 1.
Start SQL Server Management Studio and connect to the MIA-SQL database engine using Windows authentication.
2.
In Object Explorer, expand Databases. Then right-click the InternetSales database, point to Tasks, and click Export Data.
3.
On the Welcome to SQL Server Import and Export Wizard page, click Next.
4.
On the Choose a Data Source page, in the Data source drop-down list, select SQL Server Native Client 11.0. Then ensure that the MIA-SQL server is selected, that Use Windows Authentication is selected, and that the InternetSales database is selected; and click Next.
5.
On the Choose a Destination page, in the Data source drop-down list, select Microsoft Excel. Then in the Excel file path box type D:\Labfiles\Lab06\Starter\Sales.xls, ensure that First row has column names is selected, and click Next.
6.
On the Specify Table Copy or Query page, select Write a query to specify the data to transfer and click Next.
7.
On the Provide a Source Query page, click Browse and open the Query.sql script file in the D:\Labfiles\Lab06\Starter folder. Then, on the Provide a Source Query page, click Next.
8.
On the Select Source Tables and Views page, replace ‘Query’ in the Destination column with ‘Sales’. Then click Next.
9.
On the Review data Type Mapping page, review the default mappings and click Next.
10. On the Save and Run Package page, ensure that Run immediately is selected, and click Next.
11. On the Complete the Wizard page, click Finish. Then, when the execution is successful, click Close. 12. Start Excel and open the Sales.xls file in the D:\Labfiles\Lab06\Starter folder and view the data that has been exported. Then close Excel without saving the file.
Results: After this exercise, you should have exported data from InternetSales to an Excel workbook named Sales.xls.
Exercise 2: Using the bcp Utility Task 1: Create a Format File
MCT USE ONLY. STUDENT USE PROHIBITED
L6-2 Administering Microsoft® SQL Server® Databases
1.
In SQL Server Management Studio, in Object Explorer, expand the HumanResources database and its Tables folder, and then right-click the dbo.JobCandidate table and click Select Top 1000 Rows.
2.
View the existing data in the table, noting that some of the columns include Unicode characters.
3.
Open a command prompt and enter the following command to create a format file: bcp HumanResources.dbo.JobCandidate format nul -S MIA-SQL -T -w D:\Labfiles\Lab06\Starter\JobCandidateFmt.xml
4.
-t \t -r \n -x -f
Start Notepad and open JobCandidateFmt.xml in the D:\Labfiles\Lab06\Starter folder. Then view the XML format file and close notepad.
Task 2: Use bcp to Import Data 1.
Use Notepad to view the contents of the JobCandidates.txt file in the D:\Labfiles\Lab06 folder. Note that this file contains new candidate data. Then close Notepad.
2.
In the command prompt window, enter the following command to import data the new candidate data into the dbo.JobCandidate table in the HumanResources database.
bcp HumanResources.dbo.JobCandidate in D:\Labfiles\Lab06\Starter\JobCandidates.txt -S MIA-SQL -T -f D:\Labfiles\Lab06\Starter\JobCandidateFmt.xml
3.
Close the command prompt.
4.
In SQL Server Management Studio, re-execute the query that retrieves the top 1000 rows from the dbo.JobCandidate table and verify that the new data has been imported.
Results: After this exercise, you should have created a format file named JobCandidateFmt.xml, and imported the contents of the JobCandidates.txt file into the HumanResources database.
Exercise 3: Using the BULK INSERT Statement Task 1: Disable Indexes 1.
In SQL Server Management Studio, in Object Explorer, expand the InternetSales database and its Tables folder, right-click dbo.CurrencyRate, and click Select Top 1000 Rows. Note that the table is currently empty.
2.
Expand the dbo.CurrencyRate table, and then expand its Indexes folder. Note that the table has indexes defined.
3.
Click New Query, and then in the new query pane, enter the following Transact-SQL code to disable indexes: ALTER INDEX ALL ON InternetSales.dbo.CurrencyRate DISABLE; GO
4.
Click Execute.
MCT USE ONLY. STUDENT USE PROHIBITED L6-3
Task 2: Use the BULK INSERT Statement to Import Data 1.
Use Excel to view the contents of the CurrencyRates.csv file in the M:\ folder, and note that it contains currency rate data. Then close Excel.
2.
In SQL Server Management Studio, in the query pane, under the existing code to disable indexes and constraints, enter the following Transact-SQL code: BULK INSERT InternetSales.dbo.CurrencyRate FROM 'M:\CurrencyRates.csv' WITH ( FIELDTERMINATOR =',', ROWTERMINATOR ='\n' );
3.
Click Execute and note the number of rows affected.
4.
Switch to the query pane that retrieves the top 1000 rows from the dbo.CurrencyRate table, remove the Top 1000 clause from the query, and click Execute to run modified the SELECT query. Note that the table is now populated with the same number of rows as you noted in the previous step.
Task 3: Rebuild Indexes 1.
In SQL Server Management Studio, in the query pane, under the existing code to import data, enter the following Transact-SQL code: ALTER INDEX ALL ON InternetSales.dbo.CurrencyRate REBUILD; GO
2.
Click Execute.
Results: After this exercise, you should have used the BULK INSERT statement to load data into the CurrencyRates table in the InternetSales database.
Exercise 4: Using the OPENROWSET Function Task 1: Copy Data Files to the Server 1.
Use Notepad to view the JobCandidates2.txt file in the D:\Labfiles\Lab06\Starter folder and note that it contains data for three candidates, only two of which have supplied email addresses. Then close Notepad without saving the file.
2.
Copy the JobCandidates2.txt and JobCandidatesFmt.xml files from the D:\Labfiles\Lab06\Starter folder to the M:\ folder.
Note: In this lab environment, the client and server are the same. However, in a real environment you would need to upload data and format files from your local workstation to a volume that is accessible from the server. In this scenario, M: represents a volume in a SAN that would be accessible from the server.
Task 2: Disable Indexes and Constraints 1.
In SQL Server Management Studio, in Object Explorer, under the HumanResources database rightclick the dbo.JobCandidate, table and click Select Top 1000 Rows. Note the number of rows currently in the table.
MCT USE ONLY. STUDENT USE PROHIBITED
L6-4 Administering Microsoft® SQL Server® Databases
2.
Expand the dbo.JobCandidate table, and then expand both its Constraints folder and its Indexes folder. Note that the table has indexes and constraints defined.
3.
Click New Query, and then in the new query pane, enter the following Transact-SQL code to disable the non-clustered indexes and constraints: ALTER INDEX idx_JobCandidate_City ON HumanResources.dbo.JobCandidate DISABLE; GO ALTER INDEX idx_JobCandidate_CountryRegion ON HumanResources.dbo.JobCandidate DISABLE; GO ALTER TABLE HumanResources.dbo.JobCandidate NOCHECK CONSTRAINT ALL; GO
4.
Click Execute.
Task 3: Use the OPENROWSET Function to Import data 1.
In SQL Server Management Studio, in the query pane, under the existing code to disable indexes and constraints, enter the following Transact-SQL code: INSERT INTO HumanResources.dbo.JobCandidate SELECT * FROM OPENROWSET (BULK 'M:\JobCandidates2.txt', FORMATFILE = 'M:\JobCandidateFmt.xml') AS rows WHERE EmailAddress IS NOT NULL;
2.
Click Execute and note the number of rows affected.
3.
Switch to the query pane that retrieves the top 1000 rows from the dbo.JobCandidate table and click Execute to re-run the SELECT query. Verify that the records for candidates with an email address have been inserted.
Task 4: Re-Enable Indexes and Constraints 1.
In SQL Server Management Studio, in the query pane, under the existing code to import data, enter the following Transact-SQL code: ALTER INDEX idx_JobCandidate_City ON HumanResources.dbo.JobCandidate REBUILD; GO ALTER INDEX idx_JobCandidate_CountryRegion ON HumanResources.dbo.JobCandidate REBUILD; GO ALTER TABLE HumanResources.dbo.JobCandidate CHECK CONSTRAINT ALL; GO
2.
Click Execute.
Results: After this exercise, you should have imported data from JobCandidates2.txt into the dbo.JobCandidates table in the HumanResources database.
MCT USE ONLY. STUDENT USE PROHIBITED L7-1
Module 7: Monitoring SQL Server 2014
Lab: Monitoring SQL Server 2014 Exercise 1: Collecting Baseline Metrics Task 1: Prepare the Lab Environment 1.
Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are both running, and then log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
In the D:\Labfiles\Lab07\Starter folder, right-click Setup.cmd and click Run as administrator.
3.
Click Yes when prompted to confirm you want to run the command file, and wait for the script to finish
Task 2: Create a Data Collector Set 1.
Right-click the Start button and click Computer Management.
2.
In Computer Management, expand Performance, expand Monitoring Tools, expand Data Collector Sets, and expand User Defined.
3.
If a data collection set named SQL Server Workload already exists (because you completed this lab previously), right-click it and click Delete. Then click Yes when prompted.
4.
Right-click User Defined, point to New, and click Data Collector Set.
5.
In the Create new Data Collector Set dialog box, enter the name SQL Server Workload. Then select Create manually (Advanced) and click Next.
6.
On the What type of data do you want to include? Page, under Create data logs, select Performance counter, and then click Next.
7.
On the Which performance counters would you like to add? page, Click Add.
8.
In the list of objects, expand the Processor object, and select only the % Processor Time counter. Then in the Instances of selected object list, ensure that _Total is selected and click Add.
9.
In the list of objects, expand the Memory object and select the Page Faults/sec counter. Then click Add.
10. In the list of objects, expand the SQLServer:Locks object, click the Average Wait Time (ms) counter, and then hold the Ctrl key and click the Lock Requests/sec and Lock Waits/sec counters. Then in the Instances of selected object list, ensure that _Total is selected and click Add. 11. In the list of objects, expand the SQLServer:Memory Manager object, click the Database Cache Memory (KB) counter, and then hold the Ctrl key and click the Free memory (KB) counter. Then click Add.
12. In the list of objects, expand the SQLServer:Plan Cache object, and select the Cache Hit Ratio counter. Then in the Instances of selected object list, ensure that _Total is selected and click Add. 13. In the list of objects, expand the SQLServer:Transactions object, and select the Transactions counter. Then click Add.
14. In the Add Counters dialog box, click OK. Then in the Create new Data Collector Set dialog box, on the Which performance counters would you like to add? page, Click Next. 15. On the Where would you like the data to be saved? Page, in the Root directory box, type D:\Labfiles\Lab07\Starter\Logs and then click Next.
MCT USE ONLY. STUDENT USE PROHIBITED
L7-2 Administering Microsoft® SQL Server® Databases
16. On the Create the data collector set? page, ensure that Save and close is selected and click Finish.
Task 3: Run the Data Collector Set 1.
In Computer Manager, right-click the SQL Server Workload data collector set you created in the previous task and click Start.
2.
In the D:\Labfiles\Lab07\Starter folder, right-click Baseline.ps1 and click Run with PowerShell. If you are prompted to change the execution policy, enter Y. This starts a baseline workload process that takes three minutes to run.
3.
When the PowerShell window closes, in Computer Manager, right-click the SQL Server Workload data collector set and click Stop.
Task 4: View the Logged Data 1.
In Computer Manager, under Performance, expand Monitoring Tools and click Performance Monitor.
2.
In Performance Monitor, on the toolbar, click View Log Data.
3.
In the Performance Monitor Properties dialog box, on the Source tab, select Log files and click Add. Browse to the D:\Labfiles\Lab07\Starter\Logs folder, open the folder with a name similar to MIA-SQL_2014010101-000001 and open the DataCollector01.blg log file. Then, in the Performance Monitor Properties dialog box, click OK.
4.
In Performance Monitor, in the toolbar, click the Add button (a green +).
5.
In the Add Counters dialog box, click Memory, and then hold the Ctrl key and click each other object to select them all. Then click Add to add all of the counters for all of the objects, and click OK
6.
Click any of the counters in the list below the chart and on the toolbar click Highlight so that the selected counter is highlighted in the chart. Press the up and down arrow keys on the keyboard to change the selected counter. As you highlight the counters, note the Last, Average, Minimum, and Maximum values.
7.
On the toolbar, in the Change Graph Type list, select Report and view the text-based report, which shows the average value for each counter.
8.
Right-click anywhere in the report and click Save Image As. Then save the report image as BaselineAverages.gif in the D:\Labfiles\Lab07\Starter folder.
9.
On the toolbar, in the Change Graph Type list, select Line to return to the original line chart view.
10. Minimize Computer Manager, you will return to it later.
Task 5: View Query and I/O Statistics 1.
Start SQL Server Management Studio and connect to the MIA-SQL database engine instance using Windows authentication.
2.
In SQL Server Management Studio, open the Query DMV.sql script file in the D:\Labfiles\Lab07\Starter folder.
3.
Highlight the Transact-SQL statement under the comment Get top 5 queries by average reads, and then click Execute.
4.
View the results. They include SELECT queries that retrieve data from tables in the InternetSales database.
5.
In the Results pane, right-click any cell and click Save Results As. Then save the results as TopBaselineQueries.csv in the D:\Labfiles\Lab07 folder.
MCT USE ONLY. STUDENT USE PROHIBITED L7-3
6.
In the query pane, highlight the Transact-SQL statement under the comment View IO Stats, and then click Execute.
7.
View the results. They include details of I/O activity for the files used by the InternetSales database.
8.
In the Results pane, right-click any cell and click Save Results As. Then save the results as BaslineIO.csv in the D:\Labfiles\Lab07 folder.
9.
Minimize SQL Server Management Studio.
Results: At the end of this exercise, you will have a data collector set named SQL Server Workload, a log containing baseline measurements, and query and I/O statistics obtained from DMVs and DMFs.
Exercise 2: Monitoring a Workload Task 1: Run the Data Collector Set 1.
In Computer Manager, in the left pane, right-click the SQL Server Workload data collector set you created previously, and click Start.
2.
In the D:\Labfiles\Lab07\Starter folder, right-click Workload.ps1 and click Run with PowerShell. This starts a database workload process that takes three minutes to run.
3.
When the PowerShell window closes, in Computer Manager, right-click the SQL Server Workload data collector set and click Stop.
Task 2: View the Logged Data 1.
In Performance Monitor, on the toolbar, click View Log Data.
2.
In the Performance Monitor Properties dialog box, on the Source tab, ensure that Log files is selected, select any existing files and click Remove.
3.
In the Performance Monitor Properties dialog box, on the Source tab, click Add. Browse to the D:\Labfiles\Lab07\Starter\Logs folder, open the folder with a name similar to MIA-SQL_2014010101000002 and open the DataCollector01.blg log file. Then, in the Performance Monitor Properties dialog box, click OK.
4.
View the line chart report, noting any counters that look consistently high.
5.
On the toolbar, in the Change Graph Type list, select Histogram bar and view the resulting chart. Then in the Change Graph Type list, select Report and view the text-based report.
6.
In the D:\Labfiles\Lab07\Starter folder, double-click the BaselineAverages.gif image you saved earlier to view the baseline metrics in Internet Explorer. Then compare the baseline averages with the figures in performance monitor.
7.
Close Internet Explorer and Computer Management.
Task 3: View Query and I/O Statistics 1.
In SQL Server Management Studio, in the Query DMV.sql script file highlight the Transact-SQL statement under the comment Get top 5 queries by average reads, and then click Execute.
2.
View the results. Then start Microsoft Excel and open the TopBaselineQueries.csv file you saved in the D:\Labfiles\Lab07 folder and compare the results to the queries that were identified during the baseline workload.
MCT USE ONLY. STUDENT USE PROHIBITED
L7-4 Administering Microsoft® SQL Server® Databases
3.
In SQL Server Management Studio, in the query pane, highlight the Transact-SQL statement under the comment View IO Stats, and then click Execute.
4.
View the results. Then in Microsoft Excel, open the BaselineIO.csv file you saved in the D:\Labfiles\Lab07 folder and compare the results to the I/O statistics that were identified during the baseline workload.
5.
Close Excel and SQL Server Management Studio without saving any files.
Results: At the end of this exercise, you will have a second log file containing performance metrics for the revised workload.
MCT USE ONLY. STUDENT USE PROHIBITED L8-1
Module 8: Tracing SQL Server Activity
Lab: Tracing SQL Server Workload Activity Exercise 1: Capturing a Trace in SQL Server Profiler Task 1: Prepare the Lab Environment 1.
Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are both running, and then log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
In the D:\Labfiles\Lab08\Starter folder, right-click Setup.cmd and then click Run as administrator.
3.
Click Yes when prompted to confirm that you want to run the command file, and then wait for the script to finish.
Task 2: Create a SQL Server Profiler Trace 1.
On the Start screen, type Profiler and then start SQL Server 2014 Profiler.
2.
In SQL Server Profile, on the File menu, click New Trace. Then connect to the MIA-SQL database engine instance using Windows authentication.
3.
In the Trace Properties dialog box, on the General tab, set the following properties: o
Trace name: InternetSales Workload
o
Use the template: TSQL
o
Save to file: D:\Labfiles\Lab08\Starter\InternetSales Workload.trc
4.
In the Trace Properties dialog box, on the Events Selection tab, note the events and columns that were automatically selected from the TSQL template.
5.
Select Show all events, and under TSQL, select SQL:StmtCompleted. Then clear Show all events so that only the selected events, including the one you just selected are shown.
6.
Select Show all columns and select the Duration column for the SQL:StmtCompleted event.
7.
Click the column header for the Database Name column, and in the Edit Filter dialog box, expand Like, enter InternetSales, and click OK. Then clear Show all columns so that only the selected columns are shown.
Task 3: Capture Workload Events 1.
In the Trace Properties dialog box, click Run.
2.
In the D:\Labfiles\Lab08\Starter folder, right-click Workload.ps1 and click Run with PowerShell. This starts a workload in the InternetSales database that lasts for approximately three minutes.
3.
While the workload is running, switch back to SQL Server Profiler and observe the activity.
4.
When the workload has finished, in SQL Server Profiler, on the File menu, click Stop Trace.
5.
In the trace, select any of the SQL:StmntCompleted events and note that the Transact-SQL code is shown in the bottom pane.
Results: After this exercise, you should have captured a workload using SQL Server Profiler.
Exercise 2: Generating Database Tuning Recommendations Task 1: Create a Tuning Session
MCT USE ONLY. STUDENT USE PROHIBITED
L8-2 Administering Microsoft® SQL Server® Databases
1.
In SQL Server Profiler, on the Tools menu, click Database Engine Tuning Advisor.
2.
When the Database Engine Tuning Advisor starts, connect to the MIA-SQL database engine instance using Windows authentication.
3.
In the Database Engine Tuning Advisor, In the Session name box, type Tune InternetSales.
4.
Under Workload, ensure that File is selected, and browse to the D:\Labfiles\Lab08\Starter\InternetSales Workload.trc file (which is where you saved the trace from SQL Server Profiler in the previous exercise).
5.
In the Database for workload analysis drop-down list, select InternetSales.
6.
In the Select databases and tables to tune list, select InternetSales and note that all of its tables are selected. Then in the drop down list of tables, clear the checkbox for the CurrencyRate table.
7.
On the Tuning Options tab, review the default options for recommendations. Then click Advanced Options, select Generate online recommendations where possible, and click OK.
Task 2: Generate Recommendations 1.
In the Database Engine Tuning Advisor, on the toolbar, click Start Analysis.
2.
When the analysis is complete, on the Recommendations tab, review the index recommendations that the DTA has generated.
3.
On the Actions menu, click Save Recommendations, save the recommendations script as DTA Recommendations.sql in the D:\Labfiles\Lab08\Starter folder, and click OK when notified that the file was saved.
Task 3: Validate the Recommendations 1.
In the Database Engine Tuning Advisor, on the Reports tab for the Tune InternetSales session, view the tuning summary and in the Select report list, select Event frequency report.
2.
View the report and identify the most frequently used query in the workload.
3.
In the Select report list, select Statement detail report.
4.
View the report and compare the Current Statement Cost and Recommended Statement Cost values for the query you identified as being most frequently used.
5.
Select the Statement String cell for the most frequently used statement, and then right-click the selected cell and click Copy.
6.
Minimize the Database Engine Tuning Advisor and start SQL Server Management Studio. When prompted, connect to the MIA-SQL database engine instance using Windows authentication.
7.
In SQL Server Management Studio, click New Query and paste the statement you copied previously into the query window.
8.
In the Available Databases drop-down list, select InternetSales. Then on the Query menu, click Display Estimated Execution Plan. This displays a breakdown of the tasks that the query processor will perform to process the query.
9.
Note that the query processor suggests that there is at least one missing index that would improve query performance. Then hold the mouse over the SELECT icon at the left side of the query plan diagram and view the Estimated Subtree Cost value that is displayed in a tooltip.
MCT USE ONLY. STUDENT USE PROHIBITED L8-3
10. In SQL Server Management Studio, open the DTA Recommendations.sql script you saved from the Database Engine Tuning Advisor in the D:\Labfiles\Lab08\Starter folder. Then click execute to implement the recommended indexes.
11. Close the DTA Recommendations.sql tab, and return to the query and its estimated execution plan.
12. On the Query menu, click Display Estimated Execution Plan again, and note that the query processor no longer suggests that there is a missing index. Then hold the mouse over the SELECT icon at the left side of the query plan diagram and view the Estimated Subtree Cost value.
Results: After this exercise, you should have analyzed the trace in the Database Engine Tuning Advisor, and reviewed the recommendations.
Exercise 3: Using SQL Trace Task 1: Export a SQL Trace Script 1.
In SQL Server Profiler, with the InternetSales Workload trace still open, on the File menu, point to Export, point to Script Trace Definition, and click For SQL Server 2005 - 2014.
2.
Save the exported trace script as InternetSales Trace.sql in the D:\Labfiles\Lab08\Starter folder, and click OK when notified that the script has been saved.
Task 2: Run the Trace 1.
In SQL Server Management Studio, open the InternetSales Trace.sql script file in the D:\Labfiles\Lab08\Starter folder (which you exported from SQL Server Profiler in the previous task).
2.
View the Transact-SQL code, and in the line that begins exec @rc = sp_trace_create, replace InsertFileNameHere with D:\Labfiles\Lab08\Starter\InternetSales.
3.
Click Execute to start the trace, and note the TraceID value that is returned.
4.
In the D:\Labfiles\Lab08\Starter folder, right-click Workload.ps1 and click Run with PowerShell. This starts a workload in the InternetSales database that lasts for approximately three minutes.
5.
While the workload is running, switch back to SQL Server Management Studio and click New Query, then in the new query window, enter the following code, replacing TraceID with the TraceID value you noted when you started the trace. Do not execute the code yet. DECLARE @TraceID int = TraceID; EXEC sp_trace_setstatus @TraceID, 0; EXEC sp_trace_setstatus @TraceID, 2; GO
6.
When the Windows PowerShell window closes (indicating that the workload has finished), in SQL Server Management Studio, click Execute to stop the trace.
Task 3: View the Trace Results 1.
In SQL Server Management Studio, in the query pane, below the existing code, add the following Transact-SQL statement, which retrieves the text data, start time, and duration for each SQL:StmntComplete event in the trace file. SELECT TextData, StartTime, Duration FROM fn_trace_gettable('D:\Labfiles\Lab08\Starter\InternetSales.trc', default) WHERE EventClass = 41;
MCT USE ONLY. STUDENT USE PROHIBITED
L8-4 Administering Microsoft® SQL Server® Databases
2.
Select the code you added in the previous step and click Execute. Then view the results.
3.
Close SQL Server Management Studio, SQL Server profiler, and the Database Engine Tuning Advisor without saving any files.
Results: After this exercise, you should have captured a trace using SQL Trace.
MCT USE ONLY. STUDENT USE PROHIBITED L9-1
Module 9: Managing SQL Server Security
Lab: Managing SQL Server Security Exercise 1: Managing Server-Level Security Task 1: Prepare the Lab Environment 1.
Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are both running, and then log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
In the D:\Labfiles\Lab09\Starter folder, right-click Setup.cmd and then click Run as administrator.
3.
Click Yes when prompted to confirm that you want to run the command file, and wait for the script to finish.
Task 2: Verify the Authentication Mode 1.
Start SQL Server Management Studio, and connect to the MIA-SQL database engine using Windows authentication.
2.
In Object Explorer, right-click the MIA-SQL instance and click Properties.
3.
In the Server Properties – MIA-SQL dialog box, on the Security page, verify that SQL Server and Windows Authentication mode is selected. Then click Cancel
Task 3: Create Logins 1.
In SQL Server Management Studio, expand the MIA-SQL instance, expand Security, and expand Logins to view the existing logins in the instance.
2.
Open the CreateLogins.sql script in the D:\Labfiles\Lab09\Solution folder.
3.
Review the script, noting that it creates the following logins.
4.
o
[ADVENTUREWORKS\Database_Managers]
o
[ADVENTUREWORKS\WebApplicationSvc]
o
[ADVENTUREWORKS\InternetSales_Users]
o
[ADVENTUREWORKS\InternetSales_Managers]
o
Marketing_Application
Click Execute. Then, when the script has completed successfully, in Object Explorer, right-click Logins and click Refresh to verify that the logins have been created.
Task 4: Manage Server-Level Roles 1.
In SQL Server Management Studio, under Security, expand Server Roles to view the existing roles in the instance.
2.
Open the ServerRoles.sql script in the D:\Labfiles\Lab09\Solution folder.
3.
Review the script, noting that it performs the following actions.
4.
o
Creates a new server-level role named application_admin,
o
Adds the [ADVENTUREWORKS\Database_Managers] login to the application_admin role.
o
Grants ALTER ANY LOGIN and VIEW ANY DATABASE permissions to the application_admin role.
Click Execute. Then, when the script has completed successfully, in Object Explorer, right-click Server Roles and click Refresh to verify that the new role has been created.
MCT USE ONLY. STUDENT USE PROHIBITED
L9-2 Administering Microsoft® SQL Server® Databases
Results: After this exercise, the authentication mode for the MIA-SQL SQL Server instance should support the scenario requirements, you should have created the required logins and server-level roles, and you should have granted the required server-level permissions.
Exercise 2: Managing Database-Level Security Task 1: Create Database Users 1.
In Object Explorer, expand Databases, expand the InternetSales database, and expand its Security folder. Then expand the Users folder and view the users currently defined in the database.
2.
Open the CreateUsers.sql script in the D:\Labfiles\Lab09\Solution folder.
3.
Review the script, noting that it creates the following users.
4.
o
Marketing_Application
o
WebApplicationSvc
o
InternetSales_Users
o
InternetSales_Managers
o
Database_Managers
Click Execute. Then, when the script has completed successfully, in Object Explorer, right-click Users and click Refresh to verify that the new users have been created.
Task 2: Manage Database Roles 1.
In Object Explorer, expand the Roles folder, and then expand the Database Roles folder and view the database roles in the database.
2.
Open the DatabaseRoles.sql script in the D:\Labfiles\Lab09\Solution folder.
3.
Review the script, noting that it performs the following actions.
4.
o
Creates database roles named sales_reader, sales_writer, customers_reader, products_reader, and web_application,
o
Adds the Database_Managers user to the db_securityadmin fixed database-level role.
o
Adds the InternetSales_Users and InternetSales_Managers users to the sales_reader role.
o
Adds the InternetSales_Managers user to the sales_writer role.
o
Adds the InternetSales_Users, InternetSales_Managers, and Marketing_Application users to the customers_reader role.
o
Adds the InternetSales_Managers and Marketing_Application users to the products_reader role.
o
Adds the WebApplicationSvc user to the web_application role.
o
Creates an application role named sales_admin.
Click Execute. Then, when the script has completed successfully, in Object Explorer, right-click Database Roles and click Refresh and expand Application Roles to verify that the new roles have been created.
MCT USE ONLY. STUDENT USE PROHIBITED L9-3
Task 3: Assign Permissions 1.
Open the DatabasePermissions.sql script in the D:\Labfiles\Lab09\Solution folder.
2.
Review the script, noting that it grants the following permissions.
3.
o
SELECT on the Sales schema to sales_reader.
o
INSERT, UPDATE, and EXECUTE on the Sales schema to sales_writer.
o
SELECT, INSERT, UPDATE, DELETE, and EXECUTE on the Sales schema to sales_admin.
o
SELECT on the Customers schema to customers_reader.
o
SELECT on the Products schema to products_reader.
o
EXECUTE on the Products schema to InternetSales_Managers.
o
INSERT on Sales.SalesOrderHeader to web_application.
o
INSERT on Sales.SalesOrderDetail to web_application.
o
SELECT on Products.vProductCatalog to web_application.
Click Execute.
Results: After this exercise, you should have created the required database users and database-level roles, and assigned appropriate permissions.
Exercise 3: Testing Database Access Task 1: Test IT Support Permissions 1.
Minimize SQL Server Management Studio and open a command prompt.
2.
In the command prompt window, enter the following command (which opens the sqlcmd utility as ADVENTUREWORKS\AnthonyFrizzell): runas /user:adventureworks\anthonyfrizzell /noprofile sqlcmd
3.
When you are prompted for a password, enter Pa$$w0rd.
4.
In the SQLCMD window, enter the following commands to verify your identity: SELECT suser_name(); GO
5.
Note that SQL Server identifies Windows group logins using their individual user account, even though there is no individual login for that user. ADVENTUREWORKS\AnthonyFrizzell is a member of the ADVENTUREWORKS\IT_Support global group, which is in turn a member of the ADVENTUREWORKS\Database_Managers domain local group for which you created a login.
6.
In the SQLCMD window, enter the following commands to alter the password of the Marketing_Application login. ALTER LOGIN Marketing_Application WITH PASSWORD = 'NewPa$$w0rd'; GO
7.
In the SQLCMD window, enter the following commands to disable the ADVENTUREWORKS\WebAplicationSvc login. ALTER LOGIN [ADVENTUREWORKS\WebApplicationSvc] DISABLE; GO
8.
Close the SQLCMD window and maximize SQL Server Management Studio.
9.
In Object Explorer, right-click the Logins folder and click Refresh. Then right-click the ADVENTUREWORKS\WebApplicationSvc login and click Properties.
10. In the Login Properties - ADVENTUREWORKS\WebApplicationSvc dialog box, on the Status page, select Enabled and click OK to re-enable the login.
Task 2: Test Marketing Application Permissions 1.
In SQL Server Management Studio, click New Query.
2.
In the new query window, enter the following Transact-SQL code to impersonate the Marketing_Application login. EXECUTE AS LOGIN = 'Marketing_Application' GO SELECT suser_name(); GO
3.
Click Execute and verify that the connection is executing in the context of the Marketing_Application login.
4.
Enter the following Transact-SQL code under the previous code: USE InternetSales; SELECT * FROM sys.fn_my_permissions(‘Customers.Customer’, 'object'); GO
5.
Select the code you just entered and click Execute to view the effective permissions for Marketing_Application on the Customers.Customer table.
6.
Enter the following Transact-SQL code under the previous code: SELECT * FROM Customers.Customer; GO
7.
Select the code you just entered and click Execute to verify that the user can query the Customers.Customer table.
8.
Enter the following Transact-SQL code under the previous code: UPDATE Customers.Customer SET EmailAddress = NULL WHERE CustomerID = 1; GO
9.
Select the code you just entered and click Execute to verify that the user does not have UPDATE permission on the Customers.Customer table.
MCT USE ONLY. STUDENT USE PROHIBITED
L9-4 Administering Microsoft® SQL Server® Databases
MCT USE ONLY. STUDENT USE PROHIBITED L9-5
10. Enter the following Transact-SQL code under the previous code: SELECT * FROM Products.Product; GO
11. Select the code you just entered and click Execute to verify that the user can query the Product.Products table. 12. Enter the following Transact-SQL code under the previous code: SELECT * FROM Sales.SalesOrderHeader; GO
13. Select the code you just entered and click Execute to verify that the user does not have SELECT permission on the Sales.SalesOrderHeader table. 14. Close SQL Server management Studio without saving any files.
Task 3: Test Web Application Permissions 1.
In the command prompt window, enter the following command to run sqlcmd as ADVENTUREWORKS\WebApplicationSvc: runas /user:adventureworks\webapplicationsvc /noprofile sqlcmd
2.
When you are prompted for a password, enter Pa$$w0rd.
3.
In the SQLCMD window, enter the following commands to query the Products.vProductCatalog view: SELECT ProductName, ListPrice FROM Products.vProductCatalog; GO
4.
Verify that the user can query the Products.vProductCatalog view.
5.
In the SQLCMD window, enter the following commands to query the Products.Product table: SELECT * FROM Products.Product; GO
6.
Verify that the user does not have SELECT permission on the Products.Product table.
7.
Close the SQLCMD window.
Task 4: Test Sales Employee Permissions 1.
In the command prompt window, enter the following command to run sqlcmd as ADVENTUREWORKS\DanDrayton. This user is a member of the ADVENTUREWORKS\Sales_NorthAmerica global group, which is in turn a member of the ADVENTUREWORKS\InternetSales_Users domain local group. runas /user:adventureworks\dandrayton /noprofile sqlcmd
2.
When you are prompted for a password, enter Pa$$w0rd.
3.
In the SQLCMD window, enter the following commands to query the Sales.SalesOrderHeader table: SELECT SalesOrderNumber, TotalDue FROM Sales.SalesOrderHeader; GO
4.
Verify that the user can query the Sales.SalesOrderHeader table.
5.
In the SQLCMD window, enter the following commands to update the Sales.SalesOrderHeader table: UPDATE Sales.SalesOrderHeader SET ShipDate = getdate() WHERE SalesOrderID = 45024; GO
6.
Verify that the user does not have UPDATE permission on the Sales.SalesOrderHeader table.
7.
Close the SQLCMD window.
Task 5: Test Sales Manager Permissions 1.
MCT USE ONLY. STUDENT USE PROHIBITED
L9-6 Administering Microsoft® SQL Server® Databases
In the command prompt window, enter the following command to run sqlcmd as ADVENTUREWORKS\DeannaBall. This user is a member of the ADVENTUREWORKS\Sales_Managers global group, which is in turn a member of the ADVENTUREWORKS\InternetSales_Managers domain local group. runas /user:adventureworks\deannaball /noprofile sqlcmd
2.
When you are prompted for a password, enter Pa$$w0rd.
3.
In the SQLCMD window, enter the following commands to query the Sales.SalesOrderHeader table: SELECT SalesOrderNumber, TotalDue FROM Sales.SalesOrderHeader; GO
4.
Verify that the user can query the Sales.SalesOrderHeader table.
5.
In the SQLCMD window, enter the following commands to update the Sales.SalesOrderHeader table: UPDATE Sales.SalesOrderHeader SET ShipDate = getdate() WHERE SalesOrderID = 45024; GO
6.
Verify that the user can update the Sales.SalesOrderHeader table.
7.
In the SQLCMD window, enter the following commands to update the Products.Product table: UPDATE Products.Product SET ListPrice = 1999.00 WHERE ProductID = 1; GO
8.
Verify that the user cannot update the Products.Product table.
9.
In the SQLCMD window, enter the following commands to call the Products.ChangeProductPrice stored procedure: EXEC Products.ChangeProductPrice 1, 1999.00; GO
10. Verify that the one row is affected (because the user has EXECUTE permission on the Products.ChangeProductPrice stored procedure). 11. In the SQLCMD window, enter the following commands to delete a row from the Sales.SalesOrderDetail table:
MCT USE ONLY. STUDENT USE PROHIBITED L9-7
DELETE Sales.SalesOrderDetail WHERE SalesOrderDetailID = 37747; GO
12. Verify that the user cannot delete rows from the Sales.SalesOrderDetail table.
13. In the SQLCMD window, enter the following commands to activate the sales_admin application role and delete a row from the Sales.SalesOrderDetail table: EXEC sp_setapprole 'sales_admin', 'Pa$$w0rd' GO DELETE Sales.SalesOrderDetail WHERE SalesOrderDetailID = 37747; GO
14. Verify that the one row is affected. This is possible because the sales_admin application role has DELETE permission on the Sales schema. 15. Close the SQLCMD window.
Results: After this exercise, you should have verified effective permissions in the MIA-SQL instance and the InternetSales database.
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED L10-1
Module 10: Auditing Data Access and Encrypting Data
Lab: Auditing Data Access and Encrypting Data Exercise 1: Implementing Auditing Task 1: Prepare the Lab Environment 1.
Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are both running, and then log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
In the D:\Labfiles\Lab10\Starter folder, right-click Setup.cmd and then click Run as administrator.
3.
Click Yes when prompted to confirm that you want to run the command file, and then wait for the script to finish.
Task 2: Create an Audit 1.
Start SQL Server Management Studio, and connect to the MIA-SQL database engine instance using Windows authentication.
2.
In SQL Server Management Studio, open the Audit.sql script file in the D:\Labfiles\Lab10\Solution folder.
3.
Select the code under the comment Create an audit, and click Execute. This creates an audit that logs events to files in D:\Labfiles\Lab10\Starter\Audits.
4.
In Object Explorer, expand Security, and expand Audits (if Audits is not expandable, refresh it and try again).
5.
Double-click the AW_Audit audit you created and view its properties. Then click Cancel.
Task 3: Create a Server Audit Specification 1.
In SQL Server Management Studio, in the Audit.sql script, select the code under the comment Create a server audit specification and click Execute. This creates an audit specification for the AW_Audit audit that logs failed and successful login attempts.
2.
In Object Explorer, refresh the Server Audit Specifications folder and expand it. Then double-click AW_ServerAuditSpec, view its properties, and click Cancel.
Task 4: Create a Database Audit Specification 1.
In SQL Server Management Studio, in the Audit.sql script, select the code under the comment Create a database audit specification and click Execute. This creates an audit specification for the AW_Audit audit that logs user-defined audit events in the InternetSales database as well as specific actions by the customers_reader database role and the sales_admin application role in the Customers schema.
2.
In Object Explorer, expand Databases, expand InternetSales, expand Security, and expand Database Audit Specifications.
3.
Double-click AW_DatabaseAuditSpec and view its properties. Then click Cancel.
Task 5: Implement a Custom Action 1.
In the Audit.sql script, select the code under the comment Create a trigger for a user-defined action and click Execute. This creates a trigger that writes a custom audit event when the EmailAddress column in the Customers.Customer table is updated.
Administering Microsoft® SQL Server® Databases
2.
Select the code under the comment Grant permission on sp_audit_write and click Execute. This grants EXECUTE permission on the sp_audit_write stored procedure to the public role in the master database.
Task 6: View Audited Events 1.
Open a command prompt and enter the following command to run sqlcmd as ADVENTUREWORKS\VictoriaGray. This user is a member of the ADVENTUREWORKS\Sales_Europe global group, which in turn is a member of the ADVENTUREWORKS\InternetSales_Users domain local group. runas /user:adventureworks\victoriagray /noprofile sqlcmd
2.
When prompted for a password, enter Pa$$w0rd.
3.
In the SQLCMD window, enter the following commands to query the Customers.Customer table: SELECT FirstName, LastName, EmailAddress FROM Customers.Customer; GO
4.
MCT USE ONLY. STUDENT USE PROHIBITED
L10-2
In the SQLCMD window, enter the following commands to activate the sales_admin application role and update the Customers.Customer table: EXEC sp_setapprole 'sales_admin', 'Pa$$w0rd'; UPDATE Customers.Customer SET EmailAddress = '
[email protected]' WHERE CustomerID = 1699; GO
5.
Close the SQLCMD and command prompt windows.
6.
In the D:\Labfiles\Lab10\Starter\Audits folder, verify that an audit file has been created.
7.
In SQL Server Management Studio, in the Audit.sql script, select the code under the comment View audited events and click Execute. This queries the files in the audit folder and displays the audited events (events logged for the Student user and the service account for SQL Server have been excluded to simplify the results).
8.
Note that all events are logged with the server principal name ADVENTUREWORKS\VictoriaGray despite the fact that this user accesses SQL Server through membership of a Windows group and does not have an individual login. This identity is audited even when executing statements in the security context of an application role.
9.
Keep SQL Server Management Studio open for the next exercise.
Results: After this exercise, you should have created an audit, a server audit specification, and a database specification.
MCT USE ONLY. STUDENT USE PROHIBITED L10-3
Exercise 2: Implementing Transparent Database Encryption Task 1: Create Encryption Keys 1.
In SQL Server Management Studio, open the TDE.sql script file in the D:\Labfiles\Lab10\Solution folder.
2.
Select the code under the comment Create DMK and click Execute. This creates a database master key in the master database.
3.
Select the code under the comment Create server certificate and click Execute. This creates a certificate.
4.
Select the code under the comment Back up the certificate and click Execute. This backs up the certificate and its private key.
5.
Select the code under the comment Create DEK and click Execute. This creates a database encryption key in the HumanResources database
Task 2: Enable Database Encryption 1.
In the TDE.sql script, select the code under the comment Enable encryption and click Execute. This enables encryption for the HumanResources database.
2.
Select the code under the comment Verify encryption status and click Execute. This retrieves database encryption status from the sys.databases table in the master database.
3.
Review the query results, and verify that the is_encypted value for HumanResources is 1.
Task 3: Move the Database 1.
In Object Explorer, in the Databases folder, right-click HumanResources, point to Tasks, and click Detach. Then, in the Detach Database dialog box, click OK.
2.
In Object Explorer, in the Connect drop-down list, click Database Engine. Then connect to MIASQL\SQL2 using Windows authentication.
3.
In Object Explorer, under the MIA-SQL\SQL2 instance, right-click Databases and click Attach.
4.
In the Attach Database dialog box, click Add, select the HumanResources.mdf file in the M:\Data folder, and click OK. Then click OK on the error message that is displayed because the certificate with which the database encryption key is protected does not exist on the MIA-SQL\SQL2 instance.
5.
Open the MoveDB.sql script file in the D:\Labfiles\Lab10\Solution folder.
6.
Select the code under the comment Create a database master key and click Execute. This creates a database master key in the master database on MIA-SQL\SQL2.
7.
Select the code under the comment Create certificate from backup and click Execute. This creates a certificate in the master database on MIA-SQL\SQL2 from the backup files you created previously.
8.
Select the code under the comment Attach database and click Execute. This attaches the HumanResources database on MIA-SQL\SQL2.
9.
Select the code under the comment Test database and click Execute. This queries the Employees.Employee table in the HumanResources database.
10. Review the query results. Then close SQL Server Management Studio without saving any files. Results: After completing this exercise, you should have configured TDE and moved the encrypted HumanResource database to another instance of SQL Server.
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED L11-1
Module 11: Performing Ongoing Database Maintenance
Lab: Performing Ongoing Database Maintenance Exercise 1: Managing Database Integrity Task 1: Prepare the Lab Environment 1.
Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are both running, and then log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
In the D:\Labfiles\Lab11\Starter folder, right-click the Setup.cmd file and then click Run as administrator.
3.
Click Yes when prompted to confirm that you want to run the command file, and wait for the script to finish.
Task 2: Check Database Consistency 1.
Start SQL Server Management Studio and connect to the MIA-SQL database engine instance using Windows authentication.
2.
Open the DBCCCheckDB.sql script file in the D:\Labfiles\Lab11\Solution folder.
3.
Select the code under the comment Check AWDataWarehouse and click Execute. This checks the integrity of the AWDataWarehouse database.
4.
Select the code under the comment Check HumanResources and click Execute. This checks the integrity of the HumanResources database.
5.
Select the code under the comment Check InternetSales and click Execute. This checks the integrity of the InternetSales database and identifies some consistency errors in the dbo.Orders table in this database. The last line of output tells you the minimum repair level required.
Task 3: Repair a Corrupt Database 1.
In SQL Server Management Studio, in the DBCCCheckDB.sql script, select the code under the comment Repair the database and click Execute. This repairs the InternetSales database.
2.
Select the code under the comment Check the internal database structure and click Execute. No error message are displayed, indicating that the database structure is now consistent.
Results: After this exercise, you should have used the DBCC CHECKDB command to check database consistency, and corrected any issues that were found.
Exercise 2: Managing Index Fragmentation Task 1: View Index Fragmentation 1.
In SQL Server Management Studio, open the MaintainingIndexes.sql script file in the D:\Labfiles\Lab11\Solution folder.
2.
Select the code under the comment Check fragmentation and click Execute.
Administering Microsoft® SQL Server® Databases
3.
In the results, note the avg_page_space_used_in_percent and avg_fragmentation_in_percent values for each index level.
Task 2: Defragment Indexes
MCT USE ONLY. STUDENT USE PROHIBITED
L11-2
1.
In SQL Server Management Studio, in the MaintainingIndexes.sql script, select the code under the comment Rebuild the indexes and click Execute. This rebuilds the indexes on the table.
2.
Select the code under the comment Check fragmentation again and click Execute.
3.
In the results, note the avg_page_space_used_in_percent and avg_fragmentation_in_percent values for each index level.
Results: After this exercise, you should have rebuilt fragmented indexes.
Exercise 3: Implementing a Maintenance Plan Task 1: Create a Maintenance Plan 1.
In Object Explorer, under MIA-SQL, expand Management, right-click Maintenance Plans, and click Maintenance Plan Wizard.
2.
In the Maintenance Plan Wizard window, click Next.
3.
In the Select Plan Properties window, in the Name textbox type HumanResources Maintenance. Note the available scheduling options and click Change.
4.
In the New Job Schedule window, in the Name textbox type "Daily". In the Occurs drop down list, click Daily. In the Occurs once at textbox, change the time to 6:00 PM, and click OK.
5.
In the Select Plan Properties window, click Next. Then in the Select Maintenance Tasks page, select the following tasks and click Next. o
Check Database Integrity
o
Reorganize Index
o
Update Statistics
o
Back up Database (Full)
6.
On the Select Maintenance Task Order page, click Next.
7.
On the define Database Check Integrity Task page, select the HumanResources database and click OK. Then click Next.
8.
On the Define Reorganize Index Task page, select the HumanResources database and click OK, ensure that Tables and Views is selected, and click Next.
9.
On the Define Update Statistics Task page, select the HumanResources database and click OK, ensure that Tables and Views is selected, and click Next.
10. On the Define Backup database (Full) Task page, select the HumanResources database and click OK. Then on the Destination tab, ensure that Create a backup file for every database is selected, change the Folder value to R:\Backups\ and click Next.
11. On the Select Report Options page, ensure that Write a report to a text file is selected, change the Folder location to D:\Labfiles\Lab11\Starter and click Next.
MCT USE ONLY. STUDENT USE PROHIBITED L11-3
12. On the Complete the Wizard page, click Finish. Then when the operation has completed, click Close.
Task 2: Run a Maintenance Plan 1.
In Object Explorer, under Maintenance Plans, right-click HumanResources Maintenance and click Execute.
2.
Wait a minute or so until the maintenance plan succeeds, and in the Execute Maintenance Plan dialog box, click Close. Then right-click HumanResources Maintenance and click View History.
3.
In the Log File Viewer - MIA-SQL dialog box, expand the Date value for the Daily Maintenance plan to see the individual tasks.
4.
Keep clicking Refresh and expanding the tasks until four tasks have been completed. Then click Close.
5.
In the D:\Labfiles\Lab11\Starter folder, view the HumanResources Maintenance_Subplan_1_xxxxx.txt file that has been created.
6.
In the R:\Backups\ folder, verify that a backup of the HumanResources database has been created.
Results: After this exercise, you should have created the required database maintenance plan.
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED L12-1
Module 12: Automating SQL Server 2014 Management
Lab: Automating SQL Server Management Exercise 1: Creating a Job Task 1: Prepare the Lab Environment 1.
Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are both running, and then log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
In the D:\Labfiles\Lab12\Starter folder, right-click the Setup.cmd file and then click Run as administrator.
3.
Click Yes when prompted to confirm that you want to run the command file, and wait for the script to finish.
Task 2: Create a Job 1.
Start SQL Server Management Studio and connect to the MIA-SQL database engine instance using Windows authentication.
2.
In Object Explorer, expand SQL Server Agent and Jobs to view any existing jobs. Then right-click Jobs and click New Job.
3.
In the New Job dialog box, on the General page, in the Name box, type Backup HumanResources.
4.
In the New Job dialog box, on the Steps page, click New.
5.
In the New Job Step dialog box, on the General page, in the Step name box, type Back Up Database. Then ensure that Transact-SQL script (T-SQL) is selected in the Type drop-down list, select HumanResources in the Database drop-down list, and in the Command area, type the following command. BACKUP DATABASE HumanResources TO DISK = 'R:\Backups\HumanResources.bak';
6.
In the New Job Step dialog box, on the Advanced page, in the Output file box, type D:\Labfiles\Lab12\Starter\BackupLog.txt. Then click OK.
7.
In the New Job dialog box, on the Steps page, click New.
8.
In the New Job Step dialog box, on the General page, in the Step name box, type Copy Backup File. Then ensure that Operating system (CmdExec) is selected in the Type drop-down list and in the Command area, type the following command, which copies the backup file to the D:\Labfiles\Lab12\Starter folder. Copy R:\Backups\HumanResources.bak D:\Labfiles\Lab12\Starter\HumanResources.bak /Y
9.
In the New Job Step dialog box, click OK.
10. In the New Job dialog box, on the Steps page, verify that the Start step is set to 1:Back Up Database and note the On Success and On Failure actions for the steps in the job.
11. In the New Job dialog box, click OK. Then verify that the job appears in the Jobs folder in Object Explorer.
Task 3: Test the Job 1.
In SQL Server Management Studio, in Object Explorer, right-click the Backup HumanResources job and click Start Job at Step.
Administering Microsoft® SQL Server® Databases
MCT USE ONLY. STUDENT USE PROHIBITED
L12-2
2.
In the Start Job on 'MIA-SQL' dialog box, ensure that step 1 (Back Up Database) is selected, and click Start. Then, when the job has completed successfully, click Close.
3.
In Object Explorer, right-click the Backup HumanResources job and click View History.
4.
In the Log File Viewer - MIA-SQL dialog box, expand the date for the most recent instance of the job, and note that all steps succeeded. Then click Close.
5.
View the contents of the D:\Labfiles\Lab12\Starter folder and verify that it contains a text file named BackupLog.txt and a backup file named HumanResources.bak.
Task 4: Generate a Script for the Job 1.
In Object Explorer, right-click the Backup HumanResources job, point to Script Job as, point to CREATE To, and click New Query Editor Window. This generates the Transact-SQL code necessary to create the job.
2.
Save the Transact-SQL script as Create Backup HumanResources.sql in the D:\Labfiles\Lab12\Starter folder.
Results: After this exercise, you should have created a job named Backup HumanResources.
Exercise 2: Scheduling a Job Task 1: Add a Schedule to the Job 1.
Right-click the taskbar and click Properties. Then in the Taskbar and Navigation properties dialog box, next to Notification area, click Customize.
2.
In the Notification Area Icons window, click Turn system icons on or off. Then, in the System Icons window, set the behavior for the Clock system icon to On and click OK.
3.
Click OK to close the Notification Area Icons window, and click OK again to close the Taskbar and Navigation properties dialog box.
4.
Note the time in the clock. This may not be correct for your geographical location.
5.
In SQL Server Management Studio, in Object Explorer, double-click the Backup HumanResources job.
6.
In the Job properties - Backup HumanResources dialog box, on the Schedules page, click New.
7.
In the New Job Schedule dialog box, in the Name box, type Daily Backup; in the Frequency area, in the Occurs list, select Daily; ensure that the Occur Once option is selected; and set the time to one minute from the current system time as shown in the clock in the notification area. Then click OK.
8.
In the Job properties - Backup HumanResources dialog box, click OK. Then wait until the system clock shows the scheduled time.
MCT USE ONLY. STUDENT USE PROHIBITED L12-3
Task 2: Verify Scheduled Job Execution 1.
In Object Explorer, double-click Job Activity Monitor and note the Status of the Backup HumanResources job.
2.
If the job is still running, click Refresh until the Status changes to Idle.
3.
Verify that the Last Run Outcome for the job is Succeeded, and that the Last Run time is the time that you scheduled previously. Then click Close to close the Job Activity Monitor.
Results: After this exercise, you should have created a schedule for the Backup HumanResources job.
Exercise 3: Configuring Job Step Security Contexts Task 1: Create a Credential 1.
In SQL Server Management Studio, in Object Explorer, under MIA-SQL, expand Security. Then rightclick Credentials and click New Credential.
2.
In the New Credential dialog box, enter the following details and click OK. o
Credential name: FileAgent_Credential
o
Identity: MIA-SQL\FileAgent
o
Password: Pa$$w0rd
o
Confirm password: Pa$$w0rd
Task 2: Create a Proxy Account 1.
In Object Explorer, under SQL Server Agent, expand Proxies.
2.
Right-click Operating System (CmdExec) and click New Proxy.
3.
In the New Proxy Account dialog box, in the Proxy name box type FileAgent_Proxy, and in the Credential name box type FileAgent_Credential. Then ensure that only the Operating system (CmdExec) subsystem is selected and click OK.
Task 3: Configure Job Step Security Contexts 1.
In Object Explorer, under SQL Server Agent, in the Jobs folder, double-click the Backup HumanResources job.
2.
In the Job Properties - Backup HumanResources dialog box, on the Steps page, click step 1 (Back Up Database) and click Edit.
3.
In the Job Step Properties - Back Up Database dialog box, on the Advanced page, in the Run as user box, click the ellipsis (…).
4.
In the Select User dialog box, click Browse, and in the Browse for Objects dialog box, select [Backup_User] and click OK. Then click OK in the Select User dialog box and the Job Step Properties - Back Up Database dialog box.
5.
In the Job Properties - Backup HumanResources dialog box, on the Steps page, click step 2 (Copy Backup File) and click Edit.
6.
In the Job Step Properties - Make Folder dialog box, in the Run as drop-down list, select FileAgent_Proxy. Then click OK.
7.
In the Job Properties - Backup HumanResources dialog box, click OK.
Administering Microsoft® SQL Server® Databases
Task 4: Test the Job
MCT USE ONLY. STUDENT USE PROHIBITED
L12-4
1.
Right-click the Backup HumanResources job, and click Start Job at Step. Then in the Start Job on 'MIA-SQL' dialog box, ensure step 1 is selected and click Start.
2.
When the job has completed successfully, click Close.
Results: After this exercise, you should have configured the Back Up Database step of the Backup HumanResources job to run as the Backup_User SQL Server user. You should have also created a credential named FileAgent_Credential and a proxy named FileAgent_Proxy to perform the Copy Backup File step of the Backup HumanResources job.
MCT USE ONLY. STUDENT USE PROHIBITED L13-1
Module 13: Monitoring SQL Server 2014 with Notifications and Alerts
Lab: Using Notifications and Alerts Exercise 1: Configuring Database Mail Task 1: Prepare the Lab Environment 1.
Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are both running, and then log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2.
In the D:\Labfiles\Lab13\Starter folder, right-click the Setup.cmd file and then click Run as administrator.
3.
Click Yes when prompted to confirm that you want to run the command file, and wait for the script to finish.
Task 2: Configure Database Mail 1.
Start SQL Server Management Studio and connect to the MIA-SQL database engine instance using Windows authentication.
2.
In Object Explorer, under the MIA-SQL instance, expand Management, right-click Database Mail, and click Configure Database Mail.
3.
In the Welcome to Database Mail Configuration Wizard page, click Next.
4.
In the Select Configuration Task page, select the option to set up Database Mail and click Next.
5.
In the New Profile page, in the Profile name textbox type SQL Server Agent Profile, and click Add. Then, in the Add Account to profile 'SQL Server Agent Profile' dialog box, click New Account.
6.
In the New Database Mail Account dialog box, enter the following details and click OK: o
Account name: AdventureWorks Administrator
o
E-mail address:
[email protected]
o
Display name: Administrator (AdventureWorks)
o
Reply e-mail:
[email protected].
o
Server name: mia-sql.adventureworks.msft
7.
In the New Profile page, click Next.
8.
In the Manage Profile Security page, select Public for the SQL Server Agent Profile profile, and set its Default Profile setting to Yes. Then click Next.
9.
In the Configure System Parameters page, click Next. Then, in the Complete the Wizard page, click Finish and when configuration is complete, click Close.
Task 3: Test Database Mail 1.
In Object Explorer, right-click Database Mail and click Sent Test E-Mail.
2.
In the Send Test E-Mail from MIA-SQL dialog box, ensure that the SQL Server Agent Profile database mail profile is selected, and in the To textbox, enter
[email protected]. Then click Send Test Email.
3.
View the contents of the C:\inetpub\mailroot\Drop folder, and verify that an email message has been created here.
Administering Microsoft® SQL Server® Databases
MCT USE ONLY. STUDENT USE PROHIBITED
L13-2
4.
Double-click the message to view it in Outlook. When you have read the message, close it and delete it, and then minimize the Drop folder window.
5.
In the Database Mail Test E-Mail dialog box (which may be behind SQL Server Management Studio), click OK.
6.
In SQL Server Management Studio, click New Query.
7.
Enter the following Transact-SQL code and click Execute. SELECT * FROM msdb.dbo.sysmail_event_log; SELECT * FROM msdb.dbo.sysmail_mailitems;
8.
View the results. The first result shows system events for Database Mail, and the second shows records of e-mail messages that have been sent.
Results: After this exercise, you should have configured Database Mail with a new profile named SQL Server Agent Profile.
Exercise 2: Implementing Operators and Notifications Task 1: Create Operators 1.
In Object Explorer, under SQL Server Agent, right-click Operators and click New Operator.
2.
In the New Operator dialog box, in the Name box type Student, in the E-mail name box type
[email protected], and click OK.
3.
In Object Explorer, under SQL Server Agent, right-click Operators and click New Operator.
4.
In the New Operator dialog box, in the Name box type DBA Team, in the E-mail name box type
[email protected], and click OK.
Task 2: Configure the SQL Server Agent Mail Profile 1.
In SQL Server Management Studio, in Object Explorer, right-click SQL Server Agent and click Properties.
2.
In the SQL Server Agent Properties dialog box, on the Alert System page, select Enable mail profile and in the Mail profile drop-down list, select SQL Server Agent Profile.
3.
In the SQL Server Agent Properties dialog box, select Enable fail-safe operator, in the Operator drop-down list select DBA Team, and for the Notify using setting, select E-mail. Then click OK.
4.
In Object Explorer, right-click SQL Server Agent and click Restart. When prompted to confirm, click Yes.
Task 3: Configure Job Notifications 1.
In Object Explorer, under SQL Server Agent, expand Jobs and view the existing jobs.
2.
Right-click the Back Up Database - AWDataWarehouse job and click Properties.
3.
In the Job Properties - Back Up Database - AWDataWarehouse dialog box, on the Notifications tab, select E-mail, select Student, and select When the job fails. Then click OK.
4.
Right-click the Back Up Database - HumanResources job and click Properties.
5.
In the Job Properties - Back Up Database - HumanResources dialog box, on the Notifications tab, select E-mail, select Student, and select When the job fails. Then click OK.
MCT USE ONLY. STUDENT USE PROHIBITED L13-3
6.
Right-click the Back Up Database - InternetSales job and click Properties.
7.
In the Job Properties - Back Up Database - InternetSales dialog box, on the Notifications tab, select E-mail, select Student, and select When the job completes. Then click OK.
8.
Right-click the Back Up Log - InternetSales job and click Properties.
9.
In the Job Properties - Back Up Log - InternetSales dialog box, on the Notifications tab, select Email, select Student, and select When the job completes. Then click OK.
10. Expand the Operators folder, right-click Student and click Properties. On the Notifications page, select Jobs, note the job notifications that have been defined for this operator. Then click Cancel.
Task 4: Test Job Notifications 1.
In Object Explorer, right-click the Back Up Database - AWDataWarehouse job and click Start Job at Step. Then, when the job has completed, note that it failed and click Close.
2.
In Object Explorer, right-click the Back Up Database - HumanResources job and click Start Job at Step. Then, when the job has completed, note that it succeeded and click Close.
3.
In Object Explorer, right-click the Back Up Database - InternetSales job and click Start Job at Step. Then, when the job has completed, note that it succeeded and click Close.
4.
Under the Operators folder, right-click Student and click Properties. On the History page, note the most recent notification by e-mail attempt. Then click Cancel.
5.
In the C:\inetpub\mailroot\Drop folder, and verify that new email messages have been created.
6.
Open each of the messages and verify that they include a failure notification for the Back Up Database - AWDataWarehouse job and a completion notification for the Back Up Database InternetSales job, but no notification regarding the Back Up Database - HumanResources job. Then close all e-mail messages and minimize the Drop window.
Results: After this exercise, you should have created operators name Student and DBA Team, configured the SQL Server Agent service to use the SQL Server Agent Profile Database Mail profile, and configured the Back Up Database - AWDataWarehouse, Back Up Database - HumanResources, Back Up Database - InternetSales, and Back Up Log - InternetSales jobs to send notifications.
Exercise 3: Implementing Alerts Task 1: Create an Alert 1.
In SQL Server Management Studio, in Object Explorer, under SQL Server Agent, right-click Alerts and click New Alert.
2.
In the New Alert dialog box, on the General page, enter the name InternetSales Log Full Alert. In the Database name drop-down list, select InternetSales; and then select Error number, and enter the number 9002.
3.
In the New Alert dialog box, on the Response page, select Execute job, and select the Back Up Log - InternetSales ([Uncategorized (Local)]) job. Then select Notify operators and select the E-mail checkbox for the Student operator.
4.
In the New Alert dialog box, on the Options page, under Include alert error text in, select E-mail. Then click OK.
Administering Microsoft® SQL Server® Databases
Task 2: Test the Alert
MCT USE ONLY. STUDENT USE PROHIBITED
L13-4
1.
In SQL Server Management Studio, open the TestAlert.sql script file in the D:\Labfiles\Lab13\Starter folder.
2.
Click Execute and wait while the script runs. When the log file for the InternetSales database is full, error 9002 occurs.
3.
In Object Explorer, under the Alerts folder, right-click InternetSales Log Full Alert and click Properties. Then on the History page, note the Date of last alert and Date of last response values and click Cancel.
4.
In the C:\inetpub\mailroot\Drop folder, and verify that two new email messages have been created.
5.
Double-click the new email messages to view them in Outlook. They should include a notification that the transaction log was filled, and a notification that the Back Up Log - InternetSales job completed.
6.
When you have read the messages, close them and close the Drop window.
7.
Close SQL Server Management Studio without saving any files.
Results: After this exercise, you should have created an alert named InternetSales Log Full Alert.