HTML Dropdown

Monday 9 January 2017

Checking when a database was last backed up

It can be useful to periodically check when each database on a server was last backed up. The easiest way to do this on a single database is to right click on the database in SQL Server Management Studio (SSMS) and looking at the top of the Database Properties page (see the screenshot below).

However when there are several databases to check this can be quite labourious. SSMS actually uses the system table backupset to populate this part of the Properties page (you can verify this by running SQL Profiler just before opening the page). 

                                
                                      
                            
I use a SQL script that uses this table along with backupmediafamily system table (to identify the file name of the backup) to query the latest backup of each type. The script query returns the most recent backup of each type, whether it's a full, transaction log, differential, filegroup or partial backup. Here is the script :

;WITH CTE_Backup AS
(
SELECT  database_name,backup_start_date,type,physical_device_name
       ,Row_Number() OVER(PARTITION BY database_name,BS.type 
        ORDER BY backup_start_date DESC) AS RowNum
FROM    msdb..backupset BS
JOIN    msdb.dbo.backupmediafamily BMF
ON      BS.media_set_id=BMF.media_set_id
)
SELECT      D.name
           ,ISNULL(CONVERT(VARCHAR,backup_start_date),'No backups') AS last_backup_time
           ,D.recovery_model_desc
           ,state_desc,
            CASE WHEN type ='D' THEN 'Full database'
            WHEN type ='I' THEN 'Differential database' 
            WHEN type ='L' THEN 'Log' 
            WHEN type ='F' THEN 'File or filegroup' 
            WHEN type ='G' THEN 'Differential file' 
            WHEN type ='P' THEN 'Partial' 
            WHEN type ='Q' THEN 'Differential partial' 
            ELSE 'Unknown' END AS backup_type 
           ,physical_device_name
FROM        sys.databases D
LEFT JOIN   CTE_Backup CTE
ON          D.name = CTE.database_name
AND         RowNum = 1
ORDER BY    D.name,type
As an aside the 'Last Database Backup' shown in SSMS does not seem to include filegroup or partial backups, only full, differential or log backups. I'm not sure why this should be.
Identifying Databases Which Haven’t Been Backed Up Recently
The above query can return a lot of data if your server has many databases, so I've modified it to produce a list of databases that have had no backups in the last 7 days (you might want to change this to a shorter period especially for production databases) :
;WITH CTE_Backup AS
(
SELECT   database_name,backup_start_date,type,is_readonly,physical_device_name
        ,Row_Number() OVER(PARTITION BY database_name 
         ORDER BY backup_start_date DESC) AS RowNum
FROM     msdb..backupset BS
JOIN     msdb.dbo.backupmediafamily BMF
ON       BS.media_set_id=BMF.media_set_id
)
SELECT      D.name
           ,ISNULL(CONVERT(VARCHAR,backup_start_date),'No backups') AS last_backup_time
           ,D.recovery_model_desc
           ,state_desc
           ,physical_device_name
FROM        sys.databases D
LEFT JOIN   CTE_Backup CTE
ON          D.name = CTE.database_name
AND         RowNum = 1
WHERE       ( backup_start_date IS NULL OR backup_start_date < DATEADD(dd,-7,GetDate()) )
ORDER BY    D.name,type
This will produce a list of databases for investigation, though of course there may be a good reason for a database not being backed up, for instance it's not possible to backup database snapshots. Also if the secondary database of a log shipping configuration there's not always a need to back it up.
Missing Transaction Log Backups
Finally I've modified the above query so that it reports all databases that have the full or bulk-logged recovery model and where there hasn't been a transaction log backup in the last day :
;WITH CTE_Backup AS
(
SELECT   database_name,backup_start_date,type,is_readonly,physical_device_name
        ,Row_Number() OVER(PARTITION BY database_name,BS.type 
         ORDER BY backup_start_date DESC) AS RowNum
FROM     msdb..backupset BS
JOIN     msdb.dbo.backupmediafamily BMF
ON       BS.media_set_id=BMF.media_set_id
WHERE    type='L' 
)
SELECT      D.name
           ,ISNULL(CONVERT(VARCHAR,backup_start_date),'No log backups') AS last_backup_time
           ,D.recovery_model_desc
           ,state_desc
           ,physical_device_name
FROM        sys.databases D
LEFT JOIN   CTE_Backup CTE
ON          D.name = CTE.database_name
AND         RowNum = 1
WHERE       ( backup_start_date IS NULL OR backup_start_date < DATEADD(dd,-1,GetDate()) )
AND         recovery_model_desc != 'SIMPLE'
ORDER BY    D.name,type
I hope the above queries are of use in identifying databases where there is no recent backup in place. You really don't want to be in a position where this is only discovered when the backup is actually needed, i.e. after a disk failure or data corruption. Of course you should also be checking that your backups are valid, ideally by periodically restoring from a backup, or at the very least by checking the backups using the VERIFYONLY option (though there is really no substitute for doing an actual restore).

  

Script to retrieve SQL Server database backup history and no backups

Problem
There is a multitude of data to be mined from within the Microsoft SQL Server system views. This data is used to present information back to the end user of the SQL Server Management Studio (SSMS) and all third party management tools that are available for SQL Server Professionals. Be it database backup information, file statistics, indexing information, or one of the thousands of other metrics that the instance maintains, this data is readily available for direct querying and assimilation into your "home-grown" monitoring solutions as well. This tip focuses on that first metric: database backup information. Where it resides, how it is structured, and what data is available to be mined.
Solution
The msdb system database is the primary repository for storage of SQL Agent, backup, Service Broker, Database Mail, Log Shipping, restore, and maintenance plan metadata. We will be focusing on the handful of system views associated with database backups for this tip:
dbo.backupset: provides information concerning the most-granular details of the backup process
dbo.backupmediafamily: provides metadata for the physical backup files as they relate to backup sets
dbo.backupfile: this system view provides the most-granular information for the physical backup files
Based upon these tables, we can create a variety of queries to collect a detailed insight into the status of backups for the databases in any given SQL Server instance.

Database Backups for all databases For Previous Week
---------------------------------------------------------------------------------
--Database Backups for all databases For Previous Week
---------------------------------------------------------------------------------
SELECT
CONVERT(CHAR(100), SERVERPROPERTY('Servername')) AS Server,
msdb.dbo.backupset.database_name,
msdb.dbo.backupset.backup_start_date,
msdb.dbo.backupset.backup_finish_date,
msdb.dbo.backupset.expiration_date,
CASE msdb..backupset.type
WHEN 'D' THEN 'Database'
WHEN 'L' THEN 'Log'
END AS backup_type,
msdb.dbo.backupset.backup_size,
msdb.dbo.backupmediafamily.logical_device_name,
msdb.dbo.backupmediafamily.physical_device_name,
msdb.dbo.backupset.name AS backupset_name,
msdb.dbo.backupset.description
FROM msdb.dbo.backupmediafamily
INNER JOIN msdb.dbo.backupset ON msdb.dbo.backupmediafamily.media_set_id = msdb.dbo.backupset.media_set_id
WHERE (CONVERT(datetime, msdb.dbo.backupset.backup_start_date, 102) >= GETDATE() - 7)
ORDER BY
msdb.dbo.backupset.database_name,
msdb.dbo.backupset.backup_finish_date
Note: for readability the output was split into two screenshots.
          



Most Recent Database Backup for Each Database
-------------------------------------------------------------------------------------------
--Most Recent Database Backup for Each Database
-------------------------------------------------------------------------------------------
SELECT 
   CONVERT(CHAR(100), SERVERPROPERTY('Servername')) AS Server,
   msdb.dbo.backupset.database_name, 
   MAX(msdb.dbo.backupset.backup_finish_date) AS last_db_backup_date
FROM   msdb.dbo.backupmediafamily 
   INNER JOIN msdb.dbo.backupset ON msdb.dbo.backupmediafamily.media_set_id = msdb.dbo.backupset.media_set_id 
WHERE  msdb..backupset.type = 'D'
GROUP BY
   msdb.dbo.backupset.database_name 
ORDER BY 
   msdb.dbo.backupset.database_name


            


                     


Most Recent Database Backup for Each Database - Detailed
You can join the two result sets together by using the following query in order to return more detailed information about the last database backup for each database. The LEFT JOIN allows you to match up grouped data with the detailed data from the previous query without having to include the fields you do not wish to group on in the query itself.
-------------------------------------------------------------------------------------------
--Most Recent Database Backup for Each Database - Detailed
-------------------------------------------------------------------------------------------
SELECT 
   A.[Server], 
   A.last_db_backup_date, 
   B.backup_start_date, 
   B.expiration_date,
   B.backup_size, 
   B.logical_device_name, 
   B.physical_device_name,  
   B.backupset_name,
   B.description
FROM
   (
   SELECT  
       CONVERT(CHAR(100), SERVERPROPERTY('Servername')) AS Server,
       msdb.dbo.backupset.database_name, 
       MAX(msdb.dbo.backupset.backup_finish_date) AS last_db_backup_date
   FROM    msdb.dbo.backupmediafamily 
       INNER JOIN msdb.dbo.backupset ON msdb.dbo.backupmediafamily.media_set_id = msdb.dbo.backupset.media_set_id 
   WHERE   msdb..backupset.type = 'D'
   GROUP BY
       msdb.dbo.backupset.database_name 
   ) AS A
   
   LEFT JOIN 

   (
   SELECT  
   CONVERT(CHAR(100), SERVERPROPERTY('Servername')) AS Server,
   msdb.dbo.backupset.database_name, 
   msdb.dbo.backupset.backup_start_date, 
   msdb.dbo.backupset.backup_finish_date,
   msdb.dbo.backupset.expiration_date,
   msdb.dbo.backupset.backup_size, 
   msdb.dbo.backupmediafamily.logical_device_name, 
   msdb.dbo.backupmediafamily.physical_device_name,  
   msdb.dbo.backupset.name AS backupset_name,
   msdb.dbo.backupset.description
FROM   msdb.dbo.backupmediafamily 
   INNER JOIN msdb.dbo.backupset ON msdb.dbo.backupmediafamily.media_set_id = msdb.dbo.backupset.media_set_id 
WHERE  msdb..backupset.type = 'D'
   ) AS B
   ON A.[server] = B.[server] AND A.[database_name] = B.[database_name] AND A.[last_db_backup_date] = B.[backup_finish_date]
ORDER BY 
   A.database_name
Note: for readability the output was split into two screenshots.
                                 

         

Databases Missing a Data (aka Full) Back-Up Within Past 24 Hours
At this point we've seen how to look at the history for databases that have been backed up. While this information is important, there is an aspect to backup metadata that is slightly more important - which of the databases you administer have not been getting backed up. The following query provides you with that information (with some caveats.)
-------------------------------------------------------------------------------------------
--Databases Missing a Data (aka Full) Back-Up Within Past 24 Hours
-------------------------------------------------------------------------------------------
--Databases with data backup over 24 hours old
SELECT
   CONVERT(CHAR(100), SERVERPROPERTY('Servername')) AS Server,
   msdb.dbo.backupset.database_name,
   MAX(msdb.dbo.backupset.backup_finish_date) AS last_db_backup_date,
   DATEDIFF(hh, MAX(msdb.dbo.backupset.backup_finish_date), GETDATE()) AS [Backup Age (Hours)]
FROM    msdb.dbo.backupset
WHERE     msdb.dbo.backupset.type = 'D' 
GROUP BY msdb.dbo.backupset.database_name
HAVING      (MAX(msdb.dbo.backupset.backup_finish_date) < DATEADD(hh, - 24, GETDATE())) 

UNION 

--Databases without any backup history
SELECT     
   CONVERT(CHAR(100), SERVERPROPERTY('Servername')) AS Server, 
   master.dbo.sysdatabases.NAME AS database_name, 
   NULL AS [Last Data Backup Date], 
   9999 AS [Backup Age (Hours)] 
FROM
   master.dbo.sysdatabases LEFT JOIN msdb.dbo.backupset
       ON master.dbo.sysdatabases.name  = msdb.dbo.backupset.database_name
WHERE msdb.dbo.backupset.database_name IS NULL AND master.dbo.sysdatabases.name <> 'tempdb'
ORDER BY 
   msdb.dbo.backupset.database_name

https://www.mssqltips.com/tipimages/1601_06.jpg
Now let me explain those caveats, and this query. The first part of the query returns all records where the last database (full) backup is older than 24 hours from the current system date. This data is then combined via the UNION statement to the second portion of the query. That second statement returns information on all databases that have no backup history. I've taken the liberty of singling tempdb out from the result set since you do not back up that system database. It is recreated each time the SQL Server services are restarted. That is caveat #1. Caveat #2 is the arbitrary value I've assigned to the aging value for databases without any backup history. I've set that value at 9999 hours because in my environment I want to place a higher emphasis on those databases that have never been backed up.
Using this final query I produce a report via SQL Server Reporting Services that is distributed to the DBA Team on a daily basis that highlights any missed backups. That, however, is for another tip.


SQL SERVER – Finding Last Backup Time for All Database
Here is the quick script I use find last backup time for all the database in my server instance.



SELECT sdb.Name AS DatabaseName,
COALESCE(CONVERT(VARCHAR(12), MAX(bus.backup_finish_date), 101),'-') AS LastBackUpTimeFROM sys.sysdatabases sdb
LEFT OUTER JOIN msdb.dbo.backupset bus ON bus.database_name = sdb.name
GROUP BY sdb.Name


Identify when a SQL Server database was restored, the source and backup date
Problem
After restoring a database your users will typically run some queries to verify the data is as expected. However, there are times when your users may question whether the restore was done using the correct backup file. In this tip I will show you how you can identify the file(s) that was used for the restore, when the backup actually occured and when the database was restored.
Solution
The restore information is readily available inside the msdb database, making the solution as easy as a few lines of T-SQL.
When I ask people about how they verify their database restores I often get back a response that includes something similar to the following code:
RESTORE VERIFYONLY FROM DISK = 'G:\dbname.bak'
The above command simply returns this message when successful: "The backup set on file 1 is valid." Is that really useful for your end user that is complaining that the data is not correct? Chances are their complaint is not about if the backup set was valid, but more specifically it is about your selection of the backup file, or the timing of the backup itself.
If the backup was done at the wrong time, or if you restored from the wrong backup file, then the end user may be seeing exactly that problem while reviewing the data. So, how do you provide some proof that you did the restore from the correct backup file? The following script can give you this information.
SELECT [rs].[destination_database_name],
[rs].[restore_date],
[bs].[backup_start_date],
[bs].[backup_finish_date],
[bs].[database_name] as [source_database_name],
[bmf].[physical_device_name] as [backup_file_used_for_restore]
FROM msdb..restorehistory rs
INNER JOIN msdb..backupset bs
ON [rs].[backup_set_id] = [bs].[backup_set_id]
INNER JOIN msdb..backupmediafamily bmf
ON [bs].[media_set_id] = [bmf].[media_set_id]
ORDER BY [rs].[restore_date] DESC
The script will return the following result set:
Column Name
Description
destination_database_name
The name of the database that has been restored.
restore_date
The time at which the restore command was started.
backup_start_date
The time at which the backup command was started.
backup_finish_date
The time at which the backup command completed.
source_database_name
The name of the database after it was restored.
backup_file_used_for_restore
The file(s) that the restore used in the RESTORE command.
Here is a screenshot of a sample result set returned by the script.


Issues with SQL Server Import and Export Wizard

The SQL Server Import and Export Wizard has been around for a while. It’s a simple and very useful tool for importing or exporting data. You simply need to give the Wizard a data source, name a destination, and let the Wizard do the magic: importing or exporting data. But there can be instances where the SQL Server Import and Export Wizard fails to perform as designed and needs help to finish the basic importing or exporting of data.
     Problem
Moving data into SQL Server is something that most DBAs or Developers are faced with probably on a daily basis.  One simple way of doing this is by using the Import / Export wizard, but along with this option there are several other ways of loading data into SQL Server tables. Another common technique would be to use SSIS.  In this tip we take a look at some of these other options for importing data into SQL Server.
Solution
In addition to using the Import / Export wizards or SSIS to move data into SQL Server there are also a few other options for doing this that are built into SQL Server.  Some these other options include bcp, BULK INSERT, OPENROWSET as well as others.  The following examples show you some of these different options for importing data and how you can use some of these inline with your T-SQL code as well as others that can be run from the command line.

This is one of the options that is mostly widely used.  One reason for this is that it has been around for awhile, so DBAs have come quite familiar with this command.  This command allows you to both import and export data, but is primarily used for text data formats.  In addition, this command is generally run from a Windows command prompt, but could also be called from a stored procedure by using xp_cmdshell or called from a SSIS package.
Here is a simple command for importing data from file C:\ImportData.txt into table dbo.ImportTest.
bcp dbo.ImportTest in 'C:\ImportData.txt' -T -SserverName\instanceName
For more information about bcp
This command is a T-SQL command that allows you to import data directly from within SQL Server by using T-SQL.  This command imports data from file C:\ImportData.txt into table dbo.ImportTest.
BULK INSERT dbo.ImportTest
FROM 'C:\ImportData.txt'
WITH ( FIELDTERMINATOR =',', FIRSTROW = 2 )
For more information about BULK INSERT click here.

This command is a T-SQL command that allows you to query data from other data sources directly from within SQL Server.  By using this command along with an INSERT INTO command we can load data from the specified data source into a SQL Server table.
This command will pull in all data from worksheet [Sheet1$]. By using the INSERT INTO command you can insert the query results into table dbo.ImportTest.
INSERT INTO dbo.ImportTest
SELECT * FROM OPENROWSET('Microsoft.Jet.OLEDB.4.0',
'Excel 8.0;Database=C:\ImportData.xls', [Sheet1$])
Here is another example where data is pulled from worksheet [Sheet1$] by using a SELECT * FROM command. Again, by using the INSERT INTO command you can insert the query results into table dbo.ImportTest.   The query can be any valid SQL query, so you can filter the columns and rows by using this option.
INSERT INTO dbo.ImportTest
SELECT * FROM OPENROWSET('Microsoft.Jet.OLEDB.4.0',
'Excel 8.0;Database=C:\ImportData.xls', 'SELECT * FROM [Sheet1$]')
For more information about OPENROWSET click here.

This command is a T-SQL command that allows you to query data from other data sources directly from within SQL Server. This is similar to the OPENROWSET command.
INSERT INTO dbo.ImportTest
SELECT * FROM OPENDATASOURCE('Microsoft.Jet.OLEDB.4.0',
'Data Source=C:\ImportData.xls;Extended Properties=Excel 8.0')...[Sheet1$]
For more information about OPENDATASOURCE click here.

Another option is OPENQUERY.  This is another command that allows you to issue a T-SQL command to select data and again with the INSERT INTO option we can load data into our table.  There are two steps with this process, first a linked server is setup and then second the query is issued using the OPENQUERY command.  This option allow you to filter the columns and rows by the query that is issued against your linked data source.
EXEC sp_addlinkedserver 'ImportData',
   'Jet 4.0', 'Microsoft.Jet.OLEDB.4.0',
   'C:\ImportData.xls',
   NULL,
   'Excel 8.0'
GO

INSERT INTO dbo.ImportTest
SELECT *
FROM OPENQUERY(ImportData, 'SELECT * FROM [Sheet1$]')
For more information about OPENQUERY click here.

Here is yet another option with setting up a linked server and then issuing a straight SQL statement against the linked server.  This again has two steps, first the linked server is setup and secondly a SQL command is issued against the linked data source.
EXEC sp_addlinkedserver 'ImportData',
   'Jet 4.0', 'Microsoft.Jet.OLEDB.4.0',
   'C:\ImportData.xls',
   NULL,
   'Excel 8.0'
GO

INSERT INTO dbo.ImportTest
SELECT * FROM ImportData...Sheet1$
For more information about Linked Servers click here.
As you can see right out of the box SQL Server offers many ways of importing data into SQL Server.  Take a look at these different options to see what satisfies your database requirements.

The SQL Server Import and Export Wizard is useful for copying data from one data source (e.g. a SQL Server database) to another. Although the interface is fairly simple there are a few “gotchas” to be aware of. Here are a few issues I’ve found while loading data from one SQL Server database into another.
Identity Columns
The wizard doesn’t treat identity columns any differently to other columns, so will usually fail when trying to insert data into a table that has an identity column. However the error message can be a bit misleading :
- Validating (Error)
Messages

• Error 0xc0202049: Data Flow Task 1: Failure inserting into the read-only column "SystemInformationID".
(SQL Server Import and Export Wizard)

• Error 0xc0202045: Data Flow Task 1: Column metadata validation failed.
(SQL Server Import and Export Wizard)

• Error 0xc004706b: Data Flow Task 1: "component "Destination - BuildVersion" (28)" failed validation and returned validation status "VS_ISBROKEN".
(SQL Server Import and Export Wizard)

• Error 0xc004700c: Data Flow Task 1: One or more component failed validation.
(SQL Server Import and Export Wizard)
The issue here is that the ‘SystemInformationID’ column in this table is defined as an identity column, although that’s not immediately obvious from the error message. Fortunately there is an easy solution in that you can select an option in the wizard to allow specific values to be inserted into the identity column, much like you can with a SQL query. To do this select the relevant tables in the ‘Select Source Tables and Views’ page by clicking the checkbox in the header for all tables, or you can just select which tables you want copying. Make sure the tables you want are actually selected (i.e. not just checked). In my case I’ve just selected all the tables in my database :







Click on the ‘Edit mappings…’ button towards the bottom of the screen and the following window should appear:



                     





If you select the ‘Enable identity insert’ as indicated in the picture above then the values of any identity columns will simply be copied across from the source database.


Timestamp Columns
The wizard will also attempt to copy any timestamp columns in the same way it would for a column of any other data type. Unfortunately timestamp columns can’t be explicitly set to a specific value so this will always fail. If you try it then you will probably get an error message like this one (obviously the column name will be different for you) :
- Validating (Error)
Messages

• Error 0xc0202048: Data Flow Task 1: Attempting insertion into the row version column "LastUpdated". Cannot insert into a row version column.
(SQL Server Import and Export Wizard)

• Error 0xc0202045: Data Flow Task 1: Column metadata validation failed.
(SQL Server Import and Export Wizard)

• Error 0xc004706b: Data Flow Task 1: "Destination 4 - Customer" failed validation and returned validation status "VS_ISBROKEN".
(SQL Server Import and Export Wizard)

• Error 0xc004700c: Data Flow Task 1: One or more component failed validation.
(SQL Server Import and Export Wizard)

Error 0xc0024107: Data Flow Task 1: There were errors during task validation.
(SQL Server Import and Export Wizard)
Again the error is perhaps a bit misleading as there is no mention of ‘timestamp’, but row version is just a synonym for timestamp. The solution is to not copy any columns that are timestamps. To do this you just need to click the ‘Edit Mappings…’ in the ‘Select Tables and Views’ screen for the table in question. This should display a screen similar to this one :





               

As you can see in my table the ‘LastUpdated’ column is a timestamp column. To stop the error occurring just set the destination to ‘ignore’ in the drop down that appears when you click on that cell :





             

If there is more than one table with a timestamp you’ll need to repeat this for each table.
Constraints
If the table has foreign key constraints on then the chances are you will get a constraint failure message at some point. The Wizard does not load tables in any specific order for constraints, so it is quite possible that the foreign key table will get loaded before the table it refers to is loaded, causing a foreign key constraint failure. The error message will be something like : “The INSERT statement conflicted with the FOREIGN KEY constraint". The error in the Wizard will probably be similar to :
- Copying to [SalesLT].[ProductDescription] (Error)
Messages

• Information 0x402090e0: Data Flow Task 2: The final commit for the data insertion in "component "Destination 7 - ProductCategory" (207)" has ended.
(SQL Server Import and Export Wizard)

• Information 0x402090e0: Data Flow Task 2: The final commit for the data insertion in "component "Destination 5 - CustomerAddress" (31)" has ended.
(SQL Server Import and Export Wizard)

• Information 0x402090df: Data Flow Task 2: The final commit for the data insertion in "component "Destination 8 - ProductDescription" (262)" has started.
(SQL Server Import and Export Wizard)

• Information 0x402090e0: Data Flow Task 2: The final commit for the data insertion in "component "Destination 8 - ProductDescription" (262)" has ended.
(SQL Server Import and Export Wizard)

• Error 0xc0202009: Data Flow Task 2: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft SQL Server Native Client 10.0" Hresult: 0x80004005 Description: "The statement has been terminated.".
An OLE DB record is available. Source: "Microsoft SQL Server Native Client 10.0" Hresult: 0x80004005 Description: "The INSERT statement conflicted with the FOREIGN KEY constraint "FK_Product_ProductModel_ProductModelID". The conflict occurred in database "AdventureWorksLT2008R2Copy", table "SalesLT.ProductModel", column 'ProductModelID'.".
(SQL Server Import and Export Wizard)

• Error 0xc0209029: Data Flow Task 2: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "input "Destination Input" (138)" failed because error code 0xC020907B occurred, and the error row disposition on "input "Destination Input" (138)" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
(SQL Server Import and Export Wizard)

• Error 0xc0047022: Data Flow Task 2: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "Destination 6 - Product" (125) failed with error code 0xC0209029 while processing input "Destination Input" (138). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.
(SQL Server Import and Export Wizard)
The easiest way to get around this is to disable constraints during the load and then re-enable them again afterwards.

To disable all constraints on a specific table run the following SQL (in this case for the Address table) :
ALTER TABLE Address NOCHECK CONSTRAINT ALL
However you’ll need to disable constraints on all tables to be sure of avoiding errors. You can do this with the undocumented stored procedure sp_MSforeachtable as follows :
EXEC sp_MSforeachtable @command1="ALTER TABLE ? NOCHECK CONSTRAINT ALL"
This will disable all constraints on all tables. Once the data has been loaded constraints can be re-enabled with the following :
EXEC sp_MSforeachtable @command1="ALTER TABLE ? WITH CHECK CHECK CONSTRAINT ALL"
The ‘WITH CHECK’ option specifies that the data is validated against the constraint.
Triggers
If you have triggers in your database then these will not fire when using the import wizard. If you were relying on the trigger code running then you will need to do this manually once the data is imported.


Added March 2016 :
202 and 200 Data Conversion Errors

When importing or exporting data using a query, the VARCHAR data type is incorrectly recognised as a '200' data type and the NVARCHAR as a '202' data type. This causes the wizard to fail, with a message similar to that below in the 'Review Data Type Mapping' screen.




            

This is a bug in some versions of SSIS and it only occurs when the data source is a query (rather than a table or view). The workaround I use is to insert the data from the query into a table, and then use the table as the data source. This avoids the need to use a query for the data source. An alternative is to create a view that references the query and use that as the data source. In both cases the table or view can be deleted once the data has been imported.