1. Trang chủ
  2. » Công Nghệ Thông Tin

Microsoft SQL Server 2000 Data Transformation Services- P7

50 415 0
Tài liệu đã được kiểm tra trùng lặp

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 50
Dung lượng 418,96 KB

Nội dung

F IGURE 11.3 Text qualifiers are needed when commas occur in the data of a comma-delimited text file. Use the Transform Data task for these files. The second way to create a format file is to use the bcp utility interactively. Open a Command Prompt and type in a bcp command. The following command could be used to generate the format file in Listing 11.1: bcp pubs.dbo.stores out c:\temp\stores.txt -Usa The bcp utility will ask you a number of questions about the fields in this bulk copy. One of the last questions you will be asked is whether or not you want to create a format file. If you say yes, you will be asked for the host filename, which is used as the name of the format file that will be created. Reconciling Differences Between the Source and the Destination By default, a bulk insert takes data from the fields of a source file and puts it into the same number of fields, using the same order, in the data destination. If you don’t have the same number of fields or if the fields are in a different order, you usually have three options: •Use a view in place of the destination table. Create the view so that its fields line up with the fields of the source text file. This is usually the easiest option to implement. •Use a format file. This option is usually harder to implement, but it gives the most flexibility. • Change the destination table so its fields match the fields in the text file. Other Data Movement and Manipulation Tasks P ART III 276 15 0672320118 CH11 11/13/00 5:01 PM Page 276 Please purchase PDF Split-Merge on www.verypdf.com to remove this watermark. Extra Fields in the Data Destination Table You may have fields in the destination table that do not exist in the source text file, as shown in the following example. The destination is the Stores table in the Pubs database, which has the following fields: stor_id, stor_name, stor_address, city, state, zip The source text file is missing the last three fields: 1110Eric the Read Books 788 Catamaugus Ave. 2220Barnum’s 567 Pasadena Ave. 3330News & Brews 577 First St. You could use the following view as the destination for this Bulk Insert task: create view vwStoresForBulkInsertFewerFields as select stor_id, stor_name, stor_address from stores This code and the code for the following create table and create view items are on the book’s CD as BulkInsertCreateQueries.sql. If you use this view, you still need to use a format file because it’s a fixed-length text file. You could use the format file as it is generated by the DTS Designer. You could also use the table as the destination for the Bulk Insert task. To do that, you would have to create a special format file like this: 1. Create a temporary table that has the same structure as the source data file: create table tmpStoresForBulkInsertFewerFields ( [stor_id] [char] (4)NOT NULL , [stor_name] [varchar] (40) NULL , [stor_address] [varchar] (40) NULL ) 2. Generate a format file using the temporary table as the destination for the Bulk Insert task. Your generated format file will look like this: 8.0 3 1 SQLCHAR 0 4 “” 1 stor_id SQL_Latin1_General_CP1_CI_AS 2 SQLCHAR 0 40 “” 2 stor_name SQL_Latin1_General_CP1_CI_AS 3 SQLCHAR 0 40 “\r\n” 3 stor_address SQL_Latin1_General_CP1_CI_AS The Bulk Insert Task C HAPTER 11 11 T HE B ULK I NSERT T ASK 277 15 0672320118 CH11 11/13/00 5:01 PM Page 277 Please purchase PDF Split-Merge on www.verypdf.com to remove this watermark. 3. Add the missing fields in the order they appear in the destination, using 0 for the column length and 0 for the column order field. 4. If you have a row delimiter (in the example, the new line character), move that to the last line. 5. Change the number in the second row of the format file to the number of fields in the destination table. When you are done, your format file should look like Listing 11.3. L ISTING 11.3 This Format File Accommodates Extra Fields in the Data Destination Table 8.0 6 1 SQLCHAR 0 4 “” 1 stor_id SQL_Latin1_General_CP1_CI_AS 2 SQLCHAR 0 40 “” 2 stor_name SQL_Latin1_General_CP1_CI_AS 3 SQLCHAR 0 40 “” 3 stor_address SQL_Latin1_General_CP1_CI_AS 4 SQLCHAR 0 0 “” 0 city SQL_Latin1_General_CP1_CI_AS 5 SQLCHAR 0 0 “” 0 state SQL_Latin1_General_CP1_CI_AS 6 SQLCHAR 0 0 “\r\n” 0 zip SQL_Latin1_General_CP1_CI_AS The files for this example are on the book’s CD as FewerFieldsInSource.txt and FewerFieldsInSource.fmt. Rearranging Fields When Moving from Source to Destination It’s easier when you have the same fields in the source and the destination, but they’re in a dif- ferent order. For example, you could have a text file to import into stores that has the correct six fields, but the field order in this text file is stor_name, stor_id, stor_address, city, state, zip: Eric the Read Books 1100788 Catamaugus Ave. Seattle WA98056 Barnum’s 2200567 Pasadena Ave. Tustin CA92789 News & Brews 3300577 First St. Los Gatos CA96745 The view that you could create to use as the destination table is as follows: create view vwStoresForBulkInsertRearrange as select stor_name, stor_id, stor_address, city, state, zip from stores If you want to do the rearranging with a format file, start by generating the normal file, which will look like Listing 11.4. Other Data Movement and Manipulation Tasks P ART III 278 15 0672320118 CH11 11/13/00 5:01 PM Page 278 Please purchase PDF Split-Merge on www.verypdf.com to remove this watermark. L ISTING 11.4 A Generated Format File 7.0 6 1 SQLCHAR 0 40 “” 1 stor_name 2 SQLCHAR 0 4 “” 2 stor_id 3 SQLCHAR 0 40 “” 3 stor_address 4 SQLCHAR 0 20 “” 4 city 5 SQLCHAR 0 2 “” 5 state 6 SQLCHAR 0 5 “” 6 zip The rows describing the fields in the format file must be in the order that those rows appear in the source text file. But the numbers in the sixth column must reflect the actual order of those fields in the destination table. Listing 11.5 shows a format file adjusting the order of fields that differ in the source and destination tables. L ISTING 11.5 Switching the Numbering in Column 6 Reorders Fields as They Enter the Destination Table 7.0 6 1 SQLCHAR 0 40 “” 2 stor_name 2 SQLCHAR 0 4 “” 1 stor_id 3 SQLCHAR 0 40 “” 3 stor_address 4 SQLCHAR 0 20 “” 4 city 5 SQLCHAR 0 2 “” 5 state 6 SQLCHAR 0 5 “” 6 zip The files for this example are on the book’s CD as RearrangeFields.txt and RearrangeFields.fmt. Extra Fields in the Source Text File If the text file being used as the source for a Bulk Insert task has more fields than the destina- tion, using a view is not an option. The easiest way to handle this situation is to create the extra fields in the destination table. If you don’t want to do that, you can use a format file. In this example, your source text file has the six fields for the Stores table but also has three extra fields—stor_type, stor_descript, and manager_name: 1111,John Doe,Eric the Read Books,788 Catamaugus Ave.,Seattle, WA,98056,discount,good books 2222,Dave Smith,Barnum’s,567 Pasadena Ave.,Tustin, CA,92789,historical,better books, 3333,Jane Doe,News & Brews,577 First St.,Los Gatos, CA,96745,current events,best books The Bulk Insert Task C HAPTER 11 11 T HE B ULK I NSERT T ASK 279 15 0672320118 CH11 11/13/00 5:01 PM Page 279 Please purchase PDF Split-Merge on www.verypdf.com to remove this watermark. You could follow these steps: 1. Create a temporary table that has the same structure as the source data file: create table tmpStoresForBulkInsertExtraFields ( [stor_id] [char] (4) NOT NULL, [manager_name] char(40) NULL, [stor_name] [varchar] (40) NULL, [stor_address] [varchar] (40) NULL, [city] [varchar] (20) NULL, [state] [char] (2) NULL, [zip] [varchar] (50) NULL, [stor_type] char(40) NULL, [stor_descript] char(40) NULL ) 2. Generate a format file using the temporary table as the destination for the Bulk Insert task. Your generated format file will look like this: 7.0 9 1 SQLCHAR 0 4 “,” 1 stor_id 2 SQLCHAR 0 40 “,” 2 manager_name 3 SQLCHAR 0 40 “,” 3 stor_name 4 SQLCHAR 0 40 “,” 4 stor_address 5 SQLCHAR 0 20 “,” 5 city 6 SQLCHAR 0 2 “,” 6 state 7 SQLCHAR 0 5 “,” 7 zip 8 SQLCHAR 0 40 “,” 8 stor_type 9 SQLCHAR 0 40 “\r\n” 9 stor_descript 3. Renumber the destination column order to reflect the actual order of fields in the destina- tion. Set the value to 0 for those fields that don’t exist in the destination. When you’re done, the format file should look like Listing 11.6. L ISTING 11.6 Adding Additional Fields with a Format File 7.0 9 1 SQLCHAR 0 4 “” 1 stor_id 2 SQLCHAR 0 40 “” 0 manager_name 3 SQLCHAR 0 40 “” 2 stor_name 4 SQLCHAR 0 40 “” 3 stor_address 5 SQLCHAR 0 20 “” 4 city 6 SQLCHAR 0 2 “” 5 state 7 SQLCHAR 0 5 “” 6 zip 8 SQLCHAR 0 40 “” 0 stor_type 9 SQLCHAR 0 40 “” 0 stor_descript Other Data Movement and Manipulation Tasks P ART III 280 15 0672320118 CH11 11/13/00 5:01 PM Page 280 Please purchase PDF Split-Merge on www.verypdf.com to remove this watermark. The files for this sample are on the book’s CD as ExtraFieldsInSource.txt and ExtraFieldsInSource.fmt. Other Properties of the Bulk Insert Task The Bulk Insert task has many additional properties. Most of them can be set on the Options tab of the Bulk Insert Task Properties dialog, as shown in Figure 11.4. The Bulk Insert Task C HAPTER 11 11 T HE B ULK I NSERT T ASK 281 F IGURE 11.4 Many settings on the Options tab of the Bulk Insert Task Properties dialog greatly affect performance. The code sample at the end of this chapter shows how to set all these properties in Visual Basic code. Check Constraints When this option is selected, the data is checked for compliance with all constraints as it is added to the destination table. By default, constraints are ignored when adding records with a Bulk Insert: Default value: False Effect on performance: Decreases performance when selected Object property: CheckConstraints Equivalent parameter of the Bulk Insert command: CHECK_CONSTRAINTS Equivalent parameter of bcp: -h “CHECK_CONSTRAINTS” You enable constraints in Transact-SQL code with the CHECK parameter in the ALTER TABLE statement. You can disable them with the NO CHECK parameter. Selecting or not selecting this property implements identical behavior for the Bulk Insert task, although other data modifica- tions taking place at the same time will still have those constraints enforced. 15 0672320118 CH11 11/13/00 5:01 PM Page 281 Please purchase PDF Split-Merge on www.verypdf.com to remove this watermark. The Bulk Insert task runs more quickly if the constraints are not checked. You can create an Execute SQL task that checks for and processes any records that have been entered into the table that violate the table’s constraints. Set this Execute SQL task to take place upon the suc- cessful completion of the Bulk Insert task. Other Data Movement and Manipulation Tasks P ART III 282 Triggers are never fired during the Bulk Insert task. You can activate the triggers when using the other two bulk copy tools: Bulk Insert command parameter: FIRETRIGGERS Bcp parameter: -h “FIRE_TRIGGERS” If you want to check constraints and fire triggers after a Bulk Insert task, you can use the following command: Update tblCustomer set PhoneNumber = PhoneNumber This command does not modify any data, but it does cause all the table’s constraints to be enforced and all the update triggers to fire. The command will fail if any record in the table violates one of the constraints. All the update triggers will be run by this command. If you take all your insert trig- gers and also make them update triggers, this code activates all the triggers that were missed during the Bulk Insert. If any of the triggers fails to be successfully com- pleted, this update command will also fail. You need more complex code to clean up the data if it fails this constraint and trig- ger test. N OTE Keep Nulls Selecting this option causes null values to be inserted into the destination table wherever there are empty values in the source. The default behavior is to insert the values that have been defined in the destination table as defaults wherever there are empty fields. Default value: False Effect on performance: Improves performance when selected Object property: KeepNulls Equivalent parameter of the Bulk Insert command: KEEPNULLS Equivalent parameters of bcp: -k 15 0672320118 CH11 11/13/00 5:01 PM Page 282 Please purchase PDF Split-Merge on www.verypdf.com to remove this watermark. A Bulk Insert task that keeps nulls could run faster. You can create an Execute SQL task after the Bulk Insert that will apply the table’s defaults. Here is a SQL statement that puts the default value into all the PhoneNumber fields that have empty values: Update tblCustomer set PhoneNumber = Default where PhoneNumber = Null This strategy assumes that there are no records in the PhoneNumber field where you intention- ally want to place a Null value. Enable Identity Insert This option allows the insertion of values into an Identity column in the destination table. Default value: False Effect on performance: Negligible Object property: KeepIdentity Equivalent parameter of the Bulk Insert command: KEEPIDENTITY Equivalent parameters of bcp: -E There are three possible ways to handle a Bulk Insert into a table that has an identity column: • If you want to ignore the values for the identity column in the source data file, leave the default setting of False for this property. The table’s identity column will be filled with automatically generated values, as in a normal record insert. • If you want to keep the values for the identity column that are in your source data file, select this option. SQL Server sets the IDENTITY_INSERT option on for the Bulk Insert and writes the values from the text file into the table. • If your text file does not have a field for the identity column, you must use a format file. This format file must indicate that the identity field is to be skipped when importing data. The table’s identity column will be filled with the automatically generated values. Table Lock SQL Server has a special locking mechanism that is available for bulk inserts. Enable this mechanism either by selecting this property or using sp_tableoption to set the “table lock on bulk load” option to True . Default value: False Effect on performance: Significantly improves performance when selected Object property: TableLock Equivalent parameter of the Bulk Insert command: TABLOCK Equivalent parameters of bcp: -h “TABLOCK” The Bulk Insert Task C HAPTER 11 11 T HE B ULK I NSERT T ASK 283 15 0672320118 CH11 11/13/00 5:01 PM Page 283 Please purchase PDF Split-Merge on www.verypdf.com to remove this watermark. When this special locking mechanism is enabled, a bulk insert acquires a bulk update lock. This lock allows other bulk inserts to take place at the same time but prevents any other processes from accessing the table. If this property is not selected and the “table lock on bulk load” option is set to False ,the Bulk Insert will acquire individual record locks. This significantly reduces the speed of the Bulk Insert task. Sorted Data By default, the Bulk Insert task processes the records in the data file as if they were in no par- ticular order. Setting this property to true improves the performance of a bulk insert if the fol- lowing three requirements are met: •A clustered index exists on the table. • The data file is in the same order as that clustered index. • The order specified by the SortedData property matches the ordering of the table’s clus- tered index. Default value: Not selected. Empty string for property value. Effect on performance: Improves performance when selected, but only if all the require- ments for its proper use are met. Object property: SortedData , which holds the string specifying the sort order. Equivalent parameter of the Bulk Insert command: ORDER Equivalent parameters of bcp: -h “ORDER (<Ordering String>)” If the table does not have a clustered index, or an ordering other than the clustered index is specified, this property is ignored. The ordering string is constructed in the same way as the syntax of the ORDER BY clause in a SQL statement. If the ordering of customers were alphabetical by city and oldest to youngest within a city, the ordering string would be City, Age DESC Code Page This option specifies the code page that has been used for the data in the source file. This prop- erty affects the Bulk Insert only in cases where there are characters with values less than 32 or greater than 127. Default value: OEM Other possible values: ACP, RAW, Specific code page number Other Data Movement and Manipulation Tasks P ART III 284 15 0672320118 CH11 11/13/00 5:01 PM Page 284 Please purchase PDF Split-Merge on www.verypdf.com to remove this watermark. Effect on performance: Usually none Object property: CodePage Equivalent parameter of the Bulk Insert command: CODEPAGE Equivalent parameters of bcp: -C Data File Type There are two choices to make in this property—the choice between char and native data types, and the choice between regular character fields and Unicode character fields. If you have Unicode data in your data, you must use widechar or widenative to bulk insert your data. Char and widechar are used for inserting data from a file that has character fields. Native and widenative use a variety of data types in their fields. These native files must be created by bulk copying data out of SQL Server with bcp. If you are using text files to transfer data between two SQL Server databases, using native mode improves performance. Default value: char Here are all the possible values, with their constants: Constant Value DTSBulkInsert_DataFileType_Char 0 DTSBulkInsert_DataFileType_Native 1 DTSBulkInsert_DataFileType_WideChar 2 DTSBulkInsert_DataFileType_WideNative 3 Effect on performance: Using native and widenative improves performance when you’re using a text file to transfer data from one SQL Server to another. Object property: DataFileType Equivalent parameter of the Bulk Insert command: DATAFILETYPE Equivalent parameters of bcp: -s for native, -w for wide character Insert Commit Size By default, all records are inserted into the destination table as a single transaction. This prop- erty allows for fewer records to be included in each transaction. If a failure takes place during The Bulk Insert Task C HAPTER 11 11 T HE B ULK I NSERT T ASK 285 15 0672320118 CH11 11/13/00 5:01 PM Page 285 Please purchase PDF Split-Merge on www.verypdf.com to remove this watermark. [...]... to Use the Copy SQL Server Objects Task The Copy SQL Server Objects task moves data and/or database objects between SQL Server databases that are version 7.0 or later It generates Transact -SQL scripts, which it then uses to move the database objects Consider using this task if the following are true: • You don’t want to copy a whole database (If you do, consider using the Transfer Databases task.)... You don’t want to manipulate data as it is being moved (If you do, you have to use one of the transformation tasks.) • You want to move database objects, not just data You can create database objects with Execute SQL tasks, but the Copy SQL Server objects task gives you a highly efficient development environment for doing that • You’re using SQL Server 7.0 or SQL Server 2000 as the source and the destination... tasks and use Transform Data tasks to move your data If you just want to move tables and their data, you should usually use the Transfer Data task rather than the Copy SQL Server Objects task You can choose to transfer all, a selected subset, or none of the following types of SQL Server objects with this task: Copy SQL Server Objects Task CHAPTER 13 311 • Tables, with or without the data in them • Indexes... Set integrated security if user name is empty string or null IF @sServerUserName = ‘’ OR @sServerUserName IS NULL SET @lFlags = 256 ELSE SET @lFlags = 0 Load package SET @sMethod = ‘LoadFromSQLServer’ EXEC @hResult = sp_OAMethod @hPkg, @sMethod, NULL, @ServerName = @sServerName, @ServerUserName = @sServerUserName, @ServerPassword = @sServerPassword, @Flags = @lFlags, @PackageName = @sPackageName Always... Parameter for the Rowset 299 • Dynamically Modifying the SQL Statement 300 • Using the Execute SQL Task to Execute a DTS Package from a Remote Server 301 • Creating an Execute SQL Task in Visual Basic 306 292 Other Data Movement and Manipulation Tasks PART III Microsoft has significantly improved and extended the value of the Execute SQL task in SQL Server 2000 You can now do the following: • Use global variables... Execute SQL Task The transformation tasks allow you to perform rapid row-by-row processing of your data The Execute SQL task gives you the power of SQL- oriented set processing, which will usually be even faster If you can write your data transformation as a SQL statement and you don’t need to use special processing for individual rows, you can usually use an Execute SQL task You can use the Execute SQL. .. sp_OADestroy THE EXECUTE SQL TASK sql = “Select Top 5 * from “ & DTSGlobalVariables(“TableName”) cus.SQLStatement = sql 12 302 Other Data Movement and Manipulation Tasks PART III Listing 12.1 contains an example of code that executes a package with a Transform Data task on a particular database server Information about the transformation is captured in the calling package by using Execute SQL task output parameters... other packages CHAPTER The Copy SQL Server Objects Task 13 IN THIS CHAPTER • When to Use the Copy SQL Server Objects Task 310 • The Source and the Destination • Transfer Choices 311 312 • Other Properties of the Copy SQL Server Objects Task 317 • Using Methods to Include Objects in the Transfer 317 • Creating a Copy SQL Server Objects Task in Visual Basic 320 310 Other Data Movement and Manipulation... The Copy SQL Server Objects task does not support any other data sources or destinations TIP Out testing indicates that the Copy SQL Server Objects task is much slower than the Transform Data task using Copy Column transformations and Fast Load This appears to be true both for large and small tables See Chapter 28, “High-Performance DTS Packages,” for the results of our testing We used the SQL Server. .. the database server so that a large amount of data is not pulled across the network You can use the Execute SQL task in one DTS package to execute a second DTS package on a particular server You can use this technique to ensure that a large volume data transformation is run on the server, no matter where the calling DTS package is executed The tools you use to execute a DTS package from an Execute SQL . developing queries for a SQL Server database, I usually use the Query Analyzer to create my SQL statements. The Query Designer in SQL Server 2000 pro- vides an. the Data Destination Table 8.0 6 1 SQLCHAR 0 4 “” 1 stor_id SQL_ Latin1_General_CP1_CI_AS 2 SQLCHAR 0 40 “” 2 stor_name SQL_ Latin1_General_CP1_CI_AS 3 SQLCHAR

Ngày đăng: 20/10/2013, 17:15

TỪ KHÓA LIÊN QUAN