Prev: deleting duplicate rows
Next: many reports solution
From: Jeff on 6 Jun 2010 07:21 hi I'm interested in knowing more about tuning import of large dataset from for example MS Acess into MS Sql Server 2005. over 60.000 rows to be inserted into the table. I think using bulk insert is better than using a loop of insert statements. If you have I would be greatful if some some of you could post a link here so I can read more about the benefits of bulk insert compared to regular insert? also interested in other links about best practice for importing large datasets And how do I turn off foreign constraint check for an insert statment? I'm google but haven't found a good articles about it yet, but I might be using the wrong search parameters.
From: Erland Sommarskog on 6 Jun 2010 11:07 Jeff (it_consultant1(a)hotmail.com.NOSPAM) writes: > I'm interested in knowing more about tuning import of large dataset from > for example MS Acess into MS Sql Server 2005. over 60.000 rows to be > inserted into the table. > > I think using bulk insert is better than using a loop of insert > statements. If you have I would be greatful if some some of you could > post a link here so I can read more about the benefits of bulk insert > compared to regular insert? also interested in other links about best > practice for importing large datasets Using a loop of INSERT statements is about the slowest way to insert many rows. The simplest way to insert the rows may be to use a linked server: INSERT tbl (...) SELECT MYACCESSDB...tbl WHERE .... That is, if you are successful in setting up the linked server, which is not always that trivial. (And since I have not worked with Access, I cannot help there.) Another way is to export the rows from Access to a text file, and then you can use the BULK INSERT statement to read that file, or the command-line tool BCP. Bulk inserts are usually the fastest ways to load data into SQL Server. Note: many people with lesser experience of SQL Server use "bulk insert" when they talk about inserting many rows with INSERT SELECT. But in SQL Server "bulk insert" or "bulk copy" refers to a special process to import many rows, and a we don't refer to a regular INSERT SELECT as "bulk insert". > And how do I turn off foreign constraint check for an insert statment? Why would you? If there issues with the data you import, you are better off with importing the data into a staging table, and then move the usable data to the real target table. Nevertheless, you can disable constraints. The syntax is not very clean. To disable a constraint: ALTER TABLE tbl NOCHECK CONSTRAINT constr Or use ALL instead of a constraint name to disable all FK and CHECK constraints. To re-enable a constraint: ALTER TABLE tbl WITH CHECK CHECK constr Without the extra WITH CHECK clause, the constraint is not validated when it is reapplied, which means that the optimizer will not trust it. -- Erland Sommarskog, SQL Server MVP, esquel(a)sommarskog.se Links for SQL Server Books Online: SQL 2008: http://msdn.microsoft.com/en-us/sqlserver/cc514207.aspx SQL 2005: http://msdn.microsoft.com/en-us/sqlserver/bb895970.aspx SQL 2000: http://www.microsoft.com/sql/prodinfo/previousversions/books.mspx
From: Dan Guzman on 6 Jun 2010 16:29 > I'm interested in knowing more about tuning import of large dataset from > for example MS Acess into MS Sql Server 2005. over 60.000 rows to be > inserted into the table. I agree with Erland that an INSERT loop is about the slowest way to accomplish the task. Another method you might consider is an SSIS Data Flow task, which can bulk insert data directly into the target table. If you are not familiar enough with SSIS to create the needed package from scratch, you can launch an import wizard directly from SSMS by right-clicking the target database node in Object Explorer and selecting Tasks-->Import data.... This will then allow you to select the source Access database, target table and column mappings. You will have the options to save the generated package and/or execute it immediately. With only 60,000 rows, I would personally leave existing constraints and indexes in place. I would expect this task to take a seconds, not minutes. -- Hope this helps. Dan Guzman SQL Server MVP http://weblogs.sqlteam.com/dang/ "Jeff" <it_consultant1(a)hotmail.com.NOSPAM> wrote in message news:e#HFopWBLHA.6116(a)TK2MSFTNGP02.phx.gbl... > hi > > I'm interested in knowing more about tuning import of large dataset from > for example MS Acess into MS Sql Server 2005. over 60.000 rows to be > inserted into the table. > > I think using bulk insert is better than using a loop of insert > statements. If you have I would be greatful if some some of you could post > a link here so I can read more about the benefits of bulk insert compared > to regular insert? also interested in other links about best practice for > importing large datasets > > And how do I turn off foreign constraint check for an insert statment? > > I'm google but haven't found a good articles about it yet, but I might be > using the wrong search parameters. >
From: Geoff Schaller on 6 Jun 2010 18:23 Jeff. We use the SSIS wizard to build the import package. At least then you can re-execute the package as often as needed (during the trials and then the final import). The difficulties are data type conversions and relations. The error messages you get on import are particularly unhelpful. It can be traumatic trying to design keys during this process so we found it more practical to have two steps: 1. Import the raw data with SSIS, ignoring relationships 2. Design keys in SSMS and build a key script In every access database we found data corruptions, duplicate keys and table links that didn't make sense. It took a while to build the initial conversion package but it was easier once that was done. As the other guys indicated, it uses bulk Insert internally. The only alternate to this is to export all the data in access to a text file (take care picking the column/row delimiters) yourself and then importing with Bulk Insert. But even this will require a lot of experimentation. Geoff Schaller Software Objectives "Jeff" <it_consultant1(a)hotmail.com.NOSPAM> wrote in message news:e#HFopWBLHA.6116(a)TK2MSFTNGP02.phx.gbl: > hi > > I'm interested in knowing more about tuning import of large dataset from for > example MS Acess into MS Sql Server 2005. over 60.000 rows to be inserted > into the table. > > I think using bulk insert is better than using a loop of insert statements. > If you have I would be greatful if some some of you could post a link here > so I can read more about the benefits of bulk insert compared to regular > insert? also interested in other links about best practice for importing > large datasets > > And how do I turn off foreign constraint check for an insert statment? > > I'm google but haven't found a good articles about it yet, but I might be > using the wrong search parameters.
|
Pages: 1 Prev: deleting duplicate rows Next: many reports solution |