From: Tim P on
Hi

I need to do a daily routine where I collect a csv from an ftp site, which
will always have the same file name but is overwritten each day and then bulk
import the csv. The csv contains current job vacancies from an offline
system, so old and non-current vacancies don't feature. Thus I need to drop
the old data and import all the new. The offline system can open a web url,
which could be used as the trigger to start the process (the web site which
will use the data is classic asp), but equally I figured that I could use sql
server agent to initiate a SSIS stored package to do the job at a set time in
the middle of the night.

Basically the job is:
Drop table with yesterday's data
Create new table
Retrieve csv from ftp address
Bulk import new data

Question is, can I do this only using sql server 2008? If so, any clues how
would be greatly appreciated. I'm at the extreme of my knowledge here!

Thanks in advance for any ideas.

Tim
From: John Bell on
On Wed, 30 Jun 2010 09:35:22 -0700, Tim P
<TimP(a)discussions.microsoft.com> wrote:

>Hi
>
>I need to do a daily routine where I collect a csv from an ftp site, which
>will always have the same file name but is overwritten each day and then bulk
>import the csv. The csv contains current job vacancies from an offline
>system, so old and non-current vacancies don't feature. Thus I need to drop
>the old data and import all the new. The offline system can open a web url,
>which could be used as the trigger to start the process (the web site which
>will use the data is classic asp), but equally I figured that I could use sql
>server agent to initiate a SSIS stored package to do the job at a set time in
>the middle of the night.
>
>Basically the job is:
>Drop table with yesterday's data
>Create new table
>Retrieve csv from ftp address
>Bulk import new data
>
>Question is, can I do this only using sql server 2008? If so, any clues how
>would be greatly appreciated. I'm at the extreme of my knowledge here!
>
>Thanks in advance for any ideas.
>
>Tim

Hi Tim

You should not need to drop the table each time as the structure of
the table should remain constant, just dropping the existing data
should be all that is necessary.

I would also do it in the following order, as you may want to leave
old data in the table if the ftp process fails.

Retrieve csv from ftp address
Truncate the table (drop old data)
Bulk import new data

You can easily create a SSIS job for the last two items by using the
Import/Export wizard in SQL Server Management Studio and save the job
as a SSIS package. Start the wizard by right clicking the database in
SSMS and choose Tasks and Import Data from the menu. For the source
use a flat file data source, the destination database should be OK if
you started the process in the correct place. You may want to change
the table name or edit the mappings to change the data types of the
destination colums. Your first run will create the table. You will
need to change the tranformation so that it does not drop and
re-create the table, you can import the package into a SSIS solution
in the SQL Server Business Intelligence Development Studio (BIDS) to
make the changes. See
http://msdn.microsoft.com/en-us/library/ms173767.aspx
Once the changes have been made the package will need to be
re-deployed to the server.

You can then add the FTP task to run before this see:
http://msdn.microsoft.com/en-us/library/ms137656.aspx

John
 | 
Pages: 1
Prev: SELECT question?
Next: Help with SQL Query