From: Varun on
Hi all,
I have a master source xls file with 10000 Rows and with around 28 columns.
So the dat size is high.

I have a distribution list that too in excel. It contains various filters
need to be applied and flie saved to be saved in another name. VBScript code
opens the distribution list, goes through row by row, open the source file,
applies the filter , save the file as new name, close the source . For next
row , the same process is called again.
Problem is that since I am opening the source file again and again, it is
taking large time. Is there any wayout that I open the source file once,
applies the row, deletes the non-matching rows, save as new name, undo the
last two actions on source and again applies new filter.

Please help me to design the solution of this

From: Al Dunbar on


"Varun" <Varun(a)discussions.microsoft.com> wrote in message
news:B0E2A1CB-9863-4723-AAF8-76DD464B970A(a)microsoft.com...
> Hi all,
> I have a master source xls file with 10000 Rows and with around 28
> columns.
> So the dat size is high.
>
> I have a distribution list that too in excel. It contains various filters
> need to be applied and flie saved to be saved in another name. VBScript
> code
> opens the distribution list, goes through row by row, open the source
> file,
> applies the filter , save the file as new name, close the source . For
> next
> row , the same process is called again.
> Problem is that since I am opening the source file again and again, it is
> taking large time. Is there any wayout that I open the source file once,
> applies the row, deletes the non-matching rows, save as new name, undo the
> last two actions on source and again applies new filter.
>
> Please help me to design the solution of this

If you were to include some of your script perhaps your meaning would be
clearer.

/Al


From: Davo on
http://www.4guysfromrolla.com/webtech/010401-1.shtml
Thanks to advice above, I have rewritten a few scripts to read a data file to an array first, then read from the array. This seems to speed performance for larger files (1000+ lines). It definately speeds searches, less so file writes. (?uses ram instead of HD?)

---
frmsrcurl: http://msgroups.net/microsoft.public.scripting.vbscript/How-to-copy-large-data-from-one-excel-to-another-using-VBScr
From: ekrengel on
On Dec 10 2009, 12:04 am, Varun <Va...(a)discussions.microsoft.com>
wrote:
> Hi all,
> I have a master source xls file with 10000 Rows and with around 28 columns.
> So the dat size is high.
>
> I have a distribution list that too in excel. It contains various filters
> need to be applied and flie saved to be saved in another name. VBScript code
> opens the distribution list, goes through row by row, open the source file,
> applies the filter , save the file as new name, close the source . For next
> row , the same process is called again.
> Problem is that since I am opening the source file again and again, it is
> taking large time. Is there any wayout that I open the source file once,
> applies the row, deletes the non-matching rows, save as new name, undo the
> last two actions on source and again applies new filter.
>
> Please help me to design the solution of this

Yes please post what you have, so we can help your further...