Prev: Quick move tablespace
Next: LONG to CLOB within a trigger - alternative to synching data values?
From: Cliff on 6 Jan 2010 19:46 Hey all, I was looking for some guidance. I need to do some stat analysis on a production audit table. One problem is that there are no indexes on the audit table and spooling via sqlplus would take years. Using Export on the server, the disk space required for the dmp file is about 40 G ( per the DBA ). This space is not available on the UNIX box. Would running EXP locally be a valid option? I realize that any solution is going to take a while of processing time; it's just that I don't want to get 100-hours in and then find a fatal flaw. Thoughts? TIA, Cliff
From: Mladen Gogala on 6 Jan 2010 22:44 On Wed, 06 Jan 2010 19:46:45 -0500, Cliff wrote: > Hey all, > > I was looking for some guidance. I need to do some stat analysis on a > production audit table. One problem is that there are no indexes on the > audit table and spooling via sqlplus would take years. > > Using Export on the server, the disk space required for the dmp file is > about 40 G ( per the DBA ). This space is not available on the UNIX > box. > > Would running EXP locally be a valid option? Locally? Running EXP is illegal in Texas. What do you mean by locally? > > I realize that any solution is going to take a while of processing time; > it's just that I don't want to get 100-hours in and then find a fatal > flaw. You can run exp on another machine and connect over the TCP connection. The client exp executable must have the same major version number as the database server. There are even problems there: 9.2.0.1 exp file cannot export 9.2.0.8 database. Also, be aware of the different NLS conventions on Winduhs and Unix. -- http://mgogala.byethost5.com
From: Robert Klemme on 7 Jan 2010 04:57 On 01/07/2010 01:46 AM, Cliff wrote: > Hey all, > > I was looking for some guidance. I need to do some stat analysis on a > production audit table. One problem is that there are no indexes on the > audit table and spooling via sqlplus would take years. > > Using Export on the server, the disk space required for the dmp file is > about 40 G ( per the DBA ). This space is not available on the UNIX box. > > Would running EXP locally be a valid option? > > I realize that any solution is going to take a while of processing time; > it's just that I don't want to get 100-hours in and then find a fatal flaw. > > Thoughts? Another option than the ones mentioned already is to temporarily mount a file system with appropriate size on the _server_ and use Data Pump to export it on the server. Then unmount the file system or do whatever you need to do as part of your analysis. You can also use Data Pump to copy the table from one instance to another, this avoids the large export file. Then you can create indexes and do whatever analysis you want to do on the other DB server without impacting the production system. There are however some caveats, off the top of my head the default charset of the databases should be identical. If you do need to do this analysis repeatedly then maybe a materialized view which is refreshed manually via a DB Link could help. Kind regards robert -- remember.guy do |as, often| as.you_can - without end http://blog.rubybestpractices.com/
From: Jonathan Lewis on 7 Jan 2010 05:37 <Cliff> wrote in message news:2010010619464516807-(a)news.giganews.com... > Hey all, > > I was looking for some guidance. I need to do some stat analysis on a > production audit table. One problem is that there are no indexes on the > audit table and spooling via sqlplus would take years. > > Using Export on the server, the disk space required for the dmp file is > about 40 G ( per the DBA ). This space is not available on the UNIX box. > > Would running EXP locally be a valid option? > > I realize that any solution is going to take a while of processing time; > it's just that I don't want to get 100-hours in and then find a fatal > flaw. > > Thoughts? > > TIA, > Cliff > How are you planning to do the analysis ? If you're looking at exp then presumably you're thinking of using imp on another database. Have you considered the possibility of using transportable tablespaces, or do the restrictions make that option impossible ? If you're going to use exp "locally" - which in your case means at a client machine - remember that you're going to have to pump all that data across a network, and that may introduce a different set of problems, and need some adjustments to your SQL*Net configuration before you begin. If SQL*Net issues don't worry you, and you're going to be moving the datat into another Oracle database, then how about: create audit_clone nologging as select (required columns) from audit_table(a)remote ? If you're sticking with exp/imp remember that you can add a "where clause" to exp, so if there's some easy way to break the table into discrete data sets you could export it in several pieces - you'd have to scan the entire table to generate each file, but if your critical limit is the amount of filesystem space you can acquire at any one time, it gives you an option for doing the job in pieces. -- Regards Jonathan Lewis http://jonathanlewis.wordpress.com
From: Thomas Kellerer on 7 Jan 2010 05:41 Mladen Gogala, 07.01.2010 04:44: > Running EXP is illegal in Texas. How can a state law prohibit running of a "normal" computer program?
|
Next
|
Last
Pages: 1 2 3 Prev: Quick move tablespace Next: LONG to CLOB within a trigger - alternative to synching data values? |