From: morris on
am trying to find out on behalf of a network user in our group who runs sas jobs on a
unix server (solaris, /bin/csh shell)

apparently the connection is via a SSH terminal and the jobs are submitted in the
format
<command> <filename>

ie. sas job1.sas

the jobs write no output to the users home folder or any other location, all dataset
references are in the format work.tempNN

ie. proc sql; create table work.temp1 as .....

or data temp1; set temp0;

at the end of the job, all output is written directly via ftp to a mainframe location

ok, now the question:

why would the user have only a couple megabytes of files in the home folder, yet the
system vxquota reports

Disk quotas for xxxxx (uid 1234):
Filesystem usage quota limit timeleft files quota limit
timeleft
/xx/home 2147483647 200000 250000 -1 0 0

we don't know where that massive storage exists and the sas job itself does not create
anything permanent.



From: Patrick on
Hi

This is a question best asked your SAS Admin or if you're the SAS
Admin then ask SAS Tech Support.

I assume that <command> <filename> means <command> is a shell script
and <filename> is a text file containing sas script.

There will be a SAS log. The location of this log is either defined in
this <command> script or is the default location as defined as part of
the installation (and documented in the SAS installation manual).

There are also SAS Server logs. Your SAS Admin should know where they
are.

I've worked in an environment as you describe it but nothing got
stored in the $HOME directory unless someone scripted it. But who
knows - may be someone points the SAS Work directory to $HOME. SAS
work tables would then be stored there while the SAS batch job runs
and be destroyed at the end of the job.

HTH
Patrick
From: Patrick on
....and what I forgot: Most likely the command will be: nohup
<command> <filename> &

There will be a nohup.out in the directory where the user started the
script from which also might give some information.
From: morris on
Patrick wrote:
> I assume that <command> <filename> means <command> is a shell script
> and <filename> is a text file containing sas script.

no, incorrect.

as I wrote, all jobs are executed via SSH terminal in the format
sas job1.sas

manually. nothing is running automatically. user logs on, opens a SSH terminal and
types.

there are no scripts being used, unless the command "sas" before the filename
(job1.sas) is a script


> There will be a SAS log. The location of this log is either defined in
> this <command> script or is the default location as defined as part of
> the installation (and documented in the SAS installation manual).

of course. sas logs show proc sql creating a work.temp1 and then a data _null_ with an
ftp for work.temp1

all job complete normal and NOTHING is written into sas libraries (permanent) other
than WORK

so, unless WORK is stored permanently (makes no sense!) the user does not write sas
libraries and does not
store any work files on Unix other than the short piece of code in "job1.sas" which is
just a text file with proc sql and
data _null_ and the file statement which send the output to a windows server (again,
nothing is stored or written
to the unix server)

> and be destroyed at the end of the job.

that is the problem. User has only a few megabytes in files yet vxquota output shows
several gigabytes of quota.



From: morris on
> There will be a nohup.out in the directory where the user started the
> script from which also might give some information.

incorrect.

the user home directory contains no such file, just looked.

thanks for the comments