IRIS2 Data Reduction - Troubleshooting

ORAC-DR is reporting that it cannot find a suitable calibration file

  • FLAT: IRIS2 flat-field calibration files need to have the same filter (and for spectroscopy, the same readout method) as the observations being calibrated.
  • DARK: IRIS2 dark calibration files need to have the same exposure time (as recorded in the FITS keyword EXPOSED) and number of cycles as the observations being calibrated.
  • READNOISE: A readnoise value needs to be calculated at the beginning of each night, from the Array_Tests sequence.
  • STANDARD: A standard star calibration observation needs to be taken in the same filter, and have the same number of axes, as the observations being calibrated.

The methods of remedying these problems depend on which calibration information is missing:

  • FLAT, DARK: Reduce a suitable flat-field or dark frame taken later in the night (see below for information on reducing observations out of turn), then re-reduce the observations in question.
  • READNOISE: Reduce an AAO_Array_Tests sequence, then re-reduce the observations.
  • STANDARD: Reduce a suitable standard star observation, then re-reduce the observations in question.

It is possible to force ORAC-DR to re-use calibration files from a previous night. For instance, the spectroscopic flats are reproducible enough that only one set should be required per run. To re-use flats from 26 Nov 2004 for 28 Nov 2004:
cd /iris2_reduce/iris2red/041128
cp ../041126/flat_* .
cp ../041126/index.flat .

ORAC-DR is 'hanging' - not reducing data, not responding - or keeps crashing on start-up.

While it's possible that ORAC-DR has truly hung, it's much more likely that ORAC-DR is doing CPU-intensive calculations. These typically come at the end of sequences, when ORAC-DR is creating a final mosaic. If you wish to see what ORAC-DR is doing during these moments, run up ORAC-DR with the -verbose option (in addition to other tags). Check the "Warnings" and "Errors" sections of the ORAC-DR GUI to see if there are any messages there about missing files, recipes or primitives (the font colours can be hard to see).

If you wish to run ORAC-DR in a less CPU-intensive mode, try running with the _BASIC recipes. These recipes are specially designed to perform less intensive calculations, specifically in the mosaicing steps where registration and image detection and matching are done. However, with the advent of fast PCs running Linux, ORAC-DR is able to keep up with the data rate from most programs using the full-fledged recipes.

If it won't start up at all, chances are there are some rogue ORAC-DR/Starlink processes still running in the background. The simplest way to eliminate these is with the all-powerful oracdr_nuke command.

If ORAC-DR hangs soon after startup at the "Orac says: Pre-starting mandatory monoliths..." stage, then check that your /etc/hosts file contains the following line:

127.0.0.1 aaa.bbb.ccc localhost localhost.localdomain

where aaa.bbb.ccc is the full domain name of your computer, e.g. "lapsdr.aao.gov.au". Without this line, ORAC-DR cannot connect to Gaia.

JITTER_SELF_FLAT is too slow, I want to run JITTER_SELF_FLAT_BASIC

To re-reduce a set of observations, simply stop ORAC-DR (see above), then restart by adding the recipe name to the command-line options. For example, if you wish to reduce observations 71 to 79 with JITTER_SELF_FLAT_BASIC instead of JITTER_SELF_FLAT, type:

oracdr -list 71:79 JITTER_SELF_FLAT_BASIC

(similarly for any other recipe with a _BASIC variant). This command will tell ORAC-DR to only reduce observations 71 through 79 (inclusive) with recipeJITTER_SELF_FLAT_BASIC. Please note that all observations you specify will be reduced with the given recipe. If you mistakenly included a dark frame in the above example (suppose you typed 70:79), it would not get reduced with the REDUCE_DARK recipe. Generally this is not a problem, as groups will be properly reduced based on header keywords.

Also, you do not necessarily need to stop ORAC-DR. It is possible to start a parallel ORAC-DR session by typing the first startup command in a new xterm, then running ORAC-DR as described above. This will allow for real-time data reduction to continue, but also allow you to run different recipes on pre-existing data. This option will probably be slower than running a single instance of ORAC-DR, as both sessions have to use the same CPU. On a dual-processor machine (like aatssx) this may not be a problem.

I aborted an observation and now ORAC-DR is waiting for the file to appear (or else ORAC-DR fell over and needs to be restarted)

Unfortunately, after aborting an IRIS2 run, the run number does not get reset, and there will be a missing file number. Simply stop ORAC-DR and restart it. For example, if you aborted run 143 and the next available file is 144, type

oracdr -from 144 -loop wait &

This will start ORAC-DR on observation 144 and continue as normal.

GAIA displays an image that is skewed

This problem was noticed during an IRIS2 commissioning run, and no solution is apparent. The data file is fine, GAIA simply has unknown problems in displaying it.

ORAC-DR complains, or aborts when converting files to FITS

The usual error message is something like:

Err:!! No input files found matching the supplied specification. 
Err:!  FITS2NDF: Error converting a FITS file into an NDF. 
Err:!! SAI__ERROR: Error 
Err:Error creating symlink from ORAC_DATA_OUT to /data/sse/2/sdr/odi// 
Err: 
Exiting...

Chances are, the problem is actually that the wrong UT date for the files was given at startup. Re-issue the oracdr_iris2 command with the right UT date.

Alternatively, you may get a warning like:

#335 Err: !! Command interpreter invoked by a call to "system" to execute an external
#335 Err: !     command returned an error status of 256.
#335 Err: !  Command was: $CONVERT_DIR/convertndf from  'TEXT' '/home/sdr/odo/'
#335 Err: !     'map_dist' '.txt' ' ' '/home/sdr/odo/t6054.TEMP_1.NDF_1'
#335 Err: !  NDF_EXIST: Unable to associate an NDF structurewith the 'MAP1' parameter.
#335 Err: !! Aborted attempt to associate an existing NDF structure with the 'MAP1'
#335 Err: !     parameter.
#335 Err: !  Error creating a new CmpMap.
#335 Err: !! PAR__ABORT: Parameter request aborted
#335 Err: Error in obeyw to monolith atools_mon (task=astcmpmap): 146703171
#335 Err: Recipe completed with error status = 146703171
#335 Err: Continuing but this may cause problems during group processing

but it may continue to process data, or crash altogether. This can happen if you have executed other Starlink tasks (e.g. from the convert or kappapackages) from the terminal window where you started ORAC-DR. You will need to retstart ORAC-DR from a fresh login.

ORAC-DR isn't doing what I expected, since I messed up one of the FITS keywords. How do I rectify this?

The solution is outlined in the Correcting Headers (link is external) section of SUN/232 (link is external). Since ORAC-DR only ever operates on the NDF copies of the data, the solution is to pass all the affected images first through ORAC-DR using just the QUICK_LOOK recipe. This leaves NDF copies of all the input images in ORAC_DATA_OUT, where their headers can now be corrected using the kappa fitsmod task. Some of the more important keywords can be modified directly as described above.

These pages contain information on the functionality of the IRIS2 Infrared Imager and Spectrograph. Pages maintained by Chris Lidman (chris.lidman@aao.gov.au).