Submitting a job as a different user

We are all familiar with the SBMJOB command, but it’s rare that anyone looks beyond the first few parameters. Until you need to.

One overlooked parameter is USER which allows you to specify the name of the user profile for the job being submitted. Obviously, you need to have enough authority to use the profile in question, but if you do, the submitted job will then run with all the rights of that user. Doing this can also neatly solve potential object ownership issues.

Here’s an example. I need to duplicate a logical file in the production environment. The CRTDUPOBJ command will do this for me, but that leaves me a s the owner of the new object which can cause several object authority issues down the line.

Fortunately, I can do this:


The CRTDUPOBJ command is executed under the QPGMR profile and the resulting object is owned by QPGMR.

SBMJOB is a remarkably flexible command and it’s well worth poking around the parameters to see just how much it is capable of.

Copying data between heterogeneous databases with CPYFRMIMPF

Last week I mentioned the CPYTOIMPF command and how easy it makes reformatting data so that it can be handled by other platforms. Not surprisingly the IBM i makes it just as easy to handle data coming back with the CPYFRMIMPF command.

Using the same files as last time, the command looks like this:


This time, I am copying data from the MarketingReport.csv CSV file into a previously defined (and it must be previously defined) table, MRKRPT. I’m replacing the existing data and using the local and Windows conventions for Record and Field delimiters.

All of this means that, regardless of where a set of data originated, making it available to applications on the IBM i really is a piece of cake.

Copying data between heterogeneous databases with CPYTOIMPF

One of the many things that the IBM i (and its predecessors) has always been good at is exchanging data with other platforms. And one thing that often surprises me is that people often don’t realise just how easy sharing data can be.

How easy? This easy


In this example, I cam copying a physical file (table) called MRKRPT01 to a CSV file called MarketingReport.csv in the IFS. I’m using MBROPT(*REPLACE) to ensure the new file overwrites the existing file and RCDDLM(*CRLF) appends a Windows standard carriage return and line feed to the end of each line.

Then things become interesting.

DTAFMT(*DLM) indicates that the data contains delimiter characters, just like a CSV file should. And FLDDLM(';') indicates what the delimiter character should be. By default, this is a comma but I’m in Belgium where standards are different, so I’m using a semi-colon instead.

The RMVBLANK(*TRAILING) removes the trailing blanks from the character fields, which ensures the output is as tidy as it looks.

I now have a perfectly formatted CSV file which can be distributed by FTP, SMTP or by sharing the folder so that end users can find the file themselves. And if you’re using QNTC you can change the TOSTMF parameter to something like ‘/QNTC/path/to/server/folder/file.csv’ in order to put the file wherever you like on the network.

Better file handling in CL with the CLOSE command

CL is an often undervalued part of the i developer’s toolkit, and one that IBM have been quietly enhancing over the years. It does have some quirks however, the most annoying being that if you are reading a file, once you reach the end of the file there is no way of repositioning the pointer and reading it again.

Actually, there is.

The command you need to know about is the Close Database File (CLOSE) command:

The Close Database File (CLOSE) command will explicitly close a database file that was implicitly opened by a Receive File (RCVF) command run within the same CL program or ILE CL procedure. A file with the same open identifier must have been declared by a Declare File (DCLF) command within the same CL program or ILE CL procedure.

After the CLOSE command is run, the next RCVF command for this file will implicitly reopen the file and read a record from the file.

That last line is the really important piece. After you close the file, the next RCVF command will re-open it and your file pointer is back at the first record.

Here’s an example:

    dcl &FROMFILE *char 50
    dcl &TOMBR *char 50 value('/qsys.lib/LSCLIB.lib/IMPORTWORK.file/IMPORTWORK.mbr')
    dcl &QSH *char 50
    dclf file(FTPFILES) opnid(LIST)

    /* Merge Original FTP data into Reload File                               */
    ovrdbf file(INPUT) tofile(LSCLIB/QFTPSRC) mbr(GETFULL)
    ovrdbf FILE(OUTPUT) tofile(LSCLIB/QFTPSRC) mbr(FTPLOG)
    ftp ''
    dltovr OUTPUT
    dltovr INPUT
    callsubr MERGEDATA

    /* Merge correction FTP data into Reloqd File                             */
    ovrdbf file(INPUT) tofile(LSCLIB/QFTPSRC) mbr(GETCORR)
    ovrdbf FILE(OUTPUT) tofile(LSCLIB/QFTPSRC) mbr(FTPLOG)
    ftp ''
    dltovr OUTPUT
    dltovr INPUT
    callsubr MERGEDATA

/* Merge retrieved FTP records with missing data into Reload File             */

    /* List the retrieved files                                               */
    qsh cmd('ls /home/PAUL/FTPIN/ > /home/PAUL/SentFiles.txt')
    clrpfm FTPFILES
    cpyfrmstmf fromstmf('/home/PAUL/SentFiles.txt') +

    /* Then use this list to read through and import the files                */
    dowhile '1'
        rcvf opnid(LIST)
        monmsg msgid(CPF0864) exec(leave)

        /* Copy and process the stream file                                   */
        chgvar &FROMFILE value('/home/PAUL/FTPIN/' *tcat &LIST_FTPFILES)
        cpyfrmstmf fromstmf(&FROMFILE) tombr(&TOMBR) mbropt(*REPLACE)
        cpyf fromfile(LSCLIB/IMPORTWORK) tofile(LSCLIB/IMPORT) mbropt(*REPLACE) +
        call LSCMRG001R parm(&LIST_FTPFILES)

        /* Delete the stream file once we've finished with it                 */
        chgvar &QSH value('rm /home/PAUL/FTPIN/' *tcat &LIST_FTPFILES)
        qsh cmd(&QSH)


    /* Clean up and return                                                    */
    close opnid(LIST)
    qsh cmd('rm /home/PAUL/SentFiles.txt')



What I am doing here is retrieving a set of files from a customer’s FTP server, and then retrieving the corrections from the same server. After retrieving the files, I execute the MERGEDATA subroutine. This builds a list of the retrieved files and copies it to the FTPFILES physical file.

The DOWHILE loop copies each stream file into a physical file (IMPORT) and then calls program LSCMRG001R to merge the retrieved data into the enterprise application.

Note that once I leave the loop, I close the file. This means that I can reuse the MERGEDATA subroutine as many times as I need to, for whatever selection of files I happen to retrieve.

Better impact analysis with DSPPGMREF

The IBM i Control Language (CL) provides such a rich set of commands that even after almost 30 years, I can still discover something really useful. DSPPGMREF is a case in point:

The Display Program References (DSPPGMREF) command provides a list of the system objects referred to by the specified programs.

Usefully, I can select all programs in a library and direct the output to a database file:


This gives me a table showing every object referenced by every program in library LSCLIB which means that identifying the potential impact of (for example) a change to a table is only a simple query away.

The program or programs to search can be *ALL, as above, or a single program name or a generic name. And if I use this:


I’ve captured everything.

Using CPYSPLF to extract spool files via the IFS

I am currently without iSeries Navigator (or IACS, for that matter). I have a 5250 emulator, of course, but nothing else, which makes for an interesting challenge when I want to email a spool file to someone.

Fortunately, there’s CPYSPLF:

The Copy Spooled File (CPYSPLF) command copies the data records in the specified spooled file either to a user-defined physical database file or to a stream file. This allows the use of spooled files in applications using microfiche, data communications, or data processing.

It’s the stream file that makes it handy and allows me to do something like this:


And now I have a text file, called OutputTextFile.txt, sitting in my home folder in the IFS. And that is just an FTP GET away.

Adding variables to ad-hoc SQL queries with CL

Last month I mentioned using REXX to build and execute ad-hoc SQL queries. For the sake of completeness, here is the same solution implemented in CL using the RUNSQL command.

The program does exactly the same as the previously describe REXX script — extracts all records flagged with today’s date so that they can be reviewed at a later date. And here it is:

dcl &date6 *char 6
dcl &date8 *char 8
dcl &statement *char 255

/* First retrieve the current system date (QDATE)                             */
/* and convert it to an 8 character value                                     */

/* Next, build the SQL statement                                              */
           VALUE('insert into EXPATPAUL1/INPUTDATA' *bcat +
                 'select * from EXPATPAUL1/PRODDATA where dhdate =' *bcat +
runsql sql(&statement) commit(*none)


Using RGZPFM to sort physical files

It’s rare that you need to sort a database file (or table), but I have encountered a circumstance in which a sorted file had to be delivered. Being naturally lazy, I started looking around for the least-work method of achieving this and ended up looking at the RGZPFM command. This is a command I have used quite often to compress a physical file (it removes the deleted records), but this time around it was the reorganise part of Reorganise Physical File Member that I was interested in.

From the help text:

If a keyed file is identified in the Key file (KEYFILE) parameter, the system reorganizes the member by changing the physical sequence of the records in storage to either match the keyed sequence of the physical file member’s access path, or to match the access path of a logical file member that is defined over the physical file.

The important thing here is that any keyed file can be specified. So if I have an appropriately keyed logical file, I can very easily sort the physical.

Some names have been changed, but if I have a physical file called ITEMLIST and I want it sorted by field BARCODE, I need to first create an index:

create index ITEMLIST01          
on ITEMLIST (BARCODE);           
label on index ITEMLIST01        
is 'Logical on ITEMLIST (Keyed: BARCODE)'; 

Then I can easily reorder the physical file ITEMLIST into BARCODE sequence with the command:


The only thing to take not of is that RGZPFM will issue an error if the file has no records. The approach I took wast to use RTVMBRD to retrieve the NBRCURRCD for the member, an alternative approach would be to use a MONMSG immediately after the RGZPFM.

Using Qshell and CL to work with Stream files in the IFS

It turns out that there is no simple way of generating a list of files currently residing in a folder on the IBM i IFS. Simple, in this case, would be a command like DSPLNK OUTPUT(*FILE). An API does exist, but it turns out that the same results can be achieved both quickly and simply if you are willing to spend a bit of time in Qshell.

This post is a simplified reworking of an issue I encountered some time ago, but it’s worth documenting.

The issue is that I have a number of files being sent, via FTP, to a folder in the IFS. I need to be able to copy these files into an import file for subsequent processing and then archive them. The problem, of course, is that I don’t know what the file names will be beforehand.

Here’s a solution:

First create an extract file in QTEMP. You will need this in order to compile the program. If you are going to submit the compile in batch, you will need to create the file in some other library:

crtpf file(QTEMP/EXTRACTP) rcdlen(20)

And then the CL program:

dcl &fromfile *char 50                                                          
dcl &tombr *char 50 value('/qsys.lib/LSCLIB.lib/IMPORTP.file/IMPORTP.mbr')  
dcl &dltfile *char 100                                                          
dcl &cpyfile *char 100                                                          
dclf EXTRACTP                                                                   
/* First, I use QShell to list the files in the Extract folder                */
/* The output of this is redirected to ExtractedFiles.txt.                    */
qsh cmd('ls /home/PAUL/Extract/ > /home/PAUL/ExtractedFiles.txt')     
/* In order to use this information, create the extract file in qtemp and     */
/* copy the ExtractedFiles.txt stream file into it.                           */
/* If the file already exists, the MONMSG clears it.                          */
crtpf file(QTEMP/EXTRACTP) rcdlen(20)                                           
monmsg CPF7302 exec(clrpfm QTEMP/EXTRACTP)                                      
cpyfrmstmf fromstmf('/home/PAUL/ExtractedFiles.txt') +                     
           tombr('/qsys.lib/qtemp.lib/EXTRACTP.file/EXTRACTP.mbr') +            
/* And now I can use QTEMP/EXTRACTP to drive my way through the Extract       */
/* folder                                                                     */
dowhile '1'                                                                     
    monmsg msgid(CPF0864) exec(LEAVE)                                           
    /* Copy the next stream file to the IMPORTP physical file                 */
    chgvar &fromfile value('/home/PAUL/Extract/' *tcat &EXTRACTP)          
    cpyfrmstmf fromstmf(&fromfile) tombr(&tombr) mbropt(*add) +                 
    /* Copy the stream file to the Archive                                    */
    chgvar &cpyfile value('cp /home/PAUL/Extract/' *tcat &EXTRACTP +       
                    *bcat '/home/PAUL/Archive')                            
    qsh cmd(&cpyfile)                                                           
    /* Then remove the stream file from the Extract folder                    */
    chgvar &dltfile value('rm /home/PAUL/Extract/' *tcat &EXTRACTP)        
    qsh cmd(&dltfile)         
/* Clean up and exit                               
qsh cmd('rm /home/PAUL/ExtractedFiles.txt')   
dltf qtemp/EXTRACTP                                

It should go without saying that some of the names have been changed and that the above program should be treated as a sample only.

Being able to move information between the QShell/IFS and traditional i on Power environments is both useful and (in my experience) increasingly important. Although it does take a bit of thinking about, it isn’t difficult which is why I find that the oft-seen solution of “buy this tool” is both disappointing and (often) overkill.