Enterprise COBOL: Project Example

Oleg Kunitsyn
8 min readJan 31, 2021

This article completes the COBOL Programming Course highlighting important aspects in software engineering, such as codebase modularity, dependencies, unit testing, mocking, DevOps on z/OS and auto-documentation. Modern approach presented in the best possible manner — by example.

Photo by Hunter Haley on Unsplash

TLDR

Download the archive from GitHub, unzip and open sales directory.

Preconditions

You have learned basic principles, methods and standards of IBM Enterprise COBOL for z/OS — a proprietary COBOL compiler which implements a substantial part of COBOL 85, COBOL 2002 and COBOL 2014. We’ll run the project on z/OS — a proprietary 64-bit operating system for IBM mainframes, introduced in October 2000 and backward compatible with the older functionality originated since the 1960s .

  • Make sure you have installed NPM, a package manager for JavaScript:
$ npm -v
  • You have an IBM mainframe account.

You can get it for training purposes, free of charge. Register at IBM and follow the instructions. You will receive a registration email with USER ID, IP address and PORT. Then, login on Open Mainframe Project Slack workspace and add zih app via Apps menu. Post a message e.g. Hi and the app will ask your email address and USER ID that you have received. Post these details one by one and the bot will create your PASSWORD.

  • You’ll need COBOLget, a package manager for COBOL:
$ npm i -g cobolget
$ cobolget -v
  • You’ll need Zowe, an open-source management framework for mainframes, and your Zowe profile created by using the credentials above:
$ npm i -g @zowe/cli --ignore-scripts
$ zowe -V
$ zowe profiles create zosmf ztrial --host <IP> --port <PORT> --user <USER ID> --pass <PASSWORD> --reject-unauthorized false
Profile created successfully!

You may choose any text editor you like, but I recommend Visual Studio Code with IBM Z Open Editor extension, installed.

Specification

Good projects have a comprehensive description of functionality and limitations. The project example Sales will summarize fictional sales in a specific region. Sales data provided in CSV format, where each row (except the header) describes one sale as follows:

Region,Country,Units Sold,Unit Price,Total Revenue

Region and Country are PIC X(48) values, Units Sold is PIC 9(9) numeric, Unit Price and Total Revenue are PIC 9(9)V99 monetary decimals. Desired region is a PIC X(48) value. For simplicity, we’ll keep the region filter predefined in the main program and assume that CSV rows can not exceed 80 characters in length.

Decomposition

Thanks to Zowe, COBOL programmers can have any structure of the project, free from conventions and limitations of the mainframes. Modern software engineering is tending to focus on problem domain delegating platform-specific tasks to DevOps layer.

Let’s break down the specification into the functional units. Basically, the program:

  1. Reads sales file.
  2. Parses rows one by one.
  3. Checks Region for a match.
  4. Aggregates Total Revenue.
  5. Displays aggregated value.

Therefore, we can identify at least four units — three programs and one copybook. Entry point program defines the region filter, invokes Reader and displays the result. The Reader program encapsulates file (DataSet) operations and analyzes CSV records returned by Parser. The Parser program converts raw CSV rows to the records.

├── src
├── parser.cbl
├── reader.cbl
├── sales.cbl
└── sales.cpy

In other words, given copybook is a structured form of CSV row, shared by the Reader and the Parser. So let’s put it in a CPY file:

01 csv-rec.
05 Region PIC X(48).
05 Country PIC X(48).
05 UnitsSold PIC 9(9).
05 UnitPrice PIC 9(9)V99.
05 TotalRevenue PIC 9(9)V99.

Dependencies

Before Zowe has separated development and execution environments, Enterprise COBOL engineers were using following comment blocks for documenting of the modifications:

*****************************************************************
* DATE CHANGED BY DESCRIPTION *
* -------- ------------ -------------------------------------*
* 99.99.99 Author Description *

This technique does not fully reflect the history of the program and cannot be used for retrospective analysis. Opposite, Version Control Systems is a standard in software engineering nowadays that gives freedom to the teams collaborate across space, time and organization boundaries. Code experiments, parallel versions, atomic modifications and code reviews help develop programs efficiently at low risk.

The trend of splitting monolithic programs with enormous number of SECTIONS into smaller yet reusable units wouldn’t be possible without Semantic Versioning and Package Management. Despite the fact that we didn’t use any external COBOL code in our units, intrinsic functions are usually insufficient for more sophisticated projects.

Since 2020, COBOL has its own package manager COBOLget that standardizes reusability of the code for open-source and proprietary contributors. The command-line tool automates the process of installing and upgrading libraries, resolving dependencies and integrating external COBOL code into the projects. COBOLget Registry accumulates packages written in GnuCOBOL and Enterprise COBOL dialects helping distribute public and private code among the teams.

The heart of COBOLget is a Manifest file modules.json which describes the project and its dependencies.

...
"modules": [
"src/parser.cbl",
"src/reader.cbl",
"src/sales.cbl"
],
"dialect": "entcobol",
"dependencies-debug": {
"ecblunit": "*"
}
...

As you can see, the structure is very similar to the manifest in NPM. The property modules lists COBOL modules of the project, dialect property identifies the target platform. The purpose of the ecblunit dependency will be explained in the next chapter.

Testing

Good projects provide a reasonable degree of quality assurance — each unit must perform as designed. In our project, the Reader and the Parser are two units that we can isolate from the environment and cover by tests at the earliest stage of the development.

├── tests
├── tests.cbl
└── tests.jcl

Tools

IBM offers a proprietary unit testing framework for that — zUnit. zUnit implements xUnit standard letting develop, execute and evaluate test results in any z/OS language. The pipeline looks like this:

  • the Test Runner reads the Test Suite file;
  • the Test Runner calls the Test Cases, one by one;
  • ADDTESTS program adds test programs to the Test Case;
  • SETUP program allocates required resources;
  • the test program invokes the unit and asserts the result;
  • TEARDOWN program releases allocated resources.

Test Runner — a z/OS program that orchestrates testing process.
Test Suite — an XML file that lists Test Cases for the Test Runner.
Test Case — a COBOL program that invokes a unit.
Assertion — a COBOL condition that compares expected and actual values.
Test Fixture — COBOL programs for setting up, running and tearing down the tests.

The problem is that one minimal zUnit test requires over 100 lines of boilerplate code that in simple cases might be excessive. Fortunately, there is an open-source alternative — ECBLUnit. In contrast to zUnit, this tool focuses only on Test Runner and Assertion elements of the standard. Written in Enterprise COBOL, the tool is fully compatible with z/OS. ECBLUnit tests are independent COBOL programs, much simpler than zUnit ones. Since ECBLUnit is an ordinary COBOLget package, we can have it as a dependency:

$ cobolget add --debug ecblunit
$ cobolget update
$ cobolget install

Mocking

Seems, only the Parser is a completely isolated unit in our project. The Reader requires z/OS support to access CSV file in the dataset. Nevertheless, we can isolate the Reader by compiling its alter ego — a Mock. The purpose of mocking is to focus on the code being tested and not on the behavior or state of its environment. The mock implements a contract of the original unit with fake internals:

IDENTIFICATION DIVISION.
PROGRAM-ID. READER.
DATA DIVISION.
WORKING-STORAGE SECTION.
COPY SALES.
01 csv-row PIC X(48) VALUE 'Europe,Germany,10,9.99,99.90'.
LINKAGE SECTION.
01 where PIC X(48).
01 total PIC 9(9)V99 VALUE 0.
PROCEDURE DIVISION USING where RETURNING total.
CALL "PARSER" USING csv-row RETURNING csv-rec.
MOVE TotalRevenue TO total.
END PROGRAM READER.

Since testing JCL compiles the mock right after the original modules, the test program does not notice the substitution.

...
01 expected-total PIC 9(9)V99 VALUE 99.90.
...
CALL "READER" USING where RETURNING total.
CALL "ECBLUREQ" USING
BY CONTENT ADDRESS OF expected-total
BY CONTENT ADDRESS OF total
BY CONTENT LENGTH OF expected-total.

Given simple approach allows to simulate business logic of any complexity avoiding unwanted dependencies. Let’s build and test our units on z/OS:

$ cobolget run build
...
Modules modules.cpy and modules.cbl updated.
$ cobolget run test
...
OK
Tests: 001, Skipped: 000
Assertions: 002, Failures: 000, Exceptions: 000

DevOps

Don’t worry if the command above didn’t work for you. Within scripts in the Manifest you’ll find all necessary scenarios to align development and execution environments. You can run these commands separately or in groups, submitting a distinguished name of the scenario. Replace all <USER-ID> entries in the scripts and try to create mainframe datasets at once:

$ cobolget run setup

If a dataset already exists, the batch will stop with an error. Then, you’ll need to create them one by one, for example:

$ cobolget run setup:RES

If everything was made successfully, you can deploy and launch the Sales project on z/OS:

$ cobolget run build
$ cobolget run p
...
Total: 0033368932.11

Build, deploy, test and demo scenarios are up and running. Well done!

CI

The project follows Continuous Integration practices, where each modification of the code is getting tested on the repository. In the files you can notice two templates for that, such as:

  • nodejs.yml — a GitHub workflow
  • .gitlab-ci.yml — a GitLab workflow

Both templates replicate the steps explained above and alike in syntax. GitLab and GitHub will launch the workflows automatically and report an overall progress in Pipelines and Actions tabs, respectively. Non-zero exit code at any step will mark the workflow as failed.

If you decide to publish the project in your repository, don’t forget to declare HOST, PORT, USER and PASS variables in Settings->CI/CD->Variables on GitLab or in Settings->Actions secrets on GitHub.

Documentation

Software engineers are constantly reading the code. Usually, the ratio of time spent reading versus writing is well over 10:1. The speaking names of the variables, sections, paragraphs, plenty of comment lines explain to the reader what is happening and why. IBM recommends the IDENTIFICATION DIVISION to provide descriptive information about the program. Here is a snippet inserted by IBM Z Open Editor:

*****************************************************************
IDENTIFICATION DIVISION.
PROGRAM-ID. MYPROG.
AUTHOR. MYNAME.
INSTALLATION. COBOL DEVELOPMENT CENTER.
DATE-WRITTEN. 01/01/08.
DATE-COMPILED. 01/01/08.
SECURITY. NON-CONFIDENTIAL.
*****************************************************************

In terms of modularity, this type of accompanying documentation does not provide comprehensive information about the contract of the unit, without needing a thorough knowledge of its source-code. For that, COBOL vendors introduced additional tags in the comments, recognized by the documentation generators. Once implemented, auto-documentation decreases the amount of intervention required to support separate documentation, reduces costs and improves quality, making reflection of changes in the codebase efficient.

In 2020, Bruno Pacheco released an open-source tool that produces documentation from COBOL source-code — coboldoc. The tool solves the problem of incompatibility of documentation standards among COBOL dialects. Coboldoc supports Microfocus, MSDN and Free Format of comments and can generate documentation in HTML and Markdown formats.

*>**
*> Detailed description.
*> @summary <text> short description.
*> @author <text> defines the author(s).
*> @license <text> defines the license.
*> @param <type> <name> defines the input
*> @return <type> Defines the outout.
*>**

Having experience with the successors of Javadoc, you will find this snippet obvious. Let’s install Coboldoc and generate documentation of the sources:

$ npm i -g coboldoc
$ coboldoc -v
$ coboldoc generate src/*.cbl -o coboldoc

Conclusion

60-years old programming language is still going strong. Micro Focus estimates that COBOL is being used in 70% of global transaction processing systems, having 220 billion lines of code. I hope the aspects highlighted in this article will inspire you for further development of the ecosystem. Thanks for reading!

Are you COBOL enthusiast? Let’s develop missing libraries together— read the Enterprise COBOL: Package Tutorial.

--

--