[net.lang.ada] APSE Trends

EBERARD@USC-ISIF.ARPA (Edward V. Berard) (06/09/86)

Recently, I have become aware of two trends regarding Ada Programming
Support Environments (APSEs). I find one of these trends encouraging,
and the other to be disturbing. The first (i.e., the encouraging one) is
that Ada compiler vendors are now turning their attention to delivering
APSEs and APSE-like tools along with their compilers. The initial
efforts appear to be based on existing programming environments (e.g.,
VMS and UNIX), but there appears to be a growing recognition that Ada
technology is different enough from the technology associated with other
programming languages to merit some newer ideas.

The second trend (i.e., the disturbing one) is one that has manifested
itself in a number of RFPs (Requests For Proposal) I have seen recently,
e.g., the Army's WIS effort. The trend is to call any programming system
which contains an Ada compiler, an APSE. I know that there are a number
of APSE-related activities currently under way in the Ada community, and
I wonder how important the following issues are:

   1) The idea of a common user interface,

   2) The idea of APSE portability achieved via mechanisms like the
      KAPSE (note that this issue may be distasteful to hardware
      vendors),

   3) The concept of DIANA,

   4) The concept of the CAIS,

   5) The STONEMAN model for the APSE, and

   6) Other issues contained in or suggested by APSE-QUESTIONS (see
      EV-Information). 

                                -- Ed Berard
                                   (301) 695 - 6960
-------

calland@nosc-tecr ("CALLAND") (06/17/86)

     I was very interested in the observation of Tony Alden on
porting tools in a production environment.  The Navy is currently
fielding its CMS-2 support software (MTASS) using a similar though
much less sophisticated technique.  MTASS is written in FORTRAN/77
with all host-dependent functions provided by the Common Interface
Routines.  The CIRs are written in a combination of FORTRAN and
assembly as appropriate for each host and emulate target computer
arithmetic (e.g., Floating Add for the AN/UYK-43) as well as the
normal i/o and other operations.

     Before the CIRs existed, it used to take 6 months to completely
rehost a revision release of MTASS (hosts include VAX/VMS, EXEC 8,
and of course the IBM OS's).  The rehosting time is now limited to
the amount of time it takes to ship the source tapes from the master
configuration management machine (EXEC 8) to the other hosts and
recompile them.

     Also the CIRs have now allowed a reduction in the amount of
certification testing performed on each rehost.  The full certification
test suite is performed on the master machine and a subset is performed
on each rehost.  With some exceptions due to the age of the code (parts
of which date back to 1975), there are so few porting problems that
it no longer is a concern.  But it has taken 3 years to reach this
level of confidence.

     This CIR approach has not been trouble-free.  Developing the
spec was very difficult and time-consuming.  There were at least
four false starts and this was starting from known requirements from
an existing system.  At least two protypes were constructed and
examined for interface design flaws.  And then implementation of
the production copy was worse.  As simple as the CIR interface is
(for the most part, the services are a subset of those provided
by FORTRAN/77), there were still problems in implementing the
complete interface and with keeping the interfaces compatible.
Suffice to say that, as a minimum, you must have experts for all
hosts to implement this type of software and you must get them
all together fairly often to ensure a compatible set of interfaces.

     The result has been successful for MTASS.  The CIRs themselves
are quite expensive but the costs of rehosting are quite low (with
much thanks due to ANSI).  CMS-2 applications developed using MTASS
have varying levels of transportability varying from complete (for
CMS-2 and assembler source code) to controlled (for Linkage Editor and
Tape Builder command streams) to user-dependent (for Simulator
command streams).  A File Exchange System provides the mechanism for
exchange of all MTASS file types (including object code, COMPOOLs,
and linked executables) between all hosts without foreknowledge of
the destination host.

     My conclusions are the CAIS is going to be successful but it
will take more work and money and time than the casual observer
might think and that the main obstacle to sharing and exchange of
significant software systems will be limited by legal and proprietary
restrictions rather than technical difficulties.  For examples of
the latter, I refer you to the restrictive notices that the USAF
places on its ICAM library and the disclaimer that Tony Alden placed
on his message regarding this subject.  You might also ask the
people responsible for the Navy's standard support software about
data rights and copyrights discussions they have with the Software
Engineering Institute and with SofTech; bring your lawyer as an
interpreter.

     Robert Calland
     NOSC

------

alden%jade@spp1.UUCP (06/18/86)

Ed,

Two of the issues that you are wondering about with respect to their
importance that I consider important are those dealing with
DIANA and the KAPSE.  (You have the CAIS listed separate from a KAPSE, 
I remind you that the CAIS is just one instance of a KAPSE, and happens to 
be a proposed MIL STD.).

The issue of DIANA can be generically placed in the category of the issue
of a standard or common internal representation for ada code.  Whetherer the
representatio is DIANA or some other future standard, I think the issue is
a good one becuase it allows for the possibility of developing small, very
specfic tools that work from a common database or preparsed code.  For 
the large kinds of systems that Ada development is suposed to address it 
self to, it would be impossible from an throughput point of view to
to have to compile code once to produce code and once for each tool 
to be used by the developer.  It only makes sense that if there are
to be a host of tools that need to parse Ada code before they can
do their job, that this only be done once.  Given the current speeds of
compilation, a trend to reduce this overhead is welcome.

With respect to the KAPSE.  First my credentials (for what they are
worth).  I am one of the members of the TRW team developing a CAIS prototype
under contract to NOSC.  The stated value of a KAPSE is that if tools
are written to conform to the KAPSE for all so called  O.S. dependencies,
then when it comes time to rehost the toolset to a new machine, the tools
will be transportable with no (or at least minimal) rework.  This is 
because the KAPSE would be ported and absorb all of the work.  For large
toolsets, the expense of porting an entire tool suite is likely to 
be more than porting the KAPSE.  Remember that you must include the
cost of testing each tool when you change its code.  Thus added to
the cost of tweaking code for each tool, you must add the cost of
completely testing the tool.  If you are porting just the KAPSE, it
it is likely that the complexity of testing is lower because you have only
one program to test.  One unanswered question that still remains is 
how costly will it be to report the KAPSE.  This of course depends on
what functions are in the KAPSE.  Experience with UNIX ports (Unix is
considered by many as an example of a KAPSE) shows that ports to 
new machines are extremely cost effective.  Why Unix is not viable
for an Ada KAPSE is another issue, not to be dealt with here.

Not mentioned yet is the cost of developing small new tools.  If 
you have to reinvent the parsing wheel each time you need a tool that
has to run through Ada code, it is likely that it will not be cost
effective to develop useful small tools - the overhead of such development
would be to high.  Having this aspect standardized reduces the overhead
of maintenance and development because the develpment community only
has to learn one set of such routines and not a new set for each
toolset that it needs to modify.

The underlying theme that ties both of the above issues together (issue
of DIANA and issue of a KAPSE) is that of centralization and standardization.
Both of these approaches rely on the idea that if you have a standard for
internal representations of Ada code that all tools agree on and have
a central place to put this with standard interfaces to work with this
repository then tools can be integrated in such a way that their sum power
is greater than the additive power of the individual tools. This synergy 
is what I hope will make the whole effort worth while.

	... Tony Alden
	    TRW
	    (213) 535-1624

P.S.

The above comments apply to my personal and academic views on the subjects 
discussed and in no way reflect the official views of TRW or any
obligations TRW has to fulfilment of its contractural obligations.