[comp.std.unix] Standards Update, IEEE 1003.3: Test Methods

jsh@usenix.org (04/13/90)

From: <jsh@usenix.org>


            An Update on UNIX* and C Standards Activities

                             January 1990

                 USENIX Standards Watchdog Committee

                   Jeffrey S. Haemer, Report Editor

IEEE 1003.3: Test Methods Update

Doris Lebovits <lebovits@attunix.att.com> reports on the January 8-12,
1990 meeting in New Orleans, LA:

Dot three's job is to do test methods for all of the other 1003
standards.  This was the working group's fifteenth meeting.  We
reviewed the ballot status of P1003.1 test methods, worked on P1003.2
test methods, and created a steering committee.

Review of ballot status and Dot two verification

The P1003.3 standard will consist of several parts: Part I is generic
test methods, and part II is test methods for measuring P1003.1
conformance, including test assertions.  Part III of P1003.3 will
contain test methods and assertions for measuring P1003.2 conformance.
As other P1003 standards evolve, they will be covered as separate
parts in the P1003.3 standard.

Each day was divided into two sessions: mornings, we did technical
review of parts I and II, afternoons were spent writing assertions for
part III.  AT&T, NIST, OSF, Mindcraft, IBM, DEC, HP, Data General,
Cray Research, Unisys, Perennial and Unisoft Ltd.  were represented.
[Editor's complaint: I see no user representation at all.]

It took twelve meetings of the previous P1003.3 working group to
prepare the draft that is now balloting.  The technical review for the
Draft 10 ballot was completed.  Draft 11 was re-circulated late
February 1990 and closed March 23, 1990.  The balloting group is
approximately ninety members.  X/OPEN submitted a list of assertions
for P1003.1a.  This list was included as an appendix to Draft 11.
Balloters were expected to review this appendix as part of their
ballot.  We anticipate an approved P1003.3 standard in the third
quarter of 1990.

This is the third meeting for developing a verification standard
against the P1003.2 standard.  The P1003.2 assertion writing and
review were done in small groups.  Some of the assertions were based
upon P1003.2 Draft 9.

__________

  * UNIX is a registered trademark of AT&T in the U.S. and other
    countries.

January 1990 Standards Update                IEEE 1003.3: Test Methods


                                - 2 -

A steering committee and some new officers

The chair, Roger Martin, instigated the creation of a test-methods
steering committee to help alleviate the increasing dot-three work
load all the other, proliferating groups are creating.  The committee
will coordinate the activities of all test-methods groups, monitor the
groups' conformance to test methods, and write and approve Project
Authorization Requests (PARs).  Membership will be dynamic, limited to
four to six, and new members will be chosen based on long term
commitment, new ideas, and technical/managerial skills.  Roger
suggested an initial makeup  -- Roger Martin (NIST, Steering Committee
Chair), Anita Mundkur (HP), Andrew Twigger (Unisoft), Bruce Weiner
(Mindcraft), and Lowell Johnson (Unisys) --  and the working group
approved.  It's a non-controversial mix of established P1003.3
members.

The Standards Executive Committee (SEC) has approved both the
committee and its membership.  Their first assignment is to document
procedures.

In addition, new officers were chosen for the P1003.2 Test Methods
activities.  Ray Wilkes, of Unisys, is Chair, Jim Moe, of Cray
Research, is Co-chair, Lowell Johnson of Unisys is Secretary, and
Andrew Twigger of Unisoft Ltd is Technical Editor.

January 1990 Standards Update                IEEE 1003.3: Test Methods

Volume-Number: Volume 19, Number 57

hlj@posix.COM (Hal Jespersen) (04/13/90)

From: hlj@posix.COM (Hal Jespersen)

In article <627@longway.TIC.COM> From: <jsh@usenix.org>
>            An Update on UNIX* and C Standards Activities
>                             January 1990
>                 USENIX Standards Watchdog Committee
>                   Jeffrey S. Haemer, Report Editor
>IEEE 1003.3: Test Methods Update
> ...
>Each day was divided into two sessions: mornings, we did technical
>review of parts I and II, afternoons were spent writing assertions for
>part III.  AT&T, NIST, OSF, Mindcraft, IBM, DEC, HP, Data General,
>Cray Research, Unisys, Perennial and Unisoft Ltd.  were represented.
>[Editor's complaint: I see no user representation at all.]

On the contrary, most of these organizations _are_ users--of the test
suites to be produced.  How do you define "user", anyway?  If you mean
application developers who work in small companies, maybe you should
say "ISV".  If you mean people who don't develop software, but use
POSIX systems purely for services such as timesharing, office automation,
or vertical applications, I can easily imagine why their management
doesn't send them to POSIX.3 meetings or why they don't take vacation
time to go on their own.  But they can still be in the balloting group
if they are interested.


					Hal Jespersen
					POSIX Software Group
					447 Lakeview Way
					Redwood City, CA 94062
					Phone:	+1 (415) 364-3410
					FAX:	+1 (415) 364-4498
					UUCP:	uunet!posix!hlj
					 -or-	hlj@posix.COM

Volume-Number: Volume 19, Number 67

std-unix@longway.TIC.COM (Moderator, John S. Quarterman) (04/14/90)

From: Jason Zions <uunet!cnd.hp.com!jason>

[...]
>  AT&T, NIST, OSF, Mindcraft, IBM, DEC, HP, Data General,
>Cray Research, Unisys, Perennial and Unisoft Ltd.  were represented.
>[Editor's complaint: I see no user representation at all.]

I always thought of NIST as representing a (too?) large body of
users, i.e. all those agencies bound by FIPS.

Jason Zions
Hewlett-Packard Co.

Volume-Number: Volume 19, Number 64

jsh@usenix.org (Jeffrey S. Haemer) (10/02/90)

Submitted-by: jsh@usenix.org (Jeffrey S. Haemer)

           An Update on UNIX1-Related Standards Activities

                           October 1, 1990

                 USENIX Standards Watchdog Committee

                   Jeffrey S. Haemer, Report Editor

IEEE 1003.3: Test Methods

Doris Lebovits <lebovits@attunix.att.com> reports on the July 16-20
meeting in Danvers, MA:

Overview

Dot three's job is to do test methods for all of the other 1003
standards.  The group's work, whose first parts are now in ballot,
specifies the requirements for OS conformance testing for our industry
and for NIST.  This makes our balloting group, our technical
reviewers, and our schedules worth watching.  Pay attention, also, to
what comes out of the Steering Committee on Conformance Testing
(SCCT).  Their projects and decisions will be interesting and
important.

This was the working group's seventeenth meeting.  As usual, we
reviewed the ballot status of P1003.1 test methods, worked on P1003.2
test methods and reviewed steering committee activities.  Technical
reviews were done on parts I and II and the group developed assertions
for part III.  Participants from the usual companies attended (AT&T,
NIST, OSF, Mindcraft, IBM, DEC, HP, Data General, Cray Research,
Unisys, Perennial, and Unisoft, Ltd.), as did an assortment of P1003.2
members (see below).

Document structure

Currently, our evolving document has three parts: Part I is generic
test methods, Part II is test methods for measuring P1003.1
conformance, including test assertions, and Part III contains test
methods and assertions for measuring P1003.2 conformance.

After the ballot, each part will become a separate standard.  Part I
will be published as IEEE P1003.3, Part II as IEEE P1003.3.1, and Part
III as IEEE P1003.3.2.

__________

 1. UNIXTM is a Registered Trademark of UNIX System Laboratories in
    the United States and other countries.

October 1, 1990 Standards Update             IEEE 1003.3: Test Methods


				- 2 -

Ballot status

Draft 11 of the current ballot, which was recirculated to the
(approximately) ninety-member balloting group late in February, closed
balloting March 23.  Of the respondents, 19 disapproved with
substantive negative comments.  This met the two-thirds response
requirement, but falls short of the needed two-thirds approval.

A recirculation ballot for P1003.3 Draft 12, which is the revision of
Part I of Draft 11, began August 28 and is expected to close September
28, 1990.  The recirculation of P1003.3.1 Draft 12 (Part II) will be
conducted at a later date.

On the first and last days, the technical reviewers worked on ballot
objections to Part I and Part II.  All Part I objections and most Part
II objections were resolved.  The definition of an untested assertion
was reviewed and a permanent rationale will be included in Part I.

P1003.2 verification

This was our fifth meeting working on the verification standard for
the P1003.2 standard.  The assertion writing and review were done
jointly with the P1003.2 working group.

The whole P1003.3 and P1003.2 working groups worked jointly on
defining test assertions based on P1003.2 Draft 10.  They worked in
three small breakout groups.  The joint group (P1003.2 plus P1003.3)
also met in plenary session several times to discuss progress and
small-group issues.  Progress was slow in the beginning, since most of
the P1003.2 working group were not familiar with test assertions.  but
by the end of the week we had discussed and resolved several issues.
Some examples:

   - Do we need to state assertions in P1003.3.2 explicitly that
     duplicate P1003.3.1? (Yes.)

   - Must we test locale variables for every locale-sensitive
     interface?  (They should be tested when their behavior is clearly
     stated for a utility.)

   - Should assertions for multiple operands be consistent? (Yes.)

Lowell Johnson (Unisys) is Secretary of the P1003.2 Test Methods
activities, and Andrew Twigger (Unisoft Ltd) is Technical Editor.  Ray
Wilkes, the former Chair, has changed jobs and is no longer able to
attend regularly, so Roger Martin is actively looking for a
replacement.

October 1, 1990 Standards Update             IEEE 1003.3: Test Methods


				- 3 -

Steering Committee on Conformance Testing (SCCT)

The SCCT is supposed to alleviate the increasing dot-three work load
that all the other proliferating groups are creating.  Their job is
coordinating the activities of all test-methods groups, monitoring
their conformance to test methods, and writing Project Authorization
Requests (PARs).  Currently, its members are Roger Martin (NIST,
Steering Committee Chair), Anita Mundkur (HP), Andrew Twigger (Unisoft
Ltd), Bruce Weiner (Mindcraft), Lowell Johnson (Unisys) and the newest
member, John Williams (GM).  That there is a new member in the
steering committee is very important, especially because John is from
GM, the largest user voice other than the U.S. government.

The steering committee did not have anything for the working group to
review.  It is still documenting procedures, and Roger is still
clarifying which standards the working group will address.

October 1, 1990 Standards Update             IEEE 1003.3: Test Methods

Volume-Number: Volume 21, Number 162