[comp.software-eng] Brand New Optimization Methods

EC13@LIVERPOOL.AC.UK (05/31/91)

Here are more informations about the 2 new entropy-based vector optimization
methods and the 2 new simulated entropy methods.......It explains in more
details the contents of my thesis with a brief introduction to the subject...

  This thesis explores the use of Shannon's (informational) entropy measure
and Jaynes' maximum entropy criterion in the solution of multicriteria
optimization problems, for Pareto set generation, and for seeking the
global minimum of single-criteria optimization problems. They all view
optimization problems in terms of topological domain defined
deterministically by function hypersurfaces. Part of this thesis is a
continuation of research aimed at developing a non deterministic approach
to optimization problems through the use of the Maximum Entropy Principle
(MEP) and the Minimax Entropy Principle (MMEP). We treat the
multicriteria optimization problem as a statistical system which can be
interpreted as transformations of the system to a sequence of equilibrium
states which are characterized by certain entropy maxima depending upon a
feasible entropy parameter.

  This thesis also introduces, for the first time, simulated entropy for
obtaining the global minima of single-criteria minimization problems.
Several ways of using entropy in the optimization problem context are
investigated. Two simulated entropy algorithms are developed.
Computational results demonstrate that the algorithms proposed in this
work are efficient and reliable. The following parts of the present work
are original:

1-)  An entropy-based method for generating Pareto set is presented. This
new method yields additional insight into the nature of entropy as
an information-theoretic basis and clarifies some ambiguities in the
literature about its use in optimization.

2-)  A new stochastic technique for reaching the global minimum of
constrained single-criteria minimization problems is developed. The
system may be interpreted as a statistical thermodynamic one which
is characterized by lowering the temperature of the system to its
limit along the entropy process transitions. However, in each
transition, equilibrium is characterized by maximizing the entropy
at that certain temperature. This is known as Simulated Entropy.

3-)  Two simulated entropy techniques are developed which seek the global
minimum of some deterministic objective function. The two techniques
can be applied to constrained minimization problems only.

  For multi-objective optimization problems, most, if not all, of the
available Pareto set techniques have several difficulties. Firstly, they
are computer-time consuming since in order to generate a representative
or entire Pareto set the preassigned vector of weighting coefficients,
bounds, etc. must be varied over a large number of combinations.
Secondly, Pareto solutions are obtained randomly since the distribution
characteristics of these solutions are unknown. Thirdly, where there are
many criteria the amount of computation required may, itself, become a
difficulty which dramatically affects the total number of Pareto
solutions obtained and the selection of a better preferred solution.

  For single-objective optimization problems, most, if not all, of the
available deterministic techniques terminate in a local minimum which
depends on an initial configuration given by the user and do not provide
any information as to the amount by which this local minimum deviates
from a global minimum.

  The first chapter of this thesis reviews previous work in Mathematical
Optimizaion. The motivations and specifications of the present
mathematical work are also examined.

  In the second chapter, the concept of single-criteria optimization ,
multi-criteria optimization, and their mathematical formulations are
introduced. A brief survey of some of the most usable methods is given.

  The third chapter introduces the concept of entropy and the
relationships between informational entropy and the much better known
classical thermodynamical entropy. Entropy is used as a measure of
uncertainty. The development of Shannon's informational entropy is
described and its further development into entropy-based minimax methods
for solving multi-criteria minimization problems is presented. The
relationships among, entropy, simulated annealing and free energy in
optimization are investigated theoretically and are used for development
of a new subject for solving global minimization problems. In summary,
two families of entropy-based methods are described in detail. The first
set of methods is used for generating Pareto solutions of multi-criteria
optimization problem while the second is used for seeking the global
minimum of single criteria optimization problems.

  In the fourth chapter, the set of entropy-based methods developed in
the previous chapter is tested, discussed and compared.

EC13@LIVERPOOL.AC.UK (06/01/91)

   Entropic Vector Optimization and Simulated Entropy: Theory & Applications
   -------------------------------------------------------------------------
I am including here the last chapter of that thesis which was under
'Conclusions and Recommendations for Future Work'.........

1-)CONCLUSIONS:
   ~~~~~~~~~~~~
This thesis has examined the use of Shannon's (informational) entropy measure
and Jaynes' maximum entropy principle in connection with the solution of single

criterion and multi-criteria optimization problems. At first glance, the two
concepts, entropy and optimization, seem to have no direct link as the Shannon
entropy is essentially related to probabilities while optimization is usually
viewed in terms of a deterministic topological domain. To explore possible
links between them, an optimization problem has been simulated as a statistical
thermodynamic system that spontaneously approaches its equilibrium state under
a specified temperature, which is then characterized by the maximum entropy.
An attacking line:

         Entropy ------> Thermodynamic Equilibirum -------> Optimization

was then postulated. Several questions are then raised about how to do this
simulation. They are:

a) What are micro-states of this statistical thermodynamic system in an optimiz
   ation context?

b) What are the probabilities of the micro-states?
c) What common characteristic is there in these two processes?
d) What common law governs them?

In multi-criteria optimization, these questions are brieflyanswered as follows:

a) Each micro-state corresponds to a criteria in which it is optimized by
   randomly taking or adding a finite amplitude step from or to it respectively
   so that a Pareto set can be generated.
b) The multiplier associated with each criteria is interpreted as the
   probability of the system being in the corresponding micro-state.
c) The Pareto set generation process can be thought of as a sequence of
   feasible transition of the system to its equilibrium states such that the
   equilibra become the common characteristics in the two processes.
d) That the entropy of the system attains a maximum value in equilibrium states
   represent the common law to govern the two processes.

 In single-criterion constrained minimization, these questions are briefly answ
ered as follows:

a) Each micro-state corresponds to a constraint in which the objective function
   is minimized, subject to all constraints randomly taking a finite amplitude
   step from it.
b) The multipliers associated with each constraint is interpreted as the
   probability of the system being in the corresponding micro-state.
c) A minimizing process can be thought of as a sequence of transitions of the
   system to its equilibrium states that the equilibrium becomes the common
   characteristic in the two processes.
d) That the entropy of the system attains a maximum value at an equilibrium
   state but has a monotonically decreasing value during the minimization
   process represents the common law to govern the two processes.

In the course of this study, the concept of entropy was further examined by
presenting a new entropy-based minimax method for generating Pareto solutions
sets for multi-criteria optimization problems. The subject of simulated entropy
for SEEKING the global minimum of single-criteria constrained minimization
problems was explored and proved by developing two simulated entropy
techniques. The main developments made in the present study are summarized as
follows:

1) Two entropy-based method for generating Pareto solutions sets were developed
   in terms of Jaynes' maximum entropy formalism. These new methods have
   provided additional insight into entropic optimization as well as affording
   a simple means of calculating the least biased probabilities.

2) Two new entropy-based stochastic techniques were developed to reach the
   global minimum of constrained single-criterion minimization problems and
   belong to a new class of algorithms called simulated entropy.

3) Numerical examples were presented, tested and compared using all the new
   entropy-based methods described in Chapter 3.


The present work has shown that there are links between entropy and
optimization. There is no doubt that good entropy-based optimization algorithms
can be devised based upon the present research. Several conclusions, drawn
from the present research, are summarised as follows:

1) The development of the new entropy-based methods has provided not only an
   alternative convenient solution strategy but also additional insights into
   entropic processes.

2) Uncertainty contained in the solution of multi-criteria optimization
   problems is similar to that contained in thermodynamic systems: thus it is
   reasonable to employ a statistical thermodynamic approach, i.e., the entropy
   maximization approach, to estimate multi-criteria multipliers.

3) Uncertainty contained in the solution of constrained minimization problems
   is also similar to that contained in thermodynamic systems, so it is
   reasonable to employ a statistical thermodynamic approach, i.e., the entropy
   minimaximization approach, to estimate the entropy multipliers.

4) The exploration of a new class of methods called simulated entropy methods
   has a significance far beyond that subject itself. A fact which must be
   emphasized is, that it is the informational entropy approach which has made
   this possible. During the simulated entropy process simulation of the
   entropy was a close parallel to minimization of the maximum entropy at each
   configuration.

5) Entropic optimization developed in this thesis have a very unique property
   which make it very easy to be programmed. This unique property may be
   formulated as follows:

       "No matter how many goals or constraints we have, only one parameter
        controls the process: the entropy parameter, P"

6) The development of the entropic optimization methods have enabled the
   mathematical optimization examples considered in Chapter 4 to be solved
   easily. The computer results have shown that the developed methods have fast
   and stable convergence. Through the development made here, it can be seen
   that the entropic optimization methods DESERVE TO BE MORE WIDELY RECOGNIZED
   THAN HITHERTO.

7) Exponentiation and the use of logarithms within the entropy function have
   very little influence on computer execution time if time is optimized and
   full efficiency of the computer system is used.

2-)Recommendations for Future Work:
   ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The present work is exploratory. However, it has opened up new avenues in the
study of some classes of minimization problems, such as general stochastic
problems. Some potential research topics, which become possible due to the
present work, are summarized as follows:

1) The present work is mainly oriented towards providing a practical basis for
   using Shannon entropy through the Maximum Entropy Principle (MEP) and the
   Minimax Entropy Principle (MMEP) in an optimization context. It has left
   many aspects of theoretical algorithmic development to be explored. For
   example, the Maximin Entropy Principle.

2) Further practical refinements are also required for more deeply understanding
   the minimum entropy principle and the maximin entropy principle and
   extending its applications to more optimization areas.

3) It is now clear that the explored relationship between simulated entropy
   and simulated annealing has a close relationship to global minimization.
   Thus, the Minimax Entropy Principle (MMEP) can be readily adapted to solving
   large simulation problems of a stochastic nature. This has still to be
   explored.

4) Because of the explored relationship between simulation and entropy
   and since simulation is an experimental arm of operations research,
   the author strongly believes that ENTROPY RESEARCH (ER), which
   involves research on entropy may be described as a scientific approach
   to decision making that involves uncertainty and as a result its
   concept is so general that it is equally applicable to many other
   fields as well.

5) The roots of Operations Research can be traced back many decades to
   the Second World War. Because of the war effort, there was an urgent
   need to allocate scarce resources to the various military operations
   and to the activities within each operation in an effective manner.
   However, because of the strong waves of technological development,
   the world is shaken. A new strategic system of management is urgently
   needed. Entropy may be used efficiently for such development. In
   words, operations research deals with what is available today while
   entropy research (ER) deals with what is unpredictable tomorrow which
   is the case we are facing.

In conclusion, the main contribution to knowledge contained in this thesis
centers around the various demonstrations that informational entropy,
stochastic simulations and optimization processes are closely linked. Through
this work it is now possible to view the traditional deterministic,
topological interpretation of optimization processes as not the only
interpretation, but as just one possible interpretation. It is valid to treat
optimization processes in a probabilistic, information-theoretic way and to
develop solution methods from this interpretation. This new insight opens up
new avenues for research into optimization methods.


A. Sultan

EC13@LIVERPOOL.AC.UK (06/02/91)

Some of the recommendations for future work which I havn't mentioned in my
thesis and which I hope to work on in the near future are:

1)Using Shannon's informational entropy for developing a new algorithm for
  solving linear programming problems (LP).

2)Using Shannon's informational entropy for developing new simulated annealing
  algorithms, Entropic Simulated Annealing, for solving large size optimization
  problems.

3)Investigating the possibility of using Shannon's entropy for developing
  'Entropy Logic' which might be used to improve the performance of some
  industrial products.

4)In areas where: -results of all previous experimentation and data-collection
                   must be processed statistically, and
                  -expert professional opinions must be sought and considered.
  more research on what could be called 'Fuzzy Entropy' is urgently needed.

A. Sultan

romero@convex.com (Paco Romero) (06/03/91)

This fellow, EC13 from Liverpool, UKI, wrote:

>   This thesis explores the use of Shannon's (informational) entropy measure
> and Jaynes' maximum entropy criterion in the solution of multicriteria
> optimization problems, for Pareto set generation, and for seeking the
>
>   <80 more lines of unbearably incomprehensible verbosity>
> .........

Excuse me sir, but did anybody ask you what your thesis was about? 
If every student who's writing about something obtuse, of limited
interest, and full of incomprehensible jargon posted an abstract,
this newsgroup would die in days.  Please spare us next time; your
advisor might care, but I don't give a shit about the 
Minimax Entropy Principle.   How pretentious!

-paco

dtj@sumac.cray.com (Dean Johnson) (06/03/91)

In article <1991Jun02.212900.14819@convex.com>, romero@convex.com (Paco Romero) writes:
|> This fellow, EC13 from Liverpool, UKI, wrote:
|> 
|> >   This thesis explores the use of Shannon's (informational) entropy measure
|> > and Jaynes' maximum entropy criterion in the solution of multicriteria
|> > optimization problems, for Pareto set generation, and for seeking the
|> >
|> >   <80 more lines of unbearably incomprehensible verbosity>
|> > .........
|> 
|> Excuse me sir, but did anybody ask you what your thesis was about? 
|> If every student who's writing about something obtuse, of limited
|> interest, and full of incomprehensible jargon posted an abstract,
|> this newsgroup would die in days.  Please spare us next time; your
|> advisor might care, but I don't give a shit about the 
|> Minimax Entropy Principle.   How pretentious!

I actually made it through "This thesis explores" before I had a strong urge
to a) go into a coma, or b) throttle the first mathematician I saw. Can anybody
to a translation of the abstract to english? and preferably into mono-syllabic
words so that I can understand it. ;-)

	-Dean
--
	-Dean Johnson
	 Software Berserker/Rabid-Prototyping Specialist
	 Tools, Libraries, and Commands Group
	 Cray Research Inc.	Eagan, MN	(612) 683-5880

romero@convex.com (Paco Romero) (06/03/91)

>  more research on what could be called 'Fuzzy Entropy' is urgently needed.

I realize that, but would it be possible for you to use
a more specialized -and receptive I dare to say- forum for these highly 
boring trascendental issues?  I'm getting tired of your unsolicited  and
meaningless verbiage.

You either shut up or all the software engineers here are going to go 
someplace else to talk about work.  

EC13@LIVERPOOL.AC.UK (06/04/91)

I am including here some numerical examples solved using the new entropy-based
methods......

1)In Vector Optimization to generate Pareto set:
------------------------------------------------
This example was taken from 'Multicriterion Optimization in Engineering',
Osyczka, Ellis Horwood, Chichester, 1984. It is a beam design example.

F1(X) = 0.785{x1 (6400 - x2**2) + (1000 - x1) (10000 - x2**2)} mm^3 ----> Min

F2(X) = 3.298 x 10^(-5) {[(1/(4.096 x 10^6 - x2**4)) - (1/(10^8-x2**4))] * x1^3
        + (10^9/(10^8 - x2**4))} mm/N                               ----> Min

S.t.:  g1(X) = 180 - 9.87 x 10^6 * x1 / (4.096*10^7 - x2**4) >=0
       g2(X) = 75.2 - x2 >=0
       g3(X) = x2 - 40 >=0
       x1,x2>=o
Using the 1st entropy-based method, which can generate convex solutions only,
I was able to obtain 18 solutions while using the 2nd entropy-based method I
was able to generate 53 solutions. The results are summarized below:
-------------------------------------------------------------------------------
        The Entropy-Based Weighted Objectives Function Method (EWOF)
-------------------------------------------------------------------------------
  No.           X=(x1,x2)                              F(X)=(F1,F2)
-------------------------------------------------------------------------------
  1             (165.29, 75.2)            (0.2943695e+07, 0.49924662e-03)
  2             (183.58, 74.609)          (0.2961524e+07, 0.495371176e-03)
  3             (192.75, 74.307)          (0.2970895e+07, 0.493597705e-03)
  4             (202.39, 73.986)          (0.2981043e+07, 0.491856365e-03)
  5             (212.51, 73.744)          (0.2992045e+07, 0.490157399e-03)
  6             (219.88, 73.392)          (0.3000292e+07, 0.489001861e-03)
  7             (223.67, 73.262)          (0.3004618e+07, 0.488433288e-03)
  8             (152.71, 40.0)            (0.6162431e+07, 0.340317842e-03)
  9             (145.21, 40.0)            (0.6183632e+07, 0.340058003e-03)
 10             (137.95, 40.0)            (0.6204248e+07, 0.33983076e-03)
 11             (132.72, 40.0)            (0.6218924e+07, 0.33968105e-03)
 12             (131.05, 40.0)            (0.6223628e+07, 0.33963611e-03)
 13             (124.50, 40.0)            (0.624214 e+07, 0.339468941e-03)
 14             (110.72, 40.0)            (0.6281057e+07, 0.339171384e-03)
 15             (101.02, 40.0)            (0.6308531e+07, 0.339000951e-03)
 16             (96.227, 40.0)            (0.6321995e+07, 0.338929007e-03)
 17             (74.569, 40.0)            (0.6385737e+07, 0.33867266e-03)
 18             (1.0, 40.0)               (0.6591174e+07, 0.33846451e-03)
-------------------------------------------------------------------------------
        The Entropy-Based Constrained Objective Functions Method (ECOF)
-------------------------------------------------------------------------------
  1             (165.30, 75.2)            (0.2943734e+07, 0.4992401 e-03)
  2             (182.30, 74.651)          (0.2960245e+07, 0.4956268 e-03)
  3             (194.01, 74.265)          (0.2972194e+07, 0.4933646 e-03)
  4             (206.16, 73.859)          (0.2985099e+07, 0.4912079 e-03)
  5             (217.11, 73.487)          (0.2997181e+07, 0.4894268 e-03)
  6             (221.51, 73.336)          (0.3002174e+07, 0.4887516 e-03)
  7             (221.43, 71.549)          (0.3205612e+07, 0.466343  e-03)
  8             (232.23, 71.191)          (0.3215188e+07, 0.4652811 e-03)
  9             (225.57, 67.174)          (0.3670394e+07, 0.4277397 e-03)
 10             (236.85, 66.252)          (0.3735082e+07, 0.4232714 e-03)
 11             (219.06, 66.058)          (0.3805428e+07, 0.4189061 e-03)
 12             (225.08, 65.402)          (0.3856105e+07, 0.4156467 e-03)
 13             (208.21, 64.240)          (0.4022139e+07, 0.4063430 e-03)
 14             (231.21, 62.351)          (0.4144831e+07, 0.3994876 e-03)
 15             (234.36, 62.070)          (0.4163319e+07, 0.3985558 e-03)
 16             (242.77, 61.416)          (0.4203013e+07, 0.3966531 e-03)
 17             (234.36, 58.967)          (0.4458198e+07, 0.3850318 e-03)
 18             (231.06, 58.955)          (0.4468607e+07, 0.3845755 e-03)
 19             (228.60, 58.361)          (0.4530231e+07, 0.3820444 e-03)
 20             (213.70, 57.052)          (0.4691012e+07, 0.3758790 e-03)
 21             (232.50, 55.351)          (0.4787910e+07, 0.3725172 e-03)
 22             (226.02, 54.971)          (0.4839180e+07, 0.3707132 e-03 )
 23             (216.04, 54.552)          (0.4903343e+07, 0.3685560 e-03)
 24             (228.51, 53.746)          (0.4936624e+07, 0.3675970 e-03)
 25             (228.51, 52.481)          (0.5042167e+07, 0.3644039 e-03)
 26             (228.51, 51.345)          (0.5134723e+07, 0.3617750 e-03)
 27             (228.51, 50.906)          (0.5169965e+07, 0.3608146 e-03)
 28             (228.51, 50.241)          (0.5222748e+07, 0.3594169 e-03)
 29             (228.51, 49.383)          (0.5289871e+07, 0.3577087 e-03)
 30             (193.97, 48.748)          (0.5436395e+07, 0.3538036 e-03)
 31             (202.94, 47.265)          (0.5522785e+07, 0.3518865 e-03 )
 32             (192.28, 46.483)          (0.5610471e+07, 0.3499517 e-03)
 33             (175.45, 46.399)          (0.5664181e+07, 0.3488639 e-03)
 34             (175.45, 44.620)          (0.5791270e+07, 0.3463724 e-03)
 35             (175.45, 42.870)          (0.5911458e+07, 0.3442250 e-03)
 36             (192.55, 40.0)            (0.6049859e+07, 0.3421793 e-03)
 37             (175.45, 40.0)            (0.6098175e+07, 0.3412750 e-03)
 38             (154.20, 40.0)            (0.6158251e+07, 0.3403721 e-03)
 39             (147.19, 40.0)            (0.6178043e+07, 0.3401239 e-03)
 40             (134.01, 40.0)            (0.6215292e+07, 0.3397167 e-03 )
 41             (132.52, 40.0)            (0.6219488e+07, 0.3396757 e-03)
 42             (126.21, 40.0)            (0.6237336e+07, 0.3395106 e-03)
 43             (115.18, 40.0)            (0.6268511e+07, 0.3392596 e-03)
 44             (109.69, 40.0)            (0.6284024e+07, 0.3391511 e-03)
 45             (103.31, 40.0)            (0.6302049e+07, 0.3390382 e-03)
 46             (92.308, 40.0)            (0.6333136e+07, 0.3388738 e-03)
 47             (85.273, 40.0)            (0.6353017e+07, 0.3387872 e-03)
 48             (77.439, 40.0)            (0.6375157e+07, 0.3387062 e-03)
 49             (51.448, 40.0)            (0.6448607e+07, 0.3385353 e-03 )
 50             (48.834, 40.0)            (0.6455994e+07, 0.3385250 e-03)
 51             (44.642, 40.0)            (0.6467846e+07, 0.3385106 e-03)
 52             (42.974, 40.0)            (0.6472553e+07, 0.3385057 e-03)
 53             (0.0,    40.0)            (0.6593964e+07, 0.3384645 e-03)
-------------------------------------------------------------------------------
2)In Single-Criterion Minimization Using Simulated Entropy:
-----------------------------------------------------------
This example was taken from 'Lecture Notes in Economic and Mathematical
Systems', Hock W. and Schittokowski K., Springer-Verlag, 1981

Minimiza:

F(X) = x1**2 + 0.5*x2**2 + x3**2 + 0.5*x4**2 -
       x1*x3 + x3*x4 - x1 - 3*x2 + x3 - x4

S.t.:    - x1 - 2*x2 -   x3 - x4 + 5   >= 0
       - 3*x1 -   x2 - 2*x3 + x4 + 4   >= 0
                  x2 + 4*x3      - 1.5 >= 0
                  x1,x2,x3,x4>=0
-------------------------------------------------------------------------------
                             Simulated Entropy
-------------------------------------------------------------------------------
P        F         x1        x2        x3        x4     g1        g2       g3
Temp.    Energy
-------------------------------------------------------------------------------
1.5      -2.8795   0.00011   1.4091    0.3661    0.299  -1.5165  -2.1574 -1.374
1.4      -3.162    0.0001    1.5283    0.3126    0.285  -1.3461  -2.1308 -1.279
1.3      -3.4176   0.041     1.5785    0.273     0.296  -1.2331  -2.0483 -1.171
1.2      -3.6759   0.10919   1.6194    0.2575    0.31   -1.0924  -1.84   -1.15
1.1      -3.9249   0.14549   1.6635    0.2059    0.314  -1.0079  -1.8019 -0.987
1.0      -4.1706   0.2857    1.7041    0.18542   0.320  -0.8776  -1.6195 -0.946
0.9      -4.4028   0.2168    1.74      0.15398   0.331  -0.7729  -1.4979 -0.856
0.8 ***  -4.6366   0.31235   1.7787    0.11222   0.327  -0.691   -1.3868 -0.728
0.7      -4.8679   0.37211   1.8238    0.09149   0.333  -0.5556  -1.2102 -0.69
0.6      -5.0853   0.4255    1.8601    0.05629   0.333  -0.4654  -1.036  -0.585
0.5      -5.2948   0.48263   1.8925    0.02463   0.334  -0.3734  -0.9446 -0.491
0.4      -5.5005   0.5483    1.9251    0.0001    0.329  -0.272   -0.7591 -0.426
0.3      -5.6671   0.6199    1.9468    0.0001    0.336  -0.1506  -0.5291 -0.447
0.2      -5.7836   0.6751    1.9639    0.00086   0.331  -0.0651  -0.3401 -0.467
0.1      -5.8623   0.7306    1.962     0.0001    0.323  -0.0223  -0.1688 -0.462
0.075    -5.8662   0.7292    1.9635    0.0001    0.328  -0.0157  -0.1765 -0.464
0.05     -5.8867   0.73933   1.9814    0.0001    0.290  -0.0077  -0.0906 -0.482
0.025    -5.8971   0.74389   1.9911    0.0001    0.271  -0.0026  -0.0482 -0.492
-------------------------------------------------------------------------------
*** The minimum value given in the above-mentioned reference was F(X)=-4.682...
This assure the necessity of using the simulated entropy (SE) techniques to
secure our search for the global minimum.

A. Sultan

adam@visix.com (06/04/91)

Oh!  Now I get it.  He thinks this group is for the discussion of
software _for_ engineering, instead of the engineering of software.

Adam

dtj@sumac.cray.com (Dean Johnson) (06/05/91)

In article <91154.224250EC13@LIVERPOOL.AC.UK>, EC13@LIVERPOOL.AC.UK writes:
|> I am including here some numerical examples solved using the new entropy-based
|> methods......
|> 
|> 1)In Vector Optimization to generate Pareto set:
|> ------------------------------------------------
|> This example was taken from 'Multicriterion Optimization in Engineering',
|> Osyczka, Ellis Horwood, Chichester, 1984. It is a beam design example.
|> 
|> F1(X) = 0.785{x1 (6400 - x2**2) + (1000 - x1) (10000 - x2**2)} mm^3 ----> Min
|> 
|> F2(X) = 3.298 x 10^(-5) {[(1/(4.096 x 10^6 - x2**4)) - (1/(10^8-x2**4))] * x1^3
|>         + (10^9/(10^8 - x2**4))} mm/N                               ----> Min
|> 
|> S.t.:  g1(X) = 180 - 9.87 x 10^6 * x1 / (4.096*10^7 - x2**4) >=0
|>        g2(X) = 75.2 - x2 >=0
|>        g3(X) = x2 - 40 >=0
|>        x1,x2>=o
|> Using the 1st entropy-based method, which can generate convex solutions only,
|> I was able to obtain 18 solutions while using the 2nd entropy-based method I
|> was able to generate 53 solutions. The results are summarized below:
|> -------------------------------------------------------------------------------
|>         The Entropy-Based Weighted Objectives Function Method (EWOF)
|> -------------------------------------------------------------------------------
|>   No.           X=(x1,x2)                              F(X)=(F1,F2)
|> -------------------------------------------------------------------------------
|>   1             (165.29, 75.2)            (0.2943695e+07, 0.49924662e-03)
|>   2             (183.58, 74.609)          (0.2961524e+07, 0.495371176e-03)
|>   3             (192.75, 74.307)          (0.2970895e+07, 0.493597705e-03)
|>   4             (202.39, 73.986)          (0.2981043e+07, 0.491856365e-03)
|>   5             (212.51, 73.744)          (0.2992045e+07, 0.490157399e-03)
|>   6             (219.88, 73.392)          (0.3000292e+07, 0.489001861e-03)
|>   7             (223.67, 73.262)          (0.3004618e+07, 0.488433288e-03)
|>   8             (152.71, 40.0)            (0.6162431e+07, 0.340317842e-03)
|>   9             (145.21, 40.0)            (0.6183632e+07, 0.340058003e-03)
|>  10             (137.95, 40.0)            (0.6204248e+07, 0.33983076e-03)
|>  11             (132.72, 40.0)            (0.6218924e+07, 0.33968105e-03)
|>  12             (131.05, 40.0)            (0.6223628e+07, 0.33963611e-03)
|>  13             (124.50, 40.0)            (0.624214 e+07, 0.339468941e-03)
|>  14             (110.72, 40.0)            (0.6281057e+07, 0.339171384e-03)
|>  15             (101.02, 40.0)            (0.6308531e+07, 0.339000951e-03)
|>  16             (96.227, 40.0)            (0.6321995e+07, 0.338929007e-03)
|>  17             (74.569, 40.0)            (0.6385737e+07, 0.33867266e-03)
|

Ooooohhhhh.... now that hits the spot! My life is now complete, I can die a 
happy man ;-)
--
	-Dean Johnson
	 Software Berserker/Rabid-Prototyping Specialist
	 Tools, Libraries, and Commands Group
	 Cray Research Inc.	Eagan, MN	(612) 683-5880

alan@tivoli.UUCP (Alan R. Weiss) (06/05/91)

In article <91153.161332EC13@LIVERPOOL.AC.UK> EC13@LIVERPOOL.AC.UK writes:
>Some of the recommendations for future work which I havn't mentioned in my
>thesis and which I hope to work on in the near future are:
>
>1)Using Shannon's informational entropy for developing a new algorithm for
>  solving linear programming problems (LP).
>
>2)Using Shannon's informational entropy for developing new simulated annealing
>  algorithms, Entropic Simulated Annealing, for solving large size optimization
>  problems.
>
>3)Investigating the possibility of using Shannon's entropy for developing
>  'Entropy Logic' which might be used to improve the performance of some
>  industrial products.
>
>4)In areas where: -results of all previous experimentation and data-collection
>                   must be processed statistically, and
>                  -expert professional opinions must be sought and considered.
>  more research on what could be called 'Fuzzy Entropy' is urgently needed.
>
>A. Sultan

You forgot one tiny, teeny, little thing, Sultan:  where the hell is
your bibliography?  How do we check your research?  If the guy from
Cray can't parse this crap, :-) how can my little bitty SPARCstation2
hope to?  Still, for bullshit/lines of text I would definately award
you a PhD.  I would *ALSO* never hire you, and neither would anyone
else I know.

Besides, dude, that Indian fellow came up with a breakthrough
linear programming algorithm that replaces the Simplex.  Why do you
wish to waste your life on this?

Wasn't Entropic Logic a science-fiction concept from Isaac Asimov's
story "Absolom?"  As I recall, it was fictional.

Ah, never mind.  I've blown off enough packets on this one ...



_______________________________________________________________________
Alan R. Weiss                           TIVOLI Systems, Inc.
E-mail: alan@tivoli.com                 6034 West Courtyard Drive,
E-mail: alan@whitney.tivoli.com	        Suite 210
Voice : (512) 794-9070                  Austin, Texas USA  78730
Fax   : (512) 794-0623
_______________________________________________________________________

EC13@LIVERPOOL.AC.UK (06/08/91)

In article <802@tivoli.UUCP>, alan@tivoli.UUCP (Alan R. Weiss) says:
>
>
>Besides, dude, that Indian fellow came up with a breakthrough
>linear programming algorithm that replaces the Simplex.  Why do you
>wish to waste your life on this?

Don't worry, the 2 vector optimization methods are original. Regarding LP,
I am aware of that Indian scientist breakthrough but can you tell me what
is your breakthrough?

>
>Wasn't Entropic Logic a science-fiction concept from Isaac Asimov's
>story "Absolom?"  As I recall, it was fictional.

Several people have been working on 'Entropic Optimization' for the last ten
years. If you think this is wasting of time, can you tell me what are you doing
after you identify your job please?

>
>Ah, never mind.  I've blown off enough packets on this one ...

How much money you make by blowing off such packets?....what a shame!!!

A. Sultan

EC13@LIVERPOOL.AC.UK (06/08/91)

In article <1991Jun4.143522.21773@visix.com>, adam@visix.com says:
>
>Oh!  Now I get it.  He thinks this group is for the discussion of
>software _for_ engineering, instead of the engineering of software.
>

Still your comments are acceptable in comparison to: Alan, Romero, and Dtj
but still not good!!.....diplomatic comments would be preferable.

A. Sultan

EC13@LIVERPOOL.AC.UK (06/08/91)

In article <1991Jun03.051743.6747@convex.com>, romero@convex.com (Paco Romero)
says:
>
>I realize that, but would it be possible for you to use
>a more specialized -and receptive I dare to say- forum for these highly
>boring trascendental issues?  I'm getting tired of your unsolicited  and
>meaningless verbiage.

I don't know whether you are serious or just drunk.....I bet it is the second

A. Sultan
>

croes@imec.be (Kris Croes) (06/10/91)

In article <91158.213533EC13@LIVERPOOL.AC.UK> EC13@LIVERPOOL.AC.UK writes:
>In article <1991Jun03.051743.6747@convex.com>, romero@convex.com (Paco Romero)
>says:
>>
>> I'm getting tired of your unsolicited  and meaningless verbiage.

Paco, I agree...

>
>I don't know whether you are serious or just drunk.....I bet it is the second
>
>A. Sultan
>>

Mr. Sultan,

I don't know whether you are a stubborn mule or an arrogant fool.
I bet it is both of them.

You'ld better use your "Brand New Optimization Methods" to optimize the
number of your postings in this newgroup, comp.SOFTWARE-ENGINEERING, to
come out on the nice number of ZERO.

Kris
-- 
--------
K. CROES - IMEC - Leuven - Belgium   croes@imec.be

The Demon King bites in your leg and you feel weaker.

johnb@searchtech.com (John Baldwin) (06/12/91)

   [Paraphrased for brevity]

   A.Sultan> are too!
   c.s-e>> Is not!
   A.Sultan> are too!
   c.s-e>> Is not!
   A.Sultan> are too!
   c.s-e>> Is not!
   A.Sultan> are too!
   c.s-e>> Is not!
   A.Sultan> are too!
   c.s-e>> Is not!


   Gentlemen!  Can we please reduce the noise level and restrict the
discussions here to matters concerning "How to Engineer a Software System"?

   There are other, more appropriate forums for discussing optimization
methods.  There are also other, more appropriate forums for flame wars.


-- 
John Baldwin              | johnb@searchtech.com
Search Technology, Inc.   | srchtec!johnb@gatech.edu
Atlanta, Georgia          | johnb%srchtec.uucp@mathcs.emory.edu

dtj@sumac.cray.com (Dean Johnson) (06/12/91)

In article <1991Jun11.202209.16788@searchtech.com>, johnb@searchtech.com (John Baldwin) writes:
|> 
|>    [Paraphrased for brevity]
|> 
|>    A.Sultan> are too!
|>    c.s-e>> Is not!
|>    A.Sultan> are too!
|>    c.s-e>> Is not!
|>    A.Sultan> are too!
|>    c.s-e>> Is not!
|>    A.Sultan> are too!
|>    c.s-e>> Is not!
|>    A.Sultan> are too!
|>    c.s-e>> Is not!
|> 
|> 
|>    Gentlemen!  Can we please reduce the noise level and restrict the
|> discussions here to matters concerning "How to Engineer a Software System"?
|> 

Precisely my point!
--
	-Dean Johnson
	 Software Berserker/Rabid-Prototyping Specialist
	 Tools, Libraries, and Commands Group
	 Cray Research Inc.	Eagan, MN	(612) 683-5880