vdb@hou2g.UUCP (R.VANDERBEI) (01/27/85)
The usual definition of uniform continuity goes as follows:
for every epsilon > 0, there exists a delta > 0 such that
|f(x) - f(y)| < epsilon whenever |x - y| < delta.
It is not hard (and so I leave it to you readers) to show that
the following condition is equivalent:
for every epsilon > 0, there exists a K < infinity such that
for all x and y, |f(x) - f(y)| < K |x - y| + epsilon.
This second formulation shows that uniform continuity is almost
the same as Lipschitz continuity (which corresponds to being able
to find a finite K even when epsilon is zero).
If you think about it (and especially, if you write down a proof),
you will see that the first condition is like a "differential"
statement and the second condition is like an "integral" version.
The reason I was interested in this is because I wanted to know
whether a uniformly continuous function could grow more rapidly
than linearly at infinity. The second condition shows that it
cannot.
The second condition appears so simple and ellegant (and the proof is
also straight forward) that you'd expect to be able to find it
in any textbook on analysis however I have not seen it. Does any
one know a reference?
springer@iuvax.UUCP (01/29/85)
The equivalent definition that you propose works only when the function f is define on an arcwise connected set. A function can be uniformly continuous on a set made up any number of disconnected pieces. In such a case, you cannot relate the values in the different pieces the way your formulation does. However, the correct theorem which says that a function f is uniformly continuous on a connected set S if and only if given any epsilon > 0, there exists a finite number K such that for any points x and y in S, |f(x) - f(y)| < K|x-y| + epsilon, is an interesting fact and would actually make a nice exercise in an advanced calculus textbook. ....George Springer Indiana University