[net.math] function continuous only at irrational points

poulo@mtuxo.UUCP (r.poulo) (05/29/85)

I saw (in Rudin's Principles of Mathematical Analysis, I think) an interesting
function showing what strange things can happen when discussing continuity and
what a powerful restriction it is to say that a function is continuous.  Unlike
all examples I have seen of functions that are everywhere continuous but nowhere
differentiable, this function is very simple.  It has the property that it is
continuous at all irrational points and discontinuous at all rational points.

If x is irrational let f(x) = 0.
If x is rational then suppose x = p/q in lowest terms.  Let f(x) = 1/q.

In any interval f(x) jumps an infinite number of times.  Sample values
for some rationals are:

f(0)   = f(1)   = f(2)   = ... = 1.
f(1/2) = f(3/2) = f(5/2) = ... = 1/2.
f(1/5) = f(2/5) = f(3/5) = f(4/5) = 1/5;  f(5/5) = 1.

Problems for you mathematicians out there:

1.  Prove that f(x) is continuous at irrational points and discontinuous at
    rational points.  (This one is real simple -- if your attempts take more
    than 30 seconds to explain you are doing it wrong.)

2.  See if you can devise a simple function that is discontinuous at irrational
    points and continuous at the rational points.  I have not succeeded, but that
    proves nothing.

3.  I have wondered if f(x) is possibly differentiable at the irrational points.
    (It is immediate that it is not differentiable at the rational points since
    it is not continuous there.)  This question immediately leads to number theory
    and particularly to the theory of rational approximations, about which I know
    little.  Perhaps someone can complete the discussion below.


Consider the expression  f(x+dx)-f(x)
			--------------
			      dx

Since we are considering irrational values of x we have that f(x) = 0, so we
can consider the expression f(x+dx)/dx.  If we let dx-->0 through rational
values the sum x+dx will always be irrational, f(x+dx) will be 0, and the limit
(taken through rational points) will be 0.  If f(x) is differentiable we must
have a limit of 0 when dx-->0 through appropriately chosen irrational points
that make the sum x+dx assume rational values.

In particular let p  / q  be a rational sequence that converges on x and suppose
                   n    n

that x+dx assumes these values.  Then f(x+dx) = 1 / q  and dx = (p  / q ) - x.
                                                     n            n    n

The ratio f(x+dx)/dx takes on the values       1
                                          -----------
                                            p  - xq
                                             n     n

Does anyone know enough about rational approximations to tell what happens to the
value p  - xq  as n --> infinity?  As the ratio of two expressions that both --> 0
       n     n

it is not obvious what should happen.  If it is differentiable at these points
it would be a fascinating example / counterexample to give to freshman calculus
classes.  However I am 2/3 convinced that this is too much to ask and that the
function is not differentiable anywhere.

As a final comment note that the integral of f(x) over any interval is 0.
(If you don't know measure theory this won't make sense, but no flames please.)