From: Arturo Magidin on
On Jul 9, 6:13 pm, Gerry <ge...(a)math.mq.edu.au> wrote:
> On Jul 10, 7:47 am, Edward Green <spamspamsp...(a)netzero.com> wrote:
>
> > The closure of the union equals the union of the closures: this is
> > stated as a theorem to be proved shortly after defining the closure of
> > a set, and I've been beating my head against the silly thing. Can
> > anybody give me a hint in the right direction? Just how should I
> > approach such a problem?
>
> Suppose x is in the closure of the union. Then there's a sequence
> of points in the union converging to x. That sequence has
> a subsequence consisting entirely of points in one of the original
> sets.

This is quite clearly false. Suppose your sets are all finite, and
your sequence is never constant. How are you going to get a sequence
of points in a single set that converges to the point?

For an explicit example, take the union of {x} over all x in (0,1).
The closure of the union is [0,1]. An easy sequence of points in the
union converging to 0 is 1/n. Where is your subsequence that consists
entirely of points in one of the sets {x} and which converges to 0?

--
Arturo Magidin
From: William Elliot on
On Fri, 9 Jul 2010, Edward Green wrote:

> The closure of the union equals the union of the closures: this is
> stated as a theorem to be proved shortly after defining the closure of
> a set, and I've been beating my head against the silly thing.

That is false except for finite unions.
Doesn't the author make that clear?
For finite unions, just prove for all A,B,
cl A\/B = cl A \/ cl B. (1)

That is the dual result of the easy theorem, for all A,B,
int A/\B = int A /\ int B. (2)

and DeMorgan's rules of complementation for topological sets.
Those are usual rules for sets and the theorem,
cl S\A = S - int A
where S is the space and int A is A^o, the interior of A.

(1) is the topological dual statement of (2) and can
be immediately derived from (2) by taking complements.

> Can anybody give me a hint in the right direction?
> Just how should I approach such a problem?

First prove int A/\B = int A /\ int B.
From: Gerry on
On Jul 10, 2:20 pm, Arturo Magidin <magi...(a)member.ams.org> wrote:
> On Jul 9, 6:13 pm, Gerry <ge...(a)math.mq.edu.au> wrote:
>
> > On Jul 10, 7:47 am, Edward Green <spamspamsp...(a)netzero.com> wrote:
>
> > > The closure of the union equals the union of the closures: this is
> > > stated as a theorem to be proved shortly after defining the closure of
> > > a set, and I've been beating my head against the silly thing. Can
> > > anybody give me a hint in the right direction? Just how should I
> > > approach such a problem?
>
> > Suppose x is in the closure of the union. Then there's a sequence
> > of points in the union converging to x. That sequence has
> > a subsequence consisting entirely of points in one of the original
> > sets.
>
> This is quite clearly false. Suppose your sets are all finite, and
> your sequence is never constant. How are you going to get a sequence
> of points in a single set that converges to the point?
>
> For an explicit example, take the union of {x} over all x in (0,1).
> The closure of the union is [0,1]. An easy sequence of points in the
> union converging to 0 is 1/n. Where is your subsequence that consists
> entirely of points in one of the sets {x} and which converges to 0?

I was reading the question as, closure of A-union-B equals
(closure of A) union (closure of B). OP said it was stated
as a theorem, so it seems reasonable to assume that it was
only ever meant to apply to two (or, at any rate, finitely
many) sets.
--
GM
From: Edward Green on
Many thanks to all who replied. If I still can't quite manage the
proof, no doubt I have only my own debility to blame.

On a note of clarification, yes, it was the finite (actually pairwise)
case considered. Also, the problem appeared after we had barely
finished defining topologies, interiors and closures, so I don't think
any more advanced concepts, like metrics, are allowed (did I see some
metric creeping into some of the responses? I'm not sure).

Anyway, I'd like to ask a follow up question. I'm thinking there might
be a role for De Morgan's theorems somewhere, what with all the
complements flying around with this business of closed and open sets
(well, William Eliot said as much, I see). Just how hard is it to
prove De Morgan's laws in the case of infinite or even uncountable
index sets? Or perhaps that is not needed, but I'd like to know just
the same.

From: Gerry Myerson on
In article
<89a1177b-c1f3-46b8-a3ea-a0432cde1833(a)w30g2000yqw.googlegroups.com>,
Edward Green <spamspamspam3(a)netzero.com> wrote:

> Just how hard is it to prove De Morgan's laws in the case of infinite
> or even uncountable index sets?

Not very.

E.g., if x is in the complement of the intersection of a family of sets,
then there's at least one set it's not in, so it's in the union of the
complements.

--
Gerry Myerson (gerry(a)maths.mq.edi.ai) (i -> u for email)