From: Mark B on
OK thanks.

"Alexey Smirnov" <alexey.smirnov(a)gmail.com> wrote in message
news:af27d633-8be0-490d-a0c6-c68551da7f2e(a)r27g2000yqn.googlegroups.com...
On Mar 16, 12:48 pm, "Mark B" <none...(a)none.com> wrote:
> So if in the robots.txt I had:
>
> www.mysite.com/definitions/default.aspx?id=truckwww.mysite.com/definitions/default.aspx?id=trunkwww.mysite.com/definitions/default.aspx?id=try
>
> they'd all be stored separately in Google? It would be nice if they did --
> save us a lot of work and disk space.
>
> So I would need to programmatically re-write the robots.txt whenever
> another
> word was added to the database? Or would it suffice if my homepage had all
> these links on (created programmatically)?

The robots.txt file is used to define what content can be excluded by
search engine spiders. You don't need to define every single URL
there. To index all pages, you either should delete robots.txt or put
there just two following lines

User-agent: *
Disallow:

I think it would not be a problem if you enumerate all links in that
file, but I'm pretty sure that this will not help to increase any
ranking.

From: Andrew Morton on
Mark B wrote:
> So if in the robots.txt I had:
>
> www.mysite.com/definitions/default.aspx?id=truck
> www.mysite.com/definitions/default.aspx?id=trunk
> www.mysite.com/definitions/default.aspx?id=try
>
> they'd all be stored separately in Google? It would be nice if they
> did -- save us a lot of work and disk space.
>
> So I would need to programmatically re-write the robots.txt whenever
> another word was added to the database? Or would it suffice if my
> homepage had all these links on (created programmatically)?

I think you're looking for sitemaps:

"About Sitemaps - Sitemaps are a way to tell Google about pages on your site
we might not otherwise discover..."
http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=156184

(Works for other search engines too.)

--
Andrew