mvhub-dev team mailing list archive
-
mvhub-dev team
-
Mailing list archive
-
Message #00015
[Branch ~mvhub-dev/mvhub/trunk] Rev 350: merged add_robots.txt
Merge authors:
Dan MacNeil (omacneil)
Related merge proposals:
https://code.launchpad.net/~omacneil/mvhub/add_robots.txt/+merge/21205
proposed by: Dan MacNeil (omacneil)
review: Approve - Lee Goodrich (leegoodrich)
------------------------------------------------------------
revno: 350 [merge]
fixes bug(s): https://launchpad.net/bugs/493208
committer: Dan MacNeil <dan@xxxxxxxxxx>
branch nick: trunk
timestamp: Wed 2010-03-24 01:33:07 -0400
message:
merged add_robots.txt
modified:
app-mvhub/DocumentRoot/static/mvh/robots.txt
app-mvhub/DocumentRoot/static/nsp/robots.txt
app-mvhub/doc/checklists/move_to_production.txt
--
lp:mvhub
https://code.launchpad.net/~mvhub-dev/mvhub/trunk
Your team mvhub-dev is subscribed to branch lp:mvhub.
To unsubscribe from this branch go to https://code.launchpad.net/~mvhub-dev/mvhub/trunk/+edit-subscription.
=== modified file 'app-mvhub/DocumentRoot/static/mvh/robots.txt'
--- app-mvhub/DocumentRoot/static/mvh/robots.txt 2009-06-03 23:47:43 +0000
+++ app-mvhub/DocumentRoot/static/mvh/robots.txt 2010-03-12 01:35:37 +0000
@@ -1,9 +1,16 @@
-# robots.txt exists so the log doesn't fill with 'file not found'
-# if spiders start indexing test sites then we should
-# have
-# robots.txt for production that allows visits
-# robots.txt for developers/testers that doesn't
+# PURPOSE: control spidering of ALL test data so
+# googlbot doesn't get confused and
+# index test data
+# for development allow NO access
+# comment out two lines below
+# when in PRODUCTION
User-agent: *
Disallow: /
+# for production Disallow nothing
+# Allow all robots complete access
+# comment out two lines below
+# when in DEVELOPMENT
+# User-agent: *
+# Disallow:
=== modified file 'app-mvhub/DocumentRoot/static/nsp/robots.txt'
--- app-mvhub/DocumentRoot/static/nsp/robots.txt 2009-03-31 16:29:30 +0000
+++ app-mvhub/DocumentRoot/static/nsp/robots.txt 2010-03-12 01:35:37 +0000
@@ -1,9 +1,16 @@
-# robots.txt exists so the log doesn't fill with 'file not found'
-# if spiders start indexing test sites then we should
-# have
-# robots.txt for production that allows visits
-# robots.txt for developers/testers that doesn't
+# PURPOSE: control spidering of ALL test data so
+# googlbot doesn't get confused and
+# index test data
+# for development allow NO access
+# comment out two lines below
+# when in PRODUCTION
User-agent: *
-Allow: *
+Disallow: /
+# for production Disallow nothing
+# Allow all robots complete access
+# comment out two lines below
+# when in DEVELOPMENT
+# User-agent: *
+# Disallow:
=== modified file 'app-mvhub/doc/checklists/move_to_production.txt'
--- app-mvhub/doc/checklists/move_to_production.txt 2010-01-11 15:13:49 +0000
+++ app-mvhub/doc/checklists/move_to_production.txt 2010-03-12 01:40:52 +0000
@@ -180,3 +180,11 @@
sudo $EDITOR /etc/apache2/sites-available/mvhub_production.data
sudo apache2ctl graceful
+
+___ perform IMPORTANT hand cleanup
+
+ # follow comments to make
+ # changes to allow content to be
+ # indexed by spiders (google)
+ sudo $EDITOR app-mvhub/DocumentRoot/static/nsp/robots.txt
+ sudo $EDITOR app-mvhub/DocumentRoot/static/mvh/robots.txt