Ah, a linkless and this way not spiderrable context start-page would do the job i think... right?
Can you not just make a robots.txt document for the context and disallow things as you normally do? Google will hit the URL of that context and respect the robots.txt file no?
We create a robots document in each of our contexts with a document type of text and never rely on a physical robots.txt file.
I am not sure because of http headers ... don't know if this works this way... it should work, but i have to be shure about it....
So in your case it works right? So it should work in my case too...
Hopefully development clouds will have a robots.txt file with no index turned on by default - as there is a risk of duplicate content.
Wow, thats a good hint, but is this possible on a single special context within a multi context envoirenment too?
Can i use site_status in a context setting?
I will try that ... great! Maybe a save solution...
I often do that for "test" or development environments. Logged-in Manger users see the site as usual, but everybody else just gets your offline message page.