-
It's not a loop hole nor a security issue, just need to go to Studio --> Developer --> Menus --> Items and select in first drop down box 'System' and 2nd box 'Site'
Now for each menu item, click on the Anyone link under Visible to column and change to selected levels, Account, Standard, Moderator, Adminstrator and Premium
Do the same thing under Developer --> Pages and change the visibility for all of those. If you do not do this, then you can still go directly to the pages by mysite.com/pages/posts-home but if you do go through and setup visibility on your site, then you will have nothing to worry about. The pages will not be listed in the menu anymore and if they type in directly the url, it will say ACCESS DENIED and nothing will be displayed.
All pages' Visibility is set to Anyone by default. It's up to us as builders to take the tools and apply them properly for our needs. Its up to us to lock it down and make it safe and secure for our members.
That's just another great feature and proof of how customizable the platform is. You can micro-manage every asset of the site.
-
That is correct. You can do that and after a short time of exposure to your site, then lock it down and it will say Access Denied and then they will have to sign up in order to view.
-
So with doing these instruction, is there a need for the Robot.txt file? Or are we good with this very simple adjustment in Developer?
-
I would suggest doing both considering it tackle two different issues. The Visibility being set to selected levels will ensure the page content cannot be viewed. The robots.txt file will ensure the links to the content and pages through the site cannot be indexed and also helpful in blocking the bots from indexing directories that contain scripts.