Generate humans.txt and robots.txt for your web app with ease.
Introduction
The world has humans and robots. Humans build things (including robots) and they deserve credit for doing so when they do. Robots, on the other hand, do what they are built to do, and sometimes you want to exclude certain behaviors. This plugin helps you to do both of this with ease.
As the title says, it allows you to generate humans.txt and robots.txt for your web app.
Quick Start
Once the plugin is installed, click on Txt in the Settings page.
For Humans.txt
- Enable the humans txt option in the settings sub-section.
- Create some human entries in the Humans sub-section.
- Add some entries with field and values (For example, FieldName: Developer, Value: YourName for Team). You can find some examples in the Humans standard page or check out this plugin's screenshots.
- Click on Goto Human Txt button to see the generated Humans.txt file.
For Robots.txt
- Enable the robots txt option in the settings sub-section.
- Goto Agents sub-section and add some web crawler and search agents. You can use the populate button to fetch a list of agents, either from the plugin's github page (a list of common agents are provided) or from your own custom resource file (such as a .txt or .csv file)
- Now that the agents are available, navigate to the Robots sub-section and add some robot entries.
- You may select the desired robot and define different directives which will influence the behavior of the selected robot. You can find some example robots file via Robots Exclusion standard page or check out Facebook's robots.txt for ideas.
- Click on Goto Robots Txt button to se the generated Robots.txt file.
Enjoy!
Like the Sitemap plugin, Txt plugin also works out of the box and does not require any direct development to operate.
Adding the Txts to Web Pages
- In addition to the generated Txts you may also want to add meta tags in pages of your web app. Components have not been added in the plugin to automatically generate this as it is a trivial task. Here's how you can do it:
To point the location of humans.txt, add
<link type="text/plain" rel="author" href="{{ '/'|app }}/humans.txt" />
To prevent a page from being indexed, add
<meta name="robots" content="nofollow, noindex">
The terms all, index ,nofollow, noindex may be used as per the W3C's standards. But there are search engines that consider other terms as well. Here’s a list of possible terms:
index
Allow search engines robots to index the page, you don’t have to add this to your pages, as it’s the default.
noindex
Disallow search engines from showing this page in their results.
noimageindex
Disallow search engines from spidering images on that page. Of course if images are linked to directly from elsewhere, Google can still index them, so using an X-Robots-Tag HTTP header is a better idea.
none
This is a shortcut for noindex,nofollow, or basically saying to search engines: don’t do anything with this page at all.
follow
Tells the search engines robots to follow the links on the page, whether it can index it or not.
nofollow
Tells the search engines robots to not follow any links on the page at all.
noarchive
Prevents the search engines from showing a cached copy of this page.
nocache
Same as noarchive, but only used by MSN/Live.
nosnippet
Prevents the search engines from showing a snippet of this page in the search results and prevents them from caching the page.
noodp
Blocks search engines from using the description for this page in DMOZ (aka ODP) as the snippet for your page in the search results.
noydir
Blocks Yahoo! from using the description for this page in the Yahoo! directory as the snippet for your page in the search results. No other search engines use the Yahoo! directory for this purpose, so they don’t support the tag.
-
Ruth Cheesley
Found the plugin useful on 5 Apr, 2019
Nice plugin - simple to use and does what it says on the tin.
-
Tim
Found the plugin useful on 22 Aug, 2017
Easy to integrate and configure.
-
2.0.2 |
Normalized Txt tables to improve efficiency. Feb 22, 2022 |
---|---|
2.0.1 |
Adds New Line Txt Option Feb 16, 2022 |
2.0.0 |
Upgraded plugin for October 2.0. Feb 16, 2022 |
1.1.3 |
Upgraded plugin for October 2.0. Feb 16, 2022 |
1.1.2 |
Database maintenance. Removed all the unused timestamp columns. Dec 08, 2016 |
1.1.1 |
Upgraded Agents to use ImportExport behavior than custom code. Dec 08, 2016 |
1.1.0 |
Updated human field labels to repeater Dec 07, 2016 |
1.0.9 |
Download Txt Feature Dec 06, 2016 |
1.0.8 |
!!! Important bug fix Oct 08, 2016 |
1.0.7 |
Added new type for directive Apr 07, 2016 |
1.0.6 |
Added setting permissions. PR by @matissjanis Sep 14, 2015 |
1.0.5 |
Added compatability with RainLab.Sitemap Sep 06, 2015 |
1.0.4 |
Minor Bug Fix Sep 05, 2015 |
1.0.3 |
Add seed data for agents table. Feb 21, 2015 |
1.0.2 |
Created Humans Table Feb 17, 2015 |
1.0.1 |
First version of Txt Feb 17, 2015 |