Product support

Visit this product's website for support.


Generate humans.txt and robots.txt for your web app with ease.


The world has humans and robots. Humans build things (including robots) and they deserve credit for doing so when they do. Robots, on the other hand, do what they are built to do, and sometimes you want to exclude certain behaviors. This plugin helps you to do both of this with ease.

As the title says, it allows you to generate humans.txt and robots.txt for your web app.

Quick Start

Once the plugin is installed, click on Txt in the Settings page.

For Humans.txt

  • Enable the humans txt option in the settings sub-section.
  • Create some human entries in the Humans sub-section.
  • Add some entries with field and values (For example, FieldName: Developer, Value: YourName for Team). You can find some examples in the Humans standard page or check out this plugin's screenshots.
  • Click on Goto Human Txt button to see the generated Humans.txt file.

For Robots.txt

  • Enable the robots txt option in the settings sub-section.
  • Goto Agents sub-section and add some web crawler and search agents. You can use the populate button to fetch a list of agents, either from the plugin's github page (a list of common agents are provided) or from your own custom resource file (such as a .txt or .csv file)
  • Now that the agents are available, navigate to the Robots sub-section and add some robot entries.
  • You may select the desired robot and define different directives which will influence the behavior of the selected robot. You can find some example robots file via Robots Exclusion standard page or check out Facebook's robots.txt for ideas.
  • Click on Goto Robots Txt button to se the generated Robots.txt file.


Like the Sitemap plugin, Txt plugin also works out of the box and does not require any direct development to operate.

Adding the Txts to Web Pages

  • In addition to the generated Txts you may also want to add meta tags in pages of your web app. Components have not been added in the plugin to automatically generate this as it is a trivial task. Here's how you can do it:

To point the location of humans.txt, add

<link type="text/plain" rel="author" href="{{ '/'|app }}/humans.txt" />

To prevent a page from being indexed, add

<meta name="robots" content="nofollow, noindex">

The terms all, index ,nofollow, noindex may be used as per the W3C's standards. But there are search engines that consider other terms as well. Here’s a list of possible terms:


Allow search engines robots to index the page, you don’t have to add this to your pages, as it’s the default.


Disallow search engines from showing this page in their results.


Disallow search engines from spidering images on that page. Of course if images are linked to directly from elsewhere, Google can still index them, so using an X-Robots-Tag HTTP header is a better idea.


This is a shortcut for noindex,nofollow, or basically saying to search engines: don’t do anything with this page at all.


Tells the search engines robots to follow the links on the page, whether it can index it or not.


Tells the search engines robots to not follow any links on the page at all.


Prevents the search engines from showing a cached copy of this page.


Same as noarchive, but only used by MSN/Live.


Prevents the search engines from showing a snippet of this page in the search results and prevents them from caching the page.


Blocks search engines from using the description for this page in DMOZ (aka ODP) as the snippet for your page in the search results.


Blocks Yahoo! from using the description for this page in the Yahoo! directory as the snippet for your page in the search results. No other search engines use the Yahoo! directory for this purpose, so they don’t support the tag.


Normalized Txt tables to improve efficiency.

Feb 22, 2022


Adds New Line Txt Option

Feb 16, 2022


Upgraded plugin for October 2.0.

Feb 16, 2022


Upgraded plugin for October 2.0.

Feb 16, 2022


Database maintenance. Removed all the unused timestamp columns.

Dec 08, 2016


Upgraded Agents to use ImportExport behavior than custom code.

Dec 08, 2016


Updated human field labels to repeater

Dec 07, 2016


Download Txt Feature

Dec 06, 2016


!!! Important bug fix

Oct 08, 2016


Added new type for directive

Apr 07, 2016


Added setting permissions. PR by @matissjanis

Sep 14, 2015


Added compatability with RainLab.Sitemap

Sep 06, 2015


Minor Bug Fix

Sep 05, 2015


Add seed data for agents table.

Feb 21, 2015


Created Humans Table

Feb 17, 2015


First version of Txt

Feb 17, 2015