#46

Product support

Get help in the plugin support forum.

Categories

  • Miscellaneous
  • Utility
Generate humans.txt and robots.txt for your web app with ease.

Quick Start

Once the plugin is installed, click on Txt in the Settings page.

For Humans.txt

  • Enable the humans txt option in the settings sub-section.
  • Create some human entries in the Humans sub-section.
  • Add some entries with field and values (For example, FieldName: Developer, Value: YourName for Team). You can find some examples in the Humans standard page or check out this plugin's screenshots.
  • Click on Goto Human Txt button to see the generated Humans.txt file.

For Robots.txt

  • Enable the robots txt option in the settings sub-section.
  • Goto Agents sub-section and add some web crawler and search agents. You can use the populate button to fetch a list of agents, either from the plugin's github page (a list of common agents are provided) or from your own custom resource file (such as a .txt or .csv file)
  • Now that the agents are available, navigate to the Robots sub-section and add some robot entries.
  • You may select the desired robot and define different directives which will influence the behavior of the selected robot. You can find some example robots file via Robots Exclusion standard page or check out Facebook's robots.txt for ideas.
  • Click on Goto Robots Txt button to se the generated Robots.txt file.

Crawler List Example

By default, the plugin populates the Agents section with popular web search crawlers but if you want, you can also download this importable csv file and import it into the Agents table using the Import feature.

Enjoy!

Like the Sitemap plugin, Txt plugin also works out of the box and does not require any direct development to operate.

Adding the Txts to Web Pages

  • In addition to the generated Txts you may also want to add meta tags in pages of your web app. Components have not been added in the plugin to automatically generate this as it is a trivial task. Here's how you can do it:

To point the location of humans.txt, add

<link type="text/plain" rel="author" href="{{ '/'|app }}/humans.txt" />

To prevent a page from being indexed, add

<meta name="robots" content="nofollow, noindex">

The terms all, index ,nofollow, noindex may be used as per the W3C's standards. But there are search engines that consider other terms as well. Here’s a list of possible terms:

index

Allow search engines robots to index the page, you don’t have to add this to your pages, as it’s the default.

noindex

Disallow search engines from showing this page in their results.

noimageindex

Disallow search engines from spidering images on that page. Of course if images are linked to directly from elsewhere, Google can still index them, so using an X-Robots-Tag HTTP header is a better idea.

none

This is a shortcut for noindex,nofollow, or basically saying to search engines: don’t do anything with this page at all.

follow

Tells the search engines robots to follow the links on the page, whether it can index it or not.

nofollow

Tells the search engines robots to not follow any links on the page at all.

noarchive

Prevents the search engines from showing a cached copy of this page.

nocache

Same as noarchive, but only used by MSN/Live.

nosnippet

Prevents the search engines from showing a snippet of this page in the search results and prevents them from caching the page.

noodp

Blocks search engines from using the description for this page in DMOZ (aka ODP) as the snippet for your page in the search results.

noydir

Blocks Yahoo! from using the description for this page in the Yahoo! directory as the snippet for your page in the search results. No other search engines use the Yahoo! directory for this purpose, so they don’t support the tag.

1.1.2

Database maintenance. Removed all the unused timestamp columns.

Dec 08, 2016

1.1.1

Upgraded Agents to use ImportExport behavior than custom code.

Dec 08, 2016

1.1.0

Updated human field labels to repeater

Dec 07, 2016

1.0.9

Download Txt Feature

Dec 06, 2016

1.0.8

!!! Important bug fix

Oct 08, 2016

1.0.7

Added new type for directive

Apr 07, 2016

1.0.6

Added setting permissions. PR by @matissjanis

Sep 14, 2015

1.0.5

Added compatability with RainLab.Sitemap

Sep 06, 2015

1.0.4

Minor Bug Fix

Sep 05, 2015

1.0.3

Add seed data for agents table.

Feb 21, 2015

1.0.2

Created Humans Table

Feb 17, 2015

1.0.1

First version of Txt

Feb 17, 2015