{"id":954,"date":"2018-05-03T13:06:13","date_gmt":"2018-05-03T13:06:13","guid":{"rendered":"https:\/\/chemicloud.com\/kb\/?post_type=article&#038;p=954"},"modified":"2023-12-14T11:51:39","modified_gmt":"2023-12-14T11:51:39","slug":"create-robots-txt-file-cpanel","status":"publish","type":"ht_kb","link":"https:\/\/chemicloud.com\/kb\/article\/create-robots-txt-file-cpanel\/","title":{"rendered":"How to Create a robots.txt File in cPanel"},"content":{"rendered":"<article id=\"general\">If you&#8217;ve ever built your website, you may have heard of a <a href=\"https:\/\/chemicloud.com\/glossary\/term\/robots-txt\/\" target=\"_blank\" rel=\"noopener\">robots.txt<\/a> file and wondered, what is this file for? Well, you&#8217;re in the right place! Below, we will review this file and why it&#8217;s crucial.<\/article>\n<article>\n<h3 id=\"what-is-a-robots-txt-file\" style=\"text-align: justify;\">What is a robots.txt file?<\/h3>\n<p>First of all, the <em><strong>robots.txt<\/strong><\/em> is nothing more than a plain text file (ASCII or UTF-8) located in your domain <strong>root\u00a0directory<\/strong>, which blocks (or allows) search engines to access certain areas of your site. The <em>robots.txt<\/em> contains a simple set of commands (or directives), and it\u2019s typically applied to restrict crawler traffic onto your server, thus preventing unwanted resource usage.<\/p>\n<p>Search engines use so-called <strong>crawlers<\/strong> (or bots) to index parts of a website and return those as search results. You might want specific sensitive data stored on your server to be inaccessible for web searches. The robots.txt file helps you do just that.<\/p>\n<p style=\"text-align: justify;\"><strong>Note:<\/strong> Files or pages on your website are not entirely removed from crawlers if these files are indexed\/referenced from other websites. To protect your URL from appearing in Google search engines, you can password-protect the files directly from your server.<\/p>\n<\/article>\n<article id=\"creation\">\n<h3 id=\"how-to-create-the-robots-txt-file\" style=\"text-align: justify;\">How to create the robots.txt file<strong><br \/>\n<\/strong><\/h3>\n<p>To create your <em>robots.txt<\/em> file (if not already existent), follow the following steps:<\/p>\n<p><strong>1.<\/strong> <a href=\"https:\/\/chemicloud.com\/kb\/article\/how-to-login-cpanel-whm\/\" target=\"_blank\" rel=\"noopener noreferrer\">Log into your cPanel account<\/a><\/p>\n<p><strong>2.<\/strong> Navigate to <em>FILES<\/em> section and click on <strong>File Manager<\/strong><\/p>\n<figure id=\"attachment_7700\" aria-describedby=\"caption-attachment-7700\" style=\"width: 643px\" class=\"wp-caption alignnone\"><img loading=\"lazy\" decoding=\"async\" class=\"size-large wp-image-7700\" src=\"https:\/\/chemicloud.com\/kb\/wp-content\/uploads\/2018\/05\/cPanel-Files-File-Manager-1-783x295.png\" alt=\"\" width=\"643\" height=\"242\" srcset=\"https:\/\/chemicloud.com\/kb\/wp-content\/uploads\/2018\/05\/cPanel-Files-File-Manager-1-783x295.png 783w, https:\/\/chemicloud.com\/kb\/wp-content\/uploads\/2018\/05\/cPanel-Files-File-Manager-1-300x113.png 300w, https:\/\/chemicloud.com\/kb\/wp-content\/uploads\/2018\/05\/cPanel-Files-File-Manager-1-768x290.png 768w, https:\/\/chemicloud.com\/kb\/wp-content\/uploads\/2018\/05\/cPanel-Files-File-Manager-1-50x19.png 50w, https:\/\/chemicloud.com\/kb\/wp-content\/uploads\/2018\/05\/cPanel-Files-File-Manager-1-1536x580.png 1536w, https:\/\/chemicloud.com\/kb\/wp-content\/uploads\/2018\/05\/cPanel-Files-File-Manager-1-60x23.png 60w, https:\/\/chemicloud.com\/kb\/wp-content\/uploads\/2018\/05\/cPanel-Files-File-Manager-1-100x38.png 100w, https:\/\/chemicloud.com\/kb\/wp-content\/uploads\/2018\/05\/cPanel-Files-File-Manager-1.png 1728w\" sizes=\"auto, (max-width: 643px) 100vw, 643px\" \/><figcaption id=\"caption-attachment-7700\" class=\"wp-caption-text\">cPanel &gt; Files &gt; File Manager<\/figcaption><\/figure>\n<p><strong>3.<\/strong>\u00a0 Browse File Manager to the website directory ( e.g public_html ) then Click on \u201c<strong>File<\/strong>\u201d\u00a0 &gt;&gt; Type in \u201crobots.txt\u201d\u00a0 &gt;&gt; Click on \u201c<strong>Create New File<\/strong>\u201d.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-large wp-image-7701\" src=\"https:\/\/chemicloud.com\/kb\/wp-content\/uploads\/2018\/05\/robots.txt-file-783x387.png\" alt=\"\" width=\"643\" height=\"318\" srcset=\"https:\/\/chemicloud.com\/kb\/wp-content\/uploads\/2018\/05\/robots.txt-file-783x387.png 783w, https:\/\/chemicloud.com\/kb\/wp-content\/uploads\/2018\/05\/robots.txt-file-300x148.png 300w, https:\/\/chemicloud.com\/kb\/wp-content\/uploads\/2018\/05\/robots.txt-file-768x380.png 768w, https:\/\/chemicloud.com\/kb\/wp-content\/uploads\/2018\/05\/robots.txt-file-50x25.png 50w, https:\/\/chemicloud.com\/kb\/wp-content\/uploads\/2018\/05\/robots.txt-file-1536x759.png 1536w, https:\/\/chemicloud.com\/kb\/wp-content\/uploads\/2018\/05\/robots.txt-file-60x30.png 60w, https:\/\/chemicloud.com\/kb\/wp-content\/uploads\/2018\/05\/robots.txt-file-100x49.png 100w, https:\/\/chemicloud.com\/kb\/wp-content\/uploads\/2018\/05\/robots.txt-file.png 1708w\" sizes=\"auto, (max-width: 643px) 100vw, 643px\" \/><\/p>\n<p><strong>4.<\/strong> Now, you are free to edit the content of this file by double-clicking on it.<\/p>\n<p style=\"text-align: justify;\"><strong>Note: <\/strong>you can create only <strong>one <\/strong><em>r<\/em><em>obots.txt<\/em> file for each domain. Duplicates are not allowed on the same root path. Each domain or sub-domain must contain its own <em>robots.txt<\/em> file. <strong>\u00a0<\/strong><strong>\u00a0<\/strong><\/p>\n<\/article>\n<article id=\"examples\">\n<h3 id=\"examples-of-usage-and-syntax-rules\" style=\"text-align: justify;\">Examples of usage and syntax rules<\/h3>\n<p style=\"text-align: justify;\">Usually, a <em>robots.txt<\/em> file contains one or more rules, each on its own separate line. Each rule blocks or allows access to a given crawler to a specified file path or the entire website.<\/p>\n<ul>\n<li style=\"text-align: justify;\">Block all crawlers (user-agents) from accessing the <em>logs<\/em> and <em>ssl<\/em> directories.<\/li>\n<\/ul>\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\">User-agent:*\r\nDisallow: \/logs\/\r\nDisallow: \/ssl\/<\/pre>\n<ul>\n<li>Block all crawlers to index the whole site.<\/li>\n<\/ul>\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\">User-agent: *\r\nDisallow: \/<\/pre>\n<ul>\n<li style=\"text-align: justify;\">Allow all user agents to access the entire site.<\/li>\n<\/ul>\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\">User-agent: *\r\nAllow: \/<\/pre>\n<ul>\n<li style=\"text-align: justify;\">Block indexation for the whole site from a specific crawler.<\/li>\n<\/ul>\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\">User-agent: Bot1\r\nDisallow: \/<\/pre>\n<ul>\n<li style=\"text-align: justify;\">Allow index to a specific web crawler and prevents indexation from others.<\/li>\n<\/ul>\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\">User-agent: Googlebot\r\nDisallow:\r\nUser-agent: *\r\nDisallow: \/<\/pre>\n<\/article>\n<ul style=\"text-align: justify;\">\n<li>Under <em><strong>User-agent<\/strong>:\u00a0<\/em> you can type in the specific crawler name. You can also include all crawlers simply by typing in the star (*) symbol. With this command, you can filter out all crawlers except AdBot crawlers, which you need to enumerate explicitly. You can find a list of all crawlers on the internet.<\/li>\n<li>Additionally, for the <strong><em>Allow<\/em><\/strong> and <strong><em>Disallow<\/em><\/strong> commands to work only for a specific file or folder, you must always include their names between \u201c<strong>\/<\/strong>\u201d.<\/li>\n<li>Notice how both commands are case-sensitive? It is especially relevant to know that the crawler agents&#8217; default setting is so that they can access any page or directory if not blocked by a <em>Disallow<\/em>: rule.<\/li>\n<\/ul>\n<p style=\"text-align: justify;\"><strong>Note: <\/strong>You can find complete rules and syntax examples <a href=\"https:\/\/support.google.com\/webmasters\/answer\/6062596?hl=en&amp;ref_topic=6061961\" rel=\"nofollow noopener\" target=\"_blank\">here<\/a>.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>If you&#8217;ve ever built your website, you may have heard of a robots.txt file and wondered, what is this file for? Well, you&#8217;re in the right place! Below, we will review this file and why it&#8217;s crucial. What is a robots.txt file? First of all, the robots.txt is nothing more&#8230;<\/p>\n","protected":false},"author":10,"featured_media":0,"comment_status":"open","ping_status":"closed","template":"","format":"standard","meta":{"_crdt_document":"","footnotes":""},"ht-kb-category":[188],"ht-kb-tag":[],"class_list":["post-954","ht_kb","type-ht_kb","status-publish","format-standard","hentry","ht_kb_category-support-resources"],"_links":{"self":[{"href":"https:\/\/chemicloud.com\/kb\/wp-json\/wp\/v2\/ht-kb\/954","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/chemicloud.com\/kb\/wp-json\/wp\/v2\/ht-kb"}],"about":[{"href":"https:\/\/chemicloud.com\/kb\/wp-json\/wp\/v2\/types\/ht_kb"}],"author":[{"embeddable":true,"href":"https:\/\/chemicloud.com\/kb\/wp-json\/wp\/v2\/users\/10"}],"replies":[{"embeddable":true,"href":"https:\/\/chemicloud.com\/kb\/wp-json\/wp\/v2\/comments?post=954"}],"version-history":[{"count":10,"href":"https:\/\/chemicloud.com\/kb\/wp-json\/wp\/v2\/ht-kb\/954\/revisions"}],"predecessor-version":[{"id":8097,"href":"https:\/\/chemicloud.com\/kb\/wp-json\/wp\/v2\/ht-kb\/954\/revisions\/8097"}],"wp:attachment":[{"href":"https:\/\/chemicloud.com\/kb\/wp-json\/wp\/v2\/media?parent=954"}],"wp:term":[{"taxonomy":"ht_kb_category","embeddable":true,"href":"https:\/\/chemicloud.com\/kb\/wp-json\/wp\/v2\/ht-kb-category?post=954"},{"taxonomy":"ht_kb_tag","embeddable":true,"href":"https:\/\/chemicloud.com\/kb\/wp-json\/wp\/v2\/ht-kb-tag?post=954"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}