SpyREST

Supported by Mashape

Home » spyrest.com Versions » Default Resources » robots » GET /robots.txt

GET /robots.txt

Description

No description given

Write
Preview

Examples

Example 1

No description given

Recorded at

2015-11-04 01:13:03 UTC

Try with cURL

Request URL

GET /robots.txt      

Request Headers

accept: */*
user-agent: Mozilla/5.0 (compatible; MJ12bot/v1.4.5; http://www.majestic12.co.uk/bot.php?+)
      

Response Headers

transfer-encoding: chunked
accept-ranges: bytes
etag: "55b0893c-ca"
connection: close
last-modified: Thu, 23 Jul 2015 06:27:08 GMT
content-length: 202
content-type: text/plain
date: Wed, 04 Nov 2015 01:13:04 GMT
server: nginx/1.6.2
      

Response Body

Shortened for readability

# See http://www.robotstxt.org/robotstxt.html for documentation on how to use the robots.txt file
#
# To ban all spiders from the entire site uncomment the next two lines:
# User-agent: *
# Disallow: /