robotstxt (0.4.0)

0 users

A 'robots.txt' Parser and 'Webbot'/'Spider'/'Crawler' Permissions Checker.

Provides functions to download and parse 'robots.txt' files. Ultimately the package makes it easy to check if bots (spiders, scrapers, ...) are allowed to access specific resources on a domain.

Maintainer: Peter Meissner
Author(s): Peter Meissner [aut, cre], Oliver Keys [ctb], Rich Fitz John [ctb]

License: MIT + file LICENSE

Uses: httr, stringr, testthat, knitr, dplyr, rmarkdown

Released about 1 month ago.

2 previous versions



  (0 votes)


  (0 votes)

Log in to vote.


No one has written a review of robotstxt yet. Want to be the first? Write one now.

Related packages:(20 best matches, based on common tags.)

Search for robotstxt on google, google scholar, r-help, r-devel.

Visit robotstxt on R Graphical Manual.