robotstxt (0.5.2)

0 users

A 'robots.txt' Parser and 'Webbot'/'Spider'/'Crawler' Permissions Checker.

https://github.com/ropenscilabs/robotstxt
http://cran.r-project.org/web/packages/robotstxt

Provides functions to download and parse 'robots.txt' files. Ultimately the package makes it easy to check if bots (spiders, crawler, scrapers, ...) are allowed to access specific resources on a domain.

Maintainer: Peter Meissner
Author(s): Peter Meissner [aut, cre], Oliver Keys [ctb], Rich Fitz John [ctb]

License: MIT + file LICENSE

Uses: future, httr, magrittr, spiderbar, stringr, testthat, knitr, dplyr, rmarkdown, covr
Reverse suggests: spiderbar

Released about 1 month ago.


4 previous versions

Ratings

Overall:

  (0 votes)

Documentation:

  (0 votes)

Log in to vote.

Reviews

No one has written a review of robotstxt yet. Want to be the first? Write one now.


Related packages:(20 best matches, based on common tags.)


Search for robotstxt on google, google scholar, r-help, r-devel.

Visit robotstxt on R Graphical Manual.