These can be ignored: The websites report "Forbidden" state when the script asks for headers to verify the URLs are correct.
Not much you can do unless the websites are under your control.

Best,
Uwe Ligges



On 10.11.2024 06:56, Spencer Graves wrote:
Hello:


       I'm getting:


Found the following (possibly) invalid URLs:
   URL: https://bioguide.congress.gov/
     From: man/readDW_NOMINATE.Rd
     Status: 403
     Message: Forbidden
   URL: https://www.bls.gov/cps/
     From: inst/doc/UpdatingUSGDPpresidents.html
     Status: 403
     Message: Forbidden


       These are in:


https://win-builder.r-project.org/9SsxyKVoV7n1/00check.log


      Searching for "forbidden" in "Writing R Extensions" or in a web search has given me nothing.


       These are only NOTES. Should I ignore them is submitting to CRAN?


       Thanks,
       Spencer Graves

______________________________________________
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel
______________________________________________
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel

Reply via email to