A couple of ways.

using Rcurl   you can use the  curlOption of dirlistonly.

otherwise you can read the page and parse.  I've got some code around here
to do that.

Steve

On Mon, Apr 9, 2012 at 11:27 AM, Jonathan Greenberg <j...@illinois.edu>wrote:

> R-helpers:
>
> I'd like to be able to store all the file information from an ftp site
> (e.g. file and foldernames) through an R command.  Any ideas how to do
> this?  Here's an example site to use:
>
> ftp://e4ftl01.cr.usgs.gov/MOTA/MCD15A3.005
>
> --j
>
> --
> Jonathan A. Greenberg, PhD
> Assistant Professor
> Department of Geography and Geographic Information Science
> University of Illinois at Urbana-Champaign
> 607 South Mathews Avenue, MC 150
> Urbana, IL 61801
> Phone: 415-763-5476
> AIM: jgrn307, MSN: jgrn...@hotmail.com, Gchat: jgrn307, Skype: jgrn3007
> http://www.geog.illinois.edu/people/JonathanGreenberg.html
>
> ______________________________________________
> R-help@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
> http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>

        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to