Re: [R] scraping with session cookies

2012-09-21 Thread CPV
> You can use "getNodeSet" function to extract whatever links or texts that > you want from that page. > > > I hope this helps. > > Best, > Heramb > > > > On Wed, Sep 19, 2012 at 10:26 PM, CPV wrote: > >> Thanks again, >> >> I run the scr

Re: [R] scraping with session cookies

2012-09-19 Thread CPV
t clone g...@github.com:omegahat/RHTMLForms.git > > and that has the fixes for handling the degenerate forms with > no arguments. > >D. > > On 9/19/12 7:51 AM, CPV wrote: > > Thank you for your help Duncan, > > > > I have been trying what you sugges

Re: [R] scraping with session cookies

2012-09-19 Thread CPV
ption(u, FALSE) > fun = createFunction(forms[[1]]) > > Then we can use > > fun(.curl = curl) > > instead of > > postForm(site, disclaimer_action="I Agree") > > This helps to abstract the details of the form. > > D. > > On 9/18/12 5:57 PM, CP

[R] scraping with session cookies

2012-09-18 Thread CPV
Hi, I am starting coding in r and one of the things that i want to do is to scrape some data from the web. The problem that I am having is that I cannot get passed the disclaimer page (which produces a session cookie). I have been able to collect some ideas and combine them in the code below but I