I'm plagued with this error:
Retrying connect to server: localhost/127.0.0.1:9000.

I'm trying to set up hadoop on a new machine, just a basic pseudo-distributed 
setup.  I've done this quite a few times on other machines, but this time I'm 
kinda stuck.  I formatted the namenode without obvious errors and ran 
start-all.sh with no errors to stdout.  However, the logs are full of that 
error above and if I attempt to access hdfs (ala "hadoop fs -ls /") I get that 
error again.  Obviously, my core-site.xml sets fs.default.name to 
"hdfs://localhost:9000".

I assume something is wrong with /etc/hosts, but I'm not sure how to fix it.  
If "hostname" returns X and "hostname -f" returns Y, then what are the 
corresponding entries in /etc/hosts?

Thanks for any help.

________________________________________________________________________________
Keith Wiley     [email protected]     keithwiley.com    music.keithwiley.com

"I used to be with it, but then they changed what it was.  Now, what I'm with
isn't it, and what's it seems weird and scary to me."
                                           --  Abe (Grandpa) Simpson
________________________________________________________________________________

Reply via email to