I could fix it. I was using some files that I FTP from my windows machine to my linux server. I try to reinstall everything based on the nutch-0.9release directly on my server. I don't why but somehow it solved my pb. I guess it was probably due to dos2unix format error or file permission error.
Good point. I'm having the same problem and still haven't figured out how to fix this. Mathijs Homminga Emmanuel JOKE wrote:It seems I'm having a lot of trouble trying to configure hadoop on one machine. I've followed the wiki tutorial and I've configured every thing on 1 machine. I tried to start hadoop using start-all.sh and it works. I've the following output: starting namenode, logging to /data/sengine/search/logs/hadoop- nutch-namenode-node-n1.out localhost: starting datanode, logging to /data/sengine/search/logs/hadoop- nutch-datanode-node-n1.out cat: /data/sengine/search/bin/../conf/masters: No such file or directory starting jobtracker, logging to /data/sengine/search/logs/hadoop- nutch-jobtracker-node-n1.out localhost: starting tasktracker, logging to /data/sengine/search/logs/hadoop-nutch-tasktracker-node-n1.out However do you have any idea why i've an error about a file name masters ? In the task manager i can see 3 process up and running: 1=>> /usr/local/java/bin/java -Xmx1000m - Dhadoop.log.dir=/data/sengine/search/logs - Dhadoop.log.file=hadoop-nutch-namenode-node-n1.log - Dhadoop.home.dir=/data/sengine/search - Dhadoop.id.str=nutch-Dhadoop.root.logger=INFO,console - Djava.library.path=/data/sen... 2=>> /usr/local/java/bin/java -Xmx1000m - Dhadoop.log.dir=/data/sengine/search/logs - Dhadoop.log.file=hadoop-nutch-jobtracker-node-n1.log - Dhadoop.home.dir=/data/sengine/search - Dhadoop.id.str=nutch-Dhadoop.root.logger=INFO,console - Djava.library.path=/data/sengine/s... 3=>> /usr/local/java/bin/java -Xmx1000m - Dhadoop.log.dir=/data/sengine/search/logs - Dhadoop.log.file=hadoop-nutch-tasktracker-node-n1.log - Dhadoop.home.dir=/data/sengine/search - Dhadoop.id.str=nutch-Dhadoop.root.logger=INFO,console - Djava.library.path=/data ... I tried to start to crawl a website and I had the following error: $ bin/nutch crawl urls/nutch -dir crawl /usr/local/java/bin/java -Xmx512m -Dhadoop.log.dir=/data/sengine/search/logs- Dhadoop.log.file=hadoop.log - Djava.library.path=/data/sengine/search/lib/native/Linux-i386-32 -classpath
/data/sengine/search/conf:/usr/local/java/lib/tools.jar:/data/sengine/search/build:/data/sengine/search/build/nutch-
1.0-dev.job:/data/sengine/search/build/test/classes:/data/sengine/search/nutch-*.job:/data/sengine/search/lib/commons-cli-2.0-SNAPSHOT.jar:/data/sengine/search/lib/commons-codec-1.3.jar:/data/sengine/search/lib/commons-httpclient-3.0.1.jar:/data/sengine/search/lib/commons-lang-2.1.jar:/data/sengine/search/lib/commons-logging-1.0.4.jar:/data/sengine/search/lib/commons-logging-api-1.0.4.jar:/data/sengine/search/lib/hadoop-0.12.2-core.jar:/data/sengine/search/lib/jakarta-oro-2.0.7.jar:/data/sengine/search/lib/jets3t-0.5.0.jar:/data/sengine/search/lib/jetty-5.1.4.jar:/data/sengine/search/lib/junit-3.8.1.jar:/data/sengine/search/lib/log4j-1.2.13.jar:/data/sengine/search/lib/lucene-core-2.1.0.jar:/data/sengine/search/lib/lucene-misc-2.1.0.jar:/data/sengine/search/lib/servlet-api.jar:/data/sengine/search/lib/taglibs-i18n.jar:/data/sengine/search/lib/xerces-2_6_2-apis.jar:/data/sengine/search/lib/xerces-2_6_2.jar:/data/sengine/search/lib/jetty-ext/ant.jar:/data/sengine/search/lib/jetty-ext/commons-el.jar:/data/sengine/search/lib/jetty-ext/jasper-compiler.jar:/data/sengine/search/lib/jetty-ext/jasper-runtime.jar:/data/sengine/search/lib/jetty-ext/jsp-api.jar
org.apache.nutch.crawl.Crawl urls/nutch -dir crawl crawl started in: crawl rootUrlDir = urls/nutch threads = 10 depth = 5 Injector: starting Injector: crawlDb: crawl/crawldb Injector: urlDir: urls/nutch Injector: Converting injected urls to crawl db entries. task_0002_m_000000_0: log4j:ERROR setFile(null,true) call failed. task_0002_m_000000_0: java.io.FileNotFoundException: /data/sengine/search/logs (Is a directory) task_0002_m_000000_0: at java.io.FileOutputStream.openAppend(Native Method) task_0002_m_000000_0: at java.io.FileOutputStream.<init>( FileOutputStream.java:177) task_0002_m_000000_0: at java.io.FileOutputStream.<init>( FileOutputStream.java:102) task_0002_m_000000_0: at org.apache.log4j.FileAppender.setFile( FileAppender.java:289) task_0002_m_000000_0: at org.apache.log4j.FileAppender.activateOptions( FileAppender.java:163) task_0002_m_000000_0: at org.apache.log4j.DailyRollingFileAppender.activateOptions( DailyRollingFileAppender.java:215) task_0002_m_000000_0: at org.apache.log4j.config.PropertySetter.activate( PropertySetter.java:256) task_0002_m_000000_0: at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java :132) task_0002_m_000000_0: at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java
:96)
task_0002_m_000000_0: at org.apache.log4j.PropertyConfigurator.parseAppender( PropertyConfigurator.java:654) task_0002_m_000000_0: at org.apache.log4j.PropertyConfigurator.parseCategory( PropertyConfigurator.java:612) task_0002_m_000000_0: at org.apache.log4j.PropertyConfigurator.configureRootCategory( PropertyConfigurator.java:509) task_0002_m_000000_0: at org.apache.log4j.PropertyConfigurator.doConfigure (PropertyConfigurator.java:415) task_0002_m_000000_0: at org.apache.log4j.PropertyConfigurator.doConfigure (PropertyConfigurator.java:441) task_0002_m_000000_0: at org.apache.log4j.helpers.OptionConverter.selectAndConfigure( OptionConverter.java:468) task_0002_m_000000_0: at org.apache.log4j.LogManager.<clinit>( LogManager.java:122) task_0002_m_000000_0: at org.apache.log4j.Logger.getLogger(Logger.java :104) task_0002_m_000000_0: at org.apache.commons.logging.impl.Log4JLogger.getLogger(Log4JLogger.java
:229)
task_0002_m_000000_0: at org.apache.commons.logging.impl.Log4JLogger .<init>(Log4JLogger.java:65) task_0002_m_000000_0: at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) task_0002_m_000000_0: at sun.reflect.NativeConstructorAccessorImpl.newInstance( NativeConstructorAccessorImpl.java:39) task_0002_m_000000_0: at sun.reflect.DelegatingConstructorAccessorImpl.newInstance( DelegatingConstructorAccessorImpl.java:27) task_0002_m_000000_0: at java.lang.reflect.Constructor.newInstance( Constructor.java:494) task_0002_m_000000_0: at org.apache.commons.logging.impl.LogFactoryImpl.newInstance( LogFactoryImpl.java:529) task_0002_m_000000_0: at org.apache.commons.logging.impl.LogFactoryImpl.getInstance( LogFactoryImpl.java:235) task_0002_m_000000_0: at org.apache.commons.logging.LogFactory.getLog( LogFactory.java:370) task_0002_m_000000_0: at org.apache.hadoop.mapred.TaskTracker.<clinit>( TaskTracker.java:82) task_0002_m_000000_0: at org.apache.hadoop.mapred.TaskTracker$Child.main( TaskTracker.java:1423) task_0002_m_000000_0: log4j:ERROR Either File or DatePattern options are not set for appender [DRFA]. task_0002_m_000000_1: log4j:ERROR setFile(null,true) call failed. task_0002_m_000000_1: java.io.FileNotFoundException: /data/sengine/search/logs (Is a directory) task_0002_m_000000_1: at java.io.FileOutputStream.openAppend(Native Method) task_0002_m_000000_1: at java.io.FileOutputStream.<init>( FileOutputStream.java:177) task_0002_m_000000_1: at java.io.FileOutputStream.<init>( FileOutputStream.java:102) task_0002_m_000000_1: at org.apache.log4j.FileAppender.setFile( FileAppender.java:289) task_0002_m_000000_1: at org.apache.log4j.FileAppender.activateOptions( FileAppender.java:163) task_0002_m_000000_1: at org.apache.log4j.DailyRollingFileAppender.activateOptions( DailyRollingFileAppender.java:215) task_0002_m_000000_1: at org.apache.log4j.config.PropertySetter.activate( PropertySetter.java:256) task_0002_m_000000_1: at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java :132) task_0002_m_000000_1: at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java
:96)
task_0002_m_000000_1: at org.apache.log4j.PropertyConfigurator.parseAppender( PropertyConfigurator.java:654) task_0002_m_000000_1: at org.apache.log4j.PropertyConfigurator.parseCategory( PropertyConfigurator.java:612) task_0002_m_000000_1: at org.apache.log4j.PropertyConfigurator.configureRootCategory( PropertyConfigurator.java:509) task_0002_m_000000_1: at org.apache.log4j.PropertyConfigurator.doConfigure (PropertyConfigurator.java:415) task_0002_m_000000_1: at org.apache.log4j.PropertyConfigurator.doConfigure (PropertyConfigurator.java:441) task_0002_m_000000_1: at org.apache.log4j.helpers.OptionConverter.selectAndConfigure( OptionConverter.java:468) task_0002_m_000000_1: at org.apache.log4j.LogManager.<clinit>( LogManager.java:122) task_0002_m_000000_1: at org.apache.log4j.Logger.getLogger(Logger.java :104) task_0002_m_000000_1: at org.apache.commons.logging.impl.Log4JLogger.getLogger(Log4JLogger.java
:229)
task_0002_m_000000_1: at org.apache.commons.logging.impl.Log4JLogger .<init>(Log4JLogger.java:65) task_0002_m_000000_1: at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) task_0002_m_000000_1: at sun.reflect.NativeConstructorAccessorImpl.newInstance( NativeConstructorAccessorImpl.java:39) task_0002_m_000000_1: at sun.reflect.DelegatingConstructorAccessorImpl.newInstance( DelegatingConstructorAccessorImpl.java:27) task_0002_m_000000_1: at java.lang.reflect.Constructor.newInstance( Constructor.java:494) task_0002_m_000000_1: at org.apache.commons.logging.impl.LogFactoryImpl.newInstance( LogFactoryImpl.java:529) task_0002_m_000000_1: at org.apache.commons.logging.impl.LogFactoryImpl.getInstance( LogFactoryImpl.java:235) task_0002_m_000000_1: at org.apache.commons.logging.LogFactory.getLog( LogFactory.java:370) task_0002_m_000000_1: at org.apache.hadoop.mapred.TaskTracker.<clinit>( TaskTracker.java:82) task_0002_m_000000_1: at org.apache.hadoop.mapred.TaskTracker$Child.main( TaskTracker.java:1423) task_0002_m_000000_1: log4j:ERROR Either File or DatePattern options are not set for appender [DRFA]. Any idea why I've this error ? I confirm that my log4j properties file is well defined and is located in conf folder. Thanks in advance for your help Cheers E
------------------------------------------------------------------------- This SF.net email is sponsored by DB2 Express Download DB2 Express C - the FREE version of DB2 express and take control of your XML. No limits. Just data. Click to get it now. http://sourceforge.net/powerbar/db2/
_______________________________________________ Nutch-general mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/nutch-general
