Repository: spark
Updated Branches:
  refs/heads/master 1a0da0ec5 -> 7d52777ef


Super minor: Close inputStream in SparkSubmitArguments

`Properties#load()` doesn't close the InputStream, but it'd be closed after 
being GC'd anyway...

Also changed file.getName to file, because getName only shows the filename. 
This will show the full (possibly relative) path, which is less confusing if 
it's not found.

Author: Aaron Davidson <[email protected]>

Closes #914 from aarondav/tiny and squashes the following commits:

db9d072 [Aaron Davidson] Super minor: Close inputStream in SparkSubmitArguments


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/7d52777e
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/7d52777e
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/7d52777e

Branch: refs/heads/master
Commit: 7d52777effd0ff41aed545f53d2ab8de2364a188
Parents: 1a0da0e
Author: Aaron Davidson <[email protected]>
Authored: Sat May 31 12:36:58 2014 -0700
Committer: Reynold Xin <[email protected]>
Committed: Sat May 31 12:36:58 2014 -0700

----------------------------------------------------------------------
 .../org/apache/spark/deploy/SparkSubmitArguments.scala   | 11 +++++++----
 1 file changed, 7 insertions(+), 4 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/7d52777e/core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala
----------------------------------------------------------------------
diff --git 
a/core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala 
b/core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala
index bf449af..153eee3 100644
--- a/core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala
+++ b/core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala
@@ -381,16 +381,19 @@ private[spark] class SparkSubmitArguments(args: 
Seq[String]) {
 object SparkSubmitArguments {
   /** Load properties present in the given file. */
   def getPropertiesFromFile(file: File): Seq[(String, String)] = {
-    require(file.exists(), s"Properties file ${file.getName} does not exist")
+    require(file.exists(), s"Properties file $file does not exist")
+    require(file.isFile(), s"Properties file $file is not a normal file")
     val inputStream = new FileInputStream(file)
-    val properties = new Properties()
     try {
+      val properties = new Properties()
       properties.load(inputStream)
+      properties.stringPropertyNames().toSeq.map(k => (k, properties(k).trim))
     } catch {
       case e: IOException =>
-        val message = s"Failed when loading Spark properties file 
${file.getName}"
+        val message = s"Failed when loading Spark properties file $file"
         throw new SparkException(message, e)
+    } finally {
+      inputStream.close()
     }
-    properties.stringPropertyNames().toSeq.map(k => (k, properties(k).trim))
   }
 }

Reply via email to