On Mon, 17 Apr 2006, Payal Rathod wrote:
> What is the difference between, > >>>> def f(x): > ... return x > ... >>>> f(4) > 4 > >>>> def f(x): > ... print x > ... >>>> f(4) > 4 > > Both give same results. Clarification. Both "show" the same results from the interpreter. From what you see so far, there's no difference. One way to see the difference is to write functions that use functions. A elementary-school example might be to find the hypotenuse of a right triangle. Given two legs of lengths 'a' and 'b', we know that the hypotenuse has this relationship: hypotenuse = sqrt(square(a) + square(b)) Let's write this out as a set of functions: ################################################################### def hypo(a, b): """Given lengths of lengths a and b, returns the hypotenuse.""" return sqrt(square(a) + square(b)) def square(x): return x * x def sqrt(x): return x**(0.5) ################################################################### Experiment with this. See what happens if you replace a return statement with a print statement in square() or sqrt(). The key idea here is that print only shows us its output: it doesn't do anything else with it. It's what's called a "side-effect". And sometimes, we don't care about seeing something as much as we do getting the related value back. Above, hypo does not care what square looks like when it's printed to you, but it cares about getting the real value so it can do its computations. Does this make sense? _______________________________________________ Tutor maillist - Tutor@python.org http://mail.python.org/mailman/listinfo/tutor