[Tutor] evolutionary drift

2018-11-21 Thread Avi Gross
Steve,

You may be right. It often happens that someone has a (small) idea, perhaps 
very focused, and others chime in and try to apply it more widely, perhaps by 
making it more general, and it grows. Over the years, the earlier adopters may 
be seen almost as co-creators or even become the lead in the story. Tell me who 
seems to be associated with Apple and who did much of the technical work? I am 
not saying that Jobs did not have vision and marketing talent and an eye for 
style and so on. I am not even sure who came up with ideas back then. Another 
such person, Bill Gates, did do some programming of BASIC early on and so forth.

So, whatever the history of early Python (and predecessors) was, it may have 
begun as some enhancements and improvements of what came before and perhaps new 
paradigms. Others who saw it may have seen something that looked easier to 
teach. An obvious example, if it was there way back then, was removing lots of 
brackets used in other languages (such as {([])} ) and using indentation. Feels 
more natural to not-so-mathematical types.

But over the years the brackets have returned and worse. Like many languages, 
Python used what symbols it could find on the keyboard and then overloaded them 
horribly. Parentheses are used for grouping but also for tuples and to force 
invocation of a function and to hold the argument tuple (albeit no trailing 
comma is needed for a single argument as in other tuples) and presumably in 
other contexts such as within regular expressions. Periods can mean quite a few 
things as can asterisk and so on. There are few matched sets of characters on 
the keyboard. () is now used for tuples, among other things. [] is used for 
lists except when it is used for dictionary access as in cards[key] versus 
text[5] and {} is used for dictionaries except when you use [] but also for 
sets ...

Heck, they ran out of symbols. {} is an empty dictionary and you say set() for 
an empty set. Ditto for tuple. 

<> is not currently used as a matched set as it has many other uses like in 
comparisons. Some languages even use <> as the same as != or ~= to mean not 
equals. "" and '' and even `` are sort of used as if they are a matched set I 
some languages (`` is in R, not Python) but are typographically identical as 
compared to text processing that creates two versions.

EBCDIC had a few other symbols but many languages now use only ASCII symbols 
and have to combine them to make complex and even non-intuitive combinations as 
symbols. Try teaching a child in say C++ that:

X++==++Y is valid and even meaningful if read as:

X++ == ++Y

because it asks you to get the current value of X to compare to the NEW value 
of Y incremented and return a Boolean result and immediately thereafter, 
increment X. Heck, you can write (XY) and so on. I have seen languages with 
an = and == and even === alongside := and ::= and -> and --> and <- and <-- and 
more all to mean variations on a theme.

If we had started with many more symbols, in some ways it would be harder but 
in other ways easier. Mathematicians borrow symbols from lower case, upper case 
and script letters from languages like Greek and Latin but also from Hebrew as 
in, well not easy to include in a file contain normal text but aleph-null (and 
aleph-one and infinitely more levels of infinity.)

A simple teaching language that uses English words children know might either 
be verbose in places or long as in making you spell out DELETE instead of del. 
But quite a few languages are simple if you leave out most of the 
functionality. Yes, you need to explain why some things must end in a semicolon 
or be indented a certain way or require parentheses or ...
What you don't want to teach is a complex language like English. I was teaching 
my Dad as he prepared for Citizenship and he balked when told there were at 
least seven distinct ways you pronounce OUGH in common English words.

So, perhaps python can be used to teach basic programming for several reasons 
including no need to declare variables and their "types" in advance and having 
relatively intelligent basic types that easily get converted in the background 
as in various kinds of numbers. But when you use more and more features, it 
expands into areas where, frankly, there is no one right answer and choices are 
often made by fiat or by a majority. Mathematically, if you view a topic like 
multiple inheritance in classes and the kludges made to get around logical 
issues, you see what a mess it really is and very hard to teach. Look at the 
double underscore notation(prefix-only) for variables that renames them to be 
unique within a certain scope to avoid collisions in the mythical searched name 
space. 

I am not against such drift but at times wonder if a one-size-has-to-fit-all 
mentality is wise.

I had  a thought on an earlier topic by Asad. He wanted to write in Python what 
is effectively a member of the UNIX grep family. Specifically, fgrep (or

[Tutor] the fivefold path

2018-11-21 Thread Avi Gross
Mark,

Thanks for the expansion.

Yes, there are ever more ways to format text.

There can be an indefinite expansion beyond this is many ways.

I was thinking of ways to make bits and pieces from objects and stringing them 
together, perhaps inventing your own methods.

Anyone can come up with another meta-language, perhaps embed them in a module 
with classes, and use overloading methods to make it easier to use.

But worst of all is to simply find an external application and open up a 
connection to it such as with Popen() or a call to a URL that accepts some 
formatting string and returns something formatted.

Consider something like a UNIX Shell language like sh/csh/ksh or more modern 
incarnations like BASH that use a dollar sign in imaginative ways such as $PATH 
or `command args` to make substitutions. You can imagine invoking something 
like that while passing along parts needed and taking back the result. 

Now, arguably, those are not part of the language. But if you want to build a 
weird enough scenario, you could have a Rube Goldberg device that bounced 
around multiple servers on the internet running many kinds of OS and percolated 
some complex result back into a now formatted string. 

Diversity is nice but also often a distraction. Worse is when trying to read 
what others have written. On that topic, I note many people like to include 
short segments in their ENGLISH writing from other languages. Latin and French 
are common inserts as they are easy to read albeit not always to comprehend. 
Lawyers seem unable to communicate without lots of Latin nonsense and some 
musicians must toss in periodic tidbits of Italian and so on. Languages that 
use characters totally outside what is in the character sets In English or near 
relatives that have added characters with diacritical marks, are less often 
seen. How often do you see Greek or Hebrew? OK, I admit I see them often, but 
my reading tastes are eclectic. I bet many would at least like a translation 
and/or transliteration next to such quotes or just the translation.

Back to programming. Nothing wrong with multiple ways if they offer more 
functionality that is useful 
But why constantly reinvent the wheel as an ellipse? Look at the issue of 
strings. There actually is a reason for the many variation in python. Single 
and double quotes are pretty much identical with the exception that they make 
it easier to include the other symbol without backslashes. Triple quotes (three 
doubles) offer other improvements and so do several raw or Unicode variants. 
But strictly speaking, they are often more syntactic sugar for one or a few 
underlying storage mechanisms accompanied by access methods. Objects, of a 
sort. The f version mentioned as a formatting method is in a sense another such 
object with additional methods. You can well imagine more such things you can 
design into a language, albeit some may not be able to use existing symbols you 
ran out of unless you want to use "'"'" stuff "'"'" (where you may not be able 
to read the nested " and ' and " and ' and " to have some exotic meaning like 
format the sting repeatedly in a loop till it does not change. For example, if 
$PATH evaluated to 

something${DATE}something`echo "hello \$world!"`something

and $DATE expanded to $MONTH/$DAY/$YEAR

then repeated formatting till no more expansions happen could be another type 
of string. No, not asking for that especially since I can see easy ways to make 
infinite loops 😊

Mark, perhaps jokingly, asks what can you do for your language. Nice sentiment 
but I find it less useful to be wedded or loyal to things as it is always more 
interesting to see if something new developed is at least as interesting. Many 
ideas when carried through to the point of absurdity stop being as useful. I am 
thinking of some LISP variants with very nice mathematical/philosophical ideas 
including how an amazing number of things can be done recursively. So if you 
asked them to write a function in lisp that compares A and B, they don't bother 
putting the numbers in two registers like A and B and issuing a compare 
directive at machine language level. Instead, they write the LISP equivalent of 
this pseudocode:

Greater(A,B)
If A == 0, return FALSE
If B == 0, return TRUE
Otherwise, return Greater(A-1, B-1)

Try comparing A,B = 999,999,999,999, 1,000,000,000,000
How much memory does your computer have? The numbers are ONE apart with A being 
smaller. But the algorithm slavishly puts recursive function calls on the stack 
about a trillion times then unwinds. You say no problem, tail recursion allows 
replacing the function on the stack. Maybe. So all you need is a trillion times 
the CPU time or so for something done in a few cycles in most hardware.
(Parenthetically, I am not picking on LISP), as most modern) (computer 
languages)) allow (((equivalent)) code. )))

Half the fun when I thought LISP was having people get lost counting 
parentheses at a time that ed

Re: [Tutor] Pythonic way

2018-11-21 Thread Alan Gauld via Tutor
On 20/11/2018 22:35, Steven D'Aprano wrote:
> On Tue, Nov 20, 2018 at 08:22:01PM +, Alan Gauld via Tutor wrote:
> 
>> I think that's a very deliberate feature of Python going back
>> to its original purpose of being a teaching language that
>> can be used beyond the classroom.
> 
> I don't think that is correct -- everything I've read is that Guido 
> designed Python as a scripting language for use in the "Amoeba" 
> operating system.

I think both are true. Guido was working on Amoeba at the time
he built Python so it was an obvious choice of platform, although
he first built it on a Mac. But he didn't set out specifically to
build a scripting language for Amoeba but rather to build "a descendant
of ABC" which was also usable in the real world, specifically The C/Unix
world.

To do that he believed it had to address the major barriers to ABC
which were 1)being capable of being extended and 2) improved I/O.

My source for that conclusion is Guido's forward to
"Programming Python" 1st edition.

https://www.python.org/doc/essays/foreword/

Also his comments when he set up the "Computer Programming for
Everybody" initiative, which he led for several years. (Which
I can't locate...)

So I believe that ease of teaching was definitely part
of the original game plan and was definitely a factor in
some of the later developments around v2.0 or so.

-- 
Alan G
Author of the Learn to Program web site
http://www.alan-g.me.uk/
http://www.amazon.com/author/alan_gauld
Follow my photo-blog on Flickr at:
http://www.flickr.com/photos/alangauldphotos


___
Tutor maillist  -  Tutor@python.org
To unsubscribe or change subscription options:
https://mail.python.org/mailman/listinfo/tutor


Re: [Tutor] evolutionary drift

2018-11-21 Thread Alan Gauld via Tutor
On 21/11/2018 03:05, Avi Gross wrote:

> <> is not currently used as a matched set as it has many other uses like in 
> comparisons.> Some languages even use <> as the same as != or ~= to mean not 
> equals.

Indeed, Python used to do the same but it was removed in, I think, v2.

> A simple teaching language that uses English words children know 

Probably the best example of that I have seen is Logo.
Although it does use [] and ()
But otherwise its quite child friendly. But ultimately
that makes it less likeable in the "grown up world"...

> Python can still be a great teaching language if kept to a subset 

That is true and it is still one of the very few
languages that I'd recommend for teaching. But sadly
its underbelly shows through very early for beginners.
For example in v1 Python range() returned a list.
That was easy to understand. Now range() returns
a "range object" - what the heck is that? and why
do we need it?

Similarly with iterators.
Things that could easily be iterated over without
thought now require "iterators" and/or evaluate to
iterators. And the error messages tell you so
- but iterators are a concept wholly alien to
most beginners. And are quite hard to explain.

Those are just the two things that beginners most
frequently mail e about from my tutorial. There
are lots of other areas where Python implementation
now shines through in ways that trip beginners up.

> What is easy to teach to children? 

I'm not sure we should even focus on children.
It's more about teaching anyone(regardless of age) who
has no prior experience, and especially little or no
formal math background. Someone with good high school
math can be taught programming fairly easily. But
with no math foundation even basic concepts like
expressions and assignment become very difficult.

It may even be impossible. I used to think that
anyone could learn to program but over the last 30
years of trying I've come to the conclusion that its
not true. You need a brain that's wired a certain
way otherwise it just doesn't make any kind of sense.
Its like math. Some folks just can't understand
mathematical concepts, they are illogical and
non-sensical to them. They might learn  some basic
principles by rote but they never really see how
or why it is so. Same with programming. Some people
just don't get it. Thankfully those are a very
small minority!

-- 
Alan G
Author of the Learn to Program web site
http://www.alan-g.me.uk/
http://www.amazon.com/author/alan_gauld
Follow my photo-blog on Flickr at:
http://www.flickr.com/photos/alangauldphotos


___
Tutor maillist  -  Tutor@python.org
To unsubscribe or change subscription options:
https://mail.python.org/mailman/listinfo/tutor


[Tutor] origins bootstrapped.

2018-11-21 Thread Avi Gross
Alan has been involved with Python for a long time so he has more to offer
historically.

I don't see some things as either/or. You can start with one major
motivation and it morphs from a one-celled creature like an Amoeba to a
complex vertebrate like a Python which needs modules added so it can walk
around better.

OK, horrible analogy but interesting naming. So some say Guido started with
learning his ABC and then became educated enough to understand Monty Python
and reach for the holy grail.

OK, even worse. Time to get serious.

I have seen this on many projects, not just programming languages and
environments. Something fairly simple is imagined then perhaps prototyped.
Someone may notice that what was created may be used in another way if
perhaps expanded a bit. Someone then realizes they now have functionality
that can be used to create something else, in a form of bootstrapping. After
a while they have a collection of tools that can be combined to make
something more complex. The biological analogy above can be an example. No,
I am not saying that a distant ancestor of a snake like a python was an
amoeba. But they do share common ancestors they have both diverged from with
the amoeba remaining a single celled organism and the python descending from
something that became multi-cellular then differentiated into having
different kinds of cells in tissues and organs and became a somewhat
integrated whole that is possibly more than the sum of its parts. The ABC
analogy is also obvious. Once an alphabet is chosen and provisional meanings
given to each letter, it can grow and even adjust to making words and
sentences and even seemingly endless streams of consciousness like some of
my messages.

Python was built on top of other achievements that some people were learning
from. There were many steps along the way from building machines programmed
one byte at a time in binary (I hated a class that made me do that as one
error means start over) to various levels where a compiler and then an
interpreter would parse things. We have been discussing using regular
expressions. Much of a language like python is having bits and pieces of
code written in ASCII or Unicode be parsed using hopefully unambiguous rules
into tokens that can be made into decision trees or whatever data structure.
That deepens on being able to look for and find some sort of pattern in
strings. I am not sure what python and others use, but it may be tools
similar to string search or regular expressions that allows them to
bootstrap.

Back when my group was programming in C, I was sent to Denver for a class in
Lex/Yacc to learn how to use C libraries that now look primitive. One was a
lexical analyzer and the other sort of a parser somewhat rudely named as Yet
Another Compiler-Compiler.

But today, what do most people use? Our tools improve, often by being a
wrapper to older tools and so on for multiple levels. New functionality is
added too.

Can I ask a question that I really want an opinion on? As a preface, I see
some think python as a formal language is being pushed by industry in
directions that may not meld as well for its use in other contexts like for
teaching students. How much of that is due to it being a relative open and
free product? There are plenty of other applications that you pay for and
thus have to be responsive to the buyers to remain in business. Python has
many implementations including some freer than others. Yet is has gone
through a bit of a bifurcation and many would like to see 2.X retained and
others wish everyone should migrate. Is there room for a smaller core
language that remains good for teaching purposes and that is small enough to
fit in a Rasberry pi, while other versions are of industrial strength? Do we
already sort of have some of that?

I was thinking of how many languages and environments have been looking at
working using parallelism. Most people simply have no need for the
complication. When you add the ability to do multiprocessing within an
application using something like threads, you spend lots of time making sure
you literally lock down shared resources so they are used serially. You need
to make sure race conditions do not lock up all your threads at once. Lots
of added overhead is only worth it if you gain in the process. Add multiple
cores in your CPU, and you may need to handle more complications as they are
actually running in parallel, perhaps still sharing a common memory. Allow
it to use multiple processors around the world, and you need even more
elaborate control structures to synchronize all that.

It definitely is worth doing but does everyone need it especially for
teaching an intro class?

I was thinking about the little project I mentioned the other day. Should
some of it be done in parallel using methods available? One part of the
problem was to read in N files into N pandas DataFrame objects. I knew that
I/O tends to be fairly slow and most programs take a nap while waiting. I

Re: [Tutor] origins bootstrapped.

2018-11-21 Thread Alan Gauld via Tutor
On 21/11/2018 16:31, Avi Gross wrote:
> Alan has been involved with Python for a long time so he has more to offer
> historically.

I'm not so sure about that, several folks on this list
have been around longer than me. And I don't follow the
main comp.lang.python list that closely.

I'm simply giving my perspective for whatever that may
be worth.

> OK, horrible analogy but interesting naming. So some say Guido started with
> learning his ABC and then became educated enough to understand Monty Python
> and reach for the holy grail.

Made me laugh out loud!

> Back when my group was programming in C, I was sent to Denver for a class in
> Lex/Yacc to learn how to use C libraries that now look primitive. One was a
> lexical analyzer and the other sort of a parser somewhat rudely named as Yet
> Another Compiler-Compiler.

Still powerful tools and in active use in several projects.
They were great for quickly bootstrapping a small bespoke language.

> some think python as a formal language is being pushed by industry in
> directions that may not meld as well for its use in other contexts like for
> teaching students. How much of that is due to it being a relative open and
> free product? 

I think that's true but not necessarily bad.
It just takes the language in as different direction.
And as you said, that happens in many projects. They start
as one ting and end up someplace entirely different.
I remember one project that started out as a network
management system for a fairly obscure protocol and
wound up as both a customer service system for our Global
Corporate clients and as part of the monitoring system
for the English Channel Tunnel!. Very different applications
of the same root code base.

> ...Is there room for a smaller core
> language that remains good for teaching purposes and that is small enough to
> fit in a Rasberry pi, while other versions are of industrial strength? Do we
> already sort of have some of that?

We sort of have that. Python v3 certainly works well on the pi.
We could certainly have a smaller language for teaching but then
we had that in ABC and nobody used it. Students don't like
learning stuff that they can't use in the real world.

And if you want purity for beginners we already have Logo,
Scheme, Squeak/Scratch and a few others. But none of those
really work well in the wider world. Which is why I still
recommend python, warts and all.

> I was thinking of how many languages and environments have been looking at
> working using parallelism. Most people simply have no need 

Absolutely and for beginners a single thread is more than
enough to cope with.


> I was thinking about the little project I mentioned the other day. Should
> some of it be done in parallel using methods available? 

It sounded a lot like a job for the map-reduce paradigm.
Which is parallel where it can be and sequential where
it should be...

> An obvious speedup might be had by starting up N threads with each opening
> one file and doing what I said above into one shared process with N
> variables now available. But will it be faster?

Trying to calculate (or guess) this kind of thing in
advance is near impossible. The best solution is to
prototype and measure, making sure to do so on typical
data volumes.

That having been said if you know (or discover) that
you definitely need parallelism then its definitely worth
revisiting the design to ensure the data structures
and overall workflow are optimised for a parallel approach.

> ...I will pause and simply say that I opted not to bother
> as the darn program finished in 5 or 10 seconds.

Exactly so.
AS the famous quote says "Premature optimisation is..."

> For heavy industrial uses, like some of the applications in the cloud
> dealing with huge problems, it may well be worth it.

In many cases it's the only practical solution.
Almost all of my industrial programming has involved multi
processing and threading. Almost none (I think one )of my
personal programming projects has needed it.

-- 
Alan G
Author of the Learn to Program web site
http://www.alan-g.me.uk/
http://www.amazon.com/author/alan_gauld
Follow my photo-blog on Flickr at:
http://www.flickr.com/photos/alangauldphotos


___
Tutor maillist  -  Tutor@python.org
To unsubscribe or change subscription options:
https://mail.python.org/mailman/listinfo/tutor


Re: [Tutor] origins bootstrapped.

2018-11-21 Thread Mats Wichmann
On 11/21/18 5:54 PM, Alan Gauld via Tutor wrote:
> On 21/11/2018 16:31, Avi Gross wrote:

>> An obvious speedup might be had by starting up N threads with each opening
>> one file and doing what I said above into one shared process with N
>> variables now available. But will it be faster?
> 
> Trying to calculate (or guess) this kind of thing in
> advance is near impossible. The best solution is to
> prototype and measure, making sure to do so on typical
> data volumes.
> 
> That having been said if you know (or discover) that
> you definitely need parallelism then its definitely worth
> revisiting the design to ensure the data structures
> and overall workflow are optimised for a parallel approach.
> 
>> ...I will pause and simply say that I opted not to bother
>> as the darn program finished in 5 or 10 seconds.
> 
> Exactly so.
> AS the famous quote says "Premature optimisation is..."
> 
>> For heavy industrial uses, like some of the applications in the cloud
>> dealing with huge problems, it may well be worth it.
> 
> In many cases it's the only practical solution.
> Almost all of my industrial programming has involved multi
> processing and threading. Almost none (I think one )of my
> personal programming projects has needed it.
> 

People play all kinds of parallelism tricks with Python because Python
has a certain Impedimet Which Shall Remain Nameless (except I'm certain
someone will mention it).

Anyway, it's one thing to try to decompose a massive problem, that's
interesting on a certain level (see some of the talks companies like
Google have done on scaling their services) but is really hard to
replicate at home.  But another use for non-linear programming, if you
want to call it that, is task that just needs a different programming
model. That's where a lot of the async stuff with coroutines and event
loops that has been beefed up recently is quite interesting.  Even very
simple programs can run into cases where it may make sense, usually if
there are things you have to wait for and want to be able to do other
work while doing so.

I actually got around to watching David Beazley's talk from a couple
years ago, and it was pretty impressive - something I'd flagged as
"watch later" I don't know how long ago, more than a year at least. Wish
I'd watched it earlier now!

I hate posting those obscure YouTube links where people don't know what
they are clicking on, so search for this title if interested:

David Beazley - Python Concurrency From the Ground Up
(it's from the 2015 PyCon)

___
Tutor maillist  -  Tutor@python.org
To unsubscribe or change subscription options:
https://mail.python.org/mailman/listinfo/tutor


Re: [Tutor] origins bootstrapped.

2018-11-21 Thread David Rock

> On Nov 21, 2018, at 10:31, Avi Gross  wrote:
> 
> Is there room for a smaller core
> language that remains good for teaching purposes and that is small enough to
> fit in a Rasberry pi, while other versions are of industrial strength? Do we
> already sort of have some of that?

What comes stock on a Pi is more than sufficient (there’s plenty of room for 
ā€œstandardā€ python 2 and python 3).

Micropython (https://micropython.org/) fits that category nicely for micro 
controllers and Adafruit’s version of it, CircuitPython has a strong following 
https://www.adafruit.com/circuitpython

These have been great to allow people learn not only python, but how to 
physically interact with the world outside the computer.


— 
David Rock
da...@graniteweb.com




___
Tutor maillist  -  Tutor@python.org
To unsubscribe or change subscription options:
https://mail.python.org/mailman/listinfo/tutor


Re: [Tutor] origins bootstrapped.

2018-11-21 Thread Steven D'Aprano
On Wed, Nov 21, 2018 at 11:31:59AM -0500, Avi Gross wrote:

> Alan has been involved with Python for a long time so he has more to offer
> historically.

I've been involved with Python for a long time too. What exactly are you 
trying to say?


> Can I ask a question that I really want an opinion on? As a preface, I see
> some think python as a formal language is being pushed by industry in
> directions that may not meld as well for its use in other contexts like for
> teaching students.

I think there is always going to be tension between the needs of 
different users. Beginners need simplicity; expert, experienced 
programmers need power; both have very different ideas of what 
"readable code" means.

I don't think Python is being pushed in any direction by "industry". It 
is evolving according to the needs of the programmers who use it, some 
of whom may work for some industry or another.

> How much of that is due to it being a relative open and
> free product? There are plenty of other applications that you pay for and
> thus have to be responsive to the buyers to remain in business. Python has
> many implementations including some freer than others.

I don't know of any non-free (free as in beer, or free as in speech) 
implementations of Python. Can you elaborate?


> Yet is has gone
> through a bit of a bifurcation and many would like to see 2.X retained and
> others wish everyone should migrate. Is there room for a smaller core
> language that remains good for teaching purposes and that is small enough to
> fit in a Rasberry pi, while other versions are of industrial strength? Do we
> already sort of have some of that?

Standard CPython is light enough to run on fairly low-powered devices, 
including Raspberry Pi. For an even smaller footprint, you can use 
Micropython, which will run on embedded devices, although μPy does make 
some comprompises that means that it's not a fully compliant Python 
implementation.

There are, or were, other small implementations:

- Pippy, Python for Palm (probably unmaintained by now...)
- Python for S60, for the Nokia S60 platform (likewise...)
- Pythonce, for Windows CE (who still uses WinCE?)
- PyMite for embedded devices
- Python-iPod
- Py4A and QPython (Android)
- TinyPy
- PyPad for the iPad
- Pycorn, Python running on bare hardware with no OS


> I was thinking of how many languages and environments have been looking at
> working using parallelism.
[...]
> It definitely is worth doing but does everyone need it especially for
> teaching an intro class?

Who teaches threading and parallelization in introductory classes?




-- 
Steve
___
Tutor maillist  -  Tutor@python.org
To unsubscribe or change subscription options:
https://mail.python.org/mailman/listinfo/tutor