On Apr 21, 5:58 am, Dustan <[EMAIL PROTECTED]> wrote:
> >From my searches here, there is no equivalent to java's
>
> StringTokenizer in python, which seems like a real shame to me.
>
> However, str.split() works just as well, except for the fact that it
> creates it all at one go. I suggest an itersplit be introduced for
> lazy evaluation, if you don't want to take up recourses, and it could
> be used just like java's StringTokenizer.
>
> Comments?
If your delimiter is a non-empty string, you
can use an iterator like:
def it(S, sub):
start = 0
sublen = len(sub)
while True:
idx = S.find(sub,start)
if idx == -1:
yield S[start:]
raise StopIteration
else:
yield S[start:idx]
start = idx + sublen
target_string = 'abcabcabc'
for subs in it(target_string,'b'):
print subs
For something more complex,
you may be able to use
re.finditer.
--
Hope this helps,
Steven
--
http://mail.python.org/mailman/listinfo/python-list