a guest May 16th, 2018 101 Never
- def tokenize(phrase):
- Split phrase words into a parts (tokens).
- The result can be useful for autocomplete purpose.
- hello world => h,he,hel,hell,hello,w,wo,wor,worl,world
- a = 
- for word in phrase.split():
- for i in range(len(word)):
- a.append(word[0:i + 1])
- return a
RAW Paste Data