Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- def tokenize(phrase):
- """
- Split phrase words into a parts (tokens).
- The result can be useful for autocomplete purpose.
- Example:
- hello world => h,he,hel,hell,hello,w,wo,wor,worl,world
- """
- a = []
- for word in phrase.split():
- for i in range(len(word)):
- a.append(word[0:i + 1])
- return a
Add Comment
Please, Sign In to add comment