Advertisement
Guest User

Untitled

a guest
May 22nd, 2015
210
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 0.64 KB | None | 0 0
  1. import tokenize
  2.  
  3. file = open("tokenize-example-1.py")
  4.  
  5. def handle_token(type, token, (srow, scol), (erow, ecol), line):
  6. print "%d,%d-%d,%d:\t%s\t%s" % \
  7. (srow, scol, erow, ecol, tokenize.tok_name[type], repr(token))
  8.  
  9. tokenize.tokenize(
  10. file.readline,
  11. handle_token
  12. )
  13.  
  14. ## 1,0-1,6: NAME 'import'
  15. ## 1,7-1,15: NAME 'tokenize'
  16. ## 1,15-1,16: NEWLINE '\012'
  17. ## 2,0-2,1: NL '\012'
  18. ## 3,0-3,4: NAME 'file'
  19. ## 3,5-3,6: OP '='
  20. ## 3,7-3,11: NAME 'open'
  21. ## 3,11-3,12: OP '('
  22. ## 3,12-3,35: STRING '"tokenize-example-1.py"'
  23. ## 3,35-3,36: OP ')'
  24. ## 3,36-3,37: NEWLINE '\012'
  25. ## ...
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement