mirror of
https://github.com/openpeeps/toktok
synced 2026-01-14 04:01:39 +00:00
No description
|
|
||
|---|---|---|
| .github | ||
| examples | ||
| src | ||
| tests | ||
| .gitignore | ||
| LICENSE | ||
| README.md | ||
| toktok.nimble | ||

Generic tokenizer written in Nim language, powered by Nim's Macros 👑
nimble install toktok
😍 Key Features
- ✨ Powered by Nim's Macros
- 🪄 Based on
std/lexbase/ Zero Regular Expression - Compile-time generation using macro-based TokenKind
enum,lexbase - Runtime generation using TokenKind
tables,lexbase - Open Source |
MIT
Note
This is a generic Lexer, based on std/
streams,lexbaseandmacros. It is meant to be used by higher level parsers for writing any kind of tools or programs.
Note
Compile with
-d:toktokdebugto inspect the generated code.
Quick Example
# Register your custom handlers
handlers:
proc handleImport(lex: var Lexer, kind: TokenKind) =
# tokenize `import x, y, z`
lex.kind = kind
# Register your tokens
registerTokens defaultSettings:
`const` = "const"
`echo` = "hello"
asgn = '=': # `=` is tkAsgn
eq = '=' # `==` is tkEQ
excl = '!':
ne = '='
at = '@':
import = tokenizer(handleImport, "import")
# Tokenizing...
var
tok = lexer.init(sample)
prev: TokenTuple
curr: TokenTuple = tok.getToken
next: TokenTuple = tok.getToken
proc walk(tok: var Lexer) =
prev = curr
curr = next
next = tok.getToken
while likely(curr.kind != tkEOF):
if tok.hasError: break
echo curr # use `getToken` consumer to get token by token
walk tok
TODO
- Runtime Token generation using tables/critbits
❤ Contributions & Support
- 🐛 Found a bug? Create a new Issue
- 👋 Wanna help? Fork it!
- 😎 Get €20 in cloud credits from Hetzner
- 🥰 Donate to OpenPeeps via PayPal address
🎩 License
MIT license. Made by Humans from OpenPeeps.
Copyright © 2023 OpenPeeps & Contributors — All rights reserved.