Appendix¶
c¶
-
quasiquotes.c.
c
= <quasiquotes.c.c object>¶ quasiquoter for inlining c.
Parameters: keep_c : bool, optional
Keep the generated .c files. Defaults to False.
keep_so : bool, optional
Keep the compiled .so files. Defaults to True.
extra_compile_args : iterable[str or Flag]
Extra command line arguments to pass to gcc.
Notes
You cannot pass arguments in the quasiquote syntax. You must construct a new instance of c and then use that as the quasiquoter. For example:
with $c(keep_so=False): Py_None;
is a syntax error. Instead, you must do:
c_no_keep_so = c(keep_so=False) with $c_no_keep_so: Py_None;
This is because of the way the quasiquotes lexer identifies quasiquote sections.
Methods
quote_stmt
quote_expr
-
c.
cleanup
(path='.', recurse=True)¶ Remove cached shared objects and c code generated by the c quasiquoter.
Parameters: path : str, optional
The path to the directory that will be searched.
recurse : bool, optional
Should the search recurse through subdirectories of
path
.Returns: removed : list[str]
The paths to the files that were removed.
-
-
exception
quasiquotes.c.
CompilationError
¶ An exception that indicates that gcc failed to compile the given C code.
-
exception
quasiquotes.c.
CompilationWarning
¶ A warningthat indicates that gcc warned when compiling the given C code.
fromfile¶
-
class
quasiquotes.quasiquoter.
fromfile
(qq)¶ Create a
QuasiQuoter
from an existing one that reads the body from a filename.Parameters: qq : QuasiQuoter
The QuasiQuoter to wrap.
Examples
>>> from quasiquotes.quasiquoter import fromfile >>> from quasiquotes.c import c >>> include_c = fromfile(c) >>> # quote_expr on the contents of the file >>> [$include_c|mycode.c|] >>> # quote_stmt on the contents of the file >>> with $include_c: ... mycode.c
Codec¶
-
class
quasiquotes.codec.tokenizer.
FuzzyTokenInfo
¶ A token info object that check equality only on
type
andstring
.Parameters: type : int
The enum for the token type.
string : str
The string represnting the token.
start, end, line : any, optional
Ignored.
-
class
quasiquotes.codec.tokenizer.
PeekableIterator
(stream)¶ An iterator that can peek at the next
n
elements without consuming them.Parameters: stream : iterator
The underlying iterator to pull from.
Notes
Peeking at
n
items will pull that many values into memory until they have been consumed withnext
.The underlying iterator should not be consumed while the
PeekableIterator
is in use.-
lookahead_iter
()¶ Return an iterator that yields the next element and then consumes it.
This is particularly useful for
takewhile
style functions where you want to break when some predicate is matched but not consume the element that failed the predicate.Examples
>>> it = PeekableIterator(iter((1, 2, 3))) >>> for n in it.lookahead_iter(): ... if n == 2: ... break >>> next(it) 2
-
peek
(n=1)¶ Return the next
n
elements of the iterator without consuming them.Parameters: n : int
Returns: peeked : tuple
The next
elements
Examples
>>> it = PeekableIterator(iter((1, 2, 3, 4))) >>> it.peek(2) (1, 2) >>> next(it) 1 >>> it.peek(1) (2,) >>> next(it) 2 >>> next(it) 3
-
-
quasiquotes.codec.tokenizer.
quote_expr_tokenizer
(name, start, tok_stream)¶ Tokenizer for quote_expr.
Parameters: name : str
The name of the quasiquoter.
start : TokenInfo
The starting token.
tok_stream : iterator of TokenInfo
The token stream to pull from.
-
quasiquotes.codec.tokenizer.
quote_stmt_tokenizer
(name, start, tok_stream)¶ Tokenizer for quote_stmt.
Parameters: name : str
The name of the quasiquoter.
start : TokenInfo
The starting token.
tok_stream : iterator of TokenInfo
The token stream to pull from.
-
quasiquotes.codec.tokenizer.
tokenize
(readline)¶ Tokenizer for the quasiquotes language extension.
Parameters: readline : callable
A callable that returns the next line to tokenize.
-
quasiquotes.codec.tokenizer.
tokenize_bytes
(bs)¶ Tokenize a bytes object.
Parameters: bs : bytes
The bytes to tokenize.
-
quasiquotes.codec.tokenizer.
tokenize_string
(cs)¶ Tokenize a str object.
Parameters: cs : str
The string to tokenize.
-
quasiquotes.codec.tokenizer.
transform_bytes
(bs)¶ Run bytes through the tokenizer and emit the pure python representation.
Parameters: bs : bytes
The bytes to transform.
Returns: transformed : bytes
The pure python representation of bs.
-
quasiquotes.codec.tokenizer.
transform_string
(cs)¶ Run a str through the tokenizer and emit the pure python representation.
Parameters: cs : str
The string to transform.
Returns: transformed : bytes
The pure python representation of cs.
Utilities¶
-
class
quasiquotes.utils.shell.
Executable
(name)¶ An executable from the shell.
-
quasiquotes.utils._traceback.
new_tb
(frame)¶ Create a traceback object starting at the given stackframe.
Parameters: frame : frame
The frame to start the traceback from.
Returns: tb : traceback
The new traceback object.
Notes
This function creates a new traceback object through the C-API. Use at your own risk.