This document provides a comprehensive reference of all functions in the nfo project, organized by module and category.
nfo.decorators)@log_callAutomatically logs function calls with arguments, return values, exceptions, and duration.
@log_call
def my_function(arg1, arg2):
return arg1 + arg2
Parameters:
func (Optional[Callable]) - Function to decoratelevel (str) - Log level (default: “DEBUG”)logger (Optional[Logger]) - Custom logger instancemax_repr_length (Optional[int]) - Truncate long representationsReturns: Decorated function
@catchLike @log_call but suppresses exceptions and returns a default value.
@catch(default=None)
def risky_function():
return 1 / 0 # Returns None instead of raising
Parameters:
func (Optional[Callable]) - Function to decoratedefault (Any) - Value to return on exception (default: None)level (str) - Log level (default: “ERROR”)logger (Optional[Logger]) - Custom logger instancemax_repr_length (Optional[int]) - Truncate long representationsReturns: Decorated function
@loggedClass decorator that automatically wraps all public methods with @log_call.
@logged
class MyService:
def method1(self): pass # Will be logged
def _private(self): pass # Won't be logged
Parameters:
cls (Optional[Type]) - Class to decoratelevel (str) - Log level for all methodslogger (Optional[Logger]) - Custom logger instancemax_repr_length (Optional[int]) - Truncate long representationsReturns: Decorated class
@skipMark a public method to be excluded from @logged auto-wrapping.
@logged
class MyService:
@skip
def health_check(self): pass # Excluded from logging
nfo.auto)auto_log(*modules, **kwargs)Automatically wrap all functions in specified modules with logging decorators.
import mymodule
auto_log(mymodule, level="INFO", catch_exceptions=True)
Parameters:
*modules - Module objects to instrumentlevel (str) - Log level for all functionscatch_exceptions (bool) - Use @catch instead of @log_calldefault (Any) - Default value for @catchinclude_private (bool) - Also wrap private functionsmax_repr_length (Optional[int]) - Truncate long representationsReturns: Number of functions patched
auto_log_by_name(*module_names, **kwargs)Like auto_log() but accepts module name strings.
auto_log_by_name("myapp.api", "myapp.core", level="INFO")
Parameters:
*module_names - Module names to instrumentauto_log()Returns: Number of functions patched
nfo.configure)configure(**kwargs)One-liner project setup with automatic environment variable support.
configure(
sinks=["sqlite:logs.db", "csv:logs.csv"],
level="INFO",
modules=["myapp.api", "myapp.core"]
)
Parameters:
name (str) - Logger name (default: “nfo”)level (str) - Log level (default: “DEBUG”)sinks (List[Union[str, Sink]]) - Sink specifications or instancesmodules (List[str]) - Stdlib modules to bridgepropagate_stdlib (bool) - Forward to stdlib loggersenvironment (str) - Environment tagversion (str) - Application versionllm_model (str) - LLM model for analysisdetect_injection (bool) - Enable prompt injection detectionforce (bool) - Re-configure even if already configuredReturns: Configured Logger instance
nfo.sinks)SQLiteSink(db_path, table)Persist logs to SQLite database for querying.
sink = SQLiteSink("logs.db", table="function_calls")
Parameters:
db_path (Any) - Database file pathtable (str) - Table name (default: “logs”)CSVSink(file_path)Append logs to CSV file.
sink = CSVSink("logs.csv")
Parameters:
file_path (Any) - CSV file pathMarkdownSink(file_path)Write human-readable Markdown logs.
sink = MarkdownSink("logs.md")
Parameters:
file_path (Any) - Markdown file pathJSONSink(file_path, pretty)Write structured JSON Lines output.
sink = JSONSink("logs.jsonl", pretty=False)
Parameters:
file_path (Any) - JSON file pathpretty (bool) - Pretty-print JSON (default: False)nfo.prometheus)Export function call metrics to Prometheus.
sink = PrometheusSink(
delegate=SQLiteSink("logs.db"),
port=9090
)
Parameters:
delegate (Optional[Sink]) - Downstream sinkport (int) - Metrics server portprefix (str) - Metric name prefixMethods:
get_metrics() - Return current metrics in Prometheus formatnfo.webhook)Send HTTP alerts to Slack, Discord, or Teams.
sink = WebhookSink(
url="https://hooks.slack.com/...",
levels=["ERROR"],
format="slack"
)
Parameters:
url (str) - Webhook URLdelegate (Optional[Sink]) - Downstream sinklevels (List[str]) - Log levels to alert onformat (str) - Payload format: “slack”, “discord”, “teams”, “raw”nfo.llm)AI-powered log analysis via litellm.
sink = LLMSink(
model="gpt-4o-mini",
delegate=SQLiteSink("logs.db"),
detect_injection=True
)
Parameters:
model (str) - LLM model namedelegate (Optional[Sink]) - Downstream sinkdetect_injection (bool) - Scan for prompt injectionanalyze_levels (List[str]) - Levels to analyze (default: [“ERROR”])nfo.env)Auto-tag logs with environment, trace ID, and version.
sink = EnvTagger(
SQLiteSink("logs.db"),
environment="prod",
trace_id="abc123"
)
Parameters:
delegate (Sink) - Downstream sinkenvironment (Optional[str]) - Environment tagtrace_id (Optional[str]) - Trace IDversion (Optional[str]) - Application versionRoute logs to different sinks based on rules.
router = DynamicRouter([
(lambda e: e.level == "ERROR", SQLiteSink("errors.db")),
(lambda e: e.environment == "prod", PrometheusSink())
])
Parameters:
rules (List[tuple]) - (predicate, sink) pairsdefault (Optional[Sink]) - Default sinkDetect when function output changes between versions.
sink = DiffTracker(SQLiteSink("logs.db"))
Parameters:
delegate (Sink) - Downstream sinknfo.llm)detect_prompt_injection(text)Scan text for common prompt injection patterns.
result = detect_prompt_injection("ignore previous instructions")
Parameters:
text (str) - Text to scanReturns: Optional[str] - Injection type if detected
scan_entry_for_injection(entry)Scan a LogEntry’s arguments for prompt injection.
injection = scan_entry_for_injection(log_entry)
Parameters:
entry (LogEntry) - Log entry to scanReturns: Optional[str] - Injection type if detected
nfo.env)generate_trace_id()Generate a new trace ID for distributed tracing.
trace_id = generate_trace_id()
Returns: str - UUID-based trace ID
nfo.models)safe_repr(value, max_length)Safe string representation with truncation.
repr_str = safe_repr(large_object, max_length=512)
Parameters:
value (Any) - Value to representmax_length (Optional[int]) - Maximum lengthReturns: str - Safe representation
nfo.__main__)nfo run -- <command>Run any command with automatic logging.
nfo run -- python script.py
nfo run -- bash deploy.sh prod
nfo logs [options]Query logs from SQLite database.
nfo logs --errors --last 24h
nfo logs --function deploy -n 50
nfo serve [--port]Start centralized HTTP logging service.
nfo serve --port 8080
nfo versionPrint nfo version.
nfo version
nfo.models)Core data structure representing a function call log.
Fields:
timestamp (datetime) - UTC timestamplevel (str) - Log level (DEBUG/ERROR)function_name (str) - Qualified function namemodule (str) - Python moduleargs (tuple) - Positional argumentskwargs (dict) - Keyword argumentsarg_types (tuple) - Argument type nameskwarg_types (dict) - Keyword argument type namesreturn_value (Any) - Function return valuereturn_type (str) - Return value typeexception (Optional[str]) - Exception messageexception_type (Optional[str]) - Exception class nametraceback (Optional[str]) - Full tracebackduration_ms (float) - Execution time in millisecondsenvironment (Optional[str]) - Environment tagtrace_id (Optional[str]) - Trace IDversion (Optional[str]) - Application versionllm_analysis (Optional[str]) - LLM analysis resultextra (dict) - Additional metadataMethods:
now() - Create timestampargs_repr() - Get truncated args representationkwargs_repr() - Get truncated kwargs representationreturn_value_repr() - Get truncated return value representationas_dict() - Convert to flat dictionarynfo.logger)Central dispatcher for log entries.
Methods:
add_sink(sink) - Register a sinkremove_sink(sink) - Remove a sinkemit(entry) - Send entry to all sinksclose() - Close all sinksAll sinks implement the same interface:
class Sink:
def write(self, entry: LogEntry) -> None:
"""Write a log entry."""
pass
def close(self) -> None:
"""Close the sink and release resources."""
pass
nfo automatically reads these environment variables:
NFO_LEVEL - Default log levelNFO_SINKS - Comma-separated sink specificationsNFO_ENV - Environment tagNFO_VERSION - Application versionNFO_LLM_MODEL - LLM model nameOPENAI_API_KEY - OpenAI API key (for LLM features)NFO_WEBHOOK_URL - Webhook URL for alertsNFO_PROMETHEUS_PORT - Prometheus metrics portNFO_LOG_DIR - Directory for log filesNFO_PORT - HTTP service portString format for configure() and CLI:
sqlite:path/to/db.db
csv:path/to/file.csv
md:path/to/file.md
json:path/to/file.jsonl
prometheus:9090
@catchmax_repr_length for functions with large arguments