π« The Pain Point
Your server generated 10GB of logs. You need to find all ERROR entries, count occurrences, and identify the most common issues. Scrolling through millions of lines is impossible.
π Agentic Solution
A Log Parser that filters, aggregates, and summarizes log data.
Key Features:
- Pattern Matching: Custom regex for your log format.
- Level Filtering: ERROR, WARNING, INFO, DEBUG.
- Aggregation: Count by type, time period, source.
βοΈ Phase 1: Commander (Quick Fix)
For quick analysis.
Prompt:
βI have a log file
server.logwith format[TIMESTAMP] [LEVEL] Message. Write a Python script to:
- Parse: Extract timestamp, level, message.
- Filter: Show only ERROR and WARNING entries.
- Aggregate: Count by level and by hour.
- Output: Save filtered entries to
errors.csvand summary tosummary.txt.Handle malformed lines (skip with warning).β
Result: Actionable insights from massive logs.
ποΈ Phase 2: Architect (Permanent Tool)
Engineering Prompt:
**Role:** Python Tool Developer
**Task:** Create a "Log Analyzer".
**Requirements:**
1. **GUI:**
* Select log file.
* Regex pattern builder for custom formats.
* Level filter checkboxes.
* Time range filter.
* Results table with search.
* Export options.
2. **Logic:**
* Stream parsing for large files.
* Pandas for aggregation.
* Matplotlib charts for visualization.
3. **Deliverables:**
* `log_analyzer.py`
* `run.bat`, `run.sh`
* `requirements.txt`
π§ Prompt Decoding
- Stream Parsing: Donβt load entire file into memory. Process line by line.
π οΈ Instructions
- Copy Prompt β Adjust regex for your format β Run.