Examples & Use Cases
Real-world examples and practical use cases for Daily Vibe
Examples & Use Cases
This guide provides practical examples of how to use Daily Vibe in various scenarios, from personal development tracking to team knowledge management.
Personal Development Tracking
Daily Standup Preparation
Generate comprehensive summaries for your daily standups:
# Generate today's summary
daily-vibe analyze today --out ./standup
# Quick overview for team sharing
daily-vibe analyze today --json | jq -r '.summary.keyOutputs[]'
Example Output (./standup/daily.md
):
# Daily Development Report - 2025-01-16
## Overview
- **Sessions**: 3 coding sessions (4.2 hours total)
- **Projects**: daily-vibe-cli, web-dashboard
- **Key Focus**: API integration, bug fixes
## Key Outputs
- ✅ Implemented OpenAI API integration
- ✅ Fixed authentication bug in user service
- ✅ Added error handling for network requests
- 🔧 Refactored database connection pooling
## Test Results
- **Passed**: 47/50 tests
- **Failed**: 3 tests (authentication edge cases)
- **Coverage**: 89% (+2% from yesterday)
Weekly Learning Review
Track your learning progress and skill development:
# Analyze the past week
daily-vibe analyze range --from "7 days ago" --to today --out ./weekly-review
# Focus on knowledge extraction
cat ./weekly-review/knowledge.md
Example Knowledge Output:
# Knowledge Extraction - Week of 2025-01-10
## Build/Compilation Issues
### TypeScript Configuration
**Problem**: Module resolution errors with path mapping
**Solution**: Updated tsconfig.json baseUrl and paths configuration
**Tags**: typescript, configuration, modules
### Dependency Management
**Problem**: Conflicting versions of @types/node
**Solution**: Pinned to specific version in package.json
**Tags**: npm, dependencies, typescript
## Code Implementation
### API Design Patterns
**Pattern**: Repository pattern with dependency injection
**Implementation**: Created abstract base repository class
**Benefits**: Improved testability, cleaner separation of concerns
**Tags**: design-patterns, architecture, typescript
Monthly Progress Tracking
Create monthly development reports for performance reviews:
# Generate monthly report
MONTH_START="2025-01-01"
MONTH_END="2025-01-31"
daily-vibe analyze range --from $MONTH_START --to $MONTH_END --out ./monthly-report
# Extract key metrics
jq '.statistics' ./monthly-report/data.json
Team Collaboration
Knowledge Base Building
Create a searchable knowledge base for your team:
#!/bin/bash
# team-knowledge-sync.sh
TEAM_MEMBERS=("alice" "bob" "charlie")
KNOWLEDGE_DIR="./team-knowledge"
DATE_RANGE="--from yesterday --to today"
mkdir -p "$KNOWLEDGE_DIR"
for member in "${TEAM_MEMBERS[@]}"; do
echo "Processing $member's sessions..."
# Each team member runs this on their machine
if [ "$USER" == "$member" ]; then
daily-vibe analyze range $DATE_RANGE \
--out "$KNOWLEDGE_DIR/$member-$(date +%Y-%m-%d)" \
--json
fi
done
# Combine knowledge files
cat "$KNOWLEDGE_DIR"/*/knowledge.md > "$KNOWLEDGE_DIR/team-knowledge-$(date +%Y-%m-%d).md"
// .dailyviberc.json (team configuration)
{
"llm": {
"provider": "openai",
"model": "gpt-4",
"apiKey": "${OPENAI_API_KEY}"
},
"redact": {
"enabled": true,
"patterns": [
"sk-[a-zA-Z0-9]{20,}",
"internal\\.company\\.com",
"prod-[a-zA-Z0-9-]+",
"\\b[A-Za-z0-9._%+-]+@company\\.com\\b"
]
},
"analysis": {
"includeSystemMessages": false,
"minSessionDuration": 600
}
}
# .github/workflows/knowledge-sync.yml
name: Team Knowledge Sync
on:
schedule:
- cron: '0 9 * * 1' # Every Monday at 9 AM
workflow_dispatch:
jobs:
knowledge-sync:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Setup Node.js
uses: actions/setup-node@v3
with:
node-version: '18'
- name: Install Daily Vibe
run: npm install -g daily-vibe
- name: Sync Team Knowledge
env:
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
run: |
daily-vibe analyze range --from "7 days ago" --to today \
--out ./weekly-knowledge --json
- name: Create PR with Knowledge Update
run: |
# Create PR with updated knowledge base
# Implementation depends on your workflow
Code Review Preparation
Prepare detailed summaries for code reviews:
# Before creating PR
daily-vibe analyze range --from "$(git log --format="%ad" --date=short -1)" --to today \
--out ./pr-summary
# Extract key changes for PR description
jq -r '.summary.keyOutputs | join("\n- ")' ./pr-summary/data.json
Post-Mortem Analysis
Analyze sessions during incident resolution:
# Analyze incident response session
INCIDENT_START="2025-01-15T14:30:00"
INCIDENT_END="2025-01-15T18:45:00"
daily-vibe analyze range --from "$INCIDENT_START" --to "$INCIDENT_END" \
--out ./incident-postmortem --json
# Extract problem-solution patterns
jq '.knowledge.problemSolution' ./incident-postmortem/data.json
Development Workflows
Morning Routine
Automated morning development summary:
#!/bin/bash
# morning-summary.sh
echo "📅 Daily Development Summary - $(date +%Y-%m-%d)"
echo "================================================"
# Yesterday's accomplishments
daily-vibe analyze range --from yesterday --to yesterday --json | \
jq -r '.summary.keyOutputs[] | "✅ " + .'
# Today's focus areas from todos
daily-vibe analyze yesterday --json | \
jq -r '.todos.pending[] | "🎯 " + .'
echo ""
echo "🚀 Ready to start the day!"
End of Sprint Analysis
Analyze development patterns across a sprint:
# Sprint analysis (2-week sprint)
SPRINT_START="2025-01-01"
SPRINT_END="2025-01-14"
daily-vibe analyze range --from $SPRINT_START --to $SPRINT_END \
--out ./sprint-retrospective
# Generate sprint metrics
jq '{
totalSessions: .statistics.totalSessions,
avgSessionLength: .statistics.avgSessionLength,
topProblems: .knowledge.problemSolution[0:5],
testSuccessRate: .testResults.successRate
}' ./sprint-retrospective/data.json > ./sprint-metrics.json
Debugging Session Analysis
Extract insights from debugging sessions:
# Analyze a specific debugging session
daily-vibe analyze range --from "2025-01-15T10:00" --to "2025-01-15T14:00" \
--out ./debug-session --no-redact
# Extract error patterns and solutions
jq -r '.knowledge.problemSolution[] |
select(.tags[] | contains("debug") or contains("error")) |
"Problem: " + .problem + "\nSolution: " + .solution + "\n---"' \
./debug-session/data.json
Automation & Integration
CI/CD Integration
Integrate Daily Vibe into your CI/CD pipeline:
# .github/workflows/development-analysis.yml
name: Development Analysis
on:
pull_request:
branches: [main]
workflow_dispatch:
jobs:
analyze-development:
runs-on: ubuntu-latest
steps:
- name: Setup Daily Vibe
run: npm install -g daily-vibe
- name: Configure Daily Vibe
env:
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
run: |
daily-vibe config set --provider openai --api-key $OPENAI_API_KEY
- name: Analyze PR Development Session
run: |
PR_START=$(gh pr view ${{ github.event.pull_request.number }} --json createdAt -q .createdAt)
daily-vibe analyze range --from "$PR_START" --to now \
--json --out ./pr-analysis
- name: Comment PR with Analysis
run: |
jq -r '.summary.keyOutputs | join("\n- ")' ./pr-analysis/data.json | \
gh pr comment ${{ github.event.pull_request.number }} --body-file -
// Jenkinsfile
pipeline {
agent any
environment {
OPENAI_API_KEY = credentials('openai-api-key')
}
stages {
stage('Setup') {
steps {
sh 'npm install -g daily-vibe'
sh 'daily-vibe config set --provider openai --api-key $OPENAI_API_KEY'
}
}
stage('Analyze Development') {
steps {
script {
def buildStart = new Date(currentBuild.startTimeInMillis).format("yyyy-MM-dd'T'HH:mm:ss")
sh "daily-vibe analyze range --from '${buildStart}' --to now --out ./build-analysis"
}
}
}
stage('Archive Analysis') {
steps {
archiveArtifacts artifacts: 'build-analysis/**', allowEmptyArchive: true
}
}
}
}
# .gitlab-ci.yml
stages:
- analyze
- report
variables:
DAILY_VIBE_LLM_PROVIDER: openai
DAILY_VIBE_LLM_API_KEY: $OPENAI_API_KEY
analyze_development:
stage: analyze
image: node:18
script:
- npm install -g daily-vibe
- daily-vibe analyze range --from "$CI_PIPELINE_CREATED_AT" --to now --json --out ./analysis
artifacts:
paths:
- analysis/
expire_in: 1 week
generate_report:
stage: report
dependencies:
- analyze_development
script:
- cat analysis/daily.md >> $CI_MERGE_REQUEST_IID.md
only:
- merge_requests
Slack Integration
Send daily summaries to Slack:
#!/bin/bash
# slack-daily-report.sh
SLACK_WEBHOOK_URL="https://hooks.slack.com/services/YOUR/WEBHOOK/URL"
# Generate today's report
daily-vibe analyze today --json > /tmp/daily-report.json
# Extract summary
SUMMARY=$(jq -r '.summary.overview' /tmp/daily-report.json)
KEY_OUTPUTS=$(jq -r '.summary.keyOutputs | join("\n• ")' /tmp/daily-report.json)
# Send to Slack
curl -X POST -H 'Content-type: application/json' \
--data "{
\"text\": \"📊 Daily Development Summary\",
\"blocks\": [
{
\"type\": \"section\",
\"text\": {
\"type\": \"mrkdwn\",
\"text\": \"*Overview:* $SUMMARY\"
}
},
{
\"type\": \"section\",
\"text\": {
\"type\": \"mrkdwn\",
\"text\": \"*Key Outputs:*\n• $KEY_OUTPUTS\"
}
}
]
}" \
$SLACK_WEBHOOK_URL
Notion Integration
Sync reports to Notion database:
# notion-sync.py
import json
import requests
from datetime import datetime
NOTION_TOKEN = "your_notion_token"
DATABASE_ID = "your_database_id"
def sync_daily_report():
# Load Daily Vibe output
with open('./reports/data.json', 'r') as f:
report_data = json.load(f)
# Prepare Notion page data
notion_data = {
"parent": {"database_id": DATABASE_ID},
"properties": {
"Date": {
"date": {"start": datetime.now().strftime("%Y-%m-%d")}
},
"Summary": {
"rich_text": [{"text": {"content": report_data["summary"]["overview"]}}]
},
"Key Outputs": {
"rich_text": [{"text": {"content": "\n".join(report_data["summary"]["keyOutputs"])}}]
},
"Session Count": {
"number": report_data["statistics"]["totalSessions"]
}
},
"children": [
{
"object": "block",
"type": "paragraph",
"paragraph": {
"rich_text": [{"type": "text", "text": {"content": "Generated by Daily Vibe"}}]
}
}
]
}
# Create Notion page
response = requests.post(
"https://api.notion.com/v1/pages",
headers={
"Authorization": f"Bearer {NOTION_TOKEN}",
"Content-Type": "application/json",
"Notion-Version": "2022-06-28"
},
json=notion_data
)
if response.status_code == 200:
print("✅ Report synced to Notion")
else:
print(f"❌ Failed to sync: {response.text}")
if __name__ == "__main__":
sync_daily_report()
Advanced Use Cases
Multi-Project Analysis
Analyze sessions across multiple projects:
#!/bin/bash
# multi-project-analysis.sh
PROJECTS=("web-app" "mobile-app" "api-service")
DATE_RANGE="--from '7 days ago' --to today"
for project in "${PROJECTS[@]}"; do
echo "Analyzing $project..."
# Configure project-specific data sources
cat > ".dailyviberc.$project.json" << EOF
{
"dataSources": {
"claudeCode": {
"paths": ["~/.claude/projects/$project/**/*.jsonl"]
}
}
}
EOF
# Analyze project
daily-vibe --config ".dailyviberc.$project.json" \
analyze range $DATE_RANGE \
--out "./analysis/$project"
done
# Combine project analyses
python3 << EOF
import json
import glob
combined_data = {
"projects": {},
"totals": {"sessions": 0, "hours": 0}
}
for project_file in glob.glob("./analysis/*/data.json"):
project_name = project_file.split("/")[2]
with open(project_file) as f:
data = json.load(f)
combined_data["projects"][project_name] = data
combined_data["totals"]["sessions"] += data["statistics"]["totalSessions"]
combined_data["totals"]["hours"] += data["statistics"]["totalHours"]
with open("./combined-analysis.json", "w") as f:
json.dump(combined_data, f, indent=2)
print(f"📊 Combined Analysis: {combined_data['totals']['sessions']} sessions across {len(combined_data['projects'])} projects")
EOF
Learning Path Tracking
Track your learning progress in specific technologies:
#!/bin/bash
# learning-tracker.sh
TECHNOLOGIES=("typescript" "react" "node.js" "docker")
for tech in "${TECHNOLOGIES[@]}"; do
echo "📚 Learning Progress for $tech"
echo "================================"
# Extract knowledge related to technology
daily-vibe analyze range --from "30 days ago" --to today --json | \
jq --arg tech "$tech" -r '
.knowledge.problemSolution[] |
select(.tags[] | ascii_downcase | contains($tech)) |
"📝 " + .problem + " → " + .solution
'
echo ""
done
Performance Analysis
Analyze development performance patterns:
# performance-analysis.py
import json
import matplotlib.pyplot as plt
from datetime import datetime, timedelta
import subprocess
def analyze_performance_trends():
"""Analyze development performance over time"""
# Generate data for last 30 days
performance_data = []
for i in range(30):
date = datetime.now() - timedelta(days=i)
date_str = date.strftime("%Y-%m-%d")
try:
# Run Daily Vibe analysis
result = subprocess.run([
'daily-vibe', 'analyze', 'range',
'--from', date_str,
'--to', date_str,
'--json'
], capture_output=True, text=True)
if result.returncode == 0:
data = json.loads(result.stdout)
performance_data.append({
'date': date_str,
'sessions': data['statistics']['totalSessions'],
'hours': data['statistics']['totalHours'],
'test_success_rate': data.get('testResults', {}).get('successRate', 0)
})
except Exception as e:
print(f"Error analyzing {date_str}: {e}")
# Generate performance charts
dates = [d['date'] for d in performance_data]
sessions = [d['sessions'] for d in performance_data]
hours = [d['hours'] for d in performance_data]
fig, (ax1, ax2) = plt.subplots(2, 1, figsize=(12, 8))
# Sessions over time
ax1.plot(dates, sessions, marker='o')
ax1.set_title('Development Sessions Over Time')
ax1.set_ylabel('Sessions per Day')
ax1.tick_params(axis='x', rotation=45)
# Hours over time
ax2.plot(dates, hours, marker='s', color='orange')
ax2.set_title('Development Hours Over Time')
ax2.set_ylabel('Hours per Day')
ax2.set_xlabel('Date')
ax2.tick_params(axis='x', rotation=45)
plt.tight_layout()
plt.savefig('performance-trends.png')
print("📈 Performance trends saved to performance-trends.png")
if __name__ == "__main__":
analyze_performance_trends()
Best Practices
Regular Analysis Schedule
Set up automated regular analysis:
# Add to crontab (crontab -e)
# Daily analysis at 6 PM
0 18 * * * /usr/local/bin/daily-vibe analyze today --out ~/reports/daily/$(date +\%Y-\%m-\%d)
# Weekly analysis every Monday at 9 AM
0 9 * * 1 /usr/local/bin/daily-vibe analyze range --from "7 days ago" --to yesterday --out ~/reports/weekly/week-$(date +\%U)
# Monthly analysis on 1st of each month
0 10 1 * * /usr/local/bin/daily-vibe analyze range --from "1 month ago" --to yesterday --out ~/reports/monthly/$(date +\%Y-\%m)
Configuration Management
Maintain different configurations for different contexts:
{
"llm": {
"provider": "openai",
"model": "gpt-4",
"apiKey": "${PERSONAL_OPENAI_KEY}"
},
"redact": {
"enabled": false
},
"analysis": {
"includeSystemMessages": true
}
}
{
"llm": {
"provider": "anthropic",
"model": "claude-3-sonnet-20240229",
"apiKey": "${WORK_CLAUDE_KEY}"
},
"redact": {
"enabled": true,
"patterns": [
"company\\.com",
"internal-[a-zA-Z0-9-]+",
"prod-[a-zA-Z0-9-]+",
"staging-[a-zA-Z0-9-]+",
"\\b[A-Za-z0-9._%+-]+@company\\.com\\b"
]
},
"analysis": {
"includeSystemMessages": false,
"minSessionDuration": 900
}
}
{
"llm": {
"provider": "generic",
"baseUrl": "https://api.deepseek.com/v1",
"model": "deepseek-coder",
"apiKey": "${DEEPSEEK_API_KEY}"
},
"redact": {
"enabled": true,
"patterns": [
"sk-[a-zA-Z0-9]{20,}",
"\\b[A-Za-z0-9._%+-]+@[A-Za-z0-9.-]+\\.[A-Z|a-z]{2,}\\b"
]
}
}
Tip: Use different configurations for different contexts to optimize cost, privacy, and analysis quality.
Next Steps
Now that you've seen various examples, you can:
- Set up automation for your workflow
- Contribute improvements to Daily Vibe
- Join the community to share your use cases