MySQL Analysis: With an AI-Powered CLI Tool
As DBAs with MySQL we often live on a Linux terminal window. We also enjoy free options when available. This post shows an approach that allows us to stay on our terminal window and still use an AI-powered tool. You can update to use other direct AI providers but I set this example up to use aimlapi.com as it brings multiple AI models to your terminal for free with limited use or very low cost for more testing.
Note: I'm not a paid spokesperson for AIMLAPI or anything - this is just an easy example to highlight the idea.
The Problem
You're looking at a legacy database with hundreds of tables, each with complex relationships and questionable design decisions made years ago. The usual process involves:
- Manual schema inspection
- Cross-referencing documentation (if it exists)
- Running multiple EXPLAIN queries
- Consulting best practice guides
- Seeking second opinions from colleagues
This takes time and you often miss things.
A CLI-Based Approach
We can take advantage of AI directly from our CLI and do numerous things. Helping with MySQL analysis is just one example of how this approach can work with our daily database tasks. By combining MySQL's native capabilities with AI models, all accessible through a simple command-line interface, we can get insights without leaving our terminal. AIMLAPI provides free access to over 100 AI models with limited use, making this approach accessible. For heavier testing, the costs remain very reasonable.
The Tool: AIMLAPI CLI
So here's a bash script that provides access to 100+ AI models through a single interface:
#!/bin/bash
# AIMLAPI CLI tool with access to 100+ AI models
# File: ~/.local/bin/aiml
# Configuration
DEFAULT_MODEL=${AIMLAPI_DEFAULT_MODEL:-"gpt-4o"}
MAX_TOKENS=${AIMLAPI_MAX_TOKENS:-2000}
TEMPERATURE=${AIMLAPI_TEMPERATURE:-0.7}
BASE_URL="https://api.aimlapi.com"
ENDPOINT="v1/chat/completions"
# Color codes for output
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
BLUE='\033[0;34m'
PURPLE='\033[0;35m'
CYAN='\033[0;36m'
NC='\033[0m' # No Color
# Function to print colored output
print_info() { echo -e "${BLUE}[INFO]${NC} $1"; }
print_success() { echo -e "${GREEN}[SUCCESS]${NC} $1"; }
print_warning() { echo -e "${YELLOW}[WARNING]${NC} $1"; }
print_error() { echo -e "${RED}[ERROR]${NC} $1"; }
print_model() { echo -e "${PURPLE}[MODEL]${NC} $1"; }
# Popular model shortcuts
declare -A MODEL_SHORTCUTS=(
# OpenAI Models
["gpt4"]="gpt-4o"
["gpt4o"]="gpt-4o"
["gpt4mini"]="gpt-4o-mini"
["o1"]="o1-preview"
["o3"]="openai/o3-2025-04-16"
# Claude Models
["claude"]="claude-3-5-sonnet-20241022"
["claude4"]="anthropic/claude-sonnet-4"
["opus"]="claude-3-opus-20240229"
["haiku"]="claude-3-5-haiku-20241022"
["sonnet"]="claude-3-5-sonnet-20241022"
# DeepSeek Models
["deepseek"]="deepseek-chat"
["deepseek-r1"]="deepseek/deepseek-r1"
["reasoner"]="deepseek-reasoner"
# Google Models
["gemini"]="gemini-2.0-flash"
["gemini2"]="gemini-2.0-flash"
["gemini15"]="gemini-1.5-pro"
# Meta Llama Models
["llama"]="meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo"
["llama405b"]="meta-llama/Meta-Llama-3.1-405B-Instruct-Turbo"
# Qwen Models
["qwen"]="qwen-max"
["qwq"]="Qwen/QwQ-32B"
# Grok Models
["grok"]="x-ai/grok-beta"
["grok3"]="x-ai/grok-3-beta"
# Specialized Models
["coder"]="Qwen/Qwen2.5-Coder-32B-Instruct"
)
# Function to resolve model shortcuts
resolve_model() {
local model="$1"
if [[ -n "${MODEL_SHORTCUTS[$model]}" ]]; then
echo "${MODEL_SHORTCUTS[$model]}"
else
echo "$model"
fi
}
# Function to create JSON payload using jq for proper escaping
create_json_payload() {
local model="$1"
local prompt="$2"
local system_prompt="$3"
local temp_file=$(mktemp)
echo "$prompt" > "$temp_file"
if [ -n "$system_prompt" ]; then
jq -n --arg model "$model" \
--rawfile prompt "$temp_file" \
--arg system "$system_prompt" \
--argjson max_tokens "$MAX_TOKENS" \
--argjson temperature "$TEMPERATURE" \
'{
model: $model,
messages: [{role: "system", content: $system}, {role: "user", content: $prompt}],
max_tokens: $max_tokens,
temperature: $temperature
}'
else
jq -n --arg model "$model" \
--rawfile prompt "$temp_file" \
--argjson max_tokens "$MAX_TOKENS" \
--argjson temperature "$TEMPERATURE" \
'{
model: $model,
messages: [{role: "user", content: $prompt}],
max_tokens: $max_tokens,
temperature: $temperature
}'
fi
rm -f "$temp_file"
}
# Function to call AIMLAPI
call_aimlapi() {
local prompt="$1"
local model="$2"
local system_prompt="$3"
if [ -z "$AIMLAPI_API_KEY" ]; then
print_error "AIMLAPI_API_KEY not set"
return 1
fi
model=$(resolve_model "$model")
local json_file=$(mktemp)
create_json_payload "$model" "$prompt" "$system_prompt" > "$json_file"
local response_file=$(mktemp)
local http_code=$(curl -s -w "%{http_code}" -X POST "${BASE_URL}/${ENDPOINT}" \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $AIMLAPI_API_KEY" \
--data-binary @"$json_file" \
-o "$response_file")
if [ "$http_code" -ne 200 ] && [ "$http_code" -ne 201 ]; then
print_error "HTTP Error $http_code"
cat "$response_file" >&2
rm -f "$json_file" "$response_file"
return 1
fi
local content=$(jq -r '.choices[0].message.content // empty' "$response_file" 2>/dev/null)
if [ -z "$content" ]; then
content=$(jq -r '.choices[0].text // .message.content // .content // empty' "$response_file" 2>/dev/null)
fi
if [ -z "$content" ]; then
local error_msg=$(jq -r '.error.message // .error // empty' "$response_file" 2>/dev/null)
if [ -n "$error_msg" ]; then
echo "API Error: $error_msg"
else
echo "Error: Unable to parse response from API"
fi
else
echo "$content"
fi
rm -f "$json_file" "$response_file"
}
# Main function with argument parsing
main() {
local model="$DEFAULT_MODEL"
local system_prompt=""
local prompt=""
local piped_input=""
if [ -p /dev/stdin ]; then
piped_input=$(cat)
fi
# Parse arguments
while [[ $# -gt 0 ]]; do
case $1 in
-m|--model)
model="$2"
shift 2
;;
-s|--system)
system_prompt="$2"
shift 2
;;
*)
prompt="$*"
break
;;
esac
done
# Handle input
if [ -n "$piped_input" ] && [ -n "$prompt" ]; then
prompt="$prompt
Here is the data to analyze:
$piped_input"
elif [ -n "$piped_input" ]; then
prompt="Please analyze this data:
$piped_input"
elif [ -z "$prompt" ]; then
echo "Usage: aiml [options] \"prompt\""
echo " command | aiml [options]"
exit 1
fi
local resolved_model=$(resolve_model "$model")
print_info "Querying $resolved_model..."
local response=$(call_aimlapi "$prompt" "$model" "$system_prompt")
echo ""
print_model "Response from $resolved_model:"
echo "----------------------------------------"
echo "$response"
echo "----------------------------------------"
}
# Check dependencies
check_dependencies() {
command -v curl >/dev/null 2>&1 || { print_error "curl required but not installed."; exit 1; }
command -v jq >/dev/null 2>&1 || { print_error "jq required but not installed."; exit 1; }
}
check_dependencies
main "$@"
This script provides access to various AI models through simple shortcuts like claude4
, gpt4
, grok3
, etc. AIMLAPI offers free access with limited use to all these models, with reasonable costs for additional testing. Good for DBAs who want to experiment without breaking the budget.
Script Features
The script includes comprehensive help. Here's what aiml --help
shows:
AIMLAPI CLI Tool - Access to 100+ AI Models
==============================================
Usage: aiml [OPTIONS] "prompt"
command | aiml [OPTIONS]
Core Options:
-m, --model MODEL Model to use (default: gpt-4o)
-t, --tokens NUMBER Max tokens (default: 2000)
-T, --temperature FLOAT Temperature 0.0-2.0 (default: 0.7)
-s, --system PROMPT System prompt for model behavior
Input/Output Options:
-f, --file FILE Read prompt from file
-o, --output FILE Save response to file
-r, --raw Raw output (no formatting/colors)
Information Options:
-l, --list List popular model shortcuts
--get-models Fetch all available models from API
-c, --config Show current configuration
-v, --verbose Enable verbose output
-d, --debug Show debug information
-h, --help Show this help
Basic Examples:
aiml "explain quantum computing"
aiml -m claude "review this code"
aiml -m deepseek-r1 "solve this math problem step by step"
aiml -m grok3 "what are the latest AI developments?"
aiml -m coder "optimize this Python function"
Pipe Examples:
ps aux | aiml "analyze these processes"
netstat -tuln | aiml "explain these network connections"
cat error.log | aiml -m claude "diagnose these errors"
git diff | aiml -m coder "review these code changes"
df -h | aiml "analyze disk usage and suggest cleanup"
File Operations:
aiml -f prompt.txt -o response.txt
aiml -f large_dataset.csv -m llama405b "analyze this data"
cat script.py | aiml -m coder -o review.md "code review"
Model Categories & Shortcuts:
OpenAI: gpt4, gpt4mini, o1, o3
Claude: claude, opus, haiku, sonnet, claude4
DeepSeek: deepseek, deepseek-r1, reasoner
Google: gemini, gemini2, gemma
Meta: llama, llama3, llama4, llama405b
Qwen: qwen, qwen2, qwq
Grok: grok, grok3, grok3mini
Coding: coder, codestral
Advanced Usage:
aiml -m claude -s "You are a security expert" "audit this code"
aiml -m deepseek-r1 -t 3000 "complex reasoning task"
aiml -v -m grok3 "verbose query with detailed logging"
aiml -d "debug mode to troubleshoot API issues"
Model Discovery:
aiml -l # Show popular shortcuts
aiml --get-models # Fetch all available models from API
aiml --config # Show current configuration
Environment Variables:
AIMLAPI_API_KEY - Your AIMLAPI key (required)
AIMLAPI_DEFAULT_MODEL - Default model (optional)
AIMLAPI_MAX_TOKENS - Default max tokens (optional)
AIMLAPI_TEMPERATURE - Default temperature (optional)
Pro Tips:
• Use coder for programming tasks and code reviews
• Use deepseek-r1 for complex reasoning and math problems
• Use claude4 for detailed analysis and long-form content
• Use grok3 for current events and real-time information
• Use gpt4mini for quick questions to save on API costs
• Pipe command output directly: command | aiml "analyze this"
• Use -v for verbose output to see what model is being used
• Use --get-models to see all 100+ available models
Access to 100+ AI models through one simple interface!
Example: The City Table
Here's how this works with an actual MySQL table analysis. I'll analyze a City
table from the classic World database (from https://dev.mysql.com/doc/index-other.html Example Databases) using three different AI models.
The Command
mysql --login-path=klarson world -e "show create table City\G" | \
aiml --model claude4 "Using a MySQL expert point of view analyze this table"
This command:
- Extracts the table structure from MySQL
- Pipes it to our AI tool
- Gets analysis from Claude Sonnet 4
Results
Claude Sonnet 4 Analysis
Claude 4 provided a well-organized analysis:
Strengths:
- Proper AUTO_INCREMENT primary key for InnoDB efficiency
- Foreign key constraints maintaining referential integrity
- Appropriate indexing strategy for common queries
Issues Found:
- Storage Inefficiency: Using CHAR(35) for variable-length city names wastes space
- Character Set Limitation: latin1 charset inadequate for international city names
- Suboptimal Indexing: name_key index only covers first 5 characters
Suggested Improvements:
-- Claude's suggested optimized structure
CREATE TABLE `City` (
`ID` int NOT NULL AUTO_INCREMENT,
`Name` VARCHAR(35) NOT NULL,
`CountryCode` CHAR(3) NOT NULL,
`District` VARCHAR(20) NOT NULL,
`Population` int UNSIGNED NOT NULL DEFAULT '0',
PRIMARY KEY (`ID`),
KEY `CountryCode` (`CountryCode`),
KEY `name_idx` (`Name`),
KEY `country_name_idx` (`CountryCode`, `Name`),
KEY `population_idx` (`Population`),
CONSTRAINT `city_ibfk_1` FOREIGN KEY (`CountryCode`)
REFERENCES `Country` (`Code`) ON DELETE RESTRICT ON UPDATE CASCADE
) ENGINE=InnoDB AUTO_INCREMENT=4080
DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci;
Grok 3 Beta Analysis (The Comprehensive Reviewer)
mysql --login-path=klarson world -e "show create table City\G" | \
aiml --model grok3 "Using a MySQL expert point of view analyze this table"
Grok 3 provided an exhaustive, detailed analysis covering:
Technical Deep Dive:
- Performance Impact Analysis: Evaluated the partial index limitation in detail
- Storage Engine Benefits: Confirmed InnoDB choice for transactional integrity
- Data Type Optimization: Detailed space-saving recommendations with examples
Advanced Considerations:
- Full-text indexing recommendations for city name searches
- Character set migration procedures with specific commands
- Partitioning strategies for large datasets
Implementation Guidelines:
-- Grok's character set migration suggestion
ALTER TABLE City CONVERT TO CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci;
-- Full-text index recommendation
ALTER TABLE City ADD FULLTEXT INDEX name_fulltext (Name);
GPT-4o Analysis (The Practical Advisor)
mysql --login-path=klarson world -e "show create table City\G" | \
aiml --model gpt4 "Using a MySQL expert point of view analyze this table"
GPT-4o focused on practical, immediately actionable improvements:
Pragmatic Assessment:
- Validated the AUTO_INCREMENT primary key design
- Confirmed foreign key constraint benefits for data integrity
- Identified character set limitations for global applications
Ready-to-Implement Suggestions:
- Specific ALTER TABLE commands for immediate optimization
- Query pattern analysis recommendations
- Index effectiveness evaluation criteria
The Power of Multi-Model Analysis
What makes this approach valuable is getting three distinct perspectives:
- Claude 4: Provides detailed, structured analysis with concrete code solutions
- Grok 3: Offers comprehensive coverage with advanced optimization strategies
- GPT-4o: Delivers practical, immediately actionable recommendations
Each model brings unique strengths:
- Different focal points: Storage optimization vs. performance vs. maintainability
- Varying depth levels: From quick wins to architectural improvements
- Diverse analysis styles: Structured vs. comprehensive vs. practical
Implementing the Workflow
Setup Instructions
1. Install Dependencies:
# Install required tools
sudo apt install curl jq mysql-client
# Create the script directory
mkdir -p ~/.local/bin
# Make script executable
chmod +x ~/.local/bin/aiml
2. Configure API Access:
# Get your free AIMLAPI key from https://aimlapi.com (free tier with limited use)
export AIMLAPI_API_KEY="your-free-api-key-here"
echo 'export AIMLAPI_API_KEY="your-free-api-key-here"' >> ~/.bashrc
3. Test the Setup:
# Verify configuration
aiml --config
# Test basic functionality
echo "SELECT VERSION();" | aiml "explain this SQL"
Practical Usage Patterns
Quick Table Analysis
# Analyze a specific table
mysql -e "SHOW CREATE TABLE users\G" mydb | \
aiml -m claude4 "Analyze this MySQL table structure"
Compare Different Model Perspectives
# Get multiple viewpoints on the same table
TABLE_DDL=$(mysql -e "SHOW CREATE TABLE orders\G" ecommerce)
echo "$TABLE_DDL" | aiml -m claude4 "MySQL table analysis"
echo "$TABLE_DDL" | aiml -m grok3 "Performance optimization review"
echo "$TABLE_DDL" | aiml -m gpt4 "Practical improvement suggestions"
Analyze Multiple Tables
# Quick analysis of all tables in a database
mysql -e "SHOW TABLES;" mydb | \
while read table; do
echo "=== Analyzing $table ==="
mysql -e "SHOW CREATE TABLE $table\G" mydb | \
aiml -m gpt4mini "Quick assessment of this table"
done
Index Analysis
# Review index usage and optimization
mysql -e "SHOW INDEX FROM tablename;" database | \
aiml -m deepseek "Suggest index optimizations for this MySQL table"
Query Performance Analysis
# Analyze slow queries
mysql -e "SHOW PROCESSLIST;" | \
aiml -m grok3 "Identify potential performance issues in these MySQL processes"
Why AIMLAPI Makes This Possible for DBAs
Free Access with Reasonable Costs: AIMLAPI provides free access with limited use to over 100 AI models, with very reasonable pricing for additional testing. This makes it perfect for DBAs who want to experiment without committing to expensive subscriptions.
Model Diversity: Access to models from different providers (OpenAI, Anthropic, Google, Meta, etc.) means you get varied perspectives and expertise areas.
No Vendor Lock-in: You can experiment with different models to find what works best for your specific needs without long-term commitments.
Terminal-Native: Stays in your comfortable Linux environment where you're already doing your MySQL work.
Model Selection Guide
Different models excel at different aspects of MySQL analysis:
# For detailed structural analysis
aiml -m claude4 "Comprehensive table structure review"
# For performance-focused analysis
aiml -m grok3 "Performance optimization recommendations"
# For quick, practical suggestions
aiml -m gpt4 "Immediate actionable improvements"
# For complex reasoning about trade-offs
aiml -m deepseek-r1 "Complex optimization trade-offs analysis"
# For cost-effective quick checks
aiml -m gpt4mini "Brief table assessment"
Beyond MySQL: Other CLI Examples
Since we can pipe any command output to the AI tool, here are some other useful examples:
System Administration
# Analyze system processes
ps aux | aiml "what processes are using most resources?"
# Check disk usage
df -h | aiml "analyze disk usage and suggest cleanup"
# Network connections
netstat -tuln | aiml "explain these network connections"
# System logs
tail -50 /var/log/syslog | aiml "any concerning errors in these logs?"
File and Directory Analysis
# Large files
find /var -size +100M | aiml "organize these large files by type"
# Permission issues
ls -la /etc/mysql/ | aiml "check these file permissions for security"
# Configuration review
cat /etc/mysql/my.cnf | aiml "review this MySQL configuration"
Log Analysis
# Apache logs
tail -100 /var/log/apache2/error.log | aiml "summarize these web server errors"
# Auth logs
grep "Failed password" /var/log/auth.log | aiml "analyze these failed login attempts"
The point is you can pipe almost anything to get quick analysis without leaving your terminal.
Custom System Prompts
Tailor the analysis to your specific context:
# E-commerce focus
aiml -m claude4 -s "You are analyzing tables for a high-traffic e-commerce site" \
"Review this table for scalability"
# Security focus
aiml -m grok3 -s "You are a security-focused database analyst" \
"Security assessment of this table structure"
# Legacy system focus
aiml -m gpt4 -s "You are helping migrate a legacy system to modern MySQL" \
"Modernization recommendations for this table"
Automated Reporting
# Generate a comprehensive database analysis report
DB_NAME="production_db"
REPORT_FILE="analysis_$(date +%Y%m%d).md"
echo "# Database Analysis Report for $DB_NAME" > "$REPORT_FILE"
echo "Generated on $(date)" >> "$REPORT_FILE"
for table in $(mysql -Ns -e "SHOW TABLES;" "$DB_NAME"); do
echo "" >> "$REPORT_FILE"
echo "## Table: $table" >> "$REPORT_FILE"
mysql -e "SHOW CREATE TABLE $table\G" "$DB_NAME" | \
aiml -m claude4 "Provide concise analysis of this MySQL table" >> "$REPORT_FILE"
done
Performance Optimization Workflow
# Comprehensive performance analysis
mysql -e "SHOW CREATE TABLE heavy_table\G" db | \
aiml -m grok3 "Performance bottleneck analysis"
# Follow up with index suggestions
mysql -e "SHOW INDEX FROM heavy_table;" db | \
aiml -m deepseek "Index optimization strategy"
# Get implementation plan
aiml -m gpt4 "Create step-by-step implementation plan for these optimizations"
Real Benefits of This Approach
Speed: Get expert-level analysis in seconds instead of hours
Multiple Perspectives: Different models catch different issues
Learning Tool: Each analysis teaches you something new about MySQL optimization
Cost-Effective: Thanks to AIMLAPI's free tier and reasonable pricing, this powerful analysis is accessible
Consistency: Repeatable analysis across different tables and databases
Documentation: Easy to generate reports and share findings with teams
Tips for Best Results
- Start with Structure: Always begin with
SHOW CREATE TABLE
for comprehensive analysis - Use Specific Prompts: The more specific your request, the better the analysis
- Compare Models: Different models excel at different aspects - use multiple perspectives
- Validate Suggestions: Always test AI recommendations in development environments first
- Iterate: Use follow-up questions to dive deeper into specific recommendations
Getting Started Today
The beauty of this approach is its simplicity and cost-effectiveness. With just a few commands, you can:
- Get your free AIMLAPI key from https://aimlapi.com (includes free tier)
- Install the script (5 minutes)
- Start analyzing your MySQL tables immediately
- Experiment with different models to see which ones work best for your needs
- Use the free tier for regular analysis, pay only for heavy testing
Windows Users (Quick Option)
I'm not a Windows person, but if you need to run this on Windows, the simplest approach is:
- Install WSL2 (Windows Subsystem for Linux)
- Install Ubuntu from Microsoft Store
- Follow the Linux setup above inside WSL2
This gives you a proper Linux environment where the script will work exactly as designed.
This isn't about replacing DBA expertise - it's about augmenting it while staying in your terminal environment. The AI provides rapid analysis and catches things you might miss, while you provide the context and make the final decisions.
Whether you're working with a single table or a complex database with hundreds of tables, this workflow scales to meet your needs. And since AIMLAPI provides free access with reasonable costs for additional use, you can experiment and find the perfect combination for your specific use cases without budget concerns.
The combination of MySQL's powerful introspection capabilities with AI analysis creates a workflow that's both practical and cost-effective for DBAs. Give it a try on your next database optimization project - you might be surprised at what insights emerge, all while staying in your comfortable terminal environment.
No comments:
Post a Comment
@AnotherMySQLDBA