Hey there! I’m a Frontend Developer - my world is usually React and TypeScript, not SQL queries or database schemas. Honestly, my experience with backend databases has been pretty limited (and, let’s be real, a little intimidating).
But recently, I found myself needing to dive deep into backend database analysis and bug fixes - tasks I’d normally leave to the experts. Why does this matter? Because if someone like me, who lives and breathes frontend, can tackle complex backend work with a little help from AI, then anyone can! My story is proof that with the right tools, the lines between frontend and backend aren’t as rigid as they seem.
Let me show you how AI turned my “I can’t do this” into “I shipped it today”—and why you might want to try it too.
This page documents how I used an AI coding assistant to complete complex database analysis, bug investigation, and full-stack bug fixes. What would have taken me 2-3 days (or required backend team help) was completed in 4 hours.
Project: Learning Journey Template Display Feature
Stack: Java/Spring Boot backend, GraphQL, PostgreSQL, React frontend
Our enablement team needed to:
Query database to find users in specific learning programs
Understand complex mentor-mentee relationships
Fix a bug where template info returned NULL in API queries
Display learning journey template names on user profiles
My challenge: As a frontend dev, I'm not comfortable with:
Writing complex SQL queries
Debugging backend Java code
Understanding database schemas
Setting up database connections
What I Did: Got database credentials from our ops tool
What I Asked AI:
"Help me connect to the database with these credentials"
[Provided connection info with host, port, username, password]
What AI Did:
Created connection scripts for my terminal
Generated setup guides for DataGrip, DBeaver, psql
Tested the connection and verified it worked
Created documentation for future reference
Result: Connected in 5 minutes instead of 30+ minutes googling and troubleshooting.
What I Learned: AI can handle environment setup that would normally require reading docs.
What I Asked:
"Find user IDs of people in the Revenue learning journey"
What AI Did:
SELECT u.user_id, c.cohort_name, t.template_name
FROM cohorts c
JOIN cohort_members cm ON c.id = cm.cohort_id
JOIN users u ON cm.user_id = u.id
JOIN user_templates ut ON u.id = ut.user_id
JOIN templates t ON ut.template_id = t.id
WHERE c.type = 'REVENUE' AND c.deleted = false
Output:
CSV with 125 users
User IDs list (one per line)
Grouped by cohort
Template information included
Result: Complete user list in 5 minutes vs 1+ hour of manual SQL writing (which I'd have struggled with).
What I Learned: AI understands table relationships from schema and writes complex JOINs.
What I Asked:
"Find users who mentor other users but also have their own mentor"
What AI Did:
Analyzed database schema to understand relationships
Wrote complex JOIN query across 5 tables
Found 3 users with dual roles (mentor + mentee)
Discovered 3-level mentor chains
Key Finding:
Mentor Chain:
User A (Top Mentor)
└─> User B ← Has dual role!
└─> User C (Mentee)
Result: Complex analysis in 10 minutes that would have taken me 2+ hours (if I could even figure it out).
What I Learned: AI can handle multi-level relationship queries that are hard to visualize.
What I Asked:
"Fields templateId and templateName return null in GraphQL profile query,
but the search endpoint returns them correctly. Why?"
What AI Did:
Verified data exists in database (wrote and ran query)
Checked GraphQL schema files
Traced code flow through multiple Java files
Found root cause: Different endpoints used different data mappers
Created detailed analysis document
Root Cause Found:
// Search endpoint (works) ✅
UserMapper.mapUsersWithTemplates()
→ Fetches template data
// Profile endpoint (broken) ❌
UserMapper.mapBasicProfile()
→ Does NOT fetch template data
Result: Root cause identified in 20 minutes vs potentially days of debugging.
What I Learned: AI can trace code flow across files and identify logic differences.
What I Asked:
"Create a git branch and fix this bug by adding template data fetching
to the profile endpoint"
What AI Did:
Created feature branch: fix-template-null-in-profile
Modified UserService.java (added 13 lines)
Injected required repository dependency
Implemented template data fetching with null safety
Committed changes with proper message
Code Added:
// Populate templateId and templateName
if (userProfile != null && userProfile.user() != null) {
Long userId = userProfile.user().getId();
List<Object[]> templateData = templateRepo.findUserTemplates(Set.of(userId));
if (!templateData.isEmpty()) {
Object[] row = templateData.get(0);
userProfile.user().setTemplateId((Long) row[1]);
userProfile.user().setTemplateName((String) row[2]);
}
}
Result: Backend fix implemented in 30 minutes by a frontend dev! 🎉
What I Learned: AI writes code following existing patterns and best practices.
Challenge: CI/CD pipeline complained about code coverage < 80%
What I Asked:
"Pipeline says coverage is 57.1%. Add unit tests to get above 80%"
What AI Did:
Added 2 unit tests for new functionality
Fixed compilation errors iteratively
Added proper mocks for dependencies
Ensured >80% coverage
Tests Added:
@Test
void testUserProfileWithTemplateData() {
// Test that templateId and templateName are populated
}
@Test
void testUserProfileWithoutTemplateData() {
// Test that null values are handled gracefully
}
Iterations:
First attempt: Compilation errors (missing imports)
Second attempt: Test failures (missing mocks)
Third attempt: ✅ All tests passing
Result: Tests written and passing in 1 hour vs 2-3 hours manually.
What I Learned: AI iterates on errors and fixes them, just like pair programming.
What I Asked:
"Create an epic and tickets for frontend and backend work"
What AI Did:
Created Epic: "Display Template Name on User Profiles"
Created Backend ticket with technical details (completed)
Created Frontend ticket with acceptance criteria and test users
Linked everything together
Result: Complete epic structure in 5 minutes vs 30+ minutes manual entry.
What I Learned: AI can format tickets with proper structure and links.
What I Asked:
"Create a PR for this branch"
What AI Did:
Created PR with comprehensive description
Included test cases and expected results
Added before/after examples
Linked to ticket
Result: Professional PR created in 2 minutes.
What I Asked:
"Create a Confluence page documenting the mentor relationships we found"
What AI Did:
Created structured Confluence page
Included all findings, mentor chains, diagrams
Added SQL queries for reproducibility
Formatted with tables and clear sections
Additional Docs Created:
Test user guides (who to use for testing which scenario)
Database connection guides (for next time)
Root cause analysis document
Result: Professional documentation in 30 minutes vs 2+ hours of writing.
Database schema → Backend code → GraphQL schema → Frontend
Connected business requirements to technical implementation
Explained backend concepts in frontend-friendly terms
Fixed compilation errors on each attempt
Adjusted queries based on results
Added missing mocks when tests failed
No frustration, just kept trying
Followed existing code patterns
Added proper error handling
Created comprehensive tests
Wrote clean commit messages
Explained why certain JOINs were needed
Showed me backend patterns
Taught me testing patterns
Made me a better full-stack dev
Me: "Find users in Revenue program with template info"
AI: Immediately produced:
Multi-table JOIN query
Correct filtering for soft-deleted records
Output in multiple formats (CSV, TXT, MD)
Test user recommendations
Me: "Tests failing with PermissionDenied error"
AI: Diagnosed and fixed:
Identified missing security mock
Added correct mock pattern from other tests
Verified fix would work in CI/CD
Committed with descriptive message
Me: "Why do we need this JOIN?"
AI: Explained:
Table relationship diagram
Why LEFT JOIN vs INNER JOIN
What data would be missing without it
Alternative query approaches
❌ "Help with database"
✅ "Find user IDs of people in Revenue learning journey with their template names"
Share complete error messages
Reference specific files and line numbers
Show what you've already tried
Explain your technical background
Let AI fix compilation errors
Provide test results for debugging
Refine queries based on output
Ask "why" when you don't understand
Review generated code (don't blindly trust)
Test queries in non-prod first
Validate test coverage
Check security implications
Ask AI to explain unfamiliar concepts
Request alternative approaches
Understand the "why" not just the "how"
Save patterns for future use
✅ Complex SQL query generation
✅ Backend code pattern recognition
✅ Root cause analysis across layers
✅ Documentation creation
✅ Test writing
✅ Iterative debugging
✅ Teaching/explaining concepts
Business requirements definition
Design decisions
Final code review
Production deployment decisions
Security verification
Prioritization
Database analysis (especially if you're not a DB expert)
Bug investigation (across unfamiliar codebases)
Full-stack features (when you're primarily frontend/backend)
Unit test creation (saves tons of time)
Documentation (generates professional docs)
Learning new tech (explains while building)
Critical security decisions
Architectural choices
Production data operations
Compliance-related changes
Using AI transformed what would have been:
❌ "I need backend team help" (multi-day delay)
❌ "I can't do this myself" (skill gap)
❌ "This will take forever" (learning curve)
Into:
✅ "I can investigate this myself" (immediate start)
✅ "I can fix backend bugs" (skill expansion)
✅ "I can ship this today" (velocity)
Absolutely. The time savings and skill development made this a huge win.
Yes! If you're willing to:
Review and understand the code (don't just copy-paste)
Test thoroughly
Ask questions when unsure
Take responsibility for the output
Then AI can help you work across the full stack confidently.
Anuj Shah
1 comment