# Project Breakdown and Test-Driven Development Guide for LLM ## Task Preparation Workflow ### User Interaction Trigger When a user requests "please prepare my tasks", follow this precise workflow: #### Task Preparation Steps 1. Implementation Plan Creation - [ ] Create an implementation plan - [ ] Save to "instructions" directory - [ ] Use .md format with comprehensive checklist 2. Project Decomposition - [ ] Break implementation into distinct phases - [ ] Save each phase as a numerically named .md file - [ ] Store phase files in "instructions" subdirectory 3. Detailed Breakdown Process - [ ] For each phase, create task list - [ ] Generate checklist for tasks - [ ] Save task lists in respective phase folders 4. Granular Task Specification - [ ] For each task, identify subtasks - [ ] Create subtask checklists - [ ] Save subtask lists in phase folders ## Project Decomposition Methodology ### Phase 1: Comprehensive Project Analysis 1. Project Scope Definition - [ ] Identify core project objectives - [ ] List all major functional requirements - [ ] Create high-level system architecture diagram 2. Granular Decomposition Process - Break down project into hierarchical components: a. Major Functional Modules b. Specific Features c. Discrete Tasks d. Atomic Subtasks ### Component Specification Workflow 1. Subtask Component Identification - [ ] List required: * Functions * Types * Classes * Other necessary components 2. Detailed Specification for Each Component - [ ] Write comprehensive function specification - [ ] Define precise input types - [ ] Define exact return types - [ ] Create detailed test cases ### Test-Driven Development (TDD) Workflow #### Test Specification Precedence 1. Test Definition Requirements - [ ] Define test BEFORE implementation - [ ] Specify success criteria explicitly - [ ] Create comprehensive test cases covering: * Normal use cases * Edge cases * Error scenarios * Performance expectations 2. Test Specification Template ```markdown ## Test Specification for [Task Name] ### Objectives - Precise description of expected behavior ### Input Specifications - Types - Constraints - Valid/Invalid input ranges ### Expected Outputs - Exact return types - Expected value ranges - Error handling specifications ### Test Cases 1. Standard Scenario - Input: [detailed input] - Expected Output: [detailed output] - Pass Criteria: [exact specifications] 2. Edge Case Scenario - Input: [boundary condition] - Expected Behavior: [precise specification] 3. Error Scenario - Input: [invalid input] - Expected Error: [specific error type/message] ### Success Criteria - [ ] All test cases pass - [ ] 100% input validation - [ ] No unexpected side effects ``` ### Development and Branch Management #### Implementation Constraints 1. Development Flow ``` [Test Specification] → [Write Failing Test] → [Implement Minimal Code] → [Run Tests] → [Refactor If Needed] → [Confirm All Tests Pass] ``` 2. Branching Strategy - [ ] Never work on main branch - [ ] Create branches for each phase - [ ] Create sub-branches for each task - [ ] Create sub-sub-branches for subtasks 3. Merge and Commit Guidelines - [ ] Commit after each test passes - [ ] Merge subtask to task branch - [ ] Merge task to phase branch - [ ] Merge phase to main branch - [ ] Do NOT delete any branches after merging #### Progression Rules - Do NOT move to next task until: * All tests for current task pass * Code meets all specified criteria * No remaining known issues * All items checked off on the checklist ### Workflow Tracking - [ ] Always work only on current task - [ ] Continuously update: * Subtask checklist * Task checklist * Phase checklist * mark off checklist ## Key Reminders for LLM - ALWAYS define tests first - Break down into smallest possible tasks - Validate each component thoroughly - Maintain clear, traceable progress - Do NOT proceed without passing tests ### Documentation Requirements - Each task must have: 1. Detailed test specification 2. Implementation code 3. Passing test results 4. Explanation of approach ## Execution Guidance - Start with most fundamental tasks - Complete lower-level subtasks before moving up - Validate each component in isolation - Ensure predictable, testable implementations