Implements automatic retry mechanism when probe pairing times out: - Add CONFIG_CODE_PAIRING_MAX_RETRIES (default: 2) to Kconfig - Add retry_count, max_retries, skip_retry fields to pairing_state_t - Modify code_pairing_handle_timeout() to retry before failing - Preserve PSKd across retries via NVS persistence - Add CLEARGROW_EVENT_PAIRING_RETRY event for UI notifications - Add code_pairing_skip_retry() and code_pairing_get_retry_info() APIs - Update scr_pairing_progress.c with retry status display - Add Skip Retry button for users to abort retries early - Add comprehensive unit tests for retry logic User experience improvements: - Pairing now automatically retries up to 2 times (3 total attempts) - Progress screen shows "Attempt X of Y" during retries - Skip Retry button allows early abort of remaining retries - PSKd preserved - no need to re-enter code on retry 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
ClearGrow Controller Test Infrastructure
Overview
Unit and integration tests for the ClearGrow ESP32-S3 controller firmware.
- Framework: Unity (ESP-IDF integrated)
- Test Location:
test/unit/,test/integration/,test/hardware/ - Total Tests: 550+ test cases across 16+ components
Quick Start
# Set required test credentials (see Test Configuration section below)
export CLEARGROW_TEST_API_KEY="your-test-api-key"
export CLEARGROW_TEST_JWT_SECRET="your-test-jwt-secret"
# Run all tests (validates test structure)
./test/run_tests.sh
# Run specific component tests
./test/run_tests.sh automation
# Check test coverage
./test/run_coverage.sh
# Generate detailed coverage report (gcov/lcov)
./test/run_coverage.sh --gcov
Test Configuration
Tests require credentials to be configured via environment variables or Kconfig. Environment variables take precedence over Kconfig settings.
Required Environment Variables
| Variable | Purpose | Required For |
|---|---|---|
CLEARGROW_TEST_API_KEY |
API key for network API authentication tests | Unit tests |
CLEARGROW_TEST_JWT_SECRET |
JWT signing secret for authentication tests | Unit tests |
CLEARGROW_TEST_WIFI_SSID |
WiFi network SSID | Hardware tests |
CLEARGROW_TEST_WIFI_PASSWORD |
WiFi network password (empty for open networks) | Hardware tests |
Setting Up Test Credentials
Option 1: Environment Variables (Recommended for CI/CD)
# For unit tests
export CLEARGROW_TEST_API_KEY="your-secure-test-api-key"
export CLEARGROW_TEST_JWT_SECRET="your-secure-jwt-secret-min-32-chars"
# For hardware tests (additional)
export CLEARGROW_TEST_WIFI_SSID="YourTestNetwork"
export CLEARGROW_TEST_WIFI_PASSWORD="YourTestPassword"
Option 2: Kconfig (For local development)
idf.py menuconfig
# Navigate to: ClearGrow Test Configuration
# Set test credentials under the appropriate submenu
Security Notes
- Never commit test credentials to source control
- Use secure secret management for CI/CD (GitHub Secrets, etc.)
- Test credentials should be unique to your test environment
- For hardware tests, use a dedicated test WiFi network
For detailed coverage information, see COVERAGE.md
Directory Structure
test/
├── run_tests.sh # Main test runner script
├── run_coverage.sh # Coverage report generator
├── README.md # This file
├── COVERAGE.md # Coverage reporting guide
├── sdkconfig.coverage # Coverage build configuration
├── unit/ # Unit tests (per-component)
│ ├── CMakeLists.txt # Unit test build configuration
│ ├── test_automation.c
│ ├── test_common.c
│ ├── test_controller_sync.c
│ ├── test_data_logger.c
│ ├── test_display.c
│ ├── test_network_api.c
│ ├── test_ota_manager.c
│ ├── test_provisioning.c
│ ├── test_security.c
│ ├── test_sensor_hub.c
│ ├── test_settings.c
│ ├── test_spi_bus.c
│ ├── test_tflite_runner.c
│ ├── test_watchdog.c
│ └── test_wifi_manager.c
├── integration/ # Integration tests
│ ├── test_tflite_integration.c
│ ├── test_sensor_automation.c
│ ├── test_thread_probe.c
│ └── test_wifi_network_api.c
└── hardware/ # Hardware-in-the-loop tests
├── README.md # Hardware test setup guide
├── run_hardware_tests.sh # Hardware test runner
├── pytest_hardware_tests.py # Pytest test suite
├── test_hardware_display.c # Display & touch tests
├── test_hardware_wifi.c # WiFi radio tests
├── test_hardware_thread_rcp.c # Thread RCP tests
└── test_hardware_spi_sd.c # SD card SPI tests
Writing Tests
Test File Template
#include "unity.h"
#include "component_header.h"
// Optional: Mock includes
// #include "mock_nvs.h"
static const char *TAG = "test_component";
void setUp(void) {
// Called before each test
}
void tearDown(void) {
// Called after each test
}
void test_function_should_return_success_on_valid_input(void) {
// Arrange
int input = 42;
// Act
int result = component_function(input);
// Assert
TEST_ASSERT_EQUAL(ESP_OK, result);
}
void test_function_should_fail_on_null_pointer(void) {
// Act & Assert
TEST_ASSERT_EQUAL(ESP_ERR_INVALID_ARG, component_function(NULL));
}
// Test runner (for standalone execution)
void app_main(void) {
UNITY_BEGIN();
RUN_TEST(test_function_should_return_success_on_valid_input);
RUN_TEST(test_function_should_fail_on_null_pointer);
UNITY_END();
}
Naming Conventions
- Test files:
test_<component>.c - Test functions:
test_<function>_should_<expected_behavior> - Use descriptive names that explain what's being tested
Common Assertions
TEST_ASSERT_TRUE(condition);
TEST_ASSERT_FALSE(condition);
TEST_ASSERT_EQUAL(expected, actual);
TEST_ASSERT_EQUAL_STRING(expected, actual);
TEST_ASSERT_NULL(pointer);
TEST_ASSERT_NOT_NULL(pointer);
TEST_ASSERT_EQUAL_MEMORY(expected, actual, size);
TEST_ASSERT_FLOAT_WITHIN(delta, expected, actual);
Mocking
Existing Mock Patterns
The codebase includes mocks for:
- LVGL:
test_lvgl_mocks.c- UI widget mocking - NVS: Mock NVS operations for settings tests
- SD Card: Mock file operations for data_logger tests
- OTA: Mock update operations
- Security: Mock crypto operations
Creating Mocks
// In test file or separate mock file
static int mock_function_call_count = 0;
static esp_err_t mock_function_return = ESP_OK;
esp_err_t mocked_function(void) {
mock_function_call_count++;
return mock_function_return;
}
// In setUp()
void setUp(void) {
mock_function_call_count = 0;
mock_function_return = ESP_OK;
}
// In test
void test_caller_invokes_function_once(void) {
caller_that_uses_function();
TEST_ASSERT_EQUAL(1, mock_function_call_count);
}
Running Tests on Hardware
Unit Tests on Hardware
For actual ESP32-S3 execution of unit tests:
# Build with test configuration
idf.py -T test/unit build
# Flash and run
idf.py -p /dev/ttyUSB0 flash monitor
Hardware-in-the-Loop Tests
For comprehensive hardware validation with real peripherals:
# Run all hardware tests using pytest
cd test/hardware
./run_hardware_tests.sh
# Run specific hardware test categories
./run_hardware_tests.sh --display # Display & touch only
./run_hardware_tests.sh --wifi # WiFi only
./run_hardware_tests.sh --thread # Thread RCP only
# Or run individual test applications
idf.py -C test/hardware/test_hardware_display build
idf.py -C test/hardware/test_hardware_display -p /dev/ttyUSB0 flash monitor
See hardware/README.md for complete hardware test setup guide.
TFLite Integration Tests
The TFLite integration tests require the ML feature to be enabled:
# Build with ML feature enabled
idf.py -D CONFIG_CLEARGROW_ENABLE_ML=y build
# Flash and run TFLite integration test
idf.py -p /dev/ttyUSB0 flash monitor
Note: The TFLite integration test (test_tflite_integration.c) uses the actual placeholder model from models/anomaly_detector.tflite (30.6 KB). This test validates:
- Model loading from embedded flash
- Real TensorFlow Lite Micro inference execution
- Tensor arena allocation in PSRAM
- Output shape and value validation
- Memory leak detection across multiple inferences
Unlike the unit test (test_tflite_runner.c) which uses mocks, this integration test exercises the complete TFLite stack including all TensorFlow Lite Micro operations.
Component Test Status
| Component | Tests | Status |
|---|---|---|
| automation | 4 | ✅ |
| common | 5 | ✅ |
| controller_sync | 4 | ✅ |
| data_logger | 8 | ✅ |
| display | 6 | ✅ |
| network_api | 10 | ✅ |
| ota_manager | 7 | ✅ |
| provisioning | 5 | ✅ |
| security | 6 | ✅ |
| sensor_hub | 12 | ✅ |
| settings | 8 | ✅ |
| spi_bus | 4 | ✅ |
| tflite_runner | 4 unit + 5 integration | ✅ |
| thread_manager | - | ⚠️ Integration only |
| ui | - | ⚠️ Manual testing |
| watchdog | 5 | ✅ |
| wifi_manager | 6 | ✅ |
Test Categories
| Category | Location | Count | Purpose |
|---|---|---|---|
| Unit Tests | test/unit/ |
550+ | Isolated component testing with mocks |
| Integration Tests | test/integration/ |
4+ | Multi-component interactions |
| Hardware Tests | test/hardware/ |
25+ | Real hardware validation (HIL) |
Future Enhancements
- CI/CD Integration: GitHub Actions workflow for automated testing
- Code Coverage: gcov/lcov integration for coverage metrics
- Host-Based Testing: Run tests on development machine without hardware
- Fuzz Testing: Randomized input testing for edge cases