Understanding GoodBarber's Architecture

Component Overview

GoodBarber operates using a hybrid model of pre-defined UI modules, back-office logic, and API integrations. Apps can fetch data via GoodBarber Collections, JSON feeds, or Custom Connectors. Each route imposes a different caching and sync mechanism.

Offline Caching and Sync

By default, GoodBarber caches content for offline use, but this can create consistency issues when using Custom Connectors with rapidly changing data sources. Sync intervals and update triggers are opaque and not easily configurable without careful design.

Symptoms of Sync Failures

Common Issues Encountered

  • Stale content appearing even after API data changes
  • Push notifications not updating corresponding views
  • Offline mode reloading incorrect or partial datasets
  • Slow loading due to redundant API calls

Root Cause Diagnostics

1. Debugging the Custom Connector

Custom Connectors act as middlemen between GoodBarber and external APIs. Failures often occur due to incorrect JSON structures or HTTP status misinterpretation.

{"items": [
  {"title": "Product A", "description": "Latest stock info", "id": "123"},
  {"title": "Product B", "description": "New arrival", "id": "124"}
]}

Ensure headers return HTTP 200 and response time is under 300ms to avoid fallback to cached values.

2. Inspecting Sync Schedules

Syncs do not occur in real-time. Cached values can linger unless users manually refresh. You can simulate API updates via local JSON services and log timestamps to verify sync behavior.

Step-by-Step Fixes

1. Design an Update Flag

Implement a versioning mechanism in your JSON feed, so that GoodBarber detects data changes more effectively.

{"version": "20250801", "items": [ ... ]}

2. Force Manual Refresh

Use the back-office option to enable pull-to-refresh on the list view. Alternatively, inject a dummy field in the payload to simulate data difference and trigger sync.

3. Normalize Push Notification Triggers

When using push to signal content changes, link push metadata to specific JSON routes or update timestamps.

{"push": {"target": "/products", "update_time": "2025-08-03T12:00:00Z"}}

Architectural Best Practices

Idempotency and Error Recovery

Ensure your external API supports idempotent responses. This prevents GoodBarber from crashing or misbehaving if duplicate requests occur due to sync retries.

Payload Size Optimization

Keep JSON payloads under 1MB. Chunk large datasets and use pagination parameters to avoid incomplete or truncated syncs.

Monitoring via Proxy Layer

Insert a reverse proxy (e.g., Nginx or API Gateway) to log incoming GoodBarber requests. This helps track frequency, headers, and anomalies in expected behavior.

Conclusion

Sync and data consistency issues in GoodBarber arise from the platform's hybrid caching behavior and limitations around external data refreshes. By adopting architectural patterns like versioning, push-sync correlation, and payload optimization, developers can enhance reliability and ensure a smoother UX. Proactive monitoring and debugging through test endpoints and proxy logs are critical in enterprise-grade deployments.

FAQs

1. How do I reduce API load when GoodBarber repeatedly polls my endpoint?

Implement cache headers and set rate limits via your proxy. GoodBarber may re-request data based on user sessions or background refresh patterns.

2. Can GoodBarber handle real-time updates via WebSockets?

No. GoodBarber does not support WebSocket or SSE connections natively. You must rely on periodic syncs or push notifications.

3. Why do push-triggered views not reflect updated API content?

Push only brings users to the view; it doesn't guarantee data refresh. You must tie push metadata to an update in the content timestamp or version key.

4. Is it possible to bypass GoodBarber caching completely?

Not entirely. You can reduce cache retention by enabling manual refresh and using dummy field changes, but background caching remains active for performance.

5. How do I debug JSON parsing errors in GoodBarber connectors?

Use an online JSON validator and set up a staging API returning mock data. Log all responses and validate that 'items' is a top-level array with consistent keys.