Overview

This guide demonstrates how to efficiently run multiple Spongecake desktop containers concurrently. It covers how to:

  • Launch containers simultaneously.
  • Collect and process results from concurrent tasks.
  • Extend this pattern to various parallel workflows.

We’ll use an example that checks the cheapest flights for multiple weekends in a given month, then identifies which weekend is the cheapest to fly.

Key Concepts

  • Multiple Containers: Each task runs in an isolated container with unique port assignments to prevent conflicts.
  • Parallel Execution: Using Python’s ThreadPoolExecutor for concurrency allows tasks to run simultaneously.
  • Aggregating Results: Concurrent tasks return individual results that can be combined or analyzed further, such as finding the cheapest price.

Example: Cheapest Weekend Flight

This example checks multiple weekends for the cheapest flights concurrently, aggregating the results to identify the lowest-priced weekend.

Explanation

  1. Container per Weekend
    For each weekend number, we start a unique Spongecake container, specifying different ports. This prevents collisions and ensures each container is fully isolated.

  2. Aggregating Output
    Results from each weekend’s check are collected in the results dictionary. We also parse them to find the cheapest numeric value. This same pattern applies to any scenario where you combine or compare data returned by concurrent tasks.

  3. Isolation & Cleanup
    Each container is started and stopped independently. If one container fails, it doesn’t impact the other containers. You can build in better error handling using the handlers


Extending This Pattern

  • Different Task Types
    Instead of fetching flight prices, each container could perform any custom action (e.g., scraping hotel info, running integration tests, or collecting data from multiple APIs).

  • Scaling
    You can spin up as many containers as needed, each using different ports or letting Spongecake handle port assignment. If you have many tasks, simply add them to the executor queue.

  • Aggregating Complex Data
    Here we simply pick a numeric minimum. In other scenarios, you might compile large datasets, compute averages, or merge JSON results. The concurrency pattern remains the same.

  • Robust Error Handling
    The example logs errors or invalid data. You can customize error-handling routines (e.g., retrials, fallback containers) depending on your reliability needs.