Unison Cloud offers a managed pool of cloud-based Unison nodes that can execute distributed computations. There's no separate packaging or deployment step. To use it, just call a function, passing it the distributed computation you want to run. For instance, the below example forks two parallel computations on randomly chosen locations in the default pool. Calling cloud.run will serialize this computation, sync any missing dependencies on the fly, and run it in the cloud.
main = do
r = cloud.run do
t1 = Remote.forkAt !pool.default '(1 + 1)
t2 = Remote.forkAt !pool.default '(2 + 2)
Remote.await t1 + Remote.await t2
printLine ("Result was: " ++ Nat.toText r)
The current unison.cloud API is tailored for batch computations (as in Hadoop or Spark or miscellaneous ETL jobs), but we plan to add support for launching async background jobs, recurring or scheduled jobs, resilient long-running workflows (as in Temporal), and autoscaled resilient microservices.
We'd like to work with folks who have ideas for use cases in any of these areas and who are willing to work with us on some pilot project using Unison and Unison Cloud.