Ecosystem
WDL has a rich, distributed ecosystem of interconnected developer tools and execution engines to ensure (a) users can quickly write high-quality, idiomatic workflows, and (b) scaling the execution of those workflows within any computation environment is a breeze.
TIP
The WDL ecosystem is rapidly evolving, and, while we're always looking to expand the list of known ecosystem tools, sometimes tools get missed. If you know of a tool that needs to be listed here but isn't, we encourage you to create a pull request and let us know!
Execution Engines
The following list contains the known execution engines listed alphabetically.
Engine | Local | HPC | Cloud |
---|---|---|---|
AWS HealthOmics Hosted Platform Support WDL v1.1 | Slurm IBM LSF | Amazon AWS Microsoft Azure Google Cloud | |
Cromwell Binary/Executable Support WDL v1.0 | Slurm IBM LSF | Amazon AWS Microsoft Azure Google Cloud | |
dxCompiler Binary/Executable Support WDL v1.1 and v2.0 | Slurm IBM LSF | Amazon AWS* Microsoft Azure* Google Cloud * via DNAnexus | |
miniwdl Binary/Executable Support WDL v1.1 | Slurm (plugin) IBM LSF (plugin) | Amazon AWS (plugin) Microsoft Azure Google Cloud | |
Terra Hosted Platform Support WDL v1.0 | Slurm IBM LSF | Amazon AWS Microsoft Azure Google Cloud | |
Toil Binary/Executable Support WDL v1.1 | Slurm IBM LSF | Amazon AWS Microsoft Azure Google Cloud |
IDE Support
Extensions and other IDE support tools listed sorted by the editor.
Name | Editor | Supports |
---|---|---|
Sprocket (LSP) | Editors with LSP | Formatting, linting, snippets, syntax highlighting, and validation. |
wdl-mode | Emacs | Syntax highlighting. |
poly-wdl | Emacs | Integration with polymode. |
Winstanly WDL | JetBrains | Linting and syntax highlighting. |
wdl-sublime | Sublime Text | Syntax highlighting. |
wdl-vim | Vim | Syntax highlighting. |
Sprocket (extension) | Visual Studio Code | Formatting, linting, snippets, syntax highlighting, and validation. |
Syntax Highlighter | Visual Studio Code | Syntax highlighting. |
Development Tools
The following are tools to enhance the experience of working with WDL sorted by the category.
Name | Category | Description |
---|---|---|
wdl-tests | Conformance testing | Conformance tests for WDL execution engines. |
wdl-aid | Documentation generation | "Automatic input generation for WDL worflows." |
pytest-workflow | Testing | Testing framework for workflow languages (including WDL). |
wdldoc | Documentation generation | "Create WDL documentation using Markdown." |
wdl-packager | Package management | "Package a WDL and imports into a zip file." |
pytest-wdl | Testing | "WDL plugin for pytest." |
Community Workflows
The following is an incomplete list of large WDL workflow repositories sorted by name.
Name | Description |
---|---|
BioWDL LUMC | "Bioinformatics workflows and tasks, written in WDL." BioWDL is a large GitHub organization that contains the WDL workflows developed at LUMC (link). |
Chan Zuckerberg Chan Zuckerberg Initiative | Official repository for the WDL workflows developed at the Chan Zuckerberg Initiative for the CZID platform (link). |
Dockstore Multiple | Dockstore describes itself as "an app store for bioinformatics"—it's an open source platform for sharing analytical tools and workflows. WDL is one of the supported languages. |
ENCODE ENCODE Consortium | Official repository of the ENCODE Data Coordinating Center's Uniform Processing Pipelines. These pipelines are designed to "create high-quality, consistent, and reproducible data" for the ENCODE project. |
GATK Broad Institute | Official GATK best practices workflows developed at and published by the Broad Institute's Data Sciences Platform. |
PacBio Pacific Biosciences | Official repository for the best practices workflows for PacBio data. |
St. Jude Cloud St. Jude Children's Research Hospital | Official repository for data processing pipelines used on St. Jude Cloud (link). |
Thiagen Theiagen Genomics | Official repository of Thiagen's WDL workflows. |
WARP Broad Institute | WARP stands for "WDL Research Analysis Pipelines" and contains cloud-optimized pipelines for processing biological data from the Broad Institute Data Sciences Platform. |
WILDS Fred Hutch | Official repositories of the Workflows for Integration of Large Data and Software (WILDS) developed at the Fred Hutch Data Science Lab. |