diff --git a/docs/process.md b/docs/process.md index c700706e9d..4354987b14 100644 --- a/docs/process.md +++ b/docs/process.md @@ -821,6 +821,96 @@ In general, multiple input channels should be used to process *combinations* of See also: {ref}`channel-types`. +(process-typed-inputs)= + +### Typed inputs + +:::{versionadded} 24.10.0 +::: + +Typed inputs are an alternative way to define process inputs with standard types. This approach has a number of benefits: + +- A typed input can validate the values that it receives at runtime and raise an error if there is a type mismatch. + +- Whereas a `path` input relies on a custom `arity` option in order to distinguish between a single file and a list of files, with typed inputs it is trivial: `Path` vs `List`. + +- Typed inputs enable the use of custom record types (i.e. defined using `@ValueObject` or `record`), which makes the code much easier to read and understand compared to `tuple` inputs. + +A typed input is simply a variable declaration, i.e. ` `. Here are some examples: + +```groovy +input: +int my_int +String my_string +Path my_file +List my_files +``` + +In the above example: + +- `my_int` and `my_string` are treated like `val` inputs; they are defined in the process body as variables + +- `my_file` and `my_files` are treated like `path` inputs; they are defined as variables and their files are staged into the task directory + +One of the most important capabilities enabled by typed inputs is the use of custom record types. Here is an example: + +```groovy +@ValueObject +class Sample { + String id + List reads +} + +process foo { + input: + Sample my_sample + + // ... +} +``` + +In this example, `Sample` is a record type with two members `id` and `reads`. The `Sample` input in process `foo` will be provided as a variable to the process body, where its members can be accessed as `my_sample.id` and `my_sample.reads`. Additionally, because `my_sample.reads` is a collection of files (given by its type `List`), it will be staged into the task directory like a `path` input. + +Environment variables and standard input can be defined using the new `env` and `stdin` directives. Building from the previous example: + +```groovy +process foo { + env('SAMPLE_ID') { my_sample.id } + env('FIRST_READ_FILE') { my_sample.reads[0]?.name } + stdin { my_sample.reads[0] } + + input: + Sample my_sample + + // ... +} +``` + +In the above example: + +- The sample id will be exported to the `SAMPLE_ID` variable in the task environment +- The name of the first sample read file will be exported to the `FIRST_READ_FILE` variable in the task environment +- The contents of the first sample read file will be provided as standard input to the task + +By default, file inputs are automatically inferred from the types and staged into the task directory. Alternatively, the `stageAs` directive can be used to stage files under a different name, similar to using the `name` or `stageAs` option with a `path` input. For example: + +```groovy +process foo { + stageAs('*.fastq') { my_sample.reads } + + input: + Sample my_sample + + // ... +} +``` + +In this case, `my_sample.reads` will be staged as `*.fastq`, overriding the default behavior. + +:::{note} +While the `env`, `stageAs`, and `stdin` directives are provided as a convenience, it is usually easier to simply rely on the default file staging behavior, and to use the input variables directly in the task script. +::: + (process-output)= ## Outputs @@ -1202,6 +1292,79 @@ The following options are available for all process outputs: : Defines the {ref}`channel topic ` to which the output will be sent. +(process-typed-outputs)= + +### Typed outputs + +:::{versionadded} 24.10.0 +::: + +Typed outputs are an alternative way to define process outputs with standard types. This approach has a number of benefits: + +- A typed output clearly describes the expected structure of the output, which makes it easier to use the output in downstream operations. + +- Whereas a `path` output relies on a custom `arity` option in order to distinguish between a single file and a list of files, with typed outputs it is trivial: `Path` vs `List`. + +- Typed outputs enable the use of custom record types (i.e. defined using `@ValueObject` or `record`), which makes the code much easier to read and understand compared to `tuple` outputs. + +A typed output is simply a variable declaration with an optional assignment, i.e. ` [= ]`. Here are some examples: + +```groovy +output: +int my_int +String my_string = my_input +Path my_file = path('file1.txt') +List my_files = path('*.txt') +``` + +In the above example: + +- `my_int` and `my_string` are treated like `val` outputs; they are assigned to the variables `my_int` and `my_input`, which are expected to be defined in the process body + +- `my_file` and `my_files` are treated like `path` outputs; they are assigned to a file or list of files based on a matching pattern using the `path()` method + +- The output variable names correspond to the `emit` option for process outputs + +One of the most important capabilities enabled by typed outputs is the use of custom record types. Here is an example: + +```groovy +@ValueObject +class Sample { + String id + List reads +} + +process foo { + input: + String id + + output: + Sample my_sample = new Sample(id, path('*.fastq')) + + // ... +} +``` + +In this example, `Sample` is a record type with two members `id` and `reads`. The `Sample` output will be constructed from the `id` input variable and the collection of task output files matching the pattern `*.fastq`. + +In addition to the `path()` method, there are also the `env()`, `eval()`, and `stdout()` methods for extracting environment variables, eval commands, and standard output from the task environment. For example: + +```groovy +process foo { + // ... + + output: + String my_env = env('MY_VAR') + String my_eval = eval('bash --version') + String my_stdout = stdout() + List my_tuple = [ env('MY_VAR'), eval('bash --version'), stdout() ] + + // ... +} +``` + +As shown in the above examples, output values can be any expression, including lists, maps, records, and even function calls. + ## When The `when` block allows you to define a condition that must be satisfied in order to execute the process. The condition can be any expression that returns a boolean value. diff --git a/modules/nextflow/src/main/groovy/nextflow/Session.groovy b/modules/nextflow/src/main/groovy/nextflow/Session.groovy index 5ca23559d7..61c61fddb0 100644 --- a/modules/nextflow/src/main/groovy/nextflow/Session.groovy +++ b/modules/nextflow/src/main/groovy/nextflow/Session.groovy @@ -55,6 +55,7 @@ import nextflow.processor.ErrorStrategy import nextflow.processor.TaskFault import nextflow.processor.TaskHandler import nextflow.processor.TaskProcessor +import nextflow.script.dsl.ProcessConfigBuilder import nextflow.script.BaseScript import nextflow.script.ProcessConfig import nextflow.script.ProcessFactory @@ -927,7 +928,7 @@ class Session implements ISession { * @return {@code true} if the name specified belongs to the list of process names or {@code false} otherwise */ protected boolean checkValidProcessName(Collection processNames, String selector, List errorMessage) { - final matches = processNames.any { name -> ProcessConfig.matchesSelector(name, selector) } + final matches = processNames.any { name -> ProcessConfigBuilder.matchesSelector(name, selector) } if( matches ) return true @@ -938,6 +939,7 @@ class Session implements ISession { errorMessage << message.toString() return false } + /** * Register a shutdown hook to close services when the session terminates * @param Closure diff --git a/modules/nextflow/src/main/groovy/nextflow/ast/NextflowDSLImpl.groovy b/modules/nextflow/src/main/groovy/nextflow/ast/NextflowDSLImpl.groovy index 7396a5ab16..75c6bc3df4 100644 --- a/modules/nextflow/src/main/groovy/nextflow/ast/NextflowDSLImpl.groovy +++ b/modules/nextflow/src/main/groovy/nextflow/ast/NextflowDSLImpl.groovy @@ -26,6 +26,7 @@ import nextflow.NF import nextflow.script.BaseScript import nextflow.script.BodyDef import nextflow.script.IncludeDef +import nextflow.script.LazyVar import nextflow.script.TaskClosure import nextflow.script.TokenEvalCall import nextflow.script.TokenEnvCall @@ -35,7 +36,6 @@ import nextflow.script.TokenStdinCall import nextflow.script.TokenStdoutCall import nextflow.script.TokenValCall import nextflow.script.TokenValRef -import nextflow.script.TokenVar import org.codehaus.groovy.ast.ASTNode import org.codehaus.groovy.ast.ClassCodeVisitorSupport import org.codehaus.groovy.ast.ClassNode @@ -47,6 +47,7 @@ import org.codehaus.groovy.ast.expr.BinaryExpression import org.codehaus.groovy.ast.expr.CastExpression import org.codehaus.groovy.ast.expr.ClosureExpression import org.codehaus.groovy.ast.expr.ConstantExpression +import org.codehaus.groovy.ast.expr.DeclarationExpression import org.codehaus.groovy.ast.expr.Expression import org.codehaus.groovy.ast.expr.GStringExpression import org.codehaus.groovy.ast.expr.ListExpression @@ -212,8 +213,8 @@ class NextflowDSLImpl implements ASTTransformation { else if( arg instanceof VariableExpression ) { // the name of the component i.e. process, workflow, etc to import final component = arg.getName() - // wrap the name in a `TokenVar` type - final token = createX(TokenVar, new ConstantExpression(component)) + // wrap the name in a `LazyVar` type + final token = createX(LazyVar, new ConstantExpression(component)) // create a new `IncludeDef` object newArgs.addExpression(createX(IncludeDef, token)) } @@ -221,8 +222,8 @@ class NextflowDSLImpl implements ASTTransformation { def cast = (CastExpression)arg // the name of the component i.e. process, workflow, etc to import final component = (cast.expression as VariableExpression).getName() - // wrap the name in a `TokenVar` type - final token = createX(TokenVar, new ConstantExpression(component)) + // wrap the name in a `LazyVar` type + final token = createX(LazyVar, new ConstantExpression(component)) // the alias to give it final alias = constX(cast.type.name) newArgs.addExpression( createX(IncludeDef, token, alias) ) @@ -512,6 +513,7 @@ class NextflowDSLImpl implements ASTTransformation { * - collect all the statement after the 'exec:' label */ def source = new StringBuilder() + List paramStatements = [] List execStatements = [] List whenStatements = [] @@ -535,7 +537,10 @@ class NextflowDSLImpl implements ASTTransformation { if( stm instanceof ExpressionStatement ) { fixLazyGString( stm ) fixStdinStdout( stm ) - convertInputMethod( stm.getExpression() ) + if( stm.expression instanceof DeclarationExpression ) + convertInputDeclaration( stm ) + else if( stm.expression instanceof MethodCallExpression ) + convertInputMethod( (MethodCallExpression)stm.expression ) } break @@ -543,7 +548,10 @@ class NextflowDSLImpl implements ASTTransformation { if( stm instanceof ExpressionStatement ) { fixLazyGString( stm ) fixStdinStdout( stm ) - convertOutputMethod( stm.getExpression() ) + if( stm.expression instanceof DeclarationExpression ) + paramStatements.addAll( convertOutputDeclaration( stm ) ) + else if( stm.expression instanceof MethodCallExpression ) + convertOutputMethod( (MethodCallExpression)stm.expression ) } break @@ -669,6 +677,10 @@ class NextflowDSLImpl implements ASTTransformation { stm.visit(new TaskCmdXformVisitor(unit)) } + // prepend additional param statements + paramStatements.addAll(block.statements) + block.statements = paramStatements + if (!done) { log.trace "Invalid 'process' definition -- Process must terminate with string expression" int line = methodCall.lineNumber @@ -859,59 +871,44 @@ class NextflowDSLImpl implements ASTTransformation { } } - /* - * handle *input* parameters - */ - protected void convertInputMethod( Expression expression ) { - log.trace "convert > input expression: $expression" - - if( expression instanceof MethodCallExpression ) { - - def methodCall = expression as MethodCallExpression - def methodName = methodCall.getMethodAsString() - def nested = methodCall.objectExpression instanceof MethodCallExpression - log.trace "convert > input method: $methodName" + protected void convertInputDeclaration( ExpressionStatement stmt ) { + // don't throw error if not method because it could be an implicit script statement + if( stmt.expression !instanceof DeclarationExpression ) + return - if( methodName in ['val','env','file','each','set','stdin','path','tuple'] ) { - //this methods require a special prefix - if( !nested ) - methodCall.setMethod( new ConstantExpression('_in_' + methodName) ) + final decl = (DeclarationExpression)stmt.expression + if( decl.isMultipleAssignmentDeclaration() ) { + syntaxError(decl, "Invalid process input statement, possible syntax error") + return + } - fixMethodCall(methodCall) - } + // NOTE: doint this in semantic analysis causes null pointer exception + final var = decl.variableExpression + stmt.expression = callThisX( + '_typed_in_param', + args(constX(var.name), classX(var.type)) + ) + } - /* - * Handles a GString a file name, like this: - * - * input: - * file x name "$var_name" from q - * - */ - else if( methodName == 'name' && isWithinMethod(expression, 'file') ) { - varToConstX(methodCall.getArguments()) - } + private static final List VALID_INPUT_METHODS = ['val','env','file','path','stdin','each','tuple'] - // invoke on the next method call - if( expression.objectExpression instanceof MethodCallExpression ) { - convertInputMethod(methodCall.objectExpression) - } - } + protected void convertInputMethod( MethodCallExpression methodCall ) { + final methodName = methodCall.getMethodAsString() + log.trace "convert > input method: $methodName" - else if( expression instanceof PropertyExpression ) { - // invoke on the next method call - if( expression.objectExpression instanceof MethodCallExpression ) { - convertInputMethod(expression.objectExpression) - } + final caller = methodCall.objectExpression + if( caller !instanceof VariableExpression || caller.getText() != 'this' ) { + syntaxError(methodCall, "Invalid process input statement, possible syntax error") + return } - } - - protected boolean isWithinMethod(MethodCallExpression method, String name) { - if( method.objectExpression instanceof MethodCallExpression ) { - return isWithinMethod(method.objectExpression as MethodCallExpression, name) + if( methodName !in VALID_INPUT_METHODS ) { + syntaxError(methodCall, "Invalid process input method '${methodName}'") + return } - return method.getMethodAsString() == name + methodCall.setMethod( new ConstantExpression('_in_' + methodName) ) + fixMethodCall(methodCall) } /** @@ -944,34 +941,56 @@ class NextflowDSLImpl implements ASTTransformation { } } - protected void convertOutputMethod( Expression expression ) { - log.trace "convert > output expression: $expression" + protected List convertOutputDeclaration( ExpressionStatement stmt ) { + // don't throw error if not method because it could be an implicit script statement + if( stmt.expression !instanceof DeclarationExpression ) + return + + final decl = (DeclarationExpression)stmt.expression + log.trace "convert > output declaration: $decl" - if( !(expression instanceof MethodCallExpression) ) { + if( decl.isMultipleAssignmentDeclaration() ) { + syntaxError(decl, "Invalid process output statement, possible syntax error") return } - def methodCall = expression as MethodCallExpression - def methodName = methodCall.getMethodAsString() - def nested = methodCall.objectExpression instanceof MethodCallExpression - log.trace "convert > output method: $methodName" + final var = decl.variableExpression + final rhs = decl.rightExpression ?: var + stmt.expression = callThisX( + '_typed_out_param', + new ArgumentListExpression( + constX(var.name), + classX(var.type), + closureX(new ExpressionStatement(rhs)) + ) + ) + + // infer unstaging directives from AST + final visitor = new ProcessOutputVisitor(unit) + rhs.visit(visitor) + return visitor.statements + } - if( methodName in ['val','env','eval','file','set','stdout','path','tuple'] && !nested ) { - // prefix the method name with the string '_out_' - methodCall.setMethod( new ConstantExpression('_out_' + methodName) ) - fixMethodCall(methodCall) - fixOutEmitAndTopicOptions(methodCall) - } + private static final List VALID_OUTPUT_METHODS = ['val','env','eval','file','path','stdout','tuple'] + + protected void convertOutputMethod( MethodCallExpression methodCall ) { + final methodName = methodCall.getMethodAsString() + log.trace "convert > output method: $methodName" - else if( methodName in ['into','mode'] ) { - fixMethodCall(methodCall) + final caller = methodCall.objectExpression + if( caller !instanceof VariableExpression || caller.getText() != 'this' ) { + syntaxError(methodCall, "Invalid process output statement, possible syntax error") + return } - // continue to traverse - if( methodCall.objectExpression instanceof MethodCallExpression ) { - convertOutputMethod(methodCall.objectExpression) + if( methodName !in VALID_OUTPUT_METHODS ) { + syntaxError(methodCall, "Invalid process output method '${methodName}'") + return } + methodCall.setMethod( new ConstantExpression('_out_' + methodName) ) + fixMethodCall(methodCall) + fixOutEmitAndTopicOptions(methodCall) } private boolean withinTupleMethod @@ -1047,7 +1066,7 @@ class NextflowDSLImpl implements ASTTransformation { protected Expression varToStrX( Expression expr ) { if( expr instanceof VariableExpression ) { def name = ((VariableExpression) expr).getName() - return createX( TokenVar, new ConstantExpression(name) ) + return createX( LazyVar, new ConstantExpression(name) ) } else if( expr instanceof PropertyExpression ) { // transform an output declaration such @@ -1094,7 +1113,7 @@ class NextflowDSLImpl implements ASTTransformation { return createX( TokenStdoutCall ) else - return createX( TokenVar, new ConstantExpression(name) ) + return createX( LazyVar, new ConstantExpression(name) ) } if( expr instanceof MethodCallExpression ) { diff --git a/modules/nextflow/src/main/groovy/nextflow/script/params/EnvInParam.groovy b/modules/nextflow/src/main/groovy/nextflow/ast/ProcessInputPathXform.groovy similarity index 59% rename from modules/nextflow/src/main/groovy/nextflow/script/params/EnvInParam.groovy rename to modules/nextflow/src/main/groovy/nextflow/ast/ProcessInputPathXform.groovy index 2ca7e121d6..d7a5a0f401 100644 --- a/modules/nextflow/src/main/groovy/nextflow/script/params/EnvInParam.groovy +++ b/modules/nextflow/src/main/groovy/nextflow/ast/ProcessInputPathXform.groovy @@ -14,20 +14,16 @@ * limitations under the License. */ -package nextflow.script.params +package nextflow.ast -import groovy.transform.InheritConstructors +import java.lang.annotation.ElementType +import java.lang.annotation.Retention +import java.lang.annotation.RetentionPolicy +import java.lang.annotation.Target +import org.codehaus.groovy.transform.GroovyASTTransformationClass -/** - * Represents a process *environment* input parameter - * - * @author Paolo Di Tommaso - */ -@InheritConstructors -class EnvInParam extends BaseInParam { - - @Override - String getTypeName() { 'env' } - -} +@Retention(RetentionPolicy.SOURCE) +@Target(ElementType.METHOD) +@GroovyASTTransformationClass(classes = [ProcessInputPathXformImpl]) +@interface ProcessInputPathXform {} diff --git a/modules/nextflow/src/main/groovy/nextflow/ast/ProcessInputPathXformImpl.groovy b/modules/nextflow/src/main/groovy/nextflow/ast/ProcessInputPathXformImpl.groovy new file mode 100644 index 0000000000..b1daa98997 --- /dev/null +++ b/modules/nextflow/src/main/groovy/nextflow/ast/ProcessInputPathXformImpl.groovy @@ -0,0 +1,202 @@ +/* + * Copyright 2013-2024, Seqera Labs + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package nextflow.ast + +import java.lang.reflect.Field +import java.lang.reflect.Modifier +import java.lang.reflect.ParameterizedType +import java.nio.file.Path + +import static org.codehaus.groovy.ast.tools.GeneralUtils.* + +import groovy.transform.CompileStatic +import groovy.util.logging.Slf4j +import org.codehaus.groovy.ast.ASTNode +import org.codehaus.groovy.ast.ClassCodeVisitorSupport +import org.codehaus.groovy.ast.ClassNode +import org.codehaus.groovy.ast.Parameter +import org.codehaus.groovy.ast.VariableScope +import org.codehaus.groovy.ast.expr.ArgumentListExpression +import org.codehaus.groovy.ast.expr.ClassExpression +import org.codehaus.groovy.ast.expr.ClosureExpression +import org.codehaus.groovy.ast.expr.ConstantExpression +import org.codehaus.groovy.ast.expr.Expression +import org.codehaus.groovy.ast.expr.MethodCallExpression +import org.codehaus.groovy.ast.expr.PropertyExpression +import org.codehaus.groovy.ast.expr.VariableExpression +import org.codehaus.groovy.ast.stmt.BlockStatement +import org.codehaus.groovy.ast.stmt.ExpressionStatement +import org.codehaus.groovy.ast.stmt.Statement +import org.codehaus.groovy.control.CompilePhase +import org.codehaus.groovy.control.SourceUnit +import org.codehaus.groovy.syntax.SyntaxException +import org.codehaus.groovy.transform.ASTTransformation +import org.codehaus.groovy.transform.GroovyASTTransformation + +/** + * Inject file staging directives for file inputs in + * process definitions. + * + * Must be done during semantic analysis so that the + * necessary type information is available. + * + * @author Ben Sherman + */ +@Slf4j +@CompileStatic +@GroovyASTTransformation(phase = CompilePhase.SEMANTIC_ANALYSIS) +class ProcessInputPathXformImpl implements ASTTransformation { + + @Override + void visit(ASTNode[] astNodes, SourceUnit unit) { + new DslCodeVisitor(unit).visitClass((ClassNode)astNodes[1]) + } + + @CompileStatic + static class DslCodeVisitor extends ClassCodeVisitorSupport { + + private SourceUnit unit + + DslCodeVisitor(SourceUnit unit) { + this.unit = unit + } + + @Override + protected SourceUnit getSourceUnit() { unit } + + @Override + void visitMethodCallExpression(MethodCallExpression methodCall) { + if( methodCall.objectExpression?.getText() != 'this' ) + return + + final methodName = methodCall.getMethodAsString() + if( methodName != 'process' ) + return + + final args = methodCall.arguments as ArgumentListExpression + final lastArg = args.expressions.size()>0 ? args.getExpression(args.expressions.size()-1) : null + + if( lastArg !instanceof ClosureExpression ) { + syntaxError(lastArg, "Invalid process definition, possible syntax error") + return + } + + final closure = (ClosureExpression)lastArg + final block = (BlockStatement)closure.code + + List paramStatements = [] + String currentLabel = null + + for( final stmt : block.statements ) { + currentLabel = stmt.statementLabel ?: currentLabel + if( currentLabel != 'input' ) + continue + if( stmt !instanceof ExpressionStatement ) + continue + final stmtX = (ExpressionStatement)stmt + if( stmtX.expression !instanceof MethodCallExpression ) + continue + final call = (MethodCallExpression)stmtX.expression + if( call.methodAsString != '_typed_in_param' ) + continue + final paramArgs = (ArgumentListExpression)call.arguments + assert paramArgs.size() == 2 + assert paramArgs[0] instanceof ConstantExpression + assert paramArgs[1] instanceof ClassExpression + final varName = ((ConstantExpression)paramArgs[0]).text + final varType = ((ClassExpression)paramArgs[1]).type + + // infer staging directives via reflection + final var = new VariableExpression(varName, varType) + paramStatements.addAll( emitPathInputDecls(var, new TypeDef(var.type)) ) + } + + // prepend additional param statements + paramStatements.addAll(block.statements) + block.statements = paramStatements + } + + protected List emitPathInputDecls( Expression expr, TypeDef typeDef ) { + List result = [] + final type = typeDef.type + + if( isPathType(typeDef) ) { + log.trace "inferring staging directive for path input: ${expr.text}" + final block = new BlockStatement() + block.addStatement( new ExpressionStatement(expr) ) + final closure = new ClosureExpression(Parameter.EMPTY_ARRAY, block) + closure.variableScope = new VariableScope(block.variableScope) + result << new ExpressionStatement( callThisX( 'stageAs', args(closure) ) ) + } + else if( isRecordType(type) ) { + for( final field : type.getDeclaredFields() ) { + if( /* !field.isAccessible() || */ Modifier.isStatic(field.getModifiers()) ) + continue + log.trace "inspecting record type ${type.name}: field=${field.name}, type=${field.type.name}" + result.addAll( emitPathInputDecls(new PropertyExpression(expr, field.name), new TypeDef(field)) ) + } + } + + return result + } + + protected boolean isRecordType(Class type) { + // NOTE: custom parser will be able to detect record types more elegantly + log.trace "is ${type.name} a record type? ${type.package?.name == ''}" + return type.package && type.package.name == '' + } + + protected boolean isPathType(TypeDef typeDef) { + final type = typeDef.type + + log.trace "is ${type.simpleName} a Path? ${type.name == 'java.nio.file.Path'}" + if( Path.isAssignableFrom(type) ) + return true + if( Collection.isAssignableFrom(type) && typeDef.genericTypes ) { + final genericType = typeDef.genericTypes.first() + log.trace "is ${type.simpleName}<${genericType.simpleName}> a Collection? ${genericType.name == 'java.nio.file.Path'}" + return Path.isAssignableFrom(genericType) + } + return false + } + + protected void syntaxError(ASTNode node, String message) { + int line = node.lineNumber + int coln = node.columnNumber + unit.addError( new SyntaxException(message,line,coln)) + } + + } + + private static class TypeDef { + Class type + List genericTypes + + TypeDef(ClassNode classNode) { + this.type = classNode.getPlainNodeReference().getTypeClass() + if( classNode.getGenericsTypes() ) + this.genericTypes = classNode.getGenericsTypes().collect( el -> el.getType().getPlainNodeReference().getTypeClass() ) + } + + TypeDef(Field field) { + this.type = field.type + if( field.genericType instanceof ParameterizedType ) + this.genericTypes = field.genericType.getActualTypeArguments() as List + } + } + +} diff --git a/modules/nextflow/src/main/groovy/nextflow/ast/ProcessOutputVisitor.groovy b/modules/nextflow/src/main/groovy/nextflow/ast/ProcessOutputVisitor.groovy new file mode 100644 index 0000000000..d5be7f7826 --- /dev/null +++ b/modules/nextflow/src/main/groovy/nextflow/ast/ProcessOutputVisitor.groovy @@ -0,0 +1,144 @@ +/* + * Copyright 2013-2024, Seqera Labs + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package nextflow.ast + +import groovy.transform.CompileStatic +import org.codehaus.groovy.ast.ClassCodeVisitorSupport +import org.codehaus.groovy.ast.expr.ArgumentListExpression +import org.codehaus.groovy.ast.expr.Expression +import org.codehaus.groovy.ast.expr.MapExpression +import org.codehaus.groovy.ast.expr.MethodCallExpression +import org.codehaus.groovy.ast.expr.VariableExpression +import org.codehaus.groovy.ast.stmt.ExpressionStatement +import org.codehaus.groovy.ast.stmt.Statement +import org.codehaus.groovy.control.SourceUnit + +import static org.codehaus.groovy.ast.tools.GeneralUtils.* + +/** + * Extract unstaging directives from process output + * + * @author Ben Sherman + */ +@CompileStatic +class ProcessOutputVisitor extends ClassCodeVisitorSupport { + + final SourceUnit sourceUnit + + final List statements = [] + + private int evalCount = 0 + + private int pathCount = 0 + + ProcessOutputVisitor(SourceUnit unit) { + this.sourceUnit = unit + } + + @Override + void visitMethodCallExpression(MethodCallExpression call) { + extractDirective(call) + super.visitMethodCallExpression(call) + } + + void extractDirective(MethodCallExpression call) { + if( call.objectExpression?.text != 'this' ) + return + + if( call.arguments !instanceof ArgumentListExpression ) + return + + final name = call.methodAsString + final args = (ArgumentListExpression)call.arguments + + /** + * env(name) -> _typed_out_env(name) + */ + if( name == 'env' ) { + if( args.size() != 1 ) + return + + statements << makeDirective( + '_typed_out_env', + args[0] + ) + } + + /** + * eval(cmd) -> _typed_out_eval(key) { cmd } + * -> eval(key) + */ + else if( name == 'eval' ) { + if( args.size() != 1 ) + return + + final key = constX("nxf_out_eval_${evalCount++}".toString()) + + statements << makeDirective( + '_typed_out_eval', + key, + closureX(new ExpressionStatement(args[0])) + ) + + call.arguments = new ArgumentListExpression(key) + } + + /** + * path(opts, pattern) -> _typed_out_path(opts, key) { pattern } + * -> path(key) + */ + else if( name == 'path' ) { + def opts = null + def pattern + if( args.size() == 1 ) { + pattern = args[0] + } + else if( args.size() == 2 ) { + opts = args[0] + pattern = args[1] + } + else return + + final key = constX("\$file${pathCount++}".toString()) + + statements << makeDirective( + '_typed_out_path', + opts ?: new MapExpression(), + key, + closureX(new ExpressionStatement(pattern)) + ) + + call.arguments = new ArgumentListExpression(key) + } + } + + Statement makeDirective(String name, Expression... args) { + new ExpressionStatement( + new MethodCallExpression( + new VariableExpression('this'), + name, + new ArgumentListExpression(args) + ) + ) + } + + @Override + protected SourceUnit getSourceUnit() { + return sourceUnit + } + +} diff --git a/modules/nextflow/src/main/groovy/nextflow/dag/DAG.groovy b/modules/nextflow/src/main/groovy/nextflow/dag/DAG.groovy index 0289f948a8..69d0c1239c 100644 --- a/modules/nextflow/src/main/groovy/nextflow/dag/DAG.groovy +++ b/modules/nextflow/src/main/groovy/nextflow/dag/DAG.groovy @@ -30,15 +30,8 @@ import nextflow.NF import nextflow.extension.CH import nextflow.extension.DataflowHelper import nextflow.processor.TaskProcessor -import nextflow.script.params.DefaultInParam -import nextflow.script.params.DefaultOutParam -import nextflow.script.params.EachInParam -import nextflow.script.params.InParam -import nextflow.script.params.InputsList -import nextflow.script.params.OutParam -import nextflow.script.params.OutputsList -import nextflow.script.params.TupleInParam -import nextflow.script.params.TupleOutParam +import nextflow.script.ProcessInputs +import nextflow.script.ProcessOutputs import java.util.concurrent.atomic.AtomicLong @@ -96,9 +89,9 @@ class DAG { * @param inputs The list of inputs entering in the process * @param outputs the list of outputs leaving the process */ - void addProcessNode( String label, InputsList inputs, OutputsList outputs, TaskProcessor process=null ) { + void addProcessNode( String label, ProcessInputs inputs, ProcessOutputs outputs, TaskProcessor process=null ) { assert label - assert inputs + assert inputs!=null assert outputs addVertex( Type.PROCESS, label, normalizeInputs(inputs), normalizeOutputs(outputs), process ) } @@ -112,7 +105,7 @@ class DAG { */ void addOperatorNode( String label, inputs, outputs, List operators=null ) { assert label - assert inputs + assert inputs!=null addVertex(Type.OPERATOR, label, normalizeChannels(inputs), normalizeChannels(outputs), operators ) } @@ -235,32 +228,18 @@ class DAG { } - private List normalizeInputs( InputsList inputs ) { + private List normalizeInputs( ProcessInputs inputs ) { - inputs - .findAll { !( it instanceof DefaultInParam) } - .collect { InParam p -> new ChannelHandler(channel: p.rawChannel, label: inputName0(p)) } - - } - - private String inputName0(InParam param) { - if( param instanceof TupleInParam ) return null - if( param instanceof EachInParam ) return null - return param.name + inputs.collect { p -> + new ChannelHandler(channel: p.getChannel(), label: p.getName()) + } } - private List normalizeOutputs( OutputsList outputs ) { + private List normalizeOutputs( ProcessOutputs outputs ) { - def result = [] - for(OutParam p :outputs) { - if( p instanceof DefaultOutParam ) - break - final it = p.getOutChannel() - if( it!=null ) - result << new ChannelHandler(channel: it, label: p instanceof TupleOutParam ? null : p.name) + outputs.collect { p -> + new ChannelHandler(channel: p.getChannel(), label: p.getName()) } - - return result } private List normalizeChannels( entry ) { diff --git a/modules/nextflow/src/main/groovy/nextflow/dag/NodeMarker.groovy b/modules/nextflow/src/main/groovy/nextflow/dag/NodeMarker.groovy index c7885ee71a..7cce95ab86 100644 --- a/modules/nextflow/src/main/groovy/nextflow/dag/NodeMarker.groovy +++ b/modules/nextflow/src/main/groovy/nextflow/dag/NodeMarker.groovy @@ -21,8 +21,8 @@ import groovyx.gpars.dataflow.operator.DataflowProcessor import nextflow.Global import nextflow.Session import nextflow.processor.TaskProcessor -import nextflow.script.params.InputsList -import nextflow.script.params.OutputsList +import nextflow.script.ProcessInputs +import nextflow.script.ProcessOutputs /** * Helper class to mark DAG node with the proper labels * @@ -46,7 +46,7 @@ class NodeMarker { * @param inputs The list of inputs entering in the process * @param outputs the list of outputs leaving the process */ - static void addProcessNode( TaskProcessor process, InputsList inputs, OutputsList outputs ) { + static void addProcessNode( TaskProcessor process, ProcessInputs inputs, ProcessOutputs outputs ) { if( session && session.dag && !session.aborted ) session.dag.addProcessNode( process.name, inputs, outputs, process ) } diff --git a/modules/nextflow/src/main/groovy/nextflow/exception/ProcessEvalException.groovy b/modules/nextflow/src/main/groovy/nextflow/exception/ProcessEvalException.groovy index a2babd2b2e..995112411a 100644 --- a/modules/nextflow/src/main/groovy/nextflow/exception/ProcessEvalException.groovy +++ b/modules/nextflow/src/main/groovy/nextflow/exception/ProcessEvalException.groovy @@ -1,5 +1,5 @@ /* - * Copyright 2013-2023, Seqera Labs + * Copyright 2013-2024, Seqera Labs * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. diff --git a/modules/nextflow/src/main/groovy/nextflow/extension/MergeWithEachOp.groovy b/modules/nextflow/src/main/groovy/nextflow/extension/MergeWithEachOp.groovy new file mode 100644 index 0000000000..926942d3ac --- /dev/null +++ b/modules/nextflow/src/main/groovy/nextflow/extension/MergeWithEachOp.groovy @@ -0,0 +1,150 @@ +/* + * Copyright 2013-2024, Seqera Labs + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package nextflow.extension + +import java.util.concurrent.atomic.AtomicInteger + +import groovy.transform.CompileStatic +import groovy.util.logging.Slf4j +import groovyx.gpars.dataflow.DataflowQueue +import groovyx.gpars.dataflow.DataflowReadChannel +import groovyx.gpars.dataflow.DataflowVariable +import groovyx.gpars.dataflow.DataflowWriteChannel +import nextflow.Channel +/** + * Operator for merging many source channels into a single channel, + * with the option to combine channels that are marked as "iterators". + * + * @see ProcessDef#collectInputs(Object[]) + * + * @author Ben Sherman + */ +@Slf4j +@CompileStatic +class MergeWithEachOp { + + private List sources + + private List iterators + + /** + * List of queues to receive values from source channels. + */ + private List queues = [] + + /** + * Mask of source channels that are singletons. + */ + private List singletons + + /** + * True when all source channels are singletons and therefore + * the operator should emit a singleton channel. + */ + private boolean emitSingleton + + /** + * True when all source channels are iterators and therefore + * the operator should simply emit the combinations. + */ + private boolean emitCombination + + private transient List combinations + + MergeWithEachOp(List sources, List iterators) { + this.sources = sources + this.iterators = iterators + this.queues = sources.collect( ch -> [] ) + this.singletons = sources.collect( ch -> !CH.isChannelQueue(ch) ) + this.emitSingleton = iterators.size() == 0 && singletons.every() + this.emitCombination = iterators.size() > 0 && singletons.every() + } + + DataflowWriteChannel apply() { + final target = emitSingleton + ? new DataflowVariable() + : new DataflowQueue() + final counter = new AtomicInteger(sources.size()) + for( int i = 0; i < sources.size(); i++ ) + DataflowHelper.subscribeImpl( sources[i], eventsMap(i, target, counter) ) + + return target + } + + private Map eventsMap(int index, DataflowWriteChannel target, AtomicInteger counter) { + final opts = new LinkedHashMap(2) + opts.onNext = this.&take.curry(target, index) + opts.onComplete = { + if( counter.decrementAndGet() == 0 && !emitSingleton && !emitCombination ) + target.bind(Channel.STOP) + } + return opts + } + + private synchronized void take(DataflowWriteChannel target, int index, Object value) { + queues[index].add(value) + + // wait until every source has a value + if( queues.any(q -> q.size() == 0) ) + return + + // emit singleton value if every source is a singleton + if( emitSingleton ) { + final args = queues.collect(q -> q.first()) + target.bind(args) + return + } + + // emit combinations once if every source is an iterator + if( emitCombination ) { + emit(target) + target.bind(Channel.STOP) + return + } + + // otherwise emit as many items as are available + while( queues.every(q -> q.size() > 0) ) + emit(target) + } + + private void emit(DataflowWriteChannel target) { + // emit the next item if there are no iterators + if( iterators.size() == 0 ) { + final args = (0.. + singletons[i] ? queues[i].first() : queues[i].pop() + ) + target.bind(args) + return + } + + // otherwise emit an item for every iterator combination + if( combinations == null ) + combinations = iterators.collect( i -> queues[i].first() ).combinations() + + final args = (0.. + i in iterators + ? null + : singletons[i] ? queues[i].first() : queues[i].pop() + ) + for( List entries : combinations ) { + for( int k = 0; k < entries.size(); k++ ) + args[iterators[k]] = entries[k] + + target.bind(new ArrayList(args)) + } + } +} diff --git a/modules/nextflow/src/main/groovy/nextflow/processor/ForwardClosure.groovy b/modules/nextflow/src/main/groovy/nextflow/processor/ForwardClosure.groovy deleted file mode 100644 index 7b1a8e01d0..0000000000 --- a/modules/nextflow/src/main/groovy/nextflow/processor/ForwardClosure.groovy +++ /dev/null @@ -1,107 +0,0 @@ -/* - * Copyright 2013-2024, Seqera Labs - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package nextflow.processor - -import groovy.transform.CompileStatic -import groovyx.gpars.dataflow.operator.DataflowProcessor - -/** - * Implements the closure which *combines* all the iteration - * - * @param numOfInputs Number of in/out channel - * @param indexes The list of indexes which identify the position of iterators in the input channels - * @return The closure implementing the iteration/forwarding logic - */ -@CompileStatic -class ForwardClosure extends Closure { - - final private Integer len - - final private int numOfParams - - final private List indexes - - ForwardClosure(int len, List indexes) { - super(null, null); - this.len = len - this.numOfParams = len+1 - this.indexes = indexes - } - - @Override - int getMaximumNumberOfParameters() { - numOfParams - } - - @Override - Class[] getParameterTypes() { - def result = new Class[numOfParams] - for( int i=0; i cmb = itr.combinations() - - for( int i=0; i files, TaskRun task ) { + void apply( TaskRun task ) { - if( !files || !enabled ) + if( !task.outputFiles || !enabled ) return if( !path ) @@ -306,7 +305,7 @@ class PublishDir { this.stageInMode = task.config.stageInMode this.task = task - apply0(files) + apply0(task.outputFiles) } protected void apply1(Path source, boolean inProcess ) { diff --git a/modules/nextflow/src/main/groovy/nextflow/processor/TaskConfig.groovy b/modules/nextflow/src/main/groovy/nextflow/processor/TaskConfig.groovy index 25c24b7d20..63e8db1057 100644 --- a/modules/nextflow/src/main/groovy/nextflow/processor/TaskConfig.groovy +++ b/modules/nextflow/src/main/groovy/nextflow/processor/TaskConfig.groovy @@ -16,13 +16,12 @@ package nextflow.processor -import nextflow.util.CmdLineOptionMap - import static nextflow.processor.TaskProcessor.* import java.nio.file.Path import groovy.transform.CompileStatic +import groovy.transform.PackageScope import nextflow.Const import nextflow.ast.NextflowDSLImpl import nextflow.exception.AbortOperationException @@ -31,8 +30,10 @@ import nextflow.executor.BashWrapperBuilder import nextflow.executor.res.AcceleratorResource import nextflow.executor.res.DiskResource import nextflow.k8s.model.PodOptions +import nextflow.script.LazyMap import nextflow.script.TaskClosure import nextflow.util.CmdLineHelper +import nextflow.util.CmdLineOptionMap import nextflow.util.Duration import nextflow.util.MemoryUnit /** @@ -522,195 +523,3 @@ class TaskConfig extends LazyMap implements Cloneable { } } - -/** - * A map that resolve closure and gstring in a lazy manner - */ -@CompileStatic -class LazyMap implements Map { - - /** The target map holding the values */ - @Delegate - private Map target - - /** The context map against which dynamic properties are resolved */ - private Map binding - - private boolean dynamic - - protected boolean isDynamic() { dynamic } - - protected void setDynamic(boolean val) { dynamic = val } - - protected Map getBinding() { binding } - - protected void setBinding(Map map) { this.binding = map } - - protected Map getTarget() { target } - - protected void setTarget(Map obj) { this.target = obj } - - LazyMap() { - target = new HashMap<>() - } - - LazyMap( Map entries ) { - assert entries != null - target = new HashMap<>() - putAll(entries) - } - - /** - * Resolve a directive *dynamic* value i.e. defined with a closure or lazy string - * - * @param name The directive name - * @param value The value to be resolved - * @return The resolved value - */ - protected resolve( String name, value ) { - - /* - * directive with one value and optional named parameter are converted - * to a list object in which the first element is a map holding the named parameters - * and the second is the directive value - */ - if( value instanceof ConfigList ) { - def copy = new ArrayList(value.size()) - for( Object item : value ) { - if( item instanceof Map ) - copy.add( resolveParams(name, item as Map) ) - else - copy.add( resolveImpl(name, item) ) - } - return copy - } - - /* - * resolve the values in a map object - * note: 'ext' property is meant for extension attributes - * as it should be preserved as LazyMap - */ - else if( value instanceof Map && name!='ext' ) { - return resolveParams(name, value) - } - - /* - * simple value - */ - else { - return resolveImpl(name, value) - } - - } - - /** - * Resolve directive *dynamic* named params - * - * @param name The directive name - * @param value The map holding the named params - * @return A map in which dynamic params are resolved to the actual value - */ - private resolveParams( String name, Map value ) { - - final copy = new LinkedHashMap() - final attr = (value as Map) - for( Entry entry : attr.entrySet() ) { - copy[entry.key] = resolveImpl(name, entry.value, true) - } - return copy - } - - /** - * Resolve a directive dynamic value - * - * @param name The directive name - * @param value The value to be resolved - * @param param When {@code true} points that it is a named parameter value, thus closure are only cloned - * @return The resolved directive value - */ - private resolveImpl( String name, value, boolean param=false ) { - - if( value instanceof Closure ) { - def copy = value.cloneWith(getBinding()) - if( param ) { - return copy - } - - try { - return copy.call() - } - catch( MissingPropertyException e ) { - if( getBinding() == null ) throw new IllegalStateException("Directive `$name` doesn't support dynamic value (or context not yet initialized)") - else throw e - } - } - - else if( value instanceof GString ) { - return value.cloneAsLazy(getBinding()).toString() - } - - return value - } - - /** - * Override the get method in such a way that {@link Closure} values are resolved against - * the {@link #binding} map - * - * @param key The map entry key - * @return The associated value - */ - Object get( key ) { - return getValue(key) - } - - Object getValue(Object key) { - final value = target.get(key) - return resolve(key as String, value) - } - - Object put( String key, Object value ) { - if( value instanceof Closure ) { - dynamic |= true - } - else if( value instanceof GString ) { - for( int i=0; i put(k as String, v) } - } - - @Override - String toString() { - final allKeys = keySet() - final result = new ArrayList(allKeys.size()) - for( String key : allKeys ) { result << "$key: ${getProperty(key)}".toString() } - result.join('; ') - } - -} - -@CompileStatic -class ConfigList implements List { - - @Delegate - private List target - - ConfigList() { - target = [] - } - - ConfigList(int size) { - target = new ArrayList(size) - } - - ConfigList(Collection items) { - target = new ArrayList(items) - } - -} diff --git a/modules/nextflow/src/main/groovy/nextflow/processor/TaskEnvCollector.groovy b/modules/nextflow/src/main/groovy/nextflow/processor/TaskEnvCollector.groovy new file mode 100644 index 0000000000..4e21763f66 --- /dev/null +++ b/modules/nextflow/src/main/groovy/nextflow/processor/TaskEnvCollector.groovy @@ -0,0 +1,80 @@ +/* + * Copyright 2013-2024, Seqera Labs + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package nextflow.processor + +import java.nio.file.Path +import java.util.regex.Matcher + +import groovy.transform.CompileStatic +import nextflow.exception.ProcessEvalException +/** + * Implements the collection of environment variables + * from the environment of a task execution. + * + * @author Paolo Di Tommaso + * @author Ben Sherman + */ +@CompileStatic +class TaskEnvCollector { + + private Path workDir + + private Map evalCmds + + TaskEnvCollector(Path workDir, Map evalCmds) { + this.workDir = workDir + this.evalCmds = evalCmds + } + + Map collect() { + final env = workDir.resolve(TaskRun.CMD_ENV).text + final result = new HashMap(50) + Matcher matcher + // `current` represents the current capturing env variable name + String current = null + for( String line : env.readLines() ) { + // Opening condition: + // line should match a KEY=VALUE syntax + if( !current && (matcher = (line=~/([a-zA-Z_][a-zA-Z0-9_]*)=(.*)/)) ) { + final key = matcher.group(1) + final value = matcher.group(2) + if (!key) continue + result.put(key, value) + current = key + } + // Closing condition: + // line should match /KEY/ or /KEY/=exit_status + else if( current && (matcher = (line=~/\/${current}\/(?:=exit:(\d+))?/)) ) { + final status = matcher.group(1) as Integer ?: 0 + // when exit status is defined and it is a non-zero, it should be interpreted + // as a failure of the execution of the output command; in this case the variable + // holds the std error message + if( evalCmds != null && status ) { + final cmd = evalCmds.get(current) + final out = result[current] + throw new ProcessEvalException("Unable to evaluate output", cmd, out, status) + } + // reset current key + current = null + } + else if( current && line != null ) { + result[current] += '\n' + line + } + } + return result + } +} diff --git a/modules/nextflow/src/main/groovy/nextflow/processor/TaskFileCollector.groovy b/modules/nextflow/src/main/groovy/nextflow/processor/TaskFileCollector.groovy new file mode 100644 index 0000000000..b5b00cab01 --- /dev/null +++ b/modules/nextflow/src/main/groovy/nextflow/processor/TaskFileCollector.groovy @@ -0,0 +1,149 @@ +/* + * Copyright 2013-2024, Seqera Labs + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package nextflow.processor + +import java.nio.file.LinkOption +import java.nio.file.Path +import java.nio.file.NoSuchFileException + +import groovy.transform.CompileStatic +import groovy.util.logging.Slf4j +import nextflow.exception.IllegalArityException +import nextflow.exception.MissingFileException +import nextflow.file.FileHelper +import nextflow.file.FilePatternSplitter +import nextflow.script.ProcessFileOutput +/** + * Implements the collection of files from the work directory + * of a task execution. + * + * @author Paolo Di Tommaso + * @author Ben Sherman + */ +@Slf4j +@CompileStatic +class TaskFileCollecter { + + private ProcessFileOutput param + + private TaskRun task + + private Path workDir + + TaskFileCollecter(ProcessFileOutput param, TaskRun task) { + this.param = param + this.task = task + this.workDir = task.getTargetDir() + } + + Object collect() { + final List allFiles = [] + final filePatterns = param.getFilePatterns(task.context, workDir) + boolean inputsExcluded = false + + for( String filePattern : filePatterns ) { + List result = null + + final splitter = param.glob ? FilePatternSplitter.glob().parse(filePattern) : null + if( splitter?.isPattern() ) { + result = fetchResultFiles(filePattern, workDir) + if( result && !param.includeInputs ) { + result = excludeStagedInputs(task, result) + log.trace "Process ${task.lazyName()} > after removing staged inputs: ${result}" + inputsExcluded |= (result.size()==0) + } + } + else { + final path = param.glob ? splitter.strip(filePattern) : filePattern + final file = workDir.resolve(path) + final exists = checkFileExists(file) + if( exists ) + result = List.of(file) + else + log.debug "Process `${task.lazyName()}` is unable to find [${file.class.simpleName}]: `$file` (pattern: `$filePattern`)" + } + + if( result ) + allFiles.addAll(result) + + else if( !param.optional && (!param.arity || param.arity.min > 0) ) { + def msg = "Missing output file(s) `$filePattern` expected by process `${task.lazyName()}`" + if( inputsExcluded ) + msg += " (note: input files are not included in the default matching set)" + throw new MissingFileException(msg) + } + } + + if( !param.isValidArity(allFiles.size()) ) + throw new IllegalArityException("Incorrect number of output files for process `${task.lazyName()}` -- expected ${param.arity}, found ${allFiles.size()}") + + return allFiles.size()==1 && param.isSingle() ? allFiles[0] : allFiles + } + + /** + * Collect the file(s) matching the specified name or glob pattern + * in the given task work directory. + * + * @param pattern + * @param workDir + */ + protected List fetchResultFiles(String pattern, Path workDir) { + final opts = [ + relative: false, + hidden: param.hidden ?: pattern.startsWith('.'), + followLinks: param.followLinks, + maxDepth: param.maxDepth, + type: param.type ? param.type : ( pattern.contains('**') ? 'file' : 'any' ) + ] + + List files = [] + try { + FileHelper.visitFiles(opts, workDir, pattern) { Path it -> files.add(it) } + } + catch( NoSuchFileException e ) { + throw new MissingFileException("Cannot access directory: '$workDir'", e) + } + + return files.sort() + } + + /** + * Remove each path in the given list whose name matches the name of + * an input file for the specified {@code TaskRun} + * + * @param task + * @param collectedFiles + */ + protected List excludeStagedInputs(TaskRun task, List collectedFiles) { + + final List allStagedFiles = task.inputFiles.collect { it.stageName } + final List result = new ArrayList<>(collectedFiles.size()) + + for( int i = 0; i < collectedFiles.size(); i++ ) { + final file = collectedFiles.get(i) + final relativeName = workDir.relativize(file).toString() + if( !allStagedFiles.contains(relativeName) ) + result.add(file) + } + + return result + } + + protected boolean checkFileExists(Path file) { + param.followLinks ? file.exists() : file.exists(LinkOption.NOFOLLOW_LINKS) + } +} diff --git a/modules/nextflow/src/main/groovy/nextflow/processor/TaskOutputCollector.groovy b/modules/nextflow/src/main/groovy/nextflow/processor/TaskOutputCollector.groovy new file mode 100644 index 0000000000..ffeb83106f --- /dev/null +++ b/modules/nextflow/src/main/groovy/nextflow/processor/TaskOutputCollector.groovy @@ -0,0 +1,139 @@ +/* + * Copyright 2013-2024, Seqera Labs + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package nextflow.processor + +import java.nio.file.Path + +import groovy.transform.CompileDynamic +import groovy.transform.CompileStatic +import groovy.transform.Memoized +import groovy.util.logging.Slf4j +import nextflow.exception.MissingFileException +import nextflow.exception.MissingValueException +import nextflow.script.ProcessOutputs +import nextflow.script.ScriptType +import org.codehaus.groovy.runtime.InvokerHelper +/** + * Implements the resolution of task outputs + * + * @author Ben Sherman + */ +@Slf4j +@CompileStatic +class TaskOutputCollector implements Map { + + private ProcessOutputs declaredOutputs + + private boolean optional + + private TaskRun task + + @Delegate + private Map delegate + + TaskOutputCollector(ProcessOutputs declaredOutputs, boolean optional, TaskRun task) { + this.declaredOutputs = declaredOutputs + this.optional = optional + this.task = task + this.delegate = task.context + } + + /** + * Get an environment variable from the task environment. + * + * @param name + */ + String env(String name) { + final result = env0(task.workDir).get(name) + + if( result == null && !optional ) + throw new MissingValueException("Missing environment variable: $name") + + return result + } + + /** + * Get the result of an eval command from the task environment. + * + * @param name + */ + String eval(String name) { + final evalCmds = task.getOutputEvals() + final result = env0(task.workDir, evalCmds).get(name) + + if( result == null && !optional ) + throw new MissingValueException("Missing result of eval command: '${evalCmds.get(name)}'") + + return result + } + + @Memoized(maxCacheSize = 10_000) + static private Map env0(Path workDir, Map evalCmds=null) { + new TaskEnvCollector(workDir, evalCmds).collect() + } + + /** + * Get a file or list of files from the task environment. + * + * @param key + */ + Object path(String key) { + final param = declaredOutputs.getFiles().get(key) + final result = new TaskFileCollecter(param, task).collect() + + if( result instanceof Path ) + task.outputFiles.add(result) + else if( result instanceof Collection ) + task.outputFiles.addAll(result) + + return result + } + + /** + * Get the standard output from the task environment. + */ + Object stdout() { + final value = task.@stdout + + if( value == null && task.type == ScriptType.SCRIPTLET ) + throw new IllegalArgumentException("Missing 'stdout' for process > ${task.lazyName()}") + + if( value instanceof Path && !value.exists() ) + throw new MissingFileException("Missing 'stdout' file: ${value.toUriString()} for process > ${task.lazyName()}") + + return value instanceof Path ? ((Path)value).text : value?.toString() + } + + /** + * Get a variable from the task context. + * + * @param name + */ + @Override + @CompileDynamic + Object get(Object name) { + if( name == 'stdout' ) + return stdout() + + try { + return InvokerHelper.getProperty(delegate, name) + } + catch( MissingPropertyException e ) { + throw new MissingValueException("Missing variable in process output: ${e.property}") + } + } +} diff --git a/modules/nextflow/src/main/groovy/nextflow/processor/TaskProcessor.groovy b/modules/nextflow/src/main/groovy/nextflow/processor/TaskProcessor.groovy index 3e332569ab..cf02149172 100644 --- a/modules/nextflow/src/main/groovy/nextflow/processor/TaskProcessor.groovy +++ b/modules/nextflow/src/main/groovy/nextflow/processor/TaskProcessor.groovy @@ -19,13 +19,11 @@ import static nextflow.processor.ErrorStrategy.* import java.lang.reflect.InvocationTargetException import java.nio.file.FileSystems -import java.nio.file.LinkOption import java.nio.file.NoSuchFileException import java.nio.file.Path import java.nio.file.Paths import java.util.concurrent.atomic.AtomicBoolean import java.util.concurrent.atomic.AtomicInteger -import java.util.concurrent.atomic.AtomicIntegerArray import java.util.concurrent.atomic.LongAdder import java.util.regex.Matcher import java.util.regex.Pattern @@ -76,33 +74,15 @@ import nextflow.extension.CH import nextflow.extension.DataflowHelper import nextflow.file.FileHelper import nextflow.file.FileHolder -import nextflow.file.FilePatternSplitter import nextflow.file.FilePorter import nextflow.script.BaseScript import nextflow.script.BodyDef +import nextflow.script.LazyHelper import nextflow.script.ProcessConfig import nextflow.script.ScriptMeta import nextflow.script.ScriptType import nextflow.script.TaskClosure import nextflow.script.bundle.ResourcesBundle -import nextflow.script.params.BaseOutParam -import nextflow.script.params.CmdEvalParam -import nextflow.script.params.DefaultOutParam -import nextflow.script.params.EachInParam -import nextflow.script.params.EnvInParam -import nextflow.script.params.EnvOutParam -import nextflow.script.params.FileInParam -import nextflow.script.params.FileOutParam -import nextflow.script.params.InParam -import nextflow.script.params.MissingParam -import nextflow.script.params.OptionalParam -import nextflow.script.params.OutParam -import nextflow.script.params.StdInParam -import nextflow.script.params.StdOutParam -import nextflow.script.params.TupleInParam -import nextflow.script.params.TupleOutParam -import nextflow.script.params.ValueInParam -import nextflow.script.params.ValueOutParam import nextflow.util.ArrayBag import nextflow.util.BlankSeparatedList import nextflow.util.CacheHelper @@ -209,10 +189,6 @@ class TaskProcessor { */ protected volatile boolean completed - protected boolean allScalarValues - - protected boolean hasEachParams - /** * The state is maintained by using an agent */ @@ -229,10 +205,9 @@ class TaskProcessor { protected boolean singleton /** - * Track the status of input ports. When 1 the port is open (waiting for data), - * when 0 the port is closed (ie. received the STOP signal) + * Whenever the process is closed (ie. received the STOP signal) */ - protected AtomicIntegerArray openPorts + protected AtomicBoolean closed = new AtomicBoolean(false) /** * Process ID number. The first is 1, the second 2 and so on .. @@ -399,57 +374,12 @@ class TaskProcessor { log.warn(msg) } - /** - * Launch the 'script' define by the code closure as a local bash script - * - * @param code A {@code Closure} returning a bash script e.g. - *
-     *              {
-     *                 """
-     *                 #!/bin/bash
-     *                 do this ${x}
-     *                 do that ${y}
-     *                 :
-     *                 """
-     *              }
-     *
-     * @return {@code this} instance
-     */
-    def run() {
+    def run(DataflowReadChannel source) {
 
         // -- check that the task has a body
         if ( !taskBody )
             throw new IllegalStateException("Missing task body for process `$name`")
 
-        // -- check that input tuple defines at least two elements
-        def invalidInputTuple = config.getInputs().find { it instanceof TupleInParam && it.inner.size()<2 }
-        if( invalidInputTuple )
-            checkWarn "Input `tuple` must define at least two elements -- Check process `$name`"
-
-        // -- check that output tuple defines at least two elements
-        def invalidOutputTuple = config.getOutputs().find { it instanceof TupleOutParam && it.inner.size()<2 }
-        if( invalidOutputTuple )
-            checkWarn "Output `tuple` must define at least two elements -- Check process `$name`"
-
-        /**
-         * Verify if this process run only one time
-         */
-        allScalarValues = config.getInputs().allScalarInputs()
-        hasEachParams = config.getInputs().any { it instanceof EachInParam }
-
-        /*
-         * Normalize input channels
-         */
-        config.fakeInput()
-
-        /*
-         * Normalize the output
-         * - even though the output may be empty, let return the stdout as output by default
-         */
-        if ( config.getOutputs().size() == 0 ) {
-            config.fakeOutput()
-        }
-
         // the state agent
         state = new Agent<>(new StateObj(name))
         state.addListener { StateObj old, StateObj obj ->
@@ -471,7 +401,7 @@ class TaskProcessor {
         session.processRegister(this)
 
         // create the underlying dataflow operator
-        createOperator()
+        createOperator(source)
 
         session.notifyProcessCreate(this)
 
@@ -479,31 +409,19 @@ class TaskProcessor {
          * When there is a single output channel, return let returns that item
          * otherwise return the list
          */
-        def result = config.getOutputs().channels
+        final result = config.getOutputs().getChannels()
         return result.size() == 1 ? result[0] : result
     }
 
-    /**
-     * Template method which extending classes have to override in order to
-     * create the underlying *dataflow* operator associated with this processor
-     *
-     * See {@code DataflowProcessor}
-     */
+    protected void createOperator(DataflowReadChannel source) {
+        // determine whether the process is executed only once
+        this.singleton = !CH.isChannelQueue(source)
 
-    protected void createOperator() {
-        def opInputs = new ArrayList(config.getInputs().getChannels())
+        // create inputs with control channel
+        final control = CH.queue()
+        control.bind(Boolean.TRUE)
 
-        /*
-         * check if there are some iterators declaration
-         * the list holds the index in the list of all *inputs* for the {@code each} declaration
-         */
-        List iteratorIndexes = []
-        config.getInputs().eachWithIndex { param, index ->
-            if( param instanceof EachInParam ) {
-                log.trace "Process ${name} > got each param: ${param.name} at index: ${index} -- ${param.dump()}"
-                iteratorIndexes << index
-            }
-        }
+        final opInputs = [source, control]
 
         /**
          * The thread pool used by GPars. The thread pool to be used is set in the static
@@ -511,99 +429,24 @@ class TaskProcessor {
          */
         final PGroup group = Dataflow.retrieveCurrentDFPGroup()
 
-        /*
-         * When one (or more) {@code each} are declared as input, it is created an extra
-         * operator which will receive the inputs from the channel (excepts the values over iterate)
-         *
-         * The operator will *expand* the received inputs, iterating over the user provided value and
-         * forwarding the final values the the second *parallel* processor executing the user specified task
-         */
-        if( iteratorIndexes ) {
-            log.debug "Creating *combiner* operator for each param(s) at index(es): ${iteratorIndexes}"
-
-            // don't care about the last channel, being the control channel it doesn't bring real values
-            final size = opInputs.size()-1
-
-            // the iterator operator needs to executed just one time
-            // thus add a dataflow queue binding a single value and then a stop signal
-            def termination = new DataflowQueue<>()
-            termination << Boolean.TRUE
-            opInputs[size] = termination
-
-            // the channel forwarding the data from the *iterator* process to the target task
-            final linkingChannels = new ArrayList(size)
-            size.times { linkingChannels[it] = new DataflowQueue() }
-
-            // the script implementing the iterating process
-            final forwarder = new ForwardClosure(size, iteratorIndexes)
-
-            // instantiate the iteration process
-            def DataflowOperator op1
-            def stopAfterFirstRun = allScalarValues
-            def interceptor = new BaseProcessInterceptor(opInputs, stopAfterFirstRun)
-            def params = [inputs: opInputs, outputs: linkingChannels, maxForks: 1, listeners: [interceptor]]
-            session.allOperators << (op1 = new DataflowOperator(group, params, forwarder))
-            // fix issue #41
-            start(op1)
-
-            // set as next inputs the result channels of the iteration process
-            // adding the 'control' channel removed previously
-            opInputs = new ArrayList(size+1)
-            opInputs.addAll( linkingChannels )
-            opInputs.add( config.getInputs().getChannels().last() )
-        }
-
-        /*
-         * finally create the operator
-         */
         // note: do not specify the output channels in the operator declaration
         // this allows us to manage them independently from the operator life-cycle
-        this.singleton = allScalarValues && !hasEachParams
-        this.openPorts = createPortsArray(opInputs.size())
-        config.getOutputs().setSingleton(singleton)
-        def interceptor = new TaskProcessorInterceptor(opInputs, singleton)
+        def interceptor = new TaskProcessorInterceptor(source, control, singleton)
         def params = [inputs: opInputs, maxForks: session.poolSize, listeners: [interceptor] ]
-        def invoke = new InvokeTaskAdapter(this, opInputs.size())
-        session.allOperators << (operator = new DataflowOperator(group, params, invoke))
+        this.operator = new DataflowOperator(group, params, this.&invokeTask)
 
         // notify the creation of a new vertex the execution DAG
         NodeMarker.addProcessNode(this, config.getInputs(), config.getOutputs())
 
-        // fix issue #41
-        start(operator)
-    }
-
-    private start(DataflowProcessor op) {
-        if( !NF.dsl2 ) {
-            op.start()
-            return
-        }
+        // start the operator
+        session.allOperators << operator
         session.addIgniter {
             log.debug "Starting process > $name"
-            op.start()
+            operator.start()
         }
     }
 
-    private AtomicIntegerArray createPortsArray(int size) {
-        def result = new AtomicIntegerArray(size)
-        for( int i=0; i $name with params=$params; values=$values"
 
@@ -612,13 +455,8 @@ class TaskProcessor {
         // -- set the task instance as the current in this thread
         currentTask.set(task)
 
-        // -- validate input lengths
-        validateInputTuples(values)
-
         // -- map the inputs to a map and use to delegate closure values interpolation
-        final secondPass = [:]
-        int count = makeTaskContextStage1(task, secondPass, values)
-        makeTaskContextStage2(task, secondPass, count)
+        resolveTaskInputs(task, values)
 
         // verify that `when` guard, when specified, is satisfied
         if( !checkWhenGuard(task) )
@@ -642,28 +480,6 @@ class TaskProcessor {
         checkCachedOrLaunchTask(task, hash, resumable)
     }
 
-    @Memoized
-    private List getDeclaredInputTuple() {
-        getConfig().getInputs().ofType(TupleInParam)
-    }
-
-    protected void validateInputTuples( List values ) {
-
-        def declaredSets = getDeclaredInputTuple()
-        for( int i=0; i
-            if( param instanceof TupleInParam )
-                param.inner.each { task.setInput(it)  }
-            else if( param instanceof EachInParam )
-                task.setInput(param.inner)
-            else
-                task.setInput(param)
-        }
-
-        config.getOutputs().each { OutParam param ->
-            if( param instanceof TupleOutParam ) {
-                param.inner.each { task.setOutput(it) }
-            }
-            else
-                task.setOutput(param)
-        }
-
         return task
     }
 
@@ -844,17 +640,7 @@ class TaskProcessor {
         }
 
         // -- when store path is set, only output params of type 'file' can be specified
-        final ctx = task.context
-        def invalid = task.getOutputs().keySet().any {
-            if( it instanceof ValueOutParam ) {
-                return !ctx.containsKey(it.name)
-            }
-            if( it instanceof FileOutParam ) {
-                return false
-            }
-            return true
-        }
-        if( invalid ) {
+        if( config.getOutputs().getFiles().size() == 0 ) {
             checkWarn "[${safeTaskName(task)}] StoreDir can only be used when using 'file' outputs"
             return false
         }
@@ -877,7 +663,7 @@ class TaskProcessor {
             task.cached = true
             session.notifyTaskCached(new StoredTaskHandler(task))
 
-            // -- now bind the results
+            // -- now emit the results
             finalizeTask0(task)
             return true
         }
@@ -943,16 +729,16 @@ class TaskProcessor {
         }
 
         try {
-            // -- expose task exit status to make accessible as output value
+            // -- set task properties in order to resolve outputs
+            task.workDir = folder
+            task.stdout = stdoutFile
             task.config.exitStatus = exitCode
             // -- check if all output resources are available
-            collectOutputs(task, folder, stdoutFile, task.context)
+            collectOutputs(task)
 
             // set the exit code in to the task object
             task.cached = true
             task.hash = hash
-            task.workDir = folder
-            task.stdout = stdoutFile
             if( exitCode != null ) {
                 task.exitStatus = exitCode
             }
@@ -962,7 +748,7 @@ class TaskProcessor {
             if( entry )
                 session.notifyTaskCached(new CachedTaskHandler(task,entry.trace))
 
-            // -- now bind the results
+            // -- now emit the results
             finalizeTask0(task)
             return true
         }
@@ -970,6 +756,7 @@ class TaskProcessor {
             log.trace "[${safeTaskName(task)}] Missed cache > ${e.getMessage()} -- folder: $folder"
             task.exitStatus = Integer.MAX_VALUE
             task.workDir = null
+            task.stdout = null
             return false
         }
     }
@@ -1364,433 +1151,87 @@ class TaskProcessor {
      */
     @CompileStatic
     protected void publishOutputs( TaskRun task ) {
-        final publishList = task.config.getPublishDir()
-        if( !publishList ) {
-            return
-        }
+        final publishers = task.config.getPublishDir()
 
-        for( PublishDir pub : publishList ) {
-            publishOutputs0(task, pub)
-        }
-    }
+        for( PublishDir publisher : publishers ) {
+            if( publisher.overwrite == null )
+                publisher.overwrite = !task.cached
 
-    private void publishOutputs0( TaskRun task, PublishDir publish ) {
-
-        if( publish.overwrite == null ) {
-            publish.overwrite = !task.cached
-        }
-
-        HashSet files = []
-        def outputs = task.getOutputsByType(FileOutParam)
-        for( Map.Entry entry : outputs ) {
-            final value = entry.value
-            if( value instanceof Path ) {
-                files.add((Path)value)
-            }
-            else if( value instanceof Collection ) {
-                files.addAll(value)
-            }
-            else if( value != null ) {
-                throw new IllegalArgumentException("Unknown output file object [${value.class.name}]: ${value}")
-            }
+            publisher.apply(task)
         }
-
-        publish.apply(files, task)
     }
 
     /**
-     * Bind the expected output files to the corresponding output channels
-     * @param processor
+     * Emit the expected outputs to the corresponding output channels
      */
-    synchronized protected void bindOutputs( TaskRun task ) {
-
-        // -- creates the map of all tuple values to bind
-        Map tuples = [:]
-        for( OutParam param : config.getOutputs() ) {
-            tuples.put(param.index, [])
-        }
-
-        // -- collects the values to bind
-        for( OutParam param: task.outputs.keySet() ){
-            def value = task.outputs.get(param)
-
-            switch( param ) {
-            case StdOutParam:
-                log.trace "Process $name > normalize stdout param: $param"
-                value = value instanceof Path ? value.text : value?.toString()
-
-            case OptionalParam:
-                if( !value && param instanceof OptionalParam && param.optional ) {
-                    final holder = [] as MissingParam; holder.missing = param
-                    tuples[param.index] = holder
-                    break
-                }
-
-            case EnvOutParam:
-            case ValueOutParam:
-            case DefaultOutParam:
-                log.trace "Process $name > collecting out param: ${param} = $value"
-                tuples[param.index].add(value)
-                break
+    synchronized protected void emitOutputs( TaskRun task ) {
 
-            default:
-                throw new IllegalArgumentException("Illegal output parameter type: $param")
-            }
-        }
-
-        // bind the output
+        // -- emit the output
         if( isFair0 ) {
-            fairBindOutputs0(tuples, task)
+            fairEmitOutputs0(task.outputs, task)
         }
         else {
-            bindOutputs0(tuples)
+            emitOutputs0(task.outputs)
         }
 
-        // -- finally prints out the task output when 'debug' is true
+        // -- finally print the task output when 'debug' is true
         if( task.config.debug ) {
             task.echoStdout(session)
         }
     }
 
-    protected void fairBindOutputs0(Map emissions, TaskRun task) {
+    protected void fairEmitOutputs0(List emissions, TaskRun task) {
         synchronized (isFair0) {
             // decrement -1 because tasks are 1-based
             final index = task.index-1
-            // store the task emission values in a buffer
+            // store the task output values in a buffer
             fairBuffers[index-currentEmission] = emissions
-            // check if the current task index matches the expected next emission index
+            // check if the current task index matches the expected next output index
             if( currentEmission == index ) {
                 while( emissions!=null ) {
-                    // bind the emission values
-                    bindOutputs0(emissions)
+                    // emit the output values
+                    emitOutputs0(emissions)
                     // remove the head and try with the following
                     fairBuffers.remove(0)
-                    // increase the index of the next emission
+                    // increase the index of the next output
                     currentEmission++
-                    // take the next emissions 
+                    // take the next output
                     emissions = fairBuffers[0]
                 }
             }
         }
     }
 
-    protected void bindOutputs0(Map tuples) {
-        // -- bind out the collected values
-        for( OutParam param : config.getOutputs() ) {
-            final outValue = tuples[param.index]
-            if( outValue == null )
-                throw new IllegalStateException()
+    protected void emitOutputs0(List outputs) {
+        // -- emit the output values
+        for( int i = 0; i < config.getOutputs().size(); i++ ) {
+            final param = config.getOutputs()[i]
+            final value = outputs[i]
 
-            if( outValue instanceof MissingParam ) {
-                log.debug "Process $name > Skipping output binding because one or more optional files are missing: $outValue.missing"
+            if( value == null ) {
+                log.debug "Process $name > Skipping output binding because one or more optional files are missing: ${param.name}"
                 continue
             }
 
-            log.trace "Process $name > Binding out param: ${param} = ${outValue}"
-            bindOutParam(param, outValue)
+            // clone collection values before emitting them so that the task processor
+            // can iterate over them without causing a race condition
+            // see https://github.com/nextflow-io/nextflow/issues/3768
+            log.trace "Process $name > Emitting output: ${param.name} = ${value}"
+            final copy = value instanceof Collection && value instanceof Cloneable ? value.clone() : value
+            param.getChannel().bind(copy)
         }
     }
 
-    protected void bindOutParam( OutParam param, List values ) {
-        log.trace "<$name> Binding param $param with $values"
-        final x = values.size() == 1 ? values[0] : values
-        final ch = param.getOutChannel()
-        if( ch != null ) {
-            // create a copy of the output list of operation made by a downstream task
-            // can modify the list which is used internally by the task processor
-            // and result in a potential error. See https://github.com/nextflow-io/nextflow/issues/3768
-            final copy = x instanceof List && x instanceof Cloneable ? x.clone() : x
-            // emit the final value
-            ch.bind(copy)
-        }
-    }
-
-    protected void collectOutputs( TaskRun task ) {
-        collectOutputs( task, task.getTargetDir(), task.@stdout, task.context )
-    }
-
     /**
      * Once the task has completed this method is invoked to collected all the task results
      *
      * @param task
      */
-    final protected void collectOutputs( TaskRun task, Path workDir, def stdout, Map context ) {
-        log.trace "<$name> collecting output: ${task.outputs}"
-
-        for( OutParam param : task.outputs.keySet() ) {
-
-            switch( param ) {
-                case StdOutParam:
-                    collectStdOut(task, (StdOutParam)param, stdout)
-                    break
-
-                case FileOutParam:
-                    collectOutFiles(task, (FileOutParam)param, workDir, context)
-                    break
-
-                case ValueOutParam:
-                    collectOutValues(task, (ValueOutParam)param, context)
-                    break
-
-                case EnvOutParam:
-                    collectOutEnvParam(task, (EnvOutParam)param, workDir)
-                    break
-
-                case CmdEvalParam:
-                    collectOutEnvParam(task, (CmdEvalParam)param, workDir)
-                    break
-
-                case DefaultOutParam:
-                    task.setOutput(param, DefaultOutParam.Completion.DONE)
-                    break
-
-                default:
-                    throw new IllegalArgumentException("Illegal output parameter: ${param.class.simpleName}")
-
-            }
-        }
-
-        // mark ready for output binding
+    protected void collectOutputs( TaskRun task ) {
+        task.outputs = config.getOutputs().collect( param -> param.resolve(task) )
         task.canBind = true
     }
 
-    protected void collectOutEnvParam(TaskRun task, BaseOutParam param, Path workDir) {
-
-        // fetch the output value
-        final outCmds =  param instanceof CmdEvalParam ? task.getOutputEvals() : null
-        final val = collectOutEnvMap(workDir,outCmds).get(param.name)
-        if( val == null && !param.optional )
-            throw new MissingValueException("Missing environment variable: $param.name")
-        // set into the output set
-        task.setOutput(param,val)
-        // trace the result
-        log.trace "Collecting param: ${param.name}; value: ${val}"
-
-    }
-
-    /**
-     * Parse the `.command.env` file which holds the value for `env` and `cmd`
-     * output types
-     *
-     * @param workDir
-     *      The task work directory that contains the `.command.env` file
-     * @param outEvals
-     *      A {@link Map} instance containing key-value pairs
-     * @return
-     */
-    @CompileStatic
-    @Memoized(maxCacheSize = 10_000)
-    protected Map collectOutEnvMap(Path workDir, Map outEvals) {
-        final env = workDir.resolve(TaskRun.CMD_ENV).text
-        final result = new HashMap(50)
-        Matcher matcher
-        // `current` represent the current capturing env variable name
-        String current=null
-        for(String line : env.readLines() ) {
-            // Opening condition:
-            // line should match a KEY=VALUE syntax
-            if( !current && (matcher = (line=~/([a-zA-Z_][a-zA-Z0-9_]*)=(.*)/)) ) {
-                final k = matcher.group(1)
-                final v = matcher.group(2)
-                if (!k) continue
-                result.put(k,v)
-                current = k
-            }
-            // Closing condition:
-            // line should match /KEY/  or  /KEY/=exit_status
-            else if( current && (matcher = (line=~/\/${current}\/(?:=exit:(\d+))?/)) ) {
-                final status = matcher.group(1) as Integer ?: 0
-                // when exit status is defined and it is a non-zero, it should be interpreted
-                // as a failure of the execution of the output command; in this case the variable
-                // holds the std error message
-                if( outEvals!=null && status ) {
-                    final cmd = outEvals.get(current)
-                    final out = result[current]
-                    throw new ProcessEvalException("Unable to evaluate output", cmd, out, status)
-                }
-                // reset current key
-                current = null
-            }
-            else if( current && line!=null) {
-                result[current] += '\n' + line
-            }
-        }
-        return result
-    }
-
-    /**
-     * Collects the process 'std output'
-     *
-     * @param task The executed process instance
-     * @param param The declared {@link StdOutParam} object
-     * @param stdout The object holding the task produced std out object
-     */
-    protected void collectStdOut( TaskRun task, StdOutParam param, def stdout ) {
-
-        if( stdout == null && task.type == ScriptType.SCRIPTLET ) {
-            throw new IllegalArgumentException("Missing 'stdout' for process > ${safeTaskName(task)}")
-        }
-
-        if( stdout instanceof Path && !stdout.exists() ) {
-            throw new MissingFileException("Missing 'stdout' file: ${stdout.toUriString()} for process > ${safeTaskName(task)}")
-        }
-
-        task.setOutput(param, stdout)
-    }
-
-    protected void collectOutFiles( TaskRun task, FileOutParam param, Path workDir, Map context ) {
-
-        final List allFiles = []
-        // type file parameter can contain a multiple files pattern separating them with a special character
-        def entries = param.getFilePatterns(context, task.workDir)
-        boolean inputsRemovedFlag = false
-        // for each of them collect the produced files
-        for( String filePattern : entries ) {
-            List result = null
-
-            def splitter = param.glob ? FilePatternSplitter.glob().parse(filePattern) : null
-            if( splitter?.isPattern() ) {
-                result = fetchResultFiles(param, filePattern, workDir)
-                // filter the inputs
-                if( result && !param.includeInputs ) {
-                    result = filterByRemovingStagedInputs(task, result, workDir)
-                    log.trace "Process ${safeTaskName(task)} > after removing staged inputs: ${result}"
-                    inputsRemovedFlag |= (result.size()==0)
-                }
-            }
-            else {
-                def path = param.glob ? splitter.strip(filePattern) : filePattern
-                def file = workDir.resolve(path)
-                def exists = checkFileExists(file, param.followLinks)
-                if( exists )
-                    result = List.of(file)
-                else
-                    log.debug "Process `${safeTaskName(task)}` is unable to find [${file.class.simpleName}]: `$file` (pattern: `$filePattern`)"
-            }
-
-            if( result )
-                allFiles.addAll(result)
-
-            else if( !param.optional && (!param.arity || param.arity.min > 0) ) {
-                def msg = "Missing output file(s) `$filePattern` expected by process `${safeTaskName(task)}`"
-                if( inputsRemovedFlag )
-                    msg += " (note: input files are not included in the default matching set)"
-                throw new MissingFileException(msg)
-            }
-        }
-
-        if( !param.isValidArity(allFiles.size()) )
-            throw new IllegalArityException("Incorrect number of output files for process `${safeTaskName(task)}` -- expected ${param.arity}, found ${allFiles.size()}")
-
-        task.setOutput( param, allFiles.size()==1 && param.isSingle() ? allFiles[0] : allFiles )
-
-    }
-
-    protected boolean checkFileExists(Path file, boolean followLinks) {
-        followLinks ? file.exists() : file.exists(LinkOption.NOFOLLOW_LINKS)
-    }
-
-    protected void collectOutValues( TaskRun task, ValueOutParam param, Map ctx ) {
-
-        try {
-            // fetch the output value
-            final val = param.resolve(ctx)
-            // set into the output set
-            task.setOutput(param,val)
-            // trace the result
-            log.trace "Collecting param: ${param.name}; value: ${val}"
-        }
-        catch( MissingPropertyException e ) {
-            throw new MissingValueException("Missing value declared as output parameter: ${e.property}")
-        }
-
-    }
-
-    /**
-     * Collect the file(s) with the name specified, produced by the execution
-     *
-     * @param workDir The job working path
-     * @param namePattern The file name, it may include file name wildcards
-     * @return The list of files matching the specified name in lexicographical order
-     * @throws MissingFileException when no matching file is found
-     */
-    @PackageScope
-    List fetchResultFiles( FileOutParam param, String namePattern, Path workDir ) {
-        assert namePattern
-        assert workDir
-
-        List files = []
-        def opts = visitOptions(param, namePattern)
-        // scan to find the file with that name
-        try {
-            FileHelper.visitFiles(opts, workDir, namePattern) { Path it -> files.add(it) }
-        }
-        catch( NoSuchFileException e ) {
-            throw new MissingFileException("Cannot access directory: '$workDir'", e)
-        }
-
-        return files.sort()
-    }
-
-    /**
-     * Given a {@link FileOutParam} object create the option map for the
-     * {@link FileHelper#visitFiles(java.util.Map, java.nio.file.Path, java.lang.String, groovy.lang.Closure)} method
-     *
-     * @param param A task {@link FileOutParam}
-     * @param namePattern A file glob pattern
-     * @return A {@link Map} object holding the traverse options for the {@link FileHelper#visitFiles(java.util.Map, java.nio.file.Path, java.lang.String, groovy.lang.Closure)} method
-     */
-    @PackageScope
-    Map visitOptions( FileOutParam param, String namePattern ) {
-        final opts = [:]
-        opts.relative = false
-        opts.hidden = param.hidden ?: namePattern.startsWith('.')
-        opts.followLinks = param.followLinks
-        opts.maxDepth = param.maxDepth
-        opts.type = param.type ? param.type : ( namePattern.contains('**') ? 'file' : 'any' )
-        return opts
-    }
-
-    /**
-     * Given a list of {@code Path} removes all the hidden file i.e. the ones which names starts with a dot char
-     * @param files A list of {@code Path}
-     * @return The result list not containing hidden file entries
-     */
-    @PackageScope
-    List filterByRemovingHiddenFiles( List files ) {
-        files.findAll { !it.getName().startsWith('.') }
-    }
-
-    /**
-     * Given a list of {@code Path} removes all the entries which name match the name of
-     * file used as input for the specified {@code TaskRun}
-     *
-     * See TaskRun#getStagedInputs
-     *
-     * @param task
-     *      A {@link TaskRun} object representing the task executed
-     * @param collectedFiles
-     *      Collection of candidate output files
-     * @return
-     *      List of the actual output files (not including any input matching an output file name pattern)
-     */
-    @PackageScope
-    List filterByRemovingStagedInputs( TaskRun task, List collectedFiles, Path workDir ) {
-
-        // get the list of input files
-        final List allStaged = task.getStagedInputs()
-        final List result = new ArrayList<>(collectedFiles.size())
-
-        for( int i=0; i
-
-            // add the value to the task instance
-            def val = param.decodeInputs(values)
-
-            switch(param) {
-                case ValueInParam:
-                    contextMap.put( param.name, val )
-                    break
-
-                case FileInParam:
-                    secondPass[param] = val
-                    return // <-- leave it, because we do not want to add this 'val' at this stage
-
-                case StdInParam:
-                case EnvInParam:
-                    // nothing to do
-                    break
+    final protected void resolveTaskInputs( TaskRun task, List values ) {
 
-                default:
-                    throw new IllegalStateException("Unsupported input param type: ${param?.class?.simpleName}")
-            }
+        final inputs = config.getInputs()
+        final ctx = task.context
 
-            // add the value to the task instance context
-            task.setInput(param, val)
+        // -- add input params to task context
+        for( int i = 0; i < inputs.size(); i++ ) {
+            final expectedType = inputs[i].type
+            final actualType = values[i].class
+            if( expectedType != null && !expectedType.isAssignableFrom(actualType) )
+                log.warn "[${safeTaskName(task)}] invalid argument type at index ${i} -- expected a ${expectedType.simpleName} but got a ${actualType.simpleName}"
+            ctx.put(inputs[i].getName(), values[i])
         }
 
-        return count
-    }
+        // -- resolve local variables
+        for( def entry : inputs.getVariables() )
+            ctx.put(entry.key, LazyHelper.resolve(ctx, entry.value))
 
-    final protected void makeTaskContextStage2( TaskRun task, Map secondPass, int count ) {
+        // -- resolve environment vars
+        for( def entry : inputs.getEnv() )
+            task.env.put(entry.key, LazyHelper.resolve(ctx, entry.value))
 
-        final ctx = task.context
-        final allNames = new HashMap()
+        // -- resolve stdin
+        task.stdin = LazyHelper.resolve(ctx, inputs.stdin)
 
-        final FilePorter.Batch batch = session.filePorter.newBatch(executor.getStageDir())
+        // -- resolve input files
+        final allNames = new HashMap()
+        int count = 0
+        final batch = session.filePorter.newBatch(executor.getStageDir())
 
-        // -- all file parameters are processed in a second pass
-        //    so that we can use resolve the variables that eventually are in the file name
-        for( Map.Entry entry : secondPass.entrySet() ) {
-            final param = entry.getKey()
-            final val = entry.getValue()
-            final fileParam = param as FileInParam
-            final normalized = normalizeInputToFiles(val, count, fileParam.isPathQualifier(), batch)
-            final resolved = expandWildcards( fileParam.getFilePattern(ctx), normalized )
+        for( def param : config.getInputs().getFiles() ) {
+            final val = param.resolve(ctx)
+            final normalized = normalizeInputToFiles(val, count, param.isPathQualifier(), batch)
+            final resolved = expandWildcards( param.getFilePattern(ctx), normalized )
 
             if( !param.isValidArity(resolved.size()) )
                 throw new IllegalArityException("Incorrect number of input files for process `${safeTaskName(task)}` -- expected ${param.arity}, found ${resolved.size()}")
 
-            ctx.put( param.name, singleItemOrList(resolved, param.isSingle(), task.type) )
+            // add to context if the path was declared with a variable name
+            if( param.name )
+                ctx.put( param.name, singleItemOrList(resolved, param.isSingle(), task.type) )
+
             count += resolved.size()
+
             for( FileHolder item : resolved ) {
                 Integer num = allNames.getOrCreate(item.stageName, 0) +1
                 allNames.put(item.stageName,num)
             }
 
-            // add the value to the task instance context
-            task.setInput(param, resolved)
+            task.inputFiles.addAll(resolved)
         }
 
         // -- set the delegate map as context in the task config
         //    so that lazy directives will be resolved against it
         task.config.context = ctx
 
-        // check conflicting file names
+        // -- check conflicting file names
         def conflicts = allNames.findAll { name, num -> num>1 }
         if( conflicts ) {
             log.debug("Process $name > collision check staging file names: $allNames")
@@ -2163,17 +1591,6 @@ class TaskProcessor {
         session.filePorter.transfer(batch)
     }
 
-    final protected void makeTaskContextStage3( TaskRun task, HashCode hash, Path folder ) {
-
-        // set hash-code & working directory
-        task.hash = hash
-        task.workDir = folder
-        task.config.workDir = folder
-        task.config.hash = hash.toString()
-        task.config.name = task.getName()
-
-    }
-
     final protected HashCode createTaskHashKey(TaskRun task) {
 
         List keys = [ session.uniqueId, name, task.source ]
@@ -2181,11 +1598,16 @@ class TaskProcessor {
         if( task.isContainerEnabled() )
             keys << task.getContainerFingerprint()
 
-        // add all the input name-value pairs to the key generator
-        for( Map.Entry it : task.inputs ) {
-            keys.add( it.key.name )
-            keys.add( it.value )
-        }
+        // add task inputs
+        final inputVars = getTaskInputVars(task)
+        if( inputVars )
+            keys.add(inputVars.entrySet())
+        if( task.env )
+            keys.add(task.env.entrySet())
+        if( task.inputFiles )
+            keys.add(task.inputFiles)
+        if( task.stdin )
+            keys.add(task.stdin)
 
         // add all variable references in the task script but not declared as input/output
         def vars = getTaskGlobalVars(task)
@@ -2287,8 +1709,17 @@ class TaskProcessor {
         log.info(buffer.toString())
     }
 
+    protected Map getTaskInputVars(TaskRun task) {
+        final result = [:]
+        final inputs = config.getInputs()
+        final inputVars = inputs.getNames() - inputs.getFiles()*.getName()
+        for( String var : inputVars )
+            result.put(var, task.context.get(var))
+        return result
+    }
+
     protected Map getTaskGlobalVars(TaskRun task) {
-        final result = task.getGlobalVars(ownerScript.binding)
+        final result = task.getGlobalVars(ownerScript.getBinding())
         final directives = getTaskExtensionDirectiveVars(task)
         result.putAll(directives)
         return result
@@ -2316,7 +1747,12 @@ class TaskProcessor {
     final protected void submitTask( TaskRun task, HashCode hash, Path folder ) {
         log.trace "[${safeTaskName(task)}] actual run folder: ${folder}"
 
-        makeTaskContextStage3(task, hash, folder)
+        // set hash-code & working directory
+        task.hash = hash
+        task.workDir = folder
+        task.config.workDir = folder
+        task.config.hash = hash.toString()
+        task.config.name = task.getName()
 
         // add the task to the collection of running tasks
         executor.submit(task)
@@ -2343,7 +1779,7 @@ class TaskProcessor {
 
     /**
      * Finalize the task execution, checking the exit status
-     * and binding output values accordingly
+     * and emitting output values accordingly
      *
      * @param task The {@code TaskRun} instance to finalize
      */
@@ -2395,17 +1831,16 @@ class TaskProcessor {
 
     /**
      * Finalize the task execution, checking the exit status
-     * and binding output values accordingly
+     * and emitting output values accordingly
      *
      * @param task The {@code TaskRun} instance to finalize
-     * @param producedFiles The map of files to be bind the outputs
      */
     private void finalizeTask0( TaskRun task ) {
         log.trace "Finalize process > ${safeTaskName(task)}"
 
-        // -- bind output (files)
+        // -- emit outputs
         if( task.canBind ) {
-            bindOutputs(task)
+            emitOutputs(task)
             publishOutputs(task)
         }
 
@@ -2436,69 +1871,23 @@ class TaskProcessor {
 
         def statusStr = !completed && !terminated ? 'status=ACTIVE' : ( completed && terminated ? 'status=TERMINATED' : "completed=$completed; terminated=$terminated" )
         result << "  $statusStr\n"
-        // add extra info about port statuses
-        for( int i=0; i inputs
+    class TaskProcessorInterceptor extends DataflowEventAdapter {
 
-        final boolean stopAfterFirstRun
-
-        final int len
+        final DataflowReadChannel source
 
         final DataflowQueue control
 
-        final int first
-
-        BaseProcessInterceptor( List inputs, boolean stop ) {
-            this.inputs = new ArrayList<>(inputs)
-            this.stopAfterFirstRun = stop
-            this.len = inputs.size()
-            this.control = (DataflowQueue)inputs.get(len-1)
-            this.first = inputs.findIndexOf { CH.isChannelQueue(it) }
-        }
-
-        @Override
-        Object messageArrived(final DataflowProcessor processor, final DataflowReadChannel channel, final int index, final Object message) {
-            if( len == 1 || stopAfterFirstRun ) {
-                // -- kill itself
-                control.bind(PoisonPill.instance)
-            }
-            else if( index == first ) {
-                // the `if` condition guarantees only and only one signal message (the true value)
-                // is bound to the control message for a complete set of input values delivered
-                // to the process -- the control message is need to keep the process running
-                control.bind(Boolean.TRUE)
-            }
-
-            return message
-        }
-    }
+        final boolean singleton
 
-    /**
-     *  Intercept dataflow process events
-     */
-    class TaskProcessorInterceptor extends BaseProcessInterceptor {
-
-        TaskProcessorInterceptor(List inputs, boolean stop) {
-            super(inputs, stop)
+        TaskProcessorInterceptor(DataflowReadChannel source, DataflowQueue control, boolean singleton) {
+            this.source = source
+            this.control = control
+            this.singleton = singleton
         }
 
         @Override
@@ -2513,11 +1902,10 @@ class TaskProcessor {
             final params = new TaskStartParams(TaskId.next(), indexCount.incrementAndGet())
             final result = new ArrayList(2)
             result[0] = params
-            result[1] = messages
+            result[1] = messages.first()
             return result
         }
 
-
         @Override
         void afterRun(DataflowProcessor processor, List messages) {
             // apparently auto if-guard instrumented by @Slf4j is not honoured in inner classes - add it explicitly
@@ -2528,14 +1916,16 @@ class TaskProcessor {
 
         @Override
         Object messageArrived(final DataflowProcessor processor, final DataflowReadChannel channel, final int index, final Object message) {
-            // apparently auto if-guard instrumented by @Slf4j is not honoured in inner classes - add it explicitly
-            if( log.isTraceEnabled() ) {
-                def channelName = config.getInputs()?.names?.get(index)
-                def taskName = currentTask.get()?.name ?: name
-                log.trace "<${taskName}> Message arrived -- ${channelName} => ${message}"
+            if( singleton ) {
+                // -- kill the process
+                control.bind(PoisonPill.instance)
+            }
+            else {
+                // -- send a control message for each new source item to keep the process running
+                control.bind(Boolean.TRUE)
             }
 
-            super.messageArrived(processor, channel, index, message)
+            return message
         }
 
         @Override
@@ -2553,7 +1943,7 @@ class TaskProcessor {
                 // apparently auto if-guard instrumented by @Slf4j is not honoured in inner classes - add it explicitly
                 if( log.isTraceEnabled() )
                     log.trace "<${name}> Poison pill arrived; port: $index"
-                openPorts.set(index, 0) // mark the port as closed
+                closed.set(true) // mark the process as closed
                 state.update { StateObj it -> it.poison() }
             }
 
diff --git a/modules/nextflow/src/main/groovy/nextflow/processor/TaskRun.groovy b/modules/nextflow/src/main/groovy/nextflow/processor/TaskRun.groovy
index 69e217291e..1753a10842 100644
--- a/modules/nextflow/src/main/groovy/nextflow/processor/TaskRun.groovy
+++ b/modules/nextflow/src/main/groovy/nextflow/processor/TaskRun.groovy
@@ -22,6 +22,7 @@ import java.nio.file.Path
 import java.util.concurrent.ConcurrentHashMap
 
 import com.google.common.hash.HashCode
+import groovy.transform.Memoized
 import groovy.transform.PackageScope
 import groovy.util.logging.Slf4j
 import nextflow.Session
@@ -35,19 +36,12 @@ import nextflow.exception.ProcessUnrecoverableException
 import nextflow.file.FileHelper
 import nextflow.file.FileHolder
 import nextflow.script.BodyDef
+import nextflow.script.LazyHelper
 import nextflow.script.ScriptType
 import nextflow.script.TaskClosure
 import nextflow.script.bundle.ResourcesBundle
-import nextflow.script.params.CmdEvalParam
-import nextflow.script.params.EnvInParam
-import nextflow.script.params.EnvOutParam
-import nextflow.script.params.FileInParam
-import nextflow.script.params.FileOutParam
-import nextflow.script.params.InParam
-import nextflow.script.params.OutParam
-import nextflow.script.params.StdInParam
-import nextflow.script.params.ValueOutParam
 import nextflow.spack.SpackCache
+import nextflow.util.ArrayBag
 /**
  * Models a task instance
  *
@@ -85,39 +79,40 @@ class TaskRun implements Cloneable {
     TaskProcessor processor
 
     /**
-     * Holds the input value(s) for each task input parameter
+     * The map of input environment vars
+     *
+     * @see TaskProcessor#resolveTaskInputs(TaskRun, List)
      */
-    Map inputs = [:]
+    Map env = [:]
 
     /**
-     * Holds the output value(s) for each task output parameter
+     * The list of input files
+     *
+     * @see TaskProcessor#resolveTaskInputs(TaskRun, List)
      */
-    Map outputs = [:]
-
-
-    void setInput( InParam param, Object value = null ) {
-        assert param
-
-        inputs[param] = value
-
-        // copy the value to the task 'input' attribute
-        // it will be used to pipe it to the process stdin
-        if( param instanceof StdInParam) {
-            stdin = value
-        }
-    }
-
-    void setOutput( OutParam param, Object value = null ) {
-        assert param
-        outputs[param] = value
-    }
-
+    List inputFiles = new ArrayBag()
 
     /**
      * The value to be piped to the process stdin
+     *
+     * @see TaskProcessor#resolveTaskInputs(TaskRun, List)
      */
     def stdin
 
+    /**
+     * The list of resolved task outputs
+     *
+     * @see TaskProcessor#collectOutputs(TaskRun)
+     */
+    List outputs = []
+
+    /**
+     * The list of resolved output files
+     *
+     * @see ProcessOutput.ResolverContext#path(String)
+     */
+    Set outputFiles = []
+
     /**
      * The exit code returned by executing the task script
      */
@@ -403,109 +398,31 @@ class TaskRun implements Cloneable {
      * Check whenever there are values to be cached
      */
     boolean hasCacheableValues() {
-
-        if( config?.isDynamic() )
-            return true
-
-        for( OutParam it : outputs.keySet() ) {
-            if( it.class == ValueOutParam ) return true
-            if( it.class == FileOutParam && ((FileOutParam)it).isDynamic() ) return true
-        }
-
-        return false
-    }
-
-    Map> getInputFiles() {
-        (Map>) getInputsByType( FileInParam )
-    }
-
-    /**
-     * Return the list of all input files staged as inputs by this task execution
-     */
-    List getStagedInputs()  {
-        getInputFiles()
-                .values()
-                .flatten()
-                .collect { it.stageName }
+        return body.type != ScriptType.SCRIPTLET
     }
 
     /**
      * @return A map object containing all the task input files as  pairs
      */
     Map getInputFilesMap() {
-
-        final allFiles = getInputFiles().values()
-        final result = new HashMap(allFiles.size())
-        for( List entry : allFiles ) {
-            if( entry ) for( FileHolder it : entry ) {
-                result[ it.stageName ] = it.storePath
-            }
-        }
-
+        final result = [:]
+        for( FileHolder it : inputFiles )
+            result.put(it.stageName, it.storePath)
         return result
     }
 
     /**
-     * Look at the {@code nextflow.script.FileOutParam} which name is the expected
-     *  output name
-     *
+     * Get the list of expected output file patterns.
      */
+    @Memoized
     List getOutputFilesNames() {
-        cache0.computeIfAbsent('outputFileNames', (it)-> getOutputFilesNames0())
-    }
-
-    private List getOutputFilesNames0() {
-        def result = []
-
-        for( FileOutParam param : getOutputsByType(FileOutParam).keySet() ) {
+        final declaredOutputs = processor.config.getOutputs()
+        final result = []
+        for( def param : declaredOutputs.files.values() )
             result.addAll( param.getFilePatterns(context, workDir) )
-        }
-
         return result.unique()
     }
 
-    /**
-     * Get the map of *input* objects by the given {@code InParam} type
-     *
-     * @param types One or more subclass of {@code InParam}
-     * @return An associative array containing all the objects for the specified type
-     */
-    def  Map getInputsByType( Class... types ) {
-
-        def result = [:]
-        for( def it : inputs ) {
-            if( types.contains(it.key.class) )
-                result << it
-        }
-        return result
-    }
-
-    /**
-     * Get the map of *output* objects by the given {@code InParam} type
-     *
-     * @param types One or more subclass of {@code InParam}
-     * @return An associative array containing all the objects for the specified type
-     */
-    def  Map getOutputsByType( Class... types ) {
-        def result = [:]
-        for( def it : outputs ) {
-            if( types.contains(it.key.class) )
-                result << it
-        }
-        return result
-    }
-
-    /**
-     * @return A map containing the task environment defined as input declaration by this task
-     */
-    protected Map getInputEnvironment() {
-        final Map environment = [:]
-        getInputsByType( EnvInParam ).each { param, value ->
-            environment.put( param.name, value?.toString() )
-        }
-        return environment
-    }
-
     /**
      * @return A map representing the task execution environment
      */
@@ -515,7 +432,7 @@ class TaskRun implements Cloneable {
         // IMPORTANT: when copying the environment map a LinkedHashMap must be used to preserve
         // the insertion order of the env entries (ie. export FOO=1; export BAR=$FOO)
         final result = new LinkedHashMap( getProcessor().getProcessEnvironment() )
-        result.putAll( getInputEnvironment() )
+        result.putAll( env )
         return result
     }
 
@@ -587,29 +504,16 @@ class TaskRun implements Cloneable {
     }
 
     List getOutputEnvNames() {
-        final items = getOutputsByType(EnvOutParam)
-        if( !items )
-            return List.of()
-        final result = new ArrayList(items.size())
-        for( EnvOutParam it : items.keySet() ) {
-            if( !it.name ) throw new IllegalStateException("Missing output environment name - offending parameter: $it")
-            result.add(it.name)
-        }
-        return result
+        final declaredOutputs = processor.config.getOutputs()
+        return new ArrayList(declaredOutputs.getEnv())
     }
 
-    /**
-     * @return A {@link Map} instance holding a collection of key-pairs
-     * where the key represents a environment variable name holding the command
-     * output and the value the command the executed.
-     */
     Map getOutputEvals() {
-        final items = getOutputsByType(CmdEvalParam)
-        final result = new LinkedHashMap(items.size())
-        for( CmdEvalParam it : items.keySet() ) {
-            if( !it.name ) throw new IllegalStateException("Missing output eval name - offending parameter: $it")
-            result.put(it.name, it.getTarget(context))
-        }
+        final declaredOutputs = processor.config.getOutputs()
+        final evalCmds = declaredOutputs.getEval()
+        final result = new LinkedHashMap(evalCmds.size())
+        for( String name : evalCmds.keySet() )
+            result.put(name, LazyHelper.resolve(context, evalCmds[name]))
         return result
     }
 
@@ -866,10 +770,7 @@ class TaskRun implements Cloneable {
         final result = new HashMap(variableNames.size())
         final processName = name
 
-        def itr = variableNames.iterator()
-        while( itr.hasNext() ) {
-            final varName = itr.next()
-
+        for( def varName : variableNames ) {
             final p = varName.indexOf('.')
             final baseName = p !=- 1 ? varName.substring(0,p) : varName
 
diff --git a/modules/nextflow/src/main/groovy/nextflow/script/BaseScript.groovy b/modules/nextflow/src/main/groovy/nextflow/script/BaseScript.groovy
index 03a200e71e..36ddb661d3 100644
--- a/modules/nextflow/src/main/groovy/nextflow/script/BaseScript.groovy
+++ b/modules/nextflow/src/main/groovy/nextflow/script/BaseScript.groovy
@@ -19,10 +19,13 @@ package nextflow.script
 import java.lang.reflect.InvocationTargetException
 import java.nio.file.Paths
 
+import groovy.transform.CompileStatic
 import groovy.util.logging.Slf4j
 import nextflow.NextflowMeta
 import nextflow.Session
 import nextflow.exception.AbortOperationException
+import nextflow.script.dsl.ProcessDsl
+import nextflow.script.dsl.WorkflowBuilder
 import nextflow.secret.SecretsLoader
 
 /**
@@ -31,6 +34,7 @@ import nextflow.secret.SecretsLoader
  * @author Paolo Di Tommaso 
  */
 @Slf4j
+@CompileStatic
 abstract class BaseScript extends Script implements ExecutionContext {
 
     private Session session
@@ -91,31 +95,53 @@ abstract class BaseScript extends Script implements ExecutionContext {
         binding.setVariable( 'secrets', SecretsLoader.secretContext() )
     }
 
-    protected process( String name, Closure body ) {
-        final process = new ProcessDef(this,body,name)
+    /**
+     * Define a process.
+     *
+     * @param name
+     * @param rawBody
+     */
+    protected void process(String name, Closure rawBody) {
+        final builder = new ProcessDsl(this, name)
+        final copy = (Closure)rawBody.clone()
+        copy.delegate = builder
+        copy.resolveStrategy = Closure.DELEGATE_FIRST
+        final taskBody = copy.call()
+        final process = builder.withBody(taskBody).build()
         meta.addDefinition(process)
     }
 
     /**
-     * Workflow main entry point
+     * Define an anonymous workflow.
      *
-     * @param body The implementation body of the workflow
-     * @return The result of workflow execution
+     * @param rawBody
      */
-    protected workflow(Closure workflowBody) {
-        // launch the execution
-        final workflow = new WorkflowDef(this, workflowBody)
-        // capture the main (unnamed) workflow definition
+    protected void workflow(Closure rawBody) {
+        final workflow = workflow0(null, rawBody)
         this.entryFlow = workflow
-        // add it to the list of workflow definitions
         meta.addDefinition(workflow)
     }
 
-    protected workflow(String name, Closure workflowDef) {
-        final workflow = new WorkflowDef(this,workflowDef,name)
+    /**
+     * Define a named workflow.
+     *
+     * @param name
+     * @param rawBody
+     */
+    protected void workflow(String name, Closure rawBody) {
+        final workflow = workflow0(name, rawBody)
         meta.addDefinition(workflow)
     }
 
+    protected WorkflowDef workflow0(String name, Closure rawBody) {
+        final builder = new WorkflowBuilder(this, name)
+        final copy = (Closure)rawBody.clone()
+        copy.delegate = builder
+        copy.resolveStrategy = Closure.DELEGATE_FIRST
+        final body = copy.call()
+        return builder.withBody(body).build()
+    }
+
     protected IncludeDef include( IncludeDef include ) {
         if(ExecutionStack.withinWorkflow())
             throw new IllegalStateException("Include statement is not allowed within a workflow definition")
@@ -241,7 +267,7 @@ abstract class BaseScript extends Script implements ExecutionContext {
             return
 
         if( session?.ansiLog )
-            log.info(String.printf(msg, arg))
+            log.info(String.format(msg, arg))
         else
             super.printf(msg, arg)
     }
@@ -252,7 +278,7 @@ abstract class BaseScript extends Script implements ExecutionContext {
             return
 
         if( session?.ansiLog )
-            log.info(String.printf(msg, args))
+            log.info(String.format(msg, args))
         else
             super.printf(msg, args)
     }
diff --git a/modules/nextflow/src/main/groovy/nextflow/script/ChannelOut.groovy b/modules/nextflow/src/main/groovy/nextflow/script/ChannelOut.groovy
index bb18518bb9..ac238b8623 100644
--- a/modules/nextflow/src/main/groovy/nextflow/script/ChannelOut.groovy
+++ b/modules/nextflow/src/main/groovy/nextflow/script/ChannelOut.groovy
@@ -20,8 +20,6 @@ import groovy.transform.CompileStatic
 import groovy.util.logging.Slf4j
 import groovyx.gpars.dataflow.DataflowWriteChannel
 import nextflow.exception.DuplicateChannelNameException
-import nextflow.script.params.OutParam
-import nextflow.script.params.OutputsList
 import static nextflow.ast.NextflowDSLImpl.OUT_PREFIX
 /**
  * Models the output of a process or a workflow component returning
@@ -52,12 +50,12 @@ class ChannelOut implements List {
         this.channels = Collections.unmodifiableMap(new LinkedHashMap(channels))
     }
 
-    ChannelOut(OutputsList outs) {
+    ChannelOut(ProcessOutputs outs) {
         channels = new HashMap<>(outs.size())
         final onlyWithName = new ArrayList(outs.size())
-        for( OutParam param : outs ) {
-            final ch = param.getOutChannel()
-            final name = param.channelEmitName
+        for( ProcessOutput param : outs ) {
+            final ch = param.getChannel()
+            final name = param.getName()
             onlyWithName.add(ch)
             if(name) {
                 if(channels.containsKey(name)) throw new DuplicateChannelNameException("Output channel name `$name` is used more than one time")
diff --git a/modules/nextflow/src/main/groovy/nextflow/script/IncludeDef.groovy b/modules/nextflow/src/main/groovy/nextflow/script/IncludeDef.groovy
index 08e6e5566e..6b91d6d716 100644
--- a/modules/nextflow/src/main/groovy/nextflow/script/IncludeDef.groovy
+++ b/modules/nextflow/src/main/groovy/nextflow/script/IncludeDef.groovy
@@ -56,15 +56,11 @@ class IncludeDef {
     @PackageScope Map addedParams
     private Session session
 
-    IncludeDef(TokenVar token, String alias=null) {
+    IncludeDef(LazyVar token, String alias=null) {
         def component = token.name; if(alias) component += " as $alias"
         def msg = "Unwrapped module inclusion is deprecated -- Replace `include $component from './MODULE/PATH'` with `include { $component } from './MODULE/PATH'`"
-        if( NF.isDsl2() )
-            throw new DeprecationException(msg)
-        log.warn msg
 
-        this.modules = new ArrayList<>(1)
-        this.modules << new Module(token.name, alias)
+        throw new DeprecationException(msg)
     }
 
     protected IncludeDef(List modules) {
diff --git a/modules/nextflow/src/main/groovy/nextflow/script/LazyHelper.groovy b/modules/nextflow/src/main/groovy/nextflow/script/LazyHelper.groovy
new file mode 100644
index 0000000000..e28ea92118
--- /dev/null
+++ b/modules/nextflow/src/main/groovy/nextflow/script/LazyHelper.groovy
@@ -0,0 +1,285 @@
+/*
+ * Copyright 2013-2024, Seqera Labs
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package nextflow.script
+
+import groovy.transform.CompileStatic
+import groovy.transform.EqualsAndHashCode
+import groovy.transform.ToString
+/**
+ * Helper methods for lazy evaluation.
+ *
+ * @author Paolo Di Tommaso 
+ * @author Ben Sherman 
+ */
+@CompileStatic
+class LazyHelper {
+
+    /**
+     * Evaluate a lazy expression against a given binding.
+     *
+     * @param binding
+     * @param value
+     */
+    static Object resolve(Object binding, Object value) {
+        if( value instanceof LazyAware )
+            return value.resolve(binding)
+
+        if( value instanceof Closure )
+            return value.cloneWith(binding).call()
+
+        if( value instanceof GString )
+            return value.cloneAsLazy(binding).toString()
+
+        return value
+    }
+
+}
+
+/**
+ * Interface for types that can be lazily evaluated
+ */
+interface LazyAware {
+    Object resolve(Object binding)
+}
+
+/**
+ * A list that can be lazily evaluated
+ */
+@CompileStatic
+class LazyList implements LazyAware, List {
+
+    @Delegate
+    private List target
+
+    LazyList() {
+        target = []
+    }
+
+    LazyList(int size) {
+        target = new ArrayList(size)
+    }
+
+    LazyList(Collection items) {
+        target = new ArrayList(items)
+    }
+
+    @Override
+    Object resolve(Object binding) {
+        final result = new ArrayList(target.size())
+        for( def item : target )
+            result.add(LazyHelper.resolve(binding, item))
+        return result
+    }
+
+}
+
+/**
+ * A map whose values can be lazily evaluated
+ */
+@CompileStatic
+class LazyMap implements Map {
+
+    /** The target map holding the values */
+    @Delegate
+    private Map target
+
+    /** The context map against which dynamic properties are resolved */
+    private Map binding
+
+    private boolean dynamic
+
+    boolean isDynamic() { dynamic }
+
+    protected void setDynamic(boolean val) { dynamic = val }
+
+    protected Map getBinding() { binding }
+
+    void setBinding(Map map) { this.binding = map }
+
+    protected Map getTarget() { target }
+
+    protected void setTarget(Map obj) { this.target = obj }
+
+    LazyMap() {
+        target = new HashMap<>()
+    }
+
+    LazyMap( Map entries ) {
+        assert entries != null
+        target = new HashMap<>()
+        putAll(entries)
+    }
+
+    /**
+     * Resolve a directive *dynamic* value i.e. defined with a closure or lazy string
+     *
+     * @param name The directive name
+     * @param value The value to be resolved
+     * @return The resolved value
+     */
+    protected resolve( String name, value ) {
+
+        /*
+         * directive with one value and optional named parameter are converted
+         * to a list object in which the first element is a map holding the named parameters
+         * and the second is the directive value
+         */
+        if( value instanceof LazyList ) {
+            def copy = new ArrayList(value.size())
+            for( Object item : value ) {
+                if( item instanceof Map )
+                    copy.add( resolveParams(name, item as Map) )
+                else
+                    copy.add( resolveImpl(name, item) )
+            }
+            return copy
+        }
+
+        /*
+         * resolve the values in a map object, preserving
+         * lazy maps as they are
+         */
+        else if( value instanceof Map && value !instanceof LazyMap ) {
+            return resolveParams(name, value)
+        }
+
+        /*
+         * simple value
+         */
+        else {
+            return resolveImpl(name, value)
+        }
+
+    }
+
+    /**
+     * Resolve directive *dynamic* named params
+     *
+     * @param name The directive name
+     * @param value The map holding the named params
+     * @return A map in which dynamic params are resolved to the actual value
+     */
+    private Map resolveParams( String name, Map value ) {
+
+        final copy = new LinkedHashMap()
+        final attr = (value as Map)
+        for( Entry entry : attr.entrySet() ) {
+            copy[entry.key] = resolveImpl(name, entry.value, true)
+        }
+        return copy
+    }
+
+    /**
+     * Resolve a directive dynamic value
+     *
+     * @param name The directive name
+     * @param value The value to be resolved
+     * @param param When {@code true} points that it is a named parameter value, thus closure are only cloned
+     * @return The resolved directive value
+     */
+    private resolveImpl( String name, value, boolean param=false ) {
+
+        if( value instanceof LazyVar ) {
+            return binding.get(value.name)
+        }
+
+        else if( value instanceof Closure ) {
+            def copy = value.cloneWith(getBinding())
+            if( param ) {
+                return copy
+            }
+
+            try {
+                return copy.call()
+            }
+            catch( MissingPropertyException e ) {
+                if( getBinding() == null ) throw new IllegalStateException("Directive `$name` doesn't support dynamic value (or context not yet initialized)")
+                else throw e
+            }
+        }
+
+        else if( value instanceof GString ) {
+            return value.cloneAsLazy(getBinding()).toString()
+        }
+
+        return value
+    }
+
+    /**
+     * Override the get method in such a way that {@link Closure} values are resolved against
+     * the {@link #binding} map
+     *
+     * @param key The map entry key
+     * @return The associated value
+     */
+    Object get( key ) {
+        return getValue(key)
+    }
+
+    Object getValue(Object key) {
+        final value = target.get(key)
+        return resolve(key as String, value)
+    }
+
+    Object put( String key, Object value ) {
+        if( value instanceof Closure ) {
+            dynamic |= true
+        }
+        else if( value instanceof GString ) {
+            for( int i=0; i put(k as String, v) }
+    }
+
+    @Override
+    String toString() {
+        final allKeys = keySet()
+        final result = new ArrayList(allKeys.size())
+        for( String key : allKeys ) { result << "$key: ${getProperty(key)}".toString() }
+        result.join('; ')
+    }
+
+}
+
+/**
+ * A variable that can be lazily evaluated
+ */
+@CompileStatic
+@EqualsAndHashCode
+@ToString
+class LazyVar implements LazyAware {
+    String name
+
+    LazyVar(String name) {
+        this.name = name
+    }
+
+    @Override
+    Object resolve(Object binding) {
+        if( binding !instanceof Map )
+            throw new IllegalArgumentException("Can't resolve lazy var `$name` because the given binding is not a map")
+
+        return ((Map)binding).get(name)
+    }
+}
diff --git a/modules/nextflow/src/main/groovy/nextflow/script/params/ArityParam.groovy b/modules/nextflow/src/main/groovy/nextflow/script/PathArityAware.groovy
similarity index 87%
rename from modules/nextflow/src/main/groovy/nextflow/script/params/ArityParam.groovy
rename to modules/nextflow/src/main/groovy/nextflow/script/PathArityAware.groovy
index 3c1a425288..6587619981 100644
--- a/modules/nextflow/src/main/groovy/nextflow/script/params/ArityParam.groovy
+++ b/modules/nextflow/src/main/groovy/nextflow/script/PathArityAware.groovy
@@ -14,7 +14,7 @@
  * limitations under the License.
  */
 
-package nextflow.script.params
+package nextflow.script
 
 import groovy.transform.CompileStatic
 import groovy.transform.EqualsAndHashCode
@@ -26,15 +26,20 @@ import nextflow.exception.IllegalArityException
  * @author Ben Sherman 
  */
 @CompileStatic
-trait ArityParam {
+trait PathArityAware {
 
     Range arity
 
     Range getArity() { arity }
 
-    def setArity(String value) {
+    def setArity(Object value) {
+        if( value !instanceof String )
+            throw new IllegalArityException("Path arity should be a string number (e.g. '1') or range (e.g. '1..*')")
+        
+        value = (String)value
+
         if( value.isInteger() ) {
-            def n = value.toInteger()
+            final n = value.toInteger()
             this.arity = new Range(n, n)
             return this
         }
@@ -52,7 +57,7 @@ trait ArityParam {
             }
         }
 
-        throw new IllegalArityException("Path arity should be a number (e.g. '1') or a range (e.g. '1..*')")
+        throw new IllegalArityException("Path arity should be a string number (e.g. '1') or range (e.g. '1..*')")
     }
 
     /**
diff --git a/modules/nextflow/src/main/groovy/nextflow/script/ProcessConfig.groovy b/modules/nextflow/src/main/groovy/nextflow/script/ProcessConfig.groovy
index 67bac675a7..474120b40c 100644
--- a/modules/nextflow/src/main/groovy/nextflow/script/ProcessConfig.groovy
+++ b/modules/nextflow/src/main/groovy/nextflow/script/ProcessConfig.groovy
@@ -16,39 +16,13 @@
 
 package nextflow.script
 
-import static nextflow.util.CacheHelper.*
-
-import java.util.regex.Pattern
-
 import groovy.transform.PackageScope
 import groovy.util.logging.Slf4j
 import nextflow.Const
-import nextflow.ast.NextflowDSLImpl
-import nextflow.exception.ConfigParseException
-import nextflow.exception.IllegalConfigException
-import nextflow.exception.IllegalDirectiveException
 import nextflow.executor.BashWrapperBuilder
-import nextflow.processor.ConfigList
 import nextflow.processor.ErrorStrategy
 import nextflow.processor.TaskConfig
-import nextflow.script.params.CmdEvalParam
-import nextflow.script.params.DefaultInParam
-import nextflow.script.params.DefaultOutParam
-import nextflow.script.params.EachInParam
-import nextflow.script.params.EnvInParam
-import nextflow.script.params.EnvOutParam
-import nextflow.script.params.FileInParam
-import nextflow.script.params.FileOutParam
-import nextflow.script.params.InParam
-import nextflow.script.params.InputsList
-import nextflow.script.params.OutParam
-import nextflow.script.params.OutputsList
-import nextflow.script.params.StdInParam
-import nextflow.script.params.StdOutParam
-import nextflow.script.params.TupleInParam
-import nextflow.script.params.TupleOutParam
-import nextflow.script.params.ValueInParam
-import nextflow.script.params.ValueOutParam
+import static nextflow.util.CacheHelper.HashMode
 
 /**
  * Holds the process configuration properties
@@ -58,64 +32,6 @@ import nextflow.script.params.ValueOutParam
 @Slf4j
 class ProcessConfig implements Map, Cloneable {
 
-    static final public transient LABEL_REGEXP = ~/[a-zA-Z]([a-zA-Z0-9_]*[a-zA-Z0-9]+)?/
-
-    static final public List DIRECTIVES = [
-            'accelerator',
-            'afterScript',
-            'arch',
-            'beforeScript',
-            'cache',
-            'conda',
-            'cpus',
-            'container',
-            'containerOptions',
-            'cleanup',
-            'clusterOptions',
-            'debug',
-            'disk',
-            'echo', // deprecated
-            'errorStrategy',
-            'executor',
-            'ext',
-            'fair',
-            'machineType',
-            'queue',
-            'label',
-            'maxSubmitAwait',
-            'maxErrors',
-            'maxForks',
-            'maxRetries',
-            'memory',
-            'module',
-            'penv',
-            'pod',
-            'publishDir',
-            'scratch',
-            'shell',
-            'spack',
-            'storeDir',
-            'tag',
-            'time',
-            // input-output qualifiers
-            'file',
-            'val',
-            'each',
-            'env',
-            'secret',
-            'stdin',
-            'stdout',
-            'stageInMode',
-            'stageOutMode',
-            'resourceLabels'
-    ]
-
-    /**
-     * Names of directives that can be used more than once in the process definition
-     */
-    @PackageScope
-    static final List repeatableDirectives = ['label','module','pod','publishDir']
-
     /**
      * Default directives values
      */
@@ -146,27 +62,16 @@ class ProcessConfig implements Map, Cloneable {
      */
     private String processName
 
-    /**
-     * When {@code true} a {@link MissingPropertyException} is thrown when
-     * trying to access a property not existing
-     */
-    private boolean throwExceptionOnMissingProperty
-
     /**
      * List of process input definitions
      */
-    private inputs = new InputsList()
+    private ProcessInputs inputs
 
     /**
      * List of process output definitions
      */
-    private outputs = new OutputsList()
+    private ProcessOutputs outputs
 
-    /**
-     * Initialize the taskConfig object with the defaults values
-     *
-     * @param script The owner {@code BaseScript} configuration object
-     */
     protected ProcessConfig( BaseScript script ) {
         ownerScript = script
         configProperties = new LinkedHashMap()
@@ -205,68 +110,13 @@ class ProcessConfig implements Map, Cloneable {
         return this
     }
 
-    /**
-     * Enable special behavior to allow the configuration object
-     * invoking directive method from the process DSL
-     *
-     * @param value {@code true} enable capture mode, {@code false} otherwise
-     * @return The object itself
-     */
-    @PackageScope
-    ProcessConfig throwExceptionOnMissingProperty( boolean value ) {
-        this.throwExceptionOnMissingProperty = value
+    ProcessConfig setInputs(ProcessInputs inputs) {
+        this.inputs = inputs
         return this
     }
 
-    private void checkName(String name) {
-        if( DIRECTIVES.contains(name) )
-            return
-        if( name == NextflowDSLImpl.PROCESS_WHEN )
-            return
-        if( name == NextflowDSLImpl.PROCESS_STUB )
-            return
-
-        String message = "Unknown process directive: `$name`"
-        def alternatives = DIRECTIVES.closest(name)
-        if( alternatives.size()==1 ) {
-            message += '\n\nDid you mean of these?'
-            alternatives.each {
-                message += "\n        $it"
-            }
-        }
-        throw new IllegalDirectiveException(message)
-    }
-
-    Object invokeMethod(String name, Object args) {
-        /*
-         * This is need to patch #497 -- what is happening is that when in the config file
-         * is defined a directive like `memory`, `cpus`, etc in by using a closure,
-         * this closure is interpreted as method definition and it get invoked if a
-         * directive with the same name is defined in the process definition.
-         * To avoid that the offending property is removed from the map before the method
-         * is evaluated.
-         */
-        if( configProperties.get(name) instanceof Closure )
-            configProperties.remove(name)
-
-        this.metaClass.invokeMethod(this,name,args)
-    }
-
-    def methodMissing( String name, def args ) {
-        checkName(name)
-
-        if( args instanceof Object[] ) {
-            if( args.size()==1 ) {
-                configProperties[ name ] = args[0]
-            }
-            else {
-                configProperties[ name ] = args.toList()
-            }
-        }
-        else {
-            configProperties[ name ] = args
-        }
-
+    ProcessConfig setOutputs(ProcessOutputs outputs) {
+        this.outputs = outputs
         return this
     }
 
@@ -292,375 +142,30 @@ class ProcessConfig implements Map, Cloneable {
             default:
                 if( configProperties.containsKey(name) )
                     return configProperties.get(name)
-                else if( throwExceptionOnMissingProperty )
-                    throw new MissingPropertyException("Unknown variable '$name'", name, null)
                 else
                     return null
         }
 
     }
 
-    Object put( String name, Object value ) {
-
-        if( name in repeatableDirectives  ) {
-            final result = configProperties.get(name)
-            configProperties.remove(name)
-            this.metaClass.invokeMethod(this, name, value)
-            return result
-        }
-        else {
-            return configProperties.put(name,value)
-        }
-    }
-
     @PackageScope
     BaseScript getOwnerScript() { ownerScript }
 
+    @PackageScope
+    String getProcessName() { processName }
+
     TaskConfig createTaskConfig() {
         return new TaskConfig(configProperties)
     }
 
-    /**
-     * Apply the settings defined in the configuration file for the given annotation label, for example:
-     *
-     * ```
-     * process {
-     *     withLabel: foo {
-     *         cpus = 1
-     *         memory = 2.gb
-     *     }
-     * }
-     * ```
-     *
-     * @param configDirectives
-     *      A map object modelling the setting defined defined by the user in the nextflow configuration file
-     * @param labels
-     *      All the labels representing the object holding the configuration setting to apply
-     */
-    protected void applyConfigSelectorWithLabels(Map configDirectives, List labels ) {
-        final prefix = 'withLabel:'
-        for( String rule : configDirectives.keySet() ) {
-            if( !rule.startsWith(prefix) )
-                continue
-            final pattern = rule.substring(prefix.size()).trim()
-            if( !matchesLabels(labels, pattern) )
-                continue
-
-            log.debug "Config settings `$rule` matches labels `${labels.join(',')}` for process with name $processName"
-            def settings = configDirectives.get(rule)
-            if( settings instanceof Map ) {
-                applyConfigSettings(settings)
-            }
-            else if( settings != null ) {
-                throw new ConfigParseException("Unknown config settings for process labeled ${labels.join(',')} -- settings=$settings ")
-            }
-        }
-    }
-
-    static boolean matchesLabels( List labels, String pattern ) {
-        final isNegated = pattern.startsWith('!')
-        if( isNegated )
-            pattern = pattern.substring(1).trim()
-
-        final regex = Pattern.compile(pattern)
-        for (label in labels) {
-            if (regex.matcher(label).matches()) {
-                return !isNegated
-            }
-        }
-
-        return isNegated
-    }
-
-    protected void applyConfigSelectorWithName(Map configDirectives, String target ) {
-        final prefix = 'withName:'
-        for( String rule : configDirectives.keySet() ) {
-            if( !rule.startsWith(prefix) )
-                continue
-            final pattern = rule.substring(prefix.size()).trim()
-            if( !matchesSelector(target, pattern) )
-                continue
-
-            log.debug "Config settings `$rule` matches process $processName"
-            def settings = configDirectives.get(rule)
-            if( settings instanceof Map ) {
-                applyConfigSettings(settings)
-            }
-            else if( settings != null ) {
-                throw new ConfigParseException("Unknown config settings for process with name: $target  -- settings=$settings ")
-            }
-        }
-    }
-
-    static boolean matchesSelector( String name, String pattern ) {
-        final isNegated = pattern.startsWith('!')
-        if( isNegated )
-            pattern = pattern.substring(1).trim()
-        return Pattern.compile(pattern).matcher(name).matches() ^ isNegated
-    }
-
-    /**
-     * Apply the process configuration provided in the nextflow configuration file
-     * to the process instance
-     *
-     * @param configProcessScope The process configuration settings specified
-     *      in the configuration file as {@link Map} object
-     * @param simpleName The process name
-     */
-    void applyConfig(Map configProcessScope, String baseName, String simpleName, String fullyQualifiedName) {
-        // -- Apply the directives defined in the config object using the`withLabel:` syntax
-        final processLabels = this.getLabels() ?: ['']
-        this.applyConfigSelectorWithLabels(configProcessScope, processLabels)
-
-        // -- apply setting defined in the config file using the process base name
-        this.applyConfigSelectorWithName(configProcessScope, baseName)
-
-        // -- apply setting defined in the config file using the process simple name
-        if( simpleName && simpleName!=baseName )
-            this.applyConfigSelectorWithName(configProcessScope, simpleName)
-
-        // -- apply setting defined in the config file using the process qualified name (ie. with the execution scope)
-        if( fullyQualifiedName && (fullyQualifiedName!=simpleName || fullyQualifiedName!=baseName) )
-            this.applyConfigSelectorWithName(configProcessScope, fullyQualifiedName)
-
-        // -- Apply defaults
-        this.applyConfigDefaults(configProcessScope)
-
-        // -- check for conflicting settings
-        if( this.scratch && this.stageInMode == 'rellink' ) {
-            log.warn("Directives `scratch` and `stageInMode=rellink` conflict with each other -- Enforcing default stageInMode for process `$simpleName`")
-            this.remove('stageInMode')
-        }
-    }
-
-    void applyConfigLegacy(Map configProcessScope, String processName) {
-        applyConfig(configProcessScope, processName, null, null)
-    }
-
-
-    /**
-     * Apply the settings defined in the configuration file to the actual process configuration object
-     *
-     * @param settings
-     *      A map object modelling the setting defined defined by the user in the nextflow configuration file
-     */
-    protected void applyConfigSettings(Map settings) {
-        if( !settings )
-            return
-
-        for( Entry entry: settings ) {
-            if( entry.key.startsWith("withLabel:") || entry.key.startsWith("withName:"))
-                continue
-
-            if( !DIRECTIVES.contains(entry.key) )
-                log.warn "Unknown directive `$entry.key` for process `$processName`"
-
-            if( entry.key == 'params' ) // <-- patch issue #242
-                continue
-
-            if( entry.key == 'ext' ) {
-                if( this.getProperty('ext') instanceof Map ) {
-                    // update missing 'ext' properties found in 'process' scope
-                    def ext = this.getProperty('ext') as Map
-                    entry.value.each { String k, v -> ext[k] = v }
-                }
-                continue
-            }
-
-            this.put(entry.key,entry.value)
-        }
-    }
-
-    /**
-     * Apply the process settings defined globally in the process config scope
-     *
-     * @param processDefaults
-     *      A map object representing the setting to be applied to the process
-     *      (provided it does not already define a different value for
-     *      the same config setting).
-     *
-     */
-    protected void applyConfigDefaults( Map processDefaults ) {
-        for( String key : processDefaults.keySet() ) {
-            if( key == 'params' )
-                continue
-            final value = processDefaults.get(key)
-            final current = this.getProperty(key)
-            if( key == 'ext' ) {
-                if( value instanceof Map && current instanceof Map ) {
-                    final ext = current as Map
-                    value.each { k,v -> if(!ext.containsKey(k)) ext.put(k,v) }
-                }
-            }
-            else if( !this.containsKey(key) || (DEFAULT_CONFIG.containsKey(key) && current==DEFAULT_CONFIG.get(key)) ) {
-                this.put(key, value)
-            }
-        }
-    }
-
-    /**
-     * Type shortcut to {@code #configProperties.inputs}
-     */
-    InputsList getInputs() {
+    ProcessInputs getInputs() {
         inputs
     }
 
-    /**
-     * Type shortcut to {@code #configProperties.outputs}
-     */
-    OutputsList getOutputs() {
+    ProcessOutputs getOutputs() {
         outputs
     }
 
-    /**
-     * Implements the process {@code debug} directive.
-     */
-    ProcessConfig debug( value ) {
-        configProperties.debug = value
-        return this
-    }
-
-    /**
-     * Implements the process {@code echo} directive for backwards compatibility.
-     *
-     * note: without this method definition {@link BaseScript#echo} will be invoked
-     */
-    ProcessConfig echo( value ) {
-        log.warn1('The `echo` directive has been deprecated - use to `debug` instead')
-        configProperties.debug = value
-        return this
-    }
-
-    /// input parameters
-
-    InParam _in_val( obj ) {
-        new ValueInParam(this).bind(obj)
-    }
-
-    InParam _in_file( obj ) {
-        new FileInParam(this).bind(obj)
-    }
-
-    InParam _in_path( Map opts=null, obj ) {
-        new FileInParam(this)
-                .setPathQualifier(true)
-                .setOptions(opts)
-                .bind(obj)
-    }
-
-    InParam _in_each( obj ) {
-        new EachInParam(this).bind(obj)
-    }
-
-    InParam _in_tuple( Object... obj ) {
-        new TupleInParam(this).bind(obj)
-    }
-
-    InParam _in_stdin( obj = null ) {
-        def result = new StdInParam(this)
-        if( obj ) result.bind(obj)
-        result
-    }
-
-    InParam _in_env( obj ) {
-        new EnvInParam(this).bind(obj)
-    }
-
-
-    /// output parameters
-
-    OutParam _out_val( Object obj ) {
-        new ValueOutParam(this).bind(obj)
-    }
-
-    OutParam _out_val( Map opts, Object obj ) {
-        new ValueOutParam(this)
-                .setOptions(opts)
-                .bind(obj)
-    }
-
-    OutParam _out_env( Object obj ) {
-        new EnvOutParam(this).bind(obj)
-    }
-
-    OutParam _out_env( Map opts, Object obj ) {
-        new EnvOutParam(this)
-                .setOptions(opts)
-                .bind(obj)
-    }
-
-    OutParam _out_eval(Object obj ) {
-        new CmdEvalParam(this).bind(obj)
-    }
-
-    OutParam _out_eval(Map opts, Object obj ) {
-        new CmdEvalParam(this)
-            .setOptions(opts)
-            .bind(obj)
-    }
-
-    OutParam _out_file( Object obj ) {
-        // note: check that is a String type to avoid to force
-        // the evaluation of GString object to a string
-        if( obj instanceof String && obj == '-' )
-            new StdOutParam(this).bind(obj)
-
-        else
-            new FileOutParam(this).bind(obj)
-    }
-
-    OutParam _out_path( Map opts=null, Object obj ) {
-        // note: check that is a String type to avoid to force
-        // the evaluation of GString object to a string
-        if( obj instanceof String && obj == '-' ) {
-            new StdOutParam(this)
-                    .setOptions(opts)
-                    .bind(obj)
-        }
-        else {
-            new FileOutParam(this)
-                    .setPathQualifier(true)
-                    .setOptions(opts)
-                    .bind(obj)
-        }
-    }
-
-    OutParam _out_tuple( Object... obj ) {
-        new TupleOutParam(this) .bind(obj)
-    }
-
-    OutParam _out_tuple( Map opts, Object... obj ) {
-        new TupleOutParam(this)
-                .setOptions(opts)
-                .bind(obj)
-    }
-
-    OutParam _out_stdout( Map opts ) {
-        new StdOutParam(this)
-                .setOptions(opts)
-                .bind('-')
-    }
-
-    OutParam _out_stdout( obj = null ) {
-        def result = new StdOutParam(this).bind('-')
-        if( obj ) {
-            result.into(obj)
-        }
-        result
-    }
-
-    /**
-     * Defines a special *dummy* input parameter, when no inputs are
-     * provided by the user for the current task
-     */
-    void fakeInput() {
-        new DefaultInParam(this)
-    }
-
-    void fakeOutput() {
-        new DefaultOutParam(this)
-    }
-
     boolean isCacheable() {
         def value = configProperties.cache
         if( value == null )
@@ -679,73 +184,6 @@ class ProcessConfig implements Map, Cloneable {
         HashMode.of(configProperties.cache) ?: HashMode.DEFAULT()
     }
 
-    protected boolean isValidLabel(String lbl) {
-        def p = lbl.indexOf('=')
-        if( p==-1 )
-            return LABEL_REGEXP.matcher(lbl).matches()
-
-        def left = lbl.substring(0,p)
-        def right = lbl.substring(p+1)
-        return LABEL_REGEXP.matcher(left).matches() && LABEL_REGEXP.matcher(right).matches()
-    }
-
-    /**
-     * Implements the process {@code label} directive.
-     *
-     * Note this directive  can be specified (invoked) more than one time in
-     * the process context.
-     *
-     * @param lbl
-     *      The label to be attached to the process.
-     * @return
-     *      The {@link ProcessConfig} instance itself.
-     */
-    ProcessConfig label(String lbl) {
-        if( !lbl ) return this
-
-        // -- check that label has a valid syntax
-        if( !isValidLabel(lbl) )
-            throw new IllegalConfigException("Not a valid process label: $lbl -- Label must consist of alphanumeric characters or '_', must start with an alphabetic character and must end with an alphanumeric character")
-
-        // -- get the current label, it must be a list
-        def allLabels = (List)configProperties.get('label')
-        if( !allLabels ) {
-            allLabels = new ConfigList()
-            configProperties.put('label', allLabels)
-        }
-
-        // -- avoid duplicates
-        if( !allLabels.contains(lbl) )
-            allLabels.add(lbl)
-        return this
-    }
-
-    /**
-     * Implements the process {@code label} directive.
-     *
-     * Note this directive  can be specified (invoked) more than one time in
-     * the process context.
-     *
-     * @param map
-     *      The map to be attached to the process.
-     * @return
-     *      The {@link ProcessConfig} instance itself.
-     */
-    ProcessConfig resourceLabels(Map map) {
-        if( !map )
-            return this
-
-        // -- get the current sticker, it must be a Map
-        def allLabels = (Map)configProperties.get('resourceLabels')
-        if( !allLabels ) {
-            allLabels = [:]
-        }
-        // -- merge duplicates
-        allLabels += map
-        configProperties.put('resourceLabels', allLabels)
-        return this
-    }
-
     Map getResourceLabels() {
         (configProperties.get('resourceLabels') ?: Collections.emptyMap()) as Map
     }
@@ -767,240 +205,8 @@ class ProcessConfig implements Map, Cloneable {
             throw new IllegalArgumentException("Unexpected value for directive `fair` -- offending value: $value")
     }
 
-    ProcessConfig secret(String name) {
-        if( !name )
-            return this
-
-        // -- get the current label, it must be a list
-        def allSecrets = (List)configProperties.get('secret')
-        if( !allSecrets ) {
-            allSecrets = new ConfigList()
-            configProperties.put('secret', allSecrets)
-        }
-
-        // -- avoid duplicates
-        if( !allSecrets.contains(name) )
-            allSecrets.add(name)
-        return this
-    }
-
     List getSecret() {
         (List) configProperties.get('secret') ?: Collections.emptyList()
     }
 
-    /**
-     * Implements the process {@code module} directive.
-     *
-     * See also http://modules.sourceforge.net
-     *
-     * @param moduleName
-     *      The module name to be used to execute the process.
-     * @return
-     *      The {@link ProcessConfig} instance itself.
-     */
-    ProcessConfig module( moduleName ) {
-        // when no name is provided, just exit
-        if( !moduleName )
-            return this
-
-        def result = (List)configProperties.module
-        if( result == null ) {
-            result = new ConfigList()
-            configProperties.put('module', result)
-        }
-
-        if( moduleName instanceof List )
-            result.addAll(moduleName)
-        else
-            result.add(moduleName)
-        return this
-    }
-
-    /**
-     * Implements the {@code errorStrategy} directive
-     *
-     * @see ErrorStrategy
-     *
-     * @param strategy
-     *      A string representing the error strategy to be used.
-     * @return
-     *      The {@link ProcessConfig} instance itself.
-     */
-    ProcessConfig errorStrategy( strategy ) {
-        if( strategy instanceof CharSequence && !ErrorStrategy.isValid(strategy) ) {
-            throw new IllegalArgumentException("Unknown error strategy '${strategy}' ― Available strategies are: ${ErrorStrategy.values().join(',').toLowerCase()}")
-        }
-
-        configProperties.put('errorStrategy', strategy)
-        return this
-    }
-
-    /**
-     * Allow the user to specify publishDir directive as a map eg:
-     *
-     *     publishDir path:'/some/dir', mode: 'copy'
-     *
-     * @param params
-     *      A map representing the the publishDir setting
-     * @return
-     *      The {@link ProcessConfig} instance itself
-     */
-    ProcessConfig publishDir(Map params) {
-        if( !params )
-            return this
-
-        def dirs = (List)configProperties.get('publishDir')
-        if( !dirs ) {
-            dirs = new ConfigList()
-            configProperties.put('publishDir', dirs)
-        }
-
-        dirs.add(params)
-        return this
-    }
-
-    /**
-     * Allow the user to specify publishDir directive with a path and a list of named parameters, eg:
-     *
-     *     publishDir '/some/dir', mode: 'copy'
-     *
-     * @param params
-     *      A map representing the publishDir properties
-     * @param target
-     *      The target publishDir path
-     * @return
-     *      The {@link ProcessConfig} instance itself
-     */
-    ProcessConfig publishDir(Map params, target) {
-        params.put('path', target)
-        publishDir( params )
-    }
-
-    /**
-     * Allow the user to specify the publishDir as a string path, eg:
-     *
-     *      publishDir '/some/dir'
-     *
-     * @param target
-     *      The target publishDir path
-     * @return
-     *      The {@link ProcessConfig} instance itself
-     */
-    ProcessConfig publishDir( target ) {
-        if( target instanceof List ) {
-            for( Object item : target ) { publishDir(item) }
-        }
-        else if( target instanceof Map ) {
-            publishDir( target as Map )
-        }
-        else {
-            publishDir([path: target])
-        }
-        return this
-    }
-
-    /**
-     * Allow use to specify K8s `pod` options
-     *
-     * @param entry
-     *      A map object representing pod config options
-     * @return
-     *      The {@link ProcessConfig} instance itself
-     */
-    ProcessConfig pod( Map entry ) {
-
-        if( !entry )
-            return this
-
-        def allOptions = (List)configProperties.get('pod')
-        if( !allOptions ) {
-            allOptions = new ConfigList()
-            configProperties.put('pod', allOptions)
-        }
-
-        allOptions.add(entry)
-        return this
-
-    }
-
-    ProcessConfig accelerator( Map params, value )  {
-        if( value instanceof Number ) {
-            if( params.limit==null )
-                params.limit=value
-            else if( params.request==null )
-                params.request=value
-        }
-        else if( value != null )
-            throw new IllegalArgumentException("Not a valid `accelerator` directive value: $value [${value.getClass().getName()}]")
-        accelerator(params)
-        return this
-    }
-
-    ProcessConfig accelerator( value ) {
-        if( value instanceof Number )
-            configProperties.put('accelerator', [limit: value])
-        else if( value instanceof Map )
-            configProperties.put('accelerator', value)
-        else if( value != null )
-            throw new IllegalArgumentException("Not a valid `accelerator` directive value: $value [${value.getClass().getName()}]")
-        return this
-    }
-
-    /**
-     * Allow user to specify `disk` directive as a value with a list of options, eg:
-     *
-     *     disk 375.GB, type: 'local-ssd'
-     *
-     * @param opts
-     *      A map representing the disk options
-     * @param value
-     *      The default disk value
-     * @return
-     *      The {@link ProcessConfig} instance itself
-     */
-    ProcessConfig disk( Map opts, value )  {
-        opts.request = value
-        return disk(opts)
-    }
-
-    /**
-     * Allow user to specify `disk` directive as a value or a list of options, eg:
-     *
-     *     disk 100.GB
-     *     disk request: 375.GB, type: 'local-ssd'
-     *
-     * @param value
-     *      The default disk value or map of options
-     * @return
-     *      The {@link ProcessConfig} instance itself
-     */
-    ProcessConfig disk( value ) {
-        if( value instanceof Map || value instanceof Closure )
-            configProperties.put('disk', value)
-        else
-            configProperties.put('disk', [request: value])
-        return this
-    }
-
-    ProcessConfig arch( Map params, value )  {
-        if( value instanceof String ) {
-            if( params.name==null )
-                params.name=value
-        }
-        else if( value != null )
-            throw new IllegalArgumentException("Not a valid `arch` directive value: $value [${value.getClass().getName()}]")
-        arch(params)
-        return this
-    }
-
-    ProcessConfig arch( value ) {
-        if( value instanceof String )
-            configProperties.put('arch', [name: value])
-        else if( value instanceof Map )
-            configProperties.put('arch', value)
-        else if( value != null )
-            throw new IllegalArgumentException("Not a valid `arch` directive value: $value [${value.getClass().getName()}]")
-        return this
-    }
-
 }
diff --git a/modules/nextflow/src/main/groovy/nextflow/script/ProcessDef.groovy b/modules/nextflow/src/main/groovy/nextflow/script/ProcessDef.groovy
index 3c54f0e426..073080c451 100644
--- a/modules/nextflow/src/main/groovy/nextflow/script/ProcessDef.groovy
+++ b/modules/nextflow/src/main/groovy/nextflow/script/ProcessDef.groovy
@@ -18,16 +18,15 @@ package nextflow.script
 
 import groovy.transform.CompileStatic
 import groovy.util.logging.Slf4j
+import groovyx.gpars.dataflow.DataflowReadChannel
 import nextflow.Const
 import nextflow.Global
+import nextflow.NF
 import nextflow.Session
 import nextflow.exception.ScriptRuntimeException
 import nextflow.extension.CH
-import nextflow.script.params.BaseInParam
-import nextflow.script.params.BaseOutParam
-import nextflow.script.params.EachInParam
-import nextflow.script.params.InputsList
-import nextflow.script.params.OutputsList
+import nextflow.extension.MergeWithEachOp
+import nextflow.script.dsl.ProcessConfigBuilder
 
 /**
  * Models a nextflow process definition
@@ -62,32 +61,28 @@ class ProcessDef extends BindableDef implements IterableDef, ChainableDef {
      */
     private String baseName
 
-    /**
-     * The closure holding the process definition body
-     */
-    private Closure rawBody
-
     /**
      * The resolved process configuration
      */
-    private transient ProcessConfig processConfig
+    private ProcessConfig config
 
     /**
      * The actual process implementation
      */
-    private transient BodyDef taskBody
+    private BodyDef taskBody
 
     /**
      * The result of the process execution
      */
     private transient ChannelOut output
 
-    ProcessDef(BaseScript owner, Closure body, String name ) {
+    ProcessDef(BaseScript owner, String name, BodyDef body, ProcessConfig config) {
         this.owner = owner
-        this.rawBody = body
         this.simpleName = name
         this.processName = name
         this.baseName = name
+        this.taskBody = body
+        this.config = config
     }
 
     static String stripScope(String str) {
@@ -95,32 +90,15 @@ class ProcessDef extends BindableDef implements IterableDef, ChainableDef {
     }
 
     protected void initialize() {
-        log.trace "Process config > $processName"
-        assert processConfig==null
-
-        // the config object
-        processConfig = new ProcessConfig(owner,processName)
-
-        // Invoke the code block which will return the script closure to the executed.
-        // As side effect will set all the property declarations in the 'taskConfig' object.
-        processConfig.throwExceptionOnMissingProperty(true)
-        final copy = (Closure)rawBody.clone()
-        copy.setResolveStrategy(Closure.DELEGATE_FIRST)
-        copy.setDelegate(processConfig)
-        taskBody = copy.call() as BodyDef
-        processConfig.throwExceptionOnMissingProperty(false)
-        if ( !taskBody )
-            throw new ScriptRuntimeException("Missing script in the specified process block -- make sure it terminates with the script string to be executed")
-
         // apply config settings to the process
-        processConfig.applyConfig((Map)session.config.process, baseName, simpleName, processName)
+        new ProcessConfigBuilder(config).applyConfig((Map)session.config.process, baseName, simpleName, processName)
     }
 
     @Override
     ProcessDef clone() {
         def result = (ProcessDef)super.clone()
-        result.@taskBody = taskBody?.clone()
-        result.@rawBody = (Closure)rawBody?.clone()
+        result.@taskBody = taskBody.clone()
+        result.@config = config.clone()
         return result
     }
 
@@ -130,12 +108,13 @@ class ProcessDef extends BindableDef implements IterableDef, ChainableDef {
         def result = clone()
         result.@processName = name
         result.@simpleName = stripScope(name)
+        result.@config.processName = name
         return result
     }
 
-    private InputsList getDeclaredInputs() { processConfig.getInputs() }
+    private ProcessInputs getDeclaredInputs() { config.getInputs() }
 
-    private OutputsList getDeclaredOutputs() { processConfig.getOutputs() }
+    private ProcessOutputs getDeclaredOutputs() { config.getOutputs() }
 
     BaseScript getOwner() { owner }
 
@@ -145,7 +124,7 @@ class ProcessDef extends BindableDef implements IterableDef, ChainableDef {
 
     String getBaseName() { baseName }
 
-    ProcessConfig getProcessConfig() { processConfig }
+    ProcessConfig getProcessConfig() { config }
 
     ChannelOut getOut() {
         if( output==null )
@@ -162,66 +141,86 @@ class ProcessDef extends BindableDef implements IterableDef, ChainableDef {
         return "Process `$name` declares ${expected} input ${ch} but ${actual} were specified"
     }
 
+    private DataflowReadChannel collectInputs(Object[] args0) {
+        final args = ChannelOut.spread(args0)
+        if( args.size() != declaredInputs.size() )
+            throw new ScriptRuntimeException(missMatchErrMessage(processName, declaredInputs.size(), args.size()))
+
+        // emit value channel if process has no inputs
+        if( args.size() == 0 ) {
+            final source = CH.value()
+            source.bind([])
+            return source
+        }
+
+        // create input channels
+        for( int i = 0; i < declaredInputs.size(); i++ )
+            declaredInputs[i].bind(args[i])
+
+        // combine input channels
+        final count = declaredInputs.count( param -> CH.isChannelQueue(param) && !param.isIterator() )
+        if( count > 1 ) {
+            final msg = "Process `$processName` received multiple queue channel inputs which will be implicitly mergeed -- consider combining them explicitly with `combine` or `join`, or converting single-item chennels into value channels with `collect` or `first`"
+            if( NF.isStrictMode() )
+                throw new ScriptRuntimeException(msg)
+            log.warn(msg)
+        }
+
+        final iterators = (0.. declaredInputs[i].isIterator() )
+        return CH.getReadChannel(new MergeWithEachOp(declaredInputs.getChannels(), iterators).apply())
+    }
+
+    private void collectOutputs(boolean singleton) {
+        // emit stdout if no outputs are defined
+        if( declaredOutputs.size() == 0 ) {
+            declaredOutputs.setDefault()
+            return
+        }
+
+        // check for feedback channels
+        final feedbackChannels = getFeedbackChannels()
+        if( feedbackChannels && feedbackChannels.size() != declaredOutputs.size() )
+            throw new ScriptRuntimeException("Process `$processName` inputs and outputs do not have the same cardinality - Feedback loop is not supported"  )
+
+        for( int i=0; i0, "Process output should contains at least one channel"
-        return output = new ChannelOut(copyOuts)
+        return output = new ChannelOut(declaredOutputs)
     }
 
 }
diff --git a/modules/nextflow/src/main/groovy/nextflow/script/ProcessFactory.groovy b/modules/nextflow/src/main/groovy/nextflow/script/ProcessFactory.groovy
index c4b776f318..c6133a1b13 100755
--- a/modules/nextflow/src/main/groovy/nextflow/script/ProcessFactory.groovy
+++ b/modules/nextflow/src/main/groovy/nextflow/script/ProcessFactory.groovy
@@ -23,6 +23,8 @@ import nextflow.Session
 import nextflow.executor.Executor
 import nextflow.executor.ExecutorFactory
 import nextflow.processor.TaskProcessor
+import nextflow.script.dsl.ProcessConfigBuilder
+import nextflow.script.dsl.ProcessDsl
 /**
  *  Factory class for {@TaskProcessor} instances
  *
@@ -83,21 +85,20 @@ class ProcessFactory {
         assert body
         assert config.process instanceof Map
 
-        // -- the config object
-        final processConfig = new ProcessConfig(owner, name)
+        final builder = new ProcessDsl(owner, name)
         // Invoke the code block which will return the script closure to the executed.
         // As side effect will set all the property declarations in the 'taskConfig' object.
-        processConfig.throwExceptionOnMissingProperty(true)
         final copy = (Closure)body.clone()
         copy.setResolveStrategy(Closure.DELEGATE_FIRST)
-        copy.setDelegate(processConfig)
+        copy.setDelegate(builder)
         final script = copy.call()
-        processConfig.throwExceptionOnMissingProperty(false)
         if ( !script )
             throw new IllegalArgumentException("Missing script in the specified process block -- make sure it terminates with the script string to be executed")
 
         // -- apply settings from config file to process config
-        processConfig.applyConfigLegacy((Map)config.process, name)
+        final processConfig = builder.getConfig()
+
+        new ProcessConfigBuilder(processConfig).applyConfig((Map)config.process, name, null, null)
 
         // -- get the executor for the given process config
         final execObj = executorFactory.getExecutor(name, processConfig, script, session)
diff --git a/modules/nextflow/src/main/groovy/nextflow/script/ProcessFileInput.groovy b/modules/nextflow/src/main/groovy/nextflow/script/ProcessFileInput.groovy
new file mode 100644
index 0000000000..bb4dfc93a9
--- /dev/null
+++ b/modules/nextflow/src/main/groovy/nextflow/script/ProcessFileInput.groovy
@@ -0,0 +1,86 @@
+/*
+ * Copyright 2013-2024, Seqera Labs
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package nextflow.script
+
+import groovy.transform.CompileStatic
+
+/**
+ * Models a process file input, which defines a file
+ * or set of files to be staged into a task work directory.
+ *
+ * @author Ben Sherman 
+ */
+@CompileStatic
+class ProcessFileInput implements PathArityAware {
+
+    /**
+     * Lazy expression (e.g. lazy var, closure, GString) which
+     * defines which files to stage in terms of the task inputs.
+     * It is evaluated for each task against the task context.
+     */
+    private Object value
+
+    /**
+     * Optional name which, if specified, will be added to the task
+     * context as an escape-aware list of paths.
+     */
+    private String name
+
+    /**
+     * Flag to support legacy `file` input.
+     */
+    private boolean pathQualifier
+
+    /**
+     * File pattern which defines how the input files should be named
+     * when they are staged into a task directory.
+     */
+    private Object filePattern
+
+    ProcessFileInput(Object value, String name, boolean pathQualifier, Map opts) {
+        this.value = value
+        this.name = name
+        this.pathQualifier = pathQualifier
+
+        for( Map.Entry entry : opts )
+            setProperty(entry.key, entry.value)
+    }
+
+    void setStageAs(CharSequence value) {
+        this.filePattern = value
+    }
+
+    Object resolve(Map ctx) {
+        return LazyHelper.resolve(ctx, value)
+    }
+
+    String getName() {
+        return name
+    }
+
+    boolean isPathQualifier() {
+        return pathQualifier
+    }
+
+    String getFilePattern(Map ctx) {
+        if( filePattern != null )
+            return LazyHelper.resolve(ctx, filePattern)
+        else
+            return filePattern = '*'
+    }
+
+}
diff --git a/modules/nextflow/src/main/groovy/nextflow/script/ProcessFileOutput.groovy b/modules/nextflow/src/main/groovy/nextflow/script/ProcessFileOutput.groovy
new file mode 100644
index 0000000000..4d364cc8e2
--- /dev/null
+++ b/modules/nextflow/src/main/groovy/nextflow/script/ProcessFileOutput.groovy
@@ -0,0 +1,149 @@
+/*
+ * Copyright 2013-2024, Seqera Labs
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package nextflow.script
+
+import java.nio.file.Path
+
+import groovy.transform.CompileStatic
+import groovy.util.logging.Slf4j
+import nextflow.exception.IllegalFileException
+import nextflow.file.FilePatternSplitter
+import nextflow.util.BlankSeparatedList
+/**
+ * Models a process file output, which defines a file
+ * or set of files to be unstaged from a task work directory.
+ *
+ * @author Paolo Di Tommaso 
+ * @author Ben Sherman 
+ */
+@Slf4j
+@CompileStatic
+class ProcessFileOutput implements PathArityAware {
+
+    /**
+     * Lazy expression (e.g. lazy var, closure, GString) which
+     * defines which files to unstage from the task directory.
+     * It will be evaluated for each task against the task directory.
+     */
+    private Object target
+
+    /**
+     * Flag to support legacy `file` output.
+     */
+    private boolean pathQualifier
+
+    /**
+     * When true it will not fail if no files are found.
+     */
+    boolean optional
+
+    /**
+     * When true it follows symbolic links during directories tree traversal, otherwise they are managed as files (default: true)
+     */
+    boolean followLinks = true
+
+    /**
+     * When true the specified name is interpreted as a glob pattern (default: true)
+     */
+    boolean glob = true
+
+    /**
+     * When {@code true} star wildcard (*) matches hidden files (files starting with a dot char)
+     * By default it does not, coherently with linux bash rule
+     */
+    boolean hidden
+
+    /**
+     * When {@code true} file pattern includes input files as well as output files.
+     * By default a file pattern matches only against files produced by the process, not
+     * the ones received as input
+     */
+    boolean includeInputs
+
+    /**
+     * Maximum number of directory levels to visit (default: no limit)
+     */
+    Integer maxDepth
+
+    /**
+     * The type of path to output, either 'file', 'dir' or 'any'
+     */
+    String type
+
+    ProcessFileOutput(Object target, boolean pathQualifier, Map opts) {
+        this.target = target
+        this.pathQualifier = pathQualifier
+
+        for( Map.Entry entry : opts )
+            setProperty(entry.key, entry.value)
+    }
+
+    boolean isPathQualifier() {
+        return pathQualifier
+    }
+
+    List getFilePatterns(Map context, Path workDir) {
+        final entry = LazyHelper.resolve(context, target)
+
+        if( !entry )
+            return []
+
+        // -- single path
+        if( entry instanceof Path )
+            return [ relativize(entry, workDir) ]
+
+        // -- multiple paths
+        if( entry instanceof BlankSeparatedList || entry instanceof List )
+            return entry.collect( path -> relativize(path.toString(), workDir) )
+
+        // -- literal file names separated by ':' (legacy `file` output)
+        final nameString = entry.toString()
+        if( !pathQualifier && nameString.contains(':') )
+            return nameString.split(/:/).collect { String it-> relativize(it, workDir) }
+
+        // -- literal file name
+        return [ relativize(nameString, workDir) ]
+    }
+
+    protected String relativize(String path, Path workDir) {
+        if( !path.startsWith('/') )
+            return path
+
+        final dir = workDir.toString()
+        if( !path.startsWith(dir) )
+            throw new IllegalFileException("File `$path` is outside the scope of the process work directory: $workDir")
+
+        if( path.length()-dir.length()<2 )
+            throw new IllegalFileException("Missing output file name")
+
+        return path.substring(dir.size()+1)
+    }
+
+    protected String relativize(Path path, Path workDir) {
+        if( !path.isAbsolute() )
+            return glob ? FilePatternSplitter.GLOB.escape(path) : path
+
+        if( !path.startsWith(workDir) )
+            throw new IllegalFileException("File `$path` is outside the scope of the process work directory: $workDir")
+
+        if( path.nameCount == workDir.nameCount )
+            throw new IllegalFileException("Missing output file name")
+
+        final rel = path.subpath(workDir.getNameCount(), path.getNameCount())
+        return glob ? FilePatternSplitter.GLOB.escape(rel) : rel
+    }
+}
diff --git a/modules/nextflow/src/main/groovy/nextflow/script/ProcessInput.groovy b/modules/nextflow/src/main/groovy/nextflow/script/ProcessInput.groovy
new file mode 100644
index 0000000000..7dade2ffcf
--- /dev/null
+++ b/modules/nextflow/src/main/groovy/nextflow/script/ProcessInput.groovy
@@ -0,0 +1,123 @@
+/*
+ * Copyright 2013-2024, Seqera Labs
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package nextflow.script
+
+import groovy.transform.CompileStatic
+import groovyx.gpars.dataflow.DataflowBroadcast
+import groovyx.gpars.dataflow.DataflowReadChannel
+import groovyx.gpars.dataflow.DataflowVariable
+import groovyx.gpars.dataflow.expression.DataflowExpression
+import nextflow.extension.CH
+import nextflow.extension.ToListOp
+
+/**
+ * Models a process input.
+ *
+ * @author Ben Sherman 
+ */
+@CompileStatic
+class ProcessInput implements Cloneable {
+
+    /**
+     * Parameter name under which the input value for each task
+     * will be added to the task context.
+     */
+    private String name
+
+    /**
+     * Parameter type which is used to validate task inputs
+     */
+    private Class type
+
+    /**
+     * Input channel which is created when the process is invoked
+     * in a workflow.
+     */
+    private DataflowReadChannel channel
+
+    /**
+     * Flag to support `each` input
+     */
+    private boolean iterator
+
+    ProcessInput(String name, Class type) {
+        this.name = name
+        this.type = type
+    }
+
+    String getName() {
+        return name
+    }
+
+    Class getType() {
+        return type
+    }
+
+    void bind(Object value) {
+        this.channel = getInChannel(value)
+    }
+
+    private DataflowReadChannel getInChannel(Object value) {
+        if( value == null )
+            throw new IllegalArgumentException('A process input channel evaluates to null')
+
+        if( iterator )
+            value = getIteratorChannel(value)
+
+        if( value instanceof DataflowReadChannel || value instanceof DataflowBroadcast )
+            return CH.getReadChannel(value)
+
+        final result = CH.value()
+        result.bind(value)
+        return result
+    }
+
+    private DataflowReadChannel getIteratorChannel(Object value) {
+        def result
+        if( value instanceof DataflowExpression ) {
+            result = value
+        }
+        else if( CH.isChannel(value) ) {
+            def read = CH.getReadChannel(value)
+            result = new ToListOp(read).apply()
+        }
+        else {
+            result = new DataflowVariable()
+            result.bind(value)
+        }
+
+        return result.chainWith { it instanceof Collection || it == null ? it : [it] }
+    }
+
+    DataflowReadChannel getChannel() {
+        return channel
+    }
+
+    void setIterator(boolean iterator) {
+        this.iterator = iterator
+    }
+
+    boolean isIterator() {
+        return iterator
+    }
+
+    @Override
+    ProcessInput clone() {
+        (ProcessInput)super.clone()
+    }
+
+}
diff --git a/modules/nextflow/src/main/groovy/nextflow/script/ProcessInputs.groovy b/modules/nextflow/src/main/groovy/nextflow/script/ProcessInputs.groovy
new file mode 100644
index 0000000000..11160ea9b7
--- /dev/null
+++ b/modules/nextflow/src/main/groovy/nextflow/script/ProcessInputs.groovy
@@ -0,0 +1,104 @@
+/*
+ * Copyright 2013-2024, Seqera Labs
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package nextflow.script
+
+import groovyx.gpars.dataflow.DataflowReadChannel
+
+/**
+ * Models the process inputs.
+ *
+ * @author Ben Sherman 
+ */
+class ProcessInputs implements List, Cloneable {
+
+    @Delegate
+    private List params = []
+
+    /**
+     * Input variables which will be evaluated for each task
+     * in terms of the task inputs and added to the task context.
+     */
+    private Map vars = [:]
+
+    /**
+     * Environment variables which will be evaluated for each
+     * task against the task context and added to the task
+     * environment.
+     */
+    private Map env = [:]
+
+    /**
+     * Input files which will be evaluated for each task
+     * against the task context and staged into the task
+     * directory.
+     */
+    private List files = []
+
+    /**
+     * Lazy expression which will be evaluated for each task
+     * against the task context and provided as the standard
+     * input to the task.
+     */
+    Object stdin
+
+    void addParam(String name, Class type=null) {
+        add(new ProcessInput(name, type))
+    }
+
+    void addVariable(String name, Object value) {
+        vars.put(name, value)
+    }
+
+    void addEnv(String name, Object value) {
+        env.put(name, value)
+    }
+
+    void addFile(ProcessFileInput file) {
+        files.add(file)
+    }
+
+    List getNames() {
+        return params*.getName()
+    }
+
+    List getChannels() {
+        return params*.getChannel()
+    }
+
+    Map getVariables() {
+        return vars
+    }
+
+    Map getEnv() {
+        return env
+    }
+
+    List getFiles() {
+        return files
+    }
+
+    @Override
+    ProcessInputs clone() {
+        def result = (ProcessInputs)super.clone()
+        result.params = new ArrayList<>(params.size())
+        for( ProcessInput param : params ) {
+            result.params.add(param.clone())
+        }
+        return result
+    }
+
+}
diff --git a/modules/nextflow/src/main/groovy/nextflow/script/ProcessOutput.groovy b/modules/nextflow/src/main/groovy/nextflow/script/ProcessOutput.groovy
new file mode 100644
index 0000000000..a9452b5276
--- /dev/null
+++ b/modules/nextflow/src/main/groovy/nextflow/script/ProcessOutput.groovy
@@ -0,0 +1,127 @@
+/*
+ * Copyright 2013-2024, Seqera Labs
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package nextflow.script
+
+import groovy.transform.CompileStatic
+import groovy.util.logging.Slf4j
+import groovyx.gpars.dataflow.DataflowWriteChannel
+import nextflow.processor.TaskOutputCollector
+import nextflow.processor.TaskRun
+import nextflow.util.ConfigHelper
+/**
+ * Models a process output.
+ *
+ * @author Ben Sherman 
+ */
+@Slf4j
+@CompileStatic
+class ProcessOutput implements Cloneable {
+
+    /**
+     * List of declared outputs of the parent process.
+     */
+    private ProcessOutputs declaredOutputs
+
+    /**
+     * Lazy expression (e.g. lazy var, closure, GString) which
+     * defines the output value in terms of the task context,
+     * including environment variables, files, and standard output.
+     * It will be evaluated for each task after it is executed. 
+     */
+    private Object target
+
+    /**
+     * Optional parameter name under which the output channel
+     * is made available in the process outputs (i.e. `.out`).
+     */
+    private String name
+
+    /**
+     * Optional parameter type which is used to validate
+     * task outputs
+     */
+    private Class type
+
+    /**
+     * Optional channel topic which this output channel will
+     * be sent to.
+     */
+    private String topic
+
+    /**
+     * When true, a task will not fail if any environment
+     * vars or files for this output are missing.
+     */
+    private boolean optional
+
+    /**
+     * Output channel which is created when the process is invoked
+     * in a workflow.
+     */
+    private DataflowWriteChannel channel
+
+    ProcessOutput(ProcessOutputs declaredOutputs, Object target, Map opts) {
+        this.declaredOutputs = declaredOutputs
+        this.target = target
+
+        for( Map.Entry entry : opts )
+            setProperty(entry.key, entry.value)
+    }
+
+    void setName(String name) {
+        if( !ConfigHelper.isValidIdentifier(name) ) {
+            final msg = "Output name '$name' is not valid -- Make sure it starts with an alphabetic or underscore character and it does not contain any blank, dot or other special characters"
+            throw new IllegalArgumentException(msg)
+        }
+        this.name = name
+    }
+
+    String getName() {
+        return name
+    }
+
+    void setTopic(String topic) {
+        if( !ConfigHelper.isValidIdentifier(topic) ) {
+            final msg = "Output topic '$topic' is not valid -- Make sure it starts with an alphabetic or underscore character and it does not contain any blank, dot or other special characters"
+            throw new IllegalArgumentException(msg)
+        }
+        this.topic = topic
+    }
+
+    String getTopic() {
+        return topic
+    }
+
+    void setChannel(DataflowWriteChannel channel) {
+        this.channel = channel
+    }
+
+    DataflowWriteChannel getChannel() {
+        return channel
+    }
+
+    Object resolve(TaskRun task) {
+        final ctx = new TaskOutputCollector(declaredOutputs, optional, task)
+        return LazyHelper.resolve(ctx, target)
+    }
+
+    @Override
+    ProcessOutput clone() {
+        (ProcessOutput)super.clone()
+    }
+
+}
diff --git a/modules/nextflow/src/main/groovy/nextflow/script/ProcessOutputs.groovy b/modules/nextflow/src/main/groovy/nextflow/script/ProcessOutputs.groovy
new file mode 100644
index 0000000000..e0fcb7497d
--- /dev/null
+++ b/modules/nextflow/src/main/groovy/nextflow/script/ProcessOutputs.groovy
@@ -0,0 +1,105 @@
+/*
+ * Copyright 2013-2024, Seqera Labs
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package nextflow.script
+
+import groovyx.gpars.dataflow.DataflowQueue
+import groovyx.gpars.dataflow.DataflowWriteChannel
+
+/**
+ * Models the process outputs.
+ *
+ * @author Ben Sherman 
+ */
+class ProcessOutputs implements List, Cloneable {
+
+    @Delegate
+    private List params = []
+
+    /**
+     * Environment variables which will be exported from the
+     * task environment for each task and made available to
+     * process outputs.
+     */
+    private Set env = []
+
+    /**
+     * Shell commands which will be executed in the task environment
+     * for each task and whose output will be made available
+     * to process outputs. The key corresponds to the environment
+     * variable to which the command output will be saved.
+     */
+    private Map eval = [:]
+
+    /**
+     * Output files which will be unstaged from the task
+     * directory for each task and made available to process
+     * outputs.
+     */
+    private Map files = [:]
+
+    void addParam(Object target, Map opts) {
+        add(new ProcessOutput(this, target, opts))
+    }
+
+    void setDefault() {
+        final param = new ProcessOutput(this, new LazyVar('stdout'), [:])
+        param.setChannel(new DataflowQueue())
+        params.add(param)
+    }
+
+    void addEnv(String name) {
+        env.add(name)
+    }
+
+    void addEval(String name, Object value) {
+        eval.put(name, value)
+    }
+
+    void addFile(String key, ProcessFileOutput file) {
+        files.put(key, file)
+    }
+
+    List getNames() {
+        return params*.getName()
+    }
+
+    List getChannels() {
+        return params*.getChannel()
+    }
+
+    Set getEnv() {
+        return env
+    }
+
+    Map getEval() {
+        return eval
+    }
+
+    Map getFiles() {
+        return files
+    }
+
+    @Override
+    ProcessOutputs clone() {
+        def result = (ProcessOutputs)super.clone()
+        result.params = new ArrayList<>(params.size())
+        for( ProcessOutput param : params )
+            result.add(param.clone())
+        return result
+    }
+
+}
diff --git a/modules/nextflow/src/main/groovy/nextflow/script/ScriptParser.groovy b/modules/nextflow/src/main/groovy/nextflow/script/ScriptParser.groovy
index 8931cf9b03..dd7287e49c 100644
--- a/modules/nextflow/src/main/groovy/nextflow/script/ScriptParser.groovy
+++ b/modules/nextflow/src/main/groovy/nextflow/script/ScriptParser.groovy
@@ -26,6 +26,7 @@ import nextflow.Session
 import nextflow.ast.NextflowDSL
 import nextflow.ast.NextflowXform
 import nextflow.ast.OpXform
+import nextflow.ast.ProcessInputPathXform
 import nextflow.exception.ScriptCompilationException
 import nextflow.extension.FilesEx
 import nextflow.file.FileHelper
@@ -124,6 +125,7 @@ class ScriptParser {
         config.addCompilationCustomizers( new ASTTransformationCustomizer(NextflowDSL))
         config.addCompilationCustomizers( new ASTTransformationCustomizer(NextflowXform))
         config.addCompilationCustomizers( new ASTTransformationCustomizer(OpXform))
+        config.addCompilationCustomizers( new ASTTransformationCustomizer(ProcessInputPathXform))
 
         if( session?.debug )
             config.debug = true
diff --git a/modules/nextflow/src/main/groovy/nextflow/script/ScriptTokens.groovy b/modules/nextflow/src/main/groovy/nextflow/script/ScriptTokens.groovy
index 72498b2452..2bc814361a 100644
--- a/modules/nextflow/src/main/groovy/nextflow/script/ScriptTokens.groovy
+++ b/modules/nextflow/src/main/groovy/nextflow/script/ScriptTokens.groovy
@@ -20,20 +20,6 @@ import groovy.transform.CompileStatic
 import groovy.transform.EqualsAndHashCode
 import groovy.transform.ToString
 import groovy.transform.TupleConstructor
-/**
- * Presents a variable definition in the script context.
- *
- * @author Paolo Di Tommaso 
- */
-@ToString
-@EqualsAndHashCode
-@TupleConstructor
-class TokenVar {
-
-    /** The variable name */
-    String name
-
-}
 
 /**
  *  A token used by the DSL to identify a 'file' declaration in a 'tuple' parameter, for example:
@@ -62,7 +48,7 @@ class TokenPathCall {
 
     TokenPathCall(target) {
         this.target = target
-        this.opts = Collections.emptyMap()
+        this.opts = [:]
     }
 
     TokenPathCall(Map opts, target) {
@@ -72,26 +58,26 @@ class TokenPathCall {
 }
 
 /**
- * An object of this class replace the {@code stdin} token in input map declaration. For example:
+ * An object of this class replace the {@code stdin} token in input tuple declaration. For example:
  * 
  * input:
- *   map( stdin, .. ) from x
+ *   tuple( stdin, .. ) from x
  * 
* * @see nextflow.ast.NextflowDSLImpl - * @see nextflow.script.params.TupleInParam#bind(java.lang.Object[]) + * @see nextflow.script.dsl.ProcessDsl#_in_tuple(java.lang.Object[]) */ class TokenStdinCall { } /** - * An object of this class replace the {@code stdout} token in input map declaration. For example: + * An object of this class replace the {@code stdout} token in input tuple declaration. For example: *
  * input:
- *   map( stdout, .. ) into x
+ *   tuple( stdout, .. ) into x
  * 
* * @see nextflow.ast.NextflowDSLImpl - * @see nextflow.script.params.TupleOutParam#bind(java.lang.Object[]) + * @see nextflow.script.dsl.ProcessDsl#_out_tuple(java.util.Map,java.lang.Object[]) */ class TokenStdoutCall { } diff --git a/modules/nextflow/src/main/groovy/nextflow/script/WorkflowDef.groovy b/modules/nextflow/src/main/groovy/nextflow/script/WorkflowDef.groovy index b540a53451..ffa8f02d94 100644 --- a/modules/nextflow/src/main/groovy/nextflow/script/WorkflowDef.groovy +++ b/modules/nextflow/src/main/groovy/nextflow/script/WorkflowDef.groovy @@ -52,18 +52,12 @@ class WorkflowDef extends BindableDef implements ChainableDef, IterableDef, Exec private WorkflowBinding binding - WorkflowDef(BaseScript owner, Closure rawBody, String name=null) { + WorkflowDef(BaseScript owner, String name, BodyDef body, List takes, List emits) { this.owner = owner this.name = name - // invoke the body resolving in/out params - final copy = (Closure)rawBody.clone() - final resolver = new WorkflowParamsResolver() - copy.setResolveStrategy(Closure.DELEGATE_FIRST) - copy.setDelegate(resolver) - this.body = copy.call() - // now it can access the parameters - this.declaredInputs = new ArrayList<>(resolver.getTakes().keySet()) - this.declaredOutputs = new ArrayList<>(resolver.getEmits().keySet()) + this.body = body + this.declaredInputs = takes + this.declaredOutputs = emits this.variableNames = getVarNames0() } @@ -208,49 +202,3 @@ class WorkflowDef extends BindableDef implements ChainableDef, IterableDef, Exec } } - -/** - * Hold workflow parameters - */ -@Slf4j -@CompileStatic -class WorkflowParamsResolver { - - static final private String TAKE_PREFIX = '_take_' - static final private String EMIT_PREFIX = '_emit_' - - - Map takes = new LinkedHashMap<>(10) - Map emits = new LinkedHashMap<>(10) - - @Override - def invokeMethod(String name, Object args) { - if( name.startsWith(TAKE_PREFIX) ) - takes.put(name.substring(TAKE_PREFIX.size()), args) - - else if( name.startsWith(EMIT_PREFIX) ) - emits.put(name.substring(EMIT_PREFIX.size()), args) - - else - throw new MissingMethodException(name, WorkflowDef, args) - } - - private Map argsToMap(Object args) { - if( args && args.getClass().isArray() ) { - if( ((Object[])args)[0] instanceof Map ) { - def map = (Map)((Object[])args)[0] - return new HashMap(map) - } - } - Collections.emptyMap() - } - - private Map argToPublishOpts(Object args) { - final opts = argsToMap(args) - if( opts.containsKey('saveAs')) { - log.warn "Workflow publish does not support `saveAs` option" - opts.remove('saveAs') - } - return opts - } -} diff --git a/modules/nextflow/src/main/groovy/nextflow/script/dsl/ProcessBuilder.groovy b/modules/nextflow/src/main/groovy/nextflow/script/dsl/ProcessBuilder.groovy new file mode 100644 index 0000000000..7fb9615d46 --- /dev/null +++ b/modules/nextflow/src/main/groovy/nextflow/script/dsl/ProcessBuilder.groovy @@ -0,0 +1,451 @@ +/* + * Copyright 2013-2024, Seqera Labs + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package nextflow.script.dsl + +import java.util.regex.Pattern + +import groovy.util.logging.Slf4j +import nextflow.ast.NextflowDSLImpl +import nextflow.exception.IllegalConfigException +import nextflow.exception.IllegalDirectiveException +import nextflow.exception.ScriptRuntimeException +import nextflow.processor.ErrorStrategy +import nextflow.script.ProcessInputs +import nextflow.script.ProcessOutputs +import nextflow.script.BaseScript +import nextflow.script.BodyDef +import nextflow.script.LazyList +import nextflow.script.ProcessConfig +import nextflow.script.ProcessDef + +/** + * Builder for {@link ProcessDef}. + * + * @author Ben Sherman + */ +@Slf4j +class ProcessBuilder { + + static final List DIRECTIVES = [ + 'accelerator', + 'afterScript', + 'arch', + 'beforeScript', + 'cache', + 'cleanup', + 'clusterOptions', + 'conda', + 'container', + 'containerOptions', + 'cpus', + 'debug', + 'disk', + 'echo', // deprecated + 'errorStrategy', + 'executor', + 'ext', + 'fair', + 'label', + 'machineType', + 'maxErrors', + 'maxForks', + 'maxRetries', + 'maxSubmitAwait', + 'memory', + 'module', + 'penv', + 'pod', + 'publishDir', + 'queue', + 'resourceLabels', + 'scratch', + 'secret', + 'shell', + 'spack', + 'stageInMode', + 'stageOutMode', + 'storeDir', + 'tag', + 'time' + ] + + protected BaseScript ownerScript + + protected String processName + + protected BodyDef body + + protected ProcessConfig config + + ProcessBuilder(BaseScript ownerScript, String processName) { + this.ownerScript = ownerScript + this.processName = processName + this.config = new ProcessConfig(ownerScript, processName) + } + + ProcessBuilder(ProcessConfig config) { + this.ownerScript = config.getOwnerScript() + this.processName = config.getProcessName() + this.config = config + } + + Object invokeMethod(String name, Object args) { + /* + * This is need to patch #497 -- what is happening is that when in the config file + * is defined a directive like `memory`, `cpus`, etc in by using a closure, + * this closure is interpreted as method definition and it get invoked if a + * directive with the same name is defined in the process definition. + * To avoid that the offending property is removed from the map before the method + * is evaluated. + */ + if( config.get(name) instanceof Closure ) + config.remove(name) + + this.metaClass.invokeMethod(this,name,args) + } + + def methodMissing( String name, def args ) { + checkName(name) + + if( args instanceof Object[] ) + config.put(name, args.size()==1 ? args[0] : args.toList()) + else + config.put(name, args) + } + + private void checkName(String name) { + if( DIRECTIVES.contains(name) ) + return + if( name == NextflowDSLImpl.PROCESS_WHEN ) + return + if( name == NextflowDSLImpl.PROCESS_STUB ) + return + + String message = "Unknown process directive: `$name`" + def alternatives = DIRECTIVES.closest(name) + if( alternatives.size()==1 ) { + message += '\n\nDid you mean one of these?' + alternatives.each { + message += "\n $it" + } + } + throw new IllegalDirectiveException(message) + } + + /// DIRECTIVES + + void accelerator( Map params, value ) { + if( value instanceof Number ) { + if( params.limit==null ) + params.limit=value + else if( params.request==null ) + params.request=value + } + else if( value != null ) + throw new IllegalArgumentException("Not a valid `accelerator` directive value: $value [${value.getClass().getName()}]") + accelerator(params) + } + + void accelerator( value ) { + if( value instanceof Number ) + config.put('accelerator', [limit: value]) + else if( value instanceof Map ) + config.put('accelerator', value) + else if( value != null ) + throw new IllegalArgumentException("Not a valid `accelerator` directive value: $value [${value.getClass().getName()}]") + } + + void arch( Map params, value ) { + if( value instanceof String ) { + if( params.name==null ) + params.name=value + } + else if( value != null ) + throw new IllegalArgumentException("Not a valid `arch` directive value: $value [${value.getClass().getName()}]") + arch(params) + } + + void arch( value ) { + if( value instanceof String ) + config.put('arch', [name: value]) + else if( value instanceof Map ) + config.put('arch', value) + else if( value != null ) + throw new IllegalArgumentException("Not a valid `arch` directive value: $value [${value.getClass().getName()}]") + } + + void debug(boolean value) { + config.debug = value + } + + /** + * Implements the {@code disk} directive, e.g.: + * + * disk 375.GB, type: 'local-ssd' + * + * @param opts + * @param value + */ + void disk( Map opts, value ) { + opts.request = value + disk(opts) + } + + /** + * Implements the {@code disk} directive, e.g.: + * + * disk 100.GB + * disk request: 375.GB, type: 'local-ssd' + * + * @param value + */ + void disk( value ) { + if( value instanceof Map || value instanceof Closure ) + config.put('disk', value) + else + config.put('disk', [request: value]) + } + + /** + * Implements the {@code echo} directive for backwards compatibility. + * + * note: without this method definition {@link BaseScript#echo} will be invoked + * + * @param value + */ + void echo( value ) { + log.warn1('The `echo` directive has been deprecated - use `debug` instead') + config.put('debug', value) + } + + /** + * Implements the {@code errorStrategy} directive. + * + * @param strategy + */ + void errorStrategy( CharSequence strategy ) { + if( !ErrorStrategy.isValid(strategy) ) + throw new IllegalArgumentException("Unknown error strategy '${strategy}' ― Available strategies are: ${ErrorStrategy.values().join(',').toLowerCase()}") + + config.put('errorStrategy', strategy) + } + + /** + * Implements the {@code label} directive. + * + * This directive can be specified (invoked) more than once in + * the process definition. + * + * @param lbl + */ + void label(String lbl) { + if( !lbl ) return + + // -- check that label has a valid syntax + if( !isValidLabel(lbl) ) + throw new IllegalConfigException("Not a valid process label: $lbl -- Label must consist of alphanumeric characters or '_', must start with an alphabetic character and must end with an alphanumeric character") + + // -- get the current label, it must be a list + def allLabels = (List)config.get('label') + if( !allLabels ) { + allLabels = new LazyList() + config.put('label', allLabels) + } + + // -- avoid duplicates + if( !allLabels.contains(lbl) ) + allLabels.add(lbl) + } + + private static final Pattern LABEL_REGEXP = ~/[a-zA-Z]([a-zA-Z0-9_]*[a-zA-Z0-9]+)?/ + + protected static boolean isValidLabel(String lbl) { + def p = lbl.indexOf('=') + if( p==-1 ) + return LABEL_REGEXP.matcher(lbl).matches() + + def left = lbl.substring(0,p) + def right = lbl.substring(p+1) + return LABEL_REGEXP.matcher(left).matches() && LABEL_REGEXP.matcher(right).matches() + } + + /** + * Implements the {@code module} directive. + * + * See also http://modules.sourceforge.net + * + * @param value + */ + void module( String value ) { + if( !value ) return + + def result = (List)config.module + if( result == null ) { + result = new LazyList() + config.put('module', result) + } + + result.add(value) + } + + /** + * Implements the {@code pod} directive. + * + * @param entry + */ + void pod( Map entry ) { + if( !entry ) return + + def allOptions = (List)config.get('pod') + if( !allOptions ) { + allOptions = new LazyList() + config.put('pod', allOptions) + } + + allOptions.add(entry) + } + + /** + * Implements the {@code publishDir} directive as a map eg: + * + * publishDir path: '/some/dir', mode: 'copy' + * + * This directive can be specified (invoked) multiple times in + * the process definition. + * + * @param params + */ + void publishDir(Map params) { + if( !params ) return + + def dirs = (List)config.get('publishDir') + if( !dirs ) { + dirs = new LazyList() + config.put('publishDir', dirs) + } + + dirs.add(params) + } + + /** + * Implements the {@code publishDir} directive as a path with named parameters, eg: + * + * publishDir '/some/dir', mode: 'copy' + * + * @param params + * @param path + */ + void publishDir(Map params, CharSequence path) { + params.put('path', path) + publishDir( params ) + } + + /** + * Implements the {@code publishDir} directive as a string path, eg: + * + * publishDir '/some/dir' + * + * @param target + */ + void publishDir( target ) { + if( target instanceof List ) { + for( Object item : target ) { publishDir(item) } + } + else if( target instanceof Map ) { + publishDir( target as Map ) + } + else { + publishDir([path: target]) + } + } + + /** + * Implements the {@code resourceLabels} directive. + * + * This directive can be specified (invoked) multiple times in + * the process definition. + * + * @param map + */ + void resourceLabels(Map map) { + if( !map ) return + + // -- get the current sticker, it must be a Map + def allLabels = (Map)config.get('resourceLabels') + if( !allLabels ) { + allLabels = [:] + } + // -- merge duplicates + allLabels += map + config.put('resourceLabels', allLabels) + } + + /** + * Implements the {@code secret} directive. + * + * This directive can be specified (invoked) multiple times in + * the process definition. + * + * @param name + */ + void secret(String name) { + if( !name ) return + + // -- get the current label, it must be a list + def allSecrets = (List)config.get('secret') + if( !allSecrets ) { + allSecrets = new LazyList() + config.put('secret', allSecrets) + } + + // -- avoid duplicates + if( !allSecrets.contains(name) ) + allSecrets.add(name) + } + + /// SCRIPT + + ProcessBuilder withInputs(ProcessInputs inputs) { + config.inputs = inputs + return this + } + + ProcessBuilder withOutputs(ProcessOutputs outputs) { + config.outputs = outputs + return this + } + + ProcessBuilder withBody(Closure closure, String section, String source='', List values=null) { + withBody(new BodyDef(closure, source, section, values)) + } + + ProcessBuilder withBody(BodyDef body) { + this.body = body + return this + } + + ProcessConfig getConfig() { + return config + } + + ProcessDef build() { + if ( !body ) + throw new ScriptRuntimeException("Missing script in the specified process block -- make sure it terminates with the script string to be executed") + return new ProcessDef(ownerScript, processName, body, config) + } + +} diff --git a/modules/nextflow/src/main/groovy/nextflow/script/dsl/ProcessConfigBuilder.groovy b/modules/nextflow/src/main/groovy/nextflow/script/dsl/ProcessConfigBuilder.groovy new file mode 100644 index 0000000000..f522368287 --- /dev/null +++ b/modules/nextflow/src/main/groovy/nextflow/script/dsl/ProcessConfigBuilder.groovy @@ -0,0 +1,231 @@ +/* + * Copyright 2013-2024, Seqera Labs + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package nextflow.script.dsl + +import java.util.regex.Pattern + +import groovy.util.logging.Slf4j +import nextflow.exception.ConfigParseException +import nextflow.script.ProcessConfig + +/** + * Builder for {@link ProcessConfig}. + * + * @author Ben Sherman + */ +@Slf4j +class ProcessConfigBuilder extends ProcessBuilder { + + ProcessConfigBuilder(ProcessConfig config) { + super(config) + } + + /** + * Apply process config settings from the config file to a process. + * + * @param configDirectives + * @param baseName + * @param simpleName + * @param fullyQualifiedName + */ + void applyConfig(Map configDirectives, String baseName, String simpleName, String fullyQualifiedName) { + // -- apply settings defined in the config object using the`withLabel:` syntax + final processLabels = config.getLabels() ?: [''] + applyConfigSelectorWithLabels(configDirectives, processLabels) + + // -- apply settings defined in the config file using the process base name + applyConfigSelectorWithName(configDirectives, baseName) + + // -- apply settings defined in the config file using the process simple name + if( simpleName && simpleName!=baseName ) + applyConfigSelectorWithName(configDirectives, simpleName) + + // -- apply settings defined in the config file using the process fully qualified name (ie. with the execution scope) + if( fullyQualifiedName && (fullyQualifiedName!=simpleName || fullyQualifiedName!=baseName) ) + applyConfigSelectorWithName(configDirectives, fullyQualifiedName) + + // -- apply defaults + applyConfigDefaults(configDirectives) + + // -- check for conflicting settings + if( config.scratch && config.stageInMode == 'rellink' ) { + log.warn("Directives `scratch` and `stageInMode=rellink` conflict with each other -- Enforcing default stageInMode for process `$simpleName`") + config.remove('stageInMode') + } + } + + /** + * Apply the config settings in a label selector, for example: + * + * ``` + * process { + * withLabel: foo { + * cpus = 1 + * memory = 2.gb + * } + * } + * ``` + * + * @param configDirectives + * @param labels + */ + protected void applyConfigSelectorWithLabels(Map configDirectives, List labels) { + final prefix = 'withLabel:' + for( String rule : configDirectives.keySet() ) { + if( !rule.startsWith(prefix) ) + continue + final pattern = rule.substring(prefix.size()).trim() + if( !matchesLabels(labels, pattern) ) + continue + + log.debug "Config settings `$rule` matches labels `${labels.join(',')}` for process with name $processName" + final settings = configDirectives.get(rule) + if( settings instanceof Map ) { + applyConfigSettings(settings) + } + else if( settings != null ) { + throw new ConfigParseException("Unknown config settings for process labeled ${labels.join(',')} -- settings=$settings ") + } + } + } + + static boolean matchesLabels(List labels, String pattern) { + final isNegated = pattern.startsWith('!') + if( isNegated ) + pattern = pattern.substring(1).trim() + + final regex = Pattern.compile(pattern) + for (label in labels) { + if (regex.matcher(label).matches()) { + return !isNegated + } + } + + return isNegated + } + + /** + * Apply the config settings in a name selector, for example: + * + * ``` + * process { + * withName: foo { + * cpus = 1 + * memory = 2.gb + * } + * } + * ``` + * + * @param configDirectives + * @param target + */ + protected void applyConfigSelectorWithName(Map configDirectives, String target) { + final prefix = 'withName:' + for( String rule : configDirectives.keySet() ) { + if( !rule.startsWith(prefix) ) + continue + final pattern = rule.substring(prefix.size()).trim() + if( !matchesSelector(target, pattern) ) + continue + + log.debug "Config settings `$rule` matches process $processName" + def settings = configDirectives.get(rule) + if( settings instanceof Map ) { + applyConfigSettings(settings) + } + else if( settings != null ) { + throw new ConfigParseException("Unknown config settings for process with name: $target -- settings=$settings ") + } + } + } + + static boolean matchesSelector(String name, String pattern) { + final isNegated = pattern.startsWith('!') + if( isNegated ) + pattern = pattern.substring(1).trim() + return Pattern.compile(pattern).matcher(name).matches() ^ isNegated + } + + + /** + * Apply config settings to a process. + * + * @param settings + */ + protected void applyConfigSettings(Map settings) { + if( !settings ) + return + + for( def entry : settings ) { + if( entry.key.startsWith("withLabel:") || entry.key.startsWith("withName:")) + continue + + if( !DIRECTIVES.contains(entry.key) ) + log.warn "Unknown directive `$entry.key` for process `$processName`" + + if( entry.key == 'params' ) // <-- patch issue #242 + continue + + if( entry.key == 'ext' ) { + if( config.getProperty('ext') instanceof Map ) { + // update missing 'ext' properties found in 'process' scope + def ext = config.getProperty('ext') as Map + entry.value.each { String k, v -> ext[k] = v } + } + continue + } + + putWithRepeat(entry.key, entry.value) + } + } + + /** + * Apply the global settings in the process config scope to a process. + * + * @param defaults + */ + protected void applyConfigDefaults( Map defaults ) { + for( String key : defaults.keySet() ) { + if( key == 'params' ) + continue + final value = defaults.get(key) + final current = config.getProperty(key) + if( key == 'ext' ) { + if( value instanceof Map && current instanceof Map ) { + final ext = current as Map + value.each { k,v -> if(!ext.containsKey(k)) ext.put(k,v) } + } + } + else if( !config.containsKey(key) || (ProcessConfig.DEFAULT_CONFIG.containsKey(key) && current==ProcessConfig.DEFAULT_CONFIG.get(key)) ) { + putWithRepeat(key, value) + } + } + } + + private static final List REPEATABLE_DIRECTIVES = ['label','module','pod','publishDir'] + + protected void putWithRepeat( String name, Object value ) { + if( name in REPEATABLE_DIRECTIVES ) { + config.remove(name) + this.metaClass.invokeMethod(this, name, value) + } + else { + config.put(name, value) + } + } + +} diff --git a/modules/nextflow/src/main/groovy/nextflow/script/dsl/ProcessDsl.groovy b/modules/nextflow/src/main/groovy/nextflow/script/dsl/ProcessDsl.groovy new file mode 100644 index 0000000000..5b798a5bc3 --- /dev/null +++ b/modules/nextflow/src/main/groovy/nextflow/script/dsl/ProcessDsl.groovy @@ -0,0 +1,428 @@ +/* + * Copyright 2013-2024, Seqera Labs + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package nextflow.script.dsl + +import groovy.transform.CompileDynamic +import groovy.transform.CompileStatic +import nextflow.processor.TaskOutputCollector +import nextflow.script.BaseScript +import nextflow.script.LazyAware +import nextflow.script.LazyList +import nextflow.script.LazyVar +import nextflow.script.ProcessDef +import nextflow.script.ProcessFileInput +import nextflow.script.ProcessFileOutput +import nextflow.script.ProcessInputs +import nextflow.script.ProcessOutput +import nextflow.script.ProcessOutputs +import nextflow.script.TokenEnvCall +import nextflow.script.TokenEvalCall +import nextflow.script.TokenFileCall +import nextflow.script.TokenPathCall +import nextflow.script.TokenStdinCall +import nextflow.script.TokenStdoutCall +import nextflow.script.TokenValCall + +/** + * Implements the process DSL. + * + * @author Paolo Di Tommaso + * @author Ben Sherman + */ +@CompileStatic +class ProcessDsl extends ProcessBuilder { + + private ProcessInputs inputs = new ProcessInputs() + + private ProcessOutputs outputs = new ProcessOutputs() + + ProcessDsl(BaseScript ownerScript, String processName) { + super(ownerScript, processName) + } + + /// TYPED INPUTS / OUTPUTS + + void env(String name, Object source) { + inputs.addEnv(name, source) + } + + void stageAs(Object source) { + inputs.addFile(new ProcessFileInput(source, null, true, [:])) + } + + void stageAs(String stageAs, Object source) { + inputs.addFile(new ProcessFileInput(source, null, true, [stageAs: stageAs])) + } + + void stdin(Object source) { + inputs.stdin = source + } + + void _typed_in_param(String name, Class type) { + inputs.addParam(name, type) + } + + void _typed_out_env(String name) { + outputs.addEnv(name) + } + + void _typed_out_eval(String name, CharSequence cmd) { + outputs.addEval(name, cmd) + } + + void _typed_out_path(Map opts=[:], String key, Object target) { + outputs.addFile(key, new ProcessFileOutput(target, true, opts)) + } + + void _typed_out_param(String name, Class type, Object target) { + final opts = [ + name: name, + optional: type != null && Optional.isAssignableFrom(type), + type: type + ] + outputs.addParam(target, opts) + } + + /// INPUTS + + void _in_each(LazyVar var) { + _in_val(var) + inputs.last().setIterator(true) + } + + void _in_each(TokenFileCall file) { + _in_file(file.target) + inputs.last().setIterator(true) + } + + void _in_each(TokenPathCall path) { + _in_path(path.target) + inputs.last().setIterator(true) + } + + void _in_env(LazyVar var) { + final param = "\$in${inputs.size()}".toString() + inputs.addParam(param) + inputs.addEnv(var.name, new LazyVar(param)) + } + + void _in_file(Object source) { + final param = _in_path0(source, false, [:]) + inputs.addParam(param) + } + + void _in_path(Map opts=[:], Object source) { + final param = _in_path0(source, true, opts) + inputs.addParam(param) + } + + private String _in_path0(Object source, boolean pathQualifier, Map opts) { + if( !opts.stageAs && opts.name ) + opts.stageAs = opts.remove('name') + + if( source instanceof LazyVar ) { + final var = (LazyVar)source + inputs.addFile(new ProcessFileInput(var, var.name, pathQualifier, opts)) + return var.name + } + else if( source instanceof CharSequence ) { + final param = "\$in${inputs.size()}" + if( !opts.stageAs ) + opts.stageAs = source + inputs.addFile(new ProcessFileInput(new LazyVar(param), null, pathQualifier, opts)) + return param + } + else + throw new IllegalArgumentException() + } + + void _in_stdin() { + final param = "\$in${inputs.size()}".toString() + inputs.addParam(param) + inputs.stdin = new LazyVar(param) + } + + void _in_stdin(LazyVar var) { + inputs.addParam(var.name) + inputs.stdin = var + } + + @CompileDynamic + void _in_tuple(Object... elements) { + if( elements.length < 2 ) + throw new IllegalArgumentException("Input `tuple` must define at least two elements -- Check process `$processName`") + + final param = "\$in${inputs.size()}".toString() + inputs.addParam(param) + + for( int i = 0; i < elements.length; i++ ) { + final item = elements[i] + + if( item instanceof LazyVar ) { + final var = (LazyVar)item + throw new IllegalArgumentException("Unqualified input value declaration is not allowed - replace `tuple ${var.name},..` with `tuple val(${var.name}),..`") + } + else if( item instanceof TokenValCall && item.val instanceof LazyVar ) { + inputs.addVariable(item.val.name, new LazyTupleElement(param, i)) + } + else if( item instanceof TokenEnvCall && item.val instanceof LazyVar ) { + inputs.addEnv(item.val.name, new LazyTupleElement(param, i)) + } + else if( item instanceof TokenFileCall ) { + final name = _in_path0(item.target, false, [:]) + inputs.addVariable(name, new LazyTupleElement(param, i)) + } + else if( item instanceof TokenPathCall ) { + final name = _in_path0(item.target, true, item.opts) + inputs.addVariable(name, new LazyTupleElement(param, i)) + } + else if( item instanceof Map ) { + throw new IllegalArgumentException("Unqualified input file declaration is not allowed - replace `tuple $item,..` with `tuple path(${item.key}, stageAs:'${item.value}'),..`") + } + else if( item instanceof GString ) { + throw new IllegalArgumentException("Unqualified input file declaration is not allowed - replace `tuple \"$item\".. with `tuple path(\"$item\")..`") + } + else if( item instanceof TokenStdinCall || item == '-' ) { + inputs.stdin = new LazyTupleElement(param, i) + } + else if( item instanceof String ) { + throw new IllegalArgumentException("Unqualified input file declaration is not allowed - replace `tuple '$item',..` with `tuple path('$item'),..`") + } + else + throw new IllegalArgumentException() + } + } + + void _in_val(LazyVar var) { + inputs.addParam(var.name) + } + + /// OUTPUTS + + void _out_env(Map opts=[:], Object target) { + if( opts.emit ) + opts.name = opts.remove('emit') + + final name = _out_env0(target) + outputs.addEnv(name) + outputs.addParam(new LazyEnvCall(name), opts) + } + + String _out_env0(Object target) { + if( target instanceof LazyVar ) + return target.name + else if( target instanceof CharSequence ) + return target.toString() + else + throw new IllegalArgumentException("Unexpected environment output definition - it should be either a string or a variable identifier - offending value: ${target?.getClass()?.getName()}") + } + + void _out_eval(Map opts=[:], CharSequence cmd) { + if( opts.emit ) + opts.name = opts.remove('emit') + + final name = _out_eval0(cmd) + outputs.addParam(new LazyEvalCall(name), opts) + } + + private String _out_eval0(CharSequence cmd) { + final name = "nxf_out_eval_${outputs.eval.size()}" + outputs.addEval(name, cmd) + return name + } + + void _out_file(Object target) { + // note: check that is a String type to avoid to force + // the evaluation of GString object to a string + if( target instanceof String && target == '-' ) { + _out_stdout() + return + } + + final key = _out_path0(target, false, [:]) + outputs.addParam(new LazyPathCall(key), [:]) + } + + void _out_path(Map opts=[:], Object target) { + // note: check that is a String type to avoid to force + // the evaluation of GString object to a string + if( target instanceof String && target == '-' ) { + _out_stdout(opts) + return + } + + // separate param options from path options + final paramOpts = [optional: opts.optional] + if( opts.emit ) + paramOpts.name = opts.remove('emit') + + final key = _out_path0(target, true, opts) + outputs.addParam(new LazyPathCall(key), paramOpts) + } + + private String _out_path0(Object target, boolean pathQualifier, Map opts) { + final key = "\$file${outputs.files.size()}" + outputs.addFile(key, new ProcessFileOutput(target, pathQualifier, opts)) + return key + } + + void _out_stdout(Map opts=[:]) { + if( opts.emit ) + opts.name = opts.remove('emit') + + outputs.addParam(new LazyVar('stdout'), opts) + } + + @CompileDynamic + void _out_tuple(Map opts=[:], Object... elements) { + if( elements.length < 2 ) + throw new IllegalArgumentException("Output `tuple` must define at least two elements -- Check process `$processName`") + + // separate param options from path options + if( opts.emit ) + opts.name = opts.remove('emit') + + // make lazy list with tuple elements + final target = new LazyList(elements.size()) + + for( int i = 0; i < elements.length; i++ ) { + final item = elements[i] + + if( item instanceof LazyVar ) { + throw new IllegalArgumentException("Unqualified output value declaration is not allowed - replace `tuple ${item.name},..` with `tuple val(${item.name}),..`") + } + else if( item instanceof TokenValCall ) { + target << item.val + } + else if( item instanceof TokenEnvCall ) { + final name = _out_env0(item.val) + outputs.addEnv(name) + target << new LazyEnvCall(name) + } + else if( item instanceof TokenEvalCall ) { + final name = _out_eval0(item.val) + target << new LazyEvalCall(name) + } + else if( item instanceof TokenFileCall ) { + // file pattern can be a String or GString + final key = _out_path0(item.target, false, [optional: opts.optional]) + target << new LazyPathCall(key) + } + else if( item instanceof TokenPathCall ) { + // file pattern can be a String or GString + final key = _out_path0(item.target, true, item.opts + [optional: opts.optional]) + target << new LazyPathCall(key) + } + else if( item instanceof GString ) { + throw new IllegalArgumentException("Unqualified output path declaration is not allowed - replace `tuple \"$item\",..` with `tuple path(\"$item\"),..`") + } + else if( item instanceof TokenStdoutCall || item == '-' ) { + target << new LazyVar('stdout') + } + else if( item instanceof String ) { + throw new IllegalArgumentException("Unqualified output path declaration is not allowed - replace `tuple '$item',..` with `tuple path('$item'),..`") + } + else + throw new IllegalArgumentException("Invalid `tuple` output parameter declaration -- item: ${item}") + } + + outputs.addParam(target, opts) + } + + void _out_val(Map opts=[:], Object target) { + outputs.addParam(target, opts) + } + + /// BUILD + + ProcessDef build() { + config.setInputs(inputs) + config.setOutputs(outputs) + super.build() + } + +} + +@CompileStatic +class LazyTupleElement extends LazyVar { + int index + + LazyTupleElement(String name, int index) { + super(name) + this.index = index + } + + @Override + Object resolve(Object binding) { + final tuple = super.resolve(binding) + if( tuple instanceof List ) + return tuple[index] + else + throw new IllegalArgumentException("Lazy binding of `${name}[${index}]` failed because `${name}` is not a tuple") + } +} + +@CompileStatic +class LazyEnvCall implements LazyAware { + String name + + LazyEnvCall(String name) { + this.name = name + } + + @Override + Object resolve(Object binding) { + if( binding !instanceof TaskOutputCollector ) + throw new IllegalStateException() + + ((TaskOutputCollector)binding).env(name) + } +} + +@CompileStatic +class LazyEvalCall implements LazyAware { + String name + + LazyEvalCall(String name) { + this.name = name + } + + @Override + Object resolve(Object binding) { + if( binding !instanceof TaskOutputCollector ) + throw new IllegalStateException() + + ((TaskOutputCollector)binding).eval(name) + } +} + +@CompileStatic +class LazyPathCall implements LazyAware { + String key + + LazyPathCall(String key) { + this.key = key + } + + @Override + Object resolve(Object binding) { + if( binding !instanceof TaskOutputCollector ) + throw new IllegalStateException() + + ((TaskOutputCollector)binding).path(key) + } +} diff --git a/modules/nextflow/src/main/groovy/nextflow/script/dsl/WorkflowBuilder.groovy b/modules/nextflow/src/main/groovy/nextflow/script/dsl/WorkflowBuilder.groovy new file mode 100644 index 0000000000..cd79b3f3a7 --- /dev/null +++ b/modules/nextflow/src/main/groovy/nextflow/script/dsl/WorkflowBuilder.groovy @@ -0,0 +1,73 @@ +/* + * Copyright 2013-2024, Seqera Labs + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package nextflow.script.dsl + +import groovy.transform.CompileStatic +import groovy.util.logging.Slf4j +import nextflow.script.BaseScript +import nextflow.script.BodyDef +import nextflow.script.WorkflowDef +/** + * Implements the workflow builder DSL. + * + * @author Ben Sherman + */ +@Slf4j +@CompileStatic +class WorkflowBuilder { + + static final private String TAKE_PREFIX = '_take_' + static final private String EMIT_PREFIX = '_emit_' + + private BaseScript owner + private String name + private BodyDef body + private Map takes = new LinkedHashMap<>(10) + private Map emits = new LinkedHashMap<>(10) + + WorkflowBuilder(BaseScript owner, String name=null) { + this.owner = owner + this.name = name + } + + @Override + def invokeMethod(String name, Object args) { + if( name.startsWith(TAKE_PREFIX) ) + takes.put(name.substring(TAKE_PREFIX.size()), args) + + else if( name.startsWith(EMIT_PREFIX) ) + emits.put(name.substring(EMIT_PREFIX.size()), args) + + else + throw new MissingMethodException(name, WorkflowDef, args) + } + + WorkflowBuilder withBody(BodyDef body) { + this.body = body + return this + } + + WorkflowDef build() { + new WorkflowDef( + owner, + name, + body, + new ArrayList<>(takes.keySet()), + new ArrayList<>(emits.keySet()) + ) + } +} diff --git a/modules/nextflow/src/main/groovy/nextflow/script/params/BaseInParam.groovy b/modules/nextflow/src/main/groovy/nextflow/script/params/BaseInParam.groovy deleted file mode 100644 index a7ccdb1d6e..0000000000 --- a/modules/nextflow/src/main/groovy/nextflow/script/params/BaseInParam.groovy +++ /dev/null @@ -1,201 +0,0 @@ -/* - * Copyright 2013-2024, Seqera Labs - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package nextflow.script.params - -import groovy.util.logging.Slf4j -import groovyx.gpars.dataflow.DataflowBroadcast -import groovyx.gpars.dataflow.DataflowQueue -import groovyx.gpars.dataflow.DataflowReadChannel -import nextflow.NF -import nextflow.exception.ProcessException -import nextflow.exception.ScriptRuntimeException -import nextflow.extension.CH -import nextflow.script.ProcessConfig -import nextflow.script.TokenVar -/** - * Model a process generic input parameter - * - * @author Paolo Di Tommaso - */ - -@Slf4j -abstract class BaseInParam extends BaseParam implements InParam { - - protected fromObject - - protected bindObject - - protected owner - - /** - * The channel to which the input value is bound - */ - private inChannel - - /** - * @return The input channel instance used by this parameter to receive the process inputs - */ - DataflowReadChannel getInChannel() { - init() - return inChannel - } - - BaseInParam( ProcessConfig config ) { - this(config.getOwnerScript().getBinding(), config.getInputs()) - } - - /** - * @param script The global script object - * @param obj - */ - BaseInParam( Binding binding, List holder, short ownerIndex = -1 ) { - super(binding,holder,ownerIndex) - } - - abstract String getTypeName() - - protected DataflowReadChannel inputValToChannel( value ) { - checkFromNotNull(value) - - if( this instanceof DefaultInParam ) { - assert value instanceof DataflowQueue - return value - } - - if ( value instanceof DataflowReadChannel || value instanceof DataflowBroadcast ) { - return CH.getReadChannel(value) - } - - final result = CH.value() - result.bind(value) - return result - } - - - /** - * Lazy parameter initializer. - * - * @return The parameter object itself - */ - @Override - protected void lazyInit() { - - if( fromObject == null && (bindObject == null || bindObject instanceof GString || bindObject instanceof Closure ) ) { - throw new IllegalStateException("Missing 'bind' declaration in input parameter") - } - - // fallback on the bind object if the 'fromObject' is not defined - if( fromObject == null ) { - fromObject = bindObject - } - - // initialize the *inChannel* object based on the 'target' attribute - def result - if( fromObject instanceof TokenVar ) { - // when the value is a variable reference - // - use that name for the parameter itself - // - get the variable value in the script binding - result = getScriptVar(fromObject.name) - } - else if( fromObject instanceof Closure ) { - result = fromObject.call() - } - else { - result = fromObject - } - - inChannel = inputValToChannel(result) - } - - /** - * @return The parameter name - */ - String getName() { - if( bindObject instanceof TokenVar ) - return bindObject.name - - if( bindObject instanceof String ) - return bindObject - - if( bindObject instanceof Closure ) - return '__$' + this.toString() - - throw new IllegalArgumentException("Invalid process input definition") - } - - BaseInParam bind( Object obj ) { - this.bindObject = obj - return this - } - - private void checkFromNotNull(obj) { - if( obj != null ) return - def message = 'A process input channel evaluates to null' - def name = null - if( bindObject instanceof TokenVar ) - name = bindObject.name - else if( bindObject instanceof CharSequence ) - name = bindObject.toString() - if( name ) - message += " -- Invalid declaration `${getTypeName()} $name`" - throw new IllegalArgumentException(message) - } - - void setFrom( obj ) { - checkFromNotNull(obj) - fromObject = obj - } - - Object getRawChannel() { - if( CH.isChannel(fromObject) ) - return fromObject - if( CH.isChannel(inChannel) ) - return inChannel - throw new IllegalStateException("Missing input channel") - } - - def decodeInputs( List inputs ) { - final UNDEF = -1 as short - def value = inputs[index] - - if( mapIndex == UNDEF || owner instanceof EachInParam ) - return value - - if( mapIndex != UNDEF ) { - def result - if( value instanceof Map ) { - result = value.values() - } - else if( value instanceof Collection ) { - result = value - } - else { - result = [value] - } - - try { - return result[mapIndex] - } - catch( IndexOutOfBoundsException e ) { - throw new ProcessException(e) - } - } - - return value - } - -} diff --git a/modules/nextflow/src/main/groovy/nextflow/script/params/BaseOutParam.groovy b/modules/nextflow/src/main/groovy/nextflow/script/params/BaseOutParam.groovy deleted file mode 100644 index c811fa06fc..0000000000 --- a/modules/nextflow/src/main/groovy/nextflow/script/params/BaseOutParam.groovy +++ /dev/null @@ -1,201 +0,0 @@ -/* - * Copyright 2013-2024, Seqera Labs - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package nextflow.script.params - -import groovy.transform.PackageScope -import groovy.util.logging.Slf4j -import groovyx.gpars.dataflow.DataflowWriteChannel -import nextflow.NF -import nextflow.extension.CH -import nextflow.script.ProcessConfig -import nextflow.script.TokenVar -import nextflow.util.ConfigHelper -/** - * Model a process generic output parameter - * - * @author Paolo Di Tommaso - */ -@Slf4j -abstract class BaseOutParam extends BaseParam implements OutParam { - - /** The out parameter name */ - protected String nameObj - - protected intoObj - - protected List outChannels = new ArrayList<>(10) - - @PackageScope - boolean singleton - - String channelEmitName - - String channelTopicName - - BaseOutParam( Binding binding, List list, short ownerIndex = -1) { - super(binding,list,ownerIndex) - } - - BaseOutParam( ProcessConfig config ) { - super(config.getOwnerScript().getBinding(), config.getOutputs()) - } - - Object clone() { - final copy = (BaseOutParam)super.clone() - copy.outChannels = new ArrayList<>(10) - return copy - } - - void lazyInit() { - - if( intoObj instanceof TokenVar || intoObj instanceof TokenVar[] ) { - throw new IllegalArgumentException("Not a valid output channel argument: $intoObj") - } - else if( intoObj != null ) { - lazyInitImpl(intoObj) - } - else if( nameObj instanceof String ) { - lazyInitImpl(nameObj) - } - - } - - @PackageScope - void setSingleton( boolean value ) { - this.singleton = value - } - - @PackageScope - void lazyInitImpl( def target ) { - final channel = (target != null) - ? outputValToChannel(target) - : null - - if( channel ) { - outChannels.add(channel) - } - } - - /** - * Creates a channel variable in the script context - * - * @param channel it can be a string representing a channel variable name in the script context. If - * the variable does not exist it creates a {@code DataflowVariable} in the script with that name. - * If the specified {@code value} is a {@code DataflowWriteChannel} object, use this object - * as the output channel - * - * @param factory The type of the channel to create, either {@code DataflowVariable} or {@code DataflowQueue} - * @return The created (or specified) channel instance - */ - final protected DataflowWriteChannel outputValToChannel( Object channel ) { - - if( channel instanceof String ) { - // the channel is specified by name - def local = channel - - // look for that name in the 'script' context - channel = binding.hasVariable(local) ? binding.getVariable(local) : null - if( channel instanceof DataflowWriteChannel ) { - // that's OK -- nothing to do - } - else { - if( channel == null ) { - log.trace "Creating new output channel > $local" - } - else { - log.warn "Output channel `$local` overrides another variable with the same name declared in the script context -- Rename it to avoid possible conflicts" - } - - // instantiate the new channel - channel = CH.create( singleton ) - - // bind it to the script on-fly - if( local != '-' && binding ) { - // bind the outputs to the script scope - binding.setVariable(local, channel) - } - } - } - - if( channel instanceof DataflowWriteChannel ) { - return channel - } - - throw new IllegalArgumentException("Invalid output channel reference") - } - - - BaseOutParam bind( def obj ) { - if( obj instanceof TokenVar ) - this.nameObj = obj.name - - else - this.nameObj = ( obj?.toString() ?: null ) - - return this - } - - void setInto( Object obj ) { - intoObj = obj - } - - DataflowWriteChannel getOutChannel() { - init() - return outChannels ? outChannels.get(0) : null - } - - String getName() { - if( nameObj != null ) - return nameObj.toString() - throw new IllegalStateException("Missing 'name' property in output parameter") - } - - @Override - BaseOutParam setOptions(Map opts) { - super.setOptions(opts) - return this - } - - BaseOutParam setEmit( value ) { - if( isNestedParam() ) - throw new IllegalArgumentException("Output `emit` option is not allowed in tuple components") - if( !value ) - throw new IllegalArgumentException("Missing output `emit` name") - if( !ConfigHelper.isValidIdentifier(value) ) { - final msg = "Output emit '$value' is not a valid name -- Make sure it starts with an alphabetic or underscore character and it does not contain any blank, dot or other special characters" - if( NF.strictMode ) - throw new IllegalArgumentException(msg) - log.warn(msg) - } - this.channelEmitName = value - return this - } - - BaseOutParam setTopic( String name ) { - if( isNestedParam() ) - throw new IllegalArgumentException("Output `topic` option it not allowed in tuple components") - if( !name ) - throw new IllegalArgumentException("Missing output `topic` name") - if( !ConfigHelper.isValidIdentifier(name) ) { - final msg = "Output topic '$name' is not a valid name -- Make sure it starts with an alphabetic or underscore character and it does not contain any blank, dot or other special characters" - throw new IllegalArgumentException(msg) - } - - this.channelTopicName = name - return this - } -} diff --git a/modules/nextflow/src/main/groovy/nextflow/script/params/BaseParam.groovy b/modules/nextflow/src/main/groovy/nextflow/script/params/BaseParam.groovy deleted file mode 100644 index cd470b35fd..0000000000 --- a/modules/nextflow/src/main/groovy/nextflow/script/params/BaseParam.groovy +++ /dev/null @@ -1,170 +0,0 @@ -/* - * Copyright 2013-2024, Seqera Labs - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package nextflow.script.params - -import groovy.util.logging.Slf4j -import nextflow.exception.ScriptRuntimeException -import nextflow.script.TokenVar -/** - * Base class for input/output parameters - * - * @author Paolo Di Tommaso - */ -@Slf4j -abstract class BaseParam implements Cloneable { - - /** - * The binding context to resolve param variables - */ - final protected Binding binding - - protected List holder - - /** - * The param declaration index in the input/output block - * Note the index do not change for nested parameters ie. declared in the same tuple param - */ - final short index - - /** - * The nested index of tuple composed parameters or -1 when it's a top level param ie. not a tuple element - */ - final short mapIndex - - private boolean initialized - - BaseParam ( Binding binding, List holder, int ownerIndex = -1 ) { - this.binding = binding - this.holder = holder - - /* - * by default the index is got from 'holder' current size - * and the mapIndex is =1 (not defined) - */ - if( ownerIndex == -1 ) { - index = holder.size() - mapIndex = -1 - } - - /* - * when the owner index is provided (not -1) it is used as - * the main index and the map index is got from the 'holder' size - */ - else { - index = ownerIndex - mapIndex = holder.size() - } - - // add the the param to the holder list - holder.add(this) - } - - @Override - Object clone() { - final copy = (BaseParam)super.clone() - copy.holder = this.holder!=null ? new ArrayList(holder) : new ArrayList() - return copy - } - - String toString() { - def p = mapIndex == -1 ? index : "$index:$mapIndex" - return "${getTypeSimpleName()}<$p>" - } - - String getTypeSimpleName() { - this.class.simpleName.toLowerCase() - } - - /** - * Lazy initializer - */ - protected abstract void lazyInit() - - /** - * Initialize the parameter fields if needed - */ - final void init() { - if( initialized ) return - lazyInit() - - // flag as initialized - initialized = true - } - - - /** - * Get the value of variable {@code name} in the script context - * - * @param name The variable name - * @param strict If {@code true} raises a {@code MissingPropertyException} when the specified variable does not exist - * @return The variable object - */ - protected getScriptVar(String name, boolean strict ) { - if( binding.hasVariable(name) ) { - return binding.getVariable(name) - } - - if( strict ) - throw new MissingPropertyException(name,this.class) - - return null - } - - protected getScriptVar( String name ) { - getScriptVar(name,true) - } - - protected BaseParam setOptions(Map opts) { - if( !opts ) - return this - - for( Map.Entry entry : opts ) { - setProperty(entry.key, entry.value) - } - return this - } - - boolean isNestedParam() { - return mapIndex >= 0 - } - - /** - * Report missing method calls as possible syntax errors. - */ - def methodMissing( String name, def args ) { - throw new ScriptRuntimeException("Invalid function call `${name}(${argsToString0(args)})` -- possible syntax error") - } - - private String argsToString0(args) { - if( args instanceof Object[] ) - args = Arrays.asList(args) - if( args instanceof List ) { - final result = new ArrayList() - for( def it : args ) - result.add(argsToString1(it)) - return result.join(',') - } - return argsToString1(args) - } - - private String argsToString1(arg) { - if( arg instanceof TokenVar ) - return arg.name - else - return String.valueOf((Object)arg) - } -} diff --git a/modules/nextflow/src/main/groovy/nextflow/script/params/CmdEvalParam.groovy b/modules/nextflow/src/main/groovy/nextflow/script/params/CmdEvalParam.groovy deleted file mode 100644 index 786f3fbaea..0000000000 --- a/modules/nextflow/src/main/groovy/nextflow/script/params/CmdEvalParam.groovy +++ /dev/null @@ -1,61 +0,0 @@ -/* - * Copyright 2013-2023, Seqera Labs - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - * - */ - -package nextflow.script.params - -import java.util.concurrent.atomic.AtomicInteger - -import groovy.transform.InheritConstructors -import groovy.transform.Memoized - -/** - * Model process `output: eval PARAM` definition - * - * @author Paolo Di Tommaso - */ -@InheritConstructors -class CmdEvalParam extends BaseOutParam implements OptionalParam { - - private static AtomicInteger counter = new AtomicInteger() - - private Object target - - private int count - - { - count = counter.incrementAndGet() - } - - String getName() { - return "nxf_out_eval_${count}" - } - - BaseOutParam bind( def obj ) { - if( obj !instanceof CharSequence ) - throw new IllegalArgumentException("Invalid argument for command output: $this") - // the target value object - target = obj - return this - } - - @Memoized - String getTarget(Map context) { - return target instanceof GString - ? target.cloneAsLazy(context).toString() - : target.toString() - } -} diff --git a/modules/nextflow/src/main/groovy/nextflow/script/params/DefaultInParam.groovy b/modules/nextflow/src/main/groovy/nextflow/script/params/DefaultInParam.groovy deleted file mode 100644 index d17b49c303..0000000000 --- a/modules/nextflow/src/main/groovy/nextflow/script/params/DefaultInParam.groovy +++ /dev/null @@ -1,42 +0,0 @@ -/* - * Copyright 2013-2024, Seqera Labs - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package nextflow.script.params - -import nextflow.extension.CH -import nextflow.script.ProcessConfig -/** - * Model a process default input parameter - * - * @author Paolo Di Tommaso - */ -final class DefaultInParam extends ValueInParam { - - @Override - String getTypeName() { 'default' } - - DefaultInParam(ProcessConfig config) { - super(config) - // This must be a dataflow queue channel to which - // just a value is bound -- No STOP value has to be emitted - // because this channel is used to control to process termination - // See TaskProcessor.BaseProcessInterceptor#messageArrived - final channel = CH.queue() - channel.bind(Boolean.TRUE) - setFrom(channel) - bind('$') - } -} diff --git a/modules/nextflow/src/main/groovy/nextflow/script/params/DefaultOutParam.groovy b/modules/nextflow/src/main/groovy/nextflow/script/params/DefaultOutParam.groovy deleted file mode 100644 index 1857a46cb7..0000000000 --- a/modules/nextflow/src/main/groovy/nextflow/script/params/DefaultOutParam.groovy +++ /dev/null @@ -1,36 +0,0 @@ -/* - * Copyright 2013-2024, Seqera Labs - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package nextflow.script.params - - -import groovyx.gpars.dataflow.DataflowQueue -import nextflow.script.ProcessConfig -/** - * Model a process default output parameter - * - * @author Paolo Di Tommaso - */ -final class DefaultOutParam extends BaseOutParam { - - static enum Completion { DONE } - - DefaultOutParam(ProcessConfig config ) { - super(config) - bind('-') - setInto(new DataflowQueue()) - } -} diff --git a/modules/nextflow/src/main/groovy/nextflow/script/params/EachInParam.groovy b/modules/nextflow/src/main/groovy/nextflow/script/params/EachInParam.groovy deleted file mode 100644 index 5313a16e94..0000000000 --- a/modules/nextflow/src/main/groovy/nextflow/script/params/EachInParam.groovy +++ /dev/null @@ -1,103 +0,0 @@ -/* - * Copyright 2013-2024, Seqera Labs - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package nextflow.script.params - -import groovy.transform.InheritConstructors -import groovy.transform.PackageScope -import groovy.util.logging.Slf4j -import groovyx.gpars.dataflow.DataflowReadChannel -import groovyx.gpars.dataflow.DataflowVariable -import groovyx.gpars.dataflow.expression.DataflowExpression -import nextflow.extension.CH -import nextflow.extension.ToListOp -import nextflow.script.TokenFileCall -import nextflow.script.TokenPathCall - -/** - * Represents a process input *iterator* parameter - * - * @author Paolo Di Tommaso - */ -@InheritConstructors -@Slf4j -class EachInParam extends BaseInParam { - - @Override String getTypeName() { 'each' } - - private List inner = [] - - String getName() { '__$'+this.toString() } - - Object clone() { - final copy = (EachInParam)super.clone() - copy.@inner = new ArrayList<>(inner.size()) - for( BaseInParam p : inner ) { - copy.@inner.add((BaseInParam)p.clone()) - } - return copy - } - - EachInParam bind( def obj ) { - final nested = createNestedParam(obj) - nested.owner = this - this.bindObject = nested.bindObject - return this - } - - protected BaseInParam createNestedParam(obj) { - if( obj instanceof TokenFileCall ) { - return new FileInParam(binding, inner, index) - .bind(obj.target) - } - - if( obj instanceof TokenPathCall ) { - return new FileInParam(binding, inner, index) - .setPathQualifier(true) - .bind(obj.target) - } - - return new ValueInParam(binding, inner, index) - .bind(obj) - } - - InParam getInner() { inner[0] } - - @Override - protected DataflowReadChannel inputValToChannel( value ) { - def variable = normalizeToVariable( value ) - super.inputValToChannel(variable) - } - - @PackageScope - DataflowReadChannel normalizeToVariable( value ) { - def result - if( value instanceof DataflowExpression ) { - result = value - } - else if( CH.isChannel(value) ) { - def read = CH.getReadChannel(value) - result = new ToListOp(read).apply() - } - else { - result = new DataflowVariable() - result.bind(value) - } - - return result.chainWith { it instanceof Collection || it == null ? it : [it] } - } - -} diff --git a/modules/nextflow/src/main/groovy/nextflow/script/params/EnvOutParam.groovy b/modules/nextflow/src/main/groovy/nextflow/script/params/EnvOutParam.groovy deleted file mode 100644 index 70086a89ba..0000000000 --- a/modules/nextflow/src/main/groovy/nextflow/script/params/EnvOutParam.groovy +++ /dev/null @@ -1,54 +0,0 @@ -/* - * Copyright 2013-2024, Seqera Labs - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package nextflow.script.params - -import groovy.transform.InheritConstructors -import nextflow.script.TokenVar - -/** - * Model process `output: env PARAM` definition - * - * @author Paolo Di Tommaso - */ -@InheritConstructors -class EnvOutParam extends BaseOutParam implements OptionalParam { - - protected target - - String getName() { - return nameObj ? super.getName() : null - } - - BaseOutParam bind( def obj ) { - // the target value object - target = obj - - // retrieve the variable name to be used to fetch the value - if( obj instanceof TokenVar ) { - this.nameObj = obj.name - } - else if( obj instanceof CharSequence ) { - this.nameObj = obj.toString() - } - else { - throw new IllegalArgumentException("Unexpected environment output definition - it should be either a string or a variable identifier - offending value: ${obj?.getClass()?.getName()}") - } - - return this - } - -} diff --git a/modules/nextflow/src/main/groovy/nextflow/script/params/FileInParam.groovy b/modules/nextflow/src/main/groovy/nextflow/script/params/FileInParam.groovy deleted file mode 100644 index fb93eb1701..0000000000 --- a/modules/nextflow/src/main/groovy/nextflow/script/params/FileInParam.groovy +++ /dev/null @@ -1,149 +0,0 @@ -/* - * Copyright 2013-2024, Seqera Labs - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package nextflow.script.params - -import groovy.transform.InheritConstructors -import groovy.util.logging.Slf4j -import nextflow.NF -import nextflow.script.TokenVar - -/** - * Represents a process *file* input parameter - * - * @author Paolo Di Tommaso - */ -@Slf4j -@InheritConstructors -class FileInParam extends BaseInParam implements ArityParam, PathQualifier { - - protected filePattern - - private boolean pathQualifier - - @Override String getTypeName() { pathQualifier ? 'path' : 'file' } - - @Override String getTypeSimpleName() { getTypeName() + "inparam" } - - /** - * Define the file name - */ - FileInParam name( obj ) { - if( pathQualifier ) - throw new MissingMethodException("name", this.class, [String] as Object[]) - - if( obj instanceof String ) { - filePattern = obj - return this - } - - if( obj instanceof GString ) { - filePattern = obj - return this - } - - throw new IllegalArgumentException() - } - - String getName() { - if( bindObject instanceof Map ) { - assert !pathQualifier - def entry = bindObject.entrySet().first() - return entry?.key - } - - if( bindObject instanceof GString ) { - return '__$' + this.toString() - } - - return super.getName() - } - - @Override - BaseInParam bind( obj ) { - if( pathQualifier && obj instanceof Map ) - throw new IllegalArgumentException("Input `path` does not allow such arguments: ${obj.entrySet().collect{"${it.key}:${it.value}"}.join(',')}") - super.bind(obj) - return this - } - - String getFilePattern(Map ctx = null) { - - if( filePattern != null ) - return resolve(ctx,filePattern) - - if( bindObject instanceof Map ) { - assert !pathQualifier - def entry = bindObject.entrySet().first() - return resolve(ctx, entry?.value) - } - - if( bindObject instanceof TokenVar ) - return filePattern = '*' - - if( bindObject != null ) - return resolve(ctx, bindObject) - - return filePattern = '*' - } - - private resolve( Map ctx, value ) { - if( value instanceof GString ) { - value.cloneAsLazy(ctx) - } - - else if( value instanceof Closure ) { - return ctx.with(value) - } - - else - return value - } - - @Override - FileInParam setPathQualifier(boolean flag) { - pathQualifier = flag - return this - } - - @Override - boolean isPathQualifier() { pathQualifier } - - @Override - FileInParam setOptions(Map opts) { - (FileInParam)super.setOptions(opts) - } - - /** - * Defines the `stageAs:` option to define the input file stage name pattern - * - * @param value - * A string representing the target file name or a file name pattern - * ie. containing the star `*` or question mark wildcards - * @return - * The param instance itself - */ - FileInParam setStageAs(String value) { - this.filePattern = value - return this - } - - FileInParam setName(String value) { - this.filePattern = value - return this - } - -} diff --git a/modules/nextflow/src/main/groovy/nextflow/script/params/FileOutParam.groovy b/modules/nextflow/src/main/groovy/nextflow/script/params/FileOutParam.groovy deleted file mode 100644 index 6e6badeabd..0000000000 --- a/modules/nextflow/src/main/groovy/nextflow/script/params/FileOutParam.groovy +++ /dev/null @@ -1,222 +0,0 @@ -/* - * Copyright 2013-2024, Seqera Labs - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package nextflow.script.params - -import java.nio.file.Path - -import groovy.transform.InheritConstructors -import groovy.transform.PackageScope -import groovy.util.logging.Slf4j -import nextflow.NF -import nextflow.exception.IllegalFileException -import nextflow.file.FilePatternSplitter -import nextflow.script.TokenVar -import nextflow.util.BlankSeparatedList -/** - * Model a process *file* output parameter - * - * @author Paolo Di Tommaso - */ -@Slf4j -@InheritConstructors -class FileOutParam extends BaseOutParam implements OutParam, ArityParam, OptionalParam, PathQualifier { - - /** - * ONLY FOR TESTING DO NOT USE - */ - protected FileOutParam(Map params) { - super(new Binding(), []) - } - - /** - * The character used to separate multiple names (pattern) in the output specification - * - * This is only used by `file` qualifier. It's not supposed to be used anymore - * by the new `path` qualifier. - * - */ - @Deprecated - String separatorChar = ':' - - /** - * When {@code true} star wildcard (*) matches hidden files (files starting with a dot char) - * By default it does not, coherently with linux bash rule - */ - boolean hidden - - /** - * When {@code true} file pattern includes input files as well as output files. - * By default a file pattern matches only against files produced by the process, not - * the ones received as input - */ - boolean includeInputs - - /** - * The type of path to output, either {@code file}, {@code dir} or {@code any} - */ - String type - - /** - * Maximum number of directory levels to visit (default: no limit) - */ - Integer maxDepth - - /** - * When true it follows symbolic links during directories tree traversal, otherwise they are managed as files (default: true) - */ - boolean followLinks = true - - boolean glob = true - - private GString gstring - private Closure dynamicObj - private String filePattern - private boolean pathQualifier - - /** - * @return {@code true} when the file name is parametric i.e contains a variable name to be resolved, {@code false} otherwise - */ - boolean isDynamic() { dynamicObj || gstring != null } - - @Override - BaseOutParam bind( obj ) { - - if( obj instanceof GString ) { - gstring = obj - return this - } - - if( obj instanceof TokenVar ) { - this.nameObj = obj.name - dynamicObj = { delegate.containsKey(obj.name) ? delegate.get(obj.name): obj.name } - return this - } - - if( obj instanceof Closure ) { - dynamicObj = obj - return this - } - - this.filePattern = obj.toString() - return this - } - - List getFilePatterns(Map context, Path workDir) { - - def entry = null - if( dynamicObj ) { - entry = context.with(dynamicObj) - } - else if( gstring != null ) { - def strict = (getName() == null) - try { - entry = gstring.cloneAsLazy(context) - } - catch( MissingPropertyException e ) { - if( strict ) - throw e - } - } - else { - entry = filePattern - } - - if( !entry ) - return [] - - if( entry instanceof Path ) - return [ relativize(entry, workDir) ] - - // handle a collection of files - if( entry instanceof BlankSeparatedList || entry instanceof List ) { - return entry.collect { relativize(it.toString(), workDir) } - } - - // normalize to a string object - final nameString = entry.toString() - if( separatorChar && nameString.contains(separatorChar) ) { - return nameString.split(/\${separatorChar}/).collect { String it-> relativize(it, workDir) } - } - - return [relativize(nameString, workDir)] - - } - - @PackageScope String getFilePattern() { filePattern } - - @PackageScope - static String clean(String path) { - while (path.startsWith('/') ) { - path = path.substring(1) - } - return path - } - - @PackageScope - String relativize(String path, Path workDir) { - if( !path.startsWith('/') ) - return path - - final dir = workDir.toString() - if( !path.startsWith(dir) ) - throw new IllegalFileException("File `$path` is outside the scope of the process work directory: $workDir") - - if( path.length()-dir.length()<2 ) - throw new IllegalFileException("Missing output file name") - - return path.substring(dir.size()+1) - } - - @PackageScope - String relativize(Path path, Path workDir) { - if( !path.isAbsolute() ) - return glob ? FilePatternSplitter.GLOB.escape(path) : path - - if( !path.startsWith(workDir) ) - throw new IllegalFileException("File `$path` is outside the scope of the process work directory: $workDir") - - if( path.nameCount == workDir.nameCount ) - throw new IllegalFileException("Missing output file name") - - final rel = path.subpath(workDir.getNameCount(), path.getNameCount()) - return glob ? FilePatternSplitter.GLOB.escape(rel) : rel - } - - /** - * Override the default to allow null as a value name - * @return - */ - String getName() { - return nameObj ? super.getName() : null - } - - @Override - FileOutParam setPathQualifier(boolean flag) { - pathQualifier = flag - separatorChar = null - return this - } - - @Override - boolean isPathQualifier() { pathQualifier } - - @Override - FileOutParam setOptions(Map opts) { - (FileOutParam)super.setOptions(opts) - } - -} diff --git a/modules/nextflow/src/main/groovy/nextflow/script/params/InParam.groovy b/modules/nextflow/src/main/groovy/nextflow/script/params/InParam.groovy deleted file mode 100644 index 0551ddbe3b..0000000000 --- a/modules/nextflow/src/main/groovy/nextflow/script/params/InParam.groovy +++ /dev/null @@ -1,40 +0,0 @@ -/* - * Copyright 2013-2024, Seqera Labs - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package nextflow.script.params - -import groovyx.gpars.dataflow.DataflowReadChannel - -/** - * Basic interface for *all* input parameters - * - * @author Paolo Di Tommaso - */ -interface InParam extends Cloneable { - - String getName() - - DataflowReadChannel getInChannel() - - Object getRawChannel() - - short index - - short mapIndex - - def decodeInputs( List values ) - -} diff --git a/modules/nextflow/src/main/groovy/nextflow/script/params/InputsList.groovy b/modules/nextflow/src/main/groovy/nextflow/script/params/InputsList.groovy deleted file mode 100644 index 2aee0d671d..0000000000 --- a/modules/nextflow/src/main/groovy/nextflow/script/params/InputsList.groovy +++ /dev/null @@ -1,65 +0,0 @@ -/* - * Copyright 2013-2024, Seqera Labs - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package nextflow.script.params - -import groovy.util.logging.Slf4j -import groovyx.gpars.dataflow.DataflowReadChannel -import nextflow.extension.CH - - -/** - * Container to hold all process outputs - * - * @author Paolo Di Tommaso - */ -@Slf4j -class InputsList implements List, Cloneable { - - @Override - InputsList clone() { - def result = (InputsList)super.clone() - result.target = new ArrayList<>(target.size()) - for( InParam param : target ) { - result.target.add((InParam)param.clone()) - } - return result - } - - @Delegate - private List target = new LinkedList<>() - - List getChannels() { - target.collect { InParam it -> it.getInChannel() } - } - - List getNames() { target *. name } - - - def List ofType( Class clazz ) { - (List) target.findAll { it.class == clazz } - } - - boolean allScalarInputs() { - for( InParam param : target ) { - if( CH.isChannelQueue(param.inChannel) ) - return false - } - return true - } - -} - diff --git a/modules/nextflow/src/main/groovy/nextflow/script/params/MissingParam.groovy b/modules/nextflow/src/main/groovy/nextflow/script/params/MissingParam.groovy deleted file mode 100644 index fb04bfd620..0000000000 --- a/modules/nextflow/src/main/groovy/nextflow/script/params/MissingParam.groovy +++ /dev/null @@ -1,29 +0,0 @@ -/* - * Copyright 2013-2024, Seqera Labs - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package nextflow.script.params - - -/** - * Placeholder trait to mark a missing optional output parameter - * - * @author Paolo Di Tommaso - */ -trait MissingParam { - - OutParam missing - -} diff --git a/modules/nextflow/src/main/groovy/nextflow/script/params/OptionalParam.groovy b/modules/nextflow/src/main/groovy/nextflow/script/params/OptionalParam.groovy deleted file mode 100644 index cda12e8250..0000000000 --- a/modules/nextflow/src/main/groovy/nextflow/script/params/OptionalParam.groovy +++ /dev/null @@ -1,36 +0,0 @@ -/* - * Copyright 2013-2024, Seqera Labs - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package nextflow.script.params - - -/** - * Implements an optional file output option - * - * @author Paolo Di Tommaso - */ -trait OptionalParam { - - boolean optional - - boolean getOptional() { optional } - - def optional( boolean value ) { - this.optional = value - return this - } - -} diff --git a/modules/nextflow/src/main/groovy/nextflow/script/params/OutParam.groovy b/modules/nextflow/src/main/groovy/nextflow/script/params/OutParam.groovy deleted file mode 100644 index 907b2fe944..0000000000 --- a/modules/nextflow/src/main/groovy/nextflow/script/params/OutParam.groovy +++ /dev/null @@ -1,45 +0,0 @@ -/* - * Copyright 2013-2024, Seqera Labs - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package nextflow.script.params - -import groovyx.gpars.dataflow.DataflowWriteChannel - -/** - * Model a process generic input parameter - * - * @author Paolo Di Tommaso - */ - -interface OutParam extends Cloneable { - - /** - * @return The parameter name getter - */ - String getName() - - /** - * @return The output channel instance - */ - DataflowWriteChannel getOutChannel() - - short getIndex() - - String getChannelEmitName() - - String getChannelTopicName() - -} diff --git a/modules/nextflow/src/main/groovy/nextflow/script/params/OutputsList.groovy b/modules/nextflow/src/main/groovy/nextflow/script/params/OutputsList.groovy deleted file mode 100644 index a50a48a7c5..0000000000 --- a/modules/nextflow/src/main/groovy/nextflow/script/params/OutputsList.groovy +++ /dev/null @@ -1,56 +0,0 @@ -/* - * Copyright 2013-2024, Seqera Labs - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package nextflow.script.params - -import groovyx.gpars.dataflow.DataflowWriteChannel - - -/** - * Container to hold all process outputs - * - * @author Paolo Di Tommaso - */ -class OutputsList implements List, Cloneable { - - @Override - OutputsList clone() { - def result = (OutputsList)super.clone() - result.target = new ArrayList<>(target.size()) - for( OutParam param : target ) - result.add((OutParam)param.clone()) - return result - } - - @Delegate - private List target = new LinkedList<>() - - List getChannels() { - final List result = new ArrayList<>(target.size()) - for(OutParam param : target) { result.add(param.getOutChannel()) } - return result - } - - List getNames() { target *. name } - - def List ofType( Class... classes ) { - (List) target.findAll { it.class in classes } - } - - void setSingleton( boolean value ) { - for( OutParam param : target ) { ((BaseOutParam)param).singleton = value } - } -} diff --git a/modules/nextflow/src/main/groovy/nextflow/script/params/PathQualifier.groovy b/modules/nextflow/src/main/groovy/nextflow/script/params/PathQualifier.groovy deleted file mode 100644 index 9f19f40137..0000000000 --- a/modules/nextflow/src/main/groovy/nextflow/script/params/PathQualifier.groovy +++ /dev/null @@ -1,30 +0,0 @@ -/* - * Copyright 2013-2024, Seqera Labs - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package nextflow.script.params - -/** - * Path qualifier marker interface - * - * @author Paolo Di Tommaso - */ -interface PathQualifier { - - def setPathQualifier(boolean flag) - - boolean isPathQualifier() - -} diff --git a/modules/nextflow/src/main/groovy/nextflow/script/params/StdInParam.groovy b/modules/nextflow/src/main/groovy/nextflow/script/params/StdInParam.groovy deleted file mode 100644 index 80a0fe22ed..0000000000 --- a/modules/nextflow/src/main/groovy/nextflow/script/params/StdInParam.groovy +++ /dev/null @@ -1,38 +0,0 @@ -/* - * Copyright 2013-2024, Seqera Labs - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package nextflow.script.params - -import groovy.transform.InheritConstructors -import groovy.transform.ToString - - -/** - * Represents a process *stdin* input parameter - * - * @author Paolo Di Tommaso - */ -@InheritConstructors -@ToString(includePackage=false, includeSuper = true) -class StdInParam extends BaseInParam { - - String getName() { '-' } - - @Override - String getTypeName() { 'stdin' } - -} - diff --git a/modules/nextflow/src/main/groovy/nextflow/script/params/StdOutParam.groovy b/modules/nextflow/src/main/groovy/nextflow/script/params/StdOutParam.groovy deleted file mode 100644 index 34cf93ded6..0000000000 --- a/modules/nextflow/src/main/groovy/nextflow/script/params/StdOutParam.groovy +++ /dev/null @@ -1,27 +0,0 @@ -/* - * Copyright 2013-2024, Seqera Labs - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package nextflow.script.params - -import groovy.transform.InheritConstructors - -/** - * Model the process *stdout* parameter - * - * @author Paolo Di Tommaso - */ -@InheritConstructors -class StdOutParam extends BaseOutParam { } diff --git a/modules/nextflow/src/main/groovy/nextflow/script/params/TupleInParam.groovy b/modules/nextflow/src/main/groovy/nextflow/script/params/TupleInParam.groovy deleted file mode 100644 index d58a97f925..0000000000 --- a/modules/nextflow/src/main/groovy/nextflow/script/params/TupleInParam.groovy +++ /dev/null @@ -1,106 +0,0 @@ -/* - * Copyright 2013-2024, Seqera Labs - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package nextflow.script.params - -import groovy.transform.InheritConstructors -import nextflow.script.TokenEvalCall -import nextflow.script.TokenEnvCall -import nextflow.script.TokenFileCall -import nextflow.script.TokenPathCall -import nextflow.script.TokenStdinCall -import nextflow.script.TokenValCall -import nextflow.script.TokenVar - -/** - * Models a tuple of input parameters - * - * @author Paolo Di Tommaso - */ -@InheritConstructors -class TupleInParam extends BaseInParam { - - protected List inner = [] - - @Override String getTypeName() { 'tuple' } - - List getInner() { inner } - - @Override - TupleInParam clone() { - final copy = (TupleInParam)super.clone() - copy.@inner = new ArrayList<>(inner.size()) - for( BaseInParam p : inner ) { - copy.@inner.add((BaseInParam)p.clone()) - } - return copy - } - - String getName() { '__$'+this.toString() } - - TupleInParam bind(Object... obj ) { - - for( def item : obj ) { - - if( item instanceof TokenVar ) { - throw new IllegalArgumentException("Unqualified input value declaration is not allowed - replace `tuple ${item.name},..` with `tuple val(${item.name}),..`") - } - else if( item instanceof TokenFileCall ) { - newItem(FileInParam).bind( item.target ) - } - else if( item instanceof TokenPathCall ) { - newItem(FileInParam) - .setPathQualifier(true) - .setOptions(item.opts) - .bind( item.target ) - } - else if( item instanceof Map ) { - throw new IllegalArgumentException("Unqualified input file declaration is not allowed - replace `tuple $item,..` with `tuple path(${item.key}, stageAs:'${item.value}'),..`") - } - else if( item instanceof TokenValCall ) { - newItem(ValueInParam).bind(item.val) - } - else if( item instanceof TokenEnvCall ) { - newItem(EnvInParam).bind(item.val) - } - else if( item instanceof TokenEvalCall ) { - throw new IllegalArgumentException('Command input declaration is not supported') - } - else if( item instanceof TokenStdinCall ) { - newItem(StdInParam) - } - else if( item instanceof GString ) { - throw new IllegalArgumentException("Unqualified input file declaration is not allowed - replace `tuple \"$item\".. with `tuple path(\"$item\")..`") - } - else if( item == '-' ) { - newItem(StdInParam) - } - else if( item instanceof String ) { - throw new IllegalArgumentException("Unqualified input file declaration is not allowed - replace `tuple '$item',..` with `tuple path('$item'),..`") - } - else - throw new IllegalArgumentException() - } - - return this - - } - - private T newItem( Class type ) { - type.newInstance(binding, inner, index) - } - -} diff --git a/modules/nextflow/src/main/groovy/nextflow/script/params/TupleOutParam.groovy b/modules/nextflow/src/main/groovy/nextflow/script/params/TupleOutParam.groovy deleted file mode 100644 index 91df753b8c..0000000000 --- a/modules/nextflow/src/main/groovy/nextflow/script/params/TupleOutParam.groovy +++ /dev/null @@ -1,104 +0,0 @@ -/* - * Copyright 2013-2024, Seqera Labs - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package nextflow.script.params - -import groovy.transform.InheritConstructors -import nextflow.script.TokenEvalCall -import nextflow.script.TokenEnvCall -import nextflow.script.TokenFileCall -import nextflow.script.TokenPathCall -import nextflow.script.TokenStdoutCall -import nextflow.script.TokenValCall -import nextflow.script.TokenVar -/** - * Model a set of process output parameters - * - * @author Paolo Di Tommaso - */ -@InheritConstructors -class TupleOutParam extends BaseOutParam implements OptionalParam { - - protected List inner = new ArrayList<>(10) - - String getName() { toString() } - - List getInner() { inner } - - TupleOutParam clone() { - final copy = (TupleOutParam)super.clone() - copy.inner = new ArrayList<>(10) - for( BaseOutParam p : inner ) { - copy.inner.add(p.clone()) - } - return copy - } - - TupleOutParam bind(Object... obj ) { - - for( def item : obj ) { - if( item instanceof TokenVar ) { - throw new IllegalArgumentException("Unqualified output value declaration is not allowed - replace `tuple ${item.name},..` with `tuple val(${item.name}),..`") - } - else if( item instanceof TokenValCall ) { - create(ValueOutParam).bind(item.val) - } - else if( item instanceof TokenEnvCall ) { - create(EnvOutParam).bind(item.val) - } - else if( item instanceof TokenEvalCall ) { - create(CmdEvalParam).bind(item.val) - } - else if( item instanceof GString ) { - throw new IllegalArgumentException("Unqualified output path declaration is not allowed - replace `tuple \"$item\",..` with `tuple path(\"$item\"),..`") - } - else if( item instanceof TokenStdoutCall || item == '-' ) { - create(StdOutParam).bind('-') - } - else if( item instanceof String ) { - throw new IllegalArgumentException("Unqualified output path declaration is not allowed - replace `tuple '$item',..` with `tuple path('$item'),..`") - } - else if( item instanceof TokenFileCall ) { - // note that 'filePattern' can be a string or a GString - create(FileOutParam).bind(item.target) - } - else if( item instanceof TokenPathCall ) { - // note that 'filePattern' can be a string or a GString - create(FileOutParam) - .setPathQualifier(true) - .setOptions(item.opts) - .bind(item.target) - } - else - throw new IllegalArgumentException("Invalid `tuple` output parameter declaration -- item: ${item}") - } - - return this - } - - protected T create(Class type) { - type.newInstance(binding,inner,index) - } - - @Override - void lazyInit() { - super.lazyInit() - inner.each { opt -> - if( opt instanceof FileOutParam ) opt.optional(this.optional) - } - } - -} diff --git a/modules/nextflow/src/main/groovy/nextflow/script/params/ValueInParam.groovy b/modules/nextflow/src/main/groovy/nextflow/script/params/ValueInParam.groovy deleted file mode 100644 index dad42ea27a..0000000000 --- a/modules/nextflow/src/main/groovy/nextflow/script/params/ValueInParam.groovy +++ /dev/null @@ -1,33 +0,0 @@ -/* - * Copyright 2013-2024, Seqera Labs - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package nextflow.script.params - -import groovy.transform.InheritConstructors - - -/** - * Represents a process *value* input parameter - * - * @author Paolo Di Tommaso - */ -@InheritConstructors -class ValueInParam extends BaseInParam { - - @Override - String getTypeName() { 'val' } - -} diff --git a/modules/nextflow/src/main/groovy/nextflow/script/params/ValueOutParam.groovy b/modules/nextflow/src/main/groovy/nextflow/script/params/ValueOutParam.groovy deleted file mode 100644 index 20d427d864..0000000000 --- a/modules/nextflow/src/main/groovy/nextflow/script/params/ValueOutParam.groovy +++ /dev/null @@ -1,74 +0,0 @@ -/* - * Copyright 2013-2024, Seqera Labs - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package nextflow.script.params - -import groovy.transform.InheritConstructors -import nextflow.script.TokenVar -import org.codehaus.groovy.runtime.InvokerHelper - - -/** - * Model a process *value* output parameter - * - * @author Paolo Di Tommaso - */ -@InheritConstructors -class ValueOutParam extends BaseOutParam { - - protected target - - String getName() { - return nameObj ? super.getName() : null - } - - BaseOutParam bind( def obj ) { - // the target value object - target = obj - - // retrieve the variable name to be used to fetch the value - if( obj instanceof TokenVar ) { - this.nameObj = obj.name - } - - return this - } - - /** - * Given the {@link nextflow.processor.TaskContext} object resolve the actual value - * to which this param is bound - * - * @param context An instance of {@link nextflow.processor.TaskContext} holding the task evaluation context - * @return The actual value to which this out param is bound - */ - def resolve( Map context ) { - - switch( target ) { - case TokenVar: - return InvokerHelper.getProperty(context,target.name) - - case Closure: - return target.cloneWith(context).call() - - case GString: - return target.cloneAsLazy(context).toString() - - default: - return target - } - } - -} diff --git a/modules/nextflow/src/main/groovy/nextflow/util/LoggerHelper.groovy b/modules/nextflow/src/main/groovy/nextflow/util/LoggerHelper.groovy index fc8bd68841..96f0b9f3fe 100644 --- a/modules/nextflow/src/main/groovy/nextflow/util/LoggerHelper.groovy +++ b/modules/nextflow/src/main/groovy/nextflow/util/LoggerHelper.groovy @@ -69,6 +69,7 @@ import nextflow.script.ChainableDef import nextflow.script.ComponentDef import nextflow.script.CompositeDef import nextflow.script.FunctionDef +import nextflow.script.ProcessDef import nextflow.script.ScriptMeta import nextflow.script.WorkflowBinding import nextflow.script.WorkflowDef @@ -758,7 +759,7 @@ class LoggerHelper { if( type instanceof Class ) { if( DataflowWriteChannel.isAssignableFrom(type) || DataflowReadChannel .isAssignableFrom(type) ) return 'channel type' - if( ComponentDef.isAssignableFrom(type) ) + if( ProcessDef.isAssignableFrom(type) ) return 'process type' if( FunctionDef.isAssignableFrom(type) ) return 'function type' diff --git a/modules/nextflow/src/test/groovy/nextflow/dag/DAGTest.groovy b/modules/nextflow/src/test/groovy/nextflow/dag/DAGTest.groovy index 31d1594a5e..1a21f83442 100644 --- a/modules/nextflow/src/test/groovy/nextflow/dag/DAGTest.groovy +++ b/modules/nextflow/src/test/groovy/nextflow/dag/DAGTest.groovy @@ -22,10 +22,10 @@ import groovyx.gpars.dataflow.DataflowChannel import groovyx.gpars.dataflow.DataflowQueue import groovyx.gpars.dataflow.DataflowBroadcast import groovyx.gpars.dataflow.DataflowVariable -import nextflow.script.params.InputsList; -import nextflow.script.params.InParam; -import nextflow.script.params.OutputsList; -import nextflow.script.params.OutParam; +import nextflow.script.ProcessInput +import nextflow.script.ProcessInputs +import nextflow.script.ProcessOutput +import nextflow.script.ProcessOutputs import nextflow.Session /** * @@ -293,12 +293,18 @@ class DAGTest extends Specification { def dag = new DAG() - def pInList = new InputsList() - def ip1 = Mock(InParam) { rawChannel >> chC } + def pInList = new ProcessInputs() + def ip1 = Mock(ProcessInput) { + getChannel() >> chC + getName() >> 'in1' + } pInList.add( ip1 ) - def pOutList = new OutputsList() - def op1 = Mock(OutParam) { getOutChannel() >> chE } + def pOutList = new ProcessOutputs() + def op1 = Mock(ProcessOutput) { + getChannel() >> chE + getName() >> 'out1' + } pOutList.add( op1 ) when: diff --git a/modules/nextflow/src/test/groovy/nextflow/script/params/ArityParamTest.groovy b/modules/nextflow/src/test/groovy/nextflow/processor/PathArityAwareTest.groovy similarity index 84% rename from modules/nextflow/src/test/groovy/nextflow/script/params/ArityParamTest.groovy rename to modules/nextflow/src/test/groovy/nextflow/processor/PathArityAwareTest.groovy index bd4f50ee29..15d4723184 100644 --- a/modules/nextflow/src/test/groovy/nextflow/script/params/ArityParamTest.groovy +++ b/modules/nextflow/src/test/groovy/nextflow/processor/PathArityAwareTest.groovy @@ -14,7 +14,7 @@ * limitations under the License. */ -package nextflow.script.params +package nextflow.script import spock.lang.Specification import spock.lang.Unroll @@ -22,17 +22,17 @@ import spock.lang.Unroll * * @author Ben Sherman */ -class ArityParamTest extends Specification { +class PathArityAwareTest extends Specification { - static class DefaultArityParam implements ArityParam { - DefaultArityParam() {} + static class PathArity implements PathArityAware { + PathArity() {} } @Unroll def testArity () { when: - def param = new DefaultArityParam() + def param = new PathArity() param.setArity(VALUE) then: param.arity.min == MIN @@ -50,7 +50,7 @@ class ArityParamTest extends Specification { def testArityRange () { when: - def range = new ArityParam.Range(MIN, MAX) + def range = new PathArityAware.Range(MIN, MAX) then: range.contains(2) == TWO range.toString() == STRING diff --git a/modules/nextflow/src/test/groovy/nextflow/processor/TaskConfigTest.groovy b/modules/nextflow/src/test/groovy/nextflow/processor/TaskConfigTest.groovy index 9f377cff78..0240f9b1a5 100644 --- a/modules/nextflow/src/test/groovy/nextflow/processor/TaskConfigTest.groovy +++ b/modules/nextflow/src/test/groovy/nextflow/processor/TaskConfigTest.groovy @@ -102,45 +102,36 @@ class TaskConfigTest extends Specification { given: def config - def local when: - config = new ProcessConfig([:]) - config.module 't_coffee/10' - config.module( [ 'blast/2.2.1', 'clustalw/2'] ) - local = config.createTaskConfig() + config = new TaskConfig() + config.module = ['t_coffee/10', 'blast/2.2.1', 'clustalw/2'] then: - local.module == ['t_coffee/10', 'blast/2.2.1', 'clustalw/2'] - local.getModule() == ['t_coffee/10','blast/2.2.1', 'clustalw/2'] + config.module == ['t_coffee/10', 'blast/2.2.1', 'clustalw/2'] + config.getModule() == ['t_coffee/10','blast/2.2.1', 'clustalw/2'] when: - config = new ProcessConfig([:]) - config.module 'a/1' - config.module 'b/2:c/3' - local = config.createTaskConfig() + config = new TaskConfig() + config.module = ['a/1', 'b/2:c/3'] then: - local.module == ['a/1','b/2','c/3'] + config.module == ['a/1','b/2','c/3'] when: - config = new ProcessConfig([:]) - config.module { 'a/1' } - config.module { 'b/2:c/3' } - config.module 'd/4' - local = config.createTaskConfig() - local.setContext([:]) + config = new TaskConfig() + config.module = ['a/1', 'b/2:c/3', 'd/4'] + config.setContext([:]) then: - local.module == ['a/1','b/2','c/3', 'd/4'] + config.module == ['a/1','b/2','c/3', 'd/4'] when: - config = new ProcessConfig([:]) - config.module = 'b/2:c/3' - local = config.createTaskConfig() + config = new TaskConfig() + config.module = ['b/2:c/3'] then: - local.module == ['b/2','c/3'] - local.getModule() == ['b/2','c/3'] + config.module == ['b/2','c/3'] + config.getModule() == ['b/2','c/3'] } @@ -462,14 +453,13 @@ class TaskConfigTest extends Specification { def 'should create publishDir object' () { setup: - def script = Mock(BaseScript) - ProcessConfig process - PublishDir publish + def config + def publish when: - process = new ProcessConfig(script) - process.publishDir '/data' - publish = process.createTaskConfig().getPublishDir()[0] + config = new TaskConfig() + config.publishDir = [[path: '/data']] + publish = config.getPublishDir()[0] then: publish.path == Paths.get('/data').complete() publish.pattern == null @@ -477,9 +467,9 @@ class TaskConfigTest extends Specification { publish.mode == null when: - process = new ProcessConfig(script) - process.publishDir '/data', overwrite: false, mode: 'copy', pattern: '*.txt' - publish = process.createTaskConfig().getPublishDir()[0] + config = new TaskConfig() + config.publishDir = [[path: '/data', overwrite: false, mode: 'copy', pattern: '*.txt']] + publish = config.getPublishDir()[0] then: publish.path == Paths.get('/data').complete() publish.pattern == '*.txt' @@ -487,18 +477,17 @@ class TaskConfigTest extends Specification { publish.mode == PublishDir.Mode.COPY when: - process = new ProcessConfig(script) - process.publishDir '/my/data', mode: 'copyNoFollow' - publish = process.createTaskConfig().getPublishDir()[0] + config = new TaskConfig() + config.publishDir = [[path: '/my/data', mode: 'copyNoFollow']] + publish = config.getPublishDir()[0] then: publish.path == Paths.get('//my/data').complete() publish.mode == PublishDir.Mode.COPY_NO_FOLLOW when: - process = new ProcessConfig(script) - process.publishDir '/here' - process.publishDir '/there', pattern: '*.fq' - def dirs = process.createTaskConfig().getPublishDir() + config = new TaskConfig() + config.publishDir = [[path: '/here'], [path: '/there', pattern: '*.fq']] + def dirs = config.getPublishDir() then: dirs.size() == 2 dirs[0].path == Paths.get('/here') @@ -515,7 +504,7 @@ class TaskConfigTest extends Specification { when: config = new TaskConfig() - config.publishDir = [ [path: "${-> foo }/${-> bar }", mode: "${-> x }"] ] as ConfigList + config.publishDir = [ [path: "${-> foo }/${-> bar }", mode: "${-> x }"] ] as LazyList config.setContext( foo: 'world', bar: 'hello', x: 'copy' ) then: config.getPublishDir() == [ PublishDir.create(path: 'world/hello', mode: 'copy') ] @@ -549,20 +538,18 @@ class TaskConfigTest extends Specification { def 'should configure pod options'() { - given: - def script = Mock(BaseScript) - when: - def process = new ProcessConfig(script) - process.pod secret: 'foo', mountPath: '/this' - process.pod secret: 'bar', env: 'BAR_XXX' + def config = new TaskConfig() + config.pod = [ + [secret: 'foo', mountPath: '/this'], + [secret: 'bar', env: 'BAR_XXX'] ] then: - process.get('pod') == [ + config.get('pod') == [ [secret: 'foo', mountPath: '/this'], [secret: 'bar', env: 'BAR_XXX'] ] - process.createTaskConfig().getPodOptions() == new PodOptions([ + config.getPodOptions() == new PodOptions([ [secret: 'foo', mountPath: '/this'], [secret: 'bar', env: 'BAR_XXX'] ]) @@ -571,20 +558,19 @@ class TaskConfigTest extends Specification { def 'should get gpu resources' () { given: - def script = Mock(BaseScript) + def config = new TaskConfig() + def res when: - def process = new ProcessConfig(script) - process.accelerator 5 - def res = process.createTaskConfig().getAccelerator() + config.accelerator = [request: 5, limit: 5] + res = config.getAccelerator() then: res.limit == 5 res.request == 5 when: - process = new ProcessConfig(script) - process.accelerator 5, limit: 10, type: 'nvidia' - res = process.createTaskConfig().getAccelerator() + config.accelerator = [request: 5, limit: 10, type: 'nvidia'] + res = config.getAccelerator() then: res.request == 5 res.limit == 10 @@ -593,36 +579,23 @@ class TaskConfigTest extends Specification { def 'should configure secrets'() { - given: - def script = Mock(BaseScript) - when: - def process = new ProcessConfig(script) - process.secret 'alpha' - process.secret 'omega' + def config = new TaskConfig() + config.secret = ['alpha', 'omega'] then: - process.getSecret() == ['alpha', 'omega'] + config.getSecret() == ['alpha', 'omega'] and: - process.createTaskConfig().secret == ['alpha', 'omega'] - process.createTaskConfig().getSecret() == ['alpha', 'omega'] + config.secret == ['alpha', 'omega'] + config.getSecret() == ['alpha', 'omega'] } def 'should configure resourceLabels options'() { - given: - def script = Mock(BaseScript) - - when: - def process = new ProcessConfig(script) - process.resourceLabels( region: 'eu-west-1', organization: 'A', user: 'this', team: 'that' ) - - then: - process.get('resourceLabels') == [region: 'eu-west-1', organization: 'A', user: 'this', team: 'that'] - when: - def config = process.createTaskConfig() + def config = new TaskConfig() + config.resourceLabels = [region: 'eu-west-1', organization: 'A', user: 'this', team: 'that'] then: config.getResourceLabels() == [region: 'eu-west-1', organization: 'A', user: 'this', team: 'that'] config.getResourceLabelsAsString() == 'region=eu-west-1,organization=A,user=this,team=that' diff --git a/modules/nextflow/src/test/groovy/nextflow/processor/TaskProcessorTest.groovy b/modules/nextflow/src/test/groovy/nextflow/processor/TaskProcessorTest.groovy index 751feeb03f..2ce9ddec2e 100644 --- a/modules/nextflow/src/test/groovy/nextflow/processor/TaskProcessorTest.groovy +++ b/modules/nextflow/src/test/groovy/nextflow/processor/TaskProcessorTest.groovy @@ -23,6 +23,7 @@ import java.nio.file.Paths import java.util.concurrent.ExecutorService import groovyx.gpars.agent.Agent +import groovyx.gpars.dataflow.DataflowReadChannel import nextflow.Global import nextflow.ISession import nextflow.Session @@ -60,7 +61,8 @@ class TaskProcessorTest extends Specification { super(name, new NopeExecutor(session: session), session, script, taskConfig, new BodyDef({}, '..')) } - @Override protected void createOperator() { } + @Override + protected void createOperator(DataflowReadChannel source) { } } @@ -110,7 +112,7 @@ class TaskProcessorTest extends Specification { when: def session = new Session([env: [X:"1", Y:"2"]]) session.setBaseDir(home) - def processor = new DummyProcessor('task1', session, Mock(BaseScript), Mock(ProcessConfig)) + def processor = new DummyProcessor('task1', session, Mock(BaseScript), new ProcessConfig([:])) def builder = new ProcessBuilder() builder.environment().putAll( processor.getProcessEnvironment() ) then: @@ -122,7 +124,7 @@ class TaskProcessorTest extends Specification { when: session = new Session([env: [X:"1", Y:"2", PATH:'/some']]) session.setBaseDir(home) - processor = new DummyProcessor('task1', session, Mock(BaseScript), Mock(ProcessConfig)) + processor = new DummyProcessor('task1', session, Mock(BaseScript), new ProcessConfig([:])) builder = new ProcessBuilder() builder.environment().putAll( processor.getProcessEnvironment() ) then: diff --git a/modules/nextflow/src/test/groovy/nextflow/processor/TaskRunTest.groovy b/modules/nextflow/src/test/groovy/nextflow/processor/TaskRunTest.groovy index 80bcbf3458..e0b8a8b0e5 100644 --- a/modules/nextflow/src/test/groovy/nextflow/processor/TaskRunTest.groovy +++ b/modules/nextflow/src/test/groovy/nextflow/processor/TaskRunTest.groovy @@ -27,9 +27,9 @@ import nextflow.container.resolver.ContainerInfo import nextflow.executor.Executor import nextflow.file.FileHolder import nextflow.script.BodyDef +import nextflow.script.LazyVar import nextflow.script.ScriptBinding import nextflow.script.TaskClosure -import nextflow.script.TokenVar import nextflow.script.params.EnvInParam import nextflow.script.params.EnvOutParam import nextflow.script.params.FileInParam @@ -63,8 +63,8 @@ class TaskRunTest extends Specification { def list = [] task.setInput( new StdInParam(binding,list) ) - task.setInput( new FileInParam(binding, list).bind(new TokenVar('x')), 'file1' ) - task.setInput( new FileInParam(binding, list).bind(new TokenVar('y')), 'file2' ) + task.setInput( new FileInParam(binding, list).bind(new LazyVar('x')), 'file1' ) + task.setInput( new FileInParam(binding, list).bind(new LazyVar('y')), 'file2' ) task.setInput( new EnvInParam(binding, list).bind('z'), 'env' ) @@ -119,7 +119,7 @@ class TaskRunTest extends Specification { def task = new TaskRun() def list = [] - def x = new ValueInParam(binding, list).bind( new TokenVar('x') ) + def x = new ValueInParam(binding, list).bind( new LazyVar('x') ) def y = new FileInParam(binding, list).bind('y') task.setInput(x, 1) @@ -137,7 +137,7 @@ class TaskRunTest extends Specification { def task = new TaskRun() def list = [] - def x = new ValueInParam(binding, list).bind( new TokenVar('x') ) + def x = new ValueInParam(binding, list).bind( new LazyVar('x') ) def y = new FileInParam(binding, list).bind('y') def z = new FileInParam(binding, list).bind('z') @@ -159,7 +159,7 @@ class TaskRunTest extends Specification { def list = [] when: - def i1 = new ValueInParam(binding, list).bind( new TokenVar('x') ) + def i1 = new ValueInParam(binding, list).bind( new LazyVar('x') ) def o1 = new FileOutParam(binding,list).bind('file_out.alpha') def o2 = new ValueOutParam(binding,list).bind( 'x' ) def o3 = new FileOutParam(binding,list).bind('file_out.beta') @@ -203,7 +203,7 @@ class TaskRunTest extends Specification { * file with parametric name => true */ when: - def s3 = new FileOutParam(binding, list).bind( new TokenVar('y') ) + def s3 = new FileOutParam(binding, list).bind( new LazyVar('y') ) def task3 = new TaskRun() task3.setOutput(s3) then: @@ -687,8 +687,8 @@ class TaskRunTest extends Specification { def 'should return output env names' () { given: - def env1 = new EnvOutParam(new Binding(),[]).bind(new TokenVar('FOO')) - def env2 = new EnvOutParam(new Binding(),[]).bind(new TokenVar('BAR')) + def env1 = new EnvOutParam(new Binding(),[]).bind(new LazyVar('FOO')) + def env2 = new EnvOutParam(new Binding(),[]).bind(new LazyVar('BAR')) def task = new TaskRun() task.outputs.put(env1, null) task.outputs.put(env2, null) diff --git a/modules/nextflow/src/test/groovy/nextflow/scm/ProviderConfigTest.groovy b/modules/nextflow/src/test/groovy/nextflow/scm/ProviderConfigTest.groovy index bba86cbc9a..de861a7610 100644 --- a/modules/nextflow/src/test/groovy/nextflow/scm/ProviderConfigTest.groovy +++ b/modules/nextflow/src/test/groovy/nextflow/scm/ProviderConfigTest.groovy @@ -16,6 +16,8 @@ package nextflow.scm +import java.nio.file.Files + import spock.lang.Specification import spock.lang.Unroll @@ -242,4 +244,30 @@ class ProviderConfigTest extends Specification { 'paolo0758/nf-azure-repo/_git/nf-azure-repo' | 'https://dev.azure.com' | 'paolo0758/nf-azure-repo' } + def 'should get default config path' () { + given: + ProviderConfig.env.remove('NXF_SCM_FILE') + + when: + def path = ProviderConfig.getScmConfigPath() + then: + path.toString() == "${System.getProperty('user.home')}/.nextflow/scm" + + } + + def 'should get custom config path' () { + given: + def cfg = Files.createTempFile('test','config') + ProviderConfig.env.NXF_SCM_FILE = cfg.toString() + + when: + def path = ProviderConfig.getScmConfigPath() + then: + path.toString() == cfg.toString() + + cleanup: + ProviderConfig.env.remove('NXF_SCM_FILE') + cfg.delete() + } + } diff --git a/modules/nextflow/src/test/groovy/nextflow/script/ProcessConfigTest.groovy b/modules/nextflow/src/test/groovy/nextflow/script/ProcessConfigTest.groovy index 411baeb2ef..dd71e3f6ce 100644 --- a/modules/nextflow/src/test/groovy/nextflow/script/ProcessConfigTest.groovy +++ b/modules/nextflow/src/test/groovy/nextflow/script/ProcessConfigTest.groovy @@ -16,20 +16,11 @@ package nextflow.script -import java.nio.file.Files - -import nextflow.scm.ProviderConfig import spock.lang.Specification import spock.lang.Unroll -import nextflow.exception.IllegalDirectiveException import nextflow.processor.ErrorStrategy -import nextflow.script.params.FileInParam -import nextflow.script.params.StdInParam -import nextflow.script.params.StdOutParam -import nextflow.script.params.ValueInParam import nextflow.util.Duration -import nextflow.util.MemoryUnit import static nextflow.util.CacheHelper.HashMode /** * @@ -64,45 +55,12 @@ class ProcessConfigTest extends Specification { then: config.tag == 'val 1' - // setting list values - when: - config.tag 1,2,3 - then: - config.tag == [1,2,3] - - // setting named parameters attribute - when: - config.tag field1:'val1', field2: 'val2' - then: - config.tag == [field1:'val1', field2: 'val2'] - // generic value assigned like a 'plain' property when: config.tag = 99 then: config.tag == 99 - // maxDuration property - when: - config.time '1h' - then: - config.time == '1h' - config.createTaskConfig().time == new Duration('1h') - - // maxMemory property - when: - config.memory '2GB' - then: - config.memory == '2GB' - config.createTaskConfig().memory == new MemoryUnit('2GB') - - when: - config.stageInMode 'copy' - config.stageOutMode 'move' - then: - config.stageInMode == 'copy' - config.stageOutMode == 'move' - } @Unroll @@ -145,16 +103,6 @@ class ProcessConfigTest extends Specification { } - def 'should throw MissingPropertyException' () { - when: - def script = Mock(BaseScript) - def config = new ProcessConfig(script).throwExceptionOnMissingProperty(true) - def x = config.hola - - then: - thrown(MissingPropertyException) - } - def 'should check property existence' () { @@ -171,59 +119,6 @@ class ProcessConfigTest extends Specification { } - def 'should create input directives' () { - - setup: - def script = Mock(BaseScript) - def config = new ProcessConfig(script) - - when: - config._in_file([infile:'filename.fa']) - config._in_val('x').setFrom(1) - config._in_stdin() - - then: - config.getInputs().size() == 3 - - config.inputs.get(0) instanceof FileInParam - config.inputs.get(0).name == 'infile' - (config.inputs.get(0) as FileInParam).filePattern == 'filename.fa' - - config.inputs.get(1) instanceof ValueInParam - config.inputs.get(1).name == 'x' - - config.inputs.get(2).name == '-' - config.inputs.get(2) instanceof StdInParam - - config.inputs.names == [ 'infile', 'x', '-' ] - config.inputs.ofType( FileInParam ) == [ config.getInputs().get(0) ] - - } - - def 'should create output directives' () { - - setup: - def script = Mock(BaseScript) - def config = new ProcessConfig(script) - - when: - config._out_stdout() - config._out_file(new TokenVar('file1')).setInto('ch1') - config._out_file(new TokenVar('file2')).setInto('ch2') - config._out_file(new TokenVar('file3')).setInto('ch3') - - then: - config.outputs.size() == 4 - config.outputs.names == ['-', 'file1', 'file2', 'file3'] - config.outputs.ofType(StdOutParam).size() == 1 - - config.outputs[0] instanceof StdOutParam - config.outputs[1].name == 'file1' - config.outputs[2].name == 'file2' - config.outputs[3].name == 'file3' - - } - def 'should set cache attribute'() { @@ -273,542 +168,4 @@ class ProcessConfigTest extends Specification { } - def 'should create PublishDir object' () { - - setup: - BaseScript script = Mock(BaseScript) - ProcessConfig config - - when: - config = new ProcessConfig(script) - config.publishDir '/data' - then: - config.get('publishDir')[0] == [path:'/data'] - - when: - config = new ProcessConfig(script) - config.publishDir '/data', mode: 'link', pattern: '*.bam' - then: - config.get('publishDir')[0] == [path: '/data', mode: 'link', pattern: '*.bam'] - - when: - config = new ProcessConfig(script) - config.publishDir path: '/data', mode: 'link', pattern: '*.bam' - then: - config.get('publishDir')[0] == [path: '/data', mode: 'link', pattern: '*.bam'] - } - - def 'should throw InvalidDirectiveException'() { - - given: - def script = Mock(BaseScript) - def config = new ProcessConfig(script) - - when: - config.hello 'world' - - then: - def e = thrown(IllegalDirectiveException) - e.message == - ''' - Unknown process directive: `hello` - - Did you mean of these? - shell - ''' - .stripIndent().trim() - } - - def 'should set process secret'() { - when: - def config = new ProcessConfig([:]) - then: - config.getSecret() == [] - - when: - config.secret('foo') - then: - config.getSecret() == ['foo'] - - when: - config.secret('bar') - then: - config.secret == ['foo', 'bar'] - config.getSecret() == ['foo', 'bar'] - } - - def 'should set process labels'() { - when: - def config = new ProcessConfig([:]) - then: - config.getLabels() == [] - - when: - config.label('foo') - then: - config.getLabels() == ['foo'] - - when: - config.label('bar') - then: - config.getLabels() == ['foo','bar'] - } - - def 'should apply resource labels config' () { - given: - def config = new ProcessConfig(Mock(BaseScript)) - expect: - config.getResourceLabels() == [:] - - when: - config.resourceLabels([foo: 'one', bar: 'two']) - then: - config.getResourceLabels() == [foo: 'one', bar: 'two'] - - when: - config.resourceLabels([foo: 'new one', baz: 'three']) - then: - config.getResourceLabels() == [foo: 'new one', bar: 'two', baz: 'three'] - - } - - def 'should check a valid label' () { - - expect: - new ProcessConfig([:]).isValidLabel(lbl) == result - - where: - lbl | result - 'foo' | true - 'foo1' | true - '1foo' | false - '_foo' | false - 'foo1_' | false - 'foo_1' | true - 'foo-1' | false - 'foo.1' | false - 'a' | true - 'A' | true - '1' | false - '_' | false - 'a=b' | true - 'a=foo' | true - 'a=foo_1' | true - 'a=foo_' | false - '_=foo' | false - '=a' | false - 'a=' | false - 'a=1' | false - - } - - @Unroll - def 'should match selector: #SELECTOR with #TARGET' () { - expect: - ProcessConfig.matchesSelector(TARGET, SELECTOR) == EXPECTED - - where: - SELECTOR | TARGET | EXPECTED - 'foo' | 'foo' | true - 'foo' | 'bar' | false - '!foo' | 'bar' | true - 'a|b' | 'a' | true - 'a|b' | 'b' | true - 'a|b' | 'z' | false - 'a*' | 'a' | true - 'a*' | 'aaaa' | true - 'a*' | 'bbbb' | false - } - - def 'should apply config setting for a process label' () { - given: - def settings = [ - 'withLabel:short' : [ cpus: 1, time: '1h'], - 'withLabel:!short' : [ cpus: 32, queue: 'cn-long'], - 'withLabel:foo' : [ cpus: 2 ], - 'withLabel:foo|bar': [ disk: '100GB' ], - 'withLabel:gpu.+' : [ cpus: 4 ], - ] - - when: - def process = new ProcessConfig([:]) - process.applyConfigSelectorWithLabels(settings, ['short']) - then: - process.cpus == 1 - process.time == '1h' - process.size() == 2 - - when: - process = new ProcessConfig([:]) - process.applyConfigSelectorWithLabels(settings, ['long']) - then: - process.cpus == 32 - process.queue == 'cn-long' - process.size() == 2 - - when: - process = new ProcessConfig([:]) - process.applyConfigSelectorWithLabels(settings, ['foo']) - then: - process.cpus == 2 - process.disk == '100GB' - process.queue == 'cn-long' - process.size() == 3 - - when: - process = new ProcessConfig([:]) - process.applyConfigSelectorWithLabels(settings, ['bar']) - then: - process.cpus == 32 - process.disk == '100GB' - process.queue == 'cn-long' - process.size() == 3 - - when: - process = new ProcessConfig([:]) - process.applyConfigSelectorWithLabels(settings, ['gpu-1']) - then: - process.cpus == 4 - process.queue == 'cn-long' - process.size() == 2 - - } - - - def 'should apply config setting for a process name' () { - given: - def settings = [ - 'withName:alpha' : [ cpus: 1, time: '1h'], - 'withName:delta' : [ cpus: 2 ], - 'withName:delta|gamma' : [ disk: '100GB' ], - 'withName:omega.+' : [ cpus: 4 ], - ] - - when: - def process = new ProcessConfig([:]) - process.applyConfigSelectorWithName(settings, 'xx') - then: - process.size() == 0 - - when: - process = new ProcessConfig([:]) - process.applyConfigSelectorWithName(settings, 'alpha') - then: - process.cpus == 1 - process.time == '1h' - process.size() == 2 - - when: - process = new ProcessConfig([:]) - process.applyConfigSelectorWithName(settings, 'delta') - then: - process.cpus == 2 - process.disk == '100GB' - process.size() == 2 - - when: - process = new ProcessConfig([:]) - process.applyConfigSelectorWithName(settings, 'gamma') - then: - process.disk == '100GB' - process.size() == 1 - - when: - process = new ProcessConfig([:]) - process.applyConfigSelectorWithName(settings, 'omega_x') - then: - process.cpus == 4 - process.size() == 1 - } - - - def 'should apply config process defaults' () { - - when: - def process = new ProcessConfig(Mock(BaseScript)) - - // set process specific settings - process.queue = 'cn-el6' - process.memory = '10 GB' - - // apply config defaults - process.applyConfigDefaults( - queue: 'def-queue', - container: 'ubuntu:latest' - ) - - then: - process.queue == 'cn-el6' - process.container == 'ubuntu:latest' - process.memory == '10 GB' - process.cacheable == true - - - - when: - process = new ProcessConfig(Mock(BaseScript)) - // set process specific settings - process.container = null - // apply process defaults - process.applyConfigDefaults( - queue: 'def-queue', - container: 'ubuntu:latest', - maxRetries: 5 - ) - then: - process.queue == 'def-queue' - process.container == null - process.maxRetries == 5 - - - - when: - process = new ProcessConfig(Mock(BaseScript)) - // set process specific settings - process.maxRetries = 10 - // apply process defaults - process.applyConfigDefaults( - queue: 'def-queue', - container: 'ubuntu:latest', - maxRetries: 5 - ) - then: - process.queue == 'def-queue' - process.container == 'ubuntu:latest' - process.maxRetries == 10 - } - - - def 'should apply pod configs' () { - - when: - def process = new ProcessConfig([:]) - process.applyConfigDefaults( pod: [secret: 'foo', mountPath: '/there'] ) - then: - process.pod == [ - [secret: 'foo', mountPath: '/there'] - ] - - when: - process = new ProcessConfig([:]) - process.applyConfigDefaults( pod: [ - [secret: 'foo', mountPath: '/here'], - [secret: 'bar', mountPath: '/there'] - ] ) - - then: - process.pod == [ - [secret: 'foo', mountPath: '/here'], - [secret: 'bar', mountPath: '/there'] - ] - - } - - def 'should clone config object' () { - - given: - def config = new ProcessConfig(Mock(BaseScript)) - - when: - config.queue 'cn-el6' - config.container 'ubuntu:latest' - config.memory '10 GB' - config._in_val('foo') - config._in_file('sample.txt') - config._out_file('result.txt') - - then: - config.queue == 'cn-el6' - config.container == 'ubuntu:latest' - config.memory == '10 GB' - config.getInputs().size() == 2 - config.getOutputs().size() == 1 - - when: - def copy = config.clone() - copy.queue 'long' - copy.container 'debian:wheezy' - copy.memory '5 GB' - copy._in_val('bar') - copy._out_file('sample.bam') - - then: - copy.queue == 'long' - copy.container == 'debian:wheezy' - copy.memory == '5 GB' - copy.getInputs().size() == 3 - copy.getOutputs().size() == 2 - - // original config is not affected - config.queue == 'cn-el6' - config.container == 'ubuntu:latest' - config.memory == '10 GB' - config.getInputs().size() == 2 - config.getOutputs().size() == 1 - } - - def 'should apply accelerator config' () { - - given: - def process = new ProcessConfig(Mock(BaseScript)) - - when: - process.accelerator 5 - then: - process.accelerator == [limit: 5] - - when: - process.accelerator request: 1, limit: 5, type: 'nvida' - then: - process.accelerator == [request: 1, limit: 5, type: 'nvida'] - - when: - process.accelerator 5, type: 'nvida' - then: - process.accelerator == [limit: 5, type: 'nvida'] - - when: - process.accelerator 1, limit: 5 - then: - process.accelerator == [request: 1, limit:5] - - when: - process.accelerator 5, request: 1 - then: - process.accelerator == [request: 1, limit:5] - } - - def 'should apply disk config' () { - - given: - def process = new ProcessConfig(Mock(BaseScript)) - - when: - process.disk '100 GB' - then: - process.disk == [request: '100 GB'] - - when: - process.disk '375 GB', type: 'local-ssd' - then: - process.disk == [request: '375 GB', type: 'local-ssd'] - - when: - process.disk request: '375 GB', type: 'local-ssd' - then: - process.disk == [request: '375 GB', type: 'local-ssd'] - } - - def 'should apply architecture config' () { - - given: - def process = new ProcessConfig(Mock(BaseScript)) - - when: - process.arch 'linux/x86_64' - then: - process.arch == [name: 'linux/x86_64'] - - when: - process.arch 'linux/x86_64', target: 'zen3' - then: - process.arch == [name: 'linux/x86_64', target: 'zen3'] - - when: - process.arch name: 'linux/x86_64', target: 'zen3' - then: - process.arch == [name: 'linux/x86_64', target: 'zen3'] - } - - - def 'should get default config path' () { - given: - ProviderConfig.env.remove('NXF_SCM_FILE') - - when: - def path = ProviderConfig.getScmConfigPath() - then: - path.toString() == "${System.getProperty('user.home')}/.nextflow/scm" - - } - - def 'should get custom config path' () { - given: - def cfg = Files.createTempFile('test','config') - ProviderConfig.env.NXF_SCM_FILE = cfg.toString() - - when: - def path = ProviderConfig.getScmConfigPath() - then: - path.toString() == cfg.toString() - - cleanup: - ProviderConfig.env.remove('NXF_SCM_FILE') - cfg.delete() - } - - def 'should not apply config on negative label' () { - given: - def settings = [ - 'withLabel:foo': [ cpus: 2 ], - 'withLabel:!foo': [ cpus: 4 ], - 'withLabel:!nodisk_.*': [ disk: '100.GB'] - ] - - when: - def p1 = new ProcessConfig([label: ['foo', 'other']]) - p1.applyConfig(settings, "processName", null, null) - then: - p1.cpus == 2 - p1.disk == '100.GB' - - when: - def p2 = new ProcessConfig([label: ['foo', 'other', 'nodisk_label']]) - p2.applyConfig(settings, "processName", null, null) - then: - p2.cpus == 2 - !p2.disk - - when: - def p3 = new ProcessConfig([label: ['other', 'nodisk_label']]) - p3.applyConfig(settings, "processName", null, null) - then: - p3.cpus == 4 - !p3.disk - - } - - def 'should throw exception for invalid error strategy' () { - when: - def process1 = new ProcessConfig(Mock(BaseScript)) - process1.errorStrategy 'abort' - - then: - def e1 = thrown(IllegalArgumentException) - e1.message == "Unknown error strategy 'abort' ― Available strategies are: terminate,finish,ignore,retry" - - } - - def 'should not throw exception for valid error strategy or closure' () { - when: - def process1 = new ProcessConfig(Mock(BaseScript)) - process1.errorStrategy 'retry' - - then: - def e1 = noExceptionThrown() - - when: - def process2 = new ProcessConfig(Mock(BaseScript)) - process2.errorStrategy 'terminate' - - then: - def e2 = noExceptionThrown() - - when: - def process3 = new ProcessConfig(Mock(BaseScript)) - process3.errorStrategy { task.exitStatus==14 ? 'retry' : 'terminate' } - - then: - def e3 = noExceptionThrown() - } } diff --git a/modules/nextflow/src/test/groovy/nextflow/script/ProcessDefTest.groovy b/modules/nextflow/src/test/groovy/nextflow/script/ProcessDefTest.groovy index 6b8fc4805a..131d60786d 100644 --- a/modules/nextflow/src/test/groovy/nextflow/script/ProcessDefTest.groovy +++ b/modules/nextflow/src/test/groovy/nextflow/script/ProcessDefTest.groovy @@ -12,8 +12,8 @@ class ProcessDefTest extends Specification { given: def OWNER = Mock(BaseScript) - def BODY = { -> null } - def proc = new ProcessDef(OWNER, BODY, 'foo') + def BODY = new BodyDef({->}, 'echo hello') + def proc = new ProcessDef(OWNER, 'foo', BODY, new ProcessConfig([:])) when: def copy = proc.cloneWithName('foo_alias') @@ -22,8 +22,8 @@ class ProcessDefTest extends Specification { copy.getSimpleName() == 'foo_alias' copy.getBaseName() == 'foo' copy.getOwner() == OWNER - copy.rawBody.class == BODY.class - !copy.rawBody.is(BODY) + copy.taskBody.class == BODY.class + !copy.taskBody.is(BODY) when: copy = proc.cloneWithName('flow1:flow2:foo') @@ -32,8 +32,8 @@ class ProcessDefTest extends Specification { copy.getSimpleName() == 'foo' copy.getBaseName() == 'foo' copy.getOwner() == OWNER - copy.rawBody.class == BODY.class - !copy.rawBody.is(BODY) + copy.taskBody.class == BODY.class + !copy.taskBody.is(BODY) } def 'should apply process config' () { @@ -47,11 +47,11 @@ class ProcessDefTest extends Specification { 'withName:flow1:flow2:flow3:bar': [memory: '8GB'] ] ] - def BODY = {-> - return new BodyDef({->}, 'echo hello') + def BODY = new BodyDef({->}, 'echo hello') + def proc = new ProcessDef(OWNER, 'foo', BODY, new ProcessConfig([:])) + proc.session = Mock(Session) { + getConfig() >> CONFIG } - def proc = new ProcessDef(OWNER, BODY, 'foo') - proc.session = Mock(Session) { getConfig() >> CONFIG } when: def copy = proc.clone() diff --git a/modules/nextflow/src/test/groovy/nextflow/script/ScriptMetaTest.groovy b/modules/nextflow/src/test/groovy/nextflow/script/ScriptMetaTest.groovy index 0a9a9c8c25..eb0ab70eba 100644 --- a/modules/nextflow/src/test/groovy/nextflow/script/ScriptMetaTest.groovy +++ b/modules/nextflow/src/test/groovy/nextflow/script/ScriptMetaTest.groovy @@ -45,8 +45,8 @@ class ScriptMetaTest extends Dsl2Spec { given: def script = new FooScript(new ScriptBinding()) - def proc1 = new ProcessDef(script, Mock(Closure), 'proc1') - def proc2 = new ProcessDef(script, Mock(Closure), 'proc2') + def proc1 = new ProcessDef(script, 'proc1', Mock(BodyDef), new ProcessConfig([:])) + def proc2 = new ProcessDef(script, 'proc2', Mock(BodyDef), new ProcessConfig([:])) def func1 = new FunctionDef(name: 'func1', alias: 'func1') def work1 = new WorkflowDef(name:'work1') @@ -83,19 +83,19 @@ class ScriptMetaTest extends Dsl2Spec { // defs in the root script def func1 = new FunctionDef(name: 'func1', alias: 'func1') - def proc1 = new ProcessDef(script1, Mock(Closure), 'proc1') + def proc1 = new ProcessDef(script1, 'proc1', Mock(BodyDef), new ProcessConfig([:])) def work1 = new WorkflowDef(name:'work1') meta1.addDefinition(proc1, func1, work1) // defs in the second script imported in the root namespace def func2 = new FunctionDef(name: 'func2', alias: 'func2') - def proc2 = new ProcessDef(script2, Mock(Closure), 'proc2') + def proc2 = new ProcessDef(script2, 'proc2', Mock(BodyDef), new ProcessConfig([:])) def work2 = new WorkflowDef(name:'work2') meta2.addDefinition(proc2, func2, work2) // defs in the third script imported in a separate namespace def func3 = new FunctionDef(name: 'func3', alias: 'func3') - def proc3 = new ProcessDef(script2, Mock(Closure), 'proc3') + def proc3 = new ProcessDef(script2, 'proc3', Mock(BodyDef), new ProcessConfig([:])) def work3 = new WorkflowDef(name:'work3') meta3.addDefinition(proc3, func3, work3) @@ -205,7 +205,7 @@ class ScriptMetaTest extends Dsl2Spec { // import module into main script def func2 = new FunctionDef(name: 'func1', alias: 'func1') - def proc2 = new ProcessDef(script2, Mock(Closure), 'proc1') + def proc2 = new ProcessDef(script2, 'proc1', Mock(BodyDef), new ProcessConfig([:])) def work2 = new WorkflowDef(name: 'work1') meta2.addDefinition(proc2, func2, work2) @@ -215,7 +215,7 @@ class ScriptMetaTest extends Dsl2Spec { // attempt to define duplicate components in main script def func1 = new FunctionDef(name: 'func1', alias: 'func1') - def proc1 = new ProcessDef(script1, Mock(Closure), 'proc1') + def proc1 = new ProcessDef(script1, 'proc1', Mock(BodyDef), new ProcessConfig([:])) def work1 = new WorkflowDef(name: 'work1') when: diff --git a/modules/nextflow/src/test/groovy/nextflow/script/dsl/ProcessBuilderTest.groovy b/modules/nextflow/src/test/groovy/nextflow/script/dsl/ProcessBuilderTest.groovy new file mode 100644 index 0000000000..ff2682e7ee --- /dev/null +++ b/modules/nextflow/src/test/groovy/nextflow/script/dsl/ProcessBuilderTest.groovy @@ -0,0 +1,640 @@ +/* + * Copyright 2013-2024, Seqera Labs + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package nextflow.script.dsl + +import spock.lang.Specification +import spock.lang.Unroll + +import nextflow.exception.IllegalDirectiveException +import nextflow.script.params.FileInParam +import nextflow.script.params.StdInParam +import nextflow.script.params.StdOutParam +import nextflow.script.params.ValueInParam +import nextflow.script.BaseScript +import nextflow.script.LazyVar +import nextflow.script.ProcessConfig +import nextflow.util.Duration +import nextflow.util.MemoryUnit +/** + * + * @author Paolo Di Tommaso + */ +class ProcessBuilderTest extends Specification { + + def 'should set directives' () { + + setup: + def builder = new ProcessBuilder(Mock(BaseScript), null) + def config = builder.getConfig() + + // setting list values + when: + builder.tag 1,2,3 + then: + config.tag == [1,2,3] + + // setting named parameters attribute + when: + builder.tag field1:'val1', field2: 'val2' + then: + config.tag == [field1:'val1', field2: 'val2'] + + // maxDuration property + when: + builder.time '1h' + then: + config.time == '1h' + config.createTaskConfig().time == new Duration('1h') + + // maxMemory property + when: + builder.memory '2GB' + then: + config.memory == '2GB' + config.createTaskConfig().memory == new MemoryUnit('2GB') + + when: + builder.stageInMode 'copy' + builder.stageOutMode 'move' + then: + config.stageInMode == 'copy' + config.stageOutMode == 'move' + + } + + + def 'should create input directives' () { + + setup: + def builder = new ProcessBuilder(Mock(BaseScript), null) + def config = builder.getConfig() + + when: + builder._in_file([infile:'filename.fa']) + builder._in_val('x').setFrom(1) + builder._in_stdin() + + then: + config.getInputs().size() == 3 + + config.inputs.get(0) instanceof FileInParam + config.inputs.get(0).name == 'infile' + (config.inputs.get(0) as FileInParam).filePattern == 'filename.fa' + + config.inputs.get(1) instanceof ValueInParam + config.inputs.get(1).name == 'x' + + config.inputs.get(2).name == '-' + config.inputs.get(2) instanceof StdInParam + + config.inputs.names == [ 'infile', 'x', '-' ] + config.inputs.ofType( FileInParam ) == [ config.getInputs().get(0) ] + + } + + def 'should create output directives' () { + + setup: + def builder = new ProcessBuilder(Mock(BaseScript), null) + def config = builder.getConfig() + + when: + builder._out_stdout() + builder._out_file(new LazyVar('file1')).setInto('ch1') + builder._out_file(new LazyVar('file2')).setInto('ch2') + builder._out_file(new LazyVar('file3')).setInto('ch3') + + then: + config.outputs.size() == 4 + config.outputs.names == ['-', 'file1', 'file2', 'file3'] + config.outputs.ofType(StdOutParam).size() == 1 + + config.outputs[0] instanceof StdOutParam + config.outputs[1].name == 'file1' + config.outputs[2].name == 'file2' + config.outputs[3].name == 'file3' + + } + + def 'should create PublishDir object' () { + + setup: + def builder = new ProcessBuilder(Mock(BaseScript), null) + def config = builder.getConfig() + + when: + builder.publishDir '/data' + then: + config.get('publishDir').last() == [path:'/data'] + + when: + builder.publishDir '/data', mode: 'link', pattern: '*.bam' + then: + config.get('publishDir').last() == [path: '/data', mode: 'link', pattern: '*.bam'] + + when: + builder.publishDir path: '/data', mode: 'link', pattern: '*.bam' + then: + config.get('publishDir').last() == [path: '/data', mode: 'link', pattern: '*.bam'] + } + + def 'should throw IllegalDirectiveException'() { + + given: + def builder = new ProcessBuilder(Mock(BaseScript), null) + + when: + builder.hello 'world' + + then: + def e = thrown(IllegalDirectiveException) + e.message == + ''' + Unknown process directive: `hello` + + Did you mean one of these? + shell + ''' + .stripIndent().trim() + } + + def 'should set process secret'() { + when: + def builder = new ProcessBuilder(Mock(BaseScript), null) + def config = builder.getConfig() + then: + config.getSecret() == [] + + when: + builder.secret 'foo' + then: + config.getSecret() == ['foo'] + + when: + builder.secret 'bar' + then: + config.secret == ['foo', 'bar'] + config.getSecret() == ['foo', 'bar'] + } + + def 'should set process labels'() { + when: + def builder = new ProcessBuilder(Mock(BaseScript), null) + def config = builder.getConfig() + then: + config.getLabels() == [] + + when: + builder.label 'foo' + then: + config.getLabels() == ['foo'] + + when: + builder.label 'bar' + then: + config.getLabels() == ['foo','bar'] + } + + def 'should apply resource labels config' () { + given: + def builder = new ProcessBuilder(Mock(BaseScript), null) + def config = builder.getConfig() + expect: + config.getResourceLabels() == [:] + + when: + builder.resourceLabels foo: 'one', bar: 'two' + then: + config.getResourceLabels() == [foo: 'one', bar: 'two'] + + when: + builder.resourceLabels foo: 'new one', baz: 'three' + then: + config.getResourceLabels() == [foo: 'new one', bar: 'two', baz: 'three'] + + } + + def 'should check a valid label' () { + + expect: + ProcessBuilder.isValidLabel(lbl) == result + + where: + lbl | result + 'foo' | true + 'foo1' | true + '1foo' | false + '_foo' | false + 'foo1_' | false + 'foo_1' | true + 'foo-1' | false + 'foo.1' | false + 'a' | true + 'A' | true + '1' | false + '_' | false + 'a=b' | true + 'a=foo' | true + 'a=foo_1' | true + 'a=foo_' | false + '_=foo' | false + '=a' | false + 'a=' | false + 'a=1' | false + + } + + @Unroll + def 'should match selector: #SELECTOR with #TARGET' () { + expect: + ProcessBuilder.matchesSelector(TARGET, SELECTOR) == EXPECTED + + where: + SELECTOR | TARGET | EXPECTED + 'foo' | 'foo' | true + 'foo' | 'bar' | false + '!foo' | 'bar' | true + 'a|b' | 'a' | true + 'a|b' | 'b' | true + 'a|b' | 'z' | false + 'a*' | 'a' | true + 'a*' | 'aaaa' | true + 'a*' | 'bbbb' | false + } + + def 'should apply config setting for a process label' () { + given: + def settings = [ + 'withLabel:short' : [ cpus: 1, time: '1h'], + 'withLabel:!short' : [ cpus: 32, queue: 'cn-long'], + 'withLabel:foo' : [ cpus: 2 ], + 'withLabel:foo|bar': [ disk: '100GB' ], + 'withLabel:gpu.+' : [ cpus: 4 ], + ] + + when: + def config = new ProcessConfig([:]) + new ProcessBuilder(config).applyConfigSelectorWithLabels(settings, ['short']) + then: + config.cpus == 1 + config.time == '1h' + config.size() == 2 + + when: + config = new ProcessConfig([:]) + new ProcessBuilder(config).applyConfigSelectorWithLabels(settings, ['long']) + then: + config.cpus == 32 + config.queue == 'cn-long' + config.size() == 2 + + when: + config = new ProcessConfig([:]) + new ProcessBuilder(config).applyConfigSelectorWithLabels(settings, ['foo']) + then: + config.cpus == 2 + config.disk == '100GB' + config.queue == 'cn-long' + config.size() == 3 + + when: + config = new ProcessConfig([:]) + new ProcessBuilder(config).applyConfigSelectorWithLabels(settings, ['bar']) + then: + config.cpus == 32 + config.disk == '100GB' + config.queue == 'cn-long' + config.size() == 3 + + when: + config = new ProcessConfig([:]) + new ProcessBuilder(config).applyConfigSelectorWithLabels(settings, ['gpu-1']) + then: + config.cpus == 4 + config.queue == 'cn-long' + config.size() == 2 + + } + + + def 'should apply config setting for a process name' () { + given: + def settings = [ + 'withName:alpha' : [ cpus: 1, time: '1h'], + 'withName:delta' : [ cpus: 2 ], + 'withName:delta|gamma' : [ disk: '100GB' ], + 'withName:omega.+' : [ cpus: 4 ], + ] + + when: + def config = new ProcessConfig([:]) + new ProcessBuilder(config).applyConfigSelectorWithName(settings, 'xx') + then: + config.size() == 0 + + when: + config = new ProcessConfig([:]) + new ProcessBuilder(config).applyConfigSelectorWithName(settings, 'alpha') + then: + config.cpus == 1 + config.time == '1h' + config.size() == 2 + + when: + config = new ProcessConfig([:]) + new ProcessBuilder(config).applyConfigSelectorWithName(settings, 'delta') + then: + config.cpus == 2 + config.disk == '100GB' + config.size() == 2 + + when: + config = new ProcessConfig([:]) + new ProcessBuilder(config).applyConfigSelectorWithName(settings, 'gamma') + then: + config.disk == '100GB' + config.size() == 1 + + when: + config = new ProcessConfig([:]) + new ProcessBuilder(config).applyConfigSelectorWithName(settings, 'omega_x') + then: + config.cpus == 4 + config.size() == 1 + } + + + def 'should apply config process defaults' () { + + when: + def builder = new ProcessBuilder(Mock(BaseScript), null) + builder.queue 'cn-el6' + builder.memory '10 GB' + builder.applyConfigDefaults( + queue: 'def-queue', + container: 'ubuntu:latest' + ) + def config = builder.getConfig() + + then: + config.queue == 'cn-el6' + config.container == 'ubuntu:latest' + config.memory == '10 GB' + config.cacheable == true + + + + when: + builder = new ProcessBuilder(Mock(BaseScript), null) + builder.container null + builder.applyConfigDefaults( + queue: 'def-queue', + container: 'ubuntu:latest', + maxRetries: 5 + ) + config = builder.getConfig() + then: + config.queue == 'def-queue' + config.container == null + config.maxRetries == 5 + + + + when: + builder = new ProcessBuilder(Mock(BaseScript), null) + builder.maxRetries 10 + builder.applyConfigDefaults( + queue: 'def-queue', + container: 'ubuntu:latest', + maxRetries: 5 + ) + config = builder.getConfig() + then: + config.queue == 'def-queue' + config.container == 'ubuntu:latest' + config.maxRetries == 10 + } + + + def 'should apply pod configs' () { + + when: + def builder = new ProcessBuilder(Mock(BaseScript), null) + builder.applyConfigDefaults( pod: [secret: 'foo', mountPath: '/there'] ) + then: + builder.getConfig().pod == [ + [secret: 'foo', mountPath: '/there'] + ] + + when: + builder = new ProcessBuilder(Mock(BaseScript), null) + builder.applyConfigDefaults( pod: [ + [secret: 'foo', mountPath: '/here'], + [secret: 'bar', mountPath: '/there'] + ] ) + then: + builder.getConfig().pod == [ + [secret: 'foo', mountPath: '/here'], + [secret: 'bar', mountPath: '/there'] + ] + + } + + def 'should clone config object' () { + + when: + def builder = new ProcessBuilder(Mock(BaseScript), null) + def config = builder.getConfig() + builder.queue 'cn-el6' + builder.container 'ubuntu:latest' + builder.memory '10 GB' + builder._in_val('foo') + builder._in_file('sample.txt') + builder._out_file('result.txt') + + then: + config.queue == 'cn-el6' + config.container == 'ubuntu:latest' + config.memory == '10 GB' + config.getInputs().size() == 2 + config.getOutputs().size() == 1 + + when: + def copy = config.clone() + builder = new ProcessBuilder(copy) + builder.queue 'long' + builder.container 'debian:wheezy' + builder.memory '5 GB' + builder._in_val('bar') + builder._out_file('sample.bam') + + then: + copy.queue == 'long' + copy.container == 'debian:wheezy' + copy.memory == '5 GB' + copy.getInputs().size() == 3 + copy.getOutputs().size() == 2 + + // original config is not affected + config.queue == 'cn-el6' + config.container == 'ubuntu:latest' + config.memory == '10 GB' + config.getInputs().size() == 2 + config.getOutputs().size() == 1 + } + + def 'should apply accelerator config' () { + + given: + def builder = new ProcessBuilder(Mock(BaseScript), null) + def config = builder.getConfig() + + when: + builder.accelerator 5 + then: + config.accelerator == [limit: 5] + + when: + builder.accelerator request: 1, limit: 5, type: 'nvida' + then: + config.accelerator == [request: 1, limit: 5, type: 'nvida'] + + when: + builder.accelerator 5, type: 'nvida' + then: + config.accelerator == [limit: 5, type: 'nvida'] + + when: + builder.accelerator 1, limit: 5 + then: + config.accelerator == [request: 1, limit:5] + + when: + builder.accelerator 5, request: 1 + then: + config.accelerator == [request: 1, limit:5] + } + + def 'should apply disk config' () { + + given: + def builder = new ProcessBuilder(Mock(BaseScript), null) + def config = builder.getConfig() + + when: + builder.disk '100 GB' + then: + config.disk == [request: '100 GB'] + + when: + builder.disk '375 GB', type: 'local-ssd' + then: + config.disk == [request: '375 GB', type: 'local-ssd'] + + when: + builder.disk request: '375 GB', type: 'local-ssd' + then: + config.disk == [request: '375 GB', type: 'local-ssd'] + } + + def 'should apply architecture config' () { + + given: + def builder = new ProcessBuilder(Mock(BaseScript), null) + def config = builder.getConfig() + + when: + builder.arch 'linux/x86_64' + then: + config.arch == [name: 'linux/x86_64'] + + when: + builder.arch 'linux/x86_64', target: 'zen3' + then: + config.arch == [name: 'linux/x86_64', target: 'zen3'] + + when: + builder.arch name: 'linux/x86_64', target: 'zen3' + then: + config.arch == [name: 'linux/x86_64', target: 'zen3'] + } + + def 'should not apply config on negative label' () { + given: + def settings = [ + 'withLabel:foo': [ cpus: 2 ], + 'withLabel:!foo': [ cpus: 4 ], + 'withLabel:!nodisk_.*': [ disk: '100.GB'] + ] + + when: + def config = new ProcessConfig(label: ['foo', 'other']) + new ProcessConfigBuilder(config).applyConfig(settings, "processName", null, null) + then: + config.cpus == 2 + config.disk == '100.GB' + + when: + config = new ProcessConfig(label: ['foo', 'other', 'nodisk_label']) + new ProcessConfigBuilder(config).applyConfig(settings, "processName", null, null) + then: + config.cpus == 2 + !config.disk + + when: + config = new ProcessConfig(label: ['other', 'nodisk_label']) + new ProcessConfigBuilder(config).applyConfig(settings, "processName", null, null) + then: + config.cpus == 4 + !config.disk + + } + + def 'should throw exception for invalid error strategy' () { + when: + def builder = new ProcessBuilder(Mock(BaseScript), null) + builder.errorStrategy 'abort' + + then: + def e = thrown(IllegalArgumentException) + e.message == "Unknown error strategy 'abort' ― Available strategies are: terminate,finish,ignore,retry" + + } + + def 'should not throw exception for valid error strategy or closure' () { + when: + def builder = new ProcessBuilder(Mock(BaseScript), null) + builder.errorStrategy 'retry' + + then: + noExceptionThrown() + + when: + builder = new ProcessBuilder(Mock(BaseScript), null) + builder.errorStrategy 'terminate' + + then: + noExceptionThrown() + + when: + builder = new ProcessBuilder(Mock(BaseScript), null) + builder.errorStrategy { task.exitStatus==14 ? 'retry' : 'terminate' } + + then: + noExceptionThrown() + } +} diff --git a/modules/nextflow/src/test/groovy/nextflow/script/params/CmdEvalParamTest.groovy b/modules/nextflow/src/test/groovy/nextflow/script/params/CmdEvalParamTest.groovy index 39cbc26322..b591c384fd 100644 --- a/modules/nextflow/src/test/groovy/nextflow/script/params/CmdEvalParamTest.groovy +++ b/modules/nextflow/src/test/groovy/nextflow/script/params/CmdEvalParamTest.groovy @@ -1,5 +1,5 @@ /* - * Copyright 2013-2023, Seqera Labs + * Copyright 2013-2024, Seqera Labs * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. diff --git a/modules/nextflow/src/test/groovy/nextflow/script/params/ParamsInTest.groovy b/modules/nextflow/src/test/groovy/nextflow/script/params/ParamsInTest.groovy index 9f3e412d85..b7c8270e63 100644 --- a/modules/nextflow/src/test/groovy/nextflow/script/params/ParamsInTest.groovy +++ b/modules/nextflow/src/test/groovy/nextflow/script/params/ParamsInTest.groovy @@ -24,6 +24,7 @@ import groovyx.gpars.dataflow.DataflowQueue import groovyx.gpars.dataflow.DataflowVariable import nextflow.Channel import nextflow.exception.ScriptRuntimeException +import nextflow.script.PathArityAware import nextflow.processor.TaskProcessor import spock.lang.Timeout import test.Dsl2Spec @@ -631,7 +632,7 @@ class ParamsInTest extends Dsl2Spec { in0.inChannel instanceof DataflowVariable in0.inChannel.val == ['aaa'] in0.inner.name == 'x' - in0.inner.owner == in0 + in0.inner.owner.toString() == in0.toString() in1.class == EachInParam in1.name == '__$eachinparam<1>' @@ -639,14 +640,14 @@ class ParamsInTest extends Dsl2Spec { in1.inChannel.val == [1,2] in1.inner.name == 'p' in1.inner instanceof ValueInParam - in1.inner.owner == in1 + in1.inner.owner.toString() == in1.toString() in2.class == EachInParam in2.name == '__$eachinparam<2>' in2.inChannel.val == [1,2,3] in2.inner instanceof ValueInParam in2.inner.name == 'z' - in2.inner.owner == in2 + in2.inner.owner.toString() == in2.toString() in3.class == EachInParam in3.name == '__$eachinparam<3>' @@ -654,7 +655,7 @@ class ParamsInTest extends Dsl2Spec { in3.inChannel.val == ['file-a.txt'] in3.inner instanceof FileInParam in3.inner.name == 'foo' - in3.inner.owner == in3 + in3.inner.owner.toString() == in3.toString() in4.class == EachInParam in4.name == '__$eachinparam<4>' @@ -663,7 +664,7 @@ class ParamsInTest extends Dsl2Spec { in4.inner instanceof FileInParam in4.inner.name == 'bar' in4.inner.filePattern == 'bar' - in4.inner.owner == in4 + in4.inner.owner.toString() == in4.toString() } @@ -738,21 +739,21 @@ class ParamsInTest extends Dsl2Spec { in0.inChannel.val == FILE in0.index == 0 in0.isPathQualifier() - in0.arity == new ArityParam.Range(1, 1) + in0.arity == new PathArityAware.Range(1, 1) in1.name == 'f1' in1.filePattern == '*' in1.inChannel.val == FILE in1.index == 1 in1.isPathQualifier() - in1.arity == new ArityParam.Range(1, 2) + in1.arity == new PathArityAware.Range(1, 2) in2.name == '*.fa' in2.filePattern == '*.fa' in2.inChannel.val == FILE in2.index == 2 in2.isPathQualifier() - in2.arity == new ArityParam.Range(1, Integer.MAX_VALUE) + in2.arity == new PathArityAware.Range(1, Integer.MAX_VALUE) in3.name == 'file.txt' in3.filePattern == 'file.txt' @@ -940,19 +941,19 @@ class ParamsInTest extends Dsl2Spec { in0.inChannel instanceof DataflowVariable in0.inChannel.val == ['file-a.txt'] in0.inner instanceof FileInParam - (in0.inner as FileInParam).name == 'foo' - (in0.inner as FileInParam).owner == in0 - (in0.inner as FileInParam).isPathQualifier() + in0.inner.name == 'foo' + in0.inner.owner.toString() == in0.toString() + in0.inner.isPathQualifier() in1.class == EachInParam in1.name == '__$eachinparam<1>' in1.inChannel instanceof DataflowVariable in1.inChannel.val == ['file-x.fa'] in1.inner instanceof FileInParam - (in1.inner as FileInParam).name == 'bar' - (in1.inner as FileInParam).filePattern == 'bar' - (in1.inner as FileInParam).owner == in1 - (in1.inner as FileInParam).isPathQualifier() + in1.inner.name == 'bar' + in1.inner.filePattern == 'bar' + in1.inner.owner.toString() == in1.toString() + in1.inner.isPathQualifier() } diff --git a/modules/nextflow/src/test/groovy/nextflow/script/params/ParamsOutTest.groovy b/modules/nextflow/src/test/groovy/nextflow/script/params/ParamsOutTest.groovy index ed18f2faf9..b7d0a69d5b 100644 --- a/modules/nextflow/src/test/groovy/nextflow/script/params/ParamsOutTest.groovy +++ b/modules/nextflow/src/test/groovy/nextflow/script/params/ParamsOutTest.groovy @@ -22,7 +22,8 @@ import java.nio.file.Path import groovyx.gpars.dataflow.DataflowVariable import nextflow.processor.TaskContext -import nextflow.script.TokenVar +import nextflow.script.LazyVar +import nextflow.script.PathArityAware import nextflow.util.BlankSeparatedList import test.Dsl2Spec /** @@ -658,7 +659,7 @@ class ParamsOutTest extends Dsl2Spec { * val x */ when: - param.target = new TokenVar('x') + param.target = new LazyVar('x') then: param.resolve(createTaskContext([x:'foo'])) == 'foo' @@ -966,7 +967,7 @@ class ParamsOutTest extends Dsl2Spec { !out0.getGlob() !out0.getOptional() !out0.getIncludeInputs() - out0.getArity() == new ArityParam.Range(1, 1) + out0.getArity() == new PathArityAware.Range(1, 1) and: out1.getMaxDepth() == 5 @@ -977,7 +978,7 @@ class ParamsOutTest extends Dsl2Spec { out1.getGlob() out1.getOptional() out1.getIncludeInputs() - out1.getArity() == new ArityParam.Range(0, Integer.MAX_VALUE) + out1.getArity() == new PathArityAware.Range(0, Integer.MAX_VALUE) } def 'should set file options' () { diff --git a/modules/nextflow/src/testFixtures/groovy/test/TestParser.groovy b/modules/nextflow/src/testFixtures/groovy/test/TestParser.groovy index 525917f43f..53a01fa810 100644 --- a/modules/nextflow/src/testFixtures/groovy/test/TestParser.groovy +++ b/modules/nextflow/src/testFixtures/groovy/test/TestParser.groovy @@ -80,14 +80,6 @@ class TestParser { @InheritConstructors static class TestTaskProcessor extends TaskProcessor { - @Override - def run () { - // this is needed to mimic the out channels normalisation - // made by the real 'run' method - check the superclass - if ( config.getOutputs().size() == 0 ) { - config.fakeOutput() - } - } } @InheritConstructors diff --git a/modules/nf-commons/src/main/nextflow/io/ValueObject.groovy b/modules/nf-commons/src/main/nextflow/io/ValueObject.groovy index e04357acc7..7400d82166 100644 --- a/modules/nf-commons/src/main/nextflow/io/ValueObject.groovy +++ b/modules/nf-commons/src/main/nextflow/io/ValueObject.groovy @@ -32,7 +32,7 @@ import groovy.transform.Immutable * @author Paolo Di Tommaso */ @AutoClone -@Immutable(copyWith=true) +@Immutable(copyWith=true, knownImmutableClasses=[java.nio.file.Path]) @SerializableObject @AnnotationCollector(mode = AnnotationCollectorMode.PREFER_EXPLICIT_MERGED) @Retention(RetentionPolicy.RUNTIME) diff --git a/tests/blast-parallel-dsl2.nf b/tests/blast-parallel-dsl2.nf index a2b3addbd7..9220684d2a 100644 --- a/tests/blast-parallel-dsl2.nf +++ b/tests/blast-parallel-dsl2.nf @@ -14,7 +14,7 @@ process blast { path 'query.fa' output: - path top_hits + path 'top_hits' """ blastp -db ${db} -query query.fa -outfmt 6 > blast_result @@ -27,7 +27,7 @@ process blast { */ process extract { input: - path top_hits + path 'top_hits' output: path 'sequences' diff --git a/tests/collect_and_merge.nf b/tests/collect_and_merge.nf index b77a742373..ee4120b9c1 100644 --- a/tests/collect_and_merge.nf +++ b/tests/collect_and_merge.nf @@ -27,7 +27,7 @@ process algn { each seq_id output: - tuple val(barcode), val(seq_id), file('bam'), file('bai') + tuple val(barcode), val(seq_id), path('bam'), path('bai') """ echo BAM $seq_id - $barcode > bam @@ -44,7 +44,7 @@ process merge { debug true input: - tuple val(barcode), val(seq_id), file(bam: 'bam?'), file(bai: 'bai?') + tuple val(barcode), val(seq_id), path(bam, stageAs: 'bam?'), path(bai, stageAs: 'bai?') """ echo barcode: $barcode diff --git a/tests/singleton.nf b/tests/singleton.nf index 70f4bde70a..3775d290e2 100644 --- a/tests/singleton.nf +++ b/tests/singleton.nf @@ -17,7 +17,7 @@ process foo { output: - file x + path 'x' ''' echo -n Hello > x @@ -26,7 +26,7 @@ process foo { process bar { input: - file x + path x val y """ diff --git a/tests/workdir-with-blank.nf b/tests/workdir-with-blank.nf index bfb6b99121..874716c5d2 100644 --- a/tests/workdir-with-blank.nf +++ b/tests/workdir-with-blank.nf @@ -20,7 +20,7 @@ process foo { each x output: - file result_data + path 'result_data' """ echo Hello $x > result_data