scala - reproduce task dependencies in actual practise -


the sbt task documentation shows example of usage dependencies. simple, artificial works! reproduced in project/scala.build without problem.

note choose global scope make tasks available project , configuration

import sbt._ import keys._  object testbuild extends build {   lazy val sampletask = taskkey[int]("a sample task")   lazy val inttask = taskkey[int]("an int task")    override lazy val settings = super.settings ++ seq(     inttask := 1 + 2 ,     sampletask := inttask.value + 1   )  } 

now i'm trying useful , enrich existing sbt key definitions task collects compiled class names

import sbt._ import keys._ import sbt.inc.analysis import xsbti.api.classlike import xsbt.api.discovery.{isconcrete, ispublic}  object testbuild extends build {   lazy val debugapis = taskkey[list[string]]("list of top-level definitions")    override lazy val settings = super.settings ++ seq(     debugapis := getalltop( compile.value )   )    private def getalltop(analysis : analysis) : list[string] =     tests.alldefs(analysis).tolist collect {       case c : classlike if isconcrete(c) && ispublic(c) => c.name     } } 

now error sbt:

reference undefined setting:     {.}/*:compile {.}/*:debugapis (/home/sbt/project/build.scala:11) 

so have 2 questions:

  • how should define debugapis task available projects , configurations?
  • how can reproduce error in synthetic configuration?

i'm more interested in second question actually. deep understanding of how sbt works because i'd write plugin it.

the problem try access key value without proper scope.

the documentation gives hint here.

by default, keys associated compiling, packaging, , running scoped configuration , therefore may work differently in each configuration. obvious examples task keys compile, package, , run; keys affect keys (such source-directories or scalac-options or full-classpath) scoped configuration.

let's first focus on simple example, maybe doesn't make sense, illustrates problem. lets assume want redefine compile task itself.

override lazy val settings = super.settings ++ seq (     compile := { compile.value } ) 

running in sbt give error, more or less this

[error]   {.}/*:compile {.}/*:compile (/tmp/q-23723818/project/build.scala:12) [error]      did mean compile:compile ? 

we didn't specify scope sbt picked defaults. project set thisbuild (meaning no specific project) , configuration set global. setting undefined in context. it's important understand key not setting. key can exist without scope, value of key attached scope. note that, if sbt won't find value in requested scope can delegate other scopes, topic.

how can check this? turns out quite simple. let's ignore error, , let sbt start.

if type inspect compile you'll see inspect in compile:compile, value defined. can force in specific scope, e.g. inspect {.}/*:compile, in scope gave error.

> inspect {.}/*:compile [info] no entry key. 

indeed it's undefined.

how solve issue? have give sbt scope you're looking for. naively try add configuration scope.

// not work override lazy val settings = super.settings ++ seq (     compile in compile := { (compile in compile).value } ) 

well there no global compile, there compile per project. overcome issue not overriding global settings, settings specific project, , specifying compile configuration there.

lazy val root = project.in(file(".")).settings(seq(   compile in compile := {(compile in compile).value} ): _*) 

this work,but if want compile value regardless of is? scopefilter comes in handy. original example. assume want compile's analysis object projects.

import sbt._ import keys._ import sbt.inc.analysis import xsbti.api.classlike import xsbt.api.discovery.{isconcrete, ispublic}  object testbuild extends build {    val debugapis = taskkey[seq[string]]("list of top-level definitions")    val compileinanyproject = scopefilter(inanyproject, inconfigurations(compile))    override lazy val settings = super.settings ++ seq(     debugapis := {        getalltop(compile.all(compileinanyproject).value)     }   )    private def getalltop(analyses : seq[analysis]) : seq[string] =     analyses.flatmap { analysis =>       tests.alldefs(analysis) collect { case c : classlike if isconcrete(c) && ispublic(c) => c.name }     }  } 

what created scopefilter filtering project, , in projects compile configuration. looked compile values.

you can configure scopefilter match needs, , filter specific projects/configurations or tasks. key understand problem remember in sbt settings scoped.

edit

you have asked how comes compile not defined globally available every project. because there defaults.defaultsettings define it. , each project include it. if removed super.settings build definition you'd see among others compile undefined.

and if should way. overriding settings in plugin in general discouraged in plugin best practices. recommend read it, plugins chapter. should give idea of how proceed.

you can multiple values multiple scopes defining new task returning them. example analyses project, use following piece of code.

object testbuild extends build {    val debugapis = taskkey[seq[(string, string)]]("list of top-level definitions")    val compileinanyproject = scopefilter(inanyproject, inconfigurations(compile))    override lazy val settings = super.settings ++ seq(     debugapis := {        getalltop(analysiswithproject.all(compileinanyproject).value)     }   )    lazy val analysiswithproject = def.task { (thisproject.value, compile.value) }    private def getalltop(analyses : seq[(resolvedproject, analysis)]) : seq[(string, string)] =     analyses.flatmap { case (project, analysis) =>       tests.alldefs(analysis) collect { case c : classlike if isconcrete(c) && ispublic(c) => (project.id, c.name) }     }  } 

Comments

Popular posts from this blog

how to proxy from https to http with lighttpd -

android - Automated my builds -

python - Flask migration error -