diff --git a/README.md b/README.md
index e69de29b..79fa4cf1 100644
--- a/README.md
+++ b/README.md
@@ -0,0 +1,96 @@
+
+
+
+# Questions and Answers #
+
+In this section we will try to cover DataForge main ideas in the form of questions and answers.
+
+## General ##
+
+**Q:** I have a lot of data to analyze. The analysis process is complicated, requires a lot of stages and data flow is not always obvious. To top it the data size is huge, so I don't want to perform operation I don't need (calculate something I won't need or calculate something twice). And yes, I need it to be performed in parallel and probably on remote computer. By the way, I am sick and tired of scripts that modify other scripts that control scripts. Could you help me?
+
+**A:** Yes, that is the precisely the problem DataForge was made to solve. It allows to perform some automated data manipulations with automatic optimization and parallelization. The important thing that data processing recipes are made in the declarative way, so it is quite easy to perform computations on a remote station. Also DataForge guarantees reproducibility of analysis results.
+
+
+**Q:** How does it work?
+
+**A:** At the core of DataForge lies the idea of **metadata processor**. It utilizes the statement that in order to analyze something you need data itself and some additional information about what does that data represent and what does user want as a result. This additional information is called metadata and could be organized in a regular structure (a tree of values not unlike XML or JSON). The important thing is that this distinction leaves no place for user instructions (or scripts). Indeed, the idea of DataForge logic is that one do not need imperative commands. The framework configures itself according to input meta-data and decides what operations should be performed in the most efficient way.
+
+
+**Q:** But where does it take algorithms to use?
+
+**A:** Of course algorithms must be written somewhere. No magic here. The logic is written in specialized modules. Some modules are provided out of the box at the system core, some need to be developed for specific problem.
+
+
+**Q:** So I still need to write the code? What is the difference then?
+
+**A:** Yes, someone still need to write the code. But not necessary you. Simple operations could be performed using provided core logic. Also your group can have one programmer writing the logic and all other using it without any real programming expertise. Also the framework organized in a such way that one writes some additional logic, he do not need to thing about complicated thing like parallel computing, resource handling, logging, caching etc. Most of the things are done by the DataForge.
+
+
+## Platform ##
+
+**Q:** Which platform does DataForge use? Which operation system is it working on?
+
+**A:** The DataForge is mostly written in Java and utilizes JVM as a platform. It works on any system that supports JVM (meaning almost any modern system excluding some mobile platforms).
+
+
+ **Q:** But Java... it is slow!
+
+ **A:** [It is not](https://stackoverflow.com/questions/2163411/is-java-really-slow/2163570#2163570). It lacks some hardware specific optimizations and requires some additional time to start (due to JIT nature), but otherwise it is at least as fast as other languages traditionally used in science. More importantly, the memory safety, tooling support and vast ecosystem makes it №1 candidate for data analysis framework.
+
+
+
+ **Q:** Can I use my C++/Fortran/Python code in DataForge?
+
+ **A:** Yes, as long as the code could be called from Java. Most of common languages have a bridge for Java access. There are completely no problems with compiled C/Fortran libraries. Python code could be called via one of existing python-java interfaces. It is also planned to implement remote method invocation for common languages, so your Python, or, say, Julia, code could run in its native environment. The metadata processor paradigm makes it much easier to do so.
+
+
+
+## Features ##
+
+**Q:** What other features does DataForge provide?
+
+**A:** Alongside metadata processing (and a lot of tools for metadata manipulation and layering), DataForge has two additional important concepts:
+
+* **Modularisation**. Contrary to lot other frameworks, DataForge is intrinsically modular. The mandatory part is a rather tiny core module. Everything else could be customized.
+
+* **Context encapsulation**. Every DataForge task is executed in some context. The context isolates environment for the task and also works as dependency injection base and specifies interaction of the task with the external world.
+
+
+
+
+**Q:** OK, but now I want to work directly with my measuring devices. How can I do that?
+
+**A:** The [dataforge-control](${site.url}/docs.html#control) module provides interfaces to interact with the hardware. Out of the box it supports safe communication with TCP/IP or COM/tty based devices. Specific device declaration could be done via additional modules. It is also possible to maintain data storage with [datforge-storage](${site.url}/docs.htm#storage) module.
+
+
+
+**Q:** Declarations and metadata are good, but I want my scripts back!
+
+**A:** We can do that. [GRIND](${site.url}/docs.html#grind) provides a shell-like environment called GrindShell. It allows to run imperative scripts with full access to all of the DataForge functionality. Grind scripts are basically context-encapsulated. Also there are convenient feature wrappers called helpers that could be loaded into the shell when new features modules are added.
+
+
+
+## Misc ##
+
+**Q:** So everything looks great, can I replace my ROOT / other data analysis framework with DataForge?
+
+**A:** One must note, that DataForge is made for analysis, not for visualisation. The visualisation and user interaction capabilities of DataForge are rather limited compared to frameworks like ROOT, JAS3 or DataMelt. The idea is to provide reliable API and core functionality. In fact JAS3 and DataMelt could be used as a frontend for DataForge mechanics. It is planned to add an interface to ROOT via JFreeHep AIDA.
+
+
+
+**Q:** How does DataForge compare to cluster computation frameworks like Hadoop or Spark?
+
+**A:** Again, it is not the purpose of DataForge to replace cluster software. DataForge has some internal parallelism mechanics and implementations, but they are most certainly worse then specially developed programs. Still, DataForge is not fixed on one single implementation. Your favourite parallel processing tool could be still used as a back-end for the DataForge. With full benefit of configuration tools, integrations and no performance overhead.
+
+
+
+**Q:** Is it possible to use DataForge in notebook mode?
+
+**A:** Yes, it is. DataForge can be used as is from [beaker/beakerx](http://beakernotebook.com/) groovy kernel with minor additional adjustments. It is planned to provide separate DataForge kernel to `beakerx` which will automatically call a specific GRIND shell.
+
+
+
+**Q:** Can I use DataForge on a mobile platform?
+
+**A:** DataForge is modular. Core and the most of api are pretty compact, so it could be used in Android applications. Some modules are designed for PC and could not be used on other platforms. IPhone does not support Java and therefore could use only client-side DataForge applications.
diff --git a/build.gradle.kts b/build.gradle.kts
index 8494144e..016a350e 100644
--- a/build.gradle.kts
+++ b/build.gradle.kts
@@ -1,18 +1,20 @@
-val dataforgeVersion by extra("0.1.2")
+plugins {
+ id("scientifik.mpp") version "0.1.4" apply false
+ id("scientifik.publish") version "0.1.4" apply false
+}
+
+val dataforgeVersion by extra("0.1.3")
+
+val bintrayRepo by extra("dataforge")
+val githubProject by extra("dataforge-core")
allprojects {
- repositories {
- jcenter()
- maven("https://kotlin.bintray.com/kotlinx")
- }
-
group = "hep.dataforge"
version = dataforgeVersion
}
subprojects {
if (name.startsWith("dataforge")) {
- apply(plugin = "npm-bintray")
- apply(plugin = "npm-artifactory")
+ apply(plugin = "scientifik.publish")
}
}
\ No newline at end of file
diff --git a/buildSrc/build.gradle.kts b/buildSrc/build.gradle.kts
deleted file mode 100644
index 1ebbdf4d..00000000
--- a/buildSrc/build.gradle.kts
+++ /dev/null
@@ -1,20 +0,0 @@
-plugins {
- `kotlin-dsl`
-}
-
-repositories {
- gradlePluginPortal()
- jcenter()
-}
-
-val kotlinVersion = "1.3.31"
-
-// Add plugins used in buildSrc as dependencies, also we should specify version only here
-dependencies {
- implementation("org.jetbrains.kotlin:kotlin-gradle-plugin:$kotlinVersion")
- implementation("org.jfrog.buildinfo:build-info-extractor-gradle:4.9.5")
- implementation("com.jfrog.bintray.gradle:gradle-bintray-plugin:1.8.4")
- implementation("org.jetbrains.dokka:dokka-gradle-plugin:0.9.18")
- implementation("com.moowork.gradle:gradle-node-plugin:1.3.1")
- implementation("org.openjfx:javafx-plugin:0.0.7")
-}
diff --git a/buildSrc/settings.gradle.kts b/buildSrc/settings.gradle.kts
deleted file mode 100644
index e69de29b..00000000
diff --git a/buildSrc/src/main/kotlin/Versions.kt b/buildSrc/src/main/kotlin/Versions.kt
deleted file mode 100644
index 883af120..00000000
--- a/buildSrc/src/main/kotlin/Versions.kt
+++ /dev/null
@@ -1,9 +0,0 @@
-// Instead of defining runtime properties and use them dynamically
-// define version in buildSrc and have autocompletion and compile-time check
-// Also dependencies itself can be moved here
-object Versions {
- val ioVersion = "0.1.8"
- val coroutinesVersion = "1.2.1"
- val atomicfuVersion = "0.12.6"
- val serializationVersion = "0.11.0"
-}
diff --git a/buildSrc/src/main/kotlin/dokka-publish.gradle.kts b/buildSrc/src/main/kotlin/dokka-publish.gradle.kts
deleted file mode 100644
index 318e08ae..00000000
--- a/buildSrc/src/main/kotlin/dokka-publish.gradle.kts
+++ /dev/null
@@ -1,59 +0,0 @@
-import org.jetbrains.dokka.gradle.DokkaTask
-
-plugins {
- kotlin("multiplatform")
- id("org.jetbrains.dokka")
- `maven-publish`
-}
-
-kotlin {
-
- val dokka by tasks.getting(DokkaTask::class) {
- outputFormat = "html"
- outputDirectory = "$buildDir/javadoc"
- jdkVersion = 8
-
- kotlinTasks {
- // dokka fails to retrieve sources from MPP-tasks so we only define the jvm task
- listOf(tasks.getByPath("compileKotlinJvm"))
- }
- sourceRoot {
- // assuming only single source dir
- path = sourceSets["commonMain"].kotlin.srcDirs.first().toString()
- platforms = listOf("Common")
- }
- // although the JVM sources are now taken from the task,
- // we still define the jvm source root to get the JVM marker in the generated html
- sourceRoot {
- // assuming only single source dir
- path = sourceSets["jvmMain"].kotlin.srcDirs.first().toString()
- platforms = listOf("JVM")
- }
- }
-
- val javadocJar by tasks.registering(Jar::class) {
- dependsOn(dokka)
- archiveClassifier.set("javadoc")
- from("$buildDir/javadoc")
- }
-
- publishing {
-
- // publications.filterIsInstance().forEach { publication ->
-// if (publication.name == "kotlinMultiplatform") {
-// // for our root metadata publication, set artifactId with a package and project name
-// publication.artifactId = project.name
-// } else {
-// // for targets, set artifactId with a package, project name and target name (e.g. iosX64)
-// publication.artifactId = "${project.name}-${publication.name}"
-// }
-// }
-
- targets.all {
- val publication = publications.findByName(name) as MavenPublication
-
- // Patch publications with fake javadoc
- publication.artifact(javadocJar.get())
- }
- }
-}
\ No newline at end of file
diff --git a/buildSrc/src/main/kotlin/js-test.gradle.kts b/buildSrc/src/main/kotlin/js-test.gradle.kts
deleted file mode 100644
index 61759a28..00000000
--- a/buildSrc/src/main/kotlin/js-test.gradle.kts
+++ /dev/null
@@ -1,44 +0,0 @@
-import com.moowork.gradle.node.npm.NpmTask
-import com.moowork.gradle.node.task.NodeTask
-import org.gradle.kotlin.dsl.*
-import org.jetbrains.kotlin.gradle.tasks.Kotlin2JsCompile
-
-plugins {
- id("com.moowork.node")
- kotlin("multiplatform")
-}
-
-node {
- nodeModulesDir = file("$buildDir/node_modules")
-}
-
-val compileKotlinJs by tasks.getting(Kotlin2JsCompile::class)
-val compileTestKotlinJs by tasks.getting(Kotlin2JsCompile::class)
-
-val populateNodeModules by tasks.registering(Copy::class) {
- dependsOn(compileKotlinJs)
- from(compileKotlinJs.destinationDir)
-
- kotlin.js().compilations["test"].runtimeDependencyFiles.forEach {
- if (it.exists() && !it.isDirectory) {
- from(zipTree(it.absolutePath).matching { include("*.js") })
- }
- }
-
- into("$buildDir/node_modules")
-}
-
-val installMocha by tasks.registering(NpmTask::class) {
- setWorkingDir(buildDir)
- setArgs(listOf("install", "mocha"))
-}
-
-val runMocha by tasks.registering(NodeTask::class) {
- dependsOn(compileTestKotlinJs, populateNodeModules, installMocha)
- setScript(file("$buildDir/node_modules/mocha/bin/mocha"))
- setArgs(listOf(compileTestKotlinJs.outputFile))
-}
-
-tasks["jsTest"].dependsOn(runMocha)
-
-
diff --git a/buildSrc/src/main/kotlin/npm-artifactory.gradle.kts b/buildSrc/src/main/kotlin/npm-artifactory.gradle.kts
deleted file mode 100644
index d792dffb..00000000
--- a/buildSrc/src/main/kotlin/npm-artifactory.gradle.kts
+++ /dev/null
@@ -1,38 +0,0 @@
-import groovy.lang.GroovyObject
-import org.jfrog.gradle.plugin.artifactory.dsl.PublisherConfig
-import org.jfrog.gradle.plugin.artifactory.dsl.ResolverConfig
-
-plugins {
- id("com.jfrog.artifactory")
-}
-
-artifactory {
- val artifactoryUser: String? by project
- val artifactoryPassword: String? by project
- val artifactoryContextUrl = "http://npm.mipt.ru:8081/artifactory"
-
- setContextUrl(artifactoryContextUrl)//The base Artifactory URL if not overridden by the publisher/resolver
- publish(delegateClosureOf {
- repository(delegateClosureOf {
- setProperty("repoKey", "gradle-dev-local")
- setProperty("username", artifactoryUser)
- setProperty("password", artifactoryPassword)
- })
-
- defaults(delegateClosureOf{
- invokeMethod("publications", arrayOf("jvm", "js", "kotlinMultiplatform", "metadata"))
- //TODO: This property is not available for ArtifactoryTask
- //setProperty("publishBuildInfo", false)
- setProperty("publishArtifacts", true)
- setProperty("publishPom", true)
- setProperty("publishIvy", false)
- })
- })
- resolve(delegateClosureOf {
- repository(delegateClosureOf {
- setProperty("repoKey", "gradle-dev")
- setProperty("username", artifactoryUser)
- setProperty("password", artifactoryPassword)
- })
- })
-}
diff --git a/buildSrc/src/main/kotlin/npm-bintray.gradle.kts b/buildSrc/src/main/kotlin/npm-bintray.gradle.kts
deleted file mode 100644
index b152d163..00000000
--- a/buildSrc/src/main/kotlin/npm-bintray.gradle.kts
+++ /dev/null
@@ -1,97 +0,0 @@
-@file:Suppress("UnstableApiUsage")
-
-import com.jfrog.bintray.gradle.BintrayExtension.PackageConfig
-import com.jfrog.bintray.gradle.BintrayExtension.VersionConfig
-
-// Old bintray.gradle script converted to real Gradle plugin (precompiled script plugin)
-// It now has own dependencies and support type safe accessors
-// Syntax is pretty close to what we had in Groovy
-// (excluding Property.set and bintray dynamic configs)
-
-plugins {
- id("com.jfrog.bintray")
- `maven-publish`
-}
-
-val vcs = "https://github.com/mipt-npm/kmath"
-
-// Configure publishing
-publishing {
- repositories {
- maven("https://bintray.com/mipt-npm/scientifik")
- }
-
- // Process each publication we have in this project
- publications.filterIsInstance().forEach { publication ->
-
- // use type safe pom config GSL insterad of old dynamic
- publication.pom {
- name.set(project.name)
- description.set(project.description)
- url.set(vcs)
-
- licenses {
- license {
- name.set("The Apache Software License, Version 2.0")
- url.set("http://www.apache.org/licenses/LICENSE-2.0.txt")
- distribution.set("repo")
- }
- }
- developers {
- developer {
- id.set("MIPT-NPM")
- name.set("MIPT nuclear physics methods laboratory")
- organization.set("MIPT")
- organizationUrl.set("http://npm.mipt.ru")
- }
-
- }
- scm {
- url.set(vcs)
- }
- }
-
- }
-}
-
-bintray {
- // delegates for runtime properties
- val bintrayUser: String? by project
- val bintrayApiKey: String? by project
- user = bintrayUser ?: System.getenv("BINTRAY_USER")
- key = bintrayApiKey ?: System.getenv("BINTRAY_API_KEY")
- publish = true
- override = true // for multi-platform Kotlin/Native publishing
-
- // We have to use delegateClosureOf because bintray supports only dynamic groovy syntax
- // this is a problem of this plugin
- pkg(delegateClosureOf {
- userOrg = "mipt-npm"
- repo = "scientifik"
- name = "scientifik.kmath"
- issueTrackerUrl = "https://github.com/mipt-npm/kmath/issues"
- setLicenses("Apache-2.0")
- vcsUrl = vcs
- version(delegateClosureOf {
- name = project.version.toString()
- vcsTag = project.version.toString()
- released = java.util.Date().toString()
- })
- })
-
- tasks {
- bintrayUpload {
- dependsOn(publishToMavenLocal)
- doFirst {
- setPublications(project.publishing.publications
- .filterIsInstance()
- .filter { !it.name.contains("-test") && it.name != "kotlinMultiplatform" }
- .map {
- println("""Uploading artifact "${it.groupId}:${it.artifactId}:${it.version}" from publication "${it.name}""")
- it.name //https://github.com/bintray/gradle-bintray-plugin/issues/256
- })
- }
- }
-
- }
-}
diff --git a/buildSrc/src/main/kotlin/npm-multiplatform.gradle.kts b/buildSrc/src/main/kotlin/npm-multiplatform.gradle.kts
deleted file mode 100644
index 671986b8..00000000
--- a/buildSrc/src/main/kotlin/npm-multiplatform.gradle.kts
+++ /dev/null
@@ -1,86 +0,0 @@
-import org.gradle.kotlin.dsl.*
-
-plugins {
- kotlin("multiplatform")
- `maven-publish`
-}
-
-
-kotlin {
- jvm {
- compilations.all {
- kotlinOptions {
- jvmTarget = "1.8"
- }
- }
- }
-
- js {
- compilations.all {
- kotlinOptions {
- metaInfo = true
- sourceMap = true
- sourceMapEmbedSources = "always"
- moduleKind = "commonjs"
- }
- }
-
- compilations.named("main") {
- kotlinOptions {
- main = "call"
- }
- }
- }
-
- sourceSets {
- val commonMain by getting {
- dependencies {
- api(kotlin("stdlib"))
- }
- }
- val commonTest by getting {
- dependencies {
- implementation(kotlin("test-common"))
- implementation(kotlin("test-annotations-common"))
- }
- }
- val jvmMain by getting {
- dependencies {
- api(kotlin("stdlib-jdk8"))
- }
- }
- val jvmTest by getting {
- dependencies {
- implementation(kotlin("test"))
- implementation(kotlin("test-junit"))
- }
- }
- val jsMain by getting {
- dependencies {
- api(kotlin("stdlib-js"))
- }
- }
- val jsTest by getting {
- dependencies {
- implementation(kotlin("test-js"))
- }
- }
- }
-
- targets.all {
- sourceSets.all {
- languageSettings.progressiveMode = true
- languageSettings.enableLanguageFeature("InlineClasses")
- }
- }
-
- apply(plugin = "dokka-publish")
-
- // Apply JS test configuration
- val runJsTests by ext(false)
-
- if (runJsTests) {
- apply(plugin = "js-test")
- }
-
-}
diff --git a/dataforge-context/build.gradle.kts b/dataforge-context/build.gradle.kts
index f454e2ef..896e7b89 100644
--- a/dataforge-context/build.gradle.kts
+++ b/dataforge-context/build.gradle.kts
@@ -1,34 +1,31 @@
plugins {
- `npm-multiplatform`
+ id("scientifik.mpp")
}
description = "Context and provider definitions"
-val coroutinesVersion: String = Versions.coroutinesVersion
+val coroutinesVersion: String = Scientifik.coroutinesVersion
kotlin {
- jvm()
- js()
-
sourceSets {
val commonMain by getting {
dependencies {
api(project(":dataforge-meta"))
api(kotlin("reflect"))
- api("io.github.microutils:kotlin-logging-common:1.6.10")
+ api("io.github.microutils:kotlin-logging-common:1.7.2")
api("org.jetbrains.kotlinx:kotlinx-coroutines-core-common:$coroutinesVersion")
}
}
val jvmMain by getting {
dependencies {
- api("io.github.microutils:kotlin-logging:1.6.10")
+ api("io.github.microutils:kotlin-logging:1.7.2")
api("ch.qos.logback:logback-classic:1.2.3")
api("org.jetbrains.kotlinx:kotlinx-coroutines-core:$coroutinesVersion")
}
}
val jsMain by getting {
dependencies {
- api("io.github.microutils:kotlin-logging-js:1.6.10")
+ api("io.github.microutils:kotlin-logging-js:1.7.2")
api("org.jetbrains.kotlinx:kotlinx-coroutines-core-js:$coroutinesVersion")
}
}
diff --git a/dataforge-context/src/commonMain/kotlin/hep/dataforge/context/AbstractPlugin.kt b/dataforge-context/src/commonMain/kotlin/hep/dataforge/context/AbstractPlugin.kt
index 9a091ad1..c9268790 100644
--- a/dataforge-context/src/commonMain/kotlin/hep/dataforge/context/AbstractPlugin.kt
+++ b/dataforge-context/src/commonMain/kotlin/hep/dataforge/context/AbstractPlugin.kt
@@ -3,6 +3,7 @@ package hep.dataforge.context
import hep.dataforge.meta.EmptyMeta
import hep.dataforge.meta.Meta
import hep.dataforge.names.Name
+import hep.dataforge.names.toName
abstract class AbstractPlugin(override val meta: Meta = EmptyMeta) : Plugin {
private var _context: Context? = null
@@ -18,7 +19,9 @@ abstract class AbstractPlugin(override val meta: Meta = EmptyMeta) : Plugin {
this._context = null
}
- override fun provideTop(target: String, name: Name): Any? = null
+ override fun provideTop(target: String): Map = emptyMap()
- override fun listNames(target: String): Sequence = emptySequence()
+ companion object{
+ fun Collection.toMap(): Map = associate { it.name.toName() to it }
+ }
}
\ No newline at end of file
diff --git a/dataforge-context/src/commonMain/kotlin/hep/dataforge/context/Context.kt b/dataforge-context/src/commonMain/kotlin/hep/dataforge/context/Context.kt
index cf6bb662..80746456 100644
--- a/dataforge-context/src/commonMain/kotlin/hep/dataforge/context/Context.kt
+++ b/dataforge-context/src/commonMain/kotlin/hep/dataforge/context/Context.kt
@@ -26,8 +26,10 @@ import kotlin.jvm.JvmName
* Since plugins could contain mutable state, context has two states: active and inactive. No changes are allowed to active context.
* @author Alexander Nozik
*/
-open class Context(final override val name: String, val parent: Context? = Global) : Named, MetaRepr, Provider,
- CoroutineScope {
+open class Context(
+ final override val name: String,
+ val parent: Context? = Global
+) : Named, MetaRepr, Provider, CoroutineScope {
private val config = Config()
@@ -59,19 +61,11 @@ open class Context(final override val name: String, val parent: Context? = Globa
override val defaultTarget: String get() = Plugin.PLUGIN_TARGET
- override fun provideTop(target: String, name: Name): Any? {
+ override fun provideTop(target: String): Map {
return when (target) {
- Plugin.PLUGIN_TARGET -> plugins[PluginTag.fromString(name.toString())]
- Value.TYPE -> properties[name]?.value
- else -> null
- }
- }
-
- override fun listNames(target: String): Sequence {
- return when (target) {
- Plugin.PLUGIN_TARGET -> plugins.asSequence().map { it.name.toName() }
- Value.TYPE -> properties.values().map { it.first }
- else -> emptySequence()
+ Value.TYPE -> properties.sequence().toMap()
+ Plugin.PLUGIN_TARGET -> plugins.sequence(true).associateBy { it.name.toName() }
+ else -> emptyMap()
}
}
@@ -116,11 +110,11 @@ open class Context(final override val name: String, val parent: Context? = Globa
}
}
-/**
- * A sequences of all objects provided by plugins with given target and type
- */
fun Context.content(target: String): Map = content(target)
+/**
+ * A map of all objects provided by plugins with given target and type
+ */
@JvmName("typedContent")
inline fun Context.content(target: String): Map =
plugins.flatMap { plugin ->
@@ -148,14 +142,18 @@ object Global : Context("GLOBAL", null) {
private val contextRegistry = HashMap()
/**
- * Get previously builder context o builder a new one
+ * Get previously built context
*
* @param name
* @return
*/
- fun getContext(name: String): Context {
- return contextRegistry.getOrPut(name) { Context(name) }
+ fun getContext(name: String): Context? {
+ return contextRegistry[name]
}
+
+ fun context(name: String, parent: Context = this, block: ContextBuilder.() -> Unit = {}): Context =
+ ContextBuilder(name, parent).apply(block).build()
+
}
diff --git a/dataforge-context/src/commonMain/kotlin/hep/dataforge/context/ContextBuilder.kt b/dataforge-context/src/commonMain/kotlin/hep/dataforge/context/ContextBuilder.kt
index 203edd2c..92840862 100644
--- a/dataforge-context/src/commonMain/kotlin/hep/dataforge/context/ContextBuilder.kt
+++ b/dataforge-context/src/commonMain/kotlin/hep/dataforge/context/ContextBuilder.kt
@@ -18,11 +18,15 @@ class ContextBuilder(var name: String = "@anonimous", val parent: Context = Glob
plugins.add(plugin)
}
- fun plugin(tag: PluginTag, action: MetaBuilder.() -> Unit) {
+ fun plugin(tag: PluginTag, action: MetaBuilder.() -> Unit = {}) {
plugins.add(PluginRepository.fetch(tag, buildMeta(action)))
}
- fun plugin(name: String, group: String = "", version: String = "", action: MetaBuilder.() -> Unit) {
+ fun plugin(builder: PluginFactory<*>, action: MetaBuilder.() -> Unit = {}) {
+ plugins.add(builder.invoke(buildMeta(action)))
+ }
+
+ fun plugin(name: String, group: String = "", version: String = "", action: MetaBuilder.() -> Unit = {}) {
plugin(PluginTag(name, group, version), action)
}
diff --git a/dataforge-context/src/commonMain/kotlin/hep/dataforge/context/PluginManager.kt b/dataforge-context/src/commonMain/kotlin/hep/dataforge/context/PluginManager.kt
index bcce8eb6..dad483a8 100644
--- a/dataforge-context/src/commonMain/kotlin/hep/dataforge/context/PluginManager.kt
+++ b/dataforge-context/src/commonMain/kotlin/hep/dataforge/context/PluginManager.kt
@@ -51,7 +51,10 @@ class PluginManager(override val context: Context) : ContextAware, Iterable get(type: KClass, recursive: Boolean = true): T? =
- get(recursive) { type.isInstance(it) } as T?
+ operator fun get(type: KClass, tag: PluginTag? = null, recursive: Boolean = true): T? =
+ get(recursive) { type.isInstance(it) && (tag == null || tag.matches(it.tag)) } as T?
- inline fun get(recursive: Boolean = true): T? = get(T::class, recursive)
+ inline fun get(tag: PluginTag? = null, recursive: Boolean = true): T? =
+ get(T::class, tag, recursive)
/**
* Load given plugin into this manager and return loaded instance.
- * Throw error if plugin of the same class already exists in manager
+ * Throw error if plugin of the same type and tag already exists in manager.
*
* @param plugin
* @return
@@ -75,10 +79,12 @@ class PluginManager(override val context: Context) : ContextAware, Iterable load(plugin: T): T {
if (context.isActive) error("Can't load plugin into active context")
- if (get(plugin::class, false) != null) {
- throw RuntimeException("Plugin of type ${plugin::class} already exists in ${context.name}")
+ if (get(plugin::class, plugin.tag, recursive = false) != null) {
+ error("Plugin of type ${plugin::class} already exists in ${context.name}")
} else {
- loadDependencies(plugin)
+ for (tag in plugin.dependsOn()) {
+ fetch(tag, true)
+ }
logger.info { "Loading plugin ${plugin.name} into ${context.name}" }
plugin.attach(context)
@@ -87,11 +93,14 @@ class PluginManager(override val context: Context) : ContextAware, Iterable load(factory: PluginFactory, meta: Meta = EmptyMeta): T =
+ load(factory(meta))
+
+ fun load(factory: PluginFactory, metaBuilder: MetaBuilder.() -> Unit): T =
+ load(factory, buildMeta(metaBuilder))
/**
* Remove a plugin from [PluginManager]
@@ -107,22 +116,11 @@ class PluginManager(override val context: Context) : ContextAware, Iterable load(PluginRepository.fetch(tag,meta))
- loaded.meta == meta -> loaded // if meta is the same, return existing plugin
- else -> throw RuntimeException("Can't load plugin with tag $tag. Plugin with this tag and different configuration already exists in context.")
- }
- }
-
- fun load(factory: PluginFactory<*>, meta: Meta = EmptyMeta): Plugin{
- val loaded = get(factory.tag, false)
+ fun fetch(factory: PluginFactory, recursive: Boolean = true, meta: Meta = EmptyMeta): T {
+ val loaded = get(factory.type, factory.tag, recursive)
return when {
loaded == null -> load(factory(meta))
loaded.meta == meta -> loaded // if meta is the same, return existing plugin
@@ -130,42 +128,12 @@ class PluginManager(override val context: Context) : ContextAware, Iterable load(type: KClass, meta: Meta = EmptyMeta): T {
- val loaded = get(type, false)
- return when {
- loaded == null -> {
- val plugin = PluginRepository.list().first { it.type == type }.invoke(meta)
- if (type.isInstance(plugin)) {
- @Suppress("UNCHECKED_CAST")
- load(plugin as T)
- } else {
- error("Corrupt type information in plugin repository")
- }
- }
- loaded.meta == meta -> loaded // if meta is the same, return existing plugin
- else -> throw RuntimeException("Can't load plugin with type $type. Plugin with this type and different configuration already exists in context.")
- }
- }
-
- inline fun load(noinline metaBuilder: MetaBuilder.() -> Unit = {}): T {
- return load(T::class, buildMeta(metaBuilder))
- }
-
- fun load(name: String, meta: Meta = EmptyMeta): Plugin {
- return load(PluginTag.fromString(name), meta)
- }
+ fun fetch(
+ factory: PluginFactory,
+ recursive: Boolean = true,
+ metaBuilder: MetaBuilder.() -> Unit
+ ): T = fetch(factory, recursive, buildMeta(metaBuilder))
override fun iterator(): Iterator = plugins.iterator()
- /**
- * Get a plugin if it exists or load it with given meta if it is not.
- */
- inline fun getOrLoad(noinline metaBuilder: MetaBuilder.() -> Unit = {}): T {
- return get(true) ?: load(metaBuilder)
- }
-
}
\ No newline at end of file
diff --git a/dataforge-context/src/commonMain/kotlin/hep/dataforge/provider/Provider.kt b/dataforge-context/src/commonMain/kotlin/hep/dataforge/provider/Provider.kt
index 657f272d..b1d769e2 100644
--- a/dataforge-context/src/commonMain/kotlin/hep/dataforge/provider/Provider.kt
+++ b/dataforge-context/src/commonMain/kotlin/hep/dataforge/provider/Provider.kt
@@ -17,7 +17,6 @@ package hep.dataforge.provider
import hep.dataforge.names.Name
import hep.dataforge.names.toName
-import kotlin.jvm.JvmName
/**
* A marker utility interface for providers.
@@ -42,29 +41,21 @@ interface Provider {
/**
- * Provide a top level element for this [Provider] or return null if element is not present
+ * A map of direct children for specific target
*/
- fun provideTop(target: String, name: Name): Any?
-
- /**
- * [Sequence] of available names with given target. Only top level names are listed, no chain path.
- *
- * @param target
- * @return
- */
- fun listNames(target: String): Sequence
+ fun provideTop(target: String): Map
}
fun Provider.provide(path: Path, targetOverride: String? = null): Any? {
if (path.length == 0) throw IllegalArgumentException("Can't provide by empty path")
val first = path.first()
- val top = provideTop(targetOverride ?: first.target ?: defaultTarget, first.name)
+ val target = targetOverride ?: first.target ?: defaultTarget
+ val res = provideTop(target)[first.name] ?: return null
return when (path.length) {
- 1 -> top
+ 1 -> res
else -> {
- when (top) {
- null -> null
- is Provider -> top.provide(path.tail!!, targetOverride = defaultChainTarget)
+ when (res) {
+ is Provider -> res.provide(path.tail!!, targetOverride = defaultChainTarget)
else -> throw IllegalStateException("Chain path not supported: child is not a provider")
}
}
@@ -86,14 +77,11 @@ inline fun Provider.provide(target: String, name: String): T?
provide(target, name.toName())
/**
- * A top level content with names
+ * Typed top level content
*/
-fun Provider.top(target: String): Map = top(target)
-
-@JvmName("typedTop")
inline fun Provider.top(target: String): Map {
- return listNames(target).associate {
- it to (provideTop(target, it) as? T ?: error("The element $it is declared but not provided"))
+ return provideTop(target).mapValues {
+ it.value as? T ?: error("The type of element $it is ${it::class} but ${T::class} is expected")
}
}
diff --git a/dataforge-context/src/commonTest/kotlin/hep/dataforge/context/ContextTest.kt b/dataforge-context/src/commonTest/kotlin/hep/dataforge/context/ContextTest.kt
index 1d37c69e..c77439d6 100644
--- a/dataforge-context/src/commonTest/kotlin/hep/dataforge/context/ContextTest.kt
+++ b/dataforge-context/src/commonTest/kotlin/hep/dataforge/context/ContextTest.kt
@@ -12,17 +12,10 @@ class ContextTest {
class DummyPlugin : AbstractPlugin() {
override val tag get() = PluginTag("test")
- override fun provideTop(target: String, name: Name): Any? {
- return when (target) {
- "test" -> return name
- else -> super.provideTop(target, name)
- }
- }
-
- override fun listNames(target: String): Sequence {
- return when (target) {
- "test" -> sequenceOf("a", "b", "c.d").map { it.toName() }
- else -> super.listNames(target)
+ override fun provideTop(target: String): Map {
+ return when(target){
+ "test" -> listOf("a", "b", "c.d").associate { it.toName() to it.toName() }
+ else -> emptyMap()
}
}
}
diff --git a/dataforge-context/src/jvmMain/kotlin/hep/dataforge/provider/Types.kt b/dataforge-context/src/jvmMain/kotlin/hep/dataforge/provider/Types.kt
index d6ed723d..dfe81ce0 100644
--- a/dataforge-context/src/jvmMain/kotlin/hep/dataforge/provider/Types.kt
+++ b/dataforge-context/src/jvmMain/kotlin/hep/dataforge/provider/Types.kt
@@ -34,9 +34,7 @@ inline fun Provider.provideByType(name: Name): T? {
inline fun Provider.top(): Map {
val target = Types[T::class]
- return listNames(target).associate { name ->
- name to (provideByType(name) ?: error("The element $name is declared but not provided"))
- }
+ return top(target)
}
/**
diff --git a/dataforge-data/build.gradle.kts b/dataforge-data/build.gradle.kts
index 7ebb46ce..793f551b 100644
--- a/dataforge-data/build.gradle.kts
+++ b/dataforge-data/build.gradle.kts
@@ -1,17 +1,14 @@
plugins {
- `npm-multiplatform`
+ id("scientifik.mpp")
}
-val coroutinesVersion: String = Versions.coroutinesVersion
+val coroutinesVersion: String = Scientifik.coroutinesVersion
kotlin {
- jvm()
- js()
sourceSets {
val commonMain by getting{
dependencies {
api(project(":dataforge-meta"))
- api(kotlin("reflect"))
api("org.jetbrains.kotlinx:kotlinx-coroutines-core-common:$coroutinesVersion")
}
}
@@ -19,6 +16,7 @@ kotlin {
val jvmMain by getting{
dependencies {
api("org.jetbrains.kotlinx:kotlinx-coroutines-core:$coroutinesVersion")
+ api(kotlin("reflect"))
}
}
diff --git a/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/Action.kt b/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/Action.kt
index 228522dc..cf030c75 100644
--- a/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/Action.kt
+++ b/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/Action.kt
@@ -1,7 +1,6 @@
package hep.dataforge.data
import hep.dataforge.meta.Meta
-import hep.dataforge.names.Name
/**
* A simple data transformation on a data node
@@ -34,19 +33,3 @@ infix fun Action.then(action: Action): A
}
}
-
-///**
-// * An action that performs the same transformation on each of input data nodes. Null results are ignored.
-// * The transformation is non-suspending because it is lazy.
-// */
-//class PipeAction(val transform: (Name, Data, Meta) -> Data?) : Action {
-// override fun invoke(node: DataNode, meta: Meta): DataNode = DataNode.build {
-// node.data().forEach { (name, data) ->
-// val res = transform(name, data, meta)
-// if (res != null) {
-// set(name, res)
-// }
-// }
-// }
-//}
-
diff --git a/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/CoroutineMonitor.kt b/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/CoroutineMonitor.kt
new file mode 100644
index 00000000..8cbd6192
--- /dev/null
+++ b/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/CoroutineMonitor.kt
@@ -0,0 +1,48 @@
+package hep.dataforge.data
+
+import kotlinx.coroutines.CoroutineScope
+import kotlinx.coroutines.Job
+import kotlin.coroutines.CoroutineContext
+
+/**
+ * A monitor of goal state that could be accessed only form inside the goal
+ */
+class CoroutineMonitor : CoroutineContext.Element {
+ override val key: CoroutineContext.Key<*> get() = CoroutineMonitor
+
+ var totalWork: Double = 1.0
+ var workDone: Double = 0.0
+ var status: String = ""
+
+ /**
+ * Mark the goal as started
+ */
+ fun start() {
+
+ }
+
+ /**
+ * Mark the goal as completed
+ */
+ fun finish() {
+ workDone = totalWork
+ }
+
+ companion object : CoroutineContext.Key
+}
+
+class Dependencies(val values: Collection) : CoroutineContext.Element {
+ override val key: CoroutineContext.Key<*> get() = Dependencies
+
+ companion object : CoroutineContext.Key
+}
+
+val CoroutineContext.monitor: CoroutineMonitor? get() = this[CoroutineMonitor]
+val CoroutineScope.monitor: CoroutineMonitor? get() = coroutineContext.monitor
+
+val Job.dependencies: Collection get() = this[Dependencies]?.values ?: emptyList()
+
+val Job.totalWork: Double get() = dependencies.sumByDouble { totalWork } + (monitor?.totalWork ?: 0.0)
+val Job.workDone: Double get() = dependencies.sumByDouble { workDone } + (monitor?.workDone ?: 0.0)
+val Job.status: String get() = monitor?.status ?: ""
+val Job.progress: Double get() = workDone / totalWork
\ No newline at end of file
diff --git a/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/Data.kt b/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/Data.kt
index 20957824..9b0d9027 100644
--- a/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/Data.kt
+++ b/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/Data.kt
@@ -1,14 +1,17 @@
package hep.dataforge.data
+import hep.dataforge.meta.EmptyMeta
import hep.dataforge.meta.Meta
import hep.dataforge.meta.MetaRepr
import kotlinx.coroutines.CoroutineScope
+import kotlin.coroutines.CoroutineContext
+import kotlin.coroutines.EmptyCoroutineContext
import kotlin.reflect.KClass
/**
* A data element characterized by its meta
*/
-interface Data : MetaRepr {
+interface Data : Goal, MetaRepr {
/**
* Type marker for the data. The type is known before the calculation takes place so it could be checked.
*/
@@ -18,52 +21,148 @@ interface Data : MetaRepr {
*/
val meta: Meta
- /**
- * Lazy data value
- */
- val goal: Goal
-
override fun toMeta(): Meta = meta
companion object {
const val TYPE = "data"
- fun of(type: KClass, goal: Goal, meta: Meta): Data = DataImpl(type, goal, meta)
+ operator fun invoke(
+ type: KClass,
+ meta: Meta = EmptyMeta,
+ context: CoroutineContext = EmptyCoroutineContext,
+ dependencies: Collection> = emptyList(),
+ block: suspend CoroutineScope.() -> T
+ ): Data = DynamicData(type, meta, context, dependencies, block)
- inline fun of(goal: Goal, meta: Meta): Data = of(T::class, goal, meta)
+ operator inline fun invoke(
+ meta: Meta = EmptyMeta,
+ context: CoroutineContext = EmptyCoroutineContext,
+ dependencies: Collection> = emptyList(),
+ noinline block: suspend CoroutineScope.() -> T
+ ): Data = invoke(T::class, meta, context, dependencies, block)
- fun of(name: String, type: KClass, goal: Goal, meta: Meta): Data =
- NamedData(name, of(type, goal, meta))
+ operator fun invoke(
+ name: String,
+ type: KClass,
+ meta: Meta = EmptyMeta,
+ context: CoroutineContext = EmptyCoroutineContext,
+ dependencies: Collection> = emptyList(),
+ block: suspend CoroutineScope.() -> T
+ ): Data = NamedData(name, invoke(type, meta, context, dependencies, block))
- inline fun of(name: String, goal: Goal, meta: Meta): Data =
- of(name, T::class, goal, meta)
+ operator inline fun invoke(
+ name: String,
+ meta: Meta = EmptyMeta,
+ context: CoroutineContext = EmptyCoroutineContext,
+ dependencies: Collection> = emptyList(),
+ noinline block: suspend CoroutineScope.() -> T
+ ): Data =
+ invoke(name, T::class, meta, context, dependencies, block)
- fun static(scope: CoroutineScope, value: T, meta: Meta): Data =
- DataImpl(value::class, Goal.static(scope, value), meta)
+ fun static(value: T, meta: Meta = EmptyMeta): Data =
+ StaticData(value, meta)
+ }
+}
+
+
+fun Data.cast(type: KClass): Data {
+ return object : Data by this {
+ override val type: KClass = type
}
}
/**
* Upcast a [Data] to a supertype
*/
-inline fun Data.cast(): Data {
- return Data.of(R::class, goal, meta)
-}
+inline fun Data.cast(): Data = cast(R::class)
-fun Data.cast(type: KClass): Data {
- return Data.of(type, goal, meta)
-}
-suspend fun Data.await(): T = goal.await()
-
-/**
- * Generic Data implementation
- */
-private class DataImpl(
+class DynamicData(
override val type: KClass,
- override val goal: Goal,
- override val meta: Meta
-) : Data
+ override val meta: Meta = EmptyMeta,
+ context: CoroutineContext = EmptyCoroutineContext,
+ dependencies: Collection> = emptyList(),
+ block: suspend CoroutineScope.() -> T
+) : Data, DynamicGoal(context, dependencies, block)
+
+class StaticData(
+ value: T,
+ override val meta: Meta = EmptyMeta
+) : Data, StaticGoal(value) {
+ override val type: KClass get() = value::class
+}
class NamedData(val name: String, data: Data) : Data by data
+fun Data.pipe(
+ outputType: KClass,
+ coroutineContext: CoroutineContext = EmptyCoroutineContext,
+ meta: Meta = this.meta,
+ block: suspend CoroutineScope.(T) -> R
+): Data = DynamicData(outputType, meta, coroutineContext, listOf(this)) {
+ block(await(this))
+}
+
+
+/**
+ * Create a data pipe
+ */
+inline fun Data.pipe(
+ coroutineContext: CoroutineContext = EmptyCoroutineContext,
+ meta: Meta = this.meta,
+ noinline block: suspend CoroutineScope.(T) -> R
+): Data = DynamicData(R::class, meta, coroutineContext, listOf(this)) {
+ block(await(this))
+}
+
+/**
+ * Create a joined data.
+ */
+inline fun Collection>.join(
+ coroutineContext: CoroutineContext = EmptyCoroutineContext,
+ meta: Meta,
+ noinline block: suspend CoroutineScope.(Collection) -> R
+): Data = DynamicData(
+ R::class,
+ meta,
+ coroutineContext,
+ this
+) {
+ block(map { this.run { it.await(this) } })
+}
+
+fun Map>.join(
+ outputType: KClass,
+ coroutineContext: CoroutineContext = EmptyCoroutineContext,
+ meta: Meta,
+ block: suspend CoroutineScope.(Map) -> R
+): DynamicData = DynamicData(
+ outputType,
+ meta,
+ coroutineContext,
+ this.values
+) {
+ block(mapValues { it.value.await(this) })
+}
+
+
+/**
+ * A joining of multiple data into a single one
+ * @param K type of the map key
+ * @param T type of the input goal
+ * @param R type of the result goal
+ */
+inline fun Map>.join(
+ coroutineContext: CoroutineContext = EmptyCoroutineContext,
+ meta: Meta,
+ noinline block: suspend CoroutineScope.(Map) -> R
+): DynamicData = DynamicData(
+ R::class,
+ meta,
+ coroutineContext,
+ this.values
+) {
+ block(mapValues { it.value.await(this) })
+}
+
+
diff --git a/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/DataFilter.kt b/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/DataFilter.kt
index 6c920f57..a23b550d 100644
--- a/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/DataFilter.kt
+++ b/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/DataFilter.kt
@@ -20,10 +20,10 @@ class DataFilter(override val config: Config) : Specific {
* Apply meta-based filter to given data node
*/
fun DataNode.filter(filter: DataFilter): DataNode {
- val sourceNode = filter.from?.let { getNode(it.toName()) } ?: this@filter
+ val sourceNode = filter.from?.let { get(it.toName()).node } ?: this@filter
val regex = filter.pattern.toRegex()
val targetNode = DataTreeBuilder(type).apply {
- sourceNode.data().forEach { (name, data) ->
+ sourceNode.dataSequence().forEach { (name, data) ->
if (name.toString().matches(regex)) {
this[name] = data
}
diff --git a/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/DataNode.kt b/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/DataNode.kt
index 02fc6a9e..a407b512 100644
--- a/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/DataNode.kt
+++ b/dataforge-data/src/commonMain/kotlin/hep/dataforge/data/DataNode.kt
@@ -1,8 +1,26 @@
package hep.dataforge.data
import hep.dataforge.names.*
+import kotlinx.coroutines.CoroutineScope
+import kotlinx.coroutines.Job
+import kotlinx.coroutines.launch
+import kotlin.collections.component1
+import kotlin.collections.component2
+import kotlin.collections.set
import kotlin.reflect.KClass
+sealed class DataItem {
+ abstract val type: KClass
+
+ class Node(val value: DataNode) : DataItem() {
+ override val type: KClass get() = value.type
+ }
+
+ class Leaf(val value: Data) : DataItem() {
+ override val type: KClass get() = value.type
+ }
+}
+
/**
* A tree-like data structure grouped into the node. All data inside the node must inherit its type
*/
@@ -13,93 +31,89 @@ interface DataNode {
*/
val type: KClass
- /**
- * Get the specific data if it exists
- */
- operator fun get(name: Name): Data?
-
- /**
- * Get a subnode with given name if it exists.
- */
- fun getNode(name: Name): DataNode?
-
- /**
- * Walk the tree upside down and provide all data nodes with full names
- */
- fun data(): Sequence>>
-
- /**
- * A sequence of all nodes in the tree walking upside down, excluding self
- */
- fun nodes(): Sequence>>
-
- operator fun iterator(): Iterator>> = data().iterator()
+ val items: Map>
companion object {
const val TYPE = "dataNode"
fun build(type: KClass, block: DataTreeBuilder.() -> Unit) =
- DataTreeBuilder(type).apply(block).build()
+ DataTreeBuilder(type).apply(block).build()
fun builder(type: KClass) = DataTreeBuilder(type)
}
}
-internal sealed class DataTreeItem {
- class Node(val tree: DataTree) : DataTreeItem()
- class Value(val value: Data) : DataTreeItem()
+val DataItem?.node: DataNode? get() = (this as? DataItem.Node)?.value
+val DataItem?.data: Data? get() = (this as? DataItem.Leaf)?.value
+
+/**
+ * Start computation for all goals in data node
+ */
+fun DataNode<*>.startAll(scope: CoroutineScope): Unit = items.values.forEach {
+ when (it) {
+ is DataItem.Node<*> -> it.value.startAll(scope)
+ is DataItem.Leaf<*> -> it.value.start(scope)
+ }
}
+fun DataNode<*>.joinAll(scope: CoroutineScope): Job = scope.launch {
+ startAll(scope)
+ items.forEach {
+ when (val value = it.value) {
+ is DataItem.Node -> value.value.joinAll(this).join()
+ is DataItem.Leaf -> value.value.await(scope)
+ }
+ }
+}
+
+operator fun DataNode.get(name: Name): DataItem? = when (name.length) {
+ 0 -> error("Empty name")
+ 1 -> (items[name.first()] as? DataItem.Leaf)
+ else -> get(name.first()!!.asName()).node?.get(name.cutFirst())
+}
+
+/**
+ * Sequence of all children including nodes
+ */
+fun DataNode.asSequence(): Sequence>> = sequence {
+ items.forEach { (head, item) ->
+ yield(head.asName() to item)
+ if (item is DataItem.Node) {
+ val subSequence = item.value.asSequence()
+ .map { (name, data) -> (head.asName() + name) to data }
+ yieldAll(subSequence)
+ }
+ }
+}
+
+/**
+ * Sequence of data entries
+ */
+fun DataNode.dataSequence(): Sequence>> = sequence {
+ items.forEach { (head, item) ->
+ when (item) {
+ is DataItem.Leaf -> yield(head.asName() to item.value)
+ is DataItem.Node -> {
+ val subSequence = item.value.dataSequence()
+ .map { (name, data) -> (head.asName() + name) to data }
+ yieldAll(subSequence)
+ }
+ }
+ }
+}
+
+operator fun DataNode.iterator(): Iterator>> = asSequence().iterator()
+
class DataTree internal constructor(
override val type: KClass,
- private val items: Map>
+ override val items: Map>
) : DataNode {
//TODO add node-level meta?
-
- override fun get(name: Name): Data? = when (name.length) {
- 0 -> error("Empty name")
- 1 -> (items[name.first()] as? DataTreeItem.Value)?.value
- else -> getNode(name.first()!!.asName())?.get(name.cutFirst())
- }
-
- override fun getNode(name: Name): DataTree? = when (name.length) {
- 0 -> this
- 1 -> (items[name.first()] as? DataTreeItem.Node)?.tree
- else -> getNode(name.first()!!.asName())?.getNode(name.cutFirst())
- }
-
- override fun data(): Sequence>> {
- return sequence {
- items.forEach { (head, tree) ->
- when (tree) {
- is DataTreeItem.Value -> yield(head.asName() to tree.value)
- is DataTreeItem.Node -> {
- val subSequence =
- tree.tree.data().map { (name, data) -> (head.asName() + name) to data }
- yieldAll(subSequence)
- }
- }
- }
- }
- }
-
- override fun nodes(): Sequence>> {
- return sequence {
- items.forEach { (head, tree) ->
- if (tree is DataTreeItem.Node) {
- yield(head.asName() to tree.tree)
- val subSequence =
- tree.tree.nodes().map { (name, node) -> (head.asName() + name) to node }
- yieldAll(subSequence)
- }
- }
- }
- }
}
private sealed class DataTreeBuilderItem {
class Node(val tree: DataTreeBuilder) : DataTreeBuilderItem()
- class Value(val value: Data) : DataTreeBuilderItem()
+ class Leaf(val value: Data) : DataTreeBuilderItem()
}
/**
@@ -115,7 +129,7 @@ class DataTreeBuilder(private val type: KClass) {
operator fun set(token: NameToken, data: Data) {
if (map.containsKey(token)) error("Tree entry with name $token is not empty")
- map[token] = DataTreeBuilderItem.Value(data)
+ map[token] = DataTreeBuilderItem.Leaf(data)
}
private fun buildNode(token: NameToken): DataTreeBuilder {
@@ -152,6 +166,11 @@ class DataTreeBuilder(private val type: KClass) {
operator fun set(name: Name, node: DataNode) = set(name, node.builder())
+ operator fun set(name: Name, item: DataItem) = when (item) {
+ is DataItem.Node -> set(name, item.value.builder())
+ is DataItem.Leaf -> set(name, item.value)
+ }
+
/**
* Append data to node
*/
@@ -162,14 +181,16 @@ class DataTreeBuilder(private val type: KClass) {
*/
infix fun String.to(node: DataNode) = set(toName(), node)
+ infix fun String.to(item: DataItem) = set(toName(), item)
+
/**
* Build and append node
*/
infix fun String.to(block: DataTreeBuilder.() -> Unit) = set(toName(), DataTreeBuilder(type).apply(block))
- fun update(node: DataNode){
- node.data().forEach {
+ fun update(node: DataNode) {
+ node.dataSequence().forEach {
//TODO check if the place is occupied
this[it.first] = it.second
}
@@ -178,8 +199,8 @@ class DataTreeBuilder(private val type: KClass) {
fun build(): DataTree {
val resMap = map.mapValues { (_, value) ->
when (value) {
- is DataTreeBuilderItem.Value -> DataTreeItem.Value(value.value)
- is DataTreeBuilderItem.Node -> DataTreeItem.Node(value.tree.build())
+ is DataTreeBuilderItem.Leaf -> DataItem.Leaf(value.value)
+ is DataTreeBuilderItem.Node -> DataItem.Node(value.tree.build())
}
}
return DataTree(type, resMap)
@@ -190,27 +211,20 @@ class DataTreeBuilder