Merge pull request #15 from mipt-npm/dev
0.1.3
This commit is contained in:
commit
632cc8dd9c
96
README.md
96
README.md
@ -0,0 +1,96 @@
|
||||
|
||||
|
||||
|
||||
# Questions and Answers #
|
||||
|
||||
In this section we will try to cover DataForge main ideas in the form of questions and answers.
|
||||
|
||||
## General ##
|
||||
|
||||
**Q:** I have a lot of data to analyze. The analysis process is complicated, requires a lot of stages and data flow is not always obvious. To top it the data size is huge, so I don't want to perform operation I don't need (calculate something I won't need or calculate something twice). And yes, I need it to be performed in parallel and probably on remote computer. By the way, I am sick and tired of scripts that modify other scripts that control scripts. Could you help me?
|
||||
|
||||
**A:** Yes, that is the precisely the problem DataForge was made to solve. It allows to perform some automated data manipulations with automatic optimization and parallelization. The important thing that data processing recipes are made in the declarative way, so it is quite easy to perform computations on a remote station. Also DataForge guarantees reproducibility of analysis results.
|
||||
<hr>
|
||||
|
||||
**Q:** How does it work?
|
||||
|
||||
**A:** At the core of DataForge lies the idea of **metadata processor**. It utilizes the statement that in order to analyze something you need data itself and some additional information about what does that data represent and what does user want as a result. This additional information is called metadata and could be organized in a regular structure (a tree of values not unlike XML or JSON). The important thing is that this distinction leaves no place for user instructions (or scripts). Indeed, the idea of DataForge logic is that one do not need imperative commands. The framework configures itself according to input meta-data and decides what operations should be performed in the most efficient way.
|
||||
<hr>
|
||||
|
||||
**Q:** But where does it take algorithms to use?
|
||||
|
||||
**A:** Of course algorithms must be written somewhere. No magic here. The logic is written in specialized modules. Some modules are provided out of the box at the system core, some need to be developed for specific problem.
|
||||
<hr>
|
||||
|
||||
**Q:** So I still need to write the code? What is the difference then?
|
||||
|
||||
**A:** Yes, someone still need to write the code. But not necessary you. Simple operations could be performed using provided core logic. Also your group can have one programmer writing the logic and all other using it without any real programming expertise. Also the framework organized in a such way that one writes some additional logic, he do not need to thing about complicated thing like parallel computing, resource handling, logging, caching etc. Most of the things are done by the DataForge.
|
||||
<hr>
|
||||
|
||||
## Platform ##
|
||||
|
||||
**Q:** Which platform does DataForge use? Which operation system is it working on?
|
||||
|
||||
**A:** The DataForge is mostly written in Java and utilizes JVM as a platform. It works on any system that supports JVM (meaning almost any modern system excluding some mobile platforms).
|
||||
<hr>
|
||||
|
||||
**Q:** But Java... it is slow!
|
||||
|
||||
**A:** [It is not](https://stackoverflow.com/questions/2163411/is-java-really-slow/2163570#2163570). It lacks some hardware specific optimizations and requires some additional time to start (due to JIT nature), but otherwise it is at least as fast as other languages traditionally used in science. More importantly, the memory safety, tooling support and vast ecosystem makes it №1 candidate for data analysis framework.
|
||||
|
||||
<hr>
|
||||
|
||||
**Q:** Can I use my C++/Fortran/Python code in DataForge?
|
||||
|
||||
**A:** Yes, as long as the code could be called from Java. Most of common languages have a bridge for Java access. There are completely no problems with compiled C/Fortran libraries. Python code could be called via one of existing python-java interfaces. It is also planned to implement remote method invocation for common languages, so your Python, or, say, Julia, code could run in its native environment. The metadata processor paradigm makes it much easier to do so.
|
||||
|
||||
<hr>
|
||||
|
||||
## Features ##
|
||||
|
||||
**Q:** What other features does DataForge provide?
|
||||
|
||||
**A:** Alongside metadata processing (and a lot of tools for metadata manipulation and layering), DataForge has two additional important concepts:
|
||||
|
||||
* **Modularisation**. Contrary to lot other frameworks, DataForge is intrinsically modular. The mandatory part is a rather tiny core module. Everything else could be customized.
|
||||
|
||||
* **Context encapsulation**. Every DataForge task is executed in some context. The context isolates environment for the task and also works as dependency injection base and specifies interaction of the task with the external world.
|
||||
|
||||
|
||||
<hr>
|
||||
|
||||
**Q:** OK, but now I want to work directly with my measuring devices. How can I do that?
|
||||
|
||||
**A:** The [dataforge-control](${site.url}/docs.html#control) module provides interfaces to interact with the hardware. Out of the box it supports safe communication with TCP/IP or COM/tty based devices. Specific device declaration could be done via additional modules. It is also possible to maintain data storage with [datforge-storage](${site.url}/docs.htm#storage) module.
|
||||
|
||||
<hr>
|
||||
|
||||
**Q:** Declarations and metadata are good, but I want my scripts back!
|
||||
|
||||
**A:** We can do that. [GRIND](${site.url}/docs.html#grind) provides a shell-like environment called GrindShell. It allows to run imperative scripts with full access to all of the DataForge functionality. Grind scripts are basically context-encapsulated. Also there are convenient feature wrappers called helpers that could be loaded into the shell when new features modules are added.
|
||||
|
||||
<hr>
|
||||
|
||||
## Misc ##
|
||||
|
||||
**Q:** So everything looks great, can I replace my ROOT / other data analysis framework with DataForge?
|
||||
|
||||
**A:** One must note, that DataForge is made for analysis, not for visualisation. The visualisation and user interaction capabilities of DataForge are rather limited compared to frameworks like ROOT, JAS3 or DataMelt. The idea is to provide reliable API and core functionality. In fact JAS3 and DataMelt could be used as a frontend for DataForge mechanics. It is planned to add an interface to ROOT via JFreeHep AIDA.
|
||||
|
||||
<hr>
|
||||
|
||||
**Q:** How does DataForge compare to cluster computation frameworks like Hadoop or Spark?
|
||||
|
||||
**A:** Again, it is not the purpose of DataForge to replace cluster software. DataForge has some internal parallelism mechanics and implementations, but they are most certainly worse then specially developed programs. Still, DataForge is not fixed on one single implementation. Your favourite parallel processing tool could be still used as a back-end for the DataForge. With full benefit of configuration tools, integrations and no performance overhead.
|
||||
|
||||
<hr>
|
||||
|
||||
**Q:** Is it possible to use DataForge in notebook mode?
|
||||
|
||||
**A:** Yes, it is. DataForge can be used as is from [beaker/beakerx](http://beakernotebook.com/) groovy kernel with minor additional adjustments. It is planned to provide separate DataForge kernel to `beakerx` which will automatically call a specific GRIND shell.
|
||||
|
||||
<hr>
|
||||
|
||||
**Q:** Can I use DataForge on a mobile platform?
|
||||
|
||||
**A:** DataForge is modular. Core and the most of api are pretty compact, so it could be used in Android applications. Some modules are designed for PC and could not be used on other platforms. IPhone does not support Java and therefore could use only client-side DataForge applications.
|
@ -1,18 +1,20 @@
|
||||
val dataforgeVersion by extra("0.1.2")
|
||||
|
||||
allprojects {
|
||||
repositories {
|
||||
jcenter()
|
||||
maven("https://kotlin.bintray.com/kotlinx")
|
||||
plugins {
|
||||
id("scientifik.mpp") version "0.1.4" apply false
|
||||
id("scientifik.publish") version "0.1.4" apply false
|
||||
}
|
||||
|
||||
val dataforgeVersion by extra("0.1.3")
|
||||
|
||||
val bintrayRepo by extra("dataforge")
|
||||
val githubProject by extra("dataforge-core")
|
||||
|
||||
allprojects {
|
||||
group = "hep.dataforge"
|
||||
version = dataforgeVersion
|
||||
}
|
||||
|
||||
subprojects {
|
||||
if (name.startsWith("dataforge")) {
|
||||
apply(plugin = "npm-bintray")
|
||||
apply(plugin = "npm-artifactory")
|
||||
apply(plugin = "scientifik.publish")
|
||||
}
|
||||
}
|
@ -1,20 +0,0 @@
|
||||
plugins {
|
||||
`kotlin-dsl`
|
||||
}
|
||||
|
||||
repositories {
|
||||
gradlePluginPortal()
|
||||
jcenter()
|
||||
}
|
||||
|
||||
val kotlinVersion = "1.3.31"
|
||||
|
||||
// Add plugins used in buildSrc as dependencies, also we should specify version only here
|
||||
dependencies {
|
||||
implementation("org.jetbrains.kotlin:kotlin-gradle-plugin:$kotlinVersion")
|
||||
implementation("org.jfrog.buildinfo:build-info-extractor-gradle:4.9.5")
|
||||
implementation("com.jfrog.bintray.gradle:gradle-bintray-plugin:1.8.4")
|
||||
implementation("org.jetbrains.dokka:dokka-gradle-plugin:0.9.18")
|
||||
implementation("com.moowork.gradle:gradle-node-plugin:1.3.1")
|
||||
implementation("org.openjfx:javafx-plugin:0.0.7")
|
||||
}
|
@ -1,9 +0,0 @@
|
||||
// Instead of defining runtime properties and use them dynamically
|
||||
// define version in buildSrc and have autocompletion and compile-time check
|
||||
// Also dependencies itself can be moved here
|
||||
object Versions {
|
||||
val ioVersion = "0.1.8"
|
||||
val coroutinesVersion = "1.2.1"
|
||||
val atomicfuVersion = "0.12.6"
|
||||
val serializationVersion = "0.11.0"
|
||||
}
|
@ -1,59 +0,0 @@
|
||||
import org.jetbrains.dokka.gradle.DokkaTask
|
||||
|
||||
plugins {
|
||||
kotlin("multiplatform")
|
||||
id("org.jetbrains.dokka")
|
||||
`maven-publish`
|
||||
}
|
||||
|
||||
kotlin {
|
||||
|
||||
val dokka by tasks.getting(DokkaTask::class) {
|
||||
outputFormat = "html"
|
||||
outputDirectory = "$buildDir/javadoc"
|
||||
jdkVersion = 8
|
||||
|
||||
kotlinTasks {
|
||||
// dokka fails to retrieve sources from MPP-tasks so we only define the jvm task
|
||||
listOf(tasks.getByPath("compileKotlinJvm"))
|
||||
}
|
||||
sourceRoot {
|
||||
// assuming only single source dir
|
||||
path = sourceSets["commonMain"].kotlin.srcDirs.first().toString()
|
||||
platforms = listOf("Common")
|
||||
}
|
||||
// although the JVM sources are now taken from the task,
|
||||
// we still define the jvm source root to get the JVM marker in the generated html
|
||||
sourceRoot {
|
||||
// assuming only single source dir
|
||||
path = sourceSets["jvmMain"].kotlin.srcDirs.first().toString()
|
||||
platforms = listOf("JVM")
|
||||
}
|
||||
}
|
||||
|
||||
val javadocJar by tasks.registering(Jar::class) {
|
||||
dependsOn(dokka)
|
||||
archiveClassifier.set("javadoc")
|
||||
from("$buildDir/javadoc")
|
||||
}
|
||||
|
||||
publishing {
|
||||
|
||||
// publications.filterIsInstance<MavenPublication>().forEach { publication ->
|
||||
// if (publication.name == "kotlinMultiplatform") {
|
||||
// // for our root metadata publication, set artifactId with a package and project name
|
||||
// publication.artifactId = project.name
|
||||
// } else {
|
||||
// // for targets, set artifactId with a package, project name and target name (e.g. iosX64)
|
||||
// publication.artifactId = "${project.name}-${publication.name}"
|
||||
// }
|
||||
// }
|
||||
|
||||
targets.all {
|
||||
val publication = publications.findByName(name) as MavenPublication
|
||||
|
||||
// Patch publications with fake javadoc
|
||||
publication.artifact(javadocJar.get())
|
||||
}
|
||||
}
|
||||
}
|
@ -1,44 +0,0 @@
|
||||
import com.moowork.gradle.node.npm.NpmTask
|
||||
import com.moowork.gradle.node.task.NodeTask
|
||||
import org.gradle.kotlin.dsl.*
|
||||
import org.jetbrains.kotlin.gradle.tasks.Kotlin2JsCompile
|
||||
|
||||
plugins {
|
||||
id("com.moowork.node")
|
||||
kotlin("multiplatform")
|
||||
}
|
||||
|
||||
node {
|
||||
nodeModulesDir = file("$buildDir/node_modules")
|
||||
}
|
||||
|
||||
val compileKotlinJs by tasks.getting(Kotlin2JsCompile::class)
|
||||
val compileTestKotlinJs by tasks.getting(Kotlin2JsCompile::class)
|
||||
|
||||
val populateNodeModules by tasks.registering(Copy::class) {
|
||||
dependsOn(compileKotlinJs)
|
||||
from(compileKotlinJs.destinationDir)
|
||||
|
||||
kotlin.js().compilations["test"].runtimeDependencyFiles.forEach {
|
||||
if (it.exists() && !it.isDirectory) {
|
||||
from(zipTree(it.absolutePath).matching { include("*.js") })
|
||||
}
|
||||
}
|
||||
|
||||
into("$buildDir/node_modules")
|
||||
}
|
||||
|
||||
val installMocha by tasks.registering(NpmTask::class) {
|
||||
setWorkingDir(buildDir)
|
||||
setArgs(listOf("install", "mocha"))
|
||||
}
|
||||
|
||||
val runMocha by tasks.registering(NodeTask::class) {
|
||||
dependsOn(compileTestKotlinJs, populateNodeModules, installMocha)
|
||||
setScript(file("$buildDir/node_modules/mocha/bin/mocha"))
|
||||
setArgs(listOf(compileTestKotlinJs.outputFile))
|
||||
}
|
||||
|
||||
tasks["jsTest"].dependsOn(runMocha)
|
||||
|
||||
|
@ -1,38 +0,0 @@
|
||||
import groovy.lang.GroovyObject
|
||||
import org.jfrog.gradle.plugin.artifactory.dsl.PublisherConfig
|
||||
import org.jfrog.gradle.plugin.artifactory.dsl.ResolverConfig
|
||||
|
||||
plugins {
|
||||
id("com.jfrog.artifactory")
|
||||
}
|
||||
|
||||
artifactory {
|
||||
val artifactoryUser: String? by project
|
||||
val artifactoryPassword: String? by project
|
||||
val artifactoryContextUrl = "http://npm.mipt.ru:8081/artifactory"
|
||||
|
||||
setContextUrl(artifactoryContextUrl)//The base Artifactory URL if not overridden by the publisher/resolver
|
||||
publish(delegateClosureOf<PublisherConfig> {
|
||||
repository(delegateClosureOf<GroovyObject> {
|
||||
setProperty("repoKey", "gradle-dev-local")
|
||||
setProperty("username", artifactoryUser)
|
||||
setProperty("password", artifactoryPassword)
|
||||
})
|
||||
|
||||
defaults(delegateClosureOf<GroovyObject>{
|
||||
invokeMethod("publications", arrayOf("jvm", "js", "kotlinMultiplatform", "metadata"))
|
||||
//TODO: This property is not available for ArtifactoryTask
|
||||
//setProperty("publishBuildInfo", false)
|
||||
setProperty("publishArtifacts", true)
|
||||
setProperty("publishPom", true)
|
||||
setProperty("publishIvy", false)
|
||||
})
|
||||
})
|
||||
resolve(delegateClosureOf<ResolverConfig> {
|
||||
repository(delegateClosureOf<GroovyObject> {
|
||||
setProperty("repoKey", "gradle-dev")
|
||||
setProperty("username", artifactoryUser)
|
||||
setProperty("password", artifactoryPassword)
|
||||
})
|
||||
})
|
||||
}
|
@ -1,97 +0,0 @@
|
||||
@file:Suppress("UnstableApiUsage")
|
||||
|
||||
import com.jfrog.bintray.gradle.BintrayExtension.PackageConfig
|
||||
import com.jfrog.bintray.gradle.BintrayExtension.VersionConfig
|
||||
|
||||
// Old bintray.gradle script converted to real Gradle plugin (precompiled script plugin)
|
||||
// It now has own dependencies and support type safe accessors
|
||||
// Syntax is pretty close to what we had in Groovy
|
||||
// (excluding Property.set and bintray dynamic configs)
|
||||
|
||||
plugins {
|
||||
id("com.jfrog.bintray")
|
||||
`maven-publish`
|
||||
}
|
||||
|
||||
val vcs = "https://github.com/mipt-npm/kmath"
|
||||
|
||||
// Configure publishing
|
||||
publishing {
|
||||
repositories {
|
||||
maven("https://bintray.com/mipt-npm/scientifik")
|
||||
}
|
||||
|
||||
// Process each publication we have in this project
|
||||
publications.filterIsInstance<MavenPublication>().forEach { publication ->
|
||||
|
||||
// use type safe pom config GSL insterad of old dynamic
|
||||
publication.pom {
|
||||
name.set(project.name)
|
||||
description.set(project.description)
|
||||
url.set(vcs)
|
||||
|
||||
licenses {
|
||||
license {
|
||||
name.set("The Apache Software License, Version 2.0")
|
||||
url.set("http://www.apache.org/licenses/LICENSE-2.0.txt")
|
||||
distribution.set("repo")
|
||||
}
|
||||
}
|
||||
developers {
|
||||
developer {
|
||||
id.set("MIPT-NPM")
|
||||
name.set("MIPT nuclear physics methods laboratory")
|
||||
organization.set("MIPT")
|
||||
organizationUrl.set("http://npm.mipt.ru")
|
||||
}
|
||||
|
||||
}
|
||||
scm {
|
||||
url.set(vcs)
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
}
|
||||
|
||||
bintray {
|
||||
// delegates for runtime properties
|
||||
val bintrayUser: String? by project
|
||||
val bintrayApiKey: String? by project
|
||||
user = bintrayUser ?: System.getenv("BINTRAY_USER")
|
||||
key = bintrayApiKey ?: System.getenv("BINTRAY_API_KEY")
|
||||
publish = true
|
||||
override = true // for multi-platform Kotlin/Native publishing
|
||||
|
||||
// We have to use delegateClosureOf because bintray supports only dynamic groovy syntax
|
||||
// this is a problem of this plugin
|
||||
pkg(delegateClosureOf<PackageConfig> {
|
||||
userOrg = "mipt-npm"
|
||||
repo = "scientifik"
|
||||
name = "scientifik.kmath"
|
||||
issueTrackerUrl = "https://github.com/mipt-npm/kmath/issues"
|
||||
setLicenses("Apache-2.0")
|
||||
vcsUrl = vcs
|
||||
version(delegateClosureOf<VersionConfig> {
|
||||
name = project.version.toString()
|
||||
vcsTag = project.version.toString()
|
||||
released = java.util.Date().toString()
|
||||
})
|
||||
})
|
||||
|
||||
tasks {
|
||||
bintrayUpload {
|
||||
dependsOn(publishToMavenLocal)
|
||||
doFirst {
|
||||
setPublications(project.publishing.publications
|
||||
.filterIsInstance<MavenPublication>()
|
||||
.filter { !it.name.contains("-test") && it.name != "kotlinMultiplatform" }
|
||||
.map {
|
||||
println("""Uploading artifact "${it.groupId}:${it.artifactId}:${it.version}" from publication "${it.name}""")
|
||||
it.name //https://github.com/bintray/gradle-bintray-plugin/issues/256
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
}
|
@ -1,86 +0,0 @@
|
||||
import org.gradle.kotlin.dsl.*
|
||||
|
||||
plugins {
|
||||
kotlin("multiplatform")
|
||||
`maven-publish`
|
||||
}
|
||||
|
||||
|
||||
kotlin {
|
||||
jvm {
|
||||
compilations.all {
|
||||
kotlinOptions {
|
||||
jvmTarget = "1.8"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
js {
|
||||
compilations.all {
|
||||
kotlinOptions {
|
||||
metaInfo = true
|
||||
sourceMap = true
|
||||
sourceMapEmbedSources = "always"
|
||||
moduleKind = "commonjs"
|
||||
}
|
||||
}
|
||||
|
||||
compilations.named("main") {
|
||||
kotlinOptions {
|
||||
main = "call"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
sourceSets {
|
||||
val commonMain by getting {
|
||||
dependencies {
|
||||
api(kotlin("stdlib"))
|
||||
}
|
||||
}
|
||||
val commonTest by getting {
|
||||
dependencies {
|
||||
implementation(kotlin("test-common"))
|
||||
implementation(kotlin("test-annotations-common"))
|
||||
}
|
||||
}
|
||||
val jvmMain by getting {
|
||||
dependencies {
|
||||
api(kotlin("stdlib-jdk8"))
|
||||
}
|
||||
}
|
||||
val jvmTest by getting {
|
||||
dependencies {
|
||||
implementation(kotlin("test"))
|
||||
implementation(kotlin("test-junit"))
|
||||
}
|
||||
}
|
||||
val jsMain by getting {
|
||||
dependencies {
|
||||
api(kotlin("stdlib-js"))
|
||||
}
|
||||
}
|
||||
val jsTest by getting {
|
||||
dependencies {
|
||||
implementation(kotlin("test-js"))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
targets.all {
|
||||
sourceSets.all {
|
||||
languageSettings.progressiveMode = true
|
||||
languageSettings.enableLanguageFeature("InlineClasses")
|
||||
}
|
||||
}
|
||||
|
||||
apply(plugin = "dokka-publish")
|
||||
|
||||
// Apply JS test configuration
|
||||
val runJsTests by ext(false)
|
||||
|
||||
if (runJsTests) {
|
||||
apply(plugin = "js-test")
|
||||
}
|
||||
|
||||
}
|
@ -1,34 +1,31 @@
|
||||
plugins {
|
||||
`npm-multiplatform`
|
||||
id("scientifik.mpp")
|
||||
}
|
||||
|
||||
description = "Context and provider definitions"
|
||||
|
||||
val coroutinesVersion: String = Versions.coroutinesVersion
|
||||
val coroutinesVersion: String = Scientifik.coroutinesVersion
|
||||
|
||||
kotlin {
|
||||
jvm()
|
||||
js()
|
||||
|
||||
sourceSets {
|
||||
val commonMain by getting {
|
||||
dependencies {
|
||||
api(project(":dataforge-meta"))
|
||||
api(kotlin("reflect"))
|
||||
api("io.github.microutils:kotlin-logging-common:1.6.10")
|
||||
api("io.github.microutils:kotlin-logging-common:1.7.2")
|
||||
api("org.jetbrains.kotlinx:kotlinx-coroutines-core-common:$coroutinesVersion")
|
||||
}
|
||||
}
|
||||
val jvmMain by getting {
|
||||
dependencies {
|
||||
api("io.github.microutils:kotlin-logging:1.6.10")
|
||||
api("io.github.microutils:kotlin-logging:1.7.2")
|
||||
api("ch.qos.logback:logback-classic:1.2.3")
|
||||
api("org.jetbrains.kotlinx:kotlinx-coroutines-core:$coroutinesVersion")
|
||||
}
|
||||
}
|
||||
val jsMain by getting {
|
||||
dependencies {
|
||||
api("io.github.microutils:kotlin-logging-js:1.6.10")
|
||||
api("io.github.microutils:kotlin-logging-js:1.7.2")
|
||||
api("org.jetbrains.kotlinx:kotlinx-coroutines-core-js:$coroutinesVersion")
|
||||
}
|
||||
}
|
||||
|
@ -3,6 +3,7 @@ package hep.dataforge.context
|
||||
import hep.dataforge.meta.EmptyMeta
|
||||
import hep.dataforge.meta.Meta
|
||||
import hep.dataforge.names.Name
|
||||
import hep.dataforge.names.toName
|
||||
|
||||
abstract class AbstractPlugin(override val meta: Meta = EmptyMeta) : Plugin {
|
||||
private var _context: Context? = null
|
||||
@ -18,7 +19,9 @@ abstract class AbstractPlugin(override val meta: Meta = EmptyMeta) : Plugin {
|
||||
this._context = null
|
||||
}
|
||||
|
||||
override fun provideTop(target: String, name: Name): Any? = null
|
||||
override fun provideTop(target: String): Map<Name, Any> = emptyMap()
|
||||
|
||||
override fun listNames(target: String): Sequence<Name> = emptySequence()
|
||||
companion object{
|
||||
fun <T: Named> Collection<T>.toMap(): Map<Name, T> = associate { it.name.toName() to it }
|
||||
}
|
||||
}
|
@ -26,8 +26,10 @@ import kotlin.jvm.JvmName
|
||||
* Since plugins could contain mutable state, context has two states: active and inactive. No changes are allowed to active context.
|
||||
* @author Alexander Nozik
|
||||
*/
|
||||
open class Context(final override val name: String, val parent: Context? = Global) : Named, MetaRepr, Provider,
|
||||
CoroutineScope {
|
||||
open class Context(
|
||||
final override val name: String,
|
||||
val parent: Context? = Global
|
||||
) : Named, MetaRepr, Provider, CoroutineScope {
|
||||
|
||||
private val config = Config()
|
||||
|
||||
@ -59,19 +61,11 @@ open class Context(final override val name: String, val parent: Context? = Globa
|
||||
|
||||
override val defaultTarget: String get() = Plugin.PLUGIN_TARGET
|
||||
|
||||
override fun provideTop(target: String, name: Name): Any? {
|
||||
override fun provideTop(target: String): Map<Name, Any> {
|
||||
return when (target) {
|
||||
Plugin.PLUGIN_TARGET -> plugins[PluginTag.fromString(name.toString())]
|
||||
Value.TYPE -> properties[name]?.value
|
||||
else -> null
|
||||
}
|
||||
}
|
||||
|
||||
override fun listNames(target: String): Sequence<Name> {
|
||||
return when (target) {
|
||||
Plugin.PLUGIN_TARGET -> plugins.asSequence().map { it.name.toName() }
|
||||
Value.TYPE -> properties.values().map { it.first }
|
||||
else -> emptySequence()
|
||||
Value.TYPE -> properties.sequence().toMap()
|
||||
Plugin.PLUGIN_TARGET -> plugins.sequence(true).associateBy { it.name.toName() }
|
||||
else -> emptyMap()
|
||||
}
|
||||
}
|
||||
|
||||
@ -116,11 +110,11 @@ open class Context(final override val name: String, val parent: Context? = Globa
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* A sequences of all objects provided by plugins with given target and type
|
||||
*/
|
||||
fun Context.content(target: String): Map<Name, Any> = content<Any>(target)
|
||||
|
||||
/**
|
||||
* A map of all objects provided by plugins with given target and type
|
||||
*/
|
||||
@JvmName("typedContent")
|
||||
inline fun <reified T : Any> Context.content(target: String): Map<Name, T> =
|
||||
plugins.flatMap { plugin ->
|
||||
@ -148,14 +142,18 @@ object Global : Context("GLOBAL", null) {
|
||||
private val contextRegistry = HashMap<String, Context>()
|
||||
|
||||
/**
|
||||
* Get previously builder context o builder a new one
|
||||
* Get previously built context
|
||||
*
|
||||
* @param name
|
||||
* @return
|
||||
*/
|
||||
fun getContext(name: String): Context {
|
||||
return contextRegistry.getOrPut(name) { Context(name) }
|
||||
fun getContext(name: String): Context? {
|
||||
return contextRegistry[name]
|
||||
}
|
||||
|
||||
fun context(name: String, parent: Context = this, block: ContextBuilder.() -> Unit = {}): Context =
|
||||
ContextBuilder(name, parent).apply(block).build()
|
||||
|
||||
}
|
||||
|
||||
|
||||
|
@ -18,11 +18,15 @@ class ContextBuilder(var name: String = "@anonimous", val parent: Context = Glob
|
||||
plugins.add(plugin)
|
||||
}
|
||||
|
||||
fun plugin(tag: PluginTag, action: MetaBuilder.() -> Unit) {
|
||||
fun plugin(tag: PluginTag, action: MetaBuilder.() -> Unit = {}) {
|
||||
plugins.add(PluginRepository.fetch(tag, buildMeta(action)))
|
||||
}
|
||||
|
||||
fun plugin(name: String, group: String = "", version: String = "", action: MetaBuilder.() -> Unit) {
|
||||
fun plugin(builder: PluginFactory<*>, action: MetaBuilder.() -> Unit = {}) {
|
||||
plugins.add(builder.invoke(buildMeta(action)))
|
||||
}
|
||||
|
||||
fun plugin(name: String, group: String = "", version: String = "", action: MetaBuilder.() -> Unit = {}) {
|
||||
plugin(PluginTag(name, group, version), action)
|
||||
}
|
||||
|
||||
|
@ -51,7 +51,10 @@ class PluginManager(override val context: Context) : ContextAware, Iterable<Plug
|
||||
|
||||
|
||||
/**
|
||||
* Find a loaded plugin via its class
|
||||
* Find a loaded plugin via its class. This method does not check if the result is unique and just returns first
|
||||
* plugin matching the class condition.
|
||||
* For safe search provide a tag since tags are checked on load and plugins with the same tag are not allowed
|
||||
* in the same context.
|
||||
*
|
||||
* @param tag
|
||||
* @param type
|
||||
@ -59,15 +62,16 @@ class PluginManager(override val context: Context) : ContextAware, Iterable<Plug
|
||||
* @return
|
||||
*/
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
operator fun <T : Plugin> get(type: KClass<T>, recursive: Boolean = true): T? =
|
||||
get(recursive) { type.isInstance(it) } as T?
|
||||
operator fun <T : Any> get(type: KClass<T>, tag: PluginTag? = null, recursive: Boolean = true): T? =
|
||||
get(recursive) { type.isInstance(it) && (tag == null || tag.matches(it.tag)) } as T?
|
||||
|
||||
inline fun <reified T : Plugin> get(recursive: Boolean = true): T? = get(T::class, recursive)
|
||||
inline fun <reified T : Any> get(tag: PluginTag? = null, recursive: Boolean = true): T? =
|
||||
get(T::class, tag, recursive)
|
||||
|
||||
|
||||
/**
|
||||
* Load given plugin into this manager and return loaded instance.
|
||||
* Throw error if plugin of the same class already exists in manager
|
||||
* Throw error if plugin of the same type and tag already exists in manager.
|
||||
*
|
||||
* @param plugin
|
||||
* @return
|
||||
@ -75,10 +79,12 @@ class PluginManager(override val context: Context) : ContextAware, Iterable<Plug
|
||||
fun <T : Plugin> load(plugin: T): T {
|
||||
if (context.isActive) error("Can't load plugin into active context")
|
||||
|
||||
if (get(plugin::class, false) != null) {
|
||||
throw RuntimeException("Plugin of type ${plugin::class} already exists in ${context.name}")
|
||||
if (get(plugin::class, plugin.tag, recursive = false) != null) {
|
||||
error("Plugin of type ${plugin::class} already exists in ${context.name}")
|
||||
} else {
|
||||
loadDependencies(plugin)
|
||||
for (tag in plugin.dependsOn()) {
|
||||
fetch(tag, true)
|
||||
}
|
||||
|
||||
logger.info { "Loading plugin ${plugin.name} into ${context.name}" }
|
||||
plugin.attach(context)
|
||||
@ -87,11 +93,14 @@ class PluginManager(override val context: Context) : ContextAware, Iterable<Plug
|
||||
}
|
||||
}
|
||||
|
||||
private fun loadDependencies(plugin: Plugin) {
|
||||
for (tag in plugin.dependsOn()) {
|
||||
load(tag)
|
||||
}
|
||||
}
|
||||
/**
|
||||
* Load a plugin using its factory
|
||||
*/
|
||||
fun <T : Plugin> load(factory: PluginFactory<T>, meta: Meta = EmptyMeta): T =
|
||||
load(factory(meta))
|
||||
|
||||
fun <T : Plugin> load(factory: PluginFactory<T>, metaBuilder: MetaBuilder.() -> Unit): T =
|
||||
load(factory, buildMeta(metaBuilder))
|
||||
|
||||
/**
|
||||
* Remove a plugin from [PluginManager]
|
||||
@ -107,22 +116,11 @@ class PluginManager(override val context: Context) : ContextAware, Iterable<Plug
|
||||
}
|
||||
|
||||
/**
|
||||
* Get plugin instance via plugin resolver and load it.
|
||||
* Get an existing plugin with given meta or load new one using provided factory
|
||||
*
|
||||
* @param tag
|
||||
* @return
|
||||
*/
|
||||
fun load(tag: PluginTag, meta: Meta = EmptyMeta): Plugin {
|
||||
val loaded = get(tag, false)
|
||||
return when {
|
||||
loaded == null -> load(PluginRepository.fetch(tag,meta))
|
||||
loaded.meta == meta -> loaded // if meta is the same, return existing plugin
|
||||
else -> throw RuntimeException("Can't load plugin with tag $tag. Plugin with this tag and different configuration already exists in context.")
|
||||
}
|
||||
}
|
||||
|
||||
fun load(factory: PluginFactory<*>, meta: Meta = EmptyMeta): Plugin{
|
||||
val loaded = get(factory.tag, false)
|
||||
fun <T : Plugin> fetch(factory: PluginFactory<T>, recursive: Boolean = true, meta: Meta = EmptyMeta): T {
|
||||
val loaded = get(factory.type, factory.tag, recursive)
|
||||
return when {
|
||||
loaded == null -> load(factory(meta))
|
||||
loaded.meta == meta -> loaded // if meta is the same, return existing plugin
|
||||
@ -130,42 +128,12 @@ class PluginManager(override val context: Context) : ContextAware, Iterable<Plug
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Load plugin by its class and meta. Ignore if plugin with this meta is already loaded.
|
||||
* Throw an exception if there exists plugin with the same type, but different meta
|
||||
*/
|
||||
fun <T : Plugin> load(type: KClass<T>, meta: Meta = EmptyMeta): T {
|
||||
val loaded = get(type, false)
|
||||
return when {
|
||||
loaded == null -> {
|
||||
val plugin = PluginRepository.list().first { it.type == type }.invoke(meta)
|
||||
if (type.isInstance(plugin)) {
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
load(plugin as T)
|
||||
} else {
|
||||
error("Corrupt type information in plugin repository")
|
||||
}
|
||||
}
|
||||
loaded.meta == meta -> loaded // if meta is the same, return existing plugin
|
||||
else -> throw RuntimeException("Can't load plugin with type $type. Plugin with this type and different configuration already exists in context.")
|
||||
}
|
||||
}
|
||||
|
||||
inline fun <reified T : Plugin> load(noinline metaBuilder: MetaBuilder.() -> Unit = {}): T {
|
||||
return load(T::class, buildMeta(metaBuilder))
|
||||
}
|
||||
|
||||
fun load(name: String, meta: Meta = EmptyMeta): Plugin {
|
||||
return load(PluginTag.fromString(name), meta)
|
||||
}
|
||||
fun <T : Plugin> fetch(
|
||||
factory: PluginFactory<T>,
|
||||
recursive: Boolean = true,
|
||||
metaBuilder: MetaBuilder.() -> Unit
|
||||
): T = fetch(factory, recursive, buildMeta(metaBuilder))
|
||||
|
||||
override fun iterator(): Iterator<Plugin> = plugins.iterator()
|
||||
|
||||
/**
|
||||
* Get a plugin if it exists or load it with given meta if it is not.
|
||||
*/
|
||||
inline fun <reified T : Plugin> getOrLoad(noinline metaBuilder: MetaBuilder.() -> Unit = {}): T {
|
||||
return get(true) ?: load(metaBuilder)
|
||||
}
|
||||
|
||||
}
|
@ -17,7 +17,6 @@ package hep.dataforge.provider
|
||||
|
||||
import hep.dataforge.names.Name
|
||||
import hep.dataforge.names.toName
|
||||
import kotlin.jvm.JvmName
|
||||
|
||||
/**
|
||||
* A marker utility interface for providers.
|
||||
@ -42,29 +41,21 @@ interface Provider {
|
||||
|
||||
|
||||
/**
|
||||
* Provide a top level element for this [Provider] or return null if element is not present
|
||||
* A map of direct children for specific target
|
||||
*/
|
||||
fun provideTop(target: String, name: Name): Any?
|
||||
|
||||
/**
|
||||
* [Sequence] of available names with given target. Only top level names are listed, no chain path.
|
||||
*
|
||||
* @param target
|
||||
* @return
|
||||
*/
|
||||
fun listNames(target: String): Sequence<Name>
|
||||
fun provideTop(target: String): Map<Name, Any>
|
||||
}
|
||||
|
||||
fun Provider.provide(path: Path, targetOverride: String? = null): Any? {
|
||||
if (path.length == 0) throw IllegalArgumentException("Can't provide by empty path")
|
||||
val first = path.first()
|
||||
val top = provideTop(targetOverride ?: first.target ?: defaultTarget, first.name)
|
||||
val target = targetOverride ?: first.target ?: defaultTarget
|
||||
val res = provideTop(target)[first.name] ?: return null
|
||||
return when (path.length) {
|
||||
1 -> top
|
||||
1 -> res
|
||||
else -> {
|
||||
when (top) {
|
||||
null -> null
|
||||
is Provider -> top.provide(path.tail!!, targetOverride = defaultChainTarget)
|
||||
when (res) {
|
||||
is Provider -> res.provide(path.tail!!, targetOverride = defaultChainTarget)
|
||||
else -> throw IllegalStateException("Chain path not supported: child is not a provider")
|
||||
}
|
||||
}
|
||||
@ -86,14 +77,11 @@ inline fun <reified T : Any> Provider.provide(target: String, name: String): T?
|
||||
provide(target, name.toName())
|
||||
|
||||
/**
|
||||
* A top level content with names
|
||||
* Typed top level content
|
||||
*/
|
||||
fun Provider.top(target: String): Map<Name, Any> = top<Any>(target)
|
||||
|
||||
@JvmName("typedTop")
|
||||
inline fun <reified T : Any> Provider.top(target: String): Map<Name, T> {
|
||||
return listNames(target).associate {
|
||||
it to (provideTop(target, it) as? T ?: error("The element $it is declared but not provided"))
|
||||
return provideTop(target).mapValues {
|
||||
it.value as? T ?: error("The type of element $it is ${it::class} but ${T::class} is expected")
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -12,17 +12,10 @@ class ContextTest {
|
||||
class DummyPlugin : AbstractPlugin() {
|
||||
override val tag get() = PluginTag("test")
|
||||
|
||||
override fun provideTop(target: String, name: Name): Any? {
|
||||
override fun provideTop(target: String): Map<Name, Any> {
|
||||
return when(target){
|
||||
"test" -> return name
|
||||
else -> super.provideTop(target, name)
|
||||
}
|
||||
}
|
||||
|
||||
override fun listNames(target: String): Sequence<Name> {
|
||||
return when (target) {
|
||||
"test" -> sequenceOf("a", "b", "c.d").map { it.toName() }
|
||||
else -> super.listNames(target)
|
||||
"test" -> listOf("a", "b", "c.d").associate { it.toName() to it.toName() }
|
||||
else -> emptyMap()
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -34,9 +34,7 @@ inline fun <reified T : Any> Provider.provideByType(name: Name): T? {
|
||||
|
||||
inline fun <reified T : Any> Provider.top(): Map<Name, T> {
|
||||
val target = Types[T::class]
|
||||
return listNames(target).associate { name ->
|
||||
name to (provideByType<T>(name) ?: error("The element $name is declared but not provided"))
|
||||
}
|
||||
return top(target)
|
||||
}
|
||||
|
||||
/**
|
||||
|
@ -1,17 +1,14 @@
|
||||
plugins {
|
||||
`npm-multiplatform`
|
||||
id("scientifik.mpp")
|
||||
}
|
||||
|
||||
val coroutinesVersion: String = Versions.coroutinesVersion
|
||||
val coroutinesVersion: String = Scientifik.coroutinesVersion
|
||||
|
||||
kotlin {
|
||||
jvm()
|
||||
js()
|
||||
sourceSets {
|
||||
val commonMain by getting{
|
||||
dependencies {
|
||||
api(project(":dataforge-meta"))
|
||||
api(kotlin("reflect"))
|
||||
api("org.jetbrains.kotlinx:kotlinx-coroutines-core-common:$coroutinesVersion")
|
||||
}
|
||||
}
|
||||
@ -19,6 +16,7 @@ kotlin {
|
||||
val jvmMain by getting{
|
||||
dependencies {
|
||||
api("org.jetbrains.kotlinx:kotlinx-coroutines-core:$coroutinesVersion")
|
||||
api(kotlin("reflect"))
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -1,7 +1,6 @@
|
||||
package hep.dataforge.data
|
||||
|
||||
import hep.dataforge.meta.Meta
|
||||
import hep.dataforge.names.Name
|
||||
|
||||
/**
|
||||
* A simple data transformation on a data node
|
||||
@ -34,19 +33,3 @@ infix fun <T : Any, I : Any, R : Any> Action<T, I>.then(action: Action<I, R>): A
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
///**
|
||||
// * An action that performs the same transformation on each of input data nodes. Null results are ignored.
|
||||
// * The transformation is non-suspending because it is lazy.
|
||||
// */
|
||||
//class PipeAction<in T : Any, out R : Any>(val transform: (Name, Data<T>, Meta) -> Data<R>?) : Action<T, R> {
|
||||
// override fun invoke(node: DataNode<T>, meta: Meta): DataNode<R> = DataNode.build {
|
||||
// node.data().forEach { (name, data) ->
|
||||
// val res = transform(name, data, meta)
|
||||
// if (res != null) {
|
||||
// set(name, res)
|
||||
// }
|
||||
// }
|
||||
// }
|
||||
//}
|
||||
|
||||
|
@ -0,0 +1,48 @@
|
||||
package hep.dataforge.data
|
||||
|
||||
import kotlinx.coroutines.CoroutineScope
|
||||
import kotlinx.coroutines.Job
|
||||
import kotlin.coroutines.CoroutineContext
|
||||
|
||||
/**
|
||||
* A monitor of goal state that could be accessed only form inside the goal
|
||||
*/
|
||||
class CoroutineMonitor : CoroutineContext.Element {
|
||||
override val key: CoroutineContext.Key<*> get() = CoroutineMonitor
|
||||
|
||||
var totalWork: Double = 1.0
|
||||
var workDone: Double = 0.0
|
||||
var status: String = ""
|
||||
|
||||
/**
|
||||
* Mark the goal as started
|
||||
*/
|
||||
fun start() {
|
||||
|
||||
}
|
||||
|
||||
/**
|
||||
* Mark the goal as completed
|
||||
*/
|
||||
fun finish() {
|
||||
workDone = totalWork
|
||||
}
|
||||
|
||||
companion object : CoroutineContext.Key<CoroutineMonitor>
|
||||
}
|
||||
|
||||
class Dependencies(val values: Collection<Job>) : CoroutineContext.Element {
|
||||
override val key: CoroutineContext.Key<*> get() = Dependencies
|
||||
|
||||
companion object : CoroutineContext.Key<Dependencies>
|
||||
}
|
||||
|
||||
val CoroutineContext.monitor: CoroutineMonitor? get() = this[CoroutineMonitor]
|
||||
val CoroutineScope.monitor: CoroutineMonitor? get() = coroutineContext.monitor
|
||||
|
||||
val Job.dependencies: Collection<Job> get() = this[Dependencies]?.values ?: emptyList()
|
||||
|
||||
val Job.totalWork: Double get() = dependencies.sumByDouble { totalWork } + (monitor?.totalWork ?: 0.0)
|
||||
val Job.workDone: Double get() = dependencies.sumByDouble { workDone } + (monitor?.workDone ?: 0.0)
|
||||
val Job.status: String get() = monitor?.status ?: ""
|
||||
val Job.progress: Double get() = workDone / totalWork
|
@ -1,14 +1,17 @@
|
||||
package hep.dataforge.data
|
||||
|
||||
import hep.dataforge.meta.EmptyMeta
|
||||
import hep.dataforge.meta.Meta
|
||||
import hep.dataforge.meta.MetaRepr
|
||||
import kotlinx.coroutines.CoroutineScope
|
||||
import kotlin.coroutines.CoroutineContext
|
||||
import kotlin.coroutines.EmptyCoroutineContext
|
||||
import kotlin.reflect.KClass
|
||||
|
||||
/**
|
||||
* A data element characterized by its meta
|
||||
*/
|
||||
interface Data<out T : Any> : MetaRepr {
|
||||
interface Data<out T : Any> : Goal<T>, MetaRepr {
|
||||
/**
|
||||
* Type marker for the data. The type is known before the calculation takes place so it could be checked.
|
||||
*/
|
||||
@ -18,52 +21,148 @@ interface Data<out T : Any> : MetaRepr {
|
||||
*/
|
||||
val meta: Meta
|
||||
|
||||
/**
|
||||
* Lazy data value
|
||||
*/
|
||||
val goal: Goal<T>
|
||||
|
||||
override fun toMeta(): Meta = meta
|
||||
|
||||
companion object {
|
||||
const val TYPE = "data"
|
||||
|
||||
fun <T : Any> of(type: KClass<out T>, goal: Goal<T>, meta: Meta): Data<T> = DataImpl(type, goal, meta)
|
||||
operator fun <T : Any> invoke(
|
||||
type: KClass<out T>,
|
||||
meta: Meta = EmptyMeta,
|
||||
context: CoroutineContext = EmptyCoroutineContext,
|
||||
dependencies: Collection<Data<*>> = emptyList(),
|
||||
block: suspend CoroutineScope.() -> T
|
||||
): Data<T> = DynamicData(type, meta, context, dependencies, block)
|
||||
|
||||
inline fun <reified T : Any> of(goal: Goal<T>, meta: Meta): Data<T> = of(T::class, goal, meta)
|
||||
operator inline fun <reified T : Any> invoke(
|
||||
meta: Meta = EmptyMeta,
|
||||
context: CoroutineContext = EmptyCoroutineContext,
|
||||
dependencies: Collection<Data<*>> = emptyList(),
|
||||
noinline block: suspend CoroutineScope.() -> T
|
||||
): Data<T> = invoke(T::class, meta, context, dependencies, block)
|
||||
|
||||
fun <T : Any> of(name: String, type: KClass<out T>, goal: Goal<T>, meta: Meta): Data<T> =
|
||||
NamedData(name, of(type, goal, meta))
|
||||
operator fun <T : Any> invoke(
|
||||
name: String,
|
||||
type: KClass<out T>,
|
||||
meta: Meta = EmptyMeta,
|
||||
context: CoroutineContext = EmptyCoroutineContext,
|
||||
dependencies: Collection<Data<*>> = emptyList(),
|
||||
block: suspend CoroutineScope.() -> T
|
||||
): Data<T> = NamedData(name, invoke(type, meta, context, dependencies, block))
|
||||
|
||||
inline fun <reified T : Any> of(name: String, goal: Goal<T>, meta: Meta): Data<T> =
|
||||
of(name, T::class, goal, meta)
|
||||
operator inline fun <reified T : Any> invoke(
|
||||
name: String,
|
||||
meta: Meta = EmptyMeta,
|
||||
context: CoroutineContext = EmptyCoroutineContext,
|
||||
dependencies: Collection<Data<*>> = emptyList(),
|
||||
noinline block: suspend CoroutineScope.() -> T
|
||||
): Data<T> =
|
||||
invoke(name, T::class, meta, context, dependencies, block)
|
||||
|
||||
fun <T : Any> static(scope: CoroutineScope, value: T, meta: Meta): Data<T> =
|
||||
DataImpl(value::class, Goal.static(scope, value), meta)
|
||||
fun <T : Any> static(value: T, meta: Meta = EmptyMeta): Data<T> =
|
||||
StaticData(value, meta)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
fun <R : Any, T : R> Data<T>.cast(type: KClass<R>): Data<R> {
|
||||
return object : Data<R> by this {
|
||||
override val type: KClass<out R> = type
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Upcast a [Data] to a supertype
|
||||
*/
|
||||
inline fun <reified R : Any, reified T : R> Data<T>.cast(): Data<R> {
|
||||
return Data.of(R::class, goal, meta)
|
||||
}
|
||||
inline fun <reified R : Any, T : R> Data<T>.cast(): Data<R> = cast(R::class)
|
||||
|
||||
fun <R : Any, T : R> Data<T>.cast(type: KClass<R>): Data<R> {
|
||||
return Data.of(type, goal, meta)
|
||||
}
|
||||
|
||||
suspend fun <T : Any> Data<T>.await(): T = goal.await()
|
||||
|
||||
/**
|
||||
* Generic Data implementation
|
||||
*/
|
||||
private class DataImpl<out T : Any>(
|
||||
class DynamicData<T : Any>(
|
||||
override val type: KClass<out T>,
|
||||
override val goal: Goal<T>,
|
||||
override val meta: Meta
|
||||
) : Data<T>
|
||||
override val meta: Meta = EmptyMeta,
|
||||
context: CoroutineContext = EmptyCoroutineContext,
|
||||
dependencies: Collection<Data<*>> = emptyList(),
|
||||
block: suspend CoroutineScope.() -> T
|
||||
) : Data<T>, DynamicGoal<T>(context, dependencies, block)
|
||||
|
||||
class StaticData<T : Any>(
|
||||
value: T,
|
||||
override val meta: Meta = EmptyMeta
|
||||
) : Data<T>, StaticGoal<T>(value) {
|
||||
override val type: KClass<out T> get() = value::class
|
||||
}
|
||||
|
||||
class NamedData<out T : Any>(val name: String, data: Data<T>) : Data<T> by data
|
||||
|
||||
fun <T : Any, R : Any> Data<T>.pipe(
|
||||
outputType: KClass<out R>,
|
||||
coroutineContext: CoroutineContext = EmptyCoroutineContext,
|
||||
meta: Meta = this.meta,
|
||||
block: suspend CoroutineScope.(T) -> R
|
||||
): Data<R> = DynamicData(outputType, meta, coroutineContext, listOf(this)) {
|
||||
block(await(this))
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* Create a data pipe
|
||||
*/
|
||||
inline fun <T : Any, reified R : Any> Data<T>.pipe(
|
||||
coroutineContext: CoroutineContext = EmptyCoroutineContext,
|
||||
meta: Meta = this.meta,
|
||||
noinline block: suspend CoroutineScope.(T) -> R
|
||||
): Data<R> = DynamicData(R::class, meta, coroutineContext, listOf(this)) {
|
||||
block(await(this))
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a joined data.
|
||||
*/
|
||||
inline fun <T : Any, reified R : Any> Collection<Data<T>>.join(
|
||||
coroutineContext: CoroutineContext = EmptyCoroutineContext,
|
||||
meta: Meta,
|
||||
noinline block: suspend CoroutineScope.(Collection<T>) -> R
|
||||
): Data<R> = DynamicData(
|
||||
R::class,
|
||||
meta,
|
||||
coroutineContext,
|
||||
this
|
||||
) {
|
||||
block(map { this.run { it.await(this) } })
|
||||
}
|
||||
|
||||
fun <K, T : Any, R : Any> Map<K, Data<T>>.join(
|
||||
outputType: KClass<out R>,
|
||||
coroutineContext: CoroutineContext = EmptyCoroutineContext,
|
||||
meta: Meta,
|
||||
block: suspend CoroutineScope.(Map<K, T>) -> R
|
||||
): DynamicData<R> = DynamicData(
|
||||
outputType,
|
||||
meta,
|
||||
coroutineContext,
|
||||
this.values
|
||||
) {
|
||||
block(mapValues { it.value.await(this) })
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* A joining of multiple data into a single one
|
||||
* @param K type of the map key
|
||||
* @param T type of the input goal
|
||||
* @param R type of the result goal
|
||||
*/
|
||||
inline fun <K, T : Any, reified R : Any> Map<K, Data<T>>.join(
|
||||
coroutineContext: CoroutineContext = EmptyCoroutineContext,
|
||||
meta: Meta,
|
||||
noinline block: suspend CoroutineScope.(Map<K, T>) -> R
|
||||
): DynamicData<R> = DynamicData(
|
||||
R::class,
|
||||
meta,
|
||||
coroutineContext,
|
||||
this.values
|
||||
) {
|
||||
block(mapValues { it.value.await(this) })
|
||||
}
|
||||
|
||||
|
||||
|
@ -20,10 +20,10 @@ class DataFilter(override val config: Config) : Specific {
|
||||
* Apply meta-based filter to given data node
|
||||
*/
|
||||
fun <T : Any> DataNode<T>.filter(filter: DataFilter): DataNode<T> {
|
||||
val sourceNode = filter.from?.let { getNode(it.toName()) } ?: this@filter
|
||||
val sourceNode = filter.from?.let { get(it.toName()).node } ?: this@filter
|
||||
val regex = filter.pattern.toRegex()
|
||||
val targetNode = DataTreeBuilder(type).apply {
|
||||
sourceNode.data().forEach { (name, data) ->
|
||||
sourceNode.dataSequence().forEach { (name, data) ->
|
||||
if (name.toString().matches(regex)) {
|
||||
this[name] = data
|
||||
}
|
||||
|
@ -1,8 +1,26 @@
|
||||
package hep.dataforge.data
|
||||
|
||||
import hep.dataforge.names.*
|
||||
import kotlinx.coroutines.CoroutineScope
|
||||
import kotlinx.coroutines.Job
|
||||
import kotlinx.coroutines.launch
|
||||
import kotlin.collections.component1
|
||||
import kotlin.collections.component2
|
||||
import kotlin.collections.set
|
||||
import kotlin.reflect.KClass
|
||||
|
||||
sealed class DataItem<out T : Any> {
|
||||
abstract val type: KClass<out T>
|
||||
|
||||
class Node<out T : Any>(val value: DataNode<T>) : DataItem<T>() {
|
||||
override val type: KClass<out T> get() = value.type
|
||||
}
|
||||
|
||||
class Leaf<out T : Any>(val value: Data<T>) : DataItem<T>() {
|
||||
override val type: KClass<out T> get() = value.type
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* A tree-like data structure grouped into the node. All data inside the node must inherit its type
|
||||
*/
|
||||
@ -13,93 +31,89 @@ interface DataNode<out T : Any> {
|
||||
*/
|
||||
val type: KClass<out T>
|
||||
|
||||
/**
|
||||
* Get the specific data if it exists
|
||||
*/
|
||||
operator fun get(name: Name): Data<T>?
|
||||
|
||||
/**
|
||||
* Get a subnode with given name if it exists.
|
||||
*/
|
||||
fun getNode(name: Name): DataNode<T>?
|
||||
|
||||
/**
|
||||
* Walk the tree upside down and provide all data nodes with full names
|
||||
*/
|
||||
fun data(): Sequence<Pair<Name, Data<T>>>
|
||||
|
||||
/**
|
||||
* A sequence of all nodes in the tree walking upside down, excluding self
|
||||
*/
|
||||
fun nodes(): Sequence<Pair<Name, DataNode<T>>>
|
||||
|
||||
operator fun iterator(): Iterator<Pair<Name, Data<T>>> = data().iterator()
|
||||
val items: Map<NameToken, DataItem<T>>
|
||||
|
||||
companion object {
|
||||
const val TYPE = "dataNode"
|
||||
|
||||
fun <T : Any> build(type: KClass<out T>, block: DataTreeBuilder<T>.() -> Unit) =
|
||||
DataTreeBuilder<T>(type).apply(block).build()
|
||||
DataTreeBuilder(type).apply(block).build()
|
||||
|
||||
fun <T : Any> builder(type: KClass<out T>) = DataTreeBuilder(type)
|
||||
}
|
||||
}
|
||||
|
||||
internal sealed class DataTreeItem<out T : Any> {
|
||||
class Node<out T : Any>(val tree: DataTree<T>) : DataTreeItem<T>()
|
||||
class Value<out T : Any>(val value: Data<T>) : DataTreeItem<T>()
|
||||
val <T : Any> DataItem<T>?.node: DataNode<T>? get() = (this as? DataItem.Node<T>)?.value
|
||||
val <T : Any> DataItem<T>?.data: Data<T>? get() = (this as? DataItem.Leaf<T>)?.value
|
||||
|
||||
/**
|
||||
* Start computation for all goals in data node
|
||||
*/
|
||||
fun DataNode<*>.startAll(scope: CoroutineScope): Unit = items.values.forEach {
|
||||
when (it) {
|
||||
is DataItem.Node<*> -> it.value.startAll(scope)
|
||||
is DataItem.Leaf<*> -> it.value.start(scope)
|
||||
}
|
||||
}
|
||||
|
||||
fun DataNode<*>.joinAll(scope: CoroutineScope): Job = scope.launch {
|
||||
startAll(scope)
|
||||
items.forEach {
|
||||
when (val value = it.value) {
|
||||
is DataItem.Node -> value.value.joinAll(this).join()
|
||||
is DataItem.Leaf -> value.value.await(scope)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
operator fun <T : Any> DataNode<T>.get(name: Name): DataItem<T>? = when (name.length) {
|
||||
0 -> error("Empty name")
|
||||
1 -> (items[name.first()] as? DataItem.Leaf)
|
||||
else -> get(name.first()!!.asName()).node?.get(name.cutFirst())
|
||||
}
|
||||
|
||||
/**
|
||||
* Sequence of all children including nodes
|
||||
*/
|
||||
fun <T : Any> DataNode<T>.asSequence(): Sequence<Pair<Name, DataItem<T>>> = sequence {
|
||||
items.forEach { (head, item) ->
|
||||
yield(head.asName() to item)
|
||||
if (item is DataItem.Node) {
|
||||
val subSequence = item.value.asSequence()
|
||||
.map { (name, data) -> (head.asName() + name) to data }
|
||||
yieldAll(subSequence)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Sequence of data entries
|
||||
*/
|
||||
fun <T : Any> DataNode<T>.dataSequence(): Sequence<Pair<Name, Data<T>>> = sequence {
|
||||
items.forEach { (head, item) ->
|
||||
when (item) {
|
||||
is DataItem.Leaf -> yield(head.asName() to item.value)
|
||||
is DataItem.Node -> {
|
||||
val subSequence = item.value.dataSequence()
|
||||
.map { (name, data) -> (head.asName() + name) to data }
|
||||
yieldAll(subSequence)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
operator fun <T : Any> DataNode<T>.iterator(): Iterator<Pair<Name, DataItem<T>>> = asSequence().iterator()
|
||||
|
||||
class DataTree<out T : Any> internal constructor(
|
||||
override val type: KClass<out T>,
|
||||
private val items: Map<NameToken, DataTreeItem<T>>
|
||||
override val items: Map<NameToken, DataItem<T>>
|
||||
) : DataNode<T> {
|
||||
//TODO add node-level meta?
|
||||
|
||||
override fun get(name: Name): Data<T>? = when (name.length) {
|
||||
0 -> error("Empty name")
|
||||
1 -> (items[name.first()] as? DataTreeItem.Value)?.value
|
||||
else -> getNode(name.first()!!.asName())?.get(name.cutFirst())
|
||||
}
|
||||
|
||||
override fun getNode(name: Name): DataTree<T>? = when (name.length) {
|
||||
0 -> this
|
||||
1 -> (items[name.first()] as? DataTreeItem.Node)?.tree
|
||||
else -> getNode(name.first()!!.asName())?.getNode(name.cutFirst())
|
||||
}
|
||||
|
||||
override fun data(): Sequence<Pair<Name, Data<T>>> {
|
||||
return sequence {
|
||||
items.forEach { (head, tree) ->
|
||||
when (tree) {
|
||||
is DataTreeItem.Value -> yield(head.asName() to tree.value)
|
||||
is DataTreeItem.Node -> {
|
||||
val subSequence =
|
||||
tree.tree.data().map { (name, data) -> (head.asName() + name) to data }
|
||||
yieldAll(subSequence)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
override fun nodes(): Sequence<Pair<Name, DataNode<T>>> {
|
||||
return sequence {
|
||||
items.forEach { (head, tree) ->
|
||||
if (tree is DataTreeItem.Node) {
|
||||
yield(head.asName() to tree.tree)
|
||||
val subSequence =
|
||||
tree.tree.nodes().map { (name, node) -> (head.asName() + name) to node }
|
||||
yieldAll(subSequence)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private sealed class DataTreeBuilderItem<out T : Any> {
|
||||
class Node<T : Any>(val tree: DataTreeBuilder<T>) : DataTreeBuilderItem<T>()
|
||||
class Value<T : Any>(val value: Data<T>) : DataTreeBuilderItem<T>()
|
||||
class Leaf<T : Any>(val value: Data<T>) : DataTreeBuilderItem<T>()
|
||||
}
|
||||
|
||||
/**
|
||||
@ -115,7 +129,7 @@ class DataTreeBuilder<T : Any>(private val type: KClass<out T>) {
|
||||
|
||||
operator fun set(token: NameToken, data: Data<T>) {
|
||||
if (map.containsKey(token)) error("Tree entry with name $token is not empty")
|
||||
map[token] = DataTreeBuilderItem.Value(data)
|
||||
map[token] = DataTreeBuilderItem.Leaf(data)
|
||||
}
|
||||
|
||||
private fun buildNode(token: NameToken): DataTreeBuilder<T> {
|
||||
@ -152,6 +166,11 @@ class DataTreeBuilder<T : Any>(private val type: KClass<out T>) {
|
||||
|
||||
operator fun set(name: Name, node: DataNode<T>) = set(name, node.builder())
|
||||
|
||||
operator fun set(name: Name, item: DataItem<T>) = when (item) {
|
||||
is DataItem.Node<T> -> set(name, item.value.builder())
|
||||
is DataItem.Leaf<T> -> set(name, item.value)
|
||||
}
|
||||
|
||||
/**
|
||||
* Append data to node
|
||||
*/
|
||||
@ -162,6 +181,8 @@ class DataTreeBuilder<T : Any>(private val type: KClass<out T>) {
|
||||
*/
|
||||
infix fun String.to(node: DataNode<T>) = set(toName(), node)
|
||||
|
||||
infix fun String.to(item: DataItem<T>) = set(toName(), item)
|
||||
|
||||
/**
|
||||
* Build and append node
|
||||
*/
|
||||
@ -169,7 +190,7 @@ class DataTreeBuilder<T : Any>(private val type: KClass<out T>) {
|
||||
|
||||
|
||||
fun update(node: DataNode<T>) {
|
||||
node.data().forEach {
|
||||
node.dataSequence().forEach {
|
||||
//TODO check if the place is occupied
|
||||
this[it.first] = it.second
|
||||
}
|
||||
@ -178,8 +199,8 @@ class DataTreeBuilder<T : Any>(private val type: KClass<out T>) {
|
||||
fun build(): DataTree<T> {
|
||||
val resMap = map.mapValues { (_, value) ->
|
||||
when (value) {
|
||||
is DataTreeBuilderItem.Value -> DataTreeItem.Value(value.value)
|
||||
is DataTreeBuilderItem.Node -> DataTreeItem.Node(value.tree.build())
|
||||
is DataTreeBuilderItem.Leaf -> DataItem.Leaf(value.value)
|
||||
is DataTreeBuilderItem.Node -> DataItem.Node(value.tree.build())
|
||||
}
|
||||
}
|
||||
return DataTree(type, resMap)
|
||||
@ -190,27 +211,20 @@ class DataTreeBuilder<T : Any>(private val type: KClass<out T>) {
|
||||
* Generate a mutable builder from this node. Node content is not changed
|
||||
*/
|
||||
fun <T : Any> DataNode<T>.builder(): DataTreeBuilder<T> = DataTreeBuilder(type).apply {
|
||||
data().forEach { (name, data) -> this[name] = data }
|
||||
dataSequence().forEach { (name, data) -> this[name] = data }
|
||||
}
|
||||
|
||||
/**
|
||||
* Start computation for all goals in data node
|
||||
*/
|
||||
fun DataNode<*>.startAll() = data().forEach { (_, data) -> data.goal.start() }
|
||||
|
||||
fun <T : Any> DataNode<T>.filter(predicate: (Name, Data<T>) -> Boolean): DataNode<T> = DataNode.build(type) {
|
||||
data().forEach { (name, data) ->
|
||||
dataSequence().forEach { (name, data) ->
|
||||
if (predicate(name, data)) {
|
||||
this[name] = data
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fun <T: Any> DataNode<T>.first(): Data<T> = data().first().second
|
||||
fun <T : Any> DataNode<T>.first(): Data<T>? = dataSequence().first().second
|
||||
|
||||
/**
|
||||
* Check that node is compatible with given type meaning that each element could be cast to the type
|
||||
*/
|
||||
expect fun DataNode<*>.checkType(type: KClass<*>)
|
||||
|
||||
//fun <T : Any, R: T> DataNode<T>.filterIsInstance(type: KClass<R>): DataNode<R> = filter{_,data -> type.}
|
@ -4,116 +4,102 @@ import kotlinx.coroutines.*
|
||||
import kotlin.coroutines.CoroutineContext
|
||||
import kotlin.coroutines.EmptyCoroutineContext
|
||||
|
||||
/**
|
||||
* A special deferred with explicit dependencies and some additional information like progress and unique id
|
||||
*/
|
||||
interface Goal<out T> : Deferred<T>, CoroutineScope {
|
||||
val scope: CoroutineScope
|
||||
override val coroutineContext get() = scope.coroutineContext
|
||||
|
||||
interface Goal<out T> {
|
||||
val dependencies: Collection<Goal<*>>
|
||||
/**
|
||||
* Returns current running coroutine if the goal is started
|
||||
*/
|
||||
val result: Deferred<T>?
|
||||
|
||||
val totalWork: Double get() = dependencies.sumByDouble { totalWork } + (monitor?.totalWork ?: 0.0)
|
||||
val workDone: Double get() = dependencies.sumByDouble { workDone } + (monitor?.workDone ?: 0.0)
|
||||
val status: String get() = monitor?.status ?: ""
|
||||
val progress: Double get() = workDone / totalWork
|
||||
/**
|
||||
* Get ongoing computation or start a new one.
|
||||
* Does not guarantee thread safety. In case of multi-thread access, could create orphan computations.
|
||||
*/
|
||||
fun startAsync(scope: CoroutineScope): Deferred<T>
|
||||
|
||||
suspend fun CoroutineScope.await(): T = startAsync(this).await()
|
||||
|
||||
/**
|
||||
* Reset the computation
|
||||
*/
|
||||
fun reset()
|
||||
|
||||
companion object {
|
||||
/**
|
||||
* Create goal wrapping static value. This goal is always completed
|
||||
*/
|
||||
fun <T> static(scope: CoroutineScope, value: T): Goal<T> =
|
||||
StaticGoalImpl(scope, CompletableDeferred(value))
|
||||
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* A monitor of goal state that could be accessed only form inside the goal
|
||||
*/
|
||||
class GoalMonitor : CoroutineContext.Element {
|
||||
override val key: CoroutineContext.Key<*> get() = GoalMonitor
|
||||
fun Goal<*>.start(scope: CoroutineScope): Job = startAsync(scope)
|
||||
|
||||
var totalWork: Double = 1.0
|
||||
var workDone: Double = 0.0
|
||||
var status: String = ""
|
||||
val Goal<*>.isComplete get() = result?.isCompleted ?: false
|
||||
|
||||
/**
|
||||
* Mark the goal as started
|
||||
*/
|
||||
fun start() {
|
||||
suspend fun <T> Goal<T>.await(scope: CoroutineScope): T = scope.await()
|
||||
|
||||
}
|
||||
|
||||
/**
|
||||
* Mark the goal as completed
|
||||
*/
|
||||
fun finish() {
|
||||
workDone = totalWork
|
||||
}
|
||||
|
||||
companion object : CoroutineContext.Key<GoalMonitor>
|
||||
}
|
||||
|
||||
val CoroutineScope.monitor: GoalMonitor? get() = coroutineContext[GoalMonitor]
|
||||
|
||||
private class GoalImpl<T>(
|
||||
override val scope: CoroutineScope,
|
||||
override val dependencies: Collection<Goal<*>>,
|
||||
deferred: Deferred<T>
|
||||
) : Goal<T>, Deferred<T> by deferred
|
||||
|
||||
private class StaticGoalImpl<T>(override val scope: CoroutineScope, deferred: CompletableDeferred<T>) : Goal<T>,
|
||||
Deferred<T> by deferred {
|
||||
open class StaticGoal<T>(val value: T) : Goal<T> {
|
||||
override val dependencies: Collection<Goal<*>> get() = emptyList()
|
||||
override val status: String get() = ""
|
||||
override val totalWork: Double get() = 0.0
|
||||
override val workDone: Double get() = 0.0
|
||||
override val result: Deferred<T> = CompletableDeferred(value)
|
||||
|
||||
override fun startAsync(scope: CoroutineScope): Deferred<T> = result
|
||||
|
||||
override fun reset() {
|
||||
//doNothing
|
||||
}
|
||||
}
|
||||
|
||||
open class DynamicGoal<T>(
|
||||
val coroutineContext: CoroutineContext = EmptyCoroutineContext,
|
||||
override val dependencies: Collection<Goal<*>> = emptyList(),
|
||||
val block: suspend CoroutineScope.() -> T
|
||||
) : Goal<T> {
|
||||
|
||||
final override var result: Deferred<T>? = null
|
||||
private set
|
||||
|
||||
/**
|
||||
* Create a new [Goal] with given [dependencies] and execution [block]. The block takes monitor as parameter.
|
||||
* The goal block runs in a supervised scope, meaning that when it fails, it won't affect external scope.
|
||||
*
|
||||
* **Important:** Unlike regular deferred, the [Goal] is started lazily, so the actual calculation is called only when result is requested.
|
||||
* Get ongoing computation or start a new one.
|
||||
* Does not guarantee thread safety. In case of multi-thread access, could create orphan computations.
|
||||
*/
|
||||
fun <R> CoroutineScope.createGoal(
|
||||
dependencies: Collection<Goal<*>>,
|
||||
context: CoroutineContext = EmptyCoroutineContext,
|
||||
block: suspend CoroutineScope.() -> R
|
||||
): Goal<R> {
|
||||
val deferred = async(context + GoalMonitor(), start = CoroutineStart.LAZY) {
|
||||
dependencies.forEach { it.start() }
|
||||
monitor?.start()
|
||||
//Running in supervisor scope in order to allow manual error handling
|
||||
return@async supervisorScope {
|
||||
block().also {
|
||||
monitor?.finish()
|
||||
override fun startAsync(scope: CoroutineScope): Deferred<T> {
|
||||
val startedDependencies = this.dependencies.map { goal ->
|
||||
goal.startAsync(scope)
|
||||
}
|
||||
return result ?: scope.async(coroutineContext + CoroutineMonitor() + Dependencies(startedDependencies)) {
|
||||
startedDependencies.forEach { deferred ->
|
||||
deferred.invokeOnCompletion { error ->
|
||||
if (error != null) cancel(CancellationException("Dependency $deferred failed with error: ${error.message}"))
|
||||
}
|
||||
}
|
||||
block()
|
||||
}.also { result = it }
|
||||
}
|
||||
|
||||
return GoalImpl(this, dependencies, deferred)
|
||||
/**
|
||||
* Reset the computation
|
||||
*/
|
||||
override fun reset() {
|
||||
result?.cancel()
|
||||
result = null
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a one-to-one goal based on existing goal
|
||||
*/
|
||||
fun <T, R> Goal<T>.pipe(
|
||||
context: CoroutineContext = EmptyCoroutineContext,
|
||||
coroutineContext: CoroutineContext = EmptyCoroutineContext,
|
||||
block: suspend CoroutineScope.(T) -> R
|
||||
): Goal<R> = createGoal(listOf(this), context) { block(await()) }
|
||||
): Goal<R> = DynamicGoal(coroutineContext, listOf(this)) {
|
||||
block(await(this))
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a joining goal.
|
||||
* @param scope the scope for resulting goal. By default use first goal in list
|
||||
*/
|
||||
fun <T, R> Collection<Goal<T>>.join(
|
||||
scope: CoroutineScope = first(),
|
||||
context: CoroutineContext = EmptyCoroutineContext,
|
||||
coroutineContext: CoroutineContext = EmptyCoroutineContext,
|
||||
block: suspend CoroutineScope.(Collection<T>) -> R
|
||||
): Goal<R> = scope.createGoal(this, context) {
|
||||
block(map { it.await() })
|
||||
): Goal<R> = DynamicGoal(coroutineContext, this) {
|
||||
block(map { this.run { it.await(this) } })
|
||||
}
|
||||
|
||||
/**
|
||||
@ -123,9 +109,9 @@ fun <T, R> Collection<Goal<T>>.join(
|
||||
* @param R type of the result goal
|
||||
*/
|
||||
fun <K, T, R> Map<K, Goal<T>>.join(
|
||||
scope: CoroutineScope = values.first(),
|
||||
context: CoroutineContext = EmptyCoroutineContext,
|
||||
coroutineContext: CoroutineContext = EmptyCoroutineContext,
|
||||
block: suspend CoroutineScope.(Map<K, T>) -> R
|
||||
): Goal<R> = scope.createGoal(this.values, context) {
|
||||
block(mapValues { it.value.await() })
|
||||
): Goal<R> = DynamicGoal(coroutineContext, this.values) {
|
||||
block(mapValues { it.value.await(this) })
|
||||
}
|
||||
|
||||
|
@ -1,75 +0,0 @@
|
||||
/*
|
||||
* Copyright 2015 Alexander Nozik.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package hep.dataforge.data
|
||||
|
||||
import hep.dataforge.meta.Meta
|
||||
import hep.dataforge.meta.get
|
||||
import hep.dataforge.meta.string
|
||||
|
||||
interface GroupRule {
|
||||
operator fun <T : Any> invoke(node: DataNode<T>): Map<String, DataNode<T>>
|
||||
}
|
||||
|
||||
/**
|
||||
* The class to builder groups of content with annotation defined rules
|
||||
*
|
||||
* @author Alexander Nozik
|
||||
*/
|
||||
|
||||
object GroupBuilder {
|
||||
|
||||
/**
|
||||
* Create grouping rule that creates groups for different values of value
|
||||
* field with name [key]
|
||||
*
|
||||
* @param key
|
||||
* @param defaultTagValue
|
||||
* @return
|
||||
*/
|
||||
fun byValue(key: String, defaultTagValue: String): GroupRule = object :
|
||||
GroupRule {
|
||||
override fun <T : Any> invoke(node: DataNode<T>): Map<String, DataNode<T>> {
|
||||
val map = HashMap<String, DataTreeBuilder<T>>()
|
||||
|
||||
node.data().forEach { (name, data) ->
|
||||
val tagValue = data.meta[key]?.string ?: defaultTagValue
|
||||
map.getOrPut(tagValue) { DataNode.builder(node.type) }[name] = data
|
||||
}
|
||||
|
||||
return map.mapValues { it.value.build() }
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
// @ValueDef(key = "byValue", required = true, info = "The name of annotation value by which grouping should be made")
|
||||
// @ValueDef(
|
||||
// key = "defaultValue",
|
||||
// def = "default",
|
||||
// info = "Default value which should be used for content in which the grouping value is not presented"
|
||||
// )
|
||||
fun byMeta(config: Meta): GroupRule {
|
||||
//TODO expand grouping options
|
||||
return config["byValue"]?.string?.let {
|
||||
byValue(
|
||||
it,
|
||||
config["defaultValue"]?.string ?: "default"
|
||||
)
|
||||
}
|
||||
?: object : GroupRule {
|
||||
override fun <T : Any> invoke(node: DataNode<T>): Map<String, DataNode<T>> = mapOf("" to node)
|
||||
}
|
||||
}
|
||||
}
|
@ -0,0 +1,68 @@
|
||||
/*
|
||||
* Copyright 2015 Alexander Nozik.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package hep.dataforge.data
|
||||
|
||||
import hep.dataforge.meta.Meta
|
||||
import hep.dataforge.meta.get
|
||||
import hep.dataforge.meta.string
|
||||
|
||||
interface GroupRule {
|
||||
operator fun <T : Any> invoke(node: DataNode<T>): Map<String, DataNode<T>>
|
||||
|
||||
companion object{
|
||||
/**
|
||||
* Create grouping rule that creates groups for different values of value
|
||||
* field with name [key]
|
||||
*
|
||||
* @param key
|
||||
* @param defaultTagValue
|
||||
* @return
|
||||
*/
|
||||
fun byValue(key: String, defaultTagValue: String): GroupRule = object :
|
||||
GroupRule {
|
||||
override fun <T : Any> invoke(node: DataNode<T>): Map<String, DataNode<T>> {
|
||||
val map = HashMap<String, DataTreeBuilder<T>>()
|
||||
|
||||
node.dataSequence().forEach { (name, data) ->
|
||||
val tagValue = data.meta[key]?.string ?: defaultTagValue
|
||||
map.getOrPut(tagValue) { DataNode.builder(node.type) }[name] = data
|
||||
}
|
||||
|
||||
return map.mapValues { it.value.build() }
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
// @ValueDef(key = "byValue", required = true, info = "The name of annotation value by which grouping should be made")
|
||||
// @ValueDef(
|
||||
// key = "defaultValue",
|
||||
// def = "default",
|
||||
// info = "Default value which should be used for content in which the grouping value is not presented"
|
||||
// )
|
||||
fun byMeta(config: Meta): GroupRule {
|
||||
//TODO expand grouping options
|
||||
return config["byValue"]?.string?.let {
|
||||
byValue(
|
||||
it,
|
||||
config["defaultValue"]?.string ?: "default"
|
||||
)
|
||||
}
|
||||
?: object : GroupRule {
|
||||
override fun <T : Any> invoke(node: DataNode<T>): Map<String, DataNode<T>> = mapOf("" to node)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
@ -6,8 +6,6 @@ import hep.dataforge.meta.MetaBuilder
|
||||
import hep.dataforge.meta.builder
|
||||
import hep.dataforge.names.Name
|
||||
import hep.dataforge.names.toName
|
||||
import kotlin.coroutines.CoroutineContext
|
||||
import kotlin.coroutines.EmptyCoroutineContext
|
||||
import kotlin.reflect.KClass
|
||||
|
||||
|
||||
@ -31,7 +29,7 @@ class JoinGroupBuilder<T : Any, R : Any>(val actionMeta: Meta) {
|
||||
*/
|
||||
fun byValue(tag: String, defaultTag: String = "@default", action: JoinGroup<T, R>.() -> Unit) {
|
||||
groupRules += { node ->
|
||||
GroupBuilder.byValue(tag, defaultTag).invoke(node).map {
|
||||
GroupRule.byValue(tag, defaultTag).invoke(node).map {
|
||||
JoinGroup<T, R>(it.key, it.value).apply(action)
|
||||
}
|
||||
}
|
||||
@ -78,7 +76,6 @@ class JoinGroupBuilder<T : Any, R : Any>(val actionMeta: Meta) {
|
||||
class JoinAction<T : Any, R : Any>(
|
||||
val inputType: KClass<T>,
|
||||
val outputType: KClass<R>,
|
||||
val context: CoroutineContext = EmptyCoroutineContext,
|
||||
private val action: JoinGroupBuilder<T, R>.() -> Unit
|
||||
) : Action<T, R> {
|
||||
|
||||
@ -89,17 +86,13 @@ class JoinAction<T : Any, R : Any>(
|
||||
|
||||
val laminate = Laminate(group.meta, meta)
|
||||
|
||||
val goalMap: Map<Name, Goal<T>> = group.node
|
||||
.data()
|
||||
.associate { it.first to it.second.goal }
|
||||
val dataMap = group.node.dataSequence().associate { it }
|
||||
|
||||
val groupName: String = group.name;
|
||||
|
||||
val env = ActionEnv(groupName.toName(), laminate.builder())
|
||||
|
||||
val goal = goalMap.join(context = context) { group.result.invoke(env, it) }
|
||||
|
||||
val res = Data.of(outputType, goal, env.meta)
|
||||
val res: DynamicData<R> = dataMap.join(outputType, meta = laminate) { group.result.invoke(env, it) }
|
||||
|
||||
set(env.name, res)
|
||||
}
|
||||
|
@ -2,8 +2,6 @@ package hep.dataforge.data
|
||||
|
||||
import hep.dataforge.meta.*
|
||||
import hep.dataforge.names.Name
|
||||
import kotlin.coroutines.CoroutineContext
|
||||
import kotlin.coroutines.EmptyCoroutineContext
|
||||
import kotlin.reflect.KClass
|
||||
|
||||
class ActionEnv(val name: Name, val meta: Meta)
|
||||
@ -27,7 +25,6 @@ class PipeBuilder<T, R>(var name: Name, var meta: MetaBuilder) {
|
||||
class PipeAction<T : Any, R : Any>(
|
||||
val inputType: KClass<T>,
|
||||
val outputType: KClass<R>,
|
||||
val context: CoroutineContext = EmptyCoroutineContext,
|
||||
private val block: PipeBuilder<T, R>.() -> Unit
|
||||
) : Action<T, R> {
|
||||
|
||||
@ -35,7 +32,7 @@ class PipeAction<T : Any, R : Any>(
|
||||
node.checkType(inputType)
|
||||
|
||||
return DataNode.build(outputType) {
|
||||
node.data().forEach { (name, data) ->
|
||||
node.dataSequence().forEach { (name, data) ->
|
||||
//merging data meta with action meta (data meta is primary)
|
||||
val oldMeta = meta.builder().apply { update(data.meta) }
|
||||
// creating environment from old meta and name
|
||||
@ -46,10 +43,9 @@ class PipeAction<T : Any, R : Any>(
|
||||
val newName = builder.name
|
||||
//getting new meta
|
||||
val newMeta = builder.meta.seal()
|
||||
//creating a goal with custom context if provided
|
||||
val goal = data.goal.pipe(context) { builder.result(env, it) }
|
||||
val newData = data.pipe(outputType, meta = newMeta) { builder.result(env, it) }
|
||||
//setting the data node
|
||||
this[newName] = Data.of(outputType, goal, newMeta)
|
||||
this[newName] = newData
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -57,9 +53,8 @@ class PipeAction<T : Any, R : Any>(
|
||||
|
||||
inline fun <reified T : Any, reified R : Any> DataNode<T>.pipe(
|
||||
meta: Meta,
|
||||
context: CoroutineContext = EmptyCoroutineContext,
|
||||
noinline action: PipeBuilder<T, R>.() -> Unit
|
||||
): DataNode<R> = PipeAction(T::class, R::class, context, action).invoke(this, meta)
|
||||
): DataNode<R> = PipeAction(T::class, R::class, action).invoke(this, meta)
|
||||
|
||||
|
||||
|
||||
|
@ -7,8 +7,6 @@ import hep.dataforge.meta.builder
|
||||
import hep.dataforge.names.Name
|
||||
import hep.dataforge.names.toName
|
||||
import kotlin.collections.set
|
||||
import kotlin.coroutines.CoroutineContext
|
||||
import kotlin.coroutines.EmptyCoroutineContext
|
||||
import kotlin.reflect.KClass
|
||||
|
||||
|
||||
@ -37,7 +35,6 @@ class SplitBuilder<T : Any, R : Any>(val name: Name, val meta: Meta) {
|
||||
class SplitAction<T : Any, R : Any>(
|
||||
val inputType: KClass<T>,
|
||||
val outputType: KClass<R>,
|
||||
val context: CoroutineContext = EmptyCoroutineContext,
|
||||
private val action: SplitBuilder<T, R>.() -> Unit
|
||||
) : Action<T, R> {
|
||||
|
||||
@ -45,7 +42,7 @@ class SplitAction<T : Any, R : Any>(
|
||||
node.checkType(inputType)
|
||||
|
||||
return DataNode.build(outputType) {
|
||||
node.data().forEach { (name, data) ->
|
||||
node.dataSequence().forEach { (name, data) ->
|
||||
|
||||
val laminate = Laminate(data.meta, meta)
|
||||
|
||||
@ -58,9 +55,7 @@ class SplitAction<T : Any, R : Any>(
|
||||
|
||||
rule(env)
|
||||
|
||||
val goal = data.goal.pipe(context = context) { env.result(it) }
|
||||
|
||||
val res = Data.of(outputType, goal, env.meta)
|
||||
val res = data.pipe(outputType, meta = env.meta) { env.result(it) }
|
||||
set(env.name, res)
|
||||
}
|
||||
}
|
||||
|
@ -1,13 +1,23 @@
|
||||
package hep.dataforge.data
|
||||
|
||||
import hep.dataforge.names.Name
|
||||
import hep.dataforge.meta.Meta
|
||||
import hep.dataforge.names.NameToken
|
||||
import kotlinx.coroutines.CoroutineScope
|
||||
import kotlinx.coroutines.Deferred
|
||||
import kotlin.reflect.KClass
|
||||
import kotlin.reflect.full.isSubclassOf
|
||||
|
||||
fun <T : Any, R : Any> Data<T>.safeCast(type: KClass<R>): Data<R>? {
|
||||
return if (type.isSubclassOf(type)) {
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
Data.of(type, goal as Goal<R>, meta)
|
||||
fun <T : Any, R : Any> Data<T>.safeCast(type: KClass<out R>): Data<R>? {
|
||||
return if (this.type.isSubclassOf(type)) {
|
||||
return object : Data<R> {
|
||||
override val meta: Meta get() = this@safeCast.meta
|
||||
override val dependencies: Collection<Goal<*>> get() = this@safeCast.dependencies
|
||||
override val result: Deferred<R>? get() = this@safeCast.result as Deferred<R>
|
||||
override fun startAsync(scope: CoroutineScope): Deferred<R> = this@safeCast.startAsync(scope) as Deferred<R>
|
||||
override fun reset() = this@safeCast.reset()
|
||||
override val type: KClass<out R> = type
|
||||
}
|
||||
} else {
|
||||
null
|
||||
}
|
||||
@ -17,7 +27,7 @@ fun <T : Any, R : Any> Data<T>.safeCast(type: KClass<R>): Data<R>? {
|
||||
* Filter a node by data and node type. Resulting node and its subnodes is guaranteed to have border type [type],
|
||||
* but could contain empty nodes
|
||||
*/
|
||||
fun <T : Any, R : Any> DataNode<T>.cast(type: KClass<R>): DataNode<R> {
|
||||
fun <T : Any, R : Any> DataNode<T>.cast(type: KClass<out R>): DataNode<R> {
|
||||
return if (this is CastDataNode) {
|
||||
origin.cast(type)
|
||||
} else {
|
||||
@ -28,19 +38,18 @@ fun <T : Any, R : Any> DataNode<T>.cast(type: KClass<R>): DataNode<R> {
|
||||
inline fun <T : Any, reified R : Any> DataNode<T>.cast(): DataNode<R> = cast(R::class)
|
||||
|
||||
class CastDataNode<out T : Any>(val origin: DataNode<Any>, override val type: KClass<out T>) : DataNode<T> {
|
||||
|
||||
override fun get(name: Name): Data<T>? =
|
||||
origin[name]?.safeCast(type)
|
||||
|
||||
override fun getNode(name: Name): DataNode<T>? {
|
||||
return origin.getNode(name)?.cast(type)
|
||||
override val items: Map<NameToken, DataItem<T>> by lazy {
|
||||
origin.items.mapNotNull { (key, item) ->
|
||||
when (item) {
|
||||
is DataItem.Leaf -> {
|
||||
(item.value.safeCast(type))?.let {
|
||||
key to DataItem.Leaf(it)
|
||||
}
|
||||
|
||||
override fun data(): Sequence<Pair<Name, Data<T>>> =
|
||||
origin.data().mapNotNull { pair ->
|
||||
pair.second.safeCast(type)?.let { pair.first to it }
|
||||
}
|
||||
|
||||
override fun nodes(): Sequence<Pair<Name, DataNode<T>>> =
|
||||
origin.nodes().map { it.first to it.second.cast(type) }
|
||||
is DataItem.Node -> {
|
||||
key to DataItem.Node(item.value.cast(type))
|
||||
}
|
||||
}
|
||||
}.associate { it }
|
||||
}
|
||||
}
|
@ -1,8 +0,0 @@
|
||||
package hep.dataforge.data
|
||||
|
||||
import kotlinx.coroutines.runBlocking
|
||||
|
||||
/**
|
||||
* Block the thread and get data content
|
||||
*/
|
||||
fun <T : Any> Data<T>.get(): T = runBlocking { await() }
|
@ -1,8 +1,14 @@
|
||||
package hep.dataforge.data
|
||||
|
||||
import kotlinx.coroutines.runBlocking
|
||||
import kotlin.reflect.KClass
|
||||
import kotlin.reflect.full.isSuperclassOf
|
||||
|
||||
/**
|
||||
* Block the thread and get data content
|
||||
*/
|
||||
fun <T : Any> Data<T>.get(): T = runBlocking { await() }
|
||||
|
||||
/**
|
||||
* Check that node is compatible with given type meaning that each element could be cast to the type
|
||||
*/
|
@ -1,57 +1,26 @@
|
||||
plugins {
|
||||
`npm-multiplatform`
|
||||
id("scientifik.mpp")
|
||||
}
|
||||
|
||||
description = "IO for meta"
|
||||
description = "IO module"
|
||||
|
||||
scientifik{
|
||||
serialization = true
|
||||
io = true
|
||||
}
|
||||
|
||||
val ioVersion: String = Versions.ioVersion
|
||||
val serializationVersion: String = Versions.serializationVersion
|
||||
|
||||
kotlin {
|
||||
jvm()
|
||||
js()
|
||||
sourceSets {
|
||||
val commonMain by getting{
|
||||
commonMain{
|
||||
dependencies {
|
||||
api(project(":dataforge-meta"))
|
||||
//implementation 'org.jetbrains.kotlin:kotlin-reflect'
|
||||
api("org.jetbrains.kotlinx:kotlinx-serialization-runtime-common:$serializationVersion")
|
||||
api("org.jetbrains.kotlinx:kotlinx-io:$ioVersion")
|
||||
api(project(":dataforge-context"))
|
||||
}
|
||||
}
|
||||
val commonTest by getting {
|
||||
jsMain{
|
||||
dependencies{
|
||||
implementation("org.jetbrains.kotlin:kotlin-test-common")
|
||||
implementation("org.jetbrains.kotlin:kotlin-test-annotations-common")
|
||||
api(npm("text-encoding"))
|
||||
}
|
||||
}
|
||||
val jvmMain by getting {
|
||||
dependencies {
|
||||
api("org.jetbrains.kotlinx:kotlinx-serialization-runtime:$serializationVersion")
|
||||
api("org.jetbrains.kotlinx:kotlinx-io-jvm:$ioVersion")
|
||||
}
|
||||
}
|
||||
val jvmTest by getting {
|
||||
dependencies {
|
||||
implementation("org.jetbrains.kotlin:kotlin-test")
|
||||
implementation("org.jetbrains.kotlin:kotlin-test-junit")
|
||||
}
|
||||
}
|
||||
val jsMain by getting {
|
||||
dependencies {
|
||||
api("org.jetbrains.kotlinx:kotlinx-serialization-runtime-js:$serializationVersion")
|
||||
api("org.jetbrains.kotlinx:kotlinx-io-js:$ioVersion")
|
||||
}
|
||||
}
|
||||
val jsTest by getting {
|
||||
dependencies {
|
||||
implementation("org.jetbrains.kotlin:kotlin-test-js")
|
||||
}
|
||||
}
|
||||
// iosMain {
|
||||
// }
|
||||
// iosTest {
|
||||
// }
|
||||
}
|
||||
}
|
@ -0,0 +1,85 @@
|
||||
package hep.dataforge.io
|
||||
|
||||
import kotlinx.io.core.ByteReadPacket
|
||||
import kotlinx.io.core.Input
|
||||
import kotlinx.io.core.buildPacket
|
||||
import kotlinx.io.core.readBytes
|
||||
|
||||
/**
|
||||
* A source of binary data
|
||||
*/
|
||||
interface Binary {
|
||||
/**
|
||||
* The size of binary in bytes
|
||||
*/
|
||||
val size: ULong
|
||||
|
||||
/**
|
||||
* Read continuous [Input] from this binary stating from the beginning.
|
||||
* The input is automatically closed on scope close.
|
||||
* Some implementation may forbid this to be called twice. In this case second call will throw an exception.
|
||||
*/
|
||||
fun <R> read(block: Input.() -> R): R
|
||||
}
|
||||
|
||||
/**
|
||||
* A [Binary] with addition random access functionality. It by default allows multiple [read] operations.
|
||||
*/
|
||||
@ExperimentalUnsignedTypes
|
||||
interface RandomAccessBinary : Binary {
|
||||
/**
|
||||
* Read at most [size] of bytes starting at [from] offset from the beginning of the binary.
|
||||
* This method could be called multiple times simultaneously.
|
||||
*/
|
||||
fun <R> read(from: UInt, size: UInt = UInt.MAX_VALUE, block: Input.() -> R): R
|
||||
|
||||
override fun <R> read(block: Input.() -> R): R = read(0.toUInt(), UInt.MAX_VALUE, block)
|
||||
}
|
||||
|
||||
fun Binary.readAll(): ByteReadPacket = read {
|
||||
ByteReadPacket(this.readBytes())
|
||||
}
|
||||
|
||||
@ExperimentalUnsignedTypes
|
||||
fun RandomAccessBinary.readPacket(from: UInt, size: UInt): ByteReadPacket = read(from, size) {
|
||||
ByteReadPacket(this.readBytes())
|
||||
}
|
||||
|
||||
@ExperimentalUnsignedTypes
|
||||
object EmptyBinary : RandomAccessBinary {
|
||||
|
||||
override val size: ULong = 0.toULong()
|
||||
|
||||
override fun <R> read(from: UInt, size: UInt, block: Input.() -> R): R {
|
||||
error("The binary is empty")
|
||||
}
|
||||
}
|
||||
|
||||
@ExperimentalUnsignedTypes
|
||||
class ArrayBinary(val array: ByteArray) : RandomAccessBinary {
|
||||
override val size: ULong get() = array.size.toULong()
|
||||
|
||||
override fun <R> read(from: UInt, size: UInt, block: Input.() -> R): R {
|
||||
return ByteReadPacket(array, from.toInt(), size.toInt()).block()
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Read given binary as object using given format
|
||||
*/
|
||||
fun <T : Any> Binary.readWith(format: IOFormat<T>): T = format.run {
|
||||
read {
|
||||
readThis()
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Write this object to a binary
|
||||
* TODO make a lazy binary that does not use intermediate array
|
||||
*/
|
||||
fun <T: Any> T.writeWith(format: IOFormat<T>): Binary = format.run{
|
||||
val packet = buildPacket {
|
||||
writeThis(this@writeWith)
|
||||
}
|
||||
return@run ArrayBinary(packet.readBytes())
|
||||
}
|
@ -1,5 +1,6 @@
|
||||
package hep.dataforge.io
|
||||
|
||||
import hep.dataforge.descriptors.NodeDescriptor
|
||||
import hep.dataforge.meta.*
|
||||
import hep.dataforge.values.*
|
||||
import kotlinx.io.core.Input
|
||||
@ -8,12 +9,11 @@ import kotlinx.io.core.readText
|
||||
import kotlinx.io.core.writeText
|
||||
|
||||
object BinaryMetaFormat : MetaFormat {
|
||||
override fun write(obj: Meta, out: Output) {
|
||||
out.writeMeta(obj)
|
||||
}
|
||||
override val name: String = "bin"
|
||||
override val key: Short = 0x4249//BI
|
||||
|
||||
override fun read(input: Input): Meta {
|
||||
return (input.readMetaItem() as MetaItem.NodeItem).node
|
||||
override fun Input.readMeta(descriptor: NodeDescriptor?): Meta {
|
||||
return (readMetaItem() as MetaItem.NodeItem).node
|
||||
}
|
||||
|
||||
private fun Output.writeChar(char: Char) = writeByte(char.toByte())
|
||||
@ -70,7 +70,7 @@ object BinaryMetaFormat : MetaFormat {
|
||||
}
|
||||
}
|
||||
|
||||
private fun Output.writeMeta(meta: Meta) {
|
||||
override fun Output.writeMeta(meta: Meta, descriptor: NodeDescriptor?) {
|
||||
writeChar('M')
|
||||
writeInt(meta.items.size)
|
||||
meta.items.forEach { (key, item) ->
|
||||
@ -80,7 +80,7 @@ object BinaryMetaFormat : MetaFormat {
|
||||
writeValue(item.value)
|
||||
}
|
||||
is MetaItem.NodeItem -> {
|
||||
writeMeta(item.node)
|
||||
writeThis(item.node)
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -91,9 +91,9 @@ object BinaryMetaFormat : MetaFormat {
|
||||
return readText(max = length)
|
||||
}
|
||||
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
private fun Input.readMetaItem(): MetaItem<MetaBuilder> {
|
||||
val keyChar = readByte().toChar()
|
||||
return when (keyChar) {
|
||||
return when (val keyChar = readByte().toChar()) {
|
||||
'S' -> MetaItem.ValueItem(StringValue(readString()))
|
||||
'N' -> MetaItem.ValueItem(Null)
|
||||
'+' -> MetaItem.ValueItem(True)
|
||||
@ -123,10 +123,3 @@ object BinaryMetaFormat : MetaFormat {
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
class BinaryMetaFormatFactory : MetaFormatFactory {
|
||||
override val name: String = "bin"
|
||||
override val key: Short = 0x4249//BI
|
||||
|
||||
override fun build(): MetaFormat = BinaryMetaFormat
|
||||
}
|
@ -1,13 +1,13 @@
|
||||
package hep.dataforge.io
|
||||
|
||||
import hep.dataforge.meta.Laminate
|
||||
import hep.dataforge.meta.Meta
|
||||
import hep.dataforge.meta.get
|
||||
import hep.dataforge.meta.string
|
||||
import kotlinx.io.core.Input
|
||||
|
||||
interface Envelope {
|
||||
val meta: Meta
|
||||
val data: Input?
|
||||
val data: Binary?
|
||||
|
||||
companion object {
|
||||
|
||||
@ -23,11 +23,7 @@ interface Envelope {
|
||||
}
|
||||
}
|
||||
|
||||
class SimpleEnvelope(override val meta: Meta, val dataProvider: () -> Input?) : Envelope{
|
||||
override val data: Input?
|
||||
get() = dataProvider()
|
||||
|
||||
}
|
||||
class SimpleEnvelope(override val meta: Meta, override val data: Binary?) : Envelope
|
||||
|
||||
/**
|
||||
* The purpose of the envelope
|
||||
@ -50,3 +46,21 @@ val Envelope.dataType: String? get() = meta[Envelope.ENVELOPE_DATA_TYPE_KEY].str
|
||||
*/
|
||||
val Envelope.description: String? get() = meta[Envelope.ENVELOPE_DESCRIPTION_KEY].string
|
||||
|
||||
/**
|
||||
* An envelope, which wraps existing envelope and adds one or several additional layers of meta
|
||||
*/
|
||||
class ProxyEnvelope(val source: Envelope, vararg meta: Meta) : Envelope {
|
||||
override val meta: Laminate = Laminate(*meta, source.meta)
|
||||
override val data: Binary? get() = source.data
|
||||
}
|
||||
|
||||
/**
|
||||
* Add few meta layers to existing envelope
|
||||
*/
|
||||
fun Envelope.withMetaLayers(vararg layers: Meta): Envelope {
|
||||
return when {
|
||||
layers.isEmpty() -> this
|
||||
this is ProxyEnvelope -> ProxyEnvelope(source, *layers, *this.meta.layers.toTypedArray())
|
||||
else -> ProxyEnvelope(this, *layers)
|
||||
}
|
||||
}
|
@ -0,0 +1,31 @@
|
||||
package hep.dataforge.io
|
||||
|
||||
import hep.dataforge.context.Named
|
||||
import hep.dataforge.io.EnvelopeFormat.Companion.ENVELOPE_FORMAT_TYPE
|
||||
import hep.dataforge.meta.Meta
|
||||
import hep.dataforge.provider.Type
|
||||
import kotlinx.io.core.Input
|
||||
import kotlinx.io.core.Output
|
||||
|
||||
/**
|
||||
* A partially read envelope with meta, but without data
|
||||
*/
|
||||
@ExperimentalUnsignedTypes
|
||||
data class PartialEnvelope(val meta: Meta, val dataOffset: UInt, val dataSize: ULong?)
|
||||
|
||||
@Type(ENVELOPE_FORMAT_TYPE)
|
||||
interface EnvelopeFormat : IOFormat<Envelope>, Named {
|
||||
fun Input.readPartial(formats: Collection<MetaFormat> = IOPlugin.defaultMetaFormats): PartialEnvelope
|
||||
|
||||
fun Input.readEnvelope(formats: Collection<MetaFormat> = IOPlugin.defaultMetaFormats): Envelope
|
||||
|
||||
override fun Input.readThis(): Envelope = readEnvelope()
|
||||
|
||||
fun Output.writeEnvelope(envelope: Envelope, format: MetaFormat = JsonMetaFormat)
|
||||
|
||||
override fun Output.writeThis(obj: Envelope) = writeEnvelope(obj)
|
||||
|
||||
companion object {
|
||||
const val ENVELOPE_FORMAT_TYPE = "envelopeFormat"
|
||||
}
|
||||
}
|
@ -0,0 +1,38 @@
|
||||
package hep.dataforge.io
|
||||
|
||||
import kotlin.reflect.KClass
|
||||
|
||||
/**
|
||||
* A descriptor for specific type of functions
|
||||
*/
|
||||
interface FunctionSpec<T : Any, R : Any> {
|
||||
val inputType: KClass<T>
|
||||
val outputType: KClass<R>
|
||||
}
|
||||
|
||||
/**
|
||||
* A server that could produce asynchronous function values
|
||||
*/
|
||||
interface FunctionServer {
|
||||
/**
|
||||
* Call a function with given name and descriptor
|
||||
*/
|
||||
suspend fun <T : Any, R : Any, D : FunctionSpec<T, R>> call(name: String, descriptor: D, arg: T): R
|
||||
|
||||
/**
|
||||
* Resolve a function descriptor for given types
|
||||
*/
|
||||
fun <T : Any, R : Any> resolveType(inputType: KClass<out T>, outputType: KClass<out R>): FunctionSpec<T, R>
|
||||
|
||||
/**
|
||||
* Get a generic suspended function with given name and descriptor
|
||||
*/
|
||||
operator fun <T : Any, R : Any, D : FunctionSpec<T, R>> get(name: String, descriptor: D): (suspend (T) -> R) =
|
||||
{ call(name, descriptor, it) }
|
||||
}
|
||||
|
||||
suspend inline fun <reified T : Any, reified R : Any> FunctionServer.call(name: String, arg: T): R =
|
||||
call(name, resolveType(T::class, R::class), arg)
|
||||
|
||||
inline operator fun <reified T : Any, reified R : Any> FunctionServer.get(name: String): (suspend (T) -> R) =
|
||||
get(name, resolveType(T::class, R::class))
|
@ -1,10 +1,14 @@
|
||||
package hep.dataforge.io
|
||||
|
||||
import kotlinx.io.core.Input
|
||||
import kotlinx.io.core.Output
|
||||
|
||||
import kotlinx.io.core.*
|
||||
|
||||
/**
|
||||
* And interface for serialization facilities
|
||||
*/
|
||||
interface IOFormat<T : Any> {
|
||||
fun write(obj: T, out: Output)
|
||||
fun read(input: Input): T
|
||||
fun Output.writeThis(obj: T)
|
||||
fun Input.readThis(): T
|
||||
}
|
||||
|
||||
fun <T : Any> IOFormat<T>.writePacket(obj: T): ByteReadPacket = buildPacket { writeThis(obj) }
|
||||
fun <T : Any> IOFormat<T>.writeBytes(obj: T): ByteArray = buildPacket { writeThis(obj) }.readBytes()
|
@ -0,0 +1,37 @@
|
||||
package hep.dataforge.io
|
||||
|
||||
import hep.dataforge.context.AbstractPlugin
|
||||
import hep.dataforge.context.PluginFactory
|
||||
import hep.dataforge.context.PluginTag
|
||||
import hep.dataforge.context.content
|
||||
import hep.dataforge.meta.Meta
|
||||
import hep.dataforge.names.Name
|
||||
import kotlin.reflect.KClass
|
||||
|
||||
class IOPlugin(meta: Meta) : AbstractPlugin(meta) {
|
||||
override val tag: PluginTag get() = Companion.tag
|
||||
|
||||
val metaFormats by lazy {
|
||||
context.content<MetaFormat>(MetaFormat.META_FORMAT_TYPE).values
|
||||
}
|
||||
|
||||
fun metaFormat(key: Short): MetaFormat? = metaFormats.find { it.key == key }
|
||||
fun metaFormat(name: String): MetaFormat? = metaFormats.find { it.name == name }
|
||||
|
||||
override fun provideTop(target: String): Map<Name, Any> {
|
||||
return when (target) {
|
||||
MetaFormat.META_FORMAT_TYPE -> defaultMetaFormats.toMap()
|
||||
EnvelopeFormat.ENVELOPE_FORMAT_TYPE -> defaultEnvelopeFormats.toMap()
|
||||
else -> super.provideTop(target)
|
||||
}
|
||||
}
|
||||
|
||||
companion object : PluginFactory<IOPlugin> {
|
||||
val defaultMetaFormats: List<MetaFormat> = listOf(JsonMetaFormat, BinaryMetaFormat)
|
||||
val defaultEnvelopeFormats = listOf(TaggedEnvelopeFormat)
|
||||
|
||||
override val tag: PluginTag = PluginTag("io", group = PluginTag.DATAFORGE_GROUP)
|
||||
override val type: KClass<out IOPlugin> = IOPlugin::class
|
||||
override fun invoke(meta: Meta): IOPlugin = IOPlugin(meta)
|
||||
}
|
||||
}
|
@ -1,6 +1,10 @@
|
||||
package hep.dataforge.io
|
||||
|
||||
import hep.dataforge.descriptors.ItemDescriptor
|
||||
import hep.dataforge.descriptors.NodeDescriptor
|
||||
import hep.dataforge.descriptors.ValueDescriptor
|
||||
import hep.dataforge.meta.Meta
|
||||
import hep.dataforge.meta.MetaBase
|
||||
import hep.dataforge.meta.MetaItem
|
||||
import hep.dataforge.names.NameToken
|
||||
import hep.dataforge.names.toName
|
||||
@ -10,17 +14,23 @@ import kotlinx.io.core.Output
|
||||
import kotlinx.io.core.readText
|
||||
import kotlinx.io.core.writeText
|
||||
import kotlinx.serialization.json.*
|
||||
import kotlin.collections.component1
|
||||
import kotlin.collections.component2
|
||||
import kotlin.collections.set
|
||||
|
||||
|
||||
object JsonMetaFormat : MetaFormat {
|
||||
|
||||
override fun write(obj: Meta, out: Output) {
|
||||
val str = obj.toJson().toString()
|
||||
out.writeText(str)
|
||||
override val name: String = "json"
|
||||
override val key: Short = 0x4a53//"JS"
|
||||
|
||||
override fun Output.writeMeta(meta: Meta, descriptor: NodeDescriptor?) {
|
||||
val json = meta.toJson(descriptor)
|
||||
writeText(json.toString())
|
||||
}
|
||||
|
||||
override fun read(input: Input): Meta {
|
||||
val str = input.readText()
|
||||
override fun Input.readMeta(descriptor: NodeDescriptor?): Meta {
|
||||
val str = readText()
|
||||
val json = Json.plain.parseJson(str)
|
||||
|
||||
if (json is JsonObject) {
|
||||
@ -31,7 +41,7 @@ object JsonMetaFormat : MetaFormat {
|
||||
}
|
||||
}
|
||||
|
||||
fun Value.toJson(): JsonElement {
|
||||
fun Value.toJson(descriptor: ValueDescriptor? = null): JsonElement {
|
||||
return if (isList()) {
|
||||
JsonArray(list.map { it.toJson() })
|
||||
} else {
|
||||
@ -44,48 +54,96 @@ fun Value.toJson(): JsonElement {
|
||||
}
|
||||
}
|
||||
|
||||
fun Meta.toJson(): JsonObject {
|
||||
val map = this.items.mapValues { entry ->
|
||||
val value = entry.value
|
||||
when (value) {
|
||||
is MetaItem.ValueItem -> value.value.toJson()
|
||||
is MetaItem.NodeItem -> value.node.toJson()
|
||||
//Use theese methods to customize JSON key mapping
|
||||
private fun NameToken.toJsonKey(descriptor: ItemDescriptor?) = toString()
|
||||
|
||||
private fun NodeDescriptor?.getDescriptor(key: String) = this?.items?.get(key)
|
||||
|
||||
fun Meta.toJson(descriptor: NodeDescriptor? = null): JsonObject {
|
||||
|
||||
//TODO search for same name siblings and arrange them into arrays
|
||||
val map = this.items.entries.associate { (name, item) ->
|
||||
val itemDescriptor = descriptor?.items?.get(name.body)
|
||||
val key = name.toJsonKey(itemDescriptor)
|
||||
val value = when (item) {
|
||||
is MetaItem.ValueItem -> {
|
||||
item.value.toJson(itemDescriptor as? ValueDescriptor)
|
||||
}
|
||||
is MetaItem.NodeItem -> {
|
||||
item.node.toJson(itemDescriptor as? NodeDescriptor)
|
||||
}
|
||||
}
|
||||
key to value
|
||||
}
|
||||
}.mapKeys { it.key.toString() }
|
||||
return JsonObject(map)
|
||||
}
|
||||
|
||||
fun JsonObject.toMeta(descriptor: NodeDescriptor? = null): JsonMeta = JsonMeta(this, descriptor)
|
||||
|
||||
fun JsonObject.toMeta() = JsonMeta(this)
|
||||
|
||||
class JsonMeta(val json: JsonObject) : Meta {
|
||||
|
||||
private fun JsonPrimitive.toValue(): Value {
|
||||
fun JsonPrimitive.toValue(descriptor: ValueDescriptor?): Value {
|
||||
return when (this) {
|
||||
JsonNull -> Null
|
||||
else -> this.content.parseValue() // Optimize number and boolean parsing
|
||||
}
|
||||
}
|
||||
|
||||
private operator fun MutableMap<String, MetaItem<JsonMeta>>.set(key: String, value: JsonElement) = when (value) {
|
||||
is JsonPrimitive -> this[key] = MetaItem.ValueItem(value.toValue())
|
||||
is JsonObject -> this[key] = MetaItem.NodeItem(value.toMeta())
|
||||
fun JsonElement.toMetaItem(descriptor: ItemDescriptor? = null): MetaItem<JsonMeta> = when (this) {
|
||||
is JsonPrimitive -> {
|
||||
val value = this.toValue(descriptor as? ValueDescriptor)
|
||||
MetaItem.ValueItem(value)
|
||||
}
|
||||
is JsonObject -> {
|
||||
val meta = toMeta(descriptor as? NodeDescriptor)
|
||||
MetaItem.NodeItem(meta)
|
||||
}
|
||||
is JsonArray -> {
|
||||
if (this.all { it is JsonPrimitive }) {
|
||||
val value = if (isEmpty()) {
|
||||
Null
|
||||
} else {
|
||||
ListValue(
|
||||
map<JsonElement, Value> {
|
||||
//We already checked that all values are primitives
|
||||
(it as JsonPrimitive).toValue(descriptor as? ValueDescriptor)
|
||||
}
|
||||
)
|
||||
}
|
||||
MetaItem.ValueItem(value)
|
||||
} else {
|
||||
json {
|
||||
"@value" to this@toMetaItem
|
||||
}.toMetaItem(descriptor)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
class JsonMeta(val json: JsonObject, val descriptor: NodeDescriptor? = null) : MetaBase() {
|
||||
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
private operator fun MutableMap<String, MetaItem<JsonMeta>>.set(key: String, value: JsonElement): Unit {
|
||||
val itemDescriptor = descriptor.getDescriptor(key)
|
||||
//use name from descriptor in case descriptor name differs from json key
|
||||
val name = itemDescriptor?.name ?: key
|
||||
return when (value) {
|
||||
is JsonPrimitive -> {
|
||||
this[name] = MetaItem.ValueItem(value.toValue(itemDescriptor as? ValueDescriptor)) as MetaItem<JsonMeta>
|
||||
}
|
||||
is JsonObject -> {
|
||||
this[name] = MetaItem.NodeItem(value.toMeta(itemDescriptor as? NodeDescriptor))
|
||||
}
|
||||
is JsonArray -> {
|
||||
when {
|
||||
value.all { it is JsonPrimitive } -> {
|
||||
val listValue = ListValue(
|
||||
value.map {
|
||||
//We already checked that all values are primitives
|
||||
(it as JsonPrimitive).toValue()
|
||||
(it as JsonPrimitive).toValue(itemDescriptor as? ValueDescriptor)
|
||||
}
|
||||
)
|
||||
this[key] = MetaItem.ValueItem(listValue)
|
||||
this[name] = MetaItem.ValueItem(listValue) as MetaItem<JsonMeta>
|
||||
}
|
||||
else -> value.forEachIndexed { index, jsonElement ->
|
||||
when (jsonElement) {
|
||||
is JsonObject -> this["$key[$index]"] = MetaItem.NodeItem(JsonMeta(jsonElement))
|
||||
is JsonPrimitive -> this["$key[$index]"] = MetaItem.ValueItem(jsonElement.toValue())
|
||||
is JsonArray -> TODO("Nested arrays not supported")
|
||||
this["$name[$index]"] = jsonElement.toMetaItem(itemDescriptor)
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -98,10 +156,3 @@ class JsonMeta(val json: JsonObject) : Meta {
|
||||
map.mapKeys { it.key.toName().first()!! }
|
||||
}
|
||||
}
|
||||
|
||||
class JsonMetaFormatFactory : MetaFormatFactory {
|
||||
override val name: String = "json"
|
||||
override val key: Short = 0x4a53//"JS"
|
||||
|
||||
override fun build() = JsonMetaFormat
|
||||
}
|
@ -1,33 +1,49 @@
|
||||
package hep.dataforge.io
|
||||
|
||||
import hep.dataforge.context.Named
|
||||
import hep.dataforge.descriptors.NodeDescriptor
|
||||
import hep.dataforge.io.MetaFormat.Companion.META_FORMAT_TYPE
|
||||
import hep.dataforge.meta.Meta
|
||||
import kotlinx.io.core.BytePacketBuilder
|
||||
import kotlinx.io.core.ByteReadPacket
|
||||
import kotlinx.io.core.toByteArray
|
||||
import hep.dataforge.provider.Type
|
||||
import kotlinx.io.core.*
|
||||
|
||||
/**
|
||||
* A format for meta serialization
|
||||
*/
|
||||
interface MetaFormat: IOFormat<Meta>
|
||||
|
||||
/**
|
||||
* ServiceLoader compatible factory
|
||||
*/
|
||||
interface MetaFormatFactory {
|
||||
val name: String
|
||||
@Type(META_FORMAT_TYPE)
|
||||
interface MetaFormat : IOFormat<Meta>, Named {
|
||||
override val name: String
|
||||
val key: Short
|
||||
|
||||
fun build(): MetaFormat
|
||||
override fun Output.writeThis(obj: Meta) {
|
||||
writeMeta(obj, null)
|
||||
}
|
||||
|
||||
fun Meta.asString(format: MetaFormat = JsonMetaFormat): String {
|
||||
val builder = BytePacketBuilder()
|
||||
format.write(this, builder)
|
||||
return builder.build().readText()
|
||||
override fun Input.readThis(): Meta = readMeta(null)
|
||||
|
||||
fun Output.writeMeta(meta: Meta, descriptor: NodeDescriptor? = null)
|
||||
fun Input.readMeta(descriptor: NodeDescriptor? = null): Meta
|
||||
|
||||
companion object{
|
||||
const val META_FORMAT_TYPE = "metaFormat"
|
||||
}
|
||||
}
|
||||
|
||||
fun Meta.toString(format: MetaFormat): String = buildPacket {
|
||||
format.run { writeThis(this@toString) }
|
||||
}.readText()
|
||||
|
||||
fun Meta.toBytes(format: MetaFormat = JsonMetaFormat): ByteReadPacket = buildPacket {
|
||||
format.run { writeThis(this@toBytes) }
|
||||
}
|
||||
|
||||
|
||||
fun MetaFormat.parse(str: String): Meta {
|
||||
return read(ByteReadPacket(str.toByteArray()))
|
||||
return ByteReadPacket(str.toByteArray()).readThis()
|
||||
}
|
||||
|
||||
fun MetaFormat.fromBytes(packet: ByteReadPacket): Meta {
|
||||
return packet.readThis()
|
||||
}
|
||||
|
||||
|
||||
|
@ -0,0 +1,53 @@
|
||||
package hep.dataforge.io
|
||||
|
||||
import hep.dataforge.meta.Config
|
||||
import hep.dataforge.meta.Meta
|
||||
import hep.dataforge.meta.toConfig
|
||||
import hep.dataforge.names.Name
|
||||
import hep.dataforge.names.toName
|
||||
import kotlinx.serialization.*
|
||||
import kotlinx.serialization.internal.StringDescriptor
|
||||
import kotlinx.serialization.json.JsonObjectSerializer
|
||||
|
||||
@Serializer(Name::class)
|
||||
object NameSerializer : KSerializer<Name> {
|
||||
override val descriptor: SerialDescriptor = StringDescriptor
|
||||
|
||||
override fun deserialize(decoder: Decoder): Name {
|
||||
return decoder.decodeString().toName()
|
||||
}
|
||||
|
||||
override fun serialize(encoder: Encoder, obj: Name) {
|
||||
encoder.encodeString(obj.toString())
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Serialized for meta
|
||||
*/
|
||||
@Serializer(Meta::class)
|
||||
object MetaSerializer : KSerializer<Meta> {
|
||||
override val descriptor: SerialDescriptor = JsonObjectSerializer.descriptor
|
||||
|
||||
override fun deserialize(decoder: Decoder): Meta {
|
||||
//currently just delegates serialization to json serializer
|
||||
return JsonObjectSerializer.deserialize(decoder).toMeta()
|
||||
}
|
||||
|
||||
override fun serialize(encoder: Encoder, obj: Meta) {
|
||||
JsonObjectSerializer.serialize(encoder, obj.toJson())
|
||||
}
|
||||
}
|
||||
|
||||
@Serializer(Config::class)
|
||||
object ConfigSerializer : KSerializer<Config> {
|
||||
override val descriptor: SerialDescriptor = JsonObjectSerializer.descriptor
|
||||
|
||||
override fun deserialize(decoder: Decoder): Config {
|
||||
return JsonObjectSerializer.deserialize(decoder).toMeta().toConfig()
|
||||
}
|
||||
|
||||
override fun serialize(encoder: Encoder, obj: Config) {
|
||||
JsonObjectSerializer.serialize(encoder, obj.toJson())
|
||||
}
|
||||
}
|
@ -0,0 +1,81 @@
|
||||
package hep.dataforge.io
|
||||
|
||||
import kotlinx.io.core.*
|
||||
|
||||
|
||||
@ExperimentalUnsignedTypes
|
||||
object TaggedEnvelopeFormat : EnvelopeFormat {
|
||||
const val VERSION = "DF03"
|
||||
private const val START_SEQUENCE = "#~"
|
||||
private const val END_SEQUENCE = "~#\r\n"
|
||||
private const val TAG_SIZE = 26u
|
||||
|
||||
override val name: String get() = VERSION
|
||||
|
||||
private fun Tag.toBytes(): ByteReadPacket = buildPacket(24) {
|
||||
writeText(START_SEQUENCE)
|
||||
writeText(VERSION)
|
||||
writeShort(metaFormatKey)
|
||||
writeUInt(metaSize)
|
||||
writeULong(dataSize)
|
||||
writeText(END_SEQUENCE)
|
||||
}
|
||||
|
||||
private fun Input.readTag(): Tag {
|
||||
val start = readTextExactBytes(2)
|
||||
if (start != START_SEQUENCE) error("The input is not an envelope")
|
||||
val version = readTextExactBytes(4)
|
||||
if (version != VERSION) error("Wrong version of DataForge: expected $VERSION but found $version")
|
||||
val metaFormatKey = readShort()
|
||||
val metaLength = readUInt()
|
||||
val dataLength = readULong()
|
||||
return Tag(metaFormatKey, metaLength, dataLength)
|
||||
}
|
||||
|
||||
override fun Output.writeEnvelope(envelope: Envelope, format: MetaFormat) {
|
||||
val metaBytes = format.writeBytes(envelope.meta)
|
||||
val tag = Tag(format.key, metaBytes.size.toUInt(), envelope.data?.size ?: 0.toULong())
|
||||
writePacket(tag.toBytes())
|
||||
writeFully(metaBytes)
|
||||
envelope.data?.read { copyTo(this@writeEnvelope) }
|
||||
}
|
||||
|
||||
/**
|
||||
* Read an envelope from input into memory
|
||||
*
|
||||
* @param input an input to read from
|
||||
* @param formats a collection of meta formats to resolve
|
||||
*/
|
||||
override fun Input.readEnvelope(formats: Collection<MetaFormat>): Envelope {
|
||||
val tag = readTag()
|
||||
|
||||
val metaFormat = formats.find { it.key == tag.metaFormatKey }
|
||||
?: error("Meta format with key ${tag.metaFormatKey} not found")
|
||||
|
||||
val metaPacket = ByteReadPacket(readBytes(tag.metaSize.toInt()))
|
||||
val meta = metaFormat.run { metaPacket.readThis() }
|
||||
|
||||
val dataBytes = readBytes(tag.dataSize.toInt())
|
||||
|
||||
return SimpleEnvelope(meta, ArrayBinary(dataBytes))
|
||||
}
|
||||
|
||||
override fun Input.readPartial(formats: Collection<MetaFormat>): PartialEnvelope {
|
||||
val tag = readTag()
|
||||
|
||||
val metaFormat = formats.find { it.key == tag.metaFormatKey }
|
||||
?: error("Meta format with key ${tag.metaFormatKey} not found")
|
||||
|
||||
val metaPacket = ByteReadPacket(readBytes(tag.metaSize.toInt()))
|
||||
val meta = metaFormat.run { metaPacket.readThis() }
|
||||
|
||||
return PartialEnvelope(meta, TAG_SIZE + tag.metaSize, tag.dataSize)
|
||||
}
|
||||
|
||||
private data class Tag(
|
||||
val metaFormatKey: Short,
|
||||
val metaSize: UInt,
|
||||
val dataSize: ULong
|
||||
)
|
||||
|
||||
}
|
@ -1,6 +1,9 @@
|
||||
package hep.dataforge.io
|
||||
|
||||
import hep.dataforge.meta.buildMeta
|
||||
import hep.dataforge.meta.*
|
||||
import kotlinx.serialization.json.JsonPrimitive
|
||||
import kotlinx.serialization.json.json
|
||||
import kotlinx.serialization.json.jsonArray
|
||||
import kotlin.test.Test
|
||||
import kotlin.test.assertEquals
|
||||
|
||||
@ -12,10 +15,11 @@ class MetaFormatTest {
|
||||
"node" to {
|
||||
"b" to "DDD"
|
||||
"c" to 11.1
|
||||
"array" to doubleArrayOf(1.0, 2.0, 3.0)
|
||||
}
|
||||
}
|
||||
val string = meta.asString(BinaryMetaFormat)
|
||||
val result = BinaryMetaFormat.parse(string)
|
||||
val bytes = meta.toBytes(BinaryMetaFormat)
|
||||
val result = BinaryMetaFormat.fromBytes(bytes)
|
||||
assertEquals(meta, result)
|
||||
}
|
||||
|
||||
@ -29,9 +33,41 @@ class MetaFormatTest {
|
||||
"array" to doubleArrayOf(1.0, 2.0, 3.0)
|
||||
}
|
||||
}
|
||||
val string = meta.asString(JsonMetaFormat)
|
||||
val string = meta.toString(JsonMetaFormat)
|
||||
val result = JsonMetaFormat.parse(string)
|
||||
assertEquals(meta, result)
|
||||
|
||||
assertEquals<Meta>(meta, meta.seal())
|
||||
|
||||
meta.items.keys.forEach {
|
||||
if (meta[it] != result[it]) error("${meta[it]} != ${result[it]}")
|
||||
}
|
||||
|
||||
assertEquals<Meta>(meta, result)
|
||||
}
|
||||
|
||||
@Test
|
||||
fun testJsonToMeta(){
|
||||
val json = jsonArray{
|
||||
//top level array
|
||||
+jsonArray {
|
||||
+JsonPrimitive(88)
|
||||
+json{
|
||||
"c" to "aasdad"
|
||||
"d" to true
|
||||
}
|
||||
}
|
||||
+"value"
|
||||
+jsonArray {
|
||||
+JsonPrimitive(1.0)
|
||||
+JsonPrimitive(2.0)
|
||||
+JsonPrimitive(3.0)
|
||||
}
|
||||
}
|
||||
val meta = json.toMetaItem().node!!
|
||||
|
||||
assertEquals(true, meta["@value[0].@value[1].d"].boolean)
|
||||
assertEquals("value", meta["@value[1]"].string)
|
||||
assertEquals(listOf(1.0,2.0,3.0),meta["@value[2"].value?.list?.map{it.number.toDouble()})
|
||||
}
|
||||
|
||||
}
|
@ -0,0 +1,33 @@
|
||||
package hep.dataforge.io
|
||||
|
||||
import hep.dataforge.meta.buildMeta
|
||||
import hep.dataforge.names.toName
|
||||
import kotlinx.serialization.json.Json
|
||||
import kotlin.test.Test
|
||||
import kotlin.test.assertEquals
|
||||
|
||||
class MetaSerializerTest {
|
||||
@Test
|
||||
fun testMetaSerialization() {
|
||||
val meta = buildMeta {
|
||||
"a" to 22
|
||||
"node" to {
|
||||
"b" to "DDD"
|
||||
"c" to 11.1
|
||||
"array" to doubleArrayOf(1.0, 2.0, 3.0)
|
||||
}
|
||||
}
|
||||
|
||||
val string = Json.indented.stringify(MetaSerializer, meta)
|
||||
val restored = Json.plain.parse(MetaSerializer, string)
|
||||
assertEquals(restored, meta)
|
||||
}
|
||||
|
||||
@Test
|
||||
fun testNameSerialization() {
|
||||
val name = "a.b.c".toName()
|
||||
val string = Json.indented.stringify(NameSerializer, name)
|
||||
val restored = Json.plain.parse(NameSerializer, string)
|
||||
assertEquals(restored, name)
|
||||
}
|
||||
}
|
@ -0,0 +1,21 @@
|
||||
package hep.dataforge.io
|
||||
|
||||
import kotlinx.io.core.ByteReadPacket
|
||||
import kotlinx.io.core.Input
|
||||
import java.nio.channels.FileChannel
|
||||
import java.nio.file.Files
|
||||
import java.nio.file.Path
|
||||
import java.nio.file.StandardOpenOption
|
||||
|
||||
@ExperimentalUnsignedTypes
|
||||
class FileBinary(val path: Path, private val offset: UInt = 0u, size: ULong? = null) : RandomAccessBinary {
|
||||
|
||||
override val size: ULong = size ?: (Files.size(path).toULong() - offset).toULong()
|
||||
|
||||
override fun <R> read(from: UInt, size: UInt, block: Input.() -> R): R {
|
||||
FileChannel.open(path, StandardOpenOption.READ).use {
|
||||
val buffer = it.map(FileChannel.MapMode.READ_ONLY, (from + offset).toLong(), size.toLong())
|
||||
return ByteReadPacket(buffer).block()
|
||||
}
|
||||
}
|
||||
}
|
@ -0,0 +1,43 @@
|
||||
package hep.dataforge.io
|
||||
|
||||
import hep.dataforge.meta.Meta
|
||||
import kotlinx.io.nio.asInput
|
||||
import kotlinx.io.nio.asOutput
|
||||
import java.nio.file.Files
|
||||
import java.nio.file.Path
|
||||
import java.nio.file.StandardOpenOption
|
||||
|
||||
class FileEnvelope internal constructor(val path: Path, val format: EnvelopeFormat) : Envelope {
|
||||
//TODO do not like this constructor. Hope to replace it later
|
||||
|
||||
private val partialEnvelope: PartialEnvelope
|
||||
|
||||
init {
|
||||
val input = Files.newByteChannel(path, StandardOpenOption.READ).asInput()
|
||||
partialEnvelope = format.run { input.readPartial() }
|
||||
}
|
||||
|
||||
override val meta: Meta get() = partialEnvelope.meta
|
||||
|
||||
override val data: Binary? = FileBinary(path, partialEnvelope.dataOffset, partialEnvelope.dataSize)
|
||||
}
|
||||
|
||||
fun Path.readEnvelope(format: EnvelopeFormat) = FileEnvelope(this, format)
|
||||
|
||||
fun Path.writeEnvelope(
|
||||
envelope: Envelope,
|
||||
format: EnvelopeFormat = TaggedEnvelopeFormat,
|
||||
metaFormat: MetaFormat = JsonMetaFormat
|
||||
) {
|
||||
val output = Files.newByteChannel(
|
||||
this,
|
||||
StandardOpenOption.WRITE,
|
||||
StandardOpenOption.CREATE,
|
||||
StandardOpenOption.TRUNCATE_EXISTING
|
||||
).asOutput()
|
||||
|
||||
with(format) {
|
||||
output.writeEnvelope(envelope, metaFormat)
|
||||
}
|
||||
}
|
||||
|
@ -1,10 +1,5 @@
|
||||
plugins {
|
||||
`npm-multiplatform`
|
||||
id("scientifik.mpp")
|
||||
}
|
||||
|
||||
description = "Meta definition and basic operations on meta"
|
||||
|
||||
kotlin {
|
||||
jvm()
|
||||
js()
|
||||
}
|
@ -1,5 +1,29 @@
|
||||
package hep.dataforge.descriptors
|
||||
|
||||
import hep.dataforge.descriptors.Described.Companion.DESCRIPTOR_NODE
|
||||
import hep.dataforge.meta.Meta
|
||||
import hep.dataforge.meta.get
|
||||
import hep.dataforge.meta.node
|
||||
|
||||
/**
|
||||
* An object which provides its descriptor
|
||||
*/
|
||||
interface Described {
|
||||
val descriptor: NodeDescriptor
|
||||
|
||||
companion object {
|
||||
const val DESCRIPTOR_NODE = "@descriptor"
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* If meta node supplies explicit descriptor, return it, otherwise try to use descriptor node from meta itself
|
||||
*/
|
||||
val Meta.descriptor: NodeDescriptor?
|
||||
get() {
|
||||
return if (this is Described) {
|
||||
descriptor
|
||||
} else {
|
||||
get(DESCRIPTOR_NODE).node?.let { NodeDescriptor.wrap(it) }
|
||||
}
|
||||
}
|
@ -1,27 +1,133 @@
|
||||
/*
|
||||
* Copyright 2018 Alexander Nozik.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
|
||||
package hep.dataforge.descriptors
|
||||
|
||||
import hep.dataforge.meta.*
|
||||
import hep.dataforge.names.NameToken
|
||||
import hep.dataforge.names.toName
|
||||
import hep.dataforge.values.False
|
||||
import hep.dataforge.values.True
|
||||
import hep.dataforge.values.Value
|
||||
import hep.dataforge.values.ValueType
|
||||
|
||||
sealed class ItemDescriptor(override val config: Config) : Specific {
|
||||
|
||||
/**
|
||||
* The name of this item
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
var name: String by string { error("Anonymous descriptors are not allowed") }
|
||||
|
||||
/**
|
||||
* True if same name siblings with this name are allowed
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
var multiple: Boolean by boolean(false)
|
||||
|
||||
/**
|
||||
* The item description
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
var info: String? by string()
|
||||
|
||||
/**
|
||||
* A list of tags for this item. Tags used to customize item usage
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
var tags: List<String> by value { value ->
|
||||
value?.list?.map { it.string } ?: emptyList()
|
||||
}
|
||||
|
||||
/**
|
||||
* True if the item is required
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
abstract var required: Boolean
|
||||
}
|
||||
|
||||
/**
|
||||
* Descriptor for meta node. Could contain additional information for viewing
|
||||
* and editing.
|
||||
*
|
||||
* @author Alexander Nozik
|
||||
*/
|
||||
class NodeDescriptor(config: Config) : ItemDescriptor(config){
|
||||
|
||||
/**
|
||||
* True if the node is required
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
override var required: Boolean by boolean { default == null }
|
||||
|
||||
/**
|
||||
* The default for this node. Null if there is no default.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
var default: Meta? by node()
|
||||
|
||||
/**
|
||||
* The list of value descriptors
|
||||
*/
|
||||
val values: Map<String, ValueDescriptor>
|
||||
get() = config.getAll(VALUE_KEY.toName()).entries.associate { (name, node) ->
|
||||
name to ValueDescriptor.wrap(node.node ?: error("Value descriptor must be a node"))
|
||||
}
|
||||
|
||||
fun value(name: String, descriptor: ValueDescriptor) {
|
||||
if(items.keys.contains(name)) error("The key $name already exists in descriptor")
|
||||
val token = NameToken(VALUE_KEY, name)
|
||||
config[token] = descriptor.config
|
||||
}
|
||||
|
||||
/**
|
||||
* Add a value descriptor using block for
|
||||
*/
|
||||
fun value(name: String, block: ValueDescriptor.() -> Unit) {
|
||||
value(name, ValueDescriptor.build { this.name = name }.apply(block))
|
||||
}
|
||||
|
||||
/**
|
||||
* The map of children node descriptors
|
||||
*/
|
||||
val nodes: Map<String, NodeDescriptor>
|
||||
get() = config.getAll(NODE_KEY.toName()).entries.associate { (name, node) ->
|
||||
name to wrap(node.node ?: error("Node descriptor must be a node"))
|
||||
}
|
||||
|
||||
|
||||
fun node(name: String, descriptor: NodeDescriptor) {
|
||||
if(items.keys.contains(name)) error("The key $name already exists in descriptor")
|
||||
val token = NameToken(NODE_KEY, name)
|
||||
config[token] = descriptor.config
|
||||
}
|
||||
|
||||
fun node(name: String, block: NodeDescriptor.() -> Unit) {
|
||||
node(name, build { this.name = name }.apply(block))
|
||||
}
|
||||
|
||||
val items: Map<String, ItemDescriptor> get() = nodes + values
|
||||
|
||||
|
||||
//override val descriptor: NodeDescriptor = empty("descriptor")
|
||||
|
||||
companion object : Specification<NodeDescriptor> {
|
||||
|
||||
// const val ITEM_KEY = "item"
|
||||
const val NODE_KEY = "node"
|
||||
const val VALUE_KEY = "value"
|
||||
|
||||
override fun wrap(config: Config): NodeDescriptor = NodeDescriptor(config)
|
||||
|
||||
//TODO infer descriptor from spec
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* A descriptor for meta value
|
||||
*
|
||||
@ -29,7 +135,15 @@ import hep.dataforge.values.ValueType
|
||||
*
|
||||
* @author Alexander Nozik
|
||||
*/
|
||||
class ValueDescriptor(override val config: Config) : Specific {
|
||||
class ValueDescriptor(config: Config) : ItemDescriptor(config){
|
||||
|
||||
|
||||
/**
|
||||
* True if the value is required
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
override var required: Boolean by boolean { default == null }
|
||||
|
||||
/**
|
||||
* The default for this value. Null if there is no default.
|
||||
@ -42,34 +156,6 @@ class ValueDescriptor(override val config: Config) : Specific {
|
||||
this.default = Value.of(v)
|
||||
}
|
||||
|
||||
/**
|
||||
* True if multiple values with this name are allowed.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
var multiple: Boolean by boolean(false)
|
||||
|
||||
/**
|
||||
* True if the value is required
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
var required: Boolean by boolean { default == null }
|
||||
|
||||
/**
|
||||
* Value name
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
var name: String by string { error("Anonymous descriptors are not allowed") }
|
||||
|
||||
/**
|
||||
* The value info
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
var info: String? by string()
|
||||
|
||||
/**
|
||||
* A list of allowed ValueTypes. Empty if any value type allowed
|
||||
*
|
||||
@ -83,10 +169,6 @@ class ValueDescriptor(override val config: Config) : Specific {
|
||||
this.type = listOf(*t)
|
||||
}
|
||||
|
||||
var tags: List<String> by value { value ->
|
||||
value?.list?.map { it.string } ?: emptyList()
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if given value is allowed for here. The type should be allowed and
|
||||
* if it is value should be within allowed values
|
||||
@ -126,7 +208,7 @@ class ValueDescriptor(override val config: Config) : Specific {
|
||||
override fun wrap(config: Config): ValueDescriptor = ValueDescriptor(config)
|
||||
|
||||
inline fun <reified E : Enum<E>> enum(name: String) =
|
||||
ValueDescriptor.build {
|
||||
build {
|
||||
this.name = name
|
||||
type(ValueType.STRING)
|
||||
this.allowedValues = enumValues<E>().map { Value.of(it.name) }
|
@ -1,128 +0,0 @@
|
||||
/*
|
||||
* Copyright 2018 Alexander Nozik.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
|
||||
/*
|
||||
* To change this license header, choose License Headers in Project Properties.
|
||||
* To change this template file, choose Tools | Templates
|
||||
* and open the template in the editor.
|
||||
*/
|
||||
package hep.dataforge.descriptors
|
||||
|
||||
import hep.dataforge.meta.*
|
||||
import hep.dataforge.names.NameToken
|
||||
import hep.dataforge.names.toName
|
||||
|
||||
/**
|
||||
* Descriptor for meta node. Could contain additional information for viewing
|
||||
* and editing.
|
||||
*
|
||||
* @author Alexander Nozik
|
||||
*/
|
||||
class NodeDescriptor(override val config: Config) : Specific {
|
||||
|
||||
/**
|
||||
* The name of this node
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
var name: String by string { error("Anonymous descriptors are not allowed") }
|
||||
|
||||
|
||||
/**
|
||||
* The default for this node. Null if there is no default.
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
var default: Meta? by node()
|
||||
|
||||
/**
|
||||
* True if multiple children with this nodes name are allowed. Anonymous
|
||||
* nodes are always single
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
var multiple: Boolean by boolean(false)
|
||||
|
||||
/**
|
||||
* True if the node is required
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
var required: Boolean by boolean { default == null }
|
||||
|
||||
/**
|
||||
* The node description
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
var info: String? by string()
|
||||
|
||||
/**
|
||||
* A list of tags for this node. Tags used to customize node usage
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
var tags: List<String> by value{ value ->
|
||||
value?.list?.map { it.string } ?: emptyList()
|
||||
}
|
||||
|
||||
/**
|
||||
* The list of value descriptors
|
||||
*/
|
||||
val values: Map<String, ValueDescriptor>
|
||||
get() = config.getAll("value".toName()).entries.associate { (name, node) ->
|
||||
name to ValueDescriptor.wrap(node.node ?: error("Value descriptor must be a node"))
|
||||
}
|
||||
|
||||
fun value(name: String, descriptor: ValueDescriptor) {
|
||||
val token = NameToken("value", name)
|
||||
config[token] = descriptor.config
|
||||
}
|
||||
|
||||
/**
|
||||
* Add a value descriptor using block for
|
||||
*/
|
||||
fun value(name: String, block: ValueDescriptor.() -> Unit) {
|
||||
value(name, ValueDescriptor.build { this.name = name }.apply(block))
|
||||
}
|
||||
|
||||
/**
|
||||
* The map of children node descriptors
|
||||
*/
|
||||
val nodes: Map<String, NodeDescriptor>
|
||||
get() = config.getAll("node".toName()).entries.associate { (name, node) ->
|
||||
name to NodeDescriptor.wrap(node.node ?: error("Node descriptor must be a node"))
|
||||
}
|
||||
|
||||
|
||||
fun node(name: String, descriptor: NodeDescriptor) {
|
||||
val token = NameToken("node", name)
|
||||
config[token] = descriptor.config
|
||||
}
|
||||
|
||||
fun node(name: String, block: NodeDescriptor.() -> Unit) {
|
||||
node(name, NodeDescriptor.build { this.name = name }.apply(block))
|
||||
}
|
||||
|
||||
|
||||
//override val descriptor: NodeDescriptor = empty("descriptor")
|
||||
|
||||
companion object : Specification<NodeDescriptor> {
|
||||
|
||||
override fun wrap(config: Config): NodeDescriptor = NodeDescriptor(config)
|
||||
|
||||
}
|
||||
}
|
@ -3,18 +3,61 @@ package hep.dataforge.meta
|
||||
import hep.dataforge.names.Name
|
||||
import hep.dataforge.names.NameToken
|
||||
import hep.dataforge.names.asName
|
||||
import hep.dataforge.names.plus
|
||||
|
||||
//TODO add validator to configuration
|
||||
|
||||
data class MetaListener(
|
||||
val owner: Any? = null,
|
||||
val action: (name: Name, oldItem: MetaItem<*>?, newItem: MetaItem<*>?) -> Unit
|
||||
)
|
||||
|
||||
/**
|
||||
* Mutable meta representing object state
|
||||
*/
|
||||
open class Config : MutableMetaNode<Config>() {
|
||||
class Config : AbstractMutableMeta<Config>() {
|
||||
|
||||
private val listeners = HashSet<MetaListener>()
|
||||
|
||||
private fun itemChanged(name: Name, oldItem: MetaItem<*>?, newItem: MetaItem<*>?) {
|
||||
listeners.forEach { it.action(name, oldItem, newItem) }
|
||||
}
|
||||
|
||||
/**
|
||||
* Add change listener to this meta. Owner is declared to be able to remove listeners later. Listener without owner could not be removed
|
||||
*/
|
||||
fun onChange(owner: Any?, action: (Name, MetaItem<*>?, MetaItem<*>?) -> Unit) {
|
||||
listeners.add(MetaListener(owner, action))
|
||||
}
|
||||
|
||||
/**
|
||||
* Remove all listeners belonging to given owner
|
||||
*/
|
||||
fun removeListener(owner: Any?) {
|
||||
listeners.removeAll { it.owner === owner }
|
||||
}
|
||||
|
||||
override fun replaceItem(key: NameToken, oldItem: MetaItem<Config>?, newItem: MetaItem<Config>?) {
|
||||
if (newItem == null) {
|
||||
_items.remove(key)
|
||||
if(oldItem!= null && oldItem is MetaItem.NodeItem<Config>) {
|
||||
oldItem.node.removeListener(this)
|
||||
}
|
||||
} else {
|
||||
_items[key] = newItem
|
||||
if (newItem is MetaItem.NodeItem) {
|
||||
newItem.node.onChange(this) { name, oldChild, newChild ->
|
||||
itemChanged(key + name, oldChild, newChild)
|
||||
}
|
||||
}
|
||||
}
|
||||
itemChanged(key.asName(), oldItem, newItem)
|
||||
}
|
||||
|
||||
/**
|
||||
* Attach configuration node instead of creating one
|
||||
*/
|
||||
override fun wrap(name: Name, meta: Meta): Config = meta.toConfig()
|
||||
override fun wrapNode(meta: Meta): Config = meta.toConfig()
|
||||
|
||||
override fun empty(): Config = Config()
|
||||
|
||||
@ -29,7 +72,7 @@ fun Meta.toConfig(): Config = this as? Config ?: Config().also { builder ->
|
||||
this.items.mapValues { entry ->
|
||||
val item = entry.value
|
||||
builder[entry.key.asName()] = when (item) {
|
||||
is MetaItem.ValueItem -> MetaItem.ValueItem(item.value)
|
||||
is MetaItem.ValueItem -> item.value
|
||||
is MetaItem.NodeItem -> MetaItem.NodeItem(item.node.toConfig())
|
||||
}
|
||||
}
|
||||
@ -41,6 +84,6 @@ interface Configurable {
|
||||
|
||||
fun <T : Configurable> T.configure(meta: Meta): T = this.apply { config.update(meta) }
|
||||
|
||||
fun <T : Configurable> T.configure(action: Config.() -> Unit): T = this.apply { config.apply(action) }
|
||||
fun <T : Configurable> T.configure(action: MetaBuilder.() -> Unit): T = configure(buildMeta(action))
|
||||
|
||||
open class SimpleConfigurable(override val config: Config) : Configurable
|
@ -1,36 +0,0 @@
|
||||
package hep.dataforge.meta
|
||||
|
||||
import kotlin.properties.ReadWriteProperty
|
||||
import kotlin.reflect.KProperty
|
||||
|
||||
/*
|
||||
* Extra delegates for special cases
|
||||
*/
|
||||
|
||||
/**
|
||||
* A delegate for a string list
|
||||
*/
|
||||
class StringListConfigDelegate(
|
||||
val config: Config,
|
||||
private val key: String? = null,
|
||||
private val default: List<String> = emptyList()
|
||||
) :
|
||||
ReadWriteProperty<Any?, List<String>> {
|
||||
override fun getValue(thisRef: Any?, property: KProperty<*>): List<String> {
|
||||
return config[key ?: property.name]?.value?.list?.map { it.string } ?: default
|
||||
}
|
||||
|
||||
override fun setValue(thisRef: Any?, property: KProperty<*>, value: List<String>) {
|
||||
val name = key ?: property.name
|
||||
config[name] = value
|
||||
}
|
||||
}
|
||||
|
||||
fun Configurable.stringList(vararg default: String = emptyArray(), key: String? = null) =
|
||||
StringListConfigDelegate(config, key, default.toList())
|
||||
|
||||
|
||||
fun <T : Metoid> Metoid.child(key: String? = null, converter: (Meta) -> T) = ChildDelegate(meta, key, converter)
|
||||
|
||||
fun <T : Configurable> Configurable.child(key: String? = null, converter: (Meta) -> T) =
|
||||
MutableMorphDelegate(config, key, converter)
|
@ -7,7 +7,7 @@ import hep.dataforge.names.NameToken
|
||||
*
|
||||
*
|
||||
*/
|
||||
class Laminate(layers: List<Meta>) : Meta {
|
||||
class Laminate(layers: List<Meta>) : MetaBase() {
|
||||
|
||||
val layers: List<Meta> = layers.flatMap {
|
||||
if (it is Laminate) {
|
||||
@ -17,9 +17,9 @@ class Laminate(layers: List<Meta>) : Meta {
|
||||
}
|
||||
}
|
||||
|
||||
constructor(vararg layers: Meta) : this(layers.asList())
|
||||
constructor(vararg layers: Meta?) : this(layers.filterNotNull())
|
||||
|
||||
override val items: Map<NameToken, MetaItem<out Meta>>
|
||||
override val items: Map<NameToken, MetaItem<Meta>>
|
||||
get() = layers.map { it.items.keys }.flatten().associateWith { key ->
|
||||
layers.asSequence().map { it.items[key] }.filterNotNull().let(replaceRule)
|
||||
}
|
||||
@ -79,4 +79,14 @@ class Laminate(layers: List<Meta>) : Meta {
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a new [Laminate] adding given layer to the top
|
||||
*/
|
||||
fun Laminate.withTop(meta: Meta): Laminate = Laminate(listOf(meta) + layers)
|
||||
|
||||
/**
|
||||
* Create a new [Laminate] adding given layer to the bottom
|
||||
*/
|
||||
fun Laminate.withBottom(meta: Meta): Laminate = Laminate(layers + meta)
|
||||
|
||||
//TODO add custom rules for Laminate merge
|
||||
|
@ -14,9 +14,14 @@ import hep.dataforge.values.boolean
|
||||
* * a [ValueItem] (leaf)
|
||||
* * a [NodeItem] (node)
|
||||
*/
|
||||
sealed class MetaItem<M : Meta> {
|
||||
data class ValueItem<M : Meta>(val value: Value) : MetaItem<M>()
|
||||
data class NodeItem<M : Meta>(val node: M) : MetaItem<M>()
|
||||
sealed class MetaItem<out M : Meta> {
|
||||
data class ValueItem(val value: Value) : MetaItem<Nothing>() {
|
||||
override fun toString(): String = value.toString()
|
||||
}
|
||||
|
||||
data class NodeItem<M : Meta>(val node: M) : MetaItem<M>() {
|
||||
override fun toString(): String = node.toString()
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
@ -35,10 +40,19 @@ interface MetaRepr {
|
||||
* * Same name siblings are supported via elements with the same [Name] but different queries
|
||||
*/
|
||||
interface Meta : MetaRepr {
|
||||
val items: Map<NameToken, MetaItem<out Meta>>
|
||||
/**
|
||||
* Top level items of meta tree
|
||||
*/
|
||||
val items: Map<NameToken, MetaItem<*>>
|
||||
|
||||
override fun toMeta(): Meta = this
|
||||
|
||||
override fun equals(other: Any?): Boolean
|
||||
|
||||
override fun hashCode(): Int
|
||||
|
||||
override fun toString(): String
|
||||
|
||||
companion object {
|
||||
const val TYPE = "meta"
|
||||
/**
|
||||
@ -50,12 +64,7 @@ interface Meta : MetaRepr {
|
||||
|
||||
/* Get operations*/
|
||||
|
||||
/**
|
||||
* Fast [String]-based accessor for item map
|
||||
*/
|
||||
operator fun <T> Map<NameToken, T>.get(body: String, query: String = ""): T? = get(NameToken(body, query))
|
||||
|
||||
operator fun Meta?.get(name: Name): MetaItem<out Meta>? {
|
||||
operator fun Meta?.get(name: Name): MetaItem<*>? {
|
||||
if (this == null) return null
|
||||
return name.first()?.let { token ->
|
||||
val tail = name.cutFirst()
|
||||
@ -66,13 +75,13 @@ operator fun Meta?.get(name: Name): MetaItem<out Meta>? {
|
||||
}
|
||||
}
|
||||
|
||||
operator fun Meta?.get(token: NameToken): MetaItem<out Meta>? = this?.items?.get(token)
|
||||
operator fun Meta?.get(key: String): MetaItem<out Meta>? = get(key.toName())
|
||||
operator fun Meta?.get(token: NameToken): MetaItem<*>? = this?.items?.get(token)
|
||||
operator fun Meta?.get(key: String): MetaItem<*>? = get(key.toName())
|
||||
|
||||
/**
|
||||
* Get all items matching given name.
|
||||
*/
|
||||
fun Meta.getAll(name: Name): Map<String, MetaItem<out Meta>> {
|
||||
fun Meta.getAll(name: Name): Map<String, MetaItem<*>> {
|
||||
val root = when (name.length) {
|
||||
0 -> error("Can't use empty name for that")
|
||||
1 -> this
|
||||
@ -88,22 +97,37 @@ fun Meta.getAll(name: Name): Map<String, MetaItem<out Meta>> {
|
||||
?: emptyMap()
|
||||
}
|
||||
|
||||
fun Meta.getAll(name: String): Map<String, MetaItem<out Meta>> = getAll(name.toName())
|
||||
fun Meta.getAll(name: String): Map<String, MetaItem<*>> = getAll(name.toName())
|
||||
|
||||
/**
|
||||
* Get a sequence of [Name]-[Value] pairs
|
||||
*/
|
||||
fun Meta.values(): Sequence<Pair<Name, Value>> {
|
||||
return items.asSequence().flatMap { entry ->
|
||||
val item = entry.value
|
||||
return items.asSequence().flatMap { (key, item) ->
|
||||
when (item) {
|
||||
is ValueItem -> sequenceOf(entry.key.asName() to item.value)
|
||||
is NodeItem -> item.node.values().map { pair -> (entry.key.asName() + pair.first) to pair.second }
|
||||
is ValueItem -> sequenceOf(key.asName() to item.value)
|
||||
is NodeItem -> item.node.values().map { pair -> (key.asName() + pair.first) to pair.second }
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
operator fun Meta.iterator(): Iterator<Pair<Name, Value>> = values().iterator()
|
||||
/**
|
||||
* Get a sequence of all [Name]-[MetaItem] pairs for all items including nodes
|
||||
*/
|
||||
fun Meta.sequence(): Sequence<Pair<Name, MetaItem<*>>> {
|
||||
return sequence {
|
||||
items.forEach { (key, item) ->
|
||||
yield(key.asName() to item)
|
||||
if (item is NodeItem<*>) {
|
||||
yieldAll(item.node.sequence().map { (innerKey, innerItem) ->
|
||||
(key + innerKey) to innerItem
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
operator fun Meta.iterator(): Iterator<Pair<Name, MetaItem<*>>> = sequence().iterator()
|
||||
|
||||
/**
|
||||
* A meta node that ensures that all of its descendants has at least the same type
|
||||
@ -115,7 +139,7 @@ interface MetaNode<M : MetaNode<M>> : Meta {
|
||||
/**
|
||||
* Get all items matching given name.
|
||||
*/
|
||||
fun <M : MetaNode<M>> MetaNode<M>.getAll(name: Name): Map<String, MetaItem<M>> {
|
||||
fun <M : MetaNode<M>> M.getAll(name: Name): Map<String, MetaItem<M>> {
|
||||
val root: MetaNode<M>? = when (name.length) {
|
||||
0 -> error("Can't use empty name for that")
|
||||
1 -> this
|
||||
@ -133,7 +157,8 @@ fun <M : MetaNode<M>> MetaNode<M>.getAll(name: Name): Map<String, MetaItem<M>> {
|
||||
|
||||
fun <M : MetaNode<M>> M.getAll(name: String): Map<String, MetaItem<M>> = getAll(name.toName())
|
||||
|
||||
operator fun <M : MetaNode<M>> MetaNode<M>.get(name: Name): MetaItem<M>? {
|
||||
operator fun <M : MetaNode<M>> MetaNode<M>?.get(name: Name): MetaItem<M>? {
|
||||
if (this == null) return null
|
||||
return name.first()?.let { token ->
|
||||
val tail = name.cutFirst()
|
||||
when (tail.length) {
|
||||
@ -143,23 +168,43 @@ operator fun <M : MetaNode<M>> MetaNode<M>.get(name: Name): MetaItem<M>? {
|
||||
}
|
||||
}
|
||||
|
||||
operator fun <M : MetaNode<M>> MetaNode<M>?.get(key: String): MetaItem<M>? = this?.let { get(key.toName()) }
|
||||
operator fun <M : MetaNode<M>> MetaNode<M>?.get(key: String): MetaItem<M>? = if (this == null) {
|
||||
null
|
||||
} else {
|
||||
this[key.toName()]
|
||||
}
|
||||
|
||||
operator fun <M : MetaNode<M>> MetaNode<M>?.get(key: NameToken): MetaItem<M>? = if (this == null) {
|
||||
null
|
||||
} else {
|
||||
this[key.asName()]
|
||||
}
|
||||
|
||||
/**
|
||||
* Equals, hashcode and to string for any meta
|
||||
*/
|
||||
abstract class MetaBase: Meta{
|
||||
|
||||
override fun equals(other: Any?): Boolean = if(other is Meta) {
|
||||
this.items == other.items
|
||||
// val items = items
|
||||
// val otherItems = other.items
|
||||
// (items.keys == otherItems.keys) && items.keys.all {
|
||||
// items[it] == otherItems[it]
|
||||
// }
|
||||
} else {
|
||||
false
|
||||
}
|
||||
|
||||
override fun hashCode(): Int = items.hashCode()
|
||||
|
||||
override fun toString(): String = items.toString()
|
||||
}
|
||||
|
||||
/**
|
||||
* Equals and hash code implementation for meta node
|
||||
*/
|
||||
abstract class AbstractMetaNode<M : MetaNode<M>> : MetaNode<M> {
|
||||
override fun equals(other: Any?): Boolean {
|
||||
if (this === other) return true
|
||||
if (other !is Meta) return false
|
||||
|
||||
return this.items == other.items
|
||||
}
|
||||
|
||||
override fun hashCode(): Int {
|
||||
return items.hashCode()
|
||||
}
|
||||
}
|
||||
abstract class AbstractMetaNode<M : MetaNode<M>> : MetaNode<M>, MetaBase()
|
||||
|
||||
/**
|
||||
* The meta implementation which is guaranteed to be immutable.
|
||||
@ -174,13 +219,14 @@ class SealedMeta internal constructor(override val items: Map<NameToken, MetaIte
|
||||
*/
|
||||
fun Meta.seal(): SealedMeta = this as? SealedMeta ?: SealedMeta(items.mapValues { entry -> entry.value.seal() })
|
||||
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
fun MetaItem<*>.seal(): MetaItem<SealedMeta> = when (this) {
|
||||
is MetaItem.ValueItem -> MetaItem.ValueItem(value)
|
||||
is MetaItem.NodeItem -> MetaItem.NodeItem(node.seal())
|
||||
is ValueItem -> this
|
||||
is NodeItem -> NodeItem(node.seal())
|
||||
}
|
||||
|
||||
object EmptyMeta : Meta {
|
||||
override val items: Map<NameToken, MetaItem<out Meta>> = emptyMap()
|
||||
object EmptyMeta : MetaBase() {
|
||||
override val items: Map<NameToken, MetaItem<*>> = emptyMap()
|
||||
}
|
||||
|
||||
/**
|
||||
@ -188,8 +234,8 @@ object EmptyMeta : Meta {
|
||||
*/
|
||||
|
||||
val MetaItem<*>?.value
|
||||
get() = (this as? MetaItem.ValueItem)?.value
|
||||
?: (this?.node?.get(VALUE_KEY) as? MetaItem.ValueItem)?.value
|
||||
get() = (this as? ValueItem)?.value
|
||||
?: (this?.node?.get(VALUE_KEY) as? ValueItem)?.value
|
||||
|
||||
val MetaItem<*>?.string get() = value?.string
|
||||
val MetaItem<*>?.boolean get() = value?.boolean
|
||||
@ -211,8 +257,8 @@ val MetaItem<*>?.stringList get() = value?.list?.map { it.string } ?: emptyList(
|
||||
val <M : Meta> MetaItem<M>?.node: M?
|
||||
get() = when (this) {
|
||||
null -> null
|
||||
is MetaItem.ValueItem -> error("Trying to interpret value meta item as node item")
|
||||
is MetaItem.NodeItem -> node
|
||||
is ValueItem -> error("Trying to interpret value meta item as node item")
|
||||
is NodeItem -> node
|
||||
}
|
||||
|
||||
/**
|
||||
|
@ -7,8 +7,8 @@ import hep.dataforge.values.Value
|
||||
/**
|
||||
* DSL builder for meta. Is not intended to store mutable state
|
||||
*/
|
||||
class MetaBuilder : MutableMetaNode<MetaBuilder>() {
|
||||
override fun wrap(name: Name, meta: Meta): MetaBuilder = meta.builder()
|
||||
class MetaBuilder : AbstractMutableMeta<MetaBuilder>() {
|
||||
override fun wrapNode(meta: Meta): MetaBuilder = if (meta is MetaBuilder) meta else meta.builder()
|
||||
override fun empty(): MetaBuilder = MetaBuilder()
|
||||
|
||||
infix fun String.to(value: Any) {
|
||||
@ -29,6 +29,25 @@ class MetaBuilder : MutableMetaNode<MetaBuilder>() {
|
||||
infix fun String.to(metaBuilder: MetaBuilder.() -> Unit) {
|
||||
this@MetaBuilder[this] = MetaBuilder().apply(metaBuilder)
|
||||
}
|
||||
|
||||
infix fun Name.to(value: Any) {
|
||||
if (value is Meta) {
|
||||
this@MetaBuilder[this] = value
|
||||
}
|
||||
this@MetaBuilder[this] = Value.of(value)
|
||||
}
|
||||
|
||||
infix fun Name.to(meta: Meta) {
|
||||
this@MetaBuilder[this] = meta
|
||||
}
|
||||
|
||||
infix fun Name.to(value: Iterable<Meta>) {
|
||||
this@MetaBuilder[this] = value.toList()
|
||||
}
|
||||
|
||||
infix fun Name.to(metaBuilder: MetaBuilder.() -> Unit) {
|
||||
this@MetaBuilder[this] = MetaBuilder().apply(metaBuilder)
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
@ -39,11 +58,19 @@ fun Meta.builder(): MetaBuilder {
|
||||
items.mapValues { entry ->
|
||||
val item = entry.value
|
||||
builder[entry.key.asName()] = when (item) {
|
||||
is MetaItem.ValueItem -> MetaItem.ValueItem<MetaBuilder>(item.value)
|
||||
is MetaItem.ValueItem -> item.value
|
||||
is MetaItem.NodeItem -> MetaItem.NodeItem(item.node.builder())
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Build a [MetaBuilder] using given transformation
|
||||
*/
|
||||
fun buildMeta(builder: MetaBuilder.() -> Unit): MetaBuilder = MetaBuilder().apply(builder)
|
||||
|
||||
/**
|
||||
* Build meta using given source meta as a base
|
||||
*/
|
||||
fun buildMeta(source: Meta, builder: MetaBuilder.() -> Unit): MetaBuilder = source.builder().apply(builder)
|
@ -0,0 +1,170 @@
|
||||
package hep.dataforge.meta
|
||||
|
||||
import hep.dataforge.names.Name
|
||||
|
||||
/**
|
||||
* A transformation for meta item or a group of items
|
||||
*/
|
||||
interface TransformationRule {
|
||||
|
||||
/**
|
||||
* Check if this transformation
|
||||
*/
|
||||
fun matches(name: Name, item: MetaItem<*>?): Boolean
|
||||
|
||||
/**
|
||||
* Select all items to be transformed. Item could be a value as well as node
|
||||
*
|
||||
* @return a sequence of item paths to be transformed
|
||||
*/
|
||||
fun selectItems(meta: Meta): Sequence<Name> =
|
||||
meta.sequence().filter { matches(it.first, it.second) }.map { it.first }
|
||||
|
||||
/**
|
||||
* Apply transformation for a single item (Node or Value) and return resulting tree with absolute path
|
||||
*/
|
||||
fun <M : MutableMeta<M>> transformItem(name: Name, item: MetaItem<*>?, target: M): Unit
|
||||
}
|
||||
|
||||
/**
|
||||
* A transformation which keeps all elements, matching [selector] unchanged.
|
||||
*/
|
||||
data class KeepTransformationRule(val selector: (Name) -> Boolean) : TransformationRule {
|
||||
override fun matches(name: Name, item: MetaItem<*>?): Boolean {
|
||||
return selector(name)
|
||||
}
|
||||
|
||||
override fun selectItems(meta: Meta): Sequence<Name> =
|
||||
meta.sequence().map { it.first }.filter(selector)
|
||||
|
||||
override fun <M : MutableMeta<M>> transformItem(name: Name, item: MetaItem<*>?, target: M) {
|
||||
if (selector(name)) target[name] = item
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* A transformation which transforms element with specific name
|
||||
*/
|
||||
data class SingleItemTransformationRule(
|
||||
val from: Name,
|
||||
val transform: MutableMeta<*>.(Name, MetaItem<*>?) -> Unit
|
||||
) : TransformationRule {
|
||||
override fun matches(name: Name, item: MetaItem<*>?): Boolean {
|
||||
return name == from
|
||||
}
|
||||
|
||||
override fun selectItems(meta: Meta): Sequence<Name> = sequenceOf(from)
|
||||
|
||||
override fun <M : MutableMeta<M>> transformItem(name: Name, item: MetaItem<*>?, target: M) {
|
||||
if (name == this.from) {
|
||||
target.transform(name, item)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
data class RegexItemTransformationRule(
|
||||
val from: Regex,
|
||||
val transform: MutableMeta<*>.(name: Name, MatchResult, MetaItem<*>?) -> Unit
|
||||
) : TransformationRule {
|
||||
override fun matches(name: Name, item: MetaItem<*>?): Boolean {
|
||||
return from.matches(name.toString())
|
||||
}
|
||||
|
||||
override fun <M : MutableMeta<M>> transformItem(name: Name, item: MetaItem<*>?, target: M) {
|
||||
val match = from.matchEntire(name.toString())
|
||||
if (match != null) {
|
||||
target.transform(name, match, item)
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
/**
|
||||
* A set of [TransformationRule] to either transform static meta or create dynamically updated [MutableMeta]
|
||||
*/
|
||||
inline class MetaTransformation(val transformations: Collection<TransformationRule>) {
|
||||
|
||||
/**
|
||||
* Produce new meta using only those items that match transformation rules
|
||||
*/
|
||||
fun transform(source: Meta): Meta = buildMeta {
|
||||
transformations.forEach { rule ->
|
||||
rule.selectItems(source).forEach { name ->
|
||||
rule.transformItem(name, source[name], this)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Transform a meta, replacing all elements found in rules with transformed entries
|
||||
*/
|
||||
fun apply(source: Meta): Meta = buildMeta(source) {
|
||||
transformations.forEach { rule ->
|
||||
rule.selectItems(source).forEach { name ->
|
||||
remove(name)
|
||||
rule.transformItem(name, source[name], this)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Listens for changes in the source node and translates them into second node if transformation set contains a corresponding rule.
|
||||
*/
|
||||
fun <M : MutableMeta<M>> bind(source: Config, target: M) {
|
||||
source.onChange(target) { name, _, newItem ->
|
||||
transformations.forEach { t ->
|
||||
if (t.matches(name, newItem)) {
|
||||
t.transformItem(name, newItem, target)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
companion object {
|
||||
fun make(block: MetaTransformationBuilder.() -> Unit): MetaTransformation =
|
||||
MetaTransformationBuilder().apply(block).build()
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* A builder for a set of transformation rules
|
||||
*/
|
||||
class MetaTransformationBuilder {
|
||||
val transformations = HashSet<TransformationRule>()
|
||||
|
||||
/**
|
||||
* Keep all items with name satisfying the criteria
|
||||
*/
|
||||
fun keep(selector: (Name) -> Boolean) {
|
||||
transformations.add(KeepTransformationRule(selector))
|
||||
}
|
||||
|
||||
/**
|
||||
* Keep specific item (including its descendants)
|
||||
*/
|
||||
fun keep(name: Name) {
|
||||
keep { it == name }
|
||||
}
|
||||
|
||||
/**
|
||||
* Keep nodes by regex
|
||||
*/
|
||||
fun keep(regex: String) {
|
||||
transformations.add(RegexItemTransformationRule(regex.toRegex()) { name, _, metaItem ->
|
||||
setItem(name, metaItem)
|
||||
})
|
||||
}
|
||||
|
||||
/**
|
||||
* Move an item from [from] to [to], optionally applying [operation] it defined
|
||||
*/
|
||||
fun move(from: Name, to: Name, operation: (MetaItem<*>?) -> Any? = { it }) {
|
||||
transformations.add(
|
||||
SingleItemTransformationRule(from) { _, item ->
|
||||
set(to, operation(item))
|
||||
}
|
||||
)
|
||||
}
|
||||
|
||||
fun build() = MetaTransformation(transformations)
|
||||
}
|
@ -3,17 +3,11 @@ package hep.dataforge.meta
|
||||
import hep.dataforge.names.*
|
||||
import hep.dataforge.values.Value
|
||||
|
||||
internal data class MetaListener(
|
||||
val owner: Any? = null,
|
||||
val action: (name: Name, oldItem: MetaItem<*>?, newItem: MetaItem<*>?) -> Unit
|
||||
)
|
||||
|
||||
|
||||
interface MutableMeta<M : MutableMeta<M>> : MetaNode<M> {
|
||||
override val items: Map<NameToken, MetaItem<M>>
|
||||
operator fun set(name: Name, item: MetaItem<M>?)
|
||||
fun onChange(owner: Any? = null, action: (Name, MetaItem<*>?, MetaItem<*>?) -> Unit)
|
||||
fun removeListener(owner: Any? = null)
|
||||
operator fun set(name: Name, item: MetaItem<*>?)
|
||||
// fun onChange(owner: Any? = null, action: (Name, MetaItem<*>?, MetaItem<*>?) -> Unit)
|
||||
// fun removeListener(owner: Any? = null)
|
||||
}
|
||||
|
||||
/**
|
||||
@ -21,111 +15,102 @@ interface MutableMeta<M : MutableMeta<M>> : MetaNode<M> {
|
||||
*
|
||||
* Changes in Meta are not thread safe.
|
||||
*/
|
||||
abstract class MutableMetaNode<M : MutableMetaNode<M>> : AbstractMetaNode<M>(), MutableMeta<M> {
|
||||
private val listeners = HashSet<MetaListener>()
|
||||
|
||||
/**
|
||||
* Add change listener to this meta. Owner is declared to be able to remove listeners later. Listener without owner could not be removed
|
||||
*/
|
||||
override fun onChange(owner: Any?, action: (Name, MetaItem<*>?, MetaItem<*>?) -> Unit) {
|
||||
listeners.add(MetaListener(owner, action))
|
||||
}
|
||||
|
||||
/**
|
||||
* Remove all listeners belonging to given owner
|
||||
*/
|
||||
override fun removeListener(owner: Any?) {
|
||||
listeners.removeAll { it.owner === owner }
|
||||
}
|
||||
|
||||
private val _items: MutableMap<NameToken, MetaItem<M>> = HashMap()
|
||||
abstract class AbstractMutableMeta<M : MutableMeta<M>> : AbstractMetaNode<M>(), MutableMeta<M> {
|
||||
protected val _items: MutableMap<NameToken, MetaItem<M>> = HashMap()
|
||||
|
||||
override val items: Map<NameToken, MetaItem<M>>
|
||||
get() = _items
|
||||
|
||||
protected fun itemChanged(name: Name, oldItem: MetaItem<*>?, newItem: MetaItem<*>?) {
|
||||
listeners.forEach { it.action(name, oldItem, newItem) }
|
||||
}
|
||||
//protected abstract fun itemChanged(name: Name, oldItem: MetaItem<*>?, newItem: MetaItem<*>?)
|
||||
|
||||
protected open fun replaceItem(key: NameToken, oldItem: MetaItem<M>?, newItem: MetaItem<M>?) {
|
||||
if (newItem == null) {
|
||||
_items.remove(key)
|
||||
oldItem?.node?.removeListener(this)
|
||||
} else {
|
||||
_items[key] = newItem
|
||||
if (newItem is MetaItem.NodeItem) {
|
||||
newItem.node.onChange(this) { name, oldChild, newChild ->
|
||||
itemChanged(key + name, oldChild, newChild)
|
||||
}
|
||||
//itemChanged(key.asName(), oldItem, newItem)
|
||||
}
|
||||
}
|
||||
itemChanged(key.asName(), oldItem, newItem)
|
||||
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
protected fun wrapItem(item: MetaItem<*>?): MetaItem<M>? = when (item) {
|
||||
null -> null
|
||||
is MetaItem.ValueItem -> item
|
||||
is MetaItem.NodeItem -> MetaItem.NodeItem(wrapNode(item.node))
|
||||
}
|
||||
|
||||
/**
|
||||
* Transform given meta to node type of this meta tree
|
||||
* @param name the name of the node where meta should be attached. Needed for correct assignment validators and styles
|
||||
* @param meta the node itself
|
||||
*/
|
||||
internal abstract fun wrap(name: Name, meta: Meta): M
|
||||
protected abstract fun wrapNode(meta: Meta): M
|
||||
|
||||
/**
|
||||
* Create empty node
|
||||
*/
|
||||
internal abstract fun empty(): M
|
||||
|
||||
override operator fun set(name: Name, item: MetaItem<M>?) {
|
||||
override operator fun set(name: Name, item: MetaItem<*>?) {
|
||||
when (name.length) {
|
||||
0 -> error("Can't setValue meta item for empty name")
|
||||
1 -> {
|
||||
val token = name.first()!!
|
||||
replaceItem(token, get(name), item)
|
||||
replaceItem(token, get(name), wrapItem(item))
|
||||
}
|
||||
else -> {
|
||||
val token = name.first()!!
|
||||
//get existing or create new node. Query is ignored for new node
|
||||
val child = this.items[token]?.node
|
||||
?: empty().also { this[token.body.toName()] = MetaItem.NodeItem(it) }
|
||||
child[name.cutFirst()] = item
|
||||
if(items[token] == null){
|
||||
replaceItem(token,null, MetaItem.NodeItem(empty()))
|
||||
}
|
||||
items[token]?.node!![name.cutFirst()] = item
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fun <M : MutableMeta<M>> MutableMeta<M>.remove(name: Name) = set(name, null)
|
||||
fun <M : MutableMeta<M>> MutableMeta<M>.remove(name: String) = remove(name.toName())
|
||||
|
||||
fun <M : MutableMeta<M>> MutableMeta<M>.setValue(name: Name, value: Value) = set(name, MetaItem.ValueItem(value))
|
||||
fun <M : MutableMeta<M>> MutableMeta<M>.setItem(name: String, item: MetaItem<M>) = set(name.toName(), item)
|
||||
fun <M : MutableMeta<M>> MutableMeta<M>.setValue(name: String, value: Value) =
|
||||
@Suppress("NOTHING_TO_INLINE")
|
||||
inline fun MutableMeta<*>.remove(name: Name) = set(name, null)
|
||||
@Suppress("NOTHING_TO_INLINE")
|
||||
inline fun MutableMeta<*>.remove(name: String) = remove(name.toName())
|
||||
|
||||
fun MutableMeta<*>.setValue(name: Name, value: Value) =
|
||||
set(name, MetaItem.ValueItem(value))
|
||||
|
||||
fun MutableMeta<*>.setValue(name: String, value: Value) =
|
||||
set(name.toName(), MetaItem.ValueItem(value))
|
||||
|
||||
fun <M : MutableMeta<M>> MutableMeta<M>.setItem(token: NameToken, item: MetaItem<M>?) = set(token.asName(), item)
|
||||
fun MutableMeta<*>.setItem(name: Name, item: MetaItem<*>?) {
|
||||
when (item) {
|
||||
null -> remove(name)
|
||||
is MetaItem.ValueItem -> setValue(name, item.value)
|
||||
is MetaItem.NodeItem<*> -> setNode(name, item.node)
|
||||
}
|
||||
}
|
||||
|
||||
fun <M : MutableMetaNode<M>> MutableMetaNode<M>.setNode(name: Name, node: Meta) =
|
||||
set(name, MetaItem.NodeItem(wrap(name, node)))
|
||||
fun MutableMeta<*>.setItem(name: String, item: MetaItem<*>?) = setItem(name.toName(), item)
|
||||
|
||||
fun <M : MutableMetaNode<M>> MutableMetaNode<M>.setNode(name: String, node: Meta) = setNode(name.toName(), node)
|
||||
fun MutableMeta<*>.setNode(name: Name, node: Meta) =
|
||||
set(name, MetaItem.NodeItem(node))
|
||||
|
||||
fun MutableMeta<*>.setNode(name: String, node: Meta) = setNode(name.toName(), node)
|
||||
|
||||
/**
|
||||
* Universal set method
|
||||
*/
|
||||
operator fun <M : MutableMetaNode<M>> M.set(name: Name, value: Any?) {
|
||||
operator fun MutableMeta<*>.set(name: Name, value: Any?) {
|
||||
when (value) {
|
||||
null -> remove(name)
|
||||
is MetaItem<*> -> when (value) {
|
||||
is MetaItem.ValueItem<*> -> setValue(name, value.value)
|
||||
is MetaItem.NodeItem<*> -> setNode(name, value.node)
|
||||
}
|
||||
is MetaItem<*> -> setItem(name, value)
|
||||
is Meta -> setNode(name, value)
|
||||
is Specific -> setNode(name, value.config)
|
||||
else -> setValue(name, Value.of(value))
|
||||
}
|
||||
}
|
||||
|
||||
operator fun <M : MutableMetaNode<M>> M.set(name: NameToken, value: Any?) = set(name.asName(), value)
|
||||
operator fun MutableMeta<*>.set(name: NameToken, value: Any?) = set(name.asName(), value)
|
||||
|
||||
operator fun <M : MutableMetaNode<M>> M.set(key: String, value: Any?) = set(key.toName(), value)
|
||||
operator fun MutableMeta<*>.set(key: String, value: Any?) = set(key.toName(), value)
|
||||
|
||||
/**
|
||||
* Update existing mutable node with another node. The rules are following:
|
||||
@ -133,10 +118,9 @@ operator fun <M : MutableMetaNode<M>> M.set(key: String, value: Any?) = set(key.
|
||||
* * node updates node and replaces anything but node
|
||||
* * node list updates node list if number of nodes in the list is the same and replaces anything otherwise
|
||||
*/
|
||||
fun <M : MutableMetaNode<M>> M.update(meta: Meta) {
|
||||
fun <M : MutableMeta<M>> M.update(meta: Meta) {
|
||||
meta.items.forEach { entry ->
|
||||
val value = entry.value
|
||||
when (value) {
|
||||
when (val value = entry.value) {
|
||||
is MetaItem.ValueItem -> setValue(entry.key.asName(), value.value)
|
||||
is MetaItem.NodeItem -> (this[entry.key.asName()] as? MetaItem.NodeItem)?.node?.update(value.node)
|
||||
?: run { setNode(entry.key.asName(), value.node) }
|
||||
@ -146,10 +130,10 @@ fun <M : MutableMetaNode<M>> M.update(meta: Meta) {
|
||||
|
||||
/* Same name siblings generation */
|
||||
|
||||
fun <M : MutableMeta<M>> M.setIndexed(
|
||||
fun MutableMeta<*>.setIndexedItems(
|
||||
name: Name,
|
||||
items: Iterable<MetaItem<M>>,
|
||||
indexFactory: MetaItem<M>.(index: Int) -> String = { it.toString() }
|
||||
items: Iterable<MetaItem<*>>,
|
||||
indexFactory: MetaItem<*>.(index: Int) -> String = { it.toString() }
|
||||
) {
|
||||
val tokens = name.tokens.toMutableList()
|
||||
val last = tokens.last()
|
||||
@ -160,21 +144,21 @@ fun <M : MutableMeta<M>> M.setIndexed(
|
||||
}
|
||||
}
|
||||
|
||||
fun <M : MutableMetaNode<M>> M.setIndexed(
|
||||
fun MutableMeta<*>.setIndexed(
|
||||
name: Name,
|
||||
metas: Iterable<Meta>,
|
||||
indexFactory: MetaItem<M>.(index: Int) -> String = { it.toString() }
|
||||
indexFactory: MetaItem<*>.(index: Int) -> String = { it.toString() }
|
||||
) {
|
||||
setIndexed(name, metas.map { MetaItem.NodeItem(wrap(name, it)) }, indexFactory)
|
||||
setIndexedItems(name, metas.map { MetaItem.NodeItem(it) }, indexFactory)
|
||||
}
|
||||
|
||||
operator fun <M : MutableMetaNode<M>> M.set(name: Name, metas: Iterable<Meta>) = setIndexed(name, metas)
|
||||
operator fun <M : MutableMetaNode<M>> M.set(name: String, metas: Iterable<Meta>) = setIndexed(name.toName(), metas)
|
||||
operator fun MutableMeta<*>.set(name: Name, metas: Iterable<Meta>): Unit = setIndexed(name, metas)
|
||||
operator fun MutableMeta<*>.set(name: String, metas: Iterable<Meta>): Unit = setIndexed(name.toName(), metas)
|
||||
|
||||
/**
|
||||
* Append the node with a same-name-sibling, automatically generating numerical index
|
||||
*/
|
||||
fun <M : MutableMetaNode<M>> M.append(name: Name, value: Any?) {
|
||||
fun MutableMeta<*>.append(name: Name, value: Any?) {
|
||||
require(!name.isEmpty()) { "Name could not be empty for append operation" }
|
||||
val newIndex = name.last()!!.index
|
||||
if (newIndex.isNotEmpty()) {
|
||||
@ -185,4 +169,4 @@ fun <M : MutableMetaNode<M>> M.append(name: Name, value: Any?) {
|
||||
}
|
||||
}
|
||||
|
||||
fun <M : MutableMetaNode<M>> M.append(name: String, value: Any?) = append(name.toName(), value)
|
||||
fun MutableMeta<*>.append(name: String, value: Any?) = append(name.toName(), value)
|
@ -1,11 +1,13 @@
|
||||
package hep.dataforge.meta
|
||||
|
||||
/**
|
||||
* Marker interface for specifications
|
||||
* Marker interface for classes with specifications
|
||||
*/
|
||||
interface Specific : Configurable {
|
||||
operator fun get(name: String): MetaItem<Config>? = config[name]
|
||||
}
|
||||
interface Specific : Configurable
|
||||
|
||||
//TODO separate mutable config from immutable meta to allow free wrapping of meta
|
||||
|
||||
operator fun Specific.get(name: String): MetaItem<*>? = config[name]
|
||||
|
||||
/**
|
||||
* Allows to apply custom configuration in a type safe way to simple untyped configuration.
|
||||
@ -29,6 +31,7 @@ interface Specification<T : Specific> {
|
||||
*/
|
||||
fun wrap(config: Config): T
|
||||
|
||||
//TODO replace by free wrapper
|
||||
fun wrap(meta: Meta): T = wrap(meta.toConfig())
|
||||
}
|
||||
|
||||
@ -59,4 +62,4 @@ fun <C : Specific, S : Specification<C>> S.createStyle(action: C.() -> Unit): Me
|
||||
fun <C : Specific> Specific.spec(
|
||||
spec: Specification<C>,
|
||||
key: String? = null
|
||||
) = MutableMorphDelegate(config, key) { spec.wrap(it) }
|
||||
): MutableMorphDelegate<Config, C> = MutableMorphDelegate(config, key) { spec.wrap(it) }
|
@ -11,7 +11,11 @@ import kotlin.reflect.KProperty
|
||||
* @param base - unchangeable base
|
||||
* @param style - the style
|
||||
*/
|
||||
class Styled(val base: Meta, val style: Config = Config().empty()) : MutableMeta<Styled> {
|
||||
class Styled(val base: Meta, val style: Config = Config().empty()) : AbstractMutableMeta<Styled>() {
|
||||
override fun wrapNode(meta: Meta): Styled = Styled(meta)
|
||||
|
||||
override fun empty(): Styled = Styled(EmptyMeta)
|
||||
|
||||
override val items: Map<NameToken, MetaItem<Styled>>
|
||||
get() = (base.items.keys + style.items.keys).associate { key ->
|
||||
val value = base.items[key]
|
||||
@ -19,10 +23,10 @@ class Styled(val base: Meta, val style: Config = Config().empty()) : MutableMeta
|
||||
val item: MetaItem<Styled> = when (value) {
|
||||
null -> when (styleValue) {
|
||||
null -> error("Should be unreachable")
|
||||
is MetaItem.ValueItem -> MetaItem.ValueItem(styleValue.value)
|
||||
is MetaItem.NodeItem -> MetaItem.NodeItem(Styled(style.empty(), styleValue.node))
|
||||
is MetaItem.ValueItem -> styleValue
|
||||
}
|
||||
is MetaItem.ValueItem -> MetaItem.ValueItem(value.value)
|
||||
is MetaItem.ValueItem -> value
|
||||
is MetaItem.NodeItem -> MetaItem.NodeItem(
|
||||
Styled(value.node, styleValue?.node ?: Config.empty())
|
||||
)
|
||||
@ -30,7 +34,7 @@ class Styled(val base: Meta, val style: Config = Config().empty()) : MutableMeta
|
||||
key to item
|
||||
}
|
||||
|
||||
override fun set(name: Name, item: MetaItem<Styled>?) {
|
||||
override fun set(name: Name, item: MetaItem<*>?) {
|
||||
if (item == null) {
|
||||
style.remove(name)
|
||||
} else {
|
||||
@ -38,12 +42,12 @@ class Styled(val base: Meta, val style: Config = Config().empty()) : MutableMeta
|
||||
}
|
||||
}
|
||||
|
||||
override fun onChange(owner: Any?, action: (Name, before: MetaItem<*>?, after: MetaItem<*>?) -> Unit) {
|
||||
fun onChange(owner: Any?, action: (Name, before: MetaItem<*>?, after: MetaItem<*>?) -> Unit) {
|
||||
//TODO test correct behavior
|
||||
style.onChange(owner) { name, before, after -> action(name, before ?: base[name], after ?: base[name]) }
|
||||
}
|
||||
|
||||
override fun removeListener(owner: Any?) {
|
||||
fun removeListener(owner: Any?) {
|
||||
style.removeListener(owner)
|
||||
}
|
||||
}
|
||||
|
@ -1,5 +1,6 @@
|
||||
package hep.dataforge.meta
|
||||
|
||||
import hep.dataforge.values.DoubleArrayValue
|
||||
import hep.dataforge.values.Null
|
||||
import hep.dataforge.values.Value
|
||||
import kotlin.jvm.JvmName
|
||||
@ -10,25 +11,24 @@ import kotlin.jvm.JvmName
|
||||
/**
|
||||
* A property delegate that uses custom key
|
||||
*/
|
||||
fun Configurable.value(default: Any = Null, key: String? = null) =
|
||||
fun Configurable.value(default: Any = Null, key: String? = null): MutableValueDelegate<Config> =
|
||||
MutableValueDelegate(config, key, Value.of(default))
|
||||
|
||||
fun <T> Configurable.value(default: T? = null, key: String? = null, transform: (Value?) -> T) =
|
||||
MutableValueDelegate(config, key, Value.of(default)).transform(reader = transform)
|
||||
fun <T> Configurable.value(
|
||||
default: T? = null,
|
||||
key: String? = null,
|
||||
writer: (T) -> Value = { Value.of(it) },
|
||||
reader: (Value?) -> T
|
||||
): ReadWriteDelegateWrapper<Value?, T> =
|
||||
MutableValueDelegate(config, key, default?.let { Value.of(it) }).transform(reader = reader, writer = writer)
|
||||
|
||||
fun Configurable.stringList(key: String? = null) =
|
||||
value(key) { it?.list?.map { value -> value.string } ?: emptyList() }
|
||||
|
||||
fun Configurable.numberList(key: String? = null) =
|
||||
value(key) { it?.list?.map { value -> value.number } ?: emptyList() }
|
||||
|
||||
fun Configurable.string(default: String? = null, key: String? = null) =
|
||||
fun Configurable.string(default: String? = null, key: String? = null): MutableStringDelegate<Config> =
|
||||
MutableStringDelegate(config, key, default)
|
||||
|
||||
fun Configurable.boolean(default: Boolean? = null, key: String? = null) =
|
||||
fun Configurable.boolean(default: Boolean? = null, key: String? = null): MutableBooleanDelegate<Config> =
|
||||
MutableBooleanDelegate(config, key, default)
|
||||
|
||||
fun Configurable.number(default: Number? = null, key: String? = null) =
|
||||
fun Configurable.number(default: Number? = null, key: String? = null): MutableNumberDelegate<Config> =
|
||||
MutableNumberDelegate(config, key, default)
|
||||
|
||||
/* Number delegates*/
|
||||
@ -111,3 +111,26 @@ fun <T : Specific> Configurable.spec(spec: Specification<T>, key: String? = null
|
||||
|
||||
fun <T : Specific> Configurable.spec(builder: (Config) -> T, key: String? = null) =
|
||||
MutableMorphDelegate(config, key) { specification(builder).wrap(it) }
|
||||
|
||||
/*
|
||||
* Extra delegates for special cases
|
||||
*/
|
||||
|
||||
fun Configurable.stringList(key: String? = null): ReadWriteDelegateWrapper<Value?, List<String>> =
|
||||
value(emptyList(), key) { it?.list?.map { value -> value.string } ?: emptyList() }
|
||||
|
||||
fun Configurable.numberList(key: String? = null): ReadWriteDelegateWrapper<Value?, List<Number>> =
|
||||
value(emptyList(), key) { it?.list?.map { value -> value.number } ?: emptyList() }
|
||||
|
||||
/**
|
||||
* A special delegate for double arrays
|
||||
*/
|
||||
fun Configurable.doubleArray(key: String? = null): ReadWriteDelegateWrapper<Value?, DoubleArray> =
|
||||
value(doubleArrayOf(), key) {
|
||||
(it as? DoubleArrayValue)?.value
|
||||
?: it?.list?.map { value -> value.number.toDouble() }?.toDoubleArray()
|
||||
?: doubleArrayOf()
|
||||
}
|
||||
|
||||
fun <T : Configurable> Configurable.child(key: String? = null, converter: (Meta) -> T) =
|
||||
MutableMorphDelegate(config, key, converter)
|
@ -113,8 +113,11 @@ class SafeEnumDelegate<E : Enum<E>>(
|
||||
|
||||
//Child node delegate
|
||||
|
||||
class ChildDelegate<T>(val meta: Meta, private val key: String? = null, private val converter: (Meta) -> T) :
|
||||
ReadOnlyProperty<Any?, T?> {
|
||||
class ChildDelegate<T>(
|
||||
val meta: Meta,
|
||||
private val key: String? = null,
|
||||
private val converter: (Meta) -> T
|
||||
) : ReadOnlyProperty<Any?, T?> {
|
||||
override fun getValue(thisRef: Any?, property: KProperty<*>): T? {
|
||||
return meta[key ?: property.name]?.node?.let { converter(it) }
|
||||
}
|
||||
@ -164,6 +167,8 @@ inline fun <reified E : Enum<E>> Meta.enum(default: E, key: String? = null) =
|
||||
SafeEnumDelegate(this, key, default) { enumValueOf(it) }
|
||||
|
||||
|
||||
fun <T : Metoid> Metoid.child(key: String? = null, converter: (Meta) -> T) = ChildDelegate(meta, key, converter)
|
||||
|
||||
/* Read-write delegates */
|
||||
|
||||
class MutableValueDelegate<M : MutableMeta<M>>(
|
||||
@ -327,7 +332,7 @@ class MutableSafeEnumvDelegate<M : MutableMeta<M>, E : Enum<E>>(
|
||||
|
||||
//Child node delegate
|
||||
|
||||
class MutableNodeDelegate<M : MutableMetaNode<M>>(
|
||||
class MutableNodeDelegate<M : MutableMeta<M>>(
|
||||
val meta: M,
|
||||
private val key: String? = null
|
||||
) : ReadWriteProperty<Any?, Meta?> {
|
||||
@ -340,7 +345,7 @@ class MutableNodeDelegate<M : MutableMetaNode<M>>(
|
||||
}
|
||||
}
|
||||
|
||||
class MutableMorphDelegate<M : MutableMetaNode<M>, T : Configurable>(
|
||||
class MutableMorphDelegate<M : MutableMeta<M>, T : Configurable>(
|
||||
val meta: M,
|
||||
private val key: String? = null,
|
||||
private val converter: (Meta) -> T
|
||||
@ -390,7 +395,7 @@ fun <M : MutableMeta<M>> M.boolean(default: Boolean? = null, key: String? = null
|
||||
fun <M : MutableMeta<M>> M.number(default: Number? = null, key: String? = null) =
|
||||
MutableNumberDelegate(this, key, default)
|
||||
|
||||
fun <M : MutableMetaNode<M>> M.node(key: String? = null) = MutableNodeDelegate(this, key)
|
||||
fun <M : MutableMeta<M>> M.node(key: String? = null) = MutableNodeDelegate(this, key)
|
||||
|
||||
@JvmName("safeString")
|
||||
fun <M : MutableMeta<M>> M.string(default: String, key: String? = null) =
|
@ -6,10 +6,9 @@ package hep.dataforge.names
|
||||
* The name is a dot separated list of strings like `token1.token2.token3`.
|
||||
* Each token could contain additional index in square brackets.
|
||||
*/
|
||||
inline class Name constructor(val tokens: List<NameToken>) {
|
||||
class Name(val tokens: List<NameToken>) {
|
||||
|
||||
val length
|
||||
get() = tokens.size
|
||||
val length get() = tokens.size
|
||||
|
||||
/**
|
||||
* First token of the name or null if it is empty
|
||||
@ -35,6 +34,23 @@ inline class Name constructor(val tokens: List<NameToken>) {
|
||||
|
||||
override fun toString(): String = tokens.joinToString(separator = NAME_SEPARATOR) { it.toString() }
|
||||
|
||||
override fun equals(other: Any?): Boolean {
|
||||
return when (other) {
|
||||
is Name -> this.tokens == other.tokens
|
||||
is NameToken -> this.length == 1 && this.tokens.first() == other
|
||||
else -> false
|
||||
}
|
||||
}
|
||||
|
||||
override fun hashCode(): Int {
|
||||
return if (tokens.size == 1) {
|
||||
tokens.first().hashCode()
|
||||
} else {
|
||||
tokens.hashCode()
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
companion object {
|
||||
const val NAME_SEPARATOR = "."
|
||||
}
|
||||
@ -51,32 +67,55 @@ data class NameToken(val body: String, val index: String = "") {
|
||||
if (body.isEmpty()) error("Syntax error: Name token body is empty")
|
||||
}
|
||||
|
||||
private fun String.escape() =
|
||||
replace("\\", "\\\\")
|
||||
.replace(".", "\\.")
|
||||
.replace("[", "\\[")
|
||||
.replace("]", "\\]")
|
||||
|
||||
override fun toString(): String = if (hasIndex()) {
|
||||
"$body[$index]"
|
||||
"${body.escape()}[$index]"
|
||||
} else {
|
||||
body
|
||||
body.escape()
|
||||
}
|
||||
|
||||
fun hasIndex() = index.isNotEmpty()
|
||||
}
|
||||
|
||||
/**
|
||||
* Convert a [String] to name parsing it and extracting name tokens and index syntax.
|
||||
* This operation is rather heavy so it should be used with care in high performance code.
|
||||
*/
|
||||
fun String.toName(): Name {
|
||||
if (isBlank()) return EmptyName
|
||||
val tokens = sequence {
|
||||
var bodyBuilder = StringBuilder()
|
||||
var queryBuilder = StringBuilder()
|
||||
var bracketCount: Int = 0
|
||||
var escape: Boolean = false
|
||||
fun queryOn() = bracketCount > 0
|
||||
|
||||
asSequence().forEach {
|
||||
for (it in this@toName) {
|
||||
when {
|
||||
escape -> {
|
||||
if (queryOn()) {
|
||||
queryBuilder.append(it)
|
||||
} else {
|
||||
bodyBuilder.append(it)
|
||||
}
|
||||
escape = false
|
||||
}
|
||||
it == '\\' -> {
|
||||
escape = true
|
||||
}
|
||||
queryOn() -> {
|
||||
when (it) {
|
||||
'[' -> bracketCount++
|
||||
']' -> bracketCount--
|
||||
}
|
||||
if (queryOn()) queryBuilder.append(it)
|
||||
} else {
|
||||
when (it) {
|
||||
}
|
||||
else -> when (it) {
|
||||
'.' -> {
|
||||
yield(NameToken(bodyBuilder.toString(), queryBuilder.toString()))
|
||||
bodyBuilder = StringBuilder()
|
||||
@ -96,6 +135,14 @@ fun String.toName(): Name {
|
||||
return Name(tokens.toList())
|
||||
}
|
||||
|
||||
/**
|
||||
* Convert the [String] to a [Name] by simply wrapping it in a single name token without parsing.
|
||||
* The input string could contain dots and braces, but they are just escaped, not parsed.
|
||||
*/
|
||||
fun String.asName(): Name {
|
||||
return NameToken(this).asName()
|
||||
}
|
||||
|
||||
operator fun NameToken.plus(other: Name): Name = Name(listOf(this) + other.tokens)
|
||||
|
||||
operator fun Name.plus(other: Name): Name = Name(this.tokens + other.tokens)
|
||||
@ -121,5 +168,20 @@ fun Name.withIndex(index: String): Name {
|
||||
return Name(tokens)
|
||||
}
|
||||
|
||||
/**
|
||||
* Fast [String]-based accessor for item map
|
||||
*/
|
||||
operator fun <T> Map<NameToken, T>.get(body: String, query: String = ""): T? = get(NameToken(body, query))
|
||||
|
||||
operator fun <T> Map<Name, T>.get(name: String) = get(name.toName())
|
||||
operator fun <T> MutableMap<Name, T>.set(name: String, value: T) = set(name.toName(), value)
|
||||
|
||||
/* Name comparison operations */
|
||||
|
||||
fun Name.startsWith(token: NameToken): Boolean = first() == token
|
||||
|
||||
fun Name.endsWith(token: NameToken): Boolean = last() == token
|
||||
|
||||
fun Name.startsWith(name: Name): Boolean = tokens.subList(0, name.length) == name.tokens
|
||||
|
||||
fun Name.endsWith(name: Name): Boolean = tokens.subList(length - name.length, length) == name.tokens
|
@ -43,6 +43,8 @@ interface Value {
|
||||
val list: List<Value>
|
||||
get() = listOf(this)
|
||||
|
||||
override fun equals(other: Any?): Boolean
|
||||
|
||||
companion object {
|
||||
const val TYPE = "value"
|
||||
|
||||
@ -82,6 +84,8 @@ object Null : Value {
|
||||
override val string: String get() = "@null"
|
||||
|
||||
override fun toString(): String = value.toString()
|
||||
|
||||
override fun equals(other: Any?): Boolean = other === Null
|
||||
}
|
||||
|
||||
/**
|
||||
@ -97,9 +101,12 @@ object True : Value {
|
||||
override val value: Any? get() = true
|
||||
override val type: ValueType get() = ValueType.BOOLEAN
|
||||
override val number: Number get() = 1.0
|
||||
override val string: String get() = "+"
|
||||
override val string: String get() = "true"
|
||||
|
||||
override fun toString(): String = value.toString()
|
||||
|
||||
override fun equals(other: Any?): Boolean = other === True
|
||||
|
||||
}
|
||||
|
||||
/**
|
||||
@ -109,7 +116,11 @@ object False : Value {
|
||||
override val value: Any? get() = false
|
||||
override val type: ValueType get() = ValueType.BOOLEAN
|
||||
override val number: Number get() = -1.0
|
||||
override val string: String get() = "-"
|
||||
override val string: String get() = "false"
|
||||
|
||||
override fun toString(): String = True.value.toString()
|
||||
|
||||
override fun equals(other: Any?): Boolean = other === False
|
||||
}
|
||||
|
||||
val Value.boolean get() = this == True || this.list.firstOrNull() == True || (type == ValueType.STRING && string.toBoolean())
|
||||
@ -122,12 +133,12 @@ class NumberValue(override val number: Number) : Value {
|
||||
override fun equals(other: Any?): Boolean {
|
||||
if (other !is Value) return false
|
||||
return when (number) {
|
||||
is Short -> number == other.number.toShort()
|
||||
is Long -> number == other.number.toLong()
|
||||
is Byte -> number == other.number.toByte()
|
||||
is Int -> number == other.number.toInt()
|
||||
is Float -> number == other.number.toFloat()
|
||||
is Double -> number == other.number.toDouble()
|
||||
is Short -> number.toShort() == other.number.toShort()
|
||||
is Long -> number.toLong() == other.number.toLong()
|
||||
is Byte -> number.toByte() == other.number.toByte()
|
||||
is Int -> number.toInt() == other.number.toInt()
|
||||
is Float -> number.toFloat() == other.number.toFloat()
|
||||
is Double -> number.toDouble() == other.number.toDouble()
|
||||
else -> number.toString() == other.number.toString()
|
||||
}
|
||||
}
|
||||
@ -148,7 +159,7 @@ class StringValue(override val string: String) : Value {
|
||||
|
||||
override fun hashCode(): Int = string.hashCode()
|
||||
|
||||
override fun toString(): String = value.toString()
|
||||
override fun toString(): String = "\"${value.toString()}\""
|
||||
}
|
||||
|
||||
class EnumValue<E : Enum<*>>(override val value: E) : Value {
|
||||
@ -177,11 +188,14 @@ class ListValue(override val list: List<Value>) : Value {
|
||||
override val number: Number get() = list.first().number
|
||||
override val string: String get() = list.first().string
|
||||
|
||||
override fun toString(): String = value.toString()
|
||||
override fun toString(): String = list.joinToString (prefix = "[", postfix = "]")
|
||||
|
||||
override fun equals(other: Any?): Boolean {
|
||||
if (this === other) return true
|
||||
if (other !is Value) return false
|
||||
if( other is DoubleArrayValue){
|
||||
|
||||
}
|
||||
return list == other.list
|
||||
}
|
||||
|
||||
@ -206,9 +220,6 @@ fun String.asValue(): Value = StringValue(this)
|
||||
|
||||
fun Iterable<Value>.asValue(): Value = ListValue(this.toList())
|
||||
|
||||
//TODO maybe optimized storage performance
|
||||
fun DoubleArray.asValue(): Value = ListValue(map{NumberValue(it)})
|
||||
|
||||
fun IntArray.asValue(): Value = ListValue(map{NumberValue(it)})
|
||||
|
||||
fun LongArray.asValue(): Value = ListValue(map{NumberValue(it)})
|
||||
@ -254,16 +265,3 @@ fun String.parseValue(): Value {
|
||||
//Give up and return a StringValue
|
||||
return StringValue(this)
|
||||
}
|
||||
|
||||
class LazyParsedValue(override val string: String) : Value {
|
||||
private val parsedValue by lazy { string.parseValue() }
|
||||
|
||||
override val value: Any?
|
||||
get() = parsedValue.value
|
||||
override val type: ValueType
|
||||
get() = parsedValue.type
|
||||
override val number: Number
|
||||
get() = parsedValue.number
|
||||
|
||||
override fun toString(): String = value.toString()
|
||||
}
|
@ -0,0 +1,47 @@
|
||||
package hep.dataforge.values
|
||||
|
||||
|
||||
/**
|
||||
* A value built from string which content and type are parsed on-demand
|
||||
*/
|
||||
class LazyParsedValue(override val string: String) : Value {
|
||||
private val parsedValue by lazy { string.parseValue() }
|
||||
|
||||
override val value: Any? get() = parsedValue.value
|
||||
override val type: ValueType get() = parsedValue.type
|
||||
override val number: Number get() = parsedValue.number
|
||||
|
||||
override fun toString(): String = string
|
||||
|
||||
override fun equals(other: Any?): Boolean = other is Value && this.parsedValue == other
|
||||
}
|
||||
|
||||
fun String.lazyParseValue(): LazyParsedValue = LazyParsedValue(this)
|
||||
|
||||
/**
|
||||
* A performance optimized version of list value for doubles
|
||||
*/
|
||||
class DoubleArrayValue(override val value: DoubleArray) : Value {
|
||||
override val type: ValueType get() = ValueType.NUMBER
|
||||
override val number: Double get() = value.first()
|
||||
override val string: String get() = value.first().toString()
|
||||
override val list: List<Value> get() = value.map { NumberValue(it) }
|
||||
|
||||
override fun equals(other: Any?): Boolean {
|
||||
if (this === other) return true
|
||||
if (other !is Value) return false
|
||||
|
||||
return when (other) {
|
||||
is DoubleArrayValue -> value.contentEquals(other.value)
|
||||
else -> list == other.list
|
||||
}
|
||||
}
|
||||
|
||||
override fun hashCode(): Int {
|
||||
return value.contentHashCode()
|
||||
}
|
||||
|
||||
override fun toString(): String = list.joinToString (prefix = "[", postfix = "]")
|
||||
}
|
||||
|
||||
fun DoubleArray.asValue(): DoubleArrayValue = DoubleArrayValue(this)
|
@ -0,0 +1,31 @@
|
||||
package hep.dataforge.descriptors
|
||||
|
||||
import hep.dataforge.values.ValueType
|
||||
import kotlin.test.Test
|
||||
import kotlin.test.assertEquals
|
||||
|
||||
class DescriptorTest {
|
||||
|
||||
val descriptor = NodeDescriptor.build {
|
||||
node("aNode") {
|
||||
info = "A root demo node"
|
||||
value("b") {
|
||||
info = "b number value"
|
||||
type(ValueType.NUMBER)
|
||||
}
|
||||
node("otherNode") {
|
||||
value("otherValue") {
|
||||
type(ValueType.BOOLEAN)
|
||||
default(false)
|
||||
info = "default value"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@Test
|
||||
fun testAllowedValues() {
|
||||
val allowed = descriptor.nodes["aNode"]?.values?.get("b")?.allowedValues
|
||||
assertEquals(allowed, emptyList())
|
||||
}
|
||||
}
|
@ -0,0 +1,22 @@
|
||||
package hep.dataforge.meta
|
||||
|
||||
import kotlin.test.Test
|
||||
import kotlin.test.assertEquals
|
||||
|
||||
class MutableMetaTest{
|
||||
@Test
|
||||
fun testRemove(){
|
||||
val meta = buildMeta {
|
||||
"aNode" to {
|
||||
"innerNode" to {
|
||||
"innerValue" to true
|
||||
}
|
||||
"b" to 22
|
||||
"c" to "StringValue"
|
||||
}
|
||||
}.toConfig()
|
||||
|
||||
meta.remove("aNode.c")
|
||||
assertEquals(meta["aNode.c"], null)
|
||||
}
|
||||
}
|
@ -2,6 +2,8 @@ package hep.dataforge.names
|
||||
|
||||
import kotlin.test.Test
|
||||
import kotlin.test.assertEquals
|
||||
import kotlin.test.assertFalse
|
||||
import kotlin.test.assertTrue
|
||||
|
||||
class NameTest {
|
||||
@Test
|
||||
@ -16,4 +18,24 @@ class NameTest {
|
||||
val name2 = "token1".toName() + "token2[2].token3"
|
||||
assertEquals(name1, name2)
|
||||
}
|
||||
|
||||
@Test
|
||||
fun comparisonTest(){
|
||||
val name1 = "token1.token2.token3".toName()
|
||||
val name2 = "token1.token2".toName()
|
||||
val name3 = "token3".toName()
|
||||
assertTrue { name1.startsWith(name2) }
|
||||
assertTrue { name1.endsWith(name3) }
|
||||
assertFalse { name1.startsWith(name3) }
|
||||
}
|
||||
|
||||
@Test
|
||||
fun escapeTest(){
|
||||
val escapedName = "token\\.one.token2".toName()
|
||||
val unescapedName = "token\\.one.token2".asName()
|
||||
|
||||
assertEquals(2, escapedName.length)
|
||||
assertEquals(1, unescapedName.length)
|
||||
assertEquals(escapedName, escapedName.toString().toName())
|
||||
}
|
||||
}
|
@ -29,15 +29,16 @@ fun Meta.toDynamic(): dynamic {
|
||||
return res
|
||||
}
|
||||
|
||||
class DynamicMeta(val obj: dynamic) : Meta {
|
||||
class DynamicMeta(val obj: dynamic) : MetaBase() {
|
||||
private fun keys() = js("Object.keys(this.obj)") as Array<String>
|
||||
|
||||
private fun isArray(@Suppress("UNUSED_PARAMETER") obj: dynamic): Boolean =
|
||||
js("Array.isArray(obj)") as Boolean
|
||||
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
private fun asItem(obj: dynamic): MetaItem<DynamicMeta>? {
|
||||
if (obj == null) return MetaItem.ValueItem(Null)
|
||||
return when (jsTypeOf(obj)) {
|
||||
return when (jsTypeOf(obj as? Any)) {
|
||||
"boolean" -> MetaItem.ValueItem(Value.of(obj as Boolean))
|
||||
"number" -> MetaItem.ValueItem(Value.of(obj as Number))
|
||||
"string" -> MetaItem.ValueItem(Value.of(obj as String))
|
||||
|
@ -1,12 +1,10 @@
|
||||
plugins {
|
||||
`npm-multiplatform`
|
||||
id("scientifik.mpp")
|
||||
}
|
||||
|
||||
val htmlVersion by rootProject.extra("0.6.12")
|
||||
|
||||
kotlin {
|
||||
jvm()
|
||||
js()
|
||||
sourceSets {
|
||||
val commonMain by getting {
|
||||
dependencies {
|
@ -27,8 +27,8 @@ class HtmlOutput<T : Any>(override val context: Context, private val consumer: T
|
||||
} else {
|
||||
val value = cache[obj::class]
|
||||
if (value == null) {
|
||||
val answer = context.top<HtmlBuilder<*>>().values
|
||||
.filter { it.type.isInstance(obj) }.firstOrNull()
|
||||
val answer =
|
||||
context.top<HtmlBuilder<*>>(HTML_CONVERTER_TYPE).values.firstOrNull { it.type.isInstance(obj) }
|
||||
if (answer != null) {
|
||||
cache[obj::class] = answer
|
||||
answer
|
||||
@ -40,6 +40,7 @@ class HtmlOutput<T : Any>(override val context: Context, private val consumer: T
|
||||
}
|
||||
}
|
||||
context.launch(Dispatchers.Output) {
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
(builder as HtmlBuilder<T>).run { consumer.render(obj) }
|
||||
}
|
||||
}
|
@ -1,10 +1,8 @@
|
||||
plugins {
|
||||
`npm-multiplatform`
|
||||
id("scientifik.mpp")
|
||||
}
|
||||
|
||||
kotlin {
|
||||
jvm()
|
||||
js()
|
||||
sourceSets {
|
||||
val commonMain by getting{
|
||||
dependencies {
|
||||
|
@ -1,6 +1,9 @@
|
||||
package hep.dataforge.output
|
||||
|
||||
import hep.dataforge.context.*
|
||||
import hep.dataforge.context.AbstractPlugin
|
||||
import hep.dataforge.context.Context
|
||||
import hep.dataforge.context.PluginFactory
|
||||
import hep.dataforge.context.PluginTag
|
||||
import hep.dataforge.context.PluginTag.Companion.DATAFORGE_GROUP
|
||||
import hep.dataforge.meta.EmptyMeta
|
||||
import hep.dataforge.meta.Meta
|
||||
@ -13,7 +16,7 @@ import kotlin.reflect.KClass
|
||||
/**
|
||||
* A manager for outputs
|
||||
*/
|
||||
interface OutputManager : Plugin {
|
||||
interface OutputManager {
|
||||
|
||||
/**
|
||||
* Get an output specialized for given type, name and stage.
|
||||
|
@ -1,5 +1,5 @@
|
||||
plugins {
|
||||
`npm-multiplatform`
|
||||
id("scientifik.mpp")
|
||||
}
|
||||
|
||||
kotlin {
|
||||
|
@ -0,0 +1,4 @@
|
||||
package hep.dataforge.scripting
|
||||
|
||||
internal object Placeholder {
|
||||
}
|
@ -8,6 +8,7 @@ import hep.dataforge.workspace.WorkspaceBuilder
|
||||
import java.io.File
|
||||
import kotlin.script.experimental.api.*
|
||||
import kotlin.script.experimental.host.toScriptSource
|
||||
import kotlin.script.experimental.jvm.defaultJvmScriptingHostConfiguration
|
||||
import kotlin.script.experimental.jvm.dependenciesFromCurrentContext
|
||||
import kotlin.script.experimental.jvm.jvm
|
||||
import kotlin.script.experimental.jvmhost.BasicJvmScriptingHost
|
||||
@ -24,6 +25,7 @@ object Builders {
|
||||
jvm {
|
||||
dependenciesFromCurrentContext(wholeClasspath = true)
|
||||
}
|
||||
hostConfiguration(defaultJvmScriptingHostConfiguration)
|
||||
}
|
||||
|
||||
val evaluationConfiguration = ScriptEvaluationConfiguration {
|
||||
|
@ -1,5 +1,5 @@
|
||||
plugins {
|
||||
`npm-multiplatform`
|
||||
id("scientifik.mpp")
|
||||
}
|
||||
|
||||
kotlin {
|
||||
@ -10,6 +10,7 @@ kotlin {
|
||||
dependencies {
|
||||
api(project(":dataforge-context"))
|
||||
api(project(":dataforge-data"))
|
||||
api(project(":dataforge-output"))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -2,11 +2,11 @@ package hep.dataforge.workspace
|
||||
|
||||
import hep.dataforge.context.Context
|
||||
import hep.dataforge.context.Global
|
||||
import hep.dataforge.context.content
|
||||
import hep.dataforge.data.DataNode
|
||||
import hep.dataforge.meta.Meta
|
||||
import hep.dataforge.names.Name
|
||||
import hep.dataforge.names.toName
|
||||
import hep.dataforge.provider.top
|
||||
|
||||
|
||||
/**
|
||||
@ -18,8 +18,9 @@ class SimpleWorkspace(
|
||||
override val targets: Map<String, Meta>,
|
||||
tasks: Collection<Task<Any>>
|
||||
) : Workspace {
|
||||
|
||||
override val tasks: Map<Name, Task<*>> by lazy {
|
||||
context.top<Task<*>>(Task.TYPE) + tasks.associate { it.name.toName() to it }
|
||||
context.content<Task<*>>(Task.TYPE) + tasks.associate { it.name.toName() to it }
|
||||
}
|
||||
|
||||
companion object {
|
||||
|
@ -8,6 +8,7 @@ package hep.dataforge.workspace
|
||||
import hep.dataforge.data.DataFilter
|
||||
import hep.dataforge.data.DataTree
|
||||
import hep.dataforge.data.DataTreeBuilder
|
||||
import hep.dataforge.data.dataSequence
|
||||
import hep.dataforge.meta.*
|
||||
import hep.dataforge.names.EmptyName
|
||||
import hep.dataforge.names.Name
|
||||
@ -51,7 +52,7 @@ data class TaskModel(
|
||||
*/
|
||||
fun TaskModel.buildInput(workspace: Workspace): DataTree<Any> {
|
||||
return DataTreeBuilder(Any::class).apply {
|
||||
dependencies.asSequence().flatMap { it.apply(workspace).data() }.forEach { (name, data) ->
|
||||
dependencies.asSequence().flatMap { it.apply(workspace).dataSequence() }.forEach { (name, data) ->
|
||||
//TODO add concise error on replacement
|
||||
this[name] = data
|
||||
}
|
||||
|
@ -3,6 +3,7 @@ package hep.dataforge.workspace
|
||||
import hep.dataforge.context.ContextAware
|
||||
import hep.dataforge.data.Data
|
||||
import hep.dataforge.data.DataNode
|
||||
import hep.dataforge.data.dataSequence
|
||||
import hep.dataforge.meta.Meta
|
||||
import hep.dataforge.meta.MetaBuilder
|
||||
import hep.dataforge.meta.buildMeta
|
||||
@ -29,27 +30,16 @@ interface Workspace : ContextAware, Provider {
|
||||
*/
|
||||
val tasks: Map<Name, Task<*>>
|
||||
|
||||
override fun provideTop(target: String, name: Name): Any? {
|
||||
override fun provideTop(target: String): Map<Name, Any> {
|
||||
return when (target) {
|
||||
"target", Meta.TYPE -> targets[name.toString()]
|
||||
Task.TYPE -> tasks[name]
|
||||
Data.TYPE -> data[name]
|
||||
DataNode.TYPE -> data.getNode(name)
|
||||
else -> null
|
||||
"target", Meta.TYPE -> targets.mapKeys { it.key.toName() }
|
||||
Task.TYPE -> tasks
|
||||
Data.TYPE -> data.dataSequence().toMap()
|
||||
//DataNode.TYPE -> data.nodes.toMap()
|
||||
else -> emptyMap()
|
||||
}
|
||||
}
|
||||
|
||||
override fun listNames(target: String): Sequence<Name> {
|
||||
return when (target) {
|
||||
"target", Meta.TYPE -> targets.keys.asSequence().map { it.toName() }
|
||||
Task.TYPE -> tasks.keys.asSequence().map { it }
|
||||
Data.TYPE -> data.data().map { it.first }
|
||||
DataNode.TYPE -> data.nodes().map { it.first }
|
||||
else -> emptySequence()
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* Invoke a task in the workspace utilizing caching if possible
|
||||
*/
|
||||
@ -64,15 +54,6 @@ interface Workspace : ContextAware, Provider {
|
||||
}
|
||||
}
|
||||
|
||||
// /**
|
||||
// * Invoke a task in the workspace utilizing caching if possible
|
||||
// */
|
||||
// operator fun <R : Any> Task<R>.invoke(targetName: String): DataNode<R> {
|
||||
// val target = targets[targetName] ?: error("A target with name $targetName not found in ${this@Workspace}")
|
||||
// context.logger.info { "Running ${this.name} on $target" }
|
||||
// return invoke(target)
|
||||
// }
|
||||
|
||||
companion object {
|
||||
const val TYPE = "workspace"
|
||||
}
|
||||
|
@ -8,8 +8,6 @@ import hep.dataforge.data.DataTreeBuilder
|
||||
import hep.dataforge.meta.*
|
||||
import hep.dataforge.names.Name
|
||||
import hep.dataforge.names.toName
|
||||
import kotlinx.coroutines.CoroutineScope
|
||||
import kotlinx.coroutines.GlobalScope
|
||||
|
||||
@TaskBuildScope
|
||||
interface WorkspaceBuilder {
|
||||
@ -26,7 +24,7 @@ interface WorkspaceBuilder {
|
||||
/**
|
||||
* Set the context for future workspcace
|
||||
*/
|
||||
fun WorkspaceBuilder.context(name: String, block: ContextBuilder.() -> Unit = {}) {
|
||||
fun WorkspaceBuilder.context(name: String = "WORKSPACE", block: ContextBuilder.() -> Unit = {}) {
|
||||
context = ContextBuilder(name, parentContext).apply(block).build()
|
||||
}
|
||||
|
||||
@ -36,14 +34,14 @@ fun WorkspaceBuilder.data(name: Name, data: Data<Any>) {
|
||||
|
||||
fun WorkspaceBuilder.data(name: String, data: Data<Any>) = data(name.toName(), data)
|
||||
|
||||
fun WorkspaceBuilder.static(name: Name, data: Any, scope: CoroutineScope = GlobalScope, meta: Meta = EmptyMeta) =
|
||||
data(name, Data.static(scope, data, meta))
|
||||
fun WorkspaceBuilder.static(name: Name, data: Any, meta: Meta = EmptyMeta) =
|
||||
data(name, Data.static(data, meta))
|
||||
|
||||
fun WorkspaceBuilder.static(name: Name, data: Any, scope: CoroutineScope = GlobalScope, block: MetaBuilder.() -> Unit = {}) =
|
||||
data(name, Data.static(scope, data, buildMeta(block)))
|
||||
fun WorkspaceBuilder.static(name: Name, data: Any, block: MetaBuilder.() -> Unit = {}) =
|
||||
data(name, Data.static(data, buildMeta(block)))
|
||||
|
||||
fun WorkspaceBuilder.static(name: String, data: Any, scope: CoroutineScope = GlobalScope, block: MetaBuilder.() -> Unit = {}) =
|
||||
data(name, Data.static(scope, data, buildMeta(block)))
|
||||
fun WorkspaceBuilder.static(name: String, data: Any, block: MetaBuilder.() -> Unit = {}) =
|
||||
data(name, Data.static(data, buildMeta(block)))
|
||||
|
||||
fun WorkspaceBuilder.data(name: Name, node: DataNode<Any>) {
|
||||
this.data[name] = node
|
||||
|
@ -0,0 +1,19 @@
|
||||
package hep.dataforge.workspace
|
||||
|
||||
import hep.dataforge.context.AbstractPlugin
|
||||
import hep.dataforge.names.Name
|
||||
import hep.dataforge.names.toName
|
||||
|
||||
/**
|
||||
* An abstract plugin with some additional boilerplate to effectively work with workspace context
|
||||
*/
|
||||
abstract class WorkspacePlugin : AbstractPlugin() {
|
||||
abstract val tasks: Collection<Task<*>>
|
||||
|
||||
override fun provideTop(target: String): Map<Name, Any> {
|
||||
return when(target){
|
||||
Task.TYPE -> tasks.associateBy { it.name.toName() }
|
||||
else -> emptyMap()
|
||||
}
|
||||
}
|
||||
}
|
@ -0,0 +1,14 @@
|
||||
package hep.dataforge.workspace
|
||||
|
||||
import hep.dataforge.data.Data
|
||||
import hep.dataforge.io.Envelope
|
||||
import hep.dataforge.io.IOFormat
|
||||
import hep.dataforge.io.readWith
|
||||
import kotlin.reflect.KClass
|
||||
|
||||
/**
|
||||
* Convert an [Envelope] to a data via given format. The actual parsing is done lazily.
|
||||
*/
|
||||
fun <T : Any> Envelope.toData(type: KClass<out T>, format: IOFormat<T>): Data<T> = Data(type, meta) {
|
||||
data?.readWith(format) ?: error("Can't convert envelope without data to Data")
|
||||
}
|
@ -27,7 +27,7 @@ class TaskBuilder(val name: String) {
|
||||
val localData = if (from.isEmpty()) {
|
||||
node
|
||||
} else {
|
||||
node.getNode(from.toName()) ?: return null
|
||||
node[from.toName()].node ?: return null
|
||||
}
|
||||
return transform(workspace.context, model, localData)
|
||||
}
|
||||
@ -207,7 +207,7 @@ class TaskBuilder(val name: String) {
|
||||
}
|
||||
}
|
||||
|
||||
fun task(name: String, builder: TaskBuilder.() -> Unit): GenericTask<Any> {
|
||||
fun Workspace.Companion.task(name: String, builder: TaskBuilder.() -> Unit): GenericTask<Any> {
|
||||
return TaskBuilder(name).apply(builder).build()
|
||||
}
|
||||
|
||||
|
@ -0,0 +1,92 @@
|
||||
package hep.dataforge.workspace
|
||||
|
||||
import hep.dataforge.data.Data
|
||||
import hep.dataforge.descriptors.NodeDescriptor
|
||||
import hep.dataforge.io.*
|
||||
import hep.dataforge.meta.EmptyMeta
|
||||
import hep.dataforge.meta.Meta
|
||||
import kotlinx.coroutines.Dispatchers
|
||||
import kotlinx.coroutines.coroutineScope
|
||||
import kotlinx.coroutines.withContext
|
||||
import kotlinx.io.nio.asInput
|
||||
import kotlinx.io.nio.asOutput
|
||||
import java.nio.file.Files
|
||||
import java.nio.file.Path
|
||||
import java.nio.file.StandardOpenOption
|
||||
import kotlin.reflect.KClass
|
||||
|
||||
/**
|
||||
* Read meta from file in a given [format]
|
||||
*/
|
||||
suspend fun Path.readMeta(format: MetaFormat, descriptor: NodeDescriptor? = null): Meta {
|
||||
return withContext(Dispatchers.IO) {
|
||||
format.run {
|
||||
Files.newByteChannel(this@readMeta, StandardOpenOption.READ)
|
||||
.asInput()
|
||||
.readMeta(descriptor)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Write meta to file in a given [format]
|
||||
*/
|
||||
suspend fun Meta.write(path: Path, format: MetaFormat, descriptor: NodeDescriptor? = null) {
|
||||
withContext(Dispatchers.IO) {
|
||||
format.run {
|
||||
Files.newByteChannel(path, StandardOpenOption.WRITE, StandardOpenOption.CREATE_NEW)
|
||||
.asOutput()
|
||||
.writeMeta(this@write, descriptor)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Read data with supported envelope format and binary format. If envelope format is null, then read binary directly from file.
|
||||
* @param type explicit type of data read
|
||||
* @param format binary format
|
||||
* @param envelopeFormat the format of envelope. If null, file is read directly
|
||||
* @param metaFile the relative file for optional meta override
|
||||
* @param metaFileFormat the meta format for override
|
||||
*/
|
||||
suspend fun <T : Any> Path.readData(
|
||||
type: KClass<out T>,
|
||||
format: IOFormat<T>,
|
||||
envelopeFormat: EnvelopeFormat? = null,
|
||||
metaFile: Path = resolveSibling("$fileName.meta"),
|
||||
metaFileFormat: MetaFormat = JsonMetaFormat
|
||||
): Data<T> {
|
||||
return coroutineScope {
|
||||
val externalMeta = if (Files.exists(metaFile)) {
|
||||
metaFile.readMeta(metaFileFormat)
|
||||
} else {
|
||||
null
|
||||
}
|
||||
if (envelopeFormat == null) {
|
||||
Data(type, externalMeta ?: EmptyMeta) {
|
||||
withContext(Dispatchers.IO) {
|
||||
format.run {
|
||||
Files.newByteChannel(this@readData, StandardOpenOption.READ)
|
||||
.asInput()
|
||||
.readThis()
|
||||
}
|
||||
}
|
||||
}
|
||||
} else {
|
||||
withContext(Dispatchers.IO) {
|
||||
readEnvelope(envelopeFormat).let {
|
||||
if (externalMeta == null) {
|
||||
it
|
||||
} else {
|
||||
it.withMetaLayers(externalMeta)
|
||||
}
|
||||
}.toData(type, format)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
//suspend fun <T : Any> Path.writeData(
|
||||
// data: Data<T>,
|
||||
// format: IOFormat<T>,
|
||||
// )
|
@ -1,14 +1,33 @@
|
||||
package hep.dataforge.workspace
|
||||
|
||||
import hep.dataforge.context.PluginTag
|
||||
import hep.dataforge.data.first
|
||||
import hep.dataforge.data.get
|
||||
import hep.dataforge.meta.boolean
|
||||
import hep.dataforge.meta.get
|
||||
import org.junit.Test
|
||||
import kotlin.test.assertEquals
|
||||
import kotlin.test.assertTrue
|
||||
|
||||
|
||||
class SimpleWorkspaceTest {
|
||||
val testPlugin = object : WorkspacePlugin() {
|
||||
override val tag: PluginTag = PluginTag("test")
|
||||
|
||||
val contextTask = Workspace.task("test") {
|
||||
pipe<Any, Unit> {
|
||||
context.logger.info { "Test: $it" }
|
||||
}
|
||||
}
|
||||
override val tasks: Collection<Task<*>> = listOf(contextTask)
|
||||
}
|
||||
|
||||
val workspace = SimpleWorkspace.build {
|
||||
|
||||
context {
|
||||
plugin(testPlugin)
|
||||
}
|
||||
|
||||
repeat(100) {
|
||||
static("myData[$it]", it)
|
||||
}
|
||||
@ -19,6 +38,9 @@ class SimpleWorkspaceTest {
|
||||
allData()
|
||||
}
|
||||
pipe<Int, Int> { data ->
|
||||
if (meta["testFlag"].boolean == true) {
|
||||
println("flag")
|
||||
}
|
||||
context.logger.info { "Starting square on $data" }
|
||||
data * data
|
||||
}
|
||||
@ -39,13 +61,13 @@ class SimpleWorkspaceTest {
|
||||
allData()
|
||||
}
|
||||
joinByGroup<Int, Double> { context ->
|
||||
group("even", filter = { name, data -> name.toString().toInt() % 2 == 0 }) {
|
||||
group("even", filter = { name, _ -> name.toString().toInt() % 2 == 0 }) {
|
||||
result { data ->
|
||||
context.logger.info { "Starting even" }
|
||||
data.values.average()
|
||||
}
|
||||
}
|
||||
group("odd", filter = { name, data -> name.toString().toInt() % 2 == 1 }) {
|
||||
group("odd", filter = { name, _ -> name.toString().toInt() % 2 == 1 }) {
|
||||
result { data ->
|
||||
context.logger.info { "Starting odd" }
|
||||
data.values.average()
|
||||
@ -62,12 +84,27 @@ class SimpleWorkspaceTest {
|
||||
data["even"]!! - data["odd"]!!
|
||||
}
|
||||
}
|
||||
|
||||
target("empty") {}
|
||||
}
|
||||
|
||||
@Test
|
||||
fun testWorkspace() {
|
||||
val node = workspace.run("sum")
|
||||
val res = node.first()
|
||||
assertEquals(328350, res.get())
|
||||
assertEquals(328350, res?.get())
|
||||
}
|
||||
|
||||
@Test
|
||||
fun testMetaPropagation() {
|
||||
val node = workspace.run("sum") { "testFlag" to true }
|
||||
val res = node.first()?.get()
|
||||
}
|
||||
|
||||
@Test
|
||||
fun testPluginTask() {
|
||||
val tasks = workspace.tasks
|
||||
assertTrue { tasks["test.test"] != null }
|
||||
//val node = workspace.run("test.test", "empty")
|
||||
}
|
||||
}
|
@ -1,16 +1,18 @@
|
||||
pluginManagement {
|
||||
repositories {
|
||||
mavenLocal()
|
||||
jcenter()
|
||||
gradlePluginPortal()
|
||||
maven("https://dl.bintray.com/kotlin/kotlin-eap")
|
||||
maven("https://dl.bintray.com/kotlin/kotlinx")
|
||||
maven("https://dl.bintray.com/mipt-npm/scientifik")
|
||||
maven("https://dl.bintray.com/mipt-npm/dev")
|
||||
}
|
||||
resolutionStrategy {
|
||||
eachPlugin {
|
||||
when (requested.id.id) {
|
||||
"kotlinx-atomicfu" -> useModule("org.jetbrains.kotlinx:atomicfu-gradle-plugin:${requested.version}")
|
||||
"kotlin-multiplatform" -> useModule("org.jetbrains.kotlin:kotlin-gradle-plugin:${requested.version}")
|
||||
"kotlin2js" -> useModule("org.jetbrains.kotlin:kotlin-gradle-plugin:${requested.version}")
|
||||
"org.jetbrains.kotlin.frontend" -> useModule("org.jetbrains.kotlin:kotlin-frontend-plugin:0.0.45")
|
||||
"scientifik.mpp", "scientifik.publish" -> useModule("scientifik:gradle-tools:${requested.version}")
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -25,7 +27,7 @@ include(
|
||||
":dataforge-context",
|
||||
":dataforge-data",
|
||||
":dataforge-output",
|
||||
":dataforge-output:dataforge-output-html",
|
||||
":dataforge-output-html",
|
||||
":dataforge-workspace",
|
||||
":dataforge-scripting"
|
||||
)
|
Loading…
Reference in New Issue
Block a user