Merge remote-tracking branch 'spc/master'
This commit is contained in:
commit
a39634fd51
CHANGELOG.mdREADME.mdbuild.gradle.kts
dataforge-context
README.md
api
build.gradle.ktssrc
commonMain/kotlin/space/kscience/dataforge
jvmMain/kotlin/space/kscience/dataforge
dataforge-data
README.mdbuild.gradle.kts
src
commonMain/kotlin/space/kscience/dataforge
actions
data
commonTest/kotlin/space/kscience/dataforge/data
jvmMain/kotlin/space/kscience/dataforge/data
jvmTest/kotlin/space/kscience/dataforge/data
dataforge-io
README.mdbuild.gradle.kts
dataforge-io-proto
dataforge-io-yaml
src
commonMain/kotlin/space/kscience/dataforge/io
jvmMain/kotlin/space/kscience/dataforge/io
dataforge-meta
README.md
api
build.gradle.ktssrc
commonMain/kotlin/space/kscience/dataforge
meta
JsonMeta.ktMetaConverter.ktMetaDelegate.ktMetaRef.ktMetaSerializer.ktMutableMeta.ktMutableMetaDelegate.ktMutableMetaView.ktObservableMeta.ktObservableMetaWrapper.ktScheme.ktValueSerializer.ktexoticValues.ktvalueExtensions.kt
names
commonTest/kotlin/space/kscience/dataforge
dataforge-output
api
build.gradle.ktssrc
commonMain/kotlin/hep/dataforge/output
jsMain/kotlin/hep/dataforge/output
jvmMain/kotlin/hep/dataforge/output
nativeMain/kotlin/hep/dataforge/output
dataforge-scripting
dataforge-workspace
README.mdbuild.gradle.kts
src
commonMain/kotlin/space/kscience/dataforge/workspace
jvmMain/kotlin/space/kscience/dataforge/workspace
CachingAction.ktFileDataTree.ktFileWorkspaceCache.ktInMemoryWorkspaceCache.ktdataFilterJvm.ktreadFileData.ktworkspaceJvm.ktwriteFileData.kt
jvmTest/kotlin/space/kscience/dataforge/workspace
docs/templates
gradle.propertiesgradle/wrapper
28
CHANGELOG.md
28
CHANGELOG.md
@ -14,6 +14,34 @@
|
||||
|
||||
### Security
|
||||
|
||||
## 0.10.0 - 2025-01-19
|
||||
|
||||
### Added
|
||||
|
||||
- Coroutine exception logging in context
|
||||
- `ObservableMutableMetaSerializer`
|
||||
- `MutableMetaView` - a Meta wrapper that creates nodes only when its or its children are changed.
|
||||
|
||||
### Changed
|
||||
|
||||
- Simplify inheritance logic in `MutableTypedMeta`
|
||||
- Full rework of `DataTree` and associated interfaces (`DataSource`, `DataSink`, etc.).
|
||||
- Filter data by type is moved from `dataforge-data` to `dataforge-workspace` to avoid reflection dependency.
|
||||
|
||||
### Deprecated
|
||||
|
||||
- MetaProvider `spec` is replaced by `readable`. `listOfSpec` replaced with `listOfReadable`
|
||||
|
||||
### Removed
|
||||
|
||||
- Remove implicit io format resolver in `IOPlugin` and `FileWorkspaceCache`. There are no guarantees that only one format is present in the contrxt for each type.
|
||||
- Dependencies on `atomicfu` and `kotlin.reflect` from dataforge-data to improve performance.
|
||||
|
||||
### Fixed
|
||||
|
||||
- Fixed NameToken parsing.
|
||||
- Top level string list meta conversion.
|
||||
|
||||
## 0.9.0 - 2024-06-04
|
||||
|
||||
### Added
|
||||
|
89
README.md
89
README.md
@ -1,7 +1,70 @@
|
||||
[](https://confluence.jetbrains.com/display/ALL/JetBrains+on+GitHub)
|
||||
[](https://zenodo.org/badge/latestdoi/148831678)
|
||||
|
||||

|
||||
## Publications
|
||||
|
||||
* [A general overview](https://doi.org/10.1051/epjconf/201817705003)
|
||||
* [An application in "Troitsk nu-mass" experiment](https://doi.org/10.1088/1742-6596/1525/1/012024)
|
||||
|
||||
## Video
|
||||
|
||||
* [A presentation on application of DataForge (legacy version) to Troitsk nu-mass analysis.](https://youtu.be/OpWzLXUZnLI?si=3qn7EMruOHMJX3Bc)
|
||||
|
||||
## Questions and Answers
|
||||
|
||||
In this section, we will try to cover DataForge main ideas in the form of questions and answers.
|
||||
|
||||
### General
|
||||
|
||||
**Q**: I have a lot of data to analyze. The analysis process is complicated, requires a lot of stages, and data flow is not always obvious. Also, the data size is huge, so I don't want to perform operation I don't need (calculate something I won't need or calculate something twice). I need it to be performed in parallel and probably on remote computer. By the way, I am sick and tired of scripts that modify other scripts that control scripts. Could you help me?
|
||||
|
||||
**A**: Yes, that is precisely the problem DataForge was made to solve. It allows performing some automated data manipulations with optimization and parallelization. The important thing that data processing recipes are made in the declarative way, so it is quite easy to perform computations on a remote station. Also, DataForge guarantees reproducibility of analysis results.
|
||||
|
||||
**Q**: How does it work?
|
||||
|
||||
**A**: At the core of DataForge lies the idea of metadata processor. It utilizes the fact that to analyze something you need data itself and some additional information about what does that data represent and what does user want as a result. This additional information is called metadata and could be organized in a regular structure (a tree of values similar to XML or JSON). The important thing is that this distinction leaves no place for user instructions (or scripts). Indeed, the idea of DataForge logic is that one does not need imperative commands. The framework configures itself according to input meta-data and decides what operations should be performed in the most efficient way.
|
||||
|
||||
**Q**: But where does it take algorithms to use?
|
||||
|
||||
**A**: Of course algorithms must be written somewhere. No magic here. The logic is written in specialized modules. Some modules are provided out of the box at the system core, some need to be developed for a specific problem.
|
||||
|
||||
**Q**: So I still need to write the code? What is the difference then?
|
||||
|
||||
**A**: Yes, someone still needs to write the code. But not necessary you. Simple operations could be performed using provided core logic. Also, your group can have one programmer writing the logic and all other using it without any real programming expertise. The framework organized in a such way that one writes some additional logic, they do not need to think about complicated thing like parallel computing, resource handling, logging, caching, etc. Most of the things are done by the DataForge.
|
||||
|
||||
### Platform
|
||||
|
||||
**Q**: Which platform does DataForge use? Which operating system is it working on?
|
||||
|
||||
**A**: The DataForge is mostly written in Kotlin-multiplatform and could be used on JVM, JS and native targets. Some modules and functions are supported only on JVM
|
||||
|
||||
**Q**: Can I use my C++/Fortran/Python code in DataForge?
|
||||
|
||||
**A**: Yes, as long as the code could be called from Java. Most common languages have a bridge for Java access. There are completely no problems with compiled C/Fortran libraries. Python code could be called via one of existing python-java interfaces. It is also planned to implement remote method invocation for common languages, so your Python, or, say, Julia, code could run in its native environment. The metadata processor paradigm makes it much easier to do so.
|
||||
|
||||
### Features
|
||||
|
||||
**Q**: What other features does DataForge provide?
|
||||
|
||||
**A**: Alongside metadata processing (and a lot of tools for metadata manipulation and layering), DataForge has two additional important concepts:
|
||||
|
||||
* **Modularisation**. Contrary to lot other frameworks, DataForge is intrinsically modular. The mandatory part is a rather tiny core module. Everything else could be customized.
|
||||
|
||||
* **Context encapsulation**. Every DataForge task is executed in some context. The context isolates environment for the task and also works as dependency injection base and specifies interaction of the task with the external world.
|
||||
|
||||
### Misc
|
||||
|
||||
**Q**: So everything looks great, can I replace my ROOT / other data analysis framework with DataForge?
|
||||
|
||||
**A**: One must note that DataForge is made for analysis, not for visualization. The visualization and user interaction capabilities of DataForge are rather limited compared to frameworks like ROOT, JAS3 or DataMelt. The idea is to provide reliable API and core functionality. [VisionForge](https://git.sciprog.center/kscience/visionforge) project aims to provide tools for both 2D and 3D visualization both locally and remotely.
|
||||
|
||||
**Q**: How does DataForge compare to cluster computation frameworks like Apache Spark?
|
||||
|
||||
**A**: It is not the purpose of DataForge to replace cluster computing software. DataForge has some internal parallelism mechanics and implementations, but they are most certainly worse than specially developed programs. Still, DataForge is not fixed on one single implementation. Your favourite parallel processing tool could be still used as a back-end for the DataForge. With full benefit of configuration tools, integrations and no performance overhead.
|
||||
|
||||
**Q**: Is it possible to use DataForge in notebook mode?
|
||||
|
||||
**A**: [Kotlin jupyter](https://github.com/Kotlin/kotlin-jupyter) allows using any JVM program in a notebook mode. The dedicated module for DataForge is work in progress.
|
||||
|
||||
|
||||
### [dataforge-context](dataforge-context)
|
||||
@ -14,16 +77,31 @@
|
||||
> **Maturity**: EXPERIMENTAL
|
||||
|
||||
### [dataforge-io](dataforge-io)
|
||||
> IO module
|
||||
> Serialization foundation for Meta objects and Envelope processing.
|
||||
>
|
||||
> **Maturity**: EXPERIMENTAL
|
||||
>
|
||||
> **Features:**
|
||||
> - [IO format](dataforge-io/src/commonMain/kotlin/space/kscience/dataforge/io/IOFormat.kt) : A generic API for reading something from binary representation and writing it to Binary.
|
||||
> - [Binary](dataforge-io/src/commonMain/kotlin/space/kscience/dataforge/io/Binary.kt) : Multi-read random access binary.
|
||||
> - [Envelope](dataforge-io/src/commonMain/kotlin/space/kscience/dataforge/io/Envelope.kt) : API and implementations for combined data and metadata format.
|
||||
> - [Tagged envelope](dataforge-io/src/commonMain/kotlin/space/kscience/dataforge/io/TaggedEnvelope.kt) : Implementation for binary-friendly envelope format with machine readable tag and forward size declaration.
|
||||
> - [Tagged envelope](dataforge-io/src/commonMain/kotlin/space/kscience/dataforge/io/TaglessEnvelope.kt) : Implementation for text-friendly envelope format with text separators for sections.
|
||||
|
||||
|
||||
### [dataforge-meta](dataforge-meta)
|
||||
> Meta definition and basic operations on meta
|
||||
> Core Meta and Name manipulation module
|
||||
>
|
||||
> **Maturity**: DEVELOPMENT
|
||||
>
|
||||
> **Features:**
|
||||
> - [Meta](dataforge-meta/src/commonMain/kotlin/space/kscience/dataforge/meta/Meta.kt) : **Meta** is the representation of basic DataForge concept: Metadata, but it also could be called meta-value tree.
|
||||
> - [Value](dataforge-meta/src/commonMain/kotlin/space/kscience/dataforge/meta/Value.kt) : **Value** a sum type for different meta values.
|
||||
> - [Name](dataforge-meta/src/commonMain/kotlin/space/kscience/dataforge/names/Name.kt) : **Name** is an identifier to access tree-like structure.
|
||||
|
||||
|
||||
### [dataforge-scripting](dataforge-scripting)
|
||||
> Scripting definition fow workspace generation
|
||||
>
|
||||
> **Maturity**: PROTOTYPE
|
||||
|
||||
@ -31,6 +109,11 @@
|
||||
>
|
||||
> **Maturity**: EXPERIMENTAL
|
||||
|
||||
### [dataforge-io/dataforge-io-proto](dataforge-io/dataforge-io-proto)
|
||||
> ProtoBuf Meta representation
|
||||
>
|
||||
> **Maturity**: PROTOTYPE
|
||||
|
||||
### [dataforge-io/dataforge-io-yaml](dataforge-io/dataforge-io-yaml)
|
||||
> YAML meta converters and Front Matter envelope format
|
||||
>
|
||||
|
@ -9,7 +9,7 @@ plugins {
|
||||
|
||||
allprojects {
|
||||
group = "space.kscience"
|
||||
version = "0.9.0"
|
||||
version = "0.10.0"
|
||||
}
|
||||
|
||||
subprojects {
|
||||
@ -22,6 +22,12 @@ subprojects {
|
||||
}
|
||||
}
|
||||
|
||||
dependencies{
|
||||
subprojects.forEach {
|
||||
dokka(it)
|
||||
}
|
||||
}
|
||||
|
||||
readme {
|
||||
readmeTemplate = file("docs/templates/README-TEMPLATE.md")
|
||||
}
|
||||
@ -32,7 +38,7 @@ ksciencePublish {
|
||||
useSPCTeam()
|
||||
}
|
||||
repository("spc", "https://maven.sciprog.center/kscience")
|
||||
sonatype("https://oss.sonatype.org")
|
||||
central()
|
||||
}
|
||||
|
||||
apiValidation {
|
||||
|
@ -6,7 +6,7 @@ Context and provider definitions
|
||||
|
||||
## Artifact:
|
||||
|
||||
The Maven coordinates of this project are `space.kscience:dataforge-context:0.9.0-dev-1`.
|
||||
The Maven coordinates of this project are `space.kscience:dataforge-context:0.10.0`.
|
||||
|
||||
**Gradle Kotlin DSL:**
|
||||
```kotlin
|
||||
@ -16,6 +16,6 @@ repositories {
|
||||
}
|
||||
|
||||
dependencies {
|
||||
implementation("space.kscience:dataforge-context:0.9.0-dev-1")
|
||||
implementation("space.kscience:dataforge-context:0.10.0")
|
||||
}
|
||||
```
|
||||
|
@ -282,6 +282,7 @@ public final class space/kscience/dataforge/provider/Path : java/lang/Iterable,
|
||||
|
||||
public final class space/kscience/dataforge/provider/Path$Companion {
|
||||
public final fun parse-X5wN5Vs (Ljava/lang/String;)Ljava/util/List;
|
||||
public final fun serializer ()Lkotlinx/serialization/KSerializer;
|
||||
}
|
||||
|
||||
public final class space/kscience/dataforge/provider/PathKt {
|
||||
|
@ -13,11 +13,10 @@ kscience {
|
||||
useSerialization()
|
||||
commonMain {
|
||||
api(projects.dataforgeMeta)
|
||||
api(spclibs.atomicfu)
|
||||
}
|
||||
jvmMain{
|
||||
api(kotlin("reflect"))
|
||||
api("org.slf4j:slf4j-api:1.7.30")
|
||||
api(spclibs.kotlin.reflect)
|
||||
api(spclibs.slf4j)
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -1,5 +1,6 @@
|
||||
package space.kscience.dataforge.context
|
||||
|
||||
import kotlinx.coroutines.CoroutineExceptionHandler
|
||||
import kotlinx.coroutines.CoroutineScope
|
||||
import kotlinx.coroutines.Job
|
||||
import kotlinx.coroutines.SupervisorJob
|
||||
@ -67,7 +68,9 @@ public open class Context internal constructor(
|
||||
|
||||
override val coroutineContext: CoroutineContext by lazy {
|
||||
(parent ?: Global).coroutineContext.let { parenContext ->
|
||||
parenContext + coroutineContext + SupervisorJob(parenContext[Job])
|
||||
parenContext + coroutineContext + SupervisorJob(parenContext[Job]) + CoroutineExceptionHandler { _, throwable ->
|
||||
logger.error(throwable) { "Exception in context $name" }
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -15,15 +15,37 @@
|
||||
*/
|
||||
package space.kscience.dataforge.provider
|
||||
|
||||
import kotlinx.serialization.KSerializer
|
||||
import kotlinx.serialization.Serializable
|
||||
import kotlinx.serialization.builtins.serializer
|
||||
import kotlinx.serialization.descriptors.SerialDescriptor
|
||||
import kotlinx.serialization.encoding.Decoder
|
||||
import kotlinx.serialization.encoding.Encoder
|
||||
import space.kscience.dataforge.names.Name
|
||||
import space.kscience.dataforge.names.parseAsName
|
||||
import kotlin.jvm.JvmInline
|
||||
|
||||
private object PathSerializer : KSerializer<Path> {
|
||||
|
||||
override val descriptor: SerialDescriptor
|
||||
get() = String.serializer().descriptor
|
||||
|
||||
override fun serialize(encoder: Encoder, value: Path) {
|
||||
encoder.encodeString(value.toString())
|
||||
}
|
||||
|
||||
override fun deserialize(decoder: Decoder): Path {
|
||||
return Path.parse(decoder.decodeString())
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* Path interface.
|
||||
*
|
||||
*/
|
||||
@JvmInline
|
||||
@Serializable(PathSerializer::class)
|
||||
public value class Path(public val tokens: List<PathToken>) : Iterable<PathToken> {
|
||||
|
||||
override fun iterator(): Iterator<PathToken> = tokens.iterator()
|
||||
@ -33,6 +55,7 @@ public value class Path(public val tokens: List<PathToken>) : Iterable<PathToken
|
||||
public companion object {
|
||||
public const val PATH_SEGMENT_SEPARATOR: String = "/"
|
||||
|
||||
|
||||
public fun parse(path: String): Path = Path(path.split(PATH_SEGMENT_SEPARATOR).map { PathToken.parse(it) })
|
||||
}
|
||||
}
|
||||
|
@ -17,7 +17,7 @@ package space.kscience.dataforge.context
|
||||
|
||||
import java.util.*
|
||||
import kotlin.reflect.KClass
|
||||
import kotlin.reflect.full.cast
|
||||
import kotlin.reflect.cast
|
||||
|
||||
public class ClassLoaderPlugin(private val classLoader: ClassLoader) : AbstractPlugin() {
|
||||
override val tag: PluginTag = PluginTag("classLoader", PluginTag.DATAFORGE_GROUP)
|
||||
|
@ -8,25 +8,30 @@ import space.kscience.dataforge.misc.DfType
|
||||
import space.kscience.dataforge.misc.Named
|
||||
import space.kscience.dataforge.names.Name
|
||||
import kotlin.reflect.KClass
|
||||
import kotlin.reflect.KType
|
||||
import kotlin.reflect.full.findAnnotation
|
||||
|
||||
import kotlin.reflect.typeOf
|
||||
|
||||
@DFExperimental
|
||||
public val KClass<*>.dfType: String
|
||||
get() = findAnnotation<DfType>()?.id ?: simpleName ?: ""
|
||||
|
||||
@DFExperimental
|
||||
public val KType.dfType: String
|
||||
get() = findAnnotation<DfType>()?.id ?: (classifier as? KClass<*>)?.simpleName ?: ""
|
||||
|
||||
/**
|
||||
* Provide an object with given name inferring target from its type using [DfType] annotation
|
||||
*/
|
||||
@DFExperimental
|
||||
public inline fun <reified T : Any> Provider.provideByType(name: String): T? {
|
||||
val target = T::class.dfType
|
||||
val target = typeOf<T>().dfType
|
||||
return provide(target, name)
|
||||
}
|
||||
|
||||
@DFExperimental
|
||||
public inline fun <reified T : Any> Provider.top(): Map<Name, T> {
|
||||
val target = T::class.dfType
|
||||
val target = typeOf<T>().dfType
|
||||
return top(target)
|
||||
}
|
||||
|
||||
@ -35,15 +40,15 @@ public inline fun <reified T : Any> Provider.top(): Map<Name, T> {
|
||||
*/
|
||||
@DFExperimental
|
||||
public inline fun <reified T : Any> Context.gather(inherit: Boolean = true): Map<Name, T> =
|
||||
gather<T>(T::class.dfType, inherit)
|
||||
gather<T>(typeOf<T>().dfType, inherit)
|
||||
|
||||
|
||||
@DFExperimental
|
||||
public inline fun <reified T : Any> PluginBuilder.provides(items: Map<Name, T>) {
|
||||
provides(T::class.dfType, items)
|
||||
provides(typeOf<T>().dfType, items)
|
||||
}
|
||||
|
||||
@DFExperimental
|
||||
public inline fun <reified T : Any> PluginBuilder.provides(vararg items: Named) {
|
||||
provides(T::class.dfType, *items)
|
||||
provides(typeOf<T>().dfType, *items)
|
||||
}
|
||||
|
@ -6,7 +6,7 @@
|
||||
|
||||
## Artifact:
|
||||
|
||||
The Maven coordinates of this project are `space.kscience:dataforge-data:0.9.0-dev-1`.
|
||||
The Maven coordinates of this project are `space.kscience:dataforge-data:0.10.0`.
|
||||
|
||||
**Gradle Kotlin DSL:**
|
||||
```kotlin
|
||||
@ -16,6 +16,6 @@ repositories {
|
||||
}
|
||||
|
||||
dependencies {
|
||||
implementation("space.kscience:dataforge-data:0.9.0-dev-1")
|
||||
implementation("space.kscience:dataforge-data:0.10.0")
|
||||
}
|
||||
```
|
||||
|
@ -9,10 +9,7 @@ kscience{
|
||||
wasm()
|
||||
useCoroutines()
|
||||
dependencies {
|
||||
api(spclibs.atomicfu)
|
||||
api(projects.dataforgeMeta)
|
||||
//Remove after subtype moved to stdlib
|
||||
api(kotlin("reflect"))
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -1,12 +1,7 @@
|
||||
package space.kscience.dataforge.actions
|
||||
|
||||
import kotlinx.coroutines.CoroutineScope
|
||||
import kotlinx.coroutines.flow.collect
|
||||
import kotlinx.coroutines.flow.onEach
|
||||
import space.kscience.dataforge.data.DataSink
|
||||
import space.kscience.dataforge.data.DataTree
|
||||
import space.kscience.dataforge.data.DataUpdate
|
||||
import space.kscience.dataforge.data.launchUpdate
|
||||
import space.kscience.dataforge.data.*
|
||||
import space.kscience.dataforge.meta.Meta
|
||||
import space.kscience.dataforge.misc.UnsafeKType
|
||||
import space.kscience.dataforge.names.Name
|
||||
@ -31,25 +26,25 @@ public abstract class AbstractAction<T, R>(
|
||||
/**
|
||||
* Generate initial content of the output
|
||||
*/
|
||||
protected abstract fun DataSink<R>.generate(
|
||||
protected abstract fun DataBuilderScope<R>.generate(
|
||||
source: DataTree<T>,
|
||||
meta: Meta,
|
||||
)
|
||||
): Map<Name, Data<R>>
|
||||
|
||||
/**
|
||||
* Update part of the data set using provided data
|
||||
*
|
||||
* @param source the source data tree in case we need several data items to update
|
||||
* @param meta the metadata used for the whole data tree
|
||||
* @param actionMeta the metadata used for the whole data tree
|
||||
* @param updatedData an updated item
|
||||
*/
|
||||
protected open suspend fun DataSink<R>.update(
|
||||
source: DataTree<T>,
|
||||
meta: Meta,
|
||||
updatedData: DataUpdate<T>,
|
||||
actionMeta: Meta,
|
||||
updateName: Name,
|
||||
) {
|
||||
//by default regenerate the whole data set
|
||||
generate(source, meta)
|
||||
writeAll(generate(source, actionMeta))
|
||||
}
|
||||
|
||||
@OptIn(UnsafeKType::class)
|
||||
@ -57,13 +52,21 @@ public abstract class AbstractAction<T, R>(
|
||||
source: DataTree<T>,
|
||||
meta: Meta,
|
||||
updatesScope: CoroutineScope
|
||||
): DataTree<R> = DataTree(outputType) {
|
||||
generate(source, meta)
|
||||
): DataTree<R> = DataTree(
|
||||
dataType = outputType,
|
||||
scope = updatesScope,
|
||||
initialData = DataBuilderScope<R>().generate(source, meta)
|
||||
) {
|
||||
|
||||
//propagate updates
|
||||
launchUpdate(updatesScope) {
|
||||
source.updates.onEach { update ->
|
||||
update(source, meta, update)
|
||||
}.collect()
|
||||
val updateSink = DataSink<R> { name, data ->
|
||||
write(name, data)
|
||||
}
|
||||
|
||||
with(updateSink) {
|
||||
source.updates.collect {
|
||||
update(source, meta, it)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -37,6 +37,7 @@ public class MapActionBuilder<T, R>(
|
||||
/**
|
||||
* Set unsafe [outputType] for the resulting data. Be sure that it is correct.
|
||||
*/
|
||||
@UnsafeKType
|
||||
public fun <R1 : R> result(outputType: KType, f: suspend ActionEnv.(T) -> R1) {
|
||||
this.outputType = outputType
|
||||
result = f;
|
||||
@ -45,6 +46,7 @@ public class MapActionBuilder<T, R>(
|
||||
/**
|
||||
* Calculate the result of goal
|
||||
*/
|
||||
@OptIn(UnsafeKType::class)
|
||||
public inline fun <reified R1 : R> result(noinline f: suspend ActionEnv.(T) -> R1): Unit = result(typeOf<R1>(), f)
|
||||
}
|
||||
|
||||
@ -54,22 +56,21 @@ public class MapAction<T, R>(
|
||||
private val block: MapActionBuilder<T, R>.() -> Unit,
|
||||
) : AbstractAction<T, R>(outputType) {
|
||||
|
||||
private fun DataSink<R>.mapOne(name: Name, data: Data<T>?, meta: Meta) {
|
||||
private fun mapOne(name: Name, data: Data<T>?, meta: Meta): Pair<Name, Data<R>?> {
|
||||
//fast return for null data
|
||||
if (data == null) {
|
||||
put(name, null)
|
||||
return
|
||||
return name to null
|
||||
}
|
||||
// Creating a new environment for action using **old** name, old meta and task meta
|
||||
val env = ActionEnv(name, data.meta, meta)
|
||||
|
||||
//applying transformation from builder
|
||||
val builder = MapActionBuilder<T, R>(
|
||||
name,
|
||||
data.meta.toMutableMeta(), // using data meta
|
||||
meta,
|
||||
data.type,
|
||||
outputType
|
||||
name = name,
|
||||
meta = data.meta.toMutableMeta(), // using data meta
|
||||
actionMeta = meta,
|
||||
dataType = data.type,
|
||||
outputType = outputType
|
||||
).apply(block)
|
||||
|
||||
//getting new name
|
||||
@ -82,21 +83,26 @@ public class MapAction<T, R>(
|
||||
builder.result(env, data.await())
|
||||
}
|
||||
//setting the data node
|
||||
put(newName, newData)
|
||||
return newName to newData
|
||||
}
|
||||
|
||||
override fun DataSink<R>.generate(source: DataTree<T>, meta: Meta) {
|
||||
source.forEach { mapOne(it.name, it.data, meta) }
|
||||
override fun DataBuilderScope<R>.generate(source: DataTree<T>, meta: Meta): Map<Name, Data<R>> = buildMap {
|
||||
source.forEach { data ->
|
||||
val (name, data) = mapOne(data.name, data, meta)
|
||||
if (data != null) {
|
||||
check(name !in keys) { "Data with key $name already exist in the result" }
|
||||
put(name, data)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
|
||||
override suspend fun DataSink<R>.update(
|
||||
source: DataTree<T>,
|
||||
meta: Meta,
|
||||
updatedData: DataUpdate<T>,
|
||||
) {
|
||||
mapOne(updatedData.name, updatedData.data, meta)
|
||||
actionMeta: Meta,
|
||||
updateName: Name,
|
||||
) {
|
||||
val (name, data) = mapOne(updateName, source.read(updateName), actionMeta)
|
||||
write(name, data)
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -3,6 +3,8 @@ package space.kscience.dataforge.actions
|
||||
import space.kscience.dataforge.data.*
|
||||
import space.kscience.dataforge.meta.Meta
|
||||
import space.kscience.dataforge.meta.MutableMeta
|
||||
import space.kscience.dataforge.meta.get
|
||||
import space.kscience.dataforge.meta.string
|
||||
import space.kscience.dataforge.misc.DFBuilder
|
||||
import space.kscience.dataforge.misc.UnsafeKType
|
||||
import space.kscience.dataforge.names.Name
|
||||
@ -13,7 +15,7 @@ import kotlin.reflect.typeOf
|
||||
|
||||
public class JoinGroup<T, R>(
|
||||
public var name: String,
|
||||
internal val set: DataTree<T>,
|
||||
internal val data: DataTree<T>,
|
||||
@PublishedApi internal var outputType: KType,
|
||||
) {
|
||||
|
||||
@ -41,12 +43,17 @@ public class ReduceGroupBuilder<T, R>(
|
||||
private val groupRules: MutableList<(DataTree<T>) -> List<JoinGroup<T, R>>> = ArrayList();
|
||||
|
||||
/**
|
||||
* introduce grouping by meta value
|
||||
* Group by a meta value
|
||||
*/
|
||||
public fun byValue(tag: String, defaultTag: String = "@default", action: JoinGroup<T, R>.() -> Unit) {
|
||||
@OptIn(UnsafeKType::class)
|
||||
public fun byMetaValue(tag: String, defaultTag: String = "@default", action: JoinGroup<T, R>.() -> Unit) {
|
||||
groupRules += { node ->
|
||||
GroupRule.byMetaValue(tag, defaultTag).gather(node).map {
|
||||
JoinGroup<T, R>(it.key, it.value, outputType).apply(action)
|
||||
val groups = mutableMapOf<String, MutableMap<Name, Data<T>>>()
|
||||
node.forEach { data ->
|
||||
groups.getOrPut(data.meta[tag]?.string ?: defaultTag) { mutableMapOf() }.put(data.name, data)
|
||||
}
|
||||
groups.map { (key, dataMap) ->
|
||||
JoinGroup<T, R>(key, dataMap.asTree(node.dataType), outputType).apply(action)
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -84,11 +91,11 @@ internal class ReduceAction<T, R>(
|
||||
) : AbstractAction<T, R>(outputType) {
|
||||
//TODO optimize reduction. Currently, the whole action recalculates on push
|
||||
|
||||
override fun DataSink<R>.generate(source: DataTree<T>, meta: Meta) {
|
||||
override fun DataBuilderScope<R>.generate(source: DataTree<T>, meta: Meta): Map<Name, Data<R>> = buildMap {
|
||||
ReduceGroupBuilder<T, R>(meta, outputType).apply(action).buildGroups(source).forEach { group ->
|
||||
val dataFlow: Map<Name, Data<T>> = group.set.asSequence().fold(HashMap()) { acc, value ->
|
||||
val dataFlow: Map<Name, Data<T>> = group.data.asSequence().fold(HashMap()) { acc, value ->
|
||||
acc.apply {
|
||||
acc[value.name] = value.data
|
||||
acc[value.name] = value
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -7,7 +7,6 @@ import space.kscience.dataforge.meta.MutableMeta
|
||||
import space.kscience.dataforge.meta.toMutableMeta
|
||||
import space.kscience.dataforge.names.Name
|
||||
import space.kscience.dataforge.names.parseAsName
|
||||
import kotlin.collections.set
|
||||
import kotlin.reflect.KType
|
||||
import kotlin.reflect.typeOf
|
||||
|
||||
@ -48,7 +47,7 @@ internal class SplitAction<T, R>(
|
||||
private val action: SplitBuilder<T, R>.() -> Unit,
|
||||
) : AbstractAction<T, R>(outputType) {
|
||||
|
||||
private fun DataSink<R>.splitOne(name: Name, data: Data<T>?, meta: Meta) {
|
||||
private fun splitOne(name: Name, data: Data<T>?, meta: Meta): Map<Name, Data<R>?> = buildMap {
|
||||
val laminate = Laminate(data?.meta, meta)
|
||||
|
||||
val split = SplitBuilder<T, R>(name, data?.meta ?: Meta.EMPTY).apply(action)
|
||||
@ -76,16 +75,26 @@ internal class SplitAction<T, R>(
|
||||
}
|
||||
}
|
||||
|
||||
override fun DataSink<R>.generate(source: DataTree<T>, meta: Meta) {
|
||||
source.forEach { splitOne(it.name, it.data, meta) }
|
||||
override fun DataBuilderScope<R>.generate(
|
||||
source: DataTree<T>,
|
||||
meta: Meta
|
||||
): Map<Name, Data<R>> = buildMap {
|
||||
source.forEach {
|
||||
splitOne(it.name, it, meta).forEach { (name, data) ->
|
||||
check(name !in keys) { "Data with key $name already exist in the result" }
|
||||
if (data != null) {
|
||||
put(name, data)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
override suspend fun DataSink<R>.update(
|
||||
source: DataTree<T>,
|
||||
meta: Meta,
|
||||
updatedData: DataUpdate<T>,
|
||||
) {
|
||||
splitOne(updatedData.name, updatedData.data, meta)
|
||||
actionMeta: Meta,
|
||||
updateName: Name,
|
||||
) {
|
||||
writeAll(splitOne(updateName, source.read(updateName), actionMeta))
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -3,6 +3,7 @@ package space.kscience.dataforge.data
|
||||
import kotlinx.coroutines.flow.Flow
|
||||
import kotlinx.coroutines.flow.filter
|
||||
import space.kscience.dataforge.meta.Meta
|
||||
import space.kscience.dataforge.misc.DFInternal
|
||||
import space.kscience.dataforge.names.Name
|
||||
import space.kscience.dataforge.names.NameToken
|
||||
import space.kscience.dataforge.names.plus
|
||||
@ -17,24 +18,15 @@ public fun interface DataFilter {
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
public fun DataFilter.accepts(update: DataUpdate<*>): Boolean = accepts(update.name, update.data?.meta, update.type)
|
||||
|
||||
public fun <T, DU : DataUpdate<T>> Sequence<DU>.filterData(predicate: DataFilter): Sequence<DU> = filter { data ->
|
||||
predicate.accepts(data)
|
||||
}
|
||||
|
||||
public fun <T, DU : DataUpdate<T>> Flow<DU>.filterData(predicate: DataFilter): Flow<DU> = filter { data ->
|
||||
predicate.accepts(data)
|
||||
}
|
||||
|
||||
public fun <T> DataSource<T>.filterData(
|
||||
predicate: DataFilter,
|
||||
dataFilter: DataFilter,
|
||||
): DataSource<T> = object : DataSource<T> {
|
||||
override val dataType: KType get() = this@filterData.dataType
|
||||
|
||||
override fun read(name: Name): Data<T>? =
|
||||
this@filterData.read(name)?.takeIf { predicate.accepts(name, it.meta, it.type) }
|
||||
this@filterData.read(name)?.takeIf {
|
||||
dataFilter.accepts(name, it.meta, it.type)
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
@ -43,8 +35,12 @@ public fun <T> DataSource<T>.filterData(
|
||||
public fun <T> ObservableDataSource<T>.filterData(
|
||||
predicate: DataFilter,
|
||||
): ObservableDataSource<T> = object : ObservableDataSource<T> {
|
||||
override val updates: Flow<DataUpdate<T>>
|
||||
get() = this@filterData.updates.filter { predicate.accepts(it) }
|
||||
|
||||
override val updates: Flow<Name>
|
||||
get() = this@filterData.updates.filter {
|
||||
val data = read(it)
|
||||
predicate.accepts(it, data?.meta, data?.type ?: dataType)
|
||||
}
|
||||
|
||||
override val dataType: KType get() = this@filterData.dataType
|
||||
|
||||
@ -52,10 +48,14 @@ public fun <T> ObservableDataSource<T>.filterData(
|
||||
this@filterData.read(name)?.takeIf { predicate.accepts(name, it.meta, it.type) }
|
||||
}
|
||||
|
||||
internal class FilteredDataTree<T>(
|
||||
val source: DataTree<T>,
|
||||
val filter: DataFilter,
|
||||
val branch: Name,
|
||||
/**
|
||||
* A [DataTree] filtered by branch and some criterion, possibly changing resulting type
|
||||
*/
|
||||
@DFInternal
|
||||
public class FilteredDataTree<T>(
|
||||
public val source: DataTree<T>,
|
||||
public val filter: DataFilter,
|
||||
public val branch: Name,
|
||||
override val dataType: KType = source.dataType,
|
||||
) : DataTree<T> {
|
||||
|
||||
@ -70,41 +70,13 @@ internal class FilteredDataTree<T>(
|
||||
?.filter { !it.value.isEmpty() }
|
||||
?: emptyMap()
|
||||
|
||||
override val updates: Flow<DataUpdate<T>>
|
||||
get() = source.updates.filter { filter.accepts(it) }
|
||||
override val updates: Flow<Name>
|
||||
get() = source.updates.filter {
|
||||
val data = read(it)
|
||||
filter.accepts(it, data?.meta, data?.type ?: dataType)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
public fun <T> DataTree<T>.filterData(
|
||||
predicate: DataFilter,
|
||||
): DataTree<T> = FilteredDataTree(this, predicate, Name.EMPTY)
|
||||
|
||||
|
||||
///**
|
||||
// * Generate a wrapper data set with a given name prefix appended to all names
|
||||
// */
|
||||
//public fun <T : Any> DataTree<T>.withNamePrefix(prefix: Name): DataSet<T> = if (prefix.isEmpty()) {
|
||||
// this
|
||||
//} else object : DataSource<T> {
|
||||
//
|
||||
// override val dataType: KType get() = this@withNamePrefix.dataType
|
||||
//
|
||||
// override val coroutineContext: CoroutineContext
|
||||
// get() = (this@withNamePrefix as? DataSource)?.coroutineContext ?: EmptyCoroutineContext
|
||||
//
|
||||
// override val meta: Meta get() = this@withNamePrefix.meta
|
||||
//
|
||||
//
|
||||
// override fun iterator(): Iterator<NamedData<T>> = iterator {
|
||||
// for (d in this@withNamePrefix) {
|
||||
// yield(d.data.named(prefix + d.name))
|
||||
// }
|
||||
// }
|
||||
//
|
||||
// override fun get(name: Name): Data<T>? =
|
||||
// name.removeFirstOrNull(name)?.let { this@withNamePrefix.get(it) }
|
||||
//
|
||||
// override val updates: Flow<Name> get() = this@withNamePrefix.updates.map { prefix + it }
|
||||
//}
|
||||
//
|
||||
|
||||
): FilteredDataTree<T> = FilteredDataTree(this, predicate, Name.EMPTY)
|
@ -15,40 +15,41 @@
|
||||
*/
|
||||
package space.kscience.dataforge.data
|
||||
|
||||
import space.kscience.dataforge.meta.Meta
|
||||
import space.kscience.dataforge.meta.get
|
||||
import space.kscience.dataforge.meta.string
|
||||
import space.kscience.dataforge.misc.DFExperimental
|
||||
import space.kscience.dataforge.misc.UnsafeKType
|
||||
import space.kscience.dataforge.names.Name
|
||||
import space.kscience.dataforge.names.NameToken
|
||||
import space.kscience.dataforge.names.plus
|
||||
import kotlin.reflect.KType
|
||||
|
||||
public interface GroupRule {
|
||||
public fun <T> gather(set: DataTree<T>): Map<String, DataTree<T>>
|
||||
/**
|
||||
* Interface that define rename rule for [Data]
|
||||
*/
|
||||
@DFExperimental
|
||||
public fun interface DataRenamer {
|
||||
public fun rename(name: Name, meta: Meta, type: KType): Name
|
||||
|
||||
public companion object {
|
||||
|
||||
/**
|
||||
* Create grouping rule that creates groups for different values of value
|
||||
* field with name [key]
|
||||
*
|
||||
* @param key
|
||||
* @param defaultTagValue
|
||||
* @return
|
||||
* Prepend name token `key\[tagValue\]` to data name
|
||||
*/
|
||||
@OptIn(UnsafeKType::class)
|
||||
public fun byMetaValue(
|
||||
public fun groupByMetaValue(
|
||||
key: String,
|
||||
defaultTagValue: String,
|
||||
): GroupRule = object : GroupRule {
|
||||
): DataRenamer = object : DataRenamer {
|
||||
|
||||
override fun <T> gather(
|
||||
set: DataTree<T>,
|
||||
): Map<String, DataTree<T>> {
|
||||
val map = HashMap<String, MutableDataTree<T>>()
|
||||
|
||||
set.forEach { data ->
|
||||
val tagValue: String = data.meta[key]?.string ?: defaultTagValue
|
||||
map.getOrPut(tagValue) { MutableDataTree(set.dataType) }.put(data.name, data.data)
|
||||
}
|
||||
|
||||
|
||||
return map
|
||||
override fun rename(
|
||||
name: Name,
|
||||
meta: Meta,
|
||||
type: KType
|
||||
): Name {
|
||||
val tagValue: String = meta[key]?.string ?: defaultTagValue
|
||||
return NameToken(key,tagValue).plus(name)
|
||||
}
|
||||
}
|
||||
}
|
@ -1,70 +1,51 @@
|
||||
package space.kscience.dataforge.data
|
||||
|
||||
import kotlinx.coroutines.CoroutineScope
|
||||
import kotlinx.coroutines.Job
|
||||
import kotlinx.coroutines.channels.BufferOverflow
|
||||
import kotlinx.coroutines.flow.Flow
|
||||
import kotlinx.coroutines.flow.MutableSharedFlow
|
||||
import kotlinx.coroutines.flow.mapNotNull
|
||||
import kotlinx.coroutines.launch
|
||||
import space.kscience.dataforge.misc.UnsafeKType
|
||||
import space.kscience.dataforge.names.*
|
||||
import kotlin.reflect.KType
|
||||
import kotlin.reflect.typeOf
|
||||
|
||||
public interface DataSink<in T> {
|
||||
/**
|
||||
* Put data without notification
|
||||
*/
|
||||
public fun put(name: Name, data: Data<T>?)
|
||||
|
||||
/**
|
||||
* Put data and propagate changes downstream
|
||||
*/
|
||||
public suspend fun update(name: Name, data: Data<T>?)
|
||||
/**
|
||||
* A marker scope for data builders
|
||||
*/
|
||||
public interface DataBuilderScope<in T> {
|
||||
public companion object : DataBuilderScope<Nothing>
|
||||
}
|
||||
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
public fun <T> DataBuilderScope(): DataBuilderScope<T> = DataBuilderScope as DataBuilderScope<T>
|
||||
|
||||
/**
|
||||
* Launch continuous update using
|
||||
* Asynchronous data sink
|
||||
*/
|
||||
public fun <T> DataSink<T>.launchUpdate(
|
||||
scope: CoroutineScope,
|
||||
updater: suspend DataSink<T>.() -> Unit,
|
||||
): Job = scope.launch {
|
||||
object : DataSink<T> {
|
||||
override fun put(name: Name, data: Data<T>?) {
|
||||
launch {
|
||||
this@launchUpdate.update(name, data)
|
||||
}
|
||||
}
|
||||
|
||||
override suspend fun update(name: Name, data: Data<T>?) {
|
||||
this@launchUpdate.update(name, data)
|
||||
}
|
||||
}.updater()
|
||||
public fun interface DataSink<in T> : DataBuilderScope<T> {
|
||||
/**
|
||||
* Put data and notify listeners if needed
|
||||
*/
|
||||
public suspend fun write(name: Name, data: Data<T>?)
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* A mutable version of [DataTree]
|
||||
*/
|
||||
public interface MutableDataTree<T> : DataTree<T>, DataSink<T> {
|
||||
override var data: Data<T>?
|
||||
|
||||
override val items: Map<NameToken, MutableDataTree<T>>
|
||||
|
||||
public fun getOrCreateItem(token: NameToken): MutableDataTree<T>
|
||||
|
||||
public operator fun set(token: NameToken, data: Data<T>?)
|
||||
|
||||
override fun put(name: Name, data: Data<T>?): Unit = set(name, data)
|
||||
}
|
||||
|
||||
public tailrec operator fun <T> MutableDataTree<T>.set(name: Name, data: Data<T>?): Unit {
|
||||
when (name.length) {
|
||||
0 -> this.data = data
|
||||
1 -> set(name.first(), data)
|
||||
else -> getOrCreateItem(name.first())[name.cutFirst()] = data
|
||||
}
|
||||
//
|
||||
// public fun getOrCreateItem(token: NameToken): MutableDataTree<T>
|
||||
//
|
||||
// public suspend fun put(token: NameToken, data: Data<T>?)
|
||||
//
|
||||
// override suspend fun put(name: Name, data: Data<T>?): Unit {
|
||||
// when (name.length) {
|
||||
// 0 -> this.data = data
|
||||
// 1 -> put(name.first(), data)
|
||||
// else -> getOrCreateItem(name.first()).put(name.cutFirst(), data)
|
||||
// }
|
||||
// }
|
||||
}
|
||||
|
||||
/**
|
||||
@ -81,65 +62,58 @@ private class MutableDataTreeRoot<T>(
|
||||
override val dataType: KType,
|
||||
) : MutableDataTree<T> {
|
||||
|
||||
override val updates = MutableSharedFlow<DataUpdate<T>>(100, onBufferOverflow = BufferOverflow.DROP_LATEST)
|
||||
|
||||
override val items = HashMap<NameToken, MutableDataTree<T>>()
|
||||
override val updates = MutableSharedFlow<Name>()
|
||||
|
||||
inner class MutableDataTreeBranch(val branchName: Name) : MutableDataTree<T> {
|
||||
|
||||
override var data: Data<T>? = null
|
||||
private set
|
||||
|
||||
override val items = HashMap<NameToken, MutableDataTree<T>>()
|
||||
|
||||
override val updates: Flow<DataUpdate<T>> = this@MutableDataTreeRoot.updates.mapNotNull { update ->
|
||||
update.name.removeFirstOrNull(branchName)?.let {
|
||||
DataUpdate(update.data?.type ?: dataType, it, update.data)
|
||||
}
|
||||
override val updates: Flow<Name> = this@MutableDataTreeRoot.updates.mapNotNull { update ->
|
||||
update.removeFirstOrNull(branchName)
|
||||
}
|
||||
override val dataType: KType get() = this@MutableDataTreeRoot.dataType
|
||||
|
||||
override suspend fun write(
|
||||
name: Name,
|
||||
data: Data<T>?
|
||||
) {
|
||||
when (name.length) {
|
||||
0 -> {
|
||||
this.data = data
|
||||
this@MutableDataTreeRoot.updates.emit(branchName)
|
||||
}
|
||||
|
||||
override fun getOrCreateItem(token: NameToken): MutableDataTree<T> =
|
||||
items.getOrPut(token) { MutableDataTreeBranch(branchName + token) }
|
||||
|
||||
|
||||
override fun set(token: NameToken, data: Data<T>?) {
|
||||
val subTree = getOrCreateItem(token)
|
||||
subTree.data = data
|
||||
}
|
||||
|
||||
override suspend fun update(name: Name, data: Data<T>?) {
|
||||
if (name.isEmpty()) {
|
||||
this.data = data
|
||||
this@MutableDataTreeRoot.updates.emit(DataUpdate(data?.type ?: dataType, branchName + name, data))
|
||||
} else {
|
||||
getOrCreateItem(name.first()).update(name.cutFirst(), data)
|
||||
else -> {
|
||||
val token = name.first()
|
||||
items.getOrPut(token) { MutableDataTreeBranch(branchName + token) }.write(name.cutFirst(), data)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
|
||||
override var data: Data<T>? = null
|
||||
private set
|
||||
|
||||
override val items = HashMap<NameToken, MutableDataTree<T>>()
|
||||
override suspend fun write(
|
||||
name: Name,
|
||||
data: Data<T>?
|
||||
) {
|
||||
when (name.length) {
|
||||
0 -> {
|
||||
this.data = data
|
||||
this@MutableDataTreeRoot.updates.emit(Name.EMPTY)
|
||||
}
|
||||
|
||||
override fun getOrCreateItem(token: NameToken): MutableDataTree<T> = items.getOrPut(token) {
|
||||
MutableDataTreeBranch(token.asName())
|
||||
}
|
||||
|
||||
override fun set(token: NameToken, data: Data<T>?) {
|
||||
val subTree = getOrCreateItem(token)
|
||||
subTree.data = data
|
||||
}
|
||||
|
||||
override suspend fun update(name: Name, data: Data<T>?) {
|
||||
if (name.isEmpty()) {
|
||||
this.data = data
|
||||
updates.emit(DataUpdate(data?.type ?: dataType, name, data))
|
||||
} else {
|
||||
getOrCreateItem(name.first()).update(name.cutFirst(), data)
|
||||
else -> {
|
||||
val token = name.first()
|
||||
items.getOrPut(token) { MutableDataTreeBranch(token.asName()) }.write(name.cutFirst(), data)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
/**
|
||||
@ -151,7 +125,7 @@ public fun <T> MutableDataTree(
|
||||
): MutableDataTree<T> = MutableDataTreeRoot<T>(type)
|
||||
|
||||
/**
|
||||
* Create and initialize a observable mutable data tree.
|
||||
* Create and initialize an observable mutable data tree.
|
||||
*/
|
||||
@OptIn(UnsafeKType::class)
|
||||
public inline fun <reified T> MutableDataTree(
|
||||
|
@ -1,7 +1,6 @@
|
||||
package space.kscience.dataforge.data
|
||||
|
||||
import kotlinx.coroutines.flow.Flow
|
||||
import kotlinx.coroutines.flow.emptyFlow
|
||||
import kotlinx.coroutines.flow.*
|
||||
import space.kscience.dataforge.meta.Meta
|
||||
import space.kscience.dataforge.names.*
|
||||
import kotlin.contracts.contract
|
||||
@ -30,11 +29,21 @@ public interface DataSource<out T> {
|
||||
public interface ObservableDataSource<out T> : DataSource<T> {
|
||||
|
||||
/**
|
||||
* Flow updates made to the data
|
||||
* Names of updated elements.
|
||||
* Data updates with the same names could be glued together.
|
||||
*
|
||||
* Updates are considered critical.
|
||||
* The producer will suspend unless all updates are consumed.
|
||||
*/
|
||||
public val updates: Flow<DataUpdate<T>>
|
||||
public val updates: Flow<Name>
|
||||
}
|
||||
|
||||
public suspend fun <T> ObservableDataSource<T>.awaitData(name: Name): Data<T> =
|
||||
read(name) ?: updates.filter { it == name }.mapNotNull { read(name) }.first()
|
||||
|
||||
public suspend fun <T> ObservableDataSource<T>.awaitData(name: String): Data<T> =
|
||||
awaitData(name.parseAsName())
|
||||
|
||||
/**
|
||||
* A tree like structure for data holding
|
||||
*/
|
||||
@ -51,17 +60,16 @@ public interface DataTree<out T> : ObservableDataSource<T> {
|
||||
/**
|
||||
* Flow updates made to the data
|
||||
*/
|
||||
override val updates: Flow<DataUpdate<T>>
|
||||
override val updates: Flow<Name>
|
||||
|
||||
public companion object {
|
||||
private object EmptyDataTree :
|
||||
DataTree<Nothing> {
|
||||
private object EmptyDataTree : DataTree<Nothing> {
|
||||
override val data: Data<Nothing>? = null
|
||||
override val items: Map<NameToken, EmptyDataTree> = emptyMap()
|
||||
override val dataType: KType = typeOf<Unit>()
|
||||
|
||||
override fun read(name: Name): Data<Nothing>? = null
|
||||
override val updates: Flow<DataUpdate<Nothing>> get() = emptyFlow()
|
||||
override val updates: Flow<Name> get() = emptyFlow()
|
||||
}
|
||||
|
||||
public val EMPTY: DataTree<Nothing> = EmptyDataTree
|
||||
|
@ -32,7 +32,7 @@ public interface Goal<out T> {
|
||||
public companion object
|
||||
}
|
||||
|
||||
public fun Goal<*>.launch(coroutineScope: CoroutineScope): Job = async(coroutineScope)
|
||||
public fun Goal<*>.launchIn(coroutineScope: CoroutineScope): Job = async(coroutineScope)
|
||||
|
||||
public suspend fun <T> Goal<T>.await(): T = coroutineScope { async(this).await() }
|
||||
|
||||
@ -64,11 +64,14 @@ public open class LazyGoal<T>(
|
||||
/**
|
||||
* Get ongoing computation or start a new one.
|
||||
* Does not guarantee thread safety. In case of multi-thread access, could create orphan computations.
|
||||
* If [GoalExecutionRestriction] is present in the [coroutineScope] context, the call could produce a error a warning
|
||||
* If [GoalExecutionRestriction] is present in the [coroutineScope] context, the call could produce an error or a warning
|
||||
* depending on the settings.
|
||||
*
|
||||
* If [Goal] is already started on a different scope, it is not restarted.
|
||||
*/
|
||||
@OptIn(DFExperimental::class)
|
||||
override fun async(coroutineScope: CoroutineScope): Deferred<T> {
|
||||
override fun async(coroutineScope: CoroutineScope): Deferred<T> = deferred ?: run {
|
||||
|
||||
val log = coroutineScope.coroutineContext[GoalLogger]
|
||||
// Check if context restricts goal computation
|
||||
coroutineScope.coroutineContext[GoalExecutionRestriction]?.let { restriction ->
|
||||
@ -85,13 +88,14 @@ public open class LazyGoal<T>(
|
||||
val startedDependencies = dependencies.map { goal ->
|
||||
goal.async(coroutineScope)
|
||||
}
|
||||
return deferred ?: coroutineScope.async(
|
||||
|
||||
coroutineScope.async(
|
||||
coroutineContext
|
||||
+ CoroutineMonitor()
|
||||
+ Dependencies(startedDependencies)
|
||||
+ GoalExecutionRestriction(GoalExecutionRestrictionPolicy.NONE) // Remove restrictions on goal execution
|
||||
) {
|
||||
//cancel execution if error encountered in one of dependencies
|
||||
//cancel execution if error encountered in one of the dependencies
|
||||
startedDependencies.forEach { deferred ->
|
||||
deferred.invokeOnCompletion { error ->
|
||||
if (error != null) this.cancel(CancellationException("Dependency $deferred failed with error: ${error.message}"))
|
||||
|
@ -8,7 +8,7 @@ import space.kscience.dataforge.meta.copy
|
||||
private class MetaMaskData<T>(val origin: Data<T>, override val meta: Meta) : Data<T> by origin
|
||||
|
||||
/**
|
||||
* A data with overriden meta. It reflects original data computed state.
|
||||
* A data with overridden meta. It reflects original data computed state.
|
||||
*/
|
||||
public fun <T> Data<T>.withMeta(newMeta: Meta): Data<T> = if (this is MetaMaskData) {
|
||||
MetaMaskData(origin, newMeta)
|
||||
|
@ -3,38 +3,16 @@ package space.kscience.dataforge.data
|
||||
import space.kscience.dataforge.meta.isEmpty
|
||||
import space.kscience.dataforge.misc.Named
|
||||
import space.kscience.dataforge.names.Name
|
||||
import kotlin.reflect.KType
|
||||
|
||||
/**
|
||||
* An interface implementing a data update event.
|
||||
*
|
||||
* If [data] is null, then corresponding element should be removed.
|
||||
*/
|
||||
public interface DataUpdate<out T> : Named {
|
||||
public val type: KType
|
||||
override val name: Name
|
||||
public val data: Data<T>?
|
||||
}
|
||||
|
||||
public fun <T> DataUpdate(type: KType, name: Name, data: Data<T>?): DataUpdate<T> = object : DataUpdate<T> {
|
||||
override val type: KType = type
|
||||
override val name: Name = name
|
||||
override val data: Data<T>? = data
|
||||
}
|
||||
|
||||
/**
|
||||
* A data coupled to a name.
|
||||
*/
|
||||
public interface NamedData<out T> : DataUpdate<T>, Data<T> {
|
||||
override val data: Data<T>
|
||||
}
|
||||
public interface NamedData<out T> : Data<T>, Named
|
||||
|
||||
public operator fun NamedData<*>.component1(): Name = name
|
||||
public operator fun <T> NamedData<T>.component2(): Data<T> = data
|
||||
|
||||
private class NamedDataImpl<T>(
|
||||
override val name: Name,
|
||||
override val data: Data<T>,
|
||||
val data: Data<T>,
|
||||
) : Data<T> by data, NamedData<T> {
|
||||
override fun toString(): String = buildString {
|
||||
append("NamedData(name=\"$name\"")
|
||||
@ -49,7 +27,7 @@ private class NamedDataImpl<T>(
|
||||
}
|
||||
|
||||
public fun <T> Data<T>.named(name: Name): NamedData<T> = if (this is NamedData) {
|
||||
NamedDataImpl(name, this.data)
|
||||
NamedDataImpl(name, this)
|
||||
} else {
|
||||
NamedDataImpl(name, this)
|
||||
}
|
||||
|
63
dataforge-data/src/commonMain/kotlin/space/kscience/dataforge/data/StaticDataBuilder.kt
Normal file
63
dataforge-data/src/commonMain/kotlin/space/kscience/dataforge/data/StaticDataBuilder.kt
Normal file
@ -0,0 +1,63 @@
|
||||
package space.kscience.dataforge.data
|
||||
|
||||
import space.kscience.dataforge.meta.Meta
|
||||
import space.kscience.dataforge.meta.MutableMeta
|
||||
import space.kscience.dataforge.misc.UnsafeKType
|
||||
import space.kscience.dataforge.names.Name
|
||||
import space.kscience.dataforge.names.parseAsName
|
||||
import space.kscience.dataforge.names.plus
|
||||
import kotlin.reflect.KType
|
||||
import kotlin.reflect.typeOf
|
||||
|
||||
|
||||
public fun interface StaticDataBuilder<T> : DataBuilderScope<T> {
|
||||
public fun data(name: Name, data: Data<T>)
|
||||
}
|
||||
|
||||
private class DataMapBuilder<T> : StaticDataBuilder<T> {
|
||||
val map = mutableMapOf<Name, Data<T>>()
|
||||
|
||||
override fun data(name: Name, data: Data<T>) {
|
||||
if (map.containsKey(name)) {
|
||||
error("Duplicate key '$name'")
|
||||
} else {
|
||||
map.put(name, data)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
public fun <T> StaticDataBuilder<T>.data(name: String, data: Data<T>) {
|
||||
data(name.parseAsName(), data)
|
||||
}
|
||||
|
||||
public inline fun <T, reified T1 : T> StaticDataBuilder<T>.value(
|
||||
name: String,
|
||||
value: T1,
|
||||
metaBuilder: MutableMeta.() -> Unit = {}
|
||||
) {
|
||||
data(name, Data(value, Meta(metaBuilder)))
|
||||
}
|
||||
|
||||
public fun <T> StaticDataBuilder<T>.node(prefix: Name, block: StaticDataBuilder<T>.() -> Unit) {
|
||||
val map = DataMapBuilder<T>().apply(block).map
|
||||
map.forEach { (name, data) ->
|
||||
data(prefix + name, data)
|
||||
}
|
||||
}
|
||||
|
||||
public fun <T> StaticDataBuilder<T>.node(prefix: String, block: StaticDataBuilder<T>.() -> Unit) =
|
||||
node(prefix.parseAsName(), block)
|
||||
|
||||
public fun <T> StaticDataBuilder<T>.node(prefix: String, tree: DataTree<T>) {
|
||||
tree.forEach { data ->
|
||||
data(prefix.parseAsName() + data.name, data)
|
||||
}
|
||||
}
|
||||
|
||||
@UnsafeKType
|
||||
public fun <T> DataTree.Companion.static(type: KType, block: StaticDataBuilder<T>.() -> Unit): DataTree<T> =
|
||||
DataMapBuilder<T>().apply(block).map.asTree(type)
|
||||
|
||||
@OptIn(UnsafeKType::class)
|
||||
public inline fun <reified T> DataTree.Companion.static(noinline block: StaticDataBuilder<T>.() -> Unit): DataTree<T> =
|
||||
static(typeOf<T>(), block)
|
@ -1,134 +1,113 @@
|
||||
package space.kscience.dataforge.data
|
||||
|
||||
import kotlinx.coroutines.CoroutineScope
|
||||
import kotlinx.coroutines.Job
|
||||
import kotlinx.coroutines.flow.launchIn
|
||||
import kotlinx.coroutines.flow.onEach
|
||||
import space.kscience.dataforge.meta.Meta
|
||||
import space.kscience.dataforge.meta.MutableMeta
|
||||
import space.kscience.dataforge.names.*
|
||||
import space.kscience.dataforge.names.Name
|
||||
import space.kscience.dataforge.names.asName
|
||||
import space.kscience.dataforge.names.isEmpty
|
||||
import space.kscience.dataforge.names.plus
|
||||
|
||||
|
||||
public fun <T> DataSink<T>.put(value: NamedData<T>) {
|
||||
put(value.name, value.data)
|
||||
public suspend fun <T> DataSink<T>.write(value: NamedData<T>) {
|
||||
write(value.name, value)
|
||||
}
|
||||
|
||||
public inline fun <T> DataSink<T>.putAll(
|
||||
public inline fun <T> DataSink<T>.writeAll(
|
||||
prefix: Name,
|
||||
block: DataSink<T>.() -> Unit,
|
||||
) {
|
||||
if (prefix.isEmpty()) {
|
||||
apply(block)
|
||||
} else {
|
||||
val proxyDataSink = object :DataSink<T>{
|
||||
override fun put(name: Name, data: Data<T>?) {
|
||||
this@putAll.put(prefix + name, data)
|
||||
}
|
||||
|
||||
override suspend fun update(name: Name, data: Data<T>?) {
|
||||
this@putAll.update(prefix + name, data)
|
||||
}
|
||||
|
||||
}
|
||||
val proxyDataSink = DataSink<T> { name, data -> this@writeAll.write(prefix + name, data) }
|
||||
|
||||
proxyDataSink.apply(block)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
public inline fun <T> DataSink<T>.putAll(
|
||||
public inline fun <T> DataSink<T>.writeAll(
|
||||
prefix: String,
|
||||
block: DataSink<T>.() -> Unit,
|
||||
): Unit = putAll(prefix.asName(), block)
|
||||
): Unit = writeAll(prefix.asName(), block)
|
||||
|
||||
|
||||
public fun <T> DataSink<T>.put(name: String, value: Data<T>) {
|
||||
put(Name.parse(name), value)
|
||||
public suspend fun <T> DataSink<T>.write(name: String, value: Data<T>) {
|
||||
write(Name.parse(name), value)
|
||||
}
|
||||
|
||||
public fun <T> DataSink<T>.putAll(name: Name, tree: DataTree<T>) {
|
||||
putAll(name) { putAll(tree.asSequence()) }
|
||||
public suspend fun <T> DataSink<T>.writeAll(name: Name, tree: DataTree<T>) {
|
||||
writeAll(name) { writeAll(tree.asSequence()) }
|
||||
}
|
||||
|
||||
|
||||
public fun <T> DataSink<T>.putAll(name: String, tree: DataTree<T>) {
|
||||
putAll(Name.parse(name)) { putAll(tree.asSequence()) }
|
||||
public suspend fun <T> DataSink<T>.writeAll(name: String, tree: DataTree<T>) {
|
||||
writeAll(Name.parse(name)) { writeAll(tree.asSequence()) }
|
||||
}
|
||||
|
||||
/**
|
||||
* Produce lazy [Data] and emit it into the [MutableDataTree]
|
||||
*/
|
||||
public inline fun <reified T> DataSink<T>.putValue(
|
||||
public suspend inline fun <reified T> DataSink<T>.writeValue(
|
||||
name: String,
|
||||
meta: Meta = Meta.EMPTY,
|
||||
noinline producer: suspend () -> T,
|
||||
) {
|
||||
val data = Data(meta, block = producer)
|
||||
put(name, data)
|
||||
write(name, data)
|
||||
}
|
||||
|
||||
public inline fun <reified T> DataSink<T>.putValue(
|
||||
public suspend inline fun <reified T> DataSink<T>.writeValue(
|
||||
name: Name,
|
||||
meta: Meta = Meta.EMPTY,
|
||||
noinline producer: suspend () -> T,
|
||||
) {
|
||||
val data = Data(meta, block = producer)
|
||||
put(name, data)
|
||||
write(name, data)
|
||||
}
|
||||
|
||||
/**
|
||||
* Emit static data with the fixed value
|
||||
*/
|
||||
public inline fun <reified T> DataSink<T>.putValue(
|
||||
public suspend inline fun <reified T> DataSink<T>.writeValue(
|
||||
name: Name,
|
||||
value: T,
|
||||
meta: Meta = Meta.EMPTY,
|
||||
): Unit = put(name, Data.wrapValue(value, meta))
|
||||
): Unit = write(name, Data.wrapValue(value, meta))
|
||||
|
||||
public inline fun <reified T> DataSink<T>.putValue(
|
||||
public suspend inline fun <reified T> DataSink<T>.writeValue(
|
||||
name: String,
|
||||
value: T,
|
||||
meta: Meta = Meta.EMPTY,
|
||||
): Unit = put(name, Data.wrapValue(value, meta))
|
||||
): Unit = write(name, Data.wrapValue(value, meta))
|
||||
|
||||
public inline fun <reified T> DataSink<T>.putValue(
|
||||
public suspend inline fun <reified T> DataSink<T>.writeValue(
|
||||
name: String,
|
||||
value: T,
|
||||
metaBuilder: MutableMeta.() -> Unit,
|
||||
): Unit = put(Name.parse(name), Data.wrapValue(value, Meta(metaBuilder)))
|
||||
): Unit = write(Name.parse(name), Data.wrapValue(value, Meta(metaBuilder)))
|
||||
|
||||
public suspend inline fun <reified T> DataSink<T>.updateValue(
|
||||
name: Name,
|
||||
value: T,
|
||||
meta: Meta = Meta.EMPTY,
|
||||
): Unit = update(name, Data.wrapValue(value, meta))
|
||||
|
||||
public suspend inline fun <reified T> DataSink<T>.updateValue(
|
||||
name: String,
|
||||
value: T,
|
||||
meta: Meta = Meta.EMPTY,
|
||||
): Unit = update(name.parseAsName(), Data.wrapValue(value, meta))
|
||||
|
||||
public fun <T> DataSink<T>.putAll(sequence: Sequence<NamedData<T>>) {
|
||||
public suspend fun <T> DataSink<T>.writeAll(sequence: Sequence<NamedData<T>>) {
|
||||
sequence.forEach {
|
||||
put(it.name, it.data)
|
||||
write(it)
|
||||
}
|
||||
}
|
||||
|
||||
public fun <T> DataSink<T>.putAll(tree: DataTree<T>) {
|
||||
putAll(tree.asSequence())
|
||||
public suspend fun <T> DataSink<T>.writeAll(map: Map<Name, Data<T>?>) {
|
||||
map.forEach { (name, data) ->
|
||||
write(name, data)
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Copy given data set and mirror its changes to this [DataSink] in [this@setAndObserve]. Returns an update [Job]
|
||||
* Copy all data from [this] and mirror changes if they appear. Suspends indefinitely.
|
||||
*/
|
||||
public fun <T : Any> DataSink<T>.putAllAndWatch(
|
||||
scope: CoroutineScope,
|
||||
branchName: Name = Name.EMPTY,
|
||||
public suspend fun <T : Any> MutableDataTree<T>.writeAllAndWatch(
|
||||
source: DataTree<T>,
|
||||
): Job {
|
||||
putAll(branchName, source)
|
||||
return source.updates.onEach {
|
||||
update(branchName + it.name, it.data)
|
||||
}.launchIn(scope)
|
||||
prefix: Name = Name.EMPTY,
|
||||
) {
|
||||
writeAll(prefix, source)
|
||||
source.updates.collect {
|
||||
write(prefix + it, source.read(it))
|
||||
}
|
||||
}
|
@ -1,5 +1,6 @@
|
||||
package space.kscience.dataforge.data
|
||||
|
||||
import kotlinx.coroutines.CoroutineScope
|
||||
import space.kscience.dataforge.meta.*
|
||||
import space.kscience.dataforge.misc.UnsafeKType
|
||||
import space.kscience.dataforge.names.Name
|
||||
@ -36,7 +37,6 @@ public fun <T, R> Data<T>.transform(
|
||||
}
|
||||
|
||||
|
||||
|
||||
/**
|
||||
* Lazily transform this data to another data. By convention [block] should not use external data (be pure).
|
||||
* @param coroutineContext additional [CoroutineContext] elements used for data computation.
|
||||
@ -77,7 +77,6 @@ internal fun Iterable<Data<*>>.joinMeta(): Meta = Meta {
|
||||
}
|
||||
|
||||
|
||||
|
||||
@PublishedApi
|
||||
internal fun Map<*, Data<*>>.joinMeta(): Meta = Meta {
|
||||
forEach { (key, data) ->
|
||||
@ -201,34 +200,50 @@ public inline fun <T, reified R> Iterable<NamedData<T>>.foldNamedToData(
|
||||
|
||||
|
||||
@UnsafeKType
|
||||
public suspend fun <T, R> DataTree<T>.transform(
|
||||
public fun <T, R> DataTree<T>.transformEach(
|
||||
outputType: KType,
|
||||
metaTransform: MutableMeta.() -> Unit = {},
|
||||
coroutineContext: CoroutineContext = EmptyCoroutineContext,
|
||||
block: suspend (NamedValueWithMeta<T>) -> R,
|
||||
): DataTree<R> = DataTree<R>(outputType){
|
||||
//quasi-synchronous processing of elements in the tree
|
||||
asSequence().forEach { namedData: NamedData<T> ->
|
||||
val newMeta = namedData.meta.toMutableMeta().apply(metaTransform).seal()
|
||||
val d = Data(outputType, newMeta, coroutineContext, listOf(namedData)) {
|
||||
block(namedData.awaitWithMeta())
|
||||
scope: CoroutineScope,
|
||||
metaTransform: MutableMeta.(name: Name) -> Unit = {},
|
||||
compute: suspend (NamedValueWithMeta<T>) -> R,
|
||||
): DataTree<R> = DataTree<R>(
|
||||
outputType,
|
||||
scope,
|
||||
initialData = asSequence().associate { namedData: NamedData<T> ->
|
||||
val newMeta = namedData.meta.toMutableMeta().apply {
|
||||
metaTransform(namedData.name)
|
||||
}.seal()
|
||||
val newData = Data(outputType, newMeta, scope.coroutineContext, listOf(namedData)) {
|
||||
compute(namedData.awaitWithMeta())
|
||||
}
|
||||
namedData.name to newData
|
||||
}
|
||||
) {
|
||||
updates.collect { name ->
|
||||
val data: Data<T>? = read(name)
|
||||
if (data == null) write(name, null) else {
|
||||
val newMeta = data.meta.toMutableMeta().apply {
|
||||
metaTransform(name)
|
||||
}.seal()
|
||||
val d = Data(outputType, newMeta, scope.coroutineContext, listOf(data)) {
|
||||
compute(NamedValueWithMeta(name, data.await(), data.meta))
|
||||
}
|
||||
write(name, d)
|
||||
}
|
||||
put(namedData.name, d)
|
||||
}
|
||||
}
|
||||
|
||||
@OptIn(UnsafeKType::class)
|
||||
public suspend inline fun <T, reified R> DataTree<T>.transform(
|
||||
noinline metaTransform: MutableMeta.() -> Unit = {},
|
||||
coroutineContext: CoroutineContext = EmptyCoroutineContext,
|
||||
public inline fun <T, reified R> DataTree<T>.transformEach(
|
||||
scope: CoroutineScope,
|
||||
noinline metaTransform: MutableMeta.(name: Name) -> Unit = {},
|
||||
noinline block: suspend (NamedValueWithMeta<T>) -> R,
|
||||
): DataTree<R> = this@transform.transform(typeOf<R>(), metaTransform, coroutineContext, block)
|
||||
): DataTree<R> = transformEach(typeOf<R>(), scope, metaTransform, block)
|
||||
|
||||
public inline fun <T> DataTree<T>.forEach(block: (NamedData<T>) -> Unit) {
|
||||
asSequence().forEach(block)
|
||||
}
|
||||
|
||||
// DataSet reduction
|
||||
// DataSet snapshot reduction
|
||||
|
||||
@PublishedApi
|
||||
internal fun DataTree<*>.joinMeta(): Meta = Meta {
|
||||
@ -238,6 +253,10 @@ internal fun DataTree<*>.joinMeta(): Meta = Meta {
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Reduce current snapshot of the [DataTree] to a single [Data].
|
||||
* Even if a tree is changed in the future, only current data set is taken.
|
||||
*/
|
||||
public inline fun <T, reified R> DataTree<T>.reduceToData(
|
||||
meta: Meta = joinMeta(),
|
||||
coroutineContext: CoroutineContext = EmptyCoroutineContext,
|
||||
|
@ -1,8 +1,11 @@
|
||||
package space.kscience.dataforge.data
|
||||
|
||||
import kotlinx.coroutines.CoroutineScope
|
||||
import kotlinx.coroutines.flow.Flow
|
||||
import kotlinx.coroutines.flow.MutableSharedFlow
|
||||
import kotlinx.coroutines.flow.SharedFlow
|
||||
import kotlinx.coroutines.flow.mapNotNull
|
||||
import kotlinx.coroutines.launch
|
||||
import kotlinx.coroutines.sync.Mutex
|
||||
import kotlinx.coroutines.sync.withLock
|
||||
import space.kscience.dataforge.misc.UnsafeKType
|
||||
@ -14,7 +17,7 @@ import kotlin.reflect.typeOf
|
||||
private class FlatDataTree<T>(
|
||||
override val dataType: KType,
|
||||
private val dataSet: Map<Name, Data<T>>,
|
||||
private val sourceUpdates: Flow<DataUpdate<T>>,
|
||||
private val sourceUpdates: SharedFlow<Name>,
|
||||
private val prefix: Name,
|
||||
) : DataTree<T> {
|
||||
override val data: Data<T>? get() = dataSet[prefix]
|
||||
@ -26,14 +29,13 @@ private class FlatDataTree<T>(
|
||||
|
||||
override fun read(name: Name): Data<T>? = dataSet[prefix + name]
|
||||
|
||||
override val updates: Flow<DataUpdate<T>> =
|
||||
sourceUpdates.mapNotNull { update ->
|
||||
update.name.removeFirstOrNull(prefix)?.let { DataUpdate(dataType, it, update.data) }
|
||||
}
|
||||
override val updates: Flow<Name> = sourceUpdates.mapNotNull { update ->
|
||||
update.removeFirstOrNull(prefix)
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* A builder for static [DataTree].
|
||||
* A builder for [DataTree].
|
||||
*/
|
||||
private class DataTreeBuilder<T>(
|
||||
private val type: KType,
|
||||
@ -44,28 +46,21 @@ private class DataTreeBuilder<T>(
|
||||
|
||||
private val mutex = Mutex()
|
||||
|
||||
private val updatesFlow = MutableSharedFlow<DataUpdate<T>>()
|
||||
private val updatesFlow = MutableSharedFlow<Name>()
|
||||
|
||||
override fun put(name: Name, data: Data<T>?) {
|
||||
if (data == null) {
|
||||
map.remove(name)
|
||||
} else {
|
||||
map[name] = data
|
||||
}
|
||||
}
|
||||
|
||||
override suspend fun update(name: Name, data: Data<T>?) {
|
||||
override suspend fun write(name: Name, data: Data<T>?) {
|
||||
mutex.withLock {
|
||||
if (data == null) {
|
||||
map.remove(name)
|
||||
} else {
|
||||
map.put(name, data)
|
||||
map[name] = data
|
||||
}
|
||||
}
|
||||
updatesFlow.emit(DataUpdate(data?.type ?: type, name, data))
|
||||
updatesFlow.emit(name)
|
||||
}
|
||||
|
||||
public fun build(): DataTree<T> = FlatDataTree(type, map, updatesFlow, Name.EMPTY)
|
||||
fun build(): DataTree<T> = FlatDataTree(type, map, updatesFlow, Name.EMPTY)
|
||||
}
|
||||
|
||||
/**
|
||||
@ -74,17 +69,32 @@ private class DataTreeBuilder<T>(
|
||||
@UnsafeKType
|
||||
public fun <T> DataTree(
|
||||
dataType: KType,
|
||||
generator: DataSink<T>.() -> Unit,
|
||||
): DataTree<T> = DataTreeBuilder<T>(dataType).apply(generator).build()
|
||||
scope: CoroutineScope,
|
||||
initialData: Map<Name, Data<T>> = emptyMap(),
|
||||
updater: suspend DataSink<T>.() -> Unit,
|
||||
): DataTree<T> = DataTreeBuilder<T>(dataType, initialData).apply {
|
||||
scope.launch(GoalExecutionRestriction(GoalExecutionRestrictionPolicy.ERROR)) {
|
||||
updater()
|
||||
}
|
||||
}.build()
|
||||
|
||||
/**
|
||||
* Create and a data tree.
|
||||
*/
|
||||
@OptIn(UnsafeKType::class)
|
||||
public inline fun <reified T> DataTree(
|
||||
noinline generator: DataSink<T>.() -> Unit,
|
||||
): DataTree<T> = DataTree(typeOf<T>(), generator)
|
||||
scope: CoroutineScope,
|
||||
initialData: Map<Name, Data<T>> = emptyMap(),
|
||||
noinline updater: suspend DataSink<T>.() -> Unit,
|
||||
): DataTree<T> = DataTree(typeOf<T>(), scope, initialData, updater)
|
||||
|
||||
@UnsafeKType
|
||||
public fun <T> DataTree(type: KType, data: Map<Name, Data<T>>): DataTree<T> =
|
||||
DataTreeBuilder(type, data).build()
|
||||
|
||||
@OptIn(UnsafeKType::class)
|
||||
public inline fun <reified T> DataTree(data: Map<Name, Data<T>>): DataTree<T> =
|
||||
DataTree(typeOf<T>(), data)
|
||||
|
||||
/**
|
||||
* Represent this flat data map as a [DataTree] without copying it
|
||||
@ -102,7 +112,7 @@ public inline fun <reified T> Map<Name, Data<T>>.asTree(): DataTree<T> = asTree(
|
||||
|
||||
@UnsafeKType
|
||||
public fun <T> Sequence<NamedData<T>>.toTree(type: KType): DataTree<T> =
|
||||
DataTreeBuilder(type, associate { it.name to it.data }).build()
|
||||
DataTreeBuilder(type, associate { it.name to it }).build()
|
||||
|
||||
|
||||
/**
|
||||
|
@ -1,8 +1,7 @@
|
||||
package space.kscience.dataforge.data
|
||||
|
||||
import kotlinx.coroutines.Job
|
||||
import kotlinx.coroutines.flow.collect
|
||||
import kotlinx.coroutines.flow.take
|
||||
import kotlinx.coroutines.launch
|
||||
import kotlinx.coroutines.test.runTest
|
||||
import space.kscience.dataforge.names.asName
|
||||
import kotlin.test.Test
|
||||
@ -13,13 +12,13 @@ import kotlin.time.Duration.Companion.milliseconds
|
||||
internal class DataTreeBuilderTest {
|
||||
@Test
|
||||
fun testTreeBuild() = runTest(timeout = 500.milliseconds) {
|
||||
val node = DataTree<Any> {
|
||||
putAll("primary") {
|
||||
putValue("a", "a")
|
||||
putValue("b", "b")
|
||||
val node = DataTree.static<Any> {
|
||||
node("primary") {
|
||||
value("a", "a")
|
||||
value("b", "b")
|
||||
}
|
||||
putValue("c.d", "c.d")
|
||||
putValue("c.f", "c.f")
|
||||
value("c.d", "c.d")
|
||||
value("c.f", "c.f")
|
||||
}
|
||||
assertEquals("a", node["primary.a"]?.await())
|
||||
assertEquals("b", node["primary.b"]?.await())
|
||||
@ -30,20 +29,18 @@ internal class DataTreeBuilderTest {
|
||||
|
||||
@Test
|
||||
fun testDataUpdate() = runTest(timeout = 500.milliseconds) {
|
||||
val updateData = DataTree<Any> {
|
||||
putAll("update") {
|
||||
put("a", Data.wrapValue("a"))
|
||||
put("b", Data.wrapValue("b"))
|
||||
}
|
||||
val updateData = DataTree.static<Any> {
|
||||
data("a", Data.wrapValue("a"))
|
||||
data("b", Data.wrapValue("b"))
|
||||
}
|
||||
|
||||
val node = DataTree<Any> {
|
||||
putAll("primary") {
|
||||
putValue("a", "a")
|
||||
putValue("b", "b")
|
||||
val node = DataTree.static<Any> {
|
||||
node("primary") {
|
||||
value("a", "a")
|
||||
value("b", "b")
|
||||
}
|
||||
putValue("root", "root")
|
||||
putAll(updateData)
|
||||
value("root", "root")
|
||||
node("update", updateData)
|
||||
}
|
||||
|
||||
assertEquals("a", node["update.a"]?.await())
|
||||
@ -57,17 +54,20 @@ internal class DataTreeBuilderTest {
|
||||
val subNode = MutableDataTree<Int>()
|
||||
|
||||
val rootNode = MutableDataTree<Int>() {
|
||||
job = putAllAndWatch(this@runTest, "sub".asName(), subNode)
|
||||
job = launch {
|
||||
writeAllAndWatch(subNode, "sub".asName())
|
||||
}
|
||||
}
|
||||
|
||||
repeat(10) {
|
||||
subNode.updateValue("value[$it]", it)
|
||||
subNode.writeValue("value[$it]", it)
|
||||
}
|
||||
|
||||
rootNode.updates.take(10).collect()
|
||||
assertEquals(9, rootNode["sub.value[9]"]?.await())
|
||||
assertEquals(8, rootNode["sub.value[8]"]?.await())
|
||||
|
||||
assertEquals(9, subNode.awaitData("value[9]").await())
|
||||
assertEquals(8, subNode.awaitData("value[8]").await())
|
||||
assertEquals(9, rootNode.awaitData("sub.value[9]").await())
|
||||
assertEquals(8, rootNode.awaitData("sub.value[8]").await())
|
||||
println("finished")
|
||||
job?.cancel()
|
||||
}
|
||||
}
|
@ -1,3 +1,5 @@
|
||||
@file:Suppress("CONTEXT_RECEIVERS_DEPRECATED")
|
||||
|
||||
package space.kscience.dataforge.data
|
||||
|
||||
import space.kscience.dataforge.names.Name
|
||||
@ -7,15 +9,15 @@ import space.kscience.dataforge.names.Name
|
||||
* Append data to node
|
||||
*/
|
||||
context(DataSink<T>)
|
||||
public infix fun <T : Any> String.put(data: Data<T>): Unit =
|
||||
put(Name.parse(this), data)
|
||||
public suspend infix fun <T : Any> String.put(data: Data<T>): Unit =
|
||||
write(Name.parse(this), data)
|
||||
|
||||
/**
|
||||
* Append node
|
||||
*/
|
||||
context(DataSink<T>)
|
||||
public infix fun <T : Any> String.putAll(dataSet: DataTree<T>): Unit =
|
||||
putAll(this, dataSet)
|
||||
public suspend infix fun <T : Any> String.putAll(dataSet: DataTree<T>): Unit =
|
||||
writeAll(this, dataSet)
|
||||
|
||||
/**
|
||||
* Build and append node
|
||||
@ -23,5 +25,5 @@ public infix fun <T : Any> String.putAll(dataSet: DataTree<T>): Unit =
|
||||
context(DataSink<T>)
|
||||
public infix fun <T : Any> String.putAll(
|
||||
block: DataSink<T>.() -> Unit,
|
||||
): Unit = putAll(Name.parse(this), block)
|
||||
): Unit = writeAll(Name.parse(this), block)
|
||||
|
||||
|
@ -1,8 +1,6 @@
|
||||
package space.kscience.dataforge.data
|
||||
|
||||
import kotlinx.coroutines.flow.collect
|
||||
import kotlinx.coroutines.flow.onEach
|
||||
import kotlinx.coroutines.flow.take
|
||||
import kotlinx.coroutines.ExperimentalCoroutinesApi
|
||||
import kotlinx.coroutines.test.runTest
|
||||
import space.kscience.dataforge.actions.Action
|
||||
import space.kscience.dataforge.actions.invoke
|
||||
@ -12,41 +10,40 @@ import kotlin.test.Test
|
||||
import kotlin.test.assertEquals
|
||||
import kotlin.time.Duration.Companion.milliseconds
|
||||
|
||||
@OptIn(DFExperimental::class)
|
||||
@OptIn(DFExperimental::class, ExperimentalCoroutinesApi::class)
|
||||
internal class ActionsTest {
|
||||
@Test
|
||||
fun testStaticMapAction() = runTest(timeout = 500.milliseconds) {
|
||||
val data: DataTree<Int> = DataTree {
|
||||
fun testStaticMapAction() = runTest(timeout = 200.milliseconds) {
|
||||
val plusOne = Action.mapping<Int, Int> {
|
||||
result { it + 1 }
|
||||
}
|
||||
|
||||
val data: DataTree<Int> = DataTree.static {
|
||||
repeat(10) {
|
||||
putValue(it.toString(), it)
|
||||
value(it.toString(), it)
|
||||
}
|
||||
}
|
||||
|
||||
val plusOne = Action.mapping<Int, Int> {
|
||||
result { it + 1 }
|
||||
}
|
||||
val result = plusOne(data)
|
||||
assertEquals(2, result["1"]?.await())
|
||||
|
||||
assertEquals(5, result.awaitData("4").await())
|
||||
}
|
||||
|
||||
@Test
|
||||
fun testDynamicMapAction() = runTest(timeout = 500.milliseconds) {
|
||||
val source: MutableDataTree<Int> = MutableDataTree()
|
||||
|
||||
fun testDynamicMapAction() = runTest(timeout = 200.milliseconds) {
|
||||
val plusOne = Action.mapping<Int, Int> {
|
||||
result { it + 1 }
|
||||
}
|
||||
|
||||
val result = plusOne(source)
|
||||
val source: MutableDataTree<Int> = MutableDataTree()
|
||||
|
||||
val result: DataTree<Int> = plusOne(source)
|
||||
|
||||
repeat(10) {
|
||||
source.updateValue(it.toString(), it)
|
||||
source.writeValue(it.toString(), it)
|
||||
}
|
||||
|
||||
result.updates.take(10).onEach { println(it.name) }.collect()
|
||||
|
||||
assertEquals(2, result["1"]?.await())
|
||||
assertEquals(5, result.awaitData("4").await())
|
||||
}
|
||||
|
||||
}
|
@ -2,11 +2,20 @@
|
||||
|
||||
IO module
|
||||
|
||||
## Features
|
||||
|
||||
- [IO format](src/commonMain/kotlin/space/kscience/dataforge/io/IOFormat.kt) : A generic API for reading something from binary representation and writing it to Binary.
|
||||
- [Binary](src/commonMain/kotlin/space/kscience/dataforge/io/Binary.kt) : Multi-read random access binary.
|
||||
- [Envelope](src/commonMain/kotlin/space/kscience/dataforge/io/Envelope.kt) : API and implementations for combined data and metadata format.
|
||||
- [Tagged envelope](src/commonMain/kotlin/space/kscience/dataforge/io/TaggedEnvelope.kt) : Implementation for binary-friendly envelope format with machine readable tag and forward size declaration.
|
||||
- [Tagged envelope](src/commonMain/kotlin/space/kscience/dataforge/io/TaglessEnvelope.kt) : Implementation for text-friendly envelope format with text separators for sections.
|
||||
|
||||
|
||||
## Usage
|
||||
|
||||
## Artifact:
|
||||
|
||||
The Maven coordinates of this project are `space.kscience:dataforge-io:0.9.0-dev-1`.
|
||||
The Maven coordinates of this project are `space.kscience:dataforge-io:0.10.0`.
|
||||
|
||||
**Gradle Kotlin DSL:**
|
||||
```kotlin
|
||||
@ -16,6 +25,6 @@ repositories {
|
||||
}
|
||||
|
||||
dependencies {
|
||||
implementation("space.kscience:dataforge-io:0.9.0-dev-1")
|
||||
implementation("space.kscience:dataforge-io:0.10.0")
|
||||
}
|
||||
```
|
||||
|
@ -4,7 +4,7 @@ plugins {
|
||||
|
||||
description = "IO module"
|
||||
|
||||
val ioVersion = "0.3.1"
|
||||
val ioVersion = "0.6.0"
|
||||
|
||||
kscience {
|
||||
jvm()
|
||||
@ -22,6 +22,60 @@ kscience {
|
||||
}
|
||||
}
|
||||
|
||||
readme{
|
||||
readme {
|
||||
maturity = space.kscience.gradle.Maturity.EXPERIMENTAL
|
||||
|
||||
description = """
|
||||
Serialization foundation for Meta objects and Envelope processing.
|
||||
""".trimIndent()
|
||||
|
||||
feature(
|
||||
"io-format",
|
||||
ref = "src/commonMain/kotlin/space/kscience/dataforge/io/IOFormat.kt",
|
||||
name = "IO format"
|
||||
) {
|
||||
"""
|
||||
A generic API for reading something from binary representation and writing it to Binary.
|
||||
|
||||
Similar to KSerializer, but without schema.
|
||||
""".trimIndent()
|
||||
}
|
||||
|
||||
feature(
|
||||
"binary",
|
||||
ref = "src/commonMain/kotlin/space/kscience/dataforge/io/Binary.kt",
|
||||
name = "Binary"
|
||||
) {
|
||||
"Multi-read random access binary."
|
||||
}
|
||||
|
||||
feature(
|
||||
"envelope",
|
||||
ref = "src/commonMain/kotlin/space/kscience/dataforge/io/Envelope.kt",
|
||||
name = "Envelope"
|
||||
) {
|
||||
"""
|
||||
API and implementations for combined data and metadata format.
|
||||
""".trimIndent()
|
||||
}
|
||||
|
||||
feature(
|
||||
"envelope.tagged",
|
||||
ref = "src/commonMain/kotlin/space/kscience/dataforge/io/TaggedEnvelope.kt",
|
||||
name = "Tagged envelope"
|
||||
) {
|
||||
"""
|
||||
Implementation for binary-friendly envelope format with machine readable tag and forward size declaration.
|
||||
""".trimIndent()
|
||||
}
|
||||
|
||||
feature(
|
||||
"envelope.tagless",
|
||||
ref = "src/commonMain/kotlin/space/kscience/dataforge/io/TaglessEnvelope.kt",
|
||||
name = "Tagged envelope"
|
||||
) {
|
||||
"""
|
||||
Implementation for text-friendly envelope format with text separators for sections.
|
||||
""".trimIndent()
|
||||
}
|
||||
}
|
21
dataforge-io/dataforge-io-proto/README.md
Normal file
21
dataforge-io/dataforge-io-proto/README.md
Normal file
@ -0,0 +1,21 @@
|
||||
# Module dataforge-io-proto
|
||||
|
||||
ProtoBuf meta IO
|
||||
|
||||
## Usage
|
||||
|
||||
## Artifact:
|
||||
|
||||
The Maven coordinates of this project are `space.kscience:dataforge-io-proto:0.10.0`.
|
||||
|
||||
**Gradle Kotlin DSL:**
|
||||
```kotlin
|
||||
repositories {
|
||||
maven("https://repo.kotlin.link")
|
||||
mavenCentral()
|
||||
}
|
||||
|
||||
dependencies {
|
||||
implementation("space.kscience:dataforge-io-proto:0.10.0")
|
||||
}
|
||||
```
|
33
dataforge-io/dataforge-io-proto/build.gradle.kts
Normal file
33
dataforge-io/dataforge-io-proto/build.gradle.kts
Normal file
@ -0,0 +1,33 @@
|
||||
plugins {
|
||||
id("space.kscience.gradle.mpp")
|
||||
id("com.squareup.wire") version "4.9.9"
|
||||
}
|
||||
|
||||
description = "ProtoBuf meta IO"
|
||||
|
||||
kscience {
|
||||
jvm()
|
||||
// js()
|
||||
dependencies {
|
||||
api(projects.dataforgeIo)
|
||||
api("com.squareup.wire:wire-runtime:4.9.9")
|
||||
}
|
||||
useSerialization {
|
||||
protobuf()
|
||||
}
|
||||
}
|
||||
|
||||
wire {
|
||||
kotlin {
|
||||
sourcePath {
|
||||
srcDir("src/commonMain/proto")
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
readme {
|
||||
maturity = space.kscience.gradle.Maturity.PROTOTYPE
|
||||
description = """
|
||||
ProtoBuf Meta representation
|
||||
""".trimIndent()
|
||||
}
|
@ -0,0 +1,32 @@
|
||||
package pace.kscience.dataforge.io.proto
|
||||
|
||||
import kotlinx.io.Sink
|
||||
import kotlinx.io.Source
|
||||
import kotlinx.io.readByteArray
|
||||
import okio.ByteString
|
||||
import okio.ByteString.Companion.toByteString
|
||||
import space.kscience.dataforge.io.Envelope
|
||||
import space.kscience.dataforge.io.EnvelopeFormat
|
||||
import space.kscience.dataforge.io.asBinary
|
||||
import space.kscience.dataforge.io.proto.ProtoEnvelope
|
||||
import space.kscience.dataforge.io.toByteArray
|
||||
import space.kscience.dataforge.meta.Meta
|
||||
|
||||
|
||||
public object ProtoEnvelopeFormat : EnvelopeFormat {
|
||||
override fun readFrom(source: Source): Envelope {
|
||||
val protoEnvelope = ProtoEnvelope.ADAPTER.decode(source.readByteArray())
|
||||
return Envelope(
|
||||
meta = protoEnvelope.meta?.let { ProtoMetaWrapper(it) } ?: Meta.EMPTY,
|
||||
data = protoEnvelope.dataBytes.toByteArray().asBinary()
|
||||
)
|
||||
}
|
||||
|
||||
override fun writeTo(sink: Sink, obj: Envelope) {
|
||||
val protoEnvelope = ProtoEnvelope(
|
||||
obj.meta.toProto(),
|
||||
obj.data?.toByteArray()?.toByteString() ?: ByteString.EMPTY
|
||||
)
|
||||
sink.write(ProtoEnvelope.ADAPTER.encode(protoEnvelope))
|
||||
}
|
||||
}
|
@ -0,0 +1,76 @@
|
||||
package pace.kscience.dataforge.io.proto
|
||||
|
||||
import kotlinx.io.Sink
|
||||
import kotlinx.io.Source
|
||||
import kotlinx.io.readByteArray
|
||||
import space.kscience.dataforge.io.MetaFormat
|
||||
import space.kscience.dataforge.io.proto.ProtoMeta
|
||||
import space.kscience.dataforge.meta.*
|
||||
import space.kscience.dataforge.meta.descriptors.MetaDescriptor
|
||||
import space.kscience.dataforge.names.NameToken
|
||||
|
||||
internal class ProtoMetaWrapper(private val proto: ProtoMeta) : Meta {
|
||||
|
||||
private fun ProtoMeta.ProtoValue.toValue(): Value? = when {
|
||||
stringValue != null -> stringValue.asValue()
|
||||
booleanValue != null -> booleanValue.asValue()
|
||||
doubleValue != null -> doubleValue.asValue()
|
||||
floatValue != null -> floatValue.asValue()
|
||||
int32Value != null -> int32Value.asValue()
|
||||
int64Value != null -> int64Value.asValue()
|
||||
bytesValue != null -> bytesValue.toByteArray().asValue()
|
||||
listValue != null -> listValue.values.mapNotNull { it.toValue() }.asValue()
|
||||
float64ListValue != null -> float64ListValue.values.map { it.asValue() }.asValue()
|
||||
else -> null
|
||||
}
|
||||
|
||||
override val value: Value?
|
||||
get() = proto.protoValue?.toValue()
|
||||
|
||||
|
||||
override val items: Map<NameToken, Meta>
|
||||
get() = proto.items.entries.associate { NameToken.parse(it.key) to ProtoMetaWrapper(it.value) }
|
||||
|
||||
override fun toString(): String = Meta.toString(this)
|
||||
|
||||
override fun equals(other: Any?): Boolean = Meta.equals(this, other as? Meta)
|
||||
|
||||
override fun hashCode(): Int = Meta.hashCode(this)
|
||||
}
|
||||
|
||||
internal fun Meta.toProto(): ProtoMeta {
|
||||
|
||||
|
||||
fun Value.toProto(): ProtoMeta.ProtoValue = when (type) {
|
||||
ValueType.NULL -> ProtoMeta.ProtoValue()
|
||||
|
||||
ValueType.NUMBER -> when (value) {
|
||||
is Int, is Short, is Byte -> ProtoMeta.ProtoValue(int32Value = int)
|
||||
is Long -> ProtoMeta.ProtoValue(int64Value = long)
|
||||
is Float -> ProtoMeta.ProtoValue(floatValue = float)
|
||||
else -> {
|
||||
// LoggerFactory.getLogger(ProtoMeta::class.java)
|
||||
// .warn("Unknown number type ${value} encoded as Double")
|
||||
ProtoMeta.ProtoValue(doubleValue = double)
|
||||
}
|
||||
}
|
||||
|
||||
ValueType.STRING -> ProtoMeta.ProtoValue(stringValue = string)
|
||||
ValueType.BOOLEAN -> ProtoMeta.ProtoValue(booleanValue = boolean)
|
||||
ValueType.LIST -> ProtoMeta.ProtoValue(listValue = ProtoMeta.ProtoValueList(list.map { it.toProto() }))
|
||||
}
|
||||
|
||||
return ProtoMeta(
|
||||
protoValue = value?.toProto(),
|
||||
items.entries.associate { it.key.toString() to it.value.toProto() }
|
||||
)
|
||||
}
|
||||
|
||||
public object ProtoMetaFormat : MetaFormat {
|
||||
override fun writeMeta(sink: Sink, meta: Meta, descriptor: MetaDescriptor?) {
|
||||
sink.write(ProtoMeta.ADAPTER.encode(meta.toProto()))
|
||||
}
|
||||
|
||||
override fun readMeta(source: Source, descriptor: MetaDescriptor?): Meta =
|
||||
ProtoMetaWrapper(ProtoMeta.ADAPTER.decode(source.readByteArray()))
|
||||
}
|
@ -0,0 +1,35 @@
|
||||
syntax = "proto3";
|
||||
package space.kscience.dataforge.io.proto;
|
||||
|
||||
message ProtoMeta {
|
||||
message ProtoValue {
|
||||
oneof value {
|
||||
string stringValue = 2;
|
||||
bool booleanValue = 3;
|
||||
double doubleValue = 4;
|
||||
float floatValue = 5;
|
||||
int32 int32Value = 6;
|
||||
int64 int64Value = 7;
|
||||
bytes bytesValue = 8;
|
||||
ProtoValueList listValue = 9;
|
||||
Float64List float64ListValue = 10;
|
||||
}
|
||||
}
|
||||
|
||||
message ProtoValueList{
|
||||
repeated ProtoValue values = 1;
|
||||
}
|
||||
|
||||
message Float64List{
|
||||
repeated double values = 1 [packed=true];
|
||||
}
|
||||
|
||||
ProtoValue protoValue = 1;
|
||||
|
||||
map<string, ProtoMeta> items = 2;
|
||||
}
|
||||
|
||||
message ProtoEnvelope{
|
||||
ProtoMeta meta = 1;
|
||||
bytes dataBytes = 2;
|
||||
}
|
@ -0,0 +1,83 @@
|
||||
package pace.kscience.dataforge.io.proto
|
||||
|
||||
import kotlinx.io.writeString
|
||||
import space.kscience.dataforge.io.Envelope
|
||||
import space.kscience.dataforge.io.toByteArray
|
||||
import space.kscience.dataforge.meta.Meta
|
||||
import space.kscience.dataforge.meta.asValue
|
||||
import space.kscience.dataforge.meta.get
|
||||
import kotlin.test.Test
|
||||
import kotlin.test.assertContentEquals
|
||||
import kotlin.test.assertEquals
|
||||
|
||||
class ProtoBufTest {
|
||||
|
||||
@Test
|
||||
fun testProtoBufMetaFormat(){
|
||||
val meta = Meta {
|
||||
"a" put 22
|
||||
"node" put {
|
||||
"b" put "DDD"
|
||||
"c" put 11.1
|
||||
"d" put {
|
||||
"d1" put {
|
||||
"d11" put "aaa"
|
||||
"d12" put "bbb"
|
||||
}
|
||||
"d2" put 2
|
||||
}
|
||||
"array" put doubleArrayOf(1.0, 2.0, 3.0)
|
||||
"array2d" put listOf(
|
||||
doubleArrayOf(1.0, 2.0, 3.0).asValue(),
|
||||
doubleArrayOf(1.0, 2.0, 3.0).asValue()
|
||||
).asValue()
|
||||
}
|
||||
}
|
||||
val buffer = kotlinx.io.Buffer()
|
||||
ProtoMetaFormat.writeTo(buffer,meta)
|
||||
val result = ProtoMetaFormat.readFrom(buffer)
|
||||
|
||||
// println(result["a"]?.value)
|
||||
|
||||
meta.items.keys.forEach {
|
||||
assertEquals(meta[it],result[it],"${meta[it]} != ${result[it]}")
|
||||
}
|
||||
|
||||
assertEquals(meta, result)
|
||||
}
|
||||
|
||||
@Test
|
||||
fun testProtoBufEnvelopeFormat(){
|
||||
val envelope = Envelope{
|
||||
meta {
|
||||
"a" put 22
|
||||
"node" put {
|
||||
"b" put "DDD"
|
||||
"c" put 11.1
|
||||
"d" put {
|
||||
"d1" put {
|
||||
"d11" put "aaa"
|
||||
"d12" put "bbb"
|
||||
}
|
||||
"d2" put 2
|
||||
}
|
||||
"array" put doubleArrayOf(1.0, 2.0, 3.0)
|
||||
"array2d" put listOf(
|
||||
doubleArrayOf(1.0, 2.0, 3.0).asValue(),
|
||||
doubleArrayOf(1.0, 2.0, 3.0).asValue()
|
||||
).asValue()
|
||||
}
|
||||
}
|
||||
data {
|
||||
writeString("Hello world!")
|
||||
}
|
||||
}
|
||||
|
||||
val buffer = kotlinx.io.Buffer()
|
||||
ProtoEnvelopeFormat.writeTo(buffer,envelope)
|
||||
val result = ProtoEnvelopeFormat.readFrom(buffer)
|
||||
|
||||
assertEquals(envelope.meta, result.meta)
|
||||
assertContentEquals(envelope.data?.toByteArray(), result.data?.toByteArray())
|
||||
}
|
||||
}
|
@ -0,0 +1,51 @@
|
||||
package pace.kscience.dataforge.io.proto
|
||||
|
||||
import kotlinx.io.writeString
|
||||
import space.kscience.dataforge.io.Envelope
|
||||
import space.kscience.dataforge.meta.asValue
|
||||
import kotlin.concurrent.thread
|
||||
import kotlin.time.measureTime
|
||||
|
||||
public fun main() {
|
||||
val envelope = Envelope {
|
||||
meta {
|
||||
"a" put 22
|
||||
"node" put {
|
||||
"b" put "DDD"
|
||||
"c" put 11.1
|
||||
"d" put {
|
||||
"d1" put {
|
||||
"d11" put "aaa"
|
||||
"d12" put "bbb"
|
||||
}
|
||||
"d2" put 2
|
||||
}
|
||||
"array" put doubleArrayOf(1.0, 2.0, 3.0)
|
||||
"array2d" put listOf(
|
||||
doubleArrayOf(1.0, 2.0, 3.0).asValue(),
|
||||
doubleArrayOf(1.0, 2.0, 3.0).asValue()
|
||||
).asValue()
|
||||
}
|
||||
}
|
||||
data {
|
||||
writeString("Hello world!")
|
||||
}
|
||||
}
|
||||
|
||||
val format = ProtoEnvelopeFormat
|
||||
|
||||
measureTime {
|
||||
val threads = List(100) {
|
||||
thread {
|
||||
repeat(100000) {
|
||||
val buffer = kotlinx.io.Buffer()
|
||||
format.writeTo(buffer, envelope)
|
||||
// println(buffer.size)
|
||||
val r = format.readFrom(buffer)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
threads.forEach { it.join() }
|
||||
}.also { println(it) }
|
||||
}
|
@ -6,7 +6,7 @@ YAML meta IO
|
||||
|
||||
## Artifact:
|
||||
|
||||
The Maven coordinates of this project are `space.kscience:dataforge-io-yaml:0.9.0-dev-1`.
|
||||
The Maven coordinates of this project are `space.kscience:dataforge-io-yaml:0.10.0`.
|
||||
|
||||
**Gradle Kotlin DSL:**
|
||||
```kotlin
|
||||
@ -16,6 +16,6 @@ repositories {
|
||||
}
|
||||
|
||||
dependencies {
|
||||
implementation("space.kscience:dataforge-io-yaml:0.9.0-dev-1")
|
||||
implementation("space.kscience:dataforge-io-yaml:0.10.0")
|
||||
}
|
||||
```
|
||||
|
@ -11,14 +11,14 @@ kscience {
|
||||
dependencies {
|
||||
api(projects.dataforgeIo)
|
||||
}
|
||||
useSerialization{
|
||||
useSerialization {
|
||||
yamlKt()
|
||||
}
|
||||
}
|
||||
|
||||
readme{
|
||||
readme {
|
||||
maturity = space.kscience.gradle.Maturity.PROTOTYPE
|
||||
description ="""
|
||||
description = """
|
||||
YAML meta converters and Front Matter envelope format
|
||||
""".trimIndent()
|
||||
}
|
||||
|
@ -2,7 +2,6 @@ package space.kscience.dataforge.io
|
||||
|
||||
import space.kscience.dataforge.meta.Laminate
|
||||
import space.kscience.dataforge.meta.Meta
|
||||
import space.kscience.dataforge.meta.get
|
||||
import space.kscience.dataforge.meta.string
|
||||
import space.kscience.dataforge.names.Name
|
||||
import space.kscience.dataforge.names.asName
|
||||
@ -34,7 +33,9 @@ public interface Envelope {
|
||||
}
|
||||
}
|
||||
|
||||
internal class SimpleEnvelope(override val meta: Meta, override val data: Binary?) : Envelope
|
||||
internal class SimpleEnvelope(override val meta: Meta, override val data: Binary?) : Envelope{
|
||||
override fun toString(): String = "Envelope(meta=$meta, data=$data)"
|
||||
}
|
||||
|
||||
public fun Envelope(meta: Meta, data: Binary?): Envelope = SimpleEnvelope(meta, data)
|
||||
|
||||
|
@ -6,28 +6,11 @@ import space.kscience.dataforge.io.IOFormatFactory.Companion.IO_FORMAT_TYPE
|
||||
import space.kscience.dataforge.io.MetaFormatFactory.Companion.META_FORMAT_TYPE
|
||||
import space.kscience.dataforge.meta.Meta
|
||||
import space.kscience.dataforge.meta.string
|
||||
import space.kscience.dataforge.misc.UnsafeKType
|
||||
import space.kscience.dataforge.names.Name
|
||||
import kotlin.reflect.KType
|
||||
import kotlin.reflect.typeOf
|
||||
|
||||
public class IOPlugin(meta: Meta) : AbstractPlugin(meta) {
|
||||
override val tag: PluginTag get() = Companion.tag
|
||||
|
||||
public val ioFormatFactories: Collection<IOFormatFactory<*>> by lazy {
|
||||
context.gather<IOFormatFactory<*>>(IO_FORMAT_TYPE).values
|
||||
}
|
||||
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
@UnsafeKType
|
||||
public fun <T> resolveIOFormat(type: KType, meta: Meta): IOFormat<T>? =
|
||||
ioFormatFactories.singleOrNull { it.type == type }?.build(context, meta) as? IOFormat<T>
|
||||
|
||||
@OptIn(UnsafeKType::class)
|
||||
public inline fun <reified T> resolveIOFormat(meta: Meta = Meta.EMPTY): IOFormat<T>? =
|
||||
resolveIOFormat(typeOf<T>(), meta)
|
||||
|
||||
|
||||
public val metaFormatFactories: Collection<MetaFormatFactory> by lazy {
|
||||
context.gather<MetaFormatFactory>(META_FORMAT_TYPE).values
|
||||
}
|
||||
|
@ -1,12 +0,0 @@
|
||||
package space.kscience.dataforge.io
|
||||
|
||||
/**
|
||||
* An object that could respond to external messages asynchronously
|
||||
*/
|
||||
public interface Responder {
|
||||
/**
|
||||
* Send a request and wait for response for this specific request
|
||||
*/
|
||||
public suspend fun respond(request: Envelope): Envelope
|
||||
}
|
||||
|
@ -15,8 +15,6 @@ import java.nio.file.Path
|
||||
import java.nio.file.StandardOpenOption
|
||||
import kotlin.io.path.inputStream
|
||||
import kotlin.math.min
|
||||
import kotlin.reflect.full.isSupertypeOf
|
||||
import kotlin.reflect.typeOf
|
||||
import kotlin.streams.asSequence
|
||||
|
||||
|
||||
@ -79,14 +77,6 @@ public fun Path.rewrite(block: Sink.() -> Unit): Unit {
|
||||
|
||||
public fun EnvelopeFormat.readFile(path: Path): Envelope = readFrom(path.asBinary())
|
||||
|
||||
/**
|
||||
* Resolve IOFormat based on type
|
||||
*/
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
public inline fun <reified T : Any> IOPlugin.resolveIOFormat(): IOFormat<T>? =
|
||||
ioFormatFactories.find { it.type.isSupertypeOf(typeOf<T>()) } as IOFormat<T>?
|
||||
|
||||
|
||||
public val IOPlugin.Companion.META_FILE_NAME: String get() = "@meta"
|
||||
public val IOPlugin.Companion.DATA_FILE_NAME: String get() = "@data"
|
||||
|
||||
|
@ -2,11 +2,18 @@
|
||||
|
||||
Meta definition and basic operations on meta
|
||||
|
||||
## Features
|
||||
|
||||
- [Meta](src/commonMain/kotlin/space/kscience/dataforge/meta/Meta.kt) : **Meta** is the representation of basic DataForge concept: Metadata, but it also could be called meta-value tree.
|
||||
- [Value](src/commonMain/kotlin/space/kscience/dataforge/meta/Value.kt) : **Value** a sum type for different meta values.
|
||||
- [Name](src/commonMain/kotlin/space/kscience/dataforge/names/Name.kt) : **Name** is an identifier to access tree-like structure.
|
||||
|
||||
|
||||
## Usage
|
||||
|
||||
## Artifact:
|
||||
|
||||
The Maven coordinates of this project are `space.kscience:dataforge-meta:0.9.0-dev-1`.
|
||||
The Maven coordinates of this project are `space.kscience:dataforge-meta:0.10.0`.
|
||||
|
||||
**Gradle Kotlin DSL:**
|
||||
```kotlin
|
||||
@ -16,6 +23,6 @@ repositories {
|
||||
}
|
||||
|
||||
dependencies {
|
||||
implementation("space.kscience:dataforge-meta:0.9.0-dev-1")
|
||||
implementation("space.kscience:dataforge-meta:0.10.0")
|
||||
}
|
||||
```
|
||||
|
@ -53,8 +53,10 @@ public final class space/kscience/dataforge/meta/ExoticValuesKt {
|
||||
public static synthetic fun doubleArray$default (Lspace/kscience/dataforge/meta/MetaProvider;[DLspace/kscience/dataforge/names/Name;ILjava/lang/Object;)Lkotlin/properties/ReadOnlyProperty;
|
||||
public static synthetic fun doubleArray$default (Lspace/kscience/dataforge/meta/MutableMetaProvider;[DLspace/kscience/dataforge/names/Name;ILjava/lang/Object;)Lkotlin/properties/ReadWriteProperty;
|
||||
public static final fun getByteArray (Lspace/kscience/dataforge/meta/Meta;)[B
|
||||
public static final fun getByteArray (Lspace/kscience/dataforge/meta/MetaConverter$Companion;)Lspace/kscience/dataforge/meta/MetaConverter;
|
||||
public static final fun getByteArray (Lspace/kscience/dataforge/meta/Value;)[B
|
||||
public static final fun getDoubleArray (Lspace/kscience/dataforge/meta/Meta;)[D
|
||||
public static final fun getDoubleArray (Lspace/kscience/dataforge/meta/MetaConverter$Companion;)Lspace/kscience/dataforge/meta/MetaConverter;
|
||||
public static final fun getDoubleArray (Lspace/kscience/dataforge/meta/Value;)[D
|
||||
public static final fun lazyParseValue (Ljava/lang/String;)Lspace/kscience/dataforge/meta/LazyParsedValue;
|
||||
}
|
||||
@ -217,6 +219,7 @@ public final class space/kscience/dataforge/meta/MetaConverter$Companion {
|
||||
public final fun getMeta ()Lspace/kscience/dataforge/meta/MetaConverter;
|
||||
public final fun getNumber ()Lspace/kscience/dataforge/meta/MetaConverter;
|
||||
public final fun getString ()Lspace/kscience/dataforge/meta/MetaConverter;
|
||||
public final fun getStringList ()Lspace/kscience/dataforge/meta/MetaConverter;
|
||||
public final fun getValue ()Lspace/kscience/dataforge/meta/MetaConverter;
|
||||
public final fun valueList (Lkotlin/jvm/functions/Function1;Lkotlin/jvm/functions/Function1;)Lspace/kscience/dataforge/meta/MetaConverter;
|
||||
public static synthetic fun valueList$default (Lspace/kscience/dataforge/meta/MetaConverter$Companion;Lkotlin/jvm/functions/Function1;Lkotlin/jvm/functions/Function1;ILjava/lang/Object;)Lspace/kscience/dataforge/meta/MetaConverter;
|
||||
@ -248,6 +251,8 @@ public final class space/kscience/dataforge/meta/MetaDelegateKt {
|
||||
public static final fun int (Lspace/kscience/dataforge/meta/MetaProvider;Lspace/kscience/dataforge/names/Name;)Lspace/kscience/dataforge/meta/MetaDelegate;
|
||||
public static synthetic fun int$default (Lspace/kscience/dataforge/meta/MetaProvider;ILspace/kscience/dataforge/names/Name;ILjava/lang/Object;)Lspace/kscience/dataforge/meta/MetaDelegate;
|
||||
public static synthetic fun int$default (Lspace/kscience/dataforge/meta/MetaProvider;Lspace/kscience/dataforge/names/Name;ILjava/lang/Object;)Lspace/kscience/dataforge/meta/MetaDelegate;
|
||||
public static final fun listOfReadable (Lspace/kscience/dataforge/meta/Meta;Lspace/kscience/dataforge/meta/MetaReader;Lspace/kscience/dataforge/names/Name;)Lspace/kscience/dataforge/meta/MetaDelegate;
|
||||
public static synthetic fun listOfReadable$default (Lspace/kscience/dataforge/meta/Meta;Lspace/kscience/dataforge/meta/MetaReader;Lspace/kscience/dataforge/names/Name;ILjava/lang/Object;)Lspace/kscience/dataforge/meta/MetaDelegate;
|
||||
public static final fun listOfSpec (Lspace/kscience/dataforge/meta/Meta;Lspace/kscience/dataforge/meta/MetaReader;Lspace/kscience/dataforge/names/Name;)Lspace/kscience/dataforge/meta/MetaDelegate;
|
||||
public static synthetic fun listOfSpec$default (Lspace/kscience/dataforge/meta/Meta;Lspace/kscience/dataforge/meta/MetaReader;Lspace/kscience/dataforge/names/Name;ILjava/lang/Object;)Lspace/kscience/dataforge/meta/MetaDelegate;
|
||||
public static final fun long (Lspace/kscience/dataforge/meta/MetaProvider;JLspace/kscience/dataforge/names/Name;)Lspace/kscience/dataforge/meta/MetaDelegate;
|
||||
@ -264,6 +269,10 @@ public final class space/kscience/dataforge/meta/MetaDelegateKt {
|
||||
public static synthetic fun number$default (Lspace/kscience/dataforge/meta/MetaProvider;Ljava/lang/Number;Lspace/kscience/dataforge/names/Name;ILjava/lang/Object;)Lspace/kscience/dataforge/meta/MetaDelegate;
|
||||
public static synthetic fun number$default (Lspace/kscience/dataforge/meta/MetaProvider;Lspace/kscience/dataforge/names/Name;ILjava/lang/Object;)Lspace/kscience/dataforge/meta/MetaDelegate;
|
||||
public static synthetic fun number$default (Lspace/kscience/dataforge/meta/MetaProvider;Lspace/kscience/dataforge/names/Name;Lkotlin/jvm/functions/Function0;ILjava/lang/Object;)Lspace/kscience/dataforge/meta/MetaDelegate;
|
||||
public static final fun readable (Lspace/kscience/dataforge/meta/MetaProvider;Lspace/kscience/dataforge/meta/MetaReader;Ljava/lang/Object;Lspace/kscience/dataforge/names/Name;)Lspace/kscience/dataforge/meta/MetaDelegate;
|
||||
public static final fun readable (Lspace/kscience/dataforge/meta/MetaProvider;Lspace/kscience/dataforge/meta/MetaReader;Lspace/kscience/dataforge/names/Name;)Lspace/kscience/dataforge/meta/MetaDelegate;
|
||||
public static synthetic fun readable$default (Lspace/kscience/dataforge/meta/MetaProvider;Lspace/kscience/dataforge/meta/MetaReader;Ljava/lang/Object;Lspace/kscience/dataforge/names/Name;ILjava/lang/Object;)Lspace/kscience/dataforge/meta/MetaDelegate;
|
||||
public static synthetic fun readable$default (Lspace/kscience/dataforge/meta/MetaProvider;Lspace/kscience/dataforge/meta/MetaReader;Lspace/kscience/dataforge/names/Name;ILjava/lang/Object;)Lspace/kscience/dataforge/meta/MetaDelegate;
|
||||
public static final fun spec (Lspace/kscience/dataforge/meta/MetaProvider;Lspace/kscience/dataforge/meta/MetaReader;Lspace/kscience/dataforge/names/Name;)Lspace/kscience/dataforge/meta/MetaDelegate;
|
||||
public static synthetic fun spec$default (Lspace/kscience/dataforge/meta/MetaProvider;Lspace/kscience/dataforge/meta/MetaReader;Lspace/kscience/dataforge/names/Name;ILjava/lang/Object;)Lspace/kscience/dataforge/meta/MetaDelegate;
|
||||
public static final fun string (Lspace/kscience/dataforge/meta/MetaProvider;Ljava/lang/String;Lspace/kscience/dataforge/names/Name;)Lspace/kscience/dataforge/meta/MetaDelegate;
|
||||
@ -322,6 +331,10 @@ public final class space/kscience/dataforge/meta/MetaReaderKt {
|
||||
public static final fun readValue (Lspace/kscience/dataforge/meta/MetaReader;Lspace/kscience/dataforge/meta/Value;)Ljava/lang/Object;
|
||||
}
|
||||
|
||||
public abstract interface class space/kscience/dataforge/meta/MetaRefStore : space/kscience/dataforge/meta/descriptors/Described {
|
||||
public abstract fun getRefs ()Ljava/util/List;
|
||||
}
|
||||
|
||||
public abstract interface class space/kscience/dataforge/meta/MetaRepr {
|
||||
public abstract fun toMeta ()Lspace/kscience/dataforge/meta/Meta;
|
||||
}
|
||||
@ -411,7 +424,9 @@ public final class space/kscience/dataforge/meta/MutableMetaDelegateKt {
|
||||
public static synthetic fun boolean$default (Lspace/kscience/dataforge/meta/MutableMetaProvider;Lspace/kscience/dataforge/names/Name;ILjava/lang/Object;)Lspace/kscience/dataforge/meta/MutableMetaDelegate;
|
||||
public static synthetic fun boolean$default (Lspace/kscience/dataforge/meta/MutableMetaProvider;Lspace/kscience/dataforge/names/Name;Lkotlin/jvm/functions/Function0;ILjava/lang/Object;)Lspace/kscience/dataforge/meta/MutableMetaDelegate;
|
||||
public static synthetic fun boolean$default (Lspace/kscience/dataforge/meta/MutableMetaProvider;ZLspace/kscience/dataforge/names/Name;ILjava/lang/Object;)Lspace/kscience/dataforge/meta/MutableMetaDelegate;
|
||||
public static final fun convertable (Lspace/kscience/dataforge/meta/MutableMetaProvider;Lspace/kscience/dataforge/meta/MetaConverter;Ljava/lang/Object;Lspace/kscience/dataforge/names/Name;)Lspace/kscience/dataforge/meta/MutableMetaDelegate;
|
||||
public static final fun convertable (Lspace/kscience/dataforge/meta/MutableMetaProvider;Lspace/kscience/dataforge/meta/MetaConverter;Lspace/kscience/dataforge/names/Name;)Lspace/kscience/dataforge/meta/MutableMetaDelegate;
|
||||
public static synthetic fun convertable$default (Lspace/kscience/dataforge/meta/MutableMetaProvider;Lspace/kscience/dataforge/meta/MetaConverter;Ljava/lang/Object;Lspace/kscience/dataforge/names/Name;ILjava/lang/Object;)Lspace/kscience/dataforge/meta/MutableMetaDelegate;
|
||||
public static synthetic fun convertable$default (Lspace/kscience/dataforge/meta/MutableMetaProvider;Lspace/kscience/dataforge/meta/MetaConverter;Lspace/kscience/dataforge/names/Name;ILjava/lang/Object;)Lspace/kscience/dataforge/meta/MutableMetaDelegate;
|
||||
public static final fun double (Lspace/kscience/dataforge/meta/MutableMetaProvider;DLspace/kscience/dataforge/names/Name;)Lspace/kscience/dataforge/meta/MutableMetaDelegate;
|
||||
public static final fun double (Lspace/kscience/dataforge/meta/MutableMetaProvider;Lspace/kscience/dataforge/names/Name;)Lspace/kscience/dataforge/meta/MutableMetaDelegate;
|
||||
@ -433,9 +448,7 @@ public final class space/kscience/dataforge/meta/MutableMetaDelegateKt {
|
||||
public static final fun long (Lspace/kscience/dataforge/meta/MutableMetaProvider;Lspace/kscience/dataforge/names/Name;)Lspace/kscience/dataforge/meta/MutableMetaDelegate;
|
||||
public static synthetic fun long$default (Lspace/kscience/dataforge/meta/MutableMetaProvider;JLspace/kscience/dataforge/names/Name;ILjava/lang/Object;)Lspace/kscience/dataforge/meta/MutableMetaDelegate;
|
||||
public static synthetic fun long$default (Lspace/kscience/dataforge/meta/MutableMetaProvider;Lspace/kscience/dataforge/names/Name;ILjava/lang/Object;)Lspace/kscience/dataforge/meta/MutableMetaDelegate;
|
||||
public static final fun node (Lspace/kscience/dataforge/meta/MutableMetaProvider;Lspace/kscience/dataforge/names/Name;Lspace/kscience/dataforge/meta/MetaConverter;)Lspace/kscience/dataforge/meta/MutableMetaDelegate;
|
||||
public static final fun node (Lspace/kscience/dataforge/meta/MutableMetaProvider;Lspace/kscience/dataforge/names/Name;Lspace/kscience/dataforge/meta/descriptors/MetaDescriptor;)Lspace/kscience/dataforge/meta/MutableMetaDelegate;
|
||||
public static synthetic fun node$default (Lspace/kscience/dataforge/meta/MutableMetaProvider;Lspace/kscience/dataforge/names/Name;Lspace/kscience/dataforge/meta/MetaConverter;ILjava/lang/Object;)Lspace/kscience/dataforge/meta/MutableMetaDelegate;
|
||||
public static synthetic fun node$default (Lspace/kscience/dataforge/meta/MutableMetaProvider;Lspace/kscience/dataforge/names/Name;Lspace/kscience/dataforge/meta/descriptors/MetaDescriptor;ILjava/lang/Object;)Lspace/kscience/dataforge/meta/MutableMetaDelegate;
|
||||
public static final fun number (Lspace/kscience/dataforge/meta/MutableMetaProvider;Ljava/lang/Number;Lspace/kscience/dataforge/names/Name;)Lspace/kscience/dataforge/meta/MutableMetaDelegate;
|
||||
public static final fun number (Lspace/kscience/dataforge/meta/MutableMetaProvider;Lspace/kscience/dataforge/names/Name;)Lspace/kscience/dataforge/meta/MutableMetaDelegate;
|
||||
@ -507,8 +520,16 @@ public final class space/kscience/dataforge/meta/MutableMetaSerializer : kotlinx
|
||||
public fun serialize (Lkotlinx/serialization/encoding/Encoder;Lspace/kscience/dataforge/meta/MutableMeta;)V
|
||||
}
|
||||
|
||||
public final class space/kscience/dataforge/meta/MutableMetaViewKt {
|
||||
public static final fun view (Lspace/kscience/dataforge/meta/MutableMeta;Ljava/lang/String;)Lspace/kscience/dataforge/meta/MutableMeta;
|
||||
public static final fun view (Lspace/kscience/dataforge/meta/MutableMeta;Lspace/kscience/dataforge/names/Name;)Lspace/kscience/dataforge/meta/MutableMeta;
|
||||
}
|
||||
|
||||
public abstract interface class space/kscience/dataforge/meta/MutableTypedMeta : space/kscience/dataforge/meta/MutableMeta, space/kscience/dataforge/meta/TypedMeta {
|
||||
public abstract fun get (Lspace/kscience/dataforge/names/Name;)Lspace/kscience/dataforge/meta/MutableTypedMeta;
|
||||
public synthetic fun get (Lspace/kscience/dataforge/names/Name;)Lspace/kscience/dataforge/meta/Meta;
|
||||
public synthetic fun get (Lspace/kscience/dataforge/names/Name;)Lspace/kscience/dataforge/meta/MutableMeta;
|
||||
public fun get (Lspace/kscience/dataforge/names/Name;)Lspace/kscience/dataforge/meta/MutableTypedMeta;
|
||||
public synthetic fun get (Lspace/kscience/dataforge/names/Name;)Lspace/kscience/dataforge/meta/TypedMeta;
|
||||
public abstract fun getOrCreate (Lspace/kscience/dataforge/names/Name;)Lspace/kscience/dataforge/meta/MutableTypedMeta;
|
||||
}
|
||||
|
||||
@ -546,16 +567,24 @@ public final class space/kscience/dataforge/meta/ObservableMetaWrapperKt {
|
||||
}
|
||||
|
||||
public abstract interface class space/kscience/dataforge/meta/ObservableMutableMeta : space/kscience/dataforge/meta/MutableMeta, space/kscience/dataforge/meta/MutableTypedMeta, space/kscience/dataforge/meta/ObservableMeta {
|
||||
public synthetic fun get (Lspace/kscience/dataforge/names/Name;)Lspace/kscience/dataforge/meta/Meta;
|
||||
public synthetic fun get (Lspace/kscience/dataforge/names/Name;)Lspace/kscience/dataforge/meta/MutableMeta;
|
||||
public synthetic fun get (Lspace/kscience/dataforge/names/Name;)Lspace/kscience/dataforge/meta/MutableTypedMeta;
|
||||
public fun get (Lspace/kscience/dataforge/names/Name;)Lspace/kscience/dataforge/meta/ObservableMutableMeta;
|
||||
public synthetic fun get (Lspace/kscience/dataforge/names/Name;)Lspace/kscience/dataforge/meta/TypedMeta;
|
||||
public abstract fun getOrCreate (Lspace/kscience/dataforge/names/Name;)Lspace/kscience/dataforge/meta/ObservableMutableMeta;
|
||||
public static final field Companion Lspace/kscience/dataforge/meta/ObservableMutableMeta$Companion;
|
||||
public fun getSelf ()Lspace/kscience/dataforge/meta/ObservableMutableMeta;
|
||||
public synthetic fun getSelf ()Lspace/kscience/dataforge/meta/TypedMeta;
|
||||
}
|
||||
|
||||
public final class space/kscience/dataforge/meta/ObservableMutableMeta$Companion {
|
||||
public final fun serializer ()Lkotlinx/serialization/KSerializer;
|
||||
}
|
||||
|
||||
public final class space/kscience/dataforge/meta/ObservableMutableMetaSerializer : kotlinx/serialization/KSerializer {
|
||||
public static final field INSTANCE Lspace/kscience/dataforge/meta/ObservableMutableMetaSerializer;
|
||||
public synthetic fun deserialize (Lkotlinx/serialization/encoding/Decoder;)Ljava/lang/Object;
|
||||
public fun deserialize (Lkotlinx/serialization/encoding/Decoder;)Lspace/kscience/dataforge/meta/ObservableMutableMeta;
|
||||
public fun getDescriptor ()Lkotlinx/serialization/descriptors/SerialDescriptor;
|
||||
public synthetic fun serialize (Lkotlinx/serialization/encoding/Encoder;Ljava/lang/Object;)V
|
||||
public fun serialize (Lkotlinx/serialization/encoding/Encoder;Lspace/kscience/dataforge/meta/ObservableMutableMeta;)V
|
||||
}
|
||||
|
||||
public final class space/kscience/dataforge/meta/RegexItemTransformationRule : space/kscience/dataforge/meta/TransformationRule {
|
||||
public fun <init> (Lkotlin/text/Regex;Lkotlin/jvm/functions/Function4;)V
|
||||
public final fun component1 ()Lkotlin/text/Regex;
|
||||
@ -596,9 +625,9 @@ public final class space/kscience/dataforge/meta/SchemeKt {
|
||||
public static final fun listOfScheme (Lspace/kscience/dataforge/meta/Scheme;Lspace/kscience/dataforge/meta/SchemeSpec;Lspace/kscience/dataforge/names/Name;)Lkotlin/properties/ReadWriteProperty;
|
||||
public static synthetic fun listOfScheme$default (Lspace/kscience/dataforge/meta/MutableMeta;Lspace/kscience/dataforge/meta/SchemeSpec;Lspace/kscience/dataforge/names/Name;ILjava/lang/Object;)Lkotlin/properties/ReadWriteProperty;
|
||||
public static synthetic fun listOfScheme$default (Lspace/kscience/dataforge/meta/Scheme;Lspace/kscience/dataforge/meta/SchemeSpec;Lspace/kscience/dataforge/names/Name;ILjava/lang/Object;)Lkotlin/properties/ReadWriteProperty;
|
||||
public static final fun scheme (Lspace/kscience/dataforge/meta/MutableMeta;Lspace/kscience/dataforge/meta/SchemeSpec;Lspace/kscience/dataforge/names/Name;)Lkotlin/properties/ReadWriteProperty;
|
||||
public static final fun scheme (Lspace/kscience/dataforge/meta/MutableMetaProvider;Lspace/kscience/dataforge/meta/SchemeSpec;Lspace/kscience/dataforge/names/Name;)Lkotlin/properties/ReadWriteProperty;
|
||||
public static final fun scheme (Lspace/kscience/dataforge/meta/Scheme;Lspace/kscience/dataforge/meta/SchemeSpec;Lspace/kscience/dataforge/names/Name;)Lkotlin/properties/ReadWriteProperty;
|
||||
public static synthetic fun scheme$default (Lspace/kscience/dataforge/meta/MutableMeta;Lspace/kscience/dataforge/meta/SchemeSpec;Lspace/kscience/dataforge/names/Name;ILjava/lang/Object;)Lkotlin/properties/ReadWriteProperty;
|
||||
public static synthetic fun scheme$default (Lspace/kscience/dataforge/meta/MutableMetaProvider;Lspace/kscience/dataforge/meta/SchemeSpec;Lspace/kscience/dataforge/names/Name;ILjava/lang/Object;)Lkotlin/properties/ReadWriteProperty;
|
||||
public static synthetic fun scheme$default (Lspace/kscience/dataforge/meta/Scheme;Lspace/kscience/dataforge/meta/SchemeSpec;Lspace/kscience/dataforge/names/Name;ILjava/lang/Object;)Lkotlin/properties/ReadWriteProperty;
|
||||
public static final fun schemeOrNull (Lspace/kscience/dataforge/meta/MutableMeta;Lspace/kscience/dataforge/meta/SchemeSpec;Lspace/kscience/dataforge/names/Name;)Lkotlin/properties/ReadWriteProperty;
|
||||
public static final fun schemeOrNull (Lspace/kscience/dataforge/meta/Scheme;Lspace/kscience/dataforge/meta/SchemeSpec;Lspace/kscience/dataforge/names/Name;)Lkotlin/properties/ReadWriteProperty;
|
||||
|
@ -7,13 +7,57 @@ kscience {
|
||||
js()
|
||||
native()
|
||||
wasm()
|
||||
useSerialization{
|
||||
useSerialization {
|
||||
json()
|
||||
}
|
||||
}
|
||||
|
||||
description = "Meta definition and basic operations on meta"
|
||||
|
||||
readme{
|
||||
readme {
|
||||
maturity = space.kscience.gradle.Maturity.DEVELOPMENT
|
||||
|
||||
description = """
|
||||
Core Meta and Name manipulation module
|
||||
""".trimIndent()
|
||||
|
||||
feature(
|
||||
"meta",
|
||||
ref = "src/commonMain/kotlin/space/kscience/dataforge/meta/Meta.kt",
|
||||
name = "Meta"
|
||||
) {
|
||||
"""
|
||||
**Meta** is the representation of basic DataForge concept: Metadata, but it also could be called meta-value tree.
|
||||
|
||||
Each Meta node could hava a node Value as well as a map of named child items.
|
||||
|
||||
""".trimIndent()
|
||||
}
|
||||
|
||||
feature(
|
||||
"value",
|
||||
ref = "src/commonMain/kotlin/space/kscience/dataforge/meta/Value.kt",
|
||||
name = "Value"
|
||||
) {
|
||||
"""
|
||||
**Value** a sum type for different meta values.
|
||||
|
||||
The following types are implemented in core (custom ones are also available):
|
||||
* null
|
||||
* boolean
|
||||
* number
|
||||
* string
|
||||
* list of values
|
||||
""".trimIndent()
|
||||
}
|
||||
|
||||
feature(
|
||||
"name",
|
||||
ref = "src/commonMain/kotlin/space/kscience/dataforge/names/Name.kt",
|
||||
name = "Name"
|
||||
) {
|
||||
"""
|
||||
**Name** is an identifier to access tree-like structure.
|
||||
""".trimIndent()
|
||||
}
|
||||
}
|
@ -34,9 +34,9 @@ private fun Meta.toJsonWithIndex(descriptor: MetaDescriptor?, index: String?): J
|
||||
val childDescriptor = descriptor?.nodes?.get(body)
|
||||
if (list.size == 1) {
|
||||
val (token, element) = list.first()
|
||||
//do not add an empty element
|
||||
val child: JsonElement = element.toJsonWithIndex(childDescriptor, token.index)
|
||||
if(token.index == null) {
|
||||
//do not add an empty element
|
||||
val child: JsonElement = element.toJsonWithIndex(childDescriptor, token.index)
|
||||
if (token.index == null) {
|
||||
body to child
|
||||
} else {
|
||||
body to JsonArray(listOf(child))
|
||||
@ -106,7 +106,7 @@ private fun JsonElement.toValueOrNull(descriptor: MetaDescriptor?): Value? = whe
|
||||
private fun MutableMap<NameToken, SealedMeta>.addJsonElement(
|
||||
key: String,
|
||||
element: JsonElement,
|
||||
descriptor: MetaDescriptor?
|
||||
descriptor: MetaDescriptor?,
|
||||
) {
|
||||
when (element) {
|
||||
is JsonPrimitive -> put(NameToken(key), Meta(element.toValue(descriptor)))
|
||||
@ -117,8 +117,11 @@ private fun MutableMap<NameToken, SealedMeta>.addJsonElement(
|
||||
} else {
|
||||
val indexKey = descriptor?.indexKey ?: Meta.INDEX_KEY
|
||||
element.forEachIndexed { serial, childElement ->
|
||||
val index = (childElement as? JsonObject)?.get(indexKey)?.jsonPrimitive?.content
|
||||
|
||||
val index = (childElement as? JsonObject)
|
||||
?.get(indexKey)?.jsonPrimitive?.content
|
||||
?: serial.toString()
|
||||
|
||||
val child: SealedMeta = when (childElement) {
|
||||
is JsonObject -> childElement.toMeta(descriptor)
|
||||
is JsonArray -> {
|
||||
@ -133,12 +136,14 @@ private fun MutableMap<NameToken, SealedMeta>.addJsonElement(
|
||||
Meta(childValue)
|
||||
}
|
||||
}
|
||||
|
||||
is JsonPrimitive -> Meta(childElement.toValue(null))
|
||||
}
|
||||
put(NameToken(key, index), child)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
is JsonObject -> {
|
||||
val indexKey = descriptor?.indexKey ?: Meta.INDEX_KEY
|
||||
val index = element[indexKey]?.jsonPrimitive?.content
|
||||
@ -160,11 +165,15 @@ public fun JsonObject.toMeta(descriptor: MetaDescriptor? = null): SealedMeta {
|
||||
public fun JsonElement.toMeta(descriptor: MetaDescriptor? = null): SealedMeta = when (this) {
|
||||
is JsonPrimitive -> Meta(toValue(descriptor))
|
||||
is JsonObject -> toMeta(descriptor)
|
||||
is JsonArray -> SealedMeta(null,
|
||||
linkedMapOf<NameToken, SealedMeta>().apply {
|
||||
addJsonElement(Meta.JSON_ARRAY_KEY, this@toMeta, null)
|
||||
}
|
||||
)
|
||||
is JsonArray -> if (all { it is JsonPrimitive }) {
|
||||
Meta(map { it.toValueOrNull(descriptor) ?: error("Unreachable: should not contain objects") }.asValue())
|
||||
} else {
|
||||
SealedMeta(null,
|
||||
linkedMapOf<NameToken, SealedMeta>().apply {
|
||||
addJsonElement(Meta.JSON_ARRAY_KEY, this@toMeta, null)
|
||||
}
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
//
|
||||
|
@ -11,7 +11,7 @@ import space.kscience.dataforge.misc.DFExperimental
|
||||
/**
|
||||
* A converter of generic object to and from [Meta]
|
||||
*/
|
||||
public interface MetaConverter<T>: MetaReader<T> {
|
||||
public interface MetaConverter<T> : MetaReader<T> {
|
||||
|
||||
/**
|
||||
* A descriptor for resulting meta
|
||||
@ -116,6 +116,12 @@ public interface MetaConverter<T>: MetaReader<T> {
|
||||
override fun convert(obj: E): Meta = Meta(obj.asValue())
|
||||
}
|
||||
|
||||
public val stringList: MetaConverter<List<String>> = object : MetaConverter<List<String>> {
|
||||
override fun convert(obj: List<String>): Meta = Meta(obj.map { it.asValue() }.asValue())
|
||||
|
||||
override fun readOrNull(source: Meta): List<String>? = source.stringList
|
||||
}
|
||||
|
||||
public fun <T> valueList(
|
||||
writer: (T) -> Value = { Value.of(it) },
|
||||
reader: (Value) -> T,
|
||||
|
@ -24,20 +24,45 @@ public fun MetaProvider.node(
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* Use [metaReader] to read the Meta node
|
||||
* Use [reader] to read the Meta node
|
||||
*/
|
||||
public fun <T> MetaProvider.spec(
|
||||
metaReader: MetaReader<T>,
|
||||
public fun <T> MetaProvider.readable(
|
||||
reader: MetaReader<T>,
|
||||
key: Name? = null,
|
||||
): MetaDelegate<T?> = object : MetaDelegate<T?> {
|
||||
override val descriptor: MetaDescriptor? get() = metaReader.descriptor
|
||||
override val descriptor: MetaDescriptor? get() = reader.descriptor
|
||||
|
||||
override fun getValue(thisRef: Any?, property: KProperty<*>): T? {
|
||||
return get(key ?: property.name.asName())?.let { metaReader.read(it) }
|
||||
return get(key ?: property.name.asName())?.let { reader.read(it) }
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Use [reader] to read the Meta node or return [default] if node does not exist
|
||||
*/
|
||||
public fun <T> MetaProvider.readable(
|
||||
reader: MetaReader<T>,
|
||||
default: T,
|
||||
key: Name? = null,
|
||||
): MetaDelegate<T> = object : MetaDelegate<T> {
|
||||
override val descriptor: MetaDescriptor? get() = reader.descriptor
|
||||
|
||||
override fun getValue(thisRef: Any?, property: KProperty<*>): T {
|
||||
return get(key ?: property.name.asName())?.let { reader.read(it) } ?: default
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Use [reader] to read the Meta node
|
||||
*/
|
||||
@Deprecated("Replace with readable", ReplaceWith("readable(metaReader, key)"))
|
||||
public fun <T> MetaProvider.spec(
|
||||
reader: MetaReader<T>,
|
||||
key: Name? = null,
|
||||
): MetaDelegate<T?> = readable(reader, key)
|
||||
|
||||
/**
|
||||
* Use object serializer to transform it to Meta and back
|
||||
*/
|
||||
@ -45,34 +70,51 @@ public fun <T> MetaProvider.spec(
|
||||
public inline fun <reified T> MetaProvider.serializable(
|
||||
key: Name? = null,
|
||||
descriptor: MetaDescriptor? = null,
|
||||
): MetaDelegate<T?> = spec(MetaConverter.serializable(descriptor), key)
|
||||
): MetaDelegate<T?> = readable(MetaConverter.serializable(descriptor), key)
|
||||
|
||||
@DFExperimental
|
||||
public inline fun <reified T> MetaProvider.serializable(
|
||||
key: Name? = null,
|
||||
default: T,
|
||||
descriptor: MetaDescriptor? = null,
|
||||
): MetaDelegate<T> = readable(MetaConverter.serializable(descriptor), default, key)
|
||||
|
||||
@Deprecated("Use convertable", ReplaceWith("convertable(converter, key)"))
|
||||
public fun <T> MetaProvider.node(
|
||||
key: Name? = null,
|
||||
converter: MetaReader<T>,
|
||||
): ReadOnlyProperty<Any?, T?> = spec(converter, key)
|
||||
): ReadOnlyProperty<Any?, T?> = readable(converter, key)
|
||||
|
||||
/**
|
||||
* Use [converter] to convert a list of same name siblings meta to object
|
||||
* Use [reader] to convert a list of same name siblings meta to object
|
||||
*/
|
||||
public fun <T> Meta.listOfSpec(
|
||||
converter: MetaReader<T>,
|
||||
public fun <T> Meta.listOfReadable(
|
||||
reader: MetaReader<T>,
|
||||
key: Name? = null,
|
||||
): MetaDelegate<List<T>> = object : MetaDelegate<List<T>> {
|
||||
override fun getValue(thisRef: Any?, property: KProperty<*>): List<T> {
|
||||
val name = key ?: property.name.asName()
|
||||
return getIndexed(name).values.map { converter.read(it) }
|
||||
return getIndexed(name).values.map { reader.read(it) }
|
||||
}
|
||||
|
||||
override val descriptor: MetaDescriptor? = converter.descriptor?.copy(multiple = true)
|
||||
override val descriptor: MetaDescriptor? = reader.descriptor?.copy(multiple = true)
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* Use [converter] to convert a list of same name siblings meta to object
|
||||
*/
|
||||
@Deprecated("Replace with readingList", ReplaceWith("readingList(converter, key)"))
|
||||
public fun <T> Meta.listOfSpec(
|
||||
converter: MetaReader<T>,
|
||||
key: Name? = null,
|
||||
): MetaDelegate<List<T>> = listOfReadable(converter, key)
|
||||
|
||||
@DFExperimental
|
||||
public inline fun <reified T> Meta.listOfSerializable(
|
||||
key: Name? = null,
|
||||
descriptor: MetaDescriptor? = null,
|
||||
): MetaDelegate<List<T>> = listOfSpec(MetaConverter.serializable(descriptor), key)
|
||||
): MetaDelegate<List<T>> = listOfReadable(MetaConverter.serializable(descriptor), key)
|
||||
|
||||
/**
|
||||
* A property delegate that uses custom key
|
||||
|
@ -1,17 +1,19 @@
|
||||
package space.kscience.dataforge.meta
|
||||
|
||||
import kotlinx.serialization.json.Json
|
||||
import space.kscience.dataforge.meta.descriptors.Described
|
||||
import space.kscience.dataforge.meta.descriptors.MetaDescriptor
|
||||
import space.kscience.dataforge.meta.descriptors.MetaDescriptorBuilder
|
||||
import space.kscience.dataforge.misc.DFExperimental
|
||||
import space.kscience.dataforge.names.Name
|
||||
import space.kscience.dataforge.names.asName
|
||||
import space.kscience.dataforge.names.startsWith
|
||||
import kotlin.properties.PropertyDelegateProvider
|
||||
import kotlin.properties.ReadOnlyProperty
|
||||
|
||||
|
||||
/**
|
||||
* A reference to a read-only value of type [T] inside [MetaProvider]
|
||||
* A reference to a read-only value of type [T] inside [MetaProvider] or writable value in [MutableMetaProvider]
|
||||
*/
|
||||
@DFExperimental
|
||||
public data class MetaRef<T>(
|
||||
@ -20,28 +22,77 @@ public data class MetaRef<T>(
|
||||
override val descriptor: MetaDescriptor? = converter.descriptor,
|
||||
) : Described
|
||||
|
||||
/**
|
||||
* Get a value from provider by [ref] or return null if node with given name is missing
|
||||
*/
|
||||
@DFExperimental
|
||||
public operator fun <T> MetaProvider.get(ref: MetaRef<T>): T? = get(ref.name)?.let { ref.converter.readOrNull(it) }
|
||||
|
||||
/**
|
||||
* Set a value in a mutable provider by [ref]
|
||||
*/
|
||||
@DFExperimental
|
||||
public operator fun <T> MutableMetaProvider.set(ref: MetaRef<T>, value: T) {
|
||||
set(ref.name, ref.converter.convert(value))
|
||||
}
|
||||
|
||||
/**
|
||||
* Observe changes to specific property via given [ref].
|
||||
*
|
||||
* This listener should be removed in a same way as [ObservableMeta.onChange].
|
||||
*
|
||||
* @param callback an action to be performed on each change of item. Null means that the item is not present or malformed.
|
||||
*/
|
||||
@DFExperimental
|
||||
public class MetaSpec(
|
||||
private val configuration: MetaDescriptorBuilder.() -> Unit = {},
|
||||
) : Described {
|
||||
private val refs: MutableList<MetaRef<*>> = mutableListOf()
|
||||
public fun <T: Any> ObservableMeta.onValueChange(owner: Any?, ref: MetaRef<T>, callback: (T?) -> Unit) {
|
||||
onChange(owner) { name ->
|
||||
if (name.startsWith(ref.name)) {
|
||||
get(name)?.let { value ->
|
||||
callback(ref.converter.readOrNull(value))
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private fun registerRef(ref: MetaRef<*>) {
|
||||
refs.add(ref)
|
||||
/**
|
||||
* Remove a node corresponding to [ref] from a mutable provider if it exists
|
||||
*/
|
||||
@DFExperimental
|
||||
public fun MutableMetaProvider.remove(ref: MetaRef<*>) {
|
||||
remove(ref.name)
|
||||
}
|
||||
|
||||
/**
|
||||
* Base storage of [MetaRef]
|
||||
*/
|
||||
@OptIn(DFExperimental::class)
|
||||
public interface MetaRefStore : Described {
|
||||
public val refs: List<MetaRef<*>>
|
||||
}
|
||||
|
||||
/**
|
||||
* A base class for [Meta] specification that stores references to meta nodes.
|
||||
*/
|
||||
@DFExperimental
|
||||
public abstract class MetaSpec : MetaRefStore {
|
||||
private val _refs: MutableList<MetaRef<*>> = mutableListOf()
|
||||
override val refs: List<MetaRef<*>> get() = _refs
|
||||
|
||||
/**
|
||||
* Register a ref in this specification
|
||||
*/
|
||||
protected fun registerRef(ref: MetaRef<*>) {
|
||||
_refs.add(ref)
|
||||
}
|
||||
|
||||
/**
|
||||
* Create and register a ref by property name and provided converter.
|
||||
* By default, uses descriptor from the converter
|
||||
*/
|
||||
public fun <T> item(
|
||||
converter: MetaConverter<T>,
|
||||
descriptor: MetaDescriptor? = converter.descriptor,
|
||||
key: Name? = null,
|
||||
descriptor: MetaDescriptor? = converter.descriptor,
|
||||
): PropertyDelegateProvider<MetaSpec, ReadOnlyProperty<MetaSpec, MetaRef<T>>> =
|
||||
PropertyDelegateProvider { _, property ->
|
||||
val ref = MetaRef(key ?: property.name.asName(), converter, descriptor)
|
||||
@ -51,6 +102,11 @@ public class MetaSpec(
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Override to provide custom [MetaDescriptor]
|
||||
*/
|
||||
protected open fun MetaDescriptorBuilder.buildDescriptor(): Unit = Unit
|
||||
|
||||
override val descriptor: MetaDescriptor by lazy {
|
||||
MetaDescriptor {
|
||||
refs.forEach { ref ->
|
||||
@ -58,7 +114,108 @@ public class MetaSpec(
|
||||
node(ref.name, ref.descriptor)
|
||||
}
|
||||
}
|
||||
configuration()
|
||||
buildDescriptor()
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Register an item using a [descriptorBuilder] to customize descriptor
|
||||
*/
|
||||
@DFExperimental
|
||||
public fun <T> MetaSpec.item(
|
||||
converter: MetaConverter<T>,
|
||||
key: Name? = null,
|
||||
descriptorBuilder: MetaDescriptorBuilder.() -> Unit = {},
|
||||
): PropertyDelegateProvider<MetaSpec, ReadOnlyProperty<MetaSpec, MetaRef<T>>> = item(converter, key, MetaDescriptor {
|
||||
converter.descriptor?.let { from(it) }
|
||||
descriptorBuilder()
|
||||
})
|
||||
|
||||
//utility methods to add different nodes
|
||||
|
||||
@DFExperimental
|
||||
public fun MetaSpec.metaItem(
|
||||
key: Name? = null,
|
||||
descriptorBuilder: MetaDescriptorBuilder.() -> Unit = {},
|
||||
): PropertyDelegateProvider<MetaSpec, ReadOnlyProperty<MetaSpec, MetaRef<Meta>>> =
|
||||
item(MetaConverter.meta, key, descriptorBuilder)
|
||||
|
||||
@DFExperimental
|
||||
public fun MetaSpec.string(
|
||||
key: Name? = null,
|
||||
descriptorBuilder: MetaDescriptorBuilder.() -> Unit = {},
|
||||
): PropertyDelegateProvider<MetaSpec, ReadOnlyProperty<MetaSpec, MetaRef<String>>> =
|
||||
item(MetaConverter.string, key, descriptorBuilder)
|
||||
|
||||
@DFExperimental
|
||||
public fun MetaSpec.boolean(
|
||||
key: Name? = null,
|
||||
descriptorBuilder: MetaDescriptorBuilder.() -> Unit = {},
|
||||
): PropertyDelegateProvider<MetaSpec, ReadOnlyProperty<MetaSpec, MetaRef<Boolean>>> =
|
||||
item(MetaConverter.boolean, key, descriptorBuilder)
|
||||
|
||||
@DFExperimental
|
||||
public fun MetaSpec.stringList(
|
||||
key: Name? = null,
|
||||
descriptorBuilder: MetaDescriptorBuilder.() -> Unit = {},
|
||||
): PropertyDelegateProvider<MetaSpec, ReadOnlyProperty<MetaSpec, MetaRef<List<String>>>> =
|
||||
item(MetaConverter.stringList, key, descriptorBuilder)
|
||||
|
||||
@DFExperimental
|
||||
public fun MetaSpec.float(
|
||||
key: Name? = null,
|
||||
descriptorBuilder: MetaDescriptorBuilder.() -> Unit = {},
|
||||
): PropertyDelegateProvider<MetaSpec, ReadOnlyProperty<MetaSpec, MetaRef<Float>>> =
|
||||
item(MetaConverter.float, key, descriptorBuilder)
|
||||
|
||||
@DFExperimental
|
||||
public fun MetaSpec.double(
|
||||
key: Name? = null,
|
||||
descriptorBuilder: MetaDescriptorBuilder.() -> Unit = {},
|
||||
): PropertyDelegateProvider<MetaSpec, ReadOnlyProperty<MetaSpec, MetaRef<Double>>> =
|
||||
item(MetaConverter.double, key, descriptorBuilder)
|
||||
|
||||
@DFExperimental
|
||||
public fun MetaSpec.int(
|
||||
key: Name? = null,
|
||||
descriptorBuilder: MetaDescriptorBuilder.() -> Unit = {},
|
||||
): PropertyDelegateProvider<MetaSpec, ReadOnlyProperty<MetaSpec, MetaRef<Int>>> =
|
||||
item(MetaConverter.int, key, descriptorBuilder)
|
||||
|
||||
@DFExperimental
|
||||
public fun MetaSpec.long(
|
||||
key: Name? = null,
|
||||
descriptorBuilder: MetaDescriptorBuilder.() -> Unit = {},
|
||||
): PropertyDelegateProvider<MetaSpec, ReadOnlyProperty<MetaSpec, MetaRef<Long>>> =
|
||||
item(MetaConverter.long, key, descriptorBuilder)
|
||||
|
||||
|
||||
@DFExperimental
|
||||
public fun MetaSpec.doubleArray(
|
||||
key: Name? = null,
|
||||
descriptorBuilder: MetaDescriptorBuilder.() -> Unit = {},
|
||||
): PropertyDelegateProvider<MetaSpec, ReadOnlyProperty<MetaSpec, MetaRef<DoubleArray>>> =
|
||||
item(MetaConverter.doubleArray, key, descriptorBuilder)
|
||||
|
||||
@DFExperimental
|
||||
public fun MetaSpec.byteArray(
|
||||
key: Name? = null,
|
||||
descriptorBuilder: MetaDescriptorBuilder.() -> Unit = {},
|
||||
): PropertyDelegateProvider<MetaSpec, ReadOnlyProperty<MetaSpec, MetaRef<ByteArray>>> =
|
||||
item(MetaConverter.byteArray, key, descriptorBuilder)
|
||||
|
||||
@DFExperimental
|
||||
public inline fun <reified E : Enum<E>> MetaSpec.enum(
|
||||
key: Name? = null,
|
||||
noinline descriptorBuilder: MetaDescriptorBuilder.() -> Unit = {},
|
||||
): PropertyDelegateProvider<MetaSpec, ReadOnlyProperty<MetaSpec, MetaRef<E>>> =
|
||||
item(MetaConverter.enum(), key, descriptorBuilder)
|
||||
|
||||
@DFExperimental
|
||||
public inline fun <reified T> MetaSpec.serializable(
|
||||
key: Name? = null,
|
||||
jsonEncoder: Json = Json,
|
||||
noinline descriptorBuilder: MetaDescriptorBuilder.() -> Unit = {},
|
||||
): PropertyDelegateProvider<MetaSpec, ReadOnlyProperty<MetaSpec, MetaRef<T>>> =
|
||||
item(MetaConverter.serializable(jsonEncoder = jsonEncoder), key, descriptorBuilder)
|
@ -45,4 +45,21 @@ public object MutableMetaSerializer : KSerializer<MutableMeta> {
|
||||
override fun serialize(encoder: Encoder, value: MutableMeta) {
|
||||
encoder.encodeSerializableValue(MetaSerializer, value)
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* A serializer for [ObservableMutableMeta]
|
||||
*/
|
||||
public object ObservableMutableMetaSerializer : KSerializer<ObservableMutableMeta> {
|
||||
|
||||
override val descriptor: SerialDescriptor = MetaSerializer.descriptor
|
||||
|
||||
override fun deserialize(decoder: Decoder): ObservableMutableMeta {
|
||||
val meta = decoder.decodeSerializableValue(MetaSerializer)
|
||||
return ((meta as? MutableMeta) ?: meta.toMutableMeta()).asObservable()
|
||||
}
|
||||
|
||||
override fun serialize(encoder: Encoder, value: ObservableMutableMeta) {
|
||||
encoder.encodeSerializableValue(MetaSerializer, value)
|
||||
}
|
||||
}
|
@ -159,7 +159,17 @@ public interface MutableTypedMeta<M : MutableTypedMeta<M>> : TypedMeta<M>, Mutab
|
||||
*/
|
||||
@DFExperimental
|
||||
public fun attach(name: Name, node: M)
|
||||
override fun get(name: Name): M?
|
||||
|
||||
override fun get(name: Name): M? {
|
||||
tailrec fun M.find(name: Name): M? = if (name.isEmpty()) {
|
||||
self
|
||||
} else {
|
||||
items[name.firstOrNull()!!]?.find(name.cutFirst())
|
||||
}
|
||||
|
||||
return self.find(name)
|
||||
}
|
||||
|
||||
override fun getOrCreate(name: Name): M
|
||||
}
|
||||
|
||||
@ -388,7 +398,7 @@ public fun MutableMeta.reset(newMeta: Meta) {
|
||||
(items.keys - newMeta.items.keys).forEach {
|
||||
remove(it.asName())
|
||||
}
|
||||
newMeta.items.forEach { (token, item)->
|
||||
newMeta.items.forEach { (token, item) ->
|
||||
set(token, item)
|
||||
}
|
||||
}
|
||||
|
@ -54,9 +54,25 @@ public fun <T> MutableMetaProvider.convertable(
|
||||
}
|
||||
}
|
||||
|
||||
@Deprecated("Use convertable", ReplaceWith("convertable(converter, key)"))
|
||||
public fun <T> MutableMetaProvider.node(key: Name? = null, converter: MetaConverter<T>): MutableMetaDelegate<T?> =
|
||||
convertable(converter, key)
|
||||
public fun <T> MutableMetaProvider.convertable(
|
||||
converter: MetaConverter<T>,
|
||||
default: T,
|
||||
key: Name? = null,
|
||||
): MutableMetaDelegate<T> = object : MutableMetaDelegate<T> {
|
||||
|
||||
override val descriptor: MetaDescriptor? get() = converter.descriptor
|
||||
|
||||
|
||||
override fun getValue(thisRef: Any?, property: KProperty<*>): T {
|
||||
val name = key ?: property.name.asName()
|
||||
return get(name)?.let { converter.read(it) } ?: default
|
||||
}
|
||||
|
||||
override fun setValue(thisRef: Any?, property: KProperty<*>, value: T) {
|
||||
val name = key ?: property.name.asName()
|
||||
set(name, value?.let { converter.convert(it) })
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Use object serializer to transform it to Meta and back.
|
||||
@ -66,7 +82,14 @@ public fun <T> MutableMetaProvider.node(key: Name? = null, converter: MetaConver
|
||||
public inline fun <reified T> MutableMetaProvider.serializable(
|
||||
descriptor: MetaDescriptor? = null,
|
||||
key: Name? = null,
|
||||
): MutableMetaDelegate<T?> = convertable(MetaConverter.serializable(descriptor), key)
|
||||
): MutableMetaDelegate<T?> = convertable<T>(MetaConverter.serializable(descriptor), key)
|
||||
|
||||
@DFExperimental
|
||||
public inline fun <reified T> MutableMetaProvider.serializable(
|
||||
descriptor: MetaDescriptor? = null,
|
||||
default: T,
|
||||
key: Name? = null,
|
||||
): MutableMetaDelegate<T> = convertable(MetaConverter.serializable(descriptor), default, key)
|
||||
|
||||
/**
|
||||
* Use [converter] to convert a list of same name siblings meta to object and back.
|
||||
|
47
dataforge-meta/src/commonMain/kotlin/space/kscience/dataforge/meta/MutableMetaView.kt
Normal file
47
dataforge-meta/src/commonMain/kotlin/space/kscience/dataforge/meta/MutableMetaView.kt
Normal file
@ -0,0 +1,47 @@
|
||||
package space.kscience.dataforge.meta
|
||||
|
||||
import space.kscience.dataforge.names.Name
|
||||
import space.kscience.dataforge.names.NameToken
|
||||
import space.kscience.dataforge.names.parseAsName
|
||||
import space.kscience.dataforge.names.plus
|
||||
|
||||
/**
|
||||
* A [Meta] child proxy that creates required nodes on value write
|
||||
*/
|
||||
private class MutableMetaView(
|
||||
val origin: MutableMeta,
|
||||
val path: Name
|
||||
) : MutableMeta {
|
||||
|
||||
override val items: Map<NameToken, MutableMeta>
|
||||
get() = origin[path]?.items ?: emptyMap()
|
||||
|
||||
override var value: Value?
|
||||
get() = origin[path]?.value
|
||||
set(value) {
|
||||
origin[path] = value
|
||||
}
|
||||
|
||||
override fun getOrCreate(name: Name): MutableMeta = MutableMetaView(origin, path + name)
|
||||
|
||||
override fun set(name: Name, node: Meta?) {
|
||||
if (origin[path + name] == null && node?.isEmpty() == true) return
|
||||
origin[path + name] = node
|
||||
}
|
||||
|
||||
override fun equals(other: Any?): Boolean = Meta.equals(this, other as? Meta)
|
||||
|
||||
override fun hashCode(): Int = Meta.hashCode(this)
|
||||
|
||||
override fun toString(): String = Meta.toString(this)
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a view of this [MutableMeta] node that creates child items only when their values are written.
|
||||
*
|
||||
* The difference between this method and regular [getOrCreate] is that [getOrCreate] always creates and attaches node
|
||||
* even if it is empty.
|
||||
*/
|
||||
public fun MutableMeta.view(name: Name): MutableMeta = MutableMetaView(this, name)
|
||||
|
||||
public fun MutableMeta.view(name: String): MutableMeta = view(name.parseAsName())
|
@ -1,10 +1,8 @@
|
||||
package space.kscience.dataforge.meta
|
||||
|
||||
import kotlinx.serialization.Serializable
|
||||
import space.kscience.dataforge.misc.ThreadSafe
|
||||
import space.kscience.dataforge.names.Name
|
||||
import space.kscience.dataforge.names.cutFirst
|
||||
import space.kscience.dataforge.names.firstOrNull
|
||||
import space.kscience.dataforge.names.isEmpty
|
||||
|
||||
|
||||
internal data class MetaListener(
|
||||
@ -38,21 +36,10 @@ public interface ObservableMeta : Meta {
|
||||
/**
|
||||
* A [Meta] which is both observable and mutable
|
||||
*/
|
||||
@Serializable(ObservableMutableMetaSerializer::class)
|
||||
@MetaBuilderMarker
|
||||
public interface ObservableMutableMeta : ObservableMeta, MutableMeta, MutableTypedMeta<ObservableMutableMeta> {
|
||||
|
||||
override val self: ObservableMutableMeta get() = this
|
||||
|
||||
override fun getOrCreate(name: Name): ObservableMutableMeta
|
||||
|
||||
override fun get(name: Name): ObservableMutableMeta? {
|
||||
tailrec fun ObservableMutableMeta.find(name: Name): ObservableMutableMeta? = if (name.isEmpty()) {
|
||||
this
|
||||
} else {
|
||||
items[name.firstOrNull()!!]?.find(name.cutFirst())
|
||||
}
|
||||
|
||||
return find(name)
|
||||
}
|
||||
}
|
||||
|
||||
internal abstract class AbstractObservableMeta : ObservableMeta {
|
||||
|
@ -59,7 +59,7 @@ private class ObservableMetaWrapper(
|
||||
|
||||
fun removeNode(name: Name): Meta? {
|
||||
val oldMeta = get(name)
|
||||
//don't forget to remove listener
|
||||
//remember to remove listener
|
||||
oldMeta?.removeListener(this)
|
||||
|
||||
return oldMeta
|
||||
|
@ -221,13 +221,14 @@ public fun <T : Scheme> Configurable.updateWith(
|
||||
/**
|
||||
* A delegate that uses a [MetaReader] to wrap a child of this provider
|
||||
*/
|
||||
public fun <T : Scheme> MutableMeta.scheme(
|
||||
public fun <T : Scheme> MutableMetaProvider.scheme(
|
||||
spec: SchemeSpec<T>,
|
||||
key: Name? = null,
|
||||
): ReadWriteProperty<Any?, T> = object : ReadWriteProperty<Any?, T> {
|
||||
override fun getValue(thisRef: Any?, property: KProperty<*>): T {
|
||||
val name = key ?: property.name.asName()
|
||||
return spec.write(getOrCreate(name))
|
||||
val node = get(name) ?: MutableMeta().also { set(name, it) }
|
||||
return spec.write(node)
|
||||
}
|
||||
|
||||
override fun setValue(thisRef: Any?, property: KProperty<*>, value: T) {
|
||||
|
@ -8,6 +8,9 @@ import kotlinx.serialization.descriptors.element
|
||||
import kotlinx.serialization.encoding.Decoder
|
||||
import kotlinx.serialization.encoding.Encoder
|
||||
|
||||
/**
|
||||
* A serializer for [Value]
|
||||
*/
|
||||
public object ValueSerializer : KSerializer<Value> {
|
||||
private val listSerializer by lazy { ListSerializer(ValueSerializer) }
|
||||
|
||||
|
@ -21,6 +21,9 @@ public class LazyParsedValue(public val string: String) : Value {
|
||||
override fun hashCode(): Int = string.hashCode()
|
||||
}
|
||||
|
||||
/**
|
||||
* Read this string as lazily parsed value
|
||||
*/
|
||||
public fun String.lazyParseValue(): LazyParsedValue = LazyParsedValue(this)
|
||||
|
||||
/**
|
||||
@ -47,6 +50,10 @@ public class DoubleArrayValue(override val value: DoubleArray) : Value, Iterable
|
||||
override fun iterator(): Iterator<Double> = value.iterator()
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* A zero-copy wrapping of this [DoubleArray] in a [Value]
|
||||
*/
|
||||
public fun DoubleArray.asValue(): Value = if (isEmpty()) Null else DoubleArrayValue(this)
|
||||
|
||||
public val Value.doubleArray: DoubleArray
|
||||
@ -75,7 +82,17 @@ public fun MutableMetaProvider.doubleArray(
|
||||
reader = { it?.doubleArray ?: doubleArrayOf(*default) },
|
||||
)
|
||||
|
||||
private object DoubleArrayMetaConverter : MetaConverter<DoubleArray> {
|
||||
override fun readOrNull(source: Meta): DoubleArray? = source.doubleArray
|
||||
|
||||
override fun convert(obj: DoubleArray): Meta = Meta(obj.asValue())
|
||||
}
|
||||
|
||||
public val MetaConverter.Companion.doubleArray: MetaConverter<DoubleArray> get() = DoubleArrayMetaConverter
|
||||
|
||||
/**
|
||||
* A [Value] wrapping a [ByteArray]
|
||||
*/
|
||||
public class ByteArrayValue(override val value: ByteArray) : Value, Iterable<Byte> {
|
||||
override val type: ValueType get() = ValueType.LIST
|
||||
override val list: List<Value> get() = value.map { NumberValue(it) }
|
||||
@ -123,4 +140,12 @@ public fun MutableMetaProvider.byteArray(
|
||||
key,
|
||||
writer = { ByteArrayValue(it) },
|
||||
reader = { it?.byteArray ?: byteArrayOf(*default) },
|
||||
)
|
||||
)
|
||||
|
||||
private object ByteArrayMetaConverter : MetaConverter<ByteArray> {
|
||||
override fun readOrNull(source: Meta): ByteArray? = source.byteArray
|
||||
|
||||
override fun convert(obj: ByteArray): Meta = Meta(obj.asValue())
|
||||
}
|
||||
|
||||
public val MetaConverter.Companion.byteArray: MetaConverter<ByteArray> get() = ByteArrayMetaConverter
|
@ -11,9 +11,18 @@ public fun Value.isNull(): Boolean = this == Null
|
||||
public fun Value.isList(): Boolean = this.type == ValueType.LIST
|
||||
|
||||
public val Value.boolean: Boolean
|
||||
get() = this == True
|
||||
|| this.list.firstOrNull() == True
|
||||
|| (type == ValueType.STRING && string.toBoolean())
|
||||
get() = when (type) {
|
||||
ValueType.NUMBER -> int > 0
|
||||
ValueType.STRING -> string.toBoolean()
|
||||
ValueType.BOOLEAN -> this === True
|
||||
ValueType.LIST -> list.singleOrNull()?.boolean == true
|
||||
ValueType.NULL -> false
|
||||
}
|
||||
|
||||
// this == True
|
||||
// || this.list.firstOrNull() == True
|
||||
// || (type == ValueType.STRING && string.toBoolean())
|
||||
// || (type == ValueType.)
|
||||
|
||||
|
||||
public val Value.int: Int get() = number.toInt()
|
||||
|
@ -27,4 +27,4 @@ public object NameIndexComparator : Comparator<String?> {
|
||||
public fun Meta.getIndexedList(name: Name): List<Meta> = getIndexed(name).entries.sortedWith(
|
||||
//sort by index
|
||||
compareBy(space.kscience.dataforge.names.NameIndexComparator) { it.key }
|
||||
).map{it.value}
|
||||
).map { it.value }
|
@ -67,10 +67,29 @@ public class NameToken(public val body: String, public val index: String? = null
|
||||
* Parse name token from a string
|
||||
*/
|
||||
public fun parse(string: String): NameToken {
|
||||
val body = string.substringBefore('[')
|
||||
val index = string.substringAfter('[', "")
|
||||
if (index.isNotEmpty() && !index.endsWith(']')) error("NameToken with index must end with ']'")
|
||||
return NameToken(body, index.removeSuffix("]"))
|
||||
var indexStart = -1
|
||||
var indexEnd = -1
|
||||
string.forEachIndexed { index, c ->
|
||||
when (c) {
|
||||
'[' -> when {
|
||||
indexStart >= 0 -> error("Second opening bracket not allowed in NameToken: $string")
|
||||
else -> indexStart = index
|
||||
}
|
||||
|
||||
']' -> when {
|
||||
indexStart < 0 -> error("Closing index bracket could not be used before opening bracket in NameToken: $string")
|
||||
indexEnd >= 0 -> error("Second closing bracket not allowed in NameToken: $string")
|
||||
else -> indexEnd = index
|
||||
}
|
||||
|
||||
else -> if (indexEnd >= 0) error("Symbols not allowed after index in NameToken: $string")
|
||||
}
|
||||
}
|
||||
if (indexStart >= 0 && indexEnd < 0) error("Opening bracket without closing bracket not allowed in NameToken: $string")
|
||||
return NameToken(
|
||||
if (indexStart >= 0) string.substring(0, indexStart) else string,
|
||||
if (indexStart >= 0) string.substring(indexStart + 1, indexEnd) else null
|
||||
)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -0,0 +1,17 @@
|
||||
package space.kscience.dataforge.meta
|
||||
|
||||
import kotlin.test.Test
|
||||
import kotlin.test.assertEquals
|
||||
|
||||
class ConvertersTest {
|
||||
|
||||
@Test
|
||||
fun stringListConversion() {
|
||||
val list = listOf("A", "B", "C")
|
||||
val meta = MetaConverter.stringList.convert(list)
|
||||
val json = meta.toJson()
|
||||
val reconstructedMeta = json.toMeta()
|
||||
val reconstructed = MetaConverter.stringList.read(reconstructedMeta)
|
||||
assertEquals(list,reconstructed)
|
||||
}
|
||||
}
|
@ -0,0 +1,35 @@
|
||||
package space.kscience.dataforge.meta
|
||||
|
||||
import kotlinx.serialization.Serializable
|
||||
import space.kscience.dataforge.misc.DFExperimental
|
||||
import kotlin.test.Test
|
||||
import kotlin.test.assertEquals
|
||||
|
||||
@DFExperimental
|
||||
internal class MetaRefTest {
|
||||
|
||||
@Serializable
|
||||
data class XY(val x: Double, val y: Double)
|
||||
|
||||
object TestMetaSpec : MetaSpec() {
|
||||
val integer by int { description = "Integer value" }
|
||||
val string by string { description = "String value" }
|
||||
val custom by item(MetaConverter.serializable<XY>()) { description = "custom value" }
|
||||
}
|
||||
|
||||
@Test
|
||||
fun specWriteRead() = with(TestMetaSpec){
|
||||
val meta = MutableMeta()
|
||||
|
||||
meta[integer] = 22
|
||||
meta[string] = "33"
|
||||
val xy = XY(33.0, -33.0)
|
||||
meta[custom] = xy
|
||||
|
||||
val sealed = meta.seal()
|
||||
|
||||
assertEquals(22, sealed[integer])
|
||||
assertEquals("33", sealed[string])
|
||||
assertEquals(xy, sealed[custom])
|
||||
}
|
||||
}
|
25
dataforge-meta/src/commonTest/kotlin/space/kscience/dataforge/meta/MutableMetaViewTest.kt
Normal file
25
dataforge-meta/src/commonTest/kotlin/space/kscience/dataforge/meta/MutableMetaViewTest.kt
Normal file
@ -0,0 +1,25 @@
|
||||
package space.kscience.dataforge.meta
|
||||
|
||||
import space.kscience.dataforge.names.asName
|
||||
import kotlin.test.Test
|
||||
import kotlin.test.assertEquals
|
||||
import kotlin.test.assertTrue
|
||||
|
||||
class MutableMetaViewTest {
|
||||
@Test
|
||||
fun metaView() {
|
||||
val meta = MutableMeta()
|
||||
val view = meta.view("a".asName())
|
||||
|
||||
view["b"] = Meta.EMPTY
|
||||
|
||||
assertTrue { meta.items.isEmpty() }
|
||||
|
||||
view["c"] = Meta {
|
||||
"d" put 22
|
||||
}
|
||||
|
||||
assertEquals(22, meta["a.c.d"].int)
|
||||
}
|
||||
|
||||
}
|
@ -56,10 +56,22 @@ class NameTest {
|
||||
|
||||
val token2 = NameToken.parse("token-body")
|
||||
assertEquals("token-body", token2.body)
|
||||
assertEquals("", token2.index)
|
||||
assertEquals(null, token2.index)
|
||||
|
||||
// val token3 = NameToken.parse("[token-index]")
|
||||
// assertEquals("", token3.body)
|
||||
// assertEquals("token-index", token3.index)
|
||||
|
||||
assertFails{
|
||||
NameToken.parse("[token-index]")
|
||||
}
|
||||
|
||||
assertFails {
|
||||
NameToken.parse("token[22")
|
||||
}
|
||||
|
||||
assertFails {
|
||||
NameToken.parse("token[22]ddd")
|
||||
}
|
||||
}
|
||||
}
|
@ -1,68 +0,0 @@
|
||||
public final class hep/dataforge/output/ConsoleOutputManager : hep/dataforge/context/AbstractPlugin, hep/dataforge/output/OutputManager {
|
||||
public static final field Companion Lhep/dataforge/output/ConsoleOutputManager$Companion;
|
||||
public fun <init> ()V
|
||||
public fun get (Lkotlin/reflect/KClass;Lhep/dataforge/names/Name;Lhep/dataforge/names/Name;Lhep/dataforge/meta/Meta;)Lhep/dataforge/output/Renderer;
|
||||
public fun getTag ()Lhep/dataforge/context/PluginTag;
|
||||
}
|
||||
|
||||
public final class hep/dataforge/output/ConsoleOutputManager$Companion : hep/dataforge/context/PluginFactory {
|
||||
public fun getTag ()Lhep/dataforge/context/PluginTag;
|
||||
public fun getType ()Lkotlin/reflect/KClass;
|
||||
public fun invoke (Lhep/dataforge/meta/Meta;Lhep/dataforge/context/Context;)Lhep/dataforge/output/ConsoleOutputManager;
|
||||
public synthetic fun invoke (Lhep/dataforge/meta/Meta;Lhep/dataforge/context/Context;)Ljava/lang/Object;
|
||||
}
|
||||
|
||||
public final class hep/dataforge/output/DefaultTextFormat : hep/dataforge/output/TextFormat {
|
||||
public static final field INSTANCE Lhep/dataforge/output/DefaultTextFormat;
|
||||
public fun getPriority ()I
|
||||
public fun getType ()Lkotlin/reflect/KClass;
|
||||
public fun render (Ljava/lang/Appendable;Ljava/lang/Object;Lkotlin/coroutines/Continuation;)Ljava/lang/Object;
|
||||
}
|
||||
|
||||
public final class hep/dataforge/output/OutputJVMKt {
|
||||
public static final fun getOutput (Lkotlinx/coroutines/Dispatchers;)Lkotlinx/coroutines/CoroutineDispatcher;
|
||||
}
|
||||
|
||||
public abstract interface class hep/dataforge/output/OutputManager {
|
||||
public abstract fun get (Lkotlin/reflect/KClass;Lhep/dataforge/names/Name;Lhep/dataforge/names/Name;Lhep/dataforge/meta/Meta;)Lhep/dataforge/output/Renderer;
|
||||
}
|
||||
|
||||
public final class hep/dataforge/output/OutputManager$DefaultImpls {
|
||||
public static synthetic fun get$default (Lhep/dataforge/output/OutputManager;Lkotlin/reflect/KClass;Lhep/dataforge/names/Name;Lhep/dataforge/names/Name;Lhep/dataforge/meta/Meta;ILjava/lang/Object;)Lhep/dataforge/output/Renderer;
|
||||
}
|
||||
|
||||
public final class hep/dataforge/output/OutputManagerKt {
|
||||
public static final fun getCONSOLE_RENDERER ()Lhep/dataforge/output/Renderer;
|
||||
public static final fun getOutput (Lhep/dataforge/context/Context;)Lhep/dataforge/output/OutputManager;
|
||||
public static final fun render (Lhep/dataforge/output/OutputManager;Ljava/lang/Object;Lhep/dataforge/names/Name;Lhep/dataforge/names/Name;Lhep/dataforge/meta/Meta;)V
|
||||
public static synthetic fun render$default (Lhep/dataforge/output/OutputManager;Ljava/lang/Object;Lhep/dataforge/names/Name;Lhep/dataforge/names/Name;Lhep/dataforge/meta/Meta;ILjava/lang/Object;)V
|
||||
}
|
||||
|
||||
public abstract interface class hep/dataforge/output/Renderer : hep/dataforge/context/ContextAware {
|
||||
public abstract fun render (Ljava/lang/Object;Lhep/dataforge/meta/Meta;)V
|
||||
}
|
||||
|
||||
public final class hep/dataforge/output/Renderer$DefaultImpls {
|
||||
public static fun getLogger (Lhep/dataforge/output/Renderer;)Lmu/KLogger;
|
||||
public static synthetic fun render$default (Lhep/dataforge/output/Renderer;Ljava/lang/Object;Lhep/dataforge/meta/Meta;ILjava/lang/Object;)V
|
||||
}
|
||||
|
||||
public abstract interface class hep/dataforge/output/TextFormat {
|
||||
public static final field Companion Lhep/dataforge/output/TextFormat$Companion;
|
||||
public static final field TEXT_RENDERER_TYPE Ljava/lang/String;
|
||||
public abstract fun getPriority ()I
|
||||
public abstract fun getType ()Lkotlin/reflect/KClass;
|
||||
public abstract fun render (Ljava/lang/Appendable;Ljava/lang/Object;Lkotlin/coroutines/Continuation;)Ljava/lang/Object;
|
||||
}
|
||||
|
||||
public final class hep/dataforge/output/TextFormat$Companion {
|
||||
public static final field TEXT_RENDERER_TYPE Ljava/lang/String;
|
||||
}
|
||||
|
||||
public final class hep/dataforge/output/TextRenderer : hep/dataforge/output/Renderer {
|
||||
public fun <init> (Lhep/dataforge/context/Context;Ljava/lang/Appendable;)V
|
||||
public fun getContext ()Lhep/dataforge/context/Context;
|
||||
public fun getLogger ()Lmu/KLogger;
|
||||
public fun render (Ljava/lang/Object;Lhep/dataforge/meta/Meta;)V
|
||||
}
|
||||
|
@ -1,15 +0,0 @@
|
||||
plugins {
|
||||
id("space.kscience.gradle.mpp")
|
||||
id("space.kscience.gradle.native")
|
||||
}
|
||||
|
||||
kotlin {
|
||||
sourceSets {
|
||||
val commonMain by getting{
|
||||
dependencies {
|
||||
api(project(":dataforge-context"))
|
||||
//api(project(":dataforge-io"))
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
@ -1,75 +0,0 @@
|
||||
package space.kscience.dataforge.output
|
||||
|
||||
import space.kscience.dataforge.context.*
|
||||
import space.kscience.dataforge.context.PluginTag.Companion.DATAFORGE_GROUP
|
||||
import space.kscience.dataforge.meta.Meta
|
||||
import space.kscience.dataforge.names.Name
|
||||
import kotlinx.coroutines.CoroutineDispatcher
|
||||
import kotlinx.coroutines.Dispatchers
|
||||
import kotlin.reflect.KClass
|
||||
|
||||
/**
|
||||
* A manager for outputs
|
||||
*/
|
||||
public interface OutputManager {
|
||||
|
||||
/**
|
||||
* Get an output specialized for given type, name and stage.
|
||||
* @param stage represents the node or directory for the output. Empty means root node.
|
||||
* @param name represents the name inside the node.
|
||||
* @param meta configuration for [Renderer] (not for rendered object)
|
||||
*/
|
||||
public fun <T : Any> getOutputContainer(
|
||||
type: KClass<out T>,
|
||||
name: Name,
|
||||
stage: Name = Name.EMPTY,
|
||||
meta: Meta = Meta.EMPTY
|
||||
): Renderer<T>
|
||||
}
|
||||
|
||||
/**
|
||||
* Get an output manager for a context
|
||||
*/
|
||||
public val Context.output: OutputManager get() = plugins.get() ?: ConsoleOutputManager()
|
||||
|
||||
/**
|
||||
* Get an output with given [name], [stage] and reified content type
|
||||
*/
|
||||
public inline fun <reified T : Any> OutputManager.getOutputContainer(
|
||||
name: Name,
|
||||
stage: Name = Name.EMPTY,
|
||||
meta: Meta = Meta.EMPTY
|
||||
): Renderer<T> {
|
||||
return getOutputContainer(T::class, name, stage, meta)
|
||||
}
|
||||
|
||||
/**
|
||||
* Directly render an object using the most suitable renderer
|
||||
*/
|
||||
public fun OutputManager.render(obj: Any, name: Name, stage: Name = Name.EMPTY, meta: Meta = Meta.EMPTY): Unit =
|
||||
getOutputContainer(obj::class, name, stage).render(obj, meta)
|
||||
|
||||
/**
|
||||
* System console output.
|
||||
* The [CONSOLE_RENDERER] is used when no other [OutputManager] is provided.
|
||||
*/
|
||||
public val CONSOLE_RENDERER: Renderer<Any> = Renderer { obj, meta -> println(obj) }
|
||||
|
||||
public class ConsoleOutputManager : AbstractPlugin(), OutputManager {
|
||||
override val tag: PluginTag get() = ConsoleOutputManager.tag
|
||||
|
||||
override fun <T : Any> getOutputContainer(type: KClass<out T>, name: Name, stage: Name, meta: Meta): Renderer<T> = CONSOLE_RENDERER
|
||||
|
||||
public companion object : PluginFactory<ConsoleOutputManager> {
|
||||
override val tag: PluginTag = PluginTag("output.console", group = DATAFORGE_GROUP)
|
||||
|
||||
override val type: KClass<ConsoleOutputManager> = ConsoleOutputManager::class
|
||||
|
||||
override fun invoke(meta: Meta, context: Context): ConsoleOutputManager = ConsoleOutputManager()
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* A dispatcher for output tasks.
|
||||
*/
|
||||
public expect val Dispatchers.Output: CoroutineDispatcher
|
@ -1,21 +0,0 @@
|
||||
package space.kscience.dataforge.output
|
||||
|
||||
import space.kscience.dataforge.context.ContextAware
|
||||
import space.kscience.dataforge.meta.Meta
|
||||
|
||||
/**
|
||||
* A generic way to render any object in the output.
|
||||
*
|
||||
* An object could be rendered either in append or overlay mode. The mode is decided by the [Renderer]
|
||||
* based on its configuration and provided meta
|
||||
*
|
||||
*/
|
||||
public fun interface Renderer<in T : Any> {
|
||||
/**
|
||||
* Render specific object with configuration.
|
||||
*
|
||||
* By convention actual render is called in asynchronous mode, so this method should never
|
||||
* block execution
|
||||
*/
|
||||
public fun render(obj: T, meta: Meta)
|
||||
}
|
@ -1,78 +0,0 @@
|
||||
package space.kscience.dataforge.output
|
||||
|
||||
import space.kscience.dataforge.context.Context
|
||||
import space.kscience.dataforge.meta.Meta
|
||||
import space.kscience.dataforge.output.TextFormat.Companion.TEXT_RENDERER_TYPE
|
||||
import space.kscience.dataforge.provider.Type
|
||||
import space.kscience.dataforge.provider.top
|
||||
import kotlinx.coroutines.Dispatchers
|
||||
import kotlinx.coroutines.launch
|
||||
import kotlin.reflect.KClass
|
||||
import kotlin.reflect.KType
|
||||
|
||||
|
||||
/**
|
||||
* A text or binary renderer based on [Output]
|
||||
*/
|
||||
@Type(TEXT_RENDERER_TYPE)
|
||||
@Deprecated("Bad design")
|
||||
public interface TextFormat {
|
||||
/**
|
||||
* The priority of this renderer compared to other renderers
|
||||
*/
|
||||
public val priority: Int
|
||||
/**
|
||||
* The type of the content served by this renderer
|
||||
*/
|
||||
public val type: KClass<*>
|
||||
|
||||
public suspend fun Appendable.render(obj: Any)
|
||||
|
||||
public companion object {
|
||||
public const val TEXT_RENDERER_TYPE: String = "dataforge.textRenderer"
|
||||
}
|
||||
}
|
||||
|
||||
@Deprecated("Bad design")
|
||||
public object DefaultTextFormat : TextFormat {
|
||||
override val priority: Int = Int.MAX_VALUE
|
||||
override val type: KClass<*> = Any::class
|
||||
|
||||
override suspend fun Appendable.render(obj: Any) {
|
||||
append(obj.toString() + "\n")
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* A text-based renderer
|
||||
*/
|
||||
@Deprecated("Bad design")
|
||||
public class TextRenderer(override val context: Context, private val output: Appendable) : Renderer<Any> {
|
||||
private val cache = HashMap<KClass<*>, TextFormat>()
|
||||
|
||||
/**
|
||||
* Find the first [TextFormat] matching the given object type.
|
||||
*/
|
||||
override fun render(obj: Any, meta: Meta) {
|
||||
val format: TextFormat = if (obj is CharSequence) {
|
||||
DefaultTextFormat
|
||||
} else {
|
||||
val value = cache[obj::class]
|
||||
if (value == null) {
|
||||
val answer =
|
||||
context.top<TextFormat>(TEXT_RENDERER_TYPE).values.firstOrNull { it.type.isInstance(obj) }
|
||||
if (answer != null) {
|
||||
cache[obj::class] = answer
|
||||
answer
|
||||
} else {
|
||||
DefaultTextFormat
|
||||
}
|
||||
} else {
|
||||
value
|
||||
}
|
||||
}
|
||||
context.launch(Dispatchers.Output) {
|
||||
format.run { output.render(obj) }
|
||||
}
|
||||
}
|
||||
}
|
@ -1,7 +0,0 @@
|
||||
package space.kscience.dataforge.output
|
||||
|
||||
import kotlinx.coroutines.CoroutineDispatcher
|
||||
import kotlinx.coroutines.Dispatchers
|
||||
|
||||
|
||||
public actual val Dispatchers.Output: CoroutineDispatcher get() = Default
|
@ -1,6 +0,0 @@
|
||||
package space.kscience.dataforge.output
|
||||
|
||||
import kotlinx.coroutines.CoroutineDispatcher
|
||||
import kotlinx.coroutines.Dispatchers
|
||||
|
||||
public actual val Dispatchers.Output: CoroutineDispatcher get() = IO
|
@ -1,6 +0,0 @@
|
||||
package space.kscience.dataforge.output
|
||||
|
||||
import kotlinx.coroutines.CoroutineDispatcher
|
||||
import kotlinx.coroutines.Dispatchers
|
||||
|
||||
public actual val Dispatchers.Output: CoroutineDispatcher get() = Dispatchers.Default
|
@ -1,12 +1,12 @@
|
||||
# Module dataforge-scripting
|
||||
|
||||
|
||||
Scripting definition fow workspace generation
|
||||
|
||||
## Usage
|
||||
|
||||
## Artifact:
|
||||
|
||||
The Maven coordinates of this project are `space.kscience:dataforge-scripting:0.9.0-dev-1`.
|
||||
The Maven coordinates of this project are `space.kscience:dataforge-scripting:0.10.0`.
|
||||
|
||||
**Gradle Kotlin DSL:**
|
||||
```kotlin
|
||||
@ -16,6 +16,6 @@ repositories {
|
||||
}
|
||||
|
||||
dependencies {
|
||||
implementation("space.kscience:dataforge-scripting:0.9.0-dev-1")
|
||||
implementation("space.kscience:dataforge-scripting:0.10.0")
|
||||
}
|
||||
```
|
||||
|
@ -2,22 +2,24 @@ plugins {
|
||||
id("space.kscience.gradle.mpp")
|
||||
}
|
||||
|
||||
kscience{
|
||||
description = "Scripting definition fow workspace generation"
|
||||
|
||||
kscience {
|
||||
jvm()
|
||||
commonMain {
|
||||
api(projects.dataforgeWorkspace)
|
||||
implementation(kotlin("scripting-common"))
|
||||
}
|
||||
jvmMain{
|
||||
jvmMain {
|
||||
implementation(kotlin("scripting-jvm-host"))
|
||||
implementation(kotlin("scripting-jvm"))
|
||||
}
|
||||
jvmTest{
|
||||
jvmTest {
|
||||
implementation(spclibs.logback.classic)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
readme{
|
||||
readme {
|
||||
maturity = space.kscience.gradle.Maturity.PROTOTYPE
|
||||
}
|
@ -6,7 +6,7 @@
|
||||
|
||||
## Artifact:
|
||||
|
||||
The Maven coordinates of this project are `space.kscience:dataforge-workspace:0.9.0-dev-1`.
|
||||
The Maven coordinates of this project are `space.kscience:dataforge-workspace:0.10.0`.
|
||||
|
||||
**Gradle Kotlin DSL:**
|
||||
```kotlin
|
||||
@ -16,6 +16,6 @@ repositories {
|
||||
}
|
||||
|
||||
dependencies {
|
||||
implementation("space.kscience:dataforge-workspace:0.9.0-dev-1")
|
||||
implementation("space.kscience:dataforge-workspace:0.10.0")
|
||||
}
|
||||
```
|
||||
|
@ -2,6 +2,8 @@ plugins {
|
||||
id("space.kscience.gradle.mpp")
|
||||
}
|
||||
|
||||
description = "A framework for pull-based data processing"
|
||||
|
||||
kscience {
|
||||
jvm()
|
||||
js()
|
||||
|
@ -1,9 +1,9 @@
|
||||
package space.kscience.dataforge.workspace
|
||||
|
||||
import kotlinx.coroutines.withContext
|
||||
import space.kscience.dataforge.data.DataSink
|
||||
import space.kscience.dataforge.data.DataBuilderScope
|
||||
import space.kscience.dataforge.data.DataTree
|
||||
import space.kscience.dataforge.data.GoalExecutionRestriction
|
||||
import space.kscience.dataforge.data.MutableDataTree
|
||||
import space.kscience.dataforge.meta.Meta
|
||||
import space.kscience.dataforge.meta.MetaReader
|
||||
import space.kscience.dataforge.meta.MetaRepr
|
||||
@ -29,10 +29,10 @@ public interface Task<T> : Described {
|
||||
public val fingerprint: String get() = hashCode().toString(radix = 16)
|
||||
|
||||
/**
|
||||
* Compute a [TaskResult] using given meta. In general, the result is lazy and represents both computation model
|
||||
* and a handler for actual result
|
||||
* Compute a [TaskResult] using given meta. In general, the result is lazy and represents both the computation model
|
||||
* and a handler for the actual result
|
||||
*
|
||||
* @param workspace a workspace to run task in
|
||||
* @param workspace a workspace to run the task in
|
||||
* @param taskName the name of the task in this workspace
|
||||
* @param taskMeta configuration for current stage computation
|
||||
*/
|
||||
@ -62,12 +62,12 @@ public interface TaskWithSpec<T, C : Any> : Task<T> {
|
||||
// block: C.() -> Unit = {},
|
||||
//): TaskResult<T> = execute(workspace, taskName, spec(block))
|
||||
|
||||
public class TaskResultBuilder<T>(
|
||||
public class TaskResultScope<in T>(
|
||||
public val resultType: KType,
|
||||
public val workspace: Workspace,
|
||||
public val taskName: Name,
|
||||
public val taskMeta: Meta,
|
||||
private val dataSink: DataSink<T>,
|
||||
) : DataSink<T> by dataSink
|
||||
) : DataBuilderScope<T>
|
||||
|
||||
/**
|
||||
* Create a [Task] that composes a result using [builder]. Only data from the workspace could be used.
|
||||
@ -77,10 +77,11 @@ public class TaskResultBuilder<T>(
|
||||
* @param descriptor of meta accepted by this task
|
||||
* @param builder for resulting data set
|
||||
*/
|
||||
@UnsafeKType
|
||||
public fun <T : Any> Task(
|
||||
resultType: KType,
|
||||
descriptor: MetaDescriptor? = null,
|
||||
builder: suspend TaskResultBuilder<T>.() -> Unit,
|
||||
builder: suspend TaskResultScope<T>.() -> DataTree<T>,
|
||||
): Task<T> = object : Task<T> {
|
||||
|
||||
override val descriptor: MetaDescriptor? = descriptor
|
||||
@ -89,23 +90,19 @@ public fun <T : Any> Task(
|
||||
workspace: Workspace,
|
||||
taskName: Name,
|
||||
taskMeta: Meta,
|
||||
): TaskResult<T> {
|
||||
): TaskResult<T> = withContext(GoalExecutionRestriction() + workspace.goalLogger) {
|
||||
//TODO use safe builder and check for external data on add and detects cycles
|
||||
@OptIn(UnsafeKType::class)
|
||||
val dataset = MutableDataTree<T>(resultType).apply {
|
||||
TaskResultBuilder(workspace, taskName, taskMeta, this).apply {
|
||||
withContext(GoalExecutionRestriction() + workspace.goalLogger) {
|
||||
builder()
|
||||
}
|
||||
}
|
||||
}
|
||||
return workspace.wrapResult(dataset, taskName, taskMeta)
|
||||
val dataset = TaskResultScope<T>(resultType, workspace, taskName, taskMeta).builder()
|
||||
|
||||
|
||||
workspace.wrapResult(dataset, taskName, taskMeta)
|
||||
}
|
||||
}
|
||||
|
||||
@OptIn(UnsafeKType::class)
|
||||
public inline fun <reified T : Any> Task(
|
||||
descriptor: MetaDescriptor? = null,
|
||||
noinline builder: suspend TaskResultBuilder<T>.() -> Unit,
|
||||
noinline builder: suspend TaskResultScope<T>.() -> DataTree<T>,
|
||||
): Task<T> = Task(typeOf<T>(), descriptor, builder)
|
||||
|
||||
|
||||
@ -117,13 +114,11 @@ public inline fun <reified T : Any> Task(
|
||||
* @param specification a specification for task configuration
|
||||
* @param builder for resulting data set
|
||||
*/
|
||||
|
||||
|
||||
@Suppress("FunctionName")
|
||||
public fun <T : Any, C : MetaRepr> Task(
|
||||
resultType: KType,
|
||||
specification: MetaReader<C>,
|
||||
builder: suspend TaskResultBuilder<T>.(C) -> Unit,
|
||||
builder: suspend TaskResultScope<T>.(C) -> DataTree<T>,
|
||||
): TaskWithSpec<T, C> = object : TaskWithSpec<T, C> {
|
||||
override val spec: MetaReader<C> = specification
|
||||
|
||||
@ -134,15 +129,15 @@ public fun <T : Any, C : MetaRepr> Task(
|
||||
): TaskResult<T> = withContext(GoalExecutionRestriction() + workspace.goalLogger) {
|
||||
//TODO use safe builder and check for external data on add and detects cycles
|
||||
val taskMeta = configuration.toMeta()
|
||||
|
||||
@OptIn(UnsafeKType::class)
|
||||
val dataset = MutableDataTree<T>(resultType).apply {
|
||||
TaskResultBuilder(workspace, taskName, taskMeta, this).apply { builder(configuration) }
|
||||
}
|
||||
val dataset = TaskResultScope<T>(resultType, workspace, taskName, taskMeta).builder(configuration)
|
||||
|
||||
workspace.wrapResult(dataset, taskName, taskMeta)
|
||||
}
|
||||
}
|
||||
|
||||
public inline fun <reified T : Any, C : MetaRepr> Task(
|
||||
specification: MetaReader<C>,
|
||||
noinline builder: suspend TaskResultBuilder<T>.(C) -> Unit,
|
||||
noinline builder: suspend TaskResultScope<T>.(C) -> DataTree<T>,
|
||||
): Task<T> = Task(typeOf<T>(), specification, builder)
|
@ -6,7 +6,7 @@ import kotlinx.coroutines.joinAll
|
||||
import kotlinx.coroutines.launch
|
||||
import space.kscience.dataforge.data.DataTree
|
||||
import space.kscience.dataforge.data.asSequence
|
||||
import space.kscience.dataforge.data.launch
|
||||
import space.kscience.dataforge.data.launchIn
|
||||
import space.kscience.dataforge.meta.Meta
|
||||
import space.kscience.dataforge.names.Name
|
||||
|
||||
@ -33,9 +33,9 @@ public fun <T> Workspace.wrapResult(data: DataTree<T>, taskName: Name, taskMeta:
|
||||
* Start computation for all data elements of this node.
|
||||
* The resulting [Job] is completed only when all of them are completed.
|
||||
*/
|
||||
public fun TaskResult<*>.launch(scope: CoroutineScope): Job {
|
||||
public fun TaskResult<*>.launchIn(scope: CoroutineScope): Job {
|
||||
val jobs = asSequence().map {
|
||||
it.data.launch(scope)
|
||||
it.launchIn(scope)
|
||||
}.toList()
|
||||
return scope.launch { jobs.joinAll() }
|
||||
}
|
@ -4,20 +4,17 @@ import space.kscience.dataforge.actions.Action
|
||||
import space.kscience.dataforge.context.Context
|
||||
import space.kscience.dataforge.context.ContextBuilder
|
||||
import space.kscience.dataforge.context.Global
|
||||
import space.kscience.dataforge.data.DataSink
|
||||
import space.kscience.dataforge.data.DataTree
|
||||
import space.kscience.dataforge.data.MutableDataTree
|
||||
import space.kscience.dataforge.data.StaticDataBuilder
|
||||
import space.kscience.dataforge.data.static
|
||||
import space.kscience.dataforge.meta.*
|
||||
import space.kscience.dataforge.meta.descriptors.MetaDescriptor
|
||||
import space.kscience.dataforge.meta.descriptors.MetaDescriptorBuilder
|
||||
import space.kscience.dataforge.misc.DFBuilder
|
||||
import space.kscience.dataforge.misc.UnsafeKType
|
||||
import space.kscience.dataforge.names.Name
|
||||
import space.kscience.dataforge.names.asName
|
||||
import kotlin.collections.set
|
||||
import kotlin.properties.PropertyDelegateProvider
|
||||
import kotlin.properties.ReadOnlyProperty
|
||||
import kotlin.reflect.typeOf
|
||||
|
||||
public data class TaskReference<T>(public val taskName: Name, public val task: Task<T>) : DataSelector<T> {
|
||||
|
||||
@ -42,7 +39,7 @@ public interface TaskContainer {
|
||||
public inline fun <reified T : Any> TaskContainer.registerTask(
|
||||
name: String,
|
||||
descriptorBuilder: MetaDescriptorBuilder.() -> Unit = {},
|
||||
noinline builder: suspend TaskResultBuilder<T>.() -> Unit,
|
||||
noinline builder: suspend TaskResultScope<T>.() -> DataTree<T>,
|
||||
): Unit = registerTask(Name.parse(name), Task(MetaDescriptor(descriptorBuilder), builder))
|
||||
|
||||
/**
|
||||
@ -51,7 +48,7 @@ public inline fun <reified T : Any> TaskContainer.registerTask(
|
||||
public inline fun <reified T : Any> TaskContainer.buildTask(
|
||||
name: String,
|
||||
descriptorBuilder: MetaDescriptorBuilder.() -> Unit = {},
|
||||
noinline builder: suspend TaskResultBuilder<T>.() -> Unit,
|
||||
noinline builder: suspend TaskResultScope<T>.() -> DataTree<T>,
|
||||
): TaskReference<T> {
|
||||
val theName = Name.parse(name)
|
||||
val descriptor = MetaDescriptor(descriptorBuilder)
|
||||
@ -62,7 +59,7 @@ public inline fun <reified T : Any> TaskContainer.buildTask(
|
||||
|
||||
public inline fun <reified T : Any> TaskContainer.task(
|
||||
descriptor: MetaDescriptor,
|
||||
noinline builder: suspend TaskResultBuilder<T>.() -> Unit,
|
||||
noinline builder: suspend TaskResultScope<T>.() -> DataTree<T>,
|
||||
): PropertyDelegateProvider<Any?, ReadOnlyProperty<Any?, TaskReference<T>>> = PropertyDelegateProvider { _, property ->
|
||||
val taskName = Name.parse(property.name)
|
||||
val task = Task(descriptor, builder)
|
||||
@ -75,7 +72,7 @@ public inline fun <reified T : Any> TaskContainer.task(
|
||||
*/
|
||||
public inline fun <reified T : Any, C : MetaRepr> TaskContainer.task(
|
||||
specification: MetaReader<C>,
|
||||
noinline builder: suspend TaskResultBuilder<T>.(C) -> Unit,
|
||||
noinline builder: suspend TaskResultScope<T>.(C) -> DataTree<T>,
|
||||
): PropertyDelegateProvider<Any?, ReadOnlyProperty<Any?, TaskReference<T>>> = PropertyDelegateProvider { _, property ->
|
||||
val taskName = Name.parse(property.name)
|
||||
val task = Task(specification, builder)
|
||||
@ -88,7 +85,7 @@ public inline fun <reified T : Any, C : MetaRepr> TaskContainer.task(
|
||||
*/
|
||||
public inline fun <reified T : Any> TaskContainer.task(
|
||||
noinline descriptorBuilder: MetaDescriptorBuilder.() -> Unit = {},
|
||||
noinline builder: suspend TaskResultBuilder<T>.() -> Unit,
|
||||
noinline builder: suspend TaskResultScope<T>.() -> DataTree<T>,
|
||||
): PropertyDelegateProvider<Any?, ReadOnlyProperty<Any?, TaskReference<T>>> =
|
||||
task(MetaDescriptor(descriptorBuilder), builder)
|
||||
|
||||
@ -102,15 +99,15 @@ public inline fun <T : Any, reified R : Any> TaskContainer.action(
|
||||
noinline descriptorBuilder: MetaDescriptorBuilder.() -> Unit = {},
|
||||
): PropertyDelegateProvider<Any?, ReadOnlyProperty<Any?, TaskReference<R>>> =
|
||||
task(MetaDescriptor(descriptorBuilder)) {
|
||||
result(action.execute(from(selector), taskMeta.copy(metaTransform), workspace))
|
||||
action.execute(from(selector), taskMeta.copy(metaTransform), workspace)
|
||||
}
|
||||
|
||||
public class WorkspaceBuilder(
|
||||
private val parentContext: Context = Global,
|
||||
) : TaskContainer {
|
||||
private var context: Context? = null
|
||||
@OptIn(UnsafeKType::class)
|
||||
private val data = MutableDataTree<Any?>(typeOf<Any?>())
|
||||
|
||||
private var data: DataTree<Any?>? = null
|
||||
private val targets: HashMap<String, Meta> = HashMap()
|
||||
private val tasks = HashMap<Name, Task<*>>()
|
||||
private var cache: WorkspaceCache? = null
|
||||
@ -125,8 +122,8 @@ public class WorkspaceBuilder(
|
||||
/**
|
||||
* Define intrinsic data for the workspace
|
||||
*/
|
||||
public fun data(builder: DataSink<Any?>.() -> Unit) {
|
||||
data.apply(builder)
|
||||
public fun data(builder: StaticDataBuilder<Any?>.() -> Unit) {
|
||||
data = DataTree.static(builder)
|
||||
}
|
||||
|
||||
/**
|
||||
@ -152,7 +149,7 @@ public class WorkspaceBuilder(
|
||||
val postProcess: suspend (TaskResult<*>) -> TaskResult<*> = { result ->
|
||||
cache?.cache(result) ?: result
|
||||
}
|
||||
return WorkspaceImpl(context ?: parentContext, data, targets, tasks, postProcess)
|
||||
return WorkspaceImpl(context ?: parentContext, data ?: DataTree.EMPTY, targets, tasks, postProcess)
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -1,20 +1,20 @@
|
||||
package space.kscience.dataforge.workspace
|
||||
|
||||
import space.kscience.dataforge.actions.Action
|
||||
import space.kscience.dataforge.context.PluginFactory
|
||||
import space.kscience.dataforge.data.DataTree
|
||||
import space.kscience.dataforge.data.forEach
|
||||
import space.kscience.dataforge.data.putAll
|
||||
import space.kscience.dataforge.data.transform
|
||||
import space.kscience.dataforge.meta.*
|
||||
import space.kscience.dataforge.data.*
|
||||
import space.kscience.dataforge.meta.Meta
|
||||
import space.kscience.dataforge.meta.MutableMeta
|
||||
import space.kscience.dataforge.meta.copy
|
||||
import space.kscience.dataforge.meta.remove
|
||||
import space.kscience.dataforge.misc.DFExperimental
|
||||
import space.kscience.dataforge.misc.UnsafeKType
|
||||
import space.kscience.dataforge.names.Name
|
||||
import space.kscience.dataforge.names.plus
|
||||
|
||||
/**
|
||||
* A task meta without a node corresponding to the task itself (removing a node with name of the task).
|
||||
*/
|
||||
public val TaskResultBuilder<*>.defaultDependencyMeta: Meta
|
||||
public val TaskResultScope<*>.defaultDependencyMeta: Meta
|
||||
get() = taskMeta.copy {
|
||||
remove(taskName)
|
||||
}
|
||||
@ -25,12 +25,12 @@ public val TaskResultBuilder<*>.defaultDependencyMeta: Meta
|
||||
* @param selector a workspace data selector. Could be either task selector or initial data selector.
|
||||
* @param dependencyMeta meta used for selector. The same meta is used for caching. By default, uses [defaultDependencyMeta].
|
||||
*/
|
||||
public suspend fun <T> TaskResultBuilder<*>.from(
|
||||
public suspend fun <T> TaskResultScope<*>.from(
|
||||
selector: DataSelector<T>,
|
||||
dependencyMeta: Meta = defaultDependencyMeta,
|
||||
): DataTree<T> = selector.select(workspace, dependencyMeta)
|
||||
|
||||
public suspend inline fun <T, reified P : WorkspacePlugin> TaskResultBuilder<*>.from(
|
||||
public suspend inline fun <T, reified P : WorkspacePlugin> TaskResultScope<*>.from(
|
||||
plugin: P,
|
||||
dependencyMeta: Meta = defaultDependencyMeta,
|
||||
selectorBuilder: P.() -> TaskReference<T>,
|
||||
@ -50,7 +50,7 @@ public suspend inline fun <T, reified P : WorkspacePlugin> TaskResultBuilder<*>.
|
||||
* @param dependencyMeta meta used for selector. The same meta is used for caching. By default, uses [defaultDependencyMeta].
|
||||
* @param selectorBuilder a builder of task from the plugin.
|
||||
*/
|
||||
public suspend inline fun <reified T, reified P : WorkspacePlugin> TaskResultBuilder<*>.from(
|
||||
public suspend inline fun <reified T, reified P : WorkspacePlugin> TaskResultScope<*>.from(
|
||||
pluginFactory: PluginFactory<P>,
|
||||
dependencyMeta: Meta = defaultDependencyMeta,
|
||||
selectorBuilder: P.() -> TaskReference<T>,
|
||||
@ -64,7 +64,7 @@ public suspend inline fun <reified T, reified P : WorkspacePlugin> TaskResultBui
|
||||
return res as TaskResult<T>
|
||||
}
|
||||
|
||||
public val TaskResultBuilder<*>.allData: DataSelector<*>
|
||||
public val TaskResultScope<*>.allData: DataSelector<*>
|
||||
get() = DataSelector { workspace, _ -> workspace.data }
|
||||
|
||||
/**
|
||||
@ -76,44 +76,50 @@ public val TaskResultBuilder<*>.allData: DataSelector<*>
|
||||
* @param dataMetaTransform additional transformation of individual data meta.
|
||||
* @param action process individual data asynchronously.
|
||||
*/
|
||||
@OptIn(UnsafeKType::class)
|
||||
@DFExperimental
|
||||
public suspend inline fun <T, reified R> TaskResultBuilder<R>.transformEach(
|
||||
public suspend fun <T, R> TaskResultScope<R>.transformEach(
|
||||
selector: DataSelector<T>,
|
||||
dependencyMeta: Meta = defaultDependencyMeta,
|
||||
dataMetaTransform: MutableMeta.(name: Name) -> Unit = {},
|
||||
crossinline action: suspend (arg: T, name: Name, meta: Meta) -> R,
|
||||
) {
|
||||
from(selector, dependencyMeta).forEach { data ->
|
||||
val meta = data.meta.toMutableMeta().apply {
|
||||
taskMeta[taskName]?.let { taskName.put(it) }
|
||||
dataMetaTransform(data.name)
|
||||
}
|
||||
|
||||
val res = data.transform(meta, workspace.context.coroutineContext) {
|
||||
action(it, data.name, meta)
|
||||
}
|
||||
|
||||
put(data.name, res)
|
||||
action: suspend NamedValueWithMeta<T>.() -> R,
|
||||
): DataTree<R> = from(selector, dependencyMeta).transformEach<T, R>(
|
||||
resultType,
|
||||
workspace.context,
|
||||
metaTransform = { name ->
|
||||
taskMeta[taskName]?.let { taskName put it }
|
||||
dataMetaTransform(name)
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Set given [dataSet] as a task result.
|
||||
*/
|
||||
public fun <T> TaskResultBuilder<T>.result(dataSet: DataTree<T>) {
|
||||
this.putAll(dataSet)
|
||||
}
|
||||
|
||||
/**
|
||||
* Use provided [action] to fill the result
|
||||
*/
|
||||
@DFExperimental
|
||||
public suspend inline fun <T, reified R> TaskResultBuilder<R>.actionFrom(
|
||||
selector: DataSelector<T>,
|
||||
action: Action<T, R>,
|
||||
dependencyMeta: Meta = defaultDependencyMeta,
|
||||
) {
|
||||
putAll(action.execute(from(selector, dependencyMeta), dependencyMeta, workspace))
|
||||
action(it)
|
||||
}
|
||||
|
||||
@OptIn(UnsafeKType::class)
|
||||
public fun <R> TaskResultScope<R>.result(data: Data<R>): DataTree<R> = DataTree.static(resultType) {
|
||||
data(Name.EMPTY, data)
|
||||
}
|
||||
|
||||
@OptIn(UnsafeKType::class)
|
||||
public fun <R> TaskResultScope<R>.result(builder: StaticDataBuilder<R>.() -> Unit): DataTree<R> =
|
||||
DataTree.static(resultType, builder)
|
||||
|
||||
///**
|
||||
// * Set given [dataSet] as a task result.
|
||||
// */
|
||||
//public fun <T> TaskResultBuilder<T>.result(dataSet: DataTree<T>) {
|
||||
// putAll(dataSet)
|
||||
//}
|
||||
|
||||
///**
|
||||
// * Use provided [action] to fill the result
|
||||
// */
|
||||
//@DFExperimental
|
||||
//public suspend inline fun <T, reified R> TaskResultScope<R>.actionFrom(
|
||||
// selector: DataSelector<T>,
|
||||
// action: Action<T, R>,
|
||||
// dependencyMeta: Meta = defaultDependencyMeta,
|
||||
//) {
|
||||
// putAll(action.execute(from(selector, dependencyMeta), dependencyMeta, workspace))
|
||||
//}
|
||||
|
||||
|
@ -3,17 +3,25 @@ package space.kscience.dataforge.workspace
|
||||
import space.kscience.dataforge.actions.AbstractAction
|
||||
import space.kscience.dataforge.data.*
|
||||
import space.kscience.dataforge.meta.Meta
|
||||
import space.kscience.dataforge.names.Name
|
||||
import kotlin.reflect.KType
|
||||
|
||||
internal class CachingAction<T>(type: KType, private val caching: (NamedData<T>) -> NamedData<T>) :
|
||||
AbstractAction<T, T>(type) {
|
||||
override fun DataSink<T>.generate(source: DataTree<T>, meta: Meta) {
|
||||
internal class CachingAction<T>(
|
||||
type: KType, private val caching: (NamedData<T>) -> NamedData<T>
|
||||
) : AbstractAction<T, T>(type) {
|
||||
|
||||
override fun DataBuilderScope<T>.generate(
|
||||
source: DataTree<T>,
|
||||
meta: Meta
|
||||
): Map<Name, Data<T>> = buildMap {
|
||||
source.forEach {
|
||||
put(caching(it))
|
||||
val cached = caching(it)
|
||||
put(cached.name, cached)
|
||||
}
|
||||
}
|
||||
|
||||
override suspend fun DataSink<T>.update(source: DataTree<T>, meta: Meta, updatedData: DataUpdate<T>) {
|
||||
put(updatedData.name, updatedData.data?.named(updatedData.name)?.let(caching))
|
||||
override suspend fun DataSink<T>.update(source: DataTree<T>, actionMeta: Meta, updateName: Name) {
|
||||
val updatedData = source.read(updateName)
|
||||
write(updateName, updatedData?.named(updateName)?.let(caching))
|
||||
}
|
||||
}
|
196
dataforge-workspace/src/jvmMain/kotlin/space/kscience/dataforge/workspace/FileDataTree.kt
Normal file
196
dataforge-workspace/src/jvmMain/kotlin/space/kscience/dataforge/workspace/FileDataTree.kt
Normal file
@ -0,0 +1,196 @@
|
||||
package space.kscience.dataforge.workspace
|
||||
|
||||
import kotlinx.coroutines.Dispatchers
|
||||
import kotlinx.coroutines.Job
|
||||
import kotlinx.coroutines.channels.awaitClose
|
||||
import kotlinx.coroutines.flow.*
|
||||
import kotlinx.coroutines.isActive
|
||||
import kotlinx.coroutines.launch
|
||||
import space.kscience.dataforge.data.Data
|
||||
import space.kscience.dataforge.data.DataTree
|
||||
import space.kscience.dataforge.data.StaticData
|
||||
import space.kscience.dataforge.io.*
|
||||
import space.kscience.dataforge.meta.Meta
|
||||
import space.kscience.dataforge.meta.copy
|
||||
import space.kscience.dataforge.names.Name
|
||||
import space.kscience.dataforge.names.NameToken
|
||||
import space.kscience.dataforge.names.asName
|
||||
import space.kscience.dataforge.names.plus
|
||||
import java.nio.file.*
|
||||
import java.nio.file.attribute.BasicFileAttributes
|
||||
import java.nio.file.spi.FileSystemProvider
|
||||
import kotlin.io.path.*
|
||||
import kotlin.reflect.KType
|
||||
import kotlin.reflect.typeOf
|
||||
|
||||
|
||||
public class FileDataTree(
|
||||
public val io: IOPlugin,
|
||||
public val path: Path,
|
||||
private val monitor: Boolean = false
|
||||
) : DataTree<Binary> {
|
||||
override val dataType: KType = typeOf<Binary>()
|
||||
|
||||
/**
|
||||
* Read data with supported envelope format and binary format. If the envelope format is null, then read binary directly from file.
|
||||
* The operation is blocking since it must read the meta header. The reading of envelope body is lazy
|
||||
*/
|
||||
private fun readFileAsData(
|
||||
path: Path,
|
||||
): Data<Binary> {
|
||||
val envelope = io.readEnvelopeFile(path, true)
|
||||
val updatedMeta = envelope.meta.copy {
|
||||
FILE_PATH_KEY put path.toString()
|
||||
FILE_EXTENSION_KEY put path.extension
|
||||
|
||||
val attributes = path.readAttributes<BasicFileAttributes>()
|
||||
FILE_UPDATE_TIME_KEY put attributes.lastModifiedTime().toInstant().toString()
|
||||
FILE_CREATE_TIME_KEY put attributes.creationTime().toInstant().toString()
|
||||
}
|
||||
return StaticData(
|
||||
typeOf<Binary>(),
|
||||
envelope.data ?: Binary.EMPTY,
|
||||
updatedMeta
|
||||
)
|
||||
}
|
||||
|
||||
private fun readFilesFromDirectory(
|
||||
path: Path
|
||||
): Map<NameToken, FileDataTree> = path.listDirectoryEntries().filterNot { it.name.startsWith("@") }.associate {
|
||||
NameToken.parse(it.nameWithoutExtension) to FileDataTree(io, it)
|
||||
}
|
||||
|
||||
override val data: Data<Binary>?
|
||||
get() = when {
|
||||
path.isRegularFile() -> {
|
||||
//TODO process zip
|
||||
readFileAsData(path)
|
||||
}
|
||||
|
||||
path.isDirectory() -> {
|
||||
//FIXME find data and meta in a single pass instead of two
|
||||
|
||||
val dataBinary: Binary? = path.listDirectoryEntries().find {
|
||||
it.fileName.nameWithoutExtension == IOPlugin.DATA_FILE_NAME
|
||||
}?.asBinary()
|
||||
|
||||
val meta: Meta? = path.listDirectoryEntries().find {
|
||||
it.fileName.nameWithoutExtension == IOPlugin.META_FILE_NAME
|
||||
}?.let {
|
||||
io.readMetaFileOrNull(it)
|
||||
}
|
||||
|
||||
if (dataBinary != null || meta != null) {
|
||||
StaticData(
|
||||
typeOf<Binary>(),
|
||||
dataBinary ?: Binary.EMPTY,
|
||||
meta ?: Meta.EMPTY
|
||||
)
|
||||
} else {
|
||||
null
|
||||
}
|
||||
}
|
||||
|
||||
else -> {
|
||||
null
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
override val items: Map<NameToken, DataTree<Binary>>
|
||||
get() = when {
|
||||
path.isDirectory() -> readFilesFromDirectory(path)
|
||||
path.isRegularFile() && path.extension == "zip" -> {
|
||||
//Using an explicit Zip file system to avoid bizarre compatibility bugs
|
||||
val fsProvider = FileSystemProvider.installedProviders().find { it.scheme == "jar" }
|
||||
?: error("Zip file system provider not found")
|
||||
val fs = fsProvider.newFileSystem(path, emptyMap<String, Any>())
|
||||
readFilesFromDirectory(fs.rootDirectories.single())
|
||||
}
|
||||
|
||||
else -> emptyMap()
|
||||
}
|
||||
|
||||
|
||||
override val updates: Flow<Name> = if (monitor) {
|
||||
callbackFlow<Name> {
|
||||
val watchService: WatchService = path.fileSystem.newWatchService()
|
||||
|
||||
fun Path.toName() = Name(map { NameToken.parse(it.nameWithoutExtension) })
|
||||
|
||||
fun monitor(childPath: Path): Job {
|
||||
val key: WatchKey = childPath.register(
|
||||
watchService, arrayOf(
|
||||
StandardWatchEventKinds.ENTRY_DELETE,
|
||||
StandardWatchEventKinds.ENTRY_MODIFY,
|
||||
StandardWatchEventKinds.ENTRY_CREATE,
|
||||
)
|
||||
)
|
||||
|
||||
return launch {
|
||||
while (isActive) {
|
||||
for (event: WatchEvent<*> in key.pollEvents()) {
|
||||
val eventPath = event.context() as Path
|
||||
if (event.kind() === StandardWatchEventKinds.ENTRY_CREATE) {
|
||||
monitor(eventPath)
|
||||
} else {
|
||||
send(eventPath.relativeTo(path).toName())
|
||||
}
|
||||
}
|
||||
key.reset()
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
monitor(path)
|
||||
|
||||
awaitClose {
|
||||
watchService.close()
|
||||
}
|
||||
|
||||
}.flowOn(Dispatchers.IO).shareIn(io.context, SharingStarted.WhileSubscribed())
|
||||
} else {
|
||||
emptyFlow()
|
||||
}
|
||||
|
||||
public companion object {
|
||||
public val FILE_KEY: Name = "file".asName()
|
||||
public val FILE_PATH_KEY: Name = FILE_KEY + "path"
|
||||
public val FILE_EXTENSION_KEY: Name = FILE_KEY + "extension"
|
||||
public val FILE_CREATE_TIME_KEY: Name = FILE_KEY + "created"
|
||||
public val FILE_UPDATE_TIME_KEY: Name = FILE_KEY + "updated"
|
||||
public const val DF_FILE_EXTENSION: String = "df"
|
||||
public val DEFAULT_IGNORE_EXTENSIONS: Set<String> = setOf(DF_FILE_EXTENSION)
|
||||
}
|
||||
}
|
||||
|
||||
public fun IOPlugin.readDirectory(path: Path, monitor: Boolean = false): FileDataTree =
|
||||
FileDataTree(this, path, monitor)
|
||||
|
||||
|
||||
///**
|
||||
// * @param resources The names of the resources to read.
|
||||
// * @param classLoader The class loader to use for loading the resources. By default, it uses the current thread's context class loader.
|
||||
// */
|
||||
//@DFExperimental
|
||||
//public fun DataSink<Binary>.resources(
|
||||
// io: IOPlugin,
|
||||
// resource: String,
|
||||
// vararg otherResources: String,
|
||||
// classLoader: ClassLoader = Thread.currentThread().contextClassLoader,
|
||||
//) {
|
||||
// //create a file system if necessary
|
||||
// val uri = Thread.currentThread().contextClassLoader.getResource("common")!!.toURI()
|
||||
// try {
|
||||
// uri.toPath()
|
||||
// } catch (e: FileSystemNotFoundException) {
|
||||
// FileSystems.newFileSystem(uri, mapOf("create" to "true"))
|
||||
// }
|
||||
//
|
||||
// listOf(resource, *otherResources).forEach { r ->
|
||||
// val path = classLoader.getResource(r)?.toURI()?.toPath() ?: error(
|
||||
// "Resource with name $r is not resolved"
|
||||
// )
|
||||
// io.readAsDataTree(r.asName(), path)
|
||||
// }
|
||||
//}
|
@ -15,6 +15,7 @@ import space.kscience.dataforge.data.Data
|
||||
import space.kscience.dataforge.data.await
|
||||
import space.kscience.dataforge.data.named
|
||||
import space.kscience.dataforge.io.*
|
||||
import space.kscience.dataforge.meta.Meta
|
||||
import space.kscience.dataforge.misc.DFExperimental
|
||||
import space.kscience.dataforge.misc.UnsafeKType
|
||||
import space.kscience.dataforge.names.withIndex
|
||||
@ -24,11 +25,7 @@ import kotlin.io.path.div
|
||||
import kotlin.io.path.exists
|
||||
import kotlin.reflect.KType
|
||||
|
||||
public class JsonIOFormat<T>(private val type: KType) : IOFormat<T> {
|
||||
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
private val serializer: KSerializer<T> = serializer(type) as KSerializer<T>
|
||||
|
||||
public class JsonIOFormat<T>(public val serializer: KSerializer<T>) : IOFormat<T> {
|
||||
override fun readFrom(source: Source): T = Json.decodeFromString(serializer, source.readString())
|
||||
|
||||
override fun writeTo(sink: Sink, obj: T) {
|
||||
@ -36,12 +33,11 @@ public class JsonIOFormat<T>(private val type: KType) : IOFormat<T> {
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* An [IOFormat] based on Protobuf representation of the serializeable object.
|
||||
*/
|
||||
@OptIn(ExperimentalSerializationApi::class)
|
||||
public class ProtobufIOFormat<T>(private val type: KType) : IOFormat<T> {
|
||||
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
private val serializer: KSerializer<T> = serializer(type) as KSerializer<T>
|
||||
|
||||
public class ProtobufIOFormat<T>(public val serializer: KSerializer<T>) : IOFormat<T> {
|
||||
override fun readFrom(source: Source): T = ProtoBuf.decodeFromByteArray(serializer, source.readByteArray())
|
||||
|
||||
override fun writeTo(sink: Sink, obj: T) {
|
||||
@ -49,19 +45,39 @@ public class ProtobufIOFormat<T>(private val type: KType) : IOFormat<T> {
|
||||
}
|
||||
}
|
||||
|
||||
public interface IOFormatResolveStrategy {
|
||||
public fun <T> resolve(type: KType, meta: Meta): IOFormat<T>
|
||||
|
||||
public class FileWorkspaceCache(public val cacheDirectory: Path) : WorkspaceCache {
|
||||
public companion object {
|
||||
public val PROTOBUF: IOFormatResolveStrategy = object : IOFormatResolveStrategy {
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
override fun <T> resolve(
|
||||
type: KType,
|
||||
meta: Meta
|
||||
): IOFormat<T> = ProtobufIOFormat(serializer(type) as KSerializer<T>)
|
||||
}
|
||||
|
||||
// private fun <T : Any> TaskData<*>.checkType(taskType: KType): TaskData<T> = this as TaskData<T>
|
||||
public val JSON: IOFormatResolveStrategy = object : IOFormatResolveStrategy {
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
override fun <T> resolve(
|
||||
type: KType,
|
||||
meta: Meta
|
||||
): IOFormat<T> = JsonIOFormat(serializer(type) as KSerializer<T>)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
public class FileWorkspaceCache(
|
||||
public val cacheDirectory: Path,
|
||||
private val ioFormatResolveStrategy: IOFormatResolveStrategy,
|
||||
) : WorkspaceCache {
|
||||
|
||||
|
||||
@OptIn(DFExperimental::class, UnsafeKType::class)
|
||||
override suspend fun <T> cache(result: TaskResult<T>): TaskResult<T> {
|
||||
val io = result.workspace.context.request(IOPlugin)
|
||||
|
||||
val format: IOFormat<T> = io.resolveIOFormat(result.dataType, result.taskMeta)
|
||||
?: ProtobufIOFormat(result.dataType)
|
||||
?: error("Can't resolve IOFormat for ${result.dataType}")
|
||||
val format: IOFormat<T> = ioFormatResolveStrategy.resolve<T>(result.dataType, result.taskMeta)
|
||||
|
||||
|
||||
val cachingAction: Action<T, T> = CachingAction(result.dataType) { data ->
|
||||
@ -104,4 +120,7 @@ public class FileWorkspaceCache(public val cacheDirectory: Path) : WorkspaceCach
|
||||
}
|
||||
}
|
||||
|
||||
public fun WorkspaceBuilder.fileCache(cacheDir: Path): Unit = cache(FileWorkspaceCache(cacheDir))
|
||||
public fun WorkspaceBuilder.fileCache(
|
||||
cacheDir: Path,
|
||||
ioFormatResolveStrategy: IOFormatResolveStrategy = IOFormatResolveStrategy.PROTOBUF
|
||||
): Unit = cache(FileWorkspaceCache(cacheDir, ioFormatResolveStrategy))
|
2
dataforge-workspace/src/jvmMain/kotlin/space/kscience/dataforge/workspace/InMemoryWorkspaceCache.kt
2
dataforge-workspace/src/jvmMain/kotlin/space/kscience/dataforge/workspace/InMemoryWorkspaceCache.kt
@ -28,7 +28,7 @@ public class InMemoryWorkspaceCache : WorkspaceCache {
|
||||
val cachedData = cache.getOrPut(TaskResultId(result.taskName, result.taskMeta)){
|
||||
HashMap()
|
||||
}.getOrPut(data.name){
|
||||
data.data
|
||||
data
|
||||
}
|
||||
cachedData.checkType<T>(result.dataType).named(data.name)
|
||||
}
|
||||
|
@ -1,7 +1,6 @@
|
||||
package space.kscience.dataforge.data
|
||||
package space.kscience.dataforge.workspace
|
||||
|
||||
import kotlinx.coroutines.flow.Flow
|
||||
import kotlinx.coroutines.flow.filter
|
||||
import space.kscience.dataforge.data.*
|
||||
import space.kscience.dataforge.misc.DFInternal
|
||||
import space.kscience.dataforge.names.Name
|
||||
import kotlin.reflect.KType
|
||||
@ -22,16 +21,6 @@ private fun <R> Data<*>.castOrNull(type: KType): Data<R>? =
|
||||
}
|
||||
}
|
||||
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
@DFInternal
|
||||
public fun <R> Sequence<DataUpdate<*>>.filterByDataType(type: KType): Sequence<NamedData<R>> =
|
||||
filter { it.type.isSubtypeOf(type) } as Sequence<NamedData<R>>
|
||||
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
@DFInternal
|
||||
public fun <R> Flow<DataUpdate<*>>.filterByDataType(type: KType): Flow<NamedData<R>> =
|
||||
filter { it.type.isSubtypeOf(type) } as Flow<NamedData<R>>
|
||||
|
||||
/**
|
||||
* Select all data matching given type and filters. Does not modify paths
|
||||
*
|
||||
@ -42,7 +31,7 @@ public fun <R> Flow<DataUpdate<*>>.filterByDataType(type: KType): Flow<NamedData
|
||||
public fun <R> DataTree<*>.filterByType(
|
||||
type: KType,
|
||||
branch: Name = Name.EMPTY,
|
||||
filter: DataFilter = DataFilter.EMPTY,
|
||||
filter: DataFilter = DataFilter.Companion.EMPTY,
|
||||
): DataTree<R> {
|
||||
val filterWithType = DataFilter { name, meta, dataType ->
|
||||
filter.accepts(name, meta, dataType) && dataType.isSubtypeOf(type)
|
||||
@ -56,7 +45,7 @@ public fun <R> DataTree<*>.filterByType(
|
||||
@OptIn(DFInternal::class)
|
||||
public inline fun <reified R : Any> DataTree<*>.filterByType(
|
||||
branch: Name = Name.EMPTY,
|
||||
filter: DataFilter = DataFilter.EMPTY,
|
||||
filter: DataFilter = DataFilter.Companion.EMPTY,
|
||||
): DataTree<R> = filterByType(typeOf<R>(), branch, filter = filter)
|
||||
|
||||
/**
|
@ -1,188 +0,0 @@
|
||||
package space.kscience.dataforge.workspace
|
||||
|
||||
import kotlinx.coroutines.*
|
||||
import space.kscience.dataforge.data.Data
|
||||
import space.kscience.dataforge.data.DataSink
|
||||
import space.kscience.dataforge.data.StaticData
|
||||
import space.kscience.dataforge.io.*
|
||||
import space.kscience.dataforge.meta.Meta
|
||||
import space.kscience.dataforge.meta.copy
|
||||
import space.kscience.dataforge.misc.DFExperimental
|
||||
import space.kscience.dataforge.names.Name
|
||||
import space.kscience.dataforge.names.NameToken
|
||||
import space.kscience.dataforge.names.asName
|
||||
import space.kscience.dataforge.names.plus
|
||||
import java.nio.file.*
|
||||
import java.nio.file.attribute.BasicFileAttributes
|
||||
import java.nio.file.spi.FileSystemProvider
|
||||
import kotlin.io.path.*
|
||||
import kotlin.reflect.typeOf
|
||||
|
||||
|
||||
public object FileData {
|
||||
public val FILE_KEY: Name = "file".asName()
|
||||
public val FILE_PATH_KEY: Name = FILE_KEY + "path"
|
||||
public val FILE_EXTENSION_KEY: Name = FILE_KEY + "extension"
|
||||
public val FILE_CREATE_TIME_KEY: Name = FILE_KEY + "created"
|
||||
public val FILE_UPDATE_TIME_KEY: Name = FILE_KEY + "updated"
|
||||
public const val DF_FILE_EXTENSION: String = "df"
|
||||
public val DEFAULT_IGNORE_EXTENSIONS: Set<String> = setOf(DF_FILE_EXTENSION)
|
||||
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* Read data with supported envelope format and binary format. If the envelope format is null, then read binary directly from file.
|
||||
* The operation is blocking since it must read the meta header. The reading of envelope body is lazy
|
||||
*/
|
||||
public fun IOPlugin.readFileData(
|
||||
path: Path,
|
||||
): Data<Binary> {
|
||||
val envelope = readEnvelopeFile(path, true)
|
||||
val updatedMeta = envelope.meta.copy {
|
||||
FileData.FILE_PATH_KEY put path.toString()
|
||||
FileData.FILE_EXTENSION_KEY put path.extension
|
||||
|
||||
val attributes = path.readAttributes<BasicFileAttributes>()
|
||||
FileData.FILE_UPDATE_TIME_KEY put attributes.lastModifiedTime().toInstant().toString()
|
||||
FileData.FILE_CREATE_TIME_KEY put attributes.creationTime().toInstant().toString()
|
||||
}
|
||||
return StaticData(
|
||||
typeOf<Binary>(),
|
||||
envelope.data ?: Binary.EMPTY,
|
||||
updatedMeta
|
||||
)
|
||||
}
|
||||
|
||||
public fun DataSink<Binary>.file(io: IOPlugin, name: Name, path: Path) {
|
||||
if (!path.isRegularFile()) error("Only regular files could be handled by this function")
|
||||
put(name, io.readFileData(path))
|
||||
}
|
||||
|
||||
public fun DataSink<Binary>.directory(
|
||||
io: IOPlugin,
|
||||
name: Name,
|
||||
path: Path,
|
||||
) {
|
||||
if (!path.isDirectory()) error("Only directories could be handled by this function")
|
||||
//process root data
|
||||
|
||||
var dataBinary: Binary? = null
|
||||
var meta: Meta? = null
|
||||
Files.list(path).forEach { childPath ->
|
||||
val fileName = childPath.fileName.toString()
|
||||
if (fileName == IOPlugin.DATA_FILE_NAME) {
|
||||
dataBinary = childPath.asBinary()
|
||||
} else if (fileName.startsWith(IOPlugin.META_FILE_NAME)) {
|
||||
meta = io.readMetaFileOrNull(childPath)
|
||||
} else if (!fileName.startsWith("@")) {
|
||||
val token = if (childPath.isRegularFile() && childPath.extension in FileData.DEFAULT_IGNORE_EXTENSIONS) {
|
||||
NameToken(childPath.nameWithoutExtension)
|
||||
} else {
|
||||
NameToken(childPath.name)
|
||||
}
|
||||
|
||||
files(io, name + token, childPath)
|
||||
}
|
||||
}
|
||||
|
||||
//set data if it is relevant
|
||||
if (dataBinary != null || meta != null) {
|
||||
put(
|
||||
name,
|
||||
StaticData(
|
||||
typeOf<Binary>(),
|
||||
dataBinary ?: Binary.EMPTY,
|
||||
meta ?: Meta.EMPTY
|
||||
)
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
public fun DataSink<Binary>.files(
|
||||
io: IOPlugin,
|
||||
name: Name,
|
||||
path: Path,
|
||||
) {
|
||||
if (path.isRegularFile() && path.extension == "zip") {
|
||||
//Using explicit Zip file system to avoid bizarre compatibility bugs
|
||||
val fsProvider = FileSystemProvider.installedProviders().find { it.scheme == "jar" }
|
||||
?: error("Zip file system provider not found")
|
||||
val fs = fsProvider.newFileSystem(path, emptyMap<String, Any>())
|
||||
|
||||
files(io, name, fs.rootDirectories.first())
|
||||
}
|
||||
if (path.isRegularFile()) {
|
||||
file(io, name, path)
|
||||
} else {
|
||||
directory(io, name, path)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
private fun Path.toName() = Name(map { NameToken.parse(it.nameWithoutExtension) })
|
||||
|
||||
public fun DataSink<Binary>.monitorFiles(
|
||||
io: IOPlugin,
|
||||
name: Name,
|
||||
path: Path,
|
||||
scope: CoroutineScope = io.context,
|
||||
): Job {
|
||||
files(io, name, path)
|
||||
return scope.launch(Dispatchers.IO) {
|
||||
val watchService = path.fileSystem.newWatchService()
|
||||
|
||||
path.register(
|
||||
watchService,
|
||||
StandardWatchEventKinds.ENTRY_DELETE,
|
||||
StandardWatchEventKinds.ENTRY_MODIFY,
|
||||
StandardWatchEventKinds.ENTRY_CREATE
|
||||
)
|
||||
|
||||
do {
|
||||
val key = watchService.take()
|
||||
if (key != null) {
|
||||
for (event: WatchEvent<*> in key.pollEvents()) {
|
||||
val eventPath = event.context() as Path
|
||||
if (event.kind() == StandardWatchEventKinds.ENTRY_DELETE) {
|
||||
put(eventPath.toName(), null)
|
||||
} else {
|
||||
val fileName = eventPath.fileName.toString()
|
||||
if (!fileName.startsWith("@")) {
|
||||
files(io, name, eventPath)
|
||||
}
|
||||
}
|
||||
}
|
||||
key.reset()
|
||||
}
|
||||
} while (isActive && key != null)
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
/**
|
||||
* @param resources The names of the resources to read.
|
||||
* @param classLoader The class loader to use for loading the resources. By default, it uses the current thread's context class loader.
|
||||
*/
|
||||
@DFExperimental
|
||||
public fun DataSink<Binary>.resources(
|
||||
io: IOPlugin,
|
||||
resource: String,
|
||||
vararg otherResources: String,
|
||||
classLoader: ClassLoader = Thread.currentThread().contextClassLoader,
|
||||
) {
|
||||
//create a file system if necessary
|
||||
val uri = Thread.currentThread().contextClassLoader.getResource("common")!!.toURI()
|
||||
try {
|
||||
uri.toPath()
|
||||
} catch (e: FileSystemNotFoundException) {
|
||||
FileSystems.newFileSystem(uri, mapOf("create" to "true"))
|
||||
}
|
||||
|
||||
listOf(resource,*otherResources).forEach { r ->
|
||||
val path = classLoader.getResource(r)?.toURI()?.toPath() ?: error(
|
||||
"Resource with name $r is not resolved"
|
||||
)
|
||||
files(io, r.asName(), path)
|
||||
}
|
||||
}
|
@ -1,7 +1,6 @@
|
||||
package space.kscience.dataforge.workspace
|
||||
|
||||
import space.kscience.dataforge.data.DataTree
|
||||
import space.kscience.dataforge.data.filterByType
|
||||
import space.kscience.dataforge.meta.Meta
|
||||
import space.kscience.dataforge.misc.DFExperimental
|
||||
import space.kscience.dataforge.names.Name
|
||||
@ -15,14 +14,14 @@ import space.kscience.dataforge.names.matches
|
||||
* Select the whole data set from the workspace filtered by type.
|
||||
*/
|
||||
@OptIn(DFExperimental::class)
|
||||
public inline fun <reified T : Any> TaskResultBuilder<*>.dataByType(namePattern: Name? = null): DataSelector<T> =
|
||||
public inline fun <reified T : Any> TaskResultScope<*>.dataByType(namePattern: Name? = null): DataSelector<T> =
|
||||
DataSelector<T> { workspace, _ ->
|
||||
workspace.data.filterByType { name, _, _ ->
|
||||
namePattern == null || name.matches(namePattern)
|
||||
}
|
||||
}
|
||||
|
||||
public suspend inline fun <reified T : Any> TaskResultBuilder<*>.fromTask(
|
||||
public suspend inline fun <reified T : Any> TaskResultScope<*>.fromTask(
|
||||
task: Name,
|
||||
taskMeta: Meta = Meta.EMPTY,
|
||||
): DataTree<T> = workspace.produce(task, taskMeta).filterByType()
|
@ -2,7 +2,9 @@ package space.kscience.dataforge.workspace
|
||||
|
||||
import kotlinx.coroutines.Dispatchers
|
||||
import kotlinx.coroutines.withContext
|
||||
import space.kscience.dataforge.data.*
|
||||
import space.kscience.dataforge.data.DataTree
|
||||
import space.kscience.dataforge.data.forEach
|
||||
import space.kscience.dataforge.data.meta
|
||||
import space.kscience.dataforge.io.*
|
||||
import space.kscience.dataforge.misc.DFExperimental
|
||||
import space.kscience.dataforge.names.Name
|
||||
@ -32,8 +34,8 @@ public suspend fun <T : Any> IOPlugin.writeDataDirectory(
|
||||
} else if (!Files.isDirectory(path)) {
|
||||
error("Can't write a node into file")
|
||||
}
|
||||
dataSet.forEach { (name, data) ->
|
||||
val childPath = path.resolve(name.tokens.joinToString("/") { token -> token.toStringUnescaped() })
|
||||
dataSet.forEach { data ->
|
||||
val childPath = path.resolve(data.name.tokens.joinToString("/") { token -> token.toStringUnescaped() })
|
||||
childPath.parent.createDirectories()
|
||||
val envelope = data.toEnvelope(format)
|
||||
if (envelopeFormat != null) {
|
||||
|
16
dataforge-workspace/src/jvmTest/kotlin/space/kscience/dataforge/workspace/CachingWorkspaceTest.kt
16
dataforge-workspace/src/jvmTest/kotlin/space/kscience/dataforge/workspace/CachingWorkspaceTest.kt
@ -3,7 +3,7 @@ package space.kscience.dataforge.workspace
|
||||
import kotlinx.coroutines.coroutineScope
|
||||
import kotlinx.coroutines.test.runTest
|
||||
import org.junit.jupiter.api.Test
|
||||
import space.kscience.dataforge.data.putValue
|
||||
import space.kscience.dataforge.data.value
|
||||
import space.kscience.dataforge.meta.Meta
|
||||
import space.kscience.dataforge.meta.boolean
|
||||
import space.kscience.dataforge.meta.get
|
||||
@ -22,14 +22,14 @@ internal class CachingWorkspaceTest {
|
||||
data {
|
||||
//statically initialize data
|
||||
repeat(5) {
|
||||
putValue("myData[$it]", it)
|
||||
value("myData[$it]", it)
|
||||
}
|
||||
}
|
||||
|
||||
inMemoryCache()
|
||||
|
||||
val doFirst by task<Any> {
|
||||
transformEach(allData) { _, name, _ ->
|
||||
transformEach(allData) {
|
||||
firstCounter++
|
||||
println("Done first on $name with flag=${taskMeta["flag"].boolean}")
|
||||
}
|
||||
@ -39,7 +39,7 @@ internal class CachingWorkspaceTest {
|
||||
transformEach(
|
||||
doFirst,
|
||||
dependencyMeta = if (taskMeta["flag"].boolean == true) taskMeta else Meta.EMPTY
|
||||
) { _, name, _ ->
|
||||
) {
|
||||
secondCounter++
|
||||
println("Done second on $name with flag=${taskMeta["flag"].boolean ?: false}")
|
||||
}
|
||||
@ -52,11 +52,11 @@ internal class CachingWorkspaceTest {
|
||||
val secondC = workspace.produce("doSecond")
|
||||
//use coroutineScope to wait for the result
|
||||
coroutineScope {
|
||||
first.launch(this)
|
||||
secondA.launch(this)
|
||||
secondB.launch(this)
|
||||
first.launchIn(this)
|
||||
secondA.launchIn(this)
|
||||
secondB.launchIn(this)
|
||||
//repeat to check caching
|
||||
secondC.launch(this)
|
||||
secondC.launchIn(this)
|
||||
}
|
||||
|
||||
assertEquals(10, firstCounter)
|
||||
|
@ -20,14 +20,12 @@ class DataPropagationTestPlugin : WorkspacePlugin() {
|
||||
val result: Data<Int> = selectedData.foldToData(0) { result, data ->
|
||||
result + data.value
|
||||
}
|
||||
put("result", result)
|
||||
result(result)
|
||||
}
|
||||
|
||||
|
||||
val singleData by task<Int> {
|
||||
workspace.data.filterByType<Int>()["myData[12]"]?.let {
|
||||
put("result", it)
|
||||
}
|
||||
result(workspace.data.filterByType<Int>()["myData[12]"]!!)
|
||||
}
|
||||
|
||||
|
||||
@ -47,7 +45,7 @@ class DataPropagationTest {
|
||||
}
|
||||
data {
|
||||
repeat(100) {
|
||||
putValue("myData[$it]", it)
|
||||
value("myData[$it]", it)
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -55,12 +53,12 @@ class DataPropagationTest {
|
||||
@Test
|
||||
fun testAllData() = runTest {
|
||||
val node = testWorkspace.produce("Test.allData")
|
||||
assertEquals(4950, node.content.asSequence().single().await())
|
||||
assertEquals(4950, node.content.data?.await())
|
||||
}
|
||||
|
||||
@Test
|
||||
fun testSingleData() = runTest {
|
||||
val node = testWorkspace.produce("Test.singleData")
|
||||
assertEquals(12, node.content.asSequence().single().await())
|
||||
assertEquals(12, node.content.data?.await())
|
||||
}
|
||||
}
|
@ -12,7 +12,6 @@ import space.kscience.dataforge.io.*
|
||||
import space.kscience.dataforge.io.yaml.YamlPlugin
|
||||
import space.kscience.dataforge.meta.get
|
||||
import space.kscience.dataforge.misc.DFExperimental
|
||||
import space.kscience.dataforge.names.Name
|
||||
import java.nio.file.Files
|
||||
import kotlin.io.path.deleteExisting
|
||||
import kotlin.io.path.fileSize
|
||||
@ -22,13 +21,13 @@ import kotlin.test.assertEquals
|
||||
|
||||
|
||||
class FileDataTest {
|
||||
val dataNode = DataTree<String> {
|
||||
putAll("dir") {
|
||||
putValue("a", "Some string") {
|
||||
val dataNode = DataTree.static<String> {
|
||||
node("dir") {
|
||||
value("a", "Some string") {
|
||||
"content" put "Some string"
|
||||
}
|
||||
}
|
||||
putValue("b", "root data")
|
||||
value("b", "root data")
|
||||
// meta {
|
||||
// "content" put "This is root meta node"
|
||||
// }
|
||||
@ -51,10 +50,10 @@ class FileDataTest {
|
||||
val dir = Files.createTempDirectory("df_data_node")
|
||||
io.writeDataDirectory(dir, dataNode, StringIOFormat)
|
||||
println(dir.toUri().toString())
|
||||
val data = DataTree {
|
||||
files(io, Name.EMPTY, dir)
|
||||
val data = io.readDirectory(dir)
|
||||
val reconstructed = data.transformEach(this) { (_, value) ->
|
||||
value.toByteArray().decodeToString()
|
||||
}
|
||||
val reconstructed = data.transform { (_, value) -> value.toByteArray().decodeToString() }
|
||||
assertEquals(dataNode["dir.a"]?.meta?.get("content"), reconstructed["dir.a"]?.meta?.get("content"))
|
||||
assertEquals(dataNode["b"]?.await(), reconstructed["b"]?.await())
|
||||
}
|
||||
@ -68,8 +67,9 @@ class FileDataTest {
|
||||
zip.deleteExisting()
|
||||
io.writeZip(zip, dataNode, StringIOFormat)
|
||||
println(zip.toUri().toString())
|
||||
val reconstructed = DataTree { files(io, Name.EMPTY, zip) }
|
||||
.transform { (_, value) -> value.toByteArray().decodeToString() }
|
||||
val reconstructed = io.readDirectory(zip).transformEach(this) { (_, value) ->
|
||||
value.toByteArray().decodeToString()
|
||||
}
|
||||
assertEquals(dataNode["dir.a"]?.meta?.get("content"), reconstructed["dir.a"]?.meta?.get("content"))
|
||||
assertEquals(dataNode["b"]?.await(), reconstructed["b"]?.await())
|
||||
|
||||
|
8
dataforge-workspace/src/jvmTest/kotlin/space/kscience/dataforge/workspace/FileWorkspaceCacheTest.kt
8
dataforge-workspace/src/jvmTest/kotlin/space/kscience/dataforge/workspace/FileWorkspaceCacheTest.kt
@ -3,7 +3,7 @@ package space.kscience.dataforge.workspace
|
||||
import kotlinx.coroutines.ExperimentalCoroutinesApi
|
||||
import kotlinx.coroutines.test.runTest
|
||||
import org.junit.jupiter.api.Test
|
||||
import space.kscience.dataforge.data.putValue
|
||||
import space.kscience.dataforge.data.value
|
||||
import space.kscience.dataforge.misc.DFExperimental
|
||||
import java.nio.file.Files
|
||||
|
||||
@ -16,17 +16,17 @@ class FileWorkspaceCacheTest {
|
||||
data {
|
||||
//statically initialize data
|
||||
repeat(5) {
|
||||
putValue("myData[$it]", it)
|
||||
value("myData[$it]", it)
|
||||
}
|
||||
}
|
||||
fileCache(Files.createTempDirectory("dataforge-temporary-cache"))
|
||||
|
||||
val echo by task<String> {
|
||||
transformEach(dataByType<String>()) { arg, _, _ -> arg }
|
||||
transformEach(dataByType<String>()) { value }
|
||||
}
|
||||
}
|
||||
|
||||
workspace.produce("echo").launch(this)
|
||||
workspace.produce("echo").launchIn(this)
|
||||
|
||||
}
|
||||
}
|
@ -37,9 +37,9 @@ internal object TestPlugin : WorkspacePlugin() {
|
||||
|
||||
val test by task {
|
||||
// type is inferred
|
||||
transformEach(dataByType<Int>()) { arg, _, _ ->
|
||||
logger.info { "Test: $arg" }
|
||||
arg
|
||||
transformEach(dataByType<Int>()) {
|
||||
logger.info { "Test: $value" }
|
||||
value
|
||||
}
|
||||
|
||||
}
|
||||
@ -62,42 +62,42 @@ internal class SimpleWorkspaceTest {
|
||||
data {
|
||||
//statically initialize data
|
||||
repeat(100) {
|
||||
putValue("myData[$it]", it)
|
||||
value("myData[$it]", it)
|
||||
}
|
||||
}
|
||||
|
||||
val filterOne by task<Int> {
|
||||
val name by taskMeta.string { error("Name field not defined") }
|
||||
from(testPluginFactory) { test }[name]?.let { source: Data<Int> ->
|
||||
put(name, source)
|
||||
}
|
||||
result(from(testPluginFactory) { test }[name]!!)
|
||||
}
|
||||
|
||||
val square by task<Int> {
|
||||
transformEach(dataByType<Int>()) { arg, name, meta ->
|
||||
transformEach(dataByType<Int>()) {
|
||||
if (meta["testFlag"].boolean == true) {
|
||||
println("Side effect")
|
||||
}
|
||||
workspace.logger.info { "Starting square on $name" }
|
||||
arg * arg
|
||||
value * value
|
||||
}
|
||||
}
|
||||
|
||||
val linear by task<Int> {
|
||||
transformEach(dataByType<Int>()) { arg, name, _ ->
|
||||
transformEach(dataByType<Int>()) {
|
||||
workspace.logger.info { "Starting linear on $name" }
|
||||
arg * 2 + 1
|
||||
value * 2 + 1
|
||||
}
|
||||
}
|
||||
|
||||
val fullSquare by task<Int> {
|
||||
val squareData = from(square)
|
||||
val linearData = from(linear)
|
||||
squareData.forEach { data ->
|
||||
val newData: Data<Int> = data.combine(linearData[data.name]!!) { l, r ->
|
||||
l + r
|
||||
result {
|
||||
squareData.forEach { data ->
|
||||
val newData: Data<Int> = data.combine(linearData[data.name]!!) { l, r ->
|
||||
l + r
|
||||
}
|
||||
data(data.name, newData)
|
||||
}
|
||||
put(data.name, newData)
|
||||
}
|
||||
}
|
||||
|
||||
@ -106,7 +106,7 @@ internal class SimpleWorkspaceTest {
|
||||
val res = from(square).foldToData(0) { l, r ->
|
||||
l + r.value
|
||||
}
|
||||
put("sum", res)
|
||||
result(res)
|
||||
}
|
||||
|
||||
val averageByGroup by task<Int> {
|
||||
@ -116,13 +116,15 @@ internal class SimpleWorkspaceTest {
|
||||
l + r.value
|
||||
}
|
||||
|
||||
put("even", evenSum)
|
||||
val oddSum = workspace.data.filterByType<Int> { name, _, _ ->
|
||||
name.toString().toInt() % 2 == 1
|
||||
}.foldToData(0) { l, r ->
|
||||
l + r.value
|
||||
}
|
||||
put("odd", oddSum)
|
||||
result {
|
||||
data("even", evenSum)
|
||||
data("odd", oddSum)
|
||||
}
|
||||
}
|
||||
|
||||
val delta by task<Int> {
|
||||
@ -132,15 +134,17 @@ internal class SimpleWorkspaceTest {
|
||||
val res = even.combine(odd) { l, r ->
|
||||
l - r
|
||||
}
|
||||
put("res", res)
|
||||
result(res)
|
||||
}
|
||||
|
||||
val customPipe by task<Int> {
|
||||
workspace.data.filterByType<Int>().forEach { data ->
|
||||
val meta = data.meta.toMutableMeta().apply {
|
||||
"newValue" put 22
|
||||
result {
|
||||
workspace.data.filterByType<Int>().forEach { data ->
|
||||
val meta = data.meta.toMutableMeta().apply {
|
||||
"newValue" put 22
|
||||
}
|
||||
data(data.name + "new", data.transform { (data.meta["value"].int ?: 0) + it })
|
||||
}
|
||||
put(data.name + "new", data.transform { (data.meta["value"].int ?: 0) + it })
|
||||
}
|
||||
}
|
||||
|
||||
@ -148,16 +152,16 @@ internal class SimpleWorkspaceTest {
|
||||
}
|
||||
|
||||
@Test
|
||||
fun testWorkspace() = runTest(timeout = 100.milliseconds) {
|
||||
fun testWorkspace() = runTest(timeout = 200.milliseconds) {
|
||||
val node = workspace.produce("sum")
|
||||
val res = node.asSequence().single()
|
||||
assertEquals(328350, res.await())
|
||||
val res = node.data
|
||||
assertEquals(328350, res?.await())
|
||||
}
|
||||
|
||||
@Test
|
||||
fun testMetaPropagation() = runTest(timeout = 100.milliseconds) {
|
||||
fun testMetaPropagation() = runTest(timeout = 200.milliseconds) {
|
||||
val node = workspace.produce("sum") { "testFlag" put true }
|
||||
val res = node["sum"]!!.await()
|
||||
val res = node.data?.await()
|
||||
}
|
||||
|
||||
@Test
|
||||
@ -175,7 +179,7 @@ internal class SimpleWorkspaceTest {
|
||||
"""
|
||||
Name: ${it.name}
|
||||
Meta: ${it.meta}
|
||||
Data: ${it.data.await()}
|
||||
Data: ${it.await()}
|
||||
""".trimIndent()
|
||||
)
|
||||
}
|
||||
@ -186,7 +190,7 @@ internal class SimpleWorkspaceTest {
|
||||
val node = workspace.produce("filterOne") {
|
||||
"name" put "myData[12]"
|
||||
}
|
||||
assertEquals(12, node.asSequence().first().await())
|
||||
assertEquals(12, node.data?.await())
|
||||
}
|
||||
|
||||
}
|
65
docs/templates/README-TEMPLATE.md
vendored
65
docs/templates/README-TEMPLATE.md
vendored
@ -1,6 +1,69 @@
|
||||
[](https://confluence.jetbrains.com/display/ALL/JetBrains+on+GitHub)
|
||||
[](https://zenodo.org/badge/latestdoi/148831678)
|
||||
|
||||

|
||||
## Publications
|
||||
|
||||
* [A general overview](https://doi.org/10.1051/epjconf/201817705003)
|
||||
* [An application in "Troitsk nu-mass" experiment](https://doi.org/10.1088/1742-6596/1525/1/012024)
|
||||
|
||||
## Video
|
||||
|
||||
* [A presentation on application of DataForge (legacy version) to Troitsk nu-mass analysis.](https://youtu.be/OpWzLXUZnLI?si=3qn7EMruOHMJX3Bc)
|
||||
|
||||
## Questions and Answers
|
||||
|
||||
In this section, we will try to cover DataForge main ideas in the form of questions and answers.
|
||||
|
||||
### General
|
||||
|
||||
**Q**: I have a lot of data to analyze. The analysis process is complicated, requires a lot of stages, and data flow is not always obvious. Also, the data size is huge, so I don't want to perform operation I don't need (calculate something I won't need or calculate something twice). I need it to be performed in parallel and probably on remote computer. By the way, I am sick and tired of scripts that modify other scripts that control scripts. Could you help me?
|
||||
|
||||
**A**: Yes, that is precisely the problem DataForge was made to solve. It allows performing some automated data manipulations with optimization and parallelization. The important thing that data processing recipes are made in the declarative way, so it is quite easy to perform computations on a remote station. Also, DataForge guarantees reproducibility of analysis results.
|
||||
|
||||
**Q**: How does it work?
|
||||
|
||||
**A**: At the core of DataForge lies the idea of metadata processor. It utilizes the fact that to analyze something you need data itself and some additional information about what does that data represent and what does user want as a result. This additional information is called metadata and could be organized in a regular structure (a tree of values similar to XML or JSON). The important thing is that this distinction leaves no place for user instructions (or scripts). Indeed, the idea of DataForge logic is that one does not need imperative commands. The framework configures itself according to input meta-data and decides what operations should be performed in the most efficient way.
|
||||
|
||||
**Q**: But where does it take algorithms to use?
|
||||
|
||||
**A**: Of course algorithms must be written somewhere. No magic here. The logic is written in specialized modules. Some modules are provided out of the box at the system core, some need to be developed for a specific problem.
|
||||
|
||||
**Q**: So I still need to write the code? What is the difference then?
|
||||
|
||||
**A**: Yes, someone still needs to write the code. But not necessary you. Simple operations could be performed using provided core logic. Also, your group can have one programmer writing the logic and all other using it without any real programming expertise. The framework organized in a such way that one writes some additional logic, they do not need to think about complicated thing like parallel computing, resource handling, logging, caching, etc. Most of the things are done by the DataForge.
|
||||
|
||||
### Platform
|
||||
|
||||
**Q**: Which platform does DataForge use? Which operating system is it working on?
|
||||
|
||||
**A**: The DataForge is mostly written in Kotlin-multiplatform and could be used on JVM, JS and native targets. Some modules and functions are supported only on JVM
|
||||
|
||||
**Q**: Can I use my C++/Fortran/Python code in DataForge?
|
||||
|
||||
**A**: Yes, as long as the code could be called from Java. Most common languages have a bridge for Java access. There are completely no problems with compiled C/Fortran libraries. Python code could be called via one of existing python-java interfaces. It is also planned to implement remote method invocation for common languages, so your Python, or, say, Julia, code could run in its native environment. The metadata processor paradigm makes it much easier to do so.
|
||||
|
||||
### Features
|
||||
|
||||
**Q**: What other features does DataForge provide?
|
||||
|
||||
**A**: Alongside metadata processing (and a lot of tools for metadata manipulation and layering), DataForge has two additional important concepts:
|
||||
|
||||
* **Modularisation**. Contrary to lot other frameworks, DataForge is intrinsically modular. The mandatory part is a rather tiny core module. Everything else could be customized.
|
||||
|
||||
* **Context encapsulation**. Every DataForge task is executed in some context. The context isolates environment for the task and also works as dependency injection base and specifies interaction of the task with the external world.
|
||||
|
||||
### Misc
|
||||
|
||||
**Q**: So everything looks great, can I replace my ROOT / other data analysis framework with DataForge?
|
||||
|
||||
**A**: One must note that DataForge is made for analysis, not for visualization. The visualization and user interaction capabilities of DataForge are rather limited compared to frameworks like ROOT, JAS3 or DataMelt. The idea is to provide reliable API and core functionality. [VisionForge](https://git.sciprog.center/kscience/visionforge) project aims to provide tools for both 2D and 3D visualization both locally and remotely.
|
||||
|
||||
**Q**: How does DataForge compare to cluster computation frameworks like Apache Spark?
|
||||
|
||||
**A**: It is not the purpose of DataForge to replace cluster computing software. DataForge has some internal parallelism mechanics and implementations, but they are most certainly worse than specially developed programs. Still, DataForge is not fixed on one single implementation. Your favourite parallel processing tool could be still used as a back-end for the DataForge. With full benefit of configuration tools, integrations and no performance overhead.
|
||||
|
||||
**Q**: Is it possible to use DataForge in notebook mode?
|
||||
|
||||
**A**: [Kotlin jupyter](https://github.com/Kotlin/kotlin-jupyter) allows using any JVM program in a notebook mode. The dedicated module for DataForge is work in progress.
|
||||
|
||||
${modules}
|
||||
|
@ -5,5 +5,6 @@ org.gradle.jvmargs=-Xmx4096m
|
||||
|
||||
kotlin.mpp.stability.nowarn=true
|
||||
kotlin.native.ignoreDisabledTargets=true
|
||||
org.jetbrains.dokka.experimental.gradle.pluginMode=V2Enabled
|
||||
|
||||
toolsVersion=0.15.4-kotlin-2.0.0
|
||||
toolsVersion=0.16.1-kotlin-2.1.0
|
2
gradle/wrapper/gradle-wrapper.properties
vendored
2
gradle/wrapper/gradle-wrapper.properties
vendored
@ -1,5 +1,5 @@
|
||||
distributionBase=GRADLE_USER_HOME
|
||||
distributionPath=wrapper/dists
|
||||
distributionUrl=https\://services.gradle.org/distributions/gradle-8.6-bin.zip
|
||||
distributionUrl=https\://services.gradle.org/distributions/gradle-8.12-bin.zip
|
||||
zipStoreBase=GRADLE_USER_HOME
|
||||
zipStorePath=wrapper/dists
|
||||
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
x
Reference in New Issue
Block a user