Compare commits
10 Commits
78ff8d4f6e
...
c20ac1ffb8
Author | SHA1 | Date | |
---|---|---|---|
|
c20ac1ffb8 | ||
ac1cddcae7 | |||
cadfc20e54 | |||
48c7c27af4 | |||
307d4cb18a | |||
c2d2813622 | |||
5e33bdbce7 | |||
ce6da61fcf | |||
f7d9aff838 | |||
d683170a73 |
201
LICENSE
Normal file
201
LICENSE
Normal file
@ -0,0 +1,201 @@
|
||||
Apache License
|
||||
Version 2.0, January 2004
|
||||
http://www.apache.org/licenses/
|
||||
|
||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||
|
||||
1. Definitions.
|
||||
|
||||
"License" shall mean the terms and conditions for use, reproduction,
|
||||
and distribution as defined by Sections 1 through 9 of this document.
|
||||
|
||||
"Licensor" shall mean the copyright owner or entity authorized by
|
||||
the copyright owner that is granting the License.
|
||||
|
||||
"Legal Entity" shall mean the union of the acting entity and all
|
||||
other entities that control, are controlled by, or are under common
|
||||
control with that entity. For the purposes of this definition,
|
||||
"control" means (i) the power, direct or indirect, to cause the
|
||||
direction or management of such entity, whether by contract or
|
||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||
|
||||
"You" (or "Your") shall mean an individual or Legal Entity
|
||||
exercising permissions granted by this License.
|
||||
|
||||
"Source" form shall mean the preferred form for making modifications,
|
||||
including but not limited to software source code, documentation
|
||||
source, and configuration files.
|
||||
|
||||
"Object" form shall mean any form resulting from mechanical
|
||||
transformation or translation of a Source form, including but
|
||||
not limited to compiled object code, generated documentation,
|
||||
and conversions to other media types.
|
||||
|
||||
"Work" shall mean the work of authorship, whether in Source or
|
||||
Object form, made available under the License, as indicated by a
|
||||
copyright notice that is included in or attached to the work
|
||||
(an example is provided in the Appendix below).
|
||||
|
||||
"Derivative Works" shall mean any work, whether in Source or Object
|
||||
form, that is based on (or derived from) the Work and for which the
|
||||
editorial revisions, annotations, elaborations, or other modifications
|
||||
represent, as a whole, an original work of authorship. For the purposes
|
||||
of this License, Derivative Works shall not include works that remain
|
||||
separable from, or merely link (or bind by name) to the interfaces of,
|
||||
the Work and Derivative Works thereof.
|
||||
|
||||
"Contribution" shall mean any work of authorship, including
|
||||
the original version of the Work and any modifications or additions
|
||||
to that Work or Derivative Works thereof, that is intentionally
|
||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||
or by an individual or Legal Entity authorized to submit on behalf of
|
||||
the copyright owner. For the purposes of this definition, "submitted"
|
||||
means any form of electronic, verbal, or written communication sent
|
||||
to the Licensor or its representatives, including but not limited to
|
||||
communication on electronic mailing lists, source code control systems,
|
||||
and issue tracking systems that are managed by, or on behalf of, the
|
||||
Licensor for the purpose of discussing and improving the Work, but
|
||||
excluding communication that is conspicuously marked or otherwise
|
||||
designated in writing by the copyright owner as "Not a Contribution."
|
||||
|
||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||
on behalf of whom a Contribution has been received by Licensor and
|
||||
subsequently incorporated within the Work.
|
||||
|
||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
copyright license to reproduce, prepare Derivative Works of,
|
||||
publicly display, publicly perform, sublicense, and distribute the
|
||||
Work and such Derivative Works in Source or Object form.
|
||||
|
||||
3. Grant of Patent License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
(except as stated in this section) patent license to make, have made,
|
||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||
where such license applies only to those patent claims licensable
|
||||
by such Contributor that are necessarily infringed by their
|
||||
Contribution(s) alone or by combination of their Contribution(s)
|
||||
with the Work to which such Contribution(s) was submitted. If You
|
||||
institute patent litigation against any entity (including a
|
||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||
or a Contribution incorporated within the Work constitutes direct
|
||||
or contributory patent infringement, then any patent licenses
|
||||
granted to You under this License for that Work shall terminate
|
||||
as of the date such litigation is filed.
|
||||
|
||||
4. Redistribution. You may reproduce and distribute copies of the
|
||||
Work or Derivative Works thereof in any medium, with or without
|
||||
modifications, and in Source or Object form, provided that You
|
||||
meet the following conditions:
|
||||
|
||||
(a) You must give any other recipients of the Work or
|
||||
Derivative Works a copy of this License; and
|
||||
|
||||
(b) You must cause any modified files to carry prominent notices
|
||||
stating that You changed the files; and
|
||||
|
||||
(c) You must retain, in the Source form of any Derivative Works
|
||||
that You distribute, all copyright, patent, trademark, and
|
||||
attribution notices from the Source form of the Work,
|
||||
excluding those notices that do not pertain to any part of
|
||||
the Derivative Works; and
|
||||
|
||||
(d) If the Work includes a "NOTICE" text file as part of its
|
||||
distribution, then any Derivative Works that You distribute must
|
||||
include a readable copy of the attribution notices contained
|
||||
within such NOTICE file, excluding those notices that do not
|
||||
pertain to any part of the Derivative Works, in at least one
|
||||
of the following places: within a NOTICE text file distributed
|
||||
as part of the Derivative Works; within the Source form or
|
||||
documentation, if provided along with the Derivative Works; or,
|
||||
within a display generated by the Derivative Works, if and
|
||||
wherever such third-party notices normally appear. The contents
|
||||
of the NOTICE file are for informational purposes only and
|
||||
do not modify the License. You may add Your own attribution
|
||||
notices within Derivative Works that You distribute, alongside
|
||||
or as an addendum to the NOTICE text from the Work, provided
|
||||
that such additional attribution notices cannot be construed
|
||||
as modifying the License.
|
||||
|
||||
You may add Your own copyright statement to Your modifications and
|
||||
may provide additional or different license terms and conditions
|
||||
for use, reproduction, or distribution of Your modifications, or
|
||||
for any such Derivative Works as a whole, provided Your use,
|
||||
reproduction, and distribution of the Work otherwise complies with
|
||||
the conditions stated in this License.
|
||||
|
||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||
any Contribution intentionally submitted for inclusion in the Work
|
||||
by You to the Licensor shall be under the terms and conditions of
|
||||
this License, without any additional terms or conditions.
|
||||
Notwithstanding the above, nothing herein shall supersede or modify
|
||||
the terms of any separate license agreement you may have executed
|
||||
with Licensor regarding such Contributions.
|
||||
|
||||
6. Trademarks. This License does not grant permission to use the trade
|
||||
names, trademarks, service marks, or product names of the Licensor,
|
||||
except as required for reasonable and customary use in describing the
|
||||
origin of the Work and reproducing the content of the NOTICE file.
|
||||
|
||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||
agreed to in writing, Licensor provides the Work (and each
|
||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
implied, including, without limitation, any warranties or conditions
|
||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||
appropriateness of using or redistributing the Work and assume any
|
||||
risks associated with Your exercise of permissions under this License.
|
||||
|
||||
8. Limitation of Liability. In no event and under no legal theory,
|
||||
whether in tort (including negligence), contract, or otherwise,
|
||||
unless required by applicable law (such as deliberate and grossly
|
||||
negligent acts) or agreed to in writing, shall any Contributor be
|
||||
liable to You for damages, including any direct, indirect, special,
|
||||
incidental, or consequential damages of any character arising as a
|
||||
result of this License or out of the use or inability to use the
|
||||
Work (including but not limited to damages for loss of goodwill,
|
||||
work stoppage, computer failure or malfunction, or any and all
|
||||
other commercial damages or losses), even if such Contributor
|
||||
has been advised of the possibility of such damages.
|
||||
|
||||
9. Accepting Warranty or Additional Liability. While redistributing
|
||||
the Work or Derivative Works thereof, You may choose to offer,
|
||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||
or other liability obligations and/or rights consistent with this
|
||||
License. However, in accepting such obligations, You may act only
|
||||
on Your own behalf and on Your sole responsibility, not on behalf
|
||||
of any other Contributor, and only if You agree to indemnify,
|
||||
defend, and hold each Contributor harmless for any liability
|
||||
incurred by, or claims asserted against, such Contributor by reason
|
||||
of your accepting any such warranty or additional liability.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
APPENDIX: How to apply the Apache License to your work.
|
||||
|
||||
To apply the Apache License to your work, attach the following
|
||||
boilerplate notice, with the fields enclosed by brackets "[]"
|
||||
replaced with your own identifying information. (Don't include
|
||||
the brackets!) The text should be enclosed in the appropriate
|
||||
comment syntax for the file format. We also recommend that a
|
||||
file or class name and description of purpose be included on the
|
||||
same "printed page" as the copyright notice for easier
|
||||
identification within third-party archives.
|
||||
|
||||
Copyright [yyyy] [name of copyright owner]
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License");
|
||||
you may not use this file except in compliance with the License.
|
||||
You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing, software
|
||||
distributed under the License is distributed on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
See the License for the specific language governing permissions and
|
||||
limitations under the License.
|
@ -120,7 +120,7 @@ abstract class AbstractPluginLoader : PluginLoader {
|
||||
|
||||
|
||||
protected fun compare(p1: PluginFactory, p2: PluginFactory): Int {
|
||||
return Integer.compare(p1.tag.getInt("priority", 0), p2.tag.getInt("priority", 0))
|
||||
return p1.tag.getInt("priority", 0).compareTo(p2.tag.getInt("priority", 0))
|
||||
}
|
||||
|
||||
override fun listTags(): List<PluginTag> {
|
||||
|
@ -88,7 +88,7 @@ open class DefaultEnvelopeReader : EnvelopeReader {
|
||||
val dataLength = tag.dataSize
|
||||
if (metaLength < 0 || dataLength < 0) {
|
||||
LoggerFactory.getLogger(javaClass).error("Can't lazy read infinite data or meta. Returning non-lazy envelope")
|
||||
return read(file)
|
||||
return read(Files.newInputStream(file))
|
||||
}
|
||||
|
||||
val metaBuffer = ByteBuffer.allocate(metaLength)
|
||||
|
@ -215,7 +215,7 @@ interface Name : Comparable<Name> {
|
||||
return of(segments[0])
|
||||
}
|
||||
|
||||
return of(Stream.of(*segments).filter { it -> !it.isEmpty() }.map<Name>{ of(it) }.toList())
|
||||
return of(Stream.of(*segments).filter { it -> it.isNotEmpty() }.map { of(it) }.toList())
|
||||
}
|
||||
|
||||
fun joinString(vararg segments: String): String {
|
||||
|
@ -31,9 +31,7 @@ interface ValueProvider {
|
||||
fun optValue(path: String): Optional<Value>
|
||||
|
||||
|
||||
fun getValue(path: String): Value {
|
||||
return optValue(path).orElseThrow<NameNotFoundException> { NameNotFoundException(path) }
|
||||
}
|
||||
fun getValue(path: String): Value = optValue(path).orElseThrow { NameNotFoundException(path) }
|
||||
|
||||
@Provides(BOOLEAN_TARGET)
|
||||
|
||||
|
@ -150,13 +150,11 @@ class JFreeChartFrame : XYPlotFrame(), FXPlotFrame, Serializable {
|
||||
return Range(meta.getDouble("lower", java.lang.Double.NEGATIVE_INFINITY), meta.getDouble("upper", java.lang.Double.POSITIVE_INFINITY))
|
||||
}
|
||||
|
||||
private fun getAxis(axisMeta: Meta): ValueAxis {
|
||||
return when (axisMeta.getString("type", "number").lowercase()) {
|
||||
private fun getAxis(axisMeta: Meta): ValueAxis = when (axisMeta.getString("type", "number").lowercase()) {
|
||||
"log" -> getLogAxis(axisMeta)
|
||||
"time" -> getDateAxis(axisMeta)
|
||||
else -> getNumberAxis(axisMeta)
|
||||
}
|
||||
}
|
||||
|
||||
override fun updateAxis(axisName: String, axisMeta: Meta, plotMeta: Meta) {
|
||||
val axis = getAxis(axisMeta)
|
||||
|
@ -112,37 +112,29 @@ class PlotGroup(override val name: String, descriptor: NodeDescriptor = NodeDes
|
||||
}
|
||||
|
||||
@ProvidesNames(PLOT_TARGET)
|
||||
fun list(): Stream<String> {
|
||||
return stream().map { it.first.toString() }
|
||||
}
|
||||
fun list(): Stream<String> = stream().map { it.first.toString() }
|
||||
|
||||
/**
|
||||
* Recursive stream of all plots excluding intermediate nodes
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
fun stream(recursive: Boolean = true): Stream<Pair<Name, Plottable>> {
|
||||
return plots.stream().flatMap {
|
||||
fun stream(recursive: Boolean = true): Stream<Pair<Name, Plottable>> = plots.stream().flatMap {
|
||||
if (recursive && it is PlotGroup) {
|
||||
it.stream().map { pair -> Pair(Name.ofSingle(it.name) + pair.first, pair.second) }
|
||||
} else {
|
||||
Stream.of(Pair(Name.ofSingle(it.name), it))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@Provides(PLOT_TARGET)
|
||||
operator fun get(name: String): Plottable? {
|
||||
return get(Name.of(name))
|
||||
}
|
||||
operator fun get(name: String): Plottable? = get(Name.of(name))
|
||||
|
||||
operator fun get(name: Name): Plottable? {
|
||||
return when {
|
||||
name.length == 0 -> this
|
||||
name.length == 1 -> plots.find { it.name == name.unescaped }
|
||||
operator fun get(name: Name): Plottable? = when (name.length) {
|
||||
0 -> this
|
||||
1 -> plots.find { it.name == name.unescaped }
|
||||
else -> (get(name.cutLast()) as? PlotGroup)?.get(name.last)
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* * Add plottable if it is absent,
|
||||
@ -203,9 +195,7 @@ class PlotGroup(override val name: String, descriptor: NodeDescriptor = NodeDes
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
override fun iterator(): Iterator<Plottable> {
|
||||
return this.plots.iterator()
|
||||
}
|
||||
override fun iterator(): Iterator<Plottable> = this.plots.iterator()
|
||||
|
||||
class Wrapper : hep.dataforge.io.envelopes.Wrapper<PlotGroup> {
|
||||
|
||||
|
@ -35,7 +35,10 @@ import hep.dataforge.storage.StorageManager
|
||||
import kotlinx.coroutines.async
|
||||
import kotlinx.coroutines.awaitAll
|
||||
import kotlinx.coroutines.runBlocking
|
||||
import java.nio.file.*
|
||||
import java.nio.file.Files
|
||||
import java.nio.file.Path
|
||||
import java.nio.file.Paths
|
||||
import java.nio.file.StandardOpenOption
|
||||
import kotlin.streams.asSequence
|
||||
|
||||
/**
|
||||
@ -57,7 +60,12 @@ interface FileStorageElementType : StorageElementType, Named {
|
||||
/**
|
||||
* Read given path as [FileStorageElement] with given parent. Returns null if path does not belong to storage
|
||||
*/
|
||||
suspend fun read(context: Context, path: Path, parent: StorageElement? = null): FileStorageElement?
|
||||
suspend fun read(
|
||||
context: Context,
|
||||
path: Path,
|
||||
parent: StorageElement? = null,
|
||||
readMeta: Meta? = null,
|
||||
): FileStorageElement?
|
||||
}
|
||||
|
||||
class FileStorage(
|
||||
@ -95,12 +103,12 @@ class FileStorage(
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Creating a watch service or reusing one from parent
|
||||
*/
|
||||
private val watchService: WatchService by lazy {
|
||||
(parent as? FileStorage)?.watchService ?: path.fileSystem.newWatchService()
|
||||
}
|
||||
// /**
|
||||
// * Creating a watch service or reusing one from parent
|
||||
// */
|
||||
// private val watchService: WatchService by lazy {
|
||||
// (parent as? FileStorage)?.watchService ?: path.fileSystem.newWatchService()
|
||||
// }
|
||||
|
||||
//TODO actually watch for file change
|
||||
|
||||
@ -117,15 +125,13 @@ class FileStorage(
|
||||
fun resolveMeta(
|
||||
path: Path,
|
||||
metaReader: (Path) -> Meta? = { EnvelopeType.infer(it)?.reader?.read(it)?.meta },
|
||||
): Meta? {
|
||||
return if (Files.isDirectory(path)) {
|
||||
): Meta? = if (Files.isDirectory(path)) {
|
||||
Files.list(path).asSequence()
|
||||
.find { it.fileName.toString() == "meta.df" || it.fileName.toString() == "meta" }
|
||||
?.let(metaReader)
|
||||
} else {
|
||||
metaReader(path)
|
||||
}
|
||||
}
|
||||
|
||||
fun createMetaEnvelope(meta: Meta): Envelope {
|
||||
return EnvelopeBuilder().meta(meta).setEnvelopeType(META_ENVELOPE_TYPE).build()
|
||||
@ -167,8 +173,13 @@ class FileStorage(
|
||||
}
|
||||
}
|
||||
|
||||
override suspend fun read(context: Context, path: Path, parent: StorageElement?): FileStorageElement? {
|
||||
val meta = resolveMeta(path)
|
||||
override suspend fun read(
|
||||
context: Context,
|
||||
path: Path,
|
||||
parent: StorageElement?,
|
||||
readMeta: Meta?,
|
||||
): FileStorageElement? {
|
||||
val meta = readMeta ?: resolveMeta(path)
|
||||
val name = meta?.optString("name").nullable ?: path.fileName.toString()
|
||||
val type = meta?.optString("type").nullable?.let {
|
||||
context.load<StorageManager>().getType(it)
|
||||
|
@ -299,7 +299,7 @@ class TableLoaderType : FileStorageElementType {
|
||||
})
|
||||
}
|
||||
|
||||
override suspend fun read(context: Context, path: Path, parent: StorageElement?): FileStorageElement? {
|
||||
override suspend fun read(context: Context, path: Path, parent: StorageElement?, readMeta: Meta?): FileStorageElement {
|
||||
val envelope = EnvelopeReader.readFile(path)
|
||||
|
||||
val name = envelope.meta.optString("name").nullable ?: path.fileName.toString()
|
||||
|
@ -106,13 +106,11 @@ class ProtoNumassPoint(override val meta: Meta, val protoBuilder: () -> NumassPr
|
||||
this.data.stream
|
||||
}
|
||||
|
||||
fun fromEnvelope(envelope: Envelope): ProtoNumassPoint {
|
||||
return ProtoNumassPoint(envelope.meta) {
|
||||
fun fromEnvelope(envelope: Envelope): ProtoNumassPoint = ProtoNumassPoint(envelope.meta) {
|
||||
envelope.dataStream().use {
|
||||
NumassProto.Point.parseFrom(it)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// fun readFile(path: String, context: Context = Global): ProtoNumassPoint {
|
||||
// return readFile(context.getFile(path).absolutePath)
|
||||
|
@ -82,7 +82,7 @@ object NumassDataUtils {
|
||||
}
|
||||
|
||||
fun read(envelope: Envelope): NumassPoint =
|
||||
if (envelope.meta.hasMeta("dpp_params") || envelope.meta.hasMeta("tqdc")) {
|
||||
if (envelope.meta.hasMeta("dpp_params") || envelope.meta.hasMeta("channels") || envelope.meta.hasMeta("tqdc")) {
|
||||
ProtoNumassPoint.fromEnvelope(envelope)
|
||||
} else {
|
||||
ClassicNumassPoint(envelope)
|
||||
|
@ -56,9 +56,9 @@ class NumassDataLoader(
|
||||
override fun getConnectionHelper(): ConnectionHelper = _connectionHelper
|
||||
|
||||
|
||||
override val meta: Meta by lazy {
|
||||
override val meta: Meta get() {
|
||||
val metaPath = path.resolve("meta")
|
||||
NumassEnvelopeType.infer(metaPath)?.reader?.read(metaPath)?.meta ?: Meta.empty()
|
||||
return NumassEnvelopeType.infer(metaPath)?.reader?.read(metaPath)?.meta ?: Meta.empty()
|
||||
}
|
||||
|
||||
override suspend fun getHvData(): Table? {
|
||||
@ -73,7 +73,6 @@ class NumassDataLoader(
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
private val pointEnvelopes: List<Envelope> by lazy {
|
||||
Files.list(path)
|
||||
.filter { it.fileName.toString().startsWith(POINT_FRAGMENT_NAME) }
|
||||
|
@ -15,31 +15,74 @@
|
||||
*/
|
||||
package inr.numass.data.storage
|
||||
|
||||
import hep.dataforge.connections.ConnectionHelper
|
||||
import hep.dataforge.context.Context
|
||||
import hep.dataforge.context.Global
|
||||
import hep.dataforge.events.Event
|
||||
import hep.dataforge.events.EventBuilder
|
||||
import hep.dataforge.io.envelopes.Envelope
|
||||
import hep.dataforge.meta.Meta
|
||||
import hep.dataforge.nullable
|
||||
import hep.dataforge.storage.StorageElement
|
||||
import hep.dataforge.storage.StorageManager
|
||||
import hep.dataforge.storage.files.FileStorage
|
||||
import hep.dataforge.storage.files.FileStorageElement
|
||||
import hep.dataforge.storage.files.FileStorageElementType
|
||||
import inr.numass.data.NumassEnvelopeType
|
||||
import kotlinx.coroutines.runBlocking
|
||||
import java.nio.file.Files
|
||||
import java.nio.file.Path
|
||||
|
||||
class EnvelopeStorageElement(
|
||||
override val context: Context,
|
||||
override val name: String,
|
||||
override val meta: Meta,
|
||||
override val path: Path,
|
||||
override val parent: StorageElement?,
|
||||
) : FileStorageElement {
|
||||
private val _connectionHelper by lazy { ConnectionHelper(this) }
|
||||
|
||||
override fun getConnectionHelper(): ConnectionHelper = _connectionHelper
|
||||
|
||||
val envelope: Envelope? get() = NumassEnvelopeType.infer(path)?.reader?.read(path)
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* Numass storage directory. Works as a normal directory, but creates a numass loader from each directory with meta
|
||||
*/
|
||||
class NumassDirectory : FileStorage.Directory() {
|
||||
override val name: String = NUMASS_DIRECTORY_TYPE
|
||||
|
||||
override suspend fun read(context: Context, path: Path, parent: StorageElement?): FileStorageElement? {
|
||||
val meta = FileStorage.resolveMeta(path){ NumassEnvelopeType.infer(it)?.reader?.read(it)?.meta }
|
||||
override suspend fun read(
|
||||
context: Context,
|
||||
path: Path,
|
||||
parent: StorageElement?,
|
||||
readMeta: Meta?,
|
||||
): FileStorageElement? {
|
||||
val meta = readMeta ?: FileStorage.resolveMeta(path) { NumassEnvelopeType.infer(it)?.reader?.read(it)?.meta }
|
||||
return if (Files.isDirectory(path) && meta != null) {
|
||||
NumassDataLoader(context, parent, path.fileName.toString(), path)
|
||||
} else {
|
||||
super.read(context, path, parent)
|
||||
val name = meta?.optString("name").nullable ?: path.fileName.toString()
|
||||
val type = meta?.optString("type").nullable?.let {
|
||||
context.load<StorageManager>().getType(it)
|
||||
} as? FileStorageElementType
|
||||
if (type == null || type is FileStorage.Directory) {
|
||||
// Read path as directory if type not found and path is directory
|
||||
if (Files.isDirectory(path)) {
|
||||
FileStorage(context, name, meta ?: Meta.empty(), path, parent, this)
|
||||
} else {
|
||||
EnvelopeStorageElement(context, name, meta ?: Meta.empty(), path, parent)
|
||||
}
|
||||
} else {
|
||||
//Otherwise, delegate to the type
|
||||
type.read(context, path, parent)
|
||||
}.also {
|
||||
if (it != null && parent == null) {
|
||||
context.load<StorageManager>().register(it)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -10,7 +10,8 @@ fun main() {
|
||||
Global.output = FXOutputManager()
|
||||
JFreeChartPlugin().startGlobal()
|
||||
|
||||
val file = File("C:\\Users\\darksnake\\Desktop\\test-data\\p20211012142003(20s)").toPath()
|
||||
val file = File("D:\\Work\\Numass\\data\\test\\7.df").toPath()
|
||||
println(file)
|
||||
val point = ProtoNumassPoint.readFile(file)
|
||||
|
||||
point.events.forEach {
|
||||
|
@ -19,7 +19,7 @@ application {
|
||||
mainClass.set("inr.numass.viewer.Viewer")
|
||||
}
|
||||
|
||||
version = "0.6.0"
|
||||
version = "0.6.3"
|
||||
|
||||
description = "The viewer for numass data"
|
||||
|
||||
@ -30,6 +30,7 @@ dependencies {
|
||||
}
|
||||
|
||||
val addJvmArgs = listOf(
|
||||
"-XX:+UseZGC",
|
||||
"--add-exports=javafx.graphics/com.sun.glass.ui=ALL-UNNAMED",
|
||||
"--add-opens=javafx.graphics/com.sun.javafx.css=ALL-UNNAMED",
|
||||
"--add-opens=javafx.graphics/com.sun.javafx.scene=ALL-UNNAMED",
|
||||
@ -39,6 +40,10 @@ val addJvmArgs = listOf(
|
||||
"--add-opens=javafx.controls/com.sun.javafx.scene.control.behavior=ALL-UNNAMED",
|
||||
"--add-opens=javafx.controls/javafx.scene.control.skin=ALL-UNNAMED",
|
||||
"--add-exports=javafx.controls/com.sun.javafx.scene.control.inputmap=ALL-UNNAMED",
|
||||
"--add-exports=javafx.controls/com.sun.javafx.scene.control.behavior=ALL-UNNAMED",
|
||||
"--add-exports=javafx.base/com.sun.javafx.event=ALL-UNNAMED",
|
||||
"--add-opens=javafx.controls/javafx.scene.control.skin=ALL-UNNAMED",
|
||||
"--add-opens=javafx.graphics/javafx.scene=ALL-UNNAMED"
|
||||
)
|
||||
|
||||
application {
|
||||
@ -57,8 +62,28 @@ runtime {
|
||||
"javafx.controls"
|
||||
)
|
||||
jpackage {
|
||||
//installerType = "deb"
|
||||
jvmArgs = addJvmArgs
|
||||
//imageOptions = listOf("--linux-deb-maintainer", "nozik.aa@mipt.ru", "--linux-menu-group", "Science")
|
||||
val currentOs = org.gradle.internal.os.OperatingSystem.current()
|
||||
installerOptions = installerOptions + listOf("--vendor", "MIPT-NPM lab")
|
||||
|
||||
if (currentOs.isWindows) {
|
||||
installerOptions = installerOptions + listOf(
|
||||
"--win-menu",
|
||||
"--win-menu-group", "Numass",
|
||||
"--win-dir-chooser",
|
||||
"--win-shortcut"
|
||||
)
|
||||
} else if (currentOs.isLinux) {
|
||||
installerType = "deb"
|
||||
installerOptions = installerOptions + listOf(
|
||||
"--linux-package-name", "numass-viewer",
|
||||
"--linux-shortcut",
|
||||
"--linux-deb-maintainer", "nozik.aa@mipt.ru",
|
||||
"--linux-menu-group", "Science",
|
||||
"--linux-shortcut"
|
||||
)
|
||||
}
|
||||
}
|
||||
launcher {
|
||||
jvmArgs = addJvmArgs
|
||||
|
@ -2,35 +2,31 @@ package inr.numass.viewer
|
||||
|
||||
import hep.dataforge.configure
|
||||
import hep.dataforge.fx.dfIcon
|
||||
import hep.dataforge.fx.except
|
||||
import hep.dataforge.fx.plots.PlotContainer
|
||||
import hep.dataforge.fx.runGoal
|
||||
import hep.dataforge.fx.ui
|
||||
import hep.dataforge.goals.Goal
|
||||
import hep.dataforge.names.Name
|
||||
import hep.dataforge.plots.PlotGroup
|
||||
import hep.dataforge.plots.Plottable
|
||||
import hep.dataforge.plots.data.DataPlot
|
||||
import hep.dataforge.plots.jfreechart.JFreeChartFrame
|
||||
import hep.dataforge.tables.Adapters
|
||||
import inr.numass.data.analyzers.NumassAnalyzer
|
||||
import inr.numass.data.analyzers.withBinning
|
||||
import inr.numass.data.api.NumassPoint
|
||||
import javafx.beans.Observable
|
||||
import javafx.beans.binding.DoubleBinding
|
||||
import javafx.beans.property.SimpleBooleanProperty
|
||||
import javafx.beans.property.SimpleIntegerProperty
|
||||
import javafx.beans.property.SimpleObjectProperty
|
||||
import javafx.collections.FXCollections
|
||||
import javafx.collections.MapChangeListener
|
||||
import javafx.collections.ObservableMap
|
||||
import javafx.scene.control.CheckBox
|
||||
import javafx.scene.control.ChoiceBox
|
||||
import javafx.scene.image.ImageView
|
||||
import kotlinx.coroutines.Dispatchers
|
||||
import kotlinx.coroutines.*
|
||||
import kotlinx.coroutines.javafx.JavaFx
|
||||
import tornadofx.*
|
||||
|
||||
class AmplitudeView : View(title = "Numass amplitude spectrum plot", icon = ImageView(dfIcon)) {
|
||||
|
||||
private val pointCache by inject<PointCache>()
|
||||
private val dataController by inject<DataController>()
|
||||
private val data get() = dataController.points
|
||||
|
||||
private val frame = JFreeChartFrame().configure {
|
||||
"title" to "Detector response plot"
|
||||
@ -54,17 +50,17 @@ class AmplitudeView : View(title = "Numass amplitude spectrum plot", icon = Imag
|
||||
}.setType<DataPlot>()
|
||||
}
|
||||
|
||||
val binningProperty = SimpleObjectProperty(20)
|
||||
var binning by binningProperty
|
||||
val binningProperty = SimpleIntegerProperty(2)
|
||||
var binning: Int by binningProperty
|
||||
|
||||
val normalizeProperty = SimpleBooleanProperty(true)
|
||||
var normalize by normalizeProperty
|
||||
|
||||
|
||||
private val container = PlotContainer(frame).apply {
|
||||
val binningSelector: ChoiceBox<Int> = ChoiceBox(FXCollections.observableArrayList(1, 2, 8, 16, 32, 50)).apply {
|
||||
private val plotContainer = PlotContainer(frame).apply {
|
||||
val binningSelector: ChoiceBox<Int> = ChoiceBox(FXCollections.observableArrayList(1, 2, 8, 16, 32)).apply {
|
||||
minWidth = 0.0
|
||||
selectionModel.selectLast()
|
||||
selectionModel.select(binning as Int?)
|
||||
binningProperty.bind(this.selectionModel.selectedItemProperty())
|
||||
}
|
||||
val normalizeSwitch: CheckBox = CheckBox("Normalize").apply {
|
||||
@ -74,61 +70,48 @@ class AmplitudeView : View(title = "Numass amplitude spectrum plot", icon = Imag
|
||||
addToSideBar(0, binningSelector, normalizeSwitch)
|
||||
}
|
||||
|
||||
private val data: ObservableMap<String, NumassPoint> = FXCollections.observableHashMap()
|
||||
private val plots: ObservableMap<String, Goal<Plottable>> = FXCollections.observableHashMap()
|
||||
private val plotJobs: ObservableMap<String, Job> = FXCollections.observableHashMap()
|
||||
|
||||
val isEmpty = booleanBinding(data) { isEmpty() }
|
||||
|
||||
private val progress = object : DoubleBinding() {
|
||||
init {
|
||||
bind(plots)
|
||||
}
|
||||
|
||||
override fun computeValue(): Double {
|
||||
return plots.values.count { it.isDone }.toDouble() / data.size;
|
||||
bind(plotJobs)
|
||||
}
|
||||
|
||||
override fun computeValue(): Double = plotJobs.values.count { it.isCompleted }.toDouble() / plotJobs.size
|
||||
}
|
||||
|
||||
init {
|
||||
data.addListener { _: Observable ->
|
||||
invalidate()
|
||||
data.addListener(MapChangeListener { change ->
|
||||
val key = change.key.toString()
|
||||
if (change.wasAdded()) {
|
||||
replotOne(key, change.valueAdded)
|
||||
} else if (change.wasRemoved()) {
|
||||
plotJobs[key]?.cancel()
|
||||
plotJobs.remove(key)
|
||||
frame.plots.remove(Name.ofSingle(key))
|
||||
progress.invalidate()
|
||||
}
|
||||
})
|
||||
|
||||
binningProperty.onChange {
|
||||
frame.plots.clear()
|
||||
plots.clear()
|
||||
invalidate()
|
||||
replot()
|
||||
}
|
||||
|
||||
normalizeProperty.onChange {
|
||||
frame.plots.clear()
|
||||
plots.clear()
|
||||
invalidate()
|
||||
replot()
|
||||
}
|
||||
|
||||
container.progressProperty.bind(progress)
|
||||
plotContainer.progressProperty.bind(progress)
|
||||
}
|
||||
|
||||
override val root = borderpane {
|
||||
center = container.root
|
||||
private fun replotOne(key: String, point: DataController.CachedPoint) {
|
||||
plotJobs[key]?.cancel()
|
||||
frame.plots.remove(Name.ofSingle(key))
|
||||
plotJobs[key] = app.context.launch {
|
||||
withContext(Dispatchers.JavaFx) {
|
||||
progress.invalidate()
|
||||
}
|
||||
|
||||
/**
|
||||
* Put or replace current plot with name `key`
|
||||
*/
|
||||
operator fun set(key: String, point: NumassPoint) {
|
||||
data[key] = point
|
||||
}
|
||||
|
||||
fun addAll(data: Map<String, NumassPoint>) {
|
||||
this.data.putAll(data);
|
||||
}
|
||||
|
||||
private fun invalidate() {
|
||||
data.forEach { (key, point) ->
|
||||
plots.getOrPut(key) {
|
||||
runGoal<Plottable>(app.context, "loadAmplitudeSpectrum_$key", Dispatchers.IO) {
|
||||
val valueAxis = if (normalize) {
|
||||
NumassAnalyzer.COUNT_RATE_KEY
|
||||
} else {
|
||||
@ -136,9 +119,9 @@ class AmplitudeView : View(title = "Numass amplitude spectrum plot", icon = Imag
|
||||
}
|
||||
val adapter = Adapters.buildXYAdapter(NumassAnalyzer.CHANNEL_KEY, valueAxis)
|
||||
|
||||
val channels = pointCache.getChannelSpectra(key, point)
|
||||
val channels = point.channelSpectra.await()
|
||||
|
||||
return@runGoal if (channels.size == 1) {
|
||||
val plot = if (channels.size == 1) {
|
||||
DataPlot.plot(
|
||||
key,
|
||||
channels.values.first().withBinning(binning),
|
||||
@ -146,7 +129,7 @@ class AmplitudeView : View(title = "Numass amplitude spectrum plot", icon = Imag
|
||||
)
|
||||
} else {
|
||||
val group = PlotGroup.typed<DataPlot>(key)
|
||||
channels.forEach { key, spectrum ->
|
||||
channels.forEach { (key, spectrum) ->
|
||||
val plot = DataPlot.plot(
|
||||
key.toString(),
|
||||
spectrum.withBinning(binning),
|
||||
@ -156,47 +139,29 @@ class AmplitudeView : View(title = "Numass amplitude spectrum plot", icon = Imag
|
||||
}
|
||||
group
|
||||
}
|
||||
} ui { plot ->
|
||||
withContext(Dispatchers.JavaFx) {
|
||||
frame.add(plot)
|
||||
progress.invalidate()
|
||||
} except {
|
||||
}
|
||||
}.apply {
|
||||
invokeOnCompletion {
|
||||
runLater{
|
||||
progress.invalidate()
|
||||
}
|
||||
}
|
||||
plots.keys.filter { !data.containsKey(it) }.forEach { remove(it) }
|
||||
}
|
||||
}
|
||||
|
||||
fun clear() {
|
||||
data.clear()
|
||||
plots.values.forEach{
|
||||
it.cancel()
|
||||
private fun replot() {
|
||||
frame.plots.clear()
|
||||
plotJobs.forEach { (_, job) -> job.cancel() }
|
||||
plotJobs.clear()
|
||||
|
||||
data.forEach { (key, point) ->
|
||||
replotOne(key.toString(), point)
|
||||
}
|
||||
plots.clear()
|
||||
invalidate()
|
||||
}
|
||||
|
||||
/**
|
||||
* Remove the plot and cancel loading task if it is in progress.
|
||||
*/
|
||||
fun remove(name: String) {
|
||||
frame.plots.remove(Name.ofSingle(name))
|
||||
plots[name]?.cancel()
|
||||
plots.remove(name)
|
||||
data.remove(name)
|
||||
progress.invalidate()
|
||||
override val root = borderpane {
|
||||
center = plotContainer.root
|
||||
}
|
||||
|
||||
/**
|
||||
* Set frame content to the given map. All keys not in the map are removed.
|
||||
*/
|
||||
fun setAll(map: Map<String, NumassPoint>) {
|
||||
plots.clear();
|
||||
//Remove obsolete keys
|
||||
data.keys.filter { !map.containsKey(it) }.forEach {
|
||||
remove(it)
|
||||
}
|
||||
this.addAll(map);
|
||||
}
|
||||
|
||||
}
|
||||
|
@ -0,0 +1,176 @@
|
||||
package inr.numass.viewer
|
||||
|
||||
import hep.dataforge.context.ContextAware
|
||||
import hep.dataforge.meta.Meta
|
||||
import hep.dataforge.names.Name
|
||||
import hep.dataforge.storage.tables.TableLoader
|
||||
import hep.dataforge.tables.Adapters
|
||||
import hep.dataforge.tables.ListTable
|
||||
import hep.dataforge.tables.Table
|
||||
import hep.dataforge.tables.TableFormatBuilder
|
||||
import hep.dataforge.utils.Misc
|
||||
import hep.dataforge.values.ValueMap
|
||||
import inr.numass.data.analyzers.NumassAnalyzer
|
||||
import inr.numass.data.analyzers.TimeAnalyzer
|
||||
import inr.numass.data.api.NumassPoint
|
||||
import inr.numass.data.api.NumassSet
|
||||
import inr.numass.data.storage.NumassDataLoader
|
||||
import javafx.beans.property.SimpleObjectProperty
|
||||
import javafx.collections.FXCollections
|
||||
import javafx.collections.ObservableList
|
||||
import javafx.collections.ObservableMap
|
||||
import kotlinx.coroutines.*
|
||||
import tornadofx.*
|
||||
import java.nio.file.*
|
||||
import java.nio.file.attribute.BasicFileAttributes
|
||||
import kotlin.math.floor
|
||||
|
||||
|
||||
class DataController : Controller(), ContextAware {
|
||||
override val context get() = app.context
|
||||
|
||||
val analyzer = TimeAnalyzer()
|
||||
|
||||
inner class CachedPoint(point: NumassPoint) {
|
||||
val length = point.length
|
||||
|
||||
val voltage = point.voltage
|
||||
|
||||
val index = point.index
|
||||
|
||||
val meta = point.meta
|
||||
|
||||
val channelSpectra: Deferred<Map<Int, Table>> = context.async(start = CoroutineStart.LAZY) {
|
||||
point.channels.mapValues { (_, value) -> analyzer.getAmplitudeSpectrum(value) }
|
||||
}
|
||||
|
||||
val spectrum: Deferred<Table> = context.async(start = CoroutineStart.LAZY){
|
||||
analyzer.getAmplitudeSpectrum(point)
|
||||
}
|
||||
|
||||
val timeSpectrum: Deferred<Table> = context.async(start = CoroutineStart.LAZY) {
|
||||
val cr = spectrum.await().sumOf {
|
||||
it.getValue(NumassAnalyzer.COUNT_KEY).int
|
||||
}.toDouble() / point.length.toMillis() * 1000
|
||||
|
||||
val binNum = 200
|
||||
//inputMeta.getInt("binNum", 1000);
|
||||
val binSize = 1.0 / cr * 10 / binNum * 1e6
|
||||
//inputMeta.getDouble("binSize", 1.0 / cr * 10 / binNum * 1e6)
|
||||
|
||||
val format = TableFormatBuilder()
|
||||
.addNumber("x", Adapters.X_VALUE_KEY)
|
||||
.addNumber(NumassAnalyzer.COUNT_KEY, Adapters.Y_VALUE_KEY)
|
||||
.build()
|
||||
|
||||
ListTable.Builder(format).rows(
|
||||
analyzer.getEventsWithDelay(point, Meta.empty())
|
||||
.map { it.second.toDouble() / 1000.0 }
|
||||
.groupBy { floor(it / binSize) }
|
||||
.toSortedMap()
|
||||
.map {
|
||||
ValueMap.ofPairs("x" to it.key, "count" to it.value.count())
|
||||
}
|
||||
).build()
|
||||
}
|
||||
}
|
||||
|
||||
private val cache = Misc.getLRUCache<Name, CachedPoint>(400)
|
||||
|
||||
fun getCachedPoint(id: Name, point: NumassPoint): CachedPoint = cache.getOrPut(id) { CachedPoint(point) }
|
||||
|
||||
fun getSpectrumAsync(id: Name, point: NumassPoint): Deferred<Table> =
|
||||
getCachedPoint(id, point).spectrum
|
||||
|
||||
suspend fun getChannelSpectra(id: Name, point: NumassPoint): Map<Int, Table> =
|
||||
getCachedPoint(id, point).channelSpectra.await()
|
||||
|
||||
val sets: ObservableMap<Name, NumassSet> = FXCollections.observableHashMap()
|
||||
val points: ObservableMap<Name, CachedPoint> = FXCollections.observableHashMap()
|
||||
val sc: ObservableMap<Name, TableLoader> = FXCollections.observableHashMap()
|
||||
|
||||
val files: ObservableList<Path> = FXCollections.observableArrayList()
|
||||
|
||||
val watchPathProperty = SimpleObjectProperty<Path?>()
|
||||
|
||||
private var watchJob: Job? = null
|
||||
|
||||
init {
|
||||
watchPathProperty.onChange { watchPath ->
|
||||
watchJob?.cancel()
|
||||
if (watchPath != null) {
|
||||
Files.list(watchPath).toList()
|
||||
.filter {
|
||||
!Files.isDirectory(it) && it.fileName.toString()
|
||||
.startsWith(NumassDataLoader.POINT_FRAGMENT_NAME)
|
||||
}
|
||||
.sortedBy { file ->
|
||||
val attr = Files.readAttributes(file, BasicFileAttributes::class.java)
|
||||
attr.creationTime()
|
||||
}.forEach { path ->
|
||||
try {
|
||||
runLater {
|
||||
files.add(path)
|
||||
}
|
||||
} catch (x: Throwable) {
|
||||
app.context.logger.error("Error during dynamic point read", x)
|
||||
}
|
||||
}
|
||||
val watcher = watchPath.fileSystem.newWatchService()
|
||||
watchJob = app.context.launch(Dispatchers.IO) {
|
||||
watcher.use { watcher ->
|
||||
watchPath.register(watcher, StandardWatchEventKinds.ENTRY_CREATE)
|
||||
while (isActive) {
|
||||
val key: WatchKey = watcher.take()
|
||||
for (event: WatchEvent<*> in key.pollEvents()) {
|
||||
if (event.kind() == StandardWatchEventKinds.ENTRY_CREATE) {
|
||||
val path: Path = event.context() as Path
|
||||
runLater {
|
||||
files.add(watchPath.resolve(path))
|
||||
}
|
||||
}
|
||||
}
|
||||
key.reset()
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fun clear() {
|
||||
cache.clear()
|
||||
sets.clear()
|
||||
points.clear()
|
||||
sc.clear()
|
||||
watchPathProperty.set(null)
|
||||
}
|
||||
|
||||
|
||||
fun addPoint(id: Name, point: NumassPoint): CachedPoint {
|
||||
val newPoint = getCachedPoint(id, point)
|
||||
points[id] = newPoint
|
||||
return newPoint
|
||||
}
|
||||
|
||||
fun addSet(id: Name, set: NumassSet) {
|
||||
sets[id] = set
|
||||
}
|
||||
|
||||
fun addSc(id: Name, set: TableLoader) {
|
||||
sc[id] = set
|
||||
}
|
||||
|
||||
fun remove(id: Name) {
|
||||
points.remove(id)
|
||||
sets.remove(id)
|
||||
sc.remove(id)
|
||||
}
|
||||
|
||||
//
|
||||
// fun addAllPoints(points: Map<String, NumassPoint>) {
|
||||
// TODO()
|
||||
// }
|
||||
|
||||
|
||||
}
|
@ -0,0 +1,93 @@
|
||||
package inr.numass.viewer
|
||||
|
||||
import hep.dataforge.asName
|
||||
import hep.dataforge.fx.dfIconView
|
||||
import hep.dataforge.io.envelopes.Envelope
|
||||
import hep.dataforge.names.AlphanumComparator
|
||||
import hep.dataforge.names.Name
|
||||
import inr.numass.data.NumassDataUtils
|
||||
import inr.numass.data.NumassEnvelopeType
|
||||
import inr.numass.data.api.NumassPoint
|
||||
import inr.numass.data.storage.NumassDataLoader
|
||||
import kotlinx.coroutines.launch
|
||||
import tornadofx.*
|
||||
import java.nio.file.Path
|
||||
|
||||
class DirectoryWatchView : View(title = "Numass storage", icon = dfIconView) {
|
||||
|
||||
private val dataController by inject<DataController>()
|
||||
|
||||
private val ampView: AmplitudeView by inject()
|
||||
private val timeView: TimeView by inject()
|
||||
|
||||
// private val files: ObservableList<DataController.CachedPoint> =
|
||||
// FXCollections.observableArrayList<DataController.CachedPoint>().apply {
|
||||
// bind(dataController.points) { _, v -> v }
|
||||
// }
|
||||
|
||||
private fun readPointFile(path: Path): NumassPoint {
|
||||
val envelope: Envelope = NumassEnvelopeType.infer(path)?.reader?.read(path)
|
||||
?: kotlin.error("Can't read point file")
|
||||
return NumassDataUtils.read(envelope)
|
||||
}
|
||||
|
||||
//private class PointContainer(val path: Path, val checkedProperty: BooleanProperty = SimpleBooleanProperty())
|
||||
|
||||
override val root = splitpane {
|
||||
listview(dataController.files.sorted { l, r -> AlphanumComparator.compare(l.toString(), r.toString()) }) {
|
||||
cellFormat { path: Path ->
|
||||
if (path.fileName.toString().startsWith(NumassDataLoader.POINT_FRAGMENT_NAME)) {
|
||||
val name = Name.of(path.map { it.toString().asName() })
|
||||
text = null
|
||||
graphic = checkbox(path.fileName.toString()).apply {
|
||||
isSelected = dataController.points.containsKey(name)
|
||||
selectedProperty().onChange {
|
||||
if (it) {
|
||||
app.context.launch {
|
||||
dataController.addPoint(name, readPointFile(path))
|
||||
}
|
||||
} else {
|
||||
dataController.remove(name)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// app.context.launch {
|
||||
// val point = readPointFile(path)
|
||||
// val cachedPoint = dataController.addPoint(path.toString().asName(), point)
|
||||
//
|
||||
// //val point = dataController.getCachedPoint(value.toString())
|
||||
// withContext(Dispatchers.JavaFx) {
|
||||
// contextMenu = ContextMenu().apply {
|
||||
// item("Info") {
|
||||
// action {
|
||||
// PointInfoView(cachedPoint).openModal(escapeClosesWindow = true)
|
||||
// }
|
||||
// }
|
||||
// }
|
||||
// }
|
||||
// }
|
||||
|
||||
} else {
|
||||
text = path.fileName.toString()
|
||||
graphic = null
|
||||
contextMenu = null
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
tabpane {
|
||||
tab("Amplitude spectra") {
|
||||
content = ampView.root
|
||||
isClosable = false
|
||||
//visibleWhen(ampView.isEmpty.not())
|
||||
}
|
||||
tab("Time spectra") {
|
||||
content = timeView.root
|
||||
isClosable = false
|
||||
//visibleWhen(ampView.isEmpty.not())
|
||||
}
|
||||
}
|
||||
setDividerPosition(0, 0.3);
|
||||
}
|
||||
}
|
@ -11,11 +11,8 @@ import hep.dataforge.plots.data.TimePlot
|
||||
import hep.dataforge.plots.jfreechart.JFreeChartFrame
|
||||
import hep.dataforge.tables.Adapters
|
||||
import inr.numass.data.api.NumassSet
|
||||
import javafx.collections.FXCollections
|
||||
import javafx.collections.MapChangeListener
|
||||
import javafx.collections.ObservableMap
|
||||
import javafx.scene.image.ImageView
|
||||
import kotlinx.coroutines.Dispatchers
|
||||
import tornadofx.*
|
||||
|
||||
|
||||
@ -24,6 +21,9 @@ import tornadofx.*
|
||||
*/
|
||||
class HVView : View(title = "High voltage time plot", icon = ImageView(dfIcon)) {
|
||||
|
||||
private val dataController by inject<DataController>()
|
||||
private val data get() = dataController.sets
|
||||
|
||||
private val frame = JFreeChartFrame().configure {
|
||||
"xAxis.title" to "time"
|
||||
"xAxis.type" to "time"
|
||||
@ -44,23 +44,22 @@ class HVView : View(title = "High voltage time plot", icon = ImageView(dfIcon))
|
||||
center = PlotContainer(frame).root
|
||||
}
|
||||
|
||||
private val data: ObservableMap<String, NumassSet> = FXCollections.observableHashMap()
|
||||
val isEmpty = booleanBinding(data) { data.isEmpty() }
|
||||
|
||||
init {
|
||||
data.addListener { change: MapChangeListener.Change<out String, out NumassSet> ->
|
||||
data.addListener { change: MapChangeListener.Change<out Name, out NumassSet> ->
|
||||
isEmpty.invalidate()
|
||||
if (change.wasRemoved()) {
|
||||
frame.plots.remove(Name.ofSingle(change.key))
|
||||
frame.plots.remove(change.key)
|
||||
}
|
||||
if (change.wasAdded()) {
|
||||
runLater { container.progress = -1.0 }
|
||||
runGoal(app.context,"hvData[${change.key}]", Dispatchers.IO) {
|
||||
runGoal(app.context,"hvData[${change.key}]") {
|
||||
change.valueAdded.getHvData()
|
||||
} ui { table ->
|
||||
if (table != null) {
|
||||
((frame[change.key] as? DataPlot)
|
||||
?: DataPlot(change.key, adapter = Adapters.buildXYAdapter("timestamp", "value")).also { frame.add(it) })
|
||||
((frame[change.key.toString()] as? DataPlot)
|
||||
?: DataPlot(change.key.toString(), adapter = Adapters.buildXYAdapter("timestamp", "value")).also { frame.add(it) })
|
||||
.fillData(table)
|
||||
}
|
||||
|
||||
@ -71,18 +70,4 @@ class HVView : View(title = "High voltage time plot", icon = ImageView(dfIcon))
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
operator fun set(id: String, set: NumassSet) {
|
||||
data[id] = set
|
||||
}
|
||||
|
||||
fun remove(id: String) {
|
||||
data.remove(id);
|
||||
}
|
||||
|
||||
fun clear() {
|
||||
data.clear()
|
||||
}
|
||||
|
||||
|
||||
}
|
||||
|
@ -1,5 +1,6 @@
|
||||
package inr.numass.viewer
|
||||
|
||||
import hep.dataforge.asName
|
||||
import hep.dataforge.fx.dfIconView
|
||||
import hep.dataforge.fx.except
|
||||
import hep.dataforge.fx.runGoal
|
||||
@ -29,7 +30,7 @@ import java.nio.file.Path
|
||||
|
||||
class MainView : View(title = "Numass viewer", icon = dfIconView) {
|
||||
|
||||
private val pointCache by inject<PointCache>()
|
||||
private val dataController by inject<DataController>()
|
||||
|
||||
val storageView by inject<StorageView>()
|
||||
|
||||
@ -38,16 +39,123 @@ class MainView : View(title = "Numass viewer", icon = dfIconView) {
|
||||
// addLogHandler(context.logger)
|
||||
// }
|
||||
|
||||
private val pathProperty = SimpleObjectProperty<Path>()
|
||||
private var path: Path by pathProperty
|
||||
private val pathProperty = SimpleObjectProperty<Path?>()
|
||||
|
||||
private val contentViewProperty = SimpleObjectProperty<UIComponent>()
|
||||
var contentView: UIComponent? by contentViewProperty
|
||||
private var contentView: UIComponent? by contentViewProperty
|
||||
private val spectrumView by inject<SpectrumView>()
|
||||
private val amplitudeView by inject<AmplitudeView>()
|
||||
private val directoryWatchView by inject<DirectoryWatchView>()
|
||||
|
||||
init {
|
||||
contentViewProperty.onChange {
|
||||
root.center = it?.root
|
||||
}
|
||||
}
|
||||
|
||||
private fun loadDirectory(path: Path){
|
||||
app.context.launch {
|
||||
dataController.clear()
|
||||
runLater {
|
||||
pathProperty.set(path)
|
||||
contentView = null
|
||||
}
|
||||
if (Files.exists(path.resolve(NumassDataLoader.META_FRAGMENT_NAME))) {
|
||||
//build set view
|
||||
runGoal(app.context, "viewer.load.set[$path]") {
|
||||
title = "Load set ($path)"
|
||||
message = "Building numass set..."
|
||||
NumassDataLoader(app.context, null, path.fileName.toString(), path)
|
||||
} ui { loader: NumassDataLoader ->
|
||||
contentView = spectrumView
|
||||
dataController.addSet(loader.name.asName(), loader)
|
||||
|
||||
} except {
|
||||
alert(
|
||||
type = Alert.AlertType.ERROR,
|
||||
header = "Error during set loading",
|
||||
content = it.toString()
|
||||
).show()
|
||||
}
|
||||
} else {
|
||||
//build storage
|
||||
app.context.launch {
|
||||
val storageElement =
|
||||
NumassDirectory.INSTANCE.read(app.context, path) as? Storage
|
||||
withContext(Dispatchers.JavaFx) {
|
||||
contentView = storageView
|
||||
storageView.storageProperty.set(storageElement)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private fun loadFile(path: Path){
|
||||
app.context.launch {
|
||||
dataController.clear()
|
||||
runLater {
|
||||
pathProperty.set(path)
|
||||
contentView = null
|
||||
}
|
||||
//Reading individual file
|
||||
val envelope = try {
|
||||
NumassFileEnvelope(path)
|
||||
} catch (ex: Exception) {
|
||||
runLater {
|
||||
alert(
|
||||
type = Alert.AlertType.ERROR,
|
||||
header = "Can't load DF envelope from file $path",
|
||||
content = ex.toString()
|
||||
).show()
|
||||
}
|
||||
null
|
||||
}
|
||||
|
||||
envelope?.let {
|
||||
//try to read as point
|
||||
val point = NumassDataUtils.read(it)
|
||||
runLater {
|
||||
contentView = amplitudeView
|
||||
dataController.addPoint(path.fileName.toString().asName(), point)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private fun watchDirectory(path: Path){
|
||||
app.context.launch {
|
||||
dataController.clear()
|
||||
runLater {
|
||||
pathProperty.set(path)
|
||||
contentView = directoryWatchView
|
||||
dataController.watchPathProperty.set(path)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
override val root = borderpane {
|
||||
prefHeight = 600.0
|
||||
prefWidth = 800.0
|
||||
top {
|
||||
//bypass top configuration bar and only watch directory
|
||||
app.parameters.named["directory"]?.let{ pathString ->
|
||||
val path = Path.of(pathString).toAbsolutePath()
|
||||
watchDirectory(path)
|
||||
toolbar{
|
||||
prefHeight = 40.0
|
||||
label("Watching $path") {
|
||||
padding = Insets(0.0, 0.0, 0.0, 10.0)
|
||||
font = Font.font("System Bold", 13.0)
|
||||
}
|
||||
pane {
|
||||
hgrow = Priority.ALWAYS
|
||||
}
|
||||
}
|
||||
return@top
|
||||
}
|
||||
|
||||
toolbar {
|
||||
prefHeight = 40.0
|
||||
button("Load directory") {
|
||||
@ -71,12 +179,7 @@ class MainView : View(title = "Numass viewer", icon = dfIconView) {
|
||||
|
||||
if (rootDir != null) {
|
||||
NumassProperties.setNumassProperty("numass.viewer.lastPath", rootDir.absolutePath)
|
||||
app.context.launch {
|
||||
runLater {
|
||||
path = rootDir.toPath()
|
||||
}
|
||||
load(rootDir.toPath())
|
||||
}
|
||||
loadDirectory(rootDir.toPath())
|
||||
}
|
||||
} catch (ex: Exception) {
|
||||
NumassProperties.setNumassProperty("numass.viewer.lastPath", null)
|
||||
@ -101,13 +204,38 @@ class MainView : View(title = "Numass viewer", icon = dfIconView) {
|
||||
if (file != null) {
|
||||
NumassProperties.setNumassProperty("numass.viewer.lastPath",
|
||||
file.parentFile.absolutePath)
|
||||
app.context.launch {
|
||||
runLater {
|
||||
path = file.toPath()
|
||||
loadFile(file.toPath())
|
||||
}
|
||||
load(file.toPath())
|
||||
} catch (ex: Exception) {
|
||||
NumassProperties.setNumassProperty("numass.viewer.lastPath", null)
|
||||
error("Error", content = "Failed to laod file with message: ${ex.message}")
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
button("Watch directory") {
|
||||
action {
|
||||
val chooser = DirectoryChooser()
|
||||
chooser.title = "Select directory to watch"
|
||||
val homeDir = NumassProperties.getNumassProperty("numass.viewer.lastPath")
|
||||
try {
|
||||
if (homeDir == null) {
|
||||
chooser.initialDirectory = File(".").absoluteFile
|
||||
} else {
|
||||
val file = File(homeDir)
|
||||
if (file.isDirectory) {
|
||||
chooser.initialDirectory = file
|
||||
} else {
|
||||
chooser.initialDirectory = file.parentFile
|
||||
}
|
||||
}
|
||||
|
||||
val dir = chooser.showDialog(primaryStage.scene.window)
|
||||
|
||||
if (dir != null) {
|
||||
NumassProperties.setNumassProperty("numass.viewer.lastPath", dir.absolutePath)
|
||||
watchDirectory(dir.toPath())
|
||||
}
|
||||
} catch (ex: Exception) {
|
||||
NumassProperties.setNumassProperty("numass.viewer.lastPath", null)
|
||||
error("Error", content = "Failed to laod file with message: ${ex.message}")
|
||||
@ -131,71 +259,4 @@ class MainView : View(title = "Numass viewer", icon = dfIconView) {
|
||||
bottom = statusBar
|
||||
}
|
||||
|
||||
init {
|
||||
contentViewProperty.onChange {
|
||||
root.center = it?.root
|
||||
}
|
||||
}
|
||||
|
||||
private suspend fun load(path: Path) {
|
||||
runLater {
|
||||
contentView = null
|
||||
}
|
||||
pointCache.clear()
|
||||
if (Files.isDirectory(path)) {
|
||||
if (Files.exists(path.resolve(NumassDataLoader.META_FRAGMENT_NAME))) {
|
||||
//build set view
|
||||
runGoal(app.context, "viewer.load.set[$path]", Dispatchers.IO) {
|
||||
title = "Load set ($path)"
|
||||
message = "Building numass set..."
|
||||
NumassDataLoader(app.context, null, path.fileName.toString(), path)
|
||||
} ui { loader: NumassDataLoader ->
|
||||
contentView = SpectrumView().apply {
|
||||
clear()
|
||||
set(loader.name, loader)
|
||||
}
|
||||
} except {
|
||||
alert(
|
||||
type = Alert.AlertType.ERROR,
|
||||
header = "Error during set loading",
|
||||
content = it.toString()
|
||||
).show()
|
||||
}
|
||||
} else {
|
||||
//build storage
|
||||
app.context.launch {
|
||||
val storageElement = NumassDirectory.INSTANCE.read(app.context, path) as Storage
|
||||
withContext(Dispatchers.JavaFx) {
|
||||
contentView = storageView
|
||||
storageView.storageProperty.set(storageElement)
|
||||
}
|
||||
}
|
||||
}
|
||||
} else {
|
||||
//Reading individual file
|
||||
val envelope = try {
|
||||
NumassFileEnvelope(path)
|
||||
} catch (ex: Exception) {
|
||||
runLater {
|
||||
alert(
|
||||
type = Alert.AlertType.ERROR,
|
||||
header = "Can't load DF envelope from file $path",
|
||||
content = ex.toString()
|
||||
).show()
|
||||
}
|
||||
null
|
||||
}
|
||||
|
||||
envelope?.let {
|
||||
//try to read as point
|
||||
val point = NumassDataUtils.read(it)
|
||||
runLater {
|
||||
contentView = AmplitudeView().apply {
|
||||
set(path.fileName.toString(), point)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
@ -1,64 +0,0 @@
|
||||
/*
|
||||
* Copyright 2018 Alexander Nozik.
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
|
||||
package inr.numass.viewer
|
||||
|
||||
import hep.dataforge.tables.Table
|
||||
import hep.dataforge.utils.Misc
|
||||
import inr.numass.data.analyzers.SimpleAnalyzer
|
||||
import inr.numass.data.api.NumassPoint
|
||||
import kotlinx.coroutines.Deferred
|
||||
import kotlinx.coroutines.Dispatchers
|
||||
import kotlinx.coroutines.async
|
||||
import tornadofx.*
|
||||
|
||||
|
||||
private val analyzer = SimpleAnalyzer()
|
||||
|
||||
|
||||
class PointCache : Controller() {
|
||||
private val context = app.context
|
||||
|
||||
inner class CachedPoint(point: NumassPoint) {
|
||||
val length = point.length
|
||||
|
||||
val voltage = point.voltage
|
||||
|
||||
val meta = point.meta
|
||||
|
||||
val channelSpectra: Deferred<Map<Int, Table>> = context.async(Dispatchers.IO) {
|
||||
point.channels.mapValues { (_, value) -> analyzer.getAmplitudeSpectrum(value) }
|
||||
}
|
||||
|
||||
val spectrum: Deferred<Table> = context.async(Dispatchers.IO) {
|
||||
analyzer.getAmplitudeSpectrum(point)
|
||||
}
|
||||
}
|
||||
|
||||
private val cache = Misc.getLRUCache<String, CachedPoint>(400)
|
||||
|
||||
fun getCachedPoint(id: String,point: NumassPoint): CachedPoint = cache.getOrPut(id) { CachedPoint(point) }
|
||||
|
||||
fun getSpectrumAsync(id: String, point: NumassPoint): Deferred<Table> =
|
||||
getCachedPoint(id, point).spectrum
|
||||
|
||||
suspend fun getChannelSpectra(id: String, point: NumassPoint): Map<Int, Table> =
|
||||
getCachedPoint(id, point).channelSpectra.await()
|
||||
|
||||
fun clear(){
|
||||
cache.clear()
|
||||
}
|
||||
}
|
@ -9,7 +9,7 @@ import tornadofx.*
|
||||
import tornadofx.controlsfx.borders
|
||||
import tornadofx.controlsfx.toGlyph
|
||||
|
||||
class PointInfoView(val cachedPoint: PointCache.CachedPoint) : MetaViewer(cachedPoint.meta) {
|
||||
class PointInfoView(val cachedPoint: DataController.CachedPoint) : MetaViewer(cachedPoint.meta) {
|
||||
val countProperty = SimpleIntegerProperty(0)
|
||||
var count by countProperty
|
||||
|
||||
|
@ -5,18 +5,15 @@ import hep.dataforge.fx.dfIcon
|
||||
import hep.dataforge.fx.plots.PlotContainer
|
||||
import hep.dataforge.fx.runGoal
|
||||
import hep.dataforge.fx.ui
|
||||
import hep.dataforge.names.Name
|
||||
import hep.dataforge.plots.PlotGroup
|
||||
import hep.dataforge.plots.data.DataPlot
|
||||
import hep.dataforge.plots.jfreechart.JFreeChartFrame
|
||||
import hep.dataforge.storage.tables.TableLoader
|
||||
import hep.dataforge.storage.tables.asTable
|
||||
import hep.dataforge.tables.Adapters
|
||||
import hep.dataforge.tables.Table
|
||||
import javafx.collections.FXCollections
|
||||
import javafx.collections.MapChangeListener
|
||||
import javafx.collections.ObservableMap
|
||||
import javafx.scene.image.ImageView
|
||||
import kotlinx.coroutines.Dispatchers
|
||||
import tornadofx.*
|
||||
|
||||
/**
|
||||
@ -24,6 +21,9 @@ import tornadofx.*
|
||||
*/
|
||||
class SlowControlView : View(title = "Numass slow control view", icon = ImageView(dfIcon)) {
|
||||
|
||||
private val dataController by inject<DataController>()
|
||||
private val data get() = dataController.sc
|
||||
|
||||
private val plot = JFreeChartFrame().configure {
|
||||
"xAxis.type" to "time"
|
||||
"yAxis.type" to "log"
|
||||
@ -33,22 +33,21 @@ class SlowControlView : View(title = "Numass slow control view", icon = ImageVie
|
||||
center = PlotContainer(plot).root
|
||||
}
|
||||
|
||||
val data: ObservableMap<String, TableLoader> = FXCollections.observableHashMap();
|
||||
val isEmpty = booleanBinding(data) {
|
||||
data.isEmpty()
|
||||
}
|
||||
|
||||
init {
|
||||
data.addListener { change: MapChangeListener.Change<out String, out TableLoader> ->
|
||||
data.addListener { change: MapChangeListener.Change<out Name, out TableLoader> ->
|
||||
if (change.wasRemoved()) {
|
||||
plot.remove(change.key)
|
||||
plot.remove(change.key.toString())
|
||||
}
|
||||
if (change.wasAdded()) {
|
||||
runGoal(app.context,"loadTable[${change.key}]", Dispatchers.IO) {
|
||||
val plotData = getData(change.valueAdded)
|
||||
runGoal(app.context,"loadTable[${change.key}]") {
|
||||
val plotData = change.valueAdded.asTable().await()
|
||||
val names = plotData.format.namesAsArray().filter { it != "timestamp" }
|
||||
|
||||
val group = PlotGroup(change.key)
|
||||
val group = PlotGroup(change.key.toString())
|
||||
|
||||
names.forEach {
|
||||
val adapter = Adapters.buildXYAdapter("timestamp", it);
|
||||
@ -68,21 +67,4 @@ class SlowControlView : View(title = "Numass slow control view", icon = ImageVie
|
||||
}
|
||||
}
|
||||
|
||||
private suspend fun getData(loader: TableLoader): Table {
|
||||
//TODO add query
|
||||
return loader.asTable().await()
|
||||
}
|
||||
|
||||
operator fun set(id: String, loader: TableLoader) {
|
||||
this.data[id] = loader
|
||||
}
|
||||
|
||||
fun remove(id: String) {
|
||||
this.data.remove(id)
|
||||
}
|
||||
|
||||
fun clear(){
|
||||
data.clear()
|
||||
}
|
||||
|
||||
}
|
||||
|
@ -10,17 +10,13 @@ import hep.dataforge.tables.Adapters
|
||||
import inr.numass.data.analyzers.countInWindow
|
||||
import inr.numass.data.api.NumassSet
|
||||
import javafx.beans.property.SimpleIntegerProperty
|
||||
import javafx.collections.FXCollections
|
||||
import javafx.collections.MapChangeListener
|
||||
import javafx.collections.ObservableMap
|
||||
import javafx.geometry.Insets
|
||||
import javafx.geometry.Orientation
|
||||
import javafx.scene.image.ImageView
|
||||
import javafx.util.converter.NumberStringConverter
|
||||
import kotlinx.coroutines.Dispatchers
|
||||
import kotlinx.coroutines.*
|
||||
import kotlinx.coroutines.javafx.JavaFx
|
||||
import kotlinx.coroutines.launch
|
||||
import kotlinx.coroutines.withContext
|
||||
import org.controlsfx.control.RangeSlider
|
||||
import tornadofx.*
|
||||
import java.util.concurrent.atomic.AtomicInteger
|
||||
@ -33,7 +29,8 @@ import kotlin.math.sqrt
|
||||
*/
|
||||
class SpectrumView : View(title = "Numass spectrum plot", icon = ImageView(dfIcon)) {
|
||||
|
||||
private val pointCache by inject<PointCache>()
|
||||
private val dataController by inject<DataController>()
|
||||
private val data get() = dataController.sets
|
||||
|
||||
private val frame = JFreeChartFrame().configure {
|
||||
"xAxis.title" to "U"
|
||||
@ -44,7 +41,6 @@ class SpectrumView : View(title = "Numass spectrum plot", icon = ImageView(dfIco
|
||||
}
|
||||
private val container = PlotContainer(frame)
|
||||
|
||||
|
||||
private val loChannelProperty = SimpleIntegerProperty(500).apply {
|
||||
addListener { _ -> updateView() }
|
||||
}
|
||||
@ -55,9 +51,7 @@ class SpectrumView : View(title = "Numass spectrum plot", icon = ImageView(dfIco
|
||||
}
|
||||
private var upChannel by upChannelProperty
|
||||
|
||||
|
||||
private val data: ObservableMap<String, NumassSet> = FXCollections.observableHashMap()
|
||||
val isEmpty = booleanBinding(data) { data.isEmpty() }
|
||||
private val isEmpty = booleanBinding(data) { data.isEmpty() }
|
||||
|
||||
override val root = borderpane {
|
||||
top {
|
||||
@ -103,9 +97,9 @@ class SpectrumView : View(title = "Numass spectrum plot", icon = ImageView(dfIco
|
||||
}
|
||||
|
||||
init {
|
||||
data.addListener { change: MapChangeListener.Change<out String, out NumassSet> ->
|
||||
data.addListener { change: MapChangeListener.Change<out Name, out NumassSet> ->
|
||||
if (change.wasRemoved()) {
|
||||
frame.plots.remove(Name.ofSingle(change.key))
|
||||
frame.plots.remove(Name.ofSingle(change.key.toString()))
|
||||
}
|
||||
|
||||
if (change.wasAdded()) {
|
||||
@ -121,18 +115,19 @@ class SpectrumView : View(title = "Numass spectrum plot", icon = ImageView(dfIco
|
||||
val totalProgress = data.values.stream().mapToInt { it.points.size }.sum()
|
||||
|
||||
data.forEach { (name, set) ->
|
||||
val plot: DataPlot =
|
||||
frame.plots[Name.ofSingle(name)] as DataPlot? ?: DataPlot(name).apply { frame.add(this) }
|
||||
val plot: DataPlot = frame.plots[name] as DataPlot? ?: DataPlot(name.toString()).apply {
|
||||
frame.add(this)
|
||||
}
|
||||
|
||||
app.context.launch {
|
||||
val points = set.points.map {
|
||||
pointCache.getCachedPoint("$name/${it.voltage}[${it.index}]", it)
|
||||
val points = set.points.map { point ->
|
||||
dataController.getCachedPoint(Name.join("$name","${point.voltage}[${point.index}]"), point).also {
|
||||
it.spectrum.start()
|
||||
}
|
||||
}.map { cachedPoint ->
|
||||
val count = cachedPoint.spectrum.await().countInWindow(loChannel.toShort(), upChannel.toShort())
|
||||
val seconds = cachedPoint.length.toMillis() / 1000.0
|
||||
launch(Dispatchers.JavaFx) {
|
||||
container.progress = progress.incrementAndGet().toDouble() / totalProgress
|
||||
}
|
||||
|
||||
Adapters.buildXYDataPoint(
|
||||
cachedPoint.voltage,
|
||||
(count / seconds),
|
||||
@ -146,16 +141,4 @@ class SpectrumView : View(title = "Numass spectrum plot", icon = ImageView(dfIco
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
operator fun set(key: String, value: NumassSet) {
|
||||
data[key] = value
|
||||
}
|
||||
|
||||
fun remove(key: String) {
|
||||
data.remove(key)
|
||||
}
|
||||
|
||||
fun clear() {
|
||||
data.clear()
|
||||
}
|
||||
}
|
||||
|
@ -1,15 +1,19 @@
|
||||
package inr.numass.viewer
|
||||
|
||||
import hep.dataforge.asName
|
||||
import hep.dataforge.fx.dfIconView
|
||||
import hep.dataforge.fx.meta.MetaViewer
|
||||
import hep.dataforge.meta.Meta
|
||||
import hep.dataforge.meta.Metoid
|
||||
import hep.dataforge.names.AlphanumComparator
|
||||
import hep.dataforge.names.Name
|
||||
import hep.dataforge.storage.Storage
|
||||
import hep.dataforge.storage.files.FileTableLoader
|
||||
import hep.dataforge.storage.tables.TableLoader
|
||||
import inr.numass.data.NumassDataUtils
|
||||
import inr.numass.data.api.NumassPoint
|
||||
import inr.numass.data.api.NumassSet
|
||||
import inr.numass.data.storage.EnvelopeStorageElement
|
||||
import inr.numass.data.storage.NumassDataLoader
|
||||
import javafx.beans.property.SimpleBooleanProperty
|
||||
import javafx.beans.property.SimpleObjectProperty
|
||||
@ -25,7 +29,7 @@ class StorageView : View(title = "Numass storage", icon = dfIconView) {
|
||||
val storageProperty = SimpleObjectProperty<Storage>()
|
||||
val storage by storageProperty
|
||||
|
||||
private val pointCache by inject<PointCache>()
|
||||
private val dataController by inject<DataController>()
|
||||
|
||||
private val ampView: AmplitudeView by inject()
|
||||
private val timeView: TimeView by inject()
|
||||
@ -33,27 +37,15 @@ class StorageView : View(title = "Numass storage", icon = dfIconView) {
|
||||
private val hvView: HVView by inject()
|
||||
private val scView: SlowControlView by inject()
|
||||
|
||||
// private var watcher: WatchService? = null
|
||||
|
||||
|
||||
fun clear() {
|
||||
//watcher?.close()
|
||||
ampView.clear()
|
||||
timeView.clear()
|
||||
spectrumView.clear()
|
||||
hvView.clear()
|
||||
scView.clear()
|
||||
}
|
||||
|
||||
private inner class Container(val id: String, val content: Any) {
|
||||
private inner class Container(val name: Name, val content: Any) {
|
||||
val checkedProperty = SimpleBooleanProperty(false)
|
||||
var checked by checkedProperty
|
||||
|
||||
val infoView: UIComponent by lazy {
|
||||
when (content) {
|
||||
is NumassPoint -> PointInfoView(pointCache.getCachedPoint(id, content))
|
||||
is Metoid -> MetaViewer(content.meta, title = "Meta view: $id")
|
||||
else -> MetaViewer(Meta.empty(), title = "Meta view: $id")
|
||||
is NumassPoint -> PointInfoView(dataController.getCachedPoint(name, content))
|
||||
is Metoid -> MetaViewer(content.meta, title = "Meta view: $name")
|
||||
else -> MetaViewer(Meta.empty(), title = "Meta view: $name")
|
||||
}
|
||||
}
|
||||
|
||||
@ -64,27 +56,23 @@ class StorageView : View(title = "Numass storage", icon = dfIconView) {
|
||||
when (content) {
|
||||
is NumassPoint -> {
|
||||
if (selected) {
|
||||
ampView[id] = content
|
||||
timeView[id] = content
|
||||
dataController.addPoint(name, content)
|
||||
} else {
|
||||
ampView.remove(id)
|
||||
timeView.remove(id)
|
||||
dataController.remove(name)
|
||||
}
|
||||
}
|
||||
is NumassSet -> {
|
||||
if (selected) {
|
||||
spectrumView[id] = content
|
||||
hvView[id] = content
|
||||
dataController.addSet(name, content)
|
||||
} else {
|
||||
spectrumView.remove(id)
|
||||
hvView.remove(id)
|
||||
dataController.remove(name)
|
||||
}
|
||||
}
|
||||
is TableLoader -> {
|
||||
if (selected) {
|
||||
scView[id] = content
|
||||
dataController.addSc(name, content)
|
||||
} else {
|
||||
scView.remove(id)
|
||||
dataController.remove(name)
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -97,9 +85,19 @@ class StorageView : View(title = "Numass storage", icon = dfIconView) {
|
||||
|
||||
val children: ObservableList<Container>? by lazy {
|
||||
when (content) {
|
||||
is Storage -> content.getChildren().map {
|
||||
is Storage -> content.getChildren().mapNotNull {
|
||||
if (it is EnvelopeStorageElement) {
|
||||
it.envelope?.let { envelope ->
|
||||
try {
|
||||
buildContainer(NumassDataUtils.read(envelope), this)
|
||||
} catch (ex: Exception) {
|
||||
null
|
||||
}
|
||||
}
|
||||
} else {
|
||||
buildContainer(it, this)
|
||||
}.sortedWith(Comparator.comparing({ it.id }, AlphanumComparator)).asObservable()
|
||||
}
|
||||
}.sortedWith(Comparator.comparing({ it.name.toString() }, AlphanumComparator)).asObservable()
|
||||
is NumassSet -> content.points
|
||||
.sortedBy { it.index }
|
||||
.map { buildContainer(it, this) }
|
||||
@ -112,38 +110,6 @@ class StorageView : View(title = "Numass storage", icon = dfIconView) {
|
||||
|
||||
private var watchJob: Job? = null
|
||||
|
||||
// private fun toggleWatch(watch: Boolean) {
|
||||
// if (watch) {
|
||||
// if (watchJob != null && content is NumassDataLoader) {
|
||||
// watchJob = app.context.launch(Dispatchers.IO) {
|
||||
// val key: WatchKey = content.path.register(watcher!!, ENTRY_CREATE)
|
||||
// coroutineContext[Job]?.invokeOnCompletion {
|
||||
// key.cancel()
|
||||
// }
|
||||
// while (watcher != null && isActive) {
|
||||
// try {
|
||||
// key.pollEvents().forEach { event ->
|
||||
// if (event.kind() == ENTRY_CREATE) {
|
||||
// val path: Path = event.context() as Path
|
||||
// if (path.fileName.toString().startsWith(NumassDataLoader.POINT_FRAGMENT_NAME)) {
|
||||
// val envelope: Envelope = NumassEnvelopeType.infer(path)?.reader?.read(path)
|
||||
// ?: kotlin.error("Can't read point file")
|
||||
// val point = NumassDataUtils.read(envelope)
|
||||
// children!!.add(buildContainer(point, this@Container))
|
||||
// }
|
||||
// }
|
||||
// }
|
||||
// } catch (x: Throwable) {
|
||||
// app.context.logger.error("Error during dynamic point read", x)
|
||||
// }
|
||||
// }
|
||||
// }
|
||||
// }
|
||||
// } else {
|
||||
// watchJob?.cancel()
|
||||
// watchJob = null
|
||||
// }
|
||||
// }
|
||||
}
|
||||
|
||||
|
||||
@ -151,21 +117,16 @@ class StorageView : View(title = "Numass storage", icon = dfIconView) {
|
||||
treeview<Container> {
|
||||
//isShowRoot = false
|
||||
storageProperty.onChange { storage ->
|
||||
clear()
|
||||
dataController.clear()
|
||||
if (storage == null) return@onChange
|
||||
root = TreeItem(Container(storage.name, storage))
|
||||
root = TreeItem(Container(storage.name.asName(), storage))
|
||||
root.isExpanded = true
|
||||
lazyPopulate(leafCheck = {
|
||||
!it.value.hasChildren
|
||||
}) {
|
||||
it.value.children
|
||||
}
|
||||
// watcher?.close()
|
||||
// watcher = if (storage is FileStorage) {
|
||||
// storage.path.fileSystem.newWatchService()
|
||||
// } else {
|
||||
// null
|
||||
// }
|
||||
|
||||
}
|
||||
|
||||
cellFormat { value: Container ->
|
||||
@ -193,7 +154,7 @@ class StorageView : View(title = "Numass storage", icon = dfIconView) {
|
||||
}
|
||||
}
|
||||
else -> {
|
||||
text = value.id
|
||||
text = value.name.toString()
|
||||
graphic = null
|
||||
}
|
||||
}
|
||||
@ -208,11 +169,6 @@ class StorageView : View(title = "Numass storage", icon = dfIconView) {
|
||||
value.infoView.openModal(escapeClosesWindow = true)
|
||||
}
|
||||
}
|
||||
// if(value.content is NumassDataLoader) {
|
||||
// checkmenuitem("Watch") {
|
||||
// selectedProperty().bindBidirectional(value.watchedProperty)
|
||||
// }
|
||||
// }
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -256,19 +212,19 @@ class StorageView : View(title = "Numass storage", icon = dfIconView) {
|
||||
|
||||
private fun buildContainer(content: Any, parent: Container): Container =
|
||||
when (content) {
|
||||
is Storage -> Container(content.fullName.toString(), content)
|
||||
is Storage -> Container(content.fullName, content)
|
||||
is NumassSet -> {
|
||||
val id: String = if (content is NumassDataLoader) {
|
||||
content.fullName.unescaped
|
||||
val id: Name = if (content is NumassDataLoader) {
|
||||
content.fullName
|
||||
} else {
|
||||
content.name
|
||||
content.name.asName()
|
||||
}
|
||||
Container(id, content)
|
||||
}
|
||||
is NumassPoint -> {
|
||||
Container("${parent.id}/${content.voltage}[${content.index}]", content)
|
||||
Container("${parent.name}/${content.voltage}[${content.index}]".asName(), content)
|
||||
}
|
||||
is FileTableLoader -> Container(content.path.toString(), content)
|
||||
is FileTableLoader -> Container(Name.of(content.path.map { it.toString().asName() }), content)
|
||||
else -> throw IllegalArgumentException("Unknown content type: ${content::class.java}");
|
||||
}
|
||||
}
|
||||
|
@ -2,30 +2,28 @@ package inr.numass.viewer
|
||||
|
||||
import hep.dataforge.configure
|
||||
import hep.dataforge.fx.dfIcon
|
||||
import hep.dataforge.fx.except
|
||||
import hep.dataforge.fx.plots.PlotContainer
|
||||
import hep.dataforge.fx.runGoal
|
||||
import hep.dataforge.fx.ui
|
||||
import hep.dataforge.goals.Goal
|
||||
import hep.dataforge.meta.Meta
|
||||
import hep.dataforge.names.Name
|
||||
import hep.dataforge.plots.Plottable
|
||||
import hep.dataforge.plots.data.DataPlot
|
||||
import hep.dataforge.plots.jfreechart.JFreeChartFrame
|
||||
import hep.dataforge.tables.Adapters
|
||||
import hep.dataforge.values.ValueMap
|
||||
import inr.numass.data.analyzers.TimeAnalyzer
|
||||
import inr.numass.data.api.NumassPoint
|
||||
import javafx.beans.Observable
|
||||
import hep.dataforge.tables.Table
|
||||
import javafx.beans.binding.DoubleBinding
|
||||
import javafx.collections.FXCollections
|
||||
import javafx.collections.MapChangeListener
|
||||
import javafx.collections.ObservableMap
|
||||
import javafx.scene.image.ImageView
|
||||
import kotlinx.coroutines.Dispatchers
|
||||
import kotlinx.coroutines.Job
|
||||
import kotlinx.coroutines.javafx.JavaFx
|
||||
import kotlinx.coroutines.launch
|
||||
import kotlinx.coroutines.withContext
|
||||
import tornadofx.*
|
||||
|
||||
class TimeView : View(title = "Numass time spectrum plot", icon = ImageView(dfIcon)) {
|
||||
|
||||
private val dataController by inject<DataController>()
|
||||
|
||||
private val frame = JFreeChartFrame().configure {
|
||||
"title" to "Time plot"
|
||||
node("xAxis") {
|
||||
@ -47,129 +45,65 @@ class TimeView : View(title = "Numass time spectrum plot", icon = ImageView(dfIc
|
||||
}.setType<DataPlot>()
|
||||
}
|
||||
|
||||
// val stepProperty = SimpleDoubleProperty()
|
||||
// var step by stepProperty
|
||||
//
|
||||
// private val container = PlotContainer(frame).apply {
|
||||
// val binningSelector: ChoiceBox<Int> = ChoiceBox(FXCollections.observableArrayList(1, 5, 10, 20, 50)).apply {
|
||||
// minWidth = 0.0
|
||||
// selectionModel.selectLast()
|
||||
// stepProperty.bind(this.selectionModel.selectedItemProperty())
|
||||
// }
|
||||
// addToSideBar(0, binningSelector)
|
||||
// }
|
||||
|
||||
private val container = PlotContainer(frame)
|
||||
|
||||
private val data: ObservableMap<String, NumassPoint> = FXCollections.observableHashMap()
|
||||
private val plots: ObservableMap<String, Goal<Plottable>> = FXCollections.observableHashMap()
|
||||
//private val data: ObservableMap<String, NumassPoint> = FXCollections.observableHashMap()
|
||||
private val data get() = dataController.points
|
||||
private val plotJobs: ObservableMap<String, Job> = FXCollections.observableHashMap()
|
||||
|
||||
val isEmpty = booleanBinding(data) { isEmpty() }
|
||||
|
||||
private val progress = object : DoubleBinding() {
|
||||
init {
|
||||
bind(plots)
|
||||
bind(plotJobs)
|
||||
}
|
||||
|
||||
override fun computeValue(): Double {
|
||||
return plots.values.count { it.isDone }.toDouble() / data.size;
|
||||
}
|
||||
override fun computeValue(): Double = plotJobs.values.count { it.isCompleted }.toDouble() / data.size
|
||||
|
||||
}
|
||||
|
||||
init {
|
||||
data.addListener { _: Observable ->
|
||||
invalidate()
|
||||
data.addListener(MapChangeListener { change ->
|
||||
val key = change.key.toString()
|
||||
if (change.wasAdded()) {
|
||||
replotOne(key, change.valueAdded)
|
||||
} else if(change.wasRemoved()){
|
||||
plotJobs[key]?.cancel()
|
||||
plotJobs.remove(key)
|
||||
frame.plots.remove(Name.ofSingle(key))
|
||||
progress.invalidate()
|
||||
}
|
||||
})
|
||||
|
||||
}
|
||||
|
||||
override val root = borderpane {
|
||||
center = container.root
|
||||
}
|
||||
|
||||
/**
|
||||
* Put or replace current plot with name `key`
|
||||
*/
|
||||
operator fun set(key: String, point: NumassPoint) {
|
||||
data[key] = point
|
||||
}
|
||||
private fun replotOne(key: String, point: DataController.CachedPoint) {
|
||||
plotJobs[key]?.cancel()
|
||||
plotJobs[key] = app.context.launch {
|
||||
try {
|
||||
val histogram: Table = point.timeSpectrum.await()
|
||||
|
||||
fun addAll(data: Map<String, NumassPoint>) {
|
||||
this.data.putAll(data);
|
||||
}
|
||||
|
||||
private val analyzer = TimeAnalyzer();
|
||||
|
||||
|
||||
private fun invalidate() {
|
||||
data.forEach { key, point ->
|
||||
plots.getOrPut(key) {
|
||||
runGoal<Plottable>(app.context, "loadAmplitudeSpectrum_$key", Dispatchers.IO) {
|
||||
|
||||
val initialEstimate = analyzer.analyze(point)
|
||||
val cr = initialEstimate.getDouble("cr")
|
||||
|
||||
val binNum = 200//inputMeta.getInt("binNum", 1000);
|
||||
val binSize = 1.0 / cr * 10 / binNum * 1e6//inputMeta.getDouble("binSize", 1.0 / cr * 10 / binNum * 1e6)
|
||||
|
||||
val histogram = analyzer.getEventsWithDelay(point, Meta.empty())
|
||||
.map { it.second.toDouble() / 1000.0 }
|
||||
.groupBy { Math.floor(it / binSize) }
|
||||
.toSortedMap()
|
||||
.map {
|
||||
ValueMap.ofPairs("x" to it.key, "count" to it.value.count())
|
||||
}
|
||||
|
||||
DataPlot(key, adapter = Adapters.buildXYAdapter("x", "count"))
|
||||
val plot = DataPlot(key, adapter = Adapters.buildXYAdapter("x", "count"))
|
||||
.configure {
|
||||
"showLine" to true
|
||||
"showSymbol" to false
|
||||
"showErrors" to false
|
||||
"connectionType" to "step"
|
||||
}.fillData(histogram)
|
||||
|
||||
} ui { plot ->
|
||||
withContext(Dispatchers.JavaFx) {
|
||||
frame.add(plot)
|
||||
progress.invalidate()
|
||||
} except {
|
||||
}
|
||||
} finally {
|
||||
withContext(Dispatchers.JavaFx) {
|
||||
progress.invalidate()
|
||||
}
|
||||
}
|
||||
plots.keys.filter { !data.containsKey(it) }.forEach { remove(it) }
|
||||
}
|
||||
}
|
||||
|
||||
fun clear() {
|
||||
data.clear()
|
||||
plots.values.forEach {
|
||||
it.cancel()
|
||||
}
|
||||
plots.clear()
|
||||
invalidate()
|
||||
}
|
||||
|
||||
/**
|
||||
* Remove the plot and cancel loading task if it is in progress.
|
||||
*/
|
||||
fun remove(name: String) {
|
||||
frame.plots.remove(Name.ofSingle(name))
|
||||
plots[name]?.cancel()
|
||||
plots.remove(name)
|
||||
data.remove(name)
|
||||
progress.invalidate()
|
||||
}
|
||||
|
||||
/**
|
||||
* Set frame content to the given map. All keys not in the map are removed.
|
||||
*/
|
||||
fun setAll(map: Map<String, NumassPoint>) {
|
||||
plots.clear();
|
||||
//Remove obsolete keys
|
||||
data.keys.filter { !map.containsKey(it) }.forEach {
|
||||
remove(it)
|
||||
}
|
||||
this.addAll(map);
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
|
@ -8,6 +8,7 @@ import hep.dataforge.fx.dfIcon
|
||||
import javafx.stage.Stage
|
||||
import org.slf4j.LoggerFactory
|
||||
import tornadofx.*
|
||||
import kotlin.system.exitProcess
|
||||
|
||||
/**
|
||||
* Created by darksnake on 14-Apr-17.
|
||||
@ -28,6 +29,7 @@ class Viewer : App(MainView::class) {
|
||||
context.close()
|
||||
Global.terminate();
|
||||
super.stop()
|
||||
exitProcess(0)
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -1,68 +0,0 @@
|
||||
package inr.numass.viewer.test
|
||||
|
||||
import hep.dataforge.context.Global
|
||||
import hep.dataforge.fx.dfIcon
|
||||
import hep.dataforge.nullable
|
||||
import hep.dataforge.tables.Table
|
||||
import inr.numass.data.api.NumassPoint
|
||||
import inr.numass.data.api.NumassSet
|
||||
import inr.numass.data.storage.NumassDirectory
|
||||
import inr.numass.viewer.AmplitudeView
|
||||
import inr.numass.viewer.HVView
|
||||
import inr.numass.viewer.SpectrumView
|
||||
import javafx.application.Application
|
||||
import javafx.scene.image.ImageView
|
||||
import kotlinx.coroutines.launch
|
||||
import tornadofx.*
|
||||
import java.io.File
|
||||
import java.util.concurrent.ConcurrentHashMap
|
||||
|
||||
class ViewerComponentsTestApp : App(ViewerComponentsTest::class)
|
||||
|
||||
class ViewerComponentsTest : View(title = "Numass viewer test", icon = ImageView(dfIcon)) {
|
||||
|
||||
//val rootDir = File("D:\\Work\\Numass\\data\\2017_05\\Fill_2")
|
||||
|
||||
//val set: NumassSet = NumassStorageFactory.buildLocal(rootDir).provide("loader::set_8", NumassSet::class.java).orElseThrow { RuntimeException("err") }
|
||||
|
||||
|
||||
private val cache: MutableMap<NumassPoint, Table> = ConcurrentHashMap()
|
||||
val context = Global
|
||||
|
||||
val amp: AmplitudeView by inject(params = mapOf("cache" to cache))//= AmplitudeView(immutable = immutable)
|
||||
val sp: SpectrumView by inject(params = mapOf("cache" to cache))
|
||||
val hv: HVView by inject()
|
||||
|
||||
override val root = borderpane {
|
||||
top {
|
||||
button("Click me!") {
|
||||
action {
|
||||
context.launch {
|
||||
val set: NumassSet = NumassDirectory.INSTANCE.read(Global, File("D:\\Work\\Numass\\data\\2017_05\\Fill_2").toPath())
|
||||
?.provide("loader::set_2", NumassSet::class.java).nullable
|
||||
?: kotlin.error("Error")
|
||||
update(set)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
center {
|
||||
tabpane {
|
||||
tab("amplitude", amp.root)
|
||||
tab("spectrum", sp.root)
|
||||
tab("hv", hv.root)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fun update(set: NumassSet) {
|
||||
amp.setAll(set.points.filter { it.voltage != 16000.0 }.associateBy { "point_${it.voltage}" })
|
||||
sp["test"] = set
|
||||
hv[set.name] = set
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
fun main(args: Array<String>) {
|
||||
Application.launch(ViewerComponentsTestApp::class.java, *args)
|
||||
}
|
Loading…
Reference in New Issue
Block a user