Compare commits
No commits in common. "c20ac1ffb818037fd1de83c5a01078d69132e57c" and "78ff8d4f6e814eef9b7d429b0bf7f69e5a65efc6" have entirely different histories.
c20ac1ffb8
...
78ff8d4f6e
201
LICENSE
201
LICENSE
@ -1,201 +0,0 @@
|
|||||||
Apache License
|
|
||||||
Version 2.0, January 2004
|
|
||||||
http://www.apache.org/licenses/
|
|
||||||
|
|
||||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
|
||||||
|
|
||||||
1. Definitions.
|
|
||||||
|
|
||||||
"License" shall mean the terms and conditions for use, reproduction,
|
|
||||||
and distribution as defined by Sections 1 through 9 of this document.
|
|
||||||
|
|
||||||
"Licensor" shall mean the copyright owner or entity authorized by
|
|
||||||
the copyright owner that is granting the License.
|
|
||||||
|
|
||||||
"Legal Entity" shall mean the union of the acting entity and all
|
|
||||||
other entities that control, are controlled by, or are under common
|
|
||||||
control with that entity. For the purposes of this definition,
|
|
||||||
"control" means (i) the power, direct or indirect, to cause the
|
|
||||||
direction or management of such entity, whether by contract or
|
|
||||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
|
||||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
|
||||||
|
|
||||||
"You" (or "Your") shall mean an individual or Legal Entity
|
|
||||||
exercising permissions granted by this License.
|
|
||||||
|
|
||||||
"Source" form shall mean the preferred form for making modifications,
|
|
||||||
including but not limited to software source code, documentation
|
|
||||||
source, and configuration files.
|
|
||||||
|
|
||||||
"Object" form shall mean any form resulting from mechanical
|
|
||||||
transformation or translation of a Source form, including but
|
|
||||||
not limited to compiled object code, generated documentation,
|
|
||||||
and conversions to other media types.
|
|
||||||
|
|
||||||
"Work" shall mean the work of authorship, whether in Source or
|
|
||||||
Object form, made available under the License, as indicated by a
|
|
||||||
copyright notice that is included in or attached to the work
|
|
||||||
(an example is provided in the Appendix below).
|
|
||||||
|
|
||||||
"Derivative Works" shall mean any work, whether in Source or Object
|
|
||||||
form, that is based on (or derived from) the Work and for which the
|
|
||||||
editorial revisions, annotations, elaborations, or other modifications
|
|
||||||
represent, as a whole, an original work of authorship. For the purposes
|
|
||||||
of this License, Derivative Works shall not include works that remain
|
|
||||||
separable from, or merely link (or bind by name) to the interfaces of,
|
|
||||||
the Work and Derivative Works thereof.
|
|
||||||
|
|
||||||
"Contribution" shall mean any work of authorship, including
|
|
||||||
the original version of the Work and any modifications or additions
|
|
||||||
to that Work or Derivative Works thereof, that is intentionally
|
|
||||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
|
||||||
or by an individual or Legal Entity authorized to submit on behalf of
|
|
||||||
the copyright owner. For the purposes of this definition, "submitted"
|
|
||||||
means any form of electronic, verbal, or written communication sent
|
|
||||||
to the Licensor or its representatives, including but not limited to
|
|
||||||
communication on electronic mailing lists, source code control systems,
|
|
||||||
and issue tracking systems that are managed by, or on behalf of, the
|
|
||||||
Licensor for the purpose of discussing and improving the Work, but
|
|
||||||
excluding communication that is conspicuously marked or otherwise
|
|
||||||
designated in writing by the copyright owner as "Not a Contribution."
|
|
||||||
|
|
||||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
|
||||||
on behalf of whom a Contribution has been received by Licensor and
|
|
||||||
subsequently incorporated within the Work.
|
|
||||||
|
|
||||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
|
||||||
this License, each Contributor hereby grants to You a perpetual,
|
|
||||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
|
||||||
copyright license to reproduce, prepare Derivative Works of,
|
|
||||||
publicly display, publicly perform, sublicense, and distribute the
|
|
||||||
Work and such Derivative Works in Source or Object form.
|
|
||||||
|
|
||||||
3. Grant of Patent License. Subject to the terms and conditions of
|
|
||||||
this License, each Contributor hereby grants to You a perpetual,
|
|
||||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
|
||||||
(except as stated in this section) patent license to make, have made,
|
|
||||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
|
||||||
where such license applies only to those patent claims licensable
|
|
||||||
by such Contributor that are necessarily infringed by their
|
|
||||||
Contribution(s) alone or by combination of their Contribution(s)
|
|
||||||
with the Work to which such Contribution(s) was submitted. If You
|
|
||||||
institute patent litigation against any entity (including a
|
|
||||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
|
||||||
or a Contribution incorporated within the Work constitutes direct
|
|
||||||
or contributory patent infringement, then any patent licenses
|
|
||||||
granted to You under this License for that Work shall terminate
|
|
||||||
as of the date such litigation is filed.
|
|
||||||
|
|
||||||
4. Redistribution. You may reproduce and distribute copies of the
|
|
||||||
Work or Derivative Works thereof in any medium, with or without
|
|
||||||
modifications, and in Source or Object form, provided that You
|
|
||||||
meet the following conditions:
|
|
||||||
|
|
||||||
(a) You must give any other recipients of the Work or
|
|
||||||
Derivative Works a copy of this License; and
|
|
||||||
|
|
||||||
(b) You must cause any modified files to carry prominent notices
|
|
||||||
stating that You changed the files; and
|
|
||||||
|
|
||||||
(c) You must retain, in the Source form of any Derivative Works
|
|
||||||
that You distribute, all copyright, patent, trademark, and
|
|
||||||
attribution notices from the Source form of the Work,
|
|
||||||
excluding those notices that do not pertain to any part of
|
|
||||||
the Derivative Works; and
|
|
||||||
|
|
||||||
(d) If the Work includes a "NOTICE" text file as part of its
|
|
||||||
distribution, then any Derivative Works that You distribute must
|
|
||||||
include a readable copy of the attribution notices contained
|
|
||||||
within such NOTICE file, excluding those notices that do not
|
|
||||||
pertain to any part of the Derivative Works, in at least one
|
|
||||||
of the following places: within a NOTICE text file distributed
|
|
||||||
as part of the Derivative Works; within the Source form or
|
|
||||||
documentation, if provided along with the Derivative Works; or,
|
|
||||||
within a display generated by the Derivative Works, if and
|
|
||||||
wherever such third-party notices normally appear. The contents
|
|
||||||
of the NOTICE file are for informational purposes only and
|
|
||||||
do not modify the License. You may add Your own attribution
|
|
||||||
notices within Derivative Works that You distribute, alongside
|
|
||||||
or as an addendum to the NOTICE text from the Work, provided
|
|
||||||
that such additional attribution notices cannot be construed
|
|
||||||
as modifying the License.
|
|
||||||
|
|
||||||
You may add Your own copyright statement to Your modifications and
|
|
||||||
may provide additional or different license terms and conditions
|
|
||||||
for use, reproduction, or distribution of Your modifications, or
|
|
||||||
for any such Derivative Works as a whole, provided Your use,
|
|
||||||
reproduction, and distribution of the Work otherwise complies with
|
|
||||||
the conditions stated in this License.
|
|
||||||
|
|
||||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
|
||||||
any Contribution intentionally submitted for inclusion in the Work
|
|
||||||
by You to the Licensor shall be under the terms and conditions of
|
|
||||||
this License, without any additional terms or conditions.
|
|
||||||
Notwithstanding the above, nothing herein shall supersede or modify
|
|
||||||
the terms of any separate license agreement you may have executed
|
|
||||||
with Licensor regarding such Contributions.
|
|
||||||
|
|
||||||
6. Trademarks. This License does not grant permission to use the trade
|
|
||||||
names, trademarks, service marks, or product names of the Licensor,
|
|
||||||
except as required for reasonable and customary use in describing the
|
|
||||||
origin of the Work and reproducing the content of the NOTICE file.
|
|
||||||
|
|
||||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
|
||||||
agreed to in writing, Licensor provides the Work (and each
|
|
||||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
|
||||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
|
||||||
implied, including, without limitation, any warranties or conditions
|
|
||||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
|
||||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
|
||||||
appropriateness of using or redistributing the Work and assume any
|
|
||||||
risks associated with Your exercise of permissions under this License.
|
|
||||||
|
|
||||||
8. Limitation of Liability. In no event and under no legal theory,
|
|
||||||
whether in tort (including negligence), contract, or otherwise,
|
|
||||||
unless required by applicable law (such as deliberate and grossly
|
|
||||||
negligent acts) or agreed to in writing, shall any Contributor be
|
|
||||||
liable to You for damages, including any direct, indirect, special,
|
|
||||||
incidental, or consequential damages of any character arising as a
|
|
||||||
result of this License or out of the use or inability to use the
|
|
||||||
Work (including but not limited to damages for loss of goodwill,
|
|
||||||
work stoppage, computer failure or malfunction, or any and all
|
|
||||||
other commercial damages or losses), even if such Contributor
|
|
||||||
has been advised of the possibility of such damages.
|
|
||||||
|
|
||||||
9. Accepting Warranty or Additional Liability. While redistributing
|
|
||||||
the Work or Derivative Works thereof, You may choose to offer,
|
|
||||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
|
||||||
or other liability obligations and/or rights consistent with this
|
|
||||||
License. However, in accepting such obligations, You may act only
|
|
||||||
on Your own behalf and on Your sole responsibility, not on behalf
|
|
||||||
of any other Contributor, and only if You agree to indemnify,
|
|
||||||
defend, and hold each Contributor harmless for any liability
|
|
||||||
incurred by, or claims asserted against, such Contributor by reason
|
|
||||||
of your accepting any such warranty or additional liability.
|
|
||||||
|
|
||||||
END OF TERMS AND CONDITIONS
|
|
||||||
|
|
||||||
APPENDIX: How to apply the Apache License to your work.
|
|
||||||
|
|
||||||
To apply the Apache License to your work, attach the following
|
|
||||||
boilerplate notice, with the fields enclosed by brackets "[]"
|
|
||||||
replaced with your own identifying information. (Don't include
|
|
||||||
the brackets!) The text should be enclosed in the appropriate
|
|
||||||
comment syntax for the file format. We also recommend that a
|
|
||||||
file or class name and description of purpose be included on the
|
|
||||||
same "printed page" as the copyright notice for easier
|
|
||||||
identification within third-party archives.
|
|
||||||
|
|
||||||
Copyright [yyyy] [name of copyright owner]
|
|
||||||
|
|
||||||
Licensed under the Apache License, Version 2.0 (the "License");
|
|
||||||
you may not use this file except in compliance with the License.
|
|
||||||
You may obtain a copy of the License at
|
|
||||||
|
|
||||||
http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
|
|
||||||
Unless required by applicable law or agreed to in writing, software
|
|
||||||
distributed under the License is distributed on an "AS IS" BASIS,
|
|
||||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
|
||||||
See the License for the specific language governing permissions and
|
|
||||||
limitations under the License.
|
|
@ -120,7 +120,7 @@ abstract class AbstractPluginLoader : PluginLoader {
|
|||||||
|
|
||||||
|
|
||||||
protected fun compare(p1: PluginFactory, p2: PluginFactory): Int {
|
protected fun compare(p1: PluginFactory, p2: PluginFactory): Int {
|
||||||
return p1.tag.getInt("priority", 0).compareTo(p2.tag.getInt("priority", 0))
|
return Integer.compare(p1.tag.getInt("priority", 0), p2.tag.getInt("priority", 0))
|
||||||
}
|
}
|
||||||
|
|
||||||
override fun listTags(): List<PluginTag> {
|
override fun listTags(): List<PluginTag> {
|
||||||
|
@ -88,7 +88,7 @@ open class DefaultEnvelopeReader : EnvelopeReader {
|
|||||||
val dataLength = tag.dataSize
|
val dataLength = tag.dataSize
|
||||||
if (metaLength < 0 || dataLength < 0) {
|
if (metaLength < 0 || dataLength < 0) {
|
||||||
LoggerFactory.getLogger(javaClass).error("Can't lazy read infinite data or meta. Returning non-lazy envelope")
|
LoggerFactory.getLogger(javaClass).error("Can't lazy read infinite data or meta. Returning non-lazy envelope")
|
||||||
return read(Files.newInputStream(file))
|
return read(file)
|
||||||
}
|
}
|
||||||
|
|
||||||
val metaBuffer = ByteBuffer.allocate(metaLength)
|
val metaBuffer = ByteBuffer.allocate(metaLength)
|
||||||
|
@ -215,7 +215,7 @@ interface Name : Comparable<Name> {
|
|||||||
return of(segments[0])
|
return of(segments[0])
|
||||||
}
|
}
|
||||||
|
|
||||||
return of(Stream.of(*segments).filter { it -> it.isNotEmpty() }.map { of(it) }.toList())
|
return of(Stream.of(*segments).filter { it -> !it.isEmpty() }.map<Name>{ of(it) }.toList())
|
||||||
}
|
}
|
||||||
|
|
||||||
fun joinString(vararg segments: String): String {
|
fun joinString(vararg segments: String): String {
|
||||||
|
@ -31,7 +31,9 @@ interface ValueProvider {
|
|||||||
fun optValue(path: String): Optional<Value>
|
fun optValue(path: String): Optional<Value>
|
||||||
|
|
||||||
|
|
||||||
fun getValue(path: String): Value = optValue(path).orElseThrow { NameNotFoundException(path) }
|
fun getValue(path: String): Value {
|
||||||
|
return optValue(path).orElseThrow<NameNotFoundException> { NameNotFoundException(path) }
|
||||||
|
}
|
||||||
|
|
||||||
@Provides(BOOLEAN_TARGET)
|
@Provides(BOOLEAN_TARGET)
|
||||||
|
|
||||||
|
@ -150,10 +150,12 @@ class JFreeChartFrame : XYPlotFrame(), FXPlotFrame, Serializable {
|
|||||||
return Range(meta.getDouble("lower", java.lang.Double.NEGATIVE_INFINITY), meta.getDouble("upper", java.lang.Double.POSITIVE_INFINITY))
|
return Range(meta.getDouble("lower", java.lang.Double.NEGATIVE_INFINITY), meta.getDouble("upper", java.lang.Double.POSITIVE_INFINITY))
|
||||||
}
|
}
|
||||||
|
|
||||||
private fun getAxis(axisMeta: Meta): ValueAxis = when (axisMeta.getString("type", "number").lowercase()) {
|
private fun getAxis(axisMeta: Meta): ValueAxis {
|
||||||
"log" -> getLogAxis(axisMeta)
|
return when (axisMeta.getString("type", "number").lowercase()) {
|
||||||
"time" -> getDateAxis(axisMeta)
|
"log" -> getLogAxis(axisMeta)
|
||||||
else -> getNumberAxis(axisMeta)
|
"time" -> getDateAxis(axisMeta)
|
||||||
|
else -> getNumberAxis(axisMeta)
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
override fun updateAxis(axisName: String, axisMeta: Meta, plotMeta: Meta) {
|
override fun updateAxis(axisName: String, axisMeta: Meta, plotMeta: Meta) {
|
||||||
|
@ -112,28 +112,36 @@ class PlotGroup(override val name: String, descriptor: NodeDescriptor = NodeDes
|
|||||||
}
|
}
|
||||||
|
|
||||||
@ProvidesNames(PLOT_TARGET)
|
@ProvidesNames(PLOT_TARGET)
|
||||||
fun list(): Stream<String> = stream().map { it.first.toString() }
|
fun list(): Stream<String> {
|
||||||
|
return stream().map { it.first.toString() }
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Recursive stream of all plots excluding intermediate nodes
|
* Recursive stream of all plots excluding intermediate nodes
|
||||||
*
|
*
|
||||||
* @return
|
* @return
|
||||||
*/
|
*/
|
||||||
fun stream(recursive: Boolean = true): Stream<Pair<Name, Plottable>> = plots.stream().flatMap {
|
fun stream(recursive: Boolean = true): Stream<Pair<Name, Plottable>> {
|
||||||
if (recursive && it is PlotGroup) {
|
return plots.stream().flatMap {
|
||||||
it.stream().map { pair -> Pair(Name.ofSingle(it.name) + pair.first, pair.second) }
|
if (recursive && it is PlotGroup) {
|
||||||
} else {
|
it.stream().map { pair -> Pair(Name.ofSingle(it.name) + pair.first, pair.second) }
|
||||||
Stream.of(Pair(Name.ofSingle(it.name), it))
|
} else {
|
||||||
|
Stream.of(Pair(Name.ofSingle(it.name), it))
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@Provides(PLOT_TARGET)
|
@Provides(PLOT_TARGET)
|
||||||
operator fun get(name: String): Plottable? = get(Name.of(name))
|
operator fun get(name: String): Plottable? {
|
||||||
|
return get(Name.of(name))
|
||||||
|
}
|
||||||
|
|
||||||
operator fun get(name: Name): Plottable? = when (name.length) {
|
operator fun get(name: Name): Plottable? {
|
||||||
0 -> this
|
return when {
|
||||||
1 -> plots.find { it.name == name.unescaped }
|
name.length == 0 -> this
|
||||||
else -> (get(name.cutLast()) as? PlotGroup)?.get(name.last)
|
name.length == 1 -> plots.find { it.name == name.unescaped }
|
||||||
|
else -> (get(name.cutLast()) as? PlotGroup)?.get(name.last)
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@ -195,7 +203,9 @@ class PlotGroup(override val name: String, descriptor: NodeDescriptor = NodeDes
|
|||||||
*
|
*
|
||||||
* @return
|
* @return
|
||||||
*/
|
*/
|
||||||
override fun iterator(): Iterator<Plottable> = this.plots.iterator()
|
override fun iterator(): Iterator<Plottable> {
|
||||||
|
return this.plots.iterator()
|
||||||
|
}
|
||||||
|
|
||||||
class Wrapper : hep.dataforge.io.envelopes.Wrapper<PlotGroup> {
|
class Wrapper : hep.dataforge.io.envelopes.Wrapper<PlotGroup> {
|
||||||
|
|
||||||
|
@ -35,10 +35,7 @@ import hep.dataforge.storage.StorageManager
|
|||||||
import kotlinx.coroutines.async
|
import kotlinx.coroutines.async
|
||||||
import kotlinx.coroutines.awaitAll
|
import kotlinx.coroutines.awaitAll
|
||||||
import kotlinx.coroutines.runBlocking
|
import kotlinx.coroutines.runBlocking
|
||||||
import java.nio.file.Files
|
import java.nio.file.*
|
||||||
import java.nio.file.Path
|
|
||||||
import java.nio.file.Paths
|
|
||||||
import java.nio.file.StandardOpenOption
|
|
||||||
import kotlin.streams.asSequence
|
import kotlin.streams.asSequence
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@ -60,12 +57,7 @@ interface FileStorageElementType : StorageElementType, Named {
|
|||||||
/**
|
/**
|
||||||
* Read given path as [FileStorageElement] with given parent. Returns null if path does not belong to storage
|
* Read given path as [FileStorageElement] with given parent. Returns null if path does not belong to storage
|
||||||
*/
|
*/
|
||||||
suspend fun read(
|
suspend fun read(context: Context, path: Path, parent: StorageElement? = null): FileStorageElement?
|
||||||
context: Context,
|
|
||||||
path: Path,
|
|
||||||
parent: StorageElement? = null,
|
|
||||||
readMeta: Meta? = null,
|
|
||||||
): FileStorageElement?
|
|
||||||
}
|
}
|
||||||
|
|
||||||
class FileStorage(
|
class FileStorage(
|
||||||
@ -83,9 +75,9 @@ class FileStorage(
|
|||||||
|
|
||||||
override fun getChildren(): Collection<StorageElement> = runBlocking {
|
override fun getChildren(): Collection<StorageElement> = runBlocking {
|
||||||
Files.list(path).toList().map { path ->
|
Files.list(path).toList().map { path ->
|
||||||
async {
|
async{
|
||||||
type.read(context, path, this@FileStorage).also {
|
type.read(context, path, this@FileStorage).also {
|
||||||
if (it == null) {
|
if(it == null){
|
||||||
logger.warn("Can't read $path")
|
logger.warn("Can't read $path")
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@ -103,12 +95,12 @@ class FileStorage(
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// /**
|
/**
|
||||||
// * Creating a watch service or reusing one from parent
|
* Creating a watch service or reusing one from parent
|
||||||
// */
|
*/
|
||||||
// private val watchService: WatchService by lazy {
|
private val watchService: WatchService by lazy {
|
||||||
// (parent as? FileStorage)?.watchService ?: path.fileSystem.newWatchService()
|
(parent as? FileStorage)?.watchService ?: path.fileSystem.newWatchService()
|
||||||
// }
|
}
|
||||||
|
|
||||||
//TODO actually watch for file change
|
//TODO actually watch for file change
|
||||||
|
|
||||||
@ -125,12 +117,14 @@ class FileStorage(
|
|||||||
fun resolveMeta(
|
fun resolveMeta(
|
||||||
path: Path,
|
path: Path,
|
||||||
metaReader: (Path) -> Meta? = { EnvelopeType.infer(it)?.reader?.read(it)?.meta },
|
metaReader: (Path) -> Meta? = { EnvelopeType.infer(it)?.reader?.read(it)?.meta },
|
||||||
): Meta? = if (Files.isDirectory(path)) {
|
): Meta? {
|
||||||
Files.list(path).asSequence()
|
return if (Files.isDirectory(path)) {
|
||||||
.find { it.fileName.toString() == "meta.df" || it.fileName.toString() == "meta" }
|
Files.list(path).asSequence()
|
||||||
?.let(metaReader)
|
.find { it.fileName.toString() == "meta.df" || it.fileName.toString() == "meta" }
|
||||||
} else {
|
?.let(metaReader)
|
||||||
metaReader(path)
|
} else {
|
||||||
|
metaReader(path)
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
fun createMetaEnvelope(meta: Meta): Envelope {
|
fun createMetaEnvelope(meta: Meta): Envelope {
|
||||||
@ -173,13 +167,8 @@ class FileStorage(
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
override suspend fun read(
|
override suspend fun read(context: Context, path: Path, parent: StorageElement?): FileStorageElement? {
|
||||||
context: Context,
|
val meta = resolveMeta(path)
|
||||||
path: Path,
|
|
||||||
parent: StorageElement?,
|
|
||||||
readMeta: Meta?,
|
|
||||||
): FileStorageElement? {
|
|
||||||
val meta = readMeta ?: resolveMeta(path)
|
|
||||||
val name = meta?.optString("name").nullable ?: path.fileName.toString()
|
val name = meta?.optString("name").nullable ?: path.fileName.toString()
|
||||||
val type = meta?.optString("type").nullable?.let {
|
val type = meta?.optString("type").nullable?.let {
|
||||||
context.load<StorageManager>().getType(it)
|
context.load<StorageManager>().getType(it)
|
||||||
|
@ -299,7 +299,7 @@ class TableLoaderType : FileStorageElementType {
|
|||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
override suspend fun read(context: Context, path: Path, parent: StorageElement?, readMeta: Meta?): FileStorageElement {
|
override suspend fun read(context: Context, path: Path, parent: StorageElement?): FileStorageElement? {
|
||||||
val envelope = EnvelopeReader.readFile(path)
|
val envelope = EnvelopeReader.readFile(path)
|
||||||
|
|
||||||
val name = envelope.meta.optString("name").nullable ?: path.fileName.toString()
|
val name = envelope.meta.optString("name").nullable ?: path.fileName.toString()
|
||||||
|
@ -106,9 +106,11 @@ class ProtoNumassPoint(override val meta: Meta, val protoBuilder: () -> NumassPr
|
|||||||
this.data.stream
|
this.data.stream
|
||||||
}
|
}
|
||||||
|
|
||||||
fun fromEnvelope(envelope: Envelope): ProtoNumassPoint = ProtoNumassPoint(envelope.meta) {
|
fun fromEnvelope(envelope: Envelope): ProtoNumassPoint {
|
||||||
envelope.dataStream().use {
|
return ProtoNumassPoint(envelope.meta) {
|
||||||
NumassProto.Point.parseFrom(it)
|
envelope.dataStream().use {
|
||||||
|
NumassProto.Point.parseFrom(it)
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -82,7 +82,7 @@ object NumassDataUtils {
|
|||||||
}
|
}
|
||||||
|
|
||||||
fun read(envelope: Envelope): NumassPoint =
|
fun read(envelope: Envelope): NumassPoint =
|
||||||
if (envelope.meta.hasMeta("dpp_params") || envelope.meta.hasMeta("channels") || envelope.meta.hasMeta("tqdc")) {
|
if (envelope.meta.hasMeta("dpp_params") || envelope.meta.hasMeta("tqdc")) {
|
||||||
ProtoNumassPoint.fromEnvelope(envelope)
|
ProtoNumassPoint.fromEnvelope(envelope)
|
||||||
} else {
|
} else {
|
||||||
ClassicNumassPoint(envelope)
|
ClassicNumassPoint(envelope)
|
||||||
|
@ -56,9 +56,9 @@ class NumassDataLoader(
|
|||||||
override fun getConnectionHelper(): ConnectionHelper = _connectionHelper
|
override fun getConnectionHelper(): ConnectionHelper = _connectionHelper
|
||||||
|
|
||||||
|
|
||||||
override val meta: Meta get() {
|
override val meta: Meta by lazy {
|
||||||
val metaPath = path.resolve("meta")
|
val metaPath = path.resolve("meta")
|
||||||
return NumassEnvelopeType.infer(metaPath)?.reader?.read(metaPath)?.meta ?: Meta.empty()
|
NumassEnvelopeType.infer(metaPath)?.reader?.read(metaPath)?.meta ?: Meta.empty()
|
||||||
}
|
}
|
||||||
|
|
||||||
override suspend fun getHvData(): Table? {
|
override suspend fun getHvData(): Table? {
|
||||||
@ -73,6 +73,7 @@ class NumassDataLoader(
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
private val pointEnvelopes: List<Envelope> by lazy {
|
private val pointEnvelopes: List<Envelope> by lazy {
|
||||||
Files.list(path)
|
Files.list(path)
|
||||||
.filter { it.fileName.toString().startsWith(POINT_FRAGMENT_NAME) }
|
.filter { it.fileName.toString().startsWith(POINT_FRAGMENT_NAME) }
|
||||||
|
@ -15,74 +15,31 @@
|
|||||||
*/
|
*/
|
||||||
package inr.numass.data.storage
|
package inr.numass.data.storage
|
||||||
|
|
||||||
import hep.dataforge.connections.ConnectionHelper
|
|
||||||
import hep.dataforge.context.Context
|
import hep.dataforge.context.Context
|
||||||
import hep.dataforge.context.Global
|
import hep.dataforge.context.Global
|
||||||
import hep.dataforge.events.Event
|
import hep.dataforge.events.Event
|
||||||
import hep.dataforge.events.EventBuilder
|
import hep.dataforge.events.EventBuilder
|
||||||
import hep.dataforge.io.envelopes.Envelope
|
|
||||||
import hep.dataforge.meta.Meta
|
import hep.dataforge.meta.Meta
|
||||||
import hep.dataforge.nullable
|
|
||||||
import hep.dataforge.storage.StorageElement
|
import hep.dataforge.storage.StorageElement
|
||||||
import hep.dataforge.storage.StorageManager
|
|
||||||
import hep.dataforge.storage.files.FileStorage
|
import hep.dataforge.storage.files.FileStorage
|
||||||
import hep.dataforge.storage.files.FileStorageElement
|
import hep.dataforge.storage.files.FileStorageElement
|
||||||
import hep.dataforge.storage.files.FileStorageElementType
|
|
||||||
import inr.numass.data.NumassEnvelopeType
|
import inr.numass.data.NumassEnvelopeType
|
||||||
import kotlinx.coroutines.runBlocking
|
import kotlinx.coroutines.runBlocking
|
||||||
import java.nio.file.Files
|
import java.nio.file.Files
|
||||||
import java.nio.file.Path
|
import java.nio.file.Path
|
||||||
|
|
||||||
class EnvelopeStorageElement(
|
|
||||||
override val context: Context,
|
|
||||||
override val name: String,
|
|
||||||
override val meta: Meta,
|
|
||||||
override val path: Path,
|
|
||||||
override val parent: StorageElement?,
|
|
||||||
) : FileStorageElement {
|
|
||||||
private val _connectionHelper by lazy { ConnectionHelper(this) }
|
|
||||||
|
|
||||||
override fun getConnectionHelper(): ConnectionHelper = _connectionHelper
|
|
||||||
|
|
||||||
val envelope: Envelope? get() = NumassEnvelopeType.infer(path)?.reader?.read(path)
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Numass storage directory. Works as a normal directory, but creates a numass loader from each directory with meta
|
* Numass storage directory. Works as a normal directory, but creates a numass loader from each directory with meta
|
||||||
*/
|
*/
|
||||||
class NumassDirectory : FileStorage.Directory() {
|
class NumassDirectory : FileStorage.Directory() {
|
||||||
override val name: String = NUMASS_DIRECTORY_TYPE
|
override val name: String = NUMASS_DIRECTORY_TYPE
|
||||||
|
|
||||||
override suspend fun read(
|
override suspend fun read(context: Context, path: Path, parent: StorageElement?): FileStorageElement? {
|
||||||
context: Context,
|
val meta = FileStorage.resolveMeta(path){ NumassEnvelopeType.infer(it)?.reader?.read(it)?.meta }
|
||||||
path: Path,
|
|
||||||
parent: StorageElement?,
|
|
||||||
readMeta: Meta?,
|
|
||||||
): FileStorageElement? {
|
|
||||||
val meta = readMeta ?: FileStorage.resolveMeta(path) { NumassEnvelopeType.infer(it)?.reader?.read(it)?.meta }
|
|
||||||
return if (Files.isDirectory(path) && meta != null) {
|
return if (Files.isDirectory(path) && meta != null) {
|
||||||
NumassDataLoader(context, parent, path.fileName.toString(), path)
|
NumassDataLoader(context, parent, path.fileName.toString(), path)
|
||||||
} else {
|
} else {
|
||||||
val name = meta?.optString("name").nullable ?: path.fileName.toString()
|
super.read(context, path, parent)
|
||||||
val type = meta?.optString("type").nullable?.let {
|
|
||||||
context.load<StorageManager>().getType(it)
|
|
||||||
} as? FileStorageElementType
|
|
||||||
if (type == null || type is FileStorage.Directory) {
|
|
||||||
// Read path as directory if type not found and path is directory
|
|
||||||
if (Files.isDirectory(path)) {
|
|
||||||
FileStorage(context, name, meta ?: Meta.empty(), path, parent, this)
|
|
||||||
} else {
|
|
||||||
EnvelopeStorageElement(context, name, meta ?: Meta.empty(), path, parent)
|
|
||||||
}
|
|
||||||
} else {
|
|
||||||
//Otherwise, delegate to the type
|
|
||||||
type.read(context, path, parent)
|
|
||||||
}.also {
|
|
||||||
if (it != null && parent == null) {
|
|
||||||
context.load<StorageManager>().register(it)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -93,8 +50,8 @@ class NumassDirectory : FileStorage.Directory() {
|
|||||||
/**
|
/**
|
||||||
* Simple read for scripting and debug
|
* Simple read for scripting and debug
|
||||||
*/
|
*/
|
||||||
fun read(context: Context = Global, path: String): FileStorageElement? {
|
fun read(context: Context = Global, path: String): FileStorageElement?{
|
||||||
return runBlocking { INSTANCE.read(context, context.getDataFile(path).absolutePath) }
|
return runBlocking { INSTANCE.read(context, context.getDataFile(path).absolutePath)}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@ -107,7 +64,7 @@ class NumassDataPointEvent(meta: Meta) : Event(meta) {
|
|||||||
|
|
||||||
override fun toString(): String {
|
override fun toString(): String {
|
||||||
return String.format("(%s) [%s] : pushed numass data file with name '%s' and size '%d'",
|
return String.format("(%s) [%s] : pushed numass data file with name '%s' and size '%d'",
|
||||||
time().toString(), sourceTag(), fileName, fileSize)
|
time().toString(), sourceTag(), fileName, fileSize)
|
||||||
}
|
}
|
||||||
|
|
||||||
companion object {
|
companion object {
|
||||||
@ -122,9 +79,9 @@ class NumassDataPointEvent(meta: Meta) : Event(meta) {
|
|||||||
|
|
||||||
fun builder(source: String, fileName: String, fileSize: Int): EventBuilder<*> {
|
fun builder(source: String, fileName: String, fileSize: Int): EventBuilder<*> {
|
||||||
return EventBuilder.make("numass.storage.pushData")
|
return EventBuilder.make("numass.storage.pushData")
|
||||||
.setSource(source)
|
.setSource(source)
|
||||||
.setMetaValue(FILE_NAME_KEY, fileName)
|
.setMetaValue(FILE_NAME_KEY, fileName)
|
||||||
.setMetaValue(FILE_SIZE_KEY, fileSize)
|
.setMetaValue(FILE_SIZE_KEY, fileSize)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
@ -10,8 +10,7 @@ fun main() {
|
|||||||
Global.output = FXOutputManager()
|
Global.output = FXOutputManager()
|
||||||
JFreeChartPlugin().startGlobal()
|
JFreeChartPlugin().startGlobal()
|
||||||
|
|
||||||
val file = File("D:\\Work\\Numass\\data\\test\\7.df").toPath()
|
val file = File("C:\\Users\\darksnake\\Desktop\\test-data\\p20211012142003(20s)").toPath()
|
||||||
println(file)
|
|
||||||
val point = ProtoNumassPoint.readFile(file)
|
val point = ProtoNumassPoint.readFile(file)
|
||||||
|
|
||||||
point.events.forEach {
|
point.events.forEach {
|
||||||
|
@ -19,7 +19,7 @@ application {
|
|||||||
mainClass.set("inr.numass.viewer.Viewer")
|
mainClass.set("inr.numass.viewer.Viewer")
|
||||||
}
|
}
|
||||||
|
|
||||||
version = "0.6.3"
|
version = "0.6.0"
|
||||||
|
|
||||||
description = "The viewer for numass data"
|
description = "The viewer for numass data"
|
||||||
|
|
||||||
@ -30,7 +30,6 @@ dependencies {
|
|||||||
}
|
}
|
||||||
|
|
||||||
val addJvmArgs = listOf(
|
val addJvmArgs = listOf(
|
||||||
"-XX:+UseZGC",
|
|
||||||
"--add-exports=javafx.graphics/com.sun.glass.ui=ALL-UNNAMED",
|
"--add-exports=javafx.graphics/com.sun.glass.ui=ALL-UNNAMED",
|
||||||
"--add-opens=javafx.graphics/com.sun.javafx.css=ALL-UNNAMED",
|
"--add-opens=javafx.graphics/com.sun.javafx.css=ALL-UNNAMED",
|
||||||
"--add-opens=javafx.graphics/com.sun.javafx.scene=ALL-UNNAMED",
|
"--add-opens=javafx.graphics/com.sun.javafx.scene=ALL-UNNAMED",
|
||||||
@ -40,10 +39,6 @@ val addJvmArgs = listOf(
|
|||||||
"--add-opens=javafx.controls/com.sun.javafx.scene.control.behavior=ALL-UNNAMED",
|
"--add-opens=javafx.controls/com.sun.javafx.scene.control.behavior=ALL-UNNAMED",
|
||||||
"--add-opens=javafx.controls/javafx.scene.control.skin=ALL-UNNAMED",
|
"--add-opens=javafx.controls/javafx.scene.control.skin=ALL-UNNAMED",
|
||||||
"--add-exports=javafx.controls/com.sun.javafx.scene.control.inputmap=ALL-UNNAMED",
|
"--add-exports=javafx.controls/com.sun.javafx.scene.control.inputmap=ALL-UNNAMED",
|
||||||
"--add-exports=javafx.controls/com.sun.javafx.scene.control.behavior=ALL-UNNAMED",
|
|
||||||
"--add-exports=javafx.base/com.sun.javafx.event=ALL-UNNAMED",
|
|
||||||
"--add-opens=javafx.controls/javafx.scene.control.skin=ALL-UNNAMED",
|
|
||||||
"--add-opens=javafx.graphics/javafx.scene=ALL-UNNAMED"
|
|
||||||
)
|
)
|
||||||
|
|
||||||
application {
|
application {
|
||||||
@ -62,28 +57,8 @@ runtime {
|
|||||||
"javafx.controls"
|
"javafx.controls"
|
||||||
)
|
)
|
||||||
jpackage {
|
jpackage {
|
||||||
//installerType = "deb"
|
|
||||||
jvmArgs = addJvmArgs
|
jvmArgs = addJvmArgs
|
||||||
val currentOs = org.gradle.internal.os.OperatingSystem.current()
|
//imageOptions = listOf("--linux-deb-maintainer", "nozik.aa@mipt.ru", "--linux-menu-group", "Science")
|
||||||
installerOptions = installerOptions + listOf("--vendor", "MIPT-NPM lab")
|
|
||||||
|
|
||||||
if (currentOs.isWindows) {
|
|
||||||
installerOptions = installerOptions + listOf(
|
|
||||||
"--win-menu",
|
|
||||||
"--win-menu-group", "Numass",
|
|
||||||
"--win-dir-chooser",
|
|
||||||
"--win-shortcut"
|
|
||||||
)
|
|
||||||
} else if (currentOs.isLinux) {
|
|
||||||
installerType = "deb"
|
|
||||||
installerOptions = installerOptions + listOf(
|
|
||||||
"--linux-package-name", "numass-viewer",
|
|
||||||
"--linux-shortcut",
|
|
||||||
"--linux-deb-maintainer", "nozik.aa@mipt.ru",
|
|
||||||
"--linux-menu-group", "Science",
|
|
||||||
"--linux-shortcut"
|
|
||||||
)
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
launcher {
|
launcher {
|
||||||
jvmArgs = addJvmArgs
|
jvmArgs = addJvmArgs
|
||||||
|
@ -2,31 +2,35 @@ package inr.numass.viewer
|
|||||||
|
|
||||||
import hep.dataforge.configure
|
import hep.dataforge.configure
|
||||||
import hep.dataforge.fx.dfIcon
|
import hep.dataforge.fx.dfIcon
|
||||||
|
import hep.dataforge.fx.except
|
||||||
import hep.dataforge.fx.plots.PlotContainer
|
import hep.dataforge.fx.plots.PlotContainer
|
||||||
|
import hep.dataforge.fx.runGoal
|
||||||
|
import hep.dataforge.fx.ui
|
||||||
|
import hep.dataforge.goals.Goal
|
||||||
import hep.dataforge.names.Name
|
import hep.dataforge.names.Name
|
||||||
import hep.dataforge.plots.PlotGroup
|
import hep.dataforge.plots.PlotGroup
|
||||||
|
import hep.dataforge.plots.Plottable
|
||||||
import hep.dataforge.plots.data.DataPlot
|
import hep.dataforge.plots.data.DataPlot
|
||||||
import hep.dataforge.plots.jfreechart.JFreeChartFrame
|
import hep.dataforge.plots.jfreechart.JFreeChartFrame
|
||||||
import hep.dataforge.tables.Adapters
|
import hep.dataforge.tables.Adapters
|
||||||
import inr.numass.data.analyzers.NumassAnalyzer
|
import inr.numass.data.analyzers.NumassAnalyzer
|
||||||
import inr.numass.data.analyzers.withBinning
|
import inr.numass.data.analyzers.withBinning
|
||||||
|
import inr.numass.data.api.NumassPoint
|
||||||
|
import javafx.beans.Observable
|
||||||
import javafx.beans.binding.DoubleBinding
|
import javafx.beans.binding.DoubleBinding
|
||||||
import javafx.beans.property.SimpleBooleanProperty
|
import javafx.beans.property.SimpleBooleanProperty
|
||||||
import javafx.beans.property.SimpleIntegerProperty
|
|
||||||
import javafx.beans.property.SimpleObjectProperty
|
import javafx.beans.property.SimpleObjectProperty
|
||||||
import javafx.collections.FXCollections
|
import javafx.collections.FXCollections
|
||||||
import javafx.collections.MapChangeListener
|
|
||||||
import javafx.collections.ObservableMap
|
import javafx.collections.ObservableMap
|
||||||
import javafx.scene.control.CheckBox
|
import javafx.scene.control.CheckBox
|
||||||
import javafx.scene.control.ChoiceBox
|
import javafx.scene.control.ChoiceBox
|
||||||
import javafx.scene.image.ImageView
|
import javafx.scene.image.ImageView
|
||||||
import kotlinx.coroutines.*
|
import kotlinx.coroutines.Dispatchers
|
||||||
import kotlinx.coroutines.javafx.JavaFx
|
|
||||||
import tornadofx.*
|
import tornadofx.*
|
||||||
|
|
||||||
class AmplitudeView : View(title = "Numass amplitude spectrum plot", icon = ImageView(dfIcon)) {
|
class AmplitudeView : View(title = "Numass amplitude spectrum plot", icon = ImageView(dfIcon)) {
|
||||||
private val dataController by inject<DataController>()
|
|
||||||
private val data get() = dataController.points
|
private val pointCache by inject<PointCache>()
|
||||||
|
|
||||||
private val frame = JFreeChartFrame().configure {
|
private val frame = JFreeChartFrame().configure {
|
||||||
"title" to "Detector response plot"
|
"title" to "Detector response plot"
|
||||||
@ -50,17 +54,17 @@ class AmplitudeView : View(title = "Numass amplitude spectrum plot", icon = Imag
|
|||||||
}.setType<DataPlot>()
|
}.setType<DataPlot>()
|
||||||
}
|
}
|
||||||
|
|
||||||
val binningProperty = SimpleIntegerProperty(2)
|
val binningProperty = SimpleObjectProperty(20)
|
||||||
var binning: Int by binningProperty
|
var binning by binningProperty
|
||||||
|
|
||||||
val normalizeProperty = SimpleBooleanProperty(true)
|
val normalizeProperty = SimpleBooleanProperty(true)
|
||||||
var normalize by normalizeProperty
|
var normalize by normalizeProperty
|
||||||
|
|
||||||
|
|
||||||
private val plotContainer = PlotContainer(frame).apply {
|
private val container = PlotContainer(frame).apply {
|
||||||
val binningSelector: ChoiceBox<Int> = ChoiceBox(FXCollections.observableArrayList(1, 2, 8, 16, 32)).apply {
|
val binningSelector: ChoiceBox<Int> = ChoiceBox(FXCollections.observableArrayList(1, 2, 8, 16, 32, 50)).apply {
|
||||||
minWidth = 0.0
|
minWidth = 0.0
|
||||||
selectionModel.select(binning as Int?)
|
selectionModel.selectLast()
|
||||||
binningProperty.bind(this.selectionModel.selectedItemProperty())
|
binningProperty.bind(this.selectionModel.selectedItemProperty())
|
||||||
}
|
}
|
||||||
val normalizeSwitch: CheckBox = CheckBox("Normalize").apply {
|
val normalizeSwitch: CheckBox = CheckBox("Normalize").apply {
|
||||||
@ -70,98 +74,129 @@ class AmplitudeView : View(title = "Numass amplitude spectrum plot", icon = Imag
|
|||||||
addToSideBar(0, binningSelector, normalizeSwitch)
|
addToSideBar(0, binningSelector, normalizeSwitch)
|
||||||
}
|
}
|
||||||
|
|
||||||
private val plotJobs: ObservableMap<String, Job> = FXCollections.observableHashMap()
|
private val data: ObservableMap<String, NumassPoint> = FXCollections.observableHashMap()
|
||||||
|
private val plots: ObservableMap<String, Goal<Plottable>> = FXCollections.observableHashMap()
|
||||||
|
|
||||||
|
val isEmpty = booleanBinding(data) { isEmpty() }
|
||||||
|
|
||||||
private val progress = object : DoubleBinding() {
|
private val progress = object : DoubleBinding() {
|
||||||
init {
|
init {
|
||||||
bind(plotJobs)
|
bind(plots)
|
||||||
|
}
|
||||||
|
|
||||||
|
override fun computeValue(): Double {
|
||||||
|
return plots.values.count { it.isDone }.toDouble() / data.size;
|
||||||
}
|
}
|
||||||
|
|
||||||
override fun computeValue(): Double = plotJobs.values.count { it.isCompleted }.toDouble() / plotJobs.size
|
|
||||||
}
|
}
|
||||||
|
|
||||||
init {
|
init {
|
||||||
data.addListener(MapChangeListener { change ->
|
data.addListener { _: Observable ->
|
||||||
val key = change.key.toString()
|
invalidate()
|
||||||
if (change.wasAdded()) {
|
}
|
||||||
replotOne(key, change.valueAdded)
|
|
||||||
} else if (change.wasRemoved()) {
|
|
||||||
plotJobs[key]?.cancel()
|
|
||||||
plotJobs.remove(key)
|
|
||||||
frame.plots.remove(Name.ofSingle(key))
|
|
||||||
progress.invalidate()
|
|
||||||
}
|
|
||||||
})
|
|
||||||
|
|
||||||
binningProperty.onChange {
|
binningProperty.onChange {
|
||||||
replot()
|
frame.plots.clear()
|
||||||
|
plots.clear()
|
||||||
|
invalidate()
|
||||||
}
|
}
|
||||||
|
|
||||||
normalizeProperty.onChange {
|
normalizeProperty.onChange {
|
||||||
replot()
|
frame.plots.clear()
|
||||||
|
plots.clear()
|
||||||
|
invalidate()
|
||||||
}
|
}
|
||||||
|
|
||||||
plotContainer.progressProperty.bind(progress)
|
container.progressProperty.bind(progress)
|
||||||
}
|
|
||||||
|
|
||||||
private fun replotOne(key: String, point: DataController.CachedPoint) {
|
|
||||||
plotJobs[key]?.cancel()
|
|
||||||
frame.plots.remove(Name.ofSingle(key))
|
|
||||||
plotJobs[key] = app.context.launch {
|
|
||||||
withContext(Dispatchers.JavaFx) {
|
|
||||||
progress.invalidate()
|
|
||||||
}
|
|
||||||
val valueAxis = if (normalize) {
|
|
||||||
NumassAnalyzer.COUNT_RATE_KEY
|
|
||||||
} else {
|
|
||||||
NumassAnalyzer.COUNT_KEY
|
|
||||||
}
|
|
||||||
val adapter = Adapters.buildXYAdapter(NumassAnalyzer.CHANNEL_KEY, valueAxis)
|
|
||||||
|
|
||||||
val channels = point.channelSpectra.await()
|
|
||||||
|
|
||||||
val plot = if (channels.size == 1) {
|
|
||||||
DataPlot.plot(
|
|
||||||
key,
|
|
||||||
channels.values.first().withBinning(binning),
|
|
||||||
adapter
|
|
||||||
)
|
|
||||||
} else {
|
|
||||||
val group = PlotGroup.typed<DataPlot>(key)
|
|
||||||
channels.forEach { (key, spectrum) ->
|
|
||||||
val plot = DataPlot.plot(
|
|
||||||
key.toString(),
|
|
||||||
spectrum.withBinning(binning),
|
|
||||||
adapter
|
|
||||||
)
|
|
||||||
group.add(plot)
|
|
||||||
}
|
|
||||||
group
|
|
||||||
}
|
|
||||||
withContext(Dispatchers.JavaFx) {
|
|
||||||
frame.add(plot)
|
|
||||||
}
|
|
||||||
}.apply {
|
|
||||||
invokeOnCompletion {
|
|
||||||
runLater{
|
|
||||||
progress.invalidate()
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
private fun replot() {
|
|
||||||
frame.plots.clear()
|
|
||||||
plotJobs.forEach { (_, job) -> job.cancel() }
|
|
||||||
plotJobs.clear()
|
|
||||||
|
|
||||||
data.forEach { (key, point) ->
|
|
||||||
replotOne(key.toString(), point)
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
override val root = borderpane {
|
override val root = borderpane {
|
||||||
center = plotContainer.root
|
center = container.root
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Put or replace current plot with name `key`
|
||||||
|
*/
|
||||||
|
operator fun set(key: String, point: NumassPoint) {
|
||||||
|
data[key] = point
|
||||||
|
}
|
||||||
|
|
||||||
|
fun addAll(data: Map<String, NumassPoint>) {
|
||||||
|
this.data.putAll(data);
|
||||||
|
}
|
||||||
|
|
||||||
|
private fun invalidate() {
|
||||||
|
data.forEach { (key, point) ->
|
||||||
|
plots.getOrPut(key) {
|
||||||
|
runGoal<Plottable>(app.context, "loadAmplitudeSpectrum_$key", Dispatchers.IO) {
|
||||||
|
val valueAxis = if (normalize) {
|
||||||
|
NumassAnalyzer.COUNT_RATE_KEY
|
||||||
|
} else {
|
||||||
|
NumassAnalyzer.COUNT_KEY
|
||||||
|
}
|
||||||
|
val adapter = Adapters.buildXYAdapter(NumassAnalyzer.CHANNEL_KEY, valueAxis)
|
||||||
|
|
||||||
|
val channels = pointCache.getChannelSpectra(key, point)
|
||||||
|
|
||||||
|
return@runGoal if (channels.size == 1) {
|
||||||
|
DataPlot.plot(
|
||||||
|
key,
|
||||||
|
channels.values.first().withBinning(binning),
|
||||||
|
adapter
|
||||||
|
)
|
||||||
|
} else {
|
||||||
|
val group = PlotGroup.typed<DataPlot>(key)
|
||||||
|
channels.forEach { key, spectrum ->
|
||||||
|
val plot = DataPlot.plot(
|
||||||
|
key.toString(),
|
||||||
|
spectrum.withBinning(binning),
|
||||||
|
adapter
|
||||||
|
)
|
||||||
|
group.add(plot)
|
||||||
|
}
|
||||||
|
group
|
||||||
|
}
|
||||||
|
} ui { plot ->
|
||||||
|
frame.add(plot)
|
||||||
|
progress.invalidate()
|
||||||
|
} except {
|
||||||
|
progress.invalidate()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
plots.keys.filter { !data.containsKey(it) }.forEach { remove(it) }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
fun clear() {
|
||||||
|
data.clear()
|
||||||
|
plots.values.forEach{
|
||||||
|
it.cancel()
|
||||||
|
}
|
||||||
|
plots.clear()
|
||||||
|
invalidate()
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Remove the plot and cancel loading task if it is in progress.
|
||||||
|
*/
|
||||||
|
fun remove(name: String) {
|
||||||
|
frame.plots.remove(Name.ofSingle(name))
|
||||||
|
plots[name]?.cancel()
|
||||||
|
plots.remove(name)
|
||||||
|
data.remove(name)
|
||||||
|
progress.invalidate()
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Set frame content to the given map. All keys not in the map are removed.
|
||||||
|
*/
|
||||||
|
fun setAll(map: Map<String, NumassPoint>) {
|
||||||
|
plots.clear();
|
||||||
|
//Remove obsolete keys
|
||||||
|
data.keys.filter { !map.containsKey(it) }.forEach {
|
||||||
|
remove(it)
|
||||||
|
}
|
||||||
|
this.addAll(map);
|
||||||
|
}
|
||||||
|
|
||||||
}
|
}
|
||||||
|
@ -1,176 +0,0 @@
|
|||||||
package inr.numass.viewer
|
|
||||||
|
|
||||||
import hep.dataforge.context.ContextAware
|
|
||||||
import hep.dataforge.meta.Meta
|
|
||||||
import hep.dataforge.names.Name
|
|
||||||
import hep.dataforge.storage.tables.TableLoader
|
|
||||||
import hep.dataforge.tables.Adapters
|
|
||||||
import hep.dataforge.tables.ListTable
|
|
||||||
import hep.dataforge.tables.Table
|
|
||||||
import hep.dataforge.tables.TableFormatBuilder
|
|
||||||
import hep.dataforge.utils.Misc
|
|
||||||
import hep.dataforge.values.ValueMap
|
|
||||||
import inr.numass.data.analyzers.NumassAnalyzer
|
|
||||||
import inr.numass.data.analyzers.TimeAnalyzer
|
|
||||||
import inr.numass.data.api.NumassPoint
|
|
||||||
import inr.numass.data.api.NumassSet
|
|
||||||
import inr.numass.data.storage.NumassDataLoader
|
|
||||||
import javafx.beans.property.SimpleObjectProperty
|
|
||||||
import javafx.collections.FXCollections
|
|
||||||
import javafx.collections.ObservableList
|
|
||||||
import javafx.collections.ObservableMap
|
|
||||||
import kotlinx.coroutines.*
|
|
||||||
import tornadofx.*
|
|
||||||
import java.nio.file.*
|
|
||||||
import java.nio.file.attribute.BasicFileAttributes
|
|
||||||
import kotlin.math.floor
|
|
||||||
|
|
||||||
|
|
||||||
class DataController : Controller(), ContextAware {
|
|
||||||
override val context get() = app.context
|
|
||||||
|
|
||||||
val analyzer = TimeAnalyzer()
|
|
||||||
|
|
||||||
inner class CachedPoint(point: NumassPoint) {
|
|
||||||
val length = point.length
|
|
||||||
|
|
||||||
val voltage = point.voltage
|
|
||||||
|
|
||||||
val index = point.index
|
|
||||||
|
|
||||||
val meta = point.meta
|
|
||||||
|
|
||||||
val channelSpectra: Deferred<Map<Int, Table>> = context.async(start = CoroutineStart.LAZY) {
|
|
||||||
point.channels.mapValues { (_, value) -> analyzer.getAmplitudeSpectrum(value) }
|
|
||||||
}
|
|
||||||
|
|
||||||
val spectrum: Deferred<Table> = context.async(start = CoroutineStart.LAZY){
|
|
||||||
analyzer.getAmplitudeSpectrum(point)
|
|
||||||
}
|
|
||||||
|
|
||||||
val timeSpectrum: Deferred<Table> = context.async(start = CoroutineStart.LAZY) {
|
|
||||||
val cr = spectrum.await().sumOf {
|
|
||||||
it.getValue(NumassAnalyzer.COUNT_KEY).int
|
|
||||||
}.toDouble() / point.length.toMillis() * 1000
|
|
||||||
|
|
||||||
val binNum = 200
|
|
||||||
//inputMeta.getInt("binNum", 1000);
|
|
||||||
val binSize = 1.0 / cr * 10 / binNum * 1e6
|
|
||||||
//inputMeta.getDouble("binSize", 1.0 / cr * 10 / binNum * 1e6)
|
|
||||||
|
|
||||||
val format = TableFormatBuilder()
|
|
||||||
.addNumber("x", Adapters.X_VALUE_KEY)
|
|
||||||
.addNumber(NumassAnalyzer.COUNT_KEY, Adapters.Y_VALUE_KEY)
|
|
||||||
.build()
|
|
||||||
|
|
||||||
ListTable.Builder(format).rows(
|
|
||||||
analyzer.getEventsWithDelay(point, Meta.empty())
|
|
||||||
.map { it.second.toDouble() / 1000.0 }
|
|
||||||
.groupBy { floor(it / binSize) }
|
|
||||||
.toSortedMap()
|
|
||||||
.map {
|
|
||||||
ValueMap.ofPairs("x" to it.key, "count" to it.value.count())
|
|
||||||
}
|
|
||||||
).build()
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
private val cache = Misc.getLRUCache<Name, CachedPoint>(400)
|
|
||||||
|
|
||||||
fun getCachedPoint(id: Name, point: NumassPoint): CachedPoint = cache.getOrPut(id) { CachedPoint(point) }
|
|
||||||
|
|
||||||
fun getSpectrumAsync(id: Name, point: NumassPoint): Deferred<Table> =
|
|
||||||
getCachedPoint(id, point).spectrum
|
|
||||||
|
|
||||||
suspend fun getChannelSpectra(id: Name, point: NumassPoint): Map<Int, Table> =
|
|
||||||
getCachedPoint(id, point).channelSpectra.await()
|
|
||||||
|
|
||||||
val sets: ObservableMap<Name, NumassSet> = FXCollections.observableHashMap()
|
|
||||||
val points: ObservableMap<Name, CachedPoint> = FXCollections.observableHashMap()
|
|
||||||
val sc: ObservableMap<Name, TableLoader> = FXCollections.observableHashMap()
|
|
||||||
|
|
||||||
val files: ObservableList<Path> = FXCollections.observableArrayList()
|
|
||||||
|
|
||||||
val watchPathProperty = SimpleObjectProperty<Path?>()
|
|
||||||
|
|
||||||
private var watchJob: Job? = null
|
|
||||||
|
|
||||||
init {
|
|
||||||
watchPathProperty.onChange { watchPath ->
|
|
||||||
watchJob?.cancel()
|
|
||||||
if (watchPath != null) {
|
|
||||||
Files.list(watchPath).toList()
|
|
||||||
.filter {
|
|
||||||
!Files.isDirectory(it) && it.fileName.toString()
|
|
||||||
.startsWith(NumassDataLoader.POINT_FRAGMENT_NAME)
|
|
||||||
}
|
|
||||||
.sortedBy { file ->
|
|
||||||
val attr = Files.readAttributes(file, BasicFileAttributes::class.java)
|
|
||||||
attr.creationTime()
|
|
||||||
}.forEach { path ->
|
|
||||||
try {
|
|
||||||
runLater {
|
|
||||||
files.add(path)
|
|
||||||
}
|
|
||||||
} catch (x: Throwable) {
|
|
||||||
app.context.logger.error("Error during dynamic point read", x)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
val watcher = watchPath.fileSystem.newWatchService()
|
|
||||||
watchJob = app.context.launch(Dispatchers.IO) {
|
|
||||||
watcher.use { watcher ->
|
|
||||||
watchPath.register(watcher, StandardWatchEventKinds.ENTRY_CREATE)
|
|
||||||
while (isActive) {
|
|
||||||
val key: WatchKey = watcher.take()
|
|
||||||
for (event: WatchEvent<*> in key.pollEvents()) {
|
|
||||||
if (event.kind() == StandardWatchEventKinds.ENTRY_CREATE) {
|
|
||||||
val path: Path = event.context() as Path
|
|
||||||
runLater {
|
|
||||||
files.add(watchPath.resolve(path))
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
key.reset()
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fun clear() {
|
|
||||||
cache.clear()
|
|
||||||
sets.clear()
|
|
||||||
points.clear()
|
|
||||||
sc.clear()
|
|
||||||
watchPathProperty.set(null)
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
fun addPoint(id: Name, point: NumassPoint): CachedPoint {
|
|
||||||
val newPoint = getCachedPoint(id, point)
|
|
||||||
points[id] = newPoint
|
|
||||||
return newPoint
|
|
||||||
}
|
|
||||||
|
|
||||||
fun addSet(id: Name, set: NumassSet) {
|
|
||||||
sets[id] = set
|
|
||||||
}
|
|
||||||
|
|
||||||
fun addSc(id: Name, set: TableLoader) {
|
|
||||||
sc[id] = set
|
|
||||||
}
|
|
||||||
|
|
||||||
fun remove(id: Name) {
|
|
||||||
points.remove(id)
|
|
||||||
sets.remove(id)
|
|
||||||
sc.remove(id)
|
|
||||||
}
|
|
||||||
|
|
||||||
//
|
|
||||||
// fun addAllPoints(points: Map<String, NumassPoint>) {
|
|
||||||
// TODO()
|
|
||||||
// }
|
|
||||||
|
|
||||||
|
|
||||||
}
|
|
@ -1,93 +0,0 @@
|
|||||||
package inr.numass.viewer
|
|
||||||
|
|
||||||
import hep.dataforge.asName
|
|
||||||
import hep.dataforge.fx.dfIconView
|
|
||||||
import hep.dataforge.io.envelopes.Envelope
|
|
||||||
import hep.dataforge.names.AlphanumComparator
|
|
||||||
import hep.dataforge.names.Name
|
|
||||||
import inr.numass.data.NumassDataUtils
|
|
||||||
import inr.numass.data.NumassEnvelopeType
|
|
||||||
import inr.numass.data.api.NumassPoint
|
|
||||||
import inr.numass.data.storage.NumassDataLoader
|
|
||||||
import kotlinx.coroutines.launch
|
|
||||||
import tornadofx.*
|
|
||||||
import java.nio.file.Path
|
|
||||||
|
|
||||||
class DirectoryWatchView : View(title = "Numass storage", icon = dfIconView) {
|
|
||||||
|
|
||||||
private val dataController by inject<DataController>()
|
|
||||||
|
|
||||||
private val ampView: AmplitudeView by inject()
|
|
||||||
private val timeView: TimeView by inject()
|
|
||||||
|
|
||||||
// private val files: ObservableList<DataController.CachedPoint> =
|
|
||||||
// FXCollections.observableArrayList<DataController.CachedPoint>().apply {
|
|
||||||
// bind(dataController.points) { _, v -> v }
|
|
||||||
// }
|
|
||||||
|
|
||||||
private fun readPointFile(path: Path): NumassPoint {
|
|
||||||
val envelope: Envelope = NumassEnvelopeType.infer(path)?.reader?.read(path)
|
|
||||||
?: kotlin.error("Can't read point file")
|
|
||||||
return NumassDataUtils.read(envelope)
|
|
||||||
}
|
|
||||||
|
|
||||||
//private class PointContainer(val path: Path, val checkedProperty: BooleanProperty = SimpleBooleanProperty())
|
|
||||||
|
|
||||||
override val root = splitpane {
|
|
||||||
listview(dataController.files.sorted { l, r -> AlphanumComparator.compare(l.toString(), r.toString()) }) {
|
|
||||||
cellFormat { path: Path ->
|
|
||||||
if (path.fileName.toString().startsWith(NumassDataLoader.POINT_FRAGMENT_NAME)) {
|
|
||||||
val name = Name.of(path.map { it.toString().asName() })
|
|
||||||
text = null
|
|
||||||
graphic = checkbox(path.fileName.toString()).apply {
|
|
||||||
isSelected = dataController.points.containsKey(name)
|
|
||||||
selectedProperty().onChange {
|
|
||||||
if (it) {
|
|
||||||
app.context.launch {
|
|
||||||
dataController.addPoint(name, readPointFile(path))
|
|
||||||
}
|
|
||||||
} else {
|
|
||||||
dataController.remove(name)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// app.context.launch {
|
|
||||||
// val point = readPointFile(path)
|
|
||||||
// val cachedPoint = dataController.addPoint(path.toString().asName(), point)
|
|
||||||
//
|
|
||||||
// //val point = dataController.getCachedPoint(value.toString())
|
|
||||||
// withContext(Dispatchers.JavaFx) {
|
|
||||||
// contextMenu = ContextMenu().apply {
|
|
||||||
// item("Info") {
|
|
||||||
// action {
|
|
||||||
// PointInfoView(cachedPoint).openModal(escapeClosesWindow = true)
|
|
||||||
// }
|
|
||||||
// }
|
|
||||||
// }
|
|
||||||
// }
|
|
||||||
// }
|
|
||||||
|
|
||||||
} else {
|
|
||||||
text = path.fileName.toString()
|
|
||||||
graphic = null
|
|
||||||
contextMenu = null
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
tabpane {
|
|
||||||
tab("Amplitude spectra") {
|
|
||||||
content = ampView.root
|
|
||||||
isClosable = false
|
|
||||||
//visibleWhen(ampView.isEmpty.not())
|
|
||||||
}
|
|
||||||
tab("Time spectra") {
|
|
||||||
content = timeView.root
|
|
||||||
isClosable = false
|
|
||||||
//visibleWhen(ampView.isEmpty.not())
|
|
||||||
}
|
|
||||||
}
|
|
||||||
setDividerPosition(0, 0.3);
|
|
||||||
}
|
|
||||||
}
|
|
@ -11,8 +11,11 @@ import hep.dataforge.plots.data.TimePlot
|
|||||||
import hep.dataforge.plots.jfreechart.JFreeChartFrame
|
import hep.dataforge.plots.jfreechart.JFreeChartFrame
|
||||||
import hep.dataforge.tables.Adapters
|
import hep.dataforge.tables.Adapters
|
||||||
import inr.numass.data.api.NumassSet
|
import inr.numass.data.api.NumassSet
|
||||||
|
import javafx.collections.FXCollections
|
||||||
import javafx.collections.MapChangeListener
|
import javafx.collections.MapChangeListener
|
||||||
|
import javafx.collections.ObservableMap
|
||||||
import javafx.scene.image.ImageView
|
import javafx.scene.image.ImageView
|
||||||
|
import kotlinx.coroutines.Dispatchers
|
||||||
import tornadofx.*
|
import tornadofx.*
|
||||||
|
|
||||||
|
|
||||||
@ -21,9 +24,6 @@ import tornadofx.*
|
|||||||
*/
|
*/
|
||||||
class HVView : View(title = "High voltage time plot", icon = ImageView(dfIcon)) {
|
class HVView : View(title = "High voltage time plot", icon = ImageView(dfIcon)) {
|
||||||
|
|
||||||
private val dataController by inject<DataController>()
|
|
||||||
private val data get() = dataController.sets
|
|
||||||
|
|
||||||
private val frame = JFreeChartFrame().configure {
|
private val frame = JFreeChartFrame().configure {
|
||||||
"xAxis.title" to "time"
|
"xAxis.title" to "time"
|
||||||
"xAxis.type" to "time"
|
"xAxis.type" to "time"
|
||||||
@ -44,22 +44,23 @@ class HVView : View(title = "High voltage time plot", icon = ImageView(dfIcon))
|
|||||||
center = PlotContainer(frame).root
|
center = PlotContainer(frame).root
|
||||||
}
|
}
|
||||||
|
|
||||||
|
private val data: ObservableMap<String, NumassSet> = FXCollections.observableHashMap()
|
||||||
val isEmpty = booleanBinding(data) { data.isEmpty() }
|
val isEmpty = booleanBinding(data) { data.isEmpty() }
|
||||||
|
|
||||||
init {
|
init {
|
||||||
data.addListener { change: MapChangeListener.Change<out Name, out NumassSet> ->
|
data.addListener { change: MapChangeListener.Change<out String, out NumassSet> ->
|
||||||
isEmpty.invalidate()
|
isEmpty.invalidate()
|
||||||
if (change.wasRemoved()) {
|
if (change.wasRemoved()) {
|
||||||
frame.plots.remove(change.key)
|
frame.plots.remove(Name.ofSingle(change.key))
|
||||||
}
|
}
|
||||||
if (change.wasAdded()) {
|
if (change.wasAdded()) {
|
||||||
runLater { container.progress = -1.0 }
|
runLater { container.progress = -1.0 }
|
||||||
runGoal(app.context,"hvData[${change.key}]") {
|
runGoal(app.context,"hvData[${change.key}]", Dispatchers.IO) {
|
||||||
change.valueAdded.getHvData()
|
change.valueAdded.getHvData()
|
||||||
} ui { table ->
|
} ui { table ->
|
||||||
if (table != null) {
|
if (table != null) {
|
||||||
((frame[change.key.toString()] as? DataPlot)
|
((frame[change.key] as? DataPlot)
|
||||||
?: DataPlot(change.key.toString(), adapter = Adapters.buildXYAdapter("timestamp", "value")).also { frame.add(it) })
|
?: DataPlot(change.key, adapter = Adapters.buildXYAdapter("timestamp", "value")).also { frame.add(it) })
|
||||||
.fillData(table)
|
.fillData(table)
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -70,4 +71,18 @@ class HVView : View(title = "High voltage time plot", icon = ImageView(dfIcon))
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
operator fun set(id: String, set: NumassSet) {
|
||||||
|
data[id] = set
|
||||||
|
}
|
||||||
|
|
||||||
|
fun remove(id: String) {
|
||||||
|
data.remove(id);
|
||||||
|
}
|
||||||
|
|
||||||
|
fun clear() {
|
||||||
|
data.clear()
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
}
|
}
|
||||||
|
@ -1,6 +1,5 @@
|
|||||||
package inr.numass.viewer
|
package inr.numass.viewer
|
||||||
|
|
||||||
import hep.dataforge.asName
|
|
||||||
import hep.dataforge.fx.dfIconView
|
import hep.dataforge.fx.dfIconView
|
||||||
import hep.dataforge.fx.except
|
import hep.dataforge.fx.except
|
||||||
import hep.dataforge.fx.runGoal
|
import hep.dataforge.fx.runGoal
|
||||||
@ -30,7 +29,7 @@ import java.nio.file.Path
|
|||||||
|
|
||||||
class MainView : View(title = "Numass viewer", icon = dfIconView) {
|
class MainView : View(title = "Numass viewer", icon = dfIconView) {
|
||||||
|
|
||||||
private val dataController by inject<DataController>()
|
private val pointCache by inject<PointCache>()
|
||||||
|
|
||||||
val storageView by inject<StorageView>()
|
val storageView by inject<StorageView>()
|
||||||
|
|
||||||
@ -39,123 +38,16 @@ class MainView : View(title = "Numass viewer", icon = dfIconView) {
|
|||||||
// addLogHandler(context.logger)
|
// addLogHandler(context.logger)
|
||||||
// }
|
// }
|
||||||
|
|
||||||
private val pathProperty = SimpleObjectProperty<Path?>()
|
private val pathProperty = SimpleObjectProperty<Path>()
|
||||||
|
private var path: Path by pathProperty
|
||||||
|
|
||||||
private val contentViewProperty = SimpleObjectProperty<UIComponent>()
|
private val contentViewProperty = SimpleObjectProperty<UIComponent>()
|
||||||
private var contentView: UIComponent? by contentViewProperty
|
var contentView: UIComponent? by contentViewProperty
|
||||||
private val spectrumView by inject<SpectrumView>()
|
|
||||||
private val amplitudeView by inject<AmplitudeView>()
|
|
||||||
private val directoryWatchView by inject<DirectoryWatchView>()
|
|
||||||
|
|
||||||
init {
|
|
||||||
contentViewProperty.onChange {
|
|
||||||
root.center = it?.root
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
private fun loadDirectory(path: Path){
|
|
||||||
app.context.launch {
|
|
||||||
dataController.clear()
|
|
||||||
runLater {
|
|
||||||
pathProperty.set(path)
|
|
||||||
contentView = null
|
|
||||||
}
|
|
||||||
if (Files.exists(path.resolve(NumassDataLoader.META_FRAGMENT_NAME))) {
|
|
||||||
//build set view
|
|
||||||
runGoal(app.context, "viewer.load.set[$path]") {
|
|
||||||
title = "Load set ($path)"
|
|
||||||
message = "Building numass set..."
|
|
||||||
NumassDataLoader(app.context, null, path.fileName.toString(), path)
|
|
||||||
} ui { loader: NumassDataLoader ->
|
|
||||||
contentView = spectrumView
|
|
||||||
dataController.addSet(loader.name.asName(), loader)
|
|
||||||
|
|
||||||
} except {
|
|
||||||
alert(
|
|
||||||
type = Alert.AlertType.ERROR,
|
|
||||||
header = "Error during set loading",
|
|
||||||
content = it.toString()
|
|
||||||
).show()
|
|
||||||
}
|
|
||||||
} else {
|
|
||||||
//build storage
|
|
||||||
app.context.launch {
|
|
||||||
val storageElement =
|
|
||||||
NumassDirectory.INSTANCE.read(app.context, path) as? Storage
|
|
||||||
withContext(Dispatchers.JavaFx) {
|
|
||||||
contentView = storageView
|
|
||||||
storageView.storageProperty.set(storageElement)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
private fun loadFile(path: Path){
|
|
||||||
app.context.launch {
|
|
||||||
dataController.clear()
|
|
||||||
runLater {
|
|
||||||
pathProperty.set(path)
|
|
||||||
contentView = null
|
|
||||||
}
|
|
||||||
//Reading individual file
|
|
||||||
val envelope = try {
|
|
||||||
NumassFileEnvelope(path)
|
|
||||||
} catch (ex: Exception) {
|
|
||||||
runLater {
|
|
||||||
alert(
|
|
||||||
type = Alert.AlertType.ERROR,
|
|
||||||
header = "Can't load DF envelope from file $path",
|
|
||||||
content = ex.toString()
|
|
||||||
).show()
|
|
||||||
}
|
|
||||||
null
|
|
||||||
}
|
|
||||||
|
|
||||||
envelope?.let {
|
|
||||||
//try to read as point
|
|
||||||
val point = NumassDataUtils.read(it)
|
|
||||||
runLater {
|
|
||||||
contentView = amplitudeView
|
|
||||||
dataController.addPoint(path.fileName.toString().asName(), point)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
private fun watchDirectory(path: Path){
|
|
||||||
app.context.launch {
|
|
||||||
dataController.clear()
|
|
||||||
runLater {
|
|
||||||
pathProperty.set(path)
|
|
||||||
contentView = directoryWatchView
|
|
||||||
dataController.watchPathProperty.set(path)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
override val root = borderpane {
|
override val root = borderpane {
|
||||||
prefHeight = 600.0
|
prefHeight = 600.0
|
||||||
prefWidth = 800.0
|
prefWidth = 800.0
|
||||||
top {
|
top {
|
||||||
//bypass top configuration bar and only watch directory
|
|
||||||
app.parameters.named["directory"]?.let{ pathString ->
|
|
||||||
val path = Path.of(pathString).toAbsolutePath()
|
|
||||||
watchDirectory(path)
|
|
||||||
toolbar{
|
|
||||||
prefHeight = 40.0
|
|
||||||
label("Watching $path") {
|
|
||||||
padding = Insets(0.0, 0.0, 0.0, 10.0)
|
|
||||||
font = Font.font("System Bold", 13.0)
|
|
||||||
}
|
|
||||||
pane {
|
|
||||||
hgrow = Priority.ALWAYS
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return@top
|
|
||||||
}
|
|
||||||
|
|
||||||
toolbar {
|
toolbar {
|
||||||
prefHeight = 40.0
|
prefHeight = 40.0
|
||||||
button("Load directory") {
|
button("Load directory") {
|
||||||
@ -179,7 +71,12 @@ class MainView : View(title = "Numass viewer", icon = dfIconView) {
|
|||||||
|
|
||||||
if (rootDir != null) {
|
if (rootDir != null) {
|
||||||
NumassProperties.setNumassProperty("numass.viewer.lastPath", rootDir.absolutePath)
|
NumassProperties.setNumassProperty("numass.viewer.lastPath", rootDir.absolutePath)
|
||||||
loadDirectory(rootDir.toPath())
|
app.context.launch {
|
||||||
|
runLater {
|
||||||
|
path = rootDir.toPath()
|
||||||
|
}
|
||||||
|
load(rootDir.toPath())
|
||||||
|
}
|
||||||
}
|
}
|
||||||
} catch (ex: Exception) {
|
} catch (ex: Exception) {
|
||||||
NumassProperties.setNumassProperty("numass.viewer.lastPath", null)
|
NumassProperties.setNumassProperty("numass.viewer.lastPath", null)
|
||||||
@ -204,38 +101,13 @@ class MainView : View(title = "Numass viewer", icon = dfIconView) {
|
|||||||
if (file != null) {
|
if (file != null) {
|
||||||
NumassProperties.setNumassProperty("numass.viewer.lastPath",
|
NumassProperties.setNumassProperty("numass.viewer.lastPath",
|
||||||
file.parentFile.absolutePath)
|
file.parentFile.absolutePath)
|
||||||
loadFile(file.toPath())
|
app.context.launch {
|
||||||
}
|
runLater {
|
||||||
} catch (ex: Exception) {
|
path = file.toPath()
|
||||||
NumassProperties.setNumassProperty("numass.viewer.lastPath", null)
|
}
|
||||||
error("Error", content = "Failed to laod file with message: ${ex.message}")
|
load(file.toPath())
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
button("Watch directory") {
|
|
||||||
action {
|
|
||||||
val chooser = DirectoryChooser()
|
|
||||||
chooser.title = "Select directory to watch"
|
|
||||||
val homeDir = NumassProperties.getNumassProperty("numass.viewer.lastPath")
|
|
||||||
try {
|
|
||||||
if (homeDir == null) {
|
|
||||||
chooser.initialDirectory = File(".").absoluteFile
|
|
||||||
} else {
|
|
||||||
val file = File(homeDir)
|
|
||||||
if (file.isDirectory) {
|
|
||||||
chooser.initialDirectory = file
|
|
||||||
} else {
|
|
||||||
chooser.initialDirectory = file.parentFile
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
val dir = chooser.showDialog(primaryStage.scene.window)
|
|
||||||
|
|
||||||
if (dir != null) {
|
|
||||||
NumassProperties.setNumassProperty("numass.viewer.lastPath", dir.absolutePath)
|
|
||||||
watchDirectory(dir.toPath())
|
|
||||||
}
|
|
||||||
} catch (ex: Exception) {
|
} catch (ex: Exception) {
|
||||||
NumassProperties.setNumassProperty("numass.viewer.lastPath", null)
|
NumassProperties.setNumassProperty("numass.viewer.lastPath", null)
|
||||||
error("Error", content = "Failed to laod file with message: ${ex.message}")
|
error("Error", content = "Failed to laod file with message: ${ex.message}")
|
||||||
@ -259,4 +131,71 @@ class MainView : View(title = "Numass viewer", icon = dfIconView) {
|
|||||||
bottom = statusBar
|
bottom = statusBar
|
||||||
}
|
}
|
||||||
|
|
||||||
|
init {
|
||||||
|
contentViewProperty.onChange {
|
||||||
|
root.center = it?.root
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private suspend fun load(path: Path) {
|
||||||
|
runLater {
|
||||||
|
contentView = null
|
||||||
|
}
|
||||||
|
pointCache.clear()
|
||||||
|
if (Files.isDirectory(path)) {
|
||||||
|
if (Files.exists(path.resolve(NumassDataLoader.META_FRAGMENT_NAME))) {
|
||||||
|
//build set view
|
||||||
|
runGoal(app.context, "viewer.load.set[$path]", Dispatchers.IO) {
|
||||||
|
title = "Load set ($path)"
|
||||||
|
message = "Building numass set..."
|
||||||
|
NumassDataLoader(app.context, null, path.fileName.toString(), path)
|
||||||
|
} ui { loader: NumassDataLoader ->
|
||||||
|
contentView = SpectrumView().apply {
|
||||||
|
clear()
|
||||||
|
set(loader.name, loader)
|
||||||
|
}
|
||||||
|
} except {
|
||||||
|
alert(
|
||||||
|
type = Alert.AlertType.ERROR,
|
||||||
|
header = "Error during set loading",
|
||||||
|
content = it.toString()
|
||||||
|
).show()
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
//build storage
|
||||||
|
app.context.launch {
|
||||||
|
val storageElement = NumassDirectory.INSTANCE.read(app.context, path) as Storage
|
||||||
|
withContext(Dispatchers.JavaFx) {
|
||||||
|
contentView = storageView
|
||||||
|
storageView.storageProperty.set(storageElement)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
//Reading individual file
|
||||||
|
val envelope = try {
|
||||||
|
NumassFileEnvelope(path)
|
||||||
|
} catch (ex: Exception) {
|
||||||
|
runLater {
|
||||||
|
alert(
|
||||||
|
type = Alert.AlertType.ERROR,
|
||||||
|
header = "Can't load DF envelope from file $path",
|
||||||
|
content = ex.toString()
|
||||||
|
).show()
|
||||||
|
}
|
||||||
|
null
|
||||||
|
}
|
||||||
|
|
||||||
|
envelope?.let {
|
||||||
|
//try to read as point
|
||||||
|
val point = NumassDataUtils.read(it)
|
||||||
|
runLater {
|
||||||
|
contentView = AmplitudeView().apply {
|
||||||
|
set(path.fileName.toString(), point)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
}
|
}
|
||||||
|
@ -0,0 +1,64 @@
|
|||||||
|
/*
|
||||||
|
* Copyright 2018 Alexander Nozik.
|
||||||
|
*
|
||||||
|
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
* you may not use this file except in compliance with the License.
|
||||||
|
* You may obtain a copy of the License at
|
||||||
|
*
|
||||||
|
* http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
*
|
||||||
|
* Unless required by applicable law or agreed to in writing, software
|
||||||
|
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
* See the License for the specific language governing permissions and
|
||||||
|
* limitations under the License.
|
||||||
|
*/
|
||||||
|
|
||||||
|
package inr.numass.viewer
|
||||||
|
|
||||||
|
import hep.dataforge.tables.Table
|
||||||
|
import hep.dataforge.utils.Misc
|
||||||
|
import inr.numass.data.analyzers.SimpleAnalyzer
|
||||||
|
import inr.numass.data.api.NumassPoint
|
||||||
|
import kotlinx.coroutines.Deferred
|
||||||
|
import kotlinx.coroutines.Dispatchers
|
||||||
|
import kotlinx.coroutines.async
|
||||||
|
import tornadofx.*
|
||||||
|
|
||||||
|
|
||||||
|
private val analyzer = SimpleAnalyzer()
|
||||||
|
|
||||||
|
|
||||||
|
class PointCache : Controller() {
|
||||||
|
private val context = app.context
|
||||||
|
|
||||||
|
inner class CachedPoint(point: NumassPoint) {
|
||||||
|
val length = point.length
|
||||||
|
|
||||||
|
val voltage = point.voltage
|
||||||
|
|
||||||
|
val meta = point.meta
|
||||||
|
|
||||||
|
val channelSpectra: Deferred<Map<Int, Table>> = context.async(Dispatchers.IO) {
|
||||||
|
point.channels.mapValues { (_, value) -> analyzer.getAmplitudeSpectrum(value) }
|
||||||
|
}
|
||||||
|
|
||||||
|
val spectrum: Deferred<Table> = context.async(Dispatchers.IO) {
|
||||||
|
analyzer.getAmplitudeSpectrum(point)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private val cache = Misc.getLRUCache<String, CachedPoint>(400)
|
||||||
|
|
||||||
|
fun getCachedPoint(id: String,point: NumassPoint): CachedPoint = cache.getOrPut(id) { CachedPoint(point) }
|
||||||
|
|
||||||
|
fun getSpectrumAsync(id: String, point: NumassPoint): Deferred<Table> =
|
||||||
|
getCachedPoint(id, point).spectrum
|
||||||
|
|
||||||
|
suspend fun getChannelSpectra(id: String, point: NumassPoint): Map<Int, Table> =
|
||||||
|
getCachedPoint(id, point).channelSpectra.await()
|
||||||
|
|
||||||
|
fun clear(){
|
||||||
|
cache.clear()
|
||||||
|
}
|
||||||
|
}
|
@ -9,7 +9,7 @@ import tornadofx.*
|
|||||||
import tornadofx.controlsfx.borders
|
import tornadofx.controlsfx.borders
|
||||||
import tornadofx.controlsfx.toGlyph
|
import tornadofx.controlsfx.toGlyph
|
||||||
|
|
||||||
class PointInfoView(val cachedPoint: DataController.CachedPoint) : MetaViewer(cachedPoint.meta) {
|
class PointInfoView(val cachedPoint: PointCache.CachedPoint) : MetaViewer(cachedPoint.meta) {
|
||||||
val countProperty = SimpleIntegerProperty(0)
|
val countProperty = SimpleIntegerProperty(0)
|
||||||
var count by countProperty
|
var count by countProperty
|
||||||
|
|
||||||
|
@ -5,15 +5,18 @@ import hep.dataforge.fx.dfIcon
|
|||||||
import hep.dataforge.fx.plots.PlotContainer
|
import hep.dataforge.fx.plots.PlotContainer
|
||||||
import hep.dataforge.fx.runGoal
|
import hep.dataforge.fx.runGoal
|
||||||
import hep.dataforge.fx.ui
|
import hep.dataforge.fx.ui
|
||||||
import hep.dataforge.names.Name
|
|
||||||
import hep.dataforge.plots.PlotGroup
|
import hep.dataforge.plots.PlotGroup
|
||||||
import hep.dataforge.plots.data.DataPlot
|
import hep.dataforge.plots.data.DataPlot
|
||||||
import hep.dataforge.plots.jfreechart.JFreeChartFrame
|
import hep.dataforge.plots.jfreechart.JFreeChartFrame
|
||||||
import hep.dataforge.storage.tables.TableLoader
|
import hep.dataforge.storage.tables.TableLoader
|
||||||
import hep.dataforge.storage.tables.asTable
|
import hep.dataforge.storage.tables.asTable
|
||||||
import hep.dataforge.tables.Adapters
|
import hep.dataforge.tables.Adapters
|
||||||
|
import hep.dataforge.tables.Table
|
||||||
|
import javafx.collections.FXCollections
|
||||||
import javafx.collections.MapChangeListener
|
import javafx.collections.MapChangeListener
|
||||||
|
import javafx.collections.ObservableMap
|
||||||
import javafx.scene.image.ImageView
|
import javafx.scene.image.ImageView
|
||||||
|
import kotlinx.coroutines.Dispatchers
|
||||||
import tornadofx.*
|
import tornadofx.*
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@ -21,9 +24,6 @@ import tornadofx.*
|
|||||||
*/
|
*/
|
||||||
class SlowControlView : View(title = "Numass slow control view", icon = ImageView(dfIcon)) {
|
class SlowControlView : View(title = "Numass slow control view", icon = ImageView(dfIcon)) {
|
||||||
|
|
||||||
private val dataController by inject<DataController>()
|
|
||||||
private val data get() = dataController.sc
|
|
||||||
|
|
||||||
private val plot = JFreeChartFrame().configure {
|
private val plot = JFreeChartFrame().configure {
|
||||||
"xAxis.type" to "time"
|
"xAxis.type" to "time"
|
||||||
"yAxis.type" to "log"
|
"yAxis.type" to "log"
|
||||||
@ -33,21 +33,22 @@ class SlowControlView : View(title = "Numass slow control view", icon = ImageVie
|
|||||||
center = PlotContainer(plot).root
|
center = PlotContainer(plot).root
|
||||||
}
|
}
|
||||||
|
|
||||||
|
val data: ObservableMap<String, TableLoader> = FXCollections.observableHashMap();
|
||||||
val isEmpty = booleanBinding(data) {
|
val isEmpty = booleanBinding(data) {
|
||||||
data.isEmpty()
|
data.isEmpty()
|
||||||
}
|
}
|
||||||
|
|
||||||
init {
|
init {
|
||||||
data.addListener { change: MapChangeListener.Change<out Name, out TableLoader> ->
|
data.addListener { change: MapChangeListener.Change<out String, out TableLoader> ->
|
||||||
if (change.wasRemoved()) {
|
if (change.wasRemoved()) {
|
||||||
plot.remove(change.key.toString())
|
plot.remove(change.key)
|
||||||
}
|
}
|
||||||
if (change.wasAdded()) {
|
if (change.wasAdded()) {
|
||||||
runGoal(app.context,"loadTable[${change.key}]") {
|
runGoal(app.context,"loadTable[${change.key}]", Dispatchers.IO) {
|
||||||
val plotData = change.valueAdded.asTable().await()
|
val plotData = getData(change.valueAdded)
|
||||||
val names = plotData.format.namesAsArray().filter { it != "timestamp" }
|
val names = plotData.format.namesAsArray().filter { it != "timestamp" }
|
||||||
|
|
||||||
val group = PlotGroup(change.key.toString())
|
val group = PlotGroup(change.key)
|
||||||
|
|
||||||
names.forEach {
|
names.forEach {
|
||||||
val adapter = Adapters.buildXYAdapter("timestamp", it);
|
val adapter = Adapters.buildXYAdapter("timestamp", it);
|
||||||
@ -67,4 +68,21 @@ class SlowControlView : View(title = "Numass slow control view", icon = ImageVie
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
private suspend fun getData(loader: TableLoader): Table {
|
||||||
|
//TODO add query
|
||||||
|
return loader.asTable().await()
|
||||||
|
}
|
||||||
|
|
||||||
|
operator fun set(id: String, loader: TableLoader) {
|
||||||
|
this.data[id] = loader
|
||||||
|
}
|
||||||
|
|
||||||
|
fun remove(id: String) {
|
||||||
|
this.data.remove(id)
|
||||||
|
}
|
||||||
|
|
||||||
|
fun clear(){
|
||||||
|
data.clear()
|
||||||
|
}
|
||||||
|
|
||||||
}
|
}
|
||||||
|
@ -10,13 +10,17 @@ import hep.dataforge.tables.Adapters
|
|||||||
import inr.numass.data.analyzers.countInWindow
|
import inr.numass.data.analyzers.countInWindow
|
||||||
import inr.numass.data.api.NumassSet
|
import inr.numass.data.api.NumassSet
|
||||||
import javafx.beans.property.SimpleIntegerProperty
|
import javafx.beans.property.SimpleIntegerProperty
|
||||||
|
import javafx.collections.FXCollections
|
||||||
import javafx.collections.MapChangeListener
|
import javafx.collections.MapChangeListener
|
||||||
|
import javafx.collections.ObservableMap
|
||||||
import javafx.geometry.Insets
|
import javafx.geometry.Insets
|
||||||
import javafx.geometry.Orientation
|
import javafx.geometry.Orientation
|
||||||
import javafx.scene.image.ImageView
|
import javafx.scene.image.ImageView
|
||||||
import javafx.util.converter.NumberStringConverter
|
import javafx.util.converter.NumberStringConverter
|
||||||
import kotlinx.coroutines.*
|
import kotlinx.coroutines.Dispatchers
|
||||||
import kotlinx.coroutines.javafx.JavaFx
|
import kotlinx.coroutines.javafx.JavaFx
|
||||||
|
import kotlinx.coroutines.launch
|
||||||
|
import kotlinx.coroutines.withContext
|
||||||
import org.controlsfx.control.RangeSlider
|
import org.controlsfx.control.RangeSlider
|
||||||
import tornadofx.*
|
import tornadofx.*
|
||||||
import java.util.concurrent.atomic.AtomicInteger
|
import java.util.concurrent.atomic.AtomicInteger
|
||||||
@ -29,8 +33,7 @@ import kotlin.math.sqrt
|
|||||||
*/
|
*/
|
||||||
class SpectrumView : View(title = "Numass spectrum plot", icon = ImageView(dfIcon)) {
|
class SpectrumView : View(title = "Numass spectrum plot", icon = ImageView(dfIcon)) {
|
||||||
|
|
||||||
private val dataController by inject<DataController>()
|
private val pointCache by inject<PointCache>()
|
||||||
private val data get() = dataController.sets
|
|
||||||
|
|
||||||
private val frame = JFreeChartFrame().configure {
|
private val frame = JFreeChartFrame().configure {
|
||||||
"xAxis.title" to "U"
|
"xAxis.title" to "U"
|
||||||
@ -41,6 +44,7 @@ class SpectrumView : View(title = "Numass spectrum plot", icon = ImageView(dfIco
|
|||||||
}
|
}
|
||||||
private val container = PlotContainer(frame)
|
private val container = PlotContainer(frame)
|
||||||
|
|
||||||
|
|
||||||
private val loChannelProperty = SimpleIntegerProperty(500).apply {
|
private val loChannelProperty = SimpleIntegerProperty(500).apply {
|
||||||
addListener { _ -> updateView() }
|
addListener { _ -> updateView() }
|
||||||
}
|
}
|
||||||
@ -51,7 +55,9 @@ class SpectrumView : View(title = "Numass spectrum plot", icon = ImageView(dfIco
|
|||||||
}
|
}
|
||||||
private var upChannel by upChannelProperty
|
private var upChannel by upChannelProperty
|
||||||
|
|
||||||
private val isEmpty = booleanBinding(data) { data.isEmpty() }
|
|
||||||
|
private val data: ObservableMap<String, NumassSet> = FXCollections.observableHashMap()
|
||||||
|
val isEmpty = booleanBinding(data) { data.isEmpty() }
|
||||||
|
|
||||||
override val root = borderpane {
|
override val root = borderpane {
|
||||||
top {
|
top {
|
||||||
@ -97,9 +103,9 @@ class SpectrumView : View(title = "Numass spectrum plot", icon = ImageView(dfIco
|
|||||||
}
|
}
|
||||||
|
|
||||||
init {
|
init {
|
||||||
data.addListener { change: MapChangeListener.Change<out Name, out NumassSet> ->
|
data.addListener { change: MapChangeListener.Change<out String, out NumassSet> ->
|
||||||
if (change.wasRemoved()) {
|
if (change.wasRemoved()) {
|
||||||
frame.plots.remove(Name.ofSingle(change.key.toString()))
|
frame.plots.remove(Name.ofSingle(change.key))
|
||||||
}
|
}
|
||||||
|
|
||||||
if (change.wasAdded()) {
|
if (change.wasAdded()) {
|
||||||
@ -115,19 +121,18 @@ class SpectrumView : View(title = "Numass spectrum plot", icon = ImageView(dfIco
|
|||||||
val totalProgress = data.values.stream().mapToInt { it.points.size }.sum()
|
val totalProgress = data.values.stream().mapToInt { it.points.size }.sum()
|
||||||
|
|
||||||
data.forEach { (name, set) ->
|
data.forEach { (name, set) ->
|
||||||
val plot: DataPlot = frame.plots[name] as DataPlot? ?: DataPlot(name.toString()).apply {
|
val plot: DataPlot =
|
||||||
frame.add(this)
|
frame.plots[Name.ofSingle(name)] as DataPlot? ?: DataPlot(name).apply { frame.add(this) }
|
||||||
}
|
|
||||||
|
|
||||||
app.context.launch {
|
app.context.launch {
|
||||||
val points = set.points.map { point ->
|
val points = set.points.map {
|
||||||
dataController.getCachedPoint(Name.join("$name","${point.voltage}[${point.index}]"), point).also {
|
pointCache.getCachedPoint("$name/${it.voltage}[${it.index}]", it)
|
||||||
it.spectrum.start()
|
|
||||||
}
|
|
||||||
}.map { cachedPoint ->
|
}.map { cachedPoint ->
|
||||||
val count = cachedPoint.spectrum.await().countInWindow(loChannel.toShort(), upChannel.toShort())
|
val count = cachedPoint.spectrum.await().countInWindow(loChannel.toShort(), upChannel.toShort())
|
||||||
val seconds = cachedPoint.length.toMillis() / 1000.0
|
val seconds = cachedPoint.length.toMillis() / 1000.0
|
||||||
|
launch(Dispatchers.JavaFx) {
|
||||||
|
container.progress = progress.incrementAndGet().toDouble() / totalProgress
|
||||||
|
}
|
||||||
Adapters.buildXYDataPoint(
|
Adapters.buildXYDataPoint(
|
||||||
cachedPoint.voltage,
|
cachedPoint.voltage,
|
||||||
(count / seconds),
|
(count / seconds),
|
||||||
@ -141,4 +146,16 @@ class SpectrumView : View(title = "Numass spectrum plot", icon = ImageView(dfIco
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
operator fun set(key: String, value: NumassSet) {
|
||||||
|
data[key] = value
|
||||||
|
}
|
||||||
|
|
||||||
|
fun remove(key: String) {
|
||||||
|
data.remove(key)
|
||||||
|
}
|
||||||
|
|
||||||
|
fun clear() {
|
||||||
|
data.clear()
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
@ -1,19 +1,15 @@
|
|||||||
package inr.numass.viewer
|
package inr.numass.viewer
|
||||||
|
|
||||||
import hep.dataforge.asName
|
|
||||||
import hep.dataforge.fx.dfIconView
|
import hep.dataforge.fx.dfIconView
|
||||||
import hep.dataforge.fx.meta.MetaViewer
|
import hep.dataforge.fx.meta.MetaViewer
|
||||||
import hep.dataforge.meta.Meta
|
import hep.dataforge.meta.Meta
|
||||||
import hep.dataforge.meta.Metoid
|
import hep.dataforge.meta.Metoid
|
||||||
import hep.dataforge.names.AlphanumComparator
|
import hep.dataforge.names.AlphanumComparator
|
||||||
import hep.dataforge.names.Name
|
|
||||||
import hep.dataforge.storage.Storage
|
import hep.dataforge.storage.Storage
|
||||||
import hep.dataforge.storage.files.FileTableLoader
|
import hep.dataforge.storage.files.FileTableLoader
|
||||||
import hep.dataforge.storage.tables.TableLoader
|
import hep.dataforge.storage.tables.TableLoader
|
||||||
import inr.numass.data.NumassDataUtils
|
|
||||||
import inr.numass.data.api.NumassPoint
|
import inr.numass.data.api.NumassPoint
|
||||||
import inr.numass.data.api.NumassSet
|
import inr.numass.data.api.NumassSet
|
||||||
import inr.numass.data.storage.EnvelopeStorageElement
|
|
||||||
import inr.numass.data.storage.NumassDataLoader
|
import inr.numass.data.storage.NumassDataLoader
|
||||||
import javafx.beans.property.SimpleBooleanProperty
|
import javafx.beans.property.SimpleBooleanProperty
|
||||||
import javafx.beans.property.SimpleObjectProperty
|
import javafx.beans.property.SimpleObjectProperty
|
||||||
@ -29,7 +25,7 @@ class StorageView : View(title = "Numass storage", icon = dfIconView) {
|
|||||||
val storageProperty = SimpleObjectProperty<Storage>()
|
val storageProperty = SimpleObjectProperty<Storage>()
|
||||||
val storage by storageProperty
|
val storage by storageProperty
|
||||||
|
|
||||||
private val dataController by inject<DataController>()
|
private val pointCache by inject<PointCache>()
|
||||||
|
|
||||||
private val ampView: AmplitudeView by inject()
|
private val ampView: AmplitudeView by inject()
|
||||||
private val timeView: TimeView by inject()
|
private val timeView: TimeView by inject()
|
||||||
@ -37,15 +33,27 @@ class StorageView : View(title = "Numass storage", icon = dfIconView) {
|
|||||||
private val hvView: HVView by inject()
|
private val hvView: HVView by inject()
|
||||||
private val scView: SlowControlView by inject()
|
private val scView: SlowControlView by inject()
|
||||||
|
|
||||||
private inner class Container(val name: Name, val content: Any) {
|
// private var watcher: WatchService? = null
|
||||||
|
|
||||||
|
|
||||||
|
fun clear() {
|
||||||
|
//watcher?.close()
|
||||||
|
ampView.clear()
|
||||||
|
timeView.clear()
|
||||||
|
spectrumView.clear()
|
||||||
|
hvView.clear()
|
||||||
|
scView.clear()
|
||||||
|
}
|
||||||
|
|
||||||
|
private inner class Container(val id: String, val content: Any) {
|
||||||
val checkedProperty = SimpleBooleanProperty(false)
|
val checkedProperty = SimpleBooleanProperty(false)
|
||||||
var checked by checkedProperty
|
var checked by checkedProperty
|
||||||
|
|
||||||
val infoView: UIComponent by lazy {
|
val infoView: UIComponent by lazy {
|
||||||
when (content) {
|
when (content) {
|
||||||
is NumassPoint -> PointInfoView(dataController.getCachedPoint(name, content))
|
is NumassPoint -> PointInfoView(pointCache.getCachedPoint(id, content))
|
||||||
is Metoid -> MetaViewer(content.meta, title = "Meta view: $name")
|
is Metoid -> MetaViewer(content.meta, title = "Meta view: $id")
|
||||||
else -> MetaViewer(Meta.empty(), title = "Meta view: $name")
|
else -> MetaViewer(Meta.empty(), title = "Meta view: $id")
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -56,23 +64,27 @@ class StorageView : View(title = "Numass storage", icon = dfIconView) {
|
|||||||
when (content) {
|
when (content) {
|
||||||
is NumassPoint -> {
|
is NumassPoint -> {
|
||||||
if (selected) {
|
if (selected) {
|
||||||
dataController.addPoint(name, content)
|
ampView[id] = content
|
||||||
|
timeView[id] = content
|
||||||
} else {
|
} else {
|
||||||
dataController.remove(name)
|
ampView.remove(id)
|
||||||
|
timeView.remove(id)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
is NumassSet -> {
|
is NumassSet -> {
|
||||||
if (selected) {
|
if (selected) {
|
||||||
dataController.addSet(name, content)
|
spectrumView[id] = content
|
||||||
|
hvView[id] = content
|
||||||
} else {
|
} else {
|
||||||
dataController.remove(name)
|
spectrumView.remove(id)
|
||||||
|
hvView.remove(id)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
is TableLoader -> {
|
is TableLoader -> {
|
||||||
if (selected) {
|
if (selected) {
|
||||||
dataController.addSc(name, content)
|
scView[id] = content
|
||||||
} else {
|
} else {
|
||||||
dataController.remove(name)
|
scView.remove(id)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@ -85,19 +97,9 @@ class StorageView : View(title = "Numass storage", icon = dfIconView) {
|
|||||||
|
|
||||||
val children: ObservableList<Container>? by lazy {
|
val children: ObservableList<Container>? by lazy {
|
||||||
when (content) {
|
when (content) {
|
||||||
is Storage -> content.getChildren().mapNotNull {
|
is Storage -> content.getChildren().map {
|
||||||
if (it is EnvelopeStorageElement) {
|
buildContainer(it, this)
|
||||||
it.envelope?.let { envelope ->
|
}.sortedWith(Comparator.comparing({ it.id }, AlphanumComparator)).asObservable()
|
||||||
try {
|
|
||||||
buildContainer(NumassDataUtils.read(envelope), this)
|
|
||||||
} catch (ex: Exception) {
|
|
||||||
null
|
|
||||||
}
|
|
||||||
}
|
|
||||||
} else {
|
|
||||||
buildContainer(it, this)
|
|
||||||
}
|
|
||||||
}.sortedWith(Comparator.comparing({ it.name.toString() }, AlphanumComparator)).asObservable()
|
|
||||||
is NumassSet -> content.points
|
is NumassSet -> content.points
|
||||||
.sortedBy { it.index }
|
.sortedBy { it.index }
|
||||||
.map { buildContainer(it, this) }
|
.map { buildContainer(it, this) }
|
||||||
@ -110,6 +112,38 @@ class StorageView : View(title = "Numass storage", icon = dfIconView) {
|
|||||||
|
|
||||||
private var watchJob: Job? = null
|
private var watchJob: Job? = null
|
||||||
|
|
||||||
|
// private fun toggleWatch(watch: Boolean) {
|
||||||
|
// if (watch) {
|
||||||
|
// if (watchJob != null && content is NumassDataLoader) {
|
||||||
|
// watchJob = app.context.launch(Dispatchers.IO) {
|
||||||
|
// val key: WatchKey = content.path.register(watcher!!, ENTRY_CREATE)
|
||||||
|
// coroutineContext[Job]?.invokeOnCompletion {
|
||||||
|
// key.cancel()
|
||||||
|
// }
|
||||||
|
// while (watcher != null && isActive) {
|
||||||
|
// try {
|
||||||
|
// key.pollEvents().forEach { event ->
|
||||||
|
// if (event.kind() == ENTRY_CREATE) {
|
||||||
|
// val path: Path = event.context() as Path
|
||||||
|
// if (path.fileName.toString().startsWith(NumassDataLoader.POINT_FRAGMENT_NAME)) {
|
||||||
|
// val envelope: Envelope = NumassEnvelopeType.infer(path)?.reader?.read(path)
|
||||||
|
// ?: kotlin.error("Can't read point file")
|
||||||
|
// val point = NumassDataUtils.read(envelope)
|
||||||
|
// children!!.add(buildContainer(point, this@Container))
|
||||||
|
// }
|
||||||
|
// }
|
||||||
|
// }
|
||||||
|
// } catch (x: Throwable) {
|
||||||
|
// app.context.logger.error("Error during dynamic point read", x)
|
||||||
|
// }
|
||||||
|
// }
|
||||||
|
// }
|
||||||
|
// }
|
||||||
|
// } else {
|
||||||
|
// watchJob?.cancel()
|
||||||
|
// watchJob = null
|
||||||
|
// }
|
||||||
|
// }
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
@ -117,16 +151,21 @@ class StorageView : View(title = "Numass storage", icon = dfIconView) {
|
|||||||
treeview<Container> {
|
treeview<Container> {
|
||||||
//isShowRoot = false
|
//isShowRoot = false
|
||||||
storageProperty.onChange { storage ->
|
storageProperty.onChange { storage ->
|
||||||
dataController.clear()
|
clear()
|
||||||
if (storage == null) return@onChange
|
if (storage == null) return@onChange
|
||||||
root = TreeItem(Container(storage.name.asName(), storage))
|
root = TreeItem(Container(storage.name, storage))
|
||||||
root.isExpanded = true
|
root.isExpanded = true
|
||||||
lazyPopulate(leafCheck = {
|
lazyPopulate(leafCheck = {
|
||||||
!it.value.hasChildren
|
!it.value.hasChildren
|
||||||
}) {
|
}) {
|
||||||
it.value.children
|
it.value.children
|
||||||
}
|
}
|
||||||
|
// watcher?.close()
|
||||||
|
// watcher = if (storage is FileStorage) {
|
||||||
|
// storage.path.fileSystem.newWatchService()
|
||||||
|
// } else {
|
||||||
|
// null
|
||||||
|
// }
|
||||||
}
|
}
|
||||||
|
|
||||||
cellFormat { value: Container ->
|
cellFormat { value: Container ->
|
||||||
@ -154,7 +193,7 @@ class StorageView : View(title = "Numass storage", icon = dfIconView) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
else -> {
|
else -> {
|
||||||
text = value.name.toString()
|
text = value.id
|
||||||
graphic = null
|
graphic = null
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@ -169,6 +208,11 @@ class StorageView : View(title = "Numass storage", icon = dfIconView) {
|
|||||||
value.infoView.openModal(escapeClosesWindow = true)
|
value.infoView.openModal(escapeClosesWindow = true)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
// if(value.content is NumassDataLoader) {
|
||||||
|
// checkmenuitem("Watch") {
|
||||||
|
// selectedProperty().bindBidirectional(value.watchedProperty)
|
||||||
|
// }
|
||||||
|
// }
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@ -212,19 +256,19 @@ class StorageView : View(title = "Numass storage", icon = dfIconView) {
|
|||||||
|
|
||||||
private fun buildContainer(content: Any, parent: Container): Container =
|
private fun buildContainer(content: Any, parent: Container): Container =
|
||||||
when (content) {
|
when (content) {
|
||||||
is Storage -> Container(content.fullName, content)
|
is Storage -> Container(content.fullName.toString(), content)
|
||||||
is NumassSet -> {
|
is NumassSet -> {
|
||||||
val id: Name = if (content is NumassDataLoader) {
|
val id: String = if (content is NumassDataLoader) {
|
||||||
content.fullName
|
content.fullName.unescaped
|
||||||
} else {
|
} else {
|
||||||
content.name.asName()
|
content.name
|
||||||
}
|
}
|
||||||
Container(id, content)
|
Container(id, content)
|
||||||
}
|
}
|
||||||
is NumassPoint -> {
|
is NumassPoint -> {
|
||||||
Container("${parent.name}/${content.voltage}[${content.index}]".asName(), content)
|
Container("${parent.id}/${content.voltage}[${content.index}]", content)
|
||||||
}
|
}
|
||||||
is FileTableLoader -> Container(Name.of(content.path.map { it.toString().asName() }), content)
|
is FileTableLoader -> Container(content.path.toString(), content)
|
||||||
else -> throw IllegalArgumentException("Unknown content type: ${content::class.java}");
|
else -> throw IllegalArgumentException("Unknown content type: ${content::class.java}");
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -2,28 +2,30 @@ package inr.numass.viewer
|
|||||||
|
|
||||||
import hep.dataforge.configure
|
import hep.dataforge.configure
|
||||||
import hep.dataforge.fx.dfIcon
|
import hep.dataforge.fx.dfIcon
|
||||||
|
import hep.dataforge.fx.except
|
||||||
import hep.dataforge.fx.plots.PlotContainer
|
import hep.dataforge.fx.plots.PlotContainer
|
||||||
|
import hep.dataforge.fx.runGoal
|
||||||
|
import hep.dataforge.fx.ui
|
||||||
|
import hep.dataforge.goals.Goal
|
||||||
|
import hep.dataforge.meta.Meta
|
||||||
import hep.dataforge.names.Name
|
import hep.dataforge.names.Name
|
||||||
|
import hep.dataforge.plots.Plottable
|
||||||
import hep.dataforge.plots.data.DataPlot
|
import hep.dataforge.plots.data.DataPlot
|
||||||
import hep.dataforge.plots.jfreechart.JFreeChartFrame
|
import hep.dataforge.plots.jfreechart.JFreeChartFrame
|
||||||
import hep.dataforge.tables.Adapters
|
import hep.dataforge.tables.Adapters
|
||||||
import hep.dataforge.tables.Table
|
import hep.dataforge.values.ValueMap
|
||||||
|
import inr.numass.data.analyzers.TimeAnalyzer
|
||||||
|
import inr.numass.data.api.NumassPoint
|
||||||
|
import javafx.beans.Observable
|
||||||
import javafx.beans.binding.DoubleBinding
|
import javafx.beans.binding.DoubleBinding
|
||||||
import javafx.collections.FXCollections
|
import javafx.collections.FXCollections
|
||||||
import javafx.collections.MapChangeListener
|
|
||||||
import javafx.collections.ObservableMap
|
import javafx.collections.ObservableMap
|
||||||
import javafx.scene.image.ImageView
|
import javafx.scene.image.ImageView
|
||||||
import kotlinx.coroutines.Dispatchers
|
import kotlinx.coroutines.Dispatchers
|
||||||
import kotlinx.coroutines.Job
|
|
||||||
import kotlinx.coroutines.javafx.JavaFx
|
|
||||||
import kotlinx.coroutines.launch
|
|
||||||
import kotlinx.coroutines.withContext
|
|
||||||
import tornadofx.*
|
import tornadofx.*
|
||||||
|
|
||||||
class TimeView : View(title = "Numass time spectrum plot", icon = ImageView(dfIcon)) {
|
class TimeView : View(title = "Numass time spectrum plot", icon = ImageView(dfIcon)) {
|
||||||
|
|
||||||
private val dataController by inject<DataController>()
|
|
||||||
|
|
||||||
private val frame = JFreeChartFrame().configure {
|
private val frame = JFreeChartFrame().configure {
|
||||||
"title" to "Time plot"
|
"title" to "Time plot"
|
||||||
node("xAxis") {
|
node("xAxis") {
|
||||||
@ -45,65 +47,129 @@ class TimeView : View(title = "Numass time spectrum plot", icon = ImageView(dfIc
|
|||||||
}.setType<DataPlot>()
|
}.setType<DataPlot>()
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// val stepProperty = SimpleDoubleProperty()
|
||||||
|
// var step by stepProperty
|
||||||
|
//
|
||||||
|
// private val container = PlotContainer(frame).apply {
|
||||||
|
// val binningSelector: ChoiceBox<Int> = ChoiceBox(FXCollections.observableArrayList(1, 5, 10, 20, 50)).apply {
|
||||||
|
// minWidth = 0.0
|
||||||
|
// selectionModel.selectLast()
|
||||||
|
// stepProperty.bind(this.selectionModel.selectedItemProperty())
|
||||||
|
// }
|
||||||
|
// addToSideBar(0, binningSelector)
|
||||||
|
// }
|
||||||
|
|
||||||
private val container = PlotContainer(frame)
|
private val container = PlotContainer(frame)
|
||||||
|
|
||||||
//private val data: ObservableMap<String, NumassPoint> = FXCollections.observableHashMap()
|
private val data: ObservableMap<String, NumassPoint> = FXCollections.observableHashMap()
|
||||||
private val data get() = dataController.points
|
private val plots: ObservableMap<String, Goal<Plottable>> = FXCollections.observableHashMap()
|
||||||
private val plotJobs: ObservableMap<String, Job> = FXCollections.observableHashMap()
|
|
||||||
|
|
||||||
val isEmpty = booleanBinding(data) { isEmpty() }
|
val isEmpty = booleanBinding(data) { isEmpty() }
|
||||||
|
|
||||||
private val progress = object : DoubleBinding() {
|
private val progress = object : DoubleBinding() {
|
||||||
init {
|
init {
|
||||||
bind(plotJobs)
|
bind(plots)
|
||||||
}
|
}
|
||||||
|
|
||||||
override fun computeValue(): Double = plotJobs.values.count { it.isCompleted }.toDouble() / data.size
|
override fun computeValue(): Double {
|
||||||
|
return plots.values.count { it.isDone }.toDouble() / data.size;
|
||||||
|
}
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
||||||
init {
|
init {
|
||||||
data.addListener(MapChangeListener { change ->
|
data.addListener { _: Observable ->
|
||||||
val key = change.key.toString()
|
invalidate()
|
||||||
if (change.wasAdded()) {
|
}
|
||||||
replotOne(key, change.valueAdded)
|
|
||||||
} else if(change.wasRemoved()){
|
|
||||||
plotJobs[key]?.cancel()
|
|
||||||
plotJobs.remove(key)
|
|
||||||
frame.plots.remove(Name.ofSingle(key))
|
|
||||||
progress.invalidate()
|
|
||||||
}
|
|
||||||
})
|
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
||||||
override val root = borderpane {
|
override val root = borderpane {
|
||||||
center = container.root
|
center = container.root
|
||||||
}
|
}
|
||||||
|
|
||||||
private fun replotOne(key: String, point: DataController.CachedPoint) {
|
/**
|
||||||
plotJobs[key]?.cancel()
|
* Put or replace current plot with name `key`
|
||||||
plotJobs[key] = app.context.launch {
|
*/
|
||||||
try {
|
operator fun set(key: String, point: NumassPoint) {
|
||||||
val histogram: Table = point.timeSpectrum.await()
|
data[key] = point
|
||||||
|
}
|
||||||
|
|
||||||
val plot = DataPlot(key, adapter = Adapters.buildXYAdapter("x", "count"))
|
fun addAll(data: Map<String, NumassPoint>) {
|
||||||
.configure {
|
this.data.putAll(data);
|
||||||
"showLine" to true
|
}
|
||||||
"showSymbol" to false
|
|
||||||
"showErrors" to false
|
private val analyzer = TimeAnalyzer();
|
||||||
"connectionType" to "step"
|
|
||||||
}.fillData(histogram)
|
|
||||||
withContext(Dispatchers.JavaFx) {
|
private fun invalidate() {
|
||||||
|
data.forEach { key, point ->
|
||||||
|
plots.getOrPut(key) {
|
||||||
|
runGoal<Plottable>(app.context, "loadAmplitudeSpectrum_$key", Dispatchers.IO) {
|
||||||
|
|
||||||
|
val initialEstimate = analyzer.analyze(point)
|
||||||
|
val cr = initialEstimate.getDouble("cr")
|
||||||
|
|
||||||
|
val binNum = 200//inputMeta.getInt("binNum", 1000);
|
||||||
|
val binSize = 1.0 / cr * 10 / binNum * 1e6//inputMeta.getDouble("binSize", 1.0 / cr * 10 / binNum * 1e6)
|
||||||
|
|
||||||
|
val histogram = analyzer.getEventsWithDelay(point, Meta.empty())
|
||||||
|
.map { it.second.toDouble() / 1000.0 }
|
||||||
|
.groupBy { Math.floor(it / binSize) }
|
||||||
|
.toSortedMap()
|
||||||
|
.map {
|
||||||
|
ValueMap.ofPairs("x" to it.key, "count" to it.value.count())
|
||||||
|
}
|
||||||
|
|
||||||
|
DataPlot(key, adapter = Adapters.buildXYAdapter("x", "count"))
|
||||||
|
.configure {
|
||||||
|
"showLine" to true
|
||||||
|
"showSymbol" to false
|
||||||
|
"showErrors" to false
|
||||||
|
"connectionType" to "step"
|
||||||
|
}.fillData(histogram)
|
||||||
|
|
||||||
|
} ui { plot ->
|
||||||
frame.add(plot)
|
frame.add(plot)
|
||||||
}
|
progress.invalidate()
|
||||||
} finally {
|
} except {
|
||||||
withContext(Dispatchers.JavaFx) {
|
|
||||||
progress.invalidate()
|
progress.invalidate()
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
plots.keys.filter { !data.containsKey(it) }.forEach { remove(it) }
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
fun clear() {
|
||||||
|
data.clear()
|
||||||
|
plots.values.forEach {
|
||||||
|
it.cancel()
|
||||||
|
}
|
||||||
|
plots.clear()
|
||||||
|
invalidate()
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Remove the plot and cancel loading task if it is in progress.
|
||||||
|
*/
|
||||||
|
fun remove(name: String) {
|
||||||
|
frame.plots.remove(Name.ofSingle(name))
|
||||||
|
plots[name]?.cancel()
|
||||||
|
plots.remove(name)
|
||||||
|
data.remove(name)
|
||||||
|
progress.invalidate()
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Set frame content to the given map. All keys not in the map are removed.
|
||||||
|
*/
|
||||||
|
fun setAll(map: Map<String, NumassPoint>) {
|
||||||
|
plots.clear();
|
||||||
|
//Remove obsolete keys
|
||||||
|
data.keys.filter { !map.containsKey(it) }.forEach {
|
||||||
|
remove(it)
|
||||||
|
}
|
||||||
|
this.addAll(map);
|
||||||
|
}
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -8,7 +8,6 @@ import hep.dataforge.fx.dfIcon
|
|||||||
import javafx.stage.Stage
|
import javafx.stage.Stage
|
||||||
import org.slf4j.LoggerFactory
|
import org.slf4j.LoggerFactory
|
||||||
import tornadofx.*
|
import tornadofx.*
|
||||||
import kotlin.system.exitProcess
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Created by darksnake on 14-Apr-17.
|
* Created by darksnake on 14-Apr-17.
|
||||||
@ -29,7 +28,6 @@ class Viewer : App(MainView::class) {
|
|||||||
context.close()
|
context.close()
|
||||||
Global.terminate();
|
Global.terminate();
|
||||||
super.stop()
|
super.stop()
|
||||||
exitProcess(0)
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -0,0 +1,68 @@
|
|||||||
|
package inr.numass.viewer.test
|
||||||
|
|
||||||
|
import hep.dataforge.context.Global
|
||||||
|
import hep.dataforge.fx.dfIcon
|
||||||
|
import hep.dataforge.nullable
|
||||||
|
import hep.dataforge.tables.Table
|
||||||
|
import inr.numass.data.api.NumassPoint
|
||||||
|
import inr.numass.data.api.NumassSet
|
||||||
|
import inr.numass.data.storage.NumassDirectory
|
||||||
|
import inr.numass.viewer.AmplitudeView
|
||||||
|
import inr.numass.viewer.HVView
|
||||||
|
import inr.numass.viewer.SpectrumView
|
||||||
|
import javafx.application.Application
|
||||||
|
import javafx.scene.image.ImageView
|
||||||
|
import kotlinx.coroutines.launch
|
||||||
|
import tornadofx.*
|
||||||
|
import java.io.File
|
||||||
|
import java.util.concurrent.ConcurrentHashMap
|
||||||
|
|
||||||
|
class ViewerComponentsTestApp : App(ViewerComponentsTest::class)
|
||||||
|
|
||||||
|
class ViewerComponentsTest : View(title = "Numass viewer test", icon = ImageView(dfIcon)) {
|
||||||
|
|
||||||
|
//val rootDir = File("D:\\Work\\Numass\\data\\2017_05\\Fill_2")
|
||||||
|
|
||||||
|
//val set: NumassSet = NumassStorageFactory.buildLocal(rootDir).provide("loader::set_8", NumassSet::class.java).orElseThrow { RuntimeException("err") }
|
||||||
|
|
||||||
|
|
||||||
|
private val cache: MutableMap<NumassPoint, Table> = ConcurrentHashMap()
|
||||||
|
val context = Global
|
||||||
|
|
||||||
|
val amp: AmplitudeView by inject(params = mapOf("cache" to cache))//= AmplitudeView(immutable = immutable)
|
||||||
|
val sp: SpectrumView by inject(params = mapOf("cache" to cache))
|
||||||
|
val hv: HVView by inject()
|
||||||
|
|
||||||
|
override val root = borderpane {
|
||||||
|
top {
|
||||||
|
button("Click me!") {
|
||||||
|
action {
|
||||||
|
context.launch {
|
||||||
|
val set: NumassSet = NumassDirectory.INSTANCE.read(Global, File("D:\\Work\\Numass\\data\\2017_05\\Fill_2").toPath())
|
||||||
|
?.provide("loader::set_2", NumassSet::class.java).nullable
|
||||||
|
?: kotlin.error("Error")
|
||||||
|
update(set)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
center {
|
||||||
|
tabpane {
|
||||||
|
tab("amplitude", amp.root)
|
||||||
|
tab("spectrum", sp.root)
|
||||||
|
tab("hv", hv.root)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
fun update(set: NumassSet) {
|
||||||
|
amp.setAll(set.points.filter { it.voltage != 16000.0 }.associateBy { "point_${it.voltage}" })
|
||||||
|
sp["test"] = set
|
||||||
|
hv[set.name] = set
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
fun main(args: Array<String>) {
|
||||||
|
Application.launch(ViewerComponentsTestApp::class.java, *args)
|
||||||
|
}
|
Loading…
Reference in New Issue
Block a user