-
Notifications
You must be signed in to change notification settings - Fork 38
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
clearly defined anatomical orientation #208
Comments
@dyf thanks for raising this important issue! I think what you are describing could be expressed via the If you have imaging data in the condition where it was acquired / stored in instrument coordinates, but can be transformed to anatomical coordinates via some transformation, you basically have two "coordinateSystems" : [
{
"name" : "instrument",
"axes": [
{"name": "z", "type": "space", "unit": "micrometer"},
{"name": "y", "type": "space", "unit": "micrometer"},
{"name": "x", "type": "space", "unit": "micrometer"}
]
},
{
"name" : "my_reference_brain",
"axes": [
{"name": "anterior-to-posterior", "type": "space", "unit": "micrometer"},
{"name": "inferior-to-superior", "type": "space", "unit": "micrometer"},
{"name": "left-to-right", "type": "space", "unit": "micrometer"}
]
}
], The transformation from one Would this work for you @dyf ? |
Yup, the way @d-v-b described is how I had in mind to describe anatomical coordinates using the v0.5 spec - coming soon. |
Timely discussion as we just migrated the content of the wiki page to Read the Docs. |
@dyf wrote
Regarding In that vein, your point about |
Wow, amazed at the quick response, thanks all! How about a more domain-agnostic name? e.g. That said, I'm open to putting this into an extension (e.g. an anatomy or neuroanatomy extension). It's most important (to me) that downstream applications know what terms to expect and how they are defined, so either in the core spec or an easily discoverable extension would be fine. Pardon my ignorance - is there an extension mechanism already? |
In talking to the good folks of Get Your Brain Together (late one night), I proposed either as part of the existing coordinate system or perhaps by having a dedicated subclass of existing systems, the ability to add a field that is defined by an enum or ontology that is managed by a community outside of the NGFF process. The "name" fields in the above examples could almost be used for these purposes but issues such as:
could benefit from having additional type information that the medical community can detect and "do the right" with. I think the question is whether this should be a specifically anatomical extension, or if there's a way to add this type information more generically. (Oops. Comments were added while I was writing. I think nonetheless that this holds. I don't think names are sufficient alone. Another possible option would be a prefixing mechanism that is community-specific, but this would likely become unwieldy.) |
@joshmoore I agree that names are insufficient for the reasons you say. Adding a new field that refers to an enum/ontology would be great. Different communities would be able to use different controlled terms by indicating what ontology they use. Validation may get tricky, but it's at least a start. I this |
I had a look at how the CF conventions handles this: https://cfconventions.org/Data/cf-conventions/cf-conventions-1.10/cf-conventions.html#standard-name. As I understand it, a physical quantity (like length) can be associated with a
I think the @joshmoore said:
Can you elaborate on these concerns? I don't really understand how namespace collision with another community would be a problem -- presumably all that matters is that community X can save and load their data. If community Y uses the same metadata names as X, why is that bad? Perhaps I'm missing something. And I didn't understand the second concern at all. |
I like adding a I can't speak for @joshmoore, but for me a primary goal is to remove undocumented assumptions so that tools can reliably interpret the orientation of images. If multiple subcommunities have different, disparately (or un-) documented conventions for how to interpret |
I was primarily referring to collisions either on the namespace prefix itself or on the key if there is no namespace prefix. What's bad is if community X and Y cannot tell if the data came from the other community.
I may have misunderstood @dyf's example, but if detection is based on the use of a unique value as the
In my opinion, |
I think this issue relates to #203, insofar as we are thinking about giving a "proper name" to a measurement / quantity (anatomical coordinates can be thought of as special lengths). Depending on how big the ontology is going to be, I wonder if we should consider requiring that it be versioned and stored inside the zarr metadata under the appropriate namespace (maybe under an |
It would be great to be able to use dicom headers for lossless conversion between ome-ngff and WSI or other dicom objects. A lot of the concepts mentioned in this thread already have well standardized representations in dicom and it would be a shame to devise something from scratch that duplicates the concepts in an incompatible way. I know a lot of people think dicom is complex and hard to use, but in my experience it's the data that's complex and so any standard will be complex and we might aw well use the one we have. It may not be realistic to assume that everyone will use dicom, but the ability to losslessly transcode between formats seems like a very valuable goal. |
Yes, we put axis units next to anatomical orientation in our metadata schema.
It's quite small. I would love to be able to include it here so that we can use it for validation. |
I wonder if
That might make things more self-describing for humans but not for the programs that will need to understand a priori. Further, if this is a general mechanism that we want to use for the interpretation of other fields in the future, I fear the inlining burden will become burdensome.
What do you mean by "include it here", @dyf? i.e. in this issue?
💯 for working towards interoperability, @pieper. Can you show a snippet of what you think that header information would look like?
Can you share an example of that, @dyf? I think working towards a collection of N similar json blurbs that encode the same information that we can start referring to by name for this discussion would be useful. |
Hmm.. these are a bit subjective, no? I feel like
I think the alternative to making a dataset self-describing is to use links, but links can break, either because the content at the URI has moved, or because internet connectivity is unreliable. I'd be a a little uncomfortable requiring an internet connection for validating an ome-ngff with ontologies in it, at least if the ontology information is small enough to fit in JSON. Hopefully an example ontology document can clear up some of these issues. |
@d-v-b I want to include the terms that are in the issue (e.g. If you are looking for a relevant external source, see: https://openminds.ebrains.eu/v3/ --> controlledTerms -> anatomicalAxisOrientation. They describe each term in jsonld, e.g.: RAS, RAI. The difference is that I am asking for these 3-letter codes to be broken up so we can describe each axis clearly and independently (rather than assume that all arrays are 3-dimensional). We could very easily package these terms up into a JSON file and add them to the repository. |
FYI, the DICOM coordinate systems and orientations may be relevant when something is patient-relative. The co-ordinate system is used when the origin of an image TLHC and the unit vectors defining the orientation of the rows and columns of an image are to be described. DICOM is +ve in the LPS (left-posterior-superior) directions. For 2D images (such as a mammogram) the row and column directions of an image are defined in a patient-relative sense categorically. These are described at:
Note that quadrupeds as well as bipeds are accounted for (theoretically). See also this (outdated) explanation: There are also (US) volume-relative and slide-relative coordinate systems: |
Do folks have an opinion on the format and location of the controlled vocabulary? I am planning to open a draft PR for this in the next week or so. |
It would be nice if the file format could store standard coded entry data as specified in the DICOM standard (and used in all clinical imaging applications). You can find all the intricate details in the DICOM standard, but in practice it is quite simple - an entry is specified by these 3 strings:
|
+1 to @lassoan's suggestion of adopting the |
Did all the talk of DICOM derail this conversation? I would like to encode some microCT data that is currently in nrrd format and I'm curious if anyone has settled on a convention for storing the 'space-directions' and related metadata. From what I can see the ITK implementation only stores spacing and origin. If there is no existing convention I'll be glad to make up my own. |
I took the liberty to:
in #253 |
@thewtex thanks for working on that PR 👍 I do have a comment that I'm not sure whether to put here or there, but I'll start here and if people agree it's worth addressing we could try to adapt the PR. My issue is that the PR as currently written assumes that the imaging axes will always be along one of these defined anatomical directions (e.g. right-to-left or rostral-to-caudal). While this is often the case, it's also likely that the imaging will be rotated with respect to these axes, or even sheared for some scan types. This is where the concepts in nrrd of 'space' and 'space directions' are useful. What you have already documented would be really good for the definitions of the These situations come up frequently in medical imaging, for example MRs acquired on a tilted plane, or CTs with a shear due to gantry tilt. But they can also arise if microscopy images need to be spatially correlated with macroscopic anatomy. I'm afraid I find it hard to follow the discussion in this PR related to transforms, but my fear is that if we don't have a clear way of describing common image acquisition geometries then the logic around transforms will end up with extra complexities. |
As I understand, the ngff file will define multiple coordinate systems and transforms between them. ITK will specify "image" and "physical" coordinate systems and an affine transform between them. Each axis of the "physical" coordinate system axes will have |
Yes, I agree that's how it should work, but when I read the current pr text it doesn't come across that way. To me it says that if, for example, you had a coronal acquisition you would define that using the Maybe we can have worked out examples for common imaging scenarios so that it's clearer how these anatomical labels and transforms should be used together. To me, it's good for the information about anatomical mapping to be included in the transform, while the low level pixel container only talks about memory layout. I.e. at the zarr level you are only talking about rows, columns, slices, blocks, etc. But then the transform introduces the idea that you are mapping from these indices into a particular physical space (such as "LPS" or "RAS"). To say it another way, the concept of "inferior-to-superior" should exist within the transform to physical space, not as a label assigned to the z axis of a data array. |
|
Okay, sounds like the right direction. I'll take another look when all this stuff is merged. |
The anatomical orientation of an array is a critical piece of metadata for downstream analysis, particularly for the increasingly common task of aligning acquired images to an atlas for anatomical quantification and standardized comparison to other data.
Currently the NGFF spec includes coordinate transformations, but the anatomical orientation of the sample once the transformation is applied is unspecified. As a result, tools simply make assumptions about orientation, which leads to wasted time and erroneous results. In systems with a fair amount of anatomical symmetry like the brain, it is impossible to retroactively inspect data to understand the orientation in which it was acquired. A place in the spec where we are explicit about anatomical orientation will allow acquisition and analysis tools to stop making assumptions.
I propose we add a field with a controlled vocabulary for anatomical orientation. Some prior art:
The ITK ecosystem uses 3 letter acronyms to describe anatomical orientation. For example
RAS
corresponds to:This works, however the acronyms are ambiguous. I personally continually have to look up if
R
is left-to-right or right-to-left.Nifti's coordinate transforms are assumed to map data into
RAS
. This approach also works, however it relies on users and data generators being familiar with the Nifti spec and abiding by it.The Brain Image Library asks for a more explicit definition of anatomical orientation. Submitters choose for each axis from a controlled vocabular that resembles the following:
left-to-right
right-to-left
anterior-to-posterior
posterior-to-anterior
inferior-to-superior
superior-to-inferior
We have adopted this at the the Allen Institute for Neural Dynamics in our data schema. We may consider adding
dorsal-to-ventral
,ventral-to-dorsal
,rostral-to-caudal
, andcaudal-to-rostral
to this vocabulary.At the recent Get Your Brain Together hackathon hosted at the Allen Institute this was discussed at length.
Please consider adding an
anatomicalOrientation
field toaxes
metadata. Because this would be a controlled vocabulary, I recommend separating it fromlongName
, which is uncontrolled (see #142). I am of course also open to this living elsewhere.Should this have a default, I suggest it be
RAS
to be consistent with Nifti.The text was updated successfully, but these errors were encountered: