Changes between Initial Version and Version 1 of NeteSOWDraft


Ignore:
Timestamp:
Aug 20, 2020, 1:42:05 PM (5 years ago)
Author:
Tom Goddard
Comment:

--

Legend:

Unmodified
Added
Removed
Modified
  • NeteSOWDraft

    v1 v1  
     1== NIAID SOW draft Dec, 2019 ==
     2
     3Group meeting Dec 19, 2019
     4
     5
     6R01 ChimeraX proposal
     7~2 FTE
     8
     9
     101. NIH 3D pipeline using ChimeraX (Eric) ~.8
     11   * NIH 3D Print Exchange pipeline using Chimera will be migrated to new NIH 3D pipeline for both printing and VR.  New pipeline will use ChimeraX. Current scripts are those named Chimera*py in their github site.
     12   * Current Chimera pipeline uses these features not in ChimeraX (from Elaine); unclear if any are “must haves”:
     13      * PubChem fetch
     14      * VRML export - Meghan’s 2016 email said of X3D and VRML “we only need X3D”
     15      * Import of fchk, gro, mol, sdf files
     16      * PDB biounit fetch
     17      * Combine structures, maybe not needed if only for molecular surfaces
     18      * Coulombic coloring, requires charge assignment
     19   * Export of other formats (possibly Collada, GLTF (ascii), FBX, OBJ with texture colors), X3D enhancements.
     20   * Sequence conservation coloring
     21   * Fetch sequence annotations (UniProt, domains, disease-associated mutations)
     22   * Read and visualize segmentation models.
     232. Human Biomolecular Atlas Program (HuBMAP) multiscale visualization (Tom) ~.1
     24   * Models from medical imaging, 3d light microscopy, 3d electron microscopy
     25   * Connection with Nils Gehlenborg at Harvard, HIVE
     26   * Tom G sent email to Phil Cruz to ask for more details about BCBB’s role in HuBMAP
     27   * Oct 2019 Nature overview of HuBMAP overview article:
     28      * “focus of HuBMAP [is] on spatial molecular mapping”
     29      * “We anticipate that the first round of data will be released in the summer of 2020”
     30      * “HuBMAP, in collaboration with other NIH programs, plans to hold a joint meeting with the Human Cell Atlas initiative to identify and work on areas of harmonization and collaboration during the spring of 2020.”
     31      * “To ensure that browsers and visualization tools from HuBMAP are valuable, the consortium will work closely with anatomists, pathologists, and visualization and user experience experts, including those with expertise in virtual or augmented reality.”
     32      * “Ultimately, we hope to catalyse novel views on the organization of tissues, regarding not only which types of cells are neighbouring one another, but also the gene and protein expression patterns that define these cells, their phenotypes, and functional interactions. In addition to encouraging the establishment of intra- and extra-consortium collaborations that align with HuBMAP’s overall mission, we envision an easily accessible, publicly available user interface through which data can be used to visualize molecular landscapes at the single-cell level, pathways and networks for molecules of interest, and spatial and temporal changes across a given cell type of interest. Researchers will also be able to browse, search, download, and analyse the data in standard formats with rich metadata that, over time, will enable users to query and analyse datasets across similar programs.”
     333. Segmentation capabilities for medical imaging, 3d light microscopy, 3d electron microscopy (Tom) ~.25
     34   * Interactive SimpleITK use in ChimeraX
     35   * Allow loading, visualizing, creating, measuring and saving segmentations.
     36   * Support new EMDB-SFF segmentation file format from the EM Databank
     374. Medical imaging (Tom) ~.25 - ~.5
     38   * Metadata browser for DICOM files
     39   * Support radiologist collaborator needs
     40   * Measuring changes in time progression, alignment, volume measurement, difference coloring
     41   * Enhanced lighting for improved perception of details.
     425. Virtual reality (Conrad) ~.25 - ~.50
     43   * Improve multi-person VR beyond work accomplished in SOW #2
     44      * Any participant can bring in new model
     45      * Audio chat, third-party or built-in solution
     46      * Localizable connection server solution
     47   * Recording VR sessions (for VR playback? Or conventional video playback?)
     486. Drug docking (Conrad)
     49   * VR UI
     50   * Turn on and off surface display
     51   * Show hydrogen bonds - for induced-fit docking results? (already have this and clashes for single-receptor multi-ligand situation)
     52   * Local minimization (use OpenMM), needs ligand parameterization, DockPrep/Antechamber
     537. Documentation and training materials for above new capabilities. (Elaine) ~.4
     54   * NIH 3D pipeline
     55      * Provide ongoing advice on ChimeraX commands due to syntax and parameter differences (lighting, cartoon style, etc.)
     56   * Segmentation capabilities
     57   * Medical imaging
     58   * Virtual reality
     598. Administration ~.1
     60
     61
     62
     63
     64
     65
     66== NIAID SOW Ideas – Dec 2019 ==
     67
     68Initial discussion on next SOW goals, Dec 5, 2019 group meeting.  Followed up Dec 12, 2019 with video conference with Phil Cruz, Meghan McCarthy and Darrell Hurt.
     69
     70
     71* MSC/NIAID suggestions
     72   * Migration of NIH print-exchange to NIH 3D (use ChimeraX)
     73   * Updates to ChimeraX to support pipeline (Elaine’s missing pieces)
     74   * See below (scripts are all on github)
     75   * Extended to VR/AR as well as 3D printing
     76      * Need to be able to handle differences
     77   * Allow uploads of ChimeraX session files?
     78   * GLTF text form support
     79   * Learning about python interface to ChimeraX[c]
     80   * Additional file types (Collada, etc.)
     81* previous SOW Google doc with crossouts
     82* previous SOW discussion on ChimeraX wiki
     83* feature list in DICOM viewer notes (some are done or partially done, some are linked to tickets)
     84* many possibilities relating to segmentation
     85   * see especially the ChimeraX wiki link above
     86   * investigate use of machine learning to identify and annotate features
     87   * They are extending the itk toolkit to microscopy images
     88   * Focus has been on medical imaging
     89* TomG: try to steer more toward data types with which we are more familiar, e.g. light microscopy and EM
     90   * Particularly as an overlap with simple itk uses for segmentation[d]
     91* better support for multi-person VR sessions, hard to make reliable
     92   * firewall issues
     93   * VPN
     94   * AWS - rendezvous service? Conrad will investigate.
     95   * if our own service, we don't want everybody on it
     96   * investigate audio services (possibly integrated)
     97   * enable any participant to load new data without deleting others' data
     98   * material for multi-person VR sessions for remote training
     99* advancing VR/ViewDockX capabilities
     100   * Dock Prep
     101      * Antechamber?
     102   * anything else?
     103   * A lot of general interest.  Great VR use case.
     104   * Turn on and off surface display
     105   * Hydrogen bonding
     106   * Local minimization (use OpenMM)
     107      * How do we get force field for ligands?
     108   * Kudos for the tape measure tool
     109* 3D printing: enable porting workflow from Chimera to ChimeraX
     110   * any tie into segmentation?
     111   * are they still interested in pursuing this?[e]
     112* Tom F: avoid overlap with NIGMS grant
     113* Sequence analysis/MAV?
     114   * coloring by conservation (conserved regions may be better vaccine targets, e.g. universal flu vaccine as in the PBS VR segment from June 2019)
     115   * fetch annotations or info (UniProt, domains, disease-associated mutations, ...)
     116* hierarchy & smart level of detail in large and complex data and segmentations
     117   * segmentation browser[f]
     118   * Support transition easily between different levels of detail
     119   * Google-earth style for the models
     120   * Look at potential synergy with HubMap[g]
     121* DICOM enhancements
     122   * DICOM browser showing data hierarchy, metadata[h]
     123      * Show 5 most important things, but support showing more metadata at the user request
     124   * better/more presets for different tissue types in DICOM data
     125   * provide “smart” initial coloring (bones, organs, tissue types)[i]
     126   * provide photorealistic lighting to visualize 3D images and volumes
     127* Wendell Lim's engineered cells - infectious disease tie-in? Opportunity to get NIAID more interested in light (optical) microscopy. Max Krummel.
     128* recording VR sessions (augmented reality, etc.)
     129* HTML documentation for any of these new features, of course
     130* Educational uses of ChimeraX and VR (e.g. undergraduate education -- multi-user especially)[j]
     131   * Would an animation tool fit here?
     132* Other types of measurement (coupled with segmentation)
     133   * Volume measurement
     134   * Comparison capabilities (e.g. progression over time)
     135* Other sources of collaborators for medical imaging VR
     136   * Dmitri, other VA collaborators, Viv in addition to Beth
     137Other crossouts from previous SOW:
     138* select and/or highlight voxels in volumetric data representations. Save the selection, and apply visualization commands to just the selection.
     139* provide “smart” initial coloring (bones, organs, tissue types)
     140* provide photorealistic lighting to visualize 3D images and volumes
     141* additional material may be presented in video format, e.g., “How-to” screen capture videos, mixed reality video capture showing person and data for tutorials and for explaining research results to the public