Version 42 (modified by 11 months ago) ( diff ) | ,
---|
NIAID-UCSF 2024 Contract Statement of Work -- March 1 - September 30, 2024
This SOW is split into two parts. Part 1 is intended to be accomplished in the first 6 months of the contract. If specific items are completed, items from part 2 will be brought forward. Should NIAID extend the contract to 1 year, those items in part 2 not completed during the first 6 months will be worked on.
Part 1
- General ChimeraX improvements to support NIAID-specific requirements to assist NIAID personnel to transition away from the unsupported legacy Chimera program, e.g.:
- Investigate and improve ChimeraX usability for very wide displays and touch screens i.Specific focus on the BioViz lab wall display
- Worms depiction
- More GUIs (notably, copy/combine, 2D labels)
- Support for showing the thermal ellipsoids
- Support the NIH 3D pipeline development, including any changes to ChimeraX to support ongoing development
- Continuing support for NIH3D as needed
- Updating workflows
- Quick submits for AlphaFold database entries
- Improve GLTF output to include structure hierarchy
- Investigate adding support for ChimeraX sessions in NIH3D
- Both uploading and downloading
- Need to check on any possible security issues
- Continuing support for NIH3D as needed
- Extend virtual reality support
- Implement use of pass-through video with the Quest 2/3/Pro for multi-person sessions in ChimeraX VR
- Improve molecular viewer for standalone headsets such as Quest 2/3.
- Add “disable/enable buttons” commands to support better support for handing off the controls to another user to prevent the scene from getting inadvertently changed.
- Explore pedagogical benefits of ChimeraX in VR vs. flat screen
- Provide support to the University of Indiana (Andi), UCSF, and NIAID to conduct a task analysis comparing VR vs. flat screen for understanding biological macromolecules as needed
- Medical Images
- Improve presets for medical images
- Move medical imaging functionality to toolshed
- Explore re-skinning the UI when switching to medical imaging
- Support for automated segmentation
- Investigate adding Total segmentator
- Outreach
- Instructional material and tools documentation.
- Detailed instructions for all features shall be provided in a user manual.
- Written user guides and tutorials shall be available as HTML pages.
- Improve documentation for multi-person VR
- Attendance at meetings or workshops as required by NIAID
- Instructional material and tools documentation.
- Administration Submit monthly written reports of accomplishments
Part 2
- General ChimeraX improvements to support NIAID-specific requirements to assist NIAID personnel to transition away from the unsupported legacy Chimera program, e.g.:
- Energy minimization
- Support for MD analysis
- Improve the altloc explorer
- Rewrite ViewDockX
- Read VRML/X3D
- Support the NIH 3D pipeline development, including any changes to ChimeraX to support ongoing development
- Continuing support for NIH3D as needed
- Adding support for total segmentator
- Investigate adding support for ChimeraX sessions in NIH3D
- Both uploading and downloading
- Need to check on any possible security issues
- Continuing support for NIH3D as needed
- Extend virtual reality support
- Improve user experience in ChimeraX VR, e.g.
- Implement a VR ergonomic toolbar and Model panel user interface.
- Add support for a voice interface in VR mode
- Improve user experience in ChimeraX VR, e.g.
- Explore pedagogical benefits of ChimeraX in VR vs. flat screen
- Collaborate with the University of Indiana (Andi), UCSF, and NIAID to conduct a task analysis comparing VR vs. flat screen for understanding biological macromolecules
- Medical Images
- Implement new rendering and lighting modes for medical imagings
- Continue improvements to the DICOM reader by including more data types such as segmentations, and making it more robust by testing against the NCI TCIA repository.
- Improve VR experience for medical images
- Easier manipulation of windowing and leveling, especially for complex curves
- Particularly support for fine-grained changes
- Improve segmentation tool by adding commands to support multi-person VR
- Investigate adding support for 2D views in VR
- General usability improvements for using ChimeraX with medical images driven by TCIA data
- Support for automated segmentation
- Add support for ML-based tumor segmentation tool
- Outreach Instructional material and tools documentation.
- Detailed instructions for all features shall be provided in a user manual.
- Written user guides and tutorials shall be available as HTML pages.
- Create videos demonstrating new capabilities.
- Present webinar and workshop tutorials to train users on existing and new capabilities.
- Create video tutorials for how to use multi-person VR.
- Do outreach using VR in particular (live presentations)
- Improve documentation for multi-person VR
- Attendance at meetings or workshops as required by NIAID
- Administration
- Submit monthly written reports of accomplishments
Meeting Minutes
11/21/24
Phil, Meghan, Darrell; TomG, Eric
Passthrough VR: Darrell asked about the value of passthrough in VR. Tom says he always uses passthrough and always uses it when showing new visitors VR. It is conducive to discussions while in VR even if only one person is wearing a VR headset.
VR study by Andy Bueckle. Darrell is hopeful Andy with UCSF help can complete an VR usability study in 2025. It has been delayed because of many other projects.
Enterprise VR: Tom mentioned ShiftAll MeganeX superlight 8K headset (https://en.shiftall.net/products/meganex8k) expected in 2025,$1900, read favorable in-depth review (https://skarredghost.com/2024/10/30/meganex-superlight-8k-hands-on/). A main review conclusion was that high price headsets $1000-$10000 were usable for enterprise applications because they are also using custom developed software with developer salaries that are 20x the cost of the VR headset.
NIAID BioArt: Darrell says BioArt (https://bioart.niaid.nih.gov) got lots of publicity recently for its 2D images (viruses, etc) and uses mostly SVG graphics. Tom suggested maybe somehow we can produced SVG images from NIH 3D models.
Scooter leave: Tom explained that Scooter is on leave and it could be weeks or months before he returns, the timing is uncertain. Darrell said it is important to continue to send monthly reports to Nada Midani otherwise payment cannot be issued. Darrell, Phil and Meghan expressed their sympathy for Scooter.
February visit: Darrell said he probably will not be able to visit SF in February but Phil and Meghan can. When told about Scooter's current leave he suggested we may reschedule the trip. He said BCBB is flexible to arrange trips on short notice (subject to personal constraints) and has the funding to travel.
11/7/24
Phil; TomG, Eric, Zach, Elaine (last half)
Elaine was late (sorry) so didn't take notes for the first half.
Plan was to discuss February San Francisco visit but Darrell and Scooter out today. Phil: Are there UCSF researchers to talk to? Maybe we can work on VR meeting coordinate system alignment with Quest 3.
Previously discussed: lenticular display (which uses VR) was not working for a while, but is now. User Krishnan Raman at a company, Biocryst, in Birmingham, AL. Still molecule is squished vertically so not fully working.
NIH3D AlphaFold quick submit update will be released to the public soon! The user interface has been improved to handle the greatly increased number of models.
Phil has also used the Foldseek tool, Ligands option and found it useful. When is 1.9 supposed to be out? ChimeraX team: before the end of the year.
NIAID has ChimeraX 1.8 in their software store users can install it without NIAID IT support. Want to do the same for 1.9 when it comes out.
Phil: can we view a NAMD trajectory in VR? TomG: there are various tips to improve performance, e.g. sticks instead of ribbons. ISOLDE has pretty good performance because it just does the MD in a localized area. ChimeraX also has a right-mouse jiggle function but it calculates over the whole structure. There is a "play coordinates" right mouse mode to play a trajectory that can be used in VR. Maybe it would help to display a zone as a one-time thing (rather than at every frame). Eric: the waters may drift away and you wouldn't be able to see the new ones coming in.
10/24/24
canceled due to BioViz lab event
10/10/24
Phil, Mike Bopf, Bhinnata, Kristen; Elaine, Eric, Zach, TomG
Phil: tried to use "vr meeting" the other day and the Amazon server that it uses was down! So we are inspired to get our own going. Tom fixed the vr meeting server. It was down because Amazon AWS rebooted the virtual machine, first time in 4 years.
Phil: re NIH3D pipeline, what about that color by chain issue? Eric: I fixed it and sent email, it was due to biological assembly using table with incorrect chain information. Mike: I didn't get a chance to look at it yet. Eric: you just need to get the updated version of the presets.
Kristen: also for NIH3D, there was an issue in that glb vertex coloring or something about material properties is now done differently (maybe happened in July?) and this affects our workflow. Probably we should really change our workflow instead of reverting how the GLB is done, since it's now better in other ways, but there may not be time for us to do that. I'm looking at replacing trimesh with Blender to avoid this problem. So I'm conflicted whether to ask you for a way to turn off this material change temporarily. TomG: in September I fixed a bug where objects had no material color, so I put it in, but I don't see a change in July. You can send me files and I'll take a look. Kristen: maybe I'll see whether I can just fix this in our workflows. I'll work on it at least one more day. Phil: getting rid of trimesh may be a good thing. TomG, Elaine: maybe the gltf save "pruneVertexColors" or "textureColors" options would have an effect, but hard to tell without Tom looking at the files.
Phil: need to discuss agenda for our visit to SF, but that should probably wait for when Darrell, Meghan, and Scooter are participating.
Elaine, TomG: we have a long downtime coming up, all next week Mon 10/14 through Friday and possibly that weekend. Toolshed etc. will not be available.
Phil: what's new? Elaine: coming soon, Cavity Finder tool. Pockets are shown as clouds of dots (pseudoatomic models). TomG: there's finding similar structures with Foldseek that can copy ligands from hundreds of similar structures to a query. Elaine: not yet documented, though. Eric, Elaine: Piet has a Schol-AR bundle on the toolshed, and he's made progress on the animation tool.
9/26/24
Darrell, Phil, Mike Bopf, Bhinnata, Meghan, Kristen; Elaine, Eric, Zach
Trip went well but Phil and Meghan got sick afterward. Phil used ChimeraX a bit, students had it installed. Unrelated to Senegal trip, one student from Mali is coming to do a postdoc with NIAID. In Senegal, some attendees were from other Mali and Uganda. There was a hackathon, and some students made virtual posters in Nduvo. Attendance and engagement were high, and one instructor from Mali wanted Phil's slides to use in her own teaching.
Phil: Re NIH3D workflow, testing Eric's latest changes and haven't noticed any issues yet. There are so many outputs, it's easy to get lost, but that is something we need to sort out for ourselves. Each one seems useful in its own way. Thanks to Eric for all his efforts (clapping hands icon from Bhinnata)! Release may be sometime next week. Phil: I still need to write the release notes.
Phil: shows current NIH3D in progress, looks great, different thumbnails at the bottom allow quickly switching the 3D view. We need to put them in a well-defined order to present to the users. Kristen: I thought we did that, but maybe it's not implemented yet. Phil: any other issues? Kristen: not that I know of.
Darrell asks about user base of ChimeraX. Elaine: cryoEM people are most of the cutting-edge papers, but we also have a significant number of general molecular modeling people, especially recently for viewing AlphaFold predictions. Eric: we are more dominant in cryoEM, as there is more competition for general molecular modeling. Darrell: Pymol can be obtained in packages instead of a download from a website, any developments along those lines? Zach: Greg and I were working on changing licensing to agreement upon first use rather than at download, which would then enable different ways of getting ChimeraX.
Kristen: is the GLTF center-each-node option in the daily build? Elaine and Eric: after consulting documentation, daily build usage and code notes, yes, and true by default.
Phil reports a problem with alphafold fetch, works for us in the same versions of ChimeraX, Eric figured out it was his connection to UniProt since it also doesn't work for him to use "open" on a uniprot sequence. Phil will report a bug. Followup: this was caused by his VPN since it went away when he turned it off. Ticket #16027
At first we were confused that clicking "Fetch" on the AlphaFold tool runs the command "alphafold match" instead of "alphafold fetch" but it makes sense upon reviewing the documentation: "alphafold match" accepts more ways of specifying the query, not just uniprot name/ID, and will get top hit from fast k-mer sequence search if no exact match.
Zach: Scooter will be gone when we're supposed to send the next report. To whom is that sent? Darrell: nada@…
9/12/24
No meeting: Darrell, Phil, Victor, and Meghan in Senegal, Scooter on vacation. NIH3D pipeline work in progress.
8/29/24
Tom, Eric, Elaine, Zach, Piet, Scooter; Phil, Meghan, Mike, Kristen
Phil: our senior leadership came in this week, they work with several structural biology centers. We were impressed that he could lead a VR ChimeraX session showing an AIDS drug. It went quite well except for the trying to prevent the tethers of the four people from getting tangled up. Our wifi is in flux so we had to use the cables.
Scooter: is there a grant opportunity with those higher-ups? Phil: possibly but no specific knowledge. Scooter: we are considering R24 mechanism vs. or in addition to R01, so it would be helpful to build connections to any receptive parties. Phil: they want to get ChimeraX VR in all of the structural biology centers. Will see if that can lead to funding opportunities.
Phil: They plan to use ChimeraX VR in their retreat Sept 18. 10 headsets at once. One difficulty is how to position the model for so many participants, any thoughts? Tom: with interactive ChimeraX (not LookSee)? Phil: Yes. Tom: how do you draw the boundary? If SteamVR, everybody should see the model floating in the same place in the room. If there is a rectangle rather than square, it considers the viewer as facing a long side, and the order of drawing the four boundaries matters. Phil: does the size matter? Tom: maybe not, but people would need to specify the same center. With Oculus Metalink, not sure how it would work. Phil, Meghan: we'll be in Africa most of the month and not much time to work on setup. Only Sept 16-17. We plan to have the 10 people seated and not move around much. Maybe we'll make the area small, like 2x2 meters. Phil: if we draw the boundary with the user in middle, where would the model be placed? Tom: in the middle, so it would be right on top of the user. Tom: I've thought about having a separate small box for each user, not intersecting, but that may not be good for following a presenter that is farther away.
Meghan: somebody's virtual head was blocking somebody else's view. Certainly for a large number of people at the retreat, we should turn off the virtual head images. Tom: each person would need to turn it off for themselves, the way it looks currently.
Tom: if the coordinate systems aren't aligned it is very confusing, then the avatar representations are in misleading positions and collisions/tangles are even more likely.
Phil: what if everybody is trying to look from the same vantage point at the same time? Like down a pore. Tom: would need to use either common coordinate system but with pass-through video so you don't bump into each other, or separate coordinate systems with non-overlapping boundaries.
Tom: at least LookSee has a built-in method to align coordinate systems, and you don't have to worry about tethering. People would still have to take turns if they need to see from exactly the same vantage point when their coordinate systems are aligned. Phil: I'll be typing ChimeraX commands for interactive VR. Tom: you can make multiple models for LookSee ahead of time and switch between them. Phil: I've tried that aproach too.
Meghan: the tethers must go but we won't be able to do it before the 18th, unfortunately.
Phil: got very bad skyline effect trying to use passthrough at home (tethered), but it might be my home laptop. Haven't tried in the bioviz lab 4000 series yet. Tom: passthrough with tether was working fine for me on both 3000 and 4000 series. Try in the lab. If problems, may require more troubleshooting.
Scooter: let's let Kristen ask her question. Kristen: Bhinnata wants to know where Eric is on the alphafold pipeline options since we have a release in first week of October. Eric: stuck because it's unclear in which cases output should be omitted. All vs. high-conf, each has several kinds of coloring. Drop outputs in <17 high-conf residues in largest contiguous piece, but these are different for surface vs. ribbon. Phil: separate tests for ribbons and surfaces. If ribbon big enough, output all relevant colorings of ribbons. If surface big enough, output all colorings of surfaces. Eric: given that information, should be a week or two to get it done.
Phil: the good news is that our test outputs from the current presets look great!
8/15/24
Eric, Elaine, Zach; Phil, Mike Bopf, Kristen, Darrell (last part)
Phil: short meeting today. Re NIH3D, discussion of Eric's email showing that the "contacts" approach is fine for identifying a contiguous structure with molecular surface or VDW spheres, but not ribbons or other representations if sidechains are not shown. Phil: I think it's OK to get rid of the smaller disconnected piece in the ribbons case. Eric: I can use an alternative method of identifying discontinuous parts in the case of ribbons/struts and only show the largest one. Kristen: I've seen a few with floaters like ions. Eric: printable presets should hide ions. Send us the info when you see cases like this.
Kristen: for NIH3D, we figured out that we don't need ChimeraX to have TotalSegmentator. We've been able to use it directly as a python package. However, you may want to include it for other reasons. Zach: it may be in the final SOW but deprioritized. Elaine: I see "add support for ML-based tumor segmentation tool" in the SOW, don't know if that's the same thing.
Darrell: when were we thinking about a visit? Phil: February, we think was the conclusion. Can you send us information about places to stay (and in budget and convenient location).
8/1/24
Elaine, Eric, Zach, Piet, Scooter, Tom G; Phil, Darrell, Mike Bopf (first part), Kristen
Re NIH3D pipeline:
Phil: what was the high-confidence cutoff for AlphaFold? Eric: 50, could easily be changed. Phil: could the printable output only show the single biggest piece? Eric: doable. Phil: another separate rule, is that if high confidence is <17 residues total, do not create output (visible or printable). Elaine: for the biggest piece, what about separate domains that are are not contiguous covalently but that are in contact? Scooter: let's start with just the covalent connectivity because it's simpler. Eric: could do some kind of distance-based contact test, NxN where N is the number of contiguous pieces. Phil: can we use the "PAE domains"? TomG: probably too complicated. Could just do NxN contacts where N is the number of residues and then take biggest connected/contacting piece. Elaine: get a contact RIN and then use the largest connected component.
Kristen: Eric's previous fix for the virus chain-names issue looks good.
Phil: question about contacts command, can we list only residue-residue contacts? Elaine: no, is atom-atom; you can use contacts to select atoms and then list selected residues, but it would lose the residue pairing information. Eric: can add an option to the command. Phil: sounds good.
Phil: plans to incorporate AlphaFold3? TomG: their website only allows limited calculations and there is no API for programmatic access. However, can upload a json to set up a run on their server. So one potential thing ChimeraX could do is generate that json file, but it may be difficult especially for ligands, and not worth it with the current restrictions on what calculations can be done. So I'm inclined to wait and hope that a less restrictive version becomes available. Phil: I miss the interactive PAE. TomG: currently (v1.8+) you can download the results and then open the AlphaFold3 PAE data and structure in ChimeraX. Phil: OK, then I'm satisfied!
Scooter: re Schol-AR, we met with Tyler and got feedback. Next step is to put a "beta" bundle in the Toolshed and get some more testing. Elaine: yes please use, you (Kristen and Phil) are more qualified than I to test this feature.
Scooter: we have the raytracing lighting for volumes now, take a look. Elaine: see the Medical Image tab of the toolbar, icon looks like a 3D pac-man.
Greg: I've added dark mode support. Elaine: the User Guide pages also support this.
Darrell: did 2nd-half-of-year funding go through yet? Scooter: still in progress, on my to-do list. Want to get it done without a gap, so it's urgent (possibly later today). Have been busy with hardware upgrades. Darrell: saw on "X" the new chimeraX raycasting and worms representation, look good. Unfortunately, our budget going forward will be flat, need to keep spending no higher next year than this year, may need to reduce hours in the next contract.
Scooter: green light from UCSF for setting up endowment for Tom Ferrin's position. Position will be open soon, we want to inform relevant parties and start getting interest.
Darrell: we would like to visit you, should start planning soon. Would like to bring Neta Filip-Granite (cryoEM expert). Scooter: maybe Jan/Feb. Kristen: hopefully I won't be snowed in. Scooter: maybe we can do some kind of seminar concurrently. TomG: is your setup also for ET? Darrell: yes. TomG: there is a relatively new Chan-Zuckerberg institute focusing on EM/ET.
7/18/24
Elaine, Eric, Zach, Piet; Phil, Bhinnata, Andi, Kristen, Meghan, Mike Bopf
Phil: any questions from Eric about the NIH3D issues/requests? Eric: not really, but haven't had time to address the issue with struts being calculated for hidden (low-pLDDT) parts of alphafold models. Have a plan to fix it. Haven't had a chance to work on the other issues: missing outputs #15611, #15612 atom coloring showing on surfaces. Elaine: maybe the latter is atoms shown in sphere style protruding through the molecular surface.
We have some progress to report: Zach and med imaging, Piet and Schol-AR.
Piet: after getting your API key, you can see your projects via the ChimeraX tool, and update them directly. Kristen: I had issues with GLB vertex coloring. Piet: they support GLB currently but there are some issues with coloring.
Zach: I've been implementing raycasting for medical images, which gives a higher-quality appearance more like other medical image viewers. Eric: is there a way to get rid of the moire pattern? Elaine: does rendering mode XYZ slices vs. perpendicular slices (icons) affect that? Zach, no they aren't applicable to raycasting. Zach: there are a few different approaches for getting rid of the moire pattern, just haven't tried them yet. This raycasting is faster than 3D Slicers. Phil, Kristen: should show this to Dave Chen.
Zach: have you had a chance to try the Segmentations tool lately? I made several improvements to UI. Phil: not yet, but you've inspired me to do it soon. Have been busy helping with several specific research projects lately.
6/20/24
Elaine, Eric, Zach, TomG, Scooter; Phil, Kristen, Meghan, Bhinnata, Mike Bopf
Phil: to get coordinates of a point, I found it useful to add a helium atom, and then move it VR. Used "measure center" to report its coordinates. TomG: is there a more convenient approach? Elaine: maybe a setting for balloon help to show coordinates instead of spec, or an entry in the context menu, or a "log coordinates" mouse mode. TomG: similar features are the spheres in Local EM Fitting and Map Eraser. Phil: VR makes placement easier.
Kristen: we filed tickets for more alphafold presets and something about coloring by chain instead of by polymer. Eric: the latter was for something like a virus assembly, if from pdbe_bio you get different chain IDs than if you use "sym" to generate the assembly. Discussion of what coloring they actually want. Unfortunately with color bypolymer, similar sequences don't necessarily give similar colors. Discussion of which presets will be needed for the alphafold models.
Discussion of VR w/ and w/o cable, w/ and w/o passthrough video.
Meghan: how much do you engage with other VR users at UCSF? Is there a special interest group? TomG: no, we are not really connected, especially with very different applications (not molecular structure). Meghan: Adam Gazzaley? TomG: we know of him but haven't interacted re VR.
TomG: have been working on viewing different kinds of data in VR, like light-sheet microscopy. Phil: spatial omics? a "growth area" discussed in the world economic forum. Zach: somebody who attended our workshop at NIAID was doing spatial omics. Discussion of what exactly spatial omics includes. Phil: 3D detection in cells of "whatever"
Next meeting Jul 18 (Jul 4 is holiday)
6/6/24
Scooter, Eric, Elaine, Zach, Greg; Mike Bopf, Kristen, Bhinnata, Darrell, Meghan
Eric demoed some of the NIH3D alphafold presets he's been working on. Has separate sets of alphafold presets showing all residues vs. high-confidence (pLDDT) residues only. Removed some of the printable ones as per request. He also worked on the script to fetch from alphafold. Still working on the case where all residues are high confidence (to only put out one instead of two that are the same). Kristen: some of the "spaghetti" will not be printable, will need to work on our warning text for these alphafold models. Mike asked if this is the same script that produced outputs for the PDB models, Eric said yes.
Scooter: we're 3 months into this 6-month period. Should we go forward and start thinking about paperwork for the next 6 months? Darrell: I should know by next Monday. Scooter: let's think about another visit during that period. Darrell: will keep that in mind as well, and let you know if feasible. Maybe we could instead visit you, and avoid the UC surcharge. Kristen: I would definitely volunteer to come. Scooter: OK, so maybe we could host the next meeting. You could give an NIH 3D presentation.
Zach asks if Kristen had disseminated some DICOM info that he'd given her earlier, she said not yet. Scooter: by the way, we will be working with an intern on the Schol-AR project.
5/23/24
Elaine, Eric, Scooter, Zach; Phil, Kristen, Andi (briefly), Mike Bopf
Scooter is concerned that we haven't heard anything about the supposedly current contract. Meghan is away today and we didn't get any updates from her since the last meeting. Phil will send her email CC-ing Darrell and Scooter. Scooter: it's possible it's already come to UCSF but nobody has done anything with it.
Eric has implemented several of the alphafold presets requested in the previous meeting, including coloring by PAE domain. For printable ribbon, may need to use a longer max length with the "struts" command, say 20. Tried with ldlr_human. The disordered (low confidence) parts are so long, Phil is thinking we just shouldn't bother with the printable ribbon form of the alphafold-all structures (all = including low confidence). We can keep the printable surface one for these structures.
Phil and Eric discuss details of the File... Save dialog and why it looks different than the File... Open dialog.
Zach: I've made a lot of updates in the Segmentations tool and associated features. Works with segmentations read from file, can have multiple ones each associated with their original dataset. New segmentations command and mousemode/hand-controller mode toggle icons.
Zach demos the new stuff to general acclaim and Kristen wants to see how responsive segmentation is compared to slicer. Zach: it is more responsive, thanks to Tom G's graphics work. Phil recently demoed these features and will be doing more demos soon.
Mike: question for Eric, the bug you fixed, is that specific to the Mac? Eric: yes, that setting applies to Mac only. Mike: timeframe for these alphafold updates? Eric: a couple of weeks, need to do more on the presets and then update the script itself to use them.
Phil won't be here next time, will be in Croatia.
5/9/24
Elaine, Eric, Zach, Scooter; Phil, Meghan, Mike Bopf, Bhinnata, Andi Bueckle, Kristen Browne
Discussion of preparations for their poster presentation next week. Wanted to use VR along with the poster, but had an issue trying to set up their own wifi for "vr meeting" and alignment of two sessions. Phil ended up using his phone as a hotspot.
Phil: we had internal discussions on what we need for alphafold presets for NIH3D. Shows a table summarizing the names, display styles, coloring, with and without high confidence (probably pLDDT >= 50 although that may change). Also shares the table as Word and PDF docs, Elaine downloaded and emailed to Eric.
Eric demonstrates the first two alphafold presets that he'd already worked up, ribbons colored by pLDDT for viz and for printing.
Phil: these alphafold presets are only for quick-submit, i.e. user just specifies uniprot ID and we do the fetch. Eric: better to give us (the script) the uniprot ID than the structure, so that the fetch can also get the corresponding PAE, etc. Elaine: use "alphafold fetch" with "pae true" and then another command "alphafold pae" with "colorDomains true". Phil: Mike, we will be getting the additional PAE file (JSON) and it needs to be accessible to ChimeraX.
Scooter: we haven't gotten an official confirmation of the contract. We were given the nod to work "at risk" but that was a while ago. Meghan: I will touch base with Nada and see what information I can get on status. Scooter: or maybe UCSF has just failed to notify us.
Some discussion of NIAID contract and funding opportunities for Andi.
4/25/24
TomG, Eric, Zach, Scooter, Eric; Phil, Meghan, Andi Bueckle, Mike Bopf, Bhinnata
Phil: our "take your child to work day" activity was starting a ChimeraX VR meeting, and there was already a meeting name that suggested others had the same idea! Went well, kids enjoyed it. Meghan: we're tired from doing demos today. Andi: I have a VR demo (Apple Vision Pro) if we have time.
Phil: latest on working with passthrough? TomG: haven't changed it in a few weeks. Scooter tried it recently and found that a developer account is needed to change a necessary setting for passthrough. TomG: related to security.
Scooter: I made a long list of requests to TomG and Zach, can share with NIAID folks. I would bring up a menu to the right of where I was viewing the scene. Most importantly, I wanted multiple panels to stick together so they didn't have to be moved individually. Also I kept bumping my hand on the physical screen trying to select with the cone. It would be nice to have an interface for selecting from a distance, like a laser pointer. Resizing panels was confusing.
Phil: during one session on one machine, I would always get the checkbox below the one I was clicking on. Not very reproducible. It was the "show/hide" checkbox in the Model Panel. Meghan: maybe low battery on controller? TomG: there is a conversion between VR space location and desktop screen location, and maybe it was off by a certain vertical amount.
TomG: we recently added a button lockout so that you can hand over controls without messing everything up. Need a new daily build. Phil: now we can easily push the new version to multiple machines.
3D Workflow. Mike: how many visualization output file are we going to get for alphafold structures? Phil: that ball's in my court, to decide on which representations and their filenames. Will let Eric know to work on new presets. Phil: and then the script needs to take the outputs from the presets and write out the additional files.
Over to Andi for demo: sharing screen from Vision Pro, shows augmented reality of organs in shiny clear bubbles floating around his office. There are interaction panels, toggle switch to explode the organ with labels on all of its bits, and toggle back to put it back together. left/right on the other interaction panel chooses a different organ to focus on.
Discussion of VR user interface improvements, like showing the command line in the headset (hard to see via passthrough). Meghan: cisco upgrade will be very expensive. Scooter: wifi 7 has been released. Meghan: but will the headsets support it, and will we be allowed to use it here? Scooter: I just mean that you may want to investigate other options before paying cisco big bucks for an upgrade. TomG: in my experience, passthrough requires cable connection for good performance. I don't think it is a hardware issue (lack of wifi bandwidth), but a Meta software problem with using airlink. The developer option is to allow passthrough with questlink (the cable), doesn't mention airlink. TomG: you do sharing with larger numbers of headsets than we do; wifi 7 may help in that case. Meghan: a cable is a nuisance with rolling chairs and might get damaged.
All the rest of the notes from today is text that Meghan sent thru the Zoom chat:
USB:
- 192.6Mbps peak on Google Earth at defaults (200Mbps)
- 188.2Mbps peak on ChimeraX at defaults (200Mbps)
- 413.3Mbps peak on ChimeraX with max set to 500Mbps
- 846.2Mbps peak on Medical Holodeck with max set to 960Mbps
Testing wired connection and pushing bitrates to the limit
Used Wireshark with USB-PCAP (requires special install file that gets moved into Wireshark program files) see https://desowin.org/usbpcap/tour.html
Next step: same testing wired/wireless focused on normal day-to-day use
Have a baseline to measure against
Torrey will investigate use of Cisco Meraki cloud-based WLC
4/11/24
TomG, Eric, Zach, Greg, Elaine, Scooter; Phil, Kristen, Meghan, Mike Bopf, Bhinnata
Zach: did you see the email I sent about a radiology conference later this year, are you going? Meghan: someone will go but we may not be hosting a booth. May be useful for you to attend just to learn, keep it in mind.
Scooter: we still don't have a contract finalized even though we were told it was fine to start the work. I'll email and ask for status.
Meghan: what's going on with workflows? Kristen: we need to figure out what we want re AlphaFold, still haven't gotten a list of specific requests. Quick submit is when the user enters an accession code (will be obtained using ChimeraX fetch) rather than uploading their own structure.
Scooter: we should talk about AlphaFold and GLTF. We would like to add a "publish to Schol-AR" feature in ChimeraX using REST interface. Developer Tyler Ard is a medical imaging guy and has given us some feedback about the ChimeraX tools. We would like to visit him at USC, give demos, discuss features, and try to engage his colleagues as well.
TomG: Meghan's powerpoint gltf example used an older ChimeraX. Newer ChimeraX (Oct 2023) gltf export has better treatment of color, so looks better in powerpoint. Discussion of whether the vrmls will then need correction since those converted from the older gltfs are fine. Meghan: let's test with newer ChimeraX. Phil: it looks good enough with newer ChimeraX that we don't need any special preset for gltf export for powerpoint.
Kristen: another possibility is color STL. Greg: recommend staying away from STL. Kristen, Meghan: or 3MF or USC (universal scene description). Kristen: there are various flavors, USC-Z etc.
TomG: should NIH3D put in lighting, or leave that to the end user's application? Kristen: probably better to leave it out so that you don't get stuck with both lighting from the exported file and the application, and no reasonable way to remove lights.
Discussion of lighting direction vs. camera direction. If there is a headlight you avoid dark sides of models, but that approach can overly wash out white or other light-colored models.
Phil: for AlphaFold, we probably want at least 2 colorings, by PAE domain and by pLDDT. Tom: pLDDT is well-known but PAE domain coloring is more specific to ChimeraX and may require more explanation or at least thought as to whether you really want to make it available. Parts of more than one chain can be assigned to the same PAE domain. Phil: for quick submit, it will be only one protein. My instinct is that we still want to include it for alphafold database entries, along with a good explanation. Phil: other issue is whether to hide low-pLDDT spaghetti. What is a reasonable cutoff? Tom: pLDDT 50. Tom: keep in mind that PAE values are in a separate file than the structure. ChimeraX alphafold fetch gets both files.
Current thinking is we need more presets: both with and without spaghetti, 2 alphafold coloring schemes, and all the existing color schemes of both ribbons and surfaces. Elaine: do you really need a combinatorially expanded set of presets, when some of these differences would just be a single command? Eric: how about a modifier preset? Then still simple but less duplicative. To get a specific view, you'd use preset A followed by modifier preset B.
Phil: do you need any more specifics? Eric: do you need missing-structure pseudobonds where the spaghetti is hidden or excised? What about surfaces? Will think about the implementation details. Tom: I have various doubts (a) that getting rid of the low pLDDT, can remove parts of helices, even buried cores, etc. and give weird results. (b) also, pLDDT coloring is useful for ribbons, probably not surfaces, have not seen it used for surfaces. Others: would simplify the preset situation to omit some of these options.
Eric: I'll work up some possibilities and put it in a bundle for further feedback.
3/28/24
Scooter, Eric, Zach, Elaine; Andi Bueckle, Phil, Kristen, Darrell, Meghan, Bhinnata
Some discussion of Andi's scary (almost-real-looking) avatar since he is Zooming using the Vision Pro. The varying amounts of transparency and the unrealistic mouth movements are the giveaway.
Phil: Kristen had an interesting idea to output files for Mesoscape. Scooter: I talked to some people at VIZBI... planning a collaboration for publishing data to Schol-AR; the developer (Tyler) happens to also be a radiologist and may be helpful in evaluating our medical image stuff. Discussed maybe having a ChimeraX REST interface to publish to Schol-AR. Kristen and Phil: we may be interested in adding a connection to Schol-AR from NIH3D.
Scooter: Eric has a present for you, especially Darrell. (Zoom technical difficulties, disconnected, reconnected.) Eric demonstrates worms, everybody is enthusiastic. Darrell: would be fun to try 3D printing. Might need struts. Kristen: is this in other programs? Elaine: yes, Pymol, Chimera, etc. Phil: re NIH3D I also wanted to discuss the quick-submit workflows for alphafold... we need to decide which are the standard outputs: pLDDT and PAE domain coloring, ribbons and surfaces, maybe hide low-pLDDT parts? Transparency? Elaine: pLDDT coloring shows low confidence as red, or it could be gray in combination with PAE domain coloring. In ChimeraX you can specify all residues with bfactor (pLDDT is in the bfactor field) greater than some value.
Scooter: this all ties into what metadata we want to have in the GLTF output. Groups of residues predefined, e.g. "high pLDDT" or by chain ID or domain, etc. Phil: Darrell, maybe you can help us decide on the standard set of outputs. Darrell: should be useful and include ones that are harder for some people to generate on their own, which used to be the case for ESP coloring. We'll have to put our heads together and decide on an edited set. Kristen: maybe we could offer checkbox choices of which outputs the user wants.
Phil: ChimeraX gltf outputs look washed out when embedded into Microsoft documents (Powerpoint and Word). Due to their oversaturated lighting model, which is unlikely to be addressed by Microsoft, so we have to try to work around it. I can generate ChimeraX gltf files that look better after embedding in such documents by using different settings in ChimeraX (I have my own preset for this, which uses "color modify" to lighten the colors) but that may require yet another set of output options. Kristen: I hacked Powerpoint to get around it but it's not trivial. See this in the forum for the Microsoft lighting: https://answers.microsoft.com/en-us/msoffice/forum/all/3d-model-lighting-inside-powerpoint/d0c0c316-8019-4c25-b0f6-86500e512f91 ... The suggested solution is what I've done in the past to fix the lighting.
Meghan posted another link in the chat https://answers.microsoft.com/en-us/msoffice/forum/all/powerpoint-uses-gltf-but-doesnt-support/002d1f4e-061d-4ab6-a692-ed217945724a
Meghan: are there ways to fix the gltf after it's output? Blender, but having to download Blender is another barrier. Kristen: maybe NIH3D could have a "optimize gltf for Microsoft documents" service that runs our own Blender. Phil, Darrell: it may be a useful utility. Darrell: does gltf outputs from ChimeraX include lights? Kristen: I just checked, and no, these gltf files do not include lights or camera.
Darrell: we might also look at providing U3D which can be embedded in PDFs. Greg: it hasn't been used much. Darrell: probably because it is rather difficult to generate. Meghan posted this link in the chat https://helpx.adobe.com/acrobat/using/adding-3d-models-pdfs-acrobat.html