<html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8"></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;" class="">Hi Matthias,<div class=""><br class=""></div><div class=""> The D435 depth sensing camera is noisy -- can't figure out depth values for all pixels. It uses two IR cameras to judge depth by stereo overlap and so plain backgrounds or repeating patterns confuse it. It has an IR projector that puts thousands of dots in the room to help but still depth can be erratic. That leads to flickering patches in the video. So just be aware if you are going to try this that it is a technology that isn't easy to use yet.</div><div class=""><br class=""></div><div class=""> The is an Intel RealSense L515 Lidar camera shipping maybe in June, $349 more expensive than D435 $180. It uses a "time of flight" method for getting depth at each pixel -- basically times how many picoseconds it takes for a light pulse it sends out to return. I have not tried this type of camera, it may be less noisy.</div><div class=""><br class=""></div><div class=""><span class="Apple-tab-span" style="white-space:pre"> </span>Tom</div><div class=""><br class=""></div><div class=""><br class=""><div><blockquote type="cite" class=""><div class="">On Mar 12, 2020, at 11:19 PM, Matthias Wolf <<a href="mailto:matthias.wolf@oist.jp" class="">matthias.wolf@oist.jp</a>> wrote:</div><br class="Apple-interchange-newline"><div class=""><div class="WordSection1" style="page: WordSection1; caret-color: rgb(0, 0, 0); font-family: Helvetica; font-size: 12px; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px; -webkit-text-stroke-width: 0px; text-decoration: none;"><div style="margin: 0in 0in 0.0001pt; font-size: 11pt; font-family: Calibri, sans-serif;" class="">Thanks, Elaine and Tom!<o:p class=""></o:p></div><div style="margin: 0in 0in 0.0001pt; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><o:p class=""> </o:p></div><div style="margin: 0in 0in 0.0001pt; font-size: 11pt; font-family: Calibri, sans-serif;" class="">This is very informative and inspirational. I will get a RealSense camera and try it out.<o:p class=""></o:p></div><div style="margin: 0in 0in 0.0001pt; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><o:p class=""> </o:p></div><div style="margin: 0in 0in 0.0001pt; font-size: 11pt; font-family: Calibri, sans-serif;" class=""> Matthias<o:p class=""></o:p></div><div style="margin: 0in 0in 0.0001pt; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><o:p class=""> </o:p></div><div style="border-style: none none none solid; border-left-width: 1.5pt; border-left-color: blue; padding: 0in 0in 0in 4pt;" class=""><div class=""><div style="border-style: solid none none; border-top-width: 1pt; border-top-color: rgb(225, 225, 225); padding: 3pt 0in 0in;" class=""><div style="margin: 0in 0in 0.0001pt; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><b class="">From:</b><span class="Apple-converted-space"> </span>Tom Goddard<span class="Apple-converted-space"> </span><br class=""><b class="">Sent:</b><span class="Apple-converted-space"> </span>Friday, March 13, 2020 3:45 AM<br class=""><b class="">To:</b><span class="Apple-converted-space"> </span>ChimeraX Users Help <<a href="mailto:chimerax-users@cgl.ucsf.edu" class="">chimerax-users@cgl.ucsf.edu</a>><br class=""><b class="">Cc:</b><span class="Apple-converted-space"> </span>Matthias Wolf <<a href="mailto:matthias.wolf@oist.jp" class="">matthias.wolf@oist.jp</a>><br class=""><b class="">Subject:</b><span class="Apple-converted-space"> </span>Re: [chimerax-users] VR movie<o:p class=""></o:p></div></div></div><div style="margin: 0in 0in 0.0001pt; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><o:p class=""> </o:p></div><div style="margin: 0in 0in 0.0001pt; font-size: 11pt; font-family: Calibri, sans-serif;" class="">Hi Matthias,<o:p class=""></o:p></div><div class=""><div style="margin: 0in 0in 0.0001pt; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><o:p class=""> </o:p></div></div><div class=""><div style="margin: 0in 0in 0.0001pt; font-size: 11pt; font-family: Calibri, sans-serif;" class=""> The link Elaine mentioned describes how I made the augmented reality video. Briefly I use an Intel RealSense D435 depth sensing camera that captures the video of me and the room and in realtime blend it within ChimeraX with the molecular models using the ChimeraX realsense tool (obtained from ChimeraX menu Tools / More Tools...). So on the desktop display the ChimeraX graphics shows the room video and molecules in realtime, and I just screen capture that using Flashback Pro 5. There is a little more optional hardware -- I locate the RealSense camera in the room using a Vive Tracker mounted on the camera. While in VR I see a rectangular screen where the camera is showing live what the camera sees blended with the models. So as I look at the RealSense camera I see exactly what is being recorded so I can frame the molecules and myself in the video. I do not use the headset cameras for pass-through video.<o:p class=""></o:p></div></div><div class=""><div style="margin: 0in 0in 0.0001pt; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><o:p class=""> </o:p></div></div><div class=""><div style="margin: 0in 0in 0.0001pt; font-size: 11pt; font-family: Calibri, sans-serif;" class=""> The moving of the spike binding domain in the coronavirus video is a morph of PDB 6acg, 6acj, 6ack, three conformations seen by cryoEM.<o:p class=""></o:p></div></div><div class=""><div style="margin: 0in 0in 0.0001pt; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><o:p class=""> </o:p></div></div><div class=""><div style="margin: 0in 0in 0.0001pt; font-size: 11pt; font-family: Calibri, sans-serif;" class=""> Here is another augmented reality video I made on opioids.<o:p class=""></o:p></div></div><div class=""><div style="margin: 0in 0in 0.0001pt; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><o:p class=""> </o:p></div></div><div class=""><div style="margin: 0in 0in 0.0001pt; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><span class="apple-tab-span"> <span class="Apple-converted-space"> </span></span><a href="https://youtu.be/FCotNi6213w" style="color: blue; text-decoration: underline;" class="">https://youtu.be/FCotNi6213w</a><o:p class=""></o:p></div></div><div class=""><div style="margin: 0in 0in 0.0001pt; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><o:p class=""> </o:p></div></div><div class=""><div style="margin: 0in 0in 0.0001pt; font-size: 11pt; font-family: Calibri, sans-serif;" class=""> I think this augmented reality capture can be very useful for presenting results about 3D structures in science publications as supplementary material or for the public. A few people have said they are getting the depth sensing camera to try it, but I don't know of anyone who has done it.<o:p class=""></o:p></div></div><div class=""><div style="margin: 0in 0in 0.0001pt; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><o:p class=""> </o:p></div></div><div class=""><div style="margin: 0in 0in 0.0001pt; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><span class="apple-tab-span"> <span class="Apple-converted-space"> </span></span>Tom<o:p class=""></o:p></div></div><div class=""><div style="margin: 0in 0in 0.0001pt; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><o:p class=""> </o:p></div><div class=""><div style="margin: 0in 0in 0.0001pt; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><br class=""><br class=""><o:p class=""></o:p></div><blockquote style="margin-top: 5pt; margin-bottom: 5pt;" class=""><div class=""><div style="margin: 0in 0in 0.0001pt; font-size: 11pt; font-family: Calibri, sans-serif;" class="">On Mar 12, 2020, at 10:35 AM, Elaine Meng wrote:<o:p class=""></o:p></div></div><div style="margin: 0in 0in 0.0001pt; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><o:p class=""> </o:p></div><div class=""><div class=""><div style="margin: 0in 0in 0.0001pt; font-size: 11pt; font-family: Calibri, sans-serif;" class="">Hi Matthias,<br class="">Tom wrote a nice summary of his process for making mixed-reality videos here:<br class=""><br class=""><<a href="https://www.cgl.ucsf.edu/chimerax/data/mixed-reality-nov2019/mrhowto.html" style="color: blue; text-decoration: underline;" class="">https://www.cgl.ucsf.edu/chimerax/data/mixed-reality-nov2019/mrhowto.html</a>><br class=""><br class="">That may address the first and third questions. <br class=""><br class="">As for the middle question, there is a mouse mode (or VR hand-controller button mode) for bond rotation. However, my guess is that instead he previously made a morph trajectory between the two conformations and then was using the mouse mode "play coordinates" (flipping through different sets of coordinates in a trajectory model)... Tom would have to confirm whether my guess is correct. One would generally use the bond rotation mode when zoomed in on atoms/bonds shown as sticks, so that is easy to start the drag on a specific bond.<br class=""><br class="">Mouse modes and their toolbar icons:<br class=""><<a href="http://rbvi.ucsf.edu/chimerax/docs/user/tools/mousemodes.html" style="color: blue; text-decoration: underline;" class="">http://rbvi.ucsf.edu/chimerax/docs/user/tools/mousemodes.html</a>><br class=""><br class="">I hope this helps,<br class="">Elaine<br class="">-----<br class="">Elaine C. Meng, Ph.D.<br class="">UCSF Chimera(X) team<br class="">Department of Pharmaceutical Chemistry<br class="">University of California, San Francisco<br class=""><br class=""><br class=""><o:p class=""></o:p></div><blockquote style="margin-top: 5pt; margin-bottom: 5pt;" class=""><div style="margin: 0in 0in 0.0001pt; font-size: 11pt; font-family: Calibri, sans-serif;" class="">On Mar 12, 2020, at 4:16 AM, Matthias Wolf wrote:<br class=""><br class="">Hi Tom,<br class=""><br class="">I really liked your CoV movie<span class="Apple-converted-space"> </span><a href="https://www.youtube.com/watch?v=dKNbRRRFhqY&feature=youtu.be" style="color: blue; text-decoration: underline;" class="">https://www.youtube.com/watch?v=dKNbRRRFhqY&feature=youtu.be</a><br class="">It’s a new way of storytelling. Although we have used chimeraX in the lab with a Vive and Vive pro for about 2 years, it’s usually one person at a time, with a dark background. But your way opens up interactive VR to a larger audience (even if they don’t get to enjoy the stereoscopic 3D). And it’s cool.<span class="Apple-converted-space"> </span><br class=""><br class="">I have some questions:<br class=""><span class="apple-tab-span"> <span class="Apple-converted-space"> </span></span>• How did you overlay the chimera viewport sync’d with the life camera feed showing yourself? Did you use a frame grabber on a different PC to capture the full-screen chimeraX VR viewport while simultaneously recording the camera video stream, e.g. using Adobe Premiere?<br class=""><span class="apple-tab-span"> <span class="Apple-converted-space"> </span></span>• How did you control flipping out the outer spike domain with your hand controller? I guess you assigned control of a torsional angle in the atomic model to a mouse mode?<br class=""><span class="apple-tab-span"> <span class="Apple-converted-space"> </span></span>• Did you enable the headset cameras to orient yourself in the room?<br class=""><br class="">Thanks for keeping improving chimeraX and VR!<br class=""><br class="">Matthias<o:p class=""></o:p></div></blockquote><p class="MsoNormal" style="margin: 0in 0in 12pt; font-size: 11pt; font-family: Calibri, sans-serif;"><br class=""><br class="">_______________________________________________<br class="">ChimeraX-users mailing list<br class=""><a href="mailto:ChimeraX-users@cgl.ucsf.edu" style="color: blue; text-decoration: underline;" class="">ChimeraX-users@cgl.ucsf.edu</a><br class="">Manage subscription:<br class=""><a href="http://www.rbvi.ucsf.edu/mailman/listinfo/chimerax-users" style="color: blue; text-decoration: underline;" class="">http://www.rbvi.ucsf.edu/mailman/listinfo/chimerax-users</a><o:p class=""></o:p></p></div></div></blockquote></div><div style="margin: 0in 0in 0.0001pt; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><o:p class=""> </o:p></div></div></div></div><span style="caret-color: rgb(0, 0, 0); font-family: Helvetica; font-size: 12px; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px; -webkit-text-stroke-width: 0px; text-decoration: none; float: none; display: inline !important;" class="">_______________________________________________</span><br style="caret-color: rgb(0, 0, 0); font-family: Helvetica; font-size: 12px; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px; -webkit-text-stroke-width: 0px; text-decoration: none;" class=""><span style="caret-color: rgb(0, 0, 0); font-family: Helvetica; font-size: 12px; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px; -webkit-text-stroke-width: 0px; text-decoration: none; float: none; display: inline !important;" class="">ChimeraX-users mailing list</span><br style="caret-color: rgb(0, 0, 0); font-family: Helvetica; font-size: 12px; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px; -webkit-text-stroke-width: 0px; text-decoration: none;" class=""><a href="mailto:ChimeraX-users@cgl.ucsf.edu" style="color: blue; text-decoration: underline; font-family: Helvetica; font-size: 12px; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; orphans: auto; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; widows: auto; word-spacing: 0px; -webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px;" class="">ChimeraX-users@cgl.ucsf.edu</a><br style="caret-color: rgb(0, 0, 0); font-family: Helvetica; font-size: 12px; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px; -webkit-text-stroke-width: 0px; text-decoration: none;" class=""><span style="caret-color: rgb(0, 0, 0); font-family: Helvetica; font-size: 12px; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px; -webkit-text-stroke-width: 0px; text-decoration: none; float: none; display: inline !important;" class="">Manage subscription:</span><br style="caret-color: rgb(0, 0, 0); font-family: Helvetica; font-size: 12px; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px; -webkit-text-stroke-width: 0px; text-decoration: none;" class=""><a href="http://www.rbvi.ucsf.edu/mailman/listinfo/chimerax-users" style="color: blue; text-decoration: underline; font-family: Helvetica; font-size: 12px; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; orphans: auto; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; widows: auto; word-spacing: 0px; -webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px;" class="">http://www.rbvi.ucsf.edu/mailman/listinfo/chimerax-users</a></div></blockquote></div><br class=""></div></body></html>