cryoSPARC Documentation
- 1 Connection Instructions
- 1.1 SCU - BRB Cluster
- 1.1.1 Prerequisites
- 1.1.2 Connection - Mac or Linux
- 1.1.3 Connection - Windows
- 1.2 SCU - Weill Greenberg Cluster
- 1.2.1 Prerequisites
- 1.2.2 Connection - Mac or Linux
- 1.2.3 Connection - Windows
- 1.1 SCU - BRB Cluster
- 2 cryoSPARC Notices - Things to Know
- 2.1 Process Ownership
- 2.2 Permissions Required
- 2.3 CPU vs GPU
- 3 Current Lab Ports
- 4 Contact
Connection Instructions
SCU - BRB Cluster
Prerequisites
An active WCMC CWID with access to the scu-login nodes must be configured to access the cryoSPARC BRB instance. More information can be found here.
cryoSPARC access must have been configured by SCU admins prior to connection. This can be done by emailing scu@med.cornell.edu and receipt of email confirmation from lab P.I.
SSH keys setup is highly recommended for seamless access to the environment.
Connection - Mac or Linux
Modify the ~/.ssh/config in your local workstation to include the following entries:
Host scu-login02 HostName scu-login02.med.cornell.edu ServerAliveInterval 60 TCPKeepAlive yes Host scu-vis1 ProxyCommand ssh -W %h:%p scu-login02
Connect to the WCM VPN Network
Run the following from your local terminal (Replace HOSTNAME and PORT with your lab’s specific hostname and port number below ):
ssh cwid@HOSTNAME -L PORT:localhost:PORT
Enter http://localhost:port within your local browser and create your credentials using the registration token that was sent to your email.
Connection - Windows
Follow the SCU Windows SSH Forwarding guide found here (Update the port numbers):
Enter http://localhost:port within your local browser and create your credentials using the registration token that was sent to your email.
SCU - Weill Greenberg Cluster
Prerequisites
An active SCU account with access to the named gateway nodes (pascal, aphrodite) must be configured to access the cryoSPARC BRB instance.
cryoSPARC access must have been configured by SCU admins prior to connection. This can be done by emailing scu@med.cornell.edu and receipt of email confirmation from lab P.I.
SSH keys setup is highly recommended for seamless access to the environment.
Connection - Mac or Linux
Modify the ~/.ssh/config in your local workstation to include the following entries:
Host pascal HostName pascal.med.cornell.edu ServerAliveInterval 60 TCPKeepAlive yes Host node164 Hostname node164.panda.pbtech ServerAliveInterval 60 TCPKeepAlive yes ProxyCommand ssh -A -W %h:%p pascal.med.cornell.edu
Run the following from the terminal (Replace port with your lab’s specific port number here):
Enter http://localhost:port within your local browser and create your credentials using the registration token that was sent to your email.
Connection - Windows
Follow the SCU Windows SSH Forwarding guide found here (Update the port numbers):
Enter http://localhost:port within your local browser and create your credentials using the registration token that was sent to your email.
cryoSPARC Notices - Things to Know
Process Ownership
All cryoSPARC calculations are mediated by the user account “cryosparc_lab”. This user account runs cryoSPARC, manages the cryoSPARC database, submit jobs to Slurm, as well as performs other tasks. This user belongs to the lab’s linux group. All files/directories that cryoSPARC interacts with/writes to, MUST have linux group read/write permissions.
Permissions Required
cryoSPARC requires the group read/write permissions described above. As such, each member of the lab will be able to read/write any file that is read/writable to the user ‘cryosparc_lab’. This is not necessarily bad (assuming there is a certain level of trust amongst all lab members); for example, if UserA is running cryoSPARC calculations in /athena/xyzlab/scratch/userA/cryosparc_stuff, then UserB could delete this entire directory if they wished. One way to limit this exposure is for you to create a specific directory for which to run cryoSPARC calculations in (thereby isolating group access). Once the calculations are complete, you could conceivably move the completed results into a more restricted directory (however, you should read the documentation carefully about migrating projects, as I don’t know how cryoSPARC will respond to this if additional work was needed). However, assuming everyone in the lab is nice/behaves ethically, you don’t have too worry much about this.
CPU vs GPU
It’s clear for most jobs whether they require GPUs or not (obviously, if the calculation has an input for the # GPUs available, then it needs GPUs!). However, some are ambiguous. We recommend you assume that the calculation does not require GPUs (if it’s not clear) and submit to `cryo-cpu`. If the calculation does require GPUs, it will almost immediately error out complaining about `pycuda`—if you see this, just clear the job and submit to a GPU-queue. If trial-and-error is not your thing, I’m sure the cryoSPARC documentation describes this in detail for each calculation
Current Lab Ports
Lab | Port | Hostname |
Xin-Yun Huang | 39000 | scu-login02.med.cornell.edu |
Crina Nimigean | 39100 | scu-login02.med.cornell.edu |
Alessio Accardi | 39200 | scu-login02.med.cornell.edu |
Olga Boudker | 39300 | scu-login02.med.cornell.edu |
Paul Riegelhaupt | 39400 | scu-vis1 |
Joel Meyerson | 39500 | scu-login02.med.cornell.edu |
Harel Weinstein | 39600 | scu-login02.med.cornell.edu |
Joshua Levitz | 39700 | scu-vis1 |
Devrim Acehan | 39800 | scu-vis1 |
David Eliezer | 39900 | scu-vis1 |
Simon Scheuring | 40000 | scu-vis1 |
Edwin Carl Fluck | 40100 | scu-vis1 |
Olaf Sparre Andersen | 40200 | scu-login02.med.cornell.edu |
Peter A Goldstein | 40300 | scu-vis1 |
Rie Nygaard | 40400 | scu-login02.med.cornell.edu |
Contact
Please contact SCU@med.cornell.edu with any questions or concerns