Public Genomics Institute Infrastructure Ready for Migration: Difference between revisions

From UCSC Genomics Institute Computing Infrastructure Information

No edit summary
No edit summary
 
(One intermediate revision by the same user not shown)
Line 1: Line 1:


The new Genomics Institute public infrastructure is ready to begin account creation and data migration. It consists of the shared compute server '''courtyard.gi.ucsc.edu''' attached to a home directory and file storage server. We will keep the SDSC public ‘kollosus’ server and attached storage available '''until June 29th''' to allow time for migrating data. We will be adding additional compute after this migration period. We’re working on the private infrastructure (i.e. replacement for pod) and anticipate having it available for migration in the next few weeks with a similar migration window - we will email with updates.
The new Genomics Institute public infrastructure is ready to begin account creation and data migration. It consists of the shared compute server '''courtyard.gi.ucsc.edu''' attached to a home directory and file storage server. We will keep the SDSC public '''kolossus''' server and attached storage available '''until June 29th''' to allow time for migrating data. We will be adding additional compute after this migration period. We’re working on the private infrastructure (i.e. replacement for pod) and anticipate having it available for migration in the next few weeks with a similar migration window - we will email with updates.


==Getting An Account==
==Getting An Account==
Starting Tuesday May 29th contact Haifang <haifang@ucsc.edu <mailto:haifang@ucsc.edu>> to setup an account on the new system.
Starting Tuesday May 29th contact the SysAdmin at ''cluster-admin@soe.ucsc.edu'' to setup an account on the new system.


Each account includes a backed up home directory initially limited to '''30GB'''.
Each account includes a backed up home directory initially limited to '''30GB'''.


==Large Data Storage==
==Large Data Storage==
If you require more space provide Haifang with the name of a PI or funded project. We will create a shared folder under /public/groups/<PI or project name> that you and others associated with the same PI or project will have access to. If you work with multiple PI’s or projects you will have access to each of their shared folders.
If you require more space provide the admin with the name of a PI or funded project. We will create a shared directory under /public/groups/<PI or project name> that you and others associated with the same PI or project will have access to. If you work with multiple PI’s or projects you will have access to each of their shared directories.


The lab/project may organize data under that folder in any structure, but the overall total size of a pi/project top level folder under groups will be generally limited to '''10TB''' during migration as we get a better idea of how much data all groups require. This storage will be on reliable RAID6 storage but it will not be backed up and should be primarily used for data that you are actively working with. For backup and long term archival we suggest setting up an AWS account and using Glacier.
The lab/project may organize data under that directory in any structure, but the overall total size of a pi/project top level directory under groups will be generally limited to '''10TB''' during migration as we get a better idea of how much data all groups require. This storage will be on reliable RAID6 storage but it will not be backed up and should be primarily used for data that you are actively working with. For backup and long term archival we suggest setting up an AWS account and using Glacier.


==Migrating Data==
==Migrating Data==
Once you have an account you can migrate data from the old to the new infrastructure via rsync. For large shared storage please coordinate with others in the lab and/or project. For help with this contact cluster-admin@soe.ucsc.edu
Once you have an account you can migrate data from the old to the new infrastructure via ''rsync''. For large shared storage please coordinate with others in the lab and/or project. For help with this contact ''cluster-admin@soe.ucsc.edu''

Latest revision as of 22:27, 25 May 2018

The new Genomics Institute public infrastructure is ready to begin account creation and data migration. It consists of the shared compute server courtyard.gi.ucsc.edu attached to a home directory and file storage server. We will keep the SDSC public kolossus server and attached storage available until June 29th to allow time for migrating data. We will be adding additional compute after this migration period. We’re working on the private infrastructure (i.e. replacement for pod) and anticipate having it available for migration in the next few weeks with a similar migration window - we will email with updates.

Getting An Account

Starting Tuesday May 29th contact the SysAdmin at cluster-admin@soe.ucsc.edu to setup an account on the new system.

Each account includes a backed up home directory initially limited to 30GB.

Large Data Storage

If you require more space provide the admin with the name of a PI or funded project. We will create a shared directory under /public/groups/<PI or project name> that you and others associated with the same PI or project will have access to. If you work with multiple PI’s or projects you will have access to each of their shared directories.

The lab/project may organize data under that directory in any structure, but the overall total size of a pi/project top level directory under groups will be generally limited to 10TB during migration as we get a better idea of how much data all groups require. This storage will be on reliable RAID6 storage but it will not be backed up and should be primarily used for data that you are actively working with. For backup and long term archival we suggest setting up an AWS account and using Glacier.

Migrating Data

Once you have an account you can migrate data from the old to the new infrastructure via rsync. For large shared storage please coordinate with others in the lab and/or project. For help with this contact cluster-admin@soe.ucsc.edu