Every time a Workspace is started or restarted, the previous data session (the data saved in the persistent directory: /floyd/home ) will be copied into the new machine which is running your new Workspace session. This step is tracked by the Loading data step (see picture above).  

As a rule of thumb, this steps will require about 20s for each GB of data saved in the previous session. E.g. the slow workspace of the above picture will require about 12 minutes to complete the Loading data step. 

Note: the same time is applied for data mounting as well. 

How can I reduce the Loading data time?

You can reduce this time by separate your data from the code. It's unlikely that you will generate more than 20MB of code, isn't it? You can upload all the data as a dataset (or different dataset version) that you can attach your datasets on-demand. In this way, you will be able to start working on your code without waiting long time on the Loading data step. Of course, you will still have to wait for the mounting data before you can use it.

If possible for your Workflow, our advice is to keep your Workspace storage as small as possible (less than 1GB) and to use FloydHub datasets as your main source of data.

 

Did this answer your question?