Skip to content
Graham Wheeler edited this page Apr 29, 2016 · 17 revisions

Getting Started

Using DataLab in Google Cloud

The easiest way to use Google Cloud DataLab is on Google Cloud Platform. Head over the DataLab site, and deploy an instance into a Cloud Project, so you can easily work with data in other cloud services such as BigQuery, and deploy your data pipelines for execution on the cloud.

Using DataLab locally

DataLab is built and packaged as a docker container. You will need docker configured and running locally. If you're on Mac or Windows, the easiest way to get docker is via the Docker Toolbox. Download and install that, then open the Kitematic app that it installed which will create and start a 'default' VM and start the Docker server.

Add a port mapping so you can use localhost for Datalab:

VBoxManage modifyvm default --natpf1 "datalab,tcp,,8081,,8081"

Clone the Datalab repo, build it and run it:

git clone https://github.com/GoogleCloudPlatform/datalab.git
cd datalab
source ./tools/initenv.sh
rm -rf build/
cd sources/
./build.sh
cd ../containers/datalab
./build.sh
./run.sh

Then open your browser to http://localhost:8081.

Note that to use any Google Cloud functionality you will need to set the project ID. You can do this by calling:

set_project_id('myproject')

within a cell in your notebook, or by setting an environment before running Datalab (in which case the project will be used as the default for all notebooks):

PROJECT_ID='myproject' ./run.sh

Replace 'myproject' with an appropriate ID for your use.

Using Datalab on GCE

First, build the container and push it to a container registry (see the previous section for building, and look at the stage.sh script instead of the run.sh script for an example of how to push the container).

Make sure to set the project to the one you want to use for Datalab:

PROJECT_ID=my-datalab-project  # Set an appropriate ID here
gcloud config set project $PROJECT_ID

You need to create a persistent disk for storing your notebooks. You can do that with:

ZONE=us-central1-a  # pick a zone here
SIZE=500GB  # Choose an appropriate size
DISKNAME=my-datalab-disk  # Choose an appropriate name

gcloud compute disks create --size=$SIZE --zone=$ZONE $DISKNAME

Then create a containers.yaml file. In the example below, replace IMAGE with the path to the container on GCR, Docker Hub or wherever you publish your container, and replace DISKNAME with the disk name from above.

apiVersion: v1
kind: Pod
metadata:
  name: datalab
spec:
  containers:
    - name: datalab
      image: IMAGE
      command: ['/datalab/run.sh']
      imagePullPolicy: IfNotPresent
      ports:
        - containerPort: 8080
          hostPort: 8080
      volumeMounts:
        - mountPath: /content
          name: datalab-volume
  volumes:
  - name: datalab-volume
    gcePersistentDisk:
      pdName: DISKNAME
      fsType: ext4

You can then use the YAML file to create a running compute instance on GCE:

LABEL=my-datalab  # pick a name here
ZONE=us-central1-a  # pick a zone here
MACHINE_TYPE=n1-highmem-2  # pick a machine type here; we recommend decent memory but not many cores

gcloud compute instances create $LABEL --image container-vm --metadata-from-file google-container-manifest=containers.yaml --zone $ZONE --machine-type $MACHINE_TPE --scopes cloud-platform

Expose the port through ssh tunneling:

gcloud compute ssh --zone $ZONE --ssh-flag="-L" --ssh-flag="8082:localhost:8080" $LABEL

This will open a tunnel from localhost:8082 to the GCE instance. The tunnel will stay open until you close the SSH shell. If you have just created the instance then it has to do a docker pull to get the container which can take about four minutes on first run. You can check when Datalab is ready by running:

sudo docker ps

in the SSH shell. You should see one or two containers running; once there are two Datalab is ready. You can now connect to http://localhost:8082 on your local machine to start using Datalab.

Note on first execution you will (i) have to click through a EULA page and (ii) may have to enable the resource manager API for your project if it is not yet enabled; this is so we can get the project ID for the service account (and enable the new project listing APIs that are used for that).

When you no longer need the GCE instance, you can delete it:

gcloud compute instances delete $LABEL 

The persistent disk will still have your notebooks and you can create a new compute instance later and reuse the disk. If you want to delete the disk, including your notebooks, use:

gcloud compute disks delete $DISKNAME

You can create periodic snapshots of the disk for backup purposes. See here for more info about how to do that.

Clone this wiki locally