Skip to main content

Google BigQuery API client library

Project description

Python idiomatic client for Google BigQuery

pypi versions

Quick Start

$ pip install --upgrade google-cloud-bigquery

For more information on setting up your Python development environment, such as installing pip and virtualenv on your system, please refer to Python Development Environment Setup Guide for Google Cloud Platform.

Authentication

With google-cloud-python we try to make authentication as painless as possible. Check out the Authentication section in our documentation to learn more. You may also find the authentication document shared by all the google-cloud-* libraries to be helpful.

Using the API

Querying massive datasets can be time consuming and expensive without the right hardware and infrastructure. Google BigQuery (BigQuery API docs) solves this problem by enabling super-fast, SQL queries against append-mostly tables, using the processing power of Google’s infrastructure.

Create a dataset

from google.cloud import bigquery
from google.cloud.bigquery import Dataset

client = bigquery.Client()

dataset_ref = client.dataset('dataset_name')
dataset = Dataset(dataset_ref)
dataset.description = 'my dataset'
dataset = client.create_dataset(dataset)  # API request

Load data from CSV

import csv

from google.cloud import bigquery
from google.cloud.bigquery import LoadJobConfig
from google.cloud.bigquery import SchemaField

client = bigquery.Client()

SCHEMA = [
    SchemaField('full_name', 'STRING', mode='required'),
    SchemaField('age', 'INTEGER', mode='required'),
]
table_ref = client.dataset('dataset_name').table('table_name')

load_config = LoadJobConfig()
load_config.skip_leading_rows = 1
load_config.schema = SCHEMA

# Contents of csv_file.csv:
#     Name,Age
#     Tim,99
with open('csv_file.csv', 'rb') as readable:
    client.load_table_from_file(
        readable, table_ref, job_config=load_config)  # API request

Perform a query

# Perform a query.
QUERY = (
    'SELECT name FROM `bigquery-public-data.usa_names.usa_1910_2013` '
    'WHERE state = "TX" '
    'LIMIT 100')
query_job = client.query(QUERY)  # API request
rows = query_job.result()  # Waits for query to finish

for row in rows:
    print(row.name)

See the google-cloud-python API BigQuery documentation to learn how to connect to BigQuery using this Client Library.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

google-cloud-bigquery-1.4.0.tar.gz (145.7 kB view details)

Uploaded Source

Built Distribution

google_cloud_bigquery-1.4.0-py2.py3-none-any.whl (76.3 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file google-cloud-bigquery-1.4.0.tar.gz.

File metadata

File hashes

Hashes for google-cloud-bigquery-1.4.0.tar.gz
Algorithm Hash digest
SHA256 76e35bd61bcd996b7f87b403123ab829b653dfed9d6e46f0c026714b37217bcd
MD5 d3af0e6945fbd3cc5ef2c9085c0c05cc
BLAKE2b-256 1891e4769c985eb76793186d2d1899abf4fff991f11d8e8cbd209699a7b09a61

See more details on using hashes here.

File details

Details for the file google_cloud_bigquery-1.4.0-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for google_cloud_bigquery-1.4.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 cadd1d27e12ae719e0ffad2cbcaa0ebf185381321e299d7b9f388a4a67f5576e
MD5 c56d917a7842cda07640e523601516c1
BLAKE2b-256 0f9f45a7e4d1731d6b2cc0f6011d763fa4eec85956515306f47ecc50b38bdf6d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page