a
big query

How to Connect to a 3rd Party's Google Big Query using Node.js on Mac using a Service Account Key

How to Connect to a 3rd Party’s Google Big Query using Node.js on Mac using a Service Account Key One of our clients provided us with access to their Google Big Query the other day. We only received the credentials. We had no idea what was in the project or dataset. They provided us with […]

How to Connect to a 3rd Party’s Google Big Query using Node.js on Mac using a Service Account Key

One of our clients provided us with access to their Google Big Query the other day. We only received the credentials. We had no idea what was in the project or dataset.
They provided us with credentials in a json format that looked like


{
"type": "service_account",
"project_id": "allcode-dev",
"private_key_id": "xxxxxxxxxxxxxxxxx",
"private_key": "-----BEGIN PRIVATE KEY-----nyyyyyyyyyyy",
"client_email": "[email protected]",
"client_id": "yyyyyy",
"auth_uri": "http://accounts.google.com/o/oauth2/auth",
"token_uri": "http://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "http://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "http://www.googleapis.com/robot/v1/metadata/x509/allcode-gbq.iam.gserviceaccount.com"
}

The credentials are a service account key.
I took the following steps to validate connectivity to this GBQ instance.

  1. I set the environment variable GOOGLE_APPLICATION_CREDENTIALS to the path of the JSON service account file in my ~/.bash_profile
     export GOOGLE_APPLICATION_CREDENTIALS="/home/user/Downloads/service-account-file.json"
  2. Next, I confirmed the environment variable by doing the following
     export $GOOGLE_APPLICATION_CREDENTIALS
  3. I installed the latest version of NVM using brew
    brew update
    brew install nvm
  4. I installed the NPM Big Query SDK
     npm install --save @google-cloud/bigquery
  5. Next, I wrote a test_connect.js whose sole purpose was to query the meta tables to understand what tables are in the project. The contents of the test_connect.js are
    const {BigQuery} = require('@google-cloud/bigquery');
    var bigqueryClient = new BigQuery();
    //console.log( bigqueryClient);
    async function queryMetaData() {
    // Queries a default databases MetaData.
    // Create a client
    const bigqueryClient = new BigQuery();
    // The SQL query to run
    const sqlQuery = `SELECT
    * EXCEPT(is_typed)
    FROM
    mydataset.INFORMATION_SCHEMA.TABLES`;
    const options = {
    query: sqlQuery,
    // Location must match that of the dataset(s) referenced in the query.
    location: 'US',
    };
    // Run the query
    const [rows] = await bigqueryClient.query(options);
    console.log('Query Results:');
    rows.forEach(row => {
    const url = row['url'];
    const viewCount = row['view_count'];
    console.log(`url: ${url}, ${viewCount} views`);
    });
    }
    queryMetaData();
    
  6. When I executed this command with
    node test_connect.js

    I received the following errors stating that the permissions on my account needed to be improved in order to run queries.

    
    (node:76490) UnhandledPromiseRejectionWarning: Error: Access Denied: Project allcode-dev:
    User does not have bigquery.jobs.create permission in project allcode-dev.
        at new ApiError (/Users/joelgarcia/node_modules/@google-cloud/common/build/src/util.js:58:15)
        at Util.parseHttpRespBody (/Users/joelgarcia/node_modules/@google-cloud/common/build/src/util.js:193:38)
        at Util.handleResp (/Users/joelgarcia/node_modules/@google-cloud/common/build/src/util.js:134:117)
        at /Users/joelgarcia/node_modules/@google-cloud/common/build/src/util.js:432:22
        at onResponse (/Users/joelgarcia/node_modules/retry-request/index.js:206:7)
        at /Users/joelgarcia/node_modules/teeny-request/build/src/index.js:233:13
        at processTicksAndRejections (internal/process/task_queues.js:97:5)
    (node:76490) UnhandledPromiseRejectionWarning: Unhandled promise rejection.
    This error originated either by throwing inside of an async function without a catch block, or by rejecting a pro
    
  7. It looks like the credentials work, but I don’t have the proper credentials. Progress!

Joel Garcia

Joel Garcia

Joel Garcia has been building AllCode since 2015. He’s an innovative, hands-on executive with a proven record of designing, developing, and operating Software-as-a-Service (SaaS), mobile, and desktop solutions. Joel has expertise in HealthTech, VoIP, and cloud-based solutions. Joel has experience scaling multiple start-ups for successful exits to IMS Health and Golden Gate Capital, as well as working at mature, industry-leading software companies. He’s held executive engineering positions in San Francisco at TidalWave, LittleCast, Self Health Network, LiveVox acquired by Golden Gate Capital, and Med-Vantage acquired by IMS Health.

Related Articles

AWS Snowflake Data Warehouse Pricing Guide

AWS Snowflake Data Warehouse Pricing Guide

AWS Snowflake Data Warehouse – or just Snowflake – is a data cloud built for users to mobilize, centralize, and process large quantities of data. Regardless of how many sources are connected to Snowflake or the user’s preferred type of organized data used, data is easily stored and controllably shared with selectively-authorized access. Snowflake does offer extensive control over its pricing, though how it works isn’t always clear.

Guide to Cost Factors for Amazon’s RDS Pricing

Guide to Cost Factors for Amazon’s RDS Pricing

Amazon sports a complex pricing model. As simple as the pay-for-what-you-use model sounds, there are several nuances to every service from data used to computing components chosen that can easily let users blow through their entire budget. That’s not to say AWS is an inherently expensive platform and can be adapted to with proper planning and understanding of the contributing factors to billing.

Amazon Elastic Cloud Computing Pricing Guide

Amazon Elastic Cloud Computing Pricing Guide

Amazon Elastic Cloud Computing is the default option for computing on AWS. Outside of outsourced cloud computing options, it is the default service for building, running, and scaling AWS-based applications. As such, EC2 will likely be the main driving force behind AWS bills. Understanding how to control said costs is therefore the most important factor in managing your AWS environment.

Download our 10-Step Cloud Migration ChecklistYou'll get direct access to our full-length guide on Google Docs. From here, you will be able to make a copy, download the content, and share it with your team.