Enable America Jobs

Enable America Logo

Job Information

OCLC, Inc. Senior Data Engineer in DUBLIN, Ohio

You have a life. We like that about you.

 

At OCLC, we believe you'll do the best work of your life when you're living the best life possible.

 

We work hard to build the technology that connects thousands of today's libraries. But we also work hard to make a job at OCLC a meaningful part of a balanced life- not a substitute for one.

 

Technology with a Purpose. OCLC supports thousands of libraries in making information more accessible and more useful to people around the world. OCLC provides shared technology services, original research and community programs that help libraries meet the ever-evolving needs of their users, institutions, and communities. With office locations around the globe, OCLC employees are dedicated to offering premier services and software to help libraries.

 

The Job Details are as follows:

Data Engineers are responsible for bringing robust, efficient, and integrated data models and products to life and sit at the intersection of business teams, Data Analysts, and Software Engineers.

 

 

 

 

Responsibilities

- - Collaborate with team members to collect business requirements, design data models and define successful analytics outcomes.

```{=html}

``` - - Build and maintain data pipelines from internal databases and SaaS applications Design and develop dbt code to extend OCLCs Enterprise Data Model that meets the OCLC Data Teams internal standards for style, maintainability, and best practices for a high-scale database environment. Maintain and advocate for these standards through code reviews and best practice socialization

```{=html}

``` - - Create and maintain architecture and systems documentation.

```{=html}

``` - - Provide data modeling expertise to the Data Science and Analytics teams through code reviews, pairing, and training to help deliver optimal, DRY, and scalable database designs and SQL queries

```{=html}

``` - - Develop and maintain data mapping specifications based on the results of data analysis and functional requirements.

```{=html}

``` - - Build automated Extract, Transform & Load (ETL) jobs based on data mapping specifications

```{=html}

``` - - Maintain metadata structures needed for building reusable Extract, Transform & Load (ETL) components.

```{=html}

``` - - Identify and resolve impediments to efficiency and enable the entire Data Program to iterate faster

```{=html}

``` - - Help promote data innovation across OCLC with a willingness to experiment and to confront hard and complex problems

```{=html}

``` - - Profile and analyze data for pipeline quality assurance, and own the QA process

```{=html}

``` - - Conduct exploratory data analysis and generate visual summaries of data. Identify data quality issues proactively.

 

Qualifications

- - 4+ years performing in a Data Engineering or Data Ops role.

```{=html}

``` - - 4+ years experience designing, implementing, operating, and extending commercial enterprise dimensional models.

```{=html}

``` - - 4+ years working with a large-scale Data Warehouse, preferably in a cloud environment.

```{=html}

``` - - 2+ years building and deploying data solutions with cloud providers such as AWS, Azure, or GCP.

```{=html}

``` - - 2+ years experience Big Data / Hadoop / Spark / Hive / NoSQL database engines (i.e. Cassandra or HBase).

```{=html}

``` - - 2+ years experience using ETL tools such as dbt to build and maintain data pipelines.

4+ years writing SQL

DirectEmployers