BlackLine launches Cash Application

AI-Powered automation for Accounts Receivable

Positions

Data Engineer - 12770

Pleasanton, United States

Roles and Responsibility (list in order of importance)  

  • Responsible for building and maintaining config driven spark based data framework in GCP.
  • Build, integrate and deploy data processing solutions into the BlackLine application in collaboration with product management, cloud, engineering and data platform teams.
  • Create and maintain REST API end points in GCP
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Identify, design, and implement processes automation and data delivery.
  • Build infrastructure for optimal extraction, transformation, and loading of data from a wide variety of data sources.
  • Execute extract, transform and load (ETL) operations on large datasets including data identification, mapping, aggregation, conditioning, cleansing, and analyzing.

 

Technical/Specialized Knowledge, Skills, and Abilities:

  • 4-5 years’ Experience with SQL, Python and Spark is must
  • 4-6 years ETL experience using Python.
  • Practical experience with GIT version control.
  • Strong familiarity with GCP, SQL Server.
  • Comfortable working with open source tools in Unix/Linux environments.
  • Data warehousing experience, data modeling and database design.
  • Works independently without the need for supervision.

Data Engineer - 12770 ,

Apply Today