Skip navigation
#199817

Data Engineer

Remote (Pacific Standard Time)
Date:

Overview

Placement Type:

Temporary

Salary:

$59.57 to $66.19 an hour

Start Date:

02.03.2025

The Senior Data Engineer will collaborate with product owners, developers, database architects, data analysts, visual developers and data scientists on data initiatives and will ensure optimal data delivery and architecture is consistent throughout ongoing projects. The right candidate will be excited by the prospect of optimizing and building integrated and aggregated data objects to architect and support our next generation of products and data initiatives.

Responsibilities: 

  • Create and maintain optimal data pipeline architecture
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing for greater scalability
  • Comprehensive documentation and knowledge transfer to Production Support
  • Work with Production Support to analyze and fix Production issues
  • Participate in an Agile / Scrum methodology to deliver high -quality software releases every 2 weeks through Sprint
  • Refine, plan stories and deliver timely
  • Analyze requirement documents and Source to target mapping

Must Have: 

  • 5+ years of experience designing, developing and supporting complex data pipelines.
  • 5+ years of Spark experience in batch and streaming mode
  • 5+ years of advanced SQL experience for analyzing and interacting with data
  • 5+ years’ experience in Big Data stack environments like Data bricks, AWS EMR etc.
  • 3+ years of experience in scripting using Python
  • 3+ years of experience working on cloud environment like AWS.
  • Strong understanding of solution and technical design.
  • Experience building cloud scalable high-performance data lake solutions
  • Experience with relational SQL & tools like Snowflake
  • Aware of Datawarehouse concepts
  • Performance tuning with large datasets
  • Experience with source control tools such as GitHub and related dev processes
  • Experience with workflow scheduling tools like Airflow or Databricks Workflow
  • Strong problem solving and analytical mindset
  • Able to influence and communicate effectively, both verbally and written, with team members and business stakeholders
  • Must be self-directed and comfortable supporting the data needs of the product roadmap.

Good Skills To have: 

  • Experience in building streaming solutions using Spark structured streaming and Kafka.
  • Experience and knowledge of Databricks.
  • Experience in Semantic modelling and cube solutions like AAS or AtScale.

The target hiring compensation range for this role is $59.57 to $66.19 an hour. Compensation is based on several factors including, but not limited to education, relevant work experience, relevant certifications, and location.

About Aquent Talent:

Aquent Talent connects the best talent in marketing, creative, and design with the world’s biggest brands.

Our eligible talent gets access to amazing benefits like subsidized health, vision, and dental plans, paid sick leave, and retirement plans with a match. We also offer free online training through Aquent Gymnasium. More information on our awesome benefits

Aquent is an equal-opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, veteran status, and other legally protected characteristics. We’re about creating an inclusive environment—one where different backgrounds, experiences, and perspectives are valued, and everyone can contribute, grow their careers, and thrive.