Please login/register to apply for this job.
18 Sep 2020

Full-Time Data Platform Technical Architect

DoubleVerify – Posted by agent7underfire New York, New York, United States

Job Description

Role Description

In this role, you are going to build an online analytic platform providing insights and data for DoubleVerify’s clients and internal users. The platform ingests billions of events per day and makes them available for consumption through an intuitive and highly performant UI. Additionally, it allows programmatically access data while applying a tight Data Governance strategy. To achieve this goal you will create, test and roll out robust and fully productionized code. Additionally, guide and collaborate with a talented, curious and highly motivated group of engineers and engineering leads.



  • Architect, design and build a big data online analytic platform processing tens of TBs/Day, serves thousands of clients and supports advanced analytic workloads
  • Personally build Object Oriented or Functional Programming, scalable and high quality code
  • Constantly review code, look for design breaches, provide meaningful and relevant feedback to developers, stay up-to-date with system changes
  • Collaborate with multiple cross-disciplinary teams and provide technical leadership, guidance and coaching to engineers through brainstorming sessions, design reviews, and pair-programming
  • Create technical standards of CI/CD, quality, monitoring, security, modifiability, extensibility and maintainability of data processing software
  • Continuously explore the technical landscape of the Big Data ecosystem through POCs and analysis of various frameworks and technologies



  • 8+ years of experience in building and operating mission-critical data-intensive and distributed systems
  • 5+ hands-on coding experience with Scala/Java/.net Core/Python development
  • 3+ years of experience creating complex data pipelines using SQL and analyzing data
  • Ability to Architect and Design complex software systems while adhering to fundamental principles and best practices
  • In-depth understanding and hands-on experience with distributed columnar data stores
  • Experience with BigQuery data modeling, performance tuning and cost optimizations
  • Experience with following and advocating state of the art SDLC processes
  • BS/MS degree in Computer Science or other related field

Additional skills:

  • Problem solver
  • Production operations and monitoring

How to Apply

Apply at the link below

Job Categories: Equal Opportunities. Job Types: Full-Time. Job Tags: big data, Bigquery, Kafka, python, Scala, Spark, and sql. Salaries: 100,000 and above.

Job expires in 2 days.

89 total views, 3 today

Apply for this Job