Please login/register to apply for this job.
13 Jan 2021

Temporary Cloud Data Engineer (21-00685)

Axelon Services Corporation – Posted by Woodcliff Lake, New Jersey, United States

Job Description

Cloud Data Engineer
Woodcliff Lake, NJ – Will start Remote
12 Months

Position Purpose/Scope

  • Provides complete application lifecycle development, deployment, and operations support for large-scale batch and real-time data processing pipelines using cloud technologies.
  • Collaborates with product owners, data scientists, business analysts and software engineers to design and build solutions to ingest, transform, store, and export data in a cloud environment while maintaining security, scalability, and personal data protection.

Position Responsibilities/Accountabilities

  • Implements and enhances complex data processing pipelines with a focus on collecting, parsing, cleaning, managing and analyzing large data sets that produce valuable business insights and discoveries.
  • Determines the required infrastructure, services, and software required to build advanced data ingestion & transformation pipelines and solutions in the cloud.
  • Assists data scientists and data analysts with data preparation, exploration and analysis activities.
  • Applies problem solving experience and knowledge of advanced algorithms to build high-performance, parallel, and distributed solutions.
  • Performs code and solution review activities and recommends enhancements that improve efficiency, performance, stability, and decreased support costs.
  • Applies the latest DevOps and Agile methodologies to improve delivery time.
  • Works with SCRUM teams in daily stand-up, providing progress updates on a frequent basis.
  • Supports application, including incident and problem management.
  • Performs debugging and triage of incident or problem and deployment of fix to restore services.
  • Documents requirements and configurations and clarifies ambiguous specs.
  • Performs other duties as assigned by management.


  • Bachelor’s degree in Computer Science, Mathematics, or Engineering or the equivalent of 4 years of related professional IT experience.


  • 3+ years of enterprise software engineering experience with object oriented design, coding and testing patterns, as well as, experience in engineering (commercial or open source) software platforms and large-scale data infrastructure solutions.
  • 3+ years of software engineering and architecture experience within a cloud environment (Azure, AWS).
  • 3+ years of enterprise data engineering experience within any “Big Data ” environment (preferred).
  • 3+ years of software development experience using Python.
  • 3+ years of experience working in an Agile environment (Scrum, Lean or Kanban).


  • Amazon Web Services (Preferred)

Licenses and/or Certifications

  • AWS certifications (Preferred)


  • 3+ years of experience working in large-scale data integration and analytics projects, using cloud (e.g. AWS Redshift, S3, EC2, Glue, Kinesis, EMR), big data (Hadoop) and orchestration (e.g. Apache Airflow) technologies
  • 3+ years of experience in implementing distributed data processing pipelines using Apache Spark
  • 3+ years of experience in designing relational/NoSQL databases and data warehouse solutions
  • 2+ years of experience in writing and optimizing SQL queries in a business environment with large-scale, complex datasets
  • 2+ years of Unix/Linux operating system knowledge (including shell programming).
  • 2+ years of experience in automation/configuration management tools such as Terraform, Puppet or Chef.
  • 2+ years of experience in container development and management using Docker.
  • 2+ years of experience in continuous integration tools (e.g. Jenkins).


  • SQL, Python, Spark
  • Basic knowledge of machine learning algorithms and data visualization tools such as Microsoft Power BI and Tableau

How to Apply

Please send your updated resume to

Job Types: Temporary.

Job expires in 86 days.

48 total views, 4 today

Apply for this Job