Who ❤️ PJ →

Full Search

This job listing has expired and may no longer be relevant!
23 Aug 2021

Temporary Data Engineer (6 Yrs) (21-24251)

iris.chen@axelon.com Woodcliff Lake, New Jersey, United States

Job Description

Data Engineer (6 Yrs)
Woodcliff Lake, NJ
4 Months
Will start remote, may need to go onsite at least 3 days a week if office opens back up

I. Position Purpose/Scope: What are the key objectives of the position?
• Provides complete application lifecycle development, deployment, and operations support for
large-scale batch and real-time data processing pipelines using cloud technologies.
• Collaborates with product owners, data scientists, business analysts and software engineers to
design and build solutions to ingest, transform, store, and export data in a cloud environment
while maintaining security, scalability, and personal data protection.
II. Position Responsibilities/Accountabilities: List the major duties/accountabilities to achieve
the positions key objectives.
• Implements and enhances complex data processing pipelines with a focus on collecting, parsing,
cleaning, managing and analyzing large data sets that produce valuable business insights and
discoveries.
• Determines the required infrastructure, services, and software required to build advanced data
ingestion & transformation pipelines and solutions in the cloud.
• Assists data scientists and data analysts with data preparation, exploration and analysis activities.
• Applies problem solving experience and knowledge of advanced algorithms to build highperformance,
parallel, and distributed solutions.
• Performs code and solution review activities and recommends enhancements that improve
efficiency, performance, stability, and decreased support costs.
• Applies the latest DevOps and Agile methodologies to improve delivery time.
• Works with SCRUM teams in daily stand-up, providing progress updates on
a frequent basis.
• Supports application, including incident and problem management.
• Performs debugging and triage of incident or problem and deployment of
fix to restore services.
• Documents requirements and configurations and clarifies ambiguous specs.
• Performs other duties as assigned by management.
III. Position Competencies
A) Education
Bachelor’s degree in Computer Science, Mathematics, or Engineering or
the equivalent of 4 years of related professional IT experience.
B) Experience
• 3+ years of enterprise software engineering experience with object oriented
design, coding and testing patterns, as well as, experience in engineering (commercial or open
source) software platforms and large-scale data infrastructure solutions.
• 3+ years of software engineering and architecture experience within a cloud environment (Azure,
AWS).
• 3+ years of enterprise data engineering experience within any “Big Data” environment
(preferred).
• 3+ years of software development experience using Python.
• 2+ years of experience working in an Agile environment (Scrum, Lean or Kanban).
C) Training
Amazon Web Services (Preferred)
D) Licenses and/or Certifications
AWS certifications (Preferred)
E) Knowledge/Skills/Abilities: Basic = less than 1 year of experience/training needed;
Intermediate = 1-3 years of experience/some training may be needed; Advanced = 3-5
years’ experience/no training needed; Expert = 5+ years’ experience/able to train others.
• 3+ years of experience working in large-scale data integration and analytics projects, including
using cloud (e.g. AWS Redshift, S3, EC2, Glue, Kinesis, EMR) and data-orchestration (e.g.
Oozie, Apache Airflow) technologies
• 3+ years of experience in implementing distributed data processing pipelines using Apache Spark
• 3+ years of experience in designing relational/NoSQL databases and data warehouse solutions
• 2+ years of experience in writing and optimizing SQL queries in a business environment with
large-scale, complex datasets
• 2+ years of Unix/Linux operating system knowledge (including shell programming).
• 1+ years of experience in automation/configuration management tools such as Terraform, Puppet
or Chef.
• 1+ years of experience in container development and management using Docker.
• Languages: SQL, Python, Spark. Strong experience with data handling using Python is required.
• Basic knowledge of continuous integration tools (e.g. Jenkins).
• Basic knowledge of machine learning algorithms and data visualization tools such as Microsoft
Power BI and Tableau
IV. Primary Work Location/Shift : (If > 1 location, indicate % of time spent at each location.)
Woodcliff Lake: 90%
Other BMW locations 10%
V. Physical Demands: (Examples are 1) Transports materials or inspects building, equipment
and other items; 2) operates computers or other office equipment; and 3) sit or stand for
prolonged periods of time. Physical demands should be described in terms of what has
to be done and not in terms of physical or mental attributes.) Identify the key physical
demands required to perform the essential function(s) of the job and indicate whether the
responsibility is performed occasionally (“O”), frequently (“F”), or constantly (“C”).
• Operates computers (“C”)
• Sits at desk for prolonged periods of time (“C”)
• Operates other office equipment (“O”)
• Travels Internationally (“O”)
VI. Environmental Demands: (Examples are 1) Works with irritant chemicals, gases, etc.;
2) Works shifts or variable work hours; 3) Wears protective clothing or respiratory
protective equipment; 4) Works in extreme heat or cold conditions; 5) Exposure to loud
noises; 6) Performs work in a normal office setting). Identify the key environmental
demands required to perform the essential function(s) of the job and indicate whether the
responsibility is performed occasionally (“O”), frequently (“F”), or constantly (“C”).
• Performs work in office areas (“C”)
• Works normal shift schedules (“C”)
• Works overtime and on-call activities (“O”)

Share this role online (there may be a referral fee*)

How to Apply

Please share Resumes to Iris.chen@axelon.com to apply.

Job Types: Temporary.

717 total views, 0 today

Apply for this Job