Who ❤️ PJ →

Full Search

This job listing has expired and may no longer be relevant!
16 Jun 2022

Temporary Sr Big Data Architect

Hornet Staffing – Posted by Hornet Staffing Anywhere

Job Description

Job Type: Contract

Contract length : 01 Year

Summary:

The Hadoop Architect is responsible for designing and building big data databases and enabling high performing, complex queries. The Hadoop Architect will champion data as a strategic asset with innovative approaches to data sharing and data management, inclusive of practical, feasible alternatives that are sensitive to cost containment, while adhering to strategy. The Hadoop Architect serves the team as an advisor on best practices and is responsible for contributing to information management strategy-setting, developing and implementing principles, models, designs, blueprints, standards and guidelines to ensure big data information is flexible, scalable, cost effective, consistent, usable, secure and adds value to the business.

Essential Functions

  • Have knowledge of the agile methodology for delivering software solutions
  • Design, develop, document, and architect Hadoop applications
  • Manage and monitor Hadoop log files
  • Develop MapReduce coding that works seamlessly on Hadoop clusters
  • Have working knowledge of SQL, NoSQL, data warehousing, and DBA
  • Be an expert in newer concepts like Apache Spark and Scala programming
  • Complete knowledge of the Hadoop ecosystem and Hadoop Common
  • Seamlessly convert hard-to-grasp technical requirements into outstanding designs
  • Design web services for swift data tracking and query data at high speeds
  • Test software prototypes, propose standards, and smoothly transfer them to operations
  • Proficient in Hadoop Distribution platforms.
  • Expert in Java, Hive, HBase, etc.

Job Duties:

  • Ability to work with huge volumes of data so as to derive Business Intelligence
  • Knowledge to analyze data, uncover information, derive insights, and propose data-driven strategies
  • Knowledge of OOP languages like Java, C++, and Python
  • Understanding of database theories, structures, categories, properties, and best practices
  • Knowledge of installing, configuring, maintaining, and securing Hadoop
  • Analytical mind and ability to learn-unlearn-relearn concepts
  • Have hands-on experience in working with Hadoop distribution platforms like Hortonworks, Cloudera, MapR, and others
  • Take end-to-end responsibility of the Hadoop life cycle in the organization
  • Be the bridge between Data Scientists, Engineers, and the organizational needs
  • Do in-depth requirement analyses and exclusively choose the work platform
  • Have full knowledge of Hadoop architecture and HDFS
  • Have working knowledge of MapReduce, HBase, Pig, Java, and Hive

Experience:

  • 8+ years related work experience or equivalent combination of transferable experience and education
  • IT development/programming/coding and architecture within a Hadoop Environment.

Required Education:

  • Related Bachelor’s degree in an IT related field or relevant work experience
Share this role online (there may be a referral fee*)

How to Apply

For immediate consideration, please send your resume directly to Lory Weir at lory@hornetstaffing.com . You can view all of our open positions at www.hornetstaffing.com 

Job Types: Temporary.

171 total views, 0 today

Apply for this Job