The recruitment team at Myticas Consulting is looking for an experienced Hadoop Big Data Engineer who would be interested in a contract opportunity offered within the Lake Forest, Illinois region.
The position will act as a key member to work with other data scientists to solve complex analytics use cases.
He or she will be technically involved with identifying, analyzing, obtaining, understanding, and moving big data sets through the Big Data ecosystem.
The role is responsible for moving large data sets from multiple sources and ingesting this data into the Data Science lab environment and ultimately delivering ‘information sets’ to data scientists.
The role includes hands-on data acquisition and integration work using the full Hadoop stack including Sqoop, HBase, Hive, Oozie, Flume, NiFi, etc., as well as other Big Data technologies.
The ideal candidate is a technologist with a strong business acumen, have passion for data, and big data technologies to solve the complex business problems.
The role focuses on developing robust applications with Hadoop to support data science projects
- Hadoop administration skills (asset).
- Experience with ETL and UNIX.
- Expertise in ksh scripting and data manipulation.
- Experience w/Teradata and SQL
- Advanced skills (a minimum of 3 years) with MapReduce, HBase, Hive, Sqoop, and Flume.
- Experience with Mahout & R
Candidates looking to apply for this role are to send us an updated version of their resume in confidence. Our team will be sure to review all applicants and follow up accordingly at the conclusion of the review process.
To apply for this job email your details to email@example.com.