USM logo

Locals or Nearby Urgent Need Hadoop Data Engineer

USM
Contract
On-site
Reston, Virginia, United States

Job Description

GC and Citizens

Hi Friends,

Hope you are doing great,

I have an urgent requirement from one of my esteem client, I will appreciate if you can have an eye on the below requirement and send me your consultant updated profile ASAP.

Job Title: Data Engineer

Location: Reston, VA (Local or Nearby)

Duration: 4+ months

Requirements:

• Lead implementation teams from concept to completion by leveraging the best practices of Big Data

• Participate in the analysis, architecture and design of data hub

• Builds robust Big Data solution systems with an eye on the long term maintenance and support of the Application

• Work as part of a team to design and develop code, scripts, and data pipelines that leverage structured and unstructured data integrated from multiple sources

• Looks to leverage reusable code modules to solve problems across the team, including Data Preparation and Transformation and Data export and synchronization

• Design and develop automated test cases that verify solution feasibility and interoperability, to include performance assessments

• Act as a Big Data delivery liaison with Infrastructure, security, application development and testing team.

• Helps drive cross team design / development via technical leadership / mentoring

• Keep current on latest Big Data technologies and products, including hands-on evaluations and in-depth research

• Consult and advise solution architects on overall enterprise-wide analytics solutions that include a Data Hub Component

• Works with Project Manager to perform detailed planning, risks/issues escalation.

Qualifications

• Requirements for Big Data Engineer, Healthcare Analytics Senior:

• 4+ years of experience working with batch-processing and tools in the Hadoop tech stack (e.g., MapReduce, Yarn, Pig, Hive, HDFS, Oozie)*

• 4+ years of experience working with tools in the stream-processing tech stack (e.g., Spark, Storm, Samza, Kafka, Avro)* Experience developing applications that work with NoSQL stores (e.g., ElasticSearch, Hbase, Cassandra, MongoDB, CouchDB)* Experience developing for TB-level data stores and/or 10Gbps+ ingest speeds

• High-capacity data ingest into Hadoop or Spark is highly desired

• Hands-on experience with at least one major Hadoop Distribution such as Cloudera or Horton Works or MapR or IBM Big Insights

• System usage and optimization tools such as Splunk is a plus

• At least 4 years of experience delivering enterprise IT solutions as a solutions architect

• 8+ years of experience with SQL and at least two major RDBMS's

• 5+ years as a systems integrator with Linux systems and shell scripting

• 8+ years of doing Data related benchmarking, performance analysis and tuning

• 5+ years of Java experience

• Solid programming experience with a preference towards Java or Python

• DBA and/or Data Modeling experience

• Experience with operational and business-level metadata management

• Bachelor's degree in Computer Science, Information Systems, Information Technology or related field and 6+ years of software development/DW & BI experience.

• Health care experience is plus

• Excellent verbal and written communication skills

Love to Have:

• Hands-on experience with Cloudera 4.5 and higher, Horton Works 2.1 and higher or MapR 4.01 and higher

• Experience with Map/Reduce solution design and development

• ETL Solution experience, preferable on Hadoop

• Experience with industry leading Business Intelligence:

Additional Information

All your information will be kept confidential according to EEO guidelines. please send the profiles to alih(at)usmsystems.com and contact No# 703-955-3955