Lead Data Engineer in Charlotte, NC at Signature Consultants

Date Posted: 6/22/2020

Job Snapshot

Job Description

Lead Data Engineer Summary:

Signature Consultants has a Lead Data Engineer opportunity in Charlotte, NC for a 12+ month contract.

Description/Responsibilities:

  • Lead Data Engineer/Tech Lead with extensive experience on the Hadoop platform\related tools, and with leading teams to deliver complex products
  • Build new data pipelines, identify existing data gaps and provide automated solutions to deliver analytical capabilities and enriched data to applications
  • Responsible for obtaining data from the System of Record and establishing data feed to provide analysis in an automated fashion
  • Develop Hadoop applications to analyze massive data collections
  • Develop processing framework to detect conditions
  • Develop techniques supporting trending and analytic decision making processes
  • Apply Hadoop technologies for responsive front-end experience
  • Develop within security guidelines established
  • Collaborate with application developers, database architects, data analysts and data scientists to ensure optimal data delivery architecture throughout ongoing projects/operations
  • Design, build, and manage analytics infrastructure that can be utilized by data analysts, data scientists, and non-technical data consumers, which enables functions of the big data platform for Analytics
  • Develop, construct, test, and maintain architectures, such as databases and large-scale processing systems that help analyze and process data in the way the Analytics organization requires
  • Develop highly scalable data management interfaces, as well as software components by employing programming languages and tools.
  • Work closely with a team of Data Science staff to take existing or new models and convert them into scalable analytical solutions
  • Design, document, build, test and deploy data pipelines that assemble large complex datasets from various sources and integrate them into a unified view
  • Identify, design, and implement operational improvements: automating manual processes, data quality checks, error handling and recovery, re-designing infrastructure as needed
  • Create data models that will allow analytics and business teams to derive insights about customer behaviors
  • Build new data pipelines, identify existing data gaps and provide automated solutions to deliver analytical capabilities and enriched data to applications.
  • Responsible for obtaining data from the System of Record and establishing batch or real-time data feed to provide analysis in an automated fashion.
  • Develop techniques supporting trending and analytic decision making processes
  • Apply technologies for responsive front-end experience
  • Ensure systems meet business requirements and industry practices
  • Research opportunities for data acquisition and new uses for existing data
  • Develop data set processes for data modeling, mining and production
  • Integrate data management technologies and software engineering tools into existing structures
  • Employ a variety of languages and tools (e.g. scripting languages)

Requirements:

  • Degree in Computer Science, Engineering, or related field
  • 5+ years of Hadoop experience (experience working on the Data and Analytics solutions team)
  • Self-starter who works with minimal supervision and the ability to work in a team of diverse skill sets
  • Ability to comprehend customer requests and provide the correct solution
  • Understanding of distributed computing principles
  • Strong analytical mind to help take on complicated problems
  • Desire to resolve issues and dive into potential issues
  • Familiarity with the Agile Methodology
  • Experience with big data tools: Hadoop, Spark, Kafka, etc.
  • Experience with relational SQL and NoSQL databases, including SQL Server and Cassandra.
  • Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.

Desired:

Professional experience using:

  • PySpark
  • Hive Programing
  • Sqoop (utilizing various data sources)
  • Hbase/Phoenix
  • File Formats (e.g. ORC; Parquet; Avro; json)
  • BitBucket\Jenkins
  • Workflow scheduling tools such as CA ESP Automation
  • HDFS
  • MapReduce and Yarn
  • Oozie

About Signature Consultants, LLC

Headquartered in Fort Lauderdale, Florida, Signature Consultants was established in 1997 with a singular focus: to provide clients and consultants with superior staffing solutions. For the ninth consecutive year, Signature was voted as one of the "Best Staffing Firms to Work For" and is now the 15th largest IT staffing firm in the United States (source: Staffing Industry Analysts). With 28 locations throughout North America, Signature annually deploys thousands of consultants to support, run, and manage their clients' technology needs. Signature offers IT staffing, consulting, managed solutions, and direct placement services. For more information on the company, please visit www.sigconsult.com. Signature Consultants is the parent company to Hunter Hollis and Madison Gunn.

Those authorized to work in the U.S. are encouraged to apply.