Big Data Engineer/Manager

Hadoop, Bigdata, Spark, Scala, Kafka, Python
Full Time, Full Time
Depends On Experience
Telecommuting not available Travel required to 100%.

Job Description

Founded in 1998, Matrix Technology Group is an ERP and IT consulting services provider.

Matrix Technology Group provides Services into ERP, BI and Application Development. Our staff's Passion, and dedication set us a apart from other IT firms. Our team is Dynamic and is focused to our Client needs. Our Team is geared to work with Consultants and our Clients to achieve higher performance. We want to work with you and want to welcome Candidates who are Talented, Passionate, Dedicated and have Ambition to grow.

One of our highly esteemed clients has immediate need for strong Big Data Engineer - Permanent Role 

Job Description is as follows:

Big Data Engineer
Locations:
IL - Chicago, IN - Indianapolis, MI - Detroit, MN - Minneapolis, MN - St Paul, OH - Cincinnati, OH - Columbus - Cleveland, OH - MO - Kansas City, Toledo, MO - St Louis, WI - Milwaukee – Midwest
DC - Washington, MD - Baltimore , VA - Arlington, VA – Richmond, GA - Atlanta, NC - Charlotte, FL - Tampa, FL – Miami, FL - Orlando - Southeast
OK - Oklahoma City, AZ - Phoenix, CO - Denver, TX - Austin, TX - Dallas, TX – Houston - Southwest
CT - New Haven, CT - Hartford, DE - Wilmington, MA - Boston, NJ - Jersey City, NJ - Murray Hill, NJ - Florham Park, NY - New York, PA - Philadelphia, PA - Pittsburgh - Northeast
CA – Los Angeles, CA - Los Alamitos, CA - Norwalk, CA - El Segundo, CA - Sacramento, CA - San Diego, CA - San Francisco, CA - San Jose, WA - Seattle, OR - Portland – West

Basic Qualifications
Bachelor's degree in Computer Science, Engineering, Technical Science or 3 years of IT/Programming experience
Minimum 1 year of architecting, implementing and successfully operationalizing large scale data solutions in production environments using Hadoop and NoSQL ecosystem on premise or on Cloud (AWS, Google or Azure) using many of the relevant technologies such as Nifi, Spark, Kafka, HBase, Hive, Cassandra, EMR, Kinesis, BigQuery, DataProc, Azure Data Lake etc.
Minimum 1 year of architecting data and buildconing performant data models at scale for Hadoop/NoSQL ecosystem of data stores to support different business consumption patterns off a centralized data platform
Minimum 1 year of Spark/MR/ETL processing, including Java, Python, Scala, Talend; for data analysis of production Big Data applications
Minimum 1 year of architecting and industrializing data lakes or real-time platforms for an enterprise enabling business applications and usage at scale
Minimum 2 years designing and implementing relational data models working with RDBMS and understanding of the challenges in these environment

Preferred Skills
Minimum 1 year of experience implementing SQL on Hadoop solutions using tools like Presto, AtScale and others
Minimum 1 year of experience implementing large scale BI/Visualization solutions on Big Data platforms
Minimum 1 year of experience implementing large scale secure cloud data solutions using AWS data and analytics services e.g. S3, EMR, Redshift
Minimum 1 year of experience implementing large scale secure cloud data solutions using Google data and analytics services e.g. BigQuery, DataProc
Minimum 1 year of experience building data management (metadata, lineage, tracking etc.)and governance solutions for modern data platforms that use Hadoop and NoSQL on premise or on AWS, Google and Azure cloud
Minimum 1 year of experience securing Hadoop/NoSQL based modern data platforms on-premise or on AWS, Google, Azure cloud
Minimum 1 year of Re-architecting and rationalizing traditional data warehouses with Hadoop or NoSQL technologies on premise or transition to AWS, Google clouds
Experience implementing data wrangling and data blending solutions for enabling self-service solutions using tools such as Trifacta, Paxata
1 year industry systems development and implementation experience OR Minimum of 2 years of data loading, acquisition, storage, transformation, and analysis
Minimum 1 years of using Talend, Informatica like ETL tools within a Big Data environment to perform large scale metadata integrated data transformation
Minimum 1 year of building Business Catalogs or Data Marketplaces on top of a Hybrid data platform containing Big Data technologies

In case your skills matches with the above mentioned requirement kindly forward your resume in word format along with Rate/Salary expected and contact Number.

(Feel Free to reach me at 908-279-1280 OR E-Mail me:  sbhoite at rate matrixonweb.com)

 

Dice Id : mategr
Position Id : BD-SB-6776
Have a Job? Post it

Similar Positions

Big Data Developer
  • Entelli Consulting LLC
  • Lombard, IL
Big Data Developer
  • Asen
  • Chicago, IL
Senior Big Data Engineer
  • TransUnion
  • Chicago, IL
Big Data Lead/Big Data architect
  • IT Trailblazers, LLC.
  • Chicago, IL
Hadoop Big Data Solution Architect
  • Momento USA LLC
  • Deerfield, IL
BigData Engineer/Lead
  • Wipro Ltd.
  • Rolling Meadows, IL
Big Data Hadoop Developer
  • Creospan
  • Chicago, IL
Spark Developer
  • Pegasus Knowledge Solutions
  • Schaumburg, IL
Sr. Big Data / Hadoop Developer
  • Hi-Tech Solutions, Inc.
  • Chicago, IL
Big Data Architect
  • TekLink International Inc.
  • Warrenville, IL
Lead Big Data Developer / Architect
  • The Judge Group
  • Chicago, IL
Sr Data Engineer
  • Interactive Business Systems
  • Downers Grove, IL