Big Data Engineering /Architect /Consultant - Location negotiable

Full Time
Depends On Experience
Telecommuting not available Travel not required

Job Description

Founded in 1998, Matrix Technology Group is an ERP and IT consulting services provider.
Matrix Technology Group provides Services into ERP, BI and Application Development. Our staff's Passion, and dedication set us a apart from other IT firms. Our team is Dynamic and is focused to our Client needs. Our Team is geared to work with Consultants and our Clients to achieve higher performance. We want to work with you and want to welcome Candidates who are Talented, Passionate, Dedicated and have Ambition to grow.
One of our highly esteemed clients has immediate need for strong Big Data Engineering /Architect /Consultant - Location negotiable

Job Description is as follows:

Big Data Engineering /Architect /Consultant - Location negotiable
IL - Chicago, IN - Indianapolis, MI - Detroit, MN - Minneapolis, MN - St Paul, OH - Cincinnati, OH - Columbus - Cleveland, OH - MO - Kansas City, Toledo, MO - St Louis, WI - Milwaukee – Midwest
DC - Washington, MD - Baltimore , VA - Arlington, VA – Richmond, GA - Atlanta, NC - Charlotte, FL - Tampa, FL – Miami, FL - Orlando - Southeast
OK - Oklahoma City, AZ - Phoenix, CO - Denver, TX - Austin, TX - Dallas, TX – Houston - Southwest
CT - New Haven, CT - Hartford, DE - Wilmington, MA - Boston, NJ - Jersey City, NJ - Murray Hill, NJ - Florham Park, NY - New York, PA - Philadelphia, PA - Pittsburgh - Northeast
CA - LosAngeles, CA - Los Alamitos, CA - Norwalk, CA - El Segundo, CA - Sacramento, CA - San Diego, CA - San Francisco, CA - San Jose, WA - Seattle, OR - Portland – West

Data Engineers at the Consultant level will be responsible for architecture, design and implementation of Hadoop and NoSQL based full scale solutions that includes data acquisition, storage, transformation, security, data management and data analysis using these technologies. A solid understanding of infrastructure planning, scaling, design and operational considerations that are unique to Hadoop, NoSQL and other emerging data technologies is required. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to identify and apply Hadoop and NoSQL solutions to challenges with data and provide better data solutions to industries.

Responsibilities include the following:
•Design and implement data management for Hadoop/NoSQL in a hybrid environment
•Design and implement large scale data architectures using Hadoop/NoSQL in a hybrid environment
•Data profiling and data analysis using emerging data technologies
•Design, implement and deploy data loaders to ingest data into Hadoop/NoSQL

Basic Qualifications
•Bachelor's degree in Computer Science, Engineering, Technical Science or 3 years of IT/Programming experience
•Minimum 1 year of architecting, implementing and successfully operationalizing large scale data solutions in production environments using Hadoop and NoSQL ecosystem on premise or on Cloud (AWS, Google or Azure) using many of the relevant technologies such as Nifi, Spark, Kafka, HBase, Hive, Cassandra, EMR, Kinesis, BigQuery, DataProc, Azure Data Lake etc.
•Minimum 1 year of architecting data and buildconing performant data models at scale for Hadoop/NoSQL ecosystem of data stores to support different business consumption patterns off a centralized data platform
•Minimum 1 year of Spark/MR/ETL processing, including Java, Python, Scala, Talend; for data analysis of production Big Data applications
•Minimum 1 year of architecting and industrializing data lakes or real-time platforms for an enterprise enabling business applications and usage at scale
•Minimum 2 years designing and implementing relational data models working with RDBMS and understanding of the challenges in these environment

Preferred Skills
•Minimum 1 year of experience implementing SQL on Hadoop solutions using tools like Presto, AtScale and others
•Minimum 1 year of experience implementing large scale BI/Visualization solutions on Big Data platforms
•Minimum 1 year of experience implementing large scale secure cloud data solutions using AWS data and analytics services e.g. S3, EMR, Redshift
•Minimum 1 year of experience implementing large scale secure cloud data solutions using Google data and analytics services e.g. BigQuery, DataProc
•Minimum 1 year of experience building data management (metadata, lineage, tracking etc.)and governance solutions for modern data platforms that use Hadoop and NoSQL on premise or on AWS, Google and Azure cloud
•Minimum 1 year of experience securing Hadoop/NoSQL based modern data platforms on-premise or on AWS, Google, Azure cloud
•Minimum 1 year of Re-architecting and rationalizing traditional data warehouses with Hadoop or NoSQL technologies on premise or transition to AWS, Google clouds
•Experience implementing data wrangling and data blending solutions for enabling self-service solutions using tools such as Trifacta, Paxata
•1 year industry systems development and implementation experience OR Minimum of 2 years of data loading, acquisition, storage, transformation, and analysis
•Minimum 1 years of using Talend, Informatica like ETL tools within a Big Data environment to perform large scale metadata integrated data transformation
•Minimum 1 year of building Business Catalogs or Data Marketplaces on top of a Hybrid data platform containing Big Data technologies

Responsibilities include the following:
•Architect modern data solutions in a hybrid environment of traditional and modern data technologies such as Hadoop, NoSQL
•Create technical and operational architectures for these solutions incorporating Hadoop, NoSQL and other modern data technologies
•Implement and deploy custom solutions/applications using Hadoop/NoSQL
•Lead and guide implementation teams and provide technical subject matter expertise in support of the following:
-Designing, implementing and deploying ETL to load data into Hadoop/NoSQL
-Security implementation of a Hadoop/NoSQL solutions
-Managing data in Hadoop/NoSQL co-existing with traditional data technologies in a hybrid environment
-Troubleshooting production issues with Hadoop/NoSQL
-Performance tuning of a Hadoop/NoSQL environment

In case your skills matches with the above mentioned requirement kindly forward your resume in word format along with Rate/Salary expected and contact Number.
Feel Free to reach me at 908-279-1199 OR E-Mail me: araut at matrixonweb dot com

(US citizens and those authorized to work in the US are encouraged to apply. We are unable to sponsor H1B Candidates at this time)

Dice Id : mategr
Position Id : 2481-AR
Have a Job? Post it

Similar Positions

BigData Developer
  • Northfield, IL
Big Data Architect
  • Swoon Group
  • Oak Brook, IL
Big Data Engineers
  • KPK Technologies, Inc
  • Chicago, IL
Senior Big Data Developer
  • V-Soft Consulting Group, Inc
  • Chicago, IL
Big Data Developer
  • Asen
  • Chicago, IL
Lead BigData Developer
  • Blue Cross Blue Shield
  • Chicago, IL
Big Data Developer
  • TransTech LLC
  • Chicago, IL
Bigdata Developer
  • New York Technology Partners
  • Chicago, IL
BigData Developer (Hadoop)
  • Forbes Technical Consulting
  • Chicago, IL
Bigdata Architect
  • Anblicks
  • Chicago, IL
Big Data Developer
  • Computer Resource Solutions
  • Northbrook, IL
Big data Solution Architect
  • People Force Consulting Inc
  • Deerfield, IL