Overview
$$60/hr on C2C
Accepts corp to corp applications
Contract - W2
Contract - 1 day((s))
Skills
Python
AWS
ETL
Hadoop
Core Java
Big data
spark
hive
Job Details
Requirement 1:
Position: Big Data Engineer with (Core Java, Python, AWS)
Location: Remote - Atlanta, GA (Once in a Month)
Duration: Long-Term Contract
Job Description:
- Strong working Python, AWS experience, UI experience (JSP, Angular), Java, Spring, Spring boot, Databases (Oracle, PostgreSQL, Aurora)
- Good Data engineering experience in EMR, PySpark, Redshift, Glue, Serverless experience (Lambda, step functions) and containerization (ECS with Fargate)
- Nice to have: SAS knowledge, DevOps knowledge(Jenkins, Bitbucket, Terraform/UCD/CloudFormation), Testing Automation
Requirement 2:
Job Title::Big Data Engineer
Location: Hybrid Intuit MTV office 2/3 days a week
Core Skills AWS/Google Cloud Platform; Python; Spark; redshift; excellent communication skills, proactive & demonstrated thought leadership
Min Exp 9 plus years
- Experience in custom ETL design, implementation and maintenance.
- Experience working with big data technologies (Hadoop Hive Spark etc.) .
- Experience with cloud services (preferably AWS) like EMR, redshift, Kinesis, s3.
- Experience in the data warehouse space.
- Experience with schema design and dimensional data modeling.
- Experience in writing SQL statements.
- Proficient in one of Programming languages (preferably Python)
- Excellent communication skills, proactive & demonstrated thought leadership