Are you an experienced, passionate pioneer in technology - a solutions builder, a roll-up-your-sleeves technologist who wants a daily collaborative environment, think-tank feels and share new ideas with your colleagues - without the extensive demands of travel? If so, consider an opportunity with our US Delivery Center - we are breaking the mold of a typical Delivery Center.
Our US Delivery Centers have been growing since 2014 with significant, continued growth on the horizon. Interested? Read more about our opportunity below ... Responsibilities
Function as an integrator between business needs and technology solutions, helping to create technology solutions to meet clients' requirements. Be responsible for developing and
testing solutions that aligns with clients' systems strategy, requirements, and design as well as supporting system implementation. Manage data pipeline process starting from acquisition to ingestion, storage, and provisioning of data to point-of-impact by modernizing and enabling new capabilities. Facilitate Data Integration on traditional and Hadoop environments by assessing client's enterprise IT environments. Guide clients to the future IT environment state to support meeting their long-term business goals. Enhance business drivers through enterprise-scale applications that enable visualization, consumption and monetization of both structured and unstructured data. The Team
Deloitte Consulting's Analytics & Cognitive offering leverages the power of analytics, robotics, and cognitive technologies to uncover hidden relationships from vast troves of data, create and manage large-scale organizational intelligence, and generate insights that catalyze growth and efficiencies. Qualifications Required
Strong technical expertise in most of the following:
- Hadoop (Cloudera distribution)
- Spark with Scala or Python programming
- Hive Tuning, Bucketing, Partitioning, UDF, UDAF
- NOSQL Data Base such as HBase, MongoDB or Cassandra
- Experience and knowledge working in Kafka, Spark streaming, Sqoop, Oozie, Airflow, Control-M, Presto, No SQL, SQL
- Expert level usage with Jenkins, GitHub is preferred
- Knowledge of working in financial/insurance domain
- 6+ years of experience of professional work experience
- Strong technical skills including understanding of software development principles
- Hands-on programming experience
- Must live a commutable distance to one of the following cities: Atlanta, GA; Austin, TX; Boston, MA; Charlotte, NC; Chicago, IL; Cincinnati, OH; Cleveland, OH; Dallas, TX; Detroit, MI; Gilbert, AZ; Houston, TX; Indianapolis, IN; Kansas City, MO; Lake Mary, FL; Los Angeles, CA; Mechanicsburg, PA; Miami, FL; McLean, VA; Minneapolis, MN; Nashville, TN; Orange County, CA; Philadelphia, PA; Phoenix, AZ; Pittsburgh, PA; Rosslyn, VA; Sacramento, CA; St. Louis, MO; San Diego, CA; Seattle, WA; Tallahassee, FL; Tampa, FL; or be willing to relocate to one of the following USDC locations: Gilbert, AZ; Lake Mary, FL; Mechanicsburg, PA.
- Limited Immigration sponsorship may be available.
- Travel up to 10% annually.
- 3 + years' experience working with Big Data eco-system including tools such as Hadoop, Spark, Map Reduce, Sqoop, HBase, Hive and Impala
- Proficiency in one or more modern programming languages like Python or Scala
- Experience on data lakes, datahub implementation
- Knowledge on AWS or Azure platforms
- Knowledgeable in techniques for designing Hadoop-based file layout optimized to meet business needs
- Able to translate business requirements into logical and physical file structure design
- Ability to build and test solution in agile delivery manner
- Ability to articulate reasons behind the design choices being made
- Bachelor of Science in Computer Science, Engineering, or MIS, or equivalent experience
- Any bigdata certification is a plus