Join our team of over 5,000 employees globally, who provide the insights through data and next generation technology that help millions of people find, buy and protect the homes they love. From the beginning, CoreLogic has been driven by a single purpose - to innovate and create solutions that solve our clients' toughest challenges in the housing market. CoreLogic is the trusted source for property intelligence, with deep knowledge of powerful economic, social, and environmental forces that promote healthy housing markets and thriving communities.
We apply that same dedication to creating a diverse and inclusive work culture that inspires innovation and bold thinking. A place where individuals can work on small teams, feel valued, and directly impact the real estate industry. We believe our team members are the best in the business, and we will continue to recruit, retain, develop and reward our most important asset - our people!
Working under minimal supervision, the Big Data developer will be responsible for designing, developing, updating and supporting data pipelines and applications from a broad system-wide perspective.
Big Data Developers work on trillions of bytes of data each day andare responsible for the coding or programming of Data pipelines using technologies like Java, Kafka, Apache Beam, Elastic, Spark,etc., along with several databases (ex: BigQuery). Big Data developer's are expected to be an integral part of a platform development team, competent to work on virtually all phases of applications development and capable of independently carrying out all essential development tasks.
Level of work assigned is complex and broad in scope. This position, in partnership with TPM, business partners/product owners, gathers information and analyzes needs to determine feasibility of client requests. This position also takes an active mentoring role, and provides design scope and specifications to less experienced team members.
Main Job Responsibilities include:
Design, develop and implement scalable and high-performance data pipelines that extracts, transforms & loads data into an enterprise repositoryand information product helps organization in reaching strategic goals. Help ensure our technological infrastructure operates seamlessly in support of our business objectives.
Consult with product owners/business partners and translate complex technical and functional requirements into detailed design. Evaluate feasibility and make recommendations, considering things such as customer requirements, time limitations, system limitations. Serve as a mentor to junior staff by conducting technical training sessions and reviewing project outputs
Explore and research new and alternate BigData technologies and platforms. Evaluate, recommend, and apply these alternate/new technologies, disseminating information throughout the team/department. This includes documenting large and complex assignments for knowledge transfer and developing expertise in multiple areas. Mentor other team members on area(s) of expertise. Provide technical guidance on a wide range of systems/projects.
Support, maintain and document software functionality. Makes recommendations on and influences engineering processes and methods. Provide operational support on complex/escalated issues to diagnose and resolve incidents in production data pipelines. Incidents tend to be fewer but more complex, requiring analysis of issues, determination of additional resources to resolve the issue, and an in-depth system perspective.
Education, Experience, Knowledge and Skills
• BS Degree or equivalent work experience in a software engineering discipline
• Typically has 3-6 years' experience in an applicable software development environment
• Use skills as a seasoned, experienced professional to work on all phases of development within broadly assigned technical discipline
• Object-oriented/object function scripting languages such as Python, Java, Scala.
• Experience with Streaming frameworks like Kafka , RabbitMQ
• Hands on experience with Cloud Platforms (AWS, GCP, or Azure)
• Experience in designing and implementing large-scale event-driven architectures
• Understanding of data warehousing and data modeling techniques
• Experience building and optimizing 'big data' data pipelines, architectures and data sets.
• Experience with the following tools and technologies:
-Hadoop, Spark, Hive , Elastic Search
-Experience in various database and big data technologies
-Google Cloud DataFlow , Apache Beam, Google AutoML
• Ability to develop and write technical specifications
• Strong communication skills and interest in pair programming environment
• Driven to excel in areas of technical expertise and expand base of knowledge
• Coaching and teaching skills to mentor less experienced team members
• Excellent analytical and problem management skills
• Domain specific industry experience in Real Estate, Insurance, or Mortgage a plus
• Good interpersonal skills and positive attitude
CoreLogic's Diversity Commitment:
CoreLogic is fully committed to employing a diverseworkforce and creating an inclusive workenvironment that embraces everyone's uniquecontributions, experiences and values. We offer anempowered work environment that encouragescreativity, initiative and professional growth andprovides a competitive salary and benefits package. We are better together when we support and recognize our differences.
EOE AA M/F/Veteran/Disability:
CoreLogic is an Equal Opportunity/Affirmative Actionemployer committed to attracting and retaining thebest-qualified people available, without regard torace, color, religion, national origin, gender, sexualorientation, gender identity, age, disability or statusas a veteran of the Armed Forces, or any other basisprotected by federal, state or local law. CoreLogicmaintains a Drug-Free Workplace.
Please apply on our website for consideration.
Connect with us on social media! Click on the quicklinks below to find out more about our company and associates.