Solutions Architect

$175,000 - $175,000

Full Time


    • Airflow
    • Apache Flink
    • Cortex
    • Data Warehouse
    • DevOps
    • ERP
    • ElasticSearch
    • GitHub
    • Kafka
    • Kanban

    Job Description

    Details of the role / application, as appropriate:

    • Architect and designdata observability solution to ensure smooth, trouble-free end to end data operations
    • Manage relationships with key stakeholders (Directors and VP + level) , across platform, back-end, and client engineering; program management; and operational excellence
    • Out of the box thinker who can lead thought process and solutioning to define solutions to address holistic enterprise level data observability


    Key responsibilities and expected “output”:

    • Solutions architecture for Data Observability.


    Required” tech stack and other related details:

    • 10+ years of experience in architecting and delivering large Enterprise Data Warehouse, BI and Analytics solutions
    • 5+ years of experience as enterprise architect with exposure to application, BI , DevOps, Infrastructure, Data Platforms on premise and Cloud
    • 3 + years of experience in Architecting analytics solutions using Google Big Query, PySpark, Cloud Scheduler, Airflow , Hadoop, Scala, Tableau
    • 3 + years of experience in analyzing logs and telemetry to detect and resolve issues related to data quality, schema changes
    • 3 + years of experience in using Machine Learning to detect anomalies, reduce false positives
    • 3 + Experience with managing software and system delivery for BI/DevOps/SRE/Observability platforms.
    • Demonstrated experience with building systems using agile development methodologies (Scrum, Kanban, or similar).
    • Deep knowledge of high-velocity data streaming, storage, aggregation, and analysis platforms (Kafka, Apache Flink, ElasticSearch, Prometheus/Cortex, and comparable systems)
    • Knowledge of operational tooling (CloudWatch, Splunk, SQL, Kibana, GitHub, Jenkins, Grafana,
      APMs, and comparable) and how engineers interact with data using these tools to gain insight into their system


    “Good to have” tech stack and other related details:



    Any other information considered “critical” / “Useful": (Domain / Sub-Domain exp / knowledge / Certifications.

    Check for familiarity with enterprise applications like Salesforce, Oracle ERP/ SAP ERP etc.. this is basic requirement.