Overview
On Site
Depends on Experience
Contract - W2
Skills
SNOWFLAKE
DBT
Job Details
Job Title: Data Architect
Location: Cincinnati ,OH (Onsite)
Mode: Contract on W2
**Must have : Snowflake and DBT **
Job Summary
We are seeking a highly skilled Data Architect .The ideal candidate will design, implement, and manage data architectures that enable scalable AI/ML solutions to solve real-world business problems.
Key Responsibilities
Location: Cincinnati ,OH (Onsite)
Mode: Contract on W2
**Must have : Snowflake and DBT **
Job Summary
We are seeking a highly skilled Data Architect .The ideal candidate will design, implement, and manage data architectures that enable scalable AI/ML solutions to solve real-world business problems.
Key Responsibilities
- Design and implement scalable data architectures to support AI/ML workloads.
- Work closely with Data Scientists, ML Engineers, and Business teams to understand requirements and convert them into technical solutions.
- Build and maintain data pipelines (ETL/ELT) to prepare structured and unstructured data for modeling.
- Ensure data quality, governance, privacy, and compliance standards are met.
- Design data lakes, warehouses, and MLOps pipelines using tools like Databricks, Snowflake, AWS/Google Cloud Platform/Azure, etc.
- Optimize data storage and processing to support large-scale AI/ML training and inference.
- Define and manage metadata, master data, and data catalog solutions.
- Evaluate and implement AI/ML tools and frameworks (e.g., TensorFlow, PyTorch, MLflow).
- Lead data modeling, schema design, and performance tuning.
- Collaborate with DevOps for CI/CD of ML models and data workflows.
- Bachelor s or Master s degree in Computer Science, Data Engineering, or related field.
- 5+ years of experience in data architecture and engineering.
- Strong expertise in AI/ML concepts and model lifecycle.
- Proficiency in SQL, Python, and data modeling techniques.
- Experience with big data platforms: Hadoop, Spark, Kafka, etc.
- Hands-on experience with cloud platforms (AWS, Azure, or Google Cloud Platform) and services like S3, Redshift, BigQuery, etc.
- Experience in building and managing MLOps pipelines.
- Familiarity with containerization (Docker, Kubernetes) and workflow tools like Airflow.
- Experience with data governance tools (e.g., Collibra, Alation).
- Knowledge of AI/ML model interpretability and ethical AI practices.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.