Overview
On Site
Contract - W2
Contract - 8+ month(s)
Skills
Hadoop
Spark
Data Architecture
Databricks
snowflake
ETL/ELT
Job Details
IKCON TECHNOLOGIES INC delivers exceptional IT services and solutions that provide clients with definite edge over competitors and promoting highest standards of quality. We are currently looking for a Data Scientist/Architect-Principal with one of our clients in Mason, OH. If you are actively looking for opportunities, please send us your updated resume with your contact details.
"U.S. Citizens and those authorized to work in the U.S. are encouraged to apply."
JOB TITLE | Data Scientist/Architect-Principal |
CITY | Mason |
STATE | OH |
TAX TERMS | W2 |
EXPERIENCE | 20-25 |
INTERVIEW MODE | Teams Video Call/Telephonic |
Job duties :
- Define and implement data architecture best practices, including data modeling, data pipelines, and data governance.
- Design scalable and efficient data solutions using cloud platforms (AWS, Azur) and modern data technologies (Snowflake, Databricks, Hadoop).
- Establish and maintain data standards, ensuring data integrity, security, and compliance (HIPAA, GDPR, PII).
- Lead the development and deployment of machine learning models, AI algorithms, and predictive analytics solutions.
- Drive innovation in data science by exploring advanced techniques in big data analytics.
- Optimize data processing workflows to improve model performance and reduce computational costs.
- Mentor and guide data engineers, data scientists, and analysts, ensuring alignment with business goals.
- Collaborate with cross-functional teams (engineering, product, business) to translate business needs into data-driven solutions.
- Advocate for data-driven decision-making across the organization.
- Oversee the design and maintenance of ETL/ELT pipelines for structured and unstructured data.
- Work with big data frameworks (Spark, Kafka, Autoloader) to handle high-volume data processing.
- Optimize data storage solutions, ensuring cost-effectiveness and performance efficiency.
- Developed a data pipeline using Kafka and Storm to store data into HDFS.
- Present findings and insights to executive leadership, influencing strategic decisions.
- Define KPIs and metrics to measure the success of data initiatives.Ensure alignment between data strategy and business objectives.
- Project Tasks Preparation, Mentor & Lead.
- Sprint planning and product backlog grooming sessions.
- Project tracking and reporting to make sure that the project is being executed without any delay.
- Ensure project scope, timelines and effort are properly documented, reviewed and monitored.
- Schedule weekly Status meetings and document meeting minutes, risks and action plans.
- Review Change Controls and release schedules.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.