Overview
Remote
$55 - $60
Accepts corp to corp applications
Contract - W2
Contract - 12 Month(s)
Skills
Access Control
Accountability
Agile
Analytics
Cloud Computing
Collaboration
Communication
Continuous Delivery
Continuous Integration
DAX
Data Architecture
Data Flow
Windows PowerShell
Scripting
Semantics
Soft Skills
Sprint
Storage
Project Delivery
Microsoft SQL Server
Migration
Netezza
Optimization
Oracle
Management
Encryption
Extract
Transform
Load
Extraction
GitHub
HIPAA
Data Processing
Data Quality
Data Warehouse
Database
DevOps
ELT
Data Governance
Data Integration
Data Integrity
Data Lake
Data Management
HIS
Meta-data Management
Microsoft
Microsoft Azure
Microsoft Power BI
Privacy
Python
Real-time
Regulatory Compliance
Reporting
SQL
Teradata
Job Details
Job Description of Role
a) Acts as the Single point of the contact for the client for technical delivery
b) Design and develop robust data architecture that supports the organization's data needs, including data lake and data warehouse componentsc) Define Data Architecture framework components – Integration, Data Management and Data Consumptiond) Create conceptual, logical, and physical data models to ensure data integrity and consistencye) Establish and enforce data governance policies and standards to maintain data quality and securityf) Design and implement data integration solutions to extract, transform, and load data from various sources into the data lake and data warehouseg) Leverage Azure and Fabric services to build scalable and cost-effective data solutionsh) Optimize data queries and ETL processes to ensure efficient data access and analysisi) Collaborate with data engineers, analysts, and business users to understand their requirements and translate them into technical solutionsj) Should be able to understand requirements and proactively clarify any doubts/assumptionsk) Should be team playerl) Should be flexible with working hours (collaborate with team in India, in IST time zone)m) Should be self-motivated and passionate to become technology expertn) Should be accountable for his work and committed to deadlineso) Should be able to set a Implementation path for projectsp) Should be able to do Effort Estimations for Projectsq) Should be able to do customer demosKey skills: Microsoft Fabric, DBX, Data Lake, Data Factory, Azure, Data Warehouse, Modern Data Platforms MigrationPrimary (Must have skills)Excellent verbal and written communication should be able to speak confidently throughout the conversation.Experience working with customer Business/IT team.Data Architecture Design:1.Expertise in designing scalable and modular data architectures (Data Lake + Data Warehouse)2.Experience creating conceptual, logical, and physical data modelsAzure & Microsoft Fabric:3.Strong knowledge of Microsoft Fabric components (Lakehouse, Dataflows, Pipelines, etc.)4.Proficient with Azure services: Synapse, Data Factory, Data Lake Storage, SQL DB, etc.ETL/ELT & Data Integration:5.Experience designing and implementing data pipelines for extraction, transformation, and loading from diverse sources6.Familiarity with both batch and real-time data processingData Governance & Management:7.Understanding of data quality frameworks, metadata management, lineage, and access control8.Ability to define and enforce governance policiesPerformance Optimization:9.Tuning ETL jobs, optimizing data models, and improving query performanceStakeholder Communication:10.Ability to translate business requirements into technical solutions11.Experience serving as a single point of contact for clients on technical deliveryModern Data Platforms Migration:12.Experience in modernizing legacy data warehouses to cloud-native platformsWell-architected Framework13. Best design/coding practices14. Experience in working in a challenging environment with unclear requirements and contributing collaboratively with the team to refine the requirements15, Ability to offer innovative ideas/solutioning, that are modern and effective.Secondary Skills (Good To have)1.Legacy DW Experience: Familiarity with traditional platforms like Teradata, Netezza, Oracle, or SQL Server.2.Power BI: Working knowledge of Power BI for reporting, semantic model design, and DAX.3.CI/CD & DevOps: Exposure to deploying data pipelines via Azure DevOps or GitHub Actions.4.Security & Compliance: Understanding of data privacy, encryption, and regulatory standards (e.g., GDPR, HIPAA).5.Data Catalogs: Experience with Microsoft Purview or other Metadata management tools.6.Agile Methodology: Comfort with Agile project delivery and sprint-based planning.7.Scripting & Automation: Ability to use SQL, PowerShell, or Python for automation or tooling support.Certifications:Preferred: Microsoft Certified: Azure Data Engineer AssociateAdditional beneficial certifications: DP-600 (Microsoft Fabric Analytics Engineer)/DP-700, DP-203 (Azure Data Engineer), Azure Solutions Architect Expert, or equivalent cloud/data certifications.Soft skills/other skills (If any)1) Should have good oral and written communication.2) Should be a good team player.3) Should be proactive and adaptive.4) Good experience of working with customer teams5) Should have good attitude at work
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.