Role: Data Architect III Location: Las Vegas, NV 89113 Duration: Full-time / Direct Hire Must pass a background/drug screen ------------------------------------------
Peak17's direct client in is seeking an experienced Data Architect III
to work onsite in their state-of-the-art corporate offices in Las Vegas, NV
. This is an excellent direct hire
opportunity for someone who is looking to be a part of architecting their data modeling platform from the ground up. Someone who possesses Enterprise Data Modeling experience
along with good communication skills to interact with the business side is key to success in this role. Position Summary:
This role is an experienced Data Architect with a background in Software and Data Engineering, that will assist in designing, developing and deploying data-driven solutions as part of a strategic data transformation effort. The candidate will join a team of Data Architects and Engineers who will be responsible for optimizing and transforming our data architecture, infrastructure, operations, and related functions. This team will work with developers, architects, business / data analysts and data scientists on data initiatives and will ensure optimal data solutions. Summary Essential Job Functions
- Technical leadership in Information and Data Architecture, working with Enterprise Data Architects leveraging the TOGAF architecture methodology with oversight of Domain Modeling, Logical Data Modeling and Physical Data Model implementation.
- Apply modern data management toolsets and coding methods to design, build, implement, and optimize data solutions of all types - including data warehouses, data lakes, ODS, streaming data, analytic and BI/visualizations, etc.
- Translate business issues and needs into Data and System requirements and Architect the management of data assets and their flow through the enterprise.
- Architect and Design Data Services (DaaS) for data consumption and manipulation throughout the Data Ecosystem applying the "contract first" design principle and including use of API, Microservices, Microbatch, ELT Pipeline and other methods.
- Transform legacy data structures and processes to modern, capable, and secure solutions in a hybrid cloud setup.
- Apply Data Engineering & Design best practices to architect solutions, using a deep understanding of various data formats and database design approaches.
- Architect Data Storage solutions for OLTP (CRM, etc.) Systems, Analytics Platforms, Data Lake, Data Warehouse with Relational Database and Object Storage methods tailored for best fit for the needs.
- Architect best-practice data ingestion framework for batch and real-time data flows, develop tooling for increasing scale, accuracy and automation in data pipeline, to integrate with decisioning, AI/NLP and consuming systems.
- Architect Data Catalogue and Metadata Management.
- Architect and Model for Master Data Management.
- Architect for various Analytics Method including Descriptive, Diagnostic, Predictive, Prescriptive and Capabilities including Realtime Analytics, Advanced Analytics, Machine Learning (ML/AI/NLP).
- Work with Enterprise Architects and Information Security Architects to design highly secure data platform ecosystem by designing controls and protection strategies.
- Enable application performance and modernization by creating appropriate data capabilities to match.
- Determine best-in-breed Tools and Technologies, leveraging CNCF-backed Open Source, Managed Solutions and Engineered solutions where applicable.
- 5+ years of experience in data analysis, engineering, architecture and operations roles, including experience with transformational efforts.
- Strong Database skills, with RDBMS (E.g., Oracle, SQL) as well as modern relational and unstructured data sources (like NoSQL), including cloud services (AWS/GCP/Azure). Hands on experience using tools is strongly preferred.
- Experience with Tools (or similar) such as Hadoop Stack, Airflow, Kafka, NiFi, PostgreSQL, Oracle, SQL Server, ElasticSearch (ELK), JSON, Parquet, Avro and other Data Storage formats, Tableau, Superset and other Visualization Tools, Apache Atlas, and other Data-centric Apache Packages.
- Extensive Knowledge of Design Patterns for Software and Data Engineering.
- Experience in on-prem and hybrid cloud infrastructure, including service and cost optimization.
- Experience with production and analytics data, batch and real time / streaming, etc.
- Experience in regulated industries preferred (such as financial services, insurance, healthcare, etc.).
- Familiarity with optimization tools and techniques, including Bayesian modelling and variety of machine learning techniques.
- Ability to manage large programs and projects will be essential.
For immediate consideration please submit your resume in Word format, along with daytime contact information. LOCAL CANDIDATES ONLY PLEASE
unless you are willing to relocate yourself at your own expense. Client is unable to provide H-1B Visa sponsorship at this time. All submittals will be treated confidentially. Selected candidate may be asked to pass a comprehensive background, credit and/or drug screening. Principals only, no third parties please.
Peak 17 Consulting (est. 2008) provides organizations of all sizes with high-quality, cost effective staffing services. Clients tun to Peak17 for expertise in operational staffing and placement of Accounting/Finance, Human Resources, and Marketing professionals, as well as Information Technology resources.
Peak17 Consulting is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, gender expression, national origin, protected veteran status, or any other basis protected by applicable law, and will not be discriminated against on the basis of disability.
In compliance with federal law, all persons hired will be required to verify identity and eligibility to work in the United States and to complete the required employment eligibility verification document form upon hire.