Job Purpose: The Contractor shall provide Information Technology (IT) staff augmentation services in support of the Department s Office of Environmental Accountability & Transparency. The Contractor must provide the qualified Data Modeler candidate proposed in its Response, and selected by the Department, who has the experience and abilities outlined above. About the Candidate: Qualifications Education - Bachelor s or master s degree in Data Science, Environmental Science, Information Science, or other Information Technology major, or equivalent work experience. Mandatory Knowledge, Skills, and Abilities:
This position requires the following experience and/or knowledge: 3-5 years experience in data engineering, including designing and implementing data pipelines and ETL processes. (Proficiency level 4) Proficiency in programming languages such as Python and SQL. (Proficiency level 3) Strong analytical and problem-solving skills, with the ability to analyze complex datasets and extract actionable insights. (Proficiency level 4) Knowledge of relational database design and data modeling. (Proficiency level 3) Experience with implementing data warehouses, data lakes, or data lakehouses. (Proficiency level 3) Ability to establish and maintain effective working relationships with others. (Proficiency level 3) Ability to work independently. (Proficiency level 3) Ability to determine work priorities and ensure proper completion of work assignments. (Proficiency level 4) Ability to communicate effectively, both verbally and in writing. (Proficiency level 3) Preferred Knowledge, Skills and Abilities 3-5 years experience with Alteryx Designer. Familiarity with environmental science, water quality, or related fields. Experience with business intelligence tools such as Qlik Sense. Primary Job, Duties and Tasks Design, implement, and maintain robust data pipelines and architectures in Alteryx Designer. Create and maintain logical data models in Oracle SQL Developer Data Modeler. Read, write, and update data. Create and maintain ETL code repository. Perform ad hoc data cleansing of data sets as needed. Develop and implement data quality control procedures to ensure the accuracy, completeness, and consistency of environmental data. Define data quality standards and metrics. Execute procedures to monitor data quality. Identify data issues and propose remediation plans. Optimize data processing workflows and algorithms for efficiency, scalability, and reliability. Ensure compliance with data privacy regulations and security best practices in data handling, storage, and transmission. Stay current with emerging technologies, tools, and methodologies in data engineering and environmental science. Collaborate with data scientists and analysts to optimize models and algorithms for data quality, security, and governance. Monitor and tune data systems, identify and resolve performance bottlenecks, and implement caching and indexing strategies to enhance query performance. Transform raw data into a usable format by applying data cleansing, aggregation, filtering, and enrichment techniques. Establish the governance of data and algorithms used for analysis, analytical applications, and automated decision making. Provide leadership, guidance, and mentorship to junior staff members and colleagues, fostering a culture of continuous learning, innovation, and excellence in D&A practices. |