Overview
Skills
Job Details
MUST BE LOCAL TO WASHINGTON, D.C.
Overview: The position requires a quality-focused and detail-oriented Data Analyst with a demonstrated track record of providing solutions for high-priority business goals and strategic management decisions. The incumbent should be a skilled troubleshooter and problem solver with the ability to translate business processes and problem statements into requirements. The incumbent must be conversant with all stages of SDLC and be able to deliver results within tight timelines. The incumbent will be expected to analyze the activities of a particular business unit or line of business.
2. Essential Job Functions: * Interface with the business client to understand business needs and requirements, and to help develop and maintain the client relationship * Identify business and functional requirements by working with application end-users, and lead the collection, analysis, documentation, and coordination of those requirements * Build stakeholder consensus and ensure everyone is on the same page, in agreement, and can visualize the solution being proposed * Develop and maintain business cases, requirements, use cases, test plans, test strategies, test cases, and operational procedures and plans * Document business processes and workflows, develop and maintain business process models Craft business cases to evaluate the feasibility of technology initiatives * Design and execute test cases for application development and implementation projects Collaborate with IT professionals to determine if solutions currently exist (internally or externally) or whether new solutions are feasible to meet business requirements * Leverage rapid prototyping approaches to present as-is/to-be processes/workflows and rough designs of the proposed solution * Maintain a comprehensive chronological trail of requirements and agreements and actively contribute to the project change control process. Document and manage issues and actions * Provide overall support to ensure the successful design, testing, and implementation of applications that support the business unit, Also, provide support in developing training materials and conducting training * Document and manage issues and actions for IT applications and projects Prepare and make presentations using MS PowerPoint, Visio, and other tools and clearly present ideas to stakeholders and management Official Use Only * Participate in the evaluation of new products or initiatives to determine the technology support required * Develop, deploy, and maintain business capability models, according to the Bank's institutional methodology
3. Responsibilities: * Lead and manage a team, co-located in Washington DC and Chennai, to engineer, implement, review, and recommend cost effective data management solutions and develop code for automation of various aspects of data management, with the goals to improve data reliability, efficiency, and quality. * Provide leadership in the design and engineering of data pipelines that are flexible, scalable, secure, and cost effective, both on-premises and in the cloud, to meet the growing needs of the Bank's data landscape. * Maintain accountability of the integrity of the data pipelines and ensure the smooth daily operations to meet various needs such as ensuring the bank's reporting data are always available with well-established testing protocols and troubleshoot any technical issues that arise. * Ensure the design, architecture and security reviews of the data engineering framework and solution are in line with industry best practices, ITS standards and represent good practice. * Continue to innovate and establish a cloud-based data engineering framework that will accommodate both "traditional" structured data sources and "non-traditional" data sources, support current and emerging needs. * Research opportunities for data tools acquisition and new uses for existing data. * Work closely with other teams including Platform Owners, Product Owners, Solution Architects, Data use and Data Governance to achieve the best outcome. * Ensure alignment and partnership client internal stakeholders and vendors, establishing strong linkages with their service and product teams to support activities, covering on-premises and cloud technologies such as Azure PaaS services, Informatica Intelligent Cloud Services (IICS), Tableau, Tibco Data Virtualization, Collibra, Informatica MDM, SAP Business Objects and Power BI.
4. Educational Qualifications and Experience: * Master's degree with 8 years' experience OR equivalent combination of education and experience in relevant discipline such as Computer Science. * Minimum 5 years of experience in each of the following areas: (i) in developing options, roadmaps, and architectures (ii) large enterprise systems, integration, application development (iii) experience in managing teams (iv) experience in managing procurement processes (i.e., RFPs) * Experience designing and deploying high performance production services with robust monitoring and logging practices and demonstrated ability to build and interact with large data processing pipelines, distributed data stores, and distributed file systems. 5. Required Skills/Abilities: Official Use Only * Good working knowledge of cloud platforms covering Azure and on-premises platforms covering traditional data management databases, data governance tools and virtualization software. * Experience in managing large teams - staffing, skills development, organizing and operationalize teams to deliver value. * Experience in developing options, roadmaps, evaluations, decision frameworks for complex enterprise solutions. * Demonstrated experience of working and navigating in large and matrixed organizations with multi-layered governance structures, complex IT landscapes, and diverse client bases. * Excellent grasp and knowledge of industry best practices in the data management domain, with experience in successfully implementing theory to practice in complex IT and business environments. * Organized, agile, persistent, and proactive with the ability to work and juggle multiple tasks within tight deadlines. * Delivers information effectively in support of team or workgroup. Excellent communication, writing/documentation, and facilitation skills. * Proven ability to collaborate with other team members across boundaries and contribute productively to the team's work and output, demonstrating respect for different points of view. * Strong diplomatic, interpersonal and teamwork skills to cultivate effective, productive client relationships and partnerships across organizational boundaries. * Able to take personal ownership and accountability to meet deadlines and achieve agreed-upon results and has the personal organization to do so. * Ability to juggle multiple tasks in a fast-paced environment, and the maturity to participate in multiple complex programs at the same time in an agile environment
1. Azure-Specific Data Services: Proficiency in the core Azure ecosystem is paramount. This includes services like Azure Data Factory for building and orchestrating data pipelines, Azure Databricks for big data processing with Apache Spark, and Azure Synapse Analytics for data warehousing and analytics. A strong understanding of Azure storage options, such as Azure Data Lake Storage and Azure SQL Database, is also essential.
2. ETL/ELT and Data Pipeline Development: A fundamental skill is the ability to design, build, and maintain data pipelines. This involves using ETL (Extract, Transform, Load) or ELT (Extract, Load, Transform) processes to move data from various sources, clean and transform it, and then load it into a data store for analysis.
3. Programming and Scripting: A solid grasp of programming is crucial.
? SQL is a must-have for querying and manipulating data in relational databases.
? Python is the most widely used programming language for data engineering due to its versatility and extensive libraries (like Pandas and NumPy) for data manipulation.
? Knowledge of other languages like Scala or Java can also be beneficial, especially for working with big data technologies like Apache Spark.
4. Data Modeling and Warehousing: Data engineers need to understand how to design efficient data structures. This includes knowledge of data modeling concepts like star schemas and normalization, as well as experience with data warehousing principles to create a foundation for effective business intelligence and analytics.
Dexian is a leading provider of staffing, IT, and workforce solutions with over 12,000 employees and 70 locations worldwide. As one of the largest IT staffing companies and the 2nd largest minority-owned staffing company in the U.S., Dexian was formed in 2023 through the merger of DISYS and Signature Consultants. Combining the best elements of its core companies, Dexian's platform connects talent, technology, and organizations to produce game-changing results that help everyone achieve their ambitions and goals.
Dexian's brands include Dexian DISYS, Dexian Signature Consultants, Dexian Government Solutions, Dexian Talent Development and Dexian IT Solutions. Visit to learn more.
Dexian is an Equal Opportunity Employer that recruits and hires qualified candidates without regard to race, religion, sex, sexual orientation, gender identity, age, national origin, ancestry, citizenship, disability, or veteran status.