Back to Jobs

Data Engineer

BPM LLP United States Full-time
$80,000
per year

Job Description

BPM – where caring and community is in our company DNA; we are always striving to be our best selves; and we’re compelled to ask the questions that lead to innovation. As a Data Engineer, you will help drive BPM’s data-driven culture by building scalable, secure, and high-quality data pipelines that support insights and decision-making across the firm. You’ll play an integral role in delivering, maintaining, and optimizing our data infrastructure in a cutting-edge Azure environment.

 

Working with BPM means using your experiences, broadening your skills, and reaching your full potential in work and life—while also making a positive difference for your clients, colleagues, and communities. Our shared entrepreneurial spirit drives us to see and do things differently. Our passion for people makes BPM a place where everyone feels welcome, valued, and part of something bigger. Because People Matter.


What you get:


·       Total rewards package: from flexible work arrangements to personalized benefit structures and financial compensation options that give you choice and flexibility

·       Well-being resources: interactive wellness platform and incentives, employee assistance program and mental health resources, and Colleague Resource Groups (CRGs)

·       Balance & flexibility: 14 Firm Holidays (including 2 floating), Flex PTO, paid family leave, winter break, summer hours, and remote work options

·       Professional development opportunities: a learning culture with CPA exam resources and bonuses, tuition reimbursement, a coach program, and workshops through BPM University



About BPM:


BPM LLP is one of the 40 largest public accounting and advisory firms in the United States with a global team of over 1,200 colleagues. A Certified B Corp, the Firm works with clients in the agribusiness, consumer business, financial and professional services, life science, nonprofit, wine and craft beverage, real estate and technology industries. BPM’s diverse perspectives, expansive expertise, and progressive solutions come together to create exceptional experiences for individuals and businesses around the world. To learn more, visit our website.

\n


For this position, you will have:
  • Undergraduate degree in data or computer science, IT, statistics, or mathematics preferred
  • Minimum of 2 years of experience as a Data Engineer in a Databricks environment
  • Specific expertise in Databricks Delta Lake, notebooks, and clusters
  • Data Vault Modeling experience
  • Knowledge of big data technologies such as Hadoop, Spark, and Kafka
  • Strong understanding of relational data structures, theories, principles, and practices
  • Proficiency in Python and SQL programming languages
  • Strong understanding of data modeling, algorithms, and data transformation strategies for data science consumption
  • Experience with performance metric monitoring and improvement
  • Experience analyzing and specifying technical and business requirements
  • Ability to create consistent requirements documentation in both technical and user-friendly language
  • Excellent critical thinking skills and understanding of relationships between data and business intelligence
  • Strong communication skills with technical and non-technical audiences
  • Ability to work remotely and collaborate with geographically distributed team


In this position, you will:
  • Support BPM's culture of data, representing the firm's approach to data management, stewardship, lineage, architecture, collection, storage, and utilization for delivering analytic results
  • Deliver, maintain, and build trusted business relationships that contribute to BPM's data culture
  • Stay current with the latest technologies and methodologies with a pragmatic mindset
  • Participate in technology roadmaps and maintain data pipeline and tool documentation

Data Pipeline Development
  • Build, maintain, and govern data pipelines in an Azure environment with best of breed technology
  • Develop pipelines to the data Lakehouse ensuring scalability, reliability, security, and usability for insights and decision-making
  • Develop, deploy, and support high-quality, fault-tolerant data pipelines
  • Build infrastructure for optimal extraction, loading, and transformation of data from various sources
  • Support architecture for observing, cataloging, and governing data

ETL / ELT

  • Build and optimize ELT functionality using Python, dbt, and SQL
  • Monitor and troubleshoot ELT processes to ensure accuracy and reliability
  • Implement development best practices including technical design reviews, test plans, peer code reviews, and documentation

Data Governance & Security

  • Implement data governance and access controls to ensure data security and compliance
  • Collaborate with security to implement encryption, authentication, and authorization mechanisms
  • Monitor and audit data access to maintain data privacy and integrity

Collaboration & Communication

  • Collaborate with cross-functional stakeholders and IT to deliver meaningful outcomes
  • Profile data sources and understand data relationships to support analytics


\n

Company Information

Location: San Francisco, California, United States

Type: Hybrid