About AB InBev
AB InBev is the leading global brewer and one of the world’s top 5 consumer product companies. With over 500 beer brands we’re number one or two in many of the world’s top beer markets: North America, Latin America, Europe, Asia, and Africa.
BEES is an e-commerce and SaaS company, created by AB InBev, on a mission to transform the traditional sales model by putting customers at its heart. Whether you are a small to medium sized retailer or a supplying partner, BEES provides you with the tools, data, and insights you need to help your business thrive.
Today, more than 2 million+ retailers across 10+ countries use BEES every month to browse for products, place orders, earn rewards, arrange deliveries, manage invoices, and access business insights all from one place.
Learn more on how BEES keeps business buzzing at www.bees.com
Our team is in search of a Data Engineer to be a leader in our Global Revenue Management & Commercial organization and help build, and maintain our data engineering infrastructure geared towards solving business problems and provide commercial value for ABI. This position reports directly to the Global Director of eCommerce Data Engineering and can sit remotely anywhere in the United States.
About the Team
We’re rethinking the way AB InBev does business with its retail customers and creating digital experiences to serve them. You will be joining a new digital organization within AB InBev consisting of digital strategy, product, design, analytics, operations and engineering. This organization is responsible for building the products and platforms that transform our traditional sales operations across the world.
About the Job
The Data Engineer will create, oversee and maintain commercial and external data pipelines working closely with our commercial, product and technology leads to standardize our commercial and external data. This person should have experience building data integrations to sales and ERP systems such as SAP, maintaining centralized data warehouses and data dictionaries for multiple commercial stakeholders.
Main stakeholders will include sales functions (commercial reporting, promo analytics, and sales algorithms) and customer data product leadership.
- Create and maintain optimal data pipeline architecture, – own all data sources (transactional, internal and external data within our e-commerce organization) and ensure their accuracy to maintain commercial functions in our digital platforms
- Take ownership of understanding the business and operational problems at hand and how to best solve them through a data-driven approach
- Execute, and maintain the strategy for our data warehousing and pipelines to cater to specific business objectives, from implementation to maintenance
- Be a key leader in our global organization, working fluidly across ABI’s global and local teams and functions (sales, finance, marketing, product, IT, etc.) to make data-driven initiatives successful in an efficient way
- Prioritize and find the most efficient path towards solving complex, ambiguous business problems with data, keeping a mindset of simplicity, robustness and speed above all
- Have strong communication skills: able to clearly understand business objectives and translate them to an end to end data strategy and execution
- Have experience working in a central analytics or data strategy role within a large enterprise environment
- Have experience operating in a production level data warehouse and/or data lake
- Live and breathe data – providing the best, most correct data is your obsession
- Have experience working with both highly technical and non-technical profiles on a day-to-day basis
- Azure (Active Directory, Data Factory, Data Share, DevOps, Event Hub, Key Vault, Storage accounts/Blob containers/ADLS Gen2)
- New Relic
- Expertise in Python, C++, and/or a JVM language
- Familiarity with SQL
- Experience with Data Warehouses such as Google Big Query, Snowflake, or Redshift
- Experience managing the full end-to-end pipeline for data warehousing and data pipelines of commercial data for multiple analytics and BI stakeholders
- Experience structuring data from unstructured data sources and maintaining SLAs for delivery
- Experience working in a full Data Engineering team: QA, Data Engineers, Data Analysts, DBAs, etc.
- Experience working for an international organization with teams distributed across geographies/timezones
- BA/BS degree (Computer Science, Software/Computer Engineering, Information Systems, Statistics, or similar technical field)
- Working knowledge of “Big Data” technologies such as MapReduce and/or Spark
- Experience building cloud-based solutions (AWS, Azure, GCP)
- Understanding of CI/CD and DevOps best practices
- Familiarity with Data Governance and related concepts, e.g., lineage, quality, integrity, security
- Familiarity with containerization technologies, e.g., Docker, Kubernetes
- Experience with workflow orchestration tools, e.g., Airflow, Luigi
- Functional programming experience
- Familiarity with ML development lifecycle and related tools/libraries
- Experience building streaming data pipelines using Kafka, Kinesis, Spark, and/or Flink
- API design expertise