Senior Data Engineer, 2

Posted 13 Hours Ago
Be an Early Applicant
New York, NY
Senior level
AdTech • Consumer Web • Digital Media • eCommerce • Marketing Tech • News + Entertainment
Dotdash Meredith is Amercia's largest digital and print publisher, with over 40 brands and 200 million monthly users.
The Role
The Senior Data Engineer will build and optimize data integration pipelines for data lakes and warehouses. Responsibilities include collaborating with stakeholders, enhancing data transformations, ensuring data integrity, and championing coding standards. The role calls for coding and maintaining a multi-cloud infrastructure, as well as working closely with analytics engineers.
Summary Generated by Built In

About The Position | Major goals and objectives and location requirements 

Dotdash Meredith is a leading digital media company that owns and operates a portfolio of highly respected brands across various verticals, including lifestyle, health, finance, and more. With a commitment to providing high-quality content and innovative digital experiences, Dotdash Meredith reaches millions of users globally and continues to drive growth and engagement across its platforms.

Dotdash Meredith is looking for a Senior Data Engineer, 2 with strong Python and SQL skills to join the Data Operations team. The successful candidate will help build our data integration pipelines that feed into our data lakes and warehouses while maintaining data quality and integrity in our data stores. We are looking for someone who is a great team player but can also work independently. This person will also work closely with key stakeholders and understand and implement business requirements and see to it that data deliverables are met.

Remote or Hybrid 3x a month 

In-office Expectations: This position offers remote work flexibility; however, if you reside within a commutable distance to one of our offices in New York, Des Moines, Birmingham, Los Angeles, Chicago, or Seattle, the expectation is to work from the office three times per month

 

About The Positions Contributions:

Weight %

Accountabilities, Actions and Expected Measurable Results

60%

  • You will enhance our systems by building new data integration pipelines and adding new data to our data lakes and warehouses while continuously optimizing them.

  • You will work with internal team members as well as stakeholders to scope out business requirements and see data deliverables through to the end where they will be used via our Looker platform.

  • You will continuously look for ways to improve our data transformations and data consumption processes so that our systems are running efficiently, and our customers are able to use and analyze our data quickly and effectively.

 

40%

  • You will champion coding standards and best practices by actively participating in code reviews, and working to improve our internal tools and build process.

  • You will work to ensure the security and stability of our infrastructure in a multi-cloud environment.

  • You will collaborate with our Analytics engineers to ensure data integrity and the quality

  • of our data deliverables.

The Role’s Minimum Qualifications and Job Requirements

Education:

  • Degree in a quantitative field, such as computer science, statistics, mathematics, engineering, data science, or equivalent experience.


Experience:

  • A minimum of 5+ years of experience in building and optimizing data pipelines with Python.

  • You have experience writing complex SQL queries to analyze data.

  • Window functions and nested subqueries are second nature to you.

  • You have commendable experience with at least one cloud service platform (GCP and AWS preferred).

  • You've worked with data at scale using Apache Spark, Beam or a similar framework.

  • You're familiar with data streaming architectures using technologies like Pub/Sub and Apache Kafka.

  • You are eager to learn about new tech stacks, big data technologies, data pipelining architectures, etc. and propose your findings to the team to try and optimize our systems.

Specific Knowledge, Skills, Certifications and Abilities:

  • Strong Python and SQL skills.

  • Experience with Google Cloud Platform is a plus.

 

% Travel Required (Approximate): 0%

Top Skills

Python
SQL

What the Team is Saying

Vlada
Brian
Nabil
The Company
New York, NY
3,500 Employees
Hybrid Workplace
Year Founded: 2021

What We Do

Dotdash Meredith's vibrant brands help over 200 million users each month find answers, solve problems, and get inspired. Dotdash is among the largest and fastest growing publishers online, and has won over 50 awards in the last year alone, including Digiday's 2020 Publisher of the Year. Dotdash Meredith brands include Verywell, Investopedia, The Balance, The Spruce, Simply Recipes, Serious Eats, Byrdie, Brides, People, Food & Wine, Shape, Entertainment Weekly, Travel & Leisure, Better Homes and Gardens, Southern Living, Health, InStyle, Parents, EatingWell, Magnolia Journal, MyDomaine, Lifewire, TripSavvy, Liquor.com, and TreeHugger.

Gallery

Gallery
Gallery
Gallery
Gallery
Gallery

Dotdash Meredith Offices

Hybrid Workspace

Employees engage in a combination of remote and on-site work.

Our employees work 3 days in office across our NYC, Des Moines, Birmingham, Seattle, LA, and Chicago locations. A couple of our orgs have remote teams (please refer to job descriptions). We are accommodating towards various exemptions and situations.

Typical time on-site: Not Specified
New York, NY

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account