We are a small but rapidly growing team, building out our dev team and data analytics capabilities to best inform decisions and direction for our business, while also impacting our products. We are looking for a Data Engineer to not only build data pipelines to efficiently and reliably move data across systems, but also to build the next generation of data tools to enable us to take full advantage of this data.

In this role, your work will broadly influence the company’s products, data consumers and other stakeholders.  We are looking for a candidate who can apply their experience with 1) moving data, 2) object-based storage, 3) performing some kind of ETL on the data, and/or 4) data visualization.

What you will do:

  • Design and build a Multidimensional Data Warehouse.
  • Build and maintain the core data model, data pipelines, core data metrics and data quality.
  • Work directly with stakeholders across multiple functions (Product, Marketing, Alliance Management, Market Research) to definenees/requirements.
  • Champion data warehousing best practices
  • Develop and build infrastructure in an AWS cloud environment
  • Build data expertise and own data quality for the data pipelines
  • Design and develop new systems and tools to enable folks to consume and understand data faster
  • Provide expert advice and education in the usage and interpretation of data systems to end consumers of the data

Basic qualifications:

  • B.S. or B.A. in computer science, math, economics, engineering or another related technical field
  • 5+ years of SQL experience as applied to ETL tools (Informatica, Kettle, Talend, Vertica, Pentahoe, etc)
  • Experience with relational databases and NoSql infrastructure
  • Preferred Qualifications:
  • 5+ years experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
  • 5+ years experience with MongoDB, AWS cloud services, scripting languages, Google Analytics, and other big data tools
  • Advanced knowledge and experience working with databases, preferably MongoDB.
  • A successful history of manipulating, processing and extracting value from large disconnected datasets.
  • Experience with MongoDB, AWS cloud services, scripting languages, and other big data tools,
  • Excellent communication skills including the ability to identify and communicate data driven insights
  • Familiarity with Google’s stack: Google Analytics, Google Tag Manager, and Google Data Studio
  • Excellent communication skills including the ability to identify and communicate data driven insights


We’re looking for people who want to make an impact, not just an impression. The opportunity to join MyHealthTeams: work side by side with the Architect of the platform (web and mobile) and be here from the ground up as we chip away and integrate NLP into our operations. We are looking for people seeking to bring a mind wired for computer science, who can learn, grow, and level up our company. Huge opportunity to grow. If you’re looking for a big corporate team, this isn’t it. If you’re looking for visibility and the chance to have what you do make a difference to the product, platform and the lives of people who use it, join MyHealthTeams.

About us:

MyHealthTeams is a lean (currently 25 people, and growing) startup that creates social networks for people facing chronic health conditions.  We’re backed by VC funding and an investment from CVS Health. We believe that if you’re diagnosed with a disease such as MS, Lupus, breast cancer, or diabetes (to name a few out of 32 social networks), it should be easy to find the best people around to help you. We develop partnerships that empower, and are transparent to our members (nearly 2 million strong, and growing), and never share the personally identifying information of our members.

If you are interested in this position, please send your resumé and cover letter with the subject line: Principal Data Warehouse Engineer to DevJobs@MyHealthTeams.com